problem_id
stringlengths 18
22
| source
stringclasses 1
value | task_type
stringclasses 1
value | in_source_id
stringlengths 13
58
| prompt
stringlengths 1.1k
25.4k
| golden_diff
stringlengths 145
5.13k
| verification_info
stringlengths 582
39.1k
| num_tokens
int64 271
4.1k
| num_tokens_diff
int64 47
1.02k
|
---|---|---|---|---|---|---|---|---|
gh_patches_debug_54595 | rasdani/github-patches | git_diff | zulip__zulip-21726 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Create collapsible "Private messages" section in left sidebar
At present, private messages are collapsed in the left sidebar, unless the user is in a private message narrow. This has a few down sides:
1. Getting to a PM conversation generally requires multiple clicks.
2. It's not immediately clear who send you a new private message, which is important for determining whether one needs to read it right away.
3. It can be hard for new users to figure out how to view and send private messages.
In order to address this, we should try making a private messages section in the left sidebar that is open by default. Specifically:
1. Make a Private messages section just above STREAMS in the left sidebar that is open by default.
2. In the new PMs section, use the same algorithm we use for stream topics to decide how many conversations to show.
3. Make the PMs section collapsible, similar to the collapsible sections in #20072. The open/collapsed state should be sticky as the user navigates around Zulip, closes and reopens the window, logs out and in, etc.
Note that this will likely require experimentation for us to get it right. To avoid misdirected effort, please post screenshots in the #design stream on chat.zulip.org for feedback. Also, if (3) can't be implemented quickly, we can test the experience in chat.zulip.org without waiting for it to be completed.
[Prior discussion on CZO](https://chat.zulip.org/#narrow/stream/101-design/topic/private.20messages.20UI/near/1159032).
See also #11108.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `tools/lib/capitalization.py`
Content:
```
1 import re
2 from typing import List, Match, Tuple
3
4 from bs4 import BeautifulSoup
5
6 # The phrases in this list will be ignored. The longest phrase is
7 # tried first; this removes the chance of smaller phrases changing
8 # the text before longer phrases are tried.
9 # The errors shown by `tools/check-capitalization` can be added to
10 # this list without any modification.
11 IGNORED_PHRASES = [
12 # Proper nouns and acronyms
13 r"API",
14 r"APNS",
15 r"Botserver",
16 r"Cookie Bot",
17 r"DevAuthBackend",
18 r"GCM",
19 r"GitHub",
20 r"Gravatar",
21 r"Help Center",
22 r"HTTP",
23 r"ID",
24 r"IDs",
25 r"IP",
26 r"JSON",
27 r"Kerberos",
28 r"LDAP",
29 r"Markdown",
30 r"OTP",
31 r"Pivotal",
32 r"PM",
33 r"PMs",
34 r"Slack",
35 r"Google",
36 r"Terms of Service",
37 r"Tuesday",
38 r"URL",
39 r"UUID",
40 r"Webathena",
41 r"WordPress",
42 r"Zephyr",
43 r"Zoom",
44 r"Zulip",
45 r"Zulip Server",
46 r"Zulip Account Security",
47 r"Zulip Security",
48 r"Zulip Cloud Standard",
49 r"BigBlueButton",
50 # Code things
51 r"\.zuliprc",
52 # BeautifulSoup will remove <z-user> which is horribly confusing,
53 # so we need more of the sentence.
54 r"<z-user></z-user> will have the same role",
55 # Things using "I"
56 r"I understand",
57 r"I'm",
58 r"I've",
59 # Specific short words
60 r"beta",
61 r"and",
62 r"bot",
63 r"e\.g\.",
64 r"enabled",
65 r"signups",
66 # Placeholders
67 r"keyword",
68 r"streamname",
69 r"user@example\.com",
70 # Fragments of larger strings
71 (r"your subscriptions on your Streams page"),
72 r"Add global time<br />Everyone sees global times in their own time zone\.",
73 r"user",
74 r"an unknown operating system",
75 r"Go to Settings",
76 # SPECIAL CASES
77 # Because topics usually are lower-case, this would look weird if it were capitalized
78 r"more topics",
79 # Used alone in a parenthetical where capitalized looks worse.
80 r"^deprecated$",
81 # Capital 'i' looks weird in reminders popover
82 r"in 1 hour",
83 r"in 20 minutes",
84 r"in 3 hours",
85 # these are used as topics
86 r"^new streams$",
87 r"^stream events$",
88 # These are used as example short names (e.g. an uncapitalized context):
89 r"^marketing$",
90 r"^cookie$",
91 # Used to refer custom time limits
92 r"\bN\b",
93 # Capital c feels obtrusive in clear status option
94 r"clear",
95 r"group private messages with \{recipient\}",
96 r"private messages with \{recipient\}",
97 r"private messages with yourself",
98 r"GIF",
99 # Emoji name placeholder
100 r"leafy green vegetable",
101 # Subdomain placeholder
102 r"your-organization-url",
103 # Used in invite modal
104 r"or",
105 # Used in GIPHY popover.
106 r"GIFs",
107 r"GIPHY",
108 # Used in our case studies
109 r"Technical University of Munich",
110 r"University of California San Diego",
111 # Used in stream creation form
112 r"email hidden",
113 # Use in compose box.
114 r"to send",
115 r"to add a new line",
116 # Used in showing Notification Bot read receipts message
117 "Notification Bot",
118 # Used in presence_enabled setting label
119 r"invisible mode off",
120 # Typeahead suggestions for "Pronouns" custom field type.
121 r"he/him",
122 r"she/her",
123 r"they/them",
124 ]
125
126 # Sort regexes in descending order of their lengths. As a result, the
127 # longer phrases will be ignored first.
128 IGNORED_PHRASES.sort(key=lambda regex: len(regex), reverse=True)
129
130 # Compile regexes to improve performance. This also extracts the
131 # text using BeautifulSoup and then removes extra whitespaces from
132 # it. This step enables us to add HTML in our regexes directly.
133 COMPILED_IGNORED_PHRASES = [
134 re.compile(" ".join(BeautifulSoup(regex, "lxml").text.split())) for regex in IGNORED_PHRASES
135 ]
136
137 SPLIT_BOUNDARY = "?.!" # Used to split string into sentences.
138 SPLIT_BOUNDARY_REGEX = re.compile(rf"[{SPLIT_BOUNDARY}]")
139
140 # Regexes which check capitalization in sentences.
141 DISALLOWED = [
142 r"^[a-z](?!\})", # Checks if the sentence starts with a lower case character.
143 r"^[A-Z][a-z]+[\sa-z0-9]+[A-Z]", # Checks if an upper case character exists
144 # after a lower case character when the first character is in upper case.
145 ]
146 DISALLOWED_REGEX = re.compile(r"|".join(DISALLOWED))
147
148 BANNED_WORDS = {
149 "realm": "The term realm should not appear in user-facing strings. Use organization instead.",
150 }
151
152
153 def get_safe_phrase(phrase: str) -> str:
154 """
155 Safe phrase is in lower case and doesn't contain characters which can
156 conflict with split boundaries. All conflicting characters are replaced
157 with low dash (_).
158 """
159 phrase = SPLIT_BOUNDARY_REGEX.sub("_", phrase)
160 return phrase.lower()
161
162
163 def replace_with_safe_phrase(matchobj: Match[str]) -> str:
164 """
165 The idea is to convert IGNORED_PHRASES into safe phrases, see
166 `get_safe_phrase()` function. The only exception is when the
167 IGNORED_PHRASE is at the start of the text or after a split
168 boundary; in this case, we change the first letter of the phrase
169 to upper case.
170 """
171 ignored_phrase = matchobj.group(0)
172 safe_string = get_safe_phrase(ignored_phrase)
173
174 start_index = matchobj.start()
175 complete_string = matchobj.string
176
177 is_string_start = start_index == 0
178 # We expect that there will be one space between split boundary
179 # and the next word.
180 punctuation = complete_string[max(start_index - 2, 0)]
181 is_after_split_boundary = punctuation in SPLIT_BOUNDARY
182 if is_string_start or is_after_split_boundary:
183 return safe_string.capitalize()
184
185 return safe_string
186
187
188 def get_safe_text(text: str) -> str:
189 """
190 This returns text which is rendered by BeautifulSoup and is in the
191 form that can be split easily and has all IGNORED_PHRASES processed.
192 """
193 soup = BeautifulSoup(text, "lxml")
194 text = " ".join(soup.text.split()) # Remove extra whitespaces.
195 for phrase_regex in COMPILED_IGNORED_PHRASES:
196 text = phrase_regex.sub(replace_with_safe_phrase, text)
197
198 return text
199
200
201 def is_capitalized(safe_text: str) -> bool:
202 sentences = SPLIT_BOUNDARY_REGEX.split(safe_text)
203 return not any(DISALLOWED_REGEX.search(sentence.strip()) for sentence in sentences)
204
205
206 def check_banned_words(text: str) -> List[str]:
207 lower_cased_text = text.lower()
208 errors = []
209 for word, reason in BANNED_WORDS.items():
210 if word in lower_cased_text:
211 # Hack: Should move this into BANNED_WORDS framework; for
212 # now, just hand-code the skips:
213 if "realm_name" in lower_cased_text:
214 continue
215 kwargs = dict(word=word, text=text, reason=reason)
216 msg = "{word} found in '{text}'. {reason}".format(**kwargs)
217 errors.append(msg)
218
219 return errors
220
221
222 def check_capitalization(strings: List[str]) -> Tuple[List[str], List[str], List[str]]:
223 errors = []
224 ignored = []
225 banned_word_errors = []
226 for text in strings:
227 text = " ".join(text.split()) # Remove extra whitespaces.
228 safe_text = get_safe_text(text)
229 has_ignored_phrase = text != safe_text
230 capitalized = is_capitalized(safe_text)
231 if not capitalized:
232 errors.append(text)
233 elif has_ignored_phrase:
234 ignored.append(text)
235
236 banned_word_errors.extend(check_banned_words(text))
237
238 return sorted(errors), sorted(ignored), sorted(banned_word_errors)
239
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/tools/lib/capitalization.py b/tools/lib/capitalization.py
--- a/tools/lib/capitalization.py
+++ b/tools/lib/capitalization.py
@@ -78,6 +78,9 @@
r"more topics",
# Used alone in a parenthetical where capitalized looks worse.
r"^deprecated$",
+ # We want the similar text in the Private Messages section to have the same capitalization.
+ r"more conversations",
+ r"back to streams",
# Capital 'i' looks weird in reminders popover
r"in 1 hour",
r"in 20 minutes",
| {"golden_diff": "diff --git a/tools/lib/capitalization.py b/tools/lib/capitalization.py\n--- a/tools/lib/capitalization.py\n+++ b/tools/lib/capitalization.py\n@@ -78,6 +78,9 @@\n r\"more topics\",\n # Used alone in a parenthetical where capitalized looks worse.\n r\"^deprecated$\",\n+ # We want the similar text in the Private Messages section to have the same capitalization.\n+ r\"more conversations\",\n+ r\"back to streams\",\n # Capital 'i' looks weird in reminders popover\n r\"in 1 hour\",\n r\"in 20 minutes\",\n", "issue": "Create collapsible \"Private messages\" section in left sidebar\nAt present, private messages are collapsed in the left sidebar, unless the user is in a private message narrow. This has a few down sides:\r\n\r\n1. Getting to a PM conversation generally requires multiple clicks.\r\n2. It's not immediately clear who send you a new private message, which is important for determining whether one needs to read it right away.\r\n3. It can be hard for new users to figure out how to view and send private messages.\r\n\r\nIn order to address this, we should try making a private messages section in the left sidebar that is open by default. Specifically:\r\n\r\n1. Make a Private messages section just above STREAMS in the left sidebar that is open by default.\r\n2. In the new PMs section, use the same algorithm we use for stream topics to decide how many conversations to show.\r\n3. Make the PMs section collapsible, similar to the collapsible sections in #20072. The open/collapsed state should be sticky as the user navigates around Zulip, closes and reopens the window, logs out and in, etc.\r\n\r\nNote that this will likely require experimentation for us to get it right. To avoid misdirected effort, please post screenshots in the #design stream on chat.zulip.org for feedback. Also, if (3) can't be implemented quickly, we can test the experience in chat.zulip.org without waiting for it to be completed.\r\n\r\n[Prior discussion on CZO](https://chat.zulip.org/#narrow/stream/101-design/topic/private.20messages.20UI/near/1159032).\r\n\r\nSee also #11108.\n", "before_files": [{"content": "import re\nfrom typing import List, Match, Tuple\n\nfrom bs4 import BeautifulSoup\n\n# The phrases in this list will be ignored. The longest phrase is\n# tried first; this removes the chance of smaller phrases changing\n# the text before longer phrases are tried.\n# The errors shown by `tools/check-capitalization` can be added to\n# this list without any modification.\nIGNORED_PHRASES = [\n # Proper nouns and acronyms\n r\"API\",\n r\"APNS\",\n r\"Botserver\",\n r\"Cookie Bot\",\n r\"DevAuthBackend\",\n r\"GCM\",\n r\"GitHub\",\n r\"Gravatar\",\n r\"Help Center\",\n r\"HTTP\",\n r\"ID\",\n r\"IDs\",\n r\"IP\",\n r\"JSON\",\n r\"Kerberos\",\n r\"LDAP\",\n r\"Markdown\",\n r\"OTP\",\n r\"Pivotal\",\n r\"PM\",\n r\"PMs\",\n r\"Slack\",\n r\"Google\",\n r\"Terms of Service\",\n r\"Tuesday\",\n r\"URL\",\n r\"UUID\",\n r\"Webathena\",\n r\"WordPress\",\n r\"Zephyr\",\n r\"Zoom\",\n r\"Zulip\",\n r\"Zulip Server\",\n r\"Zulip Account Security\",\n r\"Zulip Security\",\n r\"Zulip Cloud Standard\",\n r\"BigBlueButton\",\n # Code things\n r\"\\.zuliprc\",\n # BeautifulSoup will remove <z-user> which is horribly confusing,\n # so we need more of the sentence.\n r\"<z-user></z-user> will have the same role\",\n # Things using \"I\"\n r\"I understand\",\n r\"I'm\",\n r\"I've\",\n # Specific short words\n r\"beta\",\n r\"and\",\n r\"bot\",\n r\"e\\.g\\.\",\n r\"enabled\",\n r\"signups\",\n # Placeholders\n r\"keyword\",\n r\"streamname\",\n r\"user@example\\.com\",\n # Fragments of larger strings\n (r\"your subscriptions on your Streams page\"),\n r\"Add global time<br />Everyone sees global times in their own time zone\\.\",\n r\"user\",\n r\"an unknown operating system\",\n r\"Go to Settings\",\n # SPECIAL CASES\n # Because topics usually are lower-case, this would look weird if it were capitalized\n r\"more topics\",\n # Used alone in a parenthetical where capitalized looks worse.\n r\"^deprecated$\",\n # Capital 'i' looks weird in reminders popover\n r\"in 1 hour\",\n r\"in 20 minutes\",\n r\"in 3 hours\",\n # these are used as topics\n r\"^new streams$\",\n r\"^stream events$\",\n # These are used as example short names (e.g. an uncapitalized context):\n r\"^marketing$\",\n r\"^cookie$\",\n # Used to refer custom time limits\n r\"\\bN\\b\",\n # Capital c feels obtrusive in clear status option\n r\"clear\",\n r\"group private messages with \\{recipient\\}\",\n r\"private messages with \\{recipient\\}\",\n r\"private messages with yourself\",\n r\"GIF\",\n # Emoji name placeholder\n r\"leafy green vegetable\",\n # Subdomain placeholder\n r\"your-organization-url\",\n # Used in invite modal\n r\"or\",\n # Used in GIPHY popover.\n r\"GIFs\",\n r\"GIPHY\",\n # Used in our case studies\n r\"Technical University of Munich\",\n r\"University of California San Diego\",\n # Used in stream creation form\n r\"email hidden\",\n # Use in compose box.\n r\"to send\",\n r\"to add a new line\",\n # Used in showing Notification Bot read receipts message\n \"Notification Bot\",\n # Used in presence_enabled setting label\n r\"invisible mode off\",\n # Typeahead suggestions for \"Pronouns\" custom field type.\n r\"he/him\",\n r\"she/her\",\n r\"they/them\",\n]\n\n# Sort regexes in descending order of their lengths. As a result, the\n# longer phrases will be ignored first.\nIGNORED_PHRASES.sort(key=lambda regex: len(regex), reverse=True)\n\n# Compile regexes to improve performance. This also extracts the\n# text using BeautifulSoup and then removes extra whitespaces from\n# it. This step enables us to add HTML in our regexes directly.\nCOMPILED_IGNORED_PHRASES = [\n re.compile(\" \".join(BeautifulSoup(regex, \"lxml\").text.split())) for regex in IGNORED_PHRASES\n]\n\nSPLIT_BOUNDARY = \"?.!\" # Used to split string into sentences.\nSPLIT_BOUNDARY_REGEX = re.compile(rf\"[{SPLIT_BOUNDARY}]\")\n\n# Regexes which check capitalization in sentences.\nDISALLOWED = [\n r\"^[a-z](?!\\})\", # Checks if the sentence starts with a lower case character.\n r\"^[A-Z][a-z]+[\\sa-z0-9]+[A-Z]\", # Checks if an upper case character exists\n # after a lower case character when the first character is in upper case.\n]\nDISALLOWED_REGEX = re.compile(r\"|\".join(DISALLOWED))\n\nBANNED_WORDS = {\n \"realm\": \"The term realm should not appear in user-facing strings. Use organization instead.\",\n}\n\n\ndef get_safe_phrase(phrase: str) -> str:\n \"\"\"\n Safe phrase is in lower case and doesn't contain characters which can\n conflict with split boundaries. All conflicting characters are replaced\n with low dash (_).\n \"\"\"\n phrase = SPLIT_BOUNDARY_REGEX.sub(\"_\", phrase)\n return phrase.lower()\n\n\ndef replace_with_safe_phrase(matchobj: Match[str]) -> str:\n \"\"\"\n The idea is to convert IGNORED_PHRASES into safe phrases, see\n `get_safe_phrase()` function. The only exception is when the\n IGNORED_PHRASE is at the start of the text or after a split\n boundary; in this case, we change the first letter of the phrase\n to upper case.\n \"\"\"\n ignored_phrase = matchobj.group(0)\n safe_string = get_safe_phrase(ignored_phrase)\n\n start_index = matchobj.start()\n complete_string = matchobj.string\n\n is_string_start = start_index == 0\n # We expect that there will be one space between split boundary\n # and the next word.\n punctuation = complete_string[max(start_index - 2, 0)]\n is_after_split_boundary = punctuation in SPLIT_BOUNDARY\n if is_string_start or is_after_split_boundary:\n return safe_string.capitalize()\n\n return safe_string\n\n\ndef get_safe_text(text: str) -> str:\n \"\"\"\n This returns text which is rendered by BeautifulSoup and is in the\n form that can be split easily and has all IGNORED_PHRASES processed.\n \"\"\"\n soup = BeautifulSoup(text, \"lxml\")\n text = \" \".join(soup.text.split()) # Remove extra whitespaces.\n for phrase_regex in COMPILED_IGNORED_PHRASES:\n text = phrase_regex.sub(replace_with_safe_phrase, text)\n\n return text\n\n\ndef is_capitalized(safe_text: str) -> bool:\n sentences = SPLIT_BOUNDARY_REGEX.split(safe_text)\n return not any(DISALLOWED_REGEX.search(sentence.strip()) for sentence in sentences)\n\n\ndef check_banned_words(text: str) -> List[str]:\n lower_cased_text = text.lower()\n errors = []\n for word, reason in BANNED_WORDS.items():\n if word in lower_cased_text:\n # Hack: Should move this into BANNED_WORDS framework; for\n # now, just hand-code the skips:\n if \"realm_name\" in lower_cased_text:\n continue\n kwargs = dict(word=word, text=text, reason=reason)\n msg = \"{word} found in '{text}'. {reason}\".format(**kwargs)\n errors.append(msg)\n\n return errors\n\n\ndef check_capitalization(strings: List[str]) -> Tuple[List[str], List[str], List[str]]:\n errors = []\n ignored = []\n banned_word_errors = []\n for text in strings:\n text = \" \".join(text.split()) # Remove extra whitespaces.\n safe_text = get_safe_text(text)\n has_ignored_phrase = text != safe_text\n capitalized = is_capitalized(safe_text)\n if not capitalized:\n errors.append(text)\n elif has_ignored_phrase:\n ignored.append(text)\n\n banned_word_errors.extend(check_banned_words(text))\n\n return sorted(errors), sorted(ignored), sorted(banned_word_errors)\n", "path": "tools/lib/capitalization.py"}], "after_files": [{"content": "import re\nfrom typing import List, Match, Tuple\n\nfrom bs4 import BeautifulSoup\n\n# The phrases in this list will be ignored. The longest phrase is\n# tried first; this removes the chance of smaller phrases changing\n# the text before longer phrases are tried.\n# The errors shown by `tools/check-capitalization` can be added to\n# this list without any modification.\nIGNORED_PHRASES = [\n # Proper nouns and acronyms\n r\"API\",\n r\"APNS\",\n r\"Botserver\",\n r\"Cookie Bot\",\n r\"DevAuthBackend\",\n r\"GCM\",\n r\"GitHub\",\n r\"Gravatar\",\n r\"Help Center\",\n r\"HTTP\",\n r\"ID\",\n r\"IDs\",\n r\"IP\",\n r\"JSON\",\n r\"Kerberos\",\n r\"LDAP\",\n r\"Markdown\",\n r\"OTP\",\n r\"Pivotal\",\n r\"PM\",\n r\"PMs\",\n r\"Slack\",\n r\"Google\",\n r\"Terms of Service\",\n r\"Tuesday\",\n r\"URL\",\n r\"UUID\",\n r\"Webathena\",\n r\"WordPress\",\n r\"Zephyr\",\n r\"Zoom\",\n r\"Zulip\",\n r\"Zulip Server\",\n r\"Zulip Account Security\",\n r\"Zulip Security\",\n r\"Zulip Cloud Standard\",\n r\"BigBlueButton\",\n # Code things\n r\"\\.zuliprc\",\n # BeautifulSoup will remove <z-user> which is horribly confusing,\n # so we need more of the sentence.\n r\"<z-user></z-user> will have the same role\",\n # Things using \"I\"\n r\"I understand\",\n r\"I'm\",\n r\"I've\",\n # Specific short words\n r\"beta\",\n r\"and\",\n r\"bot\",\n r\"e\\.g\\.\",\n r\"enabled\",\n r\"signups\",\n # Placeholders\n r\"keyword\",\n r\"streamname\",\n r\"user@example\\.com\",\n # Fragments of larger strings\n (r\"your subscriptions on your Streams page\"),\n r\"Add global time<br />Everyone sees global times in their own time zone\\.\",\n r\"user\",\n r\"an unknown operating system\",\n r\"Go to Settings\",\n # SPECIAL CASES\n # Because topics usually are lower-case, this would look weird if it were capitalized\n r\"more topics\",\n # Used alone in a parenthetical where capitalized looks worse.\n r\"^deprecated$\",\n # We want the similar text in the Private Messages section to have the same capitalization.\n r\"more conversations\",\n r\"back to streams\",\n # Capital 'i' looks weird in reminders popover\n r\"in 1 hour\",\n r\"in 20 minutes\",\n r\"in 3 hours\",\n # these are used as topics\n r\"^new streams$\",\n r\"^stream events$\",\n # These are used as example short names (e.g. an uncapitalized context):\n r\"^marketing$\",\n r\"^cookie$\",\n # Used to refer custom time limits\n r\"\\bN\\b\",\n # Capital c feels obtrusive in clear status option\n r\"clear\",\n r\"group private messages with \\{recipient\\}\",\n r\"private messages with \\{recipient\\}\",\n r\"private messages with yourself\",\n r\"GIF\",\n # Emoji name placeholder\n r\"leafy green vegetable\",\n # Subdomain placeholder\n r\"your-organization-url\",\n # Used in invite modal\n r\"or\",\n # Used in GIPHY popover.\n r\"GIFs\",\n r\"GIPHY\",\n # Used in our case studies\n r\"Technical University of Munich\",\n r\"University of California San Diego\",\n # Used in stream creation form\n r\"email hidden\",\n # Use in compose box.\n r\"to send\",\n r\"to add a new line\",\n # Used in showing Notification Bot read receipts message\n \"Notification Bot\",\n # Used in presence_enabled setting label\n r\"invisible mode off\",\n # Typeahead suggestions for \"Pronouns\" custom field type.\n r\"he/him\",\n r\"she/her\",\n r\"they/them\",\n]\n\n# Sort regexes in descending order of their lengths. As a result, the\n# longer phrases will be ignored first.\nIGNORED_PHRASES.sort(key=lambda regex: len(regex), reverse=True)\n\n# Compile regexes to improve performance. This also extracts the\n# text using BeautifulSoup and then removes extra whitespaces from\n# it. This step enables us to add HTML in our regexes directly.\nCOMPILED_IGNORED_PHRASES = [\n re.compile(\" \".join(BeautifulSoup(regex, \"lxml\").text.split())) for regex in IGNORED_PHRASES\n]\n\nSPLIT_BOUNDARY = \"?.!\" # Used to split string into sentences.\nSPLIT_BOUNDARY_REGEX = re.compile(rf\"[{SPLIT_BOUNDARY}]\")\n\n# Regexes which check capitalization in sentences.\nDISALLOWED = [\n r\"^[a-z](?!\\})\", # Checks if the sentence starts with a lower case character.\n r\"^[A-Z][a-z]+[\\sa-z0-9]+[A-Z]\", # Checks if an upper case character exists\n # after a lower case character when the first character is in upper case.\n]\nDISALLOWED_REGEX = re.compile(r\"|\".join(DISALLOWED))\n\nBANNED_WORDS = {\n \"realm\": \"The term realm should not appear in user-facing strings. Use organization instead.\",\n}\n\n\ndef get_safe_phrase(phrase: str) -> str:\n \"\"\"\n Safe phrase is in lower case and doesn't contain characters which can\n conflict with split boundaries. All conflicting characters are replaced\n with low dash (_).\n \"\"\"\n phrase = SPLIT_BOUNDARY_REGEX.sub(\"_\", phrase)\n return phrase.lower()\n\n\ndef replace_with_safe_phrase(matchobj: Match[str]) -> str:\n \"\"\"\n The idea is to convert IGNORED_PHRASES into safe phrases, see\n `get_safe_phrase()` function. The only exception is when the\n IGNORED_PHRASE is at the start of the text or after a split\n boundary; in this case, we change the first letter of the phrase\n to upper case.\n \"\"\"\n ignored_phrase = matchobj.group(0)\n safe_string = get_safe_phrase(ignored_phrase)\n\n start_index = matchobj.start()\n complete_string = matchobj.string\n\n is_string_start = start_index == 0\n # We expect that there will be one space between split boundary\n # and the next word.\n punctuation = complete_string[max(start_index - 2, 0)]\n is_after_split_boundary = punctuation in SPLIT_BOUNDARY\n if is_string_start or is_after_split_boundary:\n return safe_string.capitalize()\n\n return safe_string\n\n\ndef get_safe_text(text: str) -> str:\n \"\"\"\n This returns text which is rendered by BeautifulSoup and is in the\n form that can be split easily and has all IGNORED_PHRASES processed.\n \"\"\"\n soup = BeautifulSoup(text, \"lxml\")\n text = \" \".join(soup.text.split()) # Remove extra whitespaces.\n for phrase_regex in COMPILED_IGNORED_PHRASES:\n text = phrase_regex.sub(replace_with_safe_phrase, text)\n\n return text\n\n\ndef is_capitalized(safe_text: str) -> bool:\n sentences = SPLIT_BOUNDARY_REGEX.split(safe_text)\n return not any(DISALLOWED_REGEX.search(sentence.strip()) for sentence in sentences)\n\n\ndef check_banned_words(text: str) -> List[str]:\n lower_cased_text = text.lower()\n errors = []\n for word, reason in BANNED_WORDS.items():\n if word in lower_cased_text:\n # Hack: Should move this into BANNED_WORDS framework; for\n # now, just hand-code the skips:\n if \"realm_name\" in lower_cased_text:\n continue\n kwargs = dict(word=word, text=text, reason=reason)\n msg = \"{word} found in '{text}'. {reason}\".format(**kwargs)\n errors.append(msg)\n\n return errors\n\n\ndef check_capitalization(strings: List[str]) -> Tuple[List[str], List[str], List[str]]:\n errors = []\n ignored = []\n banned_word_errors = []\n for text in strings:\n text = \" \".join(text.split()) # Remove extra whitespaces.\n safe_text = get_safe_text(text)\n has_ignored_phrase = text != safe_text\n capitalized = is_capitalized(safe_text)\n if not capitalized:\n errors.append(text)\n elif has_ignored_phrase:\n ignored.append(text)\n\n banned_word_errors.extend(check_banned_words(text))\n\n return sorted(errors), sorted(ignored), sorted(banned_word_errors)\n", "path": "tools/lib/capitalization.py"}]} | 3,137 | 140 |
gh_patches_debug_9063 | rasdani/github-patches | git_diff | pypa__virtualenv-1886 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`virtualenv --version` prints spurious error as of 20.0.24
**Issue**
When running `virtualenv --version`, a logger error is printed to stderr, though the return code is still 0.
**Environment**
Tested with Python 3.7 and 3.8, virtualenvs managed with pipenv
Ubuntu 18.04 on WSL
```
$ rm-rf tmp && mkdir tmp && cd tmp
$ pipenv install "virtualenv==20.0.23"
$ pipenv run virtualenv --version
virtualenv 20.0.23 from tmp/.venv/lib/python3.7/site-packages/virtualenv/__init__.py
$ rm-rf tmp && mkdir tmp && cd tmp
$ pipenv install "virtualenv==20.0.24"
$ pipenv run virtualenv --version
virtualenv 20.0.24 from tmp/.venv/lib/python3.7/site-packages/virtualenv/__init__.py
ERROR:root:SystemExit: 0
$ pipenv run virtualenv --version > /dev/null
ERROR:root:SystemExit: 0
$ echo $?
0
```
Nothing else is printed with `-vvv --with-traceback`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/virtualenv/__main__.py`
Content:
```
1 from __future__ import absolute_import, print_function, unicode_literals
2
3 import logging
4 import os
5 import sys
6 from datetime import datetime
7
8
9 def run(args=None, options=None):
10 start = datetime.now()
11 from virtualenv.util.error import ProcessCallFailed
12 from virtualenv.run import cli_run
13
14 if args is None:
15 args = sys.argv[1:]
16 try:
17 session = cli_run(args, options)
18 logging.warning(LogSession(session, start))
19 except ProcessCallFailed as exception:
20 print("subprocess call failed for {} with code {}".format(exception.cmd, exception.code))
21 print(exception.out, file=sys.stdout, end="")
22 print(exception.err, file=sys.stderr, end="")
23 raise SystemExit(exception.code)
24
25
26 class LogSession(object):
27 def __init__(self, session, start):
28 self.session = session
29 self.start = start
30
31 def __str__(self):
32 from virtualenv.util.six import ensure_text
33
34 spec = self.session.creator.interpreter.spec
35 elapsed = (datetime.now() - self.start).total_seconds() * 1000
36 lines = [
37 "created virtual environment {} in {:.0f}ms".format(spec, elapsed),
38 " creator {}".format(ensure_text(str(self.session.creator))),
39 ]
40 if self.session.seeder.enabled:
41 lines += (
42 " seeder {}".format(ensure_text(str(self.session.seeder))),
43 " added seed packages: {}".format(
44 ", ".join(
45 sorted(
46 "==".join(i.stem.split("-"))
47 for i in self.session.creator.purelib.iterdir()
48 if i.suffix == ".dist-info"
49 ),
50 ),
51 ),
52 )
53 if self.session.activators:
54 lines.append(" activators {}".format(",".join(i.__class__.__name__ for i in self.session.activators)))
55 return os.linesep.join(lines)
56
57
58 def run_with_catch(args=None):
59 from virtualenv.config.cli.parser import VirtualEnvOptions
60
61 options = VirtualEnvOptions()
62 try:
63 run(args, options)
64 except (KeyboardInterrupt, SystemExit, Exception) as exception:
65 try:
66 if getattr(options, "with_traceback", False):
67 raise
68 else:
69 logging.error("%s: %s", type(exception).__name__, exception)
70 code = exception.code if isinstance(exception, SystemExit) else 1
71 sys.exit(code)
72 finally:
73 logging.shutdown() # force flush of log messages before the trace is printed
74
75
76 if __name__ == "__main__": # pragma: no cov
77 run_with_catch() # pragma: no cov
78
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/virtualenv/__main__.py b/src/virtualenv/__main__.py
--- a/src/virtualenv/__main__.py
+++ b/src/virtualenv/__main__.py
@@ -66,7 +66,8 @@
if getattr(options, "with_traceback", False):
raise
else:
- logging.error("%s: %s", type(exception).__name__, exception)
+ if not (isinstance(exception, SystemExit) and exception.code == 0):
+ logging.error("%s: %s", type(exception).__name__, exception)
code = exception.code if isinstance(exception, SystemExit) else 1
sys.exit(code)
finally:
| {"golden_diff": "diff --git a/src/virtualenv/__main__.py b/src/virtualenv/__main__.py\n--- a/src/virtualenv/__main__.py\n+++ b/src/virtualenv/__main__.py\n@@ -66,7 +66,8 @@\n if getattr(options, \"with_traceback\", False):\n raise\n else:\n- logging.error(\"%s: %s\", type(exception).__name__, exception)\n+ if not (isinstance(exception, SystemExit) and exception.code == 0):\n+ logging.error(\"%s: %s\", type(exception).__name__, exception)\n code = exception.code if isinstance(exception, SystemExit) else 1\n sys.exit(code)\n finally:\n", "issue": "`virtualenv --version` prints spurious error as of 20.0.24\n**Issue**\r\n\r\nWhen running `virtualenv --version`, a logger error is printed to stderr, though the return code is still 0.\r\n\r\n**Environment**\r\n\r\nTested with Python 3.7 and 3.8, virtualenvs managed with pipenv\r\nUbuntu 18.04 on WSL\r\n\r\n```\r\n$ rm-rf tmp && mkdir tmp && cd tmp\r\n$ pipenv install \"virtualenv==20.0.23\"\r\n$ pipenv run virtualenv --version\r\nvirtualenv 20.0.23 from tmp/.venv/lib/python3.7/site-packages/virtualenv/__init__.py\r\n\r\n$ rm-rf tmp && mkdir tmp && cd tmp\r\n$ pipenv install \"virtualenv==20.0.24\"\r\n$ pipenv run virtualenv --version\r\nvirtualenv 20.0.24 from tmp/.venv/lib/python3.7/site-packages/virtualenv/__init__.py\r\nERROR:root:SystemExit: 0\r\n$ pipenv run virtualenv --version > /dev/null\r\nERROR:root:SystemExit: 0\r\n$ echo $?\r\n0\r\n```\r\n\r\nNothing else is printed with `-vvv --with-traceback`\n", "before_files": [{"content": "from __future__ import absolute_import, print_function, unicode_literals\n\nimport logging\nimport os\nimport sys\nfrom datetime import datetime\n\n\ndef run(args=None, options=None):\n start = datetime.now()\n from virtualenv.util.error import ProcessCallFailed\n from virtualenv.run import cli_run\n\n if args is None:\n args = sys.argv[1:]\n try:\n session = cli_run(args, options)\n logging.warning(LogSession(session, start))\n except ProcessCallFailed as exception:\n print(\"subprocess call failed for {} with code {}\".format(exception.cmd, exception.code))\n print(exception.out, file=sys.stdout, end=\"\")\n print(exception.err, file=sys.stderr, end=\"\")\n raise SystemExit(exception.code)\n\n\nclass LogSession(object):\n def __init__(self, session, start):\n self.session = session\n self.start = start\n\n def __str__(self):\n from virtualenv.util.six import ensure_text\n\n spec = self.session.creator.interpreter.spec\n elapsed = (datetime.now() - self.start).total_seconds() * 1000\n lines = [\n \"created virtual environment {} in {:.0f}ms\".format(spec, elapsed),\n \" creator {}\".format(ensure_text(str(self.session.creator))),\n ]\n if self.session.seeder.enabled:\n lines += (\n \" seeder {}\".format(ensure_text(str(self.session.seeder))),\n \" added seed packages: {}\".format(\n \", \".join(\n sorted(\n \"==\".join(i.stem.split(\"-\"))\n for i in self.session.creator.purelib.iterdir()\n if i.suffix == \".dist-info\"\n ),\n ),\n ),\n )\n if self.session.activators:\n lines.append(\" activators {}\".format(\",\".join(i.__class__.__name__ for i in self.session.activators)))\n return os.linesep.join(lines)\n\n\ndef run_with_catch(args=None):\n from virtualenv.config.cli.parser import VirtualEnvOptions\n\n options = VirtualEnvOptions()\n try:\n run(args, options)\n except (KeyboardInterrupt, SystemExit, Exception) as exception:\n try:\n if getattr(options, \"with_traceback\", False):\n raise\n else:\n logging.error(\"%s: %s\", type(exception).__name__, exception)\n code = exception.code if isinstance(exception, SystemExit) else 1\n sys.exit(code)\n finally:\n logging.shutdown() # force flush of log messages before the trace is printed\n\n\nif __name__ == \"__main__\": # pragma: no cov\n run_with_catch() # pragma: no cov\n", "path": "src/virtualenv/__main__.py"}], "after_files": [{"content": "from __future__ import absolute_import, print_function, unicode_literals\n\nimport logging\nimport os\nimport sys\nfrom datetime import datetime\n\n\ndef run(args=None, options=None):\n start = datetime.now()\n from virtualenv.util.error import ProcessCallFailed\n from virtualenv.run import cli_run\n\n if args is None:\n args = sys.argv[1:]\n try:\n session = cli_run(args, options)\n logging.warning(LogSession(session, start))\n except ProcessCallFailed as exception:\n print(\"subprocess call failed for {} with code {}\".format(exception.cmd, exception.code))\n print(exception.out, file=sys.stdout, end=\"\")\n print(exception.err, file=sys.stderr, end=\"\")\n raise SystemExit(exception.code)\n\n\nclass LogSession(object):\n def __init__(self, session, start):\n self.session = session\n self.start = start\n\n def __str__(self):\n from virtualenv.util.six import ensure_text\n\n spec = self.session.creator.interpreter.spec\n elapsed = (datetime.now() - self.start).total_seconds() * 1000\n lines = [\n \"created virtual environment {} in {:.0f}ms\".format(spec, elapsed),\n \" creator {}\".format(ensure_text(str(self.session.creator))),\n ]\n if self.session.seeder.enabled:\n lines += (\n \" seeder {}\".format(ensure_text(str(self.session.seeder))),\n \" added seed packages: {}\".format(\n \", \".join(\n sorted(\n \"==\".join(i.stem.split(\"-\"))\n for i in self.session.creator.purelib.iterdir()\n if i.suffix == \".dist-info\"\n ),\n ),\n ),\n )\n if self.session.activators:\n lines.append(\" activators {}\".format(\",\".join(i.__class__.__name__ for i in self.session.activators)))\n return os.linesep.join(lines)\n\n\ndef run_with_catch(args=None):\n from virtualenv.config.cli.parser import VirtualEnvOptions\n\n options = VirtualEnvOptions()\n try:\n run(args, options)\n except (KeyboardInterrupt, SystemExit, Exception) as exception:\n try:\n if getattr(options, \"with_traceback\", False):\n raise\n else:\n if not (isinstance(exception, SystemExit) and exception.code == 0):\n logging.error(\"%s: %s\", type(exception).__name__, exception)\n code = exception.code if isinstance(exception, SystemExit) else 1\n sys.exit(code)\n finally:\n logging.shutdown() # force flush of log messages before the trace is printed\n\n\nif __name__ == \"__main__\": # pragma: no cov\n run_with_catch() # pragma: no cov\n", "path": "src/virtualenv/__main__.py"}]} | 1,250 | 151 |
gh_patches_debug_22538 | rasdani/github-patches | git_diff | networkx__networkx-1976 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
all_node_cuts returns too few and incorrect cuts.
This could be a documentation issue, a bug or a user understanding issue. Are these cases pathological for the algorithm?
Given a square graph:
```
a -- b
| |
c -- d
```
Based on a cursory reading of the documentation, I would have expected all_node_cuts() to return:
```
[{'a','d'}, {'c','b'}]
```
I get `[{'a','c'}]` but if this is a valid node cut then surely {a,b}, {b,d}, {c,d} are also equally valid and a function called "all node cuts" should return them.
```
G = nx.Graph([('a','b'), ('a','c'), ('c','d'), ('b','d')])
print( G.nodes() )
print( G.edges() )
print( list(nx.all_node_cuts(G)) )
>>> ['a', 'c', 'b', 'd']
>>> [('a', 'c'), ('a', 'b'), ('c', 'd'), ('b', 'd')]
>>> [{'a', 'c'}]
```
Expanding to a hexagon, we see similar pattern of node cuts. There are many isometric node cuts omitted from the results list. Two of the proposed cuts fail to create "two or more connected components" as the documentation suggests.
```
G = nx.Graph([('a','b'), ('b','c'), ('c','d'),('d','e'), ('e','f'),('f','a')])
list(nx.all_node_cuts(G))
>>> [{'a', 'c'}, {'a', 'b'}, {'b', 'c'}]
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `networkx/algorithms/connectivity/kcutsets.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """
3 Kanevsky all minimum node k cutsets algorithm.
4 """
5 from operator import itemgetter
6
7 import networkx as nx
8 from .utils import build_auxiliary_node_connectivity
9 from networkx.algorithms.flow import (
10 build_residual_network,
11 edmonds_karp,
12 shortest_augmenting_path,
13 )
14 default_flow_func = edmonds_karp
15
16
17 __author__ = '\n'.join(['Jordi Torrents <[email protected]>'])
18
19 __all__ = ['all_node_cuts']
20
21 def all_node_cuts(G, k=None, flow_func=None):
22 r"""Returns all minimum k cutsets of an undirected graph G.
23
24 This implementation is based on Kanevsky's algorithm [1]_ for finding all
25 minimum-size node cut-sets of an undirected graph G; ie the set (or sets)
26 of nodes of cardinality equal to the node connectivity of G. Thus if
27 removed, would break G into two or more connected components.
28
29 Parameters
30 ----------
31 G : NetworkX graph
32 Undirected graph
33
34 k : Integer
35 Node connectivity of the input graph. If k is None, then it is
36 computed. Default value: None.
37
38 flow_func : function
39 Function to perform the underlying flow computations. Default value
40 edmonds_karp. This function performs better in sparse graphs with
41 right tailed degree distributions. shortest_augmenting_path will
42 perform better in denser graphs.
43
44
45 Returns
46 -------
47 cuts : a generator of node cutsets
48 Each node cutset has cardinality equal to the node connectivity of
49 the input graph.
50
51 Examples
52 --------
53 >>> # A two-dimensional grid graph has 4 cutsets of cardinality 2
54 >>> G = nx.grid_2d_graph(5, 5)
55 >>> cutsets = list(nx.all_node_cuts(G))
56 >>> len(cutsets)
57 4
58 >>> all(2 == len(cutset) for cutset in cutsets)
59 True
60 >>> nx.node_connectivity(G)
61 2
62
63 Notes
64 -----
65 This implementation is based on the sequential algorithm for finding all
66 minimum-size separating vertex sets in a graph [1]_. The main idea is to
67 compute minimum cuts using local maximum flow computations among a set
68 of nodes of highest degree and all other non-adjacent nodes in the Graph.
69 Once we find a minimum cut, we add an edge between the high degree
70 node and the target node of the local maximum flow computation to make
71 sure that we will not find that minimum cut again.
72
73 See also
74 --------
75 node_connectivity
76 edmonds_karp
77 shortest_augmenting_path
78
79 References
80 ----------
81 .. [1] Kanevsky, A. (1993). Finding all minimum-size separating vertex
82 sets in a graph. Networks 23(6), 533--541.
83 http://onlinelibrary.wiley.com/doi/10.1002/net.3230230604/abstract
84
85 """
86 if not nx.is_connected(G):
87 raise nx.NetworkXError('Input graph is disconnected.')
88
89 # Initialize data structures.
90 # Keep track of the cuts already computed so we do not repeat them.
91 seen = []
92 # Even-Tarjan reduction is what we call auxiliary digraph
93 # for node connectivity.
94 H = build_auxiliary_node_connectivity(G)
95 mapping = H.graph['mapping']
96 R = build_residual_network(H, 'capacity')
97 kwargs = dict(capacity='capacity', residual=R)
98 # Define default flow function
99 if flow_func is None:
100 flow_func = default_flow_func
101 if flow_func is shortest_augmenting_path:
102 kwargs['two_phase'] = True
103 # Begin the actual algorithm
104 # step 1: Find node connectivity k of G
105 if k is None:
106 k = nx.node_connectivity(G, flow_func=flow_func)
107 # step 2:
108 # Find k nodes with top degree, call it X:
109 X = {n for n, d in sorted(G.degree(), key=itemgetter(1), reverse=True)[:k]}
110 # Check if X is a k-node-cutset
111 if _is_separating_set(G, X):
112 seen.append(X)
113 yield X
114
115 for x in X:
116 # step 3: Compute local connectivity flow of x with all other
117 # non adjacent nodes in G
118 non_adjacent = set(G) - X - set(G[x])
119 for v in non_adjacent:
120 # step 4: compute maximum flow in an Even-Tarjan reduction H of G
121 # and step:5 build the associated residual network R
122 R = flow_func(H, '%sB' % mapping[x], '%sA' % mapping[v], **kwargs)
123 flow_value = R.graph['flow_value']
124
125 if flow_value == k:
126 ## Remove saturated edges form the residual network
127 saturated_edges = [(u, w, d) for (u, w, d) in
128 R.edges(data=True)
129 if d['capacity'] == d['flow']]
130 R.remove_edges_from(saturated_edges)
131 # step 6: shrink the strongly connected components of
132 # residual flow network R and call it L
133 L = nx.condensation(R)
134 cmap = L.graph['mapping']
135 # step 7: Compute antichains of L; they map to closed sets in H
136 # Any edge in H that links a closed set is part of a cutset
137 for antichain in nx.antichains(L):
138 # Nodes in an antichain of the condensation graph of
139 # the residual network map to a closed set of nodes that
140 # define a node partition of the auxiliary digraph H.
141 S = {n for n, scc in cmap.items() if scc in antichain}
142 # Find the cutset that links the node partition (S,~S) in H
143 cutset = set()
144 for u in S:
145 cutset.update((u, w) for w in H[u] if w not in S)
146 # The edges in H that form the cutset are internal edges
147 # (ie edges that represent a node of the original graph G)
148 node_cut = {H.node[n]['id'] for edge in cutset for n in edge}
149
150 if len(node_cut) == k:
151 if node_cut not in seen:
152 yield node_cut
153 seen.append(node_cut)
154 # Add an edge (x, v) to make sure that we do not
155 # find this cutset again. This is equivalent
156 # of adding the edge in the input graph
157 # G.add_edge(x, v) and then regenerate H and R:
158 # Add edges to the auxiliary digraph.
159 H.add_edge('%sB' % mapping[x], '%sA' % mapping[v],
160 capacity=1)
161 H.add_edge('%sB' % mapping[v], '%sA' % mapping[x],
162 capacity=1)
163 # Add edges to the residual network.
164 R.add_edge('%sB' % mapping[x], '%sA' % mapping[v],
165 capacity=1)
166 R.add_edge('%sA' % mapping[v], '%sB' % mapping[x],
167 capacity=1)
168 break
169 # Add again the saturated edges to reuse the residual network
170 R.add_edges_from(saturated_edges)
171
172
173 def _is_separating_set(G, cut):
174 """Assumes that the input graph is connected"""
175 if len(cut) == len(G) - 1:
176 return True
177
178 H = G.copy(with_data=False)
179 H.remove_nodes_from(cut)
180 if nx.is_connected(H):
181 return False
182 return True
183
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/networkx/algorithms/connectivity/kcutsets.py b/networkx/algorithms/connectivity/kcutsets.py
--- a/networkx/algorithms/connectivity/kcutsets.py
+++ b/networkx/algorithms/connectivity/kcutsets.py
@@ -3,6 +3,7 @@
Kanevsky all minimum node k cutsets algorithm.
"""
from operator import itemgetter
+from itertools import combinations
import networkx as nx
from .utils import build_auxiliary_node_connectivity
@@ -86,6 +87,22 @@
if not nx.is_connected(G):
raise nx.NetworkXError('Input graph is disconnected.')
+ # Addess some corner cases first.
+ # For cycle graphs
+ if G.order() == G.size():
+ if all(2 == d for n, d in G.degree()):
+ seen = set()
+ for u in G:
+ for v in nx.non_neighbors(G, u):
+ if (u, v) not in seen and (v, u) not in seen:
+ yield {v, u}
+ seen.add((v, u))
+ return
+ # For complete Graphs
+ if nx.density(G) == 1:
+ for cut_set in combinations(G, len(G)-1):
+ yield set(cut_set)
+ return
# Initialize data structures.
# Keep track of the cuts already computed so we do not repeat them.
seen = []
| {"golden_diff": "diff --git a/networkx/algorithms/connectivity/kcutsets.py b/networkx/algorithms/connectivity/kcutsets.py\n--- a/networkx/algorithms/connectivity/kcutsets.py\n+++ b/networkx/algorithms/connectivity/kcutsets.py\n@@ -3,6 +3,7 @@\n Kanevsky all minimum node k cutsets algorithm.\n \"\"\"\n from operator import itemgetter\n+from itertools import combinations\n \n import networkx as nx\n from .utils import build_auxiliary_node_connectivity\n@@ -86,6 +87,22 @@\n if not nx.is_connected(G):\n raise nx.NetworkXError('Input graph is disconnected.')\n \n+ # Addess some corner cases first.\n+ # For cycle graphs\n+ if G.order() == G.size():\n+ if all(2 == d for n, d in G.degree()):\n+ seen = set()\n+ for u in G:\n+ for v in nx.non_neighbors(G, u):\n+ if (u, v) not in seen and (v, u) not in seen:\n+ yield {v, u}\n+ seen.add((v, u))\n+ return\n+ # For complete Graphs\n+ if nx.density(G) == 1:\n+ for cut_set in combinations(G, len(G)-1):\n+ yield set(cut_set)\n+ return\n # Initialize data structures.\n # Keep track of the cuts already computed so we do not repeat them.\n seen = []\n", "issue": "all_node_cuts returns too few and incorrect cuts.\nThis could be a documentation issue, a bug or a user understanding issue. Are these cases pathological for the algorithm?\n\nGiven a square graph:\n\n```\n a -- b\n | |\n c -- d\n```\n\nBased on a cursory reading of the documentation, I would have expected all_node_cuts() to return:\n\n```\n[{'a','d'}, {'c','b'}]\n```\n\nI get `[{'a','c'}]` but if this is a valid node cut then surely {a,b}, {b,d}, {c,d} are also equally valid and a function called \"all node cuts\" should return them. \n\n```\nG = nx.Graph([('a','b'), ('a','c'), ('c','d'), ('b','d')])\nprint( G.nodes() )\nprint( G.edges() )\nprint( list(nx.all_node_cuts(G)) )\n\n>>> ['a', 'c', 'b', 'd']\n>>> [('a', 'c'), ('a', 'b'), ('c', 'd'), ('b', 'd')]\n>>> [{'a', 'c'}]\n```\n\nExpanding to a hexagon, we see similar pattern of node cuts. There are many isometric node cuts omitted from the results list. Two of the proposed cuts fail to create \"two or more connected components\" as the documentation suggests.\n\n```\nG = nx.Graph([('a','b'), ('b','c'), ('c','d'),('d','e'), ('e','f'),('f','a')])\nlist(nx.all_node_cuts(G))\n>>> [{'a', 'c'}, {'a', 'b'}, {'b', 'c'}]\n```\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"\nKanevsky all minimum node k cutsets algorithm.\n\"\"\"\nfrom operator import itemgetter\n\nimport networkx as nx\nfrom .utils import build_auxiliary_node_connectivity\nfrom networkx.algorithms.flow import (\n build_residual_network,\n edmonds_karp,\n shortest_augmenting_path,\n)\ndefault_flow_func = edmonds_karp\n\n\n__author__ = '\\n'.join(['Jordi Torrents <[email protected]>'])\n\n__all__ = ['all_node_cuts']\n\ndef all_node_cuts(G, k=None, flow_func=None):\n r\"\"\"Returns all minimum k cutsets of an undirected graph G. \n\n This implementation is based on Kanevsky's algorithm [1]_ for finding all\n minimum-size node cut-sets of an undirected graph G; ie the set (or sets) \n of nodes of cardinality equal to the node connectivity of G. Thus if \n removed, would break G into two or more connected components.\n \n Parameters\n ----------\n G : NetworkX graph\n Undirected graph\n\n k : Integer\n Node connectivity of the input graph. If k is None, then it is \n computed. Default value: None.\n\n flow_func : function\n Function to perform the underlying flow computations. Default value\n edmonds_karp. This function performs better in sparse graphs with\n right tailed degree distributions. shortest_augmenting_path will\n perform better in denser graphs.\n \n\n Returns\n -------\n cuts : a generator of node cutsets\n Each node cutset has cardinality equal to the node connectivity of\n the input graph.\n\n Examples\n --------\n >>> # A two-dimensional grid graph has 4 cutsets of cardinality 2\n >>> G = nx.grid_2d_graph(5, 5)\n >>> cutsets = list(nx.all_node_cuts(G))\n >>> len(cutsets)\n 4\n >>> all(2 == len(cutset) for cutset in cutsets)\n True\n >>> nx.node_connectivity(G)\n 2\n\n Notes\n -----\n This implementation is based on the sequential algorithm for finding all\n minimum-size separating vertex sets in a graph [1]_. The main idea is to\n compute minimum cuts using local maximum flow computations among a set \n of nodes of highest degree and all other non-adjacent nodes in the Graph.\n Once we find a minimum cut, we add an edge between the high degree\n node and the target node of the local maximum flow computation to make \n sure that we will not find that minimum cut again.\n\n See also\n --------\n node_connectivity\n edmonds_karp\n shortest_augmenting_path\n\n References\n ----------\n .. [1] Kanevsky, A. (1993). Finding all minimum-size separating vertex \n sets in a graph. Networks 23(6), 533--541.\n http://onlinelibrary.wiley.com/doi/10.1002/net.3230230604/abstract\n\n \"\"\"\n if not nx.is_connected(G):\n raise nx.NetworkXError('Input graph is disconnected.')\n\n # Initialize data structures.\n # Keep track of the cuts already computed so we do not repeat them.\n seen = []\n # Even-Tarjan reduction is what we call auxiliary digraph \n # for node connectivity.\n H = build_auxiliary_node_connectivity(G)\n mapping = H.graph['mapping']\n R = build_residual_network(H, 'capacity')\n kwargs = dict(capacity='capacity', residual=R)\n # Define default flow function\n if flow_func is None:\n flow_func = default_flow_func\n if flow_func is shortest_augmenting_path:\n kwargs['two_phase'] = True\n # Begin the actual algorithm\n # step 1: Find node connectivity k of G\n if k is None:\n k = nx.node_connectivity(G, flow_func=flow_func)\n # step 2: \n # Find k nodes with top degree, call it X:\n X = {n for n, d in sorted(G.degree(), key=itemgetter(1), reverse=True)[:k]}\n # Check if X is a k-node-cutset\n if _is_separating_set(G, X):\n seen.append(X)\n yield X\n\n for x in X:\n # step 3: Compute local connectivity flow of x with all other\n # non adjacent nodes in G\n non_adjacent = set(G) - X - set(G[x])\n for v in non_adjacent:\n # step 4: compute maximum flow in an Even-Tarjan reduction H of G\n # and step:5 build the associated residual network R\n R = flow_func(H, '%sB' % mapping[x], '%sA' % mapping[v], **kwargs)\n flow_value = R.graph['flow_value']\n\n if flow_value == k:\n ## Remove saturated edges form the residual network\n saturated_edges = [(u, w, d) for (u, w, d) in\n R.edges(data=True)\n if d['capacity'] == d['flow']]\n R.remove_edges_from(saturated_edges)\n # step 6: shrink the strongly connected components of \n # residual flow network R and call it L\n L = nx.condensation(R)\n cmap = L.graph['mapping']\n # step 7: Compute antichains of L; they map to closed sets in H\n # Any edge in H that links a closed set is part of a cutset\n for antichain in nx.antichains(L):\n # Nodes in an antichain of the condensation graph of\n # the residual network map to a closed set of nodes that\n # define a node partition of the auxiliary digraph H.\n S = {n for n, scc in cmap.items() if scc in antichain}\n # Find the cutset that links the node partition (S,~S) in H\n cutset = set()\n for u in S:\n cutset.update((u, w) for w in H[u] if w not in S)\n # The edges in H that form the cutset are internal edges\n # (ie edges that represent a node of the original graph G)\n node_cut = {H.node[n]['id'] for edge in cutset for n in edge}\n\n if len(node_cut) == k:\n if node_cut not in seen:\n yield node_cut\n seen.append(node_cut)\n # Add an edge (x, v) to make sure that we do not\n # find this cutset again. This is equivalent\n # of adding the edge in the input graph \n # G.add_edge(x, v) and then regenerate H and R:\n # Add edges to the auxiliary digraph.\n H.add_edge('%sB' % mapping[x], '%sA' % mapping[v],\n capacity=1)\n H.add_edge('%sB' % mapping[v], '%sA' % mapping[x],\n capacity=1)\n # Add edges to the residual network.\n R.add_edge('%sB' % mapping[x], '%sA' % mapping[v],\n capacity=1)\n R.add_edge('%sA' % mapping[v], '%sB' % mapping[x],\n capacity=1)\n break\n # Add again the saturated edges to reuse the residual network\n R.add_edges_from(saturated_edges)\n\n\ndef _is_separating_set(G, cut):\n \"\"\"Assumes that the input graph is connected\"\"\"\n if len(cut) == len(G) - 1:\n return True\n\n H = G.copy(with_data=False)\n H.remove_nodes_from(cut)\n if nx.is_connected(H):\n return False\n return True\n", "path": "networkx/algorithms/connectivity/kcutsets.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"\nKanevsky all minimum node k cutsets algorithm.\n\"\"\"\nfrom operator import itemgetter\nfrom itertools import combinations\n\nimport networkx as nx\nfrom .utils import build_auxiliary_node_connectivity\nfrom networkx.algorithms.flow import (\n build_residual_network,\n edmonds_karp,\n shortest_augmenting_path,\n)\ndefault_flow_func = edmonds_karp\n\n\n__author__ = '\\n'.join(['Jordi Torrents <[email protected]>'])\n\n__all__ = ['all_node_cuts']\n\ndef all_node_cuts(G, k=None, flow_func=None):\n r\"\"\"Returns all minimum k cutsets of an undirected graph G. \n\n This implementation is based on Kanevsky's algorithm [1]_ for finding all\n minimum-size node cut-sets of an undirected graph G; ie the set (or sets) \n of nodes of cardinality equal to the node connectivity of G. Thus if \n removed, would break G into two or more connected components.\n \n Parameters\n ----------\n G : NetworkX graph\n Undirected graph\n\n k : Integer\n Node connectivity of the input graph. If k is None, then it is \n computed. Default value: None.\n\n flow_func : function\n Function to perform the underlying flow computations. Default value\n edmonds_karp. This function performs better in sparse graphs with\n right tailed degree distributions. shortest_augmenting_path will\n perform better in denser graphs.\n \n\n Returns\n -------\n cuts : a generator of node cutsets\n Each node cutset has cardinality equal to the node connectivity of\n the input graph.\n\n Examples\n --------\n >>> # A two-dimensional grid graph has 4 cutsets of cardinality 2\n >>> G = nx.grid_2d_graph(5, 5)\n >>> cutsets = list(nx.all_node_cuts(G))\n >>> len(cutsets)\n 4\n >>> all(2 == len(cutset) for cutset in cutsets)\n True\n >>> nx.node_connectivity(G)\n 2\n\n Notes\n -----\n This implementation is based on the sequential algorithm for finding all\n minimum-size separating vertex sets in a graph [1]_. The main idea is to\n compute minimum cuts using local maximum flow computations among a set \n of nodes of highest degree and all other non-adjacent nodes in the Graph.\n Once we find a minimum cut, we add an edge between the high degree\n node and the target node of the local maximum flow computation to make \n sure that we will not find that minimum cut again.\n\n See also\n --------\n node_connectivity\n edmonds_karp\n shortest_augmenting_path\n\n References\n ----------\n .. [1] Kanevsky, A. (1993). Finding all minimum-size separating vertex \n sets in a graph. Networks 23(6), 533--541.\n http://onlinelibrary.wiley.com/doi/10.1002/net.3230230604/abstract\n\n \"\"\"\n if not nx.is_connected(G):\n raise nx.NetworkXError('Input graph is disconnected.')\n\n # Addess some corner cases first.\n # For cycle graphs\n if G.order() == G.size():\n if all(2 == d for n, d in G.degree()):\n seen = set()\n for u in G:\n for v in nx.non_neighbors(G, u):\n if (u, v) not in seen and (v, u) not in seen:\n yield {v, u}\n seen.add((v, u))\n return\n # For complete Graphs\n if nx.density(G) == 1:\n for cut_set in combinations(G, len(G)-1):\n yield set(cut_set)\n return\n # Initialize data structures.\n # Keep track of the cuts already computed so we do not repeat them.\n seen = []\n # Even-Tarjan reduction is what we call auxiliary digraph \n # for node connectivity.\n H = build_auxiliary_node_connectivity(G)\n mapping = H.graph['mapping']\n R = build_residual_network(H, 'capacity')\n kwargs = dict(capacity='capacity', residual=R)\n # Define default flow function\n if flow_func is None:\n flow_func = default_flow_func\n if flow_func is shortest_augmenting_path:\n kwargs['two_phase'] = True\n # Begin the actual algorithm\n # step 1: Find node connectivity k of G\n if k is None:\n k = nx.node_connectivity(G, flow_func=flow_func)\n # step 2: \n # Find k nodes with top degree, call it X:\n X = {n for n, d in sorted(G.degree(), key=itemgetter(1), reverse=True)[:k]}\n # Check if X is a k-node-cutset\n if _is_separating_set(G, X):\n seen.append(X)\n yield X\n\n for x in X:\n # step 3: Compute local connectivity flow of x with all other\n # non adjacent nodes in G\n non_adjacent = set(G) - X - set(G[x])\n for v in non_adjacent:\n # step 4: compute maximum flow in an Even-Tarjan reduction H of G\n # and step:5 build the associated residual network R\n R = flow_func(H, '%sB' % mapping[x], '%sA' % mapping[v], **kwargs)\n flow_value = R.graph['flow_value']\n\n if flow_value == k:\n ## Remove saturated edges form the residual network\n saturated_edges = [(u, w, d) for (u, w, d) in\n R.edges(data=True)\n if d['capacity'] == d['flow']]\n R.remove_edges_from(saturated_edges)\n # step 6: shrink the strongly connected components of \n # residual flow network R and call it L\n L = nx.condensation(R)\n cmap = L.graph['mapping']\n # step 7: Compute antichains of L; they map to closed sets in H\n # Any edge in H that links a closed set is part of a cutset\n for antichain in nx.antichains(L):\n # Nodes in an antichain of the condensation graph of\n # the residual network map to a closed set of nodes that\n # define a node partition of the auxiliary digraph H.\n S = {n for n, scc in cmap.items() if scc in antichain}\n # Find the cutset that links the node partition (S,~S) in H\n cutset = set()\n for u in S:\n cutset.update((u, w) for w in H[u] if w not in S)\n # The edges in H that form the cutset are internal edges\n # (ie edges that represent a node of the original graph G)\n node_cut = {H.node[n]['id'] for edge in cutset for n in edge}\n\n if len(node_cut) == k:\n if node_cut not in seen:\n yield node_cut\n seen.append(node_cut)\n # Add an edge (x, v) to make sure that we do not\n # find this cutset again. This is equivalent\n # of adding the edge in the input graph \n # G.add_edge(x, v) and then regenerate H and R:\n # Add edges to the auxiliary digraph.\n H.add_edge('%sB' % mapping[x], '%sA' % mapping[v],\n capacity=1)\n H.add_edge('%sB' % mapping[v], '%sA' % mapping[x],\n capacity=1)\n # Add edges to the residual network.\n R.add_edge('%sB' % mapping[x], '%sA' % mapping[v],\n capacity=1)\n R.add_edge('%sA' % mapping[v], '%sB' % mapping[x],\n capacity=1)\n break\n # Add again the saturated edges to reuse the residual network\n R.add_edges_from(saturated_edges)\n\n\ndef _is_separating_set(G, cut):\n \"\"\"Assumes that the input graph is connected\"\"\"\n if len(cut) == len(G) - 1:\n return True\n\n H = G.copy(with_data=False)\n H.remove_nodes_from(cut)\n if nx.is_connected(H):\n return False\n return True\n", "path": "networkx/algorithms/connectivity/kcutsets.py"}]} | 2,771 | 319 |
gh_patches_debug_10902 | rasdani/github-patches | git_diff | google__flax-362 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Pooling: passing "sequence of `n` `(low, high)` integer pairs" resulting in TypeError
Trying to pass a tuple or list of tuples to a pool operation's padding parameter gives out the following errors:
`TypeError: Unknown padding type: (1, 1).`
`TypeError : unhashable type: 'list' `
Sample code for reproducing the bug:
```python3
from flax import nn
from jax import random
class FlaxModel(nn.Module):
def apply(self, x):
x = nn.max_pool(x, (3, 3), strides=(2, 2), padding=[(1, 1), (1, 1)])
return x
rng = random.PRNGKey(0)
model, _ = FlaxModel.init_by_shape(rng, [(1, 100, 100, 1)])
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `flax/nn/pooling.py`
Content:
```
1 # Copyright 2020 The Flax Authors.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Pooling modules."""
16
17 from jax import lax
18 import jax.numpy as jnp
19
20 import numpy as onp
21
22
23 def pool(inputs, init, reduce_fn, window_shape, strides, padding):
24 """Helper function to define pooling functions.
25
26 Pooling functions are implemented using the ReduceWindow XLA op.
27 NOTE: Be aware that pooling is not generally differentiable.
28 That means providing a reduce_fn that is differentiable does not imply
29 that pool is differentiable.
30
31 Args:
32 inputs: input data with dimensions (batch, window dims..., features).
33 init: the initial value for the reduction
34 reduce_fn: a reduce function of the form `(T, T) -> T`.
35 window_shape: a shape tuple defining the window to reduce over.
36 strides: a sequence of `n` integers, representing the inter-window
37 strides.
38 padding: either the string `'SAME'`, the string `'VALID'`, or a sequence
39 of `n` `(low, high)` integer pairs that give the padding to apply before
40 and after each spatial dimension.
41 Returns:
42 The output of the reduction for each window slice.
43 """
44 strides = strides or (1,) * len(window_shape)
45 strides = (1,) + strides + (1,)
46 dims = (1,) + window_shape + (1,)
47 return lax.reduce_window(inputs, init, reduce_fn, dims, strides, padding)
48
49
50 def avg_pool(inputs, window_shape, strides=None, padding="VALID"):
51 """Pools the input by taking the average over a window.
52
53 Args:
54 inputs: input data with dimensions (batch, window dims..., features).
55 window_shape: a shape tuple defining the window to reduce over.
56 strides: a sequence of `n` integers, representing the inter-window
57 strides (default: `(1, ..., 1)`).
58 padding: either the string `'SAME'`, the string `'VALID'`, or a sequence
59 of `n` `(low, high)` integer pairs that give the padding to apply before
60 and after each spatial dimension (default: `'VALID'`).
61 Returns:
62 The average for each window slice.
63 """
64 y = pool(inputs, 0., lax.add, window_shape, strides, padding)
65 y = y / onp.prod(window_shape)
66 return y
67
68
69 def max_pool(inputs, window_shape, strides=None, padding="VALID"):
70 """Pools the input by taking the maximum of a window slice.
71
72 Args:
73 inputs: input data with dimensions (batch, window dims..., features).
74 window_shape: a shape tuple defining the window to reduce over.
75 strides: a sequence of `n` integers, representing the inter-window
76 strides (default: `(1, ..., 1)`).
77 padding: either the string `'SAME'`, the string `'VALID'`, or a sequence
78 of `n` `(low, high)` integer pairs that give the padding to apply before
79 and after each spatial dimension (default: `'VALID'`).
80 Returns:
81 The maximum for each window slice.
82 """
83 y = pool(inputs, -jnp.inf, lax.max, window_shape, strides, padding)
84 return y
85
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/flax/nn/pooling.py b/flax/nn/pooling.py
--- a/flax/nn/pooling.py
+++ b/flax/nn/pooling.py
@@ -44,6 +44,14 @@
strides = strides or (1,) * len(window_shape)
strides = (1,) + strides + (1,)
dims = (1,) + window_shape + (1,)
+ if not isinstance(padding, str):
+ padding = tuple(map(tuple, padding))
+ assert(len(padding) == len(window_shape)), (
+ f"padding {padding} must specify pads for same number of dims as "
+ f"window_shape {window_shape}")
+ assert(all([len(x) == 2 for x in padding])), (
+ f"each entry in padding {padding} must be length 2")
+ padding = ((0,0),) + padding + ((0,0),)
return lax.reduce_window(inputs, init, reduce_fn, dims, strides, padding)
| {"golden_diff": "diff --git a/flax/nn/pooling.py b/flax/nn/pooling.py\n--- a/flax/nn/pooling.py\n+++ b/flax/nn/pooling.py\n@@ -44,6 +44,14 @@\n strides = strides or (1,) * len(window_shape)\n strides = (1,) + strides + (1,)\n dims = (1,) + window_shape + (1,)\n+ if not isinstance(padding, str):\n+ padding = tuple(map(tuple, padding))\n+ assert(len(padding) == len(window_shape)), (\n+ f\"padding {padding} must specify pads for same number of dims as \"\n+ f\"window_shape {window_shape}\")\n+ assert(all([len(x) == 2 for x in padding])), (\n+ f\"each entry in padding {padding} must be length 2\")\n+ padding = ((0,0),) + padding + ((0,0),)\n return lax.reduce_window(inputs, init, reduce_fn, dims, strides, padding)\n", "issue": "Pooling: passing \"sequence of `n` `(low, high)` integer pairs\" resulting in TypeError\nTrying to pass a tuple or list of tuples to a pool operation's padding parameter gives out the following errors: \r\n`TypeError: Unknown padding type: (1, 1).`\r\n`TypeError : unhashable type: 'list' `\r\n\r\n\r\nSample code for reproducing the bug:\r\n```python3\r\nfrom flax import nn\r\nfrom jax import random\r\n\r\nclass FlaxModel(nn.Module):\r\n def apply(self, x):\r\n x = nn.max_pool(x, (3, 3), strides=(2, 2), padding=[(1, 1), (1, 1)])\r\n return x\r\n\r\nrng = random.PRNGKey(0)\r\nmodel, _ = FlaxModel.init_by_shape(rng, [(1, 100, 100, 1)])\r\n```\r\n\n", "before_files": [{"content": "# Copyright 2020 The Flax Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Pooling modules.\"\"\"\n\nfrom jax import lax\nimport jax.numpy as jnp\n\nimport numpy as onp\n\n\ndef pool(inputs, init, reduce_fn, window_shape, strides, padding):\n \"\"\"Helper function to define pooling functions.\n\n Pooling functions are implemented using the ReduceWindow XLA op.\n NOTE: Be aware that pooling is not generally differentiable.\n That means providing a reduce_fn that is differentiable does not imply\n that pool is differentiable.\n\n Args:\n inputs: input data with dimensions (batch, window dims..., features).\n init: the initial value for the reduction\n reduce_fn: a reduce function of the form `(T, T) -> T`.\n window_shape: a shape tuple defining the window to reduce over.\n strides: a sequence of `n` integers, representing the inter-window\n strides.\n padding: either the string `'SAME'`, the string `'VALID'`, or a sequence\n of `n` `(low, high)` integer pairs that give the padding to apply before\n and after each spatial dimension.\n Returns:\n The output of the reduction for each window slice.\n \"\"\"\n strides = strides or (1,) * len(window_shape)\n strides = (1,) + strides + (1,)\n dims = (1,) + window_shape + (1,)\n return lax.reduce_window(inputs, init, reduce_fn, dims, strides, padding)\n\n\ndef avg_pool(inputs, window_shape, strides=None, padding=\"VALID\"):\n \"\"\"Pools the input by taking the average over a window.\n\n Args:\n inputs: input data with dimensions (batch, window dims..., features).\n window_shape: a shape tuple defining the window to reduce over.\n strides: a sequence of `n` integers, representing the inter-window\n strides (default: `(1, ..., 1)`).\n padding: either the string `'SAME'`, the string `'VALID'`, or a sequence\n of `n` `(low, high)` integer pairs that give the padding to apply before\n and after each spatial dimension (default: `'VALID'`).\n Returns:\n The average for each window slice.\n \"\"\"\n y = pool(inputs, 0., lax.add, window_shape, strides, padding)\n y = y / onp.prod(window_shape)\n return y\n\n\ndef max_pool(inputs, window_shape, strides=None, padding=\"VALID\"):\n \"\"\"Pools the input by taking the maximum of a window slice.\n\n Args:\n inputs: input data with dimensions (batch, window dims..., features).\n window_shape: a shape tuple defining the window to reduce over.\n strides: a sequence of `n` integers, representing the inter-window\n strides (default: `(1, ..., 1)`).\n padding: either the string `'SAME'`, the string `'VALID'`, or a sequence\n of `n` `(low, high)` integer pairs that give the padding to apply before\n and after each spatial dimension (default: `'VALID'`).\n Returns:\n The maximum for each window slice.\n \"\"\"\n y = pool(inputs, -jnp.inf, lax.max, window_shape, strides, padding)\n return y\n", "path": "flax/nn/pooling.py"}], "after_files": [{"content": "# Copyright 2020 The Flax Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Pooling modules.\"\"\"\n\nfrom jax import lax\nimport jax.numpy as jnp\n\nimport numpy as onp\n\n\ndef pool(inputs, init, reduce_fn, window_shape, strides, padding):\n \"\"\"Helper function to define pooling functions.\n\n Pooling functions are implemented using the ReduceWindow XLA op.\n NOTE: Be aware that pooling is not generally differentiable.\n That means providing a reduce_fn that is differentiable does not imply\n that pool is differentiable.\n\n Args:\n inputs: input data with dimensions (batch, window dims..., features).\n init: the initial value for the reduction\n reduce_fn: a reduce function of the form `(T, T) -> T`.\n window_shape: a shape tuple defining the window to reduce over.\n strides: a sequence of `n` integers, representing the inter-window\n strides.\n padding: either the string `'SAME'`, the string `'VALID'`, or a sequence\n of `n` `(low, high)` integer pairs that give the padding to apply before\n and after each spatial dimension.\n Returns:\n The output of the reduction for each window slice.\n \"\"\"\n strides = strides or (1,) * len(window_shape)\n strides = (1,) + strides + (1,)\n dims = (1,) + window_shape + (1,)\n if not isinstance(padding, str):\n padding = tuple(map(tuple, padding))\n assert(len(padding) == len(window_shape)), (\n f\"padding {padding} must specify pads for same number of dims as \"\n f\"window_shape {window_shape}\")\n assert(all([len(x) == 2 for x in padding])), (\n f\"each entry in padding {padding} must be length 2\")\n padding = ((0,0),) + padding + ((0,0),)\n return lax.reduce_window(inputs, init, reduce_fn, dims, strides, padding)\n\n\ndef avg_pool(inputs, window_shape, strides=None, padding=\"VALID\"):\n \"\"\"Pools the input by taking the average over a window.\n\n Args:\n inputs: input data with dimensions (batch, window dims..., features).\n window_shape: a shape tuple defining the window to reduce over.\n strides: a sequence of `n` integers, representing the inter-window\n strides (default: `(1, ..., 1)`).\n padding: either the string `'SAME'`, the string `'VALID'`, or a sequence\n of `n` `(low, high)` integer pairs that give the padding to apply before\n and after each spatial dimension (default: `'VALID'`).\n Returns:\n The average for each window slice.\n \"\"\"\n y = pool(inputs, 0., lax.add, window_shape, strides, padding)\n y = y / onp.prod(window_shape)\n return y\n\n\ndef max_pool(inputs, window_shape, strides=None, padding=\"VALID\"):\n \"\"\"Pools the input by taking the maximum of a window slice.\n\n Args:\n inputs: input data with dimensions (batch, window dims..., features).\n window_shape: a shape tuple defining the window to reduce over.\n strides: a sequence of `n` integers, representing the inter-window\n strides (default: `(1, ..., 1)`).\n padding: either the string `'SAME'`, the string `'VALID'`, or a sequence\n of `n` `(low, high)` integer pairs that give the padding to apply before\n and after each spatial dimension (default: `'VALID'`).\n Returns:\n The maximum for each window slice.\n \"\"\"\n y = pool(inputs, -jnp.inf, lax.max, window_shape, strides, padding)\n return y\n", "path": "flax/nn/pooling.py"}]} | 1,428 | 225 |
gh_patches_debug_40184 | rasdani/github-patches | git_diff | PaddlePaddle__PaddleSpeech-1722 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[vec] deal with class imbalances
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `paddlespeech/vector/utils/vector_utils.py`
Content:
```
1 # Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15
16 def get_chunks(seg_dur, audio_id, audio_duration):
17 """Get all chunk segments from a utterance
18
19 Args:
20 seg_dur (float): segment chunk duration, seconds
21 audio_id (str): utterance name,
22 audio_duration (float): utterance duration, seconds
23
24 Returns:
25 List: all the chunk segments
26 """
27 num_chunks = int(audio_duration / seg_dur) # all in seconds
28 chunk_lst = [
29 audio_id + "_" + str(i * seg_dur) + "_" + str(i * seg_dur + seg_dur)
30 for i in range(num_chunks)
31 ]
32 return chunk_lst
33
34
35 def Q_from_tokens(token_num):
36 """Get prior model, data from uniform, would support others(guassian) in future
37 """
38 freq = [1] * token_num
39 Q = paddle.to_tensor(freq, dtype='float64')
40 return Q / Q.sum()
41
```
Path: `paddlespeech/vector/modules/loss.py`
Content:
```
1 # Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 # This is modified from SpeechBrain
15 # https://github.com/speechbrain/speechbrain/blob/085be635c07f16d42cd1295045bc46c407f1e15b/speechbrain/nnet/losses.py
16 import math
17
18 import paddle
19 import paddle.nn as nn
20 import paddle.nn.functional as F
21
22
23 class AngularMargin(nn.Layer):
24 def __init__(self, margin=0.0, scale=1.0):
25 """An implementation of Angular Margin (AM) proposed in the following
26 paper: '''Margin Matters: Towards More Discriminative Deep Neural Network
27 Embeddings for Speaker Recognition''' (https://arxiv.org/abs/1906.07317)
28
29 Args:
30 margin (float, optional): The margin for cosine similiarity. Defaults to 0.0.
31 scale (float, optional): The scale for cosine similiarity. Defaults to 1.0.
32 """
33 super(AngularMargin, self).__init__()
34 self.margin = margin
35 self.scale = scale
36
37 def forward(self, outputs, targets):
38 outputs = outputs - self.margin * targets
39 return self.scale * outputs
40
41
42 class AdditiveAngularMargin(AngularMargin):
43 def __init__(self, margin=0.0, scale=1.0, easy_margin=False):
44 """The Implementation of Additive Angular Margin (AAM) proposed
45 in the following paper: '''Margin Matters: Towards More Discriminative Deep Neural Network Embeddings for Speaker Recognition'''
46 (https://arxiv.org/abs/1906.07317)
47
48 Args:
49 margin (float, optional): margin factor. Defaults to 0.0.
50 scale (float, optional): scale factor. Defaults to 1.0.
51 easy_margin (bool, optional): easy_margin flag. Defaults to False.
52 """
53 super(AdditiveAngularMargin, self).__init__(margin, scale)
54 self.easy_margin = easy_margin
55
56 self.cos_m = math.cos(self.margin)
57 self.sin_m = math.sin(self.margin)
58 self.th = math.cos(math.pi - self.margin)
59 self.mm = math.sin(math.pi - self.margin) * self.margin
60
61 def forward(self, outputs, targets):
62 cosine = outputs.astype('float32')
63 sine = paddle.sqrt(1.0 - paddle.pow(cosine, 2))
64 phi = cosine * self.cos_m - sine * self.sin_m # cos(theta + m)
65 if self.easy_margin:
66 phi = paddle.where(cosine > 0, phi, cosine)
67 else:
68 phi = paddle.where(cosine > self.th, phi, cosine - self.mm)
69 outputs = (targets * phi) + ((1.0 - targets) * cosine)
70 return self.scale * outputs
71
72
73 class LogSoftmaxWrapper(nn.Layer):
74 def __init__(self, loss_fn):
75 """Speaker identificatin loss function wrapper
76 including all of compositions of the loss transformation
77 Args:
78 loss_fn (_type_): the loss value of a batch
79 """
80 super(LogSoftmaxWrapper, self).__init__()
81 self.loss_fn = loss_fn
82 self.criterion = paddle.nn.KLDivLoss(reduction="sum")
83
84 def forward(self, outputs, targets, length=None):
85 targets = F.one_hot(targets, outputs.shape[1])
86 try:
87 predictions = self.loss_fn(outputs, targets)
88 except TypeError:
89 predictions = self.loss_fn(outputs)
90
91 predictions = F.log_softmax(predictions, axis=1)
92 loss = self.criterion(predictions, targets) / targets.sum()
93 return loss
94
95
96 class NCELoss(nn.Layer):
97 """Noise Contrastive Estimation loss funtion
98
99 Noise Contrastive Estimation (NCE) is an approximation method that is used to
100 work around the huge computational cost of large softmax layer.
101 The basic idea is to convert the prediction problem into classification problem
102 at training stage. It has been proved that these two criterions converges to
103 the same minimal point as long as noise distribution is close enough to real one.
104
105 NCE bridges the gap between generative models and discriminative models,
106 rather than simply speedup the softmax layer.
107 With NCE, you can turn almost anything into posterior with less effort (I think).
108
109 Refs:
110 NCE:http://www.cs.helsinki.fi/u/ahyvarin/papers/Gutmann10AISTATS.pdf
111 Thanks: https://github.com/mingen-pan/easy-to-use-NCE-RNN-for-Pytorch/blob/master/nce.py
112
113 Examples:
114 Q = Q_from_tokens(output_dim)
115 NCELoss(Q)
116 """
117
118 def __init__(self, Q, noise_ratio=100, Z_offset=9.5):
119 """Noise Contrastive Estimation loss funtion
120
121 Args:
122 Q (tensor): prior model, uniform or guassian
123 noise_ratio (int, optional): noise sampling times. Defaults to 100.
124 Z_offset (float, optional): scale of post processing the score. Defaults to 9.5.
125 """
126 super(NCELoss, self).__init__()
127 assert type(noise_ratio) is int
128 self.Q = paddle.to_tensor(Q, stop_gradient=False)
129 self.N = self.Q.shape[0]
130 self.K = noise_ratio
131 self.Z_offset = Z_offset
132
133 def forward(self, output, target):
134 """Forward inference
135
136 Args:
137 output (tensor): the model output, which is the input of loss function
138 """
139 output = paddle.reshape(output, [-1, self.N])
140 B = output.shape[0]
141 noise_idx = self.get_noise(B)
142 idx = self.get_combined_idx(target, noise_idx)
143 P_target, P_noise = self.get_prob(idx, output, sep_target=True)
144 Q_target, Q_noise = self.get_Q(idx)
145 loss = self.nce_loss(P_target, P_noise, Q_noise, Q_target)
146 return loss.mean()
147
148 def get_Q(self, idx, sep_target=True):
149 """Get prior model of batchsize data
150 """
151 idx_size = idx.size
152 prob_model = paddle.to_tensor(
153 self.Q.numpy()[paddle.reshape(idx, [-1]).numpy()])
154 prob_model = paddle.reshape(prob_model, [idx.shape[0], idx.shape[1]])
155 if sep_target:
156 return prob_model[:, 0], prob_model[:, 1:]
157 else:
158 return prob_model
159
160 def get_prob(self, idx, scores, sep_target=True):
161 """Post processing the score of post model(output of nn) of batchsize data
162 """
163 scores = self.get_scores(idx, scores)
164 scale = paddle.to_tensor([self.Z_offset], dtype='float32')
165 scores = paddle.add(scores, -scale)
166 prob = paddle.exp(scores)
167 if sep_target:
168 return prob[:, 0], prob[:, 1:]
169 else:
170 return prob
171
172 def get_scores(self, idx, scores):
173 """Get the score of post model(output of nn) of batchsize data
174 """
175 B, N = scores.shape
176 K = idx.shape[1]
177 idx_increment = paddle.to_tensor(
178 N * paddle.reshape(paddle.arange(B), [B, 1]) * paddle.ones([1, K]),
179 dtype="int64",
180 stop_gradient=False)
181 new_idx = idx_increment + idx
182 new_scores = paddle.index_select(
183 paddle.reshape(scores, [-1]), paddle.reshape(new_idx, [-1]))
184
185 return paddle.reshape(new_scores, [B, K])
186
187 def get_noise(self, batch_size, uniform=True):
188 """Select noise sample
189 """
190 if uniform:
191 noise = np.random.randint(self.N, size=self.K * batch_size)
192 else:
193 noise = np.random.choice(
194 self.N, self.K * batch_size, replace=True, p=self.Q.data)
195 noise = paddle.to_tensor(noise, dtype='int64', stop_gradient=False)
196 noise_idx = paddle.reshape(noise, [batch_size, self.K])
197 return noise_idx
198
199 def get_combined_idx(self, target_idx, noise_idx):
200 """Combined target and noise
201 """
202 target_idx = paddle.reshape(target_idx, [-1, 1])
203 return paddle.concat((target_idx, noise_idx), 1)
204
205 def nce_loss(self, prob_model, prob_noise_in_model, prob_noise,
206 prob_target_in_noise):
207 """Combined the loss of target and noise
208 """
209
210 def safe_log(tensor):
211 """Safe log
212 """
213 EPSILON = 1e-10
214 return paddle.log(EPSILON + tensor)
215
216 model_loss = safe_log(prob_model /
217 (prob_model + self.K * prob_target_in_noise))
218 model_loss = paddle.reshape(model_loss, [-1])
219
220 noise_loss = paddle.sum(
221 safe_log((self.K * prob_noise) /
222 (prob_noise_in_model + self.K * prob_noise)), -1)
223 noise_loss = paddle.reshape(noise_loss, [-1])
224
225 loss = -(model_loss + noise_loss)
226
227 return loss
228
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/paddlespeech/vector/modules/loss.py b/paddlespeech/vector/modules/loss.py
--- a/paddlespeech/vector/modules/loss.py
+++ b/paddlespeech/vector/modules/loss.py
@@ -132,7 +132,7 @@
def forward(self, output, target):
"""Forward inference
-
+
Args:
output (tensor): the model output, which is the input of loss function
"""
@@ -161,7 +161,7 @@
"""Post processing the score of post model(output of nn) of batchsize data
"""
scores = self.get_scores(idx, scores)
- scale = paddle.to_tensor([self.Z_offset], dtype='float32')
+ scale = paddle.to_tensor([self.Z_offset], dtype='float64')
scores = paddle.add(scores, -scale)
prob = paddle.exp(scores)
if sep_target:
@@ -225,3 +225,65 @@
loss = -(model_loss + noise_loss)
return loss
+
+
+class FocalLoss(nn.Layer):
+ """This criterion is a implemenation of Focal Loss, which is proposed in
+ Focal Loss for Dense Object Detection.
+
+ Loss(x, class) = - \alpha (1-softmax(x)[class])^gamma \log(softmax(x)[class])
+
+ The losses are averaged across observations for each minibatch.
+
+ Args:
+ alpha(1D Tensor, Variable) : the scalar factor for this criterion
+ gamma(float, double) : gamma > 0; reduces the relative loss for well-classified examples (p > .5),
+ putting more focus on hard, misclassified examples
+ size_average(bool): By default, the losses are averaged over observations for each minibatch.
+ However, if the field size_average is set to False, the losses are
+ instead summed for each minibatch.
+ """
+
+ def __init__(self, alpha=1, gamma=0, size_average=True, ignore_index=-100):
+ super(FocalLoss, self).__init__()
+ self.alpha = alpha
+ self.gamma = gamma
+ self.size_average = size_average
+ self.ce = nn.CrossEntropyLoss(
+ ignore_index=ignore_index, reduction="none")
+
+ def forward(self, outputs, targets):
+ """Forword inference.
+
+ Args:
+ outputs: input tensor
+ target: target label tensor
+ """
+ ce_loss = self.ce(outputs, targets)
+ pt = paddle.exp(-ce_loss)
+ focal_loss = self.alpha * (1 - pt)**self.gamma * ce_loss
+ if self.size_average:
+ return focal_loss.mean()
+ else:
+ return focal_loss.sum()
+
+
+if __name__ == "__main__":
+ import numpy as np
+ from paddlespeech.vector.utils.vector_utils import Q_from_tokens
+ paddle.set_device("cpu")
+
+ input_data = paddle.uniform([5, 100], dtype="float64")
+ label_data = np.random.randint(0, 100, size=(5)).astype(np.int64)
+
+ input = paddle.to_tensor(input_data)
+ label = paddle.to_tensor(label_data)
+
+ loss1 = FocalLoss()
+ loss = loss1.forward(input, label)
+ print("loss: %.5f" % (loss))
+
+ Q = Q_from_tokens(100)
+ loss2 = NCELoss(Q)
+ loss = loss2.forward(input, label)
+ print("loss: %.5f" % (loss))
diff --git a/paddlespeech/vector/utils/vector_utils.py b/paddlespeech/vector/utils/vector_utils.py
--- a/paddlespeech/vector/utils/vector_utils.py
+++ b/paddlespeech/vector/utils/vector_utils.py
@@ -11,6 +11,7 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
+import paddle
def get_chunks(seg_dur, audio_id, audio_duration):
| {"golden_diff": "diff --git a/paddlespeech/vector/modules/loss.py b/paddlespeech/vector/modules/loss.py\n--- a/paddlespeech/vector/modules/loss.py\n+++ b/paddlespeech/vector/modules/loss.py\n@@ -132,7 +132,7 @@\n \n def forward(self, output, target):\n \"\"\"Forward inference\n- \n+\n Args:\n output (tensor): the model output, which is the input of loss function\n \"\"\"\n@@ -161,7 +161,7 @@\n \"\"\"Post processing the score of post model(output of nn) of batchsize data\n \"\"\"\n scores = self.get_scores(idx, scores)\n- scale = paddle.to_tensor([self.Z_offset], dtype='float32')\n+ scale = paddle.to_tensor([self.Z_offset], dtype='float64')\n scores = paddle.add(scores, -scale)\n prob = paddle.exp(scores)\n if sep_target:\n@@ -225,3 +225,65 @@\n loss = -(model_loss + noise_loss)\n \n return loss\n+\n+\n+class FocalLoss(nn.Layer):\n+ \"\"\"This criterion is a implemenation of Focal Loss, which is proposed in \n+ Focal Loss for Dense Object Detection.\n+\n+ Loss(x, class) = - \\alpha (1-softmax(x)[class])^gamma \\log(softmax(x)[class])\n+\n+ The losses are averaged across observations for each minibatch.\n+\n+ Args:\n+ alpha(1D Tensor, Variable) : the scalar factor for this criterion\n+ gamma(float, double) : gamma > 0; reduces the relative loss for well-classi\ufb01ed examples (p > .5), \n+ putting more focus on hard, misclassi\ufb01ed examples\n+ size_average(bool): By default, the losses are averaged over observations for each minibatch.\n+ However, if the field size_average is set to False, the losses are\n+ instead summed for each minibatch.\n+ \"\"\"\n+\n+ def __init__(self, alpha=1, gamma=0, size_average=True, ignore_index=-100):\n+ super(FocalLoss, self).__init__()\n+ self.alpha = alpha\n+ self.gamma = gamma\n+ self.size_average = size_average\n+ self.ce = nn.CrossEntropyLoss(\n+ ignore_index=ignore_index, reduction=\"none\")\n+\n+ def forward(self, outputs, targets):\n+ \"\"\"Forword inference.\n+\n+ Args:\n+ outputs: input tensor\n+ target: target label tensor\n+ \"\"\"\n+ ce_loss = self.ce(outputs, targets)\n+ pt = paddle.exp(-ce_loss)\n+ focal_loss = self.alpha * (1 - pt)**self.gamma * ce_loss\n+ if self.size_average:\n+ return focal_loss.mean()\n+ else:\n+ return focal_loss.sum()\n+\n+\n+if __name__ == \"__main__\":\n+ import numpy as np\n+ from paddlespeech.vector.utils.vector_utils import Q_from_tokens\n+ paddle.set_device(\"cpu\")\n+\n+ input_data = paddle.uniform([5, 100], dtype=\"float64\")\n+ label_data = np.random.randint(0, 100, size=(5)).astype(np.int64)\n+\n+ input = paddle.to_tensor(input_data)\n+ label = paddle.to_tensor(label_data)\n+\n+ loss1 = FocalLoss()\n+ loss = loss1.forward(input, label)\n+ print(\"loss: %.5f\" % (loss))\n+\n+ Q = Q_from_tokens(100)\n+ loss2 = NCELoss(Q)\n+ loss = loss2.forward(input, label)\n+ print(\"loss: %.5f\" % (loss))\ndiff --git a/paddlespeech/vector/utils/vector_utils.py b/paddlespeech/vector/utils/vector_utils.py\n--- a/paddlespeech/vector/utils/vector_utils.py\n+++ b/paddlespeech/vector/utils/vector_utils.py\n@@ -11,6 +11,7 @@\n # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n # See the License for the specific language governing permissions and\n # limitations under the License.\n+import paddle\n \n \n def get_chunks(seg_dur, audio_id, audio_duration):\n", "issue": "[vec] deal with class imbalances\n\n", "before_files": [{"content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\ndef get_chunks(seg_dur, audio_id, audio_duration):\n \"\"\"Get all chunk segments from a utterance\n\n Args:\n seg_dur (float): segment chunk duration, seconds\n audio_id (str): utterance name, \n audio_duration (float): utterance duration, seconds\n\n Returns:\n List: all the chunk segments \n \"\"\"\n num_chunks = int(audio_duration / seg_dur) # all in seconds\n chunk_lst = [\n audio_id + \"_\" + str(i * seg_dur) + \"_\" + str(i * seg_dur + seg_dur)\n for i in range(num_chunks)\n ]\n return chunk_lst\n\n\ndef Q_from_tokens(token_num):\n \"\"\"Get prior model, data from uniform, would support others(guassian) in future\n \"\"\"\n freq = [1] * token_num\n Q = paddle.to_tensor(freq, dtype='float64')\n return Q / Q.sum()\n", "path": "paddlespeech/vector/utils/vector_utils.py"}, {"content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# This is modified from SpeechBrain\n# https://github.com/speechbrain/speechbrain/blob/085be635c07f16d42cd1295045bc46c407f1e15b/speechbrain/nnet/losses.py\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\nclass AngularMargin(nn.Layer):\n def __init__(self, margin=0.0, scale=1.0):\n \"\"\"An implementation of Angular Margin (AM) proposed in the following\n paper: '''Margin Matters: Towards More Discriminative Deep Neural Network\n Embeddings for Speaker Recognition''' (https://arxiv.org/abs/1906.07317)\n\n Args:\n margin (float, optional): The margin for cosine similiarity. Defaults to 0.0.\n scale (float, optional): The scale for cosine similiarity. Defaults to 1.0.\n \"\"\"\n super(AngularMargin, self).__init__()\n self.margin = margin\n self.scale = scale\n\n def forward(self, outputs, targets):\n outputs = outputs - self.margin * targets\n return self.scale * outputs\n\n\nclass AdditiveAngularMargin(AngularMargin):\n def __init__(self, margin=0.0, scale=1.0, easy_margin=False):\n \"\"\"The Implementation of Additive Angular Margin (AAM) proposed\n in the following paper: '''Margin Matters: Towards More Discriminative Deep Neural Network Embeddings for Speaker Recognition'''\n (https://arxiv.org/abs/1906.07317)\n\n Args:\n margin (float, optional): margin factor. Defaults to 0.0.\n scale (float, optional): scale factor. Defaults to 1.0.\n easy_margin (bool, optional): easy_margin flag. Defaults to False.\n \"\"\"\n super(AdditiveAngularMargin, self).__init__(margin, scale)\n self.easy_margin = easy_margin\n\n self.cos_m = math.cos(self.margin)\n self.sin_m = math.sin(self.margin)\n self.th = math.cos(math.pi - self.margin)\n self.mm = math.sin(math.pi - self.margin) * self.margin\n\n def forward(self, outputs, targets):\n cosine = outputs.astype('float32')\n sine = paddle.sqrt(1.0 - paddle.pow(cosine, 2))\n phi = cosine * self.cos_m - sine * self.sin_m # cos(theta + m)\n if self.easy_margin:\n phi = paddle.where(cosine > 0, phi, cosine)\n else:\n phi = paddle.where(cosine > self.th, phi, cosine - self.mm)\n outputs = (targets * phi) + ((1.0 - targets) * cosine)\n return self.scale * outputs\n\n\nclass LogSoftmaxWrapper(nn.Layer):\n def __init__(self, loss_fn):\n \"\"\"Speaker identificatin loss function wrapper \n including all of compositions of the loss transformation\n Args:\n loss_fn (_type_): the loss value of a batch\n \"\"\"\n super(LogSoftmaxWrapper, self).__init__()\n self.loss_fn = loss_fn\n self.criterion = paddle.nn.KLDivLoss(reduction=\"sum\")\n\n def forward(self, outputs, targets, length=None):\n targets = F.one_hot(targets, outputs.shape[1])\n try:\n predictions = self.loss_fn(outputs, targets)\n except TypeError:\n predictions = self.loss_fn(outputs)\n\n predictions = F.log_softmax(predictions, axis=1)\n loss = self.criterion(predictions, targets) / targets.sum()\n return loss\n\n\nclass NCELoss(nn.Layer):\n \"\"\"Noise Contrastive Estimation loss funtion\n\n Noise Contrastive Estimation (NCE) is an approximation method that is used to\n work around the huge computational cost of large softmax layer.\n The basic idea is to convert the prediction problem into classification problem\n at training stage. It has been proved that these two criterions converges to\n the same minimal point as long as noise distribution is close enough to real one.\n\n NCE bridges the gap between generative models and discriminative models,\n rather than simply speedup the softmax layer.\n With NCE, you can turn almost anything into posterior with less effort (I think).\n\n Refs:\n NCE\uff1ahttp://www.cs.helsinki.fi/u/ahyvarin/papers/Gutmann10AISTATS.pdf\n Thanks: https://github.com/mingen-pan/easy-to-use-NCE-RNN-for-Pytorch/blob/master/nce.py\n\n Examples:\n Q = Q_from_tokens(output_dim)\n NCELoss(Q)\n \"\"\"\n\n def __init__(self, Q, noise_ratio=100, Z_offset=9.5):\n \"\"\"Noise Contrastive Estimation loss funtion\n\n Args:\n Q (tensor): prior model, uniform or guassian\n noise_ratio (int, optional): noise sampling times. Defaults to 100.\n Z_offset (float, optional): scale of post processing the score. Defaults to 9.5.\n \"\"\"\n super(NCELoss, self).__init__()\n assert type(noise_ratio) is int\n self.Q = paddle.to_tensor(Q, stop_gradient=False)\n self.N = self.Q.shape[0]\n self.K = noise_ratio\n self.Z_offset = Z_offset\n\n def forward(self, output, target):\n \"\"\"Forward inference\n \n Args:\n output (tensor): the model output, which is the input of loss function\n \"\"\"\n output = paddle.reshape(output, [-1, self.N])\n B = output.shape[0]\n noise_idx = self.get_noise(B)\n idx = self.get_combined_idx(target, noise_idx)\n P_target, P_noise = self.get_prob(idx, output, sep_target=True)\n Q_target, Q_noise = self.get_Q(idx)\n loss = self.nce_loss(P_target, P_noise, Q_noise, Q_target)\n return loss.mean()\n\n def get_Q(self, idx, sep_target=True):\n \"\"\"Get prior model of batchsize data\n \"\"\"\n idx_size = idx.size\n prob_model = paddle.to_tensor(\n self.Q.numpy()[paddle.reshape(idx, [-1]).numpy()])\n prob_model = paddle.reshape(prob_model, [idx.shape[0], idx.shape[1]])\n if sep_target:\n return prob_model[:, 0], prob_model[:, 1:]\n else:\n return prob_model\n\n def get_prob(self, idx, scores, sep_target=True):\n \"\"\"Post processing the score of post model(output of nn) of batchsize data\n \"\"\"\n scores = self.get_scores(idx, scores)\n scale = paddle.to_tensor([self.Z_offset], dtype='float32')\n scores = paddle.add(scores, -scale)\n prob = paddle.exp(scores)\n if sep_target:\n return prob[:, 0], prob[:, 1:]\n else:\n return prob\n\n def get_scores(self, idx, scores):\n \"\"\"Get the score of post model(output of nn) of batchsize data\n \"\"\"\n B, N = scores.shape\n K = idx.shape[1]\n idx_increment = paddle.to_tensor(\n N * paddle.reshape(paddle.arange(B), [B, 1]) * paddle.ones([1, K]),\n dtype=\"int64\",\n stop_gradient=False)\n new_idx = idx_increment + idx\n new_scores = paddle.index_select(\n paddle.reshape(scores, [-1]), paddle.reshape(new_idx, [-1]))\n\n return paddle.reshape(new_scores, [B, K])\n\n def get_noise(self, batch_size, uniform=True):\n \"\"\"Select noise sample\n \"\"\"\n if uniform:\n noise = np.random.randint(self.N, size=self.K * batch_size)\n else:\n noise = np.random.choice(\n self.N, self.K * batch_size, replace=True, p=self.Q.data)\n noise = paddle.to_tensor(noise, dtype='int64', stop_gradient=False)\n noise_idx = paddle.reshape(noise, [batch_size, self.K])\n return noise_idx\n\n def get_combined_idx(self, target_idx, noise_idx):\n \"\"\"Combined target and noise\n \"\"\"\n target_idx = paddle.reshape(target_idx, [-1, 1])\n return paddle.concat((target_idx, noise_idx), 1)\n\n def nce_loss(self, prob_model, prob_noise_in_model, prob_noise,\n prob_target_in_noise):\n \"\"\"Combined the loss of target and noise\n \"\"\"\n\n def safe_log(tensor):\n \"\"\"Safe log\n \"\"\"\n EPSILON = 1e-10\n return paddle.log(EPSILON + tensor)\n\n model_loss = safe_log(prob_model /\n (prob_model + self.K * prob_target_in_noise))\n model_loss = paddle.reshape(model_loss, [-1])\n\n noise_loss = paddle.sum(\n safe_log((self.K * prob_noise) /\n (prob_noise_in_model + self.K * prob_noise)), -1)\n noise_loss = paddle.reshape(noise_loss, [-1])\n\n loss = -(model_loss + noise_loss)\n\n return loss\n", "path": "paddlespeech/vector/modules/loss.py"}], "after_files": [{"content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport paddle\n\n\ndef get_chunks(seg_dur, audio_id, audio_duration):\n \"\"\"Get all chunk segments from a utterance\n\n Args:\n seg_dur (float): segment chunk duration, seconds\n audio_id (str): utterance name, \n audio_duration (float): utterance duration, seconds\n\n Returns:\n List: all the chunk segments \n \"\"\"\n num_chunks = int(audio_duration / seg_dur) # all in seconds\n chunk_lst = [\n audio_id + \"_\" + str(i * seg_dur) + \"_\" + str(i * seg_dur + seg_dur)\n for i in range(num_chunks)\n ]\n return chunk_lst\n\n\ndef Q_from_tokens(token_num):\n \"\"\"Get prior model, data from uniform, would support others(guassian) in future\n \"\"\"\n freq = [1] * token_num\n Q = paddle.to_tensor(freq, dtype='float64')\n return Q / Q.sum()\n", "path": "paddlespeech/vector/utils/vector_utils.py"}, {"content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# This is modified from SpeechBrain\n# https://github.com/speechbrain/speechbrain/blob/085be635c07f16d42cd1295045bc46c407f1e15b/speechbrain/nnet/losses.py\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\nclass AngularMargin(nn.Layer):\n def __init__(self, margin=0.0, scale=1.0):\n \"\"\"An implementation of Angular Margin (AM) proposed in the following\n paper: '''Margin Matters: Towards More Discriminative Deep Neural Network\n Embeddings for Speaker Recognition''' (https://arxiv.org/abs/1906.07317)\n\n Args:\n margin (float, optional): The margin for cosine similiarity. Defaults to 0.0.\n scale (float, optional): The scale for cosine similiarity. Defaults to 1.0.\n \"\"\"\n super(AngularMargin, self).__init__()\n self.margin = margin\n self.scale = scale\n\n def forward(self, outputs, targets):\n outputs = outputs - self.margin * targets\n return self.scale * outputs\n\n\nclass AdditiveAngularMargin(AngularMargin):\n def __init__(self, margin=0.0, scale=1.0, easy_margin=False):\n \"\"\"The Implementation of Additive Angular Margin (AAM) proposed\n in the following paper: '''Margin Matters: Towards More Discriminative Deep Neural Network Embeddings for Speaker Recognition'''\n (https://arxiv.org/abs/1906.07317)\n\n Args:\n margin (float, optional): margin factor. Defaults to 0.0.\n scale (float, optional): scale factor. Defaults to 1.0.\n easy_margin (bool, optional): easy_margin flag. Defaults to False.\n \"\"\"\n super(AdditiveAngularMargin, self).__init__(margin, scale)\n self.easy_margin = easy_margin\n\n self.cos_m = math.cos(self.margin)\n self.sin_m = math.sin(self.margin)\n self.th = math.cos(math.pi - self.margin)\n self.mm = math.sin(math.pi - self.margin) * self.margin\n\n def forward(self, outputs, targets):\n cosine = outputs.astype('float32')\n sine = paddle.sqrt(1.0 - paddle.pow(cosine, 2))\n phi = cosine * self.cos_m - sine * self.sin_m # cos(theta + m)\n if self.easy_margin:\n phi = paddle.where(cosine > 0, phi, cosine)\n else:\n phi = paddle.where(cosine > self.th, phi, cosine - self.mm)\n outputs = (targets * phi) + ((1.0 - targets) * cosine)\n return self.scale * outputs\n\n\nclass LogSoftmaxWrapper(nn.Layer):\n def __init__(self, loss_fn):\n \"\"\"Speaker identificatin loss function wrapper \n including all of compositions of the loss transformation\n Args:\n loss_fn (_type_): the loss value of a batch\n \"\"\"\n super(LogSoftmaxWrapper, self).__init__()\n self.loss_fn = loss_fn\n self.criterion = paddle.nn.KLDivLoss(reduction=\"sum\")\n\n def forward(self, outputs, targets, length=None):\n targets = F.one_hot(targets, outputs.shape[1])\n try:\n predictions = self.loss_fn(outputs, targets)\n except TypeError:\n predictions = self.loss_fn(outputs)\n\n predictions = F.log_softmax(predictions, axis=1)\n loss = self.criterion(predictions, targets) / targets.sum()\n return loss\n\n\nclass NCELoss(nn.Layer):\n \"\"\"Noise Contrastive Estimation loss funtion\n\n Noise Contrastive Estimation (NCE) is an approximation method that is used to\n work around the huge computational cost of large softmax layer.\n The basic idea is to convert the prediction problem into classification problem\n at training stage. It has been proved that these two criterions converges to\n the same minimal point as long as noise distribution is close enough to real one.\n\n NCE bridges the gap between generative models and discriminative models,\n rather than simply speedup the softmax layer.\n With NCE, you can turn almost anything into posterior with less effort (I think).\n\n Refs:\n NCE\uff1ahttp://www.cs.helsinki.fi/u/ahyvarin/papers/Gutmann10AISTATS.pdf\n Thanks: https://github.com/mingen-pan/easy-to-use-NCE-RNN-for-Pytorch/blob/master/nce.py\n\n Examples:\n Q = Q_from_tokens(output_dim)\n NCELoss(Q)\n \"\"\"\n\n def __init__(self, Q, noise_ratio=100, Z_offset=9.5):\n \"\"\"Noise Contrastive Estimation loss funtion\n\n Args:\n Q (tensor): prior model, uniform or guassian\n noise_ratio (int, optional): noise sampling times. Defaults to 100.\n Z_offset (float, optional): scale of post processing the score. Defaults to 9.5.\n \"\"\"\n super(NCELoss, self).__init__()\n assert type(noise_ratio) is int\n self.Q = paddle.to_tensor(Q, stop_gradient=False)\n self.N = self.Q.shape[0]\n self.K = noise_ratio\n self.Z_offset = Z_offset\n\n def forward(self, output, target):\n \"\"\"Forward inference\n\n Args:\n output (tensor): the model output, which is the input of loss function\n \"\"\"\n output = paddle.reshape(output, [-1, self.N])\n B = output.shape[0]\n noise_idx = self.get_noise(B)\n idx = self.get_combined_idx(target, noise_idx)\n P_target, P_noise = self.get_prob(idx, output, sep_target=True)\n Q_target, Q_noise = self.get_Q(idx)\n loss = self.nce_loss(P_target, P_noise, Q_noise, Q_target)\n return loss.mean()\n\n def get_Q(self, idx, sep_target=True):\n \"\"\"Get prior model of batchsize data\n \"\"\"\n idx_size = idx.size\n prob_model = paddle.to_tensor(\n self.Q.numpy()[paddle.reshape(idx, [-1]).numpy()])\n prob_model = paddle.reshape(prob_model, [idx.shape[0], idx.shape[1]])\n if sep_target:\n return prob_model[:, 0], prob_model[:, 1:]\n else:\n return prob_model\n\n def get_prob(self, idx, scores, sep_target=True):\n \"\"\"Post processing the score of post model(output of nn) of batchsize data\n \"\"\"\n scores = self.get_scores(idx, scores)\n scale = paddle.to_tensor([self.Z_offset], dtype='float64')\n scores = paddle.add(scores, -scale)\n prob = paddle.exp(scores)\n if sep_target:\n return prob[:, 0], prob[:, 1:]\n else:\n return prob\n\n def get_scores(self, idx, scores):\n \"\"\"Get the score of post model(output of nn) of batchsize data\n \"\"\"\n B, N = scores.shape\n K = idx.shape[1]\n idx_increment = paddle.to_tensor(\n N * paddle.reshape(paddle.arange(B), [B, 1]) * paddle.ones([1, K]),\n dtype=\"int64\",\n stop_gradient=False)\n new_idx = idx_increment + idx\n new_scores = paddle.index_select(\n paddle.reshape(scores, [-1]), paddle.reshape(new_idx, [-1]))\n\n return paddle.reshape(new_scores, [B, K])\n\n def get_noise(self, batch_size, uniform=True):\n \"\"\"Select noise sample\n \"\"\"\n if uniform:\n noise = np.random.randint(self.N, size=self.K * batch_size)\n else:\n noise = np.random.choice(\n self.N, self.K * batch_size, replace=True, p=self.Q.data)\n noise = paddle.to_tensor(noise, dtype='int64', stop_gradient=False)\n noise_idx = paddle.reshape(noise, [batch_size, self.K])\n return noise_idx\n\n def get_combined_idx(self, target_idx, noise_idx):\n \"\"\"Combined target and noise\n \"\"\"\n target_idx = paddle.reshape(target_idx, [-1, 1])\n return paddle.concat((target_idx, noise_idx), 1)\n\n def nce_loss(self, prob_model, prob_noise_in_model, prob_noise,\n prob_target_in_noise):\n \"\"\"Combined the loss of target and noise\n \"\"\"\n\n def safe_log(tensor):\n \"\"\"Safe log\n \"\"\"\n EPSILON = 1e-10\n return paddle.log(EPSILON + tensor)\n\n model_loss = safe_log(prob_model /\n (prob_model + self.K * prob_target_in_noise))\n model_loss = paddle.reshape(model_loss, [-1])\n\n noise_loss = paddle.sum(\n safe_log((self.K * prob_noise) /\n (prob_noise_in_model + self.K * prob_noise)), -1)\n noise_loss = paddle.reshape(noise_loss, [-1])\n\n loss = -(model_loss + noise_loss)\n\n return loss\n\n\nclass FocalLoss(nn.Layer):\n \"\"\"This criterion is a implemenation of Focal Loss, which is proposed in \n Focal Loss for Dense Object Detection.\n\n Loss(x, class) = - \\alpha (1-softmax(x)[class])^gamma \\log(softmax(x)[class])\n\n The losses are averaged across observations for each minibatch.\n\n Args:\n alpha(1D Tensor, Variable) : the scalar factor for this criterion\n gamma(float, double) : gamma > 0; reduces the relative loss for well-classi\ufb01ed examples (p > .5), \n putting more focus on hard, misclassi\ufb01ed examples\n size_average(bool): By default, the losses are averaged over observations for each minibatch.\n However, if the field size_average is set to False, the losses are\n instead summed for each minibatch.\n \"\"\"\n\n def __init__(self, alpha=1, gamma=0, size_average=True, ignore_index=-100):\n super(FocalLoss, self).__init__()\n self.alpha = alpha\n self.gamma = gamma\n self.size_average = size_average\n self.ce = nn.CrossEntropyLoss(\n ignore_index=ignore_index, reduction=\"none\")\n\n def forward(self, outputs, targets):\n \"\"\"Forword inference.\n\n Args:\n outputs: input tensor\n target: target label tensor\n \"\"\"\n ce_loss = self.ce(outputs, targets)\n pt = paddle.exp(-ce_loss)\n focal_loss = self.alpha * (1 - pt)**self.gamma * ce_loss\n if self.size_average:\n return focal_loss.mean()\n else:\n return focal_loss.sum()\n\n\nif __name__ == \"__main__\":\n import numpy as np\n from paddlespeech.vector.utils.vector_utils import Q_from_tokens\n paddle.set_device(\"cpu\")\n\n input_data = paddle.uniform([5, 100], dtype=\"float64\")\n label_data = np.random.randint(0, 100, size=(5)).astype(np.int64)\n\n input = paddle.to_tensor(input_data)\n label = paddle.to_tensor(label_data)\n\n loss1 = FocalLoss()\n loss = loss1.forward(input, label)\n print(\"loss: %.5f\" % (loss))\n\n Q = Q_from_tokens(100)\n loss2 = NCELoss(Q)\n loss = loss2.forward(input, label)\n print(\"loss: %.5f\" % (loss))\n", "path": "paddlespeech/vector/modules/loss.py"}]} | 3,414 | 930 |
gh_patches_debug_18242 | rasdani/github-patches | git_diff | Mailu__Mailu-1542 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Dovecot does not use redis, so it should be removed from start script
In core/dovecot/start.py REDIS_ADDRESS is resolved but redis is not used on dovecot. It should be removed from the script.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/dovecot/start.py`
Content:
```
1 #!/usr/bin/python3
2
3 import os
4 import glob
5 import multiprocessing
6 import logging as log
7 import sys
8
9 from podop import run_server
10 from socrate import system, conf
11
12 log.basicConfig(stream=sys.stderr, level=os.environ.get("LOG_LEVEL", "WARNING"))
13
14 def start_podop():
15 os.setuid(8)
16 url = "http://" + os.environ["ADMIN_ADDRESS"] + "/internal/dovecot/§"
17 run_server(0, "dovecot", "/tmp/podop.socket", [
18 ("quota", "url", url ),
19 ("auth", "url", url),
20 ("sieve", "url", url),
21 ])
22
23 # Actual startup script
24
25 os.environ["FRONT_ADDRESS"] = system.get_host_address_from_environment("FRONT", "front")
26 os.environ["REDIS_ADDRESS"] = system.get_host_address_from_environment("REDIS", "redis")
27 os.environ["ADMIN_ADDRESS"] = system.get_host_address_from_environment("ADMIN", "admin")
28 os.environ["ANTISPAM_WEBUI_ADDRESS"] = system.get_host_address_from_environment("ANTISPAM_WEBUI", "antispam:11334")
29 if os.environ["WEBMAIL"] != "none":
30 os.environ["WEBMAIL_ADDRESS"] = system.get_host_address_from_environment("WEBMAIL", "webmail")
31
32 for dovecot_file in glob.glob("/conf/*.conf"):
33 conf.jinja(dovecot_file, os.environ, os.path.join("/etc/dovecot", os.path.basename(dovecot_file)))
34
35 os.makedirs("/conf/bin", exist_ok=True)
36 for script_file in glob.glob("/conf/*.script"):
37 out_file = os.path.join("/conf/bin/", os.path.basename(script_file).replace('.script',''))
38 conf.jinja(script_file, os.environ, out_file)
39 os.chmod(out_file, 0o555)
40
41 # Run Podop, then postfix
42 multiprocessing.Process(target=start_podop).start()
43 os.system("chown mail:mail /mail")
44 os.system("chown -R mail:mail /var/lib/dovecot /conf")
45 os.execv("/usr/sbin/dovecot", ["dovecot", "-c", "/etc/dovecot/dovecot.conf", "-F"])
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/core/dovecot/start.py b/core/dovecot/start.py
--- a/core/dovecot/start.py
+++ b/core/dovecot/start.py
@@ -21,13 +21,9 @@
])
# Actual startup script
-
os.environ["FRONT_ADDRESS"] = system.get_host_address_from_environment("FRONT", "front")
-os.environ["REDIS_ADDRESS"] = system.get_host_address_from_environment("REDIS", "redis")
os.environ["ADMIN_ADDRESS"] = system.get_host_address_from_environment("ADMIN", "admin")
os.environ["ANTISPAM_WEBUI_ADDRESS"] = system.get_host_address_from_environment("ANTISPAM_WEBUI", "antispam:11334")
-if os.environ["WEBMAIL"] != "none":
- os.environ["WEBMAIL_ADDRESS"] = system.get_host_address_from_environment("WEBMAIL", "webmail")
for dovecot_file in glob.glob("/conf/*.conf"):
conf.jinja(dovecot_file, os.environ, os.path.join("/etc/dovecot", os.path.basename(dovecot_file)))
| {"golden_diff": "diff --git a/core/dovecot/start.py b/core/dovecot/start.py\n--- a/core/dovecot/start.py\n+++ b/core/dovecot/start.py\n@@ -21,13 +21,9 @@\n ])\n \n # Actual startup script\n-\n os.environ[\"FRONT_ADDRESS\"] = system.get_host_address_from_environment(\"FRONT\", \"front\")\n-os.environ[\"REDIS_ADDRESS\"] = system.get_host_address_from_environment(\"REDIS\", \"redis\")\n os.environ[\"ADMIN_ADDRESS\"] = system.get_host_address_from_environment(\"ADMIN\", \"admin\")\n os.environ[\"ANTISPAM_WEBUI_ADDRESS\"] = system.get_host_address_from_environment(\"ANTISPAM_WEBUI\", \"antispam:11334\")\n-if os.environ[\"WEBMAIL\"] != \"none\":\n- os.environ[\"WEBMAIL_ADDRESS\"] = system.get_host_address_from_environment(\"WEBMAIL\", \"webmail\")\n \n for dovecot_file in glob.glob(\"/conf/*.conf\"):\n conf.jinja(dovecot_file, os.environ, os.path.join(\"/etc/dovecot\", os.path.basename(dovecot_file)))\n", "issue": "Dovecot does not use redis, so it should be removed from start script \nIn core/dovecot/start.py REDIS_ADDRESS is resolved but redis is not used on dovecot. It should be removed from the script.\n", "before_files": [{"content": "#!/usr/bin/python3\n\nimport os\nimport glob\nimport multiprocessing\nimport logging as log\nimport sys\n\nfrom podop import run_server\nfrom socrate import system, conf\n\nlog.basicConfig(stream=sys.stderr, level=os.environ.get(\"LOG_LEVEL\", \"WARNING\"))\n\ndef start_podop():\n os.setuid(8)\n url = \"http://\" + os.environ[\"ADMIN_ADDRESS\"] + \"/internal/dovecot/\u00a7\"\n run_server(0, \"dovecot\", \"/tmp/podop.socket\", [\n\t\t(\"quota\", \"url\", url ),\n\t\t(\"auth\", \"url\", url),\n\t\t(\"sieve\", \"url\", url),\n ])\n\n# Actual startup script\n\nos.environ[\"FRONT_ADDRESS\"] = system.get_host_address_from_environment(\"FRONT\", \"front\")\nos.environ[\"REDIS_ADDRESS\"] = system.get_host_address_from_environment(\"REDIS\", \"redis\")\nos.environ[\"ADMIN_ADDRESS\"] = system.get_host_address_from_environment(\"ADMIN\", \"admin\")\nos.environ[\"ANTISPAM_WEBUI_ADDRESS\"] = system.get_host_address_from_environment(\"ANTISPAM_WEBUI\", \"antispam:11334\")\nif os.environ[\"WEBMAIL\"] != \"none\":\n os.environ[\"WEBMAIL_ADDRESS\"] = system.get_host_address_from_environment(\"WEBMAIL\", \"webmail\")\n\nfor dovecot_file in glob.glob(\"/conf/*.conf\"):\n conf.jinja(dovecot_file, os.environ, os.path.join(\"/etc/dovecot\", os.path.basename(dovecot_file)))\n\nos.makedirs(\"/conf/bin\", exist_ok=True)\nfor script_file in glob.glob(\"/conf/*.script\"):\n out_file = os.path.join(\"/conf/bin/\", os.path.basename(script_file).replace('.script',''))\n conf.jinja(script_file, os.environ, out_file)\n os.chmod(out_file, 0o555)\n\n# Run Podop, then postfix\nmultiprocessing.Process(target=start_podop).start()\nos.system(\"chown mail:mail /mail\")\nos.system(\"chown -R mail:mail /var/lib/dovecot /conf\")\nos.execv(\"/usr/sbin/dovecot\", [\"dovecot\", \"-c\", \"/etc/dovecot/dovecot.conf\", \"-F\"])\n", "path": "core/dovecot/start.py"}], "after_files": [{"content": "#!/usr/bin/python3\n\nimport os\nimport glob\nimport multiprocessing\nimport logging as log\nimport sys\n\nfrom podop import run_server\nfrom socrate import system, conf\n\nlog.basicConfig(stream=sys.stderr, level=os.environ.get(\"LOG_LEVEL\", \"WARNING\"))\n\ndef start_podop():\n os.setuid(8)\n url = \"http://\" + os.environ[\"ADMIN_ADDRESS\"] + \"/internal/dovecot/\u00a7\"\n run_server(0, \"dovecot\", \"/tmp/podop.socket\", [\n\t\t(\"quota\", \"url\", url ),\n\t\t(\"auth\", \"url\", url),\n\t\t(\"sieve\", \"url\", url),\n ])\n\n# Actual startup script\nos.environ[\"FRONT_ADDRESS\"] = system.get_host_address_from_environment(\"FRONT\", \"front\")\nos.environ[\"ADMIN_ADDRESS\"] = system.get_host_address_from_environment(\"ADMIN\", \"admin\")\nos.environ[\"ANTISPAM_WEBUI_ADDRESS\"] = system.get_host_address_from_environment(\"ANTISPAM_WEBUI\", \"antispam:11334\")\n\nfor dovecot_file in glob.glob(\"/conf/*.conf\"):\n conf.jinja(dovecot_file, os.environ, os.path.join(\"/etc/dovecot\", os.path.basename(dovecot_file)))\n\nos.makedirs(\"/conf/bin\", exist_ok=True)\nfor script_file in glob.glob(\"/conf/*.script\"):\n out_file = os.path.join(\"/conf/bin/\", os.path.basename(script_file).replace('.script',''))\n conf.jinja(script_file, os.environ, out_file)\n os.chmod(out_file, 0o555)\n\n# Run Podop, then postfix\nmultiprocessing.Process(target=start_podop).start()\nos.system(\"chown mail:mail /mail\")\nos.system(\"chown -R mail:mail /var/lib/dovecot /conf\")\nos.execv(\"/usr/sbin/dovecot\", [\"dovecot\", \"-c\", \"/etc/dovecot/dovecot.conf\", \"-F\"])\n", "path": "core/dovecot/start.py"}]} | 866 | 230 |
gh_patches_debug_3694 | rasdani/github-patches | git_diff | ansible__ansible-22432 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
tag attribute in ec2_snapshot_facts module should be dict instead of list
##### ISSUE TYPE
- Documentation Report
##### COMPONENT NAME
ansible-modules-extras/cloud/amazon/ec2_snapshot_facts.py
##### ANSIBLE VERSION
```
$ ansible --version
ansible 2.2.1.0
config file = /home/psd/.ansible.cfg
configured module search path = ['/home/psd/.ansible/library']
```
##### CONFIGURATION
N/A. Not configuration-specific
##### OS / ENVIRONMENT
N/A
##### SUMMARY
The returned snapshots dict have a tag attribute, it was already [converted to dict in the code](https://github.com/ansible/ansible-modules-extras/blob/devel/cloud/amazon/ec2_snapshot_facts.py#L196)
So the expected tag attr in this module is a dict, not a list(which is the return value type in boto3)
##### STEPS TO REPRODUCE
get snapshot_facts, we have an example output in the next section:
##### ACTUAL RESULTS
```
{
"description": "",
"encrypted": false,
"owner_id": "omitted",
"progress": "100%",
"snapshot_id": "snap-omitted",
"start_time": "2017-03-08T03:52:29+00:00",
"state": "completed",
"tags": {
"Name": "some name",
"creator": "cron on some machine",
"frequency": "hourly"
},
"volume_id": "vol-omitted",
"volume_size": 40
}
```
As you can see, the tags attr is a dict.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/ansible/modules/cloud/amazon/ec2_snapshot_facts.py`
Content:
```
1 #!/usr/bin/python
2 # This file is part of Ansible
3 #
4 # Ansible is free software: you can redistribute it and/or modify
5 # it under the terms of the GNU General Public License as published by
6 # the Free Software Foundation, either version 3 of the License, or
7 # (at your option) any later version.
8 #
9 # Ansible is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU General Public License
15 # along with Ansible. If not, see <http://www.gnu.org/licenses/>.
16
17 ANSIBLE_METADATA = {'status': ['preview'],
18 'supported_by': 'community',
19 'version': '1.0'}
20
21 DOCUMENTATION = '''
22 ---
23 module: ec2_snapshot_facts
24 short_description: Gather facts about ec2 volume snapshots in AWS
25 description:
26 - Gather facts about ec2 volume snapshots in AWS
27 version_added: "2.1"
28 author: "Rob White (@wimnat)"
29 options:
30 snapshot_ids:
31 description:
32 - If you specify one or more snapshot IDs, only snapshots that have the specified IDs are returned.
33 required: false
34 default: []
35 owner_ids:
36 description:
37 - If you specify one or more snapshot owners, only snapshots from the specified owners and for which you have \
38 access are returned.
39 required: false
40 default: []
41 restorable_by_user_ids:
42 description:
43 - If you specify a list of restorable users, only snapshots with create snapshot permissions for those users are \
44 returned.
45 required: false
46 default: []
47 filters:
48 description:
49 - A dict of filters to apply. Each dict item consists of a filter key and a filter value. See \
50 U(http://docs.aws.amazon.com/AWSEC2/latest/APIReference/API_DescribeSnapshots.html) for possible filters. Filter \
51 names and values are case sensitive.
52 required: false
53 default: {}
54 notes:
55 - By default, the module will return all snapshots, including public ones. To limit results to snapshots owned by \
56 the account use the filter 'owner-id'.
57
58 extends_documentation_fragment:
59 - aws
60 - ec2
61 '''
62
63 EXAMPLES = '''
64 # Note: These examples do not set authentication details, see the AWS Guide for details.
65
66 # Gather facts about all snapshots, including public ones
67 - ec2_snapshot_facts:
68
69 # Gather facts about all snapshots owned by the account 0123456789
70 - ec2_snapshot_facts:
71 filters:
72 owner-id: 0123456789
73
74 # Or alternatively...
75 - ec2_snapshot_facts:
76 owner_ids:
77 - 0123456789
78
79 # Gather facts about a particular snapshot using ID
80 - ec2_snapshot_facts:
81 filters:
82 snapshot-id: snap-00112233
83
84 # Or alternatively...
85 - ec2_snapshot_facts:
86 snapshot_ids:
87 - snap-00112233
88
89 # Gather facts about any snapshot with a tag key Name and value Example
90 - ec2_snapshot_facts:
91 filters:
92 "tag:Name": Example
93
94 # Gather facts about any snapshot with an error status
95 - ec2_snapshot_facts:
96 filters:
97 status: error
98
99 '''
100
101 RETURN = '''
102 snapshot_id:
103 description: The ID of the snapshot. Each snapshot receives a unique identifier when it is created.
104 type: string
105 sample: snap-01234567
106 volume_id:
107 description: The ID of the volume that was used to create the snapshot.
108 type: string
109 sample: vol-01234567
110 state:
111 description: The snapshot state (completed, pending or error).
112 type: string
113 sample: completed
114 state_message:
115 description: Encrypted Amazon EBS snapshots are copied asynchronously. If a snapshot copy operation fails (for example, if the proper AWS Key Management Service (AWS KMS) permissions are not obtained) this field displays error state details to help you diagnose why the error occurred.
116 type: string
117 sample:
118 start_time:
119 description: The time stamp when the snapshot was initiated.
120 type: datetime
121 sample: 2015-02-12T02:14:02+00:00
122 progress:
123 description: The progress of the snapshot, as a percentage.
124 type: string
125 sample: 100%
126 owner_id:
127 description: The AWS account ID of the EBS snapshot owner.
128 type: string
129 sample: 099720109477
130 description:
131 description: The description for the snapshot.
132 type: string
133 sample: My important backup
134 volume_size:
135 description: The size of the volume, in GiB.
136 type: integer
137 sample: 8
138 owner_alias:
139 description: The AWS account alias (for example, amazon, self) or AWS account ID that owns the snapshot.
140 type: string
141 sample: 033440102211
142 tags:
143 description: Any tags assigned to the snapshot.
144 type: list
145 sample: "{ 'my_tag_key': 'my_tag_value' }"
146 encrypted:
147 description: Indicates whether the snapshot is encrypted.
148 type: boolean
149 sample: True
150 kms_key_id:
151 description: The full ARN of the AWS Key Management Service (AWS KMS) customer master key (CMK) that was used to \
152 protect the volume encryption key for the parent volume.
153 type: string
154 sample: 74c9742a-a1b2-45cb-b3fe-abcdef123456
155 data_encryption_key_id:
156 description: The data encryption key identifier for the snapshot. This value is a unique identifier that \
157 corresponds to the data encryption key that was used to encrypt the original volume or snapshot copy.
158 type: string
159 sample: "arn:aws:kms:ap-southeast-2:012345678900:key/74c9742a-a1b2-45cb-b3fe-abcdef123456"
160
161 '''
162
163 try:
164 import boto3
165 from botocore.exceptions import ClientError, NoCredentialsError
166 HAS_BOTO3 = True
167 except ImportError:
168 HAS_BOTO3 = False
169
170 from ansible.module_utils.basic import AnsibleModule
171 from ansible.module_utils.ec2 import (ansible_dict_to_boto3_filter_list,
172 boto3_conn, boto3_tag_list_to_ansible_dict, camel_dict_to_snake_dict,
173 ec2_argument_spec, get_aws_connection_info)
174
175
176 def list_ec2_snapshots(connection, module):
177
178 snapshot_ids = module.params.get("snapshot_ids")
179 owner_ids = map(str, module.params.get("owner_ids"))
180 restorable_by_user_ids = module.params.get("restorable_by_user_ids")
181 filters = ansible_dict_to_boto3_filter_list(module.params.get("filters"))
182
183 try:
184 snapshots = connection.describe_snapshots(SnapshotIds=snapshot_ids, OwnerIds=owner_ids, RestorableByUserIds=restorable_by_user_ids, Filters=filters)
185 except ClientError as e:
186 module.fail_json(msg=e.message)
187
188 # Turn the boto3 result in to ansible_friendly_snaked_names
189 snaked_snapshots = []
190 for snapshot in snapshots['Snapshots']:
191 snaked_snapshots.append(camel_dict_to_snake_dict(snapshot))
192
193 # Turn the boto3 result in to ansible friendly tag dictionary
194 for snapshot in snaked_snapshots:
195 if 'tags' in snapshot:
196 snapshot['tags'] = boto3_tag_list_to_ansible_dict(snapshot['tags'])
197
198 module.exit_json(snapshots=snaked_snapshots)
199
200
201 def main():
202
203 argument_spec = ec2_argument_spec()
204 argument_spec.update(
205 dict(
206 snapshot_ids=dict(default=[], type='list'),
207 owner_ids=dict(default=[], type='list'),
208 restorable_by_user_ids=dict(default=[], type='list'),
209 filters=dict(default={}, type='dict')
210 )
211 )
212
213 module = AnsibleModule(argument_spec=argument_spec,
214 mutually_exclusive=[
215 ['snapshot_ids', 'owner_ids', 'restorable_by_user_ids', 'filters']
216 ]
217 )
218
219 if not HAS_BOTO3:
220 module.fail_json(msg='boto3 required for this module')
221
222 region, ec2_url, aws_connect_params = get_aws_connection_info(module, boto3=True)
223
224 if region:
225 connection = boto3_conn(module, conn_type='client', resource='ec2', region=region, endpoint=ec2_url, **aws_connect_params)
226 else:
227 module.fail_json(msg="region must be specified")
228
229 list_ec2_snapshots(connection, module)
230
231
232 if __name__ == '__main__':
233 main()
234
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lib/ansible/modules/cloud/amazon/ec2_snapshot_facts.py b/lib/ansible/modules/cloud/amazon/ec2_snapshot_facts.py
--- a/lib/ansible/modules/cloud/amazon/ec2_snapshot_facts.py
+++ b/lib/ansible/modules/cloud/amazon/ec2_snapshot_facts.py
@@ -141,7 +141,7 @@
sample: 033440102211
tags:
description: Any tags assigned to the snapshot.
- type: list
+ type: dict
sample: "{ 'my_tag_key': 'my_tag_value' }"
encrypted:
description: Indicates whether the snapshot is encrypted.
| {"golden_diff": "diff --git a/lib/ansible/modules/cloud/amazon/ec2_snapshot_facts.py b/lib/ansible/modules/cloud/amazon/ec2_snapshot_facts.py\n--- a/lib/ansible/modules/cloud/amazon/ec2_snapshot_facts.py\n+++ b/lib/ansible/modules/cloud/amazon/ec2_snapshot_facts.py\n@@ -141,7 +141,7 @@\n sample: 033440102211\n tags:\n description: Any tags assigned to the snapshot.\n- type: list\n+ type: dict\n sample: \"{ 'my_tag_key': 'my_tag_value' }\"\n encrypted:\n description: Indicates whether the snapshot is encrypted.\n", "issue": "tag attribute in ec2_snapshot_facts module should be dict instead of list\n##### ISSUE TYPE\r\n - Documentation Report\r\n\r\n##### COMPONENT NAME\r\nansible-modules-extras/cloud/amazon/ec2_snapshot_facts.py\r\n\r\n##### ANSIBLE VERSION\r\n```\r\n$ ansible --version\r\nansible 2.2.1.0\r\n config file = /home/psd/.ansible.cfg\r\n configured module search path = ['/home/psd/.ansible/library']\r\n```\r\n\r\n##### CONFIGURATION\r\nN/A. Not configuration-specific\r\n\r\n##### OS / ENVIRONMENT\r\nN/A\r\n\r\n##### SUMMARY\r\nThe returned snapshots dict have a tag attribute, it was already [converted to dict in the code](https://github.com/ansible/ansible-modules-extras/blob/devel/cloud/amazon/ec2_snapshot_facts.py#L196)\r\n\r\nSo the expected tag attr in this module is a dict, not a list(which is the return value type in boto3)\r\n\r\n##### STEPS TO REPRODUCE\r\nget snapshot_facts, we have an example output in the next section:\r\n\r\n##### ACTUAL RESULTS\r\n\r\n```\r\n {\r\n \"description\": \"\",\r\n \"encrypted\": false,\r\n \"owner_id\": \"omitted\",\r\n \"progress\": \"100%\",\r\n \"snapshot_id\": \"snap-omitted\",\r\n \"start_time\": \"2017-03-08T03:52:29+00:00\",\r\n \"state\": \"completed\",\r\n \"tags\": {\r\n \"Name\": \"some name\",\r\n \"creator\": \"cron on some machine\",\r\n \"frequency\": \"hourly\"\r\n },\r\n \"volume_id\": \"vol-omitted\",\r\n \"volume_size\": 40\r\n }\r\n```\r\nAs you can see, the tags attr is a dict.\n", "before_files": [{"content": "#!/usr/bin/python\n# This file is part of Ansible\n#\n# Ansible is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# Ansible is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with Ansible. If not, see <http://www.gnu.org/licenses/>.\n\nANSIBLE_METADATA = {'status': ['preview'],\n 'supported_by': 'community',\n 'version': '1.0'}\n\nDOCUMENTATION = '''\n---\nmodule: ec2_snapshot_facts\nshort_description: Gather facts about ec2 volume snapshots in AWS\ndescription:\n - Gather facts about ec2 volume snapshots in AWS\nversion_added: \"2.1\"\nauthor: \"Rob White (@wimnat)\"\noptions:\n snapshot_ids:\n description:\n - If you specify one or more snapshot IDs, only snapshots that have the specified IDs are returned.\n required: false\n default: []\n owner_ids:\n description:\n - If you specify one or more snapshot owners, only snapshots from the specified owners and for which you have \\\n access are returned.\n required: false\n default: []\n restorable_by_user_ids:\n description:\n - If you specify a list of restorable users, only snapshots with create snapshot permissions for those users are \\\n returned.\n required: false\n default: []\n filters:\n description:\n - A dict of filters to apply. Each dict item consists of a filter key and a filter value. See \\\n U(http://docs.aws.amazon.com/AWSEC2/latest/APIReference/API_DescribeSnapshots.html) for possible filters. Filter \\\n names and values are case sensitive.\n required: false\n default: {}\nnotes:\n - By default, the module will return all snapshots, including public ones. To limit results to snapshots owned by \\\n the account use the filter 'owner-id'.\n\nextends_documentation_fragment:\n - aws\n - ec2\n'''\n\nEXAMPLES = '''\n# Note: These examples do not set authentication details, see the AWS Guide for details.\n\n# Gather facts about all snapshots, including public ones\n- ec2_snapshot_facts:\n\n# Gather facts about all snapshots owned by the account 0123456789\n- ec2_snapshot_facts:\n filters:\n owner-id: 0123456789\n\n# Or alternatively...\n- ec2_snapshot_facts:\n owner_ids:\n - 0123456789\n\n# Gather facts about a particular snapshot using ID\n- ec2_snapshot_facts:\n filters:\n snapshot-id: snap-00112233\n\n# Or alternatively...\n- ec2_snapshot_facts:\n snapshot_ids:\n - snap-00112233\n\n# Gather facts about any snapshot with a tag key Name and value Example\n- ec2_snapshot_facts:\n filters:\n \"tag:Name\": Example\n\n# Gather facts about any snapshot with an error status\n- ec2_snapshot_facts:\n filters:\n status: error\n\n'''\n\nRETURN = '''\nsnapshot_id:\n description: The ID of the snapshot. Each snapshot receives a unique identifier when it is created.\n type: string\n sample: snap-01234567\nvolume_id:\n description: The ID of the volume that was used to create the snapshot.\n type: string\n sample: vol-01234567\nstate:\n description: The snapshot state (completed, pending or error).\n type: string\n sample: completed\nstate_message:\n description: Encrypted Amazon EBS snapshots are copied asynchronously. If a snapshot copy operation fails (for example, if the proper AWS Key Management Service (AWS KMS) permissions are not obtained) this field displays error state details to help you diagnose why the error occurred.\n type: string\n sample:\nstart_time:\n description: The time stamp when the snapshot was initiated.\n type: datetime\n sample: 2015-02-12T02:14:02+00:00\nprogress:\n description: The progress of the snapshot, as a percentage.\n type: string\n sample: 100%\nowner_id:\n description: The AWS account ID of the EBS snapshot owner.\n type: string\n sample: 099720109477\ndescription:\n description: The description for the snapshot.\n type: string\n sample: My important backup\nvolume_size:\n description: The size of the volume, in GiB.\n type: integer\n sample: 8\nowner_alias:\n description: The AWS account alias (for example, amazon, self) or AWS account ID that owns the snapshot.\n type: string\n sample: 033440102211\ntags:\n description: Any tags assigned to the snapshot.\n type: list\n sample: \"{ 'my_tag_key': 'my_tag_value' }\"\nencrypted:\n description: Indicates whether the snapshot is encrypted.\n type: boolean\n sample: True\nkms_key_id:\n description: The full ARN of the AWS Key Management Service (AWS KMS) customer master key (CMK) that was used to \\\n protect the volume encryption key for the parent volume.\n type: string\n sample: 74c9742a-a1b2-45cb-b3fe-abcdef123456\ndata_encryption_key_id:\n description: The data encryption key identifier for the snapshot. This value is a unique identifier that \\\n corresponds to the data encryption key that was used to encrypt the original volume or snapshot copy.\n type: string\n sample: \"arn:aws:kms:ap-southeast-2:012345678900:key/74c9742a-a1b2-45cb-b3fe-abcdef123456\"\n\n'''\n\ntry:\n import boto3\n from botocore.exceptions import ClientError, NoCredentialsError\n HAS_BOTO3 = True\nexcept ImportError:\n HAS_BOTO3 = False\n\nfrom ansible.module_utils.basic import AnsibleModule\nfrom ansible.module_utils.ec2 import (ansible_dict_to_boto3_filter_list,\n boto3_conn, boto3_tag_list_to_ansible_dict, camel_dict_to_snake_dict,\n ec2_argument_spec, get_aws_connection_info)\n\n\ndef list_ec2_snapshots(connection, module):\n\n snapshot_ids = module.params.get(\"snapshot_ids\")\n owner_ids = map(str, module.params.get(\"owner_ids\"))\n restorable_by_user_ids = module.params.get(\"restorable_by_user_ids\")\n filters = ansible_dict_to_boto3_filter_list(module.params.get(\"filters\"))\n\n try:\n snapshots = connection.describe_snapshots(SnapshotIds=snapshot_ids, OwnerIds=owner_ids, RestorableByUserIds=restorable_by_user_ids, Filters=filters)\n except ClientError as e:\n module.fail_json(msg=e.message)\n\n # Turn the boto3 result in to ansible_friendly_snaked_names\n snaked_snapshots = []\n for snapshot in snapshots['Snapshots']:\n snaked_snapshots.append(camel_dict_to_snake_dict(snapshot))\n\n # Turn the boto3 result in to ansible friendly tag dictionary\n for snapshot in snaked_snapshots:\n if 'tags' in snapshot:\n snapshot['tags'] = boto3_tag_list_to_ansible_dict(snapshot['tags'])\n\n module.exit_json(snapshots=snaked_snapshots)\n\n\ndef main():\n\n argument_spec = ec2_argument_spec()\n argument_spec.update(\n dict(\n snapshot_ids=dict(default=[], type='list'),\n owner_ids=dict(default=[], type='list'),\n restorable_by_user_ids=dict(default=[], type='list'),\n filters=dict(default={}, type='dict')\n )\n )\n\n module = AnsibleModule(argument_spec=argument_spec,\n mutually_exclusive=[\n ['snapshot_ids', 'owner_ids', 'restorable_by_user_ids', 'filters']\n ]\n )\n\n if not HAS_BOTO3:\n module.fail_json(msg='boto3 required for this module')\n\n region, ec2_url, aws_connect_params = get_aws_connection_info(module, boto3=True)\n\n if region:\n connection = boto3_conn(module, conn_type='client', resource='ec2', region=region, endpoint=ec2_url, **aws_connect_params)\n else:\n module.fail_json(msg=\"region must be specified\")\n\n list_ec2_snapshots(connection, module)\n\n\nif __name__ == '__main__':\n main()\n", "path": "lib/ansible/modules/cloud/amazon/ec2_snapshot_facts.py"}], "after_files": [{"content": "#!/usr/bin/python\n# This file is part of Ansible\n#\n# Ansible is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# Ansible is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with Ansible. If not, see <http://www.gnu.org/licenses/>.\n\nANSIBLE_METADATA = {'status': ['preview'],\n 'supported_by': 'community',\n 'version': '1.0'}\n\nDOCUMENTATION = '''\n---\nmodule: ec2_snapshot_facts\nshort_description: Gather facts about ec2 volume snapshots in AWS\ndescription:\n - Gather facts about ec2 volume snapshots in AWS\nversion_added: \"2.1\"\nauthor: \"Rob White (@wimnat)\"\noptions:\n snapshot_ids:\n description:\n - If you specify one or more snapshot IDs, only snapshots that have the specified IDs are returned.\n required: false\n default: []\n owner_ids:\n description:\n - If you specify one or more snapshot owners, only snapshots from the specified owners and for which you have \\\n access are returned.\n required: false\n default: []\n restorable_by_user_ids:\n description:\n - If you specify a list of restorable users, only snapshots with create snapshot permissions for those users are \\\n returned.\n required: false\n default: []\n filters:\n description:\n - A dict of filters to apply. Each dict item consists of a filter key and a filter value. See \\\n U(http://docs.aws.amazon.com/AWSEC2/latest/APIReference/API_DescribeSnapshots.html) for possible filters. Filter \\\n names and values are case sensitive.\n required: false\n default: {}\nnotes:\n - By default, the module will return all snapshots, including public ones. To limit results to snapshots owned by \\\n the account use the filter 'owner-id'.\n\nextends_documentation_fragment:\n - aws\n - ec2\n'''\n\nEXAMPLES = '''\n# Note: These examples do not set authentication details, see the AWS Guide for details.\n\n# Gather facts about all snapshots, including public ones\n- ec2_snapshot_facts:\n\n# Gather facts about all snapshots owned by the account 0123456789\n- ec2_snapshot_facts:\n filters:\n owner-id: 0123456789\n\n# Or alternatively...\n- ec2_snapshot_facts:\n owner_ids:\n - 0123456789\n\n# Gather facts about a particular snapshot using ID\n- ec2_snapshot_facts:\n filters:\n snapshot-id: snap-00112233\n\n# Or alternatively...\n- ec2_snapshot_facts:\n snapshot_ids:\n - snap-00112233\n\n# Gather facts about any snapshot with a tag key Name and value Example\n- ec2_snapshot_facts:\n filters:\n \"tag:Name\": Example\n\n# Gather facts about any snapshot with an error status\n- ec2_snapshot_facts:\n filters:\n status: error\n\n'''\n\nRETURN = '''\nsnapshot_id:\n description: The ID of the snapshot. Each snapshot receives a unique identifier when it is created.\n type: string\n sample: snap-01234567\nvolume_id:\n description: The ID of the volume that was used to create the snapshot.\n type: string\n sample: vol-01234567\nstate:\n description: The snapshot state (completed, pending or error).\n type: string\n sample: completed\nstate_message:\n description: Encrypted Amazon EBS snapshots are copied asynchronously. If a snapshot copy operation fails (for example, if the proper AWS Key Management Service (AWS KMS) permissions are not obtained) this field displays error state details to help you diagnose why the error occurred.\n type: string\n sample:\nstart_time:\n description: The time stamp when the snapshot was initiated.\n type: datetime\n sample: 2015-02-12T02:14:02+00:00\nprogress:\n description: The progress of the snapshot, as a percentage.\n type: string\n sample: 100%\nowner_id:\n description: The AWS account ID of the EBS snapshot owner.\n type: string\n sample: 099720109477\ndescription:\n description: The description for the snapshot.\n type: string\n sample: My important backup\nvolume_size:\n description: The size of the volume, in GiB.\n type: integer\n sample: 8\nowner_alias:\n description: The AWS account alias (for example, amazon, self) or AWS account ID that owns the snapshot.\n type: string\n sample: 033440102211\ntags:\n description: Any tags assigned to the snapshot.\n type: dict\n sample: \"{ 'my_tag_key': 'my_tag_value' }\"\nencrypted:\n description: Indicates whether the snapshot is encrypted.\n type: boolean\n sample: True\nkms_key_id:\n description: The full ARN of the AWS Key Management Service (AWS KMS) customer master key (CMK) that was used to \\\n protect the volume encryption key for the parent volume.\n type: string\n sample: 74c9742a-a1b2-45cb-b3fe-abcdef123456\ndata_encryption_key_id:\n description: The data encryption key identifier for the snapshot. This value is a unique identifier that \\\n corresponds to the data encryption key that was used to encrypt the original volume or snapshot copy.\n type: string\n sample: \"arn:aws:kms:ap-southeast-2:012345678900:key/74c9742a-a1b2-45cb-b3fe-abcdef123456\"\n\n'''\n\ntry:\n import boto3\n from botocore.exceptions import ClientError, NoCredentialsError\n HAS_BOTO3 = True\nexcept ImportError:\n HAS_BOTO3 = False\n\nfrom ansible.module_utils.basic import AnsibleModule\nfrom ansible.module_utils.ec2 import (ansible_dict_to_boto3_filter_list,\n boto3_conn, boto3_tag_list_to_ansible_dict, camel_dict_to_snake_dict,\n ec2_argument_spec, get_aws_connection_info)\n\n\ndef list_ec2_snapshots(connection, module):\n\n snapshot_ids = module.params.get(\"snapshot_ids\")\n owner_ids = map(str, module.params.get(\"owner_ids\"))\n restorable_by_user_ids = module.params.get(\"restorable_by_user_ids\")\n filters = ansible_dict_to_boto3_filter_list(module.params.get(\"filters\"))\n\n try:\n snapshots = connection.describe_snapshots(SnapshotIds=snapshot_ids, OwnerIds=owner_ids, RestorableByUserIds=restorable_by_user_ids, Filters=filters)\n except ClientError as e:\n module.fail_json(msg=e.message)\n\n # Turn the boto3 result in to ansible_friendly_snaked_names\n snaked_snapshots = []\n for snapshot in snapshots['Snapshots']:\n snaked_snapshots.append(camel_dict_to_snake_dict(snapshot))\n\n # Turn the boto3 result in to ansible friendly tag dictionary\n for snapshot in snaked_snapshots:\n if 'tags' in snapshot:\n snapshot['tags'] = boto3_tag_list_to_ansible_dict(snapshot['tags'])\n\n module.exit_json(snapshots=snaked_snapshots)\n\n\ndef main():\n\n argument_spec = ec2_argument_spec()\n argument_spec.update(\n dict(\n snapshot_ids=dict(default=[], type='list'),\n owner_ids=dict(default=[], type='list'),\n restorable_by_user_ids=dict(default=[], type='list'),\n filters=dict(default={}, type='dict')\n )\n )\n\n module = AnsibleModule(argument_spec=argument_spec,\n mutually_exclusive=[\n ['snapshot_ids', 'owner_ids', 'restorable_by_user_ids', 'filters']\n ]\n )\n\n if not HAS_BOTO3:\n module.fail_json(msg='boto3 required for this module')\n\n region, ec2_url, aws_connect_params = get_aws_connection_info(module, boto3=True)\n\n if region:\n connection = boto3_conn(module, conn_type='client', resource='ec2', region=region, endpoint=ec2_url, **aws_connect_params)\n else:\n module.fail_json(msg=\"region must be specified\")\n\n list_ec2_snapshots(connection, module)\n\n\nif __name__ == '__main__':\n main()\n", "path": "lib/ansible/modules/cloud/amazon/ec2_snapshot_facts.py"}]} | 3,195 | 147 |
gh_patches_debug_17431 | rasdani/github-patches | git_diff | translate__pootle-5736 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
update_stores' last updated date doesn't tickle up to project overview/language list
When updating against templates, the /projects/projectname/ listing doesn't reflect the **last update**, **unless** the update affected a file in the **toplevel** dir.
Within a language overview (/lang/projectname), changes deep in a directory hierarchy will also affect the parent directory's last-change date.
using pootle 2.8.0b5 (TDF)
screenshots to clarify. overview lists last update as e.g. 3 weeks ago:

drilling down to the language reveals that the files in xmlsecurity actually had been updated only 8 hours ago (in fact xmlsecurity/uiconfig/ui.po)

(also sorting by the last updated is not working properly, goes from 10months to 8 hours, to 3weeks…)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pootle/apps/pootle_data/project_data.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 #
3 # Copyright (C) Pootle contributors.
4 #
5 # This file is a part of the Pootle project. It is distributed under the GPL3
6 # or later license. See the LICENSE file for a copy of the license and the
7 # AUTHORS file for copyright and authorship information.
8
9 from .utils import RelatedStoresDataTool, RelatedTPsDataTool
10
11
12 class ProjectDataTool(RelatedTPsDataTool):
13 """Retrieves aggregate stats for a Project"""
14
15 cache_key_name = "project"
16
17 def filter_data(self, qs):
18 return qs.filter(tp__project=self.context)
19
20
21 class ProjectResourceDataTool(RelatedStoresDataTool):
22 group_by = ("store__translation_project__language__code", )
23 cache_key_name = "project_resource"
24
25 @property
26 def project_path(self):
27 return (
28 "/%s%s"
29 % (self.project_code, self.tp_path))
30
31 @property
32 def tp_path(self):
33 return (
34 "/%s%s"
35 % (self.dir_path,
36 self.filename))
37
38 def filter_data(self, qs):
39 return (
40 qs.filter(store__translation_project__project__code=self.project_code)
41 .filter(store__tp_path__startswith=self.tp_path))
42
43 @property
44 def context_name(self):
45 return "/projects%s" % self.project_path
46
47
48 class ProjectSetDataTool(RelatedTPsDataTool):
49 group_by = ("tp__project__code", )
50 cache_key_name = "projects"
51
52 def get_root_child_path(self, child):
53 return child[self.group_by[0]]
54
55 @property
56 def context_name(self):
57 return "ALL"
58
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pootle/apps/pootle_data/project_data.py b/pootle/apps/pootle_data/project_data.py
--- a/pootle/apps/pootle_data/project_data.py
+++ b/pootle/apps/pootle_data/project_data.py
@@ -6,6 +6,8 @@
# or later license. See the LICENSE file for a copy of the license and the
# AUTHORS file for copyright and authorship information.
+from pootle.core.delegate import revision
+
from .utils import RelatedStoresDataTool, RelatedTPsDataTool
@@ -17,6 +19,11 @@
def filter_data(self, qs):
return qs.filter(tp__project=self.context)
+ @property
+ def rev_cache_key(self):
+ return revision.get(
+ self.context.__class__)(self.context.directory).get(key="stats")
+
class ProjectResourceDataTool(RelatedStoresDataTool):
group_by = ("store__translation_project__language__code", )
| {"golden_diff": "diff --git a/pootle/apps/pootle_data/project_data.py b/pootle/apps/pootle_data/project_data.py\n--- a/pootle/apps/pootle_data/project_data.py\n+++ b/pootle/apps/pootle_data/project_data.py\n@@ -6,6 +6,8 @@\n # or later license. See the LICENSE file for a copy of the license and the\n # AUTHORS file for copyright and authorship information.\n \n+from pootle.core.delegate import revision\n+\n from .utils import RelatedStoresDataTool, RelatedTPsDataTool\n \n \n@@ -17,6 +19,11 @@\n def filter_data(self, qs):\n return qs.filter(tp__project=self.context)\n \n+ @property\n+ def rev_cache_key(self):\n+ return revision.get(\n+ self.context.__class__)(self.context.directory).get(key=\"stats\")\n+\n \n class ProjectResourceDataTool(RelatedStoresDataTool):\n group_by = (\"store__translation_project__language__code\", )\n", "issue": "update_stores' last updated date doesn't tickle up to project overview/language list\nWhen updating against templates, the /projects/projectname/ listing doesn't reflect the **last update**, **unless** the update affected a file in the **toplevel** dir.\r\n\r\nWithin a language overview (/lang/projectname), changes deep in a directory hierarchy will also affect the parent directory's last-change date.\r\n\r\nusing pootle 2.8.0b5 (TDF)\r\n\r\nscreenshots to clarify. overview lists last update as e.g. 3 weeks ago:\r\n\r\n\r\ndrilling down to the language reveals that the files in xmlsecurity actually had been updated only 8 hours ago (in fact xmlsecurity/uiconfig/ui.po)\r\n\r\n\r\n(also sorting by the last updated is not working properly, goes from 10months to 8 hours, to 3weeks\u2026) \n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nfrom .utils import RelatedStoresDataTool, RelatedTPsDataTool\n\n\nclass ProjectDataTool(RelatedTPsDataTool):\n \"\"\"Retrieves aggregate stats for a Project\"\"\"\n\n cache_key_name = \"project\"\n\n def filter_data(self, qs):\n return qs.filter(tp__project=self.context)\n\n\nclass ProjectResourceDataTool(RelatedStoresDataTool):\n group_by = (\"store__translation_project__language__code\", )\n cache_key_name = \"project_resource\"\n\n @property\n def project_path(self):\n return (\n \"/%s%s\"\n % (self.project_code, self.tp_path))\n\n @property\n def tp_path(self):\n return (\n \"/%s%s\"\n % (self.dir_path,\n self.filename))\n\n def filter_data(self, qs):\n return (\n qs.filter(store__translation_project__project__code=self.project_code)\n .filter(store__tp_path__startswith=self.tp_path))\n\n @property\n def context_name(self):\n return \"/projects%s\" % self.project_path\n\n\nclass ProjectSetDataTool(RelatedTPsDataTool):\n group_by = (\"tp__project__code\", )\n cache_key_name = \"projects\"\n\n def get_root_child_path(self, child):\n return child[self.group_by[0]]\n\n @property\n def context_name(self):\n return \"ALL\"\n", "path": "pootle/apps/pootle_data/project_data.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nfrom pootle.core.delegate import revision\n\nfrom .utils import RelatedStoresDataTool, RelatedTPsDataTool\n\n\nclass ProjectDataTool(RelatedTPsDataTool):\n \"\"\"Retrieves aggregate stats for a Project\"\"\"\n\n cache_key_name = \"project\"\n\n def filter_data(self, qs):\n return qs.filter(tp__project=self.context)\n\n @property\n def rev_cache_key(self):\n return revision.get(\n self.context.__class__)(self.context.directory).get(key=\"stats\")\n\n\nclass ProjectResourceDataTool(RelatedStoresDataTool):\n group_by = (\"store__translation_project__language__code\", )\n cache_key_name = \"project_resource\"\n\n @property\n def project_path(self):\n return (\n \"/%s%s\"\n % (self.project_code, self.tp_path))\n\n @property\n def tp_path(self):\n return (\n \"/%s%s\"\n % (self.dir_path,\n self.filename))\n\n def filter_data(self, qs):\n return (\n qs.filter(store__translation_project__project__code=self.project_code)\n .filter(store__tp_path__startswith=self.tp_path))\n\n @property\n def context_name(self):\n return \"/projects%s\" % self.project_path\n\n\nclass ProjectSetDataTool(RelatedTPsDataTool):\n group_by = (\"tp__project__code\", )\n cache_key_name = \"projects\"\n\n def get_root_child_path(self, child):\n return child[self.group_by[0]]\n\n @property\n def context_name(self):\n return \"ALL\"\n", "path": "pootle/apps/pootle_data/project_data.py"}]} | 1,086 | 219 |
gh_patches_debug_38407 | rasdani/github-patches | git_diff | wagtail__wagtail-556 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Search: Make update_index update all backends
Currently, it only updates the default backend. It should update all search backends.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wagtail/wagtailsearch/management/commands/update_index.py`
Content:
```
1 from django.core.management.base import BaseCommand
2 from django.db import models
3
4 from wagtail.wagtailsearch import Indexed, get_search_backend
5
6
7 class Command(BaseCommand):
8 def handle(self, **options):
9 # Print info
10 self.stdout.write("Getting object list")
11
12 # Get list of indexed models
13 indexed_models = [model for model in models.get_models() if issubclass(model, Indexed)]
14
15 # Object set
16 object_set = {}
17
18 # Add all objects to object set and detect any duplicates
19 # Duplicates are caused when both a model and a derived model are indexed
20 # Eg, if BlogPost inherits from Page and both of these models are indexed
21 # If we were to add all objects from both models into the index, all the BlogPosts will have two entries
22 for model in indexed_models:
23 # Get toplevel content type
24 toplevel_content_type = model.indexed_get_toplevel_content_type()
25
26 # Loop through objects
27 for obj in model.get_indexed_objects():
28 # Get key for this object
29 key = toplevel_content_type + ':' + str(obj.pk)
30
31 # Check if this key already exists
32 if key in object_set:
33 # Conflict, work out who should get this space
34 # The object with the longest content type string gets the space
35 # Eg, "wagtailcore.Page-myapp.BlogPost" kicks out "wagtailcore.Page"
36 if len(obj.indexed_get_content_type()) > len(object_set[key].indexed_get_content_type()):
37 # Take the spot
38 object_set[key] = obj
39 else:
40 # Space free, take it
41 object_set[key] = obj
42
43 # Search backend
44 if 'backend' in options:
45 s = options['backend']
46 else:
47 s = get_search_backend()
48
49 # Reset the index
50 self.stdout.write("Reseting index")
51 s.reset_index()
52
53 # Add types
54 self.stdout.write("Adding types")
55 for model in indexed_models:
56 s.add_type(model)
57
58 # Add objects to index
59 self.stdout.write("Adding objects")
60 for result in s.add_bulk(object_set.values()):
61 self.stdout.write(result[0] + ' ' + str(result[1]))
62
63 # Refresh index
64 self.stdout.write("Refreshing index")
65 s.refresh_index()
66
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/wagtail/wagtailsearch/management/commands/update_index.py b/wagtail/wagtailsearch/management/commands/update_index.py
--- a/wagtail/wagtailsearch/management/commands/update_index.py
+++ b/wagtail/wagtailsearch/management/commands/update_index.py
@@ -1,11 +1,22 @@
+from optparse import make_option
+
from django.core.management.base import BaseCommand
from django.db import models
+from django.conf import settings
from wagtail.wagtailsearch import Indexed, get_search_backend
+def get_search_backends():
+ if hasattr(settings, 'WAGTAILSEARCH_BACKENDS'):
+ for backend in settings.WAGTAILSEARCH_BACKENDS.keys():
+ yield backend, get_search_backend(backend)
+ else:
+ yield 'default', get_search_backend('default')
+
+
class Command(BaseCommand):
- def handle(self, **options):
+ def get_object_list(self):
# Print info
self.stdout.write("Getting object list")
@@ -40,26 +51,51 @@
# Space free, take it
object_set[key] = obj
- # Search backend
- if 'backend' in options:
- s = options['backend']
- else:
- s = get_search_backend()
+ return indexed_models, object_set.values()
+
+ def update_backend(self, backend, models, object_list, backend_name=''):
+ # Print info
+ self.stdout.write("Updating backend: " + backend_name)
+
+ # Get backend
+ if backend is None:
+ backend = get_search_backend(backend_name)
# Reset the index
- self.stdout.write("Reseting index")
- s.reset_index()
+ self.stdout.write(backend_name + ": Reseting index")
+ backend.reset_index()
# Add types
- self.stdout.write("Adding types")
- for model in indexed_models:
- s.add_type(model)
+ self.stdout.write(backend_name + ": Adding types")
+ for model in models:
+ backend.add_type(model)
# Add objects to index
- self.stdout.write("Adding objects")
- for result in s.add_bulk(object_set.values()):
+ self.stdout.write(backend_name + ": Adding objects")
+ for result in backend.add_bulk(object_list):
self.stdout.write(result[0] + ' ' + str(result[1]))
# Refresh index
- self.stdout.write("Refreshing index")
- s.refresh_index()
+ self.stdout.write(backend_name + ": Refreshing index")
+ backend.refresh_index()
+
+ option_list = BaseCommand.option_list + (
+ make_option('--backend',
+ action='store',
+ dest='backend_name',
+ default=False,
+ help="Specify a backend to update",
+ ),
+ )
+
+ def handle(self, **options):
+ # Get object list
+ models, object_list = self.get_object_list()
+
+ # Update backends
+ if 'backend_name' in options:
+ backend = dict(get_search_backends())[options['backend_name']]
+ self.update_backend(backend, models, object_list, backend_name=options['backend_name'])
+ else:
+ for backend_name, backend in get_search_backends():
+ self.update_backend(backend, models, object_list, backend_name=backend_name)
| {"golden_diff": "diff --git a/wagtail/wagtailsearch/management/commands/update_index.py b/wagtail/wagtailsearch/management/commands/update_index.py\n--- a/wagtail/wagtailsearch/management/commands/update_index.py\n+++ b/wagtail/wagtailsearch/management/commands/update_index.py\n@@ -1,11 +1,22 @@\n+from optparse import make_option\n+\n from django.core.management.base import BaseCommand\n from django.db import models\n+from django.conf import settings\n \n from wagtail.wagtailsearch import Indexed, get_search_backend\n \n \n+def get_search_backends():\n+ if hasattr(settings, 'WAGTAILSEARCH_BACKENDS'):\n+ for backend in settings.WAGTAILSEARCH_BACKENDS.keys():\n+ yield backend, get_search_backend(backend)\n+ else:\n+ yield 'default', get_search_backend('default')\n+\n+\n class Command(BaseCommand):\n- def handle(self, **options):\n+ def get_object_list(self):\n # Print info\n self.stdout.write(\"Getting object list\")\n \n@@ -40,26 +51,51 @@\n # Space free, take it\n object_set[key] = obj\n \n- # Search backend\n- if 'backend' in options:\n- s = options['backend']\n- else:\n- s = get_search_backend()\n+ return indexed_models, object_set.values()\n+\n+ def update_backend(self, backend, models, object_list, backend_name=''):\n+ # Print info\n+ self.stdout.write(\"Updating backend: \" + backend_name)\n+\n+ # Get backend\n+ if backend is None:\n+ backend = get_search_backend(backend_name)\n \n # Reset the index\n- self.stdout.write(\"Reseting index\")\n- s.reset_index()\n+ self.stdout.write(backend_name + \": Reseting index\")\n+ backend.reset_index()\n \n # Add types\n- self.stdout.write(\"Adding types\")\n- for model in indexed_models:\n- s.add_type(model)\n+ self.stdout.write(backend_name + \": Adding types\")\n+ for model in models:\n+ backend.add_type(model)\n \n # Add objects to index\n- self.stdout.write(\"Adding objects\")\n- for result in s.add_bulk(object_set.values()):\n+ self.stdout.write(backend_name + \": Adding objects\")\n+ for result in backend.add_bulk(object_list):\n self.stdout.write(result[0] + ' ' + str(result[1]))\n \n # Refresh index\n- self.stdout.write(\"Refreshing index\")\n- s.refresh_index()\n+ self.stdout.write(backend_name + \": Refreshing index\")\n+ backend.refresh_index()\n+\n+ option_list = BaseCommand.option_list + (\n+ make_option('--backend',\n+ action='store',\n+ dest='backend_name',\n+ default=False,\n+ help=\"Specify a backend to update\",\n+ ),\n+ )\n+\n+ def handle(self, **options):\n+ # Get object list\n+ models, object_list = self.get_object_list()\n+\n+ # Update backends\n+ if 'backend_name' in options:\n+ backend = dict(get_search_backends())[options['backend_name']]\n+ self.update_backend(backend, models, object_list, backend_name=options['backend_name'])\n+ else:\n+ for backend_name, backend in get_search_backends():\n+ self.update_backend(backend, models, object_list, backend_name=backend_name)\n", "issue": "Search: Make update_index update all backends\nCurrently, it only updates the default backend. It should update all search backends.\n\n", "before_files": [{"content": "from django.core.management.base import BaseCommand\nfrom django.db import models\n\nfrom wagtail.wagtailsearch import Indexed, get_search_backend\n\n\nclass Command(BaseCommand):\n def handle(self, **options):\n # Print info\n self.stdout.write(\"Getting object list\")\n\n # Get list of indexed models\n indexed_models = [model for model in models.get_models() if issubclass(model, Indexed)]\n\n # Object set\n object_set = {}\n\n # Add all objects to object set and detect any duplicates\n # Duplicates are caused when both a model and a derived model are indexed\n # Eg, if BlogPost inherits from Page and both of these models are indexed\n # If we were to add all objects from both models into the index, all the BlogPosts will have two entries\n for model in indexed_models:\n # Get toplevel content type\n toplevel_content_type = model.indexed_get_toplevel_content_type()\n\n # Loop through objects\n for obj in model.get_indexed_objects():\n # Get key for this object\n key = toplevel_content_type + ':' + str(obj.pk)\n\n # Check if this key already exists\n if key in object_set:\n # Conflict, work out who should get this space\n # The object with the longest content type string gets the space\n # Eg, \"wagtailcore.Page-myapp.BlogPost\" kicks out \"wagtailcore.Page\"\n if len(obj.indexed_get_content_type()) > len(object_set[key].indexed_get_content_type()):\n # Take the spot\n object_set[key] = obj\n else:\n # Space free, take it\n object_set[key] = obj\n\n # Search backend\n if 'backend' in options:\n s = options['backend']\n else:\n s = get_search_backend()\n\n # Reset the index\n self.stdout.write(\"Reseting index\")\n s.reset_index()\n\n # Add types\n self.stdout.write(\"Adding types\")\n for model in indexed_models:\n s.add_type(model)\n\n # Add objects to index\n self.stdout.write(\"Adding objects\")\n for result in s.add_bulk(object_set.values()):\n self.stdout.write(result[0] + ' ' + str(result[1]))\n\n # Refresh index\n self.stdout.write(\"Refreshing index\")\n s.refresh_index()\n", "path": "wagtail/wagtailsearch/management/commands/update_index.py"}], "after_files": [{"content": "from optparse import make_option\n\nfrom django.core.management.base import BaseCommand\nfrom django.db import models\nfrom django.conf import settings\n\nfrom wagtail.wagtailsearch import Indexed, get_search_backend\n\n\ndef get_search_backends():\n if hasattr(settings, 'WAGTAILSEARCH_BACKENDS'):\n for backend in settings.WAGTAILSEARCH_BACKENDS.keys():\n yield backend, get_search_backend(backend)\n else:\n yield 'default', get_search_backend('default')\n\n\nclass Command(BaseCommand):\n def get_object_list(self):\n # Print info\n self.stdout.write(\"Getting object list\")\n\n # Get list of indexed models\n indexed_models = [model for model in models.get_models() if issubclass(model, Indexed)]\n\n # Object set\n object_set = {}\n\n # Add all objects to object set and detect any duplicates\n # Duplicates are caused when both a model and a derived model are indexed\n # Eg, if BlogPost inherits from Page and both of these models are indexed\n # If we were to add all objects from both models into the index, all the BlogPosts will have two entries\n for model in indexed_models:\n # Get toplevel content type\n toplevel_content_type = model.indexed_get_toplevel_content_type()\n\n # Loop through objects\n for obj in model.get_indexed_objects():\n # Get key for this object\n key = toplevel_content_type + ':' + str(obj.pk)\n\n # Check if this key already exists\n if key in object_set:\n # Conflict, work out who should get this space\n # The object with the longest content type string gets the space\n # Eg, \"wagtailcore.Page-myapp.BlogPost\" kicks out \"wagtailcore.Page\"\n if len(obj.indexed_get_content_type()) > len(object_set[key].indexed_get_content_type()):\n # Take the spot\n object_set[key] = obj\n else:\n # Space free, take it\n object_set[key] = obj\n\n return indexed_models, object_set.values()\n\n def update_backend(self, backend, models, object_list, backend_name=''):\n # Print info\n self.stdout.write(\"Updating backend: \" + backend_name)\n\n # Get backend\n if backend is None:\n backend = get_search_backend(backend_name)\n\n # Reset the index\n self.stdout.write(backend_name + \": Reseting index\")\n backend.reset_index()\n\n # Add types\n self.stdout.write(backend_name + \": Adding types\")\n for model in models:\n backend.add_type(model)\n\n # Add objects to index\n self.stdout.write(backend_name + \": Adding objects\")\n for result in backend.add_bulk(object_list):\n self.stdout.write(result[0] + ' ' + str(result[1]))\n\n # Refresh index\n self.stdout.write(backend_name + \": Refreshing index\")\n backend.refresh_index()\n\n option_list = BaseCommand.option_list + (\n make_option('--backend',\n action='store',\n dest='backend_name',\n default=False,\n help=\"Specify a backend to update\",\n ),\n )\n\n def handle(self, **options):\n # Get object list\n models, object_list = self.get_object_list()\n\n # Update backends\n if 'backend_name' in options:\n backend = dict(get_search_backends())[options['backend_name']]\n self.update_backend(backend, models, object_list, backend_name=options['backend_name'])\n else:\n for backend_name, backend in get_search_backends():\n self.update_backend(backend, models, object_list, backend_name=backend_name)\n", "path": "wagtail/wagtailsearch/management/commands/update_index.py"}]} | 924 | 747 |
gh_patches_debug_12742 | rasdani/github-patches | git_diff | ocadotechnology__codeforlife-portal-782 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Speak to legal team about updating our T&Cs for GDPR
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `portal/admin.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # Code for Life
3 #
4 # Copyright (C) 2018, Ocado Innovation Limited
5 #
6 # This program is free software: you can redistribute it and/or modify
7 # it under the terms of the GNU Affero General Public License as
8 # published by the Free Software Foundation, either version 3 of the
9 # License, or (at your option) any later version.
10 #
11 # This program is distributed in the hope that it will be useful,
12 # but WITHOUT ANY WARRANTY; without even the implied warranty of
13 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
14 # GNU Affero General Public License for more details.
15 #
16 # You should have received a copy of the GNU Affero General Public License
17 # along with this program. If not, see <http://www.gnu.org/licenses/>.
18 #
19 # ADDITIONAL TERMS – Section 7 GNU General Public Licence
20 #
21 # This licence does not grant any right, title or interest in any “Ocado” logos,
22 # trade names or the trademark “Ocado” or any other trademarks or domain names
23 # owned by Ocado Innovation Limited or the Ocado group of companies or any other
24 # distinctive brand features of “Ocado” as may be secured from time to time. You
25 # must not distribute any modification of this program using the trademark
26 # “Ocado” or claim any affiliation or association with Ocado or its employees.
27 #
28 # You are not authorised to use the name Ocado (or any of its trade names) or
29 # the names of any author or contributor in advertising or for publicity purposes
30 # pertaining to the distribution of this program, without the prior written
31 # authorisation of Ocado.
32 #
33 # Any propagation, distribution or conveyance of this program must include this
34 # copyright notice and these terms. You must not misrepresent the origins of this
35 # program; modified versions of the program must be marked as such and not
36 # identified as the original program.
37 from django.contrib import admin
38 from django.contrib.auth.models import User
39 from django.contrib.auth.admin import UserAdmin
40
41
42 from portal.models import Class, Student, Guardian, Teacher, School, UserProfile, FrontPageNews, EmailVerification
43
44
45 class ClassAdmin(admin.ModelAdmin):
46 search_fields = ['name', 'teacher__new_user__first_name', 'teacher__new_user__last_name']
47 list_filter = ['teacher']
48 readonly_fields = ['teacher']
49
50
51 class SchoolAdmin(admin.ModelAdmin):
52 search_fields = ['name', 'country', 'postcode', 'town']
53 list_filter = ['postcode', 'country']
54
55
56 class StudentAdmin(admin.ModelAdmin):
57 search_fields = ['new_user__first_name', 'new_user__last_name']
58 list_filter = ['class_field', 'class_field__teacher']
59 readonly_fields = ['user', 'new_user']
60 raw_id_fields = ['class_field', 'pending_class_request']
61
62
63 class TeacherAdmin(admin.ModelAdmin):
64 search_fields = ['new_user__first_name', 'new_user__last_name']
65 list_filter = ['school']
66 readonly_fields = ['user', 'new_user']
67 raw_id_fields = ['school', 'pending_join_request']
68
69
70 class UserProfileAdmin(admin.ModelAdmin):
71 search_fields = ['user__first_name', 'user__last_name', 'new_username', 'user__date_joined']
72 list_filter = ['user__date_joined']
73 list_display = ['user', 'joined_recently']
74 readonly_fields = ['user']
75
76
77 class EmailVerificationAdmin(admin.ModelAdmin):
78 search_fields = ['new_user']
79
80
81 UserAdmin.list_display += ('date_joined',)
82 UserAdmin.list_filter += ('date_joined',)
83
84
85 admin.site.register(Class, ClassAdmin)
86 admin.site.register(Student, StudentAdmin)
87 admin.site.register(Guardian)
88 admin.site.register(Teacher, TeacherAdmin)
89 admin.site.register(School, SchoolAdmin)
90 admin.site.unregister(User)
91 admin.site.register(User, UserAdmin)
92 admin.site.register(UserProfile, UserProfileAdmin)
93 admin.site.register(FrontPageNews)
94 admin.site.register(EmailVerification, EmailVerificationAdmin)
95
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/portal/admin.py b/portal/admin.py
--- a/portal/admin.py
+++ b/portal/admin.py
@@ -68,14 +68,14 @@
class UserProfileAdmin(admin.ModelAdmin):
- search_fields = ['user__first_name', 'user__last_name', 'new_username', 'user__date_joined']
+ search_fields = ['user__first_name', 'user__last_name', 'user__username', 'user__date_joined']
list_filter = ['user__date_joined']
list_display = ['user', 'joined_recently']
readonly_fields = ['user']
class EmailVerificationAdmin(admin.ModelAdmin):
- search_fields = ['new_user']
+ search_fields = ['user__first_name', 'user__last_name', 'user__username', 'user__date_joined']
UserAdmin.list_display += ('date_joined',)
| {"golden_diff": "diff --git a/portal/admin.py b/portal/admin.py\n--- a/portal/admin.py\n+++ b/portal/admin.py\n@@ -68,14 +68,14 @@\n \n \n class UserProfileAdmin(admin.ModelAdmin):\n- search_fields = ['user__first_name', 'user__last_name', 'new_username', 'user__date_joined']\n+ search_fields = ['user__first_name', 'user__last_name', 'user__username', 'user__date_joined']\n list_filter = ['user__date_joined']\n list_display = ['user', 'joined_recently']\n readonly_fields = ['user']\n \n \n class EmailVerificationAdmin(admin.ModelAdmin):\n- search_fields = ['new_user']\n+ search_fields = ['user__first_name', 'user__last_name', 'user__username', 'user__date_joined']\n \n \n UserAdmin.list_display += ('date_joined',)\n", "issue": "Speak to legal team about updating our T&Cs for GDPR\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Code for Life\n#\n# Copyright (C) 2018, Ocado Innovation Limited\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Affero General Public License as\n# published by the Free Software Foundation, either version 3 of the\n# License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Affero General Public License for more details.\n#\n# You should have received a copy of the GNU Affero General Public License\n# along with this program. If not, see <http://www.gnu.org/licenses/>.\n#\n# ADDITIONAL TERMS \u2013 Section 7 GNU General Public Licence\n#\n# This licence does not grant any right, title or interest in any \u201cOcado\u201d logos,\n# trade names or the trademark \u201cOcado\u201d or any other trademarks or domain names\n# owned by Ocado Innovation Limited or the Ocado group of companies or any other\n# distinctive brand features of \u201cOcado\u201d as may be secured from time to time. You\n# must not distribute any modification of this program using the trademark\n# \u201cOcado\u201d or claim any affiliation or association with Ocado or its employees.\n#\n# You are not authorised to use the name Ocado (or any of its trade names) or\n# the names of any author or contributor in advertising or for publicity purposes\n# pertaining to the distribution of this program, without the prior written\n# authorisation of Ocado.\n#\n# Any propagation, distribution or conveyance of this program must include this\n# copyright notice and these terms. You must not misrepresent the origins of this\n# program; modified versions of the program must be marked as such and not\n# identified as the original program.\nfrom django.contrib import admin\nfrom django.contrib.auth.models import User\nfrom django.contrib.auth.admin import UserAdmin\n\n\nfrom portal.models import Class, Student, Guardian, Teacher, School, UserProfile, FrontPageNews, EmailVerification\n\n\nclass ClassAdmin(admin.ModelAdmin):\n search_fields = ['name', 'teacher__new_user__first_name', 'teacher__new_user__last_name']\n list_filter = ['teacher']\n readonly_fields = ['teacher']\n\n\nclass SchoolAdmin(admin.ModelAdmin):\n search_fields = ['name', 'country', 'postcode', 'town']\n list_filter = ['postcode', 'country']\n\n\nclass StudentAdmin(admin.ModelAdmin):\n search_fields = ['new_user__first_name', 'new_user__last_name']\n list_filter = ['class_field', 'class_field__teacher']\n readonly_fields = ['user', 'new_user']\n raw_id_fields = ['class_field', 'pending_class_request']\n\n\nclass TeacherAdmin(admin.ModelAdmin):\n search_fields = ['new_user__first_name', 'new_user__last_name']\n list_filter = ['school']\n readonly_fields = ['user', 'new_user']\n raw_id_fields = ['school', 'pending_join_request']\n\n\nclass UserProfileAdmin(admin.ModelAdmin):\n search_fields = ['user__first_name', 'user__last_name', 'new_username', 'user__date_joined']\n list_filter = ['user__date_joined']\n list_display = ['user', 'joined_recently']\n readonly_fields = ['user']\n\n\nclass EmailVerificationAdmin(admin.ModelAdmin):\n search_fields = ['new_user']\n\n\nUserAdmin.list_display += ('date_joined',)\nUserAdmin.list_filter += ('date_joined',)\n\n\nadmin.site.register(Class, ClassAdmin)\nadmin.site.register(Student, StudentAdmin)\nadmin.site.register(Guardian)\nadmin.site.register(Teacher, TeacherAdmin)\nadmin.site.register(School, SchoolAdmin)\nadmin.site.unregister(User)\nadmin.site.register(User, UserAdmin)\nadmin.site.register(UserProfile, UserProfileAdmin)\nadmin.site.register(FrontPageNews)\nadmin.site.register(EmailVerification, EmailVerificationAdmin)\n", "path": "portal/admin.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Code for Life\n#\n# Copyright (C) 2018, Ocado Innovation Limited\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Affero General Public License as\n# published by the Free Software Foundation, either version 3 of the\n# License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Affero General Public License for more details.\n#\n# You should have received a copy of the GNU Affero General Public License\n# along with this program. If not, see <http://www.gnu.org/licenses/>.\n#\n# ADDITIONAL TERMS \u2013 Section 7 GNU General Public Licence\n#\n# This licence does not grant any right, title or interest in any \u201cOcado\u201d logos,\n# trade names or the trademark \u201cOcado\u201d or any other trademarks or domain names\n# owned by Ocado Innovation Limited or the Ocado group of companies or any other\n# distinctive brand features of \u201cOcado\u201d as may be secured from time to time. You\n# must not distribute any modification of this program using the trademark\n# \u201cOcado\u201d or claim any affiliation or association with Ocado or its employees.\n#\n# You are not authorised to use the name Ocado (or any of its trade names) or\n# the names of any author or contributor in advertising or for publicity purposes\n# pertaining to the distribution of this program, without the prior written\n# authorisation of Ocado.\n#\n# Any propagation, distribution or conveyance of this program must include this\n# copyright notice and these terms. You must not misrepresent the origins of this\n# program; modified versions of the program must be marked as such and not\n# identified as the original program.\nfrom django.contrib import admin\nfrom django.contrib.auth.models import User\nfrom django.contrib.auth.admin import UserAdmin\n\n\nfrom portal.models import Class, Student, Guardian, Teacher, School, UserProfile, FrontPageNews, EmailVerification\n\n\nclass ClassAdmin(admin.ModelAdmin):\n search_fields = ['name', 'teacher__new_user__first_name', 'teacher__new_user__last_name']\n list_filter = ['teacher']\n readonly_fields = ['teacher']\n\n\nclass SchoolAdmin(admin.ModelAdmin):\n search_fields = ['name', 'country', 'postcode', 'town']\n list_filter = ['postcode', 'country']\n\n\nclass StudentAdmin(admin.ModelAdmin):\n search_fields = ['new_user__first_name', 'new_user__last_name']\n list_filter = ['class_field', 'class_field__teacher']\n readonly_fields = ['user', 'new_user']\n raw_id_fields = ['class_field', 'pending_class_request']\n\n\nclass TeacherAdmin(admin.ModelAdmin):\n search_fields = ['new_user__first_name', 'new_user__last_name']\n list_filter = ['school']\n readonly_fields = ['user', 'new_user']\n raw_id_fields = ['school', 'pending_join_request']\n\n\nclass UserProfileAdmin(admin.ModelAdmin):\n search_fields = ['user__first_name', 'user__last_name', 'user__username', 'user__date_joined']\n list_filter = ['user__date_joined']\n list_display = ['user', 'joined_recently']\n readonly_fields = ['user']\n\n\nclass EmailVerificationAdmin(admin.ModelAdmin):\n search_fields = ['user__first_name', 'user__last_name', 'user__username', 'user__date_joined']\n\n\nUserAdmin.list_display += ('date_joined',)\nUserAdmin.list_filter += ('date_joined',)\n\n\nadmin.site.register(Class, ClassAdmin)\nadmin.site.register(Student, StudentAdmin)\nadmin.site.register(Guardian)\nadmin.site.register(Teacher, TeacherAdmin)\nadmin.site.register(School, SchoolAdmin)\nadmin.site.unregister(User)\nadmin.site.register(User, UserAdmin)\nadmin.site.register(UserProfile, UserProfileAdmin)\nadmin.site.register(FrontPageNews)\nadmin.site.register(EmailVerification, EmailVerificationAdmin)\n", "path": "portal/admin.py"}]} | 1,304 | 193 |
gh_patches_debug_22097 | rasdani/github-patches | git_diff | svthalia__concrexit-2199 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add filter/display of members-only value to document admin
### Is your feature request related to a problem? Please describe.
It is not really issue to see which documents are marked as members only. And it is impossible to easily get a list with documents that have a true/false value.
### Describe the solution you'd like
I'd like to see more information about the documents in the admin page so that I do not have to open the detail page.
### Motivation
Easier to manage these files.
### Describe alternatives you've considered
The only alternative is not doing this.
### Additional context
#2084 could have been prevented.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `website/documents/admin.py`
Content:
```
1 """Registers admin interfaces for the documents module."""
2 from django.contrib import admin
3 from django.contrib.admin import ModelAdmin
4 from django.utils.translation import gettext_lazy as _
5
6 from documents import forms
7 from documents.models import (
8 AnnualDocument,
9 AssociationDocument,
10 EventDocument,
11 GeneralMeeting,
12 Minutes,
13 MiscellaneousDocument,
14 )
15 from documents.services import is_owner
16
17
18 class MinutesInline(admin.StackedInline):
19 """Inline for minutes of a general meeting."""
20
21 model = Minutes
22 form = forms.MinutesForm
23 extra = 0
24
25
26 @admin.register(GeneralMeeting)
27 class GeneralMeetingAdmin(ModelAdmin):
28 """Manage the general meetings."""
29
30 form = forms.GeneralMeetingForm
31 inlines = [
32 MinutesInline,
33 ]
34 list_filter = ("datetime",)
35
36
37 class LectureYearFilter(admin.SimpleListFilter):
38 """Filter the memberships on those started or ended in a lecture year."""
39
40 title = _("lecture year")
41 parameter_name = "lecture_year"
42
43 def lookups(self, request, model_admin):
44 if AnnualDocument.objects.count() > 0:
45 first_year = AnnualDocument.objects.order_by("year").first().year
46 last_year = AnnualDocument.objects.order_by("year").last().year
47
48 return [
49 (year, f"{year}-{year + 1}")
50 for year in range(last_year, first_year - 1, -1)
51 ]
52 return []
53
54 def queryset(self, request, queryset):
55 if not self.value():
56 return queryset
57
58 year = int(self.value())
59
60 return queryset.filter(year=year)
61
62
63 @admin.register(AnnualDocument)
64 class AnnualDocumentAdmin(ModelAdmin):
65 """Manage the annual documents."""
66
67 form = forms.AnnualDocumentForm
68 list_filter = (
69 LectureYearFilter,
70 "created",
71 "last_updated",
72 )
73
74
75 @admin.register(AssociationDocument)
76 class AssociationDocumentAdmin(ModelAdmin):
77 """Manage the association documents."""
78
79 form = forms.AssociationDocumentForm
80 list_filter = (
81 "created",
82 "last_updated",
83 )
84
85
86 @admin.register(EventDocument)
87 class EventDocumentAdmin(ModelAdmin):
88 """Manage the event documents."""
89
90 form = forms.EventDocumentForm
91 list_filter = (
92 "created",
93 "last_updated",
94 )
95
96 def has_change_permission(self, request, obj=None):
97 """Only allow access to the change form if the user is an owner."""
98 if obj is not None and not is_owner(request.member, obj):
99 return False
100 return super().has_change_permission(request, obj)
101
102 def has_delete_permission(self, request, obj=None):
103 """Only allow delete access if the user is an owner."""
104 if obj is not None and not is_owner(request.member, obj):
105 return False
106 return super().has_delete_permission(request, obj)
107
108
109 @admin.register(MiscellaneousDocument)
110 class MiscellaneousDocumentAdmin(ModelAdmin):
111 """Manage the miscellaneous documents."""
112
113 form = forms.MiscellaneousDocumentForm
114 list_filter = (
115 "created",
116 "last_updated",
117 )
118
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/website/documents/admin.py b/website/documents/admin.py
--- a/website/documents/admin.py
+++ b/website/documents/admin.py
@@ -69,6 +69,11 @@
LectureYearFilter,
"created",
"last_updated",
+ "members_only",
+ )
+ list_display = (
+ "__str__",
+ "members_only",
)
@@ -80,6 +85,11 @@
list_filter = (
"created",
"last_updated",
+ "members_only",
+ )
+ list_display = (
+ "__str__",
+ "members_only",
)
@@ -91,6 +101,11 @@
list_filter = (
"created",
"last_updated",
+ "members_only",
+ )
+ list_display = (
+ "__str__",
+ "members_only",
)
def has_change_permission(self, request, obj=None):
@@ -114,4 +129,9 @@
list_filter = (
"created",
"last_updated",
+ "members_only",
+ )
+ list_display = (
+ "__str__",
+ "members_only",
)
| {"golden_diff": "diff --git a/website/documents/admin.py b/website/documents/admin.py\n--- a/website/documents/admin.py\n+++ b/website/documents/admin.py\n@@ -69,6 +69,11 @@\n LectureYearFilter,\n \"created\",\n \"last_updated\",\n+ \"members_only\",\n+ )\n+ list_display = (\n+ \"__str__\",\n+ \"members_only\",\n )\n \n \n@@ -80,6 +85,11 @@\n list_filter = (\n \"created\",\n \"last_updated\",\n+ \"members_only\",\n+ )\n+ list_display = (\n+ \"__str__\",\n+ \"members_only\",\n )\n \n \n@@ -91,6 +101,11 @@\n list_filter = (\n \"created\",\n \"last_updated\",\n+ \"members_only\",\n+ )\n+ list_display = (\n+ \"__str__\",\n+ \"members_only\",\n )\n \n def has_change_permission(self, request, obj=None):\n@@ -114,4 +129,9 @@\n list_filter = (\n \"created\",\n \"last_updated\",\n+ \"members_only\",\n+ )\n+ list_display = (\n+ \"__str__\",\n+ \"members_only\",\n )\n", "issue": "Add filter/display of members-only value to document admin\n### Is your feature request related to a problem? Please describe.\r\nIt is not really issue to see which documents are marked as members only. And it is impossible to easily get a list with documents that have a true/false value.\r\n\r\n### Describe the solution you'd like\r\nI'd like to see more information about the documents in the admin page so that I do not have to open the detail page.\r\n\r\n### Motivation\r\nEasier to manage these files.\r\n\r\n### Describe alternatives you've considered\r\nThe only alternative is not doing this.\r\n\r\n### Additional context\r\n#2084 could have been prevented.\r\n\n", "before_files": [{"content": "\"\"\"Registers admin interfaces for the documents module.\"\"\"\nfrom django.contrib import admin\nfrom django.contrib.admin import ModelAdmin\nfrom django.utils.translation import gettext_lazy as _\n\nfrom documents import forms\nfrom documents.models import (\n AnnualDocument,\n AssociationDocument,\n EventDocument,\n GeneralMeeting,\n Minutes,\n MiscellaneousDocument,\n)\nfrom documents.services import is_owner\n\n\nclass MinutesInline(admin.StackedInline):\n \"\"\"Inline for minutes of a general meeting.\"\"\"\n\n model = Minutes\n form = forms.MinutesForm\n extra = 0\n\n\[email protected](GeneralMeeting)\nclass GeneralMeetingAdmin(ModelAdmin):\n \"\"\"Manage the general meetings.\"\"\"\n\n form = forms.GeneralMeetingForm\n inlines = [\n MinutesInline,\n ]\n list_filter = (\"datetime\",)\n\n\nclass LectureYearFilter(admin.SimpleListFilter):\n \"\"\"Filter the memberships on those started or ended in a lecture year.\"\"\"\n\n title = _(\"lecture year\")\n parameter_name = \"lecture_year\"\n\n def lookups(self, request, model_admin):\n if AnnualDocument.objects.count() > 0:\n first_year = AnnualDocument.objects.order_by(\"year\").first().year\n last_year = AnnualDocument.objects.order_by(\"year\").last().year\n\n return [\n (year, f\"{year}-{year + 1}\")\n for year in range(last_year, first_year - 1, -1)\n ]\n return []\n\n def queryset(self, request, queryset):\n if not self.value():\n return queryset\n\n year = int(self.value())\n\n return queryset.filter(year=year)\n\n\[email protected](AnnualDocument)\nclass AnnualDocumentAdmin(ModelAdmin):\n \"\"\"Manage the annual documents.\"\"\"\n\n form = forms.AnnualDocumentForm\n list_filter = (\n LectureYearFilter,\n \"created\",\n \"last_updated\",\n )\n\n\[email protected](AssociationDocument)\nclass AssociationDocumentAdmin(ModelAdmin):\n \"\"\"Manage the association documents.\"\"\"\n\n form = forms.AssociationDocumentForm\n list_filter = (\n \"created\",\n \"last_updated\",\n )\n\n\[email protected](EventDocument)\nclass EventDocumentAdmin(ModelAdmin):\n \"\"\"Manage the event documents.\"\"\"\n\n form = forms.EventDocumentForm\n list_filter = (\n \"created\",\n \"last_updated\",\n )\n\n def has_change_permission(self, request, obj=None):\n \"\"\"Only allow access to the change form if the user is an owner.\"\"\"\n if obj is not None and not is_owner(request.member, obj):\n return False\n return super().has_change_permission(request, obj)\n\n def has_delete_permission(self, request, obj=None):\n \"\"\"Only allow delete access if the user is an owner.\"\"\"\n if obj is not None and not is_owner(request.member, obj):\n return False\n return super().has_delete_permission(request, obj)\n\n\[email protected](MiscellaneousDocument)\nclass MiscellaneousDocumentAdmin(ModelAdmin):\n \"\"\"Manage the miscellaneous documents.\"\"\"\n\n form = forms.MiscellaneousDocumentForm\n list_filter = (\n \"created\",\n \"last_updated\",\n )\n", "path": "website/documents/admin.py"}], "after_files": [{"content": "\"\"\"Registers admin interfaces for the documents module.\"\"\"\nfrom django.contrib import admin\nfrom django.contrib.admin import ModelAdmin\nfrom django.utils.translation import gettext_lazy as _\n\nfrom documents import forms\nfrom documents.models import (\n AnnualDocument,\n AssociationDocument,\n EventDocument,\n GeneralMeeting,\n Minutes,\n MiscellaneousDocument,\n)\nfrom documents.services import is_owner\n\n\nclass MinutesInline(admin.StackedInline):\n \"\"\"Inline for minutes of a general meeting.\"\"\"\n\n model = Minutes\n form = forms.MinutesForm\n extra = 0\n\n\[email protected](GeneralMeeting)\nclass GeneralMeetingAdmin(ModelAdmin):\n \"\"\"Manage the general meetings.\"\"\"\n\n form = forms.GeneralMeetingForm\n inlines = [\n MinutesInline,\n ]\n list_filter = (\"datetime\",)\n\n\nclass LectureYearFilter(admin.SimpleListFilter):\n \"\"\"Filter the memberships on those started or ended in a lecture year.\"\"\"\n\n title = _(\"lecture year\")\n parameter_name = \"lecture_year\"\n\n def lookups(self, request, model_admin):\n if AnnualDocument.objects.count() > 0:\n first_year = AnnualDocument.objects.order_by(\"year\").first().year\n last_year = AnnualDocument.objects.order_by(\"year\").last().year\n\n return [\n (year, f\"{year}-{year + 1}\")\n for year in range(last_year, first_year - 1, -1)\n ]\n return []\n\n def queryset(self, request, queryset):\n if not self.value():\n return queryset\n\n year = int(self.value())\n\n return queryset.filter(year=year)\n\n\[email protected](AnnualDocument)\nclass AnnualDocumentAdmin(ModelAdmin):\n \"\"\"Manage the annual documents.\"\"\"\n\n form = forms.AnnualDocumentForm\n list_filter = (\n LectureYearFilter,\n \"created\",\n \"last_updated\",\n \"members_only\",\n )\n list_display = (\n \"__str__\",\n \"members_only\",\n )\n\n\[email protected](AssociationDocument)\nclass AssociationDocumentAdmin(ModelAdmin):\n \"\"\"Manage the association documents.\"\"\"\n\n form = forms.AssociationDocumentForm\n list_filter = (\n \"created\",\n \"last_updated\",\n \"members_only\",\n )\n list_display = (\n \"__str__\",\n \"members_only\",\n )\n\n\[email protected](EventDocument)\nclass EventDocumentAdmin(ModelAdmin):\n \"\"\"Manage the event documents.\"\"\"\n\n form = forms.EventDocumentForm\n list_filter = (\n \"created\",\n \"last_updated\",\n \"members_only\",\n )\n list_display = (\n \"__str__\",\n \"members_only\",\n )\n\n def has_change_permission(self, request, obj=None):\n \"\"\"Only allow access to the change form if the user is an owner.\"\"\"\n if obj is not None and not is_owner(request.member, obj):\n return False\n return super().has_change_permission(request, obj)\n\n def has_delete_permission(self, request, obj=None):\n \"\"\"Only allow delete access if the user is an owner.\"\"\"\n if obj is not None and not is_owner(request.member, obj):\n return False\n return super().has_delete_permission(request, obj)\n\n\[email protected](MiscellaneousDocument)\nclass MiscellaneousDocumentAdmin(ModelAdmin):\n \"\"\"Manage the miscellaneous documents.\"\"\"\n\n form = forms.MiscellaneousDocumentForm\n list_filter = (\n \"created\",\n \"last_updated\",\n \"members_only\",\n )\n list_display = (\n \"__str__\",\n \"members_only\",\n )\n", "path": "website/documents/admin.py"}]} | 1,294 | 273 |
gh_patches_debug_8038 | rasdani/github-patches | git_diff | microsoft__botbuilder-python-302 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
NumberPrompt doesn't accept retry value
## Version
v4.5
## Describe the bug
When you send an invalid number to a `NumberPrompt`, it sends out a retry prompt.
When attempting to send a 2nd response after being reprompted, you get a timeout error.
## To Reproduce
1. Create a `NumberPrompt` object
2. When it prompts you for a number, send in a non-numeric value (e.g. `"hello"`)
* this will trigger a retry prompt (e.g. `"You must enter a number."`)
3. Try sending in another value--no matter what type of value, you get a timeout error


## Expected behavior
To be able to send in a 2nd value when reprompted
## Additional context
```python
async def test_number_prompt_retry(self):
async def exec_test(turn_context: TurnContext) -> None:
dialog_context: DialogContext = await dialogs.create_context(turn_context)
results: DialogTurnResult = await dialog_context.continue_dialog()
if results.status == DialogTurnStatus.Empty:
options = PromptOptions(
prompt=Activity(type=ActivityTypes.message, text="Enter a number."),
retry_prompt=Activity(
type=ActivityTypes.message, text="You must enter a number."
),
)
await dialog_context.prompt("NumberPrompt", options)
elif results.status == DialogTurnStatus.Complete:
number_result = results.result
await turn_context.send_activity(
MessageFactory.text(f"Bot received the number '{number_result}'.")
)
await convo_state.save_changes(turn_context)
adapter = TestAdapter(exec_test)
convo_state = ConversationState(MemoryStorage())
dialog_state = convo_state.create_property("dialogState")
dialogs = DialogSet(dialog_state)
number_prompt = NumberPrompt(
dialog_id="NumberPrompt", validator=None, default_locale=Culture.English
)
dialogs.add(number_prompt)
step1 = await adapter.send("hello")
step2 = await step1.assert_reply("Enter a number.")
# TODO: something is breaking in the validators or retry prompt
# where it does not accept the 2nd answer after reprompting the user
# for another value
step3 = await step2.send("hello")
step4 = await step3.assert_reply("You must enter a number.")
step5 = await step4.send("64")
await step5.assert_reply("Bot received the number '64'.")
```
[bug]
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/number_prompt.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License.
3
4 from typing import Callable, Dict
5
6 from recognizers_number import recognize_number
7 from recognizers_text import Culture, ModelResult
8 from babel.numbers import parse_decimal
9
10 from botbuilder.core.turn_context import TurnContext
11 from botbuilder.schema import ActivityTypes
12
13 from .prompt import Prompt, PromptValidatorContext
14 from .prompt_options import PromptOptions
15 from .prompt_recognizer_result import PromptRecognizerResult
16
17
18 class NumberPrompt(Prompt):
19 # TODO: PromptValidator needs to be fixed
20 # Does not accept answer as intended (times out)
21 def __init__(
22 self,
23 dialog_id: str,
24 validator: Callable[[PromptValidatorContext], bool] = None,
25 default_locale: str = None,
26 ):
27 super(NumberPrompt, self).__init__(dialog_id, validator)
28 self.default_locale = default_locale
29
30 async def on_prompt(
31 self,
32 turn_context: TurnContext,
33 state: Dict[str, object],
34 options: PromptOptions,
35 is_retry: bool,
36 ):
37 if not turn_context:
38 raise TypeError("NumberPrompt.on_prompt(): turn_context cannot be None.")
39 if not options:
40 raise TypeError("NumberPrompt.on_prompt(): options cannot be None.")
41
42 if is_retry and options.retry_prompt is not None:
43 turn_context.send_activity(options.retry_prompt)
44 elif options.prompt is not None:
45 await turn_context.send_activity(options.prompt)
46
47 async def on_recognize(
48 self,
49 turn_context: TurnContext,
50 state: Dict[str, object],
51 options: PromptOptions,
52 ) -> PromptRecognizerResult:
53 if not turn_context:
54 raise TypeError("NumberPrompt.on_recognize(): turn_context cannot be None.")
55
56 result = PromptRecognizerResult()
57 if turn_context.activity.type == ActivityTypes.message:
58 message = turn_context.activity
59 culture = self._get_culture(turn_context)
60 results: [ModelResult] = recognize_number(message.text, culture)
61
62 if results:
63 result.succeeded = True
64 result.value = parse_decimal(
65 results[0].resolution["value"], locale=culture.replace("-", "_")
66 )
67
68 return result
69
70 def _get_culture(self, turn_context: TurnContext):
71 culture = (
72 turn_context.activity.locale
73 if turn_context.activity.locale
74 else self.default_locale
75 )
76
77 if not culture:
78 culture = Culture.English
79
80 return culture
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/number_prompt.py b/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/number_prompt.py
--- a/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/number_prompt.py
+++ b/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/number_prompt.py
@@ -40,7 +40,7 @@
raise TypeError("NumberPrompt.on_prompt(): options cannot be None.")
if is_retry and options.retry_prompt is not None:
- turn_context.send_activity(options.retry_prompt)
+ await turn_context.send_activity(options.retry_prompt)
elif options.prompt is not None:
await turn_context.send_activity(options.prompt)
| {"golden_diff": "diff --git a/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/number_prompt.py b/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/number_prompt.py\n--- a/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/number_prompt.py\n+++ b/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/number_prompt.py\n@@ -40,7 +40,7 @@\n raise TypeError(\"NumberPrompt.on_prompt(): options cannot be None.\")\n \n if is_retry and options.retry_prompt is not None:\n- turn_context.send_activity(options.retry_prompt)\n+ await turn_context.send_activity(options.retry_prompt)\n elif options.prompt is not None:\n await turn_context.send_activity(options.prompt)\n", "issue": "NumberPrompt doesn't accept retry value\n## Version\r\nv4.5\r\n\r\n## Describe the bug\r\nWhen you send an invalid number to a `NumberPrompt`, it sends out a retry prompt.\r\nWhen attempting to send a 2nd response after being reprompted, you get a timeout error.\r\n\r\n\r\n\r\n## To Reproduce\r\n1. Create a `NumberPrompt` object\r\n2. When it prompts you for a number, send in a non-numeric value (e.g. `\"hello\"`)\r\n * this will trigger a retry prompt (e.g. `\"You must enter a number.\"`)\r\n3. Try sending in another value--no matter what type of value, you get a timeout error\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n## Expected behavior\r\nTo be able to send in a 2nd value when reprompted\r\n\r\n## Additional context\r\n```python\r\nasync def test_number_prompt_retry(self):\r\n async def exec_test(turn_context: TurnContext) -> None:\r\n dialog_context: DialogContext = await dialogs.create_context(turn_context)\r\n\r\n results: DialogTurnResult = await dialog_context.continue_dialog()\r\n\r\n if results.status == DialogTurnStatus.Empty:\r\n options = PromptOptions(\r\n prompt=Activity(type=ActivityTypes.message, text=\"Enter a number.\"),\r\n retry_prompt=Activity(\r\n type=ActivityTypes.message, text=\"You must enter a number.\"\r\n ),\r\n )\r\n await dialog_context.prompt(\"NumberPrompt\", options)\r\n elif results.status == DialogTurnStatus.Complete:\r\n number_result = results.result\r\n await turn_context.send_activity(\r\n MessageFactory.text(f\"Bot received the number '{number_result}'.\")\r\n )\r\n\r\n await convo_state.save_changes(turn_context)\r\n\r\n adapter = TestAdapter(exec_test)\r\n\r\n convo_state = ConversationState(MemoryStorage())\r\n dialog_state = convo_state.create_property(\"dialogState\")\r\n dialogs = DialogSet(dialog_state)\r\n number_prompt = NumberPrompt(\r\n dialog_id=\"NumberPrompt\", validator=None, default_locale=Culture.English\r\n )\r\n dialogs.add(number_prompt)\r\n\r\n step1 = await adapter.send(\"hello\")\r\n step2 = await step1.assert_reply(\"Enter a number.\")\r\n # TODO: something is breaking in the validators or retry prompt\r\n # where it does not accept the 2nd answer after reprompting the user\r\n # for another value\r\n step3 = await step2.send(\"hello\")\r\n step4 = await step3.assert_reply(\"You must enter a number.\")\r\n step5 = await step4.send(\"64\")\r\n await step5.assert_reply(\"Bot received the number '64'.\")\r\n```\r\n\r\n[bug]\r\n\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nfrom typing import Callable, Dict\n\nfrom recognizers_number import recognize_number\nfrom recognizers_text import Culture, ModelResult\nfrom babel.numbers import parse_decimal\n\nfrom botbuilder.core.turn_context import TurnContext\nfrom botbuilder.schema import ActivityTypes\n\nfrom .prompt import Prompt, PromptValidatorContext\nfrom .prompt_options import PromptOptions\nfrom .prompt_recognizer_result import PromptRecognizerResult\n\n\nclass NumberPrompt(Prompt):\n # TODO: PromptValidator needs to be fixed\n # Does not accept answer as intended (times out)\n def __init__(\n self,\n dialog_id: str,\n validator: Callable[[PromptValidatorContext], bool] = None,\n default_locale: str = None,\n ):\n super(NumberPrompt, self).__init__(dialog_id, validator)\n self.default_locale = default_locale\n\n async def on_prompt(\n self,\n turn_context: TurnContext,\n state: Dict[str, object],\n options: PromptOptions,\n is_retry: bool,\n ):\n if not turn_context:\n raise TypeError(\"NumberPrompt.on_prompt(): turn_context cannot be None.\")\n if not options:\n raise TypeError(\"NumberPrompt.on_prompt(): options cannot be None.\")\n\n if is_retry and options.retry_prompt is not None:\n turn_context.send_activity(options.retry_prompt)\n elif options.prompt is not None:\n await turn_context.send_activity(options.prompt)\n\n async def on_recognize(\n self,\n turn_context: TurnContext,\n state: Dict[str, object],\n options: PromptOptions,\n ) -> PromptRecognizerResult:\n if not turn_context:\n raise TypeError(\"NumberPrompt.on_recognize(): turn_context cannot be None.\")\n\n result = PromptRecognizerResult()\n if turn_context.activity.type == ActivityTypes.message:\n message = turn_context.activity\n culture = self._get_culture(turn_context)\n results: [ModelResult] = recognize_number(message.text, culture)\n\n if results:\n result.succeeded = True\n result.value = parse_decimal(\n results[0].resolution[\"value\"], locale=culture.replace(\"-\", \"_\")\n )\n\n return result\n\n def _get_culture(self, turn_context: TurnContext):\n culture = (\n turn_context.activity.locale\n if turn_context.activity.locale\n else self.default_locale\n )\n\n if not culture:\n culture = Culture.English\n\n return culture\n", "path": "libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/number_prompt.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nfrom typing import Callable, Dict\n\nfrom recognizers_number import recognize_number\nfrom recognizers_text import Culture, ModelResult\nfrom babel.numbers import parse_decimal\n\nfrom botbuilder.core.turn_context import TurnContext\nfrom botbuilder.schema import ActivityTypes\n\nfrom .prompt import Prompt, PromptValidatorContext\nfrom .prompt_options import PromptOptions\nfrom .prompt_recognizer_result import PromptRecognizerResult\n\n\nclass NumberPrompt(Prompt):\n # TODO: PromptValidator needs to be fixed\n # Does not accept answer as intended (times out)\n def __init__(\n self,\n dialog_id: str,\n validator: Callable[[PromptValidatorContext], bool] = None,\n default_locale: str = None,\n ):\n super(NumberPrompt, self).__init__(dialog_id, validator)\n self.default_locale = default_locale\n\n async def on_prompt(\n self,\n turn_context: TurnContext,\n state: Dict[str, object],\n options: PromptOptions,\n is_retry: bool,\n ):\n if not turn_context:\n raise TypeError(\"NumberPrompt.on_prompt(): turn_context cannot be None.\")\n if not options:\n raise TypeError(\"NumberPrompt.on_prompt(): options cannot be None.\")\n\n if is_retry and options.retry_prompt is not None:\n await turn_context.send_activity(options.retry_prompt)\n elif options.prompt is not None:\n await turn_context.send_activity(options.prompt)\n\n async def on_recognize(\n self,\n turn_context: TurnContext,\n state: Dict[str, object],\n options: PromptOptions,\n ) -> PromptRecognizerResult:\n if not turn_context:\n raise TypeError(\"NumberPrompt.on_recognize(): turn_context cannot be None.\")\n\n result = PromptRecognizerResult()\n if turn_context.activity.type == ActivityTypes.message:\n message = turn_context.activity\n culture = self._get_culture(turn_context)\n results: [ModelResult] = recognize_number(message.text, culture)\n\n if results:\n result.succeeded = True\n result.value = parse_decimal(\n results[0].resolution[\"value\"], locale=culture.replace(\"-\", \"_\")\n )\n\n return result\n\n def _get_culture(self, turn_context: TurnContext):\n culture = (\n turn_context.activity.locale\n if turn_context.activity.locale\n else self.default_locale\n )\n\n if not culture:\n culture = Culture.English\n\n return culture\n", "path": "libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/number_prompt.py"}]} | 1,610 | 161 |
gh_patches_debug_24268 | rasdani/github-patches | git_diff | dmlc__gluon-nlp-832 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ATIS/SNIPS datasets and GLUE datasets don't appear in the website API doc
http://gluon-nlp.mxnet.io/api/modules/data.html
does not show the details of ATISDataset/SNIPSDataset and GlueCoLA, GlueSST2, GlueSTSB, GlueQQP, GlueRTE, GlueMNLI, GlueQNLI, GlueWNLI
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/gluonnlp/data/__init__.py`
Content:
```
1 # coding: utf-8
2
3 # Licensed to the Apache Software Foundation (ASF) under one
4 # or more contributor license agreements. See the NOTICE file
5 # distributed with this work for additional information
6 # regarding copyright ownership. The ASF licenses this file
7 # to you under the Apache License, Version 2.0 (the
8 # "License"); you may not use this file except in compliance
9 # with the License. You may obtain a copy of the License at
10 #
11 # http://www.apache.org/licenses/LICENSE-2.0
12 #
13 # Unless required by applicable law or agreed to in writing,
14 # software distributed under the License is distributed on an
15 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
16 # KIND, either express or implied. See the License for the
17 # specific language governing permissions and limitations
18 # under the License.
19
20 # pylint: disable=wildcard-import
21 """This module includes common utilities such as data readers and counter."""
22
23 from . import (batchify, candidate_sampler, conll, corpora, dataloader,
24 dataset, question_answering, registry, sampler, sentiment,
25 stream, transforms, translation, utils,
26 word_embedding_evaluation, intent_slot)
27 from .candidate_sampler import *
28 from .conll import *
29 from .glue import *
30 from .corpora import *
31 from .dataloader import *
32 from .dataset import *
33 from .question_answering import *
34 from .registry import *
35 from .sampler import *
36 from .sentiment import *
37 from .stream import *
38 from .transforms import *
39 from .translation import *
40 from .utils import *
41 from .word_embedding_evaluation import *
42 from .intent_slot import *
43
44 __all__ = (['batchify'] + utils.__all__ + transforms.__all__ + sampler.__all__
45 + dataset.__all__ + corpora.__all__ + sentiment.__all__ +
46 word_embedding_evaluation.__all__ + stream.__all__ + conll.__all__ +
47 translation.__all__ + registry.__all__ + question_answering.__all__
48 + dataloader.__all__ + candidate_sampler.__all__)
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/gluonnlp/data/__init__.py b/src/gluonnlp/data/__init__.py
--- a/src/gluonnlp/data/__init__.py
+++ b/src/gluonnlp/data/__init__.py
@@ -23,7 +23,7 @@
from . import (batchify, candidate_sampler, conll, corpora, dataloader,
dataset, question_answering, registry, sampler, sentiment,
stream, transforms, translation, utils,
- word_embedding_evaluation, intent_slot)
+ word_embedding_evaluation, intent_slot, glue)
from .candidate_sampler import *
from .conll import *
from .glue import *
@@ -42,7 +42,8 @@
from .intent_slot import *
__all__ = (['batchify'] + utils.__all__ + transforms.__all__ + sampler.__all__
- + dataset.__all__ + corpora.__all__ + sentiment.__all__ +
- word_embedding_evaluation.__all__ + stream.__all__ + conll.__all__ +
- translation.__all__ + registry.__all__ + question_answering.__all__
- + dataloader.__all__ + candidate_sampler.__all__)
+ + dataset.__all__ + corpora.__all__ + sentiment.__all__
+ + word_embedding_evaluation.__all__ + stream.__all__ + conll.__all__
+ + translation.__all__ + registry.__all__ + question_answering.__all__
+ + dataloader.__all__ + candidate_sampler.__all__ + intent_slot.__all__
+ + glue.__all__)
| {"golden_diff": "diff --git a/src/gluonnlp/data/__init__.py b/src/gluonnlp/data/__init__.py\n--- a/src/gluonnlp/data/__init__.py\n+++ b/src/gluonnlp/data/__init__.py\n@@ -23,7 +23,7 @@\n from . import (batchify, candidate_sampler, conll, corpora, dataloader,\n dataset, question_answering, registry, sampler, sentiment,\n stream, transforms, translation, utils,\n- word_embedding_evaluation, intent_slot)\n+ word_embedding_evaluation, intent_slot, glue)\n from .candidate_sampler import *\n from .conll import *\n from .glue import *\n@@ -42,7 +42,8 @@\n from .intent_slot import *\n \n __all__ = (['batchify'] + utils.__all__ + transforms.__all__ + sampler.__all__\n- + dataset.__all__ + corpora.__all__ + sentiment.__all__ +\n- word_embedding_evaluation.__all__ + stream.__all__ + conll.__all__ +\n- translation.__all__ + registry.__all__ + question_answering.__all__\n- + dataloader.__all__ + candidate_sampler.__all__)\n+ + dataset.__all__ + corpora.__all__ + sentiment.__all__\n+ + word_embedding_evaluation.__all__ + stream.__all__ + conll.__all__\n+ + translation.__all__ + registry.__all__ + question_answering.__all__\n+ + dataloader.__all__ + candidate_sampler.__all__ + intent_slot.__all__\n+ + glue.__all__)\n", "issue": "ATIS/SNIPS datasets and GLUE datasets don't appear in the website API doc \nhttp://gluon-nlp.mxnet.io/api/modules/data.html\r\n\r\ndoes not show the details of ATISDataset/SNIPSDataset and GlueCoLA, GlueSST2, GlueSTSB, GlueQQP, GlueRTE, GlueMNLI, GlueQNLI, GlueWNLI\r\n\n", "before_files": [{"content": "# coding: utf-8\n\n# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\n\n# pylint: disable=wildcard-import\n\"\"\"This module includes common utilities such as data readers and counter.\"\"\"\n\nfrom . import (batchify, candidate_sampler, conll, corpora, dataloader,\n dataset, question_answering, registry, sampler, sentiment,\n stream, transforms, translation, utils,\n word_embedding_evaluation, intent_slot)\nfrom .candidate_sampler import *\nfrom .conll import *\nfrom .glue import *\nfrom .corpora import *\nfrom .dataloader import *\nfrom .dataset import *\nfrom .question_answering import *\nfrom .registry import *\nfrom .sampler import *\nfrom .sentiment import *\nfrom .stream import *\nfrom .transforms import *\nfrom .translation import *\nfrom .utils import *\nfrom .word_embedding_evaluation import *\nfrom .intent_slot import *\n\n__all__ = (['batchify'] + utils.__all__ + transforms.__all__ + sampler.__all__\n + dataset.__all__ + corpora.__all__ + sentiment.__all__ +\n word_embedding_evaluation.__all__ + stream.__all__ + conll.__all__ +\n translation.__all__ + registry.__all__ + question_answering.__all__\n + dataloader.__all__ + candidate_sampler.__all__)\n", "path": "src/gluonnlp/data/__init__.py"}], "after_files": [{"content": "# coding: utf-8\n\n# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\n\n# pylint: disable=wildcard-import\n\"\"\"This module includes common utilities such as data readers and counter.\"\"\"\n\nfrom . import (batchify, candidate_sampler, conll, corpora, dataloader,\n dataset, question_answering, registry, sampler, sentiment,\n stream, transforms, translation, utils,\n word_embedding_evaluation, intent_slot, glue)\nfrom .candidate_sampler import *\nfrom .conll import *\nfrom .glue import *\nfrom .corpora import *\nfrom .dataloader import *\nfrom .dataset import *\nfrom .question_answering import *\nfrom .registry import *\nfrom .sampler import *\nfrom .sentiment import *\nfrom .stream import *\nfrom .transforms import *\nfrom .translation import *\nfrom .utils import *\nfrom .word_embedding_evaluation import *\nfrom .intent_slot import *\n\n__all__ = (['batchify'] + utils.__all__ + transforms.__all__ + sampler.__all__\n + dataset.__all__ + corpora.__all__ + sentiment.__all__\n + word_embedding_evaluation.__all__ + stream.__all__ + conll.__all__\n + translation.__all__ + registry.__all__ + question_answering.__all__\n + dataloader.__all__ + candidate_sampler.__all__ + intent_slot.__all__\n + glue.__all__)\n", "path": "src/gluonnlp/data/__init__.py"}]} | 888 | 347 |
gh_patches_debug_29639 | rasdani/github-patches | git_diff | frappe__frappe-2519 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Move app installation to background
Long installs timeout the installation of the app and leads to broken installs.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `frappe/desk/page/applications/applications.py`
Content:
```
1 # Copyright (c) 2015, Frappe Technologies Pvt. Ltd. and Contributors
2 # MIT License. See license.txt
3
4 from __future__ import unicode_literals
5 import frappe
6 import frappe.utils
7 import frappe.installer
8 import frappe.sessions
9 import subprocess
10 import os
11 import json
12 from frappe import _
13 from distutils.spawn import find_executable
14
15 @frappe.whitelist()
16 def get_app_list():
17 """Get list of all apps with properties, installed, category from hooks and
18 `frappe/data/app_listing/` if an entry exists"""
19 out = {}
20 installed = frappe.get_installed_apps()
21 for app in frappe.get_all_apps(True):
22 app_hooks = frappe.get_hooks(app_name=app)
23
24 if app not in installed and app_hooks.get('hide_in_installer'):
25 continue
26
27 out[app] = {}
28 for key in ("app_name", "app_title", "app_description", "app_icon",
29 "app_publisher", "app_version", "app_url", "app_color"):
30 val = app_hooks.get(key) or []
31 out[app][key] = val[0] if len(val) else ""
32
33 if app in installed:
34 out[app]["installed"] = 1
35
36 for app_from_list in get_app_listing().values():
37 if app_from_list.app_name in out:
38 out[app_from_list.app_name].update(app_from_list)
39 else:
40 if not frappe.conf.disallow_app_listing:
41 out[app_from_list.app_name] = app_from_list
42
43 return out
44
45 def get_app_listing():
46 """Get apps listed in `frappe/data/app_listing/`"""
47 apps_listing_dir = os.path.join(os.path.dirname(frappe.__file__), 'data', 'app_listing')
48 out = {}
49 for app in os.listdir(apps_listing_dir):
50 if app.endswith(".json"):
51 with open(os.path.join(apps_listing_dir, app)) as f:
52 out[app[:-5]] = frappe._dict(json.load(f))
53 return out
54
55 @frappe.whitelist()
56 def install_app(name):
57 """Install app, if app is not installed in local environment, install it via git url in
58 `frappe/data/app_listing/`"""
59 frappe.only_for("System Manager")
60
61 if name not in frappe.get_all_apps(True):
62 if not frappe.conf.disallow_app_listing:
63 get_app(name)
64 frappe.cache().delete_value(["app_hooks"])
65 # reload sys.path
66 import site
67 reload(site)
68 else:
69 # will only come via direct API
70 frappe.throw("Listing app not allowed")
71
72 app_hooks = frappe.get_hooks(app_name=name)
73 if app_hooks.get('hide_in_installer'):
74 frappe.throw(_("You cannot install this app"))
75
76 frappe.publish_realtime("install_app_progress", {"status": _("Installing App {0}").format(name)},
77 user=frappe.session.user)
78
79 frappe.installer.install_app(name)
80
81 frappe.publish_realtime("install_app_progress", {"status": _("{0} Installed").format(name)},
82 user=frappe.session.user)
83
84 def get_app(name):
85 """Get app using git clone and install it in bench environment"""
86 app_listing = get_app_listing()
87 if name not in app_listing:
88 frappe.throw(_("Unknown app {0}").format(name))
89 raise frappe.ValidationError
90
91 frappe.publish_realtime("install_app_progress", {"status": _("Downloading App {0}").format(name)},
92 user=frappe.session.user)
93
94 args = [find_executable('bench'), 'get-app', name, app_listing[name]['repo_url']]
95
96 try:
97 subprocess.check_call(args, cwd=frappe.utils.get_bench_path(),
98 stderr=subprocess.STDOUT)
99 return "okay"
100 except subprocess.CalledProcessError as e:
101 frappe.msgprint("<b>" + " ".join(args) + "</b>")
102 frappe.msgprint(e.output)
103 return e.output
104
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/frappe/desk/page/applications/applications.py b/frappe/desk/page/applications/applications.py
--- a/frappe/desk/page/applications/applications.py
+++ b/frappe/desk/page/applications/applications.py
@@ -11,6 +11,7 @@
import json
from frappe import _
from distutils.spawn import find_executable
+from frappe.utils.background_jobs import enqueue
@frappe.whitelist()
def get_app_list():
@@ -73,6 +74,12 @@
if app_hooks.get('hide_in_installer'):
frappe.throw(_("You cannot install this app"))
+ enqueue('frappe.desk.page.applications.applications.start_install', name=name)
+
+ frappe.msgprint(_('Queued for install'))
+
+
+def start_install(name):
frappe.publish_realtime("install_app_progress", {"status": _("Installing App {0}").format(name)},
user=frappe.session.user)
@@ -81,6 +88,20 @@
frappe.publish_realtime("install_app_progress", {"status": _("{0} Installed").format(name)},
user=frappe.session.user)
[email protected]()
+def remove_app(name):
+ """Remove installed app"""
+ frappe.only_for("System Manager")
+
+ if name in frappe.get_installed_apps():
+ enqueue('frappe.desk.page.applications.applications.start_remove', name=name)
+
+ frappe.msgprint(_('Queued for backup and removing {0}').format(frappe.bold(name)))
+
+def start_remove(name):
+ frappe.installer.remove_app(app_name=name, yes=True)
+ frappe.publish_realtime('msgprint', _('App {0} removed').format(frappe.bold(name)))
+
def get_app(name):
"""Get app using git clone and install it in bench environment"""
app_listing = get_app_listing()
| {"golden_diff": "diff --git a/frappe/desk/page/applications/applications.py b/frappe/desk/page/applications/applications.py\n--- a/frappe/desk/page/applications/applications.py\n+++ b/frappe/desk/page/applications/applications.py\n@@ -11,6 +11,7 @@\n import json\n from frappe import _\n from distutils.spawn import find_executable\n+from frappe.utils.background_jobs import enqueue\n \n @frappe.whitelist()\n def get_app_list():\n@@ -73,6 +74,12 @@\n \tif app_hooks.get('hide_in_installer'):\n \t\tfrappe.throw(_(\"You cannot install this app\"))\n \n+\tenqueue('frappe.desk.page.applications.applications.start_install', name=name)\n+\n+\tfrappe.msgprint(_('Queued for install'))\n+\n+\n+def start_install(name):\n \tfrappe.publish_realtime(\"install_app_progress\", {\"status\": _(\"Installing App {0}\").format(name)},\n \t\tuser=frappe.session.user)\n \n@@ -81,6 +88,20 @@\n \tfrappe.publish_realtime(\"install_app_progress\", {\"status\": _(\"{0} Installed\").format(name)},\n \t\tuser=frappe.session.user)\n \[email protected]()\n+def remove_app(name):\n+\t\"\"\"Remove installed app\"\"\"\n+\tfrappe.only_for(\"System Manager\")\n+\n+\tif name in frappe.get_installed_apps():\n+\t\tenqueue('frappe.desk.page.applications.applications.start_remove', name=name)\n+\n+\tfrappe.msgprint(_('Queued for backup and removing {0}').format(frappe.bold(name)))\n+\n+def start_remove(name):\n+\tfrappe.installer.remove_app(app_name=name, yes=True)\n+\tfrappe.publish_realtime('msgprint', _('App {0} removed').format(frappe.bold(name)))\n+\n def get_app(name):\n \t\"\"\"Get app using git clone and install it in bench environment\"\"\"\n \tapp_listing = get_app_listing()\n", "issue": "Move app installation to background\nLong installs timeout the installation of the app and leads to broken installs.\n\n", "before_files": [{"content": "# Copyright (c) 2015, Frappe Technologies Pvt. Ltd. and Contributors\n# MIT License. See license.txt\n\nfrom __future__ import unicode_literals\nimport frappe\nimport frappe.utils\nimport frappe.installer\nimport frappe.sessions\nimport subprocess\nimport os\nimport json\nfrom frappe import _\nfrom distutils.spawn import find_executable\n\[email protected]()\ndef get_app_list():\n\t\"\"\"Get list of all apps with properties, installed, category from hooks and\n\t`frappe/data/app_listing/` if an entry exists\"\"\"\n\tout = {}\n\tinstalled = frappe.get_installed_apps()\n\tfor app in frappe.get_all_apps(True):\n\t\tapp_hooks = frappe.get_hooks(app_name=app)\n\n\t\tif app not in installed and app_hooks.get('hide_in_installer'):\n\t\t\tcontinue\n\n\t\tout[app] = {}\n\t\tfor key in (\"app_name\", \"app_title\", \"app_description\", \"app_icon\",\n\t\t\t\"app_publisher\", \"app_version\", \"app_url\", \"app_color\"):\n\t\t\t val = app_hooks.get(key) or []\n\t\t\t out[app][key] = val[0] if len(val) else \"\"\n\n\t\tif app in installed:\n\t\t\tout[app][\"installed\"] = 1\n\n\tfor app_from_list in get_app_listing().values():\n\t\tif app_from_list.app_name in out:\n\t\t\tout[app_from_list.app_name].update(app_from_list)\n\t\telse:\n\t\t\tif not frappe.conf.disallow_app_listing:\n\t\t\t\tout[app_from_list.app_name] = app_from_list\n\n\treturn out\n\ndef get_app_listing():\n\t\"\"\"Get apps listed in `frappe/data/app_listing/`\"\"\"\n\tapps_listing_dir = os.path.join(os.path.dirname(frappe.__file__), 'data', 'app_listing')\n\tout = {}\n\tfor app in os.listdir(apps_listing_dir):\n\t\tif app.endswith(\".json\"):\n\t\t\twith open(os.path.join(apps_listing_dir, app)) as f:\n\t\t\t\tout[app[:-5]] = frappe._dict(json.load(f))\n\treturn out\n\[email protected]()\ndef install_app(name):\n\t\"\"\"Install app, if app is not installed in local environment, install it via git url in\n\t`frappe/data/app_listing/`\"\"\"\n\tfrappe.only_for(\"System Manager\")\n\n\tif name not in frappe.get_all_apps(True):\n\t\tif not frappe.conf.disallow_app_listing:\n\t\t\tget_app(name)\n\t\t\tfrappe.cache().delete_value([\"app_hooks\"])\n\t\t\t# reload sys.path\n\t\t\timport site\n\t\t\treload(site)\n\t\telse:\n\t\t\t# will only come via direct API\n\t\t\tfrappe.throw(\"Listing app not allowed\")\n\n\tapp_hooks = frappe.get_hooks(app_name=name)\n\tif app_hooks.get('hide_in_installer'):\n\t\tfrappe.throw(_(\"You cannot install this app\"))\n\n\tfrappe.publish_realtime(\"install_app_progress\", {\"status\": _(\"Installing App {0}\").format(name)},\n\t\tuser=frappe.session.user)\n\n\tfrappe.installer.install_app(name)\n\n\tfrappe.publish_realtime(\"install_app_progress\", {\"status\": _(\"{0} Installed\").format(name)},\n\t\tuser=frappe.session.user)\n\ndef get_app(name):\n\t\"\"\"Get app using git clone and install it in bench environment\"\"\"\n\tapp_listing = get_app_listing()\n\tif name not in app_listing:\n\t\tfrappe.throw(_(\"Unknown app {0}\").format(name))\n\t\traise frappe.ValidationError\n\n\tfrappe.publish_realtime(\"install_app_progress\", {\"status\": _(\"Downloading App {0}\").format(name)},\n\t\tuser=frappe.session.user)\n\n\targs = [find_executable('bench'), 'get-app', name, app_listing[name]['repo_url']]\n\n\ttry:\n\t\tsubprocess.check_call(args, cwd=frappe.utils.get_bench_path(),\n\t\t\tstderr=subprocess.STDOUT)\n\t\treturn \"okay\"\n\texcept subprocess.CalledProcessError as e:\n\t\tfrappe.msgprint(\"<b>\" + \" \".join(args) + \"</b>\")\n\t\tfrappe.msgprint(e.output)\n\t\treturn e.output\n", "path": "frappe/desk/page/applications/applications.py"}], "after_files": [{"content": "# Copyright (c) 2015, Frappe Technologies Pvt. Ltd. and Contributors\n# MIT License. See license.txt\n\nfrom __future__ import unicode_literals\nimport frappe\nimport frappe.utils\nimport frappe.installer\nimport frappe.sessions\nimport subprocess\nimport os\nimport json\nfrom frappe import _\nfrom distutils.spawn import find_executable\nfrom frappe.utils.background_jobs import enqueue\n\[email protected]()\ndef get_app_list():\n\t\"\"\"Get list of all apps with properties, installed, category from hooks and\n\t`frappe/data/app_listing/` if an entry exists\"\"\"\n\tout = {}\n\tinstalled = frappe.get_installed_apps()\n\tfor app in frappe.get_all_apps(True):\n\t\tapp_hooks = frappe.get_hooks(app_name=app)\n\n\t\tif app not in installed and app_hooks.get('hide_in_installer'):\n\t\t\tcontinue\n\n\t\tout[app] = {}\n\t\tfor key in (\"app_name\", \"app_title\", \"app_description\", \"app_icon\",\n\t\t\t\"app_publisher\", \"app_version\", \"app_url\", \"app_color\"):\n\t\t\t val = app_hooks.get(key) or []\n\t\t\t out[app][key] = val[0] if len(val) else \"\"\n\n\t\tif app in installed:\n\t\t\tout[app][\"installed\"] = 1\n\n\tfor app_from_list in get_app_listing().values():\n\t\tif app_from_list.app_name in out:\n\t\t\tout[app_from_list.app_name].update(app_from_list)\n\t\telse:\n\t\t\tif not frappe.conf.disallow_app_listing:\n\t\t\t\tout[app_from_list.app_name] = app_from_list\n\n\treturn out\n\ndef get_app_listing():\n\t\"\"\"Get apps listed in `frappe/data/app_listing/`\"\"\"\n\tapps_listing_dir = os.path.join(os.path.dirname(frappe.__file__), 'data', 'app_listing')\n\tout = {}\n\tfor app in os.listdir(apps_listing_dir):\n\t\tif app.endswith(\".json\"):\n\t\t\twith open(os.path.join(apps_listing_dir, app)) as f:\n\t\t\t\tout[app[:-5]] = frappe._dict(json.load(f))\n\treturn out\n\[email protected]()\ndef install_app(name):\n\t\"\"\"Install app, if app is not installed in local environment, install it via git url in\n\t`frappe/data/app_listing/`\"\"\"\n\tfrappe.only_for(\"System Manager\")\n\n\tif name not in frappe.get_all_apps(True):\n\t\tif not frappe.conf.disallow_app_listing:\n\t\t\tget_app(name)\n\t\t\tfrappe.cache().delete_value([\"app_hooks\"])\n\t\t\t# reload sys.path\n\t\t\timport site\n\t\t\treload(site)\n\t\telse:\n\t\t\t# will only come via direct API\n\t\t\tfrappe.throw(\"Listing app not allowed\")\n\n\tapp_hooks = frappe.get_hooks(app_name=name)\n\tif app_hooks.get('hide_in_installer'):\n\t\tfrappe.throw(_(\"You cannot install this app\"))\n\n\tenqueue('frappe.desk.page.applications.applications.start_install', name=name)\n\n\tfrappe.msgprint(_('Queued for install'))\n\n\ndef start_install(name):\n\tfrappe.publish_realtime(\"install_app_progress\", {\"status\": _(\"Installing App {0}\").format(name)},\n\t\tuser=frappe.session.user)\n\n\tfrappe.installer.install_app(name)\n\n\tfrappe.publish_realtime(\"install_app_progress\", {\"status\": _(\"{0} Installed\").format(name)},\n\t\tuser=frappe.session.user)\n\[email protected]()\ndef remove_app(name):\n\t\"\"\"Remove installed app\"\"\"\n\tfrappe.only_for(\"System Manager\")\n\n\tif name in frappe.get_installed_apps():\n\t\tenqueue('frappe.desk.page.applications.applications.start_remove', name=name)\n\n\tfrappe.msgprint(_('Queued for backup and removing {0}').format(frappe.bold(name)))\n\ndef start_remove(name):\n\tfrappe.installer.remove_app(app_name=name, yes=True)\n\tfrappe.publish_realtime('msgprint', _('App {0} removed').format(frappe.bold(name)))\n\ndef get_app(name):\n\t\"\"\"Get app using git clone and install it in bench environment\"\"\"\n\tapp_listing = get_app_listing()\n\tif name not in app_listing:\n\t\tfrappe.throw(_(\"Unknown app {0}\").format(name))\n\t\traise frappe.ValidationError\n\n\tfrappe.publish_realtime(\"install_app_progress\", {\"status\": _(\"Downloading App {0}\").format(name)},\n\t\tuser=frappe.session.user)\n\n\targs = [find_executable('bench'), 'get-app', name, app_listing[name]['repo_url']]\n\n\ttry:\n\t\tsubprocess.check_call(args, cwd=frappe.utils.get_bench_path(),\n\t\t\tstderr=subprocess.STDOUT)\n\t\treturn \"okay\"\n\texcept subprocess.CalledProcessError as e:\n\t\tfrappe.msgprint(\"<b>\" + \" \".join(args) + \"</b>\")\n\t\tfrappe.msgprint(e.output)\n\t\treturn e.output\n", "path": "frappe/desk/page/applications/applications.py"}]} | 1,381 | 421 |
gh_patches_debug_1748 | rasdani/github-patches | git_diff | MycroftAI__mycroft-core-750 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Typing error in recognize_google() methode
In mycroft/stt/\_\_init\_\_.py line 74 :
Replacing mistyped 's' parameter by self.lang fixed the problem.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mycroft/stt/__init__.py`
Content:
```
1 # Copyright 2016 Mycroft AI, Inc.
2 #
3 # This file is part of Mycroft Core.
4 #
5 # Mycroft Core is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU General Public License as published by
7 # the Free Software Foundation, either version 3 of the License, or
8 # (at your option) any later version.
9 #
10 # Mycroft Core is distributed in the hope that it will be useful,
11 # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 # GNU General Public License for more details.
14 #
15 # You should have received a copy of the GNU General Public License
16 # along with Mycroft Core. If not, see <http://www.gnu.org/licenses/>.
17 from abc import ABCMeta, abstractmethod
18
19 from speech_recognition import Recognizer
20
21 from mycroft.api import STTApi
22 from mycroft.configuration import ConfigurationManager
23 from mycroft.util.log import getLogger
24
25 __author__ = "jdorleans"
26
27 LOG = getLogger("STT")
28
29
30 class STT(object):
31 __metaclass__ = ABCMeta
32
33 def __init__(self):
34 config_core = ConfigurationManager.get()
35 self.lang = str(self.init_language(config_core))
36 config_stt = config_core.get("stt", {})
37 self.config = config_stt.get(config_stt.get("module"), {})
38 self.credential = self.config.get("credential", {})
39 self.recognizer = Recognizer()
40
41 @staticmethod
42 def init_language(config_core):
43 langs = config_core.get("lang", "en-US").split("-")
44 return langs[0].lower() + "-" + langs[1].upper()
45
46 @abstractmethod
47 def execute(self, audio, language=None):
48 pass
49
50
51 class TokenSTT(STT):
52 __metaclass__ = ABCMeta
53
54 def __init__(self):
55 super(TokenSTT, self).__init__()
56 self.token = str(self.credential.get("token"))
57
58
59 class BasicSTT(STT):
60 __metaclass__ = ABCMeta
61
62 def __init__(self):
63 super(BasicSTT, self).__init__()
64 self.username = str(self.credential.get("username"))
65 self.password = str(self.credential.get("password"))
66
67
68 class GoogleSTT(TokenSTT):
69 def __init__(self):
70 super(GoogleSTT, self).__init__()
71
72 def execute(self, audio, language=None):
73 self.lang = language or self.lang
74 return self.recognizer.recognize_google(audio, self.token, s)
75
76
77 class WITSTT(TokenSTT):
78 def __init__(self):
79 super(WITSTT, self).__init__()
80
81 def execute(self, audio, language=None):
82 LOG.warn("WITSTT language should be configured at wit.ai settings.")
83 return self.recognizer.recognize_wit(audio, self.token)
84
85
86 class IBMSTT(BasicSTT):
87 def __init__(self):
88 super(IBMSTT, self).__init__()
89
90 def execute(self, audio, language=None):
91 self.lang = language or self.lang
92 return self.recognizer.recognize_ibm(audio, self.username,
93 self.password, self.lang)
94
95
96 class MycroftSTT(STT):
97 def __init__(self):
98 super(MycroftSTT, self).__init__()
99 self.api = STTApi()
100
101 def execute(self, audio, language=None):
102 self.lang = language or self.lang
103 return self.api.stt(audio.get_flac_data(), self.lang, 1)[0]
104
105
106 class STTFactory(object):
107 CLASSES = {
108 "mycroft": MycroftSTT,
109 "google": GoogleSTT,
110 "wit": WITSTT,
111 "ibm": IBMSTT
112 }
113
114 @staticmethod
115 def create():
116 config = ConfigurationManager.get().get("stt", {})
117 module = config.get("module", "mycroft")
118 clazz = STTFactory.CLASSES.get(module)
119 return clazz()
120
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mycroft/stt/__init__.py b/mycroft/stt/__init__.py
--- a/mycroft/stt/__init__.py
+++ b/mycroft/stt/__init__.py
@@ -71,7 +71,7 @@
def execute(self, audio, language=None):
self.lang = language or self.lang
- return self.recognizer.recognize_google(audio, self.token, s)
+ return self.recognizer.recognize_google(audio, self.token, self.lang)
class WITSTT(TokenSTT):
| {"golden_diff": "diff --git a/mycroft/stt/__init__.py b/mycroft/stt/__init__.py\n--- a/mycroft/stt/__init__.py\n+++ b/mycroft/stt/__init__.py\n@@ -71,7 +71,7 @@\n \n def execute(self, audio, language=None):\n self.lang = language or self.lang\n- return self.recognizer.recognize_google(audio, self.token, s)\n+ return self.recognizer.recognize_google(audio, self.token, self.lang)\n \n \n class WITSTT(TokenSTT):\n", "issue": "Typing error in recognize_google() methode\nIn mycroft/stt/\\_\\_init\\_\\_.py line 74 :\r\nReplacing mistyped 's' parameter by self.lang fixed the problem.\n", "before_files": [{"content": "# Copyright 2016 Mycroft AI, Inc.\n#\n# This file is part of Mycroft Core.\n#\n# Mycroft Core is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# Mycroft Core is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with Mycroft Core. If not, see <http://www.gnu.org/licenses/>.\nfrom abc import ABCMeta, abstractmethod\n\nfrom speech_recognition import Recognizer\n\nfrom mycroft.api import STTApi\nfrom mycroft.configuration import ConfigurationManager\nfrom mycroft.util.log import getLogger\n\n__author__ = \"jdorleans\"\n\nLOG = getLogger(\"STT\")\n\n\nclass STT(object):\n __metaclass__ = ABCMeta\n\n def __init__(self):\n config_core = ConfigurationManager.get()\n self.lang = str(self.init_language(config_core))\n config_stt = config_core.get(\"stt\", {})\n self.config = config_stt.get(config_stt.get(\"module\"), {})\n self.credential = self.config.get(\"credential\", {})\n self.recognizer = Recognizer()\n\n @staticmethod\n def init_language(config_core):\n langs = config_core.get(\"lang\", \"en-US\").split(\"-\")\n return langs[0].lower() + \"-\" + langs[1].upper()\n\n @abstractmethod\n def execute(self, audio, language=None):\n pass\n\n\nclass TokenSTT(STT):\n __metaclass__ = ABCMeta\n\n def __init__(self):\n super(TokenSTT, self).__init__()\n self.token = str(self.credential.get(\"token\"))\n\n\nclass BasicSTT(STT):\n __metaclass__ = ABCMeta\n\n def __init__(self):\n super(BasicSTT, self).__init__()\n self.username = str(self.credential.get(\"username\"))\n self.password = str(self.credential.get(\"password\"))\n\n\nclass GoogleSTT(TokenSTT):\n def __init__(self):\n super(GoogleSTT, self).__init__()\n\n def execute(self, audio, language=None):\n self.lang = language or self.lang\n return self.recognizer.recognize_google(audio, self.token, s)\n\n\nclass WITSTT(TokenSTT):\n def __init__(self):\n super(WITSTT, self).__init__()\n\n def execute(self, audio, language=None):\n LOG.warn(\"WITSTT language should be configured at wit.ai settings.\")\n return self.recognizer.recognize_wit(audio, self.token)\n\n\nclass IBMSTT(BasicSTT):\n def __init__(self):\n super(IBMSTT, self).__init__()\n\n def execute(self, audio, language=None):\n self.lang = language or self.lang\n return self.recognizer.recognize_ibm(audio, self.username,\n self.password, self.lang)\n\n\nclass MycroftSTT(STT):\n def __init__(self):\n super(MycroftSTT, self).__init__()\n self.api = STTApi()\n\n def execute(self, audio, language=None):\n self.lang = language or self.lang\n return self.api.stt(audio.get_flac_data(), self.lang, 1)[0]\n\n\nclass STTFactory(object):\n CLASSES = {\n \"mycroft\": MycroftSTT,\n \"google\": GoogleSTT,\n \"wit\": WITSTT,\n \"ibm\": IBMSTT\n }\n\n @staticmethod\n def create():\n config = ConfigurationManager.get().get(\"stt\", {})\n module = config.get(\"module\", \"mycroft\")\n clazz = STTFactory.CLASSES.get(module)\n return clazz()\n", "path": "mycroft/stt/__init__.py"}], "after_files": [{"content": "# Copyright 2016 Mycroft AI, Inc.\n#\n# This file is part of Mycroft Core.\n#\n# Mycroft Core is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# Mycroft Core is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with Mycroft Core. If not, see <http://www.gnu.org/licenses/>.\nfrom abc import ABCMeta, abstractmethod\n\nfrom speech_recognition import Recognizer\n\nfrom mycroft.api import STTApi\nfrom mycroft.configuration import ConfigurationManager\nfrom mycroft.util.log import getLogger\n\n__author__ = \"jdorleans\"\n\nLOG = getLogger(\"STT\")\n\n\nclass STT(object):\n __metaclass__ = ABCMeta\n\n def __init__(self):\n config_core = ConfigurationManager.get()\n self.lang = str(self.init_language(config_core))\n config_stt = config_core.get(\"stt\", {})\n self.config = config_stt.get(config_stt.get(\"module\"), {})\n self.credential = self.config.get(\"credential\", {})\n self.recognizer = Recognizer()\n\n @staticmethod\n def init_language(config_core):\n langs = config_core.get(\"lang\", \"en-US\").split(\"-\")\n return langs[0].lower() + \"-\" + langs[1].upper()\n\n @abstractmethod\n def execute(self, audio, language=None):\n pass\n\n\nclass TokenSTT(STT):\n __metaclass__ = ABCMeta\n\n def __init__(self):\n super(TokenSTT, self).__init__()\n self.token = str(self.credential.get(\"token\"))\n\n\nclass BasicSTT(STT):\n __metaclass__ = ABCMeta\n\n def __init__(self):\n super(BasicSTT, self).__init__()\n self.username = str(self.credential.get(\"username\"))\n self.password = str(self.credential.get(\"password\"))\n\n\nclass GoogleSTT(TokenSTT):\n def __init__(self):\n super(GoogleSTT, self).__init__()\n\n def execute(self, audio, language=None):\n self.lang = language or self.lang\n return self.recognizer.recognize_google(audio, self.token, self.lang)\n\n\nclass WITSTT(TokenSTT):\n def __init__(self):\n super(WITSTT, self).__init__()\n\n def execute(self, audio, language=None):\n LOG.warn(\"WITSTT language should be configured at wit.ai settings.\")\n return self.recognizer.recognize_wit(audio, self.token)\n\n\nclass IBMSTT(BasicSTT):\n def __init__(self):\n super(IBMSTT, self).__init__()\n\n def execute(self, audio, language=None):\n self.lang = language or self.lang\n return self.recognizer.recognize_ibm(audio, self.username,\n self.password, self.lang)\n\n\nclass MycroftSTT(STT):\n def __init__(self):\n super(MycroftSTT, self).__init__()\n self.api = STTApi()\n\n def execute(self, audio, language=None):\n self.lang = language or self.lang\n return self.api.stt(audio.get_flac_data(), self.lang, 1)[0]\n\n\nclass STTFactory(object):\n CLASSES = {\n \"mycroft\": MycroftSTT,\n \"google\": GoogleSTT,\n \"wit\": WITSTT,\n \"ibm\": IBMSTT\n }\n\n @staticmethod\n def create():\n config = ConfigurationManager.get().get(\"stt\", {})\n module = config.get(\"module\", \"mycroft\")\n clazz = STTFactory.CLASSES.get(module)\n return clazz()\n", "path": "mycroft/stt/__init__.py"}]} | 1,436 | 123 |
gh_patches_debug_6541 | rasdani/github-patches | git_diff | pytorch__vision-5775 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Wrong label mapping in vision/torchvision/datasets/flowers102.py
### 🐛 Describe the bug
from line 70 in the file vision/torchvision/datasets/flowers102.py
```
self._labels = []
self._image_files = []
for image_id in image_ids:
self._labels.append(image_id_to_label[image_id]) # Note: the bug is here, current the labels are ranged from 1 to 102, which should be ranged from 0 to 101, consider to change this line into self._labels.append(image_id_to_label[image_id] - 1) ?
self._image_files.append(self._images_folder / f"image_{image_id:05d}.jpg")
```
### Versions
PyTorch version: 1.11.0
Is debug build: False
CUDA used to build PyTorch: 11.3
ROCM used to build PyTorch: N/A
OS: Ubuntu 20.04.4 LTS (x86_64)
GCC version: (Ubuntu 9.4.0-1ubuntu1~20.04.1) 9.4.0
Clang version: Could not collect
CMake version: version 3.16.3
Libc version: glibc-2.31
Python version: 3.8.13 (default, Mar 28 2022, 11:38:47) [GCC 7.5.0] (64-bit runtime)
Python platform: Linux-5.4.0-104-generic-x86_64-with-glibc2.17
Is CUDA available: True
CUDA runtime version: Could not collect
GPU models and configuration: GPU 0: NVIDIA GeForce RTX 3090
Nvidia driver version: 510.54
cuDNN version: Could not collect
HIP runtime version: N/A
MIOpen runtime version: N/A
Is XNNPACK available: True
Versions of relevant libraries:
[pip3] advertorch==0.2.3
[pip3] numpy==1.21.2
[pip3] torch==1.11.0
[pip3] torchaudio==0.11.0
[pip3] torchvision==0.7.0
[conda] advertorch 0.2.3 pypi_0 pypi
[conda] blas 1.0 mkl https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
[conda] cudatoolkit 11.3.1 h2bc3f7f_2 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
[conda] ffmpeg 4.3 hf484d3e_0 pytorch
[conda] mkl 2021.4.0 h06a4308_640 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
[conda] mkl-service 2.4.0 py38h7f8727e_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
[conda] mkl_fft 1.3.1 py38hd3c417c_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
[conda] mkl_random 1.2.2 py38h51133e4_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
[conda] numpy 1.21.2 py38h20f2e39_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
[conda] numpy-base 1.21.2 py38h79a1101_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
[conda] pytorch 1.11.0 py3.8_cuda11.3_cudnn8.2.0_0 pytorch
[conda] pytorch-mutex 1.0 cuda pytorch
[conda] torchaudio 0.11.0 py38_cu113 pytorch
[conda] torchvision 0.7.0 pypi_0 pypi
cc @pmeier @YosuaMichael
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchvision/datasets/flowers102.py`
Content:
```
1 from pathlib import Path
2 from typing import Any, Tuple, Callable, Optional
3
4 import PIL.Image
5
6 from .utils import check_integrity, download_and_extract_archive, download_url, verify_str_arg
7 from .vision import VisionDataset
8
9
10 class Flowers102(VisionDataset):
11 """`Oxford 102 Flower <https://www.robots.ox.ac.uk/~vgg/data/flowers/102/>`_ Dataset.
12
13 .. warning::
14
15 This class needs `scipy <https://docs.scipy.org/doc/>`_ to load target files from `.mat` format.
16
17 Oxford 102 Flower is an image classification dataset consisting of 102 flower categories. The
18 flowers were chosen to be flowers commonly occurring in the United Kingdom. Each class consists of
19 between 40 and 258 images.
20
21 The images have large scale, pose and light variations. In addition, there are categories that
22 have large variations within the category, and several very similar categories.
23
24 Args:
25 root (string): Root directory of the dataset.
26 split (string, optional): The dataset split, supports ``"train"`` (default), ``"val"``, or ``"test"``.
27 transform (callable, optional): A function/transform that takes in an PIL image and returns a
28 transformed version. E.g, ``transforms.RandomCrop``.
29 target_transform (callable, optional): A function/transform that takes in the target and transforms it.
30 download (bool, optional): If true, downloads the dataset from the internet and
31 puts it in root directory. If dataset is already downloaded, it is not
32 downloaded again.
33 """
34
35 _download_url_prefix = "https://www.robots.ox.ac.uk/~vgg/data/flowers/102/"
36 _file_dict = { # filename, md5
37 "image": ("102flowers.tgz", "52808999861908f626f3c1f4e79d11fa"),
38 "label": ("imagelabels.mat", "e0620be6f572b9609742df49c70aed4d"),
39 "setid": ("setid.mat", "a5357ecc9cb78c4bef273ce3793fc85c"),
40 }
41 _splits_map = {"train": "trnid", "val": "valid", "test": "tstid"}
42
43 def __init__(
44 self,
45 root: str,
46 split: str = "train",
47 transform: Optional[Callable] = None,
48 target_transform: Optional[Callable] = None,
49 download: bool = False,
50 ) -> None:
51 super().__init__(root, transform=transform, target_transform=target_transform)
52 self._split = verify_str_arg(split, "split", ("train", "val", "test"))
53 self._base_folder = Path(self.root) / "flowers-102"
54 self._images_folder = self._base_folder / "jpg"
55
56 if download:
57 self.download()
58
59 if not self._check_integrity():
60 raise RuntimeError("Dataset not found or corrupted. You can use download=True to download it")
61
62 from scipy.io import loadmat
63
64 set_ids = loadmat(self._base_folder / self._file_dict["setid"][0], squeeze_me=True)
65 image_ids = set_ids[self._splits_map[self._split]].tolist()
66
67 labels = loadmat(self._base_folder / self._file_dict["label"][0], squeeze_me=True)
68 image_id_to_label = dict(enumerate(labels["labels"].tolist(), 1))
69
70 self._labels = []
71 self._image_files = []
72 for image_id in image_ids:
73 self._labels.append(image_id_to_label[image_id])
74 self._image_files.append(self._images_folder / f"image_{image_id:05d}.jpg")
75
76 def __len__(self) -> int:
77 return len(self._image_files)
78
79 def __getitem__(self, idx) -> Tuple[Any, Any]:
80 image_file, label = self._image_files[idx], self._labels[idx]
81 image = PIL.Image.open(image_file).convert("RGB")
82
83 if self.transform:
84 image = self.transform(image)
85
86 if self.target_transform:
87 label = self.target_transform(label)
88
89 return image, label
90
91 def extra_repr(self) -> str:
92 return f"split={self._split}"
93
94 def _check_integrity(self):
95 if not (self._images_folder.exists() and self._images_folder.is_dir()):
96 return False
97
98 for id in ["label", "setid"]:
99 filename, md5 = self._file_dict[id]
100 if not check_integrity(str(self._base_folder / filename), md5):
101 return False
102 return True
103
104 def download(self):
105 if self._check_integrity():
106 return
107 download_and_extract_archive(
108 f"{self._download_url_prefix}{self._file_dict['image'][0]}",
109 str(self._base_folder),
110 md5=self._file_dict["image"][1],
111 )
112 for id in ["label", "setid"]:
113 filename, md5 = self._file_dict[id]
114 download_url(self._download_url_prefix + filename, str(self._base_folder), md5=md5)
115
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/torchvision/datasets/flowers102.py b/torchvision/datasets/flowers102.py
--- a/torchvision/datasets/flowers102.py
+++ b/torchvision/datasets/flowers102.py
@@ -65,7 +65,7 @@
image_ids = set_ids[self._splits_map[self._split]].tolist()
labels = loadmat(self._base_folder / self._file_dict["label"][0], squeeze_me=True)
- image_id_to_label = dict(enumerate(labels["labels"].tolist(), 1))
+ image_id_to_label = dict(enumerate((labels["labels"] - 1).tolist(), 1))
self._labels = []
self._image_files = []
| {"golden_diff": "diff --git a/torchvision/datasets/flowers102.py b/torchvision/datasets/flowers102.py\n--- a/torchvision/datasets/flowers102.py\n+++ b/torchvision/datasets/flowers102.py\n@@ -65,7 +65,7 @@\n image_ids = set_ids[self._splits_map[self._split]].tolist()\n \n labels = loadmat(self._base_folder / self._file_dict[\"label\"][0], squeeze_me=True)\n- image_id_to_label = dict(enumerate(labels[\"labels\"].tolist(), 1))\n+ image_id_to_label = dict(enumerate((labels[\"labels\"] - 1).tolist(), 1))\n \n self._labels = []\n self._image_files = []\n", "issue": "Wrong label mapping in vision/torchvision/datasets/flowers102.py\n### \ud83d\udc1b Describe the bug\n\nfrom line 70 in the file vision/torchvision/datasets/flowers102.py\r\n```\r\nself._labels = []\r\nself._image_files = []\r\nfor image_id in image_ids:\r\n self._labels.append(image_id_to_label[image_id]) # Note: the bug is here, current the labels are ranged from 1 to 102, which should be ranged from 0 to 101, consider to change this line into self._labels.append(image_id_to_label[image_id] - 1) ?\r\n self._image_files.append(self._images_folder / f\"image_{image_id:05d}.jpg\")\r\n```\n\n### Versions\n\nPyTorch version: 1.11.0\r\nIs debug build: False\r\nCUDA used to build PyTorch: 11.3\r\nROCM used to build PyTorch: N/A\r\n\r\nOS: Ubuntu 20.04.4 LTS (x86_64)\r\nGCC version: (Ubuntu 9.4.0-1ubuntu1~20.04.1) 9.4.0\r\nClang version: Could not collect\r\nCMake version: version 3.16.3\r\nLibc version: glibc-2.31\r\n\r\nPython version: 3.8.13 (default, Mar 28 2022, 11:38:47) [GCC 7.5.0] (64-bit runtime)\r\nPython platform: Linux-5.4.0-104-generic-x86_64-with-glibc2.17\r\nIs CUDA available: True\r\nCUDA runtime version: Could not collect\r\nGPU models and configuration: GPU 0: NVIDIA GeForce RTX 3090\r\nNvidia driver version: 510.54\r\ncuDNN version: Could not collect\r\nHIP runtime version: N/A\r\nMIOpen runtime version: N/A\r\nIs XNNPACK available: True\r\n\r\nVersions of relevant libraries:\r\n[pip3] advertorch==0.2.3\r\n[pip3] numpy==1.21.2\r\n[pip3] torch==1.11.0\r\n[pip3] torchaudio==0.11.0\r\n[pip3] torchvision==0.7.0\r\n[conda] advertorch 0.2.3 pypi_0 pypi\r\n[conda] blas 1.0 mkl https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free\r\n[conda] cudatoolkit 11.3.1 h2bc3f7f_2 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main\r\n[conda] ffmpeg 4.3 hf484d3e_0 pytorch\r\n[conda] mkl 2021.4.0 h06a4308_640 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main\r\n[conda] mkl-service 2.4.0 py38h7f8727e_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main\r\n[conda] mkl_fft 1.3.1 py38hd3c417c_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main\r\n[conda] mkl_random 1.2.2 py38h51133e4_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main\r\n[conda] numpy 1.21.2 py38h20f2e39_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main\r\n[conda] numpy-base 1.21.2 py38h79a1101_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main\r\n[conda] pytorch 1.11.0 py3.8_cuda11.3_cudnn8.2.0_0 pytorch\r\n[conda] pytorch-mutex 1.0 cuda pytorch\r\n[conda] torchaudio 0.11.0 py38_cu113 pytorch\r\n[conda] torchvision 0.7.0 pypi_0 pypi\n\ncc @pmeier @YosuaMichael\n", "before_files": [{"content": "from pathlib import Path\nfrom typing import Any, Tuple, Callable, Optional\n\nimport PIL.Image\n\nfrom .utils import check_integrity, download_and_extract_archive, download_url, verify_str_arg\nfrom .vision import VisionDataset\n\n\nclass Flowers102(VisionDataset):\n \"\"\"`Oxford 102 Flower <https://www.robots.ox.ac.uk/~vgg/data/flowers/102/>`_ Dataset.\n\n .. warning::\n\n This class needs `scipy <https://docs.scipy.org/doc/>`_ to load target files from `.mat` format.\n\n Oxford 102 Flower is an image classification dataset consisting of 102 flower categories. The\n flowers were chosen to be flowers commonly occurring in the United Kingdom. Each class consists of\n between 40 and 258 images.\n\n The images have large scale, pose and light variations. In addition, there are categories that\n have large variations within the category, and several very similar categories.\n\n Args:\n root (string): Root directory of the dataset.\n split (string, optional): The dataset split, supports ``\"train\"`` (default), ``\"val\"``, or ``\"test\"``.\n transform (callable, optional): A function/transform that takes in an PIL image and returns a\n transformed version. E.g, ``transforms.RandomCrop``.\n target_transform (callable, optional): A function/transform that takes in the target and transforms it.\n download (bool, optional): If true, downloads the dataset from the internet and\n puts it in root directory. If dataset is already downloaded, it is not\n downloaded again.\n \"\"\"\n\n _download_url_prefix = \"https://www.robots.ox.ac.uk/~vgg/data/flowers/102/\"\n _file_dict = { # filename, md5\n \"image\": (\"102flowers.tgz\", \"52808999861908f626f3c1f4e79d11fa\"),\n \"label\": (\"imagelabels.mat\", \"e0620be6f572b9609742df49c70aed4d\"),\n \"setid\": (\"setid.mat\", \"a5357ecc9cb78c4bef273ce3793fc85c\"),\n }\n _splits_map = {\"train\": \"trnid\", \"val\": \"valid\", \"test\": \"tstid\"}\n\n def __init__(\n self,\n root: str,\n split: str = \"train\",\n transform: Optional[Callable] = None,\n target_transform: Optional[Callable] = None,\n download: bool = False,\n ) -> None:\n super().__init__(root, transform=transform, target_transform=target_transform)\n self._split = verify_str_arg(split, \"split\", (\"train\", \"val\", \"test\"))\n self._base_folder = Path(self.root) / \"flowers-102\"\n self._images_folder = self._base_folder / \"jpg\"\n\n if download:\n self.download()\n\n if not self._check_integrity():\n raise RuntimeError(\"Dataset not found or corrupted. You can use download=True to download it\")\n\n from scipy.io import loadmat\n\n set_ids = loadmat(self._base_folder / self._file_dict[\"setid\"][0], squeeze_me=True)\n image_ids = set_ids[self._splits_map[self._split]].tolist()\n\n labels = loadmat(self._base_folder / self._file_dict[\"label\"][0], squeeze_me=True)\n image_id_to_label = dict(enumerate(labels[\"labels\"].tolist(), 1))\n\n self._labels = []\n self._image_files = []\n for image_id in image_ids:\n self._labels.append(image_id_to_label[image_id])\n self._image_files.append(self._images_folder / f\"image_{image_id:05d}.jpg\")\n\n def __len__(self) -> int:\n return len(self._image_files)\n\n def __getitem__(self, idx) -> Tuple[Any, Any]:\n image_file, label = self._image_files[idx], self._labels[idx]\n image = PIL.Image.open(image_file).convert(\"RGB\")\n\n if self.transform:\n image = self.transform(image)\n\n if self.target_transform:\n label = self.target_transform(label)\n\n return image, label\n\n def extra_repr(self) -> str:\n return f\"split={self._split}\"\n\n def _check_integrity(self):\n if not (self._images_folder.exists() and self._images_folder.is_dir()):\n return False\n\n for id in [\"label\", \"setid\"]:\n filename, md5 = self._file_dict[id]\n if not check_integrity(str(self._base_folder / filename), md5):\n return False\n return True\n\n def download(self):\n if self._check_integrity():\n return\n download_and_extract_archive(\n f\"{self._download_url_prefix}{self._file_dict['image'][0]}\",\n str(self._base_folder),\n md5=self._file_dict[\"image\"][1],\n )\n for id in [\"label\", \"setid\"]:\n filename, md5 = self._file_dict[id]\n download_url(self._download_url_prefix + filename, str(self._base_folder), md5=md5)\n", "path": "torchvision/datasets/flowers102.py"}], "after_files": [{"content": "from pathlib import Path\nfrom typing import Any, Tuple, Callable, Optional\n\nimport PIL.Image\n\nfrom .utils import check_integrity, download_and_extract_archive, download_url, verify_str_arg\nfrom .vision import VisionDataset\n\n\nclass Flowers102(VisionDataset):\n \"\"\"`Oxford 102 Flower <https://www.robots.ox.ac.uk/~vgg/data/flowers/102/>`_ Dataset.\n\n .. warning::\n\n This class needs `scipy <https://docs.scipy.org/doc/>`_ to load target files from `.mat` format.\n\n Oxford 102 Flower is an image classification dataset consisting of 102 flower categories. The\n flowers were chosen to be flowers commonly occurring in the United Kingdom. Each class consists of\n between 40 and 258 images.\n\n The images have large scale, pose and light variations. In addition, there are categories that\n have large variations within the category, and several very similar categories.\n\n Args:\n root (string): Root directory of the dataset.\n split (string, optional): The dataset split, supports ``\"train\"`` (default), ``\"val\"``, or ``\"test\"``.\n transform (callable, optional): A function/transform that takes in an PIL image and returns a\n transformed version. E.g, ``transforms.RandomCrop``.\n target_transform (callable, optional): A function/transform that takes in the target and transforms it.\n download (bool, optional): If true, downloads the dataset from the internet and\n puts it in root directory. If dataset is already downloaded, it is not\n downloaded again.\n \"\"\"\n\n _download_url_prefix = \"https://www.robots.ox.ac.uk/~vgg/data/flowers/102/\"\n _file_dict = { # filename, md5\n \"image\": (\"102flowers.tgz\", \"52808999861908f626f3c1f4e79d11fa\"),\n \"label\": (\"imagelabels.mat\", \"e0620be6f572b9609742df49c70aed4d\"),\n \"setid\": (\"setid.mat\", \"a5357ecc9cb78c4bef273ce3793fc85c\"),\n }\n _splits_map = {\"train\": \"trnid\", \"val\": \"valid\", \"test\": \"tstid\"}\n\n def __init__(\n self,\n root: str,\n split: str = \"train\",\n transform: Optional[Callable] = None,\n target_transform: Optional[Callable] = None,\n download: bool = False,\n ) -> None:\n super().__init__(root, transform=transform, target_transform=target_transform)\n self._split = verify_str_arg(split, \"split\", (\"train\", \"val\", \"test\"))\n self._base_folder = Path(self.root) / \"flowers-102\"\n self._images_folder = self._base_folder / \"jpg\"\n\n if download:\n self.download()\n\n if not self._check_integrity():\n raise RuntimeError(\"Dataset not found or corrupted. You can use download=True to download it\")\n\n from scipy.io import loadmat\n\n set_ids = loadmat(self._base_folder / self._file_dict[\"setid\"][0], squeeze_me=True)\n image_ids = set_ids[self._splits_map[self._split]].tolist()\n\n labels = loadmat(self._base_folder / self._file_dict[\"label\"][0], squeeze_me=True)\n image_id_to_label = dict(enumerate((labels[\"labels\"] - 1).tolist(), 1))\n\n self._labels = []\n self._image_files = []\n for image_id in image_ids:\n self._labels.append(image_id_to_label[image_id])\n self._image_files.append(self._images_folder / f\"image_{image_id:05d}.jpg\")\n\n def __len__(self) -> int:\n return len(self._image_files)\n\n def __getitem__(self, idx) -> Tuple[Any, Any]:\n image_file, label = self._image_files[idx], self._labels[idx]\n image = PIL.Image.open(image_file).convert(\"RGB\")\n\n if self.transform:\n image = self.transform(image)\n\n if self.target_transform:\n label = self.target_transform(label)\n\n return image, label\n\n def extra_repr(self) -> str:\n return f\"split={self._split}\"\n\n def _check_integrity(self):\n if not (self._images_folder.exists() and self._images_folder.is_dir()):\n return False\n\n for id in [\"label\", \"setid\"]:\n filename, md5 = self._file_dict[id]\n if not check_integrity(str(self._base_folder / filename), md5):\n return False\n return True\n\n def download(self):\n if self._check_integrity():\n return\n download_and_extract_archive(\n f\"{self._download_url_prefix}{self._file_dict['image'][0]}\",\n str(self._base_folder),\n md5=self._file_dict[\"image\"][1],\n )\n for id in [\"label\", \"setid\"]:\n filename, md5 = self._file_dict[id]\n download_url(self._download_url_prefix + filename, str(self._base_folder), md5=md5)\n", "path": "torchvision/datasets/flowers102.py"}]} | 2,717 | 166 |
gh_patches_debug_11370 | rasdani/github-patches | git_diff | googleapis__google-cloud-python-901 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Subscription.pull() fails with KeyError
Hi,
My code below fails with a KeyError coming from the library:
``` python
sub = Subscription(SUBSCRIPTION_NAME, Topic(INCOMING_TOPIC_NAME))
while True:
try:
msgs = sub.pull()
...
except:
msg = 'poll %s: %s' % (SUBSCRIPTION_NAME, traceback.format_exc())
print msg
```
Fails with the message:
```
poll extractors: Traceback (most recent call last):
File "main.py", line 46, in main
msgs = sub.pull()
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/subscription.py", line 176, in pull
for info in response['receivedMessages']]
KeyError: 'receivedMessages'
```
This seems like a bug on the pubsub library code.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gcloud/pubsub/subscription.py`
Content:
```
1 # Copyright 2015 Google Inc. All rights reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Define API Subscriptions."""
16
17 from gcloud.exceptions import NotFound
18 from gcloud.pubsub.message import Message
19 from gcloud.pubsub.topic import Topic
20 from gcloud.pubsub._implicit_environ import _require_connection
21
22
23 class Subscription(object):
24 """Subscriptions receive messages published to their topics.
25
26 See:
27 https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions
28
29 :type name: string
30 :param name: the name of the subscription
31
32 :type topic: :class:`gcloud.pubsub.topic.Topic`
33 :param topic: the topic to which the subscription belongs..
34
35 :type ack_deadline: int
36 :param ack_deadline: the deadline (in seconds) by which messages pulled
37 from the back-end must be acknowledged.
38
39 :type push_endpoint: string
40 :param push_endpoint: URL to which messages will be pushed by the back-end.
41 If not set, the application must pull messages.
42 """
43 def __init__(self, name, topic, ack_deadline=None, push_endpoint=None):
44 self.name = name
45 self.topic = topic
46 self.ack_deadline = ack_deadline
47 self.push_endpoint = push_endpoint
48
49 @classmethod
50 def from_api_repr(cls, resource, topics=None):
51 """Factory: construct a topic given its API representation
52
53 :type resource: dict
54 :param resource: topic resource representation returned from the API
55
56 :type topics: dict or None
57 :param topics: A mapping of topic names -> topics. If not passed,
58 the subscription will have a newly-created topic.
59
60 :rtype: :class:`gcloud.pubsub.subscription.Subscription`
61 """
62 if topics is None:
63 topics = {}
64 t_name = resource['topic']
65 topic = topics.get(t_name)
66 if topic is None:
67 topic = topics[t_name] = Topic.from_api_repr({'name': t_name})
68 _, _, _, name = resource['name'].split('/')
69 ack_deadline = resource.get('ackDeadlineSeconds')
70 push_config = resource.get('pushConfig', {})
71 push_endpoint = push_config.get('pushEndpoint')
72 return cls(name, topic, ack_deadline, push_endpoint)
73
74 @property
75 def path(self):
76 """URL path for the subscription's APIs"""
77 project = self.topic.project
78 return '/projects/%s/subscriptions/%s' % (project, self.name)
79
80 def create(self, connection=None):
81 """API call: create the subscription via a PUT request
82
83 See:
84 https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/create
85
86 :type connection: :class:`gcloud.pubsub.connection.Connection` or None
87 :param connection: the connection to use. If not passed,
88 falls back to the topic's connection.
89 """
90 data = {'topic': self.topic.full_name}
91
92 if self.ack_deadline is not None:
93 data['ackDeadline'] = self.ack_deadline
94
95 if self.push_endpoint is not None:
96 data['pushConfig'] = {'pushEndpoint': self.push_endpoint}
97
98 connection = _require_connection(connection)
99 connection.api_request(method='PUT', path=self.path, data=data)
100
101 def exists(self, connection=None):
102 """API call: test existence of the subscription via a GET request
103
104 See
105 https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/get
106
107 :type connection: :class:`gcloud.pubsub.connection.Connection` or None
108 :param connection: the connection to use. If not passed,
109 falls back to the topic's connection.
110 """
111 connection = _require_connection(connection)
112 try:
113 connection.api_request(method='GET', path=self.path)
114 except NotFound:
115 return False
116 else:
117 return True
118
119 def reload(self, connection=None):
120 """API call: sync local subscription configuration via a GET request
121
122 See
123 https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/get
124
125 :type connection: :class:`gcloud.pubsub.connection.Connection` or None
126 :param connection: the connection to use. If not passed,
127 falls back to the topic's connection.
128 """
129 connection = _require_connection(connection)
130 data = connection.api_request(method='GET', path=self.path)
131 self.ack_deadline = data.get('ackDeadline')
132 push_config = data.get('pushConfig', {})
133 self.push_endpoint = push_config.get('pushEndpoint')
134
135 def modify_push_configuration(self, push_endpoint, connection=None):
136 """API call: update the push endpoint for the subscription.
137
138 See:
139 https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/modifyPushConfig
140
141 :type push_endpoint: string
142 :param push_endpoint: URL to which messages will be pushed by the
143 back-end. If None, the application must pull
144 messages.
145
146 :type connection: :class:`gcloud.pubsub.connection.Connection` or None
147 :param connection: the connection to use. If not passed,
148 falls back to the topic's connection.
149 """
150 connection = _require_connection(connection)
151 data = {}
152 config = data['pushConfig'] = {}
153 if push_endpoint is not None:
154 config['pushEndpoint'] = push_endpoint
155 connection.api_request(method='POST',
156 path='%s:modifyPushConfig' % self.path,
157 data=data)
158 self.push_endpoint = push_endpoint
159
160 def pull(self, return_immediately=False, max_messages=1, connection=None):
161 """API call: retrieve messages for the subscription.
162
163 See:
164 https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/pull
165
166 :type return_immediately: boolean
167 :param return_immediately: if True, the back-end returns even if no
168 messages are available; if False, the API
169 call blocks until one or more messages are
170 available.
171
172 :type max_messages: int
173 :param max_messages: the maximum number of messages to return.
174
175 :type connection: :class:`gcloud.pubsub.connection.Connection` or None
176 :param connection: the connection to use. If not passed,
177 falls back to the topic's connection.
178
179 :rtype: list of (ack_id, message) tuples
180 :returns: sequence of tuples: ``ack_id`` is the ID to be used in a
181 subsequent call to :meth:`acknowledge`, and ``message``
182 is an instance of :class:`gcloud.pubsub.message.Message`.
183 """
184 connection = _require_connection(connection)
185 data = {'returnImmediately': return_immediately,
186 'maxMessages': max_messages}
187 response = connection.api_request(method='POST',
188 path='%s:pull' % self.path,
189 data=data)
190 return [(info['ackId'], Message.from_api_repr(info['message']))
191 for info in response['receivedMessages']]
192
193 def acknowledge(self, ack_ids, connection=None):
194 """API call: acknowledge retrieved messages for the subscription.
195
196 See:
197 https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/acknowledge
198
199 :type ack_ids: list of string
200 :param ack_ids: ack IDs of messages being acknowledged
201
202 :type connection: :class:`gcloud.pubsub.connection.Connection` or None
203 :param connection: the connection to use. If not passed,
204 falls back to the topic's connection.
205 """
206 connection = _require_connection(connection)
207 data = {'ackIds': ack_ids}
208 connection.api_request(method='POST',
209 path='%s:acknowledge' % self.path,
210 data=data)
211
212 def modify_ack_deadline(self, ack_id, ack_deadline, connection=None):
213 """API call: update acknowledgement deadline for a retrieved message.
214
215 See:
216 https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/acknowledge
217
218 :type ack_id: string
219 :param ack_id: ack ID of message being updated
220
221 :type ack_deadline: int
222 :param ack_deadline: new deadline for the message, in seconds
223
224 :type connection: :class:`gcloud.pubsub.connection.Connection` or None
225 :param connection: the connection to use. If not passed,
226 falls back to the topic's connection.
227 """
228 connection = _require_connection(connection)
229 data = {'ackId': ack_id, 'ackDeadlineSeconds': ack_deadline}
230 connection.api_request(method='POST',
231 path='%s:modifyAckDeadline' % self.path,
232 data=data)
233
234 def delete(self, connection=None):
235 """API call: delete the subscription via a DELETE request.
236
237 See:
238 https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/delete
239
240 :type connection: :class:`gcloud.pubsub.connection.Connection` or None
241 :param connection: the connection to use. If not passed,
242 falls back to the topic's connection.
243 """
244 connection = _require_connection(connection)
245 connection.api_request(method='DELETE', path=self.path)
246
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gcloud/pubsub/subscription.py b/gcloud/pubsub/subscription.py
--- a/gcloud/pubsub/subscription.py
+++ b/gcloud/pubsub/subscription.py
@@ -188,7 +188,7 @@
path='%s:pull' % self.path,
data=data)
return [(info['ackId'], Message.from_api_repr(info['message']))
- for info in response['receivedMessages']]
+ for info in response.get('receivedMessages', ())]
def acknowledge(self, ack_ids, connection=None):
"""API call: acknowledge retrieved messages for the subscription.
| {"golden_diff": "diff --git a/gcloud/pubsub/subscription.py b/gcloud/pubsub/subscription.py\n--- a/gcloud/pubsub/subscription.py\n+++ b/gcloud/pubsub/subscription.py\n@@ -188,7 +188,7 @@\n path='%s:pull' % self.path,\n data=data)\n return [(info['ackId'], Message.from_api_repr(info['message']))\n- for info in response['receivedMessages']]\n+ for info in response.get('receivedMessages', ())]\n \n def acknowledge(self, ack_ids, connection=None):\n \"\"\"API call: acknowledge retrieved messages for the subscription.\n", "issue": "Subscription.pull() fails with KeyError\nHi,\n\nMy code below fails with a KeyError coming from the library:\n\n``` python\n sub = Subscription(SUBSCRIPTION_NAME, Topic(INCOMING_TOPIC_NAME))\n\n while True:\n try:\n msgs = sub.pull()\n ...\n except:\n msg = 'poll %s: %s' % (SUBSCRIPTION_NAME, traceback.format_exc())\n print msg\n```\n\nFails with the message:\n\n```\npoll extractors: Traceback (most recent call last):\n File \"main.py\", line 46, in main\n msgs = sub.pull()\n File \"/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/subscription.py\", line 176, in pull\n for info in response['receivedMessages']]\nKeyError: 'receivedMessages'\n```\n\nThis seems like a bug on the pubsub library code.\n\n", "before_files": [{"content": "# Copyright 2015 Google Inc. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Define API Subscriptions.\"\"\"\n\nfrom gcloud.exceptions import NotFound\nfrom gcloud.pubsub.message import Message\nfrom gcloud.pubsub.topic import Topic\nfrom gcloud.pubsub._implicit_environ import _require_connection\n\n\nclass Subscription(object):\n \"\"\"Subscriptions receive messages published to their topics.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions\n\n :type name: string\n :param name: the name of the subscription\n\n :type topic: :class:`gcloud.pubsub.topic.Topic`\n :param topic: the topic to which the subscription belongs..\n\n :type ack_deadline: int\n :param ack_deadline: the deadline (in seconds) by which messages pulled\n from the back-end must be acknowledged.\n\n :type push_endpoint: string\n :param push_endpoint: URL to which messages will be pushed by the back-end.\n If not set, the application must pull messages.\n \"\"\"\n def __init__(self, name, topic, ack_deadline=None, push_endpoint=None):\n self.name = name\n self.topic = topic\n self.ack_deadline = ack_deadline\n self.push_endpoint = push_endpoint\n\n @classmethod\n def from_api_repr(cls, resource, topics=None):\n \"\"\"Factory: construct a topic given its API representation\n\n :type resource: dict\n :param resource: topic resource representation returned from the API\n\n :type topics: dict or None\n :param topics: A mapping of topic names -> topics. If not passed,\n the subscription will have a newly-created topic.\n\n :rtype: :class:`gcloud.pubsub.subscription.Subscription`\n \"\"\"\n if topics is None:\n topics = {}\n t_name = resource['topic']\n topic = topics.get(t_name)\n if topic is None:\n topic = topics[t_name] = Topic.from_api_repr({'name': t_name})\n _, _, _, name = resource['name'].split('/')\n ack_deadline = resource.get('ackDeadlineSeconds')\n push_config = resource.get('pushConfig', {})\n push_endpoint = push_config.get('pushEndpoint')\n return cls(name, topic, ack_deadline, push_endpoint)\n\n @property\n def path(self):\n \"\"\"URL path for the subscription's APIs\"\"\"\n project = self.topic.project\n return '/projects/%s/subscriptions/%s' % (project, self.name)\n\n def create(self, connection=None):\n \"\"\"API call: create the subscription via a PUT request\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/create\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n data = {'topic': self.topic.full_name}\n\n if self.ack_deadline is not None:\n data['ackDeadline'] = self.ack_deadline\n\n if self.push_endpoint is not None:\n data['pushConfig'] = {'pushEndpoint': self.push_endpoint}\n\n connection = _require_connection(connection)\n connection.api_request(method='PUT', path=self.path, data=data)\n\n def exists(self, connection=None):\n \"\"\"API call: test existence of the subscription via a GET request\n\n See\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/get\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n try:\n connection.api_request(method='GET', path=self.path)\n except NotFound:\n return False\n else:\n return True\n\n def reload(self, connection=None):\n \"\"\"API call: sync local subscription configuration via a GET request\n\n See\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/get\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n data = connection.api_request(method='GET', path=self.path)\n self.ack_deadline = data.get('ackDeadline')\n push_config = data.get('pushConfig', {})\n self.push_endpoint = push_config.get('pushEndpoint')\n\n def modify_push_configuration(self, push_endpoint, connection=None):\n \"\"\"API call: update the push endpoint for the subscription.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/modifyPushConfig\n\n :type push_endpoint: string\n :param push_endpoint: URL to which messages will be pushed by the\n back-end. If None, the application must pull\n messages.\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n data = {}\n config = data['pushConfig'] = {}\n if push_endpoint is not None:\n config['pushEndpoint'] = push_endpoint\n connection.api_request(method='POST',\n path='%s:modifyPushConfig' % self.path,\n data=data)\n self.push_endpoint = push_endpoint\n\n def pull(self, return_immediately=False, max_messages=1, connection=None):\n \"\"\"API call: retrieve messages for the subscription.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/pull\n\n :type return_immediately: boolean\n :param return_immediately: if True, the back-end returns even if no\n messages are available; if False, the API\n call blocks until one or more messages are\n available.\n\n :type max_messages: int\n :param max_messages: the maximum number of messages to return.\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n\n :rtype: list of (ack_id, message) tuples\n :returns: sequence of tuples: ``ack_id`` is the ID to be used in a\n subsequent call to :meth:`acknowledge`, and ``message``\n is an instance of :class:`gcloud.pubsub.message.Message`.\n \"\"\"\n connection = _require_connection(connection)\n data = {'returnImmediately': return_immediately,\n 'maxMessages': max_messages}\n response = connection.api_request(method='POST',\n path='%s:pull' % self.path,\n data=data)\n return [(info['ackId'], Message.from_api_repr(info['message']))\n for info in response['receivedMessages']]\n\n def acknowledge(self, ack_ids, connection=None):\n \"\"\"API call: acknowledge retrieved messages for the subscription.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/acknowledge\n\n :type ack_ids: list of string\n :param ack_ids: ack IDs of messages being acknowledged\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n data = {'ackIds': ack_ids}\n connection.api_request(method='POST',\n path='%s:acknowledge' % self.path,\n data=data)\n\n def modify_ack_deadline(self, ack_id, ack_deadline, connection=None):\n \"\"\"API call: update acknowledgement deadline for a retrieved message.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/acknowledge\n\n :type ack_id: string\n :param ack_id: ack ID of message being updated\n\n :type ack_deadline: int\n :param ack_deadline: new deadline for the message, in seconds\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n data = {'ackId': ack_id, 'ackDeadlineSeconds': ack_deadline}\n connection.api_request(method='POST',\n path='%s:modifyAckDeadline' % self.path,\n data=data)\n\n def delete(self, connection=None):\n \"\"\"API call: delete the subscription via a DELETE request.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/delete\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n connection.api_request(method='DELETE', path=self.path)\n", "path": "gcloud/pubsub/subscription.py"}], "after_files": [{"content": "# Copyright 2015 Google Inc. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Define API Subscriptions.\"\"\"\n\nfrom gcloud.exceptions import NotFound\nfrom gcloud.pubsub.message import Message\nfrom gcloud.pubsub.topic import Topic\nfrom gcloud.pubsub._implicit_environ import _require_connection\n\n\nclass Subscription(object):\n \"\"\"Subscriptions receive messages published to their topics.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions\n\n :type name: string\n :param name: the name of the subscription\n\n :type topic: :class:`gcloud.pubsub.topic.Topic`\n :param topic: the topic to which the subscription belongs..\n\n :type ack_deadline: int\n :param ack_deadline: the deadline (in seconds) by which messages pulled\n from the back-end must be acknowledged.\n\n :type push_endpoint: string\n :param push_endpoint: URL to which messages will be pushed by the back-end.\n If not set, the application must pull messages.\n \"\"\"\n def __init__(self, name, topic, ack_deadline=None, push_endpoint=None):\n self.name = name\n self.topic = topic\n self.ack_deadline = ack_deadline\n self.push_endpoint = push_endpoint\n\n @classmethod\n def from_api_repr(cls, resource, topics=None):\n \"\"\"Factory: construct a topic given its API representation\n\n :type resource: dict\n :param resource: topic resource representation returned from the API\n\n :type topics: dict or None\n :param topics: A mapping of topic names -> topics. If not passed,\n the subscription will have a newly-created topic.\n\n :rtype: :class:`gcloud.pubsub.subscription.Subscription`\n \"\"\"\n if topics is None:\n topics = {}\n t_name = resource['topic']\n topic = topics.get(t_name)\n if topic is None:\n topic = topics[t_name] = Topic.from_api_repr({'name': t_name})\n _, _, _, name = resource['name'].split('/')\n ack_deadline = resource.get('ackDeadlineSeconds')\n push_config = resource.get('pushConfig', {})\n push_endpoint = push_config.get('pushEndpoint')\n return cls(name, topic, ack_deadline, push_endpoint)\n\n @property\n def path(self):\n \"\"\"URL path for the subscription's APIs\"\"\"\n project = self.topic.project\n return '/projects/%s/subscriptions/%s' % (project, self.name)\n\n def create(self, connection=None):\n \"\"\"API call: create the subscription via a PUT request\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/create\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n data = {'topic': self.topic.full_name}\n\n if self.ack_deadline is not None:\n data['ackDeadline'] = self.ack_deadline\n\n if self.push_endpoint is not None:\n data['pushConfig'] = {'pushEndpoint': self.push_endpoint}\n\n connection = _require_connection(connection)\n connection.api_request(method='PUT', path=self.path, data=data)\n\n def exists(self, connection=None):\n \"\"\"API call: test existence of the subscription via a GET request\n\n See\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/get\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n try:\n connection.api_request(method='GET', path=self.path)\n except NotFound:\n return False\n else:\n return True\n\n def reload(self, connection=None):\n \"\"\"API call: sync local subscription configuration via a GET request\n\n See\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/get\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n data = connection.api_request(method='GET', path=self.path)\n self.ack_deadline = data.get('ackDeadline')\n push_config = data.get('pushConfig', {})\n self.push_endpoint = push_config.get('pushEndpoint')\n\n def modify_push_configuration(self, push_endpoint, connection=None):\n \"\"\"API call: update the push endpoint for the subscription.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/modifyPushConfig\n\n :type push_endpoint: string\n :param push_endpoint: URL to which messages will be pushed by the\n back-end. If None, the application must pull\n messages.\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n data = {}\n config = data['pushConfig'] = {}\n if push_endpoint is not None:\n config['pushEndpoint'] = push_endpoint\n connection.api_request(method='POST',\n path='%s:modifyPushConfig' % self.path,\n data=data)\n self.push_endpoint = push_endpoint\n\n def pull(self, return_immediately=False, max_messages=1, connection=None):\n \"\"\"API call: retrieve messages for the subscription.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/pull\n\n :type return_immediately: boolean\n :param return_immediately: if True, the back-end returns even if no\n messages are available; if False, the API\n call blocks until one or more messages are\n available.\n\n :type max_messages: int\n :param max_messages: the maximum number of messages to return.\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n\n :rtype: list of (ack_id, message) tuples\n :returns: sequence of tuples: ``ack_id`` is the ID to be used in a\n subsequent call to :meth:`acknowledge`, and ``message``\n is an instance of :class:`gcloud.pubsub.message.Message`.\n \"\"\"\n connection = _require_connection(connection)\n data = {'returnImmediately': return_immediately,\n 'maxMessages': max_messages}\n response = connection.api_request(method='POST',\n path='%s:pull' % self.path,\n data=data)\n return [(info['ackId'], Message.from_api_repr(info['message']))\n for info in response.get('receivedMessages', ())]\n\n def acknowledge(self, ack_ids, connection=None):\n \"\"\"API call: acknowledge retrieved messages for the subscription.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/acknowledge\n\n :type ack_ids: list of string\n :param ack_ids: ack IDs of messages being acknowledged\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n data = {'ackIds': ack_ids}\n connection.api_request(method='POST',\n path='%s:acknowledge' % self.path,\n data=data)\n\n def modify_ack_deadline(self, ack_id, ack_deadline, connection=None):\n \"\"\"API call: update acknowledgement deadline for a retrieved message.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/acknowledge\n\n :type ack_id: string\n :param ack_id: ack ID of message being updated\n\n :type ack_deadline: int\n :param ack_deadline: new deadline for the message, in seconds\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n data = {'ackId': ack_id, 'ackDeadlineSeconds': ack_deadline}\n connection.api_request(method='POST',\n path='%s:modifyAckDeadline' % self.path,\n data=data)\n\n def delete(self, connection=None):\n \"\"\"API call: delete the subscription via a DELETE request.\n\n See:\n https://cloud.google.com/pubsub/reference/rest/v1beta2/projects/subscriptions/delete\n\n :type connection: :class:`gcloud.pubsub.connection.Connection` or None\n :param connection: the connection to use. If not passed,\n falls back to the topic's connection.\n \"\"\"\n connection = _require_connection(connection)\n connection.api_request(method='DELETE', path=self.path)\n", "path": "gcloud/pubsub/subscription.py"}]} | 3,196 | 133 |
gh_patches_debug_20733 | rasdani/github-patches | git_diff | pyca__cryptography-3539 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove `return True` in OpenSSL ecdsa verify implementation
https://github.com/pyca/cryptography/blob/master/src/cryptography/hazmat/backends/openssl/ec.py#L89
This isn't part of our documented API
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cryptography/hazmat/backends/openssl/ec.py`
Content:
```
1 # This file is dual licensed under the terms of the Apache License, Version
2 # 2.0, and the BSD License. See the LICENSE file in the root of this repository
3 # for complete details.
4
5 from __future__ import absolute_import, division, print_function
6
7 from cryptography import utils
8 from cryptography.exceptions import (
9 InvalidSignature, UnsupportedAlgorithm, _Reasons
10 )
11 from cryptography.hazmat.backends.openssl.utils import (
12 _calculate_digest_and_algorithm
13 )
14 from cryptography.hazmat.primitives import hashes, serialization
15 from cryptography.hazmat.primitives.asymmetric import (
16 AsymmetricSignatureContext, AsymmetricVerificationContext, ec
17 )
18
19
20 def _check_signature_algorithm(signature_algorithm):
21 if not isinstance(signature_algorithm, ec.ECDSA):
22 raise UnsupportedAlgorithm(
23 "Unsupported elliptic curve signature algorithm.",
24 _Reasons.UNSUPPORTED_PUBLIC_KEY_ALGORITHM)
25
26
27 def _ec_key_curve_sn(backend, ec_key):
28 group = backend._lib.EC_KEY_get0_group(ec_key)
29 backend.openssl_assert(group != backend._ffi.NULL)
30
31 nid = backend._lib.EC_GROUP_get_curve_name(group)
32 # The following check is to find EC keys with unnamed curves and raise
33 # an error for now.
34 if nid == backend._lib.NID_undef:
35 raise NotImplementedError(
36 "ECDSA certificates with unnamed curves are unsupported "
37 "at this time"
38 )
39
40 curve_name = backend._lib.OBJ_nid2sn(nid)
41 backend.openssl_assert(curve_name != backend._ffi.NULL)
42
43 sn = backend._ffi.string(curve_name).decode('ascii')
44 return sn
45
46
47 def _mark_asn1_named_ec_curve(backend, ec_cdata):
48 """
49 Set the named curve flag on the EC_KEY. This causes OpenSSL to
50 serialize EC keys along with their curve OID which makes
51 deserialization easier.
52 """
53
54 backend._lib.EC_KEY_set_asn1_flag(
55 ec_cdata, backend._lib.OPENSSL_EC_NAMED_CURVE
56 )
57
58
59 def _sn_to_elliptic_curve(backend, sn):
60 try:
61 return ec._CURVE_TYPES[sn]()
62 except KeyError:
63 raise UnsupportedAlgorithm(
64 "{0} is not a supported elliptic curve".format(sn),
65 _Reasons.UNSUPPORTED_ELLIPTIC_CURVE
66 )
67
68
69 def _ecdsa_sig_sign(backend, private_key, data):
70 max_size = backend._lib.ECDSA_size(private_key._ec_key)
71 backend.openssl_assert(max_size > 0)
72
73 sigbuf = backend._ffi.new("unsigned char[]", max_size)
74 siglen_ptr = backend._ffi.new("unsigned int[]", 1)
75 res = backend._lib.ECDSA_sign(
76 0, data, len(data), sigbuf, siglen_ptr, private_key._ec_key
77 )
78 backend.openssl_assert(res == 1)
79 return backend._ffi.buffer(sigbuf)[:siglen_ptr[0]]
80
81
82 def _ecdsa_sig_verify(backend, public_key, signature, data):
83 res = backend._lib.ECDSA_verify(
84 0, data, len(data), signature, len(signature), public_key._ec_key
85 )
86 if res != 1:
87 backend._consume_errors()
88 raise InvalidSignature
89 return True
90
91
92 @utils.register_interface(AsymmetricSignatureContext)
93 class _ECDSASignatureContext(object):
94 def __init__(self, backend, private_key, algorithm):
95 self._backend = backend
96 self._private_key = private_key
97 self._digest = hashes.Hash(algorithm, backend)
98
99 def update(self, data):
100 self._digest.update(data)
101
102 def finalize(self):
103 digest = self._digest.finalize()
104
105 return _ecdsa_sig_sign(self._backend, self._private_key, digest)
106
107
108 @utils.register_interface(AsymmetricVerificationContext)
109 class _ECDSAVerificationContext(object):
110 def __init__(self, backend, public_key, signature, algorithm):
111 self._backend = backend
112 self._public_key = public_key
113 self._signature = signature
114 self._digest = hashes.Hash(algorithm, backend)
115
116 def update(self, data):
117 self._digest.update(data)
118
119 def verify(self):
120 digest = self._digest.finalize()
121 return _ecdsa_sig_verify(
122 self._backend, self._public_key, self._signature, digest
123 )
124
125
126 @utils.register_interface(ec.EllipticCurvePrivateKeyWithSerialization)
127 class _EllipticCurvePrivateKey(object):
128 def __init__(self, backend, ec_key_cdata, evp_pkey):
129 self._backend = backend
130 _mark_asn1_named_ec_curve(backend, ec_key_cdata)
131 self._ec_key = ec_key_cdata
132 self._evp_pkey = evp_pkey
133
134 sn = _ec_key_curve_sn(backend, ec_key_cdata)
135 self._curve = _sn_to_elliptic_curve(backend, sn)
136
137 curve = utils.read_only_property("_curve")
138
139 def signer(self, signature_algorithm):
140 _check_signature_algorithm(signature_algorithm)
141 return _ECDSASignatureContext(
142 self._backend, self, signature_algorithm.algorithm
143 )
144
145 def exchange(self, algorithm, peer_public_key):
146 if not (
147 self._backend.elliptic_curve_exchange_algorithm_supported(
148 algorithm, self.curve
149 )
150 ):
151 raise UnsupportedAlgorithm(
152 "This backend does not support the ECDH algorithm.",
153 _Reasons.UNSUPPORTED_EXCHANGE_ALGORITHM
154 )
155
156 if peer_public_key.curve.name != self.curve.name:
157 raise ValueError(
158 "peer_public_key and self are not on the same curve"
159 )
160
161 group = self._backend._lib.EC_KEY_get0_group(self._ec_key)
162 z_len = (self._backend._lib.EC_GROUP_get_degree(group) + 7) // 8
163 self._backend.openssl_assert(z_len > 0)
164 z_buf = self._backend._ffi.new("uint8_t[]", z_len)
165 peer_key = self._backend._lib.EC_KEY_get0_public_key(
166 peer_public_key._ec_key
167 )
168
169 r = self._backend._lib.ECDH_compute_key(
170 z_buf, z_len, peer_key, self._ec_key, self._backend._ffi.NULL
171 )
172 self._backend.openssl_assert(r > 0)
173 return self._backend._ffi.buffer(z_buf)[:z_len]
174
175 def public_key(self):
176 group = self._backend._lib.EC_KEY_get0_group(self._ec_key)
177 self._backend.openssl_assert(group != self._backend._ffi.NULL)
178
179 curve_nid = self._backend._lib.EC_GROUP_get_curve_name(group)
180
181 public_ec_key = self._backend._lib.EC_KEY_new_by_curve_name(curve_nid)
182 self._backend.openssl_assert(public_ec_key != self._backend._ffi.NULL)
183 public_ec_key = self._backend._ffi.gc(
184 public_ec_key, self._backend._lib.EC_KEY_free
185 )
186
187 point = self._backend._lib.EC_KEY_get0_public_key(self._ec_key)
188 self._backend.openssl_assert(point != self._backend._ffi.NULL)
189
190 res = self._backend._lib.EC_KEY_set_public_key(public_ec_key, point)
191 self._backend.openssl_assert(res == 1)
192
193 evp_pkey = self._backend._ec_cdata_to_evp_pkey(public_ec_key)
194
195 return _EllipticCurvePublicKey(self._backend, public_ec_key, evp_pkey)
196
197 def private_numbers(self):
198 bn = self._backend._lib.EC_KEY_get0_private_key(self._ec_key)
199 private_value = self._backend._bn_to_int(bn)
200 return ec.EllipticCurvePrivateNumbers(
201 private_value=private_value,
202 public_numbers=self.public_key().public_numbers()
203 )
204
205 def private_bytes(self, encoding, format, encryption_algorithm):
206 return self._backend._private_key_bytes(
207 encoding,
208 format,
209 encryption_algorithm,
210 self._evp_pkey,
211 self._ec_key
212 )
213
214 def sign(self, data, signature_algorithm):
215 _check_signature_algorithm(signature_algorithm)
216 data, algorithm = _calculate_digest_and_algorithm(
217 self._backend, data, signature_algorithm._algorithm
218 )
219 return _ecdsa_sig_sign(self._backend, self, data)
220
221
222 @utils.register_interface(ec.EllipticCurvePublicKeyWithSerialization)
223 class _EllipticCurvePublicKey(object):
224 def __init__(self, backend, ec_key_cdata, evp_pkey):
225 self._backend = backend
226 _mark_asn1_named_ec_curve(backend, ec_key_cdata)
227 self._ec_key = ec_key_cdata
228 self._evp_pkey = evp_pkey
229
230 sn = _ec_key_curve_sn(backend, ec_key_cdata)
231 self._curve = _sn_to_elliptic_curve(backend, sn)
232
233 curve = utils.read_only_property("_curve")
234
235 def verifier(self, signature, signature_algorithm):
236 if not isinstance(signature, bytes):
237 raise TypeError("signature must be bytes.")
238
239 _check_signature_algorithm(signature_algorithm)
240 return _ECDSAVerificationContext(
241 self._backend, self, signature, signature_algorithm.algorithm
242 )
243
244 def public_numbers(self):
245 get_func, group = (
246 self._backend._ec_key_determine_group_get_func(self._ec_key)
247 )
248 point = self._backend._lib.EC_KEY_get0_public_key(self._ec_key)
249 self._backend.openssl_assert(point != self._backend._ffi.NULL)
250
251 with self._backend._tmp_bn_ctx() as bn_ctx:
252 bn_x = self._backend._lib.BN_CTX_get(bn_ctx)
253 bn_y = self._backend._lib.BN_CTX_get(bn_ctx)
254
255 res = get_func(group, point, bn_x, bn_y, bn_ctx)
256 self._backend.openssl_assert(res == 1)
257
258 x = self._backend._bn_to_int(bn_x)
259 y = self._backend._bn_to_int(bn_y)
260
261 return ec.EllipticCurvePublicNumbers(
262 x=x,
263 y=y,
264 curve=self._curve
265 )
266
267 def public_bytes(self, encoding, format):
268 if format is serialization.PublicFormat.PKCS1:
269 raise ValueError(
270 "EC public keys do not support PKCS1 serialization"
271 )
272
273 return self._backend._public_key_bytes(
274 encoding,
275 format,
276 self,
277 self._evp_pkey,
278 None
279 )
280
281 def verify(self, signature, data, signature_algorithm):
282 _check_signature_algorithm(signature_algorithm)
283 data, algorithm = _calculate_digest_and_algorithm(
284 self._backend, data, signature_algorithm._algorithm
285 )
286 return _ecdsa_sig_verify(self._backend, self, signature, data)
287
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/cryptography/hazmat/backends/openssl/ec.py b/src/cryptography/hazmat/backends/openssl/ec.py
--- a/src/cryptography/hazmat/backends/openssl/ec.py
+++ b/src/cryptography/hazmat/backends/openssl/ec.py
@@ -86,7 +86,6 @@
if res != 1:
backend._consume_errors()
raise InvalidSignature
- return True
@utils.register_interface(AsymmetricSignatureContext)
@@ -118,7 +117,7 @@
def verify(self):
digest = self._digest.finalize()
- return _ecdsa_sig_verify(
+ _ecdsa_sig_verify(
self._backend, self._public_key, self._signature, digest
)
@@ -283,4 +282,4 @@
data, algorithm = _calculate_digest_and_algorithm(
self._backend, data, signature_algorithm._algorithm
)
- return _ecdsa_sig_verify(self._backend, self, signature, data)
+ _ecdsa_sig_verify(self._backend, self, signature, data)
| {"golden_diff": "diff --git a/src/cryptography/hazmat/backends/openssl/ec.py b/src/cryptography/hazmat/backends/openssl/ec.py\n--- a/src/cryptography/hazmat/backends/openssl/ec.py\n+++ b/src/cryptography/hazmat/backends/openssl/ec.py\n@@ -86,7 +86,6 @@\n if res != 1:\n backend._consume_errors()\n raise InvalidSignature\n- return True\n \n \n @utils.register_interface(AsymmetricSignatureContext)\n@@ -118,7 +117,7 @@\n \n def verify(self):\n digest = self._digest.finalize()\n- return _ecdsa_sig_verify(\n+ _ecdsa_sig_verify(\n self._backend, self._public_key, self._signature, digest\n )\n \n@@ -283,4 +282,4 @@\n data, algorithm = _calculate_digest_and_algorithm(\n self._backend, data, signature_algorithm._algorithm\n )\n- return _ecdsa_sig_verify(self._backend, self, signature, data)\n+ _ecdsa_sig_verify(self._backend, self, signature, data)\n", "issue": "Remove `return True` in OpenSSL ecdsa verify implementation\nhttps://github.com/pyca/cryptography/blob/master/src/cryptography/hazmat/backends/openssl/ec.py#L89\r\n\r\nThis isn't part of our documented API\n", "before_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nfrom cryptography import utils\nfrom cryptography.exceptions import (\n InvalidSignature, UnsupportedAlgorithm, _Reasons\n)\nfrom cryptography.hazmat.backends.openssl.utils import (\n _calculate_digest_and_algorithm\n)\nfrom cryptography.hazmat.primitives import hashes, serialization\nfrom cryptography.hazmat.primitives.asymmetric import (\n AsymmetricSignatureContext, AsymmetricVerificationContext, ec\n)\n\n\ndef _check_signature_algorithm(signature_algorithm):\n if not isinstance(signature_algorithm, ec.ECDSA):\n raise UnsupportedAlgorithm(\n \"Unsupported elliptic curve signature algorithm.\",\n _Reasons.UNSUPPORTED_PUBLIC_KEY_ALGORITHM)\n\n\ndef _ec_key_curve_sn(backend, ec_key):\n group = backend._lib.EC_KEY_get0_group(ec_key)\n backend.openssl_assert(group != backend._ffi.NULL)\n\n nid = backend._lib.EC_GROUP_get_curve_name(group)\n # The following check is to find EC keys with unnamed curves and raise\n # an error for now.\n if nid == backend._lib.NID_undef:\n raise NotImplementedError(\n \"ECDSA certificates with unnamed curves are unsupported \"\n \"at this time\"\n )\n\n curve_name = backend._lib.OBJ_nid2sn(nid)\n backend.openssl_assert(curve_name != backend._ffi.NULL)\n\n sn = backend._ffi.string(curve_name).decode('ascii')\n return sn\n\n\ndef _mark_asn1_named_ec_curve(backend, ec_cdata):\n \"\"\"\n Set the named curve flag on the EC_KEY. This causes OpenSSL to\n serialize EC keys along with their curve OID which makes\n deserialization easier.\n \"\"\"\n\n backend._lib.EC_KEY_set_asn1_flag(\n ec_cdata, backend._lib.OPENSSL_EC_NAMED_CURVE\n )\n\n\ndef _sn_to_elliptic_curve(backend, sn):\n try:\n return ec._CURVE_TYPES[sn]()\n except KeyError:\n raise UnsupportedAlgorithm(\n \"{0} is not a supported elliptic curve\".format(sn),\n _Reasons.UNSUPPORTED_ELLIPTIC_CURVE\n )\n\n\ndef _ecdsa_sig_sign(backend, private_key, data):\n max_size = backend._lib.ECDSA_size(private_key._ec_key)\n backend.openssl_assert(max_size > 0)\n\n sigbuf = backend._ffi.new(\"unsigned char[]\", max_size)\n siglen_ptr = backend._ffi.new(\"unsigned int[]\", 1)\n res = backend._lib.ECDSA_sign(\n 0, data, len(data), sigbuf, siglen_ptr, private_key._ec_key\n )\n backend.openssl_assert(res == 1)\n return backend._ffi.buffer(sigbuf)[:siglen_ptr[0]]\n\n\ndef _ecdsa_sig_verify(backend, public_key, signature, data):\n res = backend._lib.ECDSA_verify(\n 0, data, len(data), signature, len(signature), public_key._ec_key\n )\n if res != 1:\n backend._consume_errors()\n raise InvalidSignature\n return True\n\n\[email protected]_interface(AsymmetricSignatureContext)\nclass _ECDSASignatureContext(object):\n def __init__(self, backend, private_key, algorithm):\n self._backend = backend\n self._private_key = private_key\n self._digest = hashes.Hash(algorithm, backend)\n\n def update(self, data):\n self._digest.update(data)\n\n def finalize(self):\n digest = self._digest.finalize()\n\n return _ecdsa_sig_sign(self._backend, self._private_key, digest)\n\n\[email protected]_interface(AsymmetricVerificationContext)\nclass _ECDSAVerificationContext(object):\n def __init__(self, backend, public_key, signature, algorithm):\n self._backend = backend\n self._public_key = public_key\n self._signature = signature\n self._digest = hashes.Hash(algorithm, backend)\n\n def update(self, data):\n self._digest.update(data)\n\n def verify(self):\n digest = self._digest.finalize()\n return _ecdsa_sig_verify(\n self._backend, self._public_key, self._signature, digest\n )\n\n\[email protected]_interface(ec.EllipticCurvePrivateKeyWithSerialization)\nclass _EllipticCurvePrivateKey(object):\n def __init__(self, backend, ec_key_cdata, evp_pkey):\n self._backend = backend\n _mark_asn1_named_ec_curve(backend, ec_key_cdata)\n self._ec_key = ec_key_cdata\n self._evp_pkey = evp_pkey\n\n sn = _ec_key_curve_sn(backend, ec_key_cdata)\n self._curve = _sn_to_elliptic_curve(backend, sn)\n\n curve = utils.read_only_property(\"_curve\")\n\n def signer(self, signature_algorithm):\n _check_signature_algorithm(signature_algorithm)\n return _ECDSASignatureContext(\n self._backend, self, signature_algorithm.algorithm\n )\n\n def exchange(self, algorithm, peer_public_key):\n if not (\n self._backend.elliptic_curve_exchange_algorithm_supported(\n algorithm, self.curve\n )\n ):\n raise UnsupportedAlgorithm(\n \"This backend does not support the ECDH algorithm.\",\n _Reasons.UNSUPPORTED_EXCHANGE_ALGORITHM\n )\n\n if peer_public_key.curve.name != self.curve.name:\n raise ValueError(\n \"peer_public_key and self are not on the same curve\"\n )\n\n group = self._backend._lib.EC_KEY_get0_group(self._ec_key)\n z_len = (self._backend._lib.EC_GROUP_get_degree(group) + 7) // 8\n self._backend.openssl_assert(z_len > 0)\n z_buf = self._backend._ffi.new(\"uint8_t[]\", z_len)\n peer_key = self._backend._lib.EC_KEY_get0_public_key(\n peer_public_key._ec_key\n )\n\n r = self._backend._lib.ECDH_compute_key(\n z_buf, z_len, peer_key, self._ec_key, self._backend._ffi.NULL\n )\n self._backend.openssl_assert(r > 0)\n return self._backend._ffi.buffer(z_buf)[:z_len]\n\n def public_key(self):\n group = self._backend._lib.EC_KEY_get0_group(self._ec_key)\n self._backend.openssl_assert(group != self._backend._ffi.NULL)\n\n curve_nid = self._backend._lib.EC_GROUP_get_curve_name(group)\n\n public_ec_key = self._backend._lib.EC_KEY_new_by_curve_name(curve_nid)\n self._backend.openssl_assert(public_ec_key != self._backend._ffi.NULL)\n public_ec_key = self._backend._ffi.gc(\n public_ec_key, self._backend._lib.EC_KEY_free\n )\n\n point = self._backend._lib.EC_KEY_get0_public_key(self._ec_key)\n self._backend.openssl_assert(point != self._backend._ffi.NULL)\n\n res = self._backend._lib.EC_KEY_set_public_key(public_ec_key, point)\n self._backend.openssl_assert(res == 1)\n\n evp_pkey = self._backend._ec_cdata_to_evp_pkey(public_ec_key)\n\n return _EllipticCurvePublicKey(self._backend, public_ec_key, evp_pkey)\n\n def private_numbers(self):\n bn = self._backend._lib.EC_KEY_get0_private_key(self._ec_key)\n private_value = self._backend._bn_to_int(bn)\n return ec.EllipticCurvePrivateNumbers(\n private_value=private_value,\n public_numbers=self.public_key().public_numbers()\n )\n\n def private_bytes(self, encoding, format, encryption_algorithm):\n return self._backend._private_key_bytes(\n encoding,\n format,\n encryption_algorithm,\n self._evp_pkey,\n self._ec_key\n )\n\n def sign(self, data, signature_algorithm):\n _check_signature_algorithm(signature_algorithm)\n data, algorithm = _calculate_digest_and_algorithm(\n self._backend, data, signature_algorithm._algorithm\n )\n return _ecdsa_sig_sign(self._backend, self, data)\n\n\[email protected]_interface(ec.EllipticCurvePublicKeyWithSerialization)\nclass _EllipticCurvePublicKey(object):\n def __init__(self, backend, ec_key_cdata, evp_pkey):\n self._backend = backend\n _mark_asn1_named_ec_curve(backend, ec_key_cdata)\n self._ec_key = ec_key_cdata\n self._evp_pkey = evp_pkey\n\n sn = _ec_key_curve_sn(backend, ec_key_cdata)\n self._curve = _sn_to_elliptic_curve(backend, sn)\n\n curve = utils.read_only_property(\"_curve\")\n\n def verifier(self, signature, signature_algorithm):\n if not isinstance(signature, bytes):\n raise TypeError(\"signature must be bytes.\")\n\n _check_signature_algorithm(signature_algorithm)\n return _ECDSAVerificationContext(\n self._backend, self, signature, signature_algorithm.algorithm\n )\n\n def public_numbers(self):\n get_func, group = (\n self._backend._ec_key_determine_group_get_func(self._ec_key)\n )\n point = self._backend._lib.EC_KEY_get0_public_key(self._ec_key)\n self._backend.openssl_assert(point != self._backend._ffi.NULL)\n\n with self._backend._tmp_bn_ctx() as bn_ctx:\n bn_x = self._backend._lib.BN_CTX_get(bn_ctx)\n bn_y = self._backend._lib.BN_CTX_get(bn_ctx)\n\n res = get_func(group, point, bn_x, bn_y, bn_ctx)\n self._backend.openssl_assert(res == 1)\n\n x = self._backend._bn_to_int(bn_x)\n y = self._backend._bn_to_int(bn_y)\n\n return ec.EllipticCurvePublicNumbers(\n x=x,\n y=y,\n curve=self._curve\n )\n\n def public_bytes(self, encoding, format):\n if format is serialization.PublicFormat.PKCS1:\n raise ValueError(\n \"EC public keys do not support PKCS1 serialization\"\n )\n\n return self._backend._public_key_bytes(\n encoding,\n format,\n self,\n self._evp_pkey,\n None\n )\n\n def verify(self, signature, data, signature_algorithm):\n _check_signature_algorithm(signature_algorithm)\n data, algorithm = _calculate_digest_and_algorithm(\n self._backend, data, signature_algorithm._algorithm\n )\n return _ecdsa_sig_verify(self._backend, self, signature, data)\n", "path": "src/cryptography/hazmat/backends/openssl/ec.py"}], "after_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nfrom cryptography import utils\nfrom cryptography.exceptions import (\n InvalidSignature, UnsupportedAlgorithm, _Reasons\n)\nfrom cryptography.hazmat.backends.openssl.utils import (\n _calculate_digest_and_algorithm\n)\nfrom cryptography.hazmat.primitives import hashes, serialization\nfrom cryptography.hazmat.primitives.asymmetric import (\n AsymmetricSignatureContext, AsymmetricVerificationContext, ec\n)\n\n\ndef _check_signature_algorithm(signature_algorithm):\n if not isinstance(signature_algorithm, ec.ECDSA):\n raise UnsupportedAlgorithm(\n \"Unsupported elliptic curve signature algorithm.\",\n _Reasons.UNSUPPORTED_PUBLIC_KEY_ALGORITHM)\n\n\ndef _ec_key_curve_sn(backend, ec_key):\n group = backend._lib.EC_KEY_get0_group(ec_key)\n backend.openssl_assert(group != backend._ffi.NULL)\n\n nid = backend._lib.EC_GROUP_get_curve_name(group)\n # The following check is to find EC keys with unnamed curves and raise\n # an error for now.\n if nid == backend._lib.NID_undef:\n raise NotImplementedError(\n \"ECDSA certificates with unnamed curves are unsupported \"\n \"at this time\"\n )\n\n curve_name = backend._lib.OBJ_nid2sn(nid)\n backend.openssl_assert(curve_name != backend._ffi.NULL)\n\n sn = backend._ffi.string(curve_name).decode('ascii')\n return sn\n\n\ndef _mark_asn1_named_ec_curve(backend, ec_cdata):\n \"\"\"\n Set the named curve flag on the EC_KEY. This causes OpenSSL to\n serialize EC keys along with their curve OID which makes\n deserialization easier.\n \"\"\"\n\n backend._lib.EC_KEY_set_asn1_flag(\n ec_cdata, backend._lib.OPENSSL_EC_NAMED_CURVE\n )\n\n\ndef _sn_to_elliptic_curve(backend, sn):\n try:\n return ec._CURVE_TYPES[sn]()\n except KeyError:\n raise UnsupportedAlgorithm(\n \"{0} is not a supported elliptic curve\".format(sn),\n _Reasons.UNSUPPORTED_ELLIPTIC_CURVE\n )\n\n\ndef _ecdsa_sig_sign(backend, private_key, data):\n max_size = backend._lib.ECDSA_size(private_key._ec_key)\n backend.openssl_assert(max_size > 0)\n\n sigbuf = backend._ffi.new(\"unsigned char[]\", max_size)\n siglen_ptr = backend._ffi.new(\"unsigned int[]\", 1)\n res = backend._lib.ECDSA_sign(\n 0, data, len(data), sigbuf, siglen_ptr, private_key._ec_key\n )\n backend.openssl_assert(res == 1)\n return backend._ffi.buffer(sigbuf)[:siglen_ptr[0]]\n\n\ndef _ecdsa_sig_verify(backend, public_key, signature, data):\n res = backend._lib.ECDSA_verify(\n 0, data, len(data), signature, len(signature), public_key._ec_key\n )\n if res != 1:\n backend._consume_errors()\n raise InvalidSignature\n\n\[email protected]_interface(AsymmetricSignatureContext)\nclass _ECDSASignatureContext(object):\n def __init__(self, backend, private_key, algorithm):\n self._backend = backend\n self._private_key = private_key\n self._digest = hashes.Hash(algorithm, backend)\n\n def update(self, data):\n self._digest.update(data)\n\n def finalize(self):\n digest = self._digest.finalize()\n\n return _ecdsa_sig_sign(self._backend, self._private_key, digest)\n\n\[email protected]_interface(AsymmetricVerificationContext)\nclass _ECDSAVerificationContext(object):\n def __init__(self, backend, public_key, signature, algorithm):\n self._backend = backend\n self._public_key = public_key\n self._signature = signature\n self._digest = hashes.Hash(algorithm, backend)\n\n def update(self, data):\n self._digest.update(data)\n\n def verify(self):\n digest = self._digest.finalize()\n _ecdsa_sig_verify(\n self._backend, self._public_key, self._signature, digest\n )\n\n\[email protected]_interface(ec.EllipticCurvePrivateKeyWithSerialization)\nclass _EllipticCurvePrivateKey(object):\n def __init__(self, backend, ec_key_cdata, evp_pkey):\n self._backend = backend\n _mark_asn1_named_ec_curve(backend, ec_key_cdata)\n self._ec_key = ec_key_cdata\n self._evp_pkey = evp_pkey\n\n sn = _ec_key_curve_sn(backend, ec_key_cdata)\n self._curve = _sn_to_elliptic_curve(backend, sn)\n\n curve = utils.read_only_property(\"_curve\")\n\n def signer(self, signature_algorithm):\n _check_signature_algorithm(signature_algorithm)\n return _ECDSASignatureContext(\n self._backend, self, signature_algorithm.algorithm\n )\n\n def exchange(self, algorithm, peer_public_key):\n if not (\n self._backend.elliptic_curve_exchange_algorithm_supported(\n algorithm, self.curve\n )\n ):\n raise UnsupportedAlgorithm(\n \"This backend does not support the ECDH algorithm.\",\n _Reasons.UNSUPPORTED_EXCHANGE_ALGORITHM\n )\n\n if peer_public_key.curve.name != self.curve.name:\n raise ValueError(\n \"peer_public_key and self are not on the same curve\"\n )\n\n group = self._backend._lib.EC_KEY_get0_group(self._ec_key)\n z_len = (self._backend._lib.EC_GROUP_get_degree(group) + 7) // 8\n self._backend.openssl_assert(z_len > 0)\n z_buf = self._backend._ffi.new(\"uint8_t[]\", z_len)\n peer_key = self._backend._lib.EC_KEY_get0_public_key(\n peer_public_key._ec_key\n )\n\n r = self._backend._lib.ECDH_compute_key(\n z_buf, z_len, peer_key, self._ec_key, self._backend._ffi.NULL\n )\n self._backend.openssl_assert(r > 0)\n return self._backend._ffi.buffer(z_buf)[:z_len]\n\n def public_key(self):\n group = self._backend._lib.EC_KEY_get0_group(self._ec_key)\n self._backend.openssl_assert(group != self._backend._ffi.NULL)\n\n curve_nid = self._backend._lib.EC_GROUP_get_curve_name(group)\n\n public_ec_key = self._backend._lib.EC_KEY_new_by_curve_name(curve_nid)\n self._backend.openssl_assert(public_ec_key != self._backend._ffi.NULL)\n public_ec_key = self._backend._ffi.gc(\n public_ec_key, self._backend._lib.EC_KEY_free\n )\n\n point = self._backend._lib.EC_KEY_get0_public_key(self._ec_key)\n self._backend.openssl_assert(point != self._backend._ffi.NULL)\n\n res = self._backend._lib.EC_KEY_set_public_key(public_ec_key, point)\n self._backend.openssl_assert(res == 1)\n\n evp_pkey = self._backend._ec_cdata_to_evp_pkey(public_ec_key)\n\n return _EllipticCurvePublicKey(self._backend, public_ec_key, evp_pkey)\n\n def private_numbers(self):\n bn = self._backend._lib.EC_KEY_get0_private_key(self._ec_key)\n private_value = self._backend._bn_to_int(bn)\n return ec.EllipticCurvePrivateNumbers(\n private_value=private_value,\n public_numbers=self.public_key().public_numbers()\n )\n\n def private_bytes(self, encoding, format, encryption_algorithm):\n return self._backend._private_key_bytes(\n encoding,\n format,\n encryption_algorithm,\n self._evp_pkey,\n self._ec_key\n )\n\n def sign(self, data, signature_algorithm):\n _check_signature_algorithm(signature_algorithm)\n data, algorithm = _calculate_digest_and_algorithm(\n self._backend, data, signature_algorithm._algorithm\n )\n return _ecdsa_sig_sign(self._backend, self, data)\n\n\[email protected]_interface(ec.EllipticCurvePublicKeyWithSerialization)\nclass _EllipticCurvePublicKey(object):\n def __init__(self, backend, ec_key_cdata, evp_pkey):\n self._backend = backend\n _mark_asn1_named_ec_curve(backend, ec_key_cdata)\n self._ec_key = ec_key_cdata\n self._evp_pkey = evp_pkey\n\n sn = _ec_key_curve_sn(backend, ec_key_cdata)\n self._curve = _sn_to_elliptic_curve(backend, sn)\n\n curve = utils.read_only_property(\"_curve\")\n\n def verifier(self, signature, signature_algorithm):\n if not isinstance(signature, bytes):\n raise TypeError(\"signature must be bytes.\")\n\n _check_signature_algorithm(signature_algorithm)\n return _ECDSAVerificationContext(\n self._backend, self, signature, signature_algorithm.algorithm\n )\n\n def public_numbers(self):\n get_func, group = (\n self._backend._ec_key_determine_group_get_func(self._ec_key)\n )\n point = self._backend._lib.EC_KEY_get0_public_key(self._ec_key)\n self._backend.openssl_assert(point != self._backend._ffi.NULL)\n\n with self._backend._tmp_bn_ctx() as bn_ctx:\n bn_x = self._backend._lib.BN_CTX_get(bn_ctx)\n bn_y = self._backend._lib.BN_CTX_get(bn_ctx)\n\n res = get_func(group, point, bn_x, bn_y, bn_ctx)\n self._backend.openssl_assert(res == 1)\n\n x = self._backend._bn_to_int(bn_x)\n y = self._backend._bn_to_int(bn_y)\n\n return ec.EllipticCurvePublicNumbers(\n x=x,\n y=y,\n curve=self._curve\n )\n\n def public_bytes(self, encoding, format):\n if format is serialization.PublicFormat.PKCS1:\n raise ValueError(\n \"EC public keys do not support PKCS1 serialization\"\n )\n\n return self._backend._public_key_bytes(\n encoding,\n format,\n self,\n self._evp_pkey,\n None\n )\n\n def verify(self, signature, data, signature_algorithm):\n _check_signature_algorithm(signature_algorithm)\n data, algorithm = _calculate_digest_and_algorithm(\n self._backend, data, signature_algorithm._algorithm\n )\n _ecdsa_sig_verify(self._backend, self, signature, data)\n", "path": "src/cryptography/hazmat/backends/openssl/ec.py"}]} | 3,448 | 245 |
gh_patches_debug_38147 | rasdani/github-patches | git_diff | WeblateOrg__weblate-8675 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Freezing in certain strings
### Describe the issue
Hi.
We just updated Weblate to 4.15.1, and our instance is hanging when we access certain strings, weird strings (that should be ignored when creating the PO files, I know).
Instance logs, sometimes show this:
```
[2023-01-24 12:54:51,272: DEBUG/90183] git: failure fatal: bad object 93f0b5592a265aa1ba11131707a710dbdcca0040
[2023-01-24 12:54:51,272: DEBUG/90183] git: failure fatal: bad object 93f0b5592a265aa1ba11131707a710dbdcca0040
```
This is an example of the string that cause the issue:
https://github.com/freebsd/freebsd-doc-translate/blob/main/documentation/content/es/articles/serial-uart/_index.po#L38-L52
```
#. type: Plain text
#: documentation/content/en/articles/serial-uart/_index.adoc:48
msgid "'''"
msgstr "'''"
```
postgres be stuck in selects.
Do you know if there is something we can do here?
Regards.
### I already tried
- [X] I've read and searched [the documentation](https://docs.weblate.org/).
- [X] I've searched for similar issues in this repository.
### Steps to reproduce the behavior
Go to any string like this:
```
#. type: Plain text
#: documentation/content/en/articles/serial-uart/_index.adoc:48
msgid "'''"
msgstr "'''"
```
### Expected behavior
_No response_
### Screenshots
_No response_
### Exception traceback
```pytb
Only this:
[2023-01-24 12:54:51,272: DEBUG/90183] git: failure fatal: bad object 93f0b5592a265aa1ba11131707a710dbdcca0040
[2023-01-24 12:54:51,272: DEBUG/90183] git: failure fatal: bad object 93f0b5592a265aa1ba11131707a710dbdcca0040
```
### How do you run Weblate?
weblate.org service
### Weblate versions
`4.15.1`
We have updated docker containers from `4.10.1`.
### Weblate deploy checks
_No response_
### Additional context
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `weblate/utils/db.py`
Content:
```
1 # Copyright © Michal Čihař <[email protected]>
2 #
3 # SPDX-License-Identifier: GPL-3.0-or-later
4
5 """Database specific code to extend Django."""
6
7 from django.db import connection, models
8 from django.db.models import Case, IntegerField, Sum, When
9 from django.db.models.lookups import PatternLookup
10
11 ESCAPED = frozenset(".\\+*?[^]$(){}=!<>|:-")
12
13 PG_TRGM = "CREATE INDEX {0}_{1}_fulltext ON trans_{0} USING GIN ({1} gin_trgm_ops {2})"
14 PG_DROP = "DROP INDEX {0}_{1}_fulltext"
15
16 MY_FTX = "CREATE FULLTEXT INDEX {0}_{1}_fulltext ON trans_{0}({1})"
17 MY_DROP = "ALTER TABLE trans_{0} DROP INDEX {0}_{1}_fulltext"
18
19
20 def conditional_sum(value=1, **cond):
21 """Wrapper to generate SUM on boolean/enum values."""
22 return Sum(Case(When(then=value, **cond), default=0, output_field=IntegerField()))
23
24
25 def using_postgresql():
26 return connection.vendor == "postgresql"
27
28
29 def adjust_similarity_threshold(value: float):
30 """
31 Adjusts pg_trgm.similarity_threshold for the % operator.
32
33 Ideally we would use directly similarity() in the search, but that doesn't seem
34 to use index, while using % does.
35 """
36 if not using_postgresql():
37 return
38 with connection.cursor() as cursor:
39 # The SELECT has to be executed first as othervise the trgm extension
40 # might not yet be loaded and GUC setting not possible.
41 if not hasattr(connection, "weblate_similarity"):
42 cursor.execute("SELECT show_limit()")
43 connection.weblate_similarity = cursor.fetchone()[0]
44 # Change setting only for reasonably big difference
45 if abs(connection.weblate_similarity - value) > 0.01:
46 cursor.execute("SELECT set_limit(%s)", [value])
47 connection.weblate_similarity = value
48
49
50 class PostgreSQLSearchLookup(PatternLookup):
51 lookup_name = "search"
52 param_pattern = "%s"
53
54 def as_sql(self, qn, connection):
55 lhs, lhs_params = self.process_lhs(qn, connection)
56 rhs, rhs_params = self.process_rhs(qn, connection)
57 params = lhs_params + rhs_params
58 return f"{lhs} %% {rhs} = true", params
59
60
61 class MySQLSearchLookup(models.Lookup):
62 lookup_name = "search"
63
64 def as_sql(self, compiler, connection):
65 lhs, lhs_params = self.process_lhs(compiler, connection)
66 rhs, rhs_params = self.process_rhs(compiler, connection)
67 params = lhs_params + rhs_params
68 return f"MATCH ({lhs}) AGAINST ({rhs} IN NATURAL LANGUAGE MODE)", params
69
70
71 class PostgreSQLSubstringLookup(PatternLookup):
72 """
73 Case insensitive substring lookup.
74
75 This is essentially same as icontains in Django, but utilizes ILIKE
76 operator which can use pg_trgm index.
77 """
78
79 lookup_name = "substring"
80
81 def as_sql(self, compiler, connection):
82 lhs, lhs_params = self.process_lhs(compiler, connection)
83 rhs, rhs_params = self.process_rhs(compiler, connection)
84 params = lhs_params + rhs_params
85 return f"{lhs} ILIKE {rhs}", params
86
87
88 class PostgreSQLILikeLookup(PostgreSQLSubstringLookup):
89 """
90 Case insensitive string lookup.
91
92 This is essentially same as iexact in Django, but utilizes ILIKE
93 operator which can use pg_trgm index.
94 """
95
96 lookup_name = "ilike"
97 param_pattern = "%s"
98
99
100 def re_escape(pattern):
101 """Escape for use in database regexp match.
102
103 This is based on re.escape, but that one escapes too much.
104 """
105 string = list(pattern)
106 for i, char in enumerate(pattern):
107 if char == "\000":
108 string[i] = "\\000"
109 elif char in ESCAPED:
110 string[i] = "\\" + char
111 return "".join(string)
112
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/weblate/utils/db.py b/weblate/utils/db.py
--- a/weblate/utils/db.py
+++ b/weblate/utils/db.py
@@ -6,7 +6,7 @@
from django.db import connection, models
from django.db.models import Case, IntegerField, Sum, When
-from django.db.models.lookups import PatternLookup
+from django.db.models.lookups import IContains, IExact, PatternLookup
ESCAPED = frozenset(".\\+*?[^]$(){}=!<>|:-")
@@ -47,13 +47,27 @@
connection.weblate_similarity = value
-class PostgreSQLSearchLookup(PatternLookup):
+class PostgreSQLFallbackLookup(PatternLookup):
+ def __init__(self, lhs, rhs):
+ self.orig_lhs = lhs
+ self.orig_rhs = rhs
+ super().__init__(lhs, rhs)
+
+ def needs_fallback(self):
+ return isinstance(self.orig_rhs, str) and not any(
+ char.isalnum() for char in self.orig_rhs
+ )
+
+
+class PostgreSQLSearchLookup(PostgreSQLFallbackLookup):
lookup_name = "search"
param_pattern = "%s"
- def as_sql(self, qn, connection):
- lhs, lhs_params = self.process_lhs(qn, connection)
- rhs, rhs_params = self.process_rhs(qn, connection)
+ def as_sql(self, compiler, connection):
+ if self.needs_fallback():
+ return IContains(self.orig_lhs, self.orig_rhs).as_sql(compiler, connection)
+ lhs, lhs_params = self.process_lhs(compiler, connection)
+ rhs, rhs_params = self.process_rhs(compiler, connection)
params = lhs_params + rhs_params
return f"{lhs} %% {rhs} = true", params
@@ -68,7 +82,7 @@
return f"MATCH ({lhs}) AGAINST ({rhs} IN NATURAL LANGUAGE MODE)", params
-class PostgreSQLSubstringLookup(PatternLookup):
+class PostgreSQLSubstringLookup(PostgreSQLFallbackLookup):
"""
Case insensitive substring lookup.
@@ -79,6 +93,8 @@
lookup_name = "substring"
def as_sql(self, compiler, connection):
+ if self.needs_fallback():
+ return IContains(self.orig_lhs, self.orig_rhs).as_sql(compiler, connection)
lhs, lhs_params = self.process_lhs(compiler, connection)
rhs, rhs_params = self.process_rhs(compiler, connection)
params = lhs_params + rhs_params
@@ -96,6 +112,11 @@
lookup_name = "ilike"
param_pattern = "%s"
+ def as_sql(self, compiler, connection):
+ if self.needs_fallback():
+ return IExact(self.orig_lhs, self.orig_rhs).as_sql(compiler, connection)
+ return super().as_sql(compiler, connection)
+
def re_escape(pattern):
"""Escape for use in database regexp match.
| {"golden_diff": "diff --git a/weblate/utils/db.py b/weblate/utils/db.py\n--- a/weblate/utils/db.py\n+++ b/weblate/utils/db.py\n@@ -6,7 +6,7 @@\n \n from django.db import connection, models\n from django.db.models import Case, IntegerField, Sum, When\n-from django.db.models.lookups import PatternLookup\n+from django.db.models.lookups import IContains, IExact, PatternLookup\n \n ESCAPED = frozenset(\".\\\\+*?[^]$(){}=!<>|:-\")\n \n@@ -47,13 +47,27 @@\n connection.weblate_similarity = value\n \n \n-class PostgreSQLSearchLookup(PatternLookup):\n+class PostgreSQLFallbackLookup(PatternLookup):\n+ def __init__(self, lhs, rhs):\n+ self.orig_lhs = lhs\n+ self.orig_rhs = rhs\n+ super().__init__(lhs, rhs)\n+\n+ def needs_fallback(self):\n+ return isinstance(self.orig_rhs, str) and not any(\n+ char.isalnum() for char in self.orig_rhs\n+ )\n+\n+\n+class PostgreSQLSearchLookup(PostgreSQLFallbackLookup):\n lookup_name = \"search\"\n param_pattern = \"%s\"\n \n- def as_sql(self, qn, connection):\n- lhs, lhs_params = self.process_lhs(qn, connection)\n- rhs, rhs_params = self.process_rhs(qn, connection)\n+ def as_sql(self, compiler, connection):\n+ if self.needs_fallback():\n+ return IContains(self.orig_lhs, self.orig_rhs).as_sql(compiler, connection)\n+ lhs, lhs_params = self.process_lhs(compiler, connection)\n+ rhs, rhs_params = self.process_rhs(compiler, connection)\n params = lhs_params + rhs_params\n return f\"{lhs} %% {rhs} = true\", params\n \n@@ -68,7 +82,7 @@\n return f\"MATCH ({lhs}) AGAINST ({rhs} IN NATURAL LANGUAGE MODE)\", params\n \n \n-class PostgreSQLSubstringLookup(PatternLookup):\n+class PostgreSQLSubstringLookup(PostgreSQLFallbackLookup):\n \"\"\"\n Case insensitive substring lookup.\n \n@@ -79,6 +93,8 @@\n lookup_name = \"substring\"\n \n def as_sql(self, compiler, connection):\n+ if self.needs_fallback():\n+ return IContains(self.orig_lhs, self.orig_rhs).as_sql(compiler, connection)\n lhs, lhs_params = self.process_lhs(compiler, connection)\n rhs, rhs_params = self.process_rhs(compiler, connection)\n params = lhs_params + rhs_params\n@@ -96,6 +112,11 @@\n lookup_name = \"ilike\"\n param_pattern = \"%s\"\n \n+ def as_sql(self, compiler, connection):\n+ if self.needs_fallback():\n+ return IExact(self.orig_lhs, self.orig_rhs).as_sql(compiler, connection)\n+ return super().as_sql(compiler, connection)\n+\n \n def re_escape(pattern):\n \"\"\"Escape for use in database regexp match.\n", "issue": "Freezing in certain strings\n### Describe the issue\r\n\r\nHi.\r\n\r\nWe just updated Weblate to 4.15.1, and our instance is hanging when we access certain strings, weird strings (that should be ignored when creating the PO files, I know).\r\n\r\nInstance logs, sometimes show this:\r\n```\r\n[2023-01-24 12:54:51,272: DEBUG/90183] git: failure fatal: bad object 93f0b5592a265aa1ba11131707a710dbdcca0040\r\n[2023-01-24 12:54:51,272: DEBUG/90183] git: failure fatal: bad object 93f0b5592a265aa1ba11131707a710dbdcca0040\r\n```\r\n\r\nThis is an example of the string that cause the issue:\r\n\r\nhttps://github.com/freebsd/freebsd-doc-translate/blob/main/documentation/content/es/articles/serial-uart/_index.po#L38-L52\r\n\r\n```\r\n#. type: Plain text\r\n#: documentation/content/en/articles/serial-uart/_index.adoc:48\r\nmsgid \"'''\"\r\nmsgstr \"'''\"\r\n```\r\n\r\npostgres be stuck in selects.\r\n\r\nDo you know if there is something we can do here?\r\n\r\nRegards.\r\n\r\n### I already tried\r\n\r\n- [X] I've read and searched [the documentation](https://docs.weblate.org/).\r\n- [X] I've searched for similar issues in this repository.\r\n\r\n### Steps to reproduce the behavior\r\n\r\nGo to any string like this:\r\n\r\n```\r\n#. type: Plain text\r\n#: documentation/content/en/articles/serial-uart/_index.adoc:48\r\nmsgid \"'''\"\r\nmsgstr \"'''\"\r\n```\r\n\r\n### Expected behavior\r\n\r\n_No response_\r\n\r\n### Screenshots\r\n\r\n_No response_\r\n\r\n### Exception traceback\r\n\r\n```pytb\r\nOnly this:\r\n\r\n\r\n[2023-01-24 12:54:51,272: DEBUG/90183] git: failure fatal: bad object 93f0b5592a265aa1ba11131707a710dbdcca0040\r\n[2023-01-24 12:54:51,272: DEBUG/90183] git: failure fatal: bad object 93f0b5592a265aa1ba11131707a710dbdcca0040\r\n```\r\n\r\n\r\n### How do you run Weblate?\r\n\r\nweblate.org service\r\n\r\n### Weblate versions\r\n\r\n`4.15.1`\r\nWe have updated docker containers from `4.10.1`.\r\n\r\n### Weblate deploy checks\r\n\r\n_No response_\r\n\r\n### Additional context\r\n\r\n_No response_\n", "before_files": [{"content": "# Copyright \u00a9 Michal \u010ciha\u0159 <[email protected]>\n#\n# SPDX-License-Identifier: GPL-3.0-or-later\n\n\"\"\"Database specific code to extend Django.\"\"\"\n\nfrom django.db import connection, models\nfrom django.db.models import Case, IntegerField, Sum, When\nfrom django.db.models.lookups import PatternLookup\n\nESCAPED = frozenset(\".\\\\+*?[^]$(){}=!<>|:-\")\n\nPG_TRGM = \"CREATE INDEX {0}_{1}_fulltext ON trans_{0} USING GIN ({1} gin_trgm_ops {2})\"\nPG_DROP = \"DROP INDEX {0}_{1}_fulltext\"\n\nMY_FTX = \"CREATE FULLTEXT INDEX {0}_{1}_fulltext ON trans_{0}({1})\"\nMY_DROP = \"ALTER TABLE trans_{0} DROP INDEX {0}_{1}_fulltext\"\n\n\ndef conditional_sum(value=1, **cond):\n \"\"\"Wrapper to generate SUM on boolean/enum values.\"\"\"\n return Sum(Case(When(then=value, **cond), default=0, output_field=IntegerField()))\n\n\ndef using_postgresql():\n return connection.vendor == \"postgresql\"\n\n\ndef adjust_similarity_threshold(value: float):\n \"\"\"\n Adjusts pg_trgm.similarity_threshold for the % operator.\n\n Ideally we would use directly similarity() in the search, but that doesn't seem\n to use index, while using % does.\n \"\"\"\n if not using_postgresql():\n return\n with connection.cursor() as cursor:\n # The SELECT has to be executed first as othervise the trgm extension\n # might not yet be loaded and GUC setting not possible.\n if not hasattr(connection, \"weblate_similarity\"):\n cursor.execute(\"SELECT show_limit()\")\n connection.weblate_similarity = cursor.fetchone()[0]\n # Change setting only for reasonably big difference\n if abs(connection.weblate_similarity - value) > 0.01:\n cursor.execute(\"SELECT set_limit(%s)\", [value])\n connection.weblate_similarity = value\n\n\nclass PostgreSQLSearchLookup(PatternLookup):\n lookup_name = \"search\"\n param_pattern = \"%s\"\n\n def as_sql(self, qn, connection):\n lhs, lhs_params = self.process_lhs(qn, connection)\n rhs, rhs_params = self.process_rhs(qn, connection)\n params = lhs_params + rhs_params\n return f\"{lhs} %% {rhs} = true\", params\n\n\nclass MySQLSearchLookup(models.Lookup):\n lookup_name = \"search\"\n\n def as_sql(self, compiler, connection):\n lhs, lhs_params = self.process_lhs(compiler, connection)\n rhs, rhs_params = self.process_rhs(compiler, connection)\n params = lhs_params + rhs_params\n return f\"MATCH ({lhs}) AGAINST ({rhs} IN NATURAL LANGUAGE MODE)\", params\n\n\nclass PostgreSQLSubstringLookup(PatternLookup):\n \"\"\"\n Case insensitive substring lookup.\n\n This is essentially same as icontains in Django, but utilizes ILIKE\n operator which can use pg_trgm index.\n \"\"\"\n\n lookup_name = \"substring\"\n\n def as_sql(self, compiler, connection):\n lhs, lhs_params = self.process_lhs(compiler, connection)\n rhs, rhs_params = self.process_rhs(compiler, connection)\n params = lhs_params + rhs_params\n return f\"{lhs} ILIKE {rhs}\", params\n\n\nclass PostgreSQLILikeLookup(PostgreSQLSubstringLookup):\n \"\"\"\n Case insensitive string lookup.\n\n This is essentially same as iexact in Django, but utilizes ILIKE\n operator which can use pg_trgm index.\n \"\"\"\n\n lookup_name = \"ilike\"\n param_pattern = \"%s\"\n\n\ndef re_escape(pattern):\n \"\"\"Escape for use in database regexp match.\n\n This is based on re.escape, but that one escapes too much.\n \"\"\"\n string = list(pattern)\n for i, char in enumerate(pattern):\n if char == \"\\000\":\n string[i] = \"\\\\000\"\n elif char in ESCAPED:\n string[i] = \"\\\\\" + char\n return \"\".join(string)\n", "path": "weblate/utils/db.py"}], "after_files": [{"content": "# Copyright \u00a9 Michal \u010ciha\u0159 <[email protected]>\n#\n# SPDX-License-Identifier: GPL-3.0-or-later\n\n\"\"\"Database specific code to extend Django.\"\"\"\n\nfrom django.db import connection, models\nfrom django.db.models import Case, IntegerField, Sum, When\nfrom django.db.models.lookups import IContains, IExact, PatternLookup\n\nESCAPED = frozenset(\".\\\\+*?[^]$(){}=!<>|:-\")\n\nPG_TRGM = \"CREATE INDEX {0}_{1}_fulltext ON trans_{0} USING GIN ({1} gin_trgm_ops {2})\"\nPG_DROP = \"DROP INDEX {0}_{1}_fulltext\"\n\nMY_FTX = \"CREATE FULLTEXT INDEX {0}_{1}_fulltext ON trans_{0}({1})\"\nMY_DROP = \"ALTER TABLE trans_{0} DROP INDEX {0}_{1}_fulltext\"\n\n\ndef conditional_sum(value=1, **cond):\n \"\"\"Wrapper to generate SUM on boolean/enum values.\"\"\"\n return Sum(Case(When(then=value, **cond), default=0, output_field=IntegerField()))\n\n\ndef using_postgresql():\n return connection.vendor == \"postgresql\"\n\n\ndef adjust_similarity_threshold(value: float):\n \"\"\"\n Adjusts pg_trgm.similarity_threshold for the % operator.\n\n Ideally we would use directly similarity() in the search, but that doesn't seem\n to use index, while using % does.\n \"\"\"\n if not using_postgresql():\n return\n with connection.cursor() as cursor:\n # The SELECT has to be executed first as othervise the trgm extension\n # might not yet be loaded and GUC setting not possible.\n if not hasattr(connection, \"weblate_similarity\"):\n cursor.execute(\"SELECT show_limit()\")\n connection.weblate_similarity = cursor.fetchone()[0]\n # Change setting only for reasonably big difference\n if abs(connection.weblate_similarity - value) > 0.01:\n cursor.execute(\"SELECT set_limit(%s)\", [value])\n connection.weblate_similarity = value\n\n\nclass PostgreSQLFallbackLookup(PatternLookup):\n def __init__(self, lhs, rhs):\n self.orig_lhs = lhs\n self.orig_rhs = rhs\n super().__init__(lhs, rhs)\n\n def needs_fallback(self):\n return isinstance(self.orig_rhs, str) and not any(\n char.isalnum() for char in self.orig_rhs\n )\n\n\nclass PostgreSQLSearchLookup(PostgreSQLFallbackLookup):\n lookup_name = \"search\"\n param_pattern = \"%s\"\n\n def as_sql(self, compiler, connection):\n if self.needs_fallback():\n return IContains(self.orig_lhs, self.orig_rhs).as_sql(compiler, connection)\n lhs, lhs_params = self.process_lhs(compiler, connection)\n rhs, rhs_params = self.process_rhs(compiler, connection)\n params = lhs_params + rhs_params\n return f\"{lhs} %% {rhs} = true\", params\n\n\nclass MySQLSearchLookup(models.Lookup):\n lookup_name = \"search\"\n\n def as_sql(self, compiler, connection):\n lhs, lhs_params = self.process_lhs(compiler, connection)\n rhs, rhs_params = self.process_rhs(compiler, connection)\n params = lhs_params + rhs_params\n return f\"MATCH ({lhs}) AGAINST ({rhs} IN NATURAL LANGUAGE MODE)\", params\n\n\nclass PostgreSQLSubstringLookup(PostgreSQLFallbackLookup):\n \"\"\"\n Case insensitive substring lookup.\n\n This is essentially same as icontains in Django, but utilizes ILIKE\n operator which can use pg_trgm index.\n \"\"\"\n\n lookup_name = \"substring\"\n\n def as_sql(self, compiler, connection):\n if self.needs_fallback():\n return IContains(self.orig_lhs, self.orig_rhs).as_sql(compiler, connection)\n lhs, lhs_params = self.process_lhs(compiler, connection)\n rhs, rhs_params = self.process_rhs(compiler, connection)\n params = lhs_params + rhs_params\n return f\"{lhs} ILIKE {rhs}\", params\n\n\nclass PostgreSQLILikeLookup(PostgreSQLSubstringLookup):\n \"\"\"\n Case insensitive string lookup.\n\n This is essentially same as iexact in Django, but utilizes ILIKE\n operator which can use pg_trgm index.\n \"\"\"\n\n lookup_name = \"ilike\"\n param_pattern = \"%s\"\n\n def as_sql(self, compiler, connection):\n if self.needs_fallback():\n return IExact(self.orig_lhs, self.orig_rhs).as_sql(compiler, connection)\n return super().as_sql(compiler, connection)\n\n\ndef re_escape(pattern):\n \"\"\"Escape for use in database regexp match.\n\n This is based on re.escape, but that one escapes too much.\n \"\"\"\n string = list(pattern)\n for i, char in enumerate(pattern):\n if char == \"\\000\":\n string[i] = \"\\\\000\"\n elif char in ESCAPED:\n string[i] = \"\\\\\" + char\n return \"\".join(string)\n", "path": "weblate/utils/db.py"}]} | 2,034 | 655 |
gh_patches_debug_39507 | rasdani/github-patches | git_diff | Nitrate__Nitrate-1106 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Drop RPM package build completely
Major reason and consideration:
- reduce the effort to maintain the builds
- easy to pin the dependencies
- make it clear to install and distribute via container images
AC:
- [x] Remove from CI
- [ ] Remove the Fedora Copr project
- [x] Refactor the Containerfile to build images directly from the source tree
- [x] Update README and documentation to remove the content about RPM packages
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `container/init.py`
Content:
```
1 #!/usr/bin/python3
2
3 import logging
4 import os
5 import time
6
7 logging.basicConfig(
8 level=logging.INFO,
9 format='%(asctime)s %(levelname)s %(name)s: %(message)s'
10 )
11 logger = logging.getLogger('entrypoint')
12
13 import django
14 django.setup()
15
16 from django.contrib.auth.models import User
17 from django.core.management import call_command
18 from django.db import connection
19
20
21 def create_superuser():
22 username = os.environ.get('NITRATE_SUPERUSER_USERNAME')
23 password = os.environ.get('NITRATE_SUPERUSER_PASSWORD')
24 email = os.environ.get('NITRATE_SUPERUSER_EMAIL')
25
26 if not (username and password and email):
27 logger.info(
28 'NITRATE_SUPERUSER_USERNAME, NITRATE_SUPERUSER_PASSWORD and NITRATE_SUPERUSER_EMAIL are not set. '
29 'Skip creating a superuser.'
30 )
31 return
32
33 try:
34 if User.objects.filter(username=username, email=email, is_superuser=True).exists():
35 logger.info('Superuser %s has been created.', username)
36 return
37 except: # noqa
38 pass
39
40 try:
41 User.objects.create_superuser(username, email=email, password=password)
42 logger.info('Superuser %s is created successfully.', username)
43 except Exception as e:
44 logger.warning('Failed to create superuser %s: %s', username, e)
45 logger.warning('Please check if the database is initialized properly.')
46
47
48 def set_default_permissions():
49 if os.environ.get('NITRATE_SET_DEFAULT_PERMS'):
50 try:
51 call_command('setdefaultperms')
52 logger.info('Default groups are created and permissions are set to groups properly.')
53 except Exception as e:
54 logger.warning('Failed to run command setdefaultperms: %s', e)
55 logger.warning('Please check if the database is initialized properly.')
56 else:
57 logger.info(
58 'Environment variable NITRATE_SET_DEFAULT_PERMS is not set. '
59 'Skip creating default groups and granting permissions to specific group.'
60 )
61
62
63 def migrate_db():
64 if os.environ.get('NITRATE_MIGRATE_DB'):
65 try:
66 call_command('migrate')
67 logger.info('Database is migrated successfully.')
68 except Exception as e:
69 logger.warning('Failed to migrate the database: %s', e)
70 else:
71 logger.info('Environment variable NITRATE_MIGRATE_DB is not set. Skip migrating database.')
72
73
74 def wait_for_db():
75 while 1:
76 try:
77 connection.cursor()
78 except: # noqa
79 logger.debug('Failed to connect to database. Sleep for a while and try again ...')
80 time.sleep(0.5)
81 else:
82 break
83
84
85 if __name__ == '__main__':
86 wait_for_db()
87 migrate_db()
88 create_superuser()
89 set_default_permissions()
90
```
Path: `contrib/scripts/make-release.py`
Content:
```
1 #!/usr/bin/env python3
2
3 import re
4 import argparse
5 import subprocess
6 from pathlib import Path
7
8 from datetime import datetime
9 from typing import Tuple
10 from pygit2 import Commit, Repository
11
12
13 def extract_short_log(commit: Commit) -> Tuple[str, None or str]:
14 lines = commit.message.split('\n')
15 subject = lines[0]
16 match = re.search(r'\((#\d+)\)$', subject)
17 return subject, match.groups()[0] if match else None
18
19
20 def generate_changelog(args: argparse.Namespace):
21 repo: Repository = Repository(args.repo or '.')
22 if args.since_version:
23 release_tag = repo.revparse_single(args.since_version)
24 else:
25 release_tag = repo.revparse_single(repo.describe().split('-')[0])
26
27 walker = repo.walk(repo.head.target)
28 walker.hide(release_tag.id)
29 logs = []
30 found_issue_keys = []
31
32 for commit in walker:
33 subject, issue_key = extract_short_log(commit)
34 if issue_key is not None:
35 found_issue_keys.append(issue_key)
36 subject = subject.replace(issue_key, f'`{issue_key}`_')
37 logs.append(f'* {subject}')
38
39 logs.append('')
40 found_issue_keys.sort()
41 for item in found_issue_keys:
42 logs.append(f'.. _{item}: https://github.com/Nitrate/Nitrate/issues/{item[1:]}')
43
44 return '\n'.join(logs)
45
46
47 def validate_version(value):
48 if value.startswith('v'):
49 raise argparse.ArgumentTypeError('Version should not be prefixed with v.')
50 return value
51
52
53 parser = argparse.ArgumentParser()
54 parser.add_argument('--repo', help='Path to git repository.')
55 parser.add_argument('--since-version', required=False,
56 type=validate_version,
57 help='Collect commits since this version.')
58 parser.add_argument('new_version', metavar='NEW_VERSION',
59 type=validate_version,
60 help='The version to be released.')
61
62 args = parser.parse_args()
63 new_version = args.new_version
64
65 Path('VERSION.txt').unlink()
66 Path('VERSION.txt').write_text(new_version, "utf-8")
67
68 template = Path('contrib/scripts/release-notes.tmpl.rst').read_text("utf-8")
69 Path(f'docs/source/releases/{new_version}.rst').write_text(
70 template.format(
71 new_version=new_version,
72 doc_ref=new_version,
73 title_marker=len(new_version) * '=',
74 change_logs=generate_changelog(args),
75 release_date=datetime.now().strftime('%b %d, %Y')
76 ),
77 "utf-8",
78 )
79
80 readme_md = Path('container/README.md')
81 content = readme_md.read_text("utf-8")
82 readme_md.unlink()
83 readme_md.write_text(
84 re.sub(r'quay.io/nitrate/nitrate:\d+\.\d+(\.\d+)?',
85 f'quay.io/nitrate/nitrate:{new_version}',
86 content),
87 "utf-8",
88 )
89
90 subprocess.check_call([
91 'rpmdev-bumpspec',
92 '-n', new_version,
93 '-c', f'Built for version {new_version}',
94 'python-nitrate-tcms.spec'
95 ])
96
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/container/init.py b/container/init.py
deleted file mode 100755
--- a/container/init.py
+++ /dev/null
@@ -1,89 +0,0 @@
-#!/usr/bin/python3
-
-import logging
-import os
-import time
-
-logging.basicConfig(
- level=logging.INFO,
- format='%(asctime)s %(levelname)s %(name)s: %(message)s'
-)
-logger = logging.getLogger('entrypoint')
-
-import django
-django.setup()
-
-from django.contrib.auth.models import User
-from django.core.management import call_command
-from django.db import connection
-
-
-def create_superuser():
- username = os.environ.get('NITRATE_SUPERUSER_USERNAME')
- password = os.environ.get('NITRATE_SUPERUSER_PASSWORD')
- email = os.environ.get('NITRATE_SUPERUSER_EMAIL')
-
- if not (username and password and email):
- logger.info(
- 'NITRATE_SUPERUSER_USERNAME, NITRATE_SUPERUSER_PASSWORD and NITRATE_SUPERUSER_EMAIL are not set. '
- 'Skip creating a superuser.'
- )
- return
-
- try:
- if User.objects.filter(username=username, email=email, is_superuser=True).exists():
- logger.info('Superuser %s has been created.', username)
- return
- except: # noqa
- pass
-
- try:
- User.objects.create_superuser(username, email=email, password=password)
- logger.info('Superuser %s is created successfully.', username)
- except Exception as e:
- logger.warning('Failed to create superuser %s: %s', username, e)
- logger.warning('Please check if the database is initialized properly.')
-
-
-def set_default_permissions():
- if os.environ.get('NITRATE_SET_DEFAULT_PERMS'):
- try:
- call_command('setdefaultperms')
- logger.info('Default groups are created and permissions are set to groups properly.')
- except Exception as e:
- logger.warning('Failed to run command setdefaultperms: %s', e)
- logger.warning('Please check if the database is initialized properly.')
- else:
- logger.info(
- 'Environment variable NITRATE_SET_DEFAULT_PERMS is not set. '
- 'Skip creating default groups and granting permissions to specific group.'
- )
-
-
-def migrate_db():
- if os.environ.get('NITRATE_MIGRATE_DB'):
- try:
- call_command('migrate')
- logger.info('Database is migrated successfully.')
- except Exception as e:
- logger.warning('Failed to migrate the database: %s', e)
- else:
- logger.info('Environment variable NITRATE_MIGRATE_DB is not set. Skip migrating database.')
-
-
-def wait_for_db():
- while 1:
- try:
- connection.cursor()
- except: # noqa
- logger.debug('Failed to connect to database. Sleep for a while and try again ...')
- time.sleep(0.5)
- else:
- break
-
-
-if __name__ == '__main__':
- wait_for_db()
- migrate_db()
- create_superuser()
- set_default_permissions()
diff --git a/contrib/scripts/make-release.py b/contrib/scripts/make-release.py
--- a/contrib/scripts/make-release.py
+++ b/contrib/scripts/make-release.py
@@ -2,7 +2,6 @@
import re
import argparse
-import subprocess
from pathlib import Path
from datetime import datetime
@@ -76,20 +75,3 @@
),
"utf-8",
)
-
-readme_md = Path('container/README.md')
-content = readme_md.read_text("utf-8")
-readme_md.unlink()
-readme_md.write_text(
- re.sub(r'quay.io/nitrate/nitrate:\d+\.\d+(\.\d+)?',
- f'quay.io/nitrate/nitrate:{new_version}',
- content),
- "utf-8",
-)
-
-subprocess.check_call([
- 'rpmdev-bumpspec',
- '-n', new_version,
- '-c', f'Built for version {new_version}',
- 'python-nitrate-tcms.spec'
-])
| {"golden_diff": "diff --git a/container/init.py b/container/init.py\ndeleted file mode 100755\n--- a/container/init.py\n+++ /dev/null\n@@ -1,89 +0,0 @@\n-#!/usr/bin/python3\n-\n-import logging\n-import os\n-import time\n-\n-logging.basicConfig(\n- level=logging.INFO,\n- format='%(asctime)s %(levelname)s %(name)s: %(message)s'\n-)\n-logger = logging.getLogger('entrypoint')\n-\n-import django\n-django.setup()\n-\n-from django.contrib.auth.models import User\n-from django.core.management import call_command\n-from django.db import connection\n-\n-\n-def create_superuser():\n- username = os.environ.get('NITRATE_SUPERUSER_USERNAME')\n- password = os.environ.get('NITRATE_SUPERUSER_PASSWORD')\n- email = os.environ.get('NITRATE_SUPERUSER_EMAIL')\n-\n- if not (username and password and email):\n- logger.info(\n- 'NITRATE_SUPERUSER_USERNAME, NITRATE_SUPERUSER_PASSWORD and NITRATE_SUPERUSER_EMAIL are not set. '\n- 'Skip creating a superuser.'\n- )\n- return\n-\n- try:\n- if User.objects.filter(username=username, email=email, is_superuser=True).exists():\n- logger.info('Superuser %s has been created.', username)\n- return\n- except: # noqa\n- pass\n-\n- try:\n- User.objects.create_superuser(username, email=email, password=password)\n- logger.info('Superuser %s is created successfully.', username)\n- except Exception as e:\n- logger.warning('Failed to create superuser %s: %s', username, e)\n- logger.warning('Please check if the database is initialized properly.')\n-\n-\n-def set_default_permissions():\n- if os.environ.get('NITRATE_SET_DEFAULT_PERMS'):\n- try:\n- call_command('setdefaultperms')\n- logger.info('Default groups are created and permissions are set to groups properly.')\n- except Exception as e:\n- logger.warning('Failed to run command setdefaultperms: %s', e)\n- logger.warning('Please check if the database is initialized properly.')\n- else:\n- logger.info(\n- 'Environment variable NITRATE_SET_DEFAULT_PERMS is not set. '\n- 'Skip creating default groups and granting permissions to specific group.'\n- )\n-\n-\n-def migrate_db():\n- if os.environ.get('NITRATE_MIGRATE_DB'):\n- try:\n- call_command('migrate')\n- logger.info('Database is migrated successfully.')\n- except Exception as e:\n- logger.warning('Failed to migrate the database: %s', e)\n- else:\n- logger.info('Environment variable NITRATE_MIGRATE_DB is not set. Skip migrating database.')\n-\n-\n-def wait_for_db():\n- while 1:\n- try:\n- connection.cursor()\n- except: # noqa\n- logger.debug('Failed to connect to database. Sleep for a while and try again ...')\n- time.sleep(0.5)\n- else:\n- break\n-\n-\n-if __name__ == '__main__':\n- wait_for_db()\n- migrate_db()\n- create_superuser()\n- set_default_permissions()\ndiff --git a/contrib/scripts/make-release.py b/contrib/scripts/make-release.py\n--- a/contrib/scripts/make-release.py\n+++ b/contrib/scripts/make-release.py\n@@ -2,7 +2,6 @@\n \n import re\n import argparse\n-import subprocess\n from pathlib import Path\n \n from datetime import datetime\n@@ -76,20 +75,3 @@\n ),\n \"utf-8\",\n )\n-\n-readme_md = Path('container/README.md')\n-content = readme_md.read_text(\"utf-8\")\n-readme_md.unlink()\n-readme_md.write_text(\n- re.sub(r'quay.io/nitrate/nitrate:\\d+\\.\\d+(\\.\\d+)?',\n- f'quay.io/nitrate/nitrate:{new_version}',\n- content),\n- \"utf-8\",\n-)\n-\n-subprocess.check_call([\n- 'rpmdev-bumpspec',\n- '-n', new_version,\n- '-c', f'Built for version {new_version}',\n- 'python-nitrate-tcms.spec'\n-])\n", "issue": "Drop RPM package build completely\nMajor reason and consideration:\r\n\r\n- reduce the effort to maintain the builds\r\n- easy to pin the dependencies\r\n- make it clear to install and distribute via container images\r\n\r\nAC:\r\n\r\n- [x] Remove from CI\r\n- [ ] Remove the Fedora Copr project\r\n- [x] Refactor the Containerfile to build images directly from the source tree\r\n- [x] Update README and documentation to remove the content about RPM packages\n", "before_files": [{"content": "#!/usr/bin/python3\n\nimport logging\nimport os\nimport time\n\nlogging.basicConfig(\n level=logging.INFO,\n format='%(asctime)s %(levelname)s %(name)s: %(message)s'\n)\nlogger = logging.getLogger('entrypoint')\n\nimport django\ndjango.setup()\n\nfrom django.contrib.auth.models import User\nfrom django.core.management import call_command\nfrom django.db import connection\n\n\ndef create_superuser():\n username = os.environ.get('NITRATE_SUPERUSER_USERNAME')\n password = os.environ.get('NITRATE_SUPERUSER_PASSWORD')\n email = os.environ.get('NITRATE_SUPERUSER_EMAIL')\n\n if not (username and password and email):\n logger.info(\n 'NITRATE_SUPERUSER_USERNAME, NITRATE_SUPERUSER_PASSWORD and NITRATE_SUPERUSER_EMAIL are not set. '\n 'Skip creating a superuser.'\n )\n return\n\n try:\n if User.objects.filter(username=username, email=email, is_superuser=True).exists():\n logger.info('Superuser %s has been created.', username)\n return\n except: # noqa\n pass\n\n try:\n User.objects.create_superuser(username, email=email, password=password)\n logger.info('Superuser %s is created successfully.', username)\n except Exception as e:\n logger.warning('Failed to create superuser %s: %s', username, e)\n logger.warning('Please check if the database is initialized properly.')\n\n\ndef set_default_permissions():\n if os.environ.get('NITRATE_SET_DEFAULT_PERMS'):\n try:\n call_command('setdefaultperms')\n logger.info('Default groups are created and permissions are set to groups properly.')\n except Exception as e:\n logger.warning('Failed to run command setdefaultperms: %s', e)\n logger.warning('Please check if the database is initialized properly.')\n else:\n logger.info(\n 'Environment variable NITRATE_SET_DEFAULT_PERMS is not set. '\n 'Skip creating default groups and granting permissions to specific group.'\n )\n\n\ndef migrate_db():\n if os.environ.get('NITRATE_MIGRATE_DB'):\n try:\n call_command('migrate')\n logger.info('Database is migrated successfully.')\n except Exception as e:\n logger.warning('Failed to migrate the database: %s', e)\n else:\n logger.info('Environment variable NITRATE_MIGRATE_DB is not set. Skip migrating database.')\n\n\ndef wait_for_db():\n while 1:\n try:\n connection.cursor()\n except: # noqa\n logger.debug('Failed to connect to database. Sleep for a while and try again ...')\n time.sleep(0.5)\n else:\n break\n\n\nif __name__ == '__main__':\n wait_for_db()\n migrate_db()\n create_superuser()\n set_default_permissions()\n", "path": "container/init.py"}, {"content": "#!/usr/bin/env python3\n\nimport re\nimport argparse\nimport subprocess\nfrom pathlib import Path\n\nfrom datetime import datetime\nfrom typing import Tuple\nfrom pygit2 import Commit, Repository\n\n\ndef extract_short_log(commit: Commit) -> Tuple[str, None or str]:\n lines = commit.message.split('\\n')\n subject = lines[0]\n match = re.search(r'\\((#\\d+)\\)$', subject)\n return subject, match.groups()[0] if match else None\n\n\ndef generate_changelog(args: argparse.Namespace):\n repo: Repository = Repository(args.repo or '.')\n if args.since_version:\n release_tag = repo.revparse_single(args.since_version)\n else:\n release_tag = repo.revparse_single(repo.describe().split('-')[0])\n\n walker = repo.walk(repo.head.target)\n walker.hide(release_tag.id)\n logs = []\n found_issue_keys = []\n\n for commit in walker:\n subject, issue_key = extract_short_log(commit)\n if issue_key is not None:\n found_issue_keys.append(issue_key)\n subject = subject.replace(issue_key, f'`{issue_key}`_')\n logs.append(f'* {subject}')\n\n logs.append('')\n found_issue_keys.sort()\n for item in found_issue_keys:\n logs.append(f'.. _{item}: https://github.com/Nitrate/Nitrate/issues/{item[1:]}')\n\n return '\\n'.join(logs)\n\n\ndef validate_version(value):\n if value.startswith('v'):\n raise argparse.ArgumentTypeError('Version should not be prefixed with v.')\n return value\n\n\nparser = argparse.ArgumentParser()\nparser.add_argument('--repo', help='Path to git repository.')\nparser.add_argument('--since-version', required=False,\n type=validate_version,\n help='Collect commits since this version.')\nparser.add_argument('new_version', metavar='NEW_VERSION',\n type=validate_version,\n help='The version to be released.')\n\nargs = parser.parse_args()\nnew_version = args.new_version\n\nPath('VERSION.txt').unlink()\nPath('VERSION.txt').write_text(new_version, \"utf-8\")\n\ntemplate = Path('contrib/scripts/release-notes.tmpl.rst').read_text(\"utf-8\")\nPath(f'docs/source/releases/{new_version}.rst').write_text(\n template.format(\n new_version=new_version,\n doc_ref=new_version,\n title_marker=len(new_version) * '=',\n change_logs=generate_changelog(args),\n release_date=datetime.now().strftime('%b %d, %Y')\n ),\n \"utf-8\",\n)\n\nreadme_md = Path('container/README.md')\ncontent = readme_md.read_text(\"utf-8\")\nreadme_md.unlink()\nreadme_md.write_text(\n re.sub(r'quay.io/nitrate/nitrate:\\d+\\.\\d+(\\.\\d+)?',\n f'quay.io/nitrate/nitrate:{new_version}',\n content),\n \"utf-8\",\n)\n\nsubprocess.check_call([\n 'rpmdev-bumpspec',\n '-n', new_version,\n '-c', f'Built for version {new_version}',\n 'python-nitrate-tcms.spec'\n])\n", "path": "contrib/scripts/make-release.py"}], "after_files": [{"content": null, "path": "container/init.py"}, {"content": "#!/usr/bin/env python3\n\nimport re\nimport argparse\nfrom pathlib import Path\n\nfrom datetime import datetime\nfrom typing import Tuple\nfrom pygit2 import Commit, Repository\n\n\ndef extract_short_log(commit: Commit) -> Tuple[str, None or str]:\n lines = commit.message.split('\\n')\n subject = lines[0]\n match = re.search(r'\\((#\\d+)\\)$', subject)\n return subject, match.groups()[0] if match else None\n\n\ndef generate_changelog(args: argparse.Namespace):\n repo: Repository = Repository(args.repo or '.')\n if args.since_version:\n release_tag = repo.revparse_single(args.since_version)\n else:\n release_tag = repo.revparse_single(repo.describe().split('-')[0])\n\n walker = repo.walk(repo.head.target)\n walker.hide(release_tag.id)\n logs = []\n found_issue_keys = []\n\n for commit in walker:\n subject, issue_key = extract_short_log(commit)\n if issue_key is not None:\n found_issue_keys.append(issue_key)\n subject = subject.replace(issue_key, f'`{issue_key}`_')\n logs.append(f'* {subject}')\n\n logs.append('')\n found_issue_keys.sort()\n for item in found_issue_keys:\n logs.append(f'.. _{item}: https://github.com/Nitrate/Nitrate/issues/{item[1:]}')\n\n return '\\n'.join(logs)\n\n\ndef validate_version(value):\n if value.startswith('v'):\n raise argparse.ArgumentTypeError('Version should not be prefixed with v.')\n return value\n\n\nparser = argparse.ArgumentParser()\nparser.add_argument('--repo', help='Path to git repository.')\nparser.add_argument('--since-version', required=False,\n type=validate_version,\n help='Collect commits since this version.')\nparser.add_argument('new_version', metavar='NEW_VERSION',\n type=validate_version,\n help='The version to be released.')\n\nargs = parser.parse_args()\nnew_version = args.new_version\n\nPath('VERSION.txt').unlink()\nPath('VERSION.txt').write_text(new_version, \"utf-8\")\n\ntemplate = Path('contrib/scripts/release-notes.tmpl.rst').read_text(\"utf-8\")\nPath(f'docs/source/releases/{new_version}.rst').write_text(\n template.format(\n new_version=new_version,\n doc_ref=new_version,\n title_marker=len(new_version) * '=',\n change_logs=generate_changelog(args),\n release_date=datetime.now().strftime('%b %d, %Y')\n ),\n \"utf-8\",\n)\n", "path": "contrib/scripts/make-release.py"}]} | 1,989 | 931 |
gh_patches_debug_28021 | rasdani/github-patches | git_diff | ipython__ipython-14125 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
%gui osx hangs with ipython 8.13.2 on macOS 13.2.1
This causes ipython to hang. The process cannot be suspended with ^Z and must be killed from another shell:
```% ipython
Python 3.11.2 (v3.11.2:878ead1ac1, Feb 7 2023, 10:02:41) [Clang 13.0.0 (clang-1300.0.29.30)]
Type 'copyright', 'credits' or 'license' for more information
IPython 8.13.2 -- An enhanced Interactive Python. Type '?' for help.
In [1]: %gui osx
Installed osx event loop hook.
In [2]:
```
This does not hang:
```% ipython
Python 3.11.2 (v3.11.2:878ead1ac1, Feb 7 2023, 10:02:41) [Clang 13.0.0 (clang-1300.0.29.30)]
Type 'copyright', 'credits' or 'license' for more information
IPython 8.13.2 -- An enhanced Interactive Python. Type '?' for help.
In [1]: import matplotlib
In [2]: matplotlib.get_backend()
Installed osx event loop hook.
Out[2]: 'MacOSX'
In [3]: 2+2
Out[3]: 4
```
If the hung process is killed with ABRT, the crash report shows a deadlock:
```
hread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 libsystem_kernel.dylib 0x7ff80a34c5c2 mach_msg2_trap + 10
1 libsystem_kernel.dylib 0x7ff80a35a604 mach_msg2_internal + 82
2 libsystem_kernel.dylib 0x7ff80a353635 mach_msg_overwrite + 723
3 libsystem_kernel.dylib 0x7ff80a34c8a8 mach_msg + 19
4 CoreFoundation 0x7ff80a466cbe __CFRunLoopServiceMachPort + 145
5 CoreFoundation 0x7ff80a46572a __CFRunLoopRun + 1360
6 CoreFoundation 0x7ff80a464b60 CFRunLoopRunSpecific + 560
7 CoreFoundation 0x7ff80a4e97aa CFRunLoopRun + 40
8 libffi.dylib 0x7ff8198e3912 ffi_call_unix64 + 82
9 libffi.dylib 0x7ff8198e323a ffi_call_int + 840
[ ... same line repeated 500 times ...]
510 libffi.dylib 0x7ff8198e323a ffi_call_int + 840
511 libffi.dylib 0x7ff8198e323a ffi_call_int + 840
Thread 1:
0 libsystem_kernel.dylib 0x7ff80a34f11a __psynch_cvwait + 10
1 libsystem_pthread.dylib 0x7ff80a38b7e1 _pthread_cond_wait + 1243
2 Python 0x105a41272 0x1057e3000 + 2482802
3 Python 0x105ac4d9e 0x1057e3000 + 3022238
4 Python 0x105ac4fdc 0x1057e3000 + 3022812
5 Python 0x10589ab07 0x1057e3000 + 752391
6 Python 0x1059b72ef 0x1057e3000 + 1917679
7 Python 0x1059bd4d0 0x1057e3000 + 1942736
8 Python 0x1059ba54a 0x1057e3000 + 1930570
9 Python 0x1059bd4d0 0x1057e3000 + 1942736
10 Python 0x1059ba54a 0x1057e3000 + 1930570
11 Python 0x1059bd4d0 0x1057e3000 + 1942736
12 Python 0x10588d816 0x1057e3000 + 698390
13 Python 0x105ac5b26 0x1057e3000 + 3025702
14 Python 0x105a40ea4 0x1057e3000 + 2481828
15 libsystem_pthread.dylib 0x7ff80a38b259 _pthread_start + 125
16 libsystem_pthread.dylib 0x7ff80a386c7b thread_start + 15
Thread 2:
0 libsystem_kernel.dylib 0x7ff80a34f11a __psynch_cvwait + 10
1 libsystem_pthread.dylib 0x7ff80a38b7e1 _pthread_cond_wait + 1243
2 Python 0x105a41272 0x1057e3000 + 2482802
3 Python 0x105ac4d9e 0x1057e3000 + 3022238
4 Python 0x105ac4fdc 0x1057e3000 + 3022812
5 Python 0x10589ab07 0x1057e3000 + 752391
6 Python 0x1059b72ef 0x1057e3000 + 1917679
7 Python 0x1059bd4d0 0x1057e3000 + 1942736
8 Python 0x10588d816 0x1057e3000 + 698390
9 Python 0x1059ba54a 0x1057e3000 + 1930570
10 Python 0x1059bd4d0 0x1057e3000 + 1942736
11 Python 0x10588d816 0x1057e3000 + 698390
12 Python 0x105ac5b26 0x1057e3000 + 3025702
13 Python 0x105a40ea4 0x1057e3000 + 2481828
14 libsystem_pthread.dylib 0x7ff80a38b259 _pthread_start + 125
15 libsystem_pthread.dylib 0x7ff80a386c7b thread_start + 15
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `IPython/terminal/pt_inputhooks/osx.py`
Content:
```
1 """Inputhook for OS X
2
3 Calls NSApp / CoreFoundation APIs via ctypes.
4 """
5
6 # obj-c boilerplate from appnope, used under BSD 2-clause
7
8 import ctypes
9 import ctypes.util
10 from threading import Event
11
12 objc = ctypes.cdll.LoadLibrary(ctypes.util.find_library("objc")) # type: ignore
13
14 void_p = ctypes.c_void_p
15
16 objc.objc_getClass.restype = void_p
17 objc.sel_registerName.restype = void_p
18 objc.objc_msgSend.restype = void_p
19 objc.objc_msgSend.argtypes = [void_p, void_p]
20
21 msg = objc.objc_msgSend
22
23 def _utf8(s):
24 """ensure utf8 bytes"""
25 if not isinstance(s, bytes):
26 s = s.encode('utf8')
27 return s
28
29 def n(name):
30 """create a selector name (for ObjC methods)"""
31 return objc.sel_registerName(_utf8(name))
32
33 def C(classname):
34 """get an ObjC Class by name"""
35 return objc.objc_getClass(_utf8(classname))
36
37 # end obj-c boilerplate from appnope
38
39 # CoreFoundation C-API calls we will use:
40 CoreFoundation = ctypes.cdll.LoadLibrary(ctypes.util.find_library("CoreFoundation")) # type: ignore
41
42 CFFileDescriptorCreate = CoreFoundation.CFFileDescriptorCreate
43 CFFileDescriptorCreate.restype = void_p
44 CFFileDescriptorCreate.argtypes = [void_p, ctypes.c_int, ctypes.c_bool, void_p, void_p]
45
46 CFFileDescriptorGetNativeDescriptor = CoreFoundation.CFFileDescriptorGetNativeDescriptor
47 CFFileDescriptorGetNativeDescriptor.restype = ctypes.c_int
48 CFFileDescriptorGetNativeDescriptor.argtypes = [void_p]
49
50 CFFileDescriptorEnableCallBacks = CoreFoundation.CFFileDescriptorEnableCallBacks
51 CFFileDescriptorEnableCallBacks.restype = None
52 CFFileDescriptorEnableCallBacks.argtypes = [void_p, ctypes.c_ulong]
53
54 CFFileDescriptorCreateRunLoopSource = CoreFoundation.CFFileDescriptorCreateRunLoopSource
55 CFFileDescriptorCreateRunLoopSource.restype = void_p
56 CFFileDescriptorCreateRunLoopSource.argtypes = [void_p, void_p, void_p]
57
58 CFRunLoopGetCurrent = CoreFoundation.CFRunLoopGetCurrent
59 CFRunLoopGetCurrent.restype = void_p
60
61 CFRunLoopAddSource = CoreFoundation.CFRunLoopAddSource
62 CFRunLoopAddSource.restype = None
63 CFRunLoopAddSource.argtypes = [void_p, void_p, void_p]
64
65 CFRelease = CoreFoundation.CFRelease
66 CFRelease.restype = None
67 CFRelease.argtypes = [void_p]
68
69 CFFileDescriptorInvalidate = CoreFoundation.CFFileDescriptorInvalidate
70 CFFileDescriptorInvalidate.restype = None
71 CFFileDescriptorInvalidate.argtypes = [void_p]
72
73 # From CFFileDescriptor.h
74 kCFFileDescriptorReadCallBack = 1
75 kCFRunLoopCommonModes = void_p.in_dll(CoreFoundation, 'kCFRunLoopCommonModes')
76
77
78 def _NSApp():
79 """Return the global NSApplication instance (NSApp)"""
80 objc.objc_msgSend.argtypes = [void_p, void_p]
81 return msg(C('NSApplication'), n('sharedApplication'))
82
83
84 def _wake(NSApp):
85 """Wake the Application"""
86 objc.objc_msgSend.argtypes = [
87 void_p,
88 void_p,
89 void_p,
90 void_p,
91 void_p,
92 void_p,
93 void_p,
94 void_p,
95 void_p,
96 void_p,
97 void_p,
98 ]
99 event = msg(
100 C("NSEvent"),
101 n(
102 "otherEventWithType:location:modifierFlags:"
103 "timestamp:windowNumber:context:subtype:data1:data2:"
104 ),
105 15, # Type
106 0, # location
107 0, # flags
108 0, # timestamp
109 0, # window
110 None, # context
111 0, # subtype
112 0, # data1
113 0, # data2
114 )
115 objc.objc_msgSend.argtypes = [void_p, void_p, void_p, void_p]
116 msg(NSApp, n('postEvent:atStart:'), void_p(event), True)
117
118
119 _triggered = Event()
120
121 def _input_callback(fdref, flags, info):
122 """Callback to fire when there's input to be read"""
123 _triggered.set()
124 CFFileDescriptorInvalidate(fdref)
125 CFRelease(fdref)
126 NSApp = _NSApp()
127 objc.objc_msgSend.argtypes = [void_p, void_p, void_p]
128 msg(NSApp, n('stop:'), NSApp)
129 _wake(NSApp)
130
131 _c_callback_func_type = ctypes.CFUNCTYPE(None, void_p, void_p, void_p)
132 _c_input_callback = _c_callback_func_type(_input_callback)
133
134
135 def _stop_on_read(fd):
136 """Register callback to stop eventloop when there's data on fd"""
137 _triggered.clear()
138 fdref = CFFileDescriptorCreate(None, fd, False, _c_input_callback, None)
139 CFFileDescriptorEnableCallBacks(fdref, kCFFileDescriptorReadCallBack)
140 source = CFFileDescriptorCreateRunLoopSource(None, fdref, 0)
141 loop = CFRunLoopGetCurrent()
142 CFRunLoopAddSource(loop, source, kCFRunLoopCommonModes)
143 CFRelease(source)
144
145
146 def inputhook(context):
147 """Inputhook for Cocoa (NSApp)"""
148 NSApp = _NSApp()
149 _stop_on_read(context.fileno())
150 objc.objc_msgSend.argtypes = [void_p, void_p]
151 msg(NSApp, n('run'))
152 if not _triggered.is_set():
153 # app closed without firing callback,
154 # probably due to last window being closed.
155 # Run the loop manually in this case,
156 # since there may be events still to process (#9734)
157 CoreFoundation.CFRunLoopRun()
158
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/IPython/terminal/pt_inputhooks/osx.py b/IPython/terminal/pt_inputhooks/osx.py
--- a/IPython/terminal/pt_inputhooks/osx.py
+++ b/IPython/terminal/pt_inputhooks/osx.py
@@ -116,11 +116,8 @@
msg(NSApp, n('postEvent:atStart:'), void_p(event), True)
-_triggered = Event()
-
def _input_callback(fdref, flags, info):
"""Callback to fire when there's input to be read"""
- _triggered.set()
CFFileDescriptorInvalidate(fdref)
CFRelease(fdref)
NSApp = _NSApp()
@@ -134,7 +131,6 @@
def _stop_on_read(fd):
"""Register callback to stop eventloop when there's data on fd"""
- _triggered.clear()
fdref = CFFileDescriptorCreate(None, fd, False, _c_input_callback, None)
CFFileDescriptorEnableCallBacks(fdref, kCFFileDescriptorReadCallBack)
source = CFFileDescriptorCreateRunLoopSource(None, fdref, 0)
@@ -149,9 +145,3 @@
_stop_on_read(context.fileno())
objc.objc_msgSend.argtypes = [void_p, void_p]
msg(NSApp, n('run'))
- if not _triggered.is_set():
- # app closed without firing callback,
- # probably due to last window being closed.
- # Run the loop manually in this case,
- # since there may be events still to process (#9734)
- CoreFoundation.CFRunLoopRun()
| {"golden_diff": "diff --git a/IPython/terminal/pt_inputhooks/osx.py b/IPython/terminal/pt_inputhooks/osx.py\n--- a/IPython/terminal/pt_inputhooks/osx.py\n+++ b/IPython/terminal/pt_inputhooks/osx.py\n@@ -116,11 +116,8 @@\n msg(NSApp, n('postEvent:atStart:'), void_p(event), True)\n \n \n-_triggered = Event()\n-\n def _input_callback(fdref, flags, info):\n \"\"\"Callback to fire when there's input to be read\"\"\"\n- _triggered.set()\n CFFileDescriptorInvalidate(fdref)\n CFRelease(fdref)\n NSApp = _NSApp()\n@@ -134,7 +131,6 @@\n \n def _stop_on_read(fd):\n \"\"\"Register callback to stop eventloop when there's data on fd\"\"\"\n- _triggered.clear()\n fdref = CFFileDescriptorCreate(None, fd, False, _c_input_callback, None)\n CFFileDescriptorEnableCallBacks(fdref, kCFFileDescriptorReadCallBack)\n source = CFFileDescriptorCreateRunLoopSource(None, fdref, 0)\n@@ -149,9 +145,3 @@\n _stop_on_read(context.fileno())\n objc.objc_msgSend.argtypes = [void_p, void_p]\n msg(NSApp, n('run'))\n- if not _triggered.is_set():\n- # app closed without firing callback,\n- # probably due to last window being closed.\n- # Run the loop manually in this case,\n- # since there may be events still to process (#9734)\n- CoreFoundation.CFRunLoopRun()\n", "issue": "%gui osx hangs with ipython 8.13.2 on macOS 13.2.1\nThis causes ipython to hang. The process cannot be suspended with ^Z and must be killed from another shell:\r\n\r\n```% ipython\r\nPython 3.11.2 (v3.11.2:878ead1ac1, Feb 7 2023, 10:02:41) [Clang 13.0.0 (clang-1300.0.29.30)]\r\nType 'copyright', 'credits' or 'license' for more information\r\nIPython 8.13.2 -- An enhanced Interactive Python. Type '?' for help.\r\n\r\nIn [1]: %gui osx\r\nInstalled osx event loop hook.\r\n\r\nIn [2]: \r\n```\r\n\r\nThis does not hang:\r\n```% ipython\r\nPython 3.11.2 (v3.11.2:878ead1ac1, Feb 7 2023, 10:02:41) [Clang 13.0.0 (clang-1300.0.29.30)]\r\nType 'copyright', 'credits' or 'license' for more information\r\nIPython 8.13.2 -- An enhanced Interactive Python. Type '?' for help.\r\n\r\nIn [1]: import matplotlib\r\n\r\nIn [2]: matplotlib.get_backend()\r\nInstalled osx event loop hook.\r\nOut[2]: 'MacOSX'\r\n\r\nIn [3]: 2+2\r\nOut[3]: 4\r\n```\r\nIf the hung process is killed with ABRT, the crash report shows a deadlock:\r\n```\r\nhread 0 Crashed:: Dispatch queue: com.apple.main-thread\r\n0 libsystem_kernel.dylib \t 0x7ff80a34c5c2 mach_msg2_trap + 10\r\n1 libsystem_kernel.dylib \t 0x7ff80a35a604 mach_msg2_internal + 82\r\n2 libsystem_kernel.dylib \t 0x7ff80a353635 mach_msg_overwrite + 723\r\n3 libsystem_kernel.dylib \t 0x7ff80a34c8a8 mach_msg + 19\r\n4 CoreFoundation \t 0x7ff80a466cbe __CFRunLoopServiceMachPort + 145\r\n5 CoreFoundation \t 0x7ff80a46572a __CFRunLoopRun + 1360\r\n6 CoreFoundation \t 0x7ff80a464b60 CFRunLoopRunSpecific + 560\r\n7 CoreFoundation \t 0x7ff80a4e97aa CFRunLoopRun + 40\r\n8 libffi.dylib \t 0x7ff8198e3912 ffi_call_unix64 + 82\r\n9 libffi.dylib \t 0x7ff8198e323a ffi_call_int + 840\r\n[ ... same line repeated 500 times ...]\r\n510 libffi.dylib \t 0x7ff8198e323a ffi_call_int + 840\r\n511 libffi.dylib \t 0x7ff8198e323a ffi_call_int + 840\r\n\r\nThread 1:\r\n0 libsystem_kernel.dylib \t 0x7ff80a34f11a __psynch_cvwait + 10\r\n1 libsystem_pthread.dylib \t 0x7ff80a38b7e1 _pthread_cond_wait + 1243\r\n2 Python \t 0x105a41272 0x1057e3000 + 2482802\r\n3 Python \t 0x105ac4d9e 0x1057e3000 + 3022238\r\n4 Python \t 0x105ac4fdc 0x1057e3000 + 3022812\r\n5 Python \t 0x10589ab07 0x1057e3000 + 752391\r\n6 Python \t 0x1059b72ef 0x1057e3000 + 1917679\r\n7 Python \t 0x1059bd4d0 0x1057e3000 + 1942736\r\n8 Python \t 0x1059ba54a 0x1057e3000 + 1930570\r\n9 Python \t 0x1059bd4d0 0x1057e3000 + 1942736\r\n10 Python \t 0x1059ba54a 0x1057e3000 + 1930570\r\n11 Python \t 0x1059bd4d0 0x1057e3000 + 1942736\r\n12 Python \t 0x10588d816 0x1057e3000 + 698390\r\n13 Python \t 0x105ac5b26 0x1057e3000 + 3025702\r\n14 Python \t 0x105a40ea4 0x1057e3000 + 2481828\r\n15 libsystem_pthread.dylib \t 0x7ff80a38b259 _pthread_start + 125\r\n16 libsystem_pthread.dylib \t 0x7ff80a386c7b thread_start + 15\r\n\r\nThread 2:\r\n0 libsystem_kernel.dylib \t 0x7ff80a34f11a __psynch_cvwait + 10\r\n1 libsystem_pthread.dylib \t 0x7ff80a38b7e1 _pthread_cond_wait + 1243\r\n2 Python \t 0x105a41272 0x1057e3000 + 2482802\r\n3 Python \t 0x105ac4d9e 0x1057e3000 + 3022238\r\n4 Python \t 0x105ac4fdc 0x1057e3000 + 3022812\r\n5 Python \t 0x10589ab07 0x1057e3000 + 752391\r\n6 Python \t 0x1059b72ef 0x1057e3000 + 1917679\r\n7 Python \t 0x1059bd4d0 0x1057e3000 + 1942736\r\n8 Python \t 0x10588d816 0x1057e3000 + 698390\r\n9 Python \t 0x1059ba54a 0x1057e3000 + 1930570\r\n10 Python \t 0x1059bd4d0 0x1057e3000 + 1942736\r\n11 Python \t 0x10588d816 0x1057e3000 + 698390\r\n12 Python \t 0x105ac5b26 0x1057e3000 + 3025702\r\n13 Python \t 0x105a40ea4 0x1057e3000 + 2481828\r\n14 libsystem_pthread.dylib \t 0x7ff80a38b259 _pthread_start + 125\r\n15 libsystem_pthread.dylib \t 0x7ff80a386c7b thread_start + 15\r\n\n", "before_files": [{"content": "\"\"\"Inputhook for OS X\n\nCalls NSApp / CoreFoundation APIs via ctypes.\n\"\"\"\n\n# obj-c boilerplate from appnope, used under BSD 2-clause\n\nimport ctypes\nimport ctypes.util\nfrom threading import Event\n\nobjc = ctypes.cdll.LoadLibrary(ctypes.util.find_library(\"objc\")) # type: ignore\n\nvoid_p = ctypes.c_void_p\n\nobjc.objc_getClass.restype = void_p\nobjc.sel_registerName.restype = void_p\nobjc.objc_msgSend.restype = void_p\nobjc.objc_msgSend.argtypes = [void_p, void_p]\n\nmsg = objc.objc_msgSend\n\ndef _utf8(s):\n \"\"\"ensure utf8 bytes\"\"\"\n if not isinstance(s, bytes):\n s = s.encode('utf8')\n return s\n\ndef n(name):\n \"\"\"create a selector name (for ObjC methods)\"\"\"\n return objc.sel_registerName(_utf8(name))\n\ndef C(classname):\n \"\"\"get an ObjC Class by name\"\"\"\n return objc.objc_getClass(_utf8(classname))\n\n# end obj-c boilerplate from appnope\n\n# CoreFoundation C-API calls we will use:\nCoreFoundation = ctypes.cdll.LoadLibrary(ctypes.util.find_library(\"CoreFoundation\")) # type: ignore\n\nCFFileDescriptorCreate = CoreFoundation.CFFileDescriptorCreate\nCFFileDescriptorCreate.restype = void_p\nCFFileDescriptorCreate.argtypes = [void_p, ctypes.c_int, ctypes.c_bool, void_p, void_p]\n\nCFFileDescriptorGetNativeDescriptor = CoreFoundation.CFFileDescriptorGetNativeDescriptor\nCFFileDescriptorGetNativeDescriptor.restype = ctypes.c_int\nCFFileDescriptorGetNativeDescriptor.argtypes = [void_p]\n\nCFFileDescriptorEnableCallBacks = CoreFoundation.CFFileDescriptorEnableCallBacks\nCFFileDescriptorEnableCallBacks.restype = None\nCFFileDescriptorEnableCallBacks.argtypes = [void_p, ctypes.c_ulong]\n\nCFFileDescriptorCreateRunLoopSource = CoreFoundation.CFFileDescriptorCreateRunLoopSource\nCFFileDescriptorCreateRunLoopSource.restype = void_p\nCFFileDescriptorCreateRunLoopSource.argtypes = [void_p, void_p, void_p]\n\nCFRunLoopGetCurrent = CoreFoundation.CFRunLoopGetCurrent\nCFRunLoopGetCurrent.restype = void_p\n\nCFRunLoopAddSource = CoreFoundation.CFRunLoopAddSource\nCFRunLoopAddSource.restype = None\nCFRunLoopAddSource.argtypes = [void_p, void_p, void_p]\n\nCFRelease = CoreFoundation.CFRelease\nCFRelease.restype = None\nCFRelease.argtypes = [void_p]\n\nCFFileDescriptorInvalidate = CoreFoundation.CFFileDescriptorInvalidate\nCFFileDescriptorInvalidate.restype = None\nCFFileDescriptorInvalidate.argtypes = [void_p]\n\n# From CFFileDescriptor.h\nkCFFileDescriptorReadCallBack = 1\nkCFRunLoopCommonModes = void_p.in_dll(CoreFoundation, 'kCFRunLoopCommonModes')\n\n\ndef _NSApp():\n \"\"\"Return the global NSApplication instance (NSApp)\"\"\"\n objc.objc_msgSend.argtypes = [void_p, void_p]\n return msg(C('NSApplication'), n('sharedApplication'))\n\n\ndef _wake(NSApp):\n \"\"\"Wake the Application\"\"\"\n objc.objc_msgSend.argtypes = [\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n ]\n event = msg(\n C(\"NSEvent\"),\n n(\n \"otherEventWithType:location:modifierFlags:\"\n \"timestamp:windowNumber:context:subtype:data1:data2:\"\n ),\n 15, # Type\n 0, # location\n 0, # flags\n 0, # timestamp\n 0, # window\n None, # context\n 0, # subtype\n 0, # data1\n 0, # data2\n )\n objc.objc_msgSend.argtypes = [void_p, void_p, void_p, void_p]\n msg(NSApp, n('postEvent:atStart:'), void_p(event), True)\n\n\n_triggered = Event()\n\ndef _input_callback(fdref, flags, info):\n \"\"\"Callback to fire when there's input to be read\"\"\"\n _triggered.set()\n CFFileDescriptorInvalidate(fdref)\n CFRelease(fdref)\n NSApp = _NSApp()\n objc.objc_msgSend.argtypes = [void_p, void_p, void_p]\n msg(NSApp, n('stop:'), NSApp)\n _wake(NSApp)\n\n_c_callback_func_type = ctypes.CFUNCTYPE(None, void_p, void_p, void_p)\n_c_input_callback = _c_callback_func_type(_input_callback)\n\n\ndef _stop_on_read(fd):\n \"\"\"Register callback to stop eventloop when there's data on fd\"\"\"\n _triggered.clear()\n fdref = CFFileDescriptorCreate(None, fd, False, _c_input_callback, None)\n CFFileDescriptorEnableCallBacks(fdref, kCFFileDescriptorReadCallBack)\n source = CFFileDescriptorCreateRunLoopSource(None, fdref, 0)\n loop = CFRunLoopGetCurrent()\n CFRunLoopAddSource(loop, source, kCFRunLoopCommonModes)\n CFRelease(source)\n\n\ndef inputhook(context):\n \"\"\"Inputhook for Cocoa (NSApp)\"\"\"\n NSApp = _NSApp()\n _stop_on_read(context.fileno())\n objc.objc_msgSend.argtypes = [void_p, void_p]\n msg(NSApp, n('run'))\n if not _triggered.is_set():\n # app closed without firing callback,\n # probably due to last window being closed.\n # Run the loop manually in this case,\n # since there may be events still to process (#9734)\n CoreFoundation.CFRunLoopRun()\n", "path": "IPython/terminal/pt_inputhooks/osx.py"}], "after_files": [{"content": "\"\"\"Inputhook for OS X\n\nCalls NSApp / CoreFoundation APIs via ctypes.\n\"\"\"\n\n# obj-c boilerplate from appnope, used under BSD 2-clause\n\nimport ctypes\nimport ctypes.util\nfrom threading import Event\n\nobjc = ctypes.cdll.LoadLibrary(ctypes.util.find_library(\"objc\")) # type: ignore\n\nvoid_p = ctypes.c_void_p\n\nobjc.objc_getClass.restype = void_p\nobjc.sel_registerName.restype = void_p\nobjc.objc_msgSend.restype = void_p\nobjc.objc_msgSend.argtypes = [void_p, void_p]\n\nmsg = objc.objc_msgSend\n\ndef _utf8(s):\n \"\"\"ensure utf8 bytes\"\"\"\n if not isinstance(s, bytes):\n s = s.encode('utf8')\n return s\n\ndef n(name):\n \"\"\"create a selector name (for ObjC methods)\"\"\"\n return objc.sel_registerName(_utf8(name))\n\ndef C(classname):\n \"\"\"get an ObjC Class by name\"\"\"\n return objc.objc_getClass(_utf8(classname))\n\n# end obj-c boilerplate from appnope\n\n# CoreFoundation C-API calls we will use:\nCoreFoundation = ctypes.cdll.LoadLibrary(ctypes.util.find_library(\"CoreFoundation\")) # type: ignore\n\nCFFileDescriptorCreate = CoreFoundation.CFFileDescriptorCreate\nCFFileDescriptorCreate.restype = void_p\nCFFileDescriptorCreate.argtypes = [void_p, ctypes.c_int, ctypes.c_bool, void_p, void_p]\n\nCFFileDescriptorGetNativeDescriptor = CoreFoundation.CFFileDescriptorGetNativeDescriptor\nCFFileDescriptorGetNativeDescriptor.restype = ctypes.c_int\nCFFileDescriptorGetNativeDescriptor.argtypes = [void_p]\n\nCFFileDescriptorEnableCallBacks = CoreFoundation.CFFileDescriptorEnableCallBacks\nCFFileDescriptorEnableCallBacks.restype = None\nCFFileDescriptorEnableCallBacks.argtypes = [void_p, ctypes.c_ulong]\n\nCFFileDescriptorCreateRunLoopSource = CoreFoundation.CFFileDescriptorCreateRunLoopSource\nCFFileDescriptorCreateRunLoopSource.restype = void_p\nCFFileDescriptorCreateRunLoopSource.argtypes = [void_p, void_p, void_p]\n\nCFRunLoopGetCurrent = CoreFoundation.CFRunLoopGetCurrent\nCFRunLoopGetCurrent.restype = void_p\n\nCFRunLoopAddSource = CoreFoundation.CFRunLoopAddSource\nCFRunLoopAddSource.restype = None\nCFRunLoopAddSource.argtypes = [void_p, void_p, void_p]\n\nCFRelease = CoreFoundation.CFRelease\nCFRelease.restype = None\nCFRelease.argtypes = [void_p]\n\nCFFileDescriptorInvalidate = CoreFoundation.CFFileDescriptorInvalidate\nCFFileDescriptorInvalidate.restype = None\nCFFileDescriptorInvalidate.argtypes = [void_p]\n\n# From CFFileDescriptor.h\nkCFFileDescriptorReadCallBack = 1\nkCFRunLoopCommonModes = void_p.in_dll(CoreFoundation, 'kCFRunLoopCommonModes')\n\n\ndef _NSApp():\n \"\"\"Return the global NSApplication instance (NSApp)\"\"\"\n objc.objc_msgSend.argtypes = [void_p, void_p]\n return msg(C('NSApplication'), n('sharedApplication'))\n\n\ndef _wake(NSApp):\n \"\"\"Wake the Application\"\"\"\n objc.objc_msgSend.argtypes = [\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n void_p,\n ]\n event = msg(\n C(\"NSEvent\"),\n n(\n \"otherEventWithType:location:modifierFlags:\"\n \"timestamp:windowNumber:context:subtype:data1:data2:\"\n ),\n 15, # Type\n 0, # location\n 0, # flags\n 0, # timestamp\n 0, # window\n None, # context\n 0, # subtype\n 0, # data1\n 0, # data2\n )\n objc.objc_msgSend.argtypes = [void_p, void_p, void_p, void_p]\n msg(NSApp, n('postEvent:atStart:'), void_p(event), True)\n\n\ndef _input_callback(fdref, flags, info):\n \"\"\"Callback to fire when there's input to be read\"\"\"\n CFFileDescriptorInvalidate(fdref)\n CFRelease(fdref)\n NSApp = _NSApp()\n objc.objc_msgSend.argtypes = [void_p, void_p, void_p]\n msg(NSApp, n('stop:'), NSApp)\n _wake(NSApp)\n\n_c_callback_func_type = ctypes.CFUNCTYPE(None, void_p, void_p, void_p)\n_c_input_callback = _c_callback_func_type(_input_callback)\n\n\ndef _stop_on_read(fd):\n \"\"\"Register callback to stop eventloop when there's data on fd\"\"\"\n fdref = CFFileDescriptorCreate(None, fd, False, _c_input_callback, None)\n CFFileDescriptorEnableCallBacks(fdref, kCFFileDescriptorReadCallBack)\n source = CFFileDescriptorCreateRunLoopSource(None, fdref, 0)\n loop = CFRunLoopGetCurrent()\n CFRunLoopAddSource(loop, source, kCFRunLoopCommonModes)\n CFRelease(source)\n\n\ndef inputhook(context):\n \"\"\"Inputhook for Cocoa (NSApp)\"\"\"\n NSApp = _NSApp()\n _stop_on_read(context.fileno())\n objc.objc_msgSend.argtypes = [void_p, void_p]\n msg(NSApp, n('run'))\n", "path": "IPython/terminal/pt_inputhooks/osx.py"}]} | 3,892 | 367 |
gh_patches_debug_18654 | rasdani/github-patches | git_diff | digitalfabrik__integreat-cms-430 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Move dev dependencies from setup.py to Pipfile
The dev extra dependencies in setup.py are not required anymore, because we don't install it with setuptools for local development. Instead, we could move all dependencies in the `extras_require`-section to the Pipfile, which would have the advantage to be able to install new dev dependencies with `pipenv install <new-dependency>`.
Move dev dependencies from setup.py to Pipfile
The dev extra dependencies in setup.py are not required anymore, because we don't install it with setuptools for local development. Instead, we could move all dependencies in the `extras_require`-section to the Pipfile, which would have the advantage to be able to install new dev dependencies with `pipenv install <new-dependency>`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python3
2 """ Setup.py """
3
4 import os
5 import sys
6
7 from setuptools import find_packages, setup
8
9 # Add source directory to PATH variable to enable import of version number
10 sys.path.append(os.path.abspath('src'))
11 # pylint: disable=wrong-import-position
12 from backend.settings import VERSION
13
14 setup(
15 name='integreat_cms',
16 version=VERSION,
17 packages=find_packages('src'),
18 package_dir={'': 'src'},
19 include_package_data=True,
20 scripts=['src/integreat-cms-cli'],
21 data_files=[
22 (f'lib/integreat-{root}', [os.path.join(root, f) for f in files])
23 for root, _, files in os.walk('src/cms/templates/')
24 ] + [
25 (f'lib/integreat-{root}', [os.path.join(root, f) for f in files])
26 for root, _, files in os.walk('src/cms/static/')
27 ] + [
28 ('usr/lib/systemd/system/', ['systemd/[email protected]'])
29 ],
30 install_requires=[
31 'cffi',
32 'Django~=2.2.13',
33 'django-cors-headers',
34 'django-filer',
35 'django-mptt',
36 'django-widget-tweaks',
37 'idna',
38 'lxml',
39 'psycopg2-binary',
40 'python-dateutil',
41 'requests',
42 'rules',
43 'six',
44 'webauthn',
45 ],
46 extras_require={
47 'dev': [
48 'django-compressor',
49 'django-compressor-toolkit',
50 'packaging',
51 'pylint',
52 'pylint-django',
53 'pylint_runner',
54 'sphinx',
55 'sphinxcontrib-django',
56 'sphinx_rtd_theme',
57 'coverage',
58 'django_coverage_plugin',
59 ]
60 },
61 author='Integreat App Project',
62 author_email='[email protected]',
63 description='Content Management System for the Integreat App',
64 license='GPL-2.0-or-later',
65 keywords='Django Integreat CMS',
66 url='http://github.com/Integreat/',
67 classifiers=[
68 'Development Status :: 5 - Production/Stable',
69 'Intended Audience :: Developers',
70 'Programming Language :: Python :: 3.7',
71 ]
72 )
73
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -12,7 +12,7 @@
from backend.settings import VERSION
setup(
- name='integreat_cms',
+ name='integreat-cms',
version=VERSION,
packages=find_packages('src'),
package_dir={'': 'src'},
@@ -43,21 +43,6 @@
'six',
'webauthn',
],
- extras_require={
- 'dev': [
- 'django-compressor',
- 'django-compressor-toolkit',
- 'packaging',
- 'pylint',
- 'pylint-django',
- 'pylint_runner',
- 'sphinx',
- 'sphinxcontrib-django',
- 'sphinx_rtd_theme',
- 'coverage',
- 'django_coverage_plugin',
- ]
- },
author='Integreat App Project',
author_email='[email protected]',
description='Content Management System for the Integreat App',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -12,7 +12,7 @@\n from backend.settings import VERSION\n \n setup(\n- name='integreat_cms',\n+ name='integreat-cms',\n version=VERSION,\n packages=find_packages('src'),\n package_dir={'': 'src'},\n@@ -43,21 +43,6 @@\n 'six',\n 'webauthn',\n ],\n- extras_require={\n- 'dev': [\n- 'django-compressor',\n- 'django-compressor-toolkit',\n- 'packaging',\n- 'pylint',\n- 'pylint-django',\n- 'pylint_runner',\n- 'sphinx',\n- 'sphinxcontrib-django',\n- 'sphinx_rtd_theme',\n- 'coverage',\n- 'django_coverage_plugin',\n- ]\n- },\n author='Integreat App Project',\n author_email='[email protected]',\n description='Content Management System for the Integreat App',\n", "issue": "Move dev dependencies from setup.py to Pipfile\nThe dev extra dependencies in setup.py are not required anymore, because we don't install it with setuptools for local development. Instead, we could move all dependencies in the `extras_require`-section to the Pipfile, which would have the advantage to be able to install new dev dependencies with `pipenv install <new-dependency>`.\nMove dev dependencies from setup.py to Pipfile\nThe dev extra dependencies in setup.py are not required anymore, because we don't install it with setuptools for local development. Instead, we could move all dependencies in the `extras_require`-section to the Pipfile, which would have the advantage to be able to install new dev dependencies with `pipenv install <new-dependency>`.\n", "before_files": [{"content": "#!/usr/bin/env python3\n\"\"\" Setup.py \"\"\"\n\nimport os\nimport sys\n\nfrom setuptools import find_packages, setup\n\n# Add source directory to PATH variable to enable import of version number\nsys.path.append(os.path.abspath('src'))\n# pylint: disable=wrong-import-position\nfrom backend.settings import VERSION\n\nsetup(\n name='integreat_cms',\n version=VERSION,\n packages=find_packages('src'),\n package_dir={'': 'src'},\n include_package_data=True,\n scripts=['src/integreat-cms-cli'],\n data_files=[\n (f'lib/integreat-{root}', [os.path.join(root, f) for f in files])\n for root, _, files in os.walk('src/cms/templates/')\n ] + [\n (f'lib/integreat-{root}', [os.path.join(root, f) for f in files])\n for root, _, files in os.walk('src/cms/static/')\n ] + [\n ('usr/lib/systemd/system/', ['systemd/[email protected]'])\n ],\n install_requires=[\n 'cffi',\n 'Django~=2.2.13',\n 'django-cors-headers',\n 'django-filer',\n 'django-mptt',\n 'django-widget-tweaks',\n 'idna',\n 'lxml',\n 'psycopg2-binary',\n 'python-dateutil',\n 'requests',\n 'rules',\n 'six',\n 'webauthn',\n ],\n extras_require={\n 'dev': [\n 'django-compressor',\n 'django-compressor-toolkit',\n 'packaging',\n 'pylint',\n 'pylint-django',\n 'pylint_runner',\n 'sphinx',\n 'sphinxcontrib-django',\n 'sphinx_rtd_theme',\n 'coverage',\n 'django_coverage_plugin',\n ]\n },\n author='Integreat App Project',\n author_email='[email protected]',\n description='Content Management System for the Integreat App',\n license='GPL-2.0-or-later',\n keywords='Django Integreat CMS',\n url='http://github.com/Integreat/',\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n 'Intended Audience :: Developers',\n 'Programming Language :: Python :: 3.7',\n ]\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\"\"\" Setup.py \"\"\"\n\nimport os\nimport sys\n\nfrom setuptools import find_packages, setup\n\n# Add source directory to PATH variable to enable import of version number\nsys.path.append(os.path.abspath('src'))\n# pylint: disable=wrong-import-position\nfrom backend.settings import VERSION\n\nsetup(\n name='integreat-cms',\n version=VERSION,\n packages=find_packages('src'),\n package_dir={'': 'src'},\n include_package_data=True,\n scripts=['src/integreat-cms-cli'],\n data_files=[\n (f'lib/integreat-{root}', [os.path.join(root, f) for f in files])\n for root, _, files in os.walk('src/cms/templates/')\n ] + [\n (f'lib/integreat-{root}', [os.path.join(root, f) for f in files])\n for root, _, files in os.walk('src/cms/static/')\n ] + [\n ('usr/lib/systemd/system/', ['systemd/[email protected]'])\n ],\n install_requires=[\n 'cffi',\n 'Django~=2.2.13',\n 'django-cors-headers',\n 'django-filer',\n 'django-mptt',\n 'django-widget-tweaks',\n 'idna',\n 'lxml',\n 'psycopg2-binary',\n 'python-dateutil',\n 'requests',\n 'rules',\n 'six',\n 'webauthn',\n ],\n author='Integreat App Project',\n author_email='[email protected]',\n description='Content Management System for the Integreat App',\n license='GPL-2.0-or-later',\n keywords='Django Integreat CMS',\n url='http://github.com/Integreat/',\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n 'Intended Audience :: Developers',\n 'Programming Language :: Python :: 3.7',\n ]\n)\n", "path": "setup.py"}]} | 1,055 | 235 |
gh_patches_debug_17962 | rasdani/github-patches | git_diff | hylang__hy-2115 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Python 3.10 on Windows fails: AttributeError: module 'collections' has no attribute 'Callable'
Thanks to https://github.com/pyreadline/pyreadline/issues/65, Hy on Python 3.10 on Windows no longer starts, and a backtrace similar to the following is the result:
```console
Traceback (most recent call last):
File "c:\python\lib\runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "c:\python\lib\runpy.py", line 86, in _run_code
exec(code, run_globals)
File "C:\Python\Scripts\hy.exe\__main__.py", line 4, in <module>
File "c:\python\lib\site-packages\hy\cmdline.py", line 37, in <module>
from hy.completer import completion, Completer
File "c:\python\lib\site-packages\hy\completer.py", line 18, in <module>
import readline
File "c:\python\lib\site-packages\readline.py", line 34, in <module>
rl = Readline()
File "c:\python\lib\site-packages\pyreadline\rlmain.py", line 422, in __init__
BaseReadline.__init__(self)
File "c:\python\lib\site-packages\pyreadline\rlmain.py", line 62, in __init__
mode.init_editing_mode(None)
File "c:\python\lib\site-packages\pyreadline\modes\emacs.py", line 633, in init_editing_mode
self._bind_key('space', self.self_insert)
File "c:\python\lib\site-packages\pyreadline\modes\basemode.py", line 162, in _bind_key
if not callable(func):
File "c:\python\lib\site-packages\pyreadline\py3k_compat.py", line 8, in callable
return isinstance(x, collections.Callable)
AttributeError: module 'collections' has no attribute 'Callable'
```
Unfortunately from that bug (and the repository in general), it would appear that `pyreadline` is no longer actively maintained (last update of any kind was in 2015), so for Hy to continue to support Windows on future Python versions will require some amount of workaround (a fork of `pyreadline`, dropping readline support on Windows, etc).
I'm not sure if there's a way to specify that Python 3.10+ and Windows are simply mutually incompatible, but that would be the "simplest" workaround if there's a clean way to specify that. :see_no_evil:
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 # Copyright 2021 the authors.
3 # This file is part of Hy, which is free software licensed under the Expat
4 # license. See the LICENSE.
5
6 import glob
7 import importlib
8 import inspect
9 import os
10 import sys
11
12 from setuptools import find_packages, setup
13 from setuptools.command.install import install
14 import fastentrypoints # Monkey-patches setuptools.
15
16 from get_version import __version__
17
18 os.chdir(os.path.split(os.path.abspath(__file__))[0])
19
20 PKG = "hy"
21
22 long_description = """Hy is a Python <--> Lisp layer. It helps
23 make things work nicer, and lets Python and the Hy lisp variant play
24 nice together. """
25
26 class Install(install):
27 def __compile_hy_bytecode(self):
28 for path in sorted(glob.iglob('hy/**.hy', recursive=True)):
29 importlib.util.cache_from_source(path, optimize=self.optimize)
30
31 def run(self):
32 # Don't bother messing around with deps if they wouldn't be installed anyway.
33 # Code is based on setuptools's install.py.
34 if not (self.old_and_unmanageable or self.single_version_externally_managed
35 or not self._called_from_setup(inspect.currentframe())):
36 easy_install = self.distribution.get_command_class('easy_install')
37
38 cmd = easy_install(
39 self.distribution, args="x", root=self.root, record=self.record,
40 )
41 cmd.ensure_finalized()
42 cmd.always_copy_from = '.'
43 cmd.package_index.scan(glob.glob('*.egg'))
44
45 cmd.args = self.distribution.install_requires
46
47 # Avoid deprecation warnings on new setuptools versions.
48 if 'show_deprecation' in inspect.signature(cmd.run).parameters:
49 cmd.run(show_deprecation=False)
50 else:
51 cmd.run()
52
53 # Make sure any new packages get picked up.
54 import site
55 importlib.reload(site)
56 importlib.invalidate_caches()
57
58 self.__compile_hy_bytecode()
59
60 # The deps won't be reinstalled because of:
61 # https://github.com/pypa/setuptools/issues/456
62 return install.run(self)
63
64 setup(
65 name=PKG,
66 version=__version__,
67 install_requires=[
68 'rply>=0.7.7',
69 'funcparserlib>=0.3.6',
70 'colorama',
71 'astor>=0.8 ; python_version < "3.9"',
72 'pyreadline>=2.1 ; os_name == "nt"',
73 ],
74 cmdclass=dict(install=Install),
75 entry_points={
76 'console_scripts': [
77 'hy = hy.cmdline:hy_main',
78 'hy3 = hy.cmdline:hy_main',
79 'hyc = hy.cmdline:hyc_main',
80 'hyc3 = hy.cmdline:hyc_main',
81 'hy2py = hy.cmdline:hy2py_main',
82 'hy2py3 = hy.cmdline:hy2py_main',
83 ]
84 },
85 packages=find_packages(exclude=['tests*']),
86 package_data={
87 'hy.contrib': ['*.hy', '__pycache__/*'],
88 'hy.core': ['*.hy', '__pycache__/*'],
89 'hy.extra': ['*.hy', '__pycache__/*'],
90 },
91 data_files=[
92 ('get_version', ['get_version.py'])
93 ],
94 author="Paul Tagliamonte",
95 author_email="[email protected]",
96 long_description=long_description,
97 description='Lisp and Python love each other.',
98 license="Expat",
99 url="http://hylang.org/",
100 platforms=['any'],
101 classifiers=[
102 "Development Status :: 4 - Beta",
103 "Intended Audience :: Developers",
104 "License :: DFSG approved",
105 "License :: OSI Approved :: MIT License", # Really "Expat". Ugh.
106 "Operating System :: OS Independent",
107 "Programming Language :: Lisp",
108 "Programming Language :: Python",
109 "Programming Language :: Python :: 3",
110 "Programming Language :: Python :: 3.6",
111 "Programming Language :: Python :: 3.7",
112 "Programming Language :: Python :: 3.8",
113 "Programming Language :: Python :: 3.9",
114 "Programming Language :: Python :: 3.10",
115 "Topic :: Software Development :: Code Generators",
116 "Topic :: Software Development :: Compilers",
117 "Topic :: Software Development :: Libraries",
118 ]
119 )
120
```
Path: `hy/completer.py`
Content:
```
1 # Copyright 2021 the authors.
2 # This file is part of Hy, which is free software licensed under the Expat
3 # license. See the LICENSE.
4
5 import contextlib
6 import os
7 import re
8 import sys
9 import builtins
10
11 import hy.macros
12 import hy.compiler
13
14
15 docomplete = True
16
17 try:
18 import readline
19 except ImportError:
20 try:
21 import pyreadline.rlmain
22 import pyreadline.unicode_helper # NOQA
23 import readline
24 except ImportError:
25 docomplete = False
26
27 if docomplete:
28 if sys.platform == 'darwin' and 'libedit' in readline.__doc__:
29 readline_bind = "bind ^I rl_complete"
30 else:
31 readline_bind = "tab: complete"
32
33
34 class Completer(object):
35
36 def __init__(self, namespace={}):
37 if not isinstance(namespace, dict):
38 raise TypeError('namespace must be a dictionary')
39 self.namespace = namespace
40 self.path = [builtins.__dict__,
41 namespace]
42
43 namespace.setdefault('__macros__', {})
44
45 self.path.append(namespace['__macros__'])
46
47 def attr_matches(self, text):
48 # Borrowed from IPython's completer
49 m = re.match(r"(\S+(\.[\w-]+)*)\.([\w-]*)$", text)
50
51 if m:
52 expr, attr = m.group(1, 3)
53 attr = attr.replace("-", "_")
54 expr = expr.replace("-", "_")
55 else:
56 return []
57
58 try:
59 obj = eval(expr, self.namespace)
60 words = dir(obj)
61 except Exception:
62 return []
63
64 n = len(attr)
65 matches = []
66 for w in words:
67 if w[:n] == attr:
68 matches.append("{}.{}".format(
69 expr.replace("_", "-"), w.replace("_", "-")))
70 return matches
71
72 def global_matches(self, text):
73 matches = []
74 for p in self.path:
75 for k in p.keys():
76 if isinstance(k, str):
77 k = k.replace("_", "-")
78 if k.startswith(text):
79 matches.append(k)
80 return matches
81
82 def complete(self, text, state):
83 if "." in text:
84 matches = self.attr_matches(text)
85 else:
86 matches = self.global_matches(text)
87 try:
88 return matches[state]
89 except IndexError:
90 return None
91
92
93 @contextlib.contextmanager
94 def completion(completer=None):
95 delims = "()[]{} "
96 if not completer:
97 completer = Completer()
98
99 if docomplete:
100 readline.set_completer(completer.complete)
101 readline.set_completer_delims(delims)
102
103 history = os.environ.get(
104 "HY_HISTORY", os.path.expanduser("~/.hy-history"))
105 readline.parse_and_bind("set blink-matching-paren on")
106
107 try:
108 readline.read_history_file(history)
109 except IOError:
110 pass
111
112 readline.parse_and_bind(readline_bind)
113
114 try:
115 yield
116 finally:
117 if docomplete:
118 try:
119 readline.write_history_file(history)
120 except IOError:
121 pass
122
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/hy/completer.py b/hy/completer.py
--- a/hy/completer.py
+++ b/hy/completer.py
@@ -16,13 +16,14 @@
try:
import readline
-except ImportError:
- try:
- import pyreadline.rlmain
- import pyreadline.unicode_helper # NOQA
- import readline
- except ImportError:
+except AttributeError as e:
+ # https://github.com/pyreadline/pyreadline/issues/65
+ if "module 'collections' has no attribute 'Callable'" in str(e):
docomplete = False
+ else:
+ raise
+except ImportError:
+ docomplete = False
if docomplete:
if sys.platform == 'darwin' and 'libedit' in readline.__doc__:
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -69,7 +69,6 @@
'funcparserlib>=0.3.6',
'colorama',
'astor>=0.8 ; python_version < "3.9"',
- 'pyreadline>=2.1 ; os_name == "nt"',
],
cmdclass=dict(install=Install),
entry_points={
| {"golden_diff": "diff --git a/hy/completer.py b/hy/completer.py\n--- a/hy/completer.py\n+++ b/hy/completer.py\n@@ -16,13 +16,14 @@\n \n try:\n import readline\n-except ImportError:\n- try:\n- import pyreadline.rlmain\n- import pyreadline.unicode_helper # NOQA\n- import readline\n- except ImportError:\n+except AttributeError as e:\n+ # https://github.com/pyreadline/pyreadline/issues/65\n+ if \"module 'collections' has no attribute 'Callable'\" in str(e):\n docomplete = False\n+ else:\n+ raise\n+except ImportError:\n+ docomplete = False\n \n if docomplete:\n if sys.platform == 'darwin' and 'libedit' in readline.__doc__:\ndiff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -69,7 +69,6 @@\n 'funcparserlib>=0.3.6',\n 'colorama',\n 'astor>=0.8 ; python_version < \"3.9\"',\n- 'pyreadline>=2.1 ; os_name == \"nt\"',\n ],\n cmdclass=dict(install=Install),\n entry_points={\n", "issue": "Python 3.10 on Windows fails: AttributeError: module 'collections' has no attribute 'Callable'\nThanks to https://github.com/pyreadline/pyreadline/issues/65, Hy on Python 3.10 on Windows no longer starts, and a backtrace similar to the following is the result:\r\n\r\n```console\r\nTraceback (most recent call last):\r\n File \"c:\\python\\lib\\runpy.py\", line 196, in _run_module_as_main\r\n return _run_code(code, main_globals, None,\r\n File \"c:\\python\\lib\\runpy.py\", line 86, in _run_code\r\n exec(code, run_globals)\r\n File \"C:\\Python\\Scripts\\hy.exe\\__main__.py\", line 4, in <module>\r\n File \"c:\\python\\lib\\site-packages\\hy\\cmdline.py\", line 37, in <module>\r\n from hy.completer import completion, Completer\r\n File \"c:\\python\\lib\\site-packages\\hy\\completer.py\", line 18, in <module>\r\n import readline\r\n File \"c:\\python\\lib\\site-packages\\readline.py\", line 34, in <module>\r\n rl = Readline()\r\n File \"c:\\python\\lib\\site-packages\\pyreadline\\rlmain.py\", line 422, in __init__\r\n BaseReadline.__init__(self)\r\n File \"c:\\python\\lib\\site-packages\\pyreadline\\rlmain.py\", line 62, in __init__\r\n mode.init_editing_mode(None)\r\n File \"c:\\python\\lib\\site-packages\\pyreadline\\modes\\emacs.py\", line 633, in init_editing_mode\r\n self._bind_key('space', self.self_insert)\r\n File \"c:\\python\\lib\\site-packages\\pyreadline\\modes\\basemode.py\", line 162, in _bind_key\r\n if not callable(func):\r\n File \"c:\\python\\lib\\site-packages\\pyreadline\\py3k_compat.py\", line 8, in callable\r\n return isinstance(x, collections.Callable)\r\nAttributeError: module 'collections' has no attribute 'Callable'\r\n```\r\n\r\nUnfortunately from that bug (and the repository in general), it would appear that `pyreadline` is no longer actively maintained (last update of any kind was in 2015), so for Hy to continue to support Windows on future Python versions will require some amount of workaround (a fork of `pyreadline`, dropping readline support on Windows, etc).\r\n\r\nI'm not sure if there's a way to specify that Python 3.10+ and Windows are simply mutually incompatible, but that would be the \"simplest\" workaround if there's a clean way to specify that. :see_no_evil:\n", "before_files": [{"content": "#!/usr/bin/env python\n# Copyright 2021 the authors.\n# This file is part of Hy, which is free software licensed under the Expat\n# license. See the LICENSE.\n\nimport glob\nimport importlib\nimport inspect\nimport os\nimport sys\n\nfrom setuptools import find_packages, setup\nfrom setuptools.command.install import install\nimport fastentrypoints # Monkey-patches setuptools.\n\nfrom get_version import __version__\n\nos.chdir(os.path.split(os.path.abspath(__file__))[0])\n\nPKG = \"hy\"\n\nlong_description = \"\"\"Hy is a Python <--> Lisp layer. It helps\nmake things work nicer, and lets Python and the Hy lisp variant play\nnice together. \"\"\"\n\nclass Install(install):\n def __compile_hy_bytecode(self):\n for path in sorted(glob.iglob('hy/**.hy', recursive=True)):\n importlib.util.cache_from_source(path, optimize=self.optimize)\n\n def run(self):\n # Don't bother messing around with deps if they wouldn't be installed anyway.\n # Code is based on setuptools's install.py.\n if not (self.old_and_unmanageable or self.single_version_externally_managed\n or not self._called_from_setup(inspect.currentframe())):\n easy_install = self.distribution.get_command_class('easy_install')\n\n cmd = easy_install(\n self.distribution, args=\"x\", root=self.root, record=self.record,\n )\n cmd.ensure_finalized()\n cmd.always_copy_from = '.'\n cmd.package_index.scan(glob.glob('*.egg'))\n\n cmd.args = self.distribution.install_requires\n\n # Avoid deprecation warnings on new setuptools versions.\n if 'show_deprecation' in inspect.signature(cmd.run).parameters:\n cmd.run(show_deprecation=False)\n else:\n cmd.run()\n\n # Make sure any new packages get picked up.\n import site\n importlib.reload(site)\n importlib.invalidate_caches()\n\n self.__compile_hy_bytecode()\n\n # The deps won't be reinstalled because of:\n # https://github.com/pypa/setuptools/issues/456\n return install.run(self)\n\nsetup(\n name=PKG,\n version=__version__,\n install_requires=[\n 'rply>=0.7.7',\n 'funcparserlib>=0.3.6',\n 'colorama',\n 'astor>=0.8 ; python_version < \"3.9\"',\n 'pyreadline>=2.1 ; os_name == \"nt\"',\n ],\n cmdclass=dict(install=Install),\n entry_points={\n 'console_scripts': [\n 'hy = hy.cmdline:hy_main',\n 'hy3 = hy.cmdline:hy_main',\n 'hyc = hy.cmdline:hyc_main',\n 'hyc3 = hy.cmdline:hyc_main',\n 'hy2py = hy.cmdline:hy2py_main',\n 'hy2py3 = hy.cmdline:hy2py_main',\n ]\n },\n packages=find_packages(exclude=['tests*']),\n package_data={\n 'hy.contrib': ['*.hy', '__pycache__/*'],\n 'hy.core': ['*.hy', '__pycache__/*'],\n 'hy.extra': ['*.hy', '__pycache__/*'],\n },\n data_files=[\n ('get_version', ['get_version.py'])\n ],\n author=\"Paul Tagliamonte\",\n author_email=\"[email protected]\",\n long_description=long_description,\n description='Lisp and Python love each other.',\n license=\"Expat\",\n url=\"http://hylang.org/\",\n platforms=['any'],\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"License :: DFSG approved\",\n \"License :: OSI Approved :: MIT License\", # Really \"Expat\". Ugh.\n \"Operating System :: OS Independent\",\n \"Programming Language :: Lisp\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Software Development :: Code Generators\",\n \"Topic :: Software Development :: Compilers\",\n \"Topic :: Software Development :: Libraries\",\n ]\n)\n", "path": "setup.py"}, {"content": "# Copyright 2021 the authors.\n# This file is part of Hy, which is free software licensed under the Expat\n# license. See the LICENSE.\n\nimport contextlib\nimport os\nimport re\nimport sys\nimport builtins\n\nimport hy.macros\nimport hy.compiler\n\n\ndocomplete = True\n\ntry:\n import readline\nexcept ImportError:\n try:\n import pyreadline.rlmain\n import pyreadline.unicode_helper # NOQA\n import readline\n except ImportError:\n docomplete = False\n\nif docomplete:\n if sys.platform == 'darwin' and 'libedit' in readline.__doc__:\n readline_bind = \"bind ^I rl_complete\"\n else:\n readline_bind = \"tab: complete\"\n\n\nclass Completer(object):\n\n def __init__(self, namespace={}):\n if not isinstance(namespace, dict):\n raise TypeError('namespace must be a dictionary')\n self.namespace = namespace\n self.path = [builtins.__dict__,\n namespace]\n\n namespace.setdefault('__macros__', {})\n\n self.path.append(namespace['__macros__'])\n\n def attr_matches(self, text):\n # Borrowed from IPython's completer\n m = re.match(r\"(\\S+(\\.[\\w-]+)*)\\.([\\w-]*)$\", text)\n\n if m:\n expr, attr = m.group(1, 3)\n attr = attr.replace(\"-\", \"_\")\n expr = expr.replace(\"-\", \"_\")\n else:\n return []\n\n try:\n obj = eval(expr, self.namespace)\n words = dir(obj)\n except Exception:\n return []\n\n n = len(attr)\n matches = []\n for w in words:\n if w[:n] == attr:\n matches.append(\"{}.{}\".format(\n expr.replace(\"_\", \"-\"), w.replace(\"_\", \"-\")))\n return matches\n\n def global_matches(self, text):\n matches = []\n for p in self.path:\n for k in p.keys():\n if isinstance(k, str):\n k = k.replace(\"_\", \"-\")\n if k.startswith(text):\n matches.append(k)\n return matches\n\n def complete(self, text, state):\n if \".\" in text:\n matches = self.attr_matches(text)\n else:\n matches = self.global_matches(text)\n try:\n return matches[state]\n except IndexError:\n return None\n\n\[email protected]\ndef completion(completer=None):\n delims = \"()[]{} \"\n if not completer:\n completer = Completer()\n\n if docomplete:\n readline.set_completer(completer.complete)\n readline.set_completer_delims(delims)\n\n history = os.environ.get(\n \"HY_HISTORY\", os.path.expanduser(\"~/.hy-history\"))\n readline.parse_and_bind(\"set blink-matching-paren on\")\n\n try:\n readline.read_history_file(history)\n except IOError:\n pass\n\n readline.parse_and_bind(readline_bind)\n\n try:\n yield\n finally:\n if docomplete:\n try:\n readline.write_history_file(history)\n except IOError:\n pass\n", "path": "hy/completer.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# Copyright 2021 the authors.\n# This file is part of Hy, which is free software licensed under the Expat\n# license. See the LICENSE.\n\nimport glob\nimport importlib\nimport inspect\nimport os\nimport sys\n\nfrom setuptools import find_packages, setup\nfrom setuptools.command.install import install\nimport fastentrypoints # Monkey-patches setuptools.\n\nfrom get_version import __version__\n\nos.chdir(os.path.split(os.path.abspath(__file__))[0])\n\nPKG = \"hy\"\n\nlong_description = \"\"\"Hy is a Python <--> Lisp layer. It helps\nmake things work nicer, and lets Python and the Hy lisp variant play\nnice together. \"\"\"\n\nclass Install(install):\n def __compile_hy_bytecode(self):\n for path in sorted(glob.iglob('hy/**.hy', recursive=True)):\n importlib.util.cache_from_source(path, optimize=self.optimize)\n\n def run(self):\n # Don't bother messing around with deps if they wouldn't be installed anyway.\n # Code is based on setuptools's install.py.\n if not (self.old_and_unmanageable or self.single_version_externally_managed\n or not self._called_from_setup(inspect.currentframe())):\n easy_install = self.distribution.get_command_class('easy_install')\n\n cmd = easy_install(\n self.distribution, args=\"x\", root=self.root, record=self.record,\n )\n cmd.ensure_finalized()\n cmd.always_copy_from = '.'\n cmd.package_index.scan(glob.glob('*.egg'))\n\n cmd.args = self.distribution.install_requires\n\n # Avoid deprecation warnings on new setuptools versions.\n if 'show_deprecation' in inspect.signature(cmd.run).parameters:\n cmd.run(show_deprecation=False)\n else:\n cmd.run()\n\n # Make sure any new packages get picked up.\n import site\n importlib.reload(site)\n importlib.invalidate_caches()\n\n self.__compile_hy_bytecode()\n\n # The deps won't be reinstalled because of:\n # https://github.com/pypa/setuptools/issues/456\n return install.run(self)\n\nsetup(\n name=PKG,\n version=__version__,\n install_requires=[\n 'rply>=0.7.7',\n 'funcparserlib>=0.3.6',\n 'colorama',\n 'astor>=0.8 ; python_version < \"3.9\"',\n ],\n cmdclass=dict(install=Install),\n entry_points={\n 'console_scripts': [\n 'hy = hy.cmdline:hy_main',\n 'hy3 = hy.cmdline:hy_main',\n 'hyc = hy.cmdline:hyc_main',\n 'hyc3 = hy.cmdline:hyc_main',\n 'hy2py = hy.cmdline:hy2py_main',\n 'hy2py3 = hy.cmdline:hy2py_main',\n ]\n },\n packages=find_packages(exclude=['tests*']),\n package_data={\n 'hy.contrib': ['*.hy', '__pycache__/*'],\n 'hy.core': ['*.hy', '__pycache__/*'],\n 'hy.extra': ['*.hy', '__pycache__/*'],\n },\n data_files=[\n ('get_version', ['get_version.py'])\n ],\n author=\"Paul Tagliamonte\",\n author_email=\"[email protected]\",\n long_description=long_description,\n description='Lisp and Python love each other.',\n license=\"Expat\",\n url=\"http://hylang.org/\",\n platforms=['any'],\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"License :: DFSG approved\",\n \"License :: OSI Approved :: MIT License\", # Really \"Expat\". Ugh.\n \"Operating System :: OS Independent\",\n \"Programming Language :: Lisp\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Software Development :: Code Generators\",\n \"Topic :: Software Development :: Compilers\",\n \"Topic :: Software Development :: Libraries\",\n ]\n)\n", "path": "setup.py"}, {"content": "# Copyright 2021 the authors.\n# This file is part of Hy, which is free software licensed under the Expat\n# license. See the LICENSE.\n\nimport contextlib\nimport os\nimport re\nimport sys\nimport builtins\n\nimport hy.macros\nimport hy.compiler\n\n\ndocomplete = True\n\ntry:\n import readline\nexcept AttributeError as e:\n # https://github.com/pyreadline/pyreadline/issues/65\n if \"module 'collections' has no attribute 'Callable'\" in str(e):\n docomplete = False\n else:\n raise\nexcept ImportError:\n docomplete = False\n\nif docomplete:\n if sys.platform == 'darwin' and 'libedit' in readline.__doc__:\n readline_bind = \"bind ^I rl_complete\"\n else:\n readline_bind = \"tab: complete\"\n\n\nclass Completer(object):\n\n def __init__(self, namespace={}):\n if not isinstance(namespace, dict):\n raise TypeError('namespace must be a dictionary')\n self.namespace = namespace\n self.path = [builtins.__dict__,\n namespace]\n\n namespace.setdefault('__macros__', {})\n\n self.path.append(namespace['__macros__'])\n\n def attr_matches(self, text):\n # Borrowed from IPython's completer\n m = re.match(r\"(\\S+(\\.[\\w-]+)*)\\.([\\w-]*)$\", text)\n\n if m:\n expr, attr = m.group(1, 3)\n attr = attr.replace(\"-\", \"_\")\n expr = expr.replace(\"-\", \"_\")\n else:\n return []\n\n try:\n obj = eval(expr, self.namespace)\n words = dir(obj)\n except Exception:\n return []\n\n n = len(attr)\n matches = []\n for w in words:\n if w[:n] == attr:\n matches.append(\"{}.{}\".format(\n expr.replace(\"_\", \"-\"), w.replace(\"_\", \"-\")))\n return matches\n\n def global_matches(self, text):\n matches = []\n for p in self.path:\n for k in p.keys():\n if isinstance(k, str):\n k = k.replace(\"_\", \"-\")\n if k.startswith(text):\n matches.append(k)\n return matches\n\n def complete(self, text, state):\n if \".\" in text:\n matches = self.attr_matches(text)\n else:\n matches = self.global_matches(text)\n try:\n return matches[state]\n except IndexError:\n return None\n\n\[email protected]\ndef completion(completer=None):\n delims = \"()[]{} \"\n if not completer:\n completer = Completer()\n\n if docomplete:\n readline.set_completer(completer.complete)\n readline.set_completer_delims(delims)\n\n history = os.environ.get(\n \"HY_HISTORY\", os.path.expanduser(\"~/.hy-history\"))\n readline.parse_and_bind(\"set blink-matching-paren on\")\n\n try:\n readline.read_history_file(history)\n except IOError:\n pass\n\n readline.parse_and_bind(readline_bind)\n\n try:\n yield\n finally:\n if docomplete:\n try:\n readline.write_history_file(history)\n except IOError:\n pass\n", "path": "hy/completer.py"}]} | 3,023 | 289 |
gh_patches_debug_9950 | rasdani/github-patches | git_diff | ManimCommunity__manim-684 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add (opengraph) metadata to documentation
Previews to links to the documentation are currently not available due to missing opengraph metadata.
Also, a description meta tag should be added.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docs/source/conf.py`
Content:
```
1 # Configuration file for the Sphinx documentation builder.
2 #
3 # This file only contains a selection of the most common options. For a full
4 # list see the documentation:
5 # https://www.sphinx-doc.org/en/master/usage/configuration.html
6
7 # -- Path setup --------------------------------------------------------------
8
9 # If extensions (or modules to document with autodoc) are in another directory,
10 # add these directories to sys.path here. If the directory is relative to the
11 # documentation root, use os.path.abspath to make it absolute, like shown here.
12
13 import os
14 import subprocess
15 import sys
16 from distutils.sysconfig import get_python_lib
17 from pathlib import Path
18
19 sys.path.insert(0, os.path.abspath("."))
20
21
22 if os.environ.get("READTHEDOCS") == "True":
23 site_path = get_python_lib()
24 # bindings for pangocffi, cairocffi, pangocairocffi need to be generated
25 subprocess.run(["python", "pangocffi/ffi_build.py"], cwd=site_path)
26 subprocess.run(["python", "cairocffi/ffi_build.py"], cwd=site_path)
27 subprocess.run(["python", "pangocairocffi/ffi_build.py"], cwd=site_path)
28 # we need to add ffmpeg to the path
29 ffmpeg_path = os.path.join(site_path, "imageio_ffmpeg", "binaries")
30 # the included binary is named ffmpeg-linux..., create a symlink
31 [ffmpeg_bin] = [
32 file for file in os.listdir(ffmpeg_path) if file.startswith("ffmpeg-")
33 ]
34 os.symlink(
35 os.path.join(ffmpeg_path, ffmpeg_bin), os.path.join(ffmpeg_path, "ffmpeg")
36 )
37 os.environ["PATH"] += os.pathsep + ffmpeg_path
38
39
40 # -- Project information -----------------------------------------------------
41
42 project = "Manim"
43 copyright = "2020, The Manim Community Dev Team"
44 author = "The Manim Community Dev Team"
45
46
47 # -- General configuration ---------------------------------------------------
48
49 # Add any Sphinx extension module names here, as strings. They can be
50 # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
51 # ones.
52 extensions = [
53 "sphinx.ext.autodoc",
54 "recommonmark",
55 "sphinx_copybutton",
56 "sphinx.ext.napoleon",
57 "sphinx.ext.autosummary",
58 "sphinx.ext.doctest",
59 "manim_directive",
60 ]
61
62 # Automatically generate stub pages when using the .. autosummary directive
63 autosummary_generate = True
64
65 # controls whether functions documented by the autofunction directive
66 # appear with their full module names
67 add_module_names = False
68
69 # Add any paths that contain templates here, relative to this directory.
70 templates_path = ["_templates"]
71
72 # List of patterns, relative to source directory, that match files and
73 # directories to ignore when looking for source files.
74 # This pattern also affects html_static_path and html_extra_path.
75 exclude_patterns = []
76
77
78 # -- Options for HTML output -------------------------------------------------
79
80 # The theme to use for HTML and HTML Help pages. See the documentation for
81 # a list of builtin themes.
82 #
83 import guzzle_sphinx_theme
84
85 html_theme_path = guzzle_sphinx_theme.html_theme_path()
86 html_theme = "guzzle_sphinx_theme"
87 html_favicon = str(Path("_static/favicon.ico"))
88
89 # There's a standing issue with Sphinx's new-style sidebars. This is a
90 # workaround. Taken from
91 # https://github.com/guzzle/guzzle_sphinx_theme/issues/33#issuecomment-637081826
92 html_sidebars = {"**": ["logo-text.html", "globaltoc.html", "searchbox.html"]}
93
94 # Register the theme as an extension to generate a sitemap.xml
95 extensions.append("guzzle_sphinx_theme")
96
97 # Add any paths that contain custom static files (such as style sheets) here,
98 # relative to this directory. They are copied after the builtin static files,
99 # so a file named "default.css" will overwrite the builtin "default.css".
100 html_static_path = ["_static"]
101
102 # This specifies any additional css files that will override the theme's
103 html_css_files = ["custom.css"]
104
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/docs/source/conf.py b/docs/source/conf.py
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -56,6 +56,7 @@
"sphinx.ext.napoleon",
"sphinx.ext.autosummary",
"sphinx.ext.doctest",
+ "sphinxext.opengraph",
"manim_directive",
]
@@ -101,3 +102,8 @@
# This specifies any additional css files that will override the theme's
html_css_files = ["custom.css"]
+
+# opengraph settings
+ogp_image = "https://www.manim.community/logo.png"
+ogp_site_name = "Manim Community | Documentation"
+ogp_site_url = "https://docs.manim.community/"
| {"golden_diff": "diff --git a/docs/source/conf.py b/docs/source/conf.py\n--- a/docs/source/conf.py\n+++ b/docs/source/conf.py\n@@ -56,6 +56,7 @@\n \"sphinx.ext.napoleon\",\n \"sphinx.ext.autosummary\",\n \"sphinx.ext.doctest\",\n+ \"sphinxext.opengraph\",\n \"manim_directive\",\n ]\n \n@@ -101,3 +102,8 @@\n \n # This specifies any additional css files that will override the theme's\n html_css_files = [\"custom.css\"]\n+\n+# opengraph settings\n+ogp_image = \"https://www.manim.community/logo.png\"\n+ogp_site_name = \"Manim Community | Documentation\"\n+ogp_site_url = \"https://docs.manim.community/\"\n", "issue": "Add (opengraph) metadata to documentation\nPreviews to links to the documentation are currently not available due to missing opengraph metadata.\r\n\r\nAlso, a description meta tag should be added.\r\n\r\n\r\n\r\n\n", "before_files": [{"content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n\nimport os\nimport subprocess\nimport sys\nfrom distutils.sysconfig import get_python_lib\nfrom pathlib import Path\n\nsys.path.insert(0, os.path.abspath(\".\"))\n\n\nif os.environ.get(\"READTHEDOCS\") == \"True\":\n site_path = get_python_lib()\n # bindings for pangocffi, cairocffi, pangocairocffi need to be generated\n subprocess.run([\"python\", \"pangocffi/ffi_build.py\"], cwd=site_path)\n subprocess.run([\"python\", \"cairocffi/ffi_build.py\"], cwd=site_path)\n subprocess.run([\"python\", \"pangocairocffi/ffi_build.py\"], cwd=site_path)\n # we need to add ffmpeg to the path\n ffmpeg_path = os.path.join(site_path, \"imageio_ffmpeg\", \"binaries\")\n # the included binary is named ffmpeg-linux..., create a symlink\n [ffmpeg_bin] = [\n file for file in os.listdir(ffmpeg_path) if file.startswith(\"ffmpeg-\")\n ]\n os.symlink(\n os.path.join(ffmpeg_path, ffmpeg_bin), os.path.join(ffmpeg_path, \"ffmpeg\")\n )\n os.environ[\"PATH\"] += os.pathsep + ffmpeg_path\n\n\n# -- Project information -----------------------------------------------------\n\nproject = \"Manim\"\ncopyright = \"2020, The Manim Community Dev Team\"\nauthor = \"The Manim Community Dev Team\"\n\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n \"sphinx.ext.autodoc\",\n \"recommonmark\",\n \"sphinx_copybutton\",\n \"sphinx.ext.napoleon\",\n \"sphinx.ext.autosummary\",\n \"sphinx.ext.doctest\",\n \"manim_directive\",\n]\n\n# Automatically generate stub pages when using the .. autosummary directive\nautosummary_generate = True\n\n# controls whether functions documented by the autofunction directive\n# appear with their full module names\nadd_module_names = False\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = [\"_templates\"]\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = []\n\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nimport guzzle_sphinx_theme\n\nhtml_theme_path = guzzle_sphinx_theme.html_theme_path()\nhtml_theme = \"guzzle_sphinx_theme\"\nhtml_favicon = str(Path(\"_static/favicon.ico\"))\n\n# There's a standing issue with Sphinx's new-style sidebars. This is a\n# workaround. Taken from\n# https://github.com/guzzle/guzzle_sphinx_theme/issues/33#issuecomment-637081826\nhtml_sidebars = {\"**\": [\"logo-text.html\", \"globaltoc.html\", \"searchbox.html\"]}\n\n# Register the theme as an extension to generate a sitemap.xml\nextensions.append(\"guzzle_sphinx_theme\")\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = [\"_static\"]\n\n# This specifies any additional css files that will override the theme's\nhtml_css_files = [\"custom.css\"]\n", "path": "docs/source/conf.py"}], "after_files": [{"content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n\nimport os\nimport subprocess\nimport sys\nfrom distutils.sysconfig import get_python_lib\nfrom pathlib import Path\n\nsys.path.insert(0, os.path.abspath(\".\"))\n\n\nif os.environ.get(\"READTHEDOCS\") == \"True\":\n site_path = get_python_lib()\n # bindings for pangocffi, cairocffi, pangocairocffi need to be generated\n subprocess.run([\"python\", \"pangocffi/ffi_build.py\"], cwd=site_path)\n subprocess.run([\"python\", \"cairocffi/ffi_build.py\"], cwd=site_path)\n subprocess.run([\"python\", \"pangocairocffi/ffi_build.py\"], cwd=site_path)\n # we need to add ffmpeg to the path\n ffmpeg_path = os.path.join(site_path, \"imageio_ffmpeg\", \"binaries\")\n # the included binary is named ffmpeg-linux..., create a symlink\n [ffmpeg_bin] = [\n file for file in os.listdir(ffmpeg_path) if file.startswith(\"ffmpeg-\")\n ]\n os.symlink(\n os.path.join(ffmpeg_path, ffmpeg_bin), os.path.join(ffmpeg_path, \"ffmpeg\")\n )\n os.environ[\"PATH\"] += os.pathsep + ffmpeg_path\n\n\n# -- Project information -----------------------------------------------------\n\nproject = \"Manim\"\ncopyright = \"2020, The Manim Community Dev Team\"\nauthor = \"The Manim Community Dev Team\"\n\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n \"sphinx.ext.autodoc\",\n \"recommonmark\",\n \"sphinx_copybutton\",\n \"sphinx.ext.napoleon\",\n \"sphinx.ext.autosummary\",\n \"sphinx.ext.doctest\",\n \"sphinxext.opengraph\",\n \"manim_directive\",\n]\n\n# Automatically generate stub pages when using the .. autosummary directive\nautosummary_generate = True\n\n# controls whether functions documented by the autofunction directive\n# appear with their full module names\nadd_module_names = False\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = [\"_templates\"]\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = []\n\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nimport guzzle_sphinx_theme\n\nhtml_theme_path = guzzle_sphinx_theme.html_theme_path()\nhtml_theme = \"guzzle_sphinx_theme\"\nhtml_favicon = str(Path(\"_static/favicon.ico\"))\n\n# There's a standing issue with Sphinx's new-style sidebars. This is a\n# workaround. Taken from\n# https://github.com/guzzle/guzzle_sphinx_theme/issues/33#issuecomment-637081826\nhtml_sidebars = {\"**\": [\"logo-text.html\", \"globaltoc.html\", \"searchbox.html\"]}\n\n# Register the theme as an extension to generate a sitemap.xml\nextensions.append(\"guzzle_sphinx_theme\")\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = [\"_static\"]\n\n# This specifies any additional css files that will override the theme's\nhtml_css_files = [\"custom.css\"]\n\n# opengraph settings\nogp_image = \"https://www.manim.community/logo.png\"\nogp_site_name = \"Manim Community | Documentation\"\nogp_site_url = \"https://docs.manim.community/\"\n", "path": "docs/source/conf.py"}]} | 1,395 | 171 |
gh_patches_debug_11488 | rasdani/github-patches | git_diff | pytorch__vision-355 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
utils.save_image fails when passing list of images
utils.save_image fails when passing in a list of images, as the code tries to call .cpu on the list.
Passing in a list should be possible according to the function's documentation.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchvision/utils.py`
Content:
```
1 import torch
2 import math
3 irange = range
4
5
6 def make_grid(tensor, nrow=8, padding=2,
7 normalize=False, range=None, scale_each=False, pad_value=0):
8 """Make a grid of images.
9
10 Args:
11 tensor (Tensor or list): 4D mini-batch Tensor of shape (B x C x H x W)
12 or a list of images all of the same size.
13 nrow (int, optional): Number of images displayed in each row of the grid.
14 The Final grid size is (B / nrow, nrow). Default is 8.
15 padding (int, optional): amount of padding. Default is 2.
16 normalize (bool, optional): If True, shift the image to the range (0, 1),
17 by subtracting the minimum and dividing by the maximum pixel value.
18 range (tuple, optional): tuple (min, max) where min and max are numbers,
19 then these numbers are used to normalize the image. By default, min and max
20 are computed from the tensor.
21 scale_each (bool, optional): If True, scale each image in the batch of
22 images separately rather than the (min, max) over all images.
23 pad_value (float, optional): Value for the padded pixels.
24
25 Example:
26 See this notebook `here <https://gist.github.com/anonymous/bf16430f7750c023141c562f3e9f2a91>`_
27
28 """
29 if not (torch.is_tensor(tensor) or
30 (isinstance(tensor, list) and all(torch.is_tensor(t) for t in tensor))):
31 raise TypeError('tensor or list of tensors expected, got {}'.format(type(tensor)))
32
33 # if list of tensors, convert to a 4D mini-batch Tensor
34 if isinstance(tensor, list):
35 tensor = torch.stack(tensor, dim=0)
36
37 if tensor.dim() == 2: # single image H x W
38 tensor = tensor.view(1, tensor.size(0), tensor.size(1))
39 if tensor.dim() == 3: # single image
40 if tensor.size(0) == 1: # if single-channel, convert to 3-channel
41 tensor = torch.cat((tensor, tensor, tensor), 0)
42 return tensor
43 if tensor.dim() == 4 and tensor.size(1) == 1: # single-channel images
44 tensor = torch.cat((tensor, tensor, tensor), 1)
45
46 if normalize is True:
47 tensor = tensor.clone() # avoid modifying tensor in-place
48 if range is not None:
49 assert isinstance(range, tuple), \
50 "range has to be a tuple (min, max) if specified. min and max are numbers"
51
52 def norm_ip(img, min, max):
53 img.clamp_(min=min, max=max)
54 img.add_(-min).div_(max - min)
55
56 def norm_range(t, range):
57 if range is not None:
58 norm_ip(t, range[0], range[1])
59 else:
60 norm_ip(t, t.min(), t.max())
61
62 if scale_each is True:
63 for t in tensor: # loop over mini-batch dimension
64 norm_range(t, range)
65 else:
66 norm_range(tensor, range)
67
68 # make the mini-batch of images into a grid
69 nmaps = tensor.size(0)
70 xmaps = min(nrow, nmaps)
71 ymaps = int(math.ceil(float(nmaps) / xmaps))
72 height, width = int(tensor.size(2) + padding), int(tensor.size(3) + padding)
73 grid = tensor.new(3, height * ymaps + padding, width * xmaps + padding).fill_(pad_value)
74 k = 0
75 for y in irange(ymaps):
76 for x in irange(xmaps):
77 if k >= nmaps:
78 break
79 grid.narrow(1, y * height + padding, height - padding)\
80 .narrow(2, x * width + padding, width - padding)\
81 .copy_(tensor[k])
82 k = k + 1
83 return grid
84
85
86 def save_image(tensor, filename, nrow=8, padding=2,
87 normalize=False, range=None, scale_each=False, pad_value=0):
88 """Save a given Tensor into an image file.
89
90 Args:
91 tensor (Tensor or list): Image to be saved. If given a mini-batch tensor,
92 saves the tensor as a grid of images by calling ``make_grid``.
93 **kwargs: Other arguments are documented in ``make_grid``.
94 """
95 from PIL import Image
96 tensor = tensor.cpu()
97 grid = make_grid(tensor, nrow=nrow, padding=padding, pad_value=pad_value,
98 normalize=normalize, range=range, scale_each=scale_each)
99 ndarr = grid.mul(255).clamp(0, 255).byte().permute(1, 2, 0).numpy()
100 im = Image.fromarray(ndarr)
101 im.save(filename)
102
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/torchvision/utils.py b/torchvision/utils.py
--- a/torchvision/utils.py
+++ b/torchvision/utils.py
@@ -93,9 +93,8 @@
**kwargs: Other arguments are documented in ``make_grid``.
"""
from PIL import Image
- tensor = tensor.cpu()
grid = make_grid(tensor, nrow=nrow, padding=padding, pad_value=pad_value,
normalize=normalize, range=range, scale_each=scale_each)
- ndarr = grid.mul(255).clamp(0, 255).byte().permute(1, 2, 0).numpy()
+ ndarr = grid.mul(255).clamp(0, 255).byte().permute(1, 2, 0).cpu().numpy()
im = Image.fromarray(ndarr)
im.save(filename)
| {"golden_diff": "diff --git a/torchvision/utils.py b/torchvision/utils.py\n--- a/torchvision/utils.py\n+++ b/torchvision/utils.py\n@@ -93,9 +93,8 @@\n **kwargs: Other arguments are documented in ``make_grid``.\n \"\"\"\n from PIL import Image\n- tensor = tensor.cpu()\n grid = make_grid(tensor, nrow=nrow, padding=padding, pad_value=pad_value,\n normalize=normalize, range=range, scale_each=scale_each)\n- ndarr = grid.mul(255).clamp(0, 255).byte().permute(1, 2, 0).numpy()\n+ ndarr = grid.mul(255).clamp(0, 255).byte().permute(1, 2, 0).cpu().numpy()\n im = Image.fromarray(ndarr)\n im.save(filename)\n", "issue": "utils.save_image fails when passing list of images\nutils.save_image fails when passing in a list of images, as the code tries to call .cpu on the list. \r\nPassing in a list should be possible according to the function's documentation.\n", "before_files": [{"content": "import torch\nimport math\nirange = range\n\n\ndef make_grid(tensor, nrow=8, padding=2,\n normalize=False, range=None, scale_each=False, pad_value=0):\n \"\"\"Make a grid of images.\n\n Args:\n tensor (Tensor or list): 4D mini-batch Tensor of shape (B x C x H x W)\n or a list of images all of the same size.\n nrow (int, optional): Number of images displayed in each row of the grid.\n The Final grid size is (B / nrow, nrow). Default is 8.\n padding (int, optional): amount of padding. Default is 2.\n normalize (bool, optional): If True, shift the image to the range (0, 1),\n by subtracting the minimum and dividing by the maximum pixel value.\n range (tuple, optional): tuple (min, max) where min and max are numbers,\n then these numbers are used to normalize the image. By default, min and max\n are computed from the tensor.\n scale_each (bool, optional): If True, scale each image in the batch of\n images separately rather than the (min, max) over all images.\n pad_value (float, optional): Value for the padded pixels.\n\n Example:\n See this notebook `here <https://gist.github.com/anonymous/bf16430f7750c023141c562f3e9f2a91>`_\n\n \"\"\"\n if not (torch.is_tensor(tensor) or\n (isinstance(tensor, list) and all(torch.is_tensor(t) for t in tensor))):\n raise TypeError('tensor or list of tensors expected, got {}'.format(type(tensor)))\n\n # if list of tensors, convert to a 4D mini-batch Tensor\n if isinstance(tensor, list):\n tensor = torch.stack(tensor, dim=0)\n\n if tensor.dim() == 2: # single image H x W\n tensor = tensor.view(1, tensor.size(0), tensor.size(1))\n if tensor.dim() == 3: # single image\n if tensor.size(0) == 1: # if single-channel, convert to 3-channel\n tensor = torch.cat((tensor, tensor, tensor), 0)\n return tensor\n if tensor.dim() == 4 and tensor.size(1) == 1: # single-channel images\n tensor = torch.cat((tensor, tensor, tensor), 1)\n\n if normalize is True:\n tensor = tensor.clone() # avoid modifying tensor in-place\n if range is not None:\n assert isinstance(range, tuple), \\\n \"range has to be a tuple (min, max) if specified. min and max are numbers\"\n\n def norm_ip(img, min, max):\n img.clamp_(min=min, max=max)\n img.add_(-min).div_(max - min)\n\n def norm_range(t, range):\n if range is not None:\n norm_ip(t, range[0], range[1])\n else:\n norm_ip(t, t.min(), t.max())\n\n if scale_each is True:\n for t in tensor: # loop over mini-batch dimension\n norm_range(t, range)\n else:\n norm_range(tensor, range)\n\n # make the mini-batch of images into a grid\n nmaps = tensor.size(0)\n xmaps = min(nrow, nmaps)\n ymaps = int(math.ceil(float(nmaps) / xmaps))\n height, width = int(tensor.size(2) + padding), int(tensor.size(3) + padding)\n grid = tensor.new(3, height * ymaps + padding, width * xmaps + padding).fill_(pad_value)\n k = 0\n for y in irange(ymaps):\n for x in irange(xmaps):\n if k >= nmaps:\n break\n grid.narrow(1, y * height + padding, height - padding)\\\n .narrow(2, x * width + padding, width - padding)\\\n .copy_(tensor[k])\n k = k + 1\n return grid\n\n\ndef save_image(tensor, filename, nrow=8, padding=2,\n normalize=False, range=None, scale_each=False, pad_value=0):\n \"\"\"Save a given Tensor into an image file.\n\n Args:\n tensor (Tensor or list): Image to be saved. If given a mini-batch tensor,\n saves the tensor as a grid of images by calling ``make_grid``.\n **kwargs: Other arguments are documented in ``make_grid``.\n \"\"\"\n from PIL import Image\n tensor = tensor.cpu()\n grid = make_grid(tensor, nrow=nrow, padding=padding, pad_value=pad_value,\n normalize=normalize, range=range, scale_each=scale_each)\n ndarr = grid.mul(255).clamp(0, 255).byte().permute(1, 2, 0).numpy()\n im = Image.fromarray(ndarr)\n im.save(filename)\n", "path": "torchvision/utils.py"}], "after_files": [{"content": "import torch\nimport math\nirange = range\n\n\ndef make_grid(tensor, nrow=8, padding=2,\n normalize=False, range=None, scale_each=False, pad_value=0):\n \"\"\"Make a grid of images.\n\n Args:\n tensor (Tensor or list): 4D mini-batch Tensor of shape (B x C x H x W)\n or a list of images all of the same size.\n nrow (int, optional): Number of images displayed in each row of the grid.\n The Final grid size is (B / nrow, nrow). Default is 8.\n padding (int, optional): amount of padding. Default is 2.\n normalize (bool, optional): If True, shift the image to the range (0, 1),\n by subtracting the minimum and dividing by the maximum pixel value.\n range (tuple, optional): tuple (min, max) where min and max are numbers,\n then these numbers are used to normalize the image. By default, min and max\n are computed from the tensor.\n scale_each (bool, optional): If True, scale each image in the batch of\n images separately rather than the (min, max) over all images.\n pad_value (float, optional): Value for the padded pixels.\n\n Example:\n See this notebook `here <https://gist.github.com/anonymous/bf16430f7750c023141c562f3e9f2a91>`_\n\n \"\"\"\n if not (torch.is_tensor(tensor) or\n (isinstance(tensor, list) and all(torch.is_tensor(t) for t in tensor))):\n raise TypeError('tensor or list of tensors expected, got {}'.format(type(tensor)))\n\n # if list of tensors, convert to a 4D mini-batch Tensor\n if isinstance(tensor, list):\n tensor = torch.stack(tensor, dim=0)\n\n if tensor.dim() == 2: # single image H x W\n tensor = tensor.view(1, tensor.size(0), tensor.size(1))\n if tensor.dim() == 3: # single image\n if tensor.size(0) == 1: # if single-channel, convert to 3-channel\n tensor = torch.cat((tensor, tensor, tensor), 0)\n return tensor\n if tensor.dim() == 4 and tensor.size(1) == 1: # single-channel images\n tensor = torch.cat((tensor, tensor, tensor), 1)\n\n if normalize is True:\n tensor = tensor.clone() # avoid modifying tensor in-place\n if range is not None:\n assert isinstance(range, tuple), \\\n \"range has to be a tuple (min, max) if specified. min and max are numbers\"\n\n def norm_ip(img, min, max):\n img.clamp_(min=min, max=max)\n img.add_(-min).div_(max - min)\n\n def norm_range(t, range):\n if range is not None:\n norm_ip(t, range[0], range[1])\n else:\n norm_ip(t, t.min(), t.max())\n\n if scale_each is True:\n for t in tensor: # loop over mini-batch dimension\n norm_range(t, range)\n else:\n norm_range(tensor, range)\n\n # make the mini-batch of images into a grid\n nmaps = tensor.size(0)\n xmaps = min(nrow, nmaps)\n ymaps = int(math.ceil(float(nmaps) / xmaps))\n height, width = int(tensor.size(2) + padding), int(tensor.size(3) + padding)\n grid = tensor.new(3, height * ymaps + padding, width * xmaps + padding).fill_(pad_value)\n k = 0\n for y in irange(ymaps):\n for x in irange(xmaps):\n if k >= nmaps:\n break\n grid.narrow(1, y * height + padding, height - padding)\\\n .narrow(2, x * width + padding, width - padding)\\\n .copy_(tensor[k])\n k = k + 1\n return grid\n\n\ndef save_image(tensor, filename, nrow=8, padding=2,\n normalize=False, range=None, scale_each=False, pad_value=0):\n \"\"\"Save a given Tensor into an image file.\n\n Args:\n tensor (Tensor or list): Image to be saved. If given a mini-batch tensor,\n saves the tensor as a grid of images by calling ``make_grid``.\n **kwargs: Other arguments are documented in ``make_grid``.\n \"\"\"\n from PIL import Image\n grid = make_grid(tensor, nrow=nrow, padding=padding, pad_value=pad_value,\n normalize=normalize, range=range, scale_each=scale_each)\n ndarr = grid.mul(255).clamp(0, 255).byte().permute(1, 2, 0).cpu().numpy()\n im = Image.fromarray(ndarr)\n im.save(filename)\n", "path": "torchvision/utils.py"}]} | 1,615 | 198 |
gh_patches_debug_4144 | rasdani/github-patches | git_diff | mampfes__hacs_waste_collection_schedule-1676 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Source Request]:
### Municipality / Region
Bedburg
### Collection Calendar Webpage
https://buerger-portal-bedburg.azurewebsites.net/calendar/
### Example Address
Am Schirkerhof 2, Bedburg, Broich
### Collection Data Format
Something different (add to additional information)
### Additional Information
The city of Bedburg change its Waste Collection Provider. Thus the old Abfall+ Source doesn't work anymore. The new provider is Drekopf. Unfortunately they do not use the existing Drekopf API. The Webpage provided by Drekopf has multiple outputs. You can download the calendar via PDF, XLS, CSV and ICS, but i can't find the url. The underlying API seems to be C-Trace, but when calling this API, credentials are requested. Maybe i am missing something...
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `custom_components/waste_collection_schedule/waste_collection_schedule/source/buergerportal_de.py`
Content:
```
1 import datetime
2 import re
3 from dataclasses import dataclass
4 from typing import List, Literal, Optional, TypedDict, Union
5
6 import requests
7 from waste_collection_schedule import Collection # type: ignore[attr-defined]
8
9 TITLE = "Bürgerportal"
10 URL = "https://www.c-trace.de"
11 DESCRIPTION = "Source for waste collection in multiple service areas."
12
13
14 def EXTRA_INFO():
15 return [{"title": s["title"], "url": s["url"]} for s in SERVICE_MAP]
16
17
18 TEST_CASES = {
19 "Cochem-Zell": {
20 "operator": "cochem_zell",
21 "district": "Bullay",
22 "subdistrict": "Bullay",
23 "street": "Layenweg",
24 "number": 3,
25 },
26 "Alb-Donau": {
27 "operator": "alb_donau",
28 "district": "Blaubeuren",
29 "street": "Alberstraße",
30 "number": 3,
31 },
32 "Biedenkopf": {
33 "operator": "biedenkopf",
34 "district": "Biedenkopf",
35 "subdistrict": "Breidenstein",
36 "street": "Auf dem Hammer",
37 "number": 1,
38 },
39 }
40
41 ICON_MAP = {
42 "mobil": "mdi:truck",
43 "bio": "mdi:leaf",
44 "papier": "mdi:package-variant",
45 "verpackung": "mdi:recycle",
46 "gelb": "mdi:recycle",
47 "lvp": "mdi:recycle",
48 "rest": "mdi:trash-can",
49 "gruen": "mdi:forest",
50 "grün": "mdi:forest",
51 "baum": "mdi:forest",
52 "schnitt": "mdi:forest",
53 "schad": "mdi:biohazard",
54 }
55 API_HEADERS = {
56 "Accept": "application/json, text/plain;q=0.5",
57 "Cache-Control": "no-cache",
58 }
59 Operator = Literal["cochem_zell", "alb_donau", "biedenkopf"]
60
61 SERVICE_MAP = [
62 {
63 "title": "KV Cochem-Zell",
64 "url": "https://www.cochem-zell-online.de/",
65 "api_url": "https://buerger-portal-cochemzell.azurewebsites.net/api",
66 "operator": "cochem_zell",
67 },
68 {
69 "title": "Abfallwirtschaft Alb-Donau-Kreis",
70 "url": "https://www.aw-adk.de/",
71 "api_url": "https://buerger-portal-albdonaukreisabfallwirtschaft.azurewebsites.net/api",
72 "operator": "alb_donau",
73 },
74 {
75 "title": "MZV Biedenkopf",
76 "url": "https://mzv-biedenkopf.de/",
77 "api_url": "https://biedenkopfmzv.buergerportal.digital/api",
78 "operator": "biedenkopf",
79 },
80 ]
81
82
83 # This datalcass is used for adding entries to a set and remove duplicate entries.
84 # The default `Collection` extends the standard dict and thus is not hashable.
85 @dataclass(frozen=True, eq=True)
86 class CollectionEntry:
87 date: datetime.date
88 waste_type: str
89 icon: Optional[str]
90
91 def export(self) -> Collection:
92 return Collection(self.date, self.waste_type, self.icon)
93
94
95 def quote_none(value: Optional[str]) -> str:
96 if value is None:
97 return "null"
98
99 return f"'{value}'"
100
101
102 def get_api_map():
103 return {s["operator"]: s["api_url"] for s in SERVICE_MAP}
104
105
106 class Source:
107 def __init__(
108 self,
109 operator: Operator,
110 district: str,
111 street: str,
112 subdistrict: Optional[str] = None,
113 number: Union[int, str, None] = None,
114 show_volume: bool = False,
115 ):
116 self.api_url = get_api_map()[operator]
117 self.district = district
118 self.subdistrict = subdistrict
119 self.street = street
120 self.number = number
121 self.show_volume = show_volume
122
123 def fetch(self) -> list[Collection]:
124 session = requests.session()
125 session.headers.update(API_HEADERS)
126
127 year = datetime.datetime.now().year
128 entries: set[CollectionEntry] = set()
129
130 district_id = self.fetch_district_id(session)
131 street_id = self.fetch_street_id(session, district_id)
132 # Eventually verify house number in the future
133
134 params = {
135 "$expand": "Abfuhrplan,Abfuhrplan/GefaesstarifArt/Abfallart,Abfuhrplan/GefaesstarifArt/Volumen",
136 "$orderby": "Abfuhrplan/GefaesstarifArt/Abfallart/Name,Abfuhrplan/GefaesstarifArt/Volumen/VolumenWert",
137 "orteId": district_id,
138 "strassenId": street_id,
139 "jahr": year,
140 }
141
142 if self.number:
143 params["hausNr"] = (f"'{self.number}'",)
144
145 res = session.get(
146 f"{self.api_url}/AbfuhrtermineAbJahr",
147 params=params,
148 )
149 res.raise_for_status()
150 payload: CollectionsRes = res.json()
151
152 date_regex = re.compile(r"\d+")
153
154 for collection in payload["d"]:
155 if date_match := re.search(date_regex, collection["Termin"]):
156 timestamp = float(date_match.group())
157 date = datetime.datetime.utcfromtimestamp(timestamp / 1000).date()
158 waste_type = collection["Abfuhrplan"]["GefaesstarifArt"]["Abfallart"][
159 "Name"
160 ]
161 icon = None
162
163 for icon_type, tested_icon in ICON_MAP.items():
164 if icon_type.lower() in waste_type.lower():
165 icon = tested_icon
166
167 if self.show_volume:
168 volume = int(
169 collection["Abfuhrplan"]["GefaesstarifArt"]["Volumen"][
170 "VolumenWert"
171 ]
172 )
173 waste_type = f"{waste_type} ({volume} l)"
174
175 entries.add(CollectionEntry(date, waste_type, icon))
176
177 if len(entries) == 0:
178 raise ValueError(
179 "No collections found! Please verify that your configuration is correct."
180 )
181
182 return [entry.export() for entry in entries]
183
184 def fetch_district_id(self, session: requests.Session) -> int:
185 res = session.get(
186 f"{self.api_url}/OrteMitOrtsteilen",
187 headers=API_HEADERS,
188 )
189 res.raise_for_status()
190 payload: DistrictsRes = res.json()
191
192 try:
193 return next(
194 entry["OrteId"]
195 for entry in payload["d"]
196 if entry["Ortsname"] == self.district
197 and entry["Ortsteilname"] == self.subdistrict
198 )
199 except StopIteration:
200 raise ValueError(
201 "District id cannot be fetched. "
202 "Please make sure that you entered a subdistrict if there is a comma on the website."
203 )
204
205 def fetch_street_id(self, session: requests.Session, district_id: int):
206 res = session.get(
207 f"{self.api_url}/Strassen",
208 params={
209 "$filter": f"Ort/OrteId eq {district_id} and OrtsteilName eq {quote_none(self.subdistrict)}",
210 "$orderby": "Name asc",
211 },
212 headers=API_HEADERS,
213 )
214 res.raise_for_status()
215 payload: StreetsRes = res.json()
216
217 try:
218 return next(
219 entry["StrassenId"]
220 for entry in payload["d"]
221 if entry["Name"] == self.street
222 )
223 except StopIteration:
224 raise ValueError(
225 "Street ID cannot be fetched. Please verify your configuration."
226 )
227
228
229 # Typed dictionaries for the API
230 # Automatically generated using https://pytyper.dev/
231
232
233 class DistrictRes(TypedDict):
234 OrteId: int
235 Ortsname: str
236 Ortsteilname: Optional[str]
237
238
239 class DistrictsRes(TypedDict):
240 d: List[DistrictRes]
241
242
243 class StreetRes(TypedDict):
244 StrassenId: int
245 Name: str
246 Plz: str
247
248
249 class StreetsRes(TypedDict):
250 d: List[StreetRes]
251
252
253 class Capacity(TypedDict):
254 VolumenId: int
255 VolumenWert: str
256
257
258 class WasteType(TypedDict):
259 AbfallartenId: int
260 Code: str
261 Name: str
262 Farbe: str
263 IsBio: bool
264 IsPapier: bool
265 IsRest: bool
266 IsWertstoff: bool
267 Bemerkung: None
268 Aktiv: None
269 IsSchadstoff: None
270
271
272 class ContainerType(TypedDict):
273 GefaesstarifArtenId: int
274 BescheidText: None
275 BescheidTextLeerungsgebuehr: None
276 Bezeichnung: str
277 GefaesstarifArtVerwenden: bool
278 GefaesstarifArtVerwendenAbfallkalender: bool
279 Bemerkung: None
280 Volumen: Capacity
281 Abfallart: WasteType
282 # Abfuhrrhythmus: Abfuhrrhythmus
283
284
285 class CollectionPlan(TypedDict):
286 AbfuhrplaeneId: int
287 Jahr: int
288 GefaesstarifArt: ContainerType
289 # AbfallartenObj: Abfuhrrhythmus
290
291
292 class CollectionRes(TypedDict):
293 AbfuhrtermineId: int
294 Termin: str
295 Abfuhrplan: CollectionPlan
296
297
298 class CollectionsRes(TypedDict):
299 d: List[CollectionRes]
300
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/custom_components/waste_collection_schedule/waste_collection_schedule/source/buergerportal_de.py b/custom_components/waste_collection_schedule/waste_collection_schedule/source/buergerportal_de.py
--- a/custom_components/waste_collection_schedule/waste_collection_schedule/source/buergerportal_de.py
+++ b/custom_components/waste_collection_schedule/waste_collection_schedule/source/buergerportal_de.py
@@ -77,6 +77,12 @@
"api_url": "https://biedenkopfmzv.buergerportal.digital/api",
"operator": "biedenkopf",
},
+ {
+ "title": "Bürgerportal Bedburg",
+ "url": "https://www.bedburg.de/",
+ "api_url": "https://buerger-portal-bedburg.azurewebsites.net",
+ "operator": "bedburg",
+ },
]
| {"golden_diff": "diff --git a/custom_components/waste_collection_schedule/waste_collection_schedule/source/buergerportal_de.py b/custom_components/waste_collection_schedule/waste_collection_schedule/source/buergerportal_de.py\n--- a/custom_components/waste_collection_schedule/waste_collection_schedule/source/buergerportal_de.py\n+++ b/custom_components/waste_collection_schedule/waste_collection_schedule/source/buergerportal_de.py\n@@ -77,6 +77,12 @@\n \"api_url\": \"https://biedenkopfmzv.buergerportal.digital/api\",\n \"operator\": \"biedenkopf\",\n },\n+ {\n+ \"title\": \"B\u00fcrgerportal Bedburg\",\n+ \"url\": \"https://www.bedburg.de/\",\n+ \"api_url\": \"https://buerger-portal-bedburg.azurewebsites.net\",\n+ \"operator\": \"bedburg\",\n+ },\n ]\n", "issue": "[Source Request]: \n### Municipality / Region\n\nBedburg\n\n### Collection Calendar Webpage\n\nhttps://buerger-portal-bedburg.azurewebsites.net/calendar/\n\n### Example Address\n\nAm Schirkerhof 2, Bedburg, Broich\n\n### Collection Data Format\n\nSomething different (add to additional information)\n\n### Additional Information\n\nThe city of Bedburg change its Waste Collection Provider. Thus the old Abfall+ Source doesn't work anymore. The new provider is Drekopf. Unfortunately they do not use the existing Drekopf API. The Webpage provided by Drekopf has multiple outputs. You can download the calendar via PDF, XLS, CSV and ICS, but i can't find the url. The underlying API seems to be C-Trace, but when calling this API, credentials are requested. Maybe i am missing something...\n", "before_files": [{"content": "import datetime\nimport re\nfrom dataclasses import dataclass\nfrom typing import List, Literal, Optional, TypedDict, Union\n\nimport requests\nfrom waste_collection_schedule import Collection # type: ignore[attr-defined]\n\nTITLE = \"B\u00fcrgerportal\"\nURL = \"https://www.c-trace.de\"\nDESCRIPTION = \"Source for waste collection in multiple service areas.\"\n\n\ndef EXTRA_INFO():\n return [{\"title\": s[\"title\"], \"url\": s[\"url\"]} for s in SERVICE_MAP]\n\n\nTEST_CASES = {\n \"Cochem-Zell\": {\n \"operator\": \"cochem_zell\",\n \"district\": \"Bullay\",\n \"subdistrict\": \"Bullay\",\n \"street\": \"Layenweg\",\n \"number\": 3,\n },\n \"Alb-Donau\": {\n \"operator\": \"alb_donau\",\n \"district\": \"Blaubeuren\",\n \"street\": \"Alberstra\u00dfe\",\n \"number\": 3,\n },\n \"Biedenkopf\": {\n \"operator\": \"biedenkopf\",\n \"district\": \"Biedenkopf\",\n \"subdistrict\": \"Breidenstein\",\n \"street\": \"Auf dem Hammer\",\n \"number\": 1,\n },\n}\n\nICON_MAP = {\n \"mobil\": \"mdi:truck\",\n \"bio\": \"mdi:leaf\",\n \"papier\": \"mdi:package-variant\",\n \"verpackung\": \"mdi:recycle\",\n \"gelb\": \"mdi:recycle\",\n \"lvp\": \"mdi:recycle\",\n \"rest\": \"mdi:trash-can\",\n \"gruen\": \"mdi:forest\",\n \"gr\u00fcn\": \"mdi:forest\",\n \"baum\": \"mdi:forest\",\n \"schnitt\": \"mdi:forest\",\n \"schad\": \"mdi:biohazard\",\n}\nAPI_HEADERS = {\n \"Accept\": \"application/json, text/plain;q=0.5\",\n \"Cache-Control\": \"no-cache\",\n}\nOperator = Literal[\"cochem_zell\", \"alb_donau\", \"biedenkopf\"]\n\nSERVICE_MAP = [\n {\n \"title\": \"KV Cochem-Zell\",\n \"url\": \"https://www.cochem-zell-online.de/\",\n \"api_url\": \"https://buerger-portal-cochemzell.azurewebsites.net/api\",\n \"operator\": \"cochem_zell\",\n },\n {\n \"title\": \"Abfallwirtschaft Alb-Donau-Kreis\",\n \"url\": \"https://www.aw-adk.de/\",\n \"api_url\": \"https://buerger-portal-albdonaukreisabfallwirtschaft.azurewebsites.net/api\",\n \"operator\": \"alb_donau\",\n },\n {\n \"title\": \"MZV Biedenkopf\",\n \"url\": \"https://mzv-biedenkopf.de/\",\n \"api_url\": \"https://biedenkopfmzv.buergerportal.digital/api\",\n \"operator\": \"biedenkopf\",\n },\n]\n\n\n# This datalcass is used for adding entries to a set and remove duplicate entries.\n# The default `Collection` extends the standard dict and thus is not hashable.\n@dataclass(frozen=True, eq=True)\nclass CollectionEntry:\n date: datetime.date\n waste_type: str\n icon: Optional[str]\n\n def export(self) -> Collection:\n return Collection(self.date, self.waste_type, self.icon)\n\n\ndef quote_none(value: Optional[str]) -> str:\n if value is None:\n return \"null\"\n\n return f\"'{value}'\"\n\n\ndef get_api_map():\n return {s[\"operator\"]: s[\"api_url\"] for s in SERVICE_MAP}\n\n\nclass Source:\n def __init__(\n self,\n operator: Operator,\n district: str,\n street: str,\n subdistrict: Optional[str] = None,\n number: Union[int, str, None] = None,\n show_volume: bool = False,\n ):\n self.api_url = get_api_map()[operator]\n self.district = district\n self.subdistrict = subdistrict\n self.street = street\n self.number = number\n self.show_volume = show_volume\n\n def fetch(self) -> list[Collection]:\n session = requests.session()\n session.headers.update(API_HEADERS)\n\n year = datetime.datetime.now().year\n entries: set[CollectionEntry] = set()\n\n district_id = self.fetch_district_id(session)\n street_id = self.fetch_street_id(session, district_id)\n # Eventually verify house number in the future\n\n params = {\n \"$expand\": \"Abfuhrplan,Abfuhrplan/GefaesstarifArt/Abfallart,Abfuhrplan/GefaesstarifArt/Volumen\",\n \"$orderby\": \"Abfuhrplan/GefaesstarifArt/Abfallart/Name,Abfuhrplan/GefaesstarifArt/Volumen/VolumenWert\",\n \"orteId\": district_id,\n \"strassenId\": street_id,\n \"jahr\": year,\n }\n\n if self.number:\n params[\"hausNr\"] = (f\"'{self.number}'\",)\n\n res = session.get(\n f\"{self.api_url}/AbfuhrtermineAbJahr\",\n params=params,\n )\n res.raise_for_status()\n payload: CollectionsRes = res.json()\n\n date_regex = re.compile(r\"\\d+\")\n\n for collection in payload[\"d\"]:\n if date_match := re.search(date_regex, collection[\"Termin\"]):\n timestamp = float(date_match.group())\n date = datetime.datetime.utcfromtimestamp(timestamp / 1000).date()\n waste_type = collection[\"Abfuhrplan\"][\"GefaesstarifArt\"][\"Abfallart\"][\n \"Name\"\n ]\n icon = None\n\n for icon_type, tested_icon in ICON_MAP.items():\n if icon_type.lower() in waste_type.lower():\n icon = tested_icon\n\n if self.show_volume:\n volume = int(\n collection[\"Abfuhrplan\"][\"GefaesstarifArt\"][\"Volumen\"][\n \"VolumenWert\"\n ]\n )\n waste_type = f\"{waste_type} ({volume} l)\"\n\n entries.add(CollectionEntry(date, waste_type, icon))\n\n if len(entries) == 0:\n raise ValueError(\n \"No collections found! Please verify that your configuration is correct.\"\n )\n\n return [entry.export() for entry in entries]\n\n def fetch_district_id(self, session: requests.Session) -> int:\n res = session.get(\n f\"{self.api_url}/OrteMitOrtsteilen\",\n headers=API_HEADERS,\n )\n res.raise_for_status()\n payload: DistrictsRes = res.json()\n\n try:\n return next(\n entry[\"OrteId\"]\n for entry in payload[\"d\"]\n if entry[\"Ortsname\"] == self.district\n and entry[\"Ortsteilname\"] == self.subdistrict\n )\n except StopIteration:\n raise ValueError(\n \"District id cannot be fetched. \"\n \"Please make sure that you entered a subdistrict if there is a comma on the website.\"\n )\n\n def fetch_street_id(self, session: requests.Session, district_id: int):\n res = session.get(\n f\"{self.api_url}/Strassen\",\n params={\n \"$filter\": f\"Ort/OrteId eq {district_id} and OrtsteilName eq {quote_none(self.subdistrict)}\",\n \"$orderby\": \"Name asc\",\n },\n headers=API_HEADERS,\n )\n res.raise_for_status()\n payload: StreetsRes = res.json()\n\n try:\n return next(\n entry[\"StrassenId\"]\n for entry in payload[\"d\"]\n if entry[\"Name\"] == self.street\n )\n except StopIteration:\n raise ValueError(\n \"Street ID cannot be fetched. Please verify your configuration.\"\n )\n\n\n# Typed dictionaries for the API\n# Automatically generated using https://pytyper.dev/\n\n\nclass DistrictRes(TypedDict):\n OrteId: int\n Ortsname: str\n Ortsteilname: Optional[str]\n\n\nclass DistrictsRes(TypedDict):\n d: List[DistrictRes]\n\n\nclass StreetRes(TypedDict):\n StrassenId: int\n Name: str\n Plz: str\n\n\nclass StreetsRes(TypedDict):\n d: List[StreetRes]\n\n\nclass Capacity(TypedDict):\n VolumenId: int\n VolumenWert: str\n\n\nclass WasteType(TypedDict):\n AbfallartenId: int\n Code: str\n Name: str\n Farbe: str\n IsBio: bool\n IsPapier: bool\n IsRest: bool\n IsWertstoff: bool\n Bemerkung: None\n Aktiv: None\n IsSchadstoff: None\n\n\nclass ContainerType(TypedDict):\n GefaesstarifArtenId: int\n BescheidText: None\n BescheidTextLeerungsgebuehr: None\n Bezeichnung: str\n GefaesstarifArtVerwenden: bool\n GefaesstarifArtVerwendenAbfallkalender: bool\n Bemerkung: None\n Volumen: Capacity\n Abfallart: WasteType\n # Abfuhrrhythmus: Abfuhrrhythmus\n\n\nclass CollectionPlan(TypedDict):\n AbfuhrplaeneId: int\n Jahr: int\n GefaesstarifArt: ContainerType\n # AbfallartenObj: Abfuhrrhythmus\n\n\nclass CollectionRes(TypedDict):\n AbfuhrtermineId: int\n Termin: str\n Abfuhrplan: CollectionPlan\n\n\nclass CollectionsRes(TypedDict):\n d: List[CollectionRes]\n", "path": "custom_components/waste_collection_schedule/waste_collection_schedule/source/buergerportal_de.py"}], "after_files": [{"content": "import datetime\nimport re\nfrom dataclasses import dataclass\nfrom typing import List, Literal, Optional, TypedDict, Union\n\nimport requests\nfrom waste_collection_schedule import Collection # type: ignore[attr-defined]\n\nTITLE = \"B\u00fcrgerportal\"\nURL = \"https://www.c-trace.de\"\nDESCRIPTION = \"Source for waste collection in multiple service areas.\"\n\n\ndef EXTRA_INFO():\n return [{\"title\": s[\"title\"], \"url\": s[\"url\"]} for s in SERVICE_MAP]\n\n\nTEST_CASES = {\n \"Cochem-Zell\": {\n \"operator\": \"cochem_zell\",\n \"district\": \"Bullay\",\n \"subdistrict\": \"Bullay\",\n \"street\": \"Layenweg\",\n \"number\": 3,\n },\n \"Alb-Donau\": {\n \"operator\": \"alb_donau\",\n \"district\": \"Blaubeuren\",\n \"street\": \"Alberstra\u00dfe\",\n \"number\": 3,\n },\n \"Biedenkopf\": {\n \"operator\": \"biedenkopf\",\n \"district\": \"Biedenkopf\",\n \"subdistrict\": \"Breidenstein\",\n \"street\": \"Auf dem Hammer\",\n \"number\": 1,\n },\n}\n\nICON_MAP = {\n \"mobil\": \"mdi:truck\",\n \"bio\": \"mdi:leaf\",\n \"papier\": \"mdi:package-variant\",\n \"verpackung\": \"mdi:recycle\",\n \"gelb\": \"mdi:recycle\",\n \"lvp\": \"mdi:recycle\",\n \"rest\": \"mdi:trash-can\",\n \"gruen\": \"mdi:forest\",\n \"gr\u00fcn\": \"mdi:forest\",\n \"baum\": \"mdi:forest\",\n \"schnitt\": \"mdi:forest\",\n \"schad\": \"mdi:biohazard\",\n}\nAPI_HEADERS = {\n \"Accept\": \"application/json, text/plain;q=0.5\",\n \"Cache-Control\": \"no-cache\",\n}\nOperator = Literal[\"cochem_zell\", \"alb_donau\", \"biedenkopf\"]\n\nSERVICE_MAP = [\n {\n \"title\": \"KV Cochem-Zell\",\n \"url\": \"https://www.cochem-zell-online.de/\",\n \"api_url\": \"https://buerger-portal-cochemzell.azurewebsites.net/api\",\n \"operator\": \"cochem_zell\",\n },\n {\n \"title\": \"Abfallwirtschaft Alb-Donau-Kreis\",\n \"url\": \"https://www.aw-adk.de/\",\n \"api_url\": \"https://buerger-portal-albdonaukreisabfallwirtschaft.azurewebsites.net/api\",\n \"operator\": \"alb_donau\",\n },\n {\n \"title\": \"MZV Biedenkopf\",\n \"url\": \"https://mzv-biedenkopf.de/\",\n \"api_url\": \"https://biedenkopfmzv.buergerportal.digital/api\",\n \"operator\": \"biedenkopf\",\n },\n {\n \"title\": \"B\u00fcrgerportal Bedburg\",\n \"url\": \"https://www.bedburg.de/\",\n \"api_url\": \"https://buerger-portal-bedburg.azurewebsites.net\",\n \"operator\": \"bedburg\",\n },\n]\n\n\n# This datalcass is used for adding entries to a set and remove duplicate entries.\n# The default `Collection` extends the standard dict and thus is not hashable.\n@dataclass(frozen=True, eq=True)\nclass CollectionEntry:\n date: datetime.date\n waste_type: str\n icon: Optional[str]\n\n def export(self) -> Collection:\n return Collection(self.date, self.waste_type, self.icon)\n\n\ndef quote_none(value: Optional[str]) -> str:\n if value is None:\n return \"null\"\n\n return f\"'{value}'\"\n\n\ndef get_api_map():\n return {s[\"operator\"]: s[\"api_url\"] for s in SERVICE_MAP}\n\n\nclass Source:\n def __init__(\n self,\n operator: Operator,\n district: str,\n street: str,\n subdistrict: Optional[str] = None,\n number: Union[int, str, None] = None,\n show_volume: bool = False,\n ):\n self.api_url = get_api_map()[operator]\n self.district = district\n self.subdistrict = subdistrict\n self.street = street\n self.number = number\n self.show_volume = show_volume\n\n def fetch(self) -> list[Collection]:\n session = requests.session()\n session.headers.update(API_HEADERS)\n\n year = datetime.datetime.now().year\n entries: set[CollectionEntry] = set()\n\n district_id = self.fetch_district_id(session)\n street_id = self.fetch_street_id(session, district_id)\n # Eventually verify house number in the future\n\n params = {\n \"$expand\": \"Abfuhrplan,Abfuhrplan/GefaesstarifArt/Abfallart,Abfuhrplan/GefaesstarifArt/Volumen\",\n \"$orderby\": \"Abfuhrplan/GefaesstarifArt/Abfallart/Name,Abfuhrplan/GefaesstarifArt/Volumen/VolumenWert\",\n \"orteId\": district_id,\n \"strassenId\": street_id,\n \"jahr\": year,\n }\n\n if self.number:\n params[\"hausNr\"] = (f\"'{self.number}'\",)\n\n res = session.get(\n f\"{self.api_url}/AbfuhrtermineAbJahr\",\n params=params,\n )\n res.raise_for_status()\n payload: CollectionsRes = res.json()\n\n date_regex = re.compile(r\"\\d+\")\n\n for collection in payload[\"d\"]:\n if date_match := re.search(date_regex, collection[\"Termin\"]):\n timestamp = float(date_match.group())\n date = datetime.datetime.utcfromtimestamp(timestamp / 1000).date()\n waste_type = collection[\"Abfuhrplan\"][\"GefaesstarifArt\"][\"Abfallart\"][\n \"Name\"\n ]\n icon = None\n\n for icon_type, tested_icon in ICON_MAP.items():\n if icon_type.lower() in waste_type.lower():\n icon = tested_icon\n\n if self.show_volume:\n volume = int(\n collection[\"Abfuhrplan\"][\"GefaesstarifArt\"][\"Volumen\"][\n \"VolumenWert\"\n ]\n )\n waste_type = f\"{waste_type} ({volume} l)\"\n\n entries.add(CollectionEntry(date, waste_type, icon))\n\n if len(entries) == 0:\n raise ValueError(\n \"No collections found! Please verify that your configuration is correct.\"\n )\n\n return [entry.export() for entry in entries]\n\n def fetch_district_id(self, session: requests.Session) -> int:\n res = session.get(\n f\"{self.api_url}/OrteMitOrtsteilen\",\n headers=API_HEADERS,\n )\n res.raise_for_status()\n payload: DistrictsRes = res.json()\n\n try:\n return next(\n entry[\"OrteId\"]\n for entry in payload[\"d\"]\n if entry[\"Ortsname\"] == self.district\n and entry[\"Ortsteilname\"] == self.subdistrict\n )\n except StopIteration:\n raise ValueError(\n \"District id cannot be fetched. \"\n \"Please make sure that you entered a subdistrict if there is a comma on the website.\"\n )\n\n def fetch_street_id(self, session: requests.Session, district_id: int):\n res = session.get(\n f\"{self.api_url}/Strassen\",\n params={\n \"$filter\": f\"Ort/OrteId eq {district_id} and OrtsteilName eq {quote_none(self.subdistrict)}\",\n \"$orderby\": \"Name asc\",\n },\n headers=API_HEADERS,\n )\n res.raise_for_status()\n payload: StreetsRes = res.json()\n\n try:\n return next(\n entry[\"StrassenId\"]\n for entry in payload[\"d\"]\n if entry[\"Name\"] == self.street\n )\n except StopIteration:\n raise ValueError(\n \"Street ID cannot be fetched. Please verify your configuration.\"\n )\n\n\n# Typed dictionaries for the API\n# Automatically generated using https://pytyper.dev/\n\n\nclass DistrictRes(TypedDict):\n OrteId: int\n Ortsname: str\n Ortsteilname: Optional[str]\n\n\nclass DistrictsRes(TypedDict):\n d: List[DistrictRes]\n\n\nclass StreetRes(TypedDict):\n StrassenId: int\n Name: str\n Plz: str\n\n\nclass StreetsRes(TypedDict):\n d: List[StreetRes]\n\n\nclass Capacity(TypedDict):\n VolumenId: int\n VolumenWert: str\n\n\nclass WasteType(TypedDict):\n AbfallartenId: int\n Code: str\n Name: str\n Farbe: str\n IsBio: bool\n IsPapier: bool\n IsRest: bool\n IsWertstoff: bool\n Bemerkung: None\n Aktiv: None\n IsSchadstoff: None\n\n\nclass ContainerType(TypedDict):\n GefaesstarifArtenId: int\n BescheidText: None\n BescheidTextLeerungsgebuehr: None\n Bezeichnung: str\n GefaesstarifArtVerwenden: bool\n GefaesstarifArtVerwendenAbfallkalender: bool\n Bemerkung: None\n Volumen: Capacity\n Abfallart: WasteType\n # Abfuhrrhythmus: Abfuhrrhythmus\n\n\nclass CollectionPlan(TypedDict):\n AbfuhrplaeneId: int\n Jahr: int\n GefaesstarifArt: ContainerType\n # AbfallartenObj: Abfuhrrhythmus\n\n\nclass CollectionRes(TypedDict):\n AbfuhrtermineId: int\n Termin: str\n Abfuhrplan: CollectionPlan\n\n\nclass CollectionsRes(TypedDict):\n d: List[CollectionRes]\n", "path": "custom_components/waste_collection_schedule/waste_collection_schedule/source/buergerportal_de.py"}]} | 3,408 | 190 |
gh_patches_debug_35355 | rasdani/github-patches | git_diff | scikit-image__scikit-image-2134 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`min_size` is not strictly conformed in the implementation of felzenszwalb
## Description
With `min_size` specified, there're still some segments with sizes that less than it. I don't know if it is an inherent flaw of the algorithm.
## Way to reproduce
```
>>> I = skimage.io.imread('dragonbaby.jpg')
>>> fz = felzenszwalb(I, scale=300, sigma=0.8, min_size=80)
>>> (fz==9).sum()
1
```

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `skimage/segmentation/_felzenszwalb.py`
Content:
```
1 import numpy as np
2
3 from .._shared.utils import warn
4 from ._felzenszwalb_cy import _felzenszwalb_grey
5
6
7 def felzenszwalb(image, scale=1, sigma=0.8, min_size=20):
8 """Computes Felsenszwalb's efficient graph based image segmentation.
9
10 Produces an oversegmentation of a multichannel (i.e. RGB) image
11 using a fast, minimum spanning tree based clustering on the image grid.
12 The parameter ``scale`` sets an observation level. Higher scale means
13 less and larger segments. ``sigma`` is the diameter of a Gaussian kernel,
14 used for smoothing the image prior to segmentation.
15
16 The number of produced segments as well as their size can only be
17 controlled indirectly through ``scale``. Segment size within an image can
18 vary greatly depending on local contrast.
19
20 For RGB images, the algorithm computes a separate segmentation for each
21 channel and then combines these. The combined segmentation is the
22 intersection of the separate segmentations on the color channels.
23
24 Parameters
25 ----------
26 image : (width, height, 3) or (width, height) ndarray
27 Input image.
28 scale : float
29 Free parameter. Higher means larger clusters.
30 sigma : float
31 Width of Gaussian kernel used in preprocessing.
32 min_size : int
33 Minimum component size. Enforced using postprocessing.
34
35 Returns
36 -------
37 segment_mask : (width, height) ndarray
38 Integer mask indicating segment labels.
39
40 References
41 ----------
42 .. [1] Efficient graph-based image segmentation, Felzenszwalb, P.F. and
43 Huttenlocher, D.P. International Journal of Computer Vision, 2004
44
45 Examples
46 --------
47 >>> from skimage.segmentation import felzenszwalb
48 >>> from skimage.data import coffee
49 >>> img = coffee()
50 >>> segments = felzenszwalb(img, scale=3.0, sigma=0.95, min_size=5)
51 """
52
53 if image.ndim == 2:
54 # assume single channel image
55 return _felzenszwalb_grey(image, scale=scale, sigma=sigma,
56 min_size=min_size)
57
58 elif image.ndim != 3:
59 raise ValueError("Felzenswalb segmentation can only operate on RGB and"
60 " grey images, but input array of ndim %d given."
61 % image.ndim)
62
63 # assume we got 2d image with multiple channels
64 n_channels = image.shape[2]
65 if n_channels != 3:
66 warn("Got image with %d channels. Is that really what you"
67 " wanted?" % image.shape[2])
68 segmentations = []
69 # compute quickshift for each channel
70 for c in range(n_channels):
71 channel = np.ascontiguousarray(image[:, :, c])
72 s = _felzenszwalb_grey(channel, scale=scale, sigma=sigma,
73 min_size=min_size)
74 segmentations.append(s)
75
76 # put pixels in same segment only if in the same segment in all images
77 # we do this by combining the channels to one number
78 n0 = segmentations[0].max() + 1
79 n1 = segmentations[1].max() + 1
80 segmentation = (segmentations[0] + segmentations[1] * n0
81 + segmentations[2] * n0 * n1)
82 # make segment labels consecutive numbers starting at 0
83 labels = np.unique(segmentation, return_inverse=True)[1]
84 return labels.reshape(image.shape[:2])
85
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/skimage/segmentation/_felzenszwalb.py b/skimage/segmentation/_felzenszwalb.py
--- a/skimage/segmentation/_felzenszwalb.py
+++ b/skimage/segmentation/_felzenszwalb.py
@@ -1,7 +1,7 @@
import numpy as np
from .._shared.utils import warn
-from ._felzenszwalb_cy import _felzenszwalb_grey
+from ._felzenszwalb_cy import _felzenszwalb_cython
def felzenszwalb(image, scale=1, sigma=0.8, min_size=20):
@@ -17,9 +17,8 @@
controlled indirectly through ``scale``. Segment size within an image can
vary greatly depending on local contrast.
- For RGB images, the algorithm computes a separate segmentation for each
- channel and then combines these. The combined segmentation is the
- intersection of the separate segmentations on the color channels.
+ For RGB images, the algorithm uses the euclidean distance between pixels in
+ color space.
Parameters
----------
@@ -50,35 +49,6 @@
>>> segments = felzenszwalb(img, scale=3.0, sigma=0.95, min_size=5)
"""
- if image.ndim == 2:
- # assume single channel image
- return _felzenszwalb_grey(image, scale=scale, sigma=sigma,
- min_size=min_size)
-
- elif image.ndim != 3:
- raise ValueError("Felzenswalb segmentation can only operate on RGB and"
- " grey images, but input array of ndim %d given."
- % image.ndim)
-
- # assume we got 2d image with multiple channels
- n_channels = image.shape[2]
- if n_channels != 3:
- warn("Got image with %d channels. Is that really what you"
- " wanted?" % image.shape[2])
- segmentations = []
- # compute quickshift for each channel
- for c in range(n_channels):
- channel = np.ascontiguousarray(image[:, :, c])
- s = _felzenszwalb_grey(channel, scale=scale, sigma=sigma,
- min_size=min_size)
- segmentations.append(s)
-
- # put pixels in same segment only if in the same segment in all images
- # we do this by combining the channels to one number
- n0 = segmentations[0].max() + 1
- n1 = segmentations[1].max() + 1
- segmentation = (segmentations[0] + segmentations[1] * n0
- + segmentations[2] * n0 * n1)
- # make segment labels consecutive numbers starting at 0
- labels = np.unique(segmentation, return_inverse=True)[1]
- return labels.reshape(image.shape[:2])
+ image = np.atleast_3d(image)
+ return _felzenszwalb_cython(image, scale=scale, sigma=sigma,
+ min_size=min_size)
| {"golden_diff": "diff --git a/skimage/segmentation/_felzenszwalb.py b/skimage/segmentation/_felzenszwalb.py\n--- a/skimage/segmentation/_felzenszwalb.py\n+++ b/skimage/segmentation/_felzenszwalb.py\n@@ -1,7 +1,7 @@\n import numpy as np\n \n from .._shared.utils import warn\n-from ._felzenszwalb_cy import _felzenszwalb_grey\n+from ._felzenszwalb_cy import _felzenszwalb_cython\n \n \n def felzenszwalb(image, scale=1, sigma=0.8, min_size=20):\n@@ -17,9 +17,8 @@\n controlled indirectly through ``scale``. Segment size within an image can\n vary greatly depending on local contrast.\n \n- For RGB images, the algorithm computes a separate segmentation for each\n- channel and then combines these. The combined segmentation is the\n- intersection of the separate segmentations on the color channels.\n+ For RGB images, the algorithm uses the euclidean distance between pixels in\n+ color space.\n \n Parameters\n ----------\n@@ -50,35 +49,6 @@\n >>> segments = felzenszwalb(img, scale=3.0, sigma=0.95, min_size=5)\n \"\"\"\n \n- if image.ndim == 2:\n- # assume single channel image\n- return _felzenszwalb_grey(image, scale=scale, sigma=sigma,\n- min_size=min_size)\n-\n- elif image.ndim != 3:\n- raise ValueError(\"Felzenswalb segmentation can only operate on RGB and\"\n- \" grey images, but input array of ndim %d given.\"\n- % image.ndim)\n-\n- # assume we got 2d image with multiple channels\n- n_channels = image.shape[2]\n- if n_channels != 3:\n- warn(\"Got image with %d channels. Is that really what you\"\n- \" wanted?\" % image.shape[2])\n- segmentations = []\n- # compute quickshift for each channel\n- for c in range(n_channels):\n- channel = np.ascontiguousarray(image[:, :, c])\n- s = _felzenszwalb_grey(channel, scale=scale, sigma=sigma,\n- min_size=min_size)\n- segmentations.append(s)\n-\n- # put pixels in same segment only if in the same segment in all images\n- # we do this by combining the channels to one number\n- n0 = segmentations[0].max() + 1\n- n1 = segmentations[1].max() + 1\n- segmentation = (segmentations[0] + segmentations[1] * n0\n- + segmentations[2] * n0 * n1)\n- # make segment labels consecutive numbers starting at 0\n- labels = np.unique(segmentation, return_inverse=True)[1]\n- return labels.reshape(image.shape[:2])\n+ image = np.atleast_3d(image)\n+ return _felzenszwalb_cython(image, scale=scale, sigma=sigma,\n+ min_size=min_size)\n", "issue": "`min_size` is not strictly conformed in the implementation of felzenszwalb\n## Description\n\nWith `min_size` specified, there're still some segments with sizes that less than it. I don't know if it is an inherent flaw of the algorithm.\n## Way to reproduce\n\n```\n>>> I = skimage.io.imread('dragonbaby.jpg')\n>>> fz = felzenszwalb(I, scale=300, sigma=0.8, min_size=80)\n>>> (fz==9).sum()\n1\n```\n\n\n\n", "before_files": [{"content": "import numpy as np\n\nfrom .._shared.utils import warn\nfrom ._felzenszwalb_cy import _felzenszwalb_grey\n\n\ndef felzenszwalb(image, scale=1, sigma=0.8, min_size=20):\n \"\"\"Computes Felsenszwalb's efficient graph based image segmentation.\n\n Produces an oversegmentation of a multichannel (i.e. RGB) image\n using a fast, minimum spanning tree based clustering on the image grid.\n The parameter ``scale`` sets an observation level. Higher scale means\n less and larger segments. ``sigma`` is the diameter of a Gaussian kernel,\n used for smoothing the image prior to segmentation.\n\n The number of produced segments as well as their size can only be\n controlled indirectly through ``scale``. Segment size within an image can\n vary greatly depending on local contrast.\n\n For RGB images, the algorithm computes a separate segmentation for each\n channel and then combines these. The combined segmentation is the\n intersection of the separate segmentations on the color channels.\n\n Parameters\n ----------\n image : (width, height, 3) or (width, height) ndarray\n Input image.\n scale : float\n Free parameter. Higher means larger clusters.\n sigma : float\n Width of Gaussian kernel used in preprocessing.\n min_size : int\n Minimum component size. Enforced using postprocessing.\n\n Returns\n -------\n segment_mask : (width, height) ndarray\n Integer mask indicating segment labels.\n\n References\n ----------\n .. [1] Efficient graph-based image segmentation, Felzenszwalb, P.F. and\n Huttenlocher, D.P. International Journal of Computer Vision, 2004\n\n Examples\n --------\n >>> from skimage.segmentation import felzenszwalb\n >>> from skimage.data import coffee\n >>> img = coffee()\n >>> segments = felzenszwalb(img, scale=3.0, sigma=0.95, min_size=5)\n \"\"\"\n\n if image.ndim == 2:\n # assume single channel image\n return _felzenszwalb_grey(image, scale=scale, sigma=sigma,\n min_size=min_size)\n\n elif image.ndim != 3:\n raise ValueError(\"Felzenswalb segmentation can only operate on RGB and\"\n \" grey images, but input array of ndim %d given.\"\n % image.ndim)\n\n # assume we got 2d image with multiple channels\n n_channels = image.shape[2]\n if n_channels != 3:\n warn(\"Got image with %d channels. Is that really what you\"\n \" wanted?\" % image.shape[2])\n segmentations = []\n # compute quickshift for each channel\n for c in range(n_channels):\n channel = np.ascontiguousarray(image[:, :, c])\n s = _felzenszwalb_grey(channel, scale=scale, sigma=sigma,\n min_size=min_size)\n segmentations.append(s)\n\n # put pixels in same segment only if in the same segment in all images\n # we do this by combining the channels to one number\n n0 = segmentations[0].max() + 1\n n1 = segmentations[1].max() + 1\n segmentation = (segmentations[0] + segmentations[1] * n0\n + segmentations[2] * n0 * n1)\n # make segment labels consecutive numbers starting at 0\n labels = np.unique(segmentation, return_inverse=True)[1]\n return labels.reshape(image.shape[:2])\n", "path": "skimage/segmentation/_felzenszwalb.py"}], "after_files": [{"content": "import numpy as np\n\nfrom .._shared.utils import warn\nfrom ._felzenszwalb_cy import _felzenszwalb_cython\n\n\ndef felzenszwalb(image, scale=1, sigma=0.8, min_size=20):\n \"\"\"Computes Felsenszwalb's efficient graph based image segmentation.\n\n Produces an oversegmentation of a multichannel (i.e. RGB) image\n using a fast, minimum spanning tree based clustering on the image grid.\n The parameter ``scale`` sets an observation level. Higher scale means\n less and larger segments. ``sigma`` is the diameter of a Gaussian kernel,\n used for smoothing the image prior to segmentation.\n\n The number of produced segments as well as their size can only be\n controlled indirectly through ``scale``. Segment size within an image can\n vary greatly depending on local contrast.\n\n For RGB images, the algorithm uses the euclidean distance between pixels in\n color space.\n\n Parameters\n ----------\n image : (width, height, 3) or (width, height) ndarray\n Input image.\n scale : float\n Free parameter. Higher means larger clusters.\n sigma : float\n Width of Gaussian kernel used in preprocessing.\n min_size : int\n Minimum component size. Enforced using postprocessing.\n\n Returns\n -------\n segment_mask : (width, height) ndarray\n Integer mask indicating segment labels.\n\n References\n ----------\n .. [1] Efficient graph-based image segmentation, Felzenszwalb, P.F. and\n Huttenlocher, D.P. International Journal of Computer Vision, 2004\n\n Examples\n --------\n >>> from skimage.segmentation import felzenszwalb\n >>> from skimage.data import coffee\n >>> img = coffee()\n >>> segments = felzenszwalb(img, scale=3.0, sigma=0.95, min_size=5)\n \"\"\"\n\n image = np.atleast_3d(image)\n return _felzenszwalb_cython(image, scale=scale, sigma=sigma,\n min_size=min_size)\n", "path": "skimage/segmentation/_felzenszwalb.py"}]} | 1,388 | 711 |
gh_patches_debug_248 | rasdani/github-patches | git_diff | statsmodels__statsmodels-3976 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
The compat modules should use absolute imports
The [statsmodels.compat.collections](https://github.com/statsmodels/statsmodels/blob/a88830efc3a99cfbe0ebc9fbfd77820fe748fc59/statsmodels/compat/collections.py#L7) imports the namesake standard library module without requesting absolute imports. While it seems to work in many cases, it causes a problem to packages that override `__import__`. See enlnt/pyq#18.
Please consider adding
```python
from __future__ import absolute_import
```
to the compat modules.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `statsmodels/compat/collections.py`
Content:
```
1 '''backported compatibility functions for Python's collections
2
3 '''
4
5 try:
6 #python >= 2.7
7 from collections import OrderedDict
8 except ImportError:
9 #http://code.activestate.com/recipes/576693/
10 #author: Raymond Hettinger
11 from .ordereddict import OrderedDict
12
13 try:
14 #python >= 2.7
15 from collections import Counter
16 except ImportError:
17 #http://code.activestate.com/recipes/576611/
18 #author: Raymond Hettinger
19 from .counter import Counter
20
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/statsmodels/compat/collections.py b/statsmodels/compat/collections.py
--- a/statsmodels/compat/collections.py
+++ b/statsmodels/compat/collections.py
@@ -1,6 +1,7 @@
'''backported compatibility functions for Python's collections
'''
+from __future__ import absolute_import
try:
#python >= 2.7
| {"golden_diff": "diff --git a/statsmodels/compat/collections.py b/statsmodels/compat/collections.py\n--- a/statsmodels/compat/collections.py\n+++ b/statsmodels/compat/collections.py\n@@ -1,6 +1,7 @@\n '''backported compatibility functions for Python's collections\n \n '''\n+from __future__ import absolute_import\n \n try:\n #python >= 2.7\n", "issue": "The compat modules should use absolute imports\nThe [statsmodels.compat.collections](https://github.com/statsmodels/statsmodels/blob/a88830efc3a99cfbe0ebc9fbfd77820fe748fc59/statsmodels/compat/collections.py#L7) imports the namesake standard library module without requesting absolute imports. While it seems to work in many cases, it causes a problem to packages that override `__import__`. See enlnt/pyq#18.\r\n\r\nPlease consider adding\r\n\r\n```python\r\nfrom __future__ import absolute_import\r\n```\r\nto the compat modules.\n", "before_files": [{"content": "'''backported compatibility functions for Python's collections\n\n'''\n\ntry:\n #python >= 2.7\n from collections import OrderedDict\nexcept ImportError:\n #http://code.activestate.com/recipes/576693/\n #author: Raymond Hettinger\n from .ordereddict import OrderedDict\n\ntry:\n #python >= 2.7\n from collections import Counter\nexcept ImportError:\n #http://code.activestate.com/recipes/576611/\n #author: Raymond Hettinger\n from .counter import Counter\n", "path": "statsmodels/compat/collections.py"}], "after_files": [{"content": "'''backported compatibility functions for Python's collections\n\n'''\nfrom __future__ import absolute_import\n\ntry:\n #python >= 2.7\n from collections import OrderedDict\nexcept ImportError:\n #http://code.activestate.com/recipes/576693/\n #author: Raymond Hettinger\n from .ordereddict import OrderedDict\n\ntry:\n #python >= 2.7\n from collections import Counter\nexcept ImportError:\n #http://code.activestate.com/recipes/576611/\n #author: Raymond Hettinger\n from .counter import Counter\n", "path": "statsmodels/compat/collections.py"}]} | 550 | 81 |
gh_patches_debug_16439 | rasdani/github-patches | git_diff | streamlit__streamlit-761 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Warn users when they try to run Streamlit on an `ipynb` file.
# Problem
[Some users](https://discuss.streamlit.io/t/app-loading-problem/199/7?u=adrien_treuille) have tried to run Streamlit on a Jupyter notebook (`.ipnyb`) file and are confused when this doesn't work.
# Solution
It would be great if for _any_ non `.py` file, Streamlit behaved as follows:
```bash
$ streamlit run my_msft_word_file.docx
Error: Streamlit requires raw Python (.py) files, not .docx.
For more information, please see https://streamlit.io/docs
$ streamlit run my_notebook.ipnyb
Error: Streamlit requires raw Python (.py) files, not .ipnyb.
For more information, please see https://streamlit.io/docs
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/streamlit/cli.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # Copyright 2018-2019 Streamlit Inc.
3 #
4 # Licensed under the Apache License, Version 2.0 (the "License");
5 # you may not use this file except in compliance with the License.
6 # You may obtain a copy of the License at
7 #
8 # http://www.apache.org/licenses/LICENSE-2.0
9 #
10 # Unless required by applicable law or agreed to in writing, software
11 # distributed under the License is distributed on an "AS IS" BASIS,
12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 # See the License for the specific language governing permissions and
14 # limitations under the License.
15
16 """This is a script which is run when the Streamlit package is executed."""
17
18 # Python 2/3 compatibility
19 from __future__ import print_function, division, absolute_import
20
21 # Not importing unicode_literals from __future__ because click doesn't like it.
22 from streamlit.compatibility import setup_2_3_shims
23
24 setup_2_3_shims(globals())
25
26 from streamlit import config as _config
27
28 import os
29 import click
30
31 import streamlit
32 from streamlit.credentials import Credentials, check_credentials
33 from streamlit import version
34 import streamlit.bootstrap as bootstrap
35 from streamlit.case_converters import to_snake_case
36
37 LOG_LEVELS = ["error", "warning", "info", "debug"]
38
39 NEW_VERSION_TEXT = """
40 %(new_version)s
41
42 See what's new at https://discuss.streamlit.io/c/announcements
43
44 Enter the following command to upgrade:
45 %(prompt)s %(command)s
46 """ % {
47 "new_version": click.style(
48 "A new version of Streamlit is available.", fg="blue", bold=True
49 ),
50 "prompt": click.style("$", fg="blue"),
51 "command": click.style("pip install streamlit --upgrade", bold=True),
52 }
53
54
55 def _convert_config_option_to_click_option(config_option):
56 """Composes given config option options as options for click lib."""
57 option = "--{}".format(config_option.key)
58 param = config_option.key.replace(".", "_")
59 description = config_option.description
60 if config_option.deprecated:
61 description += "\n {} - {}".format(
62 config_option.deprecation_text, config_option.deprecation_date
63 )
64 envvar = "STREAMLIT_{}".format(to_snake_case(param).upper())
65
66 return {
67 "param": param,
68 "description": description,
69 "type": config_option.type,
70 "option": option,
71 "envvar": envvar,
72 }
73
74
75 def configurator_options(func):
76 """Decorator that adds config param keys to click dynamically."""
77 for _, value in reversed(_config._config_options.items()):
78 parsed_parameter = _convert_config_option_to_click_option(value)
79 config_option = click.option(
80 parsed_parameter["option"],
81 parsed_parameter["param"],
82 help=parsed_parameter["description"],
83 type=parsed_parameter["type"],
84 show_envvar=True,
85 envvar=parsed_parameter["envvar"],
86 )
87 func = config_option(func)
88 return func
89
90
91 def _apply_config_options_from_cli(kwargs):
92 """The "streamlit run" command supports passing Streamlit's config options
93 as flags.
94
95 This function reads through all config flags, massage them, and
96 pass them to _set_config() overriding default values and values set via
97 config.toml file
98
99 """
100 for config_option in kwargs:
101 if kwargs[config_option] is not None:
102 config_option_def_key = config_option.replace("_", ".")
103
104 _config._set_option(
105 config_option_def_key,
106 kwargs[config_option],
107 "command-line argument or environment variable",
108 )
109
110
111 # Fetch remote file at url_path to script_path
112 def _download_remote(script_path, url_path):
113 import requests
114
115 with open(script_path, "wb") as fp:
116 try:
117 resp = requests.get(url_path)
118 resp.raise_for_status()
119 fp.write(resp.content)
120 except requests.exceptions.RequestException as e:
121 raise click.BadParameter(("Unable to fetch {}.\n{}".format(url_path, e)))
122
123
124 @click.group(context_settings={"auto_envvar_prefix": "STREAMLIT"})
125 @click.option("--log_level", show_default=True, type=click.Choice(LOG_LEVELS))
126 @click.version_option(prog_name="Streamlit")
127 @click.pass_context
128 def main(ctx, log_level="info"):
129 """Try out a demo with:
130
131 $ streamlit hello
132
133 Or use the line below to run your own script:
134
135 $ streamlit run your_script.py
136 """
137
138 if log_level:
139 import streamlit.logger
140
141 streamlit.logger.set_log_level(log_level.upper())
142
143
144 @main.command("help")
145 @click.pass_context
146 def help(ctx):
147 """Print this help message."""
148 # Pretend user typed 'streamlit --help' instead of 'streamlit help'.
149 import sys
150
151 assert len(sys.argv) == 2 # This is always true, but let's assert anyway.
152 sys.argv[1] = "--help"
153 main()
154
155
156 @main.command("version")
157 @click.pass_context
158 def main_version(ctx):
159 """Print Streamlit's version number."""
160 # Pretend user typed 'streamlit --version' instead of 'streamlit version'
161 import sys
162
163 assert len(sys.argv) == 2 # This is always true, but let's assert anyway.
164 sys.argv[1] = "--version"
165 main()
166
167
168 @main.command("docs")
169 def main_docs():
170 """Show help in browser."""
171 print("Showing help page in browser...")
172 from streamlit import util
173
174 util.open_browser("https://streamlit.io/docs")
175
176
177 @main.command("hello")
178 @configurator_options
179 def main_hello(**kwargs):
180 """Runs the Hello World script."""
181 from streamlit.hello import hello
182
183 _apply_config_options_from_cli(kwargs)
184
185 filename = hello.__file__
186
187 # For Python 2 when Streamlit is actually installed (make install rather
188 # than make develop).
189 if filename.endswith(".pyc"):
190 filename = "%s.py" % filename[:-4]
191
192 _main_run(filename)
193
194
195 @main.command("run")
196 @configurator_options
197 @click.argument("target", required=True, envvar="STREAMLIT_RUN_TARGET")
198 @click.argument("args", nargs=-1)
199 def main_run(target, args=None, **kwargs):
200 """Run a Python script, piping stderr to Streamlit.
201
202 The script can be local or it can be an url. In the latter case, Streamlit
203 will download the script to a temporary file and runs this file.
204
205 """
206 from validators import url
207
208 _apply_config_options_from_cli(kwargs)
209
210 if url(target):
211 from streamlit.temporary_directory import TemporaryDirectory
212
213 with TemporaryDirectory() as temp_dir:
214 from urllib.parse import urlparse
215 from streamlit import url_util
216
217 path = urlparse(target).path
218 script_path = os.path.join(temp_dir, path.strip("/").rsplit("/", 1)[-1])
219 # if this is a GitHub/Gist blob url, convert to a raw URL first.
220 target = url_util.process_gitblob_url(target)
221 _download_remote(script_path, target)
222 _main_run(script_path, args)
223 else:
224 if not os.path.exists(target):
225 raise click.BadParameter("File does not exist: {}".format(target))
226 _main_run(target, args)
227
228
229 # Utility function to compute the command line as a string
230 def _get_command_line_as_string():
231 import subprocess
232
233 cmd_line_as_list = [click.get_current_context().parent.command_path]
234 cmd_line_as_list.extend(click.get_os_args())
235 return subprocess.list2cmdline(cmd_line_as_list)
236
237
238 def _main_run(file, args=[]):
239 command_line = _get_command_line_as_string()
240
241 # Set a global flag indicating that we're "within" streamlit.
242 streamlit._is_running_with_streamlit = True
243
244 # Check credentials.
245 check_credentials()
246
247 # Notify if streamlit is out of date.
248 if version.should_show_new_version_notice():
249 click.echo(NEW_VERSION_TEXT)
250
251 bootstrap.run(file, command_line, args)
252
253
254 # DEPRECATED
255
256 # TODO: Remove after 2019-09-01
257 @main.command("clear_cache", deprecated=True, hidden=True)
258 @click.pass_context
259 def main_clear_cache(ctx):
260 """Deprecated."""
261 click.echo(click.style('Use "cache clear" instead.', fg="red"))
262 ctx.invoke(cache_clear)
263
264
265 # TODO: Remove after 2019-09-01
266 @main.command("show_config", deprecated=True, hidden=True)
267 @click.pass_context
268 def main_show_config(ctx):
269 """Deprecated."""
270 click.echo(click.style('Use "config show" instead.', fg="red"))
271 ctx.invoke(config_show)
272
273
274 # SUBCOMMAND: cache
275
276
277 @main.group("cache")
278 def cache():
279 """Manage the Streamlit cache."""
280 pass
281
282
283 @cache.command("clear")
284 def cache_clear():
285 """Clear the Streamlit on-disk cache."""
286 import streamlit.caching
287
288 result = streamlit.caching.clear_cache()
289 cache_path = streamlit.caching.get_cache_path()
290 if result:
291 print("Cleared directory %s." % cache_path)
292 else:
293 print("Nothing to clear at %s." % cache_path)
294
295
296 # SUBCOMMAND: config
297
298
299 @main.group("config")
300 def config():
301 """Manage Streamlit's config settings."""
302 pass
303
304
305 @config.command("show")
306 @configurator_options
307 def config_show(**kwargs):
308 """Show all of Streamlit's config settings."""
309
310 _apply_config_options_from_cli(kwargs)
311
312 _config.show_config()
313
314
315 # SUBCOMMAND: activate
316
317
318 @main.group("activate", invoke_without_command=True)
319 @click.pass_context
320 def activate(ctx):
321 """Activate Streamlit by entering your email."""
322 if not ctx.invoked_subcommand:
323 Credentials.get_current().activate()
324
325
326 @activate.command("reset")
327 def activate_reset():
328 """Reset Activation Credentials."""
329 Credentials.get_current().reset()
330
331
332 if __name__ == "__main__":
333 main()
334
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lib/streamlit/cli.py b/lib/streamlit/cli.py
--- a/lib/streamlit/cli.py
+++ b/lib/streamlit/cli.py
@@ -34,7 +34,9 @@
import streamlit.bootstrap as bootstrap
from streamlit.case_converters import to_snake_case
-LOG_LEVELS = ["error", "warning", "info", "debug"]
+ACCEPTED_FILE_EXTENSIONS = ("py", "py3")
+
+LOG_LEVELS = ("error", "warning", "info", "debug")
NEW_VERSION_TEXT = """
%(new_version)s
@@ -207,6 +209,10 @@
_apply_config_options_from_cli(kwargs)
+ _, extension = os.path.splitext(target)
+ if extension[1:] not in ACCEPTED_FILE_EXTENSIONS:
+ raise click.BadArgumentUsage("Streamlit requires raw Python (.py) files, not %s.\nFor more information, please see https://streamlit.io/docs" % extension)
+
if url(target):
from streamlit.temporary_directory import TemporaryDirectory
| {"golden_diff": "diff --git a/lib/streamlit/cli.py b/lib/streamlit/cli.py\n--- a/lib/streamlit/cli.py\n+++ b/lib/streamlit/cli.py\n@@ -34,7 +34,9 @@\n import streamlit.bootstrap as bootstrap\n from streamlit.case_converters import to_snake_case\n \n-LOG_LEVELS = [\"error\", \"warning\", \"info\", \"debug\"]\n+ACCEPTED_FILE_EXTENSIONS = (\"py\", \"py3\")\n+\n+LOG_LEVELS = (\"error\", \"warning\", \"info\", \"debug\")\n \n NEW_VERSION_TEXT = \"\"\"\n %(new_version)s\n@@ -207,6 +209,10 @@\n \n _apply_config_options_from_cli(kwargs)\n \n+ _, extension = os.path.splitext(target)\n+ if extension[1:] not in ACCEPTED_FILE_EXTENSIONS:\n+ raise click.BadArgumentUsage(\"Streamlit requires raw Python (.py) files, not %s.\\nFor more information, please see https://streamlit.io/docs\" % extension)\n+\n if url(target):\n from streamlit.temporary_directory import TemporaryDirectory\n", "issue": "Warn users when they try to run Streamlit on an `ipynb` file.\n# Problem\r\n\r\n[Some users](https://discuss.streamlit.io/t/app-loading-problem/199/7?u=adrien_treuille) have tried to run Streamlit on a Jupyter notebook (`.ipnyb`) file and are confused when this doesn't work.\r\n\r\n# Solution\r\n\r\nIt would be great if for _any_ non `.py` file, Streamlit behaved as follows:\r\n\r\n```bash\r\n$ streamlit run my_msft_word_file.docx\r\n\r\nError: Streamlit requires raw Python (.py) files, not .docx.\r\nFor more information, please see https://streamlit.io/docs\r\n\r\n$ streamlit run my_notebook.ipnyb\r\n\r\nError: Streamlit requires raw Python (.py) files, not .ipnyb.\r\nFor more information, please see https://streamlit.io/docs\r\n```\r\n\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright 2018-2019 Streamlit Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"This is a script which is run when the Streamlit package is executed.\"\"\"\n\n# Python 2/3 compatibility\nfrom __future__ import print_function, division, absolute_import\n\n# Not importing unicode_literals from __future__ because click doesn't like it.\nfrom streamlit.compatibility import setup_2_3_shims\n\nsetup_2_3_shims(globals())\n\nfrom streamlit import config as _config\n\nimport os\nimport click\n\nimport streamlit\nfrom streamlit.credentials import Credentials, check_credentials\nfrom streamlit import version\nimport streamlit.bootstrap as bootstrap\nfrom streamlit.case_converters import to_snake_case\n\nLOG_LEVELS = [\"error\", \"warning\", \"info\", \"debug\"]\n\nNEW_VERSION_TEXT = \"\"\"\n %(new_version)s\n\n See what's new at https://discuss.streamlit.io/c/announcements\n\n Enter the following command to upgrade:\n %(prompt)s %(command)s\n\"\"\" % {\n \"new_version\": click.style(\n \"A new version of Streamlit is available.\", fg=\"blue\", bold=True\n ),\n \"prompt\": click.style(\"$\", fg=\"blue\"),\n \"command\": click.style(\"pip install streamlit --upgrade\", bold=True),\n}\n\n\ndef _convert_config_option_to_click_option(config_option):\n \"\"\"Composes given config option options as options for click lib.\"\"\"\n option = \"--{}\".format(config_option.key)\n param = config_option.key.replace(\".\", \"_\")\n description = config_option.description\n if config_option.deprecated:\n description += \"\\n {} - {}\".format(\n config_option.deprecation_text, config_option.deprecation_date\n )\n envvar = \"STREAMLIT_{}\".format(to_snake_case(param).upper())\n\n return {\n \"param\": param,\n \"description\": description,\n \"type\": config_option.type,\n \"option\": option,\n \"envvar\": envvar,\n }\n\n\ndef configurator_options(func):\n \"\"\"Decorator that adds config param keys to click dynamically.\"\"\"\n for _, value in reversed(_config._config_options.items()):\n parsed_parameter = _convert_config_option_to_click_option(value)\n config_option = click.option(\n parsed_parameter[\"option\"],\n parsed_parameter[\"param\"],\n help=parsed_parameter[\"description\"],\n type=parsed_parameter[\"type\"],\n show_envvar=True,\n envvar=parsed_parameter[\"envvar\"],\n )\n func = config_option(func)\n return func\n\n\ndef _apply_config_options_from_cli(kwargs):\n \"\"\"The \"streamlit run\" command supports passing Streamlit's config options\n as flags.\n\n This function reads through all config flags, massage them, and\n pass them to _set_config() overriding default values and values set via\n config.toml file\n\n \"\"\"\n for config_option in kwargs:\n if kwargs[config_option] is not None:\n config_option_def_key = config_option.replace(\"_\", \".\")\n\n _config._set_option(\n config_option_def_key,\n kwargs[config_option],\n \"command-line argument or environment variable\",\n )\n\n\n# Fetch remote file at url_path to script_path\ndef _download_remote(script_path, url_path):\n import requests\n\n with open(script_path, \"wb\") as fp:\n try:\n resp = requests.get(url_path)\n resp.raise_for_status()\n fp.write(resp.content)\n except requests.exceptions.RequestException as e:\n raise click.BadParameter((\"Unable to fetch {}.\\n{}\".format(url_path, e)))\n\n\[email protected](context_settings={\"auto_envvar_prefix\": \"STREAMLIT\"})\[email protected](\"--log_level\", show_default=True, type=click.Choice(LOG_LEVELS))\[email protected]_option(prog_name=\"Streamlit\")\[email protected]_context\ndef main(ctx, log_level=\"info\"):\n \"\"\"Try out a demo with:\n\n $ streamlit hello\n\n Or use the line below to run your own script:\n\n $ streamlit run your_script.py\n \"\"\"\n\n if log_level:\n import streamlit.logger\n\n streamlit.logger.set_log_level(log_level.upper())\n\n\[email protected](\"help\")\[email protected]_context\ndef help(ctx):\n \"\"\"Print this help message.\"\"\"\n # Pretend user typed 'streamlit --help' instead of 'streamlit help'.\n import sys\n\n assert len(sys.argv) == 2 # This is always true, but let's assert anyway.\n sys.argv[1] = \"--help\"\n main()\n\n\[email protected](\"version\")\[email protected]_context\ndef main_version(ctx):\n \"\"\"Print Streamlit's version number.\"\"\"\n # Pretend user typed 'streamlit --version' instead of 'streamlit version'\n import sys\n\n assert len(sys.argv) == 2 # This is always true, but let's assert anyway.\n sys.argv[1] = \"--version\"\n main()\n\n\[email protected](\"docs\")\ndef main_docs():\n \"\"\"Show help in browser.\"\"\"\n print(\"Showing help page in browser...\")\n from streamlit import util\n\n util.open_browser(\"https://streamlit.io/docs\")\n\n\[email protected](\"hello\")\n@configurator_options\ndef main_hello(**kwargs):\n \"\"\"Runs the Hello World script.\"\"\"\n from streamlit.hello import hello\n\n _apply_config_options_from_cli(kwargs)\n\n filename = hello.__file__\n\n # For Python 2 when Streamlit is actually installed (make install rather\n # than make develop).\n if filename.endswith(\".pyc\"):\n filename = \"%s.py\" % filename[:-4]\n\n _main_run(filename)\n\n\[email protected](\"run\")\n@configurator_options\[email protected](\"target\", required=True, envvar=\"STREAMLIT_RUN_TARGET\")\[email protected](\"args\", nargs=-1)\ndef main_run(target, args=None, **kwargs):\n \"\"\"Run a Python script, piping stderr to Streamlit.\n\n The script can be local or it can be an url. In the latter case, Streamlit\n will download the script to a temporary file and runs this file.\n\n \"\"\"\n from validators import url\n\n _apply_config_options_from_cli(kwargs)\n\n if url(target):\n from streamlit.temporary_directory import TemporaryDirectory\n\n with TemporaryDirectory() as temp_dir:\n from urllib.parse import urlparse\n from streamlit import url_util\n\n path = urlparse(target).path\n script_path = os.path.join(temp_dir, path.strip(\"/\").rsplit(\"/\", 1)[-1])\n # if this is a GitHub/Gist blob url, convert to a raw URL first.\n target = url_util.process_gitblob_url(target)\n _download_remote(script_path, target)\n _main_run(script_path, args)\n else:\n if not os.path.exists(target):\n raise click.BadParameter(\"File does not exist: {}\".format(target))\n _main_run(target, args)\n\n\n# Utility function to compute the command line as a string\ndef _get_command_line_as_string():\n import subprocess\n\n cmd_line_as_list = [click.get_current_context().parent.command_path]\n cmd_line_as_list.extend(click.get_os_args())\n return subprocess.list2cmdline(cmd_line_as_list)\n\n\ndef _main_run(file, args=[]):\n command_line = _get_command_line_as_string()\n\n # Set a global flag indicating that we're \"within\" streamlit.\n streamlit._is_running_with_streamlit = True\n\n # Check credentials.\n check_credentials()\n\n # Notify if streamlit is out of date.\n if version.should_show_new_version_notice():\n click.echo(NEW_VERSION_TEXT)\n\n bootstrap.run(file, command_line, args)\n\n\n# DEPRECATED\n\n# TODO: Remove after 2019-09-01\[email protected](\"clear_cache\", deprecated=True, hidden=True)\[email protected]_context\ndef main_clear_cache(ctx):\n \"\"\"Deprecated.\"\"\"\n click.echo(click.style('Use \"cache clear\" instead.', fg=\"red\"))\n ctx.invoke(cache_clear)\n\n\n# TODO: Remove after 2019-09-01\[email protected](\"show_config\", deprecated=True, hidden=True)\[email protected]_context\ndef main_show_config(ctx):\n \"\"\"Deprecated.\"\"\"\n click.echo(click.style('Use \"config show\" instead.', fg=\"red\"))\n ctx.invoke(config_show)\n\n\n# SUBCOMMAND: cache\n\n\[email protected](\"cache\")\ndef cache():\n \"\"\"Manage the Streamlit cache.\"\"\"\n pass\n\n\[email protected](\"clear\")\ndef cache_clear():\n \"\"\"Clear the Streamlit on-disk cache.\"\"\"\n import streamlit.caching\n\n result = streamlit.caching.clear_cache()\n cache_path = streamlit.caching.get_cache_path()\n if result:\n print(\"Cleared directory %s.\" % cache_path)\n else:\n print(\"Nothing to clear at %s.\" % cache_path)\n\n\n# SUBCOMMAND: config\n\n\[email protected](\"config\")\ndef config():\n \"\"\"Manage Streamlit's config settings.\"\"\"\n pass\n\n\[email protected](\"show\")\n@configurator_options\ndef config_show(**kwargs):\n \"\"\"Show all of Streamlit's config settings.\"\"\"\n\n _apply_config_options_from_cli(kwargs)\n\n _config.show_config()\n\n\n# SUBCOMMAND: activate\n\n\[email protected](\"activate\", invoke_without_command=True)\[email protected]_context\ndef activate(ctx):\n \"\"\"Activate Streamlit by entering your email.\"\"\"\n if not ctx.invoked_subcommand:\n Credentials.get_current().activate()\n\n\[email protected](\"reset\")\ndef activate_reset():\n \"\"\"Reset Activation Credentials.\"\"\"\n Credentials.get_current().reset()\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "lib/streamlit/cli.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright 2018-2019 Streamlit Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"This is a script which is run when the Streamlit package is executed.\"\"\"\n\n# Python 2/3 compatibility\nfrom __future__ import print_function, division, absolute_import\n\n# Not importing unicode_literals from __future__ because click doesn't like it.\nfrom streamlit.compatibility import setup_2_3_shims\n\nsetup_2_3_shims(globals())\n\nfrom streamlit import config as _config\n\nimport os\nimport click\n\nimport streamlit\nfrom streamlit.credentials import Credentials, check_credentials\nfrom streamlit import version\nimport streamlit.bootstrap as bootstrap\nfrom streamlit.case_converters import to_snake_case\n\nACCEPTED_FILE_EXTENSIONS = (\"py\", \"py3\")\n\nLOG_LEVELS = (\"error\", \"warning\", \"info\", \"debug\")\n\nNEW_VERSION_TEXT = \"\"\"\n %(new_version)s\n\n See what's new at https://discuss.streamlit.io/c/announcements\n\n Enter the following command to upgrade:\n %(prompt)s %(command)s\n\"\"\" % {\n \"new_version\": click.style(\n \"A new version of Streamlit is available.\", fg=\"blue\", bold=True\n ),\n \"prompt\": click.style(\"$\", fg=\"blue\"),\n \"command\": click.style(\"pip install streamlit --upgrade\", bold=True),\n}\n\n\ndef _convert_config_option_to_click_option(config_option):\n \"\"\"Composes given config option options as options for click lib.\"\"\"\n option = \"--{}\".format(config_option.key)\n param = config_option.key.replace(\".\", \"_\")\n description = config_option.description\n if config_option.deprecated:\n description += \"\\n {} - {}\".format(\n config_option.deprecation_text, config_option.deprecation_date\n )\n envvar = \"STREAMLIT_{}\".format(to_snake_case(param).upper())\n\n return {\n \"param\": param,\n \"description\": description,\n \"type\": config_option.type,\n \"option\": option,\n \"envvar\": envvar,\n }\n\n\ndef configurator_options(func):\n \"\"\"Decorator that adds config param keys to click dynamically.\"\"\"\n for _, value in reversed(_config._config_options.items()):\n parsed_parameter = _convert_config_option_to_click_option(value)\n config_option = click.option(\n parsed_parameter[\"option\"],\n parsed_parameter[\"param\"],\n help=parsed_parameter[\"description\"],\n type=parsed_parameter[\"type\"],\n show_envvar=True,\n envvar=parsed_parameter[\"envvar\"],\n )\n func = config_option(func)\n return func\n\n\ndef _apply_config_options_from_cli(kwargs):\n \"\"\"The \"streamlit run\" command supports passing Streamlit's config options\n as flags.\n\n This function reads through all config flags, massage them, and\n pass them to _set_config() overriding default values and values set via\n config.toml file\n\n \"\"\"\n for config_option in kwargs:\n if kwargs[config_option] is not None:\n config_option_def_key = config_option.replace(\"_\", \".\")\n\n _config._set_option(\n config_option_def_key,\n kwargs[config_option],\n \"command-line argument or environment variable\",\n )\n\n\n# Fetch remote file at url_path to script_path\ndef _download_remote(script_path, url_path):\n import requests\n\n with open(script_path, \"wb\") as fp:\n try:\n resp = requests.get(url_path)\n resp.raise_for_status()\n fp.write(resp.content)\n except requests.exceptions.RequestException as e:\n raise click.BadParameter((\"Unable to fetch {}.\\n{}\".format(url_path, e)))\n\n\[email protected](context_settings={\"auto_envvar_prefix\": \"STREAMLIT\"})\[email protected](\"--log_level\", show_default=True, type=click.Choice(LOG_LEVELS))\[email protected]_option(prog_name=\"Streamlit\")\[email protected]_context\ndef main(ctx, log_level=\"info\"):\n \"\"\"Try out a demo with:\n\n $ streamlit hello\n\n Or use the line below to run your own script:\n\n $ streamlit run your_script.py\n \"\"\"\n\n if log_level:\n import streamlit.logger\n\n streamlit.logger.set_log_level(log_level.upper())\n\n\[email protected](\"help\")\[email protected]_context\ndef help(ctx):\n \"\"\"Print this help message.\"\"\"\n # Pretend user typed 'streamlit --help' instead of 'streamlit help'.\n import sys\n\n assert len(sys.argv) == 2 # This is always true, but let's assert anyway.\n sys.argv[1] = \"--help\"\n main()\n\n\[email protected](\"version\")\[email protected]_context\ndef main_version(ctx):\n \"\"\"Print Streamlit's version number.\"\"\"\n # Pretend user typed 'streamlit --version' instead of 'streamlit version'\n import sys\n\n assert len(sys.argv) == 2 # This is always true, but let's assert anyway.\n sys.argv[1] = \"--version\"\n main()\n\n\[email protected](\"docs\")\ndef main_docs():\n \"\"\"Show help in browser.\"\"\"\n print(\"Showing help page in browser...\")\n from streamlit import util\n\n util.open_browser(\"https://streamlit.io/docs\")\n\n\[email protected](\"hello\")\n@configurator_options\ndef main_hello(**kwargs):\n \"\"\"Runs the Hello World script.\"\"\"\n from streamlit.hello import hello\n\n _apply_config_options_from_cli(kwargs)\n\n filename = hello.__file__\n\n # For Python 2 when Streamlit is actually installed (make install rather\n # than make develop).\n if filename.endswith(\".pyc\"):\n filename = \"%s.py\" % filename[:-4]\n\n _main_run(filename)\n\n\[email protected](\"run\")\n@configurator_options\[email protected](\"target\", required=True, envvar=\"STREAMLIT_RUN_TARGET\")\[email protected](\"args\", nargs=-1)\ndef main_run(target, args=None, **kwargs):\n \"\"\"Run a Python script, piping stderr to Streamlit.\n\n The script can be local or it can be an url. In the latter case, Streamlit\n will download the script to a temporary file and runs this file.\n\n \"\"\"\n from validators import url\n\n _apply_config_options_from_cli(kwargs)\n\n _, extension = os.path.splitext(target)\n if extension[1:] not in ACCEPTED_FILE_EXTENSIONS:\n raise click.BadArgumentUsage(\"Streamlit requires raw Python (.py) files, not %s.\\nFor more information, please see https://streamlit.io/docs\" % extension)\n\n if url(target):\n from streamlit.temporary_directory import TemporaryDirectory\n\n with TemporaryDirectory() as temp_dir:\n from urllib.parse import urlparse\n from streamlit import url_util\n\n path = urlparse(target).path\n script_path = os.path.join(temp_dir, path.strip(\"/\").rsplit(\"/\", 1)[-1])\n # if this is a GitHub/Gist blob url, convert to a raw URL first.\n target = url_util.process_gitblob_url(target)\n _download_remote(script_path, target)\n _main_run(script_path, args)\n else:\n if not os.path.exists(target):\n raise click.BadParameter(\"File does not exist: {}\".format(target))\n _main_run(target, args)\n\n\n# Utility function to compute the command line as a string\ndef _get_command_line_as_string():\n import subprocess\n\n cmd_line_as_list = [click.get_current_context().parent.command_path]\n cmd_line_as_list.extend(click.get_os_args())\n return subprocess.list2cmdline(cmd_line_as_list)\n\n\ndef _main_run(file, args=[]):\n command_line = _get_command_line_as_string()\n\n # Set a global flag indicating that we're \"within\" streamlit.\n streamlit._is_running_with_streamlit = True\n\n # Check credentials.\n check_credentials()\n\n # Notify if streamlit is out of date.\n if version.should_show_new_version_notice():\n click.echo(NEW_VERSION_TEXT)\n\n bootstrap.run(file, command_line, args)\n\n\n# DEPRECATED\n\n# TODO: Remove after 2019-09-01\[email protected](\"clear_cache\", deprecated=True, hidden=True)\[email protected]_context\ndef main_clear_cache(ctx):\n \"\"\"Deprecated.\"\"\"\n click.echo(click.style('Use \"cache clear\" instead.', fg=\"red\"))\n ctx.invoke(cache_clear)\n\n\n# TODO: Remove after 2019-09-01\[email protected](\"show_config\", deprecated=True, hidden=True)\[email protected]_context\ndef main_show_config(ctx):\n \"\"\"Deprecated.\"\"\"\n click.echo(click.style('Use \"config show\" instead.', fg=\"red\"))\n ctx.invoke(config_show)\n\n\n# SUBCOMMAND: cache\n\n\[email protected](\"cache\")\ndef cache():\n \"\"\"Manage the Streamlit cache.\"\"\"\n pass\n\n\[email protected](\"clear\")\ndef cache_clear():\n \"\"\"Clear the Streamlit on-disk cache.\"\"\"\n import streamlit.caching\n\n result = streamlit.caching.clear_cache()\n cache_path = streamlit.caching.get_cache_path()\n if result:\n print(\"Cleared directory %s.\" % cache_path)\n else:\n print(\"Nothing to clear at %s.\" % cache_path)\n\n\n# SUBCOMMAND: config\n\n\[email protected](\"config\")\ndef config():\n \"\"\"Manage Streamlit's config settings.\"\"\"\n pass\n\n\[email protected](\"show\")\n@configurator_options\ndef config_show(**kwargs):\n \"\"\"Show all of Streamlit's config settings.\"\"\"\n\n _apply_config_options_from_cli(kwargs)\n\n _config.show_config()\n\n\n# SUBCOMMAND: activate\n\n\[email protected](\"activate\", invoke_without_command=True)\[email protected]_context\ndef activate(ctx):\n \"\"\"Activate Streamlit by entering your email.\"\"\"\n if not ctx.invoked_subcommand:\n Credentials.get_current().activate()\n\n\[email protected](\"reset\")\ndef activate_reset():\n \"\"\"Reset Activation Credentials.\"\"\"\n Credentials.get_current().reset()\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "lib/streamlit/cli.py"}]} | 3,559 | 229 |
gh_patches_debug_2646 | rasdani/github-patches | git_diff | jupyter__docker-stacks-1964 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[BUG] - Healthcheck fails when using proxy
### What docker image(s) are you using?
base-notebook
### Host OS system and architecture running docker image
Windows 11 as host and linux/amd64 for docker
### What Docker command are you running?
docker compose up with the following dockerfile:
```Dockerfile
version: '3.4'
services:
datamining:
container_name: xxxx
image: xxxx
build:
context: .
dockerfile: ./Dockerfile
ports:
- "8888:8888"
volumes:
- xxxx:/home/jovyan/work
environment:
- DOCKER_STACKS_JUPYTER_CMD=lab
restart: on-failure
```
### How to Reproduce the problem?
Precondition is that the machine has to operate in a corporate environment using the companies proxy.
Start the container as above.
Check the state of the container with ```docker container ls```
The container is marked as unhealthy.
### Command output
```bash session
abcdefghijk "tini -g -- start-no…" x hours ago Up x hours (unhealthy) 0.0.0.0:8888->8888/tcp xxxx
```
### Expected behavior
```abcdedfghi abcdefghijk "tini -g -- start-no…" x hours ago Up x hours (healthy) 0.0.0.0:8888->8888/tcp xxxx```
### Actual behavior
After investigating the issue the problem is that docker_healthcheck.py does not run successfully giving the following error message:
```
Traceback (most recent call last):
File "/opt/conda/lib/python3.11/site-packages/urllib3/connectionpool.py", line 790, in urlopen
response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/urllib3/connectionpool.py", line 536, in _make_request
response = conn.getresponse()
^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/urllib3/connection.py", line 461, in getresponse
httplib_response = super().getresponse()
^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/http/client.py", line 1378, in getresponse
response.begin()
File "/opt/conda/lib/python3.11/http/client.py", line 318, in begin
version, status, reason = self._read_status()
^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/http/client.py", line 287, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response
The above exception was the direct cause of the following exception:
urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/conda/lib/python3.11/site-packages/requests/adapters.py", line 486, in send
resp = conn.urlopen(
^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/urllib3/connectionpool.py", line 844, in urlopen
retries = retries.increment(
^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/urllib3/util/retry.py", line 515, in increment
raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='host.docker.internal', port=9000): Max retries exceeded with url: http://7702f0e1c7d4:8888/api (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/etc/jupyter/docker_healthcheck.py", line 19, in <module>
r = requests.get(url, verify=False) # request without SSL verification
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/requests/api.py", line 73, in get
return request("get", url, params=params, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/requests/api.py", line 59, in request
return session.request(method=method, url=url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/requests/sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/requests/adapters.py", line 513, in send
raise ProxyError(e, request=request)
requests.exceptions.ProxyError: HTTPConnectionPool(host='host.docker.internal', port=9000): Max retries exceeded with url: http://7702f0e1c7d4:8888/api (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')))
```
### Anything else?
After investigating the issue further I came to the conclusion that using the proxy will be the problem. So I applied the following fix to ```docker_healthcheck.py```:
```python
proxies = {
"http": None,
"https": None,
}
r = requests.get(url, proxies=proxies, verify=False) # request without SSL verification
```
Now the healthcheck works!
### Latest Docker version
- [X] I've updated my Docker version to the latest available, and the issue still persists
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `base-notebook/docker_healthcheck.py`
Content:
```
1 #!/usr/bin/env python3
2 # Copyright (c) Jupyter Development Team.
3 # Distributed under the terms of the Modified BSD License.
4 import json
5 import os
6 from pathlib import Path
7
8 import requests
9
10 # A number of operations below deliberately don't check for possible errors
11 # As this is a healthcheck, it should succeed or raise an exception on error
12
13 runtime_dir = Path("/home/") / os.environ["NB_USER"] / ".local/share/jupyter/runtime/"
14 json_file = next(runtime_dir.glob("*server-*.json"))
15
16 url = json.loads(json_file.read_bytes())["url"]
17 url = url + "api"
18
19 r = requests.get(url, verify=False) # request without SSL verification
20 r.raise_for_status()
21 print(r.content)
22
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/base-notebook/docker_healthcheck.py b/base-notebook/docker_healthcheck.py
--- a/base-notebook/docker_healthcheck.py
+++ b/base-notebook/docker_healthcheck.py
@@ -16,6 +16,11 @@
url = json.loads(json_file.read_bytes())["url"]
url = url + "api"
-r = requests.get(url, verify=False) # request without SSL verification
+proxies = {
+ "http": "",
+ "https": "",
+}
+
+r = requests.get(url, proxies=proxies, verify=False) # request without SSL verification
r.raise_for_status()
print(r.content)
| {"golden_diff": "diff --git a/base-notebook/docker_healthcheck.py b/base-notebook/docker_healthcheck.py\n--- a/base-notebook/docker_healthcheck.py\n+++ b/base-notebook/docker_healthcheck.py\n@@ -16,6 +16,11 @@\n url = json.loads(json_file.read_bytes())[\"url\"]\n url = url + \"api\"\n \n-r = requests.get(url, verify=False) # request without SSL verification\n+proxies = {\n+ \"http\": \"\",\n+ \"https\": \"\",\n+}\n+\n+r = requests.get(url, proxies=proxies, verify=False) # request without SSL verification\n r.raise_for_status()\n print(r.content)\n", "issue": "[BUG] - Healthcheck fails when using proxy\n### What docker image(s) are you using?\r\n\r\nbase-notebook\r\n\r\n### Host OS system and architecture running docker image\r\n\r\nWindows 11 as host and linux/amd64 for docker\r\n\r\n### What Docker command are you running?\r\n\r\ndocker compose up with the following dockerfile:\r\n\r\n```Dockerfile\r\nversion: '3.4'\r\n\r\nservices:\r\n datamining:\r\n container_name: xxxx\r\n image: xxxx\r\n build:\r\n context: .\r\n dockerfile: ./Dockerfile\r\n ports:\r\n - \"8888:8888\"\r\n volumes:\r\n - xxxx:/home/jovyan/work\r\n environment:\r\n - DOCKER_STACKS_JUPYTER_CMD=lab\r\n restart: on-failure\r\n```\r\n\r\n### How to Reproduce the problem?\r\n\r\nPrecondition is that the machine has to operate in a corporate environment using the companies proxy.\r\nStart the container as above.\r\nCheck the state of the container with ```docker container ls```\r\nThe container is marked as unhealthy.\r\n\r\n### Command output\r\n\r\n```bash session\r\nabcdefghijk \"tini -g -- start-no\u2026\" x hours ago Up x hours (unhealthy) 0.0.0.0:8888->8888/tcp xxxx\r\n```\r\n\r\n\r\n### Expected behavior\r\n\r\n```abcdedfghi abcdefghijk \"tini -g -- start-no\u2026\" x hours ago Up x hours (healthy) 0.0.0.0:8888->8888/tcp xxxx```\r\n\r\n### Actual behavior\r\n\r\nAfter investigating the issue the problem is that docker_healthcheck.py does not run successfully giving the following error message:\r\n```\r\nTraceback (most recent call last):\r\n File \"/opt/conda/lib/python3.11/site-packages/urllib3/connectionpool.py\", line 790, in urlopen\r\n response = self._make_request(\r\n ^^^^^^^^^^^^^^^^^^^\r\n File \"/opt/conda/lib/python3.11/site-packages/urllib3/connectionpool.py\", line 536, in _make_request\r\n response = conn.getresponse()\r\n ^^^^^^^^^^^^^^^^^^\r\n File \"/opt/conda/lib/python3.11/site-packages/urllib3/connection.py\", line 461, in getresponse\r\n httplib_response = super().getresponse()\r\n ^^^^^^^^^^^^^^^^^^^^^\r\n File \"/opt/conda/lib/python3.11/http/client.py\", line 1378, in getresponse\r\n response.begin()\r\n File \"/opt/conda/lib/python3.11/http/client.py\", line 318, in begin\r\n version, status, reason = self._read_status()\r\n ^^^^^^^^^^^^^^^^^^^\r\n File \"/opt/conda/lib/python3.11/http/client.py\", line 287, in _read_status\r\n raise RemoteDisconnected(\"Remote end closed connection without\"\r\nhttp.client.RemoteDisconnected: Remote end closed connection without response\r\n\r\nThe above exception was the direct cause of the following exception:\r\n\r\nurllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))\r\n\r\nThe above exception was the direct cause of the following exception:\r\n\r\nTraceback (most recent call last):\r\n File \"/opt/conda/lib/python3.11/site-packages/requests/adapters.py\", line 486, in send\r\n resp = conn.urlopen(\r\n ^^^^^^^^^^^^^\r\n File \"/opt/conda/lib/python3.11/site-packages/urllib3/connectionpool.py\", line 844, in urlopen\r\n retries = retries.increment(\r\n ^^^^^^^^^^^^^^^^^^\r\n File \"/opt/conda/lib/python3.11/site-packages/urllib3/util/retry.py\", line 515, in increment\r\n raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type]\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\nurllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='host.docker.internal', port=9000): Max retries exceeded with url: http://7702f0e1c7d4:8888/api (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')))\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/etc/jupyter/docker_healthcheck.py\", line 19, in <module>\r\n r = requests.get(url, verify=False) # request without SSL verification\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/opt/conda/lib/python3.11/site-packages/requests/api.py\", line 73, in get\r\n return request(\"get\", url, params=params, **kwargs)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/opt/conda/lib/python3.11/site-packages/requests/api.py\", line 59, in request\r\n return session.request(method=method, url=url, **kwargs)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/opt/conda/lib/python3.11/site-packages/requests/sessions.py\", line 589, in request\r\n resp = self.send(prep, **send_kwargs)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/opt/conda/lib/python3.11/site-packages/requests/sessions.py\", line 703, in send\r\n r = adapter.send(request, **kwargs)\r\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n File \"/opt/conda/lib/python3.11/site-packages/requests/adapters.py\", line 513, in send\r\n raise ProxyError(e, request=request)\r\nrequests.exceptions.ProxyError: HTTPConnectionPool(host='host.docker.internal', port=9000): Max retries exceeded with url: http://7702f0e1c7d4:8888/api (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')))\r\n```\r\n\r\n### Anything else?\r\n\r\nAfter investigating the issue further I came to the conclusion that using the proxy will be the problem. So I applied the following fix to ```docker_healthcheck.py```:\r\n```python\r\nproxies = {\r\n \"http\": None,\r\n \"https\": None,\r\n}\r\n\r\nr = requests.get(url, proxies=proxies, verify=False) # request without SSL verification\r\n```\r\nNow the healthcheck works!\r\n\r\n### Latest Docker version\r\n\r\n- [X] I've updated my Docker version to the latest available, and the issue still persists\n", "before_files": [{"content": "#!/usr/bin/env python3\n# Copyright (c) Jupyter Development Team.\n# Distributed under the terms of the Modified BSD License.\nimport json\nimport os\nfrom pathlib import Path\n\nimport requests\n\n# A number of operations below deliberately don't check for possible errors\n# As this is a healthcheck, it should succeed or raise an exception on error\n\nruntime_dir = Path(\"/home/\") / os.environ[\"NB_USER\"] / \".local/share/jupyter/runtime/\"\njson_file = next(runtime_dir.glob(\"*server-*.json\"))\n\nurl = json.loads(json_file.read_bytes())[\"url\"]\nurl = url + \"api\"\n\nr = requests.get(url, verify=False) # request without SSL verification\nr.raise_for_status()\nprint(r.content)\n", "path": "base-notebook/docker_healthcheck.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n# Copyright (c) Jupyter Development Team.\n# Distributed under the terms of the Modified BSD License.\nimport json\nimport os\nfrom pathlib import Path\n\nimport requests\n\n# A number of operations below deliberately don't check for possible errors\n# As this is a healthcheck, it should succeed or raise an exception on error\n\nruntime_dir = Path(\"/home/\") / os.environ[\"NB_USER\"] / \".local/share/jupyter/runtime/\"\njson_file = next(runtime_dir.glob(\"*server-*.json\"))\n\nurl = json.loads(json_file.read_bytes())[\"url\"]\nurl = url + \"api\"\n\nproxies = {\n \"http\": \"\",\n \"https\": \"\",\n}\n\nr = requests.get(url, proxies=proxies, verify=False) # request without SSL verification\nr.raise_for_status()\nprint(r.content)\n", "path": "base-notebook/docker_healthcheck.py"}]} | 1,896 | 139 |
gh_patches_debug_26103 | rasdani/github-patches | git_diff | pytorch__PiPPy-528 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Buck run device error
buck run reported the following error:
```
[trainer1]:RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:1 and cuda:0! (when checking argument for argument weight in method wrapper__native_layer_norm)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pippy/utils.py`
Content:
```
1 # Copyright (c) Meta Platforms, Inc. and affiliates
2 import os
3 import socket
4 import logging
5
6 # Pinning process to a separate GPU if not yet done by launch script
7 # Notes:
8 # 1. Needed to work around the issue of RPC not automatically pinning spawned worker threads to CUDA device of the main
9 # thread
10 # 2. Must be done before `import torch` at which point CUDA context may be created
11 cuda_devices_str = os.getenv('CUDA_VISIBLE_DEVICES')
12 if (cuda_devices_str is None # not set
13 or len(cuda_devices_str.split(',')) > 1): # or set to all devices
14 # If launchers like Torchrun sets `LOCAL_RANK`, we would use this information
15 local_rank_str = os.getenv('LOCAL_RANK')
16 if local_rank_str is not None:
17 os.environ['CUDA_VISIBLE_DEVICES'] = local_rank_str
18 print(f"Pinning local process {local_rank_str} to gpu {os.getenv('CUDA_VISIBLE_DEVICES')}")
19
20 import torch
21 import torch.multiprocessing as mp
22 import torch.distributed.rpc as rpc
23
24
25 VERBOSE = bool(int(os.environ.get('VERBOSE', False)))
26
27 if VERBOSE:
28 logging.getLogger().setLevel(logging.DEBUG)
29
30
31 def has_efa() -> bool:
32 try:
33 import subprocess
34 return subprocess.run(["fi_info", "-p", "efa", "-t", "FI_EP_RDM"],
35 stdout=subprocess.DEVNULL,
36 stderr=subprocess.DEVNULL).returncode == 0
37 except FileNotFoundError:
38 return False
39 except PermissionError:
40 return False
41
42
43 def tp_transports():
44 return ["shm", "uv"] if has_efa() else None
45
46
47 def run_pippy(run_master, args, *extra_args):
48 if not hasattr(args, 'world_size'):
49 assert hasattr(args, 'pp_group_size')
50 args.dp_group_size = args.dp_group_size if hasattr(args, 'dp_group_size') else 1
51 else:
52 if not hasattr(args, 'dp_group_size'):
53 args.pp_group_size = args.pp_group_size if hasattr(args, 'pp_group_size') else args.world_size
54 assert args.world_size % args.pp_group_size == 0
55 args.dp_group_size = args.world_size // args.pp_group_size
56 elif not hasattr(args, 'pp_group_size'):
57 args.dp_group_size = args.dp_group_size if hasattr(args, 'dp_group_size') else 1
58 assert args.world_size % args.dp_group_size == 0
59 args.pp_group_size = args.world_size // args.dp_group_size
60 else:
61 pass
62 # TODO: doesn't work for PiPPyTrainingArguments
63 # assert args.world_size == args.dp_group_size * args.pp_group_size
64
65 actual_world_size = args.dp_group_size * args.pp_group_size
66 print(f'[PiPPy] World size: {actual_world_size}, '
67 f'DP group size: {args.dp_group_size}, '
68 f'PP group size: {args.pp_group_size}')
69
70 if args.rank == -1:
71 mp.spawn(run_worker, args=(run_master, args, *extra_args), nprocs=actual_world_size, join=True)
72 elif args.rank < actual_world_size:
73 run_worker(args.rank, run_master, args, *extra_args)
74 else:
75 print("I'm unused, exiting")
76
77
78 def run_worker(rank, run_master, args, *extra_args):
79 args.rank = rank
80
81 os.environ['MASTER_ADDR'] = args.master_addr
82 os.environ['MASTER_PORT'] = args.master_port
83
84 actual_world_size = args.dp_group_size * args.pp_group_size
85
86 # TODO: Move to training args, blocked by: cannot pickle 'TensorPipeRpcBackendOptions' object
87 # Exclude IB for metadata transport due to lack of EFA support on AWS
88 options = rpc.TensorPipeRpcBackendOptions(num_worker_threads=512,
89 rpc_timeout=1800,
90 _transports=tp_transports())
91 if args.cuda:
92 n_devs = torch.cuda.device_count()
93 if n_devs > 0:
94 dev_id = rank % n_devs
95 for i in range(actual_world_size):
96 options.set_device_map(f"worker{i}", {dev_id: i % n_devs})
97 # Does not seem effective for RPC device pinning. TODO
98 # options.set_devices([f'cuda:{dev_id}'])
99 else:
100 args.cuda = 0
101 print('Warning: no CUDA device found. Running on CPU instead.')
102
103 args.device = f'cuda:{dev_id}' if args.cuda else 'cpu'
104 print(f"rank = {rank} host/pid/device = "
105 f"{socket.gethostname()}/{os.getpid()}/{args.device}")
106
107 # Init DDP process group
108 backend = "nccl" if args.cuda else "gloo"
109 torch.distributed.init_process_group(backend=backend, rank=rank, world_size=actual_world_size)
110
111 rpc.init_rpc(
112 f"worker{rank}",
113 rank=rank,
114 world_size=actual_world_size,
115 rpc_backend_options=options
116 )
117
118 global dp_pg_per_pp_rank
119 dp_ranks_per_pp_rank = torch.arange(actual_world_size).reshape(args.pp_group_size,
120 args.dp_group_size).tolist()
121 dp_pg_per_pp_rank = [torch.distributed.new_group(ranks) for ranks in dp_ranks_per_pp_rank]
122
123 pp_ranks_per_dp_group = [[i * args.dp_group_size + rank for i in range(args.pp_group_size)]
124 for rank in range(args.dp_group_size)]
125
126 args.driver_group = torch.distributed.new_group(list(range(args.dp_group_size)))
127
128 global exclude_master
129 exclude_master = args.exclude_master if hasattr(args, 'exclude_master') else 0
130
131 if rank >= 0 and rank // args.dp_group_size == 0:
132 args.driver_index = rank
133 args.local_driver_index = os.getenv('LOCAL_RANK', rank)
134 run_master(pp_ranks_per_dp_group[rank], args, *extra_args)
135 rpc.shutdown()
136
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pippy/utils.py b/pippy/utils.py
--- a/pippy/utils.py
+++ b/pippy/utils.py
@@ -8,14 +8,17 @@
# 1. Needed to work around the issue of RPC not automatically pinning spawned worker threads to CUDA device of the main
# thread
# 2. Must be done before `import torch` at which point CUDA context may be created
-cuda_devices_str = os.getenv('CUDA_VISIBLE_DEVICES')
-if (cuda_devices_str is None # not set
- or len(cuda_devices_str.split(',')) > 1): # or set to all devices
- # If launchers like Torchrun sets `LOCAL_RANK`, we would use this information
- local_rank_str = os.getenv('LOCAL_RANK')
- if local_rank_str is not None:
- os.environ['CUDA_VISIBLE_DEVICES'] = local_rank_str
- print(f"Pinning local process {local_rank_str} to gpu {os.getenv('CUDA_VISIBLE_DEVICES')}")
+# 3. Currently this is enabled by default (as long as #1 is not implemented in RPC). Users may set `PIPPY_PIN_DEVICE` to
+# 0 to disable the pinning
+if os.getenv('PIPPY_PIN_DEVICE', '1') == '1':
+ cuda_devices_str = os.getenv('CUDA_VISIBLE_DEVICES')
+ if (cuda_devices_str is None # not set
+ or len(cuda_devices_str.split(',')) > 1): # or set to all devices
+ # If launchers like Torchrun sets `LOCAL_RANK`, we would use this information
+ local_rank_str = os.getenv('LOCAL_RANK')
+ if local_rank_str is not None:
+ os.environ['CUDA_VISIBLE_DEVICES'] = local_rank_str
+ print(f"Pinning local process {local_rank_str} to gpu {os.getenv('CUDA_VISIBLE_DEVICES')}")
import torch
import torch.multiprocessing as mp
| {"golden_diff": "diff --git a/pippy/utils.py b/pippy/utils.py\n--- a/pippy/utils.py\n+++ b/pippy/utils.py\n@@ -8,14 +8,17 @@\n # 1. Needed to work around the issue of RPC not automatically pinning spawned worker threads to CUDA device of the main\n # thread\n # 2. Must be done before `import torch` at which point CUDA context may be created\n-cuda_devices_str = os.getenv('CUDA_VISIBLE_DEVICES')\n-if (cuda_devices_str is None # not set\n- or len(cuda_devices_str.split(',')) > 1): # or set to all devices\n- # If launchers like Torchrun sets `LOCAL_RANK`, we would use this information\n- local_rank_str = os.getenv('LOCAL_RANK')\n- if local_rank_str is not None:\n- os.environ['CUDA_VISIBLE_DEVICES'] = local_rank_str\n- print(f\"Pinning local process {local_rank_str} to gpu {os.getenv('CUDA_VISIBLE_DEVICES')}\")\n+# 3. Currently this is enabled by default (as long as #1 is not implemented in RPC). Users may set `PIPPY_PIN_DEVICE` to\n+# 0 to disable the pinning\n+if os.getenv('PIPPY_PIN_DEVICE', '1') == '1':\n+ cuda_devices_str = os.getenv('CUDA_VISIBLE_DEVICES')\n+ if (cuda_devices_str is None # not set\n+ or len(cuda_devices_str.split(',')) > 1): # or set to all devices\n+ # If launchers like Torchrun sets `LOCAL_RANK`, we would use this information\n+ local_rank_str = os.getenv('LOCAL_RANK')\n+ if local_rank_str is not None:\n+ os.environ['CUDA_VISIBLE_DEVICES'] = local_rank_str\n+ print(f\"Pinning local process {local_rank_str} to gpu {os.getenv('CUDA_VISIBLE_DEVICES')}\")\n \n import torch\n import torch.multiprocessing as mp\n", "issue": "Buck run device error\nbuck run reported the following error:\r\n```\r\n[trainer1]:RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:1 and cuda:0! (when checking argument for argument weight in method wrapper__native_layer_norm)\r\n```\n", "before_files": [{"content": "# Copyright (c) Meta Platforms, Inc. and affiliates\nimport os\nimport socket\nimport logging\n\n# Pinning process to a separate GPU if not yet done by launch script\n# Notes:\n# 1. Needed to work around the issue of RPC not automatically pinning spawned worker threads to CUDA device of the main\n# thread\n# 2. Must be done before `import torch` at which point CUDA context may be created\ncuda_devices_str = os.getenv('CUDA_VISIBLE_DEVICES')\nif (cuda_devices_str is None # not set\n or len(cuda_devices_str.split(',')) > 1): # or set to all devices\n # If launchers like Torchrun sets `LOCAL_RANK`, we would use this information\n local_rank_str = os.getenv('LOCAL_RANK')\n if local_rank_str is not None:\n os.environ['CUDA_VISIBLE_DEVICES'] = local_rank_str\n print(f\"Pinning local process {local_rank_str} to gpu {os.getenv('CUDA_VISIBLE_DEVICES')}\")\n\nimport torch\nimport torch.multiprocessing as mp\nimport torch.distributed.rpc as rpc\n\n\nVERBOSE = bool(int(os.environ.get('VERBOSE', False)))\n\nif VERBOSE:\n logging.getLogger().setLevel(logging.DEBUG)\n\n\ndef has_efa() -> bool:\n try:\n import subprocess\n return subprocess.run([\"fi_info\", \"-p\", \"efa\", \"-t\", \"FI_EP_RDM\"],\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL).returncode == 0\n except FileNotFoundError:\n return False\n except PermissionError:\n return False\n\n\ndef tp_transports():\n return [\"shm\", \"uv\"] if has_efa() else None\n\n\ndef run_pippy(run_master, args, *extra_args):\n if not hasattr(args, 'world_size'):\n assert hasattr(args, 'pp_group_size')\n args.dp_group_size = args.dp_group_size if hasattr(args, 'dp_group_size') else 1\n else:\n if not hasattr(args, 'dp_group_size'):\n args.pp_group_size = args.pp_group_size if hasattr(args, 'pp_group_size') else args.world_size\n assert args.world_size % args.pp_group_size == 0\n args.dp_group_size = args.world_size // args.pp_group_size\n elif not hasattr(args, 'pp_group_size'):\n args.dp_group_size = args.dp_group_size if hasattr(args, 'dp_group_size') else 1\n assert args.world_size % args.dp_group_size == 0\n args.pp_group_size = args.world_size // args.dp_group_size\n else:\n pass\n # TODO: doesn't work for PiPPyTrainingArguments\n # assert args.world_size == args.dp_group_size * args.pp_group_size\n\n actual_world_size = args.dp_group_size * args.pp_group_size\n print(f'[PiPPy] World size: {actual_world_size}, '\n f'DP group size: {args.dp_group_size}, '\n f'PP group size: {args.pp_group_size}')\n\n if args.rank == -1:\n mp.spawn(run_worker, args=(run_master, args, *extra_args), nprocs=actual_world_size, join=True)\n elif args.rank < actual_world_size:\n run_worker(args.rank, run_master, args, *extra_args)\n else:\n print(\"I'm unused, exiting\")\n\n\ndef run_worker(rank, run_master, args, *extra_args):\n args.rank = rank\n\n os.environ['MASTER_ADDR'] = args.master_addr\n os.environ['MASTER_PORT'] = args.master_port\n\n actual_world_size = args.dp_group_size * args.pp_group_size\n\n # TODO: Move to training args, blocked by: cannot pickle 'TensorPipeRpcBackendOptions' object\n # Exclude IB for metadata transport due to lack of EFA support on AWS\n options = rpc.TensorPipeRpcBackendOptions(num_worker_threads=512,\n rpc_timeout=1800,\n _transports=tp_transports())\n if args.cuda:\n n_devs = torch.cuda.device_count()\n if n_devs > 0:\n dev_id = rank % n_devs\n for i in range(actual_world_size):\n options.set_device_map(f\"worker{i}\", {dev_id: i % n_devs})\n # Does not seem effective for RPC device pinning. TODO\n # options.set_devices([f'cuda:{dev_id}'])\n else:\n args.cuda = 0\n print('Warning: no CUDA device found. Running on CPU instead.')\n\n args.device = f'cuda:{dev_id}' if args.cuda else 'cpu'\n print(f\"rank = {rank} host/pid/device = \"\n f\"{socket.gethostname()}/{os.getpid()}/{args.device}\")\n\n # Init DDP process group\n backend = \"nccl\" if args.cuda else \"gloo\"\n torch.distributed.init_process_group(backend=backend, rank=rank, world_size=actual_world_size)\n\n rpc.init_rpc(\n f\"worker{rank}\",\n rank=rank,\n world_size=actual_world_size,\n rpc_backend_options=options\n )\n\n global dp_pg_per_pp_rank\n dp_ranks_per_pp_rank = torch.arange(actual_world_size).reshape(args.pp_group_size,\n args.dp_group_size).tolist()\n dp_pg_per_pp_rank = [torch.distributed.new_group(ranks) for ranks in dp_ranks_per_pp_rank]\n\n pp_ranks_per_dp_group = [[i * args.dp_group_size + rank for i in range(args.pp_group_size)]\n for rank in range(args.dp_group_size)]\n\n args.driver_group = torch.distributed.new_group(list(range(args.dp_group_size)))\n\n global exclude_master\n exclude_master = args.exclude_master if hasattr(args, 'exclude_master') else 0\n\n if rank >= 0 and rank // args.dp_group_size == 0:\n args.driver_index = rank\n args.local_driver_index = os.getenv('LOCAL_RANK', rank)\n run_master(pp_ranks_per_dp_group[rank], args, *extra_args)\n rpc.shutdown()\n", "path": "pippy/utils.py"}], "after_files": [{"content": "# Copyright (c) Meta Platforms, Inc. and affiliates\nimport os\nimport socket\nimport logging\n\n# Pinning process to a separate GPU if not yet done by launch script\n# Notes:\n# 1. Needed to work around the issue of RPC not automatically pinning spawned worker threads to CUDA device of the main\n# thread\n# 2. Must be done before `import torch` at which point CUDA context may be created\n# 3. Currently this is enabled by default (as long as #1 is not implemented in RPC). Users may set `PIPPY_PIN_DEVICE` to\n# 0 to disable the pinning\nif os.getenv('PIPPY_PIN_DEVICE', '1') == '1':\n cuda_devices_str = os.getenv('CUDA_VISIBLE_DEVICES')\n if (cuda_devices_str is None # not set\n or len(cuda_devices_str.split(',')) > 1): # or set to all devices\n # If launchers like Torchrun sets `LOCAL_RANK`, we would use this information\n local_rank_str = os.getenv('LOCAL_RANK')\n if local_rank_str is not None:\n os.environ['CUDA_VISIBLE_DEVICES'] = local_rank_str\n print(f\"Pinning local process {local_rank_str} to gpu {os.getenv('CUDA_VISIBLE_DEVICES')}\")\n\nimport torch\nimport torch.multiprocessing as mp\nimport torch.distributed.rpc as rpc\n\n\nVERBOSE = bool(int(os.environ.get('VERBOSE', False)))\n\nif VERBOSE:\n logging.getLogger().setLevel(logging.DEBUG)\n\n\ndef has_efa() -> bool:\n try:\n import subprocess\n return subprocess.run([\"fi_info\", \"-p\", \"efa\", \"-t\", \"FI_EP_RDM\"],\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL).returncode == 0\n except FileNotFoundError:\n return False\n except PermissionError:\n return False\n\n\ndef tp_transports():\n return [\"shm\", \"uv\"] if has_efa() else None\n\n\ndef run_pippy(run_master, args, *extra_args):\n if not hasattr(args, 'world_size'):\n assert hasattr(args, 'pp_group_size')\n args.dp_group_size = args.dp_group_size if hasattr(args, 'dp_group_size') else 1\n else:\n if not hasattr(args, 'dp_group_size'):\n args.pp_group_size = args.pp_group_size if hasattr(args, 'pp_group_size') else args.world_size\n assert args.world_size % args.pp_group_size == 0\n args.dp_group_size = args.world_size // args.pp_group_size\n elif not hasattr(args, 'pp_group_size'):\n args.dp_group_size = args.dp_group_size if hasattr(args, 'dp_group_size') else 1\n assert args.world_size % args.dp_group_size == 0\n args.pp_group_size = args.world_size // args.dp_group_size\n else:\n pass\n # TODO: doesn't work for PiPPyTrainingArguments\n # assert args.world_size == args.dp_group_size * args.pp_group_size\n\n actual_world_size = args.dp_group_size * args.pp_group_size\n print(f'[PiPPy] World size: {actual_world_size}, '\n f'DP group size: {args.dp_group_size}, '\n f'PP group size: {args.pp_group_size}')\n\n if args.rank == -1:\n mp.spawn(run_worker, args=(run_master, args, *extra_args), nprocs=actual_world_size, join=True)\n elif args.rank < actual_world_size:\n run_worker(args.rank, run_master, args, *extra_args)\n else:\n print(\"I'm unused, exiting\")\n\n\ndef run_worker(rank, run_master, args, *extra_args):\n args.rank = rank\n\n os.environ['MASTER_ADDR'] = args.master_addr\n os.environ['MASTER_PORT'] = args.master_port\n\n actual_world_size = args.dp_group_size * args.pp_group_size\n\n # TODO: Move to training args, blocked by: cannot pickle 'TensorPipeRpcBackendOptions' object\n # Exclude IB for metadata transport due to lack of EFA support on AWS\n options = rpc.TensorPipeRpcBackendOptions(num_worker_threads=512,\n rpc_timeout=1800,\n _transports=tp_transports())\n if args.cuda:\n n_devs = torch.cuda.device_count()\n if n_devs > 0:\n dev_id = rank % n_devs\n for i in range(actual_world_size):\n options.set_device_map(f\"worker{i}\", {dev_id: i % n_devs})\n # Does not seem effective for RPC device pinning. TODO\n # options.set_devices([f'cuda:{dev_id}'])\n else:\n args.cuda = 0\n print('Warning: no CUDA device found. Running on CPU instead.')\n\n args.device = f'cuda:{dev_id}' if args.cuda else 'cpu'\n print(f\"rank = {rank} host/pid/device = \"\n f\"{socket.gethostname()}/{os.getpid()}/{args.device}\")\n\n # Init DDP process group\n backend = \"nccl\" if args.cuda else \"gloo\"\n torch.distributed.init_process_group(backend=backend, rank=rank, world_size=actual_world_size)\n\n rpc.init_rpc(\n f\"worker{rank}\",\n rank=rank,\n world_size=actual_world_size,\n rpc_backend_options=options\n )\n\n global dp_pg_per_pp_rank\n dp_ranks_per_pp_rank = torch.arange(actual_world_size).reshape(args.pp_group_size,\n args.dp_group_size).tolist()\n dp_pg_per_pp_rank = [torch.distributed.new_group(ranks) for ranks in dp_ranks_per_pp_rank]\n\n pp_ranks_per_dp_group = [[i * args.dp_group_size + rank for i in range(args.pp_group_size)]\n for rank in range(args.dp_group_size)]\n\n args.driver_group = torch.distributed.new_group(list(range(args.dp_group_size)))\n\n global exclude_master\n exclude_master = args.exclude_master if hasattr(args, 'exclude_master') else 0\n\n if rank >= 0 and rank // args.dp_group_size == 0:\n args.driver_index = rank\n args.local_driver_index = os.getenv('LOCAL_RANK', rank)\n run_master(pp_ranks_per_dp_group[rank], args, *extra_args)\n rpc.shutdown()\n", "path": "pippy/utils.py"}]} | 1,928 | 418 |
gh_patches_debug_34128 | rasdani/github-patches | git_diff | conda__conda-5133 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
signal handler can only be used in main thread
Often get these with conda-build tests
https://travis-ci.org/conda/conda/jobs/225296134#L1380
```
Traceback (most recent call last):
File "/home/travis/build/conda/conda/conda-build/conda_build/build.py", line 688, in create_env
execute_actions(actions, index, verbose=config.debug)
File "/home/travis/build/conda/conda/conda/plan.py", line 612, in execute_actions
execute_instructions(plan, index, verbose)
File "/home/travis/build/conda/conda/conda/instructions.py", line 243, in execute_instructions
cmd(state, arg)
File "/home/travis/build/conda/conda/conda/instructions.py", line 98, in PROGRESSIVEFETCHEXTRACT_CMD
progressive_fetch_extract.execute()
File "/home/travis/build/conda/conda/conda/core/package_cache.py", line 491, in execute
with signal_handler(conda_signal_handler):
File "/home/travis/miniconda/lib/python3.6/contextlib.py", line 82, in __enter__
return next(self.gen)
File "/home/travis/build/conda/conda/conda/common/signals.py", line 41, in signal_handler
prev_handler = signal.signal(sig, handler)
File "/home/travis/miniconda/lib/python3.6/signal.py", line 47, in signal
handler = _signal.signal(_enum_to_int(signalnum), _enum_to_int(handler))
ValueError: signal only works in main thread
```
signal handler can only be used in main thread
Often get these with conda-build tests
https://travis-ci.org/conda/conda/jobs/225296134#L1380
```
Traceback (most recent call last):
File "/home/travis/build/conda/conda/conda-build/conda_build/build.py", line 688, in create_env
execute_actions(actions, index, verbose=config.debug)
File "/home/travis/build/conda/conda/conda/plan.py", line 612, in execute_actions
execute_instructions(plan, index, verbose)
File "/home/travis/build/conda/conda/conda/instructions.py", line 243, in execute_instructions
cmd(state, arg)
File "/home/travis/build/conda/conda/conda/instructions.py", line 98, in PROGRESSIVEFETCHEXTRACT_CMD
progressive_fetch_extract.execute()
File "/home/travis/build/conda/conda/conda/core/package_cache.py", line 491, in execute
with signal_handler(conda_signal_handler):
File "/home/travis/miniconda/lib/python3.6/contextlib.py", line 82, in __enter__
return next(self.gen)
File "/home/travis/build/conda/conda/conda/common/signals.py", line 41, in signal_handler
prev_handler = signal.signal(sig, handler)
File "/home/travis/miniconda/lib/python3.6/signal.py", line 47, in signal
handler = _signal.signal(_enum_to_int(signalnum), _enum_to_int(handler))
ValueError: signal only works in main thread
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conda/exports.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 from __future__ import absolute_import, division, print_function, unicode_literals
3
4 from functools import partial
5 from logging import getLogger
6 from warnings import warn
7
8 log = getLogger(__name__)
9
10 from . import CondaError # NOQA
11 CondaError = CondaError
12
13 from . import compat, plan # NOQA
14 compat, plan = compat, plan
15
16 from .api import get_index # NOQA
17 get_index = get_index
18
19 from .cli.common import specs_from_args, spec_from_line, specs_from_url # NOQA
20 from .cli.conda_argparse import add_parser_prefix, add_parser_channels # NOQA
21 add_parser_channels, add_parser_prefix = add_parser_channels, add_parser_prefix
22 specs_from_args, spec_from_line = specs_from_args, spec_from_line
23 specs_from_url = specs_from_url
24
25 from .cli.conda_argparse import ArgumentParser # NOQA
26 ArgumentParser = ArgumentParser
27
28 from .common.compat import PY3, StringIO, input, iteritems, string_types, text_type # NOQA
29 PY3, StringIO, input, iteritems, string_types, text_type = PY3, StringIO, input, iteritems, string_types, text_type # NOQA
30 from .gateways.connection import CondaSession # NOQA
31 CondaSession = CondaSession
32
33 from .common.toposort import _toposort
34 _toposort = _toposort
35
36 from .gateways.disk.link import lchmod # NOQA
37 lchmod = lchmod
38
39 from .fetch import TmpDownload # NOQA
40 TmpDownload = TmpDownload
41 handle_proxy_407 = lambda x, y: warn("handle_proxy_407 is deprecated. "
42 "Now handled by CondaSession.")
43 from .core.index import dist_str_in_index, fetch_index # NOQA
44 dist_str_in_index, fetch_index = dist_str_in_index, fetch_index
45 from .core.package_cache import download, rm_fetched # NOQA
46 download, rm_fetched = download, rm_fetched
47
48 from .install import package_cache, prefix_placeholder, rm_rf, symlink_conda # NOQA
49 package_cache, prefix_placeholder, rm_rf, symlink_conda = package_cache, prefix_placeholder, rm_rf, symlink_conda # NOQA
50
51 from .gateways.disk.delete import delete_trash, move_to_trash # NOQA
52 delete_trash, move_to_trash = delete_trash, move_to_trash
53
54 from .core.linked_data import is_linked, linked, linked_data # NOQA
55 is_linked, linked, linked_data = is_linked, linked, linked_data
56
57 from .misc import untracked, walk_prefix # NOQA
58 untracked, walk_prefix = untracked, walk_prefix
59
60 from .resolve import MatchSpec, NoPackagesFound, Resolve, Unsatisfiable, normalized_version # NOQA
61 MatchSpec, NoPackagesFound, Resolve = MatchSpec, NoPackagesFound, Resolve
62 Unsatisfiable, normalized_version = Unsatisfiable, normalized_version
63
64 from .signature import KEYS, KEYS_DIR, hash_file, verify # NOQA
65 KEYS, KEYS_DIR = KEYS, KEYS_DIR
66 hash_file, verify = hash_file, verify
67
68 from .utils import hashsum_file, human_bytes, memoized, unix_path_to_win, win_path_to_unix, url_path # NOQA
69 hashsum_file, human_bytes = hashsum_file, human_bytes
70 memoized, unix_path_to_win = memoized, unix_path_to_win
71 win_path_to_unix, url_path = win_path_to_unix, url_path
72
73 from .gateways.disk.read import compute_md5sum # NOQA
74 md5_file = compute_md5sum
75
76 from .config import sys_rc_path # NOQA
77 sys_rc_path = sys_rc_path
78
79 from .models.version import VersionOrder # NOQA
80 VersionOrder = VersionOrder
81
82 import conda.base.context # NOQA
83 from .base.context import get_prefix as context_get_prefix, non_x86_linux_machines # NOQA
84 non_x86_linux_machines = non_x86_linux_machines
85
86 from ._vendor.auxlib.entity import EntityEncoder # NOQA
87 EntityEncoder = EntityEncoder
88 from .base.constants import DEFAULT_CHANNELS, DEFAULT_CHANNELS_WIN, DEFAULT_CHANNELS_UNIX # NOQA
89 DEFAULT_CHANNELS, DEFAULT_CHANNELS_WIN, DEFAULT_CHANNELS_UNIX = DEFAULT_CHANNELS, DEFAULT_CHANNELS_WIN, DEFAULT_CHANNELS_UNIX # NOQA
90 get_prefix = partial(context_get_prefix, conda.base.context.context)
91 get_default_urls = lambda: DEFAULT_CHANNELS
92
93 arch_name = conda.base.context.context.arch_name
94 binstar_upload = conda.base.context.context.anaconda_upload
95 bits = conda.base.context.context.bits
96 default_prefix = conda.base.context.context.default_prefix
97 default_python = conda.base.context.context.default_python
98 envs_dirs = conda.base.context.context.envs_dirs
99 pkgs_dirs = conda.base.context.context.pkgs_dirs
100 platform = conda.base.context.context.platform
101 root_dir = conda.base.context.context.root_prefix
102 root_writable = conda.base.context.context.root_writable
103 subdir = conda.base.context.context.subdir
104 from .models.channel import get_conda_build_local_url # NOQA
105 get_rc_urls = lambda: list(conda.base.context.context.channels)
106 get_local_urls = lambda: list(get_conda_build_local_url()) or []
107 load_condarc = lambda fn: conda.base.context.reset_context([fn])
108 from .exceptions import PaddingError # NOQA
109 PaddingError = PaddingError
110 from .gateways.disk.link import CrossPlatformStLink # NOQA
111 CrossPlatformStLink = CrossPlatformStLink
112
113 from .models.enums import FileMode # NOQA
114 FileMode = FileMode
115 from .models.enums import PathType # NOQA
116 PathType = PathType
117
118
119 if PY3:
120 import configparser # NOQA # pragma: py2 no cover
121 else:
122 import ConfigParser as configparser # NOQA # pragma: py3 no cover
123 configparser = configparser
124
125
126 from .compat import TemporaryDirectory # NOQA
127 TemporaryDirectory = TemporaryDirectory
128
129 from .gateways.subprocess import ACTIVE_SUBPROCESSES, subprocess_call # NOQA
130 ACTIVE_SUBPROCESSES, subprocess_call = ACTIVE_SUBPROCESSES, subprocess_call
131
132 from .core.repodata import cache_fn_url # NOQA
133 cache_fn_url = cache_fn_url
134
135
136 class Completer(object):
137 def get_items(self):
138 return self._get_items()
139
140 def __contains__(self, item):
141 return True
142
143 def __iter__(self):
144 return iter(self.get_items())
145
146 class InstalledPackages(object): pass # NOQA
147
```
Path: `conda/common/signals.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 from __future__ import absolute_import, division, print_function, unicode_literals
3
4 from contextlib import contextmanager
5 from logging import getLogger
6 import signal
7
8 from .compat import iteritems
9
10 log = getLogger(__name__)
11
12 INTERRUPT_SIGNALS = (
13 'SIGABRT',
14 'SIGINT',
15 'SIGTERM',
16 'SIGQUIT',
17 'SIGBREAK',
18 )
19
20
21 def get_signal_name(signum):
22 """
23 Examples:
24 >>> from signal import SIGINT
25 >>> get_signal_name(SIGINT)
26 'SIGINT'
27
28 """
29 return next((k for k, v in iteritems(signal.__dict__)
30 if v == signum and k.startswith('SIG') and not k.startswith('SIG_')),
31 None)
32
33
34 @contextmanager
35 def signal_handler(handler):
36 previous_handlers = []
37 for signame in INTERRUPT_SIGNALS:
38 sig = getattr(signal, signame, None)
39 if sig:
40 log.debug("registering handler for %s", signame)
41 prev_handler = signal.signal(sig, handler)
42 previous_handlers.append((sig, prev_handler))
43 try:
44 yield
45 finally:
46 standard_handlers = signal.SIG_IGN, signal.SIG_DFL
47 for sig, previous_handler in previous_handlers:
48 if callable(previous_handler) or previous_handler in standard_handlers:
49 log.debug("de-registering handler for %s", sig)
50 signal.signal(sig, previous_handler)
51
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/conda/common/signals.py b/conda/common/signals.py
--- a/conda/common/signals.py
+++ b/conda/common/signals.py
@@ -4,6 +4,7 @@
from contextlib import contextmanager
from logging import getLogger
import signal
+import threading
from .compat import iteritems
@@ -17,7 +18,6 @@
'SIGBREAK',
)
-
def get_signal_name(signum):
"""
Examples:
@@ -33,18 +33,23 @@
@contextmanager
def signal_handler(handler):
- previous_handlers = []
+ _thread_local = threading.local()
+ _thread_local.previous_handlers = []
for signame in INTERRUPT_SIGNALS:
sig = getattr(signal, signame, None)
if sig:
log.debug("registering handler for %s", signame)
- prev_handler = signal.signal(sig, handler)
- previous_handlers.append((sig, prev_handler))
+ try:
+ prev_handler = signal.signal(sig, handler)
+ _thread_local.previous_handlers.append((sig, prev_handler))
+ except ValueError as e:
+ # ValueError: signal only works in main thread
+ log.debug('%r', e)
try:
yield
finally:
standard_handlers = signal.SIG_IGN, signal.SIG_DFL
- for sig, previous_handler in previous_handlers:
+ for sig, previous_handler in _thread_local.previous_handlers:
if callable(previous_handler) or previous_handler in standard_handlers:
log.debug("de-registering handler for %s", sig)
signal.signal(sig, previous_handler)
diff --git a/conda/exports.py b/conda/exports.py
--- a/conda/exports.py
+++ b/conda/exports.py
@@ -30,7 +30,7 @@
from .gateways.connection import CondaSession # NOQA
CondaSession = CondaSession
-from .common.toposort import _toposort
+from .common.toposort import _toposort # NOQA
_toposort = _toposort
from .gateways.disk.link import lchmod # NOQA
| {"golden_diff": "diff --git a/conda/common/signals.py b/conda/common/signals.py\n--- a/conda/common/signals.py\n+++ b/conda/common/signals.py\n@@ -4,6 +4,7 @@\n from contextlib import contextmanager\n from logging import getLogger\n import signal\n+import threading\n \n from .compat import iteritems\n \n@@ -17,7 +18,6 @@\n 'SIGBREAK',\n )\n \n-\n def get_signal_name(signum):\n \"\"\"\n Examples:\n@@ -33,18 +33,23 @@\n \n @contextmanager\n def signal_handler(handler):\n- previous_handlers = []\n+ _thread_local = threading.local()\n+ _thread_local.previous_handlers = []\n for signame in INTERRUPT_SIGNALS:\n sig = getattr(signal, signame, None)\n if sig:\n log.debug(\"registering handler for %s\", signame)\n- prev_handler = signal.signal(sig, handler)\n- previous_handlers.append((sig, prev_handler))\n+ try:\n+ prev_handler = signal.signal(sig, handler)\n+ _thread_local.previous_handlers.append((sig, prev_handler))\n+ except ValueError as e:\n+ # ValueError: signal only works in main thread\n+ log.debug('%r', e)\n try:\n yield\n finally:\n standard_handlers = signal.SIG_IGN, signal.SIG_DFL\n- for sig, previous_handler in previous_handlers:\n+ for sig, previous_handler in _thread_local.previous_handlers:\n if callable(previous_handler) or previous_handler in standard_handlers:\n log.debug(\"de-registering handler for %s\", sig)\n signal.signal(sig, previous_handler)\ndiff --git a/conda/exports.py b/conda/exports.py\n--- a/conda/exports.py\n+++ b/conda/exports.py\n@@ -30,7 +30,7 @@\n from .gateways.connection import CondaSession # NOQA\n CondaSession = CondaSession\n \n-from .common.toposort import _toposort\n+from .common.toposort import _toposort # NOQA\n _toposort = _toposort\n \n from .gateways.disk.link import lchmod # NOQA\n", "issue": "signal handler can only be used in main thread\nOften get these with conda-build tests\r\n\r\nhttps://travis-ci.org/conda/conda/jobs/225296134#L1380\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/home/travis/build/conda/conda/conda-build/conda_build/build.py\", line 688, in create_env\r\n execute_actions(actions, index, verbose=config.debug)\r\n File \"/home/travis/build/conda/conda/conda/plan.py\", line 612, in execute_actions\r\n execute_instructions(plan, index, verbose)\r\n File \"/home/travis/build/conda/conda/conda/instructions.py\", line 243, in execute_instructions\r\n cmd(state, arg)\r\n File \"/home/travis/build/conda/conda/conda/instructions.py\", line 98, in PROGRESSIVEFETCHEXTRACT_CMD\r\n progressive_fetch_extract.execute()\r\n File \"/home/travis/build/conda/conda/conda/core/package_cache.py\", line 491, in execute\r\n with signal_handler(conda_signal_handler):\r\n File \"/home/travis/miniconda/lib/python3.6/contextlib.py\", line 82, in __enter__\r\n return next(self.gen)\r\n File \"/home/travis/build/conda/conda/conda/common/signals.py\", line 41, in signal_handler\r\n prev_handler = signal.signal(sig, handler)\r\n File \"/home/travis/miniconda/lib/python3.6/signal.py\", line 47, in signal\r\n handler = _signal.signal(_enum_to_int(signalnum), _enum_to_int(handler))\r\nValueError: signal only works in main thread\r\n```\nsignal handler can only be used in main thread\nOften get these with conda-build tests\r\n\r\nhttps://travis-ci.org/conda/conda/jobs/225296134#L1380\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/home/travis/build/conda/conda/conda-build/conda_build/build.py\", line 688, in create_env\r\n execute_actions(actions, index, verbose=config.debug)\r\n File \"/home/travis/build/conda/conda/conda/plan.py\", line 612, in execute_actions\r\n execute_instructions(plan, index, verbose)\r\n File \"/home/travis/build/conda/conda/conda/instructions.py\", line 243, in execute_instructions\r\n cmd(state, arg)\r\n File \"/home/travis/build/conda/conda/conda/instructions.py\", line 98, in PROGRESSIVEFETCHEXTRACT_CMD\r\n progressive_fetch_extract.execute()\r\n File \"/home/travis/build/conda/conda/conda/core/package_cache.py\", line 491, in execute\r\n with signal_handler(conda_signal_handler):\r\n File \"/home/travis/miniconda/lib/python3.6/contextlib.py\", line 82, in __enter__\r\n return next(self.gen)\r\n File \"/home/travis/build/conda/conda/conda/common/signals.py\", line 41, in signal_handler\r\n prev_handler = signal.signal(sig, handler)\r\n File \"/home/travis/miniconda/lib/python3.6/signal.py\", line 47, in signal\r\n handler = _signal.signal(_enum_to_int(signalnum), _enum_to_int(handler))\r\nValueError: signal only works in main thread\r\n```\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom functools import partial\nfrom logging import getLogger\nfrom warnings import warn\n\nlog = getLogger(__name__)\n\nfrom . import CondaError # NOQA\nCondaError = CondaError\n\nfrom . import compat, plan # NOQA\ncompat, plan = compat, plan\n\nfrom .api import get_index # NOQA\nget_index = get_index\n\nfrom .cli.common import specs_from_args, spec_from_line, specs_from_url # NOQA\nfrom .cli.conda_argparse import add_parser_prefix, add_parser_channels # NOQA\nadd_parser_channels, add_parser_prefix = add_parser_channels, add_parser_prefix\nspecs_from_args, spec_from_line = specs_from_args, spec_from_line\nspecs_from_url = specs_from_url\n\nfrom .cli.conda_argparse import ArgumentParser # NOQA\nArgumentParser = ArgumentParser\n\nfrom .common.compat import PY3, StringIO, input, iteritems, string_types, text_type # NOQA\nPY3, StringIO, input, iteritems, string_types, text_type = PY3, StringIO, input, iteritems, string_types, text_type # NOQA\nfrom .gateways.connection import CondaSession # NOQA\nCondaSession = CondaSession\n\nfrom .common.toposort import _toposort\n_toposort = _toposort\n\nfrom .gateways.disk.link import lchmod # NOQA\nlchmod = lchmod\n\nfrom .fetch import TmpDownload # NOQA\nTmpDownload = TmpDownload\nhandle_proxy_407 = lambda x, y: warn(\"handle_proxy_407 is deprecated. \"\n \"Now handled by CondaSession.\")\nfrom .core.index import dist_str_in_index, fetch_index # NOQA\ndist_str_in_index, fetch_index = dist_str_in_index, fetch_index\nfrom .core.package_cache import download, rm_fetched # NOQA\ndownload, rm_fetched = download, rm_fetched\n\nfrom .install import package_cache, prefix_placeholder, rm_rf, symlink_conda # NOQA\npackage_cache, prefix_placeholder, rm_rf, symlink_conda = package_cache, prefix_placeholder, rm_rf, symlink_conda # NOQA\n\nfrom .gateways.disk.delete import delete_trash, move_to_trash # NOQA\ndelete_trash, move_to_trash = delete_trash, move_to_trash\n\nfrom .core.linked_data import is_linked, linked, linked_data # NOQA\nis_linked, linked, linked_data = is_linked, linked, linked_data\n\nfrom .misc import untracked, walk_prefix # NOQA\nuntracked, walk_prefix = untracked, walk_prefix\n\nfrom .resolve import MatchSpec, NoPackagesFound, Resolve, Unsatisfiable, normalized_version # NOQA\nMatchSpec, NoPackagesFound, Resolve = MatchSpec, NoPackagesFound, Resolve\nUnsatisfiable, normalized_version = Unsatisfiable, normalized_version\n\nfrom .signature import KEYS, KEYS_DIR, hash_file, verify # NOQA\nKEYS, KEYS_DIR = KEYS, KEYS_DIR\nhash_file, verify = hash_file, verify\n\nfrom .utils import hashsum_file, human_bytes, memoized, unix_path_to_win, win_path_to_unix, url_path # NOQA\nhashsum_file, human_bytes = hashsum_file, human_bytes\nmemoized, unix_path_to_win = memoized, unix_path_to_win\nwin_path_to_unix, url_path = win_path_to_unix, url_path\n\nfrom .gateways.disk.read import compute_md5sum # NOQA\nmd5_file = compute_md5sum\n\nfrom .config import sys_rc_path # NOQA\nsys_rc_path = sys_rc_path\n\nfrom .models.version import VersionOrder # NOQA\nVersionOrder = VersionOrder\n\nimport conda.base.context # NOQA\nfrom .base.context import get_prefix as context_get_prefix, non_x86_linux_machines # NOQA\nnon_x86_linux_machines = non_x86_linux_machines\n\nfrom ._vendor.auxlib.entity import EntityEncoder # NOQA\nEntityEncoder = EntityEncoder\nfrom .base.constants import DEFAULT_CHANNELS, DEFAULT_CHANNELS_WIN, DEFAULT_CHANNELS_UNIX # NOQA\nDEFAULT_CHANNELS, DEFAULT_CHANNELS_WIN, DEFAULT_CHANNELS_UNIX = DEFAULT_CHANNELS, DEFAULT_CHANNELS_WIN, DEFAULT_CHANNELS_UNIX # NOQA\nget_prefix = partial(context_get_prefix, conda.base.context.context)\nget_default_urls = lambda: DEFAULT_CHANNELS\n\narch_name = conda.base.context.context.arch_name\nbinstar_upload = conda.base.context.context.anaconda_upload\nbits = conda.base.context.context.bits\ndefault_prefix = conda.base.context.context.default_prefix\ndefault_python = conda.base.context.context.default_python\nenvs_dirs = conda.base.context.context.envs_dirs\npkgs_dirs = conda.base.context.context.pkgs_dirs\nplatform = conda.base.context.context.platform\nroot_dir = conda.base.context.context.root_prefix\nroot_writable = conda.base.context.context.root_writable\nsubdir = conda.base.context.context.subdir\nfrom .models.channel import get_conda_build_local_url # NOQA\nget_rc_urls = lambda: list(conda.base.context.context.channels)\nget_local_urls = lambda: list(get_conda_build_local_url()) or []\nload_condarc = lambda fn: conda.base.context.reset_context([fn])\nfrom .exceptions import PaddingError # NOQA\nPaddingError = PaddingError\nfrom .gateways.disk.link import CrossPlatformStLink # NOQA\nCrossPlatformStLink = CrossPlatformStLink\n\nfrom .models.enums import FileMode # NOQA\nFileMode = FileMode\nfrom .models.enums import PathType # NOQA\nPathType = PathType\n\n\nif PY3:\n import configparser # NOQA # pragma: py2 no cover\nelse:\n import ConfigParser as configparser # NOQA # pragma: py3 no cover\nconfigparser = configparser\n\n\nfrom .compat import TemporaryDirectory # NOQA\nTemporaryDirectory = TemporaryDirectory\n\nfrom .gateways.subprocess import ACTIVE_SUBPROCESSES, subprocess_call # NOQA\nACTIVE_SUBPROCESSES, subprocess_call = ACTIVE_SUBPROCESSES, subprocess_call\n\nfrom .core.repodata import cache_fn_url # NOQA\ncache_fn_url = cache_fn_url\n\n\nclass Completer(object):\n def get_items(self):\n return self._get_items()\n\n def __contains__(self, item):\n return True\n\n def __iter__(self):\n return iter(self.get_items())\n\nclass InstalledPackages(object): pass # NOQA\n", "path": "conda/exports.py"}, {"content": "# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom contextlib import contextmanager\nfrom logging import getLogger\nimport signal\n\nfrom .compat import iteritems\n\nlog = getLogger(__name__)\n\nINTERRUPT_SIGNALS = (\n 'SIGABRT',\n 'SIGINT',\n 'SIGTERM',\n 'SIGQUIT',\n 'SIGBREAK',\n)\n\n\ndef get_signal_name(signum):\n \"\"\"\n Examples:\n >>> from signal import SIGINT\n >>> get_signal_name(SIGINT)\n 'SIGINT'\n\n \"\"\"\n return next((k for k, v in iteritems(signal.__dict__)\n if v == signum and k.startswith('SIG') and not k.startswith('SIG_')),\n None)\n\n\n@contextmanager\ndef signal_handler(handler):\n previous_handlers = []\n for signame in INTERRUPT_SIGNALS:\n sig = getattr(signal, signame, None)\n if sig:\n log.debug(\"registering handler for %s\", signame)\n prev_handler = signal.signal(sig, handler)\n previous_handlers.append((sig, prev_handler))\n try:\n yield\n finally:\n standard_handlers = signal.SIG_IGN, signal.SIG_DFL\n for sig, previous_handler in previous_handlers:\n if callable(previous_handler) or previous_handler in standard_handlers:\n log.debug(\"de-registering handler for %s\", sig)\n signal.signal(sig, previous_handler)\n", "path": "conda/common/signals.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom functools import partial\nfrom logging import getLogger\nfrom warnings import warn\n\nlog = getLogger(__name__)\n\nfrom . import CondaError # NOQA\nCondaError = CondaError\n\nfrom . import compat, plan # NOQA\ncompat, plan = compat, plan\n\nfrom .api import get_index # NOQA\nget_index = get_index\n\nfrom .cli.common import specs_from_args, spec_from_line, specs_from_url # NOQA\nfrom .cli.conda_argparse import add_parser_prefix, add_parser_channels # NOQA\nadd_parser_channels, add_parser_prefix = add_parser_channels, add_parser_prefix\nspecs_from_args, spec_from_line = specs_from_args, spec_from_line\nspecs_from_url = specs_from_url\n\nfrom .cli.conda_argparse import ArgumentParser # NOQA\nArgumentParser = ArgumentParser\n\nfrom .common.compat import PY3, StringIO, input, iteritems, string_types, text_type # NOQA\nPY3, StringIO, input, iteritems, string_types, text_type = PY3, StringIO, input, iteritems, string_types, text_type # NOQA\nfrom .gateways.connection import CondaSession # NOQA\nCondaSession = CondaSession\n\nfrom .common.toposort import _toposort # NOQA\n_toposort = _toposort\n\nfrom .gateways.disk.link import lchmod # NOQA\nlchmod = lchmod\n\nfrom .fetch import TmpDownload # NOQA\nTmpDownload = TmpDownload\nhandle_proxy_407 = lambda x, y: warn(\"handle_proxy_407 is deprecated. \"\n \"Now handled by CondaSession.\")\nfrom .core.index import dist_str_in_index, fetch_index # NOQA\ndist_str_in_index, fetch_index = dist_str_in_index, fetch_index\nfrom .core.package_cache import download, rm_fetched # NOQA\ndownload, rm_fetched = download, rm_fetched\n\nfrom .install import package_cache, prefix_placeholder, rm_rf, symlink_conda # NOQA\npackage_cache, prefix_placeholder, rm_rf, symlink_conda = package_cache, prefix_placeholder, rm_rf, symlink_conda # NOQA\n\nfrom .gateways.disk.delete import delete_trash, move_to_trash # NOQA\ndelete_trash, move_to_trash = delete_trash, move_to_trash\n\nfrom .core.linked_data import is_linked, linked, linked_data # NOQA\nis_linked, linked, linked_data = is_linked, linked, linked_data\n\nfrom .misc import untracked, walk_prefix # NOQA\nuntracked, walk_prefix = untracked, walk_prefix\n\nfrom .resolve import MatchSpec, NoPackagesFound, Resolve, Unsatisfiable, normalized_version # NOQA\nMatchSpec, NoPackagesFound, Resolve = MatchSpec, NoPackagesFound, Resolve\nUnsatisfiable, normalized_version = Unsatisfiable, normalized_version\n\nfrom .signature import KEYS, KEYS_DIR, hash_file, verify # NOQA\nKEYS, KEYS_DIR = KEYS, KEYS_DIR\nhash_file, verify = hash_file, verify\n\nfrom .utils import hashsum_file, human_bytes, memoized, unix_path_to_win, win_path_to_unix, url_path # NOQA\nhashsum_file, human_bytes = hashsum_file, human_bytes\nmemoized, unix_path_to_win = memoized, unix_path_to_win\nwin_path_to_unix, url_path = win_path_to_unix, url_path\n\nfrom .gateways.disk.read import compute_md5sum # NOQA\nmd5_file = compute_md5sum\n\nfrom .config import sys_rc_path # NOQA\nsys_rc_path = sys_rc_path\n\nfrom .models.version import VersionOrder # NOQA\nVersionOrder = VersionOrder\n\nimport conda.base.context # NOQA\nfrom .base.context import get_prefix as context_get_prefix, non_x86_linux_machines # NOQA\nnon_x86_linux_machines = non_x86_linux_machines\n\nfrom ._vendor.auxlib.entity import EntityEncoder # NOQA\nEntityEncoder = EntityEncoder\nfrom .base.constants import DEFAULT_CHANNELS, DEFAULT_CHANNELS_WIN, DEFAULT_CHANNELS_UNIX # NOQA\nDEFAULT_CHANNELS, DEFAULT_CHANNELS_WIN, DEFAULT_CHANNELS_UNIX = DEFAULT_CHANNELS, DEFAULT_CHANNELS_WIN, DEFAULT_CHANNELS_UNIX # NOQA\nget_prefix = partial(context_get_prefix, conda.base.context.context)\nget_default_urls = lambda: DEFAULT_CHANNELS\n\narch_name = conda.base.context.context.arch_name\nbinstar_upload = conda.base.context.context.anaconda_upload\nbits = conda.base.context.context.bits\ndefault_prefix = conda.base.context.context.default_prefix\ndefault_python = conda.base.context.context.default_python\nenvs_dirs = conda.base.context.context.envs_dirs\npkgs_dirs = conda.base.context.context.pkgs_dirs\nplatform = conda.base.context.context.platform\nroot_dir = conda.base.context.context.root_prefix\nroot_writable = conda.base.context.context.root_writable\nsubdir = conda.base.context.context.subdir\nfrom .models.channel import get_conda_build_local_url # NOQA\nget_rc_urls = lambda: list(conda.base.context.context.channels)\nget_local_urls = lambda: list(get_conda_build_local_url()) or []\nload_condarc = lambda fn: conda.base.context.reset_context([fn])\nfrom .exceptions import PaddingError # NOQA\nPaddingError = PaddingError\nfrom .gateways.disk.link import CrossPlatformStLink # NOQA\nCrossPlatformStLink = CrossPlatformStLink\n\nfrom .models.enums import FileMode # NOQA\nFileMode = FileMode\nfrom .models.enums import PathType # NOQA\nPathType = PathType\n\n\nif PY3:\n import configparser # NOQA # pragma: py2 no cover\nelse:\n import ConfigParser as configparser # NOQA # pragma: py3 no cover\nconfigparser = configparser\n\n\nfrom .compat import TemporaryDirectory # NOQA\nTemporaryDirectory = TemporaryDirectory\n\nfrom .gateways.subprocess import ACTIVE_SUBPROCESSES, subprocess_call # NOQA\nACTIVE_SUBPROCESSES, subprocess_call = ACTIVE_SUBPROCESSES, subprocess_call\n\nfrom .core.repodata import cache_fn_url # NOQA\ncache_fn_url = cache_fn_url\n\n\nclass Completer(object):\n def get_items(self):\n return self._get_items()\n\n def __contains__(self, item):\n return True\n\n def __iter__(self):\n return iter(self.get_items())\n\nclass InstalledPackages(object): pass # NOQA\n", "path": "conda/exports.py"}, {"content": "# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom contextlib import contextmanager\nfrom logging import getLogger\nimport signal\nimport threading\n\nfrom .compat import iteritems\n\nlog = getLogger(__name__)\n\nINTERRUPT_SIGNALS = (\n 'SIGABRT',\n 'SIGINT',\n 'SIGTERM',\n 'SIGQUIT',\n 'SIGBREAK',\n)\n\ndef get_signal_name(signum):\n \"\"\"\n Examples:\n >>> from signal import SIGINT\n >>> get_signal_name(SIGINT)\n 'SIGINT'\n\n \"\"\"\n return next((k for k, v in iteritems(signal.__dict__)\n if v == signum and k.startswith('SIG') and not k.startswith('SIG_')),\n None)\n\n\n@contextmanager\ndef signal_handler(handler):\n _thread_local = threading.local()\n _thread_local.previous_handlers = []\n for signame in INTERRUPT_SIGNALS:\n sig = getattr(signal, signame, None)\n if sig:\n log.debug(\"registering handler for %s\", signame)\n try:\n prev_handler = signal.signal(sig, handler)\n _thread_local.previous_handlers.append((sig, prev_handler))\n except ValueError as e:\n # ValueError: signal only works in main thread\n log.debug('%r', e)\n try:\n yield\n finally:\n standard_handlers = signal.SIG_IGN, signal.SIG_DFL\n for sig, previous_handler in _thread_local.previous_handlers:\n if callable(previous_handler) or previous_handler in standard_handlers:\n log.debug(\"de-registering handler for %s\", sig)\n signal.signal(sig, previous_handler)\n", "path": "conda/common/signals.py"}]} | 3,202 | 472 |
gh_patches_debug_12321 | rasdani/github-patches | git_diff | conan-io__conan-9073 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[bug] CMakeDeps generator: component and package with the same name
Hi!
During refactoring #8971 behavior of `tvalue` [macro](https://github.com/conan-io/conan/blob/80e99ca9a7d730e95adbc8bde5aa33da4f4fbb44/conan/tools/cmake/cmakedeps/templates/target_configuration.py#L35) was changed by adding some special case when `pkg_name` and `comp_name` are the same.
It accidentally break this case, because `tvalue` macro is inlined in declaration of several variables (see [this](https://github.com/conan-io/conan/blob/80e99ca9a7d730e95adbc8bde5aa33da4f4fbb44/conan/tools/cmake/cmakedeps/templates/target_configuration.py#L98) for example). So these variables have one name (`{pkg_name}_{comp_name}_...`) during setting and the other name (`{comp_name}_...`) during using it in `set_property`.
cc @lasote
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conan/tools/cmake/cmakedeps/templates/target_configuration.py`
Content:
```
1 import textwrap
2
3 from conan.tools.cmake.cmakedeps.templates import CMakeDepsFileTemplate, get_component_alias, \
4 get_target_namespace
5
6 """
7
8 FooTarget-release.cmake
9
10 """
11
12
13 class TargetConfigurationTemplate(CMakeDepsFileTemplate):
14
15 @property
16 def filename(self):
17 return "{}Target-{}.cmake".format(self.file_name,
18 self.cmakedeps.configuration.lower())
19
20 @property
21 def context(self):
22 deps_targets_names = self.get_deps_targets_names() \
23 if not self.conanfile.is_build_context else []
24 return {"pkg_name": self.pkg_name,
25 "target_namespace": self.target_namespace,
26 "config_suffix": self.config_suffix,
27 "deps_targets_names": ";".join(deps_targets_names),
28 "components_names": self.get_required_components_names(),
29 "configuration": self.cmakedeps.configuration}
30
31 @property
32 def template(self):
33 return textwrap.dedent("""\
34
35 {%- macro tvalue(pkg_name, comp_name, var, config_suffix) -%}
36 {%- if comp_name == pkg_name -%}
37 {{'${'+pkg_name+'_'+var+config_suffix+'}'}}
38 {%- else -%}
39 {{'${'+pkg_name+'_'+comp_name+'_'+var+config_suffix+'}'}}
40 {%- endif -%}
41 {%- endmacro -%}
42
43 ########### VARIABLES #######################################################################
44 #############################################################################################
45
46 set({{ pkg_name }}_COMPILE_OPTIONS{{ config_suffix }}
47 "$<$<COMPILE_LANGUAGE:CXX>{{ ':${' }}{{ pkg_name }}_COMPILE_OPTIONS_CXX{{ config_suffix }}}>"
48 "$<$<COMPILE_LANGUAGE:C>{{ ':${' }}{{ pkg_name }}_COMPILE_OPTIONS_C{{ config_suffix }}}>")
49
50 set({{ pkg_name }}_LINKER_FLAGS{{ config_suffix }}
51 "$<$<STREQUAL{{ ':$' }}<TARGET_PROPERTY:TYPE>,SHARED_LIBRARY>{{ ':${' }}{{ pkg_name }}_SHARED_LINK_FLAGS{{ config_suffix }}}>"
52 "$<$<STREQUAL{{ ':$' }}<TARGET_PROPERTY:TYPE>,MODULE_LIBRARY>{{ ':${' }}{{ pkg_name }}_SHARED_LINK_FLAGS{{ config_suffix }}}>"
53 "$<$<STREQUAL{{ ':$' }}<TARGET_PROPERTY:TYPE>,EXECUTABLE>{{ ':${' }}{{ pkg_name }}_EXE_LINK_FLAGS{{ config_suffix }}}>")
54
55 set({{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }} "") # Will be filled later
56 conan_find_apple_frameworks({{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }} "{{ '${' }}{{ pkg_name }}_FRAMEWORKS{{ config_suffix }}}" "{{ '${' }}{{ pkg_name }}_FRAMEWORK_DIRS{{ config_suffix }}}")
57
58 # Gather all the libraries that should be linked to the targets (do not touch existing variables)
59 set(_{{ pkg_name }}_DEPENDENCIES{{ config_suffix }} "{{ '${' }}{{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }}} {{ '${' }}{{ pkg_name }}_SYSTEM_LIBS{{ config_suffix }}} {{ deps_targets_names }}")
60
61 set({{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} "") # Will be filled later
62 set({{ pkg_name }}_LIBRARIES{{ config_suffix }} "") # Will be filled later
63 conan_package_library_targets("{{ '${' }}{{ pkg_name }}_LIBS{{ config_suffix }}}" # libraries
64 "{{ '${' }}{{ pkg_name }}_LIB_DIRS{{ config_suffix }}}" # package_libdir
65 "{{ '${' }}_{{ pkg_name }}_DEPENDENCIES{{ config_suffix }}}" # deps
66 {{ pkg_name }}_LIBRARIES{{ config_suffix }} # out_libraries
67 {{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} # out_libraries_targets
68 "{{ config_suffix }}" # config_suffix
69 "{{ pkg_name }}") # package_name
70
71 foreach(_FRAMEWORK {{ '${' }}{{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }}})
72 list(APPEND {{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} ${_FRAMEWORK})
73 list(APPEND {{ pkg_name }}_LIBRARIES{{ config_suffix }} ${_FRAMEWORK})
74 endforeach()
75
76 foreach(_SYSTEM_LIB {{ '${' }}{{ pkg_name }}_SYSTEM_LIBS{{ config_suffix }}})
77 list(APPEND {{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} ${_SYSTEM_LIB})
78 list(APPEND {{ pkg_name }}_LIBRARIES{{ config_suffix }} ${_SYSTEM_LIB})
79 endforeach()
80
81 # We need to add our requirements too
82 set({{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} {{ '"${' }}{{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }}{{ '};' }}{{ deps_targets_names }}")
83 set({{ pkg_name }}_LIBRARIES{{ config_suffix }} {{ '"${' }}{{ pkg_name }}_LIBRARIES{{ config_suffix }}{{ '};' }}{{ deps_targets_names }}")
84
85 # FIXME: What is the result of this for multi-config? All configs adding themselves to path?
86 set(CMAKE_MODULE_PATH {{ '${' }}{{ pkg_name }}_BUILD_DIRS{{ config_suffix }}} {{ '${' }}CMAKE_MODULE_PATH})
87 set(CMAKE_PREFIX_PATH {{ '${' }}{{ pkg_name }}_BUILD_DIRS{{ config_suffix }}} {{ '${' }}CMAKE_PREFIX_PATH})
88
89 {%- for comp_name in components_names %}
90
91 ########## COMPONENT {{ comp_name }} FIND LIBRARIES & FRAMEWORKS / DYNAMIC VARS #############
92
93 set({{ pkg_name }}_{{ comp_name }}_FRAMEWORKS_FOUND{{ config_suffix }} "")
94 conan_find_apple_frameworks({{ pkg_name }}_{{ comp_name }}_FRAMEWORKS_FOUND{{ config_suffix }} "{{ '${'+pkg_name+'_'+comp_name+'_FRAMEWORKS'+config_suffix+'}' }}" "{{ '${'+pkg_name+'_'+comp_name+'_FRAMEWORK_DIRS'+config_suffix+'}' }}")
95
96 set({{ pkg_name }}_{{ comp_name }}_LIB_TARGETS{{ config_suffix }} "")
97 set({{ pkg_name }}_{{ comp_name }}_NOT_USED{{ config_suffix }} "")
98 set({{ pkg_name }}_{{ comp_name }}_LIBS_FRAMEWORKS_DEPS{{ config_suffix }} {{ '${'+pkg_name+'_'+comp_name+'_FRAMEWORKS_FOUND'+config_suffix+'}' }} {{ '${'+pkg_name+'_'+comp_name+'_SYSTEM_LIBS'+config_suffix+'}' }} {{ '${'+pkg_name+'_'+comp_name+'_DEPENDENCIES'+config_suffix+'}' }})
99 conan_package_library_targets("{{ '${'+pkg_name+'_'+comp_name+'_LIBS'+config_suffix+'}' }}"
100 "{{ '${'+pkg_name+'_'+comp_name+'_LIB_DIRS'+config_suffix+'}' }}"
101 "{{ '${'+pkg_name+'_'+comp_name+'_LIBS_FRAMEWORKS_DEPS'+config_suffix+'}' }}"
102 {{ pkg_name }}_{{ comp_name }}_NOT_USED{{ config_suffix }}
103 {{ pkg_name }}_{{ comp_name }}_LIB_TARGETS{{ config_suffix }}
104 "{{ config_suffix }}"
105 "{{ pkg_name }}_{{ comp_name }}")
106
107 set({{ pkg_name }}_{{ comp_name }}_LINK_LIBS{{ config_suffix }} {{ '${'+pkg_name+'_'+comp_name+'_LIB_TARGETS'+config_suffix+'}' }} {{ '${'+pkg_name+'_'+comp_name+'_LIBS_FRAMEWORKS_DEPS'+config_suffix+'}' }})
108 {%- endfor %}
109
110
111
112 ########## GLOBAL TARGET PROPERTIES {{ configuration }} ########################################
113 set_property(TARGET {{target_namespace}}::{{target_namespace}}
114 PROPERTY INTERFACE_LINK_LIBRARIES
115 $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_LIBRARIES_TARGETS{{config_suffix}}}
116 ${{'{'}}{{pkg_name}}_LINKER_FLAGS{{config_suffix}}}> APPEND)
117 set_property(TARGET {{target_namespace}}::{{target_namespace}}
118 PROPERTY INTERFACE_INCLUDE_DIRECTORIES
119 $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_INCLUDE_DIRS{{config_suffix}}}> APPEND)
120 set_property(TARGET {{target_namespace}}::{{target_namespace}}
121 PROPERTY INTERFACE_COMPILE_DEFINITIONS
122 $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_COMPILE_DEFINITIONS{{config_suffix}}}> APPEND)
123 set_property(TARGET {{target_namespace}}::{{target_namespace}}
124 PROPERTY INTERFACE_COMPILE_OPTIONS
125 $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_COMPILE_OPTIONS{{config_suffix}}}> APPEND)
126
127 ########## COMPONENTS TARGET PROPERTIES {{ configuration }} ########################################
128
129 {%- for comp_name in components_names %}
130
131 ########## COMPONENT {{ comp_name }} TARGET PROPERTIES ######################################
132 set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_LINK_LIBRARIES
133 $<$<CONFIG:{{ configuration }}>:{{tvalue(pkg_name, comp_name, 'LINK_LIBS', config_suffix)}}
134 {{tvalue(pkg_name, comp_name, 'LINKER_FLAGS', config_suffix)}}> APPEND)
135 set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_INCLUDE_DIRECTORIES
136 $<$<CONFIG:{{ configuration }}>:{{tvalue(pkg_name, comp_name, 'INCLUDE_DIRS', config_suffix)}}> APPEND)
137 set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_COMPILE_DEFINITIONS
138 $<$<CONFIG:{{ configuration }}>:{{tvalue(pkg_name, comp_name, 'COMPILE_DEFINITIONS', config_suffix)}}> APPEND)
139 set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_COMPILE_OPTIONS
140 $<$<CONFIG:{{ configuration }}>:
141 {{tvalue(pkg_name, comp_name, 'COMPILE_OPTIONS_C', config_suffix)}}
142 {{tvalue(pkg_name, comp_name, 'COMPILE_OPTIONS_CXX', config_suffix)}}> APPEND)
143 set({{ pkg_name }}_{{ comp_name }}_TARGET_PROPERTIES TRUE)
144
145 {%- endfor %}
146
147 """)
148
149 def get_required_components_names(self):
150 """Returns a list of component_name"""
151 ret = []
152 sorted_comps = self.conanfile.new_cpp_info.get_sorted_components()
153 for comp_name, comp in sorted_comps.items():
154 ret.append(get_component_alias(self.conanfile, comp_name))
155 ret.reverse()
156 return ret
157
158 def get_deps_targets_names(self):
159 """
160 - [{foo}::{bar}, ] of the required
161 """
162 ret = []
163
164 # Get a list of dependencies target names
165 # Declared cppinfo.requires or .components[].requires
166 if self.conanfile.new_cpp_info.required_components:
167 for dep_name, component_name in self.conanfile.new_cpp_info.required_components:
168 if not dep_name:
169 # Internal dep (no another component)
170 dep_name = get_target_namespace(self.conanfile)
171 req = self.conanfile
172 else:
173 req = self.conanfile.dependencies.transitive_host_requires[dep_name]
174 dep_name = get_target_namespace(req)
175
176 component_name = get_component_alias(req, component_name)
177 ret.append("{}::{}".format(dep_name, component_name))
178 elif self.conanfile.dependencies.host_requires:
179 # Regular external "conanfile.requires" declared, not cpp_info requires
180 ret = ["{p}::{p}".format(p=get_target_namespace(r))
181 for r in self.conanfile.dependencies.host_requires]
182 return ret
183
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/conan/tools/cmake/cmakedeps/templates/target_configuration.py b/conan/tools/cmake/cmakedeps/templates/target_configuration.py
--- a/conan/tools/cmake/cmakedeps/templates/target_configuration.py
+++ b/conan/tools/cmake/cmakedeps/templates/target_configuration.py
@@ -33,11 +33,7 @@
return textwrap.dedent("""\
{%- macro tvalue(pkg_name, comp_name, var, config_suffix) -%}
- {%- if comp_name == pkg_name -%}
- {{'${'+pkg_name+'_'+var+config_suffix+'}'}}
- {%- else -%}
- {{'${'+pkg_name+'_'+comp_name+'_'+var+config_suffix+'}'}}
- {%- endif -%}
+ {{'${'+pkg_name+'_'+comp_name+'_'+var+config_suffix+'}'}}
{%- endmacro -%}
########### VARIABLES #######################################################################
| {"golden_diff": "diff --git a/conan/tools/cmake/cmakedeps/templates/target_configuration.py b/conan/tools/cmake/cmakedeps/templates/target_configuration.py\n--- a/conan/tools/cmake/cmakedeps/templates/target_configuration.py\n+++ b/conan/tools/cmake/cmakedeps/templates/target_configuration.py\n@@ -33,11 +33,7 @@\n return textwrap.dedent(\"\"\"\\\n \n {%- macro tvalue(pkg_name, comp_name, var, config_suffix) -%}\n- {%- if comp_name == pkg_name -%}\n- {{'${'+pkg_name+'_'+var+config_suffix+'}'}}\n- {%- else -%}\n- {{'${'+pkg_name+'_'+comp_name+'_'+var+config_suffix+'}'}}\n- {%- endif -%}\n+ {{'${'+pkg_name+'_'+comp_name+'_'+var+config_suffix+'}'}}\n {%- endmacro -%}\n \n ########### VARIABLES #######################################################################\n", "issue": "[bug] CMakeDeps generator: component and package with the same name\nHi!\r\n\r\nDuring refactoring #8971 behavior of `tvalue` [macro](https://github.com/conan-io/conan/blob/80e99ca9a7d730e95adbc8bde5aa33da4f4fbb44/conan/tools/cmake/cmakedeps/templates/target_configuration.py#L35) was changed by adding some special case when `pkg_name` and `comp_name` are the same. \r\n\r\nIt accidentally break this case, because `tvalue` macro is inlined in declaration of several variables (see [this](https://github.com/conan-io/conan/blob/80e99ca9a7d730e95adbc8bde5aa33da4f4fbb44/conan/tools/cmake/cmakedeps/templates/target_configuration.py#L98) for example). So these variables have one name (`{pkg_name}_{comp_name}_...`) during setting and the other name (`{comp_name}_...`) during using it in `set_property`.\r\n\r\ncc @lasote \n", "before_files": [{"content": "import textwrap\n\nfrom conan.tools.cmake.cmakedeps.templates import CMakeDepsFileTemplate, get_component_alias, \\\n get_target_namespace\n\n\"\"\"\n\nFooTarget-release.cmake\n\n\"\"\"\n\n\nclass TargetConfigurationTemplate(CMakeDepsFileTemplate):\n\n @property\n def filename(self):\n return \"{}Target-{}.cmake\".format(self.file_name,\n self.cmakedeps.configuration.lower())\n\n @property\n def context(self):\n deps_targets_names = self.get_deps_targets_names() \\\n if not self.conanfile.is_build_context else []\n return {\"pkg_name\": self.pkg_name,\n \"target_namespace\": self.target_namespace,\n \"config_suffix\": self.config_suffix,\n \"deps_targets_names\": \";\".join(deps_targets_names),\n \"components_names\": self.get_required_components_names(),\n \"configuration\": self.cmakedeps.configuration}\n\n @property\n def template(self):\n return textwrap.dedent(\"\"\"\\\n\n {%- macro tvalue(pkg_name, comp_name, var, config_suffix) -%}\n {%- if comp_name == pkg_name -%}\n {{'${'+pkg_name+'_'+var+config_suffix+'}'}}\n {%- else -%}\n {{'${'+pkg_name+'_'+comp_name+'_'+var+config_suffix+'}'}}\n {%- endif -%}\n {%- endmacro -%}\n\n ########### VARIABLES #######################################################################\n #############################################################################################\n\n set({{ pkg_name }}_COMPILE_OPTIONS{{ config_suffix }}\n \"$<$<COMPILE_LANGUAGE:CXX>{{ ':${' }}{{ pkg_name }}_COMPILE_OPTIONS_CXX{{ config_suffix }}}>\"\n \"$<$<COMPILE_LANGUAGE:C>{{ ':${' }}{{ pkg_name }}_COMPILE_OPTIONS_C{{ config_suffix }}}>\")\n\n set({{ pkg_name }}_LINKER_FLAGS{{ config_suffix }}\n \"$<$<STREQUAL{{ ':$' }}<TARGET_PROPERTY:TYPE>,SHARED_LIBRARY>{{ ':${' }}{{ pkg_name }}_SHARED_LINK_FLAGS{{ config_suffix }}}>\"\n \"$<$<STREQUAL{{ ':$' }}<TARGET_PROPERTY:TYPE>,MODULE_LIBRARY>{{ ':${' }}{{ pkg_name }}_SHARED_LINK_FLAGS{{ config_suffix }}}>\"\n \"$<$<STREQUAL{{ ':$' }}<TARGET_PROPERTY:TYPE>,EXECUTABLE>{{ ':${' }}{{ pkg_name }}_EXE_LINK_FLAGS{{ config_suffix }}}>\")\n\n set({{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }} \"\") # Will be filled later\n conan_find_apple_frameworks({{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }} \"{{ '${' }}{{ pkg_name }}_FRAMEWORKS{{ config_suffix }}}\" \"{{ '${' }}{{ pkg_name }}_FRAMEWORK_DIRS{{ config_suffix }}}\")\n\n # Gather all the libraries that should be linked to the targets (do not touch existing variables)\n set(_{{ pkg_name }}_DEPENDENCIES{{ config_suffix }} \"{{ '${' }}{{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }}} {{ '${' }}{{ pkg_name }}_SYSTEM_LIBS{{ config_suffix }}} {{ deps_targets_names }}\")\n\n set({{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} \"\") # Will be filled later\n set({{ pkg_name }}_LIBRARIES{{ config_suffix }} \"\") # Will be filled later\n conan_package_library_targets(\"{{ '${' }}{{ pkg_name }}_LIBS{{ config_suffix }}}\" # libraries\n \"{{ '${' }}{{ pkg_name }}_LIB_DIRS{{ config_suffix }}}\" # package_libdir\n \"{{ '${' }}_{{ pkg_name }}_DEPENDENCIES{{ config_suffix }}}\" # deps\n {{ pkg_name }}_LIBRARIES{{ config_suffix }} # out_libraries\n {{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} # out_libraries_targets\n \"{{ config_suffix }}\" # config_suffix\n \"{{ pkg_name }}\") # package_name\n\n foreach(_FRAMEWORK {{ '${' }}{{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }}})\n list(APPEND {{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} ${_FRAMEWORK})\n list(APPEND {{ pkg_name }}_LIBRARIES{{ config_suffix }} ${_FRAMEWORK})\n endforeach()\n\n foreach(_SYSTEM_LIB {{ '${' }}{{ pkg_name }}_SYSTEM_LIBS{{ config_suffix }}})\n list(APPEND {{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} ${_SYSTEM_LIB})\n list(APPEND {{ pkg_name }}_LIBRARIES{{ config_suffix }} ${_SYSTEM_LIB})\n endforeach()\n\n # We need to add our requirements too\n set({{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} {{ '\"${' }}{{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }}{{ '};' }}{{ deps_targets_names }}\")\n set({{ pkg_name }}_LIBRARIES{{ config_suffix }} {{ '\"${' }}{{ pkg_name }}_LIBRARIES{{ config_suffix }}{{ '};' }}{{ deps_targets_names }}\")\n\n # FIXME: What is the result of this for multi-config? All configs adding themselves to path?\n set(CMAKE_MODULE_PATH {{ '${' }}{{ pkg_name }}_BUILD_DIRS{{ config_suffix }}} {{ '${' }}CMAKE_MODULE_PATH})\n set(CMAKE_PREFIX_PATH {{ '${' }}{{ pkg_name }}_BUILD_DIRS{{ config_suffix }}} {{ '${' }}CMAKE_PREFIX_PATH})\n\n {%- for comp_name in components_names %}\n\n ########## COMPONENT {{ comp_name }} FIND LIBRARIES & FRAMEWORKS / DYNAMIC VARS #############\n\n set({{ pkg_name }}_{{ comp_name }}_FRAMEWORKS_FOUND{{ config_suffix }} \"\")\n conan_find_apple_frameworks({{ pkg_name }}_{{ comp_name }}_FRAMEWORKS_FOUND{{ config_suffix }} \"{{ '${'+pkg_name+'_'+comp_name+'_FRAMEWORKS'+config_suffix+'}' }}\" \"{{ '${'+pkg_name+'_'+comp_name+'_FRAMEWORK_DIRS'+config_suffix+'}' }}\")\n\n set({{ pkg_name }}_{{ comp_name }}_LIB_TARGETS{{ config_suffix }} \"\")\n set({{ pkg_name }}_{{ comp_name }}_NOT_USED{{ config_suffix }} \"\")\n set({{ pkg_name }}_{{ comp_name }}_LIBS_FRAMEWORKS_DEPS{{ config_suffix }} {{ '${'+pkg_name+'_'+comp_name+'_FRAMEWORKS_FOUND'+config_suffix+'}' }} {{ '${'+pkg_name+'_'+comp_name+'_SYSTEM_LIBS'+config_suffix+'}' }} {{ '${'+pkg_name+'_'+comp_name+'_DEPENDENCIES'+config_suffix+'}' }})\n conan_package_library_targets(\"{{ '${'+pkg_name+'_'+comp_name+'_LIBS'+config_suffix+'}' }}\"\n \"{{ '${'+pkg_name+'_'+comp_name+'_LIB_DIRS'+config_suffix+'}' }}\"\n \"{{ '${'+pkg_name+'_'+comp_name+'_LIBS_FRAMEWORKS_DEPS'+config_suffix+'}' }}\"\n {{ pkg_name }}_{{ comp_name }}_NOT_USED{{ config_suffix }}\n {{ pkg_name }}_{{ comp_name }}_LIB_TARGETS{{ config_suffix }}\n \"{{ config_suffix }}\"\n \"{{ pkg_name }}_{{ comp_name }}\")\n\n set({{ pkg_name }}_{{ comp_name }}_LINK_LIBS{{ config_suffix }} {{ '${'+pkg_name+'_'+comp_name+'_LIB_TARGETS'+config_suffix+'}' }} {{ '${'+pkg_name+'_'+comp_name+'_LIBS_FRAMEWORKS_DEPS'+config_suffix+'}' }})\n {%- endfor %}\n\n\n\n ########## GLOBAL TARGET PROPERTIES {{ configuration }} ########################################\n set_property(TARGET {{target_namespace}}::{{target_namespace}}\n PROPERTY INTERFACE_LINK_LIBRARIES\n $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_LIBRARIES_TARGETS{{config_suffix}}}\n ${{'{'}}{{pkg_name}}_LINKER_FLAGS{{config_suffix}}}> APPEND)\n set_property(TARGET {{target_namespace}}::{{target_namespace}}\n PROPERTY INTERFACE_INCLUDE_DIRECTORIES\n $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_INCLUDE_DIRS{{config_suffix}}}> APPEND)\n set_property(TARGET {{target_namespace}}::{{target_namespace}}\n PROPERTY INTERFACE_COMPILE_DEFINITIONS\n $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_COMPILE_DEFINITIONS{{config_suffix}}}> APPEND)\n set_property(TARGET {{target_namespace}}::{{target_namespace}}\n PROPERTY INTERFACE_COMPILE_OPTIONS\n $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_COMPILE_OPTIONS{{config_suffix}}}> APPEND)\n\n ########## COMPONENTS TARGET PROPERTIES {{ configuration }} ########################################\n\n {%- for comp_name in components_names %}\n\n ########## COMPONENT {{ comp_name }} TARGET PROPERTIES ######################################\n set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_LINK_LIBRARIES\n $<$<CONFIG:{{ configuration }}>:{{tvalue(pkg_name, comp_name, 'LINK_LIBS', config_suffix)}}\n {{tvalue(pkg_name, comp_name, 'LINKER_FLAGS', config_suffix)}}> APPEND)\n set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_INCLUDE_DIRECTORIES\n $<$<CONFIG:{{ configuration }}>:{{tvalue(pkg_name, comp_name, 'INCLUDE_DIRS', config_suffix)}}> APPEND)\n set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_COMPILE_DEFINITIONS\n $<$<CONFIG:{{ configuration }}>:{{tvalue(pkg_name, comp_name, 'COMPILE_DEFINITIONS', config_suffix)}}> APPEND)\n set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_COMPILE_OPTIONS\n $<$<CONFIG:{{ configuration }}>:\n {{tvalue(pkg_name, comp_name, 'COMPILE_OPTIONS_C', config_suffix)}}\n {{tvalue(pkg_name, comp_name, 'COMPILE_OPTIONS_CXX', config_suffix)}}> APPEND)\n set({{ pkg_name }}_{{ comp_name }}_TARGET_PROPERTIES TRUE)\n\n {%- endfor %}\n\n \"\"\")\n\n def get_required_components_names(self):\n \"\"\"Returns a list of component_name\"\"\"\n ret = []\n sorted_comps = self.conanfile.new_cpp_info.get_sorted_components()\n for comp_name, comp in sorted_comps.items():\n ret.append(get_component_alias(self.conanfile, comp_name))\n ret.reverse()\n return ret\n\n def get_deps_targets_names(self):\n \"\"\"\n - [{foo}::{bar}, ] of the required\n \"\"\"\n ret = []\n\n # Get a list of dependencies target names\n # Declared cppinfo.requires or .components[].requires\n if self.conanfile.new_cpp_info.required_components:\n for dep_name, component_name in self.conanfile.new_cpp_info.required_components:\n if not dep_name:\n # Internal dep (no another component)\n dep_name = get_target_namespace(self.conanfile)\n req = self.conanfile\n else:\n req = self.conanfile.dependencies.transitive_host_requires[dep_name]\n dep_name = get_target_namespace(req)\n\n component_name = get_component_alias(req, component_name)\n ret.append(\"{}::{}\".format(dep_name, component_name))\n elif self.conanfile.dependencies.host_requires:\n # Regular external \"conanfile.requires\" declared, not cpp_info requires\n ret = [\"{p}::{p}\".format(p=get_target_namespace(r))\n for r in self.conanfile.dependencies.host_requires]\n return ret\n", "path": "conan/tools/cmake/cmakedeps/templates/target_configuration.py"}], "after_files": [{"content": "import textwrap\n\nfrom conan.tools.cmake.cmakedeps.templates import CMakeDepsFileTemplate, get_component_alias, \\\n get_target_namespace\n\n\"\"\"\n\nFooTarget-release.cmake\n\n\"\"\"\n\n\nclass TargetConfigurationTemplate(CMakeDepsFileTemplate):\n\n @property\n def filename(self):\n return \"{}Target-{}.cmake\".format(self.file_name,\n self.cmakedeps.configuration.lower())\n\n @property\n def context(self):\n deps_targets_names = self.get_deps_targets_names() \\\n if not self.conanfile.is_build_context else []\n return {\"pkg_name\": self.pkg_name,\n \"target_namespace\": self.target_namespace,\n \"config_suffix\": self.config_suffix,\n \"deps_targets_names\": \";\".join(deps_targets_names),\n \"components_names\": self.get_required_components_names(),\n \"configuration\": self.cmakedeps.configuration}\n\n @property\n def template(self):\n return textwrap.dedent(\"\"\"\\\n\n {%- macro tvalue(pkg_name, comp_name, var, config_suffix) -%}\n {{'${'+pkg_name+'_'+comp_name+'_'+var+config_suffix+'}'}}\n {%- endmacro -%}\n\n ########### VARIABLES #######################################################################\n #############################################################################################\n\n set({{ pkg_name }}_COMPILE_OPTIONS{{ config_suffix }}\n \"$<$<COMPILE_LANGUAGE:CXX>{{ ':${' }}{{ pkg_name }}_COMPILE_OPTIONS_CXX{{ config_suffix }}}>\"\n \"$<$<COMPILE_LANGUAGE:C>{{ ':${' }}{{ pkg_name }}_COMPILE_OPTIONS_C{{ config_suffix }}}>\")\n\n set({{ pkg_name }}_LINKER_FLAGS{{ config_suffix }}\n \"$<$<STREQUAL{{ ':$' }}<TARGET_PROPERTY:TYPE>,SHARED_LIBRARY>{{ ':${' }}{{ pkg_name }}_SHARED_LINK_FLAGS{{ config_suffix }}}>\"\n \"$<$<STREQUAL{{ ':$' }}<TARGET_PROPERTY:TYPE>,MODULE_LIBRARY>{{ ':${' }}{{ pkg_name }}_SHARED_LINK_FLAGS{{ config_suffix }}}>\"\n \"$<$<STREQUAL{{ ':$' }}<TARGET_PROPERTY:TYPE>,EXECUTABLE>{{ ':${' }}{{ pkg_name }}_EXE_LINK_FLAGS{{ config_suffix }}}>\")\n\n set({{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }} \"\") # Will be filled later\n conan_find_apple_frameworks({{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }} \"{{ '${' }}{{ pkg_name }}_FRAMEWORKS{{ config_suffix }}}\" \"{{ '${' }}{{ pkg_name }}_FRAMEWORK_DIRS{{ config_suffix }}}\")\n\n # Gather all the libraries that should be linked to the targets (do not touch existing variables)\n set(_{{ pkg_name }}_DEPENDENCIES{{ config_suffix }} \"{{ '${' }}{{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }}} {{ '${' }}{{ pkg_name }}_SYSTEM_LIBS{{ config_suffix }}} {{ deps_targets_names }}\")\n\n set({{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} \"\") # Will be filled later\n set({{ pkg_name }}_LIBRARIES{{ config_suffix }} \"\") # Will be filled later\n conan_package_library_targets(\"{{ '${' }}{{ pkg_name }}_LIBS{{ config_suffix }}}\" # libraries\n \"{{ '${' }}{{ pkg_name }}_LIB_DIRS{{ config_suffix }}}\" # package_libdir\n \"{{ '${' }}_{{ pkg_name }}_DEPENDENCIES{{ config_suffix }}}\" # deps\n {{ pkg_name }}_LIBRARIES{{ config_suffix }} # out_libraries\n {{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} # out_libraries_targets\n \"{{ config_suffix }}\" # config_suffix\n \"{{ pkg_name }}\") # package_name\n\n foreach(_FRAMEWORK {{ '${' }}{{ pkg_name }}_FRAMEWORKS_FOUND{{ config_suffix }}})\n list(APPEND {{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} ${_FRAMEWORK})\n list(APPEND {{ pkg_name }}_LIBRARIES{{ config_suffix }} ${_FRAMEWORK})\n endforeach()\n\n foreach(_SYSTEM_LIB {{ '${' }}{{ pkg_name }}_SYSTEM_LIBS{{ config_suffix }}})\n list(APPEND {{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} ${_SYSTEM_LIB})\n list(APPEND {{ pkg_name }}_LIBRARIES{{ config_suffix }} ${_SYSTEM_LIB})\n endforeach()\n\n # We need to add our requirements too\n set({{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }} {{ '\"${' }}{{ pkg_name }}_LIBRARIES_TARGETS{{ config_suffix }}{{ '};' }}{{ deps_targets_names }}\")\n set({{ pkg_name }}_LIBRARIES{{ config_suffix }} {{ '\"${' }}{{ pkg_name }}_LIBRARIES{{ config_suffix }}{{ '};' }}{{ deps_targets_names }}\")\n\n # FIXME: What is the result of this for multi-config? All configs adding themselves to path?\n set(CMAKE_MODULE_PATH {{ '${' }}{{ pkg_name }}_BUILD_DIRS{{ config_suffix }}} {{ '${' }}CMAKE_MODULE_PATH})\n set(CMAKE_PREFIX_PATH {{ '${' }}{{ pkg_name }}_BUILD_DIRS{{ config_suffix }}} {{ '${' }}CMAKE_PREFIX_PATH})\n\n {%- for comp_name in components_names %}\n\n ########## COMPONENT {{ comp_name }} FIND LIBRARIES & FRAMEWORKS / DYNAMIC VARS #############\n\n set({{ pkg_name }}_{{ comp_name }}_FRAMEWORKS_FOUND{{ config_suffix }} \"\")\n conan_find_apple_frameworks({{ pkg_name }}_{{ comp_name }}_FRAMEWORKS_FOUND{{ config_suffix }} \"{{ '${'+pkg_name+'_'+comp_name+'_FRAMEWORKS'+config_suffix+'}' }}\" \"{{ '${'+pkg_name+'_'+comp_name+'_FRAMEWORK_DIRS'+config_suffix+'}' }}\")\n\n set({{ pkg_name }}_{{ comp_name }}_LIB_TARGETS{{ config_suffix }} \"\")\n set({{ pkg_name }}_{{ comp_name }}_NOT_USED{{ config_suffix }} \"\")\n set({{ pkg_name }}_{{ comp_name }}_LIBS_FRAMEWORKS_DEPS{{ config_suffix }} {{ '${'+pkg_name+'_'+comp_name+'_FRAMEWORKS_FOUND'+config_suffix+'}' }} {{ '${'+pkg_name+'_'+comp_name+'_SYSTEM_LIBS'+config_suffix+'}' }} {{ '${'+pkg_name+'_'+comp_name+'_DEPENDENCIES'+config_suffix+'}' }})\n conan_package_library_targets(\"{{ '${'+pkg_name+'_'+comp_name+'_LIBS'+config_suffix+'}' }}\"\n \"{{ '${'+pkg_name+'_'+comp_name+'_LIB_DIRS'+config_suffix+'}' }}\"\n \"{{ '${'+pkg_name+'_'+comp_name+'_LIBS_FRAMEWORKS_DEPS'+config_suffix+'}' }}\"\n {{ pkg_name }}_{{ comp_name }}_NOT_USED{{ config_suffix }}\n {{ pkg_name }}_{{ comp_name }}_LIB_TARGETS{{ config_suffix }}\n \"{{ config_suffix }}\"\n \"{{ pkg_name }}_{{ comp_name }}\")\n\n set({{ pkg_name }}_{{ comp_name }}_LINK_LIBS{{ config_suffix }} {{ '${'+pkg_name+'_'+comp_name+'_LIB_TARGETS'+config_suffix+'}' }} {{ '${'+pkg_name+'_'+comp_name+'_LIBS_FRAMEWORKS_DEPS'+config_suffix+'}' }})\n {%- endfor %}\n\n\n\n ########## GLOBAL TARGET PROPERTIES {{ configuration }} ########################################\n set_property(TARGET {{target_namespace}}::{{target_namespace}}\n PROPERTY INTERFACE_LINK_LIBRARIES\n $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_LIBRARIES_TARGETS{{config_suffix}}}\n ${{'{'}}{{pkg_name}}_LINKER_FLAGS{{config_suffix}}}> APPEND)\n set_property(TARGET {{target_namespace}}::{{target_namespace}}\n PROPERTY INTERFACE_INCLUDE_DIRECTORIES\n $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_INCLUDE_DIRS{{config_suffix}}}> APPEND)\n set_property(TARGET {{target_namespace}}::{{target_namespace}}\n PROPERTY INTERFACE_COMPILE_DEFINITIONS\n $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_COMPILE_DEFINITIONS{{config_suffix}}}> APPEND)\n set_property(TARGET {{target_namespace}}::{{target_namespace}}\n PROPERTY INTERFACE_COMPILE_OPTIONS\n $<$<CONFIG:{{configuration}}>:${{'{'}}{{pkg_name}}_COMPILE_OPTIONS{{config_suffix}}}> APPEND)\n\n ########## COMPONENTS TARGET PROPERTIES {{ configuration }} ########################################\n\n {%- for comp_name in components_names %}\n\n ########## COMPONENT {{ comp_name }} TARGET PROPERTIES ######################################\n set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_LINK_LIBRARIES\n $<$<CONFIG:{{ configuration }}>:{{tvalue(pkg_name, comp_name, 'LINK_LIBS', config_suffix)}}\n {{tvalue(pkg_name, comp_name, 'LINKER_FLAGS', config_suffix)}}> APPEND)\n set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_INCLUDE_DIRECTORIES\n $<$<CONFIG:{{ configuration }}>:{{tvalue(pkg_name, comp_name, 'INCLUDE_DIRS', config_suffix)}}> APPEND)\n set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_COMPILE_DEFINITIONS\n $<$<CONFIG:{{ configuration }}>:{{tvalue(pkg_name, comp_name, 'COMPILE_DEFINITIONS', config_suffix)}}> APPEND)\n set_property(TARGET {{ target_namespace }}::{{ comp_name }} PROPERTY INTERFACE_COMPILE_OPTIONS\n $<$<CONFIG:{{ configuration }}>:\n {{tvalue(pkg_name, comp_name, 'COMPILE_OPTIONS_C', config_suffix)}}\n {{tvalue(pkg_name, comp_name, 'COMPILE_OPTIONS_CXX', config_suffix)}}> APPEND)\n set({{ pkg_name }}_{{ comp_name }}_TARGET_PROPERTIES TRUE)\n\n {%- endfor %}\n\n \"\"\")\n\n def get_required_components_names(self):\n \"\"\"Returns a list of component_name\"\"\"\n ret = []\n sorted_comps = self.conanfile.new_cpp_info.get_sorted_components()\n for comp_name, comp in sorted_comps.items():\n ret.append(get_component_alias(self.conanfile, comp_name))\n ret.reverse()\n return ret\n\n def get_deps_targets_names(self):\n \"\"\"\n - [{foo}::{bar}, ] of the required\n \"\"\"\n ret = []\n\n # Get a list of dependencies target names\n # Declared cppinfo.requires or .components[].requires\n if self.conanfile.new_cpp_info.required_components:\n for dep_name, component_name in self.conanfile.new_cpp_info.required_components:\n if not dep_name:\n # Internal dep (no another component)\n dep_name = get_target_namespace(self.conanfile)\n req = self.conanfile\n else:\n req = self.conanfile.dependencies.transitive_host_requires[dep_name]\n dep_name = get_target_namespace(req)\n\n component_name = get_component_alias(req, component_name)\n ret.append(\"{}::{}\".format(dep_name, component_name))\n elif self.conanfile.dependencies.host_requires:\n # Regular external \"conanfile.requires\" declared, not cpp_info requires\n ret = [\"{p}::{p}\".format(p=get_target_namespace(r))\n for r in self.conanfile.dependencies.host_requires]\n return ret\n", "path": "conan/tools/cmake/cmakedeps/templates/target_configuration.py"}]} | 3,424 | 206 |
gh_patches_debug_22807 | rasdani/github-patches | git_diff | aws-cloudformation__cfn-lint-907 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Double in mapping thrown E7001 error
*cfn-lint version: cfn-lint 0.20.1*
*Description of issue.*
When a mapping value is a double (ex. 1.1) it returns the error `E7001:Mapping [map] has invalid property at [property]`
Examples:
With double value:

Changed to Int:

Example CFT: [environment.yaml.txt](https://github.com/aws-cloudformation/cfn-python-lint/files/3179852/environment.yaml.txt)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cfnlint/rules/mappings/Configuration.py`
Content:
```
1 """
2 Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.
3
4 Permission is hereby granted, free of charge, to any person obtaining a copy of this
5 software and associated documentation files (the "Software"), to deal in the Software
6 without restriction, including without limitation the rights to use, copy, modify,
7 merge, publish, distribute, sublicense, and/or sell copies of the Software, and to
8 permit persons to whom the Software is furnished to do so.
9
10 THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
11 INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
12 PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
13 HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
14 OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
15 SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
16 """
17 import six
18 from cfnlint import CloudFormationLintRule
19 from cfnlint import RuleMatch
20
21
22 class Configuration(CloudFormationLintRule):
23 """Check if Mappings are configured correctly"""
24 id = 'E7001'
25 shortdesc = 'Mappings are appropriately configured'
26 description = 'Check if Mappings are properly configured'
27 source_url = 'https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/mappings-section-structure.html'
28 tags = ['mappings']
29
30 def match(self, cfn):
31 """Check CloudFormation Parameters"""
32
33 matches = []
34
35 mappings = cfn.template.get('Mappings', {})
36 if mappings:
37 for mapname, mapobj in mappings.items():
38 if not isinstance(mapobj, dict):
39 message = 'Mapping {0} has invalid property'
40 matches.append(RuleMatch(
41 ['Mappings', mapname],
42 message.format(mapname)
43 ))
44 else:
45 for firstkey in mapobj:
46 firstkeyobj = mapobj[firstkey]
47 if not isinstance(firstkeyobj, dict):
48 message = 'Mapping {0} has invalid property at {1}'
49 matches.append(RuleMatch(
50 ['Mappings', mapname, firstkey],
51 message.format(mapname, firstkeyobj)
52 ))
53 else:
54 for secondkey in firstkeyobj:
55 if not isinstance(
56 firstkeyobj[secondkey],
57 (six.string_types, list, six.integer_types)):
58 message = 'Mapping {0} has invalid property at {1}'
59 matches.append(RuleMatch(
60 ['Mappings', mapname, firstkey, secondkey],
61 message.format(mapname, secondkey)
62 ))
63
64 return matches
65
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/cfnlint/rules/mappings/Configuration.py b/src/cfnlint/rules/mappings/Configuration.py
--- a/src/cfnlint/rules/mappings/Configuration.py
+++ b/src/cfnlint/rules/mappings/Configuration.py
@@ -32,6 +32,8 @@
matches = []
+ valid_map_types = (six.string_types, list, six.integer_types, float)
+
mappings = cfn.template.get('Mappings', {})
if mappings:
for mapname, mapobj in mappings.items():
@@ -53,8 +55,7 @@
else:
for secondkey in firstkeyobj:
if not isinstance(
- firstkeyobj[secondkey],
- (six.string_types, list, six.integer_types)):
+ firstkeyobj[secondkey], valid_map_types):
message = 'Mapping {0} has invalid property at {1}'
matches.append(RuleMatch(
['Mappings', mapname, firstkey, secondkey],
| {"golden_diff": "diff --git a/src/cfnlint/rules/mappings/Configuration.py b/src/cfnlint/rules/mappings/Configuration.py\n--- a/src/cfnlint/rules/mappings/Configuration.py\n+++ b/src/cfnlint/rules/mappings/Configuration.py\n@@ -32,6 +32,8 @@\n \n matches = []\n \n+ valid_map_types = (six.string_types, list, six.integer_types, float)\n+\n mappings = cfn.template.get('Mappings', {})\n if mappings:\n for mapname, mapobj in mappings.items():\n@@ -53,8 +55,7 @@\n else:\n for secondkey in firstkeyobj:\n if not isinstance(\n- firstkeyobj[secondkey],\n- (six.string_types, list, six.integer_types)):\n+ firstkeyobj[secondkey], valid_map_types):\n message = 'Mapping {0} has invalid property at {1}'\n matches.append(RuleMatch(\n ['Mappings', mapname, firstkey, secondkey],\n", "issue": "Double in mapping thrown E7001 error\n*cfn-lint version: cfn-lint 0.20.1*\r\n\r\n*Description of issue.*\r\nWhen a mapping value is a double (ex. 1.1) it returns the error `E7001:Mapping [map] has invalid property at [property]`\r\n\r\nExamples:\r\nWith double value:\r\n\r\n\r\nChanged to Int:\r\n\r\n\r\nExample CFT: [environment.yaml.txt](https://github.com/aws-cloudformation/cfn-python-lint/files/3179852/environment.yaml.txt)\r\n\n", "before_files": [{"content": "\"\"\"\n Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n\n Permission is hereby granted, free of charge, to any person obtaining a copy of this\n software and associated documentation files (the \"Software\"), to deal in the Software\n without restriction, including without limitation the rights to use, copy, modify,\n merge, publish, distribute, sublicense, and/or sell copies of the Software, and to\n permit persons to whom the Software is furnished to do so.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,\n INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A\n PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT\n HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\n OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\"\"\"\nimport six\nfrom cfnlint import CloudFormationLintRule\nfrom cfnlint import RuleMatch\n\n\nclass Configuration(CloudFormationLintRule):\n \"\"\"Check if Mappings are configured correctly\"\"\"\n id = 'E7001'\n shortdesc = 'Mappings are appropriately configured'\n description = 'Check if Mappings are properly configured'\n source_url = 'https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/mappings-section-structure.html'\n tags = ['mappings']\n\n def match(self, cfn):\n \"\"\"Check CloudFormation Parameters\"\"\"\n\n matches = []\n\n mappings = cfn.template.get('Mappings', {})\n if mappings:\n for mapname, mapobj in mappings.items():\n if not isinstance(mapobj, dict):\n message = 'Mapping {0} has invalid property'\n matches.append(RuleMatch(\n ['Mappings', mapname],\n message.format(mapname)\n ))\n else:\n for firstkey in mapobj:\n firstkeyobj = mapobj[firstkey]\n if not isinstance(firstkeyobj, dict):\n message = 'Mapping {0} has invalid property at {1}'\n matches.append(RuleMatch(\n ['Mappings', mapname, firstkey],\n message.format(mapname, firstkeyobj)\n ))\n else:\n for secondkey in firstkeyobj:\n if not isinstance(\n firstkeyobj[secondkey],\n (six.string_types, list, six.integer_types)):\n message = 'Mapping {0} has invalid property at {1}'\n matches.append(RuleMatch(\n ['Mappings', mapname, firstkey, secondkey],\n message.format(mapname, secondkey)\n ))\n\n return matches\n", "path": "src/cfnlint/rules/mappings/Configuration.py"}], "after_files": [{"content": "\"\"\"\n Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n\n Permission is hereby granted, free of charge, to any person obtaining a copy of this\n software and associated documentation files (the \"Software\"), to deal in the Software\n without restriction, including without limitation the rights to use, copy, modify,\n merge, publish, distribute, sublicense, and/or sell copies of the Software, and to\n permit persons to whom the Software is furnished to do so.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,\n INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A\n PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT\n HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\n OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\"\"\"\nimport six\nfrom cfnlint import CloudFormationLintRule\nfrom cfnlint import RuleMatch\n\n\nclass Configuration(CloudFormationLintRule):\n \"\"\"Check if Mappings are configured correctly\"\"\"\n id = 'E7001'\n shortdesc = 'Mappings are appropriately configured'\n description = 'Check if Mappings are properly configured'\n source_url = 'https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/mappings-section-structure.html'\n tags = ['mappings']\n\n def match(self, cfn):\n \"\"\"Check CloudFormation Parameters\"\"\"\n\n matches = []\n\n valid_map_types = (six.string_types, list, six.integer_types, float)\n\n mappings = cfn.template.get('Mappings', {})\n if mappings:\n for mapname, mapobj in mappings.items():\n if not isinstance(mapobj, dict):\n message = 'Mapping {0} has invalid property'\n matches.append(RuleMatch(\n ['Mappings', mapname],\n message.format(mapname)\n ))\n else:\n for firstkey in mapobj:\n firstkeyobj = mapobj[firstkey]\n if not isinstance(firstkeyobj, dict):\n message = 'Mapping {0} has invalid property at {1}'\n matches.append(RuleMatch(\n ['Mappings', mapname, firstkey],\n message.format(mapname, firstkeyobj)\n ))\n else:\n for secondkey in firstkeyobj:\n if not isinstance(\n firstkeyobj[secondkey], valid_map_types):\n message = 'Mapping {0} has invalid property at {1}'\n matches.append(RuleMatch(\n ['Mappings', mapname, firstkey, secondkey],\n message.format(mapname, secondkey)\n ))\n\n return matches\n", "path": "src/cfnlint/rules/mappings/Configuration.py"}]} | 1,197 | 215 |
gh_patches_debug_23136 | rasdani/github-patches | git_diff | ivy-llc__ivy-26394 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
index_add
index_add function
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/paddle/manipulation.py`
Content:
```
1 # global
2 import ivy
3 from ivy.functional.frontends.paddle.func_wrapper import (
4 to_ivy_arrays_and_back,
5 )
6 from ivy.func_wrapper import (
7 with_unsupported_dtypes,
8 with_supported_dtypes,
9 with_supported_device_and_dtypes,
10 )
11
12
13 @with_unsupported_dtypes({"2.5.2 and below": ("float16", "bfloat16")}, "paddle")
14 @to_ivy_arrays_and_back
15 def abs(x, name=None):
16 return ivy.abs(x)
17
18
19 @with_supported_dtypes(
20 {"2.5.2 and below": ("bool", "float32", "float64", "int32", "int64")},
21 "paddle",
22 )
23 @to_ivy_arrays_and_back
24 def broadcast_to(x, shape, name=None):
25 return ivy.broadcast_to(x, shape)
26
27
28 @with_supported_dtypes(
29 {
30 "2.5.2 and below": (
31 "bool",
32 "float16",
33 "float32",
34 "float64",
35 "int32",
36 "int64",
37 "uint8",
38 )
39 },
40 "paddle",
41 )
42 @to_ivy_arrays_and_back
43 def cast(x, dtype):
44 return ivy.astype(x, dtype)
45
46
47 @with_unsupported_dtypes({"2.5.2 and below": ("int8", "int16")}, "paddle")
48 @to_ivy_arrays_and_back
49 def concat(x, axis, name=None):
50 return ivy.concat(x, axis=axis)
51
52
53 @with_supported_dtypes(
54 {"2.5.2 and below": ("bool", "float32", "float64", "int32", "int64")},
55 "paddle",
56 )
57 @to_ivy_arrays_and_back
58 def expand(x, shape, name=None):
59 return ivy.expand(x, shape)
60
61
62 @with_unsupported_dtypes(
63 {"2.5.2 and below": ("int8", "uint8", "int16", "float16")},
64 "paddle",
65 )
66 @to_ivy_arrays_and_back
67 def flip(x, axis, name=None):
68 return ivy.flip(x, axis=axis)
69
70
71 @with_supported_dtypes(
72 {"2.5.2 and below": ("bool", "float32", "float64", "int32", "int64")},
73 "paddle",
74 )
75 @to_ivy_arrays_and_back
76 def gather(params, indices, axis=-1, batch_dims=0, name=None):
77 return ivy.gather(params, indices, axis=axis, batch_dims=batch_dims)
78
79
80 @with_unsupported_dtypes(
81 {"2.5.2 and below": ("int8", "uint8", "int16", "uint16", "float16", "bfloat16")},
82 "paddle",
83 )
84 @to_ivy_arrays_and_back
85 def gather_nd(x, index, name=None):
86 return ivy.gather_nd(x, index)
87
88
89 @to_ivy_arrays_and_back
90 def put_along_axis(arr, indices, values, axis, reduce="assign"):
91 result = ivy.put_along_axis(arr, indices, values, axis)
92 return result
93
94
95 @with_supported_dtypes(
96 {"2.5.2 and below": ("int32", "int64", "float32", "float64")},
97 "paddle",
98 )
99 @to_ivy_arrays_and_back
100 def repeat_interleave(x, repeats, axis=None, name=None):
101 return ivy.repeat(x, repeats, axis=axis)
102
103
104 @to_ivy_arrays_and_back
105 def reshape(x, shape, name=None):
106 return ivy.reshape(x, shape)
107
108
109 @with_supported_dtypes(
110 {
111 "2.5.0 and below": (
112 "float32",
113 "float64",
114 "int32",
115 "int64",
116 "complex64",
117 "complex128",
118 )
119 },
120 "paddle",
121 )
122 @to_ivy_arrays_and_back
123 def roll(x, shifts, axis=None, name=None):
124 return ivy.roll(x, shifts, axis=axis)
125
126
127 @with_supported_device_and_dtypes(
128 {
129 "2.5.2 and above": {
130 "cpu": (
131 "bool",
132 "int32",
133 "int64",
134 "float32",
135 "float64",
136 ),
137 "gpu": ("float16",),
138 },
139 },
140 "paddle",
141 )
142 @to_ivy_arrays_and_back
143 def rot90(x, k=1, axes=(0, 1), name=None):
144 return ivy.rot90(x, k=k, axes=axes)
145
146
147 @with_unsupported_dtypes(
148 {"2.5.2 and below": ("int16", "complex64", "complex128")},
149 "paddle",
150 )
151 @to_ivy_arrays_and_back
152 def split(x, num_or_sections, axis=0, name=None):
153 return ivy.split(x, num_or_size_splits=num_or_sections, axis=axis)
154
155
156 @with_unsupported_dtypes(
157 {"2.5.2 and below": ("float16", "bfloat16", "int8", "int16")},
158 "paddle",
159 )
160 @to_ivy_arrays_and_back
161 def squeeze(x, axis=None, name=None):
162 return ivy.squeeze(x, axis=axis)
163
164
165 @to_ivy_arrays_and_back
166 def stack(x, axis=0, name=None):
167 return ivy.stack(x, axis=axis)
168
169
170 def take_along_axis(arr, indices, axis):
171 return ivy.take_along_axis(arr, indices, axis)
172
173
174 @with_unsupported_dtypes(
175 {"2.5.2 and below": ("int8", "uint8", "int16", "float16")},
176 "paddle",
177 )
178 @to_ivy_arrays_and_back
179 def tile(x, repeat_times, name=None):
180 return ivy.tile(x, repeats=repeat_times)
181
182
183 @to_ivy_arrays_and_back
184 def tolist(x):
185 return ivy.to_list(x)
186
187
188 @with_supported_dtypes(
189 {"2.5.2 and below": ("bool", "int32", "int64", "float16", "float32", "float64")},
190 "paddle",
191 )
192 @to_ivy_arrays_and_back
193 def unbind(input, axis=0):
194 shape = list(input.shape)
195 num_splits = shape[axis]
196 shape.pop(axis)
197 return tuple(x.reshape(tuple(shape)) for x in split(input, num_splits, axis=axis))
198
199
200 @with_supported_dtypes(
201 {"2.5.2 and below": ("bool", "int32", "int64", "float16", "float32", "float64")},
202 "paddle",
203 )
204 @to_ivy_arrays_and_back
205 def unique_consecutive(x, axis=0):
206 return ivy.unique_consecutive(x, axis=axis)
207
208
209 @with_supported_dtypes(
210 {
211 "2.5.2 and below": (
212 "float32",
213 "float64",
214 "int32",
215 "int64",
216 )
217 },
218 "paddle",
219 )
220 @to_ivy_arrays_and_back
221 def unstack(x, axis=0, name=None):
222 return ivy.unstack(x, axis=axis)
223
224
225 absolute = abs
226
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ivy/functional/frontends/paddle/manipulation.py b/ivy/functional/frontends/paddle/manipulation.py
--- a/ivy/functional/frontends/paddle/manipulation.py
+++ b/ivy/functional/frontends/paddle/manipulation.py
@@ -86,6 +86,37 @@
return ivy.gather_nd(x, index)
+@with_supported_dtypes(
+ {"2.5.1 and below": ("bool", "int32", "int64", "float16", "float32", "float64")},
+ "paddle",
+)
+@to_ivy_arrays_and_back
+def index_add(x, index, axis, value, *, name=None):
+ x = ivy.swapaxes(x, axis, 0)
+ value = ivy.swapaxes(value, axis, 0)
+ _to_adds = []
+ index = sorted(zip(ivy.to_list(index), range(len(index))), key=(lambda i: i[0]))
+ while index:
+ _curr_idx = index[0][0]
+ while len(_to_adds) < _curr_idx:
+ _to_adds.append(ivy.zeros_like(value[0]))
+ _to_add_cum = ivy.get_item(value, index[0][1])
+ while (len(index)) > 1 and (index[0][0] == index[1][0]):
+ _to_add_cum = _to_add_cum + ivy.get_item(value, index.pop(1)[1])
+ index.pop(0)
+ _to_adds.append(_to_add_cum)
+ while len(_to_adds) < x.shape[0]:
+ _to_adds.append(ivy.zeros_like(value[0]))
+ _to_adds = ivy.stack(_to_adds)
+ if len(x.shape) < 2:
+ # Added this line due to the paddle backend treating scalars as 1-d arrays
+ _to_adds = ivy.flatten(_to_adds)
+
+ ret = ivy.add(x, _to_adds)
+ ret = ivy.swapaxes(ret, axis, 0)
+ return ret
+
+
@to_ivy_arrays_and_back
def put_along_axis(arr, indices, values, axis, reduce="assign"):
result = ivy.put_along_axis(arr, indices, values, axis)
| {"golden_diff": "diff --git a/ivy/functional/frontends/paddle/manipulation.py b/ivy/functional/frontends/paddle/manipulation.py\n--- a/ivy/functional/frontends/paddle/manipulation.py\n+++ b/ivy/functional/frontends/paddle/manipulation.py\n@@ -86,6 +86,37 @@\n return ivy.gather_nd(x, index)\n \n \n+@with_supported_dtypes(\n+ {\"2.5.1 and below\": (\"bool\", \"int32\", \"int64\", \"float16\", \"float32\", \"float64\")},\n+ \"paddle\",\n+)\n+@to_ivy_arrays_and_back\n+def index_add(x, index, axis, value, *, name=None):\n+ x = ivy.swapaxes(x, axis, 0)\n+ value = ivy.swapaxes(value, axis, 0)\n+ _to_adds = []\n+ index = sorted(zip(ivy.to_list(index), range(len(index))), key=(lambda i: i[0]))\n+ while index:\n+ _curr_idx = index[0][0]\n+ while len(_to_adds) < _curr_idx:\n+ _to_adds.append(ivy.zeros_like(value[0]))\n+ _to_add_cum = ivy.get_item(value, index[0][1])\n+ while (len(index)) > 1 and (index[0][0] == index[1][0]):\n+ _to_add_cum = _to_add_cum + ivy.get_item(value, index.pop(1)[1])\n+ index.pop(0)\n+ _to_adds.append(_to_add_cum)\n+ while len(_to_adds) < x.shape[0]:\n+ _to_adds.append(ivy.zeros_like(value[0]))\n+ _to_adds = ivy.stack(_to_adds)\n+ if len(x.shape) < 2:\n+ # Added this line due to the paddle backend treating scalars as 1-d arrays\n+ _to_adds = ivy.flatten(_to_adds)\n+\n+ ret = ivy.add(x, _to_adds)\n+ ret = ivy.swapaxes(ret, axis, 0)\n+ return ret\n+\n+\n @to_ivy_arrays_and_back\n def put_along_axis(arr, indices, values, axis, reduce=\"assign\"):\n result = ivy.put_along_axis(arr, indices, values, axis)\n", "issue": "index_add\nindex_add function\n", "before_files": [{"content": "# global\nimport ivy\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\nfrom ivy.func_wrapper import (\n with_unsupported_dtypes,\n with_supported_dtypes,\n with_supported_device_and_dtypes,\n)\n\n\n@with_unsupported_dtypes({\"2.5.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef abs(x, name=None):\n return ivy.abs(x)\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef broadcast_to(x, shape, name=None):\n return ivy.broadcast_to(x, shape)\n\n\n@with_supported_dtypes(\n {\n \"2.5.2 and below\": (\n \"bool\",\n \"float16\",\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n \"uint8\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef cast(x, dtype):\n return ivy.astype(x, dtype)\n\n\n@with_unsupported_dtypes({\"2.5.2 and below\": (\"int8\", \"int16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef concat(x, axis, name=None):\n return ivy.concat(x, axis=axis)\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef expand(x, shape, name=None):\n return ivy.expand(x, shape)\n\n\n@with_unsupported_dtypes(\n {\"2.5.2 and below\": (\"int8\", \"uint8\", \"int16\", \"float16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef flip(x, axis, name=None):\n return ivy.flip(x, axis=axis)\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef gather(params, indices, axis=-1, batch_dims=0, name=None):\n return ivy.gather(params, indices, axis=axis, batch_dims=batch_dims)\n\n\n@with_unsupported_dtypes(\n {\"2.5.2 and below\": (\"int8\", \"uint8\", \"int16\", \"uint16\", \"float16\", \"bfloat16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef gather_nd(x, index, name=None):\n return ivy.gather_nd(x, index)\n\n\n@to_ivy_arrays_and_back\ndef put_along_axis(arr, indices, values, axis, reduce=\"assign\"):\n result = ivy.put_along_axis(arr, indices, values, axis)\n return result\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"int32\", \"int64\", \"float32\", \"float64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef repeat_interleave(x, repeats, axis=None, name=None):\n return ivy.repeat(x, repeats, axis=axis)\n\n\n@to_ivy_arrays_and_back\ndef reshape(x, shape, name=None):\n return ivy.reshape(x, shape)\n\n\n@with_supported_dtypes(\n {\n \"2.5.0 and below\": (\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n \"complex64\",\n \"complex128\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef roll(x, shifts, axis=None, name=None):\n return ivy.roll(x, shifts, axis=axis)\n\n\n@with_supported_device_and_dtypes(\n {\n \"2.5.2 and above\": {\n \"cpu\": (\n \"bool\",\n \"int32\",\n \"int64\",\n \"float32\",\n \"float64\",\n ),\n \"gpu\": (\"float16\",),\n },\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef rot90(x, k=1, axes=(0, 1), name=None):\n return ivy.rot90(x, k=k, axes=axes)\n\n\n@with_unsupported_dtypes(\n {\"2.5.2 and below\": (\"int16\", \"complex64\", \"complex128\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef split(x, num_or_sections, axis=0, name=None):\n return ivy.split(x, num_or_size_splits=num_or_sections, axis=axis)\n\n\n@with_unsupported_dtypes(\n {\"2.5.2 and below\": (\"float16\", \"bfloat16\", \"int8\", \"int16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef squeeze(x, axis=None, name=None):\n return ivy.squeeze(x, axis=axis)\n\n\n@to_ivy_arrays_and_back\ndef stack(x, axis=0, name=None):\n return ivy.stack(x, axis=axis)\n\n\ndef take_along_axis(arr, indices, axis):\n return ivy.take_along_axis(arr, indices, axis)\n\n\n@with_unsupported_dtypes(\n {\"2.5.2 and below\": (\"int8\", \"uint8\", \"int16\", \"float16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef tile(x, repeat_times, name=None):\n return ivy.tile(x, repeats=repeat_times)\n\n\n@to_ivy_arrays_and_back\ndef tolist(x):\n return ivy.to_list(x)\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"bool\", \"int32\", \"int64\", \"float16\", \"float32\", \"float64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef unbind(input, axis=0):\n shape = list(input.shape)\n num_splits = shape[axis]\n shape.pop(axis)\n return tuple(x.reshape(tuple(shape)) for x in split(input, num_splits, axis=axis))\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"bool\", \"int32\", \"int64\", \"float16\", \"float32\", \"float64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef unique_consecutive(x, axis=0):\n return ivy.unique_consecutive(x, axis=axis)\n\n\n@with_supported_dtypes(\n {\n \"2.5.2 and below\": (\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef unstack(x, axis=0, name=None):\n return ivy.unstack(x, axis=axis)\n\n\nabsolute = abs\n", "path": "ivy/functional/frontends/paddle/manipulation.py"}], "after_files": [{"content": "# global\nimport ivy\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\nfrom ivy.func_wrapper import (\n with_unsupported_dtypes,\n with_supported_dtypes,\n with_supported_device_and_dtypes,\n)\n\n\n@with_unsupported_dtypes({\"2.5.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef abs(x, name=None):\n return ivy.abs(x)\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef broadcast_to(x, shape, name=None):\n return ivy.broadcast_to(x, shape)\n\n\n@with_supported_dtypes(\n {\n \"2.5.2 and below\": (\n \"bool\",\n \"float16\",\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n \"uint8\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef cast(x, dtype):\n return ivy.astype(x, dtype)\n\n\n@with_unsupported_dtypes({\"2.5.2 and below\": (\"int8\", \"int16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef concat(x, axis, name=None):\n return ivy.concat(x, axis=axis)\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef expand(x, shape, name=None):\n return ivy.expand(x, shape)\n\n\n@with_unsupported_dtypes(\n {\"2.5.2 and below\": (\"int8\", \"uint8\", \"int16\", \"float16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef flip(x, axis, name=None):\n return ivy.flip(x, axis=axis)\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef gather(params, indices, axis=-1, batch_dims=0, name=None):\n return ivy.gather(params, indices, axis=axis, batch_dims=batch_dims)\n\n\n@with_unsupported_dtypes(\n {\"2.5.2 and below\": (\"int8\", \"uint8\", \"int16\", \"uint16\", \"float16\", \"bfloat16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef gather_nd(x, index, name=None):\n return ivy.gather_nd(x, index)\n\n\n@with_supported_dtypes(\n {\"2.5.1 and below\": (\"bool\", \"int32\", \"int64\", \"float16\", \"float32\", \"float64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef index_add(x, index, axis, value, *, name=None):\n x = ivy.swapaxes(x, axis, 0)\n value = ivy.swapaxes(value, axis, 0)\n _to_adds = []\n index = sorted(zip(ivy.to_list(index), range(len(index))), key=(lambda i: i[0]))\n while index:\n _curr_idx = index[0][0]\n while len(_to_adds) < _curr_idx:\n _to_adds.append(ivy.zeros_like(value[0]))\n _to_add_cum = ivy.get_item(value, index[0][1])\n while (len(index)) > 1 and (index[0][0] == index[1][0]):\n _to_add_cum = _to_add_cum + ivy.get_item(value, index.pop(1)[1])\n index.pop(0)\n _to_adds.append(_to_add_cum)\n while len(_to_adds) < x.shape[0]:\n _to_adds.append(ivy.zeros_like(value[0]))\n _to_adds = ivy.stack(_to_adds)\n if len(x.shape) < 2:\n # Added this line due to the paddle backend treating scalars as 1-d arrays\n _to_adds = ivy.flatten(_to_adds)\n\n ret = ivy.add(x, _to_adds)\n ret = ivy.swapaxes(ret, axis, 0)\n return ret\n\n\n@to_ivy_arrays_and_back\ndef put_along_axis(arr, indices, values, axis, reduce=\"assign\"):\n result = ivy.put_along_axis(arr, indices, values, axis)\n return result\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"int32\", \"int64\", \"float32\", \"float64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef repeat_interleave(x, repeats, axis=None, name=None):\n return ivy.repeat(x, repeats, axis=axis)\n\n\n@to_ivy_arrays_and_back\ndef reshape(x, shape, name=None):\n return ivy.reshape(x, shape)\n\n\n@with_supported_dtypes(\n {\n \"2.5.0 and below\": (\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n \"complex64\",\n \"complex128\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef roll(x, shifts, axis=None, name=None):\n return ivy.roll(x, shifts, axis=axis)\n\n\n@with_supported_device_and_dtypes(\n {\n \"2.5.2 and above\": {\n \"cpu\": (\n \"bool\",\n \"int32\",\n \"int64\",\n \"float32\",\n \"float64\",\n ),\n \"gpu\": (\"float16\",),\n },\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef rot90(x, k=1, axes=(0, 1), name=None):\n return ivy.rot90(x, k=k, axes=axes)\n\n\n@with_unsupported_dtypes(\n {\"2.5.2 and below\": (\"int16\", \"complex64\", \"complex128\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef split(x, num_or_sections, axis=0, name=None):\n return ivy.split(x, num_or_size_splits=num_or_sections, axis=axis)\n\n\n@with_unsupported_dtypes(\n {\"2.5.2 and below\": (\"float16\", \"bfloat16\", \"int8\", \"int16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef squeeze(x, axis=None, name=None):\n return ivy.squeeze(x, axis=axis)\n\n\n@to_ivy_arrays_and_back\ndef stack(x, axis=0, name=None):\n return ivy.stack(x, axis=axis)\n\n\ndef take_along_axis(arr, indices, axis):\n return ivy.take_along_axis(arr, indices, axis)\n\n\n@with_unsupported_dtypes(\n {\"2.5.2 and below\": (\"int8\", \"uint8\", \"int16\", \"float16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef tile(x, repeat_times, name=None):\n return ivy.tile(x, repeats=repeat_times)\n\n\n@to_ivy_arrays_and_back\ndef tolist(x):\n return ivy.to_list(x)\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"bool\", \"int32\", \"int64\", \"float16\", \"float32\", \"float64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef unbind(input, axis=0):\n shape = list(input.shape)\n num_splits = shape[axis]\n shape.pop(axis)\n return tuple(x.reshape(tuple(shape)) for x in split(input, num_splits, axis=axis))\n\n\n@with_supported_dtypes(\n {\"2.5.2 and below\": (\"bool\", \"int32\", \"int64\", \"float16\", \"float32\", \"float64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef unique_consecutive(x, axis=0):\n return ivy.unique_consecutive(x, axis=axis)\n\n\n@with_supported_dtypes(\n {\n \"2.5.2 and below\": (\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef unstack(x, axis=0, name=None):\n return ivy.unstack(x, axis=axis)\n\n\nabsolute = abs\n", "path": "ivy/functional/frontends/paddle/manipulation.py"}]} | 2,475 | 531 |
gh_patches_debug_29326 | rasdani/github-patches | git_diff | cloudtools__troposphere-1102 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Prefix is optional for
I assume `True` here means that `Prefix` is mandatory:
```py
class S3DestinationConfiguration(AWSProperty):
props = {
'BucketARN': (basestring, True),
'BufferingHints': (BufferingHints, True),
'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),
'CompressionFormat': (basestring, True),
'EncryptionConfiguration': (EncryptionConfiguration, False),
'Prefix': (basestring, True),
'RoleARN': (basestring, True),
}
class ExtendedS3DestinationConfiguration(AWSProperty):
props = {
'BucketARN': (basestring, True),
'BufferingHints': (BufferingHints, True),
'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),
'CompressionFormat': (basestring, True),
'EncryptionConfiguration': (EncryptionConfiguration, False),
'Prefix': (basestring, True),
'ProcessingConfiguration': (ProcessingConfiguration, False),
'RoleARN': (basestring, True),
'S3BackupConfiguration': (S3DestinationConfiguration, False),
'S3BackupMode': (s3_backup_mode_extended_s3_validator, False),
}
```
However, [`Prefix` is optional](https://docs.aws.amazon.com/firehose/latest/APIReference/API_S3DestinationConfiguration.html).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `troposphere/firehose.py`
Content:
```
1 # Copyright (c) 2016-2017, troposphere project
2 # All rights reserved.
3 #
4 # See LICENSE file for full license.
5
6 from . import AWSObject, AWSProperty
7 from .validators import boolean, positive_integer
8
9
10 def processor_type_validator(x):
11 valid_types = ["Lambda"]
12 if x not in valid_types:
13 raise ValueError("Type must be one of: %s" %
14 ", ".join(valid_types))
15 return x
16
17
18 def delivery_stream_type_validator(x):
19 valid_types = ["DirectPut", "KinesisStreamAsSource"]
20 if x not in valid_types:
21 raise ValueError("DeliveryStreamType must be one of: %s" %
22 ", ".join(valid_types))
23 return x
24
25
26 def index_rotation_period_validator(x):
27 valid_types = ["NoRotation", "OneHour", "OneDay", "OneWeek", "OneMonth"]
28 if x not in valid_types:
29 raise ValueError("IndexRotationPeriod must be one of: %s" %
30 ", ".join(valid_types))
31 return x
32
33
34 def s3_backup_mode_elastic_search_validator(x):
35 valid_types = ["FailedDocumentsOnly", "AllDocuments"]
36 if x not in valid_types:
37 raise ValueError("S3BackupMode must be one of: %s" %
38 ", ".join(valid_types))
39 return x
40
41
42 def s3_backup_mode_extended_s3_validator(x):
43 valid_types = ["Disabled", "Enabled"]
44 if x not in valid_types:
45 raise ValueError("S3BackupMode must be one of: %s" %
46 ", ".join(valid_types))
47 return x
48
49
50 class BufferingHints(AWSProperty):
51 props = {
52 'IntervalInSeconds': (positive_integer, True),
53 'SizeInMBs': (positive_integer, True)
54 }
55
56
57 class CloudWatchLoggingOptions(AWSProperty):
58 props = {
59 'Enabled': (boolean, False),
60 'LogGroupName': (basestring, False), # Conditional
61 'LogStreamName': (basestring, False), # Conditional
62 }
63
64
65 class RetryOptions(AWSProperty):
66 props = {
67 'DurationInSeconds': (positive_integer, True),
68 }
69
70
71 class KMSEncryptionConfig(AWSProperty):
72 props = {
73 'AWSKMSKeyARN': (basestring, True),
74 }
75
76
77 class EncryptionConfiguration(AWSProperty):
78 props = {
79 'KMSEncryptionConfig': (KMSEncryptionConfig, False),
80 'NoEncryptionConfig': (basestring, False),
81 }
82
83
84 class S3Configuration(AWSProperty):
85 props = {
86 'BucketARN': (basestring, True),
87 'BufferingHints': (BufferingHints, True),
88 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),
89 'CompressionFormat': (basestring, True),
90 'EncryptionConfiguration': (EncryptionConfiguration, False),
91 'Prefix': (basestring, True),
92 'RoleARN': (basestring, True)
93 }
94
95
96 class CopyCommand(AWSProperty):
97 props = {
98 'CopyOptions': (basestring, False),
99 'DataTableColumns': (basestring, False),
100 'DataTableName': (basestring, True),
101 }
102
103
104 class ProcessorParameter(AWSProperty):
105 props = {
106 'ParameterName': (basestring, True),
107 'ParameterValue': (basestring, True),
108 }
109
110
111 class Processor(AWSProperty):
112 props = {
113 'Parameters': ([ProcessorParameter], True),
114 'Type': (processor_type_validator, True),
115 }
116
117
118 class ProcessingConfiguration(AWSProperty):
119 props = {
120 'Enabled': (boolean, True),
121 'Processors': ([Processor], True),
122 }
123
124
125 class ElasticsearchDestinationConfiguration(AWSProperty):
126 props = {
127 'BufferingHints': (BufferingHints, True),
128 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),
129 'DomainARN': (basestring, True),
130 'IndexName': (basestring, True),
131 'IndexRotationPeriod': (index_rotation_period_validator, True),
132 'ProcessingConfiguration': (ProcessingConfiguration, False),
133 'RetryOptions': (RetryOptions, False),
134 'RoleARN': (basestring, True),
135 'S3BackupMode': (s3_backup_mode_elastic_search_validator, True),
136 'S3Configuration': (S3Configuration, False),
137 'TypeName': (basestring, True),
138 }
139
140
141 class RedshiftDestinationConfiguration(AWSProperty):
142 props = {
143 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),
144 'ClusterJDBCURL': (basestring, True),
145 'CopyCommand': (CopyCommand, True),
146 'Password': (basestring, True),
147 'ProcessingConfiguration': (ProcessingConfiguration, False),
148 'RoleARN': (basestring, True),
149 'S3Configuration': (S3Configuration, True),
150 'Username': (basestring, True),
151 }
152
153
154 class S3DestinationConfiguration(AWSProperty):
155 props = {
156 'BucketARN': (basestring, True),
157 'BufferingHints': (BufferingHints, True),
158 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),
159 'CompressionFormat': (basestring, True),
160 'EncryptionConfiguration': (EncryptionConfiguration, False),
161 'Prefix': (basestring, True),
162 'RoleARN': (basestring, True),
163 }
164
165
166 class ExtendedS3DestinationConfiguration(AWSProperty):
167 props = {
168 'BucketARN': (basestring, True),
169 'BufferingHints': (BufferingHints, True),
170 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),
171 'CompressionFormat': (basestring, True),
172 'EncryptionConfiguration': (EncryptionConfiguration, False),
173 'Prefix': (basestring, True),
174 'ProcessingConfiguration': (ProcessingConfiguration, False),
175 'RoleARN': (basestring, True),
176 'S3BackupConfiguration': (S3DestinationConfiguration, False),
177 'S3BackupMode': (s3_backup_mode_extended_s3_validator, False),
178 }
179
180
181 class KinesisStreamSourceConfiguration(AWSProperty):
182 props = {
183 'KinesisStreamARN': (basestring, True),
184 'RoleARN': (basestring, True)
185 }
186
187
188 class SplunkRetryOptions(AWSProperty):
189 props = {
190 'DurationInSeconds': (positive_integer, True),
191 }
192
193
194 class SplunkDestinationConfiguration(AWSProperty):
195 props = {
196 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),
197 'HECAcknowledgmentTimeoutInSeconds': (positive_integer, False),
198 'HECEndpoint': (basestring, True),
199 'HECEndpointType': (basestring, True),
200 'HECToken': (basestring, True),
201 'ProcessingConfiguration': (ProcessingConfiguration, False),
202 'RetryOptions': (SplunkRetryOptions, False),
203 'S3BackupMode': (basestring, False),
204 'S3Configuration': (S3DestinationConfiguration, True),
205 }
206
207
208 class DeliveryStream(AWSObject):
209 resource_type = "AWS::KinesisFirehose::DeliveryStream"
210
211 props = {
212 'DeliveryStreamName': (basestring, False),
213 'DeliveryStreamType': (delivery_stream_type_validator, False),
214 'ElasticsearchDestinationConfiguration': (ElasticsearchDestinationConfiguration, False), # noqa
215 'ExtendedS3DestinationConfiguration': (ExtendedS3DestinationConfiguration, False), # noqa
216 'KinesisStreamSourceConfiguration': (KinesisStreamSourceConfiguration, False), # noqa
217 'RedshiftDestinationConfiguration': (RedshiftDestinationConfiguration, False), # noqa
218 'S3DestinationConfiguration': (S3DestinationConfiguration, False),
219 'SplunkDestinationConfiguration':
220 (SplunkDestinationConfiguration, False),
221 }
222
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/troposphere/firehose.py b/troposphere/firehose.py
--- a/troposphere/firehose.py
+++ b/troposphere/firehose.py
@@ -88,7 +88,7 @@
'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),
'CompressionFormat': (basestring, True),
'EncryptionConfiguration': (EncryptionConfiguration, False),
- 'Prefix': (basestring, True),
+ 'Prefix': (basestring, False),
'RoleARN': (basestring, True)
}
@@ -158,7 +158,7 @@
'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),
'CompressionFormat': (basestring, True),
'EncryptionConfiguration': (EncryptionConfiguration, False),
- 'Prefix': (basestring, True),
+ 'Prefix': (basestring, False),
'RoleARN': (basestring, True),
}
@@ -170,7 +170,7 @@
'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),
'CompressionFormat': (basestring, True),
'EncryptionConfiguration': (EncryptionConfiguration, False),
- 'Prefix': (basestring, True),
+ 'Prefix': (basestring, False),
'ProcessingConfiguration': (ProcessingConfiguration, False),
'RoleARN': (basestring, True),
'S3BackupConfiguration': (S3DestinationConfiguration, False),
| {"golden_diff": "diff --git a/troposphere/firehose.py b/troposphere/firehose.py\n--- a/troposphere/firehose.py\n+++ b/troposphere/firehose.py\n@@ -88,7 +88,7 @@\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'CompressionFormat': (basestring, True),\n 'EncryptionConfiguration': (EncryptionConfiguration, False),\n- 'Prefix': (basestring, True),\n+ 'Prefix': (basestring, False),\n 'RoleARN': (basestring, True)\n }\n \n@@ -158,7 +158,7 @@\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'CompressionFormat': (basestring, True),\n 'EncryptionConfiguration': (EncryptionConfiguration, False),\n- 'Prefix': (basestring, True),\n+ 'Prefix': (basestring, False),\n 'RoleARN': (basestring, True),\n }\n \n@@ -170,7 +170,7 @@\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'CompressionFormat': (basestring, True),\n 'EncryptionConfiguration': (EncryptionConfiguration, False),\n- 'Prefix': (basestring, True),\n+ 'Prefix': (basestring, False),\n 'ProcessingConfiguration': (ProcessingConfiguration, False),\n 'RoleARN': (basestring, True),\n 'S3BackupConfiguration': (S3DestinationConfiguration, False),\n", "issue": "Prefix is optional for \nI assume `True` here means that `Prefix` is mandatory:\r\n\r\n```py\r\nclass S3DestinationConfiguration(AWSProperty):\r\n props = {\r\n 'BucketARN': (basestring, True),\r\n 'BufferingHints': (BufferingHints, True),\r\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\r\n 'CompressionFormat': (basestring, True),\r\n 'EncryptionConfiguration': (EncryptionConfiguration, False),\r\n 'Prefix': (basestring, True),\r\n 'RoleARN': (basestring, True),\r\n }\r\n\r\nclass ExtendedS3DestinationConfiguration(AWSProperty):\r\n props = {\r\n 'BucketARN': (basestring, True),\r\n 'BufferingHints': (BufferingHints, True),\r\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\r\n 'CompressionFormat': (basestring, True),\r\n 'EncryptionConfiguration': (EncryptionConfiguration, False),\r\n 'Prefix': (basestring, True),\r\n 'ProcessingConfiguration': (ProcessingConfiguration, False),\r\n 'RoleARN': (basestring, True),\r\n 'S3BackupConfiguration': (S3DestinationConfiguration, False),\r\n 'S3BackupMode': (s3_backup_mode_extended_s3_validator, False),\r\n }\r\n```\r\nHowever, [`Prefix` is optional](https://docs.aws.amazon.com/firehose/latest/APIReference/API_S3DestinationConfiguration.html).\n", "before_files": [{"content": "# Copyright (c) 2016-2017, troposphere project\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\nfrom . import AWSObject, AWSProperty\nfrom .validators import boolean, positive_integer\n\n\ndef processor_type_validator(x):\n valid_types = [\"Lambda\"]\n if x not in valid_types:\n raise ValueError(\"Type must be one of: %s\" %\n \", \".join(valid_types))\n return x\n\n\ndef delivery_stream_type_validator(x):\n valid_types = [\"DirectPut\", \"KinesisStreamAsSource\"]\n if x not in valid_types:\n raise ValueError(\"DeliveryStreamType must be one of: %s\" %\n \", \".join(valid_types))\n return x\n\n\ndef index_rotation_period_validator(x):\n valid_types = [\"NoRotation\", \"OneHour\", \"OneDay\", \"OneWeek\", \"OneMonth\"]\n if x not in valid_types:\n raise ValueError(\"IndexRotationPeriod must be one of: %s\" %\n \", \".join(valid_types))\n return x\n\n\ndef s3_backup_mode_elastic_search_validator(x):\n valid_types = [\"FailedDocumentsOnly\", \"AllDocuments\"]\n if x not in valid_types:\n raise ValueError(\"S3BackupMode must be one of: %s\" %\n \", \".join(valid_types))\n return x\n\n\ndef s3_backup_mode_extended_s3_validator(x):\n valid_types = [\"Disabled\", \"Enabled\"]\n if x not in valid_types:\n raise ValueError(\"S3BackupMode must be one of: %s\" %\n \", \".join(valid_types))\n return x\n\n\nclass BufferingHints(AWSProperty):\n props = {\n 'IntervalInSeconds': (positive_integer, True),\n 'SizeInMBs': (positive_integer, True)\n }\n\n\nclass CloudWatchLoggingOptions(AWSProperty):\n props = {\n 'Enabled': (boolean, False),\n 'LogGroupName': (basestring, False), # Conditional\n 'LogStreamName': (basestring, False), # Conditional\n }\n\n\nclass RetryOptions(AWSProperty):\n props = {\n 'DurationInSeconds': (positive_integer, True),\n }\n\n\nclass KMSEncryptionConfig(AWSProperty):\n props = {\n 'AWSKMSKeyARN': (basestring, True),\n }\n\n\nclass EncryptionConfiguration(AWSProperty):\n props = {\n 'KMSEncryptionConfig': (KMSEncryptionConfig, False),\n 'NoEncryptionConfig': (basestring, False),\n }\n\n\nclass S3Configuration(AWSProperty):\n props = {\n 'BucketARN': (basestring, True),\n 'BufferingHints': (BufferingHints, True),\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'CompressionFormat': (basestring, True),\n 'EncryptionConfiguration': (EncryptionConfiguration, False),\n 'Prefix': (basestring, True),\n 'RoleARN': (basestring, True)\n }\n\n\nclass CopyCommand(AWSProperty):\n props = {\n 'CopyOptions': (basestring, False),\n 'DataTableColumns': (basestring, False),\n 'DataTableName': (basestring, True),\n }\n\n\nclass ProcessorParameter(AWSProperty):\n props = {\n 'ParameterName': (basestring, True),\n 'ParameterValue': (basestring, True),\n }\n\n\nclass Processor(AWSProperty):\n props = {\n 'Parameters': ([ProcessorParameter], True),\n 'Type': (processor_type_validator, True),\n }\n\n\nclass ProcessingConfiguration(AWSProperty):\n props = {\n 'Enabled': (boolean, True),\n 'Processors': ([Processor], True),\n }\n\n\nclass ElasticsearchDestinationConfiguration(AWSProperty):\n props = {\n 'BufferingHints': (BufferingHints, True),\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'DomainARN': (basestring, True),\n 'IndexName': (basestring, True),\n 'IndexRotationPeriod': (index_rotation_period_validator, True),\n 'ProcessingConfiguration': (ProcessingConfiguration, False),\n 'RetryOptions': (RetryOptions, False),\n 'RoleARN': (basestring, True),\n 'S3BackupMode': (s3_backup_mode_elastic_search_validator, True),\n 'S3Configuration': (S3Configuration, False),\n 'TypeName': (basestring, True),\n }\n\n\nclass RedshiftDestinationConfiguration(AWSProperty):\n props = {\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'ClusterJDBCURL': (basestring, True),\n 'CopyCommand': (CopyCommand, True),\n 'Password': (basestring, True),\n 'ProcessingConfiguration': (ProcessingConfiguration, False),\n 'RoleARN': (basestring, True),\n 'S3Configuration': (S3Configuration, True),\n 'Username': (basestring, True),\n }\n\n\nclass S3DestinationConfiguration(AWSProperty):\n props = {\n 'BucketARN': (basestring, True),\n 'BufferingHints': (BufferingHints, True),\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'CompressionFormat': (basestring, True),\n 'EncryptionConfiguration': (EncryptionConfiguration, False),\n 'Prefix': (basestring, True),\n 'RoleARN': (basestring, True),\n }\n\n\nclass ExtendedS3DestinationConfiguration(AWSProperty):\n props = {\n 'BucketARN': (basestring, True),\n 'BufferingHints': (BufferingHints, True),\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'CompressionFormat': (basestring, True),\n 'EncryptionConfiguration': (EncryptionConfiguration, False),\n 'Prefix': (basestring, True),\n 'ProcessingConfiguration': (ProcessingConfiguration, False),\n 'RoleARN': (basestring, True),\n 'S3BackupConfiguration': (S3DestinationConfiguration, False),\n 'S3BackupMode': (s3_backup_mode_extended_s3_validator, False),\n }\n\n\nclass KinesisStreamSourceConfiguration(AWSProperty):\n props = {\n 'KinesisStreamARN': (basestring, True),\n 'RoleARN': (basestring, True)\n }\n\n\nclass SplunkRetryOptions(AWSProperty):\n props = {\n 'DurationInSeconds': (positive_integer, True),\n }\n\n\nclass SplunkDestinationConfiguration(AWSProperty):\n props = {\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'HECAcknowledgmentTimeoutInSeconds': (positive_integer, False),\n 'HECEndpoint': (basestring, True),\n 'HECEndpointType': (basestring, True),\n 'HECToken': (basestring, True),\n 'ProcessingConfiguration': (ProcessingConfiguration, False),\n 'RetryOptions': (SplunkRetryOptions, False),\n 'S3BackupMode': (basestring, False),\n 'S3Configuration': (S3DestinationConfiguration, True),\n }\n\n\nclass DeliveryStream(AWSObject):\n resource_type = \"AWS::KinesisFirehose::DeliveryStream\"\n\n props = {\n 'DeliveryStreamName': (basestring, False),\n 'DeliveryStreamType': (delivery_stream_type_validator, False),\n 'ElasticsearchDestinationConfiguration': (ElasticsearchDestinationConfiguration, False), # noqa\n 'ExtendedS3DestinationConfiguration': (ExtendedS3DestinationConfiguration, False), # noqa\n 'KinesisStreamSourceConfiguration': (KinesisStreamSourceConfiguration, False), # noqa\n 'RedshiftDestinationConfiguration': (RedshiftDestinationConfiguration, False), # noqa\n 'S3DestinationConfiguration': (S3DestinationConfiguration, False),\n 'SplunkDestinationConfiguration':\n (SplunkDestinationConfiguration, False),\n }\n", "path": "troposphere/firehose.py"}], "after_files": [{"content": "# Copyright (c) 2016-2017, troposphere project\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\nfrom . import AWSObject, AWSProperty\nfrom .validators import boolean, positive_integer\n\n\ndef processor_type_validator(x):\n valid_types = [\"Lambda\"]\n if x not in valid_types:\n raise ValueError(\"Type must be one of: %s\" %\n \", \".join(valid_types))\n return x\n\n\ndef delivery_stream_type_validator(x):\n valid_types = [\"DirectPut\", \"KinesisStreamAsSource\"]\n if x not in valid_types:\n raise ValueError(\"DeliveryStreamType must be one of: %s\" %\n \", \".join(valid_types))\n return x\n\n\ndef index_rotation_period_validator(x):\n valid_types = [\"NoRotation\", \"OneHour\", \"OneDay\", \"OneWeek\", \"OneMonth\"]\n if x not in valid_types:\n raise ValueError(\"IndexRotationPeriod must be one of: %s\" %\n \", \".join(valid_types))\n return x\n\n\ndef s3_backup_mode_elastic_search_validator(x):\n valid_types = [\"FailedDocumentsOnly\", \"AllDocuments\"]\n if x not in valid_types:\n raise ValueError(\"S3BackupMode must be one of: %s\" %\n \", \".join(valid_types))\n return x\n\n\ndef s3_backup_mode_extended_s3_validator(x):\n valid_types = [\"Disabled\", \"Enabled\"]\n if x not in valid_types:\n raise ValueError(\"S3BackupMode must be one of: %s\" %\n \", \".join(valid_types))\n return x\n\n\nclass BufferingHints(AWSProperty):\n props = {\n 'IntervalInSeconds': (positive_integer, True),\n 'SizeInMBs': (positive_integer, True)\n }\n\n\nclass CloudWatchLoggingOptions(AWSProperty):\n props = {\n 'Enabled': (boolean, False),\n 'LogGroupName': (basestring, False), # Conditional\n 'LogStreamName': (basestring, False), # Conditional\n }\n\n\nclass RetryOptions(AWSProperty):\n props = {\n 'DurationInSeconds': (positive_integer, True),\n }\n\n\nclass KMSEncryptionConfig(AWSProperty):\n props = {\n 'AWSKMSKeyARN': (basestring, True),\n }\n\n\nclass EncryptionConfiguration(AWSProperty):\n props = {\n 'KMSEncryptionConfig': (KMSEncryptionConfig, False),\n 'NoEncryptionConfig': (basestring, False),\n }\n\n\nclass S3Configuration(AWSProperty):\n props = {\n 'BucketARN': (basestring, True),\n 'BufferingHints': (BufferingHints, True),\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'CompressionFormat': (basestring, True),\n 'EncryptionConfiguration': (EncryptionConfiguration, False),\n 'Prefix': (basestring, False),\n 'RoleARN': (basestring, True)\n }\n\n\nclass CopyCommand(AWSProperty):\n props = {\n 'CopyOptions': (basestring, False),\n 'DataTableColumns': (basestring, False),\n 'DataTableName': (basestring, True),\n }\n\n\nclass ProcessorParameter(AWSProperty):\n props = {\n 'ParameterName': (basestring, True),\n 'ParameterValue': (basestring, True),\n }\n\n\nclass Processor(AWSProperty):\n props = {\n 'Parameters': ([ProcessorParameter], True),\n 'Type': (processor_type_validator, True),\n }\n\n\nclass ProcessingConfiguration(AWSProperty):\n props = {\n 'Enabled': (boolean, True),\n 'Processors': ([Processor], True),\n }\n\n\nclass ElasticsearchDestinationConfiguration(AWSProperty):\n props = {\n 'BufferingHints': (BufferingHints, True),\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'DomainARN': (basestring, True),\n 'IndexName': (basestring, True),\n 'IndexRotationPeriod': (index_rotation_period_validator, True),\n 'ProcessingConfiguration': (ProcessingConfiguration, False),\n 'RetryOptions': (RetryOptions, False),\n 'RoleARN': (basestring, True),\n 'S3BackupMode': (s3_backup_mode_elastic_search_validator, True),\n 'S3Configuration': (S3Configuration, False),\n 'TypeName': (basestring, True),\n }\n\n\nclass RedshiftDestinationConfiguration(AWSProperty):\n props = {\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'ClusterJDBCURL': (basestring, True),\n 'CopyCommand': (CopyCommand, True),\n 'Password': (basestring, True),\n 'ProcessingConfiguration': (ProcessingConfiguration, False),\n 'RoleARN': (basestring, True),\n 'S3Configuration': (S3Configuration, True),\n 'Username': (basestring, True),\n }\n\n\nclass S3DestinationConfiguration(AWSProperty):\n props = {\n 'BucketARN': (basestring, True),\n 'BufferingHints': (BufferingHints, True),\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'CompressionFormat': (basestring, True),\n 'EncryptionConfiguration': (EncryptionConfiguration, False),\n 'Prefix': (basestring, False),\n 'RoleARN': (basestring, True),\n }\n\n\nclass ExtendedS3DestinationConfiguration(AWSProperty):\n props = {\n 'BucketARN': (basestring, True),\n 'BufferingHints': (BufferingHints, True),\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'CompressionFormat': (basestring, True),\n 'EncryptionConfiguration': (EncryptionConfiguration, False),\n 'Prefix': (basestring, False),\n 'ProcessingConfiguration': (ProcessingConfiguration, False),\n 'RoleARN': (basestring, True),\n 'S3BackupConfiguration': (S3DestinationConfiguration, False),\n 'S3BackupMode': (s3_backup_mode_extended_s3_validator, False),\n }\n\n\nclass KinesisStreamSourceConfiguration(AWSProperty):\n props = {\n 'KinesisStreamARN': (basestring, True),\n 'RoleARN': (basestring, True)\n }\n\n\nclass SplunkRetryOptions(AWSProperty):\n props = {\n 'DurationInSeconds': (positive_integer, True),\n }\n\n\nclass SplunkDestinationConfiguration(AWSProperty):\n props = {\n 'CloudWatchLoggingOptions': (CloudWatchLoggingOptions, False),\n 'HECAcknowledgmentTimeoutInSeconds': (positive_integer, False),\n 'HECEndpoint': (basestring, True),\n 'HECEndpointType': (basestring, True),\n 'HECToken': (basestring, True),\n 'ProcessingConfiguration': (ProcessingConfiguration, False),\n 'RetryOptions': (SplunkRetryOptions, False),\n 'S3BackupMode': (basestring, False),\n 'S3Configuration': (S3DestinationConfiguration, True),\n }\n\n\nclass DeliveryStream(AWSObject):\n resource_type = \"AWS::KinesisFirehose::DeliveryStream\"\n\n props = {\n 'DeliveryStreamName': (basestring, False),\n 'DeliveryStreamType': (delivery_stream_type_validator, False),\n 'ElasticsearchDestinationConfiguration': (ElasticsearchDestinationConfiguration, False), # noqa\n 'ExtendedS3DestinationConfiguration': (ExtendedS3DestinationConfiguration, False), # noqa\n 'KinesisStreamSourceConfiguration': (KinesisStreamSourceConfiguration, False), # noqa\n 'RedshiftDestinationConfiguration': (RedshiftDestinationConfiguration, False), # noqa\n 'S3DestinationConfiguration': (S3DestinationConfiguration, False),\n 'SplunkDestinationConfiguration':\n (SplunkDestinationConfiguration, False),\n }\n", "path": "troposphere/firehose.py"}]} | 2,798 | 322 |
gh_patches_debug_21935 | rasdani/github-patches | git_diff | cisagov__manage.get.gov-1321 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Modify domain invitation script to process "friends of the show" first, then incrementally process others
### Issue description
We have a few domain managers that we'll invite to the registrar first. Let's modify the domain invitation script to send to a defined set of domains first.
We also shouldn't blast thousands of emails out to the internet, but incrementally roll them out.
### Acceptance criteria
- [ ] Invitation script works with a product owner-specified list of domains/contacts (before sending to everyone else)
- [ ] Script slow rolls out invitations. Could be percentage-based (1/2/5/10/20/45/75/100) or volume-based (a few hundred at a time)
### Additional context
_No response_
### Links to other issues
🔄 Related to PR #1038
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/registrar/management/commands/send_domain_invitations.py`
Content:
```
1 """Data migration: Send domain invitations once to existing customers."""
2
3 import logging
4 import copy
5
6 from django.core.management import BaseCommand
7 from registrar.models import TransitionDomain
8 from ...utility.email import send_templated_email, EmailSendingError
9 from typing import List
10
11 logger = logging.getLogger(__name__)
12
13
14 class Command(BaseCommand):
15 help = "Send domain invitations once to existing customers."
16
17 # this array is used to store and process the transition_domains
18 transition_domains: List[str] = []
19 # this array is used to store domains with errors, which are not
20 # sent emails; this array is used to update the succesful
21 # transition_domains to email_sent=True, and also to report
22 # out errors
23 domains_with_errors: List[str] = []
24 # this array is used to store email_context; each item in the array
25 # contains the context for a single email; single emails may be 1
26 # or more transition_domains, as they are grouped by username
27 emails_to_send: List[str] = []
28
29 def add_arguments(self, parser):
30 """Add command line arguments."""
31 parser.add_argument(
32 "-s",
33 "--send_emails",
34 action="store_true",
35 default=False,
36 dest="send_emails",
37 help="Send emails ",
38 )
39
40 def handle(self, **options):
41 """Process the objects in TransitionDomain."""
42
43 logger.info("checking domains and preparing emails")
44 # Get all TransitionDomain objects
45 self.transition_domains = TransitionDomain.objects.filter(
46 email_sent=False,
47 ).order_by("username")
48 logger.info("Found %d transition domains", len(self.transition_domains))
49
50 self.build_emails_to_send_array()
51 logger.info("Prepared %d emails to send", len(self.emails_to_send))
52
53 if options["send_emails"]:
54 logger.info("about to send emails")
55 self.send_emails()
56 logger.info("done sending emails")
57
58 self.update_domains_as_sent()
59
60 logger.info("done sending emails and updating transition_domains")
61 else:
62 logger.info("not sending emails")
63 for email_context in self.emails_to_send:
64 logger.info(
65 "would send email to %s for %s",
66 email_context["email"],
67 email_context["domains"],
68 )
69
70 def build_emails_to_send_array(self):
71 """this method sends emails to distinct usernames"""
72
73 # data structure to hold email context for a single email;
74 # transition_domains ordered by username, a single email_context
75 # may include information from more than one transition_domain
76 email_context = {"email": ""}
77
78 # loop through all transition_domains; group them by username
79 # into emails_to_send_array
80 for transition_domain in self.transition_domains:
81 # attempt to get the domain from domain objects; if there is
82 # an error getting the domain, skip this domain and add it to
83 # domains_with_errors
84 try:
85 # if prior username does not match current username
86 if not email_context["email"] or email_context["email"] != transition_domain.username:
87 # if not first in list of transition_domains
88 if email_context["email"]:
89 # append the email context to the emails_to_send array
90 self.emails_to_send.append(copy.deepcopy(email_context))
91 email_context["domains"] = []
92 email_context["email"] = transition_domain.username
93 email_context["domains"].append(transition_domain.domain_name)
94 except Exception as err:
95 # error condition if domain not in database
96 self.domains_with_errors.append(copy.deepcopy(transition_domain.domain_name))
97 logger.error(f"error retrieving domain {transition_domain.domain_name}: {err}")
98 # if there are at least one more transition domains than errors,
99 # then append one more item
100 if len(self.transition_domains) > len(self.domains_with_errors):
101 self.emails_to_send.append(email_context)
102
103 def send_emails(self):
104 if len(self.emails_to_send) > 0:
105 for email_data in self.emails_to_send:
106 self.send_email(email_data)
107 else:
108 logger.info("no emails to send")
109
110 def send_email(self, email_data):
111 try:
112 send_templated_email(
113 "emails/transition_domain_invitation.txt",
114 "emails/transition_domain_invitation_subject.txt",
115 to_address=email_data["email"],
116 context={
117 "domains": email_data["domains"],
118 },
119 )
120 # success message is logged
121 logger.info(
122 f"email sent successfully to {email_data['email']} for "
123 f"{[domain for domain in email_data['domains']]}"
124 )
125 except EmailSendingError as err:
126 logger.error(
127 f"email did not send successfully to {email_data['email']} "
128 f"for {[domain for domain in email_data['domains']]}"
129 f": {err}"
130 )
131 # if email failed to send, set error in domains_with_errors for each
132 # domain in the email so that transition domain email_sent is not set
133 # to True
134 for domain in email_data["domains"]:
135 self.domains_with_errors.append(domain)
136
137 def update_domains_as_sent(self):
138 """set email_sent to True in all transition_domains which have
139 been processed successfully"""
140 for transition_domain in self.transition_domains:
141 if transition_domain.domain_name not in self.domains_with_errors:
142 transition_domain.email_sent = True
143 transition_domain.save()
144
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/registrar/management/commands/send_domain_invitations.py b/src/registrar/management/commands/send_domain_invitations.py
--- a/src/registrar/management/commands/send_domain_invitations.py
+++ b/src/registrar/management/commands/send_domain_invitations.py
@@ -37,14 +37,24 @@
help="Send emails ",
)
+ parser.add_argument("emails", nargs="*", help="Email addresses to send invitations to")
+
def handle(self, **options):
"""Process the objects in TransitionDomain."""
logger.info("checking domains and preparing emails")
- # Get all TransitionDomain objects
- self.transition_domains = TransitionDomain.objects.filter(
- email_sent=False,
- ).order_by("username")
+
+ if options["emails"]:
+ # this option is a list of email addresses
+ self.transition_domains = TransitionDomain.objects.filter(
+ username__in=options["emails"],
+ email_sent=False,
+ ).order_by("username")
+ else:
+ # Get all TransitionDomain objects
+ self.transition_domains = TransitionDomain.objects.filter(
+ email_sent=False,
+ ).order_by("username")
logger.info("Found %d transition domains", len(self.transition_domains))
self.build_emails_to_send_array()
| {"golden_diff": "diff --git a/src/registrar/management/commands/send_domain_invitations.py b/src/registrar/management/commands/send_domain_invitations.py\n--- a/src/registrar/management/commands/send_domain_invitations.py\n+++ b/src/registrar/management/commands/send_domain_invitations.py\n@@ -37,14 +37,24 @@\n help=\"Send emails \",\n )\n \n+ parser.add_argument(\"emails\", nargs=\"*\", help=\"Email addresses to send invitations to\")\n+\n def handle(self, **options):\n \"\"\"Process the objects in TransitionDomain.\"\"\"\n \n logger.info(\"checking domains and preparing emails\")\n- # Get all TransitionDomain objects\n- self.transition_domains = TransitionDomain.objects.filter(\n- email_sent=False,\n- ).order_by(\"username\")\n+\n+ if options[\"emails\"]:\n+ # this option is a list of email addresses\n+ self.transition_domains = TransitionDomain.objects.filter(\n+ username__in=options[\"emails\"],\n+ email_sent=False,\n+ ).order_by(\"username\")\n+ else:\n+ # Get all TransitionDomain objects\n+ self.transition_domains = TransitionDomain.objects.filter(\n+ email_sent=False,\n+ ).order_by(\"username\")\n logger.info(\"Found %d transition domains\", len(self.transition_domains))\n \n self.build_emails_to_send_array()\n", "issue": "Modify domain invitation script to process \"friends of the show\" first, then incrementally process others\n### Issue description\n\nWe have a few domain managers that we'll invite to the registrar first. Let's modify the domain invitation script to send to a defined set of domains first. \r\n\r\nWe also shouldn't blast thousands of emails out to the internet, but incrementally roll them out.\n\n### Acceptance criteria\n\n- [ ] Invitation script works with a product owner-specified list of domains/contacts (before sending to everyone else)\r\n- [ ] Script slow rolls out invitations. Could be percentage-based (1/2/5/10/20/45/75/100) or volume-based (a few hundred at a time)\n\n### Additional context\n\n_No response_\n\n### Links to other issues\n\n\ud83d\udd04 Related to PR #1038\n", "before_files": [{"content": "\"\"\"Data migration: Send domain invitations once to existing customers.\"\"\"\n\nimport logging\nimport copy\n\nfrom django.core.management import BaseCommand\nfrom registrar.models import TransitionDomain\nfrom ...utility.email import send_templated_email, EmailSendingError\nfrom typing import List\n\nlogger = logging.getLogger(__name__)\n\n\nclass Command(BaseCommand):\n help = \"Send domain invitations once to existing customers.\"\n\n # this array is used to store and process the transition_domains\n transition_domains: List[str] = []\n # this array is used to store domains with errors, which are not\n # sent emails; this array is used to update the succesful\n # transition_domains to email_sent=True, and also to report\n # out errors\n domains_with_errors: List[str] = []\n # this array is used to store email_context; each item in the array\n # contains the context for a single email; single emails may be 1\n # or more transition_domains, as they are grouped by username\n emails_to_send: List[str] = []\n\n def add_arguments(self, parser):\n \"\"\"Add command line arguments.\"\"\"\n parser.add_argument(\n \"-s\",\n \"--send_emails\",\n action=\"store_true\",\n default=False,\n dest=\"send_emails\",\n help=\"Send emails \",\n )\n\n def handle(self, **options):\n \"\"\"Process the objects in TransitionDomain.\"\"\"\n\n logger.info(\"checking domains and preparing emails\")\n # Get all TransitionDomain objects\n self.transition_domains = TransitionDomain.objects.filter(\n email_sent=False,\n ).order_by(\"username\")\n logger.info(\"Found %d transition domains\", len(self.transition_domains))\n\n self.build_emails_to_send_array()\n logger.info(\"Prepared %d emails to send\", len(self.emails_to_send))\n\n if options[\"send_emails\"]:\n logger.info(\"about to send emails\")\n self.send_emails()\n logger.info(\"done sending emails\")\n\n self.update_domains_as_sent()\n\n logger.info(\"done sending emails and updating transition_domains\")\n else:\n logger.info(\"not sending emails\")\n for email_context in self.emails_to_send:\n logger.info(\n \"would send email to %s for %s\",\n email_context[\"email\"],\n email_context[\"domains\"],\n )\n\n def build_emails_to_send_array(self):\n \"\"\"this method sends emails to distinct usernames\"\"\"\n\n # data structure to hold email context for a single email;\n # transition_domains ordered by username, a single email_context\n # may include information from more than one transition_domain\n email_context = {\"email\": \"\"}\n\n # loop through all transition_domains; group them by username\n # into emails_to_send_array\n for transition_domain in self.transition_domains:\n # attempt to get the domain from domain objects; if there is\n # an error getting the domain, skip this domain and add it to\n # domains_with_errors\n try:\n # if prior username does not match current username\n if not email_context[\"email\"] or email_context[\"email\"] != transition_domain.username:\n # if not first in list of transition_domains\n if email_context[\"email\"]:\n # append the email context to the emails_to_send array\n self.emails_to_send.append(copy.deepcopy(email_context))\n email_context[\"domains\"] = []\n email_context[\"email\"] = transition_domain.username\n email_context[\"domains\"].append(transition_domain.domain_name)\n except Exception as err:\n # error condition if domain not in database\n self.domains_with_errors.append(copy.deepcopy(transition_domain.domain_name))\n logger.error(f\"error retrieving domain {transition_domain.domain_name}: {err}\")\n # if there are at least one more transition domains than errors,\n # then append one more item\n if len(self.transition_domains) > len(self.domains_with_errors):\n self.emails_to_send.append(email_context)\n\n def send_emails(self):\n if len(self.emails_to_send) > 0:\n for email_data in self.emails_to_send:\n self.send_email(email_data)\n else:\n logger.info(\"no emails to send\")\n\n def send_email(self, email_data):\n try:\n send_templated_email(\n \"emails/transition_domain_invitation.txt\",\n \"emails/transition_domain_invitation_subject.txt\",\n to_address=email_data[\"email\"],\n context={\n \"domains\": email_data[\"domains\"],\n },\n )\n # success message is logged\n logger.info(\n f\"email sent successfully to {email_data['email']} for \"\n f\"{[domain for domain in email_data['domains']]}\"\n )\n except EmailSendingError as err:\n logger.error(\n f\"email did not send successfully to {email_data['email']} \"\n f\"for {[domain for domain in email_data['domains']]}\"\n f\": {err}\"\n )\n # if email failed to send, set error in domains_with_errors for each\n # domain in the email so that transition domain email_sent is not set\n # to True\n for domain in email_data[\"domains\"]:\n self.domains_with_errors.append(domain)\n\n def update_domains_as_sent(self):\n \"\"\"set email_sent to True in all transition_domains which have\n been processed successfully\"\"\"\n for transition_domain in self.transition_domains:\n if transition_domain.domain_name not in self.domains_with_errors:\n transition_domain.email_sent = True\n transition_domain.save()\n", "path": "src/registrar/management/commands/send_domain_invitations.py"}], "after_files": [{"content": "\"\"\"Data migration: Send domain invitations once to existing customers.\"\"\"\n\nimport logging\nimport copy\n\nfrom django.core.management import BaseCommand\nfrom registrar.models import TransitionDomain\nfrom ...utility.email import send_templated_email, EmailSendingError\nfrom typing import List\n\nlogger = logging.getLogger(__name__)\n\n\nclass Command(BaseCommand):\n help = \"Send domain invitations once to existing customers.\"\n\n # this array is used to store and process the transition_domains\n transition_domains: List[str] = []\n # this array is used to store domains with errors, which are not\n # sent emails; this array is used to update the succesful\n # transition_domains to email_sent=True, and also to report\n # out errors\n domains_with_errors: List[str] = []\n # this array is used to store email_context; each item in the array\n # contains the context for a single email; single emails may be 1\n # or more transition_domains, as they are grouped by username\n emails_to_send: List[str] = []\n\n def add_arguments(self, parser):\n \"\"\"Add command line arguments.\"\"\"\n parser.add_argument(\n \"-s\",\n \"--send_emails\",\n action=\"store_true\",\n default=False,\n dest=\"send_emails\",\n help=\"Send emails \",\n )\n\n parser.add_argument(\"emails\", nargs=\"*\", help=\"Email addresses to send invitations to\")\n\n def handle(self, **options):\n \"\"\"Process the objects in TransitionDomain.\"\"\"\n\n logger.info(\"checking domains and preparing emails\")\n\n if options[\"emails\"]:\n # this option is a list of email addresses\n self.transition_domains = TransitionDomain.objects.filter(\n username__in=options[\"emails\"],\n email_sent=False,\n ).order_by(\"username\")\n else:\n # Get all TransitionDomain objects\n self.transition_domains = TransitionDomain.objects.filter(\n email_sent=False,\n ).order_by(\"username\")\n logger.info(\"Found %d transition domains\", len(self.transition_domains))\n\n self.build_emails_to_send_array()\n logger.info(\"Prepared %d emails to send\", len(self.emails_to_send))\n\n if options[\"send_emails\"]:\n logger.info(\"about to send emails\")\n self.send_emails()\n logger.info(\"done sending emails\")\n\n self.update_domains_as_sent()\n\n logger.info(\"done sending emails and updating transition_domains\")\n else:\n logger.info(\"not sending emails\")\n for email_context in self.emails_to_send:\n logger.info(\n \"would send email to %s for %s\",\n email_context[\"email\"],\n email_context[\"domains\"],\n )\n\n def build_emails_to_send_array(self):\n \"\"\"this method sends emails to distinct usernames\"\"\"\n\n # data structure to hold email context for a single email;\n # transition_domains ordered by username, a single email_context\n # may include information from more than one transition_domain\n email_context = {\"email\": \"\"}\n\n # loop through all transition_domains; group them by username\n # into emails_to_send_array\n for transition_domain in self.transition_domains:\n # attempt to get the domain from domain objects; if there is\n # an error getting the domain, skip this domain and add it to\n # domains_with_errors\n try:\n # if prior username does not match current username\n if not email_context[\"email\"] or email_context[\"email\"] != transition_domain.username:\n # if not first in list of transition_domains\n if email_context[\"email\"]:\n # append the email context to the emails_to_send array\n self.emails_to_send.append(copy.deepcopy(email_context))\n email_context[\"domains\"] = []\n email_context[\"email\"] = transition_domain.username\n email_context[\"domains\"].append(transition_domain.domain_name)\n except Exception as err:\n # error condition if domain not in database\n self.domains_with_errors.append(copy.deepcopy(transition_domain.domain_name))\n logger.error(f\"error retrieving domain {transition_domain.domain_name}: {err}\")\n # if there are at least one more transition domains than errors,\n # then append one more item\n if len(self.transition_domains) > len(self.domains_with_errors):\n self.emails_to_send.append(email_context)\n\n def send_emails(self):\n if len(self.emails_to_send) > 0:\n for email_data in self.emails_to_send:\n self.send_email(email_data)\n else:\n logger.info(\"no emails to send\")\n\n def send_email(self, email_data):\n try:\n send_templated_email(\n \"emails/transition_domain_invitation.txt\",\n \"emails/transition_domain_invitation_subject.txt\",\n to_address=email_data[\"email\"],\n context={\n \"domains\": email_data[\"domains\"],\n },\n )\n # success message is logged\n logger.info(\n f\"email sent successfully to {email_data['email']} for \"\n f\"{[domain for domain in email_data['domains']]}\"\n )\n except EmailSendingError as err:\n logger.error(\n f\"email did not send successfully to {email_data['email']} \"\n f\"for {[domain for domain in email_data['domains']]}\"\n f\": {err}\"\n )\n # if email failed to send, set error in domains_with_errors for each\n # domain in the email so that transition domain email_sent is not set\n # to True\n for domain in email_data[\"domains\"]:\n self.domains_with_errors.append(domain)\n\n def update_domains_as_sent(self):\n \"\"\"set email_sent to True in all transition_domains which have\n been processed successfully\"\"\"\n for transition_domain in self.transition_domains:\n if transition_domain.domain_name not in self.domains_with_errors:\n transition_domain.email_sent = True\n transition_domain.save()\n", "path": "src/registrar/management/commands/send_domain_invitations.py"}]} | 1,915 | 282 |
gh_patches_debug_21002 | rasdani/github-patches | git_diff | fossasia__open-event-server-4664 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Server is not found locally after #4643
**I'm submitting a ...** (check one with "x")
- [x] bug report
- [ ] feature request
- [ ] support request => Please do not submit support requests here, instead ask your query in out Gitter channel at https://gitter.im/fossasia/open-event-server
After #4643, the server cannot be accessed via the URL `localhost:5000` or any other means. This has lead to Travis build failing and inability to test things locally as well. Revert this behaviour and reopen the original issue
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `config.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 import os
3 from envparse import env
4
5 env.read_envfile()
6
7 basedir = os.path.abspath(os.path.dirname(__file__))
8
9 VERSION_NAME = '2.0.0-alpha.1'
10
11 LANGUAGES = {
12 'en': 'English',
13 'bn': 'Bengali/Bangla',
14 'zh_Hans': 'Chinese (Simplified)',
15 'zh_Hant': 'Chinese (Traditional)',
16 'fr': 'French',
17 'de': 'German',
18 'id': 'Indonesian',
19 'ko': 'Korean',
20 'pl': 'Polish',
21 'es': 'Spanish',
22 'th': 'Thai',
23 'vi': 'Vietnamese',
24 'hi': 'Hindi',
25 'ja': 'Japanese',
26 'ru': 'Russian',
27 }
28
29
30 class Config(object):
31 """
32 The base configuration option. Contains the defaults.
33 """
34
35 DEBUG = False
36
37 DEVELOPMENT = False
38 STAGING = False
39 PRODUCTION = False
40 TESTING = False
41
42 CACHING = False
43 PROFILE = False
44 SQLALCHEMY_RECORD_QUERIES = False
45
46 FLASK_ADMIN_SWATCH = 'lumen'
47
48 VERSION = VERSION_NAME
49 SQLALCHEMY_TRACK_MODIFICATIONS = True
50 ERROR_404_HELP = False
51 CSRF_ENABLED = True
52 SERVER_NAME = env('SERVER_NAME', default='127.0.0.1:5000')
53 FALLBACK_PORT = 80
54 CORS_HEADERS = 'Content-Type'
55 SQLALCHEMY_DATABASE_URI = env('DATABASE_URL', default=None)
56 SERVE_STATIC = env.bool('SERVE_STATIC', default=False)
57 DATABASE_QUERY_TIMEOUT = 0.1
58 SENTRY_DSN = env('SENTRY_DSN', default=None)
59 ENABLE_ELASTICSEARCH = env.bool('ENABLE_ELASTICSEARCH', default=False)
60 ELASTICSEARCH_HOST = env('ELASTICSEARCH_HOST', default='localhost:9200')
61 REDIS_URL = env('REDIS_URL', default='redis://localhost:6379/0')
62
63 # API configs
64 SOFT_DELETE = True
65 PROPOGATE_ERROR = env.bool('PROPOGATE_ERROR', default=False)
66 DASHERIZE_API = True
67 API_PROPOGATE_UNCAUGHT_EXCEPTIONS = env.bool('API_PROPOGATE_UNCAUGHT_EXCEPTIONS', default=True)
68 ETAG = True
69
70 if not SQLALCHEMY_DATABASE_URI:
71 print('`DATABASE_URL` either not exported or empty')
72 exit()
73
74 BASE_DIR = basedir
75 FORCE_SSL = os.getenv('FORCE_SSL', 'no') == 'yes'
76
77 if SERVE_STATIC:
78 UPLOADS_FOLDER = BASE_DIR + '/static/uploads/'
79 TEMP_UPLOADS_FOLDER = BASE_DIR + '/static/uploads/temp/'
80 UPLOAD_FOLDER = UPLOADS_FOLDER
81 STATIC_URL = '/static/'
82 STATIC_ROOT = 'staticfiles'
83 STATICFILES_DIRS = (os.path.join(BASE_DIR, 'static'),)
84
85 if FORCE_SSL:
86 PREFERRED_URL_SCHEME = 'https'
87
88
89 class ProductionConfig(Config):
90 """
91 The configuration for a production environment
92 """
93
94 MINIFY_PAGE = True
95 PRODUCTION = True
96 CACHING = True
97
98 # if force on
99
100
101 class StagingConfig(ProductionConfig):
102 """
103 The configuration for a staging environment
104 """
105
106 PRODUCTION = False
107 STAGING = True
108
109
110 class DevelopmentConfig(Config):
111 """
112 The configuration for a development environment
113 """
114
115 DEVELOPMENT = True
116 DEBUG = True
117 CACHING = True
118 PROPOGATE_ERROR = True
119
120 # Test database performance
121 SQLALCHEMY_RECORD_QUERIES = True
122
123
124 class TestingConfig(Config):
125 """
126 The configuration for a test suit
127 """
128 TESTING = True
129 CELERY_ALWAYS_EAGER = True
130 CELERY_EAGER_PROPAGATES_EXCEPTIONS = True
131 SQLALCHEMY_RECORD_QUERIES = True
132 DEBUG_TB_ENABLED = False
133 BROKER_BACKEND = 'memory'
134 SQLALCHEMY_DATABASE_URI = env('TEST_DATABASE_URL', default=None)
135 PROPOGATE_ERROR = True
136
```
Path: `manage.py`
Content:
```
1 from __future__ import print_function
2 from flask_script import Server
3 import os
4
5 from app.api.helpers.db import save_to_db
6 from app.models.event import Event, get_new_event_identifier
7 from app import manager
8 from app import current_app as app
9 from app.models import db
10 from app.models.speaker import Speaker
11 from populate_db import populate
12 from flask_migrate import stamp
13 from sqlalchemy.engine import reflection
14
15 from tests.unittests.auth_helper import create_super_admin
16
17
18 @manager.command
19 def list_routes():
20 import urllib
21
22 output = []
23 for rule in app.url_map.iter_rules():
24 methods = ','.join(rule.methods)
25 line = urllib.unquote("{:50s} {:20s} {}".format(
26 rule.endpoint, methods, rule))
27 output.append(line)
28
29 for line in sorted(output):
30 print(line)
31
32
33 @manager.command
34 def add_event_identifier():
35 events = Event.query.all()
36 for event in events:
37 event.identifier = get_new_event_identifier()
38 save_to_db(event)
39
40
41 @manager.option('-e', '--event', help='Event ID. Eg. 1')
42 def fix_speaker_images(event):
43 from app.helpers.sessions_speakers.speakers import speaker_image_sizes
44 from app.helpers.sessions_speakers.speakers import save_resized_photo
45 import urllib
46 from app.helpers.storage import generate_hash
47 event_id = int(event)
48 image_sizes = speaker_image_sizes()
49 speakers = Speaker.query.filter_by(event_id=event_id).all()
50 for speaker in speakers:
51 if speaker.photo and speaker.photo.strip() != '':
52 file_relative_path = 'static/media/temp/' + generate_hash(str(speaker.id)) + '.jpg'
53 file_path = app.config['BASE_DIR'] + '/' + file_relative_path
54 urllib.urlretrieve(speaker.photo, file_path)
55 speaker.small = save_resized_photo(file_path, event_id, speaker.id, 'small', image_sizes)
56 speaker.thumbnail = save_resized_photo(file_path, event_id, speaker.id, 'thumbnail', image_sizes)
57 speaker.icon = save_resized_photo(file_path, event_id, speaker.id, 'icon', image_sizes)
58 db.session.add(speaker)
59 os.remove(file_path)
60 print("Downloaded " + speaker.photo + " into " + file_relative_path)
61 print("Processed - " + str(speaker.id))
62 db.session.commit()
63
64
65 @manager.option('-c', '--credentials', help='Super admin credentials. Eg. username:password')
66 def initialize_db(credentials):
67 with app.app_context():
68 populate_data = True
69 inspector = reflection.Inspector.from_engine(db.engine)
70 table_name = 'events'
71 table_names = inspector.get_table_names()
72 print("[LOG] Existing tables:")
73 print("[LOG] " + ','.join(table_names))
74 if table_name not in table_names:
75 print("[LOG] Table not found. Attempting creation")
76 try:
77 db.create_all()
78 stamp()
79 except:
80 populate_data = False
81 print("[LOG] Could not create tables. Either database does not exist or tables already created")
82 if populate_data:
83 credentials = credentials.split(":")
84 create_super_admin(credentials[0], credentials[1])
85 populate()
86 else:
87 print("[LOG] Tables already exist. Skipping data population & creation.")
88
89
90 @manager.command
91 def prepare_kubernetes_db():
92 with app.app_context():
93 initialize_db('[email protected]:fossasia')
94
95
96 if __name__ == "__main__":
97 host_port = app.config['SERVER_NAME'].split(":")
98 host = host_port[0]
99 port = int(host_port[1] if len(host_port) > 1 else app.config['FALLBACK_PORT'])
100 manager.add_command("runserver", Server(host=host, port=port))
101 manager.run()
102
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/config.py b/config.py
--- a/config.py
+++ b/config.py
@@ -49,8 +49,7 @@
SQLALCHEMY_TRACK_MODIFICATIONS = True
ERROR_404_HELP = False
CSRF_ENABLED = True
- SERVER_NAME = env('SERVER_NAME', default='127.0.0.1:5000')
- FALLBACK_PORT = 80
+ SERVER_NAME = env('SERVER_NAME', default=None)
CORS_HEADERS = 'Content-Type'
SQLALCHEMY_DATABASE_URI = env('DATABASE_URL', default=None)
SERVE_STATIC = env.bool('SERVE_STATIC', default=False)
diff --git a/manage.py b/manage.py
--- a/manage.py
+++ b/manage.py
@@ -1,5 +1,4 @@
from __future__ import print_function
-from flask_script import Server
import os
from app.api.helpers.db import save_to_db
@@ -94,8 +93,4 @@
if __name__ == "__main__":
- host_port = app.config['SERVER_NAME'].split(":")
- host = host_port[0]
- port = int(host_port[1] if len(host_port) > 1 else app.config['FALLBACK_PORT'])
- manager.add_command("runserver", Server(host=host, port=port))
manager.run()
| {"golden_diff": "diff --git a/config.py b/config.py\n--- a/config.py\n+++ b/config.py\n@@ -49,8 +49,7 @@\n SQLALCHEMY_TRACK_MODIFICATIONS = True\n ERROR_404_HELP = False\n CSRF_ENABLED = True\n- SERVER_NAME = env('SERVER_NAME', default='127.0.0.1:5000')\n- FALLBACK_PORT = 80\n+ SERVER_NAME = env('SERVER_NAME', default=None)\n CORS_HEADERS = 'Content-Type'\n SQLALCHEMY_DATABASE_URI = env('DATABASE_URL', default=None)\n SERVE_STATIC = env.bool('SERVE_STATIC', default=False)\ndiff --git a/manage.py b/manage.py\n--- a/manage.py\n+++ b/manage.py\n@@ -1,5 +1,4 @@\n from __future__ import print_function\n-from flask_script import Server\n import os\n \n from app.api.helpers.db import save_to_db\n@@ -94,8 +93,4 @@\n \n \n if __name__ == \"__main__\":\n- host_port = app.config['SERVER_NAME'].split(\":\")\n- host = host_port[0]\n- port = int(host_port[1] if len(host_port) > 1 else app.config['FALLBACK_PORT'])\n- manager.add_command(\"runserver\", Server(host=host, port=port))\n manager.run()\n", "issue": "Server is not found locally after #4643\n**I'm submitting a ...** (check one with \"x\")\r\n- [x] bug report\r\n- [ ] feature request\r\n- [ ] support request => Please do not submit support requests here, instead ask your query in out Gitter channel at https://gitter.im/fossasia/open-event-server\r\n\r\nAfter #4643, the server cannot be accessed via the URL `localhost:5000` or any other means. This has lead to Travis build failing and inability to test things locally as well. Revert this behaviour and reopen the original issue\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nimport os\nfrom envparse import env\n\nenv.read_envfile()\n\nbasedir = os.path.abspath(os.path.dirname(__file__))\n\nVERSION_NAME = '2.0.0-alpha.1'\n\nLANGUAGES = {\n 'en': 'English',\n 'bn': 'Bengali/Bangla',\n 'zh_Hans': 'Chinese (Simplified)',\n 'zh_Hant': 'Chinese (Traditional)',\n 'fr': 'French',\n 'de': 'German',\n 'id': 'Indonesian',\n 'ko': 'Korean',\n 'pl': 'Polish',\n 'es': 'Spanish',\n 'th': 'Thai',\n 'vi': 'Vietnamese',\n 'hi': 'Hindi',\n 'ja': 'Japanese',\n 'ru': 'Russian',\n}\n\n\nclass Config(object):\n \"\"\"\n The base configuration option. Contains the defaults.\n \"\"\"\n\n DEBUG = False\n\n DEVELOPMENT = False\n STAGING = False\n PRODUCTION = False\n TESTING = False\n\n CACHING = False\n PROFILE = False\n SQLALCHEMY_RECORD_QUERIES = False\n\n FLASK_ADMIN_SWATCH = 'lumen'\n\n VERSION = VERSION_NAME\n SQLALCHEMY_TRACK_MODIFICATIONS = True\n ERROR_404_HELP = False\n CSRF_ENABLED = True\n SERVER_NAME = env('SERVER_NAME', default='127.0.0.1:5000')\n FALLBACK_PORT = 80\n CORS_HEADERS = 'Content-Type'\n SQLALCHEMY_DATABASE_URI = env('DATABASE_URL', default=None)\n SERVE_STATIC = env.bool('SERVE_STATIC', default=False)\n DATABASE_QUERY_TIMEOUT = 0.1\n SENTRY_DSN = env('SENTRY_DSN', default=None)\n ENABLE_ELASTICSEARCH = env.bool('ENABLE_ELASTICSEARCH', default=False)\n ELASTICSEARCH_HOST = env('ELASTICSEARCH_HOST', default='localhost:9200')\n REDIS_URL = env('REDIS_URL', default='redis://localhost:6379/0')\n\n # API configs\n SOFT_DELETE = True\n PROPOGATE_ERROR = env.bool('PROPOGATE_ERROR', default=False)\n DASHERIZE_API = True\n API_PROPOGATE_UNCAUGHT_EXCEPTIONS = env.bool('API_PROPOGATE_UNCAUGHT_EXCEPTIONS', default=True)\n ETAG = True\n\n if not SQLALCHEMY_DATABASE_URI:\n print('`DATABASE_URL` either not exported or empty')\n exit()\n\n BASE_DIR = basedir\n FORCE_SSL = os.getenv('FORCE_SSL', 'no') == 'yes'\n\n if SERVE_STATIC:\n UPLOADS_FOLDER = BASE_DIR + '/static/uploads/'\n TEMP_UPLOADS_FOLDER = BASE_DIR + '/static/uploads/temp/'\n UPLOAD_FOLDER = UPLOADS_FOLDER\n STATIC_URL = '/static/'\n STATIC_ROOT = 'staticfiles'\n STATICFILES_DIRS = (os.path.join(BASE_DIR, 'static'),)\n\n if FORCE_SSL:\n PREFERRED_URL_SCHEME = 'https'\n\n\nclass ProductionConfig(Config):\n \"\"\"\n The configuration for a production environment\n \"\"\"\n\n MINIFY_PAGE = True\n PRODUCTION = True\n CACHING = True\n\n # if force on\n\n\nclass StagingConfig(ProductionConfig):\n \"\"\"\n The configuration for a staging environment\n \"\"\"\n\n PRODUCTION = False\n STAGING = True\n\n\nclass DevelopmentConfig(Config):\n \"\"\"\n The configuration for a development environment\n \"\"\"\n\n DEVELOPMENT = True\n DEBUG = True\n CACHING = True\n PROPOGATE_ERROR = True\n\n # Test database performance\n SQLALCHEMY_RECORD_QUERIES = True\n\n\nclass TestingConfig(Config):\n \"\"\"\n The configuration for a test suit\n \"\"\"\n TESTING = True\n CELERY_ALWAYS_EAGER = True\n CELERY_EAGER_PROPAGATES_EXCEPTIONS = True\n SQLALCHEMY_RECORD_QUERIES = True\n DEBUG_TB_ENABLED = False\n BROKER_BACKEND = 'memory'\n SQLALCHEMY_DATABASE_URI = env('TEST_DATABASE_URL', default=None)\n PROPOGATE_ERROR = True\n", "path": "config.py"}, {"content": "from __future__ import print_function\nfrom flask_script import Server\nimport os\n\nfrom app.api.helpers.db import save_to_db\nfrom app.models.event import Event, get_new_event_identifier\nfrom app import manager\nfrom app import current_app as app\nfrom app.models import db\nfrom app.models.speaker import Speaker\nfrom populate_db import populate\nfrom flask_migrate import stamp\nfrom sqlalchemy.engine import reflection\n\nfrom tests.unittests.auth_helper import create_super_admin\n\n\[email protected]\ndef list_routes():\n import urllib\n\n output = []\n for rule in app.url_map.iter_rules():\n methods = ','.join(rule.methods)\n line = urllib.unquote(\"{:50s} {:20s} {}\".format(\n rule.endpoint, methods, rule))\n output.append(line)\n\n for line in sorted(output):\n print(line)\n\n\[email protected]\ndef add_event_identifier():\n events = Event.query.all()\n for event in events:\n event.identifier = get_new_event_identifier()\n save_to_db(event)\n\n\[email protected]('-e', '--event', help='Event ID. Eg. 1')\ndef fix_speaker_images(event):\n from app.helpers.sessions_speakers.speakers import speaker_image_sizes\n from app.helpers.sessions_speakers.speakers import save_resized_photo\n import urllib\n from app.helpers.storage import generate_hash\n event_id = int(event)\n image_sizes = speaker_image_sizes()\n speakers = Speaker.query.filter_by(event_id=event_id).all()\n for speaker in speakers:\n if speaker.photo and speaker.photo.strip() != '':\n file_relative_path = 'static/media/temp/' + generate_hash(str(speaker.id)) + '.jpg'\n file_path = app.config['BASE_DIR'] + '/' + file_relative_path\n urllib.urlretrieve(speaker.photo, file_path)\n speaker.small = save_resized_photo(file_path, event_id, speaker.id, 'small', image_sizes)\n speaker.thumbnail = save_resized_photo(file_path, event_id, speaker.id, 'thumbnail', image_sizes)\n speaker.icon = save_resized_photo(file_path, event_id, speaker.id, 'icon', image_sizes)\n db.session.add(speaker)\n os.remove(file_path)\n print(\"Downloaded \" + speaker.photo + \" into \" + file_relative_path)\n print(\"Processed - \" + str(speaker.id))\n db.session.commit()\n\n\[email protected]('-c', '--credentials', help='Super admin credentials. Eg. username:password')\ndef initialize_db(credentials):\n with app.app_context():\n populate_data = True\n inspector = reflection.Inspector.from_engine(db.engine)\n table_name = 'events'\n table_names = inspector.get_table_names()\n print(\"[LOG] Existing tables:\")\n print(\"[LOG] \" + ','.join(table_names))\n if table_name not in table_names:\n print(\"[LOG] Table not found. Attempting creation\")\n try:\n db.create_all()\n stamp()\n except:\n populate_data = False\n print(\"[LOG] Could not create tables. Either database does not exist or tables already created\")\n if populate_data:\n credentials = credentials.split(\":\")\n create_super_admin(credentials[0], credentials[1])\n populate()\n else:\n print(\"[LOG] Tables already exist. Skipping data population & creation.\")\n\n\[email protected]\ndef prepare_kubernetes_db():\n with app.app_context():\n initialize_db('[email protected]:fossasia')\n\n\nif __name__ == \"__main__\":\n host_port = app.config['SERVER_NAME'].split(\":\")\n host = host_port[0]\n port = int(host_port[1] if len(host_port) > 1 else app.config['FALLBACK_PORT'])\n manager.add_command(\"runserver\", Server(host=host, port=port))\n manager.run()\n", "path": "manage.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nimport os\nfrom envparse import env\n\nenv.read_envfile()\n\nbasedir = os.path.abspath(os.path.dirname(__file__))\n\nVERSION_NAME = '2.0.0-alpha.1'\n\nLANGUAGES = {\n 'en': 'English',\n 'bn': 'Bengali/Bangla',\n 'zh_Hans': 'Chinese (Simplified)',\n 'zh_Hant': 'Chinese (Traditional)',\n 'fr': 'French',\n 'de': 'German',\n 'id': 'Indonesian',\n 'ko': 'Korean',\n 'pl': 'Polish',\n 'es': 'Spanish',\n 'th': 'Thai',\n 'vi': 'Vietnamese',\n 'hi': 'Hindi',\n 'ja': 'Japanese',\n 'ru': 'Russian',\n}\n\n\nclass Config(object):\n \"\"\"\n The base configuration option. Contains the defaults.\n \"\"\"\n\n DEBUG = False\n\n DEVELOPMENT = False\n STAGING = False\n PRODUCTION = False\n TESTING = False\n\n CACHING = False\n PROFILE = False\n SQLALCHEMY_RECORD_QUERIES = False\n\n FLASK_ADMIN_SWATCH = 'lumen'\n\n VERSION = VERSION_NAME\n SQLALCHEMY_TRACK_MODIFICATIONS = True\n ERROR_404_HELP = False\n CSRF_ENABLED = True\n SERVER_NAME = env('SERVER_NAME', default=None)\n CORS_HEADERS = 'Content-Type'\n SQLALCHEMY_DATABASE_URI = env('DATABASE_URL', default=None)\n SERVE_STATIC = env.bool('SERVE_STATIC', default=False)\n DATABASE_QUERY_TIMEOUT = 0.1\n SENTRY_DSN = env('SENTRY_DSN', default=None)\n ENABLE_ELASTICSEARCH = env.bool('ENABLE_ELASTICSEARCH', default=False)\n ELASTICSEARCH_HOST = env('ELASTICSEARCH_HOST', default='localhost:9200')\n REDIS_URL = env('REDIS_URL', default='redis://localhost:6379/0')\n\n # API configs\n SOFT_DELETE = True\n PROPOGATE_ERROR = env.bool('PROPOGATE_ERROR', default=False)\n DASHERIZE_API = True\n API_PROPOGATE_UNCAUGHT_EXCEPTIONS = env.bool('API_PROPOGATE_UNCAUGHT_EXCEPTIONS', default=True)\n ETAG = True\n\n if not SQLALCHEMY_DATABASE_URI:\n print('`DATABASE_URL` either not exported or empty')\n exit()\n\n BASE_DIR = basedir\n FORCE_SSL = os.getenv('FORCE_SSL', 'no') == 'yes'\n\n if SERVE_STATIC:\n UPLOADS_FOLDER = BASE_DIR + '/static/uploads/'\n TEMP_UPLOADS_FOLDER = BASE_DIR + '/static/uploads/temp/'\n UPLOAD_FOLDER = UPLOADS_FOLDER\n STATIC_URL = '/static/'\n STATIC_ROOT = 'staticfiles'\n STATICFILES_DIRS = (os.path.join(BASE_DIR, 'static'),)\n\n if FORCE_SSL:\n PREFERRED_URL_SCHEME = 'https'\n\n\nclass ProductionConfig(Config):\n \"\"\"\n The configuration for a production environment\n \"\"\"\n\n MINIFY_PAGE = True\n PRODUCTION = True\n CACHING = True\n\n # if force on\n\n\nclass StagingConfig(ProductionConfig):\n \"\"\"\n The configuration for a staging environment\n \"\"\"\n\n PRODUCTION = False\n STAGING = True\n\n\nclass DevelopmentConfig(Config):\n \"\"\"\n The configuration for a development environment\n \"\"\"\n\n DEVELOPMENT = True\n DEBUG = True\n CACHING = True\n PROPOGATE_ERROR = True\n\n # Test database performance\n SQLALCHEMY_RECORD_QUERIES = True\n\n\nclass TestingConfig(Config):\n \"\"\"\n The configuration for a test suit\n \"\"\"\n TESTING = True\n CELERY_ALWAYS_EAGER = True\n CELERY_EAGER_PROPAGATES_EXCEPTIONS = True\n SQLALCHEMY_RECORD_QUERIES = True\n DEBUG_TB_ENABLED = False\n BROKER_BACKEND = 'memory'\n SQLALCHEMY_DATABASE_URI = env('TEST_DATABASE_URL', default=None)\n PROPOGATE_ERROR = True\n", "path": "config.py"}, {"content": "from __future__ import print_function\nimport os\n\nfrom app.api.helpers.db import save_to_db\nfrom app.models.event import Event, get_new_event_identifier\nfrom app import manager\nfrom app import current_app as app\nfrom app.models import db\nfrom app.models.speaker import Speaker\nfrom populate_db import populate\nfrom flask_migrate import stamp\nfrom sqlalchemy.engine import reflection\n\nfrom tests.unittests.auth_helper import create_super_admin\n\n\[email protected]\ndef list_routes():\n import urllib\n\n output = []\n for rule in app.url_map.iter_rules():\n methods = ','.join(rule.methods)\n line = urllib.unquote(\"{:50s} {:20s} {}\".format(\n rule.endpoint, methods, rule))\n output.append(line)\n\n for line in sorted(output):\n print(line)\n\n\[email protected]\ndef add_event_identifier():\n events = Event.query.all()\n for event in events:\n event.identifier = get_new_event_identifier()\n save_to_db(event)\n\n\[email protected]('-e', '--event', help='Event ID. Eg. 1')\ndef fix_speaker_images(event):\n from app.helpers.sessions_speakers.speakers import speaker_image_sizes\n from app.helpers.sessions_speakers.speakers import save_resized_photo\n import urllib\n from app.helpers.storage import generate_hash\n event_id = int(event)\n image_sizes = speaker_image_sizes()\n speakers = Speaker.query.filter_by(event_id=event_id).all()\n for speaker in speakers:\n if speaker.photo and speaker.photo.strip() != '':\n file_relative_path = 'static/media/temp/' + generate_hash(str(speaker.id)) + '.jpg'\n file_path = app.config['BASE_DIR'] + '/' + file_relative_path\n urllib.urlretrieve(speaker.photo, file_path)\n speaker.small = save_resized_photo(file_path, event_id, speaker.id, 'small', image_sizes)\n speaker.thumbnail = save_resized_photo(file_path, event_id, speaker.id, 'thumbnail', image_sizes)\n speaker.icon = save_resized_photo(file_path, event_id, speaker.id, 'icon', image_sizes)\n db.session.add(speaker)\n os.remove(file_path)\n print(\"Downloaded \" + speaker.photo + \" into \" + file_relative_path)\n print(\"Processed - \" + str(speaker.id))\n db.session.commit()\n\n\[email protected]('-c', '--credentials', help='Super admin credentials. Eg. username:password')\ndef initialize_db(credentials):\n with app.app_context():\n populate_data = True\n inspector = reflection.Inspector.from_engine(db.engine)\n table_name = 'events'\n table_names = inspector.get_table_names()\n print(\"[LOG] Existing tables:\")\n print(\"[LOG] \" + ','.join(table_names))\n if table_name not in table_names:\n print(\"[LOG] Table not found. Attempting creation\")\n try:\n db.create_all()\n stamp()\n except:\n populate_data = False\n print(\"[LOG] Could not create tables. Either database does not exist or tables already created\")\n if populate_data:\n credentials = credentials.split(\":\")\n create_super_admin(credentials[0], credentials[1])\n populate()\n else:\n print(\"[LOG] Tables already exist. Skipping data population & creation.\")\n\n\[email protected]\ndef prepare_kubernetes_db():\n with app.app_context():\n initialize_db('[email protected]:fossasia')\n\n\nif __name__ == \"__main__\":\n manager.run()\n", "path": "manage.py"}]} | 2,630 | 294 |
gh_patches_debug_11456 | rasdani/github-patches | git_diff | GeotrekCE__Geotrek-admin-1334 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
SEARCH_PATH for Geotrek DB user
Since Geotrek 0.28, tables and functions have be moved to different schemas, which is a very good point (https://github.com/makinacorpus/Geotrek/releases/tag/v0.28.0).
Schemas are not mentionned in triggers which is OK too, as Django is doing it in his connexions so it is not a problem for GEOTREK applications.
It gets a problem when you try to edit or insert a data from an external tool (QGIS, Talend...).
You have to change the db_user search_path so that he can find tables and functions not only in public schemas.
It could be interesting to do it during GEOTREK installation for the Geotrek DB user mentionned in settings :
ALTER USER $geotrek_db_user SET
search_path=public,django,geotrek,gestion,rando,zonage,foncier,tourisme;
Of course if you are using another user to edit datas in external tools, you will have to do it manually the first time.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `geotrek/common/utils/postgresql.py`
Content:
```
1 import re
2 import os
3 import logging
4 import traceback
5 from functools import wraps
6
7 from django.db import connection, models
8 from django.conf import settings
9 from django.db.models import get_app, get_models
10
11
12 logger = logging.getLogger(__name__)
13
14
15 def debug_pg_notices(f):
16
17 @wraps(f)
18 def wrapped(*args, **kwargs):
19 before = len(connection.connection.notices) if connection.connection else 0
20 try:
21 r = f(*args, **kwargs)
22 finally:
23 # Show triggers output
24 allnotices = []
25 current = ''
26 if connection.connection:
27 notices = []
28 for notice in connection.connection.notices[before:]:
29 try:
30 notice, context = notice.split('CONTEXT:', 1)
31 context = re.sub("\s+", " ", context)
32 except ValueError:
33 context = ''
34 notices.append((context, notice))
35 if context != current:
36 allnotices.append(notices)
37 notices = []
38 current = context
39 allnotices.append(notices)
40 current = ''
41 for notices in allnotices:
42 for context, notice in notices:
43 if context != current:
44 if context != '':
45 logger.debug('Context %s...:' % context.strip()[:80])
46 current = context
47 notice = notice.replace('NOTICE: ', '')
48 prefix = '' if context == '' else ' '
49 logger.debug('%s%s' % (prefix, notice.strip()))
50 return r
51
52 return wrapped
53
54
55 def load_sql_files(app_label):
56 """
57 Look for SQL files in Django app, and load them into database.
58 We remove RAISE NOTICE instructions from SQL outside unit testing
59 since they lead to interpolation errors of '%' character in python.
60 """
61 app_dir = os.path.dirname(models.get_app(app_label).__file__)
62 sql_dir = os.path.normpath(os.path.join(app_dir, 'sql'))
63 if not os.path.exists(sql_dir):
64 logger.debug("No SQL folder for %s" % app_label)
65 return
66
67 r = re.compile(r'^.*\.sql$')
68 sql_files = [os.path.join(sql_dir, f)
69 for f in os.listdir(sql_dir)
70 if r.match(f) is not None]
71 sql_files.sort()
72
73 if len(sql_files) == 0:
74 logger.warning("Empty folder %s" % sql_dir)
75
76 cursor = connection.cursor()
77 for sql_file in sql_files:
78 try:
79 logger.info("Loading initial SQL data from '%s'" % sql_file)
80 f = open(sql_file)
81 sql = f.read()
82 f.close()
83 if not settings.TEST:
84 # Remove RAISE NOTICE (/!\ only one-liners)
85 sql = re.sub(r"\n.*RAISE NOTICE.*\n", "\n", sql)
86 # TODO: this is the ugliest driver hack ever
87 sql = sql.replace('%', '%%')
88
89 # Replace curly braces with settings values
90 pattern = re.compile(r'{{\s*(.*)\s*}}')
91 for m in pattern.finditer(sql):
92 value = getattr(settings, m.group(1))
93 sql = sql.replace(m.group(0), unicode(value))
94 cursor.execute(sql)
95 except Exception as e:
96 logger.critical("Failed to install custom SQL file '%s': %s\n" %
97 (sql_file, e))
98 traceback.print_exc()
99 raise
100
101
102 def move_models_to_schemas(app_label):
103 """
104 Move models tables to PostgreSQL schemas.
105
106 Views, functions and triggers will be moved in Geotrek app SQL files.
107 """
108 app = get_app(app_label)
109 default_schema = settings.DATABASE_SCHEMAS.get('default')
110 app_schema = settings.DATABASE_SCHEMAS.get(app_label, default_schema)
111
112 table_schemas = {}
113 for model in get_models(app):
114 model_name = model._meta.module_name
115 table_name = model._meta.db_table
116 model_schema = settings.DATABASE_SCHEMAS.get(model_name, app_schema)
117 table_schemas.setdefault(model_schema, []).append(table_name)
118
119 for m2m_field in model._meta.many_to_many:
120 table_name = m2m_field.db_table
121 if table_name:
122 table_schemas[model_schema].append(table_name)
123
124 cursor = connection.cursor()
125
126 for schema_name in table_schemas.keys():
127 try:
128 sql = "CREATE SCHEMA %s;" % model_schema
129 cursor.execute(sql)
130 logger.info("Created schema %s" % model_schema)
131 except Exception:
132 logger.debug("Schema %s already exists." % model_schema)
133
134 for schema_name, tables in table_schemas.items():
135 for table_name in tables:
136 try:
137 sql = "ALTER TABLE %s SET SCHEMA %s;" % (table_name, schema_name)
138 cursor.execute(sql)
139 logger.info("Moved %s to schema %s" % (table_name, schema_name))
140 except Exception:
141 logger.debug("Table %s already in schema %s" % (table_name, schema_name))
142
143 # For Django, search_path is set in connection options.
144 # But when accessing the database using QGis or ETL, search_path must be
145 # set database level (for all users, and for this database only).
146 if app_label == 'common':
147 dbname = settings.DATABASES['default']['NAME']
148 search_path = 'public,%s' % ','.join(set(settings.DATABASE_SCHEMAS.values()))
149 sql = "ALTER DATABASE %s SET search_path=%s;" % (dbname, search_path)
150 cursor.execute(sql)
151
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/geotrek/common/utils/postgresql.py b/geotrek/common/utils/postgresql.py
--- a/geotrek/common/utils/postgresql.py
+++ b/geotrek/common/utils/postgresql.py
@@ -145,6 +145,7 @@
# set database level (for all users, and for this database only).
if app_label == 'common':
dbname = settings.DATABASES['default']['NAME']
+ dbuser = settings.DATABASES['default']['USER']
search_path = 'public,%s' % ','.join(set(settings.DATABASE_SCHEMAS.values()))
- sql = "ALTER DATABASE %s SET search_path=%s;" % (dbname, search_path)
+ sql = "ALTER ROLE %s IN DATABASE %s SET search_path=%s;" % (dbuser, dbname, search_path)
cursor.execute(sql)
| {"golden_diff": "diff --git a/geotrek/common/utils/postgresql.py b/geotrek/common/utils/postgresql.py\n--- a/geotrek/common/utils/postgresql.py\n+++ b/geotrek/common/utils/postgresql.py\n@@ -145,6 +145,7 @@\n # set database level (for all users, and for this database only).\n if app_label == 'common':\n dbname = settings.DATABASES['default']['NAME']\n+ dbuser = settings.DATABASES['default']['USER']\n search_path = 'public,%s' % ','.join(set(settings.DATABASE_SCHEMAS.values()))\n- sql = \"ALTER DATABASE %s SET search_path=%s;\" % (dbname, search_path)\n+ sql = \"ALTER ROLE %s IN DATABASE %s SET search_path=%s;\" % (dbuser, dbname, search_path)\n cursor.execute(sql)\n", "issue": "SEARCH_PATH for Geotrek DB user\nSince Geotrek 0.28, tables and functions have be moved to different schemas, which is a very good point (https://github.com/makinacorpus/Geotrek/releases/tag/v0.28.0).\n\nSchemas are not mentionned in triggers which is OK too, as Django is doing it in his connexions so it is not a problem for GEOTREK applications.\n\nIt gets a problem when you try to edit or insert a data from an external tool (QGIS, Talend...). \nYou have to change the db_user search_path so that he can find tables and functions not only in public schemas.\n\nIt could be interesting to do it during GEOTREK installation for the Geotrek DB user mentionned in settings : \n\nALTER USER $geotrek_db_user SET \nsearch_path=public,django,geotrek,gestion,rando,zonage,foncier,tourisme; \n\nOf course if you are using another user to edit datas in external tools, you will have to do it manually the first time. \n\n", "before_files": [{"content": "import re\nimport os\nimport logging\nimport traceback\nfrom functools import wraps\n\nfrom django.db import connection, models\nfrom django.conf import settings\nfrom django.db.models import get_app, get_models\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef debug_pg_notices(f):\n\n @wraps(f)\n def wrapped(*args, **kwargs):\n before = len(connection.connection.notices) if connection.connection else 0\n try:\n r = f(*args, **kwargs)\n finally:\n # Show triggers output\n allnotices = []\n current = ''\n if connection.connection:\n notices = []\n for notice in connection.connection.notices[before:]:\n try:\n notice, context = notice.split('CONTEXT:', 1)\n context = re.sub(\"\\s+\", \" \", context)\n except ValueError:\n context = ''\n notices.append((context, notice))\n if context != current:\n allnotices.append(notices)\n notices = []\n current = context\n allnotices.append(notices)\n current = ''\n for notices in allnotices:\n for context, notice in notices:\n if context != current:\n if context != '':\n logger.debug('Context %s...:' % context.strip()[:80])\n current = context\n notice = notice.replace('NOTICE: ', '')\n prefix = '' if context == '' else ' '\n logger.debug('%s%s' % (prefix, notice.strip()))\n return r\n\n return wrapped\n\n\ndef load_sql_files(app_label):\n \"\"\"\n Look for SQL files in Django app, and load them into database.\n We remove RAISE NOTICE instructions from SQL outside unit testing\n since they lead to interpolation errors of '%' character in python.\n \"\"\"\n app_dir = os.path.dirname(models.get_app(app_label).__file__)\n sql_dir = os.path.normpath(os.path.join(app_dir, 'sql'))\n if not os.path.exists(sql_dir):\n logger.debug(\"No SQL folder for %s\" % app_label)\n return\n\n r = re.compile(r'^.*\\.sql$')\n sql_files = [os.path.join(sql_dir, f)\n for f in os.listdir(sql_dir)\n if r.match(f) is not None]\n sql_files.sort()\n\n if len(sql_files) == 0:\n logger.warning(\"Empty folder %s\" % sql_dir)\n\n cursor = connection.cursor()\n for sql_file in sql_files:\n try:\n logger.info(\"Loading initial SQL data from '%s'\" % sql_file)\n f = open(sql_file)\n sql = f.read()\n f.close()\n if not settings.TEST:\n # Remove RAISE NOTICE (/!\\ only one-liners)\n sql = re.sub(r\"\\n.*RAISE NOTICE.*\\n\", \"\\n\", sql)\n # TODO: this is the ugliest driver hack ever\n sql = sql.replace('%', '%%')\n\n # Replace curly braces with settings values\n pattern = re.compile(r'{{\\s*(.*)\\s*}}')\n for m in pattern.finditer(sql):\n value = getattr(settings, m.group(1))\n sql = sql.replace(m.group(0), unicode(value))\n cursor.execute(sql)\n except Exception as e:\n logger.critical(\"Failed to install custom SQL file '%s': %s\\n\" %\n (sql_file, e))\n traceback.print_exc()\n raise\n\n\ndef move_models_to_schemas(app_label):\n \"\"\"\n Move models tables to PostgreSQL schemas.\n\n Views, functions and triggers will be moved in Geotrek app SQL files.\n \"\"\"\n app = get_app(app_label)\n default_schema = settings.DATABASE_SCHEMAS.get('default')\n app_schema = settings.DATABASE_SCHEMAS.get(app_label, default_schema)\n\n table_schemas = {}\n for model in get_models(app):\n model_name = model._meta.module_name\n table_name = model._meta.db_table\n model_schema = settings.DATABASE_SCHEMAS.get(model_name, app_schema)\n table_schemas.setdefault(model_schema, []).append(table_name)\n\n for m2m_field in model._meta.many_to_many:\n table_name = m2m_field.db_table\n if table_name:\n table_schemas[model_schema].append(table_name)\n\n cursor = connection.cursor()\n\n for schema_name in table_schemas.keys():\n try:\n sql = \"CREATE SCHEMA %s;\" % model_schema\n cursor.execute(sql)\n logger.info(\"Created schema %s\" % model_schema)\n except Exception:\n logger.debug(\"Schema %s already exists.\" % model_schema)\n\n for schema_name, tables in table_schemas.items():\n for table_name in tables:\n try:\n sql = \"ALTER TABLE %s SET SCHEMA %s;\" % (table_name, schema_name)\n cursor.execute(sql)\n logger.info(\"Moved %s to schema %s\" % (table_name, schema_name))\n except Exception:\n logger.debug(\"Table %s already in schema %s\" % (table_name, schema_name))\n\n # For Django, search_path is set in connection options.\n # But when accessing the database using QGis or ETL, search_path must be\n # set database level (for all users, and for this database only).\n if app_label == 'common':\n dbname = settings.DATABASES['default']['NAME']\n search_path = 'public,%s' % ','.join(set(settings.DATABASE_SCHEMAS.values()))\n sql = \"ALTER DATABASE %s SET search_path=%s;\" % (dbname, search_path)\n cursor.execute(sql)\n", "path": "geotrek/common/utils/postgresql.py"}], "after_files": [{"content": "import re\nimport os\nimport logging\nimport traceback\nfrom functools import wraps\n\nfrom django.db import connection, models\nfrom django.conf import settings\nfrom django.db.models import get_app, get_models\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef debug_pg_notices(f):\n\n @wraps(f)\n def wrapped(*args, **kwargs):\n before = len(connection.connection.notices) if connection.connection else 0\n try:\n r = f(*args, **kwargs)\n finally:\n # Show triggers output\n allnotices = []\n current = ''\n if connection.connection:\n notices = []\n for notice in connection.connection.notices[before:]:\n try:\n notice, context = notice.split('CONTEXT:', 1)\n context = re.sub(\"\\s+\", \" \", context)\n except ValueError:\n context = ''\n notices.append((context, notice))\n if context != current:\n allnotices.append(notices)\n notices = []\n current = context\n allnotices.append(notices)\n current = ''\n for notices in allnotices:\n for context, notice in notices:\n if context != current:\n if context != '':\n logger.debug('Context %s...:' % context.strip()[:80])\n current = context\n notice = notice.replace('NOTICE: ', '')\n prefix = '' if context == '' else ' '\n logger.debug('%s%s' % (prefix, notice.strip()))\n return r\n\n return wrapped\n\n\ndef load_sql_files(app_label):\n \"\"\"\n Look for SQL files in Django app, and load them into database.\n We remove RAISE NOTICE instructions from SQL outside unit testing\n since they lead to interpolation errors of '%' character in python.\n \"\"\"\n app_dir = os.path.dirname(models.get_app(app_label).__file__)\n sql_dir = os.path.normpath(os.path.join(app_dir, 'sql'))\n if not os.path.exists(sql_dir):\n logger.debug(\"No SQL folder for %s\" % app_label)\n return\n\n r = re.compile(r'^.*\\.sql$')\n sql_files = [os.path.join(sql_dir, f)\n for f in os.listdir(sql_dir)\n if r.match(f) is not None]\n sql_files.sort()\n\n if len(sql_files) == 0:\n logger.warning(\"Empty folder %s\" % sql_dir)\n\n cursor = connection.cursor()\n for sql_file in sql_files:\n try:\n logger.info(\"Loading initial SQL data from '%s'\" % sql_file)\n f = open(sql_file)\n sql = f.read()\n f.close()\n if not settings.TEST:\n # Remove RAISE NOTICE (/!\\ only one-liners)\n sql = re.sub(r\"\\n.*RAISE NOTICE.*\\n\", \"\\n\", sql)\n # TODO: this is the ugliest driver hack ever\n sql = sql.replace('%', '%%')\n\n # Replace curly braces with settings values\n pattern = re.compile(r'{{\\s*(.*)\\s*}}')\n for m in pattern.finditer(sql):\n value = getattr(settings, m.group(1))\n sql = sql.replace(m.group(0), unicode(value))\n cursor.execute(sql)\n except Exception as e:\n logger.critical(\"Failed to install custom SQL file '%s': %s\\n\" %\n (sql_file, e))\n traceback.print_exc()\n raise\n\n\ndef move_models_to_schemas(app_label):\n \"\"\"\n Move models tables to PostgreSQL schemas.\n\n Views, functions and triggers will be moved in Geotrek app SQL files.\n \"\"\"\n app = get_app(app_label)\n default_schema = settings.DATABASE_SCHEMAS.get('default')\n app_schema = settings.DATABASE_SCHEMAS.get(app_label, default_schema)\n\n table_schemas = {}\n for model in get_models(app):\n model_name = model._meta.module_name\n table_name = model._meta.db_table\n model_schema = settings.DATABASE_SCHEMAS.get(model_name, app_schema)\n table_schemas.setdefault(model_schema, []).append(table_name)\n\n for m2m_field in model._meta.many_to_many:\n table_name = m2m_field.db_table\n if table_name:\n table_schemas[model_schema].append(table_name)\n\n cursor = connection.cursor()\n\n for schema_name in table_schemas.keys():\n try:\n sql = \"CREATE SCHEMA %s;\" % model_schema\n cursor.execute(sql)\n logger.info(\"Created schema %s\" % model_schema)\n except Exception:\n logger.debug(\"Schema %s already exists.\" % model_schema)\n\n for schema_name, tables in table_schemas.items():\n for table_name in tables:\n try:\n sql = \"ALTER TABLE %s SET SCHEMA %s;\" % (table_name, schema_name)\n cursor.execute(sql)\n logger.info(\"Moved %s to schema %s\" % (table_name, schema_name))\n except Exception:\n logger.debug(\"Table %s already in schema %s\" % (table_name, schema_name))\n\n # For Django, search_path is set in connection options.\n # But when accessing the database using QGis or ETL, search_path must be\n # set database level (for all users, and for this database only).\n if app_label == 'common':\n dbname = settings.DATABASES['default']['NAME']\n dbuser = settings.DATABASES['default']['USER']\n search_path = 'public,%s' % ','.join(set(settings.DATABASE_SCHEMAS.values()))\n sql = \"ALTER ROLE %s IN DATABASE %s SET search_path=%s;\" % (dbuser, dbname, search_path)\n cursor.execute(sql)\n", "path": "geotrek/common/utils/postgresql.py"}]} | 2,037 | 185 |
gh_patches_debug_40390 | rasdani/github-patches | git_diff | jupyterhub__jupyterhub-349 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Default shell to bash
Right now, if users are created by jupyterhub with useradd on a stock Ubuntu, there is no default shell. This leads to `/bin/sh` being used rather than bash. When the web terminal is opened all readline related actions (tab, up, down) don't work. I know I can configure this in `/etc/default/useradd`, but we jut added an option to set the home directory base path. Should we default the shell to bash in the useradd CLI call? Maybe add an config option that defaults to `/bin/bash`. This behavior means that a pretty standard config on Ubuntu will have a (slightly) broken terminal.
I could have sworn that we did something on this previously, when I was debugging things with my Active Directory setup where `SHELL` wasn't being set. But I don't remember what we ended up doing (I may have just set it myself).
Thoughts?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `jupyterhub/auth.py`
Content:
```
1 """Simple PAM authenticator"""
2
3 # Copyright (c) IPython Development Team.
4 # Distributed under the terms of the Modified BSD License.
5
6 from grp import getgrnam
7 import pwd
8 from subprocess import check_call, check_output, CalledProcessError
9
10 from tornado import gen
11 import pamela
12
13 from traitlets.config import LoggingConfigurable
14 from traitlets import Bool, Set, Unicode, Any
15
16 from .handlers.login import LoginHandler
17 from .utils import url_path_join
18
19 class Authenticator(LoggingConfigurable):
20 """A class for authentication.
21
22 The API is one method, `authenticate`, a tornado gen.coroutine.
23 """
24
25 db = Any()
26 admin_users = Set(config=True,
27 help="""set of usernames of admin users
28
29 If unspecified, only the user that launches the server will be admin.
30 """
31 )
32 whitelist = Set(config=True,
33 help="""Username whitelist.
34
35 Use this to restrict which users can login.
36 If empty, allow any user to attempt login.
37 """
38 )
39 custom_html = Unicode('',
40 help="""HTML login form for custom handlers.
41 Override in form-based custom authenticators
42 that don't use username+password,
43 or need custom branding.
44 """
45 )
46 login_service = Unicode('',
47 help="""Name of the login service for external
48 login services (e.g. 'GitHub').
49 """
50 )
51
52 @gen.coroutine
53 def authenticate(self, handler, data):
54 """Authenticate a user with login form data.
55
56 This must be a tornado gen.coroutine.
57 It must return the username on successful authentication,
58 and return None on failed authentication.
59 """
60
61 def pre_spawn_start(self, user, spawner):
62 """Hook called before spawning a user's server.
63
64 Can be used to do auth-related startup, e.g. opening PAM sessions.
65 """
66
67 def post_spawn_stop(self, user, spawner):
68 """Hook called after stopping a user container.
69
70 Can be used to do auth-related cleanup, e.g. closing PAM sessions.
71 """
72
73 def check_whitelist(self, user):
74 """
75 Return True if the whitelist is empty or user is in the whitelist.
76 """
77 # Parens aren't necessary here, but they make this easier to parse.
78 return (not self.whitelist) or (user in self.whitelist)
79
80 def add_user(self, user):
81 """Add a new user
82
83 By default, this just adds the user to the whitelist.
84
85 Subclasses may do more extensive things,
86 such as adding actual unix users.
87 """
88 if self.whitelist:
89 self.whitelist.add(user.name)
90
91 def delete_user(self, user):
92 """Triggered when a user is deleted.
93
94 Removes the user from the whitelist.
95 """
96 self.whitelist.discard(user.name)
97
98 def login_url(self, base_url):
99 """Override to register a custom login handler"""
100 return url_path_join(base_url, 'login')
101
102 def logout_url(self, base_url):
103 """Override to register a custom logout handler"""
104 return url_path_join(base_url, 'logout')
105
106 def get_handlers(self, app):
107 """Return any custom handlers the authenticator needs to register
108
109 (e.g. for OAuth)
110 """
111 return [
112 ('/login', LoginHandler),
113 ]
114
115 class LocalAuthenticator(Authenticator):
116 """Base class for Authenticators that work with local *ix users
117
118 Checks for local users, and can attempt to create them if they exist.
119 """
120
121 create_system_users = Bool(False, config=True,
122 help="""If a user is added that doesn't exist on the system,
123 should I try to create the system user?
124 """
125 )
126 system_user_home = Unicode('', config=True,
127 help="""Specify root home directory for users if different from system default.
128
129 Passed to `useradd -b`.
130 If specified, when system users are created their home directories will be created in
131
132 system_user_home/username
133
134 Only has an effect when users are created with `create_system_users=True`.
135 """
136 )
137
138 group_whitelist = Set(
139 config=True,
140 help="Automatically whitelist anyone in this group.",
141 )
142
143 def _group_whitelist_changed(self, name, old, new):
144 if self.whitelist:
145 self.log.warn(
146 "Ignoring username whitelist because group whitelist supplied!"
147 )
148
149 def check_whitelist(self, username):
150 if self.group_whitelist:
151 return self.check_group_whitelist(username)
152 else:
153 return super().check_whitelist(username)
154
155 def check_group_whitelist(self, username):
156 if not self.group_whitelist:
157 return False
158 for grnam in self.group_whitelist:
159 try:
160 group = getgrnam(grnam)
161 except KeyError:
162 self.log.error('No such group: [%s]' % grnam)
163 continue
164 if username in group.gr_mem:
165 return True
166 return False
167
168 @gen.coroutine
169 def add_user(self, user):
170 """Add a new user
171
172 By default, this just adds the user to the whitelist.
173
174 Subclasses may do more extensive things,
175 such as adding actual unix users.
176 """
177 user_exists = yield gen.maybe_future(self.system_user_exists(user))
178 if not user_exists:
179 if self.create_system_users:
180 yield gen.maybe_future(self.add_system_user(user))
181 else:
182 raise KeyError("User %s does not exist." % user.name)
183
184 yield gen.maybe_future(super().add_user(user))
185
186 @staticmethod
187 def system_user_exists(user):
188 """Check if the user exists on the system"""
189 try:
190 pwd.getpwnam(user.name)
191 except KeyError:
192 return False
193 else:
194 return True
195
196 def add_system_user(self, user):
197 """Create a new *ix user on the system. Works on FreeBSD and Linux, at least."""
198 name = user.name
199 for useradd in (
200 ['pw', 'useradd', '-m'],
201 ['useradd', '-m'],
202 ):
203 try:
204 check_output(['which', useradd[0]])
205 except CalledProcessError:
206 continue
207 else:
208 break
209 else:
210 raise RuntimeError("I don't know how to add users on this system.")
211
212 cmd = useradd
213 if self.system_user_home:
214 cmd = cmd + ['-b', self.system_user_home]
215 check_call(cmd + [name])
216
217
218 class PAMAuthenticator(LocalAuthenticator):
219 """Authenticate local *ix users with PAM"""
220 encoding = Unicode('utf8', config=True,
221 help="""The encoding to use for PAM"""
222 )
223 service = Unicode('login', config=True,
224 help="""The PAM service to use for authentication."""
225 )
226
227 @gen.coroutine
228 def authenticate(self, handler, data):
229 """Authenticate with PAM, and return the username if login is successful.
230
231 Return None otherwise.
232 """
233 username = data['username']
234 if not self.check_whitelist(username):
235 return
236 try:
237 pamela.authenticate(username, data['password'], service=self.service)
238 except pamela.PAMError as e:
239 if handler is not None:
240 self.log.warn("PAM Authentication failed (@%s): %s", handler.request.remote_ip, e)
241 else:
242 self.log.warn("PAM Authentication failed: %s", e)
243 else:
244 return username
245
246 def pre_spawn_start(self, user, spawner):
247 """Open PAM session for user"""
248 try:
249 pamela.open_session(user.name, service=self.service)
250 except pamela.PAMError as e:
251 self.log.warn("Failed to open PAM session for %s: %s", user.name, e)
252
253 def post_spawn_stop(self, user, spawner):
254 """Close PAM session for user"""
255 try:
256 pamela.close_session(user.name, service=self.service)
257 except pamela.PAMError as e:
258 self.log.warn("Failed to close PAM session for %s: %s", user.name, e)
259
260
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/jupyterhub/auth.py b/jupyterhub/auth.py
--- a/jupyterhub/auth.py
+++ b/jupyterhub/auth.py
@@ -4,8 +4,11 @@
# Distributed under the terms of the Modified BSD License.
from grp import getgrnam
+import pipes
import pwd
-from subprocess import check_call, check_output, CalledProcessError
+from shutil import which
+import sys
+from subprocess import check_call
from tornado import gen
import pamela
@@ -15,6 +18,7 @@
from .handlers.login import LoginHandler
from .utils import url_path_join
+from .traitlets import Command
class Authenticator(LoggingConfigurable):
"""A class for authentication.
@@ -123,17 +127,36 @@
should I try to create the system user?
"""
)
- system_user_home = Unicode('', config=True,
- help="""Specify root home directory for users if different from system default.
+ add_user_cmd = Command(config=True,
+ help="""The command to use for creating users as a list of strings.
- Passed to `useradd -b`.
- If specified, when system users are created their home directories will be created in
+ For each element in the list, the string USERNAME will be replaced with
+ the user's username. The username will also be appended as the final argument.
- system_user_home/username
+ For Linux, the default value is:
- Only has an effect when users are created with `create_system_users=True`.
+ ['adduser', '-q', '--gecos', '""', '--disabled-password']
+
+ To specify a custom home directory, set this to:
+
+ ['adduser', '-q', '--gecos', '""', '--home', '/customhome/USERNAME', '--disabled-password']
+
+ This will run the command:
+
+ adduser -q --gecos "" --home /customhome/river --disabled-password river
+
+ when the user 'river' is created.
"""
)
+ def _add_user_cmd_default(self):
+ if sys.platform == 'darwin':
+ raise ValueError("I don't know how to create users on OS X")
+ elif which('pw'):
+ # Probably BSD
+ return ['pw', 'useradd', '-m']
+ else:
+ # This appears to be the Linux non-interactive adduser command:
+ return ['adduser', '-q', '--gecos', '""', '--disabled-password']
group_whitelist = Set(
config=True,
@@ -196,23 +219,9 @@
def add_system_user(self, user):
"""Create a new *ix user on the system. Works on FreeBSD and Linux, at least."""
name = user.name
- for useradd in (
- ['pw', 'useradd', '-m'],
- ['useradd', '-m'],
- ):
- try:
- check_output(['which', useradd[0]])
- except CalledProcessError:
- continue
- else:
- break
- else:
- raise RuntimeError("I don't know how to add users on this system.")
-
- cmd = useradd
- if self.system_user_home:
- cmd = cmd + ['-b', self.system_user_home]
- check_call(cmd + [name])
+ cmd = [ arg.replace('USERNAME', name) for arg in self.add_user_cmd ] + [name]
+ self.log.info("Creating user: %s", ' '.join(map(pipes.quote, cmd)))
+ check_call(cmd)
class PAMAuthenticator(LocalAuthenticator):
| {"golden_diff": "diff --git a/jupyterhub/auth.py b/jupyterhub/auth.py\n--- a/jupyterhub/auth.py\n+++ b/jupyterhub/auth.py\n@@ -4,8 +4,11 @@\n # Distributed under the terms of the Modified BSD License.\n \n from grp import getgrnam\n+import pipes\n import pwd\n-from subprocess import check_call, check_output, CalledProcessError\n+from shutil import which\n+import sys\n+from subprocess import check_call\n \n from tornado import gen\n import pamela\n@@ -15,6 +18,7 @@\n \n from .handlers.login import LoginHandler\n from .utils import url_path_join\n+from .traitlets import Command\n \n class Authenticator(LoggingConfigurable):\n \"\"\"A class for authentication.\n@@ -123,17 +127,36 @@\n should I try to create the system user?\n \"\"\"\n )\n- system_user_home = Unicode('', config=True,\n- help=\"\"\"Specify root home directory for users if different from system default.\n+ add_user_cmd = Command(config=True,\n+ help=\"\"\"The command to use for creating users as a list of strings.\n \n- Passed to `useradd -b`.\n- If specified, when system users are created their home directories will be created in\n+ For each element in the list, the string USERNAME will be replaced with\n+ the user's username. The username will also be appended as the final argument.\n \n- system_user_home/username\n+ For Linux, the default value is:\n \n- Only has an effect when users are created with `create_system_users=True`.\n+ ['adduser', '-q', '--gecos', '\"\"', '--disabled-password']\n+ \n+ To specify a custom home directory, set this to:\n+ \n+ ['adduser', '-q', '--gecos', '\"\"', '--home', '/customhome/USERNAME', '--disabled-password']\n+\n+ This will run the command:\n+\n+ adduser -q --gecos \"\" --home /customhome/river --disabled-password river\n+ \n+ when the user 'river' is created.\n \"\"\"\n )\n+ def _add_user_cmd_default(self):\n+ if sys.platform == 'darwin':\n+ raise ValueError(\"I don't know how to create users on OS X\")\n+ elif which('pw'):\n+ # Probably BSD\n+ return ['pw', 'useradd', '-m']\n+ else:\n+ # This appears to be the Linux non-interactive adduser command:\n+ return ['adduser', '-q', '--gecos', '\"\"', '--disabled-password']\n \n group_whitelist = Set(\n config=True,\n@@ -196,23 +219,9 @@\n def add_system_user(self, user):\n \"\"\"Create a new *ix user on the system. Works on FreeBSD and Linux, at least.\"\"\"\n name = user.name\n- for useradd in (\n- ['pw', 'useradd', '-m'],\n- ['useradd', '-m'],\n- ):\n- try:\n- check_output(['which', useradd[0]])\n- except CalledProcessError:\n- continue\n- else:\n- break\n- else:\n- raise RuntimeError(\"I don't know how to add users on this system.\")\n- \n- cmd = useradd\n- if self.system_user_home:\n- cmd = cmd + ['-b', self.system_user_home]\n- check_call(cmd + [name])\n+ cmd = [ arg.replace('USERNAME', name) for arg in self.add_user_cmd ] + [name]\n+ self.log.info(\"Creating user: %s\", ' '.join(map(pipes.quote, cmd)))\n+ check_call(cmd)\n \n \n class PAMAuthenticator(LocalAuthenticator):\n", "issue": "Default shell to bash\nRight now, if users are created by jupyterhub with useradd on a stock Ubuntu, there is no default shell. This leads to `/bin/sh` being used rather than bash. When the web terminal is opened all readline related actions (tab, up, down) don't work. I know I can configure this in `/etc/default/useradd`, but we jut added an option to set the home directory base path. Should we default the shell to bash in the useradd CLI call? Maybe add an config option that defaults to `/bin/bash`. This behavior means that a pretty standard config on Ubuntu will have a (slightly) broken terminal.\n\nI could have sworn that we did something on this previously, when I was debugging things with my Active Directory setup where `SHELL` wasn't being set. But I don't remember what we ended up doing (I may have just set it myself).\n\nThoughts?\n\n", "before_files": [{"content": "\"\"\"Simple PAM authenticator\"\"\"\n\n# Copyright (c) IPython Development Team.\n# Distributed under the terms of the Modified BSD License.\n\nfrom grp import getgrnam\nimport pwd\nfrom subprocess import check_call, check_output, CalledProcessError\n\nfrom tornado import gen\nimport pamela\n\nfrom traitlets.config import LoggingConfigurable\nfrom traitlets import Bool, Set, Unicode, Any\n\nfrom .handlers.login import LoginHandler\nfrom .utils import url_path_join\n\nclass Authenticator(LoggingConfigurable):\n \"\"\"A class for authentication.\n \n The API is one method, `authenticate`, a tornado gen.coroutine.\n \"\"\"\n \n db = Any()\n admin_users = Set(config=True,\n help=\"\"\"set of usernames of admin users\n\n If unspecified, only the user that launches the server will be admin.\n \"\"\"\n )\n whitelist = Set(config=True,\n help=\"\"\"Username whitelist.\n \n Use this to restrict which users can login.\n If empty, allow any user to attempt login.\n \"\"\"\n )\n custom_html = Unicode('',\n help=\"\"\"HTML login form for custom handlers.\n Override in form-based custom authenticators\n that don't use username+password,\n or need custom branding.\n \"\"\"\n )\n login_service = Unicode('',\n help=\"\"\"Name of the login service for external\n login services (e.g. 'GitHub').\n \"\"\"\n )\n \n @gen.coroutine\n def authenticate(self, handler, data):\n \"\"\"Authenticate a user with login form data.\n \n This must be a tornado gen.coroutine.\n It must return the username on successful authentication,\n and return None on failed authentication.\n \"\"\"\n\n def pre_spawn_start(self, user, spawner):\n \"\"\"Hook called before spawning a user's server.\n \n Can be used to do auth-related startup, e.g. opening PAM sessions.\n \"\"\"\n \n def post_spawn_stop(self, user, spawner):\n \"\"\"Hook called after stopping a user container.\n \n Can be used to do auth-related cleanup, e.g. closing PAM sessions.\n \"\"\"\n \n def check_whitelist(self, user):\n \"\"\"\n Return True if the whitelist is empty or user is in the whitelist.\n \"\"\"\n # Parens aren't necessary here, but they make this easier to parse.\n return (not self.whitelist) or (user in self.whitelist)\n\n def add_user(self, user):\n \"\"\"Add a new user\n \n By default, this just adds the user to the whitelist.\n \n Subclasses may do more extensive things,\n such as adding actual unix users.\n \"\"\"\n if self.whitelist:\n self.whitelist.add(user.name)\n \n def delete_user(self, user):\n \"\"\"Triggered when a user is deleted.\n \n Removes the user from the whitelist.\n \"\"\"\n self.whitelist.discard(user.name)\n \n def login_url(self, base_url):\n \"\"\"Override to register a custom login handler\"\"\"\n return url_path_join(base_url, 'login')\n \n def logout_url(self, base_url):\n \"\"\"Override to register a custom logout handler\"\"\"\n return url_path_join(base_url, 'logout')\n \n def get_handlers(self, app):\n \"\"\"Return any custom handlers the authenticator needs to register\n \n (e.g. for OAuth)\n \"\"\"\n return [\n ('/login', LoginHandler),\n ]\n\nclass LocalAuthenticator(Authenticator):\n \"\"\"Base class for Authenticators that work with local *ix users\n \n Checks for local users, and can attempt to create them if they exist.\n \"\"\"\n \n create_system_users = Bool(False, config=True,\n help=\"\"\"If a user is added that doesn't exist on the system,\n should I try to create the system user?\n \"\"\"\n )\n system_user_home = Unicode('', config=True,\n help=\"\"\"Specify root home directory for users if different from system default.\n \n Passed to `useradd -b`.\n If specified, when system users are created their home directories will be created in\n \n system_user_home/username\n \n Only has an effect when users are created with `create_system_users=True`.\n \"\"\"\n )\n\n group_whitelist = Set(\n config=True,\n help=\"Automatically whitelist anyone in this group.\",\n )\n\n def _group_whitelist_changed(self, name, old, new):\n if self.whitelist:\n self.log.warn(\n \"Ignoring username whitelist because group whitelist supplied!\"\n )\n\n def check_whitelist(self, username):\n if self.group_whitelist:\n return self.check_group_whitelist(username)\n else:\n return super().check_whitelist(username)\n\n def check_group_whitelist(self, username):\n if not self.group_whitelist:\n return False\n for grnam in self.group_whitelist:\n try:\n group = getgrnam(grnam)\n except KeyError:\n self.log.error('No such group: [%s]' % grnam)\n continue\n if username in group.gr_mem:\n return True\n return False\n\n @gen.coroutine\n def add_user(self, user):\n \"\"\"Add a new user\n \n By default, this just adds the user to the whitelist.\n \n Subclasses may do more extensive things,\n such as adding actual unix users.\n \"\"\"\n user_exists = yield gen.maybe_future(self.system_user_exists(user))\n if not user_exists:\n if self.create_system_users:\n yield gen.maybe_future(self.add_system_user(user))\n else:\n raise KeyError(\"User %s does not exist.\" % user.name)\n \n yield gen.maybe_future(super().add_user(user))\n \n @staticmethod\n def system_user_exists(user):\n \"\"\"Check if the user exists on the system\"\"\"\n try:\n pwd.getpwnam(user.name)\n except KeyError:\n return False\n else:\n return True\n \n def add_system_user(self, user):\n \"\"\"Create a new *ix user on the system. Works on FreeBSD and Linux, at least.\"\"\"\n name = user.name\n for useradd in (\n ['pw', 'useradd', '-m'],\n ['useradd', '-m'],\n ):\n try:\n check_output(['which', useradd[0]])\n except CalledProcessError:\n continue\n else:\n break\n else:\n raise RuntimeError(\"I don't know how to add users on this system.\")\n \n cmd = useradd\n if self.system_user_home:\n cmd = cmd + ['-b', self.system_user_home]\n check_call(cmd + [name])\n\n\nclass PAMAuthenticator(LocalAuthenticator):\n \"\"\"Authenticate local *ix users with PAM\"\"\"\n encoding = Unicode('utf8', config=True,\n help=\"\"\"The encoding to use for PAM\"\"\"\n )\n service = Unicode('login', config=True,\n help=\"\"\"The PAM service to use for authentication.\"\"\"\n )\n \n @gen.coroutine\n def authenticate(self, handler, data):\n \"\"\"Authenticate with PAM, and return the username if login is successful.\n \n Return None otherwise.\n \"\"\"\n username = data['username']\n if not self.check_whitelist(username):\n return\n try:\n pamela.authenticate(username, data['password'], service=self.service)\n except pamela.PAMError as e:\n if handler is not None:\n self.log.warn(\"PAM Authentication failed (@%s): %s\", handler.request.remote_ip, e)\n else:\n self.log.warn(\"PAM Authentication failed: %s\", e)\n else:\n return username\n \n def pre_spawn_start(self, user, spawner):\n \"\"\"Open PAM session for user\"\"\"\n try:\n pamela.open_session(user.name, service=self.service)\n except pamela.PAMError as e:\n self.log.warn(\"Failed to open PAM session for %s: %s\", user.name, e)\n \n def post_spawn_stop(self, user, spawner):\n \"\"\"Close PAM session for user\"\"\"\n try:\n pamela.close_session(user.name, service=self.service)\n except pamela.PAMError as e:\n self.log.warn(\"Failed to close PAM session for %s: %s\", user.name, e)\n \n", "path": "jupyterhub/auth.py"}], "after_files": [{"content": "\"\"\"Simple PAM authenticator\"\"\"\n\n# Copyright (c) IPython Development Team.\n# Distributed under the terms of the Modified BSD License.\n\nfrom grp import getgrnam\nimport pipes\nimport pwd\nfrom shutil import which\nimport sys\nfrom subprocess import check_call\n\nfrom tornado import gen\nimport pamela\n\nfrom traitlets.config import LoggingConfigurable\nfrom traitlets import Bool, Set, Unicode, Any\n\nfrom .handlers.login import LoginHandler\nfrom .utils import url_path_join\nfrom .traitlets import Command\n\nclass Authenticator(LoggingConfigurable):\n \"\"\"A class for authentication.\n \n The API is one method, `authenticate`, a tornado gen.coroutine.\n \"\"\"\n \n db = Any()\n admin_users = Set(config=True,\n help=\"\"\"set of usernames of admin users\n\n If unspecified, only the user that launches the server will be admin.\n \"\"\"\n )\n whitelist = Set(config=True,\n help=\"\"\"Username whitelist.\n \n Use this to restrict which users can login.\n If empty, allow any user to attempt login.\n \"\"\"\n )\n custom_html = Unicode('',\n help=\"\"\"HTML login form for custom handlers.\n Override in form-based custom authenticators\n that don't use username+password,\n or need custom branding.\n \"\"\"\n )\n login_service = Unicode('',\n help=\"\"\"Name of the login service for external\n login services (e.g. 'GitHub').\n \"\"\"\n )\n \n @gen.coroutine\n def authenticate(self, handler, data):\n \"\"\"Authenticate a user with login form data.\n \n This must be a tornado gen.coroutine.\n It must return the username on successful authentication,\n and return None on failed authentication.\n \"\"\"\n\n def pre_spawn_start(self, user, spawner):\n \"\"\"Hook called before spawning a user's server.\n \n Can be used to do auth-related startup, e.g. opening PAM sessions.\n \"\"\"\n \n def post_spawn_stop(self, user, spawner):\n \"\"\"Hook called after stopping a user container.\n \n Can be used to do auth-related cleanup, e.g. closing PAM sessions.\n \"\"\"\n \n def check_whitelist(self, user):\n \"\"\"\n Return True if the whitelist is empty or user is in the whitelist.\n \"\"\"\n # Parens aren't necessary here, but they make this easier to parse.\n return (not self.whitelist) or (user in self.whitelist)\n\n def add_user(self, user):\n \"\"\"Add a new user\n \n By default, this just adds the user to the whitelist.\n \n Subclasses may do more extensive things,\n such as adding actual unix users.\n \"\"\"\n if self.whitelist:\n self.whitelist.add(user.name)\n \n def delete_user(self, user):\n \"\"\"Triggered when a user is deleted.\n \n Removes the user from the whitelist.\n \"\"\"\n self.whitelist.discard(user.name)\n \n def login_url(self, base_url):\n \"\"\"Override to register a custom login handler\"\"\"\n return url_path_join(base_url, 'login')\n \n def logout_url(self, base_url):\n \"\"\"Override to register a custom logout handler\"\"\"\n return url_path_join(base_url, 'logout')\n \n def get_handlers(self, app):\n \"\"\"Return any custom handlers the authenticator needs to register\n \n (e.g. for OAuth)\n \"\"\"\n return [\n ('/login', LoginHandler),\n ]\n\nclass LocalAuthenticator(Authenticator):\n \"\"\"Base class for Authenticators that work with local *ix users\n \n Checks for local users, and can attempt to create them if they exist.\n \"\"\"\n \n create_system_users = Bool(False, config=True,\n help=\"\"\"If a user is added that doesn't exist on the system,\n should I try to create the system user?\n \"\"\"\n )\n add_user_cmd = Command(config=True,\n help=\"\"\"The command to use for creating users as a list of strings.\n \n For each element in the list, the string USERNAME will be replaced with\n the user's username. The username will also be appended as the final argument.\n \n For Linux, the default value is:\n \n ['adduser', '-q', '--gecos', '\"\"', '--disabled-password']\n \n To specify a custom home directory, set this to:\n \n ['adduser', '-q', '--gecos', '\"\"', '--home', '/customhome/USERNAME', '--disabled-password']\n\n This will run the command:\n\n adduser -q --gecos \"\" --home /customhome/river --disabled-password river\n \n when the user 'river' is created.\n \"\"\"\n )\n def _add_user_cmd_default(self):\n if sys.platform == 'darwin':\n raise ValueError(\"I don't know how to create users on OS X\")\n elif which('pw'):\n # Probably BSD\n return ['pw', 'useradd', '-m']\n else:\n # This appears to be the Linux non-interactive adduser command:\n return ['adduser', '-q', '--gecos', '\"\"', '--disabled-password']\n\n group_whitelist = Set(\n config=True,\n help=\"Automatically whitelist anyone in this group.\",\n )\n\n def _group_whitelist_changed(self, name, old, new):\n if self.whitelist:\n self.log.warn(\n \"Ignoring username whitelist because group whitelist supplied!\"\n )\n\n def check_whitelist(self, username):\n if self.group_whitelist:\n return self.check_group_whitelist(username)\n else:\n return super().check_whitelist(username)\n\n def check_group_whitelist(self, username):\n if not self.group_whitelist:\n return False\n for grnam in self.group_whitelist:\n try:\n group = getgrnam(grnam)\n except KeyError:\n self.log.error('No such group: [%s]' % grnam)\n continue\n if username in group.gr_mem:\n return True\n return False\n\n @gen.coroutine\n def add_user(self, user):\n \"\"\"Add a new user\n \n By default, this just adds the user to the whitelist.\n \n Subclasses may do more extensive things,\n such as adding actual unix users.\n \"\"\"\n user_exists = yield gen.maybe_future(self.system_user_exists(user))\n if not user_exists:\n if self.create_system_users:\n yield gen.maybe_future(self.add_system_user(user))\n else:\n raise KeyError(\"User %s does not exist.\" % user.name)\n \n yield gen.maybe_future(super().add_user(user))\n \n @staticmethod\n def system_user_exists(user):\n \"\"\"Check if the user exists on the system\"\"\"\n try:\n pwd.getpwnam(user.name)\n except KeyError:\n return False\n else:\n return True\n \n def add_system_user(self, user):\n \"\"\"Create a new *ix user on the system. Works on FreeBSD and Linux, at least.\"\"\"\n name = user.name\n cmd = [ arg.replace('USERNAME', name) for arg in self.add_user_cmd ] + [name]\n self.log.info(\"Creating user: %s\", ' '.join(map(pipes.quote, cmd)))\n check_call(cmd)\n\n\nclass PAMAuthenticator(LocalAuthenticator):\n \"\"\"Authenticate local *ix users with PAM\"\"\"\n encoding = Unicode('utf8', config=True,\n help=\"\"\"The encoding to use for PAM\"\"\"\n )\n service = Unicode('login', config=True,\n help=\"\"\"The PAM service to use for authentication.\"\"\"\n )\n \n @gen.coroutine\n def authenticate(self, handler, data):\n \"\"\"Authenticate with PAM, and return the username if login is successful.\n \n Return None otherwise.\n \"\"\"\n username = data['username']\n if not self.check_whitelist(username):\n return\n try:\n pamela.authenticate(username, data['password'], service=self.service)\n except pamela.PAMError as e:\n if handler is not None:\n self.log.warn(\"PAM Authentication failed (@%s): %s\", handler.request.remote_ip, e)\n else:\n self.log.warn(\"PAM Authentication failed: %s\", e)\n else:\n return username\n \n def pre_spawn_start(self, user, spawner):\n \"\"\"Open PAM session for user\"\"\"\n try:\n pamela.open_session(user.name, service=self.service)\n except pamela.PAMError as e:\n self.log.warn(\"Failed to open PAM session for %s: %s\", user.name, e)\n \n def post_spawn_stop(self, user, spawner):\n \"\"\"Close PAM session for user\"\"\"\n try:\n pamela.close_session(user.name, service=self.service)\n except pamela.PAMError as e:\n self.log.warn(\"Failed to close PAM session for %s: %s\", user.name, e)\n \n", "path": "jupyterhub/auth.py"}]} | 2,868 | 816 |
gh_patches_debug_39150 | rasdani/github-patches | git_diff | python-pillow__Pillow-3673 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
DDS Plugin cannot handle uncompressed files
Loading a DDS file without compression gives me the following error:
```
File "<SNIP>\PIL\Image.py", line 2676, in open
im = _open_core(fp, filename, prefix)
File "<SNIP>\PIL\Image.py", line 2658, in _open_core
im = factory(fp, filename)
File "<SNIP>\PIL\ImageFile.py", line 103, in __init__
self._open()
File "<SNIP>\PIL\DdsImagePlugin.py", line 158, in _open
(fourcc))
NotImplementedError: Unimplemented pixel format b'\x00\x00\x00\x00'
```
To demonstrate the issue, I created a simple DDS file as demonstration. Calling `Image.open()` on that file will not work.
[uncompressed_dds.zip](https://github.com/python-pillow/Pillow/files/2872238/uncompressed_dds.zip)
Looking over the source code of DdsImagePlugin, it seems that only FourCC DDS files are supported.
Pillow version: 5.4.1
Python version: 3.7.2
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/PIL/DdsImagePlugin.py`
Content:
```
1 """
2 A Pillow loader for .dds files (S3TC-compressed aka DXTC)
3 Jerome Leclanche <[email protected]>
4
5 Documentation:
6 https://web.archive.org/web/20170802060935/http://oss.sgi.com/projects/ogl-sample/registry/EXT/texture_compression_s3tc.txt
7
8 The contents of this file are hereby released in the public domain (CC0)
9 Full text of the CC0 license:
10 https://creativecommons.org/publicdomain/zero/1.0/
11 """
12
13 import struct
14 from io import BytesIO
15 from . import Image, ImageFile
16
17
18 # Magic ("DDS ")
19 DDS_MAGIC = 0x20534444
20
21 # DDS flags
22 DDSD_CAPS = 0x1
23 DDSD_HEIGHT = 0x2
24 DDSD_WIDTH = 0x4
25 DDSD_PITCH = 0x8
26 DDSD_PIXELFORMAT = 0x1000
27 DDSD_MIPMAPCOUNT = 0x20000
28 DDSD_LINEARSIZE = 0x80000
29 DDSD_DEPTH = 0x800000
30
31 # DDS caps
32 DDSCAPS_COMPLEX = 0x8
33 DDSCAPS_TEXTURE = 0x1000
34 DDSCAPS_MIPMAP = 0x400000
35
36 DDSCAPS2_CUBEMAP = 0x200
37 DDSCAPS2_CUBEMAP_POSITIVEX = 0x400
38 DDSCAPS2_CUBEMAP_NEGATIVEX = 0x800
39 DDSCAPS2_CUBEMAP_POSITIVEY = 0x1000
40 DDSCAPS2_CUBEMAP_NEGATIVEY = 0x2000
41 DDSCAPS2_CUBEMAP_POSITIVEZ = 0x4000
42 DDSCAPS2_CUBEMAP_NEGATIVEZ = 0x8000
43 DDSCAPS2_VOLUME = 0x200000
44
45 # Pixel Format
46 DDPF_ALPHAPIXELS = 0x1
47 DDPF_ALPHA = 0x2
48 DDPF_FOURCC = 0x4
49 DDPF_PALETTEINDEXED8 = 0x20
50 DDPF_RGB = 0x40
51 DDPF_LUMINANCE = 0x20000
52
53
54 # dds.h
55
56 DDS_FOURCC = DDPF_FOURCC
57 DDS_RGB = DDPF_RGB
58 DDS_RGBA = DDPF_RGB | DDPF_ALPHAPIXELS
59 DDS_LUMINANCE = DDPF_LUMINANCE
60 DDS_LUMINANCEA = DDPF_LUMINANCE | DDPF_ALPHAPIXELS
61 DDS_ALPHA = DDPF_ALPHA
62 DDS_PAL8 = DDPF_PALETTEINDEXED8
63
64 DDS_HEADER_FLAGS_TEXTURE = (DDSD_CAPS | DDSD_HEIGHT | DDSD_WIDTH |
65 DDSD_PIXELFORMAT)
66 DDS_HEADER_FLAGS_MIPMAP = DDSD_MIPMAPCOUNT
67 DDS_HEADER_FLAGS_VOLUME = DDSD_DEPTH
68 DDS_HEADER_FLAGS_PITCH = DDSD_PITCH
69 DDS_HEADER_FLAGS_LINEARSIZE = DDSD_LINEARSIZE
70
71 DDS_HEIGHT = DDSD_HEIGHT
72 DDS_WIDTH = DDSD_WIDTH
73
74 DDS_SURFACE_FLAGS_TEXTURE = DDSCAPS_TEXTURE
75 DDS_SURFACE_FLAGS_MIPMAP = DDSCAPS_COMPLEX | DDSCAPS_MIPMAP
76 DDS_SURFACE_FLAGS_CUBEMAP = DDSCAPS_COMPLEX
77
78 DDS_CUBEMAP_POSITIVEX = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_POSITIVEX
79 DDS_CUBEMAP_NEGATIVEX = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_NEGATIVEX
80 DDS_CUBEMAP_POSITIVEY = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_POSITIVEY
81 DDS_CUBEMAP_NEGATIVEY = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_NEGATIVEY
82 DDS_CUBEMAP_POSITIVEZ = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_POSITIVEZ
83 DDS_CUBEMAP_NEGATIVEZ = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_NEGATIVEZ
84
85
86 # DXT1
87 DXT1_FOURCC = 0x31545844
88
89 # DXT3
90 DXT3_FOURCC = 0x33545844
91
92 # DXT5
93 DXT5_FOURCC = 0x35545844
94
95
96 # dxgiformat.h
97
98 DXGI_FORMAT_BC7_TYPELESS = 97
99 DXGI_FORMAT_BC7_UNORM = 98
100 DXGI_FORMAT_BC7_UNORM_SRGB = 99
101
102
103 class DdsImageFile(ImageFile.ImageFile):
104 format = "DDS"
105 format_description = "DirectDraw Surface"
106
107 def _open(self):
108 magic, header_size = struct.unpack("<II", self.fp.read(8))
109 if header_size != 124:
110 raise IOError("Unsupported header size %r" % (header_size))
111 header_bytes = self.fp.read(header_size - 4)
112 if len(header_bytes) != 120:
113 raise IOError("Incomplete header: %s bytes" % len(header_bytes))
114 header = BytesIO(header_bytes)
115
116 flags, height, width = struct.unpack("<3I", header.read(12))
117 self._size = (width, height)
118 self.mode = "RGBA"
119
120 pitch, depth, mipmaps = struct.unpack("<3I", header.read(12))
121 struct.unpack("<11I", header.read(44)) # reserved
122
123 # pixel format
124 pfsize, pfflags = struct.unpack("<2I", header.read(8))
125 fourcc = header.read(4)
126 bitcount, rmask, gmask, bmask, amask = struct.unpack("<5I",
127 header.read(20))
128
129 data_start = header_size + 4
130 n = 0
131 if fourcc == b"DXT1":
132 self.pixel_format = "DXT1"
133 n = 1
134 elif fourcc == b"DXT3":
135 self.pixel_format = "DXT3"
136 n = 2
137 elif fourcc == b"DXT5":
138 self.pixel_format = "DXT5"
139 n = 3
140 elif fourcc == b"DX10":
141 data_start += 20
142 # ignoring flags which pertain to volume textures and cubemaps
143 dxt10 = BytesIO(self.fp.read(20))
144 dxgi_format, dimension = struct.unpack("<II", dxt10.read(8))
145 if dxgi_format in (DXGI_FORMAT_BC7_TYPELESS,
146 DXGI_FORMAT_BC7_UNORM):
147 self.pixel_format = "BC7"
148 n = 7
149 elif dxgi_format == DXGI_FORMAT_BC7_UNORM_SRGB:
150 self.pixel_format = "BC7"
151 self.im_info["gamma"] = 1/2.2
152 n = 7
153 else:
154 raise NotImplementedError("Unimplemented DXGI format %d" %
155 (dxgi_format))
156 else:
157 raise NotImplementedError("Unimplemented pixel format %r" %
158 (fourcc))
159
160 self.tile = [
161 ("bcn", (0, 0) + self.size, data_start, (n))
162 ]
163
164 def load_seek(self, pos):
165 pass
166
167
168 def _validate(prefix):
169 return prefix[:4] == b"DDS "
170
171
172 Image.register_open(DdsImageFile.format, DdsImageFile, _validate)
173 Image.register_extension(DdsImageFile.format, ".dds")
174
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/PIL/DdsImagePlugin.py b/src/PIL/DdsImagePlugin.py
--- a/src/PIL/DdsImagePlugin.py
+++ b/src/PIL/DdsImagePlugin.py
@@ -123,43 +123,52 @@
# pixel format
pfsize, pfflags = struct.unpack("<2I", header.read(8))
fourcc = header.read(4)
- bitcount, rmask, gmask, bmask, amask = struct.unpack("<5I",
- header.read(20))
-
- data_start = header_size + 4
- n = 0
- if fourcc == b"DXT1":
- self.pixel_format = "DXT1"
- n = 1
- elif fourcc == b"DXT3":
- self.pixel_format = "DXT3"
- n = 2
- elif fourcc == b"DXT5":
- self.pixel_format = "DXT5"
- n = 3
- elif fourcc == b"DX10":
- data_start += 20
- # ignoring flags which pertain to volume textures and cubemaps
- dxt10 = BytesIO(self.fp.read(20))
- dxgi_format, dimension = struct.unpack("<II", dxt10.read(8))
- if dxgi_format in (DXGI_FORMAT_BC7_TYPELESS,
- DXGI_FORMAT_BC7_UNORM):
- self.pixel_format = "BC7"
- n = 7
- elif dxgi_format == DXGI_FORMAT_BC7_UNORM_SRGB:
- self.pixel_format = "BC7"
- self.im_info["gamma"] = 1/2.2
- n = 7
- else:
- raise NotImplementedError("Unimplemented DXGI format %d" %
- (dxgi_format))
+ bitcount, = struct.unpack("<I", header.read(4))
+ masks = struct.unpack("<4I", header.read(16))
+ if pfflags & 0x40:
+ # DDPF_RGB - Texture contains uncompressed RGB data
+ masks = {mask: ["R", "G", "B", "A"][i] for i, mask in enumerate(masks)}
+ rawmode = ""
+ if bitcount == 32:
+ rawmode += masks[0xff000000]
+ rawmode += masks[0xff0000] + masks[0xff00] + masks[0xff]
+
+ self.tile = [("raw", (0, 0) + self.size, 0, (rawmode, 0, 1))]
else:
- raise NotImplementedError("Unimplemented pixel format %r" %
- (fourcc))
+ data_start = header_size + 4
+ n = 0
+ if fourcc == b"DXT1":
+ self.pixel_format = "DXT1"
+ n = 1
+ elif fourcc == b"DXT3":
+ self.pixel_format = "DXT3"
+ n = 2
+ elif fourcc == b"DXT5":
+ self.pixel_format = "DXT5"
+ n = 3
+ elif fourcc == b"DX10":
+ data_start += 20
+ # ignoring flags which pertain to volume textures and cubemaps
+ dxt10 = BytesIO(self.fp.read(20))
+ dxgi_format, dimension = struct.unpack("<II", dxt10.read(8))
+ if dxgi_format in (DXGI_FORMAT_BC7_TYPELESS,
+ DXGI_FORMAT_BC7_UNORM):
+ self.pixel_format = "BC7"
+ n = 7
+ elif dxgi_format == DXGI_FORMAT_BC7_UNORM_SRGB:
+ self.pixel_format = "BC7"
+ self.im_info["gamma"] = 1/2.2
+ n = 7
+ else:
+ raise NotImplementedError("Unimplemented DXGI format %d" %
+ (dxgi_format))
+ else:
+ raise NotImplementedError("Unimplemented pixel format %r" %
+ (fourcc))
- self.tile = [
- ("bcn", (0, 0) + self.size, data_start, (n))
- ]
+ self.tile = [
+ ("bcn", (0, 0) + self.size, data_start, (n))
+ ]
def load_seek(self, pos):
pass
| {"golden_diff": "diff --git a/src/PIL/DdsImagePlugin.py b/src/PIL/DdsImagePlugin.py\n--- a/src/PIL/DdsImagePlugin.py\n+++ b/src/PIL/DdsImagePlugin.py\n@@ -123,43 +123,52 @@\n # pixel format\n pfsize, pfflags = struct.unpack(\"<2I\", header.read(8))\n fourcc = header.read(4)\n- bitcount, rmask, gmask, bmask, amask = struct.unpack(\"<5I\",\n- header.read(20))\n-\n- data_start = header_size + 4\n- n = 0\n- if fourcc == b\"DXT1\":\n- self.pixel_format = \"DXT1\"\n- n = 1\n- elif fourcc == b\"DXT3\":\n- self.pixel_format = \"DXT3\"\n- n = 2\n- elif fourcc == b\"DXT5\":\n- self.pixel_format = \"DXT5\"\n- n = 3\n- elif fourcc == b\"DX10\":\n- data_start += 20\n- # ignoring flags which pertain to volume textures and cubemaps\n- dxt10 = BytesIO(self.fp.read(20))\n- dxgi_format, dimension = struct.unpack(\"<II\", dxt10.read(8))\n- if dxgi_format in (DXGI_FORMAT_BC7_TYPELESS,\n- DXGI_FORMAT_BC7_UNORM):\n- self.pixel_format = \"BC7\"\n- n = 7\n- elif dxgi_format == DXGI_FORMAT_BC7_UNORM_SRGB:\n- self.pixel_format = \"BC7\"\n- self.im_info[\"gamma\"] = 1/2.2\n- n = 7\n- else:\n- raise NotImplementedError(\"Unimplemented DXGI format %d\" %\n- (dxgi_format))\n+ bitcount, = struct.unpack(\"<I\", header.read(4))\n+ masks = struct.unpack(\"<4I\", header.read(16))\n+ if pfflags & 0x40:\n+ # DDPF_RGB - Texture contains uncompressed RGB data\n+ masks = {mask: [\"R\", \"G\", \"B\", \"A\"][i] for i, mask in enumerate(masks)}\n+ rawmode = \"\"\n+ if bitcount == 32:\n+ rawmode += masks[0xff000000]\n+ rawmode += masks[0xff0000] + masks[0xff00] + masks[0xff]\n+\n+ self.tile = [(\"raw\", (0, 0) + self.size, 0, (rawmode, 0, 1))]\n else:\n- raise NotImplementedError(\"Unimplemented pixel format %r\" %\n- (fourcc))\n+ data_start = header_size + 4\n+ n = 0\n+ if fourcc == b\"DXT1\":\n+ self.pixel_format = \"DXT1\"\n+ n = 1\n+ elif fourcc == b\"DXT3\":\n+ self.pixel_format = \"DXT3\"\n+ n = 2\n+ elif fourcc == b\"DXT5\":\n+ self.pixel_format = \"DXT5\"\n+ n = 3\n+ elif fourcc == b\"DX10\":\n+ data_start += 20\n+ # ignoring flags which pertain to volume textures and cubemaps\n+ dxt10 = BytesIO(self.fp.read(20))\n+ dxgi_format, dimension = struct.unpack(\"<II\", dxt10.read(8))\n+ if dxgi_format in (DXGI_FORMAT_BC7_TYPELESS,\n+ DXGI_FORMAT_BC7_UNORM):\n+ self.pixel_format = \"BC7\"\n+ n = 7\n+ elif dxgi_format == DXGI_FORMAT_BC7_UNORM_SRGB:\n+ self.pixel_format = \"BC7\"\n+ self.im_info[\"gamma\"] = 1/2.2\n+ n = 7\n+ else:\n+ raise NotImplementedError(\"Unimplemented DXGI format %d\" %\n+ (dxgi_format))\n+ else:\n+ raise NotImplementedError(\"Unimplemented pixel format %r\" %\n+ (fourcc))\n \n- self.tile = [\n- (\"bcn\", (0, 0) + self.size, data_start, (n))\n- ]\n+ self.tile = [\n+ (\"bcn\", (0, 0) + self.size, data_start, (n))\n+ ]\n \n def load_seek(self, pos):\n pass\n", "issue": "DDS Plugin cannot handle uncompressed files\nLoading a DDS file without compression gives me the following error:\r\n```\r\n File \"<SNIP>\\PIL\\Image.py\", line 2676, in open\r\n im = _open_core(fp, filename, prefix)\r\n File \"<SNIP>\\PIL\\Image.py\", line 2658, in _open_core\r\n im = factory(fp, filename)\r\n File \"<SNIP>\\PIL\\ImageFile.py\", line 103, in __init__\r\n self._open()\r\n File \"<SNIP>\\PIL\\DdsImagePlugin.py\", line 158, in _open\r\n (fourcc))\r\nNotImplementedError: Unimplemented pixel format b'\\x00\\x00\\x00\\x00'\r\n```\r\n\r\nTo demonstrate the issue, I created a simple DDS file as demonstration. Calling `Image.open()` on that file will not work. \r\n[uncompressed_dds.zip](https://github.com/python-pillow/Pillow/files/2872238/uncompressed_dds.zip)\r\n\r\nLooking over the source code of DdsImagePlugin, it seems that only FourCC DDS files are supported.\r\n\r\nPillow version: 5.4.1\r\nPython version: 3.7.2\n", "before_files": [{"content": "\"\"\"\nA Pillow loader for .dds files (S3TC-compressed aka DXTC)\nJerome Leclanche <[email protected]>\n\nDocumentation:\n https://web.archive.org/web/20170802060935/http://oss.sgi.com/projects/ogl-sample/registry/EXT/texture_compression_s3tc.txt\n\nThe contents of this file are hereby released in the public domain (CC0)\nFull text of the CC0 license:\n https://creativecommons.org/publicdomain/zero/1.0/\n\"\"\"\n\nimport struct\nfrom io import BytesIO\nfrom . import Image, ImageFile\n\n\n# Magic (\"DDS \")\nDDS_MAGIC = 0x20534444\n\n# DDS flags\nDDSD_CAPS = 0x1\nDDSD_HEIGHT = 0x2\nDDSD_WIDTH = 0x4\nDDSD_PITCH = 0x8\nDDSD_PIXELFORMAT = 0x1000\nDDSD_MIPMAPCOUNT = 0x20000\nDDSD_LINEARSIZE = 0x80000\nDDSD_DEPTH = 0x800000\n\n# DDS caps\nDDSCAPS_COMPLEX = 0x8\nDDSCAPS_TEXTURE = 0x1000\nDDSCAPS_MIPMAP = 0x400000\n\nDDSCAPS2_CUBEMAP = 0x200\nDDSCAPS2_CUBEMAP_POSITIVEX = 0x400\nDDSCAPS2_CUBEMAP_NEGATIVEX = 0x800\nDDSCAPS2_CUBEMAP_POSITIVEY = 0x1000\nDDSCAPS2_CUBEMAP_NEGATIVEY = 0x2000\nDDSCAPS2_CUBEMAP_POSITIVEZ = 0x4000\nDDSCAPS2_CUBEMAP_NEGATIVEZ = 0x8000\nDDSCAPS2_VOLUME = 0x200000\n\n# Pixel Format\nDDPF_ALPHAPIXELS = 0x1\nDDPF_ALPHA = 0x2\nDDPF_FOURCC = 0x4\nDDPF_PALETTEINDEXED8 = 0x20\nDDPF_RGB = 0x40\nDDPF_LUMINANCE = 0x20000\n\n\n# dds.h\n\nDDS_FOURCC = DDPF_FOURCC\nDDS_RGB = DDPF_RGB\nDDS_RGBA = DDPF_RGB | DDPF_ALPHAPIXELS\nDDS_LUMINANCE = DDPF_LUMINANCE\nDDS_LUMINANCEA = DDPF_LUMINANCE | DDPF_ALPHAPIXELS\nDDS_ALPHA = DDPF_ALPHA\nDDS_PAL8 = DDPF_PALETTEINDEXED8\n\nDDS_HEADER_FLAGS_TEXTURE = (DDSD_CAPS | DDSD_HEIGHT | DDSD_WIDTH |\n DDSD_PIXELFORMAT)\nDDS_HEADER_FLAGS_MIPMAP = DDSD_MIPMAPCOUNT\nDDS_HEADER_FLAGS_VOLUME = DDSD_DEPTH\nDDS_HEADER_FLAGS_PITCH = DDSD_PITCH\nDDS_HEADER_FLAGS_LINEARSIZE = DDSD_LINEARSIZE\n\nDDS_HEIGHT = DDSD_HEIGHT\nDDS_WIDTH = DDSD_WIDTH\n\nDDS_SURFACE_FLAGS_TEXTURE = DDSCAPS_TEXTURE\nDDS_SURFACE_FLAGS_MIPMAP = DDSCAPS_COMPLEX | DDSCAPS_MIPMAP\nDDS_SURFACE_FLAGS_CUBEMAP = DDSCAPS_COMPLEX\n\nDDS_CUBEMAP_POSITIVEX = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_POSITIVEX\nDDS_CUBEMAP_NEGATIVEX = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_NEGATIVEX\nDDS_CUBEMAP_POSITIVEY = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_POSITIVEY\nDDS_CUBEMAP_NEGATIVEY = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_NEGATIVEY\nDDS_CUBEMAP_POSITIVEZ = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_POSITIVEZ\nDDS_CUBEMAP_NEGATIVEZ = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_NEGATIVEZ\n\n\n# DXT1\nDXT1_FOURCC = 0x31545844\n\n# DXT3\nDXT3_FOURCC = 0x33545844\n\n# DXT5\nDXT5_FOURCC = 0x35545844\n\n\n# dxgiformat.h\n\nDXGI_FORMAT_BC7_TYPELESS = 97\nDXGI_FORMAT_BC7_UNORM = 98\nDXGI_FORMAT_BC7_UNORM_SRGB = 99\n\n\nclass DdsImageFile(ImageFile.ImageFile):\n format = \"DDS\"\n format_description = \"DirectDraw Surface\"\n\n def _open(self):\n magic, header_size = struct.unpack(\"<II\", self.fp.read(8))\n if header_size != 124:\n raise IOError(\"Unsupported header size %r\" % (header_size))\n header_bytes = self.fp.read(header_size - 4)\n if len(header_bytes) != 120:\n raise IOError(\"Incomplete header: %s bytes\" % len(header_bytes))\n header = BytesIO(header_bytes)\n\n flags, height, width = struct.unpack(\"<3I\", header.read(12))\n self._size = (width, height)\n self.mode = \"RGBA\"\n\n pitch, depth, mipmaps = struct.unpack(\"<3I\", header.read(12))\n struct.unpack(\"<11I\", header.read(44)) # reserved\n\n # pixel format\n pfsize, pfflags = struct.unpack(\"<2I\", header.read(8))\n fourcc = header.read(4)\n bitcount, rmask, gmask, bmask, amask = struct.unpack(\"<5I\",\n header.read(20))\n\n data_start = header_size + 4\n n = 0\n if fourcc == b\"DXT1\":\n self.pixel_format = \"DXT1\"\n n = 1\n elif fourcc == b\"DXT3\":\n self.pixel_format = \"DXT3\"\n n = 2\n elif fourcc == b\"DXT5\":\n self.pixel_format = \"DXT5\"\n n = 3\n elif fourcc == b\"DX10\":\n data_start += 20\n # ignoring flags which pertain to volume textures and cubemaps\n dxt10 = BytesIO(self.fp.read(20))\n dxgi_format, dimension = struct.unpack(\"<II\", dxt10.read(8))\n if dxgi_format in (DXGI_FORMAT_BC7_TYPELESS,\n DXGI_FORMAT_BC7_UNORM):\n self.pixel_format = \"BC7\"\n n = 7\n elif dxgi_format == DXGI_FORMAT_BC7_UNORM_SRGB:\n self.pixel_format = \"BC7\"\n self.im_info[\"gamma\"] = 1/2.2\n n = 7\n else:\n raise NotImplementedError(\"Unimplemented DXGI format %d\" %\n (dxgi_format))\n else:\n raise NotImplementedError(\"Unimplemented pixel format %r\" %\n (fourcc))\n\n self.tile = [\n (\"bcn\", (0, 0) + self.size, data_start, (n))\n ]\n\n def load_seek(self, pos):\n pass\n\n\ndef _validate(prefix):\n return prefix[:4] == b\"DDS \"\n\n\nImage.register_open(DdsImageFile.format, DdsImageFile, _validate)\nImage.register_extension(DdsImageFile.format, \".dds\")\n", "path": "src/PIL/DdsImagePlugin.py"}], "after_files": [{"content": "\"\"\"\nA Pillow loader for .dds files (S3TC-compressed aka DXTC)\nJerome Leclanche <[email protected]>\n\nDocumentation:\n https://web.archive.org/web/20170802060935/http://oss.sgi.com/projects/ogl-sample/registry/EXT/texture_compression_s3tc.txt\n\nThe contents of this file are hereby released in the public domain (CC0)\nFull text of the CC0 license:\n https://creativecommons.org/publicdomain/zero/1.0/\n\"\"\"\n\nimport struct\nfrom io import BytesIO\nfrom . import Image, ImageFile\n\n\n# Magic (\"DDS \")\nDDS_MAGIC = 0x20534444\n\n# DDS flags\nDDSD_CAPS = 0x1\nDDSD_HEIGHT = 0x2\nDDSD_WIDTH = 0x4\nDDSD_PITCH = 0x8\nDDSD_PIXELFORMAT = 0x1000\nDDSD_MIPMAPCOUNT = 0x20000\nDDSD_LINEARSIZE = 0x80000\nDDSD_DEPTH = 0x800000\n\n# DDS caps\nDDSCAPS_COMPLEX = 0x8\nDDSCAPS_TEXTURE = 0x1000\nDDSCAPS_MIPMAP = 0x400000\n\nDDSCAPS2_CUBEMAP = 0x200\nDDSCAPS2_CUBEMAP_POSITIVEX = 0x400\nDDSCAPS2_CUBEMAP_NEGATIVEX = 0x800\nDDSCAPS2_CUBEMAP_POSITIVEY = 0x1000\nDDSCAPS2_CUBEMAP_NEGATIVEY = 0x2000\nDDSCAPS2_CUBEMAP_POSITIVEZ = 0x4000\nDDSCAPS2_CUBEMAP_NEGATIVEZ = 0x8000\nDDSCAPS2_VOLUME = 0x200000\n\n# Pixel Format\nDDPF_ALPHAPIXELS = 0x1\nDDPF_ALPHA = 0x2\nDDPF_FOURCC = 0x4\nDDPF_PALETTEINDEXED8 = 0x20\nDDPF_RGB = 0x40\nDDPF_LUMINANCE = 0x20000\n\n\n# dds.h\n\nDDS_FOURCC = DDPF_FOURCC\nDDS_RGB = DDPF_RGB\nDDS_RGBA = DDPF_RGB | DDPF_ALPHAPIXELS\nDDS_LUMINANCE = DDPF_LUMINANCE\nDDS_LUMINANCEA = DDPF_LUMINANCE | DDPF_ALPHAPIXELS\nDDS_ALPHA = DDPF_ALPHA\nDDS_PAL8 = DDPF_PALETTEINDEXED8\n\nDDS_HEADER_FLAGS_TEXTURE = (DDSD_CAPS | DDSD_HEIGHT | DDSD_WIDTH |\n DDSD_PIXELFORMAT)\nDDS_HEADER_FLAGS_MIPMAP = DDSD_MIPMAPCOUNT\nDDS_HEADER_FLAGS_VOLUME = DDSD_DEPTH\nDDS_HEADER_FLAGS_PITCH = DDSD_PITCH\nDDS_HEADER_FLAGS_LINEARSIZE = DDSD_LINEARSIZE\n\nDDS_HEIGHT = DDSD_HEIGHT\nDDS_WIDTH = DDSD_WIDTH\n\nDDS_SURFACE_FLAGS_TEXTURE = DDSCAPS_TEXTURE\nDDS_SURFACE_FLAGS_MIPMAP = DDSCAPS_COMPLEX | DDSCAPS_MIPMAP\nDDS_SURFACE_FLAGS_CUBEMAP = DDSCAPS_COMPLEX\n\nDDS_CUBEMAP_POSITIVEX = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_POSITIVEX\nDDS_CUBEMAP_NEGATIVEX = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_NEGATIVEX\nDDS_CUBEMAP_POSITIVEY = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_POSITIVEY\nDDS_CUBEMAP_NEGATIVEY = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_NEGATIVEY\nDDS_CUBEMAP_POSITIVEZ = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_POSITIVEZ\nDDS_CUBEMAP_NEGATIVEZ = DDSCAPS2_CUBEMAP | DDSCAPS2_CUBEMAP_NEGATIVEZ\n\n\n# DXT1\nDXT1_FOURCC = 0x31545844\n\n# DXT3\nDXT3_FOURCC = 0x33545844\n\n# DXT5\nDXT5_FOURCC = 0x35545844\n\n\n# dxgiformat.h\n\nDXGI_FORMAT_BC7_TYPELESS = 97\nDXGI_FORMAT_BC7_UNORM = 98\nDXGI_FORMAT_BC7_UNORM_SRGB = 99\n\n\nclass DdsImageFile(ImageFile.ImageFile):\n format = \"DDS\"\n format_description = \"DirectDraw Surface\"\n\n def _open(self):\n magic, header_size = struct.unpack(\"<II\", self.fp.read(8))\n if header_size != 124:\n raise IOError(\"Unsupported header size %r\" % (header_size))\n header_bytes = self.fp.read(header_size - 4)\n if len(header_bytes) != 120:\n raise IOError(\"Incomplete header: %s bytes\" % len(header_bytes))\n header = BytesIO(header_bytes)\n\n flags, height, width = struct.unpack(\"<3I\", header.read(12))\n self._size = (width, height)\n self.mode = \"RGBA\"\n\n pitch, depth, mipmaps = struct.unpack(\"<3I\", header.read(12))\n struct.unpack(\"<11I\", header.read(44)) # reserved\n\n # pixel format\n pfsize, pfflags = struct.unpack(\"<2I\", header.read(8))\n fourcc = header.read(4)\n bitcount, = struct.unpack(\"<I\", header.read(4))\n masks = struct.unpack(\"<4I\", header.read(16))\n if pfflags & 0x40:\n # DDPF_RGB - Texture contains uncompressed RGB data\n masks = {mask: [\"R\", \"G\", \"B\", \"A\"][i] for i, mask in enumerate(masks)}\n rawmode = \"\"\n if bitcount == 32:\n rawmode += masks[0xff000000]\n rawmode += masks[0xff0000] + masks[0xff00] + masks[0xff]\n\n self.tile = [(\"raw\", (0, 0) + self.size, 0, (rawmode, 0, 1))]\n else:\n data_start = header_size + 4\n n = 0\n if fourcc == b\"DXT1\":\n self.pixel_format = \"DXT1\"\n n = 1\n elif fourcc == b\"DXT3\":\n self.pixel_format = \"DXT3\"\n n = 2\n elif fourcc == b\"DXT5\":\n self.pixel_format = \"DXT5\"\n n = 3\n elif fourcc == b\"DX10\":\n data_start += 20\n # ignoring flags which pertain to volume textures and cubemaps\n dxt10 = BytesIO(self.fp.read(20))\n dxgi_format, dimension = struct.unpack(\"<II\", dxt10.read(8))\n if dxgi_format in (DXGI_FORMAT_BC7_TYPELESS,\n DXGI_FORMAT_BC7_UNORM):\n self.pixel_format = \"BC7\"\n n = 7\n elif dxgi_format == DXGI_FORMAT_BC7_UNORM_SRGB:\n self.pixel_format = \"BC7\"\n self.im_info[\"gamma\"] = 1/2.2\n n = 7\n else:\n raise NotImplementedError(\"Unimplemented DXGI format %d\" %\n (dxgi_format))\n else:\n raise NotImplementedError(\"Unimplemented pixel format %r\" %\n (fourcc))\n\n self.tile = [\n (\"bcn\", (0, 0) + self.size, data_start, (n))\n ]\n\n def load_seek(self, pos):\n pass\n\n\ndef _validate(prefix):\n return prefix[:4] == b\"DDS \"\n\n\nImage.register_open(DdsImageFile.format, DdsImageFile, _validate)\nImage.register_extension(DdsImageFile.format, \".dds\")\n", "path": "src/PIL/DdsImagePlugin.py"}]} | 2,669 | 1,014 |
gh_patches_debug_2439 | rasdani/github-patches | git_diff | ephios-dev__ephios-384 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Cannot delete section
As a planner, I cannot delete an existing section from a shift with the section_based signup method
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ephios/plugins/basesignup/signup/section_based.py`
Content:
```
1 import uuid
2 from functools import cached_property
3 from itertools import groupby
4 from operator import itemgetter
5
6 from django import forms
7 from django.contrib import messages
8 from django.core.exceptions import ValidationError
9 from django.shortcuts import redirect
10 from django.template.loader import get_template
11 from django.urls import reverse
12 from django.utils.translation import gettext_lazy as _
13 from django.views.generic import FormView
14 from django_select2.forms import Select2MultipleWidget
15 from dynamic_preferences.registries import global_preferences_registry
16
17 from ephios.core.models import AbstractParticipation, Qualification
18 from ephios.core.signup import (
19 AbstractParticipant,
20 BaseDispositionParticipationForm,
21 BaseSignupMethod,
22 BaseSignupView,
23 ParticipationError,
24 )
25
26
27 def sections_participant_qualifies_for(sections, participant: AbstractParticipant):
28 available_qualification_ids = set(q.id for q in participant.collect_all_qualifications())
29 return [
30 section
31 for section in sections
32 if set(section["qualifications"]) <= available_qualification_ids
33 ]
34
35
36 class SectionBasedDispositionParticipationForm(BaseDispositionParticipationForm):
37 disposition_participation_template = "basesignup/section_based/fragment_participant.html"
38
39 section = forms.ChoiceField(
40 label=_("Section"),
41 required=False, # only required if participation is confirmed
42 widget=forms.Select(
43 attrs={"data-show-for-state": str(AbstractParticipation.States.CONFIRMED)}
44 ),
45 )
46
47 def __init__(self, **kwargs):
48 super().__init__(**kwargs)
49 sections = self.shift.signup_method.configuration.sections
50 qualified_sections = list(
51 sections_participant_qualifies_for(
52 sections,
53 self.instance.participant,
54 )
55 )
56 unqualified_sections = [
57 section for section in sections if section not in qualified_sections
58 ]
59 self.fields["section"].choices = [("", "---")]
60 if qualified_sections:
61 self.fields["section"].choices += [
62 (
63 _("qualified"),
64 [(section["uuid"], section["title"]) for section in qualified_sections],
65 )
66 ]
67 if unqualified_sections:
68 self.fields["section"].choices += [
69 (
70 _("unqualified"),
71 [(section["uuid"], section["title"]) for section in unqualified_sections],
72 )
73 ]
74 if preferred_section_uuid := self.instance.data.get("preferred_section_uuid"):
75 self.fields["section"].initial = preferred_section_uuid
76 self.preferred_section = next(
77 filter(lambda section: section["uuid"] == preferred_section_uuid, sections), None
78 )
79 if initial := self.instance.data.get("dispatched_section_uuid"):
80 self.fields["section"].initial = initial
81
82 def clean(self):
83 super().clean()
84 if (
85 self.cleaned_data["state"] == AbstractParticipation.States.CONFIRMED
86 and not self.cleaned_data["section"]
87 ):
88 self.add_error(
89 "section",
90 ValidationError(_("You must select a section when confirming a participation.")),
91 )
92
93 def save(self, commit=True):
94 self.instance.data["dispatched_section_uuid"] = self.cleaned_data["section"]
95 super().save(commit)
96
97
98 class SectionForm(forms.Form):
99 title = forms.CharField(label=_("Title"), required=True)
100 qualifications = forms.ModelMultipleChoiceField(
101 label=_("Required Qualifications"),
102 queryset=Qualification.objects.all(),
103 widget=Select2MultipleWidget,
104 required=False,
105 )
106 min_count = forms.IntegerField(label=_("min amount"), min_value=0, required=True)
107 uuid = forms.CharField(widget=forms.HiddenInput, required=False)
108
109 def clean_uuid(self):
110 return self.cleaned_data.get("uuid") or uuid.uuid4()
111
112
113 SectionsFormset = forms.formset_factory(
114 SectionForm, can_delete=True, min_num=1, validate_min=1, extra=0
115 )
116
117
118 class SectionBasedConfigurationForm(forms.Form):
119 def __init__(self, data=None, **kwargs):
120 super().__init__(data, **kwargs)
121 self.sections_formset = SectionsFormset(
122 data=data,
123 initial=self.initial.get("sections", list()),
124 prefix="sections",
125 )
126
127 def clean_sections(self):
128 if not self.sections_formset.is_valid():
129 raise ValidationError(_("The sections aren't configured correctly."))
130
131 sections = [
132 {
133 key: form.cleaned_data[key]
134 for key in ("title", "qualifications", "min_count", "uuid")
135 }
136 for form in self.sections_formset
137 ]
138 return sections
139
140
141 class SectionSignupForm(forms.Form):
142 section = forms.ChoiceField(
143 label=_("Preferred Section"),
144 widget=forms.RadioSelect,
145 required=False,
146 # choices are set as (uuid, title) of section
147 )
148
149
150 class SectionBasedSignupView(FormView, BaseSignupView):
151 template_name = "basesignup/section_based/signup.html"
152
153 @cached_property
154 def sections_participant_qualifies_for(self):
155 return sections_participant_qualifies_for(
156 self.method.configuration.sections, self.participant
157 )
158
159 def get_form(self, form_class=None):
160 form = SectionSignupForm(self.request.POST)
161 form.fields["section"].choices = [
162 (section["uuid"], section["title"])
163 for section in self.sections_participant_qualifies_for
164 ]
165 return form
166
167 def get_context_data(self, **kwargs):
168 kwargs.setdefault("shift", self.shift)
169 kwargs.setdefault(
170 "unqualified_sections",
171 [
172 section["title"]
173 for section in self.method.configuration.sections
174 if section not in self.sections_participant_qualifies_for
175 ],
176 )
177 return super().get_context_data(**kwargs)
178
179 def form_valid(self, form):
180 return super().signup_pressed(preferred_section_uuid=form.cleaned_data.get("section"))
181
182 def signup_pressed(self, **kwargs):
183 if not self.method.configuration.choose_preferred_section:
184 # do straight signup if choosing is not enabled
185 return super().signup_pressed(**kwargs)
186
187 if not self.method.can_sign_up(self.participant):
188 # redirect a misled request
189 messages.warning(self.request, _("You can not sign up for this shift."))
190 return redirect(self.participant.reverse_event_detail(self.shift.event))
191
192 # all good, redirect to the form
193 return redirect(self.participant.reverse_signup_action(self.shift))
194
195
196 class SectionBasedSignupMethod(BaseSignupMethod):
197 slug = "section_based"
198 verbose_name = _("Apply for sections")
199 description = _(
200 """This method lets you define sections for which people can choose from.
201 Sections contain qualifications that helpers need to fulfil."""
202 )
203 registration_button_text = _("Request")
204 signup_success_message = _("You have successfully requested a participation for {shift}.")
205 signup_error_message = _("Requesting a participation failed: {error}")
206
207 configuration_form_class = SectionBasedConfigurationForm
208 signup_view_class = SectionBasedSignupView
209
210 disposition_participation_form_class = SectionBasedDispositionParticipationForm
211
212 def get_configuration_fields(self):
213 return {
214 **super().get_configuration_fields(),
215 "choose_preferred_section": {
216 "formfield": forms.BooleanField(
217 label=_("Ask participants for a preferred section"),
218 help_text=_("This only makes sense if you configure multiple sections."),
219 widget=forms.CheckboxInput,
220 required=False,
221 ),
222 "default": False,
223 },
224 "sections": {
225 "formfield": forms.Field(
226 label=_("Structure"),
227 widget=forms.HiddenInput,
228 required=False,
229 ),
230 "default": [],
231 },
232 }
233
234 def get_participant_count_bounds(self):
235 return sum(section.get("min_count") or 0 for section in self.configuration.sections), None
236
237 @staticmethod
238 def check_qualification(method, participant):
239 if not sections_participant_qualifies_for(method.configuration.sections, participant):
240 return ParticipationError(_("You are not qualified."))
241
242 @property
243 def _signup_checkers(self):
244 return super()._signup_checkers + [self.check_qualification]
245
246 # pylint: disable=arguments-differ
247 def _configure_participation(
248 self, participation: AbstractParticipation, preferred_section_uuid=None, **kwargs
249 ) -> AbstractParticipation:
250 participation.data["preferred_section_uuid"] = preferred_section_uuid
251 if preferred_section_uuid:
252 # reset dispatch decision, as that would have overwritten the preferred choice
253 participation.data["dispatched_section_uuid"] = None
254 participation.state = AbstractParticipation.States.REQUESTED
255 return participation
256
257 def render_configuration_form(self, *args, form=None, **kwargs):
258 form = form or self.get_configuration_form(*args, **kwargs)
259 template = get_template("basesignup/section_based/configuration_form.html").render(
260 {"form": form}
261 )
262 return template
263
264 def _get_sections_with_users(self):
265 relevant_qualification_categories = global_preferences_registry.manager()[
266 "general__relevant_qualification_categories"
267 ]
268 section_by_uuid = {section["uuid"]: section for section in self.configuration.sections}
269 # get name and preferred section uuid for confirmed participants
270 # if they have a section assigned and we have that section on record
271 confirmed_participations = [
272 {
273 "name": str(participation.participant),
274 "relevant_qualifications": ", ".join(
275 participation.participant.qualifications.filter(
276 category__in=relevant_qualification_categories
277 ).values_list("abbreviation", flat=True)
278 ),
279 "uuid": dispatched_section_uuid,
280 }
281 for participation in self.shift.participations.filter(
282 state=AbstractParticipation.States.CONFIRMED
283 )
284 if (dispatched_section_uuid := participation.data.get("dispatched_section_uuid"))
285 and dispatched_section_uuid in section_by_uuid
286 ]
287 # group by section and do some stats
288 sections_with_users = [
289 (
290 section_by_uuid.pop(uuid),
291 [[user["name"], user["relevant_qualifications"]] for user in group],
292 )
293 for uuid, group in groupby(
294 sorted(confirmed_participations, key=itemgetter("uuid")), itemgetter("uuid")
295 )
296 ]
297 # add sections without participants
298 sections_with_users += [(section, None) for section in section_by_uuid.values()]
299 return sections_with_users
300
301 def render_shift_state(self, request):
302 return get_template("basesignup/section_based/fragment_state.html").render(
303 {
304 "shift": self.shift,
305 "requested_participations": (
306 self.shift.participations.filter(state=AbstractParticipation.States.REQUESTED)
307 ),
308 "sections_with_users": self._get_sections_with_users(),
309 "disposition_url": (
310 reverse(
311 "core:shift_disposition",
312 kwargs=dict(pk=self.shift.pk),
313 )
314 if request.user.has_perm("core.change_event", obj=self.shift.event)
315 else None
316 ),
317 }
318 )
319
320 def get_participation_display(self):
321 confirmed_sections_with_users = self._get_sections_with_users()
322 participation_display = []
323 for section, users in confirmed_sections_with_users:
324 if users:
325 participation_display += [[user[0], user[1], section["title"]] for user in users]
326 if not users or len(users) < section["min_count"]:
327 required_qualifications = ", ".join(
328 Qualification.objects.filter(pk__in=section["qualifications"]).values_list(
329 "abbreviation", flat=True
330 )
331 )
332 participation_display += [["", required_qualifications, section["title"]]] * (
333 section["min_count"] - (len(users) if users else 0)
334 )
335 return participation_display
336
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ephios/plugins/basesignup/signup/section_based.py b/ephios/plugins/basesignup/signup/section_based.py
--- a/ephios/plugins/basesignup/signup/section_based.py
+++ b/ephios/plugins/basesignup/signup/section_based.py
@@ -134,6 +134,7 @@
for key in ("title", "qualifications", "min_count", "uuid")
}
for form in self.sections_formset
+ if not form.cleaned_data.get("DELETE")
]
return sections
| {"golden_diff": "diff --git a/ephios/plugins/basesignup/signup/section_based.py b/ephios/plugins/basesignup/signup/section_based.py\n--- a/ephios/plugins/basesignup/signup/section_based.py\n+++ b/ephios/plugins/basesignup/signup/section_based.py\n@@ -134,6 +134,7 @@\n for key in (\"title\", \"qualifications\", \"min_count\", \"uuid\")\n }\n for form in self.sections_formset\n+ if not form.cleaned_data.get(\"DELETE\")\n ]\n return sections\n", "issue": "Cannot delete section\nAs a planner, I cannot delete an existing section from a shift with the section_based signup method\n", "before_files": [{"content": "import uuid\nfrom functools import cached_property\nfrom itertools import groupby\nfrom operator import itemgetter\n\nfrom django import forms\nfrom django.contrib import messages\nfrom django.core.exceptions import ValidationError\nfrom django.shortcuts import redirect\nfrom django.template.loader import get_template\nfrom django.urls import reverse\nfrom django.utils.translation import gettext_lazy as _\nfrom django.views.generic import FormView\nfrom django_select2.forms import Select2MultipleWidget\nfrom dynamic_preferences.registries import global_preferences_registry\n\nfrom ephios.core.models import AbstractParticipation, Qualification\nfrom ephios.core.signup import (\n AbstractParticipant,\n BaseDispositionParticipationForm,\n BaseSignupMethod,\n BaseSignupView,\n ParticipationError,\n)\n\n\ndef sections_participant_qualifies_for(sections, participant: AbstractParticipant):\n available_qualification_ids = set(q.id for q in participant.collect_all_qualifications())\n return [\n section\n for section in sections\n if set(section[\"qualifications\"]) <= available_qualification_ids\n ]\n\n\nclass SectionBasedDispositionParticipationForm(BaseDispositionParticipationForm):\n disposition_participation_template = \"basesignup/section_based/fragment_participant.html\"\n\n section = forms.ChoiceField(\n label=_(\"Section\"),\n required=False, # only required if participation is confirmed\n widget=forms.Select(\n attrs={\"data-show-for-state\": str(AbstractParticipation.States.CONFIRMED)}\n ),\n )\n\n def __init__(self, **kwargs):\n super().__init__(**kwargs)\n sections = self.shift.signup_method.configuration.sections\n qualified_sections = list(\n sections_participant_qualifies_for(\n sections,\n self.instance.participant,\n )\n )\n unqualified_sections = [\n section for section in sections if section not in qualified_sections\n ]\n self.fields[\"section\"].choices = [(\"\", \"---\")]\n if qualified_sections:\n self.fields[\"section\"].choices += [\n (\n _(\"qualified\"),\n [(section[\"uuid\"], section[\"title\"]) for section in qualified_sections],\n )\n ]\n if unqualified_sections:\n self.fields[\"section\"].choices += [\n (\n _(\"unqualified\"),\n [(section[\"uuid\"], section[\"title\"]) for section in unqualified_sections],\n )\n ]\n if preferred_section_uuid := self.instance.data.get(\"preferred_section_uuid\"):\n self.fields[\"section\"].initial = preferred_section_uuid\n self.preferred_section = next(\n filter(lambda section: section[\"uuid\"] == preferred_section_uuid, sections), None\n )\n if initial := self.instance.data.get(\"dispatched_section_uuid\"):\n self.fields[\"section\"].initial = initial\n\n def clean(self):\n super().clean()\n if (\n self.cleaned_data[\"state\"] == AbstractParticipation.States.CONFIRMED\n and not self.cleaned_data[\"section\"]\n ):\n self.add_error(\n \"section\",\n ValidationError(_(\"You must select a section when confirming a participation.\")),\n )\n\n def save(self, commit=True):\n self.instance.data[\"dispatched_section_uuid\"] = self.cleaned_data[\"section\"]\n super().save(commit)\n\n\nclass SectionForm(forms.Form):\n title = forms.CharField(label=_(\"Title\"), required=True)\n qualifications = forms.ModelMultipleChoiceField(\n label=_(\"Required Qualifications\"),\n queryset=Qualification.objects.all(),\n widget=Select2MultipleWidget,\n required=False,\n )\n min_count = forms.IntegerField(label=_(\"min amount\"), min_value=0, required=True)\n uuid = forms.CharField(widget=forms.HiddenInput, required=False)\n\n def clean_uuid(self):\n return self.cleaned_data.get(\"uuid\") or uuid.uuid4()\n\n\nSectionsFormset = forms.formset_factory(\n SectionForm, can_delete=True, min_num=1, validate_min=1, extra=0\n)\n\n\nclass SectionBasedConfigurationForm(forms.Form):\n def __init__(self, data=None, **kwargs):\n super().__init__(data, **kwargs)\n self.sections_formset = SectionsFormset(\n data=data,\n initial=self.initial.get(\"sections\", list()),\n prefix=\"sections\",\n )\n\n def clean_sections(self):\n if not self.sections_formset.is_valid():\n raise ValidationError(_(\"The sections aren't configured correctly.\"))\n\n sections = [\n {\n key: form.cleaned_data[key]\n for key in (\"title\", \"qualifications\", \"min_count\", \"uuid\")\n }\n for form in self.sections_formset\n ]\n return sections\n\n\nclass SectionSignupForm(forms.Form):\n section = forms.ChoiceField(\n label=_(\"Preferred Section\"),\n widget=forms.RadioSelect,\n required=False,\n # choices are set as (uuid, title) of section\n )\n\n\nclass SectionBasedSignupView(FormView, BaseSignupView):\n template_name = \"basesignup/section_based/signup.html\"\n\n @cached_property\n def sections_participant_qualifies_for(self):\n return sections_participant_qualifies_for(\n self.method.configuration.sections, self.participant\n )\n\n def get_form(self, form_class=None):\n form = SectionSignupForm(self.request.POST)\n form.fields[\"section\"].choices = [\n (section[\"uuid\"], section[\"title\"])\n for section in self.sections_participant_qualifies_for\n ]\n return form\n\n def get_context_data(self, **kwargs):\n kwargs.setdefault(\"shift\", self.shift)\n kwargs.setdefault(\n \"unqualified_sections\",\n [\n section[\"title\"]\n for section in self.method.configuration.sections\n if section not in self.sections_participant_qualifies_for\n ],\n )\n return super().get_context_data(**kwargs)\n\n def form_valid(self, form):\n return super().signup_pressed(preferred_section_uuid=form.cleaned_data.get(\"section\"))\n\n def signup_pressed(self, **kwargs):\n if not self.method.configuration.choose_preferred_section:\n # do straight signup if choosing is not enabled\n return super().signup_pressed(**kwargs)\n\n if not self.method.can_sign_up(self.participant):\n # redirect a misled request\n messages.warning(self.request, _(\"You can not sign up for this shift.\"))\n return redirect(self.participant.reverse_event_detail(self.shift.event))\n\n # all good, redirect to the form\n return redirect(self.participant.reverse_signup_action(self.shift))\n\n\nclass SectionBasedSignupMethod(BaseSignupMethod):\n slug = \"section_based\"\n verbose_name = _(\"Apply for sections\")\n description = _(\n \"\"\"This method lets you define sections for which people can choose from.\n Sections contain qualifications that helpers need to fulfil.\"\"\"\n )\n registration_button_text = _(\"Request\")\n signup_success_message = _(\"You have successfully requested a participation for {shift}.\")\n signup_error_message = _(\"Requesting a participation failed: {error}\")\n\n configuration_form_class = SectionBasedConfigurationForm\n signup_view_class = SectionBasedSignupView\n\n disposition_participation_form_class = SectionBasedDispositionParticipationForm\n\n def get_configuration_fields(self):\n return {\n **super().get_configuration_fields(),\n \"choose_preferred_section\": {\n \"formfield\": forms.BooleanField(\n label=_(\"Ask participants for a preferred section\"),\n help_text=_(\"This only makes sense if you configure multiple sections.\"),\n widget=forms.CheckboxInput,\n required=False,\n ),\n \"default\": False,\n },\n \"sections\": {\n \"formfield\": forms.Field(\n label=_(\"Structure\"),\n widget=forms.HiddenInput,\n required=False,\n ),\n \"default\": [],\n },\n }\n\n def get_participant_count_bounds(self):\n return sum(section.get(\"min_count\") or 0 for section in self.configuration.sections), None\n\n @staticmethod\n def check_qualification(method, participant):\n if not sections_participant_qualifies_for(method.configuration.sections, participant):\n return ParticipationError(_(\"You are not qualified.\"))\n\n @property\n def _signup_checkers(self):\n return super()._signup_checkers + [self.check_qualification]\n\n # pylint: disable=arguments-differ\n def _configure_participation(\n self, participation: AbstractParticipation, preferred_section_uuid=None, **kwargs\n ) -> AbstractParticipation:\n participation.data[\"preferred_section_uuid\"] = preferred_section_uuid\n if preferred_section_uuid:\n # reset dispatch decision, as that would have overwritten the preferred choice\n participation.data[\"dispatched_section_uuid\"] = None\n participation.state = AbstractParticipation.States.REQUESTED\n return participation\n\n def render_configuration_form(self, *args, form=None, **kwargs):\n form = form or self.get_configuration_form(*args, **kwargs)\n template = get_template(\"basesignup/section_based/configuration_form.html\").render(\n {\"form\": form}\n )\n return template\n\n def _get_sections_with_users(self):\n relevant_qualification_categories = global_preferences_registry.manager()[\n \"general__relevant_qualification_categories\"\n ]\n section_by_uuid = {section[\"uuid\"]: section for section in self.configuration.sections}\n # get name and preferred section uuid for confirmed participants\n # if they have a section assigned and we have that section on record\n confirmed_participations = [\n {\n \"name\": str(participation.participant),\n \"relevant_qualifications\": \", \".join(\n participation.participant.qualifications.filter(\n category__in=relevant_qualification_categories\n ).values_list(\"abbreviation\", flat=True)\n ),\n \"uuid\": dispatched_section_uuid,\n }\n for participation in self.shift.participations.filter(\n state=AbstractParticipation.States.CONFIRMED\n )\n if (dispatched_section_uuid := participation.data.get(\"dispatched_section_uuid\"))\n and dispatched_section_uuid in section_by_uuid\n ]\n # group by section and do some stats\n sections_with_users = [\n (\n section_by_uuid.pop(uuid),\n [[user[\"name\"], user[\"relevant_qualifications\"]] for user in group],\n )\n for uuid, group in groupby(\n sorted(confirmed_participations, key=itemgetter(\"uuid\")), itemgetter(\"uuid\")\n )\n ]\n # add sections without participants\n sections_with_users += [(section, None) for section in section_by_uuid.values()]\n return sections_with_users\n\n def render_shift_state(self, request):\n return get_template(\"basesignup/section_based/fragment_state.html\").render(\n {\n \"shift\": self.shift,\n \"requested_participations\": (\n self.shift.participations.filter(state=AbstractParticipation.States.REQUESTED)\n ),\n \"sections_with_users\": self._get_sections_with_users(),\n \"disposition_url\": (\n reverse(\n \"core:shift_disposition\",\n kwargs=dict(pk=self.shift.pk),\n )\n if request.user.has_perm(\"core.change_event\", obj=self.shift.event)\n else None\n ),\n }\n )\n\n def get_participation_display(self):\n confirmed_sections_with_users = self._get_sections_with_users()\n participation_display = []\n for section, users in confirmed_sections_with_users:\n if users:\n participation_display += [[user[0], user[1], section[\"title\"]] for user in users]\n if not users or len(users) < section[\"min_count\"]:\n required_qualifications = \", \".join(\n Qualification.objects.filter(pk__in=section[\"qualifications\"]).values_list(\n \"abbreviation\", flat=True\n )\n )\n participation_display += [[\"\", required_qualifications, section[\"title\"]]] * (\n section[\"min_count\"] - (len(users) if users else 0)\n )\n return participation_display\n", "path": "ephios/plugins/basesignup/signup/section_based.py"}], "after_files": [{"content": "import uuid\nfrom functools import cached_property\nfrom itertools import groupby\nfrom operator import itemgetter\n\nfrom django import forms\nfrom django.contrib import messages\nfrom django.core.exceptions import ValidationError\nfrom django.shortcuts import redirect\nfrom django.template.loader import get_template\nfrom django.urls import reverse\nfrom django.utils.translation import gettext_lazy as _\nfrom django.views.generic import FormView\nfrom django_select2.forms import Select2MultipleWidget\nfrom dynamic_preferences.registries import global_preferences_registry\n\nfrom ephios.core.models import AbstractParticipation, Qualification\nfrom ephios.core.signup import (\n AbstractParticipant,\n BaseDispositionParticipationForm,\n BaseSignupMethod,\n BaseSignupView,\n ParticipationError,\n)\n\n\ndef sections_participant_qualifies_for(sections, participant: AbstractParticipant):\n available_qualification_ids = set(q.id for q in participant.collect_all_qualifications())\n return [\n section\n for section in sections\n if set(section[\"qualifications\"]) <= available_qualification_ids\n ]\n\n\nclass SectionBasedDispositionParticipationForm(BaseDispositionParticipationForm):\n disposition_participation_template = \"basesignup/section_based/fragment_participant.html\"\n\n section = forms.ChoiceField(\n label=_(\"Section\"),\n required=False, # only required if participation is confirmed\n widget=forms.Select(\n attrs={\"data-show-for-state\": str(AbstractParticipation.States.CONFIRMED)}\n ),\n )\n\n def __init__(self, **kwargs):\n super().__init__(**kwargs)\n sections = self.shift.signup_method.configuration.sections\n qualified_sections = list(\n sections_participant_qualifies_for(\n sections,\n self.instance.participant,\n )\n )\n unqualified_sections = [\n section for section in sections if section not in qualified_sections\n ]\n self.fields[\"section\"].choices = [(\"\", \"---\")]\n if qualified_sections:\n self.fields[\"section\"].choices += [\n (\n _(\"qualified\"),\n [(section[\"uuid\"], section[\"title\"]) for section in qualified_sections],\n )\n ]\n if unqualified_sections:\n self.fields[\"section\"].choices += [\n (\n _(\"unqualified\"),\n [(section[\"uuid\"], section[\"title\"]) for section in unqualified_sections],\n )\n ]\n if preferred_section_uuid := self.instance.data.get(\"preferred_section_uuid\"):\n self.fields[\"section\"].initial = preferred_section_uuid\n self.preferred_section = next(\n filter(lambda section: section[\"uuid\"] == preferred_section_uuid, sections), None\n )\n if initial := self.instance.data.get(\"dispatched_section_uuid\"):\n self.fields[\"section\"].initial = initial\n\n def clean(self):\n super().clean()\n if (\n self.cleaned_data[\"state\"] == AbstractParticipation.States.CONFIRMED\n and not self.cleaned_data[\"section\"]\n ):\n self.add_error(\n \"section\",\n ValidationError(_(\"You must select a section when confirming a participation.\")),\n )\n\n def save(self, commit=True):\n self.instance.data[\"dispatched_section_uuid\"] = self.cleaned_data[\"section\"]\n super().save(commit)\n\n\nclass SectionForm(forms.Form):\n title = forms.CharField(label=_(\"Title\"), required=True)\n qualifications = forms.ModelMultipleChoiceField(\n label=_(\"Required Qualifications\"),\n queryset=Qualification.objects.all(),\n widget=Select2MultipleWidget,\n required=False,\n )\n min_count = forms.IntegerField(label=_(\"min amount\"), min_value=0, required=True)\n uuid = forms.CharField(widget=forms.HiddenInput, required=False)\n\n def clean_uuid(self):\n return self.cleaned_data.get(\"uuid\") or uuid.uuid4()\n\n\nSectionsFormset = forms.formset_factory(\n SectionForm, can_delete=True, min_num=1, validate_min=1, extra=0\n)\n\n\nclass SectionBasedConfigurationForm(forms.Form):\n def __init__(self, data=None, **kwargs):\n super().__init__(data, **kwargs)\n self.sections_formset = SectionsFormset(\n data=data,\n initial=self.initial.get(\"sections\", list()),\n prefix=\"sections\",\n )\n\n def clean_sections(self):\n if not self.sections_formset.is_valid():\n raise ValidationError(_(\"The sections aren't configured correctly.\"))\n\n sections = [\n {\n key: form.cleaned_data[key]\n for key in (\"title\", \"qualifications\", \"min_count\", \"uuid\")\n }\n for form in self.sections_formset\n if not form.cleaned_data.get(\"DELETE\")\n ]\n return sections\n\n\nclass SectionSignupForm(forms.Form):\n section = forms.ChoiceField(\n label=_(\"Preferred Section\"),\n widget=forms.RadioSelect,\n required=False,\n # choices are set as (uuid, title) of section\n )\n\n\nclass SectionBasedSignupView(FormView, BaseSignupView):\n template_name = \"basesignup/section_based/signup.html\"\n\n @cached_property\n def sections_participant_qualifies_for(self):\n return sections_participant_qualifies_for(\n self.method.configuration.sections, self.participant\n )\n\n def get_form(self, form_class=None):\n form = SectionSignupForm(self.request.POST)\n form.fields[\"section\"].choices = [\n (section[\"uuid\"], section[\"title\"])\n for section in self.sections_participant_qualifies_for\n ]\n return form\n\n def get_context_data(self, **kwargs):\n kwargs.setdefault(\"shift\", self.shift)\n kwargs.setdefault(\n \"unqualified_sections\",\n [\n section[\"title\"]\n for section in self.method.configuration.sections\n if section not in self.sections_participant_qualifies_for\n ],\n )\n return super().get_context_data(**kwargs)\n\n def form_valid(self, form):\n return super().signup_pressed(preferred_section_uuid=form.cleaned_data.get(\"section\"))\n\n def signup_pressed(self, **kwargs):\n if not self.method.configuration.choose_preferred_section:\n # do straight signup if choosing is not enabled\n return super().signup_pressed(**kwargs)\n\n if not self.method.can_sign_up(self.participant):\n # redirect a misled request\n messages.warning(self.request, _(\"You can not sign up for this shift.\"))\n return redirect(self.participant.reverse_event_detail(self.shift.event))\n\n # all good, redirect to the form\n return redirect(self.participant.reverse_signup_action(self.shift))\n\n\nclass SectionBasedSignupMethod(BaseSignupMethod):\n slug = \"section_based\"\n verbose_name = _(\"Apply for sections\")\n description = _(\n \"\"\"This method lets you define sections for which people can choose from.\n Sections contain qualifications that helpers need to fulfil.\"\"\"\n )\n registration_button_text = _(\"Request\")\n signup_success_message = _(\"You have successfully requested a participation for {shift}.\")\n signup_error_message = _(\"Requesting a participation failed: {error}\")\n\n configuration_form_class = SectionBasedConfigurationForm\n signup_view_class = SectionBasedSignupView\n\n disposition_participation_form_class = SectionBasedDispositionParticipationForm\n\n def get_configuration_fields(self):\n return {\n **super().get_configuration_fields(),\n \"choose_preferred_section\": {\n \"formfield\": forms.BooleanField(\n label=_(\"Ask participants for a preferred section\"),\n help_text=_(\"This only makes sense if you configure multiple sections.\"),\n widget=forms.CheckboxInput,\n required=False,\n ),\n \"default\": False,\n },\n \"sections\": {\n \"formfield\": forms.Field(\n label=_(\"Structure\"),\n widget=forms.HiddenInput,\n required=False,\n ),\n \"default\": [],\n },\n }\n\n def get_participant_count_bounds(self):\n return sum(section.get(\"min_count\") or 0 for section in self.configuration.sections), None\n\n @staticmethod\n def check_qualification(method, participant):\n if not sections_participant_qualifies_for(method.configuration.sections, participant):\n return ParticipationError(_(\"You are not qualified.\"))\n\n @property\n def _signup_checkers(self):\n return super()._signup_checkers + [self.check_qualification]\n\n # pylint: disable=arguments-differ\n def _configure_participation(\n self, participation: AbstractParticipation, preferred_section_uuid=None, **kwargs\n ) -> AbstractParticipation:\n participation.data[\"preferred_section_uuid\"] = preferred_section_uuid\n if preferred_section_uuid:\n # reset dispatch decision, as that would have overwritten the preferred choice\n participation.data[\"dispatched_section_uuid\"] = None\n participation.state = AbstractParticipation.States.REQUESTED\n return participation\n\n def render_configuration_form(self, *args, form=None, **kwargs):\n form = form or self.get_configuration_form(*args, **kwargs)\n template = get_template(\"basesignup/section_based/configuration_form.html\").render(\n {\"form\": form}\n )\n return template\n\n def _get_sections_with_users(self):\n relevant_qualification_categories = global_preferences_registry.manager()[\n \"general__relevant_qualification_categories\"\n ]\n section_by_uuid = {section[\"uuid\"]: section for section in self.configuration.sections}\n # get name and preferred section uuid for confirmed participants\n # if they have a section assigned and we have that section on record\n confirmed_participations = [\n {\n \"name\": str(participation.participant),\n \"relevant_qualifications\": \", \".join(\n participation.participant.qualifications.filter(\n category__in=relevant_qualification_categories\n ).values_list(\"abbreviation\", flat=True)\n ),\n \"uuid\": dispatched_section_uuid,\n }\n for participation in self.shift.participations.filter(\n state=AbstractParticipation.States.CONFIRMED\n )\n if (dispatched_section_uuid := participation.data.get(\"dispatched_section_uuid\"))\n and dispatched_section_uuid in section_by_uuid\n ]\n # group by section and do some stats\n sections_with_users = [\n (\n section_by_uuid.pop(uuid),\n [[user[\"name\"], user[\"relevant_qualifications\"]] for user in group],\n )\n for uuid, group in groupby(\n sorted(confirmed_participations, key=itemgetter(\"uuid\")), itemgetter(\"uuid\")\n )\n ]\n # add sections without participants\n sections_with_users += [(section, None) for section in section_by_uuid.values()]\n return sections_with_users\n\n def render_shift_state(self, request):\n return get_template(\"basesignup/section_based/fragment_state.html\").render(\n {\n \"shift\": self.shift,\n \"requested_participations\": (\n self.shift.participations.filter(state=AbstractParticipation.States.REQUESTED)\n ),\n \"sections_with_users\": self._get_sections_with_users(),\n \"disposition_url\": (\n reverse(\n \"core:shift_disposition\",\n kwargs=dict(pk=self.shift.pk),\n )\n if request.user.has_perm(\"core.change_event\", obj=self.shift.event)\n else None\n ),\n }\n )\n\n def get_participation_display(self):\n confirmed_sections_with_users = self._get_sections_with_users()\n participation_display = []\n for section, users in confirmed_sections_with_users:\n if users:\n participation_display += [[user[0], user[1], section[\"title\"]] for user in users]\n if not users or len(users) < section[\"min_count\"]:\n required_qualifications = \", \".join(\n Qualification.objects.filter(pk__in=section[\"qualifications\"]).values_list(\n \"abbreviation\", flat=True\n )\n )\n participation_display += [[\"\", required_qualifications, section[\"title\"]]] * (\n section[\"min_count\"] - (len(users) if users else 0)\n )\n return participation_display\n", "path": "ephios/plugins/basesignup/signup/section_based.py"}]} | 3,630 | 124 |
gh_patches_debug_29592 | rasdani/github-patches | git_diff | e-valuation__EvaP-1484 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Locked questionnaires failing in editor form
#1445 introduced locked questionnaires. However, they are not dealt with correctly in the evaluation editor form. When initially opening the form, the locked questionnaires are correctly selected but are not handled correctly when saving the form.
Steps to reproduce:
1. As manager, assign a locked questionnaire as the only general questionnaire for an evaluation.
2. Enable the evaluation for editor review.
3. As editor, open the evaluation form and try to save it. Saving will fail with an error for the field "General questionnaires" ("This field is required.").
The locked questionnaire should count as a selected questionnaire and the form should be saved.
A test should be added for this use case.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `evap/contributor/forms.py`
Content:
```
1 from datetime import datetime, timedelta
2 import logging
3
4 from django import forms
5 from django.conf import settings
6 from django.db.models import Q
7 from django.forms.widgets import CheckboxSelectMultiple
8 from django.utils.translation import gettext_lazy as _
9 from evap.evaluation.forms import UserModelMultipleChoiceField, UserModelChoiceField
10 from evap.evaluation.models import Course, Evaluation, Questionnaire, UserProfile
11 from evap.evaluation.tools import date_to_datetime
12 from evap.staff.forms import ContributionForm
13
14 logger = logging.getLogger(__name__)
15
16
17 class EvaluationForm(forms.ModelForm):
18 general_questionnaires = forms.ModelMultipleChoiceField(queryset=None, widget=CheckboxSelectMultiple, label=_("General questionnaires"))
19 course = forms.ModelChoiceField(Course.objects.all(), disabled=True, required=False, widget=forms.HiddenInput())
20 name_de_field = forms.CharField(label=_("Name (German)"), disabled=True, required=False)
21 name_en_field = forms.CharField(label=_("Name (English)"), disabled=True, required=False)
22
23 class Meta:
24 model = Evaluation
25 fields = ('name_de_field', 'name_en_field', 'vote_start_datetime', 'vote_end_date', 'general_questionnaires', 'course')
26
27 def __init__(self, *args, **kwargs):
28 super().__init__(*args, **kwargs)
29
30 self.fields['name_de_field'].initial = self.instance.full_name_de
31 self.fields['name_en_field'].initial = self.instance.full_name_en
32
33 self.fields['general_questionnaires'].queryset = Questionnaire.objects.general_questionnaires().filter(
34 Q(visibility=Questionnaire.Visibility.EDITORS) | Q(contributions__evaluation=self.instance)).distinct()
35
36 self.fields['vote_start_datetime'].localize = True
37 self.fields['vote_end_date'].localize = True
38
39 if self.instance.general_contribution:
40 self.fields['general_questionnaires'].initial = [q.pk for q in self.instance.general_contribution.questionnaires.all()]
41
42 if not self.instance.allow_editors_to_edit:
43 for field in self._meta.fields:
44 self.fields[field].disabled = True
45
46 def clean(self):
47 super().clean()
48
49 vote_start_datetime = self.cleaned_data.get('vote_start_datetime')
50 vote_end_date = self.cleaned_data.get('vote_end_date')
51 if vote_start_datetime and vote_end_date:
52 if vote_start_datetime.date() > vote_end_date:
53 self.add_error("vote_start_datetime", "")
54 self.add_error("vote_end_date", _("The first day of evaluation must be before the last one."))
55
56 def clean_vote_end_date(self):
57 vote_end_date = self.cleaned_data.get('vote_end_date')
58
59 # The actual deadline is EVALUATION_END_OFFSET_HOURS:00 AM of the day after vote_end_date.
60 # Therefore an evaluation date 24h + EVALUATION_END_OFFSET_HOURS in the past would technically still be in the future.
61 if vote_end_date and date_to_datetime(vote_end_date) + timedelta(hours=24 + settings.EVALUATION_END_OFFSET_HOURS) < datetime.now():
62 raise forms.ValidationError(_("The last day of evaluation must be in the future."))
63 return vote_end_date
64
65 def clean_general_questionnaires(self):
66 # Ensure all locked questionnaires still have the same status (included or not)
67 locked_qs = self.fields['general_questionnaires'].queryset.filter(is_locked=True)
68
69 not_locked = [q for q in self.cleaned_data.get('general_questionnaires') if q not in locked_qs]
70 locked = [q.pk for q in self.instance.general_contribution.questionnaires.filter(is_locked=True)]
71
72 return not_locked + locked
73
74 def save(self, *args, **kw):
75 evaluation = super().save(*args, **kw)
76 evaluation.general_contribution.questionnaires.set(self.cleaned_data.get('general_questionnaires'))
77 return evaluation
78
79
80 class EditorContributionForm(ContributionForm):
81 def __init__(self, *args, **kwargs):
82 super().__init__(*args, **kwargs)
83
84 existing_contributor_pk = self.instance.contributor.pk if self.instance.contributor else None
85
86 self.fields['questionnaires'].queryset = Questionnaire.objects.contributor_questionnaires().filter(
87 Q(visibility=Questionnaire.Visibility.EDITORS) | Q(contributions__evaluation=self.evaluation)).distinct()
88 self.fields['contributor'].queryset = UserProfile.objects.filter(
89 (Q(is_active=True) & Q(is_proxy_user=False)) | Q(pk=existing_contributor_pk)
90 )
91
92
93 class DelegatesForm(forms.ModelForm):
94 delegates = UserModelMultipleChoiceField(queryset=UserProfile.objects.exclude(is_active=False).exclude(is_proxy_user=True),
95 required=False)
96
97 class Meta:
98 model = UserProfile
99 fields = ('delegates',)
100 field_classes = {
101 'delegates': UserModelMultipleChoiceField,
102 }
103
104 def __init__(self, *args, **kwargs):
105 super().__init__(*args, **kwargs)
106
107 def save(self, *args, **kw):
108 super().save(*args, **kw)
109 logger.info('User "{}" edited the settings.'.format(self.instance.email))
110
111
112 class DelegateSelectionForm(forms.Form):
113 delegate_to = UserModelChoiceField(label=_("Delegate to"),
114 queryset=UserProfile.objects.exclude(is_active=False).exclude(is_proxy_user=True))
115
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/evap/contributor/forms.py b/evap/contributor/forms.py
--- a/evap/contributor/forms.py
+++ b/evap/contributor/forms.py
@@ -15,7 +15,7 @@
class EvaluationForm(forms.ModelForm):
- general_questionnaires = forms.ModelMultipleChoiceField(queryset=None, widget=CheckboxSelectMultiple, label=_("General questionnaires"))
+ general_questionnaires = forms.ModelMultipleChoiceField(queryset=None, required=False, widget=CheckboxSelectMultiple, label=_("General questionnaires"))
course = forms.ModelChoiceField(Course.objects.all(), disabled=True, required=False, widget=forms.HiddenInput())
name_de_field = forms.CharField(label=_("Name (German)"), disabled=True, required=False)
name_en_field = forms.CharField(label=_("Name (English)"), disabled=True, required=False)
@@ -64,10 +64,14 @@
def clean_general_questionnaires(self):
# Ensure all locked questionnaires still have the same status (included or not)
- locked_qs = self.fields['general_questionnaires'].queryset.filter(is_locked=True)
+ not_locked = []
+ if self.cleaned_data.get('general_questionnaires'):
+ not_locked = list(self.cleaned_data.get('general_questionnaires').filter(is_locked=False))
- not_locked = [q for q in self.cleaned_data.get('general_questionnaires') if q not in locked_qs]
- locked = [q.pk for q in self.instance.general_contribution.questionnaires.filter(is_locked=True)]
+ locked = list(self.instance.general_contribution.questionnaires.filter(is_locked=True))
+
+ if not not_locked + locked:
+ self.add_error("general_questionnaires", _("At least one questionnaire must be selected."))
return not_locked + locked
| {"golden_diff": "diff --git a/evap/contributor/forms.py b/evap/contributor/forms.py\n--- a/evap/contributor/forms.py\n+++ b/evap/contributor/forms.py\n@@ -15,7 +15,7 @@\n \n \n class EvaluationForm(forms.ModelForm):\n- general_questionnaires = forms.ModelMultipleChoiceField(queryset=None, widget=CheckboxSelectMultiple, label=_(\"General questionnaires\"))\n+ general_questionnaires = forms.ModelMultipleChoiceField(queryset=None, required=False, widget=CheckboxSelectMultiple, label=_(\"General questionnaires\"))\n course = forms.ModelChoiceField(Course.objects.all(), disabled=True, required=False, widget=forms.HiddenInput())\n name_de_field = forms.CharField(label=_(\"Name (German)\"), disabled=True, required=False)\n name_en_field = forms.CharField(label=_(\"Name (English)\"), disabled=True, required=False)\n@@ -64,10 +64,14 @@\n \n def clean_general_questionnaires(self):\n # Ensure all locked questionnaires still have the same status (included or not)\n- locked_qs = self.fields['general_questionnaires'].queryset.filter(is_locked=True)\n+ not_locked = []\n+ if self.cleaned_data.get('general_questionnaires'):\n+ not_locked = list(self.cleaned_data.get('general_questionnaires').filter(is_locked=False))\n \n- not_locked = [q for q in self.cleaned_data.get('general_questionnaires') if q not in locked_qs]\n- locked = [q.pk for q in self.instance.general_contribution.questionnaires.filter(is_locked=True)]\n+ locked = list(self.instance.general_contribution.questionnaires.filter(is_locked=True))\n+\n+ if not not_locked + locked:\n+ self.add_error(\"general_questionnaires\", _(\"At least one questionnaire must be selected.\"))\n \n return not_locked + locked\n", "issue": "Locked questionnaires failing in editor form\n#1445 introduced locked questionnaires. However, they are not dealt with correctly in the evaluation editor form. When initially opening the form, the locked questionnaires are correctly selected but are not handled correctly when saving the form.\r\n\r\nSteps to reproduce:\r\n1. As manager, assign a locked questionnaire as the only general questionnaire for an evaluation.\r\n2. Enable the evaluation for editor review.\r\n3. As editor, open the evaluation form and try to save it. Saving will fail with an error for the field \"General questionnaires\" (\"This field is required.\").\r\n\r\nThe locked questionnaire should count as a selected questionnaire and the form should be saved.\r\nA test should be added for this use case.\n", "before_files": [{"content": "from datetime import datetime, timedelta\nimport logging\n\nfrom django import forms\nfrom django.conf import settings\nfrom django.db.models import Q\nfrom django.forms.widgets import CheckboxSelectMultiple\nfrom django.utils.translation import gettext_lazy as _\nfrom evap.evaluation.forms import UserModelMultipleChoiceField, UserModelChoiceField\nfrom evap.evaluation.models import Course, Evaluation, Questionnaire, UserProfile\nfrom evap.evaluation.tools import date_to_datetime\nfrom evap.staff.forms import ContributionForm\n\nlogger = logging.getLogger(__name__)\n\n\nclass EvaluationForm(forms.ModelForm):\n general_questionnaires = forms.ModelMultipleChoiceField(queryset=None, widget=CheckboxSelectMultiple, label=_(\"General questionnaires\"))\n course = forms.ModelChoiceField(Course.objects.all(), disabled=True, required=False, widget=forms.HiddenInput())\n name_de_field = forms.CharField(label=_(\"Name (German)\"), disabled=True, required=False)\n name_en_field = forms.CharField(label=_(\"Name (English)\"), disabled=True, required=False)\n\n class Meta:\n model = Evaluation\n fields = ('name_de_field', 'name_en_field', 'vote_start_datetime', 'vote_end_date', 'general_questionnaires', 'course')\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n self.fields['name_de_field'].initial = self.instance.full_name_de\n self.fields['name_en_field'].initial = self.instance.full_name_en\n\n self.fields['general_questionnaires'].queryset = Questionnaire.objects.general_questionnaires().filter(\n Q(visibility=Questionnaire.Visibility.EDITORS) | Q(contributions__evaluation=self.instance)).distinct()\n\n self.fields['vote_start_datetime'].localize = True\n self.fields['vote_end_date'].localize = True\n\n if self.instance.general_contribution:\n self.fields['general_questionnaires'].initial = [q.pk for q in self.instance.general_contribution.questionnaires.all()]\n\n if not self.instance.allow_editors_to_edit:\n for field in self._meta.fields:\n self.fields[field].disabled = True\n\n def clean(self):\n super().clean()\n\n vote_start_datetime = self.cleaned_data.get('vote_start_datetime')\n vote_end_date = self.cleaned_data.get('vote_end_date')\n if vote_start_datetime and vote_end_date:\n if vote_start_datetime.date() > vote_end_date:\n self.add_error(\"vote_start_datetime\", \"\")\n self.add_error(\"vote_end_date\", _(\"The first day of evaluation must be before the last one.\"))\n\n def clean_vote_end_date(self):\n vote_end_date = self.cleaned_data.get('vote_end_date')\n\n # The actual deadline is EVALUATION_END_OFFSET_HOURS:00 AM of the day after vote_end_date.\n # Therefore an evaluation date 24h + EVALUATION_END_OFFSET_HOURS in the past would technically still be in the future.\n if vote_end_date and date_to_datetime(vote_end_date) + timedelta(hours=24 + settings.EVALUATION_END_OFFSET_HOURS) < datetime.now():\n raise forms.ValidationError(_(\"The last day of evaluation must be in the future.\"))\n return vote_end_date\n\n def clean_general_questionnaires(self):\n # Ensure all locked questionnaires still have the same status (included or not)\n locked_qs = self.fields['general_questionnaires'].queryset.filter(is_locked=True)\n\n not_locked = [q for q in self.cleaned_data.get('general_questionnaires') if q not in locked_qs]\n locked = [q.pk for q in self.instance.general_contribution.questionnaires.filter(is_locked=True)]\n\n return not_locked + locked\n\n def save(self, *args, **kw):\n evaluation = super().save(*args, **kw)\n evaluation.general_contribution.questionnaires.set(self.cleaned_data.get('general_questionnaires'))\n return evaluation\n\n\nclass EditorContributionForm(ContributionForm):\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n existing_contributor_pk = self.instance.contributor.pk if self.instance.contributor else None\n\n self.fields['questionnaires'].queryset = Questionnaire.objects.contributor_questionnaires().filter(\n Q(visibility=Questionnaire.Visibility.EDITORS) | Q(contributions__evaluation=self.evaluation)).distinct()\n self.fields['contributor'].queryset = UserProfile.objects.filter(\n (Q(is_active=True) & Q(is_proxy_user=False)) | Q(pk=existing_contributor_pk)\n )\n\n\nclass DelegatesForm(forms.ModelForm):\n delegates = UserModelMultipleChoiceField(queryset=UserProfile.objects.exclude(is_active=False).exclude(is_proxy_user=True),\n required=False)\n\n class Meta:\n model = UserProfile\n fields = ('delegates',)\n field_classes = {\n 'delegates': UserModelMultipleChoiceField,\n }\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n def save(self, *args, **kw):\n super().save(*args, **kw)\n logger.info('User \"{}\" edited the settings.'.format(self.instance.email))\n\n\nclass DelegateSelectionForm(forms.Form):\n delegate_to = UserModelChoiceField(label=_(\"Delegate to\"),\n queryset=UserProfile.objects.exclude(is_active=False).exclude(is_proxy_user=True))\n", "path": "evap/contributor/forms.py"}], "after_files": [{"content": "from datetime import datetime, timedelta\nimport logging\n\nfrom django import forms\nfrom django.conf import settings\nfrom django.db.models import Q\nfrom django.forms.widgets import CheckboxSelectMultiple\nfrom django.utils.translation import gettext_lazy as _\nfrom evap.evaluation.forms import UserModelMultipleChoiceField, UserModelChoiceField\nfrom evap.evaluation.models import Course, Evaluation, Questionnaire, UserProfile\nfrom evap.evaluation.tools import date_to_datetime\nfrom evap.staff.forms import ContributionForm\n\nlogger = logging.getLogger(__name__)\n\n\nclass EvaluationForm(forms.ModelForm):\n general_questionnaires = forms.ModelMultipleChoiceField(queryset=None, required=False, widget=CheckboxSelectMultiple, label=_(\"General questionnaires\"))\n course = forms.ModelChoiceField(Course.objects.all(), disabled=True, required=False, widget=forms.HiddenInput())\n name_de_field = forms.CharField(label=_(\"Name (German)\"), disabled=True, required=False)\n name_en_field = forms.CharField(label=_(\"Name (English)\"), disabled=True, required=False)\n\n class Meta:\n model = Evaluation\n fields = ('name_de_field', 'name_en_field', 'vote_start_datetime', 'vote_end_date', 'general_questionnaires', 'course')\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n self.fields['name_de_field'].initial = self.instance.full_name_de\n self.fields['name_en_field'].initial = self.instance.full_name_en\n\n self.fields['general_questionnaires'].queryset = Questionnaire.objects.general_questionnaires().filter(\n Q(visibility=Questionnaire.Visibility.EDITORS) | Q(contributions__evaluation=self.instance)).distinct()\n\n self.fields['vote_start_datetime'].localize = True\n self.fields['vote_end_date'].localize = True\n\n if self.instance.general_contribution:\n self.fields['general_questionnaires'].initial = [q.pk for q in self.instance.general_contribution.questionnaires.all()]\n\n if not self.instance.allow_editors_to_edit:\n for field in self._meta.fields:\n self.fields[field].disabled = True\n\n def clean(self):\n super().clean()\n\n vote_start_datetime = self.cleaned_data.get('vote_start_datetime')\n vote_end_date = self.cleaned_data.get('vote_end_date')\n if vote_start_datetime and vote_end_date:\n if vote_start_datetime.date() > vote_end_date:\n self.add_error(\"vote_start_datetime\", \"\")\n self.add_error(\"vote_end_date\", _(\"The first day of evaluation must be before the last one.\"))\n\n def clean_vote_end_date(self):\n vote_end_date = self.cleaned_data.get('vote_end_date')\n\n # The actual deadline is EVALUATION_END_OFFSET_HOURS:00 AM of the day after vote_end_date.\n # Therefore an evaluation date 24h + EVALUATION_END_OFFSET_HOURS in the past would technically still be in the future.\n if vote_end_date and date_to_datetime(vote_end_date) + timedelta(hours=24 + settings.EVALUATION_END_OFFSET_HOURS) < datetime.now():\n raise forms.ValidationError(_(\"The last day of evaluation must be in the future.\"))\n return vote_end_date\n\n def clean_general_questionnaires(self):\n # Ensure all locked questionnaires still have the same status (included or not)\n not_locked = []\n if self.cleaned_data.get('general_questionnaires'):\n not_locked = list(self.cleaned_data.get('general_questionnaires').filter(is_locked=False))\n\n locked = list(self.instance.general_contribution.questionnaires.filter(is_locked=True))\n\n if not not_locked + locked:\n self.add_error(\"general_questionnaires\", _(\"At least one questionnaire must be selected.\"))\n\n return not_locked + locked\n\n def save(self, *args, **kw):\n evaluation = super().save(*args, **kw)\n evaluation.general_contribution.questionnaires.set(self.cleaned_data.get('general_questionnaires'))\n return evaluation\n\n\nclass EditorContributionForm(ContributionForm):\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n existing_contributor_pk = self.instance.contributor.pk if self.instance.contributor else None\n\n self.fields['questionnaires'].queryset = Questionnaire.objects.contributor_questionnaires().filter(\n Q(visibility=Questionnaire.Visibility.EDITORS) | Q(contributions__evaluation=self.evaluation)).distinct()\n self.fields['contributor'].queryset = UserProfile.objects.filter(\n (Q(is_active=True) & Q(is_proxy_user=False)) | Q(pk=existing_contributor_pk)\n )\n\n\nclass DelegatesForm(forms.ModelForm):\n delegates = UserModelMultipleChoiceField(queryset=UserProfile.objects.exclude(is_active=False).exclude(is_proxy_user=True),\n required=False)\n\n class Meta:\n model = UserProfile\n fields = ('delegates',)\n field_classes = {\n 'delegates': UserModelMultipleChoiceField,\n }\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n def save(self, *args, **kw):\n super().save(*args, **kw)\n logger.info('User \"{}\" edited the settings.'.format(self.instance.email))\n\n\nclass DelegateSelectionForm(forms.Form):\n delegate_to = UserModelChoiceField(label=_(\"Delegate to\"),\n queryset=UserProfile.objects.exclude(is_active=False).exclude(is_proxy_user=True))\n", "path": "evap/contributor/forms.py"}]} | 1,768 | 384 |
gh_patches_debug_25270 | rasdani/github-patches | git_diff | hpcaitech__ColossalAI-2635 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
[DOC]: the sphinx theme is too old
### 📚 The doc issue
As stated in #2579 , we want to use Read the Docs to host our documentation. In this way, tutorials and API documentations will be visited from a single entry. This issue will mainly discuss the appearance of the RTD website. Ideally, we should use Tailwind for style consistency. However, it can take some time to implement a tailwind-based theme, therefore, we should use an existing theme which looks more modern first.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docs/conf.py`
Content:
```
1 # Configuration file for the Sphinx documentation builder.
2 #
3 # This file only contains a selection of the most common options. For a full
4 # list see the documentation:
5 # https://www.sphinx-doc.org/en/master/usage/configuration.html
6
7 # -- Path setup --------------------------------------------------------------
8
9 import datetime
10 # If extensions (or modules to document with autodoc) are in another directory,
11 # add these directories to sys.path here. If the directory is relative to the
12 # documentation root, use os.path.abspath to make it absolute, like shown here.
13 #
14 import os
15 import sys
16
17 sys.path.insert(0, os.path.abspath('..'))
18
19 # -- Project information -----------------------------------------------------
20
21 project = 'Colossal-AI'
22 copyright = f'{datetime.datetime.now().year}, HPC-AI Tech'
23 author = 'HPC-AI Technology Inc.'
24
25 # The full version, including alpha/beta/rc tags
26 release = '0.0.1'
27
28
29 # -- General configuration ---------------------------------------------------
30
31 # Add any Sphinx extension module names here, as strings. They can be
32 # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
33 # ones.
34 extensions = [
35 'sphinx.ext.autodoc',
36 'sphinx.ext.mathjax',
37 'sphinx.ext.napoleon',
38 'sphinx.ext.linkcode',
39 'myst_parser',
40 ]
41
42 # Disable docstring inheritance
43 autodoc_inherit_docstrings = False
44
45 # Disable displaying type annotations, these can be very verbose
46 autodoc_typehints = 'none'
47
48 # Enable overriding of function signatures in the first line of the docstring.
49 autodoc_docstring_signature = True
50 autodoc_default_options = {
51 'member-order': 'bysource',
52 }
53
54 # Add any paths that contain templates here, relative to this directory.
55 templates_path = ['_templates']
56
57 # List of patterns, relative to source directory, that match files and
58 # directories to ignore when looking for source files.
59 # This pattern also affects html_static_path and html_extra_path.
60 exclude_patterns = ['.build', 'Thumbs.db', '.DS_Store']
61
62 # -- Options for HTML output -------------------------------------------------
63
64 # The theme to use for HTML and HTML Help pages. See the documentation for
65 # a list of builtin themes.
66 #
67 html_theme = 'sphinx_rtd_theme'
68 html_show_sourcelink = False
69 html_theme_options = {
70 'navigation_depth': 3,
71 }
72
73 html_context = {
74 'display_github': False,
75 'github_user': 'hpcaitech',
76 'github_repo': 'ColossalAI',
77 # 'github_version': 'master/docs/',
78 }
79
80 # Add any paths that contain custom static files (such as style sheets) here,
81 # relative to this directory. They are copied after the builtin static files,
82 # so a file named "default.css" will overwrite the builtin "default.css".
83 html_static_path = ['_static']
84
85 html_css_files = [
86 'css/rtd_theme.css',
87 ]
88
89 # -- Extension configuration -------------------------------------------------
90 source_suffix = ['.rst', '.md', '.MD']
91
92 import inspect
93 import colossalai
94 def linkcode_resolve(domain, info):
95 """
96 Determine the URL corresponding to Python object
97 """
98 if domain != 'py':
99 return None
100
101 modname = info['module']
102 fullname = info['fullname']
103
104 submod = sys.modules.get(modname)
105 if submod is None:
106 return None
107
108 obj = submod
109 for part in fullname.split('.'):
110 try:
111 obj = getattr(obj, part)
112 except Exception:
113 return None
114
115 try:
116 fn = inspect.getsourcefile(obj)
117 except Exception:
118 fn = None
119 if not fn:
120 return None
121
122 try:
123 source, lineno = inspect.findsource(obj)
124 except Exception:
125 lineno = None
126
127 if lineno:
128 linespec = "#L%d" % (lineno + 1)
129 else:
130 linespec = ""
131
132 fn = os.path.relpath(fn, start=os.path.dirname(colossalai.__file__))
133
134 github = "https://github.com/hpcaitech/ColossalAI/blob/main/colossalai/{}{}"
135 return github.format(fn, linespec)
136
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/docs/conf.py b/docs/conf.py
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -23,8 +23,7 @@
author = 'HPC-AI Technology Inc.'
# The full version, including alpha/beta/rc tags
-release = '0.0.1'
-
+# release = '0.0.1'
# -- General configuration ---------------------------------------------------
@@ -64,14 +63,14 @@
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
-html_theme = 'sphinx_rtd_theme'
+html_theme = 'sphinx_book_theme'
html_show_sourcelink = False
html_theme_options = {
'navigation_depth': 3,
}
html_context = {
- 'display_github': False,
+ 'display_github': True,
'github_user': 'hpcaitech',
'github_repo': 'ColossalAI',
# 'github_version': 'master/docs/',
@@ -90,7 +89,10 @@
source_suffix = ['.rst', '.md', '.MD']
import inspect
+
import colossalai
+
+
def linkcode_resolve(domain, info):
"""
Determine the URL corresponding to Python object
| {"golden_diff": "diff --git a/docs/conf.py b/docs/conf.py\n--- a/docs/conf.py\n+++ b/docs/conf.py\n@@ -23,8 +23,7 @@\n author = 'HPC-AI Technology Inc.'\n \n # The full version, including alpha/beta/rc tags\n-release = '0.0.1'\n-\n+# release = '0.0.1'\n \n # -- General configuration ---------------------------------------------------\n \n@@ -64,14 +63,14 @@\n # The theme to use for HTML and HTML Help pages. See the documentation for\n # a list of builtin themes.\n #\n-html_theme = 'sphinx_rtd_theme'\n+html_theme = 'sphinx_book_theme'\n html_show_sourcelink = False\n html_theme_options = {\n 'navigation_depth': 3,\n }\n \n html_context = {\n- 'display_github': False,\n+ 'display_github': True,\n 'github_user': 'hpcaitech',\n 'github_repo': 'ColossalAI',\n # 'github_version': 'master/docs/',\n@@ -90,7 +89,10 @@\n source_suffix = ['.rst', '.md', '.MD']\n \n import inspect\n+\n import colossalai\n+\n+\n def linkcode_resolve(domain, info):\n \"\"\"\n Determine the URL corresponding to Python object\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[DOC]: the sphinx theme is too old\n### \ud83d\udcda The doc issue\n\nAs stated in #2579 , we want to use Read the Docs to host our documentation. In this way, tutorials and API documentations will be visited from a single entry. This issue will mainly discuss the appearance of the RTD website. Ideally, we should use Tailwind for style consistency. However, it can take some time to implement a tailwind-based theme, therefore, we should use an existing theme which looks more modern first.\n", "before_files": [{"content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\nimport datetime\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nimport os\nimport sys\n\nsys.path.insert(0, os.path.abspath('..'))\n\n# -- Project information -----------------------------------------------------\n\nproject = 'Colossal-AI'\ncopyright = f'{datetime.datetime.now().year}, HPC-AI Tech'\nauthor = 'HPC-AI Technology Inc.'\n\n# The full version, including alpha/beta/rc tags\nrelease = '0.0.1'\n\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n 'sphinx.ext.autodoc',\n 'sphinx.ext.mathjax',\n 'sphinx.ext.napoleon',\n 'sphinx.ext.linkcode',\n 'myst_parser',\n]\n\n# Disable docstring inheritance\nautodoc_inherit_docstrings = False\n\n# Disable displaying type annotations, these can be very verbose\nautodoc_typehints = 'none'\n\n# Enable overriding of function signatures in the first line of the docstring.\nautodoc_docstring_signature = True\nautodoc_default_options = {\n 'member-order': 'bysource',\n}\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = ['.build', 'Thumbs.db', '.DS_Store']\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nhtml_theme = 'sphinx_rtd_theme'\nhtml_show_sourcelink = False\nhtml_theme_options = {\n 'navigation_depth': 3,\n}\n\nhtml_context = {\n 'display_github': False,\n 'github_user': 'hpcaitech',\n 'github_repo': 'ColossalAI',\n # 'github_version': 'master/docs/',\n}\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\nhtml_css_files = [\n 'css/rtd_theme.css',\n]\n\n# -- Extension configuration -------------------------------------------------\nsource_suffix = ['.rst', '.md', '.MD']\n\nimport inspect\nimport colossalai\ndef linkcode_resolve(domain, info):\n \"\"\"\n Determine the URL corresponding to Python object\n \"\"\"\n if domain != 'py':\n return None\n\n modname = info['module']\n fullname = info['fullname']\n\n submod = sys.modules.get(modname)\n if submod is None:\n return None\n\n obj = submod\n for part in fullname.split('.'):\n try:\n obj = getattr(obj, part)\n except Exception:\n return None\n\n try:\n fn = inspect.getsourcefile(obj)\n except Exception:\n fn = None\n if not fn:\n return None\n\n try:\n source, lineno = inspect.findsource(obj)\n except Exception:\n lineno = None\n\n if lineno:\n linespec = \"#L%d\" % (lineno + 1)\n else:\n linespec = \"\"\n\n fn = os.path.relpath(fn, start=os.path.dirname(colossalai.__file__))\n\n github = \"https://github.com/hpcaitech/ColossalAI/blob/main/colossalai/{}{}\"\n return github.format(fn, linespec)\n", "path": "docs/conf.py"}], "after_files": [{"content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\nimport datetime\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nimport os\nimport sys\n\nsys.path.insert(0, os.path.abspath('..'))\n\n# -- Project information -----------------------------------------------------\n\nproject = 'Colossal-AI'\ncopyright = f'{datetime.datetime.now().year}, HPC-AI Tech'\nauthor = 'HPC-AI Technology Inc.'\n\n# The full version, including alpha/beta/rc tags\n# release = '0.0.1'\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n 'sphinx.ext.autodoc',\n 'sphinx.ext.mathjax',\n 'sphinx.ext.napoleon',\n 'sphinx.ext.linkcode',\n 'myst_parser',\n]\n\n# Disable docstring inheritance\nautodoc_inherit_docstrings = False\n\n# Disable displaying type annotations, these can be very verbose\nautodoc_typehints = 'none'\n\n# Enable overriding of function signatures in the first line of the docstring.\nautodoc_docstring_signature = True\nautodoc_default_options = {\n 'member-order': 'bysource',\n}\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = ['.build', 'Thumbs.db', '.DS_Store']\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nhtml_theme = 'sphinx_book_theme'\nhtml_show_sourcelink = False\nhtml_theme_options = {\n 'navigation_depth': 3,\n}\n\nhtml_context = {\n 'display_github': True,\n 'github_user': 'hpcaitech',\n 'github_repo': 'ColossalAI',\n # 'github_version': 'master/docs/',\n}\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\nhtml_css_files = [\n 'css/rtd_theme.css',\n]\n\n# -- Extension configuration -------------------------------------------------\nsource_suffix = ['.rst', '.md', '.MD']\n\nimport inspect\n\nimport colossalai\n\n\ndef linkcode_resolve(domain, info):\n \"\"\"\n Determine the URL corresponding to Python object\n \"\"\"\n if domain != 'py':\n return None\n\n modname = info['module']\n fullname = info['fullname']\n\n submod = sys.modules.get(modname)\n if submod is None:\n return None\n\n obj = submod\n for part in fullname.split('.'):\n try:\n obj = getattr(obj, part)\n except Exception:\n return None\n\n try:\n fn = inspect.getsourcefile(obj)\n except Exception:\n fn = None\n if not fn:\n return None\n\n try:\n source, lineno = inspect.findsource(obj)\n except Exception:\n lineno = None\n\n if lineno:\n linespec = \"#L%d\" % (lineno + 1)\n else:\n linespec = \"\"\n\n fn = os.path.relpath(fn, start=os.path.dirname(colossalai.__file__))\n\n github = \"https://github.com/hpcaitech/ColossalAI/blob/main/colossalai/{}{}\"\n return github.format(fn, linespec)\n", "path": "docs/conf.py"}]} | 1,589 | 279 |
gh_patches_debug_25916 | rasdani/github-patches | git_diff | nf-core__tools-381 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
problem with nfcore_cache.sqlite within /tmp
Hi all,
I think will be a nice idea to have the nfcore_cache.sqlite within a subfolder in tmp because if two users use the program at the same time the privileges will prevent to use the tool.
For example I cannot even use nf-core --help
Luca
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nf_core/utils.py`
Content:
```
1 #!/usr/bin/env python
2 """
3 Common utility functions for the nf-core python package.
4 """
5
6 import datetime
7 import json
8 import logging
9 import os
10 import subprocess
11 import tempfile
12
13 def fetch_wf_config(wf_path, wf=None):
14 """Uses Nextflow to retrieve the the configuration variables
15 from a Nextflow workflow.
16
17 Args:
18 wf_path (str): Nextflow workflow file system path.
19
20 Returns:
21 dict: Workflow configuration settings.
22 """
23
24 config = dict()
25 cache_fn = None
26 cache_basedir = None
27 cache_path = None
28
29 # Build a cache directory if we can
30 if os.path.isdir(os.path.join(os.getenv("HOME"), '.nextflow')):
31 cache_basedir = os.path.join(os.getenv("HOME"), '.nextflow', 'nf-core')
32 if not os.path.isdir(cache_basedir):
33 os.mkdir(cache_basedir)
34
35 # If we're given a workflow object with a commit, see if we have a cached copy
36 if cache_basedir and wf and wf.full_name and wf.commit_sha:
37 cache_fn = '{}-{}.json'.format(wf.full_name.replace(os.path.sep, '-'), wf.commit_sha)
38 cache_path = os.path.join(cache_basedir, cache_fn)
39 if os.path.isfile(cache_path):
40 logging.debug("Found a config cache, loading: {}".format(cache_path))
41 with open(cache_path, 'r') as fh:
42 config = json.load(fh)
43 return config
44
45
46 # Call `nextflow config` and pipe stderr to /dev/null
47 try:
48 with open(os.devnull, 'w') as devnull:
49 nfconfig_raw = subprocess.check_output(['nextflow', 'config', '-flat', wf_path], stderr=devnull)
50 except OSError as e:
51 if e.errno == os.errno.ENOENT:
52 raise AssertionError("It looks like Nextflow is not installed. It is required for most nf-core functions.")
53 except subprocess.CalledProcessError as e:
54 raise AssertionError("`nextflow config` returned non-zero error code: %s,\n %s", e.returncode, e.output)
55 else:
56 for l in nfconfig_raw.splitlines():
57 ul = l.decode('utf-8')
58 k, v = ul.split(' = ', 1)
59 config[k] = v
60
61 # If we can, save a cached copy
62 if cache_path:
63 logging.debug("Saving config cache: {}".format(cache_path))
64 with open(cache_path, 'w') as fh:
65 json.dump(config, fh, indent=4)
66
67 return config
68
69
70 def setup_requests_cachedir():
71 """Sets up local caching for faster remote HTTP requests.
72
73 Caching directory will be generated by tempfile.gettempdir() under
74 a nfcore_cache subdir.
75 """
76 # Only import it if we need it
77 import requests_cache
78
79 cachedir = os.path.join(tempfile.gettempdir(), 'nfcore_cache')
80 if not os.path.exists(cachedir):
81 os.mkdir(cachedir)
82 requests_cache.install_cache(
83 os.path.join(cachedir, 'nfcore_cache'),
84 expire_after=datetime.timedelta(hours=1),
85 backend='sqlite',
86 )
87 # Make world-writeable so that multi-user installations work
88 os.chmod(cachedir, 0o777)
89 os.chmod(os.path.join(cachedir, 'nfcore_cache.sqlite'), 0o777)
90
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/nf_core/utils.py b/nf_core/utils.py
--- a/nf_core/utils.py
+++ b/nf_core/utils.py
@@ -8,7 +8,6 @@
import logging
import os
import subprocess
-import tempfile
def fetch_wf_config(wf_path, wf=None):
"""Uses Nextflow to retrieve the the configuration variables
@@ -70,20 +69,18 @@
def setup_requests_cachedir():
"""Sets up local caching for faster remote HTTP requests.
- Caching directory will be generated by tempfile.gettempdir() under
- a nfcore_cache subdir.
+ Caching directory will be set up in the user's home directory under
+ a .nfcore_cache subdir.
"""
# Only import it if we need it
import requests_cache
+
- cachedir = os.path.join(tempfile.gettempdir(), 'nfcore_cache')
+ cachedir = os.path.join(os.getenv("HOME"), os.path.join('.nfcore', 'cache'))
if not os.path.exists(cachedir):
- os.mkdir(cachedir)
+ os.makedirs(cachedir)
requests_cache.install_cache(
- os.path.join(cachedir, 'nfcore_cache'),
+ os.path.join(cachedir, 'github_info'),
expire_after=datetime.timedelta(hours=1),
backend='sqlite',
)
- # Make world-writeable so that multi-user installations work
- os.chmod(cachedir, 0o777)
- os.chmod(os.path.join(cachedir, 'nfcore_cache.sqlite'), 0o777)
| {"golden_diff": "diff --git a/nf_core/utils.py b/nf_core/utils.py\n--- a/nf_core/utils.py\n+++ b/nf_core/utils.py\n@@ -8,7 +8,6 @@\n import logging\n import os\n import subprocess\n-import tempfile\n \n def fetch_wf_config(wf_path, wf=None):\n \"\"\"Uses Nextflow to retrieve the the configuration variables\n@@ -70,20 +69,18 @@\n def setup_requests_cachedir():\n \"\"\"Sets up local caching for faster remote HTTP requests.\n \n- Caching directory will be generated by tempfile.gettempdir() under\n- a nfcore_cache subdir.\n+ Caching directory will be set up in the user's home directory under\n+ a .nfcore_cache subdir.\n \"\"\"\n # Only import it if we need it\n import requests_cache\n+ \n \n- cachedir = os.path.join(tempfile.gettempdir(), 'nfcore_cache')\n+ cachedir = os.path.join(os.getenv(\"HOME\"), os.path.join('.nfcore', 'cache'))\n if not os.path.exists(cachedir):\n- os.mkdir(cachedir)\n+ os.makedirs(cachedir)\n requests_cache.install_cache(\n- os.path.join(cachedir, 'nfcore_cache'),\n+ os.path.join(cachedir, 'github_info'),\n expire_after=datetime.timedelta(hours=1),\n backend='sqlite',\n )\n- # Make world-writeable so that multi-user installations work\n- os.chmod(cachedir, 0o777)\n- os.chmod(os.path.join(cachedir, 'nfcore_cache.sqlite'), 0o777)\n", "issue": "problem with nfcore_cache.sqlite within /tmp\nHi all,\r\nI think will be a nice idea to have the nfcore_cache.sqlite within a subfolder in tmp because if two users use the program at the same time the privileges will prevent to use the tool.\r\n\r\nFor example I cannot even use nf-core --help \r\n\r\nLuca\n", "before_files": [{"content": "#!/usr/bin/env python\n\"\"\"\nCommon utility functions for the nf-core python package.\n\"\"\"\n\nimport datetime\nimport json\nimport logging\nimport os\nimport subprocess\nimport tempfile\n\ndef fetch_wf_config(wf_path, wf=None):\n \"\"\"Uses Nextflow to retrieve the the configuration variables\n from a Nextflow workflow.\n\n Args:\n wf_path (str): Nextflow workflow file system path.\n\n Returns:\n dict: Workflow configuration settings.\n \"\"\"\n\n config = dict()\n cache_fn = None\n cache_basedir = None\n cache_path = None\n\n # Build a cache directory if we can\n if os.path.isdir(os.path.join(os.getenv(\"HOME\"), '.nextflow')):\n cache_basedir = os.path.join(os.getenv(\"HOME\"), '.nextflow', 'nf-core')\n if not os.path.isdir(cache_basedir):\n os.mkdir(cache_basedir)\n\n # If we're given a workflow object with a commit, see if we have a cached copy\n if cache_basedir and wf and wf.full_name and wf.commit_sha:\n cache_fn = '{}-{}.json'.format(wf.full_name.replace(os.path.sep, '-'), wf.commit_sha)\n cache_path = os.path.join(cache_basedir, cache_fn)\n if os.path.isfile(cache_path):\n logging.debug(\"Found a config cache, loading: {}\".format(cache_path))\n with open(cache_path, 'r') as fh:\n config = json.load(fh)\n return config\n\n\n # Call `nextflow config` and pipe stderr to /dev/null\n try:\n with open(os.devnull, 'w') as devnull:\n nfconfig_raw = subprocess.check_output(['nextflow', 'config', '-flat', wf_path], stderr=devnull)\n except OSError as e:\n if e.errno == os.errno.ENOENT:\n raise AssertionError(\"It looks like Nextflow is not installed. It is required for most nf-core functions.\")\n except subprocess.CalledProcessError as e:\n raise AssertionError(\"`nextflow config` returned non-zero error code: %s,\\n %s\", e.returncode, e.output)\n else:\n for l in nfconfig_raw.splitlines():\n ul = l.decode('utf-8')\n k, v = ul.split(' = ', 1)\n config[k] = v\n\n # If we can, save a cached copy\n if cache_path:\n logging.debug(\"Saving config cache: {}\".format(cache_path))\n with open(cache_path, 'w') as fh:\n json.dump(config, fh, indent=4)\n\n return config\n\n\ndef setup_requests_cachedir():\n \"\"\"Sets up local caching for faster remote HTTP requests.\n\n Caching directory will be generated by tempfile.gettempdir() under\n a nfcore_cache subdir.\n \"\"\"\n # Only import it if we need it\n import requests_cache\n\n cachedir = os.path.join(tempfile.gettempdir(), 'nfcore_cache')\n if not os.path.exists(cachedir):\n os.mkdir(cachedir)\n requests_cache.install_cache(\n os.path.join(cachedir, 'nfcore_cache'),\n expire_after=datetime.timedelta(hours=1),\n backend='sqlite',\n )\n # Make world-writeable so that multi-user installations work\n os.chmod(cachedir, 0o777)\n os.chmod(os.path.join(cachedir, 'nfcore_cache.sqlite'), 0o777)\n", "path": "nf_core/utils.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\"\"\"\nCommon utility functions for the nf-core python package.\n\"\"\"\n\nimport datetime\nimport json\nimport logging\nimport os\nimport subprocess\n\ndef fetch_wf_config(wf_path, wf=None):\n \"\"\"Uses Nextflow to retrieve the the configuration variables\n from a Nextflow workflow.\n\n Args:\n wf_path (str): Nextflow workflow file system path.\n\n Returns:\n dict: Workflow configuration settings.\n \"\"\"\n\n config = dict()\n cache_fn = None\n cache_basedir = None\n cache_path = None\n\n # Build a cache directory if we can\n if os.path.isdir(os.path.join(os.getenv(\"HOME\"), '.nextflow')):\n cache_basedir = os.path.join(os.getenv(\"HOME\"), '.nextflow', 'nf-core')\n if not os.path.isdir(cache_basedir):\n os.mkdir(cache_basedir)\n\n # If we're given a workflow object with a commit, see if we have a cached copy\n if cache_basedir and wf and wf.full_name and wf.commit_sha:\n cache_fn = '{}-{}.json'.format(wf.full_name.replace(os.path.sep, '-'), wf.commit_sha)\n cache_path = os.path.join(cache_basedir, cache_fn)\n if os.path.isfile(cache_path):\n logging.debug(\"Found a config cache, loading: {}\".format(cache_path))\n with open(cache_path, 'r') as fh:\n config = json.load(fh)\n return config\n\n\n # Call `nextflow config` and pipe stderr to /dev/null\n try:\n with open(os.devnull, 'w') as devnull:\n nfconfig_raw = subprocess.check_output(['nextflow', 'config', '-flat', wf_path], stderr=devnull)\n except OSError as e:\n if e.errno == os.errno.ENOENT:\n raise AssertionError(\"It looks like Nextflow is not installed. It is required for most nf-core functions.\")\n except subprocess.CalledProcessError as e:\n raise AssertionError(\"`nextflow config` returned non-zero error code: %s,\\n %s\", e.returncode, e.output)\n else:\n for l in nfconfig_raw.splitlines():\n ul = l.decode('utf-8')\n k, v = ul.split(' = ', 1)\n config[k] = v\n\n # If we can, save a cached copy\n if cache_path:\n logging.debug(\"Saving config cache: {}\".format(cache_path))\n with open(cache_path, 'w') as fh:\n json.dump(config, fh, indent=4)\n\n return config\n\n\ndef setup_requests_cachedir():\n \"\"\"Sets up local caching for faster remote HTTP requests.\n\n Caching directory will be set up in the user's home directory under\n a .nfcore_cache subdir.\n \"\"\"\n # Only import it if we need it\n import requests_cache\n \n\n cachedir = os.path.join(os.getenv(\"HOME\"), os.path.join('.nfcore', 'cache'))\n if not os.path.exists(cachedir):\n os.makedirs(cachedir)\n requests_cache.install_cache(\n os.path.join(cachedir, 'github_info'),\n expire_after=datetime.timedelta(hours=1),\n backend='sqlite',\n )\n", "path": "nf_core/utils.py"}]} | 1,235 | 352 |
gh_patches_debug_22168 | rasdani/github-patches | git_diff | freedomofpress__securedrop-7120 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
development server no longer hot-reloads on file changes
## Description
After #6563, `make dev` no longer hot-reloads on file changes, because the immutable `SDConfig` doesn't carry the `env` attribute from `config.py`. (This regression was last encountered in #5594.)
## Steps to Reproduce
Same as #5594.
## Expected Behavior
Same as #5594.
## Actual Behavior
Same as #5594.
## Comments
The fix in <https://github.com/freedomofpress/securedrop/issues/6669#issuecomment-1526129678> is trivial, but I want to take a moment to see if there's an easy way to test for this regression (fool me once, fool me twice, etc.).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `securedrop/sdconfig.py`
Content:
```
1 from dataclasses import dataclass
2 from importlib import import_module
3 from pathlib import Path
4 from typing import Dict, List, Optional
5
6 FALLBACK_LOCALE = "en_US"
7
8 DEFAULT_SECUREDROP_ROOT = Path(__file__).absolute().parent
9
10
11 @dataclass(frozen=True)
12 class _FlaskAppConfig:
13 """Config fields that are common to the Journalist and Source interfaces."""
14
15 SESSION_COOKIE_NAME: str
16 SECRET_KEY: str
17
18 DEBUG: bool
19 TESTING: bool
20 WTF_CSRF_ENABLED: bool
21
22 # Use MAX_CONTENT_LENGTH to mimic the behavior of Apache's LimitRequestBody
23 # in the development environment. See #1714.
24 MAX_CONTENT_LENGTH: int
25
26 # This is recommended for performance, and also resolves #369
27 USE_X_SENDFILE: bool
28
29
30 @dataclass(frozen=True)
31 class JournalistInterfaceConfig(_FlaskAppConfig):
32 # Additional config for JI Redis sessions
33 SESSION_SIGNER_SALT: str = "js_session"
34 SESSION_KEY_PREFIX: str = "js_session:"
35 SESSION_LIFETIME: int = 2 * 60 * 60
36 SESSION_RENEW_COUNT: int = 5
37
38
39 @dataclass(frozen=True)
40 class SourceInterfaceConfig(_FlaskAppConfig):
41 pass
42
43
44 @dataclass(frozen=True)
45 class SecureDropConfig:
46 JOURNALIST_APP_FLASK_CONFIG_CLS: JournalistInterfaceConfig
47 SOURCE_APP_FLASK_CONFIG_CLS: SourceInterfaceConfig
48
49 GPG_KEY_DIR: Path
50 JOURNALIST_KEY: str
51 SCRYPT_GPG_PEPPER: str
52 SCRYPT_ID_PEPPER: str
53 SCRYPT_PARAMS: Dict[str, int]
54
55 SECUREDROP_DATA_ROOT: Path
56
57 DATABASE_FILE: Path # Path to the sqlite DB file
58
59 SECUREDROP_ROOT: Path
60 STATIC_DIR: Path
61 TRANSLATION_DIRS: Path
62 SOURCE_TEMPLATES_DIR: Path
63 JOURNALIST_TEMPLATES_DIR: Path
64 NOUNS: Path
65 ADJECTIVES: Path
66
67 DEFAULT_LOCALE: str
68 SUPPORTED_LOCALES: List[str]
69
70 SESSION_EXPIRATION_MINUTES: float
71
72 RQ_WORKER_NAME: str
73
74 @property
75 def TEMP_DIR(self) -> Path:
76 # We use a directory under the SECUREDROP_DATA_ROOT instead of `/tmp` because
77 # we need to expose this directory via X-Send-File, and want to minimize the
78 # potential for exposing unintended files.
79 return self.SECUREDROP_DATA_ROOT / "tmp"
80
81 @property
82 def STORE_DIR(self) -> Path:
83 return self.SECUREDROP_DATA_ROOT / "store"
84
85 @property
86 def DATABASE_URI(self) -> str:
87 return f"sqlite:///{self.DATABASE_FILE}"
88
89 @classmethod
90 def get_current(cls) -> "SecureDropConfig":
91 global _current_config
92 if _current_config is None:
93 # Retrieve the config by parsing it from ./config.py
94 _current_config = _parse_config_from_file(config_module_name="config")
95 return _current_config
96
97
98 _current_config: Optional[SecureDropConfig] = None
99
100
101 def _parse_config_from_file(config_module_name: str) -> SecureDropConfig:
102 """Parse the config from a config.py file."""
103 config_from_local_file = import_module(config_module_name)
104
105 # Parse the local config; as there are SD instances with very old config files
106 # the parsing logic here has to assume some values might be missing, and hence
107 # set default values for such config entries
108 final_default_locale = getattr(config_from_local_file, "DEFAULT_LOCALE", FALLBACK_LOCALE)
109 final_supp_locales = getattr(config_from_local_file, "SUPPORTED_LOCALES", [FALLBACK_LOCALE])
110 final_sess_expiration_mins = getattr(config_from_local_file, "SESSION_EXPIRATION_MINUTES", 120)
111
112 final_worker_name = getattr(config_from_local_file, "RQ_WORKER_NAME", "default")
113
114 final_scrypt_params = getattr(
115 config_from_local_file, "SCRYPT_PARAMS", dict(N=2**14, r=8, p=1)
116 )
117
118 try:
119 final_securedrop_root = Path(config_from_local_file.SECUREDROP_ROOT)
120 except AttributeError:
121 final_securedrop_root = DEFAULT_SECUREDROP_ROOT
122
123 try:
124 final_securedrop_data_root = Path(config_from_local_file.SECUREDROP_DATA_ROOT)
125 except AttributeError:
126 final_securedrop_data_root = Path("/var/lib/securedrop")
127
128 try:
129 final_db_file = Path(config_from_local_file.DATABASE_FILE)
130 except AttributeError:
131 final_db_file = final_securedrop_data_root / "db.sqlite"
132
133 try:
134 final_gpg_key_dir = Path(config_from_local_file.GPG_KEY_DIR)
135 except AttributeError:
136 final_gpg_key_dir = final_securedrop_data_root / "keys"
137
138 try:
139 final_nouns = Path(config_from_local_file.NOUNS)
140 except AttributeError:
141 final_nouns = final_securedrop_root / "dictionaries" / "nouns.txt"
142
143 try:
144 final_adjectives = Path(config_from_local_file.ADJECTIVES)
145 except AttributeError:
146 final_adjectives = final_securedrop_root / "dictionaries" / "adjectives.txt"
147
148 try:
149 final_static_dir = Path(config_from_local_file.STATIC_DIR) # type: ignore
150 except AttributeError:
151 final_static_dir = final_securedrop_root / "static"
152
153 try:
154 final_transl_dir = Path(config_from_local_file.TRANSLATION_DIRS) # type: ignore
155 except AttributeError:
156 final_transl_dir = final_securedrop_root / "translations"
157
158 try:
159 final_source_tmpl_dir = Path(config_from_local_file.SOURCE_TEMPLATES_DIR)
160 except AttributeError:
161 final_source_tmpl_dir = final_securedrop_root / "source_templates"
162
163 try:
164 final_journ_tmpl_dir = Path(config_from_local_file.JOURNALIST_TEMPLATES_DIR)
165 except AttributeError:
166 final_journ_tmpl_dir = final_securedrop_root / "journalist_templates"
167
168 # Parse the Flask configurations
169 journ_flask_config = config_from_local_file.JournalistInterfaceFlaskConfig
170 parsed_journ_flask_config = JournalistInterfaceConfig(
171 SECRET_KEY=journ_flask_config.SECRET_KEY,
172 SESSION_COOKIE_NAME=getattr(journ_flask_config, "SESSION_COOKIE_NAME", "js"),
173 DEBUG=getattr(journ_flask_config, "DEBUG", False),
174 TESTING=getattr(journ_flask_config, "TESTING", False),
175 WTF_CSRF_ENABLED=getattr(journ_flask_config, "WTF_CSRF_ENABLED", True),
176 MAX_CONTENT_LENGTH=getattr(journ_flask_config, "MAX_CONTENT_LENGTH", 524288000),
177 USE_X_SENDFILE=getattr(journ_flask_config, "USE_X_SENDFILE", False),
178 )
179 source_flask_config = config_from_local_file.SourceInterfaceFlaskConfig
180 parsed_source_flask_config = SourceInterfaceConfig(
181 SECRET_KEY=source_flask_config.SECRET_KEY,
182 SESSION_COOKIE_NAME=getattr(journ_flask_config, "SESSION_COOKIE_NAME", "ss"),
183 DEBUG=getattr(journ_flask_config, "DEBUG", False),
184 TESTING=getattr(journ_flask_config, "TESTING", False),
185 WTF_CSRF_ENABLED=getattr(journ_flask_config, "WTF_CSRF_ENABLED", True),
186 MAX_CONTENT_LENGTH=getattr(journ_flask_config, "MAX_CONTENT_LENGTH", 524288000),
187 USE_X_SENDFILE=getattr(journ_flask_config, "USE_X_SENDFILE", False),
188 )
189
190 return SecureDropConfig(
191 JOURNALIST_APP_FLASK_CONFIG_CLS=parsed_journ_flask_config,
192 SOURCE_APP_FLASK_CONFIG_CLS=parsed_source_flask_config,
193 GPG_KEY_DIR=final_gpg_key_dir,
194 JOURNALIST_KEY=config_from_local_file.JOURNALIST_KEY,
195 SCRYPT_GPG_PEPPER=config_from_local_file.SCRYPT_GPG_PEPPER,
196 SCRYPT_ID_PEPPER=config_from_local_file.SCRYPT_ID_PEPPER,
197 SCRYPT_PARAMS=final_scrypt_params,
198 SECUREDROP_DATA_ROOT=final_securedrop_data_root,
199 SECUREDROP_ROOT=final_securedrop_root,
200 DATABASE_FILE=final_db_file,
201 STATIC_DIR=final_static_dir,
202 TRANSLATION_DIRS=final_transl_dir,
203 SOURCE_TEMPLATES_DIR=final_source_tmpl_dir,
204 JOURNALIST_TEMPLATES_DIR=final_journ_tmpl_dir,
205 NOUNS=final_nouns,
206 ADJECTIVES=final_adjectives,
207 DEFAULT_LOCALE=final_default_locale,
208 SUPPORTED_LOCALES=final_supp_locales,
209 SESSION_EXPIRATION_MINUTES=final_sess_expiration_mins,
210 RQ_WORKER_NAME=final_worker_name,
211 )
212
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/securedrop/sdconfig.py b/securedrop/sdconfig.py
--- a/securedrop/sdconfig.py
+++ b/securedrop/sdconfig.py
@@ -71,6 +71,8 @@
RQ_WORKER_NAME: str
+ env: str = "prod"
+
@property
def TEMP_DIR(self) -> Path:
# We use a directory under the SECUREDROP_DATA_ROOT instead of `/tmp` because
@@ -115,6 +117,8 @@
config_from_local_file, "SCRYPT_PARAMS", dict(N=2**14, r=8, p=1)
)
+ env = getattr(config_from_local_file, "env", "prod")
+
try:
final_securedrop_root = Path(config_from_local_file.SECUREDROP_ROOT)
except AttributeError:
@@ -188,6 +192,7 @@
)
return SecureDropConfig(
+ env=env,
JOURNALIST_APP_FLASK_CONFIG_CLS=parsed_journ_flask_config,
SOURCE_APP_FLASK_CONFIG_CLS=parsed_source_flask_config,
GPG_KEY_DIR=final_gpg_key_dir,
| {"golden_diff": "diff --git a/securedrop/sdconfig.py b/securedrop/sdconfig.py\n--- a/securedrop/sdconfig.py\n+++ b/securedrop/sdconfig.py\n@@ -71,6 +71,8 @@\n \n RQ_WORKER_NAME: str\n \n+ env: str = \"prod\"\n+\n @property\n def TEMP_DIR(self) -> Path:\n # We use a directory under the SECUREDROP_DATA_ROOT instead of `/tmp` because\n@@ -115,6 +117,8 @@\n config_from_local_file, \"SCRYPT_PARAMS\", dict(N=2**14, r=8, p=1)\n )\n \n+ env = getattr(config_from_local_file, \"env\", \"prod\")\n+\n try:\n final_securedrop_root = Path(config_from_local_file.SECUREDROP_ROOT)\n except AttributeError:\n@@ -188,6 +192,7 @@\n )\n \n return SecureDropConfig(\n+ env=env,\n JOURNALIST_APP_FLASK_CONFIG_CLS=parsed_journ_flask_config,\n SOURCE_APP_FLASK_CONFIG_CLS=parsed_source_flask_config,\n GPG_KEY_DIR=final_gpg_key_dir,\n", "issue": "development server no longer hot-reloads on file changes\n## Description\r\n\r\nAfter #6563, `make dev` no longer hot-reloads on file changes, because the immutable `SDConfig` doesn't carry the `env` attribute from `config.py`. (This regression was last encountered in #5594.)\r\n\r\n## Steps to Reproduce\r\n\r\nSame as #5594.\r\n\r\n## Expected Behavior\r\n\r\nSame as #5594.\r\n\r\n## Actual Behavior\r\n\r\nSame as #5594.\r\n\r\n## Comments\r\n\r\nThe fix in <https://github.com/freedomofpress/securedrop/issues/6669#issuecomment-1526129678> is trivial, but I want to take a moment to see if there's an easy way to test for this regression (fool me once, fool me twice, etc.).\n", "before_files": [{"content": "from dataclasses import dataclass\nfrom importlib import import_module\nfrom pathlib import Path\nfrom typing import Dict, List, Optional\n\nFALLBACK_LOCALE = \"en_US\"\n\nDEFAULT_SECUREDROP_ROOT = Path(__file__).absolute().parent\n\n\n@dataclass(frozen=True)\nclass _FlaskAppConfig:\n \"\"\"Config fields that are common to the Journalist and Source interfaces.\"\"\"\n\n SESSION_COOKIE_NAME: str\n SECRET_KEY: str\n\n DEBUG: bool\n TESTING: bool\n WTF_CSRF_ENABLED: bool\n\n # Use MAX_CONTENT_LENGTH to mimic the behavior of Apache's LimitRequestBody\n # in the development environment. See #1714.\n MAX_CONTENT_LENGTH: int\n\n # This is recommended for performance, and also resolves #369\n USE_X_SENDFILE: bool\n\n\n@dataclass(frozen=True)\nclass JournalistInterfaceConfig(_FlaskAppConfig):\n # Additional config for JI Redis sessions\n SESSION_SIGNER_SALT: str = \"js_session\"\n SESSION_KEY_PREFIX: str = \"js_session:\"\n SESSION_LIFETIME: int = 2 * 60 * 60\n SESSION_RENEW_COUNT: int = 5\n\n\n@dataclass(frozen=True)\nclass SourceInterfaceConfig(_FlaskAppConfig):\n pass\n\n\n@dataclass(frozen=True)\nclass SecureDropConfig:\n JOURNALIST_APP_FLASK_CONFIG_CLS: JournalistInterfaceConfig\n SOURCE_APP_FLASK_CONFIG_CLS: SourceInterfaceConfig\n\n GPG_KEY_DIR: Path\n JOURNALIST_KEY: str\n SCRYPT_GPG_PEPPER: str\n SCRYPT_ID_PEPPER: str\n SCRYPT_PARAMS: Dict[str, int]\n\n SECUREDROP_DATA_ROOT: Path\n\n DATABASE_FILE: Path # Path to the sqlite DB file\n\n SECUREDROP_ROOT: Path\n STATIC_DIR: Path\n TRANSLATION_DIRS: Path\n SOURCE_TEMPLATES_DIR: Path\n JOURNALIST_TEMPLATES_DIR: Path\n NOUNS: Path\n ADJECTIVES: Path\n\n DEFAULT_LOCALE: str\n SUPPORTED_LOCALES: List[str]\n\n SESSION_EXPIRATION_MINUTES: float\n\n RQ_WORKER_NAME: str\n\n @property\n def TEMP_DIR(self) -> Path:\n # We use a directory under the SECUREDROP_DATA_ROOT instead of `/tmp` because\n # we need to expose this directory via X-Send-File, and want to minimize the\n # potential for exposing unintended files.\n return self.SECUREDROP_DATA_ROOT / \"tmp\"\n\n @property\n def STORE_DIR(self) -> Path:\n return self.SECUREDROP_DATA_ROOT / \"store\"\n\n @property\n def DATABASE_URI(self) -> str:\n return f\"sqlite:///{self.DATABASE_FILE}\"\n\n @classmethod\n def get_current(cls) -> \"SecureDropConfig\":\n global _current_config\n if _current_config is None:\n # Retrieve the config by parsing it from ./config.py\n _current_config = _parse_config_from_file(config_module_name=\"config\")\n return _current_config\n\n\n_current_config: Optional[SecureDropConfig] = None\n\n\ndef _parse_config_from_file(config_module_name: str) -> SecureDropConfig:\n \"\"\"Parse the config from a config.py file.\"\"\"\n config_from_local_file = import_module(config_module_name)\n\n # Parse the local config; as there are SD instances with very old config files\n # the parsing logic here has to assume some values might be missing, and hence\n # set default values for such config entries\n final_default_locale = getattr(config_from_local_file, \"DEFAULT_LOCALE\", FALLBACK_LOCALE)\n final_supp_locales = getattr(config_from_local_file, \"SUPPORTED_LOCALES\", [FALLBACK_LOCALE])\n final_sess_expiration_mins = getattr(config_from_local_file, \"SESSION_EXPIRATION_MINUTES\", 120)\n\n final_worker_name = getattr(config_from_local_file, \"RQ_WORKER_NAME\", \"default\")\n\n final_scrypt_params = getattr(\n config_from_local_file, \"SCRYPT_PARAMS\", dict(N=2**14, r=8, p=1)\n )\n\n try:\n final_securedrop_root = Path(config_from_local_file.SECUREDROP_ROOT)\n except AttributeError:\n final_securedrop_root = DEFAULT_SECUREDROP_ROOT\n\n try:\n final_securedrop_data_root = Path(config_from_local_file.SECUREDROP_DATA_ROOT)\n except AttributeError:\n final_securedrop_data_root = Path(\"/var/lib/securedrop\")\n\n try:\n final_db_file = Path(config_from_local_file.DATABASE_FILE)\n except AttributeError:\n final_db_file = final_securedrop_data_root / \"db.sqlite\"\n\n try:\n final_gpg_key_dir = Path(config_from_local_file.GPG_KEY_DIR)\n except AttributeError:\n final_gpg_key_dir = final_securedrop_data_root / \"keys\"\n\n try:\n final_nouns = Path(config_from_local_file.NOUNS)\n except AttributeError:\n final_nouns = final_securedrop_root / \"dictionaries\" / \"nouns.txt\"\n\n try:\n final_adjectives = Path(config_from_local_file.ADJECTIVES)\n except AttributeError:\n final_adjectives = final_securedrop_root / \"dictionaries\" / \"adjectives.txt\"\n\n try:\n final_static_dir = Path(config_from_local_file.STATIC_DIR) # type: ignore\n except AttributeError:\n final_static_dir = final_securedrop_root / \"static\"\n\n try:\n final_transl_dir = Path(config_from_local_file.TRANSLATION_DIRS) # type: ignore\n except AttributeError:\n final_transl_dir = final_securedrop_root / \"translations\"\n\n try:\n final_source_tmpl_dir = Path(config_from_local_file.SOURCE_TEMPLATES_DIR)\n except AttributeError:\n final_source_tmpl_dir = final_securedrop_root / \"source_templates\"\n\n try:\n final_journ_tmpl_dir = Path(config_from_local_file.JOURNALIST_TEMPLATES_DIR)\n except AttributeError:\n final_journ_tmpl_dir = final_securedrop_root / \"journalist_templates\"\n\n # Parse the Flask configurations\n journ_flask_config = config_from_local_file.JournalistInterfaceFlaskConfig\n parsed_journ_flask_config = JournalistInterfaceConfig(\n SECRET_KEY=journ_flask_config.SECRET_KEY,\n SESSION_COOKIE_NAME=getattr(journ_flask_config, \"SESSION_COOKIE_NAME\", \"js\"),\n DEBUG=getattr(journ_flask_config, \"DEBUG\", False),\n TESTING=getattr(journ_flask_config, \"TESTING\", False),\n WTF_CSRF_ENABLED=getattr(journ_flask_config, \"WTF_CSRF_ENABLED\", True),\n MAX_CONTENT_LENGTH=getattr(journ_flask_config, \"MAX_CONTENT_LENGTH\", 524288000),\n USE_X_SENDFILE=getattr(journ_flask_config, \"USE_X_SENDFILE\", False),\n )\n source_flask_config = config_from_local_file.SourceInterfaceFlaskConfig\n parsed_source_flask_config = SourceInterfaceConfig(\n SECRET_KEY=source_flask_config.SECRET_KEY,\n SESSION_COOKIE_NAME=getattr(journ_flask_config, \"SESSION_COOKIE_NAME\", \"ss\"),\n DEBUG=getattr(journ_flask_config, \"DEBUG\", False),\n TESTING=getattr(journ_flask_config, \"TESTING\", False),\n WTF_CSRF_ENABLED=getattr(journ_flask_config, \"WTF_CSRF_ENABLED\", True),\n MAX_CONTENT_LENGTH=getattr(journ_flask_config, \"MAX_CONTENT_LENGTH\", 524288000),\n USE_X_SENDFILE=getattr(journ_flask_config, \"USE_X_SENDFILE\", False),\n )\n\n return SecureDropConfig(\n JOURNALIST_APP_FLASK_CONFIG_CLS=parsed_journ_flask_config,\n SOURCE_APP_FLASK_CONFIG_CLS=parsed_source_flask_config,\n GPG_KEY_DIR=final_gpg_key_dir,\n JOURNALIST_KEY=config_from_local_file.JOURNALIST_KEY,\n SCRYPT_GPG_PEPPER=config_from_local_file.SCRYPT_GPG_PEPPER,\n SCRYPT_ID_PEPPER=config_from_local_file.SCRYPT_ID_PEPPER,\n SCRYPT_PARAMS=final_scrypt_params,\n SECUREDROP_DATA_ROOT=final_securedrop_data_root,\n SECUREDROP_ROOT=final_securedrop_root,\n DATABASE_FILE=final_db_file,\n STATIC_DIR=final_static_dir,\n TRANSLATION_DIRS=final_transl_dir,\n SOURCE_TEMPLATES_DIR=final_source_tmpl_dir,\n JOURNALIST_TEMPLATES_DIR=final_journ_tmpl_dir,\n NOUNS=final_nouns,\n ADJECTIVES=final_adjectives,\n DEFAULT_LOCALE=final_default_locale,\n SUPPORTED_LOCALES=final_supp_locales,\n SESSION_EXPIRATION_MINUTES=final_sess_expiration_mins,\n RQ_WORKER_NAME=final_worker_name,\n )\n", "path": "securedrop/sdconfig.py"}], "after_files": [{"content": "from dataclasses import dataclass\nfrom importlib import import_module\nfrom pathlib import Path\nfrom typing import Dict, List, Optional\n\nFALLBACK_LOCALE = \"en_US\"\n\nDEFAULT_SECUREDROP_ROOT = Path(__file__).absolute().parent\n\n\n@dataclass(frozen=True)\nclass _FlaskAppConfig:\n \"\"\"Config fields that are common to the Journalist and Source interfaces.\"\"\"\n\n SESSION_COOKIE_NAME: str\n SECRET_KEY: str\n\n DEBUG: bool\n TESTING: bool\n WTF_CSRF_ENABLED: bool\n\n # Use MAX_CONTENT_LENGTH to mimic the behavior of Apache's LimitRequestBody\n # in the development environment. See #1714.\n MAX_CONTENT_LENGTH: int\n\n # This is recommended for performance, and also resolves #369\n USE_X_SENDFILE: bool\n\n\n@dataclass(frozen=True)\nclass JournalistInterfaceConfig(_FlaskAppConfig):\n # Additional config for JI Redis sessions\n SESSION_SIGNER_SALT: str = \"js_session\"\n SESSION_KEY_PREFIX: str = \"js_session:\"\n SESSION_LIFETIME: int = 2 * 60 * 60\n SESSION_RENEW_COUNT: int = 5\n\n\n@dataclass(frozen=True)\nclass SourceInterfaceConfig(_FlaskAppConfig):\n pass\n\n\n@dataclass(frozen=True)\nclass SecureDropConfig:\n JOURNALIST_APP_FLASK_CONFIG_CLS: JournalistInterfaceConfig\n SOURCE_APP_FLASK_CONFIG_CLS: SourceInterfaceConfig\n\n GPG_KEY_DIR: Path\n JOURNALIST_KEY: str\n SCRYPT_GPG_PEPPER: str\n SCRYPT_ID_PEPPER: str\n SCRYPT_PARAMS: Dict[str, int]\n\n SECUREDROP_DATA_ROOT: Path\n\n DATABASE_FILE: Path # Path to the sqlite DB file\n\n SECUREDROP_ROOT: Path\n STATIC_DIR: Path\n TRANSLATION_DIRS: Path\n SOURCE_TEMPLATES_DIR: Path\n JOURNALIST_TEMPLATES_DIR: Path\n NOUNS: Path\n ADJECTIVES: Path\n\n DEFAULT_LOCALE: str\n SUPPORTED_LOCALES: List[str]\n\n SESSION_EXPIRATION_MINUTES: float\n\n RQ_WORKER_NAME: str\n\n env: str = \"prod\"\n\n @property\n def TEMP_DIR(self) -> Path:\n # We use a directory under the SECUREDROP_DATA_ROOT instead of `/tmp` because\n # we need to expose this directory via X-Send-File, and want to minimize the\n # potential for exposing unintended files.\n return self.SECUREDROP_DATA_ROOT / \"tmp\"\n\n @property\n def STORE_DIR(self) -> Path:\n return self.SECUREDROP_DATA_ROOT / \"store\"\n\n @property\n def DATABASE_URI(self) -> str:\n return f\"sqlite:///{self.DATABASE_FILE}\"\n\n @classmethod\n def get_current(cls) -> \"SecureDropConfig\":\n global _current_config\n if _current_config is None:\n # Retrieve the config by parsing it from ./config.py\n _current_config = _parse_config_from_file(config_module_name=\"config\")\n return _current_config\n\n\n_current_config: Optional[SecureDropConfig] = None\n\n\ndef _parse_config_from_file(config_module_name: str) -> SecureDropConfig:\n \"\"\"Parse the config from a config.py file.\"\"\"\n config_from_local_file = import_module(config_module_name)\n\n # Parse the local config; as there are SD instances with very old config files\n # the parsing logic here has to assume some values might be missing, and hence\n # set default values for such config entries\n final_default_locale = getattr(config_from_local_file, \"DEFAULT_LOCALE\", FALLBACK_LOCALE)\n final_supp_locales = getattr(config_from_local_file, \"SUPPORTED_LOCALES\", [FALLBACK_LOCALE])\n final_sess_expiration_mins = getattr(config_from_local_file, \"SESSION_EXPIRATION_MINUTES\", 120)\n\n final_worker_name = getattr(config_from_local_file, \"RQ_WORKER_NAME\", \"default\")\n\n final_scrypt_params = getattr(\n config_from_local_file, \"SCRYPT_PARAMS\", dict(N=2**14, r=8, p=1)\n )\n\n env = getattr(config_from_local_file, \"env\", \"prod\")\n\n try:\n final_securedrop_root = Path(config_from_local_file.SECUREDROP_ROOT)\n except AttributeError:\n final_securedrop_root = DEFAULT_SECUREDROP_ROOT\n\n try:\n final_securedrop_data_root = Path(config_from_local_file.SECUREDROP_DATA_ROOT)\n except AttributeError:\n final_securedrop_data_root = Path(\"/var/lib/securedrop\")\n\n try:\n final_db_file = Path(config_from_local_file.DATABASE_FILE)\n except AttributeError:\n final_db_file = final_securedrop_data_root / \"db.sqlite\"\n\n try:\n final_gpg_key_dir = Path(config_from_local_file.GPG_KEY_DIR)\n except AttributeError:\n final_gpg_key_dir = final_securedrop_data_root / \"keys\"\n\n try:\n final_nouns = Path(config_from_local_file.NOUNS)\n except AttributeError:\n final_nouns = final_securedrop_root / \"dictionaries\" / \"nouns.txt\"\n\n try:\n final_adjectives = Path(config_from_local_file.ADJECTIVES)\n except AttributeError:\n final_adjectives = final_securedrop_root / \"dictionaries\" / \"adjectives.txt\"\n\n try:\n final_static_dir = Path(config_from_local_file.STATIC_DIR) # type: ignore\n except AttributeError:\n final_static_dir = final_securedrop_root / \"static\"\n\n try:\n final_transl_dir = Path(config_from_local_file.TRANSLATION_DIRS) # type: ignore\n except AttributeError:\n final_transl_dir = final_securedrop_root / \"translations\"\n\n try:\n final_source_tmpl_dir = Path(config_from_local_file.SOURCE_TEMPLATES_DIR)\n except AttributeError:\n final_source_tmpl_dir = final_securedrop_root / \"source_templates\"\n\n try:\n final_journ_tmpl_dir = Path(config_from_local_file.JOURNALIST_TEMPLATES_DIR)\n except AttributeError:\n final_journ_tmpl_dir = final_securedrop_root / \"journalist_templates\"\n\n # Parse the Flask configurations\n journ_flask_config = config_from_local_file.JournalistInterfaceFlaskConfig\n parsed_journ_flask_config = JournalistInterfaceConfig(\n SECRET_KEY=journ_flask_config.SECRET_KEY,\n SESSION_COOKIE_NAME=getattr(journ_flask_config, \"SESSION_COOKIE_NAME\", \"js\"),\n DEBUG=getattr(journ_flask_config, \"DEBUG\", False),\n TESTING=getattr(journ_flask_config, \"TESTING\", False),\n WTF_CSRF_ENABLED=getattr(journ_flask_config, \"WTF_CSRF_ENABLED\", True),\n MAX_CONTENT_LENGTH=getattr(journ_flask_config, \"MAX_CONTENT_LENGTH\", 524288000),\n USE_X_SENDFILE=getattr(journ_flask_config, \"USE_X_SENDFILE\", False),\n )\n source_flask_config = config_from_local_file.SourceInterfaceFlaskConfig\n parsed_source_flask_config = SourceInterfaceConfig(\n SECRET_KEY=source_flask_config.SECRET_KEY,\n SESSION_COOKIE_NAME=getattr(journ_flask_config, \"SESSION_COOKIE_NAME\", \"ss\"),\n DEBUG=getattr(journ_flask_config, \"DEBUG\", False),\n TESTING=getattr(journ_flask_config, \"TESTING\", False),\n WTF_CSRF_ENABLED=getattr(journ_flask_config, \"WTF_CSRF_ENABLED\", True),\n MAX_CONTENT_LENGTH=getattr(journ_flask_config, \"MAX_CONTENT_LENGTH\", 524288000),\n USE_X_SENDFILE=getattr(journ_flask_config, \"USE_X_SENDFILE\", False),\n )\n\n return SecureDropConfig(\n env=env,\n JOURNALIST_APP_FLASK_CONFIG_CLS=parsed_journ_flask_config,\n SOURCE_APP_FLASK_CONFIG_CLS=parsed_source_flask_config,\n GPG_KEY_DIR=final_gpg_key_dir,\n JOURNALIST_KEY=config_from_local_file.JOURNALIST_KEY,\n SCRYPT_GPG_PEPPER=config_from_local_file.SCRYPT_GPG_PEPPER,\n SCRYPT_ID_PEPPER=config_from_local_file.SCRYPT_ID_PEPPER,\n SCRYPT_PARAMS=final_scrypt_params,\n SECUREDROP_DATA_ROOT=final_securedrop_data_root,\n SECUREDROP_ROOT=final_securedrop_root,\n DATABASE_FILE=final_db_file,\n STATIC_DIR=final_static_dir,\n TRANSLATION_DIRS=final_transl_dir,\n SOURCE_TEMPLATES_DIR=final_source_tmpl_dir,\n JOURNALIST_TEMPLATES_DIR=final_journ_tmpl_dir,\n NOUNS=final_nouns,\n ADJECTIVES=final_adjectives,\n DEFAULT_LOCALE=final_default_locale,\n SUPPORTED_LOCALES=final_supp_locales,\n SESSION_EXPIRATION_MINUTES=final_sess_expiration_mins,\n RQ_WORKER_NAME=final_worker_name,\n )\n", "path": "securedrop/sdconfig.py"}]} | 2,932 | 265 |
gh_patches_debug_7282 | rasdani/github-patches | git_diff | frappe__frappe-21703 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Portal Sidebar not displaying in v14-beta.3
The portal sidebar does not display consistently on different kinds of routes. On some pages, the sidebar is not rendered, while on others it is. This has been tested on both existing sites and fresh installs.
/me (no sidebar)

/orders (single link on sidebar)

/orders/SPECIFIC-DOC (full sidebar)

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `frappe/website/page_renderers/template_page.py`
Content:
```
1 import os
2 from importlib.machinery import all_suffixes
3
4 import click
5
6 import frappe
7 from frappe.website.page_renderers.base_template_page import BaseTemplatePage
8 from frappe.website.router import get_base_template, get_page_info
9 from frappe.website.utils import (
10 cache_html,
11 extract_comment_tag,
12 extract_title,
13 get_frontmatter,
14 get_next_link,
15 get_sidebar_items,
16 get_toc,
17 is_binary_file,
18 )
19
20 PY_LOADER_SUFFIXES = tuple(all_suffixes())
21
22 WEBPAGE_PY_MODULE_PROPERTIES = (
23 "base_template_path",
24 "template",
25 "no_cache",
26 "sitemap",
27 "condition_field",
28 )
29
30 COMMENT_PROPERTY_KEY_VALUE_MAP = {
31 "no-breadcrumbs": ("no_breadcrumbs", 1),
32 "show-sidebar": ("show_sidebar", 1),
33 "add-breadcrumbs": ("add_breadcrumbs", 1),
34 "no-header": ("no_header", 1),
35 "add-next-prev-links": ("add_next_prev_links", 1),
36 "no-cache": ("no_cache", 1),
37 "no-sitemap": ("sitemap", 0),
38 "sitemap": ("sitemap", 1),
39 }
40
41
42 class TemplatePage(BaseTemplatePage):
43 def __init__(self, path, http_status_code=None):
44 super().__init__(path=path, http_status_code=http_status_code)
45 self.set_template_path()
46
47 def set_template_path(self):
48 """
49 Searches for file matching the path in the /www
50 and /templates/pages folders and sets path if match is found
51 """
52 folders = get_start_folders()
53 for app in reversed(frappe.get_installed_apps()):
54 app_path = frappe.get_app_path(app)
55
56 for dirname in folders:
57 search_path = os.path.join(app_path, dirname, self.path)
58 for file_path in self.get_index_path_options(search_path):
59 if os.path.isfile(file_path) and not is_binary_file(file_path):
60 self.app = app
61 self.app_path = app_path
62 self.file_dir = dirname
63 self.basename = os.path.splitext(file_path)[0]
64 self.template_path = os.path.relpath(file_path, self.app_path)
65 self.basepath = os.path.dirname(file_path)
66 self.filename = os.path.basename(file_path)
67 self.name = os.path.splitext(self.filename)[0]
68 return
69
70 def can_render(self):
71 return (
72 hasattr(self, "template_path")
73 and self.template_path
74 and not self.template_path.endswith(PY_LOADER_SUFFIXES)
75 )
76
77 @staticmethod
78 def get_index_path_options(search_path):
79 return (
80 frappe.as_unicode(f"{search_path}{d}") for d in ("", ".html", ".md", "/index.html", "/index.md")
81 )
82
83 def render(self):
84 html = self.get_html()
85 html = self.add_csrf_token(html)
86 return self.build_response(html)
87
88 @cache_html
89 def get_html(self):
90 # context object should be separate from self for security
91 # because it will be accessed via the user defined template
92 self.init_context()
93
94 self.set_pymodule()
95 self.update_context()
96 self.setup_template_source()
97 self.load_colocated_files()
98 self.set_properties_from_source()
99 self.post_process_context()
100
101 html = self.render_template()
102 html = self.update_toc(html)
103
104 return html
105
106 def post_process_context(self):
107 self.set_user_info()
108 self.add_sidebar_and_breadcrumbs()
109 super().post_process_context()
110
111 def add_sidebar_and_breadcrumbs(self):
112 if self.basepath:
113 self.context.sidebar_items = get_sidebar_items(self.context.website_sidebar, self.basepath)
114
115 if self.context.add_breadcrumbs and not self.context.parents:
116 parent_path = os.path.dirname(self.path)
117 if self.path.endswith("index"):
118 # in case of index page move one directory up for parent path
119 parent_path = os.path.dirname(parent_path)
120
121 for parent_file_path in self.get_index_path_options(parent_path):
122 parent_file_path = os.path.join(self.app_path, self.file_dir, parent_file_path)
123 if os.path.isfile(parent_file_path):
124 parent_page_context = get_page_info(parent_file_path, self.app, self.file_dir)
125 if parent_page_context:
126 self.context.parents = [
127 dict(route=os.path.dirname(self.path), title=parent_page_context.title)
128 ]
129 break
130
131 def set_pymodule(self):
132 """
133 A template may have a python module with a `get_context` method along with it in the
134 same folder. Also the hyphens will be coverted to underscore for python module names.
135 This method sets the pymodule_name if it exists.
136 """
137 template_basepath = os.path.splitext(self.template_path)[0]
138 self.pymodule_name = None
139
140 # replace - with _ in the internal modules names
141 self.pymodule_path = os.path.join(
142 os.path.dirname(template_basepath),
143 os.path.basename(template_basepath.replace("-", "_")) + ".py",
144 )
145
146 if os.path.exists(os.path.join(self.app_path, self.pymodule_path)):
147 self.pymodule_name = self.app + "." + self.pymodule_path.replace(os.path.sep, ".")[:-3]
148
149 def setup_template_source(self):
150 """Setup template source, frontmatter and markdown conversion"""
151 self.source = self.get_raw_template()
152 self.extract_frontmatter()
153 self.convert_from_markdown()
154
155 def update_context(self):
156 self.set_page_properties()
157 self.context.build_version = frappe.utils.get_build_version()
158
159 if self.pymodule_name:
160 self.pymodule = frappe.get_module(self.pymodule_name)
161 self.set_pymodule_properties()
162
163 data = self.run_pymodule_method("get_context")
164 # some methods may return a "context" object
165 if data:
166 self.context.update(data)
167 # TODO: self.context.children = self.run_pymodule_method('get_children')
168
169 self.context.developer_mode = frappe.conf.developer_mode
170 if self.context.http_status_code:
171 self.http_status_code = self.context.http_status_code
172
173 def set_pymodule_properties(self):
174 for prop in WEBPAGE_PY_MODULE_PROPERTIES:
175 if hasattr(self.pymodule, prop):
176 self.context[prop] = getattr(self.pymodule, prop)
177
178 def set_page_properties(self):
179 self.context.base_template = self.context.base_template or get_base_template(self.path)
180 self.context.basepath = self.basepath
181 self.context.basename = self.basename
182 self.context.name = self.name
183 self.context.path = self.path
184 self.context.route = self.path
185 self.context.template = self.template_path
186
187 def set_properties_from_source(self):
188 if not self.source:
189 return
190 context = self.context
191 if not context.title:
192 context.title = extract_title(self.source, self.path)
193
194 base_template = extract_comment_tag(self.source, "base_template")
195 if base_template:
196 context.base_template = base_template
197
198 if (
199 context.base_template
200 and "{%- extends" not in self.source
201 and "{% extends" not in self.source
202 and "</body>" not in self.source
203 ):
204 self.source = """{{% extends "{0}" %}}
205 {{% block page_content %}}{1}{{% endblock %}}""".format(
206 context.base_template, self.source
207 )
208
209 self.set_properties_via_comments()
210
211 def set_properties_via_comments(self):
212 for comment, (context_key, value) in COMMENT_PROPERTY_KEY_VALUE_MAP.items():
213 comment_tag = f"<!-- {comment} -->"
214 if comment_tag in self.source:
215 self.context[context_key] = value
216 click.echo(f"\n⚠️ DEPRECATION WARNING: {comment_tag} will be deprecated on 2021-12-31.")
217 click.echo(f"Please remove it from {self.template_path} in {self.app}")
218
219 def run_pymodule_method(self, method_name):
220 if hasattr(self.pymodule, method_name):
221 import inspect
222
223 method = getattr(self.pymodule, method_name)
224 if inspect.getfullargspec(method).args:
225 return method(self.context)
226 else:
227 return method()
228
229 def render_template(self):
230 if self.template_path.endswith("min.js"):
231 html = self.source # static
232 else:
233 if self.context.safe_render is not None:
234 safe_render = self.context.safe_render
235 else:
236 safe_render = True
237
238 html = frappe.render_template(self.source, self.context, safe_render=safe_render)
239
240 return html
241
242 def extends_template(self):
243 return self.template_path.endswith((".html", ".md")) and (
244 "{%- extends" in self.source or "{% extends" in self.source
245 )
246
247 def get_raw_template(self):
248 return frappe.get_jloader().get_source(frappe.get_jenv(), self.context.template)[0]
249
250 def load_colocated_files(self):
251 """load co-located css/js files with the same name"""
252 js_path = self.basename + ".js"
253 if os.path.exists(js_path) and "{% block script %}" not in self.source:
254 self.context.colocated_js = self.get_colocated_file(js_path)
255
256 css_path = self.basename + ".css"
257 if os.path.exists(css_path) and "{% block style %}" not in self.source:
258 self.context.colocated_css = self.get_colocated_file(css_path)
259
260 def get_colocated_file(self, path):
261 with open(path, encoding="utf-8") as f:
262 return f.read()
263
264 def extract_frontmatter(self):
265 if not self.template_path.endswith((".md", ".html")):
266 return
267
268 try:
269 # values will be used to update self
270 res = get_frontmatter(self.source)
271 if res["attributes"]:
272 self.context.update(res["attributes"])
273 self.source = res["body"]
274 except Exception:
275 pass
276
277 def convert_from_markdown(self):
278 if self.template_path.endswith(".md"):
279 self.source = frappe.utils.md_to_html(self.source)
280 self.context.page_toc_html = self.source.toc_html
281
282 if not self.context.show_sidebar:
283 self.source = '<div class="from-markdown">' + self.source + "</div>"
284
285 def update_toc(self, html):
286 if "{index}" in html:
287 html = html.replace("{index}", get_toc(self.path))
288
289 if "{next}" in html:
290 html = html.replace("{next}", get_next_link(self.path))
291
292 return html
293
294 def set_standard_path(self, path):
295 self.app = "frappe"
296 self.app_path = frappe.get_app_path("frappe")
297 self.path = path
298 self.template_path = f"www/{path}.html"
299
300 def set_missing_values(self):
301 super().set_missing_values()
302 # for backward compatibility
303 self.context.docs_base_url = "/docs"
304
305 def set_user_info(self):
306 from frappe.utils.user import get_fullname_and_avatar
307
308 info = get_fullname_and_avatar(frappe.session.user)
309 self.context["fullname"] = info.fullname
310 self.context["user_image"] = info.avatar
311 self.context["user"] = info.name
312
313
314 def get_start_folders():
315 return frappe.local.flags.web_pages_folders or ("www", "templates/pages")
316
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/frappe/website/page_renderers/template_page.py b/frappe/website/page_renderers/template_page.py
--- a/frappe/website/page_renderers/template_page.py
+++ b/frappe/website/page_renderers/template_page.py
@@ -109,8 +109,7 @@
super().post_process_context()
def add_sidebar_and_breadcrumbs(self):
- if self.basepath:
- self.context.sidebar_items = get_sidebar_items(self.context.website_sidebar, self.basepath)
+ self.context.sidebar_items = get_sidebar_items(self.context.website_sidebar, self.basepath)
if self.context.add_breadcrumbs and not self.context.parents:
parent_path = os.path.dirname(self.path)
| {"golden_diff": "diff --git a/frappe/website/page_renderers/template_page.py b/frappe/website/page_renderers/template_page.py\n--- a/frappe/website/page_renderers/template_page.py\n+++ b/frappe/website/page_renderers/template_page.py\n@@ -109,8 +109,7 @@\n \t\tsuper().post_process_context()\n \n \tdef add_sidebar_and_breadcrumbs(self):\n-\t\tif self.basepath:\n-\t\t\tself.context.sidebar_items = get_sidebar_items(self.context.website_sidebar, self.basepath)\n+\t\tself.context.sidebar_items = get_sidebar_items(self.context.website_sidebar, self.basepath)\n \n \t\tif self.context.add_breadcrumbs and not self.context.parents:\n \t\t\tparent_path = os.path.dirname(self.path)\n", "issue": "Portal Sidebar not displaying in v14-beta.3\nThe portal sidebar does not display consistently on different kinds of routes. On some pages, the sidebar is not rendered, while on others it is. This has been tested on both existing sites and fresh installs.\r\n\r\n/me (no sidebar)\r\n\r\n\r\n\r\n/orders (single link on sidebar)\r\n\r\n\r\n\r\n/orders/SPECIFIC-DOC (full sidebar)\r\n\r\n\r\n\n", "before_files": [{"content": "import os\nfrom importlib.machinery import all_suffixes\n\nimport click\n\nimport frappe\nfrom frappe.website.page_renderers.base_template_page import BaseTemplatePage\nfrom frappe.website.router import get_base_template, get_page_info\nfrom frappe.website.utils import (\n\tcache_html,\n\textract_comment_tag,\n\textract_title,\n\tget_frontmatter,\n\tget_next_link,\n\tget_sidebar_items,\n\tget_toc,\n\tis_binary_file,\n)\n\nPY_LOADER_SUFFIXES = tuple(all_suffixes())\n\nWEBPAGE_PY_MODULE_PROPERTIES = (\n\t\"base_template_path\",\n\t\"template\",\n\t\"no_cache\",\n\t\"sitemap\",\n\t\"condition_field\",\n)\n\nCOMMENT_PROPERTY_KEY_VALUE_MAP = {\n\t\"no-breadcrumbs\": (\"no_breadcrumbs\", 1),\n\t\"show-sidebar\": (\"show_sidebar\", 1),\n\t\"add-breadcrumbs\": (\"add_breadcrumbs\", 1),\n\t\"no-header\": (\"no_header\", 1),\n\t\"add-next-prev-links\": (\"add_next_prev_links\", 1),\n\t\"no-cache\": (\"no_cache\", 1),\n\t\"no-sitemap\": (\"sitemap\", 0),\n\t\"sitemap\": (\"sitemap\", 1),\n}\n\n\nclass TemplatePage(BaseTemplatePage):\n\tdef __init__(self, path, http_status_code=None):\n\t\tsuper().__init__(path=path, http_status_code=http_status_code)\n\t\tself.set_template_path()\n\n\tdef set_template_path(self):\n\t\t\"\"\"\n\t\tSearches for file matching the path in the /www\n\t\tand /templates/pages folders and sets path if match is found\n\t\t\"\"\"\n\t\tfolders = get_start_folders()\n\t\tfor app in reversed(frappe.get_installed_apps()):\n\t\t\tapp_path = frappe.get_app_path(app)\n\n\t\t\tfor dirname in folders:\n\t\t\t\tsearch_path = os.path.join(app_path, dirname, self.path)\n\t\t\t\tfor file_path in self.get_index_path_options(search_path):\n\t\t\t\t\tif os.path.isfile(file_path) and not is_binary_file(file_path):\n\t\t\t\t\t\tself.app = app\n\t\t\t\t\t\tself.app_path = app_path\n\t\t\t\t\t\tself.file_dir = dirname\n\t\t\t\t\t\tself.basename = os.path.splitext(file_path)[0]\n\t\t\t\t\t\tself.template_path = os.path.relpath(file_path, self.app_path)\n\t\t\t\t\t\tself.basepath = os.path.dirname(file_path)\n\t\t\t\t\t\tself.filename = os.path.basename(file_path)\n\t\t\t\t\t\tself.name = os.path.splitext(self.filename)[0]\n\t\t\t\t\t\treturn\n\n\tdef can_render(self):\n\t\treturn (\n\t\t\thasattr(self, \"template_path\")\n\t\t\tand self.template_path\n\t\t\tand not self.template_path.endswith(PY_LOADER_SUFFIXES)\n\t\t)\n\n\t@staticmethod\n\tdef get_index_path_options(search_path):\n\t\treturn (\n\t\t\tfrappe.as_unicode(f\"{search_path}{d}\") for d in (\"\", \".html\", \".md\", \"/index.html\", \"/index.md\")\n\t\t)\n\n\tdef render(self):\n\t\thtml = self.get_html()\n\t\thtml = self.add_csrf_token(html)\n\t\treturn self.build_response(html)\n\n\t@cache_html\n\tdef get_html(self):\n\t\t# context object should be separate from self for security\n\t\t# because it will be accessed via the user defined template\n\t\tself.init_context()\n\n\t\tself.set_pymodule()\n\t\tself.update_context()\n\t\tself.setup_template_source()\n\t\tself.load_colocated_files()\n\t\tself.set_properties_from_source()\n\t\tself.post_process_context()\n\n\t\thtml = self.render_template()\n\t\thtml = self.update_toc(html)\n\n\t\treturn html\n\n\tdef post_process_context(self):\n\t\tself.set_user_info()\n\t\tself.add_sidebar_and_breadcrumbs()\n\t\tsuper().post_process_context()\n\n\tdef add_sidebar_and_breadcrumbs(self):\n\t\tif self.basepath:\n\t\t\tself.context.sidebar_items = get_sidebar_items(self.context.website_sidebar, self.basepath)\n\n\t\tif self.context.add_breadcrumbs and not self.context.parents:\n\t\t\tparent_path = os.path.dirname(self.path)\n\t\t\tif self.path.endswith(\"index\"):\n\t\t\t\t# in case of index page move one directory up for parent path\n\t\t\t\tparent_path = os.path.dirname(parent_path)\n\n\t\t\tfor parent_file_path in self.get_index_path_options(parent_path):\n\t\t\t\tparent_file_path = os.path.join(self.app_path, self.file_dir, parent_file_path)\n\t\t\t\tif os.path.isfile(parent_file_path):\n\t\t\t\t\tparent_page_context = get_page_info(parent_file_path, self.app, self.file_dir)\n\t\t\t\t\tif parent_page_context:\n\t\t\t\t\t\tself.context.parents = [\n\t\t\t\t\t\t\tdict(route=os.path.dirname(self.path), title=parent_page_context.title)\n\t\t\t\t\t\t]\n\t\t\t\t\tbreak\n\n\tdef set_pymodule(self):\n\t\t\"\"\"\n\t\tA template may have a python module with a `get_context` method along with it in the\n\t\tsame folder. Also the hyphens will be coverted to underscore for python module names.\n\t\tThis method sets the pymodule_name if it exists.\n\t\t\"\"\"\n\t\ttemplate_basepath = os.path.splitext(self.template_path)[0]\n\t\tself.pymodule_name = None\n\n\t\t# replace - with _ in the internal modules names\n\t\tself.pymodule_path = os.path.join(\n\t\t\tos.path.dirname(template_basepath),\n\t\t\tos.path.basename(template_basepath.replace(\"-\", \"_\")) + \".py\",\n\t\t)\n\n\t\tif os.path.exists(os.path.join(self.app_path, self.pymodule_path)):\n\t\t\tself.pymodule_name = self.app + \".\" + self.pymodule_path.replace(os.path.sep, \".\")[:-3]\n\n\tdef setup_template_source(self):\n\t\t\"\"\"Setup template source, frontmatter and markdown conversion\"\"\"\n\t\tself.source = self.get_raw_template()\n\t\tself.extract_frontmatter()\n\t\tself.convert_from_markdown()\n\n\tdef update_context(self):\n\t\tself.set_page_properties()\n\t\tself.context.build_version = frappe.utils.get_build_version()\n\n\t\tif self.pymodule_name:\n\t\t\tself.pymodule = frappe.get_module(self.pymodule_name)\n\t\t\tself.set_pymodule_properties()\n\n\t\t\tdata = self.run_pymodule_method(\"get_context\")\n\t\t\t# some methods may return a \"context\" object\n\t\t\tif data:\n\t\t\t\tself.context.update(data)\n\t\t\t# TODO: self.context.children = self.run_pymodule_method('get_children')\n\n\t\tself.context.developer_mode = frappe.conf.developer_mode\n\t\tif self.context.http_status_code:\n\t\t\tself.http_status_code = self.context.http_status_code\n\n\tdef set_pymodule_properties(self):\n\t\tfor prop in WEBPAGE_PY_MODULE_PROPERTIES:\n\t\t\tif hasattr(self.pymodule, prop):\n\t\t\t\tself.context[prop] = getattr(self.pymodule, prop)\n\n\tdef set_page_properties(self):\n\t\tself.context.base_template = self.context.base_template or get_base_template(self.path)\n\t\tself.context.basepath = self.basepath\n\t\tself.context.basename = self.basename\n\t\tself.context.name = self.name\n\t\tself.context.path = self.path\n\t\tself.context.route = self.path\n\t\tself.context.template = self.template_path\n\n\tdef set_properties_from_source(self):\n\t\tif not self.source:\n\t\t\treturn\n\t\tcontext = self.context\n\t\tif not context.title:\n\t\t\tcontext.title = extract_title(self.source, self.path)\n\n\t\tbase_template = extract_comment_tag(self.source, \"base_template\")\n\t\tif base_template:\n\t\t\tcontext.base_template = base_template\n\n\t\tif (\n\t\t\tcontext.base_template\n\t\t\tand \"{%- extends\" not in self.source\n\t\t\tand \"{% extends\" not in self.source\n\t\t\tand \"</body>\" not in self.source\n\t\t):\n\t\t\tself.source = \"\"\"{{% extends \"{0}\" %}}\n\t\t\t\t{{% block page_content %}}{1}{{% endblock %}}\"\"\".format(\n\t\t\t\tcontext.base_template, self.source\n\t\t\t)\n\n\t\tself.set_properties_via_comments()\n\n\tdef set_properties_via_comments(self):\n\t\tfor comment, (context_key, value) in COMMENT_PROPERTY_KEY_VALUE_MAP.items():\n\t\t\tcomment_tag = f\"<!-- {comment} -->\"\n\t\t\tif comment_tag in self.source:\n\t\t\t\tself.context[context_key] = value\n\t\t\t\tclick.echo(f\"\\n\u26a0\ufe0f DEPRECATION WARNING: {comment_tag} will be deprecated on 2021-12-31.\")\n\t\t\t\tclick.echo(f\"Please remove it from {self.template_path} in {self.app}\")\n\n\tdef run_pymodule_method(self, method_name):\n\t\tif hasattr(self.pymodule, method_name):\n\t\t\timport inspect\n\n\t\t\tmethod = getattr(self.pymodule, method_name)\n\t\t\tif inspect.getfullargspec(method).args:\n\t\t\t\treturn method(self.context)\n\t\t\telse:\n\t\t\t\treturn method()\n\n\tdef render_template(self):\n\t\tif self.template_path.endswith(\"min.js\"):\n\t\t\thtml = self.source # static\n\t\telse:\n\t\t\tif self.context.safe_render is not None:\n\t\t\t\tsafe_render = self.context.safe_render\n\t\t\telse:\n\t\t\t\tsafe_render = True\n\n\t\t\thtml = frappe.render_template(self.source, self.context, safe_render=safe_render)\n\n\t\treturn html\n\n\tdef extends_template(self):\n\t\treturn self.template_path.endswith((\".html\", \".md\")) and (\n\t\t\t\"{%- extends\" in self.source or \"{% extends\" in self.source\n\t\t)\n\n\tdef get_raw_template(self):\n\t\treturn frappe.get_jloader().get_source(frappe.get_jenv(), self.context.template)[0]\n\n\tdef load_colocated_files(self):\n\t\t\"\"\"load co-located css/js files with the same name\"\"\"\n\t\tjs_path = self.basename + \".js\"\n\t\tif os.path.exists(js_path) and \"{% block script %}\" not in self.source:\n\t\t\tself.context.colocated_js = self.get_colocated_file(js_path)\n\n\t\tcss_path = self.basename + \".css\"\n\t\tif os.path.exists(css_path) and \"{% block style %}\" not in self.source:\n\t\t\tself.context.colocated_css = self.get_colocated_file(css_path)\n\n\tdef get_colocated_file(self, path):\n\t\twith open(path, encoding=\"utf-8\") as f:\n\t\t\treturn f.read()\n\n\tdef extract_frontmatter(self):\n\t\tif not self.template_path.endswith((\".md\", \".html\")):\n\t\t\treturn\n\n\t\ttry:\n\t\t\t# values will be used to update self\n\t\t\tres = get_frontmatter(self.source)\n\t\t\tif res[\"attributes\"]:\n\t\t\t\tself.context.update(res[\"attributes\"])\n\t\t\t\tself.source = res[\"body\"]\n\t\texcept Exception:\n\t\t\tpass\n\n\tdef convert_from_markdown(self):\n\t\tif self.template_path.endswith(\".md\"):\n\t\t\tself.source = frappe.utils.md_to_html(self.source)\n\t\t\tself.context.page_toc_html = self.source.toc_html\n\n\t\t\tif not self.context.show_sidebar:\n\t\t\t\tself.source = '<div class=\"from-markdown\">' + self.source + \"</div>\"\n\n\tdef update_toc(self, html):\n\t\tif \"{index}\" in html:\n\t\t\thtml = html.replace(\"{index}\", get_toc(self.path))\n\n\t\tif \"{next}\" in html:\n\t\t\thtml = html.replace(\"{next}\", get_next_link(self.path))\n\n\t\treturn html\n\n\tdef set_standard_path(self, path):\n\t\tself.app = \"frappe\"\n\t\tself.app_path = frappe.get_app_path(\"frappe\")\n\t\tself.path = path\n\t\tself.template_path = f\"www/{path}.html\"\n\n\tdef set_missing_values(self):\n\t\tsuper().set_missing_values()\n\t\t# for backward compatibility\n\t\tself.context.docs_base_url = \"/docs\"\n\n\tdef set_user_info(self):\n\t\tfrom frappe.utils.user import get_fullname_and_avatar\n\n\t\tinfo = get_fullname_and_avatar(frappe.session.user)\n\t\tself.context[\"fullname\"] = info.fullname\n\t\tself.context[\"user_image\"] = info.avatar\n\t\tself.context[\"user\"] = info.name\n\n\ndef get_start_folders():\n\treturn frappe.local.flags.web_pages_folders or (\"www\", \"templates/pages\")\n", "path": "frappe/website/page_renderers/template_page.py"}], "after_files": [{"content": "import os\nfrom importlib.machinery import all_suffixes\n\nimport click\n\nimport frappe\nfrom frappe.website.page_renderers.base_template_page import BaseTemplatePage\nfrom frappe.website.router import get_base_template, get_page_info\nfrom frappe.website.utils import (\n\tcache_html,\n\textract_comment_tag,\n\textract_title,\n\tget_frontmatter,\n\tget_next_link,\n\tget_sidebar_items,\n\tget_toc,\n\tis_binary_file,\n)\n\nPY_LOADER_SUFFIXES = tuple(all_suffixes())\n\nWEBPAGE_PY_MODULE_PROPERTIES = (\n\t\"base_template_path\",\n\t\"template\",\n\t\"no_cache\",\n\t\"sitemap\",\n\t\"condition_field\",\n)\n\nCOMMENT_PROPERTY_KEY_VALUE_MAP = {\n\t\"no-breadcrumbs\": (\"no_breadcrumbs\", 1),\n\t\"show-sidebar\": (\"show_sidebar\", 1),\n\t\"add-breadcrumbs\": (\"add_breadcrumbs\", 1),\n\t\"no-header\": (\"no_header\", 1),\n\t\"add-next-prev-links\": (\"add_next_prev_links\", 1),\n\t\"no-cache\": (\"no_cache\", 1),\n\t\"no-sitemap\": (\"sitemap\", 0),\n\t\"sitemap\": (\"sitemap\", 1),\n}\n\n\nclass TemplatePage(BaseTemplatePage):\n\tdef __init__(self, path, http_status_code=None):\n\t\tsuper().__init__(path=path, http_status_code=http_status_code)\n\t\tself.set_template_path()\n\n\tdef set_template_path(self):\n\t\t\"\"\"\n\t\tSearches for file matching the path in the /www\n\t\tand /templates/pages folders and sets path if match is found\n\t\t\"\"\"\n\t\tfolders = get_start_folders()\n\t\tfor app in reversed(frappe.get_installed_apps()):\n\t\t\tapp_path = frappe.get_app_path(app)\n\n\t\t\tfor dirname in folders:\n\t\t\t\tsearch_path = os.path.join(app_path, dirname, self.path)\n\t\t\t\tfor file_path in self.get_index_path_options(search_path):\n\t\t\t\t\tif os.path.isfile(file_path) and not is_binary_file(file_path):\n\t\t\t\t\t\tself.app = app\n\t\t\t\t\t\tself.app_path = app_path\n\t\t\t\t\t\tself.file_dir = dirname\n\t\t\t\t\t\tself.basename = os.path.splitext(file_path)[0]\n\t\t\t\t\t\tself.template_path = os.path.relpath(file_path, self.app_path)\n\t\t\t\t\t\tself.basepath = os.path.dirname(file_path)\n\t\t\t\t\t\tself.filename = os.path.basename(file_path)\n\t\t\t\t\t\tself.name = os.path.splitext(self.filename)[0]\n\t\t\t\t\t\treturn\n\n\tdef can_render(self):\n\t\treturn (\n\t\t\thasattr(self, \"template_path\")\n\t\t\tand self.template_path\n\t\t\tand not self.template_path.endswith(PY_LOADER_SUFFIXES)\n\t\t)\n\n\t@staticmethod\n\tdef get_index_path_options(search_path):\n\t\treturn (\n\t\t\tfrappe.as_unicode(f\"{search_path}{d}\") for d in (\"\", \".html\", \".md\", \"/index.html\", \"/index.md\")\n\t\t)\n\n\tdef render(self):\n\t\thtml = self.get_html()\n\t\thtml = self.add_csrf_token(html)\n\t\treturn self.build_response(html)\n\n\t@cache_html\n\tdef get_html(self):\n\t\t# context object should be separate from self for security\n\t\t# because it will be accessed via the user defined template\n\t\tself.init_context()\n\n\t\tself.set_pymodule()\n\t\tself.update_context()\n\t\tself.setup_template_source()\n\t\tself.load_colocated_files()\n\t\tself.set_properties_from_source()\n\t\tself.post_process_context()\n\n\t\thtml = self.render_template()\n\t\thtml = self.update_toc(html)\n\n\t\treturn html\n\n\tdef post_process_context(self):\n\t\tself.set_user_info()\n\t\tself.add_sidebar_and_breadcrumbs()\n\t\tsuper().post_process_context()\n\n\tdef add_sidebar_and_breadcrumbs(self):\n\t\tself.context.sidebar_items = get_sidebar_items(self.context.website_sidebar, self.basepath)\n\n\t\tif self.context.add_breadcrumbs and not self.context.parents:\n\t\t\tparent_path = os.path.dirname(self.path)\n\t\t\tif self.path.endswith(\"index\"):\n\t\t\t\t# in case of index page move one directory up for parent path\n\t\t\t\tparent_path = os.path.dirname(parent_path)\n\n\t\t\tfor parent_file_path in self.get_index_path_options(parent_path):\n\t\t\t\tparent_file_path = os.path.join(self.app_path, self.file_dir, parent_file_path)\n\t\t\t\tif os.path.isfile(parent_file_path):\n\t\t\t\t\tparent_page_context = get_page_info(parent_file_path, self.app, self.file_dir)\n\t\t\t\t\tif parent_page_context:\n\t\t\t\t\t\tself.context.parents = [\n\t\t\t\t\t\t\tdict(route=os.path.dirname(self.path), title=parent_page_context.title)\n\t\t\t\t\t\t]\n\t\t\t\t\tbreak\n\n\tdef set_pymodule(self):\n\t\t\"\"\"\n\t\tA template may have a python module with a `get_context` method along with it in the\n\t\tsame folder. Also the hyphens will be coverted to underscore for python module names.\n\t\tThis method sets the pymodule_name if it exists.\n\t\t\"\"\"\n\t\ttemplate_basepath = os.path.splitext(self.template_path)[0]\n\t\tself.pymodule_name = None\n\n\t\t# replace - with _ in the internal modules names\n\t\tself.pymodule_path = os.path.join(\n\t\t\tos.path.dirname(template_basepath),\n\t\t\tos.path.basename(template_basepath.replace(\"-\", \"_\")) + \".py\",\n\t\t)\n\n\t\tif os.path.exists(os.path.join(self.app_path, self.pymodule_path)):\n\t\t\tself.pymodule_name = self.app + \".\" + self.pymodule_path.replace(os.path.sep, \".\")[:-3]\n\n\tdef setup_template_source(self):\n\t\t\"\"\"Setup template source, frontmatter and markdown conversion\"\"\"\n\t\tself.source = self.get_raw_template()\n\t\tself.extract_frontmatter()\n\t\tself.convert_from_markdown()\n\n\tdef update_context(self):\n\t\tself.set_page_properties()\n\t\tself.context.build_version = frappe.utils.get_build_version()\n\n\t\tif self.pymodule_name:\n\t\t\tself.pymodule = frappe.get_module(self.pymodule_name)\n\t\t\tself.set_pymodule_properties()\n\n\t\t\tdata = self.run_pymodule_method(\"get_context\")\n\t\t\t# some methods may return a \"context\" object\n\t\t\tif data:\n\t\t\t\tself.context.update(data)\n\t\t\t# TODO: self.context.children = self.run_pymodule_method('get_children')\n\n\t\tself.context.developer_mode = frappe.conf.developer_mode\n\t\tif self.context.http_status_code:\n\t\t\tself.http_status_code = self.context.http_status_code\n\n\tdef set_pymodule_properties(self):\n\t\tfor prop in WEBPAGE_PY_MODULE_PROPERTIES:\n\t\t\tif hasattr(self.pymodule, prop):\n\t\t\t\tself.context[prop] = getattr(self.pymodule, prop)\n\n\tdef set_page_properties(self):\n\t\tself.context.base_template = self.context.base_template or get_base_template(self.path)\n\t\tself.context.basepath = self.basepath\n\t\tself.context.basename = self.basename\n\t\tself.context.name = self.name\n\t\tself.context.path = self.path\n\t\tself.context.route = self.path\n\t\tself.context.template = self.template_path\n\n\tdef set_properties_from_source(self):\n\t\tif not self.source:\n\t\t\treturn\n\t\tcontext = self.context\n\t\tif not context.title:\n\t\t\tcontext.title = extract_title(self.source, self.path)\n\n\t\tbase_template = extract_comment_tag(self.source, \"base_template\")\n\t\tif base_template:\n\t\t\tcontext.base_template = base_template\n\n\t\tif (\n\t\t\tcontext.base_template\n\t\t\tand \"{%- extends\" not in self.source\n\t\t\tand \"{% extends\" not in self.source\n\t\t\tand \"</body>\" not in self.source\n\t\t):\n\t\t\tself.source = \"\"\"{{% extends \"{0}\" %}}\n\t\t\t\t{{% block page_content %}}{1}{{% endblock %}}\"\"\".format(\n\t\t\t\tcontext.base_template, self.source\n\t\t\t)\n\n\t\tself.set_properties_via_comments()\n\n\tdef set_properties_via_comments(self):\n\t\tfor comment, (context_key, value) in COMMENT_PROPERTY_KEY_VALUE_MAP.items():\n\t\t\tcomment_tag = f\"<!-- {comment} -->\"\n\t\t\tif comment_tag in self.source:\n\t\t\t\tself.context[context_key] = value\n\t\t\t\tclick.echo(f\"\\n\u26a0\ufe0f DEPRECATION WARNING: {comment_tag} will be deprecated on 2021-12-31.\")\n\t\t\t\tclick.echo(f\"Please remove it from {self.template_path} in {self.app}\")\n\n\tdef run_pymodule_method(self, method_name):\n\t\tif hasattr(self.pymodule, method_name):\n\t\t\timport inspect\n\n\t\t\tmethod = getattr(self.pymodule, method_name)\n\t\t\tif inspect.getfullargspec(method).args:\n\t\t\t\treturn method(self.context)\n\t\t\telse:\n\t\t\t\treturn method()\n\n\tdef render_template(self):\n\t\tif self.template_path.endswith(\"min.js\"):\n\t\t\thtml = self.source # static\n\t\telse:\n\t\t\tif self.context.safe_render is not None:\n\t\t\t\tsafe_render = self.context.safe_render\n\t\t\telse:\n\t\t\t\tsafe_render = True\n\n\t\t\thtml = frappe.render_template(self.source, self.context, safe_render=safe_render)\n\n\t\treturn html\n\n\tdef extends_template(self):\n\t\treturn self.template_path.endswith((\".html\", \".md\")) and (\n\t\t\t\"{%- extends\" in self.source or \"{% extends\" in self.source\n\t\t)\n\n\tdef get_raw_template(self):\n\t\treturn frappe.get_jloader().get_source(frappe.get_jenv(), self.context.template)[0]\n\n\tdef load_colocated_files(self):\n\t\t\"\"\"load co-located css/js files with the same name\"\"\"\n\t\tjs_path = self.basename + \".js\"\n\t\tif os.path.exists(js_path) and \"{% block script %}\" not in self.source:\n\t\t\tself.context.colocated_js = self.get_colocated_file(js_path)\n\n\t\tcss_path = self.basename + \".css\"\n\t\tif os.path.exists(css_path) and \"{% block style %}\" not in self.source:\n\t\t\tself.context.colocated_css = self.get_colocated_file(css_path)\n\n\tdef get_colocated_file(self, path):\n\t\twith open(path, encoding=\"utf-8\") as f:\n\t\t\treturn f.read()\n\n\tdef extract_frontmatter(self):\n\t\tif not self.template_path.endswith((\".md\", \".html\")):\n\t\t\treturn\n\n\t\ttry:\n\t\t\t# values will be used to update self\n\t\t\tres = get_frontmatter(self.source)\n\t\t\tif res[\"attributes\"]:\n\t\t\t\tself.context.update(res[\"attributes\"])\n\t\t\t\tself.source = res[\"body\"]\n\t\texcept Exception:\n\t\t\tpass\n\n\tdef convert_from_markdown(self):\n\t\tif self.template_path.endswith(\".md\"):\n\t\t\tself.source = frappe.utils.md_to_html(self.source)\n\t\t\tself.context.page_toc_html = self.source.toc_html\n\n\t\t\tif not self.context.show_sidebar:\n\t\t\t\tself.source = '<div class=\"from-markdown\">' + self.source + \"</div>\"\n\n\tdef update_toc(self, html):\n\t\tif \"{index}\" in html:\n\t\t\thtml = html.replace(\"{index}\", get_toc(self.path))\n\n\t\tif \"{next}\" in html:\n\t\t\thtml = html.replace(\"{next}\", get_next_link(self.path))\n\n\t\treturn html\n\n\tdef set_standard_path(self, path):\n\t\tself.app = \"frappe\"\n\t\tself.app_path = frappe.get_app_path(\"frappe\")\n\t\tself.path = path\n\t\tself.template_path = f\"www/{path}.html\"\n\n\tdef set_missing_values(self):\n\t\tsuper().set_missing_values()\n\t\t# for backward compatibility\n\t\tself.context.docs_base_url = \"/docs\"\n\n\tdef set_user_info(self):\n\t\tfrom frappe.utils.user import get_fullname_and_avatar\n\n\t\tinfo = get_fullname_and_avatar(frappe.session.user)\n\t\tself.context[\"fullname\"] = info.fullname\n\t\tself.context[\"user_image\"] = info.avatar\n\t\tself.context[\"user\"] = info.name\n\n\ndef get_start_folders():\n\treturn frappe.local.flags.web_pages_folders or (\"www\", \"templates/pages\")\n", "path": "frappe/website/page_renderers/template_page.py"}]} | 3,929 | 152 |
gh_patches_debug_30135 | rasdani/github-patches | git_diff | NVIDIA__NVFlare-645 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove external 'zip' dependency from DistributionBuilder (zip_password=True)
Hi there, first off I want to acknowledge your efforts towards making an incredible tool, thank you!
The issue: I faced the exception *FileNotFoundError: [Errno 2] No such file or directory: 'zip'* when trying to specify `zip_password: true` for the DistributionBuilder args while running `provision -p project.yml`, and tracked the cause to `zip` not being installed on the local machine, which was a problem since installing additional software is not allowed on that machine.
## Replicate
```bash
# Launch a docker container with image python:3.8 running interactive bash
docker run -it --rm python:3.8 bash
# === Within docker container ===
pip install nvflare
provision # create example project.yml
# Set zip_password to true
sed -i 's/zip_password: false/zip_password: true/' project.yml
provision -p project.yml
```
We will find the following output:
```
Project yaml file: /project.yml.
Exception raised during provision. Incomplete prod_n folder removed.
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/nvflare/lighter/spec.py", line 166, in provision
b.build(study, ctx)
File "/usr/local/lib/python3.8/site-packages/nvflare/lighter/impl/workspace.py", line 109, in build
subprocess.run(run_args)
File "/usr/local/lib/python3.8/subprocess.py", line 493, in run
with Popen(*popenargs, **kwargs) as process:
File "/usr/local/lib/python3.8/subprocess.py", line 858, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "/usr/local/lib/python3.8/subprocess.py", line 1704, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'zip'
```
Whereas it'll work if we install zip
```bash
# === Within docker container ===
apt-get update && apt-get install -y zip
provision -p project.yml
```
## Fix
The build function of the DistributionBuilder uses `shutil` to make the non-password protected zip but relies on `subprocess.run(['zip', ...])` in order to create password-protected files, since shutil doesn't support passwords.
I'd propose adding a package requirement of something like `pyminizip` or `pyzipper` to handle password protection.
https://github.com/NVIDIA/NVFlare/blob/c8d51bfb534c02eff874a760c47086c87cbff59f/nvflare/lighter/impl/workspace.py#L100-L113
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nvflare/lighter/impl/workspace.py`
Content:
```
1 # Copyright (c) 2021-2022, NVIDIA CORPORATION. All rights reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import os
16 import pathlib
17 import shutil
18 import subprocess
19
20 from nvflare.lighter.spec import Builder, Project
21 from nvflare.lighter.utils import generate_password
22
23
24 class WorkspaceBuilder(Builder):
25 def __init__(self, template_file):
26 """Manages the folder structure for provisioned projects.
27
28 Sets the template_file containing scripts and configs to put into startup folders, creates directories for the
29 participants, and moves the provisioned project to the final location at the end
30 ($WORKSPACE/$PROJECT_NAME/prod_XX). WorkspaceBuilder manages and sets the number in prod_XX by incrementing from
31 the last time provision was run for this project in this workspace, starting with 00 to a max of 99.
32
33 Each time the provisioning tool runs, it requires a workspace folder in the local file system. The workspace
34 will have the following folder structure:
35
36 .. code-block:: text
37
38 $WORKSPACE/ <--- this is assigned by -w option of provision command (default is workspace)
39 $PROJECT_NAME/ <--- this is the name value in the project.yml file
40 prod_00/ <--- a new prod_NN folder is created if provision does not have any errors.
41 prod_01/
42 ...
43 resources/ <--- this folder stores resources for other builders to load
44 state/ <--- this folder stores persistent information (such as certificates) so subsequent runs of the provision command can load the state back.
45 wip/ <--- this is only used during runtime, and will be removed when the provision command exits
46
47 Args:
48 template_file: name of template file containing scripts and configs to put into startup folders
49 """
50 self.template_file = template_file
51
52 def _make_dir(self, dirs):
53 for dir in dirs:
54 if not os.path.exists(dir):
55 os.makedirs(dir)
56
57 def initialize(self, ctx):
58 workspace_dir = ctx["workspace"]
59 prod_dirs = [_ for _ in os.listdir(workspace_dir) if _.startswith("prod_")]
60 last = -1
61 for dir in prod_dirs:
62 stage = int(dir.split("_")[-1])
63 if stage > last:
64 last = stage
65 ctx["last_prod_stage"] = last
66 template_file_full_path = os.path.join(self.get_resources_dir(ctx), self.template_file)
67 file_path = pathlib.Path(__file__).parent.absolute()
68 shutil.copyfile(os.path.join(file_path, self.template_file), template_file_full_path)
69 ctx["template_file"] = self.template_file
70
71 def build(self, project: Project, ctx: dict):
72 dirs = [self.get_kit_dir(p, ctx) for p in project.participants]
73 self._make_dir(dirs)
74
75 def finalize(self, ctx: dict):
76 if ctx["last_prod_stage"] >= 99:
77 print(f"Please clean up {ctx['workspace']} by removing prod_N folders")
78 print("After clean-up, rerun the provision command.")
79 else:
80 current_prod_stage = str(ctx["last_prod_stage"] + 1).zfill(2)
81 current_prod_dir = os.path.join(ctx["workspace"], f"prod_{current_prod_stage}")
82 shutil.move(self.get_wip_dir(ctx), current_prod_dir)
83 ctx.pop("wip_dir", None)
84 print(f"Generated results can be found under {current_prod_dir}. Builder's wip folder removed.")
85 ctx["current_prod_dir"] = current_prod_dir
86
87
88 class DistributionBuilder(Builder):
89 def __init__(self, zip_password=False):
90 """Build the zip files for each folder.
91
92 Creates the zip files containing the archives for each startup kit. It will add password protection if the
93 argument (zip_password) is true.
94
95 Args:
96 zip_password: if true, will create zipped packages with passwords
97 """
98 self.zip_password = zip_password
99
100 def build(self, project: Project, ctx: dict):
101 wip_dir = self.get_wip_dir(ctx)
102 dirs = [name for name in os.listdir(wip_dir) if os.path.isdir(os.path.join(wip_dir, name))]
103 for dir in dirs:
104 dest_zip_file = os.path.join(wip_dir, f"{dir}")
105 if self.zip_password:
106 pw = generate_password()
107 run_args = ["zip", "-rq", "-P", pw, dest_zip_file + ".zip", ".", "-i", "startup/*"]
108 os.chdir(dest_zip_file)
109 subprocess.run(run_args)
110 os.chdir(os.path.join(dest_zip_file, ".."))
111 print(f"Password {pw} on {dir}.zip")
112 else:
113 shutil.make_archive(dest_zip_file, "zip", root_dir=os.path.join(wip_dir, dir), base_dir="startup")
114
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/nvflare/lighter/impl/workspace.py b/nvflare/lighter/impl/workspace.py
--- a/nvflare/lighter/impl/workspace.py
+++ b/nvflare/lighter/impl/workspace.py
@@ -98,6 +98,15 @@
self.zip_password = zip_password
def build(self, project: Project, ctx: dict):
+ """Create a zip for each individual folder.
+ Note that if zip_password is True, the zip command will be used to encrypt zip files. Users have to to
+ install this zip utility before provisioning. In Ubuntu system, use this command to install zip utility:
+ sudo apt-get install zip
+
+ Args:
+ project (Project): project instance
+ ctx (dict): the provision context
+ """
wip_dir = self.get_wip_dir(ctx)
dirs = [name for name in os.listdir(wip_dir) if os.path.isdir(os.path.join(wip_dir, name))]
for dir in dirs:
@@ -106,8 +115,12 @@
pw = generate_password()
run_args = ["zip", "-rq", "-P", pw, dest_zip_file + ".zip", ".", "-i", "startup/*"]
os.chdir(dest_zip_file)
- subprocess.run(run_args)
- os.chdir(os.path.join(dest_zip_file, ".."))
- print(f"Password {pw} on {dir}.zip")
+ try:
+ subprocess.run(run_args)
+ print(f"Password {pw} on {dir}.zip")
+ except FileNotFoundError as e:
+ raise RuntimeError("Unable to zip folders with password. Maybe the zip utility is not installed.")
+ finally:
+ os.chdir(os.path.join(dest_zip_file, ".."))
else:
shutil.make_archive(dest_zip_file, "zip", root_dir=os.path.join(wip_dir, dir), base_dir="startup")
| {"golden_diff": "diff --git a/nvflare/lighter/impl/workspace.py b/nvflare/lighter/impl/workspace.py\n--- a/nvflare/lighter/impl/workspace.py\n+++ b/nvflare/lighter/impl/workspace.py\n@@ -98,6 +98,15 @@\n self.zip_password = zip_password\n \n def build(self, project: Project, ctx: dict):\n+ \"\"\"Create a zip for each individual folder.\n+ Note that if zip_password is True, the zip command will be used to encrypt zip files. Users have to to\n+ install this zip utility before provisioning. In Ubuntu system, use this command to install zip utility:\n+ sudo apt-get install zip\n+\n+ Args:\n+ project (Project): project instance\n+ ctx (dict): the provision context\n+ \"\"\"\n wip_dir = self.get_wip_dir(ctx)\n dirs = [name for name in os.listdir(wip_dir) if os.path.isdir(os.path.join(wip_dir, name))]\n for dir in dirs:\n@@ -106,8 +115,12 @@\n pw = generate_password()\n run_args = [\"zip\", \"-rq\", \"-P\", pw, dest_zip_file + \".zip\", \".\", \"-i\", \"startup/*\"]\n os.chdir(dest_zip_file)\n- subprocess.run(run_args)\n- os.chdir(os.path.join(dest_zip_file, \"..\"))\n- print(f\"Password {pw} on {dir}.zip\")\n+ try:\n+ subprocess.run(run_args)\n+ print(f\"Password {pw} on {dir}.zip\")\n+ except FileNotFoundError as e:\n+ raise RuntimeError(\"Unable to zip folders with password. Maybe the zip utility is not installed.\")\n+ finally:\n+ os.chdir(os.path.join(dest_zip_file, \"..\"))\n else:\n shutil.make_archive(dest_zip_file, \"zip\", root_dir=os.path.join(wip_dir, dir), base_dir=\"startup\")\n", "issue": "Remove external 'zip' dependency from DistributionBuilder (zip_password=True)\nHi there, first off I want to acknowledge your efforts towards making an incredible tool, thank you!\r\n\r\nThe issue: I faced the exception *FileNotFoundError: [Errno 2] No such file or directory: 'zip'* when trying to specify `zip_password: true` for the DistributionBuilder args while running `provision -p project.yml`, and tracked the cause to `zip` not being installed on the local machine, which was a problem since installing additional software is not allowed on that machine.\r\n\r\n## Replicate\r\n```bash\r\n# Launch a docker container with image python:3.8 running interactive bash\r\ndocker run -it --rm python:3.8 bash\r\n\r\n# === Within docker container ===\r\npip install nvflare\r\nprovision # create example project.yml\r\n\r\n# Set zip_password to true\r\nsed -i 's/zip_password: false/zip_password: true/' project.yml\r\n\r\nprovision -p project.yml\r\n```\r\nWe will find the following output:\r\n```\r\nProject yaml file: /project.yml.\r\nException raised during provision. Incomplete prod_n folder removed.\r\nTraceback (most recent call last):\r\n File \"/usr/local/lib/python3.8/site-packages/nvflare/lighter/spec.py\", line 166, in provision\r\n b.build(study, ctx)\r\n File \"/usr/local/lib/python3.8/site-packages/nvflare/lighter/impl/workspace.py\", line 109, in build\r\n subprocess.run(run_args)\r\n File \"/usr/local/lib/python3.8/subprocess.py\", line 493, in run\r\n with Popen(*popenargs, **kwargs) as process:\r\n File \"/usr/local/lib/python3.8/subprocess.py\", line 858, in __init__\r\n self._execute_child(args, executable, preexec_fn, close_fds,\r\n File \"/usr/local/lib/python3.8/subprocess.py\", line 1704, in _execute_child\r\n raise child_exception_type(errno_num, err_msg, err_filename)\r\nFileNotFoundError: [Errno 2] No such file or directory: 'zip'\r\n```\r\nWhereas it'll work if we install zip\r\n```bash\r\n# === Within docker container ===\r\napt-get update && apt-get install -y zip\r\nprovision -p project.yml\r\n```\r\n\r\n## Fix\r\nThe build function of the DistributionBuilder uses `shutil` to make the non-password protected zip but relies on `subprocess.run(['zip', ...])` in order to create password-protected files, since shutil doesn't support passwords.\r\n\r\nI'd propose adding a package requirement of something like `pyminizip` or `pyzipper` to handle password protection.\r\n\r\nhttps://github.com/NVIDIA/NVFlare/blob/c8d51bfb534c02eff874a760c47086c87cbff59f/nvflare/lighter/impl/workspace.py#L100-L113\n", "before_files": [{"content": "# Copyright (c) 2021-2022, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport pathlib\nimport shutil\nimport subprocess\n\nfrom nvflare.lighter.spec import Builder, Project\nfrom nvflare.lighter.utils import generate_password\n\n\nclass WorkspaceBuilder(Builder):\n def __init__(self, template_file):\n \"\"\"Manages the folder structure for provisioned projects.\n\n Sets the template_file containing scripts and configs to put into startup folders, creates directories for the\n participants, and moves the provisioned project to the final location at the end\n ($WORKSPACE/$PROJECT_NAME/prod_XX). WorkspaceBuilder manages and sets the number in prod_XX by incrementing from\n the last time provision was run for this project in this workspace, starting with 00 to a max of 99.\n\n Each time the provisioning tool runs, it requires a workspace folder in the local file system. The workspace\n will have the following folder structure:\n\n .. code-block:: text\n\n $WORKSPACE/ <--- this is assigned by -w option of provision command (default is workspace)\n $PROJECT_NAME/ <--- this is the name value in the project.yml file\n prod_00/ <--- a new prod_NN folder is created if provision does not have any errors.\n prod_01/\n ...\n resources/ <--- this folder stores resources for other builders to load\n state/ <--- this folder stores persistent information (such as certificates) so subsequent runs of the provision command can load the state back.\n wip/ <--- this is only used during runtime, and will be removed when the provision command exits\n\n Args:\n template_file: name of template file containing scripts and configs to put into startup folders\n \"\"\"\n self.template_file = template_file\n\n def _make_dir(self, dirs):\n for dir in dirs:\n if not os.path.exists(dir):\n os.makedirs(dir)\n\n def initialize(self, ctx):\n workspace_dir = ctx[\"workspace\"]\n prod_dirs = [_ for _ in os.listdir(workspace_dir) if _.startswith(\"prod_\")]\n last = -1\n for dir in prod_dirs:\n stage = int(dir.split(\"_\")[-1])\n if stage > last:\n last = stage\n ctx[\"last_prod_stage\"] = last\n template_file_full_path = os.path.join(self.get_resources_dir(ctx), self.template_file)\n file_path = pathlib.Path(__file__).parent.absolute()\n shutil.copyfile(os.path.join(file_path, self.template_file), template_file_full_path)\n ctx[\"template_file\"] = self.template_file\n\n def build(self, project: Project, ctx: dict):\n dirs = [self.get_kit_dir(p, ctx) for p in project.participants]\n self._make_dir(dirs)\n\n def finalize(self, ctx: dict):\n if ctx[\"last_prod_stage\"] >= 99:\n print(f\"Please clean up {ctx['workspace']} by removing prod_N folders\")\n print(\"After clean-up, rerun the provision command.\")\n else:\n current_prod_stage = str(ctx[\"last_prod_stage\"] + 1).zfill(2)\n current_prod_dir = os.path.join(ctx[\"workspace\"], f\"prod_{current_prod_stage}\")\n shutil.move(self.get_wip_dir(ctx), current_prod_dir)\n ctx.pop(\"wip_dir\", None)\n print(f\"Generated results can be found under {current_prod_dir}. Builder's wip folder removed.\")\n ctx[\"current_prod_dir\"] = current_prod_dir\n\n\nclass DistributionBuilder(Builder):\n def __init__(self, zip_password=False):\n \"\"\"Build the zip files for each folder.\n\n Creates the zip files containing the archives for each startup kit. It will add password protection if the\n argument (zip_password) is true.\n\n Args:\n zip_password: if true, will create zipped packages with passwords\n \"\"\"\n self.zip_password = zip_password\n\n def build(self, project: Project, ctx: dict):\n wip_dir = self.get_wip_dir(ctx)\n dirs = [name for name in os.listdir(wip_dir) if os.path.isdir(os.path.join(wip_dir, name))]\n for dir in dirs:\n dest_zip_file = os.path.join(wip_dir, f\"{dir}\")\n if self.zip_password:\n pw = generate_password()\n run_args = [\"zip\", \"-rq\", \"-P\", pw, dest_zip_file + \".zip\", \".\", \"-i\", \"startup/*\"]\n os.chdir(dest_zip_file)\n subprocess.run(run_args)\n os.chdir(os.path.join(dest_zip_file, \"..\"))\n print(f\"Password {pw} on {dir}.zip\")\n else:\n shutil.make_archive(dest_zip_file, \"zip\", root_dir=os.path.join(wip_dir, dir), base_dir=\"startup\")\n", "path": "nvflare/lighter/impl/workspace.py"}], "after_files": [{"content": "# Copyright (c) 2021-2022, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport pathlib\nimport shutil\nimport subprocess\n\nfrom nvflare.lighter.spec import Builder, Project\nfrom nvflare.lighter.utils import generate_password\n\n\nclass WorkspaceBuilder(Builder):\n def __init__(self, template_file):\n \"\"\"Manages the folder structure for provisioned projects.\n\n Sets the template_file containing scripts and configs to put into startup folders, creates directories for the\n participants, and moves the provisioned project to the final location at the end\n ($WORKSPACE/$PROJECT_NAME/prod_XX). WorkspaceBuilder manages and sets the number in prod_XX by incrementing from\n the last time provision was run for this project in this workspace, starting with 00 to a max of 99.\n\n Each time the provisioning tool runs, it requires a workspace folder in the local file system. The workspace\n will have the following folder structure:\n\n .. code-block:: text\n\n $WORKSPACE/ <--- this is assigned by -w option of provision command (default is workspace)\n $PROJECT_NAME/ <--- this is the name value in the project.yml file\n prod_00/ <--- a new prod_NN folder is created if provision does not have any errors.\n prod_01/\n ...\n resources/ <--- this folder stores resources for other builders to load\n state/ <--- this folder stores persistent information (such as certificates) so subsequent runs of the provision command can load the state back.\n wip/ <--- this is only used during runtime, and will be removed when the provision command exits\n\n Args:\n template_file: name of template file containing scripts and configs to put into startup folders\n \"\"\"\n self.template_file = template_file\n\n def _make_dir(self, dirs):\n for dir in dirs:\n if not os.path.exists(dir):\n os.makedirs(dir)\n\n def initialize(self, ctx):\n workspace_dir = ctx[\"workspace\"]\n prod_dirs = [_ for _ in os.listdir(workspace_dir) if _.startswith(\"prod_\")]\n last = -1\n for dir in prod_dirs:\n stage = int(dir.split(\"_\")[-1])\n if stage > last:\n last = stage\n ctx[\"last_prod_stage\"] = last\n template_file_full_path = os.path.join(self.get_resources_dir(ctx), self.template_file)\n file_path = pathlib.Path(__file__).parent.absolute()\n shutil.copyfile(os.path.join(file_path, self.template_file), template_file_full_path)\n ctx[\"template_file\"] = self.template_file\n\n def build(self, project: Project, ctx: dict):\n dirs = [self.get_kit_dir(p, ctx) for p in project.participants]\n self._make_dir(dirs)\n\n def finalize(self, ctx: dict):\n if ctx[\"last_prod_stage\"] >= 99:\n print(f\"Please clean up {ctx['workspace']} by removing prod_N folders\")\n print(\"After clean-up, rerun the provision command.\")\n else:\n current_prod_stage = str(ctx[\"last_prod_stage\"] + 1).zfill(2)\n current_prod_dir = os.path.join(ctx[\"workspace\"], f\"prod_{current_prod_stage}\")\n shutil.move(self.get_wip_dir(ctx), current_prod_dir)\n ctx.pop(\"wip_dir\", None)\n print(f\"Generated results can be found under {current_prod_dir}. Builder's wip folder removed.\")\n ctx[\"current_prod_dir\"] = current_prod_dir\n\n\nclass DistributionBuilder(Builder):\n def __init__(self, zip_password=False):\n \"\"\"Build the zip files for each folder.\n\n Creates the zip files containing the archives for each startup kit. It will add password protection if the\n argument (zip_password) is true.\n\n Args:\n zip_password: if true, will create zipped packages with passwords\n \"\"\"\n self.zip_password = zip_password\n\n def build(self, project: Project, ctx: dict):\n \"\"\"Create a zip for each individual folder.\n Note that if zip_password is True, the zip command will be used to encrypt zip files. Users have to to\n install this zip utility before provisioning. In Ubuntu system, use this command to install zip utility:\n sudo apt-get install zip\n\n Args:\n project (Project): project instance\n ctx (dict): the provision context\n \"\"\"\n wip_dir = self.get_wip_dir(ctx)\n dirs = [name for name in os.listdir(wip_dir) if os.path.isdir(os.path.join(wip_dir, name))]\n for dir in dirs:\n dest_zip_file = os.path.join(wip_dir, f\"{dir}\")\n if self.zip_password:\n pw = generate_password()\n run_args = [\"zip\", \"-rq\", \"-P\", pw, dest_zip_file + \".zip\", \".\", \"-i\", \"startup/*\"]\n os.chdir(dest_zip_file)\n try:\n subprocess.run(run_args)\n print(f\"Password {pw} on {dir}.zip\")\n except FileNotFoundError as e:\n raise RuntimeError(\"Unable to zip folders with password. Maybe the zip utility is not installed.\")\n finally:\n os.chdir(os.path.join(dest_zip_file, \"..\"))\n else:\n shutil.make_archive(dest_zip_file, \"zip\", root_dir=os.path.join(wip_dir, dir), base_dir=\"startup\")\n", "path": "nvflare/lighter/impl/workspace.py"}]} | 2,300 | 416 |
gh_patches_debug_26270 | rasdani/github-patches | git_diff | e-valuation__EvaP-2036 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Translations in Javascript and Typescript
When writing Javascript and Typescript in separate, non-HTML files, we can't use the Django template functions `trans`, `blocktrans`, etc. anymore. We have worked around this by putting translated strings into the DOM and accessing them via Javascript then.
Instead of doing this, we want to have a unified approach where the use-site can just write `trans("The server is not responding.")` or so. There are two possible approaches:
1. DIY: We have a function `trans(english: string, to: Language = window.LANGUAGE): string` with `type Language = "English" | "German"`. This function looks up the string in a global dictionary (for example `window.translationDictionary` or so). I am not sure what it should do if the string is not present, probably return the English string and emit a warning? This dictionary would be defined in a script tag in a HTML file, something like (possibly with an implementation that doesn't repeat the strings a little less):
```html
<script type="text/javascript">
window.translationDictionary = {
"de": {
{% language 'de' %}
"The server is not responding": "{% trans 'The server is not responding' %}",
{% endlanguage %}
}
};
</script>
```
2. Use Django's builtin functionality: There is a builtin way that configures an extra endpoint to make all translations available (https://docs.djangoproject.com/en/4.2/topics/i18n/translation/#internationalization-in-javascript-code). A plus is that it also supports `ngettext` and so on. It seems like it can also detect all strings used in translations, but the setup may be a bit tricky with Typescript thrown into the mix.
I think I prefer the first approach, but maybe we encounter difficulties with it or decide that we will need `ngettext` etc. in the future and go with the Django versions directly.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `evap/development/management/commands/translate.py`
Content:
```
1 from django.core.management import call_command
2 from django.core.management.base import BaseCommand
3
4
5 class Command(BaseCommand):
6 args = ""
7 help = 'Execute "makemessages --locale=de --ignore=node_modules/*"'
8
9 def handle(self, *args, **options):
10 self.stdout.write('Executing "manage.py makemessages --locale=de --ignore=node_modules/*"')
11 call_command("makemessages", "--locale=de", "--ignore=node_modules/*")
12
```
Path: `evap/urls.py`
Content:
```
1 import django.contrib.auth.views
2 from django.conf import settings
3 from django.urls import include, path
4
5 urlpatterns = [
6 path("", include('evap.evaluation.urls')),
7 path("staff/", include('evap.staff.urls')),
8 path("results/", include('evap.results.urls')),
9 path("student/", include('evap.student.urls')),
10 path("contributor/", include('evap.contributor.urls')),
11 path("rewards/", include('evap.rewards.urls')),
12 path("grades/", include('evap.grades.urls')),
13
14 path("logout", django.contrib.auth.views.LogoutView.as_view(next_page="/"), name="django-auth-logout"),
15 path("oidc/", include('mozilla_django_oidc.urls')),
16 ]
17
18 if settings.DEBUG:
19 urlpatterns += [path('development/', include('evap.development.urls'))]
20
21 if settings.ENABLE_DEBUG_TOOLBAR:
22 # pylint does not correctly evaluate this if, so it will raise an import-error on
23 # GitHub actions and a useless-suppression on a vagrant setup. Ignore both cases.
24 import debug_toolbar # pylint: disable=import-error, useless-suppression
25 urlpatterns += [path('__debug__/', include(debug_toolbar.urls))]
26
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/evap/development/management/commands/translate.py b/evap/development/management/commands/translate.py
--- a/evap/development/management/commands/translate.py
+++ b/evap/development/management/commands/translate.py
@@ -9,3 +9,11 @@
def handle(self, *args, **options):
self.stdout.write('Executing "manage.py makemessages --locale=de --ignore=node_modules/*"')
call_command("makemessages", "--locale=de", "--ignore=node_modules/*")
+ call_command(
+ "makemessages",
+ "--domain=djangojs",
+ "--extension=js,ts",
+ "--locale=de",
+ "--ignore=node_modules/*",
+ "--ignore=evap/static/js/*.min.js",
+ )
diff --git a/evap/urls.py b/evap/urls.py
--- a/evap/urls.py
+++ b/evap/urls.py
@@ -1,6 +1,9 @@
import django.contrib.auth.views
from django.conf import settings
from django.urls import include, path
+from django.views.i18n import JavaScriptCatalog
+
+from evap.middleware import no_login_required
urlpatterns = [
path("", include('evap.evaluation.urls')),
@@ -13,6 +16,8 @@
path("logout", django.contrib.auth.views.LogoutView.as_view(next_page="/"), name="django-auth-logout"),
path("oidc/", include('mozilla_django_oidc.urls')),
+
+ path("catalog.js", no_login_required(JavaScriptCatalog.as_view()), name="javascript-catalog"),
]
if settings.DEBUG:
| {"golden_diff": "diff --git a/evap/development/management/commands/translate.py b/evap/development/management/commands/translate.py\n--- a/evap/development/management/commands/translate.py\n+++ b/evap/development/management/commands/translate.py\n@@ -9,3 +9,11 @@\n def handle(self, *args, **options):\n self.stdout.write('Executing \"manage.py makemessages --locale=de --ignore=node_modules/*\"')\n call_command(\"makemessages\", \"--locale=de\", \"--ignore=node_modules/*\")\n+ call_command(\n+ \"makemessages\",\n+ \"--domain=djangojs\",\n+ \"--extension=js,ts\",\n+ \"--locale=de\",\n+ \"--ignore=node_modules/*\",\n+ \"--ignore=evap/static/js/*.min.js\",\n+ )\ndiff --git a/evap/urls.py b/evap/urls.py\n--- a/evap/urls.py\n+++ b/evap/urls.py\n@@ -1,6 +1,9 @@\n import django.contrib.auth.views\n from django.conf import settings\n from django.urls import include, path\n+from django.views.i18n import JavaScriptCatalog\n+\n+from evap.middleware import no_login_required\n \n urlpatterns = [\n path(\"\", include('evap.evaluation.urls')),\n@@ -13,6 +16,8 @@\n \n path(\"logout\", django.contrib.auth.views.LogoutView.as_view(next_page=\"/\"), name=\"django-auth-logout\"),\n path(\"oidc/\", include('mozilla_django_oidc.urls')),\n+\n+ path(\"catalog.js\", no_login_required(JavaScriptCatalog.as_view()), name=\"javascript-catalog\"),\n ]\n \n if settings.DEBUG:\n", "issue": "Translations in Javascript and Typescript\nWhen writing Javascript and Typescript in separate, non-HTML files, we can't use the Django template functions `trans`, `blocktrans`, etc. anymore. We have worked around this by putting translated strings into the DOM and accessing them via Javascript then.\r\n\r\nInstead of doing this, we want to have a unified approach where the use-site can just write `trans(\"The server is not responding.\")` or so. There are two possible approaches:\r\n\r\n1. DIY: We have a function `trans(english: string, to: Language = window.LANGUAGE): string` with `type Language = \"English\" | \"German\"`. This function looks up the string in a global dictionary (for example `window.translationDictionary` or so). I am not sure what it should do if the string is not present, probably return the English string and emit a warning? This dictionary would be defined in a script tag in a HTML file, something like (possibly with an implementation that doesn't repeat the strings a little less):\r\n```html\r\n<script type=\"text/javascript\">\r\n window.translationDictionary = {\r\n \"de\": {\r\n {% language 'de' %}\r\n \"The server is not responding\": \"{% trans 'The server is not responding' %}\",\r\n {% endlanguage %}\r\n }\r\n };\r\n</script>\r\n```\r\n2. Use Django's builtin functionality: There is a builtin way that configures an extra endpoint to make all translations available (https://docs.djangoproject.com/en/4.2/topics/i18n/translation/#internationalization-in-javascript-code). A plus is that it also supports `ngettext` and so on. It seems like it can also detect all strings used in translations, but the setup may be a bit tricky with Typescript thrown into the mix.\r\n\r\nI think I prefer the first approach, but maybe we encounter difficulties with it or decide that we will need `ngettext` etc. in the future and go with the Django versions directly.\n", "before_files": [{"content": "from django.core.management import call_command\nfrom django.core.management.base import BaseCommand\n\n\nclass Command(BaseCommand):\n args = \"\"\n help = 'Execute \"makemessages --locale=de --ignore=node_modules/*\"'\n\n def handle(self, *args, **options):\n self.stdout.write('Executing \"manage.py makemessages --locale=de --ignore=node_modules/*\"')\n call_command(\"makemessages\", \"--locale=de\", \"--ignore=node_modules/*\")\n", "path": "evap/development/management/commands/translate.py"}, {"content": "import django.contrib.auth.views\nfrom django.conf import settings\nfrom django.urls import include, path\n\nurlpatterns = [\n path(\"\", include('evap.evaluation.urls')),\n path(\"staff/\", include('evap.staff.urls')),\n path(\"results/\", include('evap.results.urls')),\n path(\"student/\", include('evap.student.urls')),\n path(\"contributor/\", include('evap.contributor.urls')),\n path(\"rewards/\", include('evap.rewards.urls')),\n path(\"grades/\", include('evap.grades.urls')),\n\n path(\"logout\", django.contrib.auth.views.LogoutView.as_view(next_page=\"/\"), name=\"django-auth-logout\"),\n path(\"oidc/\", include('mozilla_django_oidc.urls')),\n]\n\nif settings.DEBUG:\n urlpatterns += [path('development/', include('evap.development.urls'))]\n\n if settings.ENABLE_DEBUG_TOOLBAR:\n # pylint does not correctly evaluate this if, so it will raise an import-error on\n # GitHub actions and a useless-suppression on a vagrant setup. Ignore both cases.\n import debug_toolbar # pylint: disable=import-error, useless-suppression\n urlpatterns += [path('__debug__/', include(debug_toolbar.urls))]\n", "path": "evap/urls.py"}], "after_files": [{"content": "from django.core.management import call_command\nfrom django.core.management.base import BaseCommand\n\n\nclass Command(BaseCommand):\n args = \"\"\n help = 'Execute \"makemessages --locale=de --ignore=node_modules/*\"'\n\n def handle(self, *args, **options):\n self.stdout.write('Executing \"manage.py makemessages --locale=de --ignore=node_modules/*\"')\n call_command(\"makemessages\", \"--locale=de\", \"--ignore=node_modules/*\")\n call_command(\n \"makemessages\",\n \"--domain=djangojs\",\n \"--extension=js,ts\",\n \"--locale=de\",\n \"--ignore=node_modules/*\",\n \"--ignore=evap/static/js/*.min.js\",\n )\n", "path": "evap/development/management/commands/translate.py"}, {"content": "import django.contrib.auth.views\nfrom django.conf import settings\nfrom django.urls import include, path\nfrom django.views.i18n import JavaScriptCatalog\n\nfrom evap.middleware import no_login_required\n\nurlpatterns = [\n path(\"\", include('evap.evaluation.urls')),\n path(\"staff/\", include('evap.staff.urls')),\n path(\"results/\", include('evap.results.urls')),\n path(\"student/\", include('evap.student.urls')),\n path(\"contributor/\", include('evap.contributor.urls')),\n path(\"rewards/\", include('evap.rewards.urls')),\n path(\"grades/\", include('evap.grades.urls')),\n\n path(\"logout\", django.contrib.auth.views.LogoutView.as_view(next_page=\"/\"), name=\"django-auth-logout\"),\n path(\"oidc/\", include('mozilla_django_oidc.urls')),\n\n path(\"catalog.js\", no_login_required(JavaScriptCatalog.as_view()), name=\"javascript-catalog\"),\n]\n\nif settings.DEBUG:\n urlpatterns += [path('development/', include('evap.development.urls'))]\n\n if settings.ENABLE_DEBUG_TOOLBAR:\n # pylint does not correctly evaluate this if, so it will raise an import-error on\n # GitHub actions and a useless-suppression on a vagrant setup. Ignore both cases.\n import debug_toolbar # pylint: disable=import-error, useless-suppression\n urlpatterns += [path('__debug__/', include(debug_toolbar.urls))]\n", "path": "evap/urls.py"}]} | 1,103 | 369 |
gh_patches_debug_1560 | rasdani/github-patches | git_diff | NVIDIA__TransformerEngine-813 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`warnings.simplefilter('default')` in global scope causes excessive DeprecationWarnings
https://github.com/NVIDIA/TransformerEngine/blob/f85553ea369da15fd726ab279818e415be48a228/transformer_engine/common/utils.py#L9
Importing the `transformer_engine.common.utils` resets the warning filters to default settings using `warnings.simplefilter('default')` in the global scope. This results in the console being flooded with DeprecationWarnings, which are normally ignored by Python by default.
Would it be possible to move setting the warning filter config to a more controlled scope in this module?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `transformer_engine/common/utils.py`
Content:
```
1 # Copyright (c) 2022-2024, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
2 #
3 # See LICENSE for license information.
4 """The utilities for Transformer Engine"""
5 import inspect
6 import warnings
7 from enum import Enum
8
9 warnings.simplefilter('default')
10
11
12 class DeprecatedEnum: # pylint: disable=too-few-public-methods
13 """DeprecatedEnum"""
14
15 def __init__(self, enum_cls, msg):
16 self.enum_cls = enum_cls
17 self.msg = msg
18
19 def __iter__(self):
20 return iter(list(self.enum_cls.__members__.values()))
21
22 def __getattr__(self, name):
23 if name in self.enum_cls.__members__:
24 warnings.warn(self.msg, DeprecationWarning)
25 return self.enum_cls.__members__[name]
26 raise AttributeError(f"{self.enum_cls} does not contain {name}")
27
28
29 def deprecate_wrapper(obj, msg):
30 """Deprecate wrapper"""
31 if inspect.isclass(obj):
32 if issubclass(obj, Enum):
33 return DeprecatedEnum(obj, msg)
34
35 class DeprecatedCls(obj): # pylint: disable=too-few-public-methods
36 """DeprecatedCls"""
37
38 def __init__(self, *args, **kwargs):
39 warnings.warn(msg, DeprecationWarning)
40 super().__init__(*args, **kwargs)
41
42 return DeprecatedCls
43
44 if inspect.isfunction(obj):
45
46 def deprecated(*args, **kwargs):
47 warnings.warn(msg, DeprecationWarning)
48 return obj(*args, **kwargs)
49
50 return deprecated
51
52 raise NotImplementedError(
53 f"deprecate_cls_wrapper only support Class and Function, but got {type(obj)}.")
54
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/transformer_engine/common/utils.py b/transformer_engine/common/utils.py
--- a/transformer_engine/common/utils.py
+++ b/transformer_engine/common/utils.py
@@ -6,7 +6,8 @@
import warnings
from enum import Enum
-warnings.simplefilter('default')
+warnings.filterwarnings(
+ "module", category=DeprecationWarning, module="transformer_engine.common.utils")
class DeprecatedEnum: # pylint: disable=too-few-public-methods
| {"golden_diff": "diff --git a/transformer_engine/common/utils.py b/transformer_engine/common/utils.py\n--- a/transformer_engine/common/utils.py\n+++ b/transformer_engine/common/utils.py\n@@ -6,7 +6,8 @@\n import warnings\n from enum import Enum\n \n-warnings.simplefilter('default')\n+warnings.filterwarnings(\n+ \"module\", category=DeprecationWarning, module=\"transformer_engine.common.utils\")\n \n \n class DeprecatedEnum: # pylint: disable=too-few-public-methods\n", "issue": "`warnings.simplefilter('default')` in global scope causes excessive DeprecationWarnings\nhttps://github.com/NVIDIA/TransformerEngine/blob/f85553ea369da15fd726ab279818e415be48a228/transformer_engine/common/utils.py#L9\r\n\r\nImporting the `transformer_engine.common.utils` resets the warning filters to default settings using `warnings.simplefilter('default')` in the global scope. This results in the console being flooded with DeprecationWarnings, which are normally ignored by Python by default.\r\n\r\nWould it be possible to move setting the warning filter config to a more controlled scope in this module?\n", "before_files": [{"content": "# Copyright (c) 2022-2024, NVIDIA CORPORATION & AFFILIATES. All rights reserved.\n#\n# See LICENSE for license information.\n\"\"\"The utilities for Transformer Engine\"\"\"\nimport inspect\nimport warnings\nfrom enum import Enum\n\nwarnings.simplefilter('default')\n\n\nclass DeprecatedEnum: # pylint: disable=too-few-public-methods\n \"\"\"DeprecatedEnum\"\"\"\n\n def __init__(self, enum_cls, msg):\n self.enum_cls = enum_cls\n self.msg = msg\n\n def __iter__(self):\n return iter(list(self.enum_cls.__members__.values()))\n\n def __getattr__(self, name):\n if name in self.enum_cls.__members__:\n warnings.warn(self.msg, DeprecationWarning)\n return self.enum_cls.__members__[name]\n raise AttributeError(f\"{self.enum_cls} does not contain {name}\")\n\n\ndef deprecate_wrapper(obj, msg):\n \"\"\"Deprecate wrapper\"\"\"\n if inspect.isclass(obj):\n if issubclass(obj, Enum):\n return DeprecatedEnum(obj, msg)\n\n class DeprecatedCls(obj): # pylint: disable=too-few-public-methods\n \"\"\"DeprecatedCls\"\"\"\n\n def __init__(self, *args, **kwargs):\n warnings.warn(msg, DeprecationWarning)\n super().__init__(*args, **kwargs)\n\n return DeprecatedCls\n\n if inspect.isfunction(obj):\n\n def deprecated(*args, **kwargs):\n warnings.warn(msg, DeprecationWarning)\n return obj(*args, **kwargs)\n\n return deprecated\n\n raise NotImplementedError(\n f\"deprecate_cls_wrapper only support Class and Function, but got {type(obj)}.\")\n", "path": "transformer_engine/common/utils.py"}], "after_files": [{"content": "# Copyright (c) 2022-2024, NVIDIA CORPORATION & AFFILIATES. All rights reserved.\n#\n# See LICENSE for license information.\n\"\"\"The utilities for Transformer Engine\"\"\"\nimport inspect\nimport warnings\nfrom enum import Enum\n\nwarnings.filterwarnings(\n \"module\", category=DeprecationWarning, module=\"transformer_engine.common.utils\")\n\n\nclass DeprecatedEnum: # pylint: disable=too-few-public-methods\n \"\"\"DeprecatedEnum\"\"\"\n\n def __init__(self, enum_cls, msg):\n self.enum_cls = enum_cls\n self.msg = msg\n\n def __iter__(self):\n return iter(list(self.enum_cls.__members__.values()))\n\n def __getattr__(self, name):\n if name in self.enum_cls.__members__:\n warnings.warn(self.msg, DeprecationWarning)\n return self.enum_cls.__members__[name]\n raise AttributeError(f\"{self.enum_cls} does not contain {name}\")\n\n\ndef deprecate_wrapper(obj, msg):\n \"\"\"Deprecate wrapper\"\"\"\n if inspect.isclass(obj):\n if issubclass(obj, Enum):\n return DeprecatedEnum(obj, msg)\n\n class DeprecatedCls(obj): # pylint: disable=too-few-public-methods\n \"\"\"DeprecatedCls\"\"\"\n\n def __init__(self, *args, **kwargs):\n warnings.warn(msg, DeprecationWarning)\n super().__init__(*args, **kwargs)\n\n return DeprecatedCls\n\n if inspect.isfunction(obj):\n\n def deprecated(*args, **kwargs):\n warnings.warn(msg, DeprecationWarning)\n return obj(*args, **kwargs)\n\n return deprecated\n\n raise NotImplementedError(\n f\"deprecate_cls_wrapper only support Class and Function, but got {type(obj)}.\")\n", "path": "transformer_engine/common/utils.py"}]} | 868 | 108 |
gh_patches_debug_76 | rasdani/github-patches | git_diff | streamlit__streamlit-2570 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
URL markup does not get generated as a link
# Summary
URLs used to generate an anchor tag automatically in markup. Now it does not
# Steps to reproduce
Code snippet:
```
st.write(f"""
As always, thank you to [all our contributors](https://github.com/streamlit/streamlit/graphs/contributors) who help make Streamlit awesome!
---
### Connect With Us
- We can be found at https://streamlit.io and https://twitter.com/streamlit
- Come by
[the forums](https://discuss.streamlit.io/c/official-announcements/6) if you'd like to ask questions,
post awesome apps, or just say hi!
""")
```
## Expected behavior:
[0.73](https://share.streamlit.io/streamlit/release-demos/0.73/0.73/streamlit_app.py)

## Actual behavior:
[0.74](https://share.streamlit.io/streamlit/release-demos/0.74/0.74/streamlit_app.py)

## Is this a regression?
Yes as of 0.74
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/setup.py`
Content:
```
1 import os
2 import setuptools
3 import sys
4
5 from setuptools.command.install import install
6
7 try:
8 from pipenv.project import Project
9 from pipenv.utils import convert_deps_to_pip
10 except:
11 exit_msg = (
12 "pipenv is required to package Streamlit. Please install pipenv and try again"
13 )
14 sys.exit(exit_msg)
15
16 VERSION = "0.74.0" # PEP-440
17
18 NAME = "streamlit"
19
20 DESCRIPTION = "The fastest way to build data apps in Python"
21
22 LONG_DESCRIPTION = (
23 "Streamlit's open-source app framework is the easiest way "
24 "for data scientists and machine learning engineers to "
25 "create beautiful, performant apps in only a few hours! "
26 "All in pure Python. All for free."
27 )
28
29 pipfile = Project(chdir=False).parsed_pipfile
30
31 packages = pipfile["packages"].copy()
32 requirements = convert_deps_to_pip(packages, r=False)
33
34
35 class VerifyVersionCommand(install):
36 """Custom command to verify that the git tag matches our version"""
37
38 description = "verify that the git tag matches our version"
39
40 def run(self):
41 tag = os.getenv("CIRCLE_TAG")
42
43 if tag != VERSION:
44 info = "Git tag: {0} does not match the version of this app: {1}".format(
45 tag, VERSION
46 )
47 sys.exit(info)
48
49
50 setuptools.setup(
51 name=NAME,
52 version=VERSION,
53 description=DESCRIPTION,
54 long_description=LONG_DESCRIPTION,
55 url="https://streamlit.io",
56 author="Streamlit Inc",
57 author_email="[email protected]",
58 python_requires=">=3.6",
59 license="Apache 2",
60 packages=setuptools.find_packages(exclude=["tests", "tests.*"]),
61 # Requirements
62 install_requires=requirements,
63 zip_safe=False, # install source files not egg
64 include_package_data=True, # copy html and friends
65 entry_points={"console_scripts": ["streamlit = streamlit.cli:main"]},
66 # For Windows so that streamlit * commands work ie.
67 # - streamlit version
68 # - streamlit hello
69 scripts=["bin/streamlit.cmd"],
70 cmdclass={
71 "verify": VerifyVersionCommand,
72 },
73 )
74
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lib/setup.py b/lib/setup.py
--- a/lib/setup.py
+++ b/lib/setup.py
@@ -13,7 +13,7 @@
)
sys.exit(exit_msg)
-VERSION = "0.74.0" # PEP-440
+VERSION = "0.74.1" # PEP-440
NAME = "streamlit"
| {"golden_diff": "diff --git a/lib/setup.py b/lib/setup.py\n--- a/lib/setup.py\n+++ b/lib/setup.py\n@@ -13,7 +13,7 @@\n )\n sys.exit(exit_msg)\n \n-VERSION = \"0.74.0\" # PEP-440\n+VERSION = \"0.74.1\" # PEP-440\n \n NAME = \"streamlit\"\n", "issue": "URL markup does not get generated as a link\n# Summary\r\nURLs used to generate an anchor tag automatically in markup. Now it does not\r\n\r\n\r\n# Steps to reproduce\r\nCode snippet:\r\n\r\n```\r\nst.write(f\"\"\"\r\n As always, thank you to [all our contributors](https://github.com/streamlit/streamlit/graphs/contributors) who help make Streamlit awesome!\r\n\r\n ---\r\n\r\n ### Connect With Us\r\n\r\n - We can be found at https://streamlit.io and https://twitter.com/streamlit\r\n - Come by\r\n [the forums](https://discuss.streamlit.io/c/official-announcements/6) if you'd like to ask questions,\r\n post awesome apps, or just say hi!\r\n \"\"\")\r\n```\r\n\r\n## Expected behavior:\r\n[0.73](https://share.streamlit.io/streamlit/release-demos/0.73/0.73/streamlit_app.py)\r\n\r\n\r\n\r\n## Actual behavior:\r\n[0.74](https://share.streamlit.io/streamlit/release-demos/0.74/0.74/streamlit_app.py)\r\n\r\n\r\n\r\n## Is this a regression?\r\nYes as of 0.74\r\n\n", "before_files": [{"content": "import os\nimport setuptools\nimport sys\n\nfrom setuptools.command.install import install\n\ntry:\n from pipenv.project import Project\n from pipenv.utils import convert_deps_to_pip\nexcept:\n exit_msg = (\n \"pipenv is required to package Streamlit. Please install pipenv and try again\"\n )\n sys.exit(exit_msg)\n\nVERSION = \"0.74.0\" # PEP-440\n\nNAME = \"streamlit\"\n\nDESCRIPTION = \"The fastest way to build data apps in Python\"\n\nLONG_DESCRIPTION = (\n \"Streamlit's open-source app framework is the easiest way \"\n \"for data scientists and machine learning engineers to \"\n \"create beautiful, performant apps in only a few hours! \"\n \"All in pure Python. All for free.\"\n)\n\npipfile = Project(chdir=False).parsed_pipfile\n\npackages = pipfile[\"packages\"].copy()\nrequirements = convert_deps_to_pip(packages, r=False)\n\n\nclass VerifyVersionCommand(install):\n \"\"\"Custom command to verify that the git tag matches our version\"\"\"\n\n description = \"verify that the git tag matches our version\"\n\n def run(self):\n tag = os.getenv(\"CIRCLE_TAG\")\n\n if tag != VERSION:\n info = \"Git tag: {0} does not match the version of this app: {1}\".format(\n tag, VERSION\n )\n sys.exit(info)\n\n\nsetuptools.setup(\n name=NAME,\n version=VERSION,\n description=DESCRIPTION,\n long_description=LONG_DESCRIPTION,\n url=\"https://streamlit.io\",\n author=\"Streamlit Inc\",\n author_email=\"[email protected]\",\n python_requires=\">=3.6\",\n license=\"Apache 2\",\n packages=setuptools.find_packages(exclude=[\"tests\", \"tests.*\"]),\n # Requirements\n install_requires=requirements,\n zip_safe=False, # install source files not egg\n include_package_data=True, # copy html and friends\n entry_points={\"console_scripts\": [\"streamlit = streamlit.cli:main\"]},\n # For Windows so that streamlit * commands work ie.\n # - streamlit version\n # - streamlit hello\n scripts=[\"bin/streamlit.cmd\"],\n cmdclass={\n \"verify\": VerifyVersionCommand,\n },\n)\n", "path": "lib/setup.py"}], "after_files": [{"content": "import os\nimport setuptools\nimport sys\n\nfrom setuptools.command.install import install\n\ntry:\n from pipenv.project import Project\n from pipenv.utils import convert_deps_to_pip\nexcept:\n exit_msg = (\n \"pipenv is required to package Streamlit. Please install pipenv and try again\"\n )\n sys.exit(exit_msg)\n\nVERSION = \"0.74.1\" # PEP-440\n\nNAME = \"streamlit\"\n\nDESCRIPTION = \"The fastest way to build data apps in Python\"\n\nLONG_DESCRIPTION = (\n \"Streamlit's open-source app framework is the easiest way \"\n \"for data scientists and machine learning engineers to \"\n \"create beautiful, performant apps in only a few hours! \"\n \"All in pure Python. All for free.\"\n)\n\npipfile = Project(chdir=False).parsed_pipfile\n\npackages = pipfile[\"packages\"].copy()\nrequirements = convert_deps_to_pip(packages, r=False)\n\n\nclass VerifyVersionCommand(install):\n \"\"\"Custom command to verify that the git tag matches our version\"\"\"\n\n description = \"verify that the git tag matches our version\"\n\n def run(self):\n tag = os.getenv(\"CIRCLE_TAG\")\n\n if tag != VERSION:\n info = \"Git tag: {0} does not match the version of this app: {1}\".format(\n tag, VERSION\n )\n sys.exit(info)\n\n\nsetuptools.setup(\n name=NAME,\n version=VERSION,\n description=DESCRIPTION,\n long_description=LONG_DESCRIPTION,\n url=\"https://streamlit.io\",\n author=\"Streamlit Inc\",\n author_email=\"[email protected]\",\n python_requires=\">=3.6\",\n license=\"Apache 2\",\n packages=setuptools.find_packages(exclude=[\"tests\", \"tests.*\"]),\n # Requirements\n install_requires=requirements,\n zip_safe=False, # install source files not egg\n include_package_data=True, # copy html and friends\n entry_points={\"console_scripts\": [\"streamlit = streamlit.cli:main\"]},\n # For Windows so that streamlit * commands work ie.\n # - streamlit version\n # - streamlit hello\n scripts=[\"bin/streamlit.cmd\"],\n cmdclass={\n \"verify\": VerifyVersionCommand,\n },\n)\n", "path": "lib/setup.py"}]} | 1,260 | 91 |
gh_patches_debug_19874 | rasdani/github-patches | git_diff | saleor__saleor-10283 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
checkoutCreate mutation issue - { "code": "REQUIRED", "field": "country", "message": "This field cannot be blank." }
### What I'm trying to achieve
try to add checkout step with createCheckout mutation.
### Steps to reproduce the problem
<!-- Adding logs from the console, as well as query/response help us fix the bug faster -->
1. use docker-platform, deploy with docker compose, delete saleor folder and replace it by saleor folder with 3.1 branch clone
2. use playgraphl to test mutation request (checkoutCreate)
### What I expected to happen
i make test on saleor demo site : https://demo.saleor.io/graphql/
```bash
mutation CheckoutCreate {
checkoutCreate(
input: { channel: "default-channel", email: "[email protected]", lines: [] }
)
{ errors {
code
field
message
}
checkout {
id
token
created
}
}
}
```
result on : https://demo.saleor.io/graphql/
```bash
{
"data": {
"checkoutCreate": {
"errors": [],
"checkout": {
"id": "Q2hlY2tvdXQ6MDQ2MmQwMzQtZGJmYi00MTg1LWExZTMtMWUwYTU2YWMxYjJi",
"token": "0462d034-dbfb-4185-a1e3-1e0a56ac1b2b",
"created": "2021-09-17T13:17:33.994853+00:00"
}
}
}
}
```
# this is fine for me but ....
When i try the samething on my local machine (deploy with docker compose)
i get this:
```bash
{
"data": {
"checkoutCreate": {
"errors": [
{
"code": "REQUIRED",
"field": "country",
"message": "This field cannot be blank."
}
],
"checkout": null
}
}
}
```
i want to get checkoutID and token and the system ask me to add some country field.....
**System information**
<!-- Provide the version of Saleor or whether you're using it from the `master` branch. If using Saleor Dashboard or Storefront, provide their versions too. -->
Saleor version:
- [ ] dev (current master)
- [ X] 3.0
- [ ] 2.11
- [ ] 2.10
Operating system:
- [ ] Windows
- [ X] Linux
- [ ] MacOS
- [ ] Other
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `saleor/channel/migrations/0001_initial.py`
Content:
```
1 # Generated by Django 3.0.6 on 2020-06-16 07:54
2
3 from django.conf import settings
4 from django.db import migrations, models
5 from django.db.models.signals import post_migrate
6 from django.apps import apps as registry
7
8
9 def assing_permissions(apps, schema_editor):
10 def on_migrations_complete(sender=None, **kwargs):
11 Group = apps.get_model("auth", "Group")
12 Permission = apps.get_model("auth", "Permission")
13 ContentType = apps.get_model("contenttypes", "ContentType")
14
15 ct, _ = ContentType.objects.get_or_create(app_label="channel", model="channel")
16 manage_channels, _ = Permission.objects.get_or_create(
17 name="Manage channels.", content_type=ct, codename="manage_channels"
18 )
19
20 for group in Group.objects.iterator():
21 group.permissions.add(manage_channels)
22
23 sender = registry.get_app_config("channel")
24 post_migrate.connect(on_migrations_complete, weak=False, sender=sender)
25
26
27 def get_default_currency(Checkout, Order, Product, ShippingMethod, Voucher):
28 latest_product = Product.objects.order_by("-pk").first()
29 if latest_product:
30 return latest_product.currency
31 latest_voucher = Voucher.objects.order_by("-pk").first()
32 if latest_voucher:
33 return latest_voucher.currency
34 latest_shipping_method = ShippingMethod.objects.order_by("-pk").first()
35 if latest_shipping_method:
36 return latest_shipping_method.currency
37 latest_order = Order.objects.order_by("-pk").first()
38 if latest_order:
39 return latest_order.currency
40 latest_checkout = Checkout.objects.order_by("-pk").first()
41 if latest_checkout:
42 return latest_checkout.currency
43 return None
44
45
46 def create_default_channel(apps, schema_editor):
47 Channel = apps.get_model("channel", "Channel")
48 Checkout = apps.get_model("checkout", "Checkout")
49 Order = apps.get_model("order", "Order")
50 Product = apps.get_model("product", "Product")
51 ShippingMethod = apps.get_model("shipping", "ShippingMethod")
52 Voucher = apps.get_model("discount", "Voucher")
53
54 default_currency = get_default_currency(
55 Checkout, Order, Product, ShippingMethod, Voucher
56 )
57 if default_currency:
58 Channel.objects.create(
59 name="Default channel",
60 slug=settings.DEFAULT_CHANNEL_SLUG,
61 currency_code=default_currency,
62 is_active=True,
63 )
64
65
66 class Migration(migrations.Migration):
67
68 initial = True
69
70 dependencies = [
71 ("checkout", "0025_auto_20200221_0257"),
72 ("discount", "0019_auto_20200217_0350"),
73 ("order", "0084_auto_20200522_0522"),
74 ("product", "0118_populate_product_variant_price"),
75 ("shipping", "0018_default_zones_countries"),
76 ]
77
78 operations = [
79 migrations.CreateModel(
80 name="Channel",
81 fields=[
82 (
83 "id",
84 models.AutoField(
85 auto_created=True,
86 primary_key=True,
87 serialize=False,
88 verbose_name="ID",
89 ),
90 ),
91 ("name", models.CharField(max_length=250)),
92 ("slug", models.SlugField(max_length=255, unique=True)),
93 ("is_active", models.BooleanField(default=False)),
94 (
95 "currency_code",
96 models.CharField(max_length=settings.DEFAULT_CURRENCY_CODE_LENGTH),
97 ),
98 ],
99 options={
100 "ordering": ("slug",),
101 "permissions": (("manage_channels", "Manage channels."),),
102 },
103 ),
104 migrations.RunPython(create_default_channel, migrations.RunPython.noop),
105 migrations.RunPython(assing_permissions, migrations.RunPython.noop),
106 ]
107
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/saleor/channel/migrations/0001_initial.py b/saleor/channel/migrations/0001_initial.py
--- a/saleor/channel/migrations/0001_initial.py
+++ b/saleor/channel/migrations/0001_initial.py
@@ -1,5 +1,6 @@
# Generated by Django 3.0.6 on 2020-06-16 07:54
+import os
from django.conf import settings
from django.db import migrations, models
from django.db.models.signals import post_migrate
@@ -54,12 +55,14 @@
default_currency = get_default_currency(
Checkout, Order, Product, ShippingMethod, Voucher
)
+ default_country = os.environ.get("DEFAULT_COUNTRY", "US")
if default_currency:
Channel.objects.create(
name="Default channel",
slug=settings.DEFAULT_CHANNEL_SLUG,
currency_code=default_currency,
is_active=True,
+ default_country=default_country,
)
| {"golden_diff": "diff --git a/saleor/channel/migrations/0001_initial.py b/saleor/channel/migrations/0001_initial.py\n--- a/saleor/channel/migrations/0001_initial.py\n+++ b/saleor/channel/migrations/0001_initial.py\n@@ -1,5 +1,6 @@\n # Generated by Django 3.0.6 on 2020-06-16 07:54\n \n+import os\n from django.conf import settings\n from django.db import migrations, models\n from django.db.models.signals import post_migrate\n@@ -54,12 +55,14 @@\n default_currency = get_default_currency(\n Checkout, Order, Product, ShippingMethod, Voucher\n )\n+ default_country = os.environ.get(\"DEFAULT_COUNTRY\", \"US\")\n if default_currency:\n Channel.objects.create(\n name=\"Default channel\",\n slug=settings.DEFAULT_CHANNEL_SLUG,\n currency_code=default_currency,\n is_active=True,\n+ default_country=default_country,\n )\n", "issue": "checkoutCreate mutation issue - { \"code\": \"REQUIRED\", \"field\": \"country\", \"message\": \"This field cannot be blank.\" }\n### What I'm trying to achieve\r\ntry to add checkout step with createCheckout mutation.\r\n\r\n### Steps to reproduce the problem\r\n<!-- Adding logs from the console, as well as query/response help us fix the bug faster -->\r\n1. use docker-platform, deploy with docker compose, delete saleor folder and replace it by saleor folder with 3.1 branch clone\r\n2. use playgraphl to test mutation request (checkoutCreate)\r\n\r\n### What I expected to happen\r\ni make test on saleor demo site : https://demo.saleor.io/graphql/\r\n\r\n```bash\r\nmutation CheckoutCreate {\r\n checkoutCreate(\r\n input: { channel: \"default-channel\", email: \"[email protected]\", lines: [] }\r\n ) \r\n { errors {\r\n code\r\n field\r\n message\r\n }\r\n checkout {\r\n id\r\n token\r\n created\r\n \r\n }\r\n }\r\n}\r\n\r\n```\r\nresult on : https://demo.saleor.io/graphql/\r\n```bash\r\n{\r\n \"data\": {\r\n \"checkoutCreate\": {\r\n \"errors\": [],\r\n \"checkout\": {\r\n \"id\": \"Q2hlY2tvdXQ6MDQ2MmQwMzQtZGJmYi00MTg1LWExZTMtMWUwYTU2YWMxYjJi\",\r\n \"token\": \"0462d034-dbfb-4185-a1e3-1e0a56ac1b2b\",\r\n \"created\": \"2021-09-17T13:17:33.994853+00:00\"\r\n }\r\n }\r\n }\r\n}\r\n\r\n\r\n```\r\n\r\n# this is fine for me but ....\r\nWhen i try the samething on my local machine (deploy with docker compose)\r\ni get this:\r\n\r\n```bash\r\n\r\n{\r\n \"data\": {\r\n \"checkoutCreate\": {\r\n \"errors\": [\r\n {\r\n \"code\": \"REQUIRED\",\r\n \"field\": \"country\",\r\n \"message\": \"This field cannot be blank.\"\r\n }\r\n ],\r\n \"checkout\": null\r\n }\r\n }\r\n}\r\n\r\n```\r\ni want to get checkoutID and token and the system ask me to add some country field.....\r\n\r\n**System information**\r\n<!-- Provide the version of Saleor or whether you're using it from the `master` branch. If using Saleor Dashboard or Storefront, provide their versions too. -->\r\nSaleor version:\r\n- [ ] dev (current master)\r\n- [ X] 3.0\r\n- [ ] 2.11\r\n- [ ] 2.10\r\n\r\nOperating system:\r\n- [ ] Windows\r\n- [ X] Linux\r\n- [ ] MacOS\r\n- [ ] Other\r\n\n", "before_files": [{"content": "# Generated by Django 3.0.6 on 2020-06-16 07:54\n\nfrom django.conf import settings\nfrom django.db import migrations, models\nfrom django.db.models.signals import post_migrate\nfrom django.apps import apps as registry\n\n\ndef assing_permissions(apps, schema_editor):\n def on_migrations_complete(sender=None, **kwargs):\n Group = apps.get_model(\"auth\", \"Group\")\n Permission = apps.get_model(\"auth\", \"Permission\")\n ContentType = apps.get_model(\"contenttypes\", \"ContentType\")\n\n ct, _ = ContentType.objects.get_or_create(app_label=\"channel\", model=\"channel\")\n manage_channels, _ = Permission.objects.get_or_create(\n name=\"Manage channels.\", content_type=ct, codename=\"manage_channels\"\n )\n\n for group in Group.objects.iterator():\n group.permissions.add(manage_channels)\n\n sender = registry.get_app_config(\"channel\")\n post_migrate.connect(on_migrations_complete, weak=False, sender=sender)\n\n\ndef get_default_currency(Checkout, Order, Product, ShippingMethod, Voucher):\n latest_product = Product.objects.order_by(\"-pk\").first()\n if latest_product:\n return latest_product.currency\n latest_voucher = Voucher.objects.order_by(\"-pk\").first()\n if latest_voucher:\n return latest_voucher.currency\n latest_shipping_method = ShippingMethod.objects.order_by(\"-pk\").first()\n if latest_shipping_method:\n return latest_shipping_method.currency\n latest_order = Order.objects.order_by(\"-pk\").first()\n if latest_order:\n return latest_order.currency\n latest_checkout = Checkout.objects.order_by(\"-pk\").first()\n if latest_checkout:\n return latest_checkout.currency\n return None\n\n\ndef create_default_channel(apps, schema_editor):\n Channel = apps.get_model(\"channel\", \"Channel\")\n Checkout = apps.get_model(\"checkout\", \"Checkout\")\n Order = apps.get_model(\"order\", \"Order\")\n Product = apps.get_model(\"product\", \"Product\")\n ShippingMethod = apps.get_model(\"shipping\", \"ShippingMethod\")\n Voucher = apps.get_model(\"discount\", \"Voucher\")\n\n default_currency = get_default_currency(\n Checkout, Order, Product, ShippingMethod, Voucher\n )\n if default_currency:\n Channel.objects.create(\n name=\"Default channel\",\n slug=settings.DEFAULT_CHANNEL_SLUG,\n currency_code=default_currency,\n is_active=True,\n )\n\n\nclass Migration(migrations.Migration):\n\n initial = True\n\n dependencies = [\n (\"checkout\", \"0025_auto_20200221_0257\"),\n (\"discount\", \"0019_auto_20200217_0350\"),\n (\"order\", \"0084_auto_20200522_0522\"),\n (\"product\", \"0118_populate_product_variant_price\"),\n (\"shipping\", \"0018_default_zones_countries\"),\n ]\n\n operations = [\n migrations.CreateModel(\n name=\"Channel\",\n fields=[\n (\n \"id\",\n models.AutoField(\n auto_created=True,\n primary_key=True,\n serialize=False,\n verbose_name=\"ID\",\n ),\n ),\n (\"name\", models.CharField(max_length=250)),\n (\"slug\", models.SlugField(max_length=255, unique=True)),\n (\"is_active\", models.BooleanField(default=False)),\n (\n \"currency_code\",\n models.CharField(max_length=settings.DEFAULT_CURRENCY_CODE_LENGTH),\n ),\n ],\n options={\n \"ordering\": (\"slug\",),\n \"permissions\": ((\"manage_channels\", \"Manage channels.\"),),\n },\n ),\n migrations.RunPython(create_default_channel, migrations.RunPython.noop),\n migrations.RunPython(assing_permissions, migrations.RunPython.noop),\n ]\n", "path": "saleor/channel/migrations/0001_initial.py"}], "after_files": [{"content": "# Generated by Django 3.0.6 on 2020-06-16 07:54\n\nimport os\nfrom django.conf import settings\nfrom django.db import migrations, models\nfrom django.db.models.signals import post_migrate\nfrom django.apps import apps as registry\n\n\ndef assing_permissions(apps, schema_editor):\n def on_migrations_complete(sender=None, **kwargs):\n Group = apps.get_model(\"auth\", \"Group\")\n Permission = apps.get_model(\"auth\", \"Permission\")\n ContentType = apps.get_model(\"contenttypes\", \"ContentType\")\n\n ct, _ = ContentType.objects.get_or_create(app_label=\"channel\", model=\"channel\")\n manage_channels, _ = Permission.objects.get_or_create(\n name=\"Manage channels.\", content_type=ct, codename=\"manage_channels\"\n )\n\n for group in Group.objects.iterator():\n group.permissions.add(manage_channels)\n\n sender = registry.get_app_config(\"channel\")\n post_migrate.connect(on_migrations_complete, weak=False, sender=sender)\n\n\ndef get_default_currency(Checkout, Order, Product, ShippingMethod, Voucher):\n latest_product = Product.objects.order_by(\"-pk\").first()\n if latest_product:\n return latest_product.currency\n latest_voucher = Voucher.objects.order_by(\"-pk\").first()\n if latest_voucher:\n return latest_voucher.currency\n latest_shipping_method = ShippingMethod.objects.order_by(\"-pk\").first()\n if latest_shipping_method:\n return latest_shipping_method.currency\n latest_order = Order.objects.order_by(\"-pk\").first()\n if latest_order:\n return latest_order.currency\n latest_checkout = Checkout.objects.order_by(\"-pk\").first()\n if latest_checkout:\n return latest_checkout.currency\n return None\n\n\ndef create_default_channel(apps, schema_editor):\n Channel = apps.get_model(\"channel\", \"Channel\")\n Checkout = apps.get_model(\"checkout\", \"Checkout\")\n Order = apps.get_model(\"order\", \"Order\")\n Product = apps.get_model(\"product\", \"Product\")\n ShippingMethod = apps.get_model(\"shipping\", \"ShippingMethod\")\n Voucher = apps.get_model(\"discount\", \"Voucher\")\n\n default_currency = get_default_currency(\n Checkout, Order, Product, ShippingMethod, Voucher\n )\n default_country = os.environ.get(\"DEFAULT_COUNTRY\", \"US\")\n if default_currency:\n Channel.objects.create(\n name=\"Default channel\",\n slug=settings.DEFAULT_CHANNEL_SLUG,\n currency_code=default_currency,\n is_active=True,\n default_country=default_country,\n )\n\n\nclass Migration(migrations.Migration):\n\n initial = True\n\n dependencies = [\n (\"checkout\", \"0025_auto_20200221_0257\"),\n (\"discount\", \"0019_auto_20200217_0350\"),\n (\"order\", \"0084_auto_20200522_0522\"),\n (\"product\", \"0118_populate_product_variant_price\"),\n (\"shipping\", \"0018_default_zones_countries\"),\n ]\n\n operations = [\n migrations.CreateModel(\n name=\"Channel\",\n fields=[\n (\n \"id\",\n models.AutoField(\n auto_created=True,\n primary_key=True,\n serialize=False,\n verbose_name=\"ID\",\n ),\n ),\n (\"name\", models.CharField(max_length=250)),\n (\"slug\", models.SlugField(max_length=255, unique=True)),\n (\"is_active\", models.BooleanField(default=False)),\n (\n \"currency_code\",\n models.CharField(max_length=settings.DEFAULT_CURRENCY_CODE_LENGTH),\n ),\n ],\n options={\n \"ordering\": (\"slug\",),\n \"permissions\": ((\"manage_channels\", \"Manage channels.\"),),\n },\n ),\n migrations.RunPython(create_default_channel, migrations.RunPython.noop),\n migrations.RunPython(assing_permissions, migrations.RunPython.noop),\n ]\n", "path": "saleor/channel/migrations/0001_initial.py"}]} | 1,905 | 227 |
gh_patches_debug_37257 | rasdani/github-patches | git_diff | svthalia__concrexit-3722 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Lock admin panel behind 2FA
### What?
<!-- A clear and concise high-level description of what you want to happen. -->
lock the admin panel behind the 2FA functionality
### Why?
<!-- A clear and concise motivation why we should consider implementing this. -->
Admin panel has sensitive data so it should be protected. So requiring 2FA makes sense.
### How?
<!-- Optionally some guidance, ideas, context. -->
Probably nice to have a decorator to be able to lock other things of the site behind 2FA in the future.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `website/thaliawebsite/admin.py`
Content:
```
1 """Settings for the admin site."""
2 from django.contrib import admin
3 from django.utils.translation import gettext_lazy as _
4
5 admin.site.site_header = _("Thalia administration")
6 admin.site.site_title = _("Thalia")
7
```
Path: `website/thaliawebsite/views.py`
Content:
```
1 """General views for the website."""
2
3 from django.contrib.admin.views.decorators import staff_member_required
4 from django.contrib.auth.views import LogoutView as BaseLogoutView
5 from django.contrib.auth.views import PasswordResetView
6 from django.core.exceptions import PermissionDenied
7 from django.http import HttpResponse, HttpResponseForbidden
8 from django.shortcuts import redirect
9 from django.utils.decorators import method_decorator
10 from django.views.generic import ListView, TemplateView
11 from django.views.generic.base import View
12
13 from django_ratelimit.decorators import ratelimit
14 from two_factor.views import LoginView
15
16
17 class IndexView(TemplateView):
18 template_name = "index.html"
19
20
21 @method_decorator(staff_member_required, "dispatch")
22 class TestCrashView(View):
23 """Test view to intentionally crash to test the error handling."""
24
25 def dispatch(self, request, *args, **kwargs) -> HttpResponse:
26 if not request.user.is_superuser:
27 return HttpResponseForbidden("This is not for you")
28 raise Exception("Test exception")
29
30
31 class PagedView(ListView):
32 """A ListView with automatic pagination."""
33
34 def get_context_data(self, **kwargs) -> dict:
35 context = super().get_context_data(**kwargs)
36 page = context["page_obj"].number
37 paginator = context["paginator"]
38
39 # Show the two pages before and after the current page
40 page_range_start = max(1, page - 2)
41 page_range_stop = min(page + 3, paginator.num_pages + 1)
42
43 # Add extra pages if we show less than 5 pages
44 page_range_start = min(page_range_start, page_range_stop - 5)
45 page_range_start = max(1, page_range_start)
46
47 # Add extra pages if we still show less than 5 pages
48 page_range_stop = max(page_range_stop, page_range_start + 5)
49 page_range_stop = min(page_range_stop, paginator.num_pages + 1)
50
51 page_range = range(page_range_start, page_range_stop)
52
53 querydict = self.request.GET.copy()
54
55 if "page" in querydict:
56 del querydict["page"]
57
58 context.update(
59 {
60 "page_range": page_range,
61 "base_url": f"{self.request.path}?{querydict.urlencode()}&"
62 if querydict
63 else f"{self.request.path}?",
64 }
65 )
66
67 return context
68
69
70 class RateLimitedPasswordResetView(PasswordResetView):
71 @method_decorator(ratelimit(key="ip", rate="5/h"))
72 def post(self, request, *args, **kwargs):
73 return super().post(request, *args, **kwargs)
74
75
76 class RateLimitedLoginView(LoginView):
77 @method_decorator(ratelimit(key="ip", rate="30/h"))
78 @method_decorator(ratelimit(key="post:username", rate="30/h"))
79 def post(self, request, *args, **kwargs):
80 return super().post(request, *args, **kwargs)
81
82
83 class LogoutView(BaseLogoutView):
84 # Allow GET logout still (this was deprecated in Django 5.0).
85 http_method_names = ["get", "post", "options"]
86
87 def get(self, request, *args, **kwargs):
88 return self.post(request, *args, **kwargs)
89
90
91 def rate_limited_view(request, *args, **kwargs):
92 return HttpResponse("You are rate limited", status=429)
93
94
95 def admin_unauthorized_view(request):
96 if not request.member:
97 url = "/user/account/login"
98 args = request.META.get("QUERY_STRING", "")
99 if args:
100 url = f"{url}?{args}"
101 return redirect(url)
102 elif not request.member.is_staff and not request.member.is_superuser:
103 raise PermissionDenied("You are not allowed to access the administration page.")
104 else:
105 return redirect(request.GET.get("next", "/"))
106
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/website/thaliawebsite/admin.py b/website/thaliawebsite/admin.py
--- a/website/thaliawebsite/admin.py
+++ b/website/thaliawebsite/admin.py
@@ -1,6 +1,17 @@
"""Settings for the admin site."""
+
+from django.conf import settings
from django.contrib import admin
from django.utils.translation import gettext_lazy as _
-admin.site.site_header = _("Thalia administration")
-admin.site.site_title = _("Thalia")
+from django_otp import user_has_device
+
+
+class ThaliaAdminSite(admin.AdminSite):
+ site_header = _("Thalia administration")
+ site_title = _("Thalia")
+
+ def has_permission(self, request):
+ return super().has_permission(request) and (
+ settings.DEBUG or user_has_device(request.user)
+ )
diff --git a/website/thaliawebsite/views.py b/website/thaliawebsite/views.py
--- a/website/thaliawebsite/views.py
+++ b/website/thaliawebsite/views.py
@@ -1,5 +1,6 @@
"""General views for the website."""
+from django.contrib import messages
from django.contrib.admin.views.decorators import staff_member_required
from django.contrib.auth.views import LogoutView as BaseLogoutView
from django.contrib.auth.views import PasswordResetView
@@ -10,6 +11,7 @@
from django.views.generic import ListView, TemplateView
from django.views.generic.base import View
+from django_otp import user_has_device
from django_ratelimit.decorators import ratelimit
from two_factor.views import LoginView
@@ -58,9 +60,11 @@
context.update(
{
"page_range": page_range,
- "base_url": f"{self.request.path}?{querydict.urlencode()}&"
- if querydict
- else f"{self.request.path}?",
+ "base_url": (
+ f"{self.request.path}?{querydict.urlencode()}&"
+ if querydict
+ else f"{self.request.path}?"
+ ),
}
)
@@ -101,5 +105,11 @@
return redirect(url)
elif not request.member.is_staff and not request.member.is_superuser:
raise PermissionDenied("You are not allowed to access the administration page.")
+ elif not user_has_device(request.member):
+ messages.error(
+ request,
+ "You need to set up two-factor authentication to access the administration page.",
+ )
+ return redirect("two_factor:setup")
else:
return redirect(request.GET.get("next", "/"))
| {"golden_diff": "diff --git a/website/thaliawebsite/admin.py b/website/thaliawebsite/admin.py\n--- a/website/thaliawebsite/admin.py\n+++ b/website/thaliawebsite/admin.py\n@@ -1,6 +1,17 @@\n \"\"\"Settings for the admin site.\"\"\"\n+\n+from django.conf import settings\n from django.contrib import admin\n from django.utils.translation import gettext_lazy as _\n \n-admin.site.site_header = _(\"Thalia administration\")\n-admin.site.site_title = _(\"Thalia\")\n+from django_otp import user_has_device\n+\n+\n+class ThaliaAdminSite(admin.AdminSite):\n+ site_header = _(\"Thalia administration\")\n+ site_title = _(\"Thalia\")\n+\n+ def has_permission(self, request):\n+ return super().has_permission(request) and (\n+ settings.DEBUG or user_has_device(request.user)\n+ )\ndiff --git a/website/thaliawebsite/views.py b/website/thaliawebsite/views.py\n--- a/website/thaliawebsite/views.py\n+++ b/website/thaliawebsite/views.py\n@@ -1,5 +1,6 @@\n \"\"\"General views for the website.\"\"\"\n \n+from django.contrib import messages\n from django.contrib.admin.views.decorators import staff_member_required\n from django.contrib.auth.views import LogoutView as BaseLogoutView\n from django.contrib.auth.views import PasswordResetView\n@@ -10,6 +11,7 @@\n from django.views.generic import ListView, TemplateView\n from django.views.generic.base import View\n \n+from django_otp import user_has_device\n from django_ratelimit.decorators import ratelimit\n from two_factor.views import LoginView\n \n@@ -58,9 +60,11 @@\n context.update(\n {\n \"page_range\": page_range,\n- \"base_url\": f\"{self.request.path}?{querydict.urlencode()}&\"\n- if querydict\n- else f\"{self.request.path}?\",\n+ \"base_url\": (\n+ f\"{self.request.path}?{querydict.urlencode()}&\"\n+ if querydict\n+ else f\"{self.request.path}?\"\n+ ),\n }\n )\n \n@@ -101,5 +105,11 @@\n return redirect(url)\n elif not request.member.is_staff and not request.member.is_superuser:\n raise PermissionDenied(\"You are not allowed to access the administration page.\")\n+ elif not user_has_device(request.member):\n+ messages.error(\n+ request,\n+ \"You need to set up two-factor authentication to access the administration page.\",\n+ )\n+ return redirect(\"two_factor:setup\")\n else:\n return redirect(request.GET.get(\"next\", \"/\"))\n", "issue": "Lock admin panel behind 2FA\n### What?\r\n<!-- A clear and concise high-level description of what you want to happen. -->\r\nlock the admin panel behind the 2FA functionality\r\n\r\n### Why?\r\n<!-- A clear and concise motivation why we should consider implementing this. -->\r\nAdmin panel has sensitive data so it should be protected. So requiring 2FA makes sense.\r\n\r\n### How?\r\n<!-- Optionally some guidance, ideas, context. -->\r\nProbably nice to have a decorator to be able to lock other things of the site behind 2FA in the future.\r\n\r\n\n", "before_files": [{"content": "\"\"\"Settings for the admin site.\"\"\"\nfrom django.contrib import admin\nfrom django.utils.translation import gettext_lazy as _\n\nadmin.site.site_header = _(\"Thalia administration\")\nadmin.site.site_title = _(\"Thalia\")\n", "path": "website/thaliawebsite/admin.py"}, {"content": "\"\"\"General views for the website.\"\"\"\n\nfrom django.contrib.admin.views.decorators import staff_member_required\nfrom django.contrib.auth.views import LogoutView as BaseLogoutView\nfrom django.contrib.auth.views import PasswordResetView\nfrom django.core.exceptions import PermissionDenied\nfrom django.http import HttpResponse, HttpResponseForbidden\nfrom django.shortcuts import redirect\nfrom django.utils.decorators import method_decorator\nfrom django.views.generic import ListView, TemplateView\nfrom django.views.generic.base import View\n\nfrom django_ratelimit.decorators import ratelimit\nfrom two_factor.views import LoginView\n\n\nclass IndexView(TemplateView):\n template_name = \"index.html\"\n\n\n@method_decorator(staff_member_required, \"dispatch\")\nclass TestCrashView(View):\n \"\"\"Test view to intentionally crash to test the error handling.\"\"\"\n\n def dispatch(self, request, *args, **kwargs) -> HttpResponse:\n if not request.user.is_superuser:\n return HttpResponseForbidden(\"This is not for you\")\n raise Exception(\"Test exception\")\n\n\nclass PagedView(ListView):\n \"\"\"A ListView with automatic pagination.\"\"\"\n\n def get_context_data(self, **kwargs) -> dict:\n context = super().get_context_data(**kwargs)\n page = context[\"page_obj\"].number\n paginator = context[\"paginator\"]\n\n # Show the two pages before and after the current page\n page_range_start = max(1, page - 2)\n page_range_stop = min(page + 3, paginator.num_pages + 1)\n\n # Add extra pages if we show less than 5 pages\n page_range_start = min(page_range_start, page_range_stop - 5)\n page_range_start = max(1, page_range_start)\n\n # Add extra pages if we still show less than 5 pages\n page_range_stop = max(page_range_stop, page_range_start + 5)\n page_range_stop = min(page_range_stop, paginator.num_pages + 1)\n\n page_range = range(page_range_start, page_range_stop)\n\n querydict = self.request.GET.copy()\n\n if \"page\" in querydict:\n del querydict[\"page\"]\n\n context.update(\n {\n \"page_range\": page_range,\n \"base_url\": f\"{self.request.path}?{querydict.urlencode()}&\"\n if querydict\n else f\"{self.request.path}?\",\n }\n )\n\n return context\n\n\nclass RateLimitedPasswordResetView(PasswordResetView):\n @method_decorator(ratelimit(key=\"ip\", rate=\"5/h\"))\n def post(self, request, *args, **kwargs):\n return super().post(request, *args, **kwargs)\n\n\nclass RateLimitedLoginView(LoginView):\n @method_decorator(ratelimit(key=\"ip\", rate=\"30/h\"))\n @method_decorator(ratelimit(key=\"post:username\", rate=\"30/h\"))\n def post(self, request, *args, **kwargs):\n return super().post(request, *args, **kwargs)\n\n\nclass LogoutView(BaseLogoutView):\n # Allow GET logout still (this was deprecated in Django 5.0).\n http_method_names = [\"get\", \"post\", \"options\"]\n\n def get(self, request, *args, **kwargs):\n return self.post(request, *args, **kwargs)\n\n\ndef rate_limited_view(request, *args, **kwargs):\n return HttpResponse(\"You are rate limited\", status=429)\n\n\ndef admin_unauthorized_view(request):\n if not request.member:\n url = \"/user/account/login\"\n args = request.META.get(\"QUERY_STRING\", \"\")\n if args:\n url = f\"{url}?{args}\"\n return redirect(url)\n elif not request.member.is_staff and not request.member.is_superuser:\n raise PermissionDenied(\"You are not allowed to access the administration page.\")\n else:\n return redirect(request.GET.get(\"next\", \"/\"))\n", "path": "website/thaliawebsite/views.py"}], "after_files": [{"content": "\"\"\"Settings for the admin site.\"\"\"\n\nfrom django.conf import settings\nfrom django.contrib import admin\nfrom django.utils.translation import gettext_lazy as _\n\nfrom django_otp import user_has_device\n\n\nclass ThaliaAdminSite(admin.AdminSite):\n site_header = _(\"Thalia administration\")\n site_title = _(\"Thalia\")\n\n def has_permission(self, request):\n return super().has_permission(request) and (\n settings.DEBUG or user_has_device(request.user)\n )\n", "path": "website/thaliawebsite/admin.py"}, {"content": "\"\"\"General views for the website.\"\"\"\n\nfrom django.contrib import messages\nfrom django.contrib.admin.views.decorators import staff_member_required\nfrom django.contrib.auth.views import LogoutView as BaseLogoutView\nfrom django.contrib.auth.views import PasswordResetView\nfrom django.core.exceptions import PermissionDenied\nfrom django.http import HttpResponse, HttpResponseForbidden\nfrom django.shortcuts import redirect\nfrom django.utils.decorators import method_decorator\nfrom django.views.generic import ListView, TemplateView\nfrom django.views.generic.base import View\n\nfrom django_otp import user_has_device\nfrom django_ratelimit.decorators import ratelimit\nfrom two_factor.views import LoginView\n\n\nclass IndexView(TemplateView):\n template_name = \"index.html\"\n\n\n@method_decorator(staff_member_required, \"dispatch\")\nclass TestCrashView(View):\n \"\"\"Test view to intentionally crash to test the error handling.\"\"\"\n\n def dispatch(self, request, *args, **kwargs) -> HttpResponse:\n if not request.user.is_superuser:\n return HttpResponseForbidden(\"This is not for you\")\n raise Exception(\"Test exception\")\n\n\nclass PagedView(ListView):\n \"\"\"A ListView with automatic pagination.\"\"\"\n\n def get_context_data(self, **kwargs) -> dict:\n context = super().get_context_data(**kwargs)\n page = context[\"page_obj\"].number\n paginator = context[\"paginator\"]\n\n # Show the two pages before and after the current page\n page_range_start = max(1, page - 2)\n page_range_stop = min(page + 3, paginator.num_pages + 1)\n\n # Add extra pages if we show less than 5 pages\n page_range_start = min(page_range_start, page_range_stop - 5)\n page_range_start = max(1, page_range_start)\n\n # Add extra pages if we still show less than 5 pages\n page_range_stop = max(page_range_stop, page_range_start + 5)\n page_range_stop = min(page_range_stop, paginator.num_pages + 1)\n\n page_range = range(page_range_start, page_range_stop)\n\n querydict = self.request.GET.copy()\n\n if \"page\" in querydict:\n del querydict[\"page\"]\n\n context.update(\n {\n \"page_range\": page_range,\n \"base_url\": (\n f\"{self.request.path}?{querydict.urlencode()}&\"\n if querydict\n else f\"{self.request.path}?\"\n ),\n }\n )\n\n return context\n\n\nclass RateLimitedPasswordResetView(PasswordResetView):\n @method_decorator(ratelimit(key=\"ip\", rate=\"5/h\"))\n def post(self, request, *args, **kwargs):\n return super().post(request, *args, **kwargs)\n\n\nclass RateLimitedLoginView(LoginView):\n @method_decorator(ratelimit(key=\"ip\", rate=\"30/h\"))\n @method_decorator(ratelimit(key=\"post:username\", rate=\"30/h\"))\n def post(self, request, *args, **kwargs):\n return super().post(request, *args, **kwargs)\n\n\nclass LogoutView(BaseLogoutView):\n # Allow GET logout still (this was deprecated in Django 5.0).\n http_method_names = [\"get\", \"post\", \"options\"]\n\n def get(self, request, *args, **kwargs):\n return self.post(request, *args, **kwargs)\n\n\ndef rate_limited_view(request, *args, **kwargs):\n return HttpResponse(\"You are rate limited\", status=429)\n\n\ndef admin_unauthorized_view(request):\n if not request.member:\n url = \"/user/account/login\"\n args = request.META.get(\"QUERY_STRING\", \"\")\n if args:\n url = f\"{url}?{args}\"\n return redirect(url)\n elif not request.member.is_staff and not request.member.is_superuser:\n raise PermissionDenied(\"You are not allowed to access the administration page.\")\n elif not user_has_device(request.member):\n messages.error(\n request,\n \"You need to set up two-factor authentication to access the administration page.\",\n )\n return redirect(\"two_factor:setup\")\n else:\n return redirect(request.GET.get(\"next\", \"/\"))\n", "path": "website/thaliawebsite/views.py"}]} | 1,473 | 565 |
gh_patches_debug_39220 | rasdani/github-patches | git_diff | Cog-Creators__Red-DiscordBot-1485 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[V3 Image] [p]imgur search errors out
### Type:
- [ ] Suggestion
- [x] Bug
### Brief description of the problem
`[p]imgur search` produces an error
### Expected behavior
It should give links to images
### Actual behavior
`Error in command 'imgur search'. Check your console or logs for details.`
### Steps to reproduce
1. do `[p]imgur search cats`
2. get error
Traceback:
```py
Exception in command 'imgur search'
Traceback (most recent call last):
File "/home/palm/redv3/lib/python3.5/site-packages/discord/ext/commands/core.py", line 62, in wrapped
ret = yield from coro(*args, **kwargs)
File "/home/palm/redv3/lib/python3.5/site-packages/redbot/cogs/image/image.py", line 47, in imgur_search
data = await search_get.json()
File "/home/palm/redv3/lib/python3.5/site-packages/aiohttp/client_reqrep.py", line 730, in json
headers=self.headers)
aiohttp.client_exceptions.ClientResponseError: 0, message='Attempt to decode JSON with unexpected mimetype: text/html; charset=utf-8'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `redbot/cogs/image/image.py`
Content:
```
1 from random import shuffle
2
3 import aiohttp
4 from discord.ext import commands
5
6 from redbot.core.i18n import CogI18n
7 from redbot.core import checks, Config
8
9 _ = CogI18n("Image", __file__)
10
11 GIPHY_API_KEY = "dc6zaTOxFJmzC"
12
13
14 class Image:
15 """Image related commands."""
16 default_global = {
17 "imgur_client_id": None
18 }
19
20 def __init__(self, bot):
21 self.bot = bot
22 self.settings = Config.get_conf(self, identifier=2652104208, force_registration=True)
23 self.settings.register_global(**self.default_global)
24 self.session = aiohttp.ClientSession()
25 self.imgur_base_url = "https://api.imgur.com/3/"
26
27 def __unload(self):
28 self.session.close()
29
30 @commands.group(name="imgur")
31 @commands.guild_only()
32 async def _imgur(self, ctx):
33 """Retrieves pictures from imgur
34
35 Make sure to set the client ID using
36 [p]imgurcreds"""
37 if ctx.invoked_subcommand is None:
38 await ctx.send_help()
39
40 @_imgur.command(name="search")
41 async def imgur_search(self, ctx, *, term: str):
42 """Searches Imgur for the specified term and returns up to 3 results"""
43 url = self.imgur_base_url + "time/all/0"
44 params = {"q": term}
45 headers = {"Authorization": "Client-ID {}".format(await self.settings.imgur_client_id())}
46 async with self.session.get(url, headers=headers, data=params) as search_get:
47 data = await search_get.json()
48
49 if data["success"]:
50 results = data["data"]
51 if not results:
52 await ctx.send(_("Your search returned no results"))
53 return
54 shuffle(results)
55 msg = _("Search results...\n")
56 for r in results[:3]:
57 msg += r["gifv"] if "gifv" in r else r["link"]
58 msg += "\n"
59 await ctx.send(msg)
60 else:
61 await ctx.send(_("Something went wrong. Error code is {}").format(data["status"]))
62
63 @_imgur.command(name="subreddit")
64 async def imgur_subreddit(self, ctx, subreddit: str, sort_type: str="top", window: str="day"):
65 """Gets images from the specified subreddit section
66
67 Sort types: new, top
68 Time windows: day, week, month, year, all"""
69 sort_type = sort_type.lower()
70 window = window.lower()
71
72 if sort_type not in ("new", "top"):
73 await ctx.send(_("Only 'new' and 'top' are a valid sort type."))
74 return
75 elif window not in ("day", "week", "month", "year", "all"):
76 await ctx.send_help()
77 return
78
79 if sort_type == "new":
80 sort = "time"
81 elif sort_type == "top":
82 sort = "top"
83
84 links = []
85 headers = {"Authorization": "Client-ID {}".format(await self.settings.imgur_client_id())}
86 url = self.imgur_base_url + "r/{}/{}/{}/0".format(subreddit, sort, window)
87
88 async with self.session.get(url, headers=headers) as sub_get:
89 data = await sub_get.json()
90
91 if data["success"]:
92 items = data["data"]
93 if items:
94 for item in items[:3]:
95 link = item["gifv"] if "gifv" in item else item["link"]
96 links.append("{}\n{}".format(item["title"], link))
97
98 if links:
99 await ctx.send("\n".join(links))
100 else:
101 await ctx.send(_("No results found."))
102 else:
103 await ctx.send(_("Something went wrong. Error code is {}").format(data["status"]))
104
105 @checks.is_owner()
106 @commands.command()
107 async def imgurcreds(self, ctx, imgur_client_id: str):
108 """Sets the imgur client id
109 You will need an account on Imgur to get this
110
111 You can get these by visiting https://api.imgur.com/oauth2/addclient
112 and filling out the form. Enter a name for the application, select
113 'Anonymous usage without user authorization' for the auth type,
114 leave the app website blank, enter a valid email address, and
115 enter a description. Check the box for the captcha, then click Next.
116 Your client ID will be on the page that loads"""
117 await self.settings.imgur_client_id.set(imgur_client_id)
118 await ctx.send(_("Set the imgur client id!"))
119
120 @commands.command(pass_context=True, no_pm=True)
121 async def gif(self, ctx, *keywords):
122 """Retrieves first search result from giphy"""
123 if keywords:
124 keywords = "+".join(keywords)
125 else:
126 await ctx.send_help()
127 return
128
129 url = ("http://api.giphy.com/v1/gifs/search?&api_key={}&q={}"
130 "".format(GIPHY_API_KEY, keywords))
131
132 async with self.session.get(url) as r:
133 result = await r.json()
134 if r.status == 200:
135 if result["data"]:
136 await ctx.send(result["data"][0]["url"])
137 else:
138 await ctx.send(_("No results found."))
139 else:
140 await ctx.send(_("Error contacting the API"))
141
142 @commands.command(pass_context=True, no_pm=True)
143 async def gifr(self, ctx, *keywords):
144 """Retrieves a random gif from a giphy search"""
145 if keywords:
146 keywords = "+".join(keywords)
147 else:
148 await ctx.send_help()
149 return
150
151 url = ("http://api.giphy.com/v1/gifs/random?&api_key={}&tag={}"
152 "".format(GIPHY_API_KEY, keywords))
153
154 async with self.session.get(url) as r:
155 result = await r.json()
156 if r.status == 200:
157 if result["data"]:
158 await ctx.send(result["data"]["url"])
159 else:
160 await ctx.send(_("No results found."))
161 else:
162 await ctx.send(_("Error contacting the API"))
163
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/redbot/cogs/image/image.py b/redbot/cogs/image/image.py
--- a/redbot/cogs/image/image.py
+++ b/redbot/cogs/image/image.py
@@ -28,7 +28,6 @@
self.session.close()
@commands.group(name="imgur")
- @commands.guild_only()
async def _imgur(self, ctx):
"""Retrieves pictures from imgur
@@ -40,10 +39,16 @@
@_imgur.command(name="search")
async def imgur_search(self, ctx, *, term: str):
"""Searches Imgur for the specified term and returns up to 3 results"""
- url = self.imgur_base_url + "time/all/0"
+ url = self.imgur_base_url + "gallery/search/time/all/0"
params = {"q": term}
- headers = {"Authorization": "Client-ID {}".format(await self.settings.imgur_client_id())}
- async with self.session.get(url, headers=headers, data=params) as search_get:
+ imgur_client_id = await self.settings.imgur_client_id()
+ if not imgur_client_id:
+ await ctx.send(
+ _("A client ID has not been set! Please set one with {}").format(
+ "`{}imgurcreds`".format(ctx.prefix)))
+ return
+ headers = {"Authorization": "Client-ID {}".format(imgur_client_id)}
+ async with self.session.get(url, headers=headers, params=params) as search_get:
data = await search_get.json()
if data["success"]:
@@ -81,9 +86,16 @@
elif sort_type == "top":
sort = "top"
+ imgur_client_id = await self.settings.imgur_client_id()
+ if not imgur_client_id:
+ await ctx.send(
+ _("A client ID has not been set! Please set one with {}").format(
+ "`{}imgurcreds`".format(ctx.prefix)))
+ return
+
links = []
- headers = {"Authorization": "Client-ID {}".format(await self.settings.imgur_client_id())}
- url = self.imgur_base_url + "r/{}/{}/{}/0".format(subreddit, sort, window)
+ headers = {"Authorization": "Client-ID {}".format(imgur_client_id)}
+ url = self.imgur_base_url + "gallery/r/{}/{}/{}/0".format(subreddit, sort, window)
async with self.session.get(url, headers=headers) as sub_get:
data = await sub_get.json()
@@ -111,6 +123,7 @@
You can get these by visiting https://api.imgur.com/oauth2/addclient
and filling out the form. Enter a name for the application, select
'Anonymous usage without user authorization' for the auth type,
+ set the authorization callback url to 'https://localhost'
leave the app website blank, enter a valid email address, and
enter a description. Check the box for the captcha, then click Next.
Your client ID will be on the page that loads"""
| {"golden_diff": "diff --git a/redbot/cogs/image/image.py b/redbot/cogs/image/image.py\n--- a/redbot/cogs/image/image.py\n+++ b/redbot/cogs/image/image.py\n@@ -28,7 +28,6 @@\n self.session.close()\n \n @commands.group(name=\"imgur\")\n- @commands.guild_only()\n async def _imgur(self, ctx):\n \"\"\"Retrieves pictures from imgur\n \n@@ -40,10 +39,16 @@\n @_imgur.command(name=\"search\")\n async def imgur_search(self, ctx, *, term: str):\n \"\"\"Searches Imgur for the specified term and returns up to 3 results\"\"\"\n- url = self.imgur_base_url + \"time/all/0\"\n+ url = self.imgur_base_url + \"gallery/search/time/all/0\"\n params = {\"q\": term}\n- headers = {\"Authorization\": \"Client-ID {}\".format(await self.settings.imgur_client_id())}\n- async with self.session.get(url, headers=headers, data=params) as search_get:\n+ imgur_client_id = await self.settings.imgur_client_id()\n+ if not imgur_client_id:\n+ await ctx.send(\n+ _(\"A client ID has not been set! Please set one with {}\").format(\n+ \"`{}imgurcreds`\".format(ctx.prefix)))\n+ return\n+ headers = {\"Authorization\": \"Client-ID {}\".format(imgur_client_id)}\n+ async with self.session.get(url, headers=headers, params=params) as search_get:\n data = await search_get.json()\n \n if data[\"success\"]:\n@@ -81,9 +86,16 @@\n elif sort_type == \"top\":\n sort = \"top\"\n \n+ imgur_client_id = await self.settings.imgur_client_id()\n+ if not imgur_client_id:\n+ await ctx.send(\n+ _(\"A client ID has not been set! Please set one with {}\").format(\n+ \"`{}imgurcreds`\".format(ctx.prefix)))\n+ return\n+\n links = []\n- headers = {\"Authorization\": \"Client-ID {}\".format(await self.settings.imgur_client_id())}\n- url = self.imgur_base_url + \"r/{}/{}/{}/0\".format(subreddit, sort, window)\n+ headers = {\"Authorization\": \"Client-ID {}\".format(imgur_client_id)}\n+ url = self.imgur_base_url + \"gallery/r/{}/{}/{}/0\".format(subreddit, sort, window)\n \n async with self.session.get(url, headers=headers) as sub_get:\n data = await sub_get.json()\n@@ -111,6 +123,7 @@\n You can get these by visiting https://api.imgur.com/oauth2/addclient\n and filling out the form. Enter a name for the application, select\n 'Anonymous usage without user authorization' for the auth type,\n+ set the authorization callback url to 'https://localhost'\n leave the app website blank, enter a valid email address, and\n enter a description. Check the box for the captcha, then click Next.\n Your client ID will be on the page that loads\"\"\"\n", "issue": "[V3 Image] [p]imgur search errors out\n### Type:\r\n\r\n- [ ] Suggestion\r\n- [x] Bug\r\n\r\n### Brief description of the problem\r\n`[p]imgur search` produces an error\r\n### Expected behavior\r\nIt should give links to images\r\n### Actual behavior\r\n`Error in command 'imgur search'. Check your console or logs for details.`\r\n### Steps to reproduce\r\n\r\n1. do `[p]imgur search cats`\r\n2. get error\r\n\r\nTraceback:\r\n\r\n```py\r\nException in command 'imgur search'\r\nTraceback (most recent call last):\r\n File \"/home/palm/redv3/lib/python3.5/site-packages/discord/ext/commands/core.py\", line 62, in wrapped\r\n ret = yield from coro(*args, **kwargs)\r\n File \"/home/palm/redv3/lib/python3.5/site-packages/redbot/cogs/image/image.py\", line 47, in imgur_search\r\n data = await search_get.json()\r\n File \"/home/palm/redv3/lib/python3.5/site-packages/aiohttp/client_reqrep.py\", line 730, in json\r\n headers=self.headers)\r\naiohttp.client_exceptions.ClientResponseError: 0, message='Attempt to decode JSON with unexpected mimetype: text/html; charset=utf-8'\r\n```\n", "before_files": [{"content": "from random import shuffle\n\nimport aiohttp\nfrom discord.ext import commands\n\nfrom redbot.core.i18n import CogI18n\nfrom redbot.core import checks, Config\n\n_ = CogI18n(\"Image\", __file__)\n\nGIPHY_API_KEY = \"dc6zaTOxFJmzC\"\n\n\nclass Image:\n \"\"\"Image related commands.\"\"\"\n default_global = {\n \"imgur_client_id\": None\n }\n\n def __init__(self, bot):\n self.bot = bot\n self.settings = Config.get_conf(self, identifier=2652104208, force_registration=True)\n self.settings.register_global(**self.default_global)\n self.session = aiohttp.ClientSession()\n self.imgur_base_url = \"https://api.imgur.com/3/\"\n\n def __unload(self):\n self.session.close()\n\n @commands.group(name=\"imgur\")\n @commands.guild_only()\n async def _imgur(self, ctx):\n \"\"\"Retrieves pictures from imgur\n\n Make sure to set the client ID using\n [p]imgurcreds\"\"\"\n if ctx.invoked_subcommand is None:\n await ctx.send_help()\n\n @_imgur.command(name=\"search\")\n async def imgur_search(self, ctx, *, term: str):\n \"\"\"Searches Imgur for the specified term and returns up to 3 results\"\"\"\n url = self.imgur_base_url + \"time/all/0\"\n params = {\"q\": term}\n headers = {\"Authorization\": \"Client-ID {}\".format(await self.settings.imgur_client_id())}\n async with self.session.get(url, headers=headers, data=params) as search_get:\n data = await search_get.json()\n\n if data[\"success\"]:\n results = data[\"data\"]\n if not results:\n await ctx.send(_(\"Your search returned no results\"))\n return\n shuffle(results)\n msg = _(\"Search results...\\n\")\n for r in results[:3]:\n msg += r[\"gifv\"] if \"gifv\" in r else r[\"link\"]\n msg += \"\\n\"\n await ctx.send(msg)\n else:\n await ctx.send(_(\"Something went wrong. Error code is {}\").format(data[\"status\"]))\n\n @_imgur.command(name=\"subreddit\")\n async def imgur_subreddit(self, ctx, subreddit: str, sort_type: str=\"top\", window: str=\"day\"):\n \"\"\"Gets images from the specified subreddit section\n\n Sort types: new, top\n Time windows: day, week, month, year, all\"\"\"\n sort_type = sort_type.lower()\n window = window.lower()\n\n if sort_type not in (\"new\", \"top\"):\n await ctx.send(_(\"Only 'new' and 'top' are a valid sort type.\"))\n return\n elif window not in (\"day\", \"week\", \"month\", \"year\", \"all\"):\n await ctx.send_help()\n return\n\n if sort_type == \"new\":\n sort = \"time\"\n elif sort_type == \"top\":\n sort = \"top\"\n\n links = []\n headers = {\"Authorization\": \"Client-ID {}\".format(await self.settings.imgur_client_id())}\n url = self.imgur_base_url + \"r/{}/{}/{}/0\".format(subreddit, sort, window)\n\n async with self.session.get(url, headers=headers) as sub_get:\n data = await sub_get.json()\n\n if data[\"success\"]:\n items = data[\"data\"]\n if items:\n for item in items[:3]:\n link = item[\"gifv\"] if \"gifv\" in item else item[\"link\"]\n links.append(\"{}\\n{}\".format(item[\"title\"], link))\n\n if links:\n await ctx.send(\"\\n\".join(links))\n else:\n await ctx.send(_(\"No results found.\"))\n else:\n await ctx.send(_(\"Something went wrong. Error code is {}\").format(data[\"status\"]))\n\n @checks.is_owner()\n @commands.command()\n async def imgurcreds(self, ctx, imgur_client_id: str):\n \"\"\"Sets the imgur client id\n You will need an account on Imgur to get this\n\n You can get these by visiting https://api.imgur.com/oauth2/addclient\n and filling out the form. Enter a name for the application, select\n 'Anonymous usage without user authorization' for the auth type,\n leave the app website blank, enter a valid email address, and\n enter a description. Check the box for the captcha, then click Next.\n Your client ID will be on the page that loads\"\"\"\n await self.settings.imgur_client_id.set(imgur_client_id)\n await ctx.send(_(\"Set the imgur client id!\"))\n\n @commands.command(pass_context=True, no_pm=True)\n async def gif(self, ctx, *keywords):\n \"\"\"Retrieves first search result from giphy\"\"\"\n if keywords:\n keywords = \"+\".join(keywords)\n else:\n await ctx.send_help()\n return\n\n url = (\"http://api.giphy.com/v1/gifs/search?&api_key={}&q={}\"\n \"\".format(GIPHY_API_KEY, keywords))\n\n async with self.session.get(url) as r:\n result = await r.json()\n if r.status == 200:\n if result[\"data\"]:\n await ctx.send(result[\"data\"][0][\"url\"])\n else:\n await ctx.send(_(\"No results found.\"))\n else:\n await ctx.send(_(\"Error contacting the API\"))\n\n @commands.command(pass_context=True, no_pm=True)\n async def gifr(self, ctx, *keywords):\n \"\"\"Retrieves a random gif from a giphy search\"\"\"\n if keywords:\n keywords = \"+\".join(keywords)\n else:\n await ctx.send_help()\n return\n\n url = (\"http://api.giphy.com/v1/gifs/random?&api_key={}&tag={}\"\n \"\".format(GIPHY_API_KEY, keywords))\n\n async with self.session.get(url) as r:\n result = await r.json()\n if r.status == 200:\n if result[\"data\"]:\n await ctx.send(result[\"data\"][\"url\"])\n else:\n await ctx.send(_(\"No results found.\"))\n else:\n await ctx.send(_(\"Error contacting the API\"))\n", "path": "redbot/cogs/image/image.py"}], "after_files": [{"content": "from random import shuffle\n\nimport aiohttp\nfrom discord.ext import commands\n\nfrom redbot.core.i18n import CogI18n\nfrom redbot.core import checks, Config\n\n_ = CogI18n(\"Image\", __file__)\n\nGIPHY_API_KEY = \"dc6zaTOxFJmzC\"\n\n\nclass Image:\n \"\"\"Image related commands.\"\"\"\n default_global = {\n \"imgur_client_id\": None\n }\n\n def __init__(self, bot):\n self.bot = bot\n self.settings = Config.get_conf(self, identifier=2652104208, force_registration=True)\n self.settings.register_global(**self.default_global)\n self.session = aiohttp.ClientSession()\n self.imgur_base_url = \"https://api.imgur.com/3/\"\n\n def __unload(self):\n self.session.close()\n\n @commands.group(name=\"imgur\")\n async def _imgur(self, ctx):\n \"\"\"Retrieves pictures from imgur\n\n Make sure to set the client ID using\n [p]imgurcreds\"\"\"\n if ctx.invoked_subcommand is None:\n await ctx.send_help()\n\n @_imgur.command(name=\"search\")\n async def imgur_search(self, ctx, *, term: str):\n \"\"\"Searches Imgur for the specified term and returns up to 3 results\"\"\"\n url = self.imgur_base_url + \"gallery/search/time/all/0\"\n params = {\"q\": term}\n imgur_client_id = await self.settings.imgur_client_id()\n if not imgur_client_id:\n await ctx.send(\n _(\"A client ID has not been set! Please set one with {}\").format(\n \"`{}imgurcreds`\".format(ctx.prefix)))\n return\n headers = {\"Authorization\": \"Client-ID {}\".format(imgur_client_id)}\n async with self.session.get(url, headers=headers, params=params) as search_get:\n data = await search_get.json()\n\n if data[\"success\"]:\n results = data[\"data\"]\n if not results:\n await ctx.send(_(\"Your search returned no results\"))\n return\n shuffle(results)\n msg = _(\"Search results...\\n\")\n for r in results[:3]:\n msg += r[\"gifv\"] if \"gifv\" in r else r[\"link\"]\n msg += \"\\n\"\n await ctx.send(msg)\n else:\n await ctx.send(_(\"Something went wrong. Error code is {}\").format(data[\"status\"]))\n\n @_imgur.command(name=\"subreddit\")\n async def imgur_subreddit(self, ctx, subreddit: str, sort_type: str=\"top\", window: str=\"day\"):\n \"\"\"Gets images from the specified subreddit section\n\n Sort types: new, top\n Time windows: day, week, month, year, all\"\"\"\n sort_type = sort_type.lower()\n window = window.lower()\n\n if sort_type not in (\"new\", \"top\"):\n await ctx.send(_(\"Only 'new' and 'top' are a valid sort type.\"))\n return\n elif window not in (\"day\", \"week\", \"month\", \"year\", \"all\"):\n await ctx.send_help()\n return\n\n if sort_type == \"new\":\n sort = \"time\"\n elif sort_type == \"top\":\n sort = \"top\"\n\n imgur_client_id = await self.settings.imgur_client_id()\n if not imgur_client_id:\n await ctx.send(\n _(\"A client ID has not been set! Please set one with {}\").format(\n \"`{}imgurcreds`\".format(ctx.prefix)))\n return\n\n links = []\n headers = {\"Authorization\": \"Client-ID {}\".format(imgur_client_id)}\n url = self.imgur_base_url + \"gallery/r/{}/{}/{}/0\".format(subreddit, sort, window)\n\n async with self.session.get(url, headers=headers) as sub_get:\n data = await sub_get.json()\n\n if data[\"success\"]:\n items = data[\"data\"]\n if items:\n for item in items[:3]:\n link = item[\"gifv\"] if \"gifv\" in item else item[\"link\"]\n links.append(\"{}\\n{}\".format(item[\"title\"], link))\n\n if links:\n await ctx.send(\"\\n\".join(links))\n else:\n await ctx.send(_(\"No results found.\"))\n else:\n await ctx.send(_(\"Something went wrong. Error code is {}\").format(data[\"status\"]))\n\n @checks.is_owner()\n @commands.command()\n async def imgurcreds(self, ctx, imgur_client_id: str):\n \"\"\"Sets the imgur client id\n You will need an account on Imgur to get this\n\n You can get these by visiting https://api.imgur.com/oauth2/addclient\n and filling out the form. Enter a name for the application, select\n 'Anonymous usage without user authorization' for the auth type,\n set the authorization callback url to 'https://localhost'\n leave the app website blank, enter a valid email address, and\n enter a description. Check the box for the captcha, then click Next.\n Your client ID will be on the page that loads\"\"\"\n await self.settings.imgur_client_id.set(imgur_client_id)\n await ctx.send(_(\"Set the imgur client id!\"))\n\n @commands.command(pass_context=True, no_pm=True)\n async def gif(self, ctx, *keywords):\n \"\"\"Retrieves first search result from giphy\"\"\"\n if keywords:\n keywords = \"+\".join(keywords)\n else:\n await ctx.send_help()\n return\n\n url = (\"http://api.giphy.com/v1/gifs/search?&api_key={}&q={}\"\n \"\".format(GIPHY_API_KEY, keywords))\n\n async with self.session.get(url) as r:\n result = await r.json()\n if r.status == 200:\n if result[\"data\"]:\n await ctx.send(result[\"data\"][0][\"url\"])\n else:\n await ctx.send(_(\"No results found.\"))\n else:\n await ctx.send(_(\"Error contacting the API\"))\n\n @commands.command(pass_context=True, no_pm=True)\n async def gifr(self, ctx, *keywords):\n \"\"\"Retrieves a random gif from a giphy search\"\"\"\n if keywords:\n keywords = \"+\".join(keywords)\n else:\n await ctx.send_help()\n return\n\n url = (\"http://api.giphy.com/v1/gifs/random?&api_key={}&tag={}\"\n \"\".format(GIPHY_API_KEY, keywords))\n\n async with self.session.get(url) as r:\n result = await r.json()\n if r.status == 200:\n if result[\"data\"]:\n await ctx.send(result[\"data\"][\"url\"])\n else:\n await ctx.send(_(\"No results found.\"))\n else:\n await ctx.send(_(\"Error contacting the API\"))\n", "path": "redbot/cogs/image/image.py"}]} | 2,273 | 671 |
gh_patches_debug_5956 | rasdani/github-patches | git_diff | freedomofpress__securedrop-3311 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[RFE] Remove the sizegrip from the statusbar of the GUI tool
# Feature request
## Description
There should not be any sizegrip available in the statusbar of the GUI tool as suggested in https://github.com/freedomofpress/securedrop/pull/3300#discussion_r184066922
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `journalist_gui/journalist_gui/SecureDropUpdater.py`
Content:
```
1 #!/usr/bin/python
2 from PyQt5 import QtGui, QtWidgets
3 from PyQt5.QtCore import QThread, pyqtSignal
4 import subprocess
5 import os
6 import pexpect
7
8 from journalist_gui import updaterUI, strings, resources_rc # noqa
9
10
11 LOCK_LOCATION = "/home/amnesia/Persistent/securedrop/securedrop_update.lock" # noqa
12
13
14 class SetupThread(QThread):
15 signal = pyqtSignal('PyQt_PyObject')
16
17 def __init__(self):
18 QThread.__init__(self)
19 self.output = ""
20 self.update_success = False
21 self.failure_reason = ""
22
23 def run(self):
24 sdadmin_path = '/home/amnesia/Persistent/securedrop/securedrop-admin'
25 update_command = [sdadmin_path, 'setup']
26
27 # Create lock so we resume failed updates on reboot.
28 # Don't create the lock if it already exists.
29 if not os.path.exists(LOCK_LOCATION):
30 open(LOCK_LOCATION, 'a').close()
31
32 try:
33 self.output = subprocess.check_output(
34 update_command,
35 stderr=subprocess.STDOUT).decode('utf-8')
36 if 'Failed to install' in self.output:
37 self.update_success = False
38 self.failure_reason = strings.update_failed_generic_reason
39 else:
40 self.update_success = True
41 except subprocess.CalledProcessError as e:
42 self.output += e.output.decode('utf-8')
43 self.update_success = False
44 self.failure_reason = strings.update_failed_generic_reason
45 result = {'status': self.update_success,
46 'output': self.output,
47 'failure_reason': self.failure_reason}
48 self.signal.emit(result)
49
50
51 # This thread will handle the ./securedrop-admin update command
52 class UpdateThread(QThread):
53 signal = pyqtSignal('PyQt_PyObject')
54
55 def __init__(self):
56 QThread.__init__(self)
57 self.output = ""
58 self.update_success = False
59 self.failure_reason = ""
60
61 def run(self):
62 sdadmin_path = '/home/amnesia/Persistent/securedrop/securedrop-admin'
63 update_command = [sdadmin_path, 'update']
64 try:
65 self.output = subprocess.check_output(
66 update_command,
67 stderr=subprocess.STDOUT).decode('utf-8')
68 if "Signature verification successful" in self.output:
69 self.update_success = True
70 else:
71 self.failure_reason = strings.update_failed_generic_reason
72 except subprocess.CalledProcessError as e:
73 self.update_success = False
74 self.output += e.output.decode('utf-8')
75 if 'Signature verification failed' in self.output:
76 self.failure_reason = strings.update_failed_sig_failure
77 else:
78 self.failure_reason = strings.update_failed_generic_reason
79 result = {'status': self.update_success,
80 'output': self.output,
81 'failure_reason': self.failure_reason}
82 self.signal.emit(result)
83
84
85 # This thread will handle the ./securedrop-admin tailsconfig command
86 class TailsconfigThread(QThread):
87 signal = pyqtSignal('PyQt_PyObject')
88
89 def __init__(self):
90 QThread.__init__(self)
91 self.output = ""
92 self.update_success = False
93 self.failure_reason = ""
94 self.sudo_password = ""
95
96 def run(self):
97 tailsconfig_command = ("/home/amnesia/Persistent/"
98 "securedrop/securedrop-admin "
99 "tailsconfig")
100 try:
101 child = pexpect.spawn(tailsconfig_command)
102 child.expect('SUDO password:')
103 self.output += child.before.decode('utf-8')
104 child.sendline(self.sudo_password)
105 child.expect(pexpect.EOF)
106 self.output += child.before.decode('utf-8')
107
108 # For Tailsconfig to be considered a success, we expect no
109 # failures in the Ansible output.
110 if 'failed=0' not in self.output:
111 self.update_success = False
112 self.failure_reason = strings.tailsconfig_failed_generic_reason # noqa
113 else:
114 self.update_success = True
115 except pexpect.exceptions.TIMEOUT:
116 self.update_success = False
117 self.failure_reason = strings.tailsconfig_failed_sudo_password
118
119 except subprocess.CalledProcessError:
120 self.update_success = False
121 self.failure_reason = strings.tailsconfig_failed_generic_reason
122 result = {'status': self.update_success,
123 'output': self.output,
124 'failure_reason': self.failure_reason}
125 self.signal.emit(result)
126
127
128 class UpdaterApp(QtWidgets.QMainWindow, updaterUI.Ui_MainWindow):
129
130 def __init__(self, parent=None):
131 super(UpdaterApp, self).__init__(parent)
132 self.setupUi(self)
133 self.output = strings.initial_text_box
134 self.plainTextEdit.setPlainText(self.output)
135 self.update_success = False
136
137 pixmap = QtGui.QPixmap(":/images/static/banner.png")
138 self.label_2.setPixmap(pixmap)
139 self.label_2.setScaledContents(True)
140
141 self.progressBar.setProperty("value", 0)
142 self.setWindowTitle(strings.window_title)
143 self.setWindowIcon(QtGui.QIcon(':/images/static/securedrop_icon.png'))
144 self.label.setText(strings.update_in_progress)
145
146 self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab),
147 strings.main_tab)
148 self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab_2),
149 strings.output_tab)
150
151 # Connect buttons to their functions.
152 self.pushButton.setText(strings.install_later_button)
153 self.pushButton.setStyleSheet("""background-color: lightgrey;
154 min-height: 2em;
155 border-radius: 10px""")
156 self.pushButton.clicked.connect(self.close)
157 self.pushButton_2.setText(strings.install_update_button)
158 self.pushButton_2.setStyleSheet("""background-color: #E6FFEB;
159 min-height: 2em;
160 border-radius: 10px;""")
161 self.pushButton_2.clicked.connect(self.update_securedrop)
162 self.update_thread = UpdateThread()
163 self.update_thread.signal.connect(self.update_status)
164 self.tails_thread = TailsconfigThread()
165 self.tails_thread.signal.connect(self.tails_status)
166 self.setup_thread = SetupThread()
167 self.setup_thread.signal.connect(self.setup_status)
168
169 # At the end of this function, we will try to do tailsconfig.
170 # A new slot will handle tailsconfig output
171 def setup_status(self, result):
172 "This is the slot for setup thread"
173 self.output += result['output']
174 self.update_success = result['status']
175 self.failure_reason = result['failure_reason']
176 self.progressBar.setProperty("value", 60)
177 self.plainTextEdit.setPlainText(self.output)
178 self.plainTextEdit.setReadOnly = True
179 if not self.update_success: # Failed to do setup update
180 self.pushButton.setEnabled(True)
181 self.pushButton_2.setEnabled(True)
182 self.update_status_bar_and_output(self.failure_reason)
183 self.progressBar.setProperty("value", 0)
184 self.alert_failure(self.failure_reason)
185 return
186 self.progressBar.setProperty("value", 70)
187 self.call_tailsconfig()
188
189 # This will update the output text after the git commands.
190 def update_status(self, result):
191 "This is the slot for update thread"
192 self.output += result['output']
193 self.update_success = result['status']
194 self.failure_reason = result['failure_reason']
195 self.progressBar.setProperty("value", 40)
196 self.plainTextEdit.setPlainText(self.output)
197 self.plainTextEdit.setReadOnly = True
198 self.progressBar.setProperty("value", 50)
199 self.update_status_bar_and_output(strings.doing_setup)
200 self.setup_thread.start()
201
202 def update_status_bar_and_output(self, status_message):
203 """This method updates the status bar and the output window with the
204 status_message."""
205 self.statusbar.showMessage(status_message)
206 self.output += status_message + '\n'
207 self.plainTextEdit.setPlainText(self.output)
208
209 def call_tailsconfig(self):
210 # Now let us work on tailsconfig part
211 if self.update_success:
212 # Get sudo password and add an enter key as tailsconfig command
213 # expects
214 sudo_password = self.get_sudo_password()
215 if not sudo_password:
216 self.update_success = False
217 self.failure_reason = strings.missing_sudo_password
218 self.on_failure()
219 return
220 self.tails_thread.sudo_password = sudo_password + '\n'
221 self.update_status_bar_and_output(strings.updating_tails_env)
222 self.tails_thread.start()
223 else:
224 self.on_failure()
225
226 def tails_status(self, result):
227 "This is the slot for Tailsconfig thread"
228 self.output += result['output']
229 self.update_success = result['status']
230 self.failure_reason = result['failure_reason']
231 self.plainTextEdit.setPlainText(self.output)
232 self.progressBar.setProperty("value", 80)
233 if self.update_success:
234 # Remove lock
235 os.remove(LOCK_LOCATION)
236 self.update_status_bar_and_output(strings.finished)
237 self.progressBar.setProperty("value", 100)
238 self.alert_success()
239 else:
240 self.on_failure()
241
242 def on_failure(self):
243 self.update_status_bar_and_output(self.failure_reason)
244 self.alert_failure(self.failure_reason)
245 # Now everything is done, enable the button.
246 self.pushButton.setEnabled(True)
247 self.pushButton_2.setEnabled(True)
248 self.progressBar.setProperty("value", 0)
249
250 def update_securedrop(self):
251 self.pushButton_2.setEnabled(False)
252 self.pushButton.setEnabled(False)
253 self.progressBar.setProperty("value", 10)
254 self.update_status_bar_and_output(strings.fetching_update)
255 self.update_thread.start()
256
257 def alert_success(self):
258 self.success_dialog = QtWidgets.QMessageBox()
259 self.success_dialog.setIcon(QtWidgets.QMessageBox.Information)
260 self.success_dialog.setText(strings.finished_dialog_message)
261 self.success_dialog.setWindowTitle(strings.finished_dialog_title)
262 self.success_dialog.show()
263
264 def alert_failure(self, failure_reason):
265 self.error_dialog = QtWidgets.QMessageBox()
266 self.error_dialog.setIcon(QtWidgets.QMessageBox.Critical)
267 self.error_dialog.setText(self.failure_reason)
268 self.error_dialog.setWindowTitle(strings.update_failed_dialog_title)
269 self.error_dialog.show()
270
271 def get_sudo_password(self):
272 sudo_password, ok_is_pressed = QtWidgets.QInputDialog.getText(
273 self, "Tails Administrator password", strings.sudo_password_text,
274 QtWidgets.QLineEdit.Password, "")
275 if ok_is_pressed and sudo_password:
276 return sudo_password
277 else:
278 return None
279
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/journalist_gui/journalist_gui/SecureDropUpdater.py b/journalist_gui/journalist_gui/SecureDropUpdater.py
--- a/journalist_gui/journalist_gui/SecureDropUpdater.py
+++ b/journalist_gui/journalist_gui/SecureDropUpdater.py
@@ -130,6 +130,7 @@
def __init__(self, parent=None):
super(UpdaterApp, self).__init__(parent)
self.setupUi(self)
+ self.statusbar.setSizeGripEnabled(False)
self.output = strings.initial_text_box
self.plainTextEdit.setPlainText(self.output)
self.update_success = False
| {"golden_diff": "diff --git a/journalist_gui/journalist_gui/SecureDropUpdater.py b/journalist_gui/journalist_gui/SecureDropUpdater.py\n--- a/journalist_gui/journalist_gui/SecureDropUpdater.py\n+++ b/journalist_gui/journalist_gui/SecureDropUpdater.py\n@@ -130,6 +130,7 @@\n def __init__(self, parent=None):\n super(UpdaterApp, self).__init__(parent)\n self.setupUi(self)\n+ self.statusbar.setSizeGripEnabled(False)\n self.output = strings.initial_text_box\n self.plainTextEdit.setPlainText(self.output)\n self.update_success = False\n", "issue": "[RFE] Remove the sizegrip from the statusbar of the GUI tool\n# Feature request\r\n\r\n## Description\r\n\r\nThere should not be any sizegrip available in the statusbar of the GUI tool as suggested in https://github.com/freedomofpress/securedrop/pull/3300#discussion_r184066922\n", "before_files": [{"content": "#!/usr/bin/python\nfrom PyQt5 import QtGui, QtWidgets\nfrom PyQt5.QtCore import QThread, pyqtSignal\nimport subprocess\nimport os\nimport pexpect\n\nfrom journalist_gui import updaterUI, strings, resources_rc # noqa\n\n\nLOCK_LOCATION = \"/home/amnesia/Persistent/securedrop/securedrop_update.lock\" # noqa\n\n\nclass SetupThread(QThread):\n signal = pyqtSignal('PyQt_PyObject')\n\n def __init__(self):\n QThread.__init__(self)\n self.output = \"\"\n self.update_success = False\n self.failure_reason = \"\"\n\n def run(self):\n sdadmin_path = '/home/amnesia/Persistent/securedrop/securedrop-admin'\n update_command = [sdadmin_path, 'setup']\n\n # Create lock so we resume failed updates on reboot.\n # Don't create the lock if it already exists.\n if not os.path.exists(LOCK_LOCATION):\n open(LOCK_LOCATION, 'a').close()\n\n try:\n self.output = subprocess.check_output(\n update_command,\n stderr=subprocess.STDOUT).decode('utf-8')\n if 'Failed to install' in self.output:\n self.update_success = False\n self.failure_reason = strings.update_failed_generic_reason\n else:\n self.update_success = True\n except subprocess.CalledProcessError as e:\n self.output += e.output.decode('utf-8')\n self.update_success = False\n self.failure_reason = strings.update_failed_generic_reason\n result = {'status': self.update_success,\n 'output': self.output,\n 'failure_reason': self.failure_reason}\n self.signal.emit(result)\n\n\n# This thread will handle the ./securedrop-admin update command\nclass UpdateThread(QThread):\n signal = pyqtSignal('PyQt_PyObject')\n\n def __init__(self):\n QThread.__init__(self)\n self.output = \"\"\n self.update_success = False\n self.failure_reason = \"\"\n\n def run(self):\n sdadmin_path = '/home/amnesia/Persistent/securedrop/securedrop-admin'\n update_command = [sdadmin_path, 'update']\n try:\n self.output = subprocess.check_output(\n update_command,\n stderr=subprocess.STDOUT).decode('utf-8')\n if \"Signature verification successful\" in self.output:\n self.update_success = True\n else:\n self.failure_reason = strings.update_failed_generic_reason\n except subprocess.CalledProcessError as e:\n self.update_success = False\n self.output += e.output.decode('utf-8')\n if 'Signature verification failed' in self.output:\n self.failure_reason = strings.update_failed_sig_failure\n else:\n self.failure_reason = strings.update_failed_generic_reason\n result = {'status': self.update_success,\n 'output': self.output,\n 'failure_reason': self.failure_reason}\n self.signal.emit(result)\n\n\n# This thread will handle the ./securedrop-admin tailsconfig command\nclass TailsconfigThread(QThread):\n signal = pyqtSignal('PyQt_PyObject')\n\n def __init__(self):\n QThread.__init__(self)\n self.output = \"\"\n self.update_success = False\n self.failure_reason = \"\"\n self.sudo_password = \"\"\n\n def run(self):\n tailsconfig_command = (\"/home/amnesia/Persistent/\"\n \"securedrop/securedrop-admin \"\n \"tailsconfig\")\n try:\n child = pexpect.spawn(tailsconfig_command)\n child.expect('SUDO password:')\n self.output += child.before.decode('utf-8')\n child.sendline(self.sudo_password)\n child.expect(pexpect.EOF)\n self.output += child.before.decode('utf-8')\n\n # For Tailsconfig to be considered a success, we expect no\n # failures in the Ansible output.\n if 'failed=0' not in self.output:\n self.update_success = False\n self.failure_reason = strings.tailsconfig_failed_generic_reason # noqa\n else:\n self.update_success = True\n except pexpect.exceptions.TIMEOUT:\n self.update_success = False\n self.failure_reason = strings.tailsconfig_failed_sudo_password\n\n except subprocess.CalledProcessError:\n self.update_success = False\n self.failure_reason = strings.tailsconfig_failed_generic_reason\n result = {'status': self.update_success,\n 'output': self.output,\n 'failure_reason': self.failure_reason}\n self.signal.emit(result)\n\n\nclass UpdaterApp(QtWidgets.QMainWindow, updaterUI.Ui_MainWindow):\n\n def __init__(self, parent=None):\n super(UpdaterApp, self).__init__(parent)\n self.setupUi(self)\n self.output = strings.initial_text_box\n self.plainTextEdit.setPlainText(self.output)\n self.update_success = False\n\n pixmap = QtGui.QPixmap(\":/images/static/banner.png\")\n self.label_2.setPixmap(pixmap)\n self.label_2.setScaledContents(True)\n\n self.progressBar.setProperty(\"value\", 0)\n self.setWindowTitle(strings.window_title)\n self.setWindowIcon(QtGui.QIcon(':/images/static/securedrop_icon.png'))\n self.label.setText(strings.update_in_progress)\n\n self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab),\n strings.main_tab)\n self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab_2),\n strings.output_tab)\n\n # Connect buttons to their functions.\n self.pushButton.setText(strings.install_later_button)\n self.pushButton.setStyleSheet(\"\"\"background-color: lightgrey;\n min-height: 2em;\n border-radius: 10px\"\"\")\n self.pushButton.clicked.connect(self.close)\n self.pushButton_2.setText(strings.install_update_button)\n self.pushButton_2.setStyleSheet(\"\"\"background-color: #E6FFEB;\n min-height: 2em;\n border-radius: 10px;\"\"\")\n self.pushButton_2.clicked.connect(self.update_securedrop)\n self.update_thread = UpdateThread()\n self.update_thread.signal.connect(self.update_status)\n self.tails_thread = TailsconfigThread()\n self.tails_thread.signal.connect(self.tails_status)\n self.setup_thread = SetupThread()\n self.setup_thread.signal.connect(self.setup_status)\n\n # At the end of this function, we will try to do tailsconfig.\n # A new slot will handle tailsconfig output\n def setup_status(self, result):\n \"This is the slot for setup thread\"\n self.output += result['output']\n self.update_success = result['status']\n self.failure_reason = result['failure_reason']\n self.progressBar.setProperty(\"value\", 60)\n self.plainTextEdit.setPlainText(self.output)\n self.plainTextEdit.setReadOnly = True\n if not self.update_success: # Failed to do setup update\n self.pushButton.setEnabled(True)\n self.pushButton_2.setEnabled(True)\n self.update_status_bar_and_output(self.failure_reason)\n self.progressBar.setProperty(\"value\", 0)\n self.alert_failure(self.failure_reason)\n return\n self.progressBar.setProperty(\"value\", 70)\n self.call_tailsconfig()\n\n # This will update the output text after the git commands.\n def update_status(self, result):\n \"This is the slot for update thread\"\n self.output += result['output']\n self.update_success = result['status']\n self.failure_reason = result['failure_reason']\n self.progressBar.setProperty(\"value\", 40)\n self.plainTextEdit.setPlainText(self.output)\n self.plainTextEdit.setReadOnly = True\n self.progressBar.setProperty(\"value\", 50)\n self.update_status_bar_and_output(strings.doing_setup)\n self.setup_thread.start()\n\n def update_status_bar_and_output(self, status_message):\n \"\"\"This method updates the status bar and the output window with the\n status_message.\"\"\"\n self.statusbar.showMessage(status_message)\n self.output += status_message + '\\n'\n self.plainTextEdit.setPlainText(self.output)\n\n def call_tailsconfig(self):\n # Now let us work on tailsconfig part\n if self.update_success:\n # Get sudo password and add an enter key as tailsconfig command\n # expects\n sudo_password = self.get_sudo_password()\n if not sudo_password:\n self.update_success = False\n self.failure_reason = strings.missing_sudo_password\n self.on_failure()\n return\n self.tails_thread.sudo_password = sudo_password + '\\n'\n self.update_status_bar_and_output(strings.updating_tails_env)\n self.tails_thread.start()\n else:\n self.on_failure()\n\n def tails_status(self, result):\n \"This is the slot for Tailsconfig thread\"\n self.output += result['output']\n self.update_success = result['status']\n self.failure_reason = result['failure_reason']\n self.plainTextEdit.setPlainText(self.output)\n self.progressBar.setProperty(\"value\", 80)\n if self.update_success:\n # Remove lock\n os.remove(LOCK_LOCATION)\n self.update_status_bar_and_output(strings.finished)\n self.progressBar.setProperty(\"value\", 100)\n self.alert_success()\n else:\n self.on_failure()\n\n def on_failure(self):\n self.update_status_bar_and_output(self.failure_reason)\n self.alert_failure(self.failure_reason)\n # Now everything is done, enable the button.\n self.pushButton.setEnabled(True)\n self.pushButton_2.setEnabled(True)\n self.progressBar.setProperty(\"value\", 0)\n\n def update_securedrop(self):\n self.pushButton_2.setEnabled(False)\n self.pushButton.setEnabled(False)\n self.progressBar.setProperty(\"value\", 10)\n self.update_status_bar_and_output(strings.fetching_update)\n self.update_thread.start()\n\n def alert_success(self):\n self.success_dialog = QtWidgets.QMessageBox()\n self.success_dialog.setIcon(QtWidgets.QMessageBox.Information)\n self.success_dialog.setText(strings.finished_dialog_message)\n self.success_dialog.setWindowTitle(strings.finished_dialog_title)\n self.success_dialog.show()\n\n def alert_failure(self, failure_reason):\n self.error_dialog = QtWidgets.QMessageBox()\n self.error_dialog.setIcon(QtWidgets.QMessageBox.Critical)\n self.error_dialog.setText(self.failure_reason)\n self.error_dialog.setWindowTitle(strings.update_failed_dialog_title)\n self.error_dialog.show()\n\n def get_sudo_password(self):\n sudo_password, ok_is_pressed = QtWidgets.QInputDialog.getText(\n self, \"Tails Administrator password\", strings.sudo_password_text,\n QtWidgets.QLineEdit.Password, \"\")\n if ok_is_pressed and sudo_password:\n return sudo_password\n else:\n return None\n", "path": "journalist_gui/journalist_gui/SecureDropUpdater.py"}], "after_files": [{"content": "#!/usr/bin/python\nfrom PyQt5 import QtGui, QtWidgets\nfrom PyQt5.QtCore import QThread, pyqtSignal\nimport subprocess\nimport os\nimport pexpect\n\nfrom journalist_gui import updaterUI, strings, resources_rc # noqa\n\n\nLOCK_LOCATION = \"/home/amnesia/Persistent/securedrop/securedrop_update.lock\" # noqa\n\n\nclass SetupThread(QThread):\n signal = pyqtSignal('PyQt_PyObject')\n\n def __init__(self):\n QThread.__init__(self)\n self.output = \"\"\n self.update_success = False\n self.failure_reason = \"\"\n\n def run(self):\n sdadmin_path = '/home/amnesia/Persistent/securedrop/securedrop-admin'\n update_command = [sdadmin_path, 'setup']\n\n # Create lock so we resume failed updates on reboot.\n # Don't create the lock if it already exists.\n if not os.path.exists(LOCK_LOCATION):\n open(LOCK_LOCATION, 'a').close()\n\n try:\n self.output = subprocess.check_output(\n update_command,\n stderr=subprocess.STDOUT).decode('utf-8')\n if 'Failed to install' in self.output:\n self.update_success = False\n self.failure_reason = strings.update_failed_generic_reason\n else:\n self.update_success = True\n except subprocess.CalledProcessError as e:\n self.output += e.output.decode('utf-8')\n self.update_success = False\n self.failure_reason = strings.update_failed_generic_reason\n result = {'status': self.update_success,\n 'output': self.output,\n 'failure_reason': self.failure_reason}\n self.signal.emit(result)\n\n\n# This thread will handle the ./securedrop-admin update command\nclass UpdateThread(QThread):\n signal = pyqtSignal('PyQt_PyObject')\n\n def __init__(self):\n QThread.__init__(self)\n self.output = \"\"\n self.update_success = False\n self.failure_reason = \"\"\n\n def run(self):\n sdadmin_path = '/home/amnesia/Persistent/securedrop/securedrop-admin'\n update_command = [sdadmin_path, 'update']\n try:\n self.output = subprocess.check_output(\n update_command,\n stderr=subprocess.STDOUT).decode('utf-8')\n if \"Signature verification successful\" in self.output:\n self.update_success = True\n else:\n self.failure_reason = strings.update_failed_generic_reason\n except subprocess.CalledProcessError as e:\n self.update_success = False\n self.output += e.output.decode('utf-8')\n if 'Signature verification failed' in self.output:\n self.failure_reason = strings.update_failed_sig_failure\n else:\n self.failure_reason = strings.update_failed_generic_reason\n result = {'status': self.update_success,\n 'output': self.output,\n 'failure_reason': self.failure_reason}\n self.signal.emit(result)\n\n\n# This thread will handle the ./securedrop-admin tailsconfig command\nclass TailsconfigThread(QThread):\n signal = pyqtSignal('PyQt_PyObject')\n\n def __init__(self):\n QThread.__init__(self)\n self.output = \"\"\n self.update_success = False\n self.failure_reason = \"\"\n self.sudo_password = \"\"\n\n def run(self):\n tailsconfig_command = (\"/home/amnesia/Persistent/\"\n \"securedrop/securedrop-admin \"\n \"tailsconfig\")\n try:\n child = pexpect.spawn(tailsconfig_command)\n child.expect('SUDO password:')\n self.output += child.before.decode('utf-8')\n child.sendline(self.sudo_password)\n child.expect(pexpect.EOF)\n self.output += child.before.decode('utf-8')\n\n # For Tailsconfig to be considered a success, we expect no\n # failures in the Ansible output.\n if 'failed=0' not in self.output:\n self.update_success = False\n self.failure_reason = strings.tailsconfig_failed_generic_reason # noqa\n else:\n self.update_success = True\n except pexpect.exceptions.TIMEOUT:\n self.update_success = False\n self.failure_reason = strings.tailsconfig_failed_sudo_password\n\n except subprocess.CalledProcessError:\n self.update_success = False\n self.failure_reason = strings.tailsconfig_failed_generic_reason\n result = {'status': self.update_success,\n 'output': self.output,\n 'failure_reason': self.failure_reason}\n self.signal.emit(result)\n\n\nclass UpdaterApp(QtWidgets.QMainWindow, updaterUI.Ui_MainWindow):\n\n def __init__(self, parent=None):\n super(UpdaterApp, self).__init__(parent)\n self.setupUi(self)\n self.statusbar.setSizeGripEnabled(False)\n self.output = strings.initial_text_box\n self.plainTextEdit.setPlainText(self.output)\n self.update_success = False\n\n pixmap = QtGui.QPixmap(\":/images/static/banner.png\")\n self.label_2.setPixmap(pixmap)\n self.label_2.setScaledContents(True)\n\n self.progressBar.setProperty(\"value\", 0)\n self.setWindowTitle(strings.window_title)\n self.setWindowIcon(QtGui.QIcon(':/images/static/securedrop_icon.png'))\n self.label.setText(strings.update_in_progress)\n\n self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab),\n strings.main_tab)\n self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab_2),\n strings.output_tab)\n\n # Connect buttons to their functions.\n self.pushButton.setText(strings.install_later_button)\n self.pushButton.setStyleSheet(\"\"\"background-color: lightgrey;\n min-height: 2em;\n border-radius: 10px\"\"\")\n self.pushButton.clicked.connect(self.close)\n self.pushButton_2.setText(strings.install_update_button)\n self.pushButton_2.setStyleSheet(\"\"\"background-color: #E6FFEB;\n min-height: 2em;\n border-radius: 10px;\"\"\")\n self.pushButton_2.clicked.connect(self.update_securedrop)\n self.update_thread = UpdateThread()\n self.update_thread.signal.connect(self.update_status)\n self.tails_thread = TailsconfigThread()\n self.tails_thread.signal.connect(self.tails_status)\n self.setup_thread = SetupThread()\n self.setup_thread.signal.connect(self.setup_status)\n\n # At the end of this function, we will try to do tailsconfig.\n # A new slot will handle tailsconfig output\n def setup_status(self, result):\n \"This is the slot for setup thread\"\n self.output += result['output']\n self.update_success = result['status']\n self.failure_reason = result['failure_reason']\n self.progressBar.setProperty(\"value\", 60)\n self.plainTextEdit.setPlainText(self.output)\n self.plainTextEdit.setReadOnly = True\n if not self.update_success: # Failed to do setup update\n self.pushButton.setEnabled(True)\n self.pushButton_2.setEnabled(True)\n self.update_status_bar_and_output(self.failure_reason)\n self.progressBar.setProperty(\"value\", 0)\n self.alert_failure(self.failure_reason)\n return\n self.progressBar.setProperty(\"value\", 70)\n self.call_tailsconfig()\n\n # This will update the output text after the git commands.\n def update_status(self, result):\n \"This is the slot for update thread\"\n self.output += result['output']\n self.update_success = result['status']\n self.failure_reason = result['failure_reason']\n self.progressBar.setProperty(\"value\", 40)\n self.plainTextEdit.setPlainText(self.output)\n self.plainTextEdit.setReadOnly = True\n self.progressBar.setProperty(\"value\", 50)\n self.update_status_bar_and_output(strings.doing_setup)\n self.setup_thread.start()\n\n def update_status_bar_and_output(self, status_message):\n \"\"\"This method updates the status bar and the output window with the\n status_message.\"\"\"\n self.statusbar.showMessage(status_message)\n self.output += status_message + '\\n'\n self.plainTextEdit.setPlainText(self.output)\n\n def call_tailsconfig(self):\n # Now let us work on tailsconfig part\n if self.update_success:\n # Get sudo password and add an enter key as tailsconfig command\n # expects\n sudo_password = self.get_sudo_password()\n if not sudo_password:\n self.update_success = False\n self.failure_reason = strings.missing_sudo_password\n self.on_failure()\n return\n self.tails_thread.sudo_password = sudo_password + '\\n'\n self.update_status_bar_and_output(strings.updating_tails_env)\n self.tails_thread.start()\n else:\n self.on_failure()\n\n def tails_status(self, result):\n \"This is the slot for Tailsconfig thread\"\n self.output += result['output']\n self.update_success = result['status']\n self.failure_reason = result['failure_reason']\n self.plainTextEdit.setPlainText(self.output)\n self.progressBar.setProperty(\"value\", 80)\n if self.update_success:\n # Remove lock\n os.remove(LOCK_LOCATION)\n self.update_status_bar_and_output(strings.finished)\n self.progressBar.setProperty(\"value\", 100)\n self.alert_success()\n else:\n self.on_failure()\n\n def on_failure(self):\n self.update_status_bar_and_output(self.failure_reason)\n self.alert_failure(self.failure_reason)\n # Now everything is done, enable the button.\n self.pushButton.setEnabled(True)\n self.pushButton_2.setEnabled(True)\n self.progressBar.setProperty(\"value\", 0)\n\n def update_securedrop(self):\n self.pushButton_2.setEnabled(False)\n self.pushButton.setEnabled(False)\n self.progressBar.setProperty(\"value\", 10)\n self.update_status_bar_and_output(strings.fetching_update)\n self.update_thread.start()\n\n def alert_success(self):\n self.success_dialog = QtWidgets.QMessageBox()\n self.success_dialog.setIcon(QtWidgets.QMessageBox.Information)\n self.success_dialog.setText(strings.finished_dialog_message)\n self.success_dialog.setWindowTitle(strings.finished_dialog_title)\n self.success_dialog.show()\n\n def alert_failure(self, failure_reason):\n self.error_dialog = QtWidgets.QMessageBox()\n self.error_dialog.setIcon(QtWidgets.QMessageBox.Critical)\n self.error_dialog.setText(self.failure_reason)\n self.error_dialog.setWindowTitle(strings.update_failed_dialog_title)\n self.error_dialog.show()\n\n def get_sudo_password(self):\n sudo_password, ok_is_pressed = QtWidgets.QInputDialog.getText(\n self, \"Tails Administrator password\", strings.sudo_password_text,\n QtWidgets.QLineEdit.Password, \"\")\n if ok_is_pressed and sudo_password:\n return sudo_password\n else:\n return None\n", "path": "journalist_gui/journalist_gui/SecureDropUpdater.py"}]} | 3,289 | 143 |
gh_patches_debug_20172 | rasdani/github-patches | git_diff | opendatacube__datacube-core-1177 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
cannot import name 'clock_gettime' from 'time' (unknown location)
### Expected behaviour
Import time
### Actual behaviour
File "C:\ProgramData\Anaconda3\lib\site-packages\datacube\drivers\postgres\_connections.py", line 20, in <module>
from time import clock_gettime, CLOCK_REALTIME
ImportError: cannot import name 'clock_gettime' from 'time' (unknown location)
cannot import name 'clock_gettime' from 'time' (unknown location)
### Steps to reproduce the behaviour
from ._connections import PostgresDb
### Environment information
Python 3.8.11
* Which ``datacube --version`` are you using?
Unknown, it's referenced in geocube on conda-forge
* What datacube deployment/enviornment are you running against?
n/a
Additional notes:
time.clock has been removed in Python 3.8 after being deprecated in Python 3.3. The recommendation is to change clock by either time.perf_counter or time.process_time.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `datacube/drivers/postgres/_connections.py`
Content:
```
1 # This file is part of the Open Data Cube, see https://opendatacube.org for more information
2 #
3 # Copyright (c) 2015-2020 ODC Contributors
4 # SPDX-License-Identifier: Apache-2.0
5
6 # We often have one-arg-per column, so these checks aren't so useful.
7 # pylint: disable=too-many-arguments,too-many-public-methods
8
9 # SQLAlchemy queries require "column == None", not "column is None" due to operator overloading:
10 # pylint: disable=singleton-comparison
11
12 """
13 Postgres connection and setup
14 """
15 import json
16 import logging
17 import os
18 import re
19 from contextlib import contextmanager
20 from time import clock_gettime, CLOCK_REALTIME
21 from typing import Callable, Optional, Union
22
23 from sqlalchemy import event, create_engine, text
24 from sqlalchemy.engine import Engine
25 from sqlalchemy.engine.url import URL as EngineUrl
26
27 import datacube
28 from datacube.index.exceptions import IndexSetupError
29 from datacube.utils import jsonify_document
30
31 from . import _api
32 from . import _core
33
34 _LIB_ID = 'agdc-' + str(datacube.__version__)
35
36 _LOG = logging.getLogger(__name__)
37
38 try:
39 import pwd
40
41 DEFAULT_DB_USER = pwd.getpwuid(os.geteuid()).pw_name # type: Optional[str]
42 except (ImportError, KeyError):
43 # No default on Windows and some other systems
44 DEFAULT_DB_USER = None
45 DEFAULT_DB_PORT = 5432
46 DEFAULT_IAM_AUTH = False
47 DEFAULT_IAM_TIMEOUT = 600
48
49
50 class PostgresDb(object):
51 """
52 A thin database access api.
53
54 It exists so that higher level modules are not tied to SQLAlchemy, connections or specifics of database-access.
55
56 (and can be unit tested without any actual databases)
57
58 Thread safe: the only shared state is the (thread-safe) sqlalchemy connection pool.
59
60 But not multiprocess safe once the first connections are made! A connection must not be shared between multiple
61 processes. You can call close() before forking if you know no other threads currently hold connections,
62 or else use a separate instance of this class in each process.
63 """
64
65 def __init__(self, engine):
66 # We don't recommend using this constructor directly as it may change.
67 # Use static methods PostgresDb.create() or PostgresDb.from_config()
68 self._engine = engine
69
70 @classmethod
71 def from_config(cls, config, application_name=None, validate_connection=True):
72 app_name = cls._expand_app_name(application_name)
73
74 return PostgresDb.create(
75 config['db_hostname'],
76 config['db_database'],
77 config.get('db_username', DEFAULT_DB_USER),
78 config.get('db_password', None),
79 config.get('db_port', DEFAULT_DB_PORT),
80 application_name=app_name,
81 validate=validate_connection,
82 iam_rds_auth=bool(config.get("db_iam_authentication", DEFAULT_IAM_AUTH)),
83 iam_rds_timeout=int(config.get("db_iam_timeout", DEFAULT_IAM_TIMEOUT)),
84 pool_timeout=int(config.get('db_connection_timeout', 60)),
85 # pass config?
86 )
87
88 @classmethod
89 def create(cls, hostname, database, username=None, password=None, port=None,
90 application_name=None, validate=True,
91 iam_rds_auth=False, iam_rds_timeout=600,
92 # pass config?
93 pool_timeout=60):
94 mk_url = getattr(EngineUrl, 'create', EngineUrl)
95 engine = cls._create_engine(
96 mk_url(
97 'postgresql',
98 host=hostname, database=database, port=port,
99 username=username, password=password,
100 ),
101 application_name=application_name,
102 iam_rds_auth=iam_rds_auth,
103 iam_rds_timeout=iam_rds_timeout,
104 pool_timeout=pool_timeout)
105 if validate:
106 if not _core.database_exists(engine):
107 raise IndexSetupError('\n\nNo DB schema exists. Have you run init?\n\t{init_command}'.format(
108 init_command='datacube system init'
109 ))
110
111 if not _core.schema_is_latest(engine):
112 raise IndexSetupError(
113 '\n\nDB schema is out of date. '
114 'An administrator must run init:\n\t{init_command}'.format(
115 init_command='datacube -v system init'
116 ))
117 return PostgresDb(engine)
118
119 @staticmethod
120 def _create_engine(url, application_name=None, iam_rds_auth=False, iam_rds_timeout=600, pool_timeout=60):
121 engine = create_engine(
122 url,
123 echo=False,
124 echo_pool=False,
125
126 # 'AUTOCOMMIT' here means READ-COMMITTED isolation level with autocommit on.
127 # When a transaction is needed we will do an explicit begin/commit.
128 isolation_level='AUTOCOMMIT',
129 json_serializer=_to_json,
130 # If a connection is idle for this many seconds, SQLAlchemy will renew it rather
131 # than assuming it's still open. Allows servers to close idle connections without clients
132 # getting errors.
133 pool_recycle=pool_timeout,
134 connect_args={'application_name': application_name}
135 )
136
137 if iam_rds_auth:
138 from datacube.utils.aws import obtain_new_iam_auth_token
139 handle_dynamic_token_authentication(engine, obtain_new_iam_auth_token, timeout=iam_rds_timeout, url=url)
140
141 return engine
142
143 @property
144 def url(self) -> EngineUrl:
145 return self._engine.url
146
147 @staticmethod
148 def get_db_username(config):
149 try:
150 return config['db_username']
151 except KeyError:
152 return DEFAULT_DB_USER
153
154 def close(self):
155 """
156 Close any idle connections in the pool.
157
158 This is good practice if you are keeping this object in scope
159 but wont be using it for a while.
160
161 Connections should not be shared between processes, so this should be called
162 before forking if the same instance will be used.
163
164 (connections are normally closed automatically when this object is
165 garbage collected)
166 """
167 self._engine.dispose()
168
169 @classmethod
170 def _expand_app_name(cls, application_name):
171 """
172 >>> PostgresDb._expand_app_name(None) #doctest: +ELLIPSIS
173 'agdc-...'
174 >>> PostgresDb._expand_app_name('') #doctest: +ELLIPSIS
175 'agdc-...'
176 >>> PostgresDb._expand_app_name('cli') #doctest: +ELLIPSIS
177 'cli agdc-...'
178 >>> PostgresDb._expand_app_name('a b.c/d')
179 'a-b-c-d agdc-...'
180 >>> PostgresDb._expand_app_name(5)
181 Traceback (most recent call last):
182 ...
183 TypeError: Application name must be a string
184 """
185 full_name = _LIB_ID
186 if application_name:
187 if not isinstance(application_name, str):
188 raise TypeError('Application name must be a string')
189
190 full_name = re.sub('[^0-9a-zA-Z]+', '-', application_name) + ' ' + full_name
191
192 if len(full_name) > 64:
193 _LOG.warning('Application name is too long: Truncating to %s chars', (64 - len(_LIB_ID) - 1))
194 return full_name[-64:]
195
196 def init(self, with_permissions=True):
197 """
198 Init a new database (if not already set up).
199
200 :return: If it was newly created.
201 """
202 is_new = _core.ensure_db(self._engine, with_permissions=with_permissions)
203 if not is_new:
204 _core.update_schema(self._engine)
205
206 return is_new
207
208 @contextmanager
209 def connect(self):
210 """
211 Borrow a connection from the pool.
212
213 The name connect() is misleading: it will not create a new connection if one is already available in the pool.
214
215 Callers should minimise the amount of time they hold onto their connections. If they're doing anything between
216 calls to the DB (such as opening files, or waiting on user input), it's better to return the connection
217 to the pool beforehand.
218
219 The connection can raise errors if not following this advice ("server closed the connection unexpectedly"),
220 as some servers will aggressively close idle connections (eg. DEA's NCI servers). It also prevents the
221 connection from being reused while borrowed.
222 """
223 with self._engine.connect() as connection:
224 yield _api.PostgresDbAPI(connection)
225 connection.close()
226
227 @contextmanager
228 def begin(self):
229 """
230 Start a transaction.
231
232 Returns an instance that will maintain a single connection in a transaction.
233
234 Call commit() or rollback() to complete the transaction or use a context manager:
235
236 with db.begin() as trans:
237 trans.insert_dataset(...)
238
239 (Don't share an instance between threads)
240
241 :rtype: PostgresDBAPI
242 """
243 with self._engine.connect() as connection:
244 connection.execute(text('BEGIN'))
245 try:
246 yield _api.PostgresDbAPI(connection)
247 connection.execute(text('COMMIT'))
248 except Exception: # pylint: disable=broad-except
249 connection.execute(text('ROLLBACK'))
250 raise
251 finally:
252 connection.close()
253
254 def give_me_a_connection(self):
255 return self._engine.connect()
256
257 @classmethod
258 def get_dataset_fields(cls, metadata_type_definition):
259 return _api.get_dataset_fields(metadata_type_definition)
260
261 def __repr__(self):
262 return "PostgresDb<engine={!r}>".format(self._engine)
263
264
265 def handle_dynamic_token_authentication(engine: Engine,
266 new_token: Callable[..., str],
267 timeout: Union[float, int] = 600,
268 **kwargs) -> None:
269 last_token = [None]
270 last_token_time = [0.0]
271
272 @event.listens_for(engine, "do_connect")
273 def override_new_connection(dialect, conn_rec, cargs, cparams):
274 # Handle IAM authentication
275 now = clock_gettime(CLOCK_REALTIME)
276 if now - last_token_time[0] > timeout:
277 last_token[0] = new_token(**kwargs)
278 last_token_time[0] = now
279 cparams["password"] = last_token[0]
280
281
282 def _to_json(o):
283 # Postgres <=9.5 doesn't support NaN and Infinity
284 fixedup = jsonify_document(o)
285 return json.dumps(fixedup, default=_json_fallback)
286
287
288 def _json_fallback(obj):
289 """Fallback json serialiser."""
290 raise TypeError("Type not serializable: {}".format(type(obj)))
291
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/datacube/drivers/postgres/_connections.py b/datacube/drivers/postgres/_connections.py
--- a/datacube/drivers/postgres/_connections.py
+++ b/datacube/drivers/postgres/_connections.py
@@ -17,7 +17,6 @@
import os
import re
from contextlib import contextmanager
-from time import clock_gettime, CLOCK_REALTIME
from typing import Callable, Optional, Union
from sqlalchemy import event, create_engine, text
@@ -272,6 +271,10 @@
@event.listens_for(engine, "do_connect")
def override_new_connection(dialect, conn_rec, cargs, cparams):
# Handle IAM authentication
+ # Importing here because the function `clock_gettime` is not available on Windows
+ # which shouldn't be a problem, because boto3 auth is mostly used on AWS.
+ from time import clock_gettime, CLOCK_REALTIME
+
now = clock_gettime(CLOCK_REALTIME)
if now - last_token_time[0] > timeout:
last_token[0] = new_token(**kwargs)
| {"golden_diff": "diff --git a/datacube/drivers/postgres/_connections.py b/datacube/drivers/postgres/_connections.py\n--- a/datacube/drivers/postgres/_connections.py\n+++ b/datacube/drivers/postgres/_connections.py\n@@ -17,7 +17,6 @@\n import os\n import re\n from contextlib import contextmanager\n-from time import clock_gettime, CLOCK_REALTIME\n from typing import Callable, Optional, Union\n \n from sqlalchemy import event, create_engine, text\n@@ -272,6 +271,10 @@\n @event.listens_for(engine, \"do_connect\")\n def override_new_connection(dialect, conn_rec, cargs, cparams):\n # Handle IAM authentication\n+ # Importing here because the function `clock_gettime` is not available on Windows\n+ # which shouldn't be a problem, because boto3 auth is mostly used on AWS.\n+ from time import clock_gettime, CLOCK_REALTIME\n+\n now = clock_gettime(CLOCK_REALTIME)\n if now - last_token_time[0] > timeout:\n last_token[0] = new_token(**kwargs)\n", "issue": " cannot import name 'clock_gettime' from 'time' (unknown location)\n### Expected behaviour\r\nImport time\r\n\r\n### Actual behaviour\r\n\r\n File \"C:\\ProgramData\\Anaconda3\\lib\\site-packages\\datacube\\drivers\\postgres\\_connections.py\", line 20, in <module>\r\n from time import clock_gettime, CLOCK_REALTIME\r\n\r\nImportError: cannot import name 'clock_gettime' from 'time' (unknown location)\r\n\r\ncannot import name 'clock_gettime' from 'time' (unknown location)\r\n\r\n\r\n### Steps to reproduce the behaviour\r\n from ._connections import PostgresDb\r\n\r\n### Environment information\r\nPython 3.8.11\r\n* Which ``datacube --version`` are you using?\r\nUnknown, it's referenced in geocube on conda-forge\r\n* What datacube deployment/enviornment are you running against?\r\nn/a\r\n\r\nAdditional notes:\r\ntime.clock has been removed in Python 3.8 after being deprecated in Python 3.3. The recommendation is to change clock by either time.perf_counter or time.process_time.\r\n\n", "before_files": [{"content": "# This file is part of the Open Data Cube, see https://opendatacube.org for more information\n#\n# Copyright (c) 2015-2020 ODC Contributors\n# SPDX-License-Identifier: Apache-2.0\n\n# We often have one-arg-per column, so these checks aren't so useful.\n# pylint: disable=too-many-arguments,too-many-public-methods\n\n# SQLAlchemy queries require \"column == None\", not \"column is None\" due to operator overloading:\n# pylint: disable=singleton-comparison\n\n\"\"\"\nPostgres connection and setup\n\"\"\"\nimport json\nimport logging\nimport os\nimport re\nfrom contextlib import contextmanager\nfrom time import clock_gettime, CLOCK_REALTIME\nfrom typing import Callable, Optional, Union\n\nfrom sqlalchemy import event, create_engine, text\nfrom sqlalchemy.engine import Engine\nfrom sqlalchemy.engine.url import URL as EngineUrl\n\nimport datacube\nfrom datacube.index.exceptions import IndexSetupError\nfrom datacube.utils import jsonify_document\n\nfrom . import _api\nfrom . import _core\n\n_LIB_ID = 'agdc-' + str(datacube.__version__)\n\n_LOG = logging.getLogger(__name__)\n\ntry:\n import pwd\n\n DEFAULT_DB_USER = pwd.getpwuid(os.geteuid()).pw_name # type: Optional[str]\nexcept (ImportError, KeyError):\n # No default on Windows and some other systems\n DEFAULT_DB_USER = None\nDEFAULT_DB_PORT = 5432\nDEFAULT_IAM_AUTH = False\nDEFAULT_IAM_TIMEOUT = 600\n\n\nclass PostgresDb(object):\n \"\"\"\n A thin database access api.\n\n It exists so that higher level modules are not tied to SQLAlchemy, connections or specifics of database-access.\n\n (and can be unit tested without any actual databases)\n\n Thread safe: the only shared state is the (thread-safe) sqlalchemy connection pool.\n\n But not multiprocess safe once the first connections are made! A connection must not be shared between multiple\n processes. You can call close() before forking if you know no other threads currently hold connections,\n or else use a separate instance of this class in each process.\n \"\"\"\n\n def __init__(self, engine):\n # We don't recommend using this constructor directly as it may change.\n # Use static methods PostgresDb.create() or PostgresDb.from_config()\n self._engine = engine\n\n @classmethod\n def from_config(cls, config, application_name=None, validate_connection=True):\n app_name = cls._expand_app_name(application_name)\n\n return PostgresDb.create(\n config['db_hostname'],\n config['db_database'],\n config.get('db_username', DEFAULT_DB_USER),\n config.get('db_password', None),\n config.get('db_port', DEFAULT_DB_PORT),\n application_name=app_name,\n validate=validate_connection,\n iam_rds_auth=bool(config.get(\"db_iam_authentication\", DEFAULT_IAM_AUTH)),\n iam_rds_timeout=int(config.get(\"db_iam_timeout\", DEFAULT_IAM_TIMEOUT)),\n pool_timeout=int(config.get('db_connection_timeout', 60)),\n # pass config?\n )\n\n @classmethod\n def create(cls, hostname, database, username=None, password=None, port=None,\n application_name=None, validate=True,\n iam_rds_auth=False, iam_rds_timeout=600,\n # pass config?\n pool_timeout=60):\n mk_url = getattr(EngineUrl, 'create', EngineUrl)\n engine = cls._create_engine(\n mk_url(\n 'postgresql',\n host=hostname, database=database, port=port,\n username=username, password=password,\n ),\n application_name=application_name,\n iam_rds_auth=iam_rds_auth,\n iam_rds_timeout=iam_rds_timeout,\n pool_timeout=pool_timeout)\n if validate:\n if not _core.database_exists(engine):\n raise IndexSetupError('\\n\\nNo DB schema exists. Have you run init?\\n\\t{init_command}'.format(\n init_command='datacube system init'\n ))\n\n if not _core.schema_is_latest(engine):\n raise IndexSetupError(\n '\\n\\nDB schema is out of date. '\n 'An administrator must run init:\\n\\t{init_command}'.format(\n init_command='datacube -v system init'\n ))\n return PostgresDb(engine)\n\n @staticmethod\n def _create_engine(url, application_name=None, iam_rds_auth=False, iam_rds_timeout=600, pool_timeout=60):\n engine = create_engine(\n url,\n echo=False,\n echo_pool=False,\n\n # 'AUTOCOMMIT' here means READ-COMMITTED isolation level with autocommit on.\n # When a transaction is needed we will do an explicit begin/commit.\n isolation_level='AUTOCOMMIT',\n json_serializer=_to_json,\n # If a connection is idle for this many seconds, SQLAlchemy will renew it rather\n # than assuming it's still open. Allows servers to close idle connections without clients\n # getting errors.\n pool_recycle=pool_timeout,\n connect_args={'application_name': application_name}\n )\n\n if iam_rds_auth:\n from datacube.utils.aws import obtain_new_iam_auth_token\n handle_dynamic_token_authentication(engine, obtain_new_iam_auth_token, timeout=iam_rds_timeout, url=url)\n\n return engine\n\n @property\n def url(self) -> EngineUrl:\n return self._engine.url\n\n @staticmethod\n def get_db_username(config):\n try:\n return config['db_username']\n except KeyError:\n return DEFAULT_DB_USER\n\n def close(self):\n \"\"\"\n Close any idle connections in the pool.\n\n This is good practice if you are keeping this object in scope\n but wont be using it for a while.\n\n Connections should not be shared between processes, so this should be called\n before forking if the same instance will be used.\n\n (connections are normally closed automatically when this object is\n garbage collected)\n \"\"\"\n self._engine.dispose()\n\n @classmethod\n def _expand_app_name(cls, application_name):\n \"\"\"\n >>> PostgresDb._expand_app_name(None) #doctest: +ELLIPSIS\n 'agdc-...'\n >>> PostgresDb._expand_app_name('') #doctest: +ELLIPSIS\n 'agdc-...'\n >>> PostgresDb._expand_app_name('cli') #doctest: +ELLIPSIS\n 'cli agdc-...'\n >>> PostgresDb._expand_app_name('a b.c/d')\n 'a-b-c-d agdc-...'\n >>> PostgresDb._expand_app_name(5)\n Traceback (most recent call last):\n ...\n TypeError: Application name must be a string\n \"\"\"\n full_name = _LIB_ID\n if application_name:\n if not isinstance(application_name, str):\n raise TypeError('Application name must be a string')\n\n full_name = re.sub('[^0-9a-zA-Z]+', '-', application_name) + ' ' + full_name\n\n if len(full_name) > 64:\n _LOG.warning('Application name is too long: Truncating to %s chars', (64 - len(_LIB_ID) - 1))\n return full_name[-64:]\n\n def init(self, with_permissions=True):\n \"\"\"\n Init a new database (if not already set up).\n\n :return: If it was newly created.\n \"\"\"\n is_new = _core.ensure_db(self._engine, with_permissions=with_permissions)\n if not is_new:\n _core.update_schema(self._engine)\n\n return is_new\n\n @contextmanager\n def connect(self):\n \"\"\"\n Borrow a connection from the pool.\n\n The name connect() is misleading: it will not create a new connection if one is already available in the pool.\n\n Callers should minimise the amount of time they hold onto their connections. If they're doing anything between\n calls to the DB (such as opening files, or waiting on user input), it's better to return the connection\n to the pool beforehand.\n\n The connection can raise errors if not following this advice (\"server closed the connection unexpectedly\"),\n as some servers will aggressively close idle connections (eg. DEA's NCI servers). It also prevents the\n connection from being reused while borrowed.\n \"\"\"\n with self._engine.connect() as connection:\n yield _api.PostgresDbAPI(connection)\n connection.close()\n\n @contextmanager\n def begin(self):\n \"\"\"\n Start a transaction.\n\n Returns an instance that will maintain a single connection in a transaction.\n\n Call commit() or rollback() to complete the transaction or use a context manager:\n\n with db.begin() as trans:\n trans.insert_dataset(...)\n\n (Don't share an instance between threads)\n\n :rtype: PostgresDBAPI\n \"\"\"\n with self._engine.connect() as connection:\n connection.execute(text('BEGIN'))\n try:\n yield _api.PostgresDbAPI(connection)\n connection.execute(text('COMMIT'))\n except Exception: # pylint: disable=broad-except\n connection.execute(text('ROLLBACK'))\n raise\n finally:\n connection.close()\n\n def give_me_a_connection(self):\n return self._engine.connect()\n\n @classmethod\n def get_dataset_fields(cls, metadata_type_definition):\n return _api.get_dataset_fields(metadata_type_definition)\n\n def __repr__(self):\n return \"PostgresDb<engine={!r}>\".format(self._engine)\n\n\ndef handle_dynamic_token_authentication(engine: Engine,\n new_token: Callable[..., str],\n timeout: Union[float, int] = 600,\n **kwargs) -> None:\n last_token = [None]\n last_token_time = [0.0]\n\n @event.listens_for(engine, \"do_connect\")\n def override_new_connection(dialect, conn_rec, cargs, cparams):\n # Handle IAM authentication\n now = clock_gettime(CLOCK_REALTIME)\n if now - last_token_time[0] > timeout:\n last_token[0] = new_token(**kwargs)\n last_token_time[0] = now\n cparams[\"password\"] = last_token[0]\n\n\ndef _to_json(o):\n # Postgres <=9.5 doesn't support NaN and Infinity\n fixedup = jsonify_document(o)\n return json.dumps(fixedup, default=_json_fallback)\n\n\ndef _json_fallback(obj):\n \"\"\"Fallback json serialiser.\"\"\"\n raise TypeError(\"Type not serializable: {}\".format(type(obj)))\n", "path": "datacube/drivers/postgres/_connections.py"}], "after_files": [{"content": "# This file is part of the Open Data Cube, see https://opendatacube.org for more information\n#\n# Copyright (c) 2015-2020 ODC Contributors\n# SPDX-License-Identifier: Apache-2.0\n\n# We often have one-arg-per column, so these checks aren't so useful.\n# pylint: disable=too-many-arguments,too-many-public-methods\n\n# SQLAlchemy queries require \"column == None\", not \"column is None\" due to operator overloading:\n# pylint: disable=singleton-comparison\n\n\"\"\"\nPostgres connection and setup\n\"\"\"\nimport json\nimport logging\nimport os\nimport re\nfrom contextlib import contextmanager\nfrom typing import Callable, Optional, Union\n\nfrom sqlalchemy import event, create_engine, text\nfrom sqlalchemy.engine import Engine\nfrom sqlalchemy.engine.url import URL as EngineUrl\n\nimport datacube\nfrom datacube.index.exceptions import IndexSetupError\nfrom datacube.utils import jsonify_document\n\nfrom . import _api\nfrom . import _core\n\n_LIB_ID = 'agdc-' + str(datacube.__version__)\n\n_LOG = logging.getLogger(__name__)\n\ntry:\n import pwd\n\n DEFAULT_DB_USER = pwd.getpwuid(os.geteuid()).pw_name # type: Optional[str]\nexcept (ImportError, KeyError):\n # No default on Windows and some other systems\n DEFAULT_DB_USER = None\nDEFAULT_DB_PORT = 5432\nDEFAULT_IAM_AUTH = False\nDEFAULT_IAM_TIMEOUT = 600\n\n\nclass PostgresDb(object):\n \"\"\"\n A thin database access api.\n\n It exists so that higher level modules are not tied to SQLAlchemy, connections or specifics of database-access.\n\n (and can be unit tested without any actual databases)\n\n Thread safe: the only shared state is the (thread-safe) sqlalchemy connection pool.\n\n But not multiprocess safe once the first connections are made! A connection must not be shared between multiple\n processes. You can call close() before forking if you know no other threads currently hold connections,\n or else use a separate instance of this class in each process.\n \"\"\"\n\n def __init__(self, engine):\n # We don't recommend using this constructor directly as it may change.\n # Use static methods PostgresDb.create() or PostgresDb.from_config()\n self._engine = engine\n\n @classmethod\n def from_config(cls, config, application_name=None, validate_connection=True):\n app_name = cls._expand_app_name(application_name)\n\n return PostgresDb.create(\n config['db_hostname'],\n config['db_database'],\n config.get('db_username', DEFAULT_DB_USER),\n config.get('db_password', None),\n config.get('db_port', DEFAULT_DB_PORT),\n application_name=app_name,\n validate=validate_connection,\n iam_rds_auth=bool(config.get(\"db_iam_authentication\", DEFAULT_IAM_AUTH)),\n iam_rds_timeout=int(config.get(\"db_iam_timeout\", DEFAULT_IAM_TIMEOUT)),\n pool_timeout=int(config.get('db_connection_timeout', 60)),\n # pass config?\n )\n\n @classmethod\n def create(cls, hostname, database, username=None, password=None, port=None,\n application_name=None, validate=True,\n iam_rds_auth=False, iam_rds_timeout=600,\n # pass config?\n pool_timeout=60):\n mk_url = getattr(EngineUrl, 'create', EngineUrl)\n engine = cls._create_engine(\n mk_url(\n 'postgresql',\n host=hostname, database=database, port=port,\n username=username, password=password,\n ),\n application_name=application_name,\n iam_rds_auth=iam_rds_auth,\n iam_rds_timeout=iam_rds_timeout,\n pool_timeout=pool_timeout)\n if validate:\n if not _core.database_exists(engine):\n raise IndexSetupError('\\n\\nNo DB schema exists. Have you run init?\\n\\t{init_command}'.format(\n init_command='datacube system init'\n ))\n\n if not _core.schema_is_latest(engine):\n raise IndexSetupError(\n '\\n\\nDB schema is out of date. '\n 'An administrator must run init:\\n\\t{init_command}'.format(\n init_command='datacube -v system init'\n ))\n return PostgresDb(engine)\n\n @staticmethod\n def _create_engine(url, application_name=None, iam_rds_auth=False, iam_rds_timeout=600, pool_timeout=60):\n engine = create_engine(\n url,\n echo=False,\n echo_pool=False,\n\n # 'AUTOCOMMIT' here means READ-COMMITTED isolation level with autocommit on.\n # When a transaction is needed we will do an explicit begin/commit.\n isolation_level='AUTOCOMMIT',\n json_serializer=_to_json,\n # If a connection is idle for this many seconds, SQLAlchemy will renew it rather\n # than assuming it's still open. Allows servers to close idle connections without clients\n # getting errors.\n pool_recycle=pool_timeout,\n connect_args={'application_name': application_name}\n )\n\n if iam_rds_auth:\n from datacube.utils.aws import obtain_new_iam_auth_token\n handle_dynamic_token_authentication(engine, obtain_new_iam_auth_token, timeout=iam_rds_timeout, url=url)\n\n return engine\n\n @property\n def url(self) -> EngineUrl:\n return self._engine.url\n\n @staticmethod\n def get_db_username(config):\n try:\n return config['db_username']\n except KeyError:\n return DEFAULT_DB_USER\n\n def close(self):\n \"\"\"\n Close any idle connections in the pool.\n\n This is good practice if you are keeping this object in scope\n but wont be using it for a while.\n\n Connections should not be shared between processes, so this should be called\n before forking if the same instance will be used.\n\n (connections are normally closed automatically when this object is\n garbage collected)\n \"\"\"\n self._engine.dispose()\n\n @classmethod\n def _expand_app_name(cls, application_name):\n \"\"\"\n >>> PostgresDb._expand_app_name(None) #doctest: +ELLIPSIS\n 'agdc-...'\n >>> PostgresDb._expand_app_name('') #doctest: +ELLIPSIS\n 'agdc-...'\n >>> PostgresDb._expand_app_name('cli') #doctest: +ELLIPSIS\n 'cli agdc-...'\n >>> PostgresDb._expand_app_name('a b.c/d')\n 'a-b-c-d agdc-...'\n >>> PostgresDb._expand_app_name(5)\n Traceback (most recent call last):\n ...\n TypeError: Application name must be a string\n \"\"\"\n full_name = _LIB_ID\n if application_name:\n if not isinstance(application_name, str):\n raise TypeError('Application name must be a string')\n\n full_name = re.sub('[^0-9a-zA-Z]+', '-', application_name) + ' ' + full_name\n\n if len(full_name) > 64:\n _LOG.warning('Application name is too long: Truncating to %s chars', (64 - len(_LIB_ID) - 1))\n return full_name[-64:]\n\n def init(self, with_permissions=True):\n \"\"\"\n Init a new database (if not already set up).\n\n :return: If it was newly created.\n \"\"\"\n is_new = _core.ensure_db(self._engine, with_permissions=with_permissions)\n if not is_new:\n _core.update_schema(self._engine)\n\n return is_new\n\n @contextmanager\n def connect(self):\n \"\"\"\n Borrow a connection from the pool.\n\n The name connect() is misleading: it will not create a new connection if one is already available in the pool.\n\n Callers should minimise the amount of time they hold onto their connections. If they're doing anything between\n calls to the DB (such as opening files, or waiting on user input), it's better to return the connection\n to the pool beforehand.\n\n The connection can raise errors if not following this advice (\"server closed the connection unexpectedly\"),\n as some servers will aggressively close idle connections (eg. DEA's NCI servers). It also prevents the\n connection from being reused while borrowed.\n \"\"\"\n with self._engine.connect() as connection:\n yield _api.PostgresDbAPI(connection)\n connection.close()\n\n @contextmanager\n def begin(self):\n \"\"\"\n Start a transaction.\n\n Returns an instance that will maintain a single connection in a transaction.\n\n Call commit() or rollback() to complete the transaction or use a context manager:\n\n with db.begin() as trans:\n trans.insert_dataset(...)\n\n (Don't share an instance between threads)\n\n :rtype: PostgresDBAPI\n \"\"\"\n with self._engine.connect() as connection:\n connection.execute(text('BEGIN'))\n try:\n yield _api.PostgresDbAPI(connection)\n connection.execute(text('COMMIT'))\n except Exception: # pylint: disable=broad-except\n connection.execute(text('ROLLBACK'))\n raise\n finally:\n connection.close()\n\n def give_me_a_connection(self):\n return self._engine.connect()\n\n @classmethod\n def get_dataset_fields(cls, metadata_type_definition):\n return _api.get_dataset_fields(metadata_type_definition)\n\n def __repr__(self):\n return \"PostgresDb<engine={!r}>\".format(self._engine)\n\n\ndef handle_dynamic_token_authentication(engine: Engine,\n new_token: Callable[..., str],\n timeout: Union[float, int] = 600,\n **kwargs) -> None:\n last_token = [None]\n last_token_time = [0.0]\n\n @event.listens_for(engine, \"do_connect\")\n def override_new_connection(dialect, conn_rec, cargs, cparams):\n # Handle IAM authentication\n # Importing here because the function `clock_gettime` is not available on Windows\n # which shouldn't be a problem, because boto3 auth is mostly used on AWS.\n from time import clock_gettime, CLOCK_REALTIME\n\n now = clock_gettime(CLOCK_REALTIME)\n if now - last_token_time[0] > timeout:\n last_token[0] = new_token(**kwargs)\n last_token_time[0] = now\n cparams[\"password\"] = last_token[0]\n\n\ndef _to_json(o):\n # Postgres <=9.5 doesn't support NaN and Infinity\n fixedup = jsonify_document(o)\n return json.dumps(fixedup, default=_json_fallback)\n\n\ndef _json_fallback(obj):\n \"\"\"Fallback json serialiser.\"\"\"\n raise TypeError(\"Type not serializable: {}\".format(type(obj)))\n", "path": "datacube/drivers/postgres/_connections.py"}]} | 3,529 | 235 |
gh_patches_debug_3792 | rasdani/github-patches | git_diff | mitmproxy__mitmproxy-2875 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Mitmproxy crashes, when we try to write a value from file into grideditor cell
##### Steps to reproduce the problem:
1. Create a file. Put inside:
```
abc
```
2. Run mitmproxy.
3. Press `n` -> `Enter` -> `Enter` -> `e` -> `cookies`. You will get into `cookies` grideditor.
4. Press `a` -> `Esc` -> `r` (or `R`, also relevant for it).
5. Input the path to the file with `abc`: `console.grideditor.load` `~/your_file_with_abc` -> `Enter`.
You will see:
```
Traceback (most recent call last):
File "/home/kajoj/Mitmproxy/mitmproxy/mitmproxy/tools/console/master.py", line 216, in run
self.loop.run()
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 278, in run
self._run()
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 376, in _run
self.event_loop.run()
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 682, in run
self._loop()
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 710, in _loop
self._entering_idle()
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 671, in _entering_idle
callback()
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 564, in entering_idle
self.draw_screen()
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 578, in draw_screen
canvas = self._topmost_widget.render(self.screen_size, focus=True)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py", line 141, in cached_render
canv = fn(self, size, focus=focus)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/container.py", line 1083, in render
focus and self.focus_part == 'body')
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py", line 141, in cached_render
canv = fn(self, size, focus=focus)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/decoration.py", line 225, in render
canv = self._original_widget.render(size, focus=focus)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py", line 141, in cached_render
canv = fn(self, size, focus=focus)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/container.py", line 1083, in render
focus and self.focus_part == 'body')
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py", line 141, in cached_render
canv = fn(self, size, focus=focus)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py", line 1750, in render
canv = get_delegate(self).render(size, focus=focus)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py", line 141, in cached_render
canv = fn(self, size, focus=focus)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py", line 1750, in render
canv = get_delegate(self).render(size, focus=focus)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py", line 141, in cached_render
canv = fn(self, size, focus=focus)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/container.py", line 1083, in render
focus and self.focus_part == 'body')
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py", line 141, in cached_render
canv = fn(self, size, focus=focus)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/listbox.py", line 455, in render
(maxcol, maxrow), focus=focus)
File "/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/listbox.py", line 340, in calculate_visible
focus_widget, focus_pos = self.body.get_focus()
File "/home/kajoj/Mitmproxy/mitmproxy/mitmproxy/tools/console/grideditor/base.py", line 227, in get_focus
self.lst[self.focus]
File "/home/kajoj/Mitmproxy/mitmproxy/mitmproxy/tools/console/grideditor/base.py", line 83, in __init__
w = self.editor.columns[i].Display(v)
File "/home/kajoj/Mitmproxy/mitmproxy/mitmproxy/tools/console/grideditor/col_text.py", line 18, in Display
return TDisplay(data, self.encoding_args)
File "/home/kajoj/Mitmproxy/mitmproxy/mitmproxy/tools/console/grideditor/col_text.py", line 31, in __init__
super().__init__(data.encode(*self.encoding_args))
AttributeError: 'bytes' object has no attribute 'encode'
```
##### Any other comments? What have you tried so far?
This bug is relevant for cookies, form, path, query and set-cookies grideditors.
I didn't check carefully, but it seems to be relevant for v2.0.2 as well.
##### System information
Mitmproxy: 3.0.0.dev113 (commit 93425d4)
Python: 3.5.2
OpenSSL: OpenSSL 1.1.0g 2 Nov 2017
Platform: Linux-4.4.0-112-generic-x86_64-with-Ubuntu-16.04-xenial
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mitmproxy/tools/console/grideditor/col_text.py`
Content:
```
1 """
2 Welcome to the encoding dance!
3
4 In a nutshell, text columns are actually a proxy class for byte columns,
5 which just encode/decodes contents.
6 """
7
8 from mitmproxy.tools.console import signals
9 from mitmproxy.tools.console.grideditor import col_bytes
10
11
12 class Column(col_bytes.Column):
13 def __init__(self, heading, encoding="utf8", errors="surrogateescape"):
14 super().__init__(heading)
15 self.encoding_args = encoding, errors
16
17 def Display(self, data):
18 return TDisplay(data, self.encoding_args)
19
20 def Edit(self, data):
21 return TEdit(data, self.encoding_args)
22
23 def blank(self):
24 return ""
25
26
27 # This is the same for both edit and display.
28 class EncodingMixin:
29 def __init__(self, data, encoding_args):
30 self.encoding_args = encoding_args
31 super().__init__(data.encode(*self.encoding_args))
32
33 def get_data(self):
34 data = super().get_data()
35 try:
36 return data.decode(*self.encoding_args)
37 except ValueError:
38 signals.status_message.send(
39 self,
40 message="Invalid encoding.",
41 expire=1000
42 )
43 raise
44
45
46 # urwid forces a different name for a subclass.
47 class TDisplay(EncodingMixin, col_bytes.Display):
48 pass
49
50
51 class TEdit(EncodingMixin, col_bytes.Edit):
52 pass
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mitmproxy/tools/console/grideditor/col_text.py b/mitmproxy/tools/console/grideditor/col_text.py
--- a/mitmproxy/tools/console/grideditor/col_text.py
+++ b/mitmproxy/tools/console/grideditor/col_text.py
@@ -28,7 +28,7 @@
class EncodingMixin:
def __init__(self, data, encoding_args):
self.encoding_args = encoding_args
- super().__init__(data.encode(*self.encoding_args))
+ super().__init__(data.__str__().encode(*self.encoding_args))
def get_data(self):
data = super().get_data()
| {"golden_diff": "diff --git a/mitmproxy/tools/console/grideditor/col_text.py b/mitmproxy/tools/console/grideditor/col_text.py\n--- a/mitmproxy/tools/console/grideditor/col_text.py\n+++ b/mitmproxy/tools/console/grideditor/col_text.py\n@@ -28,7 +28,7 @@\n class EncodingMixin:\n def __init__(self, data, encoding_args):\n self.encoding_args = encoding_args\n- super().__init__(data.encode(*self.encoding_args))\n+ super().__init__(data.__str__().encode(*self.encoding_args))\n \n def get_data(self):\n data = super().get_data()\n", "issue": "Mitmproxy crashes, when we try to write a value from file into grideditor cell\n##### Steps to reproduce the problem:\r\n\r\n1. Create a file. Put inside:\r\n```\r\nabc\r\n```\r\n2. Run mitmproxy.\r\n3. Press `n` -> `Enter` -> `Enter` -> `e` -> `cookies`. You will get into `cookies` grideditor.\r\n4. Press `a` -> `Esc` -> `r` (or `R`, also relevant for it).\r\n5. Input the path to the file with `abc`: `console.grideditor.load` `~/your_file_with_abc` -> `Enter`.\r\n\r\nYou will see:\r\n```\r\nTraceback (most recent call last):\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/mitmproxy/tools/console/master.py\", line 216, in run\r\n self.loop.run()\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py\", line 278, in run\r\n self._run()\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py\", line 376, in _run\r\n self.event_loop.run()\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py\", line 682, in run\r\n self._loop()\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py\", line 710, in _loop\r\n self._entering_idle()\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py\", line 671, in _entering_idle\r\n callback()\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py\", line 564, in entering_idle\r\n self.draw_screen()\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py\", line 578, in draw_screen\r\n canvas = self._topmost_widget.render(self.screen_size, focus=True)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py\", line 141, in cached_render\r\n canv = fn(self, size, focus=focus)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/container.py\", line 1083, in render\r\n focus and self.focus_part == 'body')\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py\", line 141, in cached_render\r\n canv = fn(self, size, focus=focus)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/decoration.py\", line 225, in render\r\n canv = self._original_widget.render(size, focus=focus)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py\", line 141, in cached_render\r\n canv = fn(self, size, focus=focus)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/container.py\", line 1083, in render\r\n focus and self.focus_part == 'body')\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py\", line 141, in cached_render\r\n canv = fn(self, size, focus=focus)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py\", line 1750, in render\r\n canv = get_delegate(self).render(size, focus=focus)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py\", line 141, in cached_render\r\n canv = fn(self, size, focus=focus)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py\", line 1750, in render\r\n canv = get_delegate(self).render(size, focus=focus)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py\", line 141, in cached_render\r\n canv = fn(self, size, focus=focus)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/container.py\", line 1083, in render\r\n focus and self.focus_part == 'body')\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/widget.py\", line 141, in cached_render\r\n canv = fn(self, size, focus=focus)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/listbox.py\", line 455, in render\r\n (maxcol, maxrow), focus=focus)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/venv/lib/python3.5/site-packages/urwid/listbox.py\", line 340, in calculate_visible\r\n focus_widget, focus_pos = self.body.get_focus()\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/mitmproxy/tools/console/grideditor/base.py\", line 227, in get_focus\r\n self.lst[self.focus]\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/mitmproxy/tools/console/grideditor/base.py\", line 83, in __init__\r\n w = self.editor.columns[i].Display(v)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/mitmproxy/tools/console/grideditor/col_text.py\", line 18, in Display\r\n return TDisplay(data, self.encoding_args)\r\n File \"/home/kajoj/Mitmproxy/mitmproxy/mitmproxy/tools/console/grideditor/col_text.py\", line 31, in __init__\r\n super().__init__(data.encode(*self.encoding_args))\r\nAttributeError: 'bytes' object has no attribute 'encode'\r\n```\r\n\r\n##### Any other comments? What have you tried so far?\r\n\r\nThis bug is relevant for cookies, form, path, query and set-cookies grideditors.\r\nI didn't check carefully, but it seems to be relevant for v2.0.2 as well.\r\n\r\n\r\n##### System information\r\n\r\nMitmproxy: 3.0.0.dev113 (commit 93425d4) \r\nPython: 3.5.2\r\nOpenSSL: OpenSSL 1.1.0g 2 Nov 2017\r\nPlatform: Linux-4.4.0-112-generic-x86_64-with-Ubuntu-16.04-xenial\r\n\r\n\n", "before_files": [{"content": "\"\"\"\nWelcome to the encoding dance!\n\nIn a nutshell, text columns are actually a proxy class for byte columns,\nwhich just encode/decodes contents.\n\"\"\"\n\nfrom mitmproxy.tools.console import signals\nfrom mitmproxy.tools.console.grideditor import col_bytes\n\n\nclass Column(col_bytes.Column):\n def __init__(self, heading, encoding=\"utf8\", errors=\"surrogateescape\"):\n super().__init__(heading)\n self.encoding_args = encoding, errors\n\n def Display(self, data):\n return TDisplay(data, self.encoding_args)\n\n def Edit(self, data):\n return TEdit(data, self.encoding_args)\n\n def blank(self):\n return \"\"\n\n\n# This is the same for both edit and display.\nclass EncodingMixin:\n def __init__(self, data, encoding_args):\n self.encoding_args = encoding_args\n super().__init__(data.encode(*self.encoding_args))\n\n def get_data(self):\n data = super().get_data()\n try:\n return data.decode(*self.encoding_args)\n except ValueError:\n signals.status_message.send(\n self,\n message=\"Invalid encoding.\",\n expire=1000\n )\n raise\n\n\n# urwid forces a different name for a subclass.\nclass TDisplay(EncodingMixin, col_bytes.Display):\n pass\n\n\nclass TEdit(EncodingMixin, col_bytes.Edit):\n pass\n", "path": "mitmproxy/tools/console/grideditor/col_text.py"}], "after_files": [{"content": "\"\"\"\nWelcome to the encoding dance!\n\nIn a nutshell, text columns are actually a proxy class for byte columns,\nwhich just encode/decodes contents.\n\"\"\"\n\nfrom mitmproxy.tools.console import signals\nfrom mitmproxy.tools.console.grideditor import col_bytes\n\n\nclass Column(col_bytes.Column):\n def __init__(self, heading, encoding=\"utf8\", errors=\"surrogateescape\"):\n super().__init__(heading)\n self.encoding_args = encoding, errors\n\n def Display(self, data):\n return TDisplay(data, self.encoding_args)\n\n def Edit(self, data):\n return TEdit(data, self.encoding_args)\n\n def blank(self):\n return \"\"\n\n\n# This is the same for both edit and display.\nclass EncodingMixin:\n def __init__(self, data, encoding_args):\n self.encoding_args = encoding_args\n super().__init__(data.__str__().encode(*self.encoding_args))\n\n def get_data(self):\n data = super().get_data()\n try:\n return data.decode(*self.encoding_args)\n except ValueError:\n signals.status_message.send(\n self,\n message=\"Invalid encoding.\",\n expire=1000\n )\n raise\n\n\n# urwid forces a different name for a subclass.\nclass TDisplay(EncodingMixin, col_bytes.Display):\n pass\n\n\nclass TEdit(EncodingMixin, col_bytes.Edit):\n pass\n", "path": "mitmproxy/tools/console/grideditor/col_text.py"}]} | 2,274 | 137 |
gh_patches_debug_19618 | rasdani/github-patches | git_diff | getsentry__sentry-9691 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Member roles cannot delete own saved search
Users with their role as `Member` cannot delete/remove their own Saved Search:
- Members can save a search query and it will only be shown to the Member that created it.
- Members cannot delete their own saved search.
- Other users cannot see another Member's saved search, not even Admins, Managers, or Owners. Since no one else can see another Member's saved search, Admins and above cannot delete them either.
- If a Member is updated to a different role, they cannot see their own saved searches unless they're back as a Member.
cc @getsentry/workflow (fyi @getsentry/cops)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/sentry/api/endpoints/project_search_details.py`
Content:
```
1 from __future__ import absolute_import
2
3 from rest_framework import serializers
4 from rest_framework.response import Response
5
6 from sentry.api.bases.project import ProjectEndpoint, RelaxedSearchPermission
7 from sentry.api.exceptions import ResourceDoesNotExist
8 from sentry.api.serializers import serialize
9 from sentry.models import SavedSearch, SavedSearchUserDefault
10
11
12 class LimitedSavedSearchSerializer(serializers.Serializer):
13 isUserDefault = serializers.BooleanField(required=False)
14
15
16 class SavedSearchSerializer(serializers.Serializer):
17 name = serializers.CharField(max_length=128, required=True)
18 query = serializers.CharField(required=True)
19 isDefault = serializers.BooleanField(required=False)
20 isUserDefault = serializers.BooleanField(required=False)
21
22
23 class ProjectSearchDetailsEndpoint(ProjectEndpoint):
24 permission_classes = (RelaxedSearchPermission, )
25
26 def get(self, request, project, search_id):
27 """
28 Retrieve a saved search
29
30 Return details on an individual saved search.
31
32 {method} {path}
33
34 """
35 try:
36 search = SavedSearch.objects.get(
37 project=project,
38 id=search_id,
39 )
40 except SavedSearch.DoesNotExist:
41 raise ResourceDoesNotExist
42
43 return Response(serialize(search, request.user))
44
45 def put(self, request, project, search_id):
46 """
47 Update a saved search
48
49 Update a saved search.
50
51 {method} {path}
52 {{
53 "name: "Unresolved",
54 "query": "is:unresolved",
55 "dateSavedSearchd": "2015-05-11T02:23:10Z"
56 }}
57
58 """
59 try:
60 search = SavedSearch.objects.get(
61 project=project,
62 id=search_id,
63 )
64 except SavedSearch.DoesNotExist:
65 raise ResourceDoesNotExist
66
67 has_team_scope = any(
68 request.access.has_team_scope(team, 'project:write') for team in project.teams.all()
69 )
70 if has_team_scope:
71 serializer = SavedSearchSerializer(data=request.DATA, partial=True)
72 else:
73 serializer = LimitedSavedSearchSerializer(data=request.DATA, partial=True)
74
75 if not serializer.is_valid():
76 return Response(serializer.errors, status=400)
77
78 result = serializer.object
79
80 kwargs = {}
81 if result.get('name'):
82 kwargs['name'] = result['name']
83 if result.get('query'):
84 kwargs['query'] = result['query']
85 if result.get('isDefault'):
86 kwargs['is_default'] = result['isDefault']
87
88 if kwargs:
89 search.update(**kwargs)
90
91 if result.get('isDefault'):
92 SavedSearch.objects.filter(
93 project=project,
94 ).exclude(id=search_id).update(is_default=False)
95
96 if result.get('isUserDefault'):
97 SavedSearchUserDefault.objects.create_or_update(
98 user=request.user, project=project, values={
99 'savedsearch': search,
100 }
101 )
102
103 return Response(serialize(search, request.user))
104
105 def delete(self, request, project, search_id):
106 """
107 Delete a saved search
108
109 Permanently remove a saved search.
110
111 {method} {path}
112
113 """
114 try:
115 search = SavedSearch.objects.get(
116 project=project,
117 id=search_id,
118 )
119 except SavedSearch.DoesNotExist:
120 raise ResourceDoesNotExist
121
122 search.delete()
123
124 return Response(status=204)
125
```
Path: `src/sentry/api/bases/project.py`
Content:
```
1 from __future__ import absolute_import
2
3 from rest_framework.response import Response
4
5 from sentry import roles
6 from sentry.api.base import Endpoint
7 from sentry.api.exceptions import ResourceDoesNotExist, ProjectMoved
8 from sentry.app import raven
9 from sentry.auth.superuser import is_active_superuser
10 from sentry.models import OrganizationMember, Project, ProjectStatus, ProjectRedirect
11
12 from .organization import OrganizationPermission
13 from .team import has_team_permission
14
15
16 class ProjectPermission(OrganizationPermission):
17 scope_map = {
18 'GET': ['project:read', 'project:write', 'project:admin'],
19 'POST': ['project:write', 'project:admin'],
20 'PUT': ['project:write', 'project:admin'],
21 'DELETE': ['project:admin'],
22 }
23
24 def has_object_permission(self, request, view, project):
25 result = super(ProjectPermission,
26 self).has_object_permission(request, view, project.organization)
27
28 if not result:
29 return result
30
31 if project.teams.exists():
32 return any(
33 has_team_permission(request, team, self.scope_map) for team in project.teams.all()
34 )
35 elif request.user.is_authenticated():
36 # this is only for team-less projects
37 if is_active_superuser(request):
38 return True
39 try:
40 role = OrganizationMember.objects.filter(
41 organization=project.organization,
42 user=request.user,
43 ).values_list('role', flat=True).get()
44 except OrganizationMember.DoesNotExist:
45 # this should probably never happen?
46 return False
47
48 return roles.get(role).is_global
49
50 return False
51
52
53 class StrictProjectPermission(ProjectPermission):
54 scope_map = {
55 'GET': ['project:write', 'project:admin'],
56 'POST': ['project:write', 'project:admin'],
57 'PUT': ['project:write', 'project:admin'],
58 'DELETE': ['project:admin'],
59 }
60
61
62 class ProjectReleasePermission(ProjectPermission):
63 scope_map = {
64 'GET': ['project:read', 'project:write', 'project:admin', 'project:releases'],
65 'POST': ['project:write', 'project:admin', 'project:releases'],
66 'PUT': ['project:write', 'project:admin', 'project:releases'],
67 'DELETE': ['project:admin', 'project:releases'],
68 }
69
70
71 class ProjectEventPermission(ProjectPermission):
72 scope_map = {
73 'GET': ['event:read', 'event:write', 'event:admin'],
74 'POST': ['event:write', 'event:admin'],
75 'PUT': ['event:write', 'event:admin'],
76 'DELETE': ['event:admin'],
77 }
78
79
80 class ProjectSettingPermission(ProjectPermission):
81 scope_map = {
82 'GET': ['project:read', 'project:write', 'project:admin'],
83 'POST': ['project:write', 'project:admin'],
84 'PUT': ['project:write', 'project:admin'],
85 'DELETE': ['project:write', 'project:admin'],
86 }
87
88
89 class RelaxedSearchPermission(ProjectPermission):
90 scope_map = {
91 'GET': ['project:read', 'project:write', 'project:admin'],
92 # members can do writes
93 'POST': ['project:write', 'project:admin', 'project:read'],
94 'PUT': ['project:write', 'project:admin', 'project:read'],
95 'DELETE': ['project:admin'],
96 }
97
98
99 class ProjectEndpoint(Endpoint):
100 permission_classes = (ProjectPermission, )
101
102 def convert_args(self, request, organization_slug, project_slug, *args, **kwargs):
103 try:
104 project = Project.objects.filter(
105 organization__slug=organization_slug,
106 slug=project_slug,
107 ).select_related('organization').prefetch_related('teams').get()
108 except Project.DoesNotExist:
109 try:
110 # Project may have been renamed
111 redirect = ProjectRedirect.objects.select_related('project')
112 redirect = redirect.get(
113 organization__slug=organization_slug,
114 redirect_slug=project_slug
115 )
116
117 # get full path so that we keep query strings
118 requested_url = request.get_full_path()
119 new_url = requested_url.replace(
120 'projects/%s/%s/' %
121 (organization_slug, project_slug), 'projects/%s/%s/' %
122 (organization_slug, redirect.project.slug))
123
124 # Resource was moved/renamed if the requested url is different than the new url
125 if requested_url != new_url:
126 raise ProjectMoved(new_url, redirect.project.slug)
127
128 # otherwise project doesn't exist
129 raise ResourceDoesNotExist
130 except ProjectRedirect.DoesNotExist:
131 raise ResourceDoesNotExist
132
133 if project.status != ProjectStatus.VISIBLE:
134 raise ResourceDoesNotExist
135
136 self.check_object_permissions(request, project)
137
138 raven.tags_context({
139 'project': project.id,
140 'organization': project.organization_id,
141 })
142
143 request._request.organization = project.organization
144
145 kwargs['project'] = project
146 return (args, kwargs)
147
148 def handle_exception(self, request, exc):
149 if isinstance(exc, ProjectMoved):
150 response = Response({
151 'slug': exc.detail['extra']['slug'],
152 'detail': exc.detail
153 }, status=exc.status_code)
154 response['Location'] = exc.detail['extra']['url']
155 return response
156 return super(ProjectEndpoint, self).handle_exception(request, exc)
157
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/sentry/api/bases/project.py b/src/sentry/api/bases/project.py
--- a/src/sentry/api/bases/project.py
+++ b/src/sentry/api/bases/project.py
@@ -92,7 +92,8 @@
# members can do writes
'POST': ['project:write', 'project:admin', 'project:read'],
'PUT': ['project:write', 'project:admin', 'project:read'],
- 'DELETE': ['project:admin'],
+ # members can delete their own searches
+ 'DELETE': ['project:read', 'project:write', 'project:admin'],
}
diff --git a/src/sentry/api/endpoints/project_search_details.py b/src/sentry/api/endpoints/project_search_details.py
--- a/src/sentry/api/endpoints/project_search_details.py
+++ b/src/sentry/api/endpoints/project_search_details.py
@@ -119,6 +119,14 @@
except SavedSearch.DoesNotExist:
raise ResourceDoesNotExist
- search.delete()
+ is_search_owner = request.user and request.user == search.owner
- return Response(status=204)
+ if request.access.has_scope('project:write'):
+ if not search.owner or is_search_owner:
+ search.delete()
+ return Response(status=204)
+ elif is_search_owner:
+ search.delete()
+ return Response(status=204)
+
+ return Response(status=403)
| {"golden_diff": "diff --git a/src/sentry/api/bases/project.py b/src/sentry/api/bases/project.py\n--- a/src/sentry/api/bases/project.py\n+++ b/src/sentry/api/bases/project.py\n@@ -92,7 +92,8 @@\n # members can do writes\n 'POST': ['project:write', 'project:admin', 'project:read'],\n 'PUT': ['project:write', 'project:admin', 'project:read'],\n- 'DELETE': ['project:admin'],\n+ # members can delete their own searches\n+ 'DELETE': ['project:read', 'project:write', 'project:admin'],\n }\n \n \ndiff --git a/src/sentry/api/endpoints/project_search_details.py b/src/sentry/api/endpoints/project_search_details.py\n--- a/src/sentry/api/endpoints/project_search_details.py\n+++ b/src/sentry/api/endpoints/project_search_details.py\n@@ -119,6 +119,14 @@\n except SavedSearch.DoesNotExist:\n raise ResourceDoesNotExist\n \n- search.delete()\n+ is_search_owner = request.user and request.user == search.owner\n \n- return Response(status=204)\n+ if request.access.has_scope('project:write'):\n+ if not search.owner or is_search_owner:\n+ search.delete()\n+ return Response(status=204)\n+ elif is_search_owner:\n+ search.delete()\n+ return Response(status=204)\n+\n+ return Response(status=403)\n", "issue": "Member roles cannot delete own saved search\nUsers with their role as `Member` cannot delete/remove their own Saved Search: \r\n\r\n- Members can save a search query and it will only be shown to the Member that created it.\r\n- Members cannot delete their own saved search.\r\n- Other users cannot see another Member's saved search, not even Admins, Managers, or Owners. Since no one else can see another Member's saved search, Admins and above cannot delete them either.\r\n- If a Member is updated to a different role, they cannot see their own saved searches unless they're back as a Member.\r\n\r\ncc @getsentry/workflow (fyi @getsentry/cops) \n", "before_files": [{"content": "from __future__ import absolute_import\n\nfrom rest_framework import serializers\nfrom rest_framework.response import Response\n\nfrom sentry.api.bases.project import ProjectEndpoint, RelaxedSearchPermission\nfrom sentry.api.exceptions import ResourceDoesNotExist\nfrom sentry.api.serializers import serialize\nfrom sentry.models import SavedSearch, SavedSearchUserDefault\n\n\nclass LimitedSavedSearchSerializer(serializers.Serializer):\n isUserDefault = serializers.BooleanField(required=False)\n\n\nclass SavedSearchSerializer(serializers.Serializer):\n name = serializers.CharField(max_length=128, required=True)\n query = serializers.CharField(required=True)\n isDefault = serializers.BooleanField(required=False)\n isUserDefault = serializers.BooleanField(required=False)\n\n\nclass ProjectSearchDetailsEndpoint(ProjectEndpoint):\n permission_classes = (RelaxedSearchPermission, )\n\n def get(self, request, project, search_id):\n \"\"\"\n Retrieve a saved search\n\n Return details on an individual saved search.\n\n {method} {path}\n\n \"\"\"\n try:\n search = SavedSearch.objects.get(\n project=project,\n id=search_id,\n )\n except SavedSearch.DoesNotExist:\n raise ResourceDoesNotExist\n\n return Response(serialize(search, request.user))\n\n def put(self, request, project, search_id):\n \"\"\"\n Update a saved search\n\n Update a saved search.\n\n {method} {path}\n {{\n \"name: \"Unresolved\",\n \"query\": \"is:unresolved\",\n \"dateSavedSearchd\": \"2015-05-11T02:23:10Z\"\n }}\n\n \"\"\"\n try:\n search = SavedSearch.objects.get(\n project=project,\n id=search_id,\n )\n except SavedSearch.DoesNotExist:\n raise ResourceDoesNotExist\n\n has_team_scope = any(\n request.access.has_team_scope(team, 'project:write') for team in project.teams.all()\n )\n if has_team_scope:\n serializer = SavedSearchSerializer(data=request.DATA, partial=True)\n else:\n serializer = LimitedSavedSearchSerializer(data=request.DATA, partial=True)\n\n if not serializer.is_valid():\n return Response(serializer.errors, status=400)\n\n result = serializer.object\n\n kwargs = {}\n if result.get('name'):\n kwargs['name'] = result['name']\n if result.get('query'):\n kwargs['query'] = result['query']\n if result.get('isDefault'):\n kwargs['is_default'] = result['isDefault']\n\n if kwargs:\n search.update(**kwargs)\n\n if result.get('isDefault'):\n SavedSearch.objects.filter(\n project=project,\n ).exclude(id=search_id).update(is_default=False)\n\n if result.get('isUserDefault'):\n SavedSearchUserDefault.objects.create_or_update(\n user=request.user, project=project, values={\n 'savedsearch': search,\n }\n )\n\n return Response(serialize(search, request.user))\n\n def delete(self, request, project, search_id):\n \"\"\"\n Delete a saved search\n\n Permanently remove a saved search.\n\n {method} {path}\n\n \"\"\"\n try:\n search = SavedSearch.objects.get(\n project=project,\n id=search_id,\n )\n except SavedSearch.DoesNotExist:\n raise ResourceDoesNotExist\n\n search.delete()\n\n return Response(status=204)\n", "path": "src/sentry/api/endpoints/project_search_details.py"}, {"content": "from __future__ import absolute_import\n\nfrom rest_framework.response import Response\n\nfrom sentry import roles\nfrom sentry.api.base import Endpoint\nfrom sentry.api.exceptions import ResourceDoesNotExist, ProjectMoved\nfrom sentry.app import raven\nfrom sentry.auth.superuser import is_active_superuser\nfrom sentry.models import OrganizationMember, Project, ProjectStatus, ProjectRedirect\n\nfrom .organization import OrganizationPermission\nfrom .team import has_team_permission\n\n\nclass ProjectPermission(OrganizationPermission):\n scope_map = {\n 'GET': ['project:read', 'project:write', 'project:admin'],\n 'POST': ['project:write', 'project:admin'],\n 'PUT': ['project:write', 'project:admin'],\n 'DELETE': ['project:admin'],\n }\n\n def has_object_permission(self, request, view, project):\n result = super(ProjectPermission,\n self).has_object_permission(request, view, project.organization)\n\n if not result:\n return result\n\n if project.teams.exists():\n return any(\n has_team_permission(request, team, self.scope_map) for team in project.teams.all()\n )\n elif request.user.is_authenticated():\n # this is only for team-less projects\n if is_active_superuser(request):\n return True\n try:\n role = OrganizationMember.objects.filter(\n organization=project.organization,\n user=request.user,\n ).values_list('role', flat=True).get()\n except OrganizationMember.DoesNotExist:\n # this should probably never happen?\n return False\n\n return roles.get(role).is_global\n\n return False\n\n\nclass StrictProjectPermission(ProjectPermission):\n scope_map = {\n 'GET': ['project:write', 'project:admin'],\n 'POST': ['project:write', 'project:admin'],\n 'PUT': ['project:write', 'project:admin'],\n 'DELETE': ['project:admin'],\n }\n\n\nclass ProjectReleasePermission(ProjectPermission):\n scope_map = {\n 'GET': ['project:read', 'project:write', 'project:admin', 'project:releases'],\n 'POST': ['project:write', 'project:admin', 'project:releases'],\n 'PUT': ['project:write', 'project:admin', 'project:releases'],\n 'DELETE': ['project:admin', 'project:releases'],\n }\n\n\nclass ProjectEventPermission(ProjectPermission):\n scope_map = {\n 'GET': ['event:read', 'event:write', 'event:admin'],\n 'POST': ['event:write', 'event:admin'],\n 'PUT': ['event:write', 'event:admin'],\n 'DELETE': ['event:admin'],\n }\n\n\nclass ProjectSettingPermission(ProjectPermission):\n scope_map = {\n 'GET': ['project:read', 'project:write', 'project:admin'],\n 'POST': ['project:write', 'project:admin'],\n 'PUT': ['project:write', 'project:admin'],\n 'DELETE': ['project:write', 'project:admin'],\n }\n\n\nclass RelaxedSearchPermission(ProjectPermission):\n scope_map = {\n 'GET': ['project:read', 'project:write', 'project:admin'],\n # members can do writes\n 'POST': ['project:write', 'project:admin', 'project:read'],\n 'PUT': ['project:write', 'project:admin', 'project:read'],\n 'DELETE': ['project:admin'],\n }\n\n\nclass ProjectEndpoint(Endpoint):\n permission_classes = (ProjectPermission, )\n\n def convert_args(self, request, organization_slug, project_slug, *args, **kwargs):\n try:\n project = Project.objects.filter(\n organization__slug=organization_slug,\n slug=project_slug,\n ).select_related('organization').prefetch_related('teams').get()\n except Project.DoesNotExist:\n try:\n # Project may have been renamed\n redirect = ProjectRedirect.objects.select_related('project')\n redirect = redirect.get(\n organization__slug=organization_slug,\n redirect_slug=project_slug\n )\n\n # get full path so that we keep query strings\n requested_url = request.get_full_path()\n new_url = requested_url.replace(\n 'projects/%s/%s/' %\n (organization_slug, project_slug), 'projects/%s/%s/' %\n (organization_slug, redirect.project.slug))\n\n # Resource was moved/renamed if the requested url is different than the new url\n if requested_url != new_url:\n raise ProjectMoved(new_url, redirect.project.slug)\n\n # otherwise project doesn't exist\n raise ResourceDoesNotExist\n except ProjectRedirect.DoesNotExist:\n raise ResourceDoesNotExist\n\n if project.status != ProjectStatus.VISIBLE:\n raise ResourceDoesNotExist\n\n self.check_object_permissions(request, project)\n\n raven.tags_context({\n 'project': project.id,\n 'organization': project.organization_id,\n })\n\n request._request.organization = project.organization\n\n kwargs['project'] = project\n return (args, kwargs)\n\n def handle_exception(self, request, exc):\n if isinstance(exc, ProjectMoved):\n response = Response({\n 'slug': exc.detail['extra']['slug'],\n 'detail': exc.detail\n }, status=exc.status_code)\n response['Location'] = exc.detail['extra']['url']\n return response\n return super(ProjectEndpoint, self).handle_exception(request, exc)\n", "path": "src/sentry/api/bases/project.py"}], "after_files": [{"content": "from __future__ import absolute_import\n\nfrom rest_framework import serializers\nfrom rest_framework.response import Response\n\nfrom sentry.api.bases.project import ProjectEndpoint, RelaxedSearchPermission\nfrom sentry.api.exceptions import ResourceDoesNotExist\nfrom sentry.api.serializers import serialize\nfrom sentry.models import SavedSearch, SavedSearchUserDefault\n\n\nclass LimitedSavedSearchSerializer(serializers.Serializer):\n isUserDefault = serializers.BooleanField(required=False)\n\n\nclass SavedSearchSerializer(serializers.Serializer):\n name = serializers.CharField(max_length=128, required=True)\n query = serializers.CharField(required=True)\n isDefault = serializers.BooleanField(required=False)\n isUserDefault = serializers.BooleanField(required=False)\n\n\nclass ProjectSearchDetailsEndpoint(ProjectEndpoint):\n permission_classes = (RelaxedSearchPermission, )\n\n def get(self, request, project, search_id):\n \"\"\"\n Retrieve a saved search\n\n Return details on an individual saved search.\n\n {method} {path}\n\n \"\"\"\n try:\n search = SavedSearch.objects.get(\n project=project,\n id=search_id,\n )\n except SavedSearch.DoesNotExist:\n raise ResourceDoesNotExist\n\n return Response(serialize(search, request.user))\n\n def put(self, request, project, search_id):\n \"\"\"\n Update a saved search\n\n Update a saved search.\n\n {method} {path}\n {{\n \"name: \"Unresolved\",\n \"query\": \"is:unresolved\",\n \"dateSavedSearchd\": \"2015-05-11T02:23:10Z\"\n }}\n\n \"\"\"\n try:\n search = SavedSearch.objects.get(\n project=project,\n id=search_id,\n )\n except SavedSearch.DoesNotExist:\n raise ResourceDoesNotExist\n\n has_team_scope = any(\n request.access.has_team_scope(team, 'project:write') for team in project.teams.all()\n )\n if has_team_scope:\n serializer = SavedSearchSerializer(data=request.DATA, partial=True)\n else:\n serializer = LimitedSavedSearchSerializer(data=request.DATA, partial=True)\n\n if not serializer.is_valid():\n return Response(serializer.errors, status=400)\n\n result = serializer.object\n\n kwargs = {}\n if result.get('name'):\n kwargs['name'] = result['name']\n if result.get('query'):\n kwargs['query'] = result['query']\n if result.get('isDefault'):\n kwargs['is_default'] = result['isDefault']\n\n if kwargs:\n search.update(**kwargs)\n\n if result.get('isDefault'):\n SavedSearch.objects.filter(\n project=project,\n ).exclude(id=search_id).update(is_default=False)\n\n if result.get('isUserDefault'):\n SavedSearchUserDefault.objects.create_or_update(\n user=request.user, project=project, values={\n 'savedsearch': search,\n }\n )\n\n return Response(serialize(search, request.user))\n\n def delete(self, request, project, search_id):\n \"\"\"\n Delete a saved search\n\n Permanently remove a saved search.\n\n {method} {path}\n\n \"\"\"\n try:\n search = SavedSearch.objects.get(\n project=project,\n id=search_id,\n )\n except SavedSearch.DoesNotExist:\n raise ResourceDoesNotExist\n\n is_search_owner = request.user and request.user == search.owner\n\n if request.access.has_scope('project:write'):\n if not search.owner or is_search_owner:\n search.delete()\n return Response(status=204)\n elif is_search_owner:\n search.delete()\n return Response(status=204)\n\n return Response(status=403)\n", "path": "src/sentry/api/endpoints/project_search_details.py"}, {"content": "from __future__ import absolute_import\n\nfrom rest_framework.response import Response\n\nfrom sentry import roles\nfrom sentry.api.base import Endpoint\nfrom sentry.api.exceptions import ResourceDoesNotExist, ProjectMoved\nfrom sentry.app import raven\nfrom sentry.auth.superuser import is_active_superuser\nfrom sentry.models import OrganizationMember, Project, ProjectStatus, ProjectRedirect\n\nfrom .organization import OrganizationPermission\nfrom .team import has_team_permission\n\n\nclass ProjectPermission(OrganizationPermission):\n scope_map = {\n 'GET': ['project:read', 'project:write', 'project:admin'],\n 'POST': ['project:write', 'project:admin'],\n 'PUT': ['project:write', 'project:admin'],\n 'DELETE': ['project:admin'],\n }\n\n def has_object_permission(self, request, view, project):\n result = super(ProjectPermission,\n self).has_object_permission(request, view, project.organization)\n\n if not result:\n return result\n\n if project.teams.exists():\n return any(\n has_team_permission(request, team, self.scope_map) for team in project.teams.all()\n )\n elif request.user.is_authenticated():\n # this is only for team-less projects\n if is_active_superuser(request):\n return True\n try:\n role = OrganizationMember.objects.filter(\n organization=project.organization,\n user=request.user,\n ).values_list('role', flat=True).get()\n except OrganizationMember.DoesNotExist:\n # this should probably never happen?\n return False\n\n return roles.get(role).is_global\n\n return False\n\n\nclass StrictProjectPermission(ProjectPermission):\n scope_map = {\n 'GET': ['project:write', 'project:admin'],\n 'POST': ['project:write', 'project:admin'],\n 'PUT': ['project:write', 'project:admin'],\n 'DELETE': ['project:admin'],\n }\n\n\nclass ProjectReleasePermission(ProjectPermission):\n scope_map = {\n 'GET': ['project:read', 'project:write', 'project:admin', 'project:releases'],\n 'POST': ['project:write', 'project:admin', 'project:releases'],\n 'PUT': ['project:write', 'project:admin', 'project:releases'],\n 'DELETE': ['project:admin', 'project:releases'],\n }\n\n\nclass ProjectEventPermission(ProjectPermission):\n scope_map = {\n 'GET': ['event:read', 'event:write', 'event:admin'],\n 'POST': ['event:write', 'event:admin'],\n 'PUT': ['event:write', 'event:admin'],\n 'DELETE': ['event:admin'],\n }\n\n\nclass ProjectSettingPermission(ProjectPermission):\n scope_map = {\n 'GET': ['project:read', 'project:write', 'project:admin'],\n 'POST': ['project:write', 'project:admin'],\n 'PUT': ['project:write', 'project:admin'],\n 'DELETE': ['project:write', 'project:admin'],\n }\n\n\nclass RelaxedSearchPermission(ProjectPermission):\n scope_map = {\n 'GET': ['project:read', 'project:write', 'project:admin'],\n # members can do writes\n 'POST': ['project:write', 'project:admin', 'project:read'],\n 'PUT': ['project:write', 'project:admin', 'project:read'],\n # members can delete their own searches\n 'DELETE': ['project:read', 'project:write', 'project:admin'],\n }\n\n\nclass ProjectEndpoint(Endpoint):\n permission_classes = (ProjectPermission, )\n\n def convert_args(self, request, organization_slug, project_slug, *args, **kwargs):\n try:\n project = Project.objects.filter(\n organization__slug=organization_slug,\n slug=project_slug,\n ).select_related('organization').prefetch_related('teams').get()\n except Project.DoesNotExist:\n try:\n # Project may have been renamed\n redirect = ProjectRedirect.objects.select_related('project')\n redirect = redirect.get(\n organization__slug=organization_slug,\n redirect_slug=project_slug\n )\n\n # get full path so that we keep query strings\n requested_url = request.get_full_path()\n new_url = requested_url.replace(\n 'projects/%s/%s/' %\n (organization_slug, project_slug), 'projects/%s/%s/' %\n (organization_slug, redirect.project.slug))\n\n # Resource was moved/renamed if the requested url is different than the new url\n if requested_url != new_url:\n raise ProjectMoved(new_url, redirect.project.slug)\n\n # otherwise project doesn't exist\n raise ResourceDoesNotExist\n except ProjectRedirect.DoesNotExist:\n raise ResourceDoesNotExist\n\n if project.status != ProjectStatus.VISIBLE:\n raise ResourceDoesNotExist\n\n self.check_object_permissions(request, project)\n\n raven.tags_context({\n 'project': project.id,\n 'organization': project.organization_id,\n })\n\n request._request.organization = project.organization\n\n kwargs['project'] = project\n return (args, kwargs)\n\n def handle_exception(self, request, exc):\n if isinstance(exc, ProjectMoved):\n response = Response({\n 'slug': exc.detail['extra']['slug'],\n 'detail': exc.detail\n }, status=exc.status_code)\n response['Location'] = exc.detail['extra']['url']\n return response\n return super(ProjectEndpoint, self).handle_exception(request, exc)\n", "path": "src/sentry/api/bases/project.py"}]} | 2,924 | 324 |
gh_patches_debug_35183 | rasdani/github-patches | git_diff | mitmproxy__mitmproxy-2971 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
User's script error handler function
As discussed with @cortesi on slack, right now whenever a user's script throws an error due to various reasons, it is being handled at different places differently.
Therefore we can have a consistent error handler function which can be invoked whenever there is an error
This will also handle #2837 #2838 #2839
### Function
Signature
`script_error(path, message, lineno, exception)`
What function will do
>"Error in script XXX:NNN MMM” where XXX is the path as specified by the user (the .path attribute of Script), NNN is a line number if we have one, and MMM is a short message
The idea here is to display the above mentioned message in the console app and display the traceback related to the error in the event log.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mitmproxy/addons/script.py`
Content:
```
1 import os
2 import importlib.util
3 import importlib.machinery
4 import time
5 import sys
6 import types
7 import typing
8
9 from mitmproxy import addonmanager
10 from mitmproxy import exceptions
11 from mitmproxy import flow
12 from mitmproxy import command
13 from mitmproxy import eventsequence
14 from mitmproxy import ctx
15 import mitmproxy.types as mtypes
16
17
18 def load_script(path: str) -> types.ModuleType:
19 fullname = "__mitmproxy_script__.{}".format(
20 os.path.splitext(os.path.basename(path))[0]
21 )
22 # the fullname is not unique among scripts, so if there already is an existing script with said
23 # fullname, remove it.
24 sys.modules.pop(fullname, None)
25 oldpath = sys.path
26 sys.path.insert(0, os.path.dirname(path))
27 try:
28 loader = importlib.machinery.SourceFileLoader(fullname, path)
29 spec = importlib.util.spec_from_loader(fullname, loader=loader)
30 m = importlib.util.module_from_spec(spec)
31 loader.exec_module(m)
32 if not getattr(m, "name", None):
33 m.name = path # type: ignore
34 return m
35 finally:
36 sys.path[:] = oldpath
37
38
39 class Script:
40 """
41 An addon that manages a single script.
42 """
43 ReloadInterval = 2
44
45 def __init__(self, path):
46 self.name = "scriptmanager:" + path
47 self.path = path
48 self.fullpath = os.path.expanduser(
49 path.strip("'\" ")
50 )
51 self.ns = None
52
53 self.last_load = 0
54 self.last_mtime = 0
55 if not os.path.isfile(self.fullpath):
56 raise exceptions.OptionsError('No such script: "%s"' % self.fullpath)
57
58 @property
59 def addons(self):
60 return [self.ns] if self.ns else []
61
62 def tick(self):
63 if time.time() - self.last_load > self.ReloadInterval:
64 try:
65 mtime = os.stat(self.fullpath).st_mtime
66 except FileNotFoundError:
67 scripts = list(ctx.options.scripts)
68 scripts.remove(self.path)
69 ctx.options.update(scripts=scripts)
70 return
71
72 if mtime > self.last_mtime:
73 ctx.log.info("Loading script: %s" % self.path)
74 if self.ns:
75 ctx.master.addons.remove(self.ns)
76 self.ns = None
77 with addonmanager.safecall():
78 ns = load_script(self.fullpath)
79 ctx.master.addons.register(ns)
80 self.ns = ns
81 if self.ns:
82 # We're already running, so we have to explicitly register and
83 # configure the addon
84 ctx.master.addons.invoke_addon(self.ns, "running")
85 ctx.master.addons.invoke_addon(
86 self.ns,
87 "configure",
88 ctx.options.keys()
89 )
90 self.last_load = time.time()
91 self.last_mtime = mtime
92
93
94 class ScriptLoader:
95 """
96 An addon that manages loading scripts from options.
97 """
98 def __init__(self):
99 self.is_running = False
100 self.addons = []
101
102 def load(self, loader):
103 loader.add_option(
104 "scripts", typing.Sequence[str], [],
105 """
106 Execute a script.
107 """
108 )
109
110 def running(self):
111 self.is_running = True
112
113 @command.command("script.run")
114 def script_run(self, flows: typing.Sequence[flow.Flow], path: mtypes.Path) -> None:
115 """
116 Run a script on the specified flows. The script is loaded with
117 default options, and all lifecycle events for each flow are
118 simulated.
119 """
120 try:
121 s = Script(path)
122 l = addonmanager.Loader(ctx.master)
123 ctx.master.addons.invoke_addon(s, "load", l)
124 ctx.master.addons.invoke_addon(s, "configure", ctx.options.keys())
125 # Script is loaded on the first tick
126 ctx.master.addons.invoke_addon(s, "tick")
127 for f in flows:
128 for evt, arg in eventsequence.iterate(f):
129 ctx.master.addons.invoke_addon(s, evt, arg)
130 except exceptions.OptionsError as e:
131 raise exceptions.CommandError("Error running script: %s" % e) from e
132
133 def configure(self, updated):
134 if "scripts" in updated:
135 for s in ctx.options.scripts:
136 if ctx.options.scripts.count(s) > 1:
137 raise exceptions.OptionsError("Duplicate script: %s" % s)
138
139 for a in self.addons[:]:
140 if a.path not in ctx.options.scripts:
141 ctx.log.info("Un-loading script: %s" % a.name)
142 ctx.master.addons.remove(a)
143 self.addons.remove(a)
144
145 # The machinations below are to ensure that:
146 # - Scripts remain in the same order
147 # - Scripts are not initialized un-necessarily. If only a
148 # script's order in the script list has changed, it is just
149 # moved.
150
151 current = {}
152 for a in self.addons:
153 current[a.path] = a
154
155 ordered = []
156 newscripts = []
157 for s in ctx.options.scripts:
158 if s in current:
159 ordered.append(current[s])
160 else:
161 sc = Script(s)
162 ordered.append(sc)
163 newscripts.append(sc)
164
165 self.addons = ordered
166
167 for s in newscripts:
168 ctx.master.addons.register(s)
169 if self.is_running:
170 # If we're already running, we configure and tell the addon
171 # we're up and running.
172 ctx.master.addons.invoke_addon(s, "running")
173
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mitmproxy/addons/script.py b/mitmproxy/addons/script.py
--- a/mitmproxy/addons/script.py
+++ b/mitmproxy/addons/script.py
@@ -5,6 +5,7 @@
import sys
import types
import typing
+import traceback
from mitmproxy import addonmanager
from mitmproxy import exceptions
@@ -36,6 +37,25 @@
sys.path[:] = oldpath
+def script_error_handler(path, exc, msg="", tb=False):
+ """
+ Handles all the user's script errors with
+ an optional traceback
+ """
+ exception = type(exc).__name__
+ if msg:
+ exception = msg
+ lineno = ""
+ if hasattr(exc, "lineno"):
+ lineno = str(exc.lineno)
+ log_msg = "in Script {}:{} {}".format(path, lineno, exception)
+ if tb:
+ etype, value, tback = sys.exc_info()
+ tback = addonmanager.cut_traceback(tback, "invoke_addon")
+ log_msg = log_msg.join(["\n"] + traceback.format_exception(etype, value, tback))
+ ctx.log.error(log_msg)
+
+
class Script:
"""
An addon that manages a single script.
@@ -53,7 +73,7 @@
self.last_load = 0
self.last_mtime = 0
if not os.path.isfile(self.fullpath):
- raise exceptions.OptionsError('No such script: "%s"' % self.fullpath)
+ raise exceptions.OptionsError('No such script')
@property
def addons(self):
@@ -128,13 +148,13 @@
for evt, arg in eventsequence.iterate(f):
ctx.master.addons.invoke_addon(s, evt, arg)
except exceptions.OptionsError as e:
- raise exceptions.CommandError("Error running script: %s" % e) from e
+ script_error_handler(path, e, msg=str(e))
def configure(self, updated):
if "scripts" in updated:
for s in ctx.options.scripts:
if ctx.options.scripts.count(s) > 1:
- raise exceptions.OptionsError("Duplicate script: %s" % s)
+ raise exceptions.OptionsError("Duplicate script")
for a in self.addons[:]:
if a.path not in ctx.options.scripts:
| {"golden_diff": "diff --git a/mitmproxy/addons/script.py b/mitmproxy/addons/script.py\n--- a/mitmproxy/addons/script.py\n+++ b/mitmproxy/addons/script.py\n@@ -5,6 +5,7 @@\n import sys\n import types\n import typing\n+import traceback\n \n from mitmproxy import addonmanager\n from mitmproxy import exceptions\n@@ -36,6 +37,25 @@\n sys.path[:] = oldpath\n \n \n+def script_error_handler(path, exc, msg=\"\", tb=False):\n+ \"\"\"\n+ Handles all the user's script errors with\n+ an optional traceback\n+ \"\"\"\n+ exception = type(exc).__name__\n+ if msg:\n+ exception = msg\n+ lineno = \"\"\n+ if hasattr(exc, \"lineno\"):\n+ lineno = str(exc.lineno)\n+ log_msg = \"in Script {}:{} {}\".format(path, lineno, exception)\n+ if tb:\n+ etype, value, tback = sys.exc_info()\n+ tback = addonmanager.cut_traceback(tback, \"invoke_addon\")\n+ log_msg = log_msg.join([\"\\n\"] + traceback.format_exception(etype, value, tback))\n+ ctx.log.error(log_msg)\n+\n+\n class Script:\n \"\"\"\n An addon that manages a single script.\n@@ -53,7 +73,7 @@\n self.last_load = 0\n self.last_mtime = 0\n if not os.path.isfile(self.fullpath):\n- raise exceptions.OptionsError('No such script: \"%s\"' % self.fullpath)\n+ raise exceptions.OptionsError('No such script')\n \n @property\n def addons(self):\n@@ -128,13 +148,13 @@\n for evt, arg in eventsequence.iterate(f):\n ctx.master.addons.invoke_addon(s, evt, arg)\n except exceptions.OptionsError as e:\n- raise exceptions.CommandError(\"Error running script: %s\" % e) from e\n+ script_error_handler(path, e, msg=str(e))\n \n def configure(self, updated):\n if \"scripts\" in updated:\n for s in ctx.options.scripts:\n if ctx.options.scripts.count(s) > 1:\n- raise exceptions.OptionsError(\"Duplicate script: %s\" % s)\n+ raise exceptions.OptionsError(\"Duplicate script\")\n \n for a in self.addons[:]:\n if a.path not in ctx.options.scripts:\n", "issue": "User's script error handler function\nAs discussed with @cortesi on slack, right now whenever a user's script throws an error due to various reasons, it is being handled at different places differently.\r\nTherefore we can have a consistent error handler function which can be invoked whenever there is an error\r\nThis will also handle #2837 #2838 #2839 \r\n### Function\r\n\r\nSignature\r\n`script_error(path, message, lineno, exception)`\r\n\r\nWhat function will do\r\n>\"Error in script XXX:NNN MMM\u201d where XXX is the path as specified by the user (the .path attribute of Script), NNN is a line number if we have one, and MMM is a short message\r\n\r\nThe idea here is to display the above mentioned message in the console app and display the traceback related to the error in the event log. \n", "before_files": [{"content": "import os\nimport importlib.util\nimport importlib.machinery\nimport time\nimport sys\nimport types\nimport typing\n\nfrom mitmproxy import addonmanager\nfrom mitmproxy import exceptions\nfrom mitmproxy import flow\nfrom mitmproxy import command\nfrom mitmproxy import eventsequence\nfrom mitmproxy import ctx\nimport mitmproxy.types as mtypes\n\n\ndef load_script(path: str) -> types.ModuleType:\n fullname = \"__mitmproxy_script__.{}\".format(\n os.path.splitext(os.path.basename(path))[0]\n )\n # the fullname is not unique among scripts, so if there already is an existing script with said\n # fullname, remove it.\n sys.modules.pop(fullname, None)\n oldpath = sys.path\n sys.path.insert(0, os.path.dirname(path))\n try:\n loader = importlib.machinery.SourceFileLoader(fullname, path)\n spec = importlib.util.spec_from_loader(fullname, loader=loader)\n m = importlib.util.module_from_spec(spec)\n loader.exec_module(m)\n if not getattr(m, \"name\", None):\n m.name = path # type: ignore\n return m\n finally:\n sys.path[:] = oldpath\n\n\nclass Script:\n \"\"\"\n An addon that manages a single script.\n \"\"\"\n ReloadInterval = 2\n\n def __init__(self, path):\n self.name = \"scriptmanager:\" + path\n self.path = path\n self.fullpath = os.path.expanduser(\n path.strip(\"'\\\" \")\n )\n self.ns = None\n\n self.last_load = 0\n self.last_mtime = 0\n if not os.path.isfile(self.fullpath):\n raise exceptions.OptionsError('No such script: \"%s\"' % self.fullpath)\n\n @property\n def addons(self):\n return [self.ns] if self.ns else []\n\n def tick(self):\n if time.time() - self.last_load > self.ReloadInterval:\n try:\n mtime = os.stat(self.fullpath).st_mtime\n except FileNotFoundError:\n scripts = list(ctx.options.scripts)\n scripts.remove(self.path)\n ctx.options.update(scripts=scripts)\n return\n\n if mtime > self.last_mtime:\n ctx.log.info(\"Loading script: %s\" % self.path)\n if self.ns:\n ctx.master.addons.remove(self.ns)\n self.ns = None\n with addonmanager.safecall():\n ns = load_script(self.fullpath)\n ctx.master.addons.register(ns)\n self.ns = ns\n if self.ns:\n # We're already running, so we have to explicitly register and\n # configure the addon\n ctx.master.addons.invoke_addon(self.ns, \"running\")\n ctx.master.addons.invoke_addon(\n self.ns,\n \"configure\",\n ctx.options.keys()\n )\n self.last_load = time.time()\n self.last_mtime = mtime\n\n\nclass ScriptLoader:\n \"\"\"\n An addon that manages loading scripts from options.\n \"\"\"\n def __init__(self):\n self.is_running = False\n self.addons = []\n\n def load(self, loader):\n loader.add_option(\n \"scripts\", typing.Sequence[str], [],\n \"\"\"\n Execute a script.\n \"\"\"\n )\n\n def running(self):\n self.is_running = True\n\n @command.command(\"script.run\")\n def script_run(self, flows: typing.Sequence[flow.Flow], path: mtypes.Path) -> None:\n \"\"\"\n Run a script on the specified flows. The script is loaded with\n default options, and all lifecycle events for each flow are\n simulated.\n \"\"\"\n try:\n s = Script(path)\n l = addonmanager.Loader(ctx.master)\n ctx.master.addons.invoke_addon(s, \"load\", l)\n ctx.master.addons.invoke_addon(s, \"configure\", ctx.options.keys())\n # Script is loaded on the first tick\n ctx.master.addons.invoke_addon(s, \"tick\")\n for f in flows:\n for evt, arg in eventsequence.iterate(f):\n ctx.master.addons.invoke_addon(s, evt, arg)\n except exceptions.OptionsError as e:\n raise exceptions.CommandError(\"Error running script: %s\" % e) from e\n\n def configure(self, updated):\n if \"scripts\" in updated:\n for s in ctx.options.scripts:\n if ctx.options.scripts.count(s) > 1:\n raise exceptions.OptionsError(\"Duplicate script: %s\" % s)\n\n for a in self.addons[:]:\n if a.path not in ctx.options.scripts:\n ctx.log.info(\"Un-loading script: %s\" % a.name)\n ctx.master.addons.remove(a)\n self.addons.remove(a)\n\n # The machinations below are to ensure that:\n # - Scripts remain in the same order\n # - Scripts are not initialized un-necessarily. If only a\n # script's order in the script list has changed, it is just\n # moved.\n\n current = {}\n for a in self.addons:\n current[a.path] = a\n\n ordered = []\n newscripts = []\n for s in ctx.options.scripts:\n if s in current:\n ordered.append(current[s])\n else:\n sc = Script(s)\n ordered.append(sc)\n newscripts.append(sc)\n\n self.addons = ordered\n\n for s in newscripts:\n ctx.master.addons.register(s)\n if self.is_running:\n # If we're already running, we configure and tell the addon\n # we're up and running.\n ctx.master.addons.invoke_addon(s, \"running\")\n", "path": "mitmproxy/addons/script.py"}], "after_files": [{"content": "import os\nimport importlib.util\nimport importlib.machinery\nimport time\nimport sys\nimport types\nimport typing\nimport traceback\n\nfrom mitmproxy import addonmanager\nfrom mitmproxy import exceptions\nfrom mitmproxy import flow\nfrom mitmproxy import command\nfrom mitmproxy import eventsequence\nfrom mitmproxy import ctx\nimport mitmproxy.types as mtypes\n\n\ndef load_script(path: str) -> types.ModuleType:\n fullname = \"__mitmproxy_script__.{}\".format(\n os.path.splitext(os.path.basename(path))[0]\n )\n # the fullname is not unique among scripts, so if there already is an existing script with said\n # fullname, remove it.\n sys.modules.pop(fullname, None)\n oldpath = sys.path\n sys.path.insert(0, os.path.dirname(path))\n try:\n loader = importlib.machinery.SourceFileLoader(fullname, path)\n spec = importlib.util.spec_from_loader(fullname, loader=loader)\n m = importlib.util.module_from_spec(spec)\n loader.exec_module(m)\n if not getattr(m, \"name\", None):\n m.name = path # type: ignore\n return m\n finally:\n sys.path[:] = oldpath\n\n\ndef script_error_handler(path, exc, msg=\"\", tb=False):\n \"\"\"\n Handles all the user's script errors with\n an optional traceback\n \"\"\"\n exception = type(exc).__name__\n if msg:\n exception = msg\n lineno = \"\"\n if hasattr(exc, \"lineno\"):\n lineno = str(exc.lineno)\n log_msg = \"in Script {}:{} {}\".format(path, lineno, exception)\n if tb:\n etype, value, tback = sys.exc_info()\n tback = addonmanager.cut_traceback(tback, \"invoke_addon\")\n log_msg = log_msg.join([\"\\n\"] + traceback.format_exception(etype, value, tback))\n ctx.log.error(log_msg)\n\n\nclass Script:\n \"\"\"\n An addon that manages a single script.\n \"\"\"\n ReloadInterval = 2\n\n def __init__(self, path):\n self.name = \"scriptmanager:\" + path\n self.path = path\n self.fullpath = os.path.expanduser(\n path.strip(\"'\\\" \")\n )\n self.ns = None\n\n self.last_load = 0\n self.last_mtime = 0\n if not os.path.isfile(self.fullpath):\n raise exceptions.OptionsError('No such script')\n\n @property\n def addons(self):\n return [self.ns] if self.ns else []\n\n def tick(self):\n if time.time() - self.last_load > self.ReloadInterval:\n try:\n mtime = os.stat(self.fullpath).st_mtime\n except FileNotFoundError:\n scripts = list(ctx.options.scripts)\n scripts.remove(self.path)\n ctx.options.update(scripts=scripts)\n return\n\n if mtime > self.last_mtime:\n ctx.log.info(\"Loading script: %s\" % self.path)\n if self.ns:\n ctx.master.addons.remove(self.ns)\n self.ns = None\n with addonmanager.safecall():\n ns = load_script(self.fullpath)\n ctx.master.addons.register(ns)\n self.ns = ns\n if self.ns:\n # We're already running, so we have to explicitly register and\n # configure the addon\n ctx.master.addons.invoke_addon(self.ns, \"running\")\n ctx.master.addons.invoke_addon(\n self.ns,\n \"configure\",\n ctx.options.keys()\n )\n self.last_load = time.time()\n self.last_mtime = mtime\n\n\nclass ScriptLoader:\n \"\"\"\n An addon that manages loading scripts from options.\n \"\"\"\n def __init__(self):\n self.is_running = False\n self.addons = []\n\n def load(self, loader):\n loader.add_option(\n \"scripts\", typing.Sequence[str], [],\n \"\"\"\n Execute a script.\n \"\"\"\n )\n\n def running(self):\n self.is_running = True\n\n @command.command(\"script.run\")\n def script_run(self, flows: typing.Sequence[flow.Flow], path: mtypes.Path) -> None:\n \"\"\"\n Run a script on the specified flows. The script is loaded with\n default options, and all lifecycle events for each flow are\n simulated.\n \"\"\"\n try:\n s = Script(path)\n l = addonmanager.Loader(ctx.master)\n ctx.master.addons.invoke_addon(s, \"load\", l)\n ctx.master.addons.invoke_addon(s, \"configure\", ctx.options.keys())\n # Script is loaded on the first tick\n ctx.master.addons.invoke_addon(s, \"tick\")\n for f in flows:\n for evt, arg in eventsequence.iterate(f):\n ctx.master.addons.invoke_addon(s, evt, arg)\n except exceptions.OptionsError as e:\n script_error_handler(path, e, msg=str(e))\n\n def configure(self, updated):\n if \"scripts\" in updated:\n for s in ctx.options.scripts:\n if ctx.options.scripts.count(s) > 1:\n raise exceptions.OptionsError(\"Duplicate script\")\n\n for a in self.addons[:]:\n if a.path not in ctx.options.scripts:\n ctx.log.info(\"Un-loading script: %s\" % a.name)\n ctx.master.addons.remove(a)\n self.addons.remove(a)\n\n # The machinations below are to ensure that:\n # - Scripts remain in the same order\n # - Scripts are not initialized un-necessarily. If only a\n # script's order in the script list has changed, it is just\n # moved.\n\n current = {}\n for a in self.addons:\n current[a.path] = a\n\n ordered = []\n newscripts = []\n for s in ctx.options.scripts:\n if s in current:\n ordered.append(current[s])\n else:\n sc = Script(s)\n ordered.append(sc)\n newscripts.append(sc)\n\n self.addons = ordered\n\n for s in newscripts:\n ctx.master.addons.register(s)\n if self.is_running:\n # If we're already running, we configure and tell the addon\n # we're up and running.\n ctx.master.addons.invoke_addon(s, \"running\")\n", "path": "mitmproxy/addons/script.py"}]} | 2,067 | 529 |
gh_patches_debug_170 | rasdani/github-patches | git_diff | pydantic__pydantic-4418 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
V1.10 release
To do/decide:
* [x] #2557 - **merged**
* [x] #2745 - needs some tweaks, but we need to decide if it's a good idea before V2
* [x] #2190 - **deferred**
* [x] cherry pick stuff from v1.9 branch, maybe just history #4350
* [x] #3346
* [x] #3593 - **deferred**
* [x] #3946
* [x] #4028 - **API will change in v2**
* [x] #4354
* [x] #4216
* [x] #4191
* [x] #3941 - revert or fix
* [x] #4339
* [x] #4356
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pydantic/version.py`
Content:
```
1 __all__ = 'compiled', 'VERSION', 'version_info'
2
3 VERSION = '1.9.2'
4
5 try:
6 import cython # type: ignore
7 except ImportError:
8 compiled: bool = False
9 else: # pragma: no cover
10 try:
11 compiled = cython.compiled
12 except AttributeError:
13 compiled = False
14
15
16 def version_info() -> str:
17 import platform
18 import sys
19 from importlib import import_module
20 from pathlib import Path
21
22 optional_deps = []
23 for p in ('devtools', 'dotenv', 'email-validator', 'typing-extensions'):
24 try:
25 import_module(p.replace('-', '_'))
26 except ImportError:
27 continue
28 optional_deps.append(p)
29
30 info = {
31 'pydantic version': VERSION,
32 'pydantic compiled': compiled,
33 'install path': Path(__file__).resolve().parent,
34 'python version': sys.version,
35 'platform': platform.platform(),
36 'optional deps. installed': optional_deps,
37 }
38 return '\n'.join('{:>30} {}'.format(k + ':', str(v).replace('\n', ' ')) for k, v in info.items())
39
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pydantic/version.py b/pydantic/version.py
--- a/pydantic/version.py
+++ b/pydantic/version.py
@@ -1,6 +1,6 @@
__all__ = 'compiled', 'VERSION', 'version_info'
-VERSION = '1.9.2'
+VERSION = '1.10.0a1'
try:
import cython # type: ignore
| {"golden_diff": "diff --git a/pydantic/version.py b/pydantic/version.py\n--- a/pydantic/version.py\n+++ b/pydantic/version.py\n@@ -1,6 +1,6 @@\n __all__ = 'compiled', 'VERSION', 'version_info'\n \n-VERSION = '1.9.2'\n+VERSION = '1.10.0a1'\n \n try:\n import cython # type: ignore\n", "issue": "V1.10 release\nTo do/decide:\r\n* [x] #2557 - **merged**\r\n* [x] #2745 - needs some tweaks, but we need to decide if it's a good idea before V2\r\n* [x] #2190 - **deferred**\r\n* [x] cherry pick stuff from v1.9 branch, maybe just history #4350\r\n* [x] #3346\r\n* [x] #3593 - **deferred**\r\n* [x] #3946\r\n* [x] #4028 - **API will change in v2**\r\n* [x] #4354\r\n* [x] #4216\r\n* [x] #4191\r\n* [x] #3941 - revert or fix\r\n* [x] #4339\r\n* [x] #4356\n", "before_files": [{"content": "__all__ = 'compiled', 'VERSION', 'version_info'\n\nVERSION = '1.9.2'\n\ntry:\n import cython # type: ignore\nexcept ImportError:\n compiled: bool = False\nelse: # pragma: no cover\n try:\n compiled = cython.compiled\n except AttributeError:\n compiled = False\n\n\ndef version_info() -> str:\n import platform\n import sys\n from importlib import import_module\n from pathlib import Path\n\n optional_deps = []\n for p in ('devtools', 'dotenv', 'email-validator', 'typing-extensions'):\n try:\n import_module(p.replace('-', '_'))\n except ImportError:\n continue\n optional_deps.append(p)\n\n info = {\n 'pydantic version': VERSION,\n 'pydantic compiled': compiled,\n 'install path': Path(__file__).resolve().parent,\n 'python version': sys.version,\n 'platform': platform.platform(),\n 'optional deps. installed': optional_deps,\n }\n return '\\n'.join('{:>30} {}'.format(k + ':', str(v).replace('\\n', ' ')) for k, v in info.items())\n", "path": "pydantic/version.py"}], "after_files": [{"content": "__all__ = 'compiled', 'VERSION', 'version_info'\n\nVERSION = '1.10.0a1'\n\ntry:\n import cython # type: ignore\nexcept ImportError:\n compiled: bool = False\nelse: # pragma: no cover\n try:\n compiled = cython.compiled\n except AttributeError:\n compiled = False\n\n\ndef version_info() -> str:\n import platform\n import sys\n from importlib import import_module\n from pathlib import Path\n\n optional_deps = []\n for p in ('devtools', 'dotenv', 'email-validator', 'typing-extensions'):\n try:\n import_module(p.replace('-', '_'))\n except ImportError:\n continue\n optional_deps.append(p)\n\n info = {\n 'pydantic version': VERSION,\n 'pydantic compiled': compiled,\n 'install path': Path(__file__).resolve().parent,\n 'python version': sys.version,\n 'platform': platform.platform(),\n 'optional deps. installed': optional_deps,\n }\n return '\\n'.join('{:>30} {}'.format(k + ':', str(v).replace('\\n', ' ')) for k, v in info.items())\n", "path": "pydantic/version.py"}]} | 792 | 93 |
gh_patches_debug_15235 | rasdani/github-patches | git_diff | conan-io__conan-center-index-10038 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[request] libusb/1.0.25
### Package Details
* Package Name/Version: **libusb/1.0.25**
* Changelog: **https://github.com/libusb/libusb/blob/master/ChangeLog**
2022-01-31: v1.0.25
* Linux: Fix regression with some particular devices
* Linux: Fix regression with libusb_handle_events_timeout_completed()
* Linux: Fix regression with cpu usage in libusb_bulk_transfer
* Darwin (macOS): Add support for detaching kernel drivers with authorization.
* Darwin (macOS): Do not drop partial data on timeout.
* Darwin (macOS): Silence pipe error in set_interface_alt_setting().
* Windows: Fix HID backend missing byte
* Windows: Fix segfault with libusbk driver
* Windows: Fix regression when using libusb0 driver
* Windows: Support LIBUSB_TRANSFER_ADD_ZERO_PACKET on winusb
* New NO_DEVICE_DISCOVERY option replaces WEAK_AUTHORITY option
* Various other bug fixes and improvements
PR follows
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `recipes/libusb/all/conanfile.py`
Content:
```
1 from conans import ConanFile, AutoToolsBuildEnvironment, MSBuild, tools
2 from conans.errors import ConanInvalidConfiguration
3 import os
4 import re
5
6 required_conan_version = ">=1.33.0"
7
8
9 class LibUSBConan(ConanFile):
10 name = "libusb"
11 description = "A cross-platform library to access USB devices"
12 license = "LGPL-2.1"
13 homepage = "https://github.com/libusb/libusb"
14 url = "https://github.com/conan-io/conan-center-index"
15 topics = ("conan", "libusb", "usb", "device")
16 settings = "os", "compiler", "build_type", "arch"
17 options = {
18 "shared": [True, False],
19 "fPIC": [True, False],
20 "enable_udev": [True, False],
21 }
22 default_options = {
23 "shared": False,
24 "fPIC": True,
25 "enable_udev": True,
26 }
27 _autotools = None
28
29 @property
30 def _source_subfolder(self):
31 return "source_subfolder"
32
33 @property
34 def _is_mingw(self):
35 return self.settings.os == "Windows" and self.settings.compiler == "gcc"
36
37 @property
38 def _is_msvc(self):
39 return self.settings.os == "Windows" and self.settings.compiler == "Visual Studio"
40
41 @property
42 def _settings_build(self):
43 return self.settings_build if hasattr(self, "settings_build") else self.settings
44
45 def config_options(self):
46 if self.settings.os == "Windows":
47 del self.options.fPIC
48 if self.settings.os not in ["Linux", "Android"]:
49 del self.options.enable_udev
50 # FIXME: enable_udev should be True for Android, but libudev recipe is missing
51 if self.settings.os == "Android":
52 self.options.enable_udev = False
53
54 def configure(self):
55 if self.options.shared:
56 del self.options.fPIC
57 del self.settings.compiler.libcxx
58 del self.settings.compiler.cppstd
59
60 def build_requirements(self):
61 if self._settings_build.os == "Windows" and not self._is_msvc and not tools.get_env("CONAN_BASH_PATH"):
62 self.build_requires("msys2/cci.latest")
63
64 def requirements(self):
65 if self.settings.os == "Linux":
66 if self.options.enable_udev:
67 self.requires("libudev/system")
68
69 def source(self):
70 tools.get(**self.conan_data["sources"][self.version],
71 destination=self._source_subfolder, strip_root=True)
72
73 def _build_visual_studio(self):
74 with tools.chdir(self._source_subfolder):
75 # Assume we're using the latest Visual Studio and default to libusb_2019.sln
76 # (or libusb_2017.sln for libusb < 1.0.24).
77 # If we're not using the latest Visual Studio, select an appropriate solution file.
78 solution_msvc_year = 2019 if tools.Version(self.version) >= "1.0.24" else 2017
79
80 solution_msvc_year = {
81 "11": 2012,
82 "12": 2013,
83 "14": 2015,
84 "15": 2017
85 }.get(str(self.settings.compiler.version), solution_msvc_year)
86
87 solution_file = os.path.join("msvc", "libusb_{}.sln".format(solution_msvc_year))
88 platforms = {"x86":"Win32"}
89 properties = {
90 # Enable LTO when CFLAGS contains -GL
91 "WholeProgramOptimization": "true" if any(re.finditer("(^| )[/-]GL($| )", tools.get_env("CFLAGS", ""))) else "false",
92 }
93 msbuild = MSBuild(self)
94 build_type = "Debug" if self.settings.build_type == "Debug" else "Release"
95 msbuild.build(solution_file, platforms=platforms, upgrade_project=False, properties=properties, build_type=build_type)
96
97 def _configure_autotools(self):
98 if not self._autotools:
99 self._autotools = AutoToolsBuildEnvironment(self, win_bash=tools.os_info.is_windows)
100 configure_args = ["--enable-shared" if self.options.shared else "--disable-shared"]
101 configure_args.append("--enable-static" if not self.options.shared else "--disable-static")
102 if self.settings.os in ["Linux", "Android"]:
103 configure_args.append("--enable-udev" if self.options.enable_udev else "--disable-udev")
104 elif self._is_mingw:
105 if self.settings.arch == "x86_64":
106 configure_args.append("--host=x86_64-w64-mingw32")
107 elif self.settings.arch == "x86":
108 configure_args.append("--build=i686-w64-mingw32")
109 configure_args.append("--host=i686-w64-mingw32")
110 self._autotools.configure(args=configure_args, configure_dir=self._source_subfolder)
111 return self._autotools
112
113 def build(self):
114 if self._is_msvc:
115 if tools.Version(self.version) < "1.0.24":
116 for vcxproj in ["fxload_2017", "getopt_2017", "hotplugtest_2017", "libusb_dll_2017",
117 "libusb_static_2017", "listdevs_2017", "stress_2017", "testlibusb_2017", "xusb_2017"]:
118 vcxproj_path = os.path.join(self._source_subfolder, "msvc", "%s.vcxproj" % vcxproj)
119 tools.replace_in_file(vcxproj_path, "<WindowsTargetPlatformVersion>10.0.16299.0</WindowsTargetPlatformVersion>", "")
120 self._build_visual_studio()
121 else:
122 autotools = self._configure_autotools()
123 autotools.make()
124
125 def _package_visual_studio(self):
126 self.copy(pattern="libusb.h", dst=os.path.join("include", "libusb-1.0"), src=os.path.join(self._source_subfolder, "libusb"), keep_path=False)
127 arch = "x64" if self.settings.arch == "x86_64" else "Win32"
128 source_dir = os.path.join(self._source_subfolder, arch, str(self.settings.build_type), "dll" if self.options.shared else "lib")
129 if self.options.shared:
130 self.copy(pattern="libusb-1.0.dll", dst="bin", src=source_dir, keep_path=False)
131 self.copy(pattern="libusb-1.0.lib", dst="lib", src=source_dir, keep_path=False)
132 self.copy(pattern="libusb-usbdk-1.0.dll", dst="bin", src=source_dir, keep_path=False)
133 self.copy(pattern="libusb-usbdk-1.0.lib", dst="lib", src=source_dir, keep_path=False)
134 else:
135 self.copy(pattern="libusb-1.0.lib", dst="lib", src=source_dir, keep_path=False)
136 self.copy(pattern="libusb-usbdk-1.0.lib", dst="lib", src=source_dir, keep_path=False)
137
138 def package(self):
139 self.copy("COPYING", src=self._source_subfolder, dst="licenses", keep_path=False)
140 if self._is_msvc:
141 self._package_visual_studio()
142 else:
143 autotools = self._configure_autotools()
144 autotools.install()
145 tools.rmdir(os.path.join(self.package_folder, "lib", "pkgconfig"))
146 tools.remove_files_by_mask(os.path.join(self.package_folder, "lib"), "*.la")
147
148 def package_info(self):
149 self.cpp_info.names["pkg_config"] = "libusb-1.0"
150 self.cpp_info.libs = tools.collect_libs(self)
151 self.cpp_info.includedirs.append(os.path.join("include", "libusb-1.0"))
152 if self.settings.os in ["Linux", "FreeBSD"]:
153 self.cpp_info.system_libs.append("pthread")
154 elif self.settings.os == "Macos":
155 self.cpp_info.system_libs = ["objc"]
156 self.cpp_info.frameworks = ["IOKit", "CoreFoundation"]
157 elif self.settings.os == "Windows":
158 self.cpp_info.system_libs = ["advapi32"]
159
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/recipes/libusb/all/conanfile.py b/recipes/libusb/all/conanfile.py
--- a/recipes/libusb/all/conanfile.py
+++ b/recipes/libusb/all/conanfile.py
@@ -1,5 +1,4 @@
from conans import ConanFile, AutoToolsBuildEnvironment, MSBuild, tools
-from conans.errors import ConanInvalidConfiguration
import os
import re
@@ -153,6 +152,6 @@
self.cpp_info.system_libs.append("pthread")
elif self.settings.os == "Macos":
self.cpp_info.system_libs = ["objc"]
- self.cpp_info.frameworks = ["IOKit", "CoreFoundation"]
+ self.cpp_info.frameworks = ["IOKit", "CoreFoundation", "Security"]
elif self.settings.os == "Windows":
self.cpp_info.system_libs = ["advapi32"]
| {"golden_diff": "diff --git a/recipes/libusb/all/conanfile.py b/recipes/libusb/all/conanfile.py\n--- a/recipes/libusb/all/conanfile.py\n+++ b/recipes/libusb/all/conanfile.py\n@@ -1,5 +1,4 @@\n from conans import ConanFile, AutoToolsBuildEnvironment, MSBuild, tools\n-from conans.errors import ConanInvalidConfiguration\n import os\n import re\n \n@@ -153,6 +152,6 @@\n self.cpp_info.system_libs.append(\"pthread\")\n elif self.settings.os == \"Macos\":\n self.cpp_info.system_libs = [\"objc\"]\n- self.cpp_info.frameworks = [\"IOKit\", \"CoreFoundation\"]\n+ self.cpp_info.frameworks = [\"IOKit\", \"CoreFoundation\", \"Security\"]\n elif self.settings.os == \"Windows\":\n self.cpp_info.system_libs = [\"advapi32\"]\n", "issue": "[request] libusb/1.0.25\n### Package Details\r\n * Package Name/Version: **libusb/1.0.25**\r\n * Changelog: **https://github.com/libusb/libusb/blob/master/ChangeLog**\r\n\r\n\r\n2022-01-31: v1.0.25\r\n* Linux: Fix regression with some particular devices\r\n* Linux: Fix regression with libusb_handle_events_timeout_completed()\r\n* Linux: Fix regression with cpu usage in libusb_bulk_transfer\r\n* Darwin (macOS): Add support for detaching kernel drivers with authorization.\r\n* Darwin (macOS): Do not drop partial data on timeout.\r\n* Darwin (macOS): Silence pipe error in set_interface_alt_setting().\r\n* Windows: Fix HID backend missing byte\r\n* Windows: Fix segfault with libusbk driver\r\n* Windows: Fix regression when using libusb0 driver\r\n* Windows: Support LIBUSB_TRANSFER_ADD_ZERO_PACKET on winusb\r\n* New NO_DEVICE_DISCOVERY option replaces WEAK_AUTHORITY option\r\n* Various other bug fixes and improvements\r\n\r\nPR follows\n", "before_files": [{"content": "from conans import ConanFile, AutoToolsBuildEnvironment, MSBuild, tools\nfrom conans.errors import ConanInvalidConfiguration\nimport os\nimport re\n\nrequired_conan_version = \">=1.33.0\"\n\n\nclass LibUSBConan(ConanFile):\n name = \"libusb\"\n description = \"A cross-platform library to access USB devices\"\n license = \"LGPL-2.1\"\n homepage = \"https://github.com/libusb/libusb\"\n url = \"https://github.com/conan-io/conan-center-index\"\n topics = (\"conan\", \"libusb\", \"usb\", \"device\")\n settings = \"os\", \"compiler\", \"build_type\", \"arch\"\n options = {\n \"shared\": [True, False],\n \"fPIC\": [True, False],\n \"enable_udev\": [True, False],\n }\n default_options = {\n \"shared\": False,\n \"fPIC\": True,\n \"enable_udev\": True,\n }\n _autotools = None\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n @property\n def _is_mingw(self):\n return self.settings.os == \"Windows\" and self.settings.compiler == \"gcc\"\n\n @property\n def _is_msvc(self):\n return self.settings.os == \"Windows\" and self.settings.compiler == \"Visual Studio\"\n\n @property\n def _settings_build(self):\n return self.settings_build if hasattr(self, \"settings_build\") else self.settings\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n if self.settings.os not in [\"Linux\", \"Android\"]:\n del self.options.enable_udev\n # FIXME: enable_udev should be True for Android, but libudev recipe is missing\n if self.settings.os == \"Android\":\n self.options.enable_udev = False\n\n def configure(self):\n if self.options.shared:\n del self.options.fPIC\n del self.settings.compiler.libcxx\n del self.settings.compiler.cppstd\n\n def build_requirements(self):\n if self._settings_build.os == \"Windows\" and not self._is_msvc and not tools.get_env(\"CONAN_BASH_PATH\"):\n self.build_requires(\"msys2/cci.latest\")\n\n def requirements(self):\n if self.settings.os == \"Linux\":\n if self.options.enable_udev:\n self.requires(\"libudev/system\")\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version],\n destination=self._source_subfolder, strip_root=True)\n\n def _build_visual_studio(self):\n with tools.chdir(self._source_subfolder):\n # Assume we're using the latest Visual Studio and default to libusb_2019.sln\n # (or libusb_2017.sln for libusb < 1.0.24).\n # If we're not using the latest Visual Studio, select an appropriate solution file.\n solution_msvc_year = 2019 if tools.Version(self.version) >= \"1.0.24\" else 2017\n\n solution_msvc_year = {\n \"11\": 2012,\n \"12\": 2013,\n \"14\": 2015,\n \"15\": 2017\n }.get(str(self.settings.compiler.version), solution_msvc_year)\n\n solution_file = os.path.join(\"msvc\", \"libusb_{}.sln\".format(solution_msvc_year))\n platforms = {\"x86\":\"Win32\"}\n properties = {\n # Enable LTO when CFLAGS contains -GL\n \"WholeProgramOptimization\": \"true\" if any(re.finditer(\"(^| )[/-]GL($| )\", tools.get_env(\"CFLAGS\", \"\"))) else \"false\",\n }\n msbuild = MSBuild(self)\n build_type = \"Debug\" if self.settings.build_type == \"Debug\" else \"Release\"\n msbuild.build(solution_file, platforms=platforms, upgrade_project=False, properties=properties, build_type=build_type)\n\n def _configure_autotools(self):\n if not self._autotools:\n self._autotools = AutoToolsBuildEnvironment(self, win_bash=tools.os_info.is_windows)\n configure_args = [\"--enable-shared\" if self.options.shared else \"--disable-shared\"]\n configure_args.append(\"--enable-static\" if not self.options.shared else \"--disable-static\")\n if self.settings.os in [\"Linux\", \"Android\"]:\n configure_args.append(\"--enable-udev\" if self.options.enable_udev else \"--disable-udev\")\n elif self._is_mingw:\n if self.settings.arch == \"x86_64\":\n configure_args.append(\"--host=x86_64-w64-mingw32\")\n elif self.settings.arch == \"x86\":\n configure_args.append(\"--build=i686-w64-mingw32\")\n configure_args.append(\"--host=i686-w64-mingw32\")\n self._autotools.configure(args=configure_args, configure_dir=self._source_subfolder)\n return self._autotools\n\n def build(self):\n if self._is_msvc:\n if tools.Version(self.version) < \"1.0.24\":\n for vcxproj in [\"fxload_2017\", \"getopt_2017\", \"hotplugtest_2017\", \"libusb_dll_2017\",\n \"libusb_static_2017\", \"listdevs_2017\", \"stress_2017\", \"testlibusb_2017\", \"xusb_2017\"]:\n vcxproj_path = os.path.join(self._source_subfolder, \"msvc\", \"%s.vcxproj\" % vcxproj)\n tools.replace_in_file(vcxproj_path, \"<WindowsTargetPlatformVersion>10.0.16299.0</WindowsTargetPlatformVersion>\", \"\")\n self._build_visual_studio()\n else:\n autotools = self._configure_autotools()\n autotools.make()\n\n def _package_visual_studio(self):\n self.copy(pattern=\"libusb.h\", dst=os.path.join(\"include\", \"libusb-1.0\"), src=os.path.join(self._source_subfolder, \"libusb\"), keep_path=False)\n arch = \"x64\" if self.settings.arch == \"x86_64\" else \"Win32\"\n source_dir = os.path.join(self._source_subfolder, arch, str(self.settings.build_type), \"dll\" if self.options.shared else \"lib\")\n if self.options.shared:\n self.copy(pattern=\"libusb-1.0.dll\", dst=\"bin\", src=source_dir, keep_path=False)\n self.copy(pattern=\"libusb-1.0.lib\", dst=\"lib\", src=source_dir, keep_path=False)\n self.copy(pattern=\"libusb-usbdk-1.0.dll\", dst=\"bin\", src=source_dir, keep_path=False)\n self.copy(pattern=\"libusb-usbdk-1.0.lib\", dst=\"lib\", src=source_dir, keep_path=False)\n else:\n self.copy(pattern=\"libusb-1.0.lib\", dst=\"lib\", src=source_dir, keep_path=False)\n self.copy(pattern=\"libusb-usbdk-1.0.lib\", dst=\"lib\", src=source_dir, keep_path=False)\n\n def package(self):\n self.copy(\"COPYING\", src=self._source_subfolder, dst=\"licenses\", keep_path=False)\n if self._is_msvc:\n self._package_visual_studio()\n else:\n autotools = self._configure_autotools()\n autotools.install()\n tools.rmdir(os.path.join(self.package_folder, \"lib\", \"pkgconfig\"))\n tools.remove_files_by_mask(os.path.join(self.package_folder, \"lib\"), \"*.la\")\n\n def package_info(self):\n self.cpp_info.names[\"pkg_config\"] = \"libusb-1.0\"\n self.cpp_info.libs = tools.collect_libs(self)\n self.cpp_info.includedirs.append(os.path.join(\"include\", \"libusb-1.0\"))\n if self.settings.os in [\"Linux\", \"FreeBSD\"]:\n self.cpp_info.system_libs.append(\"pthread\")\n elif self.settings.os == \"Macos\":\n self.cpp_info.system_libs = [\"objc\"]\n self.cpp_info.frameworks = [\"IOKit\", \"CoreFoundation\"]\n elif self.settings.os == \"Windows\":\n self.cpp_info.system_libs = [\"advapi32\"]\n", "path": "recipes/libusb/all/conanfile.py"}], "after_files": [{"content": "from conans import ConanFile, AutoToolsBuildEnvironment, MSBuild, tools\nimport os\nimport re\n\nrequired_conan_version = \">=1.33.0\"\n\n\nclass LibUSBConan(ConanFile):\n name = \"libusb\"\n description = \"A cross-platform library to access USB devices\"\n license = \"LGPL-2.1\"\n homepage = \"https://github.com/libusb/libusb\"\n url = \"https://github.com/conan-io/conan-center-index\"\n topics = (\"conan\", \"libusb\", \"usb\", \"device\")\n settings = \"os\", \"compiler\", \"build_type\", \"arch\"\n options = {\n \"shared\": [True, False],\n \"fPIC\": [True, False],\n \"enable_udev\": [True, False],\n }\n default_options = {\n \"shared\": False,\n \"fPIC\": True,\n \"enable_udev\": True,\n }\n _autotools = None\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n @property\n def _is_mingw(self):\n return self.settings.os == \"Windows\" and self.settings.compiler == \"gcc\"\n\n @property\n def _is_msvc(self):\n return self.settings.os == \"Windows\" and self.settings.compiler == \"Visual Studio\"\n\n @property\n def _settings_build(self):\n return self.settings_build if hasattr(self, \"settings_build\") else self.settings\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n if self.settings.os not in [\"Linux\", \"Android\"]:\n del self.options.enable_udev\n # FIXME: enable_udev should be True for Android, but libudev recipe is missing\n if self.settings.os == \"Android\":\n self.options.enable_udev = False\n\n def configure(self):\n if self.options.shared:\n del self.options.fPIC\n del self.settings.compiler.libcxx\n del self.settings.compiler.cppstd\n\n def build_requirements(self):\n if self._settings_build.os == \"Windows\" and not self._is_msvc and not tools.get_env(\"CONAN_BASH_PATH\"):\n self.build_requires(\"msys2/cci.latest\")\n\n def requirements(self):\n if self.settings.os == \"Linux\":\n if self.options.enable_udev:\n self.requires(\"libudev/system\")\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version],\n destination=self._source_subfolder, strip_root=True)\n\n def _build_visual_studio(self):\n with tools.chdir(self._source_subfolder):\n # Assume we're using the latest Visual Studio and default to libusb_2019.sln\n # (or libusb_2017.sln for libusb < 1.0.24).\n # If we're not using the latest Visual Studio, select an appropriate solution file.\n solution_msvc_year = 2019 if tools.Version(self.version) >= \"1.0.24\" else 2017\n\n solution_msvc_year = {\n \"11\": 2012,\n \"12\": 2013,\n \"14\": 2015,\n \"15\": 2017\n }.get(str(self.settings.compiler.version), solution_msvc_year)\n\n solution_file = os.path.join(\"msvc\", \"libusb_{}.sln\".format(solution_msvc_year))\n platforms = {\"x86\":\"Win32\"}\n properties = {\n # Enable LTO when CFLAGS contains -GL\n \"WholeProgramOptimization\": \"true\" if any(re.finditer(\"(^| )[/-]GL($| )\", tools.get_env(\"CFLAGS\", \"\"))) else \"false\",\n }\n msbuild = MSBuild(self)\n build_type = \"Debug\" if self.settings.build_type == \"Debug\" else \"Release\"\n msbuild.build(solution_file, platforms=platforms, upgrade_project=False, properties=properties, build_type=build_type)\n\n def _configure_autotools(self):\n if not self._autotools:\n self._autotools = AutoToolsBuildEnvironment(self, win_bash=tools.os_info.is_windows)\n configure_args = [\"--enable-shared\" if self.options.shared else \"--disable-shared\"]\n configure_args.append(\"--enable-static\" if not self.options.shared else \"--disable-static\")\n if self.settings.os in [\"Linux\", \"Android\"]:\n configure_args.append(\"--enable-udev\" if self.options.enable_udev else \"--disable-udev\")\n elif self._is_mingw:\n if self.settings.arch == \"x86_64\":\n configure_args.append(\"--host=x86_64-w64-mingw32\")\n elif self.settings.arch == \"x86\":\n configure_args.append(\"--build=i686-w64-mingw32\")\n configure_args.append(\"--host=i686-w64-mingw32\")\n self._autotools.configure(args=configure_args, configure_dir=self._source_subfolder)\n return self._autotools\n\n def build(self):\n if self._is_msvc:\n if tools.Version(self.version) < \"1.0.24\":\n for vcxproj in [\"fxload_2017\", \"getopt_2017\", \"hotplugtest_2017\", \"libusb_dll_2017\",\n \"libusb_static_2017\", \"listdevs_2017\", \"stress_2017\", \"testlibusb_2017\", \"xusb_2017\"]:\n vcxproj_path = os.path.join(self._source_subfolder, \"msvc\", \"%s.vcxproj\" % vcxproj)\n tools.replace_in_file(vcxproj_path, \"<WindowsTargetPlatformVersion>10.0.16299.0</WindowsTargetPlatformVersion>\", \"\")\n self._build_visual_studio()\n else:\n autotools = self._configure_autotools()\n autotools.make()\n\n def _package_visual_studio(self):\n self.copy(pattern=\"libusb.h\", dst=os.path.join(\"include\", \"libusb-1.0\"), src=os.path.join(self._source_subfolder, \"libusb\"), keep_path=False)\n arch = \"x64\" if self.settings.arch == \"x86_64\" else \"Win32\"\n source_dir = os.path.join(self._source_subfolder, arch, str(self.settings.build_type), \"dll\" if self.options.shared else \"lib\")\n if self.options.shared:\n self.copy(pattern=\"libusb-1.0.dll\", dst=\"bin\", src=source_dir, keep_path=False)\n self.copy(pattern=\"libusb-1.0.lib\", dst=\"lib\", src=source_dir, keep_path=False)\n self.copy(pattern=\"libusb-usbdk-1.0.dll\", dst=\"bin\", src=source_dir, keep_path=False)\n self.copy(pattern=\"libusb-usbdk-1.0.lib\", dst=\"lib\", src=source_dir, keep_path=False)\n else:\n self.copy(pattern=\"libusb-1.0.lib\", dst=\"lib\", src=source_dir, keep_path=False)\n self.copy(pattern=\"libusb-usbdk-1.0.lib\", dst=\"lib\", src=source_dir, keep_path=False)\n\n def package(self):\n self.copy(\"COPYING\", src=self._source_subfolder, dst=\"licenses\", keep_path=False)\n if self._is_msvc:\n self._package_visual_studio()\n else:\n autotools = self._configure_autotools()\n autotools.install()\n tools.rmdir(os.path.join(self.package_folder, \"lib\", \"pkgconfig\"))\n tools.remove_files_by_mask(os.path.join(self.package_folder, \"lib\"), \"*.la\")\n\n def package_info(self):\n self.cpp_info.names[\"pkg_config\"] = \"libusb-1.0\"\n self.cpp_info.libs = tools.collect_libs(self)\n self.cpp_info.includedirs.append(os.path.join(\"include\", \"libusb-1.0\"))\n if self.settings.os in [\"Linux\", \"FreeBSD\"]:\n self.cpp_info.system_libs.append(\"pthread\")\n elif self.settings.os == \"Macos\":\n self.cpp_info.system_libs = [\"objc\"]\n self.cpp_info.frameworks = [\"IOKit\", \"CoreFoundation\", \"Security\"]\n elif self.settings.os == \"Windows\":\n self.cpp_info.system_libs = [\"advapi32\"]\n", "path": "recipes/libusb/all/conanfile.py"}]} | 2,736 | 194 |
gh_patches_debug_31538 | rasdani/github-patches | git_diff | microsoft__botbuilder-python-291 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove Mention Support
**Describe the solution you'd like**
Add Remove Mention support as JS and C#. See ActivityExtensions for a reference of Mention related methods. To remove mentions from Activity.Text, see ActivityExtensions.RemoveMentionText and ActivityExtensions.RemoveRecipientMention. Note that in JS it is TurnContext.removeMentionText.
**Describe alternatives you've considered**
None
**Additional context**
I have implemented SkypeMentionNormalizeMiddleware on all platforms to correct Skype mentions. The user could still make use of this middleware, but would have to manually remove the mention from Activity.Text to have the same functionality as the other platforms.
[enhancement]
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libraries/botbuilder-core/botbuilder/core/turn_context.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License.
3
4 from copy import copy
5 from typing import List, Callable, Union, Dict
6 from botbuilder.schema import Activity, ConversationReference, ResourceResponse
7
8
9 class TurnContext:
10 def __init__(self, adapter_or_context, request: Activity = None):
11 """
12 Creates a new TurnContext instance.
13 :param adapter_or_context:
14 :param request:
15 """
16 if isinstance(adapter_or_context, TurnContext):
17 adapter_or_context.copy_to(self)
18 else:
19 self.adapter = adapter_or_context
20 self._activity = request
21 self.responses: List[Activity] = []
22 self._services: dict = {}
23 self._on_send_activities: Callable[
24 ["TurnContext", List[Activity], Callable], List[ResourceResponse]
25 ] = []
26 self._on_update_activity: Callable[
27 ["TurnContext", Activity, Callable], ResourceResponse
28 ] = []
29 self._on_delete_activity: Callable[
30 ["TurnContext", ConversationReference, Callable], None
31 ] = []
32 self._responded: bool = False
33
34 if self.adapter is None:
35 raise TypeError("TurnContext must be instantiated with an adapter.")
36 if self.activity is None:
37 raise TypeError(
38 "TurnContext must be instantiated with a request parameter of type Activity."
39 )
40
41 self._turn_state = {}
42
43 @property
44 def turn_state(self) -> Dict[str, object]:
45 return self._turn_state
46
47 def copy_to(self, context: "TurnContext") -> None:
48 """
49 Called when this TurnContext instance is passed into the constructor of a new TurnContext
50 instance. Can be overridden in derived classes.
51 :param context:
52 :return:
53 """
54 for attribute in [
55 "adapter",
56 "activity",
57 "_responded",
58 "_services",
59 "_on_send_activities",
60 "_on_update_activity",
61 "_on_delete_activity",
62 ]:
63 setattr(context, attribute, getattr(self, attribute))
64
65 @property
66 def activity(self):
67 """
68 The received activity.
69 :return:
70 """
71 return self._activity
72
73 @activity.setter
74 def activity(self, value):
75 """
76 Used to set TurnContext._activity when a context object is created. Only takes instances of Activities.
77 :param value:
78 :return:
79 """
80 if not isinstance(value, Activity):
81 raise TypeError(
82 "TurnContext: cannot set `activity` to a type other than Activity."
83 )
84 self._activity = value
85
86 @property
87 def responded(self) -> bool:
88 """
89 If `true` at least one response has been sent for the current turn of conversation.
90 :return:
91 """
92 return self._responded
93
94 @responded.setter
95 def responded(self, value: bool):
96 if not value:
97 raise ValueError("TurnContext: cannot set TurnContext.responded to False.")
98 self._responded = True
99
100 @property
101 def services(self):
102 """
103 Map of services and other values cached for the lifetime of the turn.
104 :return:
105 """
106 return self._services
107
108 def get(self, key: str) -> object:
109 if not key or not isinstance(key, str):
110 raise TypeError('"key" must be a valid string.')
111 try:
112 return self._services[key]
113 except KeyError:
114 raise KeyError("%s not found in TurnContext._services." % key)
115
116 def has(self, key: str) -> bool:
117 """
118 Returns True is set() has been called for a key. The cached value may be of type 'None'.
119 :param key:
120 :return:
121 """
122 if key in self._services:
123 return True
124 return False
125
126 def set(self, key: str, value: object) -> None:
127 """
128 Caches a value for the lifetime of the current turn.
129 :param key:
130 :param value:
131 :return:
132 """
133 if not key or not isinstance(key, str):
134 raise KeyError('"key" must be a valid string.')
135
136 self._services[key] = value
137
138 async def send_activity(
139 self, *activity_or_text: Union[Activity, str]
140 ) -> ResourceResponse:
141 """
142 Sends a single activity or message to the user.
143 :param activity_or_text:
144 :return:
145 """
146 reference = TurnContext.get_conversation_reference(self.activity)
147
148 output = [
149 TurnContext.apply_conversation_reference(
150 Activity(text=a, type="message") if isinstance(a, str) else a, reference
151 )
152 for a in activity_or_text
153 ]
154 for activity in output:
155 if not activity.input_hint:
156 activity.input_hint = "acceptingInput"
157
158 async def callback(context: "TurnContext", output):
159 responses = await context.adapter.send_activities(context, output)
160 context._responded = True # pylint: disable=protected-access
161 return responses
162
163 result = await self._emit(
164 self._on_send_activities, output, callback(self, output)
165 )
166
167 return result[0] if result else ResourceResponse()
168
169 async def update_activity(self, activity: Activity):
170 """
171 Replaces an existing activity.
172 :param activity:
173 :return:
174 """
175 return await self._emit(
176 self._on_update_activity,
177 activity,
178 self.adapter.update_activity(self, activity),
179 )
180
181 async def delete_activity(self, id_or_reference: Union[str, ConversationReference]):
182 """
183 Deletes an existing activity.
184 :param id_or_reference:
185 :return:
186 """
187 if isinstance(id_or_reference, str):
188 reference = TurnContext.get_conversation_reference(self.activity)
189 reference.activity_id = id_or_reference
190 else:
191 reference = id_or_reference
192 return await self._emit(
193 self._on_delete_activity,
194 reference,
195 self.adapter.delete_activity(self, reference),
196 )
197
198 def on_send_activities(self, handler) -> "TurnContext":
199 """
200 Registers a handler to be notified of and potentially intercept the sending of activities.
201 :param handler:
202 :return:
203 """
204 self._on_send_activities.append(handler)
205 return self
206
207 def on_update_activity(self, handler) -> "TurnContext":
208 """
209 Registers a handler to be notified of and potentially intercept an activity being updated.
210 :param handler:
211 :return:
212 """
213 self._on_update_activity.append(handler)
214 return self
215
216 def on_delete_activity(self, handler) -> "TurnContext":
217 """
218 Registers a handler to be notified of and potentially intercept an activity being deleted.
219 :param handler:
220 :return:
221 """
222 self._on_delete_activity.append(handler)
223 return self
224
225 async def _emit(self, plugins, arg, logic):
226 handlers = copy(plugins)
227
228 async def emit_next(i: int):
229 context = self
230 try:
231 if i < len(handlers):
232
233 async def next_handler():
234 await emit_next(i + 1)
235
236 await handlers[i](context, arg, next_handler)
237
238 except Exception as error:
239 raise error
240
241 await emit_next(0)
242 # This should be changed to `return await logic()`
243 return await logic
244
245 @staticmethod
246 def get_conversation_reference(activity: Activity) -> ConversationReference:
247 """
248 Returns the conversation reference for an activity. This can be saved as a plain old JSON
249 object and then later used to message the user proactively.
250
251 Usage Example:
252 reference = TurnContext.get_conversation_reference(context.request)
253 :param activity:
254 :return:
255 """
256 return ConversationReference(
257 activity_id=activity.id,
258 user=copy(activity.from_property),
259 bot=copy(activity.recipient),
260 conversation=copy(activity.conversation),
261 channel_id=activity.channel_id,
262 service_url=activity.service_url,
263 )
264
265 @staticmethod
266 def apply_conversation_reference(
267 activity: Activity, reference: ConversationReference, is_incoming: bool = False
268 ) -> Activity:
269 """
270 Updates an activity with the delivery information from a conversation reference. Calling
271 this after get_conversation_reference on an incoming activity
272 will properly address the reply to a received activity.
273 :param activity:
274 :param reference:
275 :param is_incoming:
276 :return:
277 """
278 activity.channel_id = reference.channel_id
279 activity.service_url = reference.service_url
280 activity.conversation = reference.conversation
281 if is_incoming:
282 activity.from_property = reference.user
283 activity.recipient = reference.bot
284 if reference.activity_id:
285 activity.id = reference.activity_id
286 else:
287 activity.from_property = reference.bot
288 activity.recipient = reference.user
289 if reference.activity_id:
290 activity.reply_to_id = reference.activity_id
291
292 return activity
293
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/libraries/botbuilder-core/botbuilder/core/turn_context.py b/libraries/botbuilder-core/botbuilder/core/turn_context.py
--- a/libraries/botbuilder-core/botbuilder/core/turn_context.py
+++ b/libraries/botbuilder-core/botbuilder/core/turn_context.py
@@ -1,9 +1,10 @@
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License.
+import re
from copy import copy
from typing import List, Callable, Union, Dict
-from botbuilder.schema import Activity, ConversationReference, ResourceResponse
+from botbuilder.schema import Activity, ConversationReference, Mention, ResourceResponse
class TurnContext:
@@ -290,3 +291,44 @@
activity.reply_to_id = reference.activity_id
return activity
+
+ @staticmethod
+ def get_reply_conversation_reference(
+ activity: Activity, reply: ResourceResponse
+ ) -> ConversationReference:
+ reference: ConversationReference = TurnContext.get_conversation_reference(
+ activity
+ )
+
+ # Update the reference with the new outgoing Activity's id.
+ reference.activity_id = reply.id
+
+ return reference
+
+ @staticmethod
+ def remove_recipient_mention(activity: Activity) -> str:
+ return TurnContext.remove_mention_text(activity, activity.recipient.id)
+
+ @staticmethod
+ def remove_mention_text(activity: Activity, identifier: str) -> str:
+ mentions = TurnContext.get_mentions(activity)
+ for mention in mentions:
+ if mention.mentioned.id == identifier:
+ mention_name_match = re.match(
+ r"<at(.*)>(.*?)<\/at>", mention.text, re.IGNORECASE
+ )
+ if mention_name_match:
+ activity.text = re.sub(
+ mention_name_match.groups()[1], "", activity.text
+ )
+ activity.text = re.sub(r"<at><\/at>", "", activity.text)
+ return activity.text
+
+ @staticmethod
+ def get_mentions(activity: Activity) -> List[Mention]:
+ result: List[Mention] = []
+ if activity.entities is not None:
+ for entity in activity.entities:
+ if entity.type.lower() == "mention":
+ result.append(entity)
+ return result
| {"golden_diff": "diff --git a/libraries/botbuilder-core/botbuilder/core/turn_context.py b/libraries/botbuilder-core/botbuilder/core/turn_context.py\n--- a/libraries/botbuilder-core/botbuilder/core/turn_context.py\n+++ b/libraries/botbuilder-core/botbuilder/core/turn_context.py\n@@ -1,9 +1,10 @@\n # Copyright (c) Microsoft Corporation. All rights reserved.\n # Licensed under the MIT License.\n \n+import re\n from copy import copy\n from typing import List, Callable, Union, Dict\n-from botbuilder.schema import Activity, ConversationReference, ResourceResponse\n+from botbuilder.schema import Activity, ConversationReference, Mention, ResourceResponse\n \n \n class TurnContext:\n@@ -290,3 +291,44 @@\n activity.reply_to_id = reference.activity_id\n \n return activity\n+\n+ @staticmethod\n+ def get_reply_conversation_reference(\n+ activity: Activity, reply: ResourceResponse\n+ ) -> ConversationReference:\n+ reference: ConversationReference = TurnContext.get_conversation_reference(\n+ activity\n+ )\n+\n+ # Update the reference with the new outgoing Activity's id.\n+ reference.activity_id = reply.id\n+\n+ return reference\n+\n+ @staticmethod\n+ def remove_recipient_mention(activity: Activity) -> str:\n+ return TurnContext.remove_mention_text(activity, activity.recipient.id)\n+\n+ @staticmethod\n+ def remove_mention_text(activity: Activity, identifier: str) -> str:\n+ mentions = TurnContext.get_mentions(activity)\n+ for mention in mentions:\n+ if mention.mentioned.id == identifier:\n+ mention_name_match = re.match(\n+ r\"<at(.*)>(.*?)<\\/at>\", mention.text, re.IGNORECASE\n+ )\n+ if mention_name_match:\n+ activity.text = re.sub(\n+ mention_name_match.groups()[1], \"\", activity.text\n+ )\n+ activity.text = re.sub(r\"<at><\\/at>\", \"\", activity.text)\n+ return activity.text\n+\n+ @staticmethod\n+ def get_mentions(activity: Activity) -> List[Mention]:\n+ result: List[Mention] = []\n+ if activity.entities is not None:\n+ for entity in activity.entities:\n+ if entity.type.lower() == \"mention\":\n+ result.append(entity)\n+ return result\n", "issue": "Remove Mention Support\n**Describe the solution you'd like**\r\nAdd Remove Mention support as JS and C#. See ActivityExtensions for a reference of Mention related methods. To remove mentions from Activity.Text, see ActivityExtensions.RemoveMentionText and ActivityExtensions.RemoveRecipientMention. Note that in JS it is TurnContext.removeMentionText.\r\n\r\n**Describe alternatives you've considered**\r\nNone\r\n\r\n**Additional context**\r\nI have implemented SkypeMentionNormalizeMiddleware on all platforms to correct Skype mentions. The user could still make use of this middleware, but would have to manually remove the mention from Activity.Text to have the same functionality as the other platforms.\r\n\r\n[enhancement]\r\n\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nfrom copy import copy\nfrom typing import List, Callable, Union, Dict\nfrom botbuilder.schema import Activity, ConversationReference, ResourceResponse\n\n\nclass TurnContext:\n def __init__(self, adapter_or_context, request: Activity = None):\n \"\"\"\n Creates a new TurnContext instance.\n :param adapter_or_context:\n :param request:\n \"\"\"\n if isinstance(adapter_or_context, TurnContext):\n adapter_or_context.copy_to(self)\n else:\n self.adapter = adapter_or_context\n self._activity = request\n self.responses: List[Activity] = []\n self._services: dict = {}\n self._on_send_activities: Callable[\n [\"TurnContext\", List[Activity], Callable], List[ResourceResponse]\n ] = []\n self._on_update_activity: Callable[\n [\"TurnContext\", Activity, Callable], ResourceResponse\n ] = []\n self._on_delete_activity: Callable[\n [\"TurnContext\", ConversationReference, Callable], None\n ] = []\n self._responded: bool = False\n\n if self.adapter is None:\n raise TypeError(\"TurnContext must be instantiated with an adapter.\")\n if self.activity is None:\n raise TypeError(\n \"TurnContext must be instantiated with a request parameter of type Activity.\"\n )\n\n self._turn_state = {}\n\n @property\n def turn_state(self) -> Dict[str, object]:\n return self._turn_state\n\n def copy_to(self, context: \"TurnContext\") -> None:\n \"\"\"\n Called when this TurnContext instance is passed into the constructor of a new TurnContext\n instance. Can be overridden in derived classes.\n :param context:\n :return:\n \"\"\"\n for attribute in [\n \"adapter\",\n \"activity\",\n \"_responded\",\n \"_services\",\n \"_on_send_activities\",\n \"_on_update_activity\",\n \"_on_delete_activity\",\n ]:\n setattr(context, attribute, getattr(self, attribute))\n\n @property\n def activity(self):\n \"\"\"\n The received activity.\n :return:\n \"\"\"\n return self._activity\n\n @activity.setter\n def activity(self, value):\n \"\"\"\n Used to set TurnContext._activity when a context object is created. Only takes instances of Activities.\n :param value:\n :return:\n \"\"\"\n if not isinstance(value, Activity):\n raise TypeError(\n \"TurnContext: cannot set `activity` to a type other than Activity.\"\n )\n self._activity = value\n\n @property\n def responded(self) -> bool:\n \"\"\"\n If `true` at least one response has been sent for the current turn of conversation.\n :return:\n \"\"\"\n return self._responded\n\n @responded.setter\n def responded(self, value: bool):\n if not value:\n raise ValueError(\"TurnContext: cannot set TurnContext.responded to False.\")\n self._responded = True\n\n @property\n def services(self):\n \"\"\"\n Map of services and other values cached for the lifetime of the turn.\n :return:\n \"\"\"\n return self._services\n\n def get(self, key: str) -> object:\n if not key or not isinstance(key, str):\n raise TypeError('\"key\" must be a valid string.')\n try:\n return self._services[key]\n except KeyError:\n raise KeyError(\"%s not found in TurnContext._services.\" % key)\n\n def has(self, key: str) -> bool:\n \"\"\"\n Returns True is set() has been called for a key. The cached value may be of type 'None'.\n :param key:\n :return:\n \"\"\"\n if key in self._services:\n return True\n return False\n\n def set(self, key: str, value: object) -> None:\n \"\"\"\n Caches a value for the lifetime of the current turn.\n :param key:\n :param value:\n :return:\n \"\"\"\n if not key or not isinstance(key, str):\n raise KeyError('\"key\" must be a valid string.')\n\n self._services[key] = value\n\n async def send_activity(\n self, *activity_or_text: Union[Activity, str]\n ) -> ResourceResponse:\n \"\"\"\n Sends a single activity or message to the user.\n :param activity_or_text:\n :return:\n \"\"\"\n reference = TurnContext.get_conversation_reference(self.activity)\n\n output = [\n TurnContext.apply_conversation_reference(\n Activity(text=a, type=\"message\") if isinstance(a, str) else a, reference\n )\n for a in activity_or_text\n ]\n for activity in output:\n if not activity.input_hint:\n activity.input_hint = \"acceptingInput\"\n\n async def callback(context: \"TurnContext\", output):\n responses = await context.adapter.send_activities(context, output)\n context._responded = True # pylint: disable=protected-access\n return responses\n\n result = await self._emit(\n self._on_send_activities, output, callback(self, output)\n )\n\n return result[0] if result else ResourceResponse()\n\n async def update_activity(self, activity: Activity):\n \"\"\"\n Replaces an existing activity.\n :param activity:\n :return:\n \"\"\"\n return await self._emit(\n self._on_update_activity,\n activity,\n self.adapter.update_activity(self, activity),\n )\n\n async def delete_activity(self, id_or_reference: Union[str, ConversationReference]):\n \"\"\"\n Deletes an existing activity.\n :param id_or_reference:\n :return:\n \"\"\"\n if isinstance(id_or_reference, str):\n reference = TurnContext.get_conversation_reference(self.activity)\n reference.activity_id = id_or_reference\n else:\n reference = id_or_reference\n return await self._emit(\n self._on_delete_activity,\n reference,\n self.adapter.delete_activity(self, reference),\n )\n\n def on_send_activities(self, handler) -> \"TurnContext\":\n \"\"\"\n Registers a handler to be notified of and potentially intercept the sending of activities.\n :param handler:\n :return:\n \"\"\"\n self._on_send_activities.append(handler)\n return self\n\n def on_update_activity(self, handler) -> \"TurnContext\":\n \"\"\"\n Registers a handler to be notified of and potentially intercept an activity being updated.\n :param handler:\n :return:\n \"\"\"\n self._on_update_activity.append(handler)\n return self\n\n def on_delete_activity(self, handler) -> \"TurnContext\":\n \"\"\"\n Registers a handler to be notified of and potentially intercept an activity being deleted.\n :param handler:\n :return:\n \"\"\"\n self._on_delete_activity.append(handler)\n return self\n\n async def _emit(self, plugins, arg, logic):\n handlers = copy(plugins)\n\n async def emit_next(i: int):\n context = self\n try:\n if i < len(handlers):\n\n async def next_handler():\n await emit_next(i + 1)\n\n await handlers[i](context, arg, next_handler)\n\n except Exception as error:\n raise error\n\n await emit_next(0)\n # This should be changed to `return await logic()`\n return await logic\n\n @staticmethod\n def get_conversation_reference(activity: Activity) -> ConversationReference:\n \"\"\"\n Returns the conversation reference for an activity. This can be saved as a plain old JSON\n object and then later used to message the user proactively.\n\n Usage Example:\n reference = TurnContext.get_conversation_reference(context.request)\n :param activity:\n :return:\n \"\"\"\n return ConversationReference(\n activity_id=activity.id,\n user=copy(activity.from_property),\n bot=copy(activity.recipient),\n conversation=copy(activity.conversation),\n channel_id=activity.channel_id,\n service_url=activity.service_url,\n )\n\n @staticmethod\n def apply_conversation_reference(\n activity: Activity, reference: ConversationReference, is_incoming: bool = False\n ) -> Activity:\n \"\"\"\n Updates an activity with the delivery information from a conversation reference. Calling\n this after get_conversation_reference on an incoming activity\n will properly address the reply to a received activity.\n :param activity:\n :param reference:\n :param is_incoming:\n :return:\n \"\"\"\n activity.channel_id = reference.channel_id\n activity.service_url = reference.service_url\n activity.conversation = reference.conversation\n if is_incoming:\n activity.from_property = reference.user\n activity.recipient = reference.bot\n if reference.activity_id:\n activity.id = reference.activity_id\n else:\n activity.from_property = reference.bot\n activity.recipient = reference.user\n if reference.activity_id:\n activity.reply_to_id = reference.activity_id\n\n return activity\n", "path": "libraries/botbuilder-core/botbuilder/core/turn_context.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport re\nfrom copy import copy\nfrom typing import List, Callable, Union, Dict\nfrom botbuilder.schema import Activity, ConversationReference, Mention, ResourceResponse\n\n\nclass TurnContext:\n def __init__(self, adapter_or_context, request: Activity = None):\n \"\"\"\n Creates a new TurnContext instance.\n :param adapter_or_context:\n :param request:\n \"\"\"\n if isinstance(adapter_or_context, TurnContext):\n adapter_or_context.copy_to(self)\n else:\n self.adapter = adapter_or_context\n self._activity = request\n self.responses: List[Activity] = []\n self._services: dict = {}\n self._on_send_activities: Callable[\n [\"TurnContext\", List[Activity], Callable], List[ResourceResponse]\n ] = []\n self._on_update_activity: Callable[\n [\"TurnContext\", Activity, Callable], ResourceResponse\n ] = []\n self._on_delete_activity: Callable[\n [\"TurnContext\", ConversationReference, Callable], None\n ] = []\n self._responded: bool = False\n\n if self.adapter is None:\n raise TypeError(\"TurnContext must be instantiated with an adapter.\")\n if self.activity is None:\n raise TypeError(\n \"TurnContext must be instantiated with a request parameter of type Activity.\"\n )\n\n self._turn_state = {}\n\n @property\n def turn_state(self) -> Dict[str, object]:\n return self._turn_state\n\n def copy_to(self, context: \"TurnContext\") -> None:\n \"\"\"\n Called when this TurnContext instance is passed into the constructor of a new TurnContext\n instance. Can be overridden in derived classes.\n :param context:\n :return:\n \"\"\"\n for attribute in [\n \"adapter\",\n \"activity\",\n \"_responded\",\n \"_services\",\n \"_on_send_activities\",\n \"_on_update_activity\",\n \"_on_delete_activity\",\n ]:\n setattr(context, attribute, getattr(self, attribute))\n\n @property\n def activity(self):\n \"\"\"\n The received activity.\n :return:\n \"\"\"\n return self._activity\n\n @activity.setter\n def activity(self, value):\n \"\"\"\n Used to set TurnContext._activity when a context object is created. Only takes instances of Activities.\n :param value:\n :return:\n \"\"\"\n if not isinstance(value, Activity):\n raise TypeError(\n \"TurnContext: cannot set `activity` to a type other than Activity.\"\n )\n self._activity = value\n\n @property\n def responded(self) -> bool:\n \"\"\"\n If `true` at least one response has been sent for the current turn of conversation.\n :return:\n \"\"\"\n return self._responded\n\n @responded.setter\n def responded(self, value: bool):\n if not value:\n raise ValueError(\"TurnContext: cannot set TurnContext.responded to False.\")\n self._responded = True\n\n @property\n def services(self):\n \"\"\"\n Map of services and other values cached for the lifetime of the turn.\n :return:\n \"\"\"\n return self._services\n\n def get(self, key: str) -> object:\n if not key or not isinstance(key, str):\n raise TypeError('\"key\" must be a valid string.')\n try:\n return self._services[key]\n except KeyError:\n raise KeyError(\"%s not found in TurnContext._services.\" % key)\n\n def has(self, key: str) -> bool:\n \"\"\"\n Returns True is set() has been called for a key. The cached value may be of type 'None'.\n :param key:\n :return:\n \"\"\"\n if key in self._services:\n return True\n return False\n\n def set(self, key: str, value: object) -> None:\n \"\"\"\n Caches a value for the lifetime of the current turn.\n :param key:\n :param value:\n :return:\n \"\"\"\n if not key or not isinstance(key, str):\n raise KeyError('\"key\" must be a valid string.')\n\n self._services[key] = value\n\n async def send_activity(\n self, *activity_or_text: Union[Activity, str]\n ) -> ResourceResponse:\n \"\"\"\n Sends a single activity or message to the user.\n :param activity_or_text:\n :return:\n \"\"\"\n reference = TurnContext.get_conversation_reference(self.activity)\n\n output = [\n TurnContext.apply_conversation_reference(\n Activity(text=a, type=\"message\") if isinstance(a, str) else a, reference\n )\n for a in activity_or_text\n ]\n for activity in output:\n if not activity.input_hint:\n activity.input_hint = \"acceptingInput\"\n\n async def callback(context: \"TurnContext\", output):\n responses = await context.adapter.send_activities(context, output)\n context._responded = True # pylint: disable=protected-access\n return responses\n\n result = await self._emit(\n self._on_send_activities, output, callback(self, output)\n )\n\n return result[0] if result else ResourceResponse()\n\n async def update_activity(self, activity: Activity):\n \"\"\"\n Replaces an existing activity.\n :param activity:\n :return:\n \"\"\"\n return await self._emit(\n self._on_update_activity,\n activity,\n self.adapter.update_activity(self, activity),\n )\n\n async def delete_activity(self, id_or_reference: Union[str, ConversationReference]):\n \"\"\"\n Deletes an existing activity.\n :param id_or_reference:\n :return:\n \"\"\"\n if isinstance(id_or_reference, str):\n reference = TurnContext.get_conversation_reference(self.activity)\n reference.activity_id = id_or_reference\n else:\n reference = id_or_reference\n return await self._emit(\n self._on_delete_activity,\n reference,\n self.adapter.delete_activity(self, reference),\n )\n\n def on_send_activities(self, handler) -> \"TurnContext\":\n \"\"\"\n Registers a handler to be notified of and potentially intercept the sending of activities.\n :param handler:\n :return:\n \"\"\"\n self._on_send_activities.append(handler)\n return self\n\n def on_update_activity(self, handler) -> \"TurnContext\":\n \"\"\"\n Registers a handler to be notified of and potentially intercept an activity being updated.\n :param handler:\n :return:\n \"\"\"\n self._on_update_activity.append(handler)\n return self\n\n def on_delete_activity(self, handler) -> \"TurnContext\":\n \"\"\"\n Registers a handler to be notified of and potentially intercept an activity being deleted.\n :param handler:\n :return:\n \"\"\"\n self._on_delete_activity.append(handler)\n return self\n\n async def _emit(self, plugins, arg, logic):\n handlers = copy(plugins)\n\n async def emit_next(i: int):\n context = self\n try:\n if i < len(handlers):\n\n async def next_handler():\n await emit_next(i + 1)\n\n await handlers[i](context, arg, next_handler)\n\n except Exception as error:\n raise error\n\n await emit_next(0)\n # This should be changed to `return await logic()`\n return await logic\n\n @staticmethod\n def get_conversation_reference(activity: Activity) -> ConversationReference:\n \"\"\"\n Returns the conversation reference for an activity. This can be saved as a plain old JSON\n object and then later used to message the user proactively.\n\n Usage Example:\n reference = TurnContext.get_conversation_reference(context.request)\n :param activity:\n :return:\n \"\"\"\n return ConversationReference(\n activity_id=activity.id,\n user=copy(activity.from_property),\n bot=copy(activity.recipient),\n conversation=copy(activity.conversation),\n channel_id=activity.channel_id,\n service_url=activity.service_url,\n )\n\n @staticmethod\n def apply_conversation_reference(\n activity: Activity, reference: ConversationReference, is_incoming: bool = False\n ) -> Activity:\n \"\"\"\n Updates an activity with the delivery information from a conversation reference. Calling\n this after get_conversation_reference on an incoming activity\n will properly address the reply to a received activity.\n :param activity:\n :param reference:\n :param is_incoming:\n :return:\n \"\"\"\n activity.channel_id = reference.channel_id\n activity.service_url = reference.service_url\n activity.conversation = reference.conversation\n if is_incoming:\n activity.from_property = reference.user\n activity.recipient = reference.bot\n if reference.activity_id:\n activity.id = reference.activity_id\n else:\n activity.from_property = reference.bot\n activity.recipient = reference.user\n if reference.activity_id:\n activity.reply_to_id = reference.activity_id\n\n return activity\n\n @staticmethod\n def get_reply_conversation_reference(\n activity: Activity, reply: ResourceResponse\n ) -> ConversationReference:\n reference: ConversationReference = TurnContext.get_conversation_reference(\n activity\n )\n\n # Update the reference with the new outgoing Activity's id.\n reference.activity_id = reply.id\n\n return reference\n\n @staticmethod\n def remove_recipient_mention(activity: Activity) -> str:\n return TurnContext.remove_mention_text(activity, activity.recipient.id)\n\n @staticmethod\n def remove_mention_text(activity: Activity, identifier: str) -> str:\n mentions = TurnContext.get_mentions(activity)\n for mention in mentions:\n if mention.mentioned.id == identifier:\n mention_name_match = re.match(\n r\"<at(.*)>(.*?)<\\/at>\", mention.text, re.IGNORECASE\n )\n if mention_name_match:\n activity.text = re.sub(\n mention_name_match.groups()[1], \"\", activity.text\n )\n activity.text = re.sub(r\"<at><\\/at>\", \"\", activity.text)\n return activity.text\n\n @staticmethod\n def get_mentions(activity: Activity) -> List[Mention]:\n result: List[Mention] = []\n if activity.entities is not None:\n for entity in activity.entities:\n if entity.type.lower() == \"mention\":\n result.append(entity)\n return result\n", "path": "libraries/botbuilder-core/botbuilder/core/turn_context.py"}]} | 3,066 | 509 |
gh_patches_debug_554 | rasdani/github-patches | git_diff | scikit-image__scikit-image-353 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Please add info how to run the skimage unit tests at the end of the installation instructions
I couldn't find instructions how to run the skimage unit tests.
First I tried
```
python -c 'import skimage; skimage.test()
```
which ran 287 tests and gave 16 errors, all the same:
```
ImportError: cannot import name BytesIO
```
Then I tried
```
nosetests --exe skimage
```
which ran 490 tests, no error.
Full output is here: https://gist.github.com/3832077
Apparently it is important to not use `skimage.test()`, but `nosetests` instead?
Could you please add this info somewhere, the first place I would have looked is at the end of http://skimage.org/docs/dev/install.html ( or make "nosetests" or "run tests" in the sphinx search find the appropriate command to run).
Thanks!
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `skimage/__init__.py`
Content:
```
1 """Image Processing SciKit (Toolbox for SciPy)
2
3 ``scikits-image`` (a.k.a. ``skimage``) is a collection of algorithms for image
4 processing and computer vision.
5
6 The main package of ``skimage`` only provides a few utilities for converting
7 between image data types; for most features, you need to import one of the
8 following subpackages:
9
10 Subpackages
11 -----------
12 color
13 Color space conversion.
14 data
15 Test images and example data.
16 draw
17 Image drawing primitives (lines, text, etc.).
18 exposure
19 Image intensity adjustment (e.g., histogram equalization).
20 feature
21 Feature detection (e.g. texture analysis, corners, etc.).
22 filter
23 Sharpening, edge finding, denoising, etc.
24 graph
25 Graph-theoretic operations, e.g. dynamic programming (shortest paths).
26 io
27 Reading, saving, and displaying images and video.
28 measure
29 Measurement of image properties, e.g., similarity and contours.
30 morphology
31 Morphological operations, e.g. opening or skeletonization.
32 segmentation
33 Splitting an image into self-similar regions.
34 transform
35 Geometric and other transforms, e.g. rotation or the Radon transform.
36 util
37 Generic utilities.
38
39 Utility Functions
40 -----------------
41 get_log
42 Returns the ``skimage`` log. Use this to print debug output.
43 img_as_float
44 Convert an image to floating point format, with values in [0, 1].
45 img_as_uint
46 Convert an image to unsigned integer format, with values in [0, 65535].
47 img_as_int
48 Convert an image to signed integer format, with values in [-32768, 32767].
49 img_as_ubyte
50 Convert an image to unsigned byte format, with values in [0, 255].
51
52 """
53
54 import os.path as _osp
55
56 pkg_dir = _osp.abspath(_osp.dirname(__file__))
57 data_dir = _osp.join(pkg_dir, 'data')
58
59 try:
60 from .version import version as __version__
61 except ImportError:
62 __version__ = "unbuilt-dev"
63
64
65 def _setup_test(verbose=False):
66 import functools
67
68 args = ['', '--exe', '-w', pkg_dir]
69 if verbose:
70 args.extend(['-v', '-s'])
71
72 try:
73 import nose as _nose
74 except ImportError:
75 def broken_test_func():
76 """This would invoke the skimage test suite, but nose couldn't be
77 imported so the test suite can not run.
78 """
79 raise ImportError("Could not load nose. Unit tests not available.")
80 return broken_test_func
81 else:
82 f = functools.partial(_nose.run, 'skimage', argv=args)
83 f.__doc__ = 'Invoke the skimage test suite.'
84 return f
85
86
87 test = _setup_test()
88 test_verbose = _setup_test(verbose=True)
89
90
91 def get_log(name=None):
92 """Return a console logger.
93
94 Output may be sent to the logger using the `debug`, `info`, `warning`,
95 `error` and `critical` methods.
96
97 Parameters
98 ----------
99 name : str
100 Name of the log.
101
102 References
103 ----------
104 .. [1] Logging facility for Python,
105 http://docs.python.org/library/logging.html
106
107 """
108 import logging
109
110 if name is None:
111 name = 'skimage'
112 else:
113 name = 'skimage.' + name
114
115 log = logging.getLogger(name)
116 return log
117
118
119 def _setup_log():
120 """Configure root logger.
121
122 """
123 import logging
124 import sys
125
126 formatter = logging.Formatter(
127 '%(name)s: %(levelname)s: %(message)s'
128 )
129
130 try:
131 handler = logging.StreamHandler(stream=sys.stdout)
132 except TypeError:
133 handler = logging.StreamHandler(strm=sys.stdout)
134 handler.setFormatter(formatter)
135
136 log = get_log()
137 log.addHandler(handler)
138 log.setLevel(logging.WARNING)
139 log.propagate = False
140
141 _setup_log()
142
143 from .util.dtype import *
144
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/skimage/__init__.py b/skimage/__init__.py
--- a/skimage/__init__.py
+++ b/skimage/__init__.py
@@ -65,7 +65,7 @@
def _setup_test(verbose=False):
import functools
- args = ['', '--exe', '-w', pkg_dir]
+ args = ['', pkg_dir, '--exe']
if verbose:
args.extend(['-v', '-s'])
| {"golden_diff": "diff --git a/skimage/__init__.py b/skimage/__init__.py\n--- a/skimage/__init__.py\n+++ b/skimage/__init__.py\n@@ -65,7 +65,7 @@\n def _setup_test(verbose=False):\n import functools\n \n- args = ['', '--exe', '-w', pkg_dir]\n+ args = ['', pkg_dir, '--exe']\n if verbose:\n args.extend(['-v', '-s'])\n", "issue": "Please add info how to run the skimage unit tests at the end of the installation instructions\nI couldn't find instructions how to run the skimage unit tests.\n\nFirst I tried\n\n```\npython -c 'import skimage; skimage.test()\n```\n\nwhich ran 287 tests and gave 16 errors, all the same:\n\n```\nImportError: cannot import name BytesIO\n```\n\nThen I tried\n\n```\nnosetests --exe skimage\n```\n\nwhich ran 490 tests, no error.\n\nFull output is here: https://gist.github.com/3832077\n\nApparently it is important to not use `skimage.test()`, but `nosetests` instead?\nCould you please add this info somewhere, the first place I would have looked is at the end of http://skimage.org/docs/dev/install.html ( or make \"nosetests\" or \"run tests\" in the sphinx search find the appropriate command to run).\n\nThanks!\n\n", "before_files": [{"content": "\"\"\"Image Processing SciKit (Toolbox for SciPy)\n\n``scikits-image`` (a.k.a. ``skimage``) is a collection of algorithms for image\nprocessing and computer vision.\n\nThe main package of ``skimage`` only provides a few utilities for converting\nbetween image data types; for most features, you need to import one of the\nfollowing subpackages:\n\nSubpackages\n-----------\ncolor\n Color space conversion.\ndata\n Test images and example data.\ndraw\n Image drawing primitives (lines, text, etc.).\nexposure\n Image intensity adjustment (e.g., histogram equalization).\nfeature\n Feature detection (e.g. texture analysis, corners, etc.).\nfilter\n Sharpening, edge finding, denoising, etc.\ngraph\n Graph-theoretic operations, e.g. dynamic programming (shortest paths).\nio\n Reading, saving, and displaying images and video.\nmeasure\n Measurement of image properties, e.g., similarity and contours.\nmorphology\n Morphological operations, e.g. opening or skeletonization.\nsegmentation\n Splitting an image into self-similar regions.\ntransform\n Geometric and other transforms, e.g. rotation or the Radon transform.\nutil\n Generic utilities.\n\nUtility Functions\n-----------------\nget_log\n Returns the ``skimage`` log. Use this to print debug output.\nimg_as_float\n Convert an image to floating point format, with values in [0, 1].\nimg_as_uint\n Convert an image to unsigned integer format, with values in [0, 65535].\nimg_as_int\n Convert an image to signed integer format, with values in [-32768, 32767].\nimg_as_ubyte\n Convert an image to unsigned byte format, with values in [0, 255].\n\n\"\"\"\n\nimport os.path as _osp\n\npkg_dir = _osp.abspath(_osp.dirname(__file__))\ndata_dir = _osp.join(pkg_dir, 'data')\n\ntry:\n from .version import version as __version__\nexcept ImportError:\n __version__ = \"unbuilt-dev\"\n\n\ndef _setup_test(verbose=False):\n import functools\n\n args = ['', '--exe', '-w', pkg_dir]\n if verbose:\n args.extend(['-v', '-s'])\n\n try:\n import nose as _nose\n except ImportError:\n def broken_test_func():\n \"\"\"This would invoke the skimage test suite, but nose couldn't be\n imported so the test suite can not run.\n \"\"\"\n raise ImportError(\"Could not load nose. Unit tests not available.\")\n return broken_test_func\n else:\n f = functools.partial(_nose.run, 'skimage', argv=args)\n f.__doc__ = 'Invoke the skimage test suite.'\n return f\n\n\ntest = _setup_test()\ntest_verbose = _setup_test(verbose=True)\n\n\ndef get_log(name=None):\n \"\"\"Return a console logger.\n\n Output may be sent to the logger using the `debug`, `info`, `warning`,\n `error` and `critical` methods.\n\n Parameters\n ----------\n name : str\n Name of the log.\n\n References\n ----------\n .. [1] Logging facility for Python,\n http://docs.python.org/library/logging.html\n\n \"\"\"\n import logging\n\n if name is None:\n name = 'skimage'\n else:\n name = 'skimage.' + name\n\n log = logging.getLogger(name)\n return log\n\n\ndef _setup_log():\n \"\"\"Configure root logger.\n\n \"\"\"\n import logging\n import sys\n\n formatter = logging.Formatter(\n '%(name)s: %(levelname)s: %(message)s'\n )\n\n try:\n handler = logging.StreamHandler(stream=sys.stdout)\n except TypeError:\n handler = logging.StreamHandler(strm=sys.stdout)\n handler.setFormatter(formatter)\n\n log = get_log()\n log.addHandler(handler)\n log.setLevel(logging.WARNING)\n log.propagate = False\n\n_setup_log()\n\nfrom .util.dtype import *\n", "path": "skimage/__init__.py"}], "after_files": [{"content": "\"\"\"Image Processing SciKit (Toolbox for SciPy)\n\n``scikits-image`` (a.k.a. ``skimage``) is a collection of algorithms for image\nprocessing and computer vision.\n\nThe main package of ``skimage`` only provides a few utilities for converting\nbetween image data types; for most features, you need to import one of the\nfollowing subpackages:\n\nSubpackages\n-----------\ncolor\n Color space conversion.\ndata\n Test images and example data.\ndraw\n Image drawing primitives (lines, text, etc.).\nexposure\n Image intensity adjustment (e.g., histogram equalization).\nfeature\n Feature detection (e.g. texture analysis, corners, etc.).\nfilter\n Sharpening, edge finding, denoising, etc.\ngraph\n Graph-theoretic operations, e.g. dynamic programming (shortest paths).\nio\n Reading, saving, and displaying images and video.\nmeasure\n Measurement of image properties, e.g., similarity and contours.\nmorphology\n Morphological operations, e.g. opening or skeletonization.\nsegmentation\n Splitting an image into self-similar regions.\ntransform\n Geometric and other transforms, e.g. rotation or the Radon transform.\nutil\n Generic utilities.\n\nUtility Functions\n-----------------\nget_log\n Returns the ``skimage`` log. Use this to print debug output.\nimg_as_float\n Convert an image to floating point format, with values in [0, 1].\nimg_as_uint\n Convert an image to unsigned integer format, with values in [0, 65535].\nimg_as_int\n Convert an image to signed integer format, with values in [-32768, 32767].\nimg_as_ubyte\n Convert an image to unsigned byte format, with values in [0, 255].\n\n\"\"\"\n\nimport os.path as _osp\n\npkg_dir = _osp.abspath(_osp.dirname(__file__))\ndata_dir = _osp.join(pkg_dir, 'data')\n\ntry:\n from .version import version as __version__\nexcept ImportError:\n __version__ = \"unbuilt-dev\"\n\n\ndef _setup_test(verbose=False):\n import functools\n\n args = ['', pkg_dir, '--exe']\n if verbose:\n args.extend(['-v', '-s'])\n\n try:\n import nose as _nose\n except ImportError:\n def broken_test_func():\n \"\"\"This would invoke the skimage test suite, but nose couldn't be\n imported so the test suite can not run.\n \"\"\"\n raise ImportError(\"Could not load nose. Unit tests not available.\")\n return broken_test_func\n else:\n f = functools.partial(_nose.run, 'skimage', argv=args)\n f.__doc__ = 'Invoke the skimage test suite.'\n return f\n\n\ntest = _setup_test()\ntest_verbose = _setup_test(verbose=True)\n\n\ndef get_log(name=None):\n \"\"\"Return a console logger.\n\n Output may be sent to the logger using the `debug`, `info`, `warning`,\n `error` and `critical` methods.\n\n Parameters\n ----------\n name : str\n Name of the log.\n\n References\n ----------\n .. [1] Logging facility for Python,\n http://docs.python.org/library/logging.html\n\n \"\"\"\n import logging\n\n if name is None:\n name = 'skimage'\n else:\n name = 'skimage.' + name\n\n log = logging.getLogger(name)\n return log\n\n\ndef _setup_log():\n \"\"\"Configure root logger.\n\n \"\"\"\n import logging\n import sys\n\n formatter = logging.Formatter(\n '%(name)s: %(levelname)s: %(message)s'\n )\n\n try:\n handler = logging.StreamHandler(stream=sys.stdout)\n except TypeError:\n handler = logging.StreamHandler(strm=sys.stdout)\n handler.setFormatter(formatter)\n\n log = get_log()\n log.addHandler(handler)\n log.setLevel(logging.WARNING)\n log.propagate = False\n\n_setup_log()\n\nfrom .util.dtype import *\n", "path": "skimage/__init__.py"}]} | 1,658 | 102 |
gh_patches_debug_28801 | rasdani/github-patches | git_diff | nautobot__nautobot-3925 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Removing credentials from a previously synced GitRepository that requires them may result in hangs
### Environment
* Nautobot version (Docker tag too if applicable): 1.5.21
### Steps to Reproduce
1. For a Git repository that requires authentication (such as a private GitHub repository), configure the `GitRepository` in Nautobot, with appropriate credentials, mark it as providing `Jobs`, and sync it successfully (which happens in the Celery worker environment).
2. Edit the `GitRepository` to remove the credentials and resync it. The resync fails as expected (again, in the Celery worker environment.)
3. Restart the Nautobot server (during startup, the `post_upgrade` signal will trigger a check and potential sync of all Git repositories in the Nautobot server environment in order to ensure that Jobs are properly discovered).
### Expected Behavior
Nautobot server to start up, perhaps with logs indicating that the repository could not be synced.
### Observed Behavior
Nautobot server startup hangs, apparently because GitPython received and didn't handle the Git `Username:` prompt and is waiting indefinitely for user input that will never come.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nautobot/utilities/git.py`
Content:
```
1 """General-purpose Git utilities."""
2
3 from collections import namedtuple
4 import logging
5 import os
6
7 from git import Repo
8
9
10 logger = logging.getLogger("nautobot.utilities.git")
11
12 # namedtuple takes a git log diff status and its accompanying text.
13 GitDiffLog = namedtuple("GitDiffLog", ["status", "text"])
14
15 # 'A' and 'D' status are swapped because of the way the repo.git.diff was implemented
16 # e.g. 'A' actually stands for Addition but in this case is Deletion
17 GIT_STATUS_MAP = {
18 "A": "Deletion",
19 "M": "Modification",
20 "C": "Copy",
21 "D": "Addition",
22 "R": "Renaming",
23 "T": "File Type Changed",
24 "U": "File Unmerged",
25 "X": "Unknown",
26 }
27
28
29 def swap_status_initials(data):
30 """Swap Git status initials with its equivalent."""
31 initial, text = data.split("\t")
32 return GitDiffLog(status=GIT_STATUS_MAP.get(initial), text=text)
33
34
35 def convert_git_diff_log_to_list(logs):
36 """
37 Convert Git diff log into a list splitted by \\n
38
39 Example:
40 >>> git_log = "M\tindex.html\nR\tsample.txt"
41 >>> print(convert_git_diff_log_to_list(git_log))
42 ["Modification - index.html", "Renaming - sample.txt"]
43 """
44 logs = logs.split("\n")
45 return [swap_status_initials(line) for line in logs]
46
47
48 class BranchDoesNotExist(Exception):
49 pass
50
51
52 class GitRepo:
53 def __init__(self, path, url, clone_initially=True):
54 """
55 Ensure that we have a clone of the given remote Git repository URL at the given local directory path.
56
57 Args:
58 path (str): path to git repo
59 url (str): git repo url
60 clone_initially (bool): True if the repo needs to be cloned
61 """
62 if os.path.isdir(path):
63 self.repo = Repo(path=path)
64 elif clone_initially:
65 self.repo = Repo.clone_from(url, to_path=path)
66 else:
67 self.repo = Repo.init(path)
68 self.repo.create_remote("origin", url=url)
69
70 if url not in self.repo.remotes.origin.urls:
71 self.repo.remotes.origin.set_url(url)
72
73 def fetch(self):
74 self.repo.remotes.origin.fetch()
75
76 def checkout(self, branch, commit_hexsha=None):
77 """
78 Check out the given branch, and optionally the specified commit within that branch.
79 """
80 # Short-circuit logic - do we already have this commit checked out?
81 if commit_hexsha and commit_hexsha == self.repo.head.commit.hexsha:
82 logger.debug(f"Commit {commit_hexsha} is already checked out.")
83 return commit_hexsha
84
85 self.fetch()
86 if commit_hexsha:
87 # Sanity check - GitPython doesn't provide a handy API for this so we just call a raw Git command:
88 # $ git branch origin/<branch> --remotes --contains <commit>
89 # prints the branch name if it DOES contain the commit, and nothing if it DOES NOT contain the commit.
90 # Since we did a `fetch` and not a `pull` above, we need to check for the commit in the remote origin
91 # branch, not the local (not-yet-updated) branch.
92 if branch not in self.repo.git.branch(f"origin/{branch}", "--remotes", "--contains", commit_hexsha):
93 raise RuntimeError(f"Requested to check out commit `{commit_hexsha}`, but it's not in branch {branch}!")
94 logger.info(f"Checking out commit `{commit_hexsha}` on branch `{branch}`...")
95 self.repo.git.checkout(commit_hexsha)
96 return commit_hexsha
97
98 if branch in self.repo.heads:
99 branch_head = self.repo.heads[branch]
100 else:
101 try:
102 branch_head = self.repo.create_head(branch, self.repo.remotes.origin.refs[branch])
103 branch_head.set_tracking_branch(self.repo.remotes.origin.refs[branch])
104 except IndexError as git_error:
105 logger.error(
106 "Branch %s does not exist at %s. %s", branch, list(self.repo.remotes.origin.urls)[0], git_error
107 )
108 raise BranchDoesNotExist(
109 f"Please create branch '{branch}' in upstream and try again."
110 f" If this is a new repo, please add a commit before syncing. {git_error}"
111 )
112
113 logger.info(f"Checking out latest commit on branch `{branch}`...")
114 branch_head.checkout()
115 # No specific commit hash was given, so make sure we get the latest from origin
116 # We would use repo.remotes.origin.pull() here, but that will fail in the case where someone has
117 # force-pushed to the upstream repo since the last time we did a pull. To be safe, we reset instead.
118 self.repo.head.reset(f"origin/{branch}", index=True, working_tree=True)
119 commit_hexsha = self.repo.head.reference.commit.hexsha
120 logger.info(f"Latest commit on branch `{branch}` is `{commit_hexsha}`")
121 return commit_hexsha
122
123 def diff_remote(self, branch):
124 logger.debug("Fetching from remote.")
125 self.fetch()
126
127 try:
128 self.repo.remotes.origin.refs[branch]
129 except IndexError as git_error:
130 logger.error(
131 "Branch %s does not exist at %s. %s", branch, list(self.repo.remotes.origin.urls)[0], git_error
132 )
133 raise BranchDoesNotExist(
134 f"Please create branch '{branch}' in upstream and try again."
135 f" If this is a new repo, please add a commit before syncing. {git_error}"
136 )
137
138 logger.debug("Getting diff between local branch and remote branch")
139 diff = self.repo.git.diff("--name-status", f"origin/{branch}")
140 if diff: # if diff is not empty
141 return convert_git_diff_log_to_list(diff)
142 logger.debug("No Difference")
143 return []
144
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/nautobot/utilities/git.py b/nautobot/utilities/git.py
--- a/nautobot/utilities/git.py
+++ b/nautobot/utilities/git.py
@@ -25,6 +25,11 @@
"X": "Unknown",
}
+# Environment variables to set on appropriate `git` CLI calls
+GIT_ENVIRONMENT = {
+ "GIT_TERMINAL_PROMPT": "0", # never prompt for user input such as credentials - important to avoid hangs!
+}
+
def swap_status_initials(data):
"""Swap Git status initials with its equivalent."""
@@ -59,10 +64,12 @@
url (str): git repo url
clone_initially (bool): True if the repo needs to be cloned
"""
- if os.path.isdir(path):
+ if os.path.isdir(path) and os.path.isdir(os.path.join(path, ".git")):
self.repo = Repo(path=path)
elif clone_initially:
- self.repo = Repo.clone_from(url, to_path=path)
+ # Don't log `url` as it may include authentication details.
+ logger.debug("Cloning git repository to %s...", path)
+ self.repo = Repo.clone_from(url, to_path=path, env=GIT_ENVIRONMENT)
else:
self.repo = Repo.init(path)
self.repo.create_remote("origin", url=url)
@@ -71,7 +78,8 @@
self.repo.remotes.origin.set_url(url)
def fetch(self):
- self.repo.remotes.origin.fetch()
+ with self.repo.git.custom_environment(**GIT_ENVIRONMENT):
+ self.repo.remotes.origin.fetch()
def checkout(self, branch, commit_hexsha=None):
"""
| {"golden_diff": "diff --git a/nautobot/utilities/git.py b/nautobot/utilities/git.py\n--- a/nautobot/utilities/git.py\n+++ b/nautobot/utilities/git.py\n@@ -25,6 +25,11 @@\n \"X\": \"Unknown\",\n }\n \n+# Environment variables to set on appropriate `git` CLI calls\n+GIT_ENVIRONMENT = {\n+ \"GIT_TERMINAL_PROMPT\": \"0\", # never prompt for user input such as credentials - important to avoid hangs!\n+}\n+\n \n def swap_status_initials(data):\n \"\"\"Swap Git status initials with its equivalent.\"\"\"\n@@ -59,10 +64,12 @@\n url (str): git repo url\n clone_initially (bool): True if the repo needs to be cloned\n \"\"\"\n- if os.path.isdir(path):\n+ if os.path.isdir(path) and os.path.isdir(os.path.join(path, \".git\")):\n self.repo = Repo(path=path)\n elif clone_initially:\n- self.repo = Repo.clone_from(url, to_path=path)\n+ # Don't log `url` as it may include authentication details.\n+ logger.debug(\"Cloning git repository to %s...\", path)\n+ self.repo = Repo.clone_from(url, to_path=path, env=GIT_ENVIRONMENT)\n else:\n self.repo = Repo.init(path)\n self.repo.create_remote(\"origin\", url=url)\n@@ -71,7 +78,8 @@\n self.repo.remotes.origin.set_url(url)\n \n def fetch(self):\n- self.repo.remotes.origin.fetch()\n+ with self.repo.git.custom_environment(**GIT_ENVIRONMENT):\n+ self.repo.remotes.origin.fetch()\n \n def checkout(self, branch, commit_hexsha=None):\n \"\"\"\n", "issue": "Removing credentials from a previously synced GitRepository that requires them may result in hangs\n### Environment\r\n* Nautobot version (Docker tag too if applicable): 1.5.21\r\n\r\n### Steps to Reproduce\r\n1. For a Git repository that requires authentication (such as a private GitHub repository), configure the `GitRepository` in Nautobot, with appropriate credentials, mark it as providing `Jobs`, and sync it successfully (which happens in the Celery worker environment).\r\n2. Edit the `GitRepository` to remove the credentials and resync it. The resync fails as expected (again, in the Celery worker environment.)\r\n3. Restart the Nautobot server (during startup, the `post_upgrade` signal will trigger a check and potential sync of all Git repositories in the Nautobot server environment in order to ensure that Jobs are properly discovered).\r\n\r\n### Expected Behavior\r\n\r\nNautobot server to start up, perhaps with logs indicating that the repository could not be synced.\r\n\r\n### Observed Behavior\r\n\r\nNautobot server startup hangs, apparently because GitPython received and didn't handle the Git `Username:` prompt and is waiting indefinitely for user input that will never come.\n", "before_files": [{"content": "\"\"\"General-purpose Git utilities.\"\"\"\n\nfrom collections import namedtuple\nimport logging\nimport os\n\nfrom git import Repo\n\n\nlogger = logging.getLogger(\"nautobot.utilities.git\")\n\n# namedtuple takes a git log diff status and its accompanying text.\nGitDiffLog = namedtuple(\"GitDiffLog\", [\"status\", \"text\"])\n\n# 'A' and 'D' status are swapped because of the way the repo.git.diff was implemented\n# e.g. 'A' actually stands for Addition but in this case is Deletion\nGIT_STATUS_MAP = {\n \"A\": \"Deletion\",\n \"M\": \"Modification\",\n \"C\": \"Copy\",\n \"D\": \"Addition\",\n \"R\": \"Renaming\",\n \"T\": \"File Type Changed\",\n \"U\": \"File Unmerged\",\n \"X\": \"Unknown\",\n}\n\n\ndef swap_status_initials(data):\n \"\"\"Swap Git status initials with its equivalent.\"\"\"\n initial, text = data.split(\"\\t\")\n return GitDiffLog(status=GIT_STATUS_MAP.get(initial), text=text)\n\n\ndef convert_git_diff_log_to_list(logs):\n \"\"\"\n Convert Git diff log into a list splitted by \\\\n\n\n Example:\n >>> git_log = \"M\\tindex.html\\nR\\tsample.txt\"\n >>> print(convert_git_diff_log_to_list(git_log))\n [\"Modification - index.html\", \"Renaming - sample.txt\"]\n \"\"\"\n logs = logs.split(\"\\n\")\n return [swap_status_initials(line) for line in logs]\n\n\nclass BranchDoesNotExist(Exception):\n pass\n\n\nclass GitRepo:\n def __init__(self, path, url, clone_initially=True):\n \"\"\"\n Ensure that we have a clone of the given remote Git repository URL at the given local directory path.\n\n Args:\n path (str): path to git repo\n url (str): git repo url\n clone_initially (bool): True if the repo needs to be cloned\n \"\"\"\n if os.path.isdir(path):\n self.repo = Repo(path=path)\n elif clone_initially:\n self.repo = Repo.clone_from(url, to_path=path)\n else:\n self.repo = Repo.init(path)\n self.repo.create_remote(\"origin\", url=url)\n\n if url not in self.repo.remotes.origin.urls:\n self.repo.remotes.origin.set_url(url)\n\n def fetch(self):\n self.repo.remotes.origin.fetch()\n\n def checkout(self, branch, commit_hexsha=None):\n \"\"\"\n Check out the given branch, and optionally the specified commit within that branch.\n \"\"\"\n # Short-circuit logic - do we already have this commit checked out?\n if commit_hexsha and commit_hexsha == self.repo.head.commit.hexsha:\n logger.debug(f\"Commit {commit_hexsha} is already checked out.\")\n return commit_hexsha\n\n self.fetch()\n if commit_hexsha:\n # Sanity check - GitPython doesn't provide a handy API for this so we just call a raw Git command:\n # $ git branch origin/<branch> --remotes --contains <commit>\n # prints the branch name if it DOES contain the commit, and nothing if it DOES NOT contain the commit.\n # Since we did a `fetch` and not a `pull` above, we need to check for the commit in the remote origin\n # branch, not the local (not-yet-updated) branch.\n if branch not in self.repo.git.branch(f\"origin/{branch}\", \"--remotes\", \"--contains\", commit_hexsha):\n raise RuntimeError(f\"Requested to check out commit `{commit_hexsha}`, but it's not in branch {branch}!\")\n logger.info(f\"Checking out commit `{commit_hexsha}` on branch `{branch}`...\")\n self.repo.git.checkout(commit_hexsha)\n return commit_hexsha\n\n if branch in self.repo.heads:\n branch_head = self.repo.heads[branch]\n else:\n try:\n branch_head = self.repo.create_head(branch, self.repo.remotes.origin.refs[branch])\n branch_head.set_tracking_branch(self.repo.remotes.origin.refs[branch])\n except IndexError as git_error:\n logger.error(\n \"Branch %s does not exist at %s. %s\", branch, list(self.repo.remotes.origin.urls)[0], git_error\n )\n raise BranchDoesNotExist(\n f\"Please create branch '{branch}' in upstream and try again.\"\n f\" If this is a new repo, please add a commit before syncing. {git_error}\"\n )\n\n logger.info(f\"Checking out latest commit on branch `{branch}`...\")\n branch_head.checkout()\n # No specific commit hash was given, so make sure we get the latest from origin\n # We would use repo.remotes.origin.pull() here, but that will fail in the case where someone has\n # force-pushed to the upstream repo since the last time we did a pull. To be safe, we reset instead.\n self.repo.head.reset(f\"origin/{branch}\", index=True, working_tree=True)\n commit_hexsha = self.repo.head.reference.commit.hexsha\n logger.info(f\"Latest commit on branch `{branch}` is `{commit_hexsha}`\")\n return commit_hexsha\n\n def diff_remote(self, branch):\n logger.debug(\"Fetching from remote.\")\n self.fetch()\n\n try:\n self.repo.remotes.origin.refs[branch]\n except IndexError as git_error:\n logger.error(\n \"Branch %s does not exist at %s. %s\", branch, list(self.repo.remotes.origin.urls)[0], git_error\n )\n raise BranchDoesNotExist(\n f\"Please create branch '{branch}' in upstream and try again.\"\n f\" If this is a new repo, please add a commit before syncing. {git_error}\"\n )\n\n logger.debug(\"Getting diff between local branch and remote branch\")\n diff = self.repo.git.diff(\"--name-status\", f\"origin/{branch}\")\n if diff: # if diff is not empty\n return convert_git_diff_log_to_list(diff)\n logger.debug(\"No Difference\")\n return []\n", "path": "nautobot/utilities/git.py"}], "after_files": [{"content": "\"\"\"General-purpose Git utilities.\"\"\"\n\nfrom collections import namedtuple\nimport logging\nimport os\n\nfrom git import Repo\n\n\nlogger = logging.getLogger(\"nautobot.utilities.git\")\n\n# namedtuple takes a git log diff status and its accompanying text.\nGitDiffLog = namedtuple(\"GitDiffLog\", [\"status\", \"text\"])\n\n# 'A' and 'D' status are swapped because of the way the repo.git.diff was implemented\n# e.g. 'A' actually stands for Addition but in this case is Deletion\nGIT_STATUS_MAP = {\n \"A\": \"Deletion\",\n \"M\": \"Modification\",\n \"C\": \"Copy\",\n \"D\": \"Addition\",\n \"R\": \"Renaming\",\n \"T\": \"File Type Changed\",\n \"U\": \"File Unmerged\",\n \"X\": \"Unknown\",\n}\n\n# Environment variables to set on appropriate `git` CLI calls\nGIT_ENVIRONMENT = {\n \"GIT_TERMINAL_PROMPT\": \"0\", # never prompt for user input such as credentials - important to avoid hangs!\n}\n\n\ndef swap_status_initials(data):\n \"\"\"Swap Git status initials with its equivalent.\"\"\"\n initial, text = data.split(\"\\t\")\n return GitDiffLog(status=GIT_STATUS_MAP.get(initial), text=text)\n\n\ndef convert_git_diff_log_to_list(logs):\n \"\"\"\n Convert Git diff log into a list splitted by \\\\n\n\n Example:\n >>> git_log = \"M\\tindex.html\\nR\\tsample.txt\"\n >>> print(convert_git_diff_log_to_list(git_log))\n [\"Modification - index.html\", \"Renaming - sample.txt\"]\n \"\"\"\n logs = logs.split(\"\\n\")\n return [swap_status_initials(line) for line in logs]\n\n\nclass BranchDoesNotExist(Exception):\n pass\n\n\nclass GitRepo:\n def __init__(self, path, url, clone_initially=True):\n \"\"\"\n Ensure that we have a clone of the given remote Git repository URL at the given local directory path.\n\n Args:\n path (str): path to git repo\n url (str): git repo url\n clone_initially (bool): True if the repo needs to be cloned\n \"\"\"\n if os.path.isdir(path) and os.path.isdir(os.path.join(path, \".git\")):\n self.repo = Repo(path=path)\n elif clone_initially:\n # Don't log `url` as it may include authentication details.\n logger.debug(\"Cloning git repository to %s...\", path)\n self.repo = Repo.clone_from(url, to_path=path, env=GIT_ENVIRONMENT)\n else:\n self.repo = Repo.init(path)\n self.repo.create_remote(\"origin\", url=url)\n\n if url not in self.repo.remotes.origin.urls:\n self.repo.remotes.origin.set_url(url)\n\n def fetch(self):\n with self.repo.git.custom_environment(**GIT_ENVIRONMENT):\n self.repo.remotes.origin.fetch()\n\n def checkout(self, branch, commit_hexsha=None):\n \"\"\"\n Check out the given branch, and optionally the specified commit within that branch.\n \"\"\"\n # Short-circuit logic - do we already have this commit checked out?\n if commit_hexsha and commit_hexsha == self.repo.head.commit.hexsha:\n logger.debug(f\"Commit {commit_hexsha} is already checked out.\")\n return commit_hexsha\n\n self.fetch()\n if commit_hexsha:\n # Sanity check - GitPython doesn't provide a handy API for this so we just call a raw Git command:\n # $ git branch origin/<branch> --remotes --contains <commit>\n # prints the branch name if it DOES contain the commit, and nothing if it DOES NOT contain the commit.\n # Since we did a `fetch` and not a `pull` above, we need to check for the commit in the remote origin\n # branch, not the local (not-yet-updated) branch.\n if branch not in self.repo.git.branch(f\"origin/{branch}\", \"--remotes\", \"--contains\", commit_hexsha):\n raise RuntimeError(f\"Requested to check out commit `{commit_hexsha}`, but it's not in branch {branch}!\")\n logger.info(f\"Checking out commit `{commit_hexsha}` on branch `{branch}`...\")\n self.repo.git.checkout(commit_hexsha)\n return commit_hexsha\n\n if branch in self.repo.heads:\n branch_head = self.repo.heads[branch]\n else:\n try:\n branch_head = self.repo.create_head(branch, self.repo.remotes.origin.refs[branch])\n branch_head.set_tracking_branch(self.repo.remotes.origin.refs[branch])\n except IndexError as git_error:\n logger.error(\n \"Branch %s does not exist at %s. %s\", branch, list(self.repo.remotes.origin.urls)[0], git_error\n )\n raise BranchDoesNotExist(\n f\"Please create branch '{branch}' in upstream and try again.\"\n f\" If this is a new repo, please add a commit before syncing. {git_error}\"\n )\n\n logger.info(f\"Checking out latest commit on branch `{branch}`...\")\n branch_head.checkout()\n # No specific commit hash was given, so make sure we get the latest from origin\n # We would use repo.remotes.origin.pull() here, but that will fail in the case where someone has\n # force-pushed to the upstream repo since the last time we did a pull. To be safe, we reset instead.\n self.repo.head.reset(f\"origin/{branch}\", index=True, working_tree=True)\n commit_hexsha = self.repo.head.reference.commit.hexsha\n logger.info(f\"Latest commit on branch `{branch}` is `{commit_hexsha}`\")\n return commit_hexsha\n\n def diff_remote(self, branch):\n logger.debug(\"Fetching from remote.\")\n self.fetch()\n\n try:\n self.repo.remotes.origin.refs[branch]\n except IndexError as git_error:\n logger.error(\n \"Branch %s does not exist at %s. %s\", branch, list(self.repo.remotes.origin.urls)[0], git_error\n )\n raise BranchDoesNotExist(\n f\"Please create branch '{branch}' in upstream and try again.\"\n f\" If this is a new repo, please add a commit before syncing. {git_error}\"\n )\n\n logger.debug(\"Getting diff between local branch and remote branch\")\n diff = self.repo.git.diff(\"--name-status\", f\"origin/{branch}\")\n if diff: # if diff is not empty\n return convert_git_diff_log_to_list(diff)\n logger.debug(\"No Difference\")\n return []\n", "path": "nautobot/utilities/git.py"}]} | 2,114 | 375 |
gh_patches_debug_36057 | rasdani/github-patches | git_diff | PennyLaneAI__pennylane-1581 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Change order of technical description and list of functions in documentation
Three modules, [`kernels`](https://pennylane.readthedocs.io/en/latest/code/qml_kernels.html), [`grouping`](https://pennylane.readthedocs.io/en/latest/code/qml_grouping.html), and [`qaoa`](https://pennylane.readthedocs.io/en/latest/code/qml_qaoa.html) have their module documentation ordered such that there is first a lengthy description of the theory, and the actual list of functions comes after. We should update the docs of these modules so that the functions appear *first*, and the technical details come afterwards (as was recently discussed in #1160). This will improve readability of the documentation and make it easier to find the details of a desired function.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pennylane/qaoa/__init__.py`
Content:
```
1 # Copyright 2018-2021 Xanadu Quantum Technologies Inc.
2
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6
7 # http://www.apache.org/licenses/LICENSE-2.0
8
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 r"""
15 This module contains functionality to construct QAOA workflows in PennyLane.
16 """
17
18 from .mixers import *
19 from .cost import *
20 from .layers import *
21 import pennylane.qaoa.cycle
22
```
Path: `pennylane/kernels/__init__.py`
Content:
```
1 # Copyright 2018-2021 Xanadu Quantum Technologies Inc.
2
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6
7 # http://www.apache.org/licenses/LICENSE-2.0
8
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 r"""
15 This subpackage defines functions that relate to quantum kernel methods.
16 On one hand this includes functions to call a quantum kernel systematically
17 on training and test datasets to obtain the *kernel matrix*.
18 On the other hand it provides postprocessing methods for those kernel
19 matrices which can be used to mitigate device noise and sampling errors.
20
21 Given a kernel
22
23 .. math ::
24
25 k: \mathbb{R}^d \times \mathbb{R}^d \to \mathbb{R}, \quad
26 (x_1, x_2)\mapsto k(x_1, x_2)
27
28 the kernel matrix of :math:`k` on a training dataset
29 :math:`\{(x_1, y_1),\cdots (x_n, y_n)\}` with :math:`x_i\in\mathbb{R}^d`
30 and :math:`y_i\in\{-1, 1\}` is defined as
31
32 .. math ::
33
34 K_{ij} = k(x_i, x_j).
35
36 For valid kernels, this is a real symmetric positive semi-definite matrix.
37 We also define the *ideal kernel matrix* for the training dataset which
38 perfectly predicts whether two points have identical labels or not:
39
40 .. math ::
41
42 K^\ast_{ij} = y_i y_j
43
44 We can measure the similarity between :math:`K` and :math:`K^\ast`,
45 through the *kernel polarity* which can be expressed as the Frobenius inner
46 product between the two matrices:
47
48 .. math ::
49
50 \operatorname{P}(k) = \langle K^\ast, K \rangle_F = \sum_{i,j=1}^n y_i y_j k(x_i, x_j)
51
52 Additionally, there is the *kernel-target alignment*, which is the normalized
53 counterpart to the kernel polarity:
54
55 .. math ::
56
57 \operatorname{TA}(k) &= \frac{P(k)}{\lVert K^\ast \rVert_F\;\lVert K \rVert_F}\\
58 \lVert K\rVert_F &= \sqrt{\sum_{i,j=1}^n k(x_i, x_j)^2}\\
59 \lVert K^\ast\rVert_F &= \sqrt{\sum_{i,j=1}^n (y_iy_j)^2}
60
61 For datasets with different numbers of training points per class the labels are rescaled
62 by the number of datapoints in the respective class to avoid that kernel polarity and
63 kernel-target alignment are dominated by the properties of the kernel for just a single class.
64
65 Given a callable kernel function, all these quantities can readily be computed
66 using the methods in this module.
67 """
68 from .cost_functions import (
69 polarity,
70 target_alignment,
71 )
72 from .postprocessing import (
73 threshold_matrix,
74 displace_matrix,
75 flip_matrix,
76 closest_psd_matrix,
77 mitigate_depolarizing_noise,
78 )
79 from .utils import (
80 kernel_matrix,
81 square_kernel_matrix,
82 )
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pennylane/kernels/__init__.py b/pennylane/kernels/__init__.py
--- a/pennylane/kernels/__init__.py
+++ b/pennylane/kernels/__init__.py
@@ -13,58 +13,8 @@
# limitations under the License.
r"""
This subpackage defines functions that relate to quantum kernel methods.
-On one hand this includes functions to call a quantum kernel systematically
-on training and test datasets to obtain the *kernel matrix*.
-On the other hand it provides postprocessing methods for those kernel
-matrices which can be used to mitigate device noise and sampling errors.
-
-Given a kernel
-
-.. math ::
-
- k: \mathbb{R}^d \times \mathbb{R}^d \to \mathbb{R}, \quad
- (x_1, x_2)\mapsto k(x_1, x_2)
-
-the kernel matrix of :math:`k` on a training dataset
-:math:`\{(x_1, y_1),\cdots (x_n, y_n)\}` with :math:`x_i\in\mathbb{R}^d`
-and :math:`y_i\in\{-1, 1\}` is defined as
-
-.. math ::
-
- K_{ij} = k(x_i, x_j).
-
-For valid kernels, this is a real symmetric positive semi-definite matrix.
-We also define the *ideal kernel matrix* for the training dataset which
-perfectly predicts whether two points have identical labels or not:
-
-.. math ::
-
- K^\ast_{ij} = y_i y_j
-
-We can measure the similarity between :math:`K` and :math:`K^\ast`,
-through the *kernel polarity* which can be expressed as the Frobenius inner
-product between the two matrices:
-
-.. math ::
-
- \operatorname{P}(k) = \langle K^\ast, K \rangle_F = \sum_{i,j=1}^n y_i y_j k(x_i, x_j)
-
-Additionally, there is the *kernel-target alignment*, which is the normalized
-counterpart to the kernel polarity:
-
-.. math ::
-
- \operatorname{TA}(k) &= \frac{P(k)}{\lVert K^\ast \rVert_F\;\lVert K \rVert_F}\\
- \lVert K\rVert_F &= \sqrt{\sum_{i,j=1}^n k(x_i, x_j)^2}\\
- \lVert K^\ast\rVert_F &= \sqrt{\sum_{i,j=1}^n (y_iy_j)^2}
-
-For datasets with different numbers of training points per class the labels are rescaled
-by the number of datapoints in the respective class to avoid that kernel polarity and
-kernel-target alignment are dominated by the properties of the kernel for just a single class.
-
-Given a callable kernel function, all these quantities can readily be computed
-using the methods in this module.
"""
+
from .cost_functions import (
polarity,
target_alignment,
diff --git a/pennylane/qaoa/__init__.py b/pennylane/qaoa/__init__.py
--- a/pennylane/qaoa/__init__.py
+++ b/pennylane/qaoa/__init__.py
@@ -12,10 +12,11 @@
# See the License for the specific language governing permissions and
# limitations under the License.
r"""
-This module contains functionality to construct QAOA workflows in PennyLane.
+This module provides a collection of methods that help in the construction of
+QAOA workflows.
"""
+import pennylane.qaoa.cycle
from .mixers import *
from .cost import *
from .layers import *
-import pennylane.qaoa.cycle
| {"golden_diff": "diff --git a/pennylane/kernels/__init__.py b/pennylane/kernels/__init__.py\n--- a/pennylane/kernels/__init__.py\n+++ b/pennylane/kernels/__init__.py\n@@ -13,58 +13,8 @@\n # limitations under the License.\n r\"\"\"\n This subpackage defines functions that relate to quantum kernel methods.\n-On one hand this includes functions to call a quantum kernel systematically\n-on training and test datasets to obtain the *kernel matrix*.\n-On the other hand it provides postprocessing methods for those kernel\n-matrices which can be used to mitigate device noise and sampling errors.\n-\n-Given a kernel\n-\n-.. math ::\n-\n- k: \\mathbb{R}^d \\times \\mathbb{R}^d \\to \\mathbb{R}, \\quad\n- (x_1, x_2)\\mapsto k(x_1, x_2)\n-\n-the kernel matrix of :math:`k` on a training dataset\n-:math:`\\{(x_1, y_1),\\cdots (x_n, y_n)\\}` with :math:`x_i\\in\\mathbb{R}^d`\n-and :math:`y_i\\in\\{-1, 1\\}` is defined as\n-\n-.. math ::\n-\n- K_{ij} = k(x_i, x_j).\n-\n-For valid kernels, this is a real symmetric positive semi-definite matrix.\n-We also define the *ideal kernel matrix* for the training dataset which\n-perfectly predicts whether two points have identical labels or not:\n-\n-.. math ::\n-\n- K^\\ast_{ij} = y_i y_j\n-\n-We can measure the similarity between :math:`K` and :math:`K^\\ast`,\n-through the *kernel polarity* which can be expressed as the Frobenius inner\n-product between the two matrices:\n-\n-.. math ::\n-\n- \\operatorname{P}(k) = \\langle K^\\ast, K \\rangle_F = \\sum_{i,j=1}^n y_i y_j k(x_i, x_j)\n-\n-Additionally, there is the *kernel-target alignment*, which is the normalized\n-counterpart to the kernel polarity:\n-\n-.. math ::\n-\n- \\operatorname{TA}(k) &= \\frac{P(k)}{\\lVert K^\\ast \\rVert_F\\;\\lVert K \\rVert_F}\\\\\n- \\lVert K\\rVert_F &= \\sqrt{\\sum_{i,j=1}^n k(x_i, x_j)^2}\\\\\n- \\lVert K^\\ast\\rVert_F &= \\sqrt{\\sum_{i,j=1}^n (y_iy_j)^2}\n-\n-For datasets with different numbers of training points per class the labels are rescaled\n-by the number of datapoints in the respective class to avoid that kernel polarity and\n-kernel-target alignment are dominated by the properties of the kernel for just a single class.\n-\n-Given a callable kernel function, all these quantities can readily be computed\n-using the methods in this module.\n \"\"\"\n+\n from .cost_functions import (\n polarity,\n target_alignment,\ndiff --git a/pennylane/qaoa/__init__.py b/pennylane/qaoa/__init__.py\n--- a/pennylane/qaoa/__init__.py\n+++ b/pennylane/qaoa/__init__.py\n@@ -12,10 +12,11 @@\n # See the License for the specific language governing permissions and\n # limitations under the License.\n r\"\"\"\n-This module contains functionality to construct QAOA workflows in PennyLane.\n+This module provides a collection of methods that help in the construction of\n+QAOA workflows.\n \"\"\"\n \n+import pennylane.qaoa.cycle\n from .mixers import *\n from .cost import *\n from .layers import *\n-import pennylane.qaoa.cycle\n", "issue": "Change order of technical description and list of functions in documentation\nThree modules, [`kernels`](https://pennylane.readthedocs.io/en/latest/code/qml_kernels.html), [`grouping`](https://pennylane.readthedocs.io/en/latest/code/qml_grouping.html), and [`qaoa`](https://pennylane.readthedocs.io/en/latest/code/qml_qaoa.html) have their module documentation ordered such that there is first a lengthy description of the theory, and the actual list of functions comes after. We should update the docs of these modules so that the functions appear *first*, and the technical details come afterwards (as was recently discussed in #1160). This will improve readability of the documentation and make it easier to find the details of a desired function.\n", "before_files": [{"content": "# Copyright 2018-2021 Xanadu Quantum Technologies Inc.\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nr\"\"\"\nThis module contains functionality to construct QAOA workflows in PennyLane.\n\"\"\"\n\nfrom .mixers import *\nfrom .cost import *\nfrom .layers import *\nimport pennylane.qaoa.cycle\n", "path": "pennylane/qaoa/__init__.py"}, {"content": "# Copyright 2018-2021 Xanadu Quantum Technologies Inc.\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nr\"\"\"\nThis subpackage defines functions that relate to quantum kernel methods.\nOn one hand this includes functions to call a quantum kernel systematically\non training and test datasets to obtain the *kernel matrix*.\nOn the other hand it provides postprocessing methods for those kernel\nmatrices which can be used to mitigate device noise and sampling errors.\n\nGiven a kernel\n\n.. math ::\n\n k: \\mathbb{R}^d \\times \\mathbb{R}^d \\to \\mathbb{R}, \\quad\n (x_1, x_2)\\mapsto k(x_1, x_2)\n\nthe kernel matrix of :math:`k` on a training dataset\n:math:`\\{(x_1, y_1),\\cdots (x_n, y_n)\\}` with :math:`x_i\\in\\mathbb{R}^d`\nand :math:`y_i\\in\\{-1, 1\\}` is defined as\n\n.. math ::\n\n K_{ij} = k(x_i, x_j).\n\nFor valid kernels, this is a real symmetric positive semi-definite matrix.\nWe also define the *ideal kernel matrix* for the training dataset which\nperfectly predicts whether two points have identical labels or not:\n\n.. math ::\n\n K^\\ast_{ij} = y_i y_j\n\nWe can measure the similarity between :math:`K` and :math:`K^\\ast`,\nthrough the *kernel polarity* which can be expressed as the Frobenius inner\nproduct between the two matrices:\n\n.. math ::\n\n \\operatorname{P}(k) = \\langle K^\\ast, K \\rangle_F = \\sum_{i,j=1}^n y_i y_j k(x_i, x_j)\n\nAdditionally, there is the *kernel-target alignment*, which is the normalized\ncounterpart to the kernel polarity:\n\n.. math ::\n\n \\operatorname{TA}(k) &= \\frac{P(k)}{\\lVert K^\\ast \\rVert_F\\;\\lVert K \\rVert_F}\\\\\n \\lVert K\\rVert_F &= \\sqrt{\\sum_{i,j=1}^n k(x_i, x_j)^2}\\\\\n \\lVert K^\\ast\\rVert_F &= \\sqrt{\\sum_{i,j=1}^n (y_iy_j)^2}\n\nFor datasets with different numbers of training points per class the labels are rescaled\nby the number of datapoints in the respective class to avoid that kernel polarity and\nkernel-target alignment are dominated by the properties of the kernel for just a single class.\n\nGiven a callable kernel function, all these quantities can readily be computed\nusing the methods in this module.\n\"\"\"\nfrom .cost_functions import (\n polarity,\n target_alignment,\n)\nfrom .postprocessing import (\n threshold_matrix,\n displace_matrix,\n flip_matrix,\n closest_psd_matrix,\n mitigate_depolarizing_noise,\n)\nfrom .utils import (\n kernel_matrix,\n square_kernel_matrix,\n)\n", "path": "pennylane/kernels/__init__.py"}], "after_files": [{"content": "# Copyright 2018-2021 Xanadu Quantum Technologies Inc.\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nr\"\"\"\nThis module provides a collection of methods that help in the construction of\nQAOA workflows.\n\"\"\"\n\nimport pennylane.qaoa.cycle\nfrom .mixers import *\nfrom .cost import *\nfrom .layers import *\n", "path": "pennylane/qaoa/__init__.py"}, {"content": "# Copyright 2018-2021 Xanadu Quantum Technologies Inc.\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nr\"\"\"\nThis subpackage defines functions that relate to quantum kernel methods.\n\"\"\"\n\nfrom .cost_functions import (\n polarity,\n target_alignment,\n)\nfrom .postprocessing import (\n threshold_matrix,\n displace_matrix,\n flip_matrix,\n closest_psd_matrix,\n mitigate_depolarizing_noise,\n)\nfrom .utils import (\n kernel_matrix,\n square_kernel_matrix,\n)\n", "path": "pennylane/kernels/__init__.py"}]} | 1,603 | 848 |
gh_patches_debug_3510 | rasdani/github-patches | git_diff | aio-libs__aiohttp-2657 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Adding routes to subapps is broken after attaching subapp to the parent app
## Long story short
If one adds routes to subapp before `app.add_subapp` accessing them works well, otherwise it results in 404.
## Expected behaviour
It's naturally expected that routing will work well regardless of the order of function calls.
## Actual behaviour
Given `sub_app_cbv.py`:
```python
import sys
import aiohttp.web
API_URL_PREFIX = r'/api/v1'
class PingView(aiohttp.web.View):
async def get(self):
return aiohttp.web.Response(text='pong')
def just_app_with_routes_def(app, api_app):
app.router.add_route('GET', API_URL_PREFIX + r'/ping/', PingView)
def sub_app_after_routes_def(app, api_app):
api_app.router.add_route('GET', r'/ping/', PingView)
app.add_subapp(API_URL_PREFIX, api_app)
def sub_app_before_routes_def(app, api_app):
app.add_subapp(API_URL_PREFIX, api_app)
api_app.router.add_route('GET', r'/ping/', PingView)
def run_app(coro_name):
print(f'Going to run {coro_name}')
setup_app = globals()[coro_name]
app = aiohttp.web.Application()
api_app = aiohttp.web.Application()
setup_app(app, api_app)
aiohttp.web.run_app(app)
def main():
if len(sys.argv) < 2:
raise RuntimeError('Supply one cli argument plz [just_app_with_routes_def|sub_app_after_routes_def|sub_app_before_routes_def]')
coro_name = sys.argv[1]
run_app(coro_name)
__name__ == '__main__' and main()
```
##### Demo 1
```shell
$ python sub_app_cbv.py just_app_with_routes_def
Going to run just_app_with_routes_def
======== Running on http://0.0.0.0:8080 ========
(Press CTRL+C to quit)
```
```shell
$ http :8080/api/v1/ping/
HTTP/1.1 200 OK
Content-Length: 4
Content-Type: text/plain; charset=utf-8
Date: Thu, 11 Jan 2018 23:17:52 GMT
Server: Python/3.6 aiohttp/2.3.7
pong
```
##### Demo 2
```shell
$ python sub_app_cbv.py sub_app_after_routes_def
Going to run sub_app_after_routes_def
======== Running on http://0.0.0.0:8080 ========
(Press CTRL+C to quit)
```
```shell
$ http :8080/api/v1/ping/
HTTP/1.1 200 OK
Content-Length: 4
Content-Type: text/plain; charset=utf-8
Date: Thu, 11 Jan 2018 23:20:58 GMT
Server: Python/3.6 aiohttp/2.3.7
pong
```
##### Demo 3
```shell
$ python sub_app_cbv.py sub_app_before_routes_def
Going to run sub_app_before_routes_def
======== Running on http://0.0.0.0:8080 ========
(Press CTRL+C to quit)
```
```shell
$ http :8080/api/v1/ping/
HTTP/1.1 404 Not Found
Content-Length: 14
Content-Type: text/plain; charset=utf-8
Date: Thu, 11 Jan 2018 23:21:41 GMT
Server: Python/3.6 aiohttp/2.3.7
404: Not Found
```
## Steps to reproduce
See above.
## Your environment
Gentoo laptop, pyenv, 3.6. Not really env-related.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `aiohttp/web_app.py`
Content:
```
1 import asyncio
2 import warnings
3 from collections import MutableMapping
4 from functools import partial
5
6 from . import hdrs
7 from .abc import AbstractAccessLogger, AbstractMatchInfo, AbstractRouter
8 from .frozenlist import FrozenList
9 from .helpers import AccessLogger
10 from .log import web_logger
11 from .signals import Signal
12 from .web_middlewares import _fix_request_current_app
13 from .web_request import Request
14 from .web_response import StreamResponse
15 from .web_server import Server
16 from .web_urldispatcher import PrefixedSubAppResource, UrlDispatcher
17
18
19 class Application(MutableMapping):
20 def __init__(self, *,
21 logger=web_logger,
22 router=None,
23 middlewares=(),
24 handler_args=None,
25 client_max_size=1024**2,
26 loop=None,
27 debug=...):
28 if router is None:
29 router = UrlDispatcher()
30 assert isinstance(router, AbstractRouter), router
31
32 if loop is not None:
33 warnings.warn("loop argument is deprecated", DeprecationWarning,
34 stacklevel=2)
35
36 self._debug = debug
37 self._router = router
38 self._loop = loop
39 self._handler_args = handler_args
40 self.logger = logger
41
42 self._middlewares = FrozenList(middlewares)
43 self._state = {}
44 self._frozen = False
45 self._subapps = []
46
47 self._on_response_prepare = Signal(self)
48 self._on_startup = Signal(self)
49 self._on_shutdown = Signal(self)
50 self._on_cleanup = Signal(self)
51 self._client_max_size = client_max_size
52
53 # MutableMapping API
54
55 def __eq__(self, other):
56 return self is other
57
58 def __getitem__(self, key):
59 return self._state[key]
60
61 def _check_frozen(self):
62 if self._frozen:
63 warnings.warn("Changing state of started or joined "
64 "application is deprecated",
65 DeprecationWarning,
66 stacklevel=3)
67
68 def __setitem__(self, key, value):
69 self._check_frozen()
70 self._state[key] = value
71
72 def __delitem__(self, key):
73 self._check_frozen()
74 del self._state[key]
75
76 def __len__(self):
77 return len(self._state)
78
79 def __iter__(self):
80 return iter(self._state)
81
82 ########
83 @property
84 def loop(self):
85 return self._loop
86
87 def _set_loop(self, loop):
88 if loop is None:
89 loop = asyncio.get_event_loop()
90 if self._loop is not None and self._loop is not loop:
91 raise RuntimeError(
92 "web.Application instance initialized with different loop")
93
94 self._loop = loop
95
96 # set loop debug
97 if self._debug is ...:
98 self._debug = loop.get_debug()
99
100 # set loop to sub applications
101 for subapp in self._subapps:
102 subapp._set_loop(loop)
103
104 @property
105 def frozen(self):
106 return self._frozen
107
108 def freeze(self):
109 if self._frozen:
110 return
111
112 self._frozen = True
113 self._middlewares.freeze()
114 self._router.freeze()
115 self._on_response_prepare.freeze()
116 self._on_startup.freeze()
117 self._on_shutdown.freeze()
118 self._on_cleanup.freeze()
119 self._middlewares_handlers = tuple(self._prepare_middleware())
120
121 # If current app and any subapp do not have middlewares avoid run all
122 # of the code footprint that it implies, which have a middleware
123 # hardcoded per app that sets up the current_app attribute. If no
124 # middlewares are configured the handler will receive the proper
125 # current_app without needing all of this code.
126 self._run_middlewares = True if self.middlewares else False
127
128 for subapp in self._subapps:
129 subapp.freeze()
130 self._run_middlewares =\
131 self._run_middlewares or subapp._run_middlewares
132
133 @property
134 def debug(self):
135 return self._debug
136
137 def _reg_subapp_signals(self, subapp):
138
139 def reg_handler(signame):
140 subsig = getattr(subapp, signame)
141
142 async def handler(app):
143 await subsig.send(subapp)
144 appsig = getattr(self, signame)
145 appsig.append(handler)
146
147 reg_handler('on_startup')
148 reg_handler('on_shutdown')
149 reg_handler('on_cleanup')
150
151 def add_subapp(self, prefix, subapp):
152 if self.frozen:
153 raise RuntimeError(
154 "Cannot add sub application to frozen application")
155 if subapp.frozen:
156 raise RuntimeError("Cannot add frozen application")
157 if prefix.endswith('/'):
158 prefix = prefix[:-1]
159 if prefix in ('', '/'):
160 raise ValueError("Prefix cannot be empty")
161
162 resource = PrefixedSubAppResource(prefix, subapp)
163 self.router.register_resource(resource)
164 self._reg_subapp_signals(subapp)
165 self._subapps.append(subapp)
166 if self._loop is not None:
167 subapp._set_loop(self._loop)
168 return resource
169
170 @property
171 def on_response_prepare(self):
172 return self._on_response_prepare
173
174 @property
175 def on_startup(self):
176 return self._on_startup
177
178 @property
179 def on_shutdown(self):
180 return self._on_shutdown
181
182 @property
183 def on_cleanup(self):
184 return self._on_cleanup
185
186 @property
187 def router(self):
188 return self._router
189
190 @property
191 def middlewares(self):
192 return self._middlewares
193
194 def make_handler(self, *,
195 loop=None,
196 access_log_class=AccessLogger,
197 **kwargs):
198
199 if not issubclass(access_log_class, AbstractAccessLogger):
200 raise TypeError(
201 'access_log_class must be subclass of '
202 'aiohttp.abc.AbstractAccessLogger, got {}'.format(
203 access_log_class))
204
205 self._set_loop(loop)
206 self.freeze()
207
208 kwargs['debug'] = self.debug
209 if self._handler_args:
210 for k, v in self._handler_args.items():
211 kwargs[k] = v
212
213 return Server(self._handle, request_factory=self._make_request,
214 access_log_class=access_log_class,
215 loop=self.loop, **kwargs)
216
217 async def startup(self):
218 """Causes on_startup signal
219
220 Should be called in the event loop along with the request handler.
221 """
222 await self.on_startup.send(self)
223
224 async def shutdown(self):
225 """Causes on_shutdown signal
226
227 Should be called before cleanup()
228 """
229 await self.on_shutdown.send(self)
230
231 async def cleanup(self):
232 """Causes on_cleanup signal
233
234 Should be called after shutdown()
235 """
236 await self.on_cleanup.send(self)
237
238 def _make_request(self, message, payload, protocol, writer, task,
239 _cls=Request):
240 return _cls(
241 message, payload, protocol, writer, task,
242 self._loop,
243 client_max_size=self._client_max_size)
244
245 def _prepare_middleware(self):
246 for m in reversed(self._middlewares):
247 if getattr(m, '__middleware_version__', None) == 1:
248 yield m, True
249 else:
250 warnings.warn('old-style middleware "{!r}" deprecated, '
251 'see #2252'.format(m),
252 DeprecationWarning, stacklevel=2)
253 yield m, False
254
255 yield _fix_request_current_app(self), True
256
257 async def _handle(self, request):
258 match_info = await self._router.resolve(request)
259 assert isinstance(match_info, AbstractMatchInfo), match_info
260 match_info.add_app(self)
261
262 if __debug__:
263 match_info.freeze()
264
265 resp = None
266 request._match_info = match_info
267 expect = request.headers.get(hdrs.EXPECT)
268 if expect:
269 resp = await match_info.expect_handler(request)
270 await request.writer.drain()
271
272 if resp is None:
273 handler = match_info.handler
274
275 if self._run_middlewares:
276 for app in match_info.apps[::-1]:
277 for m, new_style in app._middlewares_handlers:
278 if new_style:
279 handler = partial(m, handler=handler)
280 else:
281 handler = await m(app, handler)
282
283 resp = await handler(request)
284
285 assert isinstance(resp, StreamResponse), \
286 ("Handler {!r} should return response instance, "
287 "got {!r} [middlewares {!r}]").format(
288 match_info.handler, type(resp),
289 [middleware
290 for app in match_info.apps
291 for middleware in app.middlewares])
292 return resp
293
294 def __call__(self):
295 """gunicorn compatibility"""
296 return self
297
298 def __repr__(self):
299 return "<Application 0x{:x}>".format(id(self))
300
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/aiohttp/web_app.py b/aiohttp/web_app.py
--- a/aiohttp/web_app.py
+++ b/aiohttp/web_app.py
@@ -163,6 +163,7 @@
self.router.register_resource(resource)
self._reg_subapp_signals(subapp)
self._subapps.append(subapp)
+ subapp.freeze()
if self._loop is not None:
subapp._set_loop(self._loop)
return resource
| {"golden_diff": "diff --git a/aiohttp/web_app.py b/aiohttp/web_app.py\n--- a/aiohttp/web_app.py\n+++ b/aiohttp/web_app.py\n@@ -163,6 +163,7 @@\n self.router.register_resource(resource)\n self._reg_subapp_signals(subapp)\n self._subapps.append(subapp)\n+ subapp.freeze()\n if self._loop is not None:\n subapp._set_loop(self._loop)\n return resource\n", "issue": "Adding routes to subapps is broken after attaching subapp to the parent app\n## Long story short\r\n\r\nIf one adds routes to subapp before `app.add_subapp` accessing them works well, otherwise it results in 404.\r\n\r\n## Expected behaviour\r\n\r\nIt's naturally expected that routing will work well regardless of the order of function calls.\r\n\r\n## Actual behaviour\r\n\r\nGiven `sub_app_cbv.py`:\r\n```python\r\nimport sys\r\nimport aiohttp.web\r\n\r\nAPI_URL_PREFIX = r'/api/v1'\r\n\r\nclass PingView(aiohttp.web.View):\r\n async def get(self):\r\n return aiohttp.web.Response(text='pong')\r\n\r\ndef just_app_with_routes_def(app, api_app):\r\n app.router.add_route('GET', API_URL_PREFIX + r'/ping/', PingView)\r\n\r\ndef sub_app_after_routes_def(app, api_app):\r\n api_app.router.add_route('GET', r'/ping/', PingView)\r\n app.add_subapp(API_URL_PREFIX, api_app)\r\n\r\ndef sub_app_before_routes_def(app, api_app):\r\n app.add_subapp(API_URL_PREFIX, api_app)\r\n api_app.router.add_route('GET', r'/ping/', PingView)\r\n\r\ndef run_app(coro_name):\r\n print(f'Going to run {coro_name}')\r\n setup_app = globals()[coro_name]\r\n app = aiohttp.web.Application()\r\n api_app = aiohttp.web.Application()\r\n setup_app(app, api_app)\r\n aiohttp.web.run_app(app)\r\n\r\ndef main():\r\n if len(sys.argv) < 2:\r\n raise RuntimeError('Supply one cli argument plz [just_app_with_routes_def|sub_app_after_routes_def|sub_app_before_routes_def]')\r\n coro_name = sys.argv[1]\r\n run_app(coro_name)\r\n\r\n__name__ == '__main__' and main()\r\n```\r\n\r\n##### Demo 1\r\n\r\n```shell\r\n$ python sub_app_cbv.py just_app_with_routes_def\r\nGoing to run just_app_with_routes_def\r\n======== Running on http://0.0.0.0:8080 ========\r\n(Press CTRL+C to quit)\r\n```\r\n\r\n```shell\r\n$ http :8080/api/v1/ping/\r\nHTTP/1.1 200 OK\r\nContent-Length: 4\r\nContent-Type: text/plain; charset=utf-8\r\nDate: Thu, 11 Jan 2018 23:17:52 GMT\r\nServer: Python/3.6 aiohttp/2.3.7\r\npong\r\n```\r\n\r\n##### Demo 2\r\n\r\n```shell\r\n$ python sub_app_cbv.py sub_app_after_routes_def\r\nGoing to run sub_app_after_routes_def\r\n======== Running on http://0.0.0.0:8080 ========\r\n(Press CTRL+C to quit)\r\n```\r\n\r\n```shell\r\n$ http :8080/api/v1/ping/\r\nHTTP/1.1 200 OK\r\nContent-Length: 4\r\nContent-Type: text/plain; charset=utf-8\r\nDate: Thu, 11 Jan 2018 23:20:58 GMT\r\nServer: Python/3.6 aiohttp/2.3.7\r\npong\r\n```\r\n\r\n##### Demo 3\r\n\r\n```shell\r\n$ python sub_app_cbv.py sub_app_before_routes_def\r\nGoing to run sub_app_before_routes_def\r\n======== Running on http://0.0.0.0:8080 ========\r\n(Press CTRL+C to quit)\r\n```\r\n\r\n```shell\r\n$ http :8080/api/v1/ping/\r\nHTTP/1.1 404 Not Found\r\nContent-Length: 14\r\nContent-Type: text/plain; charset=utf-8\r\nDate: Thu, 11 Jan 2018 23:21:41 GMT\r\nServer: Python/3.6 aiohttp/2.3.7\r\n404: Not Found\r\n```\r\n\r\n## Steps to reproduce\r\n\r\nSee above.\r\n\r\n## Your environment\r\n\r\nGentoo laptop, pyenv, 3.6. Not really env-related.\r\n\n", "before_files": [{"content": "import asyncio\nimport warnings\nfrom collections import MutableMapping\nfrom functools import partial\n\nfrom . import hdrs\nfrom .abc import AbstractAccessLogger, AbstractMatchInfo, AbstractRouter\nfrom .frozenlist import FrozenList\nfrom .helpers import AccessLogger\nfrom .log import web_logger\nfrom .signals import Signal\nfrom .web_middlewares import _fix_request_current_app\nfrom .web_request import Request\nfrom .web_response import StreamResponse\nfrom .web_server import Server\nfrom .web_urldispatcher import PrefixedSubAppResource, UrlDispatcher\n\n\nclass Application(MutableMapping):\n def __init__(self, *,\n logger=web_logger,\n router=None,\n middlewares=(),\n handler_args=None,\n client_max_size=1024**2,\n loop=None,\n debug=...):\n if router is None:\n router = UrlDispatcher()\n assert isinstance(router, AbstractRouter), router\n\n if loop is not None:\n warnings.warn(\"loop argument is deprecated\", DeprecationWarning,\n stacklevel=2)\n\n self._debug = debug\n self._router = router\n self._loop = loop\n self._handler_args = handler_args\n self.logger = logger\n\n self._middlewares = FrozenList(middlewares)\n self._state = {}\n self._frozen = False\n self._subapps = []\n\n self._on_response_prepare = Signal(self)\n self._on_startup = Signal(self)\n self._on_shutdown = Signal(self)\n self._on_cleanup = Signal(self)\n self._client_max_size = client_max_size\n\n # MutableMapping API\n\n def __eq__(self, other):\n return self is other\n\n def __getitem__(self, key):\n return self._state[key]\n\n def _check_frozen(self):\n if self._frozen:\n warnings.warn(\"Changing state of started or joined \"\n \"application is deprecated\",\n DeprecationWarning,\n stacklevel=3)\n\n def __setitem__(self, key, value):\n self._check_frozen()\n self._state[key] = value\n\n def __delitem__(self, key):\n self._check_frozen()\n del self._state[key]\n\n def __len__(self):\n return len(self._state)\n\n def __iter__(self):\n return iter(self._state)\n\n ########\n @property\n def loop(self):\n return self._loop\n\n def _set_loop(self, loop):\n if loop is None:\n loop = asyncio.get_event_loop()\n if self._loop is not None and self._loop is not loop:\n raise RuntimeError(\n \"web.Application instance initialized with different loop\")\n\n self._loop = loop\n\n # set loop debug\n if self._debug is ...:\n self._debug = loop.get_debug()\n\n # set loop to sub applications\n for subapp in self._subapps:\n subapp._set_loop(loop)\n\n @property\n def frozen(self):\n return self._frozen\n\n def freeze(self):\n if self._frozen:\n return\n\n self._frozen = True\n self._middlewares.freeze()\n self._router.freeze()\n self._on_response_prepare.freeze()\n self._on_startup.freeze()\n self._on_shutdown.freeze()\n self._on_cleanup.freeze()\n self._middlewares_handlers = tuple(self._prepare_middleware())\n\n # If current app and any subapp do not have middlewares avoid run all\n # of the code footprint that it implies, which have a middleware\n # hardcoded per app that sets up the current_app attribute. If no\n # middlewares are configured the handler will receive the proper\n # current_app without needing all of this code.\n self._run_middlewares = True if self.middlewares else False\n\n for subapp in self._subapps:\n subapp.freeze()\n self._run_middlewares =\\\n self._run_middlewares or subapp._run_middlewares\n\n @property\n def debug(self):\n return self._debug\n\n def _reg_subapp_signals(self, subapp):\n\n def reg_handler(signame):\n subsig = getattr(subapp, signame)\n\n async def handler(app):\n await subsig.send(subapp)\n appsig = getattr(self, signame)\n appsig.append(handler)\n\n reg_handler('on_startup')\n reg_handler('on_shutdown')\n reg_handler('on_cleanup')\n\n def add_subapp(self, prefix, subapp):\n if self.frozen:\n raise RuntimeError(\n \"Cannot add sub application to frozen application\")\n if subapp.frozen:\n raise RuntimeError(\"Cannot add frozen application\")\n if prefix.endswith('/'):\n prefix = prefix[:-1]\n if prefix in ('', '/'):\n raise ValueError(\"Prefix cannot be empty\")\n\n resource = PrefixedSubAppResource(prefix, subapp)\n self.router.register_resource(resource)\n self._reg_subapp_signals(subapp)\n self._subapps.append(subapp)\n if self._loop is not None:\n subapp._set_loop(self._loop)\n return resource\n\n @property\n def on_response_prepare(self):\n return self._on_response_prepare\n\n @property\n def on_startup(self):\n return self._on_startup\n\n @property\n def on_shutdown(self):\n return self._on_shutdown\n\n @property\n def on_cleanup(self):\n return self._on_cleanup\n\n @property\n def router(self):\n return self._router\n\n @property\n def middlewares(self):\n return self._middlewares\n\n def make_handler(self, *,\n loop=None,\n access_log_class=AccessLogger,\n **kwargs):\n\n if not issubclass(access_log_class, AbstractAccessLogger):\n raise TypeError(\n 'access_log_class must be subclass of '\n 'aiohttp.abc.AbstractAccessLogger, got {}'.format(\n access_log_class))\n\n self._set_loop(loop)\n self.freeze()\n\n kwargs['debug'] = self.debug\n if self._handler_args:\n for k, v in self._handler_args.items():\n kwargs[k] = v\n\n return Server(self._handle, request_factory=self._make_request,\n access_log_class=access_log_class,\n loop=self.loop, **kwargs)\n\n async def startup(self):\n \"\"\"Causes on_startup signal\n\n Should be called in the event loop along with the request handler.\n \"\"\"\n await self.on_startup.send(self)\n\n async def shutdown(self):\n \"\"\"Causes on_shutdown signal\n\n Should be called before cleanup()\n \"\"\"\n await self.on_shutdown.send(self)\n\n async def cleanup(self):\n \"\"\"Causes on_cleanup signal\n\n Should be called after shutdown()\n \"\"\"\n await self.on_cleanup.send(self)\n\n def _make_request(self, message, payload, protocol, writer, task,\n _cls=Request):\n return _cls(\n message, payload, protocol, writer, task,\n self._loop,\n client_max_size=self._client_max_size)\n\n def _prepare_middleware(self):\n for m in reversed(self._middlewares):\n if getattr(m, '__middleware_version__', None) == 1:\n yield m, True\n else:\n warnings.warn('old-style middleware \"{!r}\" deprecated, '\n 'see #2252'.format(m),\n DeprecationWarning, stacklevel=2)\n yield m, False\n\n yield _fix_request_current_app(self), True\n\n async def _handle(self, request):\n match_info = await self._router.resolve(request)\n assert isinstance(match_info, AbstractMatchInfo), match_info\n match_info.add_app(self)\n\n if __debug__:\n match_info.freeze()\n\n resp = None\n request._match_info = match_info\n expect = request.headers.get(hdrs.EXPECT)\n if expect:\n resp = await match_info.expect_handler(request)\n await request.writer.drain()\n\n if resp is None:\n handler = match_info.handler\n\n if self._run_middlewares:\n for app in match_info.apps[::-1]:\n for m, new_style in app._middlewares_handlers:\n if new_style:\n handler = partial(m, handler=handler)\n else:\n handler = await m(app, handler)\n\n resp = await handler(request)\n\n assert isinstance(resp, StreamResponse), \\\n (\"Handler {!r} should return response instance, \"\n \"got {!r} [middlewares {!r}]\").format(\n match_info.handler, type(resp),\n [middleware\n for app in match_info.apps\n for middleware in app.middlewares])\n return resp\n\n def __call__(self):\n \"\"\"gunicorn compatibility\"\"\"\n return self\n\n def __repr__(self):\n return \"<Application 0x{:x}>\".format(id(self))\n", "path": "aiohttp/web_app.py"}], "after_files": [{"content": "import asyncio\nimport warnings\nfrom collections import MutableMapping\nfrom functools import partial\n\nfrom . import hdrs\nfrom .abc import AbstractAccessLogger, AbstractMatchInfo, AbstractRouter\nfrom .frozenlist import FrozenList\nfrom .helpers import AccessLogger\nfrom .log import web_logger\nfrom .signals import Signal\nfrom .web_middlewares import _fix_request_current_app\nfrom .web_request import Request\nfrom .web_response import StreamResponse\nfrom .web_server import Server\nfrom .web_urldispatcher import PrefixedSubAppResource, UrlDispatcher\n\n\nclass Application(MutableMapping):\n def __init__(self, *,\n logger=web_logger,\n router=None,\n middlewares=(),\n handler_args=None,\n client_max_size=1024**2,\n loop=None,\n debug=...):\n if router is None:\n router = UrlDispatcher()\n assert isinstance(router, AbstractRouter), router\n\n if loop is not None:\n warnings.warn(\"loop argument is deprecated\", DeprecationWarning,\n stacklevel=2)\n\n self._debug = debug\n self._router = router\n self._loop = loop\n self._handler_args = handler_args\n self.logger = logger\n\n self._middlewares = FrozenList(middlewares)\n self._state = {}\n self._frozen = False\n self._subapps = []\n\n self._on_response_prepare = Signal(self)\n self._on_startup = Signal(self)\n self._on_shutdown = Signal(self)\n self._on_cleanup = Signal(self)\n self._client_max_size = client_max_size\n\n # MutableMapping API\n\n def __eq__(self, other):\n return self is other\n\n def __getitem__(self, key):\n return self._state[key]\n\n def _check_frozen(self):\n if self._frozen:\n warnings.warn(\"Changing state of started or joined \"\n \"application is deprecated\",\n DeprecationWarning,\n stacklevel=3)\n\n def __setitem__(self, key, value):\n self._check_frozen()\n self._state[key] = value\n\n def __delitem__(self, key):\n self._check_frozen()\n del self._state[key]\n\n def __len__(self):\n return len(self._state)\n\n def __iter__(self):\n return iter(self._state)\n\n ########\n @property\n def loop(self):\n return self._loop\n\n def _set_loop(self, loop):\n if loop is None:\n loop = asyncio.get_event_loop()\n if self._loop is not None and self._loop is not loop:\n raise RuntimeError(\n \"web.Application instance initialized with different loop\")\n\n self._loop = loop\n\n # set loop debug\n if self._debug is ...:\n self._debug = loop.get_debug()\n\n # set loop to sub applications\n for subapp in self._subapps:\n subapp._set_loop(loop)\n\n @property\n def frozen(self):\n return self._frozen\n\n def freeze(self):\n if self._frozen:\n return\n\n self._frozen = True\n self._middlewares.freeze()\n self._router.freeze()\n self._on_response_prepare.freeze()\n self._on_startup.freeze()\n self._on_shutdown.freeze()\n self._on_cleanup.freeze()\n self._middlewares_handlers = tuple(self._prepare_middleware())\n\n # If current app and any subapp do not have middlewares avoid run all\n # of the code footprint that it implies, which have a middleware\n # hardcoded per app that sets up the current_app attribute. If no\n # middlewares are configured the handler will receive the proper\n # current_app without needing all of this code.\n self._run_middlewares = True if self.middlewares else False\n\n for subapp in self._subapps:\n subapp.freeze()\n self._run_middlewares =\\\n self._run_middlewares or subapp._run_middlewares\n\n @property\n def debug(self):\n return self._debug\n\n def _reg_subapp_signals(self, subapp):\n\n def reg_handler(signame):\n subsig = getattr(subapp, signame)\n\n async def handler(app):\n await subsig.send(subapp)\n appsig = getattr(self, signame)\n appsig.append(handler)\n\n reg_handler('on_startup')\n reg_handler('on_shutdown')\n reg_handler('on_cleanup')\n\n def add_subapp(self, prefix, subapp):\n if self.frozen:\n raise RuntimeError(\n \"Cannot add sub application to frozen application\")\n if subapp.frozen:\n raise RuntimeError(\"Cannot add frozen application\")\n if prefix.endswith('/'):\n prefix = prefix[:-1]\n if prefix in ('', '/'):\n raise ValueError(\"Prefix cannot be empty\")\n\n resource = PrefixedSubAppResource(prefix, subapp)\n self.router.register_resource(resource)\n self._reg_subapp_signals(subapp)\n self._subapps.append(subapp)\n subapp.freeze()\n if self._loop is not None:\n subapp._set_loop(self._loop)\n return resource\n\n @property\n def on_response_prepare(self):\n return self._on_response_prepare\n\n @property\n def on_startup(self):\n return self._on_startup\n\n @property\n def on_shutdown(self):\n return self._on_shutdown\n\n @property\n def on_cleanup(self):\n return self._on_cleanup\n\n @property\n def router(self):\n return self._router\n\n @property\n def middlewares(self):\n return self._middlewares\n\n def make_handler(self, *,\n loop=None,\n access_log_class=AccessLogger,\n **kwargs):\n\n if not issubclass(access_log_class, AbstractAccessLogger):\n raise TypeError(\n 'access_log_class must be subclass of '\n 'aiohttp.abc.AbstractAccessLogger, got {}'.format(\n access_log_class))\n\n self._set_loop(loop)\n self.freeze()\n\n kwargs['debug'] = self.debug\n if self._handler_args:\n for k, v in self._handler_args.items():\n kwargs[k] = v\n\n return Server(self._handle, request_factory=self._make_request,\n access_log_class=access_log_class,\n loop=self.loop, **kwargs)\n\n async def startup(self):\n \"\"\"Causes on_startup signal\n\n Should be called in the event loop along with the request handler.\n \"\"\"\n await self.on_startup.send(self)\n\n async def shutdown(self):\n \"\"\"Causes on_shutdown signal\n\n Should be called before cleanup()\n \"\"\"\n await self.on_shutdown.send(self)\n\n async def cleanup(self):\n \"\"\"Causes on_cleanup signal\n\n Should be called after shutdown()\n \"\"\"\n await self.on_cleanup.send(self)\n\n def _make_request(self, message, payload, protocol, writer, task,\n _cls=Request):\n return _cls(\n message, payload, protocol, writer, task,\n self._loop,\n client_max_size=self._client_max_size)\n\n def _prepare_middleware(self):\n for m in reversed(self._middlewares):\n if getattr(m, '__middleware_version__', None) == 1:\n yield m, True\n else:\n warnings.warn('old-style middleware \"{!r}\" deprecated, '\n 'see #2252'.format(m),\n DeprecationWarning, stacklevel=2)\n yield m, False\n\n yield _fix_request_current_app(self), True\n\n async def _handle(self, request):\n match_info = await self._router.resolve(request)\n assert isinstance(match_info, AbstractMatchInfo), match_info\n match_info.add_app(self)\n\n if __debug__:\n match_info.freeze()\n\n resp = None\n request._match_info = match_info\n expect = request.headers.get(hdrs.EXPECT)\n if expect:\n resp = await match_info.expect_handler(request)\n await request.writer.drain()\n\n if resp is None:\n handler = match_info.handler\n\n if self._run_middlewares:\n for app in match_info.apps[::-1]:\n for m, new_style in app._middlewares_handlers:\n if new_style:\n handler = partial(m, handler=handler)\n else:\n handler = await m(app, handler)\n\n resp = await handler(request)\n\n assert isinstance(resp, StreamResponse), \\\n (\"Handler {!r} should return response instance, \"\n \"got {!r} [middlewares {!r}]\").format(\n match_info.handler, type(resp),\n [middleware\n for app in match_info.apps\n for middleware in app.middlewares])\n return resp\n\n def __call__(self):\n \"\"\"gunicorn compatibility\"\"\"\n return self\n\n def __repr__(self):\n return \"<Application 0x{:x}>\".format(id(self))\n", "path": "aiohttp/web_app.py"}]} | 3,824 | 105 |
gh_patches_debug_41388 | rasdani/github-patches | git_diff | deepset-ai__haystack-7988 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add `max_retries` and `timeout` params to all `AzureOpenAI` classes
**Is your feature request related to a problem? Please describe.**
Currently all `OpenAI` related classes (e.g. `OpenAIDocumentEmbedder`, `OpenAIChatGenerator`) can be initialised by setting `max_retries` and `timeout` params.
The corresponding `AzureOpenAI` don't always have the same params.
**Describe the solution you'd like**
It would be nice to have these params in the `AzureOpenAI` classes
**Describe alternatives you've considered**
Subclass `AzureOpenAI` and create custom components.
**Additional context**
cc @anakin87 :)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `haystack/components/generators/chat/azure.py`
Content:
```
1 # SPDX-FileCopyrightText: 2022-present deepset GmbH <[email protected]>
2 #
3 # SPDX-License-Identifier: Apache-2.0
4
5 import os
6 from typing import Any, Callable, Dict, Optional
7
8 # pylint: disable=import-error
9 from openai.lib.azure import AzureOpenAI
10
11 from haystack import component, default_from_dict, default_to_dict, logging
12 from haystack.components.generators.chat import OpenAIChatGenerator
13 from haystack.dataclasses import StreamingChunk
14 from haystack.utils import Secret, deserialize_callable, deserialize_secrets_inplace, serialize_callable
15
16 logger = logging.getLogger(__name__)
17
18
19 @component
20 class AzureOpenAIChatGenerator(OpenAIChatGenerator):
21 """
22 A Chat Generator component that uses the Azure OpenAI API to generate text.
23
24 Enables text generation using OpenAI's large language models (LLMs) on Azure. It supports `gpt-4` and
25 `gpt-3.5-turbo` family of models accessed through the chat completions API endpoint.
26
27 Users can pass any text generation parameters valid for the `openai.ChatCompletion.create` method
28 directly to this component via the `generation_kwargs` parameter in `__init__` or the `generation_kwargs`
29 parameter in `run` method.
30
31 For more details on OpenAI models deployed on Azure, refer to the Microsoft
32 [documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/).
33
34 Key Features and Compatibility:
35 - Primary Compatibility: Designed to work seamlessly with the OpenAI API Chat Completion endpoint.
36 - Streaming Support: Supports streaming responses from the OpenAI API Chat Completion endpoint.
37 - Customizability: Supports all parameters supported by the OpenAI API Chat Completion endpoint.
38
39 Input and Output Format:
40 - ChatMessage Format: This component uses the ChatMessage format for structuring both input and output, ensuring
41 coherent and contextually relevant responses in chat-based text generation scenarios.
42 - Details on the ChatMessage format can be found [here](https://docs.haystack.deepset.ai/v2.0/docs/data-classes#chatmessage).
43
44
45 Usage example:
46
47 ```python
48 from haystack.components.generators.chat import AzureOpenAIGenerator
49 from haystack.dataclasses import ChatMessage
50 from haystack.utils import Secret
51
52 messages = [ChatMessage.from_user("What's Natural Language Processing?")]
53
54 client = AzureOpenAIGenerator(
55 azure_endpoint="<Your Azure endpoint e.g. `https://your-company.azure.openai.com/>",
56 api_key=Secret.from_token("<your-api-key>"),
57 azure_deployment="<this a model name, e.g. gpt-35-turbo>")
58 response = client.run(messages)
59 print(response)
60 ```
61
62 ```
63 {'replies':
64 [ChatMessage(content='Natural Language Processing (NLP) is a branch of artificial intelligence that focuses on
65 enabling computers to understand, interpret, and generate human language in a way that is useful.',
66 role=<ChatRole.ASSISTANT: 'assistant'>, name=None,
67 meta={'model': 'gpt-3.5-turbo-0613', 'index': 0, 'finish_reason': 'stop',
68 'usage': {'prompt_tokens': 15, 'completion_tokens': 36, 'total_tokens': 51}})]
69 }
70 ```
71 """
72
73 # pylint: disable=super-init-not-called
74 def __init__(
75 self,
76 azure_endpoint: Optional[str] = None,
77 api_version: Optional[str] = "2023-05-15",
78 azure_deployment: Optional[str] = "gpt-35-turbo",
79 api_key: Optional[Secret] = Secret.from_env_var("AZURE_OPENAI_API_KEY", strict=False),
80 azure_ad_token: Optional[Secret] = Secret.from_env_var("AZURE_OPENAI_AD_TOKEN", strict=False),
81 organization: Optional[str] = None,
82 streaming_callback: Optional[Callable[[StreamingChunk], None]] = None,
83 timeout: Optional[float] = None,
84 generation_kwargs: Optional[Dict[str, Any]] = None,
85 ):
86 """
87 Initialize the Azure OpenAI Chat Generator component.
88
89 :param azure_endpoint: The endpoint of the deployed model, e.g. `"https://example-resource.azure.openai.com/"`
90 :param api_version: The version of the API to use. Defaults to 2023-05-15
91 :param azure_deployment: The deployment of the model, usually the model name.
92 :param api_key: The API key to use for authentication.
93 :param azure_ad_token: [Azure Active Directory token](https://www.microsoft.com/en-us/security/business/identity-access/microsoft-entra-id)
94 :param organization: The Organization ID, defaults to `None`. See
95 [production best practices](https://platform.openai.com/docs/guides/production-best-practices/setting-up-your-organization).
96 :param streaming_callback: A callback function that is called when a new token is received from the stream.
97 The callback function accepts StreamingChunk as an argument.
98 :param generation_kwargs: Other parameters to use for the model. These parameters are all sent directly to
99 the OpenAI endpoint. See OpenAI [documentation](https://platform.openai.com/docs/api-reference/chat) for
100 more details.
101 Some of the supported parameters:
102 - `max_tokens`: The maximum number of tokens the output text can have.
103 - `temperature`: What sampling temperature to use. Higher values mean the model will take more risks.
104 Try 0.9 for more creative applications and 0 (argmax sampling) for ones with a well-defined answer.
105 - `top_p`: An alternative to sampling with temperature, called nucleus sampling, where the model
106 considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens
107 comprising the top 10% probability mass are considered.
108 - `n`: How many completions to generate for each prompt. For example, if the LLM gets 3 prompts and n is 2,
109 it will generate two completions for each of the three prompts, ending up with 6 completions in total.
110 - `stop`: One or more sequences after which the LLM should stop generating tokens.
111 - `presence_penalty`: What penalty to apply if a token is already present at all. Bigger values mean
112 the model will be less likely to repeat the same token in the text.
113 - `frequency_penalty`: What penalty to apply if a token has already been generated in the text.
114 Bigger values mean the model will be less likely to repeat the same token in the text.
115 - `logit_bias`: Add a logit bias to specific tokens. The keys of the dictionary are tokens, and the
116 values are the bias to add to that token.
117 """
118 # We intentionally do not call super().__init__ here because we only need to instantiate the client to interact
119 # with the API.
120
121 # Why is this here?
122 # AzureOpenAI init is forcing us to use an init method that takes either base_url or azure_endpoint as not
123 # None init parameters. This way we accommodate the use case where env var AZURE_OPENAI_ENDPOINT is set instead
124 # of passing it as a parameter.
125 azure_endpoint = azure_endpoint or os.environ.get("AZURE_OPENAI_ENDPOINT")
126 if not azure_endpoint:
127 raise ValueError("Please provide an Azure endpoint or set the environment variable AZURE_OPENAI_ENDPOINT.")
128
129 if api_key is None and azure_ad_token is None:
130 raise ValueError("Please provide an API key or an Azure Active Directory token.")
131
132 # The check above makes mypy incorrectly infer that api_key is never None,
133 # which propagates the incorrect type.
134 self.api_key = api_key # type: ignore
135 self.azure_ad_token = azure_ad_token
136 self.generation_kwargs = generation_kwargs or {}
137 self.streaming_callback = streaming_callback
138 self.api_version = api_version
139 self.azure_endpoint = azure_endpoint
140 self.azure_deployment = azure_deployment
141 self.organization = organization
142 self.model = azure_deployment or "gpt-35-turbo"
143 self.timeout = timeout
144
145 self.client = AzureOpenAI(
146 api_version=api_version,
147 azure_endpoint=azure_endpoint,
148 azure_deployment=azure_deployment,
149 api_key=api_key.resolve_value() if api_key is not None else None,
150 azure_ad_token=azure_ad_token.resolve_value() if azure_ad_token is not None else None,
151 organization=organization,
152 )
153
154 def to_dict(self) -> Dict[str, Any]:
155 """
156 Serialize this component to a dictionary.
157
158 :returns:
159 The serialized component as a dictionary.
160 """
161 callback_name = serialize_callable(self.streaming_callback) if self.streaming_callback else None
162 return default_to_dict(
163 self,
164 azure_endpoint=self.azure_endpoint,
165 azure_deployment=self.azure_deployment,
166 organization=self.organization,
167 api_version=self.api_version,
168 streaming_callback=callback_name,
169 generation_kwargs=self.generation_kwargs,
170 timeout=self.timeout,
171 api_key=self.api_key.to_dict() if self.api_key is not None else None,
172 azure_ad_token=self.azure_ad_token.to_dict() if self.azure_ad_token is not None else None,
173 )
174
175 @classmethod
176 def from_dict(cls, data: Dict[str, Any]) -> "AzureOpenAIChatGenerator":
177 """
178 Deserialize this component from a dictionary.
179
180 :param data: The dictionary representation of this component.
181 :returns:
182 The deserialized component instance.
183 """
184 deserialize_secrets_inplace(data["init_parameters"], keys=["api_key", "azure_ad_token"])
185 init_params = data.get("init_parameters", {})
186 serialized_callback_handler = init_params.get("streaming_callback")
187 if serialized_callback_handler:
188 data["init_parameters"]["streaming_callback"] = deserialize_callable(serialized_callback_handler)
189 return default_from_dict(cls, data)
190
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/haystack/components/generators/chat/azure.py b/haystack/components/generators/chat/azure.py
--- a/haystack/components/generators/chat/azure.py
+++ b/haystack/components/generators/chat/azure.py
@@ -81,6 +81,7 @@
organization: Optional[str] = None,
streaming_callback: Optional[Callable[[StreamingChunk], None]] = None,
timeout: Optional[float] = None,
+ max_retries: Optional[int] = None,
generation_kwargs: Optional[Dict[str, Any]] = None,
):
"""
@@ -95,6 +96,10 @@
[production best practices](https://platform.openai.com/docs/guides/production-best-practices/setting-up-your-organization).
:param streaming_callback: A callback function that is called when a new token is received from the stream.
The callback function accepts StreamingChunk as an argument.
+ :param timeout: The timeout in seconds to be passed to the underlying `AzureOpenAI` client, if not set it is
+ inferred from the `OPENAI_TIMEOUT` environment variable or set to 30.
+ :param max_retries: Maximum retries to establish a connection with AzureOpenAI if it returns an internal error,
+ if not set it is inferred from the `OPENAI_MAX_RETRIES` environment variable or set to 5.
:param generation_kwargs: Other parameters to use for the model. These parameters are all sent directly to
the OpenAI endpoint. See OpenAI [documentation](https://platform.openai.com/docs/api-reference/chat) for
more details.
@@ -140,7 +145,8 @@
self.azure_deployment = azure_deployment
self.organization = organization
self.model = azure_deployment or "gpt-35-turbo"
- self.timeout = timeout
+ self.timeout = timeout or float(os.environ.get("OPENAI_TIMEOUT", 30.0))
+ self.max_retries = max_retries or int(os.environ.get("OPENAI_MAX_RETRIES", 5))
self.client = AzureOpenAI(
api_version=api_version,
@@ -149,6 +155,8 @@
api_key=api_key.resolve_value() if api_key is not None else None,
azure_ad_token=azure_ad_token.resolve_value() if azure_ad_token is not None else None,
organization=organization,
+ timeout=self.timeout,
+ max_retries=self.max_retries,
)
def to_dict(self) -> Dict[str, Any]:
@@ -168,6 +176,7 @@
streaming_callback=callback_name,
generation_kwargs=self.generation_kwargs,
timeout=self.timeout,
+ max_retries=self.max_retries,
api_key=self.api_key.to_dict() if self.api_key is not None else None,
azure_ad_token=self.azure_ad_token.to_dict() if self.azure_ad_token is not None else None,
)
| {"golden_diff": "diff --git a/haystack/components/generators/chat/azure.py b/haystack/components/generators/chat/azure.py\n--- a/haystack/components/generators/chat/azure.py\n+++ b/haystack/components/generators/chat/azure.py\n@@ -81,6 +81,7 @@\n organization: Optional[str] = None,\n streaming_callback: Optional[Callable[[StreamingChunk], None]] = None,\n timeout: Optional[float] = None,\n+ max_retries: Optional[int] = None,\n generation_kwargs: Optional[Dict[str, Any]] = None,\n ):\n \"\"\"\n@@ -95,6 +96,10 @@\n [production best practices](https://platform.openai.com/docs/guides/production-best-practices/setting-up-your-organization).\n :param streaming_callback: A callback function that is called when a new token is received from the stream.\n The callback function accepts StreamingChunk as an argument.\n+ :param timeout: The timeout in seconds to be passed to the underlying `AzureOpenAI` client, if not set it is\n+ inferred from the `OPENAI_TIMEOUT` environment variable or set to 30.\n+ :param max_retries: Maximum retries to establish a connection with AzureOpenAI if it returns an internal error,\n+ if not set it is inferred from the `OPENAI_MAX_RETRIES` environment variable or set to 5.\n :param generation_kwargs: Other parameters to use for the model. These parameters are all sent directly to\n the OpenAI endpoint. See OpenAI [documentation](https://platform.openai.com/docs/api-reference/chat) for\n more details.\n@@ -140,7 +145,8 @@\n self.azure_deployment = azure_deployment\n self.organization = organization\n self.model = azure_deployment or \"gpt-35-turbo\"\n- self.timeout = timeout\n+ self.timeout = timeout or float(os.environ.get(\"OPENAI_TIMEOUT\", 30.0))\n+ self.max_retries = max_retries or int(os.environ.get(\"OPENAI_MAX_RETRIES\", 5))\n \n self.client = AzureOpenAI(\n api_version=api_version,\n@@ -149,6 +155,8 @@\n api_key=api_key.resolve_value() if api_key is not None else None,\n azure_ad_token=azure_ad_token.resolve_value() if azure_ad_token is not None else None,\n organization=organization,\n+ timeout=self.timeout,\n+ max_retries=self.max_retries,\n )\n \n def to_dict(self) -> Dict[str, Any]:\n@@ -168,6 +176,7 @@\n streaming_callback=callback_name,\n generation_kwargs=self.generation_kwargs,\n timeout=self.timeout,\n+ max_retries=self.max_retries,\n api_key=self.api_key.to_dict() if self.api_key is not None else None,\n azure_ad_token=self.azure_ad_token.to_dict() if self.azure_ad_token is not None else None,\n )\n", "issue": "Add `max_retries` and `timeout` params to all `AzureOpenAI` classes\n**Is your feature request related to a problem? Please describe.**\r\n\r\nCurrently all `OpenAI` related classes (e.g. `OpenAIDocumentEmbedder`, `OpenAIChatGenerator`) can be initialised by setting `max_retries` and `timeout` params.\r\n\r\nThe corresponding `AzureOpenAI` don't always have the same params.\r\n\r\n**Describe the solution you'd like**\r\n\r\nIt would be nice to have these params in the `AzureOpenAI` classes\r\n\r\n**Describe alternatives you've considered**\r\n\r\nSubclass `AzureOpenAI` and create custom components.\r\n\r\n**Additional context**\r\n\r\ncc @anakin87 :)\n", "before_files": [{"content": "# SPDX-FileCopyrightText: 2022-present deepset GmbH <[email protected]>\n#\n# SPDX-License-Identifier: Apache-2.0\n\nimport os\nfrom typing import Any, Callable, Dict, Optional\n\n# pylint: disable=import-error\nfrom openai.lib.azure import AzureOpenAI\n\nfrom haystack import component, default_from_dict, default_to_dict, logging\nfrom haystack.components.generators.chat import OpenAIChatGenerator\nfrom haystack.dataclasses import StreamingChunk\nfrom haystack.utils import Secret, deserialize_callable, deserialize_secrets_inplace, serialize_callable\n\nlogger = logging.getLogger(__name__)\n\n\n@component\nclass AzureOpenAIChatGenerator(OpenAIChatGenerator):\n \"\"\"\n A Chat Generator component that uses the Azure OpenAI API to generate text.\n\n Enables text generation using OpenAI's large language models (LLMs) on Azure. It supports `gpt-4` and\n `gpt-3.5-turbo` family of models accessed through the chat completions API endpoint.\n\n Users can pass any text generation parameters valid for the `openai.ChatCompletion.create` method\n directly to this component via the `generation_kwargs` parameter in `__init__` or the `generation_kwargs`\n parameter in `run` method.\n\n For more details on OpenAI models deployed on Azure, refer to the Microsoft\n [documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/).\n\n Key Features and Compatibility:\n - Primary Compatibility: Designed to work seamlessly with the OpenAI API Chat Completion endpoint.\n - Streaming Support: Supports streaming responses from the OpenAI API Chat Completion endpoint.\n - Customizability: Supports all parameters supported by the OpenAI API Chat Completion endpoint.\n\n Input and Output Format:\n - ChatMessage Format: This component uses the ChatMessage format for structuring both input and output, ensuring\n coherent and contextually relevant responses in chat-based text generation scenarios.\n - Details on the ChatMessage format can be found [here](https://docs.haystack.deepset.ai/v2.0/docs/data-classes#chatmessage).\n\n\n Usage example:\n\n ```python\n from haystack.components.generators.chat import AzureOpenAIGenerator\n from haystack.dataclasses import ChatMessage\n from haystack.utils import Secret\n\n messages = [ChatMessage.from_user(\"What's Natural Language Processing?\")]\n\n client = AzureOpenAIGenerator(\n azure_endpoint=\"<Your Azure endpoint e.g. `https://your-company.azure.openai.com/>\",\n api_key=Secret.from_token(\"<your-api-key>\"),\n azure_deployment=\"<this a model name, e.g. gpt-35-turbo>\")\n response = client.run(messages)\n print(response)\n ```\n\n ```\n {'replies':\n [ChatMessage(content='Natural Language Processing (NLP) is a branch of artificial intelligence that focuses on\n enabling computers to understand, interpret, and generate human language in a way that is useful.',\n role=<ChatRole.ASSISTANT: 'assistant'>, name=None,\n meta={'model': 'gpt-3.5-turbo-0613', 'index': 0, 'finish_reason': 'stop',\n 'usage': {'prompt_tokens': 15, 'completion_tokens': 36, 'total_tokens': 51}})]\n }\n ```\n \"\"\"\n\n # pylint: disable=super-init-not-called\n def __init__(\n self,\n azure_endpoint: Optional[str] = None,\n api_version: Optional[str] = \"2023-05-15\",\n azure_deployment: Optional[str] = \"gpt-35-turbo\",\n api_key: Optional[Secret] = Secret.from_env_var(\"AZURE_OPENAI_API_KEY\", strict=False),\n azure_ad_token: Optional[Secret] = Secret.from_env_var(\"AZURE_OPENAI_AD_TOKEN\", strict=False),\n organization: Optional[str] = None,\n streaming_callback: Optional[Callable[[StreamingChunk], None]] = None,\n timeout: Optional[float] = None,\n generation_kwargs: Optional[Dict[str, Any]] = None,\n ):\n \"\"\"\n Initialize the Azure OpenAI Chat Generator component.\n\n :param azure_endpoint: The endpoint of the deployed model, e.g. `\"https://example-resource.azure.openai.com/\"`\n :param api_version: The version of the API to use. Defaults to 2023-05-15\n :param azure_deployment: The deployment of the model, usually the model name.\n :param api_key: The API key to use for authentication.\n :param azure_ad_token: [Azure Active Directory token](https://www.microsoft.com/en-us/security/business/identity-access/microsoft-entra-id)\n :param organization: The Organization ID, defaults to `None`. See\n [production best practices](https://platform.openai.com/docs/guides/production-best-practices/setting-up-your-organization).\n :param streaming_callback: A callback function that is called when a new token is received from the stream.\n The callback function accepts StreamingChunk as an argument.\n :param generation_kwargs: Other parameters to use for the model. These parameters are all sent directly to\n the OpenAI endpoint. See OpenAI [documentation](https://platform.openai.com/docs/api-reference/chat) for\n more details.\n Some of the supported parameters:\n - `max_tokens`: The maximum number of tokens the output text can have.\n - `temperature`: What sampling temperature to use. Higher values mean the model will take more risks.\n Try 0.9 for more creative applications and 0 (argmax sampling) for ones with a well-defined answer.\n - `top_p`: An alternative to sampling with temperature, called nucleus sampling, where the model\n considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens\n comprising the top 10% probability mass are considered.\n - `n`: How many completions to generate for each prompt. For example, if the LLM gets 3 prompts and n is 2,\n it will generate two completions for each of the three prompts, ending up with 6 completions in total.\n - `stop`: One or more sequences after which the LLM should stop generating tokens.\n - `presence_penalty`: What penalty to apply if a token is already present at all. Bigger values mean\n the model will be less likely to repeat the same token in the text.\n - `frequency_penalty`: What penalty to apply if a token has already been generated in the text.\n Bigger values mean the model will be less likely to repeat the same token in the text.\n - `logit_bias`: Add a logit bias to specific tokens. The keys of the dictionary are tokens, and the\n values are the bias to add to that token.\n \"\"\"\n # We intentionally do not call super().__init__ here because we only need to instantiate the client to interact\n # with the API.\n\n # Why is this here?\n # AzureOpenAI init is forcing us to use an init method that takes either base_url or azure_endpoint as not\n # None init parameters. This way we accommodate the use case where env var AZURE_OPENAI_ENDPOINT is set instead\n # of passing it as a parameter.\n azure_endpoint = azure_endpoint or os.environ.get(\"AZURE_OPENAI_ENDPOINT\")\n if not azure_endpoint:\n raise ValueError(\"Please provide an Azure endpoint or set the environment variable AZURE_OPENAI_ENDPOINT.\")\n\n if api_key is None and azure_ad_token is None:\n raise ValueError(\"Please provide an API key or an Azure Active Directory token.\")\n\n # The check above makes mypy incorrectly infer that api_key is never None,\n # which propagates the incorrect type.\n self.api_key = api_key # type: ignore\n self.azure_ad_token = azure_ad_token\n self.generation_kwargs = generation_kwargs or {}\n self.streaming_callback = streaming_callback\n self.api_version = api_version\n self.azure_endpoint = azure_endpoint\n self.azure_deployment = azure_deployment\n self.organization = organization\n self.model = azure_deployment or \"gpt-35-turbo\"\n self.timeout = timeout\n\n self.client = AzureOpenAI(\n api_version=api_version,\n azure_endpoint=azure_endpoint,\n azure_deployment=azure_deployment,\n api_key=api_key.resolve_value() if api_key is not None else None,\n azure_ad_token=azure_ad_token.resolve_value() if azure_ad_token is not None else None,\n organization=organization,\n )\n\n def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serialize this component to a dictionary.\n\n :returns:\n The serialized component as a dictionary.\n \"\"\"\n callback_name = serialize_callable(self.streaming_callback) if self.streaming_callback else None\n return default_to_dict(\n self,\n azure_endpoint=self.azure_endpoint,\n azure_deployment=self.azure_deployment,\n organization=self.organization,\n api_version=self.api_version,\n streaming_callback=callback_name,\n generation_kwargs=self.generation_kwargs,\n timeout=self.timeout,\n api_key=self.api_key.to_dict() if self.api_key is not None else None,\n azure_ad_token=self.azure_ad_token.to_dict() if self.azure_ad_token is not None else None,\n )\n\n @classmethod\n def from_dict(cls, data: Dict[str, Any]) -> \"AzureOpenAIChatGenerator\":\n \"\"\"\n Deserialize this component from a dictionary.\n\n :param data: The dictionary representation of this component.\n :returns:\n The deserialized component instance.\n \"\"\"\n deserialize_secrets_inplace(data[\"init_parameters\"], keys=[\"api_key\", \"azure_ad_token\"])\n init_params = data.get(\"init_parameters\", {})\n serialized_callback_handler = init_params.get(\"streaming_callback\")\n if serialized_callback_handler:\n data[\"init_parameters\"][\"streaming_callback\"] = deserialize_callable(serialized_callback_handler)\n return default_from_dict(cls, data)\n", "path": "haystack/components/generators/chat/azure.py"}], "after_files": [{"content": "# SPDX-FileCopyrightText: 2022-present deepset GmbH <[email protected]>\n#\n# SPDX-License-Identifier: Apache-2.0\n\nimport os\nfrom typing import Any, Callable, Dict, Optional\n\n# pylint: disable=import-error\nfrom openai.lib.azure import AzureOpenAI\n\nfrom haystack import component, default_from_dict, default_to_dict, logging\nfrom haystack.components.generators.chat import OpenAIChatGenerator\nfrom haystack.dataclasses import StreamingChunk\nfrom haystack.utils import Secret, deserialize_callable, deserialize_secrets_inplace, serialize_callable\n\nlogger = logging.getLogger(__name__)\n\n\n@component\nclass AzureOpenAIChatGenerator(OpenAIChatGenerator):\n \"\"\"\n A Chat Generator component that uses the Azure OpenAI API to generate text.\n\n Enables text generation using OpenAI's large language models (LLMs) on Azure. It supports `gpt-4` and\n `gpt-3.5-turbo` family of models accessed through the chat completions API endpoint.\n\n Users can pass any text generation parameters valid for the `openai.ChatCompletion.create` method\n directly to this component via the `generation_kwargs` parameter in `__init__` or the `generation_kwargs`\n parameter in `run` method.\n\n For more details on OpenAI models deployed on Azure, refer to the Microsoft\n [documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/).\n\n Key Features and Compatibility:\n - Primary Compatibility: Designed to work seamlessly with the OpenAI API Chat Completion endpoint.\n - Streaming Support: Supports streaming responses from the OpenAI API Chat Completion endpoint.\n - Customizability: Supports all parameters supported by the OpenAI API Chat Completion endpoint.\n\n Input and Output Format:\n - ChatMessage Format: This component uses the ChatMessage format for structuring both input and output, ensuring\n coherent and contextually relevant responses in chat-based text generation scenarios.\n - Details on the ChatMessage format can be found [here](https://docs.haystack.deepset.ai/v2.0/docs/data-classes#chatmessage).\n\n\n Usage example:\n\n ```python\n from haystack.components.generators.chat import AzureOpenAIGenerator\n from haystack.dataclasses import ChatMessage\n from haystack.utils import Secret\n\n messages = [ChatMessage.from_user(\"What's Natural Language Processing?\")]\n\n client = AzureOpenAIGenerator(\n azure_endpoint=\"<Your Azure endpoint e.g. `https://your-company.azure.openai.com/>\",\n api_key=Secret.from_token(\"<your-api-key>\"),\n azure_deployment=\"<this a model name, e.g. gpt-35-turbo>\")\n response = client.run(messages)\n print(response)\n ```\n\n ```\n {'replies':\n [ChatMessage(content='Natural Language Processing (NLP) is a branch of artificial intelligence that focuses on\n enabling computers to understand, interpret, and generate human language in a way that is useful.',\n role=<ChatRole.ASSISTANT: 'assistant'>, name=None,\n meta={'model': 'gpt-3.5-turbo-0613', 'index': 0, 'finish_reason': 'stop',\n 'usage': {'prompt_tokens': 15, 'completion_tokens': 36, 'total_tokens': 51}})]\n }\n ```\n \"\"\"\n\n # pylint: disable=super-init-not-called\n def __init__(\n self,\n azure_endpoint: Optional[str] = None,\n api_version: Optional[str] = \"2023-05-15\",\n azure_deployment: Optional[str] = \"gpt-35-turbo\",\n api_key: Optional[Secret] = Secret.from_env_var(\"AZURE_OPENAI_API_KEY\", strict=False),\n azure_ad_token: Optional[Secret] = Secret.from_env_var(\"AZURE_OPENAI_AD_TOKEN\", strict=False),\n organization: Optional[str] = None,\n streaming_callback: Optional[Callable[[StreamingChunk], None]] = None,\n timeout: Optional[float] = None,\n max_retries: Optional[int] = None,\n generation_kwargs: Optional[Dict[str, Any]] = None,\n ):\n \"\"\"\n Initialize the Azure OpenAI Chat Generator component.\n\n :param azure_endpoint: The endpoint of the deployed model, e.g. `\"https://example-resource.azure.openai.com/\"`\n :param api_version: The version of the API to use. Defaults to 2023-05-15\n :param azure_deployment: The deployment of the model, usually the model name.\n :param api_key: The API key to use for authentication.\n :param azure_ad_token: [Azure Active Directory token](https://www.microsoft.com/en-us/security/business/identity-access/microsoft-entra-id)\n :param organization: The Organization ID, defaults to `None`. See\n [production best practices](https://platform.openai.com/docs/guides/production-best-practices/setting-up-your-organization).\n :param streaming_callback: A callback function that is called when a new token is received from the stream.\n The callback function accepts StreamingChunk as an argument.\n :param timeout: The timeout in seconds to be passed to the underlying `AzureOpenAI` client, if not set it is\n inferred from the `OPENAI_TIMEOUT` environment variable or set to 30.\n :param max_retries: Maximum retries to establish a connection with AzureOpenAI if it returns an internal error,\n if not set it is inferred from the `OPENAI_MAX_RETRIES` environment variable or set to 5.\n :param generation_kwargs: Other parameters to use for the model. These parameters are all sent directly to\n the OpenAI endpoint. See OpenAI [documentation](https://platform.openai.com/docs/api-reference/chat) for\n more details.\n Some of the supported parameters:\n - `max_tokens`: The maximum number of tokens the output text can have.\n - `temperature`: What sampling temperature to use. Higher values mean the model will take more risks.\n Try 0.9 for more creative applications and 0 (argmax sampling) for ones with a well-defined answer.\n - `top_p`: An alternative to sampling with temperature, called nucleus sampling, where the model\n considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens\n comprising the top 10% probability mass are considered.\n - `n`: How many completions to generate for each prompt. For example, if the LLM gets 3 prompts and n is 2,\n it will generate two completions for each of the three prompts, ending up with 6 completions in total.\n - `stop`: One or more sequences after which the LLM should stop generating tokens.\n - `presence_penalty`: What penalty to apply if a token is already present at all. Bigger values mean\n the model will be less likely to repeat the same token in the text.\n - `frequency_penalty`: What penalty to apply if a token has already been generated in the text.\n Bigger values mean the model will be less likely to repeat the same token in the text.\n - `logit_bias`: Add a logit bias to specific tokens. The keys of the dictionary are tokens, and the\n values are the bias to add to that token.\n \"\"\"\n # We intentionally do not call super().__init__ here because we only need to instantiate the client to interact\n # with the API.\n\n # Why is this here?\n # AzureOpenAI init is forcing us to use an init method that takes either base_url or azure_endpoint as not\n # None init parameters. This way we accommodate the use case where env var AZURE_OPENAI_ENDPOINT is set instead\n # of passing it as a parameter.\n azure_endpoint = azure_endpoint or os.environ.get(\"AZURE_OPENAI_ENDPOINT\")\n if not azure_endpoint:\n raise ValueError(\"Please provide an Azure endpoint or set the environment variable AZURE_OPENAI_ENDPOINT.\")\n\n if api_key is None and azure_ad_token is None:\n raise ValueError(\"Please provide an API key or an Azure Active Directory token.\")\n\n # The check above makes mypy incorrectly infer that api_key is never None,\n # which propagates the incorrect type.\n self.api_key = api_key # type: ignore\n self.azure_ad_token = azure_ad_token\n self.generation_kwargs = generation_kwargs or {}\n self.streaming_callback = streaming_callback\n self.api_version = api_version\n self.azure_endpoint = azure_endpoint\n self.azure_deployment = azure_deployment\n self.organization = organization\n self.model = azure_deployment or \"gpt-35-turbo\"\n self.timeout = timeout or float(os.environ.get(\"OPENAI_TIMEOUT\", 30.0))\n self.max_retries = max_retries or int(os.environ.get(\"OPENAI_MAX_RETRIES\", 5))\n\n self.client = AzureOpenAI(\n api_version=api_version,\n azure_endpoint=azure_endpoint,\n azure_deployment=azure_deployment,\n api_key=api_key.resolve_value() if api_key is not None else None,\n azure_ad_token=azure_ad_token.resolve_value() if azure_ad_token is not None else None,\n organization=organization,\n timeout=self.timeout,\n max_retries=self.max_retries,\n )\n\n def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serialize this component to a dictionary.\n\n :returns:\n The serialized component as a dictionary.\n \"\"\"\n callback_name = serialize_callable(self.streaming_callback) if self.streaming_callback else None\n return default_to_dict(\n self,\n azure_endpoint=self.azure_endpoint,\n azure_deployment=self.azure_deployment,\n organization=self.organization,\n api_version=self.api_version,\n streaming_callback=callback_name,\n generation_kwargs=self.generation_kwargs,\n timeout=self.timeout,\n max_retries=self.max_retries,\n api_key=self.api_key.to_dict() if self.api_key is not None else None,\n azure_ad_token=self.azure_ad_token.to_dict() if self.azure_ad_token is not None else None,\n )\n\n @classmethod\n def from_dict(cls, data: Dict[str, Any]) -> \"AzureOpenAIChatGenerator\":\n \"\"\"\n Deserialize this component from a dictionary.\n\n :param data: The dictionary representation of this component.\n :returns:\n The deserialized component instance.\n \"\"\"\n deserialize_secrets_inplace(data[\"init_parameters\"], keys=[\"api_key\", \"azure_ad_token\"])\n init_params = data.get(\"init_parameters\", {})\n serialized_callback_handler = init_params.get(\"streaming_callback\")\n if serialized_callback_handler:\n data[\"init_parameters\"][\"streaming_callback\"] = deserialize_callable(serialized_callback_handler)\n return default_from_dict(cls, data)\n", "path": "haystack/components/generators/chat/azure.py"}]} | 3,017 | 649 |
gh_patches_debug_34754 | rasdani/github-patches | git_diff | akvo__akvo-rsr-1531 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add organisation filter for maps API resources
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `akvo/rest/views/project_update_location.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Akvo RSR is covered by the GNU Affero General Public License.
4 # See more details in the license.txt file located at the root folder of the Akvo RSR module.
5 # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
6
7
8 from akvo.rsr.models import ProjectUpdateLocation
9 from ..serializers import ProjectUpdateLocationSerializer, MapProjectUpdateLocationSerializer
10 from ..viewsets import BaseRSRViewSet
11
12
13 class ProjectUpdateLocationViewSet(BaseRSRViewSet):
14 """
15 API endpoint that allows organisation locations to be viewed or edited.
16 """
17 queryset = ProjectUpdateLocation.objects.all()
18 serializer_class = ProjectUpdateLocationSerializer
19
20
21 class MapProjectUpdateLocationViewSet(BaseRSRViewSet):
22
23 """Returns a resource tailored for generating a map of update locations.
24
25 Allowed parameters are:
26 limit (default 100 / max 500), and
27 location_target__project (filter on project ID)
28 """
29
30 filter_fields = ('location_target__project', )
31 max_paginate_by = 500
32 paginate_by = 100
33 queryset = ProjectUpdateLocation.objects.select_related(
34 'location_target',
35 'location_target__project').only(
36 'id', 'latitude', 'longitude',
37 'location_target__id', 'location_target__project', 'location_target__title',
38 'location_target__photo', 'location_target__video')
39 serializer_class = MapProjectUpdateLocationSerializer
40
```
Path: `akvo/rest/views/organisation_location.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """Akvo RSR is covered by the GNU Affero General Public License.
3 See more details in the license.txt file located at the root folder of the Akvo RSR module.
4 For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
5 """
6
7 from akvo.rsr.models import OrganisationLocation
8 from ..serializers import OrganisationLocationSerializer, MapOrganisationLocationSerializer
9 from ..viewsets import BaseRSRViewSet
10
11
12 class OrganisationLocationViewSet(BaseRSRViewSet):
13 """
14 API endpoint that allows organisation locations to be viewed or edited.
15 """
16 queryset = OrganisationLocation.objects.all()
17 serializer_class = OrganisationLocationSerializer
18
19
20 class MapOrganisationLocationViewSet(BaseRSRViewSet):
21
22 """Returns a resource tailored for generating a map of organisation locations.
23
24 Allowed parameters are:
25 limit (default 100 / max 500),
26 location_target (filter on organisation ID), and
27 country (filter on country ID)
28 """
29
30 filter_fields = ('location_target', 'country')
31 max_paginate_by = 500
32 paginate_by = 100
33 queryset = OrganisationLocation.objects.select_related(
34 'location_target', 'country').only(
35 'id', 'latitude', 'longitude',
36 'location_target__id', 'location_target__name',
37 'location_target__logo',
38 'country')
39 serializer_class = MapOrganisationLocationSerializer
40
```
Path: `akvo/rest/views/project_location.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """Akvo RSR is covered by the GNU Affero General Public License.
3 See more details in the license.txt file located at the root folder of the Akvo RSR module.
4 For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
5 """
6
7 from akvo.rsr.models import ProjectLocation
8 from ..serializers import ProjectLocationSerializer, MapProjectLocationSerializer
9 from ..viewsets import BaseRSRViewSet
10
11
12 class ProjectLocationViewSet(BaseRSRViewSet):
13 """
14 """
15 queryset = ProjectLocation.objects.all()
16 serializer_class = ProjectLocationSerializer
17 filter_fields = ('location_target', 'country', )
18
19
20 class MapProjectLocationViewSet(BaseRSRViewSet):
21
22 """Returns a resource tailored for generating a map of project locations.
23
24 Allowed parameters are:
25 limit (default 100 / max 500),
26 location_target (filter on project ID), and
27 country (filter on country ID)
28 """
29
30 filter_fields = ('location_target', 'country')
31 max_paginate_by = 500
32 paginate_by = 100
33 queryset = ProjectLocation.objects.select_related(
34 'location_target', 'country').only(
35 'id', 'latitude', 'longitude',
36 'location_target__id', 'location_target__title',
37 'location_target__current_image',
38 'country')
39 serializer_class = MapProjectLocationSerializer
40
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/akvo/rest/views/organisation_location.py b/akvo/rest/views/organisation_location.py
--- a/akvo/rest/views/organisation_location.py
+++ b/akvo/rest/views/organisation_location.py
@@ -22,9 +22,9 @@
"""Returns a resource tailored for generating a map of organisation locations.
Allowed parameters are:
- limit (default 100 / max 500),
- location_target (filter on organisation ID), and
- country (filter on country ID)
+ __limit__ (default 100 / max 500),
+ __location_target__ (filter on organisation ID), and
+ __country__ (filter on country ID)
"""
filter_fields = ('location_target', 'country')
diff --git a/akvo/rest/views/project_location.py b/akvo/rest/views/project_location.py
--- a/akvo/rest/views/project_location.py
+++ b/akvo/rest/views/project_location.py
@@ -22,12 +22,17 @@
"""Returns a resource tailored for generating a map of project locations.
Allowed parameters are:
- limit (default 100 / max 500),
- location_target (filter on project ID), and
- country (filter on country ID)
+ __limit__ (default 100 / max 500),
+ __location_target__ (filter on project ID),
+ __location_target\__partners__ (filter on organisation ID), and
+ __country__ (filter on country ID)
"""
- filter_fields = ('location_target', 'country')
+ filter_fields = (
+ 'location_target',
+ 'location_target__partners',
+ 'country'
+ )
max_paginate_by = 500
paginate_by = 100
queryset = ProjectLocation.objects.select_related(
diff --git a/akvo/rest/views/project_update_location.py b/akvo/rest/views/project_update_location.py
--- a/akvo/rest/views/project_update_location.py
+++ b/akvo/rest/views/project_update_location.py
@@ -23,11 +23,18 @@
"""Returns a resource tailored for generating a map of update locations.
Allowed parameters are:
- limit (default 100 / max 500), and
- location_target__project (filter on project ID)
+ __limit__ (default 100 / max 500),
+ __location_target\__project__ (filter on project ID),
+ __location_target\__project\__partners__
+ (filter on organisation ID of the projects' organisations),
+ __location_target\__user\__employers__ (filter on organisation ID of the users' organisations)
"""
- filter_fields = ('location_target__project', )
+ filter_fields = (
+ 'location_target__project',
+ 'location_target__project__partners',
+ 'location_target__user__employers'
+ )
max_paginate_by = 500
paginate_by = 100
queryset = ProjectUpdateLocation.objects.select_related(
| {"golden_diff": "diff --git a/akvo/rest/views/organisation_location.py b/akvo/rest/views/organisation_location.py\n--- a/akvo/rest/views/organisation_location.py\n+++ b/akvo/rest/views/organisation_location.py\n@@ -22,9 +22,9 @@\n \"\"\"Returns a resource tailored for generating a map of organisation locations.\n \n Allowed parameters are:\n- limit (default 100 / max 500),\n- location_target (filter on organisation ID), and\n- country (filter on country ID)\n+ __limit__ (default 100 / max 500),\n+ __location_target__ (filter on organisation ID), and\n+ __country__ (filter on country ID)\n \"\"\"\n \n filter_fields = ('location_target', 'country')\ndiff --git a/akvo/rest/views/project_location.py b/akvo/rest/views/project_location.py\n--- a/akvo/rest/views/project_location.py\n+++ b/akvo/rest/views/project_location.py\n@@ -22,12 +22,17 @@\n \"\"\"Returns a resource tailored for generating a map of project locations.\n \n Allowed parameters are:\n- limit (default 100 / max 500),\n- location_target (filter on project ID), and\n- country (filter on country ID)\n+ __limit__ (default 100 / max 500),\n+ __location_target__ (filter on project ID),\n+ __location_target\\__partners__ (filter on organisation ID), and\n+ __country__ (filter on country ID)\n \"\"\"\n \n- filter_fields = ('location_target', 'country')\n+ filter_fields = (\n+ 'location_target',\n+ 'location_target__partners',\n+ 'country'\n+ )\n max_paginate_by = 500\n paginate_by = 100\n queryset = ProjectLocation.objects.select_related(\ndiff --git a/akvo/rest/views/project_update_location.py b/akvo/rest/views/project_update_location.py\n--- a/akvo/rest/views/project_update_location.py\n+++ b/akvo/rest/views/project_update_location.py\n@@ -23,11 +23,18 @@\n \"\"\"Returns a resource tailored for generating a map of update locations.\n \n Allowed parameters are:\n- limit (default 100 / max 500), and\n- location_target__project (filter on project ID)\n+ __limit__ (default 100 / max 500),\n+ __location_target\\__project__ (filter on project ID),\n+ __location_target\\__project\\__partners__\n+ (filter on organisation ID of the projects' organisations),\n+ __location_target\\__user\\__employers__ (filter on organisation ID of the users' organisations)\n \"\"\"\n \n- filter_fields = ('location_target__project', )\n+ filter_fields = (\n+ 'location_target__project',\n+ 'location_target__project__partners',\n+ 'location_target__user__employers'\n+ )\n max_paginate_by = 500\n paginate_by = 100\n queryset = ProjectUpdateLocation.objects.select_related(\n", "issue": "Add organisation filter for maps API resources\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\n\nfrom akvo.rsr.models import ProjectUpdateLocation\nfrom ..serializers import ProjectUpdateLocationSerializer, MapProjectUpdateLocationSerializer\nfrom ..viewsets import BaseRSRViewSet\n\n\nclass ProjectUpdateLocationViewSet(BaseRSRViewSet):\n \"\"\"\n API endpoint that allows organisation locations to be viewed or edited.\n \"\"\"\n queryset = ProjectUpdateLocation.objects.all()\n serializer_class = ProjectUpdateLocationSerializer\n\n\nclass MapProjectUpdateLocationViewSet(BaseRSRViewSet):\n\n \"\"\"Returns a resource tailored for generating a map of update locations.\n\n Allowed parameters are:\n limit (default 100 / max 500), and\n location_target__project (filter on project ID)\n \"\"\"\n\n filter_fields = ('location_target__project', )\n max_paginate_by = 500\n paginate_by = 100\n queryset = ProjectUpdateLocation.objects.select_related(\n 'location_target',\n 'location_target__project').only(\n 'id', 'latitude', 'longitude',\n 'location_target__id', 'location_target__project', 'location_target__title',\n 'location_target__photo', 'location_target__video')\n serializer_class = MapProjectUpdateLocationSerializer\n", "path": "akvo/rest/views/project_update_location.py"}, {"content": "# -*- coding: utf-8 -*-\n\"\"\"Akvo RSR is covered by the GNU Affero General Public License.\nSee more details in the license.txt file located at the root folder of the Akvo RSR module.\nFor additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\"\"\"\n\nfrom akvo.rsr.models import OrganisationLocation\nfrom ..serializers import OrganisationLocationSerializer, MapOrganisationLocationSerializer\nfrom ..viewsets import BaseRSRViewSet\n\n\nclass OrganisationLocationViewSet(BaseRSRViewSet):\n \"\"\"\n API endpoint that allows organisation locations to be viewed or edited.\n \"\"\"\n queryset = OrganisationLocation.objects.all()\n serializer_class = OrganisationLocationSerializer\n\n\nclass MapOrganisationLocationViewSet(BaseRSRViewSet):\n\n \"\"\"Returns a resource tailored for generating a map of organisation locations.\n\n Allowed parameters are:\n limit (default 100 / max 500),\n location_target (filter on organisation ID), and\n country (filter on country ID)\n \"\"\"\n\n filter_fields = ('location_target', 'country')\n max_paginate_by = 500\n paginate_by = 100\n queryset = OrganisationLocation.objects.select_related(\n 'location_target', 'country').only(\n 'id', 'latitude', 'longitude',\n 'location_target__id', 'location_target__name',\n 'location_target__logo',\n 'country')\n serializer_class = MapOrganisationLocationSerializer\n", "path": "akvo/rest/views/organisation_location.py"}, {"content": "# -*- coding: utf-8 -*-\n\"\"\"Akvo RSR is covered by the GNU Affero General Public License.\nSee more details in the license.txt file located at the root folder of the Akvo RSR module.\nFor additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\"\"\"\n\nfrom akvo.rsr.models import ProjectLocation\nfrom ..serializers import ProjectLocationSerializer, MapProjectLocationSerializer\nfrom ..viewsets import BaseRSRViewSet\n\n\nclass ProjectLocationViewSet(BaseRSRViewSet):\n \"\"\"\n \"\"\"\n queryset = ProjectLocation.objects.all()\n serializer_class = ProjectLocationSerializer\n filter_fields = ('location_target', 'country', )\n\n\nclass MapProjectLocationViewSet(BaseRSRViewSet):\n\n \"\"\"Returns a resource tailored for generating a map of project locations.\n\n Allowed parameters are:\n limit (default 100 / max 500),\n location_target (filter on project ID), and\n country (filter on country ID)\n \"\"\"\n\n filter_fields = ('location_target', 'country')\n max_paginate_by = 500\n paginate_by = 100\n queryset = ProjectLocation.objects.select_related(\n 'location_target', 'country').only(\n 'id', 'latitude', 'longitude',\n 'location_target__id', 'location_target__title',\n 'location_target__current_image',\n 'country')\n serializer_class = MapProjectLocationSerializer\n", "path": "akvo/rest/views/project_location.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\n\nfrom akvo.rsr.models import ProjectUpdateLocation\nfrom ..serializers import ProjectUpdateLocationSerializer, MapProjectUpdateLocationSerializer\nfrom ..viewsets import BaseRSRViewSet\n\n\nclass ProjectUpdateLocationViewSet(BaseRSRViewSet):\n \"\"\"\n API endpoint that allows organisation locations to be viewed or edited.\n \"\"\"\n queryset = ProjectUpdateLocation.objects.all()\n serializer_class = ProjectUpdateLocationSerializer\n\n\nclass MapProjectUpdateLocationViewSet(BaseRSRViewSet):\n\n \"\"\"Returns a resource tailored for generating a map of update locations.\n\n Allowed parameters are:\n __limit__ (default 100 / max 500),\n __location_target\\__project__ (filter on project ID),\n __location_target\\__project\\__partners__\n (filter on organisation ID of the projects' organisations),\n __location_target\\__user\\__employers__ (filter on organisation ID of the users' organisations)\n \"\"\"\n\n filter_fields = (\n 'location_target__project',\n 'location_target__project__partners',\n 'location_target__user__employers'\n )\n max_paginate_by = 500\n paginate_by = 100\n queryset = ProjectUpdateLocation.objects.select_related(\n 'location_target',\n 'location_target__project').only(\n 'id', 'latitude', 'longitude',\n 'location_target__id', 'location_target__project', 'location_target__title',\n 'location_target__photo', 'location_target__video')\n serializer_class = MapProjectUpdateLocationSerializer\n", "path": "akvo/rest/views/project_update_location.py"}, {"content": "# -*- coding: utf-8 -*-\n\"\"\"Akvo RSR is covered by the GNU Affero General Public License.\nSee more details in the license.txt file located at the root folder of the Akvo RSR module.\nFor additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\"\"\"\n\nfrom akvo.rsr.models import OrganisationLocation\nfrom ..serializers import OrganisationLocationSerializer, MapOrganisationLocationSerializer\nfrom ..viewsets import BaseRSRViewSet\n\n\nclass OrganisationLocationViewSet(BaseRSRViewSet):\n \"\"\"\n API endpoint that allows organisation locations to be viewed or edited.\n \"\"\"\n queryset = OrganisationLocation.objects.all()\n serializer_class = OrganisationLocationSerializer\n\n\nclass MapOrganisationLocationViewSet(BaseRSRViewSet):\n\n \"\"\"Returns a resource tailored for generating a map of organisation locations.\n\n Allowed parameters are:\n __limit__ (default 100 / max 500),\n __location_target__ (filter on organisation ID), and\n __country__ (filter on country ID)\n \"\"\"\n\n filter_fields = ('location_target', 'country')\n max_paginate_by = 500\n paginate_by = 100\n queryset = OrganisationLocation.objects.select_related(\n 'location_target', 'country').only(\n 'id', 'latitude', 'longitude',\n 'location_target__id', 'location_target__name',\n 'location_target__logo',\n 'country')\n serializer_class = MapOrganisationLocationSerializer\n", "path": "akvo/rest/views/organisation_location.py"}, {"content": "# -*- coding: utf-8 -*-\n\"\"\"Akvo RSR is covered by the GNU Affero General Public License.\nSee more details in the license.txt file located at the root folder of the Akvo RSR module.\nFor additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\"\"\"\n\nfrom akvo.rsr.models import ProjectLocation\nfrom ..serializers import ProjectLocationSerializer, MapProjectLocationSerializer\nfrom ..viewsets import BaseRSRViewSet\n\n\nclass ProjectLocationViewSet(BaseRSRViewSet):\n \"\"\"\n \"\"\"\n queryset = ProjectLocation.objects.all()\n serializer_class = ProjectLocationSerializer\n filter_fields = ('location_target', 'country', )\n\n\nclass MapProjectLocationViewSet(BaseRSRViewSet):\n\n \"\"\"Returns a resource tailored for generating a map of project locations.\n\n Allowed parameters are:\n __limit__ (default 100 / max 500),\n __location_target__ (filter on project ID),\n __location_target\\__partners__ (filter on organisation ID), and\n __country__ (filter on country ID)\n \"\"\"\n\n filter_fields = (\n 'location_target',\n 'location_target__partners',\n 'country'\n )\n max_paginate_by = 500\n paginate_by = 100\n queryset = ProjectLocation.objects.select_related(\n 'location_target', 'country').only(\n 'id', 'latitude', 'longitude',\n 'location_target__id', 'location_target__title',\n 'location_target__current_image',\n 'country')\n serializer_class = MapProjectLocationSerializer\n", "path": "akvo/rest/views/project_location.py"}]} | 1,465 | 696 |
gh_patches_debug_23553 | rasdani/github-patches | git_diff | conda__conda-6849 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
conda env update fails with pip dependencies if Python version changes
**I'm submitting a...**
- [x] bug report
- [ ] feature request
**update:** only the root env appears to be affected
## Current Behavior
`conda env update -f environment.yml` fails if both of these conditions are met:
1. The Python version changes
2. There is at least one dependency to be installed with pip
The error looks like:
```
Unable to install package for pip.
Please double check and ensure you dependencies file has
the correct spelling. You might also try installing the
conda-env-pip package to see if provides the required
installer.
```
This is the same error but not the same bug as #4985. pip is installed and specified as a dependency, but switching Python version in the same transaction results in conda failing to find pip.
### Steps to Reproduce
Create an environment.yml that contains Python and at least one pip dependency (doesn't matter if it's already installed or not):
```yaml
# environment.yml
dependencies:
- python=3.5
- pip
- pip:
- wheel
```
Setup a root env with a *different* Python:
```
conda install python=3.6 pip
```
and then try to update it with the environment file
```
conda env update -n root -f environment.yml
```
## Expected Behavior
`conda env update` should succeed.
<!-- What do you think should happen? -->
## Environment Information
Run in docker with [these files](https://gist.github.com/7d234cbcf69df2c7cd68523c9327db71).
<details open><summary><code>`conda info`</code></summary><p>
<!-- between the ticks below, paste the output of 'conda info' -->
```
active environment : None
user config file : /root/.condarc
populated config files :
conda version : 4.4.8
conda-build version : not installed
python version : 3.6.2.final.0
base environment : /opt/conda (writable)
channel URLs : https://repo.continuum.io/pkgs/main/linux-64
https://repo.continuum.io/pkgs/main/noarch
https://repo.continuum.io/pkgs/free/linux-64
https://repo.continuum.io/pkgs/free/noarch
https://repo.continuum.io/pkgs/r/linux-64
https://repo.continuum.io/pkgs/r/noarch
https://repo.continuum.io/pkgs/pro/linux-64
https://repo.continuum.io/pkgs/pro/noarch
package cache : /opt/conda/pkgs
/root/.conda/pkgs
envs directories : /opt/conda/envs
/root/.conda/envs
platform : linux-64
user-agent : conda/4.4.8 requests/2.18.4 CPython/3.6.2 Linux/4.9.75-linuxkit-aufs debian/8 glibc/2.19
UID:GID : 0:0
netrc file : None
offline mode : False
```
</p></details>
conda env update fails with pip dependencies if Python version changes
**I'm submitting a...**
- [x] bug report
- [ ] feature request
**update:** only the root env appears to be affected
## Current Behavior
`conda env update -f environment.yml` fails if both of these conditions are met:
1. The Python version changes
2. There is at least one dependency to be installed with pip
The error looks like:
```
Unable to install package for pip.
Please double check and ensure you dependencies file has
the correct spelling. You might also try installing the
conda-env-pip package to see if provides the required
installer.
```
This is the same error but not the same bug as #4985. pip is installed and specified as a dependency, but switching Python version in the same transaction results in conda failing to find pip.
### Steps to Reproduce
Create an environment.yml that contains Python and at least one pip dependency (doesn't matter if it's already installed or not):
```yaml
# environment.yml
dependencies:
- python=3.5
- pip
- pip:
- wheel
```
Setup a root env with a *different* Python:
```
conda install python=3.6 pip
```
and then try to update it with the environment file
```
conda env update -n root -f environment.yml
```
## Expected Behavior
`conda env update` should succeed.
<!-- What do you think should happen? -->
## Environment Information
Run in docker with [these files](https://gist.github.com/7d234cbcf69df2c7cd68523c9327db71).
<details open><summary><code>`conda info`</code></summary><p>
<!-- between the ticks below, paste the output of 'conda info' -->
```
active environment : None
user config file : /root/.condarc
populated config files :
conda version : 4.4.8
conda-build version : not installed
python version : 3.6.2.final.0
base environment : /opt/conda (writable)
channel URLs : https://repo.continuum.io/pkgs/main/linux-64
https://repo.continuum.io/pkgs/main/noarch
https://repo.continuum.io/pkgs/free/linux-64
https://repo.continuum.io/pkgs/free/noarch
https://repo.continuum.io/pkgs/r/linux-64
https://repo.continuum.io/pkgs/r/noarch
https://repo.continuum.io/pkgs/pro/linux-64
https://repo.continuum.io/pkgs/pro/noarch
package cache : /opt/conda/pkgs
/root/.conda/pkgs
envs directories : /opt/conda/envs
/root/.conda/envs
platform : linux-64
user-agent : conda/4.4.8 requests/2.18.4 CPython/3.6.2 Linux/4.9.75-linuxkit-aufs debian/8 glibc/2.19
UID:GID : 0:0
netrc file : None
offline mode : False
```
</p></details>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conda_env/cli/main_update.py`
Content:
```
1 from argparse import RawDescriptionHelpFormatter
2 import os
3 import sys
4 import textwrap
5
6 from conda._vendor.auxlib.path import expand
7 from conda.cli import install as cli_install
8 from conda.cli.conda_argparse import add_parser_json, add_parser_prefix
9 from conda.misc import touch_nonadmin
10 from .common import get_prefix
11 from .. import exceptions, specs as install_specs
12 from ..exceptions import CondaEnvException
13 from ..installers.base import InvalidInstaller, get_installer
14
15 description = """
16 Update the current environment based on environment file
17 """
18
19 example = """
20 examples:
21 conda env update
22 conda env update -n=foo
23 conda env update -f=/path/to/environment.yml
24 conda env update --name=foo --file=environment.yml
25 conda env update vader/deathstar
26 """
27
28
29 def configure_parser(sub_parsers):
30 p = sub_parsers.add_parser(
31 'update',
32 formatter_class=RawDescriptionHelpFormatter,
33 description=description,
34 help=description,
35 epilog=example,
36 )
37 add_parser_prefix(p)
38 p.add_argument(
39 '-f', '--file',
40 action='store',
41 help='environment definition (default: environment.yml)',
42 default='environment.yml',
43 )
44 p.add_argument(
45 '--prune',
46 action='store_true',
47 default=False,
48 help='remove installed packages not defined in environment.yml',
49 )
50 p.add_argument(
51 '-q', '--quiet',
52 action='store_true',
53 default=False,
54 )
55 p.add_argument(
56 'remote_definition',
57 help='remote environment definition / IPython notebook',
58 action='store',
59 default=None,
60 nargs='?'
61 )
62 add_parser_json(p)
63 p.set_defaults(func='.main_update.execute')
64
65
66 def execute(args, parser):
67 name = args.remote_definition or args.name
68
69 try:
70 spec = install_specs.detect(name=name, filename=expand(args.file),
71 directory=os.getcwd())
72 env = spec.environment
73 except exceptions.SpecNotFound:
74 raise
75
76 if not (args.name or args.prefix):
77 if not env.name:
78 # Note, this is a hack fofr get_prefix that assumes argparse results
79 # TODO Refactor common.get_prefix
80 name = os.environ.get('CONDA_DEFAULT_ENV', False)
81 if not name:
82 msg = "Unable to determine environment\n\n"
83 msg += textwrap.dedent("""
84 Please re-run this command with one of the following options:
85
86 * Provide an environment name via --name or -n
87 * Re-run this command inside an activated conda environment.""").lstrip()
88 # TODO Add json support
89 raise CondaEnvException(msg)
90
91 # Note: stubbing out the args object as all of the
92 # conda.cli.common code thinks that name will always
93 # be specified.
94 args.name = env.name
95
96 prefix = get_prefix(args, search=False)
97 # CAN'T Check with this function since it assumes we will create prefix.
98 # cli_install.check_prefix(prefix, json=args.json)
99
100 # TODO, add capability
101 # common.ensure_override_channels_requires_channel(args)
102 # channel_urls = args.channel or ()
103
104 for installer_type, specs in env.dependencies.items():
105 try:
106 installer = get_installer(installer_type)
107 installer.install(prefix, specs, args, env)
108 except InvalidInstaller:
109 sys.stderr.write(textwrap.dedent("""
110 Unable to install package for {0}.
111
112 Please double check and ensure you dependencies file has
113 the correct spelling. You might also try installing the
114 conda-env-{0} package to see if provides the required
115 installer.
116 """).lstrip().format(installer_type)
117 )
118 return -1
119
120 touch_nonadmin(prefix)
121 cli_install.print_activate(args.name if args.name else prefix)
122
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/conda_env/cli/main_update.py b/conda_env/cli/main_update.py
--- a/conda_env/cli/main_update.py
+++ b/conda_env/cli/main_update.py
@@ -101,11 +101,15 @@
# common.ensure_override_channels_requires_channel(args)
# channel_urls = args.channel or ()
- for installer_type, specs in env.dependencies.items():
+ # create installers before running any of them
+ # to avoid failure to import after the file being deleted
+ # e.g. due to conda_env being upgraded or Python version switched.
+ installers = {}
+
+ for installer_type in env.dependencies:
try:
- installer = get_installer(installer_type)
- installer.install(prefix, specs, args, env)
- except InvalidInstaller:
+ installers[installer_type] = get_installer(installer_type)
+ except InvalidInstaller as e:
sys.stderr.write(textwrap.dedent("""
Unable to install package for {0}.
@@ -117,5 +121,9 @@
)
return -1
+ for installer_type, specs in env.dependencies.items():
+ installer = installers[installer_type]
+ installer.install(prefix, specs, args, env)
+
touch_nonadmin(prefix)
cli_install.print_activate(args.name if args.name else prefix)
| {"golden_diff": "diff --git a/conda_env/cli/main_update.py b/conda_env/cli/main_update.py\n--- a/conda_env/cli/main_update.py\n+++ b/conda_env/cli/main_update.py\n@@ -101,11 +101,15 @@\n # common.ensure_override_channels_requires_channel(args)\n # channel_urls = args.channel or ()\n \n- for installer_type, specs in env.dependencies.items():\n+ # create installers before running any of them\n+ # to avoid failure to import after the file being deleted\n+ # e.g. due to conda_env being upgraded or Python version switched.\n+ installers = {}\n+\n+ for installer_type in env.dependencies:\n try:\n- installer = get_installer(installer_type)\n- installer.install(prefix, specs, args, env)\n- except InvalidInstaller:\n+ installers[installer_type] = get_installer(installer_type)\n+ except InvalidInstaller as e:\n sys.stderr.write(textwrap.dedent(\"\"\"\n Unable to install package for {0}.\n \n@@ -117,5 +121,9 @@\n )\n return -1\n \n+ for installer_type, specs in env.dependencies.items():\n+ installer = installers[installer_type]\n+ installer.install(prefix, specs, args, env)\n+\n touch_nonadmin(prefix)\n cli_install.print_activate(args.name if args.name else prefix)\n", "issue": "conda env update fails with pip dependencies if Python version changes\n**I'm submitting a...**\r\n - [x] bug report\r\n - [ ] feature request\r\n\r\n**update:** only the root env appears to be affected\r\n\r\n## Current Behavior\r\n\r\n`conda env update -f environment.yml` fails if both of these conditions are met:\r\n\r\n1. The Python version changes\r\n2. There is at least one dependency to be installed with pip\r\n\r\nThe error looks like:\r\n\r\n```\r\nUnable to install package for pip.\r\n\r\nPlease double check and ensure you dependencies file has\r\nthe correct spelling. You might also try installing the\r\nconda-env-pip package to see if provides the required\r\ninstaller.\r\n```\r\n\r\nThis is the same error but not the same bug as #4985. pip is installed and specified as a dependency, but switching Python version in the same transaction results in conda failing to find pip.\r\n\r\n### Steps to Reproduce\r\n\r\nCreate an environment.yml that contains Python and at least one pip dependency (doesn't matter if it's already installed or not):\r\n\r\n```yaml\r\n# environment.yml\r\ndependencies:\r\n - python=3.5\r\n - pip\r\n - pip:\r\n - wheel\r\n```\r\n\r\nSetup a root env with a *different* Python:\r\n\r\n```\r\nconda install python=3.6 pip\r\n```\r\n\r\nand then try to update it with the environment file\r\n\r\n```\r\nconda env update -n root -f environment.yml\r\n```\r\n\r\n\r\n## Expected Behavior\r\n\r\n`conda env update` should succeed.\r\n<!-- What do you think should happen? -->\r\n\r\n\r\n## Environment Information\r\n\r\nRun in docker with [these files](https://gist.github.com/7d234cbcf69df2c7cd68523c9327db71).\r\n\r\n<details open><summary><code>`conda info`</code></summary><p>\r\n<!-- between the ticks below, paste the output of 'conda info' -->\r\n\r\n```\r\n active environment : None\r\n user config file : /root/.condarc\r\n populated config files :\r\n conda version : 4.4.8\r\n conda-build version : not installed\r\n python version : 3.6.2.final.0\r\n base environment : /opt/conda (writable)\r\n channel URLs : https://repo.continuum.io/pkgs/main/linux-64\r\n https://repo.continuum.io/pkgs/main/noarch\r\n https://repo.continuum.io/pkgs/free/linux-64\r\n https://repo.continuum.io/pkgs/free/noarch\r\n https://repo.continuum.io/pkgs/r/linux-64\r\n https://repo.continuum.io/pkgs/r/noarch\r\n https://repo.continuum.io/pkgs/pro/linux-64\r\n https://repo.continuum.io/pkgs/pro/noarch\r\n package cache : /opt/conda/pkgs\r\n /root/.conda/pkgs\r\n envs directories : /opt/conda/envs\r\n /root/.conda/envs\r\n platform : linux-64\r\n user-agent : conda/4.4.8 requests/2.18.4 CPython/3.6.2 Linux/4.9.75-linuxkit-aufs debian/8 glibc/2.19\r\n UID:GID : 0:0\r\n netrc file : None\r\n offline mode : False\r\n```\r\n</p></details>\r\n\r\n\nconda env update fails with pip dependencies if Python version changes\n**I'm submitting a...**\r\n - [x] bug report\r\n - [ ] feature request\r\n\r\n**update:** only the root env appears to be affected\r\n\r\n## Current Behavior\r\n\r\n`conda env update -f environment.yml` fails if both of these conditions are met:\r\n\r\n1. The Python version changes\r\n2. There is at least one dependency to be installed with pip\r\n\r\nThe error looks like:\r\n\r\n```\r\nUnable to install package for pip.\r\n\r\nPlease double check and ensure you dependencies file has\r\nthe correct spelling. You might also try installing the\r\nconda-env-pip package to see if provides the required\r\ninstaller.\r\n```\r\n\r\nThis is the same error but not the same bug as #4985. pip is installed and specified as a dependency, but switching Python version in the same transaction results in conda failing to find pip.\r\n\r\n### Steps to Reproduce\r\n\r\nCreate an environment.yml that contains Python and at least one pip dependency (doesn't matter if it's already installed or not):\r\n\r\n```yaml\r\n# environment.yml\r\ndependencies:\r\n - python=3.5\r\n - pip\r\n - pip:\r\n - wheel\r\n```\r\n\r\nSetup a root env with a *different* Python:\r\n\r\n```\r\nconda install python=3.6 pip\r\n```\r\n\r\nand then try to update it with the environment file\r\n\r\n```\r\nconda env update -n root -f environment.yml\r\n```\r\n\r\n\r\n## Expected Behavior\r\n\r\n`conda env update` should succeed.\r\n<!-- What do you think should happen? -->\r\n\r\n\r\n## Environment Information\r\n\r\nRun in docker with [these files](https://gist.github.com/7d234cbcf69df2c7cd68523c9327db71).\r\n\r\n<details open><summary><code>`conda info`</code></summary><p>\r\n<!-- between the ticks below, paste the output of 'conda info' -->\r\n\r\n```\r\n active environment : None\r\n user config file : /root/.condarc\r\n populated config files :\r\n conda version : 4.4.8\r\n conda-build version : not installed\r\n python version : 3.6.2.final.0\r\n base environment : /opt/conda (writable)\r\n channel URLs : https://repo.continuum.io/pkgs/main/linux-64\r\n https://repo.continuum.io/pkgs/main/noarch\r\n https://repo.continuum.io/pkgs/free/linux-64\r\n https://repo.continuum.io/pkgs/free/noarch\r\n https://repo.continuum.io/pkgs/r/linux-64\r\n https://repo.continuum.io/pkgs/r/noarch\r\n https://repo.continuum.io/pkgs/pro/linux-64\r\n https://repo.continuum.io/pkgs/pro/noarch\r\n package cache : /opt/conda/pkgs\r\n /root/.conda/pkgs\r\n envs directories : /opt/conda/envs\r\n /root/.conda/envs\r\n platform : linux-64\r\n user-agent : conda/4.4.8 requests/2.18.4 CPython/3.6.2 Linux/4.9.75-linuxkit-aufs debian/8 glibc/2.19\r\n UID:GID : 0:0\r\n netrc file : None\r\n offline mode : False\r\n```\r\n</p></details>\r\n\r\n\n", "before_files": [{"content": "from argparse import RawDescriptionHelpFormatter\nimport os\nimport sys\nimport textwrap\n\nfrom conda._vendor.auxlib.path import expand\nfrom conda.cli import install as cli_install\nfrom conda.cli.conda_argparse import add_parser_json, add_parser_prefix\nfrom conda.misc import touch_nonadmin\nfrom .common import get_prefix\nfrom .. import exceptions, specs as install_specs\nfrom ..exceptions import CondaEnvException\nfrom ..installers.base import InvalidInstaller, get_installer\n\ndescription = \"\"\"\nUpdate the current environment based on environment file\n\"\"\"\n\nexample = \"\"\"\nexamples:\n conda env update\n conda env update -n=foo\n conda env update -f=/path/to/environment.yml\n conda env update --name=foo --file=environment.yml\n conda env update vader/deathstar\n\"\"\"\n\n\ndef configure_parser(sub_parsers):\n p = sub_parsers.add_parser(\n 'update',\n formatter_class=RawDescriptionHelpFormatter,\n description=description,\n help=description,\n epilog=example,\n )\n add_parser_prefix(p)\n p.add_argument(\n '-f', '--file',\n action='store',\n help='environment definition (default: environment.yml)',\n default='environment.yml',\n )\n p.add_argument(\n '--prune',\n action='store_true',\n default=False,\n help='remove installed packages not defined in environment.yml',\n )\n p.add_argument(\n '-q', '--quiet',\n action='store_true',\n default=False,\n )\n p.add_argument(\n 'remote_definition',\n help='remote environment definition / IPython notebook',\n action='store',\n default=None,\n nargs='?'\n )\n add_parser_json(p)\n p.set_defaults(func='.main_update.execute')\n\n\ndef execute(args, parser):\n name = args.remote_definition or args.name\n\n try:\n spec = install_specs.detect(name=name, filename=expand(args.file),\n directory=os.getcwd())\n env = spec.environment\n except exceptions.SpecNotFound:\n raise\n\n if not (args.name or args.prefix):\n if not env.name:\n # Note, this is a hack fofr get_prefix that assumes argparse results\n # TODO Refactor common.get_prefix\n name = os.environ.get('CONDA_DEFAULT_ENV', False)\n if not name:\n msg = \"Unable to determine environment\\n\\n\"\n msg += textwrap.dedent(\"\"\"\n Please re-run this command with one of the following options:\n\n * Provide an environment name via --name or -n\n * Re-run this command inside an activated conda environment.\"\"\").lstrip()\n # TODO Add json support\n raise CondaEnvException(msg)\n\n # Note: stubbing out the args object as all of the\n # conda.cli.common code thinks that name will always\n # be specified.\n args.name = env.name\n\n prefix = get_prefix(args, search=False)\n # CAN'T Check with this function since it assumes we will create prefix.\n # cli_install.check_prefix(prefix, json=args.json)\n\n # TODO, add capability\n # common.ensure_override_channels_requires_channel(args)\n # channel_urls = args.channel or ()\n\n for installer_type, specs in env.dependencies.items():\n try:\n installer = get_installer(installer_type)\n installer.install(prefix, specs, args, env)\n except InvalidInstaller:\n sys.stderr.write(textwrap.dedent(\"\"\"\n Unable to install package for {0}.\n\n Please double check and ensure you dependencies file has\n the correct spelling. You might also try installing the\n conda-env-{0} package to see if provides the required\n installer.\n \"\"\").lstrip().format(installer_type)\n )\n return -1\n\n touch_nonadmin(prefix)\n cli_install.print_activate(args.name if args.name else prefix)\n", "path": "conda_env/cli/main_update.py"}], "after_files": [{"content": "from argparse import RawDescriptionHelpFormatter\nimport os\nimport sys\nimport textwrap\n\nfrom conda._vendor.auxlib.path import expand\nfrom conda.cli import install as cli_install\nfrom conda.cli.conda_argparse import add_parser_json, add_parser_prefix\nfrom conda.misc import touch_nonadmin\nfrom .common import get_prefix\nfrom .. import exceptions, specs as install_specs\nfrom ..exceptions import CondaEnvException\nfrom ..installers.base import InvalidInstaller, get_installer\n\ndescription = \"\"\"\nUpdate the current environment based on environment file\n\"\"\"\n\nexample = \"\"\"\nexamples:\n conda env update\n conda env update -n=foo\n conda env update -f=/path/to/environment.yml\n conda env update --name=foo --file=environment.yml\n conda env update vader/deathstar\n\"\"\"\n\n\ndef configure_parser(sub_parsers):\n p = sub_parsers.add_parser(\n 'update',\n formatter_class=RawDescriptionHelpFormatter,\n description=description,\n help=description,\n epilog=example,\n )\n add_parser_prefix(p)\n p.add_argument(\n '-f', '--file',\n action='store',\n help='environment definition (default: environment.yml)',\n default='environment.yml',\n )\n p.add_argument(\n '--prune',\n action='store_true',\n default=False,\n help='remove installed packages not defined in environment.yml',\n )\n p.add_argument(\n '-q', '--quiet',\n action='store_true',\n default=False,\n )\n p.add_argument(\n 'remote_definition',\n help='remote environment definition / IPython notebook',\n action='store',\n default=None,\n nargs='?'\n )\n add_parser_json(p)\n p.set_defaults(func='.main_update.execute')\n\n\ndef execute(args, parser):\n name = args.remote_definition or args.name\n\n try:\n spec = install_specs.detect(name=name, filename=expand(args.file),\n directory=os.getcwd())\n env = spec.environment\n except exceptions.SpecNotFound:\n raise\n\n if not (args.name or args.prefix):\n if not env.name:\n # Note, this is a hack fofr get_prefix that assumes argparse results\n # TODO Refactor common.get_prefix\n name = os.environ.get('CONDA_DEFAULT_ENV', False)\n if not name:\n msg = \"Unable to determine environment\\n\\n\"\n msg += textwrap.dedent(\"\"\"\n Please re-run this command with one of the following options:\n\n * Provide an environment name via --name or -n\n * Re-run this command inside an activated conda environment.\"\"\").lstrip()\n # TODO Add json support\n raise CondaEnvException(msg)\n\n # Note: stubbing out the args object as all of the\n # conda.cli.common code thinks that name will always\n # be specified.\n args.name = env.name\n\n prefix = get_prefix(args, search=False)\n # CAN'T Check with this function since it assumes we will create prefix.\n # cli_install.check_prefix(prefix, json=args.json)\n\n # TODO, add capability\n # common.ensure_override_channels_requires_channel(args)\n # channel_urls = args.channel or ()\n\n # create installers before running any of them\n # to avoid failure to import after the file being deleted\n # e.g. due to conda_env being upgraded or Python version switched.\n installers = {}\n\n for installer_type in env.dependencies:\n try:\n installers[installer_type] = get_installer(installer_type)\n except InvalidInstaller as e:\n sys.stderr.write(textwrap.dedent(\"\"\"\n Unable to install package for {0}.\n\n Please double check and ensure you dependencies file has\n the correct spelling. You might also try installing the\n conda-env-{0} package to see if provides the required\n installer.\n \"\"\").lstrip().format(installer_type)\n )\n return -1\n\n for installer_type, specs in env.dependencies.items():\n installer = installers[installer_type]\n installer.install(prefix, specs, args, env)\n\n touch_nonadmin(prefix)\n cli_install.print_activate(args.name if args.name else prefix)\n", "path": "conda_env/cli/main_update.py"}]} | 2,814 | 299 |
gh_patches_debug_12820 | rasdani/github-patches | git_diff | ESMCI__cime-2508 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
settings from env_test are not being reapplied to tests
It seems that some recent change causes the settings from env_test.xml not to be applied when a test is resubmitted. This is supposed to be tested in scripts_regression_tests so a second question is - why is that test not failing?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scripts/lib/CIME/case/case_submit.py`
Content:
```
1 #!/usr/bin/env python
2
3 """
4 case.submit - Submit a cesm workflow to the queueing system or run it
5 if there is no queueing system. A cesm workflow may include multiple
6 jobs.
7 submit, check_case and check_da_settings are members of class Case in file case.py
8 """
9 import socket
10 from CIME.XML.standard_module_setup import *
11 from CIME.utils import expect, run_and_log_case_status, verbatim_success_msg
12 from CIME.locked_files import unlock_file, lock_file
13 from CIME.test_status import *
14
15 logger = logging.getLogger(__name__)
16
17 def _submit(case, job=None, no_batch=False, prereq=None, resubmit=False,
18 skip_pnl=False, mail_user=None, mail_type=None, batch_args=None):
19 if job is None:
20 job = case.get_primary_job()
21
22 rundir = case.get_value("RUNDIR")
23 continue_run = case.get_value("CONTINUE_RUN")
24 expect(os.path.isdir(rundir) or not continue_run,
25 " CONTINUE_RUN is true but RUNDIR {} does not exist".format(rundir))
26
27 # if case.submit is called with the no_batch flag then we assume that this
28 # flag will stay in effect for the duration of the RESUBMITs
29 env_batch = case.get_env("batch")
30 if resubmit:
31 if env_batch.get_batch_system_type() == "none":
32 no_batch = True
33
34 # This is a resubmission, do not reinitialize test values
35 if job == "case.test":
36 case.set_value("IS_FIRST_RUN", False)
37
38 resub = case.get_value("RESUBMIT")
39 logger.info("Submitting job '{}', resubmit={:d}".format(job, resub))
40 case.set_value("RESUBMIT", resub-1)
41 if case.get_value("RESUBMIT_SETS_CONTINUE_RUN"):
42 case.set_value("CONTINUE_RUN", True)
43
44 else:
45 if job == "case.test":
46 case.set_value("IS_FIRST_RUN", True)
47
48 if no_batch:
49 batch_system = "none"
50 else:
51 batch_system = env_batch.get_batch_system_type()
52
53 case.set_value("BATCH_SYSTEM", batch_system)
54
55 env_batch_has_changed = False
56 try:
57 case.check_lockedfile(os.path.basename(env_batch.filename))
58 except SystemExit:
59 env_batch_has_changed = True
60
61 if env_batch.get_batch_system_type() != "none" and env_batch_has_changed:
62 # May need to regen batch files if user made batch setting changes (e.g. walltime, queue, etc)
63 logger.warning(\
64 """
65 env_batch.xml appears to have changed, regenerating batch scripts
66 manual edits to these file will be lost!
67 """)
68 env_batch.make_all_batch_files(case)
69
70 unlock_file(os.path.basename(env_batch.filename))
71 lock_file(os.path.basename(env_batch.filename))
72
73 if job == case.get_primary_job():
74 case.check_case()
75 case.check_DA_settings()
76 if case.get_value("MACH") == "mira":
77 with open(".original_host", "w") as fd:
78 fd.write( socket.gethostname())
79
80 #Load Modules
81 case.load_env()
82
83 case.flush()
84
85 logger.warning("submit_jobs {}".format(job))
86 job_ids = case.submit_jobs(no_batch=no_batch, job=job, skip_pnl=skip_pnl,
87 prereq=prereq, mail_user=mail_user,
88 mail_type=mail_type, batch_args=batch_args)
89
90 xml_jobids = []
91 for jobname, jobid in job_ids.items():
92 logger.info("Submitted job {} with id {}".format(jobname, jobid))
93 if jobid:
94 xml_jobids.append("{}:{}".format(jobname, jobid))
95
96 xml_jobid_text = ", ".join(xml_jobids)
97 if xml_jobid_text:
98 case.set_value("JOB_IDS", xml_jobid_text)
99
100 return xml_jobid_text
101
102 def submit(self, job=None, no_batch=False, prereq=None, resubmit=False,
103 skip_pnl=False, mail_user=None, mail_type=None, batch_args=None):
104 if self.get_value("TEST"):
105 caseroot = self.get_value("CASEROOT")
106 casebaseid = self.get_value("CASEBASEID")
107 # This should take care of the race condition where the submitted job
108 # begins immediately and tries to set RUN phase. We proactively assume
109 # a passed SUBMIT phase. If this state is already PASS, don't set it again
110 # because then we'll lose RUN phase info if it's there. This info is important
111 # for system_tests_common to know if it needs to reinitialize the test or not.
112 with TestStatus(test_dir=caseroot, test_name=casebaseid) as ts:
113 phase_status = ts.get_status(SUBMIT_PHASE)
114 if phase_status != TEST_PASS_STATUS:
115 ts.set_status(SUBMIT_PHASE, TEST_PASS_STATUS)
116
117 try:
118 functor = lambda: _submit(self, job=job, no_batch=no_batch, prereq=prereq,
119 resubmit=resubmit, skip_pnl=skip_pnl,
120 mail_user=mail_user, mail_type=mail_type,
121 batch_args=batch_args)
122 run_and_log_case_status(functor, "case.submit", caseroot=self.get_value("CASEROOT"),
123 custom_success_msg_functor=verbatim_success_msg)
124 except:
125 # If something failed in the batch system, make sure to mark
126 # the test as failed if we are running a test.
127 if self.get_value("TEST"):
128 with TestStatus(test_dir=caseroot, test_name=casebaseid) as ts:
129 ts.set_status(SUBMIT_PHASE, TEST_FAIL_STATUS)
130
131 raise
132
133 def check_case(self):
134 self.check_lockedfiles()
135 self.create_namelists() # Must be called before check_all_input_data
136 logger.info("Checking that inputdata is available as part of case submission")
137 self.check_all_input_data()
138
139 expect(self.get_value("BUILD_COMPLETE"), "Build complete is "
140 "not True please rebuild the model by calling case.build")
141 logger.info("Check case OK")
142
143 def check_DA_settings(self):
144 script = self.get_value("DATA_ASSIMILATION_SCRIPT")
145 cycles = self.get_value("DATA_ASSIMILATION_CYCLES")
146 if len(script) > 0 and os.path.isfile(script) and cycles > 0:
147 logger.info("Data Assimilation enabled using script {} with {:d} cycles".format(script,cycles))
148
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/scripts/lib/CIME/case/case_submit.py b/scripts/lib/CIME/case/case_submit.py
--- a/scripts/lib/CIME/case/case_submit.py
+++ b/scripts/lib/CIME/case/case_submit.py
@@ -20,9 +20,10 @@
job = case.get_primary_job()
rundir = case.get_value("RUNDIR")
- continue_run = case.get_value("CONTINUE_RUN")
- expect(os.path.isdir(rundir) or not continue_run,
- " CONTINUE_RUN is true but RUNDIR {} does not exist".format(rundir))
+ if job != "case.test":
+ continue_run = case.get_value("CONTINUE_RUN")
+ expect(os.path.isdir(rundir) or not continue_run,
+ " CONTINUE_RUN is true but RUNDIR {} does not exist".format(rundir))
# if case.submit is called with the no_batch flag then we assume that this
# flag will stay in effect for the duration of the RESUBMITs
| {"golden_diff": "diff --git a/scripts/lib/CIME/case/case_submit.py b/scripts/lib/CIME/case/case_submit.py\n--- a/scripts/lib/CIME/case/case_submit.py\n+++ b/scripts/lib/CIME/case/case_submit.py\n@@ -20,9 +20,10 @@\n job = case.get_primary_job()\n \n rundir = case.get_value(\"RUNDIR\")\n- continue_run = case.get_value(\"CONTINUE_RUN\")\n- expect(os.path.isdir(rundir) or not continue_run,\n- \" CONTINUE_RUN is true but RUNDIR {} does not exist\".format(rundir))\n+ if job != \"case.test\":\n+ continue_run = case.get_value(\"CONTINUE_RUN\")\n+ expect(os.path.isdir(rundir) or not continue_run,\n+ \" CONTINUE_RUN is true but RUNDIR {} does not exist\".format(rundir))\n \n # if case.submit is called with the no_batch flag then we assume that this\n # flag will stay in effect for the duration of the RESUBMITs\n", "issue": "settings from env_test are not being reapplied to tests\nIt seems that some recent change causes the settings from env_test.xml not to be applied when a test is resubmitted. This is supposed to be tested in scripts_regression_tests so a second question is - why is that test not failing?\n", "before_files": [{"content": "#!/usr/bin/env python\n\n\"\"\"\ncase.submit - Submit a cesm workflow to the queueing system or run it\nif there is no queueing system. A cesm workflow may include multiple\njobs.\nsubmit, check_case and check_da_settings are members of class Case in file case.py\n\"\"\"\nimport socket\nfrom CIME.XML.standard_module_setup import *\nfrom CIME.utils import expect, run_and_log_case_status, verbatim_success_msg\nfrom CIME.locked_files import unlock_file, lock_file\nfrom CIME.test_status import *\n\nlogger = logging.getLogger(__name__)\n\ndef _submit(case, job=None, no_batch=False, prereq=None, resubmit=False,\n skip_pnl=False, mail_user=None, mail_type=None, batch_args=None):\n if job is None:\n job = case.get_primary_job()\n\n rundir = case.get_value(\"RUNDIR\")\n continue_run = case.get_value(\"CONTINUE_RUN\")\n expect(os.path.isdir(rundir) or not continue_run,\n \" CONTINUE_RUN is true but RUNDIR {} does not exist\".format(rundir))\n\n # if case.submit is called with the no_batch flag then we assume that this\n # flag will stay in effect for the duration of the RESUBMITs\n env_batch = case.get_env(\"batch\")\n if resubmit:\n if env_batch.get_batch_system_type() == \"none\":\n no_batch = True\n\n # This is a resubmission, do not reinitialize test values\n if job == \"case.test\":\n case.set_value(\"IS_FIRST_RUN\", False)\n\n resub = case.get_value(\"RESUBMIT\")\n logger.info(\"Submitting job '{}', resubmit={:d}\".format(job, resub))\n case.set_value(\"RESUBMIT\", resub-1)\n if case.get_value(\"RESUBMIT_SETS_CONTINUE_RUN\"):\n case.set_value(\"CONTINUE_RUN\", True)\n\n else:\n if job == \"case.test\":\n case.set_value(\"IS_FIRST_RUN\", True)\n\n if no_batch:\n batch_system = \"none\"\n else:\n batch_system = env_batch.get_batch_system_type()\n\n case.set_value(\"BATCH_SYSTEM\", batch_system)\n\n env_batch_has_changed = False\n try:\n case.check_lockedfile(os.path.basename(env_batch.filename))\n except SystemExit:\n env_batch_has_changed = True\n\n if env_batch.get_batch_system_type() != \"none\" and env_batch_has_changed:\n # May need to regen batch files if user made batch setting changes (e.g. walltime, queue, etc)\n logger.warning(\\\n\"\"\"\nenv_batch.xml appears to have changed, regenerating batch scripts\nmanual edits to these file will be lost!\n\"\"\")\n env_batch.make_all_batch_files(case)\n\n unlock_file(os.path.basename(env_batch.filename))\n lock_file(os.path.basename(env_batch.filename))\n\n if job == case.get_primary_job():\n case.check_case()\n case.check_DA_settings()\n if case.get_value(\"MACH\") == \"mira\":\n with open(\".original_host\", \"w\") as fd:\n fd.write( socket.gethostname())\n\n #Load Modules\n case.load_env()\n\n case.flush()\n\n logger.warning(\"submit_jobs {}\".format(job))\n job_ids = case.submit_jobs(no_batch=no_batch, job=job, skip_pnl=skip_pnl,\n prereq=prereq, mail_user=mail_user,\n mail_type=mail_type, batch_args=batch_args)\n\n xml_jobids = []\n for jobname, jobid in job_ids.items():\n logger.info(\"Submitted job {} with id {}\".format(jobname, jobid))\n if jobid:\n xml_jobids.append(\"{}:{}\".format(jobname, jobid))\n\n xml_jobid_text = \", \".join(xml_jobids)\n if xml_jobid_text:\n case.set_value(\"JOB_IDS\", xml_jobid_text)\n\n return xml_jobid_text\n\ndef submit(self, job=None, no_batch=False, prereq=None, resubmit=False,\n skip_pnl=False, mail_user=None, mail_type=None, batch_args=None):\n if self.get_value(\"TEST\"):\n caseroot = self.get_value(\"CASEROOT\")\n casebaseid = self.get_value(\"CASEBASEID\")\n # This should take care of the race condition where the submitted job\n # begins immediately and tries to set RUN phase. We proactively assume\n # a passed SUBMIT phase. If this state is already PASS, don't set it again\n # because then we'll lose RUN phase info if it's there. This info is important\n # for system_tests_common to know if it needs to reinitialize the test or not.\n with TestStatus(test_dir=caseroot, test_name=casebaseid) as ts:\n phase_status = ts.get_status(SUBMIT_PHASE)\n if phase_status != TEST_PASS_STATUS:\n ts.set_status(SUBMIT_PHASE, TEST_PASS_STATUS)\n\n try:\n functor = lambda: _submit(self, job=job, no_batch=no_batch, prereq=prereq,\n resubmit=resubmit, skip_pnl=skip_pnl,\n mail_user=mail_user, mail_type=mail_type,\n batch_args=batch_args)\n run_and_log_case_status(functor, \"case.submit\", caseroot=self.get_value(\"CASEROOT\"),\n custom_success_msg_functor=verbatim_success_msg)\n except:\n # If something failed in the batch system, make sure to mark\n # the test as failed if we are running a test.\n if self.get_value(\"TEST\"):\n with TestStatus(test_dir=caseroot, test_name=casebaseid) as ts:\n ts.set_status(SUBMIT_PHASE, TEST_FAIL_STATUS)\n\n raise\n\ndef check_case(self):\n self.check_lockedfiles()\n self.create_namelists() # Must be called before check_all_input_data\n logger.info(\"Checking that inputdata is available as part of case submission\")\n self.check_all_input_data()\n\n expect(self.get_value(\"BUILD_COMPLETE\"), \"Build complete is \"\n \"not True please rebuild the model by calling case.build\")\n logger.info(\"Check case OK\")\n\ndef check_DA_settings(self):\n script = self.get_value(\"DATA_ASSIMILATION_SCRIPT\")\n cycles = self.get_value(\"DATA_ASSIMILATION_CYCLES\")\n if len(script) > 0 and os.path.isfile(script) and cycles > 0:\n logger.info(\"Data Assimilation enabled using script {} with {:d} cycles\".format(script,cycles))\n", "path": "scripts/lib/CIME/case/case_submit.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\n\"\"\"\ncase.submit - Submit a cesm workflow to the queueing system or run it\nif there is no queueing system. A cesm workflow may include multiple\njobs.\nsubmit, check_case and check_da_settings are members of class Case in file case.py\n\"\"\"\nimport socket\nfrom CIME.XML.standard_module_setup import *\nfrom CIME.utils import expect, run_and_log_case_status, verbatim_success_msg\nfrom CIME.locked_files import unlock_file, lock_file\nfrom CIME.test_status import *\n\nlogger = logging.getLogger(__name__)\n\ndef _submit(case, job=None, no_batch=False, prereq=None, resubmit=False,\n skip_pnl=False, mail_user=None, mail_type=None, batch_args=None):\n if job is None:\n job = case.get_primary_job()\n\n rundir = case.get_value(\"RUNDIR\")\n if job != \"case.test\":\n continue_run = case.get_value(\"CONTINUE_RUN\")\n expect(os.path.isdir(rundir) or not continue_run,\n \" CONTINUE_RUN is true but RUNDIR {} does not exist\".format(rundir))\n\n # if case.submit is called with the no_batch flag then we assume that this\n # flag will stay in effect for the duration of the RESUBMITs\n env_batch = case.get_env(\"batch\")\n if resubmit:\n if env_batch.get_batch_system_type() == \"none\":\n no_batch = True\n\n # This is a resubmission, do not reinitialize test values\n if job == \"case.test\":\n case.set_value(\"IS_FIRST_RUN\", False)\n\n resub = case.get_value(\"RESUBMIT\")\n logger.info(\"Submitting job '{}', resubmit={:d}\".format(job, resub))\n case.set_value(\"RESUBMIT\", resub-1)\n if case.get_value(\"RESUBMIT_SETS_CONTINUE_RUN\"):\n case.set_value(\"CONTINUE_RUN\", True)\n\n else:\n if job == \"case.test\":\n case.set_value(\"IS_FIRST_RUN\", True)\n\n if no_batch:\n batch_system = \"none\"\n else:\n batch_system = env_batch.get_batch_system_type()\n\n case.set_value(\"BATCH_SYSTEM\", batch_system)\n\n env_batch_has_changed = False\n try:\n case.check_lockedfile(os.path.basename(env_batch.filename))\n except SystemExit:\n env_batch_has_changed = True\n\n if env_batch.get_batch_system_type() != \"none\" and env_batch_has_changed:\n # May need to regen batch files if user made batch setting changes (e.g. walltime, queue, etc)\n logger.warning(\\\n\"\"\"\nenv_batch.xml appears to have changed, regenerating batch scripts\nmanual edits to these file will be lost!\n\"\"\")\n env_batch.make_all_batch_files(case)\n\n unlock_file(os.path.basename(env_batch.filename))\n lock_file(os.path.basename(env_batch.filename))\n\n if job == case.get_primary_job():\n case.check_case()\n case.check_DA_settings()\n if case.get_value(\"MACH\") == \"mira\":\n with open(\".original_host\", \"w\") as fd:\n fd.write( socket.gethostname())\n\n #Load Modules\n case.load_env()\n\n case.flush()\n\n logger.warning(\"submit_jobs {}\".format(job))\n job_ids = case.submit_jobs(no_batch=no_batch, job=job, skip_pnl=skip_pnl,\n prereq=prereq, mail_user=mail_user,\n mail_type=mail_type, batch_args=batch_args)\n\n xml_jobids = []\n for jobname, jobid in job_ids.items():\n logger.info(\"Submitted job {} with id {}\".format(jobname, jobid))\n if jobid:\n xml_jobids.append(\"{}:{}\".format(jobname, jobid))\n\n xml_jobid_text = \", \".join(xml_jobids)\n if xml_jobid_text:\n case.set_value(\"JOB_IDS\", xml_jobid_text)\n\n return xml_jobid_text\n\ndef submit(self, job=None, no_batch=False, prereq=None, resubmit=False,\n skip_pnl=False, mail_user=None, mail_type=None, batch_args=None):\n if self.get_value(\"TEST\"):\n caseroot = self.get_value(\"CASEROOT\")\n casebaseid = self.get_value(\"CASEBASEID\")\n # This should take care of the race condition where the submitted job\n # begins immediately and tries to set RUN phase. We proactively assume\n # a passed SUBMIT phase. If this state is already PASS, don't set it again\n # because then we'll lose RUN phase info if it's there. This info is important\n # for system_tests_common to know if it needs to reinitialize the test or not.\n with TestStatus(test_dir=caseroot, test_name=casebaseid) as ts:\n phase_status = ts.get_status(SUBMIT_PHASE)\n if phase_status != TEST_PASS_STATUS:\n ts.set_status(SUBMIT_PHASE, TEST_PASS_STATUS)\n\n try:\n functor = lambda: _submit(self, job=job, no_batch=no_batch, prereq=prereq,\n resubmit=resubmit, skip_pnl=skip_pnl,\n mail_user=mail_user, mail_type=mail_type,\n batch_args=batch_args)\n run_and_log_case_status(functor, \"case.submit\", caseroot=self.get_value(\"CASEROOT\"),\n custom_success_msg_functor=verbatim_success_msg)\n except:\n # If something failed in the batch system, make sure to mark\n # the test as failed if we are running a test.\n if self.get_value(\"TEST\"):\n with TestStatus(test_dir=caseroot, test_name=casebaseid) as ts:\n ts.set_status(SUBMIT_PHASE, TEST_FAIL_STATUS)\n\n raise\n\ndef check_case(self):\n self.check_lockedfiles()\n self.create_namelists() # Must be called before check_all_input_data\n logger.info(\"Checking that inputdata is available as part of case submission\")\n self.check_all_input_data()\n\n expect(self.get_value(\"BUILD_COMPLETE\"), \"Build complete is \"\n \"not True please rebuild the model by calling case.build\")\n logger.info(\"Check case OK\")\n\ndef check_DA_settings(self):\n script = self.get_value(\"DATA_ASSIMILATION_SCRIPT\")\n cycles = self.get_value(\"DATA_ASSIMILATION_CYCLES\")\n if len(script) > 0 and os.path.isfile(script) and cycles > 0:\n logger.info(\"Data Assimilation enabled using script {} with {:d} cycles\".format(script,cycles))\n", "path": "scripts/lib/CIME/case/case_submit.py"}]} | 2,076 | 228 |
gh_patches_debug_10696 | rasdani/github-patches | git_diff | Kinto__kinto-1138 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Enforce the permission endpoint when the admin plugin is included.
Enforce the permission endpoint when the admin plugin is included.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kinto/__init__.py`
Content:
```
1 import pkg_resources
2 import logging
3
4 import kinto.core
5 from pyramid.config import Configurator
6 from pyramid.settings import asbool
7 from pyramid.security import Authenticated, Everyone
8
9 from kinto.authorization import RouteFactory
10
11
12 # Module version, as defined in PEP-0396.
13 __version__ = pkg_resources.get_distribution(__package__).version
14
15 # Implemented HTTP API Version
16 HTTP_API_VERSION = '1.16'
17
18 # Main kinto logger
19 logger = logging.getLogger(__name__)
20
21
22 DEFAULT_SETTINGS = {
23 'flush_endpoint_enabled': False,
24 'retry_after_seconds': 3,
25 'cache_backend': 'kinto.core.cache.memory',
26 'permission_backend': 'kinto.core.permission.memory',
27 'storage_backend': 'kinto.core.storage.memory',
28 'project_docs': 'https://kinto.readthedocs.io/',
29 'bucket_create_principals': Authenticated,
30 'permissions_read_principals': Everyone,
31 'multiauth.authorization_policy': (
32 'kinto.authorization.AuthorizationPolicy'),
33 'experimental_collection_schema_validation': False,
34 'experimental_permissions_endpoint': False,
35 'http_api_version': HTTP_API_VERSION,
36 'bucket_id_generator': 'kinto.views.NameGenerator',
37 'collection_id_generator': 'kinto.views.NameGenerator',
38 'group_id_generator': 'kinto.views.NameGenerator',
39 'record_id_generator': 'kinto.views.RelaxedUUID'
40 }
41
42
43 def main(global_config, config=None, **settings):
44 if not config:
45 config = Configurator(settings=settings, root_factory=RouteFactory)
46
47 # Force project name, since it determines settings prefix.
48 config.add_settings({'kinto.project_name': 'kinto'})
49
50 kinto.core.initialize(config,
51 version=__version__,
52 default_settings=DEFAULT_SETTINGS)
53
54 settings = config.get_settings()
55
56 # Expose capability
57 schema_enabled = asbool(
58 settings['experimental_collection_schema_validation']
59 )
60 if schema_enabled:
61 config.add_api_capability(
62 "schema",
63 description="Validates collection records with JSON schemas.",
64 url="https://kinto.readthedocs.io/en/latest/api/1.x/"
65 "collections.html#collection-json-schema")
66
67 # Scan Kinto views.
68 kwargs = {}
69
70 flush_enabled = asbool(settings['flush_endpoint_enabled'])
71 if flush_enabled:
72 config.add_api_capability(
73 "flush_endpoint",
74 description="The __flush__ endpoint can be used to remove all "
75 "data from all backends.",
76 url="https://kinto.readthedocs.io/en/latest/configuration/"
77 "settings.html#activating-the-flush-endpoint")
78 else:
79 kwargs['ignore'] = ['kinto.views.flush']
80
81 # Permissions endpoint enabled if permission backend is setup.
82 permissions_endpoint_enabled = (
83 asbool(settings['experimental_permissions_endpoint']) and
84 hasattr(config.registry, 'permission'))
85 if permissions_endpoint_enabled:
86 config.add_api_capability(
87 "permissions_endpoint",
88 description="The permissions endpoint can be used to list all "
89 "user objects permissions.",
90 url="https://kinto.readthedocs.io/en/latest/configuration/"
91 "settings.html#activating-the-permissions-endpoint")
92 else:
93 kwargs.setdefault('ignore', []).append('kinto.views.permissions')
94
95 config.scan("kinto.views", **kwargs)
96
97 app = config.make_wsgi_app()
98
99 # Install middleware (no-op if disabled)
100 return kinto.core.install_middlewares(app, settings)
101
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/kinto/__init__.py b/kinto/__init__.py
--- a/kinto/__init__.py
+++ b/kinto/__init__.py
@@ -79,8 +79,9 @@
kwargs['ignore'] = ['kinto.views.flush']
# Permissions endpoint enabled if permission backend is setup.
+ is_admin_enabled = 'kinto.plugins.admin' in settings['includes']
permissions_endpoint_enabled = (
- asbool(settings['experimental_permissions_endpoint']) and
+ (is_admin_enabled or asbool(settings['experimental_permissions_endpoint'])) and
hasattr(config.registry, 'permission'))
if permissions_endpoint_enabled:
config.add_api_capability(
| {"golden_diff": "diff --git a/kinto/__init__.py b/kinto/__init__.py\n--- a/kinto/__init__.py\n+++ b/kinto/__init__.py\n@@ -79,8 +79,9 @@\n kwargs['ignore'] = ['kinto.views.flush']\n \n # Permissions endpoint enabled if permission backend is setup.\n+ is_admin_enabled = 'kinto.plugins.admin' in settings['includes']\n permissions_endpoint_enabled = (\n- asbool(settings['experimental_permissions_endpoint']) and\n+ (is_admin_enabled or asbool(settings['experimental_permissions_endpoint'])) and\n hasattr(config.registry, 'permission'))\n if permissions_endpoint_enabled:\n config.add_api_capability(\n", "issue": "Enforce the permission endpoint when the admin plugin is included.\n\nEnforce the permission endpoint when the admin plugin is included.\n\n", "before_files": [{"content": "import pkg_resources\nimport logging\n\nimport kinto.core\nfrom pyramid.config import Configurator\nfrom pyramid.settings import asbool\nfrom pyramid.security import Authenticated, Everyone\n\nfrom kinto.authorization import RouteFactory\n\n\n# Module version, as defined in PEP-0396.\n__version__ = pkg_resources.get_distribution(__package__).version\n\n# Implemented HTTP API Version\nHTTP_API_VERSION = '1.16'\n\n# Main kinto logger\nlogger = logging.getLogger(__name__)\n\n\nDEFAULT_SETTINGS = {\n 'flush_endpoint_enabled': False,\n 'retry_after_seconds': 3,\n 'cache_backend': 'kinto.core.cache.memory',\n 'permission_backend': 'kinto.core.permission.memory',\n 'storage_backend': 'kinto.core.storage.memory',\n 'project_docs': 'https://kinto.readthedocs.io/',\n 'bucket_create_principals': Authenticated,\n 'permissions_read_principals': Everyone,\n 'multiauth.authorization_policy': (\n 'kinto.authorization.AuthorizationPolicy'),\n 'experimental_collection_schema_validation': False,\n 'experimental_permissions_endpoint': False,\n 'http_api_version': HTTP_API_VERSION,\n 'bucket_id_generator': 'kinto.views.NameGenerator',\n 'collection_id_generator': 'kinto.views.NameGenerator',\n 'group_id_generator': 'kinto.views.NameGenerator',\n 'record_id_generator': 'kinto.views.RelaxedUUID'\n}\n\n\ndef main(global_config, config=None, **settings):\n if not config:\n config = Configurator(settings=settings, root_factory=RouteFactory)\n\n # Force project name, since it determines settings prefix.\n config.add_settings({'kinto.project_name': 'kinto'})\n\n kinto.core.initialize(config,\n version=__version__,\n default_settings=DEFAULT_SETTINGS)\n\n settings = config.get_settings()\n\n # Expose capability\n schema_enabled = asbool(\n settings['experimental_collection_schema_validation']\n )\n if schema_enabled:\n config.add_api_capability(\n \"schema\",\n description=\"Validates collection records with JSON schemas.\",\n url=\"https://kinto.readthedocs.io/en/latest/api/1.x/\"\n \"collections.html#collection-json-schema\")\n\n # Scan Kinto views.\n kwargs = {}\n\n flush_enabled = asbool(settings['flush_endpoint_enabled'])\n if flush_enabled:\n config.add_api_capability(\n \"flush_endpoint\",\n description=\"The __flush__ endpoint can be used to remove all \"\n \"data from all backends.\",\n url=\"https://kinto.readthedocs.io/en/latest/configuration/\"\n \"settings.html#activating-the-flush-endpoint\")\n else:\n kwargs['ignore'] = ['kinto.views.flush']\n\n # Permissions endpoint enabled if permission backend is setup.\n permissions_endpoint_enabled = (\n asbool(settings['experimental_permissions_endpoint']) and\n hasattr(config.registry, 'permission'))\n if permissions_endpoint_enabled:\n config.add_api_capability(\n \"permissions_endpoint\",\n description=\"The permissions endpoint can be used to list all \"\n \"user objects permissions.\",\n url=\"https://kinto.readthedocs.io/en/latest/configuration/\"\n \"settings.html#activating-the-permissions-endpoint\")\n else:\n kwargs.setdefault('ignore', []).append('kinto.views.permissions')\n\n config.scan(\"kinto.views\", **kwargs)\n\n app = config.make_wsgi_app()\n\n # Install middleware (no-op if disabled)\n return kinto.core.install_middlewares(app, settings)\n", "path": "kinto/__init__.py"}], "after_files": [{"content": "import pkg_resources\nimport logging\n\nimport kinto.core\nfrom pyramid.config import Configurator\nfrom pyramid.settings import asbool\nfrom pyramid.security import Authenticated, Everyone\n\nfrom kinto.authorization import RouteFactory\n\n\n# Module version, as defined in PEP-0396.\n__version__ = pkg_resources.get_distribution(__package__).version\n\n# Implemented HTTP API Version\nHTTP_API_VERSION = '1.16'\n\n# Main kinto logger\nlogger = logging.getLogger(__name__)\n\n\nDEFAULT_SETTINGS = {\n 'flush_endpoint_enabled': False,\n 'retry_after_seconds': 3,\n 'cache_backend': 'kinto.core.cache.memory',\n 'permission_backend': 'kinto.core.permission.memory',\n 'storage_backend': 'kinto.core.storage.memory',\n 'project_docs': 'https://kinto.readthedocs.io/',\n 'bucket_create_principals': Authenticated,\n 'permissions_read_principals': Everyone,\n 'multiauth.authorization_policy': (\n 'kinto.authorization.AuthorizationPolicy'),\n 'experimental_collection_schema_validation': False,\n 'experimental_permissions_endpoint': False,\n 'http_api_version': HTTP_API_VERSION,\n 'bucket_id_generator': 'kinto.views.NameGenerator',\n 'collection_id_generator': 'kinto.views.NameGenerator',\n 'group_id_generator': 'kinto.views.NameGenerator',\n 'record_id_generator': 'kinto.views.RelaxedUUID'\n}\n\n\ndef main(global_config, config=None, **settings):\n if not config:\n config = Configurator(settings=settings, root_factory=RouteFactory)\n\n # Force project name, since it determines settings prefix.\n config.add_settings({'kinto.project_name': 'kinto'})\n\n kinto.core.initialize(config,\n version=__version__,\n default_settings=DEFAULT_SETTINGS)\n\n settings = config.get_settings()\n\n # Expose capability\n schema_enabled = asbool(\n settings['experimental_collection_schema_validation']\n )\n if schema_enabled:\n config.add_api_capability(\n \"schema\",\n description=\"Validates collection records with JSON schemas.\",\n url=\"https://kinto.readthedocs.io/en/latest/api/1.x/\"\n \"collections.html#collection-json-schema\")\n\n # Scan Kinto views.\n kwargs = {}\n\n flush_enabled = asbool(settings['flush_endpoint_enabled'])\n if flush_enabled:\n config.add_api_capability(\n \"flush_endpoint\",\n description=\"The __flush__ endpoint can be used to remove all \"\n \"data from all backends.\",\n url=\"https://kinto.readthedocs.io/en/latest/configuration/\"\n \"settings.html#activating-the-flush-endpoint\")\n else:\n kwargs['ignore'] = ['kinto.views.flush']\n\n # Permissions endpoint enabled if permission backend is setup.\n is_admin_enabled = 'kinto.plugins.admin' in settings['includes']\n permissions_endpoint_enabled = (\n (is_admin_enabled or asbool(settings['experimental_permissions_endpoint'])) and\n hasattr(config.registry, 'permission'))\n if permissions_endpoint_enabled:\n config.add_api_capability(\n \"permissions_endpoint\",\n description=\"The permissions endpoint can be used to list all \"\n \"user objects permissions.\",\n url=\"https://kinto.readthedocs.io/en/latest/configuration/\"\n \"settings.html#activating-the-permissions-endpoint\")\n else:\n kwargs.setdefault('ignore', []).append('kinto.views.permissions')\n\n config.scan(\"kinto.views\", **kwargs)\n\n app = config.make_wsgi_app()\n\n # Install middleware (no-op if disabled)\n return kinto.core.install_middlewares(app, settings)\n", "path": "kinto/__init__.py"}]} | 1,212 | 144 |
gh_patches_debug_6238 | rasdani/github-patches | git_diff | piskvorky__gensim-2106 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Turn off support of Google Style docstrings
**All docstrings should be refactored first**
To prevent contributors from using Google Style docstrings, we need to set
`napoleon_google_docstring = False`,
[like explained here](https://samnicholls.net/2016/06/15/how-to-sphinx-readthedocs/).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docs/src/conf.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 #
3 # gensim documentation build configuration file, created by
4 # sphinx-quickstart on Wed Mar 17 13:42:21 2010.
5 #
6 # This file is execfile()d with the current directory set to its containing dir.
7 #
8 # Note that not all possible configuration values are present in this
9 # autogenerated file.
10 #
11 # All configuration values have a default; values that are commented out
12 # serve to show the default.
13
14 import os
15 import sys
16
17 # If extensions (or modules to document with autodoc) are in another directory,
18 # add these directories to sys.path here. If the directory is relative to the
19 # documentation root, use os.path.abspath to make it absolute, like shown here.
20 sys.path.append(os.path.abspath('.'))
21
22 # -- General configuration -----------------------------------------------------
23
24 html_theme = 'gensim_theme'
25
26 # Add any Sphinx extension module names here, as strings. They can be extensions
27 # coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
28 extensions = ['sphinx.ext.autodoc', 'sphinxcontrib.napoleon', 'sphinx.ext.imgmath', 'sphinxcontrib.programoutput']
29 autoclass_content = "both"
30
31 # Add any paths that contain templates here, relative to this directory.
32 templates_path = ['_templates']
33
34 # The suffix of source filenames.
35 source_suffix = '.rst'
36
37 # The encoding of source files.
38 # source_encoding = 'utf-8'
39
40 # The master toctree document.
41 master_doc = 'indextoc'
42
43 # Additional templates that should be rendered to pages, maps page names to
44 # template names.
45 html_additional_pages = {'index': './_templates/indexcontent.html'}
46
47 # General information about the project.
48 project = u'gensim'
49 copyright = u'2009-now, Radim Řehůřek <me(at)radimrehurek.com>'
50
51 # The version info for the project you're documenting, acts as replacement for
52 # |version| and |release|, also used in various other places throughout the
53 # built documents.
54 #
55 # The short X.Y version.
56 version = '3.4'
57 # The full version, including alpha/beta/rc tags.
58 release = '3.4.0'
59
60 # The language for content autogenerated by Sphinx. Refer to documentation
61 # for a list of supported languages.
62 # language = None
63
64 # There are two options for replacing |today|: either, you set today to some
65 # non-false value, then it is used:
66 # today = ''
67 # Else, today_fmt is used as the format for a strftime call.
68 # today_fmt = '%B %d, %Y'
69
70 # List of documents that shouldn't be included in the build.
71 # unused_docs = []
72
73 # List of directories, relative to source directory, that shouldn't be searched
74 # for source files.
75 exclude_trees = ['_build']
76
77 # The reST default role (used for this markup: `text`) to use for all documents.
78 # default_role = None
79
80 # If true, '()' will be appended to :func: etc. cross-reference text.
81 # add_function_parentheses = True
82
83 # If true, the current module name will be prepended to all description
84 # unit titles (such as .. function::).
85 # add_module_names = True
86
87 # If true, sectionauthor and moduleauthor directives will be shown in the
88 # output. They are ignored by default.
89 # show_authors = False
90
91 # The name of the Pygments (syntax highlighting) style to use.
92 pygments_style = 'sphinx'
93
94 # A list of ignored prefixes for module index sorting.
95 # modindex_common_prefix = []
96
97
98 # -- Options for HTML output ---------------------------------------------------
99
100 # The theme to use for HTML and HTML Help pages. Major themes that come with
101 # Sphinx are currently 'default' and 'sphinxdoc'.
102 # html_theme = 'default'
103
104 # Theme options are theme-specific and customize the look and feel of a theme
105 # further. For a list of options available for each theme, see the
106 # documentation.
107 # main_colour = "#ffbbbb"
108
109 html_theme_options = {
110 # "rightsidebar": "false",
111 # "stickysidebar": "true",
112 # "bodyfont": "'Lucida Grande', 'Lucida Sans Unicode', 'Geneva', 'Verdana', 'sans-serif'",
113 # "headfont": "'Lucida Grande', 'Lucida Sans Unicode', 'Geneva', 'Verdana', 'sans-serif'",
114 # "sidebarbgcolor": "fuckyou",
115 # "footerbgcolor": "#771111",
116 # "relbarbgcolor": "#993333",
117 # "sidebartextcolor": "#000000",
118 # "sidebarlinkcolor": "#330000",
119 # "codebgcolor": "#fffff0",
120 # "headtextcolor": "#000080",
121 # "headbgcolor": "#f0f0ff",
122 # "bgcolor": "#ffffff",
123 }
124
125
126 # Add any paths that contain custom themes here, relative to this directory.
127 html_theme_path = ['.']
128
129 # The name for this set of Sphinx documents. If None, it defaults to
130 # "<project> v<release> documentation".
131 html_title = "gensim"
132
133 # A shorter title for the navigation bar. Default is the same as html_title.
134 # html_short_title = ''
135
136 # The name of an image file (relative to this directory) to place at the top
137 # of the sidebar.
138 # html_logo = None
139
140 # The name of an image file (within the static path) to use as favicon of the
141 # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
142 # pixels large.
143 html_favicon = '_static/favicon.ico'
144
145 # Add any paths that contain custom static files (such as style sheets) here,
146 # relative to this directory. They are copied after the builtin static files,
147 # so a file named "default.css" will overwrite the builtin "default.css".
148 html_static_path = ['_static']
149
150 # If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
151 # using the given strftime format.
152 html_last_updated_fmt = '%b %d, %Y'
153
154 # If true, SmartyPants will be used to convert quotes and dashes to
155 # typographically correct entities.
156 # html_use_smartypants = True
157
158 # Custom sidebar templates, maps document names to template names.
159 html_sidebars = {} # {'index': ['download.html', 'globaltoc.html', 'searchbox.html', 'indexsidebar.html']}
160 # html_sidebars = {'index': ['globaltoc.html', 'searchbox.html']}
161
162 # If false, no module index is generated.
163 # html_use_modindex = True
164
165 # If false, no index is generated.
166 # html_use_index = True
167
168 # If true, the index is split into individual pages for each letter.
169 html_split_index = False
170
171 # If true, links to the reST sources are added to the pages.
172 html_show_sourcelink = False
173
174 html_domain_indices = False
175
176 # If true, an OpenSearch description file will be output, and all pages will
177 # contain a <link> tag referring to it. The value of this option must be the
178 # base URL from which the finished HTML is served.
179 # html_use_opensearch = ''
180
181 # If nonempty, this is the file name suffix for HTML files (e.g. ".xhtml").
182 # html_file_suffix = ''
183
184 # Output file base name for HTML help builder.
185 htmlhelp_basename = 'gensimdoc'
186
187 html_show_sphinx = False
188
189 # -- Options for LaTeX output --------------------------------------------------
190
191 # The paper size ('letter' or 'a4').
192 # latex_paper_size = 'letter'
193
194 # The font size ('10pt', '11pt' or '12pt').
195 # latex_font_size = '10pt'
196
197 # Grouping the document tree into LaTeX files. List of tuples
198 # (source start file, target name, title, author, documentclass [howto/manual]).
199 latex_documents = [('index', 'gensim.tex', u'gensim Documentation', u'Radim Řehůřek', 'manual')]
200
201 # The name of an image file (relative to this directory) to place at the top of
202 # the title page.
203 # latex_logo = None
204
205 # For "manual" documents, if this is true, then toplevel headings are parts,
206 # not chapters.
207 latex_use_parts = False
208
209 # Additional stuff for the LaTeX preamble.
210 # latex_preamble = ''
211
212 # Documents to append as an appendix to all manuals.
213 # latex_appendices = []
214
215 # If false, no module index is generated.
216 # latex_use_modindex = True
217
218 suppress_warnings = ['image.nonlocal_uri', 'ref.citation', 'ref.footnote']
219
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/docs/src/conf.py b/docs/src/conf.py
--- a/docs/src/conf.py
+++ b/docs/src/conf.py
@@ -28,6 +28,8 @@
extensions = ['sphinx.ext.autodoc', 'sphinxcontrib.napoleon', 'sphinx.ext.imgmath', 'sphinxcontrib.programoutput']
autoclass_content = "both"
+napoleon_google_docstring = False # Disable support for google-style docstring
+
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
| {"golden_diff": "diff --git a/docs/src/conf.py b/docs/src/conf.py\n--- a/docs/src/conf.py\n+++ b/docs/src/conf.py\n@@ -28,6 +28,8 @@\n extensions = ['sphinx.ext.autodoc', 'sphinxcontrib.napoleon', 'sphinx.ext.imgmath', 'sphinxcontrib.programoutput']\n autoclass_content = \"both\"\n \n+napoleon_google_docstring = False # Disable support for google-style docstring\n+\n # Add any paths that contain templates here, relative to this directory.\n templates_path = ['_templates']\n", "issue": "Turn off support of Google Style docstrings\n**All docstrings should be refactored first**\r\n\r\nTo prevent contributors from using Google Style docstrings, we need to set\r\n\r\n`napoleon_google_docstring = False`,\r\n\r\n[like explained here](https://samnicholls.net/2016/06/15/how-to-sphinx-readthedocs/).\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# gensim documentation build configuration file, created by\n# sphinx-quickstart on Wed Mar 17 13:42:21 2010.\n#\n# This file is execfile()d with the current directory set to its containing dir.\n#\n# Note that not all possible configuration values are present in this\n# autogenerated file.\n#\n# All configuration values have a default; values that are commented out\n# serve to show the default.\n\nimport os\nimport sys\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\nsys.path.append(os.path.abspath('.'))\n\n# -- General configuration -----------------------------------------------------\n\nhtml_theme = 'gensim_theme'\n\n# Add any Sphinx extension module names here, as strings. They can be extensions\n# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.\nextensions = ['sphinx.ext.autodoc', 'sphinxcontrib.napoleon', 'sphinx.ext.imgmath', 'sphinxcontrib.programoutput']\nautoclass_content = \"both\"\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# The suffix of source filenames.\nsource_suffix = '.rst'\n\n# The encoding of source files.\n# source_encoding = 'utf-8'\n\n# The master toctree document.\nmaster_doc = 'indextoc'\n\n# Additional templates that should be rendered to pages, maps page names to\n# template names.\nhtml_additional_pages = {'index': './_templates/indexcontent.html'}\n\n# General information about the project.\nproject = u'gensim'\ncopyright = u'2009-now, Radim \u0158eh\u016f\u0159ek <me(at)radimrehurek.com>'\n\n# The version info for the project you're documenting, acts as replacement for\n# |version| and |release|, also used in various other places throughout the\n# built documents.\n#\n# The short X.Y version.\nversion = '3.4'\n# The full version, including alpha/beta/rc tags.\nrelease = '3.4.0'\n\n# The language for content autogenerated by Sphinx. Refer to documentation\n# for a list of supported languages.\n# language = None\n\n# There are two options for replacing |today|: either, you set today to some\n# non-false value, then it is used:\n# today = ''\n# Else, today_fmt is used as the format for a strftime call.\n# today_fmt = '%B %d, %Y'\n\n# List of documents that shouldn't be included in the build.\n# unused_docs = []\n\n# List of directories, relative to source directory, that shouldn't be searched\n# for source files.\nexclude_trees = ['_build']\n\n# The reST default role (used for this markup: `text`) to use for all documents.\n# default_role = None\n\n# If true, '()' will be appended to :func: etc. cross-reference text.\n# add_function_parentheses = True\n\n# If true, the current module name will be prepended to all description\n# unit titles (such as .. function::).\n# add_module_names = True\n\n# If true, sectionauthor and moduleauthor directives will be shown in the\n# output. They are ignored by default.\n# show_authors = False\n\n# The name of the Pygments (syntax highlighting) style to use.\npygments_style = 'sphinx'\n\n# A list of ignored prefixes for module index sorting.\n# modindex_common_prefix = []\n\n\n# -- Options for HTML output ---------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. Major themes that come with\n# Sphinx are currently 'default' and 'sphinxdoc'.\n# html_theme = 'default'\n\n# Theme options are theme-specific and customize the look and feel of a theme\n# further. For a list of options available for each theme, see the\n# documentation.\n# main_colour = \"#ffbbbb\"\n\nhtml_theme_options = {\n# \"rightsidebar\": \"false\",\n# \"stickysidebar\": \"true\",\n# \"bodyfont\": \"'Lucida Grande', 'Lucida Sans Unicode', 'Geneva', 'Verdana', 'sans-serif'\",\n# \"headfont\": \"'Lucida Grande', 'Lucida Sans Unicode', 'Geneva', 'Verdana', 'sans-serif'\",\n# \"sidebarbgcolor\": \"fuckyou\",\n# \"footerbgcolor\": \"#771111\",\n# \"relbarbgcolor\": \"#993333\",\n# \"sidebartextcolor\": \"#000000\",\n# \"sidebarlinkcolor\": \"#330000\",\n# \"codebgcolor\": \"#fffff0\",\n# \"headtextcolor\": \"#000080\",\n# \"headbgcolor\": \"#f0f0ff\",\n# \"bgcolor\": \"#ffffff\",\n}\n\n\n# Add any paths that contain custom themes here, relative to this directory.\nhtml_theme_path = ['.']\n\n# The name for this set of Sphinx documents. If None, it defaults to\n# \"<project> v<release> documentation\".\nhtml_title = \"gensim\"\n\n# A shorter title for the navigation bar. Default is the same as html_title.\n# html_short_title = ''\n\n# The name of an image file (relative to this directory) to place at the top\n# of the sidebar.\n# html_logo = None\n\n# The name of an image file (within the static path) to use as favicon of the\n# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32\n# pixels large.\nhtml_favicon = '_static/favicon.ico'\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\n# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,\n# using the given strftime format.\nhtml_last_updated_fmt = '%b %d, %Y'\n\n# If true, SmartyPants will be used to convert quotes and dashes to\n# typographically correct entities.\n# html_use_smartypants = True\n\n# Custom sidebar templates, maps document names to template names.\nhtml_sidebars = {} # {'index': ['download.html', 'globaltoc.html', 'searchbox.html', 'indexsidebar.html']}\n# html_sidebars = {'index': ['globaltoc.html', 'searchbox.html']}\n\n# If false, no module index is generated.\n# html_use_modindex = True\n\n# If false, no index is generated.\n# html_use_index = True\n\n# If true, the index is split into individual pages for each letter.\nhtml_split_index = False\n\n# If true, links to the reST sources are added to the pages.\nhtml_show_sourcelink = False\n\nhtml_domain_indices = False\n\n# If true, an OpenSearch description file will be output, and all pages will\n# contain a <link> tag referring to it. The value of this option must be the\n# base URL from which the finished HTML is served.\n# html_use_opensearch = ''\n\n# If nonempty, this is the file name suffix for HTML files (e.g. \".xhtml\").\n# html_file_suffix = ''\n\n# Output file base name for HTML help builder.\nhtmlhelp_basename = 'gensimdoc'\n\nhtml_show_sphinx = False\n\n# -- Options for LaTeX output --------------------------------------------------\n\n# The paper size ('letter' or 'a4').\n# latex_paper_size = 'letter'\n\n# The font size ('10pt', '11pt' or '12pt').\n# latex_font_size = '10pt'\n\n# Grouping the document tree into LaTeX files. List of tuples\n# (source start file, target name, title, author, documentclass [howto/manual]).\nlatex_documents = [('index', 'gensim.tex', u'gensim Documentation', u'Radim \u0158eh\u016f\u0159ek', 'manual')]\n\n# The name of an image file (relative to this directory) to place at the top of\n# the title page.\n# latex_logo = None\n\n# For \"manual\" documents, if this is true, then toplevel headings are parts,\n# not chapters.\nlatex_use_parts = False\n\n# Additional stuff for the LaTeX preamble.\n# latex_preamble = ''\n\n# Documents to append as an appendix to all manuals.\n# latex_appendices = []\n\n# If false, no module index is generated.\n# latex_use_modindex = True\n\nsuppress_warnings = ['image.nonlocal_uri', 'ref.citation', 'ref.footnote']\n", "path": "docs/src/conf.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# gensim documentation build configuration file, created by\n# sphinx-quickstart on Wed Mar 17 13:42:21 2010.\n#\n# This file is execfile()d with the current directory set to its containing dir.\n#\n# Note that not all possible configuration values are present in this\n# autogenerated file.\n#\n# All configuration values have a default; values that are commented out\n# serve to show the default.\n\nimport os\nimport sys\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\nsys.path.append(os.path.abspath('.'))\n\n# -- General configuration -----------------------------------------------------\n\nhtml_theme = 'gensim_theme'\n\n# Add any Sphinx extension module names here, as strings. They can be extensions\n# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.\nextensions = ['sphinx.ext.autodoc', 'sphinxcontrib.napoleon', 'sphinx.ext.imgmath', 'sphinxcontrib.programoutput']\nautoclass_content = \"both\"\n\nnapoleon_google_docstring = False # Disable support for google-style docstring\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# The suffix of source filenames.\nsource_suffix = '.rst'\n\n# The encoding of source files.\n# source_encoding = 'utf-8'\n\n# The master toctree document.\nmaster_doc = 'indextoc'\n\n# Additional templates that should be rendered to pages, maps page names to\n# template names.\nhtml_additional_pages = {'index': './_templates/indexcontent.html'}\n\n# General information about the project.\nproject = u'gensim'\ncopyright = u'2009-now, Radim \u0158eh\u016f\u0159ek <me(at)radimrehurek.com>'\n\n# The version info for the project you're documenting, acts as replacement for\n# |version| and |release|, also used in various other places throughout the\n# built documents.\n#\n# The short X.Y version.\nversion = '3.4'\n# The full version, including alpha/beta/rc tags.\nrelease = '3.4.0'\n\n# The language for content autogenerated by Sphinx. Refer to documentation\n# for a list of supported languages.\n# language = None\n\n# There are two options for replacing |today|: either, you set today to some\n# non-false value, then it is used:\n# today = ''\n# Else, today_fmt is used as the format for a strftime call.\n# today_fmt = '%B %d, %Y'\n\n# List of documents that shouldn't be included in the build.\n# unused_docs = []\n\n# List of directories, relative to source directory, that shouldn't be searched\n# for source files.\nexclude_trees = ['_build']\n\n# The reST default role (used for this markup: `text`) to use for all documents.\n# default_role = None\n\n# If true, '()' will be appended to :func: etc. cross-reference text.\n# add_function_parentheses = True\n\n# If true, the current module name will be prepended to all description\n# unit titles (such as .. function::).\n# add_module_names = True\n\n# If true, sectionauthor and moduleauthor directives will be shown in the\n# output. They are ignored by default.\n# show_authors = False\n\n# The name of the Pygments (syntax highlighting) style to use.\npygments_style = 'sphinx'\n\n# A list of ignored prefixes for module index sorting.\n# modindex_common_prefix = []\n\n\n# -- Options for HTML output ---------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. Major themes that come with\n# Sphinx are currently 'default' and 'sphinxdoc'.\n# html_theme = 'default'\n\n# Theme options are theme-specific and customize the look and feel of a theme\n# further. For a list of options available for each theme, see the\n# documentation.\n# main_colour = \"#ffbbbb\"\n\nhtml_theme_options = {\n# \"rightsidebar\": \"false\",\n# \"stickysidebar\": \"true\",\n# \"bodyfont\": \"'Lucida Grande', 'Lucida Sans Unicode', 'Geneva', 'Verdana', 'sans-serif'\",\n# \"headfont\": \"'Lucida Grande', 'Lucida Sans Unicode', 'Geneva', 'Verdana', 'sans-serif'\",\n# \"sidebarbgcolor\": \"fuckyou\",\n# \"footerbgcolor\": \"#771111\",\n# \"relbarbgcolor\": \"#993333\",\n# \"sidebartextcolor\": \"#000000\",\n# \"sidebarlinkcolor\": \"#330000\",\n# \"codebgcolor\": \"#fffff0\",\n# \"headtextcolor\": \"#000080\",\n# \"headbgcolor\": \"#f0f0ff\",\n# \"bgcolor\": \"#ffffff\",\n}\n\n\n# Add any paths that contain custom themes here, relative to this directory.\nhtml_theme_path = ['.']\n\n# The name for this set of Sphinx documents. If None, it defaults to\n# \"<project> v<release> documentation\".\nhtml_title = \"gensim\"\n\n# A shorter title for the navigation bar. Default is the same as html_title.\n# html_short_title = ''\n\n# The name of an image file (relative to this directory) to place at the top\n# of the sidebar.\n# html_logo = None\n\n# The name of an image file (within the static path) to use as favicon of the\n# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32\n# pixels large.\nhtml_favicon = '_static/favicon.ico'\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\n# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,\n# using the given strftime format.\nhtml_last_updated_fmt = '%b %d, %Y'\n\n# If true, SmartyPants will be used to convert quotes and dashes to\n# typographically correct entities.\n# html_use_smartypants = True\n\n# Custom sidebar templates, maps document names to template names.\nhtml_sidebars = {} # {'index': ['download.html', 'globaltoc.html', 'searchbox.html', 'indexsidebar.html']}\n# html_sidebars = {'index': ['globaltoc.html', 'searchbox.html']}\n\n# If false, no module index is generated.\n# html_use_modindex = True\n\n# If false, no index is generated.\n# html_use_index = True\n\n# If true, the index is split into individual pages for each letter.\nhtml_split_index = False\n\n# If true, links to the reST sources are added to the pages.\nhtml_show_sourcelink = False\n\nhtml_domain_indices = False\n\n# If true, an OpenSearch description file will be output, and all pages will\n# contain a <link> tag referring to it. The value of this option must be the\n# base URL from which the finished HTML is served.\n# html_use_opensearch = ''\n\n# If nonempty, this is the file name suffix for HTML files (e.g. \".xhtml\").\n# html_file_suffix = ''\n\n# Output file base name for HTML help builder.\nhtmlhelp_basename = 'gensimdoc'\n\nhtml_show_sphinx = False\n\n# -- Options for LaTeX output --------------------------------------------------\n\n# The paper size ('letter' or 'a4').\n# latex_paper_size = 'letter'\n\n# The font size ('10pt', '11pt' or '12pt').\n# latex_font_size = '10pt'\n\n# Grouping the document tree into LaTeX files. List of tuples\n# (source start file, target name, title, author, documentclass [howto/manual]).\nlatex_documents = [('index', 'gensim.tex', u'gensim Documentation', u'Radim \u0158eh\u016f\u0159ek', 'manual')]\n\n# The name of an image file (relative to this directory) to place at the top of\n# the title page.\n# latex_logo = None\n\n# For \"manual\" documents, if this is true, then toplevel headings are parts,\n# not chapters.\nlatex_use_parts = False\n\n# Additional stuff for the LaTeX preamble.\n# latex_preamble = ''\n\n# Documents to append as an appendix to all manuals.\n# latex_appendices = []\n\n# If false, no module index is generated.\n# latex_use_modindex = True\n\nsuppress_warnings = ['image.nonlocal_uri', 'ref.citation', 'ref.footnote']\n", "path": "docs/src/conf.py"}]} | 2,795 | 120 |
gh_patches_debug_31308 | rasdani/github-patches | git_diff | dask__distributed-4984 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Drop down tile to reveal "secret" dashboards
We're accumulating a lot of _secret_ dashboard pages https://github.com/dask/distributed/blob/c2557938e6c4175534031cba5ca5ac9d2cdc95f7/distributed/dashboard/scheduler.py#L82-L119
although most are not easily accessible from the UI. Most of the pages are not useful for the ordinary user and are only relevant for specific edge cases or debugging. hence, it makes sense that they are not promoted as a top-level dashboard page.
However, at least for debugging purposes, I would really appreciate if these pages were a bit easier to navigate. In particular I'm looking for a way which doesn't require me to know the exact endpoint for an individual plot and requires me to type it into my browser.
I would propose to add a drop down menu / button which can be used to browse all _hidden_ dashboard pages.
Disclaimer: I can't implement this. I barely know what bokeh is.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `distributed/dashboard/scheduler.py`
Content:
```
1 from urllib.parse import urljoin
2
3 from tornado import web
4 from tornado.ioloop import IOLoop
5
6 try:
7 import numpy as np
8 except ImportError:
9 np = False
10
11 from .components.nvml import gpu_doc # noqa: 1708
12 from .components.nvml import NVML_ENABLED, gpu_memory_doc, gpu_utilization_doc
13 from .components.scheduler import (
14 AggregateAction,
15 BandwidthTypes,
16 BandwidthWorkers,
17 ComputePerKey,
18 CurrentLoad,
19 MemoryByKey,
20 NBytes,
21 NBytesCluster,
22 Occupancy,
23 SystemMonitor,
24 TaskGraph,
25 TaskGroupGraph,
26 TaskProgress,
27 TaskStream,
28 WorkerTable,
29 events_doc,
30 graph_doc,
31 individual_doc,
32 individual_profile_doc,
33 individual_profile_server_doc,
34 profile_doc,
35 profile_server_doc,
36 status_doc,
37 stealing_doc,
38 systemmonitor_doc,
39 tasks_doc,
40 tg_graph_doc,
41 workers_doc,
42 )
43 from .core import BokehApplication
44 from .worker import counters_doc
45
46 template_variables = {
47 "pages": [
48 "status",
49 "workers",
50 "tasks",
51 "system",
52 "profile",
53 "graph",
54 "groups",
55 "info",
56 ]
57 }
58
59 if NVML_ENABLED:
60 template_variables["pages"].insert(4, "gpu")
61
62
63 def connect(application, http_server, scheduler, prefix=""):
64 bokeh_app = BokehApplication(
65 applications, scheduler, prefix=prefix, template_variables=template_variables
66 )
67 application.add_application(bokeh_app)
68 bokeh_app.initialize(IOLoop.current())
69
70 bokeh_app.add_handlers(
71 r".*",
72 [
73 (
74 r"/",
75 web.RedirectHandler,
76 {"url": urljoin((prefix or "").strip("/") + "/", r"status")},
77 )
78 ],
79 )
80
81
82 applications = {
83 "/system": systemmonitor_doc,
84 "/stealing": stealing_doc,
85 "/workers": workers_doc,
86 "/events": events_doc,
87 "/counters": counters_doc,
88 "/tasks": tasks_doc,
89 "/status": status_doc,
90 "/profile": profile_doc,
91 "/profile-server": profile_server_doc,
92 "/graph": graph_doc,
93 "/groups": tg_graph_doc,
94 "/gpu": gpu_doc,
95 "/individual-task-stream": individual_doc(
96 TaskStream, 100, n_rectangles=1000, clear_interval="10s"
97 ),
98 "/individual-progress": individual_doc(TaskProgress, 100, height=160),
99 "/individual-graph": individual_doc(TaskGraph, 200),
100 "/individual-groups": individual_doc(TaskGroupGraph, 200),
101 "/individual-nbytes": individual_doc(NBytes, 100),
102 "/individual-nbytes-cluster": individual_doc(NBytesCluster, 100),
103 "/individual-cpu": individual_doc(CurrentLoad, 100, fig_attr="cpu_figure"),
104 "/individual-nprocessing": individual_doc(
105 CurrentLoad, 100, fig_attr="processing_figure"
106 ),
107 "/individual-occupancy": individual_doc(Occupancy, 100),
108 "/individual-workers": individual_doc(WorkerTable, 500),
109 "/individual-bandwidth-types": individual_doc(BandwidthTypes, 500),
110 "/individual-bandwidth-workers": individual_doc(BandwidthWorkers, 500),
111 "/individual-memory-by-key": individual_doc(MemoryByKey, 500),
112 "/individual-compute-time-per-key": individual_doc(ComputePerKey, 500),
113 "/individual-aggregate-time-per-action": individual_doc(AggregateAction, 500),
114 "/individual-scheduler-system": individual_doc(SystemMonitor, 500),
115 "/individual-profile": individual_profile_doc,
116 "/individual-profile-server": individual_profile_server_doc,
117 "/individual-gpu-memory": gpu_memory_doc,
118 "/individual-gpu-utilization": gpu_utilization_doc,
119 }
120
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/distributed/dashboard/scheduler.py b/distributed/dashboard/scheduler.py
--- a/distributed/dashboard/scheduler.py
+++ b/distributed/dashboard/scheduler.py
@@ -43,42 +43,6 @@
from .core import BokehApplication
from .worker import counters_doc
-template_variables = {
- "pages": [
- "status",
- "workers",
- "tasks",
- "system",
- "profile",
- "graph",
- "groups",
- "info",
- ]
-}
-
-if NVML_ENABLED:
- template_variables["pages"].insert(4, "gpu")
-
-
-def connect(application, http_server, scheduler, prefix=""):
- bokeh_app = BokehApplication(
- applications, scheduler, prefix=prefix, template_variables=template_variables
- )
- application.add_application(bokeh_app)
- bokeh_app.initialize(IOLoop.current())
-
- bokeh_app.add_handlers(
- r".*",
- [
- (
- r"/",
- web.RedirectHandler,
- {"url": urljoin((prefix or "").strip("/") + "/", r"status")},
- )
- ],
- )
-
-
applications = {
"/system": systemmonitor_doc,
"/stealing": stealing_doc,
@@ -117,3 +81,40 @@
"/individual-gpu-memory": gpu_memory_doc,
"/individual-gpu-utilization": gpu_utilization_doc,
}
+
+
+template_variables = {
+ "pages": [
+ "status",
+ "workers",
+ "tasks",
+ "system",
+ "profile",
+ "graph",
+ "groups",
+ "info",
+ ],
+ "plots": [x.replace("/", "") for x in applications if "individual" in x],
+}
+
+if NVML_ENABLED:
+ template_variables["pages"].insert(4, "gpu")
+
+
+def connect(application, http_server, scheduler, prefix=""):
+ bokeh_app = BokehApplication(
+ applications, scheduler, prefix=prefix, template_variables=template_variables
+ )
+ application.add_application(bokeh_app)
+ bokeh_app.initialize(IOLoop.current())
+
+ bokeh_app.add_handlers(
+ r".*",
+ [
+ (
+ r"/",
+ web.RedirectHandler,
+ {"url": urljoin((prefix or "").strip("/") + "/", r"status")},
+ )
+ ],
+ )
| {"golden_diff": "diff --git a/distributed/dashboard/scheduler.py b/distributed/dashboard/scheduler.py\n--- a/distributed/dashboard/scheduler.py\n+++ b/distributed/dashboard/scheduler.py\n@@ -43,42 +43,6 @@\n from .core import BokehApplication\n from .worker import counters_doc\n \n-template_variables = {\n- \"pages\": [\n- \"status\",\n- \"workers\",\n- \"tasks\",\n- \"system\",\n- \"profile\",\n- \"graph\",\n- \"groups\",\n- \"info\",\n- ]\n-}\n-\n-if NVML_ENABLED:\n- template_variables[\"pages\"].insert(4, \"gpu\")\n-\n-\n-def connect(application, http_server, scheduler, prefix=\"\"):\n- bokeh_app = BokehApplication(\n- applications, scheduler, prefix=prefix, template_variables=template_variables\n- )\n- application.add_application(bokeh_app)\n- bokeh_app.initialize(IOLoop.current())\n-\n- bokeh_app.add_handlers(\n- r\".*\",\n- [\n- (\n- r\"/\",\n- web.RedirectHandler,\n- {\"url\": urljoin((prefix or \"\").strip(\"/\") + \"/\", r\"status\")},\n- )\n- ],\n- )\n-\n-\n applications = {\n \"/system\": systemmonitor_doc,\n \"/stealing\": stealing_doc,\n@@ -117,3 +81,40 @@\n \"/individual-gpu-memory\": gpu_memory_doc,\n \"/individual-gpu-utilization\": gpu_utilization_doc,\n }\n+\n+\n+template_variables = {\n+ \"pages\": [\n+ \"status\",\n+ \"workers\",\n+ \"tasks\",\n+ \"system\",\n+ \"profile\",\n+ \"graph\",\n+ \"groups\",\n+ \"info\",\n+ ],\n+ \"plots\": [x.replace(\"/\", \"\") for x in applications if \"individual\" in x],\n+}\n+\n+if NVML_ENABLED:\n+ template_variables[\"pages\"].insert(4, \"gpu\")\n+\n+\n+def connect(application, http_server, scheduler, prefix=\"\"):\n+ bokeh_app = BokehApplication(\n+ applications, scheduler, prefix=prefix, template_variables=template_variables\n+ )\n+ application.add_application(bokeh_app)\n+ bokeh_app.initialize(IOLoop.current())\n+\n+ bokeh_app.add_handlers(\n+ r\".*\",\n+ [\n+ (\n+ r\"/\",\n+ web.RedirectHandler,\n+ {\"url\": urljoin((prefix or \"\").strip(\"/\") + \"/\", r\"status\")},\n+ )\n+ ],\n+ )\n", "issue": "Drop down tile to reveal \"secret\" dashboards\nWe're accumulating a lot of _secret_ dashboard pages https://github.com/dask/distributed/blob/c2557938e6c4175534031cba5ca5ac9d2cdc95f7/distributed/dashboard/scheduler.py#L82-L119\r\nalthough most are not easily accessible from the UI. Most of the pages are not useful for the ordinary user and are only relevant for specific edge cases or debugging. hence, it makes sense that they are not promoted as a top-level dashboard page.\r\n\r\nHowever, at least for debugging purposes, I would really appreciate if these pages were a bit easier to navigate. In particular I'm looking for a way which doesn't require me to know the exact endpoint for an individual plot and requires me to type it into my browser.\r\n\r\nI would propose to add a drop down menu / button which can be used to browse all _hidden_ dashboard pages.\r\n\r\nDisclaimer: I can't implement this. I barely know what bokeh is.\n", "before_files": [{"content": "from urllib.parse import urljoin\n\nfrom tornado import web\nfrom tornado.ioloop import IOLoop\n\ntry:\n import numpy as np\nexcept ImportError:\n np = False\n\nfrom .components.nvml import gpu_doc # noqa: 1708\nfrom .components.nvml import NVML_ENABLED, gpu_memory_doc, gpu_utilization_doc\nfrom .components.scheduler import (\n AggregateAction,\n BandwidthTypes,\n BandwidthWorkers,\n ComputePerKey,\n CurrentLoad,\n MemoryByKey,\n NBytes,\n NBytesCluster,\n Occupancy,\n SystemMonitor,\n TaskGraph,\n TaskGroupGraph,\n TaskProgress,\n TaskStream,\n WorkerTable,\n events_doc,\n graph_doc,\n individual_doc,\n individual_profile_doc,\n individual_profile_server_doc,\n profile_doc,\n profile_server_doc,\n status_doc,\n stealing_doc,\n systemmonitor_doc,\n tasks_doc,\n tg_graph_doc,\n workers_doc,\n)\nfrom .core import BokehApplication\nfrom .worker import counters_doc\n\ntemplate_variables = {\n \"pages\": [\n \"status\",\n \"workers\",\n \"tasks\",\n \"system\",\n \"profile\",\n \"graph\",\n \"groups\",\n \"info\",\n ]\n}\n\nif NVML_ENABLED:\n template_variables[\"pages\"].insert(4, \"gpu\")\n\n\ndef connect(application, http_server, scheduler, prefix=\"\"):\n bokeh_app = BokehApplication(\n applications, scheduler, prefix=prefix, template_variables=template_variables\n )\n application.add_application(bokeh_app)\n bokeh_app.initialize(IOLoop.current())\n\n bokeh_app.add_handlers(\n r\".*\",\n [\n (\n r\"/\",\n web.RedirectHandler,\n {\"url\": urljoin((prefix or \"\").strip(\"/\") + \"/\", r\"status\")},\n )\n ],\n )\n\n\napplications = {\n \"/system\": systemmonitor_doc,\n \"/stealing\": stealing_doc,\n \"/workers\": workers_doc,\n \"/events\": events_doc,\n \"/counters\": counters_doc,\n \"/tasks\": tasks_doc,\n \"/status\": status_doc,\n \"/profile\": profile_doc,\n \"/profile-server\": profile_server_doc,\n \"/graph\": graph_doc,\n \"/groups\": tg_graph_doc,\n \"/gpu\": gpu_doc,\n \"/individual-task-stream\": individual_doc(\n TaskStream, 100, n_rectangles=1000, clear_interval=\"10s\"\n ),\n \"/individual-progress\": individual_doc(TaskProgress, 100, height=160),\n \"/individual-graph\": individual_doc(TaskGraph, 200),\n \"/individual-groups\": individual_doc(TaskGroupGraph, 200),\n \"/individual-nbytes\": individual_doc(NBytes, 100),\n \"/individual-nbytes-cluster\": individual_doc(NBytesCluster, 100),\n \"/individual-cpu\": individual_doc(CurrentLoad, 100, fig_attr=\"cpu_figure\"),\n \"/individual-nprocessing\": individual_doc(\n CurrentLoad, 100, fig_attr=\"processing_figure\"\n ),\n \"/individual-occupancy\": individual_doc(Occupancy, 100),\n \"/individual-workers\": individual_doc(WorkerTable, 500),\n \"/individual-bandwidth-types\": individual_doc(BandwidthTypes, 500),\n \"/individual-bandwidth-workers\": individual_doc(BandwidthWorkers, 500),\n \"/individual-memory-by-key\": individual_doc(MemoryByKey, 500),\n \"/individual-compute-time-per-key\": individual_doc(ComputePerKey, 500),\n \"/individual-aggregate-time-per-action\": individual_doc(AggregateAction, 500),\n \"/individual-scheduler-system\": individual_doc(SystemMonitor, 500),\n \"/individual-profile\": individual_profile_doc,\n \"/individual-profile-server\": individual_profile_server_doc,\n \"/individual-gpu-memory\": gpu_memory_doc,\n \"/individual-gpu-utilization\": gpu_utilization_doc,\n}\n", "path": "distributed/dashboard/scheduler.py"}], "after_files": [{"content": "from urllib.parse import urljoin\n\nfrom tornado import web\nfrom tornado.ioloop import IOLoop\n\ntry:\n import numpy as np\nexcept ImportError:\n np = False\n\nfrom .components.nvml import gpu_doc # noqa: 1708\nfrom .components.nvml import NVML_ENABLED, gpu_memory_doc, gpu_utilization_doc\nfrom .components.scheduler import (\n AggregateAction,\n BandwidthTypes,\n BandwidthWorkers,\n ComputePerKey,\n CurrentLoad,\n MemoryByKey,\n NBytes,\n NBytesCluster,\n Occupancy,\n SystemMonitor,\n TaskGraph,\n TaskGroupGraph,\n TaskProgress,\n TaskStream,\n WorkerTable,\n events_doc,\n graph_doc,\n individual_doc,\n individual_profile_doc,\n individual_profile_server_doc,\n profile_doc,\n profile_server_doc,\n status_doc,\n stealing_doc,\n systemmonitor_doc,\n tasks_doc,\n tg_graph_doc,\n workers_doc,\n)\nfrom .core import BokehApplication\nfrom .worker import counters_doc\n\napplications = {\n \"/system\": systemmonitor_doc,\n \"/stealing\": stealing_doc,\n \"/workers\": workers_doc,\n \"/events\": events_doc,\n \"/counters\": counters_doc,\n \"/tasks\": tasks_doc,\n \"/status\": status_doc,\n \"/profile\": profile_doc,\n \"/profile-server\": profile_server_doc,\n \"/graph\": graph_doc,\n \"/groups\": tg_graph_doc,\n \"/gpu\": gpu_doc,\n \"/individual-task-stream\": individual_doc(\n TaskStream, 100, n_rectangles=1000, clear_interval=\"10s\"\n ),\n \"/individual-progress\": individual_doc(TaskProgress, 100, height=160),\n \"/individual-graph\": individual_doc(TaskGraph, 200),\n \"/individual-groups\": individual_doc(TaskGroupGraph, 200),\n \"/individual-nbytes\": individual_doc(NBytes, 100),\n \"/individual-nbytes-cluster\": individual_doc(NBytesCluster, 100),\n \"/individual-cpu\": individual_doc(CurrentLoad, 100, fig_attr=\"cpu_figure\"),\n \"/individual-nprocessing\": individual_doc(\n CurrentLoad, 100, fig_attr=\"processing_figure\"\n ),\n \"/individual-occupancy\": individual_doc(Occupancy, 100),\n \"/individual-workers\": individual_doc(WorkerTable, 500),\n \"/individual-bandwidth-types\": individual_doc(BandwidthTypes, 500),\n \"/individual-bandwidth-workers\": individual_doc(BandwidthWorkers, 500),\n \"/individual-memory-by-key\": individual_doc(MemoryByKey, 500),\n \"/individual-compute-time-per-key\": individual_doc(ComputePerKey, 500),\n \"/individual-aggregate-time-per-action\": individual_doc(AggregateAction, 500),\n \"/individual-scheduler-system\": individual_doc(SystemMonitor, 500),\n \"/individual-profile\": individual_profile_doc,\n \"/individual-profile-server\": individual_profile_server_doc,\n \"/individual-gpu-memory\": gpu_memory_doc,\n \"/individual-gpu-utilization\": gpu_utilization_doc,\n}\n\n\ntemplate_variables = {\n \"pages\": [\n \"status\",\n \"workers\",\n \"tasks\",\n \"system\",\n \"profile\",\n \"graph\",\n \"groups\",\n \"info\",\n ],\n \"plots\": [x.replace(\"/\", \"\") for x in applications if \"individual\" in x],\n}\n\nif NVML_ENABLED:\n template_variables[\"pages\"].insert(4, \"gpu\")\n\n\ndef connect(application, http_server, scheduler, prefix=\"\"):\n bokeh_app = BokehApplication(\n applications, scheduler, prefix=prefix, template_variables=template_variables\n )\n application.add_application(bokeh_app)\n bokeh_app.initialize(IOLoop.current())\n\n bokeh_app.add_handlers(\n r\".*\",\n [\n (\n r\"/\",\n web.RedirectHandler,\n {\"url\": urljoin((prefix or \"\").strip(\"/\") + \"/\", r\"status\")},\n )\n ],\n )\n", "path": "distributed/dashboard/scheduler.py"}]} | 1,592 | 548 |
gh_patches_debug_24350 | rasdani/github-patches | git_diff | Qiskit__qiskit-1118 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Can not combine the Result object from the same backend (statevector)
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Informations
- **Qiskit Terra version**: the master branch
- **Python version**: 3.6.5
- **Operating system**: macOS 10.13
### What is the current behavior?
raise error
```
Traceback (most recent call last):
File "/Users/rchen/Developer/Quantum/qiskit-terra/qiskit/result/_result.py", line 125, in __add__
copy_of_self += other
File "/Users/rchen/Developer/Quantum/qiskit-terra/qiskit/result/_result.py", line 108, in __iadd__
raise QISKitError('Result objects from different backends cannot be combined.')
qiskit._qiskiterror.QISKitError: 'Result objects from different backends cannot be combined.'
```
### Steps to reproduce the problem
Code
```python
from qiskit import QuantumRegister, QuantumCircuit, ClassicalRegister
import qiskit as qk
import numpy as np
num_qubits = 2
q = QuantumRegister(num_qubits, name='q')
c = ClassicalRegister(num_qubits, name='c')
circuits = QuantumCircuit(q, c)
param_idx = 0
for qubit in range(num_qubits):
circuits.u3(0.0, 0.0, 0.0, q[qubit])
circuits.u1(3.0, q[qubit])
# circuits.measure(q, c)
my_backend = qk.Aer.get_backend('statevector_simulator')
qobj = qk.compile(circuits=circuits, backend=my_backend)
job = my_backend.run(qobj)
result_a = job.result()
qobj = qk.compile(circuits=circuits, backend=my_backend)
job = my_backend.run(qobj)
result_b = job.result()
result = result_a + result_b
```
### What is the expected behavior?
Result objects are combined without error
### Suggested solutions
None
Note: If I change the backend to `qasm_simulator`, there is no error.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `qiskit/backends/aer/statevector_simulator.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Copyright 2017, IBM.
4 #
5 # This source code is licensed under the Apache License, Version 2.0 found in
6 # the LICENSE.txt file in the root directory of this source tree.
7
8 # pylint: disable=invalid-name
9
10 """
11 Interface to C++ quantum circuit simulator with realistic noise.
12 """
13
14 import logging
15 import uuid
16
17 from qiskit.qobj import QobjInstruction
18 from .qasm_simulator import QasmSimulator
19 from ._simulatorerror import SimulatorError
20 from .aerjob import AerJob
21
22 logger = logging.getLogger(__name__)
23
24
25 class StatevectorSimulator(QasmSimulator):
26 """C++ statevector simulator"""
27
28 DEFAULT_CONFIGURATION = {
29 'name': 'statevector_simulator',
30 'url': 'https://github.com/QISKit/qiskit-terra/src/qasm-simulator-cpp',
31 'simulator': True,
32 'local': True,
33 'description': 'A C++ statevector simulator for qobj files',
34 'coupling_map': 'all-to-all',
35 'basis_gates': 'u1,u2,u3,cx,cz,id,x,y,z,h,s,sdg,t,tdg,rzz,load,save,snapshot'
36 }
37
38 def __init__(self, configuration=None, provider=None):
39 super().__init__(configuration=configuration or self.DEFAULT_CONFIGURATION.copy(),
40 provider=provider)
41
42 def run(self, qobj):
43 """Run a qobj on the the backend."""
44 job_id = str(uuid.uuid4())
45 aer_job = AerJob(self, job_id, self._run_job, qobj)
46 aer_job.submit()
47 return aer_job
48
49 def _run_job(self, job_id, qobj):
50 """Run a Qobj on the backend."""
51 self._validate(qobj)
52 final_state_key = 32767 # Internal key for final state snapshot
53 # Add final snapshots to circuits
54 for experiment in qobj.experiments:
55 experiment.instructions.append(
56 QobjInstruction(name='snapshot', params=[final_state_key])
57 )
58 result = super()._run_job(job_id, qobj)
59 # Replace backend name with current backend
60 result.backend_name = self.name
61 # Extract final state snapshot and move to 'statevector' data field
62 for experiment_result in result.results.values():
63 snapshots = experiment_result.snapshots
64 if str(final_state_key) in snapshots:
65 final_state_key = str(final_state_key)
66 # Pop off final snapshot added above
67 final_state = snapshots.pop(final_state_key, None)
68 final_state = final_state['statevector'][0]
69 # Add final state to results data
70 experiment_result.data['statevector'] = final_state
71 # Remove snapshot dict if empty
72 if snapshots == {}:
73 experiment_result.data.pop('snapshots', None)
74 return result
75
76 def _validate(self, qobj):
77 """Semantic validations of the qobj which cannot be done via schemas.
78 Some of these may later move to backend schemas.
79
80 1. No shots
81 2. No measurements in the middle
82 """
83 if qobj.config.shots != 1:
84 logger.info("statevector simulator only supports 1 shot. "
85 "Setting shots=1.")
86 qobj.config.shots = 1
87 for experiment in qobj.experiments:
88 if getattr(experiment.config, 'shots', 1) != 1:
89 logger.info("statevector simulator only supports 1 shot. "
90 "Setting shots=1 for circuit %s.", experiment.name)
91 experiment.config.shots = 1
92 for op in experiment.instructions:
93 if op.name in ['measure', 'reset']:
94 raise SimulatorError(
95 "In circuit {}: statevector simulator does not support "
96 "measure or reset.".format(experiment.header.name))
97
```
Path: `qiskit/backends/aer/statevector_simulator_py.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Copyright 2017, IBM.
4 #
5 # This source code is licensed under the Apache License, Version 2.0 found in
6 # the LICENSE.txt file in the root directory of this source tree.
7
8 # pylint: disable=invalid-name
9
10 """Contains a (slow) python statevector simulator.
11
12 It simulates the statevector through a quantum circuit. It is exponential in
13 the number of qubits.
14
15 We advise using the c++ simulator or online simulator for larger size systems.
16
17 The input is a qobj dictionary and the output is a Result object.
18
19 The input qobj to this simulator has no shots, no measures, no reset, no noise.
20 """
21 import logging
22 import uuid
23
24 from qiskit.backends.aer.aerjob import AerJob
25 from qiskit.backends.aer._simulatorerror import SimulatorError
26 from qiskit.qobj import QobjInstruction
27 from .qasm_simulator_py import QasmSimulatorPy
28
29 logger = logging.getLogger(__name__)
30
31
32 class StatevectorSimulatorPy(QasmSimulatorPy):
33 """Python statevector simulator."""
34
35 DEFAULT_CONFIGURATION = {
36 'name': 'statevector_simulator_py',
37 'url': 'https://github.com/QISKit/qiskit-terra',
38 'simulator': True,
39 'local': True,
40 'description': 'A Python statevector simulator for qobj files',
41 'coupling_map': 'all-to-all',
42 'basis_gates': 'u1,u2,u3,cx,id,snapshot'
43 }
44
45 def __init__(self, configuration=None, provider=None):
46 super().__init__(configuration=configuration or self.DEFAULT_CONFIGURATION.copy(),
47 provider=provider)
48
49 def run(self, qobj):
50 """Run qobj asynchronously.
51
52 Args:
53 qobj (dict): job description
54
55 Returns:
56 AerJob: derived from BaseJob
57 """
58 job_id = str(uuid.uuid4())
59 aer_job = AerJob(self, job_id, self._run_job, qobj)
60 aer_job.submit()
61 return aer_job
62
63 def _run_job(self, job_id, qobj):
64 """Run a Qobj on the backend."""
65 self._validate(qobj)
66 final_state_key = 32767 # Internal key for final state snapshot
67 # Add final snapshots to circuits
68 for experiment in qobj.experiments:
69 experiment.instructions.append(
70 QobjInstruction(name='snapshot', params=[final_state_key])
71 )
72 result = super()._run_job(job_id, qobj)
73 # Replace backend name with current backend
74 result.backend_name = self.name
75 # Extract final state snapshot and move to 'statevector' data field
76 for experiment_result in result.results.values():
77 snapshots = experiment_result.snapshots
78 if str(final_state_key) in snapshots:
79 final_state_key = str(final_state_key)
80 # Pop off final snapshot added above
81 final_state = snapshots.pop(final_state_key, None)
82 final_state = final_state['statevector'][0]
83 # Add final state to results data
84 experiment_result.data['statevector'] = final_state
85 # Remove snapshot dict if empty
86 if snapshots == {}:
87 experiment_result.data.pop('snapshots', None)
88 return result
89
90 def _validate(self, qobj):
91 """Semantic validations of the qobj which cannot be done via schemas.
92 Some of these may later move to backend schemas.
93
94 1. No shots
95 2. No measurements in the middle
96 """
97 if qobj.config.shots != 1:
98 logger.info("statevector simulator only supports 1 shot. "
99 "Setting shots=1.")
100 qobj.config.shots = 1
101 for experiment in qobj.experiments:
102 if getattr(experiment.config, 'shots', 1) != 1:
103 logger.info("statevector simulator only supports 1 shot. "
104 "Setting shots=1 for circuit %s.", experiment.name)
105 experiment.config.shots = 1
106 for op in experiment.instructions:
107 if op.name in ['measure', 'reset']:
108 raise SimulatorError(
109 "In circuit {}: statevector simulator does not support "
110 "measure or reset.".format(experiment.header.name))
111
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/qiskit/backends/aer/statevector_simulator.py b/qiskit/backends/aer/statevector_simulator.py
--- a/qiskit/backends/aer/statevector_simulator.py
+++ b/qiskit/backends/aer/statevector_simulator.py
@@ -56,8 +56,6 @@
QobjInstruction(name='snapshot', params=[final_state_key])
)
result = super()._run_job(job_id, qobj)
- # Replace backend name with current backend
- result.backend_name = self.name
# Extract final state snapshot and move to 'statevector' data field
for experiment_result in result.results.values():
snapshots = experiment_result.snapshots
diff --git a/qiskit/backends/aer/statevector_simulator_py.py b/qiskit/backends/aer/statevector_simulator_py.py
--- a/qiskit/backends/aer/statevector_simulator_py.py
+++ b/qiskit/backends/aer/statevector_simulator_py.py
@@ -70,8 +70,6 @@
QobjInstruction(name='snapshot', params=[final_state_key])
)
result = super()._run_job(job_id, qobj)
- # Replace backend name with current backend
- result.backend_name = self.name
# Extract final state snapshot and move to 'statevector' data field
for experiment_result in result.results.values():
snapshots = experiment_result.snapshots
| {"golden_diff": "diff --git a/qiskit/backends/aer/statevector_simulator.py b/qiskit/backends/aer/statevector_simulator.py\n--- a/qiskit/backends/aer/statevector_simulator.py\n+++ b/qiskit/backends/aer/statevector_simulator.py\n@@ -56,8 +56,6 @@\n QobjInstruction(name='snapshot', params=[final_state_key])\n )\n result = super()._run_job(job_id, qobj)\n- # Replace backend name with current backend\n- result.backend_name = self.name\n # Extract final state snapshot and move to 'statevector' data field\n for experiment_result in result.results.values():\n snapshots = experiment_result.snapshots\ndiff --git a/qiskit/backends/aer/statevector_simulator_py.py b/qiskit/backends/aer/statevector_simulator_py.py\n--- a/qiskit/backends/aer/statevector_simulator_py.py\n+++ b/qiskit/backends/aer/statevector_simulator_py.py\n@@ -70,8 +70,6 @@\n QobjInstruction(name='snapshot', params=[final_state_key])\n )\n result = super()._run_job(job_id, qobj)\n- # Replace backend name with current backend\n- result.backend_name = self.name\n # Extract final state snapshot and move to 'statevector' data field\n for experiment_result in result.results.values():\n snapshots = experiment_result.snapshots\n", "issue": "Can not combine the Result object from the same backend (statevector)\n<!-- \u26a0\ufe0f If you do not respect this template, your issue will be closed -->\r\n<!-- \u26a0\ufe0f Make sure to browse the opened and closed issues -->\r\n\r\n### Informations\r\n\r\n- **Qiskit Terra version**: the master branch\r\n- **Python version**: 3.6.5\r\n- **Operating system**: macOS 10.13\r\n\r\n### What is the current behavior?\r\nraise error\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/Users/rchen/Developer/Quantum/qiskit-terra/qiskit/result/_result.py\", line 125, in __add__\r\n copy_of_self += other\r\n File \"/Users/rchen/Developer/Quantum/qiskit-terra/qiskit/result/_result.py\", line 108, in __iadd__\r\n raise QISKitError('Result objects from different backends cannot be combined.')\r\nqiskit._qiskiterror.QISKitError: 'Result objects from different backends cannot be combined.'\r\n```\r\n\r\n### Steps to reproduce the problem\r\nCode\r\n```python\r\nfrom qiskit import QuantumRegister, QuantumCircuit, ClassicalRegister\r\nimport qiskit as qk\r\nimport numpy as np\r\n\r\nnum_qubits = 2\r\n\r\nq = QuantumRegister(num_qubits, name='q')\r\nc = ClassicalRegister(num_qubits, name='c')\r\ncircuits = QuantumCircuit(q, c)\r\nparam_idx = 0\r\nfor qubit in range(num_qubits):\r\n circuits.u3(0.0, 0.0, 0.0, q[qubit])\r\n circuits.u1(3.0, q[qubit])\r\n\r\n# circuits.measure(q, c)\r\n\r\n\r\nmy_backend = qk.Aer.get_backend('statevector_simulator')\r\nqobj = qk.compile(circuits=circuits, backend=my_backend)\r\njob = my_backend.run(qobj)\r\nresult_a = job.result()\r\n\r\nqobj = qk.compile(circuits=circuits, backend=my_backend)\r\njob = my_backend.run(qobj)\r\nresult_b = job.result()\r\n\r\nresult = result_a + result_b\r\n\r\n```\r\n\r\n\r\n### What is the expected behavior?\r\nResult objects are combined without error\r\n\r\n\r\n### Suggested solutions\r\nNone\r\n\r\nNote: If I change the backend to `qasm_simulator`, there is no error.\r\n\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright 2017, IBM.\n#\n# This source code is licensed under the Apache License, Version 2.0 found in\n# the LICENSE.txt file in the root directory of this source tree.\n\n# pylint: disable=invalid-name\n\n\"\"\"\nInterface to C++ quantum circuit simulator with realistic noise.\n\"\"\"\n\nimport logging\nimport uuid\n\nfrom qiskit.qobj import QobjInstruction\nfrom .qasm_simulator import QasmSimulator\nfrom ._simulatorerror import SimulatorError\nfrom .aerjob import AerJob\n\nlogger = logging.getLogger(__name__)\n\n\nclass StatevectorSimulator(QasmSimulator):\n \"\"\"C++ statevector simulator\"\"\"\n\n DEFAULT_CONFIGURATION = {\n 'name': 'statevector_simulator',\n 'url': 'https://github.com/QISKit/qiskit-terra/src/qasm-simulator-cpp',\n 'simulator': True,\n 'local': True,\n 'description': 'A C++ statevector simulator for qobj files',\n 'coupling_map': 'all-to-all',\n 'basis_gates': 'u1,u2,u3,cx,cz,id,x,y,z,h,s,sdg,t,tdg,rzz,load,save,snapshot'\n }\n\n def __init__(self, configuration=None, provider=None):\n super().__init__(configuration=configuration or self.DEFAULT_CONFIGURATION.copy(),\n provider=provider)\n\n def run(self, qobj):\n \"\"\"Run a qobj on the the backend.\"\"\"\n job_id = str(uuid.uuid4())\n aer_job = AerJob(self, job_id, self._run_job, qobj)\n aer_job.submit()\n return aer_job\n\n def _run_job(self, job_id, qobj):\n \"\"\"Run a Qobj on the backend.\"\"\"\n self._validate(qobj)\n final_state_key = 32767 # Internal key for final state snapshot\n # Add final snapshots to circuits\n for experiment in qobj.experiments:\n experiment.instructions.append(\n QobjInstruction(name='snapshot', params=[final_state_key])\n )\n result = super()._run_job(job_id, qobj)\n # Replace backend name with current backend\n result.backend_name = self.name\n # Extract final state snapshot and move to 'statevector' data field\n for experiment_result in result.results.values():\n snapshots = experiment_result.snapshots\n if str(final_state_key) in snapshots:\n final_state_key = str(final_state_key)\n # Pop off final snapshot added above\n final_state = snapshots.pop(final_state_key, None)\n final_state = final_state['statevector'][0]\n # Add final state to results data\n experiment_result.data['statevector'] = final_state\n # Remove snapshot dict if empty\n if snapshots == {}:\n experiment_result.data.pop('snapshots', None)\n return result\n\n def _validate(self, qobj):\n \"\"\"Semantic validations of the qobj which cannot be done via schemas.\n Some of these may later move to backend schemas.\n\n 1. No shots\n 2. No measurements in the middle\n \"\"\"\n if qobj.config.shots != 1:\n logger.info(\"statevector simulator only supports 1 shot. \"\n \"Setting shots=1.\")\n qobj.config.shots = 1\n for experiment in qobj.experiments:\n if getattr(experiment.config, 'shots', 1) != 1:\n logger.info(\"statevector simulator only supports 1 shot. \"\n \"Setting shots=1 for circuit %s.\", experiment.name)\n experiment.config.shots = 1\n for op in experiment.instructions:\n if op.name in ['measure', 'reset']:\n raise SimulatorError(\n \"In circuit {}: statevector simulator does not support \"\n \"measure or reset.\".format(experiment.header.name))\n", "path": "qiskit/backends/aer/statevector_simulator.py"}, {"content": "# -*- coding: utf-8 -*-\n\n# Copyright 2017, IBM.\n#\n# This source code is licensed under the Apache License, Version 2.0 found in\n# the LICENSE.txt file in the root directory of this source tree.\n\n# pylint: disable=invalid-name\n\n\"\"\"Contains a (slow) python statevector simulator.\n\nIt simulates the statevector through a quantum circuit. It is exponential in\nthe number of qubits.\n\nWe advise using the c++ simulator or online simulator for larger size systems.\n\nThe input is a qobj dictionary and the output is a Result object.\n\nThe input qobj to this simulator has no shots, no measures, no reset, no noise.\n\"\"\"\nimport logging\nimport uuid\n\nfrom qiskit.backends.aer.aerjob import AerJob\nfrom qiskit.backends.aer._simulatorerror import SimulatorError\nfrom qiskit.qobj import QobjInstruction\nfrom .qasm_simulator_py import QasmSimulatorPy\n\nlogger = logging.getLogger(__name__)\n\n\nclass StatevectorSimulatorPy(QasmSimulatorPy):\n \"\"\"Python statevector simulator.\"\"\"\n\n DEFAULT_CONFIGURATION = {\n 'name': 'statevector_simulator_py',\n 'url': 'https://github.com/QISKit/qiskit-terra',\n 'simulator': True,\n 'local': True,\n 'description': 'A Python statevector simulator for qobj files',\n 'coupling_map': 'all-to-all',\n 'basis_gates': 'u1,u2,u3,cx,id,snapshot'\n }\n\n def __init__(self, configuration=None, provider=None):\n super().__init__(configuration=configuration or self.DEFAULT_CONFIGURATION.copy(),\n provider=provider)\n\n def run(self, qobj):\n \"\"\"Run qobj asynchronously.\n\n Args:\n qobj (dict): job description\n\n Returns:\n AerJob: derived from BaseJob\n \"\"\"\n job_id = str(uuid.uuid4())\n aer_job = AerJob(self, job_id, self._run_job, qobj)\n aer_job.submit()\n return aer_job\n\n def _run_job(self, job_id, qobj):\n \"\"\"Run a Qobj on the backend.\"\"\"\n self._validate(qobj)\n final_state_key = 32767 # Internal key for final state snapshot\n # Add final snapshots to circuits\n for experiment in qobj.experiments:\n experiment.instructions.append(\n QobjInstruction(name='snapshot', params=[final_state_key])\n )\n result = super()._run_job(job_id, qobj)\n # Replace backend name with current backend\n result.backend_name = self.name\n # Extract final state snapshot and move to 'statevector' data field\n for experiment_result in result.results.values():\n snapshots = experiment_result.snapshots\n if str(final_state_key) in snapshots:\n final_state_key = str(final_state_key)\n # Pop off final snapshot added above\n final_state = snapshots.pop(final_state_key, None)\n final_state = final_state['statevector'][0]\n # Add final state to results data\n experiment_result.data['statevector'] = final_state\n # Remove snapshot dict if empty\n if snapshots == {}:\n experiment_result.data.pop('snapshots', None)\n return result\n\n def _validate(self, qobj):\n \"\"\"Semantic validations of the qobj which cannot be done via schemas.\n Some of these may later move to backend schemas.\n\n 1. No shots\n 2. No measurements in the middle\n \"\"\"\n if qobj.config.shots != 1:\n logger.info(\"statevector simulator only supports 1 shot. \"\n \"Setting shots=1.\")\n qobj.config.shots = 1\n for experiment in qobj.experiments:\n if getattr(experiment.config, 'shots', 1) != 1:\n logger.info(\"statevector simulator only supports 1 shot. \"\n \"Setting shots=1 for circuit %s.\", experiment.name)\n experiment.config.shots = 1\n for op in experiment.instructions:\n if op.name in ['measure', 'reset']:\n raise SimulatorError(\n \"In circuit {}: statevector simulator does not support \"\n \"measure or reset.\".format(experiment.header.name))\n", "path": "qiskit/backends/aer/statevector_simulator_py.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright 2017, IBM.\n#\n# This source code is licensed under the Apache License, Version 2.0 found in\n# the LICENSE.txt file in the root directory of this source tree.\n\n# pylint: disable=invalid-name\n\n\"\"\"\nInterface to C++ quantum circuit simulator with realistic noise.\n\"\"\"\n\nimport logging\nimport uuid\n\nfrom qiskit.qobj import QobjInstruction\nfrom .qasm_simulator import QasmSimulator\nfrom ._simulatorerror import SimulatorError\nfrom .aerjob import AerJob\n\nlogger = logging.getLogger(__name__)\n\n\nclass StatevectorSimulator(QasmSimulator):\n \"\"\"C++ statevector simulator\"\"\"\n\n DEFAULT_CONFIGURATION = {\n 'name': 'statevector_simulator',\n 'url': 'https://github.com/QISKit/qiskit-terra/src/qasm-simulator-cpp',\n 'simulator': True,\n 'local': True,\n 'description': 'A C++ statevector simulator for qobj files',\n 'coupling_map': 'all-to-all',\n 'basis_gates': 'u1,u2,u3,cx,cz,id,x,y,z,h,s,sdg,t,tdg,rzz,load,save,snapshot'\n }\n\n def __init__(self, configuration=None, provider=None):\n super().__init__(configuration=configuration or self.DEFAULT_CONFIGURATION.copy(),\n provider=provider)\n\n def run(self, qobj):\n \"\"\"Run a qobj on the the backend.\"\"\"\n job_id = str(uuid.uuid4())\n aer_job = AerJob(self, job_id, self._run_job, qobj)\n aer_job.submit()\n return aer_job\n\n def _run_job(self, job_id, qobj):\n \"\"\"Run a Qobj on the backend.\"\"\"\n self._validate(qobj)\n final_state_key = 32767 # Internal key for final state snapshot\n # Add final snapshots to circuits\n for experiment in qobj.experiments:\n experiment.instructions.append(\n QobjInstruction(name='snapshot', params=[final_state_key])\n )\n result = super()._run_job(job_id, qobj)\n # Extract final state snapshot and move to 'statevector' data field\n for experiment_result in result.results.values():\n snapshots = experiment_result.snapshots\n if str(final_state_key) in snapshots:\n final_state_key = str(final_state_key)\n # Pop off final snapshot added above\n final_state = snapshots.pop(final_state_key, None)\n final_state = final_state['statevector'][0]\n # Add final state to results data\n experiment_result.data['statevector'] = final_state\n # Remove snapshot dict if empty\n if snapshots == {}:\n experiment_result.data.pop('snapshots', None)\n return result\n\n def _validate(self, qobj):\n \"\"\"Semantic validations of the qobj which cannot be done via schemas.\n Some of these may later move to backend schemas.\n\n 1. No shots\n 2. No measurements in the middle\n \"\"\"\n if qobj.config.shots != 1:\n logger.info(\"statevector simulator only supports 1 shot. \"\n \"Setting shots=1.\")\n qobj.config.shots = 1\n for experiment in qobj.experiments:\n if getattr(experiment.config, 'shots', 1) != 1:\n logger.info(\"statevector simulator only supports 1 shot. \"\n \"Setting shots=1 for circuit %s.\", experiment.name)\n experiment.config.shots = 1\n for op in experiment.instructions:\n if op.name in ['measure', 'reset']:\n raise SimulatorError(\n \"In circuit {}: statevector simulator does not support \"\n \"measure or reset.\".format(experiment.header.name))\n", "path": "qiskit/backends/aer/statevector_simulator.py"}, {"content": "# -*- coding: utf-8 -*-\n\n# Copyright 2017, IBM.\n#\n# This source code is licensed under the Apache License, Version 2.0 found in\n# the LICENSE.txt file in the root directory of this source tree.\n\n# pylint: disable=invalid-name\n\n\"\"\"Contains a (slow) python statevector simulator.\n\nIt simulates the statevector through a quantum circuit. It is exponential in\nthe number of qubits.\n\nWe advise using the c++ simulator or online simulator for larger size systems.\n\nThe input is a qobj dictionary and the output is a Result object.\n\nThe input qobj to this simulator has no shots, no measures, no reset, no noise.\n\"\"\"\nimport logging\nimport uuid\n\nfrom qiskit.backends.aer.aerjob import AerJob\nfrom qiskit.backends.aer._simulatorerror import SimulatorError\nfrom qiskit.qobj import QobjInstruction\nfrom .qasm_simulator_py import QasmSimulatorPy\n\nlogger = logging.getLogger(__name__)\n\n\nclass StatevectorSimulatorPy(QasmSimulatorPy):\n \"\"\"Python statevector simulator.\"\"\"\n\n DEFAULT_CONFIGURATION = {\n 'name': 'statevector_simulator_py',\n 'url': 'https://github.com/QISKit/qiskit-terra',\n 'simulator': True,\n 'local': True,\n 'description': 'A Python statevector simulator for qobj files',\n 'coupling_map': 'all-to-all',\n 'basis_gates': 'u1,u2,u3,cx,id,snapshot'\n }\n\n def __init__(self, configuration=None, provider=None):\n super().__init__(configuration=configuration or self.DEFAULT_CONFIGURATION.copy(),\n provider=provider)\n\n def run(self, qobj):\n \"\"\"Run qobj asynchronously.\n\n Args:\n qobj (dict): job description\n\n Returns:\n AerJob: derived from BaseJob\n \"\"\"\n job_id = str(uuid.uuid4())\n aer_job = AerJob(self, job_id, self._run_job, qobj)\n aer_job.submit()\n return aer_job\n\n def _run_job(self, job_id, qobj):\n \"\"\"Run a Qobj on the backend.\"\"\"\n self._validate(qobj)\n final_state_key = 32767 # Internal key for final state snapshot\n # Add final snapshots to circuits\n for experiment in qobj.experiments:\n experiment.instructions.append(\n QobjInstruction(name='snapshot', params=[final_state_key])\n )\n result = super()._run_job(job_id, qobj)\n # Extract final state snapshot and move to 'statevector' data field\n for experiment_result in result.results.values():\n snapshots = experiment_result.snapshots\n if str(final_state_key) in snapshots:\n final_state_key = str(final_state_key)\n # Pop off final snapshot added above\n final_state = snapshots.pop(final_state_key, None)\n final_state = final_state['statevector'][0]\n # Add final state to results data\n experiment_result.data['statevector'] = final_state\n # Remove snapshot dict if empty\n if snapshots == {}:\n experiment_result.data.pop('snapshots', None)\n return result\n\n def _validate(self, qobj):\n \"\"\"Semantic validations of the qobj which cannot be done via schemas.\n Some of these may later move to backend schemas.\n\n 1. No shots\n 2. No measurements in the middle\n \"\"\"\n if qobj.config.shots != 1:\n logger.info(\"statevector simulator only supports 1 shot. \"\n \"Setting shots=1.\")\n qobj.config.shots = 1\n for experiment in qobj.experiments:\n if getattr(experiment.config, 'shots', 1) != 1:\n logger.info(\"statevector simulator only supports 1 shot. \"\n \"Setting shots=1 for circuit %s.\", experiment.name)\n experiment.config.shots = 1\n for op in experiment.instructions:\n if op.name in ['measure', 'reset']:\n raise SimulatorError(\n \"In circuit {}: statevector simulator does not support \"\n \"measure or reset.\".format(experiment.header.name))\n", "path": "qiskit/backends/aer/statevector_simulator_py.py"}]} | 2,940 | 310 |
gh_patches_debug_28338 | rasdani/github-patches | git_diff | chainer__chainer-1511 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
get_cifar100() does not work
`get_cifar100()` causes the following error.
```
>>> chainer.datasets.get_cifar100()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/chainer/datasets/cifar.py", line 84, in get_cifar100
raw = _retrieve_cifar('cifar-100')
File "/usr/local/lib/python2.7/dist-packages/chainer/datasets/cifar.py", line 145, in _retrieve_cifar
return download.cache_or_load_file(path, creator, numpy.load)
File "/usr/local/lib/python2.7/dist-packages/chainer/dataset/download.py", line 145, in cache_or_load_file
content = creator(temp_path)
File "/usr/local/lib/python2.7/dist-packages/chainer/datasets/cifar.py", line 127, in creator
d = _pickle_load(archive.extractfile(file_name))
File "/usr/lib/python2.7/tarfile.py", line 2143, in extractfile
tarinfo = self.getmember(member)
File "/usr/lib/python2.7/tarfile.py", line 1827, in getmember
raise KeyError("filename %r not found" % name)
KeyError: "filename 'cifar-100-batches-py/data_batch_1' not found"
```
cifar-100's directory structure seems to be different from cifar-10's.
```
$ tar xvzf cifar-100-python.tar.gz
cifar-100-python/
cifar-100-python/file.txt~
cifar-100-python/train
cifar-100-python/test
cifar-100-python/meta
$ tar xvzf cifar-10-python.tar.gz
cifar-10-batches-py/
cifar-10-batches-py/data_batch_4
cifar-10-batches-py/readme.html
cifar-10-batches-py/test_batch
cifar-10-batches-py/data_batch_3
cifar-10-batches-py/batches.meta
cifar-10-batches-py/data_batch_2
cifar-10-batches-py/data_batch_5
cifar-10-batches-py/data_batch_1
```
They should not be retrieved with the same logic.
https://github.com/pfnet/chainer/blob/master/chainer/datasets/cifar.py#L126
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `chainer/datasets/cifar.py`
Content:
```
1 import os
2 import sys
3 import tarfile
4
5 import numpy
6 import six.moves.cPickle as pickle
7
8 from chainer.dataset import download
9 from chainer.datasets import tuple_dataset
10
11
12 def get_cifar10(withlabel=True, ndim=3, scale=1.):
13 """Gets the CIFAR-10 dataset.
14
15 `CIFAR-10 <https://www.cs.toronto.edu/~kriz/cifar.html>`_ is a set of small
16 natural images. Each example is an RGB color image of size 32x32,
17 classified into 10 groups. In the original images, each component of pixels
18 is represented by one-byte unsigned integer. This function scales the
19 components to floating point values in the interval ``[0, scale]``.
20
21 This function returns the training set and the test set of the official
22 CIFAR-10 dataset. If ``withlabel`` is ``True``, each dataset consists of
23 tuples of images and labels, otherwise it only consists of images.
24
25 Args:
26 withlabel (bool): If ``True``, it returns datasets with labels. In this
27 case, each example is a tuple of an image and a label. Otherwise,
28 the datasets only contain images.
29 ndim (int): Number of dimensions of each image. The shape of each image
30 is determined depending on ndim as follows:
31
32 - ``ndim == 1``: the shape is ``(3072,)``
33 - ``ndim == 3``: the shape is ``(3, 32, 32)``
34
35 scale (float): Pixel value scale. If it is 1 (default), pixels are
36 scaled to the interval ``[0, 1]``.
37
38 Returns:
39 A tuple of two datasets. If ``withlabel`` is ``True``, both datasets
40 are :class:`~chainer.datasets.TupleDataset` instances. Otherwise, both
41 datasets are arrays of images.
42
43 """
44 raw = _retrieve_cifar('cifar-10')
45 train = _preprocess_cifar(raw['train_x'], raw['train_y'],
46 withlabel, ndim, scale)
47 test = _preprocess_cifar(raw['test_x'], raw['test_y'],
48 withlabel, ndim, scale)
49 return train, test
50
51
52 def get_cifar100(withlabel=True, ndim=3, scale=1.):
53 """Gets the CIFAR-100 dataset.
54
55 `CIFAR-100 <https://www.cs.toronto.edu/~kriz/cifar.html>`_ is a set of
56 small natural images. Each example is an RGB color image of size 32x32,
57 classified into 100 groups. In the original images, each component
58 pixels is represented by one-byte unsigned integer. This function scales
59 the components to floating point values in the interval ``[0, scale]``.
60
61 This function returns the training set and the test set of the official
62 CIFAR-100 dataset. If ``withlabel`` is ``True``, each dataset consists of
63 tuples of images and labels, otherwise it only consists of images.
64
65 Args:
66 withlabel (bool): If ``True``, it returns datasets with labels. In this
67 case, each example is a tuple of an image and a label. Otherwise,
68 the datasets only contain images.
69 ndim (int): Number of dimensions of each image. The shape of each image
70 is determined depending on ndim as follows:
71
72 - ``ndim == 1``: the shape is ``(3072,)``
73 - ``ndim == 3``: the shape is ``(3, 32, 32)``
74
75 scale (float): Pixel value scale. If it is 1 (default), pixels are
76 scaled to the interval ``[0, 1]``.
77
78 Returns:
79 A tuple of two datasets. If ``withlabel`` is ``True``, both
80 are :class:`~chainer.datasets.TupleDataset` instances. Otherwise, both
81 datasets are arrays of images.
82
83 """
84 raw = _retrieve_cifar('cifar-100')
85 train = _preprocess_cifar(raw['train_x'], raw['train_y'],
86 withlabel, ndim, scale)
87 test = _preprocess_cifar(raw['test_x'], raw['test_y'],
88 withlabel, ndim, scale)
89 return train, test
90
91
92 def _preprocess_cifar(images, labels, withlabel, ndim, scale):
93 if ndim == 1:
94 images = images.reshape(-1, 3072)
95 elif ndim == 3:
96 images = images.reshape(-1, 3, 32, 32)
97 else:
98 raise ValueError('invalid ndim for CIFAR dataset')
99 images = images.astype(numpy.float32)
100 images *= scale / 255.
101
102 if withlabel:
103 labels = labels.astype(numpy.int32)
104 return tuple_dataset.TupleDataset(images, labels)
105 else:
106 return images
107
108
109 def _retrieve_cifar(name):
110 root = download.get_dataset_directory('pfnet/chainer/cifar')
111 path = os.path.join(root, '{}.npz'.format(name))
112 url = 'https://www.cs.toronto.edu/~kriz/{}-python.tar.gz'.format(name)
113
114 def creator(path):
115 archive_path = download.cached_download(url)
116
117 train_x = numpy.empty((5, 10000, 3072), dtype=numpy.uint8)
118 train_y = numpy.empty((5, 10000), dtype=numpy.uint8)
119 test_y = numpy.empty(10000, dtype=numpy.uint8)
120
121 dir_name = '{}-batches-py'.format(name)
122
123 with tarfile.open(archive_path, 'r:gz') as archive:
124 # training set
125 for i in range(5):
126 file_name = '{}/data_batch_{}'.format(dir_name, i + 1)
127 d = _pickle_load(archive.extractfile(file_name))
128 train_x[i] = d['data']
129 train_y[i] = d['labels']
130
131 # test set
132 file_name = '{}/test_batch'.format(dir_name)
133 d = _pickle_load(archive.extractfile(file_name))
134 test_x = d['data']
135 test_y[...] = d['labels'] # copy to array
136
137 train_x = train_x.reshape(50000, 3072)
138 train_y = train_y.reshape(50000)
139
140 numpy.savez_compressed(path, train_x=train_x, train_y=train_y,
141 test_x=test_x, test_y=test_y)
142 return {'train_x': train_x, 'train_y': train_y,
143 'test_x': test_x, 'test_y': test_y}
144
145 return download.cache_or_load_file(path, creator, numpy.load)
146
147
148 def _pickle_load(f):
149 if sys.version_info > (3, ):
150 # python3
151 return pickle.load(f, encoding='latin-1')
152 else:
153 # python2
154 return pickle.load(f)
155
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/chainer/datasets/cifar.py b/chainer/datasets/cifar.py
--- a/chainer/datasets/cifar.py
+++ b/chainer/datasets/cifar.py
@@ -81,7 +81,7 @@
datasets are arrays of images.
"""
- raw = _retrieve_cifar('cifar-100')
+ raw = _retrieve_cifar_100()
train = _preprocess_cifar(raw['train_x'], raw['train_y'],
withlabel, ndim, scale)
test = _preprocess_cifar(raw['test_x'], raw['test_y'],
@@ -106,6 +106,32 @@
return images
+def _retrieve_cifar_100():
+ root = download.get_dataset_directory('pfnet/chainer/cifar')
+ path = os.path.join(root, 'cifar-100.npz')
+ url = 'https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz'
+
+ def creator(path):
+
+ def load(archive, file_name):
+ d = _pickle_load(archive.extractfile(file_name))
+ x = d['data'].reshape((-1, 3072))
+ y = numpy.array(d['fine_labels'], dtype=numpy.uint8)
+ return x, y
+
+ archive_path = download.cached_download(url)
+ with tarfile.open(archive_path, 'r:gz') as archive:
+ train_x, train_y = load(archive, 'cifar-100-python/train')
+ test_x, test_y = load(archive, 'cifar-100-python/test')
+
+ numpy.savez_compressed(path, train_x=train_x, train_y=train_y,
+ test_x=test_x, test_y=test_y)
+ return {'train_x': train_x, 'train_y': train_y,
+ 'test_x': test_x, 'test_y': test_y}
+
+ return download.cache_or_load_file(path, creator, numpy.load)
+
+
def _retrieve_cifar(name):
root = download.get_dataset_directory('pfnet/chainer/cifar')
path = os.path.join(root, '{}.npz'.format(name))
| {"golden_diff": "diff --git a/chainer/datasets/cifar.py b/chainer/datasets/cifar.py\n--- a/chainer/datasets/cifar.py\n+++ b/chainer/datasets/cifar.py\n@@ -81,7 +81,7 @@\n datasets are arrays of images.\n \n \"\"\"\n- raw = _retrieve_cifar('cifar-100')\n+ raw = _retrieve_cifar_100()\n train = _preprocess_cifar(raw['train_x'], raw['train_y'],\n withlabel, ndim, scale)\n test = _preprocess_cifar(raw['test_x'], raw['test_y'],\n@@ -106,6 +106,32 @@\n return images\n \n \n+def _retrieve_cifar_100():\n+ root = download.get_dataset_directory('pfnet/chainer/cifar')\n+ path = os.path.join(root, 'cifar-100.npz')\n+ url = 'https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz'\n+\n+ def creator(path):\n+\n+ def load(archive, file_name):\n+ d = _pickle_load(archive.extractfile(file_name))\n+ x = d['data'].reshape((-1, 3072))\n+ y = numpy.array(d['fine_labels'], dtype=numpy.uint8)\n+ return x, y\n+\n+ archive_path = download.cached_download(url)\n+ with tarfile.open(archive_path, 'r:gz') as archive:\n+ train_x, train_y = load(archive, 'cifar-100-python/train')\n+ test_x, test_y = load(archive, 'cifar-100-python/test')\n+\n+ numpy.savez_compressed(path, train_x=train_x, train_y=train_y,\n+ test_x=test_x, test_y=test_y)\n+ return {'train_x': train_x, 'train_y': train_y,\n+ 'test_x': test_x, 'test_y': test_y}\n+\n+ return download.cache_or_load_file(path, creator, numpy.load)\n+\n+\n def _retrieve_cifar(name):\n root = download.get_dataset_directory('pfnet/chainer/cifar')\n path = os.path.join(root, '{}.npz'.format(name))\n", "issue": "get_cifar100() does not work\n`get_cifar100()` causes the following error.\n\n```\n>>> chainer.datasets.get_cifar100()\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/usr/local/lib/python2.7/dist-packages/chainer/datasets/cifar.py\", line 84, in get_cifar100\n raw = _retrieve_cifar('cifar-100')\n File \"/usr/local/lib/python2.7/dist-packages/chainer/datasets/cifar.py\", line 145, in _retrieve_cifar\n return download.cache_or_load_file(path, creator, numpy.load)\n File \"/usr/local/lib/python2.7/dist-packages/chainer/dataset/download.py\", line 145, in cache_or_load_file\n content = creator(temp_path)\n File \"/usr/local/lib/python2.7/dist-packages/chainer/datasets/cifar.py\", line 127, in creator\n d = _pickle_load(archive.extractfile(file_name))\n File \"/usr/lib/python2.7/tarfile.py\", line 2143, in extractfile\n tarinfo = self.getmember(member)\n File \"/usr/lib/python2.7/tarfile.py\", line 1827, in getmember\n raise KeyError(\"filename %r not found\" % name)\nKeyError: \"filename 'cifar-100-batches-py/data_batch_1' not found\"\n```\n\ncifar-100's directory structure seems to be different from cifar-10's.\n\n```\n$ tar xvzf cifar-100-python.tar.gz \ncifar-100-python/\ncifar-100-python/file.txt~\ncifar-100-python/train\ncifar-100-python/test\ncifar-100-python/meta\n\n$ tar xvzf cifar-10-python.tar.gz \ncifar-10-batches-py/\ncifar-10-batches-py/data_batch_4\ncifar-10-batches-py/readme.html\ncifar-10-batches-py/test_batch\ncifar-10-batches-py/data_batch_3\ncifar-10-batches-py/batches.meta\ncifar-10-batches-py/data_batch_2\ncifar-10-batches-py/data_batch_5\ncifar-10-batches-py/data_batch_1\n```\n\nThey should not be retrieved with the same logic.\nhttps://github.com/pfnet/chainer/blob/master/chainer/datasets/cifar.py#L126\n\n", "before_files": [{"content": "import os\nimport sys\nimport tarfile\n\nimport numpy\nimport six.moves.cPickle as pickle\n\nfrom chainer.dataset import download\nfrom chainer.datasets import tuple_dataset\n\n\ndef get_cifar10(withlabel=True, ndim=3, scale=1.):\n \"\"\"Gets the CIFAR-10 dataset.\n\n `CIFAR-10 <https://www.cs.toronto.edu/~kriz/cifar.html>`_ is a set of small\n natural images. Each example is an RGB color image of size 32x32,\n classified into 10 groups. In the original images, each component of pixels\n is represented by one-byte unsigned integer. This function scales the\n components to floating point values in the interval ``[0, scale]``.\n\n This function returns the training set and the test set of the official\n CIFAR-10 dataset. If ``withlabel`` is ``True``, each dataset consists of\n tuples of images and labels, otherwise it only consists of images.\n\n Args:\n withlabel (bool): If ``True``, it returns datasets with labels. In this\n case, each example is a tuple of an image and a label. Otherwise,\n the datasets only contain images.\n ndim (int): Number of dimensions of each image. The shape of each image\n is determined depending on ndim as follows:\n\n - ``ndim == 1``: the shape is ``(3072,)``\n - ``ndim == 3``: the shape is ``(3, 32, 32)``\n\n scale (float): Pixel value scale. If it is 1 (default), pixels are\n scaled to the interval ``[0, 1]``.\n\n Returns:\n A tuple of two datasets. If ``withlabel`` is ``True``, both datasets\n are :class:`~chainer.datasets.TupleDataset` instances. Otherwise, both\n datasets are arrays of images.\n\n \"\"\"\n raw = _retrieve_cifar('cifar-10')\n train = _preprocess_cifar(raw['train_x'], raw['train_y'],\n withlabel, ndim, scale)\n test = _preprocess_cifar(raw['test_x'], raw['test_y'],\n withlabel, ndim, scale)\n return train, test\n\n\ndef get_cifar100(withlabel=True, ndim=3, scale=1.):\n \"\"\"Gets the CIFAR-100 dataset.\n\n `CIFAR-100 <https://www.cs.toronto.edu/~kriz/cifar.html>`_ is a set of\n small natural images. Each example is an RGB color image of size 32x32,\n classified into 100 groups. In the original images, each component\n pixels is represented by one-byte unsigned integer. This function scales\n the components to floating point values in the interval ``[0, scale]``.\n\n This function returns the training set and the test set of the official\n CIFAR-100 dataset. If ``withlabel`` is ``True``, each dataset consists of\n tuples of images and labels, otherwise it only consists of images.\n\n Args:\n withlabel (bool): If ``True``, it returns datasets with labels. In this\n case, each example is a tuple of an image and a label. Otherwise,\n the datasets only contain images.\n ndim (int): Number of dimensions of each image. The shape of each image\n is determined depending on ndim as follows:\n\n - ``ndim == 1``: the shape is ``(3072,)``\n - ``ndim == 3``: the shape is ``(3, 32, 32)``\n\n scale (float): Pixel value scale. If it is 1 (default), pixels are\n scaled to the interval ``[0, 1]``.\n\n Returns:\n A tuple of two datasets. If ``withlabel`` is ``True``, both\n are :class:`~chainer.datasets.TupleDataset` instances. Otherwise, both\n datasets are arrays of images.\n\n \"\"\"\n raw = _retrieve_cifar('cifar-100')\n train = _preprocess_cifar(raw['train_x'], raw['train_y'],\n withlabel, ndim, scale)\n test = _preprocess_cifar(raw['test_x'], raw['test_y'],\n withlabel, ndim, scale)\n return train, test\n\n\ndef _preprocess_cifar(images, labels, withlabel, ndim, scale):\n if ndim == 1:\n images = images.reshape(-1, 3072)\n elif ndim == 3:\n images = images.reshape(-1, 3, 32, 32)\n else:\n raise ValueError('invalid ndim for CIFAR dataset')\n images = images.astype(numpy.float32)\n images *= scale / 255.\n\n if withlabel:\n labels = labels.astype(numpy.int32)\n return tuple_dataset.TupleDataset(images, labels)\n else:\n return images\n\n\ndef _retrieve_cifar(name):\n root = download.get_dataset_directory('pfnet/chainer/cifar')\n path = os.path.join(root, '{}.npz'.format(name))\n url = 'https://www.cs.toronto.edu/~kriz/{}-python.tar.gz'.format(name)\n\n def creator(path):\n archive_path = download.cached_download(url)\n\n train_x = numpy.empty((5, 10000, 3072), dtype=numpy.uint8)\n train_y = numpy.empty((5, 10000), dtype=numpy.uint8)\n test_y = numpy.empty(10000, dtype=numpy.uint8)\n\n dir_name = '{}-batches-py'.format(name)\n\n with tarfile.open(archive_path, 'r:gz') as archive:\n # training set\n for i in range(5):\n file_name = '{}/data_batch_{}'.format(dir_name, i + 1)\n d = _pickle_load(archive.extractfile(file_name))\n train_x[i] = d['data']\n train_y[i] = d['labels']\n\n # test set\n file_name = '{}/test_batch'.format(dir_name)\n d = _pickle_load(archive.extractfile(file_name))\n test_x = d['data']\n test_y[...] = d['labels'] # copy to array\n\n train_x = train_x.reshape(50000, 3072)\n train_y = train_y.reshape(50000)\n\n numpy.savez_compressed(path, train_x=train_x, train_y=train_y,\n test_x=test_x, test_y=test_y)\n return {'train_x': train_x, 'train_y': train_y,\n 'test_x': test_x, 'test_y': test_y}\n\n return download.cache_or_load_file(path, creator, numpy.load)\n\n\ndef _pickle_load(f):\n if sys.version_info > (3, ):\n # python3\n return pickle.load(f, encoding='latin-1')\n else:\n # python2\n return pickle.load(f)\n", "path": "chainer/datasets/cifar.py"}], "after_files": [{"content": "import os\nimport sys\nimport tarfile\n\nimport numpy\nimport six.moves.cPickle as pickle\n\nfrom chainer.dataset import download\nfrom chainer.datasets import tuple_dataset\n\n\ndef get_cifar10(withlabel=True, ndim=3, scale=1.):\n \"\"\"Gets the CIFAR-10 dataset.\n\n `CIFAR-10 <https://www.cs.toronto.edu/~kriz/cifar.html>`_ is a set of small\n natural images. Each example is an RGB color image of size 32x32,\n classified into 10 groups. In the original images, each component of pixels\n is represented by one-byte unsigned integer. This function scales the\n components to floating point values in the interval ``[0, scale]``.\n\n This function returns the training set and the test set of the official\n CIFAR-10 dataset. If ``withlabel`` is ``True``, each dataset consists of\n tuples of images and labels, otherwise it only consists of images.\n\n Args:\n withlabel (bool): If ``True``, it returns datasets with labels. In this\n case, each example is a tuple of an image and a label. Otherwise,\n the datasets only contain images.\n ndim (int): Number of dimensions of each image. The shape of each image\n is determined depending on ndim as follows:\n\n - ``ndim == 1``: the shape is ``(3072,)``\n - ``ndim == 3``: the shape is ``(3, 32, 32)``\n\n scale (float): Pixel value scale. If it is 1 (default), pixels are\n scaled to the interval ``[0, 1]``.\n\n Returns:\n A tuple of two datasets. If ``withlabel`` is ``True``, both datasets\n are :class:`~chainer.datasets.TupleDataset` instances. Otherwise, both\n datasets are arrays of images.\n\n \"\"\"\n raw = _retrieve_cifar('cifar-10')\n train = _preprocess_cifar(raw['train_x'], raw['train_y'],\n withlabel, ndim, scale)\n test = _preprocess_cifar(raw['test_x'], raw['test_y'],\n withlabel, ndim, scale)\n return train, test\n\n\ndef get_cifar100(withlabel=True, ndim=3, scale=1.):\n \"\"\"Gets the CIFAR-100 dataset.\n\n `CIFAR-100 <https://www.cs.toronto.edu/~kriz/cifar.html>`_ is a set of\n small natural images. Each example is an RGB color image of size 32x32,\n classified into 100 groups. In the original images, each component\n pixels is represented by one-byte unsigned integer. This function scales\n the components to floating point values in the interval ``[0, scale]``.\n\n This function returns the training set and the test set of the official\n CIFAR-100 dataset. If ``withlabel`` is ``True``, each dataset consists of\n tuples of images and labels, otherwise it only consists of images.\n\n Args:\n withlabel (bool): If ``True``, it returns datasets with labels. In this\n case, each example is a tuple of an image and a label. Otherwise,\n the datasets only contain images.\n ndim (int): Number of dimensions of each image. The shape of each image\n is determined depending on ndim as follows:\n\n - ``ndim == 1``: the shape is ``(3072,)``\n - ``ndim == 3``: the shape is ``(3, 32, 32)``\n\n scale (float): Pixel value scale. If it is 1 (default), pixels are\n scaled to the interval ``[0, 1]``.\n\n Returns:\n A tuple of two datasets. If ``withlabel`` is ``True``, both\n are :class:`~chainer.datasets.TupleDataset` instances. Otherwise, both\n datasets are arrays of images.\n\n \"\"\"\n raw = _retrieve_cifar_100()\n train = _preprocess_cifar(raw['train_x'], raw['train_y'],\n withlabel, ndim, scale)\n test = _preprocess_cifar(raw['test_x'], raw['test_y'],\n withlabel, ndim, scale)\n return train, test\n\n\ndef _preprocess_cifar(images, labels, withlabel, ndim, scale):\n if ndim == 1:\n images = images.reshape(-1, 3072)\n elif ndim == 3:\n images = images.reshape(-1, 3, 32, 32)\n else:\n raise ValueError('invalid ndim for CIFAR dataset')\n images = images.astype(numpy.float32)\n images *= scale / 255.\n\n if withlabel:\n labels = labels.astype(numpy.int32)\n return tuple_dataset.TupleDataset(images, labels)\n else:\n return images\n\n\ndef _retrieve_cifar_100():\n root = download.get_dataset_directory('pfnet/chainer/cifar')\n path = os.path.join(root, 'cifar-100.npz')\n url = 'https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz'\n\n def creator(path):\n\n def load(archive, file_name):\n d = _pickle_load(archive.extractfile(file_name))\n x = d['data'].reshape((-1, 3072))\n y = numpy.array(d['fine_labels'], dtype=numpy.uint8)\n return x, y\n\n archive_path = download.cached_download(url)\n with tarfile.open(archive_path, 'r:gz') as archive:\n train_x, train_y = load(archive, 'cifar-100-python/train')\n test_x, test_y = load(archive, 'cifar-100-python/test')\n\n numpy.savez_compressed(path, train_x=train_x, train_y=train_y,\n test_x=test_x, test_y=test_y)\n return {'train_x': train_x, 'train_y': train_y,\n 'test_x': test_x, 'test_y': test_y}\n\n return download.cache_or_load_file(path, creator, numpy.load)\n\n\ndef _retrieve_cifar(name):\n root = download.get_dataset_directory('pfnet/chainer/cifar')\n path = os.path.join(root, '{}.npz'.format(name))\n url = 'https://www.cs.toronto.edu/~kriz/{}-python.tar.gz'.format(name)\n\n def creator(path):\n archive_path = download.cached_download(url)\n\n train_x = numpy.empty((5, 10000, 3072), dtype=numpy.uint8)\n train_y = numpy.empty((5, 10000), dtype=numpy.uint8)\n test_y = numpy.empty(10000, dtype=numpy.uint8)\n\n dir_name = '{}-batches-py'.format(name)\n\n with tarfile.open(archive_path, 'r:gz') as archive:\n # training set\n for i in range(5):\n file_name = '{}/data_batch_{}'.format(dir_name, i + 1)\n d = _pickle_load(archive.extractfile(file_name))\n train_x[i] = d['data']\n train_y[i] = d['labels']\n\n # test set\n file_name = '{}/test_batch'.format(dir_name)\n d = _pickle_load(archive.extractfile(file_name))\n test_x = d['data']\n test_y[...] = d['labels'] # copy to array\n\n train_x = train_x.reshape(50000, 3072)\n train_y = train_y.reshape(50000)\n\n numpy.savez_compressed(path, train_x=train_x, train_y=train_y,\n test_x=test_x, test_y=test_y)\n return {'train_x': train_x, 'train_y': train_y,\n 'test_x': test_x, 'test_y': test_y}\n\n return download.cache_or_load_file(path, creator, numpy.load)\n\n\ndef _pickle_load(f):\n if sys.version_info > (3, ):\n # python3\n return pickle.load(f, encoding='latin-1')\n else:\n # python2\n return pickle.load(f)\n", "path": "chainer/datasets/cifar.py"}]} | 2,779 | 497 |
gh_patches_debug_13027 | rasdani/github-patches | git_diff | Gallopsled__pwntools-282 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
2 doctests in proc fail on debian
We have some tests that assume pid 1 is init , and they fail on debian versions that use systemd. I can't really think of tests that are less platform-specific, any suggestions?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pwnlib/util/proc.py`
Content:
```
1 import time, errno, logging
2 from .. import tubes
3
4 try:
5 import psutil
6 _ok_import = True
7 except ImportError:
8 _ok_import = False
9
10 log = logging.getLogger(__name__)
11
12 if _ok_import:
13 all_pids = psutil.pids
14
15 def pidof(target):
16 """pidof(target) -> int list
17
18 Get PID(s) of `target`. The returned PID(s) depends on the type of `target`:
19
20 - :class:`str`: PIDs of all processes with a name matching `target`.
21
22 - :class:`pwnlib.tubes.process.process`: singleton list of the PID of `target`.
23
24 - :class:`pwnlib.tubes.sock.sock`: singleton list of the PID at the
25 remote end of `target` if it is running on the host. Otherwise an
26 empty list.
27
28 Args:
29 target(object): The target whose PID(s) to find.
30
31 Returns:
32 A list of found PIDs.
33 """
34 if isinstance(target, tubes.sock.sock):
35 local = target.sock.getsockname()
36 remote = target.sock.getpeername()
37
38 def match(p):
39 return (c.raddr, c.laddr, c.status) == (local, remote, 'ESTABLISHED')
40
41 return [c.pid for c in psutil.net_connections() if match(c)]
42
43 elif isinstance(target, tubes.process.process):
44 return [target.proc.pid]
45
46 else:
47 return pid_by_name(target)
48
49 def pid_by_name(name):
50 """pid_by_name(name) -> int list
51
52 Args:
53 name (str): Name of program.
54
55 Returns:
56 List of PIDs matching `name` sorted by lifetime, youngest to oldest.
57
58 Example:
59 >>> 1 in pid_by_name('init')
60 True
61 >>> os.getpid() in pid_by_name(name(os.getpid()))
62 True
63 """
64 def match(p):
65 if p.name() == name:
66 return True
67 try:
68 if p.exe() == name:
69 return True
70 except:
71 pass
72 return False
73
74 return [p.pid for p in psutil.process_iter() if match(p)]
75
76 def name(pid):
77 """name(pid) -> str
78
79 Args:
80 pid (int): PID of the process.
81
82 Returns:
83 Name of process as listed in ``/proc/<pid>/status``.
84
85 Example:
86 >>> name(1)
87 'init'
88 """
89 return psutil.Process(pid).name()
90
91 def parent(pid):
92 """parent(pid) -> int
93
94 Args:
95 pid (int): PID of the process.
96
97 Returns:
98 Parent PID as listed in ``/proc/<pid>/status`` under ``PPid``,
99 or 0 if there is not parent.
100 """
101 try:
102 return psutil.Process(pid).parent().pid
103 except:
104 return 0
105
106 def children(ppid):
107 """children(ppid) -> int list
108
109 Args:
110 pid (int): PID of the process.
111
112 Returns:
113 List of PIDs of whose parent process is `pid`.
114 """
115 return [p.pid for p in psutil.Process(ppid).children()]
116
117 def ancestors(pid):
118 """ancestors(pid) -> int list
119
120 Args:
121 pid (int): PID of the process.
122
123 Returns:
124 List of PIDs of whose parent process is `pid` or an ancestor of `pid`.
125 """
126 pids = []
127 while pid != 0:
128 pids.append(pid)
129 pid = parent(pid)
130 return pids
131
132 def descendants(pid):
133 """descendants(pid) -> dict
134
135 Args:
136 pid (int): PID of the process.
137
138 Returns:
139 Dictionary mapping the PID of each child of `pid` to it's descendants.
140 """
141 this_pid = pid
142 allpids = all_pids()
143 ppids = {}
144 def _parent(pid):
145 if pid not in ppids:
146 ppids[pid] = parent(pid)
147 return ppids[pid]
148 def _children(ppid):
149 return [pid for pid in allpids if _parent(pid) == ppid]
150 def _loop(ppid):
151 return {pid: _loop(pid) for pid in _children(ppid)}
152 return _loop(pid)
153
154 def exe(pid):
155 """exe(pid) -> str
156
157 Args:
158 pid (int): PID of the process.
159
160 Returns:
161 The path of the binary of the process. I.e. what ``/proc/<pid>/exe`` points to.
162 """
163 return psutil.Process(pid).exe()
164
165 def cwd(pid):
166 """cwd(pid) -> str
167
168 Args:
169 pid (int): PID of the process.
170
171 Returns:
172 The path of the process's current working directory. I.e. what
173 ``/proc/<pid>/cwd`` points to.
174 """
175 return psutil.Process(pid).cwd()
176
177 def cmdline(pid):
178 """cmdline(pid) -> str list
179
180 Args:
181 pid (int): PID of the process.
182
183 Returns:
184 A list of the fields in ``/proc/<pid>/cmdline``.
185 """
186 return psutil.Process(pid).cmdline()
187
188 def stat(pid):
189 """stat(pid) -> str list
190
191 Args:
192 pid (int): PID of the process.
193
194 Returns:
195 A list of the values in ``/proc/<pid>/stat``, with the exception that ``(`` and ``)`` has been removed from around the process name.
196 """
197 with open('/proc/%d/stat' % pid) as fd:
198 s = fd.read()
199 # filenames can have ( and ) in them, dammit
200 i = s.find('(')
201 j = s.rfind(')')
202 name = s[i+1:j]
203 return s[:i].split() + [name] + s[j+1:].split()
204
205 def starttime(pid):
206 """starttime(pid) -> float
207
208 Args:
209 pid (int): PID of the process.
210
211 Returns:
212 The time (in seconds) the process started after system boot
213 """
214 return psutil.Process(pid).create_time() - psutil.boot_time()
215
216 def status(pid):
217 """status(pid) -> dict
218
219 Get the status of a process.
220
221 Args:
222 pid (int): PID of the process.
223
224 Returns:
225 The contents of ``/proc/<pid>/status`` as a dictionary.
226 """
227 out = {}
228 try:
229 with open('/proc/%d/status' % pid) as fd:
230 for line in fd:
231 i = line.index(':')
232 key = line[:i]
233 val = line[i + 2:-1] # initial :\t and trailing \n
234 out[key] = val
235 except OSError as e:
236 if e.errno == errno.ENOENT:
237 raise ValueError('No process with PID %d' % pid)
238 else:
239 raise
240 return out
241
242 def tracer(pid):
243 """tracer(pid) -> int
244
245 Args:
246 pid (int): PID of the process.
247
248 Returns:
249 PID of the process tracing `pid`, or None if no `pid` is not being traced.
250
251 Example:
252 >>> tracer(os.getpid()) is None
253 True
254 """
255 tpid = int(status(pid)['TracerPid'])
256 return tpid if tpid > 0 else None
257
258 def state(pid):
259 """state(pid) -> str
260
261 Args:
262 pid (int): PID of the process.
263
264 Returns:
265 State of the process as listed in ``/proc/<pid>/status``. See `proc(5)` for details.
266
267 Example:
268 >>> state(os.getpid())
269 'R (running)'
270 """
271 return status(pid)['State']
272
273 def wait_for_debugger(pid):
274 """wait_for_debugger(pid) -> None
275
276 Sleeps until the process with PID `pid` is being traced.
277
278 Args:
279 pid (int): PID of the process.
280
281 Returns:
282 None
283 """
284 with log.waitfor('Waiting for debugger') as l:
285 while tracer(pid) is None:
286 time.sleep(0.01)
287 l.success()
288
289 if not _ok_import:
290 def _make_stub(func):
291 func.__doc__ = 'Stubbed out function, because psutil is not available.'
292 return func
293
294 @_make_stub
295 def all_pids():
296 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
297
298 @_make_stub
299 def pidof(target):
300 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
301
302 @_make_stub
303 def pid_by_name(name):
304 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
305
306 @_make_stub
307 def name(pid):
308 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
309
310 @_make_stub
311 def parent(pid):
312 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
313
314 @_make_stub
315 def children(ppid):
316 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
317
318 @_make_stub
319 def ancestors(pid):
320 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
321
322 @_make_stub
323 def descendants(pid):
324 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
325
326 @_make_stub
327 def exe(pid):
328 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
329
330 @_make_stub
331 def cwd(pid):
332 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
333
334 @_make_stub
335 def cmdline(pid):
336 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
337
338 @_make_stub
339 def starttime(pid):
340 log.error("Called stubbed-out function. Get psutil to work on your platform, then come back.")
341
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pwnlib/util/proc.py b/pwnlib/util/proc.py
--- a/pwnlib/util/proc.py
+++ b/pwnlib/util/proc.py
@@ -56,8 +56,6 @@
List of PIDs matching `name` sorted by lifetime, youngest to oldest.
Example:
- >>> 1 in pid_by_name('init')
- True
>>> os.getpid() in pid_by_name(name(os.getpid()))
True
"""
@@ -83,8 +81,8 @@
Name of process as listed in ``/proc/<pid>/status``.
Example:
- >>> name(1)
- 'init'
+ >>> name(os.getpid()) == os.path.basename(sys.argv[0])
+ True
"""
return psutil.Process(pid).name()
| {"golden_diff": "diff --git a/pwnlib/util/proc.py b/pwnlib/util/proc.py\n--- a/pwnlib/util/proc.py\n+++ b/pwnlib/util/proc.py\n@@ -56,8 +56,6 @@\n List of PIDs matching `name` sorted by lifetime, youngest to oldest.\n \n Example:\n- >>> 1 in pid_by_name('init')\n- True\n >>> os.getpid() in pid_by_name(name(os.getpid()))\n True\n \"\"\"\n@@ -83,8 +81,8 @@\n Name of process as listed in ``/proc/<pid>/status``.\n \n Example:\n- >>> name(1)\n- 'init'\n+ >>> name(os.getpid()) == os.path.basename(sys.argv[0])\n+ True\n \"\"\"\n return psutil.Process(pid).name()\n", "issue": "2 doctests in proc fail on debian\nWe have some tests that assume pid 1 is init , and they fail on debian versions that use systemd. I can't really think of tests that are less platform-specific, any suggestions?\n\n", "before_files": [{"content": "import time, errno, logging\nfrom .. import tubes\n\ntry:\n import psutil\n _ok_import = True\nexcept ImportError:\n _ok_import = False\n\nlog = logging.getLogger(__name__)\n\nif _ok_import:\n all_pids = psutil.pids\n\ndef pidof(target):\n \"\"\"pidof(target) -> int list\n\n Get PID(s) of `target`. The returned PID(s) depends on the type of `target`:\n\n - :class:`str`: PIDs of all processes with a name matching `target`.\n\n - :class:`pwnlib.tubes.process.process`: singleton list of the PID of `target`.\n\n - :class:`pwnlib.tubes.sock.sock`: singleton list of the PID at the\n remote end of `target` if it is running on the host. Otherwise an\n empty list.\n\n Args:\n target(object): The target whose PID(s) to find.\n\n Returns:\n A list of found PIDs.\n \"\"\"\n if isinstance(target, tubes.sock.sock):\n local = target.sock.getsockname()\n remote = target.sock.getpeername()\n\n def match(p):\n return (c.raddr, c.laddr, c.status) == (local, remote, 'ESTABLISHED')\n\n return [c.pid for c in psutil.net_connections() if match(c)]\n\n elif isinstance(target, tubes.process.process):\n return [target.proc.pid]\n\n else:\n return pid_by_name(target)\n\ndef pid_by_name(name):\n \"\"\"pid_by_name(name) -> int list\n\n Args:\n name (str): Name of program.\n\n Returns:\n List of PIDs matching `name` sorted by lifetime, youngest to oldest.\n\n Example:\n >>> 1 in pid_by_name('init')\n True\n >>> os.getpid() in pid_by_name(name(os.getpid()))\n True\n \"\"\"\n def match(p):\n if p.name() == name:\n return True\n try:\n if p.exe() == name:\n return True\n except:\n pass\n return False\n\n return [p.pid for p in psutil.process_iter() if match(p)]\n\ndef name(pid):\n \"\"\"name(pid) -> str\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n Name of process as listed in ``/proc/<pid>/status``.\n\n Example:\n >>> name(1)\n 'init'\n \"\"\"\n return psutil.Process(pid).name()\n\ndef parent(pid):\n \"\"\"parent(pid) -> int\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n Parent PID as listed in ``/proc/<pid>/status`` under ``PPid``,\n or 0 if there is not parent.\n \"\"\"\n try:\n return psutil.Process(pid).parent().pid\n except:\n return 0\n\ndef children(ppid):\n \"\"\"children(ppid) -> int list\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n List of PIDs of whose parent process is `pid`.\n \"\"\"\n return [p.pid for p in psutil.Process(ppid).children()]\n\ndef ancestors(pid):\n \"\"\"ancestors(pid) -> int list\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n List of PIDs of whose parent process is `pid` or an ancestor of `pid`.\n \"\"\"\n pids = []\n while pid != 0:\n pids.append(pid)\n pid = parent(pid)\n return pids\n\ndef descendants(pid):\n \"\"\"descendants(pid) -> dict\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n Dictionary mapping the PID of each child of `pid` to it's descendants.\n \"\"\"\n this_pid = pid\n allpids = all_pids()\n ppids = {}\n def _parent(pid):\n if pid not in ppids:\n ppids[pid] = parent(pid)\n return ppids[pid]\n def _children(ppid):\n return [pid for pid in allpids if _parent(pid) == ppid]\n def _loop(ppid):\n return {pid: _loop(pid) for pid in _children(ppid)}\n return _loop(pid)\n\ndef exe(pid):\n \"\"\"exe(pid) -> str\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n The path of the binary of the process. I.e. what ``/proc/<pid>/exe`` points to.\n \"\"\"\n return psutil.Process(pid).exe()\n\ndef cwd(pid):\n \"\"\"cwd(pid) -> str\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n The path of the process's current working directory. I.e. what\n ``/proc/<pid>/cwd`` points to.\n \"\"\"\n return psutil.Process(pid).cwd()\n\ndef cmdline(pid):\n \"\"\"cmdline(pid) -> str list\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n A list of the fields in ``/proc/<pid>/cmdline``.\n \"\"\"\n return psutil.Process(pid).cmdline()\n\ndef stat(pid):\n \"\"\"stat(pid) -> str list\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n A list of the values in ``/proc/<pid>/stat``, with the exception that ``(`` and ``)`` has been removed from around the process name.\n \"\"\"\n with open('/proc/%d/stat' % pid) as fd:\n s = fd.read()\n # filenames can have ( and ) in them, dammit\n i = s.find('(')\n j = s.rfind(')')\n name = s[i+1:j]\n return s[:i].split() + [name] + s[j+1:].split()\n\ndef starttime(pid):\n \"\"\"starttime(pid) -> float\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n The time (in seconds) the process started after system boot\n \"\"\"\n return psutil.Process(pid).create_time() - psutil.boot_time()\n\ndef status(pid):\n \"\"\"status(pid) -> dict\n\n Get the status of a process.\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n The contents of ``/proc/<pid>/status`` as a dictionary.\n \"\"\"\n out = {}\n try:\n with open('/proc/%d/status' % pid) as fd:\n for line in fd:\n i = line.index(':')\n key = line[:i]\n val = line[i + 2:-1] # initial :\\t and trailing \\n\n out[key] = val\n except OSError as e:\n if e.errno == errno.ENOENT:\n raise ValueError('No process with PID %d' % pid)\n else:\n raise\n return out\n\ndef tracer(pid):\n \"\"\"tracer(pid) -> int\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n PID of the process tracing `pid`, or None if no `pid` is not being traced.\n\n Example:\n >>> tracer(os.getpid()) is None\n True\n \"\"\"\n tpid = int(status(pid)['TracerPid'])\n return tpid if tpid > 0 else None\n\ndef state(pid):\n \"\"\"state(pid) -> str\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n State of the process as listed in ``/proc/<pid>/status``. See `proc(5)` for details.\n\n Example:\n >>> state(os.getpid())\n 'R (running)'\n \"\"\"\n return status(pid)['State']\n\ndef wait_for_debugger(pid):\n \"\"\"wait_for_debugger(pid) -> None\n\n Sleeps until the process with PID `pid` is being traced.\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n None\n \"\"\"\n with log.waitfor('Waiting for debugger') as l:\n while tracer(pid) is None:\n time.sleep(0.01)\n l.success()\n\nif not _ok_import:\n def _make_stub(func):\n func.__doc__ = 'Stubbed out function, because psutil is not available.'\n return func\n\n @_make_stub\n def all_pids():\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def pidof(target):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def pid_by_name(name):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def name(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def parent(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def children(ppid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def ancestors(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def descendants(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def exe(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def cwd(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def cmdline(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def starttime(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n", "path": "pwnlib/util/proc.py"}], "after_files": [{"content": "import time, errno, logging\nfrom .. import tubes\n\ntry:\n import psutil\n _ok_import = True\nexcept ImportError:\n _ok_import = False\n\nlog = logging.getLogger(__name__)\n\nif _ok_import:\n all_pids = psutil.pids\n\ndef pidof(target):\n \"\"\"pidof(target) -> int list\n\n Get PID(s) of `target`. The returned PID(s) depends on the type of `target`:\n\n - :class:`str`: PIDs of all processes with a name matching `target`.\n\n - :class:`pwnlib.tubes.process.process`: singleton list of the PID of `target`.\n\n - :class:`pwnlib.tubes.sock.sock`: singleton list of the PID at the\n remote end of `target` if it is running on the host. Otherwise an\n empty list.\n\n Args:\n target(object): The target whose PID(s) to find.\n\n Returns:\n A list of found PIDs.\n \"\"\"\n if isinstance(target, tubes.sock.sock):\n local = target.sock.getsockname()\n remote = target.sock.getpeername()\n\n def match(p):\n return (c.raddr, c.laddr, c.status) == (local, remote, 'ESTABLISHED')\n\n return [c.pid for c in psutil.net_connections() if match(c)]\n\n elif isinstance(target, tubes.process.process):\n return [target.proc.pid]\n\n else:\n return pid_by_name(target)\n\ndef pid_by_name(name):\n \"\"\"pid_by_name(name) -> int list\n\n Args:\n name (str): Name of program.\n\n Returns:\n List of PIDs matching `name` sorted by lifetime, youngest to oldest.\n\n Example:\n >>> os.getpid() in pid_by_name(name(os.getpid()))\n True\n \"\"\"\n def match(p):\n if p.name() == name:\n return True\n try:\n if p.exe() == name:\n return True\n except:\n pass\n return False\n\n return [p.pid for p in psutil.process_iter() if match(p)]\n\ndef name(pid):\n \"\"\"name(pid) -> str\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n Name of process as listed in ``/proc/<pid>/status``.\n\n Example:\n >>> name(os.getpid()) == os.path.basename(sys.argv[0])\n True\n \"\"\"\n return psutil.Process(pid).name()\n\ndef parent(pid):\n \"\"\"parent(pid) -> int\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n Parent PID as listed in ``/proc/<pid>/status`` under ``PPid``,\n or 0 if there is not parent.\n \"\"\"\n try:\n return psutil.Process(pid).parent().pid\n except:\n return 0\n\ndef children(ppid):\n \"\"\"children(ppid) -> int list\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n List of PIDs of whose parent process is `pid`.\n \"\"\"\n return [p.pid for p in psutil.Process(ppid).children()]\n\ndef ancestors(pid):\n \"\"\"ancestors(pid) -> int list\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n List of PIDs of whose parent process is `pid` or an ancestor of `pid`.\n \"\"\"\n pids = []\n while pid != 0:\n pids.append(pid)\n pid = parent(pid)\n return pids\n\ndef descendants(pid):\n \"\"\"descendants(pid) -> dict\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n Dictionary mapping the PID of each child of `pid` to it's descendants.\n \"\"\"\n this_pid = pid\n allpids = all_pids()\n ppids = {}\n def _parent(pid):\n if pid not in ppids:\n ppids[pid] = parent(pid)\n return ppids[pid]\n def _children(ppid):\n return [pid for pid in allpids if _parent(pid) == ppid]\n def _loop(ppid):\n return {pid: _loop(pid) for pid in _children(ppid)}\n return _loop(pid)\n\ndef exe(pid):\n \"\"\"exe(pid) -> str\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n The path of the binary of the process. I.e. what ``/proc/<pid>/exe`` points to.\n \"\"\"\n return psutil.Process(pid).exe()\n\ndef cwd(pid):\n \"\"\"cwd(pid) -> str\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n The path of the process's current working directory. I.e. what\n ``/proc/<pid>/cwd`` points to.\n \"\"\"\n return psutil.Process(pid).cwd()\n\ndef cmdline(pid):\n \"\"\"cmdline(pid) -> str list\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n A list of the fields in ``/proc/<pid>/cmdline``.\n \"\"\"\n return psutil.Process(pid).cmdline()\n\ndef stat(pid):\n \"\"\"stat(pid) -> str list\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n A list of the values in ``/proc/<pid>/stat``, with the exception that ``(`` and ``)`` has been removed from around the process name.\n \"\"\"\n with open('/proc/%d/stat' % pid) as fd:\n s = fd.read()\n # filenames can have ( and ) in them, dammit\n i = s.find('(')\n j = s.rfind(')')\n name = s[i+1:j]\n return s[:i].split() + [name] + s[j+1:].split()\n\ndef starttime(pid):\n \"\"\"starttime(pid) -> float\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n The time (in seconds) the process started after system boot\n \"\"\"\n return psutil.Process(pid).create_time() - psutil.boot_time()\n\ndef status(pid):\n \"\"\"status(pid) -> dict\n\n Get the status of a process.\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n The contents of ``/proc/<pid>/status`` as a dictionary.\n \"\"\"\n out = {}\n try:\n with open('/proc/%d/status' % pid) as fd:\n for line in fd:\n i = line.index(':')\n key = line[:i]\n val = line[i + 2:-1] # initial :\\t and trailing \\n\n out[key] = val\n except OSError as e:\n if e.errno == errno.ENOENT:\n raise ValueError('No process with PID %d' % pid)\n else:\n raise\n return out\n\ndef tracer(pid):\n \"\"\"tracer(pid) -> int\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n PID of the process tracing `pid`, or None if no `pid` is not being traced.\n\n Example:\n >>> tracer(os.getpid()) is None\n True\n \"\"\"\n tpid = int(status(pid)['TracerPid'])\n return tpid if tpid > 0 else None\n\ndef state(pid):\n \"\"\"state(pid) -> str\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n State of the process as listed in ``/proc/<pid>/status``. See `proc(5)` for details.\n\n Example:\n >>> state(os.getpid())\n 'R (running)'\n \"\"\"\n return status(pid)['State']\n\ndef wait_for_debugger(pid):\n \"\"\"wait_for_debugger(pid) -> None\n\n Sleeps until the process with PID `pid` is being traced.\n\n Args:\n pid (int): PID of the process.\n\n Returns:\n None\n \"\"\"\n with log.waitfor('Waiting for debugger') as l:\n while tracer(pid) is None:\n time.sleep(0.01)\n l.success()\n\nif not _ok_import:\n def _make_stub(func):\n func.__doc__ = 'Stubbed out function, because psutil is not available.'\n return func\n\n @_make_stub\n def all_pids():\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def pidof(target):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def pid_by_name(name):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def name(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def parent(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def children(ppid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def ancestors(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def descendants(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def exe(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def cwd(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def cmdline(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n\n @_make_stub\n def starttime(pid):\n log.error(\"Called stubbed-out function. Get psutil to work on your platform, then come back.\")\n", "path": "pwnlib/util/proc.py"}]} | 3,466 | 185 |
gh_patches_debug_54796 | rasdani/github-patches | git_diff | encode__httpx-1357 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ASGITransport does not correctly simulate raw_path in the scope
I'm trying to switch Datasette's internal tests over to using `httpx` with `AsyncClient`.
This has almost worked perfectly, but I've run into one problem: it looks like the `ASGITransport` class used by the `AsyncClient(app=asgi_app)` mechanism does not correctly simulate the `raw_path` and `path` keys.
Here's the code in question: https://github.com/encode/httpx/blob/92ca4d0cc654859fc2257c492e55d8752370d427/httpx/_transports/asgi.py#L82-L97
As you can see, it's not populating `raw_path` even though that's part of the ASGI spec.
This matters for Datasette because it supports this URL: https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv - which refers to a SQLite database table called `table/with/slashes.csv` (a weird table name but that's test cases for you). The way it does this is through careful decoding of the `raw_path` ASGI scope variable.
Here are my notes when I first ran into this limitation of ASGITransport: https://github.com/simonw/datasette/pull/1000#issuecomment-705945591
ASGITransport does not correctly simulate raw_path in the scope
I'm trying to switch Datasette's internal tests over to using `httpx` with `AsyncClient`.
This has almost worked perfectly, but I've run into one problem: it looks like the `ASGITransport` class used by the `AsyncClient(app=asgi_app)` mechanism does not correctly simulate the `raw_path` and `path` keys.
Here's the code in question: https://github.com/encode/httpx/blob/92ca4d0cc654859fc2257c492e55d8752370d427/httpx/_transports/asgi.py#L82-L97
As you can see, it's not populating `raw_path` even though that's part of the ASGI spec.
This matters for Datasette because it supports this URL: https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv - which refers to a SQLite database table called `table/with/slashes.csv` (a weird table name but that's test cases for you). The way it does this is through careful decoding of the `raw_path` ASGI scope variable.
Here are my notes when I first ran into this limitation of ASGITransport: https://github.com/simonw/datasette/pull/1000#issuecomment-705945591
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `httpx/_transports/asgi.py`
Content:
```
1 from typing import TYPE_CHECKING, Callable, List, Optional, Tuple, Union
2 from urllib.parse import unquote
3
4 import httpcore
5 import sniffio
6
7 if TYPE_CHECKING: # pragma: no cover
8 import asyncio
9
10 import trio
11
12 Event = Union[asyncio.Event, trio.Event]
13
14
15 def create_event() -> "Event":
16 if sniffio.current_async_library() == "trio":
17 import trio
18
19 return trio.Event()
20 else:
21 import asyncio
22
23 return asyncio.Event()
24
25
26 class ASGITransport(httpcore.AsyncHTTPTransport):
27 """
28 A custom AsyncTransport that handles sending requests directly to an ASGI app.
29 The simplest way to use this functionality is to use the `app` argument.
30
31 ```
32 client = httpx.AsyncClient(app=app)
33 ```
34
35 Alternatively, you can setup the transport instance explicitly.
36 This allows you to include any additional configuration arguments specific
37 to the ASGITransport class:
38
39 ```
40 transport = httpx.ASGITransport(
41 app=app,
42 root_path="/submount",
43 client=("1.2.3.4", 123)
44 )
45 client = httpx.AsyncClient(transport=transport)
46 ```
47
48 Arguments:
49
50 * `app` - The ASGI application.
51 * `raise_app_exceptions` - Boolean indicating if exceptions in the application
52 should be raised. Default to `True`. Can be set to `False` for use cases
53 such as testing the content of a client 500 response.
54 * `root_path` - The root path on which the ASGI application should be mounted.
55 * `client` - A two-tuple indicating the client IP and port of incoming requests.
56 ```
57 """
58
59 def __init__(
60 self,
61 app: Callable,
62 raise_app_exceptions: bool = True,
63 root_path: str = "",
64 client: Tuple[str, int] = ("127.0.0.1", 123),
65 ) -> None:
66 self.app = app
67 self.raise_app_exceptions = raise_app_exceptions
68 self.root_path = root_path
69 self.client = client
70
71 async def arequest(
72 self,
73 method: bytes,
74 url: Tuple[bytes, bytes, Optional[int], bytes],
75 headers: List[Tuple[bytes, bytes]] = None,
76 stream: httpcore.AsyncByteStream = None,
77 ext: dict = None,
78 ) -> Tuple[int, List[Tuple[bytes, bytes]], httpcore.AsyncByteStream, dict]:
79 headers = [] if headers is None else headers
80 stream = httpcore.PlainByteStream(content=b"") if stream is None else stream
81
82 # ASGI scope.
83 scheme, host, port, full_path = url
84 path, _, query = full_path.partition(b"?")
85 scope = {
86 "type": "http",
87 "asgi": {"version": "3.0"},
88 "http_version": "1.1",
89 "method": method.decode(),
90 "headers": [(k.lower(), v) for (k, v) in headers],
91 "scheme": scheme.decode("ascii"),
92 "path": unquote(path.decode("ascii")),
93 "query_string": query,
94 "server": (host.decode("ascii"), port),
95 "client": self.client,
96 "root_path": self.root_path,
97 }
98
99 # Request.
100 request_body_chunks = stream.__aiter__()
101 request_complete = False
102
103 # Response.
104 status_code = None
105 response_headers = None
106 body_parts = []
107 response_started = False
108 response_complete = create_event()
109
110 # ASGI callables.
111
112 async def receive() -> dict:
113 nonlocal request_complete
114
115 if request_complete:
116 await response_complete.wait()
117 return {"type": "http.disconnect"}
118
119 try:
120 body = await request_body_chunks.__anext__()
121 except StopAsyncIteration:
122 request_complete = True
123 return {"type": "http.request", "body": b"", "more_body": False}
124 return {"type": "http.request", "body": body, "more_body": True}
125
126 async def send(message: dict) -> None:
127 nonlocal status_code, response_headers, response_started
128
129 if message["type"] == "http.response.start":
130 assert not response_started
131
132 status_code = message["status"]
133 response_headers = message.get("headers", [])
134 response_started = True
135
136 elif message["type"] == "http.response.body":
137 assert not response_complete.is_set()
138 body = message.get("body", b"")
139 more_body = message.get("more_body", False)
140
141 if body and method != b"HEAD":
142 body_parts.append(body)
143
144 if not more_body:
145 response_complete.set()
146
147 try:
148 await self.app(scope, receive, send)
149 except Exception:
150 if self.raise_app_exceptions or not response_complete.is_set():
151 raise
152
153 assert response_complete.is_set()
154 assert status_code is not None
155 assert response_headers is not None
156
157 stream = httpcore.PlainByteStream(content=b"".join(body_parts))
158 ext = {}
159
160 return (status_code, response_headers, stream, ext)
161
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/httpx/_transports/asgi.py b/httpx/_transports/asgi.py
--- a/httpx/_transports/asgi.py
+++ b/httpx/_transports/asgi.py
@@ -90,6 +90,7 @@
"headers": [(k.lower(), v) for (k, v) in headers],
"scheme": scheme.decode("ascii"),
"path": unquote(path.decode("ascii")),
+ "raw_path": path,
"query_string": query,
"server": (host.decode("ascii"), port),
"client": self.client,
| {"golden_diff": "diff --git a/httpx/_transports/asgi.py b/httpx/_transports/asgi.py\n--- a/httpx/_transports/asgi.py\n+++ b/httpx/_transports/asgi.py\n@@ -90,6 +90,7 @@\n \"headers\": [(k.lower(), v) for (k, v) in headers],\n \"scheme\": scheme.decode(\"ascii\"),\n \"path\": unquote(path.decode(\"ascii\")),\n+ \"raw_path\": path,\n \"query_string\": query,\n \"server\": (host.decode(\"ascii\"), port),\n \"client\": self.client,\n", "issue": "ASGITransport does not correctly simulate raw_path in the scope\nI'm trying to switch Datasette's internal tests over to using `httpx` with `AsyncClient`.\r\n\r\nThis has almost worked perfectly, but I've run into one problem: it looks like the `ASGITransport` class used by the `AsyncClient(app=asgi_app)` mechanism does not correctly simulate the `raw_path` and `path` keys.\r\n\r\nHere's the code in question: https://github.com/encode/httpx/blob/92ca4d0cc654859fc2257c492e55d8752370d427/httpx/_transports/asgi.py#L82-L97\r\n\r\nAs you can see, it's not populating `raw_path` even though that's part of the ASGI spec.\r\n\r\nThis matters for Datasette because it supports this URL: https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv - which refers to a SQLite database table called `table/with/slashes.csv` (a weird table name but that's test cases for you). The way it does this is through careful decoding of the `raw_path` ASGI scope variable.\r\n\r\nHere are my notes when I first ran into this limitation of ASGITransport: https://github.com/simonw/datasette/pull/1000#issuecomment-705945591\nASGITransport does not correctly simulate raw_path in the scope\nI'm trying to switch Datasette's internal tests over to using `httpx` with `AsyncClient`.\r\n\r\nThis has almost worked perfectly, but I've run into one problem: it looks like the `ASGITransport` class used by the `AsyncClient(app=asgi_app)` mechanism does not correctly simulate the `raw_path` and `path` keys.\r\n\r\nHere's the code in question: https://github.com/encode/httpx/blob/92ca4d0cc654859fc2257c492e55d8752370d427/httpx/_transports/asgi.py#L82-L97\r\n\r\nAs you can see, it's not populating `raw_path` even though that's part of the ASGI spec.\r\n\r\nThis matters for Datasette because it supports this URL: https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv - which refers to a SQLite database table called `table/with/slashes.csv` (a weird table name but that's test cases for you). The way it does this is through careful decoding of the `raw_path` ASGI scope variable.\r\n\r\nHere are my notes when I first ran into this limitation of ASGITransport: https://github.com/simonw/datasette/pull/1000#issuecomment-705945591\n", "before_files": [{"content": "from typing import TYPE_CHECKING, Callable, List, Optional, Tuple, Union\nfrom urllib.parse import unquote\n\nimport httpcore\nimport sniffio\n\nif TYPE_CHECKING: # pragma: no cover\n import asyncio\n\n import trio\n\n Event = Union[asyncio.Event, trio.Event]\n\n\ndef create_event() -> \"Event\":\n if sniffio.current_async_library() == \"trio\":\n import trio\n\n return trio.Event()\n else:\n import asyncio\n\n return asyncio.Event()\n\n\nclass ASGITransport(httpcore.AsyncHTTPTransport):\n \"\"\"\n A custom AsyncTransport that handles sending requests directly to an ASGI app.\n The simplest way to use this functionality is to use the `app` argument.\n\n ```\n client = httpx.AsyncClient(app=app)\n ```\n\n Alternatively, you can setup the transport instance explicitly.\n This allows you to include any additional configuration arguments specific\n to the ASGITransport class:\n\n ```\n transport = httpx.ASGITransport(\n app=app,\n root_path=\"/submount\",\n client=(\"1.2.3.4\", 123)\n )\n client = httpx.AsyncClient(transport=transport)\n ```\n\n Arguments:\n\n * `app` - The ASGI application.\n * `raise_app_exceptions` - Boolean indicating if exceptions in the application\n should be raised. Default to `True`. Can be set to `False` for use cases\n such as testing the content of a client 500 response.\n * `root_path` - The root path on which the ASGI application should be mounted.\n * `client` - A two-tuple indicating the client IP and port of incoming requests.\n ```\n \"\"\"\n\n def __init__(\n self,\n app: Callable,\n raise_app_exceptions: bool = True,\n root_path: str = \"\",\n client: Tuple[str, int] = (\"127.0.0.1\", 123),\n ) -> None:\n self.app = app\n self.raise_app_exceptions = raise_app_exceptions\n self.root_path = root_path\n self.client = client\n\n async def arequest(\n self,\n method: bytes,\n url: Tuple[bytes, bytes, Optional[int], bytes],\n headers: List[Tuple[bytes, bytes]] = None,\n stream: httpcore.AsyncByteStream = None,\n ext: dict = None,\n ) -> Tuple[int, List[Tuple[bytes, bytes]], httpcore.AsyncByteStream, dict]:\n headers = [] if headers is None else headers\n stream = httpcore.PlainByteStream(content=b\"\") if stream is None else stream\n\n # ASGI scope.\n scheme, host, port, full_path = url\n path, _, query = full_path.partition(b\"?\")\n scope = {\n \"type\": \"http\",\n \"asgi\": {\"version\": \"3.0\"},\n \"http_version\": \"1.1\",\n \"method\": method.decode(),\n \"headers\": [(k.lower(), v) for (k, v) in headers],\n \"scheme\": scheme.decode(\"ascii\"),\n \"path\": unquote(path.decode(\"ascii\")),\n \"query_string\": query,\n \"server\": (host.decode(\"ascii\"), port),\n \"client\": self.client,\n \"root_path\": self.root_path,\n }\n\n # Request.\n request_body_chunks = stream.__aiter__()\n request_complete = False\n\n # Response.\n status_code = None\n response_headers = None\n body_parts = []\n response_started = False\n response_complete = create_event()\n\n # ASGI callables.\n\n async def receive() -> dict:\n nonlocal request_complete\n\n if request_complete:\n await response_complete.wait()\n return {\"type\": \"http.disconnect\"}\n\n try:\n body = await request_body_chunks.__anext__()\n except StopAsyncIteration:\n request_complete = True\n return {\"type\": \"http.request\", \"body\": b\"\", \"more_body\": False}\n return {\"type\": \"http.request\", \"body\": body, \"more_body\": True}\n\n async def send(message: dict) -> None:\n nonlocal status_code, response_headers, response_started\n\n if message[\"type\"] == \"http.response.start\":\n assert not response_started\n\n status_code = message[\"status\"]\n response_headers = message.get(\"headers\", [])\n response_started = True\n\n elif message[\"type\"] == \"http.response.body\":\n assert not response_complete.is_set()\n body = message.get(\"body\", b\"\")\n more_body = message.get(\"more_body\", False)\n\n if body and method != b\"HEAD\":\n body_parts.append(body)\n\n if not more_body:\n response_complete.set()\n\n try:\n await self.app(scope, receive, send)\n except Exception:\n if self.raise_app_exceptions or not response_complete.is_set():\n raise\n\n assert response_complete.is_set()\n assert status_code is not None\n assert response_headers is not None\n\n stream = httpcore.PlainByteStream(content=b\"\".join(body_parts))\n ext = {}\n\n return (status_code, response_headers, stream, ext)\n", "path": "httpx/_transports/asgi.py"}], "after_files": [{"content": "from typing import TYPE_CHECKING, Callable, List, Optional, Tuple, Union\nfrom urllib.parse import unquote\n\nimport httpcore\nimport sniffio\n\nif TYPE_CHECKING: # pragma: no cover\n import asyncio\n\n import trio\n\n Event = Union[asyncio.Event, trio.Event]\n\n\ndef create_event() -> \"Event\":\n if sniffio.current_async_library() == \"trio\":\n import trio\n\n return trio.Event()\n else:\n import asyncio\n\n return asyncio.Event()\n\n\nclass ASGITransport(httpcore.AsyncHTTPTransport):\n \"\"\"\n A custom AsyncTransport that handles sending requests directly to an ASGI app.\n The simplest way to use this functionality is to use the `app` argument.\n\n ```\n client = httpx.AsyncClient(app=app)\n ```\n\n Alternatively, you can setup the transport instance explicitly.\n This allows you to include any additional configuration arguments specific\n to the ASGITransport class:\n\n ```\n transport = httpx.ASGITransport(\n app=app,\n root_path=\"/submount\",\n client=(\"1.2.3.4\", 123)\n )\n client = httpx.AsyncClient(transport=transport)\n ```\n\n Arguments:\n\n * `app` - The ASGI application.\n * `raise_app_exceptions` - Boolean indicating if exceptions in the application\n should be raised. Default to `True`. Can be set to `False` for use cases\n such as testing the content of a client 500 response.\n * `root_path` - The root path on which the ASGI application should be mounted.\n * `client` - A two-tuple indicating the client IP and port of incoming requests.\n ```\n \"\"\"\n\n def __init__(\n self,\n app: Callable,\n raise_app_exceptions: bool = True,\n root_path: str = \"\",\n client: Tuple[str, int] = (\"127.0.0.1\", 123),\n ) -> None:\n self.app = app\n self.raise_app_exceptions = raise_app_exceptions\n self.root_path = root_path\n self.client = client\n\n async def arequest(\n self,\n method: bytes,\n url: Tuple[bytes, bytes, Optional[int], bytes],\n headers: List[Tuple[bytes, bytes]] = None,\n stream: httpcore.AsyncByteStream = None,\n ext: dict = None,\n ) -> Tuple[int, List[Tuple[bytes, bytes]], httpcore.AsyncByteStream, dict]:\n headers = [] if headers is None else headers\n stream = httpcore.PlainByteStream(content=b\"\") if stream is None else stream\n\n # ASGI scope.\n scheme, host, port, full_path = url\n path, _, query = full_path.partition(b\"?\")\n scope = {\n \"type\": \"http\",\n \"asgi\": {\"version\": \"3.0\"},\n \"http_version\": \"1.1\",\n \"method\": method.decode(),\n \"headers\": [(k.lower(), v) for (k, v) in headers],\n \"scheme\": scheme.decode(\"ascii\"),\n \"path\": unquote(path.decode(\"ascii\")),\n \"raw_path\": path,\n \"query_string\": query,\n \"server\": (host.decode(\"ascii\"), port),\n \"client\": self.client,\n \"root_path\": self.root_path,\n }\n\n # Request.\n request_body_chunks = stream.__aiter__()\n request_complete = False\n\n # Response.\n status_code = None\n response_headers = None\n body_parts = []\n response_started = False\n response_complete = create_event()\n\n # ASGI callables.\n\n async def receive() -> dict:\n nonlocal request_complete\n\n if request_complete:\n await response_complete.wait()\n return {\"type\": \"http.disconnect\"}\n\n try:\n body = await request_body_chunks.__anext__()\n except StopAsyncIteration:\n request_complete = True\n return {\"type\": \"http.request\", \"body\": b\"\", \"more_body\": False}\n return {\"type\": \"http.request\", \"body\": body, \"more_body\": True}\n\n async def send(message: dict) -> None:\n nonlocal status_code, response_headers, response_started\n\n if message[\"type\"] == \"http.response.start\":\n assert not response_started\n\n status_code = message[\"status\"]\n response_headers = message.get(\"headers\", [])\n response_started = True\n\n elif message[\"type\"] == \"http.response.body\":\n assert not response_complete.is_set()\n body = message.get(\"body\", b\"\")\n more_body = message.get(\"more_body\", False)\n\n if body and method != b\"HEAD\":\n body_parts.append(body)\n\n if not more_body:\n response_complete.set()\n\n try:\n await self.app(scope, receive, send)\n except Exception:\n if self.raise_app_exceptions or not response_complete.is_set():\n raise\n\n assert response_complete.is_set()\n assert status_code is not None\n assert response_headers is not None\n\n stream = httpcore.PlainByteStream(content=b\"\".join(body_parts))\n ext = {}\n\n return (status_code, response_headers, stream, ext)\n", "path": "httpx/_transports/asgi.py"}]} | 2,393 | 129 |
gh_patches_debug_10125 | rasdani/github-patches | git_diff | electricitymaps__electricitymaps-contrib-1060 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
MISO <-> Canada interconnector
..needs to be updated as it still pointing to Montana instead of MISO

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `parsers/CA_ON.py`
Content:
```
1 #!/usr/bin/env python3
2
3 # The arrow library is used to handle datetimes
4 import arrow
5 # The request library is used to fetch content through HTTP
6 import requests
7
8 from bs4 import BeautifulSoup
9
10 MAP_GENERATION = {
11 'BIOFUEL': 'biomass',
12 'GAS': 'gas',
13 'HYDRO': 'hydro',
14 'NUCLEAR': 'nuclear',
15 'SOLAR': 'solar',
16 'WIND': 'wind'
17 }
18
19 timezone = 'Canada/Eastern'
20
21
22 def fetch_production(country_code='CA-ON', session=None):
23 """Requests the last known production mix (in MW) of a given country
24
25 Arguments:
26 country_code (optional) -- used in case a parser is able to fetch multiple countries
27 session (optional) -- request session passed in order to re-use an existing session
28
29 Return:
30 A dictionary in the form:
31 {
32 'countryCode': 'FR',
33 'datetime': '2017-01-01T00:00:00Z',
34 'production': {
35 'biomass': 0.0,
36 'coal': 0.0,
37 'gas': 0.0,
38 'hydro': 0.0,
39 'nuclear': null,
40 'oil': 0.0,
41 'solar': 0.0,
42 'wind': 0.0,
43 'geothermal': 0.0,
44 'unknown': 0.0
45 },
46 'storage': {
47 'hydro': -10.0,
48 },
49 'source': 'mysource.com'
50 }
51 """
52 r = session or requests.session()
53 url = 'http://www.ieso.ca/-/media/files/ieso/uploaded/chart/generation_fuel_type_multiday.xml?la=en'
54 response = r.get(url)
55 soup = BeautifulSoup(response.text, 'html.parser')
56
57 data = {}
58
59 start_datetime = arrow.get(
60 arrow.get(soup.find_all('startdate')[0].contents[0]).datetime, timezone)
61
62 # Iterate over all datasets (production types)
63 for item in soup.find_all('dataset'):
64 key = item.attrs['series']
65 for rowIndex, row in enumerate(item.find_all('value')):
66 if not len(row.contents):
67 continue
68 if rowIndex not in data:
69 data[rowIndex] = {
70 'datetime': start_datetime.replace(hours=+rowIndex).datetime,
71 'countryCode': country_code,
72 'production': {
73 'coal': 0
74 },
75 'storage': {},
76 'source': 'ieso.ca',
77 }
78 data[rowIndex]['production'][MAP_GENERATION[key]] = \
79 float(row.contents[0])
80
81 return [data[k] for k in sorted(data.keys())]
82
83
84 def fetch_price(country_code='CA-ON', session=None):
85 """Requests the last known power price of a given country
86
87 Arguments:
88 country_code (optional) -- used in case a parser is able to fetch multiple countries
89 session (optional) -- request session passed in order to re-use an existing session
90
91 Return:
92 A dictionary in the form:
93 {
94 'countryCode': 'FR',
95 'currency': EUR,
96 'datetime': '2017-01-01T00:00:00Z',
97 'price': 0.0,
98 'source': 'mysource.com'
99 }
100 """
101
102 r = session or requests.session()
103 url = 'http://www.ieso.ca/-/media/files/ieso/uploaded/chart/price_multiday.xml?la=en'
104 response = r.get(url)
105 soup = BeautifulSoup(response.text, 'html.parser')
106
107 data = {}
108
109 start_datetime = arrow.get(
110 arrow.get(soup.find_all('startdate')[0].contents[0]).datetime, timezone)
111
112 # Iterate over all datasets (production types)
113 for item in soup.find_all('dataset'):
114 key = item.attrs['series']
115 if key != 'HOEP':
116 continue
117 for rowIndex, row in enumerate(item.find_all('value')):
118 if not len(row.contents):
119 continue
120 if rowIndex not in data:
121 data[rowIndex] = {
122 'datetime': start_datetime.replace(hours=+rowIndex).datetime,
123 'countryCode': country_code,
124 'currency': 'CAD',
125 'source': 'ieso.ca',
126 }
127 data[rowIndex]['price'] = \
128 float(row.contents[0])
129
130 return [data[k] for k in sorted(data.keys())]
131
132 return data
133
134
135 def fetch_exchange(country_code1, country_code2, session=None):
136 """Requests the last known power exchange (in MW) between two countries
137
138 Arguments:
139 country_code (optional) -- used in case a parser is able to fetch multiple countries
140 session (optional) -- request session passed in order to re-use an existing session
141
142 Return:
143 A dictionary in the form:
144 {
145 'sortedCountryCodes': 'DK->NO',
146 'datetime': '2017-01-01T00:00:00Z',
147 'netFlow': 0.0,
148 'source': 'mysource.com'
149 }
150 """
151
152 r = session or requests.session()
153 url = 'http://live.gridwatch.ca/WebServices/GridWatchWebApp.asmx/GetHomeViewData_v2'
154 response = r.get(url)
155 obj = response.json()
156 exchanges = obj['intertieLineData']
157
158 sortedCountryCodes = '->'.join(sorted([country_code1, country_code2]))
159 # Everything -> CA_ON corresponds to an import to ON
160 # In the data, "net" represents an export
161 # So everything -> CA_ON must be reversed
162 if sortedCountryCodes == 'CA-MB->CA-ON':
163 keys = ['MANITOBA', 'MANITOBA SK']
164 direction = -1
165 elif sortedCountryCodes == 'CA-ON->US-NY':
166 keys = ['NEW-YORK']
167 direction = 1
168 elif sortedCountryCodes == 'CA-ON->US-MI':
169 keys = ['MICHIGAN']
170 direction = 1
171 elif sortedCountryCodes == 'CA-ON->US-MN':
172 keys = ['MINNESOTA']
173 direction = 1
174 elif sortedCountryCodes == 'CA-ON->CA-QC':
175 keys = filter(lambda k: k[:2] == 'PQ', exchanges.keys())
176 direction = 1
177 else:
178 raise NotImplementedError('This exchange pair is not implemented')
179
180 data = {
181 'datetime': max(map(lambda x: arrow.get(arrow.get(
182 exchanges[x]['dateReported']).datetime, timezone).datetime, keys)),
183 'sortedCountryCodes': sortedCountryCodes,
184 'netFlow': sum(map(lambda x: float(exchanges[x]['net'].replace(',', '')), keys)) * direction,
185 'source': 'gridwatch.ca'
186 }
187
188 return data
189
190
191 if __name__ == '__main__':
192 """Main method, never used by the Electricity Map backend, but handy for testing."""
193
194 print('fetch_production() ->')
195 print(fetch_production())
196 print('fetch_price() ->')
197 print(fetch_price())
198 print('fetch_exchange("CA-ON", "US-NY") ->')
199 print(fetch_exchange("CA-ON", "US-NY"))
200
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/parsers/CA_ON.py b/parsers/CA_ON.py
--- a/parsers/CA_ON.py
+++ b/parsers/CA_ON.py
@@ -165,11 +165,8 @@
elif sortedCountryCodes == 'CA-ON->US-NY':
keys = ['NEW-YORK']
direction = 1
- elif sortedCountryCodes == 'CA-ON->US-MI':
- keys = ['MICHIGAN']
- direction = 1
- elif sortedCountryCodes == 'CA-ON->US-MN':
- keys = ['MINNESOTA']
+ elif sortedCountryCodes == 'CA-ON->US-MISO':
+ keys = ['MICHIGAN', 'MINNESOTA']
direction = 1
elif sortedCountryCodes == 'CA-ON->CA-QC':
keys = filter(lambda k: k[:2] == 'PQ', exchanges.keys())
| {"golden_diff": "diff --git a/parsers/CA_ON.py b/parsers/CA_ON.py\n--- a/parsers/CA_ON.py\n+++ b/parsers/CA_ON.py\n@@ -165,11 +165,8 @@\n elif sortedCountryCodes == 'CA-ON->US-NY':\n keys = ['NEW-YORK']\n direction = 1\n- elif sortedCountryCodes == 'CA-ON->US-MI':\n- keys = ['MICHIGAN']\n- direction = 1\n- elif sortedCountryCodes == 'CA-ON->US-MN':\n- keys = ['MINNESOTA']\n+ elif sortedCountryCodes == 'CA-ON->US-MISO':\n+ keys = ['MICHIGAN', 'MINNESOTA']\n direction = 1\n elif sortedCountryCodes == 'CA-ON->CA-QC':\n keys = filter(lambda k: k[:2] == 'PQ', exchanges.keys())\n", "issue": "MISO <-> Canada interconnector\n..needs to be updated as it still pointing to Montana instead of MISO\r\n\r\n\n", "before_files": [{"content": "#!/usr/bin/env python3\n\n# The arrow library is used to handle datetimes\nimport arrow\n# The request library is used to fetch content through HTTP\nimport requests\n\nfrom bs4 import BeautifulSoup\n\nMAP_GENERATION = {\n 'BIOFUEL': 'biomass',\n 'GAS': 'gas',\n 'HYDRO': 'hydro',\n 'NUCLEAR': 'nuclear',\n 'SOLAR': 'solar',\n 'WIND': 'wind'\n}\n\ntimezone = 'Canada/Eastern'\n\n\ndef fetch_production(country_code='CA-ON', session=None):\n \"\"\"Requests the last known production mix (in MW) of a given country\n\n Arguments:\n country_code (optional) -- used in case a parser is able to fetch multiple countries\n session (optional) -- request session passed in order to re-use an existing session\n\n Return:\n A dictionary in the form:\n {\n 'countryCode': 'FR',\n 'datetime': '2017-01-01T00:00:00Z',\n 'production': {\n 'biomass': 0.0,\n 'coal': 0.0,\n 'gas': 0.0,\n 'hydro': 0.0,\n 'nuclear': null,\n 'oil': 0.0,\n 'solar': 0.0,\n 'wind': 0.0,\n 'geothermal': 0.0,\n 'unknown': 0.0\n },\n 'storage': {\n 'hydro': -10.0,\n },\n 'source': 'mysource.com'\n }\n \"\"\"\n r = session or requests.session()\n url = 'http://www.ieso.ca/-/media/files/ieso/uploaded/chart/generation_fuel_type_multiday.xml?la=en'\n response = r.get(url)\n soup = BeautifulSoup(response.text, 'html.parser')\n\n data = {}\n\n start_datetime = arrow.get(\n arrow.get(soup.find_all('startdate')[0].contents[0]).datetime, timezone)\n\n # Iterate over all datasets (production types)\n for item in soup.find_all('dataset'):\n key = item.attrs['series']\n for rowIndex, row in enumerate(item.find_all('value')):\n if not len(row.contents):\n continue\n if rowIndex not in data:\n data[rowIndex] = {\n 'datetime': start_datetime.replace(hours=+rowIndex).datetime,\n 'countryCode': country_code,\n 'production': {\n 'coal': 0\n },\n 'storage': {},\n 'source': 'ieso.ca',\n }\n data[rowIndex]['production'][MAP_GENERATION[key]] = \\\n float(row.contents[0])\n\n return [data[k] for k in sorted(data.keys())]\n\n\ndef fetch_price(country_code='CA-ON', session=None):\n \"\"\"Requests the last known power price of a given country\n\n Arguments:\n country_code (optional) -- used in case a parser is able to fetch multiple countries\n session (optional) -- request session passed in order to re-use an existing session\n\n Return:\n A dictionary in the form:\n {\n 'countryCode': 'FR',\n 'currency': EUR,\n 'datetime': '2017-01-01T00:00:00Z',\n 'price': 0.0,\n 'source': 'mysource.com'\n }\n \"\"\"\n\n r = session or requests.session()\n url = 'http://www.ieso.ca/-/media/files/ieso/uploaded/chart/price_multiday.xml?la=en'\n response = r.get(url)\n soup = BeautifulSoup(response.text, 'html.parser')\n\n data = {}\n\n start_datetime = arrow.get(\n arrow.get(soup.find_all('startdate')[0].contents[0]).datetime, timezone)\n\n # Iterate over all datasets (production types)\n for item in soup.find_all('dataset'):\n key = item.attrs['series']\n if key != 'HOEP':\n continue\n for rowIndex, row in enumerate(item.find_all('value')):\n if not len(row.contents):\n continue\n if rowIndex not in data:\n data[rowIndex] = {\n 'datetime': start_datetime.replace(hours=+rowIndex).datetime,\n 'countryCode': country_code,\n 'currency': 'CAD',\n 'source': 'ieso.ca',\n }\n data[rowIndex]['price'] = \\\n float(row.contents[0])\n\n return [data[k] for k in sorted(data.keys())]\n\n return data\n\n\ndef fetch_exchange(country_code1, country_code2, session=None):\n \"\"\"Requests the last known power exchange (in MW) between two countries\n\n Arguments:\n country_code (optional) -- used in case a parser is able to fetch multiple countries\n session (optional) -- request session passed in order to re-use an existing session\n\n Return:\n A dictionary in the form:\n {\n 'sortedCountryCodes': 'DK->NO',\n 'datetime': '2017-01-01T00:00:00Z',\n 'netFlow': 0.0,\n 'source': 'mysource.com'\n }\n \"\"\"\n\n r = session or requests.session()\n url = 'http://live.gridwatch.ca/WebServices/GridWatchWebApp.asmx/GetHomeViewData_v2'\n response = r.get(url)\n obj = response.json()\n exchanges = obj['intertieLineData']\n\n sortedCountryCodes = '->'.join(sorted([country_code1, country_code2]))\n # Everything -> CA_ON corresponds to an import to ON\n # In the data, \"net\" represents an export\n # So everything -> CA_ON must be reversed\n if sortedCountryCodes == 'CA-MB->CA-ON':\n keys = ['MANITOBA', 'MANITOBA SK']\n direction = -1\n elif sortedCountryCodes == 'CA-ON->US-NY':\n keys = ['NEW-YORK']\n direction = 1\n elif sortedCountryCodes == 'CA-ON->US-MI':\n keys = ['MICHIGAN']\n direction = 1\n elif sortedCountryCodes == 'CA-ON->US-MN':\n keys = ['MINNESOTA']\n direction = 1\n elif sortedCountryCodes == 'CA-ON->CA-QC':\n keys = filter(lambda k: k[:2] == 'PQ', exchanges.keys())\n direction = 1\n else:\n raise NotImplementedError('This exchange pair is not implemented')\n\n data = {\n 'datetime': max(map(lambda x: arrow.get(arrow.get(\n exchanges[x]['dateReported']).datetime, timezone).datetime, keys)),\n 'sortedCountryCodes': sortedCountryCodes,\n 'netFlow': sum(map(lambda x: float(exchanges[x]['net'].replace(',', '')), keys)) * direction,\n 'source': 'gridwatch.ca'\n }\n\n return data\n\n\nif __name__ == '__main__':\n \"\"\"Main method, never used by the Electricity Map backend, but handy for testing.\"\"\"\n\n print('fetch_production() ->')\n print(fetch_production())\n print('fetch_price() ->')\n print(fetch_price())\n print('fetch_exchange(\"CA-ON\", \"US-NY\") ->')\n print(fetch_exchange(\"CA-ON\", \"US-NY\"))\n", "path": "parsers/CA_ON.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\n# The arrow library is used to handle datetimes\nimport arrow\n# The request library is used to fetch content through HTTP\nimport requests\n\nfrom bs4 import BeautifulSoup\n\nMAP_GENERATION = {\n 'BIOFUEL': 'biomass',\n 'GAS': 'gas',\n 'HYDRO': 'hydro',\n 'NUCLEAR': 'nuclear',\n 'SOLAR': 'solar',\n 'WIND': 'wind'\n}\n\ntimezone = 'Canada/Eastern'\n\n\ndef fetch_production(country_code='CA-ON', session=None):\n \"\"\"Requests the last known production mix (in MW) of a given country\n\n Arguments:\n country_code (optional) -- used in case a parser is able to fetch multiple countries\n session (optional) -- request session passed in order to re-use an existing session\n\n Return:\n A dictionary in the form:\n {\n 'countryCode': 'FR',\n 'datetime': '2017-01-01T00:00:00Z',\n 'production': {\n 'biomass': 0.0,\n 'coal': 0.0,\n 'gas': 0.0,\n 'hydro': 0.0,\n 'nuclear': null,\n 'oil': 0.0,\n 'solar': 0.0,\n 'wind': 0.0,\n 'geothermal': 0.0,\n 'unknown': 0.0\n },\n 'storage': {\n 'hydro': -10.0,\n },\n 'source': 'mysource.com'\n }\n \"\"\"\n r = session or requests.session()\n url = 'http://www.ieso.ca/-/media/files/ieso/uploaded/chart/generation_fuel_type_multiday.xml?la=en'\n response = r.get(url)\n soup = BeautifulSoup(response.text, 'html.parser')\n\n data = {}\n\n start_datetime = arrow.get(\n arrow.get(soup.find_all('startdate')[0].contents[0]).datetime, timezone)\n\n # Iterate over all datasets (production types)\n for item in soup.find_all('dataset'):\n key = item.attrs['series']\n for rowIndex, row in enumerate(item.find_all('value')):\n if not len(row.contents):\n continue\n if rowIndex not in data:\n data[rowIndex] = {\n 'datetime': start_datetime.replace(hours=+rowIndex).datetime,\n 'countryCode': country_code,\n 'production': {\n 'coal': 0\n },\n 'storage': {},\n 'source': 'ieso.ca',\n }\n data[rowIndex]['production'][MAP_GENERATION[key]] = \\\n float(row.contents[0])\n\n return [data[k] for k in sorted(data.keys())]\n\n\ndef fetch_price(country_code='CA-ON', session=None):\n \"\"\"Requests the last known power price of a given country\n\n Arguments:\n country_code (optional) -- used in case a parser is able to fetch multiple countries\n session (optional) -- request session passed in order to re-use an existing session\n\n Return:\n A dictionary in the form:\n {\n 'countryCode': 'FR',\n 'currency': EUR,\n 'datetime': '2017-01-01T00:00:00Z',\n 'price': 0.0,\n 'source': 'mysource.com'\n }\n \"\"\"\n\n r = session or requests.session()\n url = 'http://www.ieso.ca/-/media/files/ieso/uploaded/chart/price_multiday.xml?la=en'\n response = r.get(url)\n soup = BeautifulSoup(response.text, 'html.parser')\n\n data = {}\n\n start_datetime = arrow.get(\n arrow.get(soup.find_all('startdate')[0].contents[0]).datetime, timezone)\n\n # Iterate over all datasets (production types)\n for item in soup.find_all('dataset'):\n key = item.attrs['series']\n if key != 'HOEP':\n continue\n for rowIndex, row in enumerate(item.find_all('value')):\n if not len(row.contents):\n continue\n if rowIndex not in data:\n data[rowIndex] = {\n 'datetime': start_datetime.replace(hours=+rowIndex).datetime,\n 'countryCode': country_code,\n 'currency': 'CAD',\n 'source': 'ieso.ca',\n }\n data[rowIndex]['price'] = \\\n float(row.contents[0])\n\n return [data[k] for k in sorted(data.keys())]\n\n return data\n\n\ndef fetch_exchange(country_code1, country_code2, session=None):\n \"\"\"Requests the last known power exchange (in MW) between two countries\n\n Arguments:\n country_code (optional) -- used in case a parser is able to fetch multiple countries\n session (optional) -- request session passed in order to re-use an existing session\n\n Return:\n A dictionary in the form:\n {\n 'sortedCountryCodes': 'DK->NO',\n 'datetime': '2017-01-01T00:00:00Z',\n 'netFlow': 0.0,\n 'source': 'mysource.com'\n }\n \"\"\"\n\n r = session or requests.session()\n url = 'http://live.gridwatch.ca/WebServices/GridWatchWebApp.asmx/GetHomeViewData_v2'\n response = r.get(url)\n obj = response.json()\n exchanges = obj['intertieLineData']\n\n sortedCountryCodes = '->'.join(sorted([country_code1, country_code2]))\n # Everything -> CA_ON corresponds to an import to ON\n # In the data, \"net\" represents an export\n # So everything -> CA_ON must be reversed\n if sortedCountryCodes == 'CA-MB->CA-ON':\n keys = ['MANITOBA', 'MANITOBA SK']\n direction = -1\n elif sortedCountryCodes == 'CA-ON->US-NY':\n keys = ['NEW-YORK']\n direction = 1\n elif sortedCountryCodes == 'CA-ON->US-MISO':\n keys = ['MICHIGAN', 'MINNESOTA']\n direction = 1\n elif sortedCountryCodes == 'CA-ON->CA-QC':\n keys = filter(lambda k: k[:2] == 'PQ', exchanges.keys())\n direction = 1\n else:\n raise NotImplementedError('This exchange pair is not implemented')\n\n data = {\n 'datetime': max(map(lambda x: arrow.get(arrow.get(\n exchanges[x]['dateReported']).datetime, timezone).datetime, keys)),\n 'sortedCountryCodes': sortedCountryCodes,\n 'netFlow': sum(map(lambda x: float(exchanges[x]['net'].replace(',', '')), keys)) * direction,\n 'source': 'gridwatch.ca'\n }\n\n return data\n\n\nif __name__ == '__main__':\n \"\"\"Main method, never used by the Electricity Map backend, but handy for testing.\"\"\"\n\n print('fetch_production() ->')\n print(fetch_production())\n print('fetch_price() ->')\n print(fetch_price())\n print('fetch_exchange(\"CA-ON\", \"US-NY\") ->')\n print(fetch_exchange(\"CA-ON\", \"US-NY\"))\n", "path": "parsers/CA_ON.py"}]} | 2,456 | 209 |
gh_patches_debug_18228 | rasdani/github-patches | git_diff | bridgecrewio__checkov-3393 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[ERROR] Failed to run check: Ensure Application Gateway WAF prevents message lookup in Log4j2.
**Describe the issue**
While performing checkov scan on our terraform dir, we are running into this error:
`2022-07-18 10:09:32,794 [MainThread ] [ERROR] Failed to run check: Ensure Application Gateway WAF prevents message lookup in Log4j2. See CVE-2021-44228 aka log4jshell for configuration: {'custom_rules': [[]], 'location': ['westeurope'], 'managed_rules': [{'exclusion': [[]], 'managed_rule_set': [{'rule_group_override': [[]], 'type': ['OWASP'], 'version': ['3.1'], '__startline__': [1], '__endline__': [1], 'start_line': [0], 'end_line': [0]}], '__startline__': [1], '__endline__': [1], 'start_line': [0], 'end_line': [0]}], 'name': ['waf-acc-weu-001'], 'policy_settings': [{'enabled': [True], 'file_upload_limit_in_mb': [100], 'max_request_body_size_in_kb': [128], 'mode': ['Prevention'], 'request_body_check': [True], '__startline__': [1], '__endline__': [1], 'start_line': [0], 'end_line': [0]}], 'resource_group_name': ['rg-acceptance-weu-001'], 'tags': [None], 'timeouts': [None], '__startline__': [1], '__endline__': [1], 'start_line': [0], 'end_line': [0], 'references_': ['data.azurerm_resource_group.resourcegroup.name', 'data.azurerm_resource_group.resourcegroup'], '__address__': 'azurerm_web_application_firewall_policy.agw_waf_policy_aks_acc'} at file: /tf/plan.json`
**Examples**
When looking into this page: https://docs.bridgecrew.io/docs/ensure-application-gateway-waf-prevents-message-lookup-in-log4j2, we saw a code example, but the thing is, we have exactly the same code,:
```
resource "azurerm_web_application_firewall_policy" "agw_waf_policy_aks_acc" {
name = "waf-acc-weu-001"
resource_group_name = data.azurerm_resource_group.kbc_acc_resourcegroup.name
location = var.location
policy_settings {
enabled = true
mode = "Prevention"
request_body_check = true
# Set to 200 because ...
file_upload_limit_in_mb = 200
# Set to 2000 because ...
max_request_body_size_in_kb = 2000
}
managed_rules
managed_rule_set {
type = "OWASP"
version = "3.2"
}
}
}
}
```
**Version (please complete the following information):**
- bridgecrew/checkov docker image without version specified
**Additional context**
Traceback:
```
Process ForkProcess-6:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/usr/local/lib/python3.8/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/lib/python3.8/site-packages/checkov/common/parallelizer/parallel_runner.py", line 37, in func_wrapper
result = original_func(item)
File "/usr/local/lib/python3.8/site-packages/checkov/common/runners/runner_registry.py", line 76, in _parallel_run
return runner.run(
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py", line 72, in run
self.check_tf_definition(report, root_folder, runner_filter)
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py", line 90, in check_tf_definition
self.run_block(definition[block_type], None, full_file_path, root_folder, report, scanned_file,
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py", line 110, in run_block
results = registry.scan(scanned_file, entity, [], runner_filter)
File "/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py", line 124, in scan
result = self.run_check(check, entity_configuration, entity_name, entity_type, scanned_file, skip_info)
File "/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py", line 138, in run_check
result = check.run(
File "/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check.py", line 75, in run
check_result["result"] = self.scan_entity_conf(entity_configuration, entity_type)
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/base_resource_check.py", line 43, in scan_entity_conf
return self.scan_resource_conf(conf)
File "/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py", line 35, in scan_resource_conf
if rule_override.get("rule_group_name") == ["REQUEST-944-APPLICATION-ATTACK-JAVA"]:
File "/usr/local/lib/python3.8/site-packages/checkov/common/parsers/node.py", line 189, in __getattr__
raise TemplateAttributeError(f'{name} is invalid')
checkov.common.parsers.node.TemplateAttributeError: get is invalid
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py`
Content:
```
1 from typing import Dict, Any
2
3 from checkov.common.models.enums import CheckCategories, CheckResult
4 from checkov.common.util.type_forcers import force_list
5 from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
6
7
8 class AppGatewayWAFACLCVE202144228(BaseResourceCheck):
9 def __init__(self) -> None:
10 name = "Ensure Application Gateway WAF prevents message lookup in Log4j2. See CVE-2021-44228 aka log4jshell"
11 id = "CKV_AZURE_135"
12 supported_resources = ("azurerm_web_application_firewall_policy",)
13 categories = (CheckCategories.APPLICATION_SECURITY,)
14 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
15
16 def scan_resource_conf(self, conf: Dict[str, Any]) -> CheckResult:
17 self.evaluated_keys = ["managed_rules"]
18 managed_rules = conf.get("managed_rules")
19 if managed_rules:
20 managed_rule_sets = managed_rules[0].get("managed_rule_set") or []
21 for idx_rule_set, rule_set in enumerate(force_list(managed_rule_sets)):
22 self.evaluated_keys = [
23 f"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/type",
24 f"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/version",
25 ]
26 if rule_set.get("type", ["OWASP"]) == ["OWASP"] and rule_set.get("version") in (["3.1"], ["3.2"]):
27 rule_overrides = rule_set.get("rule_group_override") or []
28 for idx_override, rule_override in enumerate(force_list(rule_overrides)):
29 self.evaluated_keys.extend(
30 [
31 f"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/rule_group_override/[{idx_override}]/rule_group_name",
32 f"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/rule_group_override/[{idx_override}]/disabled_rules",
33 ]
34 )
35 if rule_override.get("rule_group_name") == ["REQUEST-944-APPLICATION-ATTACK-JAVA"]:
36 disabled_rules = rule_override.get("disabled_rules") or []
37 if isinstance(disabled_rules, list) and "944240" in force_list(disabled_rules[0]):
38 return CheckResult.FAILED
39
40 return CheckResult.PASSED
41
42 return CheckResult.FAILED
43
44
45 check = AppGatewayWAFACLCVE202144228()
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py b/checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py
--- a/checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py
+++ b/checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py
@@ -32,7 +32,7 @@
f"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/rule_group_override/[{idx_override}]/disabled_rules",
]
)
- if rule_override.get("rule_group_name") == ["REQUEST-944-APPLICATION-ATTACK-JAVA"]:
+ if isinstance(rule_override, dict) and rule_override.get("rule_group_name") == ["REQUEST-944-APPLICATION-ATTACK-JAVA"]:
disabled_rules = rule_override.get("disabled_rules") or []
if isinstance(disabled_rules, list) and "944240" in force_list(disabled_rules[0]):
return CheckResult.FAILED
| {"golden_diff": "diff --git a/checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py b/checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py\n--- a/checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py\n+++ b/checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py\n@@ -32,7 +32,7 @@\n f\"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/rule_group_override/[{idx_override}]/disabled_rules\",\n ]\n )\n- if rule_override.get(\"rule_group_name\") == [\"REQUEST-944-APPLICATION-ATTACK-JAVA\"]:\n+ if isinstance(rule_override, dict) and rule_override.get(\"rule_group_name\") == [\"REQUEST-944-APPLICATION-ATTACK-JAVA\"]:\n disabled_rules = rule_override.get(\"disabled_rules\") or []\n if isinstance(disabled_rules, list) and \"944240\" in force_list(disabled_rules[0]):\n return CheckResult.FAILED\n", "issue": "[ERROR] Failed to run check: Ensure Application Gateway WAF prevents message lookup in Log4j2.\n**Describe the issue**\r\nWhile performing checkov scan on our terraform dir, we are running into this error:\r\n\r\n`2022-07-18 10:09:32,794 [MainThread ] [ERROR] Failed to run check: Ensure Application Gateway WAF prevents message lookup in Log4j2. See CVE-2021-44228 aka log4jshell for configuration: {'custom_rules': [[]], 'location': ['westeurope'], 'managed_rules': [{'exclusion': [[]], 'managed_rule_set': [{'rule_group_override': [[]], 'type': ['OWASP'], 'version': ['3.1'], '__startline__': [1], '__endline__': [1], 'start_line': [0], 'end_line': [0]}], '__startline__': [1], '__endline__': [1], 'start_line': [0], 'end_line': [0]}], 'name': ['waf-acc-weu-001'], 'policy_settings': [{'enabled': [True], 'file_upload_limit_in_mb': [100], 'max_request_body_size_in_kb': [128], 'mode': ['Prevention'], 'request_body_check': [True], '__startline__': [1], '__endline__': [1], 'start_line': [0], 'end_line': [0]}], 'resource_group_name': ['rg-acceptance-weu-001'], 'tags': [None], 'timeouts': [None], '__startline__': [1], '__endline__': [1], 'start_line': [0], 'end_line': [0], 'references_': ['data.azurerm_resource_group.resourcegroup.name', 'data.azurerm_resource_group.resourcegroup'], '__address__': 'azurerm_web_application_firewall_policy.agw_waf_policy_aks_acc'} at file: /tf/plan.json`\r\n\r\n**Examples**\r\nWhen looking into this page: https://docs.bridgecrew.io/docs/ensure-application-gateway-waf-prevents-message-lookup-in-log4j2, we saw a code example, but the thing is, we have exactly the same code,:\r\n\r\n```\r\nresource \"azurerm_web_application_firewall_policy\" \"agw_waf_policy_aks_acc\" {\r\n name = \"waf-acc-weu-001\"\r\n resource_group_name = data.azurerm_resource_group.kbc_acc_resourcegroup.name\r\n location = var.location\r\n\r\n policy_settings {\r\n enabled = true\r\n mode = \"Prevention\"\r\n request_body_check = true\r\n # Set to 200 because ...\r\n file_upload_limit_in_mb = 200\r\n # Set to 2000 because ...\r\n max_request_body_size_in_kb = 2000\r\n }\r\n\r\n managed_rules \r\n\r\n managed_rule_set {\r\n type = \"OWASP\"\r\n version = \"3.2\"\r\n }\r\n }\r\n }\r\n}\r\n\r\n```\r\n\r\n**Version (please complete the following information):**\r\n - bridgecrew/checkov docker image without version specified\r\n\r\n**Additional context**\r\nTraceback:\r\n\r\n```\r\nProcess ForkProcess-6:\r\nTraceback (most recent call last):\r\n File \"/usr/local/lib/python3.8/multiprocessing/process.py\", line 315, in _bootstrap\r\n self.run()\r\n File \"/usr/local/lib/python3.8/multiprocessing/process.py\", line 108, in run\r\n self._target(*self._args, **self._kwargs)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/parallelizer/parallel_runner.py\", line 37, in func_wrapper\r\n result = original_func(item)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/runners/runner_registry.py\", line 76, in _parallel_run\r\n return runner.run(\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py\", line 72, in run\r\n self.check_tf_definition(report, root_folder, runner_filter)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py\", line 90, in check_tf_definition\r\n self.run_block(definition[block_type], None, full_file_path, root_folder, report, scanned_file,\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/plan_runner.py\", line 110, in run_block\r\n results = registry.scan(scanned_file, entity, [], runner_filter)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py\", line 124, in scan\r\n result = self.run_check(check, entity_configuration, entity_name, entity_type, scanned_file, skip_info)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check_registry.py\", line 138, in run_check\r\n result = check.run(\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/checks/base_check.py\", line 75, in run\r\n check_result[\"result\"] = self.scan_entity_conf(entity_configuration, entity_type)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/base_resource_check.py\", line 43, in scan_entity_conf\r\n return self.scan_resource_conf(conf)\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py\", line 35, in scan_resource_conf\r\n if rule_override.get(\"rule_group_name\") == [\"REQUEST-944-APPLICATION-ATTACK-JAVA\"]:\r\n File \"/usr/local/lib/python3.8/site-packages/checkov/common/parsers/node.py\", line 189, in __getattr__\r\n raise TemplateAttributeError(f'{name} is invalid')\r\ncheckov.common.parsers.node.TemplateAttributeError: get is invalid\r\n\r\n```\n", "before_files": [{"content": "from typing import Dict, Any\n\nfrom checkov.common.models.enums import CheckCategories, CheckResult\nfrom checkov.common.util.type_forcers import force_list\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n\n\nclass AppGatewayWAFACLCVE202144228(BaseResourceCheck):\n def __init__(self) -> None:\n name = \"Ensure Application Gateway WAF prevents message lookup in Log4j2. See CVE-2021-44228 aka log4jshell\"\n id = \"CKV_AZURE_135\"\n supported_resources = (\"azurerm_web_application_firewall_policy\",)\n categories = (CheckCategories.APPLICATION_SECURITY,)\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf: Dict[str, Any]) -> CheckResult:\n self.evaluated_keys = [\"managed_rules\"]\n managed_rules = conf.get(\"managed_rules\")\n if managed_rules:\n managed_rule_sets = managed_rules[0].get(\"managed_rule_set\") or []\n for idx_rule_set, rule_set in enumerate(force_list(managed_rule_sets)):\n self.evaluated_keys = [\n f\"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/type\",\n f\"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/version\",\n ]\n if rule_set.get(\"type\", [\"OWASP\"]) == [\"OWASP\"] and rule_set.get(\"version\") in ([\"3.1\"], [\"3.2\"]):\n rule_overrides = rule_set.get(\"rule_group_override\") or []\n for idx_override, rule_override in enumerate(force_list(rule_overrides)):\n self.evaluated_keys.extend(\n [\n f\"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/rule_group_override/[{idx_override}]/rule_group_name\",\n f\"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/rule_group_override/[{idx_override}]/disabled_rules\",\n ]\n )\n if rule_override.get(\"rule_group_name\") == [\"REQUEST-944-APPLICATION-ATTACK-JAVA\"]:\n disabled_rules = rule_override.get(\"disabled_rules\") or []\n if isinstance(disabled_rules, list) and \"944240\" in force_list(disabled_rules[0]):\n return CheckResult.FAILED\n\n return CheckResult.PASSED\n\n return CheckResult.FAILED\n\n\ncheck = AppGatewayWAFACLCVE202144228()\n", "path": "checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py"}], "after_files": [{"content": "from typing import Dict, Any\n\nfrom checkov.common.models.enums import CheckCategories, CheckResult\nfrom checkov.common.util.type_forcers import force_list\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n\n\nclass AppGatewayWAFACLCVE202144228(BaseResourceCheck):\n def __init__(self) -> None:\n name = \"Ensure Application Gateway WAF prevents message lookup in Log4j2. See CVE-2021-44228 aka log4jshell\"\n id = \"CKV_AZURE_135\"\n supported_resources = (\"azurerm_web_application_firewall_policy\",)\n categories = (CheckCategories.APPLICATION_SECURITY,)\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf: Dict[str, Any]) -> CheckResult:\n self.evaluated_keys = [\"managed_rules\"]\n managed_rules = conf.get(\"managed_rules\")\n if managed_rules:\n managed_rule_sets = managed_rules[0].get(\"managed_rule_set\") or []\n for idx_rule_set, rule_set in enumerate(force_list(managed_rule_sets)):\n self.evaluated_keys = [\n f\"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/type\",\n f\"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/version\",\n ]\n if rule_set.get(\"type\", [\"OWASP\"]) == [\"OWASP\"] and rule_set.get(\"version\") in ([\"3.1\"], [\"3.2\"]):\n rule_overrides = rule_set.get(\"rule_group_override\") or []\n for idx_override, rule_override in enumerate(force_list(rule_overrides)):\n self.evaluated_keys.extend(\n [\n f\"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/rule_group_override/[{idx_override}]/rule_group_name\",\n f\"managed_rules/[0]/managed_rule_set[{idx_rule_set}]/rule_group_override/[{idx_override}]/disabled_rules\",\n ]\n )\n if isinstance(rule_override, dict) and rule_override.get(\"rule_group_name\") == [\"REQUEST-944-APPLICATION-ATTACK-JAVA\"]:\n disabled_rules = rule_override.get(\"disabled_rules\") or []\n if isinstance(disabled_rules, list) and \"944240\" in force_list(disabled_rules[0]):\n return CheckResult.FAILED\n\n return CheckResult.PASSED\n\n return CheckResult.FAILED\n\n\ncheck = AppGatewayWAFACLCVE202144228()\n", "path": "checkov/terraform/checks/resource/azure/AppGatewayWAFACLCVE202144228.py"}]} | 2,236 | 268 |
gh_patches_debug_36677 | rasdani/github-patches | git_diff | sublimelsp__LSP-693 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Some servers provide tooltips for ignored scopes
The vscode-json-languageserver provides tooltips for json keys, but the `string` scope is ignored
https://github.com/tomv564/LSP/blob/1836426c85826f20de73e50ab285a948eebbeba4/plugin/hover.py#L21
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `plugin/hover.py`
Content:
```
1 import mdpopups
2 import sublime
3 import sublime_plugin
4 import webbrowser
5 from html import escape
6 try:
7 from typing import List, Optional, Any, Dict
8 assert List and Optional and Any and Dict
9 except ImportError:
10 pass
11
12 from .core.configurations import is_supported_syntax
13 from .diagnostics import get_point_diagnostics
14 from .core.registry import session_for_view, LspTextCommand
15 from .core.protocol import Request, DiagnosticSeverity
16 from .core.documents import get_document_position
17 from .core.popups import popup_css, popup_class
18 from .core.settings import client_configs
19
20 SUBLIME_WORD_MASK = 515
21 NO_HOVER_SCOPES = 'comment, string'
22
23
24 class HoverHandler(sublime_plugin.ViewEventListener):
25 def __init__(self, view):
26 self.view = view
27
28 @classmethod
29 def is_applicable(cls, settings):
30 syntax = settings.get('syntax')
31 return syntax and is_supported_syntax(syntax, client_configs.all)
32
33 def on_hover(self, point, hover_zone):
34 if hover_zone != sublime.HOVER_TEXT or self.view.is_popup_visible():
35 return
36 self.view.run_command("lsp_hover", {"point": point})
37
38
39 _test_contents = [] # type: List[str]
40
41
42 class_for_severity = {
43 DiagnosticSeverity.Error: 'errors',
44 DiagnosticSeverity.Warning: 'warnings',
45 DiagnosticSeverity.Information: 'info',
46 DiagnosticSeverity.Hint: 'hints'
47 }
48
49
50 class GotoKind:
51
52 __slots__ = ("lsp_name", "label", "subl_cmd_name")
53
54 def __init__(self, lsp_name: str, label: str, subl_cmd_name: str) -> None:
55 self.lsp_name = lsp_name
56 self.label = label
57 self.subl_cmd_name = subl_cmd_name
58
59
60 goto_kinds = [
61 GotoKind("definition", "Definition", "definition"),
62 GotoKind("typeDefinition", "Type Definition", "type_definition"),
63 GotoKind("declaration", "Declaration", "declaration"),
64 GotoKind("implementation", "Implementation", "implementation")
65 ]
66
67
68 class LspHoverCommand(LspTextCommand):
69 def __init__(self, view):
70 super().__init__(view)
71
72 def is_likely_at_symbol(self, point):
73 word_at_sel = self.view.classify(point)
74 return word_at_sel & SUBLIME_WORD_MASK and not self.view.match_selector(point, NO_HOVER_SCOPES)
75
76 def run(self, edit, point=None):
77 if point is None:
78 point = self.view.sel()[0].begin()
79 if self.is_likely_at_symbol(point):
80 self.request_symbol_hover(point)
81 point_diagnostics = get_point_diagnostics(self.view, point)
82 if point_diagnostics:
83 self.show_hover(point, self.diagnostics_content(point_diagnostics))
84
85 def request_symbol_hover(self, point) -> None:
86 session = session_for_view(self.view, point)
87 if session:
88 if session.has_capability('hoverProvider'):
89 document_position = get_document_position(self.view, point)
90 if document_position:
91 if session.client:
92 session.client.send_request(
93 Request.hover(document_position),
94 lambda response: self.handle_response(response, point))
95
96 def handle_response(self, response: 'Optional[Any]', point) -> None:
97 all_content = ""
98
99 point_diagnostics = get_point_diagnostics(self.view, point)
100 if point_diagnostics:
101 all_content += self.diagnostics_content(point_diagnostics)
102
103 all_content += self.hover_content(point, response)
104 all_content += self.symbol_actions_content()
105
106 _test_contents.clear()
107 _test_contents.append(all_content) # for testing only
108 self.show_hover(point, all_content)
109
110 def symbol_actions_content(self):
111 actions = []
112 for goto_kind in goto_kinds:
113 if self.has_client_with_capability(goto_kind.lsp_name + "Provider"):
114 actions.append("<a href='{}'>{}</a>".format(goto_kind.lsp_name, goto_kind.label))
115 if self.has_client_with_capability('referencesProvider'):
116 actions.append("<a href='{}'>{}</a>".format('references', 'References'))
117 if self.has_client_with_capability('renameProvider'):
118 actions.append("<a href='{}'>{}</a>".format('rename', 'Rename'))
119 return "<p>" + " | ".join(actions) + "</p>"
120
121 def format_diagnostic(self, diagnostic):
122 if diagnostic.source:
123 return "<pre>[{}] {}</pre>".format(diagnostic.source, escape(diagnostic.message, False))
124 else:
125 return "<pre>{}</pre>".format(escape(diagnostic.message, False))
126
127 def diagnostics_content(self, diagnostics):
128 by_severity = {} # type: Dict[int, List[str]]
129 for diagnostic in diagnostics:
130 by_severity.setdefault(diagnostic.severity, []).append(self.format_diagnostic(diagnostic))
131 formatted = []
132 for severity, items in by_severity.items():
133 formatted.append("<div class='{}'>".format(class_for_severity[severity]))
134 formatted.extend(items)
135 formatted.append("<a href='{}'>{}</a>".format('code-actions',
136 'Code Actions'))
137 formatted.append("</div>")
138
139 return "".join(formatted)
140
141 def hover_content(self, point, response: 'Optional[Any]') -> str:
142 contents = ["No description available."]
143 if isinstance(response, dict):
144 # Flow returns None sometimes
145 # See: https://github.com/flowtype/flow-language-server/issues/51
146 response_content = response.get('contents')
147 if response_content:
148 if isinstance(response_content, list):
149 contents = response_content
150 else:
151 contents = [response_content]
152
153 formatted = []
154 for item in contents:
155 value = ""
156 language = None
157 if isinstance(item, str):
158 value = item
159 else:
160 value = item.get("value")
161 language = item.get("language")
162 if language:
163 formatted.append("```{}\n{}\n```\n".format(language, value))
164 else:
165 formatted.append(value)
166
167 return mdpopups.md2html(self.view, "\n".join(formatted))
168
169 def show_hover(self, point, contents):
170 mdpopups.show_popup(
171 self.view,
172 contents,
173 css=popup_css,
174 md=False,
175 flags=sublime.HIDE_ON_MOUSE_MOVE_AWAY,
176 location=point,
177 wrapper_class=popup_class,
178 max_width=800,
179 on_navigate=lambda href: self.on_hover_navigate(href, point))
180
181 def on_hover_navigate(self, href, point):
182 for goto_kind in goto_kinds:
183 if href == goto_kind.lsp_name:
184 self.run_command_from_point(point, "lsp_symbol_" + goto_kind.subl_cmd_name)
185 return
186 if href == 'references':
187 self.run_command_from_point(point, "lsp_symbol_references")
188 elif href == 'rename':
189 self.run_command_from_point(point, "lsp_symbol_rename")
190 elif href == 'code-actions':
191 self.run_command_from_point(point, "lsp_code_actions")
192 else:
193 webbrowser.open_new_tab(href)
194
195 def run_command_from_point(self, point, command_name):
196 sel = self.view.sel()
197 sel.clear()
198 sel.add(sublime.Region(point, point))
199 self.view.run_command(command_name)
200
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/plugin/hover.py b/plugin/hover.py
--- a/plugin/hover.py
+++ b/plugin/hover.py
@@ -18,7 +18,6 @@
from .core.settings import client_configs
SUBLIME_WORD_MASK = 515
-NO_HOVER_SCOPES = 'comment, string'
class HoverHandler(sublime_plugin.ViewEventListener):
@@ -71,7 +70,7 @@
def is_likely_at_symbol(self, point):
word_at_sel = self.view.classify(point)
- return word_at_sel & SUBLIME_WORD_MASK and not self.view.match_selector(point, NO_HOVER_SCOPES)
+ return word_at_sel & SUBLIME_WORD_MASK
def run(self, edit, point=None):
if point is None:
@@ -101,11 +100,14 @@
all_content += self.diagnostics_content(point_diagnostics)
all_content += self.hover_content(point, response)
- all_content += self.symbol_actions_content()
+ if all_content:
+ all_content += self.symbol_actions_content()
_test_contents.clear()
_test_contents.append(all_content) # for testing only
- self.show_hover(point, all_content)
+
+ if all_content:
+ self.show_hover(point, all_content)
def symbol_actions_content(self):
actions = []
@@ -139,10 +141,8 @@
return "".join(formatted)
def hover_content(self, point, response: 'Optional[Any]') -> str:
- contents = ["No description available."]
+ contents = [] # type: List[Any]
if isinstance(response, dict):
- # Flow returns None sometimes
- # See: https://github.com/flowtype/flow-language-server/issues/51
response_content = response.get('contents')
if response_content:
if isinstance(response_content, list):
@@ -164,7 +164,10 @@
else:
formatted.append(value)
- return mdpopups.md2html(self.view, "\n".join(formatted))
+ if formatted:
+ return mdpopups.md2html(self.view, "\n".join(formatted))
+
+ return ""
def show_hover(self, point, contents):
mdpopups.show_popup(
| {"golden_diff": "diff --git a/plugin/hover.py b/plugin/hover.py\n--- a/plugin/hover.py\n+++ b/plugin/hover.py\n@@ -18,7 +18,6 @@\n from .core.settings import client_configs\n \n SUBLIME_WORD_MASK = 515\n-NO_HOVER_SCOPES = 'comment, string'\n \n \n class HoverHandler(sublime_plugin.ViewEventListener):\n@@ -71,7 +70,7 @@\n \n def is_likely_at_symbol(self, point):\n word_at_sel = self.view.classify(point)\n- return word_at_sel & SUBLIME_WORD_MASK and not self.view.match_selector(point, NO_HOVER_SCOPES)\n+ return word_at_sel & SUBLIME_WORD_MASK\n \n def run(self, edit, point=None):\n if point is None:\n@@ -101,11 +100,14 @@\n all_content += self.diagnostics_content(point_diagnostics)\n \n all_content += self.hover_content(point, response)\n- all_content += self.symbol_actions_content()\n+ if all_content:\n+ all_content += self.symbol_actions_content()\n \n _test_contents.clear()\n _test_contents.append(all_content) # for testing only\n- self.show_hover(point, all_content)\n+\n+ if all_content:\n+ self.show_hover(point, all_content)\n \n def symbol_actions_content(self):\n actions = []\n@@ -139,10 +141,8 @@\n return \"\".join(formatted)\n \n def hover_content(self, point, response: 'Optional[Any]') -> str:\n- contents = [\"No description available.\"]\n+ contents = [] # type: List[Any]\n if isinstance(response, dict):\n- # Flow returns None sometimes\n- # See: https://github.com/flowtype/flow-language-server/issues/51\n response_content = response.get('contents')\n if response_content:\n if isinstance(response_content, list):\n@@ -164,7 +164,10 @@\n else:\n formatted.append(value)\n \n- return mdpopups.md2html(self.view, \"\\n\".join(formatted))\n+ if formatted:\n+ return mdpopups.md2html(self.view, \"\\n\".join(formatted))\n+\n+ return \"\"\n \n def show_hover(self, point, contents):\n mdpopups.show_popup(\n", "issue": "Some servers provide tooltips for ignored scopes\nThe vscode-json-languageserver provides tooltips for json keys, but the `string` scope is ignored\r\nhttps://github.com/tomv564/LSP/blob/1836426c85826f20de73e50ab285a948eebbeba4/plugin/hover.py#L21\r\n\n", "before_files": [{"content": "import mdpopups\nimport sublime\nimport sublime_plugin\nimport webbrowser\nfrom html import escape\ntry:\n from typing import List, Optional, Any, Dict\n assert List and Optional and Any and Dict\nexcept ImportError:\n pass\n\nfrom .core.configurations import is_supported_syntax\nfrom .diagnostics import get_point_diagnostics\nfrom .core.registry import session_for_view, LspTextCommand\nfrom .core.protocol import Request, DiagnosticSeverity\nfrom .core.documents import get_document_position\nfrom .core.popups import popup_css, popup_class\nfrom .core.settings import client_configs\n\nSUBLIME_WORD_MASK = 515\nNO_HOVER_SCOPES = 'comment, string'\n\n\nclass HoverHandler(sublime_plugin.ViewEventListener):\n def __init__(self, view):\n self.view = view\n\n @classmethod\n def is_applicable(cls, settings):\n syntax = settings.get('syntax')\n return syntax and is_supported_syntax(syntax, client_configs.all)\n\n def on_hover(self, point, hover_zone):\n if hover_zone != sublime.HOVER_TEXT or self.view.is_popup_visible():\n return\n self.view.run_command(\"lsp_hover\", {\"point\": point})\n\n\n_test_contents = [] # type: List[str]\n\n\nclass_for_severity = {\n DiagnosticSeverity.Error: 'errors',\n DiagnosticSeverity.Warning: 'warnings',\n DiagnosticSeverity.Information: 'info',\n DiagnosticSeverity.Hint: 'hints'\n}\n\n\nclass GotoKind:\n\n __slots__ = (\"lsp_name\", \"label\", \"subl_cmd_name\")\n\n def __init__(self, lsp_name: str, label: str, subl_cmd_name: str) -> None:\n self.lsp_name = lsp_name\n self.label = label\n self.subl_cmd_name = subl_cmd_name\n\n\ngoto_kinds = [\n GotoKind(\"definition\", \"Definition\", \"definition\"),\n GotoKind(\"typeDefinition\", \"Type Definition\", \"type_definition\"),\n GotoKind(\"declaration\", \"Declaration\", \"declaration\"),\n GotoKind(\"implementation\", \"Implementation\", \"implementation\")\n]\n\n\nclass LspHoverCommand(LspTextCommand):\n def __init__(self, view):\n super().__init__(view)\n\n def is_likely_at_symbol(self, point):\n word_at_sel = self.view.classify(point)\n return word_at_sel & SUBLIME_WORD_MASK and not self.view.match_selector(point, NO_HOVER_SCOPES)\n\n def run(self, edit, point=None):\n if point is None:\n point = self.view.sel()[0].begin()\n if self.is_likely_at_symbol(point):\n self.request_symbol_hover(point)\n point_diagnostics = get_point_diagnostics(self.view, point)\n if point_diagnostics:\n self.show_hover(point, self.diagnostics_content(point_diagnostics))\n\n def request_symbol_hover(self, point) -> None:\n session = session_for_view(self.view, point)\n if session:\n if session.has_capability('hoverProvider'):\n document_position = get_document_position(self.view, point)\n if document_position:\n if session.client:\n session.client.send_request(\n Request.hover(document_position),\n lambda response: self.handle_response(response, point))\n\n def handle_response(self, response: 'Optional[Any]', point) -> None:\n all_content = \"\"\n\n point_diagnostics = get_point_diagnostics(self.view, point)\n if point_diagnostics:\n all_content += self.diagnostics_content(point_diagnostics)\n\n all_content += self.hover_content(point, response)\n all_content += self.symbol_actions_content()\n\n _test_contents.clear()\n _test_contents.append(all_content) # for testing only\n self.show_hover(point, all_content)\n\n def symbol_actions_content(self):\n actions = []\n for goto_kind in goto_kinds:\n if self.has_client_with_capability(goto_kind.lsp_name + \"Provider\"):\n actions.append(\"<a href='{}'>{}</a>\".format(goto_kind.lsp_name, goto_kind.label))\n if self.has_client_with_capability('referencesProvider'):\n actions.append(\"<a href='{}'>{}</a>\".format('references', 'References'))\n if self.has_client_with_capability('renameProvider'):\n actions.append(\"<a href='{}'>{}</a>\".format('rename', 'Rename'))\n return \"<p>\" + \" | \".join(actions) + \"</p>\"\n\n def format_diagnostic(self, diagnostic):\n if diagnostic.source:\n return \"<pre>[{}] {}</pre>\".format(diagnostic.source, escape(diagnostic.message, False))\n else:\n return \"<pre>{}</pre>\".format(escape(diagnostic.message, False))\n\n def diagnostics_content(self, diagnostics):\n by_severity = {} # type: Dict[int, List[str]]\n for diagnostic in diagnostics:\n by_severity.setdefault(diagnostic.severity, []).append(self.format_diagnostic(diagnostic))\n formatted = []\n for severity, items in by_severity.items():\n formatted.append(\"<div class='{}'>\".format(class_for_severity[severity]))\n formatted.extend(items)\n formatted.append(\"<a href='{}'>{}</a>\".format('code-actions',\n 'Code Actions'))\n formatted.append(\"</div>\")\n\n return \"\".join(formatted)\n\n def hover_content(self, point, response: 'Optional[Any]') -> str:\n contents = [\"No description available.\"]\n if isinstance(response, dict):\n # Flow returns None sometimes\n # See: https://github.com/flowtype/flow-language-server/issues/51\n response_content = response.get('contents')\n if response_content:\n if isinstance(response_content, list):\n contents = response_content\n else:\n contents = [response_content]\n\n formatted = []\n for item in contents:\n value = \"\"\n language = None\n if isinstance(item, str):\n value = item\n else:\n value = item.get(\"value\")\n language = item.get(\"language\")\n if language:\n formatted.append(\"```{}\\n{}\\n```\\n\".format(language, value))\n else:\n formatted.append(value)\n\n return mdpopups.md2html(self.view, \"\\n\".join(formatted))\n\n def show_hover(self, point, contents):\n mdpopups.show_popup(\n self.view,\n contents,\n css=popup_css,\n md=False,\n flags=sublime.HIDE_ON_MOUSE_MOVE_AWAY,\n location=point,\n wrapper_class=popup_class,\n max_width=800,\n on_navigate=lambda href: self.on_hover_navigate(href, point))\n\n def on_hover_navigate(self, href, point):\n for goto_kind in goto_kinds:\n if href == goto_kind.lsp_name:\n self.run_command_from_point(point, \"lsp_symbol_\" + goto_kind.subl_cmd_name)\n return\n if href == 'references':\n self.run_command_from_point(point, \"lsp_symbol_references\")\n elif href == 'rename':\n self.run_command_from_point(point, \"lsp_symbol_rename\")\n elif href == 'code-actions':\n self.run_command_from_point(point, \"lsp_code_actions\")\n else:\n webbrowser.open_new_tab(href)\n\n def run_command_from_point(self, point, command_name):\n sel = self.view.sel()\n sel.clear()\n sel.add(sublime.Region(point, point))\n self.view.run_command(command_name)\n", "path": "plugin/hover.py"}], "after_files": [{"content": "import mdpopups\nimport sublime\nimport sublime_plugin\nimport webbrowser\nfrom html import escape\ntry:\n from typing import List, Optional, Any, Dict\n assert List and Optional and Any and Dict\nexcept ImportError:\n pass\n\nfrom .core.configurations import is_supported_syntax\nfrom .diagnostics import get_point_diagnostics\nfrom .core.registry import session_for_view, LspTextCommand\nfrom .core.protocol import Request, DiagnosticSeverity\nfrom .core.documents import get_document_position\nfrom .core.popups import popup_css, popup_class\nfrom .core.settings import client_configs\n\nSUBLIME_WORD_MASK = 515\n\n\nclass HoverHandler(sublime_plugin.ViewEventListener):\n def __init__(self, view):\n self.view = view\n\n @classmethod\n def is_applicable(cls, settings):\n syntax = settings.get('syntax')\n return syntax and is_supported_syntax(syntax, client_configs.all)\n\n def on_hover(self, point, hover_zone):\n if hover_zone != sublime.HOVER_TEXT or self.view.is_popup_visible():\n return\n self.view.run_command(\"lsp_hover\", {\"point\": point})\n\n\n_test_contents = [] # type: List[str]\n\n\nclass_for_severity = {\n DiagnosticSeverity.Error: 'errors',\n DiagnosticSeverity.Warning: 'warnings',\n DiagnosticSeverity.Information: 'info',\n DiagnosticSeverity.Hint: 'hints'\n}\n\n\nclass GotoKind:\n\n __slots__ = (\"lsp_name\", \"label\", \"subl_cmd_name\")\n\n def __init__(self, lsp_name: str, label: str, subl_cmd_name: str) -> None:\n self.lsp_name = lsp_name\n self.label = label\n self.subl_cmd_name = subl_cmd_name\n\n\ngoto_kinds = [\n GotoKind(\"definition\", \"Definition\", \"definition\"),\n GotoKind(\"typeDefinition\", \"Type Definition\", \"type_definition\"),\n GotoKind(\"declaration\", \"Declaration\", \"declaration\"),\n GotoKind(\"implementation\", \"Implementation\", \"implementation\")\n]\n\n\nclass LspHoverCommand(LspTextCommand):\n def __init__(self, view):\n super().__init__(view)\n\n def is_likely_at_symbol(self, point):\n word_at_sel = self.view.classify(point)\n return word_at_sel & SUBLIME_WORD_MASK\n\n def run(self, edit, point=None):\n if point is None:\n point = self.view.sel()[0].begin()\n if self.is_likely_at_symbol(point):\n self.request_symbol_hover(point)\n point_diagnostics = get_point_diagnostics(self.view, point)\n if point_diagnostics:\n self.show_hover(point, self.diagnostics_content(point_diagnostics))\n\n def request_symbol_hover(self, point) -> None:\n session = session_for_view(self.view, point)\n if session:\n if session.has_capability('hoverProvider'):\n document_position = get_document_position(self.view, point)\n if document_position:\n if session.client:\n session.client.send_request(\n Request.hover(document_position),\n lambda response: self.handle_response(response, point))\n\n def handle_response(self, response: 'Optional[Any]', point) -> None:\n all_content = \"\"\n\n point_diagnostics = get_point_diagnostics(self.view, point)\n if point_diagnostics:\n all_content += self.diagnostics_content(point_diagnostics)\n\n all_content += self.hover_content(point, response)\n if all_content:\n all_content += self.symbol_actions_content()\n\n _test_contents.clear()\n _test_contents.append(all_content) # for testing only\n\n if all_content:\n self.show_hover(point, all_content)\n\n def symbol_actions_content(self):\n actions = []\n for goto_kind in goto_kinds:\n if self.has_client_with_capability(goto_kind.lsp_name + \"Provider\"):\n actions.append(\"<a href='{}'>{}</a>\".format(goto_kind.lsp_name, goto_kind.label))\n if self.has_client_with_capability('referencesProvider'):\n actions.append(\"<a href='{}'>{}</a>\".format('references', 'References'))\n if self.has_client_with_capability('renameProvider'):\n actions.append(\"<a href='{}'>{}</a>\".format('rename', 'Rename'))\n return \"<p>\" + \" | \".join(actions) + \"</p>\"\n\n def format_diagnostic(self, diagnostic):\n if diagnostic.source:\n return \"<pre>[{}] {}</pre>\".format(diagnostic.source, escape(diagnostic.message, False))\n else:\n return \"<pre>{}</pre>\".format(escape(diagnostic.message, False))\n\n def diagnostics_content(self, diagnostics):\n by_severity = {} # type: Dict[int, List[str]]\n for diagnostic in diagnostics:\n by_severity.setdefault(diagnostic.severity, []).append(self.format_diagnostic(diagnostic))\n formatted = []\n for severity, items in by_severity.items():\n formatted.append(\"<div class='{}'>\".format(class_for_severity[severity]))\n formatted.extend(items)\n formatted.append(\"<a href='{}'>{}</a>\".format('code-actions',\n 'Code Actions'))\n formatted.append(\"</div>\")\n\n return \"\".join(formatted)\n\n def hover_content(self, point, response: 'Optional[Any]') -> str:\n contents = [] # type: List[Any]\n if isinstance(response, dict):\n response_content = response.get('contents')\n if response_content:\n if isinstance(response_content, list):\n contents = response_content\n else:\n contents = [response_content]\n\n formatted = []\n for item in contents:\n value = \"\"\n language = None\n if isinstance(item, str):\n value = item\n else:\n value = item.get(\"value\")\n language = item.get(\"language\")\n if language:\n formatted.append(\"```{}\\n{}\\n```\\n\".format(language, value))\n else:\n formatted.append(value)\n\n if formatted:\n return mdpopups.md2html(self.view, \"\\n\".join(formatted))\n\n return \"\"\n\n def show_hover(self, point, contents):\n mdpopups.show_popup(\n self.view,\n contents,\n css=popup_css,\n md=False,\n flags=sublime.HIDE_ON_MOUSE_MOVE_AWAY,\n location=point,\n wrapper_class=popup_class,\n max_width=800,\n on_navigate=lambda href: self.on_hover_navigate(href, point))\n\n def on_hover_navigate(self, href, point):\n for goto_kind in goto_kinds:\n if href == goto_kind.lsp_name:\n self.run_command_from_point(point, \"lsp_symbol_\" + goto_kind.subl_cmd_name)\n return\n if href == 'references':\n self.run_command_from_point(point, \"lsp_symbol_references\")\n elif href == 'rename':\n self.run_command_from_point(point, \"lsp_symbol_rename\")\n elif href == 'code-actions':\n self.run_command_from_point(point, \"lsp_code_actions\")\n else:\n webbrowser.open_new_tab(href)\n\n def run_command_from_point(self, point, command_name):\n sel = self.view.sel()\n sel.clear()\n sel.add(sublime.Region(point, point))\n self.view.run_command(command_name)\n", "path": "plugin/hover.py"}]} | 2,414 | 509 |
gh_patches_debug_16198 | rasdani/github-patches | git_diff | numpy__numpy-13688 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
DOC: numpy.random.sample and numpy.random.random_sample
I just noticed in the docs that the page for `numpy.random.sample` indicates that the function should be called as `numpy.random.random_sample`. I understand that this may just indicate that the function may be called as either `sample` or `random_sample`, but it does come across as a mistake when first viewing the page. Perhaps make it more explicit that `random_sample` is an alias of `sample`? Or is this the accepted practice for functions that have aliases?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `numpy/random/__init__.py`
Content:
```
1 """
2 ========================
3 Random Number Generation
4 ========================
5
6 Instantiate a BitGenerator and wrap it in a Generator
7 which will convert the uniform stream to a number of distributions. The "bare"
8 functions are kept for legacy code, they should be called with the newer API
9 via ``np.random.Generator().function`` instead
10
11 ==================== =========================================================
12 Utility functions
13 -------------------- ---------------------------------------------------------
14 random Uniformly distributed floats over ``[0, 1)``
15 integers Uniformly distributed integers, replaces ``randint``
16 bytes Uniformly distributed random bytes.
17 permutation Randomly permute a sequence / generate a random sequence.
18 shuffle Randomly permute a sequence in place.
19 seed Seed the random number generator.
20 choice Random sample from 1-D array.
21 ==================== =========================================================
22
23 ==================== =========================================================
24 Compatibility
25 functions - removed
26 in the new API
27 -------------------- ---------------------------------------------------------
28 rand Uniformly distributed values.
29 randn Normally distributed values.
30 ranf Uniformly distributed floating point numbers.
31 random_integers Uniformly distributed integers in a given range.
32 (deprecated, use ``integers(..., closed=True)`` instead)
33 random_sample Alias for `random_sample`
34 randint Uniformly distributed integers in a given range
35 ==================== =========================================================
36
37 ==================== =========================================================
38 Univariate
39 distributions
40 -------------------- ---------------------------------------------------------
41 beta Beta distribution over ``[0, 1]``.
42 binomial Binomial distribution.
43 chisquare :math:`\\chi^2` distribution.
44 exponential Exponential distribution.
45 f F (Fisher-Snedecor) distribution.
46 gamma Gamma distribution.
47 geometric Geometric distribution.
48 gumbel Gumbel distribution.
49 hypergeometric Hypergeometric distribution.
50 laplace Laplace distribution.
51 logistic Logistic distribution.
52 lognormal Log-normal distribution.
53 logseries Logarithmic series distribution.
54 negative_binomial Negative binomial distribution.
55 noncentral_chisquare Non-central chi-square distribution.
56 noncentral_f Non-central F distribution.
57 normal Normal / Gaussian distribution.
58 pareto Pareto distribution.
59 poisson Poisson distribution.
60 power Power distribution.
61 rayleigh Rayleigh distribution.
62 triangular Triangular distribution.
63 uniform Uniform distribution.
64 vonmises Von Mises circular distribution.
65 wald Wald (inverse Gaussian) distribution.
66 weibull Weibull distribution.
67 zipf Zipf's distribution over ranked data.
68 ==================== =========================================================
69
70 ==================== ==========================================================
71 Multivariate
72 distributions
73 -------------------- ----------------------------------------------------------
74 dirichlet Multivariate generalization of Beta distribution.
75 multinomial Multivariate generalization of the binomial distribution.
76 multivariate_normal Multivariate generalization of the normal distribution.
77 ==================== ==========================================================
78
79 ==================== =========================================================
80 Standard
81 distributions
82 -------------------- ---------------------------------------------------------
83 standard_cauchy Standard Cauchy-Lorentz distribution.
84 standard_exponential Standard exponential distribution.
85 standard_gamma Standard Gamma distribution.
86 standard_normal Standard normal distribution.
87 standard_t Standard Student's t-distribution.
88 ==================== =========================================================
89
90 ==================== =========================================================
91 Internal functions
92 -------------------- ---------------------------------------------------------
93 get_state Get tuple representing internal state of generator.
94 set_state Set state of generator.
95 ==================== =========================================================
96
97 ============================================= ===
98 BitGenerator Streams that work with Generator
99 --------------------------------------------- ---
100 MT19937
101 DSFMT
102 PCG32
103 PCG64
104 Philox
105 ThreeFry
106 Xoshiro256
107 Xoshiro512
108 ============================================= ===
109
110 """
111 from __future__ import division, absolute_import, print_function
112
113 __all__ = [
114 'beta',
115 'binomial',
116 'bytes',
117 'chisquare',
118 'choice',
119 'dirichlet',
120 'exponential',
121 'f',
122 'gamma',
123 'geometric',
124 'get_state',
125 'gumbel',
126 'hypergeometric',
127 'laplace',
128 'logistic',
129 'lognormal',
130 'logseries',
131 'multinomial',
132 'multivariate_normal',
133 'negative_binomial',
134 'noncentral_chisquare',
135 'noncentral_f',
136 'normal',
137 'pareto',
138 'permutation',
139 'poisson',
140 'power',
141 'rand',
142 'randint',
143 'randn',
144 'random_integers',
145 'random_sample',
146 'rayleigh',
147 'seed',
148 'set_state',
149 'shuffle',
150 'standard_cauchy',
151 'standard_exponential',
152 'standard_gamma',
153 'standard_normal',
154 'standard_t',
155 'triangular',
156 'uniform',
157 'vonmises',
158 'wald',
159 'weibull',
160 'zipf',
161 ]
162
163 from . import mtrand
164 from .mtrand import *
165 from .dsfmt import DSFMT
166 from .generator import Generator
167 from .mt19937 import MT19937
168 from .pcg32 import PCG32
169 from .pcg64 import PCG64
170 from .philox import Philox
171 from .threefry import ThreeFry
172 from .xoshiro256 import Xoshiro256
173 from .xoshiro512 import Xoshiro512
174 from .mtrand import RandomState
175
176 __all__ += ['Generator', 'DSFMT', 'MT19937', 'Philox', 'PCG64', 'PCG32',
177 'ThreeFry', 'Xoshiro256', 'Xoshiro512', 'RandomState']
178
179 # Some aliases:
180 ranf = random = sample = random_sample
181 __all__.extend(['ranf', 'random', 'sample'])
182
183
184 def __RandomState_ctor():
185 """Return a RandomState instance.
186
187 This function exists solely to assist (un)pickling.
188
189 Note that the state of the RandomState returned here is irrelevant, as this function's
190 entire purpose is to return a newly allocated RandomState whose state pickle can set.
191 Consequently the RandomState returned by this function is a freshly allocated copy
192 with a seed=0.
193
194 See https://github.com/numpy/numpy/issues/4763 for a detailed discussion
195
196 """
197 return RandomState(seed=0)
198
199
200 from numpy._pytesttester import PytestTester
201 test = PytestTester(__name__)
202 del PytestTester
203
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/numpy/random/__init__.py b/numpy/random/__init__.py
--- a/numpy/random/__init__.py
+++ b/numpy/random/__init__.py
@@ -141,9 +141,12 @@
'rand',
'randint',
'randn',
+ 'random',
'random_integers',
'random_sample',
+ 'ranf',
'rayleigh',
+ 'sample',
'seed',
'set_state',
'shuffle',
@@ -176,10 +179,6 @@
__all__ += ['Generator', 'DSFMT', 'MT19937', 'Philox', 'PCG64', 'PCG32',
'ThreeFry', 'Xoshiro256', 'Xoshiro512', 'RandomState']
-# Some aliases:
-ranf = random = sample = random_sample
-__all__.extend(['ranf', 'random', 'sample'])
-
def __RandomState_ctor():
"""Return a RandomState instance.
| {"golden_diff": "diff --git a/numpy/random/__init__.py b/numpy/random/__init__.py\n--- a/numpy/random/__init__.py\n+++ b/numpy/random/__init__.py\n@@ -141,9 +141,12 @@\n 'rand',\n 'randint',\n 'randn',\n+ 'random',\n 'random_integers',\n 'random_sample',\n+ 'ranf',\n 'rayleigh',\n+ 'sample',\n 'seed',\n 'set_state',\n 'shuffle',\n@@ -176,10 +179,6 @@\n __all__ += ['Generator', 'DSFMT', 'MT19937', 'Philox', 'PCG64', 'PCG32',\n 'ThreeFry', 'Xoshiro256', 'Xoshiro512', 'RandomState']\n \n-# Some aliases:\n-ranf = random = sample = random_sample\n-__all__.extend(['ranf', 'random', 'sample'])\n-\n \n def __RandomState_ctor():\n \"\"\"Return a RandomState instance.\n", "issue": "DOC: numpy.random.sample and numpy.random.random_sample\nI just noticed in the docs that the page for `numpy.random.sample` indicates that the function should be called as `numpy.random.random_sample`. I understand that this may just indicate that the function may be called as either `sample` or `random_sample`, but it does come across as a mistake when first viewing the page. Perhaps make it more explicit that `random_sample` is an alias of `sample`? Or is this the accepted practice for functions that have aliases?\n", "before_files": [{"content": "\"\"\"\n========================\nRandom Number Generation\n========================\n\nInstantiate a BitGenerator and wrap it in a Generator\nwhich will convert the uniform stream to a number of distributions. The \"bare\"\nfunctions are kept for legacy code, they should be called with the newer API\nvia ``np.random.Generator().function`` instead\n\n==================== =========================================================\nUtility functions\n-------------------- ---------------------------------------------------------\nrandom Uniformly distributed floats over ``[0, 1)``\nintegers Uniformly distributed integers, replaces ``randint``\nbytes Uniformly distributed random bytes.\npermutation Randomly permute a sequence / generate a random sequence.\nshuffle Randomly permute a sequence in place.\nseed Seed the random number generator.\nchoice Random sample from 1-D array.\n==================== =========================================================\n\n==================== =========================================================\nCompatibility\nfunctions - removed\nin the new API\n-------------------- ---------------------------------------------------------\nrand Uniformly distributed values.\nrandn Normally distributed values.\nranf Uniformly distributed floating point numbers.\nrandom_integers Uniformly distributed integers in a given range.\n (deprecated, use ``integers(..., closed=True)`` instead)\nrandom_sample Alias for `random_sample`\nrandint Uniformly distributed integers in a given range\n==================== =========================================================\n\n==================== =========================================================\nUnivariate\ndistributions\n-------------------- ---------------------------------------------------------\nbeta Beta distribution over ``[0, 1]``.\nbinomial Binomial distribution.\nchisquare :math:`\\\\chi^2` distribution.\nexponential Exponential distribution.\nf F (Fisher-Snedecor) distribution.\ngamma Gamma distribution.\ngeometric Geometric distribution.\ngumbel Gumbel distribution.\nhypergeometric Hypergeometric distribution.\nlaplace Laplace distribution.\nlogistic Logistic distribution.\nlognormal Log-normal distribution.\nlogseries Logarithmic series distribution.\nnegative_binomial Negative binomial distribution.\nnoncentral_chisquare Non-central chi-square distribution.\nnoncentral_f Non-central F distribution.\nnormal Normal / Gaussian distribution.\npareto Pareto distribution.\npoisson Poisson distribution.\npower Power distribution.\nrayleigh Rayleigh distribution.\ntriangular Triangular distribution.\nuniform Uniform distribution.\nvonmises Von Mises circular distribution.\nwald Wald (inverse Gaussian) distribution.\nweibull Weibull distribution.\nzipf Zipf's distribution over ranked data.\n==================== =========================================================\n\n==================== ==========================================================\nMultivariate\ndistributions\n-------------------- ----------------------------------------------------------\ndirichlet Multivariate generalization of Beta distribution.\nmultinomial Multivariate generalization of the binomial distribution.\nmultivariate_normal Multivariate generalization of the normal distribution.\n==================== ==========================================================\n\n==================== =========================================================\nStandard\ndistributions\n-------------------- ---------------------------------------------------------\nstandard_cauchy Standard Cauchy-Lorentz distribution.\nstandard_exponential Standard exponential distribution.\nstandard_gamma Standard Gamma distribution.\nstandard_normal Standard normal distribution.\nstandard_t Standard Student's t-distribution.\n==================== =========================================================\n\n==================== =========================================================\nInternal functions\n-------------------- ---------------------------------------------------------\nget_state Get tuple representing internal state of generator.\nset_state Set state of generator.\n==================== =========================================================\n\n============================================= ===\nBitGenerator Streams that work with Generator\n--------------------------------------------- ---\nMT19937\nDSFMT\nPCG32\nPCG64\nPhilox\nThreeFry\nXoshiro256\nXoshiro512\n============================================= ===\n\n\"\"\"\nfrom __future__ import division, absolute_import, print_function\n\n__all__ = [\n 'beta',\n 'binomial',\n 'bytes',\n 'chisquare',\n 'choice',\n 'dirichlet',\n 'exponential',\n 'f',\n 'gamma',\n 'geometric',\n 'get_state',\n 'gumbel',\n 'hypergeometric',\n 'laplace',\n 'logistic',\n 'lognormal',\n 'logseries',\n 'multinomial',\n 'multivariate_normal',\n 'negative_binomial',\n 'noncentral_chisquare',\n 'noncentral_f',\n 'normal',\n 'pareto',\n 'permutation',\n 'poisson',\n 'power',\n 'rand',\n 'randint',\n 'randn',\n 'random_integers',\n 'random_sample',\n 'rayleigh',\n 'seed',\n 'set_state',\n 'shuffle',\n 'standard_cauchy',\n 'standard_exponential',\n 'standard_gamma',\n 'standard_normal',\n 'standard_t',\n 'triangular',\n 'uniform',\n 'vonmises',\n 'wald',\n 'weibull',\n 'zipf',\n]\n\nfrom . import mtrand\nfrom .mtrand import *\nfrom .dsfmt import DSFMT\nfrom .generator import Generator\nfrom .mt19937 import MT19937\nfrom .pcg32 import PCG32\nfrom .pcg64 import PCG64\nfrom .philox import Philox\nfrom .threefry import ThreeFry\nfrom .xoshiro256 import Xoshiro256\nfrom .xoshiro512 import Xoshiro512\nfrom .mtrand import RandomState\n\n__all__ += ['Generator', 'DSFMT', 'MT19937', 'Philox', 'PCG64', 'PCG32',\n 'ThreeFry', 'Xoshiro256', 'Xoshiro512', 'RandomState']\n\n# Some aliases:\nranf = random = sample = random_sample\n__all__.extend(['ranf', 'random', 'sample'])\n\n\ndef __RandomState_ctor():\n \"\"\"Return a RandomState instance.\n\n This function exists solely to assist (un)pickling.\n\n Note that the state of the RandomState returned here is irrelevant, as this function's\n entire purpose is to return a newly allocated RandomState whose state pickle can set.\n Consequently the RandomState returned by this function is a freshly allocated copy\n with a seed=0.\n\n See https://github.com/numpy/numpy/issues/4763 for a detailed discussion\n\n \"\"\"\n return RandomState(seed=0)\n\n\nfrom numpy._pytesttester import PytestTester\ntest = PytestTester(__name__)\ndel PytestTester\n", "path": "numpy/random/__init__.py"}], "after_files": [{"content": "\"\"\"\n========================\nRandom Number Generation\n========================\n\nInstantiate a BitGenerator and wrap it in a Generator\nwhich will convert the uniform stream to a number of distributions. The \"bare\"\nfunctions are kept for legacy code, they should be called with the newer API\nvia ``np.random.Generator().function`` instead\n\n==================== =========================================================\nUtility functions\n-------------------- ---------------------------------------------------------\nrandom Uniformly distributed floats over ``[0, 1)``\nintegers Uniformly distributed integers, replaces ``randint``\nbytes Uniformly distributed random bytes.\npermutation Randomly permute a sequence / generate a random sequence.\nshuffle Randomly permute a sequence in place.\nseed Seed the random number generator.\nchoice Random sample from 1-D array.\n==================== =========================================================\n\n==================== =========================================================\nCompatibility\nfunctions - removed\nin the new API\n-------------------- ---------------------------------------------------------\nrand Uniformly distributed values.\nrandn Normally distributed values.\nranf Uniformly distributed floating point numbers.\nrandom_integers Uniformly distributed integers in a given range.\n (deprecated, use ``integers(..., closed=True)`` instead)\nrandom_sample Alias for `random_sample`\nrandint Uniformly distributed integers in a given range\n==================== =========================================================\n\n==================== =========================================================\nUnivariate\ndistributions\n-------------------- ---------------------------------------------------------\nbeta Beta distribution over ``[0, 1]``.\nbinomial Binomial distribution.\nchisquare :math:`\\\\chi^2` distribution.\nexponential Exponential distribution.\nf F (Fisher-Snedecor) distribution.\ngamma Gamma distribution.\ngeometric Geometric distribution.\ngumbel Gumbel distribution.\nhypergeometric Hypergeometric distribution.\nlaplace Laplace distribution.\nlogistic Logistic distribution.\nlognormal Log-normal distribution.\nlogseries Logarithmic series distribution.\nnegative_binomial Negative binomial distribution.\nnoncentral_chisquare Non-central chi-square distribution.\nnoncentral_f Non-central F distribution.\nnormal Normal / Gaussian distribution.\npareto Pareto distribution.\npoisson Poisson distribution.\npower Power distribution.\nrayleigh Rayleigh distribution.\ntriangular Triangular distribution.\nuniform Uniform distribution.\nvonmises Von Mises circular distribution.\nwald Wald (inverse Gaussian) distribution.\nweibull Weibull distribution.\nzipf Zipf's distribution over ranked data.\n==================== =========================================================\n\n==================== ==========================================================\nMultivariate\ndistributions\n-------------------- ----------------------------------------------------------\ndirichlet Multivariate generalization of Beta distribution.\nmultinomial Multivariate generalization of the binomial distribution.\nmultivariate_normal Multivariate generalization of the normal distribution.\n==================== ==========================================================\n\n==================== =========================================================\nStandard\ndistributions\n-------------------- ---------------------------------------------------------\nstandard_cauchy Standard Cauchy-Lorentz distribution.\nstandard_exponential Standard exponential distribution.\nstandard_gamma Standard Gamma distribution.\nstandard_normal Standard normal distribution.\nstandard_t Standard Student's t-distribution.\n==================== =========================================================\n\n==================== =========================================================\nInternal functions\n-------------------- ---------------------------------------------------------\nget_state Get tuple representing internal state of generator.\nset_state Set state of generator.\n==================== =========================================================\n\n============================================= ===\nBitGenerator Streams that work with Generator\n--------------------------------------------- ---\nMT19937\nDSFMT\nPCG32\nPCG64\nPhilox\nThreeFry\nXoshiro256\nXoshiro512\n============================================= ===\n\n\"\"\"\nfrom __future__ import division, absolute_import, print_function\n\n__all__ = [\n 'beta',\n 'binomial',\n 'bytes',\n 'chisquare',\n 'choice',\n 'dirichlet',\n 'exponential',\n 'f',\n 'gamma',\n 'geometric',\n 'get_state',\n 'gumbel',\n 'hypergeometric',\n 'laplace',\n 'logistic',\n 'lognormal',\n 'logseries',\n 'multinomial',\n 'multivariate_normal',\n 'negative_binomial',\n 'noncentral_chisquare',\n 'noncentral_f',\n 'normal',\n 'pareto',\n 'permutation',\n 'poisson',\n 'power',\n 'rand',\n 'randint',\n 'randn',\n 'random',\n 'random_integers',\n 'random_sample',\n 'ranf',\n 'rayleigh',\n 'sample',\n 'seed',\n 'set_state',\n 'shuffle',\n 'standard_cauchy',\n 'standard_exponential',\n 'standard_gamma',\n 'standard_normal',\n 'standard_t',\n 'triangular',\n 'uniform',\n 'vonmises',\n 'wald',\n 'weibull',\n 'zipf',\n]\n\nfrom . import mtrand\nfrom .mtrand import *\nfrom .dsfmt import DSFMT\nfrom .generator import Generator\nfrom .mt19937 import MT19937\nfrom .pcg32 import PCG32\nfrom .pcg64 import PCG64\nfrom .philox import Philox\nfrom .threefry import ThreeFry\nfrom .xoshiro256 import Xoshiro256\nfrom .xoshiro512 import Xoshiro512\nfrom .mtrand import RandomState\n\n__all__ += ['Generator', 'DSFMT', 'MT19937', 'Philox', 'PCG64', 'PCG32',\n 'ThreeFry', 'Xoshiro256', 'Xoshiro512', 'RandomState']\n\n\ndef __RandomState_ctor():\n \"\"\"Return a RandomState instance.\n\n This function exists solely to assist (un)pickling.\n\n Note that the state of the RandomState returned here is irrelevant, as this function's\n entire purpose is to return a newly allocated RandomState whose state pickle can set.\n Consequently the RandomState returned by this function is a freshly allocated copy\n with a seed=0.\n\n See https://github.com/numpy/numpy/issues/4763 for a detailed discussion\n\n \"\"\"\n return RandomState(seed=0)\n\n\nfrom numpy._pytesttester import PytestTester\ntest = PytestTester(__name__)\ndel PytestTester\n", "path": "numpy/random/__init__.py"}]} | 2,201 | 239 |
gh_patches_debug_66191 | rasdani/github-patches | git_diff | nipy__nipype-3634 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ENH: add STC partial volume correction to PETPVC interface
### Summary
Partial Volume Correction using Single-target correction (STC) has been added to PETPVC since the Nipype PETPVC interface was created, and it would therefore be ideal if this could be added to the interface as well.
### Actual behavior
The interface should include the 'STC' option for the 'pvc' flag.
### Expected behavior
### How to replicate the behavior
### Script/Workflow details
Please put URL to code or code here (if not too long).
### Platform details:
<!-- Please run the following code from your shell and place the output between the triple ticks, below.
python -c "import nipype; from pprint import pprint; pprint(nipype.get_info())"
-->
```
```
### Execution environment
Choose one
- Container [Tag: ???]
- My python environment inside container [Base Tag: ???]
- My python environment outside container
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nipype/interfaces/petpvc.py`
Content:
```
1 # emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-
2 # vi: set ft=python sts=4 ts=4 sw=4 et:
3 """PETPVC is a toolbox for partial volume correction in positron emission tomography."""
4 import os
5
6 from .base import (
7 TraitedSpec,
8 CommandLineInputSpec,
9 CommandLine,
10 File,
11 isdefined,
12 traits,
13 )
14 from ..utils.filemanip import fname_presuffix
15 from ..external.due import BibTeX
16
17 pvc_methods = [
18 "GTM",
19 "IY",
20 "IY+RL",
21 "IY+VC",
22 "LABBE",
23 "LABBE+MTC",
24 "LABBE+MTC+RL",
25 "LABBE+MTC+VC",
26 "LABBE+RBV",
27 "LABBE+RBV+RL",
28 "LABBE+RBV+VC",
29 "MG",
30 "MG+RL",
31 "MG+VC",
32 "MTC",
33 "MTC+RL",
34 "MTC+VC",
35 "RBV",
36 "RBV+RL",
37 "RBV+VC",
38 "RL",
39 "VC",
40 ]
41
42
43 class PETPVCInputSpec(CommandLineInputSpec):
44 in_file = File(desc="PET image file", exists=True, mandatory=True, argstr="-i %s")
45 out_file = File(desc="Output file", genfile=True, hash_files=False, argstr="-o %s")
46 mask_file = File(
47 desc="Mask image file", exists=True, mandatory=True, argstr="-m %s"
48 )
49 pvc = traits.Enum(
50 pvc_methods,
51 mandatory=True,
52 argstr="-p %s",
53 desc="""\
54 Desired PVC method:
55
56 * Geometric transfer matrix -- ``GTM``
57 * Labbe approach -- ``LABBE``
58 * Richardson-Lucy -- ``RL``
59 * Van-Cittert -- ``VC``
60 * Region-based voxel-wise correction -- ``RBV``
61 * RBV with Labbe -- ``LABBE+RBV``
62 * RBV with Van-Cittert -- ``RBV+VC``
63 * RBV with Richardson-Lucy -- ``RBV+RL``
64 * RBV with Labbe and Van-Cittert -- ``LABBE+RBV+VC``
65 * RBV with Labbe and Richardson-Lucy -- ``LABBE+RBV+RL``
66 * Multi-target correction -- ``MTC``
67 * MTC with Labbe -- ``LABBE+MTC``
68 * MTC with Van-Cittert -- ``MTC+VC``
69 * MTC with Richardson-Lucy -- ``MTC+RL``
70 * MTC with Labbe and Van-Cittert -- ``LABBE+MTC+VC``
71 * MTC with Labbe and Richardson-Lucy -- ``LABBE+MTC+RL``
72 * Iterative Yang -- ``IY``
73 * Iterative Yang with Van-Cittert -- ``IY+VC``
74 * Iterative Yang with Richardson-Lucy -- ``IY+RL``
75 * Muller Gartner -- ``MG``
76 * Muller Gartner with Van-Cittert -- ``MG+VC``
77 * Muller Gartner with Richardson-Lucy -- ``MG+RL``
78
79 """,
80 )
81 fwhm_x = traits.Float(
82 desc="The full-width at half maximum in mm along x-axis",
83 mandatory=True,
84 argstr="-x %.4f",
85 )
86 fwhm_y = traits.Float(
87 desc="The full-width at half maximum in mm along y-axis",
88 mandatory=True,
89 argstr="-y %.4f",
90 )
91 fwhm_z = traits.Float(
92 desc="The full-width at half maximum in mm along z-axis",
93 mandatory=True,
94 argstr="-z %.4f",
95 )
96 debug = traits.Bool(
97 desc="Prints debug information",
98 usedefault=True,
99 default_value=False,
100 argstr="-d",
101 )
102 n_iter = traits.Int(
103 desc="Number of iterations", default_value=10, usedefault=True, argstr="-n %d"
104 )
105 n_deconv = traits.Int(
106 desc="Number of deconvolution iterations",
107 default_value=10,
108 usedefault=True,
109 argstr="-k %d",
110 )
111 alpha = traits.Float(
112 desc="Alpha value", default_value=1.5, usedefault=True, argstr="-a %.4f"
113 )
114 stop_crit = traits.Float(
115 desc="Stopping criterion", default_value=0.01, usedefault=True, argstr="-s %.4f"
116 )
117
118
119 class PETPVCOutputSpec(TraitedSpec):
120 out_file = File(desc="Output file")
121
122
123 class PETPVC(CommandLine):
124 """Use PETPVC for partial volume correction of PET images.
125
126 PETPVC ([1]_, [2]_) is a software from the Nuclear Medicine Department
127 of the UCL University Hospital, London, UK.
128
129 Examples
130 --------
131 >>> from ..testing import example_data
132 >>> #TODO get data for PETPVC
133 >>> pvc = PETPVC()
134 >>> pvc.inputs.in_file = 'pet.nii.gz'
135 >>> pvc.inputs.mask_file = 'tissues.nii.gz'
136 >>> pvc.inputs.out_file = 'pet_pvc_rbv.nii.gz'
137 >>> pvc.inputs.pvc = 'RBV'
138 >>> pvc.inputs.fwhm_x = 2.0
139 >>> pvc.inputs.fwhm_y = 2.0
140 >>> pvc.inputs.fwhm_z = 2.0
141 >>> outs = pvc.run() #doctest: +SKIP
142
143 References
144 ----------
145 .. [1] K. Erlandsson, I. Buvat, P. H. Pretorius, B. A. Thomas, and B. F. Hutton,
146 "A review of partial volume correction techniques for emission tomography
147 and their applications in neurology, cardiology and oncology," Phys. Med.
148 Biol., vol. 57, no. 21, p. R119, 2012.
149 .. [2] https://github.com/UCL/PETPVC
150
151 """
152
153 input_spec = PETPVCInputSpec
154 output_spec = PETPVCOutputSpec
155 _cmd = "petpvc"
156
157 _references = [
158 {
159 "entry": BibTeX(
160 "@article{0031-9155-61-22-7975,"
161 "author={Benjamin A Thomas and Vesna Cuplov and Alexandre Bousse and "
162 "Adriana Mendes and Kris Thielemans and Brian F Hutton and Kjell Erlandsson},"
163 "title={PETPVC: a toolbox for performing partial volume correction "
164 "techniques in positron emission tomography},"
165 "journal={Physics in Medicine and Biology},"
166 "volume={61},"
167 "number={22},"
168 "pages={7975},"
169 "url={http://stacks.iop.org/0031-9155/61/i=22/a=7975},"
170 "doi={https://doi.org/10.1088/0031-9155/61/22/7975},"
171 "year={2016},"
172 "}"
173 ),
174 "description": "PETPVC software implementation publication",
175 "tags": ["implementation"],
176 }
177 ]
178
179 def _list_outputs(self):
180 outputs = self.output_spec().get()
181 outputs["out_file"] = self.inputs.out_file
182 if not isdefined(outputs["out_file"]):
183 method_name = self.inputs.pvc.lower()
184 outputs["out_file"] = self._gen_fname(
185 self.inputs.in_file, suffix=f"_{method_name}_pvc"
186 )
187
188 outputs["out_file"] = os.path.abspath(outputs["out_file"])
189 return outputs
190
191 def _gen_fname(
192 self, basename, cwd=None, suffix=None, change_ext=True, ext=".nii.gz"
193 ):
194 """Generate a filename based on the given parameters.
195
196 The filename will take the form: cwd/basename<suffix><ext>.
197 If change_ext is True, it will use the extensions specified in
198 <instance>inputs.output_type.
199
200 Parameters
201 ----------
202 basename : str
203 Filename to base the new filename on.
204 cwd : str
205 Path to prefix to the new filename. (default is os.getcwd())
206 suffix : str
207 Suffix to add to the `basename`. (defaults is '' )
208 change_ext : bool
209 Flag to change the filename extension to the given `ext`.
210 (Default is False)
211
212 Returns
213 -------
214 fname : str
215 New filename based on given parameters.
216
217 """
218 if basename == "":
219 msg = "Unable to generate filename for command %s. " % self.cmd
220 msg += "basename is not set!"
221 raise ValueError(msg)
222 if cwd is None:
223 cwd = os.getcwd()
224 if change_ext:
225 if suffix:
226 suffix = "".join((suffix, ext))
227 else:
228 suffix = ext
229 if suffix is None:
230 suffix = ""
231 fname = fname_presuffix(basename, suffix=suffix, use_ext=False, newpath=cwd)
232 return fname
233
234 def _gen_filename(self, name):
235 if name == "out_file":
236 return self._list_outputs()["out_file"]
237 return None
238
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/nipype/interfaces/petpvc.py b/nipype/interfaces/petpvc.py
--- a/nipype/interfaces/petpvc.py
+++ b/nipype/interfaces/petpvc.py
@@ -37,6 +37,7 @@
"RBV+VC",
"RL",
"VC",
+ "STC",
]
@@ -75,6 +76,7 @@
* Muller Gartner -- ``MG``
* Muller Gartner with Van-Cittert -- ``MG+VC``
* Muller Gartner with Richardson-Lucy -- ``MG+RL``
+ * Single-target correction -- ``STC``
""",
)
| {"golden_diff": "diff --git a/nipype/interfaces/petpvc.py b/nipype/interfaces/petpvc.py\n--- a/nipype/interfaces/petpvc.py\n+++ b/nipype/interfaces/petpvc.py\n@@ -37,6 +37,7 @@\n \"RBV+VC\",\n \"RL\",\n \"VC\",\n+ \"STC\",\n ]\n \n \n@@ -75,6 +76,7 @@\n * Muller Gartner -- ``MG``\n * Muller Gartner with Van-Cittert -- ``MG+VC``\n * Muller Gartner with Richardson-Lucy -- ``MG+RL``\n+ * Single-target correction -- ``STC``\n \n \"\"\",\n )\n", "issue": "ENH: add STC partial volume correction to PETPVC interface\n### Summary\r\nPartial Volume Correction using Single-target correction (STC) has been added to PETPVC since the Nipype PETPVC interface was created, and it would therefore be ideal if this could be added to the interface as well.\r\n\r\n### Actual behavior\r\nThe interface should include the 'STC' option for the 'pvc' flag.\r\n\r\n### Expected behavior\r\n\r\n### How to replicate the behavior\r\n\r\n### Script/Workflow details\r\n\r\nPlease put URL to code or code here (if not too long).\r\n\r\n### Platform details:\r\n\r\n<!-- Please run the following code from your shell and place the output between the triple ticks, below.\r\npython -c \"import nipype; from pprint import pprint; pprint(nipype.get_info())\"\r\n-->\r\n\r\n```\r\n\r\n```\r\n\r\n### Execution environment\r\n\r\nChoose one\r\n- Container [Tag: ???]\r\n- My python environment inside container [Base Tag: ???]\r\n- My python environment outside container\r\n\n", "before_files": [{"content": "# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-\n# vi: set ft=python sts=4 ts=4 sw=4 et:\n\"\"\"PETPVC is a toolbox for partial volume correction in positron emission tomography.\"\"\"\nimport os\n\nfrom .base import (\n TraitedSpec,\n CommandLineInputSpec,\n CommandLine,\n File,\n isdefined,\n traits,\n)\nfrom ..utils.filemanip import fname_presuffix\nfrom ..external.due import BibTeX\n\npvc_methods = [\n \"GTM\",\n \"IY\",\n \"IY+RL\",\n \"IY+VC\",\n \"LABBE\",\n \"LABBE+MTC\",\n \"LABBE+MTC+RL\",\n \"LABBE+MTC+VC\",\n \"LABBE+RBV\",\n \"LABBE+RBV+RL\",\n \"LABBE+RBV+VC\",\n \"MG\",\n \"MG+RL\",\n \"MG+VC\",\n \"MTC\",\n \"MTC+RL\",\n \"MTC+VC\",\n \"RBV\",\n \"RBV+RL\",\n \"RBV+VC\",\n \"RL\",\n \"VC\",\n]\n\n\nclass PETPVCInputSpec(CommandLineInputSpec):\n in_file = File(desc=\"PET image file\", exists=True, mandatory=True, argstr=\"-i %s\")\n out_file = File(desc=\"Output file\", genfile=True, hash_files=False, argstr=\"-o %s\")\n mask_file = File(\n desc=\"Mask image file\", exists=True, mandatory=True, argstr=\"-m %s\"\n )\n pvc = traits.Enum(\n pvc_methods,\n mandatory=True,\n argstr=\"-p %s\",\n desc=\"\"\"\\\nDesired PVC method:\n\n * Geometric transfer matrix -- ``GTM``\n * Labbe approach -- ``LABBE``\n * Richardson-Lucy -- ``RL``\n * Van-Cittert -- ``VC``\n * Region-based voxel-wise correction -- ``RBV``\n * RBV with Labbe -- ``LABBE+RBV``\n * RBV with Van-Cittert -- ``RBV+VC``\n * RBV with Richardson-Lucy -- ``RBV+RL``\n * RBV with Labbe and Van-Cittert -- ``LABBE+RBV+VC``\n * RBV with Labbe and Richardson-Lucy -- ``LABBE+RBV+RL``\n * Multi-target correction -- ``MTC``\n * MTC with Labbe -- ``LABBE+MTC``\n * MTC with Van-Cittert -- ``MTC+VC``\n * MTC with Richardson-Lucy -- ``MTC+RL``\n * MTC with Labbe and Van-Cittert -- ``LABBE+MTC+VC``\n * MTC with Labbe and Richardson-Lucy -- ``LABBE+MTC+RL``\n * Iterative Yang -- ``IY``\n * Iterative Yang with Van-Cittert -- ``IY+VC``\n * Iterative Yang with Richardson-Lucy -- ``IY+RL``\n * Muller Gartner -- ``MG``\n * Muller Gartner with Van-Cittert -- ``MG+VC``\n * Muller Gartner with Richardson-Lucy -- ``MG+RL``\n\n\"\"\",\n )\n fwhm_x = traits.Float(\n desc=\"The full-width at half maximum in mm along x-axis\",\n mandatory=True,\n argstr=\"-x %.4f\",\n )\n fwhm_y = traits.Float(\n desc=\"The full-width at half maximum in mm along y-axis\",\n mandatory=True,\n argstr=\"-y %.4f\",\n )\n fwhm_z = traits.Float(\n desc=\"The full-width at half maximum in mm along z-axis\",\n mandatory=True,\n argstr=\"-z %.4f\",\n )\n debug = traits.Bool(\n desc=\"Prints debug information\",\n usedefault=True,\n default_value=False,\n argstr=\"-d\",\n )\n n_iter = traits.Int(\n desc=\"Number of iterations\", default_value=10, usedefault=True, argstr=\"-n %d\"\n )\n n_deconv = traits.Int(\n desc=\"Number of deconvolution iterations\",\n default_value=10,\n usedefault=True,\n argstr=\"-k %d\",\n )\n alpha = traits.Float(\n desc=\"Alpha value\", default_value=1.5, usedefault=True, argstr=\"-a %.4f\"\n )\n stop_crit = traits.Float(\n desc=\"Stopping criterion\", default_value=0.01, usedefault=True, argstr=\"-s %.4f\"\n )\n\n\nclass PETPVCOutputSpec(TraitedSpec):\n out_file = File(desc=\"Output file\")\n\n\nclass PETPVC(CommandLine):\n \"\"\"Use PETPVC for partial volume correction of PET images.\n\n PETPVC ([1]_, [2]_) is a software from the Nuclear Medicine Department\n of the UCL University Hospital, London, UK.\n\n Examples\n --------\n >>> from ..testing import example_data\n >>> #TODO get data for PETPVC\n >>> pvc = PETPVC()\n >>> pvc.inputs.in_file = 'pet.nii.gz'\n >>> pvc.inputs.mask_file = 'tissues.nii.gz'\n >>> pvc.inputs.out_file = 'pet_pvc_rbv.nii.gz'\n >>> pvc.inputs.pvc = 'RBV'\n >>> pvc.inputs.fwhm_x = 2.0\n >>> pvc.inputs.fwhm_y = 2.0\n >>> pvc.inputs.fwhm_z = 2.0\n >>> outs = pvc.run() #doctest: +SKIP\n\n References\n ----------\n .. [1] K. Erlandsson, I. Buvat, P. H. Pretorius, B. A. Thomas, and B. F. Hutton,\n \"A review of partial volume correction techniques for emission tomography\n and their applications in neurology, cardiology and oncology,\" Phys. Med.\n Biol., vol. 57, no. 21, p. R119, 2012.\n .. [2] https://github.com/UCL/PETPVC\n\n \"\"\"\n\n input_spec = PETPVCInputSpec\n output_spec = PETPVCOutputSpec\n _cmd = \"petpvc\"\n\n _references = [\n {\n \"entry\": BibTeX(\n \"@article{0031-9155-61-22-7975,\"\n \"author={Benjamin A Thomas and Vesna Cuplov and Alexandre Bousse and \"\n \"Adriana Mendes and Kris Thielemans and Brian F Hutton and Kjell Erlandsson},\"\n \"title={PETPVC: a toolbox for performing partial volume correction \"\n \"techniques in positron emission tomography},\"\n \"journal={Physics in Medicine and Biology},\"\n \"volume={61},\"\n \"number={22},\"\n \"pages={7975},\"\n \"url={http://stacks.iop.org/0031-9155/61/i=22/a=7975},\"\n \"doi={https://doi.org/10.1088/0031-9155/61/22/7975},\"\n \"year={2016},\"\n \"}\"\n ),\n \"description\": \"PETPVC software implementation publication\",\n \"tags\": [\"implementation\"],\n }\n ]\n\n def _list_outputs(self):\n outputs = self.output_spec().get()\n outputs[\"out_file\"] = self.inputs.out_file\n if not isdefined(outputs[\"out_file\"]):\n method_name = self.inputs.pvc.lower()\n outputs[\"out_file\"] = self._gen_fname(\n self.inputs.in_file, suffix=f\"_{method_name}_pvc\"\n )\n\n outputs[\"out_file\"] = os.path.abspath(outputs[\"out_file\"])\n return outputs\n\n def _gen_fname(\n self, basename, cwd=None, suffix=None, change_ext=True, ext=\".nii.gz\"\n ):\n \"\"\"Generate a filename based on the given parameters.\n\n The filename will take the form: cwd/basename<suffix><ext>.\n If change_ext is True, it will use the extensions specified in\n <instance>inputs.output_type.\n\n Parameters\n ----------\n basename : str\n Filename to base the new filename on.\n cwd : str\n Path to prefix to the new filename. (default is os.getcwd())\n suffix : str\n Suffix to add to the `basename`. (defaults is '' )\n change_ext : bool\n Flag to change the filename extension to the given `ext`.\n (Default is False)\n\n Returns\n -------\n fname : str\n New filename based on given parameters.\n\n \"\"\"\n if basename == \"\":\n msg = \"Unable to generate filename for command %s. \" % self.cmd\n msg += \"basename is not set!\"\n raise ValueError(msg)\n if cwd is None:\n cwd = os.getcwd()\n if change_ext:\n if suffix:\n suffix = \"\".join((suffix, ext))\n else:\n suffix = ext\n if suffix is None:\n suffix = \"\"\n fname = fname_presuffix(basename, suffix=suffix, use_ext=False, newpath=cwd)\n return fname\n\n def _gen_filename(self, name):\n if name == \"out_file\":\n return self._list_outputs()[\"out_file\"]\n return None\n", "path": "nipype/interfaces/petpvc.py"}], "after_files": [{"content": "# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-\n# vi: set ft=python sts=4 ts=4 sw=4 et:\n\"\"\"PETPVC is a toolbox for partial volume correction in positron emission tomography.\"\"\"\nimport os\n\nfrom .base import (\n TraitedSpec,\n CommandLineInputSpec,\n CommandLine,\n File,\n isdefined,\n traits,\n)\nfrom ..utils.filemanip import fname_presuffix\nfrom ..external.due import BibTeX\n\npvc_methods = [\n \"GTM\",\n \"IY\",\n \"IY+RL\",\n \"IY+VC\",\n \"LABBE\",\n \"LABBE+MTC\",\n \"LABBE+MTC+RL\",\n \"LABBE+MTC+VC\",\n \"LABBE+RBV\",\n \"LABBE+RBV+RL\",\n \"LABBE+RBV+VC\",\n \"MG\",\n \"MG+RL\",\n \"MG+VC\",\n \"MTC\",\n \"MTC+RL\",\n \"MTC+VC\",\n \"RBV\",\n \"RBV+RL\",\n \"RBV+VC\",\n \"RL\",\n \"VC\",\n \"STC\",\n]\n\n\nclass PETPVCInputSpec(CommandLineInputSpec):\n in_file = File(desc=\"PET image file\", exists=True, mandatory=True, argstr=\"-i %s\")\n out_file = File(desc=\"Output file\", genfile=True, hash_files=False, argstr=\"-o %s\")\n mask_file = File(\n desc=\"Mask image file\", exists=True, mandatory=True, argstr=\"-m %s\"\n )\n pvc = traits.Enum(\n pvc_methods,\n mandatory=True,\n argstr=\"-p %s\",\n desc=\"\"\"\\\nDesired PVC method:\n\n * Geometric transfer matrix -- ``GTM``\n * Labbe approach -- ``LABBE``\n * Richardson-Lucy -- ``RL``\n * Van-Cittert -- ``VC``\n * Region-based voxel-wise correction -- ``RBV``\n * RBV with Labbe -- ``LABBE+RBV``\n * RBV with Van-Cittert -- ``RBV+VC``\n * RBV with Richardson-Lucy -- ``RBV+RL``\n * RBV with Labbe and Van-Cittert -- ``LABBE+RBV+VC``\n * RBV with Labbe and Richardson-Lucy -- ``LABBE+RBV+RL``\n * Multi-target correction -- ``MTC``\n * MTC with Labbe -- ``LABBE+MTC``\n * MTC with Van-Cittert -- ``MTC+VC``\n * MTC with Richardson-Lucy -- ``MTC+RL``\n * MTC with Labbe and Van-Cittert -- ``LABBE+MTC+VC``\n * MTC with Labbe and Richardson-Lucy -- ``LABBE+MTC+RL``\n * Iterative Yang -- ``IY``\n * Iterative Yang with Van-Cittert -- ``IY+VC``\n * Iterative Yang with Richardson-Lucy -- ``IY+RL``\n * Muller Gartner -- ``MG``\n * Muller Gartner with Van-Cittert -- ``MG+VC``\n * Muller Gartner with Richardson-Lucy -- ``MG+RL``\n * Single-target correction -- ``STC``\n\n\"\"\",\n )\n fwhm_x = traits.Float(\n desc=\"The full-width at half maximum in mm along x-axis\",\n mandatory=True,\n argstr=\"-x %.4f\",\n )\n fwhm_y = traits.Float(\n desc=\"The full-width at half maximum in mm along y-axis\",\n mandatory=True,\n argstr=\"-y %.4f\",\n )\n fwhm_z = traits.Float(\n desc=\"The full-width at half maximum in mm along z-axis\",\n mandatory=True,\n argstr=\"-z %.4f\",\n )\n debug = traits.Bool(\n desc=\"Prints debug information\",\n usedefault=True,\n default_value=False,\n argstr=\"-d\",\n )\n n_iter = traits.Int(\n desc=\"Number of iterations\", default_value=10, usedefault=True, argstr=\"-n %d\"\n )\n n_deconv = traits.Int(\n desc=\"Number of deconvolution iterations\",\n default_value=10,\n usedefault=True,\n argstr=\"-k %d\",\n )\n alpha = traits.Float(\n desc=\"Alpha value\", default_value=1.5, usedefault=True, argstr=\"-a %.4f\"\n )\n stop_crit = traits.Float(\n desc=\"Stopping criterion\", default_value=0.01, usedefault=True, argstr=\"-s %.4f\"\n )\n\n\nclass PETPVCOutputSpec(TraitedSpec):\n out_file = File(desc=\"Output file\")\n\n\nclass PETPVC(CommandLine):\n \"\"\"Use PETPVC for partial volume correction of PET images.\n\n PETPVC ([1]_, [2]_) is a software from the Nuclear Medicine Department\n of the UCL University Hospital, London, UK.\n\n Examples\n --------\n >>> from ..testing import example_data\n >>> #TODO get data for PETPVC\n >>> pvc = PETPVC()\n >>> pvc.inputs.in_file = 'pet.nii.gz'\n >>> pvc.inputs.mask_file = 'tissues.nii.gz'\n >>> pvc.inputs.out_file = 'pet_pvc_rbv.nii.gz'\n >>> pvc.inputs.pvc = 'RBV'\n >>> pvc.inputs.fwhm_x = 2.0\n >>> pvc.inputs.fwhm_y = 2.0\n >>> pvc.inputs.fwhm_z = 2.0\n >>> outs = pvc.run() #doctest: +SKIP\n\n References\n ----------\n .. [1] K. Erlandsson, I. Buvat, P. H. Pretorius, B. A. Thomas, and B. F. Hutton,\n \"A review of partial volume correction techniques for emission tomography\n and their applications in neurology, cardiology and oncology,\" Phys. Med.\n Biol., vol. 57, no. 21, p. R119, 2012.\n .. [2] https://github.com/UCL/PETPVC\n\n \"\"\"\n\n input_spec = PETPVCInputSpec\n output_spec = PETPVCOutputSpec\n _cmd = \"petpvc\"\n\n _references = [\n {\n \"entry\": BibTeX(\n \"@article{0031-9155-61-22-7975,\"\n \"author={Benjamin A Thomas and Vesna Cuplov and Alexandre Bousse and \"\n \"Adriana Mendes and Kris Thielemans and Brian F Hutton and Kjell Erlandsson},\"\n \"title={PETPVC: a toolbox for performing partial volume correction \"\n \"techniques in positron emission tomography},\"\n \"journal={Physics in Medicine and Biology},\"\n \"volume={61},\"\n \"number={22},\"\n \"pages={7975},\"\n \"url={http://stacks.iop.org/0031-9155/61/i=22/a=7975},\"\n \"doi={https://doi.org/10.1088/0031-9155/61/22/7975},\"\n \"year={2016},\"\n \"}\"\n ),\n \"description\": \"PETPVC software implementation publication\",\n \"tags\": [\"implementation\"],\n }\n ]\n\n def _list_outputs(self):\n outputs = self.output_spec().get()\n outputs[\"out_file\"] = self.inputs.out_file\n if not isdefined(outputs[\"out_file\"]):\n method_name = self.inputs.pvc.lower()\n outputs[\"out_file\"] = self._gen_fname(\n self.inputs.in_file, suffix=f\"_{method_name}_pvc\"\n )\n\n outputs[\"out_file\"] = os.path.abspath(outputs[\"out_file\"])\n return outputs\n\n def _gen_fname(\n self, basename, cwd=None, suffix=None, change_ext=True, ext=\".nii.gz\"\n ):\n \"\"\"Generate a filename based on the given parameters.\n\n The filename will take the form: cwd/basename<suffix><ext>.\n If change_ext is True, it will use the extensions specified in\n <instance>inputs.output_type.\n\n Parameters\n ----------\n basename : str\n Filename to base the new filename on.\n cwd : str\n Path to prefix to the new filename. (default is os.getcwd())\n suffix : str\n Suffix to add to the `basename`. (defaults is '' )\n change_ext : bool\n Flag to change the filename extension to the given `ext`.\n (Default is False)\n\n Returns\n -------\n fname : str\n New filename based on given parameters.\n\n \"\"\"\n if basename == \"\":\n msg = \"Unable to generate filename for command %s. \" % self.cmd\n msg += \"basename is not set!\"\n raise ValueError(msg)\n if cwd is None:\n cwd = os.getcwd()\n if change_ext:\n if suffix:\n suffix = \"\".join((suffix, ext))\n else:\n suffix = ext\n if suffix is None:\n suffix = \"\"\n fname = fname_presuffix(basename, suffix=suffix, use_ext=False, newpath=cwd)\n return fname\n\n def _gen_filename(self, name):\n if name == \"out_file\":\n return self._list_outputs()[\"out_file\"]\n return None\n", "path": "nipype/interfaces/petpvc.py"}]} | 3,171 | 155 |
gh_patches_debug_43549 | rasdani/github-patches | git_diff | hpcaitech__ColossalAI-4544 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `colossalai/shardformer/shard/sharder.py`
Content:
```
1 from types import MethodType
2 from typing import Any, Callable, Dict, List, Optional, Set, Union
3
4 import torch.nn as nn
5 from torch import Tensor
6
7 from colossalai.lazy import LazyInitContext
8
9 from .._utils import getattr_, setattr_
10 from ..policies.auto_policy import get_autopolicy
11 from ..policies.base_policy import Policy, SubModuleReplacementDescription
12 from .shard_config import ShardConfig
13 from .utils import set_tensors_to_none
14
15 __all__ = ['ModelSharder', 'shard_model']
16
17
18 class ModelSharder(object):
19 r"""
20 Shard the original huggingface model according to the policy
21
22 Args:
23 policy (:class:`Policy`): The policy to shard the model
24 model (:class:`torch.Module`): The model to shard
25 shard_config: The setting of distributed model
26 """
27
28 def __init__(self, model: nn.Module, policy: Policy, shard_config: ShardConfig = None) -> None:
29 self.model = model
30 self.policy = get_autopolicy(self.model) if policy is None else policy
31 self.shard_config = shard_config
32
33 def shard(self) -> List[Dict[int, Tensor]]:
34 r"""
35 Shard the model according to the policy
36 """
37 self.policy.set_model(self.model)
38 self.policy.set_shard_config(self.shard_config)
39 self._preprocess()
40 # get shared params before release unheld layers, this avoid misjudgement of shared params (None is None)
41 shared_params = self.policy.get_shared_params()
42 held_layers = self._release_unheld_layers()
43 self._replace_module(include=held_layers)
44 self._materialize()
45 self._postprocess()
46 return shared_params
47
48 def _preprocess(self) -> None:
49 self.model = self.policy.preprocess()
50
51 def _postprocess(self) -> None:
52 self.model = self.policy.postprocess()
53
54 def _replace_module(self, include: Optional[Set[nn.Module]] = None) -> None:
55 r"""
56 Replace the module according to the policy, and replace the module one by one
57
58 Args:
59 model (:class:`torch.nn.Module`): The model to shard
60 """
61 module_descriptions = self.policy.module_policy()
62 for layer_cls, module_description in module_descriptions.items():
63 attr_replacement = module_description.attribute_replacement
64 param_replacement = module_description.param_replacement
65 sub_module_replacement = module_description.sub_module_replacement
66 method_replacement = module_description.method_replacement
67 self._recursive_replace_layer(self.model,
68 layer_cls,
69 attr_replacement,
70 param_replacement,
71 method_replacement,
72 sub_module_replacement,
73 include=include)
74
75 def _recursive_replace_layer(
76 self,
77 module: nn.Module,
78 origin_cls: Union[str, nn.Module],
79 attr_replacement: Dict[str, Any],
80 param_replacement: List[Callable],
81 method_replacement: Dict[str, Callable],
82 sub_module_replacement: List[SubModuleReplacementDescription],
83 include: Optional[Set[nn.Module]] = None,
84 ) -> None:
85 r"""
86 Reverse the replace layer operation
87
88 Args:
89 module (torch.nn.Module): The object of layer to shard
90 origin_cls (Union[str, torch.nn.Module]): The origin layer class or a string of layer class name
91 attr_replacement (Dict[str, Any]): The attribute dict to modify
92 param_replacement (List[Callable]): The function list to get parameter shard information in policy
93 method_replacement (Dict[str, Callable]): Key is the method name, value is the method for replacement
94 sub_module_replacement ((List[SubModuleReplacementDescription]): The function list to get sub module shard information in policy
95 """
96 # released layers are not shardable
97 can_replace_param_or_layer = include is None or module in include
98 if (isinstance(origin_cls, str) and origin_cls == module.__class__.__name__) or \
99 (module.__class__ == origin_cls):
100 if attr_replacement is not None:
101 self._replace_attr(module, attr_replacement)
102
103 if param_replacement is not None and can_replace_param_or_layer:
104 self._replace_param(module, param_replacement)
105
106 if method_replacement is not None:
107 self._replace_method(module, method_replacement)
108
109 if sub_module_replacement is not None and can_replace_param_or_layer:
110 self._replace_sub_module(module, sub_module_replacement)
111
112 for name, child in module.named_children():
113 self._recursive_replace_layer(child,
114 origin_cls,
115 attr_replacement,
116 param_replacement,
117 method_replacement,
118 sub_module_replacement,
119 include=include)
120
121 def _replace_attr(
122 self,
123 module: nn.Module,
124 attr_replacement: Dict[str, Any],
125 ) -> None:
126 r"""
127 Replace the attribute of the layer
128
129 Args:
130 module (:class:`torch.nn.Module`): The object of layer to shard
131 attr_replacement (Dict): The attribute dict to modify
132 """
133 for k, v in attr_replacement.items():
134 setattr_(module, k, v, ignore=True)
135
136 def _replace_param(
137 self,
138 module: nn.Module,
139 param_replacement: List[Callable],
140 ) -> None:
141 r"""
142 Replace the parameter of the layer
143
144 Args:
145 module (:class:`torch.nn.Module`): The object of layer to shard
146 param_replacement (List[Callable]): The function list to get parameter shard information in policy
147 """
148 for param_func in param_replacement:
149 param_func(module)
150
151 def _replace_method(self, module: nn.Module, method_replacement: Dict[str, Callable]):
152 for method_name, new_method in method_replacement.items():
153 # bind the new method to the module
154 bound_method = MethodType(new_method, module)
155 setattr(module, method_name, bound_method)
156
157 def _replace_sub_module(
158 self,
159 org_layer: nn.Module,
160 sub_module_replacement: List[SubModuleReplacementDescription],
161 ) -> None:
162 r"""
163 Shard one layer according to the policy, the layer should be the same class as the key in policy's argument_policy return dict
164
165 Args:
166 org_layer (torch.nn.Module): The origin layer object to shard
167 sub_module_replacement (List[SubModuleReplacementDescription]): The sub module replacement description list
168
169 """
170 for description in sub_module_replacement:
171 suffix = description.suffix
172 target_module = description.target_module
173 kwargs = {} if description.kwargs is None else description.kwargs
174
175 assert target_module is not None, 'target_module should not be None'
176
177 # TODO: support different parallel mode
178 native_sub_module = getattr_(org_layer, suffix, ignore=True)
179
180 assert not isinstance(native_sub_module, target_module), \
181 f"The module with suffix {suffix} has been replaced, please check the policy"
182
183 # if it is None and we are allowed to ignore this module
184 # just skip
185 if description.ignore_if_not_exist and native_sub_module is None:
186 continue
187
188 try:
189 replace_layer = target_module.from_native_module(native_sub_module,
190 self.shard_config.tensor_parallel_process_group,
191 **kwargs)
192 except Exception as e:
193 raise RuntimeError(
194 f"Failed to replace {suffix} of type {native_sub_module.__class__.__qualname__}"
195 f" with {target_module.__qualname__} with the exception: {e}. "
196 "Please check your model configuration or sharding policy, you can set up an issue for us to help you as well."
197 )
198
199 setattr_(org_layer, suffix, replace_layer)
200
201 def _get_recursive_held_layers(self, held_layers: Optional[List[nn.Module]]) -> Optional[List[nn.Module]]:
202
203 def collect_sub_modules(module: nn.Module):
204 if module is None:
205 return
206 recursive_held_layers.append(module)
207 for name, child in module.named_children():
208 collect_sub_modules(child)
209
210 recursive_held_layers = []
211 for module in held_layers:
212 collect_sub_modules(module)
213 return recursive_held_layers
214
215 def _release_unheld_layers(self) -> Optional[Set[nn.Module]]:
216 r"""
217 Release the unheld layers in the model
218 """
219 if self.shard_config and self.shard_config.pipeline_stage_manager:
220 held_layers = self.policy.get_held_layers()
221 set_tensors_to_none(self.model, exclude=set(held_layers))
222 return set(self._get_recursive_held_layers(held_layers))
223 return None
224
225 def _materialize(self) -> None:
226 r"""
227 Materialize the model if lazy initialization is used
228 """
229 LazyInitContext.materialize(self.model)
230
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/colossalai/shardformer/shard/sharder.py b/colossalai/shardformer/shard/sharder.py
--- a/colossalai/shardformer/shard/sharder.py
+++ b/colossalai/shardformer/shard/sharder.py
@@ -92,22 +92,21 @@
param_replacement (List[Callable]): The function list to get parameter shard information in policy
method_replacement (Dict[str, Callable]): Key is the method name, value is the method for replacement
sub_module_replacement ((List[SubModuleReplacementDescription]): The function list to get sub module shard information in policy
+ include (Set[nn.Module], optional): The set of modules to keep on current device when pipeline parallel is enabled. Defaults to None
"""
- # released layers are not shardable
- can_replace_param_or_layer = include is None or module in include
if (isinstance(origin_cls, str) and origin_cls == module.__class__.__name__) or \
(module.__class__ == origin_cls):
if attr_replacement is not None:
self._replace_attr(module, attr_replacement)
- if param_replacement is not None and can_replace_param_or_layer:
+ if param_replacement is not None and (include is None or module in include):
self._replace_param(module, param_replacement)
if method_replacement is not None:
self._replace_method(module, method_replacement)
- if sub_module_replacement is not None and can_replace_param_or_layer:
- self._replace_sub_module(module, sub_module_replacement)
+ if sub_module_replacement is not None:
+ self._replace_sub_module(module, sub_module_replacement, include)
for name, child in module.named_children():
self._recursive_replace_layer(child,
@@ -154,18 +153,17 @@
bound_method = MethodType(new_method, module)
setattr(module, method_name, bound_method)
- def _replace_sub_module(
- self,
- org_layer: nn.Module,
- sub_module_replacement: List[SubModuleReplacementDescription],
- ) -> None:
+ def _replace_sub_module(self,
+ org_layer: nn.Module,
+ sub_module_replacement: List[SubModuleReplacementDescription],
+ include: Optional[Set[nn.Module]] = None) -> None:
r"""
Shard one layer according to the policy, the layer should be the same class as the key in policy's argument_policy return dict
Args:
org_layer (torch.nn.Module): The origin layer object to shard
sub_module_replacement (List[SubModuleReplacementDescription]): The sub module replacement description list
-
+ include (Set[nn.Module], optional): The set of modules to keep on current device when pipeline parallel is enabled. Defaults to None
"""
for description in sub_module_replacement:
suffix = description.suffix
@@ -174,9 +172,12 @@
assert target_module is not None, 'target_module should not be None'
- # TODO: support different parallel mode
native_sub_module = getattr_(org_layer, suffix, ignore=True)
+ # Skip replacement if submodule is not kept by current device when pipeline parallel is enabled.
+ if (include is not None) and (native_sub_module is not None) and (native_sub_module not in include):
+ continue
+
assert not isinstance(native_sub_module, target_module), \
f"The module with suffix {suffix} has been replaced, please check the policy"
| {"golden_diff": "diff --git a/colossalai/shardformer/shard/sharder.py b/colossalai/shardformer/shard/sharder.py\n--- a/colossalai/shardformer/shard/sharder.py\n+++ b/colossalai/shardformer/shard/sharder.py\n@@ -92,22 +92,21 @@\n param_replacement (List[Callable]): The function list to get parameter shard information in policy\n method_replacement (Dict[str, Callable]): Key is the method name, value is the method for replacement\n sub_module_replacement ((List[SubModuleReplacementDescription]): The function list to get sub module shard information in policy\n+ include (Set[nn.Module], optional): The set of modules to keep on current device when pipeline parallel is enabled. Defaults to None\n \"\"\"\n- # released layers are not shardable\n- can_replace_param_or_layer = include is None or module in include\n if (isinstance(origin_cls, str) and origin_cls == module.__class__.__name__) or \\\n (module.__class__ == origin_cls):\n if attr_replacement is not None:\n self._replace_attr(module, attr_replacement)\n \n- if param_replacement is not None and can_replace_param_or_layer:\n+ if param_replacement is not None and (include is None or module in include):\n self._replace_param(module, param_replacement)\n \n if method_replacement is not None:\n self._replace_method(module, method_replacement)\n \n- if sub_module_replacement is not None and can_replace_param_or_layer:\n- self._replace_sub_module(module, sub_module_replacement)\n+ if sub_module_replacement is not None:\n+ self._replace_sub_module(module, sub_module_replacement, include)\n \n for name, child in module.named_children():\n self._recursive_replace_layer(child,\n@@ -154,18 +153,17 @@\n bound_method = MethodType(new_method, module)\n setattr(module, method_name, bound_method)\n \n- def _replace_sub_module(\n- self,\n- org_layer: nn.Module,\n- sub_module_replacement: List[SubModuleReplacementDescription],\n- ) -> None:\n+ def _replace_sub_module(self,\n+ org_layer: nn.Module,\n+ sub_module_replacement: List[SubModuleReplacementDescription],\n+ include: Optional[Set[nn.Module]] = None) -> None:\n r\"\"\"\n Shard one layer according to the policy, the layer should be the same class as the key in policy's argument_policy return dict\n \n Args:\n org_layer (torch.nn.Module): The origin layer object to shard\n sub_module_replacement (List[SubModuleReplacementDescription]): The sub module replacement description list\n-\n+ include (Set[nn.Module], optional): The set of modules to keep on current device when pipeline parallel is enabled. Defaults to None\n \"\"\"\n for description in sub_module_replacement:\n suffix = description.suffix\n@@ -174,9 +172,12 @@\n \n assert target_module is not None, 'target_module should not be None'\n \n- # TODO: support different parallel mode\n native_sub_module = getattr_(org_layer, suffix, ignore=True)\n \n+ # Skip replacement if submodule is not kept by current device when pipeline parallel is enabled.\n+ if (include is not None) and (native_sub_module is not None) and (native_sub_module not in include):\n+ continue\n+\n assert not isinstance(native_sub_module, target_module), \\\n f\"The module with suffix {suffix} has been replaced, please check the policy\"\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n", "before_files": [{"content": "from types import MethodType\nfrom typing import Any, Callable, Dict, List, Optional, Set, Union\n\nimport torch.nn as nn\nfrom torch import Tensor\n\nfrom colossalai.lazy import LazyInitContext\n\nfrom .._utils import getattr_, setattr_\nfrom ..policies.auto_policy import get_autopolicy\nfrom ..policies.base_policy import Policy, SubModuleReplacementDescription\nfrom .shard_config import ShardConfig\nfrom .utils import set_tensors_to_none\n\n__all__ = ['ModelSharder', 'shard_model']\n\n\nclass ModelSharder(object):\n r\"\"\"\n Shard the original huggingface model according to the policy\n\n Args:\n policy (:class:`Policy`): The policy to shard the model\n model (:class:`torch.Module`): The model to shard\n shard_config: The setting of distributed model\n \"\"\"\n\n def __init__(self, model: nn.Module, policy: Policy, shard_config: ShardConfig = None) -> None:\n self.model = model\n self.policy = get_autopolicy(self.model) if policy is None else policy\n self.shard_config = shard_config\n\n def shard(self) -> List[Dict[int, Tensor]]:\n r\"\"\"\n Shard the model according to the policy\n \"\"\"\n self.policy.set_model(self.model)\n self.policy.set_shard_config(self.shard_config)\n self._preprocess()\n # get shared params before release unheld layers, this avoid misjudgement of shared params (None is None)\n shared_params = self.policy.get_shared_params()\n held_layers = self._release_unheld_layers()\n self._replace_module(include=held_layers)\n self._materialize()\n self._postprocess()\n return shared_params\n\n def _preprocess(self) -> None:\n self.model = self.policy.preprocess()\n\n def _postprocess(self) -> None:\n self.model = self.policy.postprocess()\n\n def _replace_module(self, include: Optional[Set[nn.Module]] = None) -> None:\n r\"\"\"\n Replace the module according to the policy, and replace the module one by one\n\n Args:\n model (:class:`torch.nn.Module`): The model to shard\n \"\"\"\n module_descriptions = self.policy.module_policy()\n for layer_cls, module_description in module_descriptions.items():\n attr_replacement = module_description.attribute_replacement\n param_replacement = module_description.param_replacement\n sub_module_replacement = module_description.sub_module_replacement\n method_replacement = module_description.method_replacement\n self._recursive_replace_layer(self.model,\n layer_cls,\n attr_replacement,\n param_replacement,\n method_replacement,\n sub_module_replacement,\n include=include)\n\n def _recursive_replace_layer(\n self,\n module: nn.Module,\n origin_cls: Union[str, nn.Module],\n attr_replacement: Dict[str, Any],\n param_replacement: List[Callable],\n method_replacement: Dict[str, Callable],\n sub_module_replacement: List[SubModuleReplacementDescription],\n include: Optional[Set[nn.Module]] = None,\n ) -> None:\n r\"\"\"\n Reverse the replace layer operation\n\n Args:\n module (torch.nn.Module): The object of layer to shard\n origin_cls (Union[str, torch.nn.Module]): The origin layer class or a string of layer class name\n attr_replacement (Dict[str, Any]): The attribute dict to modify\n param_replacement (List[Callable]): The function list to get parameter shard information in policy\n method_replacement (Dict[str, Callable]): Key is the method name, value is the method for replacement\n sub_module_replacement ((List[SubModuleReplacementDescription]): The function list to get sub module shard information in policy\n \"\"\"\n # released layers are not shardable\n can_replace_param_or_layer = include is None or module in include\n if (isinstance(origin_cls, str) and origin_cls == module.__class__.__name__) or \\\n (module.__class__ == origin_cls):\n if attr_replacement is not None:\n self._replace_attr(module, attr_replacement)\n\n if param_replacement is not None and can_replace_param_or_layer:\n self._replace_param(module, param_replacement)\n\n if method_replacement is not None:\n self._replace_method(module, method_replacement)\n\n if sub_module_replacement is not None and can_replace_param_or_layer:\n self._replace_sub_module(module, sub_module_replacement)\n\n for name, child in module.named_children():\n self._recursive_replace_layer(child,\n origin_cls,\n attr_replacement,\n param_replacement,\n method_replacement,\n sub_module_replacement,\n include=include)\n\n def _replace_attr(\n self,\n module: nn.Module,\n attr_replacement: Dict[str, Any],\n ) -> None:\n r\"\"\"\n Replace the attribute of the layer\n\n Args:\n module (:class:`torch.nn.Module`): The object of layer to shard\n attr_replacement (Dict): The attribute dict to modify\n \"\"\"\n for k, v in attr_replacement.items():\n setattr_(module, k, v, ignore=True)\n\n def _replace_param(\n self,\n module: nn.Module,\n param_replacement: List[Callable],\n ) -> None:\n r\"\"\"\n Replace the parameter of the layer\n\n Args:\n module (:class:`torch.nn.Module`): The object of layer to shard\n param_replacement (List[Callable]): The function list to get parameter shard information in policy\n \"\"\"\n for param_func in param_replacement:\n param_func(module)\n\n def _replace_method(self, module: nn.Module, method_replacement: Dict[str, Callable]):\n for method_name, new_method in method_replacement.items():\n # bind the new method to the module\n bound_method = MethodType(new_method, module)\n setattr(module, method_name, bound_method)\n\n def _replace_sub_module(\n self,\n org_layer: nn.Module,\n sub_module_replacement: List[SubModuleReplacementDescription],\n ) -> None:\n r\"\"\"\n Shard one layer according to the policy, the layer should be the same class as the key in policy's argument_policy return dict\n\n Args:\n org_layer (torch.nn.Module): The origin layer object to shard\n sub_module_replacement (List[SubModuleReplacementDescription]): The sub module replacement description list\n\n \"\"\"\n for description in sub_module_replacement:\n suffix = description.suffix\n target_module = description.target_module\n kwargs = {} if description.kwargs is None else description.kwargs\n\n assert target_module is not None, 'target_module should not be None'\n\n # TODO: support different parallel mode\n native_sub_module = getattr_(org_layer, suffix, ignore=True)\n\n assert not isinstance(native_sub_module, target_module), \\\n f\"The module with suffix {suffix} has been replaced, please check the policy\"\n\n # if it is None and we are allowed to ignore this module\n # just skip\n if description.ignore_if_not_exist and native_sub_module is None:\n continue\n\n try:\n replace_layer = target_module.from_native_module(native_sub_module,\n self.shard_config.tensor_parallel_process_group,\n **kwargs)\n except Exception as e:\n raise RuntimeError(\n f\"Failed to replace {suffix} of type {native_sub_module.__class__.__qualname__}\"\n f\" with {target_module.__qualname__} with the exception: {e}. \"\n \"Please check your model configuration or sharding policy, you can set up an issue for us to help you as well.\"\n )\n\n setattr_(org_layer, suffix, replace_layer)\n\n def _get_recursive_held_layers(self, held_layers: Optional[List[nn.Module]]) -> Optional[List[nn.Module]]:\n\n def collect_sub_modules(module: nn.Module):\n if module is None:\n return\n recursive_held_layers.append(module)\n for name, child in module.named_children():\n collect_sub_modules(child)\n\n recursive_held_layers = []\n for module in held_layers:\n collect_sub_modules(module)\n return recursive_held_layers\n\n def _release_unheld_layers(self) -> Optional[Set[nn.Module]]:\n r\"\"\"\n Release the unheld layers in the model\n \"\"\"\n if self.shard_config and self.shard_config.pipeline_stage_manager:\n held_layers = self.policy.get_held_layers()\n set_tensors_to_none(self.model, exclude=set(held_layers))\n return set(self._get_recursive_held_layers(held_layers))\n return None\n\n def _materialize(self) -> None:\n r\"\"\"\n Materialize the model if lazy initialization is used\n \"\"\"\n LazyInitContext.materialize(self.model)\n", "path": "colossalai/shardformer/shard/sharder.py"}], "after_files": [{"content": "from types import MethodType\nfrom typing import Any, Callable, Dict, List, Optional, Set, Union\n\nimport torch.nn as nn\nfrom torch import Tensor\n\nfrom colossalai.lazy import LazyInitContext\n\nfrom .._utils import getattr_, setattr_\nfrom ..policies.auto_policy import get_autopolicy\nfrom ..policies.base_policy import Policy, SubModuleReplacementDescription\nfrom .shard_config import ShardConfig\nfrom .utils import set_tensors_to_none\n\n__all__ = ['ModelSharder', 'shard_model']\n\n\nclass ModelSharder(object):\n r\"\"\"\n Shard the original huggingface model according to the policy\n\n Args:\n policy (:class:`Policy`): The policy to shard the model\n model (:class:`torch.Module`): The model to shard\n shard_config: The setting of distributed model\n \"\"\"\n\n def __init__(self, model: nn.Module, policy: Policy, shard_config: ShardConfig = None) -> None:\n self.model = model\n self.policy = get_autopolicy(self.model) if policy is None else policy\n self.shard_config = shard_config\n\n def shard(self) -> List[Dict[int, Tensor]]:\n r\"\"\"\n Shard the model according to the policy\n \"\"\"\n self.policy.set_model(self.model)\n self.policy.set_shard_config(self.shard_config)\n self._preprocess()\n # get shared params before release unheld layers, this avoid misjudgement of shared params (None is None)\n shared_params = self.policy.get_shared_params()\n held_layers = self._release_unheld_layers()\n self._replace_module(include=held_layers)\n self._materialize()\n self._postprocess()\n return shared_params\n\n def _preprocess(self) -> None:\n self.model = self.policy.preprocess()\n\n def _postprocess(self) -> None:\n self.model = self.policy.postprocess()\n\n def _replace_module(self, include: Optional[Set[nn.Module]] = None) -> None:\n r\"\"\"\n Replace the module according to the policy, and replace the module one by one\n\n Args:\n model (:class:`torch.nn.Module`): The model to shard\n \"\"\"\n module_descriptions = self.policy.module_policy()\n for layer_cls, module_description in module_descriptions.items():\n attr_replacement = module_description.attribute_replacement\n param_replacement = module_description.param_replacement\n sub_module_replacement = module_description.sub_module_replacement\n method_replacement = module_description.method_replacement\n self._recursive_replace_layer(self.model,\n layer_cls,\n attr_replacement,\n param_replacement,\n method_replacement,\n sub_module_replacement,\n include=include)\n\n def _recursive_replace_layer(\n self,\n module: nn.Module,\n origin_cls: Union[str, nn.Module],\n attr_replacement: Dict[str, Any],\n param_replacement: List[Callable],\n method_replacement: Dict[str, Callable],\n sub_module_replacement: List[SubModuleReplacementDescription],\n include: Optional[Set[nn.Module]] = None,\n ) -> None:\n r\"\"\"\n Reverse the replace layer operation\n\n Args:\n module (torch.nn.Module): The object of layer to shard\n origin_cls (Union[str, torch.nn.Module]): The origin layer class or a string of layer class name\n attr_replacement (Dict[str, Any]): The attribute dict to modify\n param_replacement (List[Callable]): The function list to get parameter shard information in policy\n method_replacement (Dict[str, Callable]): Key is the method name, value is the method for replacement\n sub_module_replacement ((List[SubModuleReplacementDescription]): The function list to get sub module shard information in policy\n include (Set[nn.Module], optional): The set of modules to keep on current device when pipeline parallel is enabled. Defaults to None\n \"\"\"\n if (isinstance(origin_cls, str) and origin_cls == module.__class__.__name__) or \\\n (module.__class__ == origin_cls):\n if attr_replacement is not None:\n self._replace_attr(module, attr_replacement)\n\n if param_replacement is not None and (include is None or module in include):\n self._replace_param(module, param_replacement)\n\n if method_replacement is not None:\n self._replace_method(module, method_replacement)\n\n if sub_module_replacement is not None:\n self._replace_sub_module(module, sub_module_replacement, include)\n\n for name, child in module.named_children():\n self._recursive_replace_layer(child,\n origin_cls,\n attr_replacement,\n param_replacement,\n method_replacement,\n sub_module_replacement,\n include=include)\n\n def _replace_attr(\n self,\n module: nn.Module,\n attr_replacement: Dict[str, Any],\n ) -> None:\n r\"\"\"\n Replace the attribute of the layer\n\n Args:\n module (:class:`torch.nn.Module`): The object of layer to shard\n attr_replacement (Dict): The attribute dict to modify\n \"\"\"\n for k, v in attr_replacement.items():\n setattr_(module, k, v, ignore=True)\n\n def _replace_param(\n self,\n module: nn.Module,\n param_replacement: List[Callable],\n ) -> None:\n r\"\"\"\n Replace the parameter of the layer\n\n Args:\n module (:class:`torch.nn.Module`): The object of layer to shard\n param_replacement (List[Callable]): The function list to get parameter shard information in policy\n \"\"\"\n for param_func in param_replacement:\n param_func(module)\n\n def _replace_method(self, module: nn.Module, method_replacement: Dict[str, Callable]):\n for method_name, new_method in method_replacement.items():\n # bind the new method to the module\n bound_method = MethodType(new_method, module)\n setattr(module, method_name, bound_method)\n\n def _replace_sub_module(self,\n org_layer: nn.Module,\n sub_module_replacement: List[SubModuleReplacementDescription],\n include: Optional[Set[nn.Module]] = None) -> None:\n r\"\"\"\n Shard one layer according to the policy, the layer should be the same class as the key in policy's argument_policy return dict\n\n Args:\n org_layer (torch.nn.Module): The origin layer object to shard\n sub_module_replacement (List[SubModuleReplacementDescription]): The sub module replacement description list\n include (Set[nn.Module], optional): The set of modules to keep on current device when pipeline parallel is enabled. Defaults to None\n \"\"\"\n for description in sub_module_replacement:\n suffix = description.suffix\n target_module = description.target_module\n kwargs = {} if description.kwargs is None else description.kwargs\n\n assert target_module is not None, 'target_module should not be None'\n\n native_sub_module = getattr_(org_layer, suffix, ignore=True)\n\n # Skip replacement if submodule is not kept by current device when pipeline parallel is enabled.\n if (include is not None) and (native_sub_module is not None) and (native_sub_module not in include):\n continue\n\n assert not isinstance(native_sub_module, target_module), \\\n f\"The module with suffix {suffix} has been replaced, please check the policy\"\n\n # if it is None and we are allowed to ignore this module\n # just skip\n if description.ignore_if_not_exist and native_sub_module is None:\n continue\n\n try:\n replace_layer = target_module.from_native_module(native_sub_module,\n self.shard_config.tensor_parallel_process_group,\n **kwargs)\n except Exception as e:\n raise RuntimeError(\n f\"Failed to replace {suffix} of type {native_sub_module.__class__.__qualname__}\"\n f\" with {target_module.__qualname__} with the exception: {e}. \"\n \"Please check your model configuration or sharding policy, you can set up an issue for us to help you as well.\"\n )\n\n setattr_(org_layer, suffix, replace_layer)\n\n def _get_recursive_held_layers(self, held_layers: Optional[List[nn.Module]]) -> Optional[List[nn.Module]]:\n\n def collect_sub_modules(module: nn.Module):\n if module is None:\n return\n recursive_held_layers.append(module)\n for name, child in module.named_children():\n collect_sub_modules(child)\n\n recursive_held_layers = []\n for module in held_layers:\n collect_sub_modules(module)\n return recursive_held_layers\n\n def _release_unheld_layers(self) -> Optional[Set[nn.Module]]:\n r\"\"\"\n Release the unheld layers in the model\n \"\"\"\n if self.shard_config and self.shard_config.pipeline_stage_manager:\n held_layers = self.policy.get_held_layers()\n set_tensors_to_none(self.model, exclude=set(held_layers))\n return set(self._get_recursive_held_layers(held_layers))\n return None\n\n def _materialize(self) -> None:\n r\"\"\"\n Materialize the model if lazy initialization is used\n \"\"\"\n LazyInitContext.materialize(self.model)\n", "path": "colossalai/shardformer/shard/sharder.py"}]} | 2,756 | 787 |
gh_patches_debug_26807 | rasdani/github-patches | git_diff | facebookresearch__hydra-911 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug] Not able to Access Parent Fields in instantiate Interpolation
# 🐛 Bug
## Description
Followup on #388 the parent fields seem to be getting lost again. If I have a field that is an interpolation of a field higher up in the hierarchy, I can print out the value fine but I can't use it within instiantiate.
## Checklist
- [x] I checked on the latest version of Hydra
- [x] I created a minimal repro
## To reproduce
** Minimal Code/Config snippet to reproduce **
**Minimal** code snippet which should print three times the same integer value.
The first print is the parent field
The second print accesses the child field, which is an interpolation of the parent field
The third print uses instantiate to create an object which takes the child field as a parameter and prints from that object
Before the third print happens the exception is thrown
```
import time
import hydra
import submitit
class GetsTheInteger:
def __init__(self, same_integer):
self.intval = intval
@hydra.main(config_name="test2.yaml")
def main(cfg) -> None:
print(cfg.data.integer)
print(cfg.data.test.same_integer)
g = hydra.utils.instantiate(cfg.data.test)
print(g.intval)
if __name__ == "__main__":
main()
```
** Stack trace/error message **
```
Traceback (most recent call last):
File "/private/home/mehrlich/.conda/envs/qgac/lib/python3.7/site-packages/hydra/_internal/utils.py", line 203, in run_and_report
return func()
File "/private/home/mehrlich/.conda/envs/qgac/lib/python3.7/site-packages/hydra/_internal/utils.py", line 355, in <lambda>
overrides=args.overrides,
File "/private/home/mehrlich/.conda/envs/qgac/lib/python3.7/site-packages/hydra/_internal/hydra.py", line 110, in run
job_subdir_key=None,
File "/private/home/mehrlich/.conda/envs/qgac/lib/python3.7/site-packages/hydra/core/utils.py", line 123, in run_job
ret.return_value = task_function(task_cfg)
File "test2.py", line 15, in main
g = hydra.utils.instantiate(cfg.data.test)
File "/private/home/mehrlich/.conda/envs/qgac/lib/python3.7/site-packages/hydra/utils.py", line 68, in call
raise HydraException(f"Error calling '{cls}' : {e}") from e
hydra.errors.HydraException: Error calling 'test2.GetsTheInteger' : str interpolation key 'data.integer' not found
full_key: same_integer
reference_type=Any
object_type=dict
```
## Expected Behavior
No crash, can instantiate objects whose parameters depend on interpolations of parent fields
## System information
- **Hydra Version** : git master
- **Python version** : 3.7
- **Virtual environment type and version** : Conda
- **Operating system** : Ubuntu 18.04 (fair cluster)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `hydra/utils.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
2 import copy
3 import logging.config
4 import os
5 from pathlib import Path
6 from typing import Any, Callable
7
8 from omegaconf import DictConfig, OmegaConf
9 from omegaconf._utils import is_structured_config
10
11 from hydra._internal.utils import (
12 _call_callable,
13 _get_cls_name,
14 _instantiate_class,
15 _locate,
16 )
17 from hydra.core.hydra_config import HydraConfig
18 from hydra.errors import HydraException, InstantiationException
19 from hydra.types import TargetConf
20
21 log = logging.getLogger(__name__)
22
23
24 def call(config: Any, *args: Any, **kwargs: Any) -> Any:
25 """
26 :param config: An object describing what to call and what params to use. needs to have a _target_ field.
27 :param args: optional positional parameters pass-through
28 :param kwargs: optional named parameters pass-through
29 :return: the return value from the specified class or method
30 """
31
32 if OmegaConf.is_none(config):
33 return None
34
35 if isinstance(config, TargetConf) and config._target_ == "???":
36 # Specific check to give a good warning about failure to annotate _target_ as a string.
37 raise InstantiationException(
38 f"Missing value for {type(config).__name__}._target_. Check that it's properly annotated and overridden."
39 f"\nA common problem is forgetting to annotate _target_ as a string : '_target_: str = ...'"
40 )
41
42 if (
43 isinstance(config, dict)
44 or OmegaConf.is_config(config)
45 or is_structured_config(config)
46 ):
47 config = OmegaConf.structured(config)
48 else:
49 raise HydraException(f"Unsupported config type : {type(config).__name__}")
50
51 cls = "<unknown>"
52 try:
53 assert isinstance(config, DictConfig)
54 # make a copy to ensure we do not change the provided object
55 config = copy.deepcopy(config)
56 OmegaConf.set_readonly(config, False)
57 OmegaConf.set_struct(config, False)
58 cls = _get_cls_name(config)
59 type_or_callable = _locate(cls)
60 if isinstance(type_or_callable, type):
61 return _instantiate_class(type_or_callable, config, *args, **kwargs)
62 else:
63 assert callable(type_or_callable)
64 return _call_callable(type_or_callable, config, *args, **kwargs)
65 except InstantiationException as e:
66 raise e
67 except Exception as e:
68 raise HydraException(f"Error calling '{cls}' : {e}") from e
69
70
71 # Alias for call
72 instantiate = call
73
74
75 def get_class(path: str) -> type:
76 try:
77 cls = _locate(path)
78 if not isinstance(cls, type):
79 raise ValueError(f"Located non-class in {path} : {type(cls).__name__}")
80 return cls
81 except Exception as e:
82 log.error(f"Error initializing class at {path} : {e}")
83 raise e
84
85
86 def get_method(path: str) -> Callable[..., Any]:
87 try:
88 cl = _locate(path)
89 if not callable(cl):
90 raise ValueError(f"Non callable object located : {type(cl).__name__}")
91 return cl
92 except Exception as e:
93 log.error(f"Error getting callable at {path} : {e}")
94 raise e
95
96
97 # Alias for get_method
98 get_static_method = get_method
99
100
101 def get_original_cwd() -> str:
102 """
103 :return: the original working directory the Hydra application was launched from
104 """
105 if not HydraConfig.initialized():
106 raise ValueError(
107 "get_original_cwd() must only be used after HydraConfig is initialized"
108 )
109 ret = HydraConfig.get().runtime.cwd
110 assert ret is not None and isinstance(ret, str)
111 return ret
112
113
114 def to_absolute_path(path: str) -> str:
115 """
116 converts the specified path to be absolute path.
117 if the input path is relative, it's interpreted as relative to the original working directory
118 if it's absolute, it's returned as is
119 :param path: path to convert
120 :return:
121 """
122 p = Path(path)
123 if not HydraConfig.initialized():
124 base = Path(os.getcwd())
125 else:
126 base = Path(get_original_cwd())
127 if p.is_absolute():
128 ret = p
129 else:
130 ret = base / p
131 return str(ret)
132
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/hydra/utils.py b/hydra/utils.py
--- a/hydra/utils.py
+++ b/hydra/utils.py
@@ -1,5 +1,4 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
-import copy
import logging.config
import os
from pathlib import Path
@@ -39,20 +38,22 @@
f"\nA common problem is forgetting to annotate _target_ as a string : '_target_: str = ...'"
)
- if (
+ if not (
isinstance(config, dict)
or OmegaConf.is_config(config)
or is_structured_config(config)
):
- config = OmegaConf.structured(config)
- else:
raise HydraException(f"Unsupported config type : {type(config).__name__}")
+ # make a copy to ensure we do not change the provided object
+ config_copy = OmegaConf.structured(config)
+ if OmegaConf.is_config(config):
+ config_copy._set_parent(config._get_parent())
+ config = config_copy
+
cls = "<unknown>"
try:
assert isinstance(config, DictConfig)
- # make a copy to ensure we do not change the provided object
- config = copy.deepcopy(config)
OmegaConf.set_readonly(config, False)
OmegaConf.set_struct(config, False)
cls = _get_cls_name(config)
| {"golden_diff": "diff --git a/hydra/utils.py b/hydra/utils.py\n--- a/hydra/utils.py\n+++ b/hydra/utils.py\n@@ -1,5 +1,4 @@\n # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n-import copy\n import logging.config\n import os\n from pathlib import Path\n@@ -39,20 +38,22 @@\n f\"\\nA common problem is forgetting to annotate _target_ as a string : '_target_: str = ...'\"\n )\n \n- if (\n+ if not (\n isinstance(config, dict)\n or OmegaConf.is_config(config)\n or is_structured_config(config)\n ):\n- config = OmegaConf.structured(config)\n- else:\n raise HydraException(f\"Unsupported config type : {type(config).__name__}\")\n \n+ # make a copy to ensure we do not change the provided object\n+ config_copy = OmegaConf.structured(config)\n+ if OmegaConf.is_config(config):\n+ config_copy._set_parent(config._get_parent())\n+ config = config_copy\n+\n cls = \"<unknown>\"\n try:\n assert isinstance(config, DictConfig)\n- # make a copy to ensure we do not change the provided object\n- config = copy.deepcopy(config)\n OmegaConf.set_readonly(config, False)\n OmegaConf.set_struct(config, False)\n cls = _get_cls_name(config)\n", "issue": "[Bug] Not able to Access Parent Fields in instantiate Interpolation\n# \ud83d\udc1b Bug\r\n## Description\r\n\r\nFollowup on #388 the parent fields seem to be getting lost again. If I have a field that is an interpolation of a field higher up in the hierarchy, I can print out the value fine but I can't use it within instiantiate.\r\n\r\n## Checklist\r\n- [x] I checked on the latest version of Hydra\r\n- [x] I created a minimal repro\r\n\r\n## To reproduce\r\n\r\n** Minimal Code/Config snippet to reproduce **\r\n\r\n**Minimal** code snippet which should print three times the same integer value. \r\n\r\nThe first print is the parent field\r\nThe second print accesses the child field, which is an interpolation of the parent field\r\nThe third print uses instantiate to create an object which takes the child field as a parameter and prints from that object\r\n\r\nBefore the third print happens the exception is thrown\r\n\r\n```\r\nimport time\r\nimport hydra\r\nimport submitit\r\n\r\nclass GetsTheInteger:\r\n def __init__(self, same_integer):\r\n self.intval = intval\r\n \r\n\r\[email protected](config_name=\"test2.yaml\")\r\ndef main(cfg) -> None:\r\n print(cfg.data.integer)\r\n print(cfg.data.test.same_integer)\r\n\r\n g = hydra.utils.instantiate(cfg.data.test)\r\n print(g.intval) \r\n\r\n\r\nif __name__ == \"__main__\":\r\n main()\r\n```\r\n\r\n** Stack trace/error message **\r\n```\r\nTraceback (most recent call last):\r\n File \"/private/home/mehrlich/.conda/envs/qgac/lib/python3.7/site-packages/hydra/_internal/utils.py\", line 203, in run_and_report\r\n return func()\r\n File \"/private/home/mehrlich/.conda/envs/qgac/lib/python3.7/site-packages/hydra/_internal/utils.py\", line 355, in <lambda>\r\n overrides=args.overrides,\r\n File \"/private/home/mehrlich/.conda/envs/qgac/lib/python3.7/site-packages/hydra/_internal/hydra.py\", line 110, in run\r\n job_subdir_key=None,\r\n File \"/private/home/mehrlich/.conda/envs/qgac/lib/python3.7/site-packages/hydra/core/utils.py\", line 123, in run_job\r\n ret.return_value = task_function(task_cfg)\r\n File \"test2.py\", line 15, in main\r\n g = hydra.utils.instantiate(cfg.data.test)\r\n File \"/private/home/mehrlich/.conda/envs/qgac/lib/python3.7/site-packages/hydra/utils.py\", line 68, in call\r\n raise HydraException(f\"Error calling '{cls}' : {e}\") from e\r\nhydra.errors.HydraException: Error calling 'test2.GetsTheInteger' : str interpolation key 'data.integer' not found\r\n full_key: same_integer\r\n reference_type=Any\r\n object_type=dict\r\n```\r\n\r\n## Expected Behavior\r\n\r\nNo crash, can instantiate objects whose parameters depend on interpolations of parent fields\r\n\r\n## System information\r\n- **Hydra Version** : git master\r\n- **Python version** : 3.7\r\n- **Virtual environment type and version** : Conda\r\n- **Operating system** : Ubuntu 18.04 (fair cluster)\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\nimport copy\nimport logging.config\nimport os\nfrom pathlib import Path\nfrom typing import Any, Callable\n\nfrom omegaconf import DictConfig, OmegaConf\nfrom omegaconf._utils import is_structured_config\n\nfrom hydra._internal.utils import (\n _call_callable,\n _get_cls_name,\n _instantiate_class,\n _locate,\n)\nfrom hydra.core.hydra_config import HydraConfig\nfrom hydra.errors import HydraException, InstantiationException\nfrom hydra.types import TargetConf\n\nlog = logging.getLogger(__name__)\n\n\ndef call(config: Any, *args: Any, **kwargs: Any) -> Any:\n \"\"\"\n :param config: An object describing what to call and what params to use. needs to have a _target_ field.\n :param args: optional positional parameters pass-through\n :param kwargs: optional named parameters pass-through\n :return: the return value from the specified class or method\n \"\"\"\n\n if OmegaConf.is_none(config):\n return None\n\n if isinstance(config, TargetConf) and config._target_ == \"???\":\n # Specific check to give a good warning about failure to annotate _target_ as a string.\n raise InstantiationException(\n f\"Missing value for {type(config).__name__}._target_. Check that it's properly annotated and overridden.\"\n f\"\\nA common problem is forgetting to annotate _target_ as a string : '_target_: str = ...'\"\n )\n\n if (\n isinstance(config, dict)\n or OmegaConf.is_config(config)\n or is_structured_config(config)\n ):\n config = OmegaConf.structured(config)\n else:\n raise HydraException(f\"Unsupported config type : {type(config).__name__}\")\n\n cls = \"<unknown>\"\n try:\n assert isinstance(config, DictConfig)\n # make a copy to ensure we do not change the provided object\n config = copy.deepcopy(config)\n OmegaConf.set_readonly(config, False)\n OmegaConf.set_struct(config, False)\n cls = _get_cls_name(config)\n type_or_callable = _locate(cls)\n if isinstance(type_or_callable, type):\n return _instantiate_class(type_or_callable, config, *args, **kwargs)\n else:\n assert callable(type_or_callable)\n return _call_callable(type_or_callable, config, *args, **kwargs)\n except InstantiationException as e:\n raise e\n except Exception as e:\n raise HydraException(f\"Error calling '{cls}' : {e}\") from e\n\n\n# Alias for call\ninstantiate = call\n\n\ndef get_class(path: str) -> type:\n try:\n cls = _locate(path)\n if not isinstance(cls, type):\n raise ValueError(f\"Located non-class in {path} : {type(cls).__name__}\")\n return cls\n except Exception as e:\n log.error(f\"Error initializing class at {path} : {e}\")\n raise e\n\n\ndef get_method(path: str) -> Callable[..., Any]:\n try:\n cl = _locate(path)\n if not callable(cl):\n raise ValueError(f\"Non callable object located : {type(cl).__name__}\")\n return cl\n except Exception as e:\n log.error(f\"Error getting callable at {path} : {e}\")\n raise e\n\n\n# Alias for get_method\nget_static_method = get_method\n\n\ndef get_original_cwd() -> str:\n \"\"\"\n :return: the original working directory the Hydra application was launched from\n \"\"\"\n if not HydraConfig.initialized():\n raise ValueError(\n \"get_original_cwd() must only be used after HydraConfig is initialized\"\n )\n ret = HydraConfig.get().runtime.cwd\n assert ret is not None and isinstance(ret, str)\n return ret\n\n\ndef to_absolute_path(path: str) -> str:\n \"\"\"\n converts the specified path to be absolute path.\n if the input path is relative, it's interpreted as relative to the original working directory\n if it's absolute, it's returned as is\n :param path: path to convert\n :return:\n \"\"\"\n p = Path(path)\n if not HydraConfig.initialized():\n base = Path(os.getcwd())\n else:\n base = Path(get_original_cwd())\n if p.is_absolute():\n ret = p\n else:\n ret = base / p\n return str(ret)\n", "path": "hydra/utils.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\nimport logging.config\nimport os\nfrom pathlib import Path\nfrom typing import Any, Callable\n\nfrom omegaconf import DictConfig, OmegaConf\nfrom omegaconf._utils import is_structured_config\n\nfrom hydra._internal.utils import (\n _call_callable,\n _get_cls_name,\n _instantiate_class,\n _locate,\n)\nfrom hydra.core.hydra_config import HydraConfig\nfrom hydra.errors import HydraException, InstantiationException\nfrom hydra.types import TargetConf\n\nlog = logging.getLogger(__name__)\n\n\ndef call(config: Any, *args: Any, **kwargs: Any) -> Any:\n \"\"\"\n :param config: An object describing what to call and what params to use. needs to have a _target_ field.\n :param args: optional positional parameters pass-through\n :param kwargs: optional named parameters pass-through\n :return: the return value from the specified class or method\n \"\"\"\n\n if OmegaConf.is_none(config):\n return None\n\n if isinstance(config, TargetConf) and config._target_ == \"???\":\n # Specific check to give a good warning about failure to annotate _target_ as a string.\n raise InstantiationException(\n f\"Missing value for {type(config).__name__}._target_. Check that it's properly annotated and overridden.\"\n f\"\\nA common problem is forgetting to annotate _target_ as a string : '_target_: str = ...'\"\n )\n\n if not (\n isinstance(config, dict)\n or OmegaConf.is_config(config)\n or is_structured_config(config)\n ):\n raise HydraException(f\"Unsupported config type : {type(config).__name__}\")\n\n # make a copy to ensure we do not change the provided object\n config_copy = OmegaConf.structured(config)\n if OmegaConf.is_config(config):\n config_copy._set_parent(config._get_parent())\n config = config_copy\n\n cls = \"<unknown>\"\n try:\n assert isinstance(config, DictConfig)\n OmegaConf.set_readonly(config, False)\n OmegaConf.set_struct(config, False)\n cls = _get_cls_name(config)\n type_or_callable = _locate(cls)\n if isinstance(type_or_callable, type):\n return _instantiate_class(type_or_callable, config, *args, **kwargs)\n else:\n assert callable(type_or_callable)\n return _call_callable(type_or_callable, config, *args, **kwargs)\n except InstantiationException as e:\n raise e\n except Exception as e:\n raise HydraException(f\"Error calling '{cls}' : {e}\") from e\n\n\n# Alias for call\ninstantiate = call\n\n\ndef get_class(path: str) -> type:\n try:\n cls = _locate(path)\n if not isinstance(cls, type):\n raise ValueError(f\"Located non-class in {path} : {type(cls).__name__}\")\n return cls\n except Exception as e:\n log.error(f\"Error initializing class at {path} : {e}\")\n raise e\n\n\ndef get_method(path: str) -> Callable[..., Any]:\n try:\n cl = _locate(path)\n if not callable(cl):\n raise ValueError(f\"Non callable object located : {type(cl).__name__}\")\n return cl\n except Exception as e:\n log.error(f\"Error getting callable at {path} : {e}\")\n raise e\n\n\n# Alias for get_method\nget_static_method = get_method\n\n\ndef get_original_cwd() -> str:\n \"\"\"\n :return: the original working directory the Hydra application was launched from\n \"\"\"\n if not HydraConfig.initialized():\n raise ValueError(\n \"get_original_cwd() must only be used after HydraConfig is initialized\"\n )\n ret = HydraConfig.get().runtime.cwd\n assert ret is not None and isinstance(ret, str)\n return ret\n\n\ndef to_absolute_path(path: str) -> str:\n \"\"\"\n converts the specified path to be absolute path.\n if the input path is relative, it's interpreted as relative to the original working directory\n if it's absolute, it's returned as is\n :param path: path to convert\n :return:\n \"\"\"\n p = Path(path)\n if not HydraConfig.initialized():\n base = Path(os.getcwd())\n else:\n base = Path(get_original_cwd())\n if p.is_absolute():\n ret = p\n else:\n ret = base / p\n return str(ret)\n", "path": "hydra/utils.py"}]} | 2,197 | 306 |
gh_patches_debug_28060 | rasdani/github-patches | git_diff | dynaconf__dynaconf-131 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
YAML.load without a loader is deprecated for security purposes
We've started seeing the following warning:
```
lib/python3.6/site-packages/dynaconf/loaders/base.py:95: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
```
See here: https://github.com/yaml/pyyaml/wiki/PyYAML-yaml.load(input)-Deprecation
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `dynaconf/loaders/yaml_loader.py`
Content:
```
1 # coding: utf-8
2 import io
3 from pathlib import Path
4 from dynaconf import default_settings
5 from dynaconf.loaders.base import BaseLoader
6 from dynaconf.constants import YAML_EXTENSIONS
7 from dynaconf.utils import object_merge
8 try:
9 import yaml
10 except ImportError as e: # pragma: no cover
11 yaml = None
12
13
14 def load(obj, env=None, silent=True, key=None, filename=None):
15 """
16 Reads and loads in to "obj" a single key or all keys from source file.
17
18 :param obj: the settings instance
19 :param env: settings current env default='development'
20 :param silent: if errors should raise
21 :param key: if defined load a single key, else load all in env
22 :param filename: Optional custom filename to load
23 :return: None
24 """
25 if yaml is None: # pragma: no cover
26 BaseLoader.warn_not_installed(obj, 'yaml')
27 return
28
29 loader = BaseLoader(
30 obj=obj,
31 env=env,
32 identifier='yaml',
33 extensions=YAML_EXTENSIONS,
34 file_reader=yaml.load,
35 string_reader=yaml.load
36 )
37 loader.load(filename=filename, key=key, silent=silent)
38
39
40 def write(settings_path, settings_data, merge=True):
41 """Write data to a settings file.
42
43 :param settings_path: the filepath
44 :param settings_data: a dictionary with data
45 :param merge: boolean if existing file should be merged with new data
46 """
47 settings_path = Path(settings_path)
48 if settings_path.exists() and merge: # pragma: no cover
49 object_merge(
50 yaml.load(
51 io.open(
52 str(settings_path),
53 encoding=default_settings.ENCODING_FOR_DYNACONF
54 )
55 ),
56 settings_data
57 )
58
59 yaml.dump(
60 settings_data,
61 io.open(
62 str(settings_path), 'w',
63 encoding=default_settings.ENCODING_FOR_DYNACONF
64 )
65 )
66
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/dynaconf/loaders/yaml_loader.py b/dynaconf/loaders/yaml_loader.py
--- a/dynaconf/loaders/yaml_loader.py
+++ b/dynaconf/loaders/yaml_loader.py
@@ -1,10 +1,13 @@
# coding: utf-8
import io
+import os
from pathlib import Path
+from warnings import warn
from dynaconf import default_settings
from dynaconf.loaders.base import BaseLoader
from dynaconf.constants import YAML_EXTENSIONS
from dynaconf.utils import object_merge
+
try:
import yaml
except ImportError as e: # pragma: no cover
@@ -26,13 +29,25 @@
BaseLoader.warn_not_installed(obj, 'yaml')
return
+ # Resolve the loaders
+ # https://github.com/yaml/pyyaml/wiki/PyYAML-yaml.load(input)-Deprecation
+ # Possible values are `safe_load, full_load, unsafe_load, load`
+ yaml_loader_name = os.environ.get('YAML_LOADER_FOR_DYNACONF', 'full_load')
+ yaml_reader = getattr(yaml, yaml_loader_name, yaml.load)
+ if yaml_reader.__name__ == 'unsafe_load': # pragma: no cover
+ warn(
+ "yaml.unsafe_load is deprecated."
+ " Please read https://msg.pyyaml.org/load for full details."
+ " Try to use full_load or safe_load."
+ )
+
loader = BaseLoader(
obj=obj,
env=env,
identifier='yaml',
extensions=YAML_EXTENSIONS,
- file_reader=yaml.load,
- string_reader=yaml.load
+ file_reader=yaml_reader,
+ string_reader=yaml_reader
)
loader.load(filename=filename, key=key, silent=silent)
| {"golden_diff": "diff --git a/dynaconf/loaders/yaml_loader.py b/dynaconf/loaders/yaml_loader.py\n--- a/dynaconf/loaders/yaml_loader.py\n+++ b/dynaconf/loaders/yaml_loader.py\n@@ -1,10 +1,13 @@\n # coding: utf-8\n import io\n+import os\n from pathlib import Path\n+from warnings import warn\n from dynaconf import default_settings\n from dynaconf.loaders.base import BaseLoader\n from dynaconf.constants import YAML_EXTENSIONS\n from dynaconf.utils import object_merge\n+\n try:\n import yaml\n except ImportError as e: # pragma: no cover\n@@ -26,13 +29,25 @@\n BaseLoader.warn_not_installed(obj, 'yaml')\n return\n \n+ # Resolve the loaders\n+ # https://github.com/yaml/pyyaml/wiki/PyYAML-yaml.load(input)-Deprecation\n+ # Possible values are `safe_load, full_load, unsafe_load, load`\n+ yaml_loader_name = os.environ.get('YAML_LOADER_FOR_DYNACONF', 'full_load')\n+ yaml_reader = getattr(yaml, yaml_loader_name, yaml.load)\n+ if yaml_reader.__name__ == 'unsafe_load': # pragma: no cover\n+ warn(\n+ \"yaml.unsafe_load is deprecated.\"\n+ \" Please read https://msg.pyyaml.org/load for full details.\"\n+ \" Try to use full_load or safe_load.\"\n+ )\n+\n loader = BaseLoader(\n obj=obj,\n env=env,\n identifier='yaml',\n extensions=YAML_EXTENSIONS,\n- file_reader=yaml.load,\n- string_reader=yaml.load\n+ file_reader=yaml_reader,\n+ string_reader=yaml_reader\n )\n loader.load(filename=filename, key=key, silent=silent)\n", "issue": "YAML.load without a loader is deprecated for security purposes\nWe've started seeing the following warning:\r\n```\r\nlib/python3.6/site-packages/dynaconf/loaders/base.py:95: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.\r\n```\r\n\r\nSee here: https://github.com/yaml/pyyaml/wiki/PyYAML-yaml.load(input)-Deprecation\n", "before_files": [{"content": "# coding: utf-8\nimport io\nfrom pathlib import Path\nfrom dynaconf import default_settings\nfrom dynaconf.loaders.base import BaseLoader\nfrom dynaconf.constants import YAML_EXTENSIONS\nfrom dynaconf.utils import object_merge\ntry:\n import yaml\nexcept ImportError as e: # pragma: no cover\n yaml = None\n\n\ndef load(obj, env=None, silent=True, key=None, filename=None):\n \"\"\"\n Reads and loads in to \"obj\" a single key or all keys from source file.\n\n :param obj: the settings instance\n :param env: settings current env default='development'\n :param silent: if errors should raise\n :param key: if defined load a single key, else load all in env\n :param filename: Optional custom filename to load\n :return: None\n \"\"\"\n if yaml is None: # pragma: no cover\n BaseLoader.warn_not_installed(obj, 'yaml')\n return\n\n loader = BaseLoader(\n obj=obj,\n env=env,\n identifier='yaml',\n extensions=YAML_EXTENSIONS,\n file_reader=yaml.load,\n string_reader=yaml.load\n )\n loader.load(filename=filename, key=key, silent=silent)\n\n\ndef write(settings_path, settings_data, merge=True):\n \"\"\"Write data to a settings file.\n\n :param settings_path: the filepath\n :param settings_data: a dictionary with data\n :param merge: boolean if existing file should be merged with new data\n \"\"\"\n settings_path = Path(settings_path)\n if settings_path.exists() and merge: # pragma: no cover\n object_merge(\n yaml.load(\n io.open(\n str(settings_path),\n encoding=default_settings.ENCODING_FOR_DYNACONF\n )\n ),\n settings_data\n )\n\n yaml.dump(\n settings_data,\n io.open(\n str(settings_path), 'w',\n encoding=default_settings.ENCODING_FOR_DYNACONF\n )\n )\n", "path": "dynaconf/loaders/yaml_loader.py"}], "after_files": [{"content": "# coding: utf-8\nimport io\nimport os\nfrom pathlib import Path\nfrom warnings import warn\nfrom dynaconf import default_settings\nfrom dynaconf.loaders.base import BaseLoader\nfrom dynaconf.constants import YAML_EXTENSIONS\nfrom dynaconf.utils import object_merge\n\ntry:\n import yaml\nexcept ImportError as e: # pragma: no cover\n yaml = None\n\n\ndef load(obj, env=None, silent=True, key=None, filename=None):\n \"\"\"\n Reads and loads in to \"obj\" a single key or all keys from source file.\n\n :param obj: the settings instance\n :param env: settings current env default='development'\n :param silent: if errors should raise\n :param key: if defined load a single key, else load all in env\n :param filename: Optional custom filename to load\n :return: None\n \"\"\"\n if yaml is None: # pragma: no cover\n BaseLoader.warn_not_installed(obj, 'yaml')\n return\n\n # Resolve the loaders\n # https://github.com/yaml/pyyaml/wiki/PyYAML-yaml.load(input)-Deprecation\n # Possible values are `safe_load, full_load, unsafe_load, load`\n yaml_loader_name = os.environ.get('YAML_LOADER_FOR_DYNACONF', 'full_load')\n yaml_reader = getattr(yaml, yaml_loader_name, yaml.load)\n if yaml_reader.__name__ == 'unsafe_load': # pragma: no cover\n warn(\n \"yaml.unsafe_load is deprecated.\"\n \" Please read https://msg.pyyaml.org/load for full details.\"\n \" Try to use full_load or safe_load.\"\n )\n\n loader = BaseLoader(\n obj=obj,\n env=env,\n identifier='yaml',\n extensions=YAML_EXTENSIONS,\n file_reader=yaml_reader,\n string_reader=yaml_reader\n )\n loader.load(filename=filename, key=key, silent=silent)\n\n\ndef write(settings_path, settings_data, merge=True):\n \"\"\"Write data to a settings file.\n\n :param settings_path: the filepath\n :param settings_data: a dictionary with data\n :param merge: boolean if existing file should be merged with new data\n \"\"\"\n settings_path = Path(settings_path)\n if settings_path.exists() and merge: # pragma: no cover\n object_merge(\n yaml.load(\n io.open(\n str(settings_path),\n encoding=default_settings.ENCODING_FOR_DYNACONF\n )\n ),\n settings_data\n )\n\n yaml.dump(\n settings_data,\n io.open(\n str(settings_path), 'w',\n encoding=default_settings.ENCODING_FOR_DYNACONF\n )\n )\n", "path": "dynaconf/loaders/yaml_loader.py"}]} | 925 | 401 |
gh_patches_debug_41208 | rasdani/github-patches | git_diff | akvo__akvo-rsr-3751 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Audit log disaggregation categories and labels
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `akvo/rest/views/indicator_dimension_name.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Akvo RSR is covered by the GNU Affero General Public License.
4 # See more details in the license.txt file located at the root folder of the Akvo RSR module.
5 # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
6
7
8 from akvo.rsr.models import IndicatorDimensionName
9
10 from ..serializers import IndicatorDimensionNameSerializer
11 from ..viewsets import PublicProjectViewSet
12
13
14 class IndicatorDimensionNameViewSet(PublicProjectViewSet):
15 """
16 """
17 queryset = IndicatorDimensionName.objects.prefetch_related('dimension_values')
18 serializer_class = IndicatorDimensionNameSerializer
19 project_relation = 'project__'
20
```
Path: `akvo/rest/views/indicator_dimension_value.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Akvo RSR is covered by the GNU Affero General Public License.
4 # See more details in the license.txt file located at the root folder of the Akvo RSR module.
5 # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
6
7
8 from akvo.rsr.models import IndicatorDimensionValue
9
10 from ..serializers import IndicatorDimensionValueSerializer
11 from ..viewsets import PublicProjectViewSet
12
13
14 class IndicatorDimensionValueViewSet(PublicProjectViewSet):
15 """
16 """
17 queryset = IndicatorDimensionValue.objects.all()
18 serializer_class = IndicatorDimensionValueSerializer
19 project_relation = 'name__project__'
20
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/akvo/rest/views/indicator_dimension_name.py b/akvo/rest/views/indicator_dimension_name.py
--- a/akvo/rest/views/indicator_dimension_name.py
+++ b/akvo/rest/views/indicator_dimension_name.py
@@ -5,6 +5,8 @@
# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
+from django.contrib.admin.models import LogEntry, ADDITION, CHANGE, DELETION
+from django.contrib.contenttypes.models import ContentType
from akvo.rsr.models import IndicatorDimensionName
from ..serializers import IndicatorDimensionNameSerializer
@@ -17,3 +19,31 @@
queryset = IndicatorDimensionName.objects.prefetch_related('dimension_values')
serializer_class = IndicatorDimensionNameSerializer
project_relation = 'project__'
+
+ def create(self, request, *args, **kwargs):
+ response = super(IndicatorDimensionNameViewSet, self).create(request, *args, **kwargs)
+ self._log_action(ADDITION, response.data, str(request.data))
+ return response
+
+ def update(self, request, *args, **kwargs):
+ response = super(IndicatorDimensionNameViewSet, self).update(request, *args, **kwargs)
+ self._log_action(CHANGE, response.data, str(request.data))
+ return response
+
+ def destroy(self, request, *args, **kwargs):
+ instance = self.get_object()
+ data = {'id': instance.id, 'name': instance.name}
+ response = super(IndicatorDimensionNameViewSet, self).destroy(request, *args, **kwargs)
+ self._log_action(DELETION, data)
+ return response
+
+ def _log_action(self, action_flag, instance, message=''):
+ user = self.request.user
+ LogEntry.objects.log_action(
+ user_id=user.pk,
+ content_type_id=ContentType.objects.get_for_model(IndicatorDimensionName).pk,
+ object_id=instance['id'],
+ object_repr=str(instance),
+ action_flag=action_flag,
+ change_message=message
+ )
diff --git a/akvo/rest/views/indicator_dimension_value.py b/akvo/rest/views/indicator_dimension_value.py
--- a/akvo/rest/views/indicator_dimension_value.py
+++ b/akvo/rest/views/indicator_dimension_value.py
@@ -5,6 +5,8 @@
# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
+from django.contrib.admin.models import LogEntry, ADDITION, CHANGE, DELETION
+from django.contrib.contenttypes.models import ContentType
from akvo.rsr.models import IndicatorDimensionValue
from ..serializers import IndicatorDimensionValueSerializer
@@ -17,3 +19,31 @@
queryset = IndicatorDimensionValue.objects.all()
serializer_class = IndicatorDimensionValueSerializer
project_relation = 'name__project__'
+
+ def create(self, request, *args, **kwargs):
+ response = super(IndicatorDimensionValueViewSet, self).create(request, *args, **kwargs)
+ self._log_action(ADDITION, response.data, str(request.data))
+ return response
+
+ def update(self, request, *args, **kwargs):
+ response = super(IndicatorDimensionValueViewSet, self).update(request, *args, **kwargs)
+ self._log_action(CHANGE, response.data, str(request.data))
+ return response
+
+ def destroy(self, request, *args, **kwargs):
+ instance = self.get_object()
+ data = {'id': instance.id, 'value': instance.value}
+ response = super(IndicatorDimensionValueViewSet, self).destroy(request, *args, **kwargs)
+ self._log_action(DELETION, data)
+ return response
+
+ def _log_action(self, action_flag, instance, message=''):
+ user = self.request.user
+ LogEntry.objects.log_action(
+ user_id=user.pk,
+ content_type_id=ContentType.objects.get_for_model(IndicatorDimensionValue).pk,
+ object_id=instance['id'],
+ object_repr=str(instance),
+ action_flag=action_flag,
+ change_message=message
+ )
| {"golden_diff": "diff --git a/akvo/rest/views/indicator_dimension_name.py b/akvo/rest/views/indicator_dimension_name.py\n--- a/akvo/rest/views/indicator_dimension_name.py\n+++ b/akvo/rest/views/indicator_dimension_name.py\n@@ -5,6 +5,8 @@\n # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n \n \n+from django.contrib.admin.models import LogEntry, ADDITION, CHANGE, DELETION\n+from django.contrib.contenttypes.models import ContentType\n from akvo.rsr.models import IndicatorDimensionName\n \n from ..serializers import IndicatorDimensionNameSerializer\n@@ -17,3 +19,31 @@\n queryset = IndicatorDimensionName.objects.prefetch_related('dimension_values')\n serializer_class = IndicatorDimensionNameSerializer\n project_relation = 'project__'\n+\n+ def create(self, request, *args, **kwargs):\n+ response = super(IndicatorDimensionNameViewSet, self).create(request, *args, **kwargs)\n+ self._log_action(ADDITION, response.data, str(request.data))\n+ return response\n+\n+ def update(self, request, *args, **kwargs):\n+ response = super(IndicatorDimensionNameViewSet, self).update(request, *args, **kwargs)\n+ self._log_action(CHANGE, response.data, str(request.data))\n+ return response\n+\n+ def destroy(self, request, *args, **kwargs):\n+ instance = self.get_object()\n+ data = {'id': instance.id, 'name': instance.name}\n+ response = super(IndicatorDimensionNameViewSet, self).destroy(request, *args, **kwargs)\n+ self._log_action(DELETION, data)\n+ return response\n+\n+ def _log_action(self, action_flag, instance, message=''):\n+ user = self.request.user\n+ LogEntry.objects.log_action(\n+ user_id=user.pk,\n+ content_type_id=ContentType.objects.get_for_model(IndicatorDimensionName).pk,\n+ object_id=instance['id'],\n+ object_repr=str(instance),\n+ action_flag=action_flag,\n+ change_message=message\n+ )\ndiff --git a/akvo/rest/views/indicator_dimension_value.py b/akvo/rest/views/indicator_dimension_value.py\n--- a/akvo/rest/views/indicator_dimension_value.py\n+++ b/akvo/rest/views/indicator_dimension_value.py\n@@ -5,6 +5,8 @@\n # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n \n \n+from django.contrib.admin.models import LogEntry, ADDITION, CHANGE, DELETION\n+from django.contrib.contenttypes.models import ContentType\n from akvo.rsr.models import IndicatorDimensionValue\n \n from ..serializers import IndicatorDimensionValueSerializer\n@@ -17,3 +19,31 @@\n queryset = IndicatorDimensionValue.objects.all()\n serializer_class = IndicatorDimensionValueSerializer\n project_relation = 'name__project__'\n+\n+ def create(self, request, *args, **kwargs):\n+ response = super(IndicatorDimensionValueViewSet, self).create(request, *args, **kwargs)\n+ self._log_action(ADDITION, response.data, str(request.data))\n+ return response\n+\n+ def update(self, request, *args, **kwargs):\n+ response = super(IndicatorDimensionValueViewSet, self).update(request, *args, **kwargs)\n+ self._log_action(CHANGE, response.data, str(request.data))\n+ return response\n+\n+ def destroy(self, request, *args, **kwargs):\n+ instance = self.get_object()\n+ data = {'id': instance.id, 'value': instance.value}\n+ response = super(IndicatorDimensionValueViewSet, self).destroy(request, *args, **kwargs)\n+ self._log_action(DELETION, data)\n+ return response\n+\n+ def _log_action(self, action_flag, instance, message=''):\n+ user = self.request.user\n+ LogEntry.objects.log_action(\n+ user_id=user.pk,\n+ content_type_id=ContentType.objects.get_for_model(IndicatorDimensionValue).pk,\n+ object_id=instance['id'],\n+ object_repr=str(instance),\n+ action_flag=action_flag,\n+ change_message=message\n+ )\n", "issue": "Audit log disaggregation categories and labels\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\n\nfrom akvo.rsr.models import IndicatorDimensionName\n\nfrom ..serializers import IndicatorDimensionNameSerializer\nfrom ..viewsets import PublicProjectViewSet\n\n\nclass IndicatorDimensionNameViewSet(PublicProjectViewSet):\n \"\"\"\n \"\"\"\n queryset = IndicatorDimensionName.objects.prefetch_related('dimension_values')\n serializer_class = IndicatorDimensionNameSerializer\n project_relation = 'project__'\n", "path": "akvo/rest/views/indicator_dimension_name.py"}, {"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\n\nfrom akvo.rsr.models import IndicatorDimensionValue\n\nfrom ..serializers import IndicatorDimensionValueSerializer\nfrom ..viewsets import PublicProjectViewSet\n\n\nclass IndicatorDimensionValueViewSet(PublicProjectViewSet):\n \"\"\"\n \"\"\"\n queryset = IndicatorDimensionValue.objects.all()\n serializer_class = IndicatorDimensionValueSerializer\n project_relation = 'name__project__'\n", "path": "akvo/rest/views/indicator_dimension_value.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\n\nfrom django.contrib.admin.models import LogEntry, ADDITION, CHANGE, DELETION\nfrom django.contrib.contenttypes.models import ContentType\nfrom akvo.rsr.models import IndicatorDimensionName\n\nfrom ..serializers import IndicatorDimensionNameSerializer\nfrom ..viewsets import PublicProjectViewSet\n\n\nclass IndicatorDimensionNameViewSet(PublicProjectViewSet):\n \"\"\"\n \"\"\"\n queryset = IndicatorDimensionName.objects.prefetch_related('dimension_values')\n serializer_class = IndicatorDimensionNameSerializer\n project_relation = 'project__'\n\n def create(self, request, *args, **kwargs):\n response = super(IndicatorDimensionNameViewSet, self).create(request, *args, **kwargs)\n self._log_action(ADDITION, response.data, str(request.data))\n return response\n\n def update(self, request, *args, **kwargs):\n response = super(IndicatorDimensionNameViewSet, self).update(request, *args, **kwargs)\n self._log_action(CHANGE, response.data, str(request.data))\n return response\n\n def destroy(self, request, *args, **kwargs):\n instance = self.get_object()\n data = {'id': instance.id, 'name': instance.name}\n response = super(IndicatorDimensionNameViewSet, self).destroy(request, *args, **kwargs)\n self._log_action(DELETION, data)\n return response\n\n def _log_action(self, action_flag, instance, message=''):\n user = self.request.user\n LogEntry.objects.log_action(\n user_id=user.pk,\n content_type_id=ContentType.objects.get_for_model(IndicatorDimensionName).pk,\n object_id=instance['id'],\n object_repr=str(instance),\n action_flag=action_flag,\n change_message=message\n )\n", "path": "akvo/rest/views/indicator_dimension_name.py"}, {"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\n\nfrom django.contrib.admin.models import LogEntry, ADDITION, CHANGE, DELETION\nfrom django.contrib.contenttypes.models import ContentType\nfrom akvo.rsr.models import IndicatorDimensionValue\n\nfrom ..serializers import IndicatorDimensionValueSerializer\nfrom ..viewsets import PublicProjectViewSet\n\n\nclass IndicatorDimensionValueViewSet(PublicProjectViewSet):\n \"\"\"\n \"\"\"\n queryset = IndicatorDimensionValue.objects.all()\n serializer_class = IndicatorDimensionValueSerializer\n project_relation = 'name__project__'\n\n def create(self, request, *args, **kwargs):\n response = super(IndicatorDimensionValueViewSet, self).create(request, *args, **kwargs)\n self._log_action(ADDITION, response.data, str(request.data))\n return response\n\n def update(self, request, *args, **kwargs):\n response = super(IndicatorDimensionValueViewSet, self).update(request, *args, **kwargs)\n self._log_action(CHANGE, response.data, str(request.data))\n return response\n\n def destroy(self, request, *args, **kwargs):\n instance = self.get_object()\n data = {'id': instance.id, 'value': instance.value}\n response = super(IndicatorDimensionValueViewSet, self).destroy(request, *args, **kwargs)\n self._log_action(DELETION, data)\n return response\n\n def _log_action(self, action_flag, instance, message=''):\n user = self.request.user\n LogEntry.objects.log_action(\n user_id=user.pk,\n content_type_id=ContentType.objects.get_for_model(IndicatorDimensionValue).pk,\n object_id=instance['id'],\n object_repr=str(instance),\n action_flag=action_flag,\n change_message=message\n )\n", "path": "akvo/rest/views/indicator_dimension_value.py"}]} | 646 | 925 |
gh_patches_debug_61680 | rasdani/github-patches | git_diff | joke2k__faker-48 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Capital O missing an umlaut
Hello, I noticed in faker/Providers/De_de/internet.py in the _to_ascii method, the capital O is missing an umlaut.
It should be: ('Ö', 'Oe')
Currently:
replacements = (
('ä', 'ae'), ('Ä', 'Ae'),
('ö', 'oe'), ('O', 'Oe'),
('ü', 'ue'), ('Ü', 'Ue'),
('ß', 'ss')
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `faker/providers/de_DE/internet.py`
Content:
```
1 # coding=utf-8
2 from __future__ import unicode_literals
3 from ..internet import Provider as InternetProvider
4
5 import re
6
7
8 class Provider(InternetProvider):
9
10 free_email_domains = (
11 'web.de', 'gmail.com', 'hotmail.de', 'yahoo.de', 'googlemail.com',
12 'aol.de', 'gmx.de'
13 )
14 tlds = ('com', 'com', 'com', 'net', 'org', 'de', 'de', 'de')
15
16 @staticmethod
17 def _to_ascii(string):
18 replacements = (
19 ('ä', 'ae'), ('Ä', 'Ae'),
20 ('ö', 'oe'), ('O', 'Oe'),
21 ('ü', 'ue'), ('Ü', 'Ue'),
22 ('ß', 'ss')
23 )
24 for search, replace in replacements:
25 string = string.replace(search, replace)
26
27 return string
28
29 def user_name(self):
30 pattern = self.random_element(self.user_name_formats)
31 return self._to_ascii(
32 self.bothify(self.generator.parse(pattern)
33 ).lower())
34
35 def domain_word(self):
36 company = self.generator.format('company')
37 company_elements = company.split(' ')
38 company = self._to_ascii(company_elements.pop(0))
39 return re.sub(r'\W', '', company).lower()
40
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/faker/providers/de_DE/internet.py b/faker/providers/de_DE/internet.py
--- a/faker/providers/de_DE/internet.py
+++ b/faker/providers/de_DE/internet.py
@@ -17,7 +17,7 @@
def _to_ascii(string):
replacements = (
('ä', 'ae'), ('Ä', 'Ae'),
- ('ö', 'oe'), ('O', 'Oe'),
+ ('ö', 'oe'), ('Ö', 'Oe'),
('ü', 'ue'), ('Ü', 'Ue'),
('ß', 'ss')
)
| {"golden_diff": "diff --git a/faker/providers/de_DE/internet.py b/faker/providers/de_DE/internet.py\n--- a/faker/providers/de_DE/internet.py\n+++ b/faker/providers/de_DE/internet.py\n@@ -17,7 +17,7 @@\n def _to_ascii(string):\n replacements = (\n ('\u00e4', 'ae'), ('\u00c4', 'Ae'),\n- ('\u00f6', 'oe'), ('O', 'Oe'),\n+ ('\u00f6', 'oe'), ('\u00d6', 'Oe'),\n ('\u00fc', 'ue'), ('\u00dc', 'Ue'),\n ('\u00df', 'ss')\n )\n", "issue": "Capital O missing an umlaut\nHello, I noticed in faker/Providers/De_de/internet.py in the _to_ascii method, the capital O is missing an umlaut. \n\nIt should be: ('\u00d6', 'Oe') \n\nCurrently:\nreplacements = (\n ('\u00e4', 'ae'), ('\u00c4', 'Ae'),\n ('\u00f6', 'oe'), ('O', 'Oe'),\n ('\u00fc', 'ue'), ('\u00dc', 'Ue'),\n ('\u00df', 'ss')\n\n", "before_files": [{"content": "# coding=utf-8\nfrom __future__ import unicode_literals\nfrom ..internet import Provider as InternetProvider\n\nimport re\n\n\nclass Provider(InternetProvider):\n\n free_email_domains = (\n 'web.de', 'gmail.com', 'hotmail.de', 'yahoo.de', 'googlemail.com',\n 'aol.de', 'gmx.de'\n )\n tlds = ('com', 'com', 'com', 'net', 'org', 'de', 'de', 'de')\n\n @staticmethod\n def _to_ascii(string):\n replacements = (\n ('\u00e4', 'ae'), ('\u00c4', 'Ae'),\n ('\u00f6', 'oe'), ('O', 'Oe'),\n ('\u00fc', 'ue'), ('\u00dc', 'Ue'),\n ('\u00df', 'ss')\n )\n for search, replace in replacements:\n string = string.replace(search, replace)\n\n return string\n\n def user_name(self):\n pattern = self.random_element(self.user_name_formats)\n return self._to_ascii(\n self.bothify(self.generator.parse(pattern)\n ).lower())\n\n def domain_word(self):\n company = self.generator.format('company')\n company_elements = company.split(' ')\n company = self._to_ascii(company_elements.pop(0))\n return re.sub(r'\\W', '', company).lower()\n", "path": "faker/providers/de_DE/internet.py"}], "after_files": [{"content": "# coding=utf-8\nfrom __future__ import unicode_literals\nfrom ..internet import Provider as InternetProvider\n\nimport re\n\n\nclass Provider(InternetProvider):\n\n free_email_domains = (\n 'web.de', 'gmail.com', 'hotmail.de', 'yahoo.de', 'googlemail.com',\n 'aol.de', 'gmx.de'\n )\n tlds = ('com', 'com', 'com', 'net', 'org', 'de', 'de', 'de')\n\n @staticmethod\n def _to_ascii(string):\n replacements = (\n ('\u00e4', 'ae'), ('\u00c4', 'Ae'),\n ('\u00f6', 'oe'), ('\u00d6', 'Oe'),\n ('\u00fc', 'ue'), ('\u00dc', 'Ue'),\n ('\u00df', 'ss')\n )\n for search, replace in replacements:\n string = string.replace(search, replace)\n\n return string\n\n def user_name(self):\n pattern = self.random_element(self.user_name_formats)\n return self._to_ascii(\n self.bothify(self.generator.parse(pattern)\n ).lower())\n\n def domain_word(self):\n company = self.generator.format('company')\n company_elements = company.split(' ')\n company = self._to_ascii(company_elements.pop(0))\n return re.sub(r'\\W', '', company).lower()\n", "path": "faker/providers/de_DE/internet.py"}]} | 722 | 134 |
gh_patches_debug_5954 | rasdani/github-patches | git_diff | mathesar-foundation__mathesar-2609 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Issues with installation process that connects an existing DB
- [x] Tester Marius reports (server credentials in Upwork)
- [ ] It seems that even if you select existing database, it still tries to start a docker container for the database, creating a conflict?
- [x] Tester Mohammad reports an error as well ([details here](https://docs.google.com/document/d/15m9eZFocAsU1V9inLKxC6i_KQxMdu28snRrBPOrf5Hk/edit))
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `db/install.py`
Content:
```
1 from sqlalchemy import text
2 from sqlalchemy.exc import OperationalError
3
4 from db import engine
5 from db.types import install
6
7
8 def install_mathesar(
9 database_name, username, password, hostname, port, skip_confirm
10 ):
11 """Create database and install Mathesar on it."""
12 user_db_engine = engine.create_future_engine(
13 username, password, hostname, database_name, port,
14 connect_args={"connect_timeout": 10}
15 )
16 try:
17 user_db_engine.connect()
18 print(f"Installing Mathesar on preexisting PostgreSQL database {database_name} at host {hostname}...")
19 install.install_mathesar_on_database(user_db_engine)
20 user_db_engine.dispose()
21 except OperationalError:
22 database_created = _create_database(
23 database_name=database_name,
24 hostname=hostname,
25 username=username,
26 password=password,
27 port=port,
28 skip_confirm=skip_confirm
29 )
30 if database_created:
31 print(f"Installing Mathesar on PostgreSQL database {database_name} at host {hostname}...")
32 install.install_mathesar_on_database(user_db_engine)
33 user_db_engine.dispose()
34 else:
35 print(f"Skipping installing on DB with key {database_name}.")
36
37
38 def _create_database(database_name, hostname, username, password, port, skip_confirm=True):
39 if skip_confirm is True:
40 create_database = "y"
41 else:
42 create_database = input(
43 f"Create a new Database called {database_name}? (y/n) > "
44 )
45 if create_database.lower() in ["y", "yes"]:
46 # We need to connect to an existing database inorder to create a new Database.
47 # So we use the default Database `postgres` that comes with postgres.
48 # TODO Throw correct error when the default postgres database does not exists(which is very rare but still possible)
49 root_database = "postgres"
50 root_db_engine = engine.create_future_engine(
51 username, password, hostname, root_database, port,
52 connect_args={"connect_timeout": 10}
53 )
54 with root_db_engine.connect() as conn:
55 conn.execution_options(isolation_level="AUTOCOMMIT")
56 conn.execute(text(f"CREATE DATABASE {database_name}"))
57 root_db_engine.dispose()
58 print(f"Created DB is {database_name}.")
59 return True
60 else:
61 print(f"Database {database_name} not created!")
62 return False
63
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/db/install.py b/db/install.py
--- a/db/install.py
+++ b/db/install.py
@@ -53,7 +53,7 @@
)
with root_db_engine.connect() as conn:
conn.execution_options(isolation_level="AUTOCOMMIT")
- conn.execute(text(f"CREATE DATABASE {database_name}"))
+ conn.execute(text(f'CREATE DATABASE "{database_name}"'))
root_db_engine.dispose()
print(f"Created DB is {database_name}.")
return True
| {"golden_diff": "diff --git a/db/install.py b/db/install.py\n--- a/db/install.py\n+++ b/db/install.py\n@@ -53,7 +53,7 @@\n )\n with root_db_engine.connect() as conn:\n conn.execution_options(isolation_level=\"AUTOCOMMIT\")\n- conn.execute(text(f\"CREATE DATABASE {database_name}\"))\n+ conn.execute(text(f'CREATE DATABASE \"{database_name}\"'))\n root_db_engine.dispose()\n print(f\"Created DB is {database_name}.\")\n return True\n", "issue": "Issues with installation process that connects an existing DB\n- [x] Tester Marius reports (server credentials in Upwork)\r\n - [ ] It seems that even if you select existing database, it still tries to start a docker container for the database, creating a conflict?\r\n- [x] Tester Mohammad reports an error as well ([details here](https://docs.google.com/document/d/15m9eZFocAsU1V9inLKxC6i_KQxMdu28snRrBPOrf5Hk/edit))\n", "before_files": [{"content": "from sqlalchemy import text\nfrom sqlalchemy.exc import OperationalError\n\nfrom db import engine\nfrom db.types import install\n\n\ndef install_mathesar(\n database_name, username, password, hostname, port, skip_confirm\n):\n \"\"\"Create database and install Mathesar on it.\"\"\"\n user_db_engine = engine.create_future_engine(\n username, password, hostname, database_name, port,\n connect_args={\"connect_timeout\": 10}\n )\n try:\n user_db_engine.connect()\n print(f\"Installing Mathesar on preexisting PostgreSQL database {database_name} at host {hostname}...\")\n install.install_mathesar_on_database(user_db_engine)\n user_db_engine.dispose()\n except OperationalError:\n database_created = _create_database(\n database_name=database_name,\n hostname=hostname,\n username=username,\n password=password,\n port=port,\n skip_confirm=skip_confirm\n )\n if database_created:\n print(f\"Installing Mathesar on PostgreSQL database {database_name} at host {hostname}...\")\n install.install_mathesar_on_database(user_db_engine)\n user_db_engine.dispose()\n else:\n print(f\"Skipping installing on DB with key {database_name}.\")\n\n\ndef _create_database(database_name, hostname, username, password, port, skip_confirm=True):\n if skip_confirm is True:\n create_database = \"y\"\n else:\n create_database = input(\n f\"Create a new Database called {database_name}? (y/n) > \"\n )\n if create_database.lower() in [\"y\", \"yes\"]:\n # We need to connect to an existing database inorder to create a new Database.\n # So we use the default Database `postgres` that comes with postgres.\n # TODO Throw correct error when the default postgres database does not exists(which is very rare but still possible)\n root_database = \"postgres\"\n root_db_engine = engine.create_future_engine(\n username, password, hostname, root_database, port,\n connect_args={\"connect_timeout\": 10}\n )\n with root_db_engine.connect() as conn:\n conn.execution_options(isolation_level=\"AUTOCOMMIT\")\n conn.execute(text(f\"CREATE DATABASE {database_name}\"))\n root_db_engine.dispose()\n print(f\"Created DB is {database_name}.\")\n return True\n else:\n print(f\"Database {database_name} not created!\")\n return False\n", "path": "db/install.py"}], "after_files": [{"content": "from sqlalchemy import text\nfrom sqlalchemy.exc import OperationalError\n\nfrom db import engine\nfrom db.types import install\n\n\ndef install_mathesar(\n database_name, username, password, hostname, port, skip_confirm\n):\n \"\"\"Create database and install Mathesar on it.\"\"\"\n user_db_engine = engine.create_future_engine(\n username, password, hostname, database_name, port,\n connect_args={\"connect_timeout\": 10}\n )\n try:\n user_db_engine.connect()\n print(f\"Installing Mathesar on preexisting PostgreSQL database {database_name} at host {hostname}...\")\n install.install_mathesar_on_database(user_db_engine)\n user_db_engine.dispose()\n except OperationalError:\n database_created = _create_database(\n database_name=database_name,\n hostname=hostname,\n username=username,\n password=password,\n port=port,\n skip_confirm=skip_confirm\n )\n if database_created:\n print(f\"Installing Mathesar on PostgreSQL database {database_name} at host {hostname}...\")\n install.install_mathesar_on_database(user_db_engine)\n user_db_engine.dispose()\n else:\n print(f\"Skipping installing on DB with key {database_name}.\")\n\n\ndef _create_database(database_name, hostname, username, password, port, skip_confirm=True):\n if skip_confirm is True:\n create_database = \"y\"\n else:\n create_database = input(\n f\"Create a new Database called {database_name}? (y/n) > \"\n )\n if create_database.lower() in [\"y\", \"yes\"]:\n # We need to connect to an existing database inorder to create a new Database.\n # So we use the default Database `postgres` that comes with postgres.\n # TODO Throw correct error when the default postgres database does not exists(which is very rare but still possible)\n root_database = \"postgres\"\n root_db_engine = engine.create_future_engine(\n username, password, hostname, root_database, port,\n connect_args={\"connect_timeout\": 10}\n )\n with root_db_engine.connect() as conn:\n conn.execution_options(isolation_level=\"AUTOCOMMIT\")\n conn.execute(text(f'CREATE DATABASE \"{database_name}\"'))\n root_db_engine.dispose()\n print(f\"Created DB is {database_name}.\")\n return True\n else:\n print(f\"Database {database_name} not created!\")\n return False\n", "path": "db/install.py"}]} | 986 | 111 |
gh_patches_debug_41704 | rasdani/github-patches | git_diff | Flexget__Flexget-2525 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
qBittorrent 4.2.0 can't work
Hi
I've upgrade to qBittorrent 4.2.0 and flexget can't add new tasks to qBittorrent.
FlexGet version: 3.0.11
Python version: 3.7.5
qBittorrent 4.2.0 can't work
Hi
I've upgrade to qBittorrent 4.2.0 and flexget can't add new tasks to qBittorrent.
FlexGet version: 3.0.11
Python version: 3.7.5
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `flexget/plugins/clients/qbittorrent.py`
Content:
```
1 import logging
2 import os
3
4 from requests import Session
5 from requests.exceptions import RequestException
6
7 from flexget import plugin
8 from flexget.event import event
9 from flexget.utils.template import RenderError
10
11 log = logging.getLogger('qbittorrent')
12
13
14 class OutputQBitTorrent:
15 """
16 Example:
17
18 qbittorrent:
19 username: <USERNAME> (default: (none))
20 password: <PASSWORD> (default: (none))
21 host: <HOSTNAME> (default: localhost)
22 port: <PORT> (default: 8080)
23 use_ssl: <SSL> (default: False)
24 verify_cert: <VERIFY> (default: True)
25 path: <OUTPUT_DIR> (default: (none))
26 label: <LABEL> (default: (none))
27 maxupspeed: <torrent upload speed limit> (default: 0)
28 maxdownspeed: <torrent download speed limit> (default: 0)
29 add_paused: <ADD_PAUSED> (default: False)
30 """
31
32 schema = {
33 'anyOf': [
34 {'type': 'boolean'},
35 {
36 'type': 'object',
37 'properties': {
38 'username': {'type': 'string'},
39 'password': {'type': 'string'},
40 'host': {'type': 'string'},
41 'port': {'type': 'integer'},
42 'use_ssl': {'type': 'boolean'},
43 'verify_cert': {'type': 'boolean'},
44 'path': {'type': 'string'},
45 'label': {'type': 'string'},
46 'maxupspeed': {'type': 'integer'},
47 'maxdownspeed': {'type': 'integer'},
48 'fail_html': {'type': 'boolean'},
49 'add_paused': {'type': 'boolean'},
50 },
51 'additionalProperties': False,
52 },
53 ]
54 }
55
56 def _request(self, method, url, msg_on_fail=None, **kwargs):
57 try:
58 response = self.session.request(method, url, **kwargs)
59 if response == 'Fails.':
60 msg = (
61 'Failure. URL: {}, data: {}'.format(url, kwargs)
62 if not msg_on_fail
63 else msg_on_fail
64 )
65 else:
66 return response
67 except RequestException as e:
68 msg = str(e)
69 raise plugin.PluginError(
70 'Error when trying to send request to qBittorrent: {}'.format(msg)
71 )
72
73 def connect(self, config):
74 """
75 Connect to qBittorrent Web UI. Username and password not necessary
76 if 'Bypass authentication for localhost' is checked and host is
77 'localhost'.
78 """
79 self.session = Session()
80 self.url = '{}://{}:{}'.format(
81 'https' if config['use_ssl'] else 'http', config['host'], config['port']
82 )
83 if config.get('username') and config.get('password'):
84 data = {'username': config['username'], 'password': config['password']}
85 self._request(
86 'post',
87 self.url + '/login',
88 data=data,
89 msg_on_fail='Authentication failed.',
90 verify=config['verify_cert'],
91 )
92 log.debug('Successfully connected to qBittorrent')
93 self.connected = True
94
95 def add_torrent_file(self, file_path, data, verify_cert):
96 if not self.connected:
97 raise plugin.PluginError('Not connected.')
98 multipart_data = {k: (None, v) for k, v in data.items()}
99 with open(file_path, 'rb') as f:
100 multipart_data['torrents'] = f
101 self._request(
102 'post',
103 self.url + '/command/upload',
104 msg_on_fail='Failed to add file to qBittorrent',
105 files=multipart_data,
106 verify=verify_cert,
107 )
108 log.debug('Added torrent file %s to qBittorrent', file_path)
109
110 def add_torrent_url(self, url, data, verify_cert):
111 if not self.connected:
112 raise plugin.PluginError('Not connected.')
113 data['urls'] = url
114 multipart_data = {k: (None, v) for k, v in data.items()}
115 self._request(
116 'post',
117 self.url + '/command/download',
118 msg_on_fail='Failed to add file to qBittorrent',
119 files=multipart_data,
120 verify=verify_cert,
121 )
122 log.debug('Added url %s to qBittorrent', url)
123
124 def prepare_config(self, config):
125 if isinstance(config, bool):
126 config = {'enabled': config}
127 config.setdefault('enabled', True)
128 config.setdefault('host', 'localhost')
129 config.setdefault('port', 8080)
130 config.setdefault('use_ssl', False)
131 config.setdefault('verify_cert', True)
132 config.setdefault('label', '')
133 config.setdefault('maxupspeed', 0)
134 config.setdefault('maxdownspeed', 0)
135 config.setdefault('fail_html', True)
136 return config
137
138 def add_entries(self, task, config):
139 for entry in task.accepted:
140 form_data = {}
141 try:
142 save_path = entry.render(entry.get('path', config.get('path', '')))
143 if save_path:
144 form_data['savepath'] = save_path
145 except RenderError as e:
146 log.error('Error setting path for %s: %s', entry['title'], e)
147
148 label = entry.get('label', config.get('label'))
149 if label:
150 form_data['label'] = label # qBittorrent v3.3.3-
151 form_data['category'] = label # qBittorrent v3.3.4+
152
153 add_paused = entry.get('add_paused', config.get('add_paused'))
154 if add_paused:
155 form_data['paused'] = 'true'
156
157 maxupspeed = entry.get('maxupspeed', config.get('maxupspeed'))
158 if maxupspeed:
159 form_data['upLimit'] = maxupspeed * 1024
160
161 maxdownspeed = entry.get('maxdownspeed', config.get('maxdownspeed'))
162 if maxdownspeed:
163 form_data['dlLimit'] = maxdownspeed * 1024
164
165 is_magnet = entry['url'].startswith('magnet:')
166
167 if task.manager.options.test:
168 log.info('Test mode.')
169 log.info('Would add torrent to qBittorrent with:')
170 if not is_magnet:
171 log.info('File: %s', entry.get('file'))
172 else:
173 log.info('Url: %s', entry.get('url'))
174 log.info('Save path: %s', form_data.get('savepath'))
175 log.info('Label: %s', form_data.get('label'))
176 log.info('Paused: %s', form_data.get('paused', 'false'))
177 if maxupspeed:
178 log.info('Upload Speed Limit: %d', form_data.get('upLimit'))
179 if maxdownspeed:
180 log.info('Download Speed Limit: %d', form_data.get('dlLimit'))
181 continue
182
183 if not is_magnet:
184 if 'file' not in entry:
185 entry.fail('File missing?')
186 continue
187 if not os.path.exists(entry['file']):
188 tmp_path = os.path.join(task.manager.config_base, 'temp')
189 log.debug('entry: %s', entry)
190 log.debug('temp: %s', ', '.join(os.listdir(tmp_path)))
191 entry.fail("Downloaded temp file '%s' doesn't exist!?" % entry['file'])
192 continue
193 self.add_torrent_file(entry['file'], form_data, config['verify_cert'])
194 else:
195 self.add_torrent_url(entry['url'], form_data, config['verify_cert'])
196
197 @plugin.priority(120)
198 def on_task_download(self, task, config):
199 """
200 Call download plugin to generate torrent files to load into
201 qBittorrent.
202 """
203 config = self.prepare_config(config)
204 if not config['enabled']:
205 return
206 if 'download' not in task.config:
207 download = plugin.get('download', self)
208 download.get_temp_files(task, handle_magnets=True, fail_html=config['fail_html'])
209
210 @plugin.priority(135)
211 def on_task_output(self, task, config):
212 """Add torrents to qBittorrent at exit."""
213 if task.accepted:
214 config = self.prepare_config(config)
215 self.connect(config)
216 self.add_entries(task, config)
217
218
219 @event('plugin.register')
220 def register_plugin():
221 plugin.register(OutputQBitTorrent, 'qbittorrent', api_ver=2)
222
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/flexget/plugins/clients/qbittorrent.py b/flexget/plugins/clients/qbittorrent.py
--- a/flexget/plugins/clients/qbittorrent.py
+++ b/flexget/plugins/clients/qbittorrent.py
@@ -56,14 +56,41 @@
def _request(self, method, url, msg_on_fail=None, **kwargs):
try:
response = self.session.request(method, url, **kwargs)
- if response == 'Fails.':
+ if response.text == "Ok.":
+ return response
+ else:
msg = (
'Failure. URL: {}, data: {}'.format(url, kwargs)
if not msg_on_fail
else msg_on_fail
)
- else:
- return response
+ except RequestException as e:
+ msg = str(e)
+ raise plugin.PluginError(
+ 'Error when trying to send request to qBittorrent: {}'.format(msg)
+ )
+
+ def check_api_version(self, msg_on_fail):
+ try:
+ url = self.url + "/api/v2/app/webapiVersion"
+ response = self.session.request('get', url)
+ if response.status_code != 404:
+ self.api_url_login = '/api/v2/auth/login'
+ self.api_url_add = '/api/v2/torrents/add'
+ return response
+
+ url = self.url + "/version/api"
+ response = self.session.request('get', url)
+ if response.status_code != 404:
+ self.api_url_login = '/login'
+ self.api_url_add = '/command/upload'
+ return response
+
+ msg = (
+ 'Failure. URL: {}'.format(url)
+ if not msg_on_fail
+ else msg_on_fail
+ )
except RequestException as e:
msg = str(e)
raise plugin.PluginError(
@@ -80,11 +107,12 @@
self.url = '{}://{}:{}'.format(
'https' if config['use_ssl'] else 'http', config['host'], config['port']
)
+ self.check_api_version('Check API version failed.')
if config.get('username') and config.get('password'):
data = {'username': config['username'], 'password': config['password']}
self._request(
'post',
- self.url + '/login',
+ self.url + self.api_url_login,
data=data,
msg_on_fail='Authentication failed.',
verify=config['verify_cert'],
@@ -100,7 +128,7 @@
multipart_data['torrents'] = f
self._request(
'post',
- self.url + '/command/upload',
+ self.url + self.api_url_add,
msg_on_fail='Failed to add file to qBittorrent',
files=multipart_data,
verify=verify_cert,
@@ -114,7 +142,7 @@
multipart_data = {k: (None, v) for k, v in data.items()}
self._request(
'post',
- self.url + '/command/download',
+ self.url + self.api_url_add,
msg_on_fail='Failed to add file to qBittorrent',
files=multipart_data,
verify=verify_cert,
| {"golden_diff": "diff --git a/flexget/plugins/clients/qbittorrent.py b/flexget/plugins/clients/qbittorrent.py\n--- a/flexget/plugins/clients/qbittorrent.py\n+++ b/flexget/plugins/clients/qbittorrent.py\n@@ -56,14 +56,41 @@\n def _request(self, method, url, msg_on_fail=None, **kwargs):\n try:\n response = self.session.request(method, url, **kwargs)\n- if response == 'Fails.':\n+ if response.text == \"Ok.\":\n+ return response \n+ else:\n msg = (\n 'Failure. URL: {}, data: {}'.format(url, kwargs)\n if not msg_on_fail\n else msg_on_fail\n )\n- else:\n- return response\n+ except RequestException as e:\n+ msg = str(e)\n+ raise plugin.PluginError(\n+ 'Error when trying to send request to qBittorrent: {}'.format(msg)\n+ )\n+ \n+ def check_api_version(self, msg_on_fail):\n+ try:\n+ url = self.url + \"/api/v2/app/webapiVersion\"\n+ response = self.session.request('get', url)\n+ if response.status_code != 404:\n+ self.api_url_login = '/api/v2/auth/login'\n+ self.api_url_add = '/api/v2/torrents/add'\n+ return response \n+ \n+ url = self.url + \"/version/api\"\n+ response = self.session.request('get', url)\n+ if response.status_code != 404:\n+ self.api_url_login = '/login'\n+ self.api_url_add = '/command/upload'\n+ return response \n+ \n+ msg = (\n+ 'Failure. URL: {}'.format(url)\n+ if not msg_on_fail\n+ else msg_on_fail\n+ )\n except RequestException as e:\n msg = str(e)\n raise plugin.PluginError(\n@@ -80,11 +107,12 @@\n self.url = '{}://{}:{}'.format(\n 'https' if config['use_ssl'] else 'http', config['host'], config['port']\n )\n+ self.check_api_version('Check API version failed.')\n if config.get('username') and config.get('password'):\n data = {'username': config['username'], 'password': config['password']}\n self._request(\n 'post',\n- self.url + '/login',\n+ self.url + self.api_url_login,\n data=data,\n msg_on_fail='Authentication failed.',\n verify=config['verify_cert'],\n@@ -100,7 +128,7 @@\n multipart_data['torrents'] = f\n self._request(\n 'post',\n- self.url + '/command/upload',\n+ self.url + self.api_url_add,\n msg_on_fail='Failed to add file to qBittorrent',\n files=multipart_data,\n verify=verify_cert,\n@@ -114,7 +142,7 @@\n multipart_data = {k: (None, v) for k, v in data.items()}\n self._request(\n 'post',\n- self.url + '/command/download',\n+ self.url + self.api_url_add,\n msg_on_fail='Failed to add file to qBittorrent',\n files=multipart_data,\n verify=verify_cert,\n", "issue": "qBittorrent 4.2.0 can't work\nHi\r\n I've upgrade to qBittorrent 4.2.0 and flexget can't add new tasks to qBittorrent.\r\n \r\nFlexGet version: 3.0.11\r\nPython version: 3.7.5\nqBittorrent 4.2.0 can't work\nHi\r\n I've upgrade to qBittorrent 4.2.0 and flexget can't add new tasks to qBittorrent.\r\n \r\nFlexGet version: 3.0.11\r\nPython version: 3.7.5\n", "before_files": [{"content": "import logging\nimport os\n\nfrom requests import Session\nfrom requests.exceptions import RequestException\n\nfrom flexget import plugin\nfrom flexget.event import event\nfrom flexget.utils.template import RenderError\n\nlog = logging.getLogger('qbittorrent')\n\n\nclass OutputQBitTorrent:\n \"\"\"\n Example:\n\n qbittorrent:\n username: <USERNAME> (default: (none))\n password: <PASSWORD> (default: (none))\n host: <HOSTNAME> (default: localhost)\n port: <PORT> (default: 8080)\n use_ssl: <SSL> (default: False)\n verify_cert: <VERIFY> (default: True)\n path: <OUTPUT_DIR> (default: (none))\n label: <LABEL> (default: (none))\n maxupspeed: <torrent upload speed limit> (default: 0)\n maxdownspeed: <torrent download speed limit> (default: 0)\n add_paused: <ADD_PAUSED> (default: False)\n \"\"\"\n\n schema = {\n 'anyOf': [\n {'type': 'boolean'},\n {\n 'type': 'object',\n 'properties': {\n 'username': {'type': 'string'},\n 'password': {'type': 'string'},\n 'host': {'type': 'string'},\n 'port': {'type': 'integer'},\n 'use_ssl': {'type': 'boolean'},\n 'verify_cert': {'type': 'boolean'},\n 'path': {'type': 'string'},\n 'label': {'type': 'string'},\n 'maxupspeed': {'type': 'integer'},\n 'maxdownspeed': {'type': 'integer'},\n 'fail_html': {'type': 'boolean'},\n 'add_paused': {'type': 'boolean'},\n },\n 'additionalProperties': False,\n },\n ]\n }\n\n def _request(self, method, url, msg_on_fail=None, **kwargs):\n try:\n response = self.session.request(method, url, **kwargs)\n if response == 'Fails.':\n msg = (\n 'Failure. URL: {}, data: {}'.format(url, kwargs)\n if not msg_on_fail\n else msg_on_fail\n )\n else:\n return response\n except RequestException as e:\n msg = str(e)\n raise plugin.PluginError(\n 'Error when trying to send request to qBittorrent: {}'.format(msg)\n )\n\n def connect(self, config):\n \"\"\"\n Connect to qBittorrent Web UI. Username and password not necessary\n if 'Bypass authentication for localhost' is checked and host is\n 'localhost'.\n \"\"\"\n self.session = Session()\n self.url = '{}://{}:{}'.format(\n 'https' if config['use_ssl'] else 'http', config['host'], config['port']\n )\n if config.get('username') and config.get('password'):\n data = {'username': config['username'], 'password': config['password']}\n self._request(\n 'post',\n self.url + '/login',\n data=data,\n msg_on_fail='Authentication failed.',\n verify=config['verify_cert'],\n )\n log.debug('Successfully connected to qBittorrent')\n self.connected = True\n\n def add_torrent_file(self, file_path, data, verify_cert):\n if not self.connected:\n raise plugin.PluginError('Not connected.')\n multipart_data = {k: (None, v) for k, v in data.items()}\n with open(file_path, 'rb') as f:\n multipart_data['torrents'] = f\n self._request(\n 'post',\n self.url + '/command/upload',\n msg_on_fail='Failed to add file to qBittorrent',\n files=multipart_data,\n verify=verify_cert,\n )\n log.debug('Added torrent file %s to qBittorrent', file_path)\n\n def add_torrent_url(self, url, data, verify_cert):\n if not self.connected:\n raise plugin.PluginError('Not connected.')\n data['urls'] = url\n multipart_data = {k: (None, v) for k, v in data.items()}\n self._request(\n 'post',\n self.url + '/command/download',\n msg_on_fail='Failed to add file to qBittorrent',\n files=multipart_data,\n verify=verify_cert,\n )\n log.debug('Added url %s to qBittorrent', url)\n\n def prepare_config(self, config):\n if isinstance(config, bool):\n config = {'enabled': config}\n config.setdefault('enabled', True)\n config.setdefault('host', 'localhost')\n config.setdefault('port', 8080)\n config.setdefault('use_ssl', False)\n config.setdefault('verify_cert', True)\n config.setdefault('label', '')\n config.setdefault('maxupspeed', 0)\n config.setdefault('maxdownspeed', 0)\n config.setdefault('fail_html', True)\n return config\n\n def add_entries(self, task, config):\n for entry in task.accepted:\n form_data = {}\n try:\n save_path = entry.render(entry.get('path', config.get('path', '')))\n if save_path:\n form_data['savepath'] = save_path\n except RenderError as e:\n log.error('Error setting path for %s: %s', entry['title'], e)\n\n label = entry.get('label', config.get('label'))\n if label:\n form_data['label'] = label # qBittorrent v3.3.3-\n form_data['category'] = label # qBittorrent v3.3.4+\n\n add_paused = entry.get('add_paused', config.get('add_paused'))\n if add_paused:\n form_data['paused'] = 'true'\n\n maxupspeed = entry.get('maxupspeed', config.get('maxupspeed'))\n if maxupspeed:\n form_data['upLimit'] = maxupspeed * 1024\n\n maxdownspeed = entry.get('maxdownspeed', config.get('maxdownspeed'))\n if maxdownspeed:\n form_data['dlLimit'] = maxdownspeed * 1024\n\n is_magnet = entry['url'].startswith('magnet:')\n\n if task.manager.options.test:\n log.info('Test mode.')\n log.info('Would add torrent to qBittorrent with:')\n if not is_magnet:\n log.info('File: %s', entry.get('file'))\n else:\n log.info('Url: %s', entry.get('url'))\n log.info('Save path: %s', form_data.get('savepath'))\n log.info('Label: %s', form_data.get('label'))\n log.info('Paused: %s', form_data.get('paused', 'false'))\n if maxupspeed:\n log.info('Upload Speed Limit: %d', form_data.get('upLimit'))\n if maxdownspeed:\n log.info('Download Speed Limit: %d', form_data.get('dlLimit'))\n continue\n\n if not is_magnet:\n if 'file' not in entry:\n entry.fail('File missing?')\n continue\n if not os.path.exists(entry['file']):\n tmp_path = os.path.join(task.manager.config_base, 'temp')\n log.debug('entry: %s', entry)\n log.debug('temp: %s', ', '.join(os.listdir(tmp_path)))\n entry.fail(\"Downloaded temp file '%s' doesn't exist!?\" % entry['file'])\n continue\n self.add_torrent_file(entry['file'], form_data, config['verify_cert'])\n else:\n self.add_torrent_url(entry['url'], form_data, config['verify_cert'])\n\n @plugin.priority(120)\n def on_task_download(self, task, config):\n \"\"\"\n Call download plugin to generate torrent files to load into\n qBittorrent.\n \"\"\"\n config = self.prepare_config(config)\n if not config['enabled']:\n return\n if 'download' not in task.config:\n download = plugin.get('download', self)\n download.get_temp_files(task, handle_magnets=True, fail_html=config['fail_html'])\n\n @plugin.priority(135)\n def on_task_output(self, task, config):\n \"\"\"Add torrents to qBittorrent at exit.\"\"\"\n if task.accepted:\n config = self.prepare_config(config)\n self.connect(config)\n self.add_entries(task, config)\n\n\n@event('plugin.register')\ndef register_plugin():\n plugin.register(OutputQBitTorrent, 'qbittorrent', api_ver=2)\n", "path": "flexget/plugins/clients/qbittorrent.py"}], "after_files": [{"content": "import logging\nimport os\n\nfrom requests import Session\nfrom requests.exceptions import RequestException\n\nfrom flexget import plugin\nfrom flexget.event import event\nfrom flexget.utils.template import RenderError\n\nlog = logging.getLogger('qbittorrent')\n\n\nclass OutputQBitTorrent:\n \"\"\"\n Example:\n\n qbittorrent:\n username: <USERNAME> (default: (none))\n password: <PASSWORD> (default: (none))\n host: <HOSTNAME> (default: localhost)\n port: <PORT> (default: 8080)\n use_ssl: <SSL> (default: False)\n verify_cert: <VERIFY> (default: True)\n path: <OUTPUT_DIR> (default: (none))\n label: <LABEL> (default: (none))\n maxupspeed: <torrent upload speed limit> (default: 0)\n maxdownspeed: <torrent download speed limit> (default: 0)\n add_paused: <ADD_PAUSED> (default: False)\n \"\"\"\n\n schema = {\n 'anyOf': [\n {'type': 'boolean'},\n {\n 'type': 'object',\n 'properties': {\n 'username': {'type': 'string'},\n 'password': {'type': 'string'},\n 'host': {'type': 'string'},\n 'port': {'type': 'integer'},\n 'use_ssl': {'type': 'boolean'},\n 'verify_cert': {'type': 'boolean'},\n 'path': {'type': 'string'},\n 'label': {'type': 'string'},\n 'maxupspeed': {'type': 'integer'},\n 'maxdownspeed': {'type': 'integer'},\n 'fail_html': {'type': 'boolean'},\n 'add_paused': {'type': 'boolean'},\n },\n 'additionalProperties': False,\n },\n ]\n }\n\n def _request(self, method, url, msg_on_fail=None, **kwargs):\n try:\n response = self.session.request(method, url, **kwargs)\n if response.text == \"Ok.\":\n return response \n else:\n msg = (\n 'Failure. URL: {}, data: {}'.format(url, kwargs)\n if not msg_on_fail\n else msg_on_fail\n )\n except RequestException as e:\n msg = str(e)\n raise plugin.PluginError(\n 'Error when trying to send request to qBittorrent: {}'.format(msg)\n )\n \n def check_api_version(self, msg_on_fail):\n try:\n url = self.url + \"/api/v2/app/webapiVersion\"\n response = self.session.request('get', url)\n if response.status_code != 404:\n self.api_url_login = '/api/v2/auth/login'\n self.api_url_add = '/api/v2/torrents/add'\n return response \n \n url = self.url + \"/version/api\"\n response = self.session.request('get', url)\n if response.status_code != 404:\n self.api_url_login = '/login'\n self.api_url_add = '/command/upload'\n return response \n \n msg = (\n 'Failure. URL: {}'.format(url)\n if not msg_on_fail\n else msg_on_fail\n )\n except RequestException as e:\n msg = str(e)\n raise plugin.PluginError(\n 'Error when trying to send request to qBittorrent: {}'.format(msg)\n )\n\n def connect(self, config):\n \"\"\"\n Connect to qBittorrent Web UI. Username and password not necessary\n if 'Bypass authentication for localhost' is checked and host is\n 'localhost'.\n \"\"\"\n self.session = Session()\n self.url = '{}://{}:{}'.format(\n 'https' if config['use_ssl'] else 'http', config['host'], config['port']\n )\n self.check_api_version('Check API version failed.')\n if config.get('username') and config.get('password'):\n data = {'username': config['username'], 'password': config['password']}\n self._request(\n 'post',\n self.url + self.api_url_login,\n data=data,\n msg_on_fail='Authentication failed.',\n verify=config['verify_cert'],\n )\n log.debug('Successfully connected to qBittorrent')\n self.connected = True\n\n def add_torrent_file(self, file_path, data, verify_cert):\n if not self.connected:\n raise plugin.PluginError('Not connected.')\n multipart_data = {k: (None, v) for k, v in data.items()}\n with open(file_path, 'rb') as f:\n multipart_data['torrents'] = f\n self._request(\n 'post',\n self.url + self.api_url_add,\n msg_on_fail='Failed to add file to qBittorrent',\n files=multipart_data,\n verify=verify_cert,\n )\n log.debug('Added torrent file %s to qBittorrent', file_path)\n\n def add_torrent_url(self, url, data, verify_cert):\n if not self.connected:\n raise plugin.PluginError('Not connected.')\n data['urls'] = url\n multipart_data = {k: (None, v) for k, v in data.items()}\n self._request(\n 'post',\n self.url + self.api_url_add,\n msg_on_fail='Failed to add file to qBittorrent',\n files=multipart_data,\n verify=verify_cert,\n )\n log.debug('Added url %s to qBittorrent', url)\n\n def prepare_config(self, config):\n if isinstance(config, bool):\n config = {'enabled': config}\n config.setdefault('enabled', True)\n config.setdefault('host', 'localhost')\n config.setdefault('port', 8080)\n config.setdefault('use_ssl', False)\n config.setdefault('verify_cert', True)\n config.setdefault('label', '')\n config.setdefault('maxupspeed', 0)\n config.setdefault('maxdownspeed', 0)\n config.setdefault('fail_html', True)\n return config\n\n def add_entries(self, task, config):\n for entry in task.accepted:\n form_data = {}\n try:\n save_path = entry.render(entry.get('path', config.get('path', '')))\n if save_path:\n form_data['savepath'] = save_path\n except RenderError as e:\n log.error('Error setting path for %s: %s', entry['title'], e)\n\n label = entry.get('label', config.get('label'))\n if label:\n form_data['label'] = label # qBittorrent v3.3.3-\n form_data['category'] = label # qBittorrent v3.3.4+\n\n add_paused = entry.get('add_paused', config.get('add_paused'))\n if add_paused:\n form_data['paused'] = 'true'\n\n maxupspeed = entry.get('maxupspeed', config.get('maxupspeed'))\n if maxupspeed:\n form_data['upLimit'] = maxupspeed * 1024\n\n maxdownspeed = entry.get('maxdownspeed', config.get('maxdownspeed'))\n if maxdownspeed:\n form_data['dlLimit'] = maxdownspeed * 1024\n\n is_magnet = entry['url'].startswith('magnet:')\n\n if task.manager.options.test:\n log.info('Test mode.')\n log.info('Would add torrent to qBittorrent with:')\n if not is_magnet:\n log.info('File: %s', entry.get('file'))\n else:\n log.info('Url: %s', entry.get('url'))\n log.info('Save path: %s', form_data.get('savepath'))\n log.info('Label: %s', form_data.get('label'))\n log.info('Paused: %s', form_data.get('paused', 'false'))\n if maxupspeed:\n log.info('Upload Speed Limit: %d', form_data.get('upLimit'))\n if maxdownspeed:\n log.info('Download Speed Limit: %d', form_data.get('dlLimit'))\n continue\n\n if not is_magnet:\n if 'file' not in entry:\n entry.fail('File missing?')\n continue\n if not os.path.exists(entry['file']):\n tmp_path = os.path.join(task.manager.config_base, 'temp')\n log.debug('entry: %s', entry)\n log.debug('temp: %s', ', '.join(os.listdir(tmp_path)))\n entry.fail(\"Downloaded temp file '%s' doesn't exist!?\" % entry['file'])\n continue\n self.add_torrent_file(entry['file'], form_data, config['verify_cert'])\n else:\n self.add_torrent_url(entry['url'], form_data, config['verify_cert'])\n\n @plugin.priority(120)\n def on_task_download(self, task, config):\n \"\"\"\n Call download plugin to generate torrent files to load into\n qBittorrent.\n \"\"\"\n config = self.prepare_config(config)\n if not config['enabled']:\n return\n if 'download' not in task.config:\n download = plugin.get('download', self)\n download.get_temp_files(task, handle_magnets=True, fail_html=config['fail_html'])\n\n @plugin.priority(135)\n def on_task_output(self, task, config):\n \"\"\"Add torrents to qBittorrent at exit.\"\"\"\n if task.accepted:\n config = self.prepare_config(config)\n self.connect(config)\n self.add_entries(task, config)\n\n\n@event('plugin.register')\ndef register_plugin():\n plugin.register(OutputQBitTorrent, 'qbittorrent', api_ver=2)\n", "path": "flexget/plugins/clients/qbittorrent.py"}]} | 2,815 | 737 |
gh_patches_debug_31675 | rasdani/github-patches | git_diff | pyload__pyload-1369 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Uplea plugin out of date
Hi,
any download from uplea.com fails:
pyLoad reports success on downloading but actually only the HTML page giving acces to download is downloaded...
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `module/plugins/hoster/UpleaCom.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 import re
4
5 from urlparse import urljoin
6
7 from module.plugins.internal.XFSHoster import XFSHoster, create_getInfo
8
9
10 class UpleaCom(XFSHoster):
11 __name__ = "UpleaCom"
12 __type__ = "hoster"
13 __version__ = "0.06"
14
15 __pattern__ = r'https?://(?:www\.)?uplea\.com/dl/\w{15}'
16
17 __description__ = """Uplea.com hoster plugin"""
18 __license__ = "GPLv3"
19 __authors__ = [("Redleon", None)]
20
21
22 NAME_PATTERN = r'class="agmd size18">(?P<N>.+?)<'
23 SIZE_PATTERN = r'size14">(?P<S>[\d.,]+) (?P<U>[\w^_])</span>'
24
25 OFFLINE_PATTERN = r'>You followed an invalid or expired link'
26
27 LINK_PATTERN = r'"(http?://\w+\.uplea\.com/anonym/.*?)"'
28
29 WAIT_PATTERN = r'timeText:([\d.]+),'
30 STEP_PATTERN = r'<a href="(/step/.+)">'
31
32
33 def setup(self):
34 self.multiDL = False
35 self.chunkLimit = 1
36 self.resumeDownload = True
37
38
39 def handleFree(self, pyfile):
40 m = re.search(self.STEP_PATTERN, self.html)
41 if m is None:
42 self.error(_("STEP_PATTERN not found"))
43
44 self.html = self.load(urljoin("http://uplea.com/", m.group(1)))
45
46 m = re.search(self.WAIT_PATTERN, self.html)
47 if m:
48 self.wait(m.group(1), True)
49 self.retry()
50
51 m = re.search(self.LINK_PATTERN, self.html)
52 if m is None:
53 self.error(_("LINK_PATTERN not found"))
54
55 self.link = m.group(1)
56 self.wait(15)
57
58
59 getInfo = create_getInfo(UpleaCom)
60
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/module/plugins/hoster/UpleaCom.py b/module/plugins/hoster/UpleaCom.py
--- a/module/plugins/hoster/UpleaCom.py
+++ b/module/plugins/hoster/UpleaCom.py
@@ -10,23 +10,26 @@
class UpleaCom(XFSHoster):
__name__ = "UpleaCom"
__type__ = "hoster"
- __version__ = "0.06"
+ __version__ = "0.07"
__pattern__ = r'https?://(?:www\.)?uplea\.com/dl/\w{15}'
__description__ = """Uplea.com hoster plugin"""
__license__ = "GPLv3"
- __authors__ = [("Redleon", None)]
+ __authors__ = [("Redleon", None),
+ ("GammaC0de", None)]
NAME_PATTERN = r'class="agmd size18">(?P<N>.+?)<'
- SIZE_PATTERN = r'size14">(?P<S>[\d.,]+) (?P<U>[\w^_])</span>'
+ SIZE_PATTERN = r'size14">(?P<S>[\d.,]+) (?P<U>[\w^_]+?)</span>'
+ SIZE_REPLACEMENTS = [('Ko','KB'), ('Mo','MB'), ('Go','GB')]
OFFLINE_PATTERN = r'>You followed an invalid or expired link'
+ PREMIUM_PATTERN = r'You need to have a Premium subscription to download this file'
- LINK_PATTERN = r'"(http?://\w+\.uplea\.com/anonym/.*?)"'
+ LINK_PATTERN = r'"(https?://\w+\.uplea\.com/anonym/.*?)"'
- WAIT_PATTERN = r'timeText:([\d.]+),'
+ WAIT_PATTERN = r'timeText: ?([\d.]+),'
STEP_PATTERN = r'<a href="(/step/.+)">'
@@ -45,9 +48,14 @@
m = re.search(self.WAIT_PATTERN, self.html)
if m:
+ self.logDebug(_("Waiting %s seconds") % m.group(1))
self.wait(m.group(1), True)
self.retry()
+ m = re.search(self.PREMIUM_PATTERN, self.html)
+ if m:
+ self.error(_("This URL requires a premium account"))
+
m = re.search(self.LINK_PATTERN, self.html)
if m is None:
self.error(_("LINK_PATTERN not found"))
| {"golden_diff": "diff --git a/module/plugins/hoster/UpleaCom.py b/module/plugins/hoster/UpleaCom.py\n--- a/module/plugins/hoster/UpleaCom.py\n+++ b/module/plugins/hoster/UpleaCom.py\n@@ -10,23 +10,26 @@\n class UpleaCom(XFSHoster):\n __name__ = \"UpleaCom\"\n __type__ = \"hoster\"\n- __version__ = \"0.06\"\n+ __version__ = \"0.07\"\n \n __pattern__ = r'https?://(?:www\\.)?uplea\\.com/dl/\\w{15}'\n \n __description__ = \"\"\"Uplea.com hoster plugin\"\"\"\n __license__ = \"GPLv3\"\n- __authors__ = [(\"Redleon\", None)]\n+ __authors__ = [(\"Redleon\", None),\n+ (\"GammaC0de\", None)]\n \n \n NAME_PATTERN = r'class=\"agmd size18\">(?P<N>.+?)<'\n- SIZE_PATTERN = r'size14\">(?P<S>[\\d.,]+) (?P<U>[\\w^_])</span>'\n+ SIZE_PATTERN = r'size14\">(?P<S>[\\d.,]+) (?P<U>[\\w^_]+?)</span>'\n+ SIZE_REPLACEMENTS = [('Ko','KB'), ('Mo','MB'), ('Go','GB')]\n \n OFFLINE_PATTERN = r'>You followed an invalid or expired link'\n+ PREMIUM_PATTERN = r'You need to have a Premium subscription to download this file'\n \n- LINK_PATTERN = r'\"(http?://\\w+\\.uplea\\.com/anonym/.*?)\"'\n+ LINK_PATTERN = r'\"(https?://\\w+\\.uplea\\.com/anonym/.*?)\"'\n \n- WAIT_PATTERN = r'timeText:([\\d.]+),'\n+ WAIT_PATTERN = r'timeText: ?([\\d.]+),'\n STEP_PATTERN = r'<a href=\"(/step/.+)\">'\n \n \n@@ -45,9 +48,14 @@\n \n m = re.search(self.WAIT_PATTERN, self.html)\n if m:\n+ self.logDebug(_(\"Waiting %s seconds\") % m.group(1))\n self.wait(m.group(1), True)\n self.retry()\n \n+ m = re.search(self.PREMIUM_PATTERN, self.html)\n+ if m:\n+ self.error(_(\"This URL requires a premium account\"))\n+\n m = re.search(self.LINK_PATTERN, self.html)\n if m is None:\n self.error(_(\"LINK_PATTERN not found\"))\n", "issue": "Uplea plugin out of date\nHi,\nany download from uplea.com fails:\npyLoad reports success on downloading but actually only the HTML page giving acces to download is downloaded...\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\nimport re\n\nfrom urlparse import urljoin\n\nfrom module.plugins.internal.XFSHoster import XFSHoster, create_getInfo\n\n\nclass UpleaCom(XFSHoster):\n __name__ = \"UpleaCom\"\n __type__ = \"hoster\"\n __version__ = \"0.06\"\n\n __pattern__ = r'https?://(?:www\\.)?uplea\\.com/dl/\\w{15}'\n\n __description__ = \"\"\"Uplea.com hoster plugin\"\"\"\n __license__ = \"GPLv3\"\n __authors__ = [(\"Redleon\", None)]\n\n\n NAME_PATTERN = r'class=\"agmd size18\">(?P<N>.+?)<'\n SIZE_PATTERN = r'size14\">(?P<S>[\\d.,]+) (?P<U>[\\w^_])</span>'\n\n OFFLINE_PATTERN = r'>You followed an invalid or expired link'\n\n LINK_PATTERN = r'\"(http?://\\w+\\.uplea\\.com/anonym/.*?)\"'\n\n WAIT_PATTERN = r'timeText:([\\d.]+),'\n STEP_PATTERN = r'<a href=\"(/step/.+)\">'\n\n\n def setup(self):\n self.multiDL = False\n self.chunkLimit = 1\n self.resumeDownload = True\n\n\n def handleFree(self, pyfile):\n m = re.search(self.STEP_PATTERN, self.html)\n if m is None:\n self.error(_(\"STEP_PATTERN not found\"))\n\n self.html = self.load(urljoin(\"http://uplea.com/\", m.group(1)))\n\n m = re.search(self.WAIT_PATTERN, self.html)\n if m:\n self.wait(m.group(1), True)\n self.retry()\n\n m = re.search(self.LINK_PATTERN, self.html)\n if m is None:\n self.error(_(\"LINK_PATTERN not found\"))\n\n self.link = m.group(1)\n self.wait(15)\n\n\ngetInfo = create_getInfo(UpleaCom)\n", "path": "module/plugins/hoster/UpleaCom.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\nimport re\n\nfrom urlparse import urljoin\n\nfrom module.plugins.internal.XFSHoster import XFSHoster, create_getInfo\n\n\nclass UpleaCom(XFSHoster):\n __name__ = \"UpleaCom\"\n __type__ = \"hoster\"\n __version__ = \"0.07\"\n\n __pattern__ = r'https?://(?:www\\.)?uplea\\.com/dl/\\w{15}'\n\n __description__ = \"\"\"Uplea.com hoster plugin\"\"\"\n __license__ = \"GPLv3\"\n __authors__ = [(\"Redleon\", None),\n (\"GammaC0de\", None)]\n\n\n NAME_PATTERN = r'class=\"agmd size18\">(?P<N>.+?)<'\n SIZE_PATTERN = r'size14\">(?P<S>[\\d.,]+) (?P<U>[\\w^_]+?)</span>'\n SIZE_REPLACEMENTS = [('Ko','KB'), ('Mo','MB'), ('Go','GB')]\n\n OFFLINE_PATTERN = r'>You followed an invalid or expired link'\n PREMIUM_PATTERN = r'You need to have a Premium subscription to download this file'\n\n LINK_PATTERN = r'\"(https?://\\w+\\.uplea\\.com/anonym/.*?)\"'\n\n WAIT_PATTERN = r'timeText: ?([\\d.]+),'\n STEP_PATTERN = r'<a href=\"(/step/.+)\">'\n\n\n def setup(self):\n self.multiDL = False\n self.chunkLimit = 1\n self.resumeDownload = True\n\n\n def handleFree(self, pyfile):\n m = re.search(self.STEP_PATTERN, self.html)\n if m is None:\n self.error(_(\"STEP_PATTERN not found\"))\n\n self.html = self.load(urljoin(\"http://uplea.com/\", m.group(1)))\n\n m = re.search(self.WAIT_PATTERN, self.html)\n if m:\n self.logDebug(_(\"Waiting %s seconds\") % m.group(1))\n self.wait(m.group(1), True)\n self.retry()\n\n m = re.search(self.PREMIUM_PATTERN, self.html)\n if m:\n self.error(_(\"This URL requires a premium account\"))\n\n m = re.search(self.LINK_PATTERN, self.html)\n if m is None:\n self.error(_(\"LINK_PATTERN not found\"))\n\n self.link = m.group(1)\n self.wait(15)\n\n\ngetInfo = create_getInfo(UpleaCom)\n", "path": "module/plugins/hoster/UpleaCom.py"}]} | 864 | 582 |
gh_patches_debug_30586 | rasdani/github-patches | git_diff | superduper-io__superduper-1947 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[MISC] The cosine method is slow to compute.
The Cosine method is slow in computation because it involves data transformation for the vector matrix every time, which results in significant time consumption.
```python
def cosine(x, y):
'''
Cosine similarity function for vector search
'''
x = x.astype(float)
y = y.astype(float)
x = x / numpy.linalg.norm(x, axis=1)[:, None]
y = y / numpy.linalg.norm(y, axis=1)[:, None]
return dot(x, y)
```
we need to preprocess all the incoming matrices of cosine.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `superduperdb/vector_search/in_memory.py`
Content:
```
1 import typing as t
2
3 import numpy
4
5 from superduperdb import logging
6 from superduperdb.vector_search.base import BaseVectorSearcher, VectorItem, measures
7
8
9 class InMemoryVectorSearcher(BaseVectorSearcher):
10 """
11 Simple hash-set for looking up with vector similarity.
12
13 :param identifier: Unique string identifier of index
14 :param h: array/ tensor of vectors
15 :param index: list of IDs
16 :param measure: measure to assess similarity
17 """
18
19 name = 'vanilla'
20
21 def __init__(
22 self,
23 identifier: str,
24 dimensions: int,
25 h: t.Optional[numpy.ndarray] = None,
26 index: t.Optional[t.List[str]] = None,
27 measure: t.Union[str, t.Callable] = 'cosine',
28 ):
29 self.identifier = identifier
30 self.dimensions = dimensions
31 self._cache: t.Sequence[VectorItem] = []
32 self._CACHE_SIZE = 10000
33
34 if h is not None:
35 assert index is not None
36 self._setup(h, index)
37 else:
38 self.h = None
39 self.index = None
40 self.lookup = None
41
42 self.measure = measure
43 if isinstance(measure, str):
44 self.measure = measures[measure]
45
46 self.identifier = identifier
47
48 def __len__(self):
49 return self.h.shape[0]
50
51 def _setup(self, h, index):
52 self.h = numpy.array(h) if not isinstance(h, numpy.ndarray) else h
53 self.index = index
54 self.lookup = dict(zip(index, range(len(index))))
55
56 def find_nearest_from_id(self, _id, n=100):
57 self.post_create()
58 return self.find_nearest_from_array(self.h[self.lookup[_id]], n=n)
59
60 def find_nearest_from_array(self, h, n=100, within_ids=None):
61 self.post_create()
62 h = self.to_numpy(h)[None, :]
63 if within_ids:
64 ix = list(map(self.lookup.__getitem__, within_ids))
65 similarities = self.measure(h, self.h[ix, :]) # mypy: ignore
66 else:
67 similarities = self.measure(h, self.h) # mypy: ignore
68 similarities = similarities[0, :]
69 logging.debug(similarities)
70 scores = -numpy.sort(-similarities)
71 ## different ways of handling
72 if within_ids:
73 top_n_idxs = numpy.argsort(-similarities)[:n]
74 ix = [ix[i] for i in top_n_idxs]
75 else:
76 ix = numpy.argsort(-similarities)[:n]
77 ix = ix.tolist()
78 scores = scores.tolist()
79 _ids = [self.index[i] for i in ix]
80 return _ids, scores
81
82 def add(self, items: t.Sequence[VectorItem]) -> None:
83 if len(self._cache) < self._CACHE_SIZE:
84 for item in items:
85 self._cache.append(item)
86 else:
87 self._add(self._cache)
88 self._cache = []
89
90 def post_create(self):
91 if self._cache:
92 self._add(self._cache)
93 self._cache = []
94
95 def _add(self, items: t.Sequence[VectorItem]) -> None:
96 index = [item.id for item in items]
97 h = numpy.stack([item.vector for item in items])
98
99 if self.h is not None:
100 old_not_in_new = list(set(self.index) - set(index))
101 ix_old = [self.lookup[_id] for _id in old_not_in_new]
102 h = numpy.concatenate((self.h[ix_old], h), axis=0)
103 index = [self.index[i] for i in ix_old] + index
104
105 return self._setup(h, index)
106
107 def delete(self, ids):
108 self.post_create()
109 ix = list(map(self.lookup.__getitem__, ids))
110 h = numpy.delete(self.h, ix, axis=0)
111 index = [_id for _id in self.index if _id not in set(ids)]
112 self._setup(h, index)
113
```
Path: `superduperdb/vector_search/base.py`
Content:
```
1 from __future__ import annotations
2
3 import enum
4 import typing as t
5 from abc import ABC, abstractmethod
6 from dataclasses import dataclass, field
7
8 import numpy
9 import numpy.typing
10
11 if t.TYPE_CHECKING:
12 from superduperdb.components.vector_index import VectorIndex
13
14
15 class BaseVectorSearcher(ABC):
16 @classmethod
17 def from_component(cls, vi: 'VectorIndex'):
18 return cls(
19 identifier=vi.identifier, dimensions=vi.dimensions, measure=vi.measure
20 )
21
22 @abstractmethod
23 def __init__(
24 self,
25 identifier: str,
26 dimensions: int,
27 h: t.Optional[numpy.ndarray] = None,
28 index: t.Optional[t.List[str]] = None,
29 measure: t.Optional[str] = None,
30 ):
31 pass
32
33 @abstractmethod
34 def __len__(self):
35 pass
36
37 @staticmethod
38 def to_numpy(h):
39 if isinstance(h, numpy.ndarray):
40 return h
41 if hasattr(h, 'numpy'):
42 return h.numpy()
43 if isinstance(h, list):
44 return numpy.array(h)
45 raise ValueError(str(h))
46
47 @staticmethod
48 def to_list(h):
49 if hasattr(h, 'tolist'):
50 return h.tolist()
51 if isinstance(h, list):
52 return h
53 raise ValueError(str(h))
54
55 @abstractmethod
56 def add(self, items: t.Sequence[VectorItem]) -> None:
57 """
58 Add items to the index.
59
60 :param items: t.Sequence of VectorItems
61 """
62
63 @abstractmethod
64 def delete(self, ids: t.Sequence[str]) -> None:
65 """
66 Remove items from the index
67
68 :param ids: t.Sequence of ids of vectors.
69 """
70
71 @abstractmethod
72 def find_nearest_from_id(
73 self,
74 _id,
75 n: int = 100,
76 within_ids: t.Sequence[str] = (),
77 ) -> t.Tuple[t.List[str], t.List[float]]:
78 """
79 Find the nearest vectors to the vector with the given id.
80
81 :param _id: id of the vector
82 :param n: number of nearest vectors to return
83 """
84
85 @abstractmethod
86 def find_nearest_from_array(
87 self,
88 h: numpy.typing.ArrayLike,
89 n: int = 100,
90 within_ids: t.Sequence[str] = (),
91 ) -> t.Tuple[t.List[str], t.List[float]]:
92 """
93 Find the nearest vectors to the given vector.
94
95 :param h: vector
96 :param n: number of nearest vectors to return
97 """
98
99 def post_create(self):
100 """
101 This method is used for searchers which requires
102 to perform a task after all vectors have been added
103 """
104
105
106 class VectorIndexMeasureType(str, enum.Enum):
107 cosine = 'cosine'
108 css = 'css'
109 dot = 'dot'
110 l2 = 'l2'
111
112
113 @dataclass(frozen=True)
114 class VectorSearchConfig:
115 '''
116 Represents search config which helps initiate a vector
117 searcher class.
118 '''
119
120 id: str
121 dimensions: int
122 measure: VectorIndexMeasureType = VectorIndexMeasureType.l2
123 parameters: t.Mapping[str, t.Any] = field(default_factory=dict)
124
125
126 @dataclass(frozen=True)
127 class VectorItem:
128 '''
129 Class for representing a vector in vector search with
130 id and vector.
131
132 '''
133
134 id: str
135 vector: numpy.ndarray
136
137 @classmethod
138 def create(
139 cls,
140 *,
141 id: str,
142 vector: numpy.typing.ArrayLike,
143 ) -> VectorItem:
144 return VectorItem(id=id, vector=BaseVectorSearcher.to_numpy(vector))
145
146 def to_dict(self) -> t.Dict:
147 return {'id': self.id, 'vector': self.vector}
148
149
150 @dataclass(frozen=True)
151 class VectorSearchResult:
152 '''
153 Dataclass for representing vector search results with
154 `id` and `score`.
155 '''
156
157 id: str
158 score: float
159
160
161 def l2(x, y):
162 '''
163 L2 function for vector similarity search
164 '''
165 return numpy.array([-numpy.linalg.norm(x - y, axis=1)])
166
167
168 def dot(x, y):
169 '''
170 Dot function for vector similarity search
171 '''
172 return numpy.dot(x, y.T)
173
174
175 def cosine(x, y):
176 '''
177 Cosine similarity function for vector search
178 '''
179 x = x.astype(float)
180 y = y.astype(float)
181 x = x / numpy.linalg.norm(x, axis=1)[:, None]
182 y = y / numpy.linalg.norm(y, axis=1)[:, None]
183 return dot(x, y)
184
185
186 measures = {'cosine': cosine, 'dot': dot, 'l2': l2}
187
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/superduperdb/vector_search/base.py b/superduperdb/vector_search/base.py
--- a/superduperdb/vector_search/base.py
+++ b/superduperdb/vector_search/base.py
@@ -176,10 +176,9 @@
'''
Cosine similarity function for vector search
'''
- x = x.astype(float)
- y = y.astype(float)
x = x / numpy.linalg.norm(x, axis=1)[:, None]
- y = y / numpy.linalg.norm(y, axis=1)[:, None]
+ # y which implies all vectors in vectordatabase
+ # has normalized vectors.
return dot(x, y)
diff --git a/superduperdb/vector_search/in_memory.py b/superduperdb/vector_search/in_memory.py
--- a/superduperdb/vector_search/in_memory.py
+++ b/superduperdb/vector_search/in_memory.py
@@ -31,6 +31,10 @@
self._cache: t.Sequence[VectorItem] = []
self._CACHE_SIZE = 10000
+ self.measure = measure
+ if isinstance(measure, str):
+ self.measure = measures[measure]
+
if h is not None:
assert index is not None
self._setup(h, index)
@@ -39,17 +43,19 @@
self.index = None
self.lookup = None
- self.measure = measure
- if isinstance(measure, str):
- self.measure = measures[measure]
-
self.identifier = identifier
def __len__(self):
return self.h.shape[0]
def _setup(self, h, index):
- self.h = numpy.array(h) if not isinstance(h, numpy.ndarray) else h
+ h = numpy.array(h) if not isinstance(h, numpy.ndarray) else h
+
+ if self.measure == 'cosine':
+ # Normalization is required for cosine, hence preparing
+ # all vectors in advance.
+ h = h / numpy.linalg.norm(h, axis=1)[:, None]
+ self.h = h
self.index = index
self.lookup = dict(zip(index, range(len(index))))
| {"golden_diff": "diff --git a/superduperdb/vector_search/base.py b/superduperdb/vector_search/base.py\n--- a/superduperdb/vector_search/base.py\n+++ b/superduperdb/vector_search/base.py\n@@ -176,10 +176,9 @@\n '''\n Cosine similarity function for vector search\n '''\n- x = x.astype(float)\n- y = y.astype(float)\n x = x / numpy.linalg.norm(x, axis=1)[:, None]\n- y = y / numpy.linalg.norm(y, axis=1)[:, None]\n+ # y which implies all vectors in vectordatabase\n+ # has normalized vectors.\n return dot(x, y)\n \n \ndiff --git a/superduperdb/vector_search/in_memory.py b/superduperdb/vector_search/in_memory.py\n--- a/superduperdb/vector_search/in_memory.py\n+++ b/superduperdb/vector_search/in_memory.py\n@@ -31,6 +31,10 @@\n self._cache: t.Sequence[VectorItem] = []\n self._CACHE_SIZE = 10000\n \n+ self.measure = measure\n+ if isinstance(measure, str):\n+ self.measure = measures[measure]\n+\n if h is not None:\n assert index is not None\n self._setup(h, index)\n@@ -39,17 +43,19 @@\n self.index = None\n self.lookup = None\n \n- self.measure = measure\n- if isinstance(measure, str):\n- self.measure = measures[measure]\n-\n self.identifier = identifier\n \n def __len__(self):\n return self.h.shape[0]\n \n def _setup(self, h, index):\n- self.h = numpy.array(h) if not isinstance(h, numpy.ndarray) else h\n+ h = numpy.array(h) if not isinstance(h, numpy.ndarray) else h\n+\n+ if self.measure == 'cosine':\n+ # Normalization is required for cosine, hence preparing\n+ # all vectors in advance.\n+ h = h / numpy.linalg.norm(h, axis=1)[:, None]\n+ self.h = h\n self.index = index\n self.lookup = dict(zip(index, range(len(index))))\n", "issue": "[MISC] The cosine method is slow to compute.\nThe Cosine method is slow in computation because it involves data transformation for the vector matrix every time, which results in significant time consumption.\r\n\r\n```python\r\ndef cosine(x, y):\r\n '''\r\n Cosine similarity function for vector search\r\n '''\r\n x = x.astype(float)\r\n y = y.astype(float)\r\n x = x / numpy.linalg.norm(x, axis=1)[:, None]\r\n y = y / numpy.linalg.norm(y, axis=1)[:, None]\r\n return dot(x, y)\r\n\r\n```\r\n\r\n\r\nwe need to preprocess all the incoming matrices of cosine.\n", "before_files": [{"content": "import typing as t\n\nimport numpy\n\nfrom superduperdb import logging\nfrom superduperdb.vector_search.base import BaseVectorSearcher, VectorItem, measures\n\n\nclass InMemoryVectorSearcher(BaseVectorSearcher):\n \"\"\"\n Simple hash-set for looking up with vector similarity.\n\n :param identifier: Unique string identifier of index\n :param h: array/ tensor of vectors\n :param index: list of IDs\n :param measure: measure to assess similarity\n \"\"\"\n\n name = 'vanilla'\n\n def __init__(\n self,\n identifier: str,\n dimensions: int,\n h: t.Optional[numpy.ndarray] = None,\n index: t.Optional[t.List[str]] = None,\n measure: t.Union[str, t.Callable] = 'cosine',\n ):\n self.identifier = identifier\n self.dimensions = dimensions\n self._cache: t.Sequence[VectorItem] = []\n self._CACHE_SIZE = 10000\n\n if h is not None:\n assert index is not None\n self._setup(h, index)\n else:\n self.h = None\n self.index = None\n self.lookup = None\n\n self.measure = measure\n if isinstance(measure, str):\n self.measure = measures[measure]\n\n self.identifier = identifier\n\n def __len__(self):\n return self.h.shape[0]\n\n def _setup(self, h, index):\n self.h = numpy.array(h) if not isinstance(h, numpy.ndarray) else h\n self.index = index\n self.lookup = dict(zip(index, range(len(index))))\n\n def find_nearest_from_id(self, _id, n=100):\n self.post_create()\n return self.find_nearest_from_array(self.h[self.lookup[_id]], n=n)\n\n def find_nearest_from_array(self, h, n=100, within_ids=None):\n self.post_create()\n h = self.to_numpy(h)[None, :]\n if within_ids:\n ix = list(map(self.lookup.__getitem__, within_ids))\n similarities = self.measure(h, self.h[ix, :]) # mypy: ignore\n else:\n similarities = self.measure(h, self.h) # mypy: ignore\n similarities = similarities[0, :]\n logging.debug(similarities)\n scores = -numpy.sort(-similarities)\n ## different ways of handling\n if within_ids:\n top_n_idxs = numpy.argsort(-similarities)[:n]\n ix = [ix[i] for i in top_n_idxs]\n else:\n ix = numpy.argsort(-similarities)[:n]\n ix = ix.tolist()\n scores = scores.tolist()\n _ids = [self.index[i] for i in ix]\n return _ids, scores\n\n def add(self, items: t.Sequence[VectorItem]) -> None:\n if len(self._cache) < self._CACHE_SIZE:\n for item in items:\n self._cache.append(item)\n else:\n self._add(self._cache)\n self._cache = []\n\n def post_create(self):\n if self._cache:\n self._add(self._cache)\n self._cache = []\n\n def _add(self, items: t.Sequence[VectorItem]) -> None:\n index = [item.id for item in items]\n h = numpy.stack([item.vector for item in items])\n\n if self.h is not None:\n old_not_in_new = list(set(self.index) - set(index))\n ix_old = [self.lookup[_id] for _id in old_not_in_new]\n h = numpy.concatenate((self.h[ix_old], h), axis=0)\n index = [self.index[i] for i in ix_old] + index\n\n return self._setup(h, index)\n\n def delete(self, ids):\n self.post_create()\n ix = list(map(self.lookup.__getitem__, ids))\n h = numpy.delete(self.h, ix, axis=0)\n index = [_id for _id in self.index if _id not in set(ids)]\n self._setup(h, index)\n", "path": "superduperdb/vector_search/in_memory.py"}, {"content": "from __future__ import annotations\n\nimport enum\nimport typing as t\nfrom abc import ABC, abstractmethod\nfrom dataclasses import dataclass, field\n\nimport numpy\nimport numpy.typing\n\nif t.TYPE_CHECKING:\n from superduperdb.components.vector_index import VectorIndex\n\n\nclass BaseVectorSearcher(ABC):\n @classmethod\n def from_component(cls, vi: 'VectorIndex'):\n return cls(\n identifier=vi.identifier, dimensions=vi.dimensions, measure=vi.measure\n )\n\n @abstractmethod\n def __init__(\n self,\n identifier: str,\n dimensions: int,\n h: t.Optional[numpy.ndarray] = None,\n index: t.Optional[t.List[str]] = None,\n measure: t.Optional[str] = None,\n ):\n pass\n\n @abstractmethod\n def __len__(self):\n pass\n\n @staticmethod\n def to_numpy(h):\n if isinstance(h, numpy.ndarray):\n return h\n if hasattr(h, 'numpy'):\n return h.numpy()\n if isinstance(h, list):\n return numpy.array(h)\n raise ValueError(str(h))\n\n @staticmethod\n def to_list(h):\n if hasattr(h, 'tolist'):\n return h.tolist()\n if isinstance(h, list):\n return h\n raise ValueError(str(h))\n\n @abstractmethod\n def add(self, items: t.Sequence[VectorItem]) -> None:\n \"\"\"\n Add items to the index.\n\n :param items: t.Sequence of VectorItems\n \"\"\"\n\n @abstractmethod\n def delete(self, ids: t.Sequence[str]) -> None:\n \"\"\"\n Remove items from the index\n\n :param ids: t.Sequence of ids of vectors.\n \"\"\"\n\n @abstractmethod\n def find_nearest_from_id(\n self,\n _id,\n n: int = 100,\n within_ids: t.Sequence[str] = (),\n ) -> t.Tuple[t.List[str], t.List[float]]:\n \"\"\"\n Find the nearest vectors to the vector with the given id.\n\n :param _id: id of the vector\n :param n: number of nearest vectors to return\n \"\"\"\n\n @abstractmethod\n def find_nearest_from_array(\n self,\n h: numpy.typing.ArrayLike,\n n: int = 100,\n within_ids: t.Sequence[str] = (),\n ) -> t.Tuple[t.List[str], t.List[float]]:\n \"\"\"\n Find the nearest vectors to the given vector.\n\n :param h: vector\n :param n: number of nearest vectors to return\n \"\"\"\n\n def post_create(self):\n \"\"\"\n This method is used for searchers which requires\n to perform a task after all vectors have been added\n \"\"\"\n\n\nclass VectorIndexMeasureType(str, enum.Enum):\n cosine = 'cosine'\n css = 'css'\n dot = 'dot'\n l2 = 'l2'\n\n\n@dataclass(frozen=True)\nclass VectorSearchConfig:\n '''\n Represents search config which helps initiate a vector\n searcher class.\n '''\n\n id: str\n dimensions: int\n measure: VectorIndexMeasureType = VectorIndexMeasureType.l2\n parameters: t.Mapping[str, t.Any] = field(default_factory=dict)\n\n\n@dataclass(frozen=True)\nclass VectorItem:\n '''\n Class for representing a vector in vector search with\n id and vector.\n\n '''\n\n id: str\n vector: numpy.ndarray\n\n @classmethod\n def create(\n cls,\n *,\n id: str,\n vector: numpy.typing.ArrayLike,\n ) -> VectorItem:\n return VectorItem(id=id, vector=BaseVectorSearcher.to_numpy(vector))\n\n def to_dict(self) -> t.Dict:\n return {'id': self.id, 'vector': self.vector}\n\n\n@dataclass(frozen=True)\nclass VectorSearchResult:\n '''\n Dataclass for representing vector search results with\n `id` and `score`.\n '''\n\n id: str\n score: float\n\n\ndef l2(x, y):\n '''\n L2 function for vector similarity search\n '''\n return numpy.array([-numpy.linalg.norm(x - y, axis=1)])\n\n\ndef dot(x, y):\n '''\n Dot function for vector similarity search\n '''\n return numpy.dot(x, y.T)\n\n\ndef cosine(x, y):\n '''\n Cosine similarity function for vector search\n '''\n x = x.astype(float)\n y = y.astype(float)\n x = x / numpy.linalg.norm(x, axis=1)[:, None]\n y = y / numpy.linalg.norm(y, axis=1)[:, None]\n return dot(x, y)\n\n\nmeasures = {'cosine': cosine, 'dot': dot, 'l2': l2}\n", "path": "superduperdb/vector_search/base.py"}], "after_files": [{"content": "import typing as t\n\nimport numpy\n\nfrom superduperdb import logging\nfrom superduperdb.vector_search.base import BaseVectorSearcher, VectorItem, measures\n\n\nclass InMemoryVectorSearcher(BaseVectorSearcher):\n \"\"\"\n Simple hash-set for looking up with vector similarity.\n\n :param identifier: Unique string identifier of index\n :param h: array/ tensor of vectors\n :param index: list of IDs\n :param measure: measure to assess similarity\n \"\"\"\n\n name = 'vanilla'\n\n def __init__(\n self,\n identifier: str,\n dimensions: int,\n h: t.Optional[numpy.ndarray] = None,\n index: t.Optional[t.List[str]] = None,\n measure: t.Union[str, t.Callable] = 'cosine',\n ):\n self.identifier = identifier\n self.dimensions = dimensions\n self._cache: t.Sequence[VectorItem] = []\n self._CACHE_SIZE = 10000\n\n self.measure = measure\n if isinstance(measure, str):\n self.measure = measures[measure]\n\n if h is not None:\n assert index is not None\n self._setup(h, index)\n else:\n self.h = None\n self.index = None\n self.lookup = None\n\n self.identifier = identifier\n\n def __len__(self):\n return self.h.shape[0]\n\n def _setup(self, h, index):\n h = numpy.array(h) if not isinstance(h, numpy.ndarray) else h\n\n if self.measure == 'cosine':\n # Normalization is required for cosine, hence preparing\n # all vectors in advance.\n h = h / numpy.linalg.norm(h, axis=1)[:, None]\n self.h = h\n self.index = index\n self.lookup = dict(zip(index, range(len(index))))\n\n def find_nearest_from_id(self, _id, n=100):\n self.post_create()\n return self.find_nearest_from_array(self.h[self.lookup[_id]], n=n)\n\n def find_nearest_from_array(self, h, n=100, within_ids=None):\n self.post_create()\n h = self.to_numpy(h)[None, :]\n if within_ids:\n ix = list(map(self.lookup.__getitem__, within_ids))\n similarities = self.measure(h, self.h[ix, :]) # mypy: ignore\n else:\n similarities = self.measure(h, self.h) # mypy: ignore\n similarities = similarities[0, :]\n logging.debug(similarities)\n scores = -numpy.sort(-similarities)\n ## different ways of handling\n if within_ids:\n top_n_idxs = numpy.argsort(-similarities)[:n]\n ix = [ix[i] for i in top_n_idxs]\n else:\n ix = numpy.argsort(-similarities)[:n]\n ix = ix.tolist()\n scores = scores.tolist()\n _ids = [self.index[i] for i in ix]\n return _ids, scores\n\n def add(self, items: t.Sequence[VectorItem]) -> None:\n if len(self._cache) < self._CACHE_SIZE:\n for item in items:\n self._cache.append(item)\n else:\n self._add(self._cache)\n self._cache = []\n\n def post_create(self):\n if self._cache:\n self._add(self._cache)\n self._cache = []\n\n def _add(self, items: t.Sequence[VectorItem]) -> None:\n index = [item.id for item in items]\n h = numpy.stack([item.vector for item in items])\n\n if self.h is not None:\n old_not_in_new = list(set(self.index) - set(index))\n ix_old = [self.lookup[_id] for _id in old_not_in_new]\n h = numpy.concatenate((self.h[ix_old], h), axis=0)\n index = [self.index[i] for i in ix_old] + index\n\n return self._setup(h, index)\n\n def delete(self, ids):\n self.post_create()\n ix = list(map(self.lookup.__getitem__, ids))\n h = numpy.delete(self.h, ix, axis=0)\n index = [_id for _id in self.index if _id not in set(ids)]\n self._setup(h, index)\n", "path": "superduperdb/vector_search/in_memory.py"}, {"content": "from __future__ import annotations\n\nimport enum\nimport typing as t\nfrom abc import ABC, abstractmethod\nfrom dataclasses import dataclass, field\n\nimport numpy\nimport numpy.typing\n\nif t.TYPE_CHECKING:\n from superduperdb.components.vector_index import VectorIndex\n\n\nclass BaseVectorSearcher(ABC):\n @classmethod\n def from_component(cls, vi: 'VectorIndex'):\n return cls(\n identifier=vi.identifier, dimensions=vi.dimensions, measure=vi.measure\n )\n\n @abstractmethod\n def __init__(\n self,\n identifier: str,\n dimensions: int,\n h: t.Optional[numpy.ndarray] = None,\n index: t.Optional[t.List[str]] = None,\n measure: t.Optional[str] = None,\n ):\n pass\n\n @abstractmethod\n def __len__(self):\n pass\n\n @staticmethod\n def to_numpy(h):\n if isinstance(h, numpy.ndarray):\n return h\n if hasattr(h, 'numpy'):\n return h.numpy()\n if isinstance(h, list):\n return numpy.array(h)\n raise ValueError(str(h))\n\n @staticmethod\n def to_list(h):\n if hasattr(h, 'tolist'):\n return h.tolist()\n if isinstance(h, list):\n return h\n raise ValueError(str(h))\n\n @abstractmethod\n def add(self, items: t.Sequence[VectorItem]) -> None:\n \"\"\"\n Add items to the index.\n\n :param items: t.Sequence of VectorItems\n \"\"\"\n\n @abstractmethod\n def delete(self, ids: t.Sequence[str]) -> None:\n \"\"\"\n Remove items from the index\n\n :param ids: t.Sequence of ids of vectors.\n \"\"\"\n\n @abstractmethod\n def find_nearest_from_id(\n self,\n _id,\n n: int = 100,\n within_ids: t.Sequence[str] = (),\n ) -> t.Tuple[t.List[str], t.List[float]]:\n \"\"\"\n Find the nearest vectors to the vector with the given id.\n\n :param _id: id of the vector\n :param n: number of nearest vectors to return\n \"\"\"\n\n @abstractmethod\n def find_nearest_from_array(\n self,\n h: numpy.typing.ArrayLike,\n n: int = 100,\n within_ids: t.Sequence[str] = (),\n ) -> t.Tuple[t.List[str], t.List[float]]:\n \"\"\"\n Find the nearest vectors to the given vector.\n\n :param h: vector\n :param n: number of nearest vectors to return\n \"\"\"\n\n def post_create(self):\n \"\"\"\n This method is used for searchers which requires\n to perform a task after all vectors have been added\n \"\"\"\n\n\nclass VectorIndexMeasureType(str, enum.Enum):\n cosine = 'cosine'\n css = 'css'\n dot = 'dot'\n l2 = 'l2'\n\n\n@dataclass(frozen=True)\nclass VectorSearchConfig:\n '''\n Represents search config which helps initiate a vector\n searcher class.\n '''\n\n id: str\n dimensions: int\n measure: VectorIndexMeasureType = VectorIndexMeasureType.l2\n parameters: t.Mapping[str, t.Any] = field(default_factory=dict)\n\n\n@dataclass(frozen=True)\nclass VectorItem:\n '''\n Class for representing a vector in vector search with\n id and vector.\n\n '''\n\n id: str\n vector: numpy.ndarray\n\n @classmethod\n def create(\n cls,\n *,\n id: str,\n vector: numpy.typing.ArrayLike,\n ) -> VectorItem:\n return VectorItem(id=id, vector=BaseVectorSearcher.to_numpy(vector))\n\n def to_dict(self) -> t.Dict:\n return {'id': self.id, 'vector': self.vector}\n\n\n@dataclass(frozen=True)\nclass VectorSearchResult:\n '''\n Dataclass for representing vector search results with\n `id` and `score`.\n '''\n\n id: str\n score: float\n\n\ndef l2(x, y):\n '''\n L2 function for vector similarity search\n '''\n return numpy.array([-numpy.linalg.norm(x - y, axis=1)])\n\n\ndef dot(x, y):\n '''\n Dot function for vector similarity search\n '''\n return numpy.dot(x, y.T)\n\n\ndef cosine(x, y):\n '''\n Cosine similarity function for vector search\n '''\n x = x / numpy.linalg.norm(x, axis=1)[:, None]\n # y which implies all vectors in vectordatabase\n # has normalized vectors.\n return dot(x, y)\n\n\nmeasures = {'cosine': cosine, 'dot': dot, 'l2': l2}\n", "path": "superduperdb/vector_search/base.py"}]} | 3,017 | 492 |
gh_patches_debug_1520 | rasdani/github-patches | git_diff | alltheplaces__alltheplaces-316 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Wendy's
e.g. https://locations.wendys.com/jamestown-ny-3438
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/wendys.py`
Content:
```
1 import scrapy
2 import re
3 import json
4 from locations.items import GeojsonPointItem
5
6 DAY_MAPPING = {
7 'Monday': 'Mo',
8 'Tuesday': 'Tu',
9 'Wednesday': 'We',
10 'Thursday': 'Th',
11 'Friday': 'Fr',
12 'Saturday': 'Sa',
13 'Sunday': 'Su'
14 }
15
16
17 class WendysSpider(scrapy.Spider):
18
19 name = "wendys"
20 allowed_domains = ["locations.wendys.com"]
21 download_delay = 0
22 download_timeout = 30
23 start_urls = (
24 'https://locations.wendys.com',
25 )
26
27 def handle_error(self, failure):
28 self.log("Request failed: %s" % failure.request)
29 def parse_day(self, day):
30 return DAY_MAPPING[day.strip()]
31 def parse_times(self, times):
32 hours_to = [x.strip() for x in times.split('-')]
33 cleaned_times = []
34
35 for hour in hours_to:
36 if re.search('pm$', hour):
37 hour = re.sub('pm', '', hour).strip()
38 hour_min = hour.split(":")
39 if int(hour_min[0]) < 12:
40 hour_min[0] = str(12 + int(hour_min[0]))
41 cleaned_times.append(":".join(hour_min))
42
43 if re.search('am$', hour):
44 hour = re.sub('am', '', hour).strip()
45 hour_min = hour.split(":")
46 if len(hour_min[0]) <2:
47 hour_min[0] = hour_min[0].zfill(2)
48 else:
49 hour_min[0] = str(int(hour_min[0]))
50
51 cleaned_times.append(":".join(hour_min))
52 return "-".join(cleaned_times)
53
54 def parse_hours(self, lis):
55 hours = []
56 for li in lis:
57 day = li.xpath('./span[@class="day"]/text()').extract()[1]
58 times = li.xpath('./span[2]/text()').extract_first()
59 if times and day:
60 parsed_time = self.parse_times(times)
61 parsed_day = self.parse_day(day)
62 hours.append(parsed_day + ' ' + parsed_time)
63
64 return "; ".join(hours)
65 def parse_stores(self, response):
66 page_content = response.body_as_unicode()
67 json_content = re.findall('li.data.results =[^;]+' , page_content)
68 if len(json_content)>0:
69 json_content = json_content[0].replace('li.data.results =' ,'')
70 json_data = json.loads(json_content)
71 properties = {
72 'addr_full': json_data[0]['address'],
73 'phone':json_data[0]['phone'],
74 'city': json_data[0]['city'],
75 'state':json_data[0]['state'],
76 'postcode': json_data[0]['postal'],
77 'ref': json_data[0]['id'],
78 'website': response.url,
79 'lat': json_data[0]['lat'],
80 'lon': json_data[0]['lon'],
81 }
82 hours = self.parse_hours(response.xpath('//div[@class="hours"]/ol/li'))
83 if hours:
84 properties['opening_hours'] = hours
85
86 yield GeojsonPointItem(**properties)
87
88 def parse_city_stores(self, response):
89 stores = response.xpath('//div[@class="col-xs-12 col-lg-10 col-lg-offset-1"]/article/ul/li/a/@href').extract()
90 for store in stores:
91 if store:
92 yield scrapy.Request(response.urljoin(store), callback=self.parse_stores ,errback=self.handle_error)
93
94 def parse_state(self, response):
95 city_urls = response.xpath('//div[@class="col-xs-12 col-lg-10 col-lg-offset-1"]/article/div[@class="col"]/ul/li/a/@href').extract()
96 for path in city_urls:
97 yield scrapy.Request(response.urljoin(path), callback=self.parse_city_stores ,errback=self.handle_error)
98
99 def parse(self, response):
100 urls = response.xpath('//div[@class="col-xs-12 col-lg-10 col-lg-offset-1"]/article/div[@class="col"]/ul/li/a/@href').extract()
101 for path in urls:
102 yield scrapy.Request(response.urljoin(path), callback=self.parse_state ,errback=self.handle_error)
103
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/locations/spiders/wendys.py b/locations/spiders/wendys.py
--- a/locations/spiders/wendys.py
+++ b/locations/spiders/wendys.py
@@ -18,7 +18,7 @@
name = "wendys"
allowed_domains = ["locations.wendys.com"]
- download_delay = 0
+ download_delay = 0.5
download_timeout = 30
start_urls = (
'https://locations.wendys.com',
| {"golden_diff": "diff --git a/locations/spiders/wendys.py b/locations/spiders/wendys.py\n--- a/locations/spiders/wendys.py\n+++ b/locations/spiders/wendys.py\n@@ -18,7 +18,7 @@\n \n name = \"wendys\"\n allowed_domains = [\"locations.wendys.com\"]\n- download_delay = 0\n+ download_delay = 0.5\n download_timeout = 30\n start_urls = (\n 'https://locations.wendys.com',\n", "issue": "Wendy's\ne.g. https://locations.wendys.com/jamestown-ny-3438\n", "before_files": [{"content": "import scrapy\nimport re\nimport json\nfrom locations.items import GeojsonPointItem\n\nDAY_MAPPING = {\n 'Monday': 'Mo',\n 'Tuesday': 'Tu',\n 'Wednesday': 'We',\n 'Thursday': 'Th',\n 'Friday': 'Fr',\n 'Saturday': 'Sa',\n 'Sunday': 'Su'\n}\n\n\nclass WendysSpider(scrapy.Spider):\n\n name = \"wendys\"\n allowed_domains = [\"locations.wendys.com\"]\n download_delay = 0\n download_timeout = 30\n start_urls = (\n 'https://locations.wendys.com',\n )\n\n def handle_error(self, failure):\n self.log(\"Request failed: %s\" % failure.request)\n def parse_day(self, day):\n return DAY_MAPPING[day.strip()]\n def parse_times(self, times):\n hours_to = [x.strip() for x in times.split('-')]\n cleaned_times = []\n\n for hour in hours_to:\n if re.search('pm$', hour):\n hour = re.sub('pm', '', hour).strip()\n hour_min = hour.split(\":\")\n if int(hour_min[0]) < 12:\n hour_min[0] = str(12 + int(hour_min[0]))\n cleaned_times.append(\":\".join(hour_min))\n\n if re.search('am$', hour):\n hour = re.sub('am', '', hour).strip()\n hour_min = hour.split(\":\")\n if len(hour_min[0]) <2:\n hour_min[0] = hour_min[0].zfill(2)\n else:\n hour_min[0] = str(int(hour_min[0]))\n\n cleaned_times.append(\":\".join(hour_min))\n return \"-\".join(cleaned_times)\n\n def parse_hours(self, lis):\n hours = []\n for li in lis:\n day = li.xpath('./span[@class=\"day\"]/text()').extract()[1]\n times = li.xpath('./span[2]/text()').extract_first()\n if times and day:\n parsed_time = self.parse_times(times)\n parsed_day = self.parse_day(day)\n hours.append(parsed_day + ' ' + parsed_time)\n\n return \"; \".join(hours)\n def parse_stores(self, response):\n page_content = response.body_as_unicode()\n json_content = re.findall('li.data.results =[^;]+' , page_content)\n if len(json_content)>0:\n json_content = json_content[0].replace('li.data.results =' ,'')\n json_data = json.loads(json_content)\n properties = {\n 'addr_full': json_data[0]['address'],\n 'phone':json_data[0]['phone'],\n 'city': json_data[0]['city'],\n 'state':json_data[0]['state'],\n 'postcode': json_data[0]['postal'],\n 'ref': json_data[0]['id'],\n 'website': response.url,\n 'lat': json_data[0]['lat'],\n 'lon': json_data[0]['lon'],\n }\n hours = self.parse_hours(response.xpath('//div[@class=\"hours\"]/ol/li'))\n if hours:\n properties['opening_hours'] = hours\n\n yield GeojsonPointItem(**properties)\n\n def parse_city_stores(self, response):\n stores = response.xpath('//div[@class=\"col-xs-12 col-lg-10 col-lg-offset-1\"]/article/ul/li/a/@href').extract()\n for store in stores:\n if store:\n yield scrapy.Request(response.urljoin(store), callback=self.parse_stores ,errback=self.handle_error)\n\n def parse_state(self, response):\n city_urls = response.xpath('//div[@class=\"col-xs-12 col-lg-10 col-lg-offset-1\"]/article/div[@class=\"col\"]/ul/li/a/@href').extract()\n for path in city_urls:\n yield scrapy.Request(response.urljoin(path), callback=self.parse_city_stores ,errback=self.handle_error)\n\n def parse(self, response):\n urls = response.xpath('//div[@class=\"col-xs-12 col-lg-10 col-lg-offset-1\"]/article/div[@class=\"col\"]/ul/li/a/@href').extract()\n for path in urls:\n yield scrapy.Request(response.urljoin(path), callback=self.parse_state ,errback=self.handle_error)\n", "path": "locations/spiders/wendys.py"}], "after_files": [{"content": "import scrapy\nimport re\nimport json\nfrom locations.items import GeojsonPointItem\n\nDAY_MAPPING = {\n 'Monday': 'Mo',\n 'Tuesday': 'Tu',\n 'Wednesday': 'We',\n 'Thursday': 'Th',\n 'Friday': 'Fr',\n 'Saturday': 'Sa',\n 'Sunday': 'Su'\n}\n\n\nclass WendysSpider(scrapy.Spider):\n\n name = \"wendys\"\n allowed_domains = [\"locations.wendys.com\"]\n download_delay = 0.5\n download_timeout = 30\n start_urls = (\n 'https://locations.wendys.com',\n )\n\n def handle_error(self, failure):\n self.log(\"Request failed: %s\" % failure.request)\n def parse_day(self, day):\n return DAY_MAPPING[day.strip()]\n def parse_times(self, times):\n hours_to = [x.strip() for x in times.split('-')]\n cleaned_times = []\n\n for hour in hours_to:\n if re.search('pm$', hour):\n hour = re.sub('pm', '', hour).strip()\n hour_min = hour.split(\":\")\n if int(hour_min[0]) < 12:\n hour_min[0] = str(12 + int(hour_min[0]))\n cleaned_times.append(\":\".join(hour_min))\n\n if re.search('am$', hour):\n hour = re.sub('am', '', hour).strip()\n hour_min = hour.split(\":\")\n if len(hour_min[0]) <2:\n hour_min[0] = hour_min[0].zfill(2)\n else:\n hour_min[0] = str(int(hour_min[0]))\n\n cleaned_times.append(\":\".join(hour_min))\n return \"-\".join(cleaned_times)\n\n def parse_hours(self, lis):\n hours = []\n for li in lis:\n day = li.xpath('./span[@class=\"day\"]/text()').extract()[1]\n times = li.xpath('./span[2]/text()').extract_first()\n if times and day:\n parsed_time = self.parse_times(times)\n parsed_day = self.parse_day(day)\n hours.append(parsed_day + ' ' + parsed_time)\n\n return \"; \".join(hours)\n def parse_stores(self, response):\n page_content = response.body_as_unicode()\n json_content = re.findall('li.data.results =[^;]+' , page_content)\n if len(json_content)>0:\n json_content = json_content[0].replace('li.data.results =' ,'')\n json_data = json.loads(json_content)\n properties = {\n 'addr_full': json_data[0]['address'],\n 'phone':json_data[0]['phone'],\n 'city': json_data[0]['city'],\n 'state':json_data[0]['state'],\n 'postcode': json_data[0]['postal'],\n 'ref': json_data[0]['id'],\n 'website': response.url,\n 'lat': json_data[0]['lat'],\n 'lon': json_data[0]['lon'],\n }\n hours = self.parse_hours(response.xpath('//div[@class=\"hours\"]/ol/li'))\n if hours:\n properties['opening_hours'] = hours\n\n yield GeojsonPointItem(**properties)\n\n def parse_city_stores(self, response):\n stores = response.xpath('//div[@class=\"col-xs-12 col-lg-10 col-lg-offset-1\"]/article/ul/li/a/@href').extract()\n for store in stores:\n if store:\n yield scrapy.Request(response.urljoin(store), callback=self.parse_stores ,errback=self.handle_error)\n\n def parse_state(self, response):\n city_urls = response.xpath('//div[@class=\"col-xs-12 col-lg-10 col-lg-offset-1\"]/article/div[@class=\"col\"]/ul/li/a/@href').extract()\n for path in city_urls:\n yield scrapy.Request(response.urljoin(path), callback=self.parse_city_stores ,errback=self.handle_error)\n\n def parse(self, response):\n urls = response.xpath('//div[@class=\"col-xs-12 col-lg-10 col-lg-offset-1\"]/article/div[@class=\"col\"]/ul/li/a/@href').extract()\n for path in urls:\n yield scrapy.Request(response.urljoin(path), callback=self.parse_state ,errback=self.handle_error)\n", "path": "locations/spiders/wendys.py"}]} | 1,411 | 116 |
gh_patches_debug_8029 | rasdani/github-patches | git_diff | ipython__ipython-6931 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
console config not written
assigning to @minrk who said: "Oh jeez, I don't wanna fix that right now". Marking it two-point-oh.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `IPython/core/profileapp.py`
Content:
```
1 # encoding: utf-8
2 """
3 An application for managing IPython profiles.
4
5 To be invoked as the `ipython profile` subcommand.
6
7 Authors:
8
9 * Min RK
10
11 """
12 from __future__ import print_function
13
14 #-----------------------------------------------------------------------------
15 # Copyright (C) 2008 The IPython Development Team
16 #
17 # Distributed under the terms of the BSD License. The full license is in
18 # the file COPYING, distributed as part of this software.
19 #-----------------------------------------------------------------------------
20
21 #-----------------------------------------------------------------------------
22 # Imports
23 #-----------------------------------------------------------------------------
24
25 import os
26
27 from IPython.config.application import Application
28 from IPython.core.application import (
29 BaseIPythonApplication, base_flags
30 )
31 from IPython.core.profiledir import ProfileDir
32 from IPython.utils.importstring import import_item
33 from IPython.utils.path import get_ipython_dir, get_ipython_package_dir
34 from IPython.utils import py3compat
35 from IPython.utils.traitlets import Unicode, Bool, Dict
36
37 #-----------------------------------------------------------------------------
38 # Constants
39 #-----------------------------------------------------------------------------
40
41 create_help = """Create an IPython profile by name
42
43 Create an ipython profile directory by its name or
44 profile directory path. Profile directories contain
45 configuration, log and security related files and are named
46 using the convention 'profile_<name>'. By default they are
47 located in your ipython directory. Once created, you will
48 can edit the configuration files in the profile
49 directory to configure IPython. Most users will create a
50 profile directory by name,
51 `ipython profile create myprofile`, which will put the directory
52 in `<ipython_dir>/profile_myprofile`.
53 """
54 list_help = """List available IPython profiles
55
56 List all available profiles, by profile location, that can
57 be found in the current working directly or in the ipython
58 directory. Profile directories are named using the convention
59 'profile_<profile>'.
60 """
61 profile_help = """Manage IPython profiles
62
63 Profile directories contain
64 configuration, log and security related files and are named
65 using the convention 'profile_<name>'. By default they are
66 located in your ipython directory. You can create profiles
67 with `ipython profile create <name>`, or see the profiles you
68 already have with `ipython profile list`
69
70 To get started configuring IPython, simply do:
71
72 $> ipython profile create
73
74 and IPython will create the default profile in <ipython_dir>/profile_default,
75 where you can edit ipython_config.py to start configuring IPython.
76
77 """
78
79 _list_examples = "ipython profile list # list all profiles"
80
81 _create_examples = """
82 ipython profile create foo # create profile foo w/ default config files
83 ipython profile create foo --reset # restage default config files over current
84 ipython profile create foo --parallel # also stage parallel config files
85 """
86
87 _main_examples = """
88 ipython profile create -h # show the help string for the create subcommand
89 ipython profile list -h # show the help string for the list subcommand
90
91 ipython locate profile foo # print the path to the directory for profile 'foo'
92 """
93
94 #-----------------------------------------------------------------------------
95 # Profile Application Class (for `ipython profile` subcommand)
96 #-----------------------------------------------------------------------------
97
98
99 def list_profiles_in(path):
100 """list profiles in a given root directory"""
101 files = os.listdir(path)
102 profiles = []
103 for f in files:
104 try:
105 full_path = os.path.join(path, f)
106 except UnicodeError:
107 continue
108 if os.path.isdir(full_path) and f.startswith('profile_'):
109 profiles.append(f.split('_',1)[-1])
110 return profiles
111
112
113 def list_bundled_profiles():
114 """list profiles that are bundled with IPython."""
115 path = os.path.join(get_ipython_package_dir(), u'config', u'profile')
116 files = os.listdir(path)
117 profiles = []
118 for profile in files:
119 full_path = os.path.join(path, profile)
120 if os.path.isdir(full_path) and profile != "__pycache__":
121 profiles.append(profile)
122 return profiles
123
124
125 class ProfileLocate(BaseIPythonApplication):
126 description = """print the path to an IPython profile dir"""
127
128 def parse_command_line(self, argv=None):
129 super(ProfileLocate, self).parse_command_line(argv)
130 if self.extra_args:
131 self.profile = self.extra_args[0]
132
133 def start(self):
134 print(self.profile_dir.location)
135
136
137 class ProfileList(Application):
138 name = u'ipython-profile'
139 description = list_help
140 examples = _list_examples
141
142 aliases = Dict({
143 'ipython-dir' : 'ProfileList.ipython_dir',
144 'log-level' : 'Application.log_level',
145 })
146 flags = Dict(dict(
147 debug = ({'Application' : {'log_level' : 0}},
148 "Set Application.log_level to 0, maximizing log output."
149 )
150 ))
151
152 ipython_dir = Unicode(get_ipython_dir(), config=True,
153 help="""
154 The name of the IPython directory. This directory is used for logging
155 configuration (through profiles), history storage, etc. The default
156 is usually $HOME/.ipython. This options can also be specified through
157 the environment variable IPYTHONDIR.
158 """
159 )
160
161
162 def _print_profiles(self, profiles):
163 """print list of profiles, indented."""
164 for profile in profiles:
165 print(' %s' % profile)
166
167 def list_profile_dirs(self):
168 profiles = list_bundled_profiles()
169 if profiles:
170 print()
171 print("Available profiles in IPython:")
172 self._print_profiles(profiles)
173 print()
174 print(" The first request for a bundled profile will copy it")
175 print(" into your IPython directory (%s)," % self.ipython_dir)
176 print(" where you can customize it.")
177
178 profiles = list_profiles_in(self.ipython_dir)
179 if profiles:
180 print()
181 print("Available profiles in %s:" % self.ipython_dir)
182 self._print_profiles(profiles)
183
184 profiles = list_profiles_in(py3compat.getcwd())
185 if profiles:
186 print()
187 print("Available profiles in current directory (%s):" % py3compat.getcwd())
188 self._print_profiles(profiles)
189
190 print()
191 print("To use any of the above profiles, start IPython with:")
192 print(" ipython --profile=<name>")
193 print()
194
195 def start(self):
196 self.list_profile_dirs()
197
198
199 create_flags = {}
200 create_flags.update(base_flags)
201 # don't include '--init' flag, which implies running profile create in other apps
202 create_flags.pop('init')
203 create_flags['reset'] = ({'ProfileCreate': {'overwrite' : True}},
204 "reset config files in this profile to the defaults.")
205 create_flags['parallel'] = ({'ProfileCreate': {'parallel' : True}},
206 "Include the config files for parallel "
207 "computing apps (ipengine, ipcontroller, etc.)")
208
209
210 class ProfileCreate(BaseIPythonApplication):
211 name = u'ipython-profile'
212 description = create_help
213 examples = _create_examples
214 auto_create = Bool(True, config=False)
215 def _log_format_default(self):
216 return "[%(name)s] %(message)s"
217
218 def _copy_config_files_default(self):
219 return True
220
221 parallel = Bool(False, config=True,
222 help="whether to include parallel computing config files")
223 def _parallel_changed(self, name, old, new):
224 parallel_files = [ 'ipcontroller_config.py',
225 'ipengine_config.py',
226 'ipcluster_config.py'
227 ]
228 if new:
229 for cf in parallel_files:
230 self.config_files.append(cf)
231 else:
232 for cf in parallel_files:
233 if cf in self.config_files:
234 self.config_files.remove(cf)
235
236 def parse_command_line(self, argv):
237 super(ProfileCreate, self).parse_command_line(argv)
238 # accept positional arg as profile name
239 if self.extra_args:
240 self.profile = self.extra_args[0]
241
242 flags = Dict(create_flags)
243
244 classes = [ProfileDir]
245
246 def _import_app(self, app_path):
247 """import an app class"""
248 app = None
249 name = app_path.rsplit('.', 1)[-1]
250 try:
251 app = import_item(app_path)
252 except ImportError:
253 self.log.info("Couldn't import %s, config file will be excluded", name)
254 except Exception:
255 self.log.warn('Unexpected error importing %s', name, exc_info=True)
256 return app
257
258 def init_config_files(self):
259 super(ProfileCreate, self).init_config_files()
260 # use local imports, since these classes may import from here
261 from IPython.terminal.ipapp import TerminalIPythonApp
262 apps = [TerminalIPythonApp]
263 for app_path in (
264 'IPython.kernel.zmq.kernelapp.IPKernelApp',
265 'IPython.qt.console.qtconsoleapp.IPythonQtConsoleApp',
266 'IPython.html.notebookapp.NotebookApp',
267 'IPython.nbconvert.nbconvertapp.NbConvertApp',
268 ):
269 app = self._import_app(app_path)
270 if app is not None:
271 apps.append(app)
272 if self.parallel:
273 from IPython.parallel.apps.ipcontrollerapp import IPControllerApp
274 from IPython.parallel.apps.ipengineapp import IPEngineApp
275 from IPython.parallel.apps.ipclusterapp import IPClusterStart
276 from IPython.parallel.apps.iploggerapp import IPLoggerApp
277 apps.extend([
278 IPControllerApp,
279 IPEngineApp,
280 IPClusterStart,
281 IPLoggerApp,
282 ])
283 for App in apps:
284 app = App()
285 app.config.update(self.config)
286 app.log = self.log
287 app.overwrite = self.overwrite
288 app.copy_config_files=True
289 app.ipython_dir=self.ipython_dir
290 app.profile_dir=self.profile_dir
291 app.init_config_files()
292
293 def stage_default_config_file(self):
294 pass
295
296
297 class ProfileApp(Application):
298 name = u'ipython profile'
299 description = profile_help
300 examples = _main_examples
301
302 subcommands = Dict(dict(
303 create = (ProfileCreate, ProfileCreate.description.splitlines()[0]),
304 list = (ProfileList, ProfileList.description.splitlines()[0]),
305 locate = (ProfileLocate, ProfileLocate.description.splitlines()[0]),
306 ))
307
308 def start(self):
309 if self.subapp is None:
310 print("No subcommand specified. Must specify one of: %s"%(self.subcommands.keys()))
311 print()
312 self.print_description()
313 self.print_subcommands()
314 self.exit(1)
315 else:
316 return self.subapp.start()
317
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/IPython/core/profileapp.py b/IPython/core/profileapp.py
--- a/IPython/core/profileapp.py
+++ b/IPython/core/profileapp.py
@@ -262,6 +262,7 @@
apps = [TerminalIPythonApp]
for app_path in (
'IPython.kernel.zmq.kernelapp.IPKernelApp',
+ 'IPython.terminal.console.app.ZMQTerminalIPythonApp',
'IPython.qt.console.qtconsoleapp.IPythonQtConsoleApp',
'IPython.html.notebookapp.NotebookApp',
'IPython.nbconvert.nbconvertapp.NbConvertApp',
| {"golden_diff": "diff --git a/IPython/core/profileapp.py b/IPython/core/profileapp.py\n--- a/IPython/core/profileapp.py\n+++ b/IPython/core/profileapp.py\n@@ -262,6 +262,7 @@\n apps = [TerminalIPythonApp]\n for app_path in (\n 'IPython.kernel.zmq.kernelapp.IPKernelApp',\n+ 'IPython.terminal.console.app.ZMQTerminalIPythonApp',\n 'IPython.qt.console.qtconsoleapp.IPythonQtConsoleApp',\n 'IPython.html.notebookapp.NotebookApp',\n 'IPython.nbconvert.nbconvertapp.NbConvertApp',\n", "issue": "console config not written\nassigning to @minrk who said: \"Oh jeez, I don't wanna fix that right now\". Marking it two-point-oh.\n\n", "before_files": [{"content": "# encoding: utf-8\n\"\"\"\nAn application for managing IPython profiles.\n\nTo be invoked as the `ipython profile` subcommand.\n\nAuthors:\n\n* Min RK\n\n\"\"\"\nfrom __future__ import print_function\n\n#-----------------------------------------------------------------------------\n# Copyright (C) 2008 The IPython Development Team\n#\n# Distributed under the terms of the BSD License. The full license is in\n# the file COPYING, distributed as part of this software.\n#-----------------------------------------------------------------------------\n\n#-----------------------------------------------------------------------------\n# Imports\n#-----------------------------------------------------------------------------\n\nimport os\n\nfrom IPython.config.application import Application\nfrom IPython.core.application import (\n BaseIPythonApplication, base_flags\n)\nfrom IPython.core.profiledir import ProfileDir\nfrom IPython.utils.importstring import import_item\nfrom IPython.utils.path import get_ipython_dir, get_ipython_package_dir\nfrom IPython.utils import py3compat\nfrom IPython.utils.traitlets import Unicode, Bool, Dict\n\n#-----------------------------------------------------------------------------\n# Constants\n#-----------------------------------------------------------------------------\n\ncreate_help = \"\"\"Create an IPython profile by name\n\nCreate an ipython profile directory by its name or\nprofile directory path. Profile directories contain\nconfiguration, log and security related files and are named\nusing the convention 'profile_<name>'. By default they are\nlocated in your ipython directory. Once created, you will\ncan edit the configuration files in the profile\ndirectory to configure IPython. Most users will create a\nprofile directory by name,\n`ipython profile create myprofile`, which will put the directory\nin `<ipython_dir>/profile_myprofile`.\n\"\"\"\nlist_help = \"\"\"List available IPython profiles\n\nList all available profiles, by profile location, that can\nbe found in the current working directly or in the ipython\ndirectory. Profile directories are named using the convention\n'profile_<profile>'.\n\"\"\"\nprofile_help = \"\"\"Manage IPython profiles\n\nProfile directories contain\nconfiguration, log and security related files and are named\nusing the convention 'profile_<name>'. By default they are\nlocated in your ipython directory. You can create profiles\nwith `ipython profile create <name>`, or see the profiles you\nalready have with `ipython profile list`\n\nTo get started configuring IPython, simply do:\n\n$> ipython profile create\n\nand IPython will create the default profile in <ipython_dir>/profile_default,\nwhere you can edit ipython_config.py to start configuring IPython.\n\n\"\"\"\n\n_list_examples = \"ipython profile list # list all profiles\"\n\n_create_examples = \"\"\"\nipython profile create foo # create profile foo w/ default config files\nipython profile create foo --reset # restage default config files over current\nipython profile create foo --parallel # also stage parallel config files\n\"\"\"\n\n_main_examples = \"\"\"\nipython profile create -h # show the help string for the create subcommand\nipython profile list -h # show the help string for the list subcommand\n\nipython locate profile foo # print the path to the directory for profile 'foo'\n\"\"\"\n\n#-----------------------------------------------------------------------------\n# Profile Application Class (for `ipython profile` subcommand)\n#-----------------------------------------------------------------------------\n\n\ndef list_profiles_in(path):\n \"\"\"list profiles in a given root directory\"\"\"\n files = os.listdir(path)\n profiles = []\n for f in files:\n try:\n full_path = os.path.join(path, f)\n except UnicodeError:\n continue\n if os.path.isdir(full_path) and f.startswith('profile_'):\n profiles.append(f.split('_',1)[-1])\n return profiles\n\n\ndef list_bundled_profiles():\n \"\"\"list profiles that are bundled with IPython.\"\"\"\n path = os.path.join(get_ipython_package_dir(), u'config', u'profile')\n files = os.listdir(path)\n profiles = []\n for profile in files:\n full_path = os.path.join(path, profile)\n if os.path.isdir(full_path) and profile != \"__pycache__\":\n profiles.append(profile)\n return profiles\n\n\nclass ProfileLocate(BaseIPythonApplication):\n description = \"\"\"print the path to an IPython profile dir\"\"\"\n \n def parse_command_line(self, argv=None):\n super(ProfileLocate, self).parse_command_line(argv)\n if self.extra_args:\n self.profile = self.extra_args[0]\n \n def start(self):\n print(self.profile_dir.location)\n\n\nclass ProfileList(Application):\n name = u'ipython-profile'\n description = list_help\n examples = _list_examples\n\n aliases = Dict({\n 'ipython-dir' : 'ProfileList.ipython_dir',\n 'log-level' : 'Application.log_level',\n })\n flags = Dict(dict(\n debug = ({'Application' : {'log_level' : 0}},\n \"Set Application.log_level to 0, maximizing log output.\"\n )\n ))\n\n ipython_dir = Unicode(get_ipython_dir(), config=True,\n help=\"\"\"\n The name of the IPython directory. This directory is used for logging\n configuration (through profiles), history storage, etc. The default\n is usually $HOME/.ipython. This options can also be specified through\n the environment variable IPYTHONDIR.\n \"\"\"\n )\n\n\n def _print_profiles(self, profiles):\n \"\"\"print list of profiles, indented.\"\"\"\n for profile in profiles:\n print(' %s' % profile)\n\n def list_profile_dirs(self):\n profiles = list_bundled_profiles()\n if profiles:\n print()\n print(\"Available profiles in IPython:\")\n self._print_profiles(profiles)\n print()\n print(\" The first request for a bundled profile will copy it\")\n print(\" into your IPython directory (%s),\" % self.ipython_dir)\n print(\" where you can customize it.\")\n \n profiles = list_profiles_in(self.ipython_dir)\n if profiles:\n print()\n print(\"Available profiles in %s:\" % self.ipython_dir)\n self._print_profiles(profiles)\n \n profiles = list_profiles_in(py3compat.getcwd())\n if profiles:\n print()\n print(\"Available profiles in current directory (%s):\" % py3compat.getcwd())\n self._print_profiles(profiles)\n \n print()\n print(\"To use any of the above profiles, start IPython with:\")\n print(\" ipython --profile=<name>\")\n print()\n\n def start(self):\n self.list_profile_dirs()\n\n\ncreate_flags = {}\ncreate_flags.update(base_flags)\n# don't include '--init' flag, which implies running profile create in other apps\ncreate_flags.pop('init')\ncreate_flags['reset'] = ({'ProfileCreate': {'overwrite' : True}},\n \"reset config files in this profile to the defaults.\")\ncreate_flags['parallel'] = ({'ProfileCreate': {'parallel' : True}},\n \"Include the config files for parallel \"\n \"computing apps (ipengine, ipcontroller, etc.)\")\n\n\nclass ProfileCreate(BaseIPythonApplication):\n name = u'ipython-profile'\n description = create_help\n examples = _create_examples\n auto_create = Bool(True, config=False)\n def _log_format_default(self):\n return \"[%(name)s] %(message)s\"\n\n def _copy_config_files_default(self):\n return True\n\n parallel = Bool(False, config=True,\n help=\"whether to include parallel computing config files\")\n def _parallel_changed(self, name, old, new):\n parallel_files = [ 'ipcontroller_config.py',\n 'ipengine_config.py',\n 'ipcluster_config.py'\n ]\n if new:\n for cf in parallel_files:\n self.config_files.append(cf)\n else:\n for cf in parallel_files:\n if cf in self.config_files:\n self.config_files.remove(cf)\n\n def parse_command_line(self, argv):\n super(ProfileCreate, self).parse_command_line(argv)\n # accept positional arg as profile name\n if self.extra_args:\n self.profile = self.extra_args[0]\n\n flags = Dict(create_flags)\n\n classes = [ProfileDir]\n \n def _import_app(self, app_path):\n \"\"\"import an app class\"\"\"\n app = None\n name = app_path.rsplit('.', 1)[-1]\n try:\n app = import_item(app_path)\n except ImportError:\n self.log.info(\"Couldn't import %s, config file will be excluded\", name)\n except Exception:\n self.log.warn('Unexpected error importing %s', name, exc_info=True)\n return app\n\n def init_config_files(self):\n super(ProfileCreate, self).init_config_files()\n # use local imports, since these classes may import from here\n from IPython.terminal.ipapp import TerminalIPythonApp\n apps = [TerminalIPythonApp]\n for app_path in (\n 'IPython.kernel.zmq.kernelapp.IPKernelApp',\n 'IPython.qt.console.qtconsoleapp.IPythonQtConsoleApp',\n 'IPython.html.notebookapp.NotebookApp',\n 'IPython.nbconvert.nbconvertapp.NbConvertApp',\n ):\n app = self._import_app(app_path)\n if app is not None:\n apps.append(app)\n if self.parallel:\n from IPython.parallel.apps.ipcontrollerapp import IPControllerApp\n from IPython.parallel.apps.ipengineapp import IPEngineApp\n from IPython.parallel.apps.ipclusterapp import IPClusterStart\n from IPython.parallel.apps.iploggerapp import IPLoggerApp\n apps.extend([\n IPControllerApp,\n IPEngineApp,\n IPClusterStart,\n IPLoggerApp,\n ])\n for App in apps:\n app = App()\n app.config.update(self.config)\n app.log = self.log\n app.overwrite = self.overwrite\n app.copy_config_files=True\n app.ipython_dir=self.ipython_dir\n app.profile_dir=self.profile_dir\n app.init_config_files()\n\n def stage_default_config_file(self):\n pass\n\n\nclass ProfileApp(Application):\n name = u'ipython profile'\n description = profile_help\n examples = _main_examples\n\n subcommands = Dict(dict(\n create = (ProfileCreate, ProfileCreate.description.splitlines()[0]),\n list = (ProfileList, ProfileList.description.splitlines()[0]),\n locate = (ProfileLocate, ProfileLocate.description.splitlines()[0]),\n ))\n\n def start(self):\n if self.subapp is None:\n print(\"No subcommand specified. Must specify one of: %s\"%(self.subcommands.keys()))\n print()\n self.print_description()\n self.print_subcommands()\n self.exit(1)\n else:\n return self.subapp.start()\n", "path": "IPython/core/profileapp.py"}], "after_files": [{"content": "# encoding: utf-8\n\"\"\"\nAn application for managing IPython profiles.\n\nTo be invoked as the `ipython profile` subcommand.\n\nAuthors:\n\n* Min RK\n\n\"\"\"\nfrom __future__ import print_function\n\n#-----------------------------------------------------------------------------\n# Copyright (C) 2008 The IPython Development Team\n#\n# Distributed under the terms of the BSD License. The full license is in\n# the file COPYING, distributed as part of this software.\n#-----------------------------------------------------------------------------\n\n#-----------------------------------------------------------------------------\n# Imports\n#-----------------------------------------------------------------------------\n\nimport os\n\nfrom IPython.config.application import Application\nfrom IPython.core.application import (\n BaseIPythonApplication, base_flags\n)\nfrom IPython.core.profiledir import ProfileDir\nfrom IPython.utils.importstring import import_item\nfrom IPython.utils.path import get_ipython_dir, get_ipython_package_dir\nfrom IPython.utils import py3compat\nfrom IPython.utils.traitlets import Unicode, Bool, Dict\n\n#-----------------------------------------------------------------------------\n# Constants\n#-----------------------------------------------------------------------------\n\ncreate_help = \"\"\"Create an IPython profile by name\n\nCreate an ipython profile directory by its name or\nprofile directory path. Profile directories contain\nconfiguration, log and security related files and are named\nusing the convention 'profile_<name>'. By default they are\nlocated in your ipython directory. Once created, you will\ncan edit the configuration files in the profile\ndirectory to configure IPython. Most users will create a\nprofile directory by name,\n`ipython profile create myprofile`, which will put the directory\nin `<ipython_dir>/profile_myprofile`.\n\"\"\"\nlist_help = \"\"\"List available IPython profiles\n\nList all available profiles, by profile location, that can\nbe found in the current working directly or in the ipython\ndirectory. Profile directories are named using the convention\n'profile_<profile>'.\n\"\"\"\nprofile_help = \"\"\"Manage IPython profiles\n\nProfile directories contain\nconfiguration, log and security related files and are named\nusing the convention 'profile_<name>'. By default they are\nlocated in your ipython directory. You can create profiles\nwith `ipython profile create <name>`, or see the profiles you\nalready have with `ipython profile list`\n\nTo get started configuring IPython, simply do:\n\n$> ipython profile create\n\nand IPython will create the default profile in <ipython_dir>/profile_default,\nwhere you can edit ipython_config.py to start configuring IPython.\n\n\"\"\"\n\n_list_examples = \"ipython profile list # list all profiles\"\n\n_create_examples = \"\"\"\nipython profile create foo # create profile foo w/ default config files\nipython profile create foo --reset # restage default config files over current\nipython profile create foo --parallel # also stage parallel config files\n\"\"\"\n\n_main_examples = \"\"\"\nipython profile create -h # show the help string for the create subcommand\nipython profile list -h # show the help string for the list subcommand\n\nipython locate profile foo # print the path to the directory for profile 'foo'\n\"\"\"\n\n#-----------------------------------------------------------------------------\n# Profile Application Class (for `ipython profile` subcommand)\n#-----------------------------------------------------------------------------\n\n\ndef list_profiles_in(path):\n \"\"\"list profiles in a given root directory\"\"\"\n files = os.listdir(path)\n profiles = []\n for f in files:\n try:\n full_path = os.path.join(path, f)\n except UnicodeError:\n continue\n if os.path.isdir(full_path) and f.startswith('profile_'):\n profiles.append(f.split('_',1)[-1])\n return profiles\n\n\ndef list_bundled_profiles():\n \"\"\"list profiles that are bundled with IPython.\"\"\"\n path = os.path.join(get_ipython_package_dir(), u'config', u'profile')\n files = os.listdir(path)\n profiles = []\n for profile in files:\n full_path = os.path.join(path, profile)\n if os.path.isdir(full_path) and profile != \"__pycache__\":\n profiles.append(profile)\n return profiles\n\n\nclass ProfileLocate(BaseIPythonApplication):\n description = \"\"\"print the path to an IPython profile dir\"\"\"\n \n def parse_command_line(self, argv=None):\n super(ProfileLocate, self).parse_command_line(argv)\n if self.extra_args:\n self.profile = self.extra_args[0]\n \n def start(self):\n print(self.profile_dir.location)\n\n\nclass ProfileList(Application):\n name = u'ipython-profile'\n description = list_help\n examples = _list_examples\n\n aliases = Dict({\n 'ipython-dir' : 'ProfileList.ipython_dir',\n 'log-level' : 'Application.log_level',\n })\n flags = Dict(dict(\n debug = ({'Application' : {'log_level' : 0}},\n \"Set Application.log_level to 0, maximizing log output.\"\n )\n ))\n\n ipython_dir = Unicode(get_ipython_dir(), config=True,\n help=\"\"\"\n The name of the IPython directory. This directory is used for logging\n configuration (through profiles), history storage, etc. The default\n is usually $HOME/.ipython. This options can also be specified through\n the environment variable IPYTHONDIR.\n \"\"\"\n )\n\n\n def _print_profiles(self, profiles):\n \"\"\"print list of profiles, indented.\"\"\"\n for profile in profiles:\n print(' %s' % profile)\n\n def list_profile_dirs(self):\n profiles = list_bundled_profiles()\n if profiles:\n print()\n print(\"Available profiles in IPython:\")\n self._print_profiles(profiles)\n print()\n print(\" The first request for a bundled profile will copy it\")\n print(\" into your IPython directory (%s),\" % self.ipython_dir)\n print(\" where you can customize it.\")\n \n profiles = list_profiles_in(self.ipython_dir)\n if profiles:\n print()\n print(\"Available profiles in %s:\" % self.ipython_dir)\n self._print_profiles(profiles)\n \n profiles = list_profiles_in(py3compat.getcwd())\n if profiles:\n print()\n print(\"Available profiles in current directory (%s):\" % py3compat.getcwd())\n self._print_profiles(profiles)\n \n print()\n print(\"To use any of the above profiles, start IPython with:\")\n print(\" ipython --profile=<name>\")\n print()\n\n def start(self):\n self.list_profile_dirs()\n\n\ncreate_flags = {}\ncreate_flags.update(base_flags)\n# don't include '--init' flag, which implies running profile create in other apps\ncreate_flags.pop('init')\ncreate_flags['reset'] = ({'ProfileCreate': {'overwrite' : True}},\n \"reset config files in this profile to the defaults.\")\ncreate_flags['parallel'] = ({'ProfileCreate': {'parallel' : True}},\n \"Include the config files for parallel \"\n \"computing apps (ipengine, ipcontroller, etc.)\")\n\n\nclass ProfileCreate(BaseIPythonApplication):\n name = u'ipython-profile'\n description = create_help\n examples = _create_examples\n auto_create = Bool(True, config=False)\n def _log_format_default(self):\n return \"[%(name)s] %(message)s\"\n\n def _copy_config_files_default(self):\n return True\n\n parallel = Bool(False, config=True,\n help=\"whether to include parallel computing config files\")\n def _parallel_changed(self, name, old, new):\n parallel_files = [ 'ipcontroller_config.py',\n 'ipengine_config.py',\n 'ipcluster_config.py'\n ]\n if new:\n for cf in parallel_files:\n self.config_files.append(cf)\n else:\n for cf in parallel_files:\n if cf in self.config_files:\n self.config_files.remove(cf)\n\n def parse_command_line(self, argv):\n super(ProfileCreate, self).parse_command_line(argv)\n # accept positional arg as profile name\n if self.extra_args:\n self.profile = self.extra_args[0]\n\n flags = Dict(create_flags)\n\n classes = [ProfileDir]\n \n def _import_app(self, app_path):\n \"\"\"import an app class\"\"\"\n app = None\n name = app_path.rsplit('.', 1)[-1]\n try:\n app = import_item(app_path)\n except ImportError:\n self.log.info(\"Couldn't import %s, config file will be excluded\", name)\n except Exception:\n self.log.warn('Unexpected error importing %s', name, exc_info=True)\n return app\n\n def init_config_files(self):\n super(ProfileCreate, self).init_config_files()\n # use local imports, since these classes may import from here\n from IPython.terminal.ipapp import TerminalIPythonApp\n apps = [TerminalIPythonApp]\n for app_path in (\n 'IPython.kernel.zmq.kernelapp.IPKernelApp',\n 'IPython.terminal.console.app.ZMQTerminalIPythonApp',\n 'IPython.qt.console.qtconsoleapp.IPythonQtConsoleApp',\n 'IPython.html.notebookapp.NotebookApp',\n 'IPython.nbconvert.nbconvertapp.NbConvertApp',\n ):\n app = self._import_app(app_path)\n if app is not None:\n apps.append(app)\n if self.parallel:\n from IPython.parallel.apps.ipcontrollerapp import IPControllerApp\n from IPython.parallel.apps.ipengineapp import IPEngineApp\n from IPython.parallel.apps.ipclusterapp import IPClusterStart\n from IPython.parallel.apps.iploggerapp import IPLoggerApp\n apps.extend([\n IPControllerApp,\n IPEngineApp,\n IPClusterStart,\n IPLoggerApp,\n ])\n for App in apps:\n app = App()\n app.config.update(self.config)\n app.log = self.log\n app.overwrite = self.overwrite\n app.copy_config_files=True\n app.ipython_dir=self.ipython_dir\n app.profile_dir=self.profile_dir\n app.init_config_files()\n\n def stage_default_config_file(self):\n pass\n\n\nclass ProfileApp(Application):\n name = u'ipython profile'\n description = profile_help\n examples = _main_examples\n\n subcommands = Dict(dict(\n create = (ProfileCreate, ProfileCreate.description.splitlines()[0]),\n list = (ProfileList, ProfileList.description.splitlines()[0]),\n locate = (ProfileLocate, ProfileLocate.description.splitlines()[0]),\n ))\n\n def start(self):\n if self.subapp is None:\n print(\"No subcommand specified. Must specify one of: %s\"%(self.subcommands.keys()))\n print()\n self.print_description()\n self.print_subcommands()\n self.exit(1)\n else:\n return self.subapp.start()\n", "path": "IPython/core/profileapp.py"}]} | 3,413 | 136 |
gh_patches_debug_6762 | rasdani/github-patches | git_diff | microsoft__DeepSpeed-4770 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
nv-sd CI test failure
The Nightly CI for https://github.com/microsoft/DeepSpeed/actions/runs/7073374056 failed.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `deepspeed/model_implementations/diffusers/unet.py`
Content:
```
1 # Copyright (c) Microsoft Corporation.
2 # SPDX-License-Identifier: Apache-2.0
3
4 # DeepSpeed Team
5
6 import torch
7 from ..features.cuda_graph import CUDAGraph
8
9
10 class DSUNet(CUDAGraph, torch.nn.Module):
11
12 def __init__(self, unet, enable_cuda_graph=True):
13 super().__init__(enable_cuda_graph=enable_cuda_graph)
14 self.unet = unet
15 # SD pipeline accesses this attribute
16 self.in_channels = unet.in_channels
17 self.device = self.unet.device
18 self.dtype = self.unet.dtype
19 self.config = self.unet.config
20 self.fwd_count = 0
21 self.unet.requires_grad_(requires_grad=False)
22 self.unet.to(memory_format=torch.channels_last)
23 self.cuda_graph_created = False
24
25 def _graph_replay(self, *inputs, **kwargs):
26 for i in range(len(inputs)):
27 if torch.is_tensor(inputs[i]):
28 self.static_inputs[i].copy_(inputs[i])
29 for k in kwargs:
30 if torch.is_tensor(kwargs[k]):
31 self.static_kwargs[k].copy_(kwargs[k])
32 self._cuda_graphs.replay()
33 return self.static_output
34
35 def forward(self, *inputs, **kwargs):
36 if self.enable_cuda_graph:
37 if self.cuda_graph_created:
38 outputs = self._graph_replay(*inputs, **kwargs)
39 else:
40 self._create_cuda_graph(*inputs, **kwargs)
41 outputs = self._graph_replay(*inputs, **kwargs)
42 return outputs
43 else:
44 return self._forward(*inputs, **kwargs)
45
46 def _create_cuda_graph(self, *inputs, **kwargs):
47 # warmup to create the workspace and cublas handle
48 cuda_stream = torch.cuda.Stream()
49 cuda_stream.wait_stream(torch.cuda.current_stream())
50 with torch.cuda.stream(cuda_stream):
51 for i in range(3):
52 ret = self._forward(*inputs, **kwargs)
53 torch.cuda.current_stream().wait_stream(cuda_stream)
54
55 # create cuda_graph and assign static_inputs and static_outputs
56 self._cuda_graphs = torch.cuda.CUDAGraph()
57 self.static_inputs = inputs
58 self.static_kwargs = kwargs
59
60 with torch.cuda.graph(self._cuda_graphs):
61 self.static_output = self._forward(*self.static_inputs, **self.static_kwargs)
62
63 self.cuda_graph_created = True
64
65 def _forward(self,
66 sample,
67 timestamp,
68 encoder_hidden_states,
69 return_dict=True,
70 cross_attention_kwargs=None,
71 timestep_cond=None):
72 if cross_attention_kwargs:
73 return self.unet(sample,
74 timestamp,
75 encoder_hidden_states,
76 return_dict,
77 cross_attention_kwargs=cross_attention_kwargs)
78 else:
79 return self.unet(sample, timestamp, encoder_hidden_states, return_dict)
80
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/deepspeed/model_implementations/diffusers/unet.py b/deepspeed/model_implementations/diffusers/unet.py
--- a/deepspeed/model_implementations/diffusers/unet.py
+++ b/deepspeed/model_implementations/diffusers/unet.py
@@ -68,7 +68,8 @@
encoder_hidden_states,
return_dict=True,
cross_attention_kwargs=None,
- timestep_cond=None):
+ timestep_cond=None,
+ added_cond_kwargs=None):
if cross_attention_kwargs:
return self.unet(sample,
timestamp,
| {"golden_diff": "diff --git a/deepspeed/model_implementations/diffusers/unet.py b/deepspeed/model_implementations/diffusers/unet.py\n--- a/deepspeed/model_implementations/diffusers/unet.py\n+++ b/deepspeed/model_implementations/diffusers/unet.py\n@@ -68,7 +68,8 @@\n encoder_hidden_states,\n return_dict=True,\n cross_attention_kwargs=None,\n- timestep_cond=None):\n+ timestep_cond=None,\n+ added_cond_kwargs=None):\n if cross_attention_kwargs:\n return self.unet(sample,\n timestamp,\n", "issue": "nv-sd CI test failure\nThe Nightly CI for https://github.com/microsoft/DeepSpeed/actions/runs/7073374056 failed.\n\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation.\n# SPDX-License-Identifier: Apache-2.0\n\n# DeepSpeed Team\n\nimport torch\nfrom ..features.cuda_graph import CUDAGraph\n\n\nclass DSUNet(CUDAGraph, torch.nn.Module):\n\n def __init__(self, unet, enable_cuda_graph=True):\n super().__init__(enable_cuda_graph=enable_cuda_graph)\n self.unet = unet\n # SD pipeline accesses this attribute\n self.in_channels = unet.in_channels\n self.device = self.unet.device\n self.dtype = self.unet.dtype\n self.config = self.unet.config\n self.fwd_count = 0\n self.unet.requires_grad_(requires_grad=False)\n self.unet.to(memory_format=torch.channels_last)\n self.cuda_graph_created = False\n\n def _graph_replay(self, *inputs, **kwargs):\n for i in range(len(inputs)):\n if torch.is_tensor(inputs[i]):\n self.static_inputs[i].copy_(inputs[i])\n for k in kwargs:\n if torch.is_tensor(kwargs[k]):\n self.static_kwargs[k].copy_(kwargs[k])\n self._cuda_graphs.replay()\n return self.static_output\n\n def forward(self, *inputs, **kwargs):\n if self.enable_cuda_graph:\n if self.cuda_graph_created:\n outputs = self._graph_replay(*inputs, **kwargs)\n else:\n self._create_cuda_graph(*inputs, **kwargs)\n outputs = self._graph_replay(*inputs, **kwargs)\n return outputs\n else:\n return self._forward(*inputs, **kwargs)\n\n def _create_cuda_graph(self, *inputs, **kwargs):\n # warmup to create the workspace and cublas handle\n cuda_stream = torch.cuda.Stream()\n cuda_stream.wait_stream(torch.cuda.current_stream())\n with torch.cuda.stream(cuda_stream):\n for i in range(3):\n ret = self._forward(*inputs, **kwargs)\n torch.cuda.current_stream().wait_stream(cuda_stream)\n\n # create cuda_graph and assign static_inputs and static_outputs\n self._cuda_graphs = torch.cuda.CUDAGraph()\n self.static_inputs = inputs\n self.static_kwargs = kwargs\n\n with torch.cuda.graph(self._cuda_graphs):\n self.static_output = self._forward(*self.static_inputs, **self.static_kwargs)\n\n self.cuda_graph_created = True\n\n def _forward(self,\n sample,\n timestamp,\n encoder_hidden_states,\n return_dict=True,\n cross_attention_kwargs=None,\n timestep_cond=None):\n if cross_attention_kwargs:\n return self.unet(sample,\n timestamp,\n encoder_hidden_states,\n return_dict,\n cross_attention_kwargs=cross_attention_kwargs)\n else:\n return self.unet(sample, timestamp, encoder_hidden_states, return_dict)\n", "path": "deepspeed/model_implementations/diffusers/unet.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation.\n# SPDX-License-Identifier: Apache-2.0\n\n# DeepSpeed Team\n\nimport torch\nfrom ..features.cuda_graph import CUDAGraph\n\n\nclass DSUNet(CUDAGraph, torch.nn.Module):\n\n def __init__(self, unet, enable_cuda_graph=True):\n super().__init__(enable_cuda_graph=enable_cuda_graph)\n self.unet = unet\n # SD pipeline accesses this attribute\n self.in_channels = unet.in_channels\n self.device = self.unet.device\n self.dtype = self.unet.dtype\n self.config = self.unet.config\n self.fwd_count = 0\n self.unet.requires_grad_(requires_grad=False)\n self.unet.to(memory_format=torch.channels_last)\n self.cuda_graph_created = False\n\n def _graph_replay(self, *inputs, **kwargs):\n for i in range(len(inputs)):\n if torch.is_tensor(inputs[i]):\n self.static_inputs[i].copy_(inputs[i])\n for k in kwargs:\n if torch.is_tensor(kwargs[k]):\n self.static_kwargs[k].copy_(kwargs[k])\n self._cuda_graphs.replay()\n return self.static_output\n\n def forward(self, *inputs, **kwargs):\n if self.enable_cuda_graph:\n if self.cuda_graph_created:\n outputs = self._graph_replay(*inputs, **kwargs)\n else:\n self._create_cuda_graph(*inputs, **kwargs)\n outputs = self._graph_replay(*inputs, **kwargs)\n return outputs\n else:\n return self._forward(*inputs, **kwargs)\n\n def _create_cuda_graph(self, *inputs, **kwargs):\n # warmup to create the workspace and cublas handle\n cuda_stream = torch.cuda.Stream()\n cuda_stream.wait_stream(torch.cuda.current_stream())\n with torch.cuda.stream(cuda_stream):\n for i in range(3):\n ret = self._forward(*inputs, **kwargs)\n torch.cuda.current_stream().wait_stream(cuda_stream)\n\n # create cuda_graph and assign static_inputs and static_outputs\n self._cuda_graphs = torch.cuda.CUDAGraph()\n self.static_inputs = inputs\n self.static_kwargs = kwargs\n\n with torch.cuda.graph(self._cuda_graphs):\n self.static_output = self._forward(*self.static_inputs, **self.static_kwargs)\n\n self.cuda_graph_created = True\n\n def _forward(self,\n sample,\n timestamp,\n encoder_hidden_states,\n return_dict=True,\n cross_attention_kwargs=None,\n timestep_cond=None,\n added_cond_kwargs=None):\n if cross_attention_kwargs:\n return self.unet(sample,\n timestamp,\n encoder_hidden_states,\n return_dict,\n cross_attention_kwargs=cross_attention_kwargs)\n else:\n return self.unet(sample, timestamp, encoder_hidden_states, return_dict)\n", "path": "deepspeed/model_implementations/diffusers/unet.py"}]} | 1,053 | 127 |
gh_patches_debug_460 | rasdani/github-patches | git_diff | gratipay__gratipay.com-3013 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Twitter asks for authorization even though I've already authorized Gittip
As of #1369 Twitter is now asking me to authorize Giitip even though I've already done so.

<bountysource-plugin>
---
Want to back this issue? **[Place a bounty on it!](https://www.bountysource.com/issues/1428788-twitter-asks-for-authorization-even-though-i-ve-already-authorized-gittip?utm_campaign=plugin&utm_content=tracker%2F85909&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F85909&utm_medium=issues&utm_source=github).
</bountysource-plugin>
Twitter asks for authorization even though I've already authorized Gittip
As of #1369 Twitter is now asking me to authorize Giitip even though I've already done so.

<bountysource-plugin>
---
Want to back this issue? **[Place a bounty on it!](https://www.bountysource.com/issues/1428788-twitter-asks-for-authorization-even-though-i-ve-already-authorized-gittip?utm_campaign=plugin&utm_content=tracker%2F85909&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F85909&utm_medium=issues&utm_source=github).
</bountysource-plugin>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gratipay/elsewhere/twitter.py`
Content:
```
1 from __future__ import absolute_import, division, print_function, unicode_literals
2
3 from gratipay.elsewhere import PlatformOAuth1
4 from gratipay.elsewhere._extractors import key, not_available
5
6
7 class Twitter(PlatformOAuth1):
8
9 # Platform attributes
10 name = 'twitter'
11 display_name = 'Twitter'
12 account_url = 'https://twitter.com/{user_name}'
13
14 # Auth attributes
15 auth_url = 'https://api.twitter.com'
16
17 # API attributes
18 api_format = 'json'
19 api_url = 'https://api.twitter.com/1.1'
20 api_user_info_path = '/users/show.json?screen_name={user_name}'
21 api_user_self_info_path = '/account/verify_credentials.json'
22 ratelimit_headers_prefix = 'x-rate-limit-'
23
24 # User info extractors
25 x_user_id = key('id')
26 x_user_name = key('screen_name')
27 x_display_name = key('name')
28 x_email = not_available
29 x_avatar_url = key('profile_image_url_https',
30 clean=lambda v: v.replace('_normal.', '.'))
31
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gratipay/elsewhere/twitter.py b/gratipay/elsewhere/twitter.py
--- a/gratipay/elsewhere/twitter.py
+++ b/gratipay/elsewhere/twitter.py
@@ -13,6 +13,7 @@
# Auth attributes
auth_url = 'https://api.twitter.com'
+ authorize_path = '/oauth/authenticate'
# API attributes
api_format = 'json'
| {"golden_diff": "diff --git a/gratipay/elsewhere/twitter.py b/gratipay/elsewhere/twitter.py\n--- a/gratipay/elsewhere/twitter.py\n+++ b/gratipay/elsewhere/twitter.py\n@@ -13,6 +13,7 @@\n \n # Auth attributes\n auth_url = 'https://api.twitter.com'\n+ authorize_path = '/oauth/authenticate'\n \n # API attributes\n api_format = 'json'\n", "issue": "Twitter asks for authorization even though I've already authorized Gittip\nAs of #1369 Twitter is now asking me to authorize Giitip even though I've already done so.\n\n\n\n<bountysource-plugin>\n\n---\n\nWant to back this issue? **[Place a bounty on it!](https://www.bountysource.com/issues/1428788-twitter-asks-for-authorization-even-though-i-ve-already-authorized-gittip?utm_campaign=plugin&utm_content=tracker%2F85909&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F85909&utm_medium=issues&utm_source=github).\n</bountysource-plugin>\n\nTwitter asks for authorization even though I've already authorized Gittip\nAs of #1369 Twitter is now asking me to authorize Giitip even though I've already done so.\n\n\n\n<bountysource-plugin>\n\n---\n\nWant to back this issue? **[Place a bounty on it!](https://www.bountysource.com/issues/1428788-twitter-asks-for-authorization-even-though-i-ve-already-authorized-gittip?utm_campaign=plugin&utm_content=tracker%2F85909&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F85909&utm_medium=issues&utm_source=github).\n</bountysource-plugin>\n\n", "before_files": [{"content": "from __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom gratipay.elsewhere import PlatformOAuth1\nfrom gratipay.elsewhere._extractors import key, not_available\n\n\nclass Twitter(PlatformOAuth1):\n\n # Platform attributes\n name = 'twitter'\n display_name = 'Twitter'\n account_url = 'https://twitter.com/{user_name}'\n\n # Auth attributes\n auth_url = 'https://api.twitter.com'\n\n # API attributes\n api_format = 'json'\n api_url = 'https://api.twitter.com/1.1'\n api_user_info_path = '/users/show.json?screen_name={user_name}'\n api_user_self_info_path = '/account/verify_credentials.json'\n ratelimit_headers_prefix = 'x-rate-limit-'\n\n # User info extractors\n x_user_id = key('id')\n x_user_name = key('screen_name')\n x_display_name = key('name')\n x_email = not_available\n x_avatar_url = key('profile_image_url_https',\n clean=lambda v: v.replace('_normal.', '.'))\n", "path": "gratipay/elsewhere/twitter.py"}], "after_files": [{"content": "from __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom gratipay.elsewhere import PlatformOAuth1\nfrom gratipay.elsewhere._extractors import key, not_available\n\n\nclass Twitter(PlatformOAuth1):\n\n # Platform attributes\n name = 'twitter'\n display_name = 'Twitter'\n account_url = 'https://twitter.com/{user_name}'\n\n # Auth attributes\n auth_url = 'https://api.twitter.com'\n authorize_path = '/oauth/authenticate'\n\n # API attributes\n api_format = 'json'\n api_url = 'https://api.twitter.com/1.1'\n api_user_info_path = '/users/show.json?screen_name={user_name}'\n api_user_self_info_path = '/account/verify_credentials.json'\n ratelimit_headers_prefix = 'x-rate-limit-'\n\n # User info extractors\n x_user_id = key('id')\n x_user_name = key('screen_name')\n x_display_name = key('name')\n x_email = not_available\n x_avatar_url = key('profile_image_url_https',\n clean=lambda v: v.replace('_normal.', '.'))\n", "path": "gratipay/elsewhere/twitter.py"}]} | 1,069 | 96 |
gh_patches_debug_12961 | rasdani/github-patches | git_diff | Lightning-AI__torchmetrics-985 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Calibration error does not support negative logits
Hi there,
It appears that `CalibrationError` currently does not support some of the input data types mentioned in [this table](https://torchmetrics.readthedocs.io/en/stable/pages/classification.html#input-types) from the docs. In particular, it seems to break when fed with
- `preds=logits` where `logits` is a `(N, C)` float32 tensor with potentially negative values.
- `preds=predictions` where `predictions` is a `(N,)` int tensor with the predicted labels.
It still works with softmax-ed logits (`preds=logits.softmax(-1)`) or, generally, with and `(N,)`-dimensional float32 tensors (i.e. the binary input in the input data table mentioned above).
To reproduce:
```python
N, C = 10, 3
targets = torch.randint(C, (N,))
# (N, C) non-negative: works
preds = torch.rand((N, C)) # non-negative
CalibrationError()(preds=preds, target=targets)
# (N, C) potentially negative: fails
preds = torch.randn((N, C)) # potetially negative
CalibrationError()(preds=preds, target=targets)
# (N,) int type: fails
CalibrationError()(preds=targets, target=targets)
# (N,) float type non-negative: works
preds = torch.rand((N,)) # binary non-negative
CalibrationError()(preds=preds, target=targets)
# (N,) float type potentially negative: fails
preds = torch.randn((N,)) # binary potentially negative
CalibrationError()(preds=preds, target=targets)
```
Torchmetrics: v0.8
Pytorch: v1.11
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchmetrics/functional/classification/calibration_error.py`
Content:
```
1 # Copyright The PyTorch Lightning team.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 from typing import Tuple
15
16 import torch
17 from torch import Tensor
18
19 from torchmetrics.utilities.checks import _input_format_classification
20 from torchmetrics.utilities.enums import DataType
21 from torchmetrics.utilities.imports import _TORCH_GREATER_EQUAL_1_8
22
23
24 def _binning_with_loop(
25 confidences: Tensor, accuracies: Tensor, bin_boundaries: Tensor
26 ) -> Tuple[Tensor, Tensor, Tensor]:
27 """Compute calibration bins using for loops. Use for pytorch < 1.6.
28
29 Args:
30 confidences: The confidence (i.e. predicted prob) of the top1 prediction.
31 accuracies: 1.0 if the top-1 prediction was correct, 0.0 otherwise.
32 bin_boundaries: Bin boundaries separating the ``linspace`` from 0 to 1.
33
34 Returns:
35 tuple with binned accuracy, binned confidence and binned probabilities
36 """
37 conf_bin = torch.zeros_like(bin_boundaries)
38 acc_bin = torch.zeros_like(bin_boundaries)
39 prop_bin = torch.zeros_like(bin_boundaries)
40 for i, (bin_lower, bin_upper) in enumerate(zip(bin_boundaries[:-1], bin_boundaries[1:])):
41 # Calculated confidence and accuracy in each bin
42 in_bin = confidences.gt(bin_lower.item()) * confidences.le(bin_upper.item())
43 prop_in_bin = in_bin.float().mean()
44 if prop_in_bin.item() > 0:
45 acc_bin[i] = accuracies[in_bin].float().mean()
46 conf_bin[i] = confidences[in_bin].mean()
47 prop_bin[i] = prop_in_bin
48 return acc_bin, conf_bin, prop_bin
49
50
51 def _binning_bucketize(
52 confidences: Tensor, accuracies: Tensor, bin_boundaries: Tensor
53 ) -> Tuple[Tensor, Tensor, Tensor]:
54 """Compute calibration bins using ``torch.bucketize``. Use for pytorch >= 1.6.
55
56 Args:
57 confidences: The confidence (i.e. predicted prob) of the top1 prediction.
58 accuracies: 1.0 if the top-1 prediction was correct, 0.0 otherwise.
59 bin_boundaries: Bin boundaries separating the ``linspace`` from 0 to 1.
60
61 Returns:
62 tuple with binned accuracy, binned confidence and binned probabilities
63 """
64 acc_bin = torch.zeros(len(bin_boundaries) - 1, device=confidences.device, dtype=confidences.dtype)
65 conf_bin = torch.zeros(len(bin_boundaries) - 1, device=confidences.device, dtype=confidences.dtype)
66 count_bin = torch.zeros(len(bin_boundaries) - 1, device=confidences.device, dtype=confidences.dtype)
67
68 indices = torch.bucketize(confidences, bin_boundaries) - 1
69
70 count_bin.scatter_add_(dim=0, index=indices, src=torch.ones_like(confidences))
71
72 conf_bin.scatter_add_(dim=0, index=indices, src=confidences)
73 conf_bin = torch.nan_to_num(conf_bin / count_bin)
74
75 acc_bin.scatter_add_(dim=0, index=indices, src=accuracies)
76 acc_bin = torch.nan_to_num(acc_bin / count_bin)
77
78 prop_bin = count_bin / count_bin.sum()
79 return acc_bin, conf_bin, prop_bin
80
81
82 def _ce_compute(
83 confidences: Tensor,
84 accuracies: Tensor,
85 bin_boundaries: Tensor,
86 norm: str = "l1",
87 debias: bool = False,
88 ) -> Tensor:
89 """Computes the calibration error given the provided bin boundaries and norm.
90
91 Args:
92 confidences: The confidence (i.e. predicted prob) of the top1 prediction.
93 accuracies: 1.0 if the top-1 prediction was correct, 0.0 otherwise.
94 bin_boundaries: Bin boundaries separating the ``linspace`` from 0 to 1.
95 norm: Norm function to use when computing calibration error. Defaults to "l1".
96 debias: Apply debiasing to L2 norm computation as in
97 `Verified Uncertainty Calibration`_. Defaults to False.
98
99 Raises:
100 ValueError: If an unsupported norm function is provided.
101
102 Returns:
103 Tensor: Calibration error scalar.
104 """
105 if norm not in {"l1", "l2", "max"}:
106 raise ValueError(f"Norm {norm} is not supported. Please select from l1, l2, or max. ")
107
108 if _TORCH_GREATER_EQUAL_1_8:
109 acc_bin, conf_bin, prop_bin = _binning_bucketize(confidences, accuracies, bin_boundaries)
110 else:
111 acc_bin, conf_bin, prop_bin = _binning_with_loop(confidences, accuracies, bin_boundaries)
112
113 if norm == "l1":
114 ce = torch.sum(torch.abs(acc_bin - conf_bin) * prop_bin)
115 elif norm == "max":
116 ce = torch.max(torch.abs(acc_bin - conf_bin))
117 elif norm == "l2":
118 ce = torch.sum(torch.pow(acc_bin - conf_bin, 2) * prop_bin)
119 # NOTE: debiasing is disabled in the wrapper functions. This implementation differs from that in sklearn.
120 if debias:
121 # the order here (acc_bin - 1 ) vs (1 - acc_bin) is flipped from
122 # the equation in Verified Uncertainty Prediction (Kumar et al 2019)/
123 debias_bins = (acc_bin * (acc_bin - 1) * prop_bin) / (prop_bin * accuracies.size()[0] - 1)
124 ce += torch.sum(torch.nan_to_num(debias_bins)) # replace nans with zeros if nothing appeared in a bin
125 ce = torch.sqrt(ce) if ce > 0 else torch.tensor(0)
126 return ce
127
128
129 def _ce_update(preds: Tensor, target: Tensor) -> Tuple[Tensor, Tensor]:
130 """Given a predictions and targets tensor, computes the confidences of the top-1 prediction and records their
131 correctness.
132
133 Args:
134 preds: Input ``softmaxed`` predictions.
135 target: Labels.
136
137 Raises:
138 ValueError: If the dataset shape is not binary, multiclass, or multidimensional-multiclass.
139
140 Returns:
141 tuple with confidences and accuracies
142 """
143 _, _, mode = _input_format_classification(preds, target)
144
145 if mode == DataType.BINARY:
146 confidences, accuracies = preds, target
147 elif mode == DataType.MULTICLASS:
148 confidences, predictions = preds.max(dim=1)
149 accuracies = predictions.eq(target)
150 elif mode == DataType.MULTIDIM_MULTICLASS:
151 # reshape tensors
152 # for preds, move the class dimension to the final axis and flatten the rest
153 confidences, predictions = torch.transpose(preds, 1, -1).flatten(0, -2).max(dim=1)
154 # for targets, just flatten the target
155 accuracies = predictions.eq(target.flatten())
156 else:
157 raise ValueError(
158 f"Calibration error is not well-defined for data with size {preds.size()} and targets {target.size()}."
159 )
160 # must be cast to float for ddp allgather to work
161 return confidences.float(), accuracies.float()
162
163
164 def calibration_error(preds: Tensor, target: Tensor, n_bins: int = 15, norm: str = "l1") -> Tensor:
165 r"""`Computes the Top-label Calibration Error`_
166
167 Three different norms are implemented, each corresponding to variations on the calibration error metric.
168
169 L1 norm (Expected Calibration Error)
170
171 .. math::
172 \text{ECE} = \sum_i^N b_i \|(p_i - c_i)\|
173
174 Infinity norm (Maximum Calibration Error)
175
176 .. math::
177 \text{MCE} = \max_{i} (p_i - c_i)
178
179 L2 norm (Root Mean Square Calibration Error)
180
181 .. math::
182 \text{RMSCE} = \sqrt{\sum_i^N b_i(p_i - c_i)^2}
183
184 Where :math:`p_i` is the top-1 prediction accuracy in bin :math:`i`,
185 :math:`c_i` is the average confidence of predictions in bin :math:`i`, and
186 :math:`b_i` is the fraction of data points in bin :math:`i`.
187
188 .. note:
189 L2-norm debiasing is not yet supported.
190
191 Args:
192 preds: Model output probabilities.
193 target: Ground-truth target class labels.
194 n_bins: Number of bins to use when computing t.
195 norm: Norm used to compare empirical and expected probability bins.
196 Defaults to "l1", or Expected Calibration Error.
197 """
198 if norm not in ("l1", "l2", "max"):
199 raise ValueError(f"Norm {norm} is not supported. Please select from l1, l2, or max. ")
200
201 if not isinstance(n_bins, int) or n_bins <= 0:
202 raise ValueError(f"Expected argument `n_bins` to be a int larger than 0 but got {n_bins}")
203
204 confidences, accuracies = _ce_update(preds, target)
205
206 bin_boundaries = torch.linspace(0, 1, n_bins + 1, dtype=torch.float, device=preds.device)
207
208 return _ce_compute(confidences, accuracies, bin_boundaries, norm=norm)
209
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/torchmetrics/functional/classification/calibration_error.py b/torchmetrics/functional/classification/calibration_error.py
--- a/torchmetrics/functional/classification/calibration_error.py
+++ b/torchmetrics/functional/classification/calibration_error.py
@@ -143,8 +143,12 @@
_, _, mode = _input_format_classification(preds, target)
if mode == DataType.BINARY:
+ if not ((0 <= preds) * (preds <= 1)).all():
+ preds = preds.sigmoid()
confidences, accuracies = preds, target
elif mode == DataType.MULTICLASS:
+ if not ((0 <= preds) * (preds <= 1)).all():
+ preds = preds.softmax(dim=1)
confidences, predictions = preds.max(dim=1)
accuracies = predictions.eq(target)
elif mode == DataType.MULTIDIM_MULTICLASS:
| {"golden_diff": "diff --git a/torchmetrics/functional/classification/calibration_error.py b/torchmetrics/functional/classification/calibration_error.py\n--- a/torchmetrics/functional/classification/calibration_error.py\n+++ b/torchmetrics/functional/classification/calibration_error.py\n@@ -143,8 +143,12 @@\n _, _, mode = _input_format_classification(preds, target)\n \n if mode == DataType.BINARY:\n+ if not ((0 <= preds) * (preds <= 1)).all():\n+ preds = preds.sigmoid()\n confidences, accuracies = preds, target\n elif mode == DataType.MULTICLASS:\n+ if not ((0 <= preds) * (preds <= 1)).all():\n+ preds = preds.softmax(dim=1)\n confidences, predictions = preds.max(dim=1)\n accuracies = predictions.eq(target)\n elif mode == DataType.MULTIDIM_MULTICLASS:\n", "issue": "Calibration error does not support negative logits\nHi there,\r\n\r\nIt appears that `CalibrationError` currently does not support some of the input data types mentioned in [this table](https://torchmetrics.readthedocs.io/en/stable/pages/classification.html#input-types) from the docs. In particular, it seems to break when fed with \r\n\r\n- `preds=logits` where `logits` is a `(N, C)` float32 tensor with potentially negative values.\r\n- `preds=predictions` where `predictions` is a `(N,)` int tensor with the predicted labels.\r\n\r\nIt still works with softmax-ed logits (`preds=logits.softmax(-1)`) or, generally, with and `(N,)`-dimensional float32 tensors (i.e. the binary input in the input data table mentioned above).\r\n\r\nTo reproduce:\r\n\r\n```python\r\nN, C = 10, 3\r\ntargets = torch.randint(C, (N,))\r\n\r\n# (N, C) non-negative: works\r\npreds = torch.rand((N, C)) # non-negative\r\nCalibrationError()(preds=preds, target=targets)\r\n\r\n# (N, C) potentially negative: fails\r\npreds = torch.randn((N, C)) # potetially negative\r\nCalibrationError()(preds=preds, target=targets)\r\n\r\n# (N,) int type: fails\r\nCalibrationError()(preds=targets, target=targets)\r\n\r\n# (N,) float type non-negative: works\r\npreds = torch.rand((N,)) # binary non-negative\r\nCalibrationError()(preds=preds, target=targets)\r\n\r\n# (N,) float type potentially negative: fails\r\npreds = torch.randn((N,)) # binary potentially negative\r\nCalibrationError()(preds=preds, target=targets)\r\n```\r\n\r\nTorchmetrics: v0.8\r\nPytorch: v1.11\n", "before_files": [{"content": "# Copyright The PyTorch Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Tuple\n\nimport torch\nfrom torch import Tensor\n\nfrom torchmetrics.utilities.checks import _input_format_classification\nfrom torchmetrics.utilities.enums import DataType\nfrom torchmetrics.utilities.imports import _TORCH_GREATER_EQUAL_1_8\n\n\ndef _binning_with_loop(\n confidences: Tensor, accuracies: Tensor, bin_boundaries: Tensor\n) -> Tuple[Tensor, Tensor, Tensor]:\n \"\"\"Compute calibration bins using for loops. Use for pytorch < 1.6.\n\n Args:\n confidences: The confidence (i.e. predicted prob) of the top1 prediction.\n accuracies: 1.0 if the top-1 prediction was correct, 0.0 otherwise.\n bin_boundaries: Bin boundaries separating the ``linspace`` from 0 to 1.\n\n Returns:\n tuple with binned accuracy, binned confidence and binned probabilities\n \"\"\"\n conf_bin = torch.zeros_like(bin_boundaries)\n acc_bin = torch.zeros_like(bin_boundaries)\n prop_bin = torch.zeros_like(bin_boundaries)\n for i, (bin_lower, bin_upper) in enumerate(zip(bin_boundaries[:-1], bin_boundaries[1:])):\n # Calculated confidence and accuracy in each bin\n in_bin = confidences.gt(bin_lower.item()) * confidences.le(bin_upper.item())\n prop_in_bin = in_bin.float().mean()\n if prop_in_bin.item() > 0:\n acc_bin[i] = accuracies[in_bin].float().mean()\n conf_bin[i] = confidences[in_bin].mean()\n prop_bin[i] = prop_in_bin\n return acc_bin, conf_bin, prop_bin\n\n\ndef _binning_bucketize(\n confidences: Tensor, accuracies: Tensor, bin_boundaries: Tensor\n) -> Tuple[Tensor, Tensor, Tensor]:\n \"\"\"Compute calibration bins using ``torch.bucketize``. Use for pytorch >= 1.6.\n\n Args:\n confidences: The confidence (i.e. predicted prob) of the top1 prediction.\n accuracies: 1.0 if the top-1 prediction was correct, 0.0 otherwise.\n bin_boundaries: Bin boundaries separating the ``linspace`` from 0 to 1.\n\n Returns:\n tuple with binned accuracy, binned confidence and binned probabilities\n \"\"\"\n acc_bin = torch.zeros(len(bin_boundaries) - 1, device=confidences.device, dtype=confidences.dtype)\n conf_bin = torch.zeros(len(bin_boundaries) - 1, device=confidences.device, dtype=confidences.dtype)\n count_bin = torch.zeros(len(bin_boundaries) - 1, device=confidences.device, dtype=confidences.dtype)\n\n indices = torch.bucketize(confidences, bin_boundaries) - 1\n\n count_bin.scatter_add_(dim=0, index=indices, src=torch.ones_like(confidences))\n\n conf_bin.scatter_add_(dim=0, index=indices, src=confidences)\n conf_bin = torch.nan_to_num(conf_bin / count_bin)\n\n acc_bin.scatter_add_(dim=0, index=indices, src=accuracies)\n acc_bin = torch.nan_to_num(acc_bin / count_bin)\n\n prop_bin = count_bin / count_bin.sum()\n return acc_bin, conf_bin, prop_bin\n\n\ndef _ce_compute(\n confidences: Tensor,\n accuracies: Tensor,\n bin_boundaries: Tensor,\n norm: str = \"l1\",\n debias: bool = False,\n) -> Tensor:\n \"\"\"Computes the calibration error given the provided bin boundaries and norm.\n\n Args:\n confidences: The confidence (i.e. predicted prob) of the top1 prediction.\n accuracies: 1.0 if the top-1 prediction was correct, 0.0 otherwise.\n bin_boundaries: Bin boundaries separating the ``linspace`` from 0 to 1.\n norm: Norm function to use when computing calibration error. Defaults to \"l1\".\n debias: Apply debiasing to L2 norm computation as in\n `Verified Uncertainty Calibration`_. Defaults to False.\n\n Raises:\n ValueError: If an unsupported norm function is provided.\n\n Returns:\n Tensor: Calibration error scalar.\n \"\"\"\n if norm not in {\"l1\", \"l2\", \"max\"}:\n raise ValueError(f\"Norm {norm} is not supported. Please select from l1, l2, or max. \")\n\n if _TORCH_GREATER_EQUAL_1_8:\n acc_bin, conf_bin, prop_bin = _binning_bucketize(confidences, accuracies, bin_boundaries)\n else:\n acc_bin, conf_bin, prop_bin = _binning_with_loop(confidences, accuracies, bin_boundaries)\n\n if norm == \"l1\":\n ce = torch.sum(torch.abs(acc_bin - conf_bin) * prop_bin)\n elif norm == \"max\":\n ce = torch.max(torch.abs(acc_bin - conf_bin))\n elif norm == \"l2\":\n ce = torch.sum(torch.pow(acc_bin - conf_bin, 2) * prop_bin)\n # NOTE: debiasing is disabled in the wrapper functions. This implementation differs from that in sklearn.\n if debias:\n # the order here (acc_bin - 1 ) vs (1 - acc_bin) is flipped from\n # the equation in Verified Uncertainty Prediction (Kumar et al 2019)/\n debias_bins = (acc_bin * (acc_bin - 1) * prop_bin) / (prop_bin * accuracies.size()[0] - 1)\n ce += torch.sum(torch.nan_to_num(debias_bins)) # replace nans with zeros if nothing appeared in a bin\n ce = torch.sqrt(ce) if ce > 0 else torch.tensor(0)\n return ce\n\n\ndef _ce_update(preds: Tensor, target: Tensor) -> Tuple[Tensor, Tensor]:\n \"\"\"Given a predictions and targets tensor, computes the confidences of the top-1 prediction and records their\n correctness.\n\n Args:\n preds: Input ``softmaxed`` predictions.\n target: Labels.\n\n Raises:\n ValueError: If the dataset shape is not binary, multiclass, or multidimensional-multiclass.\n\n Returns:\n tuple with confidences and accuracies\n \"\"\"\n _, _, mode = _input_format_classification(preds, target)\n\n if mode == DataType.BINARY:\n confidences, accuracies = preds, target\n elif mode == DataType.MULTICLASS:\n confidences, predictions = preds.max(dim=1)\n accuracies = predictions.eq(target)\n elif mode == DataType.MULTIDIM_MULTICLASS:\n # reshape tensors\n # for preds, move the class dimension to the final axis and flatten the rest\n confidences, predictions = torch.transpose(preds, 1, -1).flatten(0, -2).max(dim=1)\n # for targets, just flatten the target\n accuracies = predictions.eq(target.flatten())\n else:\n raise ValueError(\n f\"Calibration error is not well-defined for data with size {preds.size()} and targets {target.size()}.\"\n )\n # must be cast to float for ddp allgather to work\n return confidences.float(), accuracies.float()\n\n\ndef calibration_error(preds: Tensor, target: Tensor, n_bins: int = 15, norm: str = \"l1\") -> Tensor:\n r\"\"\"`Computes the Top-label Calibration Error`_\n\n Three different norms are implemented, each corresponding to variations on the calibration error metric.\n\n L1 norm (Expected Calibration Error)\n\n .. math::\n \\text{ECE} = \\sum_i^N b_i \\|(p_i - c_i)\\|\n\n Infinity norm (Maximum Calibration Error)\n\n .. math::\n \\text{MCE} = \\max_{i} (p_i - c_i)\n\n L2 norm (Root Mean Square Calibration Error)\n\n .. math::\n \\text{RMSCE} = \\sqrt{\\sum_i^N b_i(p_i - c_i)^2}\n\n Where :math:`p_i` is the top-1 prediction accuracy in bin :math:`i`,\n :math:`c_i` is the average confidence of predictions in bin :math:`i`, and\n :math:`b_i` is the fraction of data points in bin :math:`i`.\n\n .. note:\n L2-norm debiasing is not yet supported.\n\n Args:\n preds: Model output probabilities.\n target: Ground-truth target class labels.\n n_bins: Number of bins to use when computing t.\n norm: Norm used to compare empirical and expected probability bins.\n Defaults to \"l1\", or Expected Calibration Error.\n \"\"\"\n if norm not in (\"l1\", \"l2\", \"max\"):\n raise ValueError(f\"Norm {norm} is not supported. Please select from l1, l2, or max. \")\n\n if not isinstance(n_bins, int) or n_bins <= 0:\n raise ValueError(f\"Expected argument `n_bins` to be a int larger than 0 but got {n_bins}\")\n\n confidences, accuracies = _ce_update(preds, target)\n\n bin_boundaries = torch.linspace(0, 1, n_bins + 1, dtype=torch.float, device=preds.device)\n\n return _ce_compute(confidences, accuracies, bin_boundaries, norm=norm)\n", "path": "torchmetrics/functional/classification/calibration_error.py"}], "after_files": [{"content": "# Copyright The PyTorch Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Tuple\n\nimport torch\nfrom torch import Tensor\n\nfrom torchmetrics.utilities.checks import _input_format_classification\nfrom torchmetrics.utilities.enums import DataType\nfrom torchmetrics.utilities.imports import _TORCH_GREATER_EQUAL_1_8\n\n\ndef _binning_with_loop(\n confidences: Tensor, accuracies: Tensor, bin_boundaries: Tensor\n) -> Tuple[Tensor, Tensor, Tensor]:\n \"\"\"Compute calibration bins using for loops. Use for pytorch < 1.6.\n\n Args:\n confidences: The confidence (i.e. predicted prob) of the top1 prediction.\n accuracies: 1.0 if the top-1 prediction was correct, 0.0 otherwise.\n bin_boundaries: Bin boundaries separating the ``linspace`` from 0 to 1.\n\n Returns:\n tuple with binned accuracy, binned confidence and binned probabilities\n \"\"\"\n conf_bin = torch.zeros_like(bin_boundaries)\n acc_bin = torch.zeros_like(bin_boundaries)\n prop_bin = torch.zeros_like(bin_boundaries)\n for i, (bin_lower, bin_upper) in enumerate(zip(bin_boundaries[:-1], bin_boundaries[1:])):\n # Calculated confidence and accuracy in each bin\n in_bin = confidences.gt(bin_lower.item()) * confidences.le(bin_upper.item())\n prop_in_bin = in_bin.float().mean()\n if prop_in_bin.item() > 0:\n acc_bin[i] = accuracies[in_bin].float().mean()\n conf_bin[i] = confidences[in_bin].mean()\n prop_bin[i] = prop_in_bin\n return acc_bin, conf_bin, prop_bin\n\n\ndef _binning_bucketize(\n confidences: Tensor, accuracies: Tensor, bin_boundaries: Tensor\n) -> Tuple[Tensor, Tensor, Tensor]:\n \"\"\"Compute calibration bins using ``torch.bucketize``. Use for pytorch >= 1.6.\n\n Args:\n confidences: The confidence (i.e. predicted prob) of the top1 prediction.\n accuracies: 1.0 if the top-1 prediction was correct, 0.0 otherwise.\n bin_boundaries: Bin boundaries separating the ``linspace`` from 0 to 1.\n\n Returns:\n tuple with binned accuracy, binned confidence and binned probabilities\n \"\"\"\n acc_bin = torch.zeros(len(bin_boundaries) - 1, device=confidences.device, dtype=confidences.dtype)\n conf_bin = torch.zeros(len(bin_boundaries) - 1, device=confidences.device, dtype=confidences.dtype)\n count_bin = torch.zeros(len(bin_boundaries) - 1, device=confidences.device, dtype=confidences.dtype)\n\n indices = torch.bucketize(confidences, bin_boundaries) - 1\n\n count_bin.scatter_add_(dim=0, index=indices, src=torch.ones_like(confidences))\n\n conf_bin.scatter_add_(dim=0, index=indices, src=confidences)\n conf_bin = torch.nan_to_num(conf_bin / count_bin)\n\n acc_bin.scatter_add_(dim=0, index=indices, src=accuracies)\n acc_bin = torch.nan_to_num(acc_bin / count_bin)\n\n prop_bin = count_bin / count_bin.sum()\n return acc_bin, conf_bin, prop_bin\n\n\ndef _ce_compute(\n confidences: Tensor,\n accuracies: Tensor,\n bin_boundaries: Tensor,\n norm: str = \"l1\",\n debias: bool = False,\n) -> Tensor:\n \"\"\"Computes the calibration error given the provided bin boundaries and norm.\n\n Args:\n confidences: The confidence (i.e. predicted prob) of the top1 prediction.\n accuracies: 1.0 if the top-1 prediction was correct, 0.0 otherwise.\n bin_boundaries: Bin boundaries separating the ``linspace`` from 0 to 1.\n norm: Norm function to use when computing calibration error. Defaults to \"l1\".\n debias: Apply debiasing to L2 norm computation as in\n `Verified Uncertainty Calibration`_. Defaults to False.\n\n Raises:\n ValueError: If an unsupported norm function is provided.\n\n Returns:\n Tensor: Calibration error scalar.\n \"\"\"\n if norm not in {\"l1\", \"l2\", \"max\"}:\n raise ValueError(f\"Norm {norm} is not supported. Please select from l1, l2, or max. \")\n\n if _TORCH_GREATER_EQUAL_1_8:\n acc_bin, conf_bin, prop_bin = _binning_bucketize(confidences, accuracies, bin_boundaries)\n else:\n acc_bin, conf_bin, prop_bin = _binning_with_loop(confidences, accuracies, bin_boundaries)\n\n if norm == \"l1\":\n ce = torch.sum(torch.abs(acc_bin - conf_bin) * prop_bin)\n elif norm == \"max\":\n ce = torch.max(torch.abs(acc_bin - conf_bin))\n elif norm == \"l2\":\n ce = torch.sum(torch.pow(acc_bin - conf_bin, 2) * prop_bin)\n # NOTE: debiasing is disabled in the wrapper functions. This implementation differs from that in sklearn.\n if debias:\n # the order here (acc_bin - 1 ) vs (1 - acc_bin) is flipped from\n # the equation in Verified Uncertainty Prediction (Kumar et al 2019)/\n debias_bins = (acc_bin * (acc_bin - 1) * prop_bin) / (prop_bin * accuracies.size()[0] - 1)\n ce += torch.sum(torch.nan_to_num(debias_bins)) # replace nans with zeros if nothing appeared in a bin\n ce = torch.sqrt(ce) if ce > 0 else torch.tensor(0)\n return ce\n\n\ndef _ce_update(preds: Tensor, target: Tensor) -> Tuple[Tensor, Tensor]:\n \"\"\"Given a predictions and targets tensor, computes the confidences of the top-1 prediction and records their\n correctness.\n\n Args:\n preds: Input ``softmaxed`` predictions.\n target: Labels.\n\n Raises:\n ValueError: If the dataset shape is not binary, multiclass, or multidimensional-multiclass.\n\n Returns:\n tuple with confidences and accuracies\n \"\"\"\n _, _, mode = _input_format_classification(preds, target)\n\n if mode == DataType.BINARY:\n if not ((0 <= preds) * (preds <= 1)).all():\n preds = preds.sigmoid()\n confidences, accuracies = preds, target\n elif mode == DataType.MULTICLASS:\n if not ((0 <= preds) * (preds <= 1)).all():\n preds = preds.softmax(dim=1)\n confidences, predictions = preds.max(dim=1)\n accuracies = predictions.eq(target)\n elif mode == DataType.MULTIDIM_MULTICLASS:\n # reshape tensors\n # for preds, move the class dimension to the final axis and flatten the rest\n confidences, predictions = torch.transpose(preds, 1, -1).flatten(0, -2).max(dim=1)\n # for targets, just flatten the target\n accuracies = predictions.eq(target.flatten())\n else:\n raise ValueError(\n f\"Calibration error is not well-defined for data with size {preds.size()} and targets {target.size()}.\"\n )\n # must be cast to float for ddp allgather to work\n return confidences.float(), accuracies.float()\n\n\ndef calibration_error(preds: Tensor, target: Tensor, n_bins: int = 15, norm: str = \"l1\") -> Tensor:\n r\"\"\"`Computes the Top-label Calibration Error`_\n\n Three different norms are implemented, each corresponding to variations on the calibration error metric.\n\n L1 norm (Expected Calibration Error)\n\n .. math::\n \\text{ECE} = \\sum_i^N b_i \\|(p_i - c_i)\\|\n\n Infinity norm (Maximum Calibration Error)\n\n .. math::\n \\text{MCE} = \\max_{i} (p_i - c_i)\n\n L2 norm (Root Mean Square Calibration Error)\n\n .. math::\n \\text{RMSCE} = \\sqrt{\\sum_i^N b_i(p_i - c_i)^2}\n\n Where :math:`p_i` is the top-1 prediction accuracy in bin :math:`i`,\n :math:`c_i` is the average confidence of predictions in bin :math:`i`, and\n :math:`b_i` is the fraction of data points in bin :math:`i`.\n\n .. note:\n L2-norm debiasing is not yet supported.\n\n Args:\n preds: Model output probabilities.\n target: Ground-truth target class labels.\n n_bins: Number of bins to use when computing t.\n norm: Norm used to compare empirical and expected probability bins.\n Defaults to \"l1\", or Expected Calibration Error.\n \"\"\"\n if norm not in (\"l1\", \"l2\", \"max\"):\n raise ValueError(f\"Norm {norm} is not supported. Please select from l1, l2, or max. \")\n\n if not isinstance(n_bins, int) or n_bins <= 0:\n raise ValueError(f\"Expected argument `n_bins` to be a int larger than 0 but got {n_bins}\")\n\n confidences, accuracies = _ce_update(preds, target)\n\n bin_boundaries = torch.linspace(0, 1, n_bins + 1, dtype=torch.float, device=preds.device)\n\n return _ce_compute(confidences, accuracies, bin_boundaries, norm=norm)\n", "path": "torchmetrics/functional/classification/calibration_error.py"}]} | 3,359 | 203 |
gh_patches_debug_2788 | rasdani/github-patches | git_diff | lutris__lutris-4038 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bottom panel switches to a different game when it stops
If another game is running and you switch to a different one, and the first game is closed by itself (like quitting it manually, closing the game window), not through Lutris, the bottom panel will switch to that stopped game all by itself, without user's interaction:

It should be noted that so far, only I can reproduce this, for some bizarre reason.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lutris/gui/widgets/game_bar.py`
Content:
```
1 from datetime import datetime
2 from gettext import gettext as _
3
4 from gi.repository import GObject, Gtk, Pango
5
6 from lutris import runners, services
7 from lutris.database.games import get_game_by_field, get_game_for_service
8 from lutris.game import Game
9 from lutris.gui.widgets.utils import get_link_button
10 from lutris.util.strings import gtk_safe
11
12
13 class GameBar(Gtk.Box):
14 def __init__(self, db_game, game_actions, application):
15 """Create the game bar with a database row"""
16 super().__init__(orientation=Gtk.Orientation.VERTICAL, visible=True,
17 margin_top=12,
18 margin_left=12,
19 margin_bottom=12,
20 margin_right=12,
21 spacing=6)
22 GObject.add_emission_hook(Game, "game-start", self.on_game_state_changed)
23 GObject.add_emission_hook(Game, "game-started", self.on_game_state_changed)
24 GObject.add_emission_hook(Game, "game-stopped", self.on_game_state_changed)
25 GObject.add_emission_hook(Game, "game-updated", self.on_game_state_changed)
26 GObject.add_emission_hook(Game, "game-removed", self.on_game_state_changed)
27 GObject.add_emission_hook(Game, "game-installed", self.on_game_state_changed)
28
29 self.set_margin_bottom(12)
30 self.game_actions = game_actions
31 self.db_game = db_game
32 self.service = None
33 if db_game.get("service"):
34 try:
35 self.service = services.SERVICES[db_game["service"]]()
36 except KeyError:
37 pass
38
39 game_id = None
40 if "service_id" in db_game:
41 self.appid = db_game["service_id"]
42 game_id = db_game["id"]
43 elif self.service:
44 self.appid = db_game["appid"]
45 if self.service.id == "lutris":
46 game = get_game_by_field(self.appid, field="slug")
47 else:
48 game = get_game_for_service(self.service.id, self.appid)
49 if game:
50 game_id = game["id"]
51 if game_id:
52 self.game = application.get_game_by_id(game_id) or Game(game_id)
53 else:
54 self.game = Game()
55 self.game.name = db_game["name"]
56 self.game.slug = db_game["slug"]
57 self.game.appid = self.appid
58 self.game.service = self.service.id if self.service else None
59 game_actions.set_game(self.game)
60 self.update_view()
61
62 def clear_view(self):
63 """Clears all widgets from the container"""
64 for child in self.get_children():
65 child.destroy()
66
67 def update_view(self):
68 """Populate the view with widgets"""
69 game_label = self.get_game_name_label()
70 game_label.set_halign(Gtk.Align.START)
71 self.pack_start(game_label, False, False, 0)
72
73 hbox = Gtk.Box(Gtk.Orientation.HORIZONTAL, spacing=6)
74 self.pack_start(hbox, False, False, 0)
75
76 self.play_button = self.get_play_button()
77 hbox.pack_start(self.play_button, False, False, 0)
78
79 if self.game.is_installed:
80 hbox.pack_start(self.get_runner_button(), False, False, 0)
81 hbox.pack_start(self.get_platform_label(), False, False, 0)
82 if self.game.lastplayed:
83 hbox.pack_start(self.get_last_played_label(), False, False, 0)
84 if self.game.playtime:
85 hbox.pack_start(self.get_playtime_label(), False, False, 0)
86 hbox.show_all()
87
88 def get_popover(self, buttons, parent):
89 """Return the popover widget containing a list of link buttons"""
90 if not buttons:
91 return None
92 popover = Gtk.Popover()
93 vbox = Gtk.Box(orientation=Gtk.Orientation.VERTICAL, visible=True)
94
95 for action in buttons:
96 vbox.pack_end(buttons[action], False, False, 1)
97 popover.add(vbox)
98 popover.set_position(Gtk.PositionType.TOP)
99 popover.set_constrain_to(Gtk.PopoverConstraint.NONE)
100 popover.set_relative_to(parent)
101 return popover
102
103 def get_game_name_label(self):
104 """Return the label with the game's title"""
105 title_label = Gtk.Label(visible=True)
106 title_label.set_ellipsize(Pango.EllipsizeMode.END)
107 title_label.set_markup("<span font_desc='16'><b>%s</b></span>" % gtk_safe(self.game.name))
108 return title_label
109
110 def get_runner_button(self):
111 icon_name = self.game.runner.name + "-symbolic"
112 runner_icon = Gtk.Image.new_from_icon_name(icon_name, Gtk.IconSize.MENU)
113 runner_icon.show()
114 box = Gtk.HBox(visible=True)
115 runner_button = Gtk.Button(visible=True)
116 popover = self.get_popover(self.get_runner_buttons(), runner_button)
117 if popover:
118 runner_button.set_image(runner_icon)
119 popover_button = Gtk.MenuButton(visible=True)
120 popover_button.set_size_request(32, 32)
121 popover_button.props.direction = Gtk.ArrowType.UP
122 popover_button.set_popover(popover)
123 runner_button.connect("clicked", lambda _x: popover_button.emit("clicked"))
124 box.add(runner_button)
125 box.add(popover_button)
126 style_context = box.get_style_context()
127 style_context.add_class("linked")
128 else:
129 runner_icon.set_margin_left(49)
130 runner_icon.set_margin_right(6)
131 box.add(runner_icon)
132 return box
133
134 def get_platform_label(self):
135 platform_label = Gtk.Label(visible=True)
136 platform_label.set_size_request(120, -1)
137 platform_label.set_alignment(0, 0.5)
138 platform = gtk_safe(self.game.platform)
139 platform_label.set_tooltip_markup(platform)
140 platform_label.set_markup(_("Platform:\n<b>%s</b>") % platform)
141 platform_label.set_property("ellipsize", Pango.EllipsizeMode.END)
142 return platform_label
143
144 def get_playtime_label(self):
145 """Return the label containing the playtime info"""
146 playtime_label = Gtk.Label(visible=True)
147 playtime_label.set_size_request(120, -1)
148 playtime_label.set_alignment(0, 0.5)
149 playtime_label.set_markup(_("Time played:\n<b>%s</b>") % self.game.formatted_playtime)
150 return playtime_label
151
152 def get_last_played_label(self):
153 """Return the label containing the last played info"""
154 last_played_label = Gtk.Label(visible=True)
155 last_played_label.set_size_request(120, -1)
156 last_played_label.set_alignment(0, 0.5)
157 lastplayed = datetime.fromtimestamp(self.game.lastplayed)
158 last_played_label.set_markup(_("Last played:\n<b>%s</b>") % lastplayed.strftime("%x"))
159 return last_played_label
160
161 def get_popover_button(self):
162 """Return the popover button+menu for the Play button"""
163 popover_button = Gtk.MenuButton(visible=True)
164 popover_button.set_size_request(32, 32)
165 popover_button.props.direction = Gtk.ArrowType.UP
166
167 return popover_button
168
169 def get_popover_box(self):
170 """Return a container for a button + a popover button attached to it"""
171 box = Gtk.HBox(visible=True)
172 style_context = box.get_style_context()
173 style_context.add_class("linked")
174 return box
175
176 def get_locate_installed_game_button(self):
177 """Return a button to locate an existing install"""
178 button = get_link_button("Locate installed game")
179 button.show()
180 button.connect("clicked", self.game_actions.on_locate_installed_game, self.game)
181 return {"locate": button}
182
183 def get_play_button(self):
184 """Return the widget for install/play/stop and game config"""
185 button = Gtk.Button(visible=True)
186 button.set_size_request(120, 32)
187 box = self.get_popover_box()
188 popover_button = self.get_popover_button()
189 if self.game.is_installed:
190 if self.game.state == self.game.STATE_STOPPED:
191 button.set_label(_("Play"))
192 button.connect("clicked", self.game_actions.on_game_launch)
193 elif self.game.state == self.game.STATE_LAUNCHING:
194 button.set_label(_("Launching"))
195 button.set_sensitive(False)
196 else:
197 button.set_label(_("Stop"))
198 button.connect("clicked", self.game_actions.on_game_stop)
199 else:
200 button.set_label(_("Install"))
201 button.connect("clicked", self.game_actions.on_install_clicked)
202 if self.service:
203 if self.service.local:
204 # Local services don't show an install dialog, they can be launched directly
205 button.set_label(_("Play"))
206 if self.service.drm_free:
207 button.set_size_request(84, 32)
208 box.add(button)
209 popover = self.get_popover(self.get_locate_installed_game_button(), popover_button)
210 popover_button.set_popover(popover)
211 box.add(popover_button)
212 return box
213 return button
214 button.set_size_request(84, 32)
215 box.add(button)
216 popover = self.get_popover(self.get_game_buttons(), popover_button)
217 popover_button.set_popover(popover)
218 box.add(popover_button)
219 return box
220
221 def get_game_buttons(self):
222 """Return a dictionary of buttons to use in the panel"""
223 displayed = self.game_actions.get_displayed_entries()
224 buttons = {}
225 for action in self.game_actions.get_game_actions():
226 action_id, label, callback = action
227 if action_id in ("play", "stop", "install"):
228 continue
229 button = get_link_button(label)
230 if displayed.get(action_id):
231 button.show()
232 else:
233 button.hide()
234 buttons[action_id] = button
235 button.connect("clicked", self.on_link_button_clicked, callback)
236 return buttons
237
238 def get_runner_buttons(self):
239 buttons = {}
240 if self.game.runner_name and self.game.is_installed:
241 runner = runners.import_runner(self.game.runner_name)(self.game.config)
242 for entry in runner.context_menu_entries:
243 name, label, callback = entry
244 button = get_link_button(label)
245 button.show()
246 button.connect("clicked", self.on_link_button_clicked, callback)
247 buttons[name] = button
248 return buttons
249
250 def on_link_button_clicked(self, button, callback):
251 """Callback for link buttons. Closes the popover then runs the actual action"""
252 popover = button.get_parent().get_parent()
253 popover.popdown()
254 callback(button)
255
256 def on_install_clicked(self, button):
257 """Handler for installing service games"""
258 self.service.install(self.db_game)
259
260 def on_game_state_changed(self, game):
261 """Handler called when the game has changed state"""
262 if (
263 game.id == self.game.id
264 or game.appid == self.appid
265 ):
266 self.game = game
267 else:
268 return True
269 self.clear_view()
270 self.update_view()
271 return True
272
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lutris/gui/widgets/game_bar.py b/lutris/gui/widgets/game_bar.py
--- a/lutris/gui/widgets/game_bar.py
+++ b/lutris/gui/widgets/game_bar.py
@@ -261,7 +261,7 @@
"""Handler called when the game has changed state"""
if (
game.id == self.game.id
- or game.appid == self.appid
+ or (self.appid and game.appid == self.appid)
):
self.game = game
else:
| {"golden_diff": "diff --git a/lutris/gui/widgets/game_bar.py b/lutris/gui/widgets/game_bar.py\n--- a/lutris/gui/widgets/game_bar.py\n+++ b/lutris/gui/widgets/game_bar.py\n@@ -261,7 +261,7 @@\n \"\"\"Handler called when the game has changed state\"\"\"\n if (\n game.id == self.game.id\n- or game.appid == self.appid\n+ or (self.appid and game.appid == self.appid)\n ):\n self.game = game\n else:\n", "issue": "Bottom panel switches to a different game when it stops\nIf another game is running and you switch to a different one, and the first game is closed by itself (like quitting it manually, closing the game window), not through Lutris, the bottom panel will switch to that stopped game all by itself, without user's interaction:\r\n\r\n\r\nIt should be noted that so far, only I can reproduce this, for some bizarre reason.\r\n\n", "before_files": [{"content": "from datetime import datetime\nfrom gettext import gettext as _\n\nfrom gi.repository import GObject, Gtk, Pango\n\nfrom lutris import runners, services\nfrom lutris.database.games import get_game_by_field, get_game_for_service\nfrom lutris.game import Game\nfrom lutris.gui.widgets.utils import get_link_button\nfrom lutris.util.strings import gtk_safe\n\n\nclass GameBar(Gtk.Box):\n def __init__(self, db_game, game_actions, application):\n \"\"\"Create the game bar with a database row\"\"\"\n super().__init__(orientation=Gtk.Orientation.VERTICAL, visible=True,\n margin_top=12,\n margin_left=12,\n margin_bottom=12,\n margin_right=12,\n spacing=6)\n GObject.add_emission_hook(Game, \"game-start\", self.on_game_state_changed)\n GObject.add_emission_hook(Game, \"game-started\", self.on_game_state_changed)\n GObject.add_emission_hook(Game, \"game-stopped\", self.on_game_state_changed)\n GObject.add_emission_hook(Game, \"game-updated\", self.on_game_state_changed)\n GObject.add_emission_hook(Game, \"game-removed\", self.on_game_state_changed)\n GObject.add_emission_hook(Game, \"game-installed\", self.on_game_state_changed)\n\n self.set_margin_bottom(12)\n self.game_actions = game_actions\n self.db_game = db_game\n self.service = None\n if db_game.get(\"service\"):\n try:\n self.service = services.SERVICES[db_game[\"service\"]]()\n except KeyError:\n pass\n\n game_id = None\n if \"service_id\" in db_game:\n self.appid = db_game[\"service_id\"]\n game_id = db_game[\"id\"]\n elif self.service:\n self.appid = db_game[\"appid\"]\n if self.service.id == \"lutris\":\n game = get_game_by_field(self.appid, field=\"slug\")\n else:\n game = get_game_for_service(self.service.id, self.appid)\n if game:\n game_id = game[\"id\"]\n if game_id:\n self.game = application.get_game_by_id(game_id) or Game(game_id)\n else:\n self.game = Game()\n self.game.name = db_game[\"name\"]\n self.game.slug = db_game[\"slug\"]\n self.game.appid = self.appid\n self.game.service = self.service.id if self.service else None\n game_actions.set_game(self.game)\n self.update_view()\n\n def clear_view(self):\n \"\"\"Clears all widgets from the container\"\"\"\n for child in self.get_children():\n child.destroy()\n\n def update_view(self):\n \"\"\"Populate the view with widgets\"\"\"\n game_label = self.get_game_name_label()\n game_label.set_halign(Gtk.Align.START)\n self.pack_start(game_label, False, False, 0)\n\n hbox = Gtk.Box(Gtk.Orientation.HORIZONTAL, spacing=6)\n self.pack_start(hbox, False, False, 0)\n\n self.play_button = self.get_play_button()\n hbox.pack_start(self.play_button, False, False, 0)\n\n if self.game.is_installed:\n hbox.pack_start(self.get_runner_button(), False, False, 0)\n hbox.pack_start(self.get_platform_label(), False, False, 0)\n if self.game.lastplayed:\n hbox.pack_start(self.get_last_played_label(), False, False, 0)\n if self.game.playtime:\n hbox.pack_start(self.get_playtime_label(), False, False, 0)\n hbox.show_all()\n\n def get_popover(self, buttons, parent):\n \"\"\"Return the popover widget containing a list of link buttons\"\"\"\n if not buttons:\n return None\n popover = Gtk.Popover()\n vbox = Gtk.Box(orientation=Gtk.Orientation.VERTICAL, visible=True)\n\n for action in buttons:\n vbox.pack_end(buttons[action], False, False, 1)\n popover.add(vbox)\n popover.set_position(Gtk.PositionType.TOP)\n popover.set_constrain_to(Gtk.PopoverConstraint.NONE)\n popover.set_relative_to(parent)\n return popover\n\n def get_game_name_label(self):\n \"\"\"Return the label with the game's title\"\"\"\n title_label = Gtk.Label(visible=True)\n title_label.set_ellipsize(Pango.EllipsizeMode.END)\n title_label.set_markup(\"<span font_desc='16'><b>%s</b></span>\" % gtk_safe(self.game.name))\n return title_label\n\n def get_runner_button(self):\n icon_name = self.game.runner.name + \"-symbolic\"\n runner_icon = Gtk.Image.new_from_icon_name(icon_name, Gtk.IconSize.MENU)\n runner_icon.show()\n box = Gtk.HBox(visible=True)\n runner_button = Gtk.Button(visible=True)\n popover = self.get_popover(self.get_runner_buttons(), runner_button)\n if popover:\n runner_button.set_image(runner_icon)\n popover_button = Gtk.MenuButton(visible=True)\n popover_button.set_size_request(32, 32)\n popover_button.props.direction = Gtk.ArrowType.UP\n popover_button.set_popover(popover)\n runner_button.connect(\"clicked\", lambda _x: popover_button.emit(\"clicked\"))\n box.add(runner_button)\n box.add(popover_button)\n style_context = box.get_style_context()\n style_context.add_class(\"linked\")\n else:\n runner_icon.set_margin_left(49)\n runner_icon.set_margin_right(6)\n box.add(runner_icon)\n return box\n\n def get_platform_label(self):\n platform_label = Gtk.Label(visible=True)\n platform_label.set_size_request(120, -1)\n platform_label.set_alignment(0, 0.5)\n platform = gtk_safe(self.game.platform)\n platform_label.set_tooltip_markup(platform)\n platform_label.set_markup(_(\"Platform:\\n<b>%s</b>\") % platform)\n platform_label.set_property(\"ellipsize\", Pango.EllipsizeMode.END)\n return platform_label\n\n def get_playtime_label(self):\n \"\"\"Return the label containing the playtime info\"\"\"\n playtime_label = Gtk.Label(visible=True)\n playtime_label.set_size_request(120, -1)\n playtime_label.set_alignment(0, 0.5)\n playtime_label.set_markup(_(\"Time played:\\n<b>%s</b>\") % self.game.formatted_playtime)\n return playtime_label\n\n def get_last_played_label(self):\n \"\"\"Return the label containing the last played info\"\"\"\n last_played_label = Gtk.Label(visible=True)\n last_played_label.set_size_request(120, -1)\n last_played_label.set_alignment(0, 0.5)\n lastplayed = datetime.fromtimestamp(self.game.lastplayed)\n last_played_label.set_markup(_(\"Last played:\\n<b>%s</b>\") % lastplayed.strftime(\"%x\"))\n return last_played_label\n\n def get_popover_button(self):\n \"\"\"Return the popover button+menu for the Play button\"\"\"\n popover_button = Gtk.MenuButton(visible=True)\n popover_button.set_size_request(32, 32)\n popover_button.props.direction = Gtk.ArrowType.UP\n\n return popover_button\n\n def get_popover_box(self):\n \"\"\"Return a container for a button + a popover button attached to it\"\"\"\n box = Gtk.HBox(visible=True)\n style_context = box.get_style_context()\n style_context.add_class(\"linked\")\n return box\n\n def get_locate_installed_game_button(self):\n \"\"\"Return a button to locate an existing install\"\"\"\n button = get_link_button(\"Locate installed game\")\n button.show()\n button.connect(\"clicked\", self.game_actions.on_locate_installed_game, self.game)\n return {\"locate\": button}\n\n def get_play_button(self):\n \"\"\"Return the widget for install/play/stop and game config\"\"\"\n button = Gtk.Button(visible=True)\n button.set_size_request(120, 32)\n box = self.get_popover_box()\n popover_button = self.get_popover_button()\n if self.game.is_installed:\n if self.game.state == self.game.STATE_STOPPED:\n button.set_label(_(\"Play\"))\n button.connect(\"clicked\", self.game_actions.on_game_launch)\n elif self.game.state == self.game.STATE_LAUNCHING:\n button.set_label(_(\"Launching\"))\n button.set_sensitive(False)\n else:\n button.set_label(_(\"Stop\"))\n button.connect(\"clicked\", self.game_actions.on_game_stop)\n else:\n button.set_label(_(\"Install\"))\n button.connect(\"clicked\", self.game_actions.on_install_clicked)\n if self.service:\n if self.service.local:\n # Local services don't show an install dialog, they can be launched directly\n button.set_label(_(\"Play\"))\n if self.service.drm_free:\n button.set_size_request(84, 32)\n box.add(button)\n popover = self.get_popover(self.get_locate_installed_game_button(), popover_button)\n popover_button.set_popover(popover)\n box.add(popover_button)\n return box\n return button\n button.set_size_request(84, 32)\n box.add(button)\n popover = self.get_popover(self.get_game_buttons(), popover_button)\n popover_button.set_popover(popover)\n box.add(popover_button)\n return box\n\n def get_game_buttons(self):\n \"\"\"Return a dictionary of buttons to use in the panel\"\"\"\n displayed = self.game_actions.get_displayed_entries()\n buttons = {}\n for action in self.game_actions.get_game_actions():\n action_id, label, callback = action\n if action_id in (\"play\", \"stop\", \"install\"):\n continue\n button = get_link_button(label)\n if displayed.get(action_id):\n button.show()\n else:\n button.hide()\n buttons[action_id] = button\n button.connect(\"clicked\", self.on_link_button_clicked, callback)\n return buttons\n\n def get_runner_buttons(self):\n buttons = {}\n if self.game.runner_name and self.game.is_installed:\n runner = runners.import_runner(self.game.runner_name)(self.game.config)\n for entry in runner.context_menu_entries:\n name, label, callback = entry\n button = get_link_button(label)\n button.show()\n button.connect(\"clicked\", self.on_link_button_clicked, callback)\n buttons[name] = button\n return buttons\n\n def on_link_button_clicked(self, button, callback):\n \"\"\"Callback for link buttons. Closes the popover then runs the actual action\"\"\"\n popover = button.get_parent().get_parent()\n popover.popdown()\n callback(button)\n\n def on_install_clicked(self, button):\n \"\"\"Handler for installing service games\"\"\"\n self.service.install(self.db_game)\n\n def on_game_state_changed(self, game):\n \"\"\"Handler called when the game has changed state\"\"\"\n if (\n game.id == self.game.id\n or game.appid == self.appid\n ):\n self.game = game\n else:\n return True\n self.clear_view()\n self.update_view()\n return True\n", "path": "lutris/gui/widgets/game_bar.py"}], "after_files": [{"content": "from datetime import datetime\nfrom gettext import gettext as _\n\nfrom gi.repository import GObject, Gtk, Pango\n\nfrom lutris import runners, services\nfrom lutris.database.games import get_game_by_field, get_game_for_service\nfrom lutris.game import Game\nfrom lutris.gui.widgets.utils import get_link_button\nfrom lutris.util.strings import gtk_safe\n\n\nclass GameBar(Gtk.Box):\n def __init__(self, db_game, game_actions, application):\n \"\"\"Create the game bar with a database row\"\"\"\n super().__init__(orientation=Gtk.Orientation.VERTICAL, visible=True,\n margin_top=12,\n margin_left=12,\n margin_bottom=12,\n margin_right=12,\n spacing=6)\n GObject.add_emission_hook(Game, \"game-start\", self.on_game_state_changed)\n GObject.add_emission_hook(Game, \"game-started\", self.on_game_state_changed)\n GObject.add_emission_hook(Game, \"game-stopped\", self.on_game_state_changed)\n GObject.add_emission_hook(Game, \"game-updated\", self.on_game_state_changed)\n GObject.add_emission_hook(Game, \"game-removed\", self.on_game_state_changed)\n GObject.add_emission_hook(Game, \"game-installed\", self.on_game_state_changed)\n\n self.set_margin_bottom(12)\n self.game_actions = game_actions\n self.db_game = db_game\n self.service = None\n if db_game.get(\"service\"):\n try:\n self.service = services.SERVICES[db_game[\"service\"]]()\n except KeyError:\n pass\n\n game_id = None\n if \"service_id\" in db_game:\n self.appid = db_game[\"service_id\"]\n game_id = db_game[\"id\"]\n elif self.service:\n self.appid = db_game[\"appid\"]\n if self.service.id == \"lutris\":\n game = get_game_by_field(self.appid, field=\"slug\")\n else:\n game = get_game_for_service(self.service.id, self.appid)\n if game:\n game_id = game[\"id\"]\n if game_id:\n self.game = application.get_game_by_id(game_id) or Game(game_id)\n else:\n self.game = Game()\n self.game.name = db_game[\"name\"]\n self.game.slug = db_game[\"slug\"]\n self.game.appid = self.appid\n self.game.service = self.service.id if self.service else None\n game_actions.set_game(self.game)\n self.update_view()\n\n def clear_view(self):\n \"\"\"Clears all widgets from the container\"\"\"\n for child in self.get_children():\n child.destroy()\n\n def update_view(self):\n \"\"\"Populate the view with widgets\"\"\"\n game_label = self.get_game_name_label()\n game_label.set_halign(Gtk.Align.START)\n self.pack_start(game_label, False, False, 0)\n\n hbox = Gtk.Box(Gtk.Orientation.HORIZONTAL, spacing=6)\n self.pack_start(hbox, False, False, 0)\n\n self.play_button = self.get_play_button()\n hbox.pack_start(self.play_button, False, False, 0)\n\n if self.game.is_installed:\n hbox.pack_start(self.get_runner_button(), False, False, 0)\n hbox.pack_start(self.get_platform_label(), False, False, 0)\n if self.game.lastplayed:\n hbox.pack_start(self.get_last_played_label(), False, False, 0)\n if self.game.playtime:\n hbox.pack_start(self.get_playtime_label(), False, False, 0)\n hbox.show_all()\n\n def get_popover(self, buttons, parent):\n \"\"\"Return the popover widget containing a list of link buttons\"\"\"\n if not buttons:\n return None\n popover = Gtk.Popover()\n vbox = Gtk.Box(orientation=Gtk.Orientation.VERTICAL, visible=True)\n\n for action in buttons:\n vbox.pack_end(buttons[action], False, False, 1)\n popover.add(vbox)\n popover.set_position(Gtk.PositionType.TOP)\n popover.set_constrain_to(Gtk.PopoverConstraint.NONE)\n popover.set_relative_to(parent)\n return popover\n\n def get_game_name_label(self):\n \"\"\"Return the label with the game's title\"\"\"\n title_label = Gtk.Label(visible=True)\n title_label.set_ellipsize(Pango.EllipsizeMode.END)\n title_label.set_markup(\"<span font_desc='16'><b>%s</b></span>\" % gtk_safe(self.game.name))\n return title_label\n\n def get_runner_button(self):\n icon_name = self.game.runner.name + \"-symbolic\"\n runner_icon = Gtk.Image.new_from_icon_name(icon_name, Gtk.IconSize.MENU)\n runner_icon.show()\n box = Gtk.HBox(visible=True)\n runner_button = Gtk.Button(visible=True)\n popover = self.get_popover(self.get_runner_buttons(), runner_button)\n if popover:\n runner_button.set_image(runner_icon)\n popover_button = Gtk.MenuButton(visible=True)\n popover_button.set_size_request(32, 32)\n popover_button.props.direction = Gtk.ArrowType.UP\n popover_button.set_popover(popover)\n runner_button.connect(\"clicked\", lambda _x: popover_button.emit(\"clicked\"))\n box.add(runner_button)\n box.add(popover_button)\n style_context = box.get_style_context()\n style_context.add_class(\"linked\")\n else:\n runner_icon.set_margin_left(49)\n runner_icon.set_margin_right(6)\n box.add(runner_icon)\n return box\n\n def get_platform_label(self):\n platform_label = Gtk.Label(visible=True)\n platform_label.set_size_request(120, -1)\n platform_label.set_alignment(0, 0.5)\n platform = gtk_safe(self.game.platform)\n platform_label.set_tooltip_markup(platform)\n platform_label.set_markup(_(\"Platform:\\n<b>%s</b>\") % platform)\n platform_label.set_property(\"ellipsize\", Pango.EllipsizeMode.END)\n return platform_label\n\n def get_playtime_label(self):\n \"\"\"Return the label containing the playtime info\"\"\"\n playtime_label = Gtk.Label(visible=True)\n playtime_label.set_size_request(120, -1)\n playtime_label.set_alignment(0, 0.5)\n playtime_label.set_markup(_(\"Time played:\\n<b>%s</b>\") % self.game.formatted_playtime)\n return playtime_label\n\n def get_last_played_label(self):\n \"\"\"Return the label containing the last played info\"\"\"\n last_played_label = Gtk.Label(visible=True)\n last_played_label.set_size_request(120, -1)\n last_played_label.set_alignment(0, 0.5)\n lastplayed = datetime.fromtimestamp(self.game.lastplayed)\n last_played_label.set_markup(_(\"Last played:\\n<b>%s</b>\") % lastplayed.strftime(\"%x\"))\n return last_played_label\n\n def get_popover_button(self):\n \"\"\"Return the popover button+menu for the Play button\"\"\"\n popover_button = Gtk.MenuButton(visible=True)\n popover_button.set_size_request(32, 32)\n popover_button.props.direction = Gtk.ArrowType.UP\n\n return popover_button\n\n def get_popover_box(self):\n \"\"\"Return a container for a button + a popover button attached to it\"\"\"\n box = Gtk.HBox(visible=True)\n style_context = box.get_style_context()\n style_context.add_class(\"linked\")\n return box\n\n def get_locate_installed_game_button(self):\n \"\"\"Return a button to locate an existing install\"\"\"\n button = get_link_button(\"Locate installed game\")\n button.show()\n button.connect(\"clicked\", self.game_actions.on_locate_installed_game, self.game)\n return {\"locate\": button}\n\n def get_play_button(self):\n \"\"\"Return the widget for install/play/stop and game config\"\"\"\n button = Gtk.Button(visible=True)\n button.set_size_request(120, 32)\n box = self.get_popover_box()\n popover_button = self.get_popover_button()\n if self.game.is_installed:\n if self.game.state == self.game.STATE_STOPPED:\n button.set_label(_(\"Play\"))\n button.connect(\"clicked\", self.game_actions.on_game_launch)\n elif self.game.state == self.game.STATE_LAUNCHING:\n button.set_label(_(\"Launching\"))\n button.set_sensitive(False)\n else:\n button.set_label(_(\"Stop\"))\n button.connect(\"clicked\", self.game_actions.on_game_stop)\n else:\n button.set_label(_(\"Install\"))\n button.connect(\"clicked\", self.game_actions.on_install_clicked)\n if self.service:\n if self.service.local:\n # Local services don't show an install dialog, they can be launched directly\n button.set_label(_(\"Play\"))\n if self.service.drm_free:\n button.set_size_request(84, 32)\n box.add(button)\n popover = self.get_popover(self.get_locate_installed_game_button(), popover_button)\n popover_button.set_popover(popover)\n box.add(popover_button)\n return box\n return button\n button.set_size_request(84, 32)\n box.add(button)\n popover = self.get_popover(self.get_game_buttons(), popover_button)\n popover_button.set_popover(popover)\n box.add(popover_button)\n return box\n\n def get_game_buttons(self):\n \"\"\"Return a dictionary of buttons to use in the panel\"\"\"\n displayed = self.game_actions.get_displayed_entries()\n buttons = {}\n for action in self.game_actions.get_game_actions():\n action_id, label, callback = action\n if action_id in (\"play\", \"stop\", \"install\"):\n continue\n button = get_link_button(label)\n if displayed.get(action_id):\n button.show()\n else:\n button.hide()\n buttons[action_id] = button\n button.connect(\"clicked\", self.on_link_button_clicked, callback)\n return buttons\n\n def get_runner_buttons(self):\n buttons = {}\n if self.game.runner_name and self.game.is_installed:\n runner = runners.import_runner(self.game.runner_name)(self.game.config)\n for entry in runner.context_menu_entries:\n name, label, callback = entry\n button = get_link_button(label)\n button.show()\n button.connect(\"clicked\", self.on_link_button_clicked, callback)\n buttons[name] = button\n return buttons\n\n def on_link_button_clicked(self, button, callback):\n \"\"\"Callback for link buttons. Closes the popover then runs the actual action\"\"\"\n popover = button.get_parent().get_parent()\n popover.popdown()\n callback(button)\n\n def on_install_clicked(self, button):\n \"\"\"Handler for installing service games\"\"\"\n self.service.install(self.db_game)\n\n def on_game_state_changed(self, game):\n \"\"\"Handler called when the game has changed state\"\"\"\n if (\n game.id == self.game.id\n or (self.appid and game.appid == self.appid)\n ):\n self.game = game\n else:\n return True\n self.clear_view()\n self.update_view()\n return True\n", "path": "lutris/gui/widgets/game_bar.py"}]} | 3,483 | 118 |
gh_patches_debug_16571 | rasdani/github-patches | git_diff | geopandas__geopandas-854 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Deprecation Warning with fiona 1.8b1
using a `debian:buster` docker image
installed Fiona with
> pip install git+https://github.com/Toblerity/[email protected]
I got this __warning__ today:
```python
/usr/local/lib/python2.7/dist-packages/geopandas/io/file.py:108: FionaDeprecationWarning: Use fiona.Env() instead.
with fiona.drivers():
No handlers could be found for logger "rasterio._gdal"
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `geopandas/io/file.py`
Content:
```
1 import os
2
3 import fiona
4 import numpy as np
5 import six
6
7 from geopandas import GeoDataFrame, GeoSeries
8
9 # Adapted from pandas.io.common
10 if six.PY3:
11 from urllib.request import urlopen as _urlopen
12 from urllib.parse import urlparse as parse_url
13 from urllib.parse import uses_relative, uses_netloc, uses_params
14 else:
15 from urllib2 import urlopen as _urlopen
16 from urlparse import urlparse as parse_url
17 from urlparse import uses_relative, uses_netloc, uses_params
18
19 _VALID_URLS = set(uses_relative + uses_netloc + uses_params)
20 _VALID_URLS.discard('')
21
22
23 def _is_url(url):
24 """Check to see if *url* has a valid protocol."""
25 try:
26 return parse_url(url).scheme in _VALID_URLS
27 except:
28 return False
29
30
31 def read_file(filename, bbox=None, **kwargs):
32 """
33 Returns a GeoDataFrame from a file or URL.
34
35 Parameters
36 ----------
37 filename: str
38 Either the absolute or relative path to the file or URL to
39 be opened.
40 bbox : tuple | GeoDataFrame or GeoSeries, default None
41 Filter features by given bounding box, GeoSeries, or GeoDataFrame.
42 CRS mis-matches are resolved if given a GeoSeries or GeoDataFrame.
43 **kwargs:
44 Keyword args to be passed to the `open` or `BytesCollection` method
45 in the fiona library when opening the file. For more information on
46 possible keywords, type:
47 ``import fiona; help(fiona.open)``
48
49 Examples
50 --------
51 >>> df = geopandas.read_file("nybb.shp")
52
53 Returns
54 -------
55 geodataframe : GeoDataFrame
56 """
57 if _is_url(filename):
58 req = _urlopen(filename)
59 path_or_bytes = req.read()
60 reader = fiona.BytesCollection
61 else:
62 path_or_bytes = filename
63 reader = fiona.open
64
65 with reader(path_or_bytes, **kwargs) as features:
66 crs = features.crs
67 if bbox is not None:
68 if isinstance(bbox, GeoDataFrame) or isinstance(bbox, GeoSeries):
69 bbox = tuple(bbox.to_crs(crs).total_bounds)
70 assert len(bbox) == 4
71 f_filt = features.filter(bbox=bbox)
72 else:
73 f_filt = features
74
75 columns = list(features.meta["schema"]["properties"]) + ["geometry"]
76 gdf = GeoDataFrame.from_features(f_filt, crs=crs, columns=columns)
77
78 return gdf
79
80
81 def to_file(df, filename, driver="ESRI Shapefile", schema=None,
82 **kwargs):
83 """
84 Write this GeoDataFrame to an OGR data source
85
86 A dictionary of supported OGR providers is available via:
87 >>> import fiona
88 >>> fiona.supported_drivers
89
90 Parameters
91 ----------
92 df : GeoDataFrame to be written
93 filename : string
94 File path or file handle to write to.
95 driver : string, default 'ESRI Shapefile'
96 The OGR format driver used to write the vector file.
97 schema : dict, default None
98 If specified, the schema dictionary is passed to Fiona to
99 better control how the file is written. If None, GeoPandas
100 will determine the schema based on each column's dtype
101
102 The *kwargs* are passed to fiona.open and can be used to write
103 to multi-layer data, store data within archives (zip files), etc.
104 """
105 if schema is None:
106 schema = infer_schema(df)
107 filename = os.path.abspath(os.path.expanduser(filename))
108 with fiona.drivers():
109 with fiona.open(filename, 'w', driver=driver, crs=df.crs,
110 schema=schema, **kwargs) as colxn:
111 colxn.writerecords(df.iterfeatures())
112
113
114 def infer_schema(df):
115 try:
116 from collections import OrderedDict
117 except ImportError:
118 from ordereddict import OrderedDict
119
120 def convert_type(column, in_type):
121 if in_type == object:
122 return 'str'
123 out_type = type(np.asscalar(np.zeros(1, in_type))).__name__
124 if out_type == 'long':
125 out_type = 'int'
126 if out_type == 'bool':
127 raise ValueError('column "{}" is boolean type, '.format(column) +
128 'which is unsupported in file writing. '
129 'Consider casting the column to int type.')
130 return out_type
131
132 properties = OrderedDict([
133 (col, convert_type(col, _type)) for col, _type in
134 zip(df.columns, df.dtypes) if col != df._geometry_column_name
135 ])
136
137 if df.empty:
138 raise ValueError("Cannot write empty DataFrame to file.")
139
140 geom_type = _common_geom_type(df)
141
142 if not geom_type:
143 raise ValueError("Geometry column cannot contain mutiple "
144 "geometry types when writing to file.")
145
146 schema = {'geometry': geom_type, 'properties': properties}
147
148 return schema
149
150
151 def _common_geom_type(df):
152 # Need to check geom_types before we write to file...
153 # Some (most?) providers expect a single geometry type:
154 # Point, LineString, or Polygon
155 geom_types = df.geometry.geom_type.unique()
156
157 from os.path import commonprefix
158 # use reversed geom types and commonprefix to find the common suffix,
159 # then reverse the result to get back to a geom type
160 geom_type = commonprefix([g[::-1] for g in geom_types if g])[::-1]
161 if not geom_type:
162 return None
163
164 if df.geometry.has_z.any():
165 geom_type = "3D " + geom_type
166
167 return geom_type
168
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/geopandas/io/file.py b/geopandas/io/file.py
--- a/geopandas/io/file.py
+++ b/geopandas/io/file.py
@@ -4,6 +4,11 @@
import numpy as np
import six
+try:
+ from fiona import Env as fiona_env
+except ImportError:
+ from fiona import drivers as fiona_env
+
from geopandas import GeoDataFrame, GeoSeries
# Adapted from pandas.io.common
@@ -105,7 +110,7 @@
if schema is None:
schema = infer_schema(df)
filename = os.path.abspath(os.path.expanduser(filename))
- with fiona.drivers():
+ with fiona_env():
with fiona.open(filename, 'w', driver=driver, crs=df.crs,
schema=schema, **kwargs) as colxn:
colxn.writerecords(df.iterfeatures())
| {"golden_diff": "diff --git a/geopandas/io/file.py b/geopandas/io/file.py\n--- a/geopandas/io/file.py\n+++ b/geopandas/io/file.py\n@@ -4,6 +4,11 @@\n import numpy as np\n import six\n \n+try:\n+ from fiona import Env as fiona_env\n+except ImportError:\n+ from fiona import drivers as fiona_env\n+\n from geopandas import GeoDataFrame, GeoSeries\n \n # Adapted from pandas.io.common\n@@ -105,7 +110,7 @@\n if schema is None:\n schema = infer_schema(df)\n filename = os.path.abspath(os.path.expanduser(filename))\n- with fiona.drivers():\n+ with fiona_env():\n with fiona.open(filename, 'w', driver=driver, crs=df.crs,\n schema=schema, **kwargs) as colxn:\n colxn.writerecords(df.iterfeatures())\n", "issue": "Deprecation Warning with fiona 1.8b1\nusing a `debian:buster` docker image\r\n\r\ninstalled Fiona with \r\n> pip install git+https://github.com/Toblerity/[email protected]\r\n\r\nI got this __warning__ today: \r\n```python\r\n/usr/local/lib/python2.7/dist-packages/geopandas/io/file.py:108: FionaDeprecationWarning: Use fiona.Env() instead.\r\n with fiona.drivers():\r\nNo handlers could be found for logger \"rasterio._gdal\"\r\n```\n", "before_files": [{"content": "import os\n\nimport fiona\nimport numpy as np\nimport six\n\nfrom geopandas import GeoDataFrame, GeoSeries\n\n# Adapted from pandas.io.common\nif six.PY3:\n from urllib.request import urlopen as _urlopen\n from urllib.parse import urlparse as parse_url\n from urllib.parse import uses_relative, uses_netloc, uses_params\nelse:\n from urllib2 import urlopen as _urlopen\n from urlparse import urlparse as parse_url\n from urlparse import uses_relative, uses_netloc, uses_params\n\n_VALID_URLS = set(uses_relative + uses_netloc + uses_params)\n_VALID_URLS.discard('')\n\n\ndef _is_url(url):\n \"\"\"Check to see if *url* has a valid protocol.\"\"\"\n try:\n return parse_url(url).scheme in _VALID_URLS\n except:\n return False\n\n\ndef read_file(filename, bbox=None, **kwargs):\n \"\"\"\n Returns a GeoDataFrame from a file or URL.\n\n Parameters\n ----------\n filename: str\n Either the absolute or relative path to the file or URL to\n be opened.\n bbox : tuple | GeoDataFrame or GeoSeries, default None\n Filter features by given bounding box, GeoSeries, or GeoDataFrame.\n CRS mis-matches are resolved if given a GeoSeries or GeoDataFrame.\n **kwargs:\n Keyword args to be passed to the `open` or `BytesCollection` method\n in the fiona library when opening the file. For more information on\n possible keywords, type:\n ``import fiona; help(fiona.open)``\n\n Examples\n --------\n >>> df = geopandas.read_file(\"nybb.shp\")\n\n Returns\n -------\n geodataframe : GeoDataFrame\n \"\"\"\n if _is_url(filename):\n req = _urlopen(filename)\n path_or_bytes = req.read()\n reader = fiona.BytesCollection\n else:\n path_or_bytes = filename\n reader = fiona.open\n\n with reader(path_or_bytes, **kwargs) as features:\n crs = features.crs\n if bbox is not None:\n if isinstance(bbox, GeoDataFrame) or isinstance(bbox, GeoSeries):\n bbox = tuple(bbox.to_crs(crs).total_bounds)\n assert len(bbox) == 4\n f_filt = features.filter(bbox=bbox)\n else:\n f_filt = features\n\n columns = list(features.meta[\"schema\"][\"properties\"]) + [\"geometry\"]\n gdf = GeoDataFrame.from_features(f_filt, crs=crs, columns=columns)\n\n return gdf\n\n\ndef to_file(df, filename, driver=\"ESRI Shapefile\", schema=None,\n **kwargs):\n \"\"\"\n Write this GeoDataFrame to an OGR data source\n\n A dictionary of supported OGR providers is available via:\n >>> import fiona\n >>> fiona.supported_drivers\n\n Parameters\n ----------\n df : GeoDataFrame to be written\n filename : string\n File path or file handle to write to.\n driver : string, default 'ESRI Shapefile'\n The OGR format driver used to write the vector file.\n schema : dict, default None\n If specified, the schema dictionary is passed to Fiona to\n better control how the file is written. If None, GeoPandas\n will determine the schema based on each column's dtype\n\n The *kwargs* are passed to fiona.open and can be used to write\n to multi-layer data, store data within archives (zip files), etc.\n \"\"\"\n if schema is None:\n schema = infer_schema(df)\n filename = os.path.abspath(os.path.expanduser(filename))\n with fiona.drivers():\n with fiona.open(filename, 'w', driver=driver, crs=df.crs,\n schema=schema, **kwargs) as colxn:\n colxn.writerecords(df.iterfeatures())\n\n\ndef infer_schema(df):\n try:\n from collections import OrderedDict\n except ImportError:\n from ordereddict import OrderedDict\n\n def convert_type(column, in_type):\n if in_type == object:\n return 'str'\n out_type = type(np.asscalar(np.zeros(1, in_type))).__name__\n if out_type == 'long':\n out_type = 'int'\n if out_type == 'bool':\n raise ValueError('column \"{}\" is boolean type, '.format(column) +\n 'which is unsupported in file writing. '\n 'Consider casting the column to int type.')\n return out_type\n\n properties = OrderedDict([\n (col, convert_type(col, _type)) for col, _type in\n zip(df.columns, df.dtypes) if col != df._geometry_column_name\n ])\n\n if df.empty:\n raise ValueError(\"Cannot write empty DataFrame to file.\")\n\n geom_type = _common_geom_type(df)\n \n if not geom_type:\n raise ValueError(\"Geometry column cannot contain mutiple \"\n \"geometry types when writing to file.\")\n\n schema = {'geometry': geom_type, 'properties': properties}\n\n return schema\n\n\ndef _common_geom_type(df):\n # Need to check geom_types before we write to file...\n # Some (most?) providers expect a single geometry type:\n # Point, LineString, or Polygon\n geom_types = df.geometry.geom_type.unique()\n\n from os.path import commonprefix\n # use reversed geom types and commonprefix to find the common suffix,\n # then reverse the result to get back to a geom type\n geom_type = commonprefix([g[::-1] for g in geom_types if g])[::-1]\n if not geom_type:\n return None\n\n if df.geometry.has_z.any():\n geom_type = \"3D \" + geom_type\n\n return geom_type\n", "path": "geopandas/io/file.py"}], "after_files": [{"content": "import os\n\nimport fiona\nimport numpy as np\nimport six\n\ntry:\n from fiona import Env as fiona_env\nexcept ImportError:\n from fiona import drivers as fiona_env\n\nfrom geopandas import GeoDataFrame, GeoSeries\n\n# Adapted from pandas.io.common\nif six.PY3:\n from urllib.request import urlopen as _urlopen\n from urllib.parse import urlparse as parse_url\n from urllib.parse import uses_relative, uses_netloc, uses_params\nelse:\n from urllib2 import urlopen as _urlopen\n from urlparse import urlparse as parse_url\n from urlparse import uses_relative, uses_netloc, uses_params\n\n_VALID_URLS = set(uses_relative + uses_netloc + uses_params)\n_VALID_URLS.discard('')\n\n\ndef _is_url(url):\n \"\"\"Check to see if *url* has a valid protocol.\"\"\"\n try:\n return parse_url(url).scheme in _VALID_URLS\n except:\n return False\n\n\ndef read_file(filename, bbox=None, **kwargs):\n \"\"\"\n Returns a GeoDataFrame from a file or URL.\n\n Parameters\n ----------\n filename: str\n Either the absolute or relative path to the file or URL to\n be opened.\n bbox : tuple | GeoDataFrame or GeoSeries, default None\n Filter features by given bounding box, GeoSeries, or GeoDataFrame.\n CRS mis-matches are resolved if given a GeoSeries or GeoDataFrame.\n **kwargs:\n Keyword args to be passed to the `open` or `BytesCollection` method\n in the fiona library when opening the file. For more information on\n possible keywords, type:\n ``import fiona; help(fiona.open)``\n\n Examples\n --------\n >>> df = geopandas.read_file(\"nybb.shp\")\n\n Returns\n -------\n geodataframe : GeoDataFrame\n \"\"\"\n if _is_url(filename):\n req = _urlopen(filename)\n path_or_bytes = req.read()\n reader = fiona.BytesCollection\n else:\n path_or_bytes = filename\n reader = fiona.open\n\n with reader(path_or_bytes, **kwargs) as features:\n crs = features.crs\n if bbox is not None:\n if isinstance(bbox, GeoDataFrame) or isinstance(bbox, GeoSeries):\n bbox = tuple(bbox.to_crs(crs).total_bounds)\n assert len(bbox) == 4\n f_filt = features.filter(bbox=bbox)\n else:\n f_filt = features\n\n columns = list(features.meta[\"schema\"][\"properties\"]) + [\"geometry\"]\n gdf = GeoDataFrame.from_features(f_filt, crs=crs, columns=columns)\n\n return gdf\n\n\ndef to_file(df, filename, driver=\"ESRI Shapefile\", schema=None,\n **kwargs):\n \"\"\"\n Write this GeoDataFrame to an OGR data source\n\n A dictionary of supported OGR providers is available via:\n >>> import fiona\n >>> fiona.supported_drivers\n\n Parameters\n ----------\n df : GeoDataFrame to be written\n filename : string\n File path or file handle to write to.\n driver : string, default 'ESRI Shapefile'\n The OGR format driver used to write the vector file.\n schema : dict, default None\n If specified, the schema dictionary is passed to Fiona to\n better control how the file is written. If None, GeoPandas\n will determine the schema based on each column's dtype\n\n The *kwargs* are passed to fiona.open and can be used to write\n to multi-layer data, store data within archives (zip files), etc.\n \"\"\"\n if schema is None:\n schema = infer_schema(df)\n filename = os.path.abspath(os.path.expanduser(filename))\n with fiona_env():\n with fiona.open(filename, 'w', driver=driver, crs=df.crs,\n schema=schema, **kwargs) as colxn:\n colxn.writerecords(df.iterfeatures())\n\n\ndef infer_schema(df):\n try:\n from collections import OrderedDict\n except ImportError:\n from ordereddict import OrderedDict\n\n def convert_type(column, in_type):\n if in_type == object:\n return 'str'\n out_type = type(np.asscalar(np.zeros(1, in_type))).__name__\n if out_type == 'long':\n out_type = 'int'\n if out_type == 'bool':\n raise ValueError('column \"{}\" is boolean type, '.format(column) +\n 'which is unsupported in file writing. '\n 'Consider casting the column to int type.')\n return out_type\n\n properties = OrderedDict([\n (col, convert_type(col, _type)) for col, _type in\n zip(df.columns, df.dtypes) if col != df._geometry_column_name\n ])\n\n if df.empty:\n raise ValueError(\"Cannot write empty DataFrame to file.\")\n\n geom_type = _common_geom_type(df)\n \n if not geom_type:\n raise ValueError(\"Geometry column cannot contain mutiple \"\n \"geometry types when writing to file.\")\n\n schema = {'geometry': geom_type, 'properties': properties}\n\n return schema\n\n\ndef _common_geom_type(df):\n # Need to check geom_types before we write to file...\n # Some (most?) providers expect a single geometry type:\n # Point, LineString, or Polygon\n geom_types = df.geometry.geom_type.unique()\n\n from os.path import commonprefix\n # use reversed geom types and commonprefix to find the common suffix,\n # then reverse the result to get back to a geom type\n geom_type = commonprefix([g[::-1] for g in geom_types if g])[::-1]\n if not geom_type:\n return None\n\n if df.geometry.has_z.any():\n geom_type = \"3D \" + geom_type\n\n return geom_type\n", "path": "geopandas/io/file.py"}]} | 2,018 | 203 |
gh_patches_debug_24607 | rasdani/github-patches | git_diff | streamlink__streamlink-3185 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
tv360.com.tr no playable stream
## Bug Report
- [x] This is a bug report and I have read the contribution guidelines.
### Description
can't find playable stream.
### Expected / Actual behavior
stream supposed to be found
### Reproduction steps / Explicit stream URLs to test
``` 1. streamlink https://www.tv360.com.tr/canli-yayin ```
### Log output
```
[cli][debug] OS: Windows 10
[cli][debug] Python: 3.8.2
[cli][debug] Streamlink: 1.5.0
[cli][debug] Requests(2.24.0), Socks(1.7.1), Websocket(0.57.0)
[cli][info] Found matching plugin tv360 for URL tv360.com.tr/canli-yayin
error: No playable streams found on this URL: tv360.com.tr/canli-yayin
```
### Additional comments, screenshots, etc.
[Love Streamlink? Please consider supporting our collective. Thanks!](https://opencollective.com/streamlink/donate)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink/plugins/tv360.py`
Content:
```
1 from __future__ import print_function
2
3 import re
4
5 from streamlink.plugin import Plugin
6 from streamlink.plugin.api import validate
7 from streamlink.stream import HLSStream
8
9
10 class TV360(Plugin):
11 url_re = re.compile(r"https?://(?:www.)?tv360.com.tr/canli-yayin")
12 hls_re = re.compile(r'''hls.loadSource\(["'](http.*m3u8)["']\)''', re.DOTALL)
13
14 hls_schema = validate.Schema(
15 validate.transform(hls_re.search),
16 validate.any(None, validate.all(validate.get(1)))
17 )
18
19 @classmethod
20 def can_handle_url(cls, url):
21 return cls.url_re.match(url) is not None
22
23 def _get_streams(self):
24 res = self.session.http.get(self.url)
25 hls_url = self.hls_re.search(res.text)
26
27 if hls_url:
28 return HLSStream.parse_variant_playlist(self.session, hls_url.group(1))
29
30
31 __plugin__ = TV360
32
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/streamlink/plugins/tv360.py b/src/streamlink/plugins/tv360.py
--- a/src/streamlink/plugins/tv360.py
+++ b/src/streamlink/plugins/tv360.py
@@ -1,5 +1,3 @@
-from __future__ import print_function
-
import re
from streamlink.plugin import Plugin
@@ -9,11 +7,11 @@
class TV360(Plugin):
url_re = re.compile(r"https?://(?:www.)?tv360.com.tr/canli-yayin")
- hls_re = re.compile(r'''hls.loadSource\(["'](http.*m3u8)["']\)''', re.DOTALL)
+ hls_re = re.compile(r'''src="(http.*m3u8)"''')
hls_schema = validate.Schema(
validate.transform(hls_re.search),
- validate.any(None, validate.all(validate.get(1)))
+ validate.any(None, validate.all(validate.get(1), validate.url()))
)
@classmethod
@@ -21,11 +19,10 @@
return cls.url_re.match(url) is not None
def _get_streams(self):
- res = self.session.http.get(self.url)
- hls_url = self.hls_re.search(res.text)
+ hls_url = self.session.http.get(self.url, schema=self.hls_schema)
if hls_url:
- return HLSStream.parse_variant_playlist(self.session, hls_url.group(1))
+ return HLSStream.parse_variant_playlist(self.session, hls_url)
__plugin__ = TV360
| {"golden_diff": "diff --git a/src/streamlink/plugins/tv360.py b/src/streamlink/plugins/tv360.py\n--- a/src/streamlink/plugins/tv360.py\n+++ b/src/streamlink/plugins/tv360.py\n@@ -1,5 +1,3 @@\n-from __future__ import print_function\n-\n import re\n \n from streamlink.plugin import Plugin\n@@ -9,11 +7,11 @@\n \n class TV360(Plugin):\n url_re = re.compile(r\"https?://(?:www.)?tv360.com.tr/canli-yayin\")\n- hls_re = re.compile(r'''hls.loadSource\\([\"'](http.*m3u8)[\"']\\)''', re.DOTALL)\n+ hls_re = re.compile(r'''src=\"(http.*m3u8)\"''')\n \n hls_schema = validate.Schema(\n validate.transform(hls_re.search),\n- validate.any(None, validate.all(validate.get(1)))\n+ validate.any(None, validate.all(validate.get(1), validate.url()))\n )\n \n @classmethod\n@@ -21,11 +19,10 @@\n return cls.url_re.match(url) is not None\n \n def _get_streams(self):\n- res = self.session.http.get(self.url)\n- hls_url = self.hls_re.search(res.text)\n+ hls_url = self.session.http.get(self.url, schema=self.hls_schema)\n \n if hls_url:\n- return HLSStream.parse_variant_playlist(self.session, hls_url.group(1))\n+ return HLSStream.parse_variant_playlist(self.session, hls_url)\n \n \n __plugin__ = TV360\n", "issue": "tv360.com.tr no playable stream\n## Bug Report\r\n- [x] This is a bug report and I have read the contribution guidelines.\r\n\r\n### Description\r\n\r\ncan't find playable stream.\r\n\r\n### Expected / Actual behavior\r\n\r\nstream supposed to be found\r\n\r\n### Reproduction steps / Explicit stream URLs to test\r\n\r\n``` 1. streamlink https://www.tv360.com.tr/canli-yayin ```\r\n\r\n### Log output\r\n\r\n```\r\n[cli][debug] OS: Windows 10\r\n[cli][debug] Python: 3.8.2\r\n[cli][debug] Streamlink: 1.5.0\r\n[cli][debug] Requests(2.24.0), Socks(1.7.1), Websocket(0.57.0)\r\n[cli][info] Found matching plugin tv360 for URL tv360.com.tr/canli-yayin\r\nerror: No playable streams found on this URL: tv360.com.tr/canli-yayin\r\n```\r\n\r\n\r\n### Additional comments, screenshots, etc.\r\n\r\n\r\n\r\n[Love Streamlink? Please consider supporting our collective. Thanks!](https://opencollective.com/streamlink/donate)\r\n\n", "before_files": [{"content": "from __future__ import print_function\n\nimport re\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream import HLSStream\n\n\nclass TV360(Plugin):\n url_re = re.compile(r\"https?://(?:www.)?tv360.com.tr/canli-yayin\")\n hls_re = re.compile(r'''hls.loadSource\\([\"'](http.*m3u8)[\"']\\)''', re.DOTALL)\n\n hls_schema = validate.Schema(\n validate.transform(hls_re.search),\n validate.any(None, validate.all(validate.get(1)))\n )\n\n @classmethod\n def can_handle_url(cls, url):\n return cls.url_re.match(url) is not None\n\n def _get_streams(self):\n res = self.session.http.get(self.url)\n hls_url = self.hls_re.search(res.text)\n\n if hls_url:\n return HLSStream.parse_variant_playlist(self.session, hls_url.group(1))\n\n\n__plugin__ = TV360\n", "path": "src/streamlink/plugins/tv360.py"}], "after_files": [{"content": "import re\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream import HLSStream\n\n\nclass TV360(Plugin):\n url_re = re.compile(r\"https?://(?:www.)?tv360.com.tr/canli-yayin\")\n hls_re = re.compile(r'''src=\"(http.*m3u8)\"''')\n\n hls_schema = validate.Schema(\n validate.transform(hls_re.search),\n validate.any(None, validate.all(validate.get(1), validate.url()))\n )\n\n @classmethod\n def can_handle_url(cls, url):\n return cls.url_re.match(url) is not None\n\n def _get_streams(self):\n hls_url = self.session.http.get(self.url, schema=self.hls_schema)\n\n if hls_url:\n return HLSStream.parse_variant_playlist(self.session, hls_url)\n\n\n__plugin__ = TV360\n", "path": "src/streamlink/plugins/tv360.py"}]} | 805 | 363 |
gh_patches_debug_7942 | rasdani/github-patches | git_diff | borgbackup__borg-3134 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
1.1.0: using a logging config causes exception
I've been using Borg 1.0.x with a [simple logging config](https://github.com/borgbackup/borg/files/1369192/logging.conf.txt) to get a logging behaviour suitable for cronjobs, that is: everything goes to the logfile, errors and warnings also go to stderr (and are sent via mail, by cron). Using the same logging config with Borg 1.1.0 causes an exception:
```
2017-10-09 06:05:09 [ERROR] Local Exception
2017-10-09 06:05:09 [ERROR] Traceback (most recent call last):
File "borg/archiver.py", line 4024, in main
File "borg/archiver.py", line 3952, in run
File "borg/archiver.py", line 130, in wrapper
File "borg/remote.py", line 562, in __init__
File "borg/remote.py", line 699, in call
File "borg/remote.py", line 841, in call_many
File "borg/remote.py", line 989, in handle_remote_line
AttributeError: 'Logger' object has no attribute 'json'
```
When not using a logging config, `setup_logging` will set up two loggers, the second one explicitly named `borg` and having a custom `json` attribute (see logging.py, lines 97-99). While I could add a `borg` logger to my logging config there seems to be no way to add the required custom `json` attribute within the fileConfig format.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/borg/logger.py`
Content:
```
1 """logging facilities
2
3 The way to use this is as follows:
4
5 * each module declares its own logger, using:
6
7 from .logger import create_logger
8 logger = create_logger()
9
10 * then each module uses logger.info/warning/debug/etc according to the
11 level it believes is appropriate:
12
13 logger.debug('debugging info for developers or power users')
14 logger.info('normal, informational output')
15 logger.warning('warn about a non-fatal error or sth else')
16 logger.error('a fatal error')
17
18 ... and so on. see the `logging documentation
19 <https://docs.python.org/3/howto/logging.html#when-to-use-logging>`_
20 for more information
21
22 * console interaction happens on stderr, that includes interactive
23 reporting functions like `help`, `info` and `list`
24
25 * ...except ``input()`` is special, because we can't control the
26 stream it is using, unfortunately. we assume that it won't clutter
27 stdout, because interaction would be broken then anyways
28
29 * what is output on INFO level is additionally controlled by commandline
30 flags
31 """
32
33 import inspect
34 import json
35 import logging
36 import logging.config
37 import logging.handlers # needed for handlers defined there being configurable in logging.conf file
38 import os
39 import warnings
40
41 configured = False
42
43 # use something like this to ignore warnings:
44 # warnings.filterwarnings('ignore', r'... regex for warning message to ignore ...')
45
46
47 def _log_warning(message, category, filename, lineno, file=None, line=None):
48 # for warnings, we just want to use the logging system, not stderr or other files
49 msg = "{0}:{1}: {2}: {3}".format(filename, lineno, category.__name__, message)
50 logger = create_logger(__name__)
51 # Note: the warning will look like coming from here,
52 # but msg contains info about where it really comes from
53 logger.warning(msg)
54
55
56 def setup_logging(stream=None, conf_fname=None, env_var='BORG_LOGGING_CONF', level='info', is_serve=False, json=False):
57 """setup logging module according to the arguments provided
58
59 if conf_fname is given (or the config file name can be determined via
60 the env_var, if given): load this logging configuration.
61
62 otherwise, set up a stream handler logger on stderr (by default, if no
63 stream is provided).
64
65 if is_serve == True, we configure a special log format as expected by
66 the borg client log message interceptor.
67 """
68 global configured
69 err_msg = None
70 if env_var:
71 conf_fname = os.environ.get(env_var, conf_fname)
72 if conf_fname:
73 try:
74 conf_fname = os.path.abspath(conf_fname)
75 # we open the conf file here to be able to give a reasonable
76 # error message in case of failure (if we give the filename to
77 # fileConfig(), it silently ignores unreadable files and gives
78 # unhelpful error msgs like "No section: 'formatters'"):
79 with open(conf_fname) as f:
80 logging.config.fileConfig(f)
81 configured = True
82 logger = logging.getLogger(__name__)
83 logger.debug('using logging configuration read from "{0}"'.format(conf_fname))
84 warnings.showwarning = _log_warning
85 return None
86 except Exception as err: # XXX be more precise
87 err_msg = str(err)
88 # if we did not / not successfully load a logging configuration, fallback to this:
89 logger = logging.getLogger('')
90 handler = logging.StreamHandler(stream)
91 if is_serve and not json:
92 fmt = '$LOG %(levelname)s %(name)s Remote: %(message)s'
93 else:
94 fmt = '%(message)s'
95 formatter = JsonFormatter(fmt) if json else logging.Formatter(fmt)
96 handler.setFormatter(formatter)
97 borg_logger = logging.getLogger('borg')
98 borg_logger.formatter = formatter
99 borg_logger.json = json
100 if configured and logger.handlers:
101 # The RepositoryServer can call setup_logging a second time to adjust the output
102 # mode from text-ish is_serve to json is_serve.
103 # Thus, remove the previously installed handler, if any.
104 logger.handlers[0].close()
105 logger.handlers.clear()
106 logger.addHandler(handler)
107 logger.setLevel(level.upper())
108 configured = True
109 logger = logging.getLogger(__name__)
110 if err_msg:
111 logger.warning('setup_logging for "{0}" failed with "{1}".'.format(conf_fname, err_msg))
112 logger.debug('using builtin fallback logging configuration')
113 warnings.showwarning = _log_warning
114 return handler
115
116
117 def find_parent_module():
118 """find the name of a the first module calling this module
119
120 if we cannot find it, we return the current module's name
121 (__name__) instead.
122 """
123 try:
124 frame = inspect.currentframe().f_back
125 module = inspect.getmodule(frame)
126 while module is None or module.__name__ == __name__:
127 frame = frame.f_back
128 module = inspect.getmodule(frame)
129 return module.__name__
130 except AttributeError:
131 # somehow we failed to find our module
132 # return the logger module name by default
133 return __name__
134
135
136 def create_logger(name=None):
137 """lazily create a Logger object with the proper path, which is returned by
138 find_parent_module() by default, or is provided via the commandline
139
140 this is really a shortcut for:
141
142 logger = logging.getLogger(__name__)
143
144 we use it to avoid errors and provide a more standard API.
145
146 We must create the logger lazily, because this is usually called from
147 module level (and thus executed at import time - BEFORE setup_logging()
148 was called). By doing it lazily we can do the setup first, we just have to
149 be careful not to call any logger methods before the setup_logging() call.
150 If you try, you'll get an exception.
151 """
152 class LazyLogger:
153 def __init__(self, name=None):
154 self.__name = name or find_parent_module()
155 self.__real_logger = None
156
157 @property
158 def __logger(self):
159 if self.__real_logger is None:
160 if not configured:
161 raise Exception("tried to call a logger before setup_logging() was called")
162 self.__real_logger = logging.getLogger(self.__name)
163 if self.__name.startswith('borg.debug.') and self.__real_logger.level == logging.NOTSET:
164 self.__real_logger.setLevel('WARNING')
165 return self.__real_logger
166
167 def getChild(self, suffix):
168 return LazyLogger(self.__name + '.' + suffix)
169
170 def setLevel(self, *args, **kw):
171 return self.__logger.setLevel(*args, **kw)
172
173 def log(self, *args, **kw):
174 if 'msgid' in kw:
175 kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')
176 return self.__logger.log(*args, **kw)
177
178 def exception(self, *args, **kw):
179 if 'msgid' in kw:
180 kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')
181 return self.__logger.exception(*args, **kw)
182
183 def debug(self, *args, **kw):
184 if 'msgid' in kw:
185 kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')
186 return self.__logger.debug(*args, **kw)
187
188 def info(self, *args, **kw):
189 if 'msgid' in kw:
190 kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')
191 return self.__logger.info(*args, **kw)
192
193 def warning(self, *args, **kw):
194 if 'msgid' in kw:
195 kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')
196 return self.__logger.warning(*args, **kw)
197
198 def error(self, *args, **kw):
199 if 'msgid' in kw:
200 kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')
201 return self.__logger.error(*args, **kw)
202
203 def critical(self, *args, **kw):
204 if 'msgid' in kw:
205 kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')
206 return self.__logger.critical(*args, **kw)
207
208 return LazyLogger(name)
209
210
211 class JsonFormatter(logging.Formatter):
212 RECORD_ATTRIBUTES = (
213 'levelname',
214 'name',
215 'message',
216 # msgid is an attribute we made up in Borg to expose a non-changing handle for log messages
217 'msgid',
218 )
219
220 # Other attributes that are not very useful but do exist:
221 # processName, process, relativeCreated, stack_info, thread, threadName
222 # msg == message
223 # *args* are the unformatted arguments passed to the logger function, not useful now,
224 # become useful if sanitized properly (must be JSON serializable) in the code +
225 # fixed message IDs are assigned.
226 # exc_info, exc_text are generally uninteresting because the message will have that
227
228 def format(self, record):
229 super().format(record)
230 data = {
231 'type': 'log_message',
232 'time': record.created,
233 'message': '',
234 'levelname': 'CRITICAL',
235 }
236 for attr in self.RECORD_ATTRIBUTES:
237 value = getattr(record, attr, None)
238 if value:
239 data[attr] = value
240 return json.dumps(data)
241
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/borg/logger.py b/src/borg/logger.py
--- a/src/borg/logger.py
+++ b/src/borg/logger.py
@@ -80,6 +80,8 @@
logging.config.fileConfig(f)
configured = True
logger = logging.getLogger(__name__)
+ borg_logger = logging.getLogger('borg')
+ borg_logger.json = json
logger.debug('using logging configuration read from "{0}"'.format(conf_fname))
warnings.showwarning = _log_warning
return None
| {"golden_diff": "diff --git a/src/borg/logger.py b/src/borg/logger.py\n--- a/src/borg/logger.py\n+++ b/src/borg/logger.py\n@@ -80,6 +80,8 @@\n logging.config.fileConfig(f)\n configured = True\n logger = logging.getLogger(__name__)\n+ borg_logger = logging.getLogger('borg')\n+ borg_logger.json = json\n logger.debug('using logging configuration read from \"{0}\"'.format(conf_fname))\n warnings.showwarning = _log_warning\n return None\n", "issue": "1.1.0: using a logging config causes exception\nI've been using Borg 1.0.x with a [simple logging config](https://github.com/borgbackup/borg/files/1369192/logging.conf.txt) to get a logging behaviour suitable for cronjobs, that is: everything goes to the logfile, errors and warnings also go to stderr (and are sent via mail, by cron). Using the same logging config with Borg 1.1.0 causes an exception:\r\n\r\n```\r\n2017-10-09 06:05:09 [ERROR] Local Exception\r\n2017-10-09 06:05:09 [ERROR] Traceback (most recent call last):\r\n File \"borg/archiver.py\", line 4024, in main\r\n File \"borg/archiver.py\", line 3952, in run\r\n File \"borg/archiver.py\", line 130, in wrapper\r\n File \"borg/remote.py\", line 562, in __init__\r\n File \"borg/remote.py\", line 699, in call\r\n File \"borg/remote.py\", line 841, in call_many\r\n File \"borg/remote.py\", line 989, in handle_remote_line\r\nAttributeError: 'Logger' object has no attribute 'json'\r\n```\r\n\r\nWhen not using a logging config, `setup_logging` will set up two loggers, the second one explicitly named `borg` and having a custom `json` attribute (see logging.py, lines 97-99). While I could add a `borg` logger to my logging config there seems to be no way to add the required custom `json` attribute within the fileConfig format.\n", "before_files": [{"content": "\"\"\"logging facilities\n\nThe way to use this is as follows:\n\n* each module declares its own logger, using:\n\n from .logger import create_logger\n logger = create_logger()\n\n* then each module uses logger.info/warning/debug/etc according to the\n level it believes is appropriate:\n\n logger.debug('debugging info for developers or power users')\n logger.info('normal, informational output')\n logger.warning('warn about a non-fatal error or sth else')\n logger.error('a fatal error')\n\n ... and so on. see the `logging documentation\n <https://docs.python.org/3/howto/logging.html#when-to-use-logging>`_\n for more information\n\n* console interaction happens on stderr, that includes interactive\n reporting functions like `help`, `info` and `list`\n\n* ...except ``input()`` is special, because we can't control the\n stream it is using, unfortunately. we assume that it won't clutter\n stdout, because interaction would be broken then anyways\n\n* what is output on INFO level is additionally controlled by commandline\n flags\n\"\"\"\n\nimport inspect\nimport json\nimport logging\nimport logging.config\nimport logging.handlers # needed for handlers defined there being configurable in logging.conf file\nimport os\nimport warnings\n\nconfigured = False\n\n# use something like this to ignore warnings:\n# warnings.filterwarnings('ignore', r'... regex for warning message to ignore ...')\n\n\ndef _log_warning(message, category, filename, lineno, file=None, line=None):\n # for warnings, we just want to use the logging system, not stderr or other files\n msg = \"{0}:{1}: {2}: {3}\".format(filename, lineno, category.__name__, message)\n logger = create_logger(__name__)\n # Note: the warning will look like coming from here,\n # but msg contains info about where it really comes from\n logger.warning(msg)\n\n\ndef setup_logging(stream=None, conf_fname=None, env_var='BORG_LOGGING_CONF', level='info', is_serve=False, json=False):\n \"\"\"setup logging module according to the arguments provided\n\n if conf_fname is given (or the config file name can be determined via\n the env_var, if given): load this logging configuration.\n\n otherwise, set up a stream handler logger on stderr (by default, if no\n stream is provided).\n\n if is_serve == True, we configure a special log format as expected by\n the borg client log message interceptor.\n \"\"\"\n global configured\n err_msg = None\n if env_var:\n conf_fname = os.environ.get(env_var, conf_fname)\n if conf_fname:\n try:\n conf_fname = os.path.abspath(conf_fname)\n # we open the conf file here to be able to give a reasonable\n # error message in case of failure (if we give the filename to\n # fileConfig(), it silently ignores unreadable files and gives\n # unhelpful error msgs like \"No section: 'formatters'\"):\n with open(conf_fname) as f:\n logging.config.fileConfig(f)\n configured = True\n logger = logging.getLogger(__name__)\n logger.debug('using logging configuration read from \"{0}\"'.format(conf_fname))\n warnings.showwarning = _log_warning\n return None\n except Exception as err: # XXX be more precise\n err_msg = str(err)\n # if we did not / not successfully load a logging configuration, fallback to this:\n logger = logging.getLogger('')\n handler = logging.StreamHandler(stream)\n if is_serve and not json:\n fmt = '$LOG %(levelname)s %(name)s Remote: %(message)s'\n else:\n fmt = '%(message)s'\n formatter = JsonFormatter(fmt) if json else logging.Formatter(fmt)\n handler.setFormatter(formatter)\n borg_logger = logging.getLogger('borg')\n borg_logger.formatter = formatter\n borg_logger.json = json\n if configured and logger.handlers:\n # The RepositoryServer can call setup_logging a second time to adjust the output\n # mode from text-ish is_serve to json is_serve.\n # Thus, remove the previously installed handler, if any.\n logger.handlers[0].close()\n logger.handlers.clear()\n logger.addHandler(handler)\n logger.setLevel(level.upper())\n configured = True\n logger = logging.getLogger(__name__)\n if err_msg:\n logger.warning('setup_logging for \"{0}\" failed with \"{1}\".'.format(conf_fname, err_msg))\n logger.debug('using builtin fallback logging configuration')\n warnings.showwarning = _log_warning\n return handler\n\n\ndef find_parent_module():\n \"\"\"find the name of a the first module calling this module\n\n if we cannot find it, we return the current module's name\n (__name__) instead.\n \"\"\"\n try:\n frame = inspect.currentframe().f_back\n module = inspect.getmodule(frame)\n while module is None or module.__name__ == __name__:\n frame = frame.f_back\n module = inspect.getmodule(frame)\n return module.__name__\n except AttributeError:\n # somehow we failed to find our module\n # return the logger module name by default\n return __name__\n\n\ndef create_logger(name=None):\n \"\"\"lazily create a Logger object with the proper path, which is returned by\n find_parent_module() by default, or is provided via the commandline\n\n this is really a shortcut for:\n\n logger = logging.getLogger(__name__)\n\n we use it to avoid errors and provide a more standard API.\n\n We must create the logger lazily, because this is usually called from\n module level (and thus executed at import time - BEFORE setup_logging()\n was called). By doing it lazily we can do the setup first, we just have to\n be careful not to call any logger methods before the setup_logging() call.\n If you try, you'll get an exception.\n \"\"\"\n class LazyLogger:\n def __init__(self, name=None):\n self.__name = name or find_parent_module()\n self.__real_logger = None\n\n @property\n def __logger(self):\n if self.__real_logger is None:\n if not configured:\n raise Exception(\"tried to call a logger before setup_logging() was called\")\n self.__real_logger = logging.getLogger(self.__name)\n if self.__name.startswith('borg.debug.') and self.__real_logger.level == logging.NOTSET:\n self.__real_logger.setLevel('WARNING')\n return self.__real_logger\n\n def getChild(self, suffix):\n return LazyLogger(self.__name + '.' + suffix)\n\n def setLevel(self, *args, **kw):\n return self.__logger.setLevel(*args, **kw)\n\n def log(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.log(*args, **kw)\n\n def exception(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.exception(*args, **kw)\n\n def debug(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.debug(*args, **kw)\n\n def info(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.info(*args, **kw)\n\n def warning(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.warning(*args, **kw)\n\n def error(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.error(*args, **kw)\n\n def critical(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.critical(*args, **kw)\n\n return LazyLogger(name)\n\n\nclass JsonFormatter(logging.Formatter):\n RECORD_ATTRIBUTES = (\n 'levelname',\n 'name',\n 'message',\n # msgid is an attribute we made up in Borg to expose a non-changing handle for log messages\n 'msgid',\n )\n\n # Other attributes that are not very useful but do exist:\n # processName, process, relativeCreated, stack_info, thread, threadName\n # msg == message\n # *args* are the unformatted arguments passed to the logger function, not useful now,\n # become useful if sanitized properly (must be JSON serializable) in the code +\n # fixed message IDs are assigned.\n # exc_info, exc_text are generally uninteresting because the message will have that\n\n def format(self, record):\n super().format(record)\n data = {\n 'type': 'log_message',\n 'time': record.created,\n 'message': '',\n 'levelname': 'CRITICAL',\n }\n for attr in self.RECORD_ATTRIBUTES:\n value = getattr(record, attr, None)\n if value:\n data[attr] = value\n return json.dumps(data)\n", "path": "src/borg/logger.py"}], "after_files": [{"content": "\"\"\"logging facilities\n\nThe way to use this is as follows:\n\n* each module declares its own logger, using:\n\n from .logger import create_logger\n logger = create_logger()\n\n* then each module uses logger.info/warning/debug/etc according to the\n level it believes is appropriate:\n\n logger.debug('debugging info for developers or power users')\n logger.info('normal, informational output')\n logger.warning('warn about a non-fatal error or sth else')\n logger.error('a fatal error')\n\n ... and so on. see the `logging documentation\n <https://docs.python.org/3/howto/logging.html#when-to-use-logging>`_\n for more information\n\n* console interaction happens on stderr, that includes interactive\n reporting functions like `help`, `info` and `list`\n\n* ...except ``input()`` is special, because we can't control the\n stream it is using, unfortunately. we assume that it won't clutter\n stdout, because interaction would be broken then anyways\n\n* what is output on INFO level is additionally controlled by commandline\n flags\n\"\"\"\n\nimport inspect\nimport json\nimport logging\nimport logging.config\nimport logging.handlers # needed for handlers defined there being configurable in logging.conf file\nimport os\nimport warnings\n\nconfigured = False\n\n# use something like this to ignore warnings:\n# warnings.filterwarnings('ignore', r'... regex for warning message to ignore ...')\n\n\ndef _log_warning(message, category, filename, lineno, file=None, line=None):\n # for warnings, we just want to use the logging system, not stderr or other files\n msg = \"{0}:{1}: {2}: {3}\".format(filename, lineno, category.__name__, message)\n logger = create_logger(__name__)\n # Note: the warning will look like coming from here,\n # but msg contains info about where it really comes from\n logger.warning(msg)\n\n\ndef setup_logging(stream=None, conf_fname=None, env_var='BORG_LOGGING_CONF', level='info', is_serve=False, json=False):\n \"\"\"setup logging module according to the arguments provided\n\n if conf_fname is given (or the config file name can be determined via\n the env_var, if given): load this logging configuration.\n\n otherwise, set up a stream handler logger on stderr (by default, if no\n stream is provided).\n\n if is_serve == True, we configure a special log format as expected by\n the borg client log message interceptor.\n \"\"\"\n global configured\n err_msg = None\n if env_var:\n conf_fname = os.environ.get(env_var, conf_fname)\n if conf_fname:\n try:\n conf_fname = os.path.abspath(conf_fname)\n # we open the conf file here to be able to give a reasonable\n # error message in case of failure (if we give the filename to\n # fileConfig(), it silently ignores unreadable files and gives\n # unhelpful error msgs like \"No section: 'formatters'\"):\n with open(conf_fname) as f:\n logging.config.fileConfig(f)\n configured = True\n logger = logging.getLogger(__name__)\n borg_logger = logging.getLogger('borg')\n borg_logger.json = json\n logger.debug('using logging configuration read from \"{0}\"'.format(conf_fname))\n warnings.showwarning = _log_warning\n return None\n except Exception as err: # XXX be more precise\n err_msg = str(err)\n # if we did not / not successfully load a logging configuration, fallback to this:\n logger = logging.getLogger('')\n handler = logging.StreamHandler(stream)\n if is_serve and not json:\n fmt = '$LOG %(levelname)s %(name)s Remote: %(message)s'\n else:\n fmt = '%(message)s'\n formatter = JsonFormatter(fmt) if json else logging.Formatter(fmt)\n handler.setFormatter(formatter)\n borg_logger = logging.getLogger('borg')\n borg_logger.formatter = formatter\n borg_logger.json = json\n if configured and logger.handlers:\n # The RepositoryServer can call setup_logging a second time to adjust the output\n # mode from text-ish is_serve to json is_serve.\n # Thus, remove the previously installed handler, if any.\n logger.handlers[0].close()\n logger.handlers.clear()\n logger.addHandler(handler)\n logger.setLevel(level.upper())\n configured = True\n logger = logging.getLogger(__name__)\n if err_msg:\n logger.warning('setup_logging for \"{0}\" failed with \"{1}\".'.format(conf_fname, err_msg))\n logger.debug('using builtin fallback logging configuration')\n warnings.showwarning = _log_warning\n return handler\n\n\ndef find_parent_module():\n \"\"\"find the name of a the first module calling this module\n\n if we cannot find it, we return the current module's name\n (__name__) instead.\n \"\"\"\n try:\n frame = inspect.currentframe().f_back\n module = inspect.getmodule(frame)\n while module is None or module.__name__ == __name__:\n frame = frame.f_back\n module = inspect.getmodule(frame)\n return module.__name__\n except AttributeError:\n # somehow we failed to find our module\n # return the logger module name by default\n return __name__\n\n\ndef create_logger(name=None):\n \"\"\"lazily create a Logger object with the proper path, which is returned by\n find_parent_module() by default, or is provided via the commandline\n\n this is really a shortcut for:\n\n logger = logging.getLogger(__name__)\n\n we use it to avoid errors and provide a more standard API.\n\n We must create the logger lazily, because this is usually called from\n module level (and thus executed at import time - BEFORE setup_logging()\n was called). By doing it lazily we can do the setup first, we just have to\n be careful not to call any logger methods before the setup_logging() call.\n If you try, you'll get an exception.\n \"\"\"\n class LazyLogger:\n def __init__(self, name=None):\n self.__name = name or find_parent_module()\n self.__real_logger = None\n\n @property\n def __logger(self):\n if self.__real_logger is None:\n if not configured:\n raise Exception(\"tried to call a logger before setup_logging() was called\")\n self.__real_logger = logging.getLogger(self.__name)\n if self.__name.startswith('borg.debug.') and self.__real_logger.level == logging.NOTSET:\n self.__real_logger.setLevel('WARNING')\n return self.__real_logger\n\n def getChild(self, suffix):\n return LazyLogger(self.__name + '.' + suffix)\n\n def setLevel(self, *args, **kw):\n return self.__logger.setLevel(*args, **kw)\n\n def log(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.log(*args, **kw)\n\n def exception(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.exception(*args, **kw)\n\n def debug(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.debug(*args, **kw)\n\n def info(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.info(*args, **kw)\n\n def warning(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.warning(*args, **kw)\n\n def error(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.error(*args, **kw)\n\n def critical(self, *args, **kw):\n if 'msgid' in kw:\n kw.setdefault('extra', {})['msgid'] = kw.pop('msgid')\n return self.__logger.critical(*args, **kw)\n\n return LazyLogger(name)\n\n\nclass JsonFormatter(logging.Formatter):\n RECORD_ATTRIBUTES = (\n 'levelname',\n 'name',\n 'message',\n # msgid is an attribute we made up in Borg to expose a non-changing handle for log messages\n 'msgid',\n )\n\n # Other attributes that are not very useful but do exist:\n # processName, process, relativeCreated, stack_info, thread, threadName\n # msg == message\n # *args* are the unformatted arguments passed to the logger function, not useful now,\n # become useful if sanitized properly (must be JSON serializable) in the code +\n # fixed message IDs are assigned.\n # exc_info, exc_text are generally uninteresting because the message will have that\n\n def format(self, record):\n super().format(record)\n data = {\n 'type': 'log_message',\n 'time': record.created,\n 'message': '',\n 'levelname': 'CRITICAL',\n }\n for attr in self.RECORD_ATTRIBUTES:\n value = getattr(record, attr, None)\n if value:\n data[attr] = value\n return json.dumps(data)\n", "path": "src/borg/logger.py"}]} | 3,285 | 114 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.