problem_id
stringlengths 18
22
| source
stringclasses 1
value | task_type
stringclasses 1
value | in_source_id
stringlengths 13
58
| prompt
stringlengths 1.1k
10.2k
| golden_diff
stringlengths 151
4.94k
| verification_info
stringlengths 582
21k
| num_tokens
int64 271
2.05k
| num_tokens_diff
int64 47
1.02k
|
---|---|---|---|---|---|---|---|---|
gh_patches_debug_555 | rasdani/github-patches | git_diff | pex-tool__pex-753 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Release 1.6.9
On the docket:
+ [x] Fix pkg resource early import #750
+ [x] Fix `sys.path` scrubbing of pex extras modules. #752
Maybe on the docket:
+ [ ] Problem running PEX files with restricted permissions #746
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pex/version.py`
Content:
```
1 # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 __version__ = '1.6.8'
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pex/version.py b/pex/version.py
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '1.6.8'
+__version__ = '1.6.9'
| {"golden_diff": "diff --git a/pex/version.py b/pex/version.py\n--- a/pex/version.py\n+++ b/pex/version.py\n@@ -1,4 +1,4 @@\n # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n # Licensed under the Apache License, Version 2.0 (see LICENSE).\n \n-__version__ = '1.6.8'\n+__version__ = '1.6.9'\n", "issue": "Release 1.6.9\nOn the docket:\r\n+ [x] Fix pkg resource early import #750\r\n+ [x] Fix `sys.path` scrubbing of pex extras modules. #752\r\n\r\nMaybe on the docket:\r\n+ [ ] Problem running PEX files with restricted permissions #746\n", "before_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.8'\n", "path": "pex/version.py"}], "after_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.9'\n", "path": "pex/version.py"}]} | 377 | 94 |
gh_patches_debug_785 | rasdani/github-patches | git_diff | translate__pootle-5666 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Silence dev checks in the admin UI as well
There's the following section in the Pootle dev config:
``` Python
# Silence the DEBUG check on dev servers
SILENCED_SYSTEM_CHECKS = [
'pootle.W004', # python-levenstein not installed
'pootle.W005', # DEBUG = True
'pootle.W010', # DEFAULT_FROM_EMAIL has default setting
'pootle.W011', # POOTLE_CONTACT_EMAIL has default setting
]
```
When running Pootle, I still get these notifications in the admin UI among others:
> `/!\` DEBUG mode is on. Do not do this in production! Set DEBUG = False in Pootle settings
>
> `/!\` POOTLE_CONTACT_EMAIL is using the following default setting 'info@YOUR_DOMAIN.com'. POOTLE_CONTACT_EMAIL is the address that will receive messages sent by the contact form.
>
> `/!\` DEFAULT_FROM_EMAIL is using the following default setting 'webmaster@localhost'. DEFAULT_FROM_EMAIL is used in all outgoing Pootle email. Don't forget to review your mail server settings.
I think it might make sense to silence them consistently not only in the console, but in the admin UI as well.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pootle/apps/pootle_app/views/admin/dashboard.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 #
3 # Copyright (C) Pootle contributors.
4 #
5 # This file is a part of the Pootle project. It is distributed under the GPL3
6 # or later license. See the LICENSE file for a copy of the license and the
7 # AUTHORS file for copyright and authorship information.
8
9 import locale
10 import os
11
12 from redis.exceptions import ConnectionError
13
14 from django.contrib.auth import get_user_model
15 from django.core.cache import cache
16 from django.shortcuts import render
17
18 from django_rq.queues import get_failed_queue, get_queue
19 from django_rq.workers import Worker
20
21 from pootle.core.decorators import admin_required
22 from pootle.i18n.gettext import ugettext as _, ungettext
23 from pootle_statistics.models import Submission
24 from pootle_store.models import Suggestion
25
26
27 def _format_numbers(numbers):
28 for k in numbers.keys():
29 formatted_number = locale.format("%d", numbers[k], grouping=True)
30 # Under Windows, formatted number must be converted to Unicode
31 if os.name == 'nt':
32 formatted_number = formatted_number.decode(
33 locale.getpreferredencoding()
34 )
35 numbers[k] = formatted_number
36
37
38 def server_stats():
39 User = get_user_model()
40 result = cache.get("server_stats")
41 if result is None:
42 result = {}
43 result['user_count'] = max(User.objects.filter(
44 is_active=True).count()-2, 0)
45 # 'default' and 'nobody' might be counted
46 # FIXME: the special users should not be retuned with is_active
47 result['submission_count'] = Submission.objects.count()
48 result['pending_count'] = Suggestion.objects.pending().count()
49 cache.set("server_stats", result, 86400)
50 _format_numbers(result)
51 return result
52
53
54 def rq_stats():
55 queue = get_queue()
56 failed_queue = get_failed_queue()
57 try:
58 workers = Worker.all(queue.connection)
59 except ConnectionError:
60 return None
61
62 num_workers = len(workers)
63 is_running = len(queue.connection.smembers(Worker.redis_workers_keys)) > 0
64 if is_running:
65 # Translators: this refers to the status of the background job worker
66 status_msg = ungettext('Running (%d worker)', 'Running (%d workers)',
67 num_workers) % num_workers
68 else:
69 # Translators: this refers to the status of the background job worker
70 status_msg = _('Stopped')
71
72 result = {
73 'job_count': queue.count,
74 'failed_job_count': failed_queue.count,
75 'is_running': is_running,
76 'status_msg': status_msg,
77 }
78
79 return result
80
81
82 def checks():
83 from django.core.checks.registry import registry
84
85 return registry.run_checks()
86
87
88 @admin_required
89 def view(request):
90 ctx = {
91 'page': 'admin-dashboard',
92 'server_stats': server_stats(),
93 'rq_stats': rq_stats(),
94 'checks': checks(),
95 }
96 return render(request, "admin/dashboard.html", ctx)
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pootle/apps/pootle_app/views/admin/dashboard.py b/pootle/apps/pootle_app/views/admin/dashboard.py
--- a/pootle/apps/pootle_app/views/admin/dashboard.py
+++ b/pootle/apps/pootle_app/views/admin/dashboard.py
@@ -82,7 +82,7 @@
def checks():
from django.core.checks.registry import registry
- return registry.run_checks()
+ return [e for e in registry.run_checks() if not e.is_silenced()]
@admin_required
| {"golden_diff": "diff --git a/pootle/apps/pootle_app/views/admin/dashboard.py b/pootle/apps/pootle_app/views/admin/dashboard.py\n--- a/pootle/apps/pootle_app/views/admin/dashboard.py\n+++ b/pootle/apps/pootle_app/views/admin/dashboard.py\n@@ -82,7 +82,7 @@\n def checks():\n from django.core.checks.registry import registry\n \n- return registry.run_checks()\n+ return [e for e in registry.run_checks() if not e.is_silenced()]\n \n \n @admin_required\n", "issue": "Silence dev checks in the admin UI as well\nThere's the following section in the Pootle dev config:\n\n``` Python\n# Silence the DEBUG check on dev servers\nSILENCED_SYSTEM_CHECKS = [\n 'pootle.W004', # python-levenstein not installed\n 'pootle.W005', # DEBUG = True\n 'pootle.W010', # DEFAULT_FROM_EMAIL has default setting\n 'pootle.W011', # POOTLE_CONTACT_EMAIL has default setting\n]\n```\n\nWhen running Pootle, I still get these notifications in the admin UI among others:\n\n> `/!\\` DEBUG mode is on. Do not do this in production! Set DEBUG = False in Pootle settings\n> \n> `/!\\` POOTLE_CONTACT_EMAIL is using the following default setting 'info@YOUR_DOMAIN.com'. POOTLE_CONTACT_EMAIL is the address that will receive messages sent by the contact form.\n> \n> `/!\\` DEFAULT_FROM_EMAIL is using the following default setting 'webmaster@localhost'. DEFAULT_FROM_EMAIL is used in all outgoing Pootle email. Don't forget to review your mail server settings.\n\nI think it might make sense to silence them consistently not only in the console, but in the admin UI as well.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nimport locale\nimport os\n\nfrom redis.exceptions import ConnectionError\n\nfrom django.contrib.auth import get_user_model\nfrom django.core.cache import cache\nfrom django.shortcuts import render\n\nfrom django_rq.queues import get_failed_queue, get_queue\nfrom django_rq.workers import Worker\n\nfrom pootle.core.decorators import admin_required\nfrom pootle.i18n.gettext import ugettext as _, ungettext\nfrom pootle_statistics.models import Submission\nfrom pootle_store.models import Suggestion\n\n\ndef _format_numbers(numbers):\n for k in numbers.keys():\n formatted_number = locale.format(\"%d\", numbers[k], grouping=True)\n # Under Windows, formatted number must be converted to Unicode\n if os.name == 'nt':\n formatted_number = formatted_number.decode(\n locale.getpreferredencoding()\n )\n numbers[k] = formatted_number\n\n\ndef server_stats():\n User = get_user_model()\n result = cache.get(\"server_stats\")\n if result is None:\n result = {}\n result['user_count'] = max(User.objects.filter(\n is_active=True).count()-2, 0)\n # 'default' and 'nobody' might be counted\n # FIXME: the special users should not be retuned with is_active\n result['submission_count'] = Submission.objects.count()\n result['pending_count'] = Suggestion.objects.pending().count()\n cache.set(\"server_stats\", result, 86400)\n _format_numbers(result)\n return result\n\n\ndef rq_stats():\n queue = get_queue()\n failed_queue = get_failed_queue()\n try:\n workers = Worker.all(queue.connection)\n except ConnectionError:\n return None\n\n num_workers = len(workers)\n is_running = len(queue.connection.smembers(Worker.redis_workers_keys)) > 0\n if is_running:\n # Translators: this refers to the status of the background job worker\n status_msg = ungettext('Running (%d worker)', 'Running (%d workers)',\n num_workers) % num_workers\n else:\n # Translators: this refers to the status of the background job worker\n status_msg = _('Stopped')\n\n result = {\n 'job_count': queue.count,\n 'failed_job_count': failed_queue.count,\n 'is_running': is_running,\n 'status_msg': status_msg,\n }\n\n return result\n\n\ndef checks():\n from django.core.checks.registry import registry\n\n return registry.run_checks()\n\n\n@admin_required\ndef view(request):\n ctx = {\n 'page': 'admin-dashboard',\n 'server_stats': server_stats(),\n 'rq_stats': rq_stats(),\n 'checks': checks(),\n }\n return render(request, \"admin/dashboard.html\", ctx)\n", "path": "pootle/apps/pootle_app/views/admin/dashboard.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nimport locale\nimport os\n\nfrom redis.exceptions import ConnectionError\n\nfrom django.contrib.auth import get_user_model\nfrom django.core.cache import cache\nfrom django.shortcuts import render\n\nfrom django_rq.queues import get_failed_queue, get_queue\nfrom django_rq.workers import Worker\n\nfrom pootle.core.decorators import admin_required\nfrom pootle.i18n.gettext import ugettext as _, ungettext\nfrom pootle_statistics.models import Submission\nfrom pootle_store.models import Suggestion\n\n\ndef _format_numbers(numbers):\n for k in numbers.keys():\n formatted_number = locale.format(\"%d\", numbers[k], grouping=True)\n # Under Windows, formatted number must be converted to Unicode\n if os.name == 'nt':\n formatted_number = formatted_number.decode(\n locale.getpreferredencoding()\n )\n numbers[k] = formatted_number\n\n\ndef server_stats():\n User = get_user_model()\n result = cache.get(\"server_stats\")\n if result is None:\n result = {}\n result['user_count'] = max(User.objects.filter(\n is_active=True).count()-2, 0)\n # 'default' and 'nobody' might be counted\n # FIXME: the special users should not be retuned with is_active\n result['submission_count'] = Submission.objects.count()\n result['pending_count'] = Suggestion.objects.pending().count()\n cache.set(\"server_stats\", result, 86400)\n _format_numbers(result)\n return result\n\n\ndef rq_stats():\n queue = get_queue()\n failed_queue = get_failed_queue()\n try:\n workers = Worker.all(queue.connection)\n except ConnectionError:\n return None\n\n num_workers = len(workers)\n is_running = len(queue.connection.smembers(Worker.redis_workers_keys)) > 0\n if is_running:\n # Translators: this refers to the status of the background job worker\n status_msg = ungettext('Running (%d worker)', 'Running (%d workers)',\n num_workers) % num_workers\n else:\n # Translators: this refers to the status of the background job worker\n status_msg = _('Stopped')\n\n result = {\n 'job_count': queue.count,\n 'failed_job_count': failed_queue.count,\n 'is_running': is_running,\n 'status_msg': status_msg,\n }\n\n return result\n\n\ndef checks():\n from django.core.checks.registry import registry\n\n return [e for e in registry.run_checks() if not e.is_silenced()]\n\n\n@admin_required\ndef view(request):\n ctx = {\n 'page': 'admin-dashboard',\n 'server_stats': server_stats(),\n 'rq_stats': rq_stats(),\n 'checks': checks(),\n }\n return render(request, \"admin/dashboard.html\", ctx)\n", "path": "pootle/apps/pootle_app/views/admin/dashboard.py"}]} | 1,387 | 117 |
gh_patches_debug_3666 | rasdani/github-patches | git_diff | carpentries__amy-245 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Use Bower for JS, CSS dependencies
We need something to manage our JavaScript (and perhaps CSS) dependencies and this tool is called [Bower](http://bower.io).
@wking [wrote](https://github.com/swcarpentry/amy/pull/239#issuecomment-89585714) some handy instructions on getting Bower up and running.
Bower packages we'd use for now:
- `bootstrap`
- `bootstrap-datepicker`
(This should land after #239)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `amy/settings.py`
Content:
```
1 """
2 Django settings for amy project.
3
4 For more information on this file, see
5 https://docs.djangoproject.com/en/1.7/topics/settings/
6
7 For the full list of settings and their values, see
8 https://docs.djangoproject.com/en/1.7/ref/settings/
9 """
10
11 # Build paths inside the project like this: os.path.join(BASE_DIR, ...)
12 import os
13 import json
14
15 from django.conf import global_settings
16
17 BASE_DIR = os.path.dirname(os.path.dirname(__file__))
18
19
20 # Quick-start development settings - unsuitable for production
21 # See https://docs.djangoproject.com/en/1.7/howto/deployment/checklist/
22
23
24 # SECURITY WARNING: don't run with DEBUG turned on in production!
25 DEBUG = json.loads(os.environ.get('AMY_DEBUG', 'true'))
26 # For deployment in production:
27 # AMY_DEBUG=false AMY_SECRET_KEY="..." ./manage.py runserver ...
28
29 if DEBUG:
30 SECRET_KEY = '3l$35+@a%g!(^y^98oi%ei+%+yvtl3y0k^_7-fmx2oj09-ac5@'
31 else:
32 SECRET_KEY = None
33 SECRET_KEY = os.environ.get('AMY_SECRET_KEY', SECRET_KEY)
34
35
36 TEMPLATE_DEBUG = True
37
38 ALLOWED_HOSTS = [
39 'software-carpentry.org',
40 'software-carpentry.org.'
41 ]
42
43
44 # Application definition
45
46 INSTALLED_APPS = (
47 'django.contrib.auth',
48 'django.contrib.contenttypes',
49 'django.contrib.sessions',
50 'django.contrib.messages',
51 'django.contrib.staticfiles',
52 'workshops',
53 # this should be after 'workshops' because templates in
54 # 'templates/registration/' clash
55 'django.contrib.admin',
56 'crispy_forms',
57 )
58
59 CRISPY_TEMPLATE_PACK = 'bootstrap3'
60
61 MIDDLEWARE_CLASSES = (
62 'django.contrib.sessions.middleware.SessionMiddleware',
63 'django.middleware.common.CommonMiddleware',
64 'django.middleware.csrf.CsrfViewMiddleware',
65 'django.contrib.auth.middleware.AuthenticationMiddleware',
66 'django.contrib.auth.middleware.SessionAuthenticationMiddleware',
67 'django.contrib.messages.middleware.MessageMiddleware',
68 'django.middleware.clickjacking.XFrameOptionsMiddleware',
69 )
70
71 # WARNING: in Django 1.8 processors get moved from ``django.core`` to
72 # ``django.template``.
73 TEMPLATE_CONTEXT_PROCESSORS = global_settings.TEMPLATE_CONTEXT_PROCESSORS + (
74 "django.core.context_processors.request",
75 )
76
77 ROOT_URLCONF = 'amy.urls'
78
79 WSGI_APPLICATION = 'amy.wsgi.application'
80
81 from django.contrib.messages import constants as message_constants
82 MESSAGE_TAGS = {
83 message_constants.INFO: 'alert-info',
84 message_constants.SUCCESS: 'alert-success',
85 message_constants.WARNING: 'alert-warning',
86 message_constants.ERROR: 'alert-danger',
87 }
88
89
90 # Database
91 # https://docs.djangoproject.com/en/1.7/ref/settings/#databases
92
93 DATABASES = {
94 'default': {
95 'ENGINE': 'django.db.backends.sqlite3',
96 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
97 }
98 }
99
100 # Authentication
101
102 AUTH_USER_MODEL = 'workshops.Person'
103
104 # Internationalization
105 # https://docs.djangoproject.com/en/1.7/topics/i18n/
106
107 LANGUAGE_CODE = 'en-us'
108
109 TIME_ZONE = 'EST'
110
111 USE_I18N = True
112
113 USE_L10N = True
114
115 USE_TZ = True
116
117
118 # Static files (CSS, JavaScript, Images)
119 # https://docs.djangoproject.com/en/1.7/howto/static-files/
120
121 STATIC_URL = '/static/'
122
123 # Warn viewers of invalid template strings
124 TEMPLATE_STRING_IF_INVALID = 'XXX-unset-variable-XXX'
125
126 # if "next" (or "?next") variable is not set when logging in, redirect to
127 # workshops
128 LOGIN_REDIRECT_URL = '/workshops/'
129
130 # here's where @login_required redirects to:
131 LOGIN_URL = '/account/login/'
132
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/amy/settings.py b/amy/settings.py
--- a/amy/settings.py
+++ b/amy/settings.py
@@ -119,6 +119,10 @@
# https://docs.djangoproject.com/en/1.7/howto/static-files/
STATIC_URL = '/static/'
+STATIC_ROOT = os.path.join(BASE_DIR, 'static')
+STATICFILES_DIRS = (
+ os.path.join(BASE_DIR, 'bower_components'),
+)
# Warn viewers of invalid template strings
TEMPLATE_STRING_IF_INVALID = 'XXX-unset-variable-XXX'
| {"golden_diff": "diff --git a/amy/settings.py b/amy/settings.py\n--- a/amy/settings.py\n+++ b/amy/settings.py\n@@ -119,6 +119,10 @@\n # https://docs.djangoproject.com/en/1.7/howto/static-files/\n \n STATIC_URL = '/static/'\n+STATIC_ROOT = os.path.join(BASE_DIR, 'static')\n+STATICFILES_DIRS = (\n+ os.path.join(BASE_DIR, 'bower_components'),\n+)\n \n # Warn viewers of invalid template strings\n TEMPLATE_STRING_IF_INVALID = 'XXX-unset-variable-XXX'\n", "issue": "Use Bower for JS, CSS dependencies\nWe need something to manage our JavaScript (and perhaps CSS) dependencies and this tool is called [Bower](http://bower.io).\n\n@wking [wrote](https://github.com/swcarpentry/amy/pull/239#issuecomment-89585714) some handy instructions on getting Bower up and running.\n\nBower packages we'd use for now:\n- `bootstrap`\n- `bootstrap-datepicker`\n\n(This should land after #239)\n\n", "before_files": [{"content": "\"\"\"\nDjango settings for amy project.\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/1.7/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/1.7/ref/settings/\n\"\"\"\n\n# Build paths inside the project like this: os.path.join(BASE_DIR, ...)\nimport os\nimport json\n\nfrom django.conf import global_settings\n\nBASE_DIR = os.path.dirname(os.path.dirname(__file__))\n\n\n# Quick-start development settings - unsuitable for production\n# See https://docs.djangoproject.com/en/1.7/howto/deployment/checklist/\n\n\n# SECURITY WARNING: don't run with DEBUG turned on in production!\nDEBUG = json.loads(os.environ.get('AMY_DEBUG', 'true'))\n# For deployment in production:\n# AMY_DEBUG=false AMY_SECRET_KEY=\"...\" ./manage.py runserver ...\n\nif DEBUG:\n SECRET_KEY = '3l$35+@a%g!(^y^98oi%ei+%+yvtl3y0k^_7-fmx2oj09-ac5@'\nelse:\n SECRET_KEY = None\nSECRET_KEY = os.environ.get('AMY_SECRET_KEY', SECRET_KEY)\n\n\nTEMPLATE_DEBUG = True\n\nALLOWED_HOSTS = [\n 'software-carpentry.org',\n 'software-carpentry.org.'\n]\n\n\n# Application definition\n\nINSTALLED_APPS = (\n 'django.contrib.auth',\n 'django.contrib.contenttypes',\n 'django.contrib.sessions',\n 'django.contrib.messages',\n 'django.contrib.staticfiles',\n 'workshops',\n # this should be after 'workshops' because templates in\n # 'templates/registration/' clash\n 'django.contrib.admin',\n 'crispy_forms',\n)\n\nCRISPY_TEMPLATE_PACK = 'bootstrap3'\n\nMIDDLEWARE_CLASSES = (\n 'django.contrib.sessions.middleware.SessionMiddleware',\n 'django.middleware.common.CommonMiddleware',\n 'django.middleware.csrf.CsrfViewMiddleware',\n 'django.contrib.auth.middleware.AuthenticationMiddleware',\n 'django.contrib.auth.middleware.SessionAuthenticationMiddleware',\n 'django.contrib.messages.middleware.MessageMiddleware',\n 'django.middleware.clickjacking.XFrameOptionsMiddleware',\n)\n\n# WARNING: in Django 1.8 processors get moved from ``django.core`` to\n# ``django.template``.\nTEMPLATE_CONTEXT_PROCESSORS = global_settings.TEMPLATE_CONTEXT_PROCESSORS + (\n \"django.core.context_processors.request\",\n)\n\nROOT_URLCONF = 'amy.urls'\n\nWSGI_APPLICATION = 'amy.wsgi.application'\n\nfrom django.contrib.messages import constants as message_constants\nMESSAGE_TAGS = {\n message_constants.INFO: 'alert-info',\n message_constants.SUCCESS: 'alert-success',\n message_constants.WARNING: 'alert-warning',\n message_constants.ERROR: 'alert-danger',\n}\n\n\n# Database\n# https://docs.djangoproject.com/en/1.7/ref/settings/#databases\n\nDATABASES = {\n 'default': {\n 'ENGINE': 'django.db.backends.sqlite3',\n 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),\n }\n}\n\n# Authentication\n\nAUTH_USER_MODEL = 'workshops.Person'\n\n# Internationalization\n# https://docs.djangoproject.com/en/1.7/topics/i18n/\n\nLANGUAGE_CODE = 'en-us'\n\nTIME_ZONE = 'EST'\n\nUSE_I18N = True\n\nUSE_L10N = True\n\nUSE_TZ = True\n\n\n# Static files (CSS, JavaScript, Images)\n# https://docs.djangoproject.com/en/1.7/howto/static-files/\n\nSTATIC_URL = '/static/'\n\n# Warn viewers of invalid template strings\nTEMPLATE_STRING_IF_INVALID = 'XXX-unset-variable-XXX'\n\n# if \"next\" (or \"?next\") variable is not set when logging in, redirect to\n# workshops\nLOGIN_REDIRECT_URL = '/workshops/'\n\n# here's where @login_required redirects to:\nLOGIN_URL = '/account/login/'\n", "path": "amy/settings.py"}], "after_files": [{"content": "\"\"\"\nDjango settings for amy project.\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/1.7/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/1.7/ref/settings/\n\"\"\"\n\n# Build paths inside the project like this: os.path.join(BASE_DIR, ...)\nimport os\nimport json\n\nfrom django.conf import global_settings\n\nBASE_DIR = os.path.dirname(os.path.dirname(__file__))\n\n\n# Quick-start development settings - unsuitable for production\n# See https://docs.djangoproject.com/en/1.7/howto/deployment/checklist/\n\n\n# SECURITY WARNING: don't run with DEBUG turned on in production!\nDEBUG = json.loads(os.environ.get('AMY_DEBUG', 'true'))\n# For deployment in production:\n# AMY_DEBUG=false AMY_SECRET_KEY=\"...\" ./manage.py runserver ...\n\nif DEBUG:\n SECRET_KEY = '3l$35+@a%g!(^y^98oi%ei+%+yvtl3y0k^_7-fmx2oj09-ac5@'\nelse:\n SECRET_KEY = None\nSECRET_KEY = os.environ.get('AMY_SECRET_KEY', SECRET_KEY)\n\n\nTEMPLATE_DEBUG = True\n\nALLOWED_HOSTS = [\n 'software-carpentry.org',\n 'software-carpentry.org.'\n]\n\n\n# Application definition\n\nINSTALLED_APPS = (\n 'django.contrib.auth',\n 'django.contrib.contenttypes',\n 'django.contrib.sessions',\n 'django.contrib.messages',\n 'django.contrib.staticfiles',\n 'workshops',\n # this should be after 'workshops' because templates in\n # 'templates/registration/' clash\n 'django.contrib.admin',\n 'crispy_forms',\n)\n\nCRISPY_TEMPLATE_PACK = 'bootstrap3'\n\nMIDDLEWARE_CLASSES = (\n 'django.contrib.sessions.middleware.SessionMiddleware',\n 'django.middleware.common.CommonMiddleware',\n 'django.middleware.csrf.CsrfViewMiddleware',\n 'django.contrib.auth.middleware.AuthenticationMiddleware',\n 'django.contrib.auth.middleware.SessionAuthenticationMiddleware',\n 'django.contrib.messages.middleware.MessageMiddleware',\n 'django.middleware.clickjacking.XFrameOptionsMiddleware',\n)\n\n# WARNING: in Django 1.8 processors get moved from ``django.core`` to\n# ``django.template``.\nTEMPLATE_CONTEXT_PROCESSORS = global_settings.TEMPLATE_CONTEXT_PROCESSORS + (\n \"django.core.context_processors.request\",\n)\n\nROOT_URLCONF = 'amy.urls'\n\nWSGI_APPLICATION = 'amy.wsgi.application'\n\nfrom django.contrib.messages import constants as message_constants\nMESSAGE_TAGS = {\n message_constants.INFO: 'alert-info',\n message_constants.SUCCESS: 'alert-success',\n message_constants.WARNING: 'alert-warning',\n message_constants.ERROR: 'alert-danger',\n}\n\n\n# Database\n# https://docs.djangoproject.com/en/1.7/ref/settings/#databases\n\nDATABASES = {\n 'default': {\n 'ENGINE': 'django.db.backends.sqlite3',\n 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),\n }\n}\n\n# Authentication\n\nAUTH_USER_MODEL = 'workshops.Person'\n\n# Internationalization\n# https://docs.djangoproject.com/en/1.7/topics/i18n/\n\nLANGUAGE_CODE = 'en-us'\n\nTIME_ZONE = 'EST'\n\nUSE_I18N = True\n\nUSE_L10N = True\n\nUSE_TZ = True\n\n\n# Static files (CSS, JavaScript, Images)\n# https://docs.djangoproject.com/en/1.7/howto/static-files/\n\nSTATIC_URL = '/static/'\nSTATIC_ROOT = os.path.join(BASE_DIR, 'static')\nSTATICFILES_DIRS = (\n os.path.join(BASE_DIR, 'bower_components'),\n)\n\n# Warn viewers of invalid template strings\nTEMPLATE_STRING_IF_INVALID = 'XXX-unset-variable-XXX'\n\n# if \"next\" (or \"?next\") variable is not set when logging in, redirect to\n# workshops\nLOGIN_REDIRECT_URL = '/workshops/'\n\n# here's where @login_required redirects to:\nLOGIN_URL = '/account/login/'\n", "path": "amy/settings.py"}]} | 1,485 | 121 |
gh_patches_debug_36040 | rasdani/github-patches | git_diff | geopandas__geopandas-1088 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
DOC: update the contextily example in the gallery
The example about adding a background map to your plot using contextily at https://geopandas.readthedocs.io/en/latest/gallery/plotting_basemap_background.html#sphx-glr-gallery-plotting-basemap-background-py can be updated with the latest contextily (https://github.com/darribas/contextily/releases/tag/v1.0rc2).
For example, the `add_basemap` function is now available in contextily itself.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/plotting_basemap_background.py`
Content:
```
1 """
2 Adding a background map to plots
3 --------------------------------
4
5 This example shows how you can add a background basemap to plots created
6 with the geopandas ``.plot()`` method. This makes use of the
7 `contextily <https://github.com/darribas/contextily>`__ package to retrieve
8 web map tiles from several sources (OpenStreetMap, Stamen).
9
10 """
11 # sphinx_gallery_thumbnail_number = 3
12 import geopandas
13
14 ###############################################################################
15 # Let's use the NYC borough boundary data that is available in geopandas
16 # datasets. Plotting this gives the following result:
17
18 df = geopandas.read_file(geopandas.datasets.get_path('nybb'))
19 ax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')
20
21 ###############################################################################
22 # Convert the data to Web Mercator
23 # ================================
24 #
25 # Web map tiles are typically provided in
26 # `Web Mercator <https://en.wikipedia.org/wiki/Web_Mercator>`__
27 # (`EPSG 3857 <https://epsg.io/3857>`__), so we need to make sure to convert
28 # our data first to the same CRS to combine our polygons and background tiles
29 # in the same map:
30
31 df = df.to_crs(epsg=3857)
32
33 ###############################################################################
34 # Contextily helper function
35 # ==========================
36 #
37 # We define a small helper function that uses
38 # `contextily <https://github.com/darribas/contextily>`__ to add a map
39 # as background to an existing plot:
40
41 import contextily as ctx
42
43 def add_basemap(ax, zoom, url='http://tile.stamen.com/terrain/tileZ/tileX/tileY.png'):
44 xmin, xmax, ymin, ymax = ax.axis()
45 basemap, extent = ctx.bounds2img(xmin, ymin, xmax, ymax, zoom=zoom, url=url)
46 ax.imshow(basemap, extent=extent, interpolation='bilinear')
47 # restore original x/y limits
48 ax.axis((xmin, xmax, ymin, ymax))
49
50 ###############################################################################
51 # Add background tiles to plot
52 # ============================
53 #
54 # Now we can use the above function to easily add a background map to our
55 # plot. The `zoom` keyword is required and let's you specify the detail of the
56 # map tiles (be careful to not specify a too high `zoom` level, as this can
57 # result in a large download):
58
59 ax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')
60 add_basemap(ax, zoom=10)
61
62 ###############################################################################
63 # By default, contextily uses the Stamen Terrain style. We can specify a
64 # different style using ``ctx.sources``:
65
66 ax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')
67 add_basemap(ax, zoom=11, url=ctx.sources.ST_TONER_LITE)
68 ax.set_axis_off()
69
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/plotting_basemap_background.py b/examples/plotting_basemap_background.py
--- a/examples/plotting_basemap_background.py
+++ b/examples/plotting_basemap_background.py
@@ -1,5 +1,5 @@
"""
-Adding a background map to plots
+Adding a background map to plots
--------------------------------
This example shows how you can add a background basemap to plots created
@@ -31,38 +31,31 @@
df = df.to_crs(epsg=3857)
###############################################################################
-# Contextily helper function
-# ==========================
-#
-# We define a small helper function that uses
-# `contextily <https://github.com/darribas/contextily>`__ to add a map
-# as background to an existing plot:
import contextily as ctx
-def add_basemap(ax, zoom, url='http://tile.stamen.com/terrain/tileZ/tileX/tileY.png'):
- xmin, xmax, ymin, ymax = ax.axis()
- basemap, extent = ctx.bounds2img(xmin, ymin, xmax, ymax, zoom=zoom, url=url)
- ax.imshow(basemap, extent=extent, interpolation='bilinear')
- # restore original x/y limits
- ax.axis((xmin, xmax, ymin, ymax))
-
###############################################################################
# Add background tiles to plot
# ============================
#
-# Now we can use the above function to easily add a background map to our
-# plot. The `zoom` keyword is required and let's you specify the detail of the
-# map tiles (be careful to not specify a too high `zoom` level, as this can
-# result in a large download):
+# We can use `add_basemap` function of contextily to easily add a background
+# map to our plot. :
+
+ax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')
+ctx.add_basemap(ax)
+
+###############################################################################
+# We can control the detail of the map tiles using the optional `zoom` keyword
+# (be careful to not specify a too high `zoom` level,
+# as this can result in a large download).:
ax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')
-add_basemap(ax, zoom=10)
+ctx.add_basemap(ax, zoom=12)
###############################################################################
# By default, contextily uses the Stamen Terrain style. We can specify a
-# different style using ``ctx.sources``:
+# different style using ``ctx.providers``:
ax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')
-add_basemap(ax, zoom=11, url=ctx.sources.ST_TONER_LITE)
+ctx.add_basemap(ax, url=ctx.providers.Stamen.TonerLite)
ax.set_axis_off()
| {"golden_diff": "diff --git a/examples/plotting_basemap_background.py b/examples/plotting_basemap_background.py\n--- a/examples/plotting_basemap_background.py\n+++ b/examples/plotting_basemap_background.py\n@@ -1,5 +1,5 @@\n \"\"\"\n-Adding a background map to plots \n+Adding a background map to plots\n --------------------------------\n \n This example shows how you can add a background basemap to plots created\n@@ -31,38 +31,31 @@\n df = df.to_crs(epsg=3857)\n \n ###############################################################################\n-# Contextily helper function\n-# ==========================\n-#\n-# We define a small helper function that uses\n-# `contextily <https://github.com/darribas/contextily>`__ to add a map\n-# as background to an existing plot: \n \n import contextily as ctx\n \n-def add_basemap(ax, zoom, url='http://tile.stamen.com/terrain/tileZ/tileX/tileY.png'):\n- xmin, xmax, ymin, ymax = ax.axis()\n- basemap, extent = ctx.bounds2img(xmin, ymin, xmax, ymax, zoom=zoom, url=url)\n- ax.imshow(basemap, extent=extent, interpolation='bilinear')\n- # restore original x/y limits\n- ax.axis((xmin, xmax, ymin, ymax))\n-\n ###############################################################################\n # Add background tiles to plot\n # ============================\n #\n-# Now we can use the above function to easily add a background map to our\n-# plot. The `zoom` keyword is required and let's you specify the detail of the\n-# map tiles (be careful to not specify a too high `zoom` level, as this can\n-# result in a large download):\n+# We can use `add_basemap` function of contextily to easily add a background\n+# map to our plot. :\n+\n+ax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')\n+ctx.add_basemap(ax)\n+\n+###############################################################################\n+# We can control the detail of the map tiles using the optional `zoom` keyword\n+# (be careful to not specify a too high `zoom` level,\n+# as this can result in a large download).:\n \n ax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')\n-add_basemap(ax, zoom=10)\n+ctx.add_basemap(ax, zoom=12)\n \n ###############################################################################\n # By default, contextily uses the Stamen Terrain style. We can specify a\n-# different style using ``ctx.sources``:\n+# different style using ``ctx.providers``:\n \n ax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')\n-add_basemap(ax, zoom=11, url=ctx.sources.ST_TONER_LITE)\n+ctx.add_basemap(ax, url=ctx.providers.Stamen.TonerLite)\n ax.set_axis_off()\n", "issue": "DOC: update the contextily example in the gallery\nThe example about adding a background map to your plot using contextily at https://geopandas.readthedocs.io/en/latest/gallery/plotting_basemap_background.html#sphx-glr-gallery-plotting-basemap-background-py can be updated with the latest contextily (https://github.com/darribas/contextily/releases/tag/v1.0rc2).\r\n\r\nFor example, the `add_basemap` function is now available in contextily itself.\n", "before_files": [{"content": "\"\"\"\nAdding a background map to plots \n--------------------------------\n\nThis example shows how you can add a background basemap to plots created\nwith the geopandas ``.plot()`` method. This makes use of the\n`contextily <https://github.com/darribas/contextily>`__ package to retrieve\nweb map tiles from several sources (OpenStreetMap, Stamen).\n\n\"\"\"\n# sphinx_gallery_thumbnail_number = 3\nimport geopandas\n\n###############################################################################\n# Let's use the NYC borough boundary data that is available in geopandas\n# datasets. Plotting this gives the following result:\n\ndf = geopandas.read_file(geopandas.datasets.get_path('nybb'))\nax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')\n\n###############################################################################\n# Convert the data to Web Mercator\n# ================================\n#\n# Web map tiles are typically provided in\n# `Web Mercator <https://en.wikipedia.org/wiki/Web_Mercator>`__\n# (`EPSG 3857 <https://epsg.io/3857>`__), so we need to make sure to convert\n# our data first to the same CRS to combine our polygons and background tiles\n# in the same map:\n\ndf = df.to_crs(epsg=3857)\n\n###############################################################################\n# Contextily helper function\n# ==========================\n#\n# We define a small helper function that uses\n# `contextily <https://github.com/darribas/contextily>`__ to add a map\n# as background to an existing plot: \n\nimport contextily as ctx\n\ndef add_basemap(ax, zoom, url='http://tile.stamen.com/terrain/tileZ/tileX/tileY.png'):\n xmin, xmax, ymin, ymax = ax.axis()\n basemap, extent = ctx.bounds2img(xmin, ymin, xmax, ymax, zoom=zoom, url=url)\n ax.imshow(basemap, extent=extent, interpolation='bilinear')\n # restore original x/y limits\n ax.axis((xmin, xmax, ymin, ymax))\n\n###############################################################################\n# Add background tiles to plot\n# ============================\n#\n# Now we can use the above function to easily add a background map to our\n# plot. The `zoom` keyword is required and let's you specify the detail of the\n# map tiles (be careful to not specify a too high `zoom` level, as this can\n# result in a large download):\n\nax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')\nadd_basemap(ax, zoom=10)\n\n###############################################################################\n# By default, contextily uses the Stamen Terrain style. We can specify a\n# different style using ``ctx.sources``:\n\nax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')\nadd_basemap(ax, zoom=11, url=ctx.sources.ST_TONER_LITE)\nax.set_axis_off()\n", "path": "examples/plotting_basemap_background.py"}], "after_files": [{"content": "\"\"\"\nAdding a background map to plots\n--------------------------------\n\nThis example shows how you can add a background basemap to plots created\nwith the geopandas ``.plot()`` method. This makes use of the\n`contextily <https://github.com/darribas/contextily>`__ package to retrieve\nweb map tiles from several sources (OpenStreetMap, Stamen).\n\n\"\"\"\n# sphinx_gallery_thumbnail_number = 3\nimport geopandas\n\n###############################################################################\n# Let's use the NYC borough boundary data that is available in geopandas\n# datasets. Plotting this gives the following result:\n\ndf = geopandas.read_file(geopandas.datasets.get_path('nybb'))\nax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')\n\n###############################################################################\n# Convert the data to Web Mercator\n# ================================\n#\n# Web map tiles are typically provided in\n# `Web Mercator <https://en.wikipedia.org/wiki/Web_Mercator>`__\n# (`EPSG 3857 <https://epsg.io/3857>`__), so we need to make sure to convert\n# our data first to the same CRS to combine our polygons and background tiles\n# in the same map:\n\ndf = df.to_crs(epsg=3857)\n\n###############################################################################\n\nimport contextily as ctx\n\n###############################################################################\n# Add background tiles to plot\n# ============================\n#\n# We can use `add_basemap` function of contextily to easily add a background\n# map to our plot. :\n\nax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')\nctx.add_basemap(ax)\n\n###############################################################################\n# We can control the detail of the map tiles using the optional `zoom` keyword\n# (be careful to not specify a too high `zoom` level,\n# as this can result in a large download).:\n\nax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')\nctx.add_basemap(ax, zoom=12)\n\n###############################################################################\n# By default, contextily uses the Stamen Terrain style. We can specify a\n# different style using ``ctx.providers``:\n\nax = df.plot(figsize=(10, 10), alpha=0.5, edgecolor='k')\nctx.add_basemap(ax, url=ctx.providers.Stamen.TonerLite)\nax.set_axis_off()\n", "path": "examples/plotting_basemap_background.py"}]} | 1,144 | 646 |
gh_patches_debug_36226 | rasdani/github-patches | git_diff | kubeflow__pipelines-2205 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
SDK feature request: support soft constraint for scheduling a component to a preemptible node pool
From a pipeline, it would be useful to have an option to schedule onto a pre-emptible node if such a node pool is available, otherwise fall back to using non-preemptible.
(Under the hood this can apparently be done with node affinity and `preferredDuringSchedulingIgnoredDuringExecution`).
Ning's suggestion: expose config in the `gcp.use_preemptible_nodepool()` call to specify whether the constraint should be hard or soft.
/assign @gaoning777 (as per discussion)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sdk/python/kfp/gcp.py`
Content:
```
1 # Copyright 2018 Google LLC
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from kubernetes.client import V1Toleration
16
17 def use_gcp_secret(secret_name='user-gcp-sa', secret_file_path_in_volume=None, volume_name=None, secret_volume_mount_path='/secret/gcp-credentials'):
18 """An operator that configures the container to use GCP service account.
19
20 The user-gcp-sa secret is created as part of the kubeflow deployment that
21 stores the access token for kubeflow user service account.
22
23 With this service account, the container has a range of GCP APIs to
24 access to. This service account is automatically created as part of the
25 kubeflow deployment.
26
27 For the list of the GCP APIs this service account can access to, check
28 https://github.com/kubeflow/kubeflow/blob/7b0db0d92d65c0746ac52b000cbc290dac7c62b1/deployment/gke/deployment_manager_configs/iam_bindings_template.yaml#L18
29
30 If you want to call the GCP APIs in a different project, grant the kf-user
31 service account access permission.
32 """
33
34 # permitted values for secret_name = ['admin-gcp-sa', 'user-gcp-sa']
35 if secret_file_path_in_volume is None:
36 secret_file_path_in_volume = '/' + secret_name + '.json'
37
38 if volume_name is None:
39 volume_name = 'gcp-credentials-' + secret_name
40
41 else:
42 import warnings
43 warnings.warn('The volume_name parameter is deprecated and will be removed in next release. The volume names are now generated automatically.', DeprecationWarning)
44
45 def _use_gcp_secret(task):
46 from kubernetes import client as k8s_client
47 return (
48 task
49 .add_volume(
50 k8s_client.V1Volume(
51 name=volume_name,
52 secret=k8s_client.V1SecretVolumeSource(
53 secret_name=secret_name,
54 )
55 )
56 )
57 .add_volume_mount(
58 k8s_client.V1VolumeMount(
59 name=volume_name,
60 mount_path=secret_volume_mount_path,
61 )
62 )
63 .add_env_variable(
64 k8s_client.V1EnvVar(
65 name='GOOGLE_APPLICATION_CREDENTIALS',
66 value=secret_volume_mount_path + secret_file_path_in_volume,
67 )
68 )
69 .add_env_variable(
70 k8s_client.V1EnvVar(
71 name='CLOUDSDK_AUTH_CREDENTIAL_FILE_OVERRIDE',
72 value=secret_volume_mount_path + secret_file_path_in_volume,
73 )
74 ) # Set GCloud Credentials by using the env var override.
75 # TODO: Is there a better way for GCloud to pick up the credential?
76 )
77
78 return _use_gcp_secret
79
80 def use_tpu(tpu_cores: int, tpu_resource: str, tf_version: str):
81 """An operator that configures GCP TPU spec in a container op.
82
83 Args:
84 tpu_cores: Required. The number of cores of TPU resource.
85 For example, the value can be '8', '32', '128', etc.
86 Check more details at: https://cloud.google.com/tpu/docs/kubernetes-engine-setup#pod-spec.
87 tpu_resource: Required. The resource name of the TPU resource.
88 For example, the value can be 'v2', 'preemptible-v1', 'v3' or 'preemptible-v3'.
89 Check more details at: https://cloud.google.com/tpu/docs/kubernetes-engine-setup#pod-spec.
90 tf_version: Required. The TensorFlow version that the TPU nodes use.
91 For example, the value can be '1.12', '1.11', '1.9' or '1.8'.
92 Check more details at: https://cloud.google.com/tpu/docs/supported-versions.
93 """
94
95 def _set_tpu_spec(task):
96 task.add_pod_annotation('tf-version.cloud-tpus.google.com', tf_version)
97 task.add_resource_limit('cloud-tpus.google.com/{}'.format(tpu_resource), str(tpu_cores))
98 return task
99
100 return _set_tpu_spec
101
102 def use_preemptible_nodepool(toleration: V1Toleration = V1Toleration(effect='NoSchedule',
103 key='preemptible',
104 operator='Equal',
105 value='true')):
106 """An operator that configures the GKE preemptible in a container op.
107 """
108
109 def _set_preemptible(task):
110 task.add_toleration(toleration)
111 task.add_node_selector_constraint("cloud.google.com/gke-preemptible", "true")
112 return task
113
114 return _set_preemptible
115
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sdk/python/kfp/gcp.py b/sdk/python/kfp/gcp.py
--- a/sdk/python/kfp/gcp.py
+++ b/sdk/python/kfp/gcp.py
@@ -12,7 +12,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-from kubernetes.client import V1Toleration
+from kubernetes.client import V1Toleration, V1Affinity, V1NodeAffinity, \
+ V1NodeSelector, V1NodeSelectorTerm, V1NodeSelectorRequirement, V1PreferredSchedulingTerm
def use_gcp_secret(secret_name='user-gcp-sa', secret_file_path_in_volume=None, volume_name=None, secret_volume_mount_path='/secret/gcp-credentials'):
"""An operator that configures the container to use GCP service account.
@@ -102,13 +103,31 @@
def use_preemptible_nodepool(toleration: V1Toleration = V1Toleration(effect='NoSchedule',
key='preemptible',
operator='Equal',
- value='true')):
+ value='true'),
+ hard_constraint: bool = False):
"""An operator that configures the GKE preemptible in a container op.
+ Args:
+ toleration (V1Toleration): toleration to pods, default is the preemptible label.
+ hard_constraint (bool): the constraint of scheduling the pods on preemptible
+ nodepools is hard. (Default: False)
"""
def _set_preemptible(task):
task.add_toleration(toleration)
- task.add_node_selector_constraint("cloud.google.com/gke-preemptible", "true")
+ node_selector_term = V1NodeSelectorTerm(match_expressions=[
+ V1NodeSelectorRequirement(key='cloud.google.com/gke-preemptible',
+ operator='In',
+ values=['true'])]
+ )
+ if hard_constraint:
+ node_affinity = V1NodeAffinity(required_during_scheduling_ignored_during_execution=
+ V1NodeSelector(node_selector_terms=[node_selector_term]))
+ else:
+ node_affinity = V1NodeAffinity(preferred_during_scheduling_ignored_during_execution=
+ V1PreferredSchedulingTerm(preference=node_selector_term,
+ weight=50))
+ affinity = V1Affinity(node_affinity=node_affinity)
+ task.add_affinity(affinity=affinity)
return task
return _set_preemptible
| {"golden_diff": "diff --git a/sdk/python/kfp/gcp.py b/sdk/python/kfp/gcp.py\n--- a/sdk/python/kfp/gcp.py\n+++ b/sdk/python/kfp/gcp.py\n@@ -12,7 +12,8 @@\n # See the License for the specific language governing permissions and\n # limitations under the License.\n \n-from kubernetes.client import V1Toleration\n+from kubernetes.client import V1Toleration, V1Affinity, V1NodeAffinity, \\\n+ V1NodeSelector, V1NodeSelectorTerm, V1NodeSelectorRequirement, V1PreferredSchedulingTerm\n \n def use_gcp_secret(secret_name='user-gcp-sa', secret_file_path_in_volume=None, volume_name=None, secret_volume_mount_path='/secret/gcp-credentials'):\n \"\"\"An operator that configures the container to use GCP service account.\n@@ -102,13 +103,31 @@\n def use_preemptible_nodepool(toleration: V1Toleration = V1Toleration(effect='NoSchedule',\n key='preemptible',\n operator='Equal',\n- value='true')):\n+ value='true'),\n+ hard_constraint: bool = False):\n \"\"\"An operator that configures the GKE preemptible in a container op.\n+ Args:\n+ toleration (V1Toleration): toleration to pods, default is the preemptible label.\n+ hard_constraint (bool): the constraint of scheduling the pods on preemptible\n+ nodepools is hard. (Default: False)\n \"\"\"\n \n def _set_preemptible(task):\n task.add_toleration(toleration)\n- task.add_node_selector_constraint(\"cloud.google.com/gke-preemptible\", \"true\")\n+ node_selector_term = V1NodeSelectorTerm(match_expressions=[\n+ V1NodeSelectorRequirement(key='cloud.google.com/gke-preemptible',\n+ operator='In',\n+ values=['true'])]\n+ )\n+ if hard_constraint:\n+ node_affinity = V1NodeAffinity(required_during_scheduling_ignored_during_execution=\n+ V1NodeSelector(node_selector_terms=[node_selector_term]))\n+ else:\n+ node_affinity = V1NodeAffinity(preferred_during_scheduling_ignored_during_execution=\n+ V1PreferredSchedulingTerm(preference=node_selector_term,\n+ weight=50))\n+ affinity = V1Affinity(node_affinity=node_affinity)\n+ task.add_affinity(affinity=affinity)\n return task\n \n return _set_preemptible\n", "issue": "SDK feature request: support soft constraint for scheduling a component to a preemptible node pool\nFrom a pipeline, it would be useful to have an option to schedule onto a pre-emptible node if such a node pool is available, otherwise fall back to using non-preemptible.\r\n(Under the hood this can apparently be done with node affinity and `preferredDuringSchedulingIgnoredDuringExecution`).\r\n\r\nNing's suggestion: expose config in the `gcp.use_preemptible_nodepool()` call to specify whether the constraint should be hard or soft.\r\n\r\n/assign @gaoning777 (as per discussion)\n", "before_files": [{"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom kubernetes.client import V1Toleration\n\ndef use_gcp_secret(secret_name='user-gcp-sa', secret_file_path_in_volume=None, volume_name=None, secret_volume_mount_path='/secret/gcp-credentials'):\n \"\"\"An operator that configures the container to use GCP service account.\n\n The user-gcp-sa secret is created as part of the kubeflow deployment that\n stores the access token for kubeflow user service account.\n\n With this service account, the container has a range of GCP APIs to\n access to. This service account is automatically created as part of the\n kubeflow deployment.\n\n For the list of the GCP APIs this service account can access to, check\n https://github.com/kubeflow/kubeflow/blob/7b0db0d92d65c0746ac52b000cbc290dac7c62b1/deployment/gke/deployment_manager_configs/iam_bindings_template.yaml#L18\n\n If you want to call the GCP APIs in a different project, grant the kf-user\n service account access permission.\n \"\"\"\n\n # permitted values for secret_name = ['admin-gcp-sa', 'user-gcp-sa']\n if secret_file_path_in_volume is None:\n secret_file_path_in_volume = '/' + secret_name + '.json'\n\n if volume_name is None:\n volume_name = 'gcp-credentials-' + secret_name\n\n else:\n import warnings\n warnings.warn('The volume_name parameter is deprecated and will be removed in next release. The volume names are now generated automatically.', DeprecationWarning)\n \n def _use_gcp_secret(task):\n from kubernetes import client as k8s_client\n return (\n task\n .add_volume(\n k8s_client.V1Volume(\n name=volume_name,\n secret=k8s_client.V1SecretVolumeSource(\n secret_name=secret_name,\n )\n )\n )\n .add_volume_mount(\n k8s_client.V1VolumeMount(\n name=volume_name,\n mount_path=secret_volume_mount_path,\n )\n )\n .add_env_variable(\n k8s_client.V1EnvVar(\n name='GOOGLE_APPLICATION_CREDENTIALS',\n value=secret_volume_mount_path + secret_file_path_in_volume,\n )\n )\n .add_env_variable(\n k8s_client.V1EnvVar(\n name='CLOUDSDK_AUTH_CREDENTIAL_FILE_OVERRIDE',\n value=secret_volume_mount_path + secret_file_path_in_volume,\n )\n ) # Set GCloud Credentials by using the env var override.\n # TODO: Is there a better way for GCloud to pick up the credential?\n )\n \n return _use_gcp_secret\n\ndef use_tpu(tpu_cores: int, tpu_resource: str, tf_version: str):\n \"\"\"An operator that configures GCP TPU spec in a container op.\n\n Args:\n tpu_cores: Required. The number of cores of TPU resource. \n For example, the value can be '8', '32', '128', etc.\n Check more details at: https://cloud.google.com/tpu/docs/kubernetes-engine-setup#pod-spec.\n tpu_resource: Required. The resource name of the TPU resource. \n For example, the value can be 'v2', 'preemptible-v1', 'v3' or 'preemptible-v3'.\n Check more details at: https://cloud.google.com/tpu/docs/kubernetes-engine-setup#pod-spec.\n tf_version: Required. The TensorFlow version that the TPU nodes use.\n For example, the value can be '1.12', '1.11', '1.9' or '1.8'.\n Check more details at: https://cloud.google.com/tpu/docs/supported-versions.\n \"\"\"\n\n def _set_tpu_spec(task):\n task.add_pod_annotation('tf-version.cloud-tpus.google.com', tf_version)\n task.add_resource_limit('cloud-tpus.google.com/{}'.format(tpu_resource), str(tpu_cores))\n return task\n\n return _set_tpu_spec\n\ndef use_preemptible_nodepool(toleration: V1Toleration = V1Toleration(effect='NoSchedule',\n key='preemptible',\n operator='Equal',\n value='true')):\n \"\"\"An operator that configures the GKE preemptible in a container op.\n \"\"\"\n\n def _set_preemptible(task):\n task.add_toleration(toleration)\n task.add_node_selector_constraint(\"cloud.google.com/gke-preemptible\", \"true\")\n return task\n\n return _set_preemptible\n", "path": "sdk/python/kfp/gcp.py"}], "after_files": [{"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom kubernetes.client import V1Toleration, V1Affinity, V1NodeAffinity, \\\n V1NodeSelector, V1NodeSelectorTerm, V1NodeSelectorRequirement, V1PreferredSchedulingTerm\n\ndef use_gcp_secret(secret_name='user-gcp-sa', secret_file_path_in_volume=None, volume_name=None, secret_volume_mount_path='/secret/gcp-credentials'):\n \"\"\"An operator that configures the container to use GCP service account.\n\n The user-gcp-sa secret is created as part of the kubeflow deployment that\n stores the access token for kubeflow user service account.\n\n With this service account, the container has a range of GCP APIs to\n access to. This service account is automatically created as part of the\n kubeflow deployment.\n\n For the list of the GCP APIs this service account can access to, check\n https://github.com/kubeflow/kubeflow/blob/7b0db0d92d65c0746ac52b000cbc290dac7c62b1/deployment/gke/deployment_manager_configs/iam_bindings_template.yaml#L18\n\n If you want to call the GCP APIs in a different project, grant the kf-user\n service account access permission.\n \"\"\"\n\n # permitted values for secret_name = ['admin-gcp-sa', 'user-gcp-sa']\n if secret_file_path_in_volume is None:\n secret_file_path_in_volume = '/' + secret_name + '.json'\n\n if volume_name is None:\n volume_name = 'gcp-credentials-' + secret_name\n\n else:\n import warnings\n warnings.warn('The volume_name parameter is deprecated and will be removed in next release. The volume names are now generated automatically.', DeprecationWarning)\n \n def _use_gcp_secret(task):\n from kubernetes import client as k8s_client\n return (\n task\n .add_volume(\n k8s_client.V1Volume(\n name=volume_name,\n secret=k8s_client.V1SecretVolumeSource(\n secret_name=secret_name,\n )\n )\n )\n .add_volume_mount(\n k8s_client.V1VolumeMount(\n name=volume_name,\n mount_path=secret_volume_mount_path,\n )\n )\n .add_env_variable(\n k8s_client.V1EnvVar(\n name='GOOGLE_APPLICATION_CREDENTIALS',\n value=secret_volume_mount_path + secret_file_path_in_volume,\n )\n )\n .add_env_variable(\n k8s_client.V1EnvVar(\n name='CLOUDSDK_AUTH_CREDENTIAL_FILE_OVERRIDE',\n value=secret_volume_mount_path + secret_file_path_in_volume,\n )\n ) # Set GCloud Credentials by using the env var override.\n # TODO: Is there a better way for GCloud to pick up the credential?\n )\n \n return _use_gcp_secret\n\ndef use_tpu(tpu_cores: int, tpu_resource: str, tf_version: str):\n \"\"\"An operator that configures GCP TPU spec in a container op.\n\n Args:\n tpu_cores: Required. The number of cores of TPU resource. \n For example, the value can be '8', '32', '128', etc.\n Check more details at: https://cloud.google.com/tpu/docs/kubernetes-engine-setup#pod-spec.\n tpu_resource: Required. The resource name of the TPU resource. \n For example, the value can be 'v2', 'preemptible-v1', 'v3' or 'preemptible-v3'.\n Check more details at: https://cloud.google.com/tpu/docs/kubernetes-engine-setup#pod-spec.\n tf_version: Required. The TensorFlow version that the TPU nodes use.\n For example, the value can be '1.12', '1.11', '1.9' or '1.8'.\n Check more details at: https://cloud.google.com/tpu/docs/supported-versions.\n \"\"\"\n\n def _set_tpu_spec(task):\n task.add_pod_annotation('tf-version.cloud-tpus.google.com', tf_version)\n task.add_resource_limit('cloud-tpus.google.com/{}'.format(tpu_resource), str(tpu_cores))\n return task\n\n return _set_tpu_spec\n\ndef use_preemptible_nodepool(toleration: V1Toleration = V1Toleration(effect='NoSchedule',\n key='preemptible',\n operator='Equal',\n value='true'),\n hard_constraint: bool = False):\n \"\"\"An operator that configures the GKE preemptible in a container op.\n Args:\n toleration (V1Toleration): toleration to pods, default is the preemptible label.\n hard_constraint (bool): the constraint of scheduling the pods on preemptible\n nodepools is hard. (Default: False)\n \"\"\"\n\n def _set_preemptible(task):\n task.add_toleration(toleration)\n node_selector_term = V1NodeSelectorTerm(match_expressions=[\n V1NodeSelectorRequirement(key='cloud.google.com/gke-preemptible',\n operator='In',\n values=['true'])]\n )\n if hard_constraint:\n node_affinity = V1NodeAffinity(required_during_scheduling_ignored_during_execution=\n V1NodeSelector(node_selector_terms=[node_selector_term]))\n else:\n node_affinity = V1NodeAffinity(preferred_during_scheduling_ignored_during_execution=\n V1PreferredSchedulingTerm(preference=node_selector_term,\n weight=50))\n affinity = V1Affinity(node_affinity=node_affinity)\n task.add_affinity(affinity=affinity)\n return task\n\n return _set_preemptible\n", "path": "sdk/python/kfp/gcp.py"}]} | 1,778 | 552 |
gh_patches_debug_14806 | rasdani/github-patches | git_diff | koxudaxi__datamodel-code-generator-1826 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Importing code generator changes behavior of pyyaml
When importing ``datamodel_code_generator``, the behaviour of ``pyyaml`` is changed after the import, datetime objects suddenly are deserialized from yaml as `str` whereas before they were deserialized as `datetime` objects.
## Steps to reproduce:
### create clean venv
```bash
python3 -m venv blub
blub/bin/pip install datamodel-code-generator pyyaml
```
### Run this reproducer with the interpreter from the venv
```python
import yaml
import tempfile
import datetime
tmp=tempfile.NamedTemporaryFile(mode='w')
def write_read():
tmp=tempfile.NamedTemporaryFile(mode='w')
yaml.safe_dump({"datetime": datetime.datetime.now()}, tmp)
dumped_reloaded = yaml.safe_load(open(tmp.name,"r"))
print(dumped_reloaded)
print("before importing datamodel-code-generator:")
write_read()
import datamodel_code_generator
print("\nafter importing datamodel-code-generator:")
write_read()
```
**Actual output:** (time string + object may change depending on run time)
```
before importing datamodel-code-generator:
{'datetime': datetime.datetime(2024, 1, 30, 9, 51, 19, 612368)}
after importing datamodel-code-generator:
{'datetime': '2024-01-30 09:51:20.019991'}
```
**Expected output:** (time string + object may change depending on run time):
```
before importing datamodel-code-generator:
{'datetime': datetime.datetime(2024, 1, 30, 9, 51, 19, 612368)}
after importing datamodel-code-generator:
{'datetime': datetime.datetime(2024, 1, 30, 9, 51, 20, 019991)}
```
The reason seems to be that the SafeLoader of ``yaml`` is modified globally by ``datamodel-code-generator``
https://github.com/koxudaxi/datamodel-code-generator/blob/acc6bf604b13626f22fc123d72ae08ff0a114155/datamodel_code_generator/util.py#L72
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `datamodel_code_generator/util.py`
Content:
```
1 from __future__ import annotations
2
3 from pathlib import Path
4 from typing import TYPE_CHECKING, Any, Callable, Dict, TypeVar
5
6 import pydantic
7 from packaging import version
8 from pydantic import BaseModel as _BaseModel
9
10 PYDANTIC_VERSION = version.parse(
11 pydantic.VERSION if isinstance(pydantic.VERSION, str) else str(pydantic.VERSION)
12 )
13
14 PYDANTIC_V2: bool = PYDANTIC_VERSION >= version.parse('2.0b3')
15
16 if TYPE_CHECKING:
17 cached_property = property
18 from yaml import SafeLoader
19
20 Protocol = object
21 runtime_checkable: Callable[..., Any]
22
23 from typing_extensions import Literal
24
25 def load_toml(path: Path) -> Dict[str, Any]:
26 ...
27
28 else:
29 try:
30 from typing import Protocol
31 except ImportError:
32 from typing_extensions import Protocol # noqa
33 try:
34 from typing import runtime_checkable
35 except ImportError:
36 from typing_extensions import runtime_checkable # noqa
37 try:
38 from yaml import CSafeLoader as SafeLoader
39 except ImportError: # pragma: no cover
40 from yaml import SafeLoader
41
42 try:
43 from functools import cached_property
44 except ImportError:
45 _NOT_FOUND = object()
46
47 class cached_property:
48 def __init__(self, func: Callable) -> None:
49 self.func: Callable = func
50 self.__doc__: Any = func.__doc__
51
52 def __get__(self, instance: Any, owner: Any = None) -> Any:
53 value = instance.__dict__.get(self.func.__name__, _NOT_FOUND)
54 if value is _NOT_FOUND: # pragma: no cover
55 value = instance.__dict__[self.func.__name__] = self.func(instance)
56 return value
57
58 try:
59 import tomllib
60
61 def load_toml(path: Path) -> Dict[str, Any]:
62 with path.open('rb') as f:
63 return tomllib.load(f)
64
65 except ImportError:
66 import toml
67
68 def load_toml(path: Path) -> Dict[str, Any]:
69 return toml.load(path)
70
71
72 SafeLoader.yaml_constructors[
73 'tag:yaml.org,2002:timestamp'
74 ] = SafeLoader.yaml_constructors['tag:yaml.org,2002:str']
75
76 Model = TypeVar('Model', bound=_BaseModel)
77
78
79 def model_validator(
80 mode: Literal['before', 'after'] = 'after',
81 ) -> Callable[[Callable[[Model, Any], Any]], Callable[[Model, Any], Any]]:
82 def inner(method: Callable[[Model, Any], Any]) -> Callable[[Model, Any], Any]:
83 if PYDANTIC_V2:
84 from pydantic import model_validator as model_validator_v2
85
86 return model_validator_v2(mode=mode)(method) # type: ignore
87 else:
88 from pydantic import root_validator
89
90 return root_validator(method, pre=mode == 'before') # type: ignore
91
92 return inner
93
94
95 def field_validator(
96 field_name: str,
97 *fields: str,
98 mode: Literal['before', 'after'] = 'after',
99 ) -> Callable[[Any], Callable[[Model, Any], Any]]:
100 def inner(method: Callable[[Model, Any], Any]) -> Callable[[Model, Any], Any]:
101 if PYDANTIC_V2:
102 from pydantic import field_validator as field_validator_v2
103
104 return field_validator_v2(field_name, *fields, mode=mode)(method) # type: ignore
105 else:
106 from pydantic import validator
107
108 return validator(field_name, *fields, pre=mode == 'before')(method) # type: ignore
109
110 return inner
111
112
113 if PYDANTIC_V2:
114 from pydantic import ConfigDict as ConfigDict
115 else:
116 ConfigDict = dict # type: ignore
117
118
119 class BaseModel(_BaseModel):
120 if PYDANTIC_V2:
121 model_config = ConfigDict(strict=False)
122
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/datamodel_code_generator/util.py b/datamodel_code_generator/util.py
--- a/datamodel_code_generator/util.py
+++ b/datamodel_code_generator/util.py
@@ -1,5 +1,6 @@
from __future__ import annotations
+import copy
from pathlib import Path
from typing import TYPE_CHECKING, Any, Callable, Dict, TypeVar
@@ -69,9 +70,13 @@
return toml.load(path)
-SafeLoader.yaml_constructors[
- 'tag:yaml.org,2002:timestamp'
-] = SafeLoader.yaml_constructors['tag:yaml.org,2002:str']
+SafeLoaderTemp = copy.deepcopy(SafeLoader)
+SafeLoaderTemp.yaml_constructors = copy.deepcopy(SafeLoader.yaml_constructors)
+SafeLoaderTemp.add_constructor(
+ 'tag:yaml.org,2002:timestamp',
+ SafeLoaderTemp.yaml_constructors['tag:yaml.org,2002:str'],
+)
+SafeLoader = SafeLoaderTemp
Model = TypeVar('Model', bound=_BaseModel)
| {"golden_diff": "diff --git a/datamodel_code_generator/util.py b/datamodel_code_generator/util.py\n--- a/datamodel_code_generator/util.py\n+++ b/datamodel_code_generator/util.py\n@@ -1,5 +1,6 @@\n from __future__ import annotations\n \n+import copy\n from pathlib import Path\n from typing import TYPE_CHECKING, Any, Callable, Dict, TypeVar\n \n@@ -69,9 +70,13 @@\n return toml.load(path)\n \n \n-SafeLoader.yaml_constructors[\n- 'tag:yaml.org,2002:timestamp'\n-] = SafeLoader.yaml_constructors['tag:yaml.org,2002:str']\n+SafeLoaderTemp = copy.deepcopy(SafeLoader)\n+SafeLoaderTemp.yaml_constructors = copy.deepcopy(SafeLoader.yaml_constructors)\n+SafeLoaderTemp.add_constructor(\n+ 'tag:yaml.org,2002:timestamp',\n+ SafeLoaderTemp.yaml_constructors['tag:yaml.org,2002:str'],\n+)\n+SafeLoader = SafeLoaderTemp\n \n Model = TypeVar('Model', bound=_BaseModel)\n", "issue": "Importing code generator changes behavior of pyyaml\nWhen importing ``datamodel_code_generator``, the behaviour of ``pyyaml`` is changed after the import, datetime objects suddenly are deserialized from yaml as `str` whereas before they were deserialized as `datetime` objects.\r\n\r\n## Steps to reproduce:\r\n### create clean venv\r\n```bash\r\npython3 -m venv blub\r\nblub/bin/pip install datamodel-code-generator pyyaml\r\n```\r\n\r\n### Run this reproducer with the interpreter from the venv\r\n```python\r\nimport yaml\r\nimport tempfile\r\nimport datetime\r\n\r\ntmp=tempfile.NamedTemporaryFile(mode='w')\r\n\r\ndef write_read():\r\n tmp=tempfile.NamedTemporaryFile(mode='w')\r\n yaml.safe_dump({\"datetime\": datetime.datetime.now()}, tmp)\r\n dumped_reloaded = yaml.safe_load(open(tmp.name,\"r\"))\r\n print(dumped_reloaded)\r\n\r\nprint(\"before importing datamodel-code-generator:\")\r\nwrite_read()\r\n\r\nimport datamodel_code_generator\r\nprint(\"\\nafter importing datamodel-code-generator:\")\r\nwrite_read()\r\n```\r\n\r\n**Actual output:** (time string + object may change depending on run time)\r\n```\r\nbefore importing datamodel-code-generator:\r\n{'datetime': datetime.datetime(2024, 1, 30, 9, 51, 19, 612368)}\r\n\r\nafter importing datamodel-code-generator:\r\n{'datetime': '2024-01-30 09:51:20.019991'}\r\n```\r\n\r\n**Expected output:** (time string + object may change depending on run time):\r\n```\r\nbefore importing datamodel-code-generator:\r\n{'datetime': datetime.datetime(2024, 1, 30, 9, 51, 19, 612368)}\r\n\r\nafter importing datamodel-code-generator:\r\n{'datetime': datetime.datetime(2024, 1, 30, 9, 51, 20, 019991)}\r\n```\r\n\r\nThe reason seems to be that the SafeLoader of ``yaml`` is modified globally by ``datamodel-code-generator``\r\nhttps://github.com/koxudaxi/datamodel-code-generator/blob/acc6bf604b13626f22fc123d72ae08ff0a114155/datamodel_code_generator/util.py#L72\r\n\r\n\n", "before_files": [{"content": "from __future__ import annotations\n\nfrom pathlib import Path\nfrom typing import TYPE_CHECKING, Any, Callable, Dict, TypeVar\n\nimport pydantic\nfrom packaging import version\nfrom pydantic import BaseModel as _BaseModel\n\nPYDANTIC_VERSION = version.parse(\n pydantic.VERSION if isinstance(pydantic.VERSION, str) else str(pydantic.VERSION)\n)\n\nPYDANTIC_V2: bool = PYDANTIC_VERSION >= version.parse('2.0b3')\n\nif TYPE_CHECKING:\n cached_property = property\n from yaml import SafeLoader\n\n Protocol = object\n runtime_checkable: Callable[..., Any]\n\n from typing_extensions import Literal\n\n def load_toml(path: Path) -> Dict[str, Any]:\n ...\n\nelse:\n try:\n from typing import Protocol\n except ImportError:\n from typing_extensions import Protocol # noqa\n try:\n from typing import runtime_checkable\n except ImportError:\n from typing_extensions import runtime_checkable # noqa\n try:\n from yaml import CSafeLoader as SafeLoader\n except ImportError: # pragma: no cover\n from yaml import SafeLoader\n\n try:\n from functools import cached_property\n except ImportError:\n _NOT_FOUND = object()\n\n class cached_property:\n def __init__(self, func: Callable) -> None:\n self.func: Callable = func\n self.__doc__: Any = func.__doc__\n\n def __get__(self, instance: Any, owner: Any = None) -> Any:\n value = instance.__dict__.get(self.func.__name__, _NOT_FOUND)\n if value is _NOT_FOUND: # pragma: no cover\n value = instance.__dict__[self.func.__name__] = self.func(instance)\n return value\n\n try:\n import tomllib\n\n def load_toml(path: Path) -> Dict[str, Any]:\n with path.open('rb') as f:\n return tomllib.load(f)\n\n except ImportError:\n import toml\n\n def load_toml(path: Path) -> Dict[str, Any]:\n return toml.load(path)\n\n\nSafeLoader.yaml_constructors[\n 'tag:yaml.org,2002:timestamp'\n] = SafeLoader.yaml_constructors['tag:yaml.org,2002:str']\n\nModel = TypeVar('Model', bound=_BaseModel)\n\n\ndef model_validator(\n mode: Literal['before', 'after'] = 'after',\n) -> Callable[[Callable[[Model, Any], Any]], Callable[[Model, Any], Any]]:\n def inner(method: Callable[[Model, Any], Any]) -> Callable[[Model, Any], Any]:\n if PYDANTIC_V2:\n from pydantic import model_validator as model_validator_v2\n\n return model_validator_v2(mode=mode)(method) # type: ignore\n else:\n from pydantic import root_validator\n\n return root_validator(method, pre=mode == 'before') # type: ignore\n\n return inner\n\n\ndef field_validator(\n field_name: str,\n *fields: str,\n mode: Literal['before', 'after'] = 'after',\n) -> Callable[[Any], Callable[[Model, Any], Any]]:\n def inner(method: Callable[[Model, Any], Any]) -> Callable[[Model, Any], Any]:\n if PYDANTIC_V2:\n from pydantic import field_validator as field_validator_v2\n\n return field_validator_v2(field_name, *fields, mode=mode)(method) # type: ignore\n else:\n from pydantic import validator\n\n return validator(field_name, *fields, pre=mode == 'before')(method) # type: ignore\n\n return inner\n\n\nif PYDANTIC_V2:\n from pydantic import ConfigDict as ConfigDict\nelse:\n ConfigDict = dict # type: ignore\n\n\nclass BaseModel(_BaseModel):\n if PYDANTIC_V2:\n model_config = ConfigDict(strict=False)\n", "path": "datamodel_code_generator/util.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport copy\nfrom pathlib import Path\nfrom typing import TYPE_CHECKING, Any, Callable, Dict, TypeVar\n\nimport pydantic\nfrom packaging import version\nfrom pydantic import BaseModel as _BaseModel\n\nPYDANTIC_VERSION = version.parse(\n pydantic.VERSION if isinstance(pydantic.VERSION, str) else str(pydantic.VERSION)\n)\n\nPYDANTIC_V2: bool = PYDANTIC_VERSION >= version.parse('2.0b3')\n\nif TYPE_CHECKING:\n cached_property = property\n from yaml import SafeLoader\n\n Protocol = object\n runtime_checkable: Callable[..., Any]\n\n from typing_extensions import Literal\n\n def load_toml(path: Path) -> Dict[str, Any]:\n ...\n\nelse:\n try:\n from typing import Protocol\n except ImportError:\n from typing_extensions import Protocol # noqa\n try:\n from typing import runtime_checkable\n except ImportError:\n from typing_extensions import runtime_checkable # noqa\n try:\n from yaml import CSafeLoader as SafeLoader\n except ImportError: # pragma: no cover\n from yaml import SafeLoader\n\n try:\n from functools import cached_property\n except ImportError:\n _NOT_FOUND = object()\n\n class cached_property:\n def __init__(self, func: Callable) -> None:\n self.func: Callable = func\n self.__doc__: Any = func.__doc__\n\n def __get__(self, instance: Any, owner: Any = None) -> Any:\n value = instance.__dict__.get(self.func.__name__, _NOT_FOUND)\n if value is _NOT_FOUND: # pragma: no cover\n value = instance.__dict__[self.func.__name__] = self.func(instance)\n return value\n\n try:\n import tomllib\n\n def load_toml(path: Path) -> Dict[str, Any]:\n with path.open('rb') as f:\n return tomllib.load(f)\n\n except ImportError:\n import toml\n\n def load_toml(path: Path) -> Dict[str, Any]:\n return toml.load(path)\n\n\nSafeLoaderTemp = copy.deepcopy(SafeLoader)\nSafeLoaderTemp.yaml_constructors = copy.deepcopy(SafeLoader.yaml_constructors)\nSafeLoaderTemp.add_constructor(\n 'tag:yaml.org,2002:timestamp',\n SafeLoaderTemp.yaml_constructors['tag:yaml.org,2002:str'],\n)\nSafeLoader = SafeLoaderTemp\n\nModel = TypeVar('Model', bound=_BaseModel)\n\n\ndef model_validator(\n mode: Literal['before', 'after'] = 'after',\n) -> Callable[[Callable[[Model, Any], Any]], Callable[[Model, Any], Any]]:\n def inner(method: Callable[[Model, Any], Any]) -> Callable[[Model, Any], Any]:\n if PYDANTIC_V2:\n from pydantic import model_validator as model_validator_v2\n\n return model_validator_v2(mode=mode)(method) # type: ignore\n else:\n from pydantic import root_validator\n\n return root_validator(method, pre=mode == 'before') # type: ignore\n\n return inner\n\n\ndef field_validator(\n field_name: str,\n *fields: str,\n mode: Literal['before', 'after'] = 'after',\n) -> Callable[[Any], Callable[[Model, Any], Any]]:\n def inner(method: Callable[[Model, Any], Any]) -> Callable[[Model, Any], Any]:\n if PYDANTIC_V2:\n from pydantic import field_validator as field_validator_v2\n\n return field_validator_v2(field_name, *fields, mode=mode)(method) # type: ignore\n else:\n from pydantic import validator\n\n return validator(field_name, *fields, pre=mode == 'before')(method) # type: ignore\n\n return inner\n\n\nif PYDANTIC_V2:\n from pydantic import ConfigDict as ConfigDict\nelse:\n ConfigDict = dict # type: ignore\n\n\nclass BaseModel(_BaseModel):\n if PYDANTIC_V2:\n model_config = ConfigDict(strict=False)\n", "path": "datamodel_code_generator/util.py"}]} | 1,910 | 233 |
gh_patches_debug_17208 | rasdani/github-patches | git_diff | hpcaitech__ColossalAI-2704 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `applications/ChatGPT/setup.py`
Content:
```
1 from setuptools import find_packages, setup
2
3
4 def fetch_requirements(path):
5 with open(path, 'r') as fd:
6 return [r.strip() for r in fd.readlines()]
7
8
9 def fetch_readme():
10 with open('README.md', encoding='utf-8') as f:
11 return f.read()
12
13
14 def fetch_version():
15 with open('version.txt', 'r') as f:
16 return f.read().strip()
17
18
19 setup(
20 name='chatgpt',
21 version=fetch_version(),
22 packages=find_packages(exclude=(
23 'tests',
24 'benchmarks',
25 'requirements',
26 '*.egg-info',
27 )),
28 description='A RLFH implementation (ChatGPT) powered by ColossalAI',
29 long_description=fetch_readme(),
30 long_description_content_type='text/markdown',
31 license='Apache Software License 2.0',
32 url='https://github.com/hpcaitech/ChatGPT',
33 install_requires=fetch_requirements('requirements/requirements.txt'),
34 python_requires='>=3.6',
35 classifiers=[
36 'Programming Language :: Python :: 3',
37 'License :: OSI Approved :: Apache Software License',
38 'Environment :: GPU :: NVIDIA CUDA',
39 'Topic :: Scientific/Engineering :: Artificial Intelligence',
40 'Topic :: System :: Distributed Computing',
41 ],
42 )
43
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/applications/ChatGPT/setup.py b/applications/ChatGPT/setup.py
--- a/applications/ChatGPT/setup.py
+++ b/applications/ChatGPT/setup.py
@@ -22,7 +22,6 @@
packages=find_packages(exclude=(
'tests',
'benchmarks',
- 'requirements',
'*.egg-info',
)),
description='A RLFH implementation (ChatGPT) powered by ColossalAI',
@@ -30,7 +29,7 @@
long_description_content_type='text/markdown',
license='Apache Software License 2.0',
url='https://github.com/hpcaitech/ChatGPT',
- install_requires=fetch_requirements('requirements/requirements.txt'),
+ install_requires=fetch_requirements('requirements.txt'),
python_requires='>=3.6',
classifiers=[
'Programming Language :: Python :: 3',
| {"golden_diff": "diff --git a/applications/ChatGPT/setup.py b/applications/ChatGPT/setup.py\n--- a/applications/ChatGPT/setup.py\n+++ b/applications/ChatGPT/setup.py\n@@ -22,7 +22,6 @@\n packages=find_packages(exclude=(\n 'tests',\n 'benchmarks',\n- 'requirements',\n '*.egg-info',\n )),\n description='A RLFH implementation (ChatGPT) powered by ColossalAI',\n@@ -30,7 +29,7 @@\n long_description_content_type='text/markdown',\n license='Apache Software License 2.0',\n url='https://github.com/hpcaitech/ChatGPT',\n- install_requires=fetch_requirements('requirements/requirements.txt'),\n+ install_requires=fetch_requirements('requirements.txt'),\n python_requires='>=3.6',\n classifiers=[\n 'Programming Language :: Python :: 3',\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n", "before_files": [{"content": "from setuptools import find_packages, setup\n\n\ndef fetch_requirements(path):\n with open(path, 'r') as fd:\n return [r.strip() for r in fd.readlines()]\n\n\ndef fetch_readme():\n with open('README.md', encoding='utf-8') as f:\n return f.read()\n\n\ndef fetch_version():\n with open('version.txt', 'r') as f:\n return f.read().strip()\n\n\nsetup(\n name='chatgpt',\n version=fetch_version(),\n packages=find_packages(exclude=(\n 'tests',\n 'benchmarks',\n 'requirements',\n '*.egg-info',\n )),\n description='A RLFH implementation (ChatGPT) powered by ColossalAI',\n long_description=fetch_readme(),\n long_description_content_type='text/markdown',\n license='Apache Software License 2.0',\n url='https://github.com/hpcaitech/ChatGPT',\n install_requires=fetch_requirements('requirements/requirements.txt'),\n python_requires='>=3.6',\n classifiers=[\n 'Programming Language :: Python :: 3',\n 'License :: OSI Approved :: Apache Software License',\n 'Environment :: GPU :: NVIDIA CUDA',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Topic :: System :: Distributed Computing',\n ],\n)\n", "path": "applications/ChatGPT/setup.py"}], "after_files": [{"content": "from setuptools import find_packages, setup\n\n\ndef fetch_requirements(path):\n with open(path, 'r') as fd:\n return [r.strip() for r in fd.readlines()]\n\n\ndef fetch_readme():\n with open('README.md', encoding='utf-8') as f:\n return f.read()\n\n\ndef fetch_version():\n with open('version.txt', 'r') as f:\n return f.read().strip()\n\n\nsetup(\n name='chatgpt',\n version=fetch_version(),\n packages=find_packages(exclude=(\n 'tests',\n 'benchmarks',\n '*.egg-info',\n )),\n description='A RLFH implementation (ChatGPT) powered by ColossalAI',\n long_description=fetch_readme(),\n long_description_content_type='text/markdown',\n license='Apache Software License 2.0',\n url='https://github.com/hpcaitech/ChatGPT',\n install_requires=fetch_requirements('requirements.txt'),\n python_requires='>=3.6',\n classifiers=[\n 'Programming Language :: Python :: 3',\n 'License :: OSI Approved :: Apache Software License',\n 'Environment :: GPU :: NVIDIA CUDA',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Topic :: System :: Distributed Computing',\n ],\n)\n", "path": "applications/ChatGPT/setup.py"}]} | 639 | 198 |
gh_patches_debug_49256 | rasdani/github-patches | git_diff | pyro-ppl__pyro-388 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Test batch_size parameter is correctly applied for all distributions
With #323, we are now supporting `batch_size` parameter for both the class and functional forms of both the distributions. Let us add a simple test to validate that this is working as expected.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pyro/distributions/log_normal.py`
Content:
```
1 import numpy as np
2 import torch
3 from torch.autograd import Variable
4
5 from pyro.distributions.distribution import Distribution
6
7
8 class LogNormal(Distribution):
9 """
10 :param mu: mean *(vector)*
11 :param sigma: standard deviations *(vector (0, Infinity))*
12
13 A distribution over probability vectors obtained by exp-transforming a random
14 variable drawn from ``Normal({mu: mu, sigma: sigma})``.
15 """
16 reparameterized = True
17
18 def __init__(self, mu, sigma, batch_size=None, *args, **kwargs):
19 """
20 Params:
21 `mu` - mean
22 `sigma` - root variance
23 """
24 self.mu = mu
25 self.sigma = sigma
26 if mu.size() != sigma.size():
27 raise ValueError("Expected mu.size() == sigma.size(), but got {} vs {}"
28 .format(mu.size(), sigma.size()))
29 if mu.dim() == 1 and batch_size is not None:
30 self.mu = mu.expand(batch_size, mu.size(0))
31 self.sigma = sigma.expand(batch_size, sigma.size(0))
32 super(LogNormal, self).__init__(*args, **kwargs)
33
34 def batch_shape(self, x=None):
35 event_dim = 1
36 mu = self.mu
37 if x is not None and x.size() != mu.size():
38 mu = self.mu.expand(x.size()[:-event_dim] + self.event_shape())
39 return mu.size()[:-event_dim]
40
41 def event_shape(self):
42 event_dim = 1
43 return self.mu.size()[-event_dim:]
44
45 def shape(self, x=None):
46 return self.batch_shape(x) + self.event_shape()
47
48 def sample(self):
49 """
50 Reparameterized log-normal sampler.
51 """
52 eps = Variable(torch.randn(1).type_as(self.mu.data))
53 z = self.mu + self.sigma * eps
54 return torch.exp(z)
55
56 def batch_log_pdf(self, x):
57 """
58 log-normal log-likelihood
59 """
60 mu = self.mu.expand(self.shape(x))
61 sigma = self.sigma.expand(self.shape(x))
62 ll_1 = Variable(torch.Tensor([-0.5 * np.log(2.0 * np.pi)])
63 .type_as(mu.data).expand_as(x))
64 ll_2 = -torch.log(sigma * x)
65 ll_3 = -0.5 * torch.pow((torch.log(x) - mu) / sigma, 2.0)
66 batch_log_pdf = torch.sum(ll_1 + ll_2 + ll_3, -1)
67 batch_log_pdf_shape = self.batch_shape(x) + (1,)
68 return batch_log_pdf.contiguous().view(batch_log_pdf_shape)
69
70 def analytic_mean(self):
71 return torch.exp(self.mu + 0.5 * torch.pow(self.sigma, 2.0))
72
73 def analytic_var(self):
74 return (torch.exp(torch.pow(self.sigma, 2.0)) - Variable(torch.ones(1))) * \
75 torch.pow(self.analytic_mean(), 2)
76
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pyro/distributions/log_normal.py b/pyro/distributions/log_normal.py
--- a/pyro/distributions/log_normal.py
+++ b/pyro/distributions/log_normal.py
@@ -49,7 +49,7 @@
"""
Reparameterized log-normal sampler.
"""
- eps = Variable(torch.randn(1).type_as(self.mu.data))
+ eps = Variable(torch.randn(self.mu.size()).type_as(self.mu.data))
z = self.mu + self.sigma * eps
return torch.exp(z)
| {"golden_diff": "diff --git a/pyro/distributions/log_normal.py b/pyro/distributions/log_normal.py\n--- a/pyro/distributions/log_normal.py\n+++ b/pyro/distributions/log_normal.py\n@@ -49,7 +49,7 @@\n \"\"\"\n Reparameterized log-normal sampler.\n \"\"\"\n- eps = Variable(torch.randn(1).type_as(self.mu.data))\n+ eps = Variable(torch.randn(self.mu.size()).type_as(self.mu.data))\n z = self.mu + self.sigma * eps\n return torch.exp(z)\n", "issue": "Test batch_size parameter is correctly applied for all distributions\nWith #323, we are now supporting `batch_size` parameter for both the class and functional forms of both the distributions. Let us add a simple test to validate that this is working as expected. \n", "before_files": [{"content": "import numpy as np\nimport torch\nfrom torch.autograd import Variable\n\nfrom pyro.distributions.distribution import Distribution\n\n\nclass LogNormal(Distribution):\n \"\"\"\n :param mu: mean *(vector)*\n :param sigma: standard deviations *(vector (0, Infinity))*\n\n A distribution over probability vectors obtained by exp-transforming a random\n variable drawn from ``Normal({mu: mu, sigma: sigma})``.\n \"\"\"\n reparameterized = True\n\n def __init__(self, mu, sigma, batch_size=None, *args, **kwargs):\n \"\"\"\n Params:\n `mu` - mean\n `sigma` - root variance\n \"\"\"\n self.mu = mu\n self.sigma = sigma\n if mu.size() != sigma.size():\n raise ValueError(\"Expected mu.size() == sigma.size(), but got {} vs {}\"\n .format(mu.size(), sigma.size()))\n if mu.dim() == 1 and batch_size is not None:\n self.mu = mu.expand(batch_size, mu.size(0))\n self.sigma = sigma.expand(batch_size, sigma.size(0))\n super(LogNormal, self).__init__(*args, **kwargs)\n\n def batch_shape(self, x=None):\n event_dim = 1\n mu = self.mu\n if x is not None and x.size() != mu.size():\n mu = self.mu.expand(x.size()[:-event_dim] + self.event_shape())\n return mu.size()[:-event_dim]\n\n def event_shape(self):\n event_dim = 1\n return self.mu.size()[-event_dim:]\n\n def shape(self, x=None):\n return self.batch_shape(x) + self.event_shape()\n\n def sample(self):\n \"\"\"\n Reparameterized log-normal sampler.\n \"\"\"\n eps = Variable(torch.randn(1).type_as(self.mu.data))\n z = self.mu + self.sigma * eps\n return torch.exp(z)\n\n def batch_log_pdf(self, x):\n \"\"\"\n log-normal log-likelihood\n \"\"\"\n mu = self.mu.expand(self.shape(x))\n sigma = self.sigma.expand(self.shape(x))\n ll_1 = Variable(torch.Tensor([-0.5 * np.log(2.0 * np.pi)])\n .type_as(mu.data).expand_as(x))\n ll_2 = -torch.log(sigma * x)\n ll_3 = -0.5 * torch.pow((torch.log(x) - mu) / sigma, 2.0)\n batch_log_pdf = torch.sum(ll_1 + ll_2 + ll_3, -1)\n batch_log_pdf_shape = self.batch_shape(x) + (1,)\n return batch_log_pdf.contiguous().view(batch_log_pdf_shape)\n\n def analytic_mean(self):\n return torch.exp(self.mu + 0.5 * torch.pow(self.sigma, 2.0))\n\n def analytic_var(self):\n return (torch.exp(torch.pow(self.sigma, 2.0)) - Variable(torch.ones(1))) * \\\n torch.pow(self.analytic_mean(), 2)\n", "path": "pyro/distributions/log_normal.py"}], "after_files": [{"content": "import numpy as np\nimport torch\nfrom torch.autograd import Variable\n\nfrom pyro.distributions.distribution import Distribution\n\n\nclass LogNormal(Distribution):\n \"\"\"\n :param mu: mean *(vector)*\n :param sigma: standard deviations *(vector (0, Infinity))*\n\n A distribution over probability vectors obtained by exp-transforming a random\n variable drawn from ``Normal({mu: mu, sigma: sigma})``.\n \"\"\"\n reparameterized = True\n\n def __init__(self, mu, sigma, batch_size=None, *args, **kwargs):\n \"\"\"\n Params:\n `mu` - mean\n `sigma` - root variance\n \"\"\"\n self.mu = mu\n self.sigma = sigma\n if mu.size() != sigma.size():\n raise ValueError(\"Expected mu.size() == sigma.size(), but got {} vs {}\"\n .format(mu.size(), sigma.size()))\n if mu.dim() == 1 and batch_size is not None:\n self.mu = mu.expand(batch_size, mu.size(0))\n self.sigma = sigma.expand(batch_size, sigma.size(0))\n super(LogNormal, self).__init__(*args, **kwargs)\n\n def batch_shape(self, x=None):\n event_dim = 1\n mu = self.mu\n if x is not None and x.size() != mu.size():\n mu = self.mu.expand(x.size()[:-event_dim] + self.event_shape())\n return mu.size()[:-event_dim]\n\n def event_shape(self):\n event_dim = 1\n return self.mu.size()[-event_dim:]\n\n def shape(self, x=None):\n return self.batch_shape(x) + self.event_shape()\n\n def sample(self):\n \"\"\"\n Reparameterized log-normal sampler.\n \"\"\"\n eps = Variable(torch.randn(self.mu.size()).type_as(self.mu.data))\n z = self.mu + self.sigma * eps\n return torch.exp(z)\n\n def batch_log_pdf(self, x):\n \"\"\"\n log-normal log-likelihood\n \"\"\"\n mu = self.mu.expand(self.shape(x))\n sigma = self.sigma.expand(self.shape(x))\n ll_1 = Variable(torch.Tensor([-0.5 * np.log(2.0 * np.pi)])\n .type_as(mu.data).expand_as(x))\n ll_2 = -torch.log(sigma * x)\n ll_3 = -0.5 * torch.pow((torch.log(x) - mu) / sigma, 2.0)\n batch_log_pdf = torch.sum(ll_1 + ll_2 + ll_3, -1)\n batch_log_pdf_shape = self.batch_shape(x) + (1,)\n return batch_log_pdf.contiguous().view(batch_log_pdf_shape)\n\n def analytic_mean(self):\n return torch.exp(self.mu + 0.5 * torch.pow(self.sigma, 2.0))\n\n def analytic_var(self):\n return (torch.exp(torch.pow(self.sigma, 2.0)) - Variable(torch.ones(1))) * \\\n torch.pow(self.analytic_mean(), 2)\n", "path": "pyro/distributions/log_normal.py"}]} | 1,107 | 115 |
gh_patches_debug_32700 | rasdani/github-patches | git_diff | mne-tools__mne-python-6252 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Cannot pickle mne.preprocessing.ica.ICA anymore
With the current master (installed via `pip install git+https://github.com/mne-tools/mne-python.git#egg=mne`), I get an error when pickling an `mne.preprocessing.ica.ICA` object:
```
import pickle
import mne
print(mne.__version__)
ica = mne.preprocessing.ICA()
raw = mne.io.read_raw_edf("/Users/clemens/Downloads/testfiles/S001R04.edf", preload=True)
ica.fit(raw)
pickle.loads(pickle.dumps(ica))
# Traceback (most recent call last):
# File "/Users/clemens/Repositories/mnelab/.venv/lib/python3.7/site-packages/IPython/core/interactiveshell.py", line 3296, in run_code
# exec(code_obj, self.user_global_ns, self.user_ns)
# File "<ipython-input-8-a056623ee708>", line 1, in <module>
# pickle.loads(pickle.dumps(ica))
# TypeError: __new__() missing 1 required positional argument: 'val'
```
The same code works with the latest released version 0.17.2. I didn't see any change to the `ICA` object that might be causing this (but maybe I'm missing something). If indeed nothing changed, could this behavior be due to how pip installs stuff from GitHub (vs. "proper" packages from PyPI)? If so, how can I fix this (I need to pickle because I want to use the `multiprocessing` module)?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mne/utils/_bunch.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """Bunch-related classes."""
3 # Authors: Alexandre Gramfort <[email protected]>
4 # Eric Larson <[email protected]>
5 #
6 # License: BSD (3-clause)
7
8 from copy import deepcopy
9
10
11 ###############################################################################
12 # Create a Bunch class that acts like a struct (mybunch.key = val )
13
14 class Bunch(dict):
15 """Dictionnary-like object thatexposes its keys as attributes."""
16
17 def __init__(self, **kwargs): # noqa: D102
18 dict.__init__(self, kwargs)
19 self.__dict__ = self
20
21
22 ###############################################################################
23 # A protected version that prevents overwriting
24
25 class BunchConst(Bunch):
26 """Class to prevent us from re-defining constants (DRY)."""
27
28 def __setattr__(self, attr, val): # noqa: D105
29 if attr != '__dict__' and hasattr(self, attr):
30 raise AttributeError('Attribute "%s" already set' % attr)
31 super().__setattr__(attr, val)
32
33
34 ###############################################################################
35 # A version that tweaks the __repr__ of its values based on keys
36
37 class BunchConstNamed(BunchConst):
38 """Class to provide nice __repr__ for our integer constants.
39
40 Only supports string keys and int or float values.
41 """
42
43 def __setattr__(self, attr, val): # noqa: D105
44 assert isinstance(attr, str)
45 if isinstance(val, int):
46 val = NamedInt(attr, val)
47 elif isinstance(val, float):
48 val = NamedFloat(attr, val)
49 else:
50 assert isinstance(val, BunchConstNamed), type(val)
51 super().__setattr__(attr, val)
52
53
54 class Named(object):
55 """Provide shared methods for giving named-representation subclasses."""
56
57 def __str__(self): # noqa: D105
58 return '%s (%s)' % (super().__str__(), self._name)
59
60 __repr__ = __str__
61
62 # see https://stackoverflow.com/a/15774013/2175965
63 def __copy__(self): # noqa: D105
64 cls = self.__class__
65 result = cls.__new__(cls)
66 result.__dict__.update(self.__dict__)
67 return result
68
69 def __deepcopy__(self, memo): # noqa: D105
70 cls = self.__class__
71 result = cls.__new__(cls, self._name, self)
72 memo[id(self)] = result
73 for k, v in self.__dict__.items():
74 setattr(result, k, deepcopy(v, memo))
75 return result
76
77
78 class NamedInt(Named, int):
79 """Int with a name in __repr__."""
80
81 def __new__(cls, name, val): # noqa: D102,D105
82 out = int.__new__(cls, val)
83 out._name = name
84 return out
85
86
87 class NamedFloat(Named, float):
88 """Float with a name in __repr__."""
89
90 def __new__(cls, name, val): # noqa: D102,D105
91 out = float.__new__(cls, val)
92 out._name = name
93 return out
94
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mne/utils/_bunch.py b/mne/utils/_bunch.py
--- a/mne/utils/_bunch.py
+++ b/mne/utils/_bunch.py
@@ -9,10 +9,10 @@
###############################################################################
-# Create a Bunch class that acts like a struct (mybunch.key = val )
+# Create a Bunch class that acts like a struct (mybunch.key = val)
class Bunch(dict):
- """Dictionnary-like object thatexposes its keys as attributes."""
+ """Dictionary-like object that exposes its keys as attributes."""
def __init__(self, **kwargs): # noqa: D102
dict.__init__(self, kwargs)
@@ -51,9 +51,14 @@
super().__setattr__(attr, val)
-class Named(object):
+class _Named(object):
"""Provide shared methods for giving named-representation subclasses."""
+ def __new__(cls, name, val): # noqa: D102,D105
+ out = _named_subclass(cls).__new__(cls, val)
+ out._name = name
+ return out
+
def __str__(self): # noqa: D105
return '%s (%s)' % (super().__str__(), self._name)
@@ -74,20 +79,25 @@
setattr(result, k, deepcopy(v, memo))
return result
+ def __getnewargs__(self): # noqa: D105
+ return self._name, _named_subclass(self)(self)
+
+
+def _named_subclass(klass):
+ if not isinstance(klass, type):
+ klass = klass.__class__
+ subklass = klass.mro()[-2]
+ assert subklass in (int, float)
+ return subklass
+
-class NamedInt(Named, int):
+class NamedInt(_Named, int):
"""Int with a name in __repr__."""
- def __new__(cls, name, val): # noqa: D102,D105
- out = int.__new__(cls, val)
- out._name = name
- return out
+ pass
-class NamedFloat(Named, float):
+class NamedFloat(_Named, float):
"""Float with a name in __repr__."""
- def __new__(cls, name, val): # noqa: D102,D105
- out = float.__new__(cls, val)
- out._name = name
- return out
+ pass
| {"golden_diff": "diff --git a/mne/utils/_bunch.py b/mne/utils/_bunch.py\n--- a/mne/utils/_bunch.py\n+++ b/mne/utils/_bunch.py\n@@ -9,10 +9,10 @@\n \n \n ###############################################################################\n-# Create a Bunch class that acts like a struct (mybunch.key = val )\n+# Create a Bunch class that acts like a struct (mybunch.key = val)\n \n class Bunch(dict):\n- \"\"\"Dictionnary-like object thatexposes its keys as attributes.\"\"\"\n+ \"\"\"Dictionary-like object that exposes its keys as attributes.\"\"\"\n \n def __init__(self, **kwargs): # noqa: D102\n dict.__init__(self, kwargs)\n@@ -51,9 +51,14 @@\n super().__setattr__(attr, val)\n \n \n-class Named(object):\n+class _Named(object):\n \"\"\"Provide shared methods for giving named-representation subclasses.\"\"\"\n \n+ def __new__(cls, name, val): # noqa: D102,D105\n+ out = _named_subclass(cls).__new__(cls, val)\n+ out._name = name\n+ return out\n+\n def __str__(self): # noqa: D105\n return '%s (%s)' % (super().__str__(), self._name)\n \n@@ -74,20 +79,25 @@\n setattr(result, k, deepcopy(v, memo))\n return result\n \n+ def __getnewargs__(self): # noqa: D105\n+ return self._name, _named_subclass(self)(self)\n+\n+\n+def _named_subclass(klass):\n+ if not isinstance(klass, type):\n+ klass = klass.__class__\n+ subklass = klass.mro()[-2]\n+ assert subklass in (int, float)\n+ return subklass\n+\n \n-class NamedInt(Named, int):\n+class NamedInt(_Named, int):\n \"\"\"Int with a name in __repr__.\"\"\"\n \n- def __new__(cls, name, val): # noqa: D102,D105\n- out = int.__new__(cls, val)\n- out._name = name\n- return out\n+ pass\n \n \n-class NamedFloat(Named, float):\n+class NamedFloat(_Named, float):\n \"\"\"Float with a name in __repr__.\"\"\"\n \n- def __new__(cls, name, val): # noqa: D102,D105\n- out = float.__new__(cls, val)\n- out._name = name\n- return out\n+ pass\n", "issue": "Cannot pickle mne.preprocessing.ica.ICA anymore\nWith the current master (installed via `pip install git+https://github.com/mne-tools/mne-python.git#egg=mne`), I get an error when pickling an `mne.preprocessing.ica.ICA` object:\r\n\r\n```\r\nimport pickle\r\nimport mne\r\n\r\n\r\nprint(mne.__version__)\r\nica = mne.preprocessing.ICA()\r\nraw = mne.io.read_raw_edf(\"/Users/clemens/Downloads/testfiles/S001R04.edf\", preload=True)\r\nica.fit(raw)\r\npickle.loads(pickle.dumps(ica))\r\n# Traceback (most recent call last):\r\n# File \"/Users/clemens/Repositories/mnelab/.venv/lib/python3.7/site-packages/IPython/core/interactiveshell.py\", line 3296, in run_code\r\n# exec(code_obj, self.user_global_ns, self.user_ns)\r\n# File \"<ipython-input-8-a056623ee708>\", line 1, in <module>\r\n# pickle.loads(pickle.dumps(ica))\r\n# TypeError: __new__() missing 1 required positional argument: 'val'\r\n```\r\n\r\nThe same code works with the latest released version 0.17.2. I didn't see any change to the `ICA` object that might be causing this (but maybe I'm missing something). If indeed nothing changed, could this behavior be due to how pip installs stuff from GitHub (vs. \"proper\" packages from PyPI)? If so, how can I fix this (I need to pickle because I want to use the `multiprocessing` module)?\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"Bunch-related classes.\"\"\"\n# Authors: Alexandre Gramfort <[email protected]>\n# Eric Larson <[email protected]>\n#\n# License: BSD (3-clause)\n\nfrom copy import deepcopy\n\n\n###############################################################################\n# Create a Bunch class that acts like a struct (mybunch.key = val )\n\nclass Bunch(dict):\n \"\"\"Dictionnary-like object thatexposes its keys as attributes.\"\"\"\n\n def __init__(self, **kwargs): # noqa: D102\n dict.__init__(self, kwargs)\n self.__dict__ = self\n\n\n###############################################################################\n# A protected version that prevents overwriting\n\nclass BunchConst(Bunch):\n \"\"\"Class to prevent us from re-defining constants (DRY).\"\"\"\n\n def __setattr__(self, attr, val): # noqa: D105\n if attr != '__dict__' and hasattr(self, attr):\n raise AttributeError('Attribute \"%s\" already set' % attr)\n super().__setattr__(attr, val)\n\n\n###############################################################################\n# A version that tweaks the __repr__ of its values based on keys\n\nclass BunchConstNamed(BunchConst):\n \"\"\"Class to provide nice __repr__ for our integer constants.\n\n Only supports string keys and int or float values.\n \"\"\"\n\n def __setattr__(self, attr, val): # noqa: D105\n assert isinstance(attr, str)\n if isinstance(val, int):\n val = NamedInt(attr, val)\n elif isinstance(val, float):\n val = NamedFloat(attr, val)\n else:\n assert isinstance(val, BunchConstNamed), type(val)\n super().__setattr__(attr, val)\n\n\nclass Named(object):\n \"\"\"Provide shared methods for giving named-representation subclasses.\"\"\"\n\n def __str__(self): # noqa: D105\n return '%s (%s)' % (super().__str__(), self._name)\n\n __repr__ = __str__\n\n # see https://stackoverflow.com/a/15774013/2175965\n def __copy__(self): # noqa: D105\n cls = self.__class__\n result = cls.__new__(cls)\n result.__dict__.update(self.__dict__)\n return result\n\n def __deepcopy__(self, memo): # noqa: D105\n cls = self.__class__\n result = cls.__new__(cls, self._name, self)\n memo[id(self)] = result\n for k, v in self.__dict__.items():\n setattr(result, k, deepcopy(v, memo))\n return result\n\n\nclass NamedInt(Named, int):\n \"\"\"Int with a name in __repr__.\"\"\"\n\n def __new__(cls, name, val): # noqa: D102,D105\n out = int.__new__(cls, val)\n out._name = name\n return out\n\n\nclass NamedFloat(Named, float):\n \"\"\"Float with a name in __repr__.\"\"\"\n\n def __new__(cls, name, val): # noqa: D102,D105\n out = float.__new__(cls, val)\n out._name = name\n return out\n", "path": "mne/utils/_bunch.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"Bunch-related classes.\"\"\"\n# Authors: Alexandre Gramfort <[email protected]>\n# Eric Larson <[email protected]>\n#\n# License: BSD (3-clause)\n\nfrom copy import deepcopy\n\n\n###############################################################################\n# Create a Bunch class that acts like a struct (mybunch.key = val)\n\nclass Bunch(dict):\n \"\"\"Dictionary-like object that exposes its keys as attributes.\"\"\"\n\n def __init__(self, **kwargs): # noqa: D102\n dict.__init__(self, kwargs)\n self.__dict__ = self\n\n\n###############################################################################\n# A protected version that prevents overwriting\n\nclass BunchConst(Bunch):\n \"\"\"Class to prevent us from re-defining constants (DRY).\"\"\"\n\n def __setattr__(self, attr, val): # noqa: D105\n if attr != '__dict__' and hasattr(self, attr):\n raise AttributeError('Attribute \"%s\" already set' % attr)\n super().__setattr__(attr, val)\n\n\n###############################################################################\n# A version that tweaks the __repr__ of its values based on keys\n\nclass BunchConstNamed(BunchConst):\n \"\"\"Class to provide nice __repr__ for our integer constants.\n\n Only supports string keys and int or float values.\n \"\"\"\n\n def __setattr__(self, attr, val): # noqa: D105\n assert isinstance(attr, str)\n if isinstance(val, int):\n val = NamedInt(attr, val)\n elif isinstance(val, float):\n val = NamedFloat(attr, val)\n else:\n assert isinstance(val, BunchConstNamed), type(val)\n super().__setattr__(attr, val)\n\n\nclass _Named(object):\n \"\"\"Provide shared methods for giving named-representation subclasses.\"\"\"\n\n def __new__(cls, name, val): # noqa: D102,D105\n out = _named_subclass(cls).__new__(cls, val)\n out._name = name\n return out\n\n def __str__(self): # noqa: D105\n return '%s (%s)' % (super().__str__(), self._name)\n\n __repr__ = __str__\n\n # see https://stackoverflow.com/a/15774013/2175965\n def __copy__(self): # noqa: D105\n cls = self.__class__\n result = cls.__new__(cls)\n result.__dict__.update(self.__dict__)\n return result\n\n def __deepcopy__(self, memo): # noqa: D105\n cls = self.__class__\n result = cls.__new__(cls, self._name, self)\n memo[id(self)] = result\n for k, v in self.__dict__.items():\n setattr(result, k, deepcopy(v, memo))\n return result\n\n def __getnewargs__(self): # noqa: D105\n return self._name, _named_subclass(self)(self)\n\n\ndef _named_subclass(klass):\n if not isinstance(klass, type):\n klass = klass.__class__\n subklass = klass.mro()[-2]\n assert subklass in (int, float)\n return subklass\n\n\nclass NamedInt(_Named, int):\n \"\"\"Int with a name in __repr__.\"\"\"\n\n pass\n\n\nclass NamedFloat(_Named, float):\n \"\"\"Float with a name in __repr__.\"\"\"\n\n pass\n", "path": "mne/utils/_bunch.py"}]} | 1,514 | 583 |
gh_patches_debug_21333 | rasdani/github-patches | git_diff | netket__netket-846 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Docs for sigma operators
The documentation of the ladder operator `netket.operator.spin.sigmap` says `σ^+ = σ^x + i σ^y`, but in fact it is defined as `σ^+ = 1/2(σ^x + i σ^y)` in the code.
The same error is in the documentation of `netket.operator.spin.sigmam` operator.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `netket/operator/spin.py`
Content:
```
1 # Copyright 2021 The NetKet Authors - All rights reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from netket.hilbert import AbstractHilbert
16 from netket.utils.types import DType
17
18 from ._local_operator import LocalOperator as _LocalOperator
19
20
21 def sigmax(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:
22 """
23 Builds the :math:`\\sigma^x` operator acting on the `site`-th of the Hilbert
24 space `hilbert`.
25
26 If `hilbert` is a non-Spin space of local dimension M, it is considered
27 as a (M-1)/2 - spin space.
28
29 :param hilbert: The hilbert space
30 :param site: the site on which this operator acts
31 :return: a nk.operator.LocalOperator
32 """
33 import numpy as np
34
35 N = hilbert.size_at_index(site)
36 S = (N - 1) / 2
37
38 D = [np.sqrt((S + 1) * 2 * a - a * (a + 1)) for a in np.arange(1, N)]
39 mat = np.diag(D, 1) + np.diag(D, -1)
40 return _LocalOperator(hilbert, mat, [site], dtype=dtype)
41
42
43 def sigmay(
44 hilbert: AbstractHilbert, site: int, dtype: DType = complex
45 ) -> _LocalOperator:
46 """
47 Builds the :math:`\\sigma^y` operator acting on the `site`-th of the Hilbert
48 space `hilbert`.
49
50 If `hilbert` is a non-Spin space of local dimension M, it is considered
51 as a (M-1)/2 - spin space.
52
53 :param hilbert: The hilbert space
54 :param site: the site on which this operator acts
55 :return: a nk.operator.LocalOperator
56 """
57 import numpy as np
58
59 N = hilbert.size_at_index(site)
60 S = (N - 1) / 2
61
62 D = np.array([1j * np.sqrt((S + 1) * 2 * a - a * (a + 1)) for a in np.arange(1, N)])
63 mat = np.diag(D, -1) + np.diag(-D, 1)
64 return _LocalOperator(hilbert, mat, [site], dtype=dtype)
65
66
67 def sigmaz(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:
68 """
69 Builds the :math:`\\sigma^z` operator acting on the `site`-th of the Hilbert
70 space `hilbert`.
71
72 If `hilbert` is a non-Spin space of local dimension M, it is considered
73 as a (M-1)/2 - spin space.
74
75 :param hilbert: The hilbert space
76 :param site: the site on which this operator acts
77 :return: a nk.operator.LocalOperator
78 """
79 import numpy as np
80
81 N = hilbert.size_at_index(site)
82 S = (N - 1) / 2
83
84 D = np.array([2 * m for m in np.arange(S, -(S + 1), -1)])
85 mat = np.diag(D, 0)
86 return _LocalOperator(hilbert, mat, [site], dtype=dtype)
87
88
89 def sigmam(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:
90 """
91 Builds the :math:`\\sigma^{-} = \\sigma^x - i \\sigma^y` operator acting on the
92 `site`-th of the Hilbert space `hilbert`.
93
94 If `hilbert` is a non-Spin space of local dimension M, it is considered
95 as a (M-1)/2 - spin space.
96
97 :param hilbert: The hilbert space
98 :param site: the site on which this operator acts
99 :return: a nk.operator.LocalOperator
100 """
101 import numpy as np
102
103 N = hilbert.size_at_index(site)
104 S = (N - 1) / 2
105
106 S2 = (S + 1) * S
107 D = np.array([np.sqrt(S2 - m * (m - 1)) for m in np.arange(S, -S, -1)])
108 mat = np.diag(D, -1)
109 return _LocalOperator(hilbert, mat, [site], dtype=dtype)
110
111
112 def sigmap(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:
113 """
114 Builds the :math:`\\sigma^{+} = \\sigma^x + i \\sigma^y` operator acting on the
115 `site`-th of the Hilbert space `hilbert`.
116
117 If `hilbert` is a non-Spin space of local dimension M, it is considered
118 as a (M-1)/2 - spin space.
119
120 :param hilbert: The hilbert space
121 :param site: the site on which this operator acts
122 :return: a nk.operator.LocalOperator
123 """
124 import numpy as np
125
126 N = hilbert.size_at_index(site)
127 S = (N - 1) / 2
128
129 S2 = (S + 1) * S
130 D = np.array([np.sqrt(S2 - m * (m + 1)) for m in np.arange(S - 1, -(S + 1), -1)])
131 mat = np.diag(D, 1)
132 return _LocalOperator(hilbert, mat, [site], dtype=dtype)
133
134
135 # clean up the module
136 del AbstractHilbert, DType
137
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/netket/operator/spin.py b/netket/operator/spin.py
--- a/netket/operator/spin.py
+++ b/netket/operator/spin.py
@@ -88,7 +88,7 @@
def sigmam(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:
"""
- Builds the :math:`\\sigma^{-} = \\sigma^x - i \\sigma^y` operator acting on the
+ Builds the :math:`\\sigma^{-} = \\frac{1}{2}(\\sigma^x - i \\sigma^y)` operator acting on the
`site`-th of the Hilbert space `hilbert`.
If `hilbert` is a non-Spin space of local dimension M, it is considered
@@ -111,7 +111,7 @@
def sigmap(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:
"""
- Builds the :math:`\\sigma^{+} = \\sigma^x + i \\sigma^y` operator acting on the
+ Builds the :math:`\\sigma^{+} = \\frac{1}{2}(\\sigma^x + i \\sigma^y)` operator acting on the
`site`-th of the Hilbert space `hilbert`.
If `hilbert` is a non-Spin space of local dimension M, it is considered
| {"golden_diff": "diff --git a/netket/operator/spin.py b/netket/operator/spin.py\n--- a/netket/operator/spin.py\n+++ b/netket/operator/spin.py\n@@ -88,7 +88,7 @@\n \n def sigmam(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:\n \"\"\"\n- Builds the :math:`\\\\sigma^{-} = \\\\sigma^x - i \\\\sigma^y` operator acting on the\n+ Builds the :math:`\\\\sigma^{-} = \\\\frac{1}{2}(\\\\sigma^x - i \\\\sigma^y)` operator acting on the\n `site`-th of the Hilbert space `hilbert`.\n \n If `hilbert` is a non-Spin space of local dimension M, it is considered\n@@ -111,7 +111,7 @@\n \n def sigmap(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:\n \"\"\"\n- Builds the :math:`\\\\sigma^{+} = \\\\sigma^x + i \\\\sigma^y` operator acting on the\n+ Builds the :math:`\\\\sigma^{+} = \\\\frac{1}{2}(\\\\sigma^x + i \\\\sigma^y)` operator acting on the\n `site`-th of the Hilbert space `hilbert`.\n \n If `hilbert` is a non-Spin space of local dimension M, it is considered\n", "issue": "Docs for sigma operators\nThe documentation of the ladder operator `netket.operator.spin.sigmap` says `\u03c3^+ = \u03c3^x + i \u03c3^y`, but in fact it is defined as `\u03c3^+ = 1/2(\u03c3^x + i \u03c3^y)` in the code.\r\n\r\nThe same error is in the documentation of `netket.operator.spin.sigmam` operator.\n", "before_files": [{"content": "# Copyright 2021 The NetKet Authors - All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom netket.hilbert import AbstractHilbert\nfrom netket.utils.types import DType\n\nfrom ._local_operator import LocalOperator as _LocalOperator\n\n\ndef sigmax(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:\n \"\"\"\n Builds the :math:`\\\\sigma^x` operator acting on the `site`-th of the Hilbert\n space `hilbert`.\n\n If `hilbert` is a non-Spin space of local dimension M, it is considered\n as a (M-1)/2 - spin space.\n\n :param hilbert: The hilbert space\n :param site: the site on which this operator acts\n :return: a nk.operator.LocalOperator\n \"\"\"\n import numpy as np\n\n N = hilbert.size_at_index(site)\n S = (N - 1) / 2\n\n D = [np.sqrt((S + 1) * 2 * a - a * (a + 1)) for a in np.arange(1, N)]\n mat = np.diag(D, 1) + np.diag(D, -1)\n return _LocalOperator(hilbert, mat, [site], dtype=dtype)\n\n\ndef sigmay(\n hilbert: AbstractHilbert, site: int, dtype: DType = complex\n) -> _LocalOperator:\n \"\"\"\n Builds the :math:`\\\\sigma^y` operator acting on the `site`-th of the Hilbert\n space `hilbert`.\n\n If `hilbert` is a non-Spin space of local dimension M, it is considered\n as a (M-1)/2 - spin space.\n\n :param hilbert: The hilbert space\n :param site: the site on which this operator acts\n :return: a nk.operator.LocalOperator\n \"\"\"\n import numpy as np\n\n N = hilbert.size_at_index(site)\n S = (N - 1) / 2\n\n D = np.array([1j * np.sqrt((S + 1) * 2 * a - a * (a + 1)) for a in np.arange(1, N)])\n mat = np.diag(D, -1) + np.diag(-D, 1)\n return _LocalOperator(hilbert, mat, [site], dtype=dtype)\n\n\ndef sigmaz(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:\n \"\"\"\n Builds the :math:`\\\\sigma^z` operator acting on the `site`-th of the Hilbert\n space `hilbert`.\n\n If `hilbert` is a non-Spin space of local dimension M, it is considered\n as a (M-1)/2 - spin space.\n\n :param hilbert: The hilbert space\n :param site: the site on which this operator acts\n :return: a nk.operator.LocalOperator\n \"\"\"\n import numpy as np\n\n N = hilbert.size_at_index(site)\n S = (N - 1) / 2\n\n D = np.array([2 * m for m in np.arange(S, -(S + 1), -1)])\n mat = np.diag(D, 0)\n return _LocalOperator(hilbert, mat, [site], dtype=dtype)\n\n\ndef sigmam(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:\n \"\"\"\n Builds the :math:`\\\\sigma^{-} = \\\\sigma^x - i \\\\sigma^y` operator acting on the\n `site`-th of the Hilbert space `hilbert`.\n\n If `hilbert` is a non-Spin space of local dimension M, it is considered\n as a (M-1)/2 - spin space.\n\n :param hilbert: The hilbert space\n :param site: the site on which this operator acts\n :return: a nk.operator.LocalOperator\n \"\"\"\n import numpy as np\n\n N = hilbert.size_at_index(site)\n S = (N - 1) / 2\n\n S2 = (S + 1) * S\n D = np.array([np.sqrt(S2 - m * (m - 1)) for m in np.arange(S, -S, -1)])\n mat = np.diag(D, -1)\n return _LocalOperator(hilbert, mat, [site], dtype=dtype)\n\n\ndef sigmap(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:\n \"\"\"\n Builds the :math:`\\\\sigma^{+} = \\\\sigma^x + i \\\\sigma^y` operator acting on the\n `site`-th of the Hilbert space `hilbert`.\n\n If `hilbert` is a non-Spin space of local dimension M, it is considered\n as a (M-1)/2 - spin space.\n\n :param hilbert: The hilbert space\n :param site: the site on which this operator acts\n :return: a nk.operator.LocalOperator\n \"\"\"\n import numpy as np\n\n N = hilbert.size_at_index(site)\n S = (N - 1) / 2\n\n S2 = (S + 1) * S\n D = np.array([np.sqrt(S2 - m * (m + 1)) for m in np.arange(S - 1, -(S + 1), -1)])\n mat = np.diag(D, 1)\n return _LocalOperator(hilbert, mat, [site], dtype=dtype)\n\n\n# clean up the module\ndel AbstractHilbert, DType\n", "path": "netket/operator/spin.py"}], "after_files": [{"content": "# Copyright 2021 The NetKet Authors - All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom netket.hilbert import AbstractHilbert\nfrom netket.utils.types import DType\n\nfrom ._local_operator import LocalOperator as _LocalOperator\n\n\ndef sigmax(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:\n \"\"\"\n Builds the :math:`\\\\sigma^x` operator acting on the `site`-th of the Hilbert\n space `hilbert`.\n\n If `hilbert` is a non-Spin space of local dimension M, it is considered\n as a (M-1)/2 - spin space.\n\n :param hilbert: The hilbert space\n :param site: the site on which this operator acts\n :return: a nk.operator.LocalOperator\n \"\"\"\n import numpy as np\n\n N = hilbert.size_at_index(site)\n S = (N - 1) / 2\n\n D = [np.sqrt((S + 1) * 2 * a - a * (a + 1)) for a in np.arange(1, N)]\n mat = np.diag(D, 1) + np.diag(D, -1)\n return _LocalOperator(hilbert, mat, [site], dtype=dtype)\n\n\ndef sigmay(\n hilbert: AbstractHilbert, site: int, dtype: DType = complex\n) -> _LocalOperator:\n \"\"\"\n Builds the :math:`\\\\sigma^y` operator acting on the `site`-th of the Hilbert\n space `hilbert`.\n\n If `hilbert` is a non-Spin space of local dimension M, it is considered\n as a (M-1)/2 - spin space.\n\n :param hilbert: The hilbert space\n :param site: the site on which this operator acts\n :return: a nk.operator.LocalOperator\n \"\"\"\n import numpy as np\n\n N = hilbert.size_at_index(site)\n S = (N - 1) / 2\n\n D = np.array([1j * np.sqrt((S + 1) * 2 * a - a * (a + 1)) for a in np.arange(1, N)])\n mat = np.diag(D, -1) + np.diag(-D, 1)\n return _LocalOperator(hilbert, mat, [site], dtype=dtype)\n\n\ndef sigmaz(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:\n \"\"\"\n Builds the :math:`\\\\sigma^z` operator acting on the `site`-th of the Hilbert\n space `hilbert`.\n\n If `hilbert` is a non-Spin space of local dimension M, it is considered\n as a (M-1)/2 - spin space.\n\n :param hilbert: The hilbert space\n :param site: the site on which this operator acts\n :return: a nk.operator.LocalOperator\n \"\"\"\n import numpy as np\n\n N = hilbert.size_at_index(site)\n S = (N - 1) / 2\n\n D = np.array([2 * m for m in np.arange(S, -(S + 1), -1)])\n mat = np.diag(D, 0)\n return _LocalOperator(hilbert, mat, [site], dtype=dtype)\n\n\ndef sigmam(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:\n \"\"\"\n Builds the :math:`\\\\sigma^{-} = \\\\frac{1}{2}(\\\\sigma^x - i \\\\sigma^y)` operator acting on the\n `site`-th of the Hilbert space `hilbert`.\n\n If `hilbert` is a non-Spin space of local dimension M, it is considered\n as a (M-1)/2 - spin space.\n\n :param hilbert: The hilbert space\n :param site: the site on which this operator acts\n :return: a nk.operator.LocalOperator\n \"\"\"\n import numpy as np\n\n N = hilbert.size_at_index(site)\n S = (N - 1) / 2\n\n S2 = (S + 1) * S\n D = np.array([np.sqrt(S2 - m * (m - 1)) for m in np.arange(S, -S, -1)])\n mat = np.diag(D, -1)\n return _LocalOperator(hilbert, mat, [site], dtype=dtype)\n\n\ndef sigmap(hilbert: AbstractHilbert, site: int, dtype: DType = float) -> _LocalOperator:\n \"\"\"\n Builds the :math:`\\\\sigma^{+} = \\\\frac{1}{2}(\\\\sigma^x + i \\\\sigma^y)` operator acting on the\n `site`-th of the Hilbert space `hilbert`.\n\n If `hilbert` is a non-Spin space of local dimension M, it is considered\n as a (M-1)/2 - spin space.\n\n :param hilbert: The hilbert space\n :param site: the site on which this operator acts\n :return: a nk.operator.LocalOperator\n \"\"\"\n import numpy as np\n\n N = hilbert.size_at_index(site)\n S = (N - 1) / 2\n\n S2 = (S + 1) * S\n D = np.array([np.sqrt(S2 - m * (m + 1)) for m in np.arange(S - 1, -(S + 1), -1)])\n mat = np.diag(D, 1)\n return _LocalOperator(hilbert, mat, [site], dtype=dtype)\n\n\n# clean up the module\ndel AbstractHilbert, DType\n", "path": "netket/operator/spin.py"}]} | 2,047 | 327 |
gh_patches_debug_14974 | rasdani/github-patches | git_diff | enthought__chaco-423 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Demo world_map.py does not work on python 2
**Problem Description**
urllib.request deprecated in Python 2
**Reproduction Steps:**
Run world_map.py
**Expected behavior:**
Traceback (most recent call last):
File "world_map.py", line 15, in <module>
import urllib.request, urllib.parse, urllib.error
ImportError: No module named request
**OS, Python version:** [Enter OS name and Python version]
OSX, Python 2.7
xref: #385
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/demo/world_map.py`
Content:
```
1 #!/usr/bin/env python
2 """
3 Displays a world map with locations plotted on top. Locations are expected to
4 be tuples of latitude, longitude where West and South are expressed as
5 negative values.
6 - Mousewheel up and down zooms the plot in and out.
7 - Pressing "z" brings up the Zoom Box, and you can click-drag a rectangular
8 region to zoom. If you use a sequence of zoom boxes, pressing alt-left-arrow
9 and alt-right-arrow moves you forwards and backwards through the "zoom
10 history".
11 """
12
13 # Standard library imports
14 import os.path
15 import urllib.request, urllib.parse, urllib.error
16
17 # Major library imports
18 import numpy
19
20 # ETS imports
21 from chaco.api import Plot, ArrayPlotData, ImageData
22 from chaco.tools.api import ZoomTool
23 from enable.component_editor import ComponentEditor
24 from traits.api import HasTraits, Instance, Str
25 from traitsui.api import Item, View
26
27 class WorldMapPlot(HasTraits):
28
29 ### Public Traits ##########################################################
30
31 # The plot which will be displayed
32 plot = Instance(Plot)
33
34 # The URL which points to the world map image to be downloaded
35 image_url = Str("http://eoimages.gsfc.nasa.gov/ve//2433/land_shallow_topo_2048.jpg")
36
37
38 ### Private Traits #########################################################
39
40 # The path to where the image exists on the filesystem
41 image_path = Str()
42
43 # The view
44 traits_view = View(Item('plot', editor=ComponentEditor(),
45 width=800, height=400, show_label=False),
46 resizable=True)
47
48 #---------------------------------------------------------------------------
49 # Public interface
50 #---------------------------------------------------------------------------
51
52 def __init__(self, **kw):
53 super(WorldMapPlot, self).__init__(**kw)
54
55 self._download_map_image()
56 image = ImageData.fromfile(self.image_path)
57
58 # For now, the locations are hardcoded, though this can be changed
59 # eassily to take command line args, read from a file, or by other
60 # means
61 austin_loc = (30.16, -97.44)
62
63 locations_x = numpy.array([austin_loc[1]])
64 locations_y = numpy.array([austin_loc[0]])
65
66 # transform each of the locations to the image data space, including
67 # moving the origin from bottom left to top left
68 locations_x = (locations_x + 180) * image.data.shape[1]/360
69 locations_y = (locations_y*-1 + 90) * image.data.shape[0]/180
70
71 # Create the plott data, adding the image and the locations
72 plot_data = ArrayPlotData()
73 plot_data.set_data("imagedata", image._data)
74 plot_data.set_data("locations_x", locations_x)
75 plot_data.set_data("locations_y", locations_y)
76
77 # Create the plot with the origin as top left, which matches
78 # how the image data is aligned
79 self.plot = Plot(plot_data, default_origin="top left")
80 self.plot.img_plot('imagedata')
81
82 # Plot the locations as a scatter plot to be overlayed on top
83 # of the map
84 loc_plot = self.plot.plot(('locations_x', 'locations_y'),
85 type='scatter', size=3, color='yellow',
86 marker='dot')[0]
87
88 loc_plot.x_mapper.range.high = image.data.shape[1]
89 loc_plot.x_mapper.range.low = 0
90 loc_plot.y_mapper.range.high = image.data.shape[0]
91 loc_plot.y_mapper.range.low = -0
92
93 # set up any tools, in this case just the zoom tool
94 zoom = ZoomTool(component=self.plot, tool_mode="box", always_on=False)
95 self.plot.overlays.append(zoom)
96
97 #---------------------------------------------------------------------------
98 # Protected interface
99 #---------------------------------------------------------------------------
100
101 def _download_map_image(self):
102 """ Downloads a map from the image_url attribute. This is done
103 primarily to keep the redistributable Chaco package as small
104 as possible
105 """
106 example_dir = os.path.dirname(__file__)
107 self.image_path = os.path.join(example_dir, 'data',
108 os.path.split(self.image_url)[1])
109
110 if not os.path.exists(self.image_path):
111 print("Downloading map image")
112 urllib.request.urlretrieve(self.image_url, self.image_path)
113
114 #===============================================================================
115 # demo object that is used by the demo.py application.
116 #===============================================================================
117 demo = WorldMapPlot()
118
119 if __name__ == "__main__":
120 demo.configure_traits()
121
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/demo/world_map.py b/examples/demo/world_map.py
--- a/examples/demo/world_map.py
+++ b/examples/demo/world_map.py
@@ -12,7 +12,7 @@
# Standard library imports
import os.path
-import urllib.request, urllib.parse, urllib.error
+from six.moves.urllib import request
# Major library imports
import numpy
@@ -109,7 +109,7 @@
if not os.path.exists(self.image_path):
print("Downloading map image")
- urllib.request.urlretrieve(self.image_url, self.image_path)
+ request.urlretrieve(self.image_url, self.image_path)
#===============================================================================
# demo object that is used by the demo.py application.
| {"golden_diff": "diff --git a/examples/demo/world_map.py b/examples/demo/world_map.py\n--- a/examples/demo/world_map.py\n+++ b/examples/demo/world_map.py\n@@ -12,7 +12,7 @@\n \n # Standard library imports\n import os.path\n-import urllib.request, urllib.parse, urllib.error\n+from six.moves.urllib import request\n \n # Major library imports\n import numpy\n@@ -109,7 +109,7 @@\n \n if not os.path.exists(self.image_path):\n print(\"Downloading map image\")\n- urllib.request.urlretrieve(self.image_url, self.image_path)\n+ request.urlretrieve(self.image_url, self.image_path)\n \n #===============================================================================\n # demo object that is used by the demo.py application.\n", "issue": "Demo world_map.py does not work on python 2\n**Problem Description**\r\nurllib.request deprecated in Python 2\r\n**Reproduction Steps:**\r\nRun world_map.py\r\n\r\n**Expected behavior:**\r\n\r\nTraceback (most recent call last):\r\n File \"world_map.py\", line 15, in <module>\r\n import urllib.request, urllib.parse, urllib.error\r\nImportError: No module named request\r\n\r\n**OS, Python version:** [Enter OS name and Python version]\r\nOSX, Python 2.7\r\n\r\nxref: #385\n", "before_files": [{"content": "#!/usr/bin/env python\n\"\"\"\nDisplays a world map with locations plotted on top. Locations are expected to\nbe tuples of latitude, longitude where West and South are expressed as\nnegative values.\n - Mousewheel up and down zooms the plot in and out.\n - Pressing \"z\" brings up the Zoom Box, and you can click-drag a rectangular\n region to zoom. If you use a sequence of zoom boxes, pressing alt-left-arrow\n and alt-right-arrow moves you forwards and backwards through the \"zoom\n history\".\n\"\"\"\n\n# Standard library imports\nimport os.path\nimport urllib.request, urllib.parse, urllib.error\n\n# Major library imports\nimport numpy\n\n# ETS imports\nfrom chaco.api import Plot, ArrayPlotData, ImageData\nfrom chaco.tools.api import ZoomTool\nfrom enable.component_editor import ComponentEditor\nfrom traits.api import HasTraits, Instance, Str\nfrom traitsui.api import Item, View\n\nclass WorldMapPlot(HasTraits):\n\n ### Public Traits ##########################################################\n\n # The plot which will be displayed\n plot = Instance(Plot)\n\n # The URL which points to the world map image to be downloaded\n image_url = Str(\"http://eoimages.gsfc.nasa.gov/ve//2433/land_shallow_topo_2048.jpg\")\n\n\n ### Private Traits #########################################################\n\n # The path to where the image exists on the filesystem\n image_path = Str()\n\n # The view\n traits_view = View(Item('plot', editor=ComponentEditor(),\n width=800, height=400, show_label=False),\n resizable=True)\n\n #---------------------------------------------------------------------------\n # Public interface\n #---------------------------------------------------------------------------\n\n def __init__(self, **kw):\n super(WorldMapPlot, self).__init__(**kw)\n\n self._download_map_image()\n image = ImageData.fromfile(self.image_path)\n\n # For now, the locations are hardcoded, though this can be changed\n # eassily to take command line args, read from a file, or by other\n # means\n austin_loc = (30.16, -97.44)\n\n locations_x = numpy.array([austin_loc[1]])\n locations_y = numpy.array([austin_loc[0]])\n\n # transform each of the locations to the image data space, including\n # moving the origin from bottom left to top left\n locations_x = (locations_x + 180) * image.data.shape[1]/360\n locations_y = (locations_y*-1 + 90) * image.data.shape[0]/180\n\n # Create the plott data, adding the image and the locations\n plot_data = ArrayPlotData()\n plot_data.set_data(\"imagedata\", image._data)\n plot_data.set_data(\"locations_x\", locations_x)\n plot_data.set_data(\"locations_y\", locations_y)\n\n # Create the plot with the origin as top left, which matches\n # how the image data is aligned\n self.plot = Plot(plot_data, default_origin=\"top left\")\n self.plot.img_plot('imagedata')\n\n # Plot the locations as a scatter plot to be overlayed on top\n # of the map\n loc_plot = self.plot.plot(('locations_x', 'locations_y'),\n type='scatter', size=3, color='yellow',\n marker='dot')[0]\n\n loc_plot.x_mapper.range.high = image.data.shape[1]\n loc_plot.x_mapper.range.low = 0\n loc_plot.y_mapper.range.high = image.data.shape[0]\n loc_plot.y_mapper.range.low = -0\n\n # set up any tools, in this case just the zoom tool\n zoom = ZoomTool(component=self.plot, tool_mode=\"box\", always_on=False)\n self.plot.overlays.append(zoom)\n\n #---------------------------------------------------------------------------\n # Protected interface\n #---------------------------------------------------------------------------\n\n def _download_map_image(self):\n \"\"\" Downloads a map from the image_url attribute. This is done\n primarily to keep the redistributable Chaco package as small\n as possible\n \"\"\"\n example_dir = os.path.dirname(__file__)\n self.image_path = os.path.join(example_dir, 'data',\n os.path.split(self.image_url)[1])\n\n if not os.path.exists(self.image_path):\n print(\"Downloading map image\")\n urllib.request.urlretrieve(self.image_url, self.image_path)\n\n#===============================================================================\n# demo object that is used by the demo.py application.\n#===============================================================================\ndemo = WorldMapPlot()\n\nif __name__ == \"__main__\":\n demo.configure_traits()\n", "path": "examples/demo/world_map.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\"\"\"\nDisplays a world map with locations plotted on top. Locations are expected to\nbe tuples of latitude, longitude where West and South are expressed as\nnegative values.\n - Mousewheel up and down zooms the plot in and out.\n - Pressing \"z\" brings up the Zoom Box, and you can click-drag a rectangular\n region to zoom. If you use a sequence of zoom boxes, pressing alt-left-arrow\n and alt-right-arrow moves you forwards and backwards through the \"zoom\n history\".\n\"\"\"\n\n# Standard library imports\nimport os.path\nfrom six.moves.urllib import request\n\n# Major library imports\nimport numpy\n\n# ETS imports\nfrom chaco.api import Plot, ArrayPlotData, ImageData\nfrom chaco.tools.api import ZoomTool\nfrom enable.component_editor import ComponentEditor\nfrom traits.api import HasTraits, Instance, Str\nfrom traitsui.api import Item, View\n\nclass WorldMapPlot(HasTraits):\n\n ### Public Traits ##########################################################\n\n # The plot which will be displayed\n plot = Instance(Plot)\n\n # The URL which points to the world map image to be downloaded\n image_url = Str(\"http://eoimages.gsfc.nasa.gov/ve//2433/land_shallow_topo_2048.jpg\")\n\n\n ### Private Traits #########################################################\n\n # The path to where the image exists on the filesystem\n image_path = Str()\n\n # The view\n traits_view = View(Item('plot', editor=ComponentEditor(),\n width=800, height=400, show_label=False),\n resizable=True)\n\n #---------------------------------------------------------------------------\n # Public interface\n #---------------------------------------------------------------------------\n\n def __init__(self, **kw):\n super(WorldMapPlot, self).__init__(**kw)\n\n self._download_map_image()\n image = ImageData.fromfile(self.image_path)\n\n # For now, the locations are hardcoded, though this can be changed\n # eassily to take command line args, read from a file, or by other\n # means\n austin_loc = (30.16, -97.44)\n\n locations_x = numpy.array([austin_loc[1]])\n locations_y = numpy.array([austin_loc[0]])\n\n # transform each of the locations to the image data space, including\n # moving the origin from bottom left to top left\n locations_x = (locations_x + 180) * image.data.shape[1]/360\n locations_y = (locations_y*-1 + 90) * image.data.shape[0]/180\n\n # Create the plott data, adding the image and the locations\n plot_data = ArrayPlotData()\n plot_data.set_data(\"imagedata\", image._data)\n plot_data.set_data(\"locations_x\", locations_x)\n plot_data.set_data(\"locations_y\", locations_y)\n\n # Create the plot with the origin as top left, which matches\n # how the image data is aligned\n self.plot = Plot(plot_data, default_origin=\"top left\")\n self.plot.img_plot('imagedata')\n\n # Plot the locations as a scatter plot to be overlayed on top\n # of the map\n loc_plot = self.plot.plot(('locations_x', 'locations_y'),\n type='scatter', size=3, color='yellow',\n marker='dot')[0]\n\n loc_plot.x_mapper.range.high = image.data.shape[1]\n loc_plot.x_mapper.range.low = 0\n loc_plot.y_mapper.range.high = image.data.shape[0]\n loc_plot.y_mapper.range.low = -0\n\n # set up any tools, in this case just the zoom tool\n zoom = ZoomTool(component=self.plot, tool_mode=\"box\", always_on=False)\n self.plot.overlays.append(zoom)\n\n #---------------------------------------------------------------------------\n # Protected interface\n #---------------------------------------------------------------------------\n\n def _download_map_image(self):\n \"\"\" Downloads a map from the image_url attribute. This is done\n primarily to keep the redistributable Chaco package as small\n as possible\n \"\"\"\n example_dir = os.path.dirname(__file__)\n self.image_path = os.path.join(example_dir, 'data',\n os.path.split(self.image_url)[1])\n\n if not os.path.exists(self.image_path):\n print(\"Downloading map image\")\n request.urlretrieve(self.image_url, self.image_path)\n\n#===============================================================================\n# demo object that is used by the demo.py application.\n#===============================================================================\ndemo = WorldMapPlot()\n\nif __name__ == \"__main__\":\n demo.configure_traits()\n", "path": "examples/demo/world_map.py"}]} | 1,631 | 156 |
gh_patches_debug_37650 | rasdani/github-patches | git_diff | mirumee__ariadne-30 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
make_executable_schema resolvers arg should accept dict of dicts or list of dicts of dicts
The `make_executable_schema` utility should optionally take list of dicts of dicts (AKA "resolvers map"), this would allow larger projects to easily split and compose resolvers as needed:
```python
from ariadne import make_executable_schema
from products.graphql import resolvers as products_resolvers
from users.graphql import resolvers as users_resolvers
typedefs = "..."
resolvers = [products_resolvers, users_resolvers]
schema = make_executable_schema(typedefs, resolvers)
```
This task will likely require #13 to be done first, so we are 100% certain that all resolver mappings are dicts.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ariadne/executable_schema.py`
Content:
```
1 from typing import List, Union
2
3 from graphql import GraphQLSchema
4
5 from .build_schema import build_schema_from_type_definitions
6 from .resolvers import add_resolve_functions_to_schema
7
8
9 def make_executable_schema(
10 type_defs: Union[str, List[str]], resolvers: dict
11 ) -> GraphQLSchema:
12 schema = build_schema_from_type_definitions(type_defs)
13 add_resolve_functions_to_schema(schema, resolvers)
14 return schema
15
```
Path: `ariadne/build_schema.py`
Content:
```
1 from typing import List, Union
2
3 from graphql import GraphQLSchema, parse
4 from graphql.language.ast import (
5 Document,
6 ObjectTypeDefinition,
7 OperationTypeDefinition,
8 Name,
9 NamedType,
10 SchemaDefinition,
11 )
12 from graphql.utils.build_ast_schema import build_ast_schema
13
14
15 def build_schema_from_type_definitions(
16 type_defs: Union[str, List[str]]
17 ) -> GraphQLSchema:
18 if isinstance(type_defs, list):
19 type_defs = concatenate_type_defs(type_defs)
20
21 document = parse(type_defs)
22
23 if not document_has_schema(document):
24 schema_definition = build_default_schema(document)
25 document.definitions.append(schema_definition)
26
27 return build_ast_schema(document)
28
29
30 def concatenate_type_defs(type_defs: List[str]) -> str:
31 resolved_type_defs = []
32 for type_def in type_defs:
33 resolved_type_defs.append(type_def.strip())
34 return "\n\n".join(resolved_type_defs)
35
36
37 def build_default_schema(document: Document) -> SchemaDefinition:
38 defined_types = [
39 td.name.value
40 for td in document.definitions
41 if isinstance(td, ObjectTypeDefinition)
42 ]
43 operations = []
44 if "Query" in defined_types:
45 operations.append(
46 OperationTypeDefinition("query", type=NamedType(name=Name("Query")))
47 )
48 if "Mutation" in defined_types:
49 operations.append(
50 OperationTypeDefinition("mutation", type=NamedType(name=Name("Mutation")))
51 )
52 if "Subscription" in defined_types:
53 operations.append(
54 OperationTypeDefinition(
55 "subscription", type=NamedType(name=Name("Subscription"))
56 )
57 )
58 return SchemaDefinition(operation_types=operations, directives=[])
59
60
61 def document_has_schema(document: Document) -> bool:
62 return any(isinstance(td, SchemaDefinition) for td in document.definitions)
63
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ariadne/build_schema.py b/ariadne/build_schema.py
--- a/ariadne/build_schema.py
+++ b/ariadne/build_schema.py
@@ -1,5 +1,3 @@
-from typing import List, Union
-
from graphql import GraphQLSchema, parse
from graphql.language.ast import (
Document,
@@ -12,12 +10,7 @@
from graphql.utils.build_ast_schema import build_ast_schema
-def build_schema_from_type_definitions(
- type_defs: Union[str, List[str]]
-) -> GraphQLSchema:
- if isinstance(type_defs, list):
- type_defs = concatenate_type_defs(type_defs)
-
+def build_schema_from_type_definitions(type_defs: str) -> GraphQLSchema:
document = parse(type_defs)
if not document_has_schema(document):
@@ -27,13 +20,6 @@
return build_ast_schema(document)
-def concatenate_type_defs(type_defs: List[str]) -> str:
- resolved_type_defs = []
- for type_def in type_defs:
- resolved_type_defs.append(type_def.strip())
- return "\n\n".join(resolved_type_defs)
-
-
def build_default_schema(document: Document) -> SchemaDefinition:
defined_types = [
td.name.value
diff --git a/ariadne/executable_schema.py b/ariadne/executable_schema.py
--- a/ariadne/executable_schema.py
+++ b/ariadne/executable_schema.py
@@ -1,4 +1,6 @@
-from typing import List, Union
+from collections import defaultdict
+from itertools import chain
+from typing import Iterator, List, Union
from graphql import GraphQLSchema
@@ -6,9 +8,39 @@
from .resolvers import add_resolve_functions_to_schema
+def decompose_maps(resolvers_maps: List[dict]) -> Iterator[tuple]:
+ def flatten(rm):
+ for key, value in rm.items():
+ for resolver_name, resolver in value.items():
+ yield (key, resolver_name, resolver)
+
+ return chain.from_iterable(flatten(m) for m in resolvers_maps)
+
+
+def merge_resolvers(resolver_list: Iterator[tuple]) -> dict:
+ output = defaultdict(dict) # type: dict
+ for key, resolver_name, resolver in resolver_list:
+ output[key][resolver_name] = resolver
+ return output
+
+
+def join_type_defs(type_defs: List[str]) -> str:
+ return "\n\n".join(t.strip() for t in type_defs)
+
+
def make_executable_schema(
- type_defs: Union[str, List[str]], resolvers: dict
+ type_defs: Union[str, List[str]], resolvers: Union[dict, List[dict]]
) -> GraphQLSchema:
+ if isinstance(type_defs, list):
+ type_defs = join_type_defs(type_defs)
+
schema = build_schema_from_type_definitions(type_defs)
- add_resolve_functions_to_schema(schema, resolvers)
+
+ if isinstance(resolvers, list):
+ add_resolve_functions_to_schema(
+ schema, merge_resolvers(decompose_maps(resolvers))
+ )
+ elif isinstance(resolvers, dict):
+ add_resolve_functions_to_schema(schema, resolvers)
+
return schema
| {"golden_diff": "diff --git a/ariadne/build_schema.py b/ariadne/build_schema.py\n--- a/ariadne/build_schema.py\n+++ b/ariadne/build_schema.py\n@@ -1,5 +1,3 @@\n-from typing import List, Union\n-\n from graphql import GraphQLSchema, parse\n from graphql.language.ast import (\n Document,\n@@ -12,12 +10,7 @@\n from graphql.utils.build_ast_schema import build_ast_schema\n \n \n-def build_schema_from_type_definitions(\n- type_defs: Union[str, List[str]]\n-) -> GraphQLSchema:\n- if isinstance(type_defs, list):\n- type_defs = concatenate_type_defs(type_defs)\n-\n+def build_schema_from_type_definitions(type_defs: str) -> GraphQLSchema:\n document = parse(type_defs)\n \n if not document_has_schema(document):\n@@ -27,13 +20,6 @@\n return build_ast_schema(document)\n \n \n-def concatenate_type_defs(type_defs: List[str]) -> str:\n- resolved_type_defs = []\n- for type_def in type_defs:\n- resolved_type_defs.append(type_def.strip())\n- return \"\\n\\n\".join(resolved_type_defs)\n-\n-\n def build_default_schema(document: Document) -> SchemaDefinition:\n defined_types = [\n td.name.value\ndiff --git a/ariadne/executable_schema.py b/ariadne/executable_schema.py\n--- a/ariadne/executable_schema.py\n+++ b/ariadne/executable_schema.py\n@@ -1,4 +1,6 @@\n-from typing import List, Union\n+from collections import defaultdict\n+from itertools import chain\n+from typing import Iterator, List, Union\n \n from graphql import GraphQLSchema\n \n@@ -6,9 +8,39 @@\n from .resolvers import add_resolve_functions_to_schema\n \n \n+def decompose_maps(resolvers_maps: List[dict]) -> Iterator[tuple]:\n+ def flatten(rm):\n+ for key, value in rm.items():\n+ for resolver_name, resolver in value.items():\n+ yield (key, resolver_name, resolver)\n+\n+ return chain.from_iterable(flatten(m) for m in resolvers_maps)\n+\n+\n+def merge_resolvers(resolver_list: Iterator[tuple]) -> dict:\n+ output = defaultdict(dict) # type: dict\n+ for key, resolver_name, resolver in resolver_list:\n+ output[key][resolver_name] = resolver\n+ return output\n+\n+\n+def join_type_defs(type_defs: List[str]) -> str:\n+ return \"\\n\\n\".join(t.strip() for t in type_defs)\n+\n+\n def make_executable_schema(\n- type_defs: Union[str, List[str]], resolvers: dict\n+ type_defs: Union[str, List[str]], resolvers: Union[dict, List[dict]]\n ) -> GraphQLSchema:\n+ if isinstance(type_defs, list):\n+ type_defs = join_type_defs(type_defs)\n+\n schema = build_schema_from_type_definitions(type_defs)\n- add_resolve_functions_to_schema(schema, resolvers)\n+\n+ if isinstance(resolvers, list):\n+ add_resolve_functions_to_schema(\n+ schema, merge_resolvers(decompose_maps(resolvers))\n+ )\n+ elif isinstance(resolvers, dict):\n+ add_resolve_functions_to_schema(schema, resolvers)\n+\n return schema\n", "issue": "make_executable_schema resolvers arg should accept dict of dicts or list of dicts of dicts\nThe `make_executable_schema` utility should optionally take list of dicts of dicts (AKA \"resolvers map\"), this would allow larger projects to easily split and compose resolvers as needed:\r\n\r\n```python\r\nfrom ariadne import make_executable_schema\r\n\r\nfrom products.graphql import resolvers as products_resolvers\r\nfrom users.graphql import resolvers as users_resolvers\r\n\r\n\r\ntypedefs = \"...\"\r\nresolvers = [products_resolvers, users_resolvers]\r\n\r\nschema = make_executable_schema(typedefs, resolvers)\r\n```\r\n\r\nThis task will likely require #13 to be done first, so we are 100% certain that all resolver mappings are dicts.\n", "before_files": [{"content": "from typing import List, Union\n\nfrom graphql import GraphQLSchema\n\nfrom .build_schema import build_schema_from_type_definitions\nfrom .resolvers import add_resolve_functions_to_schema\n\n\ndef make_executable_schema(\n type_defs: Union[str, List[str]], resolvers: dict\n) -> GraphQLSchema:\n schema = build_schema_from_type_definitions(type_defs)\n add_resolve_functions_to_schema(schema, resolvers)\n return schema\n", "path": "ariadne/executable_schema.py"}, {"content": "from typing import List, Union\n\nfrom graphql import GraphQLSchema, parse\nfrom graphql.language.ast import (\n Document,\n ObjectTypeDefinition,\n OperationTypeDefinition,\n Name,\n NamedType,\n SchemaDefinition,\n)\nfrom graphql.utils.build_ast_schema import build_ast_schema\n\n\ndef build_schema_from_type_definitions(\n type_defs: Union[str, List[str]]\n) -> GraphQLSchema:\n if isinstance(type_defs, list):\n type_defs = concatenate_type_defs(type_defs)\n\n document = parse(type_defs)\n\n if not document_has_schema(document):\n schema_definition = build_default_schema(document)\n document.definitions.append(schema_definition)\n\n return build_ast_schema(document)\n\n\ndef concatenate_type_defs(type_defs: List[str]) -> str:\n resolved_type_defs = []\n for type_def in type_defs:\n resolved_type_defs.append(type_def.strip())\n return \"\\n\\n\".join(resolved_type_defs)\n\n\ndef build_default_schema(document: Document) -> SchemaDefinition:\n defined_types = [\n td.name.value\n for td in document.definitions\n if isinstance(td, ObjectTypeDefinition)\n ]\n operations = []\n if \"Query\" in defined_types:\n operations.append(\n OperationTypeDefinition(\"query\", type=NamedType(name=Name(\"Query\")))\n )\n if \"Mutation\" in defined_types:\n operations.append(\n OperationTypeDefinition(\"mutation\", type=NamedType(name=Name(\"Mutation\")))\n )\n if \"Subscription\" in defined_types:\n operations.append(\n OperationTypeDefinition(\n \"subscription\", type=NamedType(name=Name(\"Subscription\"))\n )\n )\n return SchemaDefinition(operation_types=operations, directives=[])\n\n\ndef document_has_schema(document: Document) -> bool:\n return any(isinstance(td, SchemaDefinition) for td in document.definitions)\n", "path": "ariadne/build_schema.py"}], "after_files": [{"content": "from collections import defaultdict\nfrom itertools import chain\nfrom typing import Iterator, List, Union\n\nfrom graphql import GraphQLSchema\n\nfrom .build_schema import build_schema_from_type_definitions\nfrom .resolvers import add_resolve_functions_to_schema\n\n\ndef decompose_maps(resolvers_maps: List[dict]) -> Iterator[tuple]:\n def flatten(rm):\n for key, value in rm.items():\n for resolver_name, resolver in value.items():\n yield (key, resolver_name, resolver)\n\n return chain.from_iterable(flatten(m) for m in resolvers_maps)\n\n\ndef merge_resolvers(resolver_list: Iterator[tuple]) -> dict:\n output = defaultdict(dict) # type: dict\n for key, resolver_name, resolver in resolver_list:\n output[key][resolver_name] = resolver\n return output\n\n\ndef join_type_defs(type_defs: List[str]) -> str:\n return \"\\n\\n\".join(t.strip() for t in type_defs)\n\n\ndef make_executable_schema(\n type_defs: Union[str, List[str]], resolvers: Union[dict, List[dict]]\n) -> GraphQLSchema:\n if isinstance(type_defs, list):\n type_defs = join_type_defs(type_defs)\n\n schema = build_schema_from_type_definitions(type_defs)\n\n if isinstance(resolvers, list):\n add_resolve_functions_to_schema(\n schema, merge_resolvers(decompose_maps(resolvers))\n )\n elif isinstance(resolvers, dict):\n add_resolve_functions_to_schema(schema, resolvers)\n\n return schema\n", "path": "ariadne/executable_schema.py"}, {"content": "from graphql import GraphQLSchema, parse\nfrom graphql.language.ast import (\n Document,\n ObjectTypeDefinition,\n OperationTypeDefinition,\n Name,\n NamedType,\n SchemaDefinition,\n)\nfrom graphql.utils.build_ast_schema import build_ast_schema\n\n\ndef build_schema_from_type_definitions(type_defs: str) -> GraphQLSchema:\n document = parse(type_defs)\n\n if not document_has_schema(document):\n schema_definition = build_default_schema(document)\n document.definitions.append(schema_definition)\n\n return build_ast_schema(document)\n\n\ndef build_default_schema(document: Document) -> SchemaDefinition:\n defined_types = [\n td.name.value\n for td in document.definitions\n if isinstance(td, ObjectTypeDefinition)\n ]\n operations = []\n if \"Query\" in defined_types:\n operations.append(\n OperationTypeDefinition(\"query\", type=NamedType(name=Name(\"Query\")))\n )\n if \"Mutation\" in defined_types:\n operations.append(\n OperationTypeDefinition(\"mutation\", type=NamedType(name=Name(\"Mutation\")))\n )\n if \"Subscription\" in defined_types:\n operations.append(\n OperationTypeDefinition(\n \"subscription\", type=NamedType(name=Name(\"Subscription\"))\n )\n )\n return SchemaDefinition(operation_types=operations, directives=[])\n\n\ndef document_has_schema(document: Document) -> bool:\n return any(isinstance(td, SchemaDefinition) for td in document.definitions)\n", "path": "ariadne/build_schema.py"}]} | 1,041 | 709 |
gh_patches_debug_18394 | rasdani/github-patches | git_diff | vega__altair-1888 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
HTML renderer should use uuid in chart divs
Currently, divs are generated via chart counts. This means that if a kernel is restarted in the notebook, new charts may not display correctly.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `altair/utils/display.py`
Content:
```
1 import json
2 import pkgutil
3 import textwrap
4 from typing import Callable, Dict
5
6 from jsonschema import validate
7
8 from .plugin_registry import PluginRegistry
9 from .mimebundle import spec_to_mimebundle
10
11
12 # ==============================================================================
13 # Renderer registry
14 # ==============================================================================
15 MimeBundleType = Dict[str, object]
16 RendererType = Callable[..., MimeBundleType]
17
18
19 class RendererRegistry(PluginRegistry[RendererType]):
20 entrypoint_err_messages = {
21 'notebook': textwrap.dedent(
22 """
23 To use the 'notebook' renderer, you must install the vega package
24 and the associated Jupyter extension.
25 See https://altair-viz.github.io/getting_started/installation.html
26 for more information.
27 """),
28 'vegascope': textwrap.dedent(
29 """
30 To use the 'vegascope' renderer, you must install the vegascope
31 package; see http://github.com/diana-hep/vegascope/
32 for more information.
33 """),
34 }
35
36 def set_embed_options(self, defaultStyle=None, renderer=None,
37 width=None, height=None, padding=None,
38 scaleFactor=None, actions=None, **kwargs):
39 """Set options for embeddings of Vega & Vega-Lite charts.
40
41 Options are fully documented at https://github.com/vega/vega-embed.
42 Similar to the `enable()` method, this can be used as either
43 a persistent global switch, or as a temporary local setting using
44 a context manager (i.e. a `with` statement).
45
46 Parameters
47 ----------
48 defaultStyle : bool or string
49 Specify a default stylesheet for embed actions.
50 renderer : string
51 The renderer to use for the view. One of "canvas" (default) or "svg"
52 width : integer
53 The view width in pixels
54 height : integer
55 The view height in pixels
56 padding : integer
57 The view padding in pixels
58 scaleFactor : number
59 The number by which to multiply the width and height (default 1)
60 of an exported PNG or SVG image.
61 actions : bool or dict
62 Determines if action links ("Export as PNG/SVG", "View Source",
63 "View Vega" (only for Vega-Lite), "Open in Vega Editor") are
64 included with the embedded view. If the value is true, all action
65 links will be shown and none if the value is false. This property
66 can take a key-value mapping object that maps keys (export, source,
67 compiled, editor) to boolean values for determining if
68 each action link should be shown.
69 **kwargs :
70 Additional options are passed directly to embed options.
71 """
72 options = {'defaultStyle': defaultStyle, 'renderer': renderer,
73 'width': width, 'height': height, 'padding': padding,
74 'scaleFactor': scaleFactor, 'actions': actions}
75 kwargs.update({key: val for key, val in options.items()
76 if val is not None})
77 return self.enable(None, embed_options=kwargs)
78
79
80 # ==============================================================================
81 # VegaLite v1/v2 renderer logic
82 # ==============================================================================
83
84
85 class Displayable(object):
86 """A base display class for VegaLite v1/v2.
87
88 This class takes a VegaLite v1/v2 spec and does the following:
89
90 1. Optionally validates the spec against a schema.
91 2. Uses the RendererPlugin to grab a renderer and call it when the
92 IPython/Jupyter display method (_repr_mimebundle_) is called.
93
94 The spec passed to this class must be fully schema compliant and already
95 have the data portion of the spec fully processed and ready to serialize.
96 In practice, this means, the data portion of the spec should have been passed
97 through appropriate data model transformers.
98 """
99
100 renderers = None
101 schema_path = ('altair', '')
102
103 def __init__(self, spec, validate=False):
104 # type: (dict, bool) -> None
105 self.spec = spec
106 self.validate = validate
107 self._validate()
108
109 def _validate(self):
110 # type: () -> None
111 """Validate the spec against the schema."""
112 schema_dict = json.loads(pkgutil.get_data(*self.schema_path).decode('utf-8'))
113 validate(self.spec, schema_dict)
114
115 def _repr_mimebundle_(self, include, exclude):
116 """Return a MIME bundle for display in Jupyter frontends."""
117 if self.renderers is not None:
118 return self.renderers.get()(self.spec)
119 else:
120 return {}
121
122
123 def default_renderer_base(spec, mime_type, str_repr, **options):
124 """A default renderer for Vega or VegaLite that works for modern frontends.
125
126 This renderer works with modern frontends (JupyterLab, nteract) that know
127 how to render the custom VegaLite MIME type listed above.
128 """
129 assert isinstance(spec, dict)
130 bundle = {}
131 metadata = {}
132
133 bundle[mime_type] = spec
134 bundle['text/plain'] = str_repr
135 if options:
136 metadata[mime_type] = options
137 return bundle, metadata
138
139
140 def json_renderer_base(spec, str_repr, **options):
141 """A renderer that returns a MIME type of application/json.
142
143 In JupyterLab/nteract this is rendered as a nice JSON tree.
144 """
145 return default_renderer_base(spec, mime_type='application/json',
146 str_repr=str_repr, **options)
147
148
149 class HTMLRenderer(object):
150 """Object to render charts as HTML, with a unique output div each time"""
151 def __init__(self, output_div='altair-viz-{}', **kwargs):
152 self._output_div = output_div
153 self._output_count = 0
154 self.kwargs = kwargs
155
156 @property
157 def output_div(self):
158 self._output_count += 1
159 return self._output_div.format(self._output_count)
160
161 def __call__(self, spec, **metadata):
162 kwargs = self.kwargs.copy()
163 kwargs.update(metadata)
164 return spec_to_mimebundle(spec, format='html',
165 output_div=self.output_div, **kwargs)
166
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/altair/utils/display.py b/altair/utils/display.py
--- a/altair/utils/display.py
+++ b/altair/utils/display.py
@@ -2,6 +2,7 @@
import pkgutil
import textwrap
from typing import Callable, Dict
+import uuid
from jsonschema import validate
@@ -150,13 +151,11 @@
"""Object to render charts as HTML, with a unique output div each time"""
def __init__(self, output_div='altair-viz-{}', **kwargs):
self._output_div = output_div
- self._output_count = 0
self.kwargs = kwargs
@property
def output_div(self):
- self._output_count += 1
- return self._output_div.format(self._output_count)
+ return self._output_div.format(uuid.uuid4().hex)
def __call__(self, spec, **metadata):
kwargs = self.kwargs.copy()
| {"golden_diff": "diff --git a/altair/utils/display.py b/altair/utils/display.py\n--- a/altair/utils/display.py\n+++ b/altair/utils/display.py\n@@ -2,6 +2,7 @@\n import pkgutil\n import textwrap\n from typing import Callable, Dict\n+import uuid\n \n from jsonschema import validate\n \n@@ -150,13 +151,11 @@\n \"\"\"Object to render charts as HTML, with a unique output div each time\"\"\"\n def __init__(self, output_div='altair-viz-{}', **kwargs):\n self._output_div = output_div\n- self._output_count = 0\n self.kwargs = kwargs\n \n @property\n def output_div(self):\n- self._output_count += 1\n- return self._output_div.format(self._output_count)\n+ return self._output_div.format(uuid.uuid4().hex)\n \n def __call__(self, spec, **metadata):\n kwargs = self.kwargs.copy()\n", "issue": "HTML renderer should use uuid in chart divs\nCurrently, divs are generated via chart counts. This means that if a kernel is restarted in the notebook, new charts may not display correctly.\n", "before_files": [{"content": "import json\nimport pkgutil\nimport textwrap\nfrom typing import Callable, Dict\n\nfrom jsonschema import validate\n\nfrom .plugin_registry import PluginRegistry\nfrom .mimebundle import spec_to_mimebundle\n\n\n# ==============================================================================\n# Renderer registry\n# ==============================================================================\nMimeBundleType = Dict[str, object]\nRendererType = Callable[..., MimeBundleType]\n\n\nclass RendererRegistry(PluginRegistry[RendererType]):\n entrypoint_err_messages = {\n 'notebook': textwrap.dedent(\n \"\"\"\n To use the 'notebook' renderer, you must install the vega package\n and the associated Jupyter extension.\n See https://altair-viz.github.io/getting_started/installation.html\n for more information.\n \"\"\"),\n 'vegascope': textwrap.dedent(\n \"\"\"\n To use the 'vegascope' renderer, you must install the vegascope\n package; see http://github.com/diana-hep/vegascope/\n for more information.\n \"\"\"),\n }\n\n def set_embed_options(self, defaultStyle=None, renderer=None,\n width=None, height=None, padding=None,\n scaleFactor=None, actions=None, **kwargs):\n \"\"\"Set options for embeddings of Vega & Vega-Lite charts.\n\n Options are fully documented at https://github.com/vega/vega-embed.\n Similar to the `enable()` method, this can be used as either\n a persistent global switch, or as a temporary local setting using\n a context manager (i.e. a `with` statement).\n\n Parameters\n ----------\n defaultStyle : bool or string\n Specify a default stylesheet for embed actions.\n renderer : string\n The renderer to use for the view. One of \"canvas\" (default) or \"svg\"\n width : integer\n The view width in pixels\n height : integer\n The view height in pixels\n padding : integer\n The view padding in pixels\n scaleFactor : number\n The number by which to multiply the width and height (default 1)\n of an exported PNG or SVG image.\n actions : bool or dict\n Determines if action links (\"Export as PNG/SVG\", \"View Source\",\n \"View Vega\" (only for Vega-Lite), \"Open in Vega Editor\") are\n included with the embedded view. If the value is true, all action\n links will be shown and none if the value is false. This property\n can take a key-value mapping object that maps keys (export, source,\n compiled, editor) to boolean values for determining if\n each action link should be shown.\n **kwargs :\n Additional options are passed directly to embed options.\n \"\"\"\n options = {'defaultStyle': defaultStyle, 'renderer': renderer,\n 'width': width, 'height': height, 'padding': padding,\n 'scaleFactor': scaleFactor, 'actions': actions}\n kwargs.update({key: val for key, val in options.items()\n if val is not None})\n return self.enable(None, embed_options=kwargs)\n\n\n# ==============================================================================\n# VegaLite v1/v2 renderer logic\n# ==============================================================================\n\n\nclass Displayable(object):\n \"\"\"A base display class for VegaLite v1/v2.\n\n This class takes a VegaLite v1/v2 spec and does the following:\n\n 1. Optionally validates the spec against a schema.\n 2. Uses the RendererPlugin to grab a renderer and call it when the\n IPython/Jupyter display method (_repr_mimebundle_) is called.\n\n The spec passed to this class must be fully schema compliant and already\n have the data portion of the spec fully processed and ready to serialize.\n In practice, this means, the data portion of the spec should have been passed\n through appropriate data model transformers.\n \"\"\"\n\n renderers = None\n schema_path = ('altair', '')\n\n def __init__(self, spec, validate=False):\n # type: (dict, bool) -> None\n self.spec = spec\n self.validate = validate\n self._validate()\n\n def _validate(self):\n # type: () -> None\n \"\"\"Validate the spec against the schema.\"\"\"\n schema_dict = json.loads(pkgutil.get_data(*self.schema_path).decode('utf-8'))\n validate(self.spec, schema_dict)\n\n def _repr_mimebundle_(self, include, exclude):\n \"\"\"Return a MIME bundle for display in Jupyter frontends.\"\"\"\n if self.renderers is not None:\n return self.renderers.get()(self.spec)\n else:\n return {}\n\n\ndef default_renderer_base(spec, mime_type, str_repr, **options):\n \"\"\"A default renderer for Vega or VegaLite that works for modern frontends.\n\n This renderer works with modern frontends (JupyterLab, nteract) that know\n how to render the custom VegaLite MIME type listed above.\n \"\"\"\n assert isinstance(spec, dict)\n bundle = {}\n metadata = {}\n\n bundle[mime_type] = spec\n bundle['text/plain'] = str_repr\n if options:\n metadata[mime_type] = options\n return bundle, metadata\n\n\ndef json_renderer_base(spec, str_repr, **options):\n \"\"\"A renderer that returns a MIME type of application/json.\n\n In JupyterLab/nteract this is rendered as a nice JSON tree.\n \"\"\"\n return default_renderer_base(spec, mime_type='application/json',\n str_repr=str_repr, **options)\n\n\nclass HTMLRenderer(object):\n \"\"\"Object to render charts as HTML, with a unique output div each time\"\"\"\n def __init__(self, output_div='altair-viz-{}', **kwargs):\n self._output_div = output_div\n self._output_count = 0\n self.kwargs = kwargs\n\n @property\n def output_div(self):\n self._output_count += 1\n return self._output_div.format(self._output_count)\n\n def __call__(self, spec, **metadata):\n kwargs = self.kwargs.copy()\n kwargs.update(metadata)\n return spec_to_mimebundle(spec, format='html',\n output_div=self.output_div, **kwargs)\n", "path": "altair/utils/display.py"}], "after_files": [{"content": "import json\nimport pkgutil\nimport textwrap\nfrom typing import Callable, Dict\nimport uuid\n\nfrom jsonschema import validate\n\nfrom .plugin_registry import PluginRegistry\nfrom .mimebundle import spec_to_mimebundle\n\n\n# ==============================================================================\n# Renderer registry\n# ==============================================================================\nMimeBundleType = Dict[str, object]\nRendererType = Callable[..., MimeBundleType]\n\n\nclass RendererRegistry(PluginRegistry[RendererType]):\n entrypoint_err_messages = {\n 'notebook': textwrap.dedent(\n \"\"\"\n To use the 'notebook' renderer, you must install the vega package\n and the associated Jupyter extension.\n See https://altair-viz.github.io/getting_started/installation.html\n for more information.\n \"\"\"),\n 'vegascope': textwrap.dedent(\n \"\"\"\n To use the 'vegascope' renderer, you must install the vegascope\n package; see http://github.com/diana-hep/vegascope/\n for more information.\n \"\"\"),\n }\n\n def set_embed_options(self, defaultStyle=None, renderer=None,\n width=None, height=None, padding=None,\n scaleFactor=None, actions=None, **kwargs):\n \"\"\"Set options for embeddings of Vega & Vega-Lite charts.\n\n Options are fully documented at https://github.com/vega/vega-embed.\n Similar to the `enable()` method, this can be used as either\n a persistent global switch, or as a temporary local setting using\n a context manager (i.e. a `with` statement).\n\n Parameters\n ----------\n defaultStyle : bool or string\n Specify a default stylesheet for embed actions.\n renderer : string\n The renderer to use for the view. One of \"canvas\" (default) or \"svg\"\n width : integer\n The view width in pixels\n height : integer\n The view height in pixels\n padding : integer\n The view padding in pixels\n scaleFactor : number\n The number by which to multiply the width and height (default 1)\n of an exported PNG or SVG image.\n actions : bool or dict\n Determines if action links (\"Export as PNG/SVG\", \"View Source\",\n \"View Vega\" (only for Vega-Lite), \"Open in Vega Editor\") are\n included with the embedded view. If the value is true, all action\n links will be shown and none if the value is false. This property\n can take a key-value mapping object that maps keys (export, source,\n compiled, editor) to boolean values for determining if\n each action link should be shown.\n **kwargs :\n Additional options are passed directly to embed options.\n \"\"\"\n options = {'defaultStyle': defaultStyle, 'renderer': renderer,\n 'width': width, 'height': height, 'padding': padding,\n 'scaleFactor': scaleFactor, 'actions': actions}\n kwargs.update({key: val for key, val in options.items()\n if val is not None})\n return self.enable(None, embed_options=kwargs)\n\n\n# ==============================================================================\n# VegaLite v1/v2 renderer logic\n# ==============================================================================\n\n\nclass Displayable(object):\n \"\"\"A base display class for VegaLite v1/v2.\n\n This class takes a VegaLite v1/v2 spec and does the following:\n\n 1. Optionally validates the spec against a schema.\n 2. Uses the RendererPlugin to grab a renderer and call it when the\n IPython/Jupyter display method (_repr_mimebundle_) is called.\n\n The spec passed to this class must be fully schema compliant and already\n have the data portion of the spec fully processed and ready to serialize.\n In practice, this means, the data portion of the spec should have been passed\n through appropriate data model transformers.\n \"\"\"\n\n renderers = None\n schema_path = ('altair', '')\n\n def __init__(self, spec, validate=False):\n # type: (dict, bool) -> None\n self.spec = spec\n self.validate = validate\n self._validate()\n\n def _validate(self):\n # type: () -> None\n \"\"\"Validate the spec against the schema.\"\"\"\n schema_dict = json.loads(pkgutil.get_data(*self.schema_path).decode('utf-8'))\n validate(self.spec, schema_dict)\n\n def _repr_mimebundle_(self, include, exclude):\n \"\"\"Return a MIME bundle for display in Jupyter frontends.\"\"\"\n if self.renderers is not None:\n return self.renderers.get()(self.spec)\n else:\n return {}\n\n\ndef default_renderer_base(spec, mime_type, str_repr, **options):\n \"\"\"A default renderer for Vega or VegaLite that works for modern frontends.\n\n This renderer works with modern frontends (JupyterLab, nteract) that know\n how to render the custom VegaLite MIME type listed above.\n \"\"\"\n assert isinstance(spec, dict)\n bundle = {}\n metadata = {}\n\n bundle[mime_type] = spec\n bundle['text/plain'] = str_repr\n if options:\n metadata[mime_type] = options\n return bundle, metadata\n\n\ndef json_renderer_base(spec, str_repr, **options):\n \"\"\"A renderer that returns a MIME type of application/json.\n\n In JupyterLab/nteract this is rendered as a nice JSON tree.\n \"\"\"\n return default_renderer_base(spec, mime_type='application/json',\n str_repr=str_repr, **options)\n\n\nclass HTMLRenderer(object):\n \"\"\"Object to render charts as HTML, with a unique output div each time\"\"\"\n def __init__(self, output_div='altair-viz-{}', **kwargs):\n self._output_div = output_div\n self.kwargs = kwargs\n\n @property\n def output_div(self):\n return self._output_div.format(uuid.uuid4().hex)\n\n def __call__(self, spec, **metadata):\n kwargs = self.kwargs.copy()\n kwargs.update(metadata)\n return spec_to_mimebundle(spec, format='html',\n output_div=self.output_div, **kwargs)\n", "path": "altair/utils/display.py"}]} | 2,001 | 216 |
gh_patches_debug_36406 | rasdani/github-patches | git_diff | cornellius-gp__gpytorch-1565 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Docs] Deep Sigma Point Processes
Hi,
The DSPPs documentaiton makes it seem like theyare coded up very similarly to the standard Deep GPs, and this does seem true for[ two layers](https://docs.gpytorch.ai/en/stable/examples/05_Deep_Gaussian_Processes/Deep_Sigma_Point_Processes.html).
However, if I try to add a third layer (a second hidden layer) and update the input and output dims and make sure the forward call is correct, the loss function returns a vector rather than a single value (which the optimsier obviously doesn't like).
I'm guessing these losses are meant to be weighted according to the quadrature points. Is this true? Perhpas this could be more clear in the documentation (or alternatively, should this be done automatically in the backend?)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gpytorch/models/deep_gps/dspp.py`
Content:
```
1 import torch
2
3 from gpytorch import settings
4 from gpytorch.distributions import MultitaskMultivariateNormal
5 from gpytorch.lazy import BlockDiagLazyTensor
6
7 from ..approximate_gp import ApproximateGP
8 from .deep_gp import DeepGP, DeepGPLayer
9
10
11 class DSPPLayer(DeepGPLayer):
12 """
13 Represents a layer in a DSPP where inference is performed using the techniques defined in Jankowiak et al., 2020.
14
15 From an end user perspective, the functionality and usage of a DSPPLayer is essentially identical to that of a
16 DeepGPLayer. It is therefore recommended that you review the documentation for DeepGPLayer.
17
18 :param ~gpytorch.variational.VariationalStrategy variational_strategy: Strategy for
19 changing q(u) -> q(f) (see other VI docs)
20 :param int input_dims: Dimensionality of input data expected by each GP
21 :param int output_dims: (default None) Number of GPs in this layer, equivalent to
22 output dimensionality. If set to `None`, then the output dimension will be squashed.
23 :param int num_quad_sites: Number of quadrature sites to use. Also the number of Gaussians in the mixture output
24 by this layer.
25
26 Again, refer to the documentation for DeepGPLayer or our example notebooks for full details on what calling a
27 DSPPLayer module does. The high level overview is that if a tensor `x` is `n x d` then
28
29 >>> hidden_gp2(hidden_gp1(x))
30
31 will return a `num_quad_sites` by `output_dims` set of Gaussians, where for each output dim the first batch dim
32 represents a weighted mixture of `num_quad_sites` Gaussians with weights given by DSPP.quad_weights (see DSPP below)
33 """
34
35 def __init__(self, variational_strategy, input_dims, output_dims, num_quad_sites=3, quad_sites=None):
36 super().__init__(variational_strategy, input_dims, output_dims)
37
38 self.num_quad_sites = num_quad_sites
39
40 # Pass in previous_layer.quad_sites if you want to share quad_sites across layers.
41 if quad_sites is not None:
42 self.quad_sites = quad_sites
43 else:
44 self.quad_sites = torch.nn.Parameter(torch.randn(num_quad_sites, input_dims))
45
46 def __call__(self, inputs, are_samples=False, expand_for_quadgrid=True, **kwargs):
47 if isinstance(inputs, MultitaskMultivariateNormal):
48 # inputs is definitely in the second layer, and mean is n x t
49 mus, sigmas = inputs.mean, inputs.variance.sqrt()
50
51 if expand_for_quadgrid:
52 xi_mus = mus.unsqueeze(0) # 1 x n x t
53 xi_sigmas = sigmas.unsqueeze(0) # 1 x n x t
54 else:
55 xi_mus = mus
56 xi_sigmas = sigmas
57
58 # unsqueeze sigmas to 1 x n x t, locations from [q] to Q^T x 1 x T.
59 # Broadcasted result will be Q^T x N x T
60 qg = self.quad_sites.view([self.num_quad_sites] + [1] * (xi_mus.dim() - 2) + [self.input_dims])
61 xi_sigmas = xi_sigmas * qg
62
63 inputs = xi_mus + xi_sigmas # q^t x n x t
64 if settings.debug.on():
65 if not torch.is_tensor(inputs):
66 raise ValueError(
67 "`inputs` should either be a MultitaskMultivariateNormal or a Tensor, got "
68 f"{inputs.__class__.__Name__}"
69 )
70
71 if inputs.size(-1) != self.input_dims:
72 raise RuntimeError(
73 f"Input shape did not match self.input_dims. Got total feature dims [{inputs.size(-1)}],"
74 f" expected [{self.input_dims}]"
75 )
76
77 # Repeat the input for all possible outputs
78 if self.output_dims is not None:
79 inputs = inputs.unsqueeze(-3)
80 inputs = inputs.expand(*inputs.shape[:-3], self.output_dims, *inputs.shape[-2:])
81 # Now run samples through the GP
82 output = ApproximateGP.__call__(self, inputs, **kwargs)
83
84 if self.num_quad_sites > 0:
85 if self.output_dims is not None and not isinstance(output, MultitaskMultivariateNormal):
86 mean = output.loc.transpose(-1, -2)
87 covar = BlockDiagLazyTensor(output.lazy_covariance_matrix, block_dim=-3)
88 output = MultitaskMultivariateNormal(mean, covar, interleaved=False)
89 else:
90 output = output.loc.transpose(-1, -2) # this layer provides noiseless kernel interpolation
91
92 return output
93
94
95 class DSPP(DeepGP):
96 """
97 A container module to build a DSPP
98 This module should contain :obj:`~gpytorch.models.deep_gps.DSPPLayer`
99 modules, and can also contain other modules as well.
100
101 This Module contains an additional set of parameters, `raw_quad_weights`, that represent the mixture weights for
102 the output distribution.
103 """
104
105 def __init__(self, num_quad_sites):
106 super().__init__()
107 self.num_quad_sites = num_quad_sites
108 self.register_parameter("raw_quad_weights", torch.nn.Parameter(torch.randn(self.num_quad_sites)))
109
110 @property
111 def quad_weights(self):
112 qwd = self.raw_quad_weights
113 return qwd - qwd.logsumexp(dim=-1)
114
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gpytorch/models/deep_gps/dspp.py b/gpytorch/models/deep_gps/dspp.py
--- a/gpytorch/models/deep_gps/dspp.py
+++ b/gpytorch/models/deep_gps/dspp.py
@@ -43,24 +43,18 @@
else:
self.quad_sites = torch.nn.Parameter(torch.randn(num_quad_sites, input_dims))
- def __call__(self, inputs, are_samples=False, expand_for_quadgrid=True, **kwargs):
+ def __call__(self, inputs, **kwargs):
if isinstance(inputs, MultitaskMultivariateNormal):
- # inputs is definitely in the second layer, and mean is n x t
+ # This is for subsequent layers. We apply quadrature here
+ # Mean, stdv are q x ... x n x t
mus, sigmas = inputs.mean, inputs.variance.sqrt()
+ qg = self.quad_sites.view([self.num_quad_sites] + [1] * (mus.dim() - 2) + [self.input_dims])
+ sigmas = sigmas * qg
+ inputs = mus + sigmas # q^t x n x t
+ deterministic_inputs = False
+ else:
+ deterministic_inputs = True
- if expand_for_quadgrid:
- xi_mus = mus.unsqueeze(0) # 1 x n x t
- xi_sigmas = sigmas.unsqueeze(0) # 1 x n x t
- else:
- xi_mus = mus
- xi_sigmas = sigmas
-
- # unsqueeze sigmas to 1 x n x t, locations from [q] to Q^T x 1 x T.
- # Broadcasted result will be Q^T x N x T
- qg = self.quad_sites.view([self.num_quad_sites] + [1] * (xi_mus.dim() - 2) + [self.input_dims])
- xi_sigmas = xi_sigmas * qg
-
- inputs = xi_mus + xi_sigmas # q^t x n x t
if settings.debug.on():
if not torch.is_tensor(inputs):
raise ValueError(
@@ -78,9 +72,15 @@
if self.output_dims is not None:
inputs = inputs.unsqueeze(-3)
inputs = inputs.expand(*inputs.shape[:-3], self.output_dims, *inputs.shape[-2:])
+
# Now run samples through the GP
output = ApproximateGP.__call__(self, inputs, **kwargs)
+ # If this is the first layer (deterministic inputs), expand the output
+ # This allows quadrature to be applied to future layers
+ if deterministic_inputs:
+ output = output.expand(torch.Size([self.num_quad_sites]) + output.batch_shape)
+
if self.num_quad_sites > 0:
if self.output_dims is not None and not isinstance(output, MultitaskMultivariateNormal):
mean = output.loc.transpose(-1, -2)
| {"golden_diff": "diff --git a/gpytorch/models/deep_gps/dspp.py b/gpytorch/models/deep_gps/dspp.py\n--- a/gpytorch/models/deep_gps/dspp.py\n+++ b/gpytorch/models/deep_gps/dspp.py\n@@ -43,24 +43,18 @@\n else:\n self.quad_sites = torch.nn.Parameter(torch.randn(num_quad_sites, input_dims))\n \n- def __call__(self, inputs, are_samples=False, expand_for_quadgrid=True, **kwargs):\n+ def __call__(self, inputs, **kwargs):\n if isinstance(inputs, MultitaskMultivariateNormal):\n- # inputs is definitely in the second layer, and mean is n x t\n+ # This is for subsequent layers. We apply quadrature here\n+ # Mean, stdv are q x ... x n x t\n mus, sigmas = inputs.mean, inputs.variance.sqrt()\n+ qg = self.quad_sites.view([self.num_quad_sites] + [1] * (mus.dim() - 2) + [self.input_dims])\n+ sigmas = sigmas * qg\n+ inputs = mus + sigmas # q^t x n x t\n+ deterministic_inputs = False\n+ else:\n+ deterministic_inputs = True\n \n- if expand_for_quadgrid:\n- xi_mus = mus.unsqueeze(0) # 1 x n x t\n- xi_sigmas = sigmas.unsqueeze(0) # 1 x n x t\n- else:\n- xi_mus = mus\n- xi_sigmas = sigmas\n-\n- # unsqueeze sigmas to 1 x n x t, locations from [q] to Q^T x 1 x T.\n- # Broadcasted result will be Q^T x N x T\n- qg = self.quad_sites.view([self.num_quad_sites] + [1] * (xi_mus.dim() - 2) + [self.input_dims])\n- xi_sigmas = xi_sigmas * qg\n-\n- inputs = xi_mus + xi_sigmas # q^t x n x t\n if settings.debug.on():\n if not torch.is_tensor(inputs):\n raise ValueError(\n@@ -78,9 +72,15 @@\n if self.output_dims is not None:\n inputs = inputs.unsqueeze(-3)\n inputs = inputs.expand(*inputs.shape[:-3], self.output_dims, *inputs.shape[-2:])\n+\n # Now run samples through the GP\n output = ApproximateGP.__call__(self, inputs, **kwargs)\n \n+ # If this is the first layer (deterministic inputs), expand the output\n+ # This allows quadrature to be applied to future layers\n+ if deterministic_inputs:\n+ output = output.expand(torch.Size([self.num_quad_sites]) + output.batch_shape)\n+\n if self.num_quad_sites > 0:\n if self.output_dims is not None and not isinstance(output, MultitaskMultivariateNormal):\n mean = output.loc.transpose(-1, -2)\n", "issue": "[Docs] Deep Sigma Point Processes\nHi,\r\n\r\nThe DSPPs documentaiton makes it seem like theyare coded up very similarly to the standard Deep GPs, and this does seem true for[ two layers](https://docs.gpytorch.ai/en/stable/examples/05_Deep_Gaussian_Processes/Deep_Sigma_Point_Processes.html). \r\n\r\nHowever, if I try to add a third layer (a second hidden layer) and update the input and output dims and make sure the forward call is correct, the loss function returns a vector rather than a single value (which the optimsier obviously doesn't like).\r\n\r\nI'm guessing these losses are meant to be weighted according to the quadrature points. Is this true? Perhpas this could be more clear in the documentation (or alternatively, should this be done automatically in the backend?)\n", "before_files": [{"content": "import torch\n\nfrom gpytorch import settings\nfrom gpytorch.distributions import MultitaskMultivariateNormal\nfrom gpytorch.lazy import BlockDiagLazyTensor\n\nfrom ..approximate_gp import ApproximateGP\nfrom .deep_gp import DeepGP, DeepGPLayer\n\n\nclass DSPPLayer(DeepGPLayer):\n \"\"\"\n Represents a layer in a DSPP where inference is performed using the techniques defined in Jankowiak et al., 2020.\n\n From an end user perspective, the functionality and usage of a DSPPLayer is essentially identical to that of a\n DeepGPLayer. It is therefore recommended that you review the documentation for DeepGPLayer.\n\n :param ~gpytorch.variational.VariationalStrategy variational_strategy: Strategy for\n changing q(u) -> q(f) (see other VI docs)\n :param int input_dims: Dimensionality of input data expected by each GP\n :param int output_dims: (default None) Number of GPs in this layer, equivalent to\n output dimensionality. If set to `None`, then the output dimension will be squashed.\n :param int num_quad_sites: Number of quadrature sites to use. Also the number of Gaussians in the mixture output\n by this layer.\n\n Again, refer to the documentation for DeepGPLayer or our example notebooks for full details on what calling a\n DSPPLayer module does. The high level overview is that if a tensor `x` is `n x d` then\n\n >>> hidden_gp2(hidden_gp1(x))\n\n will return a `num_quad_sites` by `output_dims` set of Gaussians, where for each output dim the first batch dim\n represents a weighted mixture of `num_quad_sites` Gaussians with weights given by DSPP.quad_weights (see DSPP below)\n \"\"\"\n\n def __init__(self, variational_strategy, input_dims, output_dims, num_quad_sites=3, quad_sites=None):\n super().__init__(variational_strategy, input_dims, output_dims)\n\n self.num_quad_sites = num_quad_sites\n\n # Pass in previous_layer.quad_sites if you want to share quad_sites across layers.\n if quad_sites is not None:\n self.quad_sites = quad_sites\n else:\n self.quad_sites = torch.nn.Parameter(torch.randn(num_quad_sites, input_dims))\n\n def __call__(self, inputs, are_samples=False, expand_for_quadgrid=True, **kwargs):\n if isinstance(inputs, MultitaskMultivariateNormal):\n # inputs is definitely in the second layer, and mean is n x t\n mus, sigmas = inputs.mean, inputs.variance.sqrt()\n\n if expand_for_quadgrid:\n xi_mus = mus.unsqueeze(0) # 1 x n x t\n xi_sigmas = sigmas.unsqueeze(0) # 1 x n x t\n else:\n xi_mus = mus\n xi_sigmas = sigmas\n\n # unsqueeze sigmas to 1 x n x t, locations from [q] to Q^T x 1 x T.\n # Broadcasted result will be Q^T x N x T\n qg = self.quad_sites.view([self.num_quad_sites] + [1] * (xi_mus.dim() - 2) + [self.input_dims])\n xi_sigmas = xi_sigmas * qg\n\n inputs = xi_mus + xi_sigmas # q^t x n x t\n if settings.debug.on():\n if not torch.is_tensor(inputs):\n raise ValueError(\n \"`inputs` should either be a MultitaskMultivariateNormal or a Tensor, got \"\n f\"{inputs.__class__.__Name__}\"\n )\n\n if inputs.size(-1) != self.input_dims:\n raise RuntimeError(\n f\"Input shape did not match self.input_dims. Got total feature dims [{inputs.size(-1)}],\"\n f\" expected [{self.input_dims}]\"\n )\n\n # Repeat the input for all possible outputs\n if self.output_dims is not None:\n inputs = inputs.unsqueeze(-3)\n inputs = inputs.expand(*inputs.shape[:-3], self.output_dims, *inputs.shape[-2:])\n # Now run samples through the GP\n output = ApproximateGP.__call__(self, inputs, **kwargs)\n\n if self.num_quad_sites > 0:\n if self.output_dims is not None and not isinstance(output, MultitaskMultivariateNormal):\n mean = output.loc.transpose(-1, -2)\n covar = BlockDiagLazyTensor(output.lazy_covariance_matrix, block_dim=-3)\n output = MultitaskMultivariateNormal(mean, covar, interleaved=False)\n else:\n output = output.loc.transpose(-1, -2) # this layer provides noiseless kernel interpolation\n\n return output\n\n\nclass DSPP(DeepGP):\n \"\"\"\n A container module to build a DSPP\n This module should contain :obj:`~gpytorch.models.deep_gps.DSPPLayer`\n modules, and can also contain other modules as well.\n\n This Module contains an additional set of parameters, `raw_quad_weights`, that represent the mixture weights for\n the output distribution.\n \"\"\"\n\n def __init__(self, num_quad_sites):\n super().__init__()\n self.num_quad_sites = num_quad_sites\n self.register_parameter(\"raw_quad_weights\", torch.nn.Parameter(torch.randn(self.num_quad_sites)))\n\n @property\n def quad_weights(self):\n qwd = self.raw_quad_weights\n return qwd - qwd.logsumexp(dim=-1)\n", "path": "gpytorch/models/deep_gps/dspp.py"}], "after_files": [{"content": "import torch\n\nfrom gpytorch import settings\nfrom gpytorch.distributions import MultitaskMultivariateNormal\nfrom gpytorch.lazy import BlockDiagLazyTensor\n\nfrom ..approximate_gp import ApproximateGP\nfrom .deep_gp import DeepGP, DeepGPLayer\n\n\nclass DSPPLayer(DeepGPLayer):\n \"\"\"\n Represents a layer in a DSPP where inference is performed using the techniques defined in Jankowiak et al., 2020.\n\n From an end user perspective, the functionality and usage of a DSPPLayer is essentially identical to that of a\n DeepGPLayer. It is therefore recommended that you review the documentation for DeepGPLayer.\n\n :param ~gpytorch.variational.VariationalStrategy variational_strategy: Strategy for\n changing q(u) -> q(f) (see other VI docs)\n :param int input_dims: Dimensionality of input data expected by each GP\n :param int output_dims: (default None) Number of GPs in this layer, equivalent to\n output dimensionality. If set to `None`, then the output dimension will be squashed.\n :param int num_quad_sites: Number of quadrature sites to use. Also the number of Gaussians in the mixture output\n by this layer.\n\n Again, refer to the documentation for DeepGPLayer or our example notebooks for full details on what calling a\n DSPPLayer module does. The high level overview is that if a tensor `x` is `n x d` then\n\n >>> hidden_gp2(hidden_gp1(x))\n\n will return a `num_quad_sites` by `output_dims` set of Gaussians, where for each output dim the first batch dim\n represents a weighted mixture of `num_quad_sites` Gaussians with weights given by DSPP.quad_weights (see DSPP below)\n \"\"\"\n\n def __init__(self, variational_strategy, input_dims, output_dims, num_quad_sites=3, quad_sites=None):\n super().__init__(variational_strategy, input_dims, output_dims)\n\n self.num_quad_sites = num_quad_sites\n\n # Pass in previous_layer.quad_sites if you want to share quad_sites across layers.\n if quad_sites is not None:\n self.quad_sites = quad_sites\n else:\n self.quad_sites = torch.nn.Parameter(torch.randn(num_quad_sites, input_dims))\n\n def __call__(self, inputs, **kwargs):\n if isinstance(inputs, MultitaskMultivariateNormal):\n # This is for subsequent layers. We apply quadrature here\n # Mean, stdv are q x ... x n x t\n mus, sigmas = inputs.mean, inputs.variance.sqrt()\n qg = self.quad_sites.view([self.num_quad_sites] + [1] * (mus.dim() - 2) + [self.input_dims])\n sigmas = sigmas * qg\n inputs = mus + sigmas # q^t x n x t\n deterministic_inputs = False\n else:\n deterministic_inputs = True\n\n if settings.debug.on():\n if not torch.is_tensor(inputs):\n raise ValueError(\n \"`inputs` should either be a MultitaskMultivariateNormal or a Tensor, got \"\n f\"{inputs.__class__.__Name__}\"\n )\n\n if inputs.size(-1) != self.input_dims:\n raise RuntimeError(\n f\"Input shape did not match self.input_dims. Got total feature dims [{inputs.size(-1)}],\"\n f\" expected [{self.input_dims}]\"\n )\n\n # Repeat the input for all possible outputs\n if self.output_dims is not None:\n inputs = inputs.unsqueeze(-3)\n inputs = inputs.expand(*inputs.shape[:-3], self.output_dims, *inputs.shape[-2:])\n\n # Now run samples through the GP\n output = ApproximateGP.__call__(self, inputs, **kwargs)\n\n # If this is the first layer (deterministic inputs), expand the output\n # This allows quadrature to be applied to future layers\n if deterministic_inputs:\n output = output.expand(torch.Size([self.num_quad_sites]) + output.batch_shape)\n\n if self.num_quad_sites > 0:\n if self.output_dims is not None and not isinstance(output, MultitaskMultivariateNormal):\n mean = output.loc.transpose(-1, -2)\n covar = BlockDiagLazyTensor(output.lazy_covariance_matrix, block_dim=-3)\n output = MultitaskMultivariateNormal(mean, covar, interleaved=False)\n else:\n output = output.loc.transpose(-1, -2) # this layer provides noiseless kernel interpolation\n\n return output\n\n\nclass DSPP(DeepGP):\n \"\"\"\n A container module to build a DSPP\n This module should contain :obj:`~gpytorch.models.deep_gps.DSPPLayer`\n modules, and can also contain other modules as well.\n\n This Module contains an additional set of parameters, `raw_quad_weights`, that represent the mixture weights for\n the output distribution.\n \"\"\"\n\n def __init__(self, num_quad_sites):\n super().__init__()\n self.num_quad_sites = num_quad_sites\n self.register_parameter(\"raw_quad_weights\", torch.nn.Parameter(torch.randn(self.num_quad_sites)))\n\n @property\n def quad_weights(self):\n qwd = self.raw_quad_weights\n return qwd - qwd.logsumexp(dim=-1)\n", "path": "gpytorch/models/deep_gps/dspp.py"}]} | 1,877 | 671 |
gh_patches_debug_10331 | rasdani/github-patches | git_diff | pytorch__vision-2086 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pretrain models for FCN and DeepLabV3
Is it possible to provide the pretrain weight for deeplabv3 and FCN?
right now when i do:
`model = torchvision.models.segmentation.fcn_resnet50(pretrained=True)`
I receive this:
> NotImplementedError: pretrained fcn_resnet50_coco is not supported as of now
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchvision/models/segmentation/segmentation.py`
Content:
```
1 from .._utils import IntermediateLayerGetter
2 from ..utils import load_state_dict_from_url
3 from .. import resnet
4 from .deeplabv3 import DeepLabHead, DeepLabV3
5 from .fcn import FCN, FCNHead
6
7
8 __all__ = ['fcn_resnet50', 'fcn_resnet101', 'deeplabv3_resnet50', 'deeplabv3_resnet101']
9
10
11 model_urls = {
12 'fcn_resnet50_coco': None,
13 'fcn_resnet101_coco': 'https://download.pytorch.org/models/fcn_resnet101_coco-7ecb50ca.pth',
14 'deeplabv3_resnet50_coco': None,
15 'deeplabv3_resnet101_coco': 'https://download.pytorch.org/models/deeplabv3_resnet101_coco-586e9e4e.pth',
16 }
17
18
19 def _segm_resnet(name, backbone_name, num_classes, aux, pretrained_backbone=True):
20 backbone = resnet.__dict__[backbone_name](
21 pretrained=pretrained_backbone,
22 replace_stride_with_dilation=[False, True, True])
23
24 return_layers = {'layer4': 'out'}
25 if aux:
26 return_layers['layer3'] = 'aux'
27 backbone = IntermediateLayerGetter(backbone, return_layers=return_layers)
28
29 aux_classifier = None
30 if aux:
31 inplanes = 1024
32 aux_classifier = FCNHead(inplanes, num_classes)
33
34 model_map = {
35 'deeplabv3': (DeepLabHead, DeepLabV3),
36 'fcn': (FCNHead, FCN),
37 }
38 inplanes = 2048
39 classifier = model_map[name][0](inplanes, num_classes)
40 base_model = model_map[name][1]
41
42 model = base_model(backbone, classifier, aux_classifier)
43 return model
44
45
46 def _load_model(arch_type, backbone, pretrained, progress, num_classes, aux_loss, **kwargs):
47 if pretrained:
48 aux_loss = True
49 model = _segm_resnet(arch_type, backbone, num_classes, aux_loss, **kwargs)
50 if pretrained:
51 arch = arch_type + '_' + backbone + '_coco'
52 model_url = model_urls[arch]
53 if model_url is None:
54 raise NotImplementedError('pretrained {} is not supported as of now'.format(arch))
55 else:
56 state_dict = load_state_dict_from_url(model_url, progress=progress)
57 model.load_state_dict(state_dict)
58 return model
59
60
61 def fcn_resnet50(pretrained=False, progress=True,
62 num_classes=21, aux_loss=None, **kwargs):
63 """Constructs a Fully-Convolutional Network model with a ResNet-50 backbone.
64
65 Args:
66 pretrained (bool): If True, returns a model pre-trained on COCO train2017 which
67 contains the same classes as Pascal VOC
68 progress (bool): If True, displays a progress bar of the download to stderr
69 """
70 return _load_model('fcn', 'resnet50', pretrained, progress, num_classes, aux_loss, **kwargs)
71
72
73 def fcn_resnet101(pretrained=False, progress=True,
74 num_classes=21, aux_loss=None, **kwargs):
75 """Constructs a Fully-Convolutional Network model with a ResNet-101 backbone.
76
77 Args:
78 pretrained (bool): If True, returns a model pre-trained on COCO train2017 which
79 contains the same classes as Pascal VOC
80 progress (bool): If True, displays a progress bar of the download to stderr
81 """
82 return _load_model('fcn', 'resnet101', pretrained, progress, num_classes, aux_loss, **kwargs)
83
84
85 def deeplabv3_resnet50(pretrained=False, progress=True,
86 num_classes=21, aux_loss=None, **kwargs):
87 """Constructs a DeepLabV3 model with a ResNet-50 backbone.
88
89 Args:
90 pretrained (bool): If True, returns a model pre-trained on COCO train2017 which
91 contains the same classes as Pascal VOC
92 progress (bool): If True, displays a progress bar of the download to stderr
93 """
94 return _load_model('deeplabv3', 'resnet50', pretrained, progress, num_classes, aux_loss, **kwargs)
95
96
97 def deeplabv3_resnet101(pretrained=False, progress=True,
98 num_classes=21, aux_loss=None, **kwargs):
99 """Constructs a DeepLabV3 model with a ResNet-101 backbone.
100
101 Args:
102 pretrained (bool): If True, returns a model pre-trained on COCO train2017 which
103 contains the same classes as Pascal VOC
104 progress (bool): If True, displays a progress bar of the download to stderr
105 """
106 return _load_model('deeplabv3', 'resnet101', pretrained, progress, num_classes, aux_loss, **kwargs)
107
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/torchvision/models/segmentation/segmentation.py b/torchvision/models/segmentation/segmentation.py
--- a/torchvision/models/segmentation/segmentation.py
+++ b/torchvision/models/segmentation/segmentation.py
@@ -9,9 +9,9 @@
model_urls = {
- 'fcn_resnet50_coco': None,
+ 'fcn_resnet50_coco': 'https://download.pytorch.org/models/fcn_resnet50_coco-1167a1af.pth',
'fcn_resnet101_coco': 'https://download.pytorch.org/models/fcn_resnet101_coco-7ecb50ca.pth',
- 'deeplabv3_resnet50_coco': None,
+ 'deeplabv3_resnet50_coco': 'https://download.pytorch.org/models/deeplabv3_resnet50_coco-cd0a2569.pth',
'deeplabv3_resnet101_coco': 'https://download.pytorch.org/models/deeplabv3_resnet101_coco-586e9e4e.pth',
}
| {"golden_diff": "diff --git a/torchvision/models/segmentation/segmentation.py b/torchvision/models/segmentation/segmentation.py\n--- a/torchvision/models/segmentation/segmentation.py\n+++ b/torchvision/models/segmentation/segmentation.py\n@@ -9,9 +9,9 @@\n \n \n model_urls = {\n- 'fcn_resnet50_coco': None,\n+ 'fcn_resnet50_coco': 'https://download.pytorch.org/models/fcn_resnet50_coco-1167a1af.pth',\n 'fcn_resnet101_coco': 'https://download.pytorch.org/models/fcn_resnet101_coco-7ecb50ca.pth',\n- 'deeplabv3_resnet50_coco': None,\n+ 'deeplabv3_resnet50_coco': 'https://download.pytorch.org/models/deeplabv3_resnet50_coco-cd0a2569.pth',\n 'deeplabv3_resnet101_coco': 'https://download.pytorch.org/models/deeplabv3_resnet101_coco-586e9e4e.pth',\n }\n", "issue": "pretrain models for FCN and DeepLabV3\nIs it possible to provide the pretrain weight for deeplabv3 and FCN?\r\nright now when i do:\r\n`model = torchvision.models.segmentation.fcn_resnet50(pretrained=True)`\r\nI receive this:\r\n\r\n> NotImplementedError: pretrained fcn_resnet50_coco is not supported as of now\n", "before_files": [{"content": "from .._utils import IntermediateLayerGetter\nfrom ..utils import load_state_dict_from_url\nfrom .. import resnet\nfrom .deeplabv3 import DeepLabHead, DeepLabV3\nfrom .fcn import FCN, FCNHead\n\n\n__all__ = ['fcn_resnet50', 'fcn_resnet101', 'deeplabv3_resnet50', 'deeplabv3_resnet101']\n\n\nmodel_urls = {\n 'fcn_resnet50_coco': None,\n 'fcn_resnet101_coco': 'https://download.pytorch.org/models/fcn_resnet101_coco-7ecb50ca.pth',\n 'deeplabv3_resnet50_coco': None,\n 'deeplabv3_resnet101_coco': 'https://download.pytorch.org/models/deeplabv3_resnet101_coco-586e9e4e.pth',\n}\n\n\ndef _segm_resnet(name, backbone_name, num_classes, aux, pretrained_backbone=True):\n backbone = resnet.__dict__[backbone_name](\n pretrained=pretrained_backbone,\n replace_stride_with_dilation=[False, True, True])\n\n return_layers = {'layer4': 'out'}\n if aux:\n return_layers['layer3'] = 'aux'\n backbone = IntermediateLayerGetter(backbone, return_layers=return_layers)\n\n aux_classifier = None\n if aux:\n inplanes = 1024\n aux_classifier = FCNHead(inplanes, num_classes)\n\n model_map = {\n 'deeplabv3': (DeepLabHead, DeepLabV3),\n 'fcn': (FCNHead, FCN),\n }\n inplanes = 2048\n classifier = model_map[name][0](inplanes, num_classes)\n base_model = model_map[name][1]\n\n model = base_model(backbone, classifier, aux_classifier)\n return model\n\n\ndef _load_model(arch_type, backbone, pretrained, progress, num_classes, aux_loss, **kwargs):\n if pretrained:\n aux_loss = True\n model = _segm_resnet(arch_type, backbone, num_classes, aux_loss, **kwargs)\n if pretrained:\n arch = arch_type + '_' + backbone + '_coco'\n model_url = model_urls[arch]\n if model_url is None:\n raise NotImplementedError('pretrained {} is not supported as of now'.format(arch))\n else:\n state_dict = load_state_dict_from_url(model_url, progress=progress)\n model.load_state_dict(state_dict)\n return model\n\n\ndef fcn_resnet50(pretrained=False, progress=True,\n num_classes=21, aux_loss=None, **kwargs):\n \"\"\"Constructs a Fully-Convolutional Network model with a ResNet-50 backbone.\n\n Args:\n pretrained (bool): If True, returns a model pre-trained on COCO train2017 which\n contains the same classes as Pascal VOC\n progress (bool): If True, displays a progress bar of the download to stderr\n \"\"\"\n return _load_model('fcn', 'resnet50', pretrained, progress, num_classes, aux_loss, **kwargs)\n\n\ndef fcn_resnet101(pretrained=False, progress=True,\n num_classes=21, aux_loss=None, **kwargs):\n \"\"\"Constructs a Fully-Convolutional Network model with a ResNet-101 backbone.\n\n Args:\n pretrained (bool): If True, returns a model pre-trained on COCO train2017 which\n contains the same classes as Pascal VOC\n progress (bool): If True, displays a progress bar of the download to stderr\n \"\"\"\n return _load_model('fcn', 'resnet101', pretrained, progress, num_classes, aux_loss, **kwargs)\n\n\ndef deeplabv3_resnet50(pretrained=False, progress=True,\n num_classes=21, aux_loss=None, **kwargs):\n \"\"\"Constructs a DeepLabV3 model with a ResNet-50 backbone.\n\n Args:\n pretrained (bool): If True, returns a model pre-trained on COCO train2017 which\n contains the same classes as Pascal VOC\n progress (bool): If True, displays a progress bar of the download to stderr\n \"\"\"\n return _load_model('deeplabv3', 'resnet50', pretrained, progress, num_classes, aux_loss, **kwargs)\n\n\ndef deeplabv3_resnet101(pretrained=False, progress=True,\n num_classes=21, aux_loss=None, **kwargs):\n \"\"\"Constructs a DeepLabV3 model with a ResNet-101 backbone.\n\n Args:\n pretrained (bool): If True, returns a model pre-trained on COCO train2017 which\n contains the same classes as Pascal VOC\n progress (bool): If True, displays a progress bar of the download to stderr\n \"\"\"\n return _load_model('deeplabv3', 'resnet101', pretrained, progress, num_classes, aux_loss, **kwargs)\n", "path": "torchvision/models/segmentation/segmentation.py"}], "after_files": [{"content": "from .._utils import IntermediateLayerGetter\nfrom ..utils import load_state_dict_from_url\nfrom .. import resnet\nfrom .deeplabv3 import DeepLabHead, DeepLabV3\nfrom .fcn import FCN, FCNHead\n\n\n__all__ = ['fcn_resnet50', 'fcn_resnet101', 'deeplabv3_resnet50', 'deeplabv3_resnet101']\n\n\nmodel_urls = {\n 'fcn_resnet50_coco': 'https://download.pytorch.org/models/fcn_resnet50_coco-1167a1af.pth',\n 'fcn_resnet101_coco': 'https://download.pytorch.org/models/fcn_resnet101_coco-7ecb50ca.pth',\n 'deeplabv3_resnet50_coco': 'https://download.pytorch.org/models/deeplabv3_resnet50_coco-cd0a2569.pth',\n 'deeplabv3_resnet101_coco': 'https://download.pytorch.org/models/deeplabv3_resnet101_coco-586e9e4e.pth',\n}\n\n\ndef _segm_resnet(name, backbone_name, num_classes, aux, pretrained_backbone=True):\n backbone = resnet.__dict__[backbone_name](\n pretrained=pretrained_backbone,\n replace_stride_with_dilation=[False, True, True])\n\n return_layers = {'layer4': 'out'}\n if aux:\n return_layers['layer3'] = 'aux'\n backbone = IntermediateLayerGetter(backbone, return_layers=return_layers)\n\n aux_classifier = None\n if aux:\n inplanes = 1024\n aux_classifier = FCNHead(inplanes, num_classes)\n\n model_map = {\n 'deeplabv3': (DeepLabHead, DeepLabV3),\n 'fcn': (FCNHead, FCN),\n }\n inplanes = 2048\n classifier = model_map[name][0](inplanes, num_classes)\n base_model = model_map[name][1]\n\n model = base_model(backbone, classifier, aux_classifier)\n return model\n\n\ndef _load_model(arch_type, backbone, pretrained, progress, num_classes, aux_loss, **kwargs):\n if pretrained:\n aux_loss = True\n model = _segm_resnet(arch_type, backbone, num_classes, aux_loss, **kwargs)\n if pretrained:\n arch = arch_type + '_' + backbone + '_coco'\n model_url = model_urls[arch]\n if model_url is None:\n raise NotImplementedError('pretrained {} is not supported as of now'.format(arch))\n else:\n state_dict = load_state_dict_from_url(model_url, progress=progress)\n model.load_state_dict(state_dict)\n return model\n\n\ndef fcn_resnet50(pretrained=False, progress=True,\n num_classes=21, aux_loss=None, **kwargs):\n \"\"\"Constructs a Fully-Convolutional Network model with a ResNet-50 backbone.\n\n Args:\n pretrained (bool): If True, returns a model pre-trained on COCO train2017 which\n contains the same classes as Pascal VOC\n progress (bool): If True, displays a progress bar of the download to stderr\n \"\"\"\n return _load_model('fcn', 'resnet50', pretrained, progress, num_classes, aux_loss, **kwargs)\n\n\ndef fcn_resnet101(pretrained=False, progress=True,\n num_classes=21, aux_loss=None, **kwargs):\n \"\"\"Constructs a Fully-Convolutional Network model with a ResNet-101 backbone.\n\n Args:\n pretrained (bool): If True, returns a model pre-trained on COCO train2017 which\n contains the same classes as Pascal VOC\n progress (bool): If True, displays a progress bar of the download to stderr\n \"\"\"\n return _load_model('fcn', 'resnet101', pretrained, progress, num_classes, aux_loss, **kwargs)\n\n\ndef deeplabv3_resnet50(pretrained=False, progress=True,\n num_classes=21, aux_loss=None, **kwargs):\n \"\"\"Constructs a DeepLabV3 model with a ResNet-50 backbone.\n\n Args:\n pretrained (bool): If True, returns a model pre-trained on COCO train2017 which\n contains the same classes as Pascal VOC\n progress (bool): If True, displays a progress bar of the download to stderr\n \"\"\"\n return _load_model('deeplabv3', 'resnet50', pretrained, progress, num_classes, aux_loss, **kwargs)\n\n\ndef deeplabv3_resnet101(pretrained=False, progress=True,\n num_classes=21, aux_loss=None, **kwargs):\n \"\"\"Constructs a DeepLabV3 model with a ResNet-101 backbone.\n\n Args:\n pretrained (bool): If True, returns a model pre-trained on COCO train2017 which\n contains the same classes as Pascal VOC\n progress (bool): If True, displays a progress bar of the download to stderr\n \"\"\"\n return _load_model('deeplabv3', 'resnet101', pretrained, progress, num_classes, aux_loss, **kwargs)\n", "path": "torchvision/models/segmentation/segmentation.py"}]} | 1,687 | 268 |
gh_patches_debug_7787 | rasdani/github-patches | git_diff | qtile__qtile-1682 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Firefox cpu usage escalates and then stops responding only when run in qtile
Hi,
When starting firefox (firefox 69.0-1, qtile 0.14.2-2) its cpu usage starts rising to the point were the interface becomes unresponsive, also only parts of firefox interface are drawn, but like after 5 mins the window is not visible at all, may be the screenshots will make things clearer.
screenshots with description:
https://imgur.com/a/V17wJrM
Nothing is there in the logs (I only checked .local/share/qtile/qtile.log), this does not happen with me under i3 or bspwm. I also use optirun with my browsers but it does not matter if I start firefox without it. this bug happens every time I use firefox, and I use a lot of gtk3, gtk2, qt4, qt5, electron apps and it only happens with firefox on qtile only.
I am not sure what other information I can provide.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libqtile/widget/keyboardlayout.py`
Content:
```
1 # Copyright (c) 2013 Jacob Mourelos
2 # Copyright (c) 2014 Shepilov Vladislav
3 # Copyright (c) 2014-2015 Sean Vig
4 # Copyright (c) 2014 Tycho Andersen
5 # Copyright (c) 2019 zordsdavini
6 #
7 # Permission is hereby granted, free of charge, to any person obtaining a copy
8 # of this software and associated documentation files (the "Software"), to deal
9 # in the Software without restriction, including without limitation the rights
10 # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
11 # copies of the Software, and to permit persons to whom the Software is
12 # furnished to do so, subject to the following conditions:
13 #
14 # The above copyright notice and this permission notice shall be included in
15 # all copies or substantial portions of the Software.
16 #
17 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
18 # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
19 # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
20 # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
21 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
22 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
23 # SOFTWARE.
24
25 import re
26 from subprocess import CalledProcessError
27
28 from libqtile.log_utils import logger
29 from libqtile.widget import base
30
31 kb_layout_regex = re.compile(r'layout:\s+(?P<layout>\w+)')
32 kb_variant_regex = re.compile(r'variant:\s+(?P<variant>\w+)')
33
34
35 class KeyboardLayout(base.InLoopPollText):
36 """Widget for changing and displaying the current keyboard layout
37
38 It requires setxkbmap to be available in the system.
39 """
40 orientations = base.ORIENTATION_HORIZONTAL
41 defaults = [
42 ("update_interval", 1, "Update time in seconds."),
43 ("configured_keyboards", ["us"], "A list of predefined keyboard layouts "
44 "represented as strings. For example: "
45 "['us', 'us colemak', 'es', 'fr']."),
46 ("option", None, "string of setxkbmap option. Ex., 'compose:menu,grp_led:scroll'"),
47 ]
48
49 def __init__(self, **config):
50 base.InLoopPollText.__init__(self, **config)
51 self.add_defaults(KeyboardLayout.defaults)
52
53 def button_press(self, x, y, button):
54 if button == 1:
55 self.next_keyboard()
56
57 def next_keyboard(self):
58 """Set the next layout in the list of configured keyboard layouts as
59 new current layout in use
60
61 If the current keyboard layout is not in the list, it will set as new
62 layout the first one in the list.
63 """
64
65 current_keyboard = self.keyboard
66 if current_keyboard in self.configured_keyboards:
67 # iterate the list circularly
68 next_keyboard = self.configured_keyboards[
69 (self.configured_keyboards.index(current_keyboard) + 1) %
70 len(self.configured_keyboards)]
71 else:
72 next_keyboard = self.configured_keyboards[0]
73
74 self.keyboard = next_keyboard
75
76 self.tick()
77
78 def poll(self):
79 return self.keyboard.upper()
80
81 def get_keyboard_layout(self, setxkbmap_output):
82 match_layout = kb_layout_regex.search(setxkbmap_output)
83 match_variant = kb_variant_regex.search(setxkbmap_output)
84
85 if match_layout is None:
86 return 'ERR'
87
88 kb = match_layout.group('layout')
89 if match_variant:
90 kb += " " + match_variant.group('variant')
91 return kb
92
93 @property
94 def keyboard(self):
95 """Return the currently used keyboard layout as a string
96
97 Examples: "us", "us dvorak". In case of error returns "unknown".
98 """
99 try:
100 command = 'setxkbmap -verbose 10'
101 setxkbmap_output = self.call_process(command.split(' '))
102 keyboard = self.get_keyboard_layout(setxkbmap_output)
103 return str(keyboard)
104 except CalledProcessError as e:
105 logger.error('Can not get the keyboard layout: {0}'.format(e))
106 except OSError as e:
107 logger.error('Please, check that xset is available: {0}'.format(e))
108 return "unknown"
109
110 @keyboard.setter
111 def keyboard(self, keyboard):
112 command = ['setxkbmap']
113 command.extend(keyboard.split(" "))
114 if self.option:
115 command.extend(['-option', self.option])
116 try:
117 self.call_process(command)
118 except CalledProcessError as e:
119 logger.error('Can not change the keyboard layout: {0}'.format(e))
120 except OSError as e:
121 logger.error('Please, check that setxkbmap is available: {0}'.format(e))
122
123 def cmd_next_keyboard(self):
124 """Select next keyboard layout"""
125 self.next_keyboard()
126
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/libqtile/widget/keyboardlayout.py b/libqtile/widget/keyboardlayout.py
--- a/libqtile/widget/keyboardlayout.py
+++ b/libqtile/widget/keyboardlayout.py
@@ -97,7 +97,7 @@
Examples: "us", "us dvorak". In case of error returns "unknown".
"""
try:
- command = 'setxkbmap -verbose 10'
+ command = 'setxkbmap -verbose 10 -query'
setxkbmap_output = self.call_process(command.split(' '))
keyboard = self.get_keyboard_layout(setxkbmap_output)
return str(keyboard)
| {"golden_diff": "diff --git a/libqtile/widget/keyboardlayout.py b/libqtile/widget/keyboardlayout.py\n--- a/libqtile/widget/keyboardlayout.py\n+++ b/libqtile/widget/keyboardlayout.py\n@@ -97,7 +97,7 @@\n Examples: \"us\", \"us dvorak\". In case of error returns \"unknown\".\n \"\"\"\n try:\n- command = 'setxkbmap -verbose 10'\n+ command = 'setxkbmap -verbose 10 -query'\n setxkbmap_output = self.call_process(command.split(' '))\n keyboard = self.get_keyboard_layout(setxkbmap_output)\n return str(keyboard)\n", "issue": "Firefox cpu usage escalates and then stops responding only when run in qtile\nHi, \r\n\r\nWhen starting firefox (firefox 69.0-1, qtile 0.14.2-2) its cpu usage starts rising to the point were the interface becomes unresponsive, also only parts of firefox interface are drawn, but like after 5 mins the window is not visible at all, may be the screenshots will make things clearer. \r\n\r\nscreenshots with description:\r\nhttps://imgur.com/a/V17wJrM\r\n\r\nNothing is there in the logs (I only checked .local/share/qtile/qtile.log), this does not happen with me under i3 or bspwm. I also use optirun with my browsers but it does not matter if I start firefox without it. this bug happens every time I use firefox, and I use a lot of gtk3, gtk2, qt4, qt5, electron apps and it only happens with firefox on qtile only.\r\n\r\nI am not sure what other information I can provide.\r\n\n", "before_files": [{"content": "# Copyright (c) 2013 Jacob Mourelos\n# Copyright (c) 2014 Shepilov Vladislav\n# Copyright (c) 2014-2015 Sean Vig\n# Copyright (c) 2014 Tycho Andersen\n# Copyright (c) 2019 zordsdavini\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\n\nimport re\nfrom subprocess import CalledProcessError\n\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import base\n\nkb_layout_regex = re.compile(r'layout:\\s+(?P<layout>\\w+)')\nkb_variant_regex = re.compile(r'variant:\\s+(?P<variant>\\w+)')\n\n\nclass KeyboardLayout(base.InLoopPollText):\n \"\"\"Widget for changing and displaying the current keyboard layout\n\n It requires setxkbmap to be available in the system.\n \"\"\"\n orientations = base.ORIENTATION_HORIZONTAL\n defaults = [\n (\"update_interval\", 1, \"Update time in seconds.\"),\n (\"configured_keyboards\", [\"us\"], \"A list of predefined keyboard layouts \"\n \"represented as strings. For example: \"\n \"['us', 'us colemak', 'es', 'fr'].\"),\n (\"option\", None, \"string of setxkbmap option. Ex., 'compose:menu,grp_led:scroll'\"),\n ]\n\n def __init__(self, **config):\n base.InLoopPollText.__init__(self, **config)\n self.add_defaults(KeyboardLayout.defaults)\n\n def button_press(self, x, y, button):\n if button == 1:\n self.next_keyboard()\n\n def next_keyboard(self):\n \"\"\"Set the next layout in the list of configured keyboard layouts as\n new current layout in use\n\n If the current keyboard layout is not in the list, it will set as new\n layout the first one in the list.\n \"\"\"\n\n current_keyboard = self.keyboard\n if current_keyboard in self.configured_keyboards:\n # iterate the list circularly\n next_keyboard = self.configured_keyboards[\n (self.configured_keyboards.index(current_keyboard) + 1) %\n len(self.configured_keyboards)]\n else:\n next_keyboard = self.configured_keyboards[0]\n\n self.keyboard = next_keyboard\n\n self.tick()\n\n def poll(self):\n return self.keyboard.upper()\n\n def get_keyboard_layout(self, setxkbmap_output):\n match_layout = kb_layout_regex.search(setxkbmap_output)\n match_variant = kb_variant_regex.search(setxkbmap_output)\n\n if match_layout is None:\n return 'ERR'\n\n kb = match_layout.group('layout')\n if match_variant:\n kb += \" \" + match_variant.group('variant')\n return kb\n\n @property\n def keyboard(self):\n \"\"\"Return the currently used keyboard layout as a string\n\n Examples: \"us\", \"us dvorak\". In case of error returns \"unknown\".\n \"\"\"\n try:\n command = 'setxkbmap -verbose 10'\n setxkbmap_output = self.call_process(command.split(' '))\n keyboard = self.get_keyboard_layout(setxkbmap_output)\n return str(keyboard)\n except CalledProcessError as e:\n logger.error('Can not get the keyboard layout: {0}'.format(e))\n except OSError as e:\n logger.error('Please, check that xset is available: {0}'.format(e))\n return \"unknown\"\n\n @keyboard.setter\n def keyboard(self, keyboard):\n command = ['setxkbmap']\n command.extend(keyboard.split(\" \"))\n if self.option:\n command.extend(['-option', self.option])\n try:\n self.call_process(command)\n except CalledProcessError as e:\n logger.error('Can not change the keyboard layout: {0}'.format(e))\n except OSError as e:\n logger.error('Please, check that setxkbmap is available: {0}'.format(e))\n\n def cmd_next_keyboard(self):\n \"\"\"Select next keyboard layout\"\"\"\n self.next_keyboard()\n", "path": "libqtile/widget/keyboardlayout.py"}], "after_files": [{"content": "# Copyright (c) 2013 Jacob Mourelos\n# Copyright (c) 2014 Shepilov Vladislav\n# Copyright (c) 2014-2015 Sean Vig\n# Copyright (c) 2014 Tycho Andersen\n# Copyright (c) 2019 zordsdavini\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\n\nimport re\nfrom subprocess import CalledProcessError\n\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import base\n\nkb_layout_regex = re.compile(r'layout:\\s+(?P<layout>\\w+)')\nkb_variant_regex = re.compile(r'variant:\\s+(?P<variant>\\w+)')\n\n\nclass KeyboardLayout(base.InLoopPollText):\n \"\"\"Widget for changing and displaying the current keyboard layout\n\n It requires setxkbmap to be available in the system.\n \"\"\"\n orientations = base.ORIENTATION_HORIZONTAL\n defaults = [\n (\"update_interval\", 1, \"Update time in seconds.\"),\n (\"configured_keyboards\", [\"us\"], \"A list of predefined keyboard layouts \"\n \"represented as strings. For example: \"\n \"['us', 'us colemak', 'es', 'fr'].\"),\n (\"option\", None, \"string of setxkbmap option. Ex., 'compose:menu,grp_led:scroll'\"),\n ]\n\n def __init__(self, **config):\n base.InLoopPollText.__init__(self, **config)\n self.add_defaults(KeyboardLayout.defaults)\n\n def button_press(self, x, y, button):\n if button == 1:\n self.next_keyboard()\n\n def next_keyboard(self):\n \"\"\"Set the next layout in the list of configured keyboard layouts as\n new current layout in use\n\n If the current keyboard layout is not in the list, it will set as new\n layout the first one in the list.\n \"\"\"\n\n current_keyboard = self.keyboard\n if current_keyboard in self.configured_keyboards:\n # iterate the list circularly\n next_keyboard = self.configured_keyboards[\n (self.configured_keyboards.index(current_keyboard) + 1) %\n len(self.configured_keyboards)]\n else:\n next_keyboard = self.configured_keyboards[0]\n\n self.keyboard = next_keyboard\n\n self.tick()\n\n def poll(self):\n return self.keyboard.upper()\n\n def get_keyboard_layout(self, setxkbmap_output):\n match_layout = kb_layout_regex.search(setxkbmap_output)\n match_variant = kb_variant_regex.search(setxkbmap_output)\n\n if match_layout is None:\n return 'ERR'\n\n kb = match_layout.group('layout')\n if match_variant:\n kb += \" \" + match_variant.group('variant')\n return kb\n\n @property\n def keyboard(self):\n \"\"\"Return the currently used keyboard layout as a string\n\n Examples: \"us\", \"us dvorak\". In case of error returns \"unknown\".\n \"\"\"\n try:\n command = 'setxkbmap -verbose 10 -query'\n setxkbmap_output = self.call_process(command.split(' '))\n keyboard = self.get_keyboard_layout(setxkbmap_output)\n return str(keyboard)\n except CalledProcessError as e:\n logger.error('Can not get the keyboard layout: {0}'.format(e))\n except OSError as e:\n logger.error('Please, check that xset is available: {0}'.format(e))\n return \"unknown\"\n\n @keyboard.setter\n def keyboard(self, keyboard):\n command = ['setxkbmap']\n command.extend(keyboard.split(\" \"))\n if self.option:\n command.extend(['-option', self.option])\n try:\n self.call_process(command)\n except CalledProcessError as e:\n logger.error('Can not change the keyboard layout: {0}'.format(e))\n except OSError as e:\n logger.error('Please, check that setxkbmap is available: {0}'.format(e))\n\n def cmd_next_keyboard(self):\n \"\"\"Select next keyboard layout\"\"\"\n self.next_keyboard()\n", "path": "libqtile/widget/keyboardlayout.py"}]} | 1,844 | 147 |
gh_patches_debug_28995 | rasdani/github-patches | git_diff | vacanza__python-holidays-1020 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update MANIFEST.in
Clean up and make sure `sdist` contains all needed files.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # python-holidays
2 # ---------------
3 # A fast, efficient Python library for generating country, province and state
4 # specific sets of holidays on the fly. It aims to make determining whether a
5 # specific date is a holiday as fast and flexible as possible.
6 #
7 # Authors: dr-prodigy <[email protected]> (c) 2017-2023
8 # ryanss <[email protected]> (c) 2014-2017
9 # Website: https://github.com/dr-prodigy/python-holidays
10 # License: MIT (see LICENSE file)
11
12 import os
13 import subprocess
14 import sys
15 from pathlib import Path
16
17 from setuptools import setup
18
19
20 def generate_mo_files():
21 """Looks up for .po files and generates respective .mo files."""
22 for po_path in Path(os.path.join("holidays", "locale")).rglob("*.po"):
23 po_file = str(po_path)
24 mo_file = po_file.replace(".po", ".mo")
25
26 if os.path.exists(mo_file):
27 os.unlink(mo_file)
28 subprocess.run(
29 (
30 sys.executable,
31 os.path.join("scripts", "l10n", "msgfmt.py"),
32 "-o",
33 mo_file,
34 po_file,
35 ),
36 check=True,
37 )
38
39
40 if __name__ == "__main__":
41 generate_mo_files()
42 setup()
43
```
Path: `scripts/l10n/generate_mo_files.py`
Content:
```
1 #!/usr/bin/env python3
2
3 # python-holidays
4 # ---------------
5 # A fast, efficient Python library for generating country, province and state
6 # specific sets of holidays on the fly. It aims to make determining whether a
7 # specific date is a holiday as fast and flexible as possible.
8 #
9 # Authors: dr-prodigy <[email protected]> (c) 2017-2023
10 # ryanss <[email protected]> (c) 2014-2017
11 # Website: https://github.com/dr-prodigy/python-holidays
12 # License: MIT (see LICENSE file)
13
14 import os
15 import subprocess
16 import sys
17 from pathlib import Path
18
19
20 class MOGenerator:
21 """Creates .mo files for supported country/market entities."""
22
23 def run(self):
24 """Runs the .mo files generation process."""
25 # Delete old files.
26 for mo_file in Path(os.path.join("holidays", "locale")).rglob("*.mo"):
27 os.unlink(str(mo_file))
28
29 # Create new files.
30 for po_path in Path(os.path.join("holidays", "locale")).rglob("*.po"):
31 po_file = str(po_path)
32 mo_file = po_file.replace(".po", ".mo")
33 subprocess.run(
34 (
35 sys.executable,
36 os.path.join("scripts", "l10n", "msgfmt.py"),
37 "-o",
38 mo_file,
39 po_file,
40 ),
41 check=True,
42 )
43
44
45 if __name__ == "__main__":
46 MOGenerator().run()
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/scripts/l10n/generate_mo_files.py b/scripts/l10n/generate_mo_files.py
--- a/scripts/l10n/generate_mo_files.py
+++ b/scripts/l10n/generate_mo_files.py
@@ -22,14 +22,12 @@
def run(self):
"""Runs the .mo files generation process."""
- # Delete old files.
- for mo_file in Path(os.path.join("holidays", "locale")).rglob("*.mo"):
- os.unlink(str(mo_file))
-
- # Create new files.
for po_path in Path(os.path.join("holidays", "locale")).rglob("*.po"):
po_file = str(po_path)
mo_file = po_file.replace(".po", ".mo")
+
+ if os.path.exists(mo_file):
+ os.unlink(mo_file)
subprocess.run(
(
sys.executable,
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -9,34 +9,7 @@
# Website: https://github.com/dr-prodigy/python-holidays
# License: MIT (see LICENSE file)
-import os
-import subprocess
-import sys
-from pathlib import Path
-
from setuptools import setup
-
-def generate_mo_files():
- """Looks up for .po files and generates respective .mo files."""
- for po_path in Path(os.path.join("holidays", "locale")).rglob("*.po"):
- po_file = str(po_path)
- mo_file = po_file.replace(".po", ".mo")
-
- if os.path.exists(mo_file):
- os.unlink(mo_file)
- subprocess.run(
- (
- sys.executable,
- os.path.join("scripts", "l10n", "msgfmt.py"),
- "-o",
- mo_file,
- po_file,
- ),
- check=True,
- )
-
-
if __name__ == "__main__":
- generate_mo_files()
setup()
| {"golden_diff": "diff --git a/scripts/l10n/generate_mo_files.py b/scripts/l10n/generate_mo_files.py\n--- a/scripts/l10n/generate_mo_files.py\n+++ b/scripts/l10n/generate_mo_files.py\n@@ -22,14 +22,12 @@\n \n def run(self):\n \"\"\"Runs the .mo files generation process.\"\"\"\n- # Delete old files.\n- for mo_file in Path(os.path.join(\"holidays\", \"locale\")).rglob(\"*.mo\"):\n- os.unlink(str(mo_file))\n-\n- # Create new files.\n for po_path in Path(os.path.join(\"holidays\", \"locale\")).rglob(\"*.po\"):\n po_file = str(po_path)\n mo_file = po_file.replace(\".po\", \".mo\")\n+\n+ if os.path.exists(mo_file):\n+ os.unlink(mo_file)\n subprocess.run(\n (\n sys.executable,\ndiff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -9,34 +9,7 @@\n # Website: https://github.com/dr-prodigy/python-holidays\n # License: MIT (see LICENSE file)\n \n-import os\n-import subprocess\n-import sys\n-from pathlib import Path\n-\n from setuptools import setup\n \n-\n-def generate_mo_files():\n- \"\"\"Looks up for .po files and generates respective .mo files.\"\"\"\n- for po_path in Path(os.path.join(\"holidays\", \"locale\")).rglob(\"*.po\"):\n- po_file = str(po_path)\n- mo_file = po_file.replace(\".po\", \".mo\")\n-\n- if os.path.exists(mo_file):\n- os.unlink(mo_file)\n- subprocess.run(\n- (\n- sys.executable,\n- os.path.join(\"scripts\", \"l10n\", \"msgfmt.py\"),\n- \"-o\",\n- mo_file,\n- po_file,\n- ),\n- check=True,\n- )\n-\n-\n if __name__ == \"__main__\":\n- generate_mo_files()\n setup()\n", "issue": "Update MANIFEST.in\nClean up and make sure `sdist` contains all needed files.\n", "before_files": [{"content": "# python-holidays\n# ---------------\n# A fast, efficient Python library for generating country, province and state\n# specific sets of holidays on the fly. It aims to make determining whether a\n# specific date is a holiday as fast and flexible as possible.\n#\n# Authors: dr-prodigy <[email protected]> (c) 2017-2023\n# ryanss <[email protected]> (c) 2014-2017\n# Website: https://github.com/dr-prodigy/python-holidays\n# License: MIT (see LICENSE file)\n\nimport os\nimport subprocess\nimport sys\nfrom pathlib import Path\n\nfrom setuptools import setup\n\n\ndef generate_mo_files():\n \"\"\"Looks up for .po files and generates respective .mo files.\"\"\"\n for po_path in Path(os.path.join(\"holidays\", \"locale\")).rglob(\"*.po\"):\n po_file = str(po_path)\n mo_file = po_file.replace(\".po\", \".mo\")\n\n if os.path.exists(mo_file):\n os.unlink(mo_file)\n subprocess.run(\n (\n sys.executable,\n os.path.join(\"scripts\", \"l10n\", \"msgfmt.py\"),\n \"-o\",\n mo_file,\n po_file,\n ),\n check=True,\n )\n\n\nif __name__ == \"__main__\":\n generate_mo_files()\n setup()\n", "path": "setup.py"}, {"content": "#!/usr/bin/env python3\n\n# python-holidays\n# ---------------\n# A fast, efficient Python library for generating country, province and state\n# specific sets of holidays on the fly. It aims to make determining whether a\n# specific date is a holiday as fast and flexible as possible.\n#\n# Authors: dr-prodigy <[email protected]> (c) 2017-2023\n# ryanss <[email protected]> (c) 2014-2017\n# Website: https://github.com/dr-prodigy/python-holidays\n# License: MIT (see LICENSE file)\n\nimport os\nimport subprocess\nimport sys\nfrom pathlib import Path\n\n\nclass MOGenerator:\n \"\"\"Creates .mo files for supported country/market entities.\"\"\"\n\n def run(self):\n \"\"\"Runs the .mo files generation process.\"\"\"\n # Delete old files.\n for mo_file in Path(os.path.join(\"holidays\", \"locale\")).rglob(\"*.mo\"):\n os.unlink(str(mo_file))\n\n # Create new files.\n for po_path in Path(os.path.join(\"holidays\", \"locale\")).rglob(\"*.po\"):\n po_file = str(po_path)\n mo_file = po_file.replace(\".po\", \".mo\")\n subprocess.run(\n (\n sys.executable,\n os.path.join(\"scripts\", \"l10n\", \"msgfmt.py\"),\n \"-o\",\n mo_file,\n po_file,\n ),\n check=True,\n )\n\n\nif __name__ == \"__main__\":\n MOGenerator().run()\n", "path": "scripts/l10n/generate_mo_files.py"}], "after_files": [{"content": "# python-holidays\n# ---------------\n# A fast, efficient Python library for generating country, province and state\n# specific sets of holidays on the fly. It aims to make determining whether a\n# specific date is a holiday as fast and flexible as possible.\n#\n# Authors: dr-prodigy <[email protected]> (c) 2017-2023\n# ryanss <[email protected]> (c) 2014-2017\n# Website: https://github.com/dr-prodigy/python-holidays\n# License: MIT (see LICENSE file)\n\nfrom setuptools import setup\n\nif __name__ == \"__main__\":\n setup()\n", "path": "setup.py"}, {"content": "#!/usr/bin/env python3\n\n# python-holidays\n# ---------------\n# A fast, efficient Python library for generating country, province and state\n# specific sets of holidays on the fly. It aims to make determining whether a\n# specific date is a holiday as fast and flexible as possible.\n#\n# Authors: dr-prodigy <[email protected]> (c) 2017-2023\n# ryanss <[email protected]> (c) 2014-2017\n# Website: https://github.com/dr-prodigy/python-holidays\n# License: MIT (see LICENSE file)\n\nimport os\nimport subprocess\nimport sys\nfrom pathlib import Path\n\n\nclass MOGenerator:\n \"\"\"Creates .mo files for supported country/market entities.\"\"\"\n\n def run(self):\n \"\"\"Runs the .mo files generation process.\"\"\"\n for po_path in Path(os.path.join(\"holidays\", \"locale\")).rglob(\"*.po\"):\n po_file = str(po_path)\n mo_file = po_file.replace(\".po\", \".mo\")\n\n if os.path.exists(mo_file):\n os.unlink(mo_file)\n subprocess.run(\n (\n sys.executable,\n os.path.join(\"scripts\", \"l10n\", \"msgfmt.py\"),\n \"-o\",\n mo_file,\n po_file,\n ),\n check=True,\n )\n\n\nif __name__ == \"__main__\":\n MOGenerator().run()\n", "path": "scripts/l10n/generate_mo_files.py"}]} | 1,128 | 445 |
gh_patches_debug_35484 | rasdani/github-patches | git_diff | DDMAL__CantusDB-271 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
"My sources" panel should appear on homepage
In OldCantus on the homepage when a user is logged in, a list of sources appears in a "My sources" box in the left sidebar. This box replaces the "jump to source" and "what's new" boxes that are visible when a user is not logged in.
In NewCantus, the "jump to source" and "what's new" boxes appear whether or not a user is logged in. Behavior on NewCantus should be changed to match that of OldCantus
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `django/cantusdb_project/main_app/templatetags/helper_tags.py`
Content:
```
1 import calendar
2 from typing import Union, Optional
3 from django.utils.http import urlencode
4 from django import template
5 from main_app.models import Source
6 from articles.models import Article
7 from django.utils.safestring import mark_safe
8
9 register = template.Library()
10
11
12 @register.simple_tag(takes_context=False)
13 def recent_articles():
14 articles = Article.objects.order_by("-date_created")[:5]
15 list_item_template = '<li><a href="{url}">{title}</a><br><small>{date}</small></li>'
16 list_items = [
17 list_item_template.format(
18 url=a.get_absolute_url(),
19 title=a.title,
20 date=a.date_created.strftime("%x"),
21 )
22 for a
23 in articles
24 ]
25 list_items_string = "".join(list_items)
26 recent_articles_string = "<ul>{lis}</ul>".format(lis=list_items_string)
27 return mark_safe(recent_articles_string)
28
29
30 @register.filter(name="month_to_string")
31 def month_to_string(value: Optional[Union[str, int]]) -> Optional[Union[str, int]]:
32 """Converts month number to textual representation, 3 letters (Jan, Mar, etc)"""
33 if type(value) == int and value in range(1, 13):
34 return calendar.month_abbr[value]
35 else:
36 return value
37
38
39 @register.simple_tag(takes_context=True)
40 def url_add_get_params(context, **kwargs):
41 query = context["request"].GET.copy()
42 # accounts for the situations where there may be two paginations in one page
43 if "page" in kwargs:
44 query.pop("page", None)
45 if "page2" in kwargs:
46 query.pop("page2", None)
47 query.update(kwargs)
48 return query.urlencode()
49
50
51 @register.simple_tag(takes_context=False)
52 def source_links():
53 sources = (
54 Source.objects.filter(public=True, visible=True, segment__id=4063)
55 .exclude(siglum=None)
56 .values("siglum", "id")
57 .order_by("siglum")
58 )
59 options = ""
60 # <option value="source1">Source 1</option>
61 # <option value="source2">Source 2</option>
62 # <option value="source3">Source 3</option>
63 for source in sources:
64 option_str = (
65 f"<option value=source/{source['id']}>{source['siglum']}</option>\n"
66 )
67 options += option_str
68
69 return mark_safe(options)
70
71
72
73 @register.filter
74 def classname(obj):
75 """
76 Returns the name of the object's class
77 A use-case is: {% if object|classname == "Notation" %}
78 """
79 return obj.__class__.__name__
80
81 @register.filter
82 def admin_url_name(class_name, action):
83 """
84 Accepts a class name and an action (either "change" or "delete") as arguments.
85 Returns the name of the URL for changing/deleting an object in the admin interface.
86 """
87 class_name = class_name.lower()
88 action = action.lower()
89
90 return f"admin:main_app_{class_name}_{action}"
91
92 @register.filter(name='has_group')
93 def has_group(user, group_name):
94 return user.groups.filter(name=group_name).exists()
95
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/django/cantusdb_project/main_app/templatetags/helper_tags.py b/django/cantusdb_project/main_app/templatetags/helper_tags.py
--- a/django/cantusdb_project/main_app/templatetags/helper_tags.py
+++ b/django/cantusdb_project/main_app/templatetags/helper_tags.py
@@ -5,6 +5,7 @@
from main_app.models import Source
from articles.models import Article
from django.utils.safestring import mark_safe
+from django.urls import reverse
register = template.Library()
@@ -27,6 +28,57 @@
return mark_safe(recent_articles_string)
[email protected]_tag(takes_context=False)
+def my_sources(user):
+ def make_source_detail_link_with_siglum(source):
+ id = source.id
+ siglum = source.rism_siglum
+ url = reverse("source-detail", args=[id])
+ link = '<a href="{}">{}</a>'.format(url, siglum)
+ return link
+ def make_source_detail_link_with_title(source):
+ id = source.id
+ title = source.title
+ url = reverse("source-detail", args=[id])
+ link = '<a href="{}">{}</a>'.format(url, title)
+ return link
+ def make_add_new_chants_link(source):
+ id = source.id
+ url = reverse("chant-create", args=[id])
+ link = '<a href="{}">+ Add new chant</a>'.format(url)
+ return link
+ def make_edit_chants_link(source):
+ id = source.id
+ url = reverse("source-edit-volpiano", args=[id])
+ link = '<a href="{}">Edit chants (Fulltext & Volpiano editor)</a>'.format(url)
+ return link
+ def make_links_for_source(source):
+ link_with_siglum = make_source_detail_link_with_siglum(source)
+ link_with_title = make_source_detail_link_with_title(source)
+ add_new_chants_link = make_add_new_chants_link(source)
+ edit_chants_link = make_edit_chants_link(source)
+ template = """{sigl}<br>
+ <small>
+ <b>{title}</b><br>
+ {add}<br>
+ {edit}<br>
+ </small>
+ """
+ links_string = template.format(
+ sigl=link_with_siglum,
+ title=link_with_title,
+ add=add_new_chants_link,
+ edit=edit_chants_link,
+ )
+ return links_string
+ MAX_SOURCES_TO_DISPLAY = 6
+ sources = list(user.sources_user_can_edit.all())[:MAX_SOURCES_TO_DISPLAY]
+ source_links = [make_links_for_source(source) for source in sources]
+ list_items = ["<li>{}</li>".format(link) for link in source_links]
+ joined_list_items = "".join(list_items)
+ links_ul = "<ul>{}</ul>".format(joined_list_items)
+ return mark_safe(links_ul)
+
@register.filter(name="month_to_string")
def month_to_string(value: Optional[Union[str, int]]) -> Optional[Union[str, int]]:
"""Converts month number to textual representation, 3 letters (Jan, Mar, etc)"""
| {"golden_diff": "diff --git a/django/cantusdb_project/main_app/templatetags/helper_tags.py b/django/cantusdb_project/main_app/templatetags/helper_tags.py\n--- a/django/cantusdb_project/main_app/templatetags/helper_tags.py\n+++ b/django/cantusdb_project/main_app/templatetags/helper_tags.py\n@@ -5,6 +5,7 @@\n from main_app.models import Source\n from articles.models import Article\n from django.utils.safestring import mark_safe\n+from django.urls import reverse\n \n register = template.Library()\n \n@@ -27,6 +28,57 @@\n return mark_safe(recent_articles_string)\n \n \[email protected]_tag(takes_context=False)\n+def my_sources(user):\n+ def make_source_detail_link_with_siglum(source):\n+ id = source.id\n+ siglum = source.rism_siglum\n+ url = reverse(\"source-detail\", args=[id])\n+ link = '<a href=\"{}\">{}</a>'.format(url, siglum)\n+ return link\n+ def make_source_detail_link_with_title(source):\n+ id = source.id\n+ title = source.title\n+ url = reverse(\"source-detail\", args=[id])\n+ link = '<a href=\"{}\">{}</a>'.format(url, title) \n+ return link\n+ def make_add_new_chants_link(source):\n+ id = source.id\n+ url = reverse(\"chant-create\", args=[id])\n+ link = '<a href=\"{}\">+ Add new chant</a>'.format(url) \n+ return link\n+ def make_edit_chants_link(source):\n+ id = source.id\n+ url = reverse(\"source-edit-volpiano\", args=[id])\n+ link = '<a href=\"{}\">Edit chants (Fulltext & Volpiano editor)</a>'.format(url) \n+ return link\n+ def make_links_for_source(source):\n+ link_with_siglum = make_source_detail_link_with_siglum(source)\n+ link_with_title = make_source_detail_link_with_title(source)\n+ add_new_chants_link = make_add_new_chants_link(source)\n+ edit_chants_link = make_edit_chants_link(source)\n+ template = \"\"\"{sigl}<br>\n+ <small>\n+ <b>{title}</b><br>\n+ {add}<br>\n+ {edit}<br>\n+ </small>\n+ \"\"\"\n+ links_string = template.format(\n+ sigl=link_with_siglum,\n+ title=link_with_title,\n+ add=add_new_chants_link,\n+ edit=edit_chants_link,\n+ )\n+ return links_string\n+ MAX_SOURCES_TO_DISPLAY = 6\n+ sources = list(user.sources_user_can_edit.all())[:MAX_SOURCES_TO_DISPLAY]\n+ source_links = [make_links_for_source(source) for source in sources]\n+ list_items = [\"<li>{}</li>\".format(link) for link in source_links]\n+ joined_list_items = \"\".join(list_items)\n+ links_ul = \"<ul>{}</ul>\".format(joined_list_items)\n+ return mark_safe(links_ul)\n+\n @register.filter(name=\"month_to_string\")\n def month_to_string(value: Optional[Union[str, int]]) -> Optional[Union[str, int]]:\n \"\"\"Converts month number to textual representation, 3 letters (Jan, Mar, etc)\"\"\"\n", "issue": "\"My sources\" panel should appear on homepage\nIn OldCantus on the homepage when a user is logged in, a list of sources appears in a \"My sources\" box in the left sidebar. This box replaces the \"jump to source\" and \"what's new\" boxes that are visible when a user is not logged in.\r\n\r\nIn NewCantus, the \"jump to source\" and \"what's new\" boxes appear whether or not a user is logged in. Behavior on NewCantus should be changed to match that of OldCantus\n", "before_files": [{"content": "import calendar\nfrom typing import Union, Optional\nfrom django.utils.http import urlencode\nfrom django import template\nfrom main_app.models import Source\nfrom articles.models import Article\nfrom django.utils.safestring import mark_safe\n\nregister = template.Library()\n\n\[email protected]_tag(takes_context=False)\ndef recent_articles():\n articles = Article.objects.order_by(\"-date_created\")[:5]\n list_item_template = '<li><a href=\"{url}\">{title}</a><br><small>{date}</small></li>'\n list_items = [\n list_item_template.format(\n url=a.get_absolute_url(),\n title=a.title,\n date=a.date_created.strftime(\"%x\"),\n )\n for a\n in articles\n ]\n list_items_string = \"\".join(list_items)\n recent_articles_string = \"<ul>{lis}</ul>\".format(lis=list_items_string)\n return mark_safe(recent_articles_string)\n\n\[email protected](name=\"month_to_string\")\ndef month_to_string(value: Optional[Union[str, int]]) -> Optional[Union[str, int]]:\n \"\"\"Converts month number to textual representation, 3 letters (Jan, Mar, etc)\"\"\"\n if type(value) == int and value in range(1, 13):\n return calendar.month_abbr[value]\n else:\n return value\n\n\[email protected]_tag(takes_context=True)\ndef url_add_get_params(context, **kwargs):\n query = context[\"request\"].GET.copy()\n # accounts for the situations where there may be two paginations in one page\n if \"page\" in kwargs:\n query.pop(\"page\", None)\n if \"page2\" in kwargs:\n query.pop(\"page2\", None)\n query.update(kwargs)\n return query.urlencode()\n\n\[email protected]_tag(takes_context=False)\ndef source_links():\n sources = (\n Source.objects.filter(public=True, visible=True, segment__id=4063)\n .exclude(siglum=None)\n .values(\"siglum\", \"id\")\n .order_by(\"siglum\")\n )\n options = \"\"\n # <option value=\"source1\">Source 1</option>\n # <option value=\"source2\">Source 2</option>\n # <option value=\"source3\">Source 3</option>\n for source in sources:\n option_str = (\n f\"<option value=source/{source['id']}>{source['siglum']}</option>\\n\"\n )\n options += option_str\n\n return mark_safe(options)\n\n\n\[email protected]\ndef classname(obj):\n \"\"\"\n Returns the name of the object's class\n A use-case is: {% if object|classname == \"Notation\" %}\n \"\"\"\n return obj.__class__.__name__\n\[email protected]\ndef admin_url_name(class_name, action):\n \"\"\"\n Accepts a class name and an action (either \"change\" or \"delete\") as arguments.\n Returns the name of the URL for changing/deleting an object in the admin interface.\n \"\"\"\n class_name = class_name.lower()\n action = action.lower()\n\n return f\"admin:main_app_{class_name}_{action}\"\n\[email protected](name='has_group') \ndef has_group(user, group_name):\n return user.groups.filter(name=group_name).exists() \n", "path": "django/cantusdb_project/main_app/templatetags/helper_tags.py"}], "after_files": [{"content": "import calendar\nfrom typing import Union, Optional\nfrom django.utils.http import urlencode\nfrom django import template\nfrom main_app.models import Source\nfrom articles.models import Article\nfrom django.utils.safestring import mark_safe\nfrom django.urls import reverse\n\nregister = template.Library()\n\n\[email protected]_tag(takes_context=False)\ndef recent_articles():\n articles = Article.objects.order_by(\"-date_created\")[:5]\n list_item_template = '<li><a href=\"{url}\">{title}</a><br><small>{date}</small></li>'\n list_items = [\n list_item_template.format(\n url=a.get_absolute_url(),\n title=a.title,\n date=a.date_created.strftime(\"%x\"),\n )\n for a\n in articles\n ]\n list_items_string = \"\".join(list_items)\n recent_articles_string = \"<ul>{lis}</ul>\".format(lis=list_items_string)\n return mark_safe(recent_articles_string)\n\n\[email protected]_tag(takes_context=False)\ndef my_sources(user):\n def make_source_detail_link_with_siglum(source):\n id = source.id\n siglum = source.rism_siglum\n url = reverse(\"source-detail\", args=[id])\n link = '<a href=\"{}\">{}</a>'.format(url, siglum)\n return link\n def make_source_detail_link_with_title(source):\n id = source.id\n title = source.title\n url = reverse(\"source-detail\", args=[id])\n link = '<a href=\"{}\">{}</a>'.format(url, title) \n return link\n def make_add_new_chants_link(source):\n id = source.id\n url = reverse(\"chant-create\", args=[id])\n link = '<a href=\"{}\">+ Add new chant</a>'.format(url) \n return link\n def make_edit_chants_link(source):\n id = source.id\n url = reverse(\"source-edit-volpiano\", args=[id])\n link = '<a href=\"{}\">Edit chants (Fulltext & Volpiano editor)</a>'.format(url) \n return link\n def make_links_for_source(source):\n link_with_siglum = make_source_detail_link_with_siglum(source)\n link_with_title = make_source_detail_link_with_title(source)\n add_new_chants_link = make_add_new_chants_link(source)\n edit_chants_link = make_edit_chants_link(source)\n template = \"\"\"{sigl}<br>\n <small>\n <b>{title}</b><br>\n {add}<br>\n {edit}<br>\n </small>\n \"\"\"\n links_string = template.format(\n sigl=link_with_siglum,\n title=link_with_title,\n add=add_new_chants_link,\n edit=edit_chants_link,\n )\n return links_string\n MAX_SOURCES_TO_DISPLAY = 6\n sources = list(user.sources_user_can_edit.all())[:MAX_SOURCES_TO_DISPLAY]\n source_links = [make_links_for_source(source) for source in sources]\n list_items = [\"<li>{}</li>\".format(link) for link in source_links]\n joined_list_items = \"\".join(list_items)\n links_ul = \"<ul>{}</ul>\".format(joined_list_items)\n return mark_safe(links_ul)\n\[email protected](name=\"month_to_string\")\ndef month_to_string(value: Optional[Union[str, int]]) -> Optional[Union[str, int]]:\n \"\"\"Converts month number to textual representation, 3 letters (Jan, Mar, etc)\"\"\"\n if type(value) == int and value in range(1, 13):\n return calendar.month_abbr[value]\n else:\n return value\n\n\[email protected]_tag(takes_context=True)\ndef url_add_get_params(context, **kwargs):\n query = context[\"request\"].GET.copy()\n # accounts for the situations where there may be two paginations in one page\n if \"page\" in kwargs:\n query.pop(\"page\", None)\n if \"page2\" in kwargs:\n query.pop(\"page2\", None)\n query.update(kwargs)\n return query.urlencode()\n\n\[email protected]_tag(takes_context=False)\ndef source_links():\n sources = (\n Source.objects.filter(public=True, visible=True, segment__id=4063)\n .exclude(siglum=None)\n .values(\"siglum\", \"id\")\n .order_by(\"siglum\")\n )\n options = \"\"\n # <option value=\"source1\">Source 1</option>\n # <option value=\"source2\">Source 2</option>\n # <option value=\"source3\">Source 3</option>\n for source in sources:\n option_str = (\n f\"<option value=source/{source['id']}>{source['siglum']}</option>\\n\"\n )\n options += option_str\n\n return mark_safe(options)\n\n\n\[email protected]\ndef classname(obj):\n \"\"\"\n Returns the name of the object's class\n A use-case is: {% if object|classname == \"Notation\" %}\n \"\"\"\n return obj.__class__.__name__\n\[email protected]\ndef admin_url_name(class_name, action):\n \"\"\"\n Accepts a class name and an action (either \"change\" or \"delete\") as arguments.\n Returns the name of the URL for changing/deleting an object in the admin interface.\n \"\"\"\n class_name = class_name.lower()\n action = action.lower()\n\n return f\"admin:main_app_{class_name}_{action}\"\n\[email protected](name='has_group') \ndef has_group(user, group_name):\n return user.groups.filter(name=group_name).exists() \n", "path": "django/cantusdb_project/main_app/templatetags/helper_tags.py"}]} | 1,268 | 751 |
gh_patches_debug_24026 | rasdani/github-patches | git_diff | mesonbuild__meson-4627 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
run_target improperly handles command if 'python' is in meson's installation path
My build environment is RHEL 7.6 with the python 3.6 Red Hat software collection enabled in my .bashrc. Meson was installed by running ``pip3.6 install .`` as root in the ``meson-0.49.0/`` folder from a fresh download. (This issue also happens in meson 48.)
I was setting up a custom run_target to run cppcheck with the command
```
run_target('cppcheck', command : ['cppcheck', '--project=' + join_paths(meson.build_root(), 'compile_commands.json')])
```
but noticed that it was failing to run with the following error message:
```
Could not execute command "--project=[build dir]/compile_commands.json". File not found.
FAILED: meson-cppcheck
/opt/rh/rh-python36/root/usr/bin/meson --internal commandrunner [src dir] [build dir] '' /opt/rh/rh-python36/root/usr/bin/meson cppcheck --project=[build dir]/compile_commands.json
ninja: build stopped: subcommand failed
```
I traced the issue for a bit and it the issue appears to be in ``commandrunner.py`` line 61 (ironically the line with the comment "Hack." haha): it looks for the presence of the string ``python`` in the meson path, presumably to catch instances of meson being launched with a ``python3 meson`` call. However it ignores the possibility that python is simply a part of the path to the meson executable. When this happens, the commandrunner script grabs the fifth argument and assumes that it's part of the meson command, which is wrong.
I'm currently working around the issue by inserting an empty argument into the run_target command, like this:
```
run_target('cppcheck', command : ['', 'cppcheck', '--project=' + join_paths(meson.build_root(), 'compile_commands.json')])
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mesonbuild/scripts/commandrunner.py`
Content:
```
1 # Copyright 2014 The Meson development team
2
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6
7 # http://www.apache.org/licenses/LICENSE-2.0
8
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """This program is a wrapper to run external commands. It determines
16 what to run, sets up the environment and executes the command."""
17
18 import sys, os, subprocess, shutil, shlex
19
20 def run_command(source_dir, build_dir, subdir, meson_command, command, arguments):
21 env = {'MESON_SOURCE_ROOT': source_dir,
22 'MESON_BUILD_ROOT': build_dir,
23 'MESON_SUBDIR': subdir,
24 'MESONINTROSPECT': ' '.join([shlex.quote(x) for x in meson_command + ['introspect']]),
25 }
26 cwd = os.path.join(source_dir, subdir)
27 child_env = os.environ.copy()
28 child_env.update(env)
29
30 # Is the command an executable in path?
31 exe = shutil.which(command)
32 if exe is not None:
33 command_array = [exe] + arguments
34 else:# No? Maybe it is a script in the source tree.
35 fullpath = os.path.join(source_dir, subdir, command)
36 command_array = [fullpath] + arguments
37 try:
38 return subprocess.Popen(command_array, env=child_env, cwd=cwd)
39 except FileNotFoundError:
40 print('Could not execute command "%s". File not found.' % command)
41 sys.exit(1)
42 except PermissionError:
43 print('Could not execute command "%s". File not executable.' % command)
44 sys.exit(1)
45 except OSError as err:
46 print('Could not execute command "{}": {}'.format(command, err))
47 sys.exit(1)
48 except subprocess.SubprocessError as err:
49 print('Could not execute command "{}": {}'.format(command, err))
50 sys.exit(1)
51
52
53 def run(args):
54 if len(args) < 4:
55 print('commandrunner.py <source dir> <build dir> <subdir> <command> [arguments]')
56 return 1
57 src_dir = args[0]
58 build_dir = args[1]
59 subdir = args[2]
60 meson_command = args[3]
61 if 'python' in meson_command: # Hack.
62 meson_command = [meson_command, args[4]]
63 command = args[5]
64 arguments = args[6:]
65 else:
66 meson_command = [meson_command]
67 command = args[4]
68 arguments = args[5:]
69 pc = run_command(src_dir, build_dir, subdir, meson_command, command, arguments)
70 pc.wait()
71 return pc.returncode
72
73 if __name__ == '__main__':
74 sys.exit(run(sys.argv[1:]))
75
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mesonbuild/scripts/commandrunner.py b/mesonbuild/scripts/commandrunner.py
--- a/mesonbuild/scripts/commandrunner.py
+++ b/mesonbuild/scripts/commandrunner.py
@@ -16,6 +16,7 @@
what to run, sets up the environment and executes the command."""
import sys, os, subprocess, shutil, shlex
+import re
def run_command(source_dir, build_dir, subdir, meson_command, command, arguments):
env = {'MESON_SOURCE_ROOT': source_dir,
@@ -49,6 +50,9 @@
print('Could not execute command "{}": {}'.format(command, err))
sys.exit(1)
+def is_python_command(cmdname):
+ end_py_regex = r'python(3|3\.\d+)?(\.exe)?$'
+ return re.search(end_py_regex, cmdname) is not None
def run(args):
if len(args) < 4:
@@ -58,7 +62,7 @@
build_dir = args[1]
subdir = args[2]
meson_command = args[3]
- if 'python' in meson_command: # Hack.
+ if is_python_command(meson_command):
meson_command = [meson_command, args[4]]
command = args[5]
arguments = args[6:]
| {"golden_diff": "diff --git a/mesonbuild/scripts/commandrunner.py b/mesonbuild/scripts/commandrunner.py\n--- a/mesonbuild/scripts/commandrunner.py\n+++ b/mesonbuild/scripts/commandrunner.py\n@@ -16,6 +16,7 @@\n what to run, sets up the environment and executes the command.\"\"\"\n \n import sys, os, subprocess, shutil, shlex\n+import re\n \n def run_command(source_dir, build_dir, subdir, meson_command, command, arguments):\n env = {'MESON_SOURCE_ROOT': source_dir,\n@@ -49,6 +50,9 @@\n print('Could not execute command \"{}\": {}'.format(command, err))\n sys.exit(1)\n \n+def is_python_command(cmdname):\n+ end_py_regex = r'python(3|3\\.\\d+)?(\\.exe)?$'\n+ return re.search(end_py_regex, cmdname) is not None\n \n def run(args):\n if len(args) < 4:\n@@ -58,7 +62,7 @@\n build_dir = args[1]\n subdir = args[2]\n meson_command = args[3]\n- if 'python' in meson_command: # Hack.\n+ if is_python_command(meson_command):\n meson_command = [meson_command, args[4]]\n command = args[5]\n arguments = args[6:]\n", "issue": "run_target improperly handles command if 'python' is in meson's installation path\nMy build environment is RHEL 7.6 with the python 3.6 Red Hat software collection enabled in my .bashrc. Meson was installed by running ``pip3.6 install .`` as root in the ``meson-0.49.0/`` folder from a fresh download. (This issue also happens in meson 48.)\r\n\r\nI was setting up a custom run_target to run cppcheck with the command\r\n```\r\nrun_target('cppcheck', command : ['cppcheck', '--project=' + join_paths(meson.build_root(), 'compile_commands.json')])\r\n```\r\nbut noticed that it was failing to run with the following error message:\r\n```\r\nCould not execute command \"--project=[build dir]/compile_commands.json\". File not found.\r\nFAILED: meson-cppcheck\r\n/opt/rh/rh-python36/root/usr/bin/meson --internal commandrunner [src dir] [build dir] '' /opt/rh/rh-python36/root/usr/bin/meson cppcheck --project=[build dir]/compile_commands.json\r\nninja: build stopped: subcommand failed\r\n```\r\n\r\nI traced the issue for a bit and it the issue appears to be in ``commandrunner.py`` line 61 (ironically the line with the comment \"Hack.\" haha): it looks for the presence of the string ``python`` in the meson path, presumably to catch instances of meson being launched with a ``python3 meson`` call. However it ignores the possibility that python is simply a part of the path to the meson executable. When this happens, the commandrunner script grabs the fifth argument and assumes that it's part of the meson command, which is wrong.\r\n\r\nI'm currently working around the issue by inserting an empty argument into the run_target command, like this:\r\n```\r\nrun_target('cppcheck', command : ['', 'cppcheck', '--project=' + join_paths(meson.build_root(), 'compile_commands.json')])\r\n```\n", "before_files": [{"content": "# Copyright 2014 The Meson development team\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"This program is a wrapper to run external commands. It determines\nwhat to run, sets up the environment and executes the command.\"\"\"\n\nimport sys, os, subprocess, shutil, shlex\n\ndef run_command(source_dir, build_dir, subdir, meson_command, command, arguments):\n env = {'MESON_SOURCE_ROOT': source_dir,\n 'MESON_BUILD_ROOT': build_dir,\n 'MESON_SUBDIR': subdir,\n 'MESONINTROSPECT': ' '.join([shlex.quote(x) for x in meson_command + ['introspect']]),\n }\n cwd = os.path.join(source_dir, subdir)\n child_env = os.environ.copy()\n child_env.update(env)\n\n # Is the command an executable in path?\n exe = shutil.which(command)\n if exe is not None:\n command_array = [exe] + arguments\n else:# No? Maybe it is a script in the source tree.\n fullpath = os.path.join(source_dir, subdir, command)\n command_array = [fullpath] + arguments\n try:\n return subprocess.Popen(command_array, env=child_env, cwd=cwd)\n except FileNotFoundError:\n print('Could not execute command \"%s\". File not found.' % command)\n sys.exit(1)\n except PermissionError:\n print('Could not execute command \"%s\". File not executable.' % command)\n sys.exit(1)\n except OSError as err:\n print('Could not execute command \"{}\": {}'.format(command, err))\n sys.exit(1)\n except subprocess.SubprocessError as err:\n print('Could not execute command \"{}\": {}'.format(command, err))\n sys.exit(1)\n\n\ndef run(args):\n if len(args) < 4:\n print('commandrunner.py <source dir> <build dir> <subdir> <command> [arguments]')\n return 1\n src_dir = args[0]\n build_dir = args[1]\n subdir = args[2]\n meson_command = args[3]\n if 'python' in meson_command: # Hack.\n meson_command = [meson_command, args[4]]\n command = args[5]\n arguments = args[6:]\n else:\n meson_command = [meson_command]\n command = args[4]\n arguments = args[5:]\n pc = run_command(src_dir, build_dir, subdir, meson_command, command, arguments)\n pc.wait()\n return pc.returncode\n\nif __name__ == '__main__':\n sys.exit(run(sys.argv[1:]))\n", "path": "mesonbuild/scripts/commandrunner.py"}], "after_files": [{"content": "# Copyright 2014 The Meson development team\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"This program is a wrapper to run external commands. It determines\nwhat to run, sets up the environment and executes the command.\"\"\"\n\nimport sys, os, subprocess, shutil, shlex\nimport re\n\ndef run_command(source_dir, build_dir, subdir, meson_command, command, arguments):\n env = {'MESON_SOURCE_ROOT': source_dir,\n 'MESON_BUILD_ROOT': build_dir,\n 'MESON_SUBDIR': subdir,\n 'MESONINTROSPECT': ' '.join([shlex.quote(x) for x in meson_command + ['introspect']]),\n }\n cwd = os.path.join(source_dir, subdir)\n child_env = os.environ.copy()\n child_env.update(env)\n\n # Is the command an executable in path?\n exe = shutil.which(command)\n if exe is not None:\n command_array = [exe] + arguments\n else:# No? Maybe it is a script in the source tree.\n fullpath = os.path.join(source_dir, subdir, command)\n command_array = [fullpath] + arguments\n try:\n return subprocess.Popen(command_array, env=child_env, cwd=cwd)\n except FileNotFoundError:\n print('Could not execute command \"%s\". File not found.' % command)\n sys.exit(1)\n except PermissionError:\n print('Could not execute command \"%s\". File not executable.' % command)\n sys.exit(1)\n except OSError as err:\n print('Could not execute command \"{}\": {}'.format(command, err))\n sys.exit(1)\n except subprocess.SubprocessError as err:\n print('Could not execute command \"{}\": {}'.format(command, err))\n sys.exit(1)\n\ndef is_python_command(cmdname):\n end_py_regex = r'python(3|3\\.\\d+)?(\\.exe)?$'\n return re.search(end_py_regex, cmdname) is not None\n\ndef run(args):\n if len(args) < 4:\n print('commandrunner.py <source dir> <build dir> <subdir> <command> [arguments]')\n return 1\n src_dir = args[0]\n build_dir = args[1]\n subdir = args[2]\n meson_command = args[3]\n if is_python_command(meson_command):\n meson_command = [meson_command, args[4]]\n command = args[5]\n arguments = args[6:]\n else:\n meson_command = [meson_command]\n command = args[4]\n arguments = args[5:]\n pc = run_command(src_dir, build_dir, subdir, meson_command, command, arguments)\n pc.wait()\n return pc.returncode\n\nif __name__ == '__main__':\n sys.exit(run(sys.argv[1:]))\n", "path": "mesonbuild/scripts/commandrunner.py"}]} | 1,512 | 295 |
gh_patches_debug_61898 | rasdani/github-patches | git_diff | opendatacube__datacube-core-534 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Pin rasterio >= 1.0.2
There are some issues with rasterio that are fixed with 1.0.2 (See: https://github.com/opendatacube/datacube-core/pull/531#pullrequestreview-143845097).
Files to change:
- setup.py
- requirements.txt
- rtd-environment.yml (this also needs the conda-forge/label/dev channel removed with the change)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2
3 import versioneer
4 from setuptools import setup, find_packages
5 import os
6
7 tests_require = [
8 'compliance-checker',
9 'hypothesis',
10 'mock',
11 'objgraph',
12 'pycodestyle',
13 'pylint',
14 'pytest',
15 'pytest-cov',
16 'pytest-timeout',
17 ]
18
19 extras_require = {
20 'performance': ['ciso8601', 'bottleneck'],
21 'interactive': ['matplotlib', 'fiona'],
22 'distributed': ['distributed', 'dask[distributed]'],
23 'doc': ['Sphinx', 'setuptools'],
24 'replicas': ['paramiko', 'sshtunnel', 'tqdm'],
25 'celery': ['celery>=4', 'redis'],
26 's3': ['boto3', 'SharedArray', 'pathos', 'zstandard'],
27 'test': tests_require,
28 }
29 # An 'all' option, following ipython naming conventions.
30 extras_require['all'] = sorted(set(sum(extras_require.values(), [])))
31
32 extra_plugins = dict(read=[], write=[], index=[])
33
34 if os.name != 'nt':
35 extra_plugins['read'].extend([
36 's3aio = datacube.drivers.s3.driver:reader_driver_init [s3]',
37 's3aio_test = datacube.drivers.s3.driver:reader_test_driver_init [s3]',
38 ])
39 extra_plugins['write'].extend([
40 's3aio = datacube.drivers.s3.driver:writer_driver_init [s3]',
41 's3aio_test = datacube.drivers.s3.driver:writer_test_driver_init [s3]',
42 ])
43
44 extra_plugins['index'].extend([
45 's3aio_index = datacube.drivers.s3aio_index:index_driver_init [s3]',
46 ])
47
48 setup(
49 name='datacube',
50 version=versioneer.get_version(),
51 cmdclass=versioneer.get_cmdclass(),
52 python_requires='>=3.5.2',
53
54 url='https://github.com/opendatacube/datacube-core',
55 author='Open Data Cube',
56 maintainer='Open Data Cube',
57 maintainer_email='',
58 description='An analysis environment for satellite and other earth observation data',
59 long_description=open('README.rst').read(),
60 license='Apache License 2.0',
61 classifiers=[
62 "Development Status :: 4 - Beta",
63 "Intended Audience :: Developers",
64 "Intended Audience :: Science/Research",
65 "License :: OSI Approved :: Apache Software License",
66 "Natural Language :: English",
67 "Operating System :: MacOS :: MacOS X",
68 "Operating System :: POSIX",
69 "Operating System :: POSIX :: BSD",
70 "Operating System :: POSIX :: Linux",
71 "Operating System :: Microsoft :: Windows",
72 "Programming Language :: Python",
73 "Programming Language :: Python :: 3",
74 "Programming Language :: Python :: 3.5",
75 "Programming Language :: Python :: 3.6",
76 "Topic :: Scientific/Engineering :: GIS",
77 "Topic :: Scientific/Engineering :: Information Analysis",
78 ],
79
80 packages=find_packages(
81 exclude=('tests', 'tests.*',
82 'integration_tests', 'integration_tests.*')
83 ),
84 package_data={
85 '': ['*.yaml', '*/*.yaml'],
86 },
87 scripts=[
88 'datacube_apps/scripts/pbs_helpers.sh'
89 ],
90 setup_requires=[
91 'pytest-runner'
92 ],
93 install_requires=[
94 'affine',
95 'cachetools',
96 'click>=5.0',
97 'cloudpickle>=0.4',
98 'dask[array]',
99 'gdal>=1.9',
100 'jsonschema',
101 'netcdf4',
102 'numpy',
103 'psycopg2',
104 'pypeg2',
105 'python-dateutil',
106 'pyyaml',
107 'rasterio~=1.0',
108 'singledispatch',
109 'sqlalchemy',
110 'toolz',
111 'xarray>=0.9', # >0.9 fixes most problems with `crs` attributes being lost
112 ],
113 extras_require=extras_require,
114 tests_require=tests_require,
115
116 entry_points={
117 'console_scripts': [
118 'datacube = datacube.scripts.cli_app:cli',
119 'datacube-search = datacube.scripts.search_tool:cli',
120 'datacube-stacker = datacube_apps.stacker:main',
121 'datacube-worker = datacube.execution.worker:main',
122 'datacube-fixer = datacube_apps.stacker:fixer_main',
123 'datacube-ncml = datacube_apps.ncml:ncml_app',
124 'pixeldrill = datacube_apps.pixeldrill:main [interactive]',
125 'movie_generator = datacube_apps.movie_generator:main',
126 'datacube-simple-replica = datacube_apps.simple_replica:replicate [replicas]'
127 ],
128 'datacube.plugins.io.read': [
129 'netcdf = datacube.drivers.netcdf.driver:reader_driver_init',
130 *extra_plugins['read'],
131 ],
132 'datacube.plugins.io.write': [
133 'netcdf = datacube.drivers.netcdf.driver:writer_driver_init',
134 *extra_plugins['write'],
135 ],
136 'datacube.plugins.index': [
137 'default = datacube.index.index:index_driver_init',
138 *extra_plugins['index'],
139 ],
140 },
141 )
142
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -104,7 +104,7 @@
'pypeg2',
'python-dateutil',
'pyyaml',
- 'rasterio~=1.0',
+ 'rasterio>=1.0.2', # Multi-band re-project fixed in that version
'singledispatch',
'sqlalchemy',
'toolz',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -104,7 +104,7 @@\n 'pypeg2',\n 'python-dateutil',\n 'pyyaml',\n- 'rasterio~=1.0',\n+ 'rasterio>=1.0.2', # Multi-band re-project fixed in that version\n 'singledispatch',\n 'sqlalchemy',\n 'toolz',\n", "issue": "Pin rasterio >= 1.0.2\nThere are some issues with rasterio that are fixed with 1.0.2 (See: https://github.com/opendatacube/datacube-core/pull/531#pullrequestreview-143845097).\r\n\r\nFiles to change:\r\n- setup.py\r\n- requirements.txt\r\n- rtd-environment.yml (this also needs the conda-forge/label/dev channel removed with the change)\n", "before_files": [{"content": "#!/usr/bin/env python\n\nimport versioneer\nfrom setuptools import setup, find_packages\nimport os\n\ntests_require = [\n 'compliance-checker',\n 'hypothesis',\n 'mock',\n 'objgraph',\n 'pycodestyle',\n 'pylint',\n 'pytest',\n 'pytest-cov',\n 'pytest-timeout',\n]\n\nextras_require = {\n 'performance': ['ciso8601', 'bottleneck'],\n 'interactive': ['matplotlib', 'fiona'],\n 'distributed': ['distributed', 'dask[distributed]'],\n 'doc': ['Sphinx', 'setuptools'],\n 'replicas': ['paramiko', 'sshtunnel', 'tqdm'],\n 'celery': ['celery>=4', 'redis'],\n 's3': ['boto3', 'SharedArray', 'pathos', 'zstandard'],\n 'test': tests_require,\n}\n# An 'all' option, following ipython naming conventions.\nextras_require['all'] = sorted(set(sum(extras_require.values(), [])))\n\nextra_plugins = dict(read=[], write=[], index=[])\n\nif os.name != 'nt':\n extra_plugins['read'].extend([\n 's3aio = datacube.drivers.s3.driver:reader_driver_init [s3]',\n 's3aio_test = datacube.drivers.s3.driver:reader_test_driver_init [s3]',\n ])\n extra_plugins['write'].extend([\n 's3aio = datacube.drivers.s3.driver:writer_driver_init [s3]',\n 's3aio_test = datacube.drivers.s3.driver:writer_test_driver_init [s3]',\n ])\n\n extra_plugins['index'].extend([\n 's3aio_index = datacube.drivers.s3aio_index:index_driver_init [s3]',\n ])\n\nsetup(\n name='datacube',\n version=versioneer.get_version(),\n cmdclass=versioneer.get_cmdclass(),\n python_requires='>=3.5.2',\n\n url='https://github.com/opendatacube/datacube-core',\n author='Open Data Cube',\n maintainer='Open Data Cube',\n maintainer_email='',\n description='An analysis environment for satellite and other earth observation data',\n long_description=open('README.rst').read(),\n license='Apache License 2.0',\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Natural Language :: English\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: POSIX\",\n \"Operating System :: POSIX :: BSD\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: Microsoft :: Windows\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Topic :: Scientific/Engineering :: GIS\",\n \"Topic :: Scientific/Engineering :: Information Analysis\",\n ],\n\n packages=find_packages(\n exclude=('tests', 'tests.*',\n 'integration_tests', 'integration_tests.*')\n ),\n package_data={\n '': ['*.yaml', '*/*.yaml'],\n },\n scripts=[\n 'datacube_apps/scripts/pbs_helpers.sh'\n ],\n setup_requires=[\n 'pytest-runner'\n ],\n install_requires=[\n 'affine',\n 'cachetools',\n 'click>=5.0',\n 'cloudpickle>=0.4',\n 'dask[array]',\n 'gdal>=1.9',\n 'jsonschema',\n 'netcdf4',\n 'numpy',\n 'psycopg2',\n 'pypeg2',\n 'python-dateutil',\n 'pyyaml',\n 'rasterio~=1.0',\n 'singledispatch',\n 'sqlalchemy',\n 'toolz',\n 'xarray>=0.9', # >0.9 fixes most problems with `crs` attributes being lost\n ],\n extras_require=extras_require,\n tests_require=tests_require,\n\n entry_points={\n 'console_scripts': [\n 'datacube = datacube.scripts.cli_app:cli',\n 'datacube-search = datacube.scripts.search_tool:cli',\n 'datacube-stacker = datacube_apps.stacker:main',\n 'datacube-worker = datacube.execution.worker:main',\n 'datacube-fixer = datacube_apps.stacker:fixer_main',\n 'datacube-ncml = datacube_apps.ncml:ncml_app',\n 'pixeldrill = datacube_apps.pixeldrill:main [interactive]',\n 'movie_generator = datacube_apps.movie_generator:main',\n 'datacube-simple-replica = datacube_apps.simple_replica:replicate [replicas]'\n ],\n 'datacube.plugins.io.read': [\n 'netcdf = datacube.drivers.netcdf.driver:reader_driver_init',\n *extra_plugins['read'],\n ],\n 'datacube.plugins.io.write': [\n 'netcdf = datacube.drivers.netcdf.driver:writer_driver_init',\n *extra_plugins['write'],\n ],\n 'datacube.plugins.index': [\n 'default = datacube.index.index:index_driver_init',\n *extra_plugins['index'],\n ],\n },\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\nimport versioneer\nfrom setuptools import setup, find_packages\nimport os\n\ntests_require = [\n 'compliance-checker',\n 'hypothesis',\n 'mock',\n 'objgraph',\n 'pycodestyle',\n 'pylint',\n 'pytest',\n 'pytest-cov',\n 'pytest-timeout',\n]\n\nextras_require = {\n 'performance': ['ciso8601', 'bottleneck'],\n 'interactive': ['matplotlib', 'fiona'],\n 'distributed': ['distributed', 'dask[distributed]'],\n 'doc': ['Sphinx', 'setuptools'],\n 'replicas': ['paramiko', 'sshtunnel', 'tqdm'],\n 'celery': ['celery>=4', 'redis'],\n 's3': ['boto3', 'SharedArray', 'pathos', 'zstandard'],\n 'test': tests_require,\n}\n# An 'all' option, following ipython naming conventions.\nextras_require['all'] = sorted(set(sum(extras_require.values(), [])))\n\nextra_plugins = dict(read=[], write=[], index=[])\n\nif os.name != 'nt':\n extra_plugins['read'].extend([\n 's3aio = datacube.drivers.s3.driver:reader_driver_init [s3]',\n 's3aio_test = datacube.drivers.s3.driver:reader_test_driver_init [s3]',\n ])\n extra_plugins['write'].extend([\n 's3aio = datacube.drivers.s3.driver:writer_driver_init [s3]',\n 's3aio_test = datacube.drivers.s3.driver:writer_test_driver_init [s3]',\n ])\n\n extra_plugins['index'].extend([\n 's3aio_index = datacube.drivers.s3aio_index:index_driver_init [s3]',\n ])\n\nsetup(\n name='datacube',\n version=versioneer.get_version(),\n cmdclass=versioneer.get_cmdclass(),\n python_requires='>=3.5.2',\n\n url='https://github.com/opendatacube/datacube-core',\n author='Open Data Cube',\n maintainer='Open Data Cube',\n maintainer_email='',\n description='An analysis environment for satellite and other earth observation data',\n long_description=open('README.rst').read(),\n license='Apache License 2.0',\n classifiers=[\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Natural Language :: English\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Operating System :: POSIX\",\n \"Operating System :: POSIX :: BSD\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: Microsoft :: Windows\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Topic :: Scientific/Engineering :: GIS\",\n \"Topic :: Scientific/Engineering :: Information Analysis\",\n ],\n\n packages=find_packages(\n exclude=('tests', 'tests.*',\n 'integration_tests', 'integration_tests.*')\n ),\n package_data={\n '': ['*.yaml', '*/*.yaml'],\n },\n scripts=[\n 'datacube_apps/scripts/pbs_helpers.sh'\n ],\n setup_requires=[\n 'pytest-runner'\n ],\n install_requires=[\n 'affine',\n 'cachetools',\n 'click>=5.0',\n 'cloudpickle>=0.4',\n 'dask[array]',\n 'gdal>=1.9',\n 'jsonschema',\n 'netcdf4',\n 'numpy',\n 'psycopg2',\n 'pypeg2',\n 'python-dateutil',\n 'pyyaml',\n 'rasterio>=1.0.2', # Multi-band re-project fixed in that version\n 'singledispatch',\n 'sqlalchemy',\n 'toolz',\n 'xarray>=0.9', # >0.9 fixes most problems with `crs` attributes being lost\n ],\n extras_require=extras_require,\n tests_require=tests_require,\n\n entry_points={\n 'console_scripts': [\n 'datacube = datacube.scripts.cli_app:cli',\n 'datacube-search = datacube.scripts.search_tool:cli',\n 'datacube-stacker = datacube_apps.stacker:main',\n 'datacube-worker = datacube.execution.worker:main',\n 'datacube-fixer = datacube_apps.stacker:fixer_main',\n 'datacube-ncml = datacube_apps.ncml:ncml_app',\n 'pixeldrill = datacube_apps.pixeldrill:main [interactive]',\n 'movie_generator = datacube_apps.movie_generator:main',\n 'datacube-simple-replica = datacube_apps.simple_replica:replicate [replicas]'\n ],\n 'datacube.plugins.io.read': [\n 'netcdf = datacube.drivers.netcdf.driver:reader_driver_init',\n *extra_plugins['read'],\n ],\n 'datacube.plugins.io.write': [\n 'netcdf = datacube.drivers.netcdf.driver:writer_driver_init',\n *extra_plugins['write'],\n ],\n 'datacube.plugins.index': [\n 'default = datacube.index.index:index_driver_init',\n *extra_plugins['index'],\n ],\n },\n)\n", "path": "setup.py"}]} | 1,825 | 104 |
gh_patches_debug_7604 | rasdani/github-patches | git_diff | robocorp__rpaframework-617 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`RPA.Windows.Get Elements` not copying the `robocorp_click_offset` attribute
Should make available this attribute in all siblings found elements, otherwise the `Click` keyword will fail when using offsets:
<img width="841" alt="Screen Shot 2022-08-26 at 4 48 38 PM" src="https://user-images.githubusercontent.com/709053/187146331-32d1ea86-7da7-4afe-b8ec-ea9b5289db70.png">
<img width="1391" alt="Screen Shot 2022-08-26 at 4 48 20 PM" src="https://user-images.githubusercontent.com/709053/187146342-b6ae653f-0ae7-4870-bdc9-edba1ef83533.png">
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `packages/windows/src/RPA/Windows/keywords/locators.py`
Content:
```
1 from typing import List, Optional
2
3 from RPA.core.windows.locators import (
4 Locator,
5 LocatorMethods,
6 WindowsElement,
7 )
8
9 from RPA.Windows.keywords import keyword
10 from RPA.Windows.keywords.context import with_timeout
11
12
13 class LocatorKeywords(LocatorMethods):
14 """Keywords for handling Windows locators"""
15
16 @keyword
17 @with_timeout
18 def get_element(
19 self,
20 locator: Optional[Locator] = None,
21 search_depth: int = 8,
22 root_element: Optional[WindowsElement] = None,
23 timeout: Optional[float] = None, # pylint: disable=unused-argument
24 ) -> WindowsElement:
25 """Get Control element defined by the locator.
26
27 Returned element can be used instead of a locator string for
28 keywords accepting `locator`.
29
30 Keyword ``Get Attribute`` can be used to read element attribute values.
31
32 If `locator` is *None* then returned `element` will be in order of preference:
33
34 1. anchor element if that has been set with `Set Anchor`
35 2. current active window if that has been set with `Control Window`
36 3. final option is the `Desktop`
37
38 :param locator: locator as a string or as an element
39 :param search_depth: how deep the element search will traverse (default 8)
40 :param root_element: can be used to set search root element
41 :param timeout: float value in seconds, see keyword
42 ``Set Global Timeout``
43 :return: WindowsElement object
44
45 Example:
46
47 .. code-block:: robotframework
48
49 ${element} = Get Element name:"RichEdit Control"
50 Set Value ${element} note to myself
51 """
52 # NOTE(cmiN): Timeout is automatically set to `None` in the upper call by the
53 # `with_timeout` decorator, so we alter the behaviour (context timeout
54 # setting) at this level only.
55 return super().get_element(
56 locator=locator,
57 search_depth=search_depth,
58 root_element=root_element,
59 timeout=timeout,
60 )
61
62 @keyword
63 @with_timeout
64 def get_elements(
65 self,
66 locator: Optional[Locator] = None,
67 search_depth: int = 8,
68 root_element: Optional[WindowsElement] = None,
69 timeout: Optional[float] = None, # pylint: disable=unused-argument
70 ) -> List[WindowsElement]:
71 """Get list of elements matching locator.
72
73 :param locator: locator as a string or as an element
74 :param search_depth: how deep the element search will traverse (default 8)
75 :param root_element: can be used to set search root element
76 :param timeout: float value in seconds, see keyword
77 ``Set Global Timeout``
78 :return: list of WindowsElement objects
79
80 Example:
81
82 .. code-block:: robotframework
83
84 Set Anchor id:DataGrid
85 ${elements}= Get Elements type:HeaderItem
86 FOR ${el} IN @{elements}
87 Log To Console ${el.Name}
88 END
89 """
90 elements = []
91 initial_element = element = self.get_element(
92 locator, search_depth, root_element
93 )
94 elements.append(initial_element)
95 while True:
96 next_control = element.item.GetNextSiblingControl()
97 if next_control:
98 element = WindowsElement(next_control, locator)
99 if initial_element.is_sibling(element):
100 elements.append(element)
101 else:
102 break
103 return elements
104
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/packages/windows/src/RPA/Windows/keywords/locators.py b/packages/windows/src/RPA/Windows/keywords/locators.py
--- a/packages/windows/src/RPA/Windows/keywords/locators.py
+++ b/packages/windows/src/RPA/Windows/keywords/locators.py
@@ -97,6 +97,9 @@
if next_control:
element = WindowsElement(next_control, locator)
if initial_element.is_sibling(element):
+ element.item.robocorp_click_offset = (
+ initial_element.item.robocorp_click_offset
+ )
elements.append(element)
else:
break
| {"golden_diff": "diff --git a/packages/windows/src/RPA/Windows/keywords/locators.py b/packages/windows/src/RPA/Windows/keywords/locators.py\n--- a/packages/windows/src/RPA/Windows/keywords/locators.py\n+++ b/packages/windows/src/RPA/Windows/keywords/locators.py\n@@ -97,6 +97,9 @@\n if next_control:\n element = WindowsElement(next_control, locator)\n if initial_element.is_sibling(element):\n+ element.item.robocorp_click_offset = (\n+ initial_element.item.robocorp_click_offset\n+ )\n elements.append(element)\n else:\n break\n", "issue": "`RPA.Windows.Get Elements` not copying the `robocorp_click_offset` attribute\nShould make available this attribute in all siblings found elements, otherwise the `Click` keyword will fail when using offsets:\r\n\r\n<img width=\"841\" alt=\"Screen Shot 2022-08-26 at 4 48 38 PM\" src=\"https://user-images.githubusercontent.com/709053/187146331-32d1ea86-7da7-4afe-b8ec-ea9b5289db70.png\">\r\n<img width=\"1391\" alt=\"Screen Shot 2022-08-26 at 4 48 20 PM\" src=\"https://user-images.githubusercontent.com/709053/187146342-b6ae653f-0ae7-4870-bdc9-edba1ef83533.png\">\r\n\n", "before_files": [{"content": "from typing import List, Optional\n\nfrom RPA.core.windows.locators import (\n Locator,\n LocatorMethods,\n WindowsElement,\n)\n\nfrom RPA.Windows.keywords import keyword\nfrom RPA.Windows.keywords.context import with_timeout\n\n\nclass LocatorKeywords(LocatorMethods):\n \"\"\"Keywords for handling Windows locators\"\"\"\n\n @keyword\n @with_timeout\n def get_element(\n self,\n locator: Optional[Locator] = None,\n search_depth: int = 8,\n root_element: Optional[WindowsElement] = None,\n timeout: Optional[float] = None, # pylint: disable=unused-argument\n ) -> WindowsElement:\n \"\"\"Get Control element defined by the locator.\n\n Returned element can be used instead of a locator string for\n keywords accepting `locator`.\n\n Keyword ``Get Attribute`` can be used to read element attribute values.\n\n If `locator` is *None* then returned `element` will be in order of preference:\n\n 1. anchor element if that has been set with `Set Anchor`\n 2. current active window if that has been set with `Control Window`\n 3. final option is the `Desktop`\n\n :param locator: locator as a string or as an element\n :param search_depth: how deep the element search will traverse (default 8)\n :param root_element: can be used to set search root element\n :param timeout: float value in seconds, see keyword\n ``Set Global Timeout``\n :return: WindowsElement object\n\n Example:\n\n .. code-block:: robotframework\n\n ${element} = Get Element name:\"RichEdit Control\"\n Set Value ${element} note to myself\n \"\"\"\n # NOTE(cmiN): Timeout is automatically set to `None` in the upper call by the\n # `with_timeout` decorator, so we alter the behaviour (context timeout\n # setting) at this level only.\n return super().get_element(\n locator=locator,\n search_depth=search_depth,\n root_element=root_element,\n timeout=timeout,\n )\n\n @keyword\n @with_timeout\n def get_elements(\n self,\n locator: Optional[Locator] = None,\n search_depth: int = 8,\n root_element: Optional[WindowsElement] = None,\n timeout: Optional[float] = None, # pylint: disable=unused-argument\n ) -> List[WindowsElement]:\n \"\"\"Get list of elements matching locator.\n\n :param locator: locator as a string or as an element\n :param search_depth: how deep the element search will traverse (default 8)\n :param root_element: can be used to set search root element\n :param timeout: float value in seconds, see keyword\n ``Set Global Timeout``\n :return: list of WindowsElement objects\n\n Example:\n\n .. code-block:: robotframework\n\n Set Anchor id:DataGrid\n ${elements}= Get Elements type:HeaderItem\n FOR ${el} IN @{elements}\n Log To Console ${el.Name}\n END\n \"\"\"\n elements = []\n initial_element = element = self.get_element(\n locator, search_depth, root_element\n )\n elements.append(initial_element)\n while True:\n next_control = element.item.GetNextSiblingControl()\n if next_control:\n element = WindowsElement(next_control, locator)\n if initial_element.is_sibling(element):\n elements.append(element)\n else:\n break\n return elements\n", "path": "packages/windows/src/RPA/Windows/keywords/locators.py"}], "after_files": [{"content": "from typing import List, Optional\n\nfrom RPA.core.windows.locators import (\n Locator,\n LocatorMethods,\n WindowsElement,\n)\n\nfrom RPA.Windows.keywords import keyword\nfrom RPA.Windows.keywords.context import with_timeout\n\n\nclass LocatorKeywords(LocatorMethods):\n \"\"\"Keywords for handling Windows locators\"\"\"\n\n @keyword\n @with_timeout\n def get_element(\n self,\n locator: Optional[Locator] = None,\n search_depth: int = 8,\n root_element: Optional[WindowsElement] = None,\n timeout: Optional[float] = None, # pylint: disable=unused-argument\n ) -> WindowsElement:\n \"\"\"Get Control element defined by the locator.\n\n Returned element can be used instead of a locator string for\n keywords accepting `locator`.\n\n Keyword ``Get Attribute`` can be used to read element attribute values.\n\n If `locator` is *None* then returned `element` will be in order of preference:\n\n 1. anchor element if that has been set with `Set Anchor`\n 2. current active window if that has been set with `Control Window`\n 3. final option is the `Desktop`\n\n :param locator: locator as a string or as an element\n :param search_depth: how deep the element search will traverse (default 8)\n :param root_element: can be used to set search root element\n :param timeout: float value in seconds, see keyword\n ``Set Global Timeout``\n :return: WindowsElement object\n\n Example:\n\n .. code-block:: robotframework\n\n ${element} = Get Element name:\"RichEdit Control\"\n Set Value ${element} note to myself\n \"\"\"\n # NOTE(cmiN): Timeout is automatically set to `None` in the upper call by the\n # `with_timeout` decorator, so we alter the behaviour (context timeout\n # setting) at this level only.\n return super().get_element(\n locator=locator,\n search_depth=search_depth,\n root_element=root_element,\n timeout=timeout,\n )\n\n @keyword\n @with_timeout\n def get_elements(\n self,\n locator: Optional[Locator] = None,\n search_depth: int = 8,\n root_element: Optional[WindowsElement] = None,\n timeout: Optional[float] = None, # pylint: disable=unused-argument\n ) -> List[WindowsElement]:\n \"\"\"Get list of elements matching locator.\n\n :param locator: locator as a string or as an element\n :param search_depth: how deep the element search will traverse (default 8)\n :param root_element: can be used to set search root element\n :param timeout: float value in seconds, see keyword\n ``Set Global Timeout``\n :return: list of WindowsElement objects\n\n Example:\n\n .. code-block:: robotframework\n\n Set Anchor id:DataGrid\n ${elements}= Get Elements type:HeaderItem\n FOR ${el} IN @{elements}\n Log To Console ${el.Name}\n END\n \"\"\"\n elements = []\n initial_element = element = self.get_element(\n locator, search_depth, root_element\n )\n elements.append(initial_element)\n while True:\n next_control = element.item.GetNextSiblingControl()\n if next_control:\n element = WindowsElement(next_control, locator)\n if initial_element.is_sibling(element):\n element.item.robocorp_click_offset = (\n initial_element.item.robocorp_click_offset\n )\n elements.append(element)\n else:\n break\n return elements\n", "path": "packages/windows/src/RPA/Windows/keywords/locators.py"}]} | 1,453 | 137 |
gh_patches_debug_29378 | rasdani/github-patches | git_diff | bridgecrewio__checkov-5018 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
CKV_AWS_338 does not account for 0 (keep forever) & has insufficient error message
**Describe the issue**
CKV_AWS_338 is a check for the number of retention days for a Cloudwatch log group. However, the actual check source code requires not just being specified, but also to be at least 1 year.
1 - The message "Ensure that CloudWatch Log Group specifies retention days" does not reflect the need to not only be specified, but also be of sufficient length (or provide ref to web page providing more detail on the check)
2 - 0 is a valid value and means that logs are always retained and never expire, which should result in passing the "must retain for at least 1 year" rule.
**Examples**
Please share an example code sample (in the IaC of your choice) + the expected outcomes.
**Version (please complete the following information):**
2.3.212
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/aws/CloudWatchLogGroupRetentionYear.py`
Content:
```
1 from __future__ import annotations
2
3 from typing import Any
4
5 from checkov.common.models.enums import CheckCategories, CheckResult
6 from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
7
8
9 class CloudWatchLogGroupRetentionYear(BaseResourceCheck):
10 def __init__(self) -> None:
11 """
12 NIST.800-53.r5 AU-10, NIST.800-53.r5 AU-11, NIST.800-53.r5 AU-6(3), NIST.800-53.r5 AU-6(4),
13 NIST.800-53.r5 CA-7, NIST.800-53.r5 SI-12
14 CloudWatch log groups should be retained for at least 1 year
15 """
16 name = "Ensure that CloudWatch Log Group specifies retention days"
17 id = "CKV_AWS_338"
18 supported_resource = ("aws_cloudwatch_log_group",)
19 categories = (CheckCategories.LOGGING,)
20 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resource)
21
22 def scan_resource_conf(self, conf: dict[str, list[Any]]) -> CheckResult:
23 retention = conf.get("retention_in_days")
24 if retention and isinstance(retention, list):
25 retention = retention[0]
26 if not isinstance(retention, int):
27 # probably a dependent variable
28 return CheckResult.UNKNOWN
29
30 if retention >= 365:
31 return CheckResult.PASSED
32
33 return CheckResult.FAILED
34
35
36 check = CloudWatchLogGroupRetentionYear()
37
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/checkov/terraform/checks/resource/aws/CloudWatchLogGroupRetentionYear.py b/checkov/terraform/checks/resource/aws/CloudWatchLogGroupRetentionYear.py
--- a/checkov/terraform/checks/resource/aws/CloudWatchLogGroupRetentionYear.py
+++ b/checkov/terraform/checks/resource/aws/CloudWatchLogGroupRetentionYear.py
@@ -3,7 +3,8 @@
from typing import Any
from checkov.common.models.enums import CheckCategories, CheckResult
-from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
+from checkov.terraform.checks.resource.base_resource_check import \
+ BaseResourceCheck
class CloudWatchLogGroupRetentionYear(BaseResourceCheck):
@@ -13,7 +14,7 @@
NIST.800-53.r5 CA-7, NIST.800-53.r5 SI-12
CloudWatch log groups should be retained for at least 1 year
"""
- name = "Ensure that CloudWatch Log Group specifies retention days"
+ name = "Ensure CloudWatch log groups retains logs for at least 1 year"
id = "CKV_AWS_338"
supported_resource = ("aws_cloudwatch_log_group",)
categories = (CheckCategories.LOGGING,)
@@ -26,8 +27,8 @@
if not isinstance(retention, int):
# probably a dependent variable
return CheckResult.UNKNOWN
-
- if retention >= 365:
+ # If you select 0, the events in the log group are always retained and never expire.
+ if retention == 0 or retention >= 365:
return CheckResult.PASSED
return CheckResult.FAILED
| {"golden_diff": "diff --git a/checkov/terraform/checks/resource/aws/CloudWatchLogGroupRetentionYear.py b/checkov/terraform/checks/resource/aws/CloudWatchLogGroupRetentionYear.py\n--- a/checkov/terraform/checks/resource/aws/CloudWatchLogGroupRetentionYear.py\n+++ b/checkov/terraform/checks/resource/aws/CloudWatchLogGroupRetentionYear.py\n@@ -3,7 +3,8 @@\n from typing import Any\n \n from checkov.common.models.enums import CheckCategories, CheckResult\n-from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n+from checkov.terraform.checks.resource.base_resource_check import \\\n+ BaseResourceCheck\n \n \n class CloudWatchLogGroupRetentionYear(BaseResourceCheck):\n@@ -13,7 +14,7 @@\n NIST.800-53.r5 CA-7, NIST.800-53.r5 SI-12\n CloudWatch log groups should be retained for at least 1 year\n \"\"\"\n- name = \"Ensure that CloudWatch Log Group specifies retention days\"\n+ name = \"Ensure CloudWatch log groups retains logs for at least 1 year\"\n id = \"CKV_AWS_338\"\n supported_resource = (\"aws_cloudwatch_log_group\",)\n categories = (CheckCategories.LOGGING,)\n@@ -26,8 +27,8 @@\n if not isinstance(retention, int):\n # probably a dependent variable\n return CheckResult.UNKNOWN\n-\n- if retention >= 365:\n+ # If you select 0, the events in the log group are always retained and never expire.\n+ if retention == 0 or retention >= 365:\n return CheckResult.PASSED\n \n return CheckResult.FAILED\n", "issue": "CKV_AWS_338 does not account for 0 (keep forever) & has insufficient error message\n**Describe the issue**\r\nCKV_AWS_338 is a check for the number of retention days for a Cloudwatch log group. However, the actual check source code requires not just being specified, but also to be at least 1 year.\r\n\r\n1 - The message \"Ensure that CloudWatch Log Group specifies retention days\" does not reflect the need to not only be specified, but also be of sufficient length (or provide ref to web page providing more detail on the check)\r\n\r\n2 - 0 is a valid value and means that logs are always retained and never expire, which should result in passing the \"must retain for at least 1 year\" rule.\r\n\r\n**Examples**\r\nPlease share an example code sample (in the IaC of your choice) + the expected outcomes.\r\n\r\n**Version (please complete the following information):**\r\n2.3.212\n", "before_files": [{"content": "from __future__ import annotations\n\nfrom typing import Any\n\nfrom checkov.common.models.enums import CheckCategories, CheckResult\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\n\n\nclass CloudWatchLogGroupRetentionYear(BaseResourceCheck):\n def __init__(self) -> None:\n \"\"\"\n NIST.800-53.r5 AU-10, NIST.800-53.r5 AU-11, NIST.800-53.r5 AU-6(3), NIST.800-53.r5 AU-6(4),\n NIST.800-53.r5 CA-7, NIST.800-53.r5 SI-12\n CloudWatch log groups should be retained for at least 1 year\n \"\"\"\n name = \"Ensure that CloudWatch Log Group specifies retention days\"\n id = \"CKV_AWS_338\"\n supported_resource = (\"aws_cloudwatch_log_group\",)\n categories = (CheckCategories.LOGGING,)\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resource)\n\n def scan_resource_conf(self, conf: dict[str, list[Any]]) -> CheckResult:\n retention = conf.get(\"retention_in_days\")\n if retention and isinstance(retention, list):\n retention = retention[0]\n if not isinstance(retention, int):\n # probably a dependent variable\n return CheckResult.UNKNOWN\n\n if retention >= 365:\n return CheckResult.PASSED\n\n return CheckResult.FAILED\n\n\ncheck = CloudWatchLogGroupRetentionYear()\n", "path": "checkov/terraform/checks/resource/aws/CloudWatchLogGroupRetentionYear.py"}], "after_files": [{"content": "from __future__ import annotations\n\nfrom typing import Any\n\nfrom checkov.common.models.enums import CheckCategories, CheckResult\nfrom checkov.terraform.checks.resource.base_resource_check import \\\n BaseResourceCheck\n\n\nclass CloudWatchLogGroupRetentionYear(BaseResourceCheck):\n def __init__(self) -> None:\n \"\"\"\n NIST.800-53.r5 AU-10, NIST.800-53.r5 AU-11, NIST.800-53.r5 AU-6(3), NIST.800-53.r5 AU-6(4),\n NIST.800-53.r5 CA-7, NIST.800-53.r5 SI-12\n CloudWatch log groups should be retained for at least 1 year\n \"\"\"\n name = \"Ensure CloudWatch log groups retains logs for at least 1 year\"\n id = \"CKV_AWS_338\"\n supported_resource = (\"aws_cloudwatch_log_group\",)\n categories = (CheckCategories.LOGGING,)\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resource)\n\n def scan_resource_conf(self, conf: dict[str, list[Any]]) -> CheckResult:\n retention = conf.get(\"retention_in_days\")\n if retention and isinstance(retention, list):\n retention = retention[0]\n if not isinstance(retention, int):\n # probably a dependent variable\n return CheckResult.UNKNOWN\n # If you select 0, the events in the log group are always retained and never expire.\n if retention == 0 or retention >= 365:\n return CheckResult.PASSED\n\n return CheckResult.FAILED\n\n\ncheck = CloudWatchLogGroupRetentionYear()\n", "path": "checkov/terraform/checks/resource/aws/CloudWatchLogGroupRetentionYear.py"}]} | 897 | 379 |
gh_patches_debug_35330 | rasdani/github-patches | git_diff | iterative__dvc-1531 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
daemon: address dvc directly
Our daemon is using `sys.executable -m dvc` to launch a new dvc process, but a more proper way would be to address dvc directly, by dynamically computing its location ourselves from `__file__`. See discussion https://github.com/iterative/dvc/issues/1509#issuecomment-457415307 .
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `dvc/daemon.py`
Content:
```
1 """Launch `dvc daemon` command in a separate detached process."""
2
3 import os
4 import sys
5 from subprocess import Popen
6
7 import dvc.logger as logger
8 from dvc.utils import is_binary, fix_env
9
10
11 CREATE_NEW_PROCESS_GROUP = 0x00000200
12 DETACHED_PROCESS = 0x00000008
13
14
15 def _spawn_windows(cmd):
16 from subprocess import STARTUPINFO, STARTF_USESHOWWINDOW
17
18 creationflags = CREATE_NEW_PROCESS_GROUP | DETACHED_PROCESS
19
20 startupinfo = STARTUPINFO()
21 startupinfo.dwFlags |= STARTF_USESHOWWINDOW
22
23 Popen(cmd,
24 env=fix_env(),
25 close_fds=True,
26 shell=False,
27 creationflags=creationflags,
28 startupinfo=startupinfo).communicate()
29
30
31 def _spawn_posix(cmd):
32 # NOTE: using os._exit instead of sys.exit, because dvc built
33 # with PyInstaller has trouble with SystemExit exeption and throws
34 # errors such as "[26338] Failed to execute script __main__"
35 try:
36 pid = os.fork()
37 if pid > 0:
38 return
39 except OSError:
40 logger.error("failed at first fork")
41 os._exit(1) # pylint: disable=protected-access
42
43 os.setsid()
44 os.umask(0)
45
46 try:
47 pid = os.fork()
48 if pid > 0:
49 os._exit(0) # pylint: disable=protected-access
50 except OSError:
51 logger.error("failed at second fork")
52 os._exit(1) # pylint: disable=protected-access
53
54 sys.stdin.close()
55 sys.stdout.close()
56 sys.stderr.close()
57
58 Popen(cmd, env=fix_env(), close_fds=True, shell=False).communicate()
59
60 os._exit(0) # pylint: disable=protected-access
61
62
63 def daemon(args):
64 """Launch a `dvc daemon` command in a detached process.
65
66 Args:
67 args (list): list of arguments to append to `dvc daemon` command.
68 """
69 cmd = [sys.executable]
70 if not is_binary():
71 cmd += ['-m', 'dvc']
72 cmd += ['daemon', '-q'] + args
73
74 logger.debug("Trying to spawn '{}'".format(cmd))
75
76 if os.name == 'nt':
77 _spawn_windows(cmd)
78 elif os.name == 'posix':
79 _spawn_posix(cmd)
80 else:
81 raise NotImplementedError
82
83 logger.debug("Spawned '{}'".format(cmd))
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/dvc/daemon.py b/dvc/daemon.py
--- a/dvc/daemon.py
+++ b/dvc/daemon.py
@@ -2,6 +2,7 @@
import os
import sys
+import inspect
from subprocess import Popen
import dvc.logger as logger
@@ -12,7 +13,7 @@
DETACHED_PROCESS = 0x00000008
-def _spawn_windows(cmd):
+def _spawn_windows(cmd, env):
from subprocess import STARTUPINFO, STARTF_USESHOWWINDOW
creationflags = CREATE_NEW_PROCESS_GROUP | DETACHED_PROCESS
@@ -21,14 +22,14 @@
startupinfo.dwFlags |= STARTF_USESHOWWINDOW
Popen(cmd,
- env=fix_env(),
+ env=env,
close_fds=True,
shell=False,
creationflags=creationflags,
startupinfo=startupinfo).communicate()
-def _spawn_posix(cmd):
+def _spawn_posix(cmd, env):
# NOTE: using os._exit instead of sys.exit, because dvc built
# with PyInstaller has trouble with SystemExit exeption and throws
# errors such as "[26338] Failed to execute script __main__"
@@ -55,7 +56,7 @@
sys.stdout.close()
sys.stderr.close()
- Popen(cmd, env=fix_env(), close_fds=True, shell=False).communicate()
+ Popen(cmd, env=env, close_fds=True, shell=False).communicate()
os._exit(0) # pylint: disable=protected-access
@@ -71,12 +72,16 @@
cmd += ['-m', 'dvc']
cmd += ['daemon', '-q'] + args
- logger.debug("Trying to spawn '{}'".format(cmd))
+ env = fix_env()
+ file_path = os.path.abspath(inspect.stack()[0][1])
+ env['PYTHONPATH'] = os.path.dirname(os.path.dirname(file_path))
+
+ logger.debug("Trying to spawn '{}' with env '{}'".format(cmd, env))
if os.name == 'nt':
- _spawn_windows(cmd)
+ _spawn_windows(cmd, env)
elif os.name == 'posix':
- _spawn_posix(cmd)
+ _spawn_posix(cmd, env)
else:
raise NotImplementedError
| {"golden_diff": "diff --git a/dvc/daemon.py b/dvc/daemon.py\n--- a/dvc/daemon.py\n+++ b/dvc/daemon.py\n@@ -2,6 +2,7 @@\n \n import os\n import sys\n+import inspect\n from subprocess import Popen\n \n import dvc.logger as logger\n@@ -12,7 +13,7 @@\n DETACHED_PROCESS = 0x00000008\n \n \n-def _spawn_windows(cmd):\n+def _spawn_windows(cmd, env):\n from subprocess import STARTUPINFO, STARTF_USESHOWWINDOW\n \n creationflags = CREATE_NEW_PROCESS_GROUP | DETACHED_PROCESS\n@@ -21,14 +22,14 @@\n startupinfo.dwFlags |= STARTF_USESHOWWINDOW\n \n Popen(cmd,\n- env=fix_env(),\n+ env=env,\n close_fds=True,\n shell=False,\n creationflags=creationflags,\n startupinfo=startupinfo).communicate()\n \n \n-def _spawn_posix(cmd):\n+def _spawn_posix(cmd, env):\n # NOTE: using os._exit instead of sys.exit, because dvc built\n # with PyInstaller has trouble with SystemExit exeption and throws\n # errors such as \"[26338] Failed to execute script __main__\"\n@@ -55,7 +56,7 @@\n sys.stdout.close()\n sys.stderr.close()\n \n- Popen(cmd, env=fix_env(), close_fds=True, shell=False).communicate()\n+ Popen(cmd, env=env, close_fds=True, shell=False).communicate()\n \n os._exit(0) # pylint: disable=protected-access\n \n@@ -71,12 +72,16 @@\n cmd += ['-m', 'dvc']\n cmd += ['daemon', '-q'] + args\n \n- logger.debug(\"Trying to spawn '{}'\".format(cmd))\n+ env = fix_env()\n+ file_path = os.path.abspath(inspect.stack()[0][1])\n+ env['PYTHONPATH'] = os.path.dirname(os.path.dirname(file_path))\n+\n+ logger.debug(\"Trying to spawn '{}' with env '{}'\".format(cmd, env))\n \n if os.name == 'nt':\n- _spawn_windows(cmd)\n+ _spawn_windows(cmd, env)\n elif os.name == 'posix':\n- _spawn_posix(cmd)\n+ _spawn_posix(cmd, env)\n else:\n raise NotImplementedError\n", "issue": "daemon: address dvc directly\nOur daemon is using `sys.executable -m dvc` to launch a new dvc process, but a more proper way would be to address dvc directly, by dynamically computing its location ourselves from `__file__`. See discussion https://github.com/iterative/dvc/issues/1509#issuecomment-457415307 .\n", "before_files": [{"content": "\"\"\"Launch `dvc daemon` command in a separate detached process.\"\"\"\n\nimport os\nimport sys\nfrom subprocess import Popen\n\nimport dvc.logger as logger\nfrom dvc.utils import is_binary, fix_env\n\n\nCREATE_NEW_PROCESS_GROUP = 0x00000200\nDETACHED_PROCESS = 0x00000008\n\n\ndef _spawn_windows(cmd):\n from subprocess import STARTUPINFO, STARTF_USESHOWWINDOW\n\n creationflags = CREATE_NEW_PROCESS_GROUP | DETACHED_PROCESS\n\n startupinfo = STARTUPINFO()\n startupinfo.dwFlags |= STARTF_USESHOWWINDOW\n\n Popen(cmd,\n env=fix_env(),\n close_fds=True,\n shell=False,\n creationflags=creationflags,\n startupinfo=startupinfo).communicate()\n\n\ndef _spawn_posix(cmd):\n # NOTE: using os._exit instead of sys.exit, because dvc built\n # with PyInstaller has trouble with SystemExit exeption and throws\n # errors such as \"[26338] Failed to execute script __main__\"\n try:\n pid = os.fork()\n if pid > 0:\n return\n except OSError:\n logger.error(\"failed at first fork\")\n os._exit(1) # pylint: disable=protected-access\n\n os.setsid()\n os.umask(0)\n\n try:\n pid = os.fork()\n if pid > 0:\n os._exit(0) # pylint: disable=protected-access\n except OSError:\n logger.error(\"failed at second fork\")\n os._exit(1) # pylint: disable=protected-access\n\n sys.stdin.close()\n sys.stdout.close()\n sys.stderr.close()\n\n Popen(cmd, env=fix_env(), close_fds=True, shell=False).communicate()\n\n os._exit(0) # pylint: disable=protected-access\n\n\ndef daemon(args):\n \"\"\"Launch a `dvc daemon` command in a detached process.\n\n Args:\n args (list): list of arguments to append to `dvc daemon` command.\n \"\"\"\n cmd = [sys.executable]\n if not is_binary():\n cmd += ['-m', 'dvc']\n cmd += ['daemon', '-q'] + args\n\n logger.debug(\"Trying to spawn '{}'\".format(cmd))\n\n if os.name == 'nt':\n _spawn_windows(cmd)\n elif os.name == 'posix':\n _spawn_posix(cmd)\n else:\n raise NotImplementedError\n\n logger.debug(\"Spawned '{}'\".format(cmd))\n", "path": "dvc/daemon.py"}], "after_files": [{"content": "\"\"\"Launch `dvc daemon` command in a separate detached process.\"\"\"\n\nimport os\nimport sys\nimport inspect\nfrom subprocess import Popen\n\nimport dvc.logger as logger\nfrom dvc.utils import is_binary, fix_env\n\n\nCREATE_NEW_PROCESS_GROUP = 0x00000200\nDETACHED_PROCESS = 0x00000008\n\n\ndef _spawn_windows(cmd, env):\n from subprocess import STARTUPINFO, STARTF_USESHOWWINDOW\n\n creationflags = CREATE_NEW_PROCESS_GROUP | DETACHED_PROCESS\n\n startupinfo = STARTUPINFO()\n startupinfo.dwFlags |= STARTF_USESHOWWINDOW\n\n Popen(cmd,\n env=env,\n close_fds=True,\n shell=False,\n creationflags=creationflags,\n startupinfo=startupinfo).communicate()\n\n\ndef _spawn_posix(cmd, env):\n # NOTE: using os._exit instead of sys.exit, because dvc built\n # with PyInstaller has trouble with SystemExit exeption and throws\n # errors such as \"[26338] Failed to execute script __main__\"\n try:\n pid = os.fork()\n if pid > 0:\n return\n except OSError:\n logger.error(\"failed at first fork\")\n os._exit(1) # pylint: disable=protected-access\n\n os.setsid()\n os.umask(0)\n\n try:\n pid = os.fork()\n if pid > 0:\n os._exit(0) # pylint: disable=protected-access\n except OSError:\n logger.error(\"failed at second fork\")\n os._exit(1) # pylint: disable=protected-access\n\n sys.stdin.close()\n sys.stdout.close()\n sys.stderr.close()\n\n Popen(cmd, env=env, close_fds=True, shell=False).communicate()\n\n os._exit(0) # pylint: disable=protected-access\n\n\ndef daemon(args):\n \"\"\"Launch a `dvc daemon` command in a detached process.\n\n Args:\n args (list): list of arguments to append to `dvc daemon` command.\n \"\"\"\n cmd = [sys.executable]\n if not is_binary():\n cmd += ['-m', 'dvc']\n cmd += ['daemon', '-q'] + args\n\n env = fix_env()\n file_path = os.path.abspath(inspect.stack()[0][1])\n env['PYTHONPATH'] = os.path.dirname(os.path.dirname(file_path))\n\n logger.debug(\"Trying to spawn '{}' with env '{}'\".format(cmd, env))\n\n if os.name == 'nt':\n _spawn_windows(cmd, env)\n elif os.name == 'posix':\n _spawn_posix(cmd, env)\n else:\n raise NotImplementedError\n\n logger.debug(\"Spawned '{}'\".format(cmd))\n", "path": "dvc/daemon.py"}]} | 1,062 | 525 |
gh_patches_debug_27533 | rasdani/github-patches | git_diff | akvo__akvo-rsr-5271 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Feature Request: Document the task management
### What are you trying to do?
Understand how the task management functions
### Describe the solution you'd like
A technical documentation of how it works.
### Have you consider alternatives?
_No response_
### Additional context
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `akvo/rsr/management/commands/django_q_probettp.py`
Content:
```
1 #!/usr/bin/env python3
2 # -*- coding: utf-8 -*-
3
4 # Akvo Reporting is covered by the GNU Affero General Public License.
5 # See more details in the license.txt file located at the root folder of the Akvo RSR module.
6 # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
7
8 """
9 Provides a localhost HTTP server to query the local status of the django-q cluster
10 """
11 import logging
12 import signal
13 import socket
14 from http.server import BaseHTTPRequestHandler, HTTPServer
15
16 from django.core.management.base import BaseCommand
17 from django_q.conf import Conf
18 from django_q.status import Stat
19
20 logger = logging.getLogger(__name__)
21
22
23 class Command(BaseCommand):
24 help = __doc__
25
26 def handle(self, *args, **options):
27 server = HTTPServer(("localhost", 8080), DjangoQRequestHandler)
28
29 def handle_end(*_):
30 logger.info("Stopping server")
31 server.shutdown()
32
33 signal.signal(signal.SIGINT, handle_end)
34 signal.signal(signal.SIGTERM, handle_end)
35
36 logger.info("Starting server...")
37 server.serve_forever()
38
39
40 class DjangoQRequestHandler(BaseHTTPRequestHandler):
41
42 def do_GET(self):
43 hostname = socket.gethostname()
44
45 # Find local cluster
46 local_stat = next(iter(stat for stat in Stat.get_all() if stat.host == hostname), None)
47 if local_stat:
48 message = local_stat.status
49 else:
50 message = Conf.STOPPED
51 logger.info(f"Probe response {message}")
52 self.send_response(200)
53 self.end_headers()
54 self.wfile.write(message.encode())
55
56 def log_message(self, format: str, *args) -> None:
57 logger.debug(format, *args)
58
```
Path: `akvo/cache/heartbeat.py`
Content:
```
1 import datetime
2 import logging
3 from threading import Event, Thread
4 from typing import Union
5
6 from django.core.cache import cache
7
8
9 class CacheHeartbeat(Thread):
10 """
11 Thread to update set a cache key with a max life and refresh it as long as the thread is alive
12
13 The thread can be ended by setting the `event_end` flag
14 """
15
16 def __init__(self, cache_key: str, key_timeout: float = 30.0, beat_interval: int = 3):
17 """
18 :param cache_key: The cache key to keep alive
19 :param key_timeout: How long the cache key should live without the heartbeat thread
20 :param beat_interval: How often per timeout the key should "beat"
21 """
22 super().__init__()
23 self.cache_key = cache_key
24 self.event_end = Event()
25 self.key_timeout = key_timeout
26 self.beat_interval = beat_interval
27
28 def run(self) -> None:
29 logger = logging.getLogger("akvo.rsr.CacheHeartBeat")
30 logger.info("Starting cache heartbeat for '%s' with timeout %s", self.cache_key, self.key_timeout)
31 self.event_end.clear()
32 while not self.event_end.is_set():
33 # Refresh the heartbeat
34 self.set_cache_value()
35 self.event_end.wait(self.key_timeout / self.beat_interval)
36
37 cache.delete(self.cache_key)
38 logger.info("Ended cache heartbeat for '%s'", self.cache_key)
39
40 def set_cache_value(self):
41 cache.set(self.cache_key, self.get_calc_value(), self.key_timeout)
42
43 def get_calc_value(self) -> Union[str, int, float]:
44 return datetime.datetime.utcnow().timestamp()
45
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/akvo/cache/heartbeat.py b/akvo/cache/heartbeat.py
--- a/akvo/cache/heartbeat.py
+++ b/akvo/cache/heartbeat.py
@@ -8,7 +8,7 @@
class CacheHeartbeat(Thread):
"""
- Thread to update set a cache key with a max life and refresh it as long as the thread is alive
+ Thread to set a cache key with a max life and refresh it as long as the thread is alive
The thread can be ended by setting the `event_end` flag
"""
@@ -38,7 +38,13 @@
logger.info("Ended cache heartbeat for '%s'", self.cache_key)
def set_cache_value(self):
- cache.set(self.cache_key, self.get_calc_value(), self.key_timeout)
+ """
+ Set the cache key and its value
+ """
+ cache.set(self.cache_key, self.calc_cache_value(), self.key_timeout)
- def get_calc_value(self) -> Union[str, int, float]:
+ def calc_cache_value(self) -> Union[str, int, float]:
+ """
+ Calculate a value to be used when setting the cache key
+ """
return datetime.datetime.utcnow().timestamp()
diff --git a/akvo/rsr/management/commands/django_q_probettp.py b/akvo/rsr/management/commands/django_q_probettp.py
--- a/akvo/rsr/management/commands/django_q_probettp.py
+++ b/akvo/rsr/management/commands/django_q_probettp.py
@@ -38,8 +38,14 @@
class DjangoQRequestHandler(BaseHTTPRequestHandler):
+ """
+ A handler to be used with HTTPServer to get the status of the local django-q cluster
+ """
def do_GET(self):
+ """
+ Handle GET requests to return a simple string indicating the status of the django-q cluster
+ """
hostname = socket.gethostname()
# Find local cluster
| {"golden_diff": "diff --git a/akvo/cache/heartbeat.py b/akvo/cache/heartbeat.py\n--- a/akvo/cache/heartbeat.py\n+++ b/akvo/cache/heartbeat.py\n@@ -8,7 +8,7 @@\n \n class CacheHeartbeat(Thread):\n \"\"\"\n- Thread to update set a cache key with a max life and refresh it as long as the thread is alive\n+ Thread to set a cache key with a max life and refresh it as long as the thread is alive\n \n The thread can be ended by setting the `event_end` flag\n \"\"\"\n@@ -38,7 +38,13 @@\n logger.info(\"Ended cache heartbeat for '%s'\", self.cache_key)\n \n def set_cache_value(self):\n- cache.set(self.cache_key, self.get_calc_value(), self.key_timeout)\n+ \"\"\"\n+ Set the cache key and its value\n+ \"\"\"\n+ cache.set(self.cache_key, self.calc_cache_value(), self.key_timeout)\n \n- def get_calc_value(self) -> Union[str, int, float]:\n+ def calc_cache_value(self) -> Union[str, int, float]:\n+ \"\"\"\n+ Calculate a value to be used when setting the cache key\n+ \"\"\"\n return datetime.datetime.utcnow().timestamp()\ndiff --git a/akvo/rsr/management/commands/django_q_probettp.py b/akvo/rsr/management/commands/django_q_probettp.py\n--- a/akvo/rsr/management/commands/django_q_probettp.py\n+++ b/akvo/rsr/management/commands/django_q_probettp.py\n@@ -38,8 +38,14 @@\n \n \n class DjangoQRequestHandler(BaseHTTPRequestHandler):\n+ \"\"\"\n+ A handler to be used with HTTPServer to get the status of the local django-q cluster\n+ \"\"\"\n \n def do_GET(self):\n+ \"\"\"\n+ Handle GET requests to return a simple string indicating the status of the django-q cluster\n+ \"\"\"\n hostname = socket.gethostname()\n \n # Find local cluster\n", "issue": "Feature Request: Document the task management\n### What are you trying to do?\n\nUnderstand how the task management functions\n\n### Describe the solution you'd like\n\nA technical documentation of how it works.\n\n### Have you consider alternatives?\n\n_No response_\n\n### Additional context\n\n_No response_\n", "before_files": [{"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\n# Akvo Reporting is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\n\"\"\"\nProvides a localhost HTTP server to query the local status of the django-q cluster\n\"\"\"\nimport logging\nimport signal\nimport socket\nfrom http.server import BaseHTTPRequestHandler, HTTPServer\n\nfrom django.core.management.base import BaseCommand\nfrom django_q.conf import Conf\nfrom django_q.status import Stat\n\nlogger = logging.getLogger(__name__)\n\n\nclass Command(BaseCommand):\n help = __doc__\n\n def handle(self, *args, **options):\n server = HTTPServer((\"localhost\", 8080), DjangoQRequestHandler)\n\n def handle_end(*_):\n logger.info(\"Stopping server\")\n server.shutdown()\n\n signal.signal(signal.SIGINT, handle_end)\n signal.signal(signal.SIGTERM, handle_end)\n\n logger.info(\"Starting server...\")\n server.serve_forever()\n\n\nclass DjangoQRequestHandler(BaseHTTPRequestHandler):\n\n def do_GET(self):\n hostname = socket.gethostname()\n\n # Find local cluster\n local_stat = next(iter(stat for stat in Stat.get_all() if stat.host == hostname), None)\n if local_stat:\n message = local_stat.status\n else:\n message = Conf.STOPPED\n logger.info(f\"Probe response {message}\")\n self.send_response(200)\n self.end_headers()\n self.wfile.write(message.encode())\n\n def log_message(self, format: str, *args) -> None:\n logger.debug(format, *args)\n", "path": "akvo/rsr/management/commands/django_q_probettp.py"}, {"content": "import datetime\nimport logging\nfrom threading import Event, Thread\nfrom typing import Union\n\nfrom django.core.cache import cache\n\n\nclass CacheHeartbeat(Thread):\n \"\"\"\n Thread to update set a cache key with a max life and refresh it as long as the thread is alive\n\n The thread can be ended by setting the `event_end` flag\n \"\"\"\n\n def __init__(self, cache_key: str, key_timeout: float = 30.0, beat_interval: int = 3):\n \"\"\"\n :param cache_key: The cache key to keep alive\n :param key_timeout: How long the cache key should live without the heartbeat thread\n :param beat_interval: How often per timeout the key should \"beat\"\n \"\"\"\n super().__init__()\n self.cache_key = cache_key\n self.event_end = Event()\n self.key_timeout = key_timeout\n self.beat_interval = beat_interval\n\n def run(self) -> None:\n logger = logging.getLogger(\"akvo.rsr.CacheHeartBeat\")\n logger.info(\"Starting cache heartbeat for '%s' with timeout %s\", self.cache_key, self.key_timeout)\n self.event_end.clear()\n while not self.event_end.is_set():\n # Refresh the heartbeat\n self.set_cache_value()\n self.event_end.wait(self.key_timeout / self.beat_interval)\n\n cache.delete(self.cache_key)\n logger.info(\"Ended cache heartbeat for '%s'\", self.cache_key)\n\n def set_cache_value(self):\n cache.set(self.cache_key, self.get_calc_value(), self.key_timeout)\n\n def get_calc_value(self) -> Union[str, int, float]:\n return datetime.datetime.utcnow().timestamp()\n", "path": "akvo/cache/heartbeat.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\n# Akvo Reporting is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\n\"\"\"\nProvides a localhost HTTP server to query the local status of the django-q cluster\n\"\"\"\nimport logging\nimport signal\nimport socket\nfrom http.server import BaseHTTPRequestHandler, HTTPServer\n\nfrom django.core.management.base import BaseCommand\nfrom django_q.conf import Conf\nfrom django_q.status import Stat\n\nlogger = logging.getLogger(__name__)\n\n\nclass Command(BaseCommand):\n help = __doc__\n\n def handle(self, *args, **options):\n server = HTTPServer((\"localhost\", 8080), DjangoQRequestHandler)\n\n def handle_end(*_):\n logger.info(\"Stopping server\")\n server.shutdown()\n\n signal.signal(signal.SIGINT, handle_end)\n signal.signal(signal.SIGTERM, handle_end)\n\n logger.info(\"Starting server...\")\n server.serve_forever()\n\n\nclass DjangoQRequestHandler(BaseHTTPRequestHandler):\n \"\"\"\n A handler to be used with HTTPServer to get the status of the local django-q cluster\n \"\"\"\n\n def do_GET(self):\n \"\"\"\n Handle GET requests to return a simple string indicating the status of the django-q cluster\n \"\"\"\n hostname = socket.gethostname()\n\n # Find local cluster\n local_stat = next(iter(stat for stat in Stat.get_all() if stat.host == hostname), None)\n if local_stat:\n message = local_stat.status\n else:\n message = Conf.STOPPED\n logger.info(f\"Probe response {message}\")\n self.send_response(200)\n self.end_headers()\n self.wfile.write(message.encode())\n\n def log_message(self, format: str, *args) -> None:\n logger.debug(format, *args)\n", "path": "akvo/rsr/management/commands/django_q_probettp.py"}, {"content": "import datetime\nimport logging\nfrom threading import Event, Thread\nfrom typing import Union\n\nfrom django.core.cache import cache\n\n\nclass CacheHeartbeat(Thread):\n \"\"\"\n Thread to set a cache key with a max life and refresh it as long as the thread is alive\n\n The thread can be ended by setting the `event_end` flag\n \"\"\"\n\n def __init__(self, cache_key: str, key_timeout: float = 30.0, beat_interval: int = 3):\n \"\"\"\n :param cache_key: The cache key to keep alive\n :param key_timeout: How long the cache key should live without the heartbeat thread\n :param beat_interval: How often per timeout the key should \"beat\"\n \"\"\"\n super().__init__()\n self.cache_key = cache_key\n self.event_end = Event()\n self.key_timeout = key_timeout\n self.beat_interval = beat_interval\n\n def run(self) -> None:\n logger = logging.getLogger(\"akvo.rsr.CacheHeartBeat\")\n logger.info(\"Starting cache heartbeat for '%s' with timeout %s\", self.cache_key, self.key_timeout)\n self.event_end.clear()\n while not self.event_end.is_set():\n # Refresh the heartbeat\n self.set_cache_value()\n self.event_end.wait(self.key_timeout / self.beat_interval)\n\n cache.delete(self.cache_key)\n logger.info(\"Ended cache heartbeat for '%s'\", self.cache_key)\n\n def set_cache_value(self):\n \"\"\"\n Set the cache key and its value\n \"\"\"\n cache.set(self.cache_key, self.calc_cache_value(), self.key_timeout)\n\n def calc_cache_value(self) -> Union[str, int, float]:\n \"\"\"\n Calculate a value to be used when setting the cache key\n \"\"\"\n return datetime.datetime.utcnow().timestamp()\n", "path": "akvo/cache/heartbeat.py"}]} | 1,265 | 442 |
gh_patches_debug_11081 | rasdani/github-patches | git_diff | deeppavlov__DeepPavlov-635 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add an environment variable to skip nltk.download()
Add an environment variable to skip nltk.download()
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `deeppavlov/models/__init__.py`
Content:
```
1 # Copyright 2017 Neural Networks and Deep Learning lab, MIPT
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import nltk
16
17 from deeppavlov.core.common.prints import RedirectedPrints
18
19
20 with RedirectedPrints():
21 nltk.download('punkt')
22 nltk.download('stopwords')
23 nltk.download('perluniprops')
24 nltk.download('nonbreaking_prefixes')
25
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/deeppavlov/models/__init__.py b/deeppavlov/models/__init__.py
--- a/deeppavlov/models/__init__.py
+++ b/deeppavlov/models/__init__.py
@@ -12,13 +12,14 @@
# See the License for the specific language governing permissions and
# limitations under the License.
+import os
import nltk
from deeppavlov.core.common.prints import RedirectedPrints
-
-with RedirectedPrints():
- nltk.download('punkt')
- nltk.download('stopwords')
- nltk.download('perluniprops')
- nltk.download('nonbreaking_prefixes')
+if not os.environ.get('DP_SKIP_NLTK_DOWNLOAD'):
+ with RedirectedPrints():
+ nltk.download('punkt')
+ nltk.download('stopwords')
+ nltk.download('perluniprops')
+ nltk.download('nonbreaking_prefixes')
| {"golden_diff": "diff --git a/deeppavlov/models/__init__.py b/deeppavlov/models/__init__.py\n--- a/deeppavlov/models/__init__.py\n+++ b/deeppavlov/models/__init__.py\n@@ -12,13 +12,14 @@\n # See the License for the specific language governing permissions and\n # limitations under the License.\n \n+import os\n import nltk\n \n from deeppavlov.core.common.prints import RedirectedPrints\n \n-\n-with RedirectedPrints():\n- nltk.download('punkt')\n- nltk.download('stopwords')\n- nltk.download('perluniprops')\n- nltk.download('nonbreaking_prefixes')\n+if not os.environ.get('DP_SKIP_NLTK_DOWNLOAD'):\n+ with RedirectedPrints():\n+ nltk.download('punkt')\n+ nltk.download('stopwords')\n+ nltk.download('perluniprops')\n+ nltk.download('nonbreaking_prefixes')\n", "issue": "Add an environment variable to skip nltk.download()\n\nAdd an environment variable to skip nltk.download()\n\n", "before_files": [{"content": "# Copyright 2017 Neural Networks and Deep Learning lab, MIPT\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport nltk\n\nfrom deeppavlov.core.common.prints import RedirectedPrints\n\n\nwith RedirectedPrints():\n nltk.download('punkt')\n nltk.download('stopwords')\n nltk.download('perluniprops')\n nltk.download('nonbreaking_prefixes')\n", "path": "deeppavlov/models/__init__.py"}], "after_files": [{"content": "# Copyright 2017 Neural Networks and Deep Learning lab, MIPT\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport nltk\n\nfrom deeppavlov.core.common.prints import RedirectedPrints\n\nif not os.environ.get('DP_SKIP_NLTK_DOWNLOAD'):\n with RedirectedPrints():\n nltk.download('punkt')\n nltk.download('stopwords')\n nltk.download('perluniprops')\n nltk.download('nonbreaking_prefixes')\n", "path": "deeppavlov/models/__init__.py"}]} | 518 | 206 |
gh_patches_debug_2610 | rasdani/github-patches | git_diff | openai__openai-python-1007 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Missing default value to logprobs in openai.types.chat.chat_completion.Choice
### Confirm this is an issue with the Python library and not an underlying OpenAI API
- [X] This is an issue with the Python library
### Describe the bug
#980 added token `logprobs` to chat completions of type `Optional[ChoiceLogprobs]` in [`openai.types.chat.chat_completion.Choice`](https://github.com/openai/openai-python/blob/3ad4e8bc9d89d7a81586bf598289ff62b0a339b9/src/openai/types/chat/chat_completion.py#L33) and [`openai.types.chat.chat_completion_chunk.Choice`](https://github.com/openai/openai-python/blob/3ad4e8bc9d89d7a81586bf598289ff62b0a339b9/src/openai/types/chat/chat_completion_chunk.py#L97). In the latter, the default value is set to `None`, while in the former it is not set. This causes backward compatibility problems with code written for versions prior to 1.5.0.
### To Reproduce
Execution of the following code fails:
```python
from openai.types.chat.chat_completion import ChatCompletionMessage, Choice
msg = ChatCompletionMessage(role="assistant", content="")
Choice(
index=0,
finish_reason="stop",
message=msg,
)
```
The output
```
----> 1 Choice(
2 index=0,
3 finish_reason="stop",
4 message=msg,
5 )
File /.venv-3.10/lib/python3.10/site-packages/pydantic/main.py:164, in BaseModel.__init__(__pydantic_self__, **data)
162 # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
163 __tracebackhide__ = True
--> 164 __pydantic_self__.__pydantic_validator__.validate_python(data, self_instance=__pydantic_self__)
ValidationError: 1 validation error for Choice
logprobs
Field required [type=missing, input_value={'index': 0, 'finish_reas...=None, tool_calls=None)}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.5/v/missing
```
Setting `logprobs` to `None` fixes the problem.
```python
from openai.types.chat.chat_completion import ChatCompletionMessage, Choice
msg = ChatCompletionMessage(role="assistant", content="")
Choice(
index=0,
finish_reason="stop",
message=msg,
logprobs=None # added line
)
```
### Code snippets
```Python
see above
```
### OS
Linux
### Python version
Python 3.10.13
### Library version
openai 1.6.0
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/openai/types/chat/chat_completion.py`
Content:
```
1 # File generated from our OpenAPI spec by Stainless.
2
3 from typing import List, Optional
4 from typing_extensions import Literal
5
6 from ..._models import BaseModel
7 from ..completion_usage import CompletionUsage
8 from .chat_completion_message import ChatCompletionMessage
9 from .chat_completion_token_logprob import ChatCompletionTokenLogprob
10
11 __all__ = ["ChatCompletion", "Choice", "ChoiceLogprobs"]
12
13
14 class ChoiceLogprobs(BaseModel):
15 content: Optional[List[ChatCompletionTokenLogprob]]
16 """A list of message content tokens with log probability information."""
17
18
19 class Choice(BaseModel):
20 finish_reason: Literal["stop", "length", "tool_calls", "content_filter", "function_call"]
21 """The reason the model stopped generating tokens.
22
23 This will be `stop` if the model hit a natural stop point or a provided stop
24 sequence, `length` if the maximum number of tokens specified in the request was
25 reached, `content_filter` if content was omitted due to a flag from our content
26 filters, `tool_calls` if the model called a tool, or `function_call`
27 (deprecated) if the model called a function.
28 """
29
30 index: int
31 """The index of the choice in the list of choices."""
32
33 logprobs: Optional[ChoiceLogprobs]
34 """Log probability information for the choice."""
35
36 message: ChatCompletionMessage
37 """A chat completion message generated by the model."""
38
39
40 class ChatCompletion(BaseModel):
41 id: str
42 """A unique identifier for the chat completion."""
43
44 choices: List[Choice]
45 """A list of chat completion choices.
46
47 Can be more than one if `n` is greater than 1.
48 """
49
50 created: int
51 """The Unix timestamp (in seconds) of when the chat completion was created."""
52
53 model: str
54 """The model used for the chat completion."""
55
56 object: Literal["chat.completion"]
57 """The object type, which is always `chat.completion`."""
58
59 system_fingerprint: Optional[str] = None
60 """This fingerprint represents the backend configuration that the model runs with.
61
62 Can be used in conjunction with the `seed` request parameter to understand when
63 backend changes have been made that might impact determinism.
64 """
65
66 usage: Optional[CompletionUsage] = None
67 """Usage statistics for the completion request."""
68
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/openai/types/chat/chat_completion.py b/src/openai/types/chat/chat_completion.py
--- a/src/openai/types/chat/chat_completion.py
+++ b/src/openai/types/chat/chat_completion.py
@@ -30,7 +30,7 @@
index: int
"""The index of the choice in the list of choices."""
- logprobs: Optional[ChoiceLogprobs]
+ logprobs: Optional[ChoiceLogprobs] = None
"""Log probability information for the choice."""
message: ChatCompletionMessage
| {"golden_diff": "diff --git a/src/openai/types/chat/chat_completion.py b/src/openai/types/chat/chat_completion.py\n--- a/src/openai/types/chat/chat_completion.py\n+++ b/src/openai/types/chat/chat_completion.py\n@@ -30,7 +30,7 @@\n index: int\n \"\"\"The index of the choice in the list of choices.\"\"\"\n \n- logprobs: Optional[ChoiceLogprobs]\n+ logprobs: Optional[ChoiceLogprobs] = None\n \"\"\"Log probability information for the choice.\"\"\"\n \n message: ChatCompletionMessage\n", "issue": "Missing default value to logprobs in openai.types.chat.chat_completion.Choice\n### Confirm this is an issue with the Python library and not an underlying OpenAI API\r\n\r\n- [X] This is an issue with the Python library\r\n\r\n### Describe the bug\r\n\r\n#980 added token `logprobs` to chat completions of type `Optional[ChoiceLogprobs]` in [`openai.types.chat.chat_completion.Choice`](https://github.com/openai/openai-python/blob/3ad4e8bc9d89d7a81586bf598289ff62b0a339b9/src/openai/types/chat/chat_completion.py#L33) and [`openai.types.chat.chat_completion_chunk.Choice`](https://github.com/openai/openai-python/blob/3ad4e8bc9d89d7a81586bf598289ff62b0a339b9/src/openai/types/chat/chat_completion_chunk.py#L97). In the latter, the default value is set to `None`, while in the former it is not set. This causes backward compatibility problems with code written for versions prior to 1.5.0.\r\n\r\n\r\n\r\n### To Reproduce\r\n\r\nExecution of the following code fails:\r\n\r\n```python \r\nfrom openai.types.chat.chat_completion import ChatCompletionMessage, Choice\r\n\r\nmsg = ChatCompletionMessage(role=\"assistant\", content=\"\")\r\n\r\nChoice(\r\n index=0,\r\n finish_reason=\"stop\",\r\n message=msg,\r\n)\r\n```\r\nThe output\r\n```\r\n----> 1 Choice(\r\n 2 index=0,\r\n 3 finish_reason=\"stop\",\r\n 4 message=msg,\r\n 5 )\r\n\r\nFile /.venv-3.10/lib/python3.10/site-packages/pydantic/main.py:164, in BaseModel.__init__(__pydantic_self__, **data)\r\n 162 # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks\r\n 163 __tracebackhide__ = True\r\n--> 164 __pydantic_self__.__pydantic_validator__.validate_python(data, self_instance=__pydantic_self__)\r\n\r\nValidationError: 1 validation error for Choice\r\nlogprobs\r\n Field required [type=missing, input_value={'index': 0, 'finish_reas...=None, tool_calls=None)}, input_type=dict]\r\n For further information visit https://errors.pydantic.dev/2.5/v/missing\r\n```\r\nSetting `logprobs` to `None` fixes the problem.\r\n```python \r\nfrom openai.types.chat.chat_completion import ChatCompletionMessage, Choice\r\n\r\nmsg = ChatCompletionMessage(role=\"assistant\", content=\"\")\r\n\r\nChoice(\r\n index=0,\r\n finish_reason=\"stop\",\r\n message=msg,\r\n logprobs=None # added line\r\n)\r\n```\r\n\r\n### Code snippets\r\n\r\n```Python\r\nsee above\r\n```\r\n\r\n\r\n### OS\r\n\r\nLinux\r\n\r\n### Python version\r\n\r\nPython 3.10.13\r\n\r\n### Library version\r\n\r\nopenai 1.6.0\n", "before_files": [{"content": "# File generated from our OpenAPI spec by Stainless.\n\nfrom typing import List, Optional\nfrom typing_extensions import Literal\n\nfrom ..._models import BaseModel\nfrom ..completion_usage import CompletionUsage\nfrom .chat_completion_message import ChatCompletionMessage\nfrom .chat_completion_token_logprob import ChatCompletionTokenLogprob\n\n__all__ = [\"ChatCompletion\", \"Choice\", \"ChoiceLogprobs\"]\n\n\nclass ChoiceLogprobs(BaseModel):\n content: Optional[List[ChatCompletionTokenLogprob]]\n \"\"\"A list of message content tokens with log probability information.\"\"\"\n\n\nclass Choice(BaseModel):\n finish_reason: Literal[\"stop\", \"length\", \"tool_calls\", \"content_filter\", \"function_call\"]\n \"\"\"The reason the model stopped generating tokens.\n\n This will be `stop` if the model hit a natural stop point or a provided stop\n sequence, `length` if the maximum number of tokens specified in the request was\n reached, `content_filter` if content was omitted due to a flag from our content\n filters, `tool_calls` if the model called a tool, or `function_call`\n (deprecated) if the model called a function.\n \"\"\"\n\n index: int\n \"\"\"The index of the choice in the list of choices.\"\"\"\n\n logprobs: Optional[ChoiceLogprobs]\n \"\"\"Log probability information for the choice.\"\"\"\n\n message: ChatCompletionMessage\n \"\"\"A chat completion message generated by the model.\"\"\"\n\n\nclass ChatCompletion(BaseModel):\n id: str\n \"\"\"A unique identifier for the chat completion.\"\"\"\n\n choices: List[Choice]\n \"\"\"A list of chat completion choices.\n\n Can be more than one if `n` is greater than 1.\n \"\"\"\n\n created: int\n \"\"\"The Unix timestamp (in seconds) of when the chat completion was created.\"\"\"\n\n model: str\n \"\"\"The model used for the chat completion.\"\"\"\n\n object: Literal[\"chat.completion\"]\n \"\"\"The object type, which is always `chat.completion`.\"\"\"\n\n system_fingerprint: Optional[str] = None\n \"\"\"This fingerprint represents the backend configuration that the model runs with.\n\n Can be used in conjunction with the `seed` request parameter to understand when\n backend changes have been made that might impact determinism.\n \"\"\"\n\n usage: Optional[CompletionUsage] = None\n \"\"\"Usage statistics for the completion request.\"\"\"\n", "path": "src/openai/types/chat/chat_completion.py"}], "after_files": [{"content": "# File generated from our OpenAPI spec by Stainless.\n\nfrom typing import List, Optional\nfrom typing_extensions import Literal\n\nfrom ..._models import BaseModel\nfrom ..completion_usage import CompletionUsage\nfrom .chat_completion_message import ChatCompletionMessage\nfrom .chat_completion_token_logprob import ChatCompletionTokenLogprob\n\n__all__ = [\"ChatCompletion\", \"Choice\", \"ChoiceLogprobs\"]\n\n\nclass ChoiceLogprobs(BaseModel):\n content: Optional[List[ChatCompletionTokenLogprob]]\n \"\"\"A list of message content tokens with log probability information.\"\"\"\n\n\nclass Choice(BaseModel):\n finish_reason: Literal[\"stop\", \"length\", \"tool_calls\", \"content_filter\", \"function_call\"]\n \"\"\"The reason the model stopped generating tokens.\n\n This will be `stop` if the model hit a natural stop point or a provided stop\n sequence, `length` if the maximum number of tokens specified in the request was\n reached, `content_filter` if content was omitted due to a flag from our content\n filters, `tool_calls` if the model called a tool, or `function_call`\n (deprecated) if the model called a function.\n \"\"\"\n\n index: int\n \"\"\"The index of the choice in the list of choices.\"\"\"\n\n logprobs: Optional[ChoiceLogprobs] = None\n \"\"\"Log probability information for the choice.\"\"\"\n\n message: ChatCompletionMessage\n \"\"\"A chat completion message generated by the model.\"\"\"\n\n\nclass ChatCompletion(BaseModel):\n id: str\n \"\"\"A unique identifier for the chat completion.\"\"\"\n\n choices: List[Choice]\n \"\"\"A list of chat completion choices.\n\n Can be more than one if `n` is greater than 1.\n \"\"\"\n\n created: int\n \"\"\"The Unix timestamp (in seconds) of when the chat completion was created.\"\"\"\n\n model: str\n \"\"\"The model used for the chat completion.\"\"\"\n\n object: Literal[\"chat.completion\"]\n \"\"\"The object type, which is always `chat.completion`.\"\"\"\n\n system_fingerprint: Optional[str] = None\n \"\"\"This fingerprint represents the backend configuration that the model runs with.\n\n Can be used in conjunction with the `seed` request parameter to understand when\n backend changes have been made that might impact determinism.\n \"\"\"\n\n usage: Optional[CompletionUsage] = None\n \"\"\"Usage statistics for the completion request.\"\"\"\n", "path": "src/openai/types/chat/chat_completion.py"}]} | 1,557 | 121 |
gh_patches_debug_13241 | rasdani/github-patches | git_diff | pypa__pip-12173 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
23.2: Unable to add zsh completion as documented
### Description
Attempting to follow the steps in the [documentation for adding zsh shell completion](https://pip.pypa.io/en/stable/user_guide/#command-completion) raises an error regarding the use of `compadd` outside of a completion function. This seems to be new with 23.2 and added in https://github.com/pypa/pip/pull/11417.
### Expected behavior
As in previous versions, following the documented steps should successfully add completion for pip.
### pip version
23.2
### Python version
3.11.4
### OS
macOS 12.6.8
### How to Reproduce
As documented:
1. Execute `python -m pip completion --zsh >> ~/.zprofile` or add eval "`pip completion --zsh`" to `.zprofile` or `.zshrc`
2. Start a new shell
### Output
From startup:
```
/dev/fd/16:compadd:4: can only be called from completion function
```
### Code of Conduct
- [X] I agree to follow the [PSF Code of Conduct](https://www.python.org/psf/conduct/).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/pip/_internal/commands/completion.py`
Content:
```
1 import sys
2 import textwrap
3 from optparse import Values
4 from typing import List
5
6 from pip._internal.cli.base_command import Command
7 from pip._internal.cli.status_codes import SUCCESS
8 from pip._internal.utils.misc import get_prog
9
10 BASE_COMPLETION = """
11 # pip {shell} completion start{script}# pip {shell} completion end
12 """
13
14 COMPLETION_SCRIPTS = {
15 "bash": """
16 _pip_completion()
17 {{
18 COMPREPLY=( $( COMP_WORDS="${{COMP_WORDS[*]}}" \\
19 COMP_CWORD=$COMP_CWORD \\
20 PIP_AUTO_COMPLETE=1 $1 2>/dev/null ) )
21 }}
22 complete -o default -F _pip_completion {prog}
23 """,
24 "zsh": """
25 #compdef -P pip[0-9.]#
26 compadd $( COMP_WORDS="$words[*]" \\
27 COMP_CWORD=$((CURRENT-1)) \\
28 PIP_AUTO_COMPLETE=1 $words[1] 2>/dev/null )
29 """,
30 "fish": """
31 function __fish_complete_pip
32 set -lx COMP_WORDS (commandline -o) ""
33 set -lx COMP_CWORD ( \\
34 math (contains -i -- (commandline -t) $COMP_WORDS)-1 \\
35 )
36 set -lx PIP_AUTO_COMPLETE 1
37 string split \\ -- (eval $COMP_WORDS[1])
38 end
39 complete -fa "(__fish_complete_pip)" -c {prog}
40 """,
41 "powershell": """
42 if ((Test-Path Function:\\TabExpansion) -and -not `
43 (Test-Path Function:\\_pip_completeBackup)) {{
44 Rename-Item Function:\\TabExpansion _pip_completeBackup
45 }}
46 function TabExpansion($line, $lastWord) {{
47 $lastBlock = [regex]::Split($line, '[|;]')[-1].TrimStart()
48 if ($lastBlock.StartsWith("{prog} ")) {{
49 $Env:COMP_WORDS=$lastBlock
50 $Env:COMP_CWORD=$lastBlock.Split().Length - 1
51 $Env:PIP_AUTO_COMPLETE=1
52 (& {prog}).Split()
53 Remove-Item Env:COMP_WORDS
54 Remove-Item Env:COMP_CWORD
55 Remove-Item Env:PIP_AUTO_COMPLETE
56 }}
57 elseif (Test-Path Function:\\_pip_completeBackup) {{
58 # Fall back on existing tab expansion
59 _pip_completeBackup $line $lastWord
60 }}
61 }}
62 """,
63 }
64
65
66 class CompletionCommand(Command):
67 """A helper command to be used for command completion."""
68
69 ignore_require_venv = True
70
71 def add_options(self) -> None:
72 self.cmd_opts.add_option(
73 "--bash",
74 "-b",
75 action="store_const",
76 const="bash",
77 dest="shell",
78 help="Emit completion code for bash",
79 )
80 self.cmd_opts.add_option(
81 "--zsh",
82 "-z",
83 action="store_const",
84 const="zsh",
85 dest="shell",
86 help="Emit completion code for zsh",
87 )
88 self.cmd_opts.add_option(
89 "--fish",
90 "-f",
91 action="store_const",
92 const="fish",
93 dest="shell",
94 help="Emit completion code for fish",
95 )
96 self.cmd_opts.add_option(
97 "--powershell",
98 "-p",
99 action="store_const",
100 const="powershell",
101 dest="shell",
102 help="Emit completion code for powershell",
103 )
104
105 self.parser.insert_option_group(0, self.cmd_opts)
106
107 def run(self, options: Values, args: List[str]) -> int:
108 """Prints the completion code of the given shell"""
109 shells = COMPLETION_SCRIPTS.keys()
110 shell_options = ["--" + shell for shell in sorted(shells)]
111 if options.shell in shells:
112 script = textwrap.dedent(
113 COMPLETION_SCRIPTS.get(options.shell, "").format(prog=get_prog())
114 )
115 print(BASE_COMPLETION.format(script=script, shell=options.shell))
116 return SUCCESS
117 else:
118 sys.stderr.write(
119 "ERROR: You must pass {}\n".format(" or ".join(shell_options))
120 )
121 return SUCCESS
122
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/pip/_internal/commands/completion.py b/src/pip/_internal/commands/completion.py
--- a/src/pip/_internal/commands/completion.py
+++ b/src/pip/_internal/commands/completion.py
@@ -23,9 +23,18 @@
""",
"zsh": """
#compdef -P pip[0-9.]#
- compadd $( COMP_WORDS="$words[*]" \\
- COMP_CWORD=$((CURRENT-1)) \\
- PIP_AUTO_COMPLETE=1 $words[1] 2>/dev/null )
+ __pip() {{
+ compadd $( COMP_WORDS="$words[*]" \\
+ COMP_CWORD=$((CURRENT-1)) \\
+ PIP_AUTO_COMPLETE=1 $words[1] 2>/dev/null )
+ }}
+ if [[ $zsh_eval_context[-1] == loadautofunc ]]; then
+ # autoload from fpath, call function directly
+ __pip "$@"
+ else
+ # eval/source/. command, register function for later
+ compdef __pip -P 'pip[0-9.]#'
+ fi
""",
"fish": """
function __fish_complete_pip
| {"golden_diff": "diff --git a/src/pip/_internal/commands/completion.py b/src/pip/_internal/commands/completion.py\n--- a/src/pip/_internal/commands/completion.py\n+++ b/src/pip/_internal/commands/completion.py\n@@ -23,9 +23,18 @@\n \"\"\",\n \"zsh\": \"\"\"\n #compdef -P pip[0-9.]#\n- compadd $( COMP_WORDS=\"$words[*]\" \\\\\n- COMP_CWORD=$((CURRENT-1)) \\\\\n- PIP_AUTO_COMPLETE=1 $words[1] 2>/dev/null )\n+ __pip() {{\n+ compadd $( COMP_WORDS=\"$words[*]\" \\\\\n+ COMP_CWORD=$((CURRENT-1)) \\\\\n+ PIP_AUTO_COMPLETE=1 $words[1] 2>/dev/null )\n+ }}\n+ if [[ $zsh_eval_context[-1] == loadautofunc ]]; then\n+ # autoload from fpath, call function directly\n+ __pip \"$@\"\n+ else\n+ # eval/source/. command, register function for later\n+ compdef __pip -P 'pip[0-9.]#'\n+ fi\n \"\"\",\n \"fish\": \"\"\"\n function __fish_complete_pip\n", "issue": "23.2: Unable to add zsh completion as documented\n### Description\r\n\r\nAttempting to follow the steps in the [documentation for adding zsh shell completion](https://pip.pypa.io/en/stable/user_guide/#command-completion) raises an error regarding the use of `compadd` outside of a completion function. This seems to be new with 23.2 and added in https://github.com/pypa/pip/pull/11417.\r\n\r\n### Expected behavior\r\n\r\nAs in previous versions, following the documented steps should successfully add completion for pip.\r\n\r\n### pip version\r\n\r\n23.2\r\n\r\n### Python version\r\n\r\n3.11.4\r\n\r\n### OS\r\n\r\nmacOS 12.6.8\r\n\r\n### How to Reproduce\r\n\r\nAs documented:\r\n1. Execute `python -m pip completion --zsh >> ~/.zprofile` or add eval \"`pip completion --zsh`\" to `.zprofile` or `.zshrc`\r\n2. Start a new shell\r\n\r\n### Output\r\n\r\nFrom startup:\r\n```\r\n/dev/fd/16:compadd:4: can only be called from completion function\r\n```\r\n\r\n### Code of Conduct\r\n\r\n- [X] I agree to follow the [PSF Code of Conduct](https://www.python.org/psf/conduct/).\n", "before_files": [{"content": "import sys\nimport textwrap\nfrom optparse import Values\nfrom typing import List\n\nfrom pip._internal.cli.base_command import Command\nfrom pip._internal.cli.status_codes import SUCCESS\nfrom pip._internal.utils.misc import get_prog\n\nBASE_COMPLETION = \"\"\"\n# pip {shell} completion start{script}# pip {shell} completion end\n\"\"\"\n\nCOMPLETION_SCRIPTS = {\n \"bash\": \"\"\"\n _pip_completion()\n {{\n COMPREPLY=( $( COMP_WORDS=\"${{COMP_WORDS[*]}}\" \\\\\n COMP_CWORD=$COMP_CWORD \\\\\n PIP_AUTO_COMPLETE=1 $1 2>/dev/null ) )\n }}\n complete -o default -F _pip_completion {prog}\n \"\"\",\n \"zsh\": \"\"\"\n #compdef -P pip[0-9.]#\n compadd $( COMP_WORDS=\"$words[*]\" \\\\\n COMP_CWORD=$((CURRENT-1)) \\\\\n PIP_AUTO_COMPLETE=1 $words[1] 2>/dev/null )\n \"\"\",\n \"fish\": \"\"\"\n function __fish_complete_pip\n set -lx COMP_WORDS (commandline -o) \"\"\n set -lx COMP_CWORD ( \\\\\n math (contains -i -- (commandline -t) $COMP_WORDS)-1 \\\\\n )\n set -lx PIP_AUTO_COMPLETE 1\n string split \\\\ -- (eval $COMP_WORDS[1])\n end\n complete -fa \"(__fish_complete_pip)\" -c {prog}\n \"\"\",\n \"powershell\": \"\"\"\n if ((Test-Path Function:\\\\TabExpansion) -and -not `\n (Test-Path Function:\\\\_pip_completeBackup)) {{\n Rename-Item Function:\\\\TabExpansion _pip_completeBackup\n }}\n function TabExpansion($line, $lastWord) {{\n $lastBlock = [regex]::Split($line, '[|;]')[-1].TrimStart()\n if ($lastBlock.StartsWith(\"{prog} \")) {{\n $Env:COMP_WORDS=$lastBlock\n $Env:COMP_CWORD=$lastBlock.Split().Length - 1\n $Env:PIP_AUTO_COMPLETE=1\n (& {prog}).Split()\n Remove-Item Env:COMP_WORDS\n Remove-Item Env:COMP_CWORD\n Remove-Item Env:PIP_AUTO_COMPLETE\n }}\n elseif (Test-Path Function:\\\\_pip_completeBackup) {{\n # Fall back on existing tab expansion\n _pip_completeBackup $line $lastWord\n }}\n }}\n \"\"\",\n}\n\n\nclass CompletionCommand(Command):\n \"\"\"A helper command to be used for command completion.\"\"\"\n\n ignore_require_venv = True\n\n def add_options(self) -> None:\n self.cmd_opts.add_option(\n \"--bash\",\n \"-b\",\n action=\"store_const\",\n const=\"bash\",\n dest=\"shell\",\n help=\"Emit completion code for bash\",\n )\n self.cmd_opts.add_option(\n \"--zsh\",\n \"-z\",\n action=\"store_const\",\n const=\"zsh\",\n dest=\"shell\",\n help=\"Emit completion code for zsh\",\n )\n self.cmd_opts.add_option(\n \"--fish\",\n \"-f\",\n action=\"store_const\",\n const=\"fish\",\n dest=\"shell\",\n help=\"Emit completion code for fish\",\n )\n self.cmd_opts.add_option(\n \"--powershell\",\n \"-p\",\n action=\"store_const\",\n const=\"powershell\",\n dest=\"shell\",\n help=\"Emit completion code for powershell\",\n )\n\n self.parser.insert_option_group(0, self.cmd_opts)\n\n def run(self, options: Values, args: List[str]) -> int:\n \"\"\"Prints the completion code of the given shell\"\"\"\n shells = COMPLETION_SCRIPTS.keys()\n shell_options = [\"--\" + shell for shell in sorted(shells)]\n if options.shell in shells:\n script = textwrap.dedent(\n COMPLETION_SCRIPTS.get(options.shell, \"\").format(prog=get_prog())\n )\n print(BASE_COMPLETION.format(script=script, shell=options.shell))\n return SUCCESS\n else:\n sys.stderr.write(\n \"ERROR: You must pass {}\\n\".format(\" or \".join(shell_options))\n )\n return SUCCESS\n", "path": "src/pip/_internal/commands/completion.py"}], "after_files": [{"content": "import sys\nimport textwrap\nfrom optparse import Values\nfrom typing import List\n\nfrom pip._internal.cli.base_command import Command\nfrom pip._internal.cli.status_codes import SUCCESS\nfrom pip._internal.utils.misc import get_prog\n\nBASE_COMPLETION = \"\"\"\n# pip {shell} completion start{script}# pip {shell} completion end\n\"\"\"\n\nCOMPLETION_SCRIPTS = {\n \"bash\": \"\"\"\n _pip_completion()\n {{\n COMPREPLY=( $( COMP_WORDS=\"${{COMP_WORDS[*]}}\" \\\\\n COMP_CWORD=$COMP_CWORD \\\\\n PIP_AUTO_COMPLETE=1 $1 2>/dev/null ) )\n }}\n complete -o default -F _pip_completion {prog}\n \"\"\",\n \"zsh\": \"\"\"\n #compdef -P pip[0-9.]#\n __pip() {{\n compadd $( COMP_WORDS=\"$words[*]\" \\\\\n COMP_CWORD=$((CURRENT-1)) \\\\\n PIP_AUTO_COMPLETE=1 $words[1] 2>/dev/null )\n }}\n if [[ $zsh_eval_context[-1] == loadautofunc ]]; then\n # autoload from fpath, call function directly\n __pip \"$@\"\n else\n # eval/source/. command, register function for later\n compdef __pip -P 'pip[0-9.]#'\n fi\n \"\"\",\n \"fish\": \"\"\"\n function __fish_complete_pip\n set -lx COMP_WORDS (commandline -o) \"\"\n set -lx COMP_CWORD ( \\\\\n math (contains -i -- (commandline -t) $COMP_WORDS)-1 \\\\\n )\n set -lx PIP_AUTO_COMPLETE 1\n string split \\\\ -- (eval $COMP_WORDS[1])\n end\n complete -fa \"(__fish_complete_pip)\" -c {prog}\n \"\"\",\n \"powershell\": \"\"\"\n if ((Test-Path Function:\\\\TabExpansion) -and -not `\n (Test-Path Function:\\\\_pip_completeBackup)) {{\n Rename-Item Function:\\\\TabExpansion _pip_completeBackup\n }}\n function TabExpansion($line, $lastWord) {{\n $lastBlock = [regex]::Split($line, '[|;]')[-1].TrimStart()\n if ($lastBlock.StartsWith(\"{prog} \")) {{\n $Env:COMP_WORDS=$lastBlock\n $Env:COMP_CWORD=$lastBlock.Split().Length - 1\n $Env:PIP_AUTO_COMPLETE=1\n (& {prog}).Split()\n Remove-Item Env:COMP_WORDS\n Remove-Item Env:COMP_CWORD\n Remove-Item Env:PIP_AUTO_COMPLETE\n }}\n elseif (Test-Path Function:\\\\_pip_completeBackup) {{\n # Fall back on existing tab expansion\n _pip_completeBackup $line $lastWord\n }}\n }}\n \"\"\",\n}\n\n\nclass CompletionCommand(Command):\n \"\"\"A helper command to be used for command completion.\"\"\"\n\n ignore_require_venv = True\n\n def add_options(self) -> None:\n self.cmd_opts.add_option(\n \"--bash\",\n \"-b\",\n action=\"store_const\",\n const=\"bash\",\n dest=\"shell\",\n help=\"Emit completion code for bash\",\n )\n self.cmd_opts.add_option(\n \"--zsh\",\n \"-z\",\n action=\"store_const\",\n const=\"zsh\",\n dest=\"shell\",\n help=\"Emit completion code for zsh\",\n )\n self.cmd_opts.add_option(\n \"--fish\",\n \"-f\",\n action=\"store_const\",\n const=\"fish\",\n dest=\"shell\",\n help=\"Emit completion code for fish\",\n )\n self.cmd_opts.add_option(\n \"--powershell\",\n \"-p\",\n action=\"store_const\",\n const=\"powershell\",\n dest=\"shell\",\n help=\"Emit completion code for powershell\",\n )\n\n self.parser.insert_option_group(0, self.cmd_opts)\n\n def run(self, options: Values, args: List[str]) -> int:\n \"\"\"Prints the completion code of the given shell\"\"\"\n shells = COMPLETION_SCRIPTS.keys()\n shell_options = [\"--\" + shell for shell in sorted(shells)]\n if options.shell in shells:\n script = textwrap.dedent(\n COMPLETION_SCRIPTS.get(options.shell, \"\").format(prog=get_prog())\n )\n print(BASE_COMPLETION.format(script=script, shell=options.shell))\n return SUCCESS\n else:\n sys.stderr.write(\n \"ERROR: You must pass {}\\n\".format(\" or \".join(shell_options))\n )\n return SUCCESS\n", "path": "src/pip/_internal/commands/completion.py"}]} | 1,696 | 273 |
gh_patches_debug_27686 | rasdani/github-patches | git_diff | xonsh__xonsh-861 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
COMPLETIONS_MENU_ROWS not working
Hello
My value for `COMPLETIONS_MENU_ROWS` is 5 (default). But it does not seem to work. My terminal grow until the bottom of the window and when it has reached the last line then I don't get completions anymore.
I use gnome-terminal on Ubuntu 14.04
Thank you for your help !
EDIT: I installed HEAD from the git repo. I still have the problem
EDIT2+3: I also have the problem with xtrem and Terminator (It seems Terminal independant)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `xonsh/ptk/completer.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """Completer implementation to use with prompt_toolkit."""
3 import os
4 import builtins
5
6 from prompt_toolkit.layout.dimension import LayoutDimension
7 from prompt_toolkit.completion import Completer, Completion
8
9
10 class PromptToolkitCompleter(Completer):
11 """Simple prompt_toolkit Completer object.
12
13 It just redirects requests to normal Xonsh completer.
14 """
15
16 def __init__(self, completer, ctx):
17 """Takes instance of xonsh.completer.Completer and dict with context."""
18 self.completer = completer
19 self.ctx = ctx
20
21 def get_completions(self, document, complete_event):
22 """Returns a generator for list of completions."""
23
24 # Only generate completions when the user hits tab.
25 if complete_event.completion_requested:
26 line = document.current_line.lstrip()
27 endidx = document.cursor_position_col
28 begidx = line[:endidx].rfind(' ') + 1 if line[:endidx].rfind(' ') >= 0 else 0
29 prefix = line[begidx:endidx]
30 completions, l = self.completer.complete(prefix,
31 line,
32 begidx,
33 endidx,
34 self.ctx)
35 if len(completions) <= 1:
36 pass
37 elif len(os.path.commonprefix(completions)) <= len(prefix):
38 self.reserve_space()
39 for comp in completions:
40 yield Completion(comp, -l)
41
42 def reserve_space(self):
43 cli = builtins.__xonsh_shell__.shell.prompter.cli
44 #TODO remove after next prompt_toolkit release
45 try:
46 #old layout to be removed at next ptk release
47 window = cli.application.layout.children[1].children[1].content
48 except AttributeError:
49 #new layout to become default
50 window = cli.application.layout.children[1].content
51 if window and window.render_info:
52 h = window.render_info.content_height
53 r = builtins.__xonsh_env__.get('COMPLETIONS_MENU_ROWS')
54 size = h + r
55 def comp_height(cli):
56 # If there is an autocompletion menu to be shown, make sure that o
57 # layout has at least a minimal height in order to display it.
58 if not cli.is_done:
59 return LayoutDimension(min=size)
60 else:
61 return LayoutDimension()
62 window._height = comp_height
63
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/xonsh/ptk/completer.py b/xonsh/ptk/completer.py
--- a/xonsh/ptk/completer.py
+++ b/xonsh/ptk/completer.py
@@ -2,6 +2,7 @@
"""Completer implementation to use with prompt_toolkit."""
import os
import builtins
+import xonsh.shell
from prompt_toolkit.layout.dimension import LayoutDimension
from prompt_toolkit.completion import Completer, Completion
@@ -41,13 +42,18 @@
def reserve_space(self):
cli = builtins.__xonsh_shell__.shell.prompter.cli
- #TODO remove after next prompt_toolkit release
- try:
- #old layout to be removed at next ptk release
- window = cli.application.layout.children[1].children[1].content
- except AttributeError:
- #new layout to become default
- window = cli.application.layout.children[1].content
+ if xonsh.shell.prompt_toolkit_version().startswith("1.0"):
+ # This is the layout for ptk 1.0
+ window = cli.application.layout.children[0].content.children[1]
+ else:
+ #TODO remove after next prompt_toolkit release
+ try:
+ #old layout to be removed at next ptk release
+ window = cli.application.layout.children[1].children[1].content
+ except AttributeError:
+ #new layout to become default
+ window = cli.application.layout.children[1].content
+
if window and window.render_info:
h = window.render_info.content_height
r = builtins.__xonsh_env__.get('COMPLETIONS_MENU_ROWS')
| {"golden_diff": "diff --git a/xonsh/ptk/completer.py b/xonsh/ptk/completer.py\n--- a/xonsh/ptk/completer.py\n+++ b/xonsh/ptk/completer.py\n@@ -2,6 +2,7 @@\n \"\"\"Completer implementation to use with prompt_toolkit.\"\"\"\n import os\n import builtins\n+import xonsh.shell\n \n from prompt_toolkit.layout.dimension import LayoutDimension\n from prompt_toolkit.completion import Completer, Completion\n@@ -41,13 +42,18 @@\n \n def reserve_space(self):\n cli = builtins.__xonsh_shell__.shell.prompter.cli\n- #TODO remove after next prompt_toolkit release\n- try:\n- #old layout to be removed at next ptk release\n- window = cli.application.layout.children[1].children[1].content\n- except AttributeError:\n- #new layout to become default\n- window = cli.application.layout.children[1].content\n+ if xonsh.shell.prompt_toolkit_version().startswith(\"1.0\"):\n+ # This is the layout for ptk 1.0\n+ window = cli.application.layout.children[0].content.children[1]\n+ else:\n+ #TODO remove after next prompt_toolkit release\n+ try:\n+ #old layout to be removed at next ptk release\n+ window = cli.application.layout.children[1].children[1].content\n+ except AttributeError:\n+ #new layout to become default\n+ window = cli.application.layout.children[1].content\n+\n if window and window.render_info:\n h = window.render_info.content_height\n r = builtins.__xonsh_env__.get('COMPLETIONS_MENU_ROWS')\n", "issue": "COMPLETIONS_MENU_ROWS not working\nHello\n\nMy value for `COMPLETIONS_MENU_ROWS` is 5 (default). But it does not seem to work. My terminal grow until the bottom of the window and when it has reached the last line then I don't get completions anymore.\n\nI use gnome-terminal on Ubuntu 14.04\n\nThank you for your help !\n\nEDIT: I installed HEAD from the git repo. I still have the problem\n\nEDIT2+3: I also have the problem with xtrem and Terminator (It seems Terminal independant)\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"Completer implementation to use with prompt_toolkit.\"\"\"\nimport os\nimport builtins\n\nfrom prompt_toolkit.layout.dimension import LayoutDimension\nfrom prompt_toolkit.completion import Completer, Completion\n\n\nclass PromptToolkitCompleter(Completer):\n \"\"\"Simple prompt_toolkit Completer object.\n\n It just redirects requests to normal Xonsh completer.\n \"\"\"\n\n def __init__(self, completer, ctx):\n \"\"\"Takes instance of xonsh.completer.Completer and dict with context.\"\"\"\n self.completer = completer\n self.ctx = ctx\n\n def get_completions(self, document, complete_event):\n \"\"\"Returns a generator for list of completions.\"\"\"\n\n # Only generate completions when the user hits tab.\n if complete_event.completion_requested:\n line = document.current_line.lstrip()\n endidx = document.cursor_position_col\n begidx = line[:endidx].rfind(' ') + 1 if line[:endidx].rfind(' ') >= 0 else 0\n prefix = line[begidx:endidx]\n completions, l = self.completer.complete(prefix,\n line,\n begidx,\n endidx,\n self.ctx)\n if len(completions) <= 1:\n pass\n elif len(os.path.commonprefix(completions)) <= len(prefix):\n self.reserve_space()\n for comp in completions:\n yield Completion(comp, -l)\n\n def reserve_space(self):\n cli = builtins.__xonsh_shell__.shell.prompter.cli\n #TODO remove after next prompt_toolkit release\n try:\n #old layout to be removed at next ptk release\n window = cli.application.layout.children[1].children[1].content\n except AttributeError:\n #new layout to become default\n window = cli.application.layout.children[1].content\n if window and window.render_info:\n h = window.render_info.content_height\n r = builtins.__xonsh_env__.get('COMPLETIONS_MENU_ROWS')\n size = h + r\n def comp_height(cli):\n # If there is an autocompletion menu to be shown, make sure that o\n # layout has at least a minimal height in order to display it.\n if not cli.is_done:\n return LayoutDimension(min=size)\n else:\n return LayoutDimension()\n window._height = comp_height\n", "path": "xonsh/ptk/completer.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"Completer implementation to use with prompt_toolkit.\"\"\"\nimport os\nimport builtins\nimport xonsh.shell\n\nfrom prompt_toolkit.layout.dimension import LayoutDimension\nfrom prompt_toolkit.completion import Completer, Completion\n\n\nclass PromptToolkitCompleter(Completer):\n \"\"\"Simple prompt_toolkit Completer object.\n\n It just redirects requests to normal Xonsh completer.\n \"\"\"\n\n def __init__(self, completer, ctx):\n \"\"\"Takes instance of xonsh.completer.Completer and dict with context.\"\"\"\n self.completer = completer\n self.ctx = ctx\n\n def get_completions(self, document, complete_event):\n \"\"\"Returns a generator for list of completions.\"\"\"\n\n # Only generate completions when the user hits tab.\n if complete_event.completion_requested:\n line = document.current_line.lstrip()\n endidx = document.cursor_position_col\n begidx = line[:endidx].rfind(' ') + 1 if line[:endidx].rfind(' ') >= 0 else 0\n prefix = line[begidx:endidx]\n completions, l = self.completer.complete(prefix,\n line,\n begidx,\n endidx,\n self.ctx)\n if len(completions) <= 1:\n pass\n elif len(os.path.commonprefix(completions)) <= len(prefix):\n self.reserve_space()\n for comp in completions:\n yield Completion(comp, -l)\n\n def reserve_space(self):\n cli = builtins.__xonsh_shell__.shell.prompter.cli\n if xonsh.shell.prompt_toolkit_version().startswith(\"1.0\"):\n # This is the layout for ptk 1.0\n window = cli.application.layout.children[0].content.children[1]\n else:\n #TODO remove after next prompt_toolkit release\n try:\n #old layout to be removed at next ptk release\n window = cli.application.layout.children[1].children[1].content\n except AttributeError:\n #new layout to become default\n window = cli.application.layout.children[1].content\n\n if window and window.render_info:\n h = window.render_info.content_height\n r = builtins.__xonsh_env__.get('COMPLETIONS_MENU_ROWS')\n size = h + r\n def comp_height(cli):\n # If there is an autocompletion menu to be shown, make sure that o\n # layout has at least a minimal height in order to display it.\n if not cli.is_done:\n return LayoutDimension(min=size)\n else:\n return LayoutDimension()\n window._height = comp_height\n", "path": "xonsh/ptk/completer.py"}]} | 1,020 | 383 |
gh_patches_debug_37069 | rasdani/github-patches | git_diff | TheAlgorithms__Python-8674 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
There is a missing check for the "associativity" of the operators in the stacks/infix_to_postfix_conversion.py file.
### Repository commit
11582943a555ae3b6a22938df6d3645b0327562e
### Python version (python --version)
Python 3.10.6
### Dependencies version (pip freeze)
astroid==2.11.3
bcc==0.18.0
black==22.3.0
blinker==1.4
Brlapi==0.8.3
certifi==2020.6.20
chardet==4.0.0
### Expected behavior
Hi, In the [data_structures/stacks/infix_to_postfix_conversion.py](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/infix_to_postfix_conversion.py) file, when we pass `"2^3^2"` argument to the `infix_to_postfix` function, it should return `'2 3 2 ^ ^'` back. This results in number `512` which is the correct calculation.
### Actual behavior
Currently `infix_to_postfix("2^3^2")` returns `'2 3 ^ 2 ^'` which is wrong. If we calculate it, it returns number `64`.
The reason behind this is that currently we don't have any check for the "associativity" of the operators. In particular `^` operator causes problem here. It has `Right-To-Left` associativity.
There is a missing check for the "associativity" of the operators in the stacks/infix_to_postfix_conversion.py file.
### Repository commit
11582943a555ae3b6a22938df6d3645b0327562e
### Python version (python --version)
Python 3.10.6
### Dependencies version (pip freeze)
astroid==2.11.3
bcc==0.18.0
black==22.3.0
blinker==1.4
Brlapi==0.8.3
certifi==2020.6.20
chardet==4.0.0
### Expected behavior
Hi, In the [data_structures/stacks/infix_to_postfix_conversion.py](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/infix_to_postfix_conversion.py) file, when we pass `"2^3^2"` argument to the `infix_to_postfix` function, it should return `'2 3 2 ^ ^'` back. This results in number `512` which is the correct calculation.
### Actual behavior
Currently `infix_to_postfix("2^3^2")` returns `'2 3 ^ 2 ^'` which is wrong. If we calculate it, it returns number `64`.
The reason behind this is that currently we don't have any check for the "associativity" of the operators. In particular `^` operator causes problem here. It has `Right-To-Left` associativity.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `data_structures/stacks/infix_to_postfix_conversion.py`
Content:
```
1 """
2 https://en.wikipedia.org/wiki/Infix_notation
3 https://en.wikipedia.org/wiki/Reverse_Polish_notation
4 https://en.wikipedia.org/wiki/Shunting-yard_algorithm
5 """
6
7 from .balanced_parentheses import balanced_parentheses
8 from .stack import Stack
9
10
11 def precedence(char: str) -> int:
12 """
13 Return integer value representing an operator's precedence, or
14 order of operation.
15 https://en.wikipedia.org/wiki/Order_of_operations
16 """
17 return {"+": 1, "-": 1, "*": 2, "/": 2, "^": 3}.get(char, -1)
18
19
20 def infix_to_postfix(expression_str: str) -> str:
21 """
22 >>> infix_to_postfix("(1*(2+3)+4))")
23 Traceback (most recent call last):
24 ...
25 ValueError: Mismatched parentheses
26 >>> infix_to_postfix("")
27 ''
28 >>> infix_to_postfix("3+2")
29 '3 2 +'
30 >>> infix_to_postfix("(3+4)*5-6")
31 '3 4 + 5 * 6 -'
32 >>> infix_to_postfix("(1+2)*3/4-5")
33 '1 2 + 3 * 4 / 5 -'
34 >>> infix_to_postfix("a+b*c+(d*e+f)*g")
35 'a b c * + d e * f + g * +'
36 >>> infix_to_postfix("x^y/(5*z)+2")
37 'x y ^ 5 z * / 2 +'
38 """
39 if not balanced_parentheses(expression_str):
40 raise ValueError("Mismatched parentheses")
41 stack: Stack[str] = Stack()
42 postfix = []
43 for char in expression_str:
44 if char.isalpha() or char.isdigit():
45 postfix.append(char)
46 elif char == "(":
47 stack.push(char)
48 elif char == ")":
49 while not stack.is_empty() and stack.peek() != "(":
50 postfix.append(stack.pop())
51 stack.pop()
52 else:
53 while not stack.is_empty() and precedence(char) <= precedence(stack.peek()):
54 postfix.append(stack.pop())
55 stack.push(char)
56 while not stack.is_empty():
57 postfix.append(stack.pop())
58 return " ".join(postfix)
59
60
61 if __name__ == "__main__":
62 from doctest import testmod
63
64 testmod()
65 expression = "a+b*(c^d-e)^(f+g*h)-i"
66
67 print("Infix to Postfix Notation demonstration:\n")
68 print("Infix notation: " + expression)
69 print("Postfix notation: " + infix_to_postfix(expression))
70
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/data_structures/stacks/infix_to_postfix_conversion.py b/data_structures/stacks/infix_to_postfix_conversion.py
--- a/data_structures/stacks/infix_to_postfix_conversion.py
+++ b/data_structures/stacks/infix_to_postfix_conversion.py
@@ -4,9 +4,26 @@
https://en.wikipedia.org/wiki/Shunting-yard_algorithm
"""
+from typing import Literal
+
from .balanced_parentheses import balanced_parentheses
from .stack import Stack
+PRECEDENCES: dict[str, int] = {
+ "+": 1,
+ "-": 1,
+ "*": 2,
+ "/": 2,
+ "^": 3,
+}
+ASSOCIATIVITIES: dict[str, Literal["LR", "RL"]] = {
+ "+": "LR",
+ "-": "LR",
+ "*": "LR",
+ "/": "LR",
+ "^": "RL",
+}
+
def precedence(char: str) -> int:
"""
@@ -14,7 +31,15 @@
order of operation.
https://en.wikipedia.org/wiki/Order_of_operations
"""
- return {"+": 1, "-": 1, "*": 2, "/": 2, "^": 3}.get(char, -1)
+ return PRECEDENCES.get(char, -1)
+
+
+def associativity(char: str) -> Literal["LR", "RL"]:
+ """
+ Return the associativity of the operator `char`.
+ https://en.wikipedia.org/wiki/Operator_associativity
+ """
+ return ASSOCIATIVITIES[char]
def infix_to_postfix(expression_str: str) -> str:
@@ -35,6 +60,8 @@
'a b c * + d e * f + g * +'
>>> infix_to_postfix("x^y/(5*z)+2")
'x y ^ 5 z * / 2 +'
+ >>> infix_to_postfix("2^3^2")
+ '2 3 2 ^ ^'
"""
if not balanced_parentheses(expression_str):
raise ValueError("Mismatched parentheses")
@@ -50,9 +77,26 @@
postfix.append(stack.pop())
stack.pop()
else:
- while not stack.is_empty() and precedence(char) <= precedence(stack.peek()):
+ while True:
+ if stack.is_empty():
+ stack.push(char)
+ break
+
+ char_precedence = precedence(char)
+ tos_precedence = precedence(stack.peek())
+
+ if char_precedence > tos_precedence:
+ stack.push(char)
+ break
+ if char_precedence < tos_precedence:
+ postfix.append(stack.pop())
+ continue
+ # Precedences are equal
+ if associativity(char) == "RL":
+ stack.push(char)
+ break
postfix.append(stack.pop())
- stack.push(char)
+
while not stack.is_empty():
postfix.append(stack.pop())
return " ".join(postfix)
| {"golden_diff": "diff --git a/data_structures/stacks/infix_to_postfix_conversion.py b/data_structures/stacks/infix_to_postfix_conversion.py\n--- a/data_structures/stacks/infix_to_postfix_conversion.py\n+++ b/data_structures/stacks/infix_to_postfix_conversion.py\n@@ -4,9 +4,26 @@\n https://en.wikipedia.org/wiki/Shunting-yard_algorithm\n \"\"\"\n \n+from typing import Literal\n+\n from .balanced_parentheses import balanced_parentheses\n from .stack import Stack\n \n+PRECEDENCES: dict[str, int] = {\n+ \"+\": 1,\n+ \"-\": 1,\n+ \"*\": 2,\n+ \"/\": 2,\n+ \"^\": 3,\n+}\n+ASSOCIATIVITIES: dict[str, Literal[\"LR\", \"RL\"]] = {\n+ \"+\": \"LR\",\n+ \"-\": \"LR\",\n+ \"*\": \"LR\",\n+ \"/\": \"LR\",\n+ \"^\": \"RL\",\n+}\n+\n \n def precedence(char: str) -> int:\n \"\"\"\n@@ -14,7 +31,15 @@\n order of operation.\n https://en.wikipedia.org/wiki/Order_of_operations\n \"\"\"\n- return {\"+\": 1, \"-\": 1, \"*\": 2, \"/\": 2, \"^\": 3}.get(char, -1)\n+ return PRECEDENCES.get(char, -1)\n+\n+\n+def associativity(char: str) -> Literal[\"LR\", \"RL\"]:\n+ \"\"\"\n+ Return the associativity of the operator `char`.\n+ https://en.wikipedia.org/wiki/Operator_associativity\n+ \"\"\"\n+ return ASSOCIATIVITIES[char]\n \n \n def infix_to_postfix(expression_str: str) -> str:\n@@ -35,6 +60,8 @@\n 'a b c * + d e * f + g * +'\n >>> infix_to_postfix(\"x^y/(5*z)+2\")\n 'x y ^ 5 z * / 2 +'\n+ >>> infix_to_postfix(\"2^3^2\")\n+ '2 3 2 ^ ^'\n \"\"\"\n if not balanced_parentheses(expression_str):\n raise ValueError(\"Mismatched parentheses\")\n@@ -50,9 +77,26 @@\n postfix.append(stack.pop())\n stack.pop()\n else:\n- while not stack.is_empty() and precedence(char) <= precedence(stack.peek()):\n+ while True:\n+ if stack.is_empty():\n+ stack.push(char)\n+ break\n+\n+ char_precedence = precedence(char)\n+ tos_precedence = precedence(stack.peek())\n+\n+ if char_precedence > tos_precedence:\n+ stack.push(char)\n+ break\n+ if char_precedence < tos_precedence:\n+ postfix.append(stack.pop())\n+ continue\n+ # Precedences are equal\n+ if associativity(char) == \"RL\":\n+ stack.push(char)\n+ break\n postfix.append(stack.pop())\n- stack.push(char)\n+\n while not stack.is_empty():\n postfix.append(stack.pop())\n return \" \".join(postfix)\n", "issue": "There is a missing check for the \"associativity\" of the operators in the stacks/infix_to_postfix_conversion.py file.\n### Repository commit\r\n\r\n11582943a555ae3b6a22938df6d3645b0327562e\r\n\r\n### Python version (python --version)\r\n\r\nPython 3.10.6\r\n\r\n### Dependencies version (pip freeze)\r\n\r\nastroid==2.11.3\r\nbcc==0.18.0\r\nblack==22.3.0\r\nblinker==1.4\r\nBrlapi==0.8.3\r\ncertifi==2020.6.20\r\nchardet==4.0.0\r\n\r\n\r\n### Expected behavior\r\n\r\nHi, In the [data_structures/stacks/infix_to_postfix_conversion.py](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/infix_to_postfix_conversion.py) file, when we pass `\"2^3^2\"` argument to the `infix_to_postfix` function, it should return `'2 3 2 ^ ^'` back. This results in number `512` which is the correct calculation.\r\n\r\n### Actual behavior\r\n\r\nCurrently `infix_to_postfix(\"2^3^2\")` returns `'2 3 ^ 2 ^'` which is wrong. If we calculate it, it returns number `64`.\r\n\r\nThe reason behind this is that currently we don't have any check for the \"associativity\" of the operators. In particular `^` operator causes problem here. It has `Right-To-Left` associativity. \nThere is a missing check for the \"associativity\" of the operators in the stacks/infix_to_postfix_conversion.py file.\n### Repository commit\r\n\r\n11582943a555ae3b6a22938df6d3645b0327562e\r\n\r\n### Python version (python --version)\r\n\r\nPython 3.10.6\r\n\r\n### Dependencies version (pip freeze)\r\n\r\nastroid==2.11.3\r\nbcc==0.18.0\r\nblack==22.3.0\r\nblinker==1.4\r\nBrlapi==0.8.3\r\ncertifi==2020.6.20\r\nchardet==4.0.0\r\n\r\n\r\n### Expected behavior\r\n\r\nHi, In the [data_structures/stacks/infix_to_postfix_conversion.py](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/infix_to_postfix_conversion.py) file, when we pass `\"2^3^2\"` argument to the `infix_to_postfix` function, it should return `'2 3 2 ^ ^'` back. This results in number `512` which is the correct calculation.\r\n\r\n### Actual behavior\r\n\r\nCurrently `infix_to_postfix(\"2^3^2\")` returns `'2 3 ^ 2 ^'` which is wrong. If we calculate it, it returns number `64`.\r\n\r\nThe reason behind this is that currently we don't have any check for the \"associativity\" of the operators. In particular `^` operator causes problem here. It has `Right-To-Left` associativity. \n", "before_files": [{"content": "\"\"\"\nhttps://en.wikipedia.org/wiki/Infix_notation\nhttps://en.wikipedia.org/wiki/Reverse_Polish_notation\nhttps://en.wikipedia.org/wiki/Shunting-yard_algorithm\n\"\"\"\n\nfrom .balanced_parentheses import balanced_parentheses\nfrom .stack import Stack\n\n\ndef precedence(char: str) -> int:\n \"\"\"\n Return integer value representing an operator's precedence, or\n order of operation.\n https://en.wikipedia.org/wiki/Order_of_operations\n \"\"\"\n return {\"+\": 1, \"-\": 1, \"*\": 2, \"/\": 2, \"^\": 3}.get(char, -1)\n\n\ndef infix_to_postfix(expression_str: str) -> str:\n \"\"\"\n >>> infix_to_postfix(\"(1*(2+3)+4))\")\n Traceback (most recent call last):\n ...\n ValueError: Mismatched parentheses\n >>> infix_to_postfix(\"\")\n ''\n >>> infix_to_postfix(\"3+2\")\n '3 2 +'\n >>> infix_to_postfix(\"(3+4)*5-6\")\n '3 4 + 5 * 6 -'\n >>> infix_to_postfix(\"(1+2)*3/4-5\")\n '1 2 + 3 * 4 / 5 -'\n >>> infix_to_postfix(\"a+b*c+(d*e+f)*g\")\n 'a b c * + d e * f + g * +'\n >>> infix_to_postfix(\"x^y/(5*z)+2\")\n 'x y ^ 5 z * / 2 +'\n \"\"\"\n if not balanced_parentheses(expression_str):\n raise ValueError(\"Mismatched parentheses\")\n stack: Stack[str] = Stack()\n postfix = []\n for char in expression_str:\n if char.isalpha() or char.isdigit():\n postfix.append(char)\n elif char == \"(\":\n stack.push(char)\n elif char == \")\":\n while not stack.is_empty() and stack.peek() != \"(\":\n postfix.append(stack.pop())\n stack.pop()\n else:\n while not stack.is_empty() and precedence(char) <= precedence(stack.peek()):\n postfix.append(stack.pop())\n stack.push(char)\n while not stack.is_empty():\n postfix.append(stack.pop())\n return \" \".join(postfix)\n\n\nif __name__ == \"__main__\":\n from doctest import testmod\n\n testmod()\n expression = \"a+b*(c^d-e)^(f+g*h)-i\"\n\n print(\"Infix to Postfix Notation demonstration:\\n\")\n print(\"Infix notation: \" + expression)\n print(\"Postfix notation: \" + infix_to_postfix(expression))\n", "path": "data_structures/stacks/infix_to_postfix_conversion.py"}], "after_files": [{"content": "\"\"\"\nhttps://en.wikipedia.org/wiki/Infix_notation\nhttps://en.wikipedia.org/wiki/Reverse_Polish_notation\nhttps://en.wikipedia.org/wiki/Shunting-yard_algorithm\n\"\"\"\n\nfrom typing import Literal\n\nfrom .balanced_parentheses import balanced_parentheses\nfrom .stack import Stack\n\nPRECEDENCES: dict[str, int] = {\n \"+\": 1,\n \"-\": 1,\n \"*\": 2,\n \"/\": 2,\n \"^\": 3,\n}\nASSOCIATIVITIES: dict[str, Literal[\"LR\", \"RL\"]] = {\n \"+\": \"LR\",\n \"-\": \"LR\",\n \"*\": \"LR\",\n \"/\": \"LR\",\n \"^\": \"RL\",\n}\n\n\ndef precedence(char: str) -> int:\n \"\"\"\n Return integer value representing an operator's precedence, or\n order of operation.\n https://en.wikipedia.org/wiki/Order_of_operations\n \"\"\"\n return PRECEDENCES.get(char, -1)\n\n\ndef associativity(char: str) -> Literal[\"LR\", \"RL\"]:\n \"\"\"\n Return the associativity of the operator `char`.\n https://en.wikipedia.org/wiki/Operator_associativity\n \"\"\"\n return ASSOCIATIVITIES[char]\n\n\ndef infix_to_postfix(expression_str: str) -> str:\n \"\"\"\n >>> infix_to_postfix(\"(1*(2+3)+4))\")\n Traceback (most recent call last):\n ...\n ValueError: Mismatched parentheses\n >>> infix_to_postfix(\"\")\n ''\n >>> infix_to_postfix(\"3+2\")\n '3 2 +'\n >>> infix_to_postfix(\"(3+4)*5-6\")\n '3 4 + 5 * 6 -'\n >>> infix_to_postfix(\"(1+2)*3/4-5\")\n '1 2 + 3 * 4 / 5 -'\n >>> infix_to_postfix(\"a+b*c+(d*e+f)*g\")\n 'a b c * + d e * f + g * +'\n >>> infix_to_postfix(\"x^y/(5*z)+2\")\n 'x y ^ 5 z * / 2 +'\n >>> infix_to_postfix(\"2^3^2\")\n '2 3 2 ^ ^'\n \"\"\"\n if not balanced_parentheses(expression_str):\n raise ValueError(\"Mismatched parentheses\")\n stack: Stack[str] = Stack()\n postfix = []\n for char in expression_str:\n if char.isalpha() or char.isdigit():\n postfix.append(char)\n elif char == \"(\":\n stack.push(char)\n elif char == \")\":\n while not stack.is_empty() and stack.peek() != \"(\":\n postfix.append(stack.pop())\n stack.pop()\n else:\n while True:\n if stack.is_empty():\n stack.push(char)\n break\n\n char_precedence = precedence(char)\n tos_precedence = precedence(stack.peek())\n\n if char_precedence > tos_precedence:\n stack.push(char)\n break\n if char_precedence < tos_precedence:\n postfix.append(stack.pop())\n continue\n # Precedences are equal\n if associativity(char) == \"RL\":\n stack.push(char)\n break\n postfix.append(stack.pop())\n\n while not stack.is_empty():\n postfix.append(stack.pop())\n return \" \".join(postfix)\n\n\nif __name__ == \"__main__\":\n from doctest import testmod\n\n testmod()\n expression = \"a+b*(c^d-e)^(f+g*h)-i\"\n\n print(\"Infix to Postfix Notation demonstration:\\n\")\n print(\"Infix notation: \" + expression)\n print(\"Postfix notation: \" + infix_to_postfix(expression))\n", "path": "data_structures/stacks/infix_to_postfix_conversion.py"}]} | 1,677 | 687 |
gh_patches_debug_24866 | rasdani/github-patches | git_diff | facebookresearch__hydra-914 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update Example Launcher Plugin mention restoring the singletons
The singletons should be restored when performing multi-process launching.
something
```python
# In the launcher's launch:
state = Singleton.get_state()
# In the remote process
Singleton.set_state(state)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `plugins/examples/example_launcher_plugin/hydra_plugins/example_launcher_plugin/example_launcher.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
2 import logging
3 from pathlib import Path
4 from typing import Optional, Sequence
5
6 from hydra.core.config_loader import ConfigLoader
7 from hydra.core.config_search_path import ConfigSearchPath
8 from hydra.core.hydra_config import HydraConfig
9 from hydra.core.utils import (
10 JobReturn,
11 configure_log,
12 filter_overrides,
13 run_job,
14 setup_globals,
15 )
16 from hydra.plugins.launcher import Launcher
17 from hydra.plugins.search_path_plugin import SearchPathPlugin
18 from hydra.types import TaskFunction
19 from omegaconf import DictConfig, open_dict
20
21 # IMPORTANT:
22 # If your plugin imports any module that takes more than a fraction of a second to import,
23 # Import the module lazily (typically inside launch()).
24 # Installed plugins are imported during Hydra initialization and plugins that are slow to import plugins will slow
25 # the startup of ALL hydra applications.
26 # Another approach is to place heavy includes in a file prefixed by _, such as _core.py:
27 # Hydra will not look for plugin in such files and will not import them during plugin discovery.
28
29
30 log = logging.getLogger(__name__)
31
32
33 class ExampleLauncherSearchPathPlugin(SearchPathPlugin):
34 """
35 This plugin is allowing configuration files provided by the ExampleLauncher plugin to be discovered
36 and used once the ExampleLauncher plugin is installed
37 """
38
39 def manipulate_search_path(self, search_path: ConfigSearchPath) -> None:
40 # Appends the search path for this plugin to the end of the search path
41 search_path.append(
42 "hydra-example-launcher", "pkg://hydra_plugins.example_launcher_plugin.conf"
43 )
44
45
46 class ExampleLauncher(Launcher):
47 def __init__(self, foo: str, bar: str) -> None:
48 self.config: Optional[DictConfig] = None
49 self.config_loader: Optional[ConfigLoader] = None
50 self.task_function: Optional[TaskFunction] = None
51
52 # foo and var are coming from the the plugin's configuration
53 self.foo = foo
54 self.bar = bar
55
56 def setup(
57 self,
58 config: DictConfig,
59 config_loader: ConfigLoader,
60 task_function: TaskFunction,
61 ) -> None:
62 self.config = config
63 self.config_loader = config_loader
64 self.task_function = task_function
65
66 def launch(
67 self, job_overrides: Sequence[Sequence[str]], initial_job_idx: int
68 ) -> Sequence[JobReturn]:
69 """
70 :param job_overrides: a List of List<String>, where each inner list is the arguments for one job run.
71 :param initial_job_idx: Initial job idx in batch.
72 :return: an array of return values from run_job with indexes corresponding to the input list indexes.
73 """
74 setup_globals()
75 assert self.config is not None
76 assert self.config_loader is not None
77 assert self.task_function is not None
78
79 configure_log(self.config.hydra.hydra_logging, self.config.hydra.verbose)
80 sweep_dir = Path(str(self.config.hydra.sweep.dir))
81 sweep_dir.mkdir(parents=True, exist_ok=True)
82 log.info(
83 f"Example Launcher(foo={self.foo}, bar={self.bar}) is launching {len(job_overrides)} jobs locally"
84 )
85 log.info(f"Sweep output dir : {sweep_dir}")
86 runs = []
87
88 for idx, overrides in enumerate(job_overrides):
89 idx = initial_job_idx + idx
90 lst = " ".join(filter_overrides(overrides))
91 log.info(f"\t#{idx} : {lst}")
92 sweep_config = self.config_loader.load_sweep_config(
93 self.config, list(overrides)
94 )
95 with open_dict(sweep_config):
96 # This typically coming from the underlying scheduler (SLURM_JOB_ID for instance)
97 # In that case, it will not be available here because we are still in the main process.
98 # but instead should be populated remotely before calling the task_function.
99 sweep_config.hydra.job.id = f"job_id_for_{idx}"
100 sweep_config.hydra.job.num = idx
101 HydraConfig.instance().set_config(sweep_config)
102
103 ret = run_job(
104 config=sweep_config,
105 task_function=self.task_function,
106 job_dir_key="hydra.sweep.dir",
107 job_subdir_key="hydra.sweep.subdir",
108 )
109 runs.append(ret)
110 # reconfigure the logging subsystem for Hydra as the run_job call configured it for the Job.
111 # This is needed for launchers that calls run_job in the same process and not spawn a new one.
112 configure_log(self.config.hydra.hydra_logging, self.config.hydra.verbose)
113 return runs
114
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/plugins/examples/example_launcher_plugin/hydra_plugins/example_launcher_plugin/example_launcher.py b/plugins/examples/example_launcher_plugin/hydra_plugins/example_launcher_plugin/example_launcher.py
--- a/plugins/examples/example_launcher_plugin/hydra_plugins/example_launcher_plugin/example_launcher.py
+++ b/plugins/examples/example_launcher_plugin/hydra_plugins/example_launcher_plugin/example_launcher.py
@@ -6,6 +6,7 @@
from hydra.core.config_loader import ConfigLoader
from hydra.core.config_search_path import ConfigSearchPath
from hydra.core.hydra_config import HydraConfig
+from hydra.core.singleton import Singleton
from hydra.core.utils import (
JobReturn,
configure_log,
@@ -100,6 +101,17 @@
sweep_config.hydra.job.num = idx
HydraConfig.instance().set_config(sweep_config)
+ # If your launcher is executing code in a different process, it is important to restore
+ # the singleton state in the new process.
+ # To do this, you will likely need to serialize the singleton state along with the other
+ # parameters passed to the child process.
+
+ # happening on launcher process
+ state = Singleton.get_state()
+
+ # happening on the spawned process
+ Singleton.set_state(state)
+
ret = run_job(
config=sweep_config,
task_function=self.task_function,
| {"golden_diff": "diff --git a/plugins/examples/example_launcher_plugin/hydra_plugins/example_launcher_plugin/example_launcher.py b/plugins/examples/example_launcher_plugin/hydra_plugins/example_launcher_plugin/example_launcher.py\n--- a/plugins/examples/example_launcher_plugin/hydra_plugins/example_launcher_plugin/example_launcher.py\n+++ b/plugins/examples/example_launcher_plugin/hydra_plugins/example_launcher_plugin/example_launcher.py\n@@ -6,6 +6,7 @@\n from hydra.core.config_loader import ConfigLoader\n from hydra.core.config_search_path import ConfigSearchPath\n from hydra.core.hydra_config import HydraConfig\n+from hydra.core.singleton import Singleton\n from hydra.core.utils import (\n JobReturn,\n configure_log,\n@@ -100,6 +101,17 @@\n sweep_config.hydra.job.num = idx\n HydraConfig.instance().set_config(sweep_config)\n \n+ # If your launcher is executing code in a different process, it is important to restore\n+ # the singleton state in the new process.\n+ # To do this, you will likely need to serialize the singleton state along with the other\n+ # parameters passed to the child process.\n+\n+ # happening on launcher process\n+ state = Singleton.get_state()\n+\n+ # happening on the spawned process\n+ Singleton.set_state(state)\n+\n ret = run_job(\n config=sweep_config,\n task_function=self.task_function,\n", "issue": "Update Example Launcher Plugin mention restoring the singletons\nThe singletons should be restored when performing multi-process launching.\r\nsomething \r\n\r\n```python\r\n# In the launcher's launch:\r\nstate = Singleton.get_state()\r\n\r\n# In the remote process\r\nSingleton.set_state(state)\r\n```\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\nimport logging\nfrom pathlib import Path\nfrom typing import Optional, Sequence\n\nfrom hydra.core.config_loader import ConfigLoader\nfrom hydra.core.config_search_path import ConfigSearchPath\nfrom hydra.core.hydra_config import HydraConfig\nfrom hydra.core.utils import (\n JobReturn,\n configure_log,\n filter_overrides,\n run_job,\n setup_globals,\n)\nfrom hydra.plugins.launcher import Launcher\nfrom hydra.plugins.search_path_plugin import SearchPathPlugin\nfrom hydra.types import TaskFunction\nfrom omegaconf import DictConfig, open_dict\n\n# IMPORTANT:\n# If your plugin imports any module that takes more than a fraction of a second to import,\n# Import the module lazily (typically inside launch()).\n# Installed plugins are imported during Hydra initialization and plugins that are slow to import plugins will slow\n# the startup of ALL hydra applications.\n# Another approach is to place heavy includes in a file prefixed by _, such as _core.py:\n# Hydra will not look for plugin in such files and will not import them during plugin discovery.\n\n\nlog = logging.getLogger(__name__)\n\n\nclass ExampleLauncherSearchPathPlugin(SearchPathPlugin):\n \"\"\"\n This plugin is allowing configuration files provided by the ExampleLauncher plugin to be discovered\n and used once the ExampleLauncher plugin is installed\n \"\"\"\n\n def manipulate_search_path(self, search_path: ConfigSearchPath) -> None:\n # Appends the search path for this plugin to the end of the search path\n search_path.append(\n \"hydra-example-launcher\", \"pkg://hydra_plugins.example_launcher_plugin.conf\"\n )\n\n\nclass ExampleLauncher(Launcher):\n def __init__(self, foo: str, bar: str) -> None:\n self.config: Optional[DictConfig] = None\n self.config_loader: Optional[ConfigLoader] = None\n self.task_function: Optional[TaskFunction] = None\n\n # foo and var are coming from the the plugin's configuration\n self.foo = foo\n self.bar = bar\n\n def setup(\n self,\n config: DictConfig,\n config_loader: ConfigLoader,\n task_function: TaskFunction,\n ) -> None:\n self.config = config\n self.config_loader = config_loader\n self.task_function = task_function\n\n def launch(\n self, job_overrides: Sequence[Sequence[str]], initial_job_idx: int\n ) -> Sequence[JobReturn]:\n \"\"\"\n :param job_overrides: a List of List<String>, where each inner list is the arguments for one job run.\n :param initial_job_idx: Initial job idx in batch.\n :return: an array of return values from run_job with indexes corresponding to the input list indexes.\n \"\"\"\n setup_globals()\n assert self.config is not None\n assert self.config_loader is not None\n assert self.task_function is not None\n\n configure_log(self.config.hydra.hydra_logging, self.config.hydra.verbose)\n sweep_dir = Path(str(self.config.hydra.sweep.dir))\n sweep_dir.mkdir(parents=True, exist_ok=True)\n log.info(\n f\"Example Launcher(foo={self.foo}, bar={self.bar}) is launching {len(job_overrides)} jobs locally\"\n )\n log.info(f\"Sweep output dir : {sweep_dir}\")\n runs = []\n\n for idx, overrides in enumerate(job_overrides):\n idx = initial_job_idx + idx\n lst = \" \".join(filter_overrides(overrides))\n log.info(f\"\\t#{idx} : {lst}\")\n sweep_config = self.config_loader.load_sweep_config(\n self.config, list(overrides)\n )\n with open_dict(sweep_config):\n # This typically coming from the underlying scheduler (SLURM_JOB_ID for instance)\n # In that case, it will not be available here because we are still in the main process.\n # but instead should be populated remotely before calling the task_function.\n sweep_config.hydra.job.id = f\"job_id_for_{idx}\"\n sweep_config.hydra.job.num = idx\n HydraConfig.instance().set_config(sweep_config)\n\n ret = run_job(\n config=sweep_config,\n task_function=self.task_function,\n job_dir_key=\"hydra.sweep.dir\",\n job_subdir_key=\"hydra.sweep.subdir\",\n )\n runs.append(ret)\n # reconfigure the logging subsystem for Hydra as the run_job call configured it for the Job.\n # This is needed for launchers that calls run_job in the same process and not spawn a new one.\n configure_log(self.config.hydra.hydra_logging, self.config.hydra.verbose)\n return runs\n", "path": "plugins/examples/example_launcher_plugin/hydra_plugins/example_launcher_plugin/example_launcher.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\nimport logging\nfrom pathlib import Path\nfrom typing import Optional, Sequence\n\nfrom hydra.core.config_loader import ConfigLoader\nfrom hydra.core.config_search_path import ConfigSearchPath\nfrom hydra.core.hydra_config import HydraConfig\nfrom hydra.core.singleton import Singleton\nfrom hydra.core.utils import (\n JobReturn,\n configure_log,\n filter_overrides,\n run_job,\n setup_globals,\n)\nfrom hydra.plugins.launcher import Launcher\nfrom hydra.plugins.search_path_plugin import SearchPathPlugin\nfrom hydra.types import TaskFunction\nfrom omegaconf import DictConfig, open_dict\n\n# IMPORTANT:\n# If your plugin imports any module that takes more than a fraction of a second to import,\n# Import the module lazily (typically inside launch()).\n# Installed plugins are imported during Hydra initialization and plugins that are slow to import plugins will slow\n# the startup of ALL hydra applications.\n# Another approach is to place heavy includes in a file prefixed by _, such as _core.py:\n# Hydra will not look for plugin in such files and will not import them during plugin discovery.\n\n\nlog = logging.getLogger(__name__)\n\n\nclass ExampleLauncherSearchPathPlugin(SearchPathPlugin):\n \"\"\"\n This plugin is allowing configuration files provided by the ExampleLauncher plugin to be discovered\n and used once the ExampleLauncher plugin is installed\n \"\"\"\n\n def manipulate_search_path(self, search_path: ConfigSearchPath) -> None:\n # Appends the search path for this plugin to the end of the search path\n search_path.append(\n \"hydra-example-launcher\", \"pkg://hydra_plugins.example_launcher_plugin.conf\"\n )\n\n\nclass ExampleLauncher(Launcher):\n def __init__(self, foo: str, bar: str) -> None:\n self.config: Optional[DictConfig] = None\n self.config_loader: Optional[ConfigLoader] = None\n self.task_function: Optional[TaskFunction] = None\n\n # foo and var are coming from the the plugin's configuration\n self.foo = foo\n self.bar = bar\n\n def setup(\n self,\n config: DictConfig,\n config_loader: ConfigLoader,\n task_function: TaskFunction,\n ) -> None:\n self.config = config\n self.config_loader = config_loader\n self.task_function = task_function\n\n def launch(\n self, job_overrides: Sequence[Sequence[str]], initial_job_idx: int\n ) -> Sequence[JobReturn]:\n \"\"\"\n :param job_overrides: a List of List<String>, where each inner list is the arguments for one job run.\n :param initial_job_idx: Initial job idx in batch.\n :return: an array of return values from run_job with indexes corresponding to the input list indexes.\n \"\"\"\n setup_globals()\n assert self.config is not None\n assert self.config_loader is not None\n assert self.task_function is not None\n\n configure_log(self.config.hydra.hydra_logging, self.config.hydra.verbose)\n sweep_dir = Path(str(self.config.hydra.sweep.dir))\n sweep_dir.mkdir(parents=True, exist_ok=True)\n log.info(\n f\"Example Launcher(foo={self.foo}, bar={self.bar}) is launching {len(job_overrides)} jobs locally\"\n )\n log.info(f\"Sweep output dir : {sweep_dir}\")\n runs = []\n\n for idx, overrides in enumerate(job_overrides):\n idx = initial_job_idx + idx\n lst = \" \".join(filter_overrides(overrides))\n log.info(f\"\\t#{idx} : {lst}\")\n sweep_config = self.config_loader.load_sweep_config(\n self.config, list(overrides)\n )\n with open_dict(sweep_config):\n # This typically coming from the underlying scheduler (SLURM_JOB_ID for instance)\n # In that case, it will not be available here because we are still in the main process.\n # but instead should be populated remotely before calling the task_function.\n sweep_config.hydra.job.id = f\"job_id_for_{idx}\"\n sweep_config.hydra.job.num = idx\n HydraConfig.instance().set_config(sweep_config)\n\n # If your launcher is executing code in a different process, it is important to restore\n # the singleton state in the new process.\n # To do this, you will likely need to serialize the singleton state along with the other\n # parameters passed to the child process.\n\n # happening on launcher process\n state = Singleton.get_state()\n\n # happening on the spawned process\n Singleton.set_state(state)\n\n ret = run_job(\n config=sweep_config,\n task_function=self.task_function,\n job_dir_key=\"hydra.sweep.dir\",\n job_subdir_key=\"hydra.sweep.subdir\",\n )\n runs.append(ret)\n # reconfigure the logging subsystem for Hydra as the run_job call configured it for the Job.\n # This is needed for launchers that calls run_job in the same process and not spawn a new one.\n configure_log(self.config.hydra.hydra_logging, self.config.hydra.verbose)\n return runs\n", "path": "plugins/examples/example_launcher_plugin/hydra_plugins/example_launcher_plugin/example_launcher.py"}]} | 1,553 | 292 |
gh_patches_debug_30573 | rasdani/github-patches | git_diff | wagtail__wagtail-1357 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
StreamField EmbedBlock renders the url only
Given the following setup:
```
class EmbedOnImage(StructBlock):
embed = EmbedBlock()
background_image = ImageChooserBlock(required=False)
class Meta:
template = 'streamfield_blocks/embed_on_image.html'
```
`{{ self.embed }}` outputs the embed url only in `streamfield_blocks/embed_on_image.html`.
And `EmbedBlock.render_basic` is never called: https://github.com/torchbox/wagtail/blob/master/wagtail/wagtailembeds/blocks.py#L8
Also, if I apply the embed filter too: `{{ self.embed|embed }}` then it outputs nothing because the embed filter won't call `embed_to_frontend_html` either, which would use the correct template to render it: https://github.com/torchbox/wagtail/blob/master/wagtail/wagtailembeds/templatetags/wagtailembeds_tags.py#L14
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wagtail/wagtailembeds/blocks.py`
Content:
```
1 from wagtail.wagtailcore import blocks
2
3 from wagtail.wagtailembeds.format import embed_to_frontend_html
4
5
6 class EmbedBlock(blocks.URLBlock):
7 def render_basic(self, value):
8 return embed_to_frontend_html(value)
9
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/wagtail/wagtailembeds/blocks.py b/wagtail/wagtailembeds/blocks.py
--- a/wagtail/wagtailembeds/blocks.py
+++ b/wagtail/wagtailembeds/blocks.py
@@ -1,8 +1,62 @@
+from django.utils.encoding import python_2_unicode_compatible
+
from wagtail.wagtailcore import blocks
from wagtail.wagtailembeds.format import embed_to_frontend_html
+@python_2_unicode_compatible
+class EmbedValue(object):
+ """
+ Native value of an EmbedBlock. Should, at minimum, have a 'url' property
+ and render as the embed HTML when rendered in a template.
+ NB We don't use a wagtailembeds.model.Embed object for this, because
+ we want to be able to do {{ value.url|embed:max_width=500 }} without
+ doing a redundant fetch of the embed at the default width.
+ """
+ def __init__(self, url):
+ self.url = url
+
+ def __str__(self):
+ return embed_to_frontend_html(self.url)
+
+
class EmbedBlock(blocks.URLBlock):
- def render_basic(self, value):
- return embed_to_frontend_html(value)
+ def get_default(self):
+ # Allow specifying the default for an EmbedBlock as either an EmbedValue or a string (or None).
+ if not self.meta.default:
+ return None
+ elif isinstance(self.meta.default, EmbedValue):
+ return self.meta.default
+ else:
+ # assume default has been passed as a string
+ return EmbedValue(self.meta.default)
+
+ def to_python(self, value):
+ # The JSON representation of an EmbedBlock's value is a URL string;
+ # this should be converted to an EmbedValue (or None).
+ if not value:
+ return None
+ else:
+ return EmbedValue(value)
+
+ def get_prep_value(self, value):
+ # serialisable value should be a URL string
+ if value is None:
+ return ''
+ else:
+ return value.url
+
+ def value_for_form(self, value):
+ # the value to be handled by the URLField is a plain URL string (or the empty string)
+ if value is None:
+ return ''
+ else:
+ return value.url
+
+ def value_from_form(self, value):
+ # convert the value returned from the form (a URL string) to an EmbedValue (or None)
+ if not value:
+ return None
+ else:
+ return EmbedValue(value)
| {"golden_diff": "diff --git a/wagtail/wagtailembeds/blocks.py b/wagtail/wagtailembeds/blocks.py\n--- a/wagtail/wagtailembeds/blocks.py\n+++ b/wagtail/wagtailembeds/blocks.py\n@@ -1,8 +1,62 @@\n+from django.utils.encoding import python_2_unicode_compatible\n+\n from wagtail.wagtailcore import blocks\n \n from wagtail.wagtailembeds.format import embed_to_frontend_html\n \n \n+@python_2_unicode_compatible\n+class EmbedValue(object):\n+ \"\"\"\n+ Native value of an EmbedBlock. Should, at minimum, have a 'url' property\n+ and render as the embed HTML when rendered in a template.\n+ NB We don't use a wagtailembeds.model.Embed object for this, because\n+ we want to be able to do {{ value.url|embed:max_width=500 }} without\n+ doing a redundant fetch of the embed at the default width.\n+ \"\"\"\n+ def __init__(self, url):\n+ self.url = url\n+\n+ def __str__(self):\n+ return embed_to_frontend_html(self.url)\n+\n+\n class EmbedBlock(blocks.URLBlock):\n- def render_basic(self, value):\n- return embed_to_frontend_html(value)\n+ def get_default(self):\n+ # Allow specifying the default for an EmbedBlock as either an EmbedValue or a string (or None).\n+ if not self.meta.default:\n+ return None\n+ elif isinstance(self.meta.default, EmbedValue):\n+ return self.meta.default\n+ else:\n+ # assume default has been passed as a string\n+ return EmbedValue(self.meta.default)\n+\n+ def to_python(self, value):\n+ # The JSON representation of an EmbedBlock's value is a URL string;\n+ # this should be converted to an EmbedValue (or None).\n+ if not value:\n+ return None\n+ else:\n+ return EmbedValue(value)\n+\n+ def get_prep_value(self, value):\n+ # serialisable value should be a URL string\n+ if value is None:\n+ return ''\n+ else:\n+ return value.url\n+\n+ def value_for_form(self, value):\n+ # the value to be handled by the URLField is a plain URL string (or the empty string)\n+ if value is None:\n+ return ''\n+ else:\n+ return value.url\n+\n+ def value_from_form(self, value):\n+ # convert the value returned from the form (a URL string) to an EmbedValue (or None)\n+ if not value:\n+ return None\n+ else:\n+ return EmbedValue(value)\n", "issue": "StreamField EmbedBlock renders the url only\nGiven the following setup:\n\n```\nclass EmbedOnImage(StructBlock):\n embed = EmbedBlock()\n background_image = ImageChooserBlock(required=False)\n\n class Meta:\n template = 'streamfield_blocks/embed_on_image.html'\n```\n\n`{{ self.embed }}` outputs the embed url only in `streamfield_blocks/embed_on_image.html`.\n\nAnd `EmbedBlock.render_basic` is never called: https://github.com/torchbox/wagtail/blob/master/wagtail/wagtailembeds/blocks.py#L8\n\nAlso, if I apply the embed filter too: `{{ self.embed|embed }}` then it outputs nothing because the embed filter won't call `embed_to_frontend_html` either, which would use the correct template to render it: https://github.com/torchbox/wagtail/blob/master/wagtail/wagtailembeds/templatetags/wagtailembeds_tags.py#L14\n\n", "before_files": [{"content": "from wagtail.wagtailcore import blocks\n\nfrom wagtail.wagtailembeds.format import embed_to_frontend_html\n\n\nclass EmbedBlock(blocks.URLBlock):\n def render_basic(self, value):\n return embed_to_frontend_html(value)\n", "path": "wagtail/wagtailembeds/blocks.py"}], "after_files": [{"content": "from django.utils.encoding import python_2_unicode_compatible\n\nfrom wagtail.wagtailcore import blocks\n\nfrom wagtail.wagtailembeds.format import embed_to_frontend_html\n\n\n@python_2_unicode_compatible\nclass EmbedValue(object):\n \"\"\"\n Native value of an EmbedBlock. Should, at minimum, have a 'url' property\n and render as the embed HTML when rendered in a template.\n NB We don't use a wagtailembeds.model.Embed object for this, because\n we want to be able to do {{ value.url|embed:max_width=500 }} without\n doing a redundant fetch of the embed at the default width.\n \"\"\"\n def __init__(self, url):\n self.url = url\n\n def __str__(self):\n return embed_to_frontend_html(self.url)\n\n\nclass EmbedBlock(blocks.URLBlock):\n def get_default(self):\n # Allow specifying the default for an EmbedBlock as either an EmbedValue or a string (or None).\n if not self.meta.default:\n return None\n elif isinstance(self.meta.default, EmbedValue):\n return self.meta.default\n else:\n # assume default has been passed as a string\n return EmbedValue(self.meta.default)\n\n def to_python(self, value):\n # The JSON representation of an EmbedBlock's value is a URL string;\n # this should be converted to an EmbedValue (or None).\n if not value:\n return None\n else:\n return EmbedValue(value)\n\n def get_prep_value(self, value):\n # serialisable value should be a URL string\n if value is None:\n return ''\n else:\n return value.url\n\n def value_for_form(self, value):\n # the value to be handled by the URLField is a plain URL string (or the empty string)\n if value is None:\n return ''\n else:\n return value.url\n\n def value_from_form(self, value):\n # convert the value returned from the form (a URL string) to an EmbedValue (or None)\n if not value:\n return None\n else:\n return EmbedValue(value)\n", "path": "wagtail/wagtailembeds/blocks.py"}]} | 536 | 600 |
gh_patches_debug_30740 | rasdani/github-patches | git_diff | translate__translate-3520 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add traits and variants support in l20nfile
L20n traits allow storing value variants and additional information for a unit. We need to add such support when parsing `.ftl` and convert resources.
```
file-menu =
[xul/label] File
[xul/accesskey] F
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `translate/storage/l20n.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 #
3 # Copyright 2016 Zuza Software Foundation
4 #
5 # This file is part of translate.
6 #
7 # translate is free software; you can redistribute it and/or modify
8 # it under the terms of the GNU General Public License as published by
9 # the Free Software Foundation; either version 2 of the License, or
10 # (at your option) any later version.
11 #
12 # translate is distributed in the hope that it will be useful,
13 # but WITHOUT ANY WARRANTY; without even the implied warranty of
14 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15 # GNU General Public License for more details.
16 #
17 # You should have received a copy of the GNU General Public License
18 # along with this program; if not, see <http://www.gnu.org/licenses/>.
19
20 from __future__ import absolute_import
21
22 from codecs import iterencode
23
24 from l20n.format.parser import FTLParser as L20nParser
25 from l20n.format.serializer import FTLSerializer as L20nSerializer
26
27 from translate.storage import base
28
29
30 class l20nunit(base.TranslationUnit):
31 """Single L20n Entity"""
32
33 def __init__(self, source='', id='', comment=''):
34 super(l20nunit, self).__init__(source)
35 self.id = id
36 self.value = source
37 self.comment = comment
38
39 # Note that source and target are equivalent for monolingual units
40 def getsource(self):
41 return self.value
42
43 def setsource(self, source):
44 self.value = source
45
46 source = property(getsource, setsource)
47
48 def gettarget(self):
49 return self.value
50
51 def settarget(self, target):
52 self.value = target
53
54 target = property(gettarget, settarget)
55
56 def getid(self):
57 return self.id
58
59 def setid(self, new_id):
60 self.id = new_id
61
62 def getoutput(self):
63 return u"%s = %s\n" % (self.id, self.value)
64
65
66 class l20nfile(base.TranslationStore):
67 UnitClass = l20nunit
68 encoding = 'utf8'
69
70 def __init__(self, inputfile=None):
71 super(l20nfile, self).__init__(unitclass=self.UnitClass)
72 self.filename = getattr(inputfile, 'name', '')
73 if inputfile is not None:
74 l20nsrc = inputfile.read()
75 self.parse(l20nsrc)
76 self.makeindex()
77
78 def parse(self, l20nsrc):
79 text, encoding = self.detect_encoding(
80 l20nsrc, default_encodings=[self.encoding])
81 if not text:
82 raise IOError("Cannot detect encoding for %s." % (self.filename or
83 "given string"))
84 l20nsrc = text
85
86 parser = L20nParser()
87 ast, errors = parser.parseResource(l20nsrc)
88
89 for entry in ast['body']:
90 if entry['type'] == 'Entity':
91 translation = L20nSerializer().dumpPattern(entry['value'])
92 comment = ''
93 if entry['comment']:
94 comment = entry['comment']['content']
95
96 newl20n = l20nunit(
97 source=translation,
98 id=entry['id']['name'],
99 comment=comment
100 )
101 self.addunit(newl20n)
102
103 def serialize(self, out):
104 """Write the units back to file."""
105 # Thanks to iterencode, a possible BOM is written only once
106 for chunk in iterencode((unit.getoutput() for unit in self.units), self.encoding):
107 out.write(chunk)
108
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/translate/storage/l20n.py b/translate/storage/l20n.py
--- a/translate/storage/l20n.py
+++ b/translate/storage/l20n.py
@@ -27,6 +27,17 @@
from translate.storage import base
+def dump_l20n_entity_value(entity):
+ serializer = L20nSerializer()
+ value = serializer.dumpPattern(entity['value'])
+
+ if len(entity['traits']):
+ traits = serializer.dumpMembers(entity['traits'], 2)
+ return u'{}\n{}'.format(value, traits)
+
+ return value
+
+
class l20nunit(base.TranslationUnit):
"""Single L20n Entity"""
@@ -75,6 +86,19 @@
self.parse(l20nsrc)
self.makeindex()
+ def parse_entity(self, entity):
+ translation = dump_l20n_entity_value(entity)
+ comment = ''
+ if entity['comment']:
+ comment = entity['comment']['content']
+
+ newl20n = l20nunit(
+ source=translation,
+ id=entity['id']['name'],
+ comment=comment
+ )
+ self.addunit(newl20n)
+
def parse(self, l20nsrc):
text, encoding = self.detect_encoding(
l20nsrc, default_encodings=[self.encoding])
@@ -88,17 +112,7 @@
for entry in ast['body']:
if entry['type'] == 'Entity':
- translation = L20nSerializer().dumpPattern(entry['value'])
- comment = ''
- if entry['comment']:
- comment = entry['comment']['content']
-
- newl20n = l20nunit(
- source=translation,
- id=entry['id']['name'],
- comment=comment
- )
- self.addunit(newl20n)
+ self.parse_entity(entry)
def serialize(self, out):
"""Write the units back to file."""
| {"golden_diff": "diff --git a/translate/storage/l20n.py b/translate/storage/l20n.py\n--- a/translate/storage/l20n.py\n+++ b/translate/storage/l20n.py\n@@ -27,6 +27,17 @@\n from translate.storage import base\n \n \n+def dump_l20n_entity_value(entity):\n+ serializer = L20nSerializer()\n+ value = serializer.dumpPattern(entity['value'])\n+\n+ if len(entity['traits']):\n+ traits = serializer.dumpMembers(entity['traits'], 2)\n+ return u'{}\\n{}'.format(value, traits)\n+\n+ return value\n+\n+\n class l20nunit(base.TranslationUnit):\n \"\"\"Single L20n Entity\"\"\"\n \n@@ -75,6 +86,19 @@\n self.parse(l20nsrc)\n self.makeindex()\n \n+ def parse_entity(self, entity):\n+ translation = dump_l20n_entity_value(entity)\n+ comment = ''\n+ if entity['comment']:\n+ comment = entity['comment']['content']\n+\n+ newl20n = l20nunit(\n+ source=translation,\n+ id=entity['id']['name'],\n+ comment=comment\n+ )\n+ self.addunit(newl20n)\n+\n def parse(self, l20nsrc):\n text, encoding = self.detect_encoding(\n l20nsrc, default_encodings=[self.encoding])\n@@ -88,17 +112,7 @@\n \n for entry in ast['body']:\n if entry['type'] == 'Entity':\n- translation = L20nSerializer().dumpPattern(entry['value'])\n- comment = ''\n- if entry['comment']:\n- comment = entry['comment']['content']\n-\n- newl20n = l20nunit(\n- source=translation,\n- id=entry['id']['name'],\n- comment=comment\n- )\n- self.addunit(newl20n)\n+ self.parse_entity(entry)\n \n def serialize(self, out):\n \"\"\"Write the units back to file.\"\"\"\n", "issue": "Add traits and variants support in l20nfile\nL20n traits allow storing value variants and additional information for a unit. We need to add such support when parsing `.ftl` and convert resources.\r\n\r\n```\r\nfile-menu =\r\n [xul/label] File\r\n [xul/accesskey] F\r\n```\r\n\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright 2016 Zuza Software Foundation\n#\n# This file is part of translate.\n#\n# translate is free software; you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation; either version 2 of the License, or\n# (at your option) any later version.\n#\n# translate is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with this program; if not, see <http://www.gnu.org/licenses/>.\n\nfrom __future__ import absolute_import\n\nfrom codecs import iterencode\n\nfrom l20n.format.parser import FTLParser as L20nParser\nfrom l20n.format.serializer import FTLSerializer as L20nSerializer\n\nfrom translate.storage import base\n\n\nclass l20nunit(base.TranslationUnit):\n \"\"\"Single L20n Entity\"\"\"\n\n def __init__(self, source='', id='', comment=''):\n super(l20nunit, self).__init__(source)\n self.id = id\n self.value = source\n self.comment = comment\n\n # Note that source and target are equivalent for monolingual units\n def getsource(self):\n return self.value\n\n def setsource(self, source):\n self.value = source\n\n source = property(getsource, setsource)\n\n def gettarget(self):\n return self.value\n\n def settarget(self, target):\n self.value = target\n\n target = property(gettarget, settarget)\n\n def getid(self):\n return self.id\n\n def setid(self, new_id):\n self.id = new_id\n\n def getoutput(self):\n return u\"%s = %s\\n\" % (self.id, self.value)\n\n\nclass l20nfile(base.TranslationStore):\n UnitClass = l20nunit\n encoding = 'utf8'\n\n def __init__(self, inputfile=None):\n super(l20nfile, self).__init__(unitclass=self.UnitClass)\n self.filename = getattr(inputfile, 'name', '')\n if inputfile is not None:\n l20nsrc = inputfile.read()\n self.parse(l20nsrc)\n self.makeindex()\n\n def parse(self, l20nsrc):\n text, encoding = self.detect_encoding(\n l20nsrc, default_encodings=[self.encoding])\n if not text:\n raise IOError(\"Cannot detect encoding for %s.\" % (self.filename or\n \"given string\"))\n l20nsrc = text\n\n parser = L20nParser()\n ast, errors = parser.parseResource(l20nsrc)\n\n for entry in ast['body']:\n if entry['type'] == 'Entity':\n translation = L20nSerializer().dumpPattern(entry['value'])\n comment = ''\n if entry['comment']:\n comment = entry['comment']['content']\n\n newl20n = l20nunit(\n source=translation,\n id=entry['id']['name'],\n comment=comment\n )\n self.addunit(newl20n)\n\n def serialize(self, out):\n \"\"\"Write the units back to file.\"\"\"\n # Thanks to iterencode, a possible BOM is written only once\n for chunk in iterencode((unit.getoutput() for unit in self.units), self.encoding):\n out.write(chunk)\n", "path": "translate/storage/l20n.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright 2016 Zuza Software Foundation\n#\n# This file is part of translate.\n#\n# translate is free software; you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation; either version 2 of the License, or\n# (at your option) any later version.\n#\n# translate is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with this program; if not, see <http://www.gnu.org/licenses/>.\n\nfrom __future__ import absolute_import\n\nfrom codecs import iterencode\n\nfrom l20n.format.parser import FTLParser as L20nParser\nfrom l20n.format.serializer import FTLSerializer as L20nSerializer\n\nfrom translate.storage import base\n\n\ndef dump_l20n_entity_value(entity):\n serializer = L20nSerializer()\n value = serializer.dumpPattern(entity['value'])\n\n if len(entity['traits']):\n traits = serializer.dumpMembers(entity['traits'], 2)\n return u'{}\\n{}'.format(value, traits)\n\n return value\n\n\nclass l20nunit(base.TranslationUnit):\n \"\"\"Single L20n Entity\"\"\"\n\n def __init__(self, source='', id='', comment=''):\n super(l20nunit, self).__init__(source)\n self.id = id\n self.value = source\n self.comment = comment\n\n # Note that source and target are equivalent for monolingual units\n def getsource(self):\n return self.value\n\n def setsource(self, source):\n self.value = source\n\n source = property(getsource, setsource)\n\n def gettarget(self):\n return self.value\n\n def settarget(self, target):\n self.value = target\n\n target = property(gettarget, settarget)\n\n def getid(self):\n return self.id\n\n def setid(self, new_id):\n self.id = new_id\n\n def getoutput(self):\n return u\"%s = %s\\n\" % (self.id, self.value)\n\n\nclass l20nfile(base.TranslationStore):\n UnitClass = l20nunit\n encoding = 'utf8'\n\n def __init__(self, inputfile=None):\n super(l20nfile, self).__init__(unitclass=self.UnitClass)\n self.filename = getattr(inputfile, 'name', '')\n if inputfile is not None:\n l20nsrc = inputfile.read()\n self.parse(l20nsrc)\n self.makeindex()\n\n def parse_entity(self, entity):\n translation = dump_l20n_entity_value(entity)\n comment = ''\n if entity['comment']:\n comment = entity['comment']['content']\n\n newl20n = l20nunit(\n source=translation,\n id=entity['id']['name'],\n comment=comment\n )\n self.addunit(newl20n)\n\n def parse(self, l20nsrc):\n text, encoding = self.detect_encoding(\n l20nsrc, default_encodings=[self.encoding])\n if not text:\n raise IOError(\"Cannot detect encoding for %s.\" % (self.filename or\n \"given string\"))\n l20nsrc = text\n\n parser = L20nParser()\n ast, errors = parser.parseResource(l20nsrc)\n\n for entry in ast['body']:\n if entry['type'] == 'Entity':\n self.parse_entity(entry)\n\n def serialize(self, out):\n \"\"\"Write the units back to file.\"\"\"\n # Thanks to iterencode, a possible BOM is written only once\n for chunk in iterencode((unit.getoutput() for unit in self.units), self.encoding):\n out.write(chunk)\n", "path": "translate/storage/l20n.py"}]} | 1,346 | 465 |
gh_patches_debug_7268 | rasdani/github-patches | git_diff | facebookresearch__hydra-165 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
output/hydra.yaml is wrong
hydra.yaml in output directory is malformed (not rooted at hydra.)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `hydra/plugins/common/utils.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
2
3 import copy
4 import logging
5 import os
6 import re
7 import sys
8 from time import strftime, localtime
9
10 import six
11 from omegaconf import OmegaConf, DictConfig
12
13 # pylint: disable=C0103
14 log = logging.getLogger(__name__)
15
16
17 def configure_log(log_config, verbose=None):
18 if log_config is not None:
19 conf = log_config.to_container(resolve=True)
20 logging.config.dictConfig(conf)
21 else:
22 # default logging to stdout
23 root = logging.getLogger()
24 root.setLevel(logging.INFO)
25 handler = logging.StreamHandler(sys.stdout)
26 formatter = logging.Formatter(
27 "[%(asctime)s][%(name)s][%(levelname)s] - %(message)s"
28 )
29 handler.setFormatter(formatter)
30 root.addHandler(handler)
31
32 if verbose is not None:
33 if verbose == "root":
34 logging.getLogger().setLevel(logging.DEBUG)
35 for logger in verbose.split(","):
36 logging.getLogger(logger).setLevel(logging.DEBUG)
37
38
39 def save_config(cfg, filename):
40 with open(os.path.join(filename), "w") as file:
41 file.write(cfg.pretty())
42
43
44 def get_overrides_dirname(lst, exclude_keys=[]):
45 lst = [x for x in lst if x not in exclude_keys]
46 lst.sort()
47 return re.sub(pattern="[=]", repl="=", string=",".join(lst))
48
49
50 def filter_overrides(overrides):
51 """
52 :param overrides: overrides list
53 :return: returning a new overrides list with all the keys starting with hydra. fitlered.
54 """
55 return [x for x in overrides if not x.startswith("hydra.")]
56
57
58 def run_job(config, task_function, verbose, job_dir_key, job_subdir_key):
59 old_cwd = os.getcwd()
60 working_dir = str(config.select(job_dir_key))
61 if job_subdir_key is not None:
62 # evaluate job_subdir_key lazily.
63 # this is running on the client side in sweep and contains things such as job:id which
64 # are only available there.
65 subdir = str(config.select(job_subdir_key))
66 working_dir = os.path.join(working_dir, subdir)
67 try:
68 ret = JobReturn()
69 ret.working_dir = working_dir
70 task_cfg = copy.deepcopy(config)
71 hydra_cfg = task_cfg["hydra"]
72 del task_cfg["hydra"]
73 ret.cfg = task_cfg
74 ret.hydra_cfg = copy.deepcopy(HydraConfig())
75 ret.overrides = config.hydra.overrides.task.to_container()
76 if not os.path.exists(working_dir):
77 os.makedirs(working_dir)
78 os.chdir(working_dir)
79 configure_log(config.hydra.job_logging, verbose)
80
81 save_config(task_cfg, "config.yaml")
82 save_config(hydra_cfg, "hydra.yaml")
83 save_config(config.hydra.overrides.task, "overrides.yaml")
84 ret.return_value = task_function(task_cfg)
85 ret.task_name = JobRuntime().get("name")
86 return ret
87 finally:
88 os.chdir(old_cwd)
89
90
91 def get_valid_filename(s):
92 s = str(s).strip().replace(" ", "_")
93 return re.sub(r"(?u)[^-\w.]", "", s)
94
95
96 def setup_globals():
97 try:
98 OmegaConf.register_resolver(
99 "now", lambda pattern: strftime(pattern, localtime())
100 )
101
102 def job_error(x):
103 raise Exception(
104 "job:{} is no longer available. use hydra.job.{}".format(x, x)
105 )
106
107 OmegaConf.register_resolver("job", job_error)
108
109 except AssertionError:
110 # calling it again in no_workers mode will throw. safe to ignore.
111 pass
112
113
114 class JobReturn:
115 def __init__(self):
116 self.overrides = None
117 self.return_value = None
118 self.cfg = None
119 self.hydra_cfg = None
120 self.working_dir = None
121 self.task_name = None
122
123
124 class Singleton(type):
125 _instances = {}
126
127 def __call__(cls, *args, **kwargs):
128 if cls not in cls._instances:
129 cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)
130 return cls._instances[cls]
131
132 @staticmethod
133 def get_state():
134 return Singleton._instances
135
136 @staticmethod
137 def set_state(instances):
138 Singleton._instances = instances
139
140
141 @six.add_metaclass(Singleton)
142 class JobRuntime:
143 def __init__(self):
144 self.conf = OmegaConf.create()
145 self.set("name", "UNKNOWN_NAME")
146
147 def get(self, key):
148 ret = self.conf.select(key)
149 if ret is None:
150 raise KeyError("Key not found in {}: {}".format(type(self).__name__, key))
151 return ret
152
153 def set(self, key, value):
154 log.debug("Setting {}:{}={}".format(type(self).__name__, key, value))
155 self.conf[key] = value
156
157
158 @six.add_metaclass(Singleton)
159 class HydraConfig(DictConfig):
160 def __init__(self):
161 super(HydraConfig, self).__init__(content={})
162 self.hydra = None
163
164 def set_config(self, cfg):
165 try:
166 OmegaConf.set_readonly(self, False)
167 self.hydra = copy.deepcopy(cfg.hydra)
168 finally:
169 OmegaConf.set_readonly(self, True)
170
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/hydra/plugins/common/utils.py b/hydra/plugins/common/utils.py
--- a/hydra/plugins/common/utils.py
+++ b/hydra/plugins/common/utils.py
@@ -68,7 +68,8 @@
ret = JobReturn()
ret.working_dir = working_dir
task_cfg = copy.deepcopy(config)
- hydra_cfg = task_cfg["hydra"]
+ # TODO: update this after https://github.com/omry/omegaconf/issues/42 is resolved
+ hydra_cfg = OmegaConf.create({"hydra": task_cfg["hydra"]});
del task_cfg["hydra"]
ret.cfg = task_cfg
ret.hydra_cfg = copy.deepcopy(HydraConfig())
| {"golden_diff": "diff --git a/hydra/plugins/common/utils.py b/hydra/plugins/common/utils.py\n--- a/hydra/plugins/common/utils.py\n+++ b/hydra/plugins/common/utils.py\n@@ -68,7 +68,8 @@\n ret = JobReturn()\n ret.working_dir = working_dir\n task_cfg = copy.deepcopy(config)\n- hydra_cfg = task_cfg[\"hydra\"]\n+ # TODO: update this after https://github.com/omry/omegaconf/issues/42 is resolved\n+ hydra_cfg = OmegaConf.create({\"hydra\": task_cfg[\"hydra\"]});\n del task_cfg[\"hydra\"]\n ret.cfg = task_cfg\n ret.hydra_cfg = copy.deepcopy(HydraConfig())\n", "issue": "output/hydra.yaml is wrong\nhydra.yaml in output directory is malformed (not rooted at hydra.)\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n\nimport copy\nimport logging\nimport os\nimport re\nimport sys\nfrom time import strftime, localtime\n\nimport six\nfrom omegaconf import OmegaConf, DictConfig\n\n# pylint: disable=C0103\nlog = logging.getLogger(__name__)\n\n\ndef configure_log(log_config, verbose=None):\n if log_config is not None:\n conf = log_config.to_container(resolve=True)\n logging.config.dictConfig(conf)\n else:\n # default logging to stdout\n root = logging.getLogger()\n root.setLevel(logging.INFO)\n handler = logging.StreamHandler(sys.stdout)\n formatter = logging.Formatter(\n \"[%(asctime)s][%(name)s][%(levelname)s] - %(message)s\"\n )\n handler.setFormatter(formatter)\n root.addHandler(handler)\n\n if verbose is not None:\n if verbose == \"root\":\n logging.getLogger().setLevel(logging.DEBUG)\n for logger in verbose.split(\",\"):\n logging.getLogger(logger).setLevel(logging.DEBUG)\n\n\ndef save_config(cfg, filename):\n with open(os.path.join(filename), \"w\") as file:\n file.write(cfg.pretty())\n\n\ndef get_overrides_dirname(lst, exclude_keys=[]):\n lst = [x for x in lst if x not in exclude_keys]\n lst.sort()\n return re.sub(pattern=\"[=]\", repl=\"=\", string=\",\".join(lst))\n\n\ndef filter_overrides(overrides):\n \"\"\"\n :param overrides: overrides list\n :return: returning a new overrides list with all the keys starting with hydra. fitlered.\n \"\"\"\n return [x for x in overrides if not x.startswith(\"hydra.\")]\n\n\ndef run_job(config, task_function, verbose, job_dir_key, job_subdir_key):\n old_cwd = os.getcwd()\n working_dir = str(config.select(job_dir_key))\n if job_subdir_key is not None:\n # evaluate job_subdir_key lazily.\n # this is running on the client side in sweep and contains things such as job:id which\n # are only available there.\n subdir = str(config.select(job_subdir_key))\n working_dir = os.path.join(working_dir, subdir)\n try:\n ret = JobReturn()\n ret.working_dir = working_dir\n task_cfg = copy.deepcopy(config)\n hydra_cfg = task_cfg[\"hydra\"]\n del task_cfg[\"hydra\"]\n ret.cfg = task_cfg\n ret.hydra_cfg = copy.deepcopy(HydraConfig())\n ret.overrides = config.hydra.overrides.task.to_container()\n if not os.path.exists(working_dir):\n os.makedirs(working_dir)\n os.chdir(working_dir)\n configure_log(config.hydra.job_logging, verbose)\n\n save_config(task_cfg, \"config.yaml\")\n save_config(hydra_cfg, \"hydra.yaml\")\n save_config(config.hydra.overrides.task, \"overrides.yaml\")\n ret.return_value = task_function(task_cfg)\n ret.task_name = JobRuntime().get(\"name\")\n return ret\n finally:\n os.chdir(old_cwd)\n\n\ndef get_valid_filename(s):\n s = str(s).strip().replace(\" \", \"_\")\n return re.sub(r\"(?u)[^-\\w.]\", \"\", s)\n\n\ndef setup_globals():\n try:\n OmegaConf.register_resolver(\n \"now\", lambda pattern: strftime(pattern, localtime())\n )\n\n def job_error(x):\n raise Exception(\n \"job:{} is no longer available. use hydra.job.{}\".format(x, x)\n )\n\n OmegaConf.register_resolver(\"job\", job_error)\n\n except AssertionError:\n # calling it again in no_workers mode will throw. safe to ignore.\n pass\n\n\nclass JobReturn:\n def __init__(self):\n self.overrides = None\n self.return_value = None\n self.cfg = None\n self.hydra_cfg = None\n self.working_dir = None\n self.task_name = None\n\n\nclass Singleton(type):\n _instances = {}\n\n def __call__(cls, *args, **kwargs):\n if cls not in cls._instances:\n cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)\n return cls._instances[cls]\n\n @staticmethod\n def get_state():\n return Singleton._instances\n\n @staticmethod\n def set_state(instances):\n Singleton._instances = instances\n\n\[email protected]_metaclass(Singleton)\nclass JobRuntime:\n def __init__(self):\n self.conf = OmegaConf.create()\n self.set(\"name\", \"UNKNOWN_NAME\")\n\n def get(self, key):\n ret = self.conf.select(key)\n if ret is None:\n raise KeyError(\"Key not found in {}: {}\".format(type(self).__name__, key))\n return ret\n\n def set(self, key, value):\n log.debug(\"Setting {}:{}={}\".format(type(self).__name__, key, value))\n self.conf[key] = value\n\n\[email protected]_metaclass(Singleton)\nclass HydraConfig(DictConfig):\n def __init__(self):\n super(HydraConfig, self).__init__(content={})\n self.hydra = None\n\n def set_config(self, cfg):\n try:\n OmegaConf.set_readonly(self, False)\n self.hydra = copy.deepcopy(cfg.hydra)\n finally:\n OmegaConf.set_readonly(self, True)\n", "path": "hydra/plugins/common/utils.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n\nimport copy\nimport logging\nimport os\nimport re\nimport sys\nfrom time import strftime, localtime\n\nimport six\nfrom omegaconf import OmegaConf, DictConfig\n\n# pylint: disable=C0103\nlog = logging.getLogger(__name__)\n\n\ndef configure_log(log_config, verbose=None):\n if log_config is not None:\n conf = log_config.to_container(resolve=True)\n logging.config.dictConfig(conf)\n else:\n # default logging to stdout\n root = logging.getLogger()\n root.setLevel(logging.INFO)\n handler = logging.StreamHandler(sys.stdout)\n formatter = logging.Formatter(\n \"[%(asctime)s][%(name)s][%(levelname)s] - %(message)s\"\n )\n handler.setFormatter(formatter)\n root.addHandler(handler)\n\n if verbose is not None:\n if verbose == \"root\":\n logging.getLogger().setLevel(logging.DEBUG)\n for logger in verbose.split(\",\"):\n logging.getLogger(logger).setLevel(logging.DEBUG)\n\n\ndef save_config(cfg, filename):\n with open(os.path.join(filename), \"w\") as file:\n file.write(cfg.pretty())\n\n\ndef get_overrides_dirname(lst, exclude_keys=[]):\n lst = [x for x in lst if x not in exclude_keys]\n lst.sort()\n return re.sub(pattern=\"[=]\", repl=\"=\", string=\",\".join(lst))\n\n\ndef filter_overrides(overrides):\n \"\"\"\n :param overrides: overrides list\n :return: returning a new overrides list with all the keys starting with hydra. fitlered.\n \"\"\"\n return [x for x in overrides if not x.startswith(\"hydra.\")]\n\n\ndef run_job(config, task_function, verbose, job_dir_key, job_subdir_key):\n old_cwd = os.getcwd()\n working_dir = str(config.select(job_dir_key))\n if job_subdir_key is not None:\n # evaluate job_subdir_key lazily.\n # this is running on the client side in sweep and contains things such as job:id which\n # are only available there.\n subdir = str(config.select(job_subdir_key))\n working_dir = os.path.join(working_dir, subdir)\n try:\n ret = JobReturn()\n ret.working_dir = working_dir\n task_cfg = copy.deepcopy(config)\n # TODO: update this after https://github.com/omry/omegaconf/issues/42 is resolved\n hydra_cfg = OmegaConf.create({\"hydra\": task_cfg[\"hydra\"]});\n del task_cfg[\"hydra\"]\n ret.cfg = task_cfg\n ret.hydra_cfg = copy.deepcopy(HydraConfig())\n ret.overrides = config.hydra.overrides.task.to_container()\n if not os.path.exists(working_dir):\n os.makedirs(working_dir)\n os.chdir(working_dir)\n configure_log(config.hydra.job_logging, verbose)\n\n save_config(task_cfg, \"config.yaml\")\n save_config(hydra_cfg, \"hydra.yaml\")\n save_config(config.hydra.overrides.task, \"overrides.yaml\")\n ret.return_value = task_function(task_cfg)\n ret.task_name = JobRuntime().get(\"name\")\n return ret\n finally:\n os.chdir(old_cwd)\n\n\ndef get_valid_filename(s):\n s = str(s).strip().replace(\" \", \"_\")\n return re.sub(r\"(?u)[^-\\w.]\", \"\", s)\n\n\ndef setup_globals():\n try:\n OmegaConf.register_resolver(\n \"now\", lambda pattern: strftime(pattern, localtime())\n )\n\n def job_error(x):\n raise Exception(\n \"job:{} is no longer available. use hydra.job.{}\".format(x, x)\n )\n\n OmegaConf.register_resolver(\"job\", job_error)\n\n except AssertionError:\n # calling it again in no_workers mode will throw. safe to ignore.\n pass\n\n\nclass JobReturn:\n def __init__(self):\n self.overrides = None\n self.return_value = None\n self.cfg = None\n self.hydra_cfg = None\n self.working_dir = None\n self.task_name = None\n\n\nclass Singleton(type):\n _instances = {}\n\n def __call__(cls, *args, **kwargs):\n if cls not in cls._instances:\n cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)\n return cls._instances[cls]\n\n @staticmethod\n def get_state():\n return Singleton._instances\n\n @staticmethod\n def set_state(instances):\n Singleton._instances = instances\n\n\[email protected]_metaclass(Singleton)\nclass JobRuntime:\n def __init__(self):\n self.conf = OmegaConf.create()\n self.set(\"name\", \"UNKNOWN_NAME\")\n\n def get(self, key):\n ret = self.conf.select(key)\n if ret is None:\n raise KeyError(\"Key not found in {}: {}\".format(type(self).__name__, key))\n return ret\n\n def set(self, key, value):\n log.debug(\"Setting {}:{}={}\".format(type(self).__name__, key, value))\n self.conf[key] = value\n\n\[email protected]_metaclass(Singleton)\nclass HydraConfig(DictConfig):\n def __init__(self):\n super(HydraConfig, self).__init__(content={})\n self.hydra = None\n\n def set_config(self, cfg):\n try:\n OmegaConf.set_readonly(self, False)\n self.hydra = copy.deepcopy(cfg.hydra)\n finally:\n OmegaConf.set_readonly(self, True)\n", "path": "hydra/plugins/common/utils.py"}]} | 1,844 | 160 |
gh_patches_debug_19703 | rasdani/github-patches | git_diff | certbot__certbot-5349 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Can't install new certificates with certbot-apache under Gentoo since version 0.20.0
It seems that version 0.20.0 has a major architecture revision from 0.19.0, in particular regarding OS specific overrides. But the Gentoo overrides have a flaw when trying to install a new certificate : at one point it tries to run "apache2ctl -t -D DUMP_MODULES", which is no good at all under Gentoo because apache2ctl is a dummy script redirecting to /etc/init.d/apache2. It generates this error :
`Error in checking parameter list: /etc/init.d/apache2: invalid option -- 't'`
## My operating system is (include version):
Gentoo
## I installed Certbot with (certbot-auto, OS package manager, pip, etc):
OS package manager (emerge) with testing (~amd64) flag
## I ran this command and it produced this output:
```
certbot -d test.example.com
Saving debug log to /var/log/letsencrypt/letsencrypt.log
Plugins selected: Authenticator apache, Installer apache
Obtaining a new certificate
Performing the following challenges:
tls-sni-01 challenge for test.example.com
Waiting for verification...
Cleaning up challenges
Error in checking parameter list: /etc/init.d/apache2: invalid option -- 't'
Apache is unable to check whether or not the module is loaded because Apache is misconfigured.
```
## Certbot's behavior differed from what I expected because:
Certbot did not install the certificate because it failed to run apache2ctl -t -D DUMP_MODULES
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `certbot-apache/certbot_apache/override_gentoo.py`
Content:
```
1 """ Distribution specific override class for Gentoo Linux """
2 import pkg_resources
3
4 import zope.interface
5
6 from certbot import interfaces
7
8 from certbot_apache import apache_util
9 from certbot_apache import configurator
10 from certbot_apache import parser
11
12 @zope.interface.provider(interfaces.IPluginFactory)
13 class GentooConfigurator(configurator.ApacheConfigurator):
14 """Gentoo specific ApacheConfigurator override class"""
15
16 OS_DEFAULTS = dict(
17 server_root="/etc/apache2",
18 vhost_root="/etc/apache2/vhosts.d",
19 vhost_files="*.conf",
20 logs_root="/var/log/apache2",
21 version_cmd=['/usr/sbin/apache2', '-v'],
22 apache_cmd="apache2ctl",
23 restart_cmd=['apache2ctl', 'graceful'],
24 conftest_cmd=['apache2ctl', 'configtest'],
25 enmod=None,
26 dismod=None,
27 le_vhost_ext="-le-ssl.conf",
28 handle_mods=False,
29 handle_sites=False,
30 challenge_location="/etc/apache2/vhosts.d",
31 MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
32 "certbot_apache", "options-ssl-apache.conf")
33 )
34
35 def get_parser(self):
36 """Initializes the ApacheParser"""
37 return GentooParser(
38 self.aug, self.conf("server-root"), self.conf("vhost-root"),
39 self.version, configurator=self)
40
41
42 class GentooParser(parser.ApacheParser):
43 """Gentoo specific ApacheParser override class"""
44 def __init__(self, *args, **kwargs):
45 # Gentoo specific configuration file for Apache2
46 self.apacheconfig_filep = "/etc/conf.d/apache2"
47 super(GentooParser, self).__init__(*args, **kwargs)
48
49 def update_runtime_variables(self):
50 """ Override for update_runtime_variables for custom parsing """
51 self.parse_sysconfig_var()
52
53 def parse_sysconfig_var(self):
54 """ Parses Apache CLI options from Gentoo configuration file """
55 defines = apache_util.parse_define_file(self.apacheconfig_filep,
56 "APACHE2_OPTS")
57 for k in defines.keys():
58 self.variables[k] = defines[k]
59
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/certbot-apache/certbot_apache/override_gentoo.py b/certbot-apache/certbot_apache/override_gentoo.py
--- a/certbot-apache/certbot_apache/override_gentoo.py
+++ b/certbot-apache/certbot_apache/override_gentoo.py
@@ -49,6 +49,7 @@
def update_runtime_variables(self):
""" Override for update_runtime_variables for custom parsing """
self.parse_sysconfig_var()
+ self.update_modules()
def parse_sysconfig_var(self):
""" Parses Apache CLI options from Gentoo configuration file """
@@ -56,3 +57,10 @@
"APACHE2_OPTS")
for k in defines.keys():
self.variables[k] = defines[k]
+
+ def update_modules(self):
+ """Get loaded modules from httpd process, and add them to DOM"""
+ mod_cmd = [self.configurator.constant("apache_cmd"), "modules"]
+ matches = self.parse_from_subprocess(mod_cmd, r"(.*)_module")
+ for mod in matches:
+ self.add_mod(mod.strip())
| {"golden_diff": "diff --git a/certbot-apache/certbot_apache/override_gentoo.py b/certbot-apache/certbot_apache/override_gentoo.py\n--- a/certbot-apache/certbot_apache/override_gentoo.py\n+++ b/certbot-apache/certbot_apache/override_gentoo.py\n@@ -49,6 +49,7 @@\n def update_runtime_variables(self):\n \"\"\" Override for update_runtime_variables for custom parsing \"\"\"\n self.parse_sysconfig_var()\n+ self.update_modules()\n \n def parse_sysconfig_var(self):\n \"\"\" Parses Apache CLI options from Gentoo configuration file \"\"\"\n@@ -56,3 +57,10 @@\n \"APACHE2_OPTS\")\n for k in defines.keys():\n self.variables[k] = defines[k]\n+\n+ def update_modules(self):\n+ \"\"\"Get loaded modules from httpd process, and add them to DOM\"\"\"\n+ mod_cmd = [self.configurator.constant(\"apache_cmd\"), \"modules\"]\n+ matches = self.parse_from_subprocess(mod_cmd, r\"(.*)_module\")\n+ for mod in matches:\n+ self.add_mod(mod.strip())\n", "issue": "Can't install new certificates with certbot-apache under Gentoo since version 0.20.0\nIt seems that version 0.20.0 has a major architecture revision from 0.19.0, in particular regarding OS specific overrides. But the Gentoo overrides have a flaw when trying to install a new certificate : at one point it tries to run \"apache2ctl -t -D DUMP_MODULES\", which is no good at all under Gentoo because apache2ctl is a dummy script redirecting to /etc/init.d/apache2. It generates this error :\r\n\r\n`Error in checking parameter list: /etc/init.d/apache2: invalid option -- 't'`\r\n\r\n## My operating system is (include version):\r\nGentoo\r\n\r\n## I installed Certbot with (certbot-auto, OS package manager, pip, etc):\r\nOS package manager (emerge) with testing (~amd64) flag\r\n\r\n## I ran this command and it produced this output:\r\n```\r\ncertbot -d test.example.com\r\nSaving debug log to /var/log/letsencrypt/letsencrypt.log\r\nPlugins selected: Authenticator apache, Installer apache\r\nObtaining a new certificate\r\nPerforming the following challenges:\r\ntls-sni-01 challenge for test.example.com\r\nWaiting for verification...\r\nCleaning up challenges\r\nError in checking parameter list: /etc/init.d/apache2: invalid option -- 't'\r\n\r\nApache is unable to check whether or not the module is loaded because Apache is misconfigured.\r\n```\r\n\r\n## Certbot's behavior differed from what I expected because:\r\nCertbot did not install the certificate because it failed to run apache2ctl -t -D DUMP_MODULES\n", "before_files": [{"content": "\"\"\" Distribution specific override class for Gentoo Linux \"\"\"\nimport pkg_resources\n\nimport zope.interface\n\nfrom certbot import interfaces\n\nfrom certbot_apache import apache_util\nfrom certbot_apache import configurator\nfrom certbot_apache import parser\n\[email protected](interfaces.IPluginFactory)\nclass GentooConfigurator(configurator.ApacheConfigurator):\n \"\"\"Gentoo specific ApacheConfigurator override class\"\"\"\n\n OS_DEFAULTS = dict(\n server_root=\"/etc/apache2\",\n vhost_root=\"/etc/apache2/vhosts.d\",\n vhost_files=\"*.conf\",\n logs_root=\"/var/log/apache2\",\n version_cmd=['/usr/sbin/apache2', '-v'],\n apache_cmd=\"apache2ctl\",\n restart_cmd=['apache2ctl', 'graceful'],\n conftest_cmd=['apache2ctl', 'configtest'],\n enmod=None,\n dismod=None,\n le_vhost_ext=\"-le-ssl.conf\",\n handle_mods=False,\n handle_sites=False,\n challenge_location=\"/etc/apache2/vhosts.d\",\n MOD_SSL_CONF_SRC=pkg_resources.resource_filename(\n \"certbot_apache\", \"options-ssl-apache.conf\")\n )\n\n def get_parser(self):\n \"\"\"Initializes the ApacheParser\"\"\"\n return GentooParser(\n self.aug, self.conf(\"server-root\"), self.conf(\"vhost-root\"),\n self.version, configurator=self)\n\n\nclass GentooParser(parser.ApacheParser):\n \"\"\"Gentoo specific ApacheParser override class\"\"\"\n def __init__(self, *args, **kwargs):\n # Gentoo specific configuration file for Apache2\n self.apacheconfig_filep = \"/etc/conf.d/apache2\"\n super(GentooParser, self).__init__(*args, **kwargs)\n\n def update_runtime_variables(self):\n \"\"\" Override for update_runtime_variables for custom parsing \"\"\"\n self.parse_sysconfig_var()\n\n def parse_sysconfig_var(self):\n \"\"\" Parses Apache CLI options from Gentoo configuration file \"\"\"\n defines = apache_util.parse_define_file(self.apacheconfig_filep,\n \"APACHE2_OPTS\")\n for k in defines.keys():\n self.variables[k] = defines[k]\n", "path": "certbot-apache/certbot_apache/override_gentoo.py"}], "after_files": [{"content": "\"\"\" Distribution specific override class for Gentoo Linux \"\"\"\nimport pkg_resources\n\nimport zope.interface\n\nfrom certbot import interfaces\n\nfrom certbot_apache import apache_util\nfrom certbot_apache import configurator\nfrom certbot_apache import parser\n\[email protected](interfaces.IPluginFactory)\nclass GentooConfigurator(configurator.ApacheConfigurator):\n \"\"\"Gentoo specific ApacheConfigurator override class\"\"\"\n\n OS_DEFAULTS = dict(\n server_root=\"/etc/apache2\",\n vhost_root=\"/etc/apache2/vhosts.d\",\n vhost_files=\"*.conf\",\n logs_root=\"/var/log/apache2\",\n version_cmd=['/usr/sbin/apache2', '-v'],\n apache_cmd=\"apache2ctl\",\n restart_cmd=['apache2ctl', 'graceful'],\n conftest_cmd=['apache2ctl', 'configtest'],\n enmod=None,\n dismod=None,\n le_vhost_ext=\"-le-ssl.conf\",\n handle_mods=False,\n handle_sites=False,\n challenge_location=\"/etc/apache2/vhosts.d\",\n MOD_SSL_CONF_SRC=pkg_resources.resource_filename(\n \"certbot_apache\", \"options-ssl-apache.conf\")\n )\n\n def get_parser(self):\n \"\"\"Initializes the ApacheParser\"\"\"\n return GentooParser(\n self.aug, self.conf(\"server-root\"), self.conf(\"vhost-root\"),\n self.version, configurator=self)\n\n\nclass GentooParser(parser.ApacheParser):\n \"\"\"Gentoo specific ApacheParser override class\"\"\"\n def __init__(self, *args, **kwargs):\n # Gentoo specific configuration file for Apache2\n self.apacheconfig_filep = \"/etc/conf.d/apache2\"\n super(GentooParser, self).__init__(*args, **kwargs)\n\n def update_runtime_variables(self):\n \"\"\" Override for update_runtime_variables for custom parsing \"\"\"\n self.parse_sysconfig_var()\n self.update_modules()\n\n def parse_sysconfig_var(self):\n \"\"\" Parses Apache CLI options from Gentoo configuration file \"\"\"\n defines = apache_util.parse_define_file(self.apacheconfig_filep,\n \"APACHE2_OPTS\")\n for k in defines.keys():\n self.variables[k] = defines[k]\n\n def update_modules(self):\n \"\"\"Get loaded modules from httpd process, and add them to DOM\"\"\"\n mod_cmd = [self.configurator.constant(\"apache_cmd\"), \"modules\"]\n matches = self.parse_from_subprocess(mod_cmd, r\"(.*)_module\")\n for mod in matches:\n self.add_mod(mod.strip())\n", "path": "certbot-apache/certbot_apache/override_gentoo.py"}]} | 1,185 | 251 |
gh_patches_debug_2097 | rasdani/github-patches | git_diff | sosreport__sos-1100 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[rabbitmq] does not elide password in /etc/rabbitmq.config
Hi,
Sosreport as in sos-3.2-36.eo7ost.1.noarch includes in `/etc/rabbitmq/rabbitmq.conf` `default_user` and `default_pass` without being hidden.
Regards,
Pablo
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sos/plugins/rabbitmq.py`
Content:
```
1 # This program is free software; you can redistribute it and/or modify
2 # it under the terms of the GNU General Public License as published by
3 # the Free Software Foundation; either version 2 of the License, or
4 # (at your option) any later version.
5
6 # This program is distributed in the hope that it will be useful,
7 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # GNU General Public License for more details.
10
11 # You should have received a copy of the GNU General Public License along
12 # with this program; if not, write to the Free Software Foundation, Inc.,
13 # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
14
15 from sos.plugins import Plugin, RedHatPlugin, DebianPlugin, UbuntuPlugin
16
17
18 class RabbitMQ(Plugin, RedHatPlugin, DebianPlugin, UbuntuPlugin):
19 """RabbitMQ messaging service
20 """
21 plugin_name = 'rabbitmq'
22 profiles = ('services',)
23 var_puppet_gen = "/var/lib/config-data/puppet-generated/rabbitmq"
24 files = (
25 '/etc/rabbitmq/rabbitmq.conf',
26 var_puppet_gen + '/etc/rabbitmq/rabbitmq.config'
27 )
28 packages = ('rabbitmq-server',)
29
30 def setup(self):
31 container_status = self.get_command_output(
32 "docker ps -a --format='{{ .Names }}'")
33
34 in_container = False
35 container_names = []
36 if container_status['status'] == 0:
37 for line in container_status['output'].splitlines():
38 if line.startswith("rabbitmq"):
39 in_container = True
40 container_names.append(line)
41
42 if in_container:
43 for container in container_names:
44 self.add_cmd_output('docker logs {0}'.format(container))
45 self.add_cmd_output(
46 'docker exec -t {0} rabbitmqctl report'
47 .format(container)
48 )
49 else:
50 self.add_cmd_output("rabbitmqctl report")
51
52 self.add_copy_spec([
53 "/etc/rabbitmq/*",
54 self.var_puppet_gen + "/etc/rabbitmq/*",
55 self.var_puppet_gen + "/etc/security/limits.d/",
56 self.var_puppet_gen + "/etc/systemd/"
57 ])
58 self.add_copy_spec([
59 "/var/log/rabbitmq/*",
60 "/var/log/containers/rabbitmq/*"
61 ], sizelimit=self.get_option('log_size'))
62
63
64 # vim: set et ts=4 sw=4 :
65
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sos/plugins/rabbitmq.py b/sos/plugins/rabbitmq.py
--- a/sos/plugins/rabbitmq.py
+++ b/sos/plugins/rabbitmq.py
@@ -60,5 +60,8 @@
"/var/log/containers/rabbitmq/*"
], sizelimit=self.get_option('log_size'))
+ def postproc(self):
+ self.do_file_sub("/etc/rabbitmq/rabbitmq.conf",
+ r"(\s*default_pass\s*,\s*)\S+", r"\1<<***>>},")
# vim: set et ts=4 sw=4 :
| {"golden_diff": "diff --git a/sos/plugins/rabbitmq.py b/sos/plugins/rabbitmq.py\n--- a/sos/plugins/rabbitmq.py\n+++ b/sos/plugins/rabbitmq.py\n@@ -60,5 +60,8 @@\n \"/var/log/containers/rabbitmq/*\"\n ], sizelimit=self.get_option('log_size'))\n \n+ def postproc(self):\n+ self.do_file_sub(\"/etc/rabbitmq/rabbitmq.conf\",\n+ r\"(\\s*default_pass\\s*,\\s*)\\S+\", r\"\\1<<***>>},\")\n \n # vim: set et ts=4 sw=4 :\n", "issue": "[rabbitmq] does not elide password in /etc/rabbitmq.config\nHi,\nSosreport as in sos-3.2-36.eo7ost.1.noarch includes in `/etc/rabbitmq/rabbitmq.conf` `default_user` and `default_pass` without being hidden.\n\nRegards,\nPablo\n\n", "before_files": [{"content": "# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation; either version 2 of the License, or\n# (at your option) any later version.\n\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n\n# You should have received a copy of the GNU General Public License along\n# with this program; if not, write to the Free Software Foundation, Inc.,\n# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.\n\nfrom sos.plugins import Plugin, RedHatPlugin, DebianPlugin, UbuntuPlugin\n\n\nclass RabbitMQ(Plugin, RedHatPlugin, DebianPlugin, UbuntuPlugin):\n \"\"\"RabbitMQ messaging service\n \"\"\"\n plugin_name = 'rabbitmq'\n profiles = ('services',)\n var_puppet_gen = \"/var/lib/config-data/puppet-generated/rabbitmq\"\n files = (\n '/etc/rabbitmq/rabbitmq.conf',\n var_puppet_gen + '/etc/rabbitmq/rabbitmq.config'\n )\n packages = ('rabbitmq-server',)\n\n def setup(self):\n container_status = self.get_command_output(\n \"docker ps -a --format='{{ .Names }}'\")\n\n in_container = False\n container_names = []\n if container_status['status'] == 0:\n for line in container_status['output'].splitlines():\n if line.startswith(\"rabbitmq\"):\n in_container = True\n container_names.append(line)\n\n if in_container:\n for container in container_names:\n self.add_cmd_output('docker logs {0}'.format(container))\n self.add_cmd_output(\n 'docker exec -t {0} rabbitmqctl report'\n .format(container)\n )\n else:\n self.add_cmd_output(\"rabbitmqctl report\")\n\n self.add_copy_spec([\n \"/etc/rabbitmq/*\",\n self.var_puppet_gen + \"/etc/rabbitmq/*\",\n self.var_puppet_gen + \"/etc/security/limits.d/\",\n self.var_puppet_gen + \"/etc/systemd/\"\n ])\n self.add_copy_spec([\n \"/var/log/rabbitmq/*\",\n \"/var/log/containers/rabbitmq/*\"\n ], sizelimit=self.get_option('log_size'))\n\n\n# vim: set et ts=4 sw=4 :\n", "path": "sos/plugins/rabbitmq.py"}], "after_files": [{"content": "# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation; either version 2 of the License, or\n# (at your option) any later version.\n\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n\n# You should have received a copy of the GNU General Public License along\n# with this program; if not, write to the Free Software Foundation, Inc.,\n# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.\n\nfrom sos.plugins import Plugin, RedHatPlugin, DebianPlugin, UbuntuPlugin\n\n\nclass RabbitMQ(Plugin, RedHatPlugin, DebianPlugin, UbuntuPlugin):\n \"\"\"RabbitMQ messaging service\n \"\"\"\n plugin_name = 'rabbitmq'\n profiles = ('services',)\n var_puppet_gen = \"/var/lib/config-data/puppet-generated/rabbitmq\"\n files = (\n '/etc/rabbitmq/rabbitmq.conf',\n var_puppet_gen + '/etc/rabbitmq/rabbitmq.config'\n )\n packages = ('rabbitmq-server',)\n\n def setup(self):\n container_status = self.get_command_output(\n \"docker ps -a --format='{{ .Names }}'\")\n\n in_container = False\n container_names = []\n if container_status['status'] == 0:\n for line in container_status['output'].splitlines():\n if line.startswith(\"rabbitmq\"):\n in_container = True\n container_names.append(line)\n\n if in_container:\n for container in container_names:\n self.add_cmd_output('docker logs {0}'.format(container))\n self.add_cmd_output(\n 'docker exec -t {0} rabbitmqctl report'\n .format(container)\n )\n else:\n self.add_cmd_output(\"rabbitmqctl report\")\n\n self.add_copy_spec([\n \"/etc/rabbitmq/*\",\n self.var_puppet_gen + \"/etc/rabbitmq/*\",\n self.var_puppet_gen + \"/etc/security/limits.d/\",\n self.var_puppet_gen + \"/etc/systemd/\"\n ])\n self.add_copy_spec([\n \"/var/log/rabbitmq/*\",\n \"/var/log/containers/rabbitmq/*\"\n ], sizelimit=self.get_option('log_size'))\n\n def postproc(self):\n self.do_file_sub(\"/etc/rabbitmq/rabbitmq.conf\",\n r\"(\\s*default_pass\\s*,\\s*)\\S+\", r\"\\1<<***>>},\")\n\n# vim: set et ts=4 sw=4 :\n", "path": "sos/plugins/rabbitmq.py"}]} | 996 | 137 |
gh_patches_debug_62860 | rasdani/github-patches | git_diff | conan-io__conan-center-index-7774 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[request] sigslot/1.2.1
### Package Details
* Package Name/Version: **sigslot/1.2.1**
* Changelog: **https://github.com/palacaze/sigslot/releases/tag/v1.2.1**
The above mentioned version is newly released by the upstream project and not yet available as a recipe. Please add this version.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `recipes/sigslot/all/conanfile.py`
Content:
```
1 import os
2
3 from conans import ConanFile, tools
4 from conans.errors import ConanInvalidConfiguration
5
6
7 class SigslotConan(ConanFile):
8 name = "sigslot"
9 description = "Sigslot is a header-only, thread safe implementation of signal-slots for C++."
10 topics = ("signal", "slot", "c++14", "header-only")
11 url = "https://github.com/conan-io/conan-center-index"
12 homepage = "https://github.com/palacaze/sigslot"
13 license = "MIT"
14 settings = "compiler", "os"
15 no_copy_source = True
16
17 @property
18 def _source_subfolder(self):
19 return "source_subfolder"
20
21 def configure(self):
22 minimal_cpp_standard = "14"
23 if self.settings.compiler.cppstd:
24 tools.check_min_cppstd(self, minimal_cpp_standard)
25 minimal_version = {
26 "gcc": "5",
27 "clang": "3.4",
28 "apple-clang": "10",
29 "Visual Studio": "15" # 14 is not supported by the library
30 }
31 compiler = str(self.settings.compiler)
32 if compiler not in minimal_version:
33 self.output.warn(
34 "%s recipe lacks information about the %s compiler standard version support" % (self.name, compiler))
35 self.output.warn(
36 "%s requires a compiler that supports at least C++%s" % (self.name, minimal_cpp_standard))
37 return
38 version = tools.Version(self.settings.compiler.version)
39 if version < minimal_version[compiler]:
40 raise ConanInvalidConfiguration("%s requires a compiler that supports at least C++%s" % (self.name, minimal_cpp_standard))
41
42 def source(self):
43 tools.get(**self.conan_data["sources"][self.version])
44 extracted_dir = "sigslot-" + self.version
45 os.rename(extracted_dir, self._source_subfolder)
46
47 def package(self):
48 self.copy(pattern="LICENSE", src=self._source_subfolder, dst="licenses")
49 self.copy(pattern="signal.hpp", src=os.path.join(self._source_subfolder, "include", "sigslot"), dst=os.path.join("include", "sigslot"))
50
51 def package_id(self):
52 self.info.header_only()
53
54 def package_info(self):
55 self.cpp_info.filenames["cmake_find_package"] = "PalSigslot"
56 self.cpp_info.filenames["cmake_find_package_multi"] = "PalSigslot"
57 self.cpp_info.names["cmake_find_package"] = "Pal"
58 self.cpp_info.names["cmake_find_package_multi"] = "Pal"
59
60 self.cpp_info.components["_sigslot"].libs = []
61 self.cpp_info.components["_sigslot"].names["cmake_find_package"] = "Sigslot"
62 self.cpp_info.components["_sigslot"].names["cmake_find_package_multi"] = "Sigslot"
63
64 if self.settings.os == "Linux":
65 self.cpp_info.components["_sigslot"].system_libs.append("pthread")
66 if self.settings.os == "Windows":
67 if self.settings.compiler in ("Visual Studio", "clang"):
68 self.cpp_info.components["_sigslot"].exelinkflags.append('/OPT:NOICF')
69
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/recipes/sigslot/all/conanfile.py b/recipes/sigslot/all/conanfile.py
--- a/recipes/sigslot/all/conanfile.py
+++ b/recipes/sigslot/all/conanfile.py
@@ -65,4 +65,4 @@
self.cpp_info.components["_sigslot"].system_libs.append("pthread")
if self.settings.os == "Windows":
if self.settings.compiler in ("Visual Studio", "clang"):
- self.cpp_info.components["_sigslot"].exelinkflags.append('/OPT:NOICF')
+ self.cpp_info.components["_sigslot"].exelinkflags.append('-OPT:NOICF')
| {"golden_diff": "diff --git a/recipes/sigslot/all/conanfile.py b/recipes/sigslot/all/conanfile.py\n--- a/recipes/sigslot/all/conanfile.py\n+++ b/recipes/sigslot/all/conanfile.py\n@@ -65,4 +65,4 @@\n self.cpp_info.components[\"_sigslot\"].system_libs.append(\"pthread\")\n if self.settings.os == \"Windows\":\n if self.settings.compiler in (\"Visual Studio\", \"clang\"):\n- self.cpp_info.components[\"_sigslot\"].exelinkflags.append('/OPT:NOICF')\n+ self.cpp_info.components[\"_sigslot\"].exelinkflags.append('-OPT:NOICF')\n", "issue": "[request] sigslot/1.2.1\n### Package Details\r\n * Package Name/Version: **sigslot/1.2.1**\r\n * Changelog: **https://github.com/palacaze/sigslot/releases/tag/v1.2.1**\r\n\r\n\r\nThe above mentioned version is newly released by the upstream project and not yet available as a recipe. Please add this version.\r\n\n", "before_files": [{"content": "import os\n\nfrom conans import ConanFile, tools\nfrom conans.errors import ConanInvalidConfiguration\n\n\nclass SigslotConan(ConanFile):\n name = \"sigslot\"\n description = \"Sigslot is a header-only, thread safe implementation of signal-slots for C++.\"\n topics = (\"signal\", \"slot\", \"c++14\", \"header-only\")\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://github.com/palacaze/sigslot\"\n license = \"MIT\"\n settings = \"compiler\", \"os\"\n no_copy_source = True\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n def configure(self):\n minimal_cpp_standard = \"14\"\n if self.settings.compiler.cppstd:\n tools.check_min_cppstd(self, minimal_cpp_standard)\n minimal_version = {\n \"gcc\": \"5\",\n \"clang\": \"3.4\",\n \"apple-clang\": \"10\",\n \"Visual Studio\": \"15\" # 14 is not supported by the library\n }\n compiler = str(self.settings.compiler)\n if compiler not in minimal_version:\n self.output.warn(\n \"%s recipe lacks information about the %s compiler standard version support\" % (self.name, compiler))\n self.output.warn(\n \"%s requires a compiler that supports at least C++%s\" % (self.name, minimal_cpp_standard))\n return\n version = tools.Version(self.settings.compiler.version)\n if version < minimal_version[compiler]:\n raise ConanInvalidConfiguration(\"%s requires a compiler that supports at least C++%s\" % (self.name, minimal_cpp_standard))\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extracted_dir = \"sigslot-\" + self.version\n os.rename(extracted_dir, self._source_subfolder)\n\n def package(self):\n self.copy(pattern=\"LICENSE\", src=self._source_subfolder, dst=\"licenses\")\n self.copy(pattern=\"signal.hpp\", src=os.path.join(self._source_subfolder, \"include\", \"sigslot\"), dst=os.path.join(\"include\", \"sigslot\"))\n\n def package_id(self):\n self.info.header_only()\n\n def package_info(self):\n self.cpp_info.filenames[\"cmake_find_package\"] = \"PalSigslot\"\n self.cpp_info.filenames[\"cmake_find_package_multi\"] = \"PalSigslot\"\n self.cpp_info.names[\"cmake_find_package\"] = \"Pal\"\n self.cpp_info.names[\"cmake_find_package_multi\"] = \"Pal\"\n\n self.cpp_info.components[\"_sigslot\"].libs = []\n self.cpp_info.components[\"_sigslot\"].names[\"cmake_find_package\"] = \"Sigslot\"\n self.cpp_info.components[\"_sigslot\"].names[\"cmake_find_package_multi\"] = \"Sigslot\"\n\n if self.settings.os == \"Linux\":\n self.cpp_info.components[\"_sigslot\"].system_libs.append(\"pthread\")\n if self.settings.os == \"Windows\":\n if self.settings.compiler in (\"Visual Studio\", \"clang\"):\n self.cpp_info.components[\"_sigslot\"].exelinkflags.append('/OPT:NOICF')\n", "path": "recipes/sigslot/all/conanfile.py"}], "after_files": [{"content": "import os\n\nfrom conans import ConanFile, tools\nfrom conans.errors import ConanInvalidConfiguration\n\n\nclass SigslotConan(ConanFile):\n name = \"sigslot\"\n description = \"Sigslot is a header-only, thread safe implementation of signal-slots for C++.\"\n topics = (\"signal\", \"slot\", \"c++14\", \"header-only\")\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://github.com/palacaze/sigslot\"\n license = \"MIT\"\n settings = \"compiler\", \"os\"\n no_copy_source = True\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n def configure(self):\n minimal_cpp_standard = \"14\"\n if self.settings.compiler.cppstd:\n tools.check_min_cppstd(self, minimal_cpp_standard)\n minimal_version = {\n \"gcc\": \"5\",\n \"clang\": \"3.4\",\n \"apple-clang\": \"10\",\n \"Visual Studio\": \"15\" # 14 is not supported by the library\n }\n compiler = str(self.settings.compiler)\n if compiler not in minimal_version:\n self.output.warn(\n \"%s recipe lacks information about the %s compiler standard version support\" % (self.name, compiler))\n self.output.warn(\n \"%s requires a compiler that supports at least C++%s\" % (self.name, minimal_cpp_standard))\n return\n version = tools.Version(self.settings.compiler.version)\n if version < minimal_version[compiler]:\n raise ConanInvalidConfiguration(\"%s requires a compiler that supports at least C++%s\" % (self.name, minimal_cpp_standard))\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extracted_dir = \"sigslot-\" + self.version\n os.rename(extracted_dir, self._source_subfolder)\n\n def package(self):\n self.copy(pattern=\"LICENSE\", src=self._source_subfolder, dst=\"licenses\")\n self.copy(pattern=\"signal.hpp\", src=os.path.join(self._source_subfolder, \"include\", \"sigslot\"), dst=os.path.join(\"include\", \"sigslot\"))\n\n def package_id(self):\n self.info.header_only()\n\n def package_info(self):\n self.cpp_info.filenames[\"cmake_find_package\"] = \"PalSigslot\"\n self.cpp_info.filenames[\"cmake_find_package_multi\"] = \"PalSigslot\"\n self.cpp_info.names[\"cmake_find_package\"] = \"Pal\"\n self.cpp_info.names[\"cmake_find_package_multi\"] = \"Pal\"\n\n self.cpp_info.components[\"_sigslot\"].libs = []\n self.cpp_info.components[\"_sigslot\"].names[\"cmake_find_package\"] = \"Sigslot\"\n self.cpp_info.components[\"_sigslot\"].names[\"cmake_find_package_multi\"] = \"Sigslot\"\n\n if self.settings.os == \"Linux\":\n self.cpp_info.components[\"_sigslot\"].system_libs.append(\"pthread\")\n if self.settings.os == \"Windows\":\n if self.settings.compiler in (\"Visual Studio\", \"clang\"):\n self.cpp_info.components[\"_sigslot\"].exelinkflags.append('-OPT:NOICF')\n", "path": "recipes/sigslot/all/conanfile.py"}]} | 1,162 | 144 |
gh_patches_debug_11690 | rasdani/github-patches | git_diff | searxng__searxng-3369 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
remove ask engine
Reasons:
1. User Tracking via query parameters.
Example Query: [!ask best museums in Köln](https://searx.be/search?q=%21ask+best+museums+in+K%C3%B6ln&category_none=&language=auto&time_range=&safesearch=0&theme=simple)
Notice the ```euid=some_id``` at the end for most urls.
2. Very bad results - Something must have changed.
Example Query: [!ask "yt-dlp"](https://searx.be/search?q=%21ask+%22yt-dlp%22&category_none=&language=auto&time_range=&safesearch=0&theme=simple)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `searx/engines/ask.py`
Content:
```
1 # SPDX-License-Identifier: AGPL-3.0-or-later
2 """Ask.com"""
3
4 from urllib.parse import urlencode
5 import dateutil
6 from lxml import html
7 from searx import utils
8
9 # Metadata
10 about = {
11 "website": "https://www.ask.com/",
12 "wikidata_id": 'Q847564',
13 "official_api_documentation": None,
14 "use_official_api": False,
15 "require_api_key": False,
16 "results": "HTML",
17 }
18
19 # Engine Configuration
20 categories = ['general']
21 paging = True
22
23 # Base URL
24 base_url = "https://www.ask.com/web"
25
26
27 def request(query, params):
28
29 query_params = {
30 "q": query,
31 "page": params["pageno"],
32 }
33
34 params["url"] = f"{base_url}?{urlencode(query_params)}"
35 return params
36
37
38 def response(resp):
39
40 start_tag = 'window.MESON.initialState = {'
41 end_tag = '}};'
42
43 dom = html.fromstring(resp.text)
44 script = utils.eval_xpath_getindex(dom, '//script', 0, default=None).text
45
46 pos = script.index(start_tag) + len(start_tag) - 1
47 script = script[pos:]
48 pos = script.index(end_tag) + len(end_tag) - 1
49 script = script[:pos]
50
51 json_resp = utils.js_variable_to_python(script)
52
53 results = []
54
55 for item in json_resp['search']['webResults']['results']:
56
57 pubdate_original = item.get('pubdate_original')
58 if pubdate_original:
59 pubdate_original = dateutil.parser.parse(pubdate_original)
60 metadata = [item.get(field) for field in ['category_l1', 'catsy'] if item.get(field)]
61
62 results.append(
63 {
64 "url": item['url'],
65 "title": item['title'],
66 "content": item['abstract'],
67 "publishedDate": pubdate_original,
68 # "img_src": item.get('image_url') or None, # these are not thumbs / to large
69 "metadata": ' | '.join(metadata),
70 }
71 )
72
73 return results
74
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/searx/engines/ask.py b/searx/engines/ask.py
--- a/searx/engines/ask.py
+++ b/searx/engines/ask.py
@@ -19,6 +19,7 @@
# Engine Configuration
categories = ['general']
paging = True
+max_page = 5
# Base URL
base_url = "https://www.ask.com/web"
@@ -61,7 +62,7 @@
results.append(
{
- "url": item['url'],
+ "url": item['url'].split('&ueid')[0],
"title": item['title'],
"content": item['abstract'],
"publishedDate": pubdate_original,
| {"golden_diff": "diff --git a/searx/engines/ask.py b/searx/engines/ask.py\n--- a/searx/engines/ask.py\n+++ b/searx/engines/ask.py\n@@ -19,6 +19,7 @@\n # Engine Configuration\n categories = ['general']\n paging = True\n+max_page = 5\n \n # Base URL\n base_url = \"https://www.ask.com/web\"\n@@ -61,7 +62,7 @@\n \n results.append(\n {\n- \"url\": item['url'],\n+ \"url\": item['url'].split('&ueid')[0],\n \"title\": item['title'],\n \"content\": item['abstract'],\n \"publishedDate\": pubdate_original,\n", "issue": "remove ask engine\nReasons:\r\n\r\n1. User Tracking via query parameters.\r\n\r\n Example Query: [!ask best museums in K\u00f6ln](https://searx.be/search?q=%21ask+best+museums+in+K%C3%B6ln&category_none=&language=auto&time_range=&safesearch=0&theme=simple)\r\n\r\n Notice the ```euid=some_id``` at the end for most urls.\r\n\r\n2. Very bad results - Something must have changed.\r\n\r\n Example Query: [!ask \"yt-dlp\"](https://searx.be/search?q=%21ask+%22yt-dlp%22&category_none=&language=auto&time_range=&safesearch=0&theme=simple)\r\n\r\n\n", "before_files": [{"content": "# SPDX-License-Identifier: AGPL-3.0-or-later\n\"\"\"Ask.com\"\"\"\n\nfrom urllib.parse import urlencode\nimport dateutil\nfrom lxml import html\nfrom searx import utils\n\n# Metadata\nabout = {\n \"website\": \"https://www.ask.com/\",\n \"wikidata_id\": 'Q847564',\n \"official_api_documentation\": None,\n \"use_official_api\": False,\n \"require_api_key\": False,\n \"results\": \"HTML\",\n}\n\n# Engine Configuration\ncategories = ['general']\npaging = True\n\n# Base URL\nbase_url = \"https://www.ask.com/web\"\n\n\ndef request(query, params):\n\n query_params = {\n \"q\": query,\n \"page\": params[\"pageno\"],\n }\n\n params[\"url\"] = f\"{base_url}?{urlencode(query_params)}\"\n return params\n\n\ndef response(resp):\n\n start_tag = 'window.MESON.initialState = {'\n end_tag = '}};'\n\n dom = html.fromstring(resp.text)\n script = utils.eval_xpath_getindex(dom, '//script', 0, default=None).text\n\n pos = script.index(start_tag) + len(start_tag) - 1\n script = script[pos:]\n pos = script.index(end_tag) + len(end_tag) - 1\n script = script[:pos]\n\n json_resp = utils.js_variable_to_python(script)\n\n results = []\n\n for item in json_resp['search']['webResults']['results']:\n\n pubdate_original = item.get('pubdate_original')\n if pubdate_original:\n pubdate_original = dateutil.parser.parse(pubdate_original)\n metadata = [item.get(field) for field in ['category_l1', 'catsy'] if item.get(field)]\n\n results.append(\n {\n \"url\": item['url'],\n \"title\": item['title'],\n \"content\": item['abstract'],\n \"publishedDate\": pubdate_original,\n # \"img_src\": item.get('image_url') or None, # these are not thumbs / to large\n \"metadata\": ' | '.join(metadata),\n }\n )\n\n return results\n", "path": "searx/engines/ask.py"}], "after_files": [{"content": "# SPDX-License-Identifier: AGPL-3.0-or-later\n\"\"\"Ask.com\"\"\"\n\nfrom urllib.parse import urlencode\nimport dateutil\nfrom lxml import html\nfrom searx import utils\n\n# Metadata\nabout = {\n \"website\": \"https://www.ask.com/\",\n \"wikidata_id\": 'Q847564',\n \"official_api_documentation\": None,\n \"use_official_api\": False,\n \"require_api_key\": False,\n \"results\": \"HTML\",\n}\n\n# Engine Configuration\ncategories = ['general']\npaging = True\nmax_page = 5\n\n# Base URL\nbase_url = \"https://www.ask.com/web\"\n\n\ndef request(query, params):\n\n query_params = {\n \"q\": query,\n \"page\": params[\"pageno\"],\n }\n\n params[\"url\"] = f\"{base_url}?{urlencode(query_params)}\"\n return params\n\n\ndef response(resp):\n\n start_tag = 'window.MESON.initialState = {'\n end_tag = '}};'\n\n dom = html.fromstring(resp.text)\n script = utils.eval_xpath_getindex(dom, '//script', 0, default=None).text\n\n pos = script.index(start_tag) + len(start_tag) - 1\n script = script[pos:]\n pos = script.index(end_tag) + len(end_tag) - 1\n script = script[:pos]\n\n json_resp = utils.js_variable_to_python(script)\n\n results = []\n\n for item in json_resp['search']['webResults']['results']:\n\n pubdate_original = item.get('pubdate_original')\n if pubdate_original:\n pubdate_original = dateutil.parser.parse(pubdate_original)\n metadata = [item.get(field) for field in ['category_l1', 'catsy'] if item.get(field)]\n\n results.append(\n {\n \"url\": item['url'].split('&ueid')[0],\n \"title\": item['title'],\n \"content\": item['abstract'],\n \"publishedDate\": pubdate_original,\n # \"img_src\": item.get('image_url') or None, # these are not thumbs / to large\n \"metadata\": ' | '.join(metadata),\n }\n )\n\n return results\n", "path": "searx/engines/ask.py"}]} | 1,032 | 164 |
gh_patches_debug_11039 | rasdani/github-patches | git_diff | frappe__frappe-23799 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
PostgreSQL - permission denied for schema public
<!--
Welcome to the Frappe Framework issue tracker! Before creating an issue, please heed the following:
1. This tracker should only be used to report bugs and request features / enhancements to Frappe
- For questions and general support, use https://stackoverflow.com/questions/tagged/frappe
- For documentation issues, refer to https://frappeframework.com/docs/user/en or the developer cheetsheet https://github.com/frappe/frappe/wiki/Developer-Cheatsheet
2. Use the search function before creating a new issue. Duplicates will be closed and directed to
the original discussion.
3. When making a bug report, make sure you provide all required information. The easier it is for
maintainers to reproduce, the faster it'll be fixed.
4. If you think you know what the reason for the bug is, share it with us. Maybe put in a PR 😉
-->
## Description of the issue
When PostgreSQL version 15 or greater is used, the `bench new-site` command files with insufficient permission error from PostgreSQL. This is due to the change introduced, where `PostgreSQL 15 also revokes the CREATE permission from all users except a database owner from the public (or default) schema.`
## Context information (for bug reports)
**Output of `bench version`**
```
5.19.0
```
## Steps to reproduce the issue
1. Configure Frappe to use PostgreSQL version which is 15 or greater (16.1 in my case)
2. Attempt to perform `bench new-site` (Ex: ERPNext)
### Observed result
Error from Postgres:
```
ERROR: permission denied for schema public at character 14
```
Error from Frappe:
```
psql:/home/frappe/frappe-bench/apps/frappe/frappe/database/postgres/framework_postgres.sql:72: ERROR: permission denied for schema public
```
### Expected result
Installation should proceed without any errors
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `frappe/database/postgres/setup_db.py`
Content:
```
1 import os
2
3 import frappe
4 from frappe import _
5
6
7 def setup_database():
8 root_conn = get_root_connection(frappe.flags.root_login, frappe.flags.root_password)
9 root_conn.commit()
10 root_conn.sql("end")
11 root_conn.sql(f"DROP DATABASE IF EXISTS `{frappe.conf.db_name}`")
12 root_conn.sql(f"DROP USER IF EXISTS {frappe.conf.db_name}")
13 root_conn.sql(f"CREATE DATABASE `{frappe.conf.db_name}`")
14 root_conn.sql(f"CREATE user {frappe.conf.db_name} password '{frappe.conf.db_password}'")
15 root_conn.sql("GRANT ALL PRIVILEGES ON DATABASE `{0}` TO {0}".format(frappe.conf.db_name))
16 root_conn.close()
17
18
19 def bootstrap_database(db_name, verbose, source_sql=None):
20 frappe.connect(db_name=db_name)
21 import_db_from_sql(source_sql, verbose)
22 frappe.connect(db_name=db_name)
23
24 if "tabDefaultValue" not in frappe.db.get_tables():
25 import sys
26
27 from click import secho
28
29 secho(
30 "Table 'tabDefaultValue' missing in the restored site. "
31 "This may be due to incorrect permissions or the result of a restore from a bad backup file. "
32 "Database not installed correctly.",
33 fg="red",
34 )
35 sys.exit(1)
36
37
38 def import_db_from_sql(source_sql=None, verbose=False):
39 import shlex
40 from shutil import which
41
42 from frappe.database import get_command
43 from frappe.utils import execute_in_shell
44
45 # bootstrap db
46 if not source_sql:
47 source_sql = os.path.join(os.path.dirname(__file__), "framework_postgres.sql")
48
49 pv = which("pv")
50
51 command = []
52
53 if pv:
54 command.extend([pv, source_sql, "|"])
55 source = []
56 print("Restoring Database file...")
57 else:
58 source = ["-f", source_sql]
59
60 bin, args, bin_name = get_command(
61 host=frappe.conf.db_host,
62 port=frappe.conf.db_port,
63 user=frappe.conf.db_name,
64 password=frappe.conf.db_password,
65 db_name=frappe.conf.db_name,
66 )
67
68 if not bin:
69 frappe.throw(
70 _("{} not found in PATH! This is required to restore the database.").format(bin_name),
71 exc=frappe.ExecutableNotFound,
72 )
73 command.append(bin)
74 command.append(shlex.join(args))
75 command.extend(source)
76 execute_in_shell(" ".join(command), check_exit_code=True, verbose=verbose)
77 frappe.cache.delete_keys("") # Delete all keys associated with this site.
78
79
80 def get_root_connection(root_login=None, root_password=None):
81 if not frappe.local.flags.root_connection:
82 if not root_login:
83 root_login = frappe.conf.get("root_login") or None
84
85 if not root_login:
86 root_login = input("Enter postgres super user: ")
87
88 if not root_password:
89 root_password = frappe.conf.get("root_password") or None
90
91 if not root_password:
92 from getpass import getpass
93
94 root_password = getpass("Postgres super user password: ")
95
96 frappe.local.flags.root_connection = frappe.database.get_db(
97 host=frappe.conf.db_host,
98 port=frappe.conf.db_port,
99 user=root_login,
100 password=root_password,
101 )
102
103 return frappe.local.flags.root_connection
104
105
106 def drop_user_and_database(db_name, root_login, root_password):
107 root_conn = get_root_connection(
108 frappe.flags.root_login or root_login, frappe.flags.root_password or root_password
109 )
110 root_conn.commit()
111 root_conn.sql(
112 "SELECT pg_terminate_backend (pg_stat_activity.pid) FROM pg_stat_activity WHERE pg_stat_activity.datname = %s",
113 (db_name,),
114 )
115 root_conn.sql("end")
116 root_conn.sql(f"DROP DATABASE IF EXISTS {db_name}")
117 root_conn.sql(f"DROP USER IF EXISTS {db_name}")
118
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/frappe/database/postgres/setup_db.py b/frappe/database/postgres/setup_db.py
--- a/frappe/database/postgres/setup_db.py
+++ b/frappe/database/postgres/setup_db.py
@@ -13,6 +13,9 @@
root_conn.sql(f"CREATE DATABASE `{frappe.conf.db_name}`")
root_conn.sql(f"CREATE user {frappe.conf.db_name} password '{frappe.conf.db_password}'")
root_conn.sql("GRANT ALL PRIVILEGES ON DATABASE `{0}` TO {0}".format(frappe.conf.db_name))
+ psql_version = root_conn.sql(f"SELECT VERSION()", as_dict=True)
+ if psql_version and psql_version[0].get("version", "PostgreSQL 14").split()[1] >= "15":
+ root_conn.sql("ALTER DATABASE `{0}` OWNER TO {0}".format(frappe.conf.db_name))
root_conn.close()
| {"golden_diff": "diff --git a/frappe/database/postgres/setup_db.py b/frappe/database/postgres/setup_db.py\n--- a/frappe/database/postgres/setup_db.py\n+++ b/frappe/database/postgres/setup_db.py\n@@ -13,6 +13,9 @@\n \troot_conn.sql(f\"CREATE DATABASE `{frappe.conf.db_name}`\")\n \troot_conn.sql(f\"CREATE user {frappe.conf.db_name} password '{frappe.conf.db_password}'\")\n \troot_conn.sql(\"GRANT ALL PRIVILEGES ON DATABASE `{0}` TO {0}\".format(frappe.conf.db_name))\n+\tpsql_version = root_conn.sql(f\"SELECT VERSION()\", as_dict=True)\n+\tif psql_version and psql_version[0].get(\"version\", \"PostgreSQL 14\").split()[1] >= \"15\":\n+\t\troot_conn.sql(\"ALTER DATABASE `{0}` OWNER TO {0}\".format(frappe.conf.db_name))\n \troot_conn.close()\n", "issue": "PostgreSQL - permission denied for schema public\n<!--\r\nWelcome to the Frappe Framework issue tracker! Before creating an issue, please heed the following:\r\n\r\n1. This tracker should only be used to report bugs and request features / enhancements to Frappe\r\n - For questions and general support, use https://stackoverflow.com/questions/tagged/frappe\r\n - For documentation issues, refer to https://frappeframework.com/docs/user/en or the developer cheetsheet https://github.com/frappe/frappe/wiki/Developer-Cheatsheet\r\n2. Use the search function before creating a new issue. Duplicates will be closed and directed to\r\n the original discussion.\r\n3. When making a bug report, make sure you provide all required information. The easier it is for\r\n maintainers to reproduce, the faster it'll be fixed.\r\n4. If you think you know what the reason for the bug is, share it with us. Maybe put in a PR \ud83d\ude09\r\n-->\r\n\r\n## Description of the issue\r\nWhen PostgreSQL version 15 or greater is used, the `bench new-site` command files with insufficient permission error from PostgreSQL. This is due to the change introduced, where `PostgreSQL 15 also revokes the CREATE permission from all users except a database owner from the public (or default) schema.`\r\n\r\n## Context information (for bug reports)\r\n\r\n**Output of `bench version`**\r\n```\r\n5.19.0\r\n```\r\n\r\n## Steps to reproduce the issue\r\n\r\n1. Configure Frappe to use PostgreSQL version which is 15 or greater (16.1 in my case)\r\n2. Attempt to perform `bench new-site` (Ex: ERPNext)\r\n\r\n### Observed result\r\nError from Postgres:\r\n```\r\nERROR: permission denied for schema public at character 14\r\n```\r\n\r\nError from Frappe:\r\n```\r\npsql:/home/frappe/frappe-bench/apps/frappe/frappe/database/postgres/framework_postgres.sql:72: ERROR: permission denied for schema public\r\n```\r\n\r\n### Expected result\r\nInstallation should proceed without any errors\r\n\r\n\r\n\n", "before_files": [{"content": "import os\n\nimport frappe\nfrom frappe import _\n\n\ndef setup_database():\n\troot_conn = get_root_connection(frappe.flags.root_login, frappe.flags.root_password)\n\troot_conn.commit()\n\troot_conn.sql(\"end\")\n\troot_conn.sql(f\"DROP DATABASE IF EXISTS `{frappe.conf.db_name}`\")\n\troot_conn.sql(f\"DROP USER IF EXISTS {frappe.conf.db_name}\")\n\troot_conn.sql(f\"CREATE DATABASE `{frappe.conf.db_name}`\")\n\troot_conn.sql(f\"CREATE user {frappe.conf.db_name} password '{frappe.conf.db_password}'\")\n\troot_conn.sql(\"GRANT ALL PRIVILEGES ON DATABASE `{0}` TO {0}\".format(frappe.conf.db_name))\n\troot_conn.close()\n\n\ndef bootstrap_database(db_name, verbose, source_sql=None):\n\tfrappe.connect(db_name=db_name)\n\timport_db_from_sql(source_sql, verbose)\n\tfrappe.connect(db_name=db_name)\n\n\tif \"tabDefaultValue\" not in frappe.db.get_tables():\n\t\timport sys\n\n\t\tfrom click import secho\n\n\t\tsecho(\n\t\t\t\"Table 'tabDefaultValue' missing in the restored site. \"\n\t\t\t\"This may be due to incorrect permissions or the result of a restore from a bad backup file. \"\n\t\t\t\"Database not installed correctly.\",\n\t\t\tfg=\"red\",\n\t\t)\n\t\tsys.exit(1)\n\n\ndef import_db_from_sql(source_sql=None, verbose=False):\n\timport shlex\n\tfrom shutil import which\n\n\tfrom frappe.database import get_command\n\tfrom frappe.utils import execute_in_shell\n\n\t# bootstrap db\n\tif not source_sql:\n\t\tsource_sql = os.path.join(os.path.dirname(__file__), \"framework_postgres.sql\")\n\n\tpv = which(\"pv\")\n\n\tcommand = []\n\n\tif pv:\n\t\tcommand.extend([pv, source_sql, \"|\"])\n\t\tsource = []\n\t\tprint(\"Restoring Database file...\")\n\telse:\n\t\tsource = [\"-f\", source_sql]\n\n\tbin, args, bin_name = get_command(\n\t\thost=frappe.conf.db_host,\n\t\tport=frappe.conf.db_port,\n\t\tuser=frappe.conf.db_name,\n\t\tpassword=frappe.conf.db_password,\n\t\tdb_name=frappe.conf.db_name,\n\t)\n\n\tif not bin:\n\t\tfrappe.throw(\n\t\t\t_(\"{} not found in PATH! This is required to restore the database.\").format(bin_name),\n\t\t\texc=frappe.ExecutableNotFound,\n\t\t)\n\tcommand.append(bin)\n\tcommand.append(shlex.join(args))\n\tcommand.extend(source)\n\texecute_in_shell(\" \".join(command), check_exit_code=True, verbose=verbose)\n\tfrappe.cache.delete_keys(\"\") # Delete all keys associated with this site.\n\n\ndef get_root_connection(root_login=None, root_password=None):\n\tif not frappe.local.flags.root_connection:\n\t\tif not root_login:\n\t\t\troot_login = frappe.conf.get(\"root_login\") or None\n\n\t\tif not root_login:\n\t\t\troot_login = input(\"Enter postgres super user: \")\n\n\t\tif not root_password:\n\t\t\troot_password = frappe.conf.get(\"root_password\") or None\n\n\t\tif not root_password:\n\t\t\tfrom getpass import getpass\n\n\t\t\troot_password = getpass(\"Postgres super user password: \")\n\n\t\tfrappe.local.flags.root_connection = frappe.database.get_db(\n\t\t\thost=frappe.conf.db_host,\n\t\t\tport=frappe.conf.db_port,\n\t\t\tuser=root_login,\n\t\t\tpassword=root_password,\n\t\t)\n\n\treturn frappe.local.flags.root_connection\n\n\ndef drop_user_and_database(db_name, root_login, root_password):\n\troot_conn = get_root_connection(\n\t\tfrappe.flags.root_login or root_login, frappe.flags.root_password or root_password\n\t)\n\troot_conn.commit()\n\troot_conn.sql(\n\t\t\"SELECT pg_terminate_backend (pg_stat_activity.pid) FROM pg_stat_activity WHERE pg_stat_activity.datname = %s\",\n\t\t(db_name,),\n\t)\n\troot_conn.sql(\"end\")\n\troot_conn.sql(f\"DROP DATABASE IF EXISTS {db_name}\")\n\troot_conn.sql(f\"DROP USER IF EXISTS {db_name}\")\n", "path": "frappe/database/postgres/setup_db.py"}], "after_files": [{"content": "import os\n\nimport frappe\nfrom frappe import _\n\n\ndef setup_database():\n\troot_conn = get_root_connection(frappe.flags.root_login, frappe.flags.root_password)\n\troot_conn.commit()\n\troot_conn.sql(\"end\")\n\troot_conn.sql(f\"DROP DATABASE IF EXISTS `{frappe.conf.db_name}`\")\n\troot_conn.sql(f\"DROP USER IF EXISTS {frappe.conf.db_name}\")\n\troot_conn.sql(f\"CREATE DATABASE `{frappe.conf.db_name}`\")\n\troot_conn.sql(f\"CREATE user {frappe.conf.db_name} password '{frappe.conf.db_password}'\")\n\troot_conn.sql(\"GRANT ALL PRIVILEGES ON DATABASE `{0}` TO {0}\".format(frappe.conf.db_name))\n\tpsql_version = root_conn.sql(f\"SELECT VERSION()\", as_dict=True)\n\tif psql_version and psql_version[0].get(\"version\", \"PostgreSQL 14\").split()[1] >= \"15\":\n\t\troot_conn.sql(\"ALTER DATABASE `{0}` OWNER TO {0}\".format(frappe.conf.db_name))\n\troot_conn.close()\n\n\ndef bootstrap_database(db_name, verbose, source_sql=None):\n\tfrappe.connect(db_name=db_name)\n\timport_db_from_sql(source_sql, verbose)\n\tfrappe.connect(db_name=db_name)\n\n\tif \"tabDefaultValue\" not in frappe.db.get_tables():\n\t\timport sys\n\n\t\tfrom click import secho\n\n\t\tsecho(\n\t\t\t\"Table 'tabDefaultValue' missing in the restored site. \"\n\t\t\t\"This may be due to incorrect permissions or the result of a restore from a bad backup file. \"\n\t\t\t\"Database not installed correctly.\",\n\t\t\tfg=\"red\",\n\t\t)\n\t\tsys.exit(1)\n\n\ndef import_db_from_sql(source_sql=None, verbose=False):\n\timport shlex\n\tfrom shutil import which\n\n\tfrom frappe.database import get_command\n\tfrom frappe.utils import execute_in_shell\n\n\t# bootstrap db\n\tif not source_sql:\n\t\tsource_sql = os.path.join(os.path.dirname(__file__), \"framework_postgres.sql\")\n\n\tpv = which(\"pv\")\n\n\tcommand = []\n\n\tif pv:\n\t\tcommand.extend([pv, source_sql, \"|\"])\n\t\tsource = []\n\t\tprint(\"Restoring Database file...\")\n\telse:\n\t\tsource = [\"-f\", source_sql]\n\n\tbin, args, bin_name = get_command(\n\t\thost=frappe.conf.db_host,\n\t\tport=frappe.conf.db_port,\n\t\tuser=frappe.conf.db_name,\n\t\tpassword=frappe.conf.db_password,\n\t\tdb_name=frappe.conf.db_name,\n\t)\n\n\tif not bin:\n\t\tfrappe.throw(\n\t\t\t_(\"{} not found in PATH! This is required to restore the database.\").format(bin_name),\n\t\t\texc=frappe.ExecutableNotFound,\n\t\t)\n\tcommand.append(bin)\n\tcommand.append(shlex.join(args))\n\tcommand.extend(source)\n\texecute_in_shell(\" \".join(command), check_exit_code=True, verbose=verbose)\n\tfrappe.cache.delete_keys(\"\") # Delete all keys associated with this site.\n\n\ndef get_root_connection(root_login=None, root_password=None):\n\tif not frappe.local.flags.root_connection:\n\t\tif not root_login:\n\t\t\troot_login = frappe.conf.get(\"root_login\") or None\n\n\t\tif not root_login:\n\t\t\troot_login = input(\"Enter postgres super user: \")\n\n\t\tif not root_password:\n\t\t\troot_password = frappe.conf.get(\"root_password\") or None\n\n\t\tif not root_password:\n\t\t\tfrom getpass import getpass\n\n\t\t\troot_password = getpass(\"Postgres super user password: \")\n\n\t\tfrappe.local.flags.root_connection = frappe.database.get_db(\n\t\t\thost=frappe.conf.db_host,\n\t\t\tport=frappe.conf.db_port,\n\t\t\tuser=root_login,\n\t\t\tpassword=root_password,\n\t\t)\n\n\treturn frappe.local.flags.root_connection\n\n\ndef drop_user_and_database(db_name, root_login, root_password):\n\troot_conn = get_root_connection(\n\t\tfrappe.flags.root_login or root_login, frappe.flags.root_password or root_password\n\t)\n\troot_conn.commit()\n\troot_conn.sql(\n\t\t\"SELECT pg_terminate_backend (pg_stat_activity.pid) FROM pg_stat_activity WHERE pg_stat_activity.datname = %s\",\n\t\t(db_name,),\n\t)\n\troot_conn.sql(\"end\")\n\troot_conn.sql(f\"DROP DATABASE IF EXISTS {db_name}\")\n\troot_conn.sql(f\"DROP USER IF EXISTS {db_name}\")\n", "path": "frappe/database/postgres/setup_db.py"}]} | 1,827 | 202 |
gh_patches_debug_3277 | rasdani/github-patches | git_diff | spack__spack-11755 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Installing rclone leaves a go folder outside spack root
Spack version: commit a45c4ef33de (if no change on relevant part, it is basically the head).
OS distribution: Ubuntu server 18.04.
After `spack install rclone`, there is a folder named go outside spack installation root as `~/go/` (the spack root is `~/spack/`). I have no idea about go lang, but it seems to be the go packages directory, denoted as `$GOPATH`.
I wonder whether it is necessary to leave something outside spack root or should it be simple to move such GOPATH within spack root by hacking some repos (such as go or rclone).
BTW,
```bash
module show go-1.11.5-gcc-7.4.0-gm
--------------------------------------------------------------------------------------------------------------------------------
/home/ubuntu/spack/share/spack/modules/linux-ubuntu18.04-x86_64/go-1.11.5-gcc-7.4.0-gm:
--------------------------------------------------------------------------------------------------------------------------------
whatis("The golang compiler and build environment ")
prepend_path("PATH","/home/ubuntu/spack/opt/spack/linux-ubuntu18.04-x86_64/gcc-7.4.0/go-1.11.5-gmcvqdwr6az4hxbicr375if5fttk6sna/bin")
prepend_path("LD_LIBRARY_PATH","/home/ubuntu/spack/opt/spack/linux-ubuntu18.04-x86_64/gcc-7.4.0/go-1.11.5-gmcvqdwr6az4hxbicr375if5fttk6sna/lib")
prepend_path("LIBRARY_PATH","/home/ubuntu/spack/opt/spack/linux-ubuntu18.04-x86_64/gcc-7.4.0/go-1.11.5-gmcvqdwr6az4hxbicr375if5fttk6sna/lib")
prepend_path("CMAKE_PREFIX_PATH","/home/ubuntu/spack/opt/spack/linux-ubuntu18.04-x86_64/gcc-7.4.0/go-1.11.5-gmcvqdwr6az4hxbicr375if5fttk6sna/")
help([[The golang compiler and build environment
]])
```
The auto generated module file for go by spack has no modification on `$GOPATH`.
Correct me if I have any misunderstanding on go or its way of doing things, since literally I know nothing about it.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `var/spack/repos/builtin/packages/rclone/package.py`
Content:
```
1 # Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
2 # Spack Project Developers. See the top-level COPYRIGHT file for details.
3 #
4 # SPDX-License-Identifier: (Apache-2.0 OR MIT)
5
6 from spack import *
7
8
9 class Rclone(Package):
10 """Rclone is a command line program to sync files and directories
11 to and from various cloud storage providers"""
12
13 homepage = "http://rclone.org"
14 url = "https://github.com/ncw/rclone/releases/download/v1.43/rclone-v1.43.tar.gz"
15
16 version('1.43', sha256='d30527b00cecb4e5e7188dddb78e5cec62d67cf2422dab82190db58512b5a4e3')
17
18 depends_on("go", type='build')
19
20 def install(self, spec, prefix):
21 go('build')
22 mkdirp(prefix.bin)
23 install('rclone', prefix.bin)
24
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/var/spack/repos/builtin/packages/rclone/package.py b/var/spack/repos/builtin/packages/rclone/package.py
--- a/var/spack/repos/builtin/packages/rclone/package.py
+++ b/var/spack/repos/builtin/packages/rclone/package.py
@@ -17,6 +17,10 @@
depends_on("go", type='build')
+ def setup_environment(self, spack_env, run_env):
+ # Point GOPATH at the top of the staging dir for the build step.
+ spack_env.prepend_path('GOPATH', self.stage.path)
+
def install(self, spec, prefix):
go('build')
mkdirp(prefix.bin)
| {"golden_diff": "diff --git a/var/spack/repos/builtin/packages/rclone/package.py b/var/spack/repos/builtin/packages/rclone/package.py\n--- a/var/spack/repos/builtin/packages/rclone/package.py\n+++ b/var/spack/repos/builtin/packages/rclone/package.py\n@@ -17,6 +17,10 @@\n \n depends_on(\"go\", type='build')\n \n+ def setup_environment(self, spack_env, run_env):\n+ # Point GOPATH at the top of the staging dir for the build step.\n+ spack_env.prepend_path('GOPATH', self.stage.path)\n+\n def install(self, spec, prefix):\n go('build')\n mkdirp(prefix.bin)\n", "issue": "Installing rclone leaves a go folder outside spack root\nSpack version: commit a45c4ef33de (if no change on relevant part, it is basically the head).\r\nOS distribution: Ubuntu server 18.04.\r\n\r\nAfter `spack install rclone`, there is a folder named go outside spack installation root as `~/go/` (the spack root is `~/spack/`). I have no idea about go lang, but it seems to be the go packages directory, denoted as `$GOPATH`.\r\n\r\nI wonder whether it is necessary to leave something outside spack root or should it be simple to move such GOPATH within spack root by hacking some repos (such as go or rclone).\r\n\r\nBTW, \r\n```bash\r\nmodule show go-1.11.5-gcc-7.4.0-gm\r\n--------------------------------------------------------------------------------------------------------------------------------\r\n /home/ubuntu/spack/share/spack/modules/linux-ubuntu18.04-x86_64/go-1.11.5-gcc-7.4.0-gm:\r\n--------------------------------------------------------------------------------------------------------------------------------\r\nwhatis(\"The golang compiler and build environment \")\r\nprepend_path(\"PATH\",\"/home/ubuntu/spack/opt/spack/linux-ubuntu18.04-x86_64/gcc-7.4.0/go-1.11.5-gmcvqdwr6az4hxbicr375if5fttk6sna/bin\")\r\nprepend_path(\"LD_LIBRARY_PATH\",\"/home/ubuntu/spack/opt/spack/linux-ubuntu18.04-x86_64/gcc-7.4.0/go-1.11.5-gmcvqdwr6az4hxbicr375if5fttk6sna/lib\")\r\nprepend_path(\"LIBRARY_PATH\",\"/home/ubuntu/spack/opt/spack/linux-ubuntu18.04-x86_64/gcc-7.4.0/go-1.11.5-gmcvqdwr6az4hxbicr375if5fttk6sna/lib\")\r\nprepend_path(\"CMAKE_PREFIX_PATH\",\"/home/ubuntu/spack/opt/spack/linux-ubuntu18.04-x86_64/gcc-7.4.0/go-1.11.5-gmcvqdwr6az4hxbicr375if5fttk6sna/\")\r\nhelp([[The golang compiler and build environment\r\n]])\r\n```\r\nThe auto generated module file for go by spack has no modification on `$GOPATH`.\r\n\r\nCorrect me if I have any misunderstanding on go or its way of doing things, since literally I know nothing about it.\n", "before_files": [{"content": "# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other\n# Spack Project Developers. See the top-level COPYRIGHT file for details.\n#\n# SPDX-License-Identifier: (Apache-2.0 OR MIT)\n\nfrom spack import *\n\n\nclass Rclone(Package):\n \"\"\"Rclone is a command line program to sync files and directories\n to and from various cloud storage providers\"\"\"\n\n homepage = \"http://rclone.org\"\n url = \"https://github.com/ncw/rclone/releases/download/v1.43/rclone-v1.43.tar.gz\"\n\n version('1.43', sha256='d30527b00cecb4e5e7188dddb78e5cec62d67cf2422dab82190db58512b5a4e3')\n\n depends_on(\"go\", type='build')\n\n def install(self, spec, prefix):\n go('build')\n mkdirp(prefix.bin)\n install('rclone', prefix.bin)\n", "path": "var/spack/repos/builtin/packages/rclone/package.py"}], "after_files": [{"content": "# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other\n# Spack Project Developers. See the top-level COPYRIGHT file for details.\n#\n# SPDX-License-Identifier: (Apache-2.0 OR MIT)\n\nfrom spack import *\n\n\nclass Rclone(Package):\n \"\"\"Rclone is a command line program to sync files and directories\n to and from various cloud storage providers\"\"\"\n\n homepage = \"http://rclone.org\"\n url = \"https://github.com/ncw/rclone/releases/download/v1.43/rclone-v1.43.tar.gz\"\n\n version('1.43', sha256='d30527b00cecb4e5e7188dddb78e5cec62d67cf2422dab82190db58512b5a4e3')\n\n depends_on(\"go\", type='build')\n\n def setup_environment(self, spack_env, run_env):\n # Point GOPATH at the top of the staging dir for the build step.\n spack_env.prepend_path('GOPATH', self.stage.path)\n\n def install(self, spec, prefix):\n go('build')\n mkdirp(prefix.bin)\n install('rclone', prefix.bin)\n", "path": "var/spack/repos/builtin/packages/rclone/package.py"}]} | 1,118 | 149 |
gh_patches_debug_30001 | rasdani/github-patches | git_diff | getsentry__sentry-5722 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[plugin] Regression: Webhook Test Button
Somehow during transfer to the new plugin architecture, the webhook plugin's Test button has disappeared.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/sentry/api/endpoints/project_plugin_details.py`
Content:
```
1 from __future__ import absolute_import
2
3 import six
4
5 from django import forms
6 from django.core.urlresolvers import reverse
7 from rest_framework import serializers
8 from rest_framework.response import Response
9
10 from sentry.exceptions import PluginError, PluginIdentityRequired
11 from sentry.plugins import plugins
12 from sentry.api.bases.project import ProjectEndpoint
13 from sentry.api.exceptions import ResourceDoesNotExist
14 from sentry.api.serializers import serialize
15 from sentry.api.serializers.models.plugin import (
16 PluginSerializer, PluginWithConfigSerializer, serialize_field
17 )
18 from sentry.signals import plugin_enabled
19
20 ERR_ALWAYS_ENABLED = 'This plugin is always enabled.'
21 ERR_FIELD_REQUIRED = 'This field is required.'
22
23 OK_UPDATED = 'Successfully updated configuration.'
24
25
26 class ProjectPluginDetailsEndpoint(ProjectEndpoint):
27 def _get_plugin(self, plugin_id):
28 try:
29 return plugins.get(plugin_id)
30 except KeyError:
31 raise ResourceDoesNotExist
32
33 def get(self, request, project, plugin_id):
34 plugin = self._get_plugin(plugin_id)
35
36 try:
37 context = serialize(plugin, request.user, PluginWithConfigSerializer(project))
38 except PluginIdentityRequired as e:
39 context = serialize(plugin, request.user, PluginSerializer(project))
40 context['config_error'] = e.message
41 context['auth_url'] = reverse('socialauth_associate', args=[plugin.slug])
42
43 return Response(context)
44
45 def post(self, request, project, plugin_id):
46 """
47 Enable plugin
48 """
49 plugin = self._get_plugin(plugin_id)
50
51 if not plugin.can_disable:
52 return Response({'detail': ERR_ALWAYS_ENABLED}, status=400)
53
54 plugin.enable(project)
55
56 return Response(status=201)
57
58 def delete(self, request, project, plugin_id):
59 """
60 Disable plugin
61 """
62 plugin = self._get_plugin(plugin_id)
63
64 if not plugin.can_disable:
65 return Response({'detail': ERR_ALWAYS_ENABLED}, status=400)
66
67 plugin.disable(project)
68
69 return Response(status=204)
70
71 def put(self, request, project, plugin_id):
72 plugin = self._get_plugin(plugin_id)
73
74 config = [
75 serialize_field(project, plugin, c)
76 for c in plugin.get_config(
77 project=project,
78 user=request.user,
79 )
80 ]
81
82 cleaned = {}
83 errors = {}
84 for field in config:
85 key = field['name']
86 value = request.DATA.get(key)
87
88 if field.get('required') and not value:
89 errors[key] = ERR_FIELD_REQUIRED
90
91 try:
92 value = plugin.validate_config_field(
93 project=project,
94 name=key,
95 value=value,
96 actor=request.user,
97 )
98 except (forms.ValidationError, serializers.ValidationError, PluginError) as e:
99 errors[key] = e.message
100
101 if not errors.get(key):
102 cleaned[key] = value
103
104 if not errors:
105 try:
106 cleaned = plugin.validate_config(
107 project=project,
108 config=cleaned,
109 actor=request.user,
110 )
111 except PluginError as e:
112 errors['__all__'] = e.message
113
114 if errors:
115 return Response(
116 {
117 'errors': errors,
118 }, status=400
119 )
120
121 for key, value in six.iteritems(cleaned):
122 if value is None:
123 plugin.unset_option(
124 project=project,
125 key=key,
126 )
127 else:
128 plugin.set_option(
129 project=project,
130 key=key,
131 value=value,
132 )
133
134 context = serialize(plugin, request.user, PluginWithConfigSerializer(project))
135
136 plugin_enabled.send(plugin=plugin, project=project, user=request.user, sender=self)
137
138 return Response(context)
139
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/sentry/api/endpoints/project_plugin_details.py b/src/sentry/api/endpoints/project_plugin_details.py
--- a/src/sentry/api/endpoints/project_plugin_details.py
+++ b/src/sentry/api/endpoints/project_plugin_details.py
@@ -1,11 +1,13 @@
from __future__ import absolute_import
+import logging
import six
from django import forms
from django.core.urlresolvers import reverse
from rest_framework import serializers
from rest_framework.response import Response
+from requests.exceptions import HTTPError
from sentry.exceptions import PluginError, PluginIdentityRequired
from sentry.plugins import plugins
@@ -44,10 +46,26 @@
def post(self, request, project, plugin_id):
"""
- Enable plugin
+ Enable plugin or Test plugin
"""
plugin = self._get_plugin(plugin_id)
+ if request.DATA.get('test') and plugin.is_testable():
+ try:
+ test_results = plugin.test_configuration(project)
+ except Exception as exc:
+ if isinstance(exc, HTTPError):
+ test_results = '%s\n%s' % (exc, exc.response.text[:256])
+ elif hasattr(exc, 'read') and callable(exc.read):
+ test_results = '%s\n%s' % (exc, exc.read()[:256])
+ else:
+ logging.exception('Plugin(%s) raised an error during test',
+ plugin_id)
+ test_results = 'There was an internal error with the Plugin'
+ if not test_results:
+ test_results = 'No errors returned'
+ return Response({'detail': test_results}, status=200)
+
if not plugin.can_disable:
return Response({'detail': ERR_ALWAYS_ENABLED}, status=400)
| {"golden_diff": "diff --git a/src/sentry/api/endpoints/project_plugin_details.py b/src/sentry/api/endpoints/project_plugin_details.py\n--- a/src/sentry/api/endpoints/project_plugin_details.py\n+++ b/src/sentry/api/endpoints/project_plugin_details.py\n@@ -1,11 +1,13 @@\n from __future__ import absolute_import\n \n+import logging\n import six\n \n from django import forms\n from django.core.urlresolvers import reverse\n from rest_framework import serializers\n from rest_framework.response import Response\n+from requests.exceptions import HTTPError\n \n from sentry.exceptions import PluginError, PluginIdentityRequired\n from sentry.plugins import plugins\n@@ -44,10 +46,26 @@\n \n def post(self, request, project, plugin_id):\n \"\"\"\n- Enable plugin\n+ Enable plugin or Test plugin\n \"\"\"\n plugin = self._get_plugin(plugin_id)\n \n+ if request.DATA.get('test') and plugin.is_testable():\n+ try:\n+ test_results = plugin.test_configuration(project)\n+ except Exception as exc:\n+ if isinstance(exc, HTTPError):\n+ test_results = '%s\\n%s' % (exc, exc.response.text[:256])\n+ elif hasattr(exc, 'read') and callable(exc.read):\n+ test_results = '%s\\n%s' % (exc, exc.read()[:256])\n+ else:\n+ logging.exception('Plugin(%s) raised an error during test',\n+ plugin_id)\n+ test_results = 'There was an internal error with the Plugin'\n+ if not test_results:\n+ test_results = 'No errors returned'\n+ return Response({'detail': test_results}, status=200)\n+\n if not plugin.can_disable:\n return Response({'detail': ERR_ALWAYS_ENABLED}, status=400)\n", "issue": "[plugin] Regression: Webhook Test Button\nSomehow during transfer to the new plugin architecture, the webhook plugin's Test button has disappeared.\n", "before_files": [{"content": "from __future__ import absolute_import\n\nimport six\n\nfrom django import forms\nfrom django.core.urlresolvers import reverse\nfrom rest_framework import serializers\nfrom rest_framework.response import Response\n\nfrom sentry.exceptions import PluginError, PluginIdentityRequired\nfrom sentry.plugins import plugins\nfrom sentry.api.bases.project import ProjectEndpoint\nfrom sentry.api.exceptions import ResourceDoesNotExist\nfrom sentry.api.serializers import serialize\nfrom sentry.api.serializers.models.plugin import (\n PluginSerializer, PluginWithConfigSerializer, serialize_field\n)\nfrom sentry.signals import plugin_enabled\n\nERR_ALWAYS_ENABLED = 'This plugin is always enabled.'\nERR_FIELD_REQUIRED = 'This field is required.'\n\nOK_UPDATED = 'Successfully updated configuration.'\n\n\nclass ProjectPluginDetailsEndpoint(ProjectEndpoint):\n def _get_plugin(self, plugin_id):\n try:\n return plugins.get(plugin_id)\n except KeyError:\n raise ResourceDoesNotExist\n\n def get(self, request, project, plugin_id):\n plugin = self._get_plugin(plugin_id)\n\n try:\n context = serialize(plugin, request.user, PluginWithConfigSerializer(project))\n except PluginIdentityRequired as e:\n context = serialize(plugin, request.user, PluginSerializer(project))\n context['config_error'] = e.message\n context['auth_url'] = reverse('socialauth_associate', args=[plugin.slug])\n\n return Response(context)\n\n def post(self, request, project, plugin_id):\n \"\"\"\n Enable plugin\n \"\"\"\n plugin = self._get_plugin(plugin_id)\n\n if not plugin.can_disable:\n return Response({'detail': ERR_ALWAYS_ENABLED}, status=400)\n\n plugin.enable(project)\n\n return Response(status=201)\n\n def delete(self, request, project, plugin_id):\n \"\"\"\n Disable plugin\n \"\"\"\n plugin = self._get_plugin(plugin_id)\n\n if not plugin.can_disable:\n return Response({'detail': ERR_ALWAYS_ENABLED}, status=400)\n\n plugin.disable(project)\n\n return Response(status=204)\n\n def put(self, request, project, plugin_id):\n plugin = self._get_plugin(plugin_id)\n\n config = [\n serialize_field(project, plugin, c)\n for c in plugin.get_config(\n project=project,\n user=request.user,\n )\n ]\n\n cleaned = {}\n errors = {}\n for field in config:\n key = field['name']\n value = request.DATA.get(key)\n\n if field.get('required') and not value:\n errors[key] = ERR_FIELD_REQUIRED\n\n try:\n value = plugin.validate_config_field(\n project=project,\n name=key,\n value=value,\n actor=request.user,\n )\n except (forms.ValidationError, serializers.ValidationError, PluginError) as e:\n errors[key] = e.message\n\n if not errors.get(key):\n cleaned[key] = value\n\n if not errors:\n try:\n cleaned = plugin.validate_config(\n project=project,\n config=cleaned,\n actor=request.user,\n )\n except PluginError as e:\n errors['__all__'] = e.message\n\n if errors:\n return Response(\n {\n 'errors': errors,\n }, status=400\n )\n\n for key, value in six.iteritems(cleaned):\n if value is None:\n plugin.unset_option(\n project=project,\n key=key,\n )\n else:\n plugin.set_option(\n project=project,\n key=key,\n value=value,\n )\n\n context = serialize(plugin, request.user, PluginWithConfigSerializer(project))\n\n plugin_enabled.send(plugin=plugin, project=project, user=request.user, sender=self)\n\n return Response(context)\n", "path": "src/sentry/api/endpoints/project_plugin_details.py"}], "after_files": [{"content": "from __future__ import absolute_import\n\nimport logging\nimport six\n\nfrom django import forms\nfrom django.core.urlresolvers import reverse\nfrom rest_framework import serializers\nfrom rest_framework.response import Response\nfrom requests.exceptions import HTTPError\n\nfrom sentry.exceptions import PluginError, PluginIdentityRequired\nfrom sentry.plugins import plugins\nfrom sentry.api.bases.project import ProjectEndpoint\nfrom sentry.api.exceptions import ResourceDoesNotExist\nfrom sentry.api.serializers import serialize\nfrom sentry.api.serializers.models.plugin import (\n PluginSerializer, PluginWithConfigSerializer, serialize_field\n)\nfrom sentry.signals import plugin_enabled\n\nERR_ALWAYS_ENABLED = 'This plugin is always enabled.'\nERR_FIELD_REQUIRED = 'This field is required.'\n\nOK_UPDATED = 'Successfully updated configuration.'\n\n\nclass ProjectPluginDetailsEndpoint(ProjectEndpoint):\n def _get_plugin(self, plugin_id):\n try:\n return plugins.get(plugin_id)\n except KeyError:\n raise ResourceDoesNotExist\n\n def get(self, request, project, plugin_id):\n plugin = self._get_plugin(plugin_id)\n\n try:\n context = serialize(plugin, request.user, PluginWithConfigSerializer(project))\n except PluginIdentityRequired as e:\n context = serialize(plugin, request.user, PluginSerializer(project))\n context['config_error'] = e.message\n context['auth_url'] = reverse('socialauth_associate', args=[plugin.slug])\n\n return Response(context)\n\n def post(self, request, project, plugin_id):\n \"\"\"\n Enable plugin or Test plugin\n \"\"\"\n plugin = self._get_plugin(plugin_id)\n\n if request.DATA.get('test') and plugin.is_testable():\n try:\n test_results = plugin.test_configuration(project)\n except Exception as exc:\n if isinstance(exc, HTTPError):\n test_results = '%s\\n%s' % (exc, exc.response.text[:256])\n elif hasattr(exc, 'read') and callable(exc.read):\n test_results = '%s\\n%s' % (exc, exc.read()[:256])\n else:\n logging.exception('Plugin(%s) raised an error during test',\n plugin_id)\n test_results = 'There was an internal error with the Plugin'\n if not test_results:\n test_results = 'No errors returned'\n return Response({'detail': test_results}, status=200)\n\n if not plugin.can_disable:\n return Response({'detail': ERR_ALWAYS_ENABLED}, status=400)\n\n plugin.enable(project)\n\n return Response(status=201)\n\n def delete(self, request, project, plugin_id):\n \"\"\"\n Disable plugin\n \"\"\"\n plugin = self._get_plugin(plugin_id)\n\n if not plugin.can_disable:\n return Response({'detail': ERR_ALWAYS_ENABLED}, status=400)\n\n plugin.disable(project)\n\n return Response(status=204)\n\n def put(self, request, project, plugin_id):\n plugin = self._get_plugin(plugin_id)\n\n config = [\n serialize_field(project, plugin, c)\n for c in plugin.get_config(\n project=project,\n user=request.user,\n )\n ]\n\n cleaned = {}\n errors = {}\n for field in config:\n key = field['name']\n value = request.DATA.get(key)\n\n if field.get('required') and not value:\n errors[key] = ERR_FIELD_REQUIRED\n\n try:\n value = plugin.validate_config_field(\n project=project,\n name=key,\n value=value,\n actor=request.user,\n )\n except (forms.ValidationError, serializers.ValidationError, PluginError) as e:\n errors[key] = e.message\n\n if not errors.get(key):\n cleaned[key] = value\n\n if not errors:\n try:\n cleaned = plugin.validate_config(\n project=project,\n config=cleaned,\n actor=request.user,\n )\n except PluginError as e:\n errors['__all__'] = e.message\n\n if errors:\n return Response(\n {\n 'errors': errors,\n }, status=400\n )\n\n for key, value in six.iteritems(cleaned):\n if value is None:\n plugin.unset_option(\n project=project,\n key=key,\n )\n else:\n plugin.set_option(\n project=project,\n key=key,\n value=value,\n )\n\n context = serialize(plugin, request.user, PluginWithConfigSerializer(project))\n\n plugin_enabled.send(plugin=plugin, project=project, user=request.user, sender=self)\n\n return Response(context)\n", "path": "src/sentry/api/endpoints/project_plugin_details.py"}]} | 1,392 | 387 |
gh_patches_debug_15019 | rasdani/github-patches | git_diff | Mailu__Mailu-1305 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Certwatcher does not work with symlinks (on Kubernetes)
I need to recheck this, but I just had this issue... Will report in 3 Months
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/nginx/certwatcher.py`
Content:
```
1 #!/usr/bin/python3
2 """
3 Certificate watcher which reloads nginx or reconfigures it, depending on what
4 happens to externally supplied certificates. Only executed by start.py in case
5 of TLS_FLAVOR=[mail, cert]
6 """
7
8 from os.path import exists, split as path_split
9 from os import system
10 import time
11 from watchdog.observers import Observer
12 from watchdog.events import FileSystemEventHandler, FileDeletedEvent, \
13 FileCreatedEvent, FileModifiedEvent, FileMovedEvent
14
15 class ChangeHandler(FileSystemEventHandler):
16 "watchdog-handler listening on any event, executing the correct configuration/reload steps"
17 @staticmethod
18 def reload_nginx():
19 "merely reload nginx without re-configuring everything"
20 if exists("/var/run/nginx.pid"):
21 print("Reloading a running nginx")
22 system("nginx -s reload")
23
24 @staticmethod
25 def reexec_config():
26 "execute a reconfiguration of the system, which also reloads"
27 print("Reconfiguring system")
28 system("/config.py")
29
30 def on_any_event(self, event):
31 "event-listener checking if the affected files are the cert-files we're interested in"
32 if event.is_directory:
33 return
34
35 filename = path_split(event.src_path)[-1]
36 if isinstance(event, FileMovedEvent):
37 filename = path_split(event.dest_path)[-1]
38
39 if filename in ['cert.pem', 'key.pem']:
40 # all cases except for FileModified need re-configure
41 if isinstance(event, (FileCreatedEvent, FileMovedEvent, FileDeletedEvent)):
42 ChangeHandler.reexec_config()
43 # file modification needs only a nginx reload without config.py
44 elif isinstance(event, FileModifiedEvent):
45 ChangeHandler.reload_nginx()
46 # cert files have been moved away, re-configure
47 elif isinstance(event, FileMovedEvent) and path_split(event.src_path)[-1] in ['cert.pem', 'key.pem']:
48 ChangeHandler.reexec_config()
49
50
51 if __name__ == '__main__':
52 observer = Observer()
53 handler = ChangeHandler()
54 observer.schedule(handler, "/certs", recursive=False)
55 observer.start()
56
57 try:
58 while True:
59 time.sleep(1)
60 except KeyboardInterrupt:
61 observer.stop()
62
63 observer.join()
64
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/core/nginx/certwatcher.py b/core/nginx/certwatcher.py
--- a/core/nginx/certwatcher.py
+++ b/core/nginx/certwatcher.py
@@ -8,7 +8,7 @@
from os.path import exists, split as path_split
from os import system
import time
-from watchdog.observers import Observer
+from watchdog.observers.polling import PollingObserver
from watchdog.events import FileSystemEventHandler, FileDeletedEvent, \
FileCreatedEvent, FileModifiedEvent, FileMovedEvent
@@ -49,7 +49,7 @@
if __name__ == '__main__':
- observer = Observer()
+ observer = PollingObserver()
handler = ChangeHandler()
observer.schedule(handler, "/certs", recursive=False)
observer.start()
| {"golden_diff": "diff --git a/core/nginx/certwatcher.py b/core/nginx/certwatcher.py\n--- a/core/nginx/certwatcher.py\n+++ b/core/nginx/certwatcher.py\n@@ -8,7 +8,7 @@\n from os.path import exists, split as path_split\n from os import system\n import time\n-from watchdog.observers import Observer\n+from watchdog.observers.polling import PollingObserver\n from watchdog.events import FileSystemEventHandler, FileDeletedEvent, \\\n FileCreatedEvent, FileModifiedEvent, FileMovedEvent\n \n@@ -49,7 +49,7 @@\n \n \n if __name__ == '__main__':\n- observer = Observer()\n+ observer = PollingObserver()\n handler = ChangeHandler()\n observer.schedule(handler, \"/certs\", recursive=False)\n observer.start()\n", "issue": "Certwatcher does not work with symlinks (on Kubernetes)\nI need to recheck this, but I just had this issue... Will report in 3 Months\n", "before_files": [{"content": "#!/usr/bin/python3\n\"\"\"\nCertificate watcher which reloads nginx or reconfigures it, depending on what\nhappens to externally supplied certificates. Only executed by start.py in case\nof TLS_FLAVOR=[mail, cert]\n\"\"\"\n\nfrom os.path import exists, split as path_split\nfrom os import system\nimport time\nfrom watchdog.observers import Observer\nfrom watchdog.events import FileSystemEventHandler, FileDeletedEvent, \\\n FileCreatedEvent, FileModifiedEvent, FileMovedEvent\n\nclass ChangeHandler(FileSystemEventHandler):\n \"watchdog-handler listening on any event, executing the correct configuration/reload steps\"\n @staticmethod\n def reload_nginx():\n \"merely reload nginx without re-configuring everything\"\n if exists(\"/var/run/nginx.pid\"):\n print(\"Reloading a running nginx\")\n system(\"nginx -s reload\")\n\n @staticmethod\n def reexec_config():\n \"execute a reconfiguration of the system, which also reloads\"\n print(\"Reconfiguring system\")\n system(\"/config.py\")\n\n def on_any_event(self, event):\n \"event-listener checking if the affected files are the cert-files we're interested in\"\n if event.is_directory:\n return\n\n filename = path_split(event.src_path)[-1]\n if isinstance(event, FileMovedEvent):\n filename = path_split(event.dest_path)[-1]\n\n if filename in ['cert.pem', 'key.pem']:\n # all cases except for FileModified need re-configure\n if isinstance(event, (FileCreatedEvent, FileMovedEvent, FileDeletedEvent)):\n ChangeHandler.reexec_config()\n # file modification needs only a nginx reload without config.py\n elif isinstance(event, FileModifiedEvent):\n ChangeHandler.reload_nginx()\n # cert files have been moved away, re-configure\n elif isinstance(event, FileMovedEvent) and path_split(event.src_path)[-1] in ['cert.pem', 'key.pem']:\n ChangeHandler.reexec_config()\n\n\nif __name__ == '__main__':\n observer = Observer()\n handler = ChangeHandler()\n observer.schedule(handler, \"/certs\", recursive=False)\n observer.start()\n\n try:\n while True:\n time.sleep(1)\n except KeyboardInterrupt:\n observer.stop()\n\n observer.join()\n", "path": "core/nginx/certwatcher.py"}], "after_files": [{"content": "#!/usr/bin/python3\n\"\"\"\nCertificate watcher which reloads nginx or reconfigures it, depending on what\nhappens to externally supplied certificates. Only executed by start.py in case\nof TLS_FLAVOR=[mail, cert]\n\"\"\"\n\nfrom os.path import exists, split as path_split\nfrom os import system\nimport time\nfrom watchdog.observers.polling import PollingObserver\nfrom watchdog.events import FileSystemEventHandler, FileDeletedEvent, \\\n FileCreatedEvent, FileModifiedEvent, FileMovedEvent\n\nclass ChangeHandler(FileSystemEventHandler):\n \"watchdog-handler listening on any event, executing the correct configuration/reload steps\"\n @staticmethod\n def reload_nginx():\n \"merely reload nginx without re-configuring everything\"\n if exists(\"/var/run/nginx.pid\"):\n print(\"Reloading a running nginx\")\n system(\"nginx -s reload\")\n\n @staticmethod\n def reexec_config():\n \"execute a reconfiguration of the system, which also reloads\"\n print(\"Reconfiguring system\")\n system(\"/config.py\")\n\n def on_any_event(self, event):\n \"event-listener checking if the affected files are the cert-files we're interested in\"\n if event.is_directory:\n return\n\n filename = path_split(event.src_path)[-1]\n if isinstance(event, FileMovedEvent):\n filename = path_split(event.dest_path)[-1]\n\n if filename in ['cert.pem', 'key.pem']:\n # all cases except for FileModified need re-configure\n if isinstance(event, (FileCreatedEvent, FileMovedEvent, FileDeletedEvent)):\n ChangeHandler.reexec_config()\n # file modification needs only a nginx reload without config.py\n elif isinstance(event, FileModifiedEvent):\n ChangeHandler.reload_nginx()\n # cert files have been moved away, re-configure\n elif isinstance(event, FileMovedEvent) and path_split(event.src_path)[-1] in ['cert.pem', 'key.pem']:\n ChangeHandler.reexec_config()\n\n\nif __name__ == '__main__':\n observer = PollingObserver()\n handler = ChangeHandler()\n observer.schedule(handler, \"/certs\", recursive=False)\n observer.start()\n\n try:\n while True:\n time.sleep(1)\n except KeyboardInterrupt:\n observer.stop()\n\n observer.join()\n", "path": "core/nginx/certwatcher.py"}]} | 891 | 170 |
gh_patches_debug_4511 | rasdani/github-patches | git_diff | zigpy__zha-device-handlers-704 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Xiaomi Mijia Light Sensor (GZCGQ01LM) in ZHA not detecting battery level anymore (it did in previous versions lower than 2020.12.7)
Hai The original call is from ( lustreking aka Steve Small) , I (yfands) can confirm Steve's findings indeed no battery level.
Steve posted his issue [here ](https://community.home-assistant.io/t/xiaomi-mijia-zha-not-detecting-battery-level/257827)but no answer there, so iam trying it here. In his original call there are 2 pictures one when all was well and the second picture after the update to 2020.12.7, the one that is not working any more, the difference is the class the device was assigned to.
The Problem
A while ago I got a Xiaomi Mijia Light Sensor (GZCGQ01LM), and it worked well for me, so I eventually ordered 2 more. Before I got them, I updated to the current version of HA (2020.12.7), as well as migrated HA to a new RPi.
When I tried to add them, ZHA found the illuminance entity, but not the battery entity, which my previous sensor had.
In ZHA I see the new sensors listed as Xiaomi, while the previous sensor’s manufacturer was listed as lumi. The model for both is lumi.sen_ill.mgl01
Under the zigbee info for the new ones it says “Power Source: Mains” which is obviously incorrect!
Here are the zigbee details of the good and bad ones, (see his original link).. The configuration below is from me (yfands), so to be short in ZHA found the illuminance entity, but not the battery entity anymore!
## System Health
version | 2020.12.2
-- | --
installation_type | Home Assistant OS
dev | false
hassio | true
docker | true
virtualenv | false
python_version | 3.8.6
os_name | Linux
os_version | 5.4.79-v8
arch | aarch64
timezone | Europe/Amsterdam
<details><summary>Home Assistant Community Store</summary>
GitHub API | ok
-- | --
Github API Calls Remaining | 4646
Installed Version | 1.9.0
Stage | running
Available Repositories | 711
Installed Repositories | 29
</details>
<details><summary>Home Assistant Cloud</summary>
logged_in | false
-- | --
can_reach_cert_server | ok
can_reach_cloud_auth | ok
can_reach_cloud | ok
</details>
<details><summary>Home Assistant Supervisor</summary>
host_os | Home Assistant OS 5.9
-- | --
update_channel | stable
supervisor_version | 2020.12.7
docker_version | 19.03.13
disk_total | 27.2 GB
disk_used | 5.5 GB
healthy | true
supported | true
board | rpi4-64
supervisor_api | ok
version_api | ok
installed_addons | File editor (5.2.0), Samba share (9.3.0), Terminal & SSH (8.10.0), Mosquitto broker (5.1), Node-RED (7.2.11)
</details>
<details><summary>Lovelace</summary>
dashboards | 1
-- | --
mode | storage
views | 10
resources | 20
</details>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `zhaquirks/xiaomi/aqara/illumination.py`
Content:
```
1 """Quirk for lumi.sen_ill.mgl01 illumination sensor."""
2 import logging
3
4 from zigpy.profiles import zha
5 from zigpy.zcl.clusters.general import Basic, Identify
6 from zigpy.zcl.clusters.measurement import IlluminanceMeasurement
7 from zigpy.zdo.types import NodeDescriptor
8
9 from .. import LUMI, BasicCluster, XiaomiCustomDevice
10 from ... import PowerConfigurationCluster
11 from ...const import (
12 DEVICE_TYPE,
13 ENDPOINTS,
14 INPUT_CLUSTERS,
15 MODELS_INFO,
16 NODE_DESCRIPTOR,
17 OUTPUT_CLUSTERS,
18 PROFILE_ID,
19 )
20
21 _LOGGER = logging.getLogger(__name__)
22
23
24 class Illumination(XiaomiCustomDevice):
25 """Aqara LUMI lumi.sen_ill.mgl01."""
26
27 signature = {
28 # <SimpleDescriptor endpoint=1 profile=260 device_type=262
29 # device_version=1
30 # input_clusters=[0, 1024, 3, 1]
31 # output_clusters=[3]>
32 MODELS_INFO: [(LUMI, "lumi.sen_ill.mgl01")],
33 ENDPOINTS: {
34 1: {
35 PROFILE_ID: zha.PROFILE_ID,
36 DEVICE_TYPE: zha.DeviceType.LIGHT_SENSOR,
37 INPUT_CLUSTERS: [
38 Basic.cluster_id,
39 Identify.cluster_id,
40 IlluminanceMeasurement.cluster_id,
41 PowerConfigurationCluster.cluster_id,
42 ],
43 OUTPUT_CLUSTERS: [Identify.cluster_id],
44 }
45 },
46 }
47
48 replacement = {
49 NODE_DESCRIPTOR: NodeDescriptor(
50 0x02, 0x40, 0x80, 0x115F, 0x7F, 0x0064, 0x2C00, 0x0064, 0x00
51 ),
52 ENDPOINTS: {
53 1: {
54 PROFILE_ID: zha.PROFILE_ID,
55 DEVICE_TYPE: zha.DeviceType.LIGHT_SENSOR,
56 INPUT_CLUSTERS: [
57 BasicCluster,
58 Identify.cluster_id,
59 IlluminanceMeasurement.cluster_id,
60 PowerConfigurationCluster,
61 ],
62 OUTPUT_CLUSTERS: [Identify.cluster_id],
63 }
64 },
65 }
66
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/zhaquirks/xiaomi/aqara/illumination.py b/zhaquirks/xiaomi/aqara/illumination.py
--- a/zhaquirks/xiaomi/aqara/illumination.py
+++ b/zhaquirks/xiaomi/aqara/illumination.py
@@ -29,7 +29,7 @@
# device_version=1
# input_clusters=[0, 1024, 3, 1]
# output_clusters=[3]>
- MODELS_INFO: [(LUMI, "lumi.sen_ill.mgl01")],
+ MODELS_INFO: [(LUMI, "lumi.sen_ill.mgl01"), ("XIAOMI", "lumi.sen_ill.mgl01")],
ENDPOINTS: {
1: {
PROFILE_ID: zha.PROFILE_ID,
| {"golden_diff": "diff --git a/zhaquirks/xiaomi/aqara/illumination.py b/zhaquirks/xiaomi/aqara/illumination.py\n--- a/zhaquirks/xiaomi/aqara/illumination.py\n+++ b/zhaquirks/xiaomi/aqara/illumination.py\n@@ -29,7 +29,7 @@\n # device_version=1\n # input_clusters=[0, 1024, 3, 1]\n # output_clusters=[3]>\n- MODELS_INFO: [(LUMI, \"lumi.sen_ill.mgl01\")],\n+ MODELS_INFO: [(LUMI, \"lumi.sen_ill.mgl01\"), (\"XIAOMI\", \"lumi.sen_ill.mgl01\")],\n ENDPOINTS: {\n 1: {\n PROFILE_ID: zha.PROFILE_ID,\n", "issue": "Xiaomi Mijia Light Sensor (GZCGQ01LM) in ZHA not detecting battery level anymore (it did in previous versions lower than 2020.12.7)\nHai The original call is from ( lustreking aka Steve Small) , I (yfands) can confirm Steve's findings indeed no battery level.\r\nSteve posted his issue [here ](https://community.home-assistant.io/t/xiaomi-mijia-zha-not-detecting-battery-level/257827)but no answer there, so iam trying it here. In his original call there are 2 pictures one when all was well and the second picture after the update to 2020.12.7, the one that is not working any more, the difference is the class the device was assigned to.\r\n\r\nThe Problem\r\nA while ago I got a Xiaomi Mijia Light Sensor (GZCGQ01LM), and it worked well for me, so I eventually ordered 2 more. Before I got them, I updated to the current version of HA (2020.12.7), as well as migrated HA to a new RPi.\r\n\r\nWhen I tried to add them, ZHA found the illuminance entity, but not the battery entity, which my previous sensor had.\r\n\r\nIn ZHA I see the new sensors listed as Xiaomi, while the previous sensor\u2019s manufacturer was listed as lumi. The model for both is lumi.sen_ill.mgl01\r\n\r\nUnder the zigbee info for the new ones it says \u201cPower Source: Mains\u201d which is obviously incorrect!\r\nHere are the zigbee details of the good and bad ones, (see his original link).. The configuration below is from me (yfands), so to be short in ZHA found the illuminance entity, but not the battery entity anymore!\r\n\r\n## System Health\r\n\r\nversion | 2020.12.2\r\n-- | --\r\ninstallation_type | Home Assistant OS\r\ndev | false\r\nhassio | true\r\ndocker | true\r\nvirtualenv | false\r\npython_version | 3.8.6\r\nos_name | Linux\r\nos_version | 5.4.79-v8\r\narch | aarch64\r\ntimezone | Europe/Amsterdam\r\n\r\n<details><summary>Home Assistant Community Store</summary>\r\n\r\nGitHub API | ok\r\n-- | --\r\nGithub API Calls Remaining | 4646\r\nInstalled Version | 1.9.0\r\nStage | running\r\nAvailable Repositories | 711\r\nInstalled Repositories | 29\r\n\r\n</details>\r\n\r\n<details><summary>Home Assistant Cloud</summary>\r\n\r\nlogged_in | false\r\n-- | --\r\ncan_reach_cert_server | ok\r\ncan_reach_cloud_auth | ok\r\ncan_reach_cloud | ok\r\n\r\n</details>\r\n\r\n<details><summary>Home Assistant Supervisor</summary>\r\n\r\nhost_os | Home Assistant OS 5.9\r\n-- | --\r\nupdate_channel | stable\r\nsupervisor_version | 2020.12.7\r\ndocker_version | 19.03.13\r\ndisk_total | 27.2 GB\r\ndisk_used | 5.5 GB\r\nhealthy | true\r\nsupported | true\r\nboard | rpi4-64\r\nsupervisor_api | ok\r\nversion_api | ok\r\ninstalled_addons | File editor (5.2.0), Samba share (9.3.0), Terminal & SSH (8.10.0), Mosquitto broker (5.1), Node-RED (7.2.11)\r\n\r\n</details>\r\n\r\n<details><summary>Lovelace</summary>\r\n\r\ndashboards | 1\r\n-- | --\r\nmode | storage\r\nviews | 10\r\nresources | 20\r\n\r\n</details>\r\n\n", "before_files": [{"content": "\"\"\"Quirk for lumi.sen_ill.mgl01 illumination sensor.\"\"\"\nimport logging\n\nfrom zigpy.profiles import zha\nfrom zigpy.zcl.clusters.general import Basic, Identify\nfrom zigpy.zcl.clusters.measurement import IlluminanceMeasurement\nfrom zigpy.zdo.types import NodeDescriptor\n\nfrom .. import LUMI, BasicCluster, XiaomiCustomDevice\nfrom ... import PowerConfigurationCluster\nfrom ...const import (\n DEVICE_TYPE,\n ENDPOINTS,\n INPUT_CLUSTERS,\n MODELS_INFO,\n NODE_DESCRIPTOR,\n OUTPUT_CLUSTERS,\n PROFILE_ID,\n)\n\n_LOGGER = logging.getLogger(__name__)\n\n\nclass Illumination(XiaomiCustomDevice):\n \"\"\"Aqara LUMI lumi.sen_ill.mgl01.\"\"\"\n\n signature = {\n # <SimpleDescriptor endpoint=1 profile=260 device_type=262\n # device_version=1\n # input_clusters=[0, 1024, 3, 1]\n # output_clusters=[3]>\n MODELS_INFO: [(LUMI, \"lumi.sen_ill.mgl01\")],\n ENDPOINTS: {\n 1: {\n PROFILE_ID: zha.PROFILE_ID,\n DEVICE_TYPE: zha.DeviceType.LIGHT_SENSOR,\n INPUT_CLUSTERS: [\n Basic.cluster_id,\n Identify.cluster_id,\n IlluminanceMeasurement.cluster_id,\n PowerConfigurationCluster.cluster_id,\n ],\n OUTPUT_CLUSTERS: [Identify.cluster_id],\n }\n },\n }\n\n replacement = {\n NODE_DESCRIPTOR: NodeDescriptor(\n 0x02, 0x40, 0x80, 0x115F, 0x7F, 0x0064, 0x2C00, 0x0064, 0x00\n ),\n ENDPOINTS: {\n 1: {\n PROFILE_ID: zha.PROFILE_ID,\n DEVICE_TYPE: zha.DeviceType.LIGHT_SENSOR,\n INPUT_CLUSTERS: [\n BasicCluster,\n Identify.cluster_id,\n IlluminanceMeasurement.cluster_id,\n PowerConfigurationCluster,\n ],\n OUTPUT_CLUSTERS: [Identify.cluster_id],\n }\n },\n }\n", "path": "zhaquirks/xiaomi/aqara/illumination.py"}], "after_files": [{"content": "\"\"\"Quirk for lumi.sen_ill.mgl01 illumination sensor.\"\"\"\nimport logging\n\nfrom zigpy.profiles import zha\nfrom zigpy.zcl.clusters.general import Basic, Identify\nfrom zigpy.zcl.clusters.measurement import IlluminanceMeasurement\nfrom zigpy.zdo.types import NodeDescriptor\n\nfrom .. import LUMI, BasicCluster, XiaomiCustomDevice\nfrom ... import PowerConfigurationCluster\nfrom ...const import (\n DEVICE_TYPE,\n ENDPOINTS,\n INPUT_CLUSTERS,\n MODELS_INFO,\n NODE_DESCRIPTOR,\n OUTPUT_CLUSTERS,\n PROFILE_ID,\n)\n\n_LOGGER = logging.getLogger(__name__)\n\n\nclass Illumination(XiaomiCustomDevice):\n \"\"\"Aqara LUMI lumi.sen_ill.mgl01.\"\"\"\n\n signature = {\n # <SimpleDescriptor endpoint=1 profile=260 device_type=262\n # device_version=1\n # input_clusters=[0, 1024, 3, 1]\n # output_clusters=[3]>\n MODELS_INFO: [(LUMI, \"lumi.sen_ill.mgl01\"), (\"XIAOMI\", \"lumi.sen_ill.mgl01\")],\n ENDPOINTS: {\n 1: {\n PROFILE_ID: zha.PROFILE_ID,\n DEVICE_TYPE: zha.DeviceType.LIGHT_SENSOR,\n INPUT_CLUSTERS: [\n Basic.cluster_id,\n Identify.cluster_id,\n IlluminanceMeasurement.cluster_id,\n PowerConfigurationCluster.cluster_id,\n ],\n OUTPUT_CLUSTERS: [Identify.cluster_id],\n }\n },\n }\n\n replacement = {\n NODE_DESCRIPTOR: NodeDescriptor(\n 0x02, 0x40, 0x80, 0x115F, 0x7F, 0x0064, 0x2C00, 0x0064, 0x00\n ),\n ENDPOINTS: {\n 1: {\n PROFILE_ID: zha.PROFILE_ID,\n DEVICE_TYPE: zha.DeviceType.LIGHT_SENSOR,\n INPUT_CLUSTERS: [\n BasicCluster,\n Identify.cluster_id,\n IlluminanceMeasurement.cluster_id,\n PowerConfigurationCluster,\n ],\n OUTPUT_CLUSTERS: [Identify.cluster_id],\n }\n },\n }\n", "path": "zhaquirks/xiaomi/aqara/illumination.py"}]} | 1,678 | 199 |
gh_patches_debug_661 | rasdani/github-patches | git_diff | pex-tool__pex-2055 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Release 2.1.122
On the docket:
+ [x] Support the latest Pip releases: 22.3.1 & 23.0 #2056
+ [x] Lock sdists with prepare-metadata-for-build-wheel. #2053
+ [x] Fix `execute_parallel` "leaking" a thread. #2052
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pex/version.py`
Content:
```
1 # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 __version__ = "2.1.121"
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pex/version.py b/pex/version.py
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.121"
+__version__ = "2.1.122"
| {"golden_diff": "diff --git a/pex/version.py b/pex/version.py\n--- a/pex/version.py\n+++ b/pex/version.py\n@@ -1,4 +1,4 @@\n # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n # Licensed under the Apache License, Version 2.0 (see LICENSE).\n \n-__version__ = \"2.1.121\"\n+__version__ = \"2.1.122\"\n", "issue": "Release 2.1.122\nOn the docket:\r\n+ [x] Support the latest Pip releases: 22.3.1 & 23.0 #2056\r\n+ [x] Lock sdists with prepare-metadata-for-build-wheel. #2053\r\n+ [x] Fix `execute_parallel` \"leaking\" a thread. #2052\r\n\n", "before_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.121\"\n", "path": "pex/version.py"}], "after_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.122\"\n", "path": "pex/version.py"}]} | 395 | 98 |
gh_patches_debug_16138 | rasdani/github-patches | git_diff | uccser__cs-unplugged-718 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Workaround Crowdin bug where integer yaml keys are not preserved
When downloading in-context localisation files, integer keys in YAML files are not preserved. This is only an issue in the file `topics/content/en/programming-challenges-structure-difficulties.yaml`, which uses the difficulty number as the key.
As a work around, we can use the string value of the integer as the key, i.e. `"0"` instead of `0`
Workaround Crowdin bug where integer yaml keys are not preserved
When downloading in-context localisation files, integer keys in YAML files are not preserved. This is only an issue in the file `topics/content/en/programming-challenges-structure-difficulties.yaml`, which uses the difficulty number as the key.
As a work around, we can use the string value of the integer as the key, i.e. `"0"` instead of `0`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `csunplugged/topics/management/commands/_ProgrammingChallengesStructureLoader.py`
Content:
```
1 """Custom loader for loading structure of programming challenges."""
2
3 import os
4 from django.db import transaction
5 from utils.errors.MissingRequiredFieldError import MissingRequiredFieldError
6 from utils.TranslatableModelLoader import TranslatableModelLoader
7 from topics.models import ProgrammingChallengeLanguage, ProgrammingChallengeDifficulty
8
9
10 class ProgrammingChallengesStructureLoader(TranslatableModelLoader):
11 """Custom loader for loading structure of programming challenges."""
12
13 @transaction.atomic
14 def load(self):
15 """Load the content for structure of programming challenges.
16
17 Raises:
18 MissingRequiredFieldError: when no object can be found with the matching
19 attribute.
20 """
21 structure = self.load_yaml_file(self.structure_file_path)
22
23 prog_languages = structure.get("languages", None)
24 difficulty_levels = structure.get("difficulties", None)
25 if None in [prog_languages, difficulty_levels]:
26 raise MissingRequiredFieldError(
27 self.structure_file_path,
28 ["lanugages", "difficulties"],
29 "Programming Challenge Structure"
30 )
31
32 # Add "-languages" to the structure filename
33 prog_languages_translation_filename = "{}-languages.yaml".format(
34 os.path.splitext(self.structure_filename)[0]
35 )
36 prog_languages_translations = self.get_yaml_translations(
37 prog_languages_translation_filename,
38 required_slugs=prog_languages.keys(),
39 required_fields=["name"]
40 )
41
42 for (prog_language, prog_language_data) in prog_languages.items():
43
44 if prog_language_data is None:
45 raise MissingRequiredFieldError(
46 self.structure_file_path,
47 ["number"],
48 "Programming Challenge Language"
49 )
50
51 # Check for required fields
52 prog_language_number = prog_language_data.get("number", None)
53 if prog_language_number is None:
54 raise MissingRequiredFieldError(
55 self.structure_file_path,
56 ["number"],
57 "Programming Challenge Language"
58 )
59
60 # Check if icon is given
61 if "icon" in prog_language_data:
62 prog_language_icon = prog_language_data["icon"]
63 else:
64 prog_language_icon = None
65
66 new_prog_language = ProgrammingChallengeLanguage(
67 slug=prog_language,
68 number=prog_language_number,
69 icon=prog_language_icon
70 )
71
72 translations = prog_languages_translations.get(prog_language, dict())
73 self.populate_translations(new_prog_language, translations)
74 self.mark_translation_availability(new_prog_language, required_fields=["name"])
75 new_prog_language.save()
76
77 self.log("Added programming language: {}".format(new_prog_language.__str__()))
78
79 # Add "-languages" to the structure filename
80 difficulties_translation_filename = "{}-difficulties.yaml".format(
81 os.path.splitext(self.structure_filename)[0]
82 )
83 difficulties_translations = self.get_yaml_translations(
84 difficulties_translation_filename,
85 required_slugs=difficulty_levels,
86 required_fields=["name"],
87 )
88
89 for difficulty in difficulty_levels:
90
91 new_difficulty = ProgrammingChallengeDifficulty(
92 level=int(difficulty),
93 )
94
95 translations = difficulties_translations.get(difficulty, dict())
96 self.populate_translations(new_difficulty, translations)
97 self.mark_translation_availability(new_difficulty, required_fields=["name"])
98 new_difficulty.save()
99
100 self.log("Added programming difficulty level: {}".format(new_difficulty.__str__()))
101
102 self.log("")
103
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/csunplugged/topics/management/commands/_ProgrammingChallengesStructureLoader.py b/csunplugged/topics/management/commands/_ProgrammingChallengesStructureLoader.py
--- a/csunplugged/topics/management/commands/_ProgrammingChallengesStructureLoader.py
+++ b/csunplugged/topics/management/commands/_ProgrammingChallengesStructureLoader.py
@@ -86,13 +86,13 @@
required_fields=["name"],
)
- for difficulty in difficulty_levels:
+ for level, difficulty_slug in enumerate(difficulty_levels):
new_difficulty = ProgrammingChallengeDifficulty(
- level=int(difficulty),
+ level=level,
)
- translations = difficulties_translations.get(difficulty, dict())
+ translations = difficulties_translations.get(difficulty_slug, dict())
self.populate_translations(new_difficulty, translations)
self.mark_translation_availability(new_difficulty, required_fields=["name"])
new_difficulty.save()
| {"golden_diff": "diff --git a/csunplugged/topics/management/commands/_ProgrammingChallengesStructureLoader.py b/csunplugged/topics/management/commands/_ProgrammingChallengesStructureLoader.py\n--- a/csunplugged/topics/management/commands/_ProgrammingChallengesStructureLoader.py\n+++ b/csunplugged/topics/management/commands/_ProgrammingChallengesStructureLoader.py\n@@ -86,13 +86,13 @@\n required_fields=[\"name\"],\n )\n \n- for difficulty in difficulty_levels:\n+ for level, difficulty_slug in enumerate(difficulty_levels):\n \n new_difficulty = ProgrammingChallengeDifficulty(\n- level=int(difficulty),\n+ level=level,\n )\n \n- translations = difficulties_translations.get(difficulty, dict())\n+ translations = difficulties_translations.get(difficulty_slug, dict())\n self.populate_translations(new_difficulty, translations)\n self.mark_translation_availability(new_difficulty, required_fields=[\"name\"])\n new_difficulty.save()\n", "issue": "Workaround Crowdin bug where integer yaml keys are not preserved\nWhen downloading in-context localisation files, integer keys in YAML files are not preserved. This is only an issue in the file `topics/content/en/programming-challenges-structure-difficulties.yaml`, which uses the difficulty number as the key.\r\n\r\nAs a work around, we can use the string value of the integer as the key, i.e. `\"0\"` instead of `0`\nWorkaround Crowdin bug where integer yaml keys are not preserved\nWhen downloading in-context localisation files, integer keys in YAML files are not preserved. This is only an issue in the file `topics/content/en/programming-challenges-structure-difficulties.yaml`, which uses the difficulty number as the key.\r\n\r\nAs a work around, we can use the string value of the integer as the key, i.e. `\"0\"` instead of `0`\n", "before_files": [{"content": "\"\"\"Custom loader for loading structure of programming challenges.\"\"\"\n\nimport os\nfrom django.db import transaction\nfrom utils.errors.MissingRequiredFieldError import MissingRequiredFieldError\nfrom utils.TranslatableModelLoader import TranslatableModelLoader\nfrom topics.models import ProgrammingChallengeLanguage, ProgrammingChallengeDifficulty\n\n\nclass ProgrammingChallengesStructureLoader(TranslatableModelLoader):\n \"\"\"Custom loader for loading structure of programming challenges.\"\"\"\n\n @transaction.atomic\n def load(self):\n \"\"\"Load the content for structure of programming challenges.\n\n Raises:\n MissingRequiredFieldError: when no object can be found with the matching\n attribute.\n \"\"\"\n structure = self.load_yaml_file(self.structure_file_path)\n\n prog_languages = structure.get(\"languages\", None)\n difficulty_levels = structure.get(\"difficulties\", None)\n if None in [prog_languages, difficulty_levels]:\n raise MissingRequiredFieldError(\n self.structure_file_path,\n [\"lanugages\", \"difficulties\"],\n \"Programming Challenge Structure\"\n )\n\n # Add \"-languages\" to the structure filename\n prog_languages_translation_filename = \"{}-languages.yaml\".format(\n os.path.splitext(self.structure_filename)[0]\n )\n prog_languages_translations = self.get_yaml_translations(\n prog_languages_translation_filename,\n required_slugs=prog_languages.keys(),\n required_fields=[\"name\"]\n )\n\n for (prog_language, prog_language_data) in prog_languages.items():\n\n if prog_language_data is None:\n raise MissingRequiredFieldError(\n self.structure_file_path,\n [\"number\"],\n \"Programming Challenge Language\"\n )\n\n # Check for required fields\n prog_language_number = prog_language_data.get(\"number\", None)\n if prog_language_number is None:\n raise MissingRequiredFieldError(\n self.structure_file_path,\n [\"number\"],\n \"Programming Challenge Language\"\n )\n\n # Check if icon is given\n if \"icon\" in prog_language_data:\n prog_language_icon = prog_language_data[\"icon\"]\n else:\n prog_language_icon = None\n\n new_prog_language = ProgrammingChallengeLanguage(\n slug=prog_language,\n number=prog_language_number,\n icon=prog_language_icon\n )\n\n translations = prog_languages_translations.get(prog_language, dict())\n self.populate_translations(new_prog_language, translations)\n self.mark_translation_availability(new_prog_language, required_fields=[\"name\"])\n new_prog_language.save()\n\n self.log(\"Added programming language: {}\".format(new_prog_language.__str__()))\n\n # Add \"-languages\" to the structure filename\n difficulties_translation_filename = \"{}-difficulties.yaml\".format(\n os.path.splitext(self.structure_filename)[0]\n )\n difficulties_translations = self.get_yaml_translations(\n difficulties_translation_filename,\n required_slugs=difficulty_levels,\n required_fields=[\"name\"],\n )\n\n for difficulty in difficulty_levels:\n\n new_difficulty = ProgrammingChallengeDifficulty(\n level=int(difficulty),\n )\n\n translations = difficulties_translations.get(difficulty, dict())\n self.populate_translations(new_difficulty, translations)\n self.mark_translation_availability(new_difficulty, required_fields=[\"name\"])\n new_difficulty.save()\n\n self.log(\"Added programming difficulty level: {}\".format(new_difficulty.__str__()))\n\n self.log(\"\")\n", "path": "csunplugged/topics/management/commands/_ProgrammingChallengesStructureLoader.py"}], "after_files": [{"content": "\"\"\"Custom loader for loading structure of programming challenges.\"\"\"\n\nimport os\nfrom django.db import transaction\nfrom utils.errors.MissingRequiredFieldError import MissingRequiredFieldError\nfrom utils.TranslatableModelLoader import TranslatableModelLoader\nfrom topics.models import ProgrammingChallengeLanguage, ProgrammingChallengeDifficulty\n\n\nclass ProgrammingChallengesStructureLoader(TranslatableModelLoader):\n \"\"\"Custom loader for loading structure of programming challenges.\"\"\"\n\n @transaction.atomic\n def load(self):\n \"\"\"Load the content for structure of programming challenges.\n\n Raises:\n MissingRequiredFieldError: when no object can be found with the matching\n attribute.\n \"\"\"\n structure = self.load_yaml_file(self.structure_file_path)\n\n prog_languages = structure.get(\"languages\", None)\n difficulty_levels = structure.get(\"difficulties\", None)\n if None in [prog_languages, difficulty_levels]:\n raise MissingRequiredFieldError(\n self.structure_file_path,\n [\"lanugages\", \"difficulties\"],\n \"Programming Challenge Structure\"\n )\n\n # Add \"-languages\" to the structure filename\n prog_languages_translation_filename = \"{}-languages.yaml\".format(\n os.path.splitext(self.structure_filename)[0]\n )\n prog_languages_translations = self.get_yaml_translations(\n prog_languages_translation_filename,\n required_slugs=prog_languages.keys(),\n required_fields=[\"name\"]\n )\n\n for (prog_language, prog_language_data) in prog_languages.items():\n\n if prog_language_data is None:\n raise MissingRequiredFieldError(\n self.structure_file_path,\n [\"number\"],\n \"Programming Challenge Language\"\n )\n\n # Check for required fields\n prog_language_number = prog_language_data.get(\"number\", None)\n if prog_language_number is None:\n raise MissingRequiredFieldError(\n self.structure_file_path,\n [\"number\"],\n \"Programming Challenge Language\"\n )\n\n # Check if icon is given\n if \"icon\" in prog_language_data:\n prog_language_icon = prog_language_data[\"icon\"]\n else:\n prog_language_icon = None\n\n new_prog_language = ProgrammingChallengeLanguage(\n slug=prog_language,\n number=prog_language_number,\n icon=prog_language_icon\n )\n\n translations = prog_languages_translations.get(prog_language, dict())\n self.populate_translations(new_prog_language, translations)\n self.mark_translation_availability(new_prog_language, required_fields=[\"name\"])\n new_prog_language.save()\n\n self.log(\"Added programming language: {}\".format(new_prog_language.__str__()))\n\n # Add \"-languages\" to the structure filename\n difficulties_translation_filename = \"{}-difficulties.yaml\".format(\n os.path.splitext(self.structure_filename)[0]\n )\n difficulties_translations = self.get_yaml_translations(\n difficulties_translation_filename,\n required_slugs=difficulty_levels,\n required_fields=[\"name\"],\n )\n\n for level, difficulty_slug in enumerate(difficulty_levels):\n\n new_difficulty = ProgrammingChallengeDifficulty(\n level=level,\n )\n\n translations = difficulties_translations.get(difficulty_slug, dict())\n self.populate_translations(new_difficulty, translations)\n self.mark_translation_availability(new_difficulty, required_fields=[\"name\"])\n new_difficulty.save()\n\n self.log(\"Added programming difficulty level: {}\".format(new_difficulty.__str__()))\n\n self.log(\"\")\n", "path": "csunplugged/topics/management/commands/_ProgrammingChallengesStructureLoader.py"}]} | 1,343 | 205 |
gh_patches_debug_1183 | rasdani/github-patches | git_diff | bokeh__bokeh-2235 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
VBoxForm broken
Added a `float:left` to fix `sliders.py` which broke stock app example worse.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/plotting/file/slider.py`
Content:
```
1
2 from bokeh.io import vform
3 from bokeh.plotting import figure, hplot, output_file, show, vplot, ColumnDataSource
4 from bokeh.models.actions import Callback
5 from bokeh.models.widgets import Slider
6
7 import numpy as np
8
9 x = np.linspace(0, 10, 500)
10 y = np.sin(x)
11
12 source = ColumnDataSource(data=dict(x=x, y=y))
13
14
15 plot = figure(y_range=(-10, 10), plot_width=400, plot_height=400)
16 plot.line('x', 'y', source=source, line_width=3, line_alpha=0.6)
17
18 callback = Callback(args=dict(source=source), code="""
19 var data = source.get('data');
20 var A = amp.get('value')
21 var k = freq.get('value')
22 var phi = phase.get('value')
23 var B = offset.get('value')
24 x = data['x']
25 y = data['y']
26 for (i = 0; i < x.length; i++) {
27 y[i] = B + A*Math.sin(k*x[i]+phi);
28 }
29 source.trigger('change');
30 """)
31
32 amp_slider = Slider(start=0.1, end=10, value=1, step=.1, title="Amplitude", callback=callback)
33 callback.args["amp"] = amp_slider
34
35 freq_slider = Slider(start=0.1, end=10, value=1, step=.1, title="Frequency", callback=callback)
36 callback.args["freq"] = freq_slider
37
38 phase_slider = Slider(start=0, end=6.4, value=0, step=.1, title="Phase", callback=callback)
39 callback.args["phase"] = phase_slider
40
41 offset_slider = Slider(start=-5, end=5, value=0, step=.1, title="Offset", callback=callback)
42 callback.args["offset"] = offset_slider
43
44 layout = hplot(
45 vform(amp_slider, freq_slider, phase_slider, offset_slider),
46 plot
47 )
48
49 output_file("slider.html")
50
51 show(layout)
52
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/plotting/file/slider.py b/examples/plotting/file/slider.py
--- a/examples/plotting/file/slider.py
+++ b/examples/plotting/file/slider.py
@@ -42,8 +42,8 @@
callback.args["offset"] = offset_slider
layout = hplot(
+ plot,
vform(amp_slider, freq_slider, phase_slider, offset_slider),
- plot
)
output_file("slider.html")
| {"golden_diff": "diff --git a/examples/plotting/file/slider.py b/examples/plotting/file/slider.py\n--- a/examples/plotting/file/slider.py\n+++ b/examples/plotting/file/slider.py\n@@ -42,8 +42,8 @@\n callback.args[\"offset\"] = offset_slider\n \n layout = hplot(\n+ plot,\n vform(amp_slider, freq_slider, phase_slider, offset_slider),\n- plot\n )\n \n output_file(\"slider.html\")\n", "issue": "VBoxForm broken\nAdded a `float:left` to fix `sliders.py` which broke stock app example worse. \n\n", "before_files": [{"content": "\nfrom bokeh.io import vform\nfrom bokeh.plotting import figure, hplot, output_file, show, vplot, ColumnDataSource\nfrom bokeh.models.actions import Callback\nfrom bokeh.models.widgets import Slider\n\nimport numpy as np\n\nx = np.linspace(0, 10, 500)\ny = np.sin(x)\n\nsource = ColumnDataSource(data=dict(x=x, y=y))\n\n\nplot = figure(y_range=(-10, 10), plot_width=400, plot_height=400)\nplot.line('x', 'y', source=source, line_width=3, line_alpha=0.6)\n\ncallback = Callback(args=dict(source=source), code=\"\"\"\n var data = source.get('data');\n var A = amp.get('value')\n var k = freq.get('value')\n var phi = phase.get('value')\n var B = offset.get('value')\n x = data['x']\n y = data['y']\n for (i = 0; i < x.length; i++) {\n y[i] = B + A*Math.sin(k*x[i]+phi);\n }\n source.trigger('change');\n\"\"\")\n\namp_slider = Slider(start=0.1, end=10, value=1, step=.1, title=\"Amplitude\", callback=callback)\ncallback.args[\"amp\"] = amp_slider\n\nfreq_slider = Slider(start=0.1, end=10, value=1, step=.1, title=\"Frequency\", callback=callback)\ncallback.args[\"freq\"] = freq_slider\n\nphase_slider = Slider(start=0, end=6.4, value=0, step=.1, title=\"Phase\", callback=callback)\ncallback.args[\"phase\"] = phase_slider\n\noffset_slider = Slider(start=-5, end=5, value=0, step=.1, title=\"Offset\", callback=callback)\ncallback.args[\"offset\"] = offset_slider\n\nlayout = hplot(\n vform(amp_slider, freq_slider, phase_slider, offset_slider),\n plot\n)\n\noutput_file(\"slider.html\")\n\nshow(layout)\n", "path": "examples/plotting/file/slider.py"}], "after_files": [{"content": "\nfrom bokeh.io import vform\nfrom bokeh.plotting import figure, hplot, output_file, show, vplot, ColumnDataSource\nfrom bokeh.models.actions import Callback\nfrom bokeh.models.widgets import Slider\n\nimport numpy as np\n\nx = np.linspace(0, 10, 500)\ny = np.sin(x)\n\nsource = ColumnDataSource(data=dict(x=x, y=y))\n\n\nplot = figure(y_range=(-10, 10), plot_width=400, plot_height=400)\nplot.line('x', 'y', source=source, line_width=3, line_alpha=0.6)\n\ncallback = Callback(args=dict(source=source), code=\"\"\"\n var data = source.get('data');\n var A = amp.get('value')\n var k = freq.get('value')\n var phi = phase.get('value')\n var B = offset.get('value')\n x = data['x']\n y = data['y']\n for (i = 0; i < x.length; i++) {\n y[i] = B + A*Math.sin(k*x[i]+phi);\n }\n source.trigger('change');\n\"\"\")\n\namp_slider = Slider(start=0.1, end=10, value=1, step=.1, title=\"Amplitude\", callback=callback)\ncallback.args[\"amp\"] = amp_slider\n\nfreq_slider = Slider(start=0.1, end=10, value=1, step=.1, title=\"Frequency\", callback=callback)\ncallback.args[\"freq\"] = freq_slider\n\nphase_slider = Slider(start=0, end=6.4, value=0, step=.1, title=\"Phase\", callback=callback)\ncallback.args[\"phase\"] = phase_slider\n\noffset_slider = Slider(start=-5, end=5, value=0, step=.1, title=\"Offset\", callback=callback)\ncallback.args[\"offset\"] = offset_slider\n\nlayout = hplot(\n plot,\n vform(amp_slider, freq_slider, phase_slider, offset_slider),\n)\n\noutput_file(\"slider.html\")\n\nshow(layout)\n", "path": "examples/plotting/file/slider.py"}]} | 836 | 98 |
gh_patches_debug_2661 | rasdani/github-patches | git_diff | cowrie__cowrie-920 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
output_localsyslog exceptions.KeyError: 'isError'
After pulling the most recent version of cowrie to some of my honeypots, I get this error when a new connection I enabled [output_localsyslog] with configuration below:
```
[output_localsyslog]
enabled = true
facility = LOCAL5
format = text
```
The log error shows this:
```
2018-10-11T18:29:01.778300+0000 [twisted.logger._observer#critical] Temporarily disabling observer LegacyLogObserverWrapper(<bound method Output.emit of <cowrie.output.localsyslog.Output object at 0xb55ae7b0>>) due to exception: [Failure instance: Traceback: <type 'exceptions.KeyError'>: 'isError'
/opt/cowrie/src/cowrie/core/checkers.py:110:checkUserPass
/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/threadable.py:53:sync
/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/log.py:286:msg
/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_legacy.py:154:publishToNewObserver
--- <exception caught here> ---
/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_observer.py:131:__call__
/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_legacy.py:93:__call__
/opt/cowrie/src/cowrie/core/output.py:209:emit
/opt/cowrie/src/cowrie/output/localsyslog.py:65:write
/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/syslog.py:76:emit
]
Traceback (most recent call last):
File "/opt/cowrie/src/cowrie/core/checkers.py", line 110, in checkUserPass
password=thepassword)
File "/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/threadable.py", line 53, in sync
return function(self, *args, **kwargs)
File "/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/log.py", line 286, in msg
_publishNew(self._publishPublisher, actualEventDict, textFromEventDict)
File "/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_legacy.py", line 154, in publishToNewObserver
observer(eventDict)
--- <exception caught here> ---
File "/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_observer.py", line 131, in __call__
observer(event)
File "/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_legacy.py", line 93, in __call__
self.legacyObserver(event)
File "/opt/cowrie/src/cowrie/core/output.py", line 209, in emit
self.write(ev)
File "/opt/cowrie/src/cowrie/output/localsyslog.py", line 65, in write
self.syslog.emit(logentry)
File "/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/syslog.py", line 76, in emit
if eventDict['isError']:
exceptions.KeyError: 'isError'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cowrie/output/localsyslog.py`
Content:
```
1 # Copyright (c) 2015 Michel Oosterhof <[email protected]>
2 # All rights reserved.
3 #
4 # Redistribution and use in source and binary forms, with or without
5 # modification, are permitted provided that the following conditions
6 # are met:
7 #
8 # 1. Redistributions of source code must retain the above copyright
9 # notice, this list of conditions and the following disclaimer.
10 # 2. Redistributions in binary form must reproduce the above copyright
11 # notice, this list of conditions and the following disclaimer in the
12 # documentation and/or other materials provided with the distribution.
13 # 3. The names of the author(s) may not be used to endorse or promote
14 # products derived from this software without specific prior written
15 # permission.
16 #
17 # THIS SOFTWARE IS PROVIDED BY THE AUTHORS ``AS IS'' AND ANY EXPRESS OR
18 # IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
19 # OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
20 # IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY DIRECT, INDIRECT,
21 # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
22 # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
23 # LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED
24 # AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
25 # OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
26 # OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
27 # SUCH DAMAGE.
28
29 from __future__ import absolute_import, division
30
31 import syslog
32
33 import twisted.python.syslog
34
35 import cowrie.core.cef
36 import cowrie.core.output
37 from cowrie.core.config import CONFIG
38
39
40 class Output(cowrie.core.output.Output):
41
42 def __init__(self):
43 facilityString = CONFIG.get('output_localsyslog', 'facility')
44 self.format = CONFIG.get('output_localsyslog', 'format')
45 self.facility = vars(syslog)['LOG_' + facilityString]
46 self.syslog = twisted.python.syslog.SyslogObserver(prefix='cowrie', facility=self.facility)
47 cowrie.core.output.Output.__init__(self)
48
49 def start(self):
50 pass
51
52 def stop(self):
53 pass
54
55 def write(self, logentry):
56 if self.format == 'cef':
57 self.syslog.emit({
58 'message': cowrie.core.cef.formatCef(logentry),
59 'isError': False,
60 'system': 'cowrie'
61 })
62 else:
63 # message appears with additional spaces if message key is defined
64 logentry['message'] = [logentry['message']]
65 self.syslog.emit(logentry)
66
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/cowrie/output/localsyslog.py b/src/cowrie/output/localsyslog.py
--- a/src/cowrie/output/localsyslog.py
+++ b/src/cowrie/output/localsyslog.py
@@ -53,6 +53,9 @@
pass
def write(self, logentry):
+ if 'isError' not in logentry:
+ logentry['isError'] = False
+
if self.format == 'cef':
self.syslog.emit({
'message': cowrie.core.cef.formatCef(logentry),
| {"golden_diff": "diff --git a/src/cowrie/output/localsyslog.py b/src/cowrie/output/localsyslog.py\n--- a/src/cowrie/output/localsyslog.py\n+++ b/src/cowrie/output/localsyslog.py\n@@ -53,6 +53,9 @@\n pass\n \n def write(self, logentry):\n+ if 'isError' not in logentry:\n+ logentry['isError'] = False\n+\n if self.format == 'cef':\n self.syslog.emit({\n 'message': cowrie.core.cef.formatCef(logentry),\n", "issue": "output_localsyslog exceptions.KeyError: 'isError'\nAfter pulling the most recent version of cowrie to some of my honeypots, I get this error when a new connection I enabled [output_localsyslog] with configuration below:\r\n```\r\n[output_localsyslog]\r\nenabled = true\r\nfacility = LOCAL5\r\nformat = text\r\n```\r\nThe log error shows this:\r\n```\r\n2018-10-11T18:29:01.778300+0000 [twisted.logger._observer#critical] Temporarily disabling observer LegacyLogObserverWrapper(<bound method Output.emit of <cowrie.output.localsyslog.Output object at 0xb55ae7b0>>) due to exception: [Failure instance: Traceback: <type 'exceptions.KeyError'>: 'isError'\r\n\t/opt/cowrie/src/cowrie/core/checkers.py:110:checkUserPass\r\n\t/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/threadable.py:53:sync\r\n\t/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/log.py:286:msg\r\n\t/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_legacy.py:154:publishToNewObserver\r\n\t--- <exception caught here> ---\r\n\t/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_observer.py:131:__call__\r\n\t/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_legacy.py:93:__call__\r\n\t/opt/cowrie/src/cowrie/core/output.py:209:emit\r\n\t/opt/cowrie/src/cowrie/output/localsyslog.py:65:write\r\n\t/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/syslog.py:76:emit\r\n\t]\r\n\tTraceback (most recent call last):\r\n\t File \"/opt/cowrie/src/cowrie/core/checkers.py\", line 110, in checkUserPass\r\n\t password=thepassword)\r\n\t File \"/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/threadable.py\", line 53, in sync\r\n\t return function(self, *args, **kwargs)\r\n\t File \"/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/log.py\", line 286, in msg\r\n\t _publishNew(self._publishPublisher, actualEventDict, textFromEventDict)\r\n\t File \"/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_legacy.py\", line 154, in publishToNewObserver\r\n\t observer(eventDict)\r\n\t--- <exception caught here> ---\r\n\t File \"/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_observer.py\", line 131, in __call__\r\n\t observer(event)\r\n\t File \"/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/logger/_legacy.py\", line 93, in __call__\r\n\t self.legacyObserver(event)\r\n\t File \"/opt/cowrie/src/cowrie/core/output.py\", line 209, in emit\r\n\t self.write(ev)\r\n\t File \"/opt/cowrie/src/cowrie/output/localsyslog.py\", line 65, in write\r\n\t self.syslog.emit(logentry)\r\n\t File \"/opt/cowrie/cowrie-env/local/lib/python2.7/site-packages/twisted/python/syslog.py\", line 76, in emit\r\n\t if eventDict['isError']:\r\n\texceptions.KeyError: 'isError'\r\n```\n", "before_files": [{"content": "# Copyright (c) 2015 Michel Oosterhof <[email protected]>\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions\n# are met:\n#\n# 1. Redistributions of source code must retain the above copyright\n# notice, this list of conditions and the following disclaimer.\n# 2. Redistributions in binary form must reproduce the above copyright\n# notice, this list of conditions and the following disclaimer in the\n# documentation and/or other materials provided with the distribution.\n# 3. The names of the author(s) may not be used to endorse or promote\n# products derived from this software without specific prior written\n# permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE AUTHORS ``AS IS'' AND ANY EXPRESS OR\n# IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES\n# OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.\n# IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY DIRECT, INDIRECT,\n# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,\n# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\n# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED\n# AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY\n# OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF\n# SUCH DAMAGE.\n\nfrom __future__ import absolute_import, division\n\nimport syslog\n\nimport twisted.python.syslog\n\nimport cowrie.core.cef\nimport cowrie.core.output\nfrom cowrie.core.config import CONFIG\n\n\nclass Output(cowrie.core.output.Output):\n\n def __init__(self):\n facilityString = CONFIG.get('output_localsyslog', 'facility')\n self.format = CONFIG.get('output_localsyslog', 'format')\n self.facility = vars(syslog)['LOG_' + facilityString]\n self.syslog = twisted.python.syslog.SyslogObserver(prefix='cowrie', facility=self.facility)\n cowrie.core.output.Output.__init__(self)\n\n def start(self):\n pass\n\n def stop(self):\n pass\n\n def write(self, logentry):\n if self.format == 'cef':\n self.syslog.emit({\n 'message': cowrie.core.cef.formatCef(logentry),\n 'isError': False,\n 'system': 'cowrie'\n })\n else:\n # message appears with additional spaces if message key is defined\n logentry['message'] = [logentry['message']]\n self.syslog.emit(logentry)\n", "path": "src/cowrie/output/localsyslog.py"}], "after_files": [{"content": "# Copyright (c) 2015 Michel Oosterhof <[email protected]>\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions\n# are met:\n#\n# 1. Redistributions of source code must retain the above copyright\n# notice, this list of conditions and the following disclaimer.\n# 2. Redistributions in binary form must reproduce the above copyright\n# notice, this list of conditions and the following disclaimer in the\n# documentation and/or other materials provided with the distribution.\n# 3. The names of the author(s) may not be used to endorse or promote\n# products derived from this software without specific prior written\n# permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE AUTHORS ``AS IS'' AND ANY EXPRESS OR\n# IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES\n# OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.\n# IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY DIRECT, INDIRECT,\n# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,\n# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\n# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED\n# AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY\n# OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF\n# SUCH DAMAGE.\n\nfrom __future__ import absolute_import, division\n\nimport syslog\n\nimport twisted.python.syslog\n\nimport cowrie.core.cef\nimport cowrie.core.output\nfrom cowrie.core.config import CONFIG\n\n\nclass Output(cowrie.core.output.Output):\n\n def __init__(self):\n facilityString = CONFIG.get('output_localsyslog', 'facility')\n self.format = CONFIG.get('output_localsyslog', 'format')\n self.facility = vars(syslog)['LOG_' + facilityString]\n self.syslog = twisted.python.syslog.SyslogObserver(prefix='cowrie', facility=self.facility)\n cowrie.core.output.Output.__init__(self)\n\n def start(self):\n pass\n\n def stop(self):\n pass\n\n def write(self, logentry):\n if 'isError' not in logentry:\n logentry['isError'] = False\n\n if self.format == 'cef':\n self.syslog.emit({\n 'message': cowrie.core.cef.formatCef(logentry),\n 'isError': False,\n 'system': 'cowrie'\n })\n else:\n # message appears with additional spaces if message key is defined\n logentry['message'] = [logentry['message']]\n self.syslog.emit(logentry)\n", "path": "src/cowrie/output/localsyslog.py"}]} | 1,788 | 128 |
gh_patches_debug_4834 | rasdani/github-patches | git_diff | opensearch-project__opensearch-build-724 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug]: Broken UTs on `main` branch
### Describe the bug
Seeing failures in these 2 UTs on main branch
```
Run pipenv run coverage run -m pytest --cov=./src --cov-report=xml
F....................................................................... [ 25%]
........................................................................ [ 50%]
........................................................................ [ 76%]
.................................................................... [100%]
=================================== FAILURES ===================================
__________________________ TestRunAssemble.test_main ___________________________
self = <tests.test_run_assemble.TestRunAssemble testMethod=test_main>
mock_copy = <MagicMock name='copy2' id='140398777086032'>
mock_temp = <MagicMock name='TemporaryDirectory' id='140398777095120'>
mock_recorder = <MagicMock name='BundleRecorder' id='140398777104272'>
mock_bundles = <MagicMock name='Bundles' id='140398777121680'>
mocks = (<MagicMock name='getcwd' id='140398777147472'>, <MagicMock name='makedirs' id='140398777161040'>, <MagicMock name='chdir' id='140398777178384'>)
mock_bundle = <MagicMock name='Bundles.create()' id='140398776678672'>
@patch("os.chdir")
@patch("os.makedirs")
@patch("os.getcwd", return_value="curdir")
@patch("argparse._sys.argv", ["run_assemble.py", BUILD_MANIFEST])
@patch("run_assemble.Bundles", return_value=MagicMock())
@patch("run_assemble.BundleRecorder", return_value=MagicMock())
@patch("tempfile.TemporaryDirectory")
@patch("shutil.copy2")
def test_main(self, mock_copy, mock_temp, mock_recorder, mock_bundles, *mocks):
mock_temp.return_value.__enter__.return_value = tempfile.gettempdir()
mock_bundle = MagicMock(archive_path="path")
mock_bundles.create.return_value = mock_bundle
> main()
tests/test_run_assemble.py:47:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
src/run_assemble.py:48: in main
exit(1)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Use exit() or Ctrl-D (i.e. EOF) to exit, code = 1
def __call__(self, code=None):
# Shells like IDLE catch the SystemExit, but listen when their
# stdin wrapper is closed.
try:
sys.stdin.close()
except:
pass
> raise SystemExit(code)
E SystemExit: 1
/opt/hostedtoolcache/Python/3.7.12/x64/lib/python3.7/_sitebuiltins.py:26: SystemExit
------------------------------ Captured log call -------------------------------
ERROR root:run_assemble.py:46 No installation script found at path: /home/runner/work/opensearch-build/opensearch-build/release/tar/linux/opensearch-tar-install.sh
---------- coverage: platform linux, python 3.7.12-final-0 -----------
Coverage XML written to file coverage.xml
=========================== short test summary info ============================
FAILED tests/test_run_assemble.py::TestRunAssemble::test_main - SystemExit: 1
1 failed, 283 passed in 31.93s
Error: Process completed with exit code 1.
```
### To reproduce
Run ut coverage analysis on repo `opensearch-build`, branch `main`
```
pipenv run coverage run -m pytest --cov=./src --cov-report=xml
```
### Expected behavior
No UT failures
### Screenshots
If applicable, add screenshots to help explain your problem.
### Host / Environment
_No response_
### Additional context
_No response_
### Relevant log output
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/run_assemble.py`
Content:
```
1 #!/usr/bin/env python
2
3 # SPDX-License-Identifier: Apache-2.0
4 #
5 # The OpenSearch Contributors require contributions made to
6 # this file be licensed under the Apache-2.0 license or a
7 # compatible open source license.
8
9 import argparse
10 import logging
11 import os
12 import shutil
13 import sys
14 import tempfile
15
16 from assemble_workflow.bundle_recorder import BundleRecorder
17 from assemble_workflow.bundles import Bundles
18 from manifests.build_manifest import BuildManifest
19 from system import console
20
21
22 def main():
23 parser = argparse.ArgumentParser(description="Assemble an OpenSearch Bundle")
24 parser.add_argument("manifest", type=argparse.FileType("r"), help="Manifest file.")
25 parser.add_argument(
26 "-v",
27 "--verbose",
28 help="Show more verbose output.",
29 action="store_const",
30 default=logging.INFO,
31 const=logging.DEBUG,
32 dest="logging_level",
33 )
34 args = parser.parse_args()
35
36 console.configure(level=args.logging_level)
37
38 tarball_installation_script = os.path.realpath(
39 os.path.join(
40 os.path.dirname(os.path.abspath(__file__)),
41 "../release/tar/linux/opensearch-tar-install.sh",
42 )
43 )
44 if not os.path.isfile(tarball_installation_script):
45 logging.error(
46 f"No installation script found at path: {tarball_installation_script}"
47 )
48 exit(1)
49
50 build_manifest = BuildManifest.from_file(args.manifest)
51 build = build_manifest.build
52 artifacts_dir = os.path.dirname(os.path.realpath(args.manifest.name))
53 output_dir = os.path.join(os.getcwd(), "bundle")
54 os.makedirs(output_dir, exist_ok=True)
55
56 with tempfile.TemporaryDirectory() as work_dir:
57 logging.info(
58 f"Bundling {build.name} ({build.architecture}) into {output_dir} ..."
59 )
60
61 os.chdir(work_dir)
62
63 bundle_recorder = BundleRecorder(build, output_dir, artifacts_dir)
64
65 bundle = Bundles.create(build_manifest, artifacts_dir, bundle_recorder)
66
67 bundle.install_plugins()
68 logging.info(f"Installed plugins: {bundle.installed_plugins}")
69
70 # Copy the tar installation script into the bundle
71 shutil.copy2(
72 tarball_installation_script,
73 os.path.join(
74 bundle.archive_path, os.path.basename(tarball_installation_script)
75 ),
76 )
77
78 # Save a copy of the manifest inside of the tar
79 bundle_recorder.write_manifest(bundle.archive_path)
80 bundle.build_tar(output_dir)
81
82 bundle_recorder.write_manifest(output_dir)
83
84 logging.info("Done.")
85
86
87 if __name__ == "__main__":
88 sys.exit(main())
89
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/run_assemble.py b/src/run_assemble.py
--- a/src/run_assemble.py
+++ b/src/run_assemble.py
@@ -38,7 +38,7 @@
tarball_installation_script = os.path.realpath(
os.path.join(
os.path.dirname(os.path.abspath(__file__)),
- "../release/tar/linux/opensearch-tar-install.sh",
+ "../scripts/legacy/tar/linux/opensearch-tar-install.sh",
)
)
if not os.path.isfile(tarball_installation_script):
| {"golden_diff": "diff --git a/src/run_assemble.py b/src/run_assemble.py\n--- a/src/run_assemble.py\n+++ b/src/run_assemble.py\n@@ -38,7 +38,7 @@\n tarball_installation_script = os.path.realpath(\n os.path.join(\n os.path.dirname(os.path.abspath(__file__)),\n- \"../release/tar/linux/opensearch-tar-install.sh\",\n+ \"../scripts/legacy/tar/linux/opensearch-tar-install.sh\",\n )\n )\n if not os.path.isfile(tarball_installation_script):\n", "issue": "[Bug]: Broken UTs on `main` branch\n### Describe the bug\r\n\r\nSeeing failures in these 2 UTs on main branch \r\n\r\n```\r\nRun pipenv run coverage run -m pytest --cov=./src --cov-report=xml\r\nF....................................................................... [ 25%]\r\n........................................................................ [ 50%]\r\n........................................................................ [ 76%]\r\n.................................................................... [100%]\r\n=================================== FAILURES ===================================\r\n__________________________ TestRunAssemble.test_main ___________________________\r\n\r\nself = <tests.test_run_assemble.TestRunAssemble testMethod=test_main>\r\nmock_copy = <MagicMock name='copy2' id='140398777086032'>\r\nmock_temp = <MagicMock name='TemporaryDirectory' id='140398777095120'>\r\nmock_recorder = <MagicMock name='BundleRecorder' id='140398777104272'>\r\nmock_bundles = <MagicMock name='Bundles' id='140398777121680'>\r\nmocks = (<MagicMock name='getcwd' id='140398777147472'>, <MagicMock name='makedirs' id='140398777161040'>, <MagicMock name='chdir' id='140398777178384'>)\r\nmock_bundle = <MagicMock name='Bundles.create()' id='140398776678672'>\r\n\r\n @patch(\"os.chdir\")\r\n @patch(\"os.makedirs\")\r\n @patch(\"os.getcwd\", return_value=\"curdir\")\r\n @patch(\"argparse._sys.argv\", [\"run_assemble.py\", BUILD_MANIFEST])\r\n @patch(\"run_assemble.Bundles\", return_value=MagicMock())\r\n @patch(\"run_assemble.BundleRecorder\", return_value=MagicMock())\r\n @patch(\"tempfile.TemporaryDirectory\")\r\n @patch(\"shutil.copy2\")\r\n def test_main(self, mock_copy, mock_temp, mock_recorder, mock_bundles, *mocks):\r\n mock_temp.return_value.__enter__.return_value = tempfile.gettempdir()\r\n mock_bundle = MagicMock(archive_path=\"path\")\r\n mock_bundles.create.return_value = mock_bundle\r\n \r\n> main()\r\n\r\ntests/test_run_assemble.py:47: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ \r\nsrc/run_assemble.py:48: in main\r\n exit(1)\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ \r\n\r\nself = Use exit() or Ctrl-D (i.e. EOF) to exit, code = 1\r\n\r\n def __call__(self, code=None):\r\n # Shells like IDLE catch the SystemExit, but listen when their\r\n # stdin wrapper is closed.\r\n try:\r\n sys.stdin.close()\r\n except:\r\n pass\r\n> raise SystemExit(code)\r\nE SystemExit: 1\r\n\r\n/opt/hostedtoolcache/Python/3.7.12/x64/lib/python3.7/_sitebuiltins.py:26: SystemExit\r\n------------------------------ Captured log call -------------------------------\r\nERROR root:run_assemble.py:46 No installation script found at path: /home/runner/work/opensearch-build/opensearch-build/release/tar/linux/opensearch-tar-install.sh\r\n\r\n---------- coverage: platform linux, python 3.7.12-final-0 -----------\r\nCoverage XML written to file coverage.xml\r\n\r\n=========================== short test summary info ============================\r\nFAILED tests/test_run_assemble.py::TestRunAssemble::test_main - SystemExit: 1\r\n1 failed, 283 passed in 31.93s\r\nError: Process completed with exit code 1.\r\n```\r\n\r\n### To reproduce\r\n\r\nRun ut coverage analysis on repo `opensearch-build`, branch `main`\r\n\r\n```\r\npipenv run coverage run -m pytest --cov=./src --cov-report=xml\r\n```\r\n\r\n### Expected behavior\r\n\r\nNo UT failures\r\n\r\n### Screenshots\r\n\r\nIf applicable, add screenshots to help explain your problem.\r\n\r\n### Host / Environment\r\n\r\n_No response_\r\n\r\n### Additional context\r\n\r\n_No response_\r\n\r\n### Relevant log output\r\n\r\n_No response_\n", "before_files": [{"content": "#!/usr/bin/env python\n\n# SPDX-License-Identifier: Apache-2.0\n#\n# The OpenSearch Contributors require contributions made to\n# this file be licensed under the Apache-2.0 license or a\n# compatible open source license.\n\nimport argparse\nimport logging\nimport os\nimport shutil\nimport sys\nimport tempfile\n\nfrom assemble_workflow.bundle_recorder import BundleRecorder\nfrom assemble_workflow.bundles import Bundles\nfrom manifests.build_manifest import BuildManifest\nfrom system import console\n\n\ndef main():\n parser = argparse.ArgumentParser(description=\"Assemble an OpenSearch Bundle\")\n parser.add_argument(\"manifest\", type=argparse.FileType(\"r\"), help=\"Manifest file.\")\n parser.add_argument(\n \"-v\",\n \"--verbose\",\n help=\"Show more verbose output.\",\n action=\"store_const\",\n default=logging.INFO,\n const=logging.DEBUG,\n dest=\"logging_level\",\n )\n args = parser.parse_args()\n\n console.configure(level=args.logging_level)\n\n tarball_installation_script = os.path.realpath(\n os.path.join(\n os.path.dirname(os.path.abspath(__file__)),\n \"../release/tar/linux/opensearch-tar-install.sh\",\n )\n )\n if not os.path.isfile(tarball_installation_script):\n logging.error(\n f\"No installation script found at path: {tarball_installation_script}\"\n )\n exit(1)\n\n build_manifest = BuildManifest.from_file(args.manifest)\n build = build_manifest.build\n artifacts_dir = os.path.dirname(os.path.realpath(args.manifest.name))\n output_dir = os.path.join(os.getcwd(), \"bundle\")\n os.makedirs(output_dir, exist_ok=True)\n\n with tempfile.TemporaryDirectory() as work_dir:\n logging.info(\n f\"Bundling {build.name} ({build.architecture}) into {output_dir} ...\"\n )\n\n os.chdir(work_dir)\n\n bundle_recorder = BundleRecorder(build, output_dir, artifacts_dir)\n\n bundle = Bundles.create(build_manifest, artifacts_dir, bundle_recorder)\n\n bundle.install_plugins()\n logging.info(f\"Installed plugins: {bundle.installed_plugins}\")\n\n # Copy the tar installation script into the bundle\n shutil.copy2(\n tarball_installation_script,\n os.path.join(\n bundle.archive_path, os.path.basename(tarball_installation_script)\n ),\n )\n\n # Save a copy of the manifest inside of the tar\n bundle_recorder.write_manifest(bundle.archive_path)\n bundle.build_tar(output_dir)\n\n bundle_recorder.write_manifest(output_dir)\n\n logging.info(\"Done.\")\n\n\nif __name__ == \"__main__\":\n sys.exit(main())\n", "path": "src/run_assemble.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\n# SPDX-License-Identifier: Apache-2.0\n#\n# The OpenSearch Contributors require contributions made to\n# this file be licensed under the Apache-2.0 license or a\n# compatible open source license.\n\nimport argparse\nimport logging\nimport os\nimport shutil\nimport sys\nimport tempfile\n\nfrom assemble_workflow.bundle_recorder import BundleRecorder\nfrom assemble_workflow.bundles import Bundles\nfrom manifests.build_manifest import BuildManifest\nfrom system import console\n\n\ndef main():\n parser = argparse.ArgumentParser(description=\"Assemble an OpenSearch Bundle\")\n parser.add_argument(\"manifest\", type=argparse.FileType(\"r\"), help=\"Manifest file.\")\n parser.add_argument(\n \"-v\",\n \"--verbose\",\n help=\"Show more verbose output.\",\n action=\"store_const\",\n default=logging.INFO,\n const=logging.DEBUG,\n dest=\"logging_level\",\n )\n args = parser.parse_args()\n\n console.configure(level=args.logging_level)\n\n tarball_installation_script = os.path.realpath(\n os.path.join(\n os.path.dirname(os.path.abspath(__file__)),\n \"../scripts/legacy/tar/linux/opensearch-tar-install.sh\",\n )\n )\n if not os.path.isfile(tarball_installation_script):\n logging.error(\n f\"No installation script found at path: {tarball_installation_script}\"\n )\n exit(1)\n\n build_manifest = BuildManifest.from_file(args.manifest)\n build = build_manifest.build\n artifacts_dir = os.path.dirname(os.path.realpath(args.manifest.name))\n output_dir = os.path.join(os.getcwd(), \"bundle\")\n os.makedirs(output_dir, exist_ok=True)\n\n with tempfile.TemporaryDirectory() as work_dir:\n logging.info(\n f\"Bundling {build.name} ({build.architecture}) into {output_dir} ...\"\n )\n\n os.chdir(work_dir)\n\n bundle_recorder = BundleRecorder(build, output_dir, artifacts_dir)\n\n bundle = Bundles.create(build_manifest, artifacts_dir, bundle_recorder)\n\n bundle.install_plugins()\n logging.info(f\"Installed plugins: {bundle.installed_plugins}\")\n\n # Copy the tar installation script into the bundle\n shutil.copy2(\n tarball_installation_script,\n os.path.join(\n bundle.archive_path, os.path.basename(tarball_installation_script)\n ),\n )\n\n # Save a copy of the manifest inside of the tar\n bundle_recorder.write_manifest(bundle.archive_path)\n bundle.build_tar(output_dir)\n\n bundle_recorder.write_manifest(output_dir)\n\n logging.info(\"Done.\")\n\n\nif __name__ == \"__main__\":\n sys.exit(main())\n", "path": "src/run_assemble.py"}]} | 1,960 | 120 |
gh_patches_debug_21852 | rasdani/github-patches | git_diff | wagtail__wagtail-7315 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Site history report n+1s
While checking Scout APM for a client project with lots of different authors and overall activity, I've noticed that the site history report has a few N+1s.

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wagtail/admin/views/reports/audit_logging.py`
Content:
```
1 import datetime
2
3 import django_filters
4
5 from django import forms
6 from django.db.models import Q, Subquery
7 from django.utils.encoding import force_str
8 from django.utils.translation import gettext_lazy as _
9
10 from wagtail.admin.filters import DateRangePickerWidget, WagtailFilterSet
11 from wagtail.core.log_actions import page_log_action_registry
12 from wagtail.core.models import Page, PageLogEntry, UserPagePermissionsProxy
13
14 from .base import ReportView
15
16
17 class SiteHistoryReportFilterSet(WagtailFilterSet):
18 action = django_filters.ChoiceFilter(choices=page_log_action_registry.get_choices)
19 hide_commenting_actions = django_filters.BooleanFilter(
20 label=_('Hide commenting actions'),
21 method='filter_hide_commenting_actions',
22 widget=forms.CheckboxInput,
23 )
24 timestamp = django_filters.DateFromToRangeFilter(label=_('Date'), widget=DateRangePickerWidget)
25 label = django_filters.CharFilter(label=_('Title'), lookup_expr='icontains')
26 user = django_filters.ModelChoiceFilter(
27 field_name='user', queryset=lambda request: PageLogEntry.objects.all().get_users()
28 )
29
30 def filter_hide_commenting_actions(self, queryset, name, value):
31 if value:
32 queryset = queryset.exclude(
33 action__startswith='wagtail.comments'
34 )
35 return queryset
36
37 class Meta:
38 model = PageLogEntry
39 fields = ['label', 'action', 'user', 'timestamp', 'hide_commenting_actions']
40
41
42 class LogEntriesView(ReportView):
43 template_name = 'wagtailadmin/reports/site_history.html'
44 title = _('Site history')
45 header_icon = 'history'
46 filterset_class = SiteHistoryReportFilterSet
47
48 export_headings = {
49 "object_id": _("ID"),
50 "title": _("Title"),
51 "object_verbose_name": _("Type"),
52 "action": _("Action type"),
53 "timestamp": _("Date/Time")
54 }
55 list_export = [
56 "object_id",
57 "label",
58 "object_verbose_name",
59 "action",
60 "timestamp"
61 ]
62
63 def __init__(self, **kwargs):
64 super().__init__(**kwargs)
65
66 self.custom_field_preprocess['action'] = {
67 self.FORMAT_CSV: self.get_action_label,
68 self.FORMAT_XLSX: self.get_action_label
69 }
70
71 def get_filename(self):
72 return "audit-log-{}".format(
73 datetime.datetime.today().strftime("%Y-%m-%d")
74 )
75
76 def get_queryset(self):
77 q = Q(
78 page__in=UserPagePermissionsProxy(self.request.user).explorable_pages().values_list('pk', flat=True)
79 )
80
81 root_page_permissions = Page.get_first_root_node().permissions_for_user(self.request.user)
82 if (
83 self.request.user.is_superuser
84 or root_page_permissions.can_add_subpage() or root_page_permissions.can_edit()
85 ):
86 # Include deleted entries
87 q = q | Q(page_id__in=Subquery(
88 PageLogEntry.objects.filter(deleted=True).values('page_id')
89 ))
90
91 return PageLogEntry.objects.filter(q)
92
93 def get_action_label(self, action):
94 from wagtail.core.log_actions import page_log_action_registry
95 return force_str(page_log_action_registry.get_action_label(action))
96
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/wagtail/admin/views/reports/audit_logging.py b/wagtail/admin/views/reports/audit_logging.py
--- a/wagtail/admin/views/reports/audit_logging.py
+++ b/wagtail/admin/views/reports/audit_logging.py
@@ -74,9 +74,7 @@
)
def get_queryset(self):
- q = Q(
- page__in=UserPagePermissionsProxy(self.request.user).explorable_pages().values_list('pk', flat=True)
- )
+ q = Q(page__in=UserPagePermissionsProxy(self.request.user).explorable_pages())
root_page_permissions = Page.get_first_root_node().permissions_for_user(self.request.user)
if (
@@ -88,7 +86,11 @@
PageLogEntry.objects.filter(deleted=True).values('page_id')
))
- return PageLogEntry.objects.filter(q)
+ # Using prefech_related() on page, as select_related() generates an INNER JOIN,
+ # which filters out entries for deleted pages
+ return PageLogEntry.objects.filter(q).select_related(
+ 'user', 'user__wagtail_userprofile'
+ ).prefetch_related('page')
def get_action_label(self, action):
from wagtail.core.log_actions import page_log_action_registry
| {"golden_diff": "diff --git a/wagtail/admin/views/reports/audit_logging.py b/wagtail/admin/views/reports/audit_logging.py\n--- a/wagtail/admin/views/reports/audit_logging.py\n+++ b/wagtail/admin/views/reports/audit_logging.py\n@@ -74,9 +74,7 @@\n )\n \n def get_queryset(self):\n- q = Q(\n- page__in=UserPagePermissionsProxy(self.request.user).explorable_pages().values_list('pk', flat=True)\n- )\n+ q = Q(page__in=UserPagePermissionsProxy(self.request.user).explorable_pages())\n \n root_page_permissions = Page.get_first_root_node().permissions_for_user(self.request.user)\n if (\n@@ -88,7 +86,11 @@\n PageLogEntry.objects.filter(deleted=True).values('page_id')\n ))\n \n- return PageLogEntry.objects.filter(q)\n+ # Using prefech_related() on page, as select_related() generates an INNER JOIN,\n+ # which filters out entries for deleted pages\n+ return PageLogEntry.objects.filter(q).select_related(\n+ 'user', 'user__wagtail_userprofile'\n+ ).prefetch_related('page')\n \n def get_action_label(self, action):\n from wagtail.core.log_actions import page_log_action_registry\n", "issue": "Site history report n+1s\nWhile checking Scout APM for a client project with lots of different authors and overall activity, I've noticed that the site history report has a few N+1s.\r\n\r\n\r\n\n", "before_files": [{"content": "import datetime\n\nimport django_filters\n\nfrom django import forms\nfrom django.db.models import Q, Subquery\nfrom django.utils.encoding import force_str\nfrom django.utils.translation import gettext_lazy as _\n\nfrom wagtail.admin.filters import DateRangePickerWidget, WagtailFilterSet\nfrom wagtail.core.log_actions import page_log_action_registry\nfrom wagtail.core.models import Page, PageLogEntry, UserPagePermissionsProxy\n\nfrom .base import ReportView\n\n\nclass SiteHistoryReportFilterSet(WagtailFilterSet):\n action = django_filters.ChoiceFilter(choices=page_log_action_registry.get_choices)\n hide_commenting_actions = django_filters.BooleanFilter(\n label=_('Hide commenting actions'),\n method='filter_hide_commenting_actions',\n widget=forms.CheckboxInput,\n )\n timestamp = django_filters.DateFromToRangeFilter(label=_('Date'), widget=DateRangePickerWidget)\n label = django_filters.CharFilter(label=_('Title'), lookup_expr='icontains')\n user = django_filters.ModelChoiceFilter(\n field_name='user', queryset=lambda request: PageLogEntry.objects.all().get_users()\n )\n\n def filter_hide_commenting_actions(self, queryset, name, value):\n if value:\n queryset = queryset.exclude(\n action__startswith='wagtail.comments'\n )\n return queryset\n\n class Meta:\n model = PageLogEntry\n fields = ['label', 'action', 'user', 'timestamp', 'hide_commenting_actions']\n\n\nclass LogEntriesView(ReportView):\n template_name = 'wagtailadmin/reports/site_history.html'\n title = _('Site history')\n header_icon = 'history'\n filterset_class = SiteHistoryReportFilterSet\n\n export_headings = {\n \"object_id\": _(\"ID\"),\n \"title\": _(\"Title\"),\n \"object_verbose_name\": _(\"Type\"),\n \"action\": _(\"Action type\"),\n \"timestamp\": _(\"Date/Time\")\n }\n list_export = [\n \"object_id\",\n \"label\",\n \"object_verbose_name\",\n \"action\",\n \"timestamp\"\n ]\n\n def __init__(self, **kwargs):\n super().__init__(**kwargs)\n\n self.custom_field_preprocess['action'] = {\n self.FORMAT_CSV: self.get_action_label,\n self.FORMAT_XLSX: self.get_action_label\n }\n\n def get_filename(self):\n return \"audit-log-{}\".format(\n datetime.datetime.today().strftime(\"%Y-%m-%d\")\n )\n\n def get_queryset(self):\n q = Q(\n page__in=UserPagePermissionsProxy(self.request.user).explorable_pages().values_list('pk', flat=True)\n )\n\n root_page_permissions = Page.get_first_root_node().permissions_for_user(self.request.user)\n if (\n self.request.user.is_superuser\n or root_page_permissions.can_add_subpage() or root_page_permissions.can_edit()\n ):\n # Include deleted entries\n q = q | Q(page_id__in=Subquery(\n PageLogEntry.objects.filter(deleted=True).values('page_id')\n ))\n\n return PageLogEntry.objects.filter(q)\n\n def get_action_label(self, action):\n from wagtail.core.log_actions import page_log_action_registry\n return force_str(page_log_action_registry.get_action_label(action))\n", "path": "wagtail/admin/views/reports/audit_logging.py"}], "after_files": [{"content": "import datetime\n\nimport django_filters\n\nfrom django import forms\nfrom django.db.models import Q, Subquery\nfrom django.utils.encoding import force_str\nfrom django.utils.translation import gettext_lazy as _\n\nfrom wagtail.admin.filters import DateRangePickerWidget, WagtailFilterSet\nfrom wagtail.core.log_actions import page_log_action_registry\nfrom wagtail.core.models import Page, PageLogEntry, UserPagePermissionsProxy\n\nfrom .base import ReportView\n\n\nclass SiteHistoryReportFilterSet(WagtailFilterSet):\n action = django_filters.ChoiceFilter(choices=page_log_action_registry.get_choices)\n hide_commenting_actions = django_filters.BooleanFilter(\n label=_('Hide commenting actions'),\n method='filter_hide_commenting_actions',\n widget=forms.CheckboxInput,\n )\n timestamp = django_filters.DateFromToRangeFilter(label=_('Date'), widget=DateRangePickerWidget)\n label = django_filters.CharFilter(label=_('Title'), lookup_expr='icontains')\n user = django_filters.ModelChoiceFilter(\n field_name='user', queryset=lambda request: PageLogEntry.objects.all().get_users()\n )\n\n def filter_hide_commenting_actions(self, queryset, name, value):\n if value:\n queryset = queryset.exclude(\n action__startswith='wagtail.comments'\n )\n return queryset\n\n class Meta:\n model = PageLogEntry\n fields = ['label', 'action', 'user', 'timestamp', 'hide_commenting_actions']\n\n\nclass LogEntriesView(ReportView):\n template_name = 'wagtailadmin/reports/site_history.html'\n title = _('Site history')\n header_icon = 'history'\n filterset_class = SiteHistoryReportFilterSet\n\n export_headings = {\n \"object_id\": _(\"ID\"),\n \"title\": _(\"Title\"),\n \"object_verbose_name\": _(\"Type\"),\n \"action\": _(\"Action type\"),\n \"timestamp\": _(\"Date/Time\")\n }\n list_export = [\n \"object_id\",\n \"label\",\n \"object_verbose_name\",\n \"action\",\n \"timestamp\"\n ]\n\n def __init__(self, **kwargs):\n super().__init__(**kwargs)\n\n self.custom_field_preprocess['action'] = {\n self.FORMAT_CSV: self.get_action_label,\n self.FORMAT_XLSX: self.get_action_label\n }\n\n def get_filename(self):\n return \"audit-log-{}\".format(\n datetime.datetime.today().strftime(\"%Y-%m-%d\")\n )\n\n def get_queryset(self):\n q = Q(page__in=UserPagePermissionsProxy(self.request.user).explorable_pages())\n\n root_page_permissions = Page.get_first_root_node().permissions_for_user(self.request.user)\n if (\n self.request.user.is_superuser\n or root_page_permissions.can_add_subpage() or root_page_permissions.can_edit()\n ):\n # Include deleted entries\n q = q | Q(page_id__in=Subquery(\n PageLogEntry.objects.filter(deleted=True).values('page_id')\n ))\n\n # Using prefech_related() on page, as select_related() generates an INNER JOIN,\n # which filters out entries for deleted pages\n return PageLogEntry.objects.filter(q).select_related(\n 'user', 'user__wagtail_userprofile'\n ).prefetch_related('page')\n\n def get_action_label(self, action):\n from wagtail.core.log_actions import page_log_action_registry\n return force_str(page_log_action_registry.get_action_label(action))\n", "path": "wagtail/admin/views/reports/audit_logging.py"}]} | 1,251 | 285 |
gh_patches_debug_25226 | rasdani/github-patches | git_diff | rotki__rotki-1397 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Install roboto mono
## Abstract
We should install the mono version of our main font in order to increase legibility and proper alignment of content in certain scenarios.
## Specification
https://www.npmjs.com/package/typeface-roboto-mono install, require it where we require roboto, use it in classes where a mono font is required to line things up properly in data tables / cards / etc.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `rotkehlchen/chain/ethereum/transactions.py`
Content:
```
1 import logging
2 from collections import defaultdict
3 from typing import Dict, List, Optional
4
5 from rotkehlchen.db.dbhandler import DBHandler
6 from rotkehlchen.db.ranges import DBQueryRanges
7 from rotkehlchen.errors import RemoteError
8 from rotkehlchen.externalapis.etherscan import Etherscan
9 from rotkehlchen.logging import RotkehlchenLogsAdapter
10 from rotkehlchen.typing import ChecksumEthAddress, EthereumTransaction, Timestamp
11 from rotkehlchen.user_messages import MessagesAggregator
12 from rotkehlchen.utils.interfaces import LockableQueryObject, protect_with_lock
13
14 logger = logging.getLogger(__name__)
15 log = RotkehlchenLogsAdapter(logger)
16
17 FREE_ETH_TX_LIMIT = 500
18
19
20 class EthTransactions(LockableQueryObject):
21
22 def __init__(
23 self,
24 database: DBHandler,
25 etherscan: Etherscan,
26 msg_aggregator: MessagesAggregator,
27 ) -> None:
28 super().__init__()
29 self.database = database
30 self.etherscan = etherscan
31 self.msg_aggregator = msg_aggregator
32 self.tx_per_address: Dict[ChecksumEthAddress, int] = defaultdict(int)
33
34 def _single_address_query_transactions(
35 self,
36 address: ChecksumEthAddress,
37 start_ts: Timestamp,
38 end_ts: Timestamp,
39 with_limit: bool,
40 ) -> List[EthereumTransaction]:
41 self.tx_per_address[address] = 0
42 transactions = self.database.get_ethereum_transactions(
43 from_ts=start_ts,
44 to_ts=end_ts,
45 address=address,
46 )
47 ranges = DBQueryRanges(self.database)
48 ranges_to_query = ranges.get_location_query_ranges(
49 location_string=f'ethtxs_{address}',
50 start_ts=start_ts,
51 end_ts=end_ts,
52 )
53 new_transactions = []
54 for query_start_ts, query_end_ts in ranges_to_query:
55 for internal in (False, True):
56 try:
57 new_transactions.extend(self.etherscan.get_transactions(
58 account=address,
59 internal=internal,
60 from_ts=query_start_ts,
61 to_ts=query_end_ts,
62 ))
63 except RemoteError as e:
64 self.msg_aggregator.add_error(
65 f'Got error "{str(e)}" while querying ethereum transactions '
66 f'from Etherscan. Transactions not added to the DB '
67 f'from_ts: {query_start_ts} '
68 f'to_ts: {query_end_ts} '
69 f'internal: {internal}',
70 )
71
72 # add new transactions to the DB
73 if new_transactions != []:
74 self.database.add_ethereum_transactions(new_transactions, from_etherscan=True)
75 # and also set the last queried timestamps for the address
76 ranges.update_used_query_range(
77 location_string=f'ethtxs_{address}',
78 start_ts=start_ts,
79 end_ts=end_ts,
80 ranges_to_query=ranges_to_query,
81 )
82 # finally append them to the already returned DB transactions
83 transactions.extend(new_transactions)
84
85 if with_limit:
86 transactions_queried_so_far = sum(x for _, x in self.tx_per_address.items())
87 remaining_num_tx = FREE_ETH_TX_LIMIT - transactions_queried_so_far
88 returning_tx_length = min(remaining_num_tx, len(transactions))
89 # Note down how many we got for this address
90 self.tx_per_address[address] = returning_tx_length
91 return transactions[:returning_tx_length]
92
93 return transactions
94
95 @protect_with_lock()
96 def query(
97 self,
98 address: Optional[ChecksumEthAddress],
99 from_ts: Timestamp,
100 to_ts: Timestamp,
101 with_limit: bool = False,
102 recent_first: bool = False,
103 ) -> List[EthereumTransaction]:
104 """Queries for all transactions (normal AND internal) of all ethereum accounts.
105 Returns a list of all transactions of all accounts sorted by time.
106
107 If `with_limit` is true then the api limit is applied
108
109 if `recent_first` is true then the transactions are returned with the most
110 recent first on the list
111
112 May raise:
113 - RemoteError if etherscan is used and there is a problem with reaching it or
114 with parsing the response.
115 """
116 transactions: List[EthereumTransaction] = []
117
118 if address is not None:
119 accounts = [address]
120 else:
121 accounts = self.database.get_blockchain_accounts().eth
122
123 for address in accounts:
124 transactions.extend(self._single_address_query_transactions(
125 address=address,
126 start_ts=from_ts,
127 end_ts=to_ts,
128 with_limit=with_limit,
129 ))
130
131 transactions.sort(key=lambda tx: tx.timestamp, reverse=recent_first)
132 return transactions
133
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/rotkehlchen/chain/ethereum/transactions.py b/rotkehlchen/chain/ethereum/transactions.py
--- a/rotkehlchen/chain/ethereum/transactions.py
+++ b/rotkehlchen/chain/ethereum/transactions.py
@@ -72,6 +72,15 @@
# add new transactions to the DB
if new_transactions != []:
self.database.add_ethereum_transactions(new_transactions, from_etherscan=True)
+ # And since at least for now the increasingly negative nonce for the internal
+ # transactions happens only in the DB writing, requery the entire batch from
+ # the DB to get the updated transactions
+ transactions = self.database.get_ethereum_transactions(
+ from_ts=start_ts,
+ to_ts=end_ts,
+ address=address,
+ )
+
# and also set the last queried timestamps for the address
ranges.update_used_query_range(
location_string=f'ethtxs_{address}',
@@ -79,8 +88,6 @@
end_ts=end_ts,
ranges_to_query=ranges_to_query,
)
- # finally append them to the already returned DB transactions
- transactions.extend(new_transactions)
if with_limit:
transactions_queried_so_far = sum(x for _, x in self.tx_per_address.items())
| {"golden_diff": "diff --git a/rotkehlchen/chain/ethereum/transactions.py b/rotkehlchen/chain/ethereum/transactions.py\n--- a/rotkehlchen/chain/ethereum/transactions.py\n+++ b/rotkehlchen/chain/ethereum/transactions.py\n@@ -72,6 +72,15 @@\n # add new transactions to the DB\n if new_transactions != []:\n self.database.add_ethereum_transactions(new_transactions, from_etherscan=True)\n+ # And since at least for now the increasingly negative nonce for the internal\n+ # transactions happens only in the DB writing, requery the entire batch from\n+ # the DB to get the updated transactions\n+ transactions = self.database.get_ethereum_transactions(\n+ from_ts=start_ts,\n+ to_ts=end_ts,\n+ address=address,\n+ )\n+\n # and also set the last queried timestamps for the address\n ranges.update_used_query_range(\n location_string=f'ethtxs_{address}',\n@@ -79,8 +88,6 @@\n end_ts=end_ts,\n ranges_to_query=ranges_to_query,\n )\n- # finally append them to the already returned DB transactions\n- transactions.extend(new_transactions)\n \n if with_limit:\n transactions_queried_so_far = sum(x for _, x in self.tx_per_address.items())\n", "issue": "Install roboto mono\n## Abstract\r\n\r\nWe should install the mono version of our main font in order to increase legibility and proper alignment of content in certain scenarios.\r\n\r\n## Specification\r\n\r\nhttps://www.npmjs.com/package/typeface-roboto-mono install, require it where we require roboto, use it in classes where a mono font is required to line things up properly in data tables / cards / etc.\n", "before_files": [{"content": "import logging\nfrom collections import defaultdict\nfrom typing import Dict, List, Optional\n\nfrom rotkehlchen.db.dbhandler import DBHandler\nfrom rotkehlchen.db.ranges import DBQueryRanges\nfrom rotkehlchen.errors import RemoteError\nfrom rotkehlchen.externalapis.etherscan import Etherscan\nfrom rotkehlchen.logging import RotkehlchenLogsAdapter\nfrom rotkehlchen.typing import ChecksumEthAddress, EthereumTransaction, Timestamp\nfrom rotkehlchen.user_messages import MessagesAggregator\nfrom rotkehlchen.utils.interfaces import LockableQueryObject, protect_with_lock\n\nlogger = logging.getLogger(__name__)\nlog = RotkehlchenLogsAdapter(logger)\n\nFREE_ETH_TX_LIMIT = 500\n\n\nclass EthTransactions(LockableQueryObject):\n\n def __init__(\n self,\n database: DBHandler,\n etherscan: Etherscan,\n msg_aggregator: MessagesAggregator,\n ) -> None:\n super().__init__()\n self.database = database\n self.etherscan = etherscan\n self.msg_aggregator = msg_aggregator\n self.tx_per_address: Dict[ChecksumEthAddress, int] = defaultdict(int)\n\n def _single_address_query_transactions(\n self,\n address: ChecksumEthAddress,\n start_ts: Timestamp,\n end_ts: Timestamp,\n with_limit: bool,\n ) -> List[EthereumTransaction]:\n self.tx_per_address[address] = 0\n transactions = self.database.get_ethereum_transactions(\n from_ts=start_ts,\n to_ts=end_ts,\n address=address,\n )\n ranges = DBQueryRanges(self.database)\n ranges_to_query = ranges.get_location_query_ranges(\n location_string=f'ethtxs_{address}',\n start_ts=start_ts,\n end_ts=end_ts,\n )\n new_transactions = []\n for query_start_ts, query_end_ts in ranges_to_query:\n for internal in (False, True):\n try:\n new_transactions.extend(self.etherscan.get_transactions(\n account=address,\n internal=internal,\n from_ts=query_start_ts,\n to_ts=query_end_ts,\n ))\n except RemoteError as e:\n self.msg_aggregator.add_error(\n f'Got error \"{str(e)}\" while querying ethereum transactions '\n f'from Etherscan. Transactions not added to the DB '\n f'from_ts: {query_start_ts} '\n f'to_ts: {query_end_ts} '\n f'internal: {internal}',\n )\n\n # add new transactions to the DB\n if new_transactions != []:\n self.database.add_ethereum_transactions(new_transactions, from_etherscan=True)\n # and also set the last queried timestamps for the address\n ranges.update_used_query_range(\n location_string=f'ethtxs_{address}',\n start_ts=start_ts,\n end_ts=end_ts,\n ranges_to_query=ranges_to_query,\n )\n # finally append them to the already returned DB transactions\n transactions.extend(new_transactions)\n\n if with_limit:\n transactions_queried_so_far = sum(x for _, x in self.tx_per_address.items())\n remaining_num_tx = FREE_ETH_TX_LIMIT - transactions_queried_so_far\n returning_tx_length = min(remaining_num_tx, len(transactions))\n # Note down how many we got for this address\n self.tx_per_address[address] = returning_tx_length\n return transactions[:returning_tx_length]\n\n return transactions\n\n @protect_with_lock()\n def query(\n self,\n address: Optional[ChecksumEthAddress],\n from_ts: Timestamp,\n to_ts: Timestamp,\n with_limit: bool = False,\n recent_first: bool = False,\n ) -> List[EthereumTransaction]:\n \"\"\"Queries for all transactions (normal AND internal) of all ethereum accounts.\n Returns a list of all transactions of all accounts sorted by time.\n\n If `with_limit` is true then the api limit is applied\n\n if `recent_first` is true then the transactions are returned with the most\n recent first on the list\n\n May raise:\n - RemoteError if etherscan is used and there is a problem with reaching it or\n with parsing the response.\n \"\"\"\n transactions: List[EthereumTransaction] = []\n\n if address is not None:\n accounts = [address]\n else:\n accounts = self.database.get_blockchain_accounts().eth\n\n for address in accounts:\n transactions.extend(self._single_address_query_transactions(\n address=address,\n start_ts=from_ts,\n end_ts=to_ts,\n with_limit=with_limit,\n ))\n\n transactions.sort(key=lambda tx: tx.timestamp, reverse=recent_first)\n return transactions\n", "path": "rotkehlchen/chain/ethereum/transactions.py"}], "after_files": [{"content": "import logging\nfrom collections import defaultdict\nfrom typing import Dict, List, Optional\n\nfrom rotkehlchen.db.dbhandler import DBHandler\nfrom rotkehlchen.db.ranges import DBQueryRanges\nfrom rotkehlchen.errors import RemoteError\nfrom rotkehlchen.externalapis.etherscan import Etherscan\nfrom rotkehlchen.logging import RotkehlchenLogsAdapter\nfrom rotkehlchen.typing import ChecksumEthAddress, EthereumTransaction, Timestamp\nfrom rotkehlchen.user_messages import MessagesAggregator\nfrom rotkehlchen.utils.interfaces import LockableQueryObject, protect_with_lock\n\nlogger = logging.getLogger(__name__)\nlog = RotkehlchenLogsAdapter(logger)\n\nFREE_ETH_TX_LIMIT = 500\n\n\nclass EthTransactions(LockableQueryObject):\n\n def __init__(\n self,\n database: DBHandler,\n etherscan: Etherscan,\n msg_aggregator: MessagesAggregator,\n ) -> None:\n super().__init__()\n self.database = database\n self.etherscan = etherscan\n self.msg_aggregator = msg_aggregator\n self.tx_per_address: Dict[ChecksumEthAddress, int] = defaultdict(int)\n\n def _single_address_query_transactions(\n self,\n address: ChecksumEthAddress,\n start_ts: Timestamp,\n end_ts: Timestamp,\n with_limit: bool,\n ) -> List[EthereumTransaction]:\n self.tx_per_address[address] = 0\n transactions = self.database.get_ethereum_transactions(\n from_ts=start_ts,\n to_ts=end_ts,\n address=address,\n )\n ranges = DBQueryRanges(self.database)\n ranges_to_query = ranges.get_location_query_ranges(\n location_string=f'ethtxs_{address}',\n start_ts=start_ts,\n end_ts=end_ts,\n )\n new_transactions = []\n for query_start_ts, query_end_ts in ranges_to_query:\n for internal in (False, True):\n try:\n new_transactions.extend(self.etherscan.get_transactions(\n account=address,\n internal=internal,\n from_ts=query_start_ts,\n to_ts=query_end_ts,\n ))\n except RemoteError as e:\n self.msg_aggregator.add_error(\n f'Got error \"{str(e)}\" while querying ethereum transactions '\n f'from Etherscan. Transactions not added to the DB '\n f'from_ts: {query_start_ts} '\n f'to_ts: {query_end_ts} '\n f'internal: {internal}',\n )\n\n # add new transactions to the DB\n if new_transactions != []:\n self.database.add_ethereum_transactions(new_transactions, from_etherscan=True)\n # And since at least for now the increasingly negative nonce for the internal\n # transactions happens only in the DB writing, requery the entire batch from\n # the DB to get the updated transactions\n transactions = self.database.get_ethereum_transactions(\n from_ts=start_ts,\n to_ts=end_ts,\n address=address,\n )\n\n # and also set the last queried timestamps for the address\n ranges.update_used_query_range(\n location_string=f'ethtxs_{address}',\n start_ts=start_ts,\n end_ts=end_ts,\n ranges_to_query=ranges_to_query,\n )\n\n if with_limit:\n transactions_queried_so_far = sum(x for _, x in self.tx_per_address.items())\n remaining_num_tx = FREE_ETH_TX_LIMIT - transactions_queried_so_far\n returning_tx_length = min(remaining_num_tx, len(transactions))\n # Note down how many we got for this address\n self.tx_per_address[address] = returning_tx_length\n return transactions[:returning_tx_length]\n\n return transactions\n\n @protect_with_lock()\n def query(\n self,\n address: Optional[ChecksumEthAddress],\n from_ts: Timestamp,\n to_ts: Timestamp,\n with_limit: bool = False,\n recent_first: bool = False,\n ) -> List[EthereumTransaction]:\n \"\"\"Queries for all transactions (normal AND internal) of all ethereum accounts.\n Returns a list of all transactions of all accounts sorted by time.\n\n If `with_limit` is true then the api limit is applied\n\n if `recent_first` is true then the transactions are returned with the most\n recent first on the list\n\n May raise:\n - RemoteError if etherscan is used and there is a problem with reaching it or\n with parsing the response.\n \"\"\"\n transactions: List[EthereumTransaction] = []\n\n if address is not None:\n accounts = [address]\n else:\n accounts = self.database.get_blockchain_accounts().eth\n\n for address in accounts:\n transactions.extend(self._single_address_query_transactions(\n address=address,\n start_ts=from_ts,\n end_ts=to_ts,\n with_limit=with_limit,\n ))\n\n transactions.sort(key=lambda tx: tx.timestamp, reverse=recent_first)\n return transactions\n", "path": "rotkehlchen/chain/ethereum/transactions.py"}]} | 1,643 | 291 |
gh_patches_debug_17132 | rasdani/github-patches | git_diff | scoutapp__scout_apm_python-299 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Don't attempt to start on Windows (until supported)
As of right now, we error out when attempting to start on a Microsoft Windows system.
We should detect that the OS is not supported, and not start at all (and log that fact).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/scout_apm/core/__init__.py`
Content:
```
1 # coding=utf-8
2 from __future__ import absolute_import, division, print_function, unicode_literals
3
4 import logging
5 from os import getpid
6
7 from scout_apm.core.config import ScoutConfig
8 from scout_apm.core.context import AgentContext
9 from scout_apm.core.core_agent_manager import CoreAgentManager
10 from scout_apm.core.instrument_manager import InstrumentManager
11 from scout_apm.core.metadata import AppMetadata
12
13 try:
14 from scout_apm.core import objtrace
15 except ImportError:
16 objtrace = None
17
18 logger = logging.getLogger(__name__)
19
20
21 def install(*args, **kwargs):
22 if "config" in kwargs:
23 ScoutConfig().set(**kwargs["config"])
24 context = AgentContext.build(config=ScoutConfig())
25
26 if not context.config.value("monitor"):
27 logger.info(
28 "APM Not Launching on PID: %s - Configuration 'monitor' is not true",
29 getpid(),
30 )
31 return False
32
33 InstrumentManager().install_all()
34
35 if objtrace is not None:
36 objtrace.enable()
37
38 logger.debug("APM Launching on PID: %s", getpid())
39 launched = CoreAgentManager().launch()
40
41 AppMetadata.report()
42 if launched:
43 AgentContext.socket().stop()
44
45 return True
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/scout_apm/core/__init__.py b/src/scout_apm/core/__init__.py
--- a/src/scout_apm/core/__init__.py
+++ b/src/scout_apm/core/__init__.py
@@ -2,6 +2,7 @@
from __future__ import absolute_import, division, print_function, unicode_literals
import logging
+import sys
from os import getpid
from scout_apm.core.config import ScoutConfig
@@ -23,6 +24,10 @@
ScoutConfig().set(**kwargs["config"])
context = AgentContext.build(config=ScoutConfig())
+ if sys.platform == "win32":
+ logger.info("APM Not Launching on PID: %s - Windows is not supported", getpid())
+ return False
+
if not context.config.value("monitor"):
logger.info(
"APM Not Launching on PID: %s - Configuration 'monitor' is not true",
| {"golden_diff": "diff --git a/src/scout_apm/core/__init__.py b/src/scout_apm/core/__init__.py\n--- a/src/scout_apm/core/__init__.py\n+++ b/src/scout_apm/core/__init__.py\n@@ -2,6 +2,7 @@\n from __future__ import absolute_import, division, print_function, unicode_literals\n \n import logging\n+import sys\n from os import getpid\n \n from scout_apm.core.config import ScoutConfig\n@@ -23,6 +24,10 @@\n ScoutConfig().set(**kwargs[\"config\"])\n context = AgentContext.build(config=ScoutConfig())\n \n+ if sys.platform == \"win32\":\n+ logger.info(\"APM Not Launching on PID: %s - Windows is not supported\", getpid())\n+ return False\n+\n if not context.config.value(\"monitor\"):\n logger.info(\n \"APM Not Launching on PID: %s - Configuration 'monitor' is not true\",\n", "issue": "Don't attempt to start on Windows (until supported)\nAs of right now, we error out when attempting to start on a Microsoft Windows system.\r\n\r\nWe should detect that the OS is not supported, and not start at all (and log that fact).\r\n\n", "before_files": [{"content": "# coding=utf-8\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport logging\nfrom os import getpid\n\nfrom scout_apm.core.config import ScoutConfig\nfrom scout_apm.core.context import AgentContext\nfrom scout_apm.core.core_agent_manager import CoreAgentManager\nfrom scout_apm.core.instrument_manager import InstrumentManager\nfrom scout_apm.core.metadata import AppMetadata\n\ntry:\n from scout_apm.core import objtrace\nexcept ImportError:\n objtrace = None\n\nlogger = logging.getLogger(__name__)\n\n\ndef install(*args, **kwargs):\n if \"config\" in kwargs:\n ScoutConfig().set(**kwargs[\"config\"])\n context = AgentContext.build(config=ScoutConfig())\n\n if not context.config.value(\"monitor\"):\n logger.info(\n \"APM Not Launching on PID: %s - Configuration 'monitor' is not true\",\n getpid(),\n )\n return False\n\n InstrumentManager().install_all()\n\n if objtrace is not None:\n objtrace.enable()\n\n logger.debug(\"APM Launching on PID: %s\", getpid())\n launched = CoreAgentManager().launch()\n\n AppMetadata.report()\n if launched:\n AgentContext.socket().stop()\n\n return True\n", "path": "src/scout_apm/core/__init__.py"}], "after_files": [{"content": "# coding=utf-8\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport logging\nimport sys\nfrom os import getpid\n\nfrom scout_apm.core.config import ScoutConfig\nfrom scout_apm.core.context import AgentContext\nfrom scout_apm.core.core_agent_manager import CoreAgentManager\nfrom scout_apm.core.instrument_manager import InstrumentManager\nfrom scout_apm.core.metadata import AppMetadata\n\ntry:\n from scout_apm.core import objtrace\nexcept ImportError:\n objtrace = None\n\nlogger = logging.getLogger(__name__)\n\n\ndef install(*args, **kwargs):\n if \"config\" in kwargs:\n ScoutConfig().set(**kwargs[\"config\"])\n context = AgentContext.build(config=ScoutConfig())\n\n if sys.platform == \"win32\":\n logger.info(\"APM Not Launching on PID: %s - Windows is not supported\", getpid())\n return False\n\n if not context.config.value(\"monitor\"):\n logger.info(\n \"APM Not Launching on PID: %s - Configuration 'monitor' is not true\",\n getpid(),\n )\n return False\n\n InstrumentManager().install_all()\n\n if objtrace is not None:\n objtrace.enable()\n\n logger.debug(\"APM Launching on PID: %s\", getpid())\n launched = CoreAgentManager().launch()\n\n AppMetadata.report()\n if launched:\n AgentContext.socket().stop()\n\n return True\n", "path": "src/scout_apm/core/__init__.py"}]} | 669 | 210 |
gh_patches_debug_18254 | rasdani/github-patches | git_diff | streamlit__streamlit-2470 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Watchdog error on install (Mac)
I've tried to install it in a MAC using PIP and PIP3 and I get the same error:
Command "/usr/local/opt/python/bin/python3.7 -u -c "import setuptools, tokenize;__file__='/private/var/folders/0s/rkvsfhzn2930_3902mpks9lm0000gn/T/pip-install-3cscl6kq/watchdog/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /private/var/folders/0s/rkvsfhzn2930_3902mpks9lm0000gn/T/pip-record-qr49cvd0/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /private/var/folders/0s/rkvsfhzn2930_3902mpks9lm0000gn/T/pip-install-3cscl6kq/watchdog/
I'm using:
macOS Mojave 10.14.5
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/setup.py`
Content:
```
1 import os
2 import platform
3 import setuptools
4 import subprocess
5 import sys
6
7 from setuptools.command.install import install
8
9 try:
10 from pipenv.project import Project
11 from pipenv.utils import convert_deps_to_pip
12 except:
13 exit_msg = (
14 "pipenv is required to package Streamlit. Please install pipenv and try again"
15 )
16 sys.exit(exit_msg)
17
18 VERSION = "0.72.0" # PEP-440
19
20 NAME = "streamlit"
21
22 DESCRIPTION = "The fastest way to build data apps in Python"
23
24 LONG_DESCRIPTION = (
25 "Streamlit's open-source app framework is the easiest way "
26 "for data scientists and machine learning engineers to "
27 "create beautiful, performant apps in only a few hours! "
28 "All in pure Python. All for free."
29 )
30
31 pipfile = Project(chdir=False).parsed_pipfile
32
33 packages = pipfile["packages"].copy()
34 requirements = convert_deps_to_pip(packages, r=False)
35
36 # Check whether xcode tools are available before making watchdog a
37 # dependency (only if the current system is a Mac).
38 if platform.system() == "Darwin":
39 has_xcode = subprocess.call(["xcode-select", "--version"], shell=False) == 0
40 has_gcc = subprocess.call(["gcc", "--version"], shell=False) == 0
41
42 if not (has_xcode and has_gcc):
43 try:
44 requirements.remove("watchdog")
45 except ValueError:
46 pass
47
48
49 class VerifyVersionCommand(install):
50 """Custom command to verify that the git tag matches our version"""
51
52 description = "verify that the git tag matches our version"
53
54 def run(self):
55 tag = os.getenv("CIRCLE_TAG")
56
57 if tag != VERSION:
58 info = "Git tag: {0} does not match the version of this app: {1}".format(
59 tag, VERSION
60 )
61 sys.exit(info)
62
63
64 setuptools.setup(
65 name=NAME,
66 version=VERSION,
67 description=DESCRIPTION,
68 long_description=LONG_DESCRIPTION,
69 url="https://streamlit.io",
70 author="Streamlit Inc",
71 author_email="[email protected]",
72 python_requires=">=3.6",
73 license="Apache 2",
74 packages=setuptools.find_packages(exclude=["tests", "tests.*"]),
75 # Requirements
76 install_requires=requirements,
77 zip_safe=False, # install source files not egg
78 include_package_data=True, # copy html and friends
79 entry_points={"console_scripts": ["streamlit = streamlit.cli:main"]},
80 # For Windows so that streamlit * commands work ie.
81 # - streamlit version
82 # - streamlit hello
83 scripts=["bin/streamlit.cmd"],
84 cmdclass={
85 "verify": VerifyVersionCommand,
86 },
87 )
88
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lib/setup.py b/lib/setup.py
--- a/lib/setup.py
+++ b/lib/setup.py
@@ -1,7 +1,5 @@
import os
-import platform
import setuptools
-import subprocess
import sys
from setuptools.command.install import install
@@ -33,18 +31,6 @@
packages = pipfile["packages"].copy()
requirements = convert_deps_to_pip(packages, r=False)
-# Check whether xcode tools are available before making watchdog a
-# dependency (only if the current system is a Mac).
-if platform.system() == "Darwin":
- has_xcode = subprocess.call(["xcode-select", "--version"], shell=False) == 0
- has_gcc = subprocess.call(["gcc", "--version"], shell=False) == 0
-
- if not (has_xcode and has_gcc):
- try:
- requirements.remove("watchdog")
- except ValueError:
- pass
-
class VerifyVersionCommand(install):
"""Custom command to verify that the git tag matches our version"""
| {"golden_diff": "diff --git a/lib/setup.py b/lib/setup.py\n--- a/lib/setup.py\n+++ b/lib/setup.py\n@@ -1,7 +1,5 @@\n import os\n-import platform\n import setuptools\n-import subprocess\n import sys\n \n from setuptools.command.install import install\n@@ -33,18 +31,6 @@\n packages = pipfile[\"packages\"].copy()\n requirements = convert_deps_to_pip(packages, r=False)\n \n-# Check whether xcode tools are available before making watchdog a\n-# dependency (only if the current system is a Mac).\n-if platform.system() == \"Darwin\":\n- has_xcode = subprocess.call([\"xcode-select\", \"--version\"], shell=False) == 0\n- has_gcc = subprocess.call([\"gcc\", \"--version\"], shell=False) == 0\n-\n- if not (has_xcode and has_gcc):\n- try:\n- requirements.remove(\"watchdog\")\n- except ValueError:\n- pass\n-\n \n class VerifyVersionCommand(install):\n \"\"\"Custom command to verify that the git tag matches our version\"\"\"\n", "issue": "Watchdog error on install (Mac)\nI've tried to install it in a MAC using PIP and PIP3 and I get the same error:\r\n\r\nCommand \"/usr/local/opt/python/bin/python3.7 -u -c \"import setuptools, tokenize;__file__='/private/var/folders/0s/rkvsfhzn2930_3902mpks9lm0000gn/T/pip-install-3cscl6kq/watchdog/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\\r\\n', '\\n');f.close();exec(compile(code, __file__, 'exec'))\" install --record /private/var/folders/0s/rkvsfhzn2930_3902mpks9lm0000gn/T/pip-record-qr49cvd0/install-record.txt --single-version-externally-managed --compile\" failed with error code 1 in /private/var/folders/0s/rkvsfhzn2930_3902mpks9lm0000gn/T/pip-install-3cscl6kq/watchdog/\r\n\r\nI'm using: \r\nmacOS Mojave 10.14.5\r\n\n", "before_files": [{"content": "import os\nimport platform\nimport setuptools\nimport subprocess\nimport sys\n\nfrom setuptools.command.install import install\n\ntry:\n from pipenv.project import Project\n from pipenv.utils import convert_deps_to_pip\nexcept:\n exit_msg = (\n \"pipenv is required to package Streamlit. Please install pipenv and try again\"\n )\n sys.exit(exit_msg)\n\nVERSION = \"0.72.0\" # PEP-440\n\nNAME = \"streamlit\"\n\nDESCRIPTION = \"The fastest way to build data apps in Python\"\n\nLONG_DESCRIPTION = (\n \"Streamlit's open-source app framework is the easiest way \"\n \"for data scientists and machine learning engineers to \"\n \"create beautiful, performant apps in only a few hours! \"\n \"All in pure Python. All for free.\"\n)\n\npipfile = Project(chdir=False).parsed_pipfile\n\npackages = pipfile[\"packages\"].copy()\nrequirements = convert_deps_to_pip(packages, r=False)\n\n# Check whether xcode tools are available before making watchdog a\n# dependency (only if the current system is a Mac).\nif platform.system() == \"Darwin\":\n has_xcode = subprocess.call([\"xcode-select\", \"--version\"], shell=False) == 0\n has_gcc = subprocess.call([\"gcc\", \"--version\"], shell=False) == 0\n\n if not (has_xcode and has_gcc):\n try:\n requirements.remove(\"watchdog\")\n except ValueError:\n pass\n\n\nclass VerifyVersionCommand(install):\n \"\"\"Custom command to verify that the git tag matches our version\"\"\"\n\n description = \"verify that the git tag matches our version\"\n\n def run(self):\n tag = os.getenv(\"CIRCLE_TAG\")\n\n if tag != VERSION:\n info = \"Git tag: {0} does not match the version of this app: {1}\".format(\n tag, VERSION\n )\n sys.exit(info)\n\n\nsetuptools.setup(\n name=NAME,\n version=VERSION,\n description=DESCRIPTION,\n long_description=LONG_DESCRIPTION,\n url=\"https://streamlit.io\",\n author=\"Streamlit Inc\",\n author_email=\"[email protected]\",\n python_requires=\">=3.6\",\n license=\"Apache 2\",\n packages=setuptools.find_packages(exclude=[\"tests\", \"tests.*\"]),\n # Requirements\n install_requires=requirements,\n zip_safe=False, # install source files not egg\n include_package_data=True, # copy html and friends\n entry_points={\"console_scripts\": [\"streamlit = streamlit.cli:main\"]},\n # For Windows so that streamlit * commands work ie.\n # - streamlit version\n # - streamlit hello\n scripts=[\"bin/streamlit.cmd\"],\n cmdclass={\n \"verify\": VerifyVersionCommand,\n },\n)\n", "path": "lib/setup.py"}], "after_files": [{"content": "import os\nimport setuptools\nimport sys\n\nfrom setuptools.command.install import install\n\ntry:\n from pipenv.project import Project\n from pipenv.utils import convert_deps_to_pip\nexcept:\n exit_msg = (\n \"pipenv is required to package Streamlit. Please install pipenv and try again\"\n )\n sys.exit(exit_msg)\n\nVERSION = \"0.72.0\" # PEP-440\n\nNAME = \"streamlit\"\n\nDESCRIPTION = \"The fastest way to build data apps in Python\"\n\nLONG_DESCRIPTION = (\n \"Streamlit's open-source app framework is the easiest way \"\n \"for data scientists and machine learning engineers to \"\n \"create beautiful, performant apps in only a few hours! \"\n \"All in pure Python. All for free.\"\n)\n\npipfile = Project(chdir=False).parsed_pipfile\n\npackages = pipfile[\"packages\"].copy()\nrequirements = convert_deps_to_pip(packages, r=False)\n\n\nclass VerifyVersionCommand(install):\n \"\"\"Custom command to verify that the git tag matches our version\"\"\"\n\n description = \"verify that the git tag matches our version\"\n\n def run(self):\n tag = os.getenv(\"CIRCLE_TAG\")\n\n if tag != VERSION:\n info = \"Git tag: {0} does not match the version of this app: {1}\".format(\n tag, VERSION\n )\n sys.exit(info)\n\n\nsetuptools.setup(\n name=NAME,\n version=VERSION,\n description=DESCRIPTION,\n long_description=LONG_DESCRIPTION,\n url=\"https://streamlit.io\",\n author=\"Streamlit Inc\",\n author_email=\"[email protected]\",\n python_requires=\">=3.6\",\n license=\"Apache 2\",\n packages=setuptools.find_packages(exclude=[\"tests\", \"tests.*\"]),\n # Requirements\n install_requires=requirements,\n zip_safe=False, # install source files not egg\n include_package_data=True, # copy html and friends\n entry_points={\"console_scripts\": [\"streamlit = streamlit.cli:main\"]},\n # For Windows so that streamlit * commands work ie.\n # - streamlit version\n # - streamlit hello\n scripts=[\"bin/streamlit.cmd\"],\n cmdclass={\n \"verify\": VerifyVersionCommand,\n },\n)\n", "path": "lib/setup.py"}]} | 1,302 | 230 |
gh_patches_debug_25882 | rasdani/github-patches | git_diff | mdn__kuma-6927 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
T - Update robots.txt (remove sites)
**Summary**
There are several page types that we have in our robots.txt, which means that Google can't crawl it to see that we don't want those pages to be indexed.
**Steps To Reproduce (STR)**
1. Go to Search Console
2. go to Coverage > Valid with Warnings > Indexed, though blocked by robots.txt
3. Alternatively: https://developer.mozilla.org/robots.txt
**Actual behavior**
Google has a link to https://developer.mozilla.org/en-US/docs/tag/SpeechRecognitionResult (for example), but that URL is blocked in robots.txt, so it can't follow the link to see that it redirects to https://wiki.developer.mozilla.org/en-US/docs/tag/SpeechRecognitionResult which has a `no-index` directive.
**Expected behavior**
All page types that redirect to the wiki should be removed from the robots.txt of the read-only site, since the wiki has a `no-index` directive on all content already.
**Additional context**
The reason to do this is so we can see actually problematic content show up in our search console reports, instead of this noise. Search console only shows up to 1000 pages as problematic, but there are currently more than 10k warnings, so we might be missing large issues.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kuma/landing/views.py`
Content:
```
1 from django.conf import settings
2 from django.http import HttpResponse
3 from django.shortcuts import redirect, render
4 from django.views import static
5 from django.views.decorators.cache import never_cache
6 from django.views.generic import RedirectView
7
8 from kuma.core.decorators import ensure_wiki_domain, shared_cache_control
9 from kuma.core.utils import is_wiki
10 from kuma.feeder.models import Bundle
11 from kuma.feeder.sections import SECTION_HACKS
12 from kuma.search.models import Filter
13
14 from .utils import favicon_url
15
16
17 @shared_cache_control
18 def contribute_json(request):
19 return static.serve(request, "contribute.json", document_root=settings.ROOT)
20
21
22 @shared_cache_control
23 def home(request):
24 """Home page."""
25 context = {}
26 # Need for both wiki and react homepage
27 context["updates"] = list(Bundle.objects.recent_entries(SECTION_HACKS.updates)[:5])
28
29 # The default template name
30 template_name = "landing/react_homepage.html"
31 if is_wiki(request):
32 template_name = "landing/homepage.html"
33 context["default_filters"] = Filter.objects.default_filters()
34 return render(request, template_name, context)
35
36
37 @ensure_wiki_domain
38 @never_cache
39 def maintenance_mode(request):
40 if settings.MAINTENANCE_MODE:
41 return render(request, "landing/maintenance-mode.html")
42 else:
43 return redirect("home")
44
45
46 @ensure_wiki_domain
47 @shared_cache_control
48 def promote_buttons(request):
49 """Bug 646192: MDN affiliate buttons"""
50 return render(request, "landing/promote_buttons.html")
51
52
53 ROBOTS_ALL_ALLOWED_TXT = """\
54 User-agent: *
55 Sitemap: https://wiki.developer.mozilla.org/sitemap.xml
56
57 Disallow:
58 """
59
60 ROBOTS_ALLOWED_TXT = """\
61 User-agent: *
62 Sitemap: https://developer.mozilla.org/sitemap.xml
63
64 Disallow: /admin/
65 Disallow: /api/
66 Disallow: /*/dashboards/*
67 Disallow: /*docs/feeds
68 Disallow: /*docs/templates
69 Disallow: /*docs*Template:
70 Disallow: /*docs/all
71 Disallow: /*docs/tag*
72 Disallow: /*docs/needs-review*
73 Disallow: /*docs/localization-tag*
74 Disallow: /*docs/with-errors
75 Disallow: /*docs/without-parent
76 Disallow: /*docs/top-level
77 Disallow: /*docs/new
78 Disallow: /*docs/get-documents
79 Disallow: /*docs/submit_akismet_spam
80 Disallow: /*docs/load*
81 Disallow: /*docs/Experiment:*
82 Disallow: /*$api
83 Disallow: /*$compare
84 Disallow: /*$revision
85 Disallow: /*$history
86 Disallow: /*$children
87 Disallow: /*$flag
88 Disallow: /*$locales
89 Disallow: /*$toc
90 Disallow: /*$move
91 Disallow: /*$quick-review
92 Disallow: /*$samples
93 Disallow: /*$revert
94 Disallow: /*$repair_breadcrumbs
95 Disallow: /*$delete
96 Disallow: /*$restore
97 Disallow: /*$purge
98 Disallow: /*$subscribe
99 Disallow: /*$subscribe_to_tree
100 Disallow: /*$vote
101 Disallow: /*docs.json
102 Disallow: /*docs/ckeditor_config.js
103 Disallow: /*/files/
104 Disallow: /media
105 Disallow: /*move-requested
106 Disallow: /*preview-wiki-content
107 Disallow: /*profiles*/edit
108 Disallow: /skins
109 Disallow: /*type=feed
110 Disallow: /*users/
111 """ + "\n".join(
112 "Disallow: /{locale}/search".format(locale=locale)
113 for locale in settings.ENABLED_LOCALES
114 )
115
116 ROBOTS_GO_AWAY_TXT = """\
117 User-Agent: *
118 Disallow: /
119 """
120
121
122 @shared_cache_control
123 def robots_txt(request):
124 """Serve robots.txt that allows or forbids robots."""
125 host = request.get_host()
126 if host in settings.ALLOW_ROBOTS_DOMAINS:
127 robots = ""
128 elif host in settings.ALLOW_ROBOTS_WEB_DOMAINS:
129 if host == settings.WIKI_HOST:
130 robots = ROBOTS_ALL_ALLOWED_TXT
131 else:
132 robots = ROBOTS_ALLOWED_TXT
133 else:
134 robots = ROBOTS_GO_AWAY_TXT
135 return HttpResponse(robots, content_type="text/plain")
136
137
138 class FaviconRedirect(RedirectView):
139 """Redirect to the favicon in the static img folder (bug 1402497)"""
140
141 def get_redirect_url(self, *args, **kwargs):
142 return favicon_url()
143
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/kuma/landing/views.py b/kuma/landing/views.py
--- a/kuma/landing/views.py
+++ b/kuma/landing/views.py
@@ -61,52 +61,14 @@
User-agent: *
Sitemap: https://developer.mozilla.org/sitemap.xml
-Disallow: /admin/
Disallow: /api/
-Disallow: /*/dashboards/*
-Disallow: /*docs/feeds
-Disallow: /*docs/templates
-Disallow: /*docs*Template:
-Disallow: /*docs/all
-Disallow: /*docs/tag*
-Disallow: /*docs/needs-review*
-Disallow: /*docs/localization-tag*
-Disallow: /*docs/with-errors
-Disallow: /*docs/without-parent
-Disallow: /*docs/top-level
-Disallow: /*docs/new
Disallow: /*docs/get-documents
-Disallow: /*docs/submit_akismet_spam
-Disallow: /*docs/load*
Disallow: /*docs/Experiment:*
-Disallow: /*$api
-Disallow: /*$compare
-Disallow: /*$revision
-Disallow: /*$history
Disallow: /*$children
-Disallow: /*$flag
-Disallow: /*$locales
-Disallow: /*$toc
-Disallow: /*$move
-Disallow: /*$quick-review
-Disallow: /*$samples
-Disallow: /*$revert
-Disallow: /*$repair_breadcrumbs
-Disallow: /*$delete
-Disallow: /*$restore
-Disallow: /*$purge
-Disallow: /*$subscribe
-Disallow: /*$subscribe_to_tree
-Disallow: /*$vote
Disallow: /*docs.json
-Disallow: /*docs/ckeditor_config.js
Disallow: /*/files/
Disallow: /media
-Disallow: /*move-requested
-Disallow: /*preview-wiki-content
Disallow: /*profiles*/edit
-Disallow: /skins
-Disallow: /*type=feed
Disallow: /*users/
""" + "\n".join(
"Disallow: /{locale}/search".format(locale=locale)
| {"golden_diff": "diff --git a/kuma/landing/views.py b/kuma/landing/views.py\n--- a/kuma/landing/views.py\n+++ b/kuma/landing/views.py\n@@ -61,52 +61,14 @@\n User-agent: *\n Sitemap: https://developer.mozilla.org/sitemap.xml\n \n-Disallow: /admin/\n Disallow: /api/\n-Disallow: /*/dashboards/*\n-Disallow: /*docs/feeds\n-Disallow: /*docs/templates\n-Disallow: /*docs*Template:\n-Disallow: /*docs/all\n-Disallow: /*docs/tag*\n-Disallow: /*docs/needs-review*\n-Disallow: /*docs/localization-tag*\n-Disallow: /*docs/with-errors\n-Disallow: /*docs/without-parent\n-Disallow: /*docs/top-level\n-Disallow: /*docs/new\n Disallow: /*docs/get-documents\n-Disallow: /*docs/submit_akismet_spam\n-Disallow: /*docs/load*\n Disallow: /*docs/Experiment:*\n-Disallow: /*$api\n-Disallow: /*$compare\n-Disallow: /*$revision\n-Disallow: /*$history\n Disallow: /*$children\n-Disallow: /*$flag\n-Disallow: /*$locales\n-Disallow: /*$toc\n-Disallow: /*$move\n-Disallow: /*$quick-review\n-Disallow: /*$samples\n-Disallow: /*$revert\n-Disallow: /*$repair_breadcrumbs\n-Disallow: /*$delete\n-Disallow: /*$restore\n-Disallow: /*$purge\n-Disallow: /*$subscribe\n-Disallow: /*$subscribe_to_tree\n-Disallow: /*$vote\n Disallow: /*docs.json\n-Disallow: /*docs/ckeditor_config.js\n Disallow: /*/files/\n Disallow: /media\n-Disallow: /*move-requested\n-Disallow: /*preview-wiki-content\n Disallow: /*profiles*/edit\n-Disallow: /skins\n-Disallow: /*type=feed\n Disallow: /*users/\n \"\"\" + \"\\n\".join(\n \"Disallow: /{locale}/search\".format(locale=locale)\n", "issue": "T - Update robots.txt (remove sites)\n**Summary**\r\nThere are several page types that we have in our robots.txt, which means that Google can't crawl it to see that we don't want those pages to be indexed.\r\n\r\n\r\n**Steps To Reproduce (STR)**\r\n1. Go to Search Console\r\n2. go to Coverage > Valid with Warnings > Indexed, though blocked by robots.txt\r\n3. Alternatively: https://developer.mozilla.org/robots.txt\r\n\r\n\r\n**Actual behavior**\r\nGoogle has a link to https://developer.mozilla.org/en-US/docs/tag/SpeechRecognitionResult (for example), but that URL is blocked in robots.txt, so it can't follow the link to see that it redirects to https://wiki.developer.mozilla.org/en-US/docs/tag/SpeechRecognitionResult which has a `no-index` directive.\r\n\r\n\r\n**Expected behavior**\r\nAll page types that redirect to the wiki should be removed from the robots.txt of the read-only site, since the wiki has a `no-index` directive on all content already.\r\n\r\n\r\n**Additional context**\r\nThe reason to do this is so we can see actually problematic content show up in our search console reports, instead of this noise. Search console only shows up to 1000 pages as problematic, but there are currently more than 10k warnings, so we might be missing large issues.\r\n\n", "before_files": [{"content": "from django.conf import settings\nfrom django.http import HttpResponse\nfrom django.shortcuts import redirect, render\nfrom django.views import static\nfrom django.views.decorators.cache import never_cache\nfrom django.views.generic import RedirectView\n\nfrom kuma.core.decorators import ensure_wiki_domain, shared_cache_control\nfrom kuma.core.utils import is_wiki\nfrom kuma.feeder.models import Bundle\nfrom kuma.feeder.sections import SECTION_HACKS\nfrom kuma.search.models import Filter\n\nfrom .utils import favicon_url\n\n\n@shared_cache_control\ndef contribute_json(request):\n return static.serve(request, \"contribute.json\", document_root=settings.ROOT)\n\n\n@shared_cache_control\ndef home(request):\n \"\"\"Home page.\"\"\"\n context = {}\n # Need for both wiki and react homepage\n context[\"updates\"] = list(Bundle.objects.recent_entries(SECTION_HACKS.updates)[:5])\n\n # The default template name\n template_name = \"landing/react_homepage.html\"\n if is_wiki(request):\n template_name = \"landing/homepage.html\"\n context[\"default_filters\"] = Filter.objects.default_filters()\n return render(request, template_name, context)\n\n\n@ensure_wiki_domain\n@never_cache\ndef maintenance_mode(request):\n if settings.MAINTENANCE_MODE:\n return render(request, \"landing/maintenance-mode.html\")\n else:\n return redirect(\"home\")\n\n\n@ensure_wiki_domain\n@shared_cache_control\ndef promote_buttons(request):\n \"\"\"Bug 646192: MDN affiliate buttons\"\"\"\n return render(request, \"landing/promote_buttons.html\")\n\n\nROBOTS_ALL_ALLOWED_TXT = \"\"\"\\\nUser-agent: *\nSitemap: https://wiki.developer.mozilla.org/sitemap.xml\n\nDisallow:\n\"\"\"\n\nROBOTS_ALLOWED_TXT = \"\"\"\\\nUser-agent: *\nSitemap: https://developer.mozilla.org/sitemap.xml\n\nDisallow: /admin/\nDisallow: /api/\nDisallow: /*/dashboards/*\nDisallow: /*docs/feeds\nDisallow: /*docs/templates\nDisallow: /*docs*Template:\nDisallow: /*docs/all\nDisallow: /*docs/tag*\nDisallow: /*docs/needs-review*\nDisallow: /*docs/localization-tag*\nDisallow: /*docs/with-errors\nDisallow: /*docs/without-parent\nDisallow: /*docs/top-level\nDisallow: /*docs/new\nDisallow: /*docs/get-documents\nDisallow: /*docs/submit_akismet_spam\nDisallow: /*docs/load*\nDisallow: /*docs/Experiment:*\nDisallow: /*$api\nDisallow: /*$compare\nDisallow: /*$revision\nDisallow: /*$history\nDisallow: /*$children\nDisallow: /*$flag\nDisallow: /*$locales\nDisallow: /*$toc\nDisallow: /*$move\nDisallow: /*$quick-review\nDisallow: /*$samples\nDisallow: /*$revert\nDisallow: /*$repair_breadcrumbs\nDisallow: /*$delete\nDisallow: /*$restore\nDisallow: /*$purge\nDisallow: /*$subscribe\nDisallow: /*$subscribe_to_tree\nDisallow: /*$vote\nDisallow: /*docs.json\nDisallow: /*docs/ckeditor_config.js\nDisallow: /*/files/\nDisallow: /media\nDisallow: /*move-requested\nDisallow: /*preview-wiki-content\nDisallow: /*profiles*/edit\nDisallow: /skins\nDisallow: /*type=feed\nDisallow: /*users/\n\"\"\" + \"\\n\".join(\n \"Disallow: /{locale}/search\".format(locale=locale)\n for locale in settings.ENABLED_LOCALES\n)\n\nROBOTS_GO_AWAY_TXT = \"\"\"\\\nUser-Agent: *\nDisallow: /\n\"\"\"\n\n\n@shared_cache_control\ndef robots_txt(request):\n \"\"\"Serve robots.txt that allows or forbids robots.\"\"\"\n host = request.get_host()\n if host in settings.ALLOW_ROBOTS_DOMAINS:\n robots = \"\"\n elif host in settings.ALLOW_ROBOTS_WEB_DOMAINS:\n if host == settings.WIKI_HOST:\n robots = ROBOTS_ALL_ALLOWED_TXT\n else:\n robots = ROBOTS_ALLOWED_TXT\n else:\n robots = ROBOTS_GO_AWAY_TXT\n return HttpResponse(robots, content_type=\"text/plain\")\n\n\nclass FaviconRedirect(RedirectView):\n \"\"\"Redirect to the favicon in the static img folder (bug 1402497)\"\"\"\n\n def get_redirect_url(self, *args, **kwargs):\n return favicon_url()\n", "path": "kuma/landing/views.py"}], "after_files": [{"content": "from django.conf import settings\nfrom django.http import HttpResponse\nfrom django.shortcuts import redirect, render\nfrom django.views import static\nfrom django.views.decorators.cache import never_cache\nfrom django.views.generic import RedirectView\n\nfrom kuma.core.decorators import ensure_wiki_domain, shared_cache_control\nfrom kuma.core.utils import is_wiki\nfrom kuma.feeder.models import Bundle\nfrom kuma.feeder.sections import SECTION_HACKS\nfrom kuma.search.models import Filter\n\nfrom .utils import favicon_url\n\n\n@shared_cache_control\ndef contribute_json(request):\n return static.serve(request, \"contribute.json\", document_root=settings.ROOT)\n\n\n@shared_cache_control\ndef home(request):\n \"\"\"Home page.\"\"\"\n context = {}\n # Need for both wiki and react homepage\n context[\"updates\"] = list(Bundle.objects.recent_entries(SECTION_HACKS.updates)[:5])\n\n # The default template name\n template_name = \"landing/react_homepage.html\"\n if is_wiki(request):\n template_name = \"landing/homepage.html\"\n context[\"default_filters\"] = Filter.objects.default_filters()\n return render(request, template_name, context)\n\n\n@ensure_wiki_domain\n@never_cache\ndef maintenance_mode(request):\n if settings.MAINTENANCE_MODE:\n return render(request, \"landing/maintenance-mode.html\")\n else:\n return redirect(\"home\")\n\n\n@ensure_wiki_domain\n@shared_cache_control\ndef promote_buttons(request):\n \"\"\"Bug 646192: MDN affiliate buttons\"\"\"\n return render(request, \"landing/promote_buttons.html\")\n\n\nROBOTS_ALL_ALLOWED_TXT = \"\"\"\\\nUser-agent: *\nSitemap: https://wiki.developer.mozilla.org/sitemap.xml\n\nDisallow:\n\"\"\"\n\nROBOTS_ALLOWED_TXT = \"\"\"\\\nUser-agent: *\nSitemap: https://developer.mozilla.org/sitemap.xml\n\nDisallow: /api/\nDisallow: /*docs/get-documents\nDisallow: /*docs/Experiment:*\nDisallow: /*$children\nDisallow: /*docs.json\nDisallow: /*/files/\nDisallow: /media\nDisallow: /*profiles*/edit\nDisallow: /*users/\n\"\"\" + \"\\n\".join(\n \"Disallow: /{locale}/search\".format(locale=locale)\n for locale in settings.ENABLED_LOCALES\n)\n\nROBOTS_GO_AWAY_TXT = \"\"\"\\\nUser-Agent: *\nDisallow: /\n\"\"\"\n\n\n@shared_cache_control\ndef robots_txt(request):\n \"\"\"Serve robots.txt that allows or forbids robots.\"\"\"\n host = request.get_host()\n if host in settings.ALLOW_ROBOTS_DOMAINS:\n robots = \"\"\n elif host in settings.ALLOW_ROBOTS_WEB_DOMAINS:\n if host == settings.WIKI_HOST:\n robots = ROBOTS_ALL_ALLOWED_TXT\n else:\n robots = ROBOTS_ALLOWED_TXT\n else:\n robots = ROBOTS_GO_AWAY_TXT\n return HttpResponse(robots, content_type=\"text/plain\")\n\n\nclass FaviconRedirect(RedirectView):\n \"\"\"Redirect to the favicon in the static img folder (bug 1402497)\"\"\"\n\n def get_redirect_url(self, *args, **kwargs):\n return favicon_url()\n", "path": "kuma/landing/views.py"}]} | 1,845 | 448 |
gh_patches_debug_27712 | rasdani/github-patches | git_diff | pyca__cryptography-10277 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
cryptography_vectors is missing from `test` extra
There is missing `cryptography_vectors` in the `test` extra in `pyproject.toml` so when tests are run from sdist with the `test` extra installed then testing fails and `cryptography_vectors` needs to be installed manually to make tests passing.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `release.py`
Content:
```
1 # This file is dual licensed under the terms of the Apache License, Version
2 # 2.0, and the BSD License. See the LICENSE file in the root of this repository
3 # for complete details.
4
5 import pathlib
6 import re
7 import subprocess
8
9 import click
10 import tomllib
11 from packaging.version import Version
12
13
14 def run(*args: str) -> None:
15 print(f"[running] {list(args)}")
16 subprocess.check_call(list(args))
17
18
19 @click.group()
20 def cli():
21 pass
22
23
24 @cli.command()
25 def release() -> None:
26 base_dir = pathlib.Path(__file__).parent
27 with (base_dir / "pyproject.toml").open("rb") as f:
28 pyproject = tomllib.load(f)
29 version = pyproject["project"]["version"]
30
31 if Version(version).is_prerelease:
32 raise RuntimeError(
33 f"Can't release, pyproject.toml version is pre-release: {version}"
34 )
35
36 # Tag and push the tag (this will trigger the wheel builder in Actions)
37 run("git", "tag", "-s", version, "-m", f"{version} release")
38 run("git", "push", "--tags", "[email protected]:pyca/cryptography.git")
39
40
41 def replace_version(
42 p: pathlib.Path, variable_name: str, new_version: str
43 ) -> None:
44 content = p.read_text()
45
46 pattern = rf"^{variable_name}\s*=\s*.*$"
47 match = re.search(pattern, content, re.MULTILINE)
48 assert match is not None
49
50 start, end = match.span()
51 new_content = (
52 content[:start] + f'{variable_name} = "{new_version}"' + content[end:]
53 )
54
55 p.write_text(new_content)
56
57
58 @cli.command()
59 @click.argument("new_version")
60 def bump_version(new_version: str) -> None:
61 base_dir = pathlib.Path(__file__).parent
62
63 replace_version(base_dir / "pyproject.toml", "version", new_version)
64 replace_version(
65 base_dir / "src/cryptography/__about__.py", "__version__", new_version
66 )
67 replace_version(
68 base_dir / "vectors/pyproject.toml",
69 "version",
70 new_version,
71 )
72 replace_version(
73 base_dir / "vectors/cryptography_vectors/__about__.py",
74 "__version__",
75 new_version,
76 )
77
78
79 if __name__ == "__main__":
80 cli()
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/release.py b/release.py
--- a/release.py
+++ b/release.py
@@ -38,23 +38,24 @@
run("git", "push", "--tags", "[email protected]:pyca/cryptography.git")
-def replace_version(
- p: pathlib.Path, variable_name: str, new_version: str
-) -> None:
+def replace_pattern(p: pathlib.Path, pattern: str, replacement: str) -> None:
content = p.read_text()
-
- pattern = rf"^{variable_name}\s*=\s*.*$"
match = re.search(pattern, content, re.MULTILINE)
assert match is not None
start, end = match.span()
- new_content = (
- content[:start] + f'{variable_name} = "{new_version}"' + content[end:]
- )
-
+ new_content = content[:start] + replacement + content[end:]
p.write_text(new_content)
+def replace_version(
+ p: pathlib.Path, variable_name: str, new_version: str
+) -> None:
+ replace_pattern(
+ p, rf"^{variable_name}\s*=\s*.*$", f'{variable_name} = "{new_version}"'
+ )
+
+
@cli.command()
@click.argument("new_version")
def bump_version(new_version: str) -> None:
@@ -75,6 +76,19 @@
new_version,
)
+ if Version(new_version).is_prerelease:
+ replace_pattern(
+ base_dir / "pyproject.toml",
+ r'"cryptography_vectors(==.*?)?"',
+ '"cryptography_vectors"',
+ )
+ else:
+ replace_pattern(
+ base_dir / "pyproject.toml",
+ r'"cryptography_vectors(==.*?)?"',
+ f'"cryptography_vectors=={new_version}"',
+ )
+
if __name__ == "__main__":
cli()
| {"golden_diff": "diff --git a/release.py b/release.py\n--- a/release.py\n+++ b/release.py\n@@ -38,23 +38,24 @@\n run(\"git\", \"push\", \"--tags\", \"[email protected]:pyca/cryptography.git\")\n \n \n-def replace_version(\n- p: pathlib.Path, variable_name: str, new_version: str\n-) -> None:\n+def replace_pattern(p: pathlib.Path, pattern: str, replacement: str) -> None:\n content = p.read_text()\n-\n- pattern = rf\"^{variable_name}\\s*=\\s*.*$\"\n match = re.search(pattern, content, re.MULTILINE)\n assert match is not None\n \n start, end = match.span()\n- new_content = (\n- content[:start] + f'{variable_name} = \"{new_version}\"' + content[end:]\n- )\n-\n+ new_content = content[:start] + replacement + content[end:]\n p.write_text(new_content)\n \n \n+def replace_version(\n+ p: pathlib.Path, variable_name: str, new_version: str\n+) -> None:\n+ replace_pattern(\n+ p, rf\"^{variable_name}\\s*=\\s*.*$\", f'{variable_name} = \"{new_version}\"'\n+ )\n+\n+\n @cli.command()\n @click.argument(\"new_version\")\n def bump_version(new_version: str) -> None:\n@@ -75,6 +76,19 @@\n new_version,\n )\n \n+ if Version(new_version).is_prerelease:\n+ replace_pattern(\n+ base_dir / \"pyproject.toml\",\n+ r'\"cryptography_vectors(==.*?)?\"',\n+ '\"cryptography_vectors\"',\n+ )\n+ else:\n+ replace_pattern(\n+ base_dir / \"pyproject.toml\",\n+ r'\"cryptography_vectors(==.*?)?\"',\n+ f'\"cryptography_vectors=={new_version}\"',\n+ )\n+\n \n if __name__ == \"__main__\":\n cli()\n", "issue": "cryptography_vectors is missing from `test` extra\nThere is missing `cryptography_vectors` in the `test` extra in `pyproject.toml` so when tests are run from sdist with the `test` extra installed then testing fails and `cryptography_vectors` needs to be installed manually to make tests passing.\n", "before_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nimport pathlib\nimport re\nimport subprocess\n\nimport click\nimport tomllib\nfrom packaging.version import Version\n\n\ndef run(*args: str) -> None:\n print(f\"[running] {list(args)}\")\n subprocess.check_call(list(args))\n\n\[email protected]()\ndef cli():\n pass\n\n\[email protected]()\ndef release() -> None:\n base_dir = pathlib.Path(__file__).parent\n with (base_dir / \"pyproject.toml\").open(\"rb\") as f:\n pyproject = tomllib.load(f)\n version = pyproject[\"project\"][\"version\"]\n\n if Version(version).is_prerelease:\n raise RuntimeError(\n f\"Can't release, pyproject.toml version is pre-release: {version}\"\n )\n\n # Tag and push the tag (this will trigger the wheel builder in Actions)\n run(\"git\", \"tag\", \"-s\", version, \"-m\", f\"{version} release\")\n run(\"git\", \"push\", \"--tags\", \"[email protected]:pyca/cryptography.git\")\n\n\ndef replace_version(\n p: pathlib.Path, variable_name: str, new_version: str\n) -> None:\n content = p.read_text()\n\n pattern = rf\"^{variable_name}\\s*=\\s*.*$\"\n match = re.search(pattern, content, re.MULTILINE)\n assert match is not None\n\n start, end = match.span()\n new_content = (\n content[:start] + f'{variable_name} = \"{new_version}\"' + content[end:]\n )\n\n p.write_text(new_content)\n\n\[email protected]()\[email protected](\"new_version\")\ndef bump_version(new_version: str) -> None:\n base_dir = pathlib.Path(__file__).parent\n\n replace_version(base_dir / \"pyproject.toml\", \"version\", new_version)\n replace_version(\n base_dir / \"src/cryptography/__about__.py\", \"__version__\", new_version\n )\n replace_version(\n base_dir / \"vectors/pyproject.toml\",\n \"version\",\n new_version,\n )\n replace_version(\n base_dir / \"vectors/cryptography_vectors/__about__.py\",\n \"__version__\",\n new_version,\n )\n\n\nif __name__ == \"__main__\":\n cli()\n", "path": "release.py"}], "after_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nimport pathlib\nimport re\nimport subprocess\n\nimport click\nimport tomllib\nfrom packaging.version import Version\n\n\ndef run(*args: str) -> None:\n print(f\"[running] {list(args)}\")\n subprocess.check_call(list(args))\n\n\[email protected]()\ndef cli():\n pass\n\n\[email protected]()\ndef release() -> None:\n base_dir = pathlib.Path(__file__).parent\n with (base_dir / \"pyproject.toml\").open(\"rb\") as f:\n pyproject = tomllib.load(f)\n version = pyproject[\"project\"][\"version\"]\n\n if Version(version).is_prerelease:\n raise RuntimeError(\n f\"Can't release, pyproject.toml version is pre-release: {version}\"\n )\n\n # Tag and push the tag (this will trigger the wheel builder in Actions)\n run(\"git\", \"tag\", \"-s\", version, \"-m\", f\"{version} release\")\n run(\"git\", \"push\", \"--tags\", \"[email protected]:pyca/cryptography.git\")\n\n\ndef replace_pattern(p: pathlib.Path, pattern: str, replacement: str) -> None:\n content = p.read_text()\n match = re.search(pattern, content, re.MULTILINE)\n assert match is not None\n\n start, end = match.span()\n new_content = content[:start] + replacement + content[end:]\n p.write_text(new_content)\n\n\ndef replace_version(\n p: pathlib.Path, variable_name: str, new_version: str\n) -> None:\n replace_pattern(\n p, rf\"^{variable_name}\\s*=\\s*.*$\", f'{variable_name} = \"{new_version}\"'\n )\n\n\[email protected]()\[email protected](\"new_version\")\ndef bump_version(new_version: str) -> None:\n base_dir = pathlib.Path(__file__).parent\n\n replace_version(base_dir / \"pyproject.toml\", \"version\", new_version)\n replace_version(\n base_dir / \"src/cryptography/__about__.py\", \"__version__\", new_version\n )\n replace_version(\n base_dir / \"vectors/pyproject.toml\",\n \"version\",\n new_version,\n )\n replace_version(\n base_dir / \"vectors/cryptography_vectors/__about__.py\",\n \"__version__\",\n new_version,\n )\n\n if Version(new_version).is_prerelease:\n replace_pattern(\n base_dir / \"pyproject.toml\",\n r'\"cryptography_vectors(==.*?)?\"',\n '\"cryptography_vectors\"',\n )\n else:\n replace_pattern(\n base_dir / \"pyproject.toml\",\n r'\"cryptography_vectors(==.*?)?\"',\n f'\"cryptography_vectors=={new_version}\"',\n )\n\n\nif __name__ == \"__main__\":\n cli()\n", "path": "release.py"}]} | 1,009 | 429 |
gh_patches_debug_58135 | rasdani/github-patches | git_diff | liqd__a4-meinberlin-1579 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Untranslated "Plans"

Maybe we just remove it since there is already a header
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `meinberlin/apps/plans/forms.py`
Content:
```
1 from django import forms
2 from django.conf import settings
3 from django.db.models import Q
4 from django.utils.translation import ugettext_lazy as _
5
6 from adhocracy4.dashboard.components.forms import ProjectDashboardForm
7 from adhocracy4.maps import widgets as maps_widgets
8 from adhocracy4.projects import models as project_models
9 from meinberlin.apps.contrib import widgets as contrib_widgets
10
11 from . import models
12
13
14 def get_theme_options():
15 return models.Plan.objects\
16 .filter(~Q(theme=''))\
17 .order_by('theme')\
18 .values_list('theme', flat=True)\
19 .distinct()
20
21
22 class PlanForm(forms.ModelForm):
23
24 class Meta:
25 model = models.Plan
26 fields = [
27 'title',
28 'description_image',
29 'contact',
30 'point',
31 'point_label',
32 'district',
33 'cost',
34 'description',
35 'theme',
36 'status',
37 'participation']
38 widgets = {
39 'point': maps_widgets.MapChoosePointWidget(
40 polygon=settings.BERLIN_POLYGON),
41 'theme': contrib_widgets.TextWithDatalistWidget(attrs={
42 'options': get_theme_options
43 })
44 }
45 error_messages = {
46 'point': {
47 'required': _('Please locate the plan on the map.')
48 }
49 }
50
51 def __init__(self, *args, **kwargs):
52 super().__init__(*args, **kwargs)
53 self.fields['district'].empty_label = _('City wide')
54
55
56 class CustomMultipleChoiceField(forms.ModelMultipleChoiceField):
57
58 widget = forms.RadioSelect
59
60 def clean(self, value):
61 if value is None:
62 return super().clean([])
63 return super().clean([value])
64
65
66 class ProjectPlansDashboardForm(ProjectDashboardForm):
67 plans = CustomMultipleChoiceField(queryset=None)
68
69 class Meta:
70 model = project_models.Project
71 fields = ['plans']
72 required_for_project_publish = ['plans']
73
74 def save(self, commit=False):
75 plans = self.cleaned_data['plans']
76 self.instance.plans.set(plans)
77
78 def __init__(self, *args, **kwargs):
79 super().__init__(*args, **kwargs)
80 self.initial['plans'] = self.instance.plans.all()
81 self.fields['plans'
82 ].queryset = self.instance.organisation.plan_set.all()
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/meinberlin/apps/plans/forms.py b/meinberlin/apps/plans/forms.py
--- a/meinberlin/apps/plans/forms.py
+++ b/meinberlin/apps/plans/forms.py
@@ -64,7 +64,8 @@
class ProjectPlansDashboardForm(ProjectDashboardForm):
- plans = CustomMultipleChoiceField(queryset=None)
+ plans = CustomMultipleChoiceField(queryset=None,
+ label=_('Plans'))
class Meta:
model = project_models.Project
| {"golden_diff": "diff --git a/meinberlin/apps/plans/forms.py b/meinberlin/apps/plans/forms.py\n--- a/meinberlin/apps/plans/forms.py\n+++ b/meinberlin/apps/plans/forms.py\n@@ -64,7 +64,8 @@\n \n \n class ProjectPlansDashboardForm(ProjectDashboardForm):\n- plans = CustomMultipleChoiceField(queryset=None)\n+ plans = CustomMultipleChoiceField(queryset=None,\n+ label=_('Plans'))\n \n class Meta:\n model = project_models.Project\n", "issue": "Untranslated \"Plans\"\n\r\n\r\n\r\nMaybe we just remove it since there is already a header\n", "before_files": [{"content": "from django import forms\nfrom django.conf import settings\nfrom django.db.models import Q\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom adhocracy4.dashboard.components.forms import ProjectDashboardForm\nfrom adhocracy4.maps import widgets as maps_widgets\nfrom adhocracy4.projects import models as project_models\nfrom meinberlin.apps.contrib import widgets as contrib_widgets\n\nfrom . import models\n\n\ndef get_theme_options():\n return models.Plan.objects\\\n .filter(~Q(theme=''))\\\n .order_by('theme')\\\n .values_list('theme', flat=True)\\\n .distinct()\n\n\nclass PlanForm(forms.ModelForm):\n\n class Meta:\n model = models.Plan\n fields = [\n 'title',\n 'description_image',\n 'contact',\n 'point',\n 'point_label',\n 'district',\n 'cost',\n 'description',\n 'theme',\n 'status',\n 'participation']\n widgets = {\n 'point': maps_widgets.MapChoosePointWidget(\n polygon=settings.BERLIN_POLYGON),\n 'theme': contrib_widgets.TextWithDatalistWidget(attrs={\n 'options': get_theme_options\n })\n }\n error_messages = {\n 'point': {\n 'required': _('Please locate the plan on the map.')\n }\n }\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.fields['district'].empty_label = _('City wide')\n\n\nclass CustomMultipleChoiceField(forms.ModelMultipleChoiceField):\n\n widget = forms.RadioSelect\n\n def clean(self, value):\n if value is None:\n return super().clean([])\n return super().clean([value])\n\n\nclass ProjectPlansDashboardForm(ProjectDashboardForm):\n plans = CustomMultipleChoiceField(queryset=None)\n\n class Meta:\n model = project_models.Project\n fields = ['plans']\n required_for_project_publish = ['plans']\n\n def save(self, commit=False):\n plans = self.cleaned_data['plans']\n self.instance.plans.set(plans)\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.initial['plans'] = self.instance.plans.all()\n self.fields['plans'\n ].queryset = self.instance.organisation.plan_set.all()\n", "path": "meinberlin/apps/plans/forms.py"}], "after_files": [{"content": "from django import forms\nfrom django.conf import settings\nfrom django.db.models import Q\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom adhocracy4.dashboard.components.forms import ProjectDashboardForm\nfrom adhocracy4.maps import widgets as maps_widgets\nfrom adhocracy4.projects import models as project_models\nfrom meinberlin.apps.contrib import widgets as contrib_widgets\n\nfrom . import models\n\n\ndef get_theme_options():\n return models.Plan.objects\\\n .filter(~Q(theme=''))\\\n .order_by('theme')\\\n .values_list('theme', flat=True)\\\n .distinct()\n\n\nclass PlanForm(forms.ModelForm):\n\n class Meta:\n model = models.Plan\n fields = [\n 'title',\n 'description_image',\n 'contact',\n 'point',\n 'point_label',\n 'district',\n 'cost',\n 'description',\n 'theme',\n 'status',\n 'participation']\n widgets = {\n 'point': maps_widgets.MapChoosePointWidget(\n polygon=settings.BERLIN_POLYGON),\n 'theme': contrib_widgets.TextWithDatalistWidget(attrs={\n 'options': get_theme_options\n })\n }\n error_messages = {\n 'point': {\n 'required': _('Please locate the plan on the map.')\n }\n }\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.fields['district'].empty_label = _('City wide')\n\n\nclass CustomMultipleChoiceField(forms.ModelMultipleChoiceField):\n\n widget = forms.RadioSelect\n\n def clean(self, value):\n if value is None:\n return super().clean([])\n return super().clean([value])\n\n\nclass ProjectPlansDashboardForm(ProjectDashboardForm):\n plans = CustomMultipleChoiceField(queryset=None,\n label=_('Plans'))\n\n class Meta:\n model = project_models.Project\n fields = ['plans']\n required_for_project_publish = ['plans']\n\n def save(self, commit=False):\n plans = self.cleaned_data['plans']\n self.instance.plans.set(plans)\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.initial['plans'] = self.instance.plans.all()\n self.fields['plans'\n ].queryset = self.instance.organisation.plan_set.all()\n", "path": "meinberlin/apps/plans/forms.py"}]} | 1,019 | 112 |
gh_patches_debug_16587 | rasdani/github-patches | git_diff | freedomofpress__securedrop-5839 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Several testinfra tests failing against 1.8.0-rc1 Focal production instances:
## Description
`./securedrop-admin verify` against a 1.8.0-rc1 instance results in 11 test failures (of which 2 are expected)
```
============================================================================================= short test summary info =============================================================================================
FAILED app/test_apparmor.py::test_apparmor_enforced[paramiko:/app-xenial] - AssertionError: assert 'xenial' in ' /usr/bin/man\n /usr/lib/NetworkManager/nm-dhcp-client.action\n /usr/lib/NetworkManager/nm...
FAILED app/test_apparmor.py::test_apparmor_enforced[paramiko:/app-focal] - AssertionError: assert 'focal' in ' /usr/bin/man\n /usr/lib/NetworkManager/nm-dhcp-client.action\n /usr/lib/NetworkManager/nm-d...
FAILED app-code/test_securedrop_rqrequeue.py::test_securedrop_rqrequeue_service[paramiko:/app] - assert '[Unit]\nDesc...user.target\n' == '[Unit]\nDesc...user.target\n'
FAILED app-code/test_securedrop_rqworker.py::test_securedrop_rqworker_service[paramiko:/app] - assert '[Unit]\nDesc...user.target\n' == '[Unit]\nDesc...user.target\n'
FAILED app-code/test_securedrop_shredder_configuration.py::test_securedrop_shredder_service[paramiko:/app] - assert '[Unit]\nDesc...user.target\n' == '[Unit]\nDesc...user.target\n'
FAILED app-code/test_securedrop_source_deleter_configuration.py::test_securedrop_source_deleter_service[paramiko:/app] - assert '[Unit]\nDesc...user.target\n' == '[Unit]\nDesc...user.target\n'
FAILED app-code/test_securedrop_app_code.py::test_securedrop_application_apt_dependencies[paramiko:/app-libpython3.5] - AssertionError: assert False
(expected failure) FAILED common/test_fpf_apt_repo.py::test_fpf_apt_repo_present[paramiko:/app] - AssertionError: Unexpected exit code 2 for CommandResult(command=b"grep -qs -- '^deb \\[arch=amd64\\] https://apt\\.freedom\\.pre...
(expected failure) FAILED common/test_fpf_apt_repo.py::test_fpf_apt_repo_present[paramiko:/mon] - AssertionError: Unexpected exit code 2 for CommandResult(command=b"grep -qs -- '^deb \\[arch=amd64\\] https://apt\\.freedom\\.pre...
FAILED common/test_grsecurity.py::test_paxctld_focal[paramiko:/mon] - RuntimeError: Ansible module is only available with ansible connection backend
FAILED common/test_grsecurity.py::test_paxctld_focal[paramiko:/app] - RuntimeError: Ansible module is only available with ansible connection backend
============================================================= 11 failed, 441 passed, 7 skipped, 3 xfailed, 1 xpassed, 8 warnings in 842.46s (0:14:02) =============================================================
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `molecule/testinfra/conftest.py`
Content:
```
1 """
2 Configuration for TestInfra test suite for SecureDrop.
3 Handles importing host-specific test vars, so test functions
4 can be reused across multiple hosts, with varied targets.
5
6 Vars should be placed in `testinfra/vars/<hostname>.yml`.
7 """
8
9 import io
10 import os
11 import yaml
12 from typing import Any, Dict
13
14 import testutils
15
16
17 # The config tests target staging by default. It's possible to override
18 # for e.g. prod, but the associated vars files are not yet ported.
19 target_host = os.environ.get('SECUREDROP_TESTINFRA_TARGET_HOST', 'staging')
20
21
22 def securedrop_import_testinfra_vars(hostname, with_header=False):
23 """
24 Import vars from a YAML file to populate tests with host-specific
25 values used in checks. For instance, the SecureDrop docroot will
26 be under /vagrant in development, but /var/www/securedrop in staging.
27
28 Vars must be stored in `testinfra/vars/<hostname>.yml`.
29 """
30 filepath = os.path.join(os.path.dirname(__file__), "vars", hostname+".yml")
31 with io.open(filepath, 'r') as f:
32 hostvars = yaml.safe_load(f)
33
34 # Testing against both Focal and Xenial must be supported for now in both
35 # staging scenarios, and in prod via `USE_FOCAL=1 ./securedrop-admin verify`
36 testing_focal = False
37 scenario_env = "MOLECULE_SCENARIO_NAME"
38 if scenario_env in os.environ and os.environ.get(scenario_env).endswith("focal"):
39 testing_focal = True
40 if "USE_FOCAL" in os.environ:
41 testing_focal = True
42
43 if testing_focal:
44 hostvars['securedrop_venv_site_packages'] = hostvars["securedrop_venv_site_packages"].format("3.8") # noqa: E501
45 hostvars['python_version'] = "3.8"
46 else:
47 hostvars['securedrop_venv_site_packages'] = hostvars["securedrop_venv_site_packages"].format("3.5") # noqa: E501
48 hostvars['python_version'] = "3.5"
49
50 if with_header:
51 hostvars = dict(securedrop_test_vars=hostvars)
52
53 return hostvars
54
55
56 class TestVars(dict):
57 managed_attrs = {} # type: Dict[str, Any]
58
59 def __init__(self, initial: Dict[str, Any]) -> None:
60 self.securedrop_target_distribution = os.environ.get("SECUREDROP_TARGET_DISTRIBUTION")
61 self.managed_attrs.update(initial)
62
63 def __getattr__(self, name: str) -> Any:
64 """
65 If the requested attribute names a dict in managed_attrs and that
66 contains a key with the name of the target distribution,
67 e.g. "focal", return that. Otherwise return the entire item
68 under the requested name.
69 """
70 try:
71 attr = self.managed_attrs[name]
72 if isinstance(attr, dict) and self.securedrop_target_distribution in attr:
73 return attr[self.securedrop_target_distribution]
74 return attr
75 except KeyError:
76 raise AttributeError(name)
77
78 def __str__(self) -> str:
79 return str(self.managed_attrs)
80
81
82 testutils.securedrop_test_vars = TestVars(securedrop_import_testinfra_vars(target_host))
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/molecule/testinfra/conftest.py b/molecule/testinfra/conftest.py
--- a/molecule/testinfra/conftest.py
+++ b/molecule/testinfra/conftest.py
@@ -43,9 +43,11 @@
if testing_focal:
hostvars['securedrop_venv_site_packages'] = hostvars["securedrop_venv_site_packages"].format("3.8") # noqa: E501
hostvars['python_version'] = "3.8"
+ hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['focal']
else:
hostvars['securedrop_venv_site_packages'] = hostvars["securedrop_venv_site_packages"].format("3.5") # noqa: E501
hostvars['python_version'] = "3.5"
+ hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['xenial']
if with_header:
hostvars = dict(securedrop_test_vars=hostvars)
| {"golden_diff": "diff --git a/molecule/testinfra/conftest.py b/molecule/testinfra/conftest.py\n--- a/molecule/testinfra/conftest.py\n+++ b/molecule/testinfra/conftest.py\n@@ -43,9 +43,11 @@\n if testing_focal:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.8\") # noqa: E501\n hostvars['python_version'] = \"3.8\"\n+ hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['focal']\n else:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.5\") # noqa: E501\n hostvars['python_version'] = \"3.5\"\n+ hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['xenial']\n \n if with_header:\n hostvars = dict(securedrop_test_vars=hostvars)\n", "issue": "Several testinfra tests failing against 1.8.0-rc1 Focal production instances:\n## Description\r\n`./securedrop-admin verify` against a 1.8.0-rc1 instance results in 11 test failures (of which 2 are expected)\r\n\r\n```\r\n============================================================================================= short test summary info =============================================================================================\r\nFAILED app/test_apparmor.py::test_apparmor_enforced[paramiko:/app-xenial] - AssertionError: assert 'xenial' in ' /usr/bin/man\\n /usr/lib/NetworkManager/nm-dhcp-client.action\\n /usr/lib/NetworkManager/nm...\r\nFAILED app/test_apparmor.py::test_apparmor_enforced[paramiko:/app-focal] - AssertionError: assert 'focal' in ' /usr/bin/man\\n /usr/lib/NetworkManager/nm-dhcp-client.action\\n /usr/lib/NetworkManager/nm-d...\r\nFAILED app-code/test_securedrop_rqrequeue.py::test_securedrop_rqrequeue_service[paramiko:/app] - assert '[Unit]\\nDesc...user.target\\n' == '[Unit]\\nDesc...user.target\\n'\r\nFAILED app-code/test_securedrop_rqworker.py::test_securedrop_rqworker_service[paramiko:/app] - assert '[Unit]\\nDesc...user.target\\n' == '[Unit]\\nDesc...user.target\\n'\r\nFAILED app-code/test_securedrop_shredder_configuration.py::test_securedrop_shredder_service[paramiko:/app] - assert '[Unit]\\nDesc...user.target\\n' == '[Unit]\\nDesc...user.target\\n'\r\nFAILED app-code/test_securedrop_source_deleter_configuration.py::test_securedrop_source_deleter_service[paramiko:/app] - assert '[Unit]\\nDesc...user.target\\n' == '[Unit]\\nDesc...user.target\\n'\r\nFAILED app-code/test_securedrop_app_code.py::test_securedrop_application_apt_dependencies[paramiko:/app-libpython3.5] - AssertionError: assert False\r\n(expected failure) FAILED common/test_fpf_apt_repo.py::test_fpf_apt_repo_present[paramiko:/app] - AssertionError: Unexpected exit code 2 for CommandResult(command=b\"grep -qs -- '^deb \\\\[arch=amd64\\\\] https://apt\\\\.freedom\\\\.pre...\r\n(expected failure) FAILED common/test_fpf_apt_repo.py::test_fpf_apt_repo_present[paramiko:/mon] - AssertionError: Unexpected exit code 2 for CommandResult(command=b\"grep -qs -- '^deb \\\\[arch=amd64\\\\] https://apt\\\\.freedom\\\\.pre...\r\nFAILED common/test_grsecurity.py::test_paxctld_focal[paramiko:/mon] - RuntimeError: Ansible module is only available with ansible connection backend\r\nFAILED common/test_grsecurity.py::test_paxctld_focal[paramiko:/app] - RuntimeError: Ansible module is only available with ansible connection backend\r\n============================================================= 11 failed, 441 passed, 7 skipped, 3 xfailed, 1 xpassed, 8 warnings in 842.46s (0:14:02) =============================================================\r\n```\r\n\n", "before_files": [{"content": "\"\"\"\nConfiguration for TestInfra test suite for SecureDrop.\nHandles importing host-specific test vars, so test functions\ncan be reused across multiple hosts, with varied targets.\n\nVars should be placed in `testinfra/vars/<hostname>.yml`.\n\"\"\"\n\nimport io\nimport os\nimport yaml\nfrom typing import Any, Dict\n\nimport testutils\n\n\n# The config tests target staging by default. It's possible to override\n# for e.g. prod, but the associated vars files are not yet ported.\ntarget_host = os.environ.get('SECUREDROP_TESTINFRA_TARGET_HOST', 'staging')\n\n\ndef securedrop_import_testinfra_vars(hostname, with_header=False):\n \"\"\"\n Import vars from a YAML file to populate tests with host-specific\n values used in checks. For instance, the SecureDrop docroot will\n be under /vagrant in development, but /var/www/securedrop in staging.\n\n Vars must be stored in `testinfra/vars/<hostname>.yml`.\n \"\"\"\n filepath = os.path.join(os.path.dirname(__file__), \"vars\", hostname+\".yml\")\n with io.open(filepath, 'r') as f:\n hostvars = yaml.safe_load(f)\n\n # Testing against both Focal and Xenial must be supported for now in both\n # staging scenarios, and in prod via `USE_FOCAL=1 ./securedrop-admin verify`\n testing_focal = False\n scenario_env = \"MOLECULE_SCENARIO_NAME\"\n if scenario_env in os.environ and os.environ.get(scenario_env).endswith(\"focal\"):\n testing_focal = True\n if \"USE_FOCAL\" in os.environ:\n testing_focal = True\n\n if testing_focal:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.8\") # noqa: E501\n hostvars['python_version'] = \"3.8\"\n else:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.5\") # noqa: E501\n hostvars['python_version'] = \"3.5\"\n\n if with_header:\n hostvars = dict(securedrop_test_vars=hostvars)\n\n return hostvars\n\n\nclass TestVars(dict):\n managed_attrs = {} # type: Dict[str, Any]\n\n def __init__(self, initial: Dict[str, Any]) -> None:\n self.securedrop_target_distribution = os.environ.get(\"SECUREDROP_TARGET_DISTRIBUTION\")\n self.managed_attrs.update(initial)\n\n def __getattr__(self, name: str) -> Any:\n \"\"\"\n If the requested attribute names a dict in managed_attrs and that\n contains a key with the name of the target distribution,\n e.g. \"focal\", return that. Otherwise return the entire item\n under the requested name.\n \"\"\"\n try:\n attr = self.managed_attrs[name]\n if isinstance(attr, dict) and self.securedrop_target_distribution in attr:\n return attr[self.securedrop_target_distribution]\n return attr\n except KeyError:\n raise AttributeError(name)\n\n def __str__(self) -> str:\n return str(self.managed_attrs)\n\n\ntestutils.securedrop_test_vars = TestVars(securedrop_import_testinfra_vars(target_host))\n", "path": "molecule/testinfra/conftest.py"}], "after_files": [{"content": "\"\"\"\nConfiguration for TestInfra test suite for SecureDrop.\nHandles importing host-specific test vars, so test functions\ncan be reused across multiple hosts, with varied targets.\n\nVars should be placed in `testinfra/vars/<hostname>.yml`.\n\"\"\"\n\nimport io\nimport os\nimport yaml\nfrom typing import Any, Dict\n\nimport testutils\n\n\n# The config tests target staging by default. It's possible to override\n# for e.g. prod, but the associated vars files are not yet ported.\ntarget_host = os.environ.get('SECUREDROP_TESTINFRA_TARGET_HOST', 'staging')\n\n\ndef securedrop_import_testinfra_vars(hostname, with_header=False):\n \"\"\"\n Import vars from a YAML file to populate tests with host-specific\n values used in checks. For instance, the SecureDrop docroot will\n be under /vagrant in development, but /var/www/securedrop in staging.\n\n Vars must be stored in `testinfra/vars/<hostname>.yml`.\n \"\"\"\n filepath = os.path.join(os.path.dirname(__file__), \"vars\", hostname+\".yml\")\n with io.open(filepath, 'r') as f:\n hostvars = yaml.safe_load(f)\n\n # Testing against both Focal and Xenial must be supported for now in both\n # staging scenarios, and in prod via `USE_FOCAL=1 ./securedrop-admin verify`\n testing_focal = False\n scenario_env = \"MOLECULE_SCENARIO_NAME\"\n if scenario_env in os.environ and os.environ.get(scenario_env).endswith(\"focal\"):\n testing_focal = True\n if \"USE_FOCAL\" in os.environ:\n testing_focal = True\n\n if testing_focal:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.8\") # noqa: E501\n hostvars['python_version'] = \"3.8\"\n hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['focal']\n else:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.5\") # noqa: E501\n hostvars['python_version'] = \"3.5\"\n hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['xenial']\n\n if with_header:\n hostvars = dict(securedrop_test_vars=hostvars)\n\n return hostvars\n\n\nclass TestVars(dict):\n managed_attrs = {} # type: Dict[str, Any]\n\n def __init__(self, initial: Dict[str, Any]) -> None:\n self.securedrop_target_distribution = os.environ.get(\"SECUREDROP_TARGET_DISTRIBUTION\")\n self.managed_attrs.update(initial)\n\n def __getattr__(self, name: str) -> Any:\n \"\"\"\n If the requested attribute names a dict in managed_attrs and that\n contains a key with the name of the target distribution,\n e.g. \"focal\", return that. Otherwise return the entire item\n under the requested name.\n \"\"\"\n try:\n attr = self.managed_attrs[name]\n if isinstance(attr, dict) and self.securedrop_target_distribution in attr:\n return attr[self.securedrop_target_distribution]\n return attr\n except KeyError:\n raise AttributeError(name)\n\n def __str__(self) -> str:\n return str(self.managed_attrs)\n\n\ntestutils.securedrop_test_vars = TestVars(securedrop_import_testinfra_vars(target_host))\n", "path": "molecule/testinfra/conftest.py"}]} | 1,813 | 237 |
gh_patches_debug_18447 | rasdani/github-patches | git_diff | facebookresearch__CompilerGym-442 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Observation 'Bitcode' without going through a temporary file.
## 🚀 Feature
Have an observation which just returns the bytes of the bitcode, rather than the file. Going through a temporary file is a (tiny) bit annoying.
```
env.observation["Bitcode"] -> bytes
env.bitcode == env.observation["Bitcode"]
```
## Motivation
I'm passing bitcode to LLVM tools. They will accept bitcode on stdin and emit it to stdout. No need for a temporary file.
## Pitch
Don't use a file when a `bytes` will do?
## Alternatives
Keep using a file.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `compiler_gym/spaces/sequence.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates.
2 #
3 # This source code is licensed under the MIT license found in the
4 # LICENSE file in the root directory of this source tree.
5 from typing import Optional, Tuple
6
7 from gym.spaces import Space
8
9 from compiler_gym.spaces.scalar import Scalar
10
11
12 class Sequence(Space):
13 """A sequence of values. Each element of the sequence is of `dtype`. The
14 length of the sequence is bounded by `size_range`.
15
16 Example:
17
18 ::
19
20 >>> space = Sequence(size_range=(0, None), dtype=str)
21 >>> space.contains("Hello, world!")
22 True
23
24 ::
25
26 >>> space = Sequence(size_range=(256, 256), dtype=bytes)
27 >>> space.contains("Hello, world!")
28 False
29
30 :ivar size_range: A tuple indicating the `(lower, upper)` bounds for
31 sequence lengths. An upper bound of `None` means no upper bound. All
32 sequences must have a lower bound of length >= 0.
33 :ivar dtype: The data type for each element in a sequence.
34 :ivar opaque_data_format: An optional string describing an opaque data
35 format, e.g. a data structure that is serialized to a string/binary
36 array for transmission to the client. It is up to the client and service
37 to agree on how to decode observations using this value. For example,
38 an opaque_data_format of `string_json` could be used to indicate that
39 the observation is a string-serialized JSON value.
40 """
41
42 def __init__(
43 self,
44 size_range: Tuple[int, Optional[int]] = (0, None),
45 dtype=bytes,
46 opaque_data_format: Optional[str] = None,
47 scalar_range: Optional[Scalar] = None,
48 ):
49 """Constructor.
50
51 :param size_range: A tuple indicating the `(lower, upper)` bounds for
52 sequence lengths. An upper bound of `None` means no upper bound. All
53 sequences must have a lower bound of length >= 0.
54 :param dtype: The data type for each element in a sequence.
55 :param opaque_data_format: An optional string describing an opaque data
56 format, e.g. a data structure that is serialized to a string/binary
57 array for transmission to the client. It is up to the client and
58 service to agree on how to decode observations using this value. For
59 example, an opaque_data_format of `string_json` could be used to
60 indicate that the observation is a string-serialized JSON value.
61 :param scalar_range: If specified, this denotes the legal range of
62 each element in the sequence. This is enforced by
63 :meth:`contains() <compiler_gym.spaces.Sequence.contains>` checks.
64 """
65 self.size_range = size_range
66 self.dtype = dtype
67 self.opaque_data_format = opaque_data_format
68 self.scalar_range = scalar_range
69
70 def __repr__(self) -> str:
71 upper_bound = "inf" if self.size_range[1] is None else self.size_range[1]
72 d = f" -> {self.opaque_data_format}" if self.opaque_data_format else ""
73 return (
74 f"{self.dtype.__name__}_list<>[{int(self.size_range[0])},{upper_bound}]){d}"
75 )
76
77 def contains(self, x):
78 lower_bound = self.size_range[0]
79 upper_bound = float("inf") if self.size_range[1] is None else self.size_range[1]
80 if not (lower_bound <= len(x) <= upper_bound):
81 return False
82 for element in x:
83 if not isinstance(element, self.dtype):
84 return False
85
86 # Run the bounds check on every scalar element, if there is a scalar
87 # range specified.
88 if self.scalar_range:
89 return all(self.scalar_range.contains(s) for s in x)
90
91 return True
92
93 def sample(self):
94 """
95 .. warning::
96 The `Sequence` space cannot be sampled from.
97
98 :raises NotImplementedError: Not supported.
99 """
100 raise NotImplementedError
101
102 def __eq__(self, other):
103 if not isinstance(other, Sequence):
104 return False
105 return (
106 self.size_range == other.size_range
107 and self.dtype == other.dtype
108 and self.opaque_data_format == other.opaque_data_format
109 )
110
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/compiler_gym/spaces/sequence.py b/compiler_gym/spaces/sequence.py
--- a/compiler_gym/spaces/sequence.py
+++ b/compiler_gym/spaces/sequence.py
@@ -79,9 +79,18 @@
upper_bound = float("inf") if self.size_range[1] is None else self.size_range[1]
if not (lower_bound <= len(x) <= upper_bound):
return False
- for element in x:
- if not isinstance(element, self.dtype):
+
+ # TODO(cummins): The dtype API is inconsistent. When dtype=str or
+ # dtype=bytes, we expect this to be the type of the entire sequence. But
+ # for dtype=int, we expect this to be the type of each element. We
+ # should distinguish these differences better.
+ if self.dtype in {str, bytes}:
+ if not isinstance(x, self.dtype):
return False
+ else:
+ for element in x:
+ if not isinstance(element, self.dtype):
+ return False
# Run the bounds check on every scalar element, if there is a scalar
# range specified.
| {"golden_diff": "diff --git a/compiler_gym/spaces/sequence.py b/compiler_gym/spaces/sequence.py\n--- a/compiler_gym/spaces/sequence.py\n+++ b/compiler_gym/spaces/sequence.py\n@@ -79,9 +79,18 @@\n upper_bound = float(\"inf\") if self.size_range[1] is None else self.size_range[1]\n if not (lower_bound <= len(x) <= upper_bound):\n return False\n- for element in x:\n- if not isinstance(element, self.dtype):\n+\n+ # TODO(cummins): The dtype API is inconsistent. When dtype=str or\n+ # dtype=bytes, we expect this to be the type of the entire sequence. But\n+ # for dtype=int, we expect this to be the type of each element. We\n+ # should distinguish these differences better.\n+ if self.dtype in {str, bytes}:\n+ if not isinstance(x, self.dtype):\n return False\n+ else:\n+ for element in x:\n+ if not isinstance(element, self.dtype):\n+ return False\n \n # Run the bounds check on every scalar element, if there is a scalar\n # range specified.\n", "issue": "Observation 'Bitcode' without going through a temporary file.\n## \ud83d\ude80 Feature\r\n\r\nHave an observation which just returns the bytes of the bitcode, rather than the file. Going through a temporary file is a (tiny) bit annoying.\r\n\r\n```\r\nenv.observation[\"Bitcode\"] -> bytes\r\nenv.bitcode == env.observation[\"Bitcode\"]\r\n```\r\n## Motivation\r\n\r\nI'm passing bitcode to LLVM tools. They will accept bitcode on stdin and emit it to stdout. No need for a temporary file.\r\n\r\n## Pitch\r\n\r\nDon't use a file when a `bytes` will do? \r\n\r\n## Alternatives\r\n\r\nKeep using a file.\r\n\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates.\n#\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\nfrom typing import Optional, Tuple\n\nfrom gym.spaces import Space\n\nfrom compiler_gym.spaces.scalar import Scalar\n\n\nclass Sequence(Space):\n \"\"\"A sequence of values. Each element of the sequence is of `dtype`. The\n length of the sequence is bounded by `size_range`.\n\n Example:\n\n ::\n\n >>> space = Sequence(size_range=(0, None), dtype=str)\n >>> space.contains(\"Hello, world!\")\n True\n\n ::\n\n >>> space = Sequence(size_range=(256, 256), dtype=bytes)\n >>> space.contains(\"Hello, world!\")\n False\n\n :ivar size_range: A tuple indicating the `(lower, upper)` bounds for\n sequence lengths. An upper bound of `None` means no upper bound. All\n sequences must have a lower bound of length >= 0.\n :ivar dtype: The data type for each element in a sequence.\n :ivar opaque_data_format: An optional string describing an opaque data\n format, e.g. a data structure that is serialized to a string/binary\n array for transmission to the client. It is up to the client and service\n to agree on how to decode observations using this value. For example,\n an opaque_data_format of `string_json` could be used to indicate that\n the observation is a string-serialized JSON value.\n \"\"\"\n\n def __init__(\n self,\n size_range: Tuple[int, Optional[int]] = (0, None),\n dtype=bytes,\n opaque_data_format: Optional[str] = None,\n scalar_range: Optional[Scalar] = None,\n ):\n \"\"\"Constructor.\n\n :param size_range: A tuple indicating the `(lower, upper)` bounds for\n sequence lengths. An upper bound of `None` means no upper bound. All\n sequences must have a lower bound of length >= 0.\n :param dtype: The data type for each element in a sequence.\n :param opaque_data_format: An optional string describing an opaque data\n format, e.g. a data structure that is serialized to a string/binary\n array for transmission to the client. It is up to the client and\n service to agree on how to decode observations using this value. For\n example, an opaque_data_format of `string_json` could be used to\n indicate that the observation is a string-serialized JSON value.\n :param scalar_range: If specified, this denotes the legal range of\n each element in the sequence. This is enforced by\n :meth:`contains() <compiler_gym.spaces.Sequence.contains>` checks.\n \"\"\"\n self.size_range = size_range\n self.dtype = dtype\n self.opaque_data_format = opaque_data_format\n self.scalar_range = scalar_range\n\n def __repr__(self) -> str:\n upper_bound = \"inf\" if self.size_range[1] is None else self.size_range[1]\n d = f\" -> {self.opaque_data_format}\" if self.opaque_data_format else \"\"\n return (\n f\"{self.dtype.__name__}_list<>[{int(self.size_range[0])},{upper_bound}]){d}\"\n )\n\n def contains(self, x):\n lower_bound = self.size_range[0]\n upper_bound = float(\"inf\") if self.size_range[1] is None else self.size_range[1]\n if not (lower_bound <= len(x) <= upper_bound):\n return False\n for element in x:\n if not isinstance(element, self.dtype):\n return False\n\n # Run the bounds check on every scalar element, if there is a scalar\n # range specified.\n if self.scalar_range:\n return all(self.scalar_range.contains(s) for s in x)\n\n return True\n\n def sample(self):\n \"\"\"\n .. warning::\n The `Sequence` space cannot be sampled from.\n\n :raises NotImplementedError: Not supported.\n \"\"\"\n raise NotImplementedError\n\n def __eq__(self, other):\n if not isinstance(other, Sequence):\n return False\n return (\n self.size_range == other.size_range\n and self.dtype == other.dtype\n and self.opaque_data_format == other.opaque_data_format\n )\n", "path": "compiler_gym/spaces/sequence.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates.\n#\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\nfrom typing import Optional, Tuple\n\nfrom gym.spaces import Space\n\nfrom compiler_gym.spaces.scalar import Scalar\n\n\nclass Sequence(Space):\n \"\"\"A sequence of values. Each element of the sequence is of `dtype`. The\n length of the sequence is bounded by `size_range`.\n\n Example:\n\n ::\n\n >>> space = Sequence(size_range=(0, None), dtype=str)\n >>> space.contains(\"Hello, world!\")\n True\n\n ::\n\n >>> space = Sequence(size_range=(256, 256), dtype=bytes)\n >>> space.contains(\"Hello, world!\")\n False\n\n :ivar size_range: A tuple indicating the `(lower, upper)` bounds for\n sequence lengths. An upper bound of `None` means no upper bound. All\n sequences must have a lower bound of length >= 0.\n :ivar dtype: The data type for each element in a sequence.\n :ivar opaque_data_format: An optional string describing an opaque data\n format, e.g. a data structure that is serialized to a string/binary\n array for transmission to the client. It is up to the client and service\n to agree on how to decode observations using this value. For example,\n an opaque_data_format of `string_json` could be used to indicate that\n the observation is a string-serialized JSON value.\n \"\"\"\n\n def __init__(\n self,\n size_range: Tuple[int, Optional[int]] = (0, None),\n dtype=bytes,\n opaque_data_format: Optional[str] = None,\n scalar_range: Optional[Scalar] = None,\n ):\n \"\"\"Constructor.\n\n :param size_range: A tuple indicating the `(lower, upper)` bounds for\n sequence lengths. An upper bound of `None` means no upper bound. All\n sequences must have a lower bound of length >= 0.\n :param dtype: The data type for each element in a sequence.\n :param opaque_data_format: An optional string describing an opaque data\n format, e.g. a data structure that is serialized to a string/binary\n array for transmission to the client. It is up to the client and\n service to agree on how to decode observations using this value. For\n example, an opaque_data_format of `string_json` could be used to\n indicate that the observation is a string-serialized JSON value.\n :param scalar_range: If specified, this denotes the legal range of\n each element in the sequence. This is enforced by\n :meth:`contains() <compiler_gym.spaces.Sequence.contains>` checks.\n \"\"\"\n self.size_range = size_range\n self.dtype = dtype\n self.opaque_data_format = opaque_data_format\n self.scalar_range = scalar_range\n\n def __repr__(self) -> str:\n upper_bound = \"inf\" if self.size_range[1] is None else self.size_range[1]\n d = f\" -> {self.opaque_data_format}\" if self.opaque_data_format else \"\"\n return (\n f\"{self.dtype.__name__}_list<>[{int(self.size_range[0])},{upper_bound}]){d}\"\n )\n\n def contains(self, x):\n lower_bound = self.size_range[0]\n upper_bound = float(\"inf\") if self.size_range[1] is None else self.size_range[1]\n if not (lower_bound <= len(x) <= upper_bound):\n return False\n\n # TODO(cummins): The dtype API is inconsistent. When dtype=str or\n # dtype=bytes, we expect this to be the type of the entire sequence. But\n # for dtype=int, we expect this to be the type of each element. We\n # should distinguish these differences better.\n if self.dtype in {str, bytes}:\n if not isinstance(x, self.dtype):\n return False\n else:\n for element in x:\n if not isinstance(element, self.dtype):\n return False\n\n # Run the bounds check on every scalar element, if there is a scalar\n # range specified.\n if self.scalar_range:\n return all(self.scalar_range.contains(s) for s in x)\n\n return True\n\n def sample(self):\n \"\"\"\n .. warning::\n The `Sequence` space cannot be sampled from.\n\n :raises NotImplementedError: Not supported.\n \"\"\"\n raise NotImplementedError\n\n def __eq__(self, other):\n if not isinstance(other, Sequence):\n return False\n return (\n self.size_range == other.size_range\n and self.dtype == other.dtype\n and self.opaque_data_format == other.opaque_data_format\n )\n", "path": "compiler_gym/spaces/sequence.py"}]} | 1,562 | 260 |
gh_patches_debug_67277 | rasdani/github-patches | git_diff | litestar-org__litestar-2244 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
StaticFilesConfig and virtual directories
I'm trying to write a ``FileSystemProtocol`` to load files from the package data using [importlib_resources](https://importlib-resources.readthedocs.io/en/latest/using.html#). But because ``directories`` is defined as ``DirectoryPath``, pydantic checks if the given directories exist in the local filesystem.
This is not generally true, especially in any kind of virtual filesystem (e.g. a zipped package). I think this condition should be relaxed to support virtual filesystems.
https://github.com/starlite-api/starlite/blob/9bb6dcd57c10a591377cf8e3a537e9292566d5b9/starlite/config/static_files.py#L32
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `litestar/cli/main.py`
Content:
```
1 from __future__ import annotations
2
3 import sys
4 from pathlib import Path
5 from typing import TYPE_CHECKING
6
7 from ._utils import RICH_CLICK_INSTALLED, LitestarEnv, LitestarExtensionGroup
8 from .commands import core, schema, sessions
9
10 if TYPE_CHECKING or not RICH_CLICK_INSTALLED: # pragma: no cover
11 import click
12 from click import Context, group, option, pass_context
13 from click import Path as ClickPath
14 else:
15 import rich_click as click
16 from rich_click import Context, group, option, pass_context
17 from rich_click import Path as ClickPath
18 from rich_click.cli import patch as rich_click_patch
19
20 rich_click_patch()
21 click.rich_click.USE_RICH_MARKUP = True
22 click.rich_click.USE_MARKDOWN = False
23 click.rich_click.SHOW_ARGUMENTS = True
24 click.rich_click.GROUP_ARGUMENTS_OPTIONS = True
25 click.rich_click.SHOW_ARGUMENTS = True
26 click.rich_click.GROUP_ARGUMENTS_OPTIONS = True
27 click.rich_click.STYLE_ERRORS_SUGGESTION = "magenta italic"
28 click.rich_click.ERRORS_SUGGESTION = ""
29 click.rich_click.ERRORS_EPILOGUE = ""
30 click.rich_click.MAX_WIDTH = 100
31 click.rich_click.SHOW_METAVARS_COLUMN = True
32 click.rich_click.APPEND_METAVARS_HELP = True
33
34
35 __all__ = ("litestar_group",)
36
37
38 @group(cls=LitestarExtensionGroup, context_settings={"help_option_names": ["-h", "--help"]})
39 @option("--app", "app_path", help="Module path to a Litestar application")
40 @option(
41 "--app-dir",
42 help="Look for APP in the specified directory, by adding this to the PYTHONPATH. Defaults to the current working directory.",
43 default=None,
44 type=ClickPath(dir_okay=True, file_okay=False, path_type=Path),
45 show_default=False,
46 )
47 @pass_context
48 def litestar_group(ctx: Context, app_path: str | None, app_dir: Path | None = None) -> None:
49 """Litestar CLI."""
50 sys.path.append(str(app_dir))
51
52 if ctx.obj is None: # env has not been loaded yet, so we can lazy load it
53 ctx.obj = lambda: LitestarEnv.from_env(app_path)
54
55
56 # add sub commands here
57
58 litestar_group.add_command(core.info_command)
59 litestar_group.add_command(core.run_command)
60 litestar_group.add_command(core.routes_command)
61 litestar_group.add_command(core.version_command)
62 litestar_group.add_command(sessions.sessions_group)
63 litestar_group.add_command(schema.schema_group)
64
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/litestar/cli/main.py b/litestar/cli/main.py
--- a/litestar/cli/main.py
+++ b/litestar/cli/main.py
@@ -27,7 +27,7 @@
click.rich_click.STYLE_ERRORS_SUGGESTION = "magenta italic"
click.rich_click.ERRORS_SUGGESTION = ""
click.rich_click.ERRORS_EPILOGUE = ""
- click.rich_click.MAX_WIDTH = 100
+ click.rich_click.MAX_WIDTH = 80
click.rich_click.SHOW_METAVARS_COLUMN = True
click.rich_click.APPEND_METAVARS_HELP = True
| {"golden_diff": "diff --git a/litestar/cli/main.py b/litestar/cli/main.py\n--- a/litestar/cli/main.py\n+++ b/litestar/cli/main.py\n@@ -27,7 +27,7 @@\n click.rich_click.STYLE_ERRORS_SUGGESTION = \"magenta italic\"\n click.rich_click.ERRORS_SUGGESTION = \"\"\n click.rich_click.ERRORS_EPILOGUE = \"\"\n- click.rich_click.MAX_WIDTH = 100\n+ click.rich_click.MAX_WIDTH = 80\n click.rich_click.SHOW_METAVARS_COLUMN = True\n click.rich_click.APPEND_METAVARS_HELP = True\n", "issue": "StaticFilesConfig and virtual directories\nI'm trying to write a ``FileSystemProtocol`` to load files from the package data using [importlib_resources](https://importlib-resources.readthedocs.io/en/latest/using.html#). But because ``directories`` is defined as ``DirectoryPath``, pydantic checks if the given directories exist in the local filesystem. \r\n\r\nThis is not generally true, especially in any kind of virtual filesystem (e.g. a zipped package). I think this condition should be relaxed to support virtual filesystems.\r\n\r\nhttps://github.com/starlite-api/starlite/blob/9bb6dcd57c10a591377cf8e3a537e9292566d5b9/starlite/config/static_files.py#L32\n", "before_files": [{"content": "from __future__ import annotations\n\nimport sys\nfrom pathlib import Path\nfrom typing import TYPE_CHECKING\n\nfrom ._utils import RICH_CLICK_INSTALLED, LitestarEnv, LitestarExtensionGroup\nfrom .commands import core, schema, sessions\n\nif TYPE_CHECKING or not RICH_CLICK_INSTALLED: # pragma: no cover\n import click\n from click import Context, group, option, pass_context\n from click import Path as ClickPath\nelse:\n import rich_click as click\n from rich_click import Context, group, option, pass_context\n from rich_click import Path as ClickPath\n from rich_click.cli import patch as rich_click_patch\n\n rich_click_patch()\n click.rich_click.USE_RICH_MARKUP = True\n click.rich_click.USE_MARKDOWN = False\n click.rich_click.SHOW_ARGUMENTS = True\n click.rich_click.GROUP_ARGUMENTS_OPTIONS = True\n click.rich_click.SHOW_ARGUMENTS = True\n click.rich_click.GROUP_ARGUMENTS_OPTIONS = True\n click.rich_click.STYLE_ERRORS_SUGGESTION = \"magenta italic\"\n click.rich_click.ERRORS_SUGGESTION = \"\"\n click.rich_click.ERRORS_EPILOGUE = \"\"\n click.rich_click.MAX_WIDTH = 100\n click.rich_click.SHOW_METAVARS_COLUMN = True\n click.rich_click.APPEND_METAVARS_HELP = True\n\n\n__all__ = (\"litestar_group\",)\n\n\n@group(cls=LitestarExtensionGroup, context_settings={\"help_option_names\": [\"-h\", \"--help\"]})\n@option(\"--app\", \"app_path\", help=\"Module path to a Litestar application\")\n@option(\n \"--app-dir\",\n help=\"Look for APP in the specified directory, by adding this to the PYTHONPATH. Defaults to the current working directory.\",\n default=None,\n type=ClickPath(dir_okay=True, file_okay=False, path_type=Path),\n show_default=False,\n)\n@pass_context\ndef litestar_group(ctx: Context, app_path: str | None, app_dir: Path | None = None) -> None:\n \"\"\"Litestar CLI.\"\"\"\n sys.path.append(str(app_dir))\n\n if ctx.obj is None: # env has not been loaded yet, so we can lazy load it\n ctx.obj = lambda: LitestarEnv.from_env(app_path)\n\n\n# add sub commands here\n\nlitestar_group.add_command(core.info_command)\nlitestar_group.add_command(core.run_command)\nlitestar_group.add_command(core.routes_command)\nlitestar_group.add_command(core.version_command)\nlitestar_group.add_command(sessions.sessions_group)\nlitestar_group.add_command(schema.schema_group)\n", "path": "litestar/cli/main.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport sys\nfrom pathlib import Path\nfrom typing import TYPE_CHECKING\n\nfrom ._utils import RICH_CLICK_INSTALLED, LitestarEnv, LitestarExtensionGroup\nfrom .commands import core, schema, sessions\n\nif TYPE_CHECKING or not RICH_CLICK_INSTALLED: # pragma: no cover\n import click\n from click import Context, group, option, pass_context\n from click import Path as ClickPath\nelse:\n import rich_click as click\n from rich_click import Context, group, option, pass_context\n from rich_click import Path as ClickPath\n from rich_click.cli import patch as rich_click_patch\n\n rich_click_patch()\n click.rich_click.USE_RICH_MARKUP = True\n click.rich_click.USE_MARKDOWN = False\n click.rich_click.SHOW_ARGUMENTS = True\n click.rich_click.GROUP_ARGUMENTS_OPTIONS = True\n click.rich_click.SHOW_ARGUMENTS = True\n click.rich_click.GROUP_ARGUMENTS_OPTIONS = True\n click.rich_click.STYLE_ERRORS_SUGGESTION = \"magenta italic\"\n click.rich_click.ERRORS_SUGGESTION = \"\"\n click.rich_click.ERRORS_EPILOGUE = \"\"\n click.rich_click.MAX_WIDTH = 80\n click.rich_click.SHOW_METAVARS_COLUMN = True\n click.rich_click.APPEND_METAVARS_HELP = True\n\n\n__all__ = (\"litestar_group\",)\n\n\n@group(cls=LitestarExtensionGroup, context_settings={\"help_option_names\": [\"-h\", \"--help\"]})\n@option(\"--app\", \"app_path\", help=\"Module path to a Litestar application\")\n@option(\n \"--app-dir\",\n help=\"Look for APP in the specified directory, by adding this to the PYTHONPATH. Defaults to the current working directory.\",\n default=None,\n type=ClickPath(dir_okay=True, file_okay=False, path_type=Path),\n show_default=False,\n)\n@pass_context\ndef litestar_group(ctx: Context, app_path: str | None, app_dir: Path | None = None) -> None:\n \"\"\"Litestar CLI.\"\"\"\n sys.path.append(str(app_dir))\n\n if ctx.obj is None: # env has not been loaded yet, so we can lazy load it\n ctx.obj = lambda: LitestarEnv.from_env(app_path)\n\n\n# add sub commands here\n\nlitestar_group.add_command(core.info_command)\nlitestar_group.add_command(core.run_command)\nlitestar_group.add_command(core.routes_command)\nlitestar_group.add_command(core.version_command)\nlitestar_group.add_command(sessions.sessions_group)\nlitestar_group.add_command(schema.schema_group)\n", "path": "litestar/cli/main.py"}]} | 1,124 | 147 |
gh_patches_debug_6898 | rasdani/github-patches | git_diff | mozilla__bugbug-98 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Unnecessary string formatting when saving the model
https://github.com/mozilla/bugbug/blob/ab007344c5c799c08739b2d56f334afcaf242919/bugbug/model.py#L111 is not needed, we can use the variable directly.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bugbug/model.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # This Source Code Form is subject to the terms of the Mozilla Public
3 # License, v. 2.0. If a copy of the MPL was not distributed with this file,
4 # You can obtain one at http://mozilla.org/MPL/2.0/.
5
6 import numpy as np
7 import shap
8 from imblearn.metrics import classification_report_imbalanced
9 from imblearn.under_sampling import RandomUnderSampler
10 from sklearn import metrics
11 from sklearn.externals import joblib
12 from sklearn.feature_extraction.text import TfidfVectorizer
13 from sklearn.model_selection import cross_val_score
14 from sklearn.model_selection import train_test_split
15
16 from bugbug import bugzilla
17 from bugbug.nlp import SpacyVectorizer
18
19
20 class Model():
21 def __init__(self, lemmatization=False):
22 if lemmatization:
23 self.text_vectorizer = SpacyVectorizer
24 else:
25 self.text_vectorizer = TfidfVectorizer
26
27 self.undersampling_enabled = True
28 self.cross_validation_enabled = True
29
30 def get_feature_names(self):
31 return []
32
33 def train(self):
34 classes = self.get_labels()
35 class_names = sorted(list(set(classes.values())), reverse=True)
36
37 # Get bugs, filtering out those for which we have no labels.
38 def bugs():
39 return (bug for bug in bugzilla.get_bugs() if bug['id'] in classes)
40
41 # Calculate labels.
42 y = np.array([classes[bug['id']] for bug in bugs()])
43
44 # Extract features from the bugs.
45 X = self.extraction_pipeline.fit_transform(bugs())
46
47 print(f'X: {X.shape}, y: {y.shape}')
48
49 # Split dataset in training and test.
50 X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=0)
51
52 if self.undersampling_enabled:
53 # Under-sample the majority classes, as the datasets are imbalanced.
54 X_train, y_train = RandomUnderSampler(random_state=0).fit_sample(X_train, y_train)
55
56 print(f'X_train: {X_train.shape}, y_train: {y_train.shape}')
57 print(f'X_test: {X_test.shape}, y_test: {y_test.shape}')
58
59 # Use k-fold cross validation to evaluate results.
60 if self.cross_validation_enabled:
61 scores = cross_val_score(self.clf, X_train, y_train, cv=5)
62 print(f'CV Accuracy: f{scores.mean()} (+/- {scores.std() * 2})')
63
64 # Evaluate results on the test set.
65 self.clf.fit(X_train, y_train)
66
67 feature_names = self.get_feature_names()
68 if len(feature_names):
69 explainer = shap.TreeExplainer(self.clf)
70 shap_values = explainer.shap_values(X_train)
71
72 print('Feature ranking (top 20 features):')
73 # Calculate the values that represent the fraction of the model output variability attributable
74 # to each feature across the whole dataset.
75 shap_sums = np.abs(shap_values).sum(0)
76 rel_shap_sums = shap_sums / shap_sums.sum()
77 indices = np.argsort(rel_shap_sums)[::-1][:20]
78 for i, index in enumerate(indices):
79 print(f'{i + 1}. \'{feature_names[index]}\' ({rel_shap_sums[index]})')
80
81 y_pred = self.clf.predict(X_test)
82
83 print(f'No confidence threshold - {len(y_test)} classified')
84 print(metrics.confusion_matrix(y_test, y_pred, labels=class_names))
85 print(classification_report_imbalanced(y_test, y_pred, labels=class_names))
86
87 # Evaluate results on the test set for some confidence thresholds.
88 for confidence_threshold in [0.6, 0.7, 0.8, 0.9]:
89 y_pred_probas = self.clf.predict_proba(X_test)
90
91 y_test_filter = []
92 y_pred_filter = []
93 for i in range(0, len(y_test)):
94 argmax = np.argmax(y_pred_probas[i])
95 if y_pred_probas[i][argmax] < confidence_threshold:
96 continue
97
98 y_test_filter.append(y_test[i])
99 y_pred_filter.append(argmax)
100
101 y_pred_filter = self.clf._le.inverse_transform(y_pred_filter)
102
103 print(f'\nConfidence threshold > {confidence_threshold} - {len(y_test_filter)} classified')
104 print(metrics.confusion_matrix(y_test_filter, y_pred_filter, labels=class_names))
105 print(classification_report_imbalanced(y_test_filter, y_pred_filter, labels=class_names))
106
107 joblib.dump(self, f'{self.__class__.__name__.lower()}')
108
109 @staticmethod
110 def load(model_file_name):
111 return joblib.load(model_file_name)
112
113 def overwrite_classes(self, bugs, classes, probabilities):
114 return classes
115
116 def classify(self, bugs, probabilities=False, importances=False):
117 assert bugs is not None
118 assert self.extraction_pipeline is not None and self.clf is not None, 'The module needs to be initialized first'
119
120 if not isinstance(bugs, list):
121 bugs = [bugs]
122
123 assert isinstance(bugs[0], dict)
124
125 X = self.extraction_pipeline.transform(bugs)
126 if probabilities:
127 classes = self.clf.predict_proba(X)
128 else:
129 classes = self.clf.predict(X)
130
131 classes = self.overwrite_classes(bugs, classes, probabilities)
132
133 if importances:
134 explainer = shap.TreeExplainer(self.clf)
135 shap_values = explainer.shap_values(X)
136
137 shap_sums = shap_values.sum(0)
138 abs_shap_sums = np.abs(shap_sums)
139 rel_shap_sums = abs_shap_sums / abs_shap_sums.sum()
140 indices = np.argsort(abs_shap_sums)[::-1]
141 importances = [(index, shap_sums[index] > 0, rel_shap_sums[index]) for index in indices]
142
143 return classes, importances
144
145 return classes
146
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/bugbug/model.py b/bugbug/model.py
--- a/bugbug/model.py
+++ b/bugbug/model.py
@@ -104,7 +104,7 @@
print(metrics.confusion_matrix(y_test_filter, y_pred_filter, labels=class_names))
print(classification_report_imbalanced(y_test_filter, y_pred_filter, labels=class_names))
- joblib.dump(self, f'{self.__class__.__name__.lower()}')
+ joblib.dump(self, self.__class__.__name__.lower())
@staticmethod
def load(model_file_name):
| {"golden_diff": "diff --git a/bugbug/model.py b/bugbug/model.py\n--- a/bugbug/model.py\n+++ b/bugbug/model.py\n@@ -104,7 +104,7 @@\n print(metrics.confusion_matrix(y_test_filter, y_pred_filter, labels=class_names))\n print(classification_report_imbalanced(y_test_filter, y_pred_filter, labels=class_names))\n \n- joblib.dump(self, f'{self.__class__.__name__.lower()}')\n+ joblib.dump(self, self.__class__.__name__.lower())\n \n @staticmethod\n def load(model_file_name):\n", "issue": "Unnecessary string formatting when saving the model\nhttps://github.com/mozilla/bugbug/blob/ab007344c5c799c08739b2d56f334afcaf242919/bugbug/model.py#L111 is not needed, we can use the variable directly.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport numpy as np\nimport shap\nfrom imblearn.metrics import classification_report_imbalanced\nfrom imblearn.under_sampling import RandomUnderSampler\nfrom sklearn import metrics\nfrom sklearn.externals import joblib\nfrom sklearn.feature_extraction.text import TfidfVectorizer\nfrom sklearn.model_selection import cross_val_score\nfrom sklearn.model_selection import train_test_split\n\nfrom bugbug import bugzilla\nfrom bugbug.nlp import SpacyVectorizer\n\n\nclass Model():\n def __init__(self, lemmatization=False):\n if lemmatization:\n self.text_vectorizer = SpacyVectorizer\n else:\n self.text_vectorizer = TfidfVectorizer\n\n self.undersampling_enabled = True\n self.cross_validation_enabled = True\n\n def get_feature_names(self):\n return []\n\n def train(self):\n classes = self.get_labels()\n class_names = sorted(list(set(classes.values())), reverse=True)\n\n # Get bugs, filtering out those for which we have no labels.\n def bugs():\n return (bug for bug in bugzilla.get_bugs() if bug['id'] in classes)\n\n # Calculate labels.\n y = np.array([classes[bug['id']] for bug in bugs()])\n\n # Extract features from the bugs.\n X = self.extraction_pipeline.fit_transform(bugs())\n\n print(f'X: {X.shape}, y: {y.shape}')\n\n # Split dataset in training and test.\n X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=0)\n\n if self.undersampling_enabled:\n # Under-sample the majority classes, as the datasets are imbalanced.\n X_train, y_train = RandomUnderSampler(random_state=0).fit_sample(X_train, y_train)\n\n print(f'X_train: {X_train.shape}, y_train: {y_train.shape}')\n print(f'X_test: {X_test.shape}, y_test: {y_test.shape}')\n\n # Use k-fold cross validation to evaluate results.\n if self.cross_validation_enabled:\n scores = cross_val_score(self.clf, X_train, y_train, cv=5)\n print(f'CV Accuracy: f{scores.mean()} (+/- {scores.std() * 2})')\n\n # Evaluate results on the test set.\n self.clf.fit(X_train, y_train)\n\n feature_names = self.get_feature_names()\n if len(feature_names):\n explainer = shap.TreeExplainer(self.clf)\n shap_values = explainer.shap_values(X_train)\n\n print('Feature ranking (top 20 features):')\n # Calculate the values that represent the fraction of the model output variability attributable\n # to each feature across the whole dataset.\n shap_sums = np.abs(shap_values).sum(0)\n rel_shap_sums = shap_sums / shap_sums.sum()\n indices = np.argsort(rel_shap_sums)[::-1][:20]\n for i, index in enumerate(indices):\n print(f'{i + 1}. \\'{feature_names[index]}\\' ({rel_shap_sums[index]})')\n\n y_pred = self.clf.predict(X_test)\n\n print(f'No confidence threshold - {len(y_test)} classified')\n print(metrics.confusion_matrix(y_test, y_pred, labels=class_names))\n print(classification_report_imbalanced(y_test, y_pred, labels=class_names))\n\n # Evaluate results on the test set for some confidence thresholds.\n for confidence_threshold in [0.6, 0.7, 0.8, 0.9]:\n y_pred_probas = self.clf.predict_proba(X_test)\n\n y_test_filter = []\n y_pred_filter = []\n for i in range(0, len(y_test)):\n argmax = np.argmax(y_pred_probas[i])\n if y_pred_probas[i][argmax] < confidence_threshold:\n continue\n\n y_test_filter.append(y_test[i])\n y_pred_filter.append(argmax)\n\n y_pred_filter = self.clf._le.inverse_transform(y_pred_filter)\n\n print(f'\\nConfidence threshold > {confidence_threshold} - {len(y_test_filter)} classified')\n print(metrics.confusion_matrix(y_test_filter, y_pred_filter, labels=class_names))\n print(classification_report_imbalanced(y_test_filter, y_pred_filter, labels=class_names))\n\n joblib.dump(self, f'{self.__class__.__name__.lower()}')\n\n @staticmethod\n def load(model_file_name):\n return joblib.load(model_file_name)\n\n def overwrite_classes(self, bugs, classes, probabilities):\n return classes\n\n def classify(self, bugs, probabilities=False, importances=False):\n assert bugs is not None\n assert self.extraction_pipeline is not None and self.clf is not None, 'The module needs to be initialized first'\n\n if not isinstance(bugs, list):\n bugs = [bugs]\n\n assert isinstance(bugs[0], dict)\n\n X = self.extraction_pipeline.transform(bugs)\n if probabilities:\n classes = self.clf.predict_proba(X)\n else:\n classes = self.clf.predict(X)\n\n classes = self.overwrite_classes(bugs, classes, probabilities)\n\n if importances:\n explainer = shap.TreeExplainer(self.clf)\n shap_values = explainer.shap_values(X)\n\n shap_sums = shap_values.sum(0)\n abs_shap_sums = np.abs(shap_sums)\n rel_shap_sums = abs_shap_sums / abs_shap_sums.sum()\n indices = np.argsort(abs_shap_sums)[::-1]\n importances = [(index, shap_sums[index] > 0, rel_shap_sums[index]) for index in indices]\n\n return classes, importances\n\n return classes\n", "path": "bugbug/model.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport numpy as np\nimport shap\nfrom imblearn.metrics import classification_report_imbalanced\nfrom imblearn.under_sampling import RandomUnderSampler\nfrom sklearn import metrics\nfrom sklearn.externals import joblib\nfrom sklearn.feature_extraction.text import TfidfVectorizer\nfrom sklearn.model_selection import cross_val_score\nfrom sklearn.model_selection import train_test_split\n\nfrom bugbug import bugzilla\nfrom bugbug.nlp import SpacyVectorizer\n\n\nclass Model():\n def __init__(self, lemmatization=False):\n if lemmatization:\n self.text_vectorizer = SpacyVectorizer\n else:\n self.text_vectorizer = TfidfVectorizer\n\n self.undersampling_enabled = True\n self.cross_validation_enabled = True\n\n def get_feature_names(self):\n return []\n\n def train(self):\n classes = self.get_labels()\n class_names = sorted(list(set(classes.values())), reverse=True)\n\n # Get bugs, filtering out those for which we have no labels.\n def bugs():\n return (bug for bug in bugzilla.get_bugs() if bug['id'] in classes)\n\n # Calculate labels.\n y = np.array([classes[bug['id']] for bug in bugs()])\n\n # Extract features from the bugs.\n X = self.extraction_pipeline.fit_transform(bugs())\n\n print(f'X: {X.shape}, y: {y.shape}')\n\n # Split dataset in training and test.\n X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=0)\n\n if self.undersampling_enabled:\n # Under-sample the majority classes, as the datasets are imbalanced.\n X_train, y_train = RandomUnderSampler(random_state=0).fit_sample(X_train, y_train)\n\n print(f'X_train: {X_train.shape}, y_train: {y_train.shape}')\n print(f'X_test: {X_test.shape}, y_test: {y_test.shape}')\n\n # Use k-fold cross validation to evaluate results.\n if self.cross_validation_enabled:\n scores = cross_val_score(self.clf, X_train, y_train, cv=5)\n print(f'CV Accuracy: f{scores.mean()} (+/- {scores.std() * 2})')\n\n # Evaluate results on the test set.\n self.clf.fit(X_train, y_train)\n\n feature_names = self.get_feature_names()\n if len(feature_names):\n explainer = shap.TreeExplainer(self.clf)\n shap_values = explainer.shap_values(X_train)\n\n print('Feature ranking (top 20 features):')\n # Calculate the values that represent the fraction of the model output variability attributable\n # to each feature across the whole dataset.\n shap_sums = np.abs(shap_values).sum(0)\n rel_shap_sums = shap_sums / shap_sums.sum()\n indices = np.argsort(rel_shap_sums)[::-1][:20]\n for i, index in enumerate(indices):\n print(f'{i + 1}. \\'{feature_names[index]}\\' ({rel_shap_sums[index]})')\n\n y_pred = self.clf.predict(X_test)\n\n print(f'No confidence threshold - {len(y_test)} classified')\n print(metrics.confusion_matrix(y_test, y_pred, labels=class_names))\n print(classification_report_imbalanced(y_test, y_pred, labels=class_names))\n\n # Evaluate results on the test set for some confidence thresholds.\n for confidence_threshold in [0.6, 0.7, 0.8, 0.9]:\n y_pred_probas = self.clf.predict_proba(X_test)\n\n y_test_filter = []\n y_pred_filter = []\n for i in range(0, len(y_test)):\n argmax = np.argmax(y_pred_probas[i])\n if y_pred_probas[i][argmax] < confidence_threshold:\n continue\n\n y_test_filter.append(y_test[i])\n y_pred_filter.append(argmax)\n\n y_pred_filter = self.clf._le.inverse_transform(y_pred_filter)\n\n print(f'\\nConfidence threshold > {confidence_threshold} - {len(y_test_filter)} classified')\n print(metrics.confusion_matrix(y_test_filter, y_pred_filter, labels=class_names))\n print(classification_report_imbalanced(y_test_filter, y_pred_filter, labels=class_names))\n\n joblib.dump(self, self.__class__.__name__.lower())\n\n @staticmethod\n def load(model_file_name):\n return joblib.load(model_file_name)\n\n def overwrite_classes(self, bugs, classes, probabilities):\n return classes\n\n def classify(self, bugs, probabilities=False, importances=False):\n assert bugs is not None\n assert self.extraction_pipeline is not None and self.clf is not None, 'The module needs to be initialized first'\n\n if not isinstance(bugs, list):\n bugs = [bugs]\n\n assert isinstance(bugs[0], dict)\n\n X = self.extraction_pipeline.transform(bugs)\n if probabilities:\n classes = self.clf.predict_proba(X)\n else:\n classes = self.clf.predict(X)\n\n classes = self.overwrite_classes(bugs, classes, probabilities)\n\n if importances:\n explainer = shap.TreeExplainer(self.clf)\n shap_values = explainer.shap_values(X)\n\n shap_sums = shap_values.sum(0)\n abs_shap_sums = np.abs(shap_sums)\n rel_shap_sums = abs_shap_sums / abs_shap_sums.sum()\n indices = np.argsort(abs_shap_sums)[::-1]\n importances = [(index, shap_sums[index] > 0, rel_shap_sums[index]) for index in indices]\n\n return classes, importances\n\n return classes\n", "path": "bugbug/model.py"}]} | 1,982 | 131 |
gh_patches_debug_28868 | rasdani/github-patches | git_diff | streamlit__streamlit-2101 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Month dropdown on DateInput is very slow (~5s to appear)
# Summary
Month dropdown on DateInput is very slow (~5s to appear). See https://i.imgur.com/WkmHYag.gifv
# Steps to reproduce
What are the steps we should take to reproduce the bug:
1. Create `st.date_input('hi')`
2. Click on the date
3. Click on the month inside the date
## Expected behavior:
Month dropdown should appear almost instantly, with no user-perceived lag.
## Actual behavior:
Takes ~5 seconds
## Is this a regression?
Unsure. Does this date input dropdown use our virtualized selectbox?
# Debug info
- Streamlit version: develop
- Python version: 3.7.6
- Using Pipenv
- OS version: Windows 10
- Browser version: Chrome
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/streamlit/elements/time_widgets.py`
Content:
```
1 from datetime import datetime, date, time
2
3 from streamlit.proto.TimeInput_pb2 import TimeInput as TimeInputProto
4 from streamlit.proto.DateInput_pb2 import DateInput as DateInputProto
5 from streamlit.errors import StreamlitAPIException
6 from .utils import _get_widget_ui_value
7
8
9 class TimeWidgetsMixin:
10 def time_input(dg, label, value=None, key=None):
11 """Display a time input widget.
12
13 Parameters
14 ----------
15 label : str
16 A short label explaining to the user what this time input is for.
17 value : datetime.time/datetime.datetime
18 The value of this widget when it first renders. This will be
19 cast to str internally. Defaults to the current time.
20 key : str
21 An optional string to use as the unique key for the widget.
22 If this is omitted, a key will be generated for the widget
23 based on its content. Multiple widgets of the same type may
24 not share the same key.
25
26 Returns
27 -------
28 datetime.time
29 The current value of the time input widget.
30
31 Example
32 -------
33 >>> t = st.time_input('Set an alarm for', datetime.time(8, 45))
34 >>> st.write('Alarm is set for', t)
35
36 """
37 # Set value default.
38 if value is None:
39 value = datetime.now().time()
40
41 # Ensure that the value is either datetime/time
42 if not isinstance(value, datetime) and not isinstance(value, time):
43 raise StreamlitAPIException(
44 "The type of the value should be either datetime or time."
45 )
46
47 # Convert datetime to time
48 if isinstance(value, datetime):
49 value = value.time()
50
51 time_input_proto = TimeInputProto()
52 time_input_proto.label = label
53 time_input_proto.default = time.strftime(value, "%H:%M")
54
55 ui_value = _get_widget_ui_value("time_input", time_input_proto, user_key=key)
56 current_value = (
57 datetime.strptime(ui_value, "%H:%M").time()
58 if ui_value is not None
59 else value
60 )
61 return dg._enqueue("time_input", time_input_proto, current_value) # type: ignore
62
63 def date_input(
64 dg,
65 label,
66 value=None,
67 min_value=datetime.min,
68 max_value=None,
69 key=None,
70 ):
71 """Display a date input widget.
72
73 Parameters
74 ----------
75 label : str
76 A short label explaining to the user what this date input is for.
77 value : datetime.date or datetime.datetime or list/tuple of datetime.date or datetime.datetime or None
78 The value of this widget when it first renders. If a list/tuple with
79 0 to 2 date/datetime values is provided, the datepicker will allow
80 users to provide a range. Defaults to today as a single-date picker.
81 min_value : datetime.date or datetime.datetime
82 The minimum selectable date. Defaults to datetime.min.
83 max_value : datetime.date or datetime.datetime
84 The maximum selectable date. Defaults to today+10y.
85 key : str
86 An optional string to use as the unique key for the widget.
87 If this is omitted, a key will be generated for the widget
88 based on its content. Multiple widgets of the same type may
89 not share the same key.
90
91 Returns
92 -------
93 datetime.date
94 The current value of the date input widget.
95
96 Example
97 -------
98 >>> d = st.date_input(
99 ... "When\'s your birthday",
100 ... datetime.date(2019, 7, 6))
101 >>> st.write('Your birthday is:', d)
102
103 """
104 # Set value default.
105 if value is None:
106 value = datetime.now().date()
107
108 single_value = isinstance(value, (date, datetime))
109 range_value = isinstance(value, (list, tuple)) and len(value) in (0, 1, 2)
110 if not single_value and not range_value:
111 raise StreamlitAPIException(
112 "DateInput value should either be an date/datetime or a list/tuple of "
113 "0 - 2 date/datetime values"
114 )
115
116 if single_value:
117 value = [value]
118
119 date_input_proto = DateInputProto()
120 date_input_proto.is_range = range_value
121
122 value = [v.date() if isinstance(v, datetime) else v for v in value]
123
124 date_input_proto.label = label
125 date_input_proto.default[:] = [date.strftime(v, "%Y/%m/%d") for v in value]
126
127 if isinstance(min_value, datetime):
128 min_value = min_value.date()
129
130 date_input_proto.min = date.strftime(min_value, "%Y/%m/%d")
131
132 if max_value is None:
133 today = date.today()
134 max_value = date(today.year + 10, today.month, today.day)
135
136 if isinstance(max_value, datetime):
137 max_value = max_value.date()
138
139 date_input_proto.max = date.strftime(max_value, "%Y/%m/%d")
140
141 ui_value = _get_widget_ui_value("date_input", date_input_proto, user_key=key)
142
143 if ui_value is not None:
144 value = getattr(ui_value, "data")
145 value = [datetime.strptime(v, "%Y/%m/%d").date() for v in value]
146
147 return_value = value[0] if single_value else tuple(value)
148 return dg._enqueue("date_input", date_input_proto, return_value) # type: ignore
149
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lib/streamlit/elements/time_widgets.py b/lib/streamlit/elements/time_widgets.py
--- a/lib/streamlit/elements/time_widgets.py
+++ b/lib/streamlit/elements/time_widgets.py
@@ -61,12 +61,7 @@
return dg._enqueue("time_input", time_input_proto, current_value) # type: ignore
def date_input(
- dg,
- label,
- value=None,
- min_value=datetime.min,
- max_value=None,
- key=None,
+ dg, label, value=None, min_value=None, max_value=None, key=None,
):
"""Display a date input widget.
@@ -79,7 +74,7 @@
0 to 2 date/datetime values is provided, the datepicker will allow
users to provide a range. Defaults to today as a single-date picker.
min_value : datetime.date or datetime.datetime
- The minimum selectable date. Defaults to datetime.min.
+ The minimum selectable date. Defaults to today-10y.
max_value : datetime.date or datetime.datetime
The maximum selectable date. Defaults to today+10y.
key : str
@@ -126,6 +121,9 @@
if isinstance(min_value, datetime):
min_value = min_value.date()
+ elif min_value is None:
+ today = date.today()
+ min_value = date(today.year - 10, today.month, today.day)
date_input_proto.min = date.strftime(min_value, "%Y/%m/%d")
| {"golden_diff": "diff --git a/lib/streamlit/elements/time_widgets.py b/lib/streamlit/elements/time_widgets.py\n--- a/lib/streamlit/elements/time_widgets.py\n+++ b/lib/streamlit/elements/time_widgets.py\n@@ -61,12 +61,7 @@\n return dg._enqueue(\"time_input\", time_input_proto, current_value) # type: ignore\n \n def date_input(\n- dg,\n- label,\n- value=None,\n- min_value=datetime.min,\n- max_value=None,\n- key=None,\n+ dg, label, value=None, min_value=None, max_value=None, key=None,\n ):\n \"\"\"Display a date input widget.\n \n@@ -79,7 +74,7 @@\n 0 to 2 date/datetime values is provided, the datepicker will allow\n users to provide a range. Defaults to today as a single-date picker.\n min_value : datetime.date or datetime.datetime\n- The minimum selectable date. Defaults to datetime.min.\n+ The minimum selectable date. Defaults to today-10y.\n max_value : datetime.date or datetime.datetime\n The maximum selectable date. Defaults to today+10y.\n key : str\n@@ -126,6 +121,9 @@\n \n if isinstance(min_value, datetime):\n min_value = min_value.date()\n+ elif min_value is None:\n+ today = date.today()\n+ min_value = date(today.year - 10, today.month, today.day)\n \n date_input_proto.min = date.strftime(min_value, \"%Y/%m/%d\")\n", "issue": "Month dropdown on DateInput is very slow (~5s to appear)\n# Summary\r\n\r\nMonth dropdown on DateInput is very slow (~5s to appear). See https://i.imgur.com/WkmHYag.gifv\r\n\r\n# Steps to reproduce\r\n\r\nWhat are the steps we should take to reproduce the bug:\r\n\r\n1. Create `st.date_input('hi')`\r\n2. Click on the date\r\n3. Click on the month inside the date\r\n\r\n## Expected behavior:\r\n\r\nMonth dropdown should appear almost instantly, with no user-perceived lag.\r\n\r\n## Actual behavior:\r\n\r\nTakes ~5 seconds\r\n\r\n## Is this a regression?\r\n\r\nUnsure. Does this date input dropdown use our virtualized selectbox?\r\n\r\n# Debug info\r\n\r\n- Streamlit version: develop\r\n- Python version: 3.7.6\r\n- Using Pipenv\r\n- OS version: Windows 10\r\n- Browser version: Chrome\r\n\n", "before_files": [{"content": "from datetime import datetime, date, time\n\nfrom streamlit.proto.TimeInput_pb2 import TimeInput as TimeInputProto\nfrom streamlit.proto.DateInput_pb2 import DateInput as DateInputProto\nfrom streamlit.errors import StreamlitAPIException\nfrom .utils import _get_widget_ui_value\n\n\nclass TimeWidgetsMixin:\n def time_input(dg, label, value=None, key=None):\n \"\"\"Display a time input widget.\n\n Parameters\n ----------\n label : str\n A short label explaining to the user what this time input is for.\n value : datetime.time/datetime.datetime\n The value of this widget when it first renders. This will be\n cast to str internally. Defaults to the current time.\n key : str\n An optional string to use as the unique key for the widget.\n If this is omitted, a key will be generated for the widget\n based on its content. Multiple widgets of the same type may\n not share the same key.\n\n Returns\n -------\n datetime.time\n The current value of the time input widget.\n\n Example\n -------\n >>> t = st.time_input('Set an alarm for', datetime.time(8, 45))\n >>> st.write('Alarm is set for', t)\n\n \"\"\"\n # Set value default.\n if value is None:\n value = datetime.now().time()\n\n # Ensure that the value is either datetime/time\n if not isinstance(value, datetime) and not isinstance(value, time):\n raise StreamlitAPIException(\n \"The type of the value should be either datetime or time.\"\n )\n\n # Convert datetime to time\n if isinstance(value, datetime):\n value = value.time()\n\n time_input_proto = TimeInputProto()\n time_input_proto.label = label\n time_input_proto.default = time.strftime(value, \"%H:%M\")\n\n ui_value = _get_widget_ui_value(\"time_input\", time_input_proto, user_key=key)\n current_value = (\n datetime.strptime(ui_value, \"%H:%M\").time()\n if ui_value is not None\n else value\n )\n return dg._enqueue(\"time_input\", time_input_proto, current_value) # type: ignore\n\n def date_input(\n dg,\n label,\n value=None,\n min_value=datetime.min,\n max_value=None,\n key=None,\n ):\n \"\"\"Display a date input widget.\n\n Parameters\n ----------\n label : str\n A short label explaining to the user what this date input is for.\n value : datetime.date or datetime.datetime or list/tuple of datetime.date or datetime.datetime or None\n The value of this widget when it first renders. If a list/tuple with\n 0 to 2 date/datetime values is provided, the datepicker will allow\n users to provide a range. Defaults to today as a single-date picker.\n min_value : datetime.date or datetime.datetime\n The minimum selectable date. Defaults to datetime.min.\n max_value : datetime.date or datetime.datetime\n The maximum selectable date. Defaults to today+10y.\n key : str\n An optional string to use as the unique key for the widget.\n If this is omitted, a key will be generated for the widget\n based on its content. Multiple widgets of the same type may\n not share the same key.\n\n Returns\n -------\n datetime.date\n The current value of the date input widget.\n\n Example\n -------\n >>> d = st.date_input(\n ... \"When\\'s your birthday\",\n ... datetime.date(2019, 7, 6))\n >>> st.write('Your birthday is:', d)\n\n \"\"\"\n # Set value default.\n if value is None:\n value = datetime.now().date()\n\n single_value = isinstance(value, (date, datetime))\n range_value = isinstance(value, (list, tuple)) and len(value) in (0, 1, 2)\n if not single_value and not range_value:\n raise StreamlitAPIException(\n \"DateInput value should either be an date/datetime or a list/tuple of \"\n \"0 - 2 date/datetime values\"\n )\n\n if single_value:\n value = [value]\n\n date_input_proto = DateInputProto()\n date_input_proto.is_range = range_value\n\n value = [v.date() if isinstance(v, datetime) else v for v in value]\n\n date_input_proto.label = label\n date_input_proto.default[:] = [date.strftime(v, \"%Y/%m/%d\") for v in value]\n\n if isinstance(min_value, datetime):\n min_value = min_value.date()\n\n date_input_proto.min = date.strftime(min_value, \"%Y/%m/%d\")\n\n if max_value is None:\n today = date.today()\n max_value = date(today.year + 10, today.month, today.day)\n\n if isinstance(max_value, datetime):\n max_value = max_value.date()\n\n date_input_proto.max = date.strftime(max_value, \"%Y/%m/%d\")\n\n ui_value = _get_widget_ui_value(\"date_input\", date_input_proto, user_key=key)\n\n if ui_value is not None:\n value = getattr(ui_value, \"data\")\n value = [datetime.strptime(v, \"%Y/%m/%d\").date() for v in value]\n\n return_value = value[0] if single_value else tuple(value)\n return dg._enqueue(\"date_input\", date_input_proto, return_value) # type: ignore\n", "path": "lib/streamlit/elements/time_widgets.py"}], "after_files": [{"content": "from datetime import datetime, date, time\n\nfrom streamlit.proto.TimeInput_pb2 import TimeInput as TimeInputProto\nfrom streamlit.proto.DateInput_pb2 import DateInput as DateInputProto\nfrom streamlit.errors import StreamlitAPIException\nfrom .utils import _get_widget_ui_value\n\n\nclass TimeWidgetsMixin:\n def time_input(dg, label, value=None, key=None):\n \"\"\"Display a time input widget.\n\n Parameters\n ----------\n label : str\n A short label explaining to the user what this time input is for.\n value : datetime.time/datetime.datetime\n The value of this widget when it first renders. This will be\n cast to str internally. Defaults to the current time.\n key : str\n An optional string to use as the unique key for the widget.\n If this is omitted, a key will be generated for the widget\n based on its content. Multiple widgets of the same type may\n not share the same key.\n\n Returns\n -------\n datetime.time\n The current value of the time input widget.\n\n Example\n -------\n >>> t = st.time_input('Set an alarm for', datetime.time(8, 45))\n >>> st.write('Alarm is set for', t)\n\n \"\"\"\n # Set value default.\n if value is None:\n value = datetime.now().time()\n\n # Ensure that the value is either datetime/time\n if not isinstance(value, datetime) and not isinstance(value, time):\n raise StreamlitAPIException(\n \"The type of the value should be either datetime or time.\"\n )\n\n # Convert datetime to time\n if isinstance(value, datetime):\n value = value.time()\n\n time_input_proto = TimeInputProto()\n time_input_proto.label = label\n time_input_proto.default = time.strftime(value, \"%H:%M\")\n\n ui_value = _get_widget_ui_value(\"time_input\", time_input_proto, user_key=key)\n current_value = (\n datetime.strptime(ui_value, \"%H:%M\").time()\n if ui_value is not None\n else value\n )\n return dg._enqueue(\"time_input\", time_input_proto, current_value) # type: ignore\n\n def date_input(\n dg, label, value=None, min_value=None, max_value=None, key=None,\n ):\n \"\"\"Display a date input widget.\n\n Parameters\n ----------\n label : str\n A short label explaining to the user what this date input is for.\n value : datetime.date or datetime.datetime or list/tuple of datetime.date or datetime.datetime or None\n The value of this widget when it first renders. If a list/tuple with\n 0 to 2 date/datetime values is provided, the datepicker will allow\n users to provide a range. Defaults to today as a single-date picker.\n min_value : datetime.date or datetime.datetime\n The minimum selectable date. Defaults to today-10y.\n max_value : datetime.date or datetime.datetime\n The maximum selectable date. Defaults to today+10y.\n key : str\n An optional string to use as the unique key for the widget.\n If this is omitted, a key will be generated for the widget\n based on its content. Multiple widgets of the same type may\n not share the same key.\n\n Returns\n -------\n datetime.date\n The current value of the date input widget.\n\n Example\n -------\n >>> d = st.date_input(\n ... \"When\\'s your birthday\",\n ... datetime.date(2019, 7, 6))\n >>> st.write('Your birthday is:', d)\n\n \"\"\"\n # Set value default.\n if value is None:\n value = datetime.now().date()\n\n single_value = isinstance(value, (date, datetime))\n range_value = isinstance(value, (list, tuple)) and len(value) in (0, 1, 2)\n if not single_value and not range_value:\n raise StreamlitAPIException(\n \"DateInput value should either be an date/datetime or a list/tuple of \"\n \"0 - 2 date/datetime values\"\n )\n\n if single_value:\n value = [value]\n\n date_input_proto = DateInputProto()\n date_input_proto.is_range = range_value\n\n value = [v.date() if isinstance(v, datetime) else v for v in value]\n\n date_input_proto.label = label\n date_input_proto.default[:] = [date.strftime(v, \"%Y/%m/%d\") for v in value]\n\n if isinstance(min_value, datetime):\n min_value = min_value.date()\n elif min_value is None:\n today = date.today()\n min_value = date(today.year - 10, today.month, today.day)\n\n date_input_proto.min = date.strftime(min_value, \"%Y/%m/%d\")\n\n if max_value is None:\n today = date.today()\n max_value = date(today.year + 10, today.month, today.day)\n\n if isinstance(max_value, datetime):\n max_value = max_value.date()\n\n date_input_proto.max = date.strftime(max_value, \"%Y/%m/%d\")\n\n ui_value = _get_widget_ui_value(\"date_input\", date_input_proto, user_key=key)\n\n if ui_value is not None:\n value = getattr(ui_value, \"data\")\n value = [datetime.strptime(v, \"%Y/%m/%d\").date() for v in value]\n\n return_value = value[0] if single_value else tuple(value)\n return dg._enqueue(\"date_input\", date_input_proto, return_value) # type: ignore\n", "path": "lib/streamlit/elements/time_widgets.py"}]} | 1,971 | 343 |
gh_patches_debug_40235 | rasdani/github-patches | git_diff | qutebrowser__qutebrowser-3385 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Completion: match terms in any order
[This regex](https://github.com/The-Compiler/qutebrowser/blob/master/qutebrowser/completion/models/sortfilter.py#L75), which is used to filter items for completion, constrains the match to the same order the terms are supplied by the user. I.e., "browser qute" does not match "qutebrowser". According to [this post](http://stackoverflow.com/a/18944925/4180822), this issue can be solved by constructing better regular expression using positive lookaheads. I think it's worthwhile to examine the performance impact of the solution, or find a different one.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `qutebrowser/completion/models/histcategory.py`
Content:
```
1 # vim: ft=python fileencoding=utf-8 sts=4 sw=4 et:
2
3 # Copyright 2017 Ryan Roden-Corrent (rcorre) <[email protected]>
4 #
5 # This file is part of qutebrowser.
6 #
7 # qutebrowser is free software: you can redistribute it and/or modify
8 # it under the terms of the GNU General Public License as published by
9 # the Free Software Foundation, either version 3 of the License, or
10 # (at your option) any later version.
11 #
12 # qutebrowser is distributed in the hope that it will be useful,
13 # but WITHOUT ANY WARRANTY; without even the implied warranty of
14 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15 # GNU General Public License for more details.
16 #
17 # You should have received a copy of the GNU General Public License
18 # along with qutebrowser. If not, see <http://www.gnu.org/licenses/>.
19
20 """A completion category that queries the SQL History store."""
21
22 import re
23
24 from PyQt5.QtSql import QSqlQueryModel
25
26 from qutebrowser.misc import sql
27 from qutebrowser.utils import debug
28 from qutebrowser.config import config
29
30
31 class HistoryCategory(QSqlQueryModel):
32
33 """A completion category that queries the SQL History store."""
34
35 def __init__(self, *, delete_func=None, parent=None):
36 """Create a new History completion category."""
37 super().__init__(parent=parent)
38 self.name = "History"
39
40 # replace ' in timestamp-format to avoid breaking the query
41 timestamp_format = config.val.completion.timestamp_format
42 timefmt = ("strftime('{}', last_atime, 'unixepoch', 'localtime')"
43 .format(timestamp_format.replace("'", "`")))
44
45 self._query = sql.Query(' '.join([
46 "SELECT url, title, {}".format(timefmt),
47 "FROM CompletionHistory",
48 # the incoming pattern will have literal % and _ escaped with '\'
49 # we need to tell sql to treat '\' as an escape character
50 "WHERE ((url || title) LIKE :pat escape '\\')",
51 self._atime_expr(),
52 "ORDER BY last_atime DESC",
53 ]), forward_only=False)
54
55 # advertise that this model filters by URL and title
56 self.columns_to_filter = [0, 1]
57 self.delete_func = delete_func
58
59 def _atime_expr(self):
60 """If max_items is set, return an expression to limit the query."""
61 max_items = config.val.completion.web_history_max_items
62 # HistoryCategory should not be added to the completion in that case.
63 assert max_items != 0
64
65 if max_items < 0:
66 return ''
67
68 min_atime = sql.Query(' '.join([
69 'SELECT min(last_atime) FROM',
70 '(SELECT last_atime FROM CompletionHistory',
71 'ORDER BY last_atime DESC LIMIT :limit)',
72 ])).run(limit=max_items).value()
73
74 if not min_atime:
75 # if there are no history items, min_atime may be '' (issue #2849)
76 return ''
77
78 return "AND last_atime >= {}".format(min_atime)
79
80 def set_pattern(self, pattern):
81 """Set the pattern used to filter results.
82
83 Args:
84 pattern: string pattern to filter by.
85 """
86 # escape to treat a user input % or _ as a literal, not a wildcard
87 pattern = pattern.replace('%', '\\%')
88 pattern = pattern.replace('_', '\\_')
89 # treat spaces as wildcards to match any of the typed words
90 pattern = re.sub(r' +', '%', pattern)
91 pattern = '%{}%'.format(pattern)
92 with debug.log_time('sql', 'Running completion query'):
93 self._query.run(pat=pattern)
94 self.setQuery(self._query)
95
96 def removeRows(self, row, _count, _parent=None):
97 """Override QAbstractItemModel::removeRows to re-run sql query."""
98 # re-run query to reload updated table
99 with debug.log_time('sql', 'Re-running completion query post-delete'):
100 self._query.run()
101 self.setQuery(self._query)
102 while self.rowCount() < row:
103 self.fetchMore()
104 return True
105
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/qutebrowser/completion/models/histcategory.py b/qutebrowser/completion/models/histcategory.py
--- a/qutebrowser/completion/models/histcategory.py
+++ b/qutebrowser/completion/models/histcategory.py
@@ -19,8 +19,6 @@
"""A completion category that queries the SQL History store."""
-import re
-
from PyQt5.QtSql import QSqlQueryModel
from qutebrowser.misc import sql
@@ -36,21 +34,7 @@
"""Create a new History completion category."""
super().__init__(parent=parent)
self.name = "History"
-
- # replace ' in timestamp-format to avoid breaking the query
- timestamp_format = config.val.completion.timestamp_format
- timefmt = ("strftime('{}', last_atime, 'unixepoch', 'localtime')"
- .format(timestamp_format.replace("'", "`")))
-
- self._query = sql.Query(' '.join([
- "SELECT url, title, {}".format(timefmt),
- "FROM CompletionHistory",
- # the incoming pattern will have literal % and _ escaped with '\'
- # we need to tell sql to treat '\' as an escape character
- "WHERE ((url || title) LIKE :pat escape '\\')",
- self._atime_expr(),
- "ORDER BY last_atime DESC",
- ]), forward_only=False)
+ self._query = None
# advertise that this model filters by URL and title
self.columns_to_filter = [0, 1]
@@ -86,11 +70,36 @@
# escape to treat a user input % or _ as a literal, not a wildcard
pattern = pattern.replace('%', '\\%')
pattern = pattern.replace('_', '\\_')
- # treat spaces as wildcards to match any of the typed words
- pattern = re.sub(r' +', '%', pattern)
- pattern = '%{}%'.format(pattern)
+ words = ['%{}%'.format(w) for w in pattern.split(' ')]
+
+ # build a where clause to match all of the words in any order
+ # given the search term "a b", the WHERE clause would be:
+ # ((url || title) LIKE '%a%') AND ((url || title) LIKE '%b%')
+ where_clause = ' AND '.join(
+ "(url || title) LIKE :{} escape '\\'".format(i)
+ for i in range(len(words)))
+
+ # replace ' in timestamp-format to avoid breaking the query
+ timestamp_format = config.val.completion.timestamp_format
+ timefmt = ("strftime('{}', last_atime, 'unixepoch', 'localtime')"
+ .format(timestamp_format.replace("'", "`")))
+
+ if not self._query or len(words) != len(self._query.boundValues()):
+ # if the number of words changed, we need to generate a new query
+ # otherwise, we can reuse the prepared query for performance
+ self._query = sql.Query(' '.join([
+ "SELECT url, title, {}".format(timefmt),
+ "FROM CompletionHistory",
+ # the incoming pattern will have literal % and _ escaped
+ # we need to tell sql to treat '\' as an escape character
+ 'WHERE ({})'.format(where_clause),
+ self._atime_expr(),
+ "ORDER BY last_atime DESC",
+ ]), forward_only=False)
+
with debug.log_time('sql', 'Running completion query'):
- self._query.run(pat=pattern)
+ self._query.run(**{
+ str(i): w for i, w in enumerate(words)})
self.setQuery(self._query)
def removeRows(self, row, _count, _parent=None):
| {"golden_diff": "diff --git a/qutebrowser/completion/models/histcategory.py b/qutebrowser/completion/models/histcategory.py\n--- a/qutebrowser/completion/models/histcategory.py\n+++ b/qutebrowser/completion/models/histcategory.py\n@@ -19,8 +19,6 @@\n \n \"\"\"A completion category that queries the SQL History store.\"\"\"\n \n-import re\n-\n from PyQt5.QtSql import QSqlQueryModel\n \n from qutebrowser.misc import sql\n@@ -36,21 +34,7 @@\n \"\"\"Create a new History completion category.\"\"\"\n super().__init__(parent=parent)\n self.name = \"History\"\n-\n- # replace ' in timestamp-format to avoid breaking the query\n- timestamp_format = config.val.completion.timestamp_format\n- timefmt = (\"strftime('{}', last_atime, 'unixepoch', 'localtime')\"\n- .format(timestamp_format.replace(\"'\", \"`\")))\n-\n- self._query = sql.Query(' '.join([\n- \"SELECT url, title, {}\".format(timefmt),\n- \"FROM CompletionHistory\",\n- # the incoming pattern will have literal % and _ escaped with '\\'\n- # we need to tell sql to treat '\\' as an escape character\n- \"WHERE ((url || title) LIKE :pat escape '\\\\')\",\n- self._atime_expr(),\n- \"ORDER BY last_atime DESC\",\n- ]), forward_only=False)\n+ self._query = None\n \n # advertise that this model filters by URL and title\n self.columns_to_filter = [0, 1]\n@@ -86,11 +70,36 @@\n # escape to treat a user input % or _ as a literal, not a wildcard\n pattern = pattern.replace('%', '\\\\%')\n pattern = pattern.replace('_', '\\\\_')\n- # treat spaces as wildcards to match any of the typed words\n- pattern = re.sub(r' +', '%', pattern)\n- pattern = '%{}%'.format(pattern)\n+ words = ['%{}%'.format(w) for w in pattern.split(' ')]\n+\n+ # build a where clause to match all of the words in any order\n+ # given the search term \"a b\", the WHERE clause would be:\n+ # ((url || title) LIKE '%a%') AND ((url || title) LIKE '%b%')\n+ where_clause = ' AND '.join(\n+ \"(url || title) LIKE :{} escape '\\\\'\".format(i)\n+ for i in range(len(words)))\n+\n+ # replace ' in timestamp-format to avoid breaking the query\n+ timestamp_format = config.val.completion.timestamp_format\n+ timefmt = (\"strftime('{}', last_atime, 'unixepoch', 'localtime')\"\n+ .format(timestamp_format.replace(\"'\", \"`\")))\n+\n+ if not self._query or len(words) != len(self._query.boundValues()):\n+ # if the number of words changed, we need to generate a new query\n+ # otherwise, we can reuse the prepared query for performance\n+ self._query = sql.Query(' '.join([\n+ \"SELECT url, title, {}\".format(timefmt),\n+ \"FROM CompletionHistory\",\n+ # the incoming pattern will have literal % and _ escaped\n+ # we need to tell sql to treat '\\' as an escape character\n+ 'WHERE ({})'.format(where_clause),\n+ self._atime_expr(),\n+ \"ORDER BY last_atime DESC\",\n+ ]), forward_only=False)\n+\n with debug.log_time('sql', 'Running completion query'):\n- self._query.run(pat=pattern)\n+ self._query.run(**{\n+ str(i): w for i, w in enumerate(words)})\n self.setQuery(self._query)\n \n def removeRows(self, row, _count, _parent=None):\n", "issue": "Completion: match terms in any order\n[This regex](https://github.com/The-Compiler/qutebrowser/blob/master/qutebrowser/completion/models/sortfilter.py#L75), which is used to filter items for completion, constrains the match to the same order the terms are supplied by the user. I.e., \"browser qute\" does not match \"qutebrowser\". According to [this post](http://stackoverflow.com/a/18944925/4180822), this issue can be solved by constructing better regular expression using positive lookaheads. I think it's worthwhile to examine the performance impact of the solution, or find a different one.\n\n", "before_files": [{"content": "# vim: ft=python fileencoding=utf-8 sts=4 sw=4 et:\n\n# Copyright 2017 Ryan Roden-Corrent (rcorre) <[email protected]>\n#\n# This file is part of qutebrowser.\n#\n# qutebrowser is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# qutebrowser is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with qutebrowser. If not, see <http://www.gnu.org/licenses/>.\n\n\"\"\"A completion category that queries the SQL History store.\"\"\"\n\nimport re\n\nfrom PyQt5.QtSql import QSqlQueryModel\n\nfrom qutebrowser.misc import sql\nfrom qutebrowser.utils import debug\nfrom qutebrowser.config import config\n\n\nclass HistoryCategory(QSqlQueryModel):\n\n \"\"\"A completion category that queries the SQL History store.\"\"\"\n\n def __init__(self, *, delete_func=None, parent=None):\n \"\"\"Create a new History completion category.\"\"\"\n super().__init__(parent=parent)\n self.name = \"History\"\n\n # replace ' in timestamp-format to avoid breaking the query\n timestamp_format = config.val.completion.timestamp_format\n timefmt = (\"strftime('{}', last_atime, 'unixepoch', 'localtime')\"\n .format(timestamp_format.replace(\"'\", \"`\")))\n\n self._query = sql.Query(' '.join([\n \"SELECT url, title, {}\".format(timefmt),\n \"FROM CompletionHistory\",\n # the incoming pattern will have literal % and _ escaped with '\\'\n # we need to tell sql to treat '\\' as an escape character\n \"WHERE ((url || title) LIKE :pat escape '\\\\')\",\n self._atime_expr(),\n \"ORDER BY last_atime DESC\",\n ]), forward_only=False)\n\n # advertise that this model filters by URL and title\n self.columns_to_filter = [0, 1]\n self.delete_func = delete_func\n\n def _atime_expr(self):\n \"\"\"If max_items is set, return an expression to limit the query.\"\"\"\n max_items = config.val.completion.web_history_max_items\n # HistoryCategory should not be added to the completion in that case.\n assert max_items != 0\n\n if max_items < 0:\n return ''\n\n min_atime = sql.Query(' '.join([\n 'SELECT min(last_atime) FROM',\n '(SELECT last_atime FROM CompletionHistory',\n 'ORDER BY last_atime DESC LIMIT :limit)',\n ])).run(limit=max_items).value()\n\n if not min_atime:\n # if there are no history items, min_atime may be '' (issue #2849)\n return ''\n\n return \"AND last_atime >= {}\".format(min_atime)\n\n def set_pattern(self, pattern):\n \"\"\"Set the pattern used to filter results.\n\n Args:\n pattern: string pattern to filter by.\n \"\"\"\n # escape to treat a user input % or _ as a literal, not a wildcard\n pattern = pattern.replace('%', '\\\\%')\n pattern = pattern.replace('_', '\\\\_')\n # treat spaces as wildcards to match any of the typed words\n pattern = re.sub(r' +', '%', pattern)\n pattern = '%{}%'.format(pattern)\n with debug.log_time('sql', 'Running completion query'):\n self._query.run(pat=pattern)\n self.setQuery(self._query)\n\n def removeRows(self, row, _count, _parent=None):\n \"\"\"Override QAbstractItemModel::removeRows to re-run sql query.\"\"\"\n # re-run query to reload updated table\n with debug.log_time('sql', 'Re-running completion query post-delete'):\n self._query.run()\n self.setQuery(self._query)\n while self.rowCount() < row:\n self.fetchMore()\n return True\n", "path": "qutebrowser/completion/models/histcategory.py"}], "after_files": [{"content": "# vim: ft=python fileencoding=utf-8 sts=4 sw=4 et:\n\n# Copyright 2017 Ryan Roden-Corrent (rcorre) <[email protected]>\n#\n# This file is part of qutebrowser.\n#\n# qutebrowser is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# qutebrowser is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with qutebrowser. If not, see <http://www.gnu.org/licenses/>.\n\n\"\"\"A completion category that queries the SQL History store.\"\"\"\n\nfrom PyQt5.QtSql import QSqlQueryModel\n\nfrom qutebrowser.misc import sql\nfrom qutebrowser.utils import debug\nfrom qutebrowser.config import config\n\n\nclass HistoryCategory(QSqlQueryModel):\n\n \"\"\"A completion category that queries the SQL History store.\"\"\"\n\n def __init__(self, *, delete_func=None, parent=None):\n \"\"\"Create a new History completion category.\"\"\"\n super().__init__(parent=parent)\n self.name = \"History\"\n self._query = None\n\n # advertise that this model filters by URL and title\n self.columns_to_filter = [0, 1]\n self.delete_func = delete_func\n\n def _atime_expr(self):\n \"\"\"If max_items is set, return an expression to limit the query.\"\"\"\n max_items = config.val.completion.web_history_max_items\n # HistoryCategory should not be added to the completion in that case.\n assert max_items != 0\n\n if max_items < 0:\n return ''\n\n min_atime = sql.Query(' '.join([\n 'SELECT min(last_atime) FROM',\n '(SELECT last_atime FROM CompletionHistory',\n 'ORDER BY last_atime DESC LIMIT :limit)',\n ])).run(limit=max_items).value()\n\n if not min_atime:\n # if there are no history items, min_atime may be '' (issue #2849)\n return ''\n\n return \"AND last_atime >= {}\".format(min_atime)\n\n def set_pattern(self, pattern):\n \"\"\"Set the pattern used to filter results.\n\n Args:\n pattern: string pattern to filter by.\n \"\"\"\n # escape to treat a user input % or _ as a literal, not a wildcard\n pattern = pattern.replace('%', '\\\\%')\n pattern = pattern.replace('_', '\\\\_')\n words = ['%{}%'.format(w) for w in pattern.split(' ')]\n\n # build a where clause to match all of the words in any order\n # given the search term \"a b\", the WHERE clause would be:\n # ((url || title) LIKE '%a%') AND ((url || title) LIKE '%b%')\n where_clause = ' AND '.join(\n \"(url || title) LIKE :{} escape '\\\\'\".format(i)\n for i in range(len(words)))\n\n # replace ' in timestamp-format to avoid breaking the query\n timestamp_format = config.val.completion.timestamp_format\n timefmt = (\"strftime('{}', last_atime, 'unixepoch', 'localtime')\"\n .format(timestamp_format.replace(\"'\", \"`\")))\n\n if not self._query or len(words) != len(self._query.boundValues()):\n # if the number of words changed, we need to generate a new query\n # otherwise, we can reuse the prepared query for performance\n self._query = sql.Query(' '.join([\n \"SELECT url, title, {}\".format(timefmt),\n \"FROM CompletionHistory\",\n # the incoming pattern will have literal % and _ escaped\n # we need to tell sql to treat '\\' as an escape character\n 'WHERE ({})'.format(where_clause),\n self._atime_expr(),\n \"ORDER BY last_atime DESC\",\n ]), forward_only=False)\n\n with debug.log_time('sql', 'Running completion query'):\n self._query.run(**{\n str(i): w for i, w in enumerate(words)})\n self.setQuery(self._query)\n\n def removeRows(self, row, _count, _parent=None):\n \"\"\"Override QAbstractItemModel::removeRows to re-run sql query.\"\"\"\n # re-run query to reload updated table\n with debug.log_time('sql', 'Re-running completion query post-delete'):\n self._query.run()\n self.setQuery(self._query)\n while self.rowCount() < row:\n self.fetchMore()\n return True\n", "path": "qutebrowser/completion/models/histcategory.py"}]} | 1,521 | 821 |
gh_patches_debug_5403 | rasdani/github-patches | git_diff | ivy-llc__ivy-18003 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
fft2
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/scipy/fft/fft.py`
Content:
```
1 # global
2 import ivy
3 from ivy.functional.frontends.scipy.func_wrapper import (
4 to_ivy_arrays_and_back,
5 )
6
7
8 # fft
9 @to_ivy_arrays_and_back
10 def fft(x, n=None, axis=-1, norm="backward", overwrite_x=False):
11 return ivy.fft(x, axis, norm=norm, n=n)
12
13
14 # ifft
15 @to_ivy_arrays_and_back
16 def ifft(x, n=None, axis=-1, norm="backward", overwrite_x=False):
17 return ivy.ifft(x, axis, norm=norm, n=n)
18
19
20 # dct
21 @to_ivy_arrays_and_back
22 def dct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):
23 return ivy.dct(x, type=type, n=n, axis=axis, norm=norm)
24
25
26 # idct
27 @to_ivy_arrays_and_back
28 def idct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):
29 inverse_type = {1: 1, 2: 3, 3: 2, 4: 4}[type]
30 return ivy.dct(x, type=inverse_type, n=n, axis=axis, norm=norm)
31
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ivy/functional/frontends/scipy/fft/fft.py b/ivy/functional/frontends/scipy/fft/fft.py
--- a/ivy/functional/frontends/scipy/fft/fft.py
+++ b/ivy/functional/frontends/scipy/fft/fft.py
@@ -28,3 +28,8 @@
def idct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):
inverse_type = {1: 1, 2: 3, 3: 2, 4: 4}[type]
return ivy.dct(x, type=inverse_type, n=n, axis=axis, norm=norm)
+
+
+@to_ivy_arrays_and_back
+def fft2(x, s=None, axes=(-2, -1), norm=None, overwrite_x=False):
+ return ivy.fft2(x, s=s, dim=axes, norm=norm)
| {"golden_diff": "diff --git a/ivy/functional/frontends/scipy/fft/fft.py b/ivy/functional/frontends/scipy/fft/fft.py\n--- a/ivy/functional/frontends/scipy/fft/fft.py\n+++ b/ivy/functional/frontends/scipy/fft/fft.py\n@@ -28,3 +28,8 @@\n def idct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):\n inverse_type = {1: 1, 2: 3, 3: 2, 4: 4}[type]\n return ivy.dct(x, type=inverse_type, n=n, axis=axis, norm=norm)\n+\n+\n+@to_ivy_arrays_and_back\n+def fft2(x, s=None, axes=(-2, -1), norm=None, overwrite_x=False):\n+ return ivy.fft2(x, s=s, dim=axes, norm=norm)\n", "issue": "fft2\n\n", "before_files": [{"content": "# global\nimport ivy\nfrom ivy.functional.frontends.scipy.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n# fft\n@to_ivy_arrays_and_back\ndef fft(x, n=None, axis=-1, norm=\"backward\", overwrite_x=False):\n return ivy.fft(x, axis, norm=norm, n=n)\n\n\n# ifft\n@to_ivy_arrays_and_back\ndef ifft(x, n=None, axis=-1, norm=\"backward\", overwrite_x=False):\n return ivy.ifft(x, axis, norm=norm, n=n)\n\n\n# dct\n@to_ivy_arrays_and_back\ndef dct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):\n return ivy.dct(x, type=type, n=n, axis=axis, norm=norm)\n\n\n# idct\n@to_ivy_arrays_and_back\ndef idct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):\n inverse_type = {1: 1, 2: 3, 3: 2, 4: 4}[type]\n return ivy.dct(x, type=inverse_type, n=n, axis=axis, norm=norm)\n", "path": "ivy/functional/frontends/scipy/fft/fft.py"}], "after_files": [{"content": "# global\nimport ivy\nfrom ivy.functional.frontends.scipy.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n# fft\n@to_ivy_arrays_and_back\ndef fft(x, n=None, axis=-1, norm=\"backward\", overwrite_x=False):\n return ivy.fft(x, axis, norm=norm, n=n)\n\n\n# ifft\n@to_ivy_arrays_and_back\ndef ifft(x, n=None, axis=-1, norm=\"backward\", overwrite_x=False):\n return ivy.ifft(x, axis, norm=norm, n=n)\n\n\n# dct\n@to_ivy_arrays_and_back\ndef dct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):\n return ivy.dct(x, type=type, n=n, axis=axis, norm=norm)\n\n\n# idct\n@to_ivy_arrays_and_back\ndef idct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):\n inverse_type = {1: 1, 2: 3, 3: 2, 4: 4}[type]\n return ivy.dct(x, type=inverse_type, n=n, axis=axis, norm=norm)\n\n\n@to_ivy_arrays_and_back\ndef fft2(x, s=None, axes=(-2, -1), norm=None, overwrite_x=False):\n return ivy.fft2(x, s=s, dim=axes, norm=norm)\n", "path": "ivy/functional/frontends/scipy/fft/fft.py"}]} | 606 | 211 |
gh_patches_debug_6100 | rasdani/github-patches | git_diff | conan-io__conan-center-index-2011 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[bug] Some system packages fail to install (xorg/system, opengl/system)
<!--
Please don't forget to update the issue title.
Include all applicable information to help us reproduce your problem.
To help us debug your issue please explain:
--> When configuring a cmake project template that used conan i encountered an error while configuring cmake :
`opengl/system: ERROR: while executing system_requirements(): 'SysConfigOpenGLConan' object has no attribute 'warn'`. After attemping a manual installation with `conan install opengl/system@` i still have the same problem and output. I have no idea what could be causing this and after a few hours of googling i failed to find anything that would explain this. I also get the exact same output when installing `xorg/system`
### Environment Details (include every applicable attribute)
* Operating System+version: Pop!_OS 20.04 LTS x86_64
* Compiler+version: gcc (Ubuntu 9.3.0-10ubuntu2) 9.3.0
* Conan version: Conan version 1.26.0
* Python version: Python 2.7.18rc1 AND Python 3.8.2
### Steps to reproduce (Include if Applicable)
`conan install opengl/system@`
`conan install xorg/system@`
### Logs (Executed commands with output) (Include/Attach if Applicable)
```
Configuration:
[settings]
arch=x86_64
arch_build=x86_64
build_type=Release
os=Linux
os_build=Linux
[options]
[build_requires]
[env]
Installing package: opengl/system
Requirements
opengl/system from 'conan-center' - Cache
Packages
opengl/system:5ab84d6acfe1f23c4fae0ab88f26e3a396351ac9 - Cache
Installing (downloading, building) binaries...
opengl/system: ERROR: while executing system_requirements(): 'SysConfigOpenGLConan' object has no attribute 'warn'
ERROR: Error in system requirements
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `recipes/opengl/all/conanfile.py`
Content:
```
1 from conans import ConanFile, tools
2 from conans.errors import ConanException
3 import os
4
5
6 class SysConfigOpenGLConan(ConanFile):
7 name = "opengl"
8 version = "system"
9 description = "cross-platform virtual conan package for the OpenGL support"
10 topics = ("conan", "opengl", "gl")
11 url = "https://github.com/conan-io/conan-center-index"
12 homepage = "https://www.opengl.org/"
13 license = "MIT"
14 settings = ("os",)
15
16 def package_id(self):
17 self.info.header_only()
18
19 def _fill_cppinfo_from_pkgconfig(self, name):
20 pkg_config = tools.PkgConfig(name)
21 if not pkg_config.provides:
22 raise ConanException("OpenGL development files aren't available, give up")
23 libs = [lib[2:] for lib in pkg_config.libs_only_l]
24 lib_dirs = [lib[2:] for lib in pkg_config.libs_only_L]
25 ldflags = [flag for flag in pkg_config.libs_only_other]
26 include_dirs = [include[2:] for include in pkg_config.cflags_only_I]
27 cflags = [flag for flag in pkg_config.cflags_only_other if not flag.startswith("-D")]
28 defines = [flag[2:] for flag in pkg_config.cflags_only_other if flag.startswith("-D")]
29
30 self.cpp_info.system_libs.extend(libs)
31 self.cpp_info.libdirs.extend(lib_dirs)
32 self.cpp_info.sharedlinkflags.extend(ldflags)
33 self.cpp_info.exelinkflags.extend(ldflags)
34 self.cpp_info.defines.extend(defines)
35 self.cpp_info.includedirs.extend(include_dirs)
36 self.cpp_info.cflags.extend(cflags)
37 self.cpp_info.cxxflags.extend(cflags)
38
39 def system_requirements(self):
40 if tools.os_info.is_linux and self.settings.os == "Linux":
41 package_tool = tools.SystemPackageTool(conanfile=self, default_mode='verify')
42 if tools.os_info.with_yum:
43 if tools.os_info.linux_distro == "fedora" and tools.os_info.os_version >= "32":
44 packages = ["libglvnd-devel"]
45 else:
46 packages = ["mesa-libGL-devel"]
47 elif tools.os_info.with_apt:
48 ubuntu_20_or_later = tools.os_info.linux_distro == "ubuntu" and tools.os_info.os_version >= "20"
49 debian_11_or_later = tools.os_info.linux_distro == "debian" and tools.os_info.os_version >= "11"
50 if ubuntu_20_or_later or debian_11_or_later:
51 packages = ["libgl-dev"]
52 else:
53 packages = ["libgl1-mesa-dev"]
54 elif tools.os_info.with_pacman:
55 packages = ["libglvnd"]
56 elif tools.os_info.with_zypper:
57 packages = ["Mesa-libGL-devel"]
58 else:
59 self.warn("don't know how to install OpenGL for your distro")
60 package_tool.install(update=True, packages=packages)
61
62 def package_info(self):
63 if self.settings.os == "Macos":
64 self.cpp_info.defines.append("GL_SILENCE_DEPRECATION=1")
65 self.cpp_info.frameworks.append("OpenGL")
66 elif self.settings.os == "Windows":
67 self.cpp_info.system_libs = ["OpenGL32"]
68 elif self.settings.os == "Linux":
69 self._fill_cppinfo_from_pkgconfig('gl')
70
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/recipes/opengl/all/conanfile.py b/recipes/opengl/all/conanfile.py
--- a/recipes/opengl/all/conanfile.py
+++ b/recipes/opengl/all/conanfile.py
@@ -56,7 +56,7 @@
elif tools.os_info.with_zypper:
packages = ["Mesa-libGL-devel"]
else:
- self.warn("don't know how to install OpenGL for your distro")
+ self.output.warn("Don't know how to install OpenGL for your distro.")
package_tool.install(update=True, packages=packages)
def package_info(self):
| {"golden_diff": "diff --git a/recipes/opengl/all/conanfile.py b/recipes/opengl/all/conanfile.py\n--- a/recipes/opengl/all/conanfile.py\n+++ b/recipes/opengl/all/conanfile.py\n@@ -56,7 +56,7 @@\n elif tools.os_info.with_zypper:\n packages = [\"Mesa-libGL-devel\"]\n else:\n- self.warn(\"don't know how to install OpenGL for your distro\")\n+ self.output.warn(\"Don't know how to install OpenGL for your distro.\")\n package_tool.install(update=True, packages=packages)\n \n def package_info(self):\n", "issue": "[bug] Some system packages fail to install (xorg/system, opengl/system)\n<!--\r\n Please don't forget to update the issue title.\r\n Include all applicable information to help us reproduce your problem.\r\n\r\n To help us debug your issue please explain:\r\n--> When configuring a cmake project template that used conan i encountered an error while configuring cmake : \r\n`opengl/system: ERROR: while executing system_requirements(): 'SysConfigOpenGLConan' object has no attribute 'warn'`. After attemping a manual installation with `conan install opengl/system@` i still have the same problem and output. I have no idea what could be causing this and after a few hours of googling i failed to find anything that would explain this. I also get the exact same output when installing `xorg/system`\r\n\r\n### Environment Details (include every applicable attribute)\r\n * Operating System+version: Pop!_OS 20.04 LTS x86_64\r\n * Compiler+version: gcc (Ubuntu 9.3.0-10ubuntu2) 9.3.0\r\n * Conan version: Conan version 1.26.0\r\n * Python version: Python 2.7.18rc1 AND Python 3.8.2\r\n\r\n### Steps to reproduce (Include if Applicable)\r\n`conan install opengl/system@`\r\n`conan install xorg/system@`\r\n\r\n### Logs (Executed commands with output) (Include/Attach if Applicable)\r\n\r\n```\r\nConfiguration:\r\n[settings]\r\narch=x86_64\r\narch_build=x86_64\r\nbuild_type=Release\r\nos=Linux\r\nos_build=Linux\r\n[options]\r\n[build_requires]\r\n[env]\r\n\r\nInstalling package: opengl/system\r\nRequirements\r\n opengl/system from 'conan-center' - Cache\r\nPackages\r\n opengl/system:5ab84d6acfe1f23c4fae0ab88f26e3a396351ac9 - Cache\r\n\r\nInstalling (downloading, building) binaries...\r\nopengl/system: ERROR: while executing system_requirements(): 'SysConfigOpenGLConan' object has no attribute 'warn'\r\nERROR: Error in system requirements\r\n```\r\n\n", "before_files": [{"content": "from conans import ConanFile, tools\nfrom conans.errors import ConanException\nimport os\n\n\nclass SysConfigOpenGLConan(ConanFile):\n name = \"opengl\"\n version = \"system\"\n description = \"cross-platform virtual conan package for the OpenGL support\"\n topics = (\"conan\", \"opengl\", \"gl\")\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://www.opengl.org/\"\n license = \"MIT\"\n settings = (\"os\",)\n\n def package_id(self):\n self.info.header_only()\n\n def _fill_cppinfo_from_pkgconfig(self, name):\n pkg_config = tools.PkgConfig(name)\n if not pkg_config.provides:\n raise ConanException(\"OpenGL development files aren't available, give up\")\n libs = [lib[2:] for lib in pkg_config.libs_only_l]\n lib_dirs = [lib[2:] for lib in pkg_config.libs_only_L]\n ldflags = [flag for flag in pkg_config.libs_only_other]\n include_dirs = [include[2:] for include in pkg_config.cflags_only_I]\n cflags = [flag for flag in pkg_config.cflags_only_other if not flag.startswith(\"-D\")]\n defines = [flag[2:] for flag in pkg_config.cflags_only_other if flag.startswith(\"-D\")]\n\n self.cpp_info.system_libs.extend(libs)\n self.cpp_info.libdirs.extend(lib_dirs)\n self.cpp_info.sharedlinkflags.extend(ldflags)\n self.cpp_info.exelinkflags.extend(ldflags)\n self.cpp_info.defines.extend(defines)\n self.cpp_info.includedirs.extend(include_dirs)\n self.cpp_info.cflags.extend(cflags)\n self.cpp_info.cxxflags.extend(cflags)\n\n def system_requirements(self):\n if tools.os_info.is_linux and self.settings.os == \"Linux\":\n package_tool = tools.SystemPackageTool(conanfile=self, default_mode='verify')\n if tools.os_info.with_yum:\n if tools.os_info.linux_distro == \"fedora\" and tools.os_info.os_version >= \"32\":\n packages = [\"libglvnd-devel\"]\n else:\n packages = [\"mesa-libGL-devel\"]\n elif tools.os_info.with_apt:\n ubuntu_20_or_later = tools.os_info.linux_distro == \"ubuntu\" and tools.os_info.os_version >= \"20\"\n debian_11_or_later = tools.os_info.linux_distro == \"debian\" and tools.os_info.os_version >= \"11\"\n if ubuntu_20_or_later or debian_11_or_later:\n packages = [\"libgl-dev\"]\n else:\n packages = [\"libgl1-mesa-dev\"]\n elif tools.os_info.with_pacman:\n packages = [\"libglvnd\"]\n elif tools.os_info.with_zypper:\n packages = [\"Mesa-libGL-devel\"]\n else:\n self.warn(\"don't know how to install OpenGL for your distro\")\n package_tool.install(update=True, packages=packages)\n\n def package_info(self):\n if self.settings.os == \"Macos\":\n self.cpp_info.defines.append(\"GL_SILENCE_DEPRECATION=1\")\n self.cpp_info.frameworks.append(\"OpenGL\")\n elif self.settings.os == \"Windows\":\n self.cpp_info.system_libs = [\"OpenGL32\"]\n elif self.settings.os == \"Linux\":\n self._fill_cppinfo_from_pkgconfig('gl')\n", "path": "recipes/opengl/all/conanfile.py"}], "after_files": [{"content": "from conans import ConanFile, tools\nfrom conans.errors import ConanException\nimport os\n\n\nclass SysConfigOpenGLConan(ConanFile):\n name = \"opengl\"\n version = \"system\"\n description = \"cross-platform virtual conan package for the OpenGL support\"\n topics = (\"conan\", \"opengl\", \"gl\")\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://www.opengl.org/\"\n license = \"MIT\"\n settings = (\"os\",)\n\n def package_id(self):\n self.info.header_only()\n\n def _fill_cppinfo_from_pkgconfig(self, name):\n pkg_config = tools.PkgConfig(name)\n if not pkg_config.provides:\n raise ConanException(\"OpenGL development files aren't available, give up\")\n libs = [lib[2:] for lib in pkg_config.libs_only_l]\n lib_dirs = [lib[2:] for lib in pkg_config.libs_only_L]\n ldflags = [flag for flag in pkg_config.libs_only_other]\n include_dirs = [include[2:] for include in pkg_config.cflags_only_I]\n cflags = [flag for flag in pkg_config.cflags_only_other if not flag.startswith(\"-D\")]\n defines = [flag[2:] for flag in pkg_config.cflags_only_other if flag.startswith(\"-D\")]\n\n self.cpp_info.system_libs.extend(libs)\n self.cpp_info.libdirs.extend(lib_dirs)\n self.cpp_info.sharedlinkflags.extend(ldflags)\n self.cpp_info.exelinkflags.extend(ldflags)\n self.cpp_info.defines.extend(defines)\n self.cpp_info.includedirs.extend(include_dirs)\n self.cpp_info.cflags.extend(cflags)\n self.cpp_info.cxxflags.extend(cflags)\n\n def system_requirements(self):\n if tools.os_info.is_linux and self.settings.os == \"Linux\":\n package_tool = tools.SystemPackageTool(conanfile=self, default_mode='verify')\n if tools.os_info.with_yum:\n if tools.os_info.linux_distro == \"fedora\" and tools.os_info.os_version >= \"32\":\n packages = [\"libglvnd-devel\"]\n else:\n packages = [\"mesa-libGL-devel\"]\n elif tools.os_info.with_apt:\n ubuntu_20_or_later = tools.os_info.linux_distro == \"ubuntu\" and tools.os_info.os_version >= \"20\"\n debian_11_or_later = tools.os_info.linux_distro == \"debian\" and tools.os_info.os_version >= \"11\"\n if ubuntu_20_or_later or debian_11_or_later:\n packages = [\"libgl-dev\"]\n else:\n packages = [\"libgl1-mesa-dev\"]\n elif tools.os_info.with_pacman:\n packages = [\"libglvnd\"]\n elif tools.os_info.with_zypper:\n packages = [\"Mesa-libGL-devel\"]\n else:\n self.output.warn(\"Don't know how to install OpenGL for your distro.\")\n package_tool.install(update=True, packages=packages)\n\n def package_info(self):\n if self.settings.os == \"Macos\":\n self.cpp_info.defines.append(\"GL_SILENCE_DEPRECATION=1\")\n self.cpp_info.frameworks.append(\"OpenGL\")\n elif self.settings.os == \"Windows\":\n self.cpp_info.system_libs = [\"OpenGL32\"]\n elif self.settings.os == \"Linux\":\n self._fill_cppinfo_from_pkgconfig('gl')\n", "path": "recipes/opengl/all/conanfile.py"}]} | 1,606 | 136 |
gh_patches_debug_37236 | rasdani/github-patches | git_diff | ansible__awx-11599 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
doc error with inventory plugin for latest awx.awx inventory plugin
### Please confirm the following
- [X] I agree to follow this project's [code of conduct](http://docs.ansible.com/ansible/latest/community/code_of_conduct.html).
- [X] I have checked the [current issues](https://github.com/ansible/awx/issues) for duplicates.
- [X] I understand that AWX is open source software provided for free and that I am not entitled to status updates or other assurances.
### Summary
The documentation for the inventory plugin, awx.awx.controller, was malformed.
rendered webpage for docs is here: https://docs.ansible.com/ansible/latest/collections/awx/awx/controller_inventory.html#ansible-collections-awx-awx-controller-inventory
### AWX version
N/A
### Installation method
openshift
### Modifications
no
### Ansible version
n/a
### Operating system
n/a
### Web browser
Chrome
### Steps to reproduce
go to https://docs.ansible.com/ansible/latest/collections/awx/awx/controller_inventory.html#ansible-collections-awx-awx-controller-inventory
there is docs errors on the webpage
>The documentation for the inventory plugin, awx.awx.controller, was malformed.
### Expected results
no errors
### Actual results

### Additional information
fast track this to infamous Chris Meyers
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `awx_collection/plugins/doc_fragments/auth_plugin.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Copyright: (c) 2020, Ansible by Red Hat, Inc
4 # GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
5
6 from __future__ import absolute_import, division, print_function
7
8 __metaclass__ = type
9
10
11 class ModuleDocFragment(object):
12
13 # Automation Platform Controller documentation fragment
14 DOCUMENTATION = r'''
15 options:
16 host:
17 description: The network address of your Automation Platform Controller host.
18 env:
19 - name: CONTROLLER_HOST
20 - name: TOWER_HOST
21 deprecated:
22 - removed_from_collection: 'awx.awx.controller'
23 - removed_in: '4.0.0'
24 - why: Collection name change
25 - alternative: 'CONTROLLER_HOST'
26 username:
27 description: The user that you plan to use to access inventories on the controller.
28 env:
29 - name: CONTROLLER_USERNAME
30 - name: TOWER_USERNAME
31 deprecated:
32 - removed_from_collection: 'awx.awx.controller'
33 - removed_in: '4.0.0'
34 - why: Collection name change
35 - alternative: 'CONTROLLER_USERNAME'
36 password:
37 description: The password for your controller user.
38 env:
39 - name: CONTROLLER_PASSWORD
40 - name: TOWER_PASSWORD
41 deprecated:
42 - removed_from_collection: 'awx.awx.controller'
43 - removed_in: '4.0.0'
44 - why: Collection name change
45 - alternative: 'CONTROLLER_PASSWORD'
46 oauth_token:
47 description:
48 - The OAuth token to use.
49 env:
50 - name: CONTROLLER_OAUTH_TOKEN
51 - name: TOWER_OAUTH_TOKEN
52 deprecated:
53 - removed_from_collection: 'awx.awx.controller'
54 - removed_in: '4.0.0'
55 - why: Collection name change
56 - alternative: 'CONTROLLER_OAUTH_TOKEN'
57 verify_ssl:
58 description:
59 - Specify whether Ansible should verify the SSL certificate of the controller host.
60 - Defaults to True, but this is handled by the shared module_utils code
61 type: bool
62 env:
63 - name: CONTROLLER_VERIFY_SSL
64 - name: TOWER_VERIFY_SSL
65 deprecated:
66 - removed_from_collection: 'awx.awx.controller'
67 - removed_in: '4.0.0'
68 - why: Collection name change
69 - alternative: 'CONTROLLER_VERIFY_SSL'
70 aliases: [ validate_certs ]
71
72 notes:
73 - If no I(config_file) is provided we will attempt to use the tower-cli library
74 defaults to find your host information.
75 - I(config_file) should be in the following format
76 host=hostname
77 username=username
78 password=password
79 '''
80
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/awx_collection/plugins/doc_fragments/auth_plugin.py b/awx_collection/plugins/doc_fragments/auth_plugin.py
--- a/awx_collection/plugins/doc_fragments/auth_plugin.py
+++ b/awx_collection/plugins/doc_fragments/auth_plugin.py
@@ -18,42 +18,42 @@
env:
- name: CONTROLLER_HOST
- name: TOWER_HOST
- deprecated:
- - removed_from_collection: 'awx.awx.controller'
- - removed_in: '4.0.0'
- - why: Collection name change
- - alternative: 'CONTROLLER_HOST'
+ deprecated:
+ collection_name: 'awx.awx'
+ version: '4.0.0'
+ why: Collection name change
+ alternatives: 'CONTROLLER_HOST'
username:
description: The user that you plan to use to access inventories on the controller.
env:
- name: CONTROLLER_USERNAME
- name: TOWER_USERNAME
- deprecated:
- - removed_from_collection: 'awx.awx.controller'
- - removed_in: '4.0.0'
- - why: Collection name change
- - alternative: 'CONTROLLER_USERNAME'
+ deprecated:
+ collection_name: 'awx.awx'
+ version: '4.0.0'
+ why: Collection name change
+ alternatives: 'CONTROLLER_USERNAME'
password:
description: The password for your controller user.
env:
- name: CONTROLLER_PASSWORD
- name: TOWER_PASSWORD
- deprecated:
- - removed_from_collection: 'awx.awx.controller'
- - removed_in: '4.0.0'
- - why: Collection name change
- - alternative: 'CONTROLLER_PASSWORD'
+ deprecated:
+ collection_name: 'awx.awx'
+ version: '4.0.0'
+ why: Collection name change
+ alternatives: 'CONTROLLER_PASSWORD'
oauth_token:
description:
- The OAuth token to use.
env:
- name: CONTROLLER_OAUTH_TOKEN
- name: TOWER_OAUTH_TOKEN
- deprecated:
- - removed_from_collection: 'awx.awx.controller'
- - removed_in: '4.0.0'
- - why: Collection name change
- - alternative: 'CONTROLLER_OAUTH_TOKEN'
+ deprecated:
+ collection_name: 'awx.awx'
+ version: '4.0.0'
+ why: Collection name change
+ alternatives: 'CONTROLLER_OAUTH_TOKEN'
verify_ssl:
description:
- Specify whether Ansible should verify the SSL certificate of the controller host.
@@ -62,11 +62,11 @@
env:
- name: CONTROLLER_VERIFY_SSL
- name: TOWER_VERIFY_SSL
- deprecated:
- - removed_from_collection: 'awx.awx.controller'
- - removed_in: '4.0.0'
- - why: Collection name change
- - alternative: 'CONTROLLER_VERIFY_SSL'
+ deprecated:
+ collection_name: 'awx.awx'
+ version: '4.0.0'
+ why: Collection name change
+ alternatives: 'CONTROLLER_VERIFY_SSL'
aliases: [ validate_certs ]
notes:
| {"golden_diff": "diff --git a/awx_collection/plugins/doc_fragments/auth_plugin.py b/awx_collection/plugins/doc_fragments/auth_plugin.py\n--- a/awx_collection/plugins/doc_fragments/auth_plugin.py\n+++ b/awx_collection/plugins/doc_fragments/auth_plugin.py\n@@ -18,42 +18,42 @@\n env:\n - name: CONTROLLER_HOST\n - name: TOWER_HOST\n- deprecated:\n- - removed_from_collection: 'awx.awx.controller'\n- - removed_in: '4.0.0'\n- - why: Collection name change\n- - alternative: 'CONTROLLER_HOST'\n+ deprecated:\n+ collection_name: 'awx.awx'\n+ version: '4.0.0'\n+ why: Collection name change\n+ alternatives: 'CONTROLLER_HOST'\n username:\n description: The user that you plan to use to access inventories on the controller.\n env:\n - name: CONTROLLER_USERNAME\n - name: TOWER_USERNAME\n- deprecated:\n- - removed_from_collection: 'awx.awx.controller'\n- - removed_in: '4.0.0'\n- - why: Collection name change\n- - alternative: 'CONTROLLER_USERNAME'\n+ deprecated:\n+ collection_name: 'awx.awx'\n+ version: '4.0.0'\n+ why: Collection name change\n+ alternatives: 'CONTROLLER_USERNAME'\n password:\n description: The password for your controller user.\n env:\n - name: CONTROLLER_PASSWORD\n - name: TOWER_PASSWORD\n- deprecated:\n- - removed_from_collection: 'awx.awx.controller'\n- - removed_in: '4.0.0'\n- - why: Collection name change\n- - alternative: 'CONTROLLER_PASSWORD'\n+ deprecated:\n+ collection_name: 'awx.awx'\n+ version: '4.0.0'\n+ why: Collection name change\n+ alternatives: 'CONTROLLER_PASSWORD'\n oauth_token:\n description:\n - The OAuth token to use.\n env:\n - name: CONTROLLER_OAUTH_TOKEN\n - name: TOWER_OAUTH_TOKEN\n- deprecated:\n- - removed_from_collection: 'awx.awx.controller'\n- - removed_in: '4.0.0'\n- - why: Collection name change\n- - alternative: 'CONTROLLER_OAUTH_TOKEN'\n+ deprecated:\n+ collection_name: 'awx.awx'\n+ version: '4.0.0'\n+ why: Collection name change\n+ alternatives: 'CONTROLLER_OAUTH_TOKEN'\n verify_ssl:\n description:\n - Specify whether Ansible should verify the SSL certificate of the controller host.\n@@ -62,11 +62,11 @@\n env:\n - name: CONTROLLER_VERIFY_SSL\n - name: TOWER_VERIFY_SSL\n- deprecated:\n- - removed_from_collection: 'awx.awx.controller'\n- - removed_in: '4.0.0'\n- - why: Collection name change\n- - alternative: 'CONTROLLER_VERIFY_SSL'\n+ deprecated:\n+ collection_name: 'awx.awx'\n+ version: '4.0.0'\n+ why: Collection name change\n+ alternatives: 'CONTROLLER_VERIFY_SSL'\n aliases: [ validate_certs ]\n \n notes:\n", "issue": "doc error with inventory plugin for latest awx.awx inventory plugin \n### Please confirm the following\n\n- [X] I agree to follow this project's [code of conduct](http://docs.ansible.com/ansible/latest/community/code_of_conduct.html).\n- [X] I have checked the [current issues](https://github.com/ansible/awx/issues) for duplicates.\n- [X] I understand that AWX is open source software provided for free and that I am not entitled to status updates or other assurances.\n\n### Summary\n\nThe documentation for the inventory plugin, awx.awx.controller, was malformed.\r\n\r\nrendered webpage for docs is here: https://docs.ansible.com/ansible/latest/collections/awx/awx/controller_inventory.html#ansible-collections-awx-awx-controller-inventory\n\n### AWX version\n\nN/A\n\n### Installation method\n\nopenshift\n\n### Modifications\n\nno\n\n### Ansible version\n\nn/a\n\n### Operating system\n\nn/a\n\n### Web browser\n\nChrome\n\n### Steps to reproduce\n\ngo to https://docs.ansible.com/ansible/latest/collections/awx/awx/controller_inventory.html#ansible-collections-awx-awx-controller-inventory\r\n\r\nthere is docs errors on the webpage \r\n\r\n>The documentation for the inventory plugin, awx.awx.controller, was malformed.\r\n\r\n\n\n### Expected results\n\nno errors\n\n### Actual results\n\n\r\n\n\n### Additional information\n\nfast track this to infamous Chris Meyers\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright: (c) 2020, Ansible by Red Hat, Inc\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n\n__metaclass__ = type\n\n\nclass ModuleDocFragment(object):\n\n # Automation Platform Controller documentation fragment\n DOCUMENTATION = r'''\noptions:\n host:\n description: The network address of your Automation Platform Controller host.\n env:\n - name: CONTROLLER_HOST\n - name: TOWER_HOST\n deprecated:\n - removed_from_collection: 'awx.awx.controller'\n - removed_in: '4.0.0'\n - why: Collection name change\n - alternative: 'CONTROLLER_HOST'\n username:\n description: The user that you plan to use to access inventories on the controller.\n env:\n - name: CONTROLLER_USERNAME\n - name: TOWER_USERNAME\n deprecated:\n - removed_from_collection: 'awx.awx.controller'\n - removed_in: '4.0.0'\n - why: Collection name change\n - alternative: 'CONTROLLER_USERNAME'\n password:\n description: The password for your controller user.\n env:\n - name: CONTROLLER_PASSWORD\n - name: TOWER_PASSWORD\n deprecated:\n - removed_from_collection: 'awx.awx.controller'\n - removed_in: '4.0.0'\n - why: Collection name change\n - alternative: 'CONTROLLER_PASSWORD'\n oauth_token:\n description:\n - The OAuth token to use.\n env:\n - name: CONTROLLER_OAUTH_TOKEN\n - name: TOWER_OAUTH_TOKEN\n deprecated:\n - removed_from_collection: 'awx.awx.controller'\n - removed_in: '4.0.0'\n - why: Collection name change\n - alternative: 'CONTROLLER_OAUTH_TOKEN'\n verify_ssl:\n description:\n - Specify whether Ansible should verify the SSL certificate of the controller host.\n - Defaults to True, but this is handled by the shared module_utils code\n type: bool\n env:\n - name: CONTROLLER_VERIFY_SSL\n - name: TOWER_VERIFY_SSL\n deprecated:\n - removed_from_collection: 'awx.awx.controller'\n - removed_in: '4.0.0'\n - why: Collection name change\n - alternative: 'CONTROLLER_VERIFY_SSL'\n aliases: [ validate_certs ]\n\nnotes:\n- If no I(config_file) is provided we will attempt to use the tower-cli library\n defaults to find your host information.\n- I(config_file) should be in the following format\n host=hostname\n username=username\n password=password\n'''\n", "path": "awx_collection/plugins/doc_fragments/auth_plugin.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright: (c) 2020, Ansible by Red Hat, Inc\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n\n__metaclass__ = type\n\n\nclass ModuleDocFragment(object):\n\n # Automation Platform Controller documentation fragment\n DOCUMENTATION = r'''\noptions:\n host:\n description: The network address of your Automation Platform Controller host.\n env:\n - name: CONTROLLER_HOST\n - name: TOWER_HOST\n deprecated:\n collection_name: 'awx.awx'\n version: '4.0.0'\n why: Collection name change\n alternatives: 'CONTROLLER_HOST'\n username:\n description: The user that you plan to use to access inventories on the controller.\n env:\n - name: CONTROLLER_USERNAME\n - name: TOWER_USERNAME\n deprecated:\n collection_name: 'awx.awx'\n version: '4.0.0'\n why: Collection name change\n alternatives: 'CONTROLLER_USERNAME'\n password:\n description: The password for your controller user.\n env:\n - name: CONTROLLER_PASSWORD\n - name: TOWER_PASSWORD\n deprecated:\n collection_name: 'awx.awx'\n version: '4.0.0'\n why: Collection name change\n alternatives: 'CONTROLLER_PASSWORD'\n oauth_token:\n description:\n - The OAuth token to use.\n env:\n - name: CONTROLLER_OAUTH_TOKEN\n - name: TOWER_OAUTH_TOKEN\n deprecated:\n collection_name: 'awx.awx'\n version: '4.0.0'\n why: Collection name change\n alternatives: 'CONTROLLER_OAUTH_TOKEN'\n verify_ssl:\n description:\n - Specify whether Ansible should verify the SSL certificate of the controller host.\n - Defaults to True, but this is handled by the shared module_utils code\n type: bool\n env:\n - name: CONTROLLER_VERIFY_SSL\n - name: TOWER_VERIFY_SSL\n deprecated:\n collection_name: 'awx.awx'\n version: '4.0.0'\n why: Collection name change\n alternatives: 'CONTROLLER_VERIFY_SSL'\n aliases: [ validate_certs ]\n\nnotes:\n- If no I(config_file) is provided we will attempt to use the tower-cli library\n defaults to find your host information.\n- I(config_file) should be in the following format\n host=hostname\n username=username\n password=password\n'''\n", "path": "awx_collection/plugins/doc_fragments/auth_plugin.py"}]} | 1,384 | 739 |
gh_patches_debug_1537 | rasdani/github-patches | git_diff | mitmproxy__mitmproxy-2051 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
@concurrent annotation doesn't work in the OOP script method
I.E:
```
class SomeClass:
@concurrent [doesn't work, 2 args]
def request(self, flow):
pass
```
@concurrent annotation doesn't work in the OOP script method
I.E:
```
class SomeClass:
@concurrent [doesn't work, 2 args]
def request(self, flow):
pass
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mitmproxy/script/concurrent.py`
Content:
```
1 """
2 This module provides a @concurrent decorator primitive to
3 offload computations from mitmproxy's main master thread.
4 """
5
6 from mitmproxy import eventsequence
7 from mitmproxy.types import basethread
8
9
10 class ScriptThread(basethread.BaseThread):
11 name = "ScriptThread"
12
13
14 def concurrent(fn):
15 if fn.__name__ not in eventsequence.Events - {"start", "configure", "tick"}:
16 raise NotImplementedError(
17 "Concurrent decorator not supported for '%s' method." % fn.__name__
18 )
19
20 def _concurrent(obj):
21 def run():
22 fn(obj)
23 if obj.reply.state == "taken":
24 if not obj.reply.has_message:
25 obj.reply.ack()
26 obj.reply.commit()
27 obj.reply.take()
28 ScriptThread(
29 "script.concurrent (%s)" % fn.__name__,
30 target=run
31 ).start()
32 return _concurrent
33
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mitmproxy/script/concurrent.py b/mitmproxy/script/concurrent.py
--- a/mitmproxy/script/concurrent.py
+++ b/mitmproxy/script/concurrent.py
@@ -29,4 +29,8 @@
"script.concurrent (%s)" % fn.__name__,
target=run
).start()
- return _concurrent
+ # Support @concurrent for class-based addons
+ if "." in fn.__qualname__:
+ return staticmethod(_concurrent)
+ else:
+ return _concurrent
| {"golden_diff": "diff --git a/mitmproxy/script/concurrent.py b/mitmproxy/script/concurrent.py\n--- a/mitmproxy/script/concurrent.py\n+++ b/mitmproxy/script/concurrent.py\n@@ -29,4 +29,8 @@\n \"script.concurrent (%s)\" % fn.__name__,\n target=run\n ).start()\n- return _concurrent\n+ # Support @concurrent for class-based addons\n+ if \".\" in fn.__qualname__:\n+ return staticmethod(_concurrent)\n+ else:\n+ return _concurrent\n", "issue": "@concurrent annotation doesn't work in the OOP script method\nI.E:\r\n\r\n```\r\nclass SomeClass:\r\n @concurrent [doesn't work, 2 args]\r\n def request(self, flow):\r\n pass\r\n\r\n```\n@concurrent annotation doesn't work in the OOP script method\nI.E:\r\n\r\n```\r\nclass SomeClass:\r\n @concurrent [doesn't work, 2 args]\r\n def request(self, flow):\r\n pass\r\n\r\n```\n", "before_files": [{"content": "\"\"\"\nThis module provides a @concurrent decorator primitive to\noffload computations from mitmproxy's main master thread.\n\"\"\"\n\nfrom mitmproxy import eventsequence\nfrom mitmproxy.types import basethread\n\n\nclass ScriptThread(basethread.BaseThread):\n name = \"ScriptThread\"\n\n\ndef concurrent(fn):\n if fn.__name__ not in eventsequence.Events - {\"start\", \"configure\", \"tick\"}:\n raise NotImplementedError(\n \"Concurrent decorator not supported for '%s' method.\" % fn.__name__\n )\n\n def _concurrent(obj):\n def run():\n fn(obj)\n if obj.reply.state == \"taken\":\n if not obj.reply.has_message:\n obj.reply.ack()\n obj.reply.commit()\n obj.reply.take()\n ScriptThread(\n \"script.concurrent (%s)\" % fn.__name__,\n target=run\n ).start()\n return _concurrent\n", "path": "mitmproxy/script/concurrent.py"}], "after_files": [{"content": "\"\"\"\nThis module provides a @concurrent decorator primitive to\noffload computations from mitmproxy's main master thread.\n\"\"\"\n\nfrom mitmproxy import eventsequence\nfrom mitmproxy.types import basethread\n\n\nclass ScriptThread(basethread.BaseThread):\n name = \"ScriptThread\"\n\n\ndef concurrent(fn):\n if fn.__name__ not in eventsequence.Events - {\"start\", \"configure\", \"tick\"}:\n raise NotImplementedError(\n \"Concurrent decorator not supported for '%s' method.\" % fn.__name__\n )\n\n def _concurrent(obj):\n def run():\n fn(obj)\n if obj.reply.state == \"taken\":\n if not obj.reply.has_message:\n obj.reply.ack()\n obj.reply.commit()\n obj.reply.take()\n ScriptThread(\n \"script.concurrent (%s)\" % fn.__name__,\n target=run\n ).start()\n # Support @concurrent for class-based addons\n if \".\" in fn.__qualname__:\n return staticmethod(_concurrent)\n else:\n return _concurrent\n", "path": "mitmproxy/script/concurrent.py"}]} | 609 | 123 |
gh_patches_debug_10057 | rasdani/github-patches | git_diff | OCA__social-1068 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[15.0] mail_tracking: portal tests fail
<!-- Provide a general summary of the issue in the Title above.
Before opening the issue, please check that there isn't another issue addressing the same problem already created -->
## Module
mail_tracking
## Describe the bug
mail_tracking causes TestPortalWizard.test_portal_wizard_public_user and TestPortalWizard.test_portal_wizard_partner to fail
## To Reproduce
**Affected versions**: 15.0
Steps to reproduce the behavior:
1. init db with portal and mail_tracking
2. run portal tests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mail_tracking/models/ir_mail_server.py`
Content:
```
1 # Copyright 2016 Antonio Espinosa - <[email protected]>
2 # License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).
3
4 import re
5 import threading
6
7 from odoo import api, models, tools
8
9
10 class IrMailServer(models.Model):
11 _inherit = "ir.mail_server"
12
13 def _tracking_headers_add(self, tracking_email_id, headers):
14 """Allow other addons to add its own tracking SMTP headers"""
15 headers = headers or {}
16 headers["X-Odoo-Database"] = getattr(threading.current_thread(), "dbname", None)
17 headers["X-Odoo-MailTracking-ID"] = tracking_email_id
18 return headers
19
20 def _tracking_email_id_body_get(self, body):
21 body = body or ""
22 # https://regex101.com/r/lW4cB1/2
23 match = re.search(r'<img[^>]*data-odoo-tracking-email=["\']([0-9]*)["\']', body)
24 return str(match.group(1)) if match and match.group(1) else False
25
26 def build_email(
27 self,
28 email_from,
29 email_to,
30 subject,
31 body,
32 email_cc=None,
33 email_bcc=None,
34 reply_to=False,
35 attachments=None,
36 message_id=None,
37 references=None,
38 object_id=False,
39 subtype="plain",
40 headers=None,
41 body_alternative=None,
42 subtype_alternative="plain",
43 ):
44 tracking_email_id = self._tracking_email_id_body_get(body)
45 if tracking_email_id:
46 headers = self._tracking_headers_add(tracking_email_id, headers)
47 msg = super(IrMailServer, self).build_email(
48 email_from,
49 email_to,
50 subject,
51 body,
52 email_cc=email_cc,
53 email_bcc=email_bcc,
54 reply_to=reply_to,
55 attachments=attachments,
56 message_id=message_id,
57 references=references,
58 object_id=object_id,
59 subtype=subtype,
60 headers=headers,
61 body_alternative=body_alternative,
62 subtype_alternative=subtype_alternative,
63 )
64 return msg
65
66 def _tracking_email_get(self, message):
67 try:
68 tracking_email_id = int(
69 message.get(
70 "X-Odoo-MailTracking-ID",
71 # Deprecated tracking header, kept as fallback
72 message["X-Odoo-Tracking-ID"],
73 )
74 )
75 except (TypeError, ValueError, KeyError):
76 tracking_email_id = False
77 return self.env["mail.tracking.email"].browse(tracking_email_id)
78
79 def _smtp_server_get(self, mail_server_id, smtp_server):
80 smtp_server_used = False
81 mail_server = None
82 if mail_server_id:
83 mail_server = self.browse(mail_server_id)
84 elif not smtp_server:
85 mail_server_ids = self.search([], order="sequence", limit=1)
86 mail_server = mail_server_ids[0] if mail_server_ids else None
87 if mail_server:
88 smtp_server_used = mail_server.smtp_host
89 else:
90 smtp_server_used = smtp_server or tools.config.get("smtp_server")
91 return smtp_server_used
92
93 @api.model
94 def send_email(
95 self,
96 message,
97 mail_server_id=None,
98 smtp_server=None,
99 smtp_port=None,
100 smtp_user=None,
101 smtp_password=None,
102 smtp_encryption=None,
103 smtp_debug=False,
104 smtp_session=None,
105 ):
106 message_id = False
107 tracking_email = self._tracking_email_get(message)
108 smtp_server_used = self.sudo()._smtp_server_get(mail_server_id, smtp_server)
109 try:
110 message_id = super(IrMailServer, self).send_email(
111 message,
112 mail_server_id=mail_server_id,
113 smtp_server=smtp_server,
114 smtp_port=smtp_port,
115 smtp_user=smtp_user,
116 smtp_password=smtp_password,
117 smtp_encryption=smtp_encryption,
118 smtp_debug=smtp_debug,
119 smtp_session=smtp_session,
120 )
121 except Exception as e:
122 if tracking_email:
123 tracking_email.smtp_error(self, smtp_server_used, e)
124 if message_id and tracking_email:
125 vals = tracking_email._tracking_sent_prepare(
126 self, smtp_server_used, message, message_id
127 )
128 if vals:
129 self.env["mail.tracking.event"].sudo().create(vals)
130 return message_id
131
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mail_tracking/models/ir_mail_server.py b/mail_tracking/models/ir_mail_server.py
--- a/mail_tracking/models/ir_mail_server.py
+++ b/mail_tracking/models/ir_mail_server.py
@@ -45,10 +45,10 @@
if tracking_email_id:
headers = self._tracking_headers_add(tracking_email_id, headers)
msg = super(IrMailServer, self).build_email(
- email_from,
- email_to,
- subject,
- body,
+ email_from=email_from,
+ email_to=email_to,
+ subject=subject,
+ body=body,
email_cc=email_cc,
email_bcc=email_bcc,
reply_to=reply_to,
| {"golden_diff": "diff --git a/mail_tracking/models/ir_mail_server.py b/mail_tracking/models/ir_mail_server.py\n--- a/mail_tracking/models/ir_mail_server.py\n+++ b/mail_tracking/models/ir_mail_server.py\n@@ -45,10 +45,10 @@\n if tracking_email_id:\n headers = self._tracking_headers_add(tracking_email_id, headers)\n msg = super(IrMailServer, self).build_email(\n- email_from,\n- email_to,\n- subject,\n- body,\n+ email_from=email_from,\n+ email_to=email_to,\n+ subject=subject,\n+ body=body,\n email_cc=email_cc,\n email_bcc=email_bcc,\n reply_to=reply_to,\n", "issue": "[15.0] mail_tracking: portal tests fail\n<!-- Provide a general summary of the issue in the Title above.\r\nBefore opening the issue, please check that there isn't another issue addressing the same problem already created -->\r\n\r\n## Module\r\nmail_tracking\r\n\r\n## Describe the bug\r\nmail_tracking causes TestPortalWizard.test_portal_wizard_public_user and TestPortalWizard.test_portal_wizard_partner to fail\r\n\r\n## To Reproduce\r\n**Affected versions**: 15.0\r\n\r\nSteps to reproduce the behavior:\r\n1. init db with portal and mail_tracking\r\n2. run portal tests\r\n\n", "before_files": [{"content": "# Copyright 2016 Antonio Espinosa - <[email protected]>\n# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).\n\nimport re\nimport threading\n\nfrom odoo import api, models, tools\n\n\nclass IrMailServer(models.Model):\n _inherit = \"ir.mail_server\"\n\n def _tracking_headers_add(self, tracking_email_id, headers):\n \"\"\"Allow other addons to add its own tracking SMTP headers\"\"\"\n headers = headers or {}\n headers[\"X-Odoo-Database\"] = getattr(threading.current_thread(), \"dbname\", None)\n headers[\"X-Odoo-MailTracking-ID\"] = tracking_email_id\n return headers\n\n def _tracking_email_id_body_get(self, body):\n body = body or \"\"\n # https://regex101.com/r/lW4cB1/2\n match = re.search(r'<img[^>]*data-odoo-tracking-email=[\"\\']([0-9]*)[\"\\']', body)\n return str(match.group(1)) if match and match.group(1) else False\n\n def build_email(\n self,\n email_from,\n email_to,\n subject,\n body,\n email_cc=None,\n email_bcc=None,\n reply_to=False,\n attachments=None,\n message_id=None,\n references=None,\n object_id=False,\n subtype=\"plain\",\n headers=None,\n body_alternative=None,\n subtype_alternative=\"plain\",\n ):\n tracking_email_id = self._tracking_email_id_body_get(body)\n if tracking_email_id:\n headers = self._tracking_headers_add(tracking_email_id, headers)\n msg = super(IrMailServer, self).build_email(\n email_from,\n email_to,\n subject,\n body,\n email_cc=email_cc,\n email_bcc=email_bcc,\n reply_to=reply_to,\n attachments=attachments,\n message_id=message_id,\n references=references,\n object_id=object_id,\n subtype=subtype,\n headers=headers,\n body_alternative=body_alternative,\n subtype_alternative=subtype_alternative,\n )\n return msg\n\n def _tracking_email_get(self, message):\n try:\n tracking_email_id = int(\n message.get(\n \"X-Odoo-MailTracking-ID\",\n # Deprecated tracking header, kept as fallback\n message[\"X-Odoo-Tracking-ID\"],\n )\n )\n except (TypeError, ValueError, KeyError):\n tracking_email_id = False\n return self.env[\"mail.tracking.email\"].browse(tracking_email_id)\n\n def _smtp_server_get(self, mail_server_id, smtp_server):\n smtp_server_used = False\n mail_server = None\n if mail_server_id:\n mail_server = self.browse(mail_server_id)\n elif not smtp_server:\n mail_server_ids = self.search([], order=\"sequence\", limit=1)\n mail_server = mail_server_ids[0] if mail_server_ids else None\n if mail_server:\n smtp_server_used = mail_server.smtp_host\n else:\n smtp_server_used = smtp_server or tools.config.get(\"smtp_server\")\n return smtp_server_used\n\n @api.model\n def send_email(\n self,\n message,\n mail_server_id=None,\n smtp_server=None,\n smtp_port=None,\n smtp_user=None,\n smtp_password=None,\n smtp_encryption=None,\n smtp_debug=False,\n smtp_session=None,\n ):\n message_id = False\n tracking_email = self._tracking_email_get(message)\n smtp_server_used = self.sudo()._smtp_server_get(mail_server_id, smtp_server)\n try:\n message_id = super(IrMailServer, self).send_email(\n message,\n mail_server_id=mail_server_id,\n smtp_server=smtp_server,\n smtp_port=smtp_port,\n smtp_user=smtp_user,\n smtp_password=smtp_password,\n smtp_encryption=smtp_encryption,\n smtp_debug=smtp_debug,\n smtp_session=smtp_session,\n )\n except Exception as e:\n if tracking_email:\n tracking_email.smtp_error(self, smtp_server_used, e)\n if message_id and tracking_email:\n vals = tracking_email._tracking_sent_prepare(\n self, smtp_server_used, message, message_id\n )\n if vals:\n self.env[\"mail.tracking.event\"].sudo().create(vals)\n return message_id\n", "path": "mail_tracking/models/ir_mail_server.py"}], "after_files": [{"content": "# Copyright 2016 Antonio Espinosa - <[email protected]>\n# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).\n\nimport re\nimport threading\n\nfrom odoo import api, models, tools\n\n\nclass IrMailServer(models.Model):\n _inherit = \"ir.mail_server\"\n\n def _tracking_headers_add(self, tracking_email_id, headers):\n \"\"\"Allow other addons to add its own tracking SMTP headers\"\"\"\n headers = headers or {}\n headers[\"X-Odoo-Database\"] = getattr(threading.current_thread(), \"dbname\", None)\n headers[\"X-Odoo-MailTracking-ID\"] = tracking_email_id\n return headers\n\n def _tracking_email_id_body_get(self, body):\n body = body or \"\"\n # https://regex101.com/r/lW4cB1/2\n match = re.search(r'<img[^>]*data-odoo-tracking-email=[\"\\']([0-9]*)[\"\\']', body)\n return str(match.group(1)) if match and match.group(1) else False\n\n def build_email(\n self,\n email_from,\n email_to,\n subject,\n body,\n email_cc=None,\n email_bcc=None,\n reply_to=False,\n attachments=None,\n message_id=None,\n references=None,\n object_id=False,\n subtype=\"plain\",\n headers=None,\n body_alternative=None,\n subtype_alternative=\"plain\",\n ):\n tracking_email_id = self._tracking_email_id_body_get(body)\n if tracking_email_id:\n headers = self._tracking_headers_add(tracking_email_id, headers)\n msg = super(IrMailServer, self).build_email(\n email_from=email_from,\n email_to=email_to,\n subject=subject,\n body=body,\n email_cc=email_cc,\n email_bcc=email_bcc,\n reply_to=reply_to,\n attachments=attachments,\n message_id=message_id,\n references=references,\n object_id=object_id,\n subtype=subtype,\n headers=headers,\n body_alternative=body_alternative,\n subtype_alternative=subtype_alternative,\n )\n return msg\n\n def _tracking_email_get(self, message):\n try:\n tracking_email_id = int(\n message.get(\n \"X-Odoo-MailTracking-ID\",\n # Deprecated tracking header, kept as fallback\n message[\"X-Odoo-Tracking-ID\"],\n )\n )\n except (TypeError, ValueError, KeyError):\n tracking_email_id = False\n return self.env[\"mail.tracking.email\"].browse(tracking_email_id)\n\n def _smtp_server_get(self, mail_server_id, smtp_server):\n smtp_server_used = False\n mail_server = None\n if mail_server_id:\n mail_server = self.browse(mail_server_id)\n elif not smtp_server:\n mail_server_ids = self.search([], order=\"sequence\", limit=1)\n mail_server = mail_server_ids[0] if mail_server_ids else None\n if mail_server:\n smtp_server_used = mail_server.smtp_host\n else:\n smtp_server_used = smtp_server or tools.config.get(\"smtp_server\")\n return smtp_server_used\n\n @api.model\n def send_email(\n self,\n message,\n mail_server_id=None,\n smtp_server=None,\n smtp_port=None,\n smtp_user=None,\n smtp_password=None,\n smtp_encryption=None,\n smtp_debug=False,\n smtp_session=None,\n ):\n message_id = False\n tracking_email = self._tracking_email_get(message)\n smtp_server_used = self.sudo()._smtp_server_get(mail_server_id, smtp_server)\n try:\n message_id = super(IrMailServer, self).send_email(\n message,\n mail_server_id=mail_server_id,\n smtp_server=smtp_server,\n smtp_port=smtp_port,\n smtp_user=smtp_user,\n smtp_password=smtp_password,\n smtp_encryption=smtp_encryption,\n smtp_debug=smtp_debug,\n smtp_session=smtp_session,\n )\n except Exception as e:\n if tracking_email:\n tracking_email.smtp_error(self, smtp_server_used, e)\n if message_id and tracking_email:\n vals = tracking_email._tracking_sent_prepare(\n self, smtp_server_used, message, message_id\n )\n if vals:\n self.env[\"mail.tracking.event\"].sudo().create(vals)\n return message_id\n", "path": "mail_tracking/models/ir_mail_server.py"}]} | 1,607 | 158 |
gh_patches_debug_35032 | rasdani/github-patches | git_diff | alltheplaces__alltheplaces-1880 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Spider quiznos is broken
During the global build at 2021-05-26-14-42-23, spider **quiznos** failed with **129 features** and **1 errors**.
Here's [the log](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/logs/quiznos.log) and [the output](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/quiznos.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/quiznos.geojson))
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/quiznos.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 import scrapy
3 import json
4 import re
5 import traceback
6
7 from locations.items import GeojsonPointItem
8
9 URL = "http://restaurants.quiznos.com"
10
11 class QuiznosSpider(scrapy.Spider):
12 name = "quiznos"
13 item_attributes = { 'brand': "Quizno's", 'brand_wikidata': "Q1936229" }
14 allowed_domains = [URL]
15 start_urls = (
16 'http://restaurants.quiznos.com/data/stores.json?callback=storeList',
17 )
18
19 def store_hours(self, store_hours):
20 if store_hours == '' or store_hours.lower().find('close') > -1: return ''
21
22 day_groups = []
23 this_day_group = None
24 hour_intervals = []
25
26 interval = store_hours.split(' - ')
27 start_time = interval[0].split(' ')
28 end_time = interval[1].split(' ')
29 start_hour = start_time[0].split(':')
30 end_hour = end_time[0].split(':')
31
32 hour_intervals.append('{}:{}-{}:{}'.format(
33 start_hour[0],
34 start_hour[1],
35 int(end_hour[0]) + 12 if end_time[1] == 'PM' else end_hour[0],
36 end_hour[1],
37 ))
38
39 hours = ','.join(hour_intervals)
40
41 if not this_day_group:
42 this_day_group = {
43 'from_day': 'Su',
44 'to_day': 'Sa',
45 'hours': hours
46 }
47
48 day_groups.append(this_day_group)
49
50 opening_hours = ""
51 if len(day_groups) == 1 and day_groups[0]['hours'] in ('00:00-23:59', '00:00-00:00'):
52 opening_hours = '24/7'
53 else:
54 for day_group in day_groups:
55 if day_group['from_day'] == day_group['to_day']:
56 opening_hours += '{from_day} {hours}; '.format(**day_group)
57 elif day_group['from_day'] == 'Su' and day_group['to_day'] == 'Sa':
58 opening_hours += '{hours}; '.format(**day_group)
59 else:
60 opening_hours += '{from_day}-{to_day} {hours}; '.format(**day_group)
61 opening_hours = opening_hours[:-2]
62
63 return opening_hours
64
65 def parse(self, response):
66 data = response.body_as_unicode()
67 stores = json.loads(re.search(r'storeList\((.*)\)', data).group(1))
68
69 for store in stores:
70
71 yield GeojsonPointItem(
72 lat=store.get('latitude'),
73 lon=store.get('longitude'),
74 ref=str(store.get('storeid')),
75 phone=store.get('phone'),
76 name=store.get('restaurantname'),
77 opening_hours=self.store_hours(store.get('businesshours')),
78 addr_full=store.get('address1'),
79 city=store.get('city'),
80 state=store.get('statecode'),
81 postcode=store.get('zipcode'),
82 website=URL + store.get('url'),
83 )
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/locations/spiders/quiznos.py b/locations/spiders/quiznos.py
--- a/locations/spiders/quiznos.py
+++ b/locations/spiders/quiznos.py
@@ -2,22 +2,24 @@
import scrapy
import json
import re
-import traceback
from locations.items import GeojsonPointItem
-URL = "http://restaurants.quiznos.com"
class QuiznosSpider(scrapy.Spider):
name = "quiznos"
item_attributes = { 'brand': "Quizno's", 'brand_wikidata': "Q1936229" }
- allowed_domains = [URL]
+ allowed_domains = ["https://restaurants.quiznos.com"]
start_urls = (
- 'http://restaurants.quiznos.com/data/stores.json?callback=storeList',
+ 'https://restaurants.quiznos.com/data/stores.json',
)
def store_hours(self, store_hours):
- if store_hours == '' or store_hours.lower().find('close') > -1: return ''
+ if store_hours == '' or store_hours == ' - ' or store_hours.lower().find('close') > -1:
+ return ''
+
+ if store_hours == 'Open 24 Hours - Open 24 Hours':
+ return '24/7'
day_groups = []
this_day_group = None
@@ -67,17 +69,18 @@
stores = json.loads(re.search(r'storeList\((.*)\)', data).group(1))
for store in stores:
+ properties = {
+ "lat": store.get('latitude'),
+ "lon": store.get('longitude'),
+ "ref": str(store.get('storeid')),
+ "phone": store.get('phone'),
+ "name": store.get('restaurantname'),
+ "opening_hours": self.store_hours(store.get('businesshours')),
+ "addr_full": store.get('address1'),
+ "city": store.get('city'),
+ "state": store.get('statecode'),
+ "postcode": store.get('zipcode'),
+ "website": response.urljoin(store.get('url')),
+ }
- yield GeojsonPointItem(
- lat=store.get('latitude'),
- lon=store.get('longitude'),
- ref=str(store.get('storeid')),
- phone=store.get('phone'),
- name=store.get('restaurantname'),
- opening_hours=self.store_hours(store.get('businesshours')),
- addr_full=store.get('address1'),
- city=store.get('city'),
- state=store.get('statecode'),
- postcode=store.get('zipcode'),
- website=URL + store.get('url'),
- )
+ yield GeojsonPointItem(**properties)
| {"golden_diff": "diff --git a/locations/spiders/quiznos.py b/locations/spiders/quiznos.py\n--- a/locations/spiders/quiznos.py\n+++ b/locations/spiders/quiznos.py\n@@ -2,22 +2,24 @@\n import scrapy\n import json\n import re\n-import traceback\n \n from locations.items import GeojsonPointItem\n \n-URL = \"http://restaurants.quiznos.com\"\n \n class QuiznosSpider(scrapy.Spider):\n name = \"quiznos\"\n item_attributes = { 'brand': \"Quizno's\", 'brand_wikidata': \"Q1936229\" }\n- allowed_domains = [URL]\n+ allowed_domains = [\"https://restaurants.quiznos.com\"]\n start_urls = (\n- 'http://restaurants.quiznos.com/data/stores.json?callback=storeList',\n+ 'https://restaurants.quiznos.com/data/stores.json',\n )\n \n def store_hours(self, store_hours):\n- if store_hours == '' or store_hours.lower().find('close') > -1: return ''\n+ if store_hours == '' or store_hours == ' - ' or store_hours.lower().find('close') > -1:\n+ return ''\n+\n+ if store_hours == 'Open 24 Hours - Open 24 Hours':\n+ return '24/7'\n \n day_groups = []\n this_day_group = None\n@@ -67,17 +69,18 @@\n stores = json.loads(re.search(r'storeList\\((.*)\\)', data).group(1))\n \n for store in stores:\n+ properties = {\n+ \"lat\": store.get('latitude'),\n+ \"lon\": store.get('longitude'),\n+ \"ref\": str(store.get('storeid')),\n+ \"phone\": store.get('phone'),\n+ \"name\": store.get('restaurantname'),\n+ \"opening_hours\": self.store_hours(store.get('businesshours')),\n+ \"addr_full\": store.get('address1'),\n+ \"city\": store.get('city'),\n+ \"state\": store.get('statecode'),\n+ \"postcode\": store.get('zipcode'),\n+ \"website\": response.urljoin(store.get('url')),\n+ }\n \n- yield GeojsonPointItem(\n- lat=store.get('latitude'),\n- lon=store.get('longitude'),\n- ref=str(store.get('storeid')),\n- phone=store.get('phone'),\n- name=store.get('restaurantname'),\n- opening_hours=self.store_hours(store.get('businesshours')),\n- addr_full=store.get('address1'),\n- city=store.get('city'),\n- state=store.get('statecode'),\n- postcode=store.get('zipcode'),\n- website=URL + store.get('url'),\n- )\n+ yield GeojsonPointItem(**properties)\n", "issue": "Spider quiznos is broken\nDuring the global build at 2021-05-26-14-42-23, spider **quiznos** failed with **129 features** and **1 errors**.\n\nHere's [the log](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/logs/quiznos.log) and [the output](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/quiznos.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/quiznos.geojson))\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nimport scrapy\nimport json\nimport re\nimport traceback\n\nfrom locations.items import GeojsonPointItem\n\nURL = \"http://restaurants.quiznos.com\"\n\nclass QuiznosSpider(scrapy.Spider):\n name = \"quiznos\"\n item_attributes = { 'brand': \"Quizno's\", 'brand_wikidata': \"Q1936229\" }\n allowed_domains = [URL]\n start_urls = (\n 'http://restaurants.quiznos.com/data/stores.json?callback=storeList',\n )\n\n def store_hours(self, store_hours):\n if store_hours == '' or store_hours.lower().find('close') > -1: return ''\n\n day_groups = []\n this_day_group = None\n hour_intervals = []\n\n interval = store_hours.split(' - ')\n start_time = interval[0].split(' ')\n end_time = interval[1].split(' ')\n start_hour = start_time[0].split(':')\n end_hour = end_time[0].split(':')\n\n hour_intervals.append('{}:{}-{}:{}'.format(\n start_hour[0],\n start_hour[1],\n int(end_hour[0]) + 12 if end_time[1] == 'PM' else end_hour[0],\n end_hour[1],\n ))\n\n hours = ','.join(hour_intervals)\n\n if not this_day_group:\n this_day_group = {\n 'from_day': 'Su',\n 'to_day': 'Sa',\n 'hours': hours\n }\n\n day_groups.append(this_day_group)\n\n opening_hours = \"\"\n if len(day_groups) == 1 and day_groups[0]['hours'] in ('00:00-23:59', '00:00-00:00'):\n opening_hours = '24/7'\n else:\n for day_group in day_groups:\n if day_group['from_day'] == day_group['to_day']:\n opening_hours += '{from_day} {hours}; '.format(**day_group)\n elif day_group['from_day'] == 'Su' and day_group['to_day'] == 'Sa':\n opening_hours += '{hours}; '.format(**day_group)\n else:\n opening_hours += '{from_day}-{to_day} {hours}; '.format(**day_group)\n opening_hours = opening_hours[:-2]\n\n return opening_hours\n\n def parse(self, response):\n data = response.body_as_unicode()\n stores = json.loads(re.search(r'storeList\\((.*)\\)', data).group(1))\n\n for store in stores:\n\n yield GeojsonPointItem(\n lat=store.get('latitude'),\n lon=store.get('longitude'),\n ref=str(store.get('storeid')),\n phone=store.get('phone'),\n name=store.get('restaurantname'),\n opening_hours=self.store_hours(store.get('businesshours')),\n addr_full=store.get('address1'),\n city=store.get('city'),\n state=store.get('statecode'),\n postcode=store.get('zipcode'),\n website=URL + store.get('url'),\n )\n", "path": "locations/spiders/quiznos.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nimport scrapy\nimport json\nimport re\n\nfrom locations.items import GeojsonPointItem\n\n\nclass QuiznosSpider(scrapy.Spider):\n name = \"quiznos\"\n item_attributes = { 'brand': \"Quizno's\", 'brand_wikidata': \"Q1936229\" }\n allowed_domains = [\"https://restaurants.quiznos.com\"]\n start_urls = (\n 'https://restaurants.quiznos.com/data/stores.json',\n )\n\n def store_hours(self, store_hours):\n if store_hours == '' or store_hours == ' - ' or store_hours.lower().find('close') > -1:\n return ''\n\n if store_hours == 'Open 24 Hours - Open 24 Hours':\n return '24/7'\n\n day_groups = []\n this_day_group = None\n hour_intervals = []\n\n interval = store_hours.split(' - ')\n start_time = interval[0].split(' ')\n end_time = interval[1].split(' ')\n start_hour = start_time[0].split(':')\n end_hour = end_time[0].split(':')\n\n hour_intervals.append('{}:{}-{}:{}'.format(\n start_hour[0],\n start_hour[1],\n int(end_hour[0]) + 12 if end_time[1] == 'PM' else end_hour[0],\n end_hour[1],\n ))\n\n hours = ','.join(hour_intervals)\n\n if not this_day_group:\n this_day_group = {\n 'from_day': 'Su',\n 'to_day': 'Sa',\n 'hours': hours\n }\n\n day_groups.append(this_day_group)\n\n opening_hours = \"\"\n if len(day_groups) == 1 and day_groups[0]['hours'] in ('00:00-23:59', '00:00-00:00'):\n opening_hours = '24/7'\n else:\n for day_group in day_groups:\n if day_group['from_day'] == day_group['to_day']:\n opening_hours += '{from_day} {hours}; '.format(**day_group)\n elif day_group['from_day'] == 'Su' and day_group['to_day'] == 'Sa':\n opening_hours += '{hours}; '.format(**day_group)\n else:\n opening_hours += '{from_day}-{to_day} {hours}; '.format(**day_group)\n opening_hours = opening_hours[:-2]\n\n return opening_hours\n\n def parse(self, response):\n data = response.body_as_unicode()\n stores = json.loads(re.search(r'storeList\\((.*)\\)', data).group(1))\n\n for store in stores:\n properties = {\n \"lat\": store.get('latitude'),\n \"lon\": store.get('longitude'),\n \"ref\": str(store.get('storeid')),\n \"phone\": store.get('phone'),\n \"name\": store.get('restaurantname'),\n \"opening_hours\": self.store_hours(store.get('businesshours')),\n \"addr_full\": store.get('address1'),\n \"city\": store.get('city'),\n \"state\": store.get('statecode'),\n \"postcode\": store.get('zipcode'),\n \"website\": response.urljoin(store.get('url')),\n }\n\n yield GeojsonPointItem(**properties)\n", "path": "locations/spiders/quiznos.py"}]} | 1,278 | 609 |
gh_patches_debug_37893 | rasdani/github-patches | git_diff | hpcaitech__ColossalAI-5532 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/language/grok-1/inference_tp.py`
Content:
```
1 import time
2
3 import torch
4 from grok1_policy import Grok1ForCausalLMPolicy
5 from transformers import AutoModelForCausalLM, LlamaTokenizerFast
6 from utils import get_defualt_parser, inference, print_output
7
8 import colossalai
9 from colossalai.booster import Booster
10 from colossalai.booster.plugin import HybridParallelPlugin
11 from colossalai.cluster import DistCoordinator
12 from colossalai.lazy import LazyInitContext
13 from colossalai.utils import get_current_device
14
15 if __name__ == "__main__":
16 parser = get_defualt_parser()
17 args = parser.parse_args()
18 start = time.time()
19 colossalai.launch_from_torch({})
20 coordinator = DistCoordinator()
21 plugin = HybridParallelPlugin(
22 tp_size=coordinator.world_size,
23 pp_size=1,
24 precision="bf16",
25 parallel_output=False,
26 custom_policy=Grok1ForCausalLMPolicy(),
27 )
28 booster = Booster(plugin=plugin)
29 torch.set_default_dtype(torch.bfloat16)
30 with LazyInitContext(default_device=get_current_device()):
31 model = AutoModelForCausalLM.from_pretrained(
32 args.pretrained, trust_remote_code=True, torch_dtype=torch.bfloat16
33 )
34 model, *_ = booster.boost(model)
35 model.eval()
36 init_time = time.time() - start
37
38 # A transformers-compatible version of the grok-1 tokenizer by Xenova
39 # https://huggingface.co/Xenova/grok-1-tokenizer
40 tokenizer = LlamaTokenizerFast.from_pretrained("Xenova/grok-1-tokenizer")
41
42 for text in args.text:
43 output = inference(
44 model.unwrap(),
45 tokenizer,
46 text,
47 max_new_tokens=args.max_new_tokens,
48 do_sample=args.do_sample,
49 temperature=args.temperature,
50 top_k=args.top_k,
51 top_p=args.top_p,
52 )
53 if coordinator.is_master():
54 print_output(text, tokenizer.decode(output))
55
56 overall_time = time.time() - start
57 gen_latency = overall_time - init_time
58 avg_gen_latency = gen_latency / len(args.text)
59 coordinator.print_on_master(
60 f"Initializing time: {init_time:.2f} seconds.\n"
61 f"Overall time: {overall_time:.2f} seconds. \n"
62 f"Generation latency: {gen_latency:.2f} seconds. \n"
63 f"Average generation latency: {avg_gen_latency:.2f} seconds. \n"
64 )
65
```
Path: `examples/language/grok-1/inference.py`
Content:
```
1 import time
2
3 import torch
4 from transformers import AutoModelForCausalLM, LlamaTokenizerFast
5 from utils import get_defualt_parser, inference, print_output
6
7 if __name__ == "__main__":
8 parser = get_defualt_parser()
9 args = parser.parse_args()
10 start = time.time()
11 torch.set_default_dtype(torch.bfloat16)
12 model = AutoModelForCausalLM.from_pretrained(
13 args.pretrained,
14 trust_remote_code=True,
15 device_map="auto",
16 torch_dtype=torch.bfloat16,
17 )
18 model.eval()
19 init_time = time.time() - start
20
21 # A transformers-compatible version of the grok-1 tokenizer by Xenova
22 # https://huggingface.co/Xenova/grok-1-tokenizer
23 tokenizer = LlamaTokenizerFast.from_pretrained("Xenova/grok-1-tokenizer")
24
25 for text in args.text:
26 output = inference(
27 model,
28 tokenizer,
29 text,
30 max_new_tokens=args.max_new_tokens,
31 do_sample=args.do_sample,
32 temperature=args.temperature,
33 top_k=args.top_k,
34 top_p=args.top_p,
35 )
36 print_output(text, tokenizer.decode(output))
37
38 overall_time = time.time() - start
39 gen_latency = overall_time - init_time
40 avg_gen_latency = gen_latency / len(args.text)
41 print(
42 f"Initializing time: {init_time:.2f} seconds.\n"
43 f"Overall time: {overall_time:.2f} seconds. \n"
44 f"Generation latency: {gen_latency:.2f} seconds. \n"
45 f"Average generation latency: {avg_gen_latency:.2f} seconds. \n"
46 )
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/language/grok-1/inference.py b/examples/language/grok-1/inference.py
--- a/examples/language/grok-1/inference.py
+++ b/examples/language/grok-1/inference.py
@@ -1,7 +1,7 @@
import time
import torch
-from transformers import AutoModelForCausalLM, LlamaTokenizerFast
+from transformers import AutoModelForCausalLM, AutoTokenizer
from utils import get_defualt_parser, inference, print_output
if __name__ == "__main__":
@@ -9,6 +9,9 @@
args = parser.parse_args()
start = time.time()
torch.set_default_dtype(torch.bfloat16)
+
+ tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)
+
model = AutoModelForCausalLM.from_pretrained(
args.pretrained,
trust_remote_code=True,
@@ -18,10 +21,6 @@
model.eval()
init_time = time.time() - start
- # A transformers-compatible version of the grok-1 tokenizer by Xenova
- # https://huggingface.co/Xenova/grok-1-tokenizer
- tokenizer = LlamaTokenizerFast.from_pretrained("Xenova/grok-1-tokenizer")
-
for text in args.text:
output = inference(
model,
diff --git a/examples/language/grok-1/inference_tp.py b/examples/language/grok-1/inference_tp.py
--- a/examples/language/grok-1/inference_tp.py
+++ b/examples/language/grok-1/inference_tp.py
@@ -2,7 +2,7 @@
import torch
from grok1_policy import Grok1ForCausalLMPolicy
-from transformers import AutoModelForCausalLM, LlamaTokenizerFast
+from transformers import AutoModelForCausalLM, AutoTokenizer
from utils import get_defualt_parser, inference, print_output
import colossalai
@@ -27,6 +27,9 @@
)
booster = Booster(plugin=plugin)
torch.set_default_dtype(torch.bfloat16)
+
+ tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)
+
with LazyInitContext(default_device=get_current_device()):
model = AutoModelForCausalLM.from_pretrained(
args.pretrained, trust_remote_code=True, torch_dtype=torch.bfloat16
@@ -35,10 +38,6 @@
model.eval()
init_time = time.time() - start
- # A transformers-compatible version of the grok-1 tokenizer by Xenova
- # https://huggingface.co/Xenova/grok-1-tokenizer
- tokenizer = LlamaTokenizerFast.from_pretrained("Xenova/grok-1-tokenizer")
-
for text in args.text:
output = inference(
model.unwrap(),
| {"golden_diff": "diff --git a/examples/language/grok-1/inference.py b/examples/language/grok-1/inference.py\n--- a/examples/language/grok-1/inference.py\n+++ b/examples/language/grok-1/inference.py\n@@ -1,7 +1,7 @@\n import time\n \n import torch\n-from transformers import AutoModelForCausalLM, LlamaTokenizerFast\n+from transformers import AutoModelForCausalLM, AutoTokenizer\n from utils import get_defualt_parser, inference, print_output\n \n if __name__ == \"__main__\":\n@@ -9,6 +9,9 @@\n args = parser.parse_args()\n start = time.time()\n torch.set_default_dtype(torch.bfloat16)\n+\n+ tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)\n+\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained,\n trust_remote_code=True,\n@@ -18,10 +21,6 @@\n model.eval()\n init_time = time.time() - start\n \n- # A transformers-compatible version of the grok-1 tokenizer by Xenova\n- # https://huggingface.co/Xenova/grok-1-tokenizer\n- tokenizer = LlamaTokenizerFast.from_pretrained(\"Xenova/grok-1-tokenizer\")\n-\n for text in args.text:\n output = inference(\n model,\ndiff --git a/examples/language/grok-1/inference_tp.py b/examples/language/grok-1/inference_tp.py\n--- a/examples/language/grok-1/inference_tp.py\n+++ b/examples/language/grok-1/inference_tp.py\n@@ -2,7 +2,7 @@\n \n import torch\n from grok1_policy import Grok1ForCausalLMPolicy\n-from transformers import AutoModelForCausalLM, LlamaTokenizerFast\n+from transformers import AutoModelForCausalLM, AutoTokenizer\n from utils import get_defualt_parser, inference, print_output\n \n import colossalai\n@@ -27,6 +27,9 @@\n )\n booster = Booster(plugin=plugin)\n torch.set_default_dtype(torch.bfloat16)\n+\n+ tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)\n+\n with LazyInitContext(default_device=get_current_device()):\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained, trust_remote_code=True, torch_dtype=torch.bfloat16\n@@ -35,10 +38,6 @@\n model.eval()\n init_time = time.time() - start\n \n- # A transformers-compatible version of the grok-1 tokenizer by Xenova\n- # https://huggingface.co/Xenova/grok-1-tokenizer\n- tokenizer = LlamaTokenizerFast.from_pretrained(\"Xenova/grok-1-tokenizer\")\n-\n for text in args.text:\n output = inference(\n model.unwrap(),\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n", "before_files": [{"content": "import time\n\nimport torch\nfrom grok1_policy import Grok1ForCausalLMPolicy\nfrom transformers import AutoModelForCausalLM, LlamaTokenizerFast\nfrom utils import get_defualt_parser, inference, print_output\n\nimport colossalai\nfrom colossalai.booster import Booster\nfrom colossalai.booster.plugin import HybridParallelPlugin\nfrom colossalai.cluster import DistCoordinator\nfrom colossalai.lazy import LazyInitContext\nfrom colossalai.utils import get_current_device\n\nif __name__ == \"__main__\":\n parser = get_defualt_parser()\n args = parser.parse_args()\n start = time.time()\n colossalai.launch_from_torch({})\n coordinator = DistCoordinator()\n plugin = HybridParallelPlugin(\n tp_size=coordinator.world_size,\n pp_size=1,\n precision=\"bf16\",\n parallel_output=False,\n custom_policy=Grok1ForCausalLMPolicy(),\n )\n booster = Booster(plugin=plugin)\n torch.set_default_dtype(torch.bfloat16)\n with LazyInitContext(default_device=get_current_device()):\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained, trust_remote_code=True, torch_dtype=torch.bfloat16\n )\n model, *_ = booster.boost(model)\n model.eval()\n init_time = time.time() - start\n\n # A transformers-compatible version of the grok-1 tokenizer by Xenova\n # https://huggingface.co/Xenova/grok-1-tokenizer\n tokenizer = LlamaTokenizerFast.from_pretrained(\"Xenova/grok-1-tokenizer\")\n\n for text in args.text:\n output = inference(\n model.unwrap(),\n tokenizer,\n text,\n max_new_tokens=args.max_new_tokens,\n do_sample=args.do_sample,\n temperature=args.temperature,\n top_k=args.top_k,\n top_p=args.top_p,\n )\n if coordinator.is_master():\n print_output(text, tokenizer.decode(output))\n\n overall_time = time.time() - start\n gen_latency = overall_time - init_time\n avg_gen_latency = gen_latency / len(args.text)\n coordinator.print_on_master(\n f\"Initializing time: {init_time:.2f} seconds.\\n\"\n f\"Overall time: {overall_time:.2f} seconds. \\n\"\n f\"Generation latency: {gen_latency:.2f} seconds. \\n\"\n f\"Average generation latency: {avg_gen_latency:.2f} seconds. \\n\"\n )\n", "path": "examples/language/grok-1/inference_tp.py"}, {"content": "import time\n\nimport torch\nfrom transformers import AutoModelForCausalLM, LlamaTokenizerFast\nfrom utils import get_defualt_parser, inference, print_output\n\nif __name__ == \"__main__\":\n parser = get_defualt_parser()\n args = parser.parse_args()\n start = time.time()\n torch.set_default_dtype(torch.bfloat16)\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained,\n trust_remote_code=True,\n device_map=\"auto\",\n torch_dtype=torch.bfloat16,\n )\n model.eval()\n init_time = time.time() - start\n\n # A transformers-compatible version of the grok-1 tokenizer by Xenova\n # https://huggingface.co/Xenova/grok-1-tokenizer\n tokenizer = LlamaTokenizerFast.from_pretrained(\"Xenova/grok-1-tokenizer\")\n\n for text in args.text:\n output = inference(\n model,\n tokenizer,\n text,\n max_new_tokens=args.max_new_tokens,\n do_sample=args.do_sample,\n temperature=args.temperature,\n top_k=args.top_k,\n top_p=args.top_p,\n )\n print_output(text, tokenizer.decode(output))\n\n overall_time = time.time() - start\n gen_latency = overall_time - init_time\n avg_gen_latency = gen_latency / len(args.text)\n print(\n f\"Initializing time: {init_time:.2f} seconds.\\n\"\n f\"Overall time: {overall_time:.2f} seconds. \\n\"\n f\"Generation latency: {gen_latency:.2f} seconds. \\n\"\n f\"Average generation latency: {avg_gen_latency:.2f} seconds. \\n\"\n )\n", "path": "examples/language/grok-1/inference.py"}], "after_files": [{"content": "import time\n\nimport torch\nfrom grok1_policy import Grok1ForCausalLMPolicy\nfrom transformers import AutoModelForCausalLM, AutoTokenizer\nfrom utils import get_defualt_parser, inference, print_output\n\nimport colossalai\nfrom colossalai.booster import Booster\nfrom colossalai.booster.plugin import HybridParallelPlugin\nfrom colossalai.cluster import DistCoordinator\nfrom colossalai.lazy import LazyInitContext\nfrom colossalai.utils import get_current_device\n\nif __name__ == \"__main__\":\n parser = get_defualt_parser()\n args = parser.parse_args()\n start = time.time()\n colossalai.launch_from_torch({})\n coordinator = DistCoordinator()\n plugin = HybridParallelPlugin(\n tp_size=coordinator.world_size,\n pp_size=1,\n precision=\"bf16\",\n parallel_output=False,\n custom_policy=Grok1ForCausalLMPolicy(),\n )\n booster = Booster(plugin=plugin)\n torch.set_default_dtype(torch.bfloat16)\n\n tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)\n\n with LazyInitContext(default_device=get_current_device()):\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained, trust_remote_code=True, torch_dtype=torch.bfloat16\n )\n model, *_ = booster.boost(model)\n model.eval()\n init_time = time.time() - start\n\n for text in args.text:\n output = inference(\n model.unwrap(),\n tokenizer,\n text,\n max_new_tokens=args.max_new_tokens,\n do_sample=args.do_sample,\n temperature=args.temperature,\n top_k=args.top_k,\n top_p=args.top_p,\n )\n if coordinator.is_master():\n print_output(text, tokenizer.decode(output))\n\n overall_time = time.time() - start\n gen_latency = overall_time - init_time\n avg_gen_latency = gen_latency / len(args.text)\n coordinator.print_on_master(\n f\"Initializing time: {init_time:.2f} seconds.\\n\"\n f\"Overall time: {overall_time:.2f} seconds. \\n\"\n f\"Generation latency: {gen_latency:.2f} seconds. \\n\"\n f\"Average generation latency: {avg_gen_latency:.2f} seconds. \\n\"\n )\n", "path": "examples/language/grok-1/inference_tp.py"}, {"content": "import time\n\nimport torch\nfrom transformers import AutoModelForCausalLM, AutoTokenizer\nfrom utils import get_defualt_parser, inference, print_output\n\nif __name__ == \"__main__\":\n parser = get_defualt_parser()\n args = parser.parse_args()\n start = time.time()\n torch.set_default_dtype(torch.bfloat16)\n\n tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)\n\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained,\n trust_remote_code=True,\n device_map=\"auto\",\n torch_dtype=torch.bfloat16,\n )\n model.eval()\n init_time = time.time() - start\n\n for text in args.text:\n output = inference(\n model,\n tokenizer,\n text,\n max_new_tokens=args.max_new_tokens,\n do_sample=args.do_sample,\n temperature=args.temperature,\n top_k=args.top_k,\n top_p=args.top_p,\n )\n print_output(text, tokenizer.decode(output))\n\n overall_time = time.time() - start\n gen_latency = overall_time - init_time\n avg_gen_latency = gen_latency / len(args.text)\n print(\n f\"Initializing time: {init_time:.2f} seconds.\\n\"\n f\"Overall time: {overall_time:.2f} seconds. \\n\"\n f\"Generation latency: {gen_latency:.2f} seconds. \\n\"\n f\"Average generation latency: {avg_gen_latency:.2f} seconds. \\n\"\n )\n", "path": "examples/language/grok-1/inference.py"}]} | 1,411 | 622 |
gh_patches_debug_10285 | rasdani/github-patches | git_diff | certbot__certbot-6002 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Certbot crashes if EFF e-mail submission endpoint is unavailable
Certbot will crash if the user's e-mail information can't be submitted to the EFF endpoint:
https://community.letsencrypt.org/t/issue-when-trying-to-create-certificate/59307/1
https://github.com/certbot/certbot/blob/master/certbot/eff.py#L74
It crashes with a somewhat relevant error if the endpoint is unavailable, but a confusing error if the endpoint returns something other than JSON. It should probably not crash at all in either case. If the endpoint specifically says that the user's e-mail address is malformed, that's probably worth stopping to report to the user, but if the endpoint simply doesn't reply usefully, it's probably preferable to continue since I wouldn't think we really want to make users wait in this situation.
(Adding the "bug" tag at least for the case where the endpoint returns a non-JSON reply because the error shown to the user in this case is unhelpful.)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `certbot/eff.py`
Content:
```
1 """Subscribes users to the EFF newsletter."""
2 import logging
3
4 import requests
5 import zope.component
6
7 from certbot import constants
8 from certbot import interfaces
9
10
11 logger = logging.getLogger(__name__)
12
13
14 def handle_subscription(config):
15 """High level function to take care of EFF newsletter subscriptions.
16
17 The user may be asked if they want to sign up for the newsletter if
18 they have not already specified.
19
20 :param .IConfig config: Client configuration.
21
22 """
23 if config.email is None:
24 if config.eff_email:
25 _report_failure("you didn't provide an e-mail address")
26 return
27 if config.eff_email is None:
28 config.eff_email = _want_subscription()
29 if config.eff_email:
30 subscribe(config.email)
31
32
33 def _want_subscription():
34 """Does the user want to be subscribed to the EFF newsletter?
35
36 :returns: True if we should subscribe the user, otherwise, False
37 :rtype: bool
38
39 """
40 prompt = (
41 'Would you be willing to share your email address with the '
42 "Electronic Frontier Foundation, a founding partner of the Let's "
43 'Encrypt project and the non-profit organization that develops '
44 "Certbot? We'd like to send you email about EFF and our work to "
45 'encrypt the web, protect its users and defend digital rights.')
46 display = zope.component.getUtility(interfaces.IDisplay)
47 return display.yesno(prompt, default=False)
48
49
50 def subscribe(email):
51 """Subscribe the user to the EFF mailing list.
52
53 :param str email: the e-mail address to subscribe
54
55 """
56 url = constants.EFF_SUBSCRIBE_URI
57 data = {'data_type': 'json',
58 'email': email,
59 'form_id': 'eff_supporters_library_subscribe_form'}
60 logger.debug('Sending POST request to %s:\n%s', url, data)
61 _check_response(requests.post(url, data=data))
62
63
64 def _check_response(response):
65 """Check for errors in the server's response.
66
67 If an error occurred, it will be reported to the user.
68
69 :param requests.Response response: the server's response to the
70 subscription request
71
72 """
73 logger.debug('Received response:\n%s', response.content)
74 if response.ok:
75 if not response.json()['status']:
76 _report_failure('your e-mail address appears to be invalid')
77 else:
78 _report_failure()
79
80
81 def _report_failure(reason=None):
82 """Notify the user of failing to sign them up for the newsletter.
83
84 :param reason: a phrase describing what the problem was
85 beginning with a lowercase letter and no closing punctuation
86 :type reason: `str` or `None`
87
88 """
89 msg = ['We were unable to subscribe you the EFF mailing list']
90 if reason is not None:
91 msg.append(' because ')
92 msg.append(reason)
93 msg.append('. You can try again later by visiting https://act.eff.org.')
94 reporter = zope.component.getUtility(interfaces.IReporter)
95 reporter.add_message(''.join(msg), reporter.LOW_PRIORITY)
96
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/certbot/eff.py b/certbot/eff.py
--- a/certbot/eff.py
+++ b/certbot/eff.py
@@ -71,11 +71,14 @@
"""
logger.debug('Received response:\n%s', response.content)
- if response.ok:
- if not response.json()['status']:
+ try:
+ response.raise_for_status()
+ if response.json()['status'] == False:
_report_failure('your e-mail address appears to be invalid')
- else:
+ except requests.exceptions.HTTPError:
_report_failure()
+ except (ValueError, KeyError):
+ _report_failure('there was a problem with the server response')
def _report_failure(reason=None):
| {"golden_diff": "diff --git a/certbot/eff.py b/certbot/eff.py\n--- a/certbot/eff.py\n+++ b/certbot/eff.py\n@@ -71,11 +71,14 @@\n \n \"\"\"\n logger.debug('Received response:\\n%s', response.content)\n- if response.ok:\n- if not response.json()['status']:\n+ try:\n+ response.raise_for_status()\n+ if response.json()['status'] == False:\n _report_failure('your e-mail address appears to be invalid')\n- else:\n+ except requests.exceptions.HTTPError:\n _report_failure()\n+ except (ValueError, KeyError):\n+ _report_failure('there was a problem with the server response')\n \n \n def _report_failure(reason=None):\n", "issue": "Certbot crashes if EFF e-mail submission endpoint is unavailable\nCertbot will crash if the user's e-mail information can't be submitted to the EFF endpoint:\r\n\r\nhttps://community.letsencrypt.org/t/issue-when-trying-to-create-certificate/59307/1\r\n\r\nhttps://github.com/certbot/certbot/blob/master/certbot/eff.py#L74\r\n\r\nIt crashes with a somewhat relevant error if the endpoint is unavailable, but a confusing error if the endpoint returns something other than JSON. It should probably not crash at all in either case. If the endpoint specifically says that the user's e-mail address is malformed, that's probably worth stopping to report to the user, but if the endpoint simply doesn't reply usefully, it's probably preferable to continue since I wouldn't think we really want to make users wait in this situation.\r\n\r\n(Adding the \"bug\" tag at least for the case where the endpoint returns a non-JSON reply because the error shown to the user in this case is unhelpful.)\n", "before_files": [{"content": "\"\"\"Subscribes users to the EFF newsletter.\"\"\"\nimport logging\n\nimport requests\nimport zope.component\n\nfrom certbot import constants\nfrom certbot import interfaces\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef handle_subscription(config):\n \"\"\"High level function to take care of EFF newsletter subscriptions.\n\n The user may be asked if they want to sign up for the newsletter if\n they have not already specified.\n\n :param .IConfig config: Client configuration.\n\n \"\"\"\n if config.email is None:\n if config.eff_email:\n _report_failure(\"you didn't provide an e-mail address\")\n return\n if config.eff_email is None:\n config.eff_email = _want_subscription()\n if config.eff_email:\n subscribe(config.email)\n\n\ndef _want_subscription():\n \"\"\"Does the user want to be subscribed to the EFF newsletter?\n\n :returns: True if we should subscribe the user, otherwise, False\n :rtype: bool\n\n \"\"\"\n prompt = (\n 'Would you be willing to share your email address with the '\n \"Electronic Frontier Foundation, a founding partner of the Let's \"\n 'Encrypt project and the non-profit organization that develops '\n \"Certbot? We'd like to send you email about EFF and our work to \"\n 'encrypt the web, protect its users and defend digital rights.')\n display = zope.component.getUtility(interfaces.IDisplay)\n return display.yesno(prompt, default=False)\n\n\ndef subscribe(email):\n \"\"\"Subscribe the user to the EFF mailing list.\n\n :param str email: the e-mail address to subscribe\n\n \"\"\"\n url = constants.EFF_SUBSCRIBE_URI\n data = {'data_type': 'json',\n 'email': email,\n 'form_id': 'eff_supporters_library_subscribe_form'}\n logger.debug('Sending POST request to %s:\\n%s', url, data)\n _check_response(requests.post(url, data=data))\n\n\ndef _check_response(response):\n \"\"\"Check for errors in the server's response.\n\n If an error occurred, it will be reported to the user.\n\n :param requests.Response response: the server's response to the\n subscription request\n\n \"\"\"\n logger.debug('Received response:\\n%s', response.content)\n if response.ok:\n if not response.json()['status']:\n _report_failure('your e-mail address appears to be invalid')\n else:\n _report_failure()\n\n\ndef _report_failure(reason=None):\n \"\"\"Notify the user of failing to sign them up for the newsletter.\n\n :param reason: a phrase describing what the problem was\n beginning with a lowercase letter and no closing punctuation\n :type reason: `str` or `None`\n\n \"\"\"\n msg = ['We were unable to subscribe you the EFF mailing list']\n if reason is not None:\n msg.append(' because ')\n msg.append(reason)\n msg.append('. You can try again later by visiting https://act.eff.org.')\n reporter = zope.component.getUtility(interfaces.IReporter)\n reporter.add_message(''.join(msg), reporter.LOW_PRIORITY)\n", "path": "certbot/eff.py"}], "after_files": [{"content": "\"\"\"Subscribes users to the EFF newsletter.\"\"\"\nimport logging\n\nimport requests\nimport zope.component\n\nfrom certbot import constants\nfrom certbot import interfaces\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef handle_subscription(config):\n \"\"\"High level function to take care of EFF newsletter subscriptions.\n\n The user may be asked if they want to sign up for the newsletter if\n they have not already specified.\n\n :param .IConfig config: Client configuration.\n\n \"\"\"\n if config.email is None:\n if config.eff_email:\n _report_failure(\"you didn't provide an e-mail address\")\n return\n if config.eff_email is None:\n config.eff_email = _want_subscription()\n if config.eff_email:\n subscribe(config.email)\n\n\ndef _want_subscription():\n \"\"\"Does the user want to be subscribed to the EFF newsletter?\n\n :returns: True if we should subscribe the user, otherwise, False\n :rtype: bool\n\n \"\"\"\n prompt = (\n 'Would you be willing to share your email address with the '\n \"Electronic Frontier Foundation, a founding partner of the Let's \"\n 'Encrypt project and the non-profit organization that develops '\n \"Certbot? We'd like to send you email about EFF and our work to \"\n 'encrypt the web, protect its users and defend digital rights.')\n display = zope.component.getUtility(interfaces.IDisplay)\n return display.yesno(prompt, default=False)\n\n\ndef subscribe(email):\n \"\"\"Subscribe the user to the EFF mailing list.\n\n :param str email: the e-mail address to subscribe\n\n \"\"\"\n url = constants.EFF_SUBSCRIBE_URI\n data = {'data_type': 'json',\n 'email': email,\n 'form_id': 'eff_supporters_library_subscribe_form'}\n logger.debug('Sending POST request to %s:\\n%s', url, data)\n _check_response(requests.post(url, data=data))\n\n\ndef _check_response(response):\n \"\"\"Check for errors in the server's response.\n\n If an error occurred, it will be reported to the user.\n\n :param requests.Response response: the server's response to the\n subscription request\n\n \"\"\"\n logger.debug('Received response:\\n%s', response.content)\n try:\n response.raise_for_status()\n if response.json()['status'] == False:\n _report_failure('your e-mail address appears to be invalid')\n except requests.exceptions.HTTPError:\n _report_failure()\n except (ValueError, KeyError):\n _report_failure('there was a problem with the server response')\n\n\ndef _report_failure(reason=None):\n \"\"\"Notify the user of failing to sign them up for the newsletter.\n\n :param reason: a phrase describing what the problem was\n beginning with a lowercase letter and no closing punctuation\n :type reason: `str` or `None`\n\n \"\"\"\n msg = ['We were unable to subscribe you the EFF mailing list']\n if reason is not None:\n msg.append(' because ')\n msg.append(reason)\n msg.append('. You can try again later by visiting https://act.eff.org.')\n reporter = zope.component.getUtility(interfaces.IReporter)\n reporter.add_message(''.join(msg), reporter.LOW_PRIORITY)\n", "path": "certbot/eff.py"}]} | 1,319 | 166 |
gh_patches_debug_34333 | rasdani/github-patches | git_diff | internetarchive__openlibrary-8896 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove fixed subject line from contact form
The contact form has a drop down for choosing a topic. When patrons submit this form, the submissions all go to the same inbox. The selected topics are usually incorrect, but also make it more difficult to manage the inbox due to most messages looking the same.
### Describe the problem that you'd like solved
Let the patron populate the subject line OR auto-populate the subject line from the message.
### Proposal & Constraints
<!-- What is the proposed solution / implementation? Is there a precedent of this approach succeeding elsewhere? -->
<!-- Which suggestions or requirements should be considered for how feature needs to appear or be implemented? -->
### Additional context
<!-- Add any other context or screenshots about the feature request here. -->
### Stakeholders
<!-- @ tag stakeholders of this bug -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `openlibrary/plugins/openlibrary/support.py`
Content:
```
1 import hashlib
2
3 import web
4 import logging
5
6 from infogami import config
7 from infogami.utils import delegate
8 from infogami.utils.view import render_template
9
10 from openlibrary import accounts
11 from openlibrary.core import stats
12 from openlibrary.core.cache import get_memcache
13 from openlibrary.plugins.upstream.addbook import get_recaptcha
14 from openlibrary.utils.dateutil import MINUTE_SECS
15
16 logger = logging.getLogger("openlibrary")
17
18
19 class contact(delegate.page):
20 def GET(self):
21 i = web.input(path=None)
22 user = accounts.get_current_user()
23 email = user and user.email
24
25 hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()
26 has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)
27 recaptcha = has_emailed_recently and get_recaptcha()
28 return render_template("support", email=email, url=i.path, recaptcha=recaptcha)
29
30 def POST(self):
31 form = web.input()
32 patron_name = form.get("name", "")
33 email = form.get("email", "")
34 topic = form.get("topic", "")
35 description = form.get("question", "")
36 url = form.get("url", "")
37 user = accounts.get_current_user()
38 useragent = web.ctx.env.get("HTTP_USER_AGENT", "")
39 if not all([email, topic, description]):
40 return ""
41
42 hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()
43 has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)
44 if has_emailed_recently:
45 recap = get_recaptcha()
46 if recap and not recap.validate():
47 return render_template(
48 "message.html",
49 'Recaptcha solution was incorrect',
50 (
51 'Please <a href="javascript:history.back()">go back</a> and try '
52 'again.'
53 ),
54 )
55
56 default_assignees = config.get("support_default_assignees", {})
57 if (topic_key := str(topic.replace(" ", "_").lower())) in default_assignees:
58 assignee = default_assignees.get(topic_key)
59 else:
60 assignee = default_assignees.get("default", "[email protected]")
61 stats.increment("ol.support.all")
62 subject = "Support case *%s*" % topic
63
64 url = web.ctx.home + url
65 displayname = user and user.get_name() or ""
66 username = user and user.get_username() or ""
67
68 message = SUPPORT_EMAIL_TEMPLATE % locals()
69 sendmail(email, assignee, subject, message)
70
71 get_memcache().set(
72 'contact-POST-%s' % hashed_ip, "true", expires=15 * MINUTE_SECS
73 )
74 return render_template("email/case_created", assignee)
75
76
77 def sendmail(from_address, to_address, subject, message):
78 if config.get('dummy_sendmail'):
79 msg = (
80 f'To: {to_address}\n'
81 f'From:{from_address}\n'
82 f'Subject:{subject}\n'
83 f'\n{web.safestr(message)}'
84 )
85
86 logger.info("sending email:\n%s", msg)
87 else:
88 web.sendmail(from_address, to_address, subject, message)
89
90
91 SUPPORT_EMAIL_TEMPLATE = """
92
93 Description:\n
94 %(description)s
95
96 A new support case has been filed by %(displayname)s <%(email)s>.
97
98 Topic: %(topic)s
99 URL: %(url)s
100 User-Agent: %(useragent)s
101 OL-username: %(username)s
102 Patron-name: %(patron_name)s
103 """
104
105
106 def setup():
107 pass
108
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/openlibrary/plugins/openlibrary/support.py b/openlibrary/plugins/openlibrary/support.py
--- a/openlibrary/plugins/openlibrary/support.py
+++ b/openlibrary/plugins/openlibrary/support.py
@@ -32,11 +32,12 @@
patron_name = form.get("name", "")
email = form.get("email", "")
topic = form.get("topic", "")
+ subject_line = form.get('subject', '')
description = form.get("question", "")
url = form.get("url", "")
user = accounts.get_current_user()
useragent = web.ctx.env.get("HTTP_USER_AGENT", "")
- if not all([email, topic, description]):
+ if not all([email, description]):
return ""
hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()
@@ -59,7 +60,7 @@
else:
assignee = default_assignees.get("default", "[email protected]")
stats.increment("ol.support.all")
- subject = "Support case *%s*" % topic
+ subject = "Support case *%s*" % self.prepare_subject_line(subject_line)
url = web.ctx.home + url
displayname = user and user.get_name() or ""
@@ -73,6 +74,14 @@
)
return render_template("email/case_created", assignee)
+ def prepare_subject_line(self, subject, max_length=60):
+ if not subject:
+ return '[no subject]'
+ if len(subject) <= max_length:
+ return subject
+
+ return subject[:max_length]
+
def sendmail(from_address, to_address, subject, message):
if config.get('dummy_sendmail'):
@@ -95,7 +104,7 @@
A new support case has been filed by %(displayname)s <%(email)s>.
-Topic: %(topic)s
+Subject: %(subject_line)s
URL: %(url)s
User-Agent: %(useragent)s
OL-username: %(username)s
| {"golden_diff": "diff --git a/openlibrary/plugins/openlibrary/support.py b/openlibrary/plugins/openlibrary/support.py\n--- a/openlibrary/plugins/openlibrary/support.py\n+++ b/openlibrary/plugins/openlibrary/support.py\n@@ -32,11 +32,12 @@\n patron_name = form.get(\"name\", \"\")\n email = form.get(\"email\", \"\")\n topic = form.get(\"topic\", \"\")\n+ subject_line = form.get('subject', '')\n description = form.get(\"question\", \"\")\n url = form.get(\"url\", \"\")\n user = accounts.get_current_user()\n useragent = web.ctx.env.get(\"HTTP_USER_AGENT\", \"\")\n- if not all([email, topic, description]):\n+ if not all([email, description]):\n return \"\"\n \n hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()\n@@ -59,7 +60,7 @@\n else:\n assignee = default_assignees.get(\"default\", \"[email protected]\")\n stats.increment(\"ol.support.all\")\n- subject = \"Support case *%s*\" % topic\n+ subject = \"Support case *%s*\" % self.prepare_subject_line(subject_line)\n \n url = web.ctx.home + url\n displayname = user and user.get_name() or \"\"\n@@ -73,6 +74,14 @@\n )\n return render_template(\"email/case_created\", assignee)\n \n+ def prepare_subject_line(self, subject, max_length=60):\n+ if not subject:\n+ return '[no subject]'\n+ if len(subject) <= max_length:\n+ return subject\n+\n+ return subject[:max_length]\n+\n \n def sendmail(from_address, to_address, subject, message):\n if config.get('dummy_sendmail'):\n@@ -95,7 +104,7 @@\n \n A new support case has been filed by %(displayname)s <%(email)s>.\n \n-Topic: %(topic)s\n+Subject: %(subject_line)s\n URL: %(url)s\n User-Agent: %(useragent)s\n OL-username: %(username)s\n", "issue": "Remove fixed subject line from contact form\nThe contact form has a drop down for choosing a topic. When patrons submit this form, the submissions all go to the same inbox. The selected topics are usually incorrect, but also make it more difficult to manage the inbox due to most messages looking the same.\r\n\r\n### Describe the problem that you'd like solved\r\nLet the patron populate the subject line OR auto-populate the subject line from the message.\r\n\r\n### Proposal & Constraints\r\n<!-- What is the proposed solution / implementation? Is there a precedent of this approach succeeding elsewhere? -->\r\n\r\n<!-- Which suggestions or requirements should be considered for how feature needs to appear or be implemented? -->\r\n\r\n### Additional context\r\n<!-- Add any other context or screenshots about the feature request here. -->\r\n\r\n### Stakeholders\r\n<!-- @ tag stakeholders of this bug -->\r\n\r\n\r\n\n", "before_files": [{"content": "import hashlib\n\nimport web\nimport logging\n\nfrom infogami import config\nfrom infogami.utils import delegate\nfrom infogami.utils.view import render_template\n\nfrom openlibrary import accounts\nfrom openlibrary.core import stats\nfrom openlibrary.core.cache import get_memcache\nfrom openlibrary.plugins.upstream.addbook import get_recaptcha\nfrom openlibrary.utils.dateutil import MINUTE_SECS\n\nlogger = logging.getLogger(\"openlibrary\")\n\n\nclass contact(delegate.page):\n def GET(self):\n i = web.input(path=None)\n user = accounts.get_current_user()\n email = user and user.email\n\n hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()\n has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)\n recaptcha = has_emailed_recently and get_recaptcha()\n return render_template(\"support\", email=email, url=i.path, recaptcha=recaptcha)\n\n def POST(self):\n form = web.input()\n patron_name = form.get(\"name\", \"\")\n email = form.get(\"email\", \"\")\n topic = form.get(\"topic\", \"\")\n description = form.get(\"question\", \"\")\n url = form.get(\"url\", \"\")\n user = accounts.get_current_user()\n useragent = web.ctx.env.get(\"HTTP_USER_AGENT\", \"\")\n if not all([email, topic, description]):\n return \"\"\n\n hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()\n has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)\n if has_emailed_recently:\n recap = get_recaptcha()\n if recap and not recap.validate():\n return render_template(\n \"message.html\",\n 'Recaptcha solution was incorrect',\n (\n 'Please <a href=\"javascript:history.back()\">go back</a> and try '\n 'again.'\n ),\n )\n\n default_assignees = config.get(\"support_default_assignees\", {})\n if (topic_key := str(topic.replace(\" \", \"_\").lower())) in default_assignees:\n assignee = default_assignees.get(topic_key)\n else:\n assignee = default_assignees.get(\"default\", \"[email protected]\")\n stats.increment(\"ol.support.all\")\n subject = \"Support case *%s*\" % topic\n\n url = web.ctx.home + url\n displayname = user and user.get_name() or \"\"\n username = user and user.get_username() or \"\"\n\n message = SUPPORT_EMAIL_TEMPLATE % locals()\n sendmail(email, assignee, subject, message)\n\n get_memcache().set(\n 'contact-POST-%s' % hashed_ip, \"true\", expires=15 * MINUTE_SECS\n )\n return render_template(\"email/case_created\", assignee)\n\n\ndef sendmail(from_address, to_address, subject, message):\n if config.get('dummy_sendmail'):\n msg = (\n f'To: {to_address}\\n'\n f'From:{from_address}\\n'\n f'Subject:{subject}\\n'\n f'\\n{web.safestr(message)}'\n )\n\n logger.info(\"sending email:\\n%s\", msg)\n else:\n web.sendmail(from_address, to_address, subject, message)\n\n\nSUPPORT_EMAIL_TEMPLATE = \"\"\"\n\nDescription:\\n\n%(description)s\n\nA new support case has been filed by %(displayname)s <%(email)s>.\n\nTopic: %(topic)s\nURL: %(url)s\nUser-Agent: %(useragent)s\nOL-username: %(username)s\nPatron-name: %(patron_name)s\n\"\"\"\n\n\ndef setup():\n pass\n", "path": "openlibrary/plugins/openlibrary/support.py"}], "after_files": [{"content": "import hashlib\n\nimport web\nimport logging\n\nfrom infogami import config\nfrom infogami.utils import delegate\nfrom infogami.utils.view import render_template\n\nfrom openlibrary import accounts\nfrom openlibrary.core import stats\nfrom openlibrary.core.cache import get_memcache\nfrom openlibrary.plugins.upstream.addbook import get_recaptcha\nfrom openlibrary.utils.dateutil import MINUTE_SECS\n\nlogger = logging.getLogger(\"openlibrary\")\n\n\nclass contact(delegate.page):\n def GET(self):\n i = web.input(path=None)\n user = accounts.get_current_user()\n email = user and user.email\n\n hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()\n has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)\n recaptcha = has_emailed_recently and get_recaptcha()\n return render_template(\"support\", email=email, url=i.path, recaptcha=recaptcha)\n\n def POST(self):\n form = web.input()\n patron_name = form.get(\"name\", \"\")\n email = form.get(\"email\", \"\")\n topic = form.get(\"topic\", \"\")\n subject_line = form.get('subject', '')\n description = form.get(\"question\", \"\")\n url = form.get(\"url\", \"\")\n user = accounts.get_current_user()\n useragent = web.ctx.env.get(\"HTTP_USER_AGENT\", \"\")\n if not all([email, description]):\n return \"\"\n\n hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()\n has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)\n if has_emailed_recently:\n recap = get_recaptcha()\n if recap and not recap.validate():\n return render_template(\n \"message.html\",\n 'Recaptcha solution was incorrect',\n (\n 'Please <a href=\"javascript:history.back()\">go back</a> and try '\n 'again.'\n ),\n )\n\n default_assignees = config.get(\"support_default_assignees\", {})\n if (topic_key := str(topic.replace(\" \", \"_\").lower())) in default_assignees:\n assignee = default_assignees.get(topic_key)\n else:\n assignee = default_assignees.get(\"default\", \"[email protected]\")\n stats.increment(\"ol.support.all\")\n subject = \"Support case *%s*\" % self.prepare_subject_line(subject_line)\n\n url = web.ctx.home + url\n displayname = user and user.get_name() or \"\"\n username = user and user.get_username() or \"\"\n\n message = SUPPORT_EMAIL_TEMPLATE % locals()\n sendmail(email, assignee, subject, message)\n\n get_memcache().set(\n 'contact-POST-%s' % hashed_ip, \"true\", expires=15 * MINUTE_SECS\n )\n return render_template(\"email/case_created\", assignee)\n\n def prepare_subject_line(self, subject, max_length=60):\n if not subject:\n return '[no subject]'\n if len(subject) <= max_length:\n return subject\n\n return subject[:max_length]\n\n\ndef sendmail(from_address, to_address, subject, message):\n if config.get('dummy_sendmail'):\n msg = (\n f'To: {to_address}\\n'\n f'From:{from_address}\\n'\n f'Subject:{subject}\\n'\n f'\\n{web.safestr(message)}'\n )\n\n logger.info(\"sending email:\\n%s\", msg)\n else:\n web.sendmail(from_address, to_address, subject, message)\n\n\nSUPPORT_EMAIL_TEMPLATE = \"\"\"\n\nDescription:\\n\n%(description)s\n\nA new support case has been filed by %(displayname)s <%(email)s>.\n\nSubject: %(subject_line)s\nURL: %(url)s\nUser-Agent: %(useragent)s\nOL-username: %(username)s\nPatron-name: %(patron_name)s\n\"\"\"\n\n\ndef setup():\n pass\n", "path": "openlibrary/plugins/openlibrary/support.py"}]} | 1,429 | 448 |
gh_patches_debug_30256 | rasdani/github-patches | git_diff | TheAlgorithms__Python-3949 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
web_programming/fetch_github_info.py err
<pre>message: Bad credentials. The API can't be accessed using username/password authentication. Please create a personal access token to access this endpoint: http://github.com/settings/tokens
documentation_url: https://docs.github.com/articles/creating-a-personal-access-token-for-the-command-line
</pre>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `web_programming/fetch_github_info.py`
Content:
```
1 #!/usr/bin/env python3
2
3 """
4 Created by sarathkaul on 14/11/19
5
6 Basic authentication using an API password is deprecated and will soon no longer work.
7 Visit https://developer.github.com/changes/2020-02-14-deprecating-password-auth
8 for more information around suggested workarounds and removal dates.
9 """
10
11
12 import requests
13
14 _GITHUB_API = "https://api.github.com/user"
15
16
17 def fetch_github_info(auth_user: str, auth_pass: str) -> dict:
18 """
19 Fetch GitHub info of a user using the requests module
20 """
21 return requests.get(_GITHUB_API, auth=(auth_user, auth_pass)).json()
22
23
24 if __name__ == "__main__":
25 for key, value in fetch_github_info("<USER NAME>", "<PASSWORD>").items():
26 print(f"{key}: {value}")
27
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/web_programming/fetch_github_info.py b/web_programming/fetch_github_info.py
--- a/web_programming/fetch_github_info.py
+++ b/web_programming/fetch_github_info.py
@@ -1,26 +1,50 @@
#!/usr/bin/env python3
-
"""
Created by sarathkaul on 14/11/19
+Updated by lawric1 on 24/11/20
-Basic authentication using an API password is deprecated and will soon no longer work.
-Visit https://developer.github.com/changes/2020-02-14-deprecating-password-auth
-for more information around suggested workarounds and removal dates.
-"""
+Authentication will be made via access token.
+To generate your personal access token visit https://github.com/settings/tokens.
+
+NOTE:
+Never hardcode any credential information in the code. Always use an environment
+file to store the private information and use the `os` module to get the information
+during runtime.
+Create a ".env" file in the root directory and write these two lines in that file
+with your token::
+
+#!/usr/bin/env bash
+export USER_TOKEN=""
+"""
+import os
+from typing import Any, Dict
import requests
-_GITHUB_API = "https://api.github.com/user"
+BASE_URL = "https://api.github.com"
+# https://docs.github.com/en/free-pro-team@latest/rest/reference/users#get-the-authenticated-user
+AUTHENTICATED_USER_ENDPOINT = BASE_URL + "/user"
-def fetch_github_info(auth_user: str, auth_pass: str) -> dict:
+# https://github.com/settings/tokens
+USER_TOKEN = os.environ.get("USER_TOKEN", "")
+
+
+def fetch_github_info(auth_token: str) -> Dict[Any, Any]:
"""
Fetch GitHub info of a user using the requests module
"""
- return requests.get(_GITHUB_API, auth=(auth_user, auth_pass)).json()
-
-
-if __name__ == "__main__":
- for key, value in fetch_github_info("<USER NAME>", "<PASSWORD>").items():
- print(f"{key}: {value}")
+ headers = {
+ "Authorization": f"token {auth_token}",
+ "Accept": "application/vnd.github.v3+json",
+ }
+ return requests.get(AUTHENTICATED_USER_ENDPOINT, headers=headers).json()
+
+
+if __name__ == "__main__": # pragma: no cover
+ if USER_TOKEN:
+ for key, value in fetch_github_info(USER_TOKEN).items():
+ print(f"{key}: {value}")
+ else:
+ raise ValueError("'USER_TOKEN' field cannot be empty.")
| {"golden_diff": "diff --git a/web_programming/fetch_github_info.py b/web_programming/fetch_github_info.py\n--- a/web_programming/fetch_github_info.py\n+++ b/web_programming/fetch_github_info.py\n@@ -1,26 +1,50 @@\n #!/usr/bin/env python3\n-\n \"\"\"\n Created by sarathkaul on 14/11/19\n+Updated by lawric1 on 24/11/20\n \n-Basic authentication using an API password is deprecated and will soon no longer work.\n-Visit https://developer.github.com/changes/2020-02-14-deprecating-password-auth\n-for more information around suggested workarounds and removal dates.\n-\"\"\"\n+Authentication will be made via access token.\n+To generate your personal access token visit https://github.com/settings/tokens.\n+\n+NOTE:\n+Never hardcode any credential information in the code. Always use an environment\n+file to store the private information and use the `os` module to get the information\n+during runtime.\n \n+Create a \".env\" file in the root directory and write these two lines in that file\n+with your token::\n+\n+#!/usr/bin/env bash\n+export USER_TOKEN=\"\"\n+\"\"\"\n+import os\n+from typing import Any, Dict\n \n import requests\n \n-_GITHUB_API = \"https://api.github.com/user\"\n+BASE_URL = \"https://api.github.com\"\n \n+# https://docs.github.com/en/free-pro-team@latest/rest/reference/users#get-the-authenticated-user\n+AUTHENTICATED_USER_ENDPOINT = BASE_URL + \"/user\"\n \n-def fetch_github_info(auth_user: str, auth_pass: str) -> dict:\n+# https://github.com/settings/tokens\n+USER_TOKEN = os.environ.get(\"USER_TOKEN\", \"\")\n+\n+\n+def fetch_github_info(auth_token: str) -> Dict[Any, Any]:\n \"\"\"\n Fetch GitHub info of a user using the requests module\n \"\"\"\n- return requests.get(_GITHUB_API, auth=(auth_user, auth_pass)).json()\n-\n-\n-if __name__ == \"__main__\":\n- for key, value in fetch_github_info(\"<USER NAME>\", \"<PASSWORD>\").items():\n- print(f\"{key}: {value}\")\n+ headers = {\n+ \"Authorization\": f\"token {auth_token}\",\n+ \"Accept\": \"application/vnd.github.v3+json\",\n+ }\n+ return requests.get(AUTHENTICATED_USER_ENDPOINT, headers=headers).json()\n+\n+\n+if __name__ == \"__main__\": # pragma: no cover\n+ if USER_TOKEN:\n+ for key, value in fetch_github_info(USER_TOKEN).items():\n+ print(f\"{key}: {value}\")\n+ else:\n+ raise ValueError(\"'USER_TOKEN' field cannot be empty.\")\n", "issue": "web_programming/fetch_github_info.py err\n<pre>message: Bad credentials. The API can't be accessed using username/password authentication. Please create a personal access token to access this endpoint: http://github.com/settings/tokens\r\ndocumentation_url: https://docs.github.com/articles/creating-a-personal-access-token-for-the-command-line\r\n</pre>\n", "before_files": [{"content": "#!/usr/bin/env python3\n\n\"\"\"\nCreated by sarathkaul on 14/11/19\n\nBasic authentication using an API password is deprecated and will soon no longer work.\nVisit https://developer.github.com/changes/2020-02-14-deprecating-password-auth\nfor more information around suggested workarounds and removal dates.\n\"\"\"\n\n\nimport requests\n\n_GITHUB_API = \"https://api.github.com/user\"\n\n\ndef fetch_github_info(auth_user: str, auth_pass: str) -> dict:\n \"\"\"\n Fetch GitHub info of a user using the requests module\n \"\"\"\n return requests.get(_GITHUB_API, auth=(auth_user, auth_pass)).json()\n\n\nif __name__ == \"__main__\":\n for key, value in fetch_github_info(\"<USER NAME>\", \"<PASSWORD>\").items():\n print(f\"{key}: {value}\")\n", "path": "web_programming/fetch_github_info.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\"\"\"\nCreated by sarathkaul on 14/11/19\nUpdated by lawric1 on 24/11/20\n\nAuthentication will be made via access token.\nTo generate your personal access token visit https://github.com/settings/tokens.\n\nNOTE:\nNever hardcode any credential information in the code. Always use an environment\nfile to store the private information and use the `os` module to get the information\nduring runtime.\n\nCreate a \".env\" file in the root directory and write these two lines in that file\nwith your token::\n\n#!/usr/bin/env bash\nexport USER_TOKEN=\"\"\n\"\"\"\nimport os\nfrom typing import Any, Dict\n\nimport requests\n\nBASE_URL = \"https://api.github.com\"\n\n# https://docs.github.com/en/free-pro-team@latest/rest/reference/users#get-the-authenticated-user\nAUTHENTICATED_USER_ENDPOINT = BASE_URL + \"/user\"\n\n# https://github.com/settings/tokens\nUSER_TOKEN = os.environ.get(\"USER_TOKEN\", \"\")\n\n\ndef fetch_github_info(auth_token: str) -> Dict[Any, Any]:\n \"\"\"\n Fetch GitHub info of a user using the requests module\n \"\"\"\n headers = {\n \"Authorization\": f\"token {auth_token}\",\n \"Accept\": \"application/vnd.github.v3+json\",\n }\n return requests.get(AUTHENTICATED_USER_ENDPOINT, headers=headers).json()\n\n\nif __name__ == \"__main__\": # pragma: no cover\n if USER_TOKEN:\n for key, value in fetch_github_info(USER_TOKEN).items():\n print(f\"{key}: {value}\")\n else:\n raise ValueError(\"'USER_TOKEN' field cannot be empty.\")\n", "path": "web_programming/fetch_github_info.py"}]} | 574 | 597 |
gh_patches_debug_14459 | rasdani/github-patches | git_diff | sql-machine-learning__elasticdl-268 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[clean up] Should check tf is running eagerly early
so the program can fail early if tf is configured wrong.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `elasticdl/worker/worker.py`
Content:
```
1 import tensorflow as tf
2 from tensorflow.python.ops import math_ops
3 from proto import master_pb2
4 from proto import master_pb2_grpc
5
6
7 class Worker(object):
8 """ElasticDL worker"""
9
10 def __init__(self, model_cls, input_fn, opt_fn, channel=None):
11 """
12 Arguments:
13 model_cls: A class to define the model, which contains funcs
14 get_keras_model: return the keras model defined in the class, with a tf dataset as its input
15 output(data): get model ouput from data as input, either a single output of a dict of outputs
16 loss(output, data): get model loss from output and data as input
17 input_fn: a func to to get a dataset, which can be used as the keras model input
18 dataset = input_fn(dict_of_params)
19 dict_of_params from GetTask for DistributedTrain, from kwargs for LocalTrain
20 opt_fn: a func to get the optimizer
21 channel: grpc channel
22 """
23
24 self._model_cls = model_cls()
25 self._keras_model = self._model_cls.get_keras_model()
26 self._input_fn = input_fn
27 self._opt_fn = opt_fn
28 if channel is None:
29 self._stub = None
30 else:
31 self._stub = master_pb2_grpc.MasterStub(channel)
32 self._model_version = -1
33
34 def get_task(self):
35 # TODO: get task from master
36 pass
37
38 def get_model(self):
39 # TODO: get model from master
40 pass
41
42 def report_task_result(self):
43 # TODO: report task result to master
44 pass
45
46 def report_gradient(self):
47 # TODO: report gradient to ps
48 pass
49
50 def distributed_train(self):
51 # TODO: distributed training
52 pass
53
54 def local_train(self, batch_size, epoch=1, kwargs=None):
55 """
56 Local training for local testing. Must in eager mode.
57 Argments:
58 batch_size: batch size in training
59 epoch: the number of epoch in training
60 kwargs: contains a dict of parameters used in training
61 """
62
63 if not tf.executing_eagerly():
64 raise ValueError('Eager mode is required for LocalTrain')
65
66 dataset = self._input_fn(kwargs)
67 dataset = dataset.repeat(epoch).batch(batch_size)
68 optimizer = self._opt_fn()
69
70 for data in dataset:
71 with tf.GradientTape() as tape:
72 output = self._model_cls.output(data)
73 loss = self._model_cls.loss(output, data)
74 # Add regularization loss if any.
75 # Note: for distributed training, the regularization loss should
76 # be divided by the number of contributing workers, which
77 # might be difficult for elasticdl.
78 if self._keras_model.losses:
79 loss += math_ops.add_n(self._keras_model.losses)
80 grads = tape.gradient(loss, self._keras_model.variables)
81 optimizer.apply_gradients(zip(grads, self._keras_model.variables))
82 print('Loss is ', loss.numpy())
83 pass
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/elasticdl/worker/worker.py b/elasticdl/worker/worker.py
--- a/elasticdl/worker/worker.py
+++ b/elasticdl/worker/worker.py
@@ -1,4 +1,6 @@
import tensorflow as tf
+assert tf.executing_eagerly()
+
from tensorflow.python.ops import math_ops
from proto import master_pb2
from proto import master_pb2_grpc
@@ -60,9 +62,6 @@
kwargs: contains a dict of parameters used in training
"""
- if not tf.executing_eagerly():
- raise ValueError('Eager mode is required for LocalTrain')
-
dataset = self._input_fn(kwargs)
dataset = dataset.repeat(epoch).batch(batch_size)
optimizer = self._opt_fn()
| {"golden_diff": "diff --git a/elasticdl/worker/worker.py b/elasticdl/worker/worker.py\n--- a/elasticdl/worker/worker.py\n+++ b/elasticdl/worker/worker.py\n@@ -1,4 +1,6 @@\n import tensorflow as tf\n+assert tf.executing_eagerly()\n+\n from tensorflow.python.ops import math_ops\n from proto import master_pb2\n from proto import master_pb2_grpc\n@@ -60,9 +62,6 @@\n kwargs: contains a dict of parameters used in training\n \"\"\"\n \n- if not tf.executing_eagerly():\n- raise ValueError('Eager mode is required for LocalTrain')\n-\n dataset = self._input_fn(kwargs)\n dataset = dataset.repeat(epoch).batch(batch_size)\n optimizer = self._opt_fn()\n", "issue": "[clean up] Should check tf is running eagerly early\nso the program can fail early if tf is configured wrong.\n", "before_files": [{"content": "import tensorflow as tf\nfrom tensorflow.python.ops import math_ops\nfrom proto import master_pb2\nfrom proto import master_pb2_grpc\n\n\nclass Worker(object):\n \"\"\"ElasticDL worker\"\"\"\n\n def __init__(self, model_cls, input_fn, opt_fn, channel=None):\n \"\"\"\n Arguments:\n model_cls: A class to define the model, which contains funcs\n get_keras_model: return the keras model defined in the class, with a tf dataset as its input\n output(data): get model ouput from data as input, either a single output of a dict of outputs\n loss(output, data): get model loss from output and data as input\n input_fn: a func to to get a dataset, which can be used as the keras model input\n dataset = input_fn(dict_of_params)\n dict_of_params from GetTask for DistributedTrain, from kwargs for LocalTrain\n opt_fn: a func to get the optimizer \n channel: grpc channel\n \"\"\"\n\n self._model_cls = model_cls()\n self._keras_model = self._model_cls.get_keras_model()\n self._input_fn = input_fn\n self._opt_fn = opt_fn\n if channel is None:\n self._stub = None\n else:\n self._stub = master_pb2_grpc.MasterStub(channel)\n self._model_version = -1\n\n def get_task(self):\n # TODO: get task from master\n pass\n\n def get_model(self):\n # TODO: get model from master\n pass\n\n def report_task_result(self):\n # TODO: report task result to master\n pass\n\n def report_gradient(self):\n # TODO: report gradient to ps\n pass\n\n def distributed_train(self):\n # TODO: distributed training\n pass\n\n def local_train(self, batch_size, epoch=1, kwargs=None):\n \"\"\"\n Local training for local testing. Must in eager mode.\n Argments:\n batch_size: batch size in training\n epoch: the number of epoch in training\n kwargs: contains a dict of parameters used in training\n \"\"\"\n\n if not tf.executing_eagerly():\n raise ValueError('Eager mode is required for LocalTrain')\n\n dataset = self._input_fn(kwargs)\n dataset = dataset.repeat(epoch).batch(batch_size)\n optimizer = self._opt_fn()\n\n for data in dataset:\n with tf.GradientTape() as tape:\n output = self._model_cls.output(data)\n loss = self._model_cls.loss(output, data)\n # Add regularization loss if any.\n # Note: for distributed training, the regularization loss should\n # be divided by the number of contributing workers, which\n # might be difficult for elasticdl.\n if self._keras_model.losses:\n loss += math_ops.add_n(self._keras_model.losses)\n grads = tape.gradient(loss, self._keras_model.variables)\n optimizer.apply_gradients(zip(grads, self._keras_model.variables))\n print('Loss is ', loss.numpy())\n pass\n", "path": "elasticdl/worker/worker.py"}], "after_files": [{"content": "import tensorflow as tf\nassert tf.executing_eagerly()\n\nfrom tensorflow.python.ops import math_ops\nfrom proto import master_pb2\nfrom proto import master_pb2_grpc\n\n\nclass Worker(object):\n \"\"\"ElasticDL worker\"\"\"\n\n def __init__(self, model_cls, input_fn, opt_fn, channel=None):\n \"\"\"\n Arguments:\n model_cls: A class to define the model, which contains funcs\n get_keras_model: return the keras model defined in the class, with a tf dataset as its input\n output(data): get model ouput from data as input, either a single output of a dict of outputs\n loss(output, data): get model loss from output and data as input\n input_fn: a func to to get a dataset, which can be used as the keras model input\n dataset = input_fn(dict_of_params)\n dict_of_params from GetTask for DistributedTrain, from kwargs for LocalTrain\n opt_fn: a func to get the optimizer \n channel: grpc channel\n \"\"\"\n\n self._model_cls = model_cls()\n self._keras_model = self._model_cls.get_keras_model()\n self._input_fn = input_fn\n self._opt_fn = opt_fn\n if channel is None:\n self._stub = None\n else:\n self._stub = master_pb2_grpc.MasterStub(channel)\n self._model_version = -1\n\n def get_task(self):\n # TODO: get task from master\n pass\n\n def get_model(self):\n # TODO: get model from master\n pass\n\n def report_task_result(self):\n # TODO: report task result to master\n pass\n\n def report_gradient(self):\n # TODO: report gradient to ps\n pass\n\n def distributed_train(self):\n # TODO: distributed training\n pass\n\n def local_train(self, batch_size, epoch=1, kwargs=None):\n \"\"\"\n Local training for local testing. Must in eager mode.\n Argments:\n batch_size: batch size in training\n epoch: the number of epoch in training\n kwargs: contains a dict of parameters used in training\n \"\"\"\n\n dataset = self._input_fn(kwargs)\n dataset = dataset.repeat(epoch).batch(batch_size)\n optimizer = self._opt_fn()\n\n for data in dataset:\n with tf.GradientTape() as tape:\n output = self._model_cls.output(data)\n loss = self._model_cls.loss(output, data)\n # Add regularization loss if any.\n # Note: for distributed training, the regularization loss should\n # be divided by the number of contributing workers, which\n # might be difficult for elasticdl.\n if self._keras_model.losses:\n loss += math_ops.add_n(self._keras_model.losses)\n grads = tape.gradient(loss, self._keras_model.variables)\n optimizer.apply_gradients(zip(grads, self._keras_model.variables))\n print('Loss is ', loss.numpy())\n pass\n", "path": "elasticdl/worker/worker.py"}]} | 1,104 | 174 |
gh_patches_debug_22781 | rasdani/github-patches | git_diff | microsoft__lisa-491 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
VERIFY-BOOT-ERROR-WARNINGS.py: fix test case
Test case is not running properly on redhat. Run the script directly, not through lisav2 to observe this better.
Release: 7.7 Snapshot 5 (most likely 7.6 should also have this behavior)
Issues to be fixed:
- ignore if syslog is not present (that exists only on ubuntu/debian).
[root@rhel77gen2 Linux]# python VERIFY-BOOT-ERROR-WARNINGS.py
2.7.5 (default, Jun 11 2019, 14:33:56)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
grep: /var/log/syslog: No such file or directory
- fix grep search
[root@rhel77gen2 Linux]# grep -nw '/var/log/messages' -e 'mlx5'
[root@rhel77gen2 Linux]# grep /var/log/messages -e mlx5
Jul 29 10:02:27 rhel77gen2 kernel: mlx5_core ae35:00:02.0: firmware version: 14.23.1020
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py`
Content:
```
1 #!/usr/bin/python
2 # Copyright (c) Microsoft Corporation. All rights reserved.
3 # Licensed under the Apache License.
4 from azuremodules import *
5
6 white_list_xml = "ignorable-boot-errors.xml"
7
8
9 def RunTest():
10 UpdateState("TestRunning")
11 RunLog.info("Checking for ERROR and WARNING messages in system logs.")
12 errors = Run("grep -nw '/var/log/syslog' -e 'error' --ignore-case && grep -nw '/var/log/messages' -e 'error' --ignore-case")
13 warnings = Run("grep -nw '/var/log/syslog' -e 'warning' --ignore-case && grep -nw '/var/log/messages' -e 'warning' --ignore-case")
14 failures = Run("grep -nw '/var/log/syslog' -e 'fail' --ignore-case && grep -nw '/var/log/messages' -e 'fail' --ignore-case")
15 if (not errors and not warnings and not failures):
16 RunLog.info('Could not find ERROR/WARNING/FAILURE messages in syslog/messages log file.')
17 ResultLog.info('PASS')
18 else:
19 if white_list_xml and os.path.isfile(white_list_xml):
20 try:
21 import xml.etree.cElementTree as ET
22 except ImportError:
23 import xml.etree.ElementTree as ET
24
25 white_list_file = ET.parse(white_list_xml)
26 xml_root = white_list_file.getroot()
27
28 RunLog.info('Checking ignorable boot ERROR/WARNING/FAILURE messages...')
29 for node in xml_root:
30 if (failures and node.tag == "failures"):
31 failures = RemoveIgnorableMessages(failures, node)
32 if (errors and node.tag == "errors"):
33 errors = RemoveIgnorableMessages(errors, node)
34 if (warnings and node.tag == "warnings"):
35 warnings = RemoveIgnorableMessages(warnings, node)
36
37 if (errors or warnings or failures):
38 RunLog.error('Found ERROR/WARNING/FAILURE messages in logs.')
39 if(errors):
40 SplitLog('Errors', errors)
41 if(warnings):
42 SplitLog('warnings', warnings)
43 if(failures):
44 SplitLog('failures', failures)
45 ResultLog.error('FAIL')
46 else:
47 ResultLog.info('PASS')
48 UpdateState("TestCompleted")
49
50
51 def SplitLog(logType, logValues):
52 for logEntry in logValues:
53 RunLog.info(logType + ': ' + logEntry)
54
55
56 def RemoveIgnorableMessages(messages, keywords_xml_node):
57 message_list = messages.strip().split('\n')
58 valid_list = []
59 for msg in message_list:
60 for keywords in keywords_xml_node:
61 if keywords.text in msg:
62 RunLog.info('Ignorable ERROR/WARNING/FAILURE message: ' + msg)
63 break
64 else:
65 valid_list.append(msg)
66 if len(valid_list) > 0:
67 return valid_list
68 else:
69 return None
70
71 RunTest()
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py b/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py
--- a/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py
+++ b/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py
@@ -9,9 +9,9 @@
def RunTest():
UpdateState("TestRunning")
RunLog.info("Checking for ERROR and WARNING messages in system logs.")
- errors = Run("grep -nw '/var/log/syslog' -e 'error' --ignore-case && grep -nw '/var/log/messages' -e 'error' --ignore-case")
- warnings = Run("grep -nw '/var/log/syslog' -e 'warning' --ignore-case && grep -nw '/var/log/messages' -e 'warning' --ignore-case")
- failures = Run("grep -nw '/var/log/syslog' -e 'fail' --ignore-case && grep -nw '/var/log/messages' -e 'fail' --ignore-case")
+ errors = Run("grep -nw '/var/log/syslog' -e 'error' --ignore-case --no-message && grep /var/log/messages -e error --ignore-case --no-message")
+ warnings = Run("grep -nw '/var/log/syslog' -e 'warning' --ignore-case --no-message && grep /var/log/messages -e warning --ignore-case --no-message")
+ failures = Run("grep -nw '/var/log/syslog' -e 'fail' --ignore-case --no-message && grep /var/log/messages -e fail --ignore-case --no-message")
if (not errors and not warnings and not failures):
RunLog.info('Could not find ERROR/WARNING/FAILURE messages in syslog/messages log file.')
ResultLog.info('PASS')
| {"golden_diff": "diff --git a/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py b/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py\n--- a/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py\n+++ b/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py\n@@ -9,9 +9,9 @@\n def RunTest():\n UpdateState(\"TestRunning\")\n RunLog.info(\"Checking for ERROR and WARNING messages in system logs.\")\n- errors = Run(\"grep -nw '/var/log/syslog' -e 'error' --ignore-case && grep -nw '/var/log/messages' -e 'error' --ignore-case\")\n- warnings = Run(\"grep -nw '/var/log/syslog' -e 'warning' --ignore-case && grep -nw '/var/log/messages' -e 'warning' --ignore-case\")\n- failures = Run(\"grep -nw '/var/log/syslog' -e 'fail' --ignore-case && grep -nw '/var/log/messages' -e 'fail' --ignore-case\")\n+ errors = Run(\"grep -nw '/var/log/syslog' -e 'error' --ignore-case --no-message && grep /var/log/messages -e error --ignore-case --no-message\")\n+ warnings = Run(\"grep -nw '/var/log/syslog' -e 'warning' --ignore-case --no-message && grep /var/log/messages -e warning --ignore-case --no-message\")\n+ failures = Run(\"grep -nw '/var/log/syslog' -e 'fail' --ignore-case --no-message && grep /var/log/messages -e fail --ignore-case --no-message\")\n if (not errors and not warnings and not failures):\n RunLog.info('Could not find ERROR/WARNING/FAILURE messages in syslog/messages log file.')\n ResultLog.info('PASS')\n", "issue": "VERIFY-BOOT-ERROR-WARNINGS.py: fix test case\nTest case is not running properly on redhat. Run the script directly, not through lisav2 to observe this better.\r\n\r\nRelease: 7.7 Snapshot 5 (most likely 7.6 should also have this behavior)\r\n\r\nIssues to be fixed:\r\n- ignore if syslog is not present (that exists only on ubuntu/debian).\r\n[root@rhel77gen2 Linux]# python VERIFY-BOOT-ERROR-WARNINGS.py\r\n2.7.5 (default, Jun 11 2019, 14:33:56)\r\n[GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]\r\ngrep: /var/log/syslog: No such file or directory\r\n\r\n- fix grep search\r\n[root@rhel77gen2 Linux]# grep -nw '/var/log/messages' -e 'mlx5'\r\n[root@rhel77gen2 Linux]# grep /var/log/messages -e mlx5\r\nJul 29 10:02:27 rhel77gen2 kernel: mlx5_core ae35:00:02.0: firmware version: 14.23.1020\r\n\n", "before_files": [{"content": "#!/usr/bin/python\n# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the Apache License.\nfrom azuremodules import *\n\nwhite_list_xml = \"ignorable-boot-errors.xml\"\n\n\ndef RunTest():\n UpdateState(\"TestRunning\")\n RunLog.info(\"Checking for ERROR and WARNING messages in system logs.\")\n errors = Run(\"grep -nw '/var/log/syslog' -e 'error' --ignore-case && grep -nw '/var/log/messages' -e 'error' --ignore-case\")\n warnings = Run(\"grep -nw '/var/log/syslog' -e 'warning' --ignore-case && grep -nw '/var/log/messages' -e 'warning' --ignore-case\")\n failures = Run(\"grep -nw '/var/log/syslog' -e 'fail' --ignore-case && grep -nw '/var/log/messages' -e 'fail' --ignore-case\")\n if (not errors and not warnings and not failures):\n RunLog.info('Could not find ERROR/WARNING/FAILURE messages in syslog/messages log file.')\n ResultLog.info('PASS')\n else:\n if white_list_xml and os.path.isfile(white_list_xml):\n try:\n import xml.etree.cElementTree as ET\n except ImportError:\n import xml.etree.ElementTree as ET\n\n white_list_file = ET.parse(white_list_xml)\n xml_root = white_list_file.getroot()\n\n RunLog.info('Checking ignorable boot ERROR/WARNING/FAILURE messages...')\n for node in xml_root:\n if (failures and node.tag == \"failures\"):\n failures = RemoveIgnorableMessages(failures, node)\n if (errors and node.tag == \"errors\"):\n errors = RemoveIgnorableMessages(errors, node)\n if (warnings and node.tag == \"warnings\"):\n warnings = RemoveIgnorableMessages(warnings, node)\n\n if (errors or warnings or failures):\n RunLog.error('Found ERROR/WARNING/FAILURE messages in logs.')\n if(errors):\n SplitLog('Errors', errors)\n if(warnings):\n SplitLog('warnings', warnings)\n if(failures):\n SplitLog('failures', failures)\n ResultLog.error('FAIL')\n else:\n ResultLog.info('PASS')\n UpdateState(\"TestCompleted\")\n\n\ndef SplitLog(logType, logValues):\n for logEntry in logValues:\n RunLog.info(logType + ': ' + logEntry)\n\n\ndef RemoveIgnorableMessages(messages, keywords_xml_node):\n message_list = messages.strip().split('\\n')\n valid_list = []\n for msg in message_list:\n for keywords in keywords_xml_node:\n if keywords.text in msg:\n RunLog.info('Ignorable ERROR/WARNING/FAILURE message: ' + msg)\n break\n else:\n valid_list.append(msg)\n if len(valid_list) > 0:\n return valid_list\n else:\n return None\n\nRunTest()\n", "path": "Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py"}], "after_files": [{"content": "#!/usr/bin/python\n# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the Apache License.\nfrom azuremodules import *\n\nwhite_list_xml = \"ignorable-boot-errors.xml\"\n\n\ndef RunTest():\n UpdateState(\"TestRunning\")\n RunLog.info(\"Checking for ERROR and WARNING messages in system logs.\")\n errors = Run(\"grep -nw '/var/log/syslog' -e 'error' --ignore-case --no-message && grep /var/log/messages -e error --ignore-case --no-message\")\n warnings = Run(\"grep -nw '/var/log/syslog' -e 'warning' --ignore-case --no-message && grep /var/log/messages -e warning --ignore-case --no-message\")\n failures = Run(\"grep -nw '/var/log/syslog' -e 'fail' --ignore-case --no-message && grep /var/log/messages -e fail --ignore-case --no-message\")\n if (not errors and not warnings and not failures):\n RunLog.info('Could not find ERROR/WARNING/FAILURE messages in syslog/messages log file.')\n ResultLog.info('PASS')\n else:\n if white_list_xml and os.path.isfile(white_list_xml):\n try:\n import xml.etree.cElementTree as ET\n except ImportError:\n import xml.etree.ElementTree as ET\n\n white_list_file = ET.parse(white_list_xml)\n xml_root = white_list_file.getroot()\n\n RunLog.info('Checking ignorable boot ERROR/WARNING/FAILURE messages...')\n for node in xml_root:\n if (failures and node.tag == \"failures\"):\n failures = RemoveIgnorableMessages(failures, node)\n if (errors and node.tag == \"errors\"):\n errors = RemoveIgnorableMessages(errors, node)\n if (warnings and node.tag == \"warnings\"):\n warnings = RemoveIgnorableMessages(warnings, node)\n\n if (errors or warnings or failures):\n RunLog.error('Found ERROR/WARNING/FAILURE messages in logs.')\n if(errors):\n SplitLog('Errors', errors)\n if(warnings):\n SplitLog('warnings', warnings)\n if(failures):\n SplitLog('failures', failures)\n ResultLog.error('FAIL')\n else:\n ResultLog.info('PASS')\n UpdateState(\"TestCompleted\")\n\n\ndef SplitLog(logType, logValues):\n for logEntry in logValues:\n RunLog.info(logType + ': ' + logEntry)\n\n\ndef RemoveIgnorableMessages(messages, keywords_xml_node):\n message_list = messages.strip().split('\\n')\n valid_list = []\n for msg in message_list:\n for keywords in keywords_xml_node:\n if keywords.text in msg:\n RunLog.info('Ignorable ERROR/WARNING/FAILURE message: ' + msg)\n break\n else:\n valid_list.append(msg)\n if len(valid_list) > 0:\n return valid_list\n else:\n return None\n\nRunTest()\n", "path": "Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py"}]} | 1,302 | 388 |
gh_patches_debug_23467 | rasdani/github-patches | git_diff | saleor__saleor-809 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Calling get_price_range() on an empty item range
On the product detail page backend doesn't check existence of product variants.
If the product requires variants, but they don't exist, Django throws an exception `Calling get_price_range() on an empty item range`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `saleor/product/utils.py`
Content:
```
1 from collections import namedtuple
2
3 from ..cart.utils import get_cart_from_request, get_or_create_cart_from_request
4 from ..core.utils import to_local_currency
5 from .forms import get_form_class_for_product
6 from .models.utils import get_attributes_display_map
7 from .models import Product
8
9
10 def products_visible_to_user(user):
11 if (user.is_authenticated() and
12 user.is_active and user.is_staff):
13 return Product.objects.all()
14 else:
15 return Product.objects.get_available_products()
16
17
18 def products_with_details(user):
19 products = products_visible_to_user(user)
20 products = products.prefetch_related('categories', 'images',
21 'variants__stock',
22 'variants__variant_images__image',
23 'attributes__values',
24 'product_class__variant_attributes__values',
25 'product_class__product_attributes__values')
26 return products
27
28
29 def get_product_images(product):
30 """
31 Returns list of product images that will be placed in product gallery
32 """
33 return list(product.images.all())
34
35
36 def products_with_availability(products, discounts, local_currency):
37 for product in products:
38 yield product, get_availability(product, discounts, local_currency)
39
40
41 ProductAvailability = namedtuple(
42 'ProductAvailability', (
43 'available', 'price_range', 'price_range_undiscounted', 'discount',
44 'price_range_local_currency', 'discount_local_currency'))
45
46
47 def get_availability(product, discounts=None, local_currency=None):
48 # In default currency
49 price_range = product.get_price_range(discounts=discounts)
50 undiscounted = product.get_price_range()
51 if undiscounted.min_price > price_range.min_price:
52 discount = undiscounted.min_price - price_range.min_price
53 else:
54 discount = None
55
56 # Local currency
57 if local_currency:
58 price_range_local = to_local_currency(
59 price_range, local_currency)
60 undiscounted_local = to_local_currency(
61 undiscounted, local_currency)
62 if (undiscounted_local and
63 undiscounted_local.min_price > price_range_local.min_price):
64 discount_local_currency = (
65 undiscounted_local.min_price - price_range_local.min_price)
66 else:
67 discount_local_currency = None
68 else:
69 price_range_local = None
70 discount_local_currency = None
71
72 is_available = product.is_in_stock() and product.is_available()
73
74 return ProductAvailability(
75 available=is_available,
76 price_range=price_range,
77 price_range_undiscounted=undiscounted,
78 discount=discount,
79 price_range_local_currency=price_range_local,
80 discount_local_currency=discount_local_currency)
81
82
83 def handle_cart_form(request, product, create_cart=False):
84 if create_cart:
85 cart = get_or_create_cart_from_request(request)
86 else:
87 cart = get_cart_from_request(request)
88
89 form_class = get_form_class_for_product(product)
90 form = form_class(cart=cart, product=product,
91 data=request.POST or None, discounts=request.discounts)
92 return form, cart
93
94
95 def products_for_cart(user):
96 products = products_visible_to_user(user)
97 products = products.prefetch_related(
98 'variants', 'variants__variant_images__image')
99 return products
100
101
102 def get_variant_picker_data(product, discounts=None):
103 availability = get_availability(product, discounts)
104 variants = product.variants.all()
105 data = {'variantAttributes': [], 'variants': []}
106
107 variant_attributes = product.product_class.variant_attributes.all()
108 for attribute in variant_attributes:
109 data['variantAttributes'].append({
110 'pk': attribute.pk,
111 'display': attribute.display,
112 'name': attribute.name,
113 'values': [{'pk': value.pk, 'display': value.display}
114 for value in attribute.values.all()]})
115
116 for variant in variants:
117 price = variant.get_price_per_item(discounts)
118 price_undiscounted = variant.get_price_per_item()
119 variant_data = {
120 'id': variant.id,
121 'price': price.gross,
122 'priceUndiscounted': price_undiscounted.gross,
123 'currency': price.currency,
124 'attributes': variant.attributes}
125 data['variants'].append(variant_data)
126
127 data['availability'] = {
128 'discount': price_as_dict(availability.discount),
129 'priceRange': price_range_as_dict(availability.price_range),
130 'priceRangeUndiscounted': price_range_as_dict(
131 availability.price_range_undiscounted)}
132 return data
133
134
135 def get_product_attributes_data(product):
136 attributes = product.product_class.product_attributes.all()
137 attributes_map = {attribute.pk: attribute for attribute in attributes}
138 values_map = get_attributes_display_map(product, attributes)
139 return {attributes_map.get(attr_pk): value_obj
140 for (attr_pk, value_obj) in values_map.items()}
141
142
143 def price_as_dict(price):
144 if not price:
145 return {}
146 return {'currency': price.currency,
147 'gross': price.gross,
148 'net': price.net}
149
150
151 def price_range_as_dict(price_range):
152 return {'maxPrice': price_as_dict(price_range.max_price),
153 'minPrice': price_as_dict(price_range.min_price)}
154
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/saleor/product/utils.py b/saleor/product/utils.py
--- a/saleor/product/utils.py
+++ b/saleor/product/utils.py
@@ -45,6 +45,15 @@
def get_availability(product, discounts=None, local_currency=None):
+ if not product.variants.exists():
+ return ProductAvailability(
+ available=False,
+ price_range=None,
+ price_range_undiscounted=None,
+ discount=None,
+ price_range_local_currency=None,
+ discount_local_currency=None)
+
# In default currency
price_range = product.get_price_range(discounts=discounts)
undiscounted = product.get_price_range()
@@ -142,12 +151,14 @@
def price_as_dict(price):
if not price:
- return {}
+ return None
return {'currency': price.currency,
'gross': price.gross,
'net': price.net}
def price_range_as_dict(price_range):
+ if not price_range:
+ return None
return {'maxPrice': price_as_dict(price_range.max_price),
'minPrice': price_as_dict(price_range.min_price)}
| {"golden_diff": "diff --git a/saleor/product/utils.py b/saleor/product/utils.py\n--- a/saleor/product/utils.py\n+++ b/saleor/product/utils.py\n@@ -45,6 +45,15 @@\n \n \n def get_availability(product, discounts=None, local_currency=None):\n+ if not product.variants.exists():\n+ return ProductAvailability(\n+ available=False,\n+ price_range=None,\n+ price_range_undiscounted=None,\n+ discount=None,\n+ price_range_local_currency=None,\n+ discount_local_currency=None)\n+\n # In default currency\n price_range = product.get_price_range(discounts=discounts)\n undiscounted = product.get_price_range()\n@@ -142,12 +151,14 @@\n \n def price_as_dict(price):\n if not price:\n- return {}\n+ return None\n return {'currency': price.currency,\n 'gross': price.gross,\n 'net': price.net}\n \n \n def price_range_as_dict(price_range):\n+ if not price_range:\n+ return None\n return {'maxPrice': price_as_dict(price_range.max_price),\n 'minPrice': price_as_dict(price_range.min_price)}\n", "issue": "Calling get_price_range() on an empty item range\nOn the product detail page backend doesn't check existence of product variants.\r\nIf the product requires variants, but they don't exist, Django throws an exception `Calling get_price_range() on an empty item range`.\n", "before_files": [{"content": "from collections import namedtuple\n\nfrom ..cart.utils import get_cart_from_request, get_or_create_cart_from_request\nfrom ..core.utils import to_local_currency\nfrom .forms import get_form_class_for_product\nfrom .models.utils import get_attributes_display_map\nfrom .models import Product\n\n\ndef products_visible_to_user(user):\n if (user.is_authenticated() and\n user.is_active and user.is_staff):\n return Product.objects.all()\n else:\n return Product.objects.get_available_products()\n\n\ndef products_with_details(user):\n products = products_visible_to_user(user)\n products = products.prefetch_related('categories', 'images',\n 'variants__stock',\n 'variants__variant_images__image',\n 'attributes__values',\n 'product_class__variant_attributes__values',\n 'product_class__product_attributes__values')\n return products\n\n\ndef get_product_images(product):\n \"\"\"\n Returns list of product images that will be placed in product gallery\n \"\"\"\n return list(product.images.all())\n\n\ndef products_with_availability(products, discounts, local_currency):\n for product in products:\n yield product, get_availability(product, discounts, local_currency)\n\n\nProductAvailability = namedtuple(\n 'ProductAvailability', (\n 'available', 'price_range', 'price_range_undiscounted', 'discount',\n 'price_range_local_currency', 'discount_local_currency'))\n\n\ndef get_availability(product, discounts=None, local_currency=None):\n # In default currency\n price_range = product.get_price_range(discounts=discounts)\n undiscounted = product.get_price_range()\n if undiscounted.min_price > price_range.min_price:\n discount = undiscounted.min_price - price_range.min_price\n else:\n discount = None\n\n # Local currency\n if local_currency:\n price_range_local = to_local_currency(\n price_range, local_currency)\n undiscounted_local = to_local_currency(\n undiscounted, local_currency)\n if (undiscounted_local and\n undiscounted_local.min_price > price_range_local.min_price):\n discount_local_currency = (\n undiscounted_local.min_price - price_range_local.min_price)\n else:\n discount_local_currency = None\n else:\n price_range_local = None\n discount_local_currency = None\n\n is_available = product.is_in_stock() and product.is_available()\n\n return ProductAvailability(\n available=is_available,\n price_range=price_range,\n price_range_undiscounted=undiscounted,\n discount=discount,\n price_range_local_currency=price_range_local,\n discount_local_currency=discount_local_currency)\n\n\ndef handle_cart_form(request, product, create_cart=False):\n if create_cart:\n cart = get_or_create_cart_from_request(request)\n else:\n cart = get_cart_from_request(request)\n\n form_class = get_form_class_for_product(product)\n form = form_class(cart=cart, product=product,\n data=request.POST or None, discounts=request.discounts)\n return form, cart\n\n\ndef products_for_cart(user):\n products = products_visible_to_user(user)\n products = products.prefetch_related(\n 'variants', 'variants__variant_images__image')\n return products\n\n\ndef get_variant_picker_data(product, discounts=None):\n availability = get_availability(product, discounts)\n variants = product.variants.all()\n data = {'variantAttributes': [], 'variants': []}\n\n variant_attributes = product.product_class.variant_attributes.all()\n for attribute in variant_attributes:\n data['variantAttributes'].append({\n 'pk': attribute.pk,\n 'display': attribute.display,\n 'name': attribute.name,\n 'values': [{'pk': value.pk, 'display': value.display}\n for value in attribute.values.all()]})\n\n for variant in variants:\n price = variant.get_price_per_item(discounts)\n price_undiscounted = variant.get_price_per_item()\n variant_data = {\n 'id': variant.id,\n 'price': price.gross,\n 'priceUndiscounted': price_undiscounted.gross,\n 'currency': price.currency,\n 'attributes': variant.attributes}\n data['variants'].append(variant_data)\n\n data['availability'] = {\n 'discount': price_as_dict(availability.discount),\n 'priceRange': price_range_as_dict(availability.price_range),\n 'priceRangeUndiscounted': price_range_as_dict(\n availability.price_range_undiscounted)}\n return data\n\n\ndef get_product_attributes_data(product):\n attributes = product.product_class.product_attributes.all()\n attributes_map = {attribute.pk: attribute for attribute in attributes}\n values_map = get_attributes_display_map(product, attributes)\n return {attributes_map.get(attr_pk): value_obj\n for (attr_pk, value_obj) in values_map.items()}\n\n\ndef price_as_dict(price):\n if not price:\n return {}\n return {'currency': price.currency,\n 'gross': price.gross,\n 'net': price.net}\n\n\ndef price_range_as_dict(price_range):\n return {'maxPrice': price_as_dict(price_range.max_price),\n 'minPrice': price_as_dict(price_range.min_price)}\n", "path": "saleor/product/utils.py"}], "after_files": [{"content": "from collections import namedtuple\n\nfrom ..cart.utils import get_cart_from_request, get_or_create_cart_from_request\nfrom ..core.utils import to_local_currency\nfrom .forms import get_form_class_for_product\nfrom .models.utils import get_attributes_display_map\nfrom .models import Product\n\n\ndef products_visible_to_user(user):\n if (user.is_authenticated() and\n user.is_active and user.is_staff):\n return Product.objects.all()\n else:\n return Product.objects.get_available_products()\n\n\ndef products_with_details(user):\n products = products_visible_to_user(user)\n products = products.prefetch_related('categories', 'images',\n 'variants__stock',\n 'variants__variant_images__image',\n 'attributes__values',\n 'product_class__variant_attributes__values',\n 'product_class__product_attributes__values')\n return products\n\n\ndef get_product_images(product):\n \"\"\"\n Returns list of product images that will be placed in product gallery\n \"\"\"\n return list(product.images.all())\n\n\ndef products_with_availability(products, discounts, local_currency):\n for product in products:\n yield product, get_availability(product, discounts, local_currency)\n\n\nProductAvailability = namedtuple(\n 'ProductAvailability', (\n 'available', 'price_range', 'price_range_undiscounted', 'discount',\n 'price_range_local_currency', 'discount_local_currency'))\n\n\ndef get_availability(product, discounts=None, local_currency=None):\n if not product.variants.exists():\n return ProductAvailability(\n available=False,\n price_range=None,\n price_range_undiscounted=None,\n discount=None,\n price_range_local_currency=None,\n discount_local_currency=None)\n\n # In default currency\n price_range = product.get_price_range(discounts=discounts)\n undiscounted = product.get_price_range()\n if undiscounted.min_price > price_range.min_price:\n discount = undiscounted.min_price - price_range.min_price\n else:\n discount = None\n\n # Local currency\n if local_currency:\n price_range_local = to_local_currency(\n price_range, local_currency)\n undiscounted_local = to_local_currency(\n undiscounted, local_currency)\n if (undiscounted_local and\n undiscounted_local.min_price > price_range_local.min_price):\n discount_local_currency = (\n undiscounted_local.min_price - price_range_local.min_price)\n else:\n discount_local_currency = None\n else:\n price_range_local = None\n discount_local_currency = None\n\n is_available = product.is_in_stock() and product.is_available()\n\n return ProductAvailability(\n available=is_available,\n price_range=price_range,\n price_range_undiscounted=undiscounted,\n discount=discount,\n price_range_local_currency=price_range_local,\n discount_local_currency=discount_local_currency)\n\n\ndef handle_cart_form(request, product, create_cart=False):\n if create_cart:\n cart = get_or_create_cart_from_request(request)\n else:\n cart = get_cart_from_request(request)\n\n form_class = get_form_class_for_product(product)\n form = form_class(cart=cart, product=product,\n data=request.POST or None, discounts=request.discounts)\n return form, cart\n\n\ndef products_for_cart(user):\n products = products_visible_to_user(user)\n products = products.prefetch_related(\n 'variants', 'variants__variant_images__image')\n return products\n\n\ndef get_variant_picker_data(product, discounts=None):\n availability = get_availability(product, discounts)\n variants = product.variants.all()\n data = {'variantAttributes': [], 'variants': []}\n\n variant_attributes = product.product_class.variant_attributes.all()\n for attribute in variant_attributes:\n data['variantAttributes'].append({\n 'pk': attribute.pk,\n 'display': attribute.display,\n 'name': attribute.name,\n 'values': [{'pk': value.pk, 'display': value.display}\n for value in attribute.values.all()]})\n\n for variant in variants:\n price = variant.get_price_per_item(discounts)\n price_undiscounted = variant.get_price_per_item()\n variant_data = {\n 'id': variant.id,\n 'price': price.gross,\n 'priceUndiscounted': price_undiscounted.gross,\n 'currency': price.currency,\n 'attributes': variant.attributes}\n data['variants'].append(variant_data)\n\n data['availability'] = {\n 'discount': price_as_dict(availability.discount),\n 'priceRange': price_range_as_dict(availability.price_range),\n 'priceRangeUndiscounted': price_range_as_dict(\n availability.price_range_undiscounted)}\n return data\n\n\ndef get_product_attributes_data(product):\n attributes = product.product_class.product_attributes.all()\n attributes_map = {attribute.pk: attribute for attribute in attributes}\n values_map = get_attributes_display_map(product, attributes)\n return {attributes_map.get(attr_pk): value_obj\n for (attr_pk, value_obj) in values_map.items()}\n\n\ndef price_as_dict(price):\n if not price:\n return None\n return {'currency': price.currency,\n 'gross': price.gross,\n 'net': price.net}\n\n\ndef price_range_as_dict(price_range):\n if not price_range:\n return None\n return {'maxPrice': price_as_dict(price_range.max_price),\n 'minPrice': price_as_dict(price_range.min_price)}\n", "path": "saleor/product/utils.py"}]} | 1,765 | 259 |
gh_patches_debug_54265 | rasdani/github-patches | git_diff | hylang__hy-411 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
problem with comment parsing
I was translating some code to Hy from a textbook on Python programming (http://inventwithpython.com/pygame/index.html) and ran into a problem with this direct translation.
```
(import pygame sys)
(import [pygame.locals [*]])
(pygame.init)
(setv *displaysurf* (pygame.display.set_mode (, 400 300)))
(pygame.display.set_caption "Hello World!")
(while True ; main game loop
(do (foreach [event (pygame.event.get)]
(if (= event.type QUIT)
(do (pygame.quit)
(sys.exit))))
(pygame.display.update)))
```
I get a parse error if the end-of-line comment ("main game loop") appears where it does. It works if I remove it.
The following interaction with the prompt also surprised me.
```
=> ; some comment
hy.lex.exceptions.LexException: Could not identify the next token at line -1, column -1
```
Fixing this isn't critical, but it should probably be fixed. I do occasionally type something, realize I need to do something else first, comment it, press Enter, type whatever setup I needed, press Enter, then press Up twice, uncomment the line, and Enter to run it.
problem with comment parsing
I was translating some code to Hy from a textbook on Python programming (http://inventwithpython.com/pygame/index.html) and ran into a problem with this direct translation.
```
(import pygame sys)
(import [pygame.locals [*]])
(pygame.init)
(setv *displaysurf* (pygame.display.set_mode (, 400 300)))
(pygame.display.set_caption "Hello World!")
(while True ; main game loop
(do (foreach [event (pygame.event.get)]
(if (= event.type QUIT)
(do (pygame.quit)
(sys.exit))))
(pygame.display.update)))
```
I get a parse error if the end-of-line comment ("main game loop") appears where it does. It works if I remove it.
The following interaction with the prompt also surprised me.
```
=> ; some comment
hy.lex.exceptions.LexException: Could not identify the next token at line -1, column -1
```
Fixing this isn't critical, but it should probably be fixed. I do occasionally type something, realize I need to do something else first, comment it, press Enter, type whatever setup I needed, press Enter, then press Up twice, uncomment the line, and Enter to run it.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `hy/lex/lexer.py`
Content:
```
1 # Copyright (c) 2013 Nicolas Dandrimont <[email protected]>
2 #
3 # Permission is hereby granted, free of charge, to any person obtaining a
4 # copy of this software and associated documentation files (the "Software"),
5 # to deal in the Software without restriction, including without limitation
6 # the rights to use, copy, modify, merge, publish, distribute, sublicense,
7 # and/or sell copies of the Software, and to permit persons to whom the
8 # Software is furnished to do so, subject to the following conditions:
9 #
10 # The above copyright notice and this permission notice shall be included in
11 # all copies or substantial portions of the Software.
12 #
13 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
14 # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
15 # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
16 # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
17 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
18 # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
19 # DEALINGS IN THE SOFTWARE.
20
21 from rply import LexerGenerator
22
23
24 lg = LexerGenerator()
25
26
27 # A regexp for something that should end a quoting/unquoting operator
28 # i.e. a space or a closing brace/paren/curly
29 end_quote = r'(?![\s\)\]\}])'
30
31
32 lg.add('LPAREN', r'\(')
33 lg.add('RPAREN', r'\)')
34 lg.add('LBRACKET', r'\[')
35 lg.add('RBRACKET', r'\]')
36 lg.add('LCURLY', r'\{')
37 lg.add('RCURLY', r'\}')
38 lg.add('QUOTE', r'\'%s' % end_quote)
39 lg.add('QUASIQUOTE', r'`%s' % end_quote)
40 lg.add('UNQUOTESPLICE', r'~@%s' % end_quote)
41 lg.add('UNQUOTE', r'~%s' % end_quote)
42 lg.add('HASHBANG', r'#!.*[^\r\n]')
43 lg.add('HASHREADER', r'#.')
44
45
46 lg.add('STRING', r'''(?x)
47 (?:u|r|ur|ru)? # prefix
48 " # start string
49 (?:
50 | [^"\\] # non-quote or backslash
51 | \\. # or escaped single character
52 | \\x[0-9a-fA-F]{2} # or escaped raw character
53 | \\u[0-9a-fA-F]{4} # or unicode escape
54 | \\U[0-9a-fA-F]{8} # or long unicode escape
55 )* # one or more times
56 " # end string
57 ''')
58
59
60 lg.add('IDENTIFIER', r'[^()\[\]{}\'"\s;]+')
61
62
63 lg.ignore(r';.*[\r\n]+')
64 lg.ignore(r'\s+')
65
66
67 lexer = lg.build()
68
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/hy/lex/lexer.py b/hy/lex/lexer.py
--- a/hy/lex/lexer.py
+++ b/hy/lex/lexer.py
@@ -60,7 +60,7 @@
lg.add('IDENTIFIER', r'[^()\[\]{}\'"\s;]+')
-lg.ignore(r';.*[\r\n]+')
+lg.ignore(r';.*(?=\r|\n|$)')
lg.ignore(r'\s+')
| {"golden_diff": "diff --git a/hy/lex/lexer.py b/hy/lex/lexer.py\n--- a/hy/lex/lexer.py\n+++ b/hy/lex/lexer.py\n@@ -60,7 +60,7 @@\n lg.add('IDENTIFIER', r'[^()\\[\\]{}\\'\"\\s;]+')\n \n \n-lg.ignore(r';.*[\\r\\n]+')\n+lg.ignore(r';.*(?=\\r|\\n|$)')\n lg.ignore(r'\\s+')\n", "issue": "problem with comment parsing\nI was translating some code to Hy from a textbook on Python programming (http://inventwithpython.com/pygame/index.html) and ran into a problem with this direct translation.\n\n```\n(import pygame sys)\n(import [pygame.locals [*]])\n\n(pygame.init)\n(setv *displaysurf* (pygame.display.set_mode (, 400 300)))\n(pygame.display.set_caption \"Hello World!\")\n(while True ; main game loop\n (do (foreach [event (pygame.event.get)]\n (if (= event.type QUIT)\n (do (pygame.quit)\n (sys.exit))))\n (pygame.display.update)))\n```\n\nI get a parse error if the end-of-line comment (\"main game loop\") appears where it does. It works if I remove it.\n\nThe following interaction with the prompt also surprised me.\n\n```\n=> ; some comment\nhy.lex.exceptions.LexException: Could not identify the next token at line -1, column -1\n```\n\nFixing this isn't critical, but it should probably be fixed. I do occasionally type something, realize I need to do something else first, comment it, press Enter, type whatever setup I needed, press Enter, then press Up twice, uncomment the line, and Enter to run it.\n\nproblem with comment parsing\nI was translating some code to Hy from a textbook on Python programming (http://inventwithpython.com/pygame/index.html) and ran into a problem with this direct translation.\n\n```\n(import pygame sys)\n(import [pygame.locals [*]])\n\n(pygame.init)\n(setv *displaysurf* (pygame.display.set_mode (, 400 300)))\n(pygame.display.set_caption \"Hello World!\")\n(while True ; main game loop\n (do (foreach [event (pygame.event.get)]\n (if (= event.type QUIT)\n (do (pygame.quit)\n (sys.exit))))\n (pygame.display.update)))\n```\n\nI get a parse error if the end-of-line comment (\"main game loop\") appears where it does. It works if I remove it.\n\nThe following interaction with the prompt also surprised me.\n\n```\n=> ; some comment\nhy.lex.exceptions.LexException: Could not identify the next token at line -1, column -1\n```\n\nFixing this isn't critical, but it should probably be fixed. I do occasionally type something, realize I need to do something else first, comment it, press Enter, type whatever setup I needed, press Enter, then press Up twice, uncomment the line, and Enter to run it.\n\n", "before_files": [{"content": "# Copyright (c) 2013 Nicolas Dandrimont <[email protected]>\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the \"Software\"),\n# to deal in the Software without restriction, including without limitation\n# the rights to use, copy, modify, merge, publish, distribute, sublicense,\n# and/or sell copies of the Software, and to permit persons to whom the\n# Software is furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL\n# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\n# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER\n# DEALINGS IN THE SOFTWARE.\n\nfrom rply import LexerGenerator\n\n\nlg = LexerGenerator()\n\n\n# A regexp for something that should end a quoting/unquoting operator\n# i.e. a space or a closing brace/paren/curly\nend_quote = r'(?![\\s\\)\\]\\}])'\n\n\nlg.add('LPAREN', r'\\(')\nlg.add('RPAREN', r'\\)')\nlg.add('LBRACKET', r'\\[')\nlg.add('RBRACKET', r'\\]')\nlg.add('LCURLY', r'\\{')\nlg.add('RCURLY', r'\\}')\nlg.add('QUOTE', r'\\'%s' % end_quote)\nlg.add('QUASIQUOTE', r'`%s' % end_quote)\nlg.add('UNQUOTESPLICE', r'~@%s' % end_quote)\nlg.add('UNQUOTE', r'~%s' % end_quote)\nlg.add('HASHBANG', r'#!.*[^\\r\\n]')\nlg.add('HASHREADER', r'#.')\n\n\nlg.add('STRING', r'''(?x)\n (?:u|r|ur|ru)? # prefix\n \" # start string\n (?:\n | [^\"\\\\] # non-quote or backslash\n | \\\\. # or escaped single character\n | \\\\x[0-9a-fA-F]{2} # or escaped raw character\n | \\\\u[0-9a-fA-F]{4} # or unicode escape\n | \\\\U[0-9a-fA-F]{8} # or long unicode escape\n )* # one or more times\n \" # end string\n''')\n\n\nlg.add('IDENTIFIER', r'[^()\\[\\]{}\\'\"\\s;]+')\n\n\nlg.ignore(r';.*[\\r\\n]+')\nlg.ignore(r'\\s+')\n\n\nlexer = lg.build()\n", "path": "hy/lex/lexer.py"}], "after_files": [{"content": "# Copyright (c) 2013 Nicolas Dandrimont <[email protected]>\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the \"Software\"),\n# to deal in the Software without restriction, including without limitation\n# the rights to use, copy, modify, merge, publish, distribute, sublicense,\n# and/or sell copies of the Software, and to permit persons to whom the\n# Software is furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL\n# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\n# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER\n# DEALINGS IN THE SOFTWARE.\n\nfrom rply import LexerGenerator\n\n\nlg = LexerGenerator()\n\n\n# A regexp for something that should end a quoting/unquoting operator\n# i.e. a space or a closing brace/paren/curly\nend_quote = r'(?![\\s\\)\\]\\}])'\n\n\nlg.add('LPAREN', r'\\(')\nlg.add('RPAREN', r'\\)')\nlg.add('LBRACKET', r'\\[')\nlg.add('RBRACKET', r'\\]')\nlg.add('LCURLY', r'\\{')\nlg.add('RCURLY', r'\\}')\nlg.add('QUOTE', r'\\'%s' % end_quote)\nlg.add('QUASIQUOTE', r'`%s' % end_quote)\nlg.add('UNQUOTESPLICE', r'~@%s' % end_quote)\nlg.add('UNQUOTE', r'~%s' % end_quote)\nlg.add('HASHBANG', r'#!.*[^\\r\\n]')\nlg.add('HASHREADER', r'#.')\n\n\nlg.add('STRING', r'''(?x)\n (?:u|r|ur|ru)? # prefix\n \" # start string\n (?:\n | [^\"\\\\] # non-quote or backslash\n | \\\\. # or escaped single character\n | \\\\x[0-9a-fA-F]{2} # or escaped raw character\n | \\\\u[0-9a-fA-F]{4} # or unicode escape\n | \\\\U[0-9a-fA-F]{8} # or long unicode escape\n )* # one or more times\n \" # end string\n''')\n\n\nlg.add('IDENTIFIER', r'[^()\\[\\]{}\\'\"\\s;]+')\n\n\nlg.ignore(r';.*(?=\\r|\\n|$)')\nlg.ignore(r'\\s+')\n\n\nlexer = lg.build()\n", "path": "hy/lex/lexer.py"}]} | 1,580 | 102 |
gh_patches_debug_91 | rasdani/github-patches | git_diff | projectmesa__mesa-535 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Issue 523
add networkx to the dependency specified in setup.py for #523 to allow all the tests to run without error when installing with the `pip install -e .` command.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3 import re
4
5 from setuptools import setup, find_packages
6 from codecs import open
7
8 requires = [
9 'tornado >= 4.2, < 5.0.0',
10 'numpy',
11 'pandas',
12 'tqdm',
13 'click',
14 ]
15
16 version = ''
17 with open('mesa/__init__.py', 'r') as fd:
18 version = re.search(r'^__version__\s*=\s*[\'"]([^\'"]*)[\'"]',
19 fd.read(), re.MULTILINE).group(1)
20
21 with open('README.rst', 'rb', encoding='utf-8') as f:
22 readme = f.read()
23
24 setup(
25 name='Mesa',
26 version=version,
27 description="Agent-based modeling (ABM) in Python 3+",
28 long_description=readme,
29 author='Project Mesa Team',
30 author_email='[email protected]',
31 url='https://github.com/projectmesa/mesa',
32 packages=find_packages(),
33 package_data={'mesa': ['visualization/templates/*.html', 'visualization/templates/css/*',
34 'visualization/templates/fonts/*', 'visualization/templates/js/*']},
35 include_package_data=True,
36 install_requires=requires,
37 keywords='agent based modeling model ABM simulation multi-agent',
38 license='Apache 2.0',
39 zip_safe=False,
40 classifiers=[
41 'Topic :: Scientific/Engineering',
42 'Topic :: Scientific/Engineering :: Artificial Life',
43 'Topic :: Scientific/Engineering :: Artificial Intelligence',
44 'Intended Audience :: Science/Research',
45 'Programming Language :: Python :: 3 :: Only',
46 'License :: OSI Approved :: Apache Software License',
47 'Operating System :: OS Independent',
48 'Development Status :: 3 - Alpha',
49 'Natural Language :: English',
50 ],
51 entry_points='''
52 [console_scripts]
53 mesa=mesa.main:cli
54 ''',
55 )
56
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -7,6 +7,7 @@
requires = [
'tornado >= 4.2, < 5.0.0',
+ 'networkx',
'numpy',
'pandas',
'tqdm',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -7,6 +7,7 @@\n \n requires = [\n 'tornado >= 4.2, < 5.0.0',\n+ 'networkx',\n 'numpy',\n 'pandas',\n 'tqdm',\n", "issue": "Issue 523\nadd networkx to the dependency specified in setup.py for #523 to allow all the tests to run without error when installing with the `pip install -e .` command.\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nimport re\n\nfrom setuptools import setup, find_packages\nfrom codecs import open\n\nrequires = [\n 'tornado >= 4.2, < 5.0.0',\n 'numpy',\n 'pandas',\n 'tqdm',\n 'click',\n]\n\nversion = ''\nwith open('mesa/__init__.py', 'r') as fd:\n version = re.search(r'^__version__\\s*=\\s*[\\'\"]([^\\'\"]*)[\\'\"]',\n fd.read(), re.MULTILINE).group(1)\n\nwith open('README.rst', 'rb', encoding='utf-8') as f:\n readme = f.read()\n\nsetup(\n name='Mesa',\n version=version,\n description=\"Agent-based modeling (ABM) in Python 3+\",\n long_description=readme,\n author='Project Mesa Team',\n author_email='[email protected]',\n url='https://github.com/projectmesa/mesa',\n packages=find_packages(),\n package_data={'mesa': ['visualization/templates/*.html', 'visualization/templates/css/*',\n 'visualization/templates/fonts/*', 'visualization/templates/js/*']},\n include_package_data=True,\n install_requires=requires,\n keywords='agent based modeling model ABM simulation multi-agent',\n license='Apache 2.0',\n zip_safe=False,\n classifiers=[\n 'Topic :: Scientific/Engineering',\n 'Topic :: Scientific/Engineering :: Artificial Life',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Intended Audience :: Science/Research',\n 'Programming Language :: Python :: 3 :: Only',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: OS Independent',\n 'Development Status :: 3 - Alpha',\n 'Natural Language :: English',\n ],\n entry_points='''\n [console_scripts]\n mesa=mesa.main:cli\n ''',\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nimport re\n\nfrom setuptools import setup, find_packages\nfrom codecs import open\n\nrequires = [\n 'tornado >= 4.2, < 5.0.0',\n 'networkx',\n 'numpy',\n 'pandas',\n 'tqdm',\n]\n\nversion = ''\nwith open('mesa/__init__.py', 'r') as fd:\n version = re.search(r'^__version__\\s*=\\s*[\\'\"]([^\\'\"]*)[\\'\"]',\n fd.read(), re.MULTILINE).group(1)\n\nwith open('README.rst', 'rb', encoding='utf-8') as f:\n readme = f.read()\n\nsetup(\n name='Mesa',\n version=version,\n description=\"Agent-based modeling (ABM) in Python 3+\",\n long_description=readme,\n author='Project Mesa Team',\n author_email='[email protected]',\n url='https://github.com/projectmesa/mesa',\n packages=find_packages(),\n package_data={'mesa': ['visualization/templates/*.html', 'visualization/templates/css/*',\n 'visualization/templates/fonts/*', 'visualization/templates/js/*']},\n include_package_data=True,\n install_requires=requires,\n keywords='agent based modeling model ABM simulation multi-agent',\n license='Apache 2.0',\n zip_safe=False,\n classifiers=[\n 'Topic :: Scientific/Engineering',\n 'Topic :: Scientific/Engineering :: Artificial Life',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Intended Audience :: Science/Research',\n 'Programming Language :: Python :: 3 :: Only',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: OS Independent',\n 'Development Status :: 3 - Alpha',\n 'Natural Language :: English',\n ],\n)\n", "path": "setup.py"}]} | 821 | 73 |
gh_patches_debug_24159 | rasdani/github-patches | git_diff | pytorch__ignite-399 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Improve State documentation
Hi,
I think the documentation of states and the way they're used by engines is not detailed enough.
Currently the only way to know what attributes are available in the default State object, used by the engines generated by convenience methods such as `create_supervised_trainer`, is to look at the source code.
Some of them are mentioned in the [Concept](https://pytorch.org/ignite/concepts.html#state) part of the documentation but there is still a lot of room for confusion about the way they work and some are missing.
For example, no information about the `batch` attribute which without documentation some might assume to correspond to the batch index (kinda like `epoch` for the epoch index) which is actually `iteration`. Same thing for `epoch` which is used implicitly in examples but no mention of it in the default `State` documentation, and you might think it's obvious but then someone will try to create a default `State` and assume `epoch` is a built-in attribute set to 0 at instantiation when in fact you'll have to search through the source code to figure out that it's not a default attribute and is only created by the default `Engine` through a kwarg passed to the `State` class.
Additionally this kind of explicit documentation of the state should be done every time a custom `State` is created otherwise the user is left to guess what attributes are available in it, what they mean and what format they're in. For example, is `output` the return from the forward pass of your model? The loss?
You'll have to look at `Engine` then find out it's the output of the process_function argument. Then figure out that in the case of the `create_supervised_trainer` that function is `_update` which returns `loss.item()` and in the case of `create_supervised_evaluator` returns `(y_pred, y)`.
Thank you, and sorry for the verbosity but without it I was afraid this would be rejected as trivial.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ignite/engine/__init__.py`
Content:
```
1 import torch
2
3 from ignite.engine.engine import Engine, State, Events
4 from ignite.utils import convert_tensor
5
6
7 def _prepare_batch(batch, device=None, non_blocking=False):
8 """Prepare batch for training: pass to a device with options.
9
10 """
11 x, y = batch
12 return (convert_tensor(x, device=device, non_blocking=non_blocking),
13 convert_tensor(y, device=device, non_blocking=non_blocking))
14
15
16 def create_supervised_trainer(model, optimizer, loss_fn,
17 device=None, non_blocking=False,
18 prepare_batch=_prepare_batch):
19 """
20 Factory function for creating a trainer for supervised models.
21
22 Args:
23 model (`torch.nn.Module`): the model to train.
24 optimizer (`torch.optim.Optimizer`): the optimizer to use.
25 loss_fn (torch.nn loss function): the loss function to use.
26 device (str, optional): device type specification (default: None).
27 Applies to both model and batches.
28 non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously
29 with respect to the host. For other cases, this argument has no effect.
30 prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs
31 tuple of tensors `(batch_x, batch_y)`.
32
33 Returns:
34 Engine: a trainer engine with supervised update function.
35 """
36 if device:
37 model.to(device)
38
39 def _update(engine, batch):
40 model.train()
41 optimizer.zero_grad()
42 x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)
43 y_pred = model(x)
44 loss = loss_fn(y_pred, y)
45 loss.backward()
46 optimizer.step()
47 return loss.item()
48
49 return Engine(_update)
50
51
52 def create_supervised_evaluator(model, metrics={},
53 device=None, non_blocking=False,
54 prepare_batch=_prepare_batch):
55 """
56 Factory function for creating an evaluator for supervised models.
57
58 Args:
59 model (`torch.nn.Module`): the model to train.
60 metrics (dict of str - :class:`~ignite.metrics.Metric`): a map of metric names to Metrics.
61 device (str, optional): device type specification (default: None).
62 Applies to both model and batches.
63 non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously
64 with respect to the host. For other cases, this argument has no effect.
65 prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs
66 tuple of tensors `(batch_x, batch_y)`.
67
68 Returns:
69 Engine: an evaluator engine with supervised inference function.
70 """
71 if device:
72 model.to(device)
73
74 def _inference(engine, batch):
75 model.eval()
76 with torch.no_grad():
77 x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)
78 y_pred = model(x)
79 return y_pred, y
80
81 engine = Engine(_inference)
82
83 for name, metric in metrics.items():
84 metric.attach(engine, name)
85
86 return engine
87
```
Path: `ignite/__init__.py`
Content:
```
1 import ignite.engine
2 import ignite.handlers
3 import ignite.metrics
4 import ignite.exceptions
5 import ignite.contrib
6
7 __version__ = '0.1.2'
8
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ignite/__init__.py b/ignite/__init__.py
--- a/ignite/__init__.py
+++ b/ignite/__init__.py
@@ -3,5 +3,6 @@
import ignite.metrics
import ignite.exceptions
import ignite.contrib
+import ignite.utils
__version__ = '0.1.2'
diff --git a/ignite/engine/__init__.py b/ignite/engine/__init__.py
--- a/ignite/engine/__init__.py
+++ b/ignite/engine/__init__.py
@@ -30,6 +30,8 @@
prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs
tuple of tensors `(batch_x, batch_y)`.
+ Note: `engine.state.output` for this engine is the loss of the processed batch.
+
Returns:
Engine: a trainer engine with supervised update function.
"""
@@ -65,6 +67,8 @@
prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs
tuple of tensors `(batch_x, batch_y)`.
+ Note: `engine.state.output` for this engine is a tuple of `(batch_pred, batch_y)`.
+
Returns:
Engine: an evaluator engine with supervised inference function.
"""
| {"golden_diff": "diff --git a/ignite/__init__.py b/ignite/__init__.py\n--- a/ignite/__init__.py\n+++ b/ignite/__init__.py\n@@ -3,5 +3,6 @@\n import ignite.metrics\n import ignite.exceptions\n import ignite.contrib\n+import ignite.utils\n \n __version__ = '0.1.2'\ndiff --git a/ignite/engine/__init__.py b/ignite/engine/__init__.py\n--- a/ignite/engine/__init__.py\n+++ b/ignite/engine/__init__.py\n@@ -30,6 +30,8 @@\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n \n+ Note: `engine.state.output` for this engine is the loss of the processed batch.\n+\n Returns:\n Engine: a trainer engine with supervised update function.\n \"\"\"\n@@ -65,6 +67,8 @@\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n \n+ Note: `engine.state.output` for this engine is a tuple of `(batch_pred, batch_y)`.\n+\n Returns:\n Engine: an evaluator engine with supervised inference function.\n \"\"\"\n", "issue": "Improve State documentation\nHi,\r\n\r\nI think the documentation of states and the way they're used by engines is not detailed enough.\r\nCurrently the only way to know what attributes are available in the default State object, used by the engines generated by convenience methods such as `create_supervised_trainer`, is to look at the source code. \r\nSome of them are mentioned in the [Concept](https://pytorch.org/ignite/concepts.html#state) part of the documentation but there is still a lot of room for confusion about the way they work and some are missing.\r\nFor example, no information about the `batch` attribute which without documentation some might assume to correspond to the batch index (kinda like `epoch` for the epoch index) which is actually `iteration`. Same thing for `epoch` which is used implicitly in examples but no mention of it in the default `State` documentation, and you might think it's obvious but then someone will try to create a default `State` and assume `epoch` is a built-in attribute set to 0 at instantiation when in fact you'll have to search through the source code to figure out that it's not a default attribute and is only created by the default `Engine` through a kwarg passed to the `State` class.\r\n\r\nAdditionally this kind of explicit documentation of the state should be done every time a custom `State` is created otherwise the user is left to guess what attributes are available in it, what they mean and what format they're in. For example, is `output` the return from the forward pass of your model? The loss?\r\nYou'll have to look at `Engine` then find out it's the output of the process_function argument. Then figure out that in the case of the `create_supervised_trainer` that function is `_update` which returns `loss.item()` and in the case of `create_supervised_evaluator` returns `(y_pred, y)`.\r\n\r\nThank you, and sorry for the verbosity but without it I was afraid this would be rejected as trivial.\n", "before_files": [{"content": "import torch\n\nfrom ignite.engine.engine import Engine, State, Events\nfrom ignite.utils import convert_tensor\n\n\ndef _prepare_batch(batch, device=None, non_blocking=False):\n \"\"\"Prepare batch for training: pass to a device with options.\n\n \"\"\"\n x, y = batch\n return (convert_tensor(x, device=device, non_blocking=non_blocking),\n convert_tensor(y, device=device, non_blocking=non_blocking))\n\n\ndef create_supervised_trainer(model, optimizer, loss_fn,\n device=None, non_blocking=False,\n prepare_batch=_prepare_batch):\n \"\"\"\n Factory function for creating a trainer for supervised models.\n\n Args:\n model (`torch.nn.Module`): the model to train.\n optimizer (`torch.optim.Optimizer`): the optimizer to use.\n loss_fn (torch.nn loss function): the loss function to use.\n device (str, optional): device type specification (default: None).\n Applies to both model and batches.\n non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously\n with respect to the host. For other cases, this argument has no effect.\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n\n Returns:\n Engine: a trainer engine with supervised update function.\n \"\"\"\n if device:\n model.to(device)\n\n def _update(engine, batch):\n model.train()\n optimizer.zero_grad()\n x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)\n y_pred = model(x)\n loss = loss_fn(y_pred, y)\n loss.backward()\n optimizer.step()\n return loss.item()\n\n return Engine(_update)\n\n\ndef create_supervised_evaluator(model, metrics={},\n device=None, non_blocking=False,\n prepare_batch=_prepare_batch):\n \"\"\"\n Factory function for creating an evaluator for supervised models.\n\n Args:\n model (`torch.nn.Module`): the model to train.\n metrics (dict of str - :class:`~ignite.metrics.Metric`): a map of metric names to Metrics.\n device (str, optional): device type specification (default: None).\n Applies to both model and batches.\n non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously\n with respect to the host. For other cases, this argument has no effect.\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n\n Returns:\n Engine: an evaluator engine with supervised inference function.\n \"\"\"\n if device:\n model.to(device)\n\n def _inference(engine, batch):\n model.eval()\n with torch.no_grad():\n x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)\n y_pred = model(x)\n return y_pred, y\n\n engine = Engine(_inference)\n\n for name, metric in metrics.items():\n metric.attach(engine, name)\n\n return engine\n", "path": "ignite/engine/__init__.py"}, {"content": "import ignite.engine\nimport ignite.handlers\nimport ignite.metrics\nimport ignite.exceptions\nimport ignite.contrib\n\n__version__ = '0.1.2'\n", "path": "ignite/__init__.py"}], "after_files": [{"content": "import torch\n\nfrom ignite.engine.engine import Engine, State, Events\nfrom ignite.utils import convert_tensor\n\n\ndef _prepare_batch(batch, device=None, non_blocking=False):\n \"\"\"Prepare batch for training: pass to a device with options.\n\n \"\"\"\n x, y = batch\n return (convert_tensor(x, device=device, non_blocking=non_blocking),\n convert_tensor(y, device=device, non_blocking=non_blocking))\n\n\ndef create_supervised_trainer(model, optimizer, loss_fn,\n device=None, non_blocking=False,\n prepare_batch=_prepare_batch):\n \"\"\"\n Factory function for creating a trainer for supervised models.\n\n Args:\n model (`torch.nn.Module`): the model to train.\n optimizer (`torch.optim.Optimizer`): the optimizer to use.\n loss_fn (torch.nn loss function): the loss function to use.\n device (str, optional): device type specification (default: None).\n Applies to both model and batches.\n non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously\n with respect to the host. For other cases, this argument has no effect.\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n\n Note: `engine.state.output` for this engine is the loss of the processed batch.\n\n Returns:\n Engine: a trainer engine with supervised update function.\n \"\"\"\n if device:\n model.to(device)\n\n def _update(engine, batch):\n model.train()\n optimizer.zero_grad()\n x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)\n y_pred = model(x)\n loss = loss_fn(y_pred, y)\n loss.backward()\n optimizer.step()\n return loss.item()\n\n return Engine(_update)\n\n\ndef create_supervised_evaluator(model, metrics={},\n device=None, non_blocking=False,\n prepare_batch=_prepare_batch):\n \"\"\"\n Factory function for creating an evaluator for supervised models.\n\n Args:\n model (`torch.nn.Module`): the model to train.\n metrics (dict of str - :class:`~ignite.metrics.Metric`): a map of metric names to Metrics.\n device (str, optional): device type specification (default: None).\n Applies to both model and batches.\n non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously\n with respect to the host. For other cases, this argument has no effect.\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n\n Note: `engine.state.output` for this engine is a tuple of `(batch_pred, batch_y)`.\n\n Returns:\n Engine: an evaluator engine with supervised inference function.\n \"\"\"\n if device:\n model.to(device)\n\n def _inference(engine, batch):\n model.eval()\n with torch.no_grad():\n x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)\n y_pred = model(x)\n return y_pred, y\n\n engine = Engine(_inference)\n\n for name, metric in metrics.items():\n metric.attach(engine, name)\n\n return engine\n", "path": "ignite/engine/__init__.py"}, {"content": "import ignite.engine\nimport ignite.handlers\nimport ignite.metrics\nimport ignite.exceptions\nimport ignite.contrib\nimport ignite.utils\n\n__version__ = '0.1.2'\n", "path": "ignite/__init__.py"}]} | 1,571 | 290 |
gh_patches_debug_21524 | rasdani/github-patches | git_diff | mindsdb__mindsdb-981 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Route to check connection
At this moment in GUI possible check connection only for already exists integration. Route need for checking connection before integration created.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mindsdb/api/http/namespaces/config.py`
Content:
```
1 import copy
2 import traceback
3
4 from flask import request
5 from flask_restx import Resource, abort
6 from flask import current_app as ca
7
8 from mindsdb.api.http.namespaces.configs.config import ns_conf
9 from mindsdb.interfaces.database.database import DatabaseWrapper
10 from mindsdb.utilities.functions import get_all_models_meta_data
11
12
13 def get_integration(name):
14 integrations = ca.config_obj.get('integrations', {})
15 return integrations.get(name)
16
17
18 @ns_conf.route('/integrations')
19 @ns_conf.param('name', 'List all database integration')
20 class ListIntegration(Resource):
21 @ns_conf.doc('get_integrations')
22 def get(self):
23 return {'integrations': [k for k in ca.config_obj.get('integrations', {})]}
24
25
26 @ns_conf.route('/all_integrations')
27 @ns_conf.param('name', 'List all database integration')
28 class AllIntegration(Resource):
29 @ns_conf.doc('get_all_integrations')
30 def get(self):
31 integrations = copy.deepcopy(
32 ca.config_obj.get('integrations', {})
33 )
34 for integration in integrations.values():
35 if 'password' in integration:
36 integration['password'] = None
37 return integrations
38
39
40 @ns_conf.route('/integrations/<name>')
41 @ns_conf.param('name', 'Database integration')
42 class Integration(Resource):
43 @ns_conf.doc('get_integration')
44 def get(self, name):
45 integration = get_integration(name)
46 if integration is None:
47 abort(404, f'Can\'t find database integration: {name}')
48 integration = copy.deepcopy(integration)
49 if 'password' in integration:
50 integration['password'] = None
51 return integration
52
53 @ns_conf.doc('put_integration')
54 def put(self, name):
55 params = request.json.get('params')
56 if not isinstance(params, dict):
57 abort(400, "type of 'params' must be dict")
58 integration = get_integration(name)
59 if integration is not None:
60 abort(400, f"Integration with name '{name}' already exists")
61 try:
62 if 'enabled' in params:
63 params['publish'] = params['enabled']
64 del params['enabled']
65 ca.config_obj.add_db_integration(name, params)
66
67 mdb = ca.mindsdb_native
68 cst = ca.custom_models
69 model_data_arr = get_all_models_meta_data(mdb, cst)
70 dbw = DatabaseWrapper(ca.config_obj)
71 dbw.register_predictors(model_data_arr)
72 except Exception as e:
73 print(traceback.format_exc())
74 abort(500, f'Error during config update: {str(e)}')
75 return '', 200
76
77 @ns_conf.doc('delete_integration')
78 def delete(self, name):
79 integration = get_integration(name)
80 if integration is None:
81 abort(400, f"Nothing to delete. '{name}' not exists.")
82 try:
83 ca.config_obj.remove_db_integration(name)
84 except Exception as e:
85 print(traceback.format_exc())
86 abort(500, f'Error during integration delete: {str(e)}')
87 return '', 200
88
89 @ns_conf.doc('modify_integration')
90 def post(self, name):
91 params = request.json.get('params')
92 if not isinstance(params, dict):
93 abort(400, "type of 'params' must be dict")
94 integration = get_integration(name)
95 if integration is None:
96 abort(400, f"Nothin to modify. '{name}' not exists.")
97 try:
98 if 'enabled' in params:
99 params['publish'] = params['enabled']
100 del params['enabled']
101 ca.config_obj.modify_db_integration(name, params)
102 DatabaseWrapper(ca.config_obj)
103 except Exception as e:
104 print(traceback.format_exc())
105 abort(500, f'Error during integration modifycation: {str(e)}')
106 return '', 200
107
108
109 @ns_conf.route('/integrations/<name>/check')
110 @ns_conf.param('name', 'Database integration checks')
111 class Check(Resource):
112 @ns_conf.doc('check')
113 def get(self, name):
114 if get_integration(name) is None:
115 abort(404, f'Can\'t find database integration: {name}')
116 dbw = DatabaseWrapper(ca.config_obj)
117 connections = dbw.check_connections()
118 return connections.get(name, False), 200
119
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mindsdb/api/http/namespaces/config.py b/mindsdb/api/http/namespaces/config.py
--- a/mindsdb/api/http/namespaces/config.py
+++ b/mindsdb/api/http/namespaces/config.py
@@ -55,6 +55,11 @@
params = request.json.get('params')
if not isinstance(params, dict):
abort(400, "type of 'params' must be dict")
+
+ is_test = params.get('test', False)
+ if is_test:
+ del params['test']
+
integration = get_integration(name)
if integration is not None:
abort(400, f"Integration with name '{name}' already exists")
@@ -72,6 +77,12 @@
except Exception as e:
print(traceback.format_exc())
abort(500, f'Error during config update: {str(e)}')
+
+ if is_test:
+ cons = dbw.check_connections()
+ ca.config_obj.remove_db_integration(name)
+ return {'success': cons[name]}, 200
+
return '', 200
@ns_conf.doc('delete_integration')
| {"golden_diff": "diff --git a/mindsdb/api/http/namespaces/config.py b/mindsdb/api/http/namespaces/config.py\n--- a/mindsdb/api/http/namespaces/config.py\n+++ b/mindsdb/api/http/namespaces/config.py\n@@ -55,6 +55,11 @@\n params = request.json.get('params')\n if not isinstance(params, dict):\n abort(400, \"type of 'params' must be dict\")\n+\n+ is_test = params.get('test', False)\n+ if is_test:\n+ del params['test']\n+\n integration = get_integration(name)\n if integration is not None:\n abort(400, f\"Integration with name '{name}' already exists\")\n@@ -72,6 +77,12 @@\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during config update: {str(e)}')\n+\n+ if is_test:\n+ cons = dbw.check_connections()\n+ ca.config_obj.remove_db_integration(name)\n+ return {'success': cons[name]}, 200\n+\n return '', 200\n \n @ns_conf.doc('delete_integration')\n", "issue": "Route to check connection\nAt this moment in GUI possible check connection only for already exists integration. Route need for checking connection before integration created.\n", "before_files": [{"content": "import copy\nimport traceback\n\nfrom flask import request\nfrom flask_restx import Resource, abort\nfrom flask import current_app as ca\n\nfrom mindsdb.api.http.namespaces.configs.config import ns_conf\nfrom mindsdb.interfaces.database.database import DatabaseWrapper\nfrom mindsdb.utilities.functions import get_all_models_meta_data\n\n\ndef get_integration(name):\n integrations = ca.config_obj.get('integrations', {})\n return integrations.get(name)\n\n\n@ns_conf.route('/integrations')\n@ns_conf.param('name', 'List all database integration')\nclass ListIntegration(Resource):\n @ns_conf.doc('get_integrations')\n def get(self):\n return {'integrations': [k for k in ca.config_obj.get('integrations', {})]}\n\n\n@ns_conf.route('/all_integrations')\n@ns_conf.param('name', 'List all database integration')\nclass AllIntegration(Resource):\n @ns_conf.doc('get_all_integrations')\n def get(self):\n integrations = copy.deepcopy(\n ca.config_obj.get('integrations', {})\n )\n for integration in integrations.values():\n if 'password' in integration:\n integration['password'] = None\n return integrations\n\n\n@ns_conf.route('/integrations/<name>')\n@ns_conf.param('name', 'Database integration')\nclass Integration(Resource):\n @ns_conf.doc('get_integration')\n def get(self, name):\n integration = get_integration(name)\n if integration is None:\n abort(404, f'Can\\'t find database integration: {name}')\n integration = copy.deepcopy(integration)\n if 'password' in integration:\n integration['password'] = None\n return integration\n\n @ns_conf.doc('put_integration')\n def put(self, name):\n params = request.json.get('params')\n if not isinstance(params, dict):\n abort(400, \"type of 'params' must be dict\")\n integration = get_integration(name)\n if integration is not None:\n abort(400, f\"Integration with name '{name}' already exists\")\n try:\n if 'enabled' in params:\n params['publish'] = params['enabled']\n del params['enabled']\n ca.config_obj.add_db_integration(name, params)\n\n mdb = ca.mindsdb_native\n cst = ca.custom_models\n model_data_arr = get_all_models_meta_data(mdb, cst)\n dbw = DatabaseWrapper(ca.config_obj)\n dbw.register_predictors(model_data_arr)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during config update: {str(e)}')\n return '', 200\n\n @ns_conf.doc('delete_integration')\n def delete(self, name):\n integration = get_integration(name)\n if integration is None:\n abort(400, f\"Nothing to delete. '{name}' not exists.\")\n try:\n ca.config_obj.remove_db_integration(name)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during integration delete: {str(e)}')\n return '', 200\n\n @ns_conf.doc('modify_integration')\n def post(self, name):\n params = request.json.get('params')\n if not isinstance(params, dict):\n abort(400, \"type of 'params' must be dict\")\n integration = get_integration(name)\n if integration is None:\n abort(400, f\"Nothin to modify. '{name}' not exists.\")\n try:\n if 'enabled' in params:\n params['publish'] = params['enabled']\n del params['enabled']\n ca.config_obj.modify_db_integration(name, params)\n DatabaseWrapper(ca.config_obj)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during integration modifycation: {str(e)}')\n return '', 200\n\n\n@ns_conf.route('/integrations/<name>/check')\n@ns_conf.param('name', 'Database integration checks')\nclass Check(Resource):\n @ns_conf.doc('check')\n def get(self, name):\n if get_integration(name) is None:\n abort(404, f'Can\\'t find database integration: {name}')\n dbw = DatabaseWrapper(ca.config_obj)\n connections = dbw.check_connections()\n return connections.get(name, False), 200\n", "path": "mindsdb/api/http/namespaces/config.py"}], "after_files": [{"content": "import copy\nimport traceback\n\nfrom flask import request\nfrom flask_restx import Resource, abort\nfrom flask import current_app as ca\n\nfrom mindsdb.api.http.namespaces.configs.config import ns_conf\nfrom mindsdb.interfaces.database.database import DatabaseWrapper\nfrom mindsdb.utilities.functions import get_all_models_meta_data\n\n\ndef get_integration(name):\n integrations = ca.config_obj.get('integrations', {})\n return integrations.get(name)\n\n\n@ns_conf.route('/integrations')\n@ns_conf.param('name', 'List all database integration')\nclass ListIntegration(Resource):\n @ns_conf.doc('get_integrations')\n def get(self):\n return {'integrations': [k for k in ca.config_obj.get('integrations', {})]}\n\n\n@ns_conf.route('/all_integrations')\n@ns_conf.param('name', 'List all database integration')\nclass AllIntegration(Resource):\n @ns_conf.doc('get_all_integrations')\n def get(self):\n integrations = copy.deepcopy(\n ca.config_obj.get('integrations', {})\n )\n for integration in integrations.values():\n if 'password' in integration:\n integration['password'] = None\n return integrations\n\n\n@ns_conf.route('/integrations/<name>')\n@ns_conf.param('name', 'Database integration')\nclass Integration(Resource):\n @ns_conf.doc('get_integration')\n def get(self, name):\n integration = get_integration(name)\n if integration is None:\n abort(404, f'Can\\'t find database integration: {name}')\n integration = copy.deepcopy(integration)\n if 'password' in integration:\n integration['password'] = None\n return integration\n\n @ns_conf.doc('put_integration')\n def put(self, name):\n params = request.json.get('params')\n if not isinstance(params, dict):\n abort(400, \"type of 'params' must be dict\")\n\n is_test = params.get('test', False)\n if is_test:\n del params['test']\n\n integration = get_integration(name)\n if integration is not None:\n abort(400, f\"Integration with name '{name}' already exists\")\n try:\n if 'enabled' in params:\n params['publish'] = params['enabled']\n del params['enabled']\n ca.config_obj.add_db_integration(name, params)\n\n mdb = ca.mindsdb_native\n cst = ca.custom_models\n model_data_arr = get_all_models_meta_data(mdb, cst)\n dbw = DatabaseWrapper(ca.config_obj)\n dbw.register_predictors(model_data_arr)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during config update: {str(e)}')\n\n if is_test:\n cons = dbw.check_connections()\n ca.config_obj.remove_db_integration(name)\n return {'success': cons[name]}, 200\n\n return '', 200\n\n @ns_conf.doc('delete_integration')\n def delete(self, name):\n integration = get_integration(name)\n if integration is None:\n abort(400, f\"Nothing to delete. '{name}' not exists.\")\n try:\n ca.config_obj.remove_db_integration(name)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during integration delete: {str(e)}')\n return '', 200\n\n @ns_conf.doc('modify_integration')\n def post(self, name):\n params = request.json.get('params')\n if not isinstance(params, dict):\n abort(400, \"type of 'params' must be dict\")\n integration = get_integration(name)\n if integration is None:\n abort(400, f\"Nothin to modify. '{name}' not exists.\")\n try:\n if 'enabled' in params:\n params['publish'] = params['enabled']\n del params['enabled']\n ca.config_obj.modify_db_integration(name, params)\n DatabaseWrapper(ca.config_obj)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during integration modifycation: {str(e)}')\n return '', 200\n\n\n@ns_conf.route('/integrations/<name>/check')\n@ns_conf.param('name', 'Database integration checks')\nclass Check(Resource):\n @ns_conf.doc('check')\n def get(self, name):\n if get_integration(name) is None:\n abort(404, f'Can\\'t find database integration: {name}')\n dbw = DatabaseWrapper(ca.config_obj)\n connections = dbw.check_connections()\n return connections.get(name, False), 200\n", "path": "mindsdb/api/http/namespaces/config.py"}]} | 1,485 | 261 |
gh_patches_debug_29959 | rasdani/github-patches | git_diff | joke2k__faker-1800 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Latvian ssn generator is not correct
* Faker version: 17.0.0
* OS: Windows
Latvian ssn generator generates ssn in format 'ddd-dd-dddd', which is not correct.
### Steps to reproduce
```
from faker import Faker
fake = Faker('lv_LV')
fake.ssn()
```
### Expected behavior
It should be 12 characters long DDMMYY-CZZZQ e.g. 170334-19862
### Actual behavior
fake.ssn() for latvian locale outputs ssn in format 'ddd-dd-dddd'
'604-87-6475'
'824-21-9489'
'411-57-8156'
'938-77-8408'
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `faker/providers/ssn/lv_LV/__init__.py`
Content:
```
1 from .. import Provider as BaseProvider
2
3
4 class Provider(BaseProvider):
5 """
6 A Faker provider for the Latvian VAT IDs
7 """
8
9 vat_id_formats = ("LV###########",)
10
11 def vat_id(self) -> str:
12 """
13 http://ec.europa.eu/taxation_customs/vies/faq.html#item_11
14 :return: a random Latvian VAT ID
15 """
16
17 return self.bothify(self.random_element(self.vat_id_formats))
18
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/faker/providers/ssn/lv_LV/__init__.py b/faker/providers/ssn/lv_LV/__init__.py
--- a/faker/providers/ssn/lv_LV/__init__.py
+++ b/faker/providers/ssn/lv_LV/__init__.py
@@ -1,7 +1,56 @@
-from .. import Provider as BaseProvider
+import datetime
+from .. import Provider as SsnProvider
+
+
+class Provider(SsnProvider):
+ def ssn(self, min_age: int = 0, max_age: int = 105) -> str:
+ """
+ Returns 11 character Latvian personal identity code (Personas kods).
+ This function assigns random age to person.
+
+ Personal code consists of eleven characters of the form DDMMYYCZZZQ, where
+ DDMMYY is the date of birth, C the century sign, ZZZ the individual
+ number and Q the control character (checksum). The number for the
+ century is either 0 (1800–1899), 1 (1900–1999), or 2 (2000–2099).
+ """
+
+ def _checksum(ssn_without_checksum):
+ weights = [1, 6, 3, 7, 9, 10, 5, 8, 4, 2]
+ weighted_sum = sum(int(digit) * weight for digit, weight in zip(ssn_without_checksum, weights))
+ reminder = (1 - weighted_sum) % 11
+ if reminder == 10:
+ return 0
+ elif reminder < -1:
+ return reminder + 11
+ return reminder
+
+ age = datetime.timedelta(days=self.generator.random.randrange(min_age * 365, max_age * 365))
+ birthday = datetime.date.today() - age
+ ssn_date = "%02d%02d%s" % (
+ birthday.day,
+ birthday.month,
+ str(birthday.year)[-2:],
+ )
+ century = self._get_century_code(birthday.year) # Century
+ suffix = self.generator.random.randrange(111, 999)
+ checksum = _checksum(f"{ssn_date}{century:01d}{suffix:03d}")
+ ssn = f"{ssn_date}-{century:01d}{suffix:03d}{checksum:01d}"
+ return ssn
+
+ @staticmethod
+ def _get_century_code(year: int) -> int:
+ """Returns the century code for a given year"""
+ if 2000 <= year < 3000:
+ code = 2
+ elif 1900 <= year < 2000:
+ code = 1
+ elif 1800 <= year < 1900:
+ code = 0
+ else:
+ raise ValueError("SSN do not support people born before the year 1800 or after the year 2999")
+ return code
-class Provider(BaseProvider):
"""
A Faker provider for the Latvian VAT IDs
"""
| {"golden_diff": "diff --git a/faker/providers/ssn/lv_LV/__init__.py b/faker/providers/ssn/lv_LV/__init__.py\n--- a/faker/providers/ssn/lv_LV/__init__.py\n+++ b/faker/providers/ssn/lv_LV/__init__.py\n@@ -1,7 +1,56 @@\n-from .. import Provider as BaseProvider\n+import datetime\n \n+from .. import Provider as SsnProvider\n+\n+\n+class Provider(SsnProvider):\n+ def ssn(self, min_age: int = 0, max_age: int = 105) -> str:\n+ \"\"\"\n+ Returns 11 character Latvian personal identity code (Personas kods).\n+ This function assigns random age to person.\n+\n+ Personal code consists of eleven characters of the form DDMMYYCZZZQ, where\n+ DDMMYY is the date of birth, C the century sign, ZZZ the individual\n+ number and Q the control character (checksum). The number for the\n+ century is either 0 (1800\u20131899), 1 (1900\u20131999), or 2 (2000\u20132099).\n+ \"\"\"\n+\n+ def _checksum(ssn_without_checksum):\n+ weights = [1, 6, 3, 7, 9, 10, 5, 8, 4, 2]\n+ weighted_sum = sum(int(digit) * weight for digit, weight in zip(ssn_without_checksum, weights))\n+ reminder = (1 - weighted_sum) % 11\n+ if reminder == 10:\n+ return 0\n+ elif reminder < -1:\n+ return reminder + 11\n+ return reminder\n+\n+ age = datetime.timedelta(days=self.generator.random.randrange(min_age * 365, max_age * 365))\n+ birthday = datetime.date.today() - age\n+ ssn_date = \"%02d%02d%s\" % (\n+ birthday.day,\n+ birthday.month,\n+ str(birthday.year)[-2:],\n+ )\n+ century = self._get_century_code(birthday.year) # Century\n+ suffix = self.generator.random.randrange(111, 999)\n+ checksum = _checksum(f\"{ssn_date}{century:01d}{suffix:03d}\")\n+ ssn = f\"{ssn_date}-{century:01d}{suffix:03d}{checksum:01d}\"\n+ return ssn\n+\n+ @staticmethod\n+ def _get_century_code(year: int) -> int:\n+ \"\"\"Returns the century code for a given year\"\"\"\n+ if 2000 <= year < 3000:\n+ code = 2\n+ elif 1900 <= year < 2000:\n+ code = 1\n+ elif 1800 <= year < 1900:\n+ code = 0\n+ else:\n+ raise ValueError(\"SSN do not support people born before the year 1800 or after the year 2999\")\n+ return code\n \n-class Provider(BaseProvider):\n \"\"\"\n A Faker provider for the Latvian VAT IDs\n \"\"\"\n", "issue": "Latvian ssn generator is not correct\n* Faker version: 17.0.0\r\n* OS: Windows\r\n\r\n\r\nLatvian ssn generator generates ssn in format 'ddd-dd-dddd', which is not correct. \r\n\r\n### Steps to reproduce\r\n\r\n```\r\nfrom faker import Faker\r\nfake = Faker('lv_LV')\r\nfake.ssn()\r\n\r\n```\r\n\r\n### Expected behavior\r\n\r\nIt should be 12 characters long DDMMYY-CZZZQ e.g. 170334-19862\r\n\r\n### Actual behavior\r\n\r\nfake.ssn() for latvian locale outputs ssn in format 'ddd-dd-dddd'\r\n'604-87-6475'\r\n'824-21-9489'\r\n'411-57-8156'\r\n'938-77-8408'\r\n\n", "before_files": [{"content": "from .. import Provider as BaseProvider\n\n\nclass Provider(BaseProvider):\n \"\"\"\n A Faker provider for the Latvian VAT IDs\n \"\"\"\n\n vat_id_formats = (\"LV###########\",)\n\n def vat_id(self) -> str:\n \"\"\"\n http://ec.europa.eu/taxation_customs/vies/faq.html#item_11\n :return: a random Latvian VAT ID\n \"\"\"\n\n return self.bothify(self.random_element(self.vat_id_formats))\n", "path": "faker/providers/ssn/lv_LV/__init__.py"}], "after_files": [{"content": "import datetime\n\nfrom .. import Provider as SsnProvider\n\n\nclass Provider(SsnProvider):\n def ssn(self, min_age: int = 0, max_age: int = 105) -> str:\n \"\"\"\n Returns 11 character Latvian personal identity code (Personas kods).\n This function assigns random age to person.\n\n Personal code consists of eleven characters of the form DDMMYYCZZZQ, where\n DDMMYY is the date of birth, C the century sign, ZZZ the individual\n number and Q the control character (checksum). The number for the\n century is either 0 (1800\u20131899), 1 (1900\u20131999), or 2 (2000\u20132099).\n \"\"\"\n\n def _checksum(ssn_without_checksum):\n weights = [1, 6, 3, 7, 9, 10, 5, 8, 4, 2]\n weighted_sum = sum(int(digit) * weight for digit, weight in zip(ssn_without_checksum, weights))\n reminder = (1 - weighted_sum) % 11\n if reminder == 10:\n return 0\n elif reminder < -1:\n return reminder + 11\n return reminder\n\n age = datetime.timedelta(days=self.generator.random.randrange(min_age * 365, max_age * 365))\n birthday = datetime.date.today() - age\n ssn_date = \"%02d%02d%s\" % (\n birthday.day,\n birthday.month,\n str(birthday.year)[-2:],\n )\n century = self._get_century_code(birthday.year) # Century\n suffix = self.generator.random.randrange(111, 999)\n checksum = _checksum(f\"{ssn_date}{century:01d}{suffix:03d}\")\n ssn = f\"{ssn_date}-{century:01d}{suffix:03d}{checksum:01d}\"\n return ssn\n\n @staticmethod\n def _get_century_code(year: int) -> int:\n \"\"\"Returns the century code for a given year\"\"\"\n if 2000 <= year < 3000:\n code = 2\n elif 1900 <= year < 2000:\n code = 1\n elif 1800 <= year < 1900:\n code = 0\n else:\n raise ValueError(\"SSN do not support people born before the year 1800 or after the year 2999\")\n return code\n\n \"\"\"\n A Faker provider for the Latvian VAT IDs\n \"\"\"\n\n vat_id_formats = (\"LV###########\",)\n\n def vat_id(self) -> str:\n \"\"\"\n http://ec.europa.eu/taxation_customs/vies/faq.html#item_11\n :return: a random Latvian VAT ID\n \"\"\"\n\n return self.bothify(self.random_element(self.vat_id_formats))\n", "path": "faker/providers/ssn/lv_LV/__init__.py"}]} | 592 | 738 |
gh_patches_debug_11013 | rasdani/github-patches | git_diff | holoviz__panel-6293 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pn.template attribute not found when lazy loading panel
panel==1.3.8
As Panel takes ~2secs to import I would like to lazy load it in one of my packages. This package provides different utility functions. Only some are panel related.
Lazy loading of modules in Python is defined here https://docs.python.org/3/library/importlib.html#implementing-lazy-imports. When I use that I get an `AttributeError` because the attribute `template` is not defined on `panel`. Without lazy loading this works.
## Minimum reproducible example
`lazy_loader.py`
```python
import importlib.util
import sys
def lazy_import(name):
spec = importlib.util.find_spec(name)
loader = importlib.util.LazyLoader(spec.loader)
spec.loader = loader
module = importlib.util.module_from_spec(spec)
sys.modules[name] = module
loader.exec_module(module)
return module
pn = lazy_import("panel")
# import panel as pn
def extension():
return pn.extension()
```
`app.py`
```python
from lazy_loader import extension
import panel as pn
extension()
pn.template.FastListTemplate(title="Hello", main=["world"]).servable()
```
```bash
panel serve app.py --autoreload --index app
```

¨
```bash
AttributeError: module 'panel' has no attribute 'template'
Traceback (most recent call last):
File "/home/jovyan/repos/aw-lib/.venv/lib/python3.11/site-packages/bokeh/application/handlers/code_runner.py", line 229, in run
exec(self._code, module.__dict__)
File "/home/jovyan/repos/aw-lib/app.py", line 6, in <module>
pn.template.FastListTemplate(title="Hello", main=["world"]).servable()
^^^^^^^^^^^
AttributeError: module 'panel' has no attribute 'template'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `panel/__init__.py`
Content:
```
1 """
2 Panel is a high level app and dashboarding framework
3 ====================================================
4
5 Panel is an open-source Python library that lets you create custom
6 interactive web apps and dashboards by connecting user-defined widgets
7 to plots, images, tables, or text.
8
9 Panel works with the tools you know and ❤️.
10
11 Check out https://panel.holoviz.org/
12
13 .. figure:: https://user-images.githubusercontent.com/42288570/152672367-6c239073-0ea0-4a2b-a4c0-817e8090e877.gif
14 :alt: Panel Dashboard
15
16 Panel Dashboard
17
18 How to develop a Panel app in 3 simple steps
19 --------------------------------------------
20
21 - Write the app
22
23 >>> import panel as pn
24 >>> pn.extension(sizing_mode="stretch_width", template="fast")
25 >>> pn.state.template.param.update(title="My Data App")
26 >>> pn.panel(some_python_object).servable()
27
28 - Run your app
29
30 $ panel serve my_script.py --autoreload --show
31
32 or
33
34 $ panel serve my_notebook.ipynb --autoreload --show
35
36 The app will be available in your browser!
37
38 - Change your code and save it
39
40 The app will reload with your changes!
41
42 You can also add automatic reload to jupyterlab. Check out
43 https://blog.holoviz.org/panel_0.12.0.html#JupyterLab-previews
44
45 To learn more about Panel check out
46 https://panel.holoviz.org/getting_started/index.html
47 """
48 from param import rx
49
50 from . import chat # noqa
51 from . import layout # noqa
52 from . import links # noqa
53 from . import pane # noqa
54 from . import param # noqa
55 from . import pipeline # noqa
56 from . import reactive # noqa
57 from . import viewable # noqa
58 from . import widgets # noqa
59 from .config import __version__, config, panel_extension as extension # noqa
60 from .depends import bind, depends # noqa
61 from .interact import interact # noqa
62 from .io import ( # noqa
63 _jupyter_server_extension_paths, cache, ipywidget, serve, state,
64 )
65 from .layout import ( # noqa
66 Accordion, Card, Column, FlexBox, FloatPanel, GridBox, GridSpec, GridStack,
67 HSpacer, Row, Spacer, Swipe, Tabs, VSpacer, WidgetBox,
68 )
69 from .pane import panel # noqa
70 from .param import Param, ReactiveExpr # noqa
71 from .template import Template # noqa
72 from .widgets import indicators, widget # noqa
73
74 __all__ = (
75 "__version__",
76 "Accordion",
77 "Card",
78 "chat",
79 "Column",
80 "FlexBox",
81 "FloatPanel",
82 "GridBox",
83 "GridSpec",
84 "GridStack",
85 "HSpacer",
86 "Param",
87 "ReactiveExpr",
88 "Row",
89 "Spacer",
90 "Tabs",
91 "Template",
92 "VSpacer",
93 "WidgetBox",
94 "bind",
95 "cache",
96 "config",
97 "depends",
98 "extension",
99 "indicators",
100 "interact",
101 "ipywidget",
102 "layout",
103 "links",
104 "pane",
105 "panel",
106 "param",
107 "pipeline",
108 "rx",
109 "serve",
110 "state",
111 "viewable",
112 "widgets",
113 "widget"
114 )
115
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/panel/__init__.py b/panel/__init__.py
--- a/panel/__init__.py
+++ b/panel/__init__.py
@@ -54,6 +54,7 @@
from . import param # noqa
from . import pipeline # noqa
from . import reactive # noqa
+from . import template # noqa
from . import viewable # noqa
from . import widgets # noqa
from .config import __version__, config, panel_extension as extension # noqa
@@ -108,6 +109,7 @@
"rx",
"serve",
"state",
+ "template",
"viewable",
"widgets",
"widget"
| {"golden_diff": "diff --git a/panel/__init__.py b/panel/__init__.py\n--- a/panel/__init__.py\n+++ b/panel/__init__.py\n@@ -54,6 +54,7 @@\n from . import param # noqa\n from . import pipeline # noqa\n from . import reactive # noqa\n+from . import template # noqa\n from . import viewable # noqa\n from . import widgets # noqa\n from .config import __version__, config, panel_extension as extension # noqa\n@@ -108,6 +109,7 @@\n \"rx\",\n \"serve\",\n \"state\",\n+ \"template\",\n \"viewable\",\n \"widgets\",\n \"widget\"\n", "issue": "pn.template attribute not found when lazy loading panel\npanel==1.3.8\r\n\r\nAs Panel takes ~2secs to import I would like to lazy load it in one of my packages. This package provides different utility functions. Only some are panel related.\r\n\r\nLazy loading of modules in Python is defined here https://docs.python.org/3/library/importlib.html#implementing-lazy-imports. When I use that I get an `AttributeError` because the attribute `template` is not defined on `panel`. Without lazy loading this works.\r\n\r\n## Minimum reproducible example\r\n\r\n`lazy_loader.py`\r\n\r\n```python\r\nimport importlib.util\r\nimport sys\r\n\r\ndef lazy_import(name):\r\n spec = importlib.util.find_spec(name)\r\n loader = importlib.util.LazyLoader(spec.loader)\r\n spec.loader = loader\r\n module = importlib.util.module_from_spec(spec)\r\n sys.modules[name] = module\r\n loader.exec_module(module)\r\n return module\r\n\r\npn = lazy_import(\"panel\")\r\n# import panel as pn\r\n\r\ndef extension():\r\n return pn.extension()\r\n```\r\n\r\n`app.py`\r\n\r\n```python\r\nfrom lazy_loader import extension\r\nimport panel as pn\r\n\r\nextension()\r\n\r\npn.template.FastListTemplate(title=\"Hello\", main=[\"world\"]).servable()\r\n```\r\n\r\n```bash\r\npanel serve app.py --autoreload --index app\r\n```\r\n\r\n\r\n\u00a8\r\n\r\n```bash\r\nAttributeError: module 'panel' has no attribute 'template'\r\n\r\nTraceback (most recent call last):\r\n File \"/home/jovyan/repos/aw-lib/.venv/lib/python3.11/site-packages/bokeh/application/handlers/code_runner.py\", line 229, in run\r\n exec(self._code, module.__dict__)\r\n File \"/home/jovyan/repos/aw-lib/app.py\", line 6, in <module>\r\n pn.template.FastListTemplate(title=\"Hello\", main=[\"world\"]).servable()\r\n ^^^^^^^^^^^\r\nAttributeError: module 'panel' has no attribute 'template'\r\n```\n", "before_files": [{"content": "\"\"\"\nPanel is a high level app and dashboarding framework\n====================================================\n\nPanel is an open-source Python library that lets you create custom\ninteractive web apps and dashboards by connecting user-defined widgets\nto plots, images, tables, or text.\n\nPanel works with the tools you know and \u2764\ufe0f.\n\nCheck out https://panel.holoviz.org/\n\n.. figure:: https://user-images.githubusercontent.com/42288570/152672367-6c239073-0ea0-4a2b-a4c0-817e8090e877.gif\n :alt: Panel Dashboard\n\n Panel Dashboard\n\nHow to develop a Panel app in 3 simple steps\n--------------------------------------------\n\n- Write the app\n\n>>> import panel as pn\n>>> pn.extension(sizing_mode=\"stretch_width\", template=\"fast\")\n>>> pn.state.template.param.update(title=\"My Data App\")\n>>> pn.panel(some_python_object).servable()\n\n- Run your app\n\n$ panel serve my_script.py --autoreload --show\n\nor\n\n$ panel serve my_notebook.ipynb --autoreload --show\n\nThe app will be available in your browser!\n\n- Change your code and save it\n\nThe app will reload with your changes!\n\nYou can also add automatic reload to jupyterlab. Check out\nhttps://blog.holoviz.org/panel_0.12.0.html#JupyterLab-previews\n\nTo learn more about Panel check out\nhttps://panel.holoviz.org/getting_started/index.html\n\"\"\"\nfrom param import rx\n\nfrom . import chat # noqa\nfrom . import layout # noqa\nfrom . import links # noqa\nfrom . import pane # noqa\nfrom . import param # noqa\nfrom . import pipeline # noqa\nfrom . import reactive # noqa\nfrom . import viewable # noqa\nfrom . import widgets # noqa\nfrom .config import __version__, config, panel_extension as extension # noqa\nfrom .depends import bind, depends # noqa\nfrom .interact import interact # noqa\nfrom .io import ( # noqa\n _jupyter_server_extension_paths, cache, ipywidget, serve, state,\n)\nfrom .layout import ( # noqa\n Accordion, Card, Column, FlexBox, FloatPanel, GridBox, GridSpec, GridStack,\n HSpacer, Row, Spacer, Swipe, Tabs, VSpacer, WidgetBox,\n)\nfrom .pane import panel # noqa\nfrom .param import Param, ReactiveExpr # noqa\nfrom .template import Template # noqa\nfrom .widgets import indicators, widget # noqa\n\n__all__ = (\n \"__version__\",\n \"Accordion\",\n \"Card\",\n \"chat\",\n \"Column\",\n \"FlexBox\",\n \"FloatPanel\",\n \"GridBox\",\n \"GridSpec\",\n \"GridStack\",\n \"HSpacer\",\n \"Param\",\n \"ReactiveExpr\",\n \"Row\",\n \"Spacer\",\n \"Tabs\",\n \"Template\",\n \"VSpacer\",\n \"WidgetBox\",\n \"bind\",\n \"cache\",\n \"config\",\n \"depends\",\n \"extension\",\n \"indicators\",\n \"interact\",\n \"ipywidget\",\n \"layout\",\n \"links\",\n \"pane\",\n \"panel\",\n \"param\",\n \"pipeline\",\n \"rx\",\n \"serve\",\n \"state\",\n \"viewable\",\n \"widgets\",\n \"widget\"\n)\n", "path": "panel/__init__.py"}], "after_files": [{"content": "\"\"\"\nPanel is a high level app and dashboarding framework\n====================================================\n\nPanel is an open-source Python library that lets you create custom\ninteractive web apps and dashboards by connecting user-defined widgets\nto plots, images, tables, or text.\n\nPanel works with the tools you know and \u2764\ufe0f.\n\nCheck out https://panel.holoviz.org/\n\n.. figure:: https://user-images.githubusercontent.com/42288570/152672367-6c239073-0ea0-4a2b-a4c0-817e8090e877.gif\n :alt: Panel Dashboard\n\n Panel Dashboard\n\nHow to develop a Panel app in 3 simple steps\n--------------------------------------------\n\n- Write the app\n\n>>> import panel as pn\n>>> pn.extension(sizing_mode=\"stretch_width\", template=\"fast\")\n>>> pn.state.template.param.update(title=\"My Data App\")\n>>> pn.panel(some_python_object).servable()\n\n- Run your app\n\n$ panel serve my_script.py --autoreload --show\n\nor\n\n$ panel serve my_notebook.ipynb --autoreload --show\n\nThe app will be available in your browser!\n\n- Change your code and save it\n\nThe app will reload with your changes!\n\nYou can also add automatic reload to jupyterlab. Check out\nhttps://blog.holoviz.org/panel_0.12.0.html#JupyterLab-previews\n\nTo learn more about Panel check out\nhttps://panel.holoviz.org/getting_started/index.html\n\"\"\"\nfrom param import rx\n\nfrom . import chat # noqa\nfrom . import layout # noqa\nfrom . import links # noqa\nfrom . import pane # noqa\nfrom . import param # noqa\nfrom . import pipeline # noqa\nfrom . import reactive # noqa\nfrom . import template # noqa\nfrom . import viewable # noqa\nfrom . import widgets # noqa\nfrom .config import __version__, config, panel_extension as extension # noqa\nfrom .depends import bind, depends # noqa\nfrom .interact import interact # noqa\nfrom .io import ( # noqa\n _jupyter_server_extension_paths, cache, ipywidget, serve, state,\n)\nfrom .layout import ( # noqa\n Accordion, Card, Column, FlexBox, FloatPanel, GridBox, GridSpec, GridStack,\n HSpacer, Row, Spacer, Swipe, Tabs, VSpacer, WidgetBox,\n)\nfrom .pane import panel # noqa\nfrom .param import Param, ReactiveExpr # noqa\nfrom .template import Template # noqa\nfrom .widgets import indicators, widget # noqa\n\n__all__ = (\n \"__version__\",\n \"Accordion\",\n \"Card\",\n \"chat\",\n \"Column\",\n \"FlexBox\",\n \"FloatPanel\",\n \"GridBox\",\n \"GridSpec\",\n \"GridStack\",\n \"HSpacer\",\n \"Param\",\n \"ReactiveExpr\",\n \"Row\",\n \"Spacer\",\n \"Tabs\",\n \"Template\",\n \"VSpacer\",\n \"WidgetBox\",\n \"bind\",\n \"cache\",\n \"config\",\n \"depends\",\n \"extension\",\n \"indicators\",\n \"interact\",\n \"ipywidget\",\n \"layout\",\n \"links\",\n \"pane\",\n \"panel\",\n \"param\",\n \"pipeline\",\n \"rx\",\n \"serve\",\n \"state\",\n \"template\",\n \"viewable\",\n \"widgets\",\n \"widget\"\n)\n", "path": "panel/__init__.py"}]} | 1,731 | 159 |
gh_patches_debug_40115 | rasdani/github-patches | git_diff | Pylons__pyramid-2902 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
add_translation_dirs behaves the wrong way around
This is a follow-up to #1473. See the reproducer and notes there.
I would propose an API change that changes behaviour of add_translation_dirs to append specs at the end of existing specs instead of prepending it. That way, API users simply need to understand "last spec wins". This matches much closer to the mental model of "add".
It matches the current documented behaviour of how the method is to _behave_ in case of multiple calls.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pyramid/config/i18n.py`
Content:
```
1 from pyramid.interfaces import (
2 ILocaleNegotiator,
3 ITranslationDirectories,
4 )
5
6 from pyramid.exceptions import ConfigurationError
7 from pyramid.path import AssetResolver
8 from pyramid.util import action_method
9
10 class I18NConfiguratorMixin(object):
11 @action_method
12 def set_locale_negotiator(self, negotiator):
13 """
14 Set the :term:`locale negotiator` for this application. The
15 :term:`locale negotiator` is a callable which accepts a
16 :term:`request` object and which returns a :term:`locale
17 name`. The ``negotiator`` argument should be the locale
18 negotiator implementation or a :term:`dotted Python name`
19 which refers to such an implementation.
20
21 Later calls to this method override earlier calls; there can
22 be only one locale negotiator active at a time within an
23 application. See :ref:`activating_translation` for more
24 information.
25
26 .. note::
27
28 Using the ``locale_negotiator`` argument to the
29 :class:`pyramid.config.Configurator` constructor can be used to
30 achieve the same purpose.
31 """
32 def register():
33 self._set_locale_negotiator(negotiator)
34 intr = self.introspectable('locale negotiator', None,
35 self.object_description(negotiator),
36 'locale negotiator')
37 intr['negotiator'] = negotiator
38 self.action(ILocaleNegotiator, register, introspectables=(intr,))
39
40 def _set_locale_negotiator(self, negotiator):
41 locale_negotiator = self.maybe_dotted(negotiator)
42 self.registry.registerUtility(locale_negotiator, ILocaleNegotiator)
43
44 @action_method
45 def add_translation_dirs(self, *specs):
46 """ Add one or more :term:`translation directory` paths to the
47 current configuration state. The ``specs`` argument is a
48 sequence that may contain absolute directory paths
49 (e.g. ``/usr/share/locale``) or :term:`asset specification`
50 names naming a directory path (e.g. ``some.package:locale``)
51 or a combination of the two.
52
53 Example:
54
55 .. code-block:: python
56
57 config.add_translation_dirs('/usr/share/locale',
58 'some.package:locale')
59
60 The translation directories are defined as a list in which
61 translations defined later have precedence over translations defined
62 earlier.
63
64 If multiple specs are provided in a single call to
65 ``add_translation_dirs``, the directories will be inserted in the
66 order they're provided (earlier items are trumped by later items).
67
68 .. warning::
69
70 Consecutive calls to ``add_translation_dirs`` will sort the
71 directories such that the later calls will add folders with
72 lower precedence than earlier calls.
73
74 """
75 introspectables = []
76
77 def register():
78 directories = []
79 resolver = AssetResolver(self.package_name)
80
81 # defer spec resolution until register to allow for asset
82 # overrides to take place in an earlier config phase
83 for spec in specs[::-1]: # reversed
84 # the trailing slash helps match asset overrides for folders
85 if not spec.endswith('/'):
86 spec += '/'
87 asset = resolver.resolve(spec)
88 directory = asset.abspath()
89 if not asset.isdir():
90 raise ConfigurationError('"%s" is not a directory' %
91 directory)
92 intr = self.introspectable('translation directories', directory,
93 spec, 'translation directory')
94 intr['directory'] = directory
95 intr['spec'] = spec
96 introspectables.append(intr)
97 directories.append(directory)
98
99 tdirs = self.registry.queryUtility(ITranslationDirectories)
100 if tdirs is None:
101 tdirs = []
102 self.registry.registerUtility(tdirs, ITranslationDirectories)
103 for directory in directories:
104 tdirs.insert(0, directory)
105
106 self.action(None, register, introspectables=introspectables)
107
108
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pyramid/config/i18n.py b/pyramid/config/i18n.py
--- a/pyramid/config/i18n.py
+++ b/pyramid/config/i18n.py
@@ -42,7 +42,7 @@
self.registry.registerUtility(locale_negotiator, ILocaleNegotiator)
@action_method
- def add_translation_dirs(self, *specs):
+ def add_translation_dirs(self, *specs, **kw):
""" Add one or more :term:`translation directory` paths to the
current configuration state. The ``specs`` argument is a
sequence that may contain absolute directory paths
@@ -61,18 +61,27 @@
translations defined later have precedence over translations defined
earlier.
+ By default, consecutive calls to ``add_translation_dirs`` will add
+ directories to the start of the list. This means later calls to
+ ``add_translation_dirs`` will have their translations trumped by
+ earlier calls. If you explicitly need this call to trump an earlier
+ call then you may set ``override`` to ``True``.
+
If multiple specs are provided in a single call to
``add_translation_dirs``, the directories will be inserted in the
order they're provided (earlier items are trumped by later items).
- .. warning::
+ .. versionchanged:: 1.8
- Consecutive calls to ``add_translation_dirs`` will sort the
- directories such that the later calls will add folders with
- lower precedence than earlier calls.
+ The ``override`` parameter was added to allow a later call
+ to ``add_translation_dirs`` to override an earlier call, inserting
+ folders at the beginning of the translation directory list.
"""
introspectables = []
+ override = kw.pop('override', False)
+ if kw:
+ raise TypeError('invalid keyword arguments: %s', sorted(kw.keys()))
def register():
directories = []
@@ -80,7 +89,7 @@
# defer spec resolution until register to allow for asset
# overrides to take place in an earlier config phase
- for spec in specs[::-1]: # reversed
+ for spec in specs:
# the trailing slash helps match asset overrides for folders
if not spec.endswith('/'):
spec += '/'
@@ -100,8 +109,11 @@
if tdirs is None:
tdirs = []
self.registry.registerUtility(tdirs, ITranslationDirectories)
- for directory in directories:
- tdirs.insert(0, directory)
+ if override:
+ tdirs.extend(directories)
+ else:
+ for directory in reversed(directories):
+ tdirs.insert(0, directory)
self.action(None, register, introspectables=introspectables)
| {"golden_diff": "diff --git a/pyramid/config/i18n.py b/pyramid/config/i18n.py\n--- a/pyramid/config/i18n.py\n+++ b/pyramid/config/i18n.py\n@@ -42,7 +42,7 @@\n self.registry.registerUtility(locale_negotiator, ILocaleNegotiator)\n \n @action_method\n- def add_translation_dirs(self, *specs):\n+ def add_translation_dirs(self, *specs, **kw):\n \"\"\" Add one or more :term:`translation directory` paths to the\n current configuration state. The ``specs`` argument is a\n sequence that may contain absolute directory paths\n@@ -61,18 +61,27 @@\n translations defined later have precedence over translations defined\n earlier.\n \n+ By default, consecutive calls to ``add_translation_dirs`` will add\n+ directories to the start of the list. This means later calls to\n+ ``add_translation_dirs`` will have their translations trumped by\n+ earlier calls. If you explicitly need this call to trump an earlier\n+ call then you may set ``override`` to ``True``.\n+\n If multiple specs are provided in a single call to\n ``add_translation_dirs``, the directories will be inserted in the\n order they're provided (earlier items are trumped by later items).\n \n- .. warning::\n+ .. versionchanged:: 1.8\n \n- Consecutive calls to ``add_translation_dirs`` will sort the\n- directories such that the later calls will add folders with\n- lower precedence than earlier calls.\n+ The ``override`` parameter was added to allow a later call\n+ to ``add_translation_dirs`` to override an earlier call, inserting\n+ folders at the beginning of the translation directory list.\n \n \"\"\"\n introspectables = []\n+ override = kw.pop('override', False)\n+ if kw:\n+ raise TypeError('invalid keyword arguments: %s', sorted(kw.keys()))\n \n def register():\n directories = []\n@@ -80,7 +89,7 @@\n \n # defer spec resolution until register to allow for asset\n # overrides to take place in an earlier config phase\n- for spec in specs[::-1]: # reversed\n+ for spec in specs:\n # the trailing slash helps match asset overrides for folders\n if not spec.endswith('/'):\n spec += '/'\n@@ -100,8 +109,11 @@\n if tdirs is None:\n tdirs = []\n self.registry.registerUtility(tdirs, ITranslationDirectories)\n- for directory in directories:\n- tdirs.insert(0, directory)\n+ if override:\n+ tdirs.extend(directories)\n+ else:\n+ for directory in reversed(directories):\n+ tdirs.insert(0, directory)\n \n self.action(None, register, introspectables=introspectables)\n", "issue": "add_translation_dirs behaves the wrong way around\nThis is a follow-up to #1473. See the reproducer and notes there.\n\nI would propose an API change that changes behaviour of add_translation_dirs to append specs at the end of existing specs instead of prepending it. That way, API users simply need to understand \"last spec wins\". This matches much closer to the mental model of \"add\".\n\nIt matches the current documented behaviour of how the method is to _behave_ in case of multiple calls.\n\n", "before_files": [{"content": "from pyramid.interfaces import (\n ILocaleNegotiator,\n ITranslationDirectories,\n )\n\nfrom pyramid.exceptions import ConfigurationError\nfrom pyramid.path import AssetResolver\nfrom pyramid.util import action_method\n\nclass I18NConfiguratorMixin(object):\n @action_method\n def set_locale_negotiator(self, negotiator):\n \"\"\"\n Set the :term:`locale negotiator` for this application. The\n :term:`locale negotiator` is a callable which accepts a\n :term:`request` object and which returns a :term:`locale\n name`. The ``negotiator`` argument should be the locale\n negotiator implementation or a :term:`dotted Python name`\n which refers to such an implementation.\n\n Later calls to this method override earlier calls; there can\n be only one locale negotiator active at a time within an\n application. See :ref:`activating_translation` for more\n information.\n\n .. note::\n\n Using the ``locale_negotiator`` argument to the\n :class:`pyramid.config.Configurator` constructor can be used to\n achieve the same purpose.\n \"\"\"\n def register():\n self._set_locale_negotiator(negotiator)\n intr = self.introspectable('locale negotiator', None,\n self.object_description(negotiator),\n 'locale negotiator')\n intr['negotiator'] = negotiator\n self.action(ILocaleNegotiator, register, introspectables=(intr,))\n\n def _set_locale_negotiator(self, negotiator):\n locale_negotiator = self.maybe_dotted(negotiator)\n self.registry.registerUtility(locale_negotiator, ILocaleNegotiator)\n\n @action_method\n def add_translation_dirs(self, *specs):\n \"\"\" Add one or more :term:`translation directory` paths to the\n current configuration state. The ``specs`` argument is a\n sequence that may contain absolute directory paths\n (e.g. ``/usr/share/locale``) or :term:`asset specification`\n names naming a directory path (e.g. ``some.package:locale``)\n or a combination of the two.\n\n Example:\n\n .. code-block:: python\n\n config.add_translation_dirs('/usr/share/locale',\n 'some.package:locale')\n\n The translation directories are defined as a list in which\n translations defined later have precedence over translations defined\n earlier.\n\n If multiple specs are provided in a single call to\n ``add_translation_dirs``, the directories will be inserted in the\n order they're provided (earlier items are trumped by later items).\n\n .. warning::\n\n Consecutive calls to ``add_translation_dirs`` will sort the\n directories such that the later calls will add folders with\n lower precedence than earlier calls.\n\n \"\"\"\n introspectables = []\n\n def register():\n directories = []\n resolver = AssetResolver(self.package_name)\n\n # defer spec resolution until register to allow for asset\n # overrides to take place in an earlier config phase\n for spec in specs[::-1]: # reversed\n # the trailing slash helps match asset overrides for folders\n if not spec.endswith('/'):\n spec += '/'\n asset = resolver.resolve(spec)\n directory = asset.abspath()\n if not asset.isdir():\n raise ConfigurationError('\"%s\" is not a directory' %\n directory)\n intr = self.introspectable('translation directories', directory,\n spec, 'translation directory')\n intr['directory'] = directory\n intr['spec'] = spec\n introspectables.append(intr)\n directories.append(directory)\n\n tdirs = self.registry.queryUtility(ITranslationDirectories)\n if tdirs is None:\n tdirs = []\n self.registry.registerUtility(tdirs, ITranslationDirectories)\n for directory in directories:\n tdirs.insert(0, directory)\n\n self.action(None, register, introspectables=introspectables)\n\n", "path": "pyramid/config/i18n.py"}], "after_files": [{"content": "from pyramid.interfaces import (\n ILocaleNegotiator,\n ITranslationDirectories,\n )\n\nfrom pyramid.exceptions import ConfigurationError\nfrom pyramid.path import AssetResolver\nfrom pyramid.util import action_method\n\nclass I18NConfiguratorMixin(object):\n @action_method\n def set_locale_negotiator(self, negotiator):\n \"\"\"\n Set the :term:`locale negotiator` for this application. The\n :term:`locale negotiator` is a callable which accepts a\n :term:`request` object and which returns a :term:`locale\n name`. The ``negotiator`` argument should be the locale\n negotiator implementation or a :term:`dotted Python name`\n which refers to such an implementation.\n\n Later calls to this method override earlier calls; there can\n be only one locale negotiator active at a time within an\n application. See :ref:`activating_translation` for more\n information.\n\n .. note::\n\n Using the ``locale_negotiator`` argument to the\n :class:`pyramid.config.Configurator` constructor can be used to\n achieve the same purpose.\n \"\"\"\n def register():\n self._set_locale_negotiator(negotiator)\n intr = self.introspectable('locale negotiator', None,\n self.object_description(negotiator),\n 'locale negotiator')\n intr['negotiator'] = negotiator\n self.action(ILocaleNegotiator, register, introspectables=(intr,))\n\n def _set_locale_negotiator(self, negotiator):\n locale_negotiator = self.maybe_dotted(negotiator)\n self.registry.registerUtility(locale_negotiator, ILocaleNegotiator)\n\n @action_method\n def add_translation_dirs(self, *specs, **kw):\n \"\"\" Add one or more :term:`translation directory` paths to the\n current configuration state. The ``specs`` argument is a\n sequence that may contain absolute directory paths\n (e.g. ``/usr/share/locale``) or :term:`asset specification`\n names naming a directory path (e.g. ``some.package:locale``)\n or a combination of the two.\n\n Example:\n\n .. code-block:: python\n\n config.add_translation_dirs('/usr/share/locale',\n 'some.package:locale')\n\n The translation directories are defined as a list in which\n translations defined later have precedence over translations defined\n earlier.\n\n By default, consecutive calls to ``add_translation_dirs`` will add\n directories to the start of the list. This means later calls to\n ``add_translation_dirs`` will have their translations trumped by\n earlier calls. If you explicitly need this call to trump an earlier\n call then you may set ``override`` to ``True``.\n\n If multiple specs are provided in a single call to\n ``add_translation_dirs``, the directories will be inserted in the\n order they're provided (earlier items are trumped by later items).\n\n .. versionchanged:: 1.8\n\n The ``override`` parameter was added to allow a later call\n to ``add_translation_dirs`` to override an earlier call, inserting\n folders at the beginning of the translation directory list.\n\n \"\"\"\n introspectables = []\n override = kw.pop('override', False)\n if kw:\n raise TypeError('invalid keyword arguments: %s', sorted(kw.keys()))\n\n def register():\n directories = []\n resolver = AssetResolver(self.package_name)\n\n # defer spec resolution until register to allow for asset\n # overrides to take place in an earlier config phase\n for spec in specs:\n # the trailing slash helps match asset overrides for folders\n if not spec.endswith('/'):\n spec += '/'\n asset = resolver.resolve(spec)\n directory = asset.abspath()\n if not asset.isdir():\n raise ConfigurationError('\"%s\" is not a directory' %\n directory)\n intr = self.introspectable('translation directories', directory,\n spec, 'translation directory')\n intr['directory'] = directory\n intr['spec'] = spec\n introspectables.append(intr)\n directories.append(directory)\n\n tdirs = self.registry.queryUtility(ITranslationDirectories)\n if tdirs is None:\n tdirs = []\n self.registry.registerUtility(tdirs, ITranslationDirectories)\n if override:\n tdirs.extend(directories)\n else:\n for directory in reversed(directories):\n tdirs.insert(0, directory)\n\n self.action(None, register, introspectables=introspectables)\n\n", "path": "pyramid/config/i18n.py"}]} | 1,427 | 628 |
gh_patches_debug_43176 | rasdani/github-patches | git_diff | encode__starlette-134 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Support `shutdown` as a synonym for `cleanup`
* Support either `cleanup` or `shutdown` as the ASGI lifespan message name.
* Update uvicorn to move to shutdown - https://github.com/encode/uvicorn/issues/233
* Finally, after a small period of time, drop `cleanup`
Easy PR for a contributor to jump on would be addressing the first part of this, and supporting either name.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `starlette/lifespan.py`
Content:
```
1 import asyncio
2 import logging
3 import traceback
4 import typing
5 from types import TracebackType
6 from starlette.types import ASGIApp, ASGIInstance, Receive, Message, Send
7
8
9 STATE_TRANSITION_ERROR = "Got invalid state transition on lifespan protocol."
10
11
12 class LifespanHandler:
13 def __init__(self) -> None:
14 self.startup_handlers = [] # type: typing.List[typing.Callable]
15 self.cleanup_handlers = [] # type: typing.List[typing.Callable]
16
17 def on_event(self, event_type: str) -> typing.Callable:
18 def decorator(func: typing.Callable) -> typing.Callable:
19 self.add_event_handler(event_type, func)
20 return func
21
22 return decorator
23
24 def add_event_handler(self, event_type: str, func: typing.Callable) -> None:
25 assert event_type in ("startup", "cleanup")
26
27 if event_type == "startup":
28 self.startup_handlers.append(func)
29 else:
30 self.cleanup_handlers.append(func)
31
32 async def run_startup(self) -> None:
33 for handler in self.startup_handlers:
34 if asyncio.iscoroutinefunction(handler):
35 await handler()
36 else:
37 handler()
38
39 async def run_cleanup(self) -> None:
40 for handler in self.cleanup_handlers:
41 if asyncio.iscoroutinefunction(handler):
42 await handler()
43 else:
44 handler()
45
46 def __call__(self, scope: Message) -> ASGIInstance:
47 assert scope["type"] == "lifespan"
48 return self.run_lifespan
49
50 async def run_lifespan(self, receive: Receive, send: Send) -> None:
51 message = await receive()
52 assert message["type"] == "lifespan.startup"
53 await self.run_startup()
54 await send({"type": "lifespan.startup.complete"})
55 message = await receive()
56 assert message["type"] == "lifespan.cleanup"
57 await self.run_cleanup()
58 await send({"type": "lifespan.cleanup.complete"})
59
60
61 class LifespanContext:
62 def __init__(
63 self, app: ASGIApp, startup_timeout: int = 10, cleanup_timeout: int = 10
64 ) -> None:
65 self.startup_timeout = startup_timeout
66 self.cleanup_timeout = cleanup_timeout
67 self.startup_event = asyncio.Event()
68 self.cleanup_event = asyncio.Event()
69 self.receive_queue = asyncio.Queue() # type: asyncio.Queue
70 self.asgi = app({"type": "lifespan"}) # type: ASGIInstance
71
72 def __enter__(self) -> None:
73 loop = asyncio.get_event_loop()
74 loop.create_task(self.run_lifespan())
75 loop.run_until_complete(self.wait_startup())
76
77 def __exit__(
78 self,
79 exc_type: typing.Type[BaseException],
80 exc: BaseException,
81 tb: TracebackType,
82 ) -> None:
83 loop = asyncio.get_event_loop()
84 loop.run_until_complete(self.wait_cleanup())
85
86 async def run_lifespan(self) -> None:
87 try:
88 await self.asgi(self.receive, self.send)
89 finally:
90 self.startup_event.set()
91 self.cleanup_event.set()
92
93 async def send(self, message: Message) -> None:
94 if message["type"] == "lifespan.startup.complete":
95 assert not self.startup_event.is_set(), STATE_TRANSITION_ERROR
96 assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR
97 self.startup_event.set()
98 else:
99 assert message["type"] == "lifespan.cleanup.complete"
100 assert self.startup_event.is_set(), STATE_TRANSITION_ERROR
101 assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR
102 self.cleanup_event.set()
103
104 async def receive(self) -> Message:
105 return await self.receive_queue.get()
106
107 async def wait_startup(self) -> None:
108 await self.receive_queue.put({"type": "lifespan.startup"})
109 await asyncio.wait_for(self.startup_event.wait(), timeout=self.startup_timeout)
110
111 async def wait_cleanup(self) -> None:
112 await self.receive_queue.put({"type": "lifespan.cleanup"})
113 await asyncio.wait_for(self.cleanup_event.wait(), timeout=self.cleanup_timeout)
114
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/starlette/lifespan.py b/starlette/lifespan.py
--- a/starlette/lifespan.py
+++ b/starlette/lifespan.py
@@ -22,7 +22,7 @@
return decorator
def add_event_handler(self, event_type: str, func: typing.Callable) -> None:
- assert event_type in ("startup", "cleanup")
+ assert event_type in ("startup", "shutdown", "cleanup")
if event_type == "startup":
self.startup_handlers.append(func)
@@ -53,19 +53,26 @@
await self.run_startup()
await send({"type": "lifespan.startup.complete"})
message = await receive()
- assert message["type"] == "lifespan.cleanup"
+ assert (
+ message["type"] == "lifespan.shutdown"
+ or message["type"] == "lifespan.cleanup"
+ )
await self.run_cleanup()
- await send({"type": "lifespan.cleanup.complete"})
+ if message["type"] == "lifespan.shutdown":
+ await send({"type": "lifespan.shutdown.complete"})
+
+ if message["type"] == "lifespan.cleanup":
+ await send({"type": "lifespan.cleanup.complete"}) # pragma: no cover
class LifespanContext:
def __init__(
- self, app: ASGIApp, startup_timeout: int = 10, cleanup_timeout: int = 10
+ self, app: ASGIApp, startup_timeout: int = 10, shutdown_timeout: int = 10
) -> None:
self.startup_timeout = startup_timeout
- self.cleanup_timeout = cleanup_timeout
+ self.shutdown_timeout = shutdown_timeout
self.startup_event = asyncio.Event()
- self.cleanup_event = asyncio.Event()
+ self.shutdown_event = asyncio.Event()
self.receive_queue = asyncio.Queue() # type: asyncio.Queue
self.asgi = app({"type": "lifespan"}) # type: ASGIInstance
@@ -81,25 +88,25 @@
tb: TracebackType,
) -> None:
loop = asyncio.get_event_loop()
- loop.run_until_complete(self.wait_cleanup())
+ loop.run_until_complete(self.wait_shutdown())
async def run_lifespan(self) -> None:
try:
await self.asgi(self.receive, self.send)
finally:
self.startup_event.set()
- self.cleanup_event.set()
+ self.shutdown_event.set()
async def send(self, message: Message) -> None:
if message["type"] == "lifespan.startup.complete":
assert not self.startup_event.is_set(), STATE_TRANSITION_ERROR
- assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR
+ assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR
self.startup_event.set()
else:
- assert message["type"] == "lifespan.cleanup.complete"
+ assert message["type"] == "lifespan.shutdown.complete"
assert self.startup_event.is_set(), STATE_TRANSITION_ERROR
- assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR
- self.cleanup_event.set()
+ assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR
+ self.shutdown_event.set()
async def receive(self) -> Message:
return await self.receive_queue.get()
@@ -108,6 +115,8 @@
await self.receive_queue.put({"type": "lifespan.startup"})
await asyncio.wait_for(self.startup_event.wait(), timeout=self.startup_timeout)
- async def wait_cleanup(self) -> None:
- await self.receive_queue.put({"type": "lifespan.cleanup"})
- await asyncio.wait_for(self.cleanup_event.wait(), timeout=self.cleanup_timeout)
+ async def wait_shutdown(self) -> None:
+ await self.receive_queue.put({"type": "lifespan.shutdown"})
+ await asyncio.wait_for(
+ self.shutdown_event.wait(), timeout=self.shutdown_timeout
+ )
| {"golden_diff": "diff --git a/starlette/lifespan.py b/starlette/lifespan.py\n--- a/starlette/lifespan.py\n+++ b/starlette/lifespan.py\n@@ -22,7 +22,7 @@\n return decorator\n \n def add_event_handler(self, event_type: str, func: typing.Callable) -> None:\n- assert event_type in (\"startup\", \"cleanup\")\n+ assert event_type in (\"startup\", \"shutdown\", \"cleanup\")\n \n if event_type == \"startup\":\n self.startup_handlers.append(func)\n@@ -53,19 +53,26 @@\n await self.run_startup()\n await send({\"type\": \"lifespan.startup.complete\"})\n message = await receive()\n- assert message[\"type\"] == \"lifespan.cleanup\"\n+ assert (\n+ message[\"type\"] == \"lifespan.shutdown\"\n+ or message[\"type\"] == \"lifespan.cleanup\"\n+ )\n await self.run_cleanup()\n- await send({\"type\": \"lifespan.cleanup.complete\"})\n+ if message[\"type\"] == \"lifespan.shutdown\":\n+ await send({\"type\": \"lifespan.shutdown.complete\"})\n+\n+ if message[\"type\"] == \"lifespan.cleanup\":\n+ await send({\"type\": \"lifespan.cleanup.complete\"}) # pragma: no cover\n \n \n class LifespanContext:\n def __init__(\n- self, app: ASGIApp, startup_timeout: int = 10, cleanup_timeout: int = 10\n+ self, app: ASGIApp, startup_timeout: int = 10, shutdown_timeout: int = 10\n ) -> None:\n self.startup_timeout = startup_timeout\n- self.cleanup_timeout = cleanup_timeout\n+ self.shutdown_timeout = shutdown_timeout\n self.startup_event = asyncio.Event()\n- self.cleanup_event = asyncio.Event()\n+ self.shutdown_event = asyncio.Event()\n self.receive_queue = asyncio.Queue() # type: asyncio.Queue\n self.asgi = app({\"type\": \"lifespan\"}) # type: ASGIInstance\n \n@@ -81,25 +88,25 @@\n tb: TracebackType,\n ) -> None:\n loop = asyncio.get_event_loop()\n- loop.run_until_complete(self.wait_cleanup())\n+ loop.run_until_complete(self.wait_shutdown())\n \n async def run_lifespan(self) -> None:\n try:\n await self.asgi(self.receive, self.send)\n finally:\n self.startup_event.set()\n- self.cleanup_event.set()\n+ self.shutdown_event.set()\n \n async def send(self, message: Message) -> None:\n if message[\"type\"] == \"lifespan.startup.complete\":\n assert not self.startup_event.is_set(), STATE_TRANSITION_ERROR\n- assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR\n+ assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR\n self.startup_event.set()\n else:\n- assert message[\"type\"] == \"lifespan.cleanup.complete\"\n+ assert message[\"type\"] == \"lifespan.shutdown.complete\"\n assert self.startup_event.is_set(), STATE_TRANSITION_ERROR\n- assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR\n- self.cleanup_event.set()\n+ assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR\n+ self.shutdown_event.set()\n \n async def receive(self) -> Message:\n return await self.receive_queue.get()\n@@ -108,6 +115,8 @@\n await self.receive_queue.put({\"type\": \"lifespan.startup\"})\n await asyncio.wait_for(self.startup_event.wait(), timeout=self.startup_timeout)\n \n- async def wait_cleanup(self) -> None:\n- await self.receive_queue.put({\"type\": \"lifespan.cleanup\"})\n- await asyncio.wait_for(self.cleanup_event.wait(), timeout=self.cleanup_timeout)\n+ async def wait_shutdown(self) -> None:\n+ await self.receive_queue.put({\"type\": \"lifespan.shutdown\"})\n+ await asyncio.wait_for(\n+ self.shutdown_event.wait(), timeout=self.shutdown_timeout\n+ )\n", "issue": "Support `shutdown` as a synonym for `cleanup`\n* Support either `cleanup` or `shutdown` as the ASGI lifespan message name.\r\n* Update uvicorn to move to shutdown - https://github.com/encode/uvicorn/issues/233\r\n* Finally, after a small period of time, drop `cleanup`\r\n\r\nEasy PR for a contributor to jump on would be addressing the first part of this, and supporting either name.\n", "before_files": [{"content": "import asyncio\nimport logging\nimport traceback\nimport typing\nfrom types import TracebackType\nfrom starlette.types import ASGIApp, ASGIInstance, Receive, Message, Send\n\n\nSTATE_TRANSITION_ERROR = \"Got invalid state transition on lifespan protocol.\"\n\n\nclass LifespanHandler:\n def __init__(self) -> None:\n self.startup_handlers = [] # type: typing.List[typing.Callable]\n self.cleanup_handlers = [] # type: typing.List[typing.Callable]\n\n def on_event(self, event_type: str) -> typing.Callable:\n def decorator(func: typing.Callable) -> typing.Callable:\n self.add_event_handler(event_type, func)\n return func\n\n return decorator\n\n def add_event_handler(self, event_type: str, func: typing.Callable) -> None:\n assert event_type in (\"startup\", \"cleanup\")\n\n if event_type == \"startup\":\n self.startup_handlers.append(func)\n else:\n self.cleanup_handlers.append(func)\n\n async def run_startup(self) -> None:\n for handler in self.startup_handlers:\n if asyncio.iscoroutinefunction(handler):\n await handler()\n else:\n handler()\n\n async def run_cleanup(self) -> None:\n for handler in self.cleanup_handlers:\n if asyncio.iscoroutinefunction(handler):\n await handler()\n else:\n handler()\n\n def __call__(self, scope: Message) -> ASGIInstance:\n assert scope[\"type\"] == \"lifespan\"\n return self.run_lifespan\n\n async def run_lifespan(self, receive: Receive, send: Send) -> None:\n message = await receive()\n assert message[\"type\"] == \"lifespan.startup\"\n await self.run_startup()\n await send({\"type\": \"lifespan.startup.complete\"})\n message = await receive()\n assert message[\"type\"] == \"lifespan.cleanup\"\n await self.run_cleanup()\n await send({\"type\": \"lifespan.cleanup.complete\"})\n\n\nclass LifespanContext:\n def __init__(\n self, app: ASGIApp, startup_timeout: int = 10, cleanup_timeout: int = 10\n ) -> None:\n self.startup_timeout = startup_timeout\n self.cleanup_timeout = cleanup_timeout\n self.startup_event = asyncio.Event()\n self.cleanup_event = asyncio.Event()\n self.receive_queue = asyncio.Queue() # type: asyncio.Queue\n self.asgi = app({\"type\": \"lifespan\"}) # type: ASGIInstance\n\n def __enter__(self) -> None:\n loop = asyncio.get_event_loop()\n loop.create_task(self.run_lifespan())\n loop.run_until_complete(self.wait_startup())\n\n def __exit__(\n self,\n exc_type: typing.Type[BaseException],\n exc: BaseException,\n tb: TracebackType,\n ) -> None:\n loop = asyncio.get_event_loop()\n loop.run_until_complete(self.wait_cleanup())\n\n async def run_lifespan(self) -> None:\n try:\n await self.asgi(self.receive, self.send)\n finally:\n self.startup_event.set()\n self.cleanup_event.set()\n\n async def send(self, message: Message) -> None:\n if message[\"type\"] == \"lifespan.startup.complete\":\n assert not self.startup_event.is_set(), STATE_TRANSITION_ERROR\n assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR\n self.startup_event.set()\n else:\n assert message[\"type\"] == \"lifespan.cleanup.complete\"\n assert self.startup_event.is_set(), STATE_TRANSITION_ERROR\n assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR\n self.cleanup_event.set()\n\n async def receive(self) -> Message:\n return await self.receive_queue.get()\n\n async def wait_startup(self) -> None:\n await self.receive_queue.put({\"type\": \"lifespan.startup\"})\n await asyncio.wait_for(self.startup_event.wait(), timeout=self.startup_timeout)\n\n async def wait_cleanup(self) -> None:\n await self.receive_queue.put({\"type\": \"lifespan.cleanup\"})\n await asyncio.wait_for(self.cleanup_event.wait(), timeout=self.cleanup_timeout)\n", "path": "starlette/lifespan.py"}], "after_files": [{"content": "import asyncio\nimport logging\nimport traceback\nimport typing\nfrom types import TracebackType\nfrom starlette.types import ASGIApp, ASGIInstance, Receive, Message, Send\n\n\nSTATE_TRANSITION_ERROR = \"Got invalid state transition on lifespan protocol.\"\n\n\nclass LifespanHandler:\n def __init__(self) -> None:\n self.startup_handlers = [] # type: typing.List[typing.Callable]\n self.cleanup_handlers = [] # type: typing.List[typing.Callable]\n\n def on_event(self, event_type: str) -> typing.Callable:\n def decorator(func: typing.Callable) -> typing.Callable:\n self.add_event_handler(event_type, func)\n return func\n\n return decorator\n\n def add_event_handler(self, event_type: str, func: typing.Callable) -> None:\n assert event_type in (\"startup\", \"shutdown\", \"cleanup\")\n\n if event_type == \"startup\":\n self.startup_handlers.append(func)\n else:\n self.cleanup_handlers.append(func)\n\n async def run_startup(self) -> None:\n for handler in self.startup_handlers:\n if asyncio.iscoroutinefunction(handler):\n await handler()\n else:\n handler()\n\n async def run_cleanup(self) -> None:\n for handler in self.cleanup_handlers:\n if asyncio.iscoroutinefunction(handler):\n await handler()\n else:\n handler()\n\n def __call__(self, scope: Message) -> ASGIInstance:\n assert scope[\"type\"] == \"lifespan\"\n return self.run_lifespan\n\n async def run_lifespan(self, receive: Receive, send: Send) -> None:\n message = await receive()\n assert message[\"type\"] == \"lifespan.startup\"\n await self.run_startup()\n await send({\"type\": \"lifespan.startup.complete\"})\n message = await receive()\n assert (\n message[\"type\"] == \"lifespan.shutdown\"\n or message[\"type\"] == \"lifespan.cleanup\"\n )\n await self.run_cleanup()\n if message[\"type\"] == \"lifespan.shutdown\":\n await send({\"type\": \"lifespan.shutdown.complete\"})\n\n if message[\"type\"] == \"lifespan.cleanup\":\n await send({\"type\": \"lifespan.cleanup.complete\"}) # pragma: no cover\n\n\nclass LifespanContext:\n def __init__(\n self, app: ASGIApp, startup_timeout: int = 10, shutdown_timeout: int = 10\n ) -> None:\n self.startup_timeout = startup_timeout\n self.shutdown_timeout = shutdown_timeout\n self.startup_event = asyncio.Event()\n self.shutdown_event = asyncio.Event()\n self.receive_queue = asyncio.Queue() # type: asyncio.Queue\n self.asgi = app({\"type\": \"lifespan\"}) # type: ASGIInstance\n\n def __enter__(self) -> None:\n loop = asyncio.get_event_loop()\n loop.create_task(self.run_lifespan())\n loop.run_until_complete(self.wait_startup())\n\n def __exit__(\n self,\n exc_type: typing.Type[BaseException],\n exc: BaseException,\n tb: TracebackType,\n ) -> None:\n loop = asyncio.get_event_loop()\n loop.run_until_complete(self.wait_shutdown())\n\n async def run_lifespan(self) -> None:\n try:\n await self.asgi(self.receive, self.send)\n finally:\n self.startup_event.set()\n self.shutdown_event.set()\n\n async def send(self, message: Message) -> None:\n if message[\"type\"] == \"lifespan.startup.complete\":\n assert not self.startup_event.is_set(), STATE_TRANSITION_ERROR\n assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR\n self.startup_event.set()\n else:\n assert message[\"type\"] == \"lifespan.shutdown.complete\"\n assert self.startup_event.is_set(), STATE_TRANSITION_ERROR\n assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR\n self.shutdown_event.set()\n\n async def receive(self) -> Message:\n return await self.receive_queue.get()\n\n async def wait_startup(self) -> None:\n await self.receive_queue.put({\"type\": \"lifespan.startup\"})\n await asyncio.wait_for(self.startup_event.wait(), timeout=self.startup_timeout)\n\n async def wait_shutdown(self) -> None:\n await self.receive_queue.put({\"type\": \"lifespan.shutdown\"})\n await asyncio.wait_for(\n self.shutdown_event.wait(), timeout=self.shutdown_timeout\n )\n", "path": "starlette/lifespan.py"}]} | 1,484 | 888 |
gh_patches_debug_32409 | rasdani/github-patches | git_diff | plone__Products.CMFPlone-1044 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
HTML filtering requires a "re save" on each page for the settings to take effect.
Using: Plone 5.0b4.dev0 (5007)
Been testing the HTML filtering settings using iFrame video.
When I add an "iframe" video the saved page does not show the video (as expected). But when I toggle ON the "Disable HTML Filtering" checkbox the video still does not display on the page (even after emptying cache). If I edit the video page and immediately "Save" (making no edits) the video displays as expected. The reverse is also true. If I have video correctly displaying on the page (with html filtering disabled) and uncheck "Disable HTML Filtering" checkbox in the HTML Filtering panel, go back to the video page and reload the video still displays. If I edit the page and immediately save the page the video properly does not display. Is this the expected behavior?
In addition. If I add "iFrame" to the "Custom tags" dialogue box I cannot get the video to display with "Disable HTML Filtering" checkbox OFF. Even if I do the "Edit" and "Save" trick from above.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `Products/CMFPlone/controlpanel/browser/filter.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 from Products.CMFCore.utils import getToolByName
3 from Products.CMFPlone import PloneMessageFactory as _ # NOQA
4 from Products.CMFPlone.interfaces import IFilterSchema
5 from Products.Five.browser.pagetemplatefile import ViewPageTemplateFile
6 from Products.statusmessages.interfaces import IStatusMessage
7 from plone.autoform.form import AutoExtensibleForm
8 from plone.z3cform import layout
9 from z3c.form import button
10 from z3c.form import form
11 from Products.PortalTransforms.transforms.safe_html import VALID_TAGS
12 from Products.PortalTransforms.transforms.safe_html import NASTY_TAGS
13
14
15 class FilterControlPanel(AutoExtensibleForm, form.EditForm):
16 id = "FilterControlPanel"
17 label = _(u"HTML Filtering Settings")
18 description = _(
19 'description_html_filtering',
20 default=u"HTML generation is heavily cached across Plone. "
21 u"After changing settings here, you may have to edit "
22 u"existing content to see the changes in these filter settings "
23 u"or restart your server.")
24 schema = IFilterSchema
25 form_name = _(u"HTML Filtering Settings")
26 control_panel_view = "filter-controlpanel"
27
28 def updateActions(self): # NOQA
29 """Have to override this because we only have Save, not Cancel
30 """
31 super(FilterControlPanel, self).updateActions()
32 self.actions['save'].addClass("context")
33
34 @button.buttonAndHandler(_(u"Save"), name='save')
35 def handleSave(self, action): # NOQA
36 data, errors = self.extractData()
37 if errors:
38 self.status = self.formErrorsMessage
39 return
40
41 # Save in portal tools
42 safe_html = getattr(
43 getToolByName(self.context, 'portal_transforms'),
44 'safe_html',
45 None)
46
47 nasty_tags = data['nasty_tags']
48 custom_tags = data['custom_tags']
49 stripped_tags = data['stripped_tags']
50
51 valid = safe_html._config['valid_tags']
52
53 # start with removing tags that do not belong in valid tags
54 for value in nasty_tags + stripped_tags:
55 if value in valid:
56 del valid[value]
57 # add in custom tags
58 for custom in custom_tags:
59 if value not in valid:
60 valid[custom] = 1
61 # then, check if something was previously prevented but is no longer
62 for tag in set(VALID_TAGS.keys()) - set(valid.keys()):
63 if tag not in nasty_tags and tag not in stripped_tags:
64 valid[tag] = VALID_TAGS[tag]
65
66 # nasty tags are simple, just set the value here
67 nasty_value = {}
68 for tag in nasty_tags:
69 nasty_value[tag] = NASTY_TAGS.get(tag, VALID_TAGS.get(tag, 1))
70 safe_html._config['nasty_tags'] = nasty_value
71
72 disable_filtering = int(data['disable_filtering'])
73 if disable_filtering != safe_html._config['disable_transform']:
74 safe_html._config['disable_transform'] = disable_filtering
75
76 for attr in ('stripped_combinations', 'class_blacklist', 'stripped_attributes',
77 'style_whitelist'):
78 value = data[attr]
79 if value is None:
80 if attr == 'stripped_combinations':
81 value = {}
82 else:
83 value = []
84 if value != safe_html._config[attr]:
85 safe_html._config[attr] = value
86
87 # always reload the transform
88 safe_html._p_changed = True
89 safe_html.reload()
90
91 self.applyChanges(data)
92 IStatusMessage(self.request).addStatusMessage(
93 _(u"Changes saved."),
94 "info")
95 self.request.response.redirect(self.request.getURL())
96
97
98 class ControlPanelFormWrapper(layout.FormWrapper):
99 """Use this form as the plone.z3cform layout wrapper to get the control
100 panel layout.
101 """
102 index = ViewPageTemplateFile('filter_controlpanel.pt')
103
104
105 FilterControlPanelView = layout.wrap_form(
106 FilterControlPanel, ControlPanelFormWrapper)
107
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/Products/CMFPlone/controlpanel/browser/filter.py b/Products/CMFPlone/controlpanel/browser/filter.py
--- a/Products/CMFPlone/controlpanel/browser/filter.py
+++ b/Products/CMFPlone/controlpanel/browser/filter.py
@@ -15,12 +15,7 @@
class FilterControlPanel(AutoExtensibleForm, form.EditForm):
id = "FilterControlPanel"
label = _(u"HTML Filtering Settings")
- description = _(
- 'description_html_filtering',
- default=u"HTML generation is heavily cached across Plone. "
- u"After changing settings here, you may have to edit "
- u"existing content to see the changes in these filter settings "
- u"or restart your server.")
+ description = ""
schema = IFilterSchema
form_name = _(u"HTML Filtering Settings")
control_panel_view = "filter-controlpanel"
@@ -73,8 +68,8 @@
if disable_filtering != safe_html._config['disable_transform']:
safe_html._config['disable_transform'] = disable_filtering
- for attr in ('stripped_combinations', 'class_blacklist', 'stripped_attributes',
- 'style_whitelist'):
+ for attr in ('stripped_combinations', 'class_blacklist',
+ 'stripped_attributes', 'style_whitelist'):
value = data[attr]
if value is None:
if attr == 'stripped_combinations':
@@ -92,6 +87,11 @@
IStatusMessage(self.request).addStatusMessage(
_(u"Changes saved."),
"info")
+ IStatusMessage(self.request).addStatusMessage(
+ _(u"HTML generation is heavily cached across Plone. You may "
+ u"have to edit existing content or restart your server to see "
+ u"the changes."),
+ "warning")
self.request.response.redirect(self.request.getURL())
| {"golden_diff": "diff --git a/Products/CMFPlone/controlpanel/browser/filter.py b/Products/CMFPlone/controlpanel/browser/filter.py\n--- a/Products/CMFPlone/controlpanel/browser/filter.py\n+++ b/Products/CMFPlone/controlpanel/browser/filter.py\n@@ -15,12 +15,7 @@\n class FilterControlPanel(AutoExtensibleForm, form.EditForm):\n id = \"FilterControlPanel\"\n label = _(u\"HTML Filtering Settings\")\n- description = _(\n- 'description_html_filtering',\n- default=u\"HTML generation is heavily cached across Plone. \"\n- u\"After changing settings here, you may have to edit \"\n- u\"existing content to see the changes in these filter settings \"\n- u\"or restart your server.\")\n+ description = \"\"\n schema = IFilterSchema\n form_name = _(u\"HTML Filtering Settings\")\n control_panel_view = \"filter-controlpanel\"\n@@ -73,8 +68,8 @@\n if disable_filtering != safe_html._config['disable_transform']:\n safe_html._config['disable_transform'] = disable_filtering\n \n- for attr in ('stripped_combinations', 'class_blacklist', 'stripped_attributes',\n- 'style_whitelist'):\n+ for attr in ('stripped_combinations', 'class_blacklist',\n+ 'stripped_attributes', 'style_whitelist'):\n value = data[attr]\n if value is None:\n if attr == 'stripped_combinations':\n@@ -92,6 +87,11 @@\n IStatusMessage(self.request).addStatusMessage(\n _(u\"Changes saved.\"),\n \"info\")\n+ IStatusMessage(self.request).addStatusMessage(\n+ _(u\"HTML generation is heavily cached across Plone. You may \"\n+ u\"have to edit existing content or restart your server to see \"\n+ u\"the changes.\"),\n+ \"warning\")\n self.request.response.redirect(self.request.getURL())\n", "issue": "HTML filtering requires a \"re save\" on each page for the settings to take effect. \nUsing: Plone 5.0b4.dev0 (5007)\n\nBeen testing the HTML filtering settings using iFrame video. \nWhen I add an \"iframe\" video the saved page does not show the video (as expected). But when I toggle ON the \"Disable HTML Filtering\" checkbox the video still does not display on the page (even after emptying cache). If I edit the video page and immediately \"Save\" (making no edits) the video displays as expected. The reverse is also true. If I have video correctly displaying on the page (with html filtering disabled) and uncheck \"Disable HTML Filtering\" checkbox in the HTML Filtering panel, go back to the video page and reload the video still displays. If I edit the page and immediately save the page the video properly does not display. Is this the expected behavior?\n\nIn addition. If I add \"iFrame\" to the \"Custom tags\" dialogue box I cannot get the video to display with \"Disable HTML Filtering\" checkbox OFF. Even if I do the \"Edit\" and \"Save\" trick from above.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nfrom Products.CMFCore.utils import getToolByName\nfrom Products.CMFPlone import PloneMessageFactory as _ # NOQA\nfrom Products.CMFPlone.interfaces import IFilterSchema\nfrom Products.Five.browser.pagetemplatefile import ViewPageTemplateFile\nfrom Products.statusmessages.interfaces import IStatusMessage\nfrom plone.autoform.form import AutoExtensibleForm\nfrom plone.z3cform import layout\nfrom z3c.form import button\nfrom z3c.form import form\nfrom Products.PortalTransforms.transforms.safe_html import VALID_TAGS\nfrom Products.PortalTransforms.transforms.safe_html import NASTY_TAGS\n\n\nclass FilterControlPanel(AutoExtensibleForm, form.EditForm):\n id = \"FilterControlPanel\"\n label = _(u\"HTML Filtering Settings\")\n description = _(\n 'description_html_filtering',\n default=u\"HTML generation is heavily cached across Plone. \"\n u\"After changing settings here, you may have to edit \"\n u\"existing content to see the changes in these filter settings \"\n u\"or restart your server.\")\n schema = IFilterSchema\n form_name = _(u\"HTML Filtering Settings\")\n control_panel_view = \"filter-controlpanel\"\n\n def updateActions(self): # NOQA\n \"\"\"Have to override this because we only have Save, not Cancel\n \"\"\"\n super(FilterControlPanel, self).updateActions()\n self.actions['save'].addClass(\"context\")\n\n @button.buttonAndHandler(_(u\"Save\"), name='save')\n def handleSave(self, action): # NOQA\n data, errors = self.extractData()\n if errors:\n self.status = self.formErrorsMessage\n return\n\n # Save in portal tools\n safe_html = getattr(\n getToolByName(self.context, 'portal_transforms'),\n 'safe_html',\n None)\n\n nasty_tags = data['nasty_tags']\n custom_tags = data['custom_tags']\n stripped_tags = data['stripped_tags']\n\n valid = safe_html._config['valid_tags']\n\n # start with removing tags that do not belong in valid tags\n for value in nasty_tags + stripped_tags:\n if value in valid:\n del valid[value]\n # add in custom tags\n for custom in custom_tags:\n if value not in valid:\n valid[custom] = 1\n # then, check if something was previously prevented but is no longer\n for tag in set(VALID_TAGS.keys()) - set(valid.keys()):\n if tag not in nasty_tags and tag not in stripped_tags:\n valid[tag] = VALID_TAGS[tag]\n\n # nasty tags are simple, just set the value here\n nasty_value = {}\n for tag in nasty_tags:\n nasty_value[tag] = NASTY_TAGS.get(tag, VALID_TAGS.get(tag, 1))\n safe_html._config['nasty_tags'] = nasty_value\n\n disable_filtering = int(data['disable_filtering'])\n if disable_filtering != safe_html._config['disable_transform']:\n safe_html._config['disable_transform'] = disable_filtering\n\n for attr in ('stripped_combinations', 'class_blacklist', 'stripped_attributes',\n 'style_whitelist'):\n value = data[attr]\n if value is None:\n if attr == 'stripped_combinations':\n value = {}\n else:\n value = []\n if value != safe_html._config[attr]:\n safe_html._config[attr] = value\n\n # always reload the transform\n safe_html._p_changed = True\n safe_html.reload()\n\n self.applyChanges(data)\n IStatusMessage(self.request).addStatusMessage(\n _(u\"Changes saved.\"),\n \"info\")\n self.request.response.redirect(self.request.getURL())\n\n\nclass ControlPanelFormWrapper(layout.FormWrapper):\n \"\"\"Use this form as the plone.z3cform layout wrapper to get the control\n panel layout.\n \"\"\"\n index = ViewPageTemplateFile('filter_controlpanel.pt')\n\n\nFilterControlPanelView = layout.wrap_form(\n FilterControlPanel, ControlPanelFormWrapper)\n", "path": "Products/CMFPlone/controlpanel/browser/filter.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nfrom Products.CMFCore.utils import getToolByName\nfrom Products.CMFPlone import PloneMessageFactory as _ # NOQA\nfrom Products.CMFPlone.interfaces import IFilterSchema\nfrom Products.Five.browser.pagetemplatefile import ViewPageTemplateFile\nfrom Products.statusmessages.interfaces import IStatusMessage\nfrom plone.autoform.form import AutoExtensibleForm\nfrom plone.z3cform import layout\nfrom z3c.form import button\nfrom z3c.form import form\nfrom Products.PortalTransforms.transforms.safe_html import VALID_TAGS\nfrom Products.PortalTransforms.transforms.safe_html import NASTY_TAGS\n\n\nclass FilterControlPanel(AutoExtensibleForm, form.EditForm):\n id = \"FilterControlPanel\"\n label = _(u\"HTML Filtering Settings\")\n description = \"\"\n schema = IFilterSchema\n form_name = _(u\"HTML Filtering Settings\")\n control_panel_view = \"filter-controlpanel\"\n\n def updateActions(self): # NOQA\n \"\"\"Have to override this because we only have Save, not Cancel\n \"\"\"\n super(FilterControlPanel, self).updateActions()\n self.actions['save'].addClass(\"context\")\n\n @button.buttonAndHandler(_(u\"Save\"), name='save')\n def handleSave(self, action): # NOQA\n data, errors = self.extractData()\n if errors:\n self.status = self.formErrorsMessage\n return\n\n # Save in portal tools\n safe_html = getattr(\n getToolByName(self.context, 'portal_transforms'),\n 'safe_html',\n None)\n\n nasty_tags = data['nasty_tags']\n custom_tags = data['custom_tags']\n stripped_tags = data['stripped_tags']\n\n valid = safe_html._config['valid_tags']\n\n # start with removing tags that do not belong in valid tags\n for value in nasty_tags + stripped_tags:\n if value in valid:\n del valid[value]\n # add in custom tags\n for custom in custom_tags:\n if value not in valid:\n valid[custom] = 1\n # then, check if something was previously prevented but is no longer\n for tag in set(VALID_TAGS.keys()) - set(valid.keys()):\n if tag not in nasty_tags and tag not in stripped_tags:\n valid[tag] = VALID_TAGS[tag]\n\n # nasty tags are simple, just set the value here\n nasty_value = {}\n for tag in nasty_tags:\n nasty_value[tag] = NASTY_TAGS.get(tag, VALID_TAGS.get(tag, 1))\n safe_html._config['nasty_tags'] = nasty_value\n\n disable_filtering = int(data['disable_filtering'])\n if disable_filtering != safe_html._config['disable_transform']:\n safe_html._config['disable_transform'] = disable_filtering\n\n for attr in ('stripped_combinations', 'class_blacklist',\n 'stripped_attributes', 'style_whitelist'):\n value = data[attr]\n if value is None:\n if attr == 'stripped_combinations':\n value = {}\n else:\n value = []\n if value != safe_html._config[attr]:\n safe_html._config[attr] = value\n\n # always reload the transform\n safe_html._p_changed = True\n safe_html.reload()\n\n self.applyChanges(data)\n IStatusMessage(self.request).addStatusMessage(\n _(u\"Changes saved.\"),\n \"info\")\n IStatusMessage(self.request).addStatusMessage(\n _(u\"HTML generation is heavily cached across Plone. You may \"\n u\"have to edit existing content or restart your server to see \"\n u\"the changes.\"),\n \"warning\")\n self.request.response.redirect(self.request.getURL())\n\n\nclass ControlPanelFormWrapper(layout.FormWrapper):\n \"\"\"Use this form as the plone.z3cform layout wrapper to get the control\n panel layout.\n \"\"\"\n index = ViewPageTemplateFile('filter_controlpanel.pt')\n\n\nFilterControlPanelView = layout.wrap_form(\n FilterControlPanel, ControlPanelFormWrapper)\n", "path": "Products/CMFPlone/controlpanel/browser/filter.py"}]} | 1,597 | 425 |
gh_patches_debug_9181 | rasdani/github-patches | git_diff | certbot__certbot-9218 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
snap_config.prepare_env can hang forever if snapd is malfunctioning
Reported at https://community.letsencrypt.org/t/sudo-certbot-renew-hangs-forever/138649
There's some evidence in that thread that `snapd` can get into a state where it just stops responding to commands (whether via `snap`, `snapctl`, or the REST API directly).
Certbot should guard against the possibility of hanging forever, by setting timeouts on relevant network operations:
https://github.com/certbot/certbot/blob/9ca7f76505b10b2f395ddffc4ddc1cbc8afb516b/certbot/certbot/_internal/snap_config.py#L87-L89
https://github.com/certbot/certbot/blob/9ca7f76505b10b2f395ddffc4ddc1cbc8afb516b/certbot/certbot/_internal/snap_config.py#L56-L57
Edit: to clarify, although the reported issue involves the previous shell-based snap wrapper, I have reproduced the same hang and lack of effective timeout with the current Python implementation.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `certbot/certbot/_internal/snap_config.py`
Content:
```
1 """Module configuring Certbot in a snap environment"""
2 import logging
3 import socket
4 from typing import Iterable
5 from typing import List
6 from typing import Optional
7
8 from requests import Session
9 from requests.adapters import HTTPAdapter
10 from requests.exceptions import HTTPError
11 from requests.exceptions import RequestException
12
13 from certbot.compat import os
14 from certbot.errors import Error
15
16 try:
17 from urllib3.connection import HTTPConnection
18 from urllib3.connectionpool import HTTPConnectionPool
19 except ImportError:
20 # Stub imports for oldest requirements, that will never be used in snaps.
21 HTTPConnection = object # type: ignore[misc,assignment]
22 HTTPConnectionPool = object # type: ignore[misc,assignment]
23
24
25 _ARCH_TRIPLET_MAP = {
26 'arm64': 'aarch64-linux-gnu',
27 'armhf': 'arm-linux-gnueabihf',
28 'i386': 'i386-linux-gnu',
29 'ppc64el': 'powerpc64le-linux-gnu',
30 'powerpc': 'powerpc-linux-gnu',
31 'amd64': 'x86_64-linux-gnu',
32 's390x': 's390x-linux-gnu',
33 }
34
35 LOGGER = logging.getLogger(__name__)
36
37
38 def prepare_env(cli_args: List[str]) -> List[str]:
39 """
40 Prepare runtime environment for a certbot execution in snap.
41 :param list cli_args: List of command line arguments
42 :return: Update list of command line arguments
43 :rtype: list
44 """
45 snap_arch = os.environ.get('SNAP_ARCH')
46
47 if snap_arch not in _ARCH_TRIPLET_MAP:
48 raise Error('Unrecognized value of SNAP_ARCH: {0}'.format(snap_arch))
49
50 os.environ['CERTBOT_AUGEAS_PATH'] = '{0}/usr/lib/{1}/libaugeas.so.0'.format(
51 os.environ.get('SNAP'), _ARCH_TRIPLET_MAP[snap_arch])
52
53 with Session() as session:
54 session.mount('http://snapd/', _SnapdAdapter())
55
56 try:
57 response = session.get('http://snapd/v2/connections?snap=certbot&interface=content')
58 response.raise_for_status()
59 except RequestException as e:
60 if isinstance(e, HTTPError) and e.response.status_code == 404:
61 LOGGER.error('An error occurred while fetching Certbot snap plugins: '
62 'your version of snapd is outdated.')
63 LOGGER.error('Please run "sudo snap install core; sudo snap refresh core" '
64 'in your terminal and try again.')
65 else:
66 LOGGER.error('An error occurred while fetching Certbot snap plugins: '
67 'make sure the snapd service is running.')
68 raise e
69
70 data = response.json()
71 connections = ['/snap/{0}/current/lib/python3.8/site-packages/'.format(item['slot']['snap'])
72 for item in data.get('result', {}).get('established', [])
73 if item.get('plug', {}).get('plug') == 'plugin'
74 and item.get('plug-attrs', {}).get('content') == 'certbot-1']
75
76 os.environ['CERTBOT_PLUGIN_PATH'] = ':'.join(connections)
77
78 cli_args.append('--preconfigured-renewal')
79
80 return cli_args
81
82
83 class _SnapdConnection(HTTPConnection):
84 def __init__(self) -> None:
85 super().__init__("localhost")
86 self.sock: Optional[socket.socket] = None
87
88 def connect(self) -> None:
89 self.sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
90 self.sock.connect("/run/snapd.socket")
91
92
93 class _SnapdConnectionPool(HTTPConnectionPool):
94 def __init__(self) -> None:
95 super().__init__("localhost")
96
97 def _new_conn(self) -> _SnapdConnection:
98 return _SnapdConnection()
99
100
101 class _SnapdAdapter(HTTPAdapter):
102 def get_connection(self, url: str,
103 proxies: Optional[Iterable[str]] = None) -> _SnapdConnectionPool:
104 return _SnapdConnectionPool()
105
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/certbot/certbot/_internal/snap_config.py b/certbot/certbot/_internal/snap_config.py
--- a/certbot/certbot/_internal/snap_config.py
+++ b/certbot/certbot/_internal/snap_config.py
@@ -54,7 +54,8 @@
session.mount('http://snapd/', _SnapdAdapter())
try:
- response = session.get('http://snapd/v2/connections?snap=certbot&interface=content')
+ response = session.get('http://snapd/v2/connections?snap=certbot&interface=content',
+ timeout=30.0)
response.raise_for_status()
except RequestException as e:
if isinstance(e, HTTPError) and e.response.status_code == 404:
| {"golden_diff": "diff --git a/certbot/certbot/_internal/snap_config.py b/certbot/certbot/_internal/snap_config.py\n--- a/certbot/certbot/_internal/snap_config.py\n+++ b/certbot/certbot/_internal/snap_config.py\n@@ -54,7 +54,8 @@\n session.mount('http://snapd/', _SnapdAdapter())\n \n try:\n- response = session.get('http://snapd/v2/connections?snap=certbot&interface=content')\n+ response = session.get('http://snapd/v2/connections?snap=certbot&interface=content',\n+ timeout=30.0)\n response.raise_for_status()\n except RequestException as e:\n if isinstance(e, HTTPError) and e.response.status_code == 404:\n", "issue": "snap_config.prepare_env can hang forever if snapd is malfunctioning\nReported at https://community.letsencrypt.org/t/sudo-certbot-renew-hangs-forever/138649\r\n\r\nThere's some evidence in that thread that `snapd` can get into a state where it just stops responding to commands (whether via `snap`, `snapctl`, or the REST API directly).\r\n\r\nCertbot should guard against the possibility of hanging forever, by setting timeouts on relevant network operations:\r\n\r\nhttps://github.com/certbot/certbot/blob/9ca7f76505b10b2f395ddffc4ddc1cbc8afb516b/certbot/certbot/_internal/snap_config.py#L87-L89\r\n\r\nhttps://github.com/certbot/certbot/blob/9ca7f76505b10b2f395ddffc4ddc1cbc8afb516b/certbot/certbot/_internal/snap_config.py#L56-L57\r\n\r\nEdit: to clarify, although the reported issue involves the previous shell-based snap wrapper, I have reproduced the same hang and lack of effective timeout with the current Python implementation.\n", "before_files": [{"content": "\"\"\"Module configuring Certbot in a snap environment\"\"\"\nimport logging\nimport socket\nfrom typing import Iterable\nfrom typing import List\nfrom typing import Optional\n\nfrom requests import Session\nfrom requests.adapters import HTTPAdapter\nfrom requests.exceptions import HTTPError\nfrom requests.exceptions import RequestException\n\nfrom certbot.compat import os\nfrom certbot.errors import Error\n\ntry:\n from urllib3.connection import HTTPConnection\n from urllib3.connectionpool import HTTPConnectionPool\nexcept ImportError:\n # Stub imports for oldest requirements, that will never be used in snaps.\n HTTPConnection = object # type: ignore[misc,assignment]\n HTTPConnectionPool = object # type: ignore[misc,assignment]\n\n\n_ARCH_TRIPLET_MAP = {\n 'arm64': 'aarch64-linux-gnu',\n 'armhf': 'arm-linux-gnueabihf',\n 'i386': 'i386-linux-gnu',\n 'ppc64el': 'powerpc64le-linux-gnu',\n 'powerpc': 'powerpc-linux-gnu',\n 'amd64': 'x86_64-linux-gnu',\n 's390x': 's390x-linux-gnu',\n}\n\nLOGGER = logging.getLogger(__name__)\n\n\ndef prepare_env(cli_args: List[str]) -> List[str]:\n \"\"\"\n Prepare runtime environment for a certbot execution in snap.\n :param list cli_args: List of command line arguments\n :return: Update list of command line arguments\n :rtype: list\n \"\"\"\n snap_arch = os.environ.get('SNAP_ARCH')\n\n if snap_arch not in _ARCH_TRIPLET_MAP:\n raise Error('Unrecognized value of SNAP_ARCH: {0}'.format(snap_arch))\n\n os.environ['CERTBOT_AUGEAS_PATH'] = '{0}/usr/lib/{1}/libaugeas.so.0'.format(\n os.environ.get('SNAP'), _ARCH_TRIPLET_MAP[snap_arch])\n\n with Session() as session:\n session.mount('http://snapd/', _SnapdAdapter())\n\n try:\n response = session.get('http://snapd/v2/connections?snap=certbot&interface=content')\n response.raise_for_status()\n except RequestException as e:\n if isinstance(e, HTTPError) and e.response.status_code == 404:\n LOGGER.error('An error occurred while fetching Certbot snap plugins: '\n 'your version of snapd is outdated.')\n LOGGER.error('Please run \"sudo snap install core; sudo snap refresh core\" '\n 'in your terminal and try again.')\n else:\n LOGGER.error('An error occurred while fetching Certbot snap plugins: '\n 'make sure the snapd service is running.')\n raise e\n\n data = response.json()\n connections = ['/snap/{0}/current/lib/python3.8/site-packages/'.format(item['slot']['snap'])\n for item in data.get('result', {}).get('established', [])\n if item.get('plug', {}).get('plug') == 'plugin'\n and item.get('plug-attrs', {}).get('content') == 'certbot-1']\n\n os.environ['CERTBOT_PLUGIN_PATH'] = ':'.join(connections)\n\n cli_args.append('--preconfigured-renewal')\n\n return cli_args\n\n\nclass _SnapdConnection(HTTPConnection):\n def __init__(self) -> None:\n super().__init__(\"localhost\")\n self.sock: Optional[socket.socket] = None\n\n def connect(self) -> None:\n self.sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)\n self.sock.connect(\"/run/snapd.socket\")\n\n\nclass _SnapdConnectionPool(HTTPConnectionPool):\n def __init__(self) -> None:\n super().__init__(\"localhost\")\n\n def _new_conn(self) -> _SnapdConnection:\n return _SnapdConnection()\n\n\nclass _SnapdAdapter(HTTPAdapter):\n def get_connection(self, url: str,\n proxies: Optional[Iterable[str]] = None) -> _SnapdConnectionPool:\n return _SnapdConnectionPool()\n", "path": "certbot/certbot/_internal/snap_config.py"}], "after_files": [{"content": "\"\"\"Module configuring Certbot in a snap environment\"\"\"\nimport logging\nimport socket\nfrom typing import Iterable\nfrom typing import List\nfrom typing import Optional\n\nfrom requests import Session\nfrom requests.adapters import HTTPAdapter\nfrom requests.exceptions import HTTPError\nfrom requests.exceptions import RequestException\n\nfrom certbot.compat import os\nfrom certbot.errors import Error\n\ntry:\n from urllib3.connection import HTTPConnection\n from urllib3.connectionpool import HTTPConnectionPool\nexcept ImportError:\n # Stub imports for oldest requirements, that will never be used in snaps.\n HTTPConnection = object # type: ignore[misc,assignment]\n HTTPConnectionPool = object # type: ignore[misc,assignment]\n\n\n_ARCH_TRIPLET_MAP = {\n 'arm64': 'aarch64-linux-gnu',\n 'armhf': 'arm-linux-gnueabihf',\n 'i386': 'i386-linux-gnu',\n 'ppc64el': 'powerpc64le-linux-gnu',\n 'powerpc': 'powerpc-linux-gnu',\n 'amd64': 'x86_64-linux-gnu',\n 's390x': 's390x-linux-gnu',\n}\n\nLOGGER = logging.getLogger(__name__)\n\n\ndef prepare_env(cli_args: List[str]) -> List[str]:\n \"\"\"\n Prepare runtime environment for a certbot execution in snap.\n :param list cli_args: List of command line arguments\n :return: Update list of command line arguments\n :rtype: list\n \"\"\"\n snap_arch = os.environ.get('SNAP_ARCH')\n\n if snap_arch not in _ARCH_TRIPLET_MAP:\n raise Error('Unrecognized value of SNAP_ARCH: {0}'.format(snap_arch))\n\n os.environ['CERTBOT_AUGEAS_PATH'] = '{0}/usr/lib/{1}/libaugeas.so.0'.format(\n os.environ.get('SNAP'), _ARCH_TRIPLET_MAP[snap_arch])\n\n with Session() as session:\n session.mount('http://snapd/', _SnapdAdapter())\n\n try:\n response = session.get('http://snapd/v2/connections?snap=certbot&interface=content',\n timeout=30.0)\n response.raise_for_status()\n except RequestException as e:\n if isinstance(e, HTTPError) and e.response.status_code == 404:\n LOGGER.error('An error occurred while fetching Certbot snap plugins: '\n 'your version of snapd is outdated.')\n LOGGER.error('Please run \"sudo snap install core; sudo snap refresh core\" '\n 'in your terminal and try again.')\n else:\n LOGGER.error('An error occurred while fetching Certbot snap plugins: '\n 'make sure the snapd service is running.')\n raise e\n\n data = response.json()\n connections = ['/snap/{0}/current/lib/python3.8/site-packages/'.format(item['slot']['snap'])\n for item in data.get('result', {}).get('established', [])\n if item.get('plug', {}).get('plug') == 'plugin'\n and item.get('plug-attrs', {}).get('content') == 'certbot-1']\n\n os.environ['CERTBOT_PLUGIN_PATH'] = ':'.join(connections)\n\n cli_args.append('--preconfigured-renewal')\n\n return cli_args\n\n\nclass _SnapdConnection(HTTPConnection):\n def __init__(self) -> None:\n super().__init__(\"localhost\")\n self.sock: Optional[socket.socket] = None\n\n def connect(self) -> None:\n self.sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)\n self.sock.connect(\"/run/snapd.socket\")\n\n\nclass _SnapdConnectionPool(HTTPConnectionPool):\n def __init__(self) -> None:\n super().__init__(\"localhost\")\n\n def _new_conn(self) -> _SnapdConnection:\n return _SnapdConnection()\n\n\nclass _SnapdAdapter(HTTPAdapter):\n def get_connection(self, url: str,\n proxies: Optional[Iterable[str]] = None) -> _SnapdConnectionPool:\n return _SnapdConnectionPool()\n", "path": "certbot/certbot/_internal/snap_config.py"}]} | 1,615 | 181 |
gh_patches_debug_2671 | rasdani/github-patches | git_diff | saleor__saleor-1389 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add robots meta tag and "nofollow" link attribute
1. Fragile pages should be not indexed by search engines.
```
<meta name=”robots” content=”nofollow, noindex”>
```
- [x] Add above meta tag to order's confirmation page
2. Pages that brings no to little content value should not be crawled
```
<meta name=”robots” content=”nofollow”>
```
- [x] Add above meta tag to sign in/sign up/cart pages
3. Add link attribute
- [x] Links pointing to above pages should have set attribute `rel="nofollow"`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `saleor/core/views.py`
Content:
```
1 from __future__ import unicode_literals
2
3 from django.template.response import TemplateResponse
4 from django.contrib import messages
5 from django.conf import settings
6 from django.utils.translation import pgettext_lazy
7 from impersonate.views import impersonate as orig_impersonate
8
9 from ..dashboard.views import staff_member_required
10 from ..product.utils import products_with_availability, products_for_homepage
11 from ..userprofile.models import User
12
13
14 def home(request):
15 products = products_for_homepage()[:8]
16 products = products_with_availability(
17 products, discounts=request.discounts, local_currency=request.currency)
18 return TemplateResponse(
19 request, 'home.html',
20 {'products': products, 'parent': None})
21
22
23 @staff_member_required
24 def styleguide(request):
25 return TemplateResponse(request, 'styleguide.html')
26
27
28 def impersonate(request, uid):
29 response = orig_impersonate(request, uid)
30 if request.session.modified:
31 msg = pgettext_lazy(
32 'Impersonation message',
33 'You are now logged as {}'.format(User.objects.get(pk=uid)))
34 messages.success(request, msg)
35 return response
36
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/saleor/core/views.py b/saleor/core/views.py
--- a/saleor/core/views.py
+++ b/saleor/core/views.py
@@ -2,7 +2,6 @@
from django.template.response import TemplateResponse
from django.contrib import messages
-from django.conf import settings
from django.utils.translation import pgettext_lazy
from impersonate.views import impersonate as orig_impersonate
| {"golden_diff": "diff --git a/saleor/core/views.py b/saleor/core/views.py\n--- a/saleor/core/views.py\n+++ b/saleor/core/views.py\n@@ -2,7 +2,6 @@\n \n from django.template.response import TemplateResponse\n from django.contrib import messages\n-from django.conf import settings\n from django.utils.translation import pgettext_lazy\n from impersonate.views import impersonate as orig_impersonate\n", "issue": "Add robots meta tag and \"nofollow\" link attribute\n1. Fragile pages should be not indexed by search engines.\r\n```\r\n<meta name=\u201drobots\u201d content=\u201dnofollow, noindex\u201d>\r\n```\r\n- [x] Add above meta tag to order's confirmation page\r\n\r\n2. Pages that brings no to little content value should not be crawled\r\n```\r\n<meta name=\u201drobots\u201d content=\u201dnofollow\u201d>\r\n```\r\n- [x] Add above meta tag to sign in/sign up/cart pages \r\n3. Add link attribute\r\n- [x] Links pointing to above pages should have set attribute `rel=\"nofollow\"` \n", "before_files": [{"content": "from __future__ import unicode_literals\n\nfrom django.template.response import TemplateResponse\nfrom django.contrib import messages\nfrom django.conf import settings\nfrom django.utils.translation import pgettext_lazy\nfrom impersonate.views import impersonate as orig_impersonate\n\nfrom ..dashboard.views import staff_member_required\nfrom ..product.utils import products_with_availability, products_for_homepage\nfrom ..userprofile.models import User\n\n\ndef home(request):\n products = products_for_homepage()[:8]\n products = products_with_availability(\n products, discounts=request.discounts, local_currency=request.currency)\n return TemplateResponse(\n request, 'home.html',\n {'products': products, 'parent': None})\n\n\n@staff_member_required\ndef styleguide(request):\n return TemplateResponse(request, 'styleguide.html')\n\n\ndef impersonate(request, uid):\n response = orig_impersonate(request, uid)\n if request.session.modified:\n msg = pgettext_lazy(\n 'Impersonation message',\n 'You are now logged as {}'.format(User.objects.get(pk=uid)))\n messages.success(request, msg)\n return response\n", "path": "saleor/core/views.py"}], "after_files": [{"content": "from __future__ import unicode_literals\n\nfrom django.template.response import TemplateResponse\nfrom django.contrib import messages\nfrom django.utils.translation import pgettext_lazy\nfrom impersonate.views import impersonate as orig_impersonate\n\nfrom ..dashboard.views import staff_member_required\nfrom ..product.utils import products_with_availability, products_for_homepage\nfrom ..userprofile.models import User\n\n\ndef home(request):\n products = products_for_homepage()[:8]\n products = products_with_availability(\n products, discounts=request.discounts, local_currency=request.currency)\n return TemplateResponse(\n request, 'home.html',\n {'products': products, 'parent': None})\n\n\n@staff_member_required\ndef styleguide(request):\n return TemplateResponse(request, 'styleguide.html')\n\n\ndef impersonate(request, uid):\n response = orig_impersonate(request, uid)\n if request.session.modified:\n msg = pgettext_lazy(\n 'Impersonation message',\n 'You are now logged as {}'.format(User.objects.get(pk=uid)))\n messages.success(request, msg)\n return response\n", "path": "saleor/core/views.py"}]} | 683 | 89 |
gh_patches_debug_9597 | rasdani/github-patches | git_diff | google__turbinia-1017 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Cron.py unicode decode error
It looks like input evidence caused a unicode error for cron. I think this would only be raised when receiving unreadable input but filing this if we want to catch this exception differently/look into it further
```
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/turbinia-20220216-py3.8.egg/turbinia/workers/__init__.py", line 1005, in run_wrapper
self.result = self.run(evidence, self.result)
File "/usr/local/lib/python3.8/dist-packages/turbinia-20220216-py3.8.egg/turbinia/workers/cron.py", line 54, in run
crontab = input_file.read()
File "/usr/lib/python3.8/codecs.py", line 322, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xdc in position 0: invalid continuation byte
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `turbinia/workers/cron.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # Copyright 2021 Google Inc.
3 #
4 # Licensed under the Apache License, Version 2.0 (the "License");
5 # you may not use this file except in compliance with the License.
6 # You may obtain a copy of the License at
7 #
8 # http://www.apache.org/licenses/LICENSE-2.0
9 #
10 # Unless required by applicable law or agreed to in writing, software
11 # distributed under the License is distributed on an "AS IS" BASIS,
12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 # See the License for the specific language governing permissions and
14 # limitations under the License.
15 """Task for analysing cron files."""
16
17 from __future__ import unicode_literals
18
19 import os
20 import re
21
22 from turbinia.evidence import EvidenceState as state
23 from turbinia.evidence import ReportText
24 from turbinia.lib import text_formatter as fmt
25 from turbinia.workers import TurbiniaTask
26 from turbinia.workers import Priority
27
28
29 class CronAnalysisTask(TurbiniaTask):
30 """Task to analyze crontabs."""
31
32 REQUIRED_STATES = [
33 state.ATTACHED, state.CONTAINER_MOUNTED, state.DECOMPRESSED
34 ]
35
36 def run(self, evidence, result):
37 """Run the cron analysis worker.
38
39 Args:
40 evidence (Evidence object): The evidence we will process.
41 result (TurbiniaTaskResult): The object to place task results into.
42
43 Returns:
44 TurbiniaTaskResult object.
45 """
46 # Where to store the resulting output file.
47 output_file_name = 'cron_analysis.txt'
48 output_file_path = os.path.join(self.output_dir, output_file_name)
49 # Set the output file as the data source for the output evidence.
50 output_evidence = ReportText(source_path=output_file_path)
51
52 # Read the input file
53 with open(evidence.local_path, 'r') as input_file:
54 crontab = input_file.read()
55
56 (report, priority, summary) = self.analyse_crontab(crontab)
57 output_evidence.text_data = report
58 result.report_priority = priority
59 result.report_data = report
60
61 # Write the report to the output file.
62 with open(output_file_path, 'wb') as fh:
63 fh.write(output_evidence.text_data.encode('utf-8'))
64
65 # Add the resulting evidence to the result object.
66 result.add_evidence(output_evidence, evidence.config)
67 result.close(self, success=True, status=summary)
68 return result
69
70 def analyse_crontab(self, crontab):
71 """Analyses a Cron file.
72
73 Args:
74 crontab (str): file content.
75
76 Returns:
77 Tuple(
78 report_text(str): The report data
79 report_priority(int): The priority of the report (0 - 100)
80 summary(str): A summary of the report (used for task status)
81 )
82 """
83 findings = []
84 wget_or_curl = re.compile(r'(wget|curl)', re.IGNORECASE | re.MULTILINE)
85 pipe_to_sh = re.compile(r'\|(.*)sh ', re.IGNORECASE | re.MULTILINE)
86 get_piped_to_sh = re.compile(
87 r'((wget|curl).*\|)+(.*sh)', re.IGNORECASE | re.MULTILINE)
88
89 if re.search(get_piped_to_sh, crontab):
90 findings.append(fmt.bullet('Remote file retrieval piped to a shell.'))
91 elif re.search(wget_or_curl, crontab):
92 findings.append(fmt.bullet('Remote file retrieval'))
93 elif re.search(pipe_to_sh, crontab):
94 findings.append(fmt.bullet('File piped to shell'))
95
96 if findings:
97 summary = 'Potentially backdoored crontab found.'
98 findings.insert(0, fmt.heading4(fmt.bold(summary)))
99 report = '\n'.join(findings)
100 return (report, Priority.HIGH, summary)
101
102 report = 'No issues found in crontabs'
103 return (report, Priority.LOW, report)
104
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/turbinia/workers/cron.py b/turbinia/workers/cron.py
--- a/turbinia/workers/cron.py
+++ b/turbinia/workers/cron.py
@@ -51,7 +51,14 @@
# Read the input file
with open(evidence.local_path, 'r') as input_file:
- crontab = input_file.read()
+ try:
+ crontab = input_file.read()
+ except UnicodeDecodeError as exception:
+ message = 'Error parsing cron file {0:s}: {1!s}'.format(
+ evidence.local_path, exception)
+ result.log(message)
+ result.close(self, success=False, status=message)
+ return result
(report, priority, summary) = self.analyse_crontab(crontab)
output_evidence.text_data = report
| {"golden_diff": "diff --git a/turbinia/workers/cron.py b/turbinia/workers/cron.py\n--- a/turbinia/workers/cron.py\n+++ b/turbinia/workers/cron.py\n@@ -51,7 +51,14 @@\n \n # Read the input file\n with open(evidence.local_path, 'r') as input_file:\n- crontab = input_file.read()\n+ try:\n+ crontab = input_file.read()\n+ except UnicodeDecodeError as exception:\n+ message = 'Error parsing cron file {0:s}: {1!s}'.format(\n+ evidence.local_path, exception)\n+ result.log(message)\n+ result.close(self, success=False, status=message)\n+ return result\n \n (report, priority, summary) = self.analyse_crontab(crontab)\n output_evidence.text_data = report\n", "issue": "Cron.py unicode decode error\nIt looks like input evidence caused a unicode error for cron. I think this would only be raised when receiving unreadable input but filing this if we want to catch this exception differently/look into it further\r\n\r\n```\r\n Traceback (most recent call last):\r\n File \"/usr/local/lib/python3.8/dist-packages/turbinia-20220216-py3.8.egg/turbinia/workers/__init__.py\", line 1005, in run_wrapper\r\n self.result = self.run(evidence, self.result)\r\n File \"/usr/local/lib/python3.8/dist-packages/turbinia-20220216-py3.8.egg/turbinia/workers/cron.py\", line 54, in run\r\n crontab = input_file.read()\r\n File \"/usr/lib/python3.8/codecs.py\", line 322, in decode\r\n (result, consumed) = self._buffer_decode(data, self.errors, final)\r\nUnicodeDecodeError: 'utf-8' codec can't decode byte 0xdc in position 0: invalid continuation byte\r\n```\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright 2021 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Task for analysing cron files.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport os\nimport re\n\nfrom turbinia.evidence import EvidenceState as state\nfrom turbinia.evidence import ReportText\nfrom turbinia.lib import text_formatter as fmt\nfrom turbinia.workers import TurbiniaTask\nfrom turbinia.workers import Priority\n\n\nclass CronAnalysisTask(TurbiniaTask):\n \"\"\"Task to analyze crontabs.\"\"\"\n\n REQUIRED_STATES = [\n state.ATTACHED, state.CONTAINER_MOUNTED, state.DECOMPRESSED\n ]\n\n def run(self, evidence, result):\n \"\"\"Run the cron analysis worker.\n\n Args:\n evidence (Evidence object): The evidence we will process.\n result (TurbiniaTaskResult): The object to place task results into.\n\n Returns:\n TurbiniaTaskResult object.\n \"\"\"\n # Where to store the resulting output file.\n output_file_name = 'cron_analysis.txt'\n output_file_path = os.path.join(self.output_dir, output_file_name)\n # Set the output file as the data source for the output evidence.\n output_evidence = ReportText(source_path=output_file_path)\n\n # Read the input file\n with open(evidence.local_path, 'r') as input_file:\n crontab = input_file.read()\n\n (report, priority, summary) = self.analyse_crontab(crontab)\n output_evidence.text_data = report\n result.report_priority = priority\n result.report_data = report\n\n # Write the report to the output file.\n with open(output_file_path, 'wb') as fh:\n fh.write(output_evidence.text_data.encode('utf-8'))\n\n # Add the resulting evidence to the result object.\n result.add_evidence(output_evidence, evidence.config)\n result.close(self, success=True, status=summary)\n return result\n\n def analyse_crontab(self, crontab):\n \"\"\"Analyses a Cron file.\n\n Args:\n crontab (str): file content.\n\n Returns:\n Tuple(\n report_text(str): The report data\n report_priority(int): The priority of the report (0 - 100)\n summary(str): A summary of the report (used for task status)\n )\n \"\"\"\n findings = []\n wget_or_curl = re.compile(r'(wget|curl)', re.IGNORECASE | re.MULTILINE)\n pipe_to_sh = re.compile(r'\\|(.*)sh ', re.IGNORECASE | re.MULTILINE)\n get_piped_to_sh = re.compile(\n r'((wget|curl).*\\|)+(.*sh)', re.IGNORECASE | re.MULTILINE)\n\n if re.search(get_piped_to_sh, crontab):\n findings.append(fmt.bullet('Remote file retrieval piped to a shell.'))\n elif re.search(wget_or_curl, crontab):\n findings.append(fmt.bullet('Remote file retrieval'))\n elif re.search(pipe_to_sh, crontab):\n findings.append(fmt.bullet('File piped to shell'))\n\n if findings:\n summary = 'Potentially backdoored crontab found.'\n findings.insert(0, fmt.heading4(fmt.bold(summary)))\n report = '\\n'.join(findings)\n return (report, Priority.HIGH, summary)\n\n report = 'No issues found in crontabs'\n return (report, Priority.LOW, report)\n", "path": "turbinia/workers/cron.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright 2021 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Task for analysing cron files.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport os\nimport re\n\nfrom turbinia.evidence import EvidenceState as state\nfrom turbinia.evidence import ReportText\nfrom turbinia.lib import text_formatter as fmt\nfrom turbinia.workers import TurbiniaTask\nfrom turbinia.workers import Priority\n\n\nclass CronAnalysisTask(TurbiniaTask):\n \"\"\"Task to analyze crontabs.\"\"\"\n\n REQUIRED_STATES = [\n state.ATTACHED, state.CONTAINER_MOUNTED, state.DECOMPRESSED\n ]\n\n def run(self, evidence, result):\n \"\"\"Run the cron analysis worker.\n\n Args:\n evidence (Evidence object): The evidence we will process.\n result (TurbiniaTaskResult): The object to place task results into.\n\n Returns:\n TurbiniaTaskResult object.\n \"\"\"\n # Where to store the resulting output file.\n output_file_name = 'cron_analysis.txt'\n output_file_path = os.path.join(self.output_dir, output_file_name)\n # Set the output file as the data source for the output evidence.\n output_evidence = ReportText(source_path=output_file_path)\n\n # Read the input file\n with open(evidence.local_path, 'r') as input_file:\n try:\n crontab = input_file.read()\n except UnicodeDecodeError as exception:\n message = 'Error parsing cron file {0:s}: {1!s}'.format(\n evidence.local_path, exception)\n result.log(message)\n result.close(self, success=False, status=message)\n return result\n\n (report, priority, summary) = self.analyse_crontab(crontab)\n output_evidence.text_data = report\n result.report_priority = priority\n result.report_data = report\n\n # Write the report to the output file.\n with open(output_file_path, 'wb') as fh:\n fh.write(output_evidence.text_data.encode('utf-8'))\n\n # Add the resulting evidence to the result object.\n result.add_evidence(output_evidence, evidence.config)\n result.close(self, success=True, status=summary)\n return result\n\n def analyse_crontab(self, crontab):\n \"\"\"Analyses a Cron file.\n\n Args:\n crontab (str): file content.\n\n Returns:\n Tuple(\n report_text(str): The report data\n report_priority(int): The priority of the report (0 - 100)\n summary(str): A summary of the report (used for task status)\n )\n \"\"\"\n findings = []\n wget_or_curl = re.compile(r'(wget|curl)', re.IGNORECASE | re.MULTILINE)\n pipe_to_sh = re.compile(r'\\|(.*)sh ', re.IGNORECASE | re.MULTILINE)\n get_piped_to_sh = re.compile(\n r'((wget|curl).*\\|)+(.*sh)', re.IGNORECASE | re.MULTILINE)\n\n if re.search(get_piped_to_sh, crontab):\n findings.append(fmt.bullet('Remote file retrieval piped to a shell.'))\n elif re.search(wget_or_curl, crontab):\n findings.append(fmt.bullet('Remote file retrieval'))\n elif re.search(pipe_to_sh, crontab):\n findings.append(fmt.bullet('File piped to shell'))\n\n if findings:\n summary = 'Potentially backdoored crontab found.'\n findings.insert(0, fmt.heading4(fmt.bold(summary)))\n report = '\\n'.join(findings)\n return (report, Priority.HIGH, summary)\n\n report = 'No issues found in crontabs'\n return (report, Priority.LOW, report)\n", "path": "turbinia/workers/cron.py"}]} | 1,592 | 197 |
gh_patches_debug_933 | rasdani/github-patches | git_diff | python-pillow__Pillow-399 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Image opened twice if imagemagick and xv are installed
If ImageMagick and xv are both installed and you call Image.show() it will open the image twice, once with display and once with xv. This is probably related to the for loop over registered viewers. Not sure why the return 1 isn't working though.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `PIL/ImageShow.py`
Content:
```
1 #
2 # The Python Imaging Library.
3 # $Id$
4 #
5 # im.show() drivers
6 #
7 # History:
8 # 2008-04-06 fl Created
9 #
10 # Copyright (c) Secret Labs AB 2008.
11 #
12 # See the README file for information on usage and redistribution.
13 #
14
15 from __future__ import print_function
16
17 from PIL import Image
18 import os, sys
19
20 _viewers = []
21
22 def register(viewer, order=1):
23 try:
24 if issubclass(viewer, Viewer):
25 viewer = viewer()
26 except TypeError:
27 pass # raised if viewer wasn't a class
28 if order > 0:
29 _viewers.append(viewer)
30 elif order < 0:
31 _viewers.insert(0, viewer)
32
33 ##
34 # Displays a given image.
35 #
36 # @param image An image object.
37 # @param title Optional title. Not all viewers can display the title.
38 # @param **options Additional viewer options.
39 # @return True if a suitable viewer was found, false otherwise.
40
41 def show(image, title=None, **options):
42 for viewer in _viewers:
43 if viewer.show(image, title=title, **options):
44 return 1
45 return 0
46
47 ##
48 # Base class for viewers.
49
50 class Viewer:
51
52 # main api
53
54 def show(self, image, **options):
55
56 # save temporary image to disk
57 if image.mode[:4] == "I;16":
58 # @PIL88 @PIL101
59 # "I;16" isn't an 'official' mode, but we still want to
60 # provide a simple way to show 16-bit images.
61 base = "L"
62 # FIXME: auto-contrast if max() > 255?
63 else:
64 base = Image.getmodebase(image.mode)
65 if base != image.mode and image.mode != "1":
66 image = image.convert(base)
67
68 self.show_image(image, **options)
69
70 # hook methods
71
72 format = None
73
74 def get_format(self, image):
75 # return format name, or None to save as PGM/PPM
76 return self.format
77
78 def get_command(self, file, **options):
79 raise NotImplementedError
80
81 def save_image(self, image):
82 # save to temporary file, and return filename
83 return image._dump(format=self.get_format(image))
84
85 def show_image(self, image, **options):
86 # display given image
87 return self.show_file(self.save_image(image), **options)
88
89 def show_file(self, file, **options):
90 # display given file
91 os.system(self.get_command(file, **options))
92 return 1
93
94 # --------------------------------------------------------------------
95
96 if sys.platform == "win32":
97
98 class WindowsViewer(Viewer):
99 format = "BMP"
100 def get_command(self, file, **options):
101 return ("start /wait %s && ping -n 2 127.0.0.1 >NUL "
102 "&& del /f %s" % (file, file))
103
104 register(WindowsViewer)
105
106 elif sys.platform == "darwin":
107
108 class MacViewer(Viewer):
109 format = "BMP"
110 def get_command(self, file, **options):
111 # on darwin open returns immediately resulting in the temp
112 # file removal while app is opening
113 command = "open -a /Applications/Preview.app"
114 command = "(%s %s; sleep 20; rm -f %s)&" % (command, file, file)
115 return command
116
117 register(MacViewer)
118
119 else:
120
121 # unixoids
122
123 def which(executable):
124 path = os.environ.get("PATH")
125 if not path:
126 return None
127 for dirname in path.split(os.pathsep):
128 filename = os.path.join(dirname, executable)
129 if os.path.isfile(filename):
130 # FIXME: make sure it's executable
131 return filename
132 return None
133
134 class UnixViewer(Viewer):
135 def show_file(self, file, **options):
136 command, executable = self.get_command_ex(file, **options)
137 command = "(%s %s; rm -f %s)&" % (command, file, file)
138 os.system(command)
139 return 1
140
141 # implementations
142
143 class DisplayViewer(UnixViewer):
144 def get_command_ex(self, file, **options):
145 command = executable = "display"
146 return command, executable
147
148 if which("display"):
149 register(DisplayViewer)
150
151 class XVViewer(UnixViewer):
152 def get_command_ex(self, file, title=None, **options):
153 # note: xv is pretty outdated. most modern systems have
154 # imagemagick's display command instead.
155 command = executable = "xv"
156 if title:
157 # FIXME: do full escaping
158 command = command + " -name \"%s\"" % title
159 return command, executable
160
161 if which("xv"):
162 register(XVViewer)
163
164 if __name__ == "__main__":
165 # usage: python ImageShow.py imagefile [title]
166 print(show(Image.open(sys.argv[1]), *sys.argv[2:]))
167
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/PIL/ImageShow.py b/PIL/ImageShow.py
--- a/PIL/ImageShow.py
+++ b/PIL/ImageShow.py
@@ -65,7 +65,7 @@
if base != image.mode and image.mode != "1":
image = image.convert(base)
- self.show_image(image, **options)
+ return self.show_image(image, **options)
# hook methods
| {"golden_diff": "diff --git a/PIL/ImageShow.py b/PIL/ImageShow.py\n--- a/PIL/ImageShow.py\n+++ b/PIL/ImageShow.py\n@@ -65,7 +65,7 @@\n if base != image.mode and image.mode != \"1\":\n image = image.convert(base)\n \n- self.show_image(image, **options)\n+ return self.show_image(image, **options)\n \n # hook methods\n", "issue": "Image opened twice if imagemagick and xv are installed\nIf ImageMagick and xv are both installed and you call Image.show() it will open the image twice, once with display and once with xv. This is probably related to the for loop over registered viewers. Not sure why the return 1 isn't working though.\n\n", "before_files": [{"content": "#\n# The Python Imaging Library.\n# $Id$\n#\n# im.show() drivers\n#\n# History:\n# 2008-04-06 fl Created\n#\n# Copyright (c) Secret Labs AB 2008.\n#\n# See the README file for information on usage and redistribution.\n#\n\nfrom __future__ import print_function\n\nfrom PIL import Image\nimport os, sys\n\n_viewers = []\n\ndef register(viewer, order=1):\n try:\n if issubclass(viewer, Viewer):\n viewer = viewer()\n except TypeError:\n pass # raised if viewer wasn't a class\n if order > 0:\n _viewers.append(viewer)\n elif order < 0:\n _viewers.insert(0, viewer)\n\n##\n# Displays a given image.\n#\n# @param image An image object.\n# @param title Optional title. Not all viewers can display the title.\n# @param **options Additional viewer options.\n# @return True if a suitable viewer was found, false otherwise.\n\ndef show(image, title=None, **options):\n for viewer in _viewers:\n if viewer.show(image, title=title, **options):\n return 1\n return 0\n\n##\n# Base class for viewers.\n\nclass Viewer:\n\n # main api\n\n def show(self, image, **options):\n\n # save temporary image to disk\n if image.mode[:4] == \"I;16\":\n # @PIL88 @PIL101\n # \"I;16\" isn't an 'official' mode, but we still want to\n # provide a simple way to show 16-bit images.\n base = \"L\"\n # FIXME: auto-contrast if max() > 255?\n else:\n base = Image.getmodebase(image.mode)\n if base != image.mode and image.mode != \"1\":\n image = image.convert(base)\n\n self.show_image(image, **options)\n\n # hook methods\n\n format = None\n\n def get_format(self, image):\n # return format name, or None to save as PGM/PPM\n return self.format\n\n def get_command(self, file, **options):\n raise NotImplementedError\n\n def save_image(self, image):\n # save to temporary file, and return filename\n return image._dump(format=self.get_format(image))\n\n def show_image(self, image, **options):\n # display given image\n return self.show_file(self.save_image(image), **options)\n\n def show_file(self, file, **options):\n # display given file\n os.system(self.get_command(file, **options))\n return 1\n\n# --------------------------------------------------------------------\n\nif sys.platform == \"win32\":\n\n class WindowsViewer(Viewer):\n format = \"BMP\"\n def get_command(self, file, **options):\n return (\"start /wait %s && ping -n 2 127.0.0.1 >NUL \"\n \"&& del /f %s\" % (file, file))\n\n register(WindowsViewer)\n\nelif sys.platform == \"darwin\":\n\n class MacViewer(Viewer):\n format = \"BMP\"\n def get_command(self, file, **options):\n # on darwin open returns immediately resulting in the temp\n # file removal while app is opening\n command = \"open -a /Applications/Preview.app\"\n command = \"(%s %s; sleep 20; rm -f %s)&\" % (command, file, file)\n return command\n\n register(MacViewer)\n\nelse:\n\n # unixoids\n\n def which(executable):\n path = os.environ.get(\"PATH\")\n if not path:\n return None\n for dirname in path.split(os.pathsep):\n filename = os.path.join(dirname, executable)\n if os.path.isfile(filename):\n # FIXME: make sure it's executable\n return filename\n return None\n\n class UnixViewer(Viewer):\n def show_file(self, file, **options):\n command, executable = self.get_command_ex(file, **options)\n command = \"(%s %s; rm -f %s)&\" % (command, file, file)\n os.system(command)\n return 1\n\n # implementations\n\n class DisplayViewer(UnixViewer):\n def get_command_ex(self, file, **options):\n command = executable = \"display\"\n return command, executable\n\n if which(\"display\"):\n register(DisplayViewer)\n\n class XVViewer(UnixViewer):\n def get_command_ex(self, file, title=None, **options):\n # note: xv is pretty outdated. most modern systems have\n # imagemagick's display command instead.\n command = executable = \"xv\"\n if title:\n # FIXME: do full escaping\n command = command + \" -name \\\"%s\\\"\" % title\n return command, executable\n\n if which(\"xv\"):\n register(XVViewer)\n\nif __name__ == \"__main__\":\n # usage: python ImageShow.py imagefile [title]\n print(show(Image.open(sys.argv[1]), *sys.argv[2:]))\n", "path": "PIL/ImageShow.py"}], "after_files": [{"content": "#\n# The Python Imaging Library.\n# $Id$\n#\n# im.show() drivers\n#\n# History:\n# 2008-04-06 fl Created\n#\n# Copyright (c) Secret Labs AB 2008.\n#\n# See the README file for information on usage and redistribution.\n#\n\nfrom __future__ import print_function\n\nfrom PIL import Image\nimport os, sys\n\n_viewers = []\n\ndef register(viewer, order=1):\n try:\n if issubclass(viewer, Viewer):\n viewer = viewer()\n except TypeError:\n pass # raised if viewer wasn't a class\n if order > 0:\n _viewers.append(viewer)\n elif order < 0:\n _viewers.insert(0, viewer)\n\n##\n# Displays a given image.\n#\n# @param image An image object.\n# @param title Optional title. Not all viewers can display the title.\n# @param **options Additional viewer options.\n# @return True if a suitable viewer was found, false otherwise.\n\ndef show(image, title=None, **options):\n for viewer in _viewers:\n if viewer.show(image, title=title, **options):\n return 1\n return 0\n\n##\n# Base class for viewers.\n\nclass Viewer:\n\n # main api\n\n def show(self, image, **options):\n\n # save temporary image to disk\n if image.mode[:4] == \"I;16\":\n # @PIL88 @PIL101\n # \"I;16\" isn't an 'official' mode, but we still want to\n # provide a simple way to show 16-bit images.\n base = \"L\"\n # FIXME: auto-contrast if max() > 255?\n else:\n base = Image.getmodebase(image.mode)\n if base != image.mode and image.mode != \"1\":\n image = image.convert(base)\n\n return self.show_image(image, **options)\n\n # hook methods\n\n format = None\n\n def get_format(self, image):\n # return format name, or None to save as PGM/PPM\n return self.format\n\n def get_command(self, file, **options):\n raise NotImplementedError\n\n def save_image(self, image):\n # save to temporary file, and return filename\n return image._dump(format=self.get_format(image))\n\n def show_image(self, image, **options):\n # display given image\n return self.show_file(self.save_image(image), **options)\n\n def show_file(self, file, **options):\n # display given file\n os.system(self.get_command(file, **options))\n return 1\n\n# --------------------------------------------------------------------\n\nif sys.platform == \"win32\":\n\n class WindowsViewer(Viewer):\n format = \"BMP\"\n def get_command(self, file, **options):\n return (\"start /wait %s && ping -n 2 127.0.0.1 >NUL \"\n \"&& del /f %s\" % (file, file))\n\n register(WindowsViewer)\n\nelif sys.platform == \"darwin\":\n\n class MacViewer(Viewer):\n format = \"BMP\"\n def get_command(self, file, **options):\n # on darwin open returns immediately resulting in the temp\n # file removal while app is opening\n command = \"open -a /Applications/Preview.app\"\n command = \"(%s %s; sleep 20; rm -f %s)&\" % (command, file, file)\n return command\n\n register(MacViewer)\n\nelse:\n\n # unixoids\n\n def which(executable):\n path = os.environ.get(\"PATH\")\n if not path:\n return None\n for dirname in path.split(os.pathsep):\n filename = os.path.join(dirname, executable)\n if os.path.isfile(filename):\n # FIXME: make sure it's executable\n return filename\n return None\n\n class UnixViewer(Viewer):\n def show_file(self, file, **options):\n command, executable = self.get_command_ex(file, **options)\n command = \"(%s %s; rm -f %s)&\" % (command, file, file)\n os.system(command)\n return 1\n\n # implementations\n\n class DisplayViewer(UnixViewer):\n def get_command_ex(self, file, **options):\n command = executable = \"display\"\n return command, executable\n\n if which(\"display\"):\n register(DisplayViewer)\n\n class XVViewer(UnixViewer):\n def get_command_ex(self, file, title=None, **options):\n # note: xv is pretty outdated. most modern systems have\n # imagemagick's display command instead.\n command = executable = \"xv\"\n if title:\n # FIXME: do full escaping\n command = command + \" -name \\\"%s\\\"\" % title\n return command, executable\n\n if which(\"xv\"):\n register(XVViewer)\n\nif __name__ == \"__main__\":\n # usage: python ImageShow.py imagefile [title]\n print(show(Image.open(sys.argv[1]), *sys.argv[2:]))\n", "path": "PIL/ImageShow.py"}]} | 1,844 | 92 |
gh_patches_debug_20118 | rasdani/github-patches | git_diff | google__turbinia-321 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
turbinia_job_graph.py doesn't support new job manager
turbinia_job_graph.py needs to be updated to support the new job manager (from #257).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `tools/turbinia_job_graph.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # Copyright 2018 Google Inc.
3 #
4 # Licensed under the Apache License, Version 2.0 (the "License");
5 # you may not use this file except in compliance with the License.
6 # You may obtain a copy of the License at
7 #
8 # http://www.apache.org/licenses/LICENSE-2.0
9 #
10 # Unless required by applicable law or agreed to in writing, software
11 # distributed under the License is distributed on an "AS IS" BASIS,
12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 # See the License for the specific language governing permissions and
14 # limitations under the License.
15 """Graph to visualise job/evidence relationships."""
16
17 from __future__ import unicode_literals
18
19 import argparse
20 import graphviz
21
22 from turbinia.jobs import get_jobs as turbinia_jobs
23
24
25 def create_graph():
26 """Create graph of relationships between Turbinia jobs and evidence.
27
28 Returns:
29 Instance of graphviz.dot.Digraph
30 """
31 dot = graphviz.Digraph(comment='Turbinia Evidence graph', format='png')
32 for job in turbinia_jobs():
33 dot.node(job.name)
34 for evidence in job.evidence_input:
35 dot.node(evidence.__name__, shape='box')
36 dot.edge(evidence.__name__, job.name)
37
38 for evidence in job.evidence_output:
39 dot.node(evidence.__name__, shape='box')
40 dot.edge(job.name, evidence.__name__)
41 return dot
42
43
44 if __name__ == '__main__':
45 parser = argparse.ArgumentParser(
46 description='Create Turbinia evidence graph.')
47 parser.add_argument('filename', type=unicode, help='where to save the file')
48 args = parser.parse_args()
49
50 graph = create_graph()
51 output_file = args.filename.replace('.png', '')
52
53 try:
54 rendered_graph = graph.render(filename=output_file, cleanup=True)
55 print('Graph generated and saved to: {0}'.format(rendered_graph))
56 except graphviz.ExecutableNotFound:
57 print('Graphviz is not installed - Run: apt-get install graphviz')
58
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/tools/turbinia_job_graph.py b/tools/turbinia_job_graph.py
--- a/tools/turbinia_job_graph.py
+++ b/tools/turbinia_job_graph.py
@@ -19,7 +19,7 @@
import argparse
import graphviz
-from turbinia.jobs import get_jobs as turbinia_jobs
+from turbinia.jobs import manager as jobs_manager
def create_graph():
@@ -29,15 +29,15 @@
Instance of graphviz.dot.Digraph
"""
dot = graphviz.Digraph(comment='Turbinia Evidence graph', format='png')
- for job in turbinia_jobs():
- dot.node(job.name)
+ for _, job in jobs_manager.JobsManager.GetJobs():
+ dot.node(job.NAME)
for evidence in job.evidence_input:
dot.node(evidence.__name__, shape='box')
- dot.edge(evidence.__name__, job.name)
+ dot.edge(evidence.__name__, job.NAME)
for evidence in job.evidence_output:
dot.node(evidence.__name__, shape='box')
- dot.edge(job.name, evidence.__name__)
+ dot.edge(job.NAME, evidence.__name__)
return dot
| {"golden_diff": "diff --git a/tools/turbinia_job_graph.py b/tools/turbinia_job_graph.py\n--- a/tools/turbinia_job_graph.py\n+++ b/tools/turbinia_job_graph.py\n@@ -19,7 +19,7 @@\n import argparse\n import graphviz\n \n-from turbinia.jobs import get_jobs as turbinia_jobs\n+from turbinia.jobs import manager as jobs_manager\n \n \n def create_graph():\n@@ -29,15 +29,15 @@\n Instance of graphviz.dot.Digraph\n \"\"\"\n dot = graphviz.Digraph(comment='Turbinia Evidence graph', format='png')\n- for job in turbinia_jobs():\n- dot.node(job.name)\n+ for _, job in jobs_manager.JobsManager.GetJobs():\n+ dot.node(job.NAME)\n for evidence in job.evidence_input:\n dot.node(evidence.__name__, shape='box')\n- dot.edge(evidence.__name__, job.name)\n+ dot.edge(evidence.__name__, job.NAME)\n \n for evidence in job.evidence_output:\n dot.node(evidence.__name__, shape='box')\n- dot.edge(job.name, evidence.__name__)\n+ dot.edge(job.NAME, evidence.__name__)\n return dot\n", "issue": "turbinia_job_graph.py doesn't support new job manager\nturbinia_job_graph.py needs to be updated to support the new job manager (from #257).\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright 2018 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Graph to visualise job/evidence relationships.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport argparse\nimport graphviz\n\nfrom turbinia.jobs import get_jobs as turbinia_jobs\n\n\ndef create_graph():\n \"\"\"Create graph of relationships between Turbinia jobs and evidence.\n\n Returns:\n Instance of graphviz.dot.Digraph\n \"\"\"\n dot = graphviz.Digraph(comment='Turbinia Evidence graph', format='png')\n for job in turbinia_jobs():\n dot.node(job.name)\n for evidence in job.evidence_input:\n dot.node(evidence.__name__, shape='box')\n dot.edge(evidence.__name__, job.name)\n\n for evidence in job.evidence_output:\n dot.node(evidence.__name__, shape='box')\n dot.edge(job.name, evidence.__name__)\n return dot\n\n\nif __name__ == '__main__':\n parser = argparse.ArgumentParser(\n description='Create Turbinia evidence graph.')\n parser.add_argument('filename', type=unicode, help='where to save the file')\n args = parser.parse_args()\n\n graph = create_graph()\n output_file = args.filename.replace('.png', '')\n\n try:\n rendered_graph = graph.render(filename=output_file, cleanup=True)\n print('Graph generated and saved to: {0}'.format(rendered_graph))\n except graphviz.ExecutableNotFound:\n print('Graphviz is not installed - Run: apt-get install graphviz')\n", "path": "tools/turbinia_job_graph.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright 2018 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Graph to visualise job/evidence relationships.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport argparse\nimport graphviz\n\nfrom turbinia.jobs import manager as jobs_manager\n\n\ndef create_graph():\n \"\"\"Create graph of relationships between Turbinia jobs and evidence.\n\n Returns:\n Instance of graphviz.dot.Digraph\n \"\"\"\n dot = graphviz.Digraph(comment='Turbinia Evidence graph', format='png')\n for _, job in jobs_manager.JobsManager.GetJobs():\n dot.node(job.NAME)\n for evidence in job.evidence_input:\n dot.node(evidence.__name__, shape='box')\n dot.edge(evidence.__name__, job.NAME)\n\n for evidence in job.evidence_output:\n dot.node(evidence.__name__, shape='box')\n dot.edge(job.NAME, evidence.__name__)\n return dot\n\n\nif __name__ == '__main__':\n parser = argparse.ArgumentParser(\n description='Create Turbinia evidence graph.')\n parser.add_argument('filename', type=unicode, help='where to save the file')\n args = parser.parse_args()\n\n graph = create_graph()\n output_file = args.filename.replace('.png', '')\n\n try:\n rendered_graph = graph.render(filename=output_file, cleanup=True)\n print('Graph generated and saved to: {0}'.format(rendered_graph))\n except graphviz.ExecutableNotFound:\n print('Graphviz is not installed - Run: apt-get install graphviz')\n", "path": "tools/turbinia_job_graph.py"}]} | 851 | 269 |
gh_patches_debug_11740 | rasdani/github-patches | git_diff | locustio__locust-2715 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Locust insists on using IPv6 despite being in an IPv4 stack.
### Prerequisites
- [X] I am using [the latest version of Locust](https://github.com/locustio/locust/releases/)
- [X] I am reporting a bug, not asking a question
### Description
I have an environment that is IPv4-only, but there is an IPv6 address listed in /etc/hosts. When I try to connect to the master using the following command, it results in an error:
```
# locust -f ./p4_basic_code.py --worker --master-host locust-master -L DEBUG
[2024-04-25 06:24:31,138] ip-192-168-0-235.ap-northeast-3.compute.internal/DEBUG/locust.runners: Failed to connect to master locust-master:5557, retry 1/60.
```
By changing [self.socket.setsockopt(zmq.IPV6, 1)](https://github.com/locustio/locust/blob/master/locust/rpc/zmqrpc.py#L18) to 0 resolves the issue.
```
# cat /etc/hosts
127.0.0.1 localhost
192.168.0.235 ip-192-168-0-235.ap-northeast-3.compute.internal
127.255.0.1 locust-master
2600:f0f0:0:0:0:0:0:1 locust-master
```
I recommend detecting whether the current environment supports IPv6 before enabling it.
### Command line
locust -f ./p4_basic_code.py --worker --master-host locust-master -L DEBUG
### Locustfile contents
```python3
N/A
```
### Python version
3.11
### Locust version
2.26.0
### Operating system
Debian 12
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locust/rpc/zmqrpc.py`
Content:
```
1 from locust.exception import RPCError, RPCReceiveError, RPCSendError
2 from locust.util.exception_handler import retry
3
4 import msgpack.exceptions as msgerr
5 import zmq.error as zmqerr
6 import zmq.green as zmq
7
8 from .protocol import Message
9
10
11 class BaseSocket:
12 def __init__(self, sock_type):
13 context = zmq.Context()
14 self.socket = context.socket(sock_type)
15
16 self.socket.setsockopt(zmq.TCP_KEEPALIVE, 1)
17 self.socket.setsockopt(zmq.TCP_KEEPALIVE_IDLE, 30)
18 self.socket.setsockopt(zmq.IPV6, 1)
19
20 @retry()
21 def send(self, msg):
22 try:
23 self.socket.send(msg.serialize(), zmq.NOBLOCK)
24 except zmqerr.ZMQError as e:
25 raise RPCSendError("ZMQ sent failure") from e
26
27 @retry()
28 def send_to_client(self, msg):
29 try:
30 self.socket.send_multipart([msg.node_id.encode(), msg.serialize()])
31 except zmqerr.ZMQError as e:
32 raise RPCSendError("ZMQ sent failure") from e
33
34 def recv(self):
35 try:
36 data = self.socket.recv()
37 msg = Message.unserialize(data)
38 except msgerr.ExtraData as e:
39 raise RPCReceiveError("ZMQ interrupted message") from e
40 except zmqerr.ZMQError as e:
41 raise RPCError("ZMQ network broken") from e
42 return msg
43
44 def recv_from_client(self):
45 try:
46 data = self.socket.recv_multipart()
47 addr = data[0].decode()
48 except UnicodeDecodeError as e:
49 raise RPCReceiveError("ZMQ interrupted or corrupted message") from e
50 except zmqerr.ZMQError as e:
51 raise RPCError("ZMQ network broken") from e
52 try:
53 msg = Message.unserialize(data[1])
54 except (UnicodeDecodeError, msgerr.ExtraData) as e:
55 raise RPCReceiveError("ZMQ interrupted or corrupted message", addr=addr) from e
56 return addr, msg
57
58 def close(self, linger=None):
59 self.socket.close(linger=linger)
60
61
62 class Server(BaseSocket):
63 def __init__(self, host, port):
64 BaseSocket.__init__(self, zmq.ROUTER)
65 if port == 0:
66 self.port = self.socket.bind_to_random_port(f"tcp://{host}")
67 else:
68 try:
69 self.socket.bind("tcp://%s:%i" % (host, port))
70 self.port = port
71 except zmqerr.ZMQError as e:
72 raise RPCError(f"Socket bind failure: {e}")
73
74
75 class Client(BaseSocket):
76 def __init__(self, host, port, identity):
77 BaseSocket.__init__(self, zmq.DEALER)
78 self.socket.setsockopt(zmq.IDENTITY, identity.encode())
79 self.socket.connect("tcp://%s:%i" % (host, port))
80
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/locust/rpc/zmqrpc.py b/locust/rpc/zmqrpc.py
--- a/locust/rpc/zmqrpc.py
+++ b/locust/rpc/zmqrpc.py
@@ -4,6 +4,7 @@
import msgpack.exceptions as msgerr
import zmq.error as zmqerr
import zmq.green as zmq
+from urllib3.util.connection import HAS_IPV6
from .protocol import Message
@@ -15,7 +16,8 @@
self.socket.setsockopt(zmq.TCP_KEEPALIVE, 1)
self.socket.setsockopt(zmq.TCP_KEEPALIVE_IDLE, 30)
- self.socket.setsockopt(zmq.IPV6, 1)
+ if HAS_IPV6:
+ self.socket.setsockopt(zmq.IPV6, 1)
@retry()
def send(self, msg):
| {"golden_diff": "diff --git a/locust/rpc/zmqrpc.py b/locust/rpc/zmqrpc.py\n--- a/locust/rpc/zmqrpc.py\n+++ b/locust/rpc/zmqrpc.py\n@@ -4,6 +4,7 @@\n import msgpack.exceptions as msgerr\n import zmq.error as zmqerr\n import zmq.green as zmq\n+from urllib3.util.connection import HAS_IPV6\n \n from .protocol import Message\n \n@@ -15,7 +16,8 @@\n \n self.socket.setsockopt(zmq.TCP_KEEPALIVE, 1)\n self.socket.setsockopt(zmq.TCP_KEEPALIVE_IDLE, 30)\n- self.socket.setsockopt(zmq.IPV6, 1)\n+ if HAS_IPV6:\n+ self.socket.setsockopt(zmq.IPV6, 1)\n \n @retry()\n def send(self, msg):\n", "issue": "Locust insists on using IPv6 despite being in an IPv4 stack.\n### Prerequisites\r\n\r\n- [X] I am using [the latest version of Locust](https://github.com/locustio/locust/releases/)\r\n- [X] I am reporting a bug, not asking a question\r\n\r\n### Description\r\n\r\nI have an environment that is IPv4-only, but there is an IPv6 address listed in /etc/hosts. When I try to connect to the master using the following command, it results in an error:\r\n\r\n```\r\n# locust -f ./p4_basic_code.py --worker --master-host locust-master -L DEBUG\r\n[2024-04-25 06:24:31,138] ip-192-168-0-235.ap-northeast-3.compute.internal/DEBUG/locust.runners: Failed to connect to master locust-master:5557, retry 1/60.\r\n```\r\n\r\nBy changing [self.socket.setsockopt(zmq.IPV6, 1)](https://github.com/locustio/locust/blob/master/locust/rpc/zmqrpc.py#L18) to 0 resolves the issue.\r\n```\r\n# cat /etc/hosts\r\n127.0.0.1 localhost\r\n192.168.0.235 ip-192-168-0-235.ap-northeast-3.compute.internal\r\n127.255.0.1 locust-master\r\n2600:f0f0:0:0:0:0:0:1 locust-master\r\n```\r\n\r\nI recommend detecting whether the current environment supports IPv6 before enabling it.\r\n\r\n### Command line\r\n\r\nlocust -f ./p4_basic_code.py --worker --master-host locust-master -L DEBUG\r\n\r\n### Locustfile contents\r\n\r\n```python3\r\nN/A\r\n```\r\n\r\n\r\n### Python version\r\n\r\n3.11\r\n\r\n### Locust version\r\n\r\n2.26.0\r\n\r\n### Operating system\r\n\r\nDebian 12\n", "before_files": [{"content": "from locust.exception import RPCError, RPCReceiveError, RPCSendError\nfrom locust.util.exception_handler import retry\n\nimport msgpack.exceptions as msgerr\nimport zmq.error as zmqerr\nimport zmq.green as zmq\n\nfrom .protocol import Message\n\n\nclass BaseSocket:\n def __init__(self, sock_type):\n context = zmq.Context()\n self.socket = context.socket(sock_type)\n\n self.socket.setsockopt(zmq.TCP_KEEPALIVE, 1)\n self.socket.setsockopt(zmq.TCP_KEEPALIVE_IDLE, 30)\n self.socket.setsockopt(zmq.IPV6, 1)\n\n @retry()\n def send(self, msg):\n try:\n self.socket.send(msg.serialize(), zmq.NOBLOCK)\n except zmqerr.ZMQError as e:\n raise RPCSendError(\"ZMQ sent failure\") from e\n\n @retry()\n def send_to_client(self, msg):\n try:\n self.socket.send_multipart([msg.node_id.encode(), msg.serialize()])\n except zmqerr.ZMQError as e:\n raise RPCSendError(\"ZMQ sent failure\") from e\n\n def recv(self):\n try:\n data = self.socket.recv()\n msg = Message.unserialize(data)\n except msgerr.ExtraData as e:\n raise RPCReceiveError(\"ZMQ interrupted message\") from e\n except zmqerr.ZMQError as e:\n raise RPCError(\"ZMQ network broken\") from e\n return msg\n\n def recv_from_client(self):\n try:\n data = self.socket.recv_multipart()\n addr = data[0].decode()\n except UnicodeDecodeError as e:\n raise RPCReceiveError(\"ZMQ interrupted or corrupted message\") from e\n except zmqerr.ZMQError as e:\n raise RPCError(\"ZMQ network broken\") from e\n try:\n msg = Message.unserialize(data[1])\n except (UnicodeDecodeError, msgerr.ExtraData) as e:\n raise RPCReceiveError(\"ZMQ interrupted or corrupted message\", addr=addr) from e\n return addr, msg\n\n def close(self, linger=None):\n self.socket.close(linger=linger)\n\n\nclass Server(BaseSocket):\n def __init__(self, host, port):\n BaseSocket.__init__(self, zmq.ROUTER)\n if port == 0:\n self.port = self.socket.bind_to_random_port(f\"tcp://{host}\")\n else:\n try:\n self.socket.bind(\"tcp://%s:%i\" % (host, port))\n self.port = port\n except zmqerr.ZMQError as e:\n raise RPCError(f\"Socket bind failure: {e}\")\n\n\nclass Client(BaseSocket):\n def __init__(self, host, port, identity):\n BaseSocket.__init__(self, zmq.DEALER)\n self.socket.setsockopt(zmq.IDENTITY, identity.encode())\n self.socket.connect(\"tcp://%s:%i\" % (host, port))\n", "path": "locust/rpc/zmqrpc.py"}], "after_files": [{"content": "from locust.exception import RPCError, RPCReceiveError, RPCSendError\nfrom locust.util.exception_handler import retry\n\nimport msgpack.exceptions as msgerr\nimport zmq.error as zmqerr\nimport zmq.green as zmq\nfrom urllib3.util.connection import HAS_IPV6\n\nfrom .protocol import Message\n\n\nclass BaseSocket:\n def __init__(self, sock_type):\n context = zmq.Context()\n self.socket = context.socket(sock_type)\n\n self.socket.setsockopt(zmq.TCP_KEEPALIVE, 1)\n self.socket.setsockopt(zmq.TCP_KEEPALIVE_IDLE, 30)\n if HAS_IPV6:\n self.socket.setsockopt(zmq.IPV6, 1)\n\n @retry()\n def send(self, msg):\n try:\n self.socket.send(msg.serialize(), zmq.NOBLOCK)\n except zmqerr.ZMQError as e:\n raise RPCSendError(\"ZMQ sent failure\") from e\n\n @retry()\n def send_to_client(self, msg):\n try:\n self.socket.send_multipart([msg.node_id.encode(), msg.serialize()])\n except zmqerr.ZMQError as e:\n raise RPCSendError(\"ZMQ sent failure\") from e\n\n def recv(self):\n try:\n data = self.socket.recv()\n msg = Message.unserialize(data)\n except msgerr.ExtraData as e:\n raise RPCReceiveError(\"ZMQ interrupted message\") from e\n except zmqerr.ZMQError as e:\n raise RPCError(\"ZMQ network broken\") from e\n return msg\n\n def recv_from_client(self):\n try:\n data = self.socket.recv_multipart()\n addr = data[0].decode()\n except UnicodeDecodeError as e:\n raise RPCReceiveError(\"ZMQ interrupted or corrupted message\") from e\n except zmqerr.ZMQError as e:\n raise RPCError(\"ZMQ network broken\") from e\n try:\n msg = Message.unserialize(data[1])\n except (UnicodeDecodeError, msgerr.ExtraData) as e:\n raise RPCReceiveError(\"ZMQ interrupted or corrupted message\", addr=addr) from e\n return addr, msg\n\n def close(self, linger=None):\n self.socket.close(linger=linger)\n\n\nclass Server(BaseSocket):\n def __init__(self, host, port):\n BaseSocket.__init__(self, zmq.ROUTER)\n if port == 0:\n self.port = self.socket.bind_to_random_port(f\"tcp://{host}\")\n else:\n try:\n self.socket.bind(\"tcp://%s:%i\" % (host, port))\n self.port = port\n except zmqerr.ZMQError as e:\n raise RPCError(f\"Socket bind failure: {e}\")\n\n\nclass Client(BaseSocket):\n def __init__(self, host, port, identity):\n BaseSocket.__init__(self, zmq.DEALER)\n self.socket.setsockopt(zmq.IDENTITY, identity.encode())\n self.socket.connect(\"tcp://%s:%i\" % (host, port))\n", "path": "locust/rpc/zmqrpc.py"}]} | 1,494 | 196 |
gh_patches_debug_1391 | rasdani/github-patches | git_diff | opsdroid__opsdroid-1683 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
skill-seen broken with redis database?
I've been testing opsdroid with a redis database and the seen skill appears to be having problems serializing python datetime objects.
user: when did you last see user?
opsdroid: Whoops there has been an error.
opsdroid: Check the log for details.
this is the opsdroid log with DEBUG logging enabled:
```
notrexroof_1 | DEBUG opsdroid.memory: Putting seen to memory.
notrexroof_1 | DEBUG opsdroid.database.redis: Putting seen into Redis.
notrexroof_1 | ERROR opsdroid.core: Exception when running skill 'seen'.
notrexroof_1 | Traceback (most recent call last):
notrexroof_1 | File "/usr/local/lib/python3.8/site-packages/opsdroid/core.py", line 427, in run_skill
notrexroof_1 | return await skill(self, config, event)
notrexroof_1 | File "/root/.local/share/opsdroid/opsdroid-modules/skill/seen/__init__.py", line 16, in last_seen
notrexroof_1 | await message.respond("I last saw {} {}".format(name, human(seen[name], precision=1)))
notrexroof_1 | File "/root/.local/share/opsdroid/site-packages/ago.py", line 55, in human
notrexroof_1 | delta = get_delta_from_subject(subject)
notrexroof_1 | File "/root/.local/share/opsdroid/site-packages/ago.py", line 16, in get_delta_from_subject
notrexroof_1 | subject = float(subject)
notrexroof_1 | TypeError: float() argument must be a string or a number, not 'dict'
```
I know this hasn't been touched in a few years, but I'm wondering if there is a general issue with serializing objects into a redis database within opsdroid.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `opsdroid/database/redis/__init__.py`
Content:
```
1 """Module for storing data within Redis."""
2 import json
3 import logging
4
5 import aioredis
6 from aioredis import parser
7 from voluptuous import Any
8
9 from opsdroid.database import Database
10 from opsdroid.helper import JSONEncoder, JSONDecoder
11
12 _LOGGER = logging.getLogger(__name__)
13 CONFIG_SCHEMA = {"host": str, "port": Any(int, str), "database": int, "password": str}
14
15
16 class RedisDatabase(Database):
17 """Database class for storing data within a Redis instance."""
18
19 def __init__(self, config, opsdroid=None):
20 """Initialise the redis database.
21
22 Set basic properties of the database. Initialise properties like
23 name, connection arguments, database file, table name and config.
24
25 Args:
26 config (dict): The configuration of the database which consists
27 of `file` and `table` name of the sqlite database
28 specified in `configuration.yaml` file.
29 opsdroid (OpsDroid): An instance of opsdroid.core.
30
31 """
32 super().__init__(config, opsdroid=opsdroid)
33 self.config = config
34 self.client = None
35 self.host = self.config.get("host", "localhost")
36 self.port = self.config.get("port", 6379)
37 self.database = self.config.get("database", 0)
38 self.password = self.config.get("password", None)
39 _LOGGER.debug(_("Loaded Redis database connector."))
40
41 async def connect(self):
42 """Connect to the database.
43
44 This method will connect to a Redis database. By default it will
45 connect to Redis on localhost on port 6379
46
47 """
48 try:
49 self.client = await aioredis.create_pool(
50 address=(self.host, int(self.port)),
51 db=self.database,
52 password=self.password,
53 parser=parser.PyReader,
54 )
55
56 _LOGGER.info(
57 _("Connected to Redis database %s from %s on port %s."),
58 self.database,
59 self.host,
60 self.port,
61 )
62 except OSError:
63 _LOGGER.warning(
64 _("Unable to connect to Redis database on address: %s port: %s."),
65 self.host,
66 self.port,
67 )
68
69 async def put(self, key, data):
70 """Store the data object in Redis against the key.
71
72 Args:
73 key (string): The key to store the data object under.
74 data (object): The data object to store.
75
76 """
77 if self.client:
78 _LOGGER.debug(_("Putting %s into Redis."), key)
79 await self.client.execute("SET", key, json.dumps(data, cls=JSONEncoder))
80
81 async def get(self, key):
82 """Get data from Redis for a given key.
83
84 Args:
85 key (string): The key to lookup in the database.
86
87 Returns:
88 object or None: The data object stored for that key, or None if no
89 object found for that key.
90
91 """
92 if self.client:
93 _LOGGER.debug(_("Getting %s from Redis."), key)
94 data = await self.client.execute("GET", key)
95
96 if data:
97 return json.loads(data, encoding=JSONDecoder)
98
99 return None
100
101 async def delete(self, key):
102 """Delete data from Redis for a given key.
103
104 Args:
105 key (string): The key to delete in the database.
106
107 """
108 if self.client:
109 _LOGGER.debug(_("Deleting %s from Redis."), key)
110 await self.client.execute("DEL", key)
111
112 async def disconnect(self):
113 """Disconnect from the database."""
114 if self.client:
115 self.client.close()
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/opsdroid/database/redis/__init__.py b/opsdroid/database/redis/__init__.py
--- a/opsdroid/database/redis/__init__.py
+++ b/opsdroid/database/redis/__init__.py
@@ -94,7 +94,7 @@
data = await self.client.execute("GET", key)
if data:
- return json.loads(data, encoding=JSONDecoder)
+ return json.loads(data, object_hook=JSONDecoder())
return None
| {"golden_diff": "diff --git a/opsdroid/database/redis/__init__.py b/opsdroid/database/redis/__init__.py\n--- a/opsdroid/database/redis/__init__.py\n+++ b/opsdroid/database/redis/__init__.py\n@@ -94,7 +94,7 @@\n data = await self.client.execute(\"GET\", key)\n \n if data:\n- return json.loads(data, encoding=JSONDecoder)\n+ return json.loads(data, object_hook=JSONDecoder())\n \n return None\n", "issue": "skill-seen broken with redis database?\nI've been testing opsdroid with a redis database and the seen skill appears to be having problems serializing python datetime objects.\r\n\r\nuser: when did you last see user?\r\nopsdroid: Whoops there has been an error.\r\nopsdroid: Check the log for details.\r\n\r\nthis is the opsdroid log with DEBUG logging enabled:\r\n\r\n```\r\nnotrexroof_1 | DEBUG opsdroid.memory: Putting seen to memory.\r\nnotrexroof_1 | DEBUG opsdroid.database.redis: Putting seen into Redis.\r\nnotrexroof_1 | ERROR opsdroid.core: Exception when running skill 'seen'.\r\nnotrexroof_1 | Traceback (most recent call last):\r\nnotrexroof_1 | File \"/usr/local/lib/python3.8/site-packages/opsdroid/core.py\", line 427, in run_skill\r\nnotrexroof_1 | return await skill(self, config, event)\r\nnotrexroof_1 | File \"/root/.local/share/opsdroid/opsdroid-modules/skill/seen/__init__.py\", line 16, in last_seen\r\nnotrexroof_1 | await message.respond(\"I last saw {} {}\".format(name, human(seen[name], precision=1)))\r\nnotrexroof_1 | File \"/root/.local/share/opsdroid/site-packages/ago.py\", line 55, in human\r\nnotrexroof_1 | delta = get_delta_from_subject(subject)\r\nnotrexroof_1 | File \"/root/.local/share/opsdroid/site-packages/ago.py\", line 16, in get_delta_from_subject\r\nnotrexroof_1 | subject = float(subject)\r\nnotrexroof_1 | TypeError: float() argument must be a string or a number, not 'dict'\r\n```\r\n\r\nI know this hasn't been touched in a few years, but I'm wondering if there is a general issue with serializing objects into a redis database within opsdroid.\r\n\r\n\n", "before_files": [{"content": "\"\"\"Module for storing data within Redis.\"\"\"\nimport json\nimport logging\n\nimport aioredis\nfrom aioredis import parser\nfrom voluptuous import Any\n\nfrom opsdroid.database import Database\nfrom opsdroid.helper import JSONEncoder, JSONDecoder\n\n_LOGGER = logging.getLogger(__name__)\nCONFIG_SCHEMA = {\"host\": str, \"port\": Any(int, str), \"database\": int, \"password\": str}\n\n\nclass RedisDatabase(Database):\n \"\"\"Database class for storing data within a Redis instance.\"\"\"\n\n def __init__(self, config, opsdroid=None):\n \"\"\"Initialise the redis database.\n\n Set basic properties of the database. Initialise properties like\n name, connection arguments, database file, table name and config.\n\n Args:\n config (dict): The configuration of the database which consists\n of `file` and `table` name of the sqlite database\n specified in `configuration.yaml` file.\n opsdroid (OpsDroid): An instance of opsdroid.core.\n\n \"\"\"\n super().__init__(config, opsdroid=opsdroid)\n self.config = config\n self.client = None\n self.host = self.config.get(\"host\", \"localhost\")\n self.port = self.config.get(\"port\", 6379)\n self.database = self.config.get(\"database\", 0)\n self.password = self.config.get(\"password\", None)\n _LOGGER.debug(_(\"Loaded Redis database connector.\"))\n\n async def connect(self):\n \"\"\"Connect to the database.\n\n This method will connect to a Redis database. By default it will\n connect to Redis on localhost on port 6379\n\n \"\"\"\n try:\n self.client = await aioredis.create_pool(\n address=(self.host, int(self.port)),\n db=self.database,\n password=self.password,\n parser=parser.PyReader,\n )\n\n _LOGGER.info(\n _(\"Connected to Redis database %s from %s on port %s.\"),\n self.database,\n self.host,\n self.port,\n )\n except OSError:\n _LOGGER.warning(\n _(\"Unable to connect to Redis database on address: %s port: %s.\"),\n self.host,\n self.port,\n )\n\n async def put(self, key, data):\n \"\"\"Store the data object in Redis against the key.\n\n Args:\n key (string): The key to store the data object under.\n data (object): The data object to store.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Putting %s into Redis.\"), key)\n await self.client.execute(\"SET\", key, json.dumps(data, cls=JSONEncoder))\n\n async def get(self, key):\n \"\"\"Get data from Redis for a given key.\n\n Args:\n key (string): The key to lookup in the database.\n\n Returns:\n object or None: The data object stored for that key, or None if no\n object found for that key.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Getting %s from Redis.\"), key)\n data = await self.client.execute(\"GET\", key)\n\n if data:\n return json.loads(data, encoding=JSONDecoder)\n\n return None\n\n async def delete(self, key):\n \"\"\"Delete data from Redis for a given key.\n\n Args:\n key (string): The key to delete in the database.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Deleting %s from Redis.\"), key)\n await self.client.execute(\"DEL\", key)\n\n async def disconnect(self):\n \"\"\"Disconnect from the database.\"\"\"\n if self.client:\n self.client.close()\n", "path": "opsdroid/database/redis/__init__.py"}], "after_files": [{"content": "\"\"\"Module for storing data within Redis.\"\"\"\nimport json\nimport logging\n\nimport aioredis\nfrom aioredis import parser\nfrom voluptuous import Any\n\nfrom opsdroid.database import Database\nfrom opsdroid.helper import JSONEncoder, JSONDecoder\n\n_LOGGER = logging.getLogger(__name__)\nCONFIG_SCHEMA = {\"host\": str, \"port\": Any(int, str), \"database\": int, \"password\": str}\n\n\nclass RedisDatabase(Database):\n \"\"\"Database class for storing data within a Redis instance.\"\"\"\n\n def __init__(self, config, opsdroid=None):\n \"\"\"Initialise the redis database.\n\n Set basic properties of the database. Initialise properties like\n name, connection arguments, database file, table name and config.\n\n Args:\n config (dict): The configuration of the database which consists\n of `file` and `table` name of the sqlite database\n specified in `configuration.yaml` file.\n opsdroid (OpsDroid): An instance of opsdroid.core.\n\n \"\"\"\n super().__init__(config, opsdroid=opsdroid)\n self.config = config\n self.client = None\n self.host = self.config.get(\"host\", \"localhost\")\n self.port = self.config.get(\"port\", 6379)\n self.database = self.config.get(\"database\", 0)\n self.password = self.config.get(\"password\", None)\n _LOGGER.debug(_(\"Loaded Redis database connector.\"))\n\n async def connect(self):\n \"\"\"Connect to the database.\n\n This method will connect to a Redis database. By default it will\n connect to Redis on localhost on port 6379\n\n \"\"\"\n try:\n self.client = await aioredis.create_pool(\n address=(self.host, int(self.port)),\n db=self.database,\n password=self.password,\n parser=parser.PyReader,\n )\n\n _LOGGER.info(\n _(\"Connected to Redis database %s from %s on port %s.\"),\n self.database,\n self.host,\n self.port,\n )\n except OSError:\n _LOGGER.warning(\n _(\"Unable to connect to Redis database on address: %s port: %s.\"),\n self.host,\n self.port,\n )\n\n async def put(self, key, data):\n \"\"\"Store the data object in Redis against the key.\n\n Args:\n key (string): The key to store the data object under.\n data (object): The data object to store.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Putting %s into Redis.\"), key)\n await self.client.execute(\"SET\", key, json.dumps(data, cls=JSONEncoder))\n\n async def get(self, key):\n \"\"\"Get data from Redis for a given key.\n\n Args:\n key (string): The key to lookup in the database.\n\n Returns:\n object or None: The data object stored for that key, or None if no\n object found for that key.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Getting %s from Redis.\"), key)\n data = await self.client.execute(\"GET\", key)\n\n if data:\n return json.loads(data, object_hook=JSONDecoder())\n\n return None\n\n async def delete(self, key):\n \"\"\"Delete data from Redis for a given key.\n\n Args:\n key (string): The key to delete in the database.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Deleting %s from Redis.\"), key)\n await self.client.execute(\"DEL\", key)\n\n async def disconnect(self):\n \"\"\"Disconnect from the database.\"\"\"\n if self.client:\n self.client.close()\n", "path": "opsdroid/database/redis/__init__.py"}]} | 1,717 | 115 |
gh_patches_debug_13451 | rasdani/github-patches | git_diff | CTPUG__wafer-312 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Schedule Editor does not clear extra fields on existing items
When replacing an existing item in the schedule editor, the notes, css_class and details fields are not replaced or cleared.
While this can be useful to leave css_class untouched, it is surprising behaviour and usually the wrong thing to do for notes and details.
I think we must clear notes and details when this happens.
We should probably clear all extra fields when replacing an item, for the sake of predictablity,
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wafer/schedule/serializers.py`
Content:
```
1 from rest_framework import serializers
2
3 from wafer.talks.models import Talk
4 from wafer.pages.models import Page
5 from wafer.schedule.models import ScheduleItem, Venue, Slot
6
7
8 class ScheduleItemSerializer(serializers.HyperlinkedModelSerializer):
9 page = serializers.PrimaryKeyRelatedField(
10 allow_null=True, queryset=Page.objects.all())
11 talk = serializers.PrimaryKeyRelatedField(
12 allow_null=True, queryset=Talk.objects.all())
13 venue = serializers.PrimaryKeyRelatedField(
14 allow_null=True, queryset=Venue.objects.all())
15 slots = serializers.PrimaryKeyRelatedField(
16 allow_null=True, many=True, queryset=Slot.objects.all())
17
18 class Meta:
19 model = ScheduleItem
20 fields = ('id', 'talk', 'page', 'venue', 'slots')
21
22 def create(self, validated_data):
23 venue_id = validated_data['venue']
24 slots = validated_data['slots']
25 talk = validated_data.get('talk')
26 page = validated_data.get('page')
27
28 try:
29 existing_schedule_item = ScheduleItem.objects.get(
30 venue_id=venue_id, slots__in=slots)
31 except ScheduleItem.DoesNotExist:
32 pass
33 else:
34 existing_schedule_item.talk = talk
35 existing_schedule_item.page = page
36 existing_schedule_item.slots = slots
37 existing_schedule_item.save()
38 return existing_schedule_item
39 return super(ScheduleItemSerializer, self).create(validated_data)
40
41
42
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/wafer/schedule/serializers.py b/wafer/schedule/serializers.py
--- a/wafer/schedule/serializers.py
+++ b/wafer/schedule/serializers.py
@@ -34,6 +34,12 @@
existing_schedule_item.talk = talk
existing_schedule_item.page = page
existing_schedule_item.slots = slots
+ # Clear any existing details that aren't editable by the
+ # schedule edit view
+ existing_schedule_item.details = ''
+ existing_schedule_item.notes = ''
+ existing_schedule_item.css_class = ''
+ existing_schedule_item.expand = False
existing_schedule_item.save()
return existing_schedule_item
return super(ScheduleItemSerializer, self).create(validated_data)
| {"golden_diff": "diff --git a/wafer/schedule/serializers.py b/wafer/schedule/serializers.py\n--- a/wafer/schedule/serializers.py\n+++ b/wafer/schedule/serializers.py\n@@ -34,6 +34,12 @@\n existing_schedule_item.talk = talk\n existing_schedule_item.page = page\n existing_schedule_item.slots = slots\n+ # Clear any existing details that aren't editable by the\n+ # schedule edit view\n+ existing_schedule_item.details = ''\n+ existing_schedule_item.notes = ''\n+ existing_schedule_item.css_class = ''\n+ existing_schedule_item.expand = False\n existing_schedule_item.save()\n return existing_schedule_item\n return super(ScheduleItemSerializer, self).create(validated_data)\n", "issue": "Schedule Editor does not clear extra fields on existing items\nWhen replacing an existing item in the schedule editor, the notes, css_class and details fields are not replaced or cleared.\n\nWhile this can be useful to leave css_class untouched, it is surprising behaviour and usually the wrong thing to do for notes and details. \n\nI think we must clear notes and details when this happens.\n\nWe should probably clear all extra fields when replacing an item, for the sake of predictablity,\n\n", "before_files": [{"content": "from rest_framework import serializers\n\nfrom wafer.talks.models import Talk\nfrom wafer.pages.models import Page\nfrom wafer.schedule.models import ScheduleItem, Venue, Slot\n\n\nclass ScheduleItemSerializer(serializers.HyperlinkedModelSerializer):\n page = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Page.objects.all())\n talk = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Talk.objects.all())\n venue = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Venue.objects.all())\n slots = serializers.PrimaryKeyRelatedField(\n allow_null=True, many=True, queryset=Slot.objects.all())\n\n class Meta:\n model = ScheduleItem\n fields = ('id', 'talk', 'page', 'venue', 'slots')\n\n def create(self, validated_data):\n venue_id = validated_data['venue']\n slots = validated_data['slots']\n talk = validated_data.get('talk')\n page = validated_data.get('page')\n\n try:\n existing_schedule_item = ScheduleItem.objects.get(\n venue_id=venue_id, slots__in=slots)\n except ScheduleItem.DoesNotExist:\n pass\n else:\n existing_schedule_item.talk = talk\n existing_schedule_item.page = page\n existing_schedule_item.slots = slots\n existing_schedule_item.save()\n return existing_schedule_item\n return super(ScheduleItemSerializer, self).create(validated_data)\n\n\n", "path": "wafer/schedule/serializers.py"}], "after_files": [{"content": "from rest_framework import serializers\n\nfrom wafer.talks.models import Talk\nfrom wafer.pages.models import Page\nfrom wafer.schedule.models import ScheduleItem, Venue, Slot\n\n\nclass ScheduleItemSerializer(serializers.HyperlinkedModelSerializer):\n page = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Page.objects.all())\n talk = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Talk.objects.all())\n venue = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Venue.objects.all())\n slots = serializers.PrimaryKeyRelatedField(\n allow_null=True, many=True, queryset=Slot.objects.all())\n\n class Meta:\n model = ScheduleItem\n fields = ('id', 'talk', 'page', 'venue', 'slots')\n\n def create(self, validated_data):\n venue_id = validated_data['venue']\n slots = validated_data['slots']\n talk = validated_data.get('talk')\n page = validated_data.get('page')\n\n try:\n existing_schedule_item = ScheduleItem.objects.get(\n venue_id=venue_id, slots__in=slots)\n except ScheduleItem.DoesNotExist:\n pass\n else:\n existing_schedule_item.talk = talk\n existing_schedule_item.page = page\n existing_schedule_item.slots = slots\n # Clear any existing details that aren't editable by the\n # schedule edit view\n existing_schedule_item.details = ''\n existing_schedule_item.notes = ''\n existing_schedule_item.css_class = ''\n existing_schedule_item.expand = False\n existing_schedule_item.save()\n return existing_schedule_item\n return super(ScheduleItemSerializer, self).create(validated_data)\n\n\n", "path": "wafer/schedule/serializers.py"}]} | 730 | 169 |
gh_patches_debug_827 | rasdani/github-patches | git_diff | elastic__apm-agent-python-1558 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 369: invalid start byte
**Describe the bug**:
Sending PUT request to FastAPI with binary file encoded, its returns a error.
**To Reproduce**
1. Configure FastAPI with elasticapm.contrib.starlette
2. Send a PUT request with binary file
**Environment (please complete the following information)**
- OS: Linux
- Python version: 3.8
- Framework and version: fastapi 0.61.2
- APM Server version: 8
- Agent version: 8
The same problema was resolved in the main code : [#344](https://github.com/elastic/apm-agent-python/issues/334)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `elasticapm/contrib/starlette/utils.py`
Content:
```
1 # BSD 3-Clause License
2 #
3 # Copyright (c) 2019, Elasticsearch BV
4 # All rights reserved.
5 #
6 # Redistribution and use in source and binary forms, with or without
7 # modification, are permitted provided that the following conditions are met:
8 #
9 # * Redistributions of source code must retain the above copyright notice, this
10 # list of conditions and the following disclaimer.
11 #
12 # * Redistributions in binary form must reproduce the above copyright notice,
13 # this list of conditions and the following disclaimer in the documentation
14 # and/or other materials provided with the distribution.
15 #
16 # * Neither the name of the copyright holder nor the names of its
17 # contributors may be used to endorse or promote products derived from
18 # this software without specific prior written permission.
19 #
20 # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
21 # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
22 # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
23 # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
24 # FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
25 # DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
26 # SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
27 # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
28 # OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29 # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30 import asyncio
31
32 from starlette.datastructures import Headers
33 from starlette.requests import Request
34 from starlette.types import Message
35
36 from elasticapm.conf import Config, constants
37 from elasticapm.utils import get_url_dict
38
39
40 async def get_data_from_request(request: Request, config: Config, event_type: str) -> dict:
41 """Loads data from incoming request for APM capturing.
42
43 Args:
44 request (Request)
45 config (Config)
46 event_type (str)
47
48 Returns:
49 dict
50 """
51 result = {
52 "method": request.method,
53 "socket": {"remote_address": _get_client_ip(request)},
54 "cookies": request.cookies,
55 }
56 if config.capture_headers:
57 result["headers"] = dict(request.headers)
58
59 if request.method in constants.HTTP_WITH_BODY:
60 if config.capture_body not in ("all", event_type):
61 result["body"] = "[REDACTED]"
62 else:
63 body = None
64 try:
65 body = await get_body(request)
66 except Exception:
67 pass
68 if body is not None:
69 result["body"] = body
70
71 result["url"] = get_url_dict(str(request.url))
72
73 return result
74
75
76 async def get_data_from_response(message: dict, config: Config, event_type: str) -> dict:
77 """Loads data from response for APM capturing.
78
79 Args:
80 message (dict)
81 config (Config)
82 event_type (str)
83
84 Returns:
85 dict
86 """
87 result = {}
88
89 if "status" in message:
90 result["status_code"] = message["status"]
91
92 if config.capture_headers and "headers" in message:
93 headers = Headers(raw=message["headers"])
94 result["headers"] = {key: ";".join(headers.getlist(key)) for key in headers.keys()}
95
96 return result
97
98
99 async def set_body(request: Request, body: bytes):
100 """Overwrites body in Starlette.
101
102 Args:
103 request (Request)
104 body (bytes)
105 """
106
107 async def receive() -> Message:
108 await asyncio.sleep(0)
109 return {"type": "http.request", "body": body}
110
111 request._receive = receive
112
113
114 async def get_body(request: Request) -> str:
115 """Gets body from the request.
116
117 When we consume the body, we replace the streaming mechanism with
118 a mocked version -- this workaround came from
119 https://github.com/encode/starlette/issues/495#issuecomment-513138055
120
121 Args:
122 request (Request)
123
124 Returns:
125 str
126 """
127 body = await request.body()
128 await set_body(request, body)
129
130 request._stream_consumed = False
131
132 return body.decode("utf-8")
133
134
135 async def query_params_to_dict(query_params: str) -> dict:
136 """Transforms query params from URL to dictionary
137
138 Args:
139 query_params (str)
140
141 Returns:
142 dict
143
144 Examples:
145 >>> print(query_params_to_dict(b"key=val&key2=val2"))
146 {"key": "val", "key2": "val2"}
147 """
148 query_params = query_params.split("&")
149 res = {}
150 for param in query_params:
151 key, val = param.split("=")
152 res[key] = val
153
154 return res
155
156
157 def _get_client_ip(request: Request):
158 x_forwarded_for = request.headers.get("HTTP_X_FORWARDED_FOR")
159 if x_forwarded_for:
160 ip = x_forwarded_for.split(",")[0]
161 else:
162 ip = request.headers.get("REMOTE_ADDR")
163 return ip
164
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/elasticapm/contrib/starlette/utils.py b/elasticapm/contrib/starlette/utils.py
--- a/elasticapm/contrib/starlette/utils.py
+++ b/elasticapm/contrib/starlette/utils.py
@@ -129,7 +129,7 @@
request._stream_consumed = False
- return body.decode("utf-8")
+ return body.decode("utf-8", errors="replace")
async def query_params_to_dict(query_params: str) -> dict:
| {"golden_diff": "diff --git a/elasticapm/contrib/starlette/utils.py b/elasticapm/contrib/starlette/utils.py\n--- a/elasticapm/contrib/starlette/utils.py\n+++ b/elasticapm/contrib/starlette/utils.py\n@@ -129,7 +129,7 @@\n \n request._stream_consumed = False\n \n- return body.decode(\"utf-8\")\n+ return body.decode(\"utf-8\", errors=\"replace\")\n \n \n async def query_params_to_dict(query_params: str) -> dict:\n", "issue": "UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 369: invalid start byte\n**Describe the bug**: \r\nSending PUT request to FastAPI with binary file encoded, its returns a error.\r\n\r\n**To Reproduce**\r\n\r\n1. Configure FastAPI with elasticapm.contrib.starlette\r\n2. Send a PUT request with binary file\r\n\r\n**Environment (please complete the following information)**\r\n- OS: Linux\r\n- Python version: 3.8\r\n- Framework and version: fastapi 0.61.2\r\n- APM Server version: 8\r\n- Agent version: 8\r\n\r\nThe same problema was resolved in the main code : [#344](https://github.com/elastic/apm-agent-python/issues/334)\r\n\n", "before_files": [{"content": "# BSD 3-Clause License\n#\n# Copyright (c) 2019, Elasticsearch BV\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n#\n# * Redistributions of source code must retain the above copyright notice, this\n# list of conditions and the following disclaimer.\n#\n# * Redistributions in binary form must reproduce the above copyright notice,\n# this list of conditions and the following disclaimer in the documentation\n# and/or other materials provided with the distribution.\n#\n# * Neither the name of the copyright holder nor the names of its\n# contributors may be used to endorse or promote products derived from\n# this software without specific prior written permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\n# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\n# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\n# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\n# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\nimport asyncio\n\nfrom starlette.datastructures import Headers\nfrom starlette.requests import Request\nfrom starlette.types import Message\n\nfrom elasticapm.conf import Config, constants\nfrom elasticapm.utils import get_url_dict\n\n\nasync def get_data_from_request(request: Request, config: Config, event_type: str) -> dict:\n \"\"\"Loads data from incoming request for APM capturing.\n\n Args:\n request (Request)\n config (Config)\n event_type (str)\n\n Returns:\n dict\n \"\"\"\n result = {\n \"method\": request.method,\n \"socket\": {\"remote_address\": _get_client_ip(request)},\n \"cookies\": request.cookies,\n }\n if config.capture_headers:\n result[\"headers\"] = dict(request.headers)\n\n if request.method in constants.HTTP_WITH_BODY:\n if config.capture_body not in (\"all\", event_type):\n result[\"body\"] = \"[REDACTED]\"\n else:\n body = None\n try:\n body = await get_body(request)\n except Exception:\n pass\n if body is not None:\n result[\"body\"] = body\n\n result[\"url\"] = get_url_dict(str(request.url))\n\n return result\n\n\nasync def get_data_from_response(message: dict, config: Config, event_type: str) -> dict:\n \"\"\"Loads data from response for APM capturing.\n\n Args:\n message (dict)\n config (Config)\n event_type (str)\n\n Returns:\n dict\n \"\"\"\n result = {}\n\n if \"status\" in message:\n result[\"status_code\"] = message[\"status\"]\n\n if config.capture_headers and \"headers\" in message:\n headers = Headers(raw=message[\"headers\"])\n result[\"headers\"] = {key: \";\".join(headers.getlist(key)) for key in headers.keys()}\n\n return result\n\n\nasync def set_body(request: Request, body: bytes):\n \"\"\"Overwrites body in Starlette.\n\n Args:\n request (Request)\n body (bytes)\n \"\"\"\n\n async def receive() -> Message:\n await asyncio.sleep(0)\n return {\"type\": \"http.request\", \"body\": body}\n\n request._receive = receive\n\n\nasync def get_body(request: Request) -> str:\n \"\"\"Gets body from the request.\n\n When we consume the body, we replace the streaming mechanism with\n a mocked version -- this workaround came from\n https://github.com/encode/starlette/issues/495#issuecomment-513138055\n\n Args:\n request (Request)\n\n Returns:\n str\n \"\"\"\n body = await request.body()\n await set_body(request, body)\n\n request._stream_consumed = False\n\n return body.decode(\"utf-8\")\n\n\nasync def query_params_to_dict(query_params: str) -> dict:\n \"\"\"Transforms query params from URL to dictionary\n\n Args:\n query_params (str)\n\n Returns:\n dict\n\n Examples:\n >>> print(query_params_to_dict(b\"key=val&key2=val2\"))\n {\"key\": \"val\", \"key2\": \"val2\"}\n \"\"\"\n query_params = query_params.split(\"&\")\n res = {}\n for param in query_params:\n key, val = param.split(\"=\")\n res[key] = val\n\n return res\n\n\ndef _get_client_ip(request: Request):\n x_forwarded_for = request.headers.get(\"HTTP_X_FORWARDED_FOR\")\n if x_forwarded_for:\n ip = x_forwarded_for.split(\",\")[0]\n else:\n ip = request.headers.get(\"REMOTE_ADDR\")\n return ip\n", "path": "elasticapm/contrib/starlette/utils.py"}], "after_files": [{"content": "# BSD 3-Clause License\n#\n# Copyright (c) 2019, Elasticsearch BV\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n#\n# * Redistributions of source code must retain the above copyright notice, this\n# list of conditions and the following disclaimer.\n#\n# * Redistributions in binary form must reproduce the above copyright notice,\n# this list of conditions and the following disclaimer in the documentation\n# and/or other materials provided with the distribution.\n#\n# * Neither the name of the copyright holder nor the names of its\n# contributors may be used to endorse or promote products derived from\n# this software without specific prior written permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\n# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\n# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\n# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\n# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\nimport asyncio\n\nfrom starlette.datastructures import Headers\nfrom starlette.requests import Request\nfrom starlette.types import Message\n\nfrom elasticapm.conf import Config, constants\nfrom elasticapm.utils import get_url_dict\n\n\nasync def get_data_from_request(request: Request, config: Config, event_type: str) -> dict:\n \"\"\"Loads data from incoming request for APM capturing.\n\n Args:\n request (Request)\n config (Config)\n event_type (str)\n\n Returns:\n dict\n \"\"\"\n result = {\n \"method\": request.method,\n \"socket\": {\"remote_address\": _get_client_ip(request)},\n \"cookies\": request.cookies,\n }\n if config.capture_headers:\n result[\"headers\"] = dict(request.headers)\n\n if request.method in constants.HTTP_WITH_BODY:\n if config.capture_body not in (\"all\", event_type):\n result[\"body\"] = \"[REDACTED]\"\n else:\n body = None\n try:\n body = await get_body(request)\n except Exception:\n pass\n if body is not None:\n result[\"body\"] = body\n\n result[\"url\"] = get_url_dict(str(request.url))\n\n return result\n\n\nasync def get_data_from_response(message: dict, config: Config, event_type: str) -> dict:\n \"\"\"Loads data from response for APM capturing.\n\n Args:\n message (dict)\n config (Config)\n event_type (str)\n\n Returns:\n dict\n \"\"\"\n result = {}\n\n if \"status\" in message:\n result[\"status_code\"] = message[\"status\"]\n\n if config.capture_headers and \"headers\" in message:\n headers = Headers(raw=message[\"headers\"])\n result[\"headers\"] = {key: \";\".join(headers.getlist(key)) for key in headers.keys()}\n\n return result\n\n\nasync def set_body(request: Request, body: bytes):\n \"\"\"Overwrites body in Starlette.\n\n Args:\n request (Request)\n body (bytes)\n \"\"\"\n\n async def receive() -> Message:\n await asyncio.sleep(0)\n return {\"type\": \"http.request\", \"body\": body}\n\n request._receive = receive\n\n\nasync def get_body(request: Request) -> str:\n \"\"\"Gets body from the request.\n\n When we consume the body, we replace the streaming mechanism with\n a mocked version -- this workaround came from\n https://github.com/encode/starlette/issues/495#issuecomment-513138055\n\n Args:\n request (Request)\n\n Returns:\n str\n \"\"\"\n body = await request.body()\n await set_body(request, body)\n\n request._stream_consumed = False\n\n return body.decode(\"utf-8\", errors=\"replace\")\n\n\nasync def query_params_to_dict(query_params: str) -> dict:\n \"\"\"Transforms query params from URL to dictionary\n\n Args:\n query_params (str)\n\n Returns:\n dict\n\n Examples:\n >>> print(query_params_to_dict(b\"key=val&key2=val2\"))\n {\"key\": \"val\", \"key2\": \"val2\"}\n \"\"\"\n query_params = query_params.split(\"&\")\n res = {}\n for param in query_params:\n key, val = param.split(\"=\")\n res[key] = val\n\n return res\n\n\ndef _get_client_ip(request: Request):\n x_forwarded_for = request.headers.get(\"HTTP_X_FORWARDED_FOR\")\n if x_forwarded_for:\n ip = x_forwarded_for.split(\",\")[0]\n else:\n ip = request.headers.get(\"REMOTE_ADDR\")\n return ip\n", "path": "elasticapm/contrib/starlette/utils.py"}]} | 1,938 | 115 |
gh_patches_debug_24514 | rasdani/github-patches | git_diff | bridgecrewio__checkov-2025 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Check CKV_AZURE_50 fails incorrectly for resource 'azurerm_virtual_machine'
CKV_AZURE_50 always fails for the resource "azurerm_virtual_machine" due to an incorrect check of the existence of the attribute "allow_extension_operations":
https://github.com/bridgecrewio/checkov/blob/25388a34231e09ac17b266ad9db0b4c0e806e956/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py#L15
The Terraform resource "azurerm_virtual_machine" does not have an attribute named "allow_extension_operations" (see [Terraform Resouce Docu](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/virtual_machine).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py`
Content:
```
1 from checkov.common.models.enums import CheckCategories, CheckResult
2 from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck
3
4
5 class AzureInstanceExtensions(BaseResourceValueCheck):
6 def __init__(self):
7 name = "Ensure Virtual Machine Extensions are not Installed"
8 id = "CKV_AZURE_50"
9 supported_resources = ['azurerm_virtual_machine', 'azurerm_linux_virtual_machine']
10 categories = [CheckCategories.GENERAL_SECURITY]
11 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,
12 missing_block_result=CheckResult.PASSED)
13
14 def get_inspected_key(self):
15 return 'allow_extension_operations'
16
17 def get_expected_value(self):
18 return False
19
20
21 check = AzureInstanceExtensions()
22
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py b/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py
--- a/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py
+++ b/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py
@@ -1,20 +1,21 @@
-from checkov.common.models.enums import CheckCategories, CheckResult
+from typing import Any
+
+from checkov.common.models.enums import CheckCategories
from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck
class AzureInstanceExtensions(BaseResourceValueCheck):
- def __init__(self):
+ def __init__(self) -> None:
name = "Ensure Virtual Machine Extensions are not Installed"
id = "CKV_AZURE_50"
- supported_resources = ['azurerm_virtual_machine', 'azurerm_linux_virtual_machine']
+ supported_resources = ["azurerm_linux_virtual_machine", "azurerm_windows_virtual_machine"]
categories = [CheckCategories.GENERAL_SECURITY]
- super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,
- missing_block_result=CheckResult.PASSED)
+ super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
- def get_inspected_key(self):
- return 'allow_extension_operations'
+ def get_inspected_key(self) -> str:
+ return "allow_extension_operations"
- def get_expected_value(self):
+ def get_expected_value(self) -> Any:
return False
| {"golden_diff": "diff --git a/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py b/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py\n--- a/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py\n+++ b/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py\n@@ -1,20 +1,21 @@\n-from checkov.common.models.enums import CheckCategories, CheckResult\n+from typing import Any\n+\n+from checkov.common.models.enums import CheckCategories\n from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck\n \n \n class AzureInstanceExtensions(BaseResourceValueCheck):\n- def __init__(self):\n+ def __init__(self) -> None:\n name = \"Ensure Virtual Machine Extensions are not Installed\"\n id = \"CKV_AZURE_50\"\n- supported_resources = ['azurerm_virtual_machine', 'azurerm_linux_virtual_machine']\n+ supported_resources = [\"azurerm_linux_virtual_machine\", \"azurerm_windows_virtual_machine\"]\n categories = [CheckCategories.GENERAL_SECURITY]\n- super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,\n- missing_block_result=CheckResult.PASSED)\n+ super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n \n- def get_inspected_key(self):\n- return 'allow_extension_operations'\n+ def get_inspected_key(self) -> str:\n+ return \"allow_extension_operations\"\n \n- def get_expected_value(self):\n+ def get_expected_value(self) -> Any:\n return False\n", "issue": "Check CKV_AZURE_50 fails incorrectly for resource 'azurerm_virtual_machine'\nCKV_AZURE_50 always fails for the resource \"azurerm_virtual_machine\" due to an incorrect check of the existence of the attribute \"allow_extension_operations\":\r\n\r\nhttps://github.com/bridgecrewio/checkov/blob/25388a34231e09ac17b266ad9db0b4c0e806e956/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py#L15\r\n\r\nThe Terraform resource \"azurerm_virtual_machine\" does not have an attribute named \"allow_extension_operations\" (see [Terraform Resouce Docu](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/virtual_machine).\r\n\r\n\n", "before_files": [{"content": "from checkov.common.models.enums import CheckCategories, CheckResult\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck\n\n\nclass AzureInstanceExtensions(BaseResourceValueCheck):\n def __init__(self):\n name = \"Ensure Virtual Machine Extensions are not Installed\"\n id = \"CKV_AZURE_50\"\n supported_resources = ['azurerm_virtual_machine', 'azurerm_linux_virtual_machine']\n categories = [CheckCategories.GENERAL_SECURITY]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,\n missing_block_result=CheckResult.PASSED)\n\n def get_inspected_key(self):\n return 'allow_extension_operations'\n\n def get_expected_value(self):\n return False\n\n\ncheck = AzureInstanceExtensions()\n", "path": "checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py"}], "after_files": [{"content": "from typing import Any\n\nfrom checkov.common.models.enums import CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck\n\n\nclass AzureInstanceExtensions(BaseResourceValueCheck):\n def __init__(self) -> None:\n name = \"Ensure Virtual Machine Extensions are not Installed\"\n id = \"CKV_AZURE_50\"\n supported_resources = [\"azurerm_linux_virtual_machine\", \"azurerm_windows_virtual_machine\"]\n categories = [CheckCategories.GENERAL_SECURITY]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def get_inspected_key(self) -> str:\n return \"allow_extension_operations\"\n\n def get_expected_value(self) -> Any:\n return False\n\n\ncheck = AzureInstanceExtensions()\n", "path": "checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py"}]} | 656 | 353 |
gh_patches_debug_9846 | rasdani/github-patches | git_diff | microsoft__playwright-python-472 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Interactive mode (REPL) (v 1.8.0a1) Error
**from playwright.sync_api import sync_playwright
playwright = sync_playwright().start()**
Traceback (most recent call last):
File "C:\Python37\lib\site-packages\playwright\_impl\_transport.py", line 27, in _get_stderr_fileno
return sys.stderr.fileno()
io.UnsupportedOperation: fileno
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<pyshell#1>", line 1, in <module>
playwright = sync_playwright().start()
File "C:\Python37\lib\site-packages\playwright\sync_api\_context_manager.py", line 73, in start
return self.__enter__()
File "C:\Python37\lib\site-packages\playwright\sync_api\_context_manager.py", line 67, in __enter__
dispatcher_fiber.switch()
File "C:\Python37\lib\site-packages\playwright\sync_api\_context_manager.py", line 48, in greenlet_main
loop.run_until_complete(self._connection.run_as_sync())
File "C:\Python37\lib\asyncio\base_events.py", line 587, in run_until_complete
return future.result()
File "C:\Python37\lib\site-packages\playwright\_impl\_connection.py", line 134, in run_as_sync
await self.run()
File "C:\Python37\lib\site-packages\playwright\_impl\_connection.py", line 139, in run
await self._transport.run()
File "C:\Python37\lib\site-packages\playwright\_impl\_transport.py", line 62, in run
stderr=_get_stderr_fileno(),
File "C:\Python37\lib\site-packages\playwright\_impl\_transport.py", line 34, in _get_stderr_fileno
return sys.__stderr__.fileno()
AttributeError: 'NoneType' object has no attribute 'fileno'
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `playwright/_impl/_transport.py`
Content:
```
1 # Copyright (c) Microsoft Corporation.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import asyncio
16 import io
17 import json
18 import os
19 import sys
20 from pathlib import Path
21 from typing import Dict, Optional
22
23
24 # Sourced from: https://github.com/pytest-dev/pytest/blob/da01ee0a4bb0af780167ecd228ab3ad249511302/src/_pytest/faulthandler.py#L69-L77
25 def _get_stderr_fileno() -> Optional[int]:
26 try:
27 return sys.stderr.fileno()
28 except (AttributeError, io.UnsupportedOperation):
29 # pytest-xdist monkeypatches sys.stderr with an object that is not an actual file.
30 # https://docs.python.org/3/library/faulthandler.html#issue-with-file-descriptors
31 # This is potentially dangerous, but the best we can do.
32 if not hasattr(sys, "__stderr__"):
33 return None
34 return sys.__stderr__.fileno()
35
36
37 class Transport:
38 def __init__(self, driver_executable: Path) -> None:
39 super().__init__()
40 self.on_message = lambda _: None
41 self._stopped = False
42 self._driver_executable = driver_executable
43 self._loop: asyncio.AbstractEventLoop
44
45 def stop(self) -> None:
46 self._stopped = True
47 self._output.close()
48
49 async def run(self) -> None:
50 self._loop = asyncio.get_running_loop()
51
52 driver_env = os.environ.copy()
53 # VSCode's JavaScript Debug Terminal provides it but driver/pkg does not support it
54 driver_env.pop("NODE_OPTIONS", None)
55
56 proc = await asyncio.create_subprocess_exec(
57 str(self._driver_executable),
58 "run-driver",
59 env=driver_env,
60 stdin=asyncio.subprocess.PIPE,
61 stdout=asyncio.subprocess.PIPE,
62 stderr=_get_stderr_fileno(),
63 limit=32768,
64 )
65 assert proc.stdout
66 assert proc.stdin
67 self._output = proc.stdin
68
69 while not self._stopped:
70 try:
71 buffer = await proc.stdout.readexactly(4)
72 length = int.from_bytes(buffer, byteorder="little", signed=False)
73 buffer = bytes(0)
74 while length:
75 to_read = min(length, 32768)
76 data = await proc.stdout.readexactly(to_read)
77 length -= to_read
78 if len(buffer):
79 buffer = buffer + data
80 else:
81 buffer = data
82 obj = json.loads(buffer)
83
84 if "DEBUGP" in os.environ: # pragma: no cover
85 print("\x1b[33mRECV>\x1b[0m", json.dumps(obj, indent=2))
86 self.on_message(obj)
87 except asyncio.IncompleteReadError:
88 break
89 await asyncio.sleep(0)
90
91 def send(self, message: Dict) -> None:
92 msg = json.dumps(message)
93 if "DEBUGP" in os.environ: # pragma: no cover
94 print("\x1b[32mSEND>\x1b[0m", json.dumps(message, indent=2))
95 data = msg.encode()
96 self._output.write(
97 len(data).to_bytes(4, byteorder="little", signed=False) + data
98 )
99
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/playwright/_impl/_transport.py b/playwright/_impl/_transport.py
--- a/playwright/_impl/_transport.py
+++ b/playwright/_impl/_transport.py
@@ -29,7 +29,7 @@
# pytest-xdist monkeypatches sys.stderr with an object that is not an actual file.
# https://docs.python.org/3/library/faulthandler.html#issue-with-file-descriptors
# This is potentially dangerous, but the best we can do.
- if not hasattr(sys, "__stderr__"):
+ if not hasattr(sys, "__stderr__") or not sys.__stderr__:
return None
return sys.__stderr__.fileno()
| {"golden_diff": "diff --git a/playwright/_impl/_transport.py b/playwright/_impl/_transport.py\n--- a/playwright/_impl/_transport.py\n+++ b/playwright/_impl/_transport.py\n@@ -29,7 +29,7 @@\n # pytest-xdist monkeypatches sys.stderr with an object that is not an actual file.\n # https://docs.python.org/3/library/faulthandler.html#issue-with-file-descriptors\n # This is potentially dangerous, but the best we can do.\n- if not hasattr(sys, \"__stderr__\"):\n+ if not hasattr(sys, \"__stderr__\") or not sys.__stderr__:\n return None\n return sys.__stderr__.fileno()\n", "issue": "Interactive mode (REPL) (v 1.8.0a1) Error\n**from playwright.sync_api import sync_playwright\r\nplaywright = sync_playwright().start()**\r\n\r\nTraceback (most recent call last):\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\_impl\\_transport.py\", line 27, in _get_stderr_fileno\r\n return sys.stderr.fileno()\r\nio.UnsupportedOperation: fileno\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"<pyshell#1>\", line 1, in <module>\r\n playwright = sync_playwright().start()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\sync_api\\_context_manager.py\", line 73, in start\r\n return self.__enter__()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\sync_api\\_context_manager.py\", line 67, in __enter__\r\n dispatcher_fiber.switch()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\sync_api\\_context_manager.py\", line 48, in greenlet_main\r\n loop.run_until_complete(self._connection.run_as_sync())\r\n File \"C:\\Python37\\lib\\asyncio\\base_events.py\", line 587, in run_until_complete\r\n return future.result()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\_impl\\_connection.py\", line 134, in run_as_sync\r\n await self.run()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\_impl\\_connection.py\", line 139, in run\r\n await self._transport.run()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\_impl\\_transport.py\", line 62, in run\r\n stderr=_get_stderr_fileno(),\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\_impl\\_transport.py\", line 34, in _get_stderr_fileno\r\n return sys.__stderr__.fileno()\r\nAttributeError: 'NoneType' object has no attribute 'fileno'\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport asyncio\nimport io\nimport json\nimport os\nimport sys\nfrom pathlib import Path\nfrom typing import Dict, Optional\n\n\n# Sourced from: https://github.com/pytest-dev/pytest/blob/da01ee0a4bb0af780167ecd228ab3ad249511302/src/_pytest/faulthandler.py#L69-L77\ndef _get_stderr_fileno() -> Optional[int]:\n try:\n return sys.stderr.fileno()\n except (AttributeError, io.UnsupportedOperation):\n # pytest-xdist monkeypatches sys.stderr with an object that is not an actual file.\n # https://docs.python.org/3/library/faulthandler.html#issue-with-file-descriptors\n # This is potentially dangerous, but the best we can do.\n if not hasattr(sys, \"__stderr__\"):\n return None\n return sys.__stderr__.fileno()\n\n\nclass Transport:\n def __init__(self, driver_executable: Path) -> None:\n super().__init__()\n self.on_message = lambda _: None\n self._stopped = False\n self._driver_executable = driver_executable\n self._loop: asyncio.AbstractEventLoop\n\n def stop(self) -> None:\n self._stopped = True\n self._output.close()\n\n async def run(self) -> None:\n self._loop = asyncio.get_running_loop()\n\n driver_env = os.environ.copy()\n # VSCode's JavaScript Debug Terminal provides it but driver/pkg does not support it\n driver_env.pop(\"NODE_OPTIONS\", None)\n\n proc = await asyncio.create_subprocess_exec(\n str(self._driver_executable),\n \"run-driver\",\n env=driver_env,\n stdin=asyncio.subprocess.PIPE,\n stdout=asyncio.subprocess.PIPE,\n stderr=_get_stderr_fileno(),\n limit=32768,\n )\n assert proc.stdout\n assert proc.stdin\n self._output = proc.stdin\n\n while not self._stopped:\n try:\n buffer = await proc.stdout.readexactly(4)\n length = int.from_bytes(buffer, byteorder=\"little\", signed=False)\n buffer = bytes(0)\n while length:\n to_read = min(length, 32768)\n data = await proc.stdout.readexactly(to_read)\n length -= to_read\n if len(buffer):\n buffer = buffer + data\n else:\n buffer = data\n obj = json.loads(buffer)\n\n if \"DEBUGP\" in os.environ: # pragma: no cover\n print(\"\\x1b[33mRECV>\\x1b[0m\", json.dumps(obj, indent=2))\n self.on_message(obj)\n except asyncio.IncompleteReadError:\n break\n await asyncio.sleep(0)\n\n def send(self, message: Dict) -> None:\n msg = json.dumps(message)\n if \"DEBUGP\" in os.environ: # pragma: no cover\n print(\"\\x1b[32mSEND>\\x1b[0m\", json.dumps(message, indent=2))\n data = msg.encode()\n self._output.write(\n len(data).to_bytes(4, byteorder=\"little\", signed=False) + data\n )\n", "path": "playwright/_impl/_transport.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport asyncio\nimport io\nimport json\nimport os\nimport sys\nfrom pathlib import Path\nfrom typing import Dict, Optional\n\n\n# Sourced from: https://github.com/pytest-dev/pytest/blob/da01ee0a4bb0af780167ecd228ab3ad249511302/src/_pytest/faulthandler.py#L69-L77\ndef _get_stderr_fileno() -> Optional[int]:\n try:\n return sys.stderr.fileno()\n except (AttributeError, io.UnsupportedOperation):\n # pytest-xdist monkeypatches sys.stderr with an object that is not an actual file.\n # https://docs.python.org/3/library/faulthandler.html#issue-with-file-descriptors\n # This is potentially dangerous, but the best we can do.\n if not hasattr(sys, \"__stderr__\") or not sys.__stderr__:\n return None\n return sys.__stderr__.fileno()\n\n\nclass Transport:\n def __init__(self, driver_executable: Path) -> None:\n super().__init__()\n self.on_message = lambda _: None\n self._stopped = False\n self._driver_executable = driver_executable\n self._loop: asyncio.AbstractEventLoop\n\n def stop(self) -> None:\n self._stopped = True\n self._output.close()\n\n async def run(self) -> None:\n self._loop = asyncio.get_running_loop()\n\n driver_env = os.environ.copy()\n # VSCode's JavaScript Debug Terminal provides it but driver/pkg does not support it\n driver_env.pop(\"NODE_OPTIONS\", None)\n\n proc = await asyncio.create_subprocess_exec(\n str(self._driver_executable),\n \"run-driver\",\n env=driver_env,\n stdin=asyncio.subprocess.PIPE,\n stdout=asyncio.subprocess.PIPE,\n stderr=_get_stderr_fileno(),\n limit=32768,\n )\n assert proc.stdout\n assert proc.stdin\n self._output = proc.stdin\n\n while not self._stopped:\n try:\n buffer = await proc.stdout.readexactly(4)\n length = int.from_bytes(buffer, byteorder=\"little\", signed=False)\n buffer = bytes(0)\n while length:\n to_read = min(length, 32768)\n data = await proc.stdout.readexactly(to_read)\n length -= to_read\n if len(buffer):\n buffer = buffer + data\n else:\n buffer = data\n obj = json.loads(buffer)\n\n if \"DEBUGP\" in os.environ: # pragma: no cover\n print(\"\\x1b[33mRECV>\\x1b[0m\", json.dumps(obj, indent=2))\n self.on_message(obj)\n except asyncio.IncompleteReadError:\n break\n await asyncio.sleep(0)\n\n def send(self, message: Dict) -> None:\n msg = json.dumps(message)\n if \"DEBUGP\" in os.environ: # pragma: no cover\n print(\"\\x1b[32mSEND>\\x1b[0m\", json.dumps(message, indent=2))\n data = msg.encode()\n self._output.write(\n len(data).to_bytes(4, byteorder=\"little\", signed=False) + data\n )\n", "path": "playwright/_impl/_transport.py"}]} | 1,754 | 149 |
gh_patches_debug_829 | rasdani/github-patches | git_diff | ivy-llc__ivy-15926 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
log
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/paddle/tensor/math.py`
Content:
```
1 # global
2 import ivy
3 from ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes
4 from ivy.functional.frontends.paddle.func_wrapper import (
5 to_ivy_arrays_and_back,
6 )
7
8
9 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
10 @to_ivy_arrays_and_back
11 def sin(x, name=None):
12 return ivy.sin(x)
13
14
15 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
16 @to_ivy_arrays_and_back
17 def cos(x, name=None):
18 return ivy.cos(x)
19
20
21 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
22 @to_ivy_arrays_and_back
23 def acos(x, name=None):
24 return ivy.acos(x)
25
26
27 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
28 @to_ivy_arrays_and_back
29 def cosh(x, name=None):
30 return ivy.cosh(x)
31
32
33 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
34 @to_ivy_arrays_and_back
35 def tanh(x, name=None):
36 return ivy.tanh(x)
37
38
39 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
40 @to_ivy_arrays_and_back
41 def acosh(x, name=None):
42 return ivy.acosh(x)
43
44
45 @with_supported_dtypes({"2.4.2 and below": ("float32", "float64")}, "paddle")
46 @to_ivy_arrays_and_back
47 def asin(x, name=None):
48 return ivy.asin(x)
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ivy/functional/frontends/paddle/tensor/math.py b/ivy/functional/frontends/paddle/tensor/math.py
--- a/ivy/functional/frontends/paddle/tensor/math.py
+++ b/ivy/functional/frontends/paddle/tensor/math.py
@@ -46,3 +46,9 @@
@to_ivy_arrays_and_back
def asin(x, name=None):
return ivy.asin(x)
+
+
+@with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
+@to_ivy_arrays_and_back
+def log(x, name=None):
+ return ivy.log(x)
| {"golden_diff": "diff --git a/ivy/functional/frontends/paddle/tensor/math.py b/ivy/functional/frontends/paddle/tensor/math.py\n--- a/ivy/functional/frontends/paddle/tensor/math.py\n+++ b/ivy/functional/frontends/paddle/tensor/math.py\n@@ -46,3 +46,9 @@\n @to_ivy_arrays_and_back\n def asin(x, name=None):\n return ivy.asin(x)\n+\n+\n+@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n+@to_ivy_arrays_and_back\n+def log(x, name=None):\n+ return ivy.log(x)\n", "issue": "log\n\n", "before_files": [{"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef sin(x, name=None):\n return ivy.sin(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cos(x, name=None):\n return ivy.cos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acos(x, name=None):\n return ivy.acos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cosh(x, name=None):\n return ivy.cosh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef tanh(x, name=None):\n return ivy.tanh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acosh(x, name=None):\n return ivy.acosh(x)\n\n\n@with_supported_dtypes({\"2.4.2 and below\": (\"float32\", \"float64\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef asin(x, name=None):\n return ivy.asin(x)\n", "path": "ivy/functional/frontends/paddle/tensor/math.py"}], "after_files": [{"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef sin(x, name=None):\n return ivy.sin(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cos(x, name=None):\n return ivy.cos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acos(x, name=None):\n return ivy.acos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cosh(x, name=None):\n return ivy.cosh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef tanh(x, name=None):\n return ivy.tanh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acosh(x, name=None):\n return ivy.acosh(x)\n\n\n@with_supported_dtypes({\"2.4.2 and below\": (\"float32\", \"float64\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef asin(x, name=None):\n return ivy.asin(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef log(x, name=None):\n return ivy.log(x)\n", "path": "ivy/functional/frontends/paddle/tensor/math.py"}]} | 793 | 155 |
gh_patches_debug_36271 | rasdani/github-patches | git_diff | opsdroid__opsdroid-1835 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Upgrade redis database backend to use aioredis v2
Looks like v2 of `aioredis` is out and has breaking changes which affect us. In #1809 I've pinned us to v1 for now but we should upgrade things to work with v2.
Specifically importing the parser fails
https://github.com/opsdroid/opsdroid/blob/a45490d1bdceca39b49880e20262b55ea0be101d/opsdroid/database/redis/__init__.py#L6
```python-traceback
ImportError while importing test module '/home/runner/work/opsdroid/opsdroid/opsdroid/database/redis/tests/test_redis.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/opt/hostedtoolcache/Python/3.7.11/x64/lib/python3.7/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
opsdroid/database/redis/tests/test_redis.py:8: in <module>
from opsdroid.database.redis import RedisDatabase
opsdroid/database/redis/__init__.py:6: in <module>
from aioredis import parser
E ImportError: cannot import name 'parser' from 'aioredis' (/home/runner/work/opsdroid/opsdroid/.tox/py37-e2e/lib/python3.7/site-packages/aioredis/__init__.py)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `opsdroid/database/redis/__init__.py`
Content:
```
1 """Module for storing data within Redis."""
2 import json
3 import logging
4
5 import aioredis
6 from aioredis import parser
7 from voluptuous import Any
8
9 from opsdroid.database import Database
10 from opsdroid.helper import JSONEncoder, JSONDecoder
11
12 _LOGGER = logging.getLogger(__name__)
13 CONFIG_SCHEMA = {"host": str, "port": Any(int, str), "database": int, "password": str}
14
15
16 class RedisDatabase(Database):
17 """Database class for storing data within a Redis instance."""
18
19 def __init__(self, config, opsdroid=None):
20 """Initialise the redis database.
21
22 Set basic properties of the database. Initialise properties like
23 name, connection arguments, database file, table name and config.
24
25 Args:
26 config (dict): The configuration of the database which consists
27 of `file` and `table` name of the sqlite database
28 specified in `configuration.yaml` file.
29 opsdroid (OpsDroid): An instance of opsdroid.core.
30
31 """
32 super().__init__(config, opsdroid=opsdroid)
33 self.config = config
34 self.client = None
35 self.host = self.config.get("host", "localhost")
36 self.port = self.config.get("port", 6379)
37 self.database = self.config.get("database", 0)
38 self.password = self.config.get("password", None)
39 _LOGGER.debug(_("Loaded Redis database connector."))
40
41 async def connect(self):
42 """Connect to the database.
43
44 This method will connect to a Redis database. By default it will
45 connect to Redis on localhost on port 6379
46
47 """
48 try:
49 self.client = await aioredis.create_pool(
50 address=(self.host, int(self.port)),
51 db=self.database,
52 password=self.password,
53 parser=parser.PyReader,
54 )
55
56 _LOGGER.info(
57 _("Connected to Redis database %s from %s on port %s."),
58 self.database,
59 self.host,
60 self.port,
61 )
62 except OSError:
63 _LOGGER.warning(
64 _("Unable to connect to Redis database on address: %s port: %s."),
65 self.host,
66 self.port,
67 )
68
69 async def put(self, key, data):
70 """Store the data object in Redis against the key.
71
72 Args:
73 key (string): The key to store the data object under.
74 data (object): The data object to store.
75
76 """
77 if self.client:
78 _LOGGER.debug(_("Putting %s into Redis."), key)
79 await self.client.execute("SET", key, json.dumps(data, cls=JSONEncoder))
80
81 async def get(self, key):
82 """Get data from Redis for a given key.
83
84 Args:
85 key (string): The key to lookup in the database.
86
87 Returns:
88 object or None: The data object stored for that key, or None if no
89 object found for that key.
90
91 """
92 if self.client:
93 _LOGGER.debug(_("Getting %s from Redis."), key)
94 data = await self.client.execute("GET", key)
95
96 if data:
97 return json.loads(data, object_hook=JSONDecoder())
98
99 return None
100
101 async def delete(self, key):
102 """Delete data from Redis for a given key.
103
104 Args:
105 key (string): The key to delete in the database.
106
107 """
108 if self.client:
109 _LOGGER.debug(_("Deleting %s from Redis."), key)
110 await self.client.execute("DEL", key)
111
112 async def disconnect(self):
113 """Disconnect from the database."""
114 if self.client:
115 self.client.close()
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/opsdroid/database/redis/__init__.py b/opsdroid/database/redis/__init__.py
--- a/opsdroid/database/redis/__init__.py
+++ b/opsdroid/database/redis/__init__.py
@@ -2,8 +2,7 @@
import json
import logging
-import aioredis
-from aioredis import parser
+from aioredis import Redis
from voluptuous import Any
from opsdroid.database import Database
@@ -46,12 +45,13 @@
"""
try:
- self.client = await aioredis.create_pool(
- address=(self.host, int(self.port)),
+ self.client = Redis(
+ host=self.host,
+ port=int(self.port),
db=self.database,
password=self.password,
- parser=parser.PyReader,
)
+ await self.client.ping() # to actually initiate a connection
_LOGGER.info(
_("Connected to Redis database %s from %s on port %s."),
@@ -76,7 +76,9 @@
"""
if self.client:
_LOGGER.debug(_("Putting %s into Redis."), key)
- await self.client.execute("SET", key, json.dumps(data, cls=JSONEncoder))
+ await self.client.execute_command(
+ "SET", key, json.dumps(data, cls=JSONEncoder)
+ )
async def get(self, key):
"""Get data from Redis for a given key.
@@ -91,7 +93,7 @@
"""
if self.client:
_LOGGER.debug(_("Getting %s from Redis."), key)
- data = await self.client.execute("GET", key)
+ data = await self.client.execute_command("GET", key)
if data:
return json.loads(data, object_hook=JSONDecoder())
@@ -107,9 +109,9 @@
"""
if self.client:
_LOGGER.debug(_("Deleting %s from Redis."), key)
- await self.client.execute("DEL", key)
+ await self.client.execute_command("DEL", key)
async def disconnect(self):
"""Disconnect from the database."""
if self.client:
- self.client.close()
+ await self.client.close()
| {"golden_diff": "diff --git a/opsdroid/database/redis/__init__.py b/opsdroid/database/redis/__init__.py\n--- a/opsdroid/database/redis/__init__.py\n+++ b/opsdroid/database/redis/__init__.py\n@@ -2,8 +2,7 @@\n import json\n import logging\n \n-import aioredis\n-from aioredis import parser\n+from aioredis import Redis\n from voluptuous import Any\n \n from opsdroid.database import Database\n@@ -46,12 +45,13 @@\n \n \"\"\"\n try:\n- self.client = await aioredis.create_pool(\n- address=(self.host, int(self.port)),\n+ self.client = Redis(\n+ host=self.host,\n+ port=int(self.port),\n db=self.database,\n password=self.password,\n- parser=parser.PyReader,\n )\n+ await self.client.ping() # to actually initiate a connection\n \n _LOGGER.info(\n _(\"Connected to Redis database %s from %s on port %s.\"),\n@@ -76,7 +76,9 @@\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Putting %s into Redis.\"), key)\n- await self.client.execute(\"SET\", key, json.dumps(data, cls=JSONEncoder))\n+ await self.client.execute_command(\n+ \"SET\", key, json.dumps(data, cls=JSONEncoder)\n+ )\n \n async def get(self, key):\n \"\"\"Get data from Redis for a given key.\n@@ -91,7 +93,7 @@\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Getting %s from Redis.\"), key)\n- data = await self.client.execute(\"GET\", key)\n+ data = await self.client.execute_command(\"GET\", key)\n \n if data:\n return json.loads(data, object_hook=JSONDecoder())\n@@ -107,9 +109,9 @@\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Deleting %s from Redis.\"), key)\n- await self.client.execute(\"DEL\", key)\n+ await self.client.execute_command(\"DEL\", key)\n \n async def disconnect(self):\n \"\"\"Disconnect from the database.\"\"\"\n if self.client:\n- self.client.close()\n+ await self.client.close()\n", "issue": "Upgrade redis database backend to use aioredis v2\nLooks like v2 of `aioredis` is out and has breaking changes which affect us. In #1809 I've pinned us to v1 for now but we should upgrade things to work with v2.\r\n\r\nSpecifically importing the parser fails\r\n\r\nhttps://github.com/opsdroid/opsdroid/blob/a45490d1bdceca39b49880e20262b55ea0be101d/opsdroid/database/redis/__init__.py#L6\r\n\r\n```python-traceback\r\n ImportError while importing test module '/home/runner/work/opsdroid/opsdroid/opsdroid/database/redis/tests/test_redis.py'.\r\n Hint: make sure your test modules/packages have valid Python names.\r\n Traceback:\r\n /opt/hostedtoolcache/Python/3.7.11/x64/lib/python3.7/importlib/__init__.py:127: in import_module\r\n return _bootstrap._gcd_import(name[level:], package, level)\r\n opsdroid/database/redis/tests/test_redis.py:8: in <module>\r\n from opsdroid.database.redis import RedisDatabase\r\n opsdroid/database/redis/__init__.py:6: in <module>\r\n from aioredis import parser\r\n E ImportError: cannot import name 'parser' from 'aioredis' (/home/runner/work/opsdroid/opsdroid/.tox/py37-e2e/lib/python3.7/site-packages/aioredis/__init__.py)\r\n```\n", "before_files": [{"content": "\"\"\"Module for storing data within Redis.\"\"\"\nimport json\nimport logging\n\nimport aioredis\nfrom aioredis import parser\nfrom voluptuous import Any\n\nfrom opsdroid.database import Database\nfrom opsdroid.helper import JSONEncoder, JSONDecoder\n\n_LOGGER = logging.getLogger(__name__)\nCONFIG_SCHEMA = {\"host\": str, \"port\": Any(int, str), \"database\": int, \"password\": str}\n\n\nclass RedisDatabase(Database):\n \"\"\"Database class for storing data within a Redis instance.\"\"\"\n\n def __init__(self, config, opsdroid=None):\n \"\"\"Initialise the redis database.\n\n Set basic properties of the database. Initialise properties like\n name, connection arguments, database file, table name and config.\n\n Args:\n config (dict): The configuration of the database which consists\n of `file` and `table` name of the sqlite database\n specified in `configuration.yaml` file.\n opsdroid (OpsDroid): An instance of opsdroid.core.\n\n \"\"\"\n super().__init__(config, opsdroid=opsdroid)\n self.config = config\n self.client = None\n self.host = self.config.get(\"host\", \"localhost\")\n self.port = self.config.get(\"port\", 6379)\n self.database = self.config.get(\"database\", 0)\n self.password = self.config.get(\"password\", None)\n _LOGGER.debug(_(\"Loaded Redis database connector.\"))\n\n async def connect(self):\n \"\"\"Connect to the database.\n\n This method will connect to a Redis database. By default it will\n connect to Redis on localhost on port 6379\n\n \"\"\"\n try:\n self.client = await aioredis.create_pool(\n address=(self.host, int(self.port)),\n db=self.database,\n password=self.password,\n parser=parser.PyReader,\n )\n\n _LOGGER.info(\n _(\"Connected to Redis database %s from %s on port %s.\"),\n self.database,\n self.host,\n self.port,\n )\n except OSError:\n _LOGGER.warning(\n _(\"Unable to connect to Redis database on address: %s port: %s.\"),\n self.host,\n self.port,\n )\n\n async def put(self, key, data):\n \"\"\"Store the data object in Redis against the key.\n\n Args:\n key (string): The key to store the data object under.\n data (object): The data object to store.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Putting %s into Redis.\"), key)\n await self.client.execute(\"SET\", key, json.dumps(data, cls=JSONEncoder))\n\n async def get(self, key):\n \"\"\"Get data from Redis for a given key.\n\n Args:\n key (string): The key to lookup in the database.\n\n Returns:\n object or None: The data object stored for that key, or None if no\n object found for that key.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Getting %s from Redis.\"), key)\n data = await self.client.execute(\"GET\", key)\n\n if data:\n return json.loads(data, object_hook=JSONDecoder())\n\n return None\n\n async def delete(self, key):\n \"\"\"Delete data from Redis for a given key.\n\n Args:\n key (string): The key to delete in the database.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Deleting %s from Redis.\"), key)\n await self.client.execute(\"DEL\", key)\n\n async def disconnect(self):\n \"\"\"Disconnect from the database.\"\"\"\n if self.client:\n self.client.close()\n", "path": "opsdroid/database/redis/__init__.py"}], "after_files": [{"content": "\"\"\"Module for storing data within Redis.\"\"\"\nimport json\nimport logging\n\nfrom aioredis import Redis\nfrom voluptuous import Any\n\nfrom opsdroid.database import Database\nfrom opsdroid.helper import JSONEncoder, JSONDecoder\n\n_LOGGER = logging.getLogger(__name__)\nCONFIG_SCHEMA = {\"host\": str, \"port\": Any(int, str), \"database\": int, \"password\": str}\n\n\nclass RedisDatabase(Database):\n \"\"\"Database class for storing data within a Redis instance.\"\"\"\n\n def __init__(self, config, opsdroid=None):\n \"\"\"Initialise the redis database.\n\n Set basic properties of the database. Initialise properties like\n name, connection arguments, database file, table name and config.\n\n Args:\n config (dict): The configuration of the database which consists\n of `file` and `table` name of the sqlite database\n specified in `configuration.yaml` file.\n opsdroid (OpsDroid): An instance of opsdroid.core.\n\n \"\"\"\n super().__init__(config, opsdroid=opsdroid)\n self.config = config\n self.client = None\n self.host = self.config.get(\"host\", \"localhost\")\n self.port = self.config.get(\"port\", 6379)\n self.database = self.config.get(\"database\", 0)\n self.password = self.config.get(\"password\", None)\n _LOGGER.debug(_(\"Loaded Redis database connector.\"))\n\n async def connect(self):\n \"\"\"Connect to the database.\n\n This method will connect to a Redis database. By default it will\n connect to Redis on localhost on port 6379\n\n \"\"\"\n try:\n self.client = Redis(\n host=self.host,\n port=int(self.port),\n db=self.database,\n password=self.password,\n )\n await self.client.ping() # to actually initiate a connection\n\n _LOGGER.info(\n _(\"Connected to Redis database %s from %s on port %s.\"),\n self.database,\n self.host,\n self.port,\n )\n except OSError:\n _LOGGER.warning(\n _(\"Unable to connect to Redis database on address: %s port: %s.\"),\n self.host,\n self.port,\n )\n\n async def put(self, key, data):\n \"\"\"Store the data object in Redis against the key.\n\n Args:\n key (string): The key to store the data object under.\n data (object): The data object to store.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Putting %s into Redis.\"), key)\n await self.client.execute_command(\n \"SET\", key, json.dumps(data, cls=JSONEncoder)\n )\n\n async def get(self, key):\n \"\"\"Get data from Redis for a given key.\n\n Args:\n key (string): The key to lookup in the database.\n\n Returns:\n object or None: The data object stored for that key, or None if no\n object found for that key.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Getting %s from Redis.\"), key)\n data = await self.client.execute_command(\"GET\", key)\n\n if data:\n return json.loads(data, object_hook=JSONDecoder())\n\n return None\n\n async def delete(self, key):\n \"\"\"Delete data from Redis for a given key.\n\n Args:\n key (string): The key to delete in the database.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Deleting %s from Redis.\"), key)\n await self.client.execute_command(\"DEL\", key)\n\n async def disconnect(self):\n \"\"\"Disconnect from the database.\"\"\"\n if self.client:\n await self.client.close()\n", "path": "opsdroid/database/redis/__init__.py"}]} | 1,633 | 490 |
gh_patches_debug_28142 | rasdani/github-patches | git_diff | pypa__pip-8617 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fail legacy install if .egg-info directory was not created
**What's the problem this feature will solve?**
Currently on the legacy install code path ([here](https://github.com/pypa/pip/blob/e0239735a3d8a504e00133090dde2210b6564a38/src/pip/_internal/req/req_install.py#L997-L1004)) if execution of `setup.py install ...` didn't result in an `.egg-info` directory being created we trace a warning and continue.
This is bad because we will have installed an uninstallable package and the only penalty is a small warning which is very easy to miss.
**Describe the solution you'd like**
Instead of tracing a warning we should throw an exception.
This prevents additional packages from being installed and possibly hiding the warning from users.
**Alternative Solutions**
Continue tracing the warning as before.
**Additional context**
* [this support query](https://kite.trade/forum/discussion/4335/could-not-find-egg-info-directory-in-install-record-for-kiteconnect) for [this package](https://github.com/zerodhatech/pykiteconnect/blob/7af60a776fbfafb6c30cfbd8d4f6c192618c5af5/setup.py#L48) which does an egg install in its setup.py regardless of setuptools version.
* [this query](https://stackoverflow.com/questions/26091641/what-does-a-could-not-find-egg-info-directory-in-install-record-from-pip-mean) which implicates having an older version of setuptools
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/pip/_internal/operations/install/legacy.py`
Content:
```
1 """Legacy installation process, i.e. `setup.py install`.
2 """
3
4 import logging
5 import os
6 import sys
7 from distutils.util import change_root
8
9 from pip._internal.utils.deprecation import deprecated
10 from pip._internal.utils.logging import indent_log
11 from pip._internal.utils.misc import ensure_dir
12 from pip._internal.utils.setuptools_build import make_setuptools_install_args
13 from pip._internal.utils.subprocess import runner_with_spinner_message
14 from pip._internal.utils.temp_dir import TempDirectory
15 from pip._internal.utils.typing import MYPY_CHECK_RUNNING
16
17 if MYPY_CHECK_RUNNING:
18 from typing import List, Optional, Sequence
19
20 from pip._internal.build_env import BuildEnvironment
21 from pip._internal.models.scheme import Scheme
22
23
24 logger = logging.getLogger(__name__)
25
26
27 class LegacyInstallFailure(Exception):
28 def __init__(self):
29 # type: () -> None
30 self.parent = sys.exc_info()
31
32
33 def install(
34 install_options, # type: List[str]
35 global_options, # type: Sequence[str]
36 root, # type: Optional[str]
37 home, # type: Optional[str]
38 prefix, # type: Optional[str]
39 use_user_site, # type: bool
40 pycompile, # type: bool
41 scheme, # type: Scheme
42 setup_py_path, # type: str
43 isolated, # type: bool
44 req_name, # type: str
45 build_env, # type: BuildEnvironment
46 unpacked_source_directory, # type: str
47 req_description, # type: str
48 ):
49 # type: (...) -> bool
50
51 header_dir = scheme.headers
52
53 with TempDirectory(kind="record") as temp_dir:
54 try:
55 record_filename = os.path.join(temp_dir.path, 'install-record.txt')
56 install_args = make_setuptools_install_args(
57 setup_py_path,
58 global_options=global_options,
59 install_options=install_options,
60 record_filename=record_filename,
61 root=root,
62 prefix=prefix,
63 header_dir=header_dir,
64 home=home,
65 use_user_site=use_user_site,
66 no_user_config=isolated,
67 pycompile=pycompile,
68 )
69
70 runner = runner_with_spinner_message(
71 "Running setup.py install for {}".format(req_name)
72 )
73 with indent_log(), build_env:
74 runner(
75 cmd=install_args,
76 cwd=unpacked_source_directory,
77 )
78
79 if not os.path.exists(record_filename):
80 logger.debug('Record file %s not found', record_filename)
81 # Signal to the caller that we didn't install the new package
82 return False
83
84 except Exception:
85 # Signal to the caller that we didn't install the new package
86 raise LegacyInstallFailure
87
88 # At this point, we have successfully installed the requirement.
89
90 # We intentionally do not use any encoding to read the file because
91 # setuptools writes the file using distutils.file_util.write_file,
92 # which does not specify an encoding.
93 with open(record_filename) as f:
94 record_lines = f.read().splitlines()
95
96 def prepend_root(path):
97 # type: (str) -> str
98 if root is None or not os.path.isabs(path):
99 return path
100 else:
101 return change_root(root, path)
102
103 for line in record_lines:
104 directory = os.path.dirname(line)
105 if directory.endswith('.egg-info'):
106 egg_info_dir = prepend_root(directory)
107 break
108 else:
109 deprecated(
110 reason=(
111 "{} did not indicate that it installed an "
112 ".egg-info directory. Only setup.py projects "
113 "generating .egg-info directories are supported."
114 ).format(req_description),
115 replacement=(
116 "for maintainers: updating the setup.py of {0}. "
117 "For users: contact the maintainers of {0} to let "
118 "them know to update their setup.py.".format(
119 req_name
120 )
121 ),
122 gone_in="20.2",
123 issue=6998,
124 )
125 # FIXME: put the record somewhere
126 return True
127
128 new_lines = []
129 for line in record_lines:
130 filename = line.strip()
131 if os.path.isdir(filename):
132 filename += os.path.sep
133 new_lines.append(
134 os.path.relpath(prepend_root(filename), egg_info_dir)
135 )
136 new_lines.sort()
137 ensure_dir(egg_info_dir)
138 inst_files_path = os.path.join(egg_info_dir, 'installed-files.txt')
139 with open(inst_files_path, 'w') as f:
140 f.write('\n'.join(new_lines) + '\n')
141
142 return True
143
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/pip/_internal/operations/install/legacy.py b/src/pip/_internal/operations/install/legacy.py
--- a/src/pip/_internal/operations/install/legacy.py
+++ b/src/pip/_internal/operations/install/legacy.py
@@ -6,7 +6,7 @@
import sys
from distutils.util import change_root
-from pip._internal.utils.deprecation import deprecated
+from pip._internal.exceptions import InstallationError
from pip._internal.utils.logging import indent_log
from pip._internal.utils.misc import ensure_dir
from pip._internal.utils.setuptools_build import make_setuptools_install_args
@@ -106,24 +106,12 @@
egg_info_dir = prepend_root(directory)
break
else:
- deprecated(
- reason=(
- "{} did not indicate that it installed an "
- ".egg-info directory. Only setup.py projects "
- "generating .egg-info directories are supported."
- ).format(req_description),
- replacement=(
- "for maintainers: updating the setup.py of {0}. "
- "For users: contact the maintainers of {0} to let "
- "them know to update their setup.py.".format(
- req_name
- )
- ),
- gone_in="20.2",
- issue=6998,
- )
- # FIXME: put the record somewhere
- return True
+ message = (
+ "{} did not indicate that it installed an "
+ ".egg-info directory. Only setup.py projects "
+ "generating .egg-info directories are supported."
+ ).format(req_description)
+ raise InstallationError(message)
new_lines = []
for line in record_lines:
| {"golden_diff": "diff --git a/src/pip/_internal/operations/install/legacy.py b/src/pip/_internal/operations/install/legacy.py\n--- a/src/pip/_internal/operations/install/legacy.py\n+++ b/src/pip/_internal/operations/install/legacy.py\n@@ -6,7 +6,7 @@\n import sys\n from distutils.util import change_root\n \n-from pip._internal.utils.deprecation import deprecated\n+from pip._internal.exceptions import InstallationError\n from pip._internal.utils.logging import indent_log\n from pip._internal.utils.misc import ensure_dir\n from pip._internal.utils.setuptools_build import make_setuptools_install_args\n@@ -106,24 +106,12 @@\n egg_info_dir = prepend_root(directory)\n break\n else:\n- deprecated(\n- reason=(\n- \"{} did not indicate that it installed an \"\n- \".egg-info directory. Only setup.py projects \"\n- \"generating .egg-info directories are supported.\"\n- ).format(req_description),\n- replacement=(\n- \"for maintainers: updating the setup.py of {0}. \"\n- \"For users: contact the maintainers of {0} to let \"\n- \"them know to update their setup.py.\".format(\n- req_name\n- )\n- ),\n- gone_in=\"20.2\",\n- issue=6998,\n- )\n- # FIXME: put the record somewhere\n- return True\n+ message = (\n+ \"{} did not indicate that it installed an \"\n+ \".egg-info directory. Only setup.py projects \"\n+ \"generating .egg-info directories are supported.\"\n+ ).format(req_description)\n+ raise InstallationError(message)\n \n new_lines = []\n for line in record_lines:\n", "issue": "Fail legacy install if .egg-info directory was not created\n**What's the problem this feature will solve?**\r\n\r\nCurrently on the legacy install code path ([here](https://github.com/pypa/pip/blob/e0239735a3d8a504e00133090dde2210b6564a38/src/pip/_internal/req/req_install.py#L997-L1004)) if execution of `setup.py install ...` didn't result in an `.egg-info` directory being created we trace a warning and continue.\r\n\r\nThis is bad because we will have installed an uninstallable package and the only penalty is a small warning which is very easy to miss.\r\n\r\n**Describe the solution you'd like**\r\n\r\nInstead of tracing a warning we should throw an exception.\r\n\r\nThis prevents additional packages from being installed and possibly hiding the warning from users.\r\n\r\n**Alternative Solutions**\r\n\r\nContinue tracing the warning as before.\r\n\r\n**Additional context**\r\n\r\n* [this support query](https://kite.trade/forum/discussion/4335/could-not-find-egg-info-directory-in-install-record-for-kiteconnect) for [this package](https://github.com/zerodhatech/pykiteconnect/blob/7af60a776fbfafb6c30cfbd8d4f6c192618c5af5/setup.py#L48) which does an egg install in its setup.py regardless of setuptools version.\r\n* [this query](https://stackoverflow.com/questions/26091641/what-does-a-could-not-find-egg-info-directory-in-install-record-from-pip-mean) which implicates having an older version of setuptools\n", "before_files": [{"content": "\"\"\"Legacy installation process, i.e. `setup.py install`.\n\"\"\"\n\nimport logging\nimport os\nimport sys\nfrom distutils.util import change_root\n\nfrom pip._internal.utils.deprecation import deprecated\nfrom pip._internal.utils.logging import indent_log\nfrom pip._internal.utils.misc import ensure_dir\nfrom pip._internal.utils.setuptools_build import make_setuptools_install_args\nfrom pip._internal.utils.subprocess import runner_with_spinner_message\nfrom pip._internal.utils.temp_dir import TempDirectory\nfrom pip._internal.utils.typing import MYPY_CHECK_RUNNING\n\nif MYPY_CHECK_RUNNING:\n from typing import List, Optional, Sequence\n\n from pip._internal.build_env import BuildEnvironment\n from pip._internal.models.scheme import Scheme\n\n\nlogger = logging.getLogger(__name__)\n\n\nclass LegacyInstallFailure(Exception):\n def __init__(self):\n # type: () -> None\n self.parent = sys.exc_info()\n\n\ndef install(\n install_options, # type: List[str]\n global_options, # type: Sequence[str]\n root, # type: Optional[str]\n home, # type: Optional[str]\n prefix, # type: Optional[str]\n use_user_site, # type: bool\n pycompile, # type: bool\n scheme, # type: Scheme\n setup_py_path, # type: str\n isolated, # type: bool\n req_name, # type: str\n build_env, # type: BuildEnvironment\n unpacked_source_directory, # type: str\n req_description, # type: str\n):\n # type: (...) -> bool\n\n header_dir = scheme.headers\n\n with TempDirectory(kind=\"record\") as temp_dir:\n try:\n record_filename = os.path.join(temp_dir.path, 'install-record.txt')\n install_args = make_setuptools_install_args(\n setup_py_path,\n global_options=global_options,\n install_options=install_options,\n record_filename=record_filename,\n root=root,\n prefix=prefix,\n header_dir=header_dir,\n home=home,\n use_user_site=use_user_site,\n no_user_config=isolated,\n pycompile=pycompile,\n )\n\n runner = runner_with_spinner_message(\n \"Running setup.py install for {}\".format(req_name)\n )\n with indent_log(), build_env:\n runner(\n cmd=install_args,\n cwd=unpacked_source_directory,\n )\n\n if not os.path.exists(record_filename):\n logger.debug('Record file %s not found', record_filename)\n # Signal to the caller that we didn't install the new package\n return False\n\n except Exception:\n # Signal to the caller that we didn't install the new package\n raise LegacyInstallFailure\n\n # At this point, we have successfully installed the requirement.\n\n # We intentionally do not use any encoding to read the file because\n # setuptools writes the file using distutils.file_util.write_file,\n # which does not specify an encoding.\n with open(record_filename) as f:\n record_lines = f.read().splitlines()\n\n def prepend_root(path):\n # type: (str) -> str\n if root is None or not os.path.isabs(path):\n return path\n else:\n return change_root(root, path)\n\n for line in record_lines:\n directory = os.path.dirname(line)\n if directory.endswith('.egg-info'):\n egg_info_dir = prepend_root(directory)\n break\n else:\n deprecated(\n reason=(\n \"{} did not indicate that it installed an \"\n \".egg-info directory. Only setup.py projects \"\n \"generating .egg-info directories are supported.\"\n ).format(req_description),\n replacement=(\n \"for maintainers: updating the setup.py of {0}. \"\n \"For users: contact the maintainers of {0} to let \"\n \"them know to update their setup.py.\".format(\n req_name\n )\n ),\n gone_in=\"20.2\",\n issue=6998,\n )\n # FIXME: put the record somewhere\n return True\n\n new_lines = []\n for line in record_lines:\n filename = line.strip()\n if os.path.isdir(filename):\n filename += os.path.sep\n new_lines.append(\n os.path.relpath(prepend_root(filename), egg_info_dir)\n )\n new_lines.sort()\n ensure_dir(egg_info_dir)\n inst_files_path = os.path.join(egg_info_dir, 'installed-files.txt')\n with open(inst_files_path, 'w') as f:\n f.write('\\n'.join(new_lines) + '\\n')\n\n return True\n", "path": "src/pip/_internal/operations/install/legacy.py"}], "after_files": [{"content": "\"\"\"Legacy installation process, i.e. `setup.py install`.\n\"\"\"\n\nimport logging\nimport os\nimport sys\nfrom distutils.util import change_root\n\nfrom pip._internal.exceptions import InstallationError\nfrom pip._internal.utils.logging import indent_log\nfrom pip._internal.utils.misc import ensure_dir\nfrom pip._internal.utils.setuptools_build import make_setuptools_install_args\nfrom pip._internal.utils.subprocess import runner_with_spinner_message\nfrom pip._internal.utils.temp_dir import TempDirectory\nfrom pip._internal.utils.typing import MYPY_CHECK_RUNNING\n\nif MYPY_CHECK_RUNNING:\n from typing import List, Optional, Sequence\n\n from pip._internal.build_env import BuildEnvironment\n from pip._internal.models.scheme import Scheme\n\n\nlogger = logging.getLogger(__name__)\n\n\nclass LegacyInstallFailure(Exception):\n def __init__(self):\n # type: () -> None\n self.parent = sys.exc_info()\n\n\ndef install(\n install_options, # type: List[str]\n global_options, # type: Sequence[str]\n root, # type: Optional[str]\n home, # type: Optional[str]\n prefix, # type: Optional[str]\n use_user_site, # type: bool\n pycompile, # type: bool\n scheme, # type: Scheme\n setup_py_path, # type: str\n isolated, # type: bool\n req_name, # type: str\n build_env, # type: BuildEnvironment\n unpacked_source_directory, # type: str\n req_description, # type: str\n):\n # type: (...) -> bool\n\n header_dir = scheme.headers\n\n with TempDirectory(kind=\"record\") as temp_dir:\n try:\n record_filename = os.path.join(temp_dir.path, 'install-record.txt')\n install_args = make_setuptools_install_args(\n setup_py_path,\n global_options=global_options,\n install_options=install_options,\n record_filename=record_filename,\n root=root,\n prefix=prefix,\n header_dir=header_dir,\n home=home,\n use_user_site=use_user_site,\n no_user_config=isolated,\n pycompile=pycompile,\n )\n\n runner = runner_with_spinner_message(\n \"Running setup.py install for {}\".format(req_name)\n )\n with indent_log(), build_env:\n runner(\n cmd=install_args,\n cwd=unpacked_source_directory,\n )\n\n if not os.path.exists(record_filename):\n logger.debug('Record file %s not found', record_filename)\n # Signal to the caller that we didn't install the new package\n return False\n\n except Exception:\n # Signal to the caller that we didn't install the new package\n raise LegacyInstallFailure\n\n # At this point, we have successfully installed the requirement.\n\n # We intentionally do not use any encoding to read the file because\n # setuptools writes the file using distutils.file_util.write_file,\n # which does not specify an encoding.\n with open(record_filename) as f:\n record_lines = f.read().splitlines()\n\n def prepend_root(path):\n # type: (str) -> str\n if root is None or not os.path.isabs(path):\n return path\n else:\n return change_root(root, path)\n\n for line in record_lines:\n directory = os.path.dirname(line)\n if directory.endswith('.egg-info'):\n egg_info_dir = prepend_root(directory)\n break\n else:\n message = (\n \"{} did not indicate that it installed an \"\n \".egg-info directory. Only setup.py projects \"\n \"generating .egg-info directories are supported.\"\n ).format(req_description)\n raise InstallationError(message)\n\n new_lines = []\n for line in record_lines:\n filename = line.strip()\n if os.path.isdir(filename):\n filename += os.path.sep\n new_lines.append(\n os.path.relpath(prepend_root(filename), egg_info_dir)\n )\n new_lines.sort()\n ensure_dir(egg_info_dir)\n inst_files_path = os.path.join(egg_info_dir, 'installed-files.txt')\n with open(inst_files_path, 'w') as f:\n f.write('\\n'.join(new_lines) + '\\n')\n\n return True\n", "path": "src/pip/_internal/operations/install/legacy.py"}]} | 1,957 | 378 |
gh_patches_debug_73 | rasdani/github-patches | git_diff | pypa__setuptools-1043 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
No module named 'six'
Flask's Tox/Travis builds all started failing. Looks like a new version of setuptools was just released that has a problem with six.
~~~pytb
Obtaining file:///home/david/Projects/flask
Installing collected packages: Flask
Running setup.py develop for Flask
Complete output from command /home/david/Projects/flask/.tox/py/bin/python3 -c "import setuptools, tokenize;__file__='/home/david/Projects/flask/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" develop --no-deps:
/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py:336: UserWarning: Normalizing '0.13-dev' to '0.13.dev0'
normalized_version,
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/david/Projects/flask/setup.py", line 109, in <module>
'''
File "/usr/lib64/python3.6/distutils/core.py", line 134, in setup
ok = dist.parse_command_line()
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py", line 363, in parse_command_line
result = _Distribution.parse_command_line(self)
File "/usr/lib64/python3.6/distutils/dist.py", line 472, in parse_command_line
args = self._parse_command_opts(parser, args)
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py", line 674, in _parse_command_opts
nargs = _Distribution._parse_command_opts(self, parser, args)
File "/usr/lib64/python3.6/distutils/dist.py", line 528, in _parse_command_opts
cmd_class = self.get_command_class(command)
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py", line 495, in get_command_class
self.cmdclass[command] = cmdclass = ep.load()
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2303, in load
return self.resolve()
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2309, in resolve
module = __import__(self.module_name, fromlist=['__name__'], level=0)
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/command/develop.py", line 11, in <module>
from setuptools.command.easy_install import easy_install
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/command/easy_install.py", line 49, in <module>
from setuptools.py27compat import rmtree_safe
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/py27compat.py", line 7, in <module>
import six
ModuleNotFoundError: No module named 'six'
~~~
Example failed build log: https://travis-ci.org/pallets/flask/jobs/238166427#L242
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setuptools/py27compat.py`
Content:
```
1 """
2 Compatibility Support for Python 2.7 and earlier
3 """
4
5 import platform
6
7 import six
8
9
10 def get_all_headers(message, key):
11 """
12 Given an HTTPMessage, return all headers matching a given key.
13 """
14 return message.get_all(key)
15
16
17 if six.PY2:
18 def get_all_headers(message, key):
19 return message.getheaders(key)
20
21
22 linux_py2_ascii = (
23 platform.system() == 'Linux' and
24 six.PY2
25 )
26
27 rmtree_safe = str if linux_py2_ascii else lambda x: x
28 """Workaround for http://bugs.python.org/issue24672"""
29
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setuptools/py27compat.py b/setuptools/py27compat.py
--- a/setuptools/py27compat.py
+++ b/setuptools/py27compat.py
@@ -4,7 +4,7 @@
import platform
-import six
+from setuptools.extern import six
def get_all_headers(message, key):
| {"golden_diff": "diff --git a/setuptools/py27compat.py b/setuptools/py27compat.py\n--- a/setuptools/py27compat.py\n+++ b/setuptools/py27compat.py\n@@ -4,7 +4,7 @@\n \n import platform\n \n-import six\n+from setuptools.extern import six\n \n \n def get_all_headers(message, key):\n", "issue": "No module named 'six'\nFlask's Tox/Travis builds all started failing. Looks like a new version of setuptools was just released that has a problem with six.\r\n\r\n~~~pytb\r\nObtaining file:///home/david/Projects/flask\r\nInstalling collected packages: Flask\r\n Running setup.py develop for Flask\r\n Complete output from command /home/david/Projects/flask/.tox/py/bin/python3 -c \"import setuptools, tokenize;__file__='/home/david/Projects/flask/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\\r\\n', '\\n');f.close();exec(compile(code, __file__, 'exec'))\" develop --no-deps:\r\n /home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py:336: UserWarning: Normalizing '0.13-dev' to '0.13.dev0'\r\n normalized_version,\r\n Traceback (most recent call last):\r\n File \"<string>\", line 1, in <module>\r\n File \"/home/david/Projects/flask/setup.py\", line 109, in <module>\r\n '''\r\n File \"/usr/lib64/python3.6/distutils/core.py\", line 134, in setup\r\n ok = dist.parse_command_line()\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py\", line 363, in parse_command_line\r\n result = _Distribution.parse_command_line(self)\r\n File \"/usr/lib64/python3.6/distutils/dist.py\", line 472, in parse_command_line\r\n args = self._parse_command_opts(parser, args)\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py\", line 674, in _parse_command_opts\r\n nargs = _Distribution._parse_command_opts(self, parser, args)\r\n File \"/usr/lib64/python3.6/distutils/dist.py\", line 528, in _parse_command_opts\r\n cmd_class = self.get_command_class(command)\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py\", line 495, in get_command_class\r\n self.cmdclass[command] = cmdclass = ep.load()\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/pkg_resources/__init__.py\", line 2303, in load\r\n return self.resolve()\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/pkg_resources/__init__.py\", line 2309, in resolve\r\n module = __import__(self.module_name, fromlist=['__name__'], level=0)\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/command/develop.py\", line 11, in <module>\r\n from setuptools.command.easy_install import easy_install\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/command/easy_install.py\", line 49, in <module>\r\n from setuptools.py27compat import rmtree_safe\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/py27compat.py\", line 7, in <module>\r\n import six\r\n ModuleNotFoundError: No module named 'six'\r\n~~~\r\n\r\nExample failed build log: https://travis-ci.org/pallets/flask/jobs/238166427#L242\n", "before_files": [{"content": "\"\"\"\nCompatibility Support for Python 2.7 and earlier\n\"\"\"\n\nimport platform\n\nimport six\n\n\ndef get_all_headers(message, key):\n \"\"\"\n Given an HTTPMessage, return all headers matching a given key.\n \"\"\"\n return message.get_all(key)\n\n\nif six.PY2:\n def get_all_headers(message, key):\n return message.getheaders(key)\n\n\nlinux_py2_ascii = (\n platform.system() == 'Linux' and\n six.PY2\n)\n\nrmtree_safe = str if linux_py2_ascii else lambda x: x\n\"\"\"Workaround for http://bugs.python.org/issue24672\"\"\"\n", "path": "setuptools/py27compat.py"}], "after_files": [{"content": "\"\"\"\nCompatibility Support for Python 2.7 and earlier\n\"\"\"\n\nimport platform\n\nfrom setuptools.extern import six\n\n\ndef get_all_headers(message, key):\n \"\"\"\n Given an HTTPMessage, return all headers matching a given key.\n \"\"\"\n return message.get_all(key)\n\n\nif six.PY2:\n def get_all_headers(message, key):\n return message.getheaders(key)\n\n\nlinux_py2_ascii = (\n platform.system() == 'Linux' and\n six.PY2\n)\n\nrmtree_safe = str if linux_py2_ascii else lambda x: x\n\"\"\"Workaround for http://bugs.python.org/issue24672\"\"\"\n", "path": "setuptools/py27compat.py"}]} | 1,254 | 73 |
gh_patches_debug_40886 | rasdani/github-patches | git_diff | pwndbg__pwndbg-900 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
telescope should not fold single values, should probably be configurable
In this screenshot, we should simply display the 0x28 and 0x90 and 0x108 offset in the stack trace. The ellipsis isn't particularly useful here, and actually adds to cognitive load due to having to check how many values are skipped.
This should probably be behind a config, e.g. `telescope-fold-minimum` which is expressed in pointer widths. This will allow people to control the setting to their choosing (or disable folding it with a value of 0 or -1 or something).

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pwndbg/commands/telescope.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3 """
4 Prints out pointer chains starting at some address in memory.
5
6 Generally used to print out the stack or register values.
7 """
8
9 import argparse
10 import collections
11 import math
12
13 import pwndbg.arch
14 import pwndbg.chain
15 import pwndbg.color.telescope as T
16 import pwndbg.color.theme as theme
17 import pwndbg.commands
18 import pwndbg.config
19 import pwndbg.memory
20 import pwndbg.regs
21 import pwndbg.typeinfo
22
23 telescope_lines = pwndbg.config.Parameter('telescope-lines', 8, 'number of lines to printed by the telescope command')
24 skip_repeating_values = pwndbg.config.Parameter('telescope-skip-repeating-val', True,
25 'whether to skip repeating values of the telescope command')
26
27 offset_separator = theme.Parameter('telescope-offset-separator', '│', 'offset separator of the telescope command')
28 offset_delimiter = theme.Parameter('telescope-offset-delimiter', ':', 'offset delimiter of the telescope command')
29 repeating_marker = theme.Parameter('telescope-repeating-marker', '... ↓',
30 'repeating values marker of the telescope command')
31
32
33 parser = argparse.ArgumentParser(description="""
34 Recursively dereferences pointers starting at the specified address
35 ($sp by default)
36 """)
37 parser.add_argument("address", nargs="?", default=None, type=int, help="The address to telescope at.")
38 parser.add_argument("count", nargs="?", default=telescope_lines, type=int, help="The number of lines to show.")
39 @pwndbg.commands.ArgparsedCommand(parser)
40 @pwndbg.commands.OnlyWhenRunning
41 def telescope(address=None, count=telescope_lines, to_string=False):
42 """
43 Recursively dereferences pointers starting at the specified address
44 ($sp by default)
45 """
46 ptrsize = pwndbg.typeinfo.ptrsize
47 if telescope.repeat:
48 address = telescope.last_address + ptrsize
49 telescope.offset += 1
50 else:
51 telescope.offset = 0
52
53 address = int(address if address else pwndbg.regs.sp) & pwndbg.arch.ptrmask
54 count = max(int(count), 1) & pwndbg.arch.ptrmask
55 delimiter = T.delimiter(offset_delimiter)
56 separator = T.separator(offset_separator)
57
58 # Allow invocation of "telescope 20" to dump 20 bytes at the stack pointer
59 if address < pwndbg.memory.MMAP_MIN_ADDR and not pwndbg.memory.peek(address):
60 count = address
61 address = pwndbg.regs.sp
62
63 # Allow invocation of "telescope a b" to dump all bytes from A to B
64 if int(address) <= int(count):
65 # adjust count if it is an address. use ceil division as count is number of
66 # ptrsize values and we don't want to strip out a value if dest is unaligned
67 count -= address
68 count = max(math.ceil(count / ptrsize), 1)
69
70 reg_values = collections.defaultdict(lambda: [])
71 for reg in pwndbg.regs.common:
72 reg_values[pwndbg.regs[reg]].append(reg)
73 # address = pwndbg.memory.poi(pwndbg.typeinfo.ppvoid, address)
74
75 start = address
76 stop = address + (count*ptrsize)
77 step = ptrsize
78
79 # Find all registers which show up in the trace
80 regs = {}
81 for i in range(start, stop, step):
82 values = list(reg_values[i])
83
84 for width in range(1, pwndbg.arch.ptrsize):
85 values.extend('%s-%i' % (r,width) for r in reg_values[i+width])
86
87 regs[i] = ' '.join(values)
88
89 # Find the longest set of register information
90 if regs:
91 longest_regs = max(map(len, regs.values())) + 1
92 else:
93 longest_regs = 0
94
95 # Print everything out
96 result = []
97 last = None
98 skip = False
99 for i,addr in enumerate(range(start, stop, step)):
100 if not pwndbg.memory.peek(addr):
101 result.append("<Could not read memory at %#x>" % addr)
102 break
103
104 # Collapse repeating values.
105 value = pwndbg.memory.pvoid(addr)
106 if skip_repeating_values and last == value:
107 if not skip:
108 result.append(T.repeating_marker('%s' % repeating_marker))
109 skip = True
110 continue
111 last = value
112 skip = False
113
114 line = ' '.join((T.offset("%02x%s%04x%s" % (i + telescope.offset, delimiter,
115 addr - start + (telescope.offset * ptrsize), separator)),
116 T.register(regs[addr].ljust(longest_regs)),
117 pwndbg.chain.format(addr)))
118 result.append(line)
119 telescope.offset += i
120 telescope.last_address = addr
121
122 if not to_string:
123 print('\n'.join(result))
124
125 return result
126
127
128 parser = argparse.ArgumentParser(description='dereferences on stack data with specified count and offset.')
129 parser.add_argument('count', nargs='?', default=8, type=int,
130 help='number of element to dump')
131 parser.add_argument('offset', nargs='?', default=0, type=int,
132 help='Element offset from $sp (support negative offset)')
133
134
135 @pwndbg.commands.ArgparsedCommand(parser)
136 @pwndbg.commands.OnlyWhenRunning
137 def stack(count, offset):
138 ptrsize = pwndbg.typeinfo.ptrsize
139 telescope.repeat = stack.repeat
140 telescope(address=pwndbg.regs.sp + offset * ptrsize, count=count)
141
142
143 telescope.last_address = 0
144 telescope.offset = 0
145
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pwndbg/commands/telescope.py b/pwndbg/commands/telescope.py
--- a/pwndbg/commands/telescope.py
+++ b/pwndbg/commands/telescope.py
@@ -23,6 +23,8 @@
telescope_lines = pwndbg.config.Parameter('telescope-lines', 8, 'number of lines to printed by the telescope command')
skip_repeating_values = pwndbg.config.Parameter('telescope-skip-repeating-val', True,
'whether to skip repeating values of the telescope command')
+skip_repeating_values_minimum = pwndbg.config.Parameter('telescope-skip-repeating-val-minimum', 3,
+ 'minimum amount of repeated values before skipping lines')
offset_separator = theme.Parameter('telescope-offset-separator', '│', 'offset separator of the telescope command')
offset_delimiter = theme.Parameter('telescope-offset-delimiter', ':', 'offset delimiter of the telescope command')
@@ -88,34 +90,47 @@
# Find the longest set of register information
if regs:
- longest_regs = max(map(len, regs.values())) + 1
+ longest_regs = max(map(len, regs.values()))
else:
longest_regs = 0
# Print everything out
result = []
- last = None
- skip = False
- for i,addr in enumerate(range(start, stop, step)):
+ last = None
+ collapse_buffer = []
+ skipped_padding = 2 + len(offset_delimiter) + 4 + len(offset_separator) + 1 + longest_regs + 1 - len(repeating_marker)
+
+ # Collapse repeating values exceeding minimum delta.
+ def collapse_repeating_values():
+ # The first line was already printed, hence increment by 1
+ if collapse_buffer and len(collapse_buffer) + 1 >= skip_repeating_values_minimum:
+ result.append(T.repeating_marker('%s%s%i skipped' % (repeating_marker, ' ' * skipped_padding, len(collapse_buffer))))
+ else:
+ result.extend(collapse_buffer)
+ collapse_buffer.clear()
+
+ for i, addr in enumerate(range(start, stop, step)):
if not pwndbg.memory.peek(addr):
+ collapse_repeating_values()
result.append("<Could not read memory at %#x>" % addr)
break
- # Collapse repeating values.
- value = pwndbg.memory.pvoid(addr)
- if skip_repeating_values and last == value:
- if not skip:
- result.append(T.repeating_marker('%s' % repeating_marker))
- skip = True
- continue
- last = value
- skip = False
-
line = ' '.join((T.offset("%02x%s%04x%s" % (i + telescope.offset, delimiter,
addr - start + (telescope.offset * ptrsize), separator)),
T.register(regs[addr].ljust(longest_regs)),
pwndbg.chain.format(addr)))
+
+ # Buffer repeating values.
+ if skip_repeating_values:
+ value = pwndbg.memory.pvoid(addr)
+ if last == value:
+ collapse_buffer.append(line)
+ continue
+ collapse_repeating_values()
+ last = value
+
result.append(line)
+
telescope.offset += i
telescope.last_address = addr
| {"golden_diff": "diff --git a/pwndbg/commands/telescope.py b/pwndbg/commands/telescope.py\n--- a/pwndbg/commands/telescope.py\n+++ b/pwndbg/commands/telescope.py\n@@ -23,6 +23,8 @@\n telescope_lines = pwndbg.config.Parameter('telescope-lines', 8, 'number of lines to printed by the telescope command')\n skip_repeating_values = pwndbg.config.Parameter('telescope-skip-repeating-val', True,\n 'whether to skip repeating values of the telescope command')\n+skip_repeating_values_minimum = pwndbg.config.Parameter('telescope-skip-repeating-val-minimum', 3,\n+ 'minimum amount of repeated values before skipping lines')\n \n offset_separator = theme.Parameter('telescope-offset-separator', '\u2502', 'offset separator of the telescope command')\n offset_delimiter = theme.Parameter('telescope-offset-delimiter', ':', 'offset delimiter of the telescope command')\n@@ -88,34 +90,47 @@\n \n # Find the longest set of register information\n if regs:\n- longest_regs = max(map(len, regs.values())) + 1\n+ longest_regs = max(map(len, regs.values()))\n else:\n longest_regs = 0\n \n # Print everything out\n result = []\n- last = None\n- skip = False\n- for i,addr in enumerate(range(start, stop, step)):\n+ last = None\n+ collapse_buffer = []\n+ skipped_padding = 2 + len(offset_delimiter) + 4 + len(offset_separator) + 1 + longest_regs + 1 - len(repeating_marker)\n+\n+ # Collapse repeating values exceeding minimum delta.\n+ def collapse_repeating_values():\n+ # The first line was already printed, hence increment by 1\n+ if collapse_buffer and len(collapse_buffer) + 1 >= skip_repeating_values_minimum:\n+ result.append(T.repeating_marker('%s%s%i skipped' % (repeating_marker, ' ' * skipped_padding, len(collapse_buffer))))\n+ else:\n+ result.extend(collapse_buffer)\n+ collapse_buffer.clear()\n+\n+ for i, addr in enumerate(range(start, stop, step)):\n if not pwndbg.memory.peek(addr):\n+ collapse_repeating_values()\n result.append(\"<Could not read memory at %#x>\" % addr)\n break\n \n- # Collapse repeating values.\n- value = pwndbg.memory.pvoid(addr)\n- if skip_repeating_values and last == value:\n- if not skip:\n- result.append(T.repeating_marker('%s' % repeating_marker))\n- skip = True\n- continue\n- last = value\n- skip = False\n-\n line = ' '.join((T.offset(\"%02x%s%04x%s\" % (i + telescope.offset, delimiter,\n addr - start + (telescope.offset * ptrsize), separator)),\n T.register(regs[addr].ljust(longest_regs)),\n pwndbg.chain.format(addr)))\n+\n+ # Buffer repeating values.\n+ if skip_repeating_values:\n+ value = pwndbg.memory.pvoid(addr)\n+ if last == value:\n+ collapse_buffer.append(line)\n+ continue\n+ collapse_repeating_values()\n+ last = value\n+\n result.append(line)\n+\n telescope.offset += i\n telescope.last_address = addr\n", "issue": "telescope should not fold single values, should probably be configurable\nIn this screenshot, we should simply display the 0x28 and 0x90 and 0x108 offset in the stack trace. The ellipsis isn't particularly useful here, and actually adds to cognitive load due to having to check how many values are skipped.\r\n\r\nThis should probably be behind a config, e.g. `telescope-fold-minimum` which is expressed in pointer widths. This will allow people to control the setting to their choosing (or disable folding it with a value of 0 or -1 or something).\r\n\r\n\r\n\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"\nPrints out pointer chains starting at some address in memory.\n\nGenerally used to print out the stack or register values.\n\"\"\"\n\nimport argparse\nimport collections\nimport math\n\nimport pwndbg.arch\nimport pwndbg.chain\nimport pwndbg.color.telescope as T\nimport pwndbg.color.theme as theme\nimport pwndbg.commands\nimport pwndbg.config\nimport pwndbg.memory\nimport pwndbg.regs\nimport pwndbg.typeinfo\n\ntelescope_lines = pwndbg.config.Parameter('telescope-lines', 8, 'number of lines to printed by the telescope command')\nskip_repeating_values = pwndbg.config.Parameter('telescope-skip-repeating-val', True,\n 'whether to skip repeating values of the telescope command')\n\noffset_separator = theme.Parameter('telescope-offset-separator', '\u2502', 'offset separator of the telescope command')\noffset_delimiter = theme.Parameter('telescope-offset-delimiter', ':', 'offset delimiter of the telescope command')\nrepeating_marker = theme.Parameter('telescope-repeating-marker', '... \u2193',\n 'repeating values marker of the telescope command')\n\n\nparser = argparse.ArgumentParser(description=\"\"\"\n Recursively dereferences pointers starting at the specified address\n ($sp by default)\n \"\"\")\nparser.add_argument(\"address\", nargs=\"?\", default=None, type=int, help=\"The address to telescope at.\")\nparser.add_argument(\"count\", nargs=\"?\", default=telescope_lines, type=int, help=\"The number of lines to show.\")\[email protected](parser)\[email protected]\ndef telescope(address=None, count=telescope_lines, to_string=False):\n \"\"\"\n Recursively dereferences pointers starting at the specified address\n ($sp by default)\n \"\"\"\n ptrsize = pwndbg.typeinfo.ptrsize\n if telescope.repeat:\n address = telescope.last_address + ptrsize\n telescope.offset += 1\n else:\n telescope.offset = 0\n\n address = int(address if address else pwndbg.regs.sp) & pwndbg.arch.ptrmask\n count = max(int(count), 1) & pwndbg.arch.ptrmask\n delimiter = T.delimiter(offset_delimiter)\n separator = T.separator(offset_separator)\n\n # Allow invocation of \"telescope 20\" to dump 20 bytes at the stack pointer\n if address < pwndbg.memory.MMAP_MIN_ADDR and not pwndbg.memory.peek(address):\n count = address\n address = pwndbg.regs.sp\n\n # Allow invocation of \"telescope a b\" to dump all bytes from A to B\n if int(address) <= int(count):\n # adjust count if it is an address. use ceil division as count is number of\n # ptrsize values and we don't want to strip out a value if dest is unaligned\n count -= address\n count = max(math.ceil(count / ptrsize), 1)\n\n reg_values = collections.defaultdict(lambda: [])\n for reg in pwndbg.regs.common:\n reg_values[pwndbg.regs[reg]].append(reg)\n # address = pwndbg.memory.poi(pwndbg.typeinfo.ppvoid, address)\n\n start = address\n stop = address + (count*ptrsize)\n step = ptrsize\n\n # Find all registers which show up in the trace\n regs = {}\n for i in range(start, stop, step):\n values = list(reg_values[i])\n\n for width in range(1, pwndbg.arch.ptrsize):\n values.extend('%s-%i' % (r,width) for r in reg_values[i+width])\n\n regs[i] = ' '.join(values)\n\n # Find the longest set of register information\n if regs:\n longest_regs = max(map(len, regs.values())) + 1\n else:\n longest_regs = 0\n\n # Print everything out\n result = []\n last = None\n skip = False\n for i,addr in enumerate(range(start, stop, step)):\n if not pwndbg.memory.peek(addr):\n result.append(\"<Could not read memory at %#x>\" % addr)\n break\n\n # Collapse repeating values.\n value = pwndbg.memory.pvoid(addr)\n if skip_repeating_values and last == value:\n if not skip:\n result.append(T.repeating_marker('%s' % repeating_marker))\n skip = True\n continue\n last = value\n skip = False\n\n line = ' '.join((T.offset(\"%02x%s%04x%s\" % (i + telescope.offset, delimiter,\n addr - start + (telescope.offset * ptrsize), separator)),\n T.register(regs[addr].ljust(longest_regs)),\n pwndbg.chain.format(addr)))\n result.append(line)\n telescope.offset += i\n telescope.last_address = addr\n\n if not to_string:\n print('\\n'.join(result))\n\n return result\n\n\nparser = argparse.ArgumentParser(description='dereferences on stack data with specified count and offset.')\nparser.add_argument('count', nargs='?', default=8, type=int,\n help='number of element to dump')\nparser.add_argument('offset', nargs='?', default=0, type=int,\n help='Element offset from $sp (support negative offset)')\n\n\[email protected](parser)\[email protected]\ndef stack(count, offset):\n ptrsize = pwndbg.typeinfo.ptrsize\n telescope.repeat = stack.repeat\n telescope(address=pwndbg.regs.sp + offset * ptrsize, count=count)\n\n\ntelescope.last_address = 0\ntelescope.offset = 0\n", "path": "pwndbg/commands/telescope.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"\nPrints out pointer chains starting at some address in memory.\n\nGenerally used to print out the stack or register values.\n\"\"\"\n\nimport argparse\nimport collections\nimport math\n\nimport pwndbg.arch\nimport pwndbg.chain\nimport pwndbg.color.telescope as T\nimport pwndbg.color.theme as theme\nimport pwndbg.commands\nimport pwndbg.config\nimport pwndbg.memory\nimport pwndbg.regs\nimport pwndbg.typeinfo\n\ntelescope_lines = pwndbg.config.Parameter('telescope-lines', 8, 'number of lines to printed by the telescope command')\nskip_repeating_values = pwndbg.config.Parameter('telescope-skip-repeating-val', True,\n 'whether to skip repeating values of the telescope command')\nskip_repeating_values_minimum = pwndbg.config.Parameter('telescope-skip-repeating-val-minimum', 3,\n 'minimum amount of repeated values before skipping lines')\n\noffset_separator = theme.Parameter('telescope-offset-separator', '\u2502', 'offset separator of the telescope command')\noffset_delimiter = theme.Parameter('telescope-offset-delimiter', ':', 'offset delimiter of the telescope command')\nrepeating_marker = theme.Parameter('telescope-repeating-marker', '... \u2193',\n 'repeating values marker of the telescope command')\n\n\nparser = argparse.ArgumentParser(description=\"\"\"\n Recursively dereferences pointers starting at the specified address\n ($sp by default)\n \"\"\")\nparser.add_argument(\"address\", nargs=\"?\", default=None, type=int, help=\"The address to telescope at.\")\nparser.add_argument(\"count\", nargs=\"?\", default=telescope_lines, type=int, help=\"The number of lines to show.\")\[email protected](parser)\[email protected]\ndef telescope(address=None, count=telescope_lines, to_string=False):\n \"\"\"\n Recursively dereferences pointers starting at the specified address\n ($sp by default)\n \"\"\"\n ptrsize = pwndbg.typeinfo.ptrsize\n if telescope.repeat:\n address = telescope.last_address + ptrsize\n telescope.offset += 1\n else:\n telescope.offset = 0\n\n address = int(address if address else pwndbg.regs.sp) & pwndbg.arch.ptrmask\n count = max(int(count), 1) & pwndbg.arch.ptrmask\n delimiter = T.delimiter(offset_delimiter)\n separator = T.separator(offset_separator)\n\n # Allow invocation of \"telescope 20\" to dump 20 bytes at the stack pointer\n if address < pwndbg.memory.MMAP_MIN_ADDR and not pwndbg.memory.peek(address):\n count = address\n address = pwndbg.regs.sp\n\n # Allow invocation of \"telescope a b\" to dump all bytes from A to B\n if int(address) <= int(count):\n # adjust count if it is an address. use ceil division as count is number of\n # ptrsize values and we don't want to strip out a value if dest is unaligned\n count -= address\n count = max(math.ceil(count / ptrsize), 1)\n\n reg_values = collections.defaultdict(lambda: [])\n for reg in pwndbg.regs.common:\n reg_values[pwndbg.regs[reg]].append(reg)\n # address = pwndbg.memory.poi(pwndbg.typeinfo.ppvoid, address)\n\n start = address\n stop = address + (count*ptrsize)\n step = ptrsize\n\n # Find all registers which show up in the trace\n regs = {}\n for i in range(start, stop, step):\n values = list(reg_values[i])\n\n for width in range(1, pwndbg.arch.ptrsize):\n values.extend('%s-%i' % (r,width) for r in reg_values[i+width])\n\n regs[i] = ' '.join(values)\n\n # Find the longest set of register information\n if regs:\n longest_regs = max(map(len, regs.values()))\n else:\n longest_regs = 0\n\n # Print everything out\n result = []\n last = None\n collapse_buffer = []\n skipped_padding = 2 + len(offset_delimiter) + 4 + len(offset_separator) + 1 + longest_regs + 1 - len(repeating_marker)\n\n # Collapse repeating values exceeding minimum delta.\n def collapse_repeating_values():\n # The first line was already printed, hence increment by 1\n if collapse_buffer and len(collapse_buffer) + 1 >= skip_repeating_values_minimum:\n result.append(T.repeating_marker('%s%s%i skipped' % (repeating_marker, ' ' * skipped_padding, len(collapse_buffer))))\n else:\n result.extend(collapse_buffer)\n collapse_buffer.clear()\n\n for i, addr in enumerate(range(start, stop, step)):\n if not pwndbg.memory.peek(addr):\n collapse_repeating_values()\n result.append(\"<Could not read memory at %#x>\" % addr)\n break\n\n line = ' '.join((T.offset(\"%02x%s%04x%s\" % (i + telescope.offset, delimiter,\n addr - start + (telescope.offset * ptrsize), separator)),\n T.register(regs[addr].ljust(longest_regs)),\n pwndbg.chain.format(addr)))\n\n # Buffer repeating values.\n if skip_repeating_values:\n value = pwndbg.memory.pvoid(addr)\n if last == value:\n collapse_buffer.append(line)\n continue\n collapse_repeating_values()\n last = value\n\n result.append(line)\n\n telescope.offset += i\n telescope.last_address = addr\n\n if not to_string:\n print('\\n'.join(result))\n\n return result\n\n\nparser = argparse.ArgumentParser(description='dereferences on stack data with specified count and offset.')\nparser.add_argument('count', nargs='?', default=8, type=int,\n help='number of element to dump')\nparser.add_argument('offset', nargs='?', default=0, type=int,\n help='Element offset from $sp (support negative offset)')\n\n\[email protected](parser)\[email protected]\ndef stack(count, offset):\n ptrsize = pwndbg.typeinfo.ptrsize\n telescope.repeat = stack.repeat\n telescope(address=pwndbg.regs.sp + offset * ptrsize, count=count)\n\n\ntelescope.last_address = 0\ntelescope.offset = 0\n", "path": "pwndbg/commands/telescope.py"}]} | 2,019 | 735 |
gh_patches_debug_40595 | rasdani/github-patches | git_diff | hpcaitech__ColossalAI-3159 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
[booster] implement accelerator
# Overview
`Accelerator` is an abstraction for the hardware devices used for training. It can be CPU, GPU, TPU, etc. With this abstraction, we can provide room for future extension to more hardware which is compatible with PyTorch.
Wanna track the development progress? Take a look at
proposal: https://github.com/hpcaitech/ColossalAI/discussions/3046
project kanban: https://github.com/orgs/hpcaitech/projects/19
# Goal
The `Accelerator` can move models to the target device. A sample definition is given below and subject to future changes.
```python
class Accelerator:
def __init__(self, device):
self.device = device
def setup_model(self, model) -> torch.nn.Module
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `colossalai/booster/accelerator.py`
Content:
```
1 import torch
2 import torch.nn as nn
3
4 __all__ = ['Accelerator']
5
6
7 class Accelerator:
8
9 def __init__(self, device: torch.device):
10 self.device = device
11
12 def setup_model(self, model: nn.Module) -> nn.Module:
13 # TODO: implement this method
14 pass
15
```
Path: `colossalai/booster/booster.py`
Content:
```
1 from contextlib import contextmanager
2 from typing import Callable, Iterable, Iterator, List, Optional, Tuple, Union
3
4 import torch
5 import torch.nn as nn
6 from torch import Tensor
7 from torch.optim import Optimizer
8 from torch.optim.lr_scheduler import _LRScheduler as LRScheduler
9 from torch.utils.data import DataLoader
10
11 from .mixed_precision import MixedPrecision, mixed_precision_factory
12 from .plugin import Plugin
13
14 __all__ = ['Booster']
15
16
17 class Booster:
18 """
19 Booster is a high-level API for training neural networks. It provides a unified interface for
20 training with different precisio, accelerator, and plugin.
21
22 Examples:
23 >>> colossalai.launch(...)
24 >>> plugin = GeminiPlugin(stage=3, ...)
25 >>> booster = Booster(precision='fp16', plugin=plugin)
26 >>>
27 >>> model = GPT2()
28 >>> optimizer = Adam(model.parameters())
29 >>> dataloader = Dataloader(Dataset)
30 >>> lr_scheduler = LinearWarmupScheduler()
31 >>> criterion = GPTLMLoss()
32 >>>
33 >>> model, optimizer, lr_scheduler, dataloader = booster.boost(model, optimizer, lr_scheduler, dataloader)
34 >>>
35 >>> for epoch in range(max_epochs):
36 >>> for input_ids, attention_mask in dataloader:
37 >>> outputs = model(input_ids, attention_mask)
38 >>> loss = criterion(outputs.logits, input_ids)
39 >>> booster.backward(loss, optimizer)
40 >>> optimizer.step()
41 >>> lr_scheduler.step()
42 >>> optimizer.zero_grad()
43
44
45 Args:
46 device (str or torch.device): The device to run the training. Default: 'cuda'.
47 mixed_precision (str or MixedPrecision): The mixed precision to run the training. Default: None.
48 If the argument is a string, it can be 'fp16', 'fp16_apex', 'bf16', or 'fp8'.
49 'fp16' would use PyTorch AMP while `fp16_apex` would use Nvidia Apex.
50 plugin (Plugin): The plugin to run the training. Default: None.
51 """
52
53 def __init__(self,
54 device: Union[str, torch.device] = 'cuda',
55 mixed_precision: Union[MixedPrecision, str] = None,
56 plugin: Optional[Plugin] = None) -> None:
57 # validate and set precision
58 if isinstance(MixedPrecision, str):
59 # the user will take the default arguments for amp training
60 self.mixed_precision = mixed_precision_factory(mixed_precision)
61 elif isinstance(mixed_precision, MixedPrecision):
62 # the user can customize the arguments by passing the precision object
63 self.mixed_precision = mixed_precision
64 else:
65 raise ValueError(
66 f'Expected the argument mixed_precision to be a string or an instance of Precision, but got {type(mixed_precision)}.'
67 )
68
69 def boost(self, model: nn.Module, optimizer: Optimizer, criterion: Callable, lr_scheduler: LRScheduler,
70 dataloader: DataLoader) -> List[Union[nn.Module, Optimizer, LRScheduler, DataLoader]]:
71 """
72 Boost the model, optimizer, criterion, lr_scheduler, and dataloader.
73
74 Args:
75 model (nn.Module): The model to be boosted.
76 optimizer (Optimizer): The optimizer to be boosted.
77 criterion (Callable): The criterion to be boosted.
78 lr_scheduler (LRScheduler): The lr_scheduler to be boosted.
79 dataloader (DataLoader): The dataloader to be boosted.
80 """
81 # TODO(FrankLeeeee): consider multi-model and multi-optimizer case
82 # TODO(lsg): Add plugin control logic
83 # e.g.
84 # if self.plugin is not None and self.plugin.control_boost:
85 # ...
86 # transform model for mixed precision
87 model, optimizer, criterion = self.mixed_precision.configure(model, optimizer, criterion)
88 return model, optimizer, criterion, lr_scheduler, dataloader
89
90 def backward(self, loss: torch.Tensor, optimizer: Optimizer) -> None:
91 # TODO: implement this method with plugin
92 optimizer.backward(loss)
93
94 def execute_pipeline(self,
95 data_iter: Iterator,
96 model: nn.Module,
97 criterion: Callable[[torch.Tensor], torch.Tensor],
98 optimizer: Optimizer,
99 return_loss: bool = True,
100 return_outputs: bool = False) -> Tuple[Optional[torch.Tensor], ...]:
101 # TODO: implement this method
102 # run pipeline forward backward pass
103 # return loss or outputs if needed
104 pass
105
106 def no_sync(self, model: nn.Module) -> contextmanager:
107 # TODO: implement this method
108 pass
109
110 def save(self,
111 obj: Union[nn.Module, Optimizer, LRScheduler],
112 path_like: str,
113 plan: str = 'torch',
114 **kwargs) -> None:
115 # TODO: implement this method
116 pass
117
118 def load(self,
119 obj: Union[nn.Module, Optimizer, LRScheduler],
120 path_like: str,
121 plan: str = 'torch',
122 **kwargs) -> None:
123 # TODO: implement this method
124 pass
125
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/colossalai/booster/accelerator.py b/colossalai/booster/accelerator.py
--- a/colossalai/booster/accelerator.py
+++ b/colossalai/booster/accelerator.py
@@ -3,12 +3,52 @@
__all__ = ['Accelerator']
+_supported_devices = [
+ 'cpu',
+ 'cuda',
+
+ # To be supported
+ # 'xpu',
+ # 'npu',
+ # 'tpu',
+]
+
class Accelerator:
+ """
+ Accelerator is an abstraction for the hardware device that is used to run the model.
+
+ Args:
+ device (str): The device to be used. Currently only support 'cpu' and 'gpu'.
+ """
- def __init__(self, device: torch.device):
+ def __init__(self, device: str):
self.device = device
- def setup_model(self, model: nn.Module) -> nn.Module:
- # TODO: implement this method
- pass
+ assert self.device in _supported_devices, f"Device {self.device} is not supported yet, supported devices include {_supported_devices}"
+
+ def bind(self):
+ """
+ Set the default device for the current process.
+ """
+ if self.device == 'cpu':
+ pass
+ elif self.device == 'cuda':
+ # TODO(FrankLeeeee): use global environment to check if it is a dist job
+ # if is_distributed:
+ # local_rank = EnvTable().get_local_rank()
+ # torch.cuda.set_device(torch.device(f'cuda:{local_rank}'))
+ torch.cuda.set_device(torch.device('cuda'))
+ pass
+ else:
+ raise ValueError(f"Device {self.device} is not supported yet")
+
+ def configure_model(self, model: nn.Module) -> nn.Module:
+ """
+ Move the model to the device.
+
+ Args:
+ model (nn.Module): The model to be moved.
+ """
+ model = model.to(torch.device(self.device))
+ return model
diff --git a/colossalai/booster/booster.py b/colossalai/booster/booster.py
--- a/colossalai/booster/booster.py
+++ b/colossalai/booster/booster.py
@@ -8,6 +8,7 @@
from torch.optim.lr_scheduler import _LRScheduler as LRScheduler
from torch.utils.data import DataLoader
+from .accelerator import Accelerator
from .mixed_precision import MixedPrecision, mixed_precision_factory
from .plugin import Plugin
@@ -51,9 +52,16 @@
"""
def __init__(self,
- device: Union[str, torch.device] = 'cuda',
+ device: str = 'cuda',
mixed_precision: Union[MixedPrecision, str] = None,
plugin: Optional[Plugin] = None) -> None:
+ # TODO(FrankLeeeee): add plugin control logic
+ # if self.plugin is not None and self.plugin.control_accelerator:
+ # ...
+ # create acclerator
+ self.acceleartor = Accelerator(device)
+ self.acceleartor.set_default_device()
+
# validate and set precision
if isinstance(MixedPrecision, str):
# the user will take the default arguments for amp training
@@ -78,6 +86,11 @@
lr_scheduler (LRScheduler): The lr_scheduler to be boosted.
dataloader (DataLoader): The dataloader to be boosted.
"""
+ # TODO(FrankLeeeee): add plugin control logic
+ # if self.plugin is not None and self.plugin.control_accelerator:
+ # ...
+ model = self.acceleartor.configure_model(model)
+
# TODO(FrankLeeeee): consider multi-model and multi-optimizer case
# TODO(lsg): Add plugin control logic
# e.g.
| {"golden_diff": "diff --git a/colossalai/booster/accelerator.py b/colossalai/booster/accelerator.py\n--- a/colossalai/booster/accelerator.py\n+++ b/colossalai/booster/accelerator.py\n@@ -3,12 +3,52 @@\n \n __all__ = ['Accelerator']\n \n+_supported_devices = [\n+ 'cpu',\n+ 'cuda',\n+\n+ # To be supported\n+ # 'xpu',\n+ # 'npu',\n+ # 'tpu',\n+]\n+\n \n class Accelerator:\n+ \"\"\"\n+ Accelerator is an abstraction for the hardware device that is used to run the model.\n+\n+ Args:\n+ device (str): The device to be used. Currently only support 'cpu' and 'gpu'.\n+ \"\"\"\n \n- def __init__(self, device: torch.device):\n+ def __init__(self, device: str):\n self.device = device\n \n- def setup_model(self, model: nn.Module) -> nn.Module:\n- # TODO: implement this method\n- pass\n+ assert self.device in _supported_devices, f\"Device {self.device} is not supported yet, supported devices include {_supported_devices}\"\n+\n+ def bind(self):\n+ \"\"\"\n+ Set the default device for the current process.\n+ \"\"\"\n+ if self.device == 'cpu':\n+ pass\n+ elif self.device == 'cuda':\n+ # TODO(FrankLeeeee): use global environment to check if it is a dist job\n+ # if is_distributed:\n+ # local_rank = EnvTable().get_local_rank()\n+ # torch.cuda.set_device(torch.device(f'cuda:{local_rank}'))\n+ torch.cuda.set_device(torch.device('cuda'))\n+ pass\n+ else:\n+ raise ValueError(f\"Device {self.device} is not supported yet\")\n+\n+ def configure_model(self, model: nn.Module) -> nn.Module:\n+ \"\"\"\n+ Move the model to the device.\n+\n+ Args:\n+ model (nn.Module): The model to be moved.\n+ \"\"\"\n+ model = model.to(torch.device(self.device))\n+ return model\ndiff --git a/colossalai/booster/booster.py b/colossalai/booster/booster.py\n--- a/colossalai/booster/booster.py\n+++ b/colossalai/booster/booster.py\n@@ -8,6 +8,7 @@\n from torch.optim.lr_scheduler import _LRScheduler as LRScheduler\n from torch.utils.data import DataLoader\n \n+from .accelerator import Accelerator\n from .mixed_precision import MixedPrecision, mixed_precision_factory\n from .plugin import Plugin\n \n@@ -51,9 +52,16 @@\n \"\"\"\n \n def __init__(self,\n- device: Union[str, torch.device] = 'cuda',\n+ device: str = 'cuda',\n mixed_precision: Union[MixedPrecision, str] = None,\n plugin: Optional[Plugin] = None) -> None:\n+ # TODO(FrankLeeeee): add plugin control logic\n+ # if self.plugin is not None and self.plugin.control_accelerator:\n+ # ...\n+ # create acclerator\n+ self.acceleartor = Accelerator(device)\n+ self.acceleartor.set_default_device()\n+\n # validate and set precision\n if isinstance(MixedPrecision, str):\n # the user will take the default arguments for amp training\n@@ -78,6 +86,11 @@\n lr_scheduler (LRScheduler): The lr_scheduler to be boosted.\n dataloader (DataLoader): The dataloader to be boosted.\n \"\"\"\n+ # TODO(FrankLeeeee): add plugin control logic\n+ # if self.plugin is not None and self.plugin.control_accelerator:\n+ # ...\n+ model = self.acceleartor.configure_model(model)\n+\n # TODO(FrankLeeeee): consider multi-model and multi-optimizer case\n # TODO(lsg): Add plugin control logic\n # e.g.\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[booster] implement accelerator\n# Overview\n\n\n`Accelerator` is an abstraction for the hardware devices used for training. It can be CPU, GPU, TPU, etc. With this abstraction, we can provide room for future extension to more hardware which is compatible with PyTorch.\n\nWanna track the development progress? Take a look at\n\nproposal: https://github.com/hpcaitech/ColossalAI/discussions/3046\nproject kanban: https://github.com/orgs/hpcaitech/projects/19\n\n# Goal\n\nThe `Accelerator` can move models to the target device. A sample definition is given below and subject to future changes.\n\n```python\n\nclass Accelerator:\n\n def __init__(self, device):\n self.device = device\n\n def setup_model(self, model) -> torch.nn.Module\n```\n\n\n", "before_files": [{"content": "import torch\nimport torch.nn as nn\n\n__all__ = ['Accelerator']\n\n\nclass Accelerator:\n\n def __init__(self, device: torch.device):\n self.device = device\n\n def setup_model(self, model: nn.Module) -> nn.Module:\n # TODO: implement this method\n pass\n", "path": "colossalai/booster/accelerator.py"}, {"content": "from contextlib import contextmanager\nfrom typing import Callable, Iterable, Iterator, List, Optional, Tuple, Union\n\nimport torch\nimport torch.nn as nn\nfrom torch import Tensor\nfrom torch.optim import Optimizer\nfrom torch.optim.lr_scheduler import _LRScheduler as LRScheduler\nfrom torch.utils.data import DataLoader\n\nfrom .mixed_precision import MixedPrecision, mixed_precision_factory\nfrom .plugin import Plugin\n\n__all__ = ['Booster']\n\n\nclass Booster:\n \"\"\"\n Booster is a high-level API for training neural networks. It provides a unified interface for\n training with different precisio, accelerator, and plugin.\n\n Examples:\n >>> colossalai.launch(...)\n >>> plugin = GeminiPlugin(stage=3, ...)\n >>> booster = Booster(precision='fp16', plugin=plugin)\n >>>\n >>> model = GPT2()\n >>> optimizer = Adam(model.parameters())\n >>> dataloader = Dataloader(Dataset)\n >>> lr_scheduler = LinearWarmupScheduler()\n >>> criterion = GPTLMLoss()\n >>>\n >>> model, optimizer, lr_scheduler, dataloader = booster.boost(model, optimizer, lr_scheduler, dataloader)\n >>>\n >>> for epoch in range(max_epochs):\n >>> for input_ids, attention_mask in dataloader:\n >>> outputs = model(input_ids, attention_mask)\n >>> loss = criterion(outputs.logits, input_ids)\n >>> booster.backward(loss, optimizer)\n >>> optimizer.step()\n >>> lr_scheduler.step()\n >>> optimizer.zero_grad()\n\n\n Args:\n device (str or torch.device): The device to run the training. Default: 'cuda'.\n mixed_precision (str or MixedPrecision): The mixed precision to run the training. Default: None.\n If the argument is a string, it can be 'fp16', 'fp16_apex', 'bf16', or 'fp8'.\n 'fp16' would use PyTorch AMP while `fp16_apex` would use Nvidia Apex.\n plugin (Plugin): The plugin to run the training. Default: None.\n \"\"\"\n\n def __init__(self,\n device: Union[str, torch.device] = 'cuda',\n mixed_precision: Union[MixedPrecision, str] = None,\n plugin: Optional[Plugin] = None) -> None:\n # validate and set precision\n if isinstance(MixedPrecision, str):\n # the user will take the default arguments for amp training\n self.mixed_precision = mixed_precision_factory(mixed_precision)\n elif isinstance(mixed_precision, MixedPrecision):\n # the user can customize the arguments by passing the precision object\n self.mixed_precision = mixed_precision\n else:\n raise ValueError(\n f'Expected the argument mixed_precision to be a string or an instance of Precision, but got {type(mixed_precision)}.'\n )\n\n def boost(self, model: nn.Module, optimizer: Optimizer, criterion: Callable, lr_scheduler: LRScheduler,\n dataloader: DataLoader) -> List[Union[nn.Module, Optimizer, LRScheduler, DataLoader]]:\n \"\"\"\n Boost the model, optimizer, criterion, lr_scheduler, and dataloader.\n\n Args:\n model (nn.Module): The model to be boosted.\n optimizer (Optimizer): The optimizer to be boosted.\n criterion (Callable): The criterion to be boosted.\n lr_scheduler (LRScheduler): The lr_scheduler to be boosted.\n dataloader (DataLoader): The dataloader to be boosted.\n \"\"\"\n # TODO(FrankLeeeee): consider multi-model and multi-optimizer case\n # TODO(lsg): Add plugin control logic\n # e.g.\n # if self.plugin is not None and self.plugin.control_boost:\n # ...\n # transform model for mixed precision\n model, optimizer, criterion = self.mixed_precision.configure(model, optimizer, criterion)\n return model, optimizer, criterion, lr_scheduler, dataloader\n\n def backward(self, loss: torch.Tensor, optimizer: Optimizer) -> None:\n # TODO: implement this method with plugin\n optimizer.backward(loss)\n\n def execute_pipeline(self,\n data_iter: Iterator,\n model: nn.Module,\n criterion: Callable[[torch.Tensor], torch.Tensor],\n optimizer: Optimizer,\n return_loss: bool = True,\n return_outputs: bool = False) -> Tuple[Optional[torch.Tensor], ...]:\n # TODO: implement this method\n # run pipeline forward backward pass\n # return loss or outputs if needed\n pass\n\n def no_sync(self, model: nn.Module) -> contextmanager:\n # TODO: implement this method\n pass\n\n def save(self,\n obj: Union[nn.Module, Optimizer, LRScheduler],\n path_like: str,\n plan: str = 'torch',\n **kwargs) -> None:\n # TODO: implement this method\n pass\n\n def load(self,\n obj: Union[nn.Module, Optimizer, LRScheduler],\n path_like: str,\n plan: str = 'torch',\n **kwargs) -> None:\n # TODO: implement this method\n pass\n", "path": "colossalai/booster/booster.py"}], "after_files": [{"content": "import torch\nimport torch.nn as nn\n\n__all__ = ['Accelerator']\n\n_supported_devices = [\n 'cpu',\n 'cuda',\n\n # To be supported\n # 'xpu',\n # 'npu',\n # 'tpu',\n]\n\n\nclass Accelerator:\n \"\"\"\n Accelerator is an abstraction for the hardware device that is used to run the model.\n\n Args:\n device (str): The device to be used. Currently only support 'cpu' and 'gpu'.\n \"\"\"\n\n def __init__(self, device: str):\n self.device = device\n\n assert self.device in _supported_devices, f\"Device {self.device} is not supported yet, supported devices include {_supported_devices}\"\n\n def bind(self):\n \"\"\"\n Set the default device for the current process.\n \"\"\"\n if self.device == 'cpu':\n pass\n elif self.device == 'cuda':\n # TODO(FrankLeeeee): use global environment to check if it is a dist job\n # if is_distributed:\n # local_rank = EnvTable().get_local_rank()\n # torch.cuda.set_device(torch.device(f'cuda:{local_rank}'))\n torch.cuda.set_device(torch.device('cuda'))\n pass\n else:\n raise ValueError(f\"Device {self.device} is not supported yet\")\n\n def configure_model(self, model: nn.Module) -> nn.Module:\n \"\"\"\n Move the model to the device.\n\n Args:\n model (nn.Module): The model to be moved.\n \"\"\"\n model = model.to(torch.device(self.device))\n return model\n", "path": "colossalai/booster/accelerator.py"}, {"content": "from contextlib import contextmanager\nfrom typing import Callable, Iterable, Iterator, List, Optional, Tuple, Union\n\nimport torch\nimport torch.nn as nn\nfrom torch import Tensor\nfrom torch.optim import Optimizer\nfrom torch.optim.lr_scheduler import _LRScheduler as LRScheduler\nfrom torch.utils.data import DataLoader\n\nfrom .accelerator import Accelerator\nfrom .mixed_precision import MixedPrecision, mixed_precision_factory\nfrom .plugin import Plugin\n\n__all__ = ['Booster']\n\n\nclass Booster:\n \"\"\"\n Booster is a high-level API for training neural networks. It provides a unified interface for\n training with different precisio, accelerator, and plugin.\n\n Examples:\n >>> colossalai.launch(...)\n >>> plugin = GeminiPlugin(stage=3, ...)\n >>> booster = Booster(precision='fp16', plugin=plugin)\n >>>\n >>> model = GPT2()\n >>> optimizer = Adam(model.parameters())\n >>> dataloader = Dataloader(Dataset)\n >>> lr_scheduler = LinearWarmupScheduler()\n >>> criterion = GPTLMLoss()\n >>>\n >>> model, optimizer, lr_scheduler, dataloader = booster.boost(model, optimizer, lr_scheduler, dataloader)\n >>>\n >>> for epoch in range(max_epochs):\n >>> for input_ids, attention_mask in dataloader:\n >>> outputs = model(input_ids, attention_mask)\n >>> loss = criterion(outputs.logits, input_ids)\n >>> booster.backward(loss, optimizer)\n >>> optimizer.step()\n >>> lr_scheduler.step()\n >>> optimizer.zero_grad()\n\n\n Args:\n device (str or torch.device): The device to run the training. Default: 'cuda'.\n mixed_precision (str or MixedPrecision): The mixed precision to run the training. Default: None.\n If the argument is a string, it can be 'fp16', 'fp16_apex', 'bf16', or 'fp8'.\n 'fp16' would use PyTorch AMP while `fp16_apex` would use Nvidia Apex.\n plugin (Plugin): The plugin to run the training. Default: None.\n \"\"\"\n\n def __init__(self,\n device: str = 'cuda',\n mixed_precision: Union[MixedPrecision, str] = None,\n plugin: Optional[Plugin] = None) -> None:\n # TODO(FrankLeeeee): add plugin control logic\n # if self.plugin is not None and self.plugin.control_accelerator:\n # ...\n # create acclerator\n self.acceleartor = Accelerator(device)\n self.acceleartor.set_default_device()\n\n # validate and set precision\n if isinstance(MixedPrecision, str):\n # the user will take the default arguments for amp training\n self.mixed_precision = mixed_precision_factory(mixed_precision)\n elif isinstance(mixed_precision, MixedPrecision):\n # the user can customize the arguments by passing the precision object\n self.mixed_precision = mixed_precision\n else:\n raise ValueError(\n f'Expected the argument mixed_precision to be a string or an instance of Precision, but got {type(mixed_precision)}.'\n )\n\n def boost(self, model: nn.Module, optimizer: Optimizer, criterion: Callable, lr_scheduler: LRScheduler,\n dataloader: DataLoader) -> List[Union[nn.Module, Optimizer, LRScheduler, DataLoader]]:\n \"\"\"\n Boost the model, optimizer, criterion, lr_scheduler, and dataloader.\n\n Args:\n model (nn.Module): The model to be boosted.\n optimizer (Optimizer): The optimizer to be boosted.\n criterion (Callable): The criterion to be boosted.\n lr_scheduler (LRScheduler): The lr_scheduler to be boosted.\n dataloader (DataLoader): The dataloader to be boosted.\n \"\"\"\n # TODO(FrankLeeeee): add plugin control logic\n # if self.plugin is not None and self.plugin.control_accelerator:\n # ...\n model = self.acceleartor.configure_model(model)\n\n # TODO(FrankLeeeee): consider multi-model and multi-optimizer case\n # TODO(lsg): Add plugin control logic\n # e.g.\n # if self.plugin is not None and self.plugin.control_boost:\n # ...\n # transform model for mixed precision\n model, optimizer, criterion = self.mixed_precision.configure(model, optimizer, criterion)\n return model, optimizer, criterion, lr_scheduler, dataloader\n\n def backward(self, loss: torch.Tensor, optimizer: Optimizer) -> None:\n # TODO: implement this method with plugin\n optimizer.backward(loss)\n\n def execute_pipeline(self,\n data_iter: Iterator,\n model: nn.Module,\n criterion: Callable[[torch.Tensor], torch.Tensor],\n optimizer: Optimizer,\n return_loss: bool = True,\n return_outputs: bool = False) -> Tuple[Optional[torch.Tensor], ...]:\n # TODO: implement this method\n # run pipeline forward backward pass\n # return loss or outputs if needed\n pass\n\n def no_sync(self, model: nn.Module) -> contextmanager:\n # TODO: implement this method\n pass\n\n def save(self,\n obj: Union[nn.Module, Optimizer, LRScheduler],\n path_like: str,\n plan: str = 'torch',\n **kwargs) -> None:\n # TODO: implement this method\n pass\n\n def load(self,\n obj: Union[nn.Module, Optimizer, LRScheduler],\n path_like: str,\n plan: str = 'torch',\n **kwargs) -> None:\n # TODO: implement this method\n pass\n", "path": "colossalai/booster/booster.py"}]} | 1,945 | 892 |
gh_patches_debug_6810 | rasdani/github-patches | git_diff | Pylons__pyramid-2671 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
reify docstring doesn't render properly
The testsetup section is omitted from the rendered docs at http://pyramid.readthedocs.io/en/latest/api/decorator.html#pyramid.decorator.reify
Unfortunately this section is essential for understanding the example.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pyramid/decorator.py`
Content:
```
1 from functools import update_wrapper
2
3
4 class reify(object):
5 """ Use as a class method decorator. It operates almost exactly like the
6 Python ``@property`` decorator, but it puts the result of the method it
7 decorates into the instance dict after the first call, effectively
8 replacing the function it decorates with an instance variable. It is, in
9 Python parlance, a non-data descriptor. An example:
10
11 .. testsetup::
12
13 from pyramid.decorator import reify
14
15 class Foo(object):
16 @reify
17 def jammy(self):
18 print('jammy called')
19 return 1
20
21 And usage of Foo:
22
23 .. doctest::
24
25 >>> f = Foo()
26 >>> v = f.jammy
27 jammy called
28 >>> print(v)
29 1
30 >>> f.jammy
31 1
32 >>> # jammy func not called the second time; it replaced itself with 1
33 >>> # Note: reassignment is possible
34 >>> f.jammy = 2
35 >>> f.jammy
36 2
37 """
38 def __init__(self, wrapped):
39 self.wrapped = wrapped
40 update_wrapper(self, wrapped)
41
42 def __get__(self, inst, objtype=None):
43 if inst is None:
44 return self
45 val = self.wrapped(inst)
46 setattr(inst, self.wrapped.__name__, val)
47 return val
48
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pyramid/decorator.py b/pyramid/decorator.py
--- a/pyramid/decorator.py
+++ b/pyramid/decorator.py
@@ -8,6 +8,16 @@
replacing the function it decorates with an instance variable. It is, in
Python parlance, a non-data descriptor. An example:
+ .. code-block:: python
+
+ from pyramid.decorator import reify
+
+ class Foo(object):
+ @reify
+ def jammy(self):
+ print('jammy called')
+ return 1
+
.. testsetup::
from pyramid.decorator import reify
| {"golden_diff": "diff --git a/pyramid/decorator.py b/pyramid/decorator.py\n--- a/pyramid/decorator.py\n+++ b/pyramid/decorator.py\n@@ -8,6 +8,16 @@\n replacing the function it decorates with an instance variable. It is, in\n Python parlance, a non-data descriptor. An example:\n \n+ .. code-block:: python\n+\n+ from pyramid.decorator import reify\n+\n+ class Foo(object):\n+ @reify\n+ def jammy(self):\n+ print('jammy called')\n+ return 1\n+\n .. testsetup::\n \n from pyramid.decorator import reify\n", "issue": "reify docstring doesn't render properly\nThe testsetup section is omitted from the rendered docs at http://pyramid.readthedocs.io/en/latest/api/decorator.html#pyramid.decorator.reify\n\nUnfortunately this section is essential for understanding the example.\n\n", "before_files": [{"content": "from functools import update_wrapper\n\n\nclass reify(object):\n \"\"\" Use as a class method decorator. It operates almost exactly like the\n Python ``@property`` decorator, but it puts the result of the method it\n decorates into the instance dict after the first call, effectively\n replacing the function it decorates with an instance variable. It is, in\n Python parlance, a non-data descriptor. An example:\n\n .. testsetup::\n\n from pyramid.decorator import reify\n\n class Foo(object):\n @reify\n def jammy(self):\n print('jammy called')\n return 1\n\n And usage of Foo:\n\n .. doctest::\n\n >>> f = Foo()\n >>> v = f.jammy\n jammy called\n >>> print(v)\n 1\n >>> f.jammy\n 1\n >>> # jammy func not called the second time; it replaced itself with 1\n >>> # Note: reassignment is possible\n >>> f.jammy = 2\n >>> f.jammy\n 2\n \"\"\"\n def __init__(self, wrapped):\n self.wrapped = wrapped\n update_wrapper(self, wrapped)\n\n def __get__(self, inst, objtype=None):\n if inst is None:\n return self\n val = self.wrapped(inst)\n setattr(inst, self.wrapped.__name__, val)\n return val\n\n", "path": "pyramid/decorator.py"}], "after_files": [{"content": "from functools import update_wrapper\n\n\nclass reify(object):\n \"\"\" Use as a class method decorator. It operates almost exactly like the\n Python ``@property`` decorator, but it puts the result of the method it\n decorates into the instance dict after the first call, effectively\n replacing the function it decorates with an instance variable. It is, in\n Python parlance, a non-data descriptor. An example:\n\n .. code-block:: python\n\n from pyramid.decorator import reify\n\n class Foo(object):\n @reify\n def jammy(self):\n print('jammy called')\n return 1\n\n .. testsetup::\n\n from pyramid.decorator import reify\n\n class Foo(object):\n @reify\n def jammy(self):\n print('jammy called')\n return 1\n\n And usage of Foo:\n\n .. doctest::\n\n >>> f = Foo()\n >>> v = f.jammy\n jammy called\n >>> print(v)\n 1\n >>> f.jammy\n 1\n >>> # jammy func not called the second time; it replaced itself with 1\n >>> # Note: reassignment is possible\n >>> f.jammy = 2\n >>> f.jammy\n 2\n \"\"\"\n def __init__(self, wrapped):\n self.wrapped = wrapped\n update_wrapper(self, wrapped)\n\n def __get__(self, inst, objtype=None):\n if inst is None:\n return self\n val = self.wrapped(inst)\n setattr(inst, self.wrapped.__name__, val)\n return val\n\n", "path": "pyramid/decorator.py"}]} | 719 | 149 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.