hash
stringlengths 40
40
| diff
stringlengths 131
26.7k
| message
stringlengths 7
694
| project
stringlengths 5
67
| split
stringclasses 1
value | diff_languages
stringlengths 2
24
|
---|---|---|---|---|---|
36c217f8ea598b39275b380f7dceb50aa2a8f707
|
diff --git a/identify/extensions.py b/identify/extensions.py
index <HASH>..<HASH> 100644
--- a/identify/extensions.py
+++ b/identify/extensions.py
@@ -32,6 +32,7 @@ EXTENSIONS = {
'cxx': {'text', 'c++'},
'dart': {'text', 'dart'},
'def': {'text', 'def'},
+ 'dll': {'binary'},
'dtd': {'text', 'dtd'},
'ear': {'binary', 'zip', 'jar'},
'edn': {'text', 'clojure', 'edn'},
|
Add .dll as a binary file
|
chriskuehl_identify
|
train
|
py
|
209974a1cfe6fdf2e6b6abf6e1a1dc4fad2ea90e
|
diff --git a/config/debugbar.php b/config/debugbar.php
index <HASH>..<HASH> 100644
--- a/config/debugbar.php
+++ b/config/debugbar.php
@@ -180,4 +180,13 @@ return [
*/
'route_prefix' => '_debugbar',
+ /*
+ |--------------------------------------------------------------------------
+ | DebugBar route domain
+ |--------------------------------------------------------------------------
+ |
+ | By default DebugBar route served from the same domain that request served.
+ | To override default domain, specify it as a non-empty value.
+ */
+ 'route_domain' => null,
];
diff --git a/src/ServiceProvider.php b/src/ServiceProvider.php
index <HASH>..<HASH> 100644
--- a/src/ServiceProvider.php
+++ b/src/ServiceProvider.php
@@ -77,6 +77,7 @@ class ServiceProvider extends \Illuminate\Support\ServiceProvider
$routeConfig = [
'namespace' => 'Barryvdh\Debugbar\Controllers',
'prefix' => $this->app['config']->get('debugbar.route_prefix'),
+ 'domain' => $this->app['config']->get('debugbar.route_domain'),
];
$this->getRouter()->group($routeConfig, function($router) {
|
Add route domain config option (#<I>)
|
barryvdh_laravel-debugbar
|
train
|
php,php
|
e4a5951f12eb7b532d453112a6cc974b17426465
|
diff --git a/django_q/tests/test_cluster.py b/django_q/tests/test_cluster.py
index <HASH>..<HASH> 100644
--- a/django_q/tests/test_cluster.py
+++ b/django_q/tests/test_cluster.py
@@ -18,6 +18,7 @@ from django_q.status import Stat
from django_q.brokers import get_broker
from .tasks import multiply
+
class WordClass(object):
def __init__(self):
self.word_list = DEFAULT_WORDLIST
@@ -30,6 +31,7 @@ class WordClass(object):
def broker():
Conf.DISQUE_NODES = None
Conf.IRON_MQ = None
+ Conf.SQS = None
Conf.DJANGO_REDIS = 'default'
return get_broker()
diff --git a/django_q/tests/test_scheduler.py b/django_q/tests/test_scheduler.py
index <HASH>..<HASH> 100644
--- a/django_q/tests/test_scheduler.py
+++ b/django_q/tests/test_scheduler.py
@@ -14,6 +14,10 @@ from django_q.tasks import Schedule, fetch, schedule as create_schedule, queue_s
@pytest.fixture
def broker():
+ Conf.DISQUE_NODES = None
+ Conf.IRON_MQ = None
+ Conf.SQS = None
+ Conf.DJANGO_REDIS = 'default'
return get_broker()
|
Resets SQS conf in case of test failure
|
Koed00_django-q
|
train
|
py,py
|
b21f6d78b3624dcbbc0ddbb7108e80adc08e6ccf
|
diff --git a/apitools/base/py/testing/mock.py b/apitools/base/py/testing/mock.py
index <HASH>..<HASH> 100644
--- a/apitools/base/py/testing/mock.py
+++ b/apitools/base/py/testing/mock.py
@@ -186,6 +186,7 @@ class _MockedMethod(object):
"""A mocked API service method."""
def __init__(self, key, mocked_client, real_method):
+ self.__name__ = real_method.__name__
self.__key = key
self.__mocked_client = mocked_client
self.__real_method = real_method
|
Fixing _MockedMethod so that you can call client.GetMethodsList() when a
client is mocked out using mock.Client().Mock().
|
google_apitools
|
train
|
py
|
ba6630c4f8cea20e2e2092ba9b74e8ae6a0aaff7
|
diff --git a/tchannel/sync/client.py b/tchannel/sync/client.py
index <HASH>..<HASH> 100644
--- a/tchannel/sync/client.py
+++ b/tchannel/sync/client.py
@@ -49,7 +49,7 @@ class TChannel(AsyncTChannel):
# thrift_service is the result of a call to ``thrift_request_builder``
future = tchannel.thrift(
thrift_service.getItem('foo'),
- timeout=1000,
+ timeout=1, # 1 second
)
result = future.result()
|
update the example of sync client in the comments
to use the right timeout unit.
|
uber_tchannel-python
|
train
|
py
|
7fa2607bd12290d7ae84d74b9dcc59b1777f8d58
|
diff --git a/p2p/discover/table.go b/p2p/discover/table.go
index <HASH>..<HASH> 100644
--- a/p2p/discover/table.go
+++ b/p2p/discover/table.go
@@ -25,7 +25,7 @@ const (
hashBits = len(common.Hash{}) * 8
nBuckets = hashBits + 1 // Number of buckets
- maxBondingPingPongs = 10
+ maxBondingPingPongs = 16
)
type Table struct {
|
p2p/discover: bump maxBondingPingPongs to <I>
This should increase the speed a bit because all findnode
results (up to <I>) can be verified at the same time.
|
ethereum_go-ethereum
|
train
|
go
|
9c8c6b0e8a9cf658d7d56b0f6a41d8f348a13f13
|
diff --git a/src/components/dialog/dialog.js b/src/components/dialog/dialog.js
index <HASH>..<HASH> 100644
--- a/src/components/dialog/dialog.js
+++ b/src/components/dialog/dialog.js
@@ -442,7 +442,6 @@ function MdDialogDirective($$rAF, $mdTheming, $mdDialog) {
* `three` into the controller, with the value 3. If `bindToController` is true, they will be
* copied to the controller instead.
* - `bindToController` - `bool`: bind the locals to the controller, instead of passing them in.
- * These values will not be available until after initialization.
* - `resolve` - `{object=}`: Similar to locals, except it takes promises as values, and the
* dialog will not open until all of the promises resolve.
* - `controllerAs` - `{string=}`: An alias to assign the controller to on the scope.
|
Update dialog.js
"These values will not be available until after initialization" has been fixed since <URL>
|
angular_material
|
train
|
js
|
61c0edce3cc55d518b884be9780fdede6f4d3c2b
|
diff --git a/src/Autoloader.php b/src/Autoloader.php
index <HASH>..<HASH> 100644
--- a/src/Autoloader.php
+++ b/src/Autoloader.php
@@ -1,4 +1,13 @@
<?php
+/**
+ * Contains autoloading functionality.
+ *
+ * @package Fragen\Autoloader
+ * @author Andy Fragen <[email protected]>
+ * @license GPL-2.0+
+ * @link http://github.com/afragen/autoloader
+ * @copyright 2015 Andy Fragen
+ */
namespace Fragen;
@@ -18,7 +27,6 @@ if ( ! class_exists( 'Fragen\\Autoloader' ) ) {
* @package Fragen\Autoloader
* @author Andy Fragen <[email protected]>
* @author Barry Hughes <[email protected]>
- * @license GPL-2.0+
* @link http://github.com/afragen/autoloader
* @copyright 2015 Andy Fragen
* @version 2.0.0
|
Created file-level docblock/moved some class-level to file-level.
|
afragen_github-updater
|
train
|
php
|
616d12351f3dab743a1eb3d60812d2da5c7e07c3
|
diff --git a/babel.config.js b/babel.config.js
index <HASH>..<HASH> 100644
--- a/babel.config.js
+++ b/babel.config.js
@@ -1,20 +1,13 @@
module.exports = {
babelrcRoots: [".", "./website/*"],
presets: [["@babel/env", { loose: true }], "@babel/react"],
- plugins: ["dev-expression"],
+ plugins: [
+ "dev-expression",
+ ["@babel/plugin-proposal-class-properties", { loose: true }]
+ ],
env: {
test: {
presets: [["@babel/preset-env", { targets: { node: "current" } }]]
}
- },
- overrides: [
- {
- test: "./packages/react-router/modules/*",
- plugins: [["@babel/plugin-proposal-class-properties", { loose: true }]]
- },
- {
- test: "./packages/react-router-dom/modules/*",
- plugins: [["@babel/plugin-proposal-class-properties", { loose: true }]]
- }
- ]
+ }
};
|
refactor: simplify project-wide babel config
|
ReactTraining_react-router
|
train
|
js
|
c0b847509f6ca7a13880e5ec0529808b09ec79d8
|
diff --git a/components/messaging/messaging-int/src/main/java/org/torquebox/messaging/deployers/TasksDeployer.java b/components/messaging/messaging-int/src/main/java/org/torquebox/messaging/deployers/TasksDeployer.java
index <HASH>..<HASH> 100644
--- a/components/messaging/messaging-int/src/main/java/org/torquebox/messaging/deployers/TasksDeployer.java
+++ b/components/messaging/messaging-int/src/main/java/org/torquebox/messaging/deployers/TasksDeployer.java
@@ -48,7 +48,7 @@ public class TasksDeployer extends AbstractDeployer {
MessageProcessorMetaData processorMetaData = new MessageProcessorMetaData();
processorMetaData.setDestinationName( queue.getName() );
- processorMetaData.setRubyClassName( task.getRubyClassName() );
+ processorMetaData.setRubyClassName( task.getRubyClassName(), task.getLocation() );
AttachmentUtils.multipleAttach(unit, processorMetaData, processorMetaData.getName() );
}
|
Need to include the location so the message processor can find the task.
|
torquebox_torquebox
|
train
|
java
|
c1ba82ab8091a6f782d815ec0fc61dc23eecaaa4
|
diff --git a/index.js b/index.js
index <HASH>..<HASH> 100644
--- a/index.js
+++ b/index.js
@@ -7,7 +7,8 @@ var core = {};
// load core modules from builtin dir
fs.readdirSync(path.resolve(__dirname, 'builtin')).forEach(function(file) {
- core[path.basename(file, '.js')] = path.resolve(__dirname, 'builtin', file);
+ if (file[0] === '_') return;
+ core[path.basename(file, '.js')] = path.resolve(__dirname, 'builtin', file);
});
// manually resolve modules that would otherwise resolve as core
diff --git a/test/node/node-test.js b/test/node/node-test.js
index <HASH>..<HASH> 100644
--- a/test/node/node-test.js
+++ b/test/node/node-test.js
@@ -7,19 +7,25 @@ test('test that all the modules are set', function (t) {
'assert',
'buffer',
'child_process',
+ 'cluster',
'console',
'constants',
'crypto',
'dgram',
+ 'dns',
+ 'domain',
'events',
'fs',
'http',
'https',
'net',
+ 'os',
'path',
'process',
'punycode',
'querystring',
+ 'readline',
+ 'repl',
'stream',
'string_decoder',
'sys',
|
[fix] don't include internal files in module object
|
alexgorbatchev_node-browser-builtins
|
train
|
js,js
|
18a93230042deb6104a54ea5812fd2d93dd2eae2
|
diff --git a/benchexec/runexecutor.py b/benchexec/runexecutor.py
index <HASH>..<HASH> 100644
--- a/benchexec/runexecutor.py
+++ b/benchexec/runexecutor.py
@@ -209,6 +209,15 @@ class RunExecutor(object):
if user is not None:
self._kill_process = self._kill_process_with_sudo
+ # Check if we are allowed to execute 'kill' with dummy signal.
+ sudo_check = _SUDO_ARGS + [user, 'kill', '-0', '0']
+ with subprocess.Popen(sudo_check, stdout=subprocess.PIPE, stderr=subprocess.PIPE) as p:
+ if p.wait():
+ logging.error('Calling "{}" failed with error code {} and the following output: {}\n{}'
+ .format(' '.join(sudo_check), p.returncode,
+ p.stdout.read().decode().strip(),
+ p.stderr.read().decode().strip()))
+ sys.exit('Cannot execute benchmark as user "{0}", please fix your sudo setup.'.format(user))
else:
self._kill_process = util.kill_process
|
In sudo mode, check whether we are allowed to execute commands
before actually executing them to distinguish sudo failures from other
problems.
|
sosy-lab_benchexec
|
train
|
py
|
2c7a999f23b168f85c2b1c06296f617c626fde9d
|
diff --git a/gitlab/objects.py b/gitlab/objects.py
index <HASH>..<HASH> 100644
--- a/gitlab/objects.py
+++ b/gitlab/objects.py
@@ -398,7 +398,9 @@ class GitlabObject(object):
if kwargs:
for k, v in kwargs.items():
- self.__dict__[k] = v
+ # Don't overwrite attributes returned by the server (#171)
+ if k not in self.__dict__ or not self.__dict__[k]:
+ self.__dict__[k] = v
# Special handling for api-objects that don't have id-number in api
# responses. Currently only Labels and Files
|
Don't overwrite attributes returned by the server
Fixes #<I>
|
python-gitlab_python-gitlab
|
train
|
py
|
c82d681950bc375ec31d5ead9d5ed6dc29c44a63
|
diff --git a/pp/test_channel.py b/pp/test_channel.py
index <HASH>..<HASH> 100644
--- a/pp/test_channel.py
+++ b/pp/test_channel.py
@@ -91,11 +91,6 @@ class TestChannel(unittest.TestCase):
name = "newchan",
resolution = "a")
- numb = np.random.uniform(100000)
- self.assertRaises(TypeError, Channel,
- name = "newchan",
- resolution = numb)
-
# Wavelength
numbs = [np.random.uniform(100),
|
Bugfix: test case for channel resolution did not follow previous patch allowing real resolutions.
|
pytroll_satpy
|
train
|
py
|
c3be390926253924de017d9d3b3ac69d5c356438
|
diff --git a/dev.py b/dev.py
index <HASH>..<HASH> 100644
--- a/dev.py
+++ b/dev.py
@@ -46,7 +46,8 @@ tinyCollection.query = tm.Query()
# logger.debug('condition found: {}'.format(condition))
q = {'user_number': {'$gte': 3, '$lte': 10}}
-condition2 = tinyCollection.parse_condition({'$gte': 3, '$lte': 6}, 'user_number')
+#condition2 = tinyCollection.parse_condition({'$gte': 3, '$lte': 6}, 'user_number')
+condition2 = tinyCollection.parse_condition(q)
for c in condition2:
print(c)
diff --git a/tinymongo/tinymongo.py b/tinymongo/tinymongo.py
index <HASH>..<HASH> 100644
--- a/tinymongo/tinymongo.py
+++ b/tinymongo/tinymongo.py
@@ -144,7 +144,7 @@ class TinyMongoCollection(object):
logger.debug('c: {}'.format(conditions))
if isinstance(value, dict):
- yield from (self.parse_condition(value), key, conditions)
+ yield from (self.parse_condition(value, key))
else:
yield conditions
|
The last generator returned appears to be the correct parsing of the conditions
|
schapman1974_tinymongo
|
train
|
py,py
|
496523cb3866a65bd0dc0c4dc9d4f951c4343d6d
|
diff --git a/src/windows/EmailComposerProxyImpl.js b/src/windows/EmailComposerProxyImpl.js
index <HASH>..<HASH> 100644
--- a/src/windows/EmailComposerProxyImpl.js
+++ b/src/windows/EmailComposerProxyImpl.js
@@ -19,7 +19,7 @@ specific language governing permissions and limitations
under the License.
*/
-var proxy = require('de.appplant.cordova.plugin.email-composer.EmailComposerProxy'),
+var proxy = require('cordova-plugin-email-composer.EmailComposerProxy'),
impl = proxy.impl = {},
WinMail = Windows.ApplicationModel.Email;
|
Updating Windows Plugin
The Windows plugin seemed to be looking for "cordova-plugin-email-composer" the module.
|
katzer_cordova-plugin-email-composer
|
train
|
js
|
4142d67d55d14e69870c174c359dea9f8f8e2c3a
|
diff --git a/src/main/java/net/ripe/commons/ip/Ipv6PrefixUtils.java b/src/main/java/net/ripe/commons/ip/Ipv6PrefixUtils.java
index <HASH>..<HASH> 100644
--- a/src/main/java/net/ripe/commons/ip/Ipv6PrefixUtils.java
+++ b/src/main/java/net/ripe/commons/ip/Ipv6PrefixUtils.java
@@ -6,8 +6,8 @@ import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
-import com.google.common.base.Optional;
import org.apache.commons.lang3.Validate;
+import com.google.common.base.Optional;
public final class Ipv6PrefixUtils { // TODO(yg): Investigate how to abstract for Ipv4 and Ipv6 in an elegant way.
@@ -64,6 +64,10 @@ public final class Ipv6PrefixUtils { // TODO(yg): Investigate how to abstract fo
};
return findPrefixInRangeWhichFitsPrefixLength(range, prefixLength, comparator);
}
+
+ public static int findMaxPrefixLengthForAddress(Ipv6 address) {
+ return getMaxValidPrefix(address.value());
+ }
private static Optional<Ipv6Range> findPrefixInRangeWhichFitsPrefixLength(Ipv6Range range, int prefixLength, Comparator<Ipv6Range> comparator) {
List<Ipv6Range> prefixes = splitIntoPrefixes(range);
|
added public method to get maxprefix for that Ipv6 address.
|
jgonian_commons-ip-math
|
train
|
java
|
7b53bfb0723d11b7e520230a35aba6191bf0d5da
|
diff --git a/src/language/Linting.js b/src/language/Linting.js
index <HASH>..<HASH> 100644
--- a/src/language/Linting.js
+++ b/src/language/Linting.js
@@ -170,7 +170,7 @@ define(function (require, exports, module) {
if (provider) {
perfTimerLint = PerfUtils.markStart("Linting '" + languageId + "':\t" + currentDoc.file.fullPath);
- var result = provider.scanFile(currentDoc.getText(), currentDoc.fullPath);
+ var result = provider.scanFile(currentDoc.getText(), currentDoc.file.fullPath);
_lastResult = result;
PerfUtils.addMeasurement(perfTimerLint);
|
Fix bug spotted by @ingorichter in #<I> (seems worth having this working
from the get-go)
|
adobe_brackets
|
train
|
js
|
12616e426b1522ef2099701fafdd6ad9f476e8c1
|
diff --git a/daemon_unix.go b/daemon_unix.go
index <HASH>..<HASH> 100644
--- a/daemon_unix.go
+++ b/daemon_unix.go
@@ -8,8 +8,6 @@ import (
"os"
"path/filepath"
"syscall"
-
- "github.com/kardianos/osext"
)
// A Context describes daemon context.
@@ -172,7 +170,7 @@ func (d *Context) closeFiles() (err error) {
}
func (d *Context) prepareEnv() (err error) {
- if d.abspath, err = osext.Executable(); err != nil {
+ if d.abspath, err = os.Executable(); err != nil {
return
}
|
Remove "github.com/kardianos/osext" dependency
Replaced with os.Executable
Fixes #<I>.
|
sevlyar_go-daemon
|
train
|
go
|
ebb6d71ca0c19e72ee08a5240fa3d7154688d189
|
diff --git a/Entity/Administrator.php b/Entity/Administrator.php
index <HASH>..<HASH> 100644
--- a/Entity/Administrator.php
+++ b/Entity/Administrator.php
@@ -371,7 +371,7 @@ class Administrator implements \Serializable, AdvancedUserInterface
/**
* @return boolean
*/
- public function getLocked()
+ public function isLocked()
{
return $this->locked;
}
diff --git a/View/Index/Head/HeadItem.php b/View/Index/Head/HeadItem.php
index <HASH>..<HASH> 100644
--- a/View/Index/Head/HeadItem.php
+++ b/View/Index/Head/HeadItem.php
@@ -53,7 +53,7 @@ class HeadItem
/**
* @return boolean
*/
- public function getSortable()
+ public function isSortable()
{
return $this->sortable;
}
|
Replace bool property getters with issers.
|
DarvinStudio_DarvinAdminBundle
|
train
|
php,php
|
f147cd9e39c2c830f8750118a2973831b4012017
|
diff --git a/ford/__init__.py b/ford/__init__.py
index <HASH>..<HASH> 100644
--- a/ford/__init__.py
+++ b/ford/__init__.py
@@ -234,7 +234,7 @@ FORD will look in the provided paths for a modules.json file.
'encoding': 'utf-8',
}
listopts = ['extensions','fpp_extensions','fixed_extensions','display',
- 'extra_vartypes','src_dir','exclude','exclude_dir',
+ 'extra_vartypes','src_dir','exclude','exclude_dir', 'external',
'macro','include','extra_mods','extra_filetypes','copy_subdir','alias']
# Evaluate paths relative to project file location
if args.warn:
|
Added external to the list options again
|
Fortran-FOSS-Programmers_ford
|
train
|
py
|
6fd36c0339356d5f51ffacce68e1c9a53f6fbfcb
|
diff --git a/src/mg/Ding/Bean/Factory/Driver/BeanXmlDriver.php b/src/mg/Ding/Bean/Factory/Driver/BeanXmlDriver.php
index <HASH>..<HASH> 100644
--- a/src/mg/Ding/Bean/Factory/Driver/BeanXmlDriver.php
+++ b/src/mg/Ding/Bean/Factory/Driver/BeanXmlDriver.php
@@ -213,7 +213,9 @@ class BeanXmlDriver
foreach($this->_simpleXml as $name => $xml) {
$simpleXmlBean = $xml->xpath("//bean[@id='$beanName']");
if (!empty($simpleXmlBean)) {
- $this->_logger->debug('Found ' . $beanName . ' in ' . $name);
+ if ($this->_logger->isDebugEnabled()) {
+ $this->_logger->debug('Found ' . $beanName . ' in ' . $name);
+ }
break;
}
}
|
added missing isDebugEnabled()
|
marcelog_Ding
|
train
|
php
|
fd5f2eb39932f313e5264f6c79b1c073aa7162d4
|
diff --git a/test/test_static_file.rb b/test/test_static_file.rb
index <HASH>..<HASH> 100644
--- a/test/test_static_file.rb
+++ b/test/test_static_file.rb
@@ -112,6 +112,14 @@ class TestStaticFile < JekyllUnitTest
assert_equal Time.new.to_i, @static_file.mtime
end
+ should "only set modified time if not a symlink" do
+ expect(File).to receive(:symlink?).and_return(true)
+ expect(File).not_to receive(:utime)
+ @static_file.write(dest_dir)
+
+ allow(File).to receive(:symlink?).and_call_original
+ end
+
should "known if the source path is modified, when it is" do
sleep 1
modify_dummy_file(@filename)
|
Add failing test for File.utime of a symlink in staticfile.
|
jekyll_jekyll
|
train
|
rb
|
3f57c42ff0dfa3cafe8c77e1c105ce6c4b6f85e3
|
diff --git a/jquery.squirrel.js b/jquery.squirrel.js
index <HASH>..<HASH> 100644
--- a/jquery.squirrel.js
+++ b/jquery.squirrel.js
@@ -115,7 +115,7 @@
// if value is not enumerable then make it enumerable. Then for each value in the array, find the option
// with that value and set the property called selected to true
$.each(typeof(value) !== 'object' ? [value] : value, function(index, option) {
- $elem.find('option[value=' + option + ']:not(:checked)').prop('selected', true).trigger('change');
+ $elem.find('option[value="' + option + '"]:not(:checked)').prop('selected', true).trigger('change');
});
}
});
|
Encapsulated with double quotes
As mentioned here edf<I>
|
jpederson_Squirrel.js
|
train
|
js
|
6b0d94409a3289e15b684ee44ada0dc51df9ebe6
|
diff --git a/azure-mgmt-network/src/main/java/com/microsoft/azure/management/network/VirtualNetworkGatewayConnection.java b/azure-mgmt-network/src/main/java/com/microsoft/azure/management/network/VirtualNetworkGatewayConnection.java
index <HASH>..<HASH> 100644
--- a/azure-mgmt-network/src/main/java/com/microsoft/azure/management/network/VirtualNetworkGatewayConnection.java
+++ b/azure-mgmt-network/src/main/java/com/microsoft/azure/management/network/VirtualNetworkGatewayConnection.java
@@ -178,7 +178,8 @@ public interface VirtualNetworkGatewayConnection extends
* Grouping of virtual network gateway connection update stages.
*/
interface Update extends
- UpdateStages.WithBgp {
+ UpdateStages.WithBgp,
+ UpdateStages.WithSharedKey {
}
/**
@@ -191,5 +192,9 @@ public interface VirtualNetworkGatewayConnection extends
Update withoutBgp();
}
+
+ interface WithSharedKey {
+ Update withSharedKey(String sharedKey);
+ }
}
}
|
Virtual Network Gateway Connections - add update stage for shared key
|
Azure_azure-sdk-for-java
|
train
|
java
|
374eeecfbcabae20da08ce00d0f9268bb67b1e5e
|
diff --git a/js/src/tooltip.js b/js/src/tooltip.js
index <HASH>..<HASH> 100644
--- a/js/src/tooltip.js
+++ b/js/src/tooltip.js
@@ -598,13 +598,7 @@ class Tooltip extends BaseComponent {
}
_isWithActiveTrigger() {
- for (const trigger in this._activeTrigger) {
- if (this._activeTrigger[trigger]) {
- return true
- }
- }
-
- return false
+ return Object.values(this._activeTrigger).includes(true)
}
_getConfig(config) {
|
tooltip.js: use array.includes instead of for iteration (#<I>)
|
twbs_bootstrap
|
train
|
js
|
12c273c2f2ad3967ca6b31da0fd9053818bbec5e
|
diff --git a/chemked/utils.py b/chemked/utils.py
index <HASH>..<HASH> 100644
--- a/chemked/utils.py
+++ b/chemked/utils.py
@@ -49,17 +49,3 @@ SPEC_NAMES = {'nC7H16': 'n-heptane',
'H2': 'hydrogen',
'H2O': 'water',
}
-
-
-def print_species_names():
- """Print species names, internal short name, and InChI identifiers."""
-
- len_longest = max([len(sp) for sp in SPEC_NAMES.values()])
- header = '{:<{}s} Short name\tInChI key'.format('Species name', len_longest)
-
- print(header)
- print('-' * len(header.expandtabs()))
- for spec in SPEC_KEY_REV.items():
- print('{:<{}s} {:10}\t{}'.format(SPEC_NAMES[spec[0]], len_longest,
- spec[0], spec[1])
- )
|
removing unused print_species_names() in utils
|
pr-omethe-us_PyKED
|
train
|
py
|
90da2bdd944cd7b2d12984ca4c579d54c63fe2ba
|
diff --git a/PyFunceble.py b/PyFunceble.py
index <HASH>..<HASH> 100755
--- a/PyFunceble.py
+++ b/PyFunceble.py
@@ -394,7 +394,7 @@ class PyFunceble(object):
"""
def __init__(self, domain=None, file_path=None):
-
+ self.bypass()
ExecutionTime('start')
if domain is not None and domain != '':
@@ -407,6 +407,21 @@ class PyFunceble(object):
Percentage().log()
@classmethod
+ def bypass(cls):
+ """
+ Exit the script if `[PyFunceble skip]` is matched into the currently treated commit message.
+ """
+
+ regex_bypass = r'\[PyFunceble\sskip\]'
+
+ if Settings.travis and Helpers.Regex(
+ Helpers.Command('git log -1').execute(),
+ regex_bypass,
+ return_data=False).match():
+
+ AutoSave(True)
+
+ @classmethod
def print_header(cls):
"""
Decide if we print or not the header.
@@ -2318,7 +2333,7 @@ if __name__ == '__main__':
'-v',
'--version',
action='version',
- version='%(prog)s 0.7.1-beta'
+ version='%(prog)s 0.8.0-beta'
)
ARGS = PARSER.parse_args()
|
Introduction of PyFunceble.bypass() which is used to skip the execution of [\PyFunceble skip\] is matched into the current commit message
|
funilrys_PyFunceble
|
train
|
py
|
778be5be58ee52ad77c93dd4932e9fa6156dcc90
|
diff --git a/chanbackup/single.go b/chanbackup/single.go
index <HASH>..<HASH> 100644
--- a/chanbackup/single.go
+++ b/chanbackup/single.go
@@ -180,6 +180,24 @@ func NewSingle(channel *channeldb.OpenChannel,
chanID.BlockHeight = channel.BroadcastHeight()
}
+ // If this is a zero-conf channel, we'll need to have separate logic
+ // depending on whether it's confirmed or not. This is because the
+ // ShortChanID is an alias.
+ if channel.IsZeroConf() {
+ // If the channel is confirmed, we'll use the confirmed SCID.
+ if channel.ZeroConfConfirmed() {
+ chanID = channel.ZeroConfRealScid()
+ } else {
+ // Else if the zero-conf channel is unconfirmed, we'll
+ // need to use the broadcast height and zero out the
+ // TxIndex and TxPosition fields. This is so
+ // openChannelShell works properly.
+ chanID.BlockHeight = channel.BroadcastHeight()
+ chanID.TxIndex = 0
+ chanID.TxPosition = 0
+ }
+ }
+
single := Single{
IsInitiator: channel.IsInitiator,
ChainHash: channel.ChainHash,
|
chanbackup: handle Single creation for zero-conf channels
The SCB function NewSingle is now zero-conf aware. Since the confirmed
short channel id may be unknown, it may use the broadcast height.
|
lightningnetwork_lnd
|
train
|
go
|
70d835722d656ef216e80215911ae8a20e63a5ee
|
diff --git a/tracer/__init__.py b/tracer/__init__.py
index <HASH>..<HASH> 100644
--- a/tracer/__init__.py
+++ b/tracer/__init__.py
@@ -1,2 +1,2 @@
-from .tracer import Tracer, TracerMisfollowError
from .runner import Runner
+from .tracer import Tracer, TracerMisfollowError, TracerEnvironmentError
|
Export the TracerEnvironmentError
|
angr_angr
|
train
|
py
|
161c73471dde79fb4c25be7ad8f5712e19f8f09a
|
diff --git a/lib/ResourceOutputStream.php b/lib/ResourceOutputStream.php
index <HASH>..<HASH> 100644
--- a/lib/ResourceOutputStream.php
+++ b/lib/ResourceOutputStream.php
@@ -206,6 +206,10 @@ final class ResourceOutputStream implements OutputStream {
* @return void
*/
public function close() {
+ if ($this->resource) {
+ \stream_socket_shutdown($this->resource, \STREAM_SHUT_WR);
+ }
+
$this->resource = null;
$this->writable = false;
diff --git a/test/ResourceStreamTest.php b/test/ResourceStreamTest.php
index <HASH>..<HASH> 100644
--- a/test/ResourceStreamTest.php
+++ b/test/ResourceStreamTest.php
@@ -17,17 +17,9 @@ class ResourceStreamTest extends TestCase {
$b = new ResourceInputStream($right);
$length = 1 * 1024 * 1024; // 1M
+ $message = \str_repeat(".", $length);
- $message = "";
- for ($i = 0; $i < $length; $i++) {
- $message .= \chr(\mt_rand(33, 125));
- }
-
- $a->end($message)->onResolve(function () use (&$a, &$left) {
- // Let GC close resource
- $a = null;
- $left = null;
- });
+ $a->end($message);
$received = "";
while (null !== $chunk = yield $b->read()) {
|
stream_socket_shutdown resources in ROS on close
|
amphp_byte-stream
|
train
|
php,php
|
99334ef46c492105a739a3d55746794cbb96dc28
|
diff --git a/src/Shell/Task/CsvRelationTask.php b/src/Shell/Task/CsvRelationTask.php
index <HASH>..<HASH> 100644
--- a/src/Shell/Task/CsvRelationTask.php
+++ b/src/Shell/Task/CsvRelationTask.php
@@ -212,7 +212,30 @@ class CsvRelationTask extends BakeTask
*/
private function bakeDatabaseConfig(array $selection, string $path) : bool
{
- $fields = [];
+ $fields = [
+ 'id' => [
+ 'name' => 'id',
+ 'type' => 'uuid',
+ 'required' => true,
+ 'non-searchable' => false,
+ 'unique' => true
+ ],
+ 'created' => [
+ 'name' => 'created',
+ 'type' => 'datetime',
+ 'required' => false,
+ 'non-searchable' => false,
+ 'unique' => false
+ ],
+ 'modified' => [
+ 'name' => 'modified',
+ 'type' => 'datetime',
+ 'required' => false,
+ 'non-searchable' => false,
+ 'unique' => false
+ ],
+ ];
+
foreach ($selection as $module) {
$fields[$this->_modelKey($module)] = [
'name' => $this->_modelKey($module),
|
Add required fields to baked relation migration
This fixes the issue with validation of baked configurations,
where required fields (id/created/modified) are missing from
the result.
|
QoboLtd_cakephp-csv-migrations
|
train
|
php
|
89ca9b2fef7cce93667d078c415a8869c6184d89
|
diff --git a/index.js b/index.js
index <HASH>..<HASH> 100644
--- a/index.js
+++ b/index.js
@@ -124,7 +124,8 @@ function Angular2ConventionsLoader(source, sourcemap) {
var __args = /@(Component|Directive)\({([\s\S]*?)}\)\s*export\s*class\s*([\s\S]+)\s*(extends|implements|{)$/m.exec(src.slice(offset));
if (__args && __args[3]) {
var __className = __args[3].split(' ')[0];
- __selector = dashCase(__className);
+ // if component dash case else [attr]
+ __selector = (decorator === 'Component') ? dashCase(__className) : '[' + __className + ']';
__selector = __selector.replace('-component', '');
metadata = 'selector: "' + selectorPrefix + __selector + '",\n' + metadata;
__args = null;
|
feat: for Directive wrap in [attr] selector
|
AngularClass_angular2-conventions-loader
|
train
|
js
|
9c839bd9ab311fa4d92fb01fda29322bbd0982d1
|
diff --git a/lib/helper/WebDriver.js b/lib/helper/WebDriver.js
index <HASH>..<HASH> 100644
--- a/lib/helper/WebDriver.js
+++ b/lib/helper/WebDriver.js
@@ -1839,12 +1839,24 @@ class WebDriver extends Helper {
// for chrome
if (browser.isW3C) {
- return browser.performActions([
- { type: 'pointerDown', button: 0 },
- {
- type: 'pointerMove', origin: 'pointer', duration: 1000, x: offsetX, y: 0,
- },
- { type: 'pointerUp', button: 0 },
+ const xOffset = await this.grabElementBoundingRect(locator, 'x');
+ const yOffset = await this.grabElementBoundingRect(locator, 'y');
+
+ return browser.performActions([{
+ type: 'pointer',
+ id: 'pointer1',
+ parameters: { pointerType: 'mouse' },
+ actions: [
+ {
+ type: 'pointerMove', origin: 'pointer', duration: 1000, x: xOffset, y: yOffset,
+ },
+ { type: 'pointerDown', button: 0 },
+ {
+ type: 'pointerMove', origin: 'pointer', duration: 1000, x: offsetX, y: 0,
+ },
+ { type: 'pointerUp', button: 0 },
+ ],
+ },
]);
}
|
Action Api Updated for dragSlider method (#<I>)
|
Codeception_CodeceptJS
|
train
|
js
|
50aee591f0917787781ad2b45af3074a63366e60
|
diff --git a/example.py b/example.py
index <HASH>..<HASH> 100755
--- a/example.py
+++ b/example.py
@@ -1,4 +1,6 @@
#!/usr/bin/env python
+from __future__ import print_function
+
from sys import argv
from coinshot import Coinshot, CoinshotException
@@ -19,9 +21,9 @@ try:
coinshot_object.push(**args)
except CoinshotException as e:
- print e[0]['message']
+ print(e[0]['message'])
if e[0].get('details'):
- print "Details:"
+ print("Details:")
for k, v in e[0]['details'].items():
- print "\t%s => %s" % (k, v)
+ print("\t%s => %s" % (k, v))
|
making example.py work with python3
|
charlesthomas_coinshot
|
train
|
py
|
318eb4e65add2d2583dd3c76e0c944286783e633
|
diff --git a/plugins/providers/hyperv/driver.rb b/plugins/providers/hyperv/driver.rb
index <HASH>..<HASH> 100644
--- a/plugins/providers/hyperv/driver.rb
+++ b/plugins/providers/hyperv/driver.rb
@@ -44,11 +44,6 @@ module VagrantPlugins
# TODO: Include other options like if disk is fixed or dymanic in opts hash?
#
- # Example path for default disk location. Should be able to get this
- # for new disks and store them in the same folder
- #
- # C:\Users\vagrant\test\.vagrant\machines\hashicorp\hyperv\Virtual Hard Disks\ubuntu-18.04-amd64.vhdx
- #
# @param [String] path
# @param [Int] size_bytes
# @param [Hash] opts
@@ -78,10 +73,12 @@ module VagrantPlugins
execute(:list_hdds, VmId: @vm_id)
end
+ # @param [String] disk_file_path
def get_disk(disk_file_path)
execute(:get_vhd, DiskFilePath: disk_file_path)
end
+ # @param [String] disk_file_path
def dismount_disk(disk_file_path)
execute(:dismount_vhd, DiskFilePath: disk_file_path)
end
|
Update method docs in hyperv driver
|
hashicorp_vagrant
|
train
|
rb
|
82ffa50cf89672a818c97fa29d603dd2c3c410fb
|
diff --git a/karaage/projects/models.py b/karaage/projects/models.py
index <HASH>..<HASH> 100644
--- a/karaage/projects/models.py
+++ b/karaage/projects/models.py
@@ -99,14 +99,18 @@ class Project(models.Model):
self.is_active = True
self.is_approved = True
self.date_approved = datetime.datetime.today()
- approver = get_current_user()
- self.approved_by = approver.get_profile()
+ try:
+ self.approved_by = get_current_user().get_profile()
+ except:
+ pass
self.save()
def deactivate(self):
self.is_active = False
- deletor = get_current_user()
- self.deleted_by = deletor.get_profile()
+ try:
+ self.deleted_by = get_current_user().get_profile()
+ except:
+ pass
self.date_deleted = datetime.datetime.today()
self.users.clear()
self.save()
|
Don't error if you can't assign approved/deleted user in case done via command line
|
Karaage-Cluster_karaage
|
train
|
py
|
7be760342fa433278ca6271dd7661a3132b96e1a
|
diff --git a/spec/dpl/providers/convox_spec.rb b/spec/dpl/providers/convox_spec.rb
index <HASH>..<HASH> 100644
--- a/spec/dpl/providers/convox_spec.rb
+++ b/spec/dpl/providers/convox_spec.rb
@@ -86,13 +86,17 @@ describe Dpl::Providers::Convox do
end
describe 'given --env "ONE=$one,TWO=two"' do
- it { should have_run 'convox env set ONE="$one" TWO="two" --rack rack --app app --replace' }
+ it { should have_run 'convox env set ONE\=\$one TWO\=two --rack rack --app app --replace' }
+ end
+
+ describe 'given --env "ONE=$one,TWO=two,THREE=three four five"' do
+ it { should have_run 'convox env set ONE\=\$one TWO\=two THREE\=three\ four\ five --rack rack --app app --replace' }
end
describe 'given --env_file .env --env TWO=two', run: false do
file '.env', 'ONE=$one'
before { subject.run }
- it { should have_run 'convox env set ONE="$one" TWO="two" --rack rack --app app --replace' }
+ it { should have_run 'convox env set ONE\=\$one TWO\=two --rack rack --app app --replace' }
end
describe 'missing env file, given --env_file .env', run: false do
|
test: fixed tests for usage of Shellwords's escape instead ShVars
|
travis-ci_dpl
|
train
|
rb
|
c1f7399a86f61bcf26791b9e0a6cf30b28e2048c
|
diff --git a/crypto/secp256k1/secp256k1_nocgo.go b/crypto/secp256k1/secp256k1_nocgo.go
index <HASH>..<HASH> 100644
--- a/crypto/secp256k1/secp256k1_nocgo.go
+++ b/crypto/secp256k1/secp256k1_nocgo.go
@@ -14,8 +14,7 @@ import (
// see:
// - https://github.com/ethereum/go-ethereum/blob/f9401ae011ddf7f8d2d95020b7446c17f8d98dc1/crypto/signature_nocgo.go#L90-L93
// - https://github.com/ethereum/go-ethereum/blob/f9401ae011ddf7f8d2d95020b7446c17f8d98dc1/crypto/crypto.go#L39
-var secp256k1N, _ = new(big.Int).SetString("fffffffffffffffffffffffffffffffebaaedce6af48a03bbfd25e8cd0364141", 16)
-var secp256k1halfN = new(big.Int).Div(secp256k1N, big.NewInt(2))
+var secp256k1halfN = new(big.Int).Rsh(secp256k1.S256().N, 1)
// Sign creates an ECDSA signature on curve Secp256k1, using SHA256 on the msg.
// The returned signature will be of the form R || S (in lower-S form).
|
review comment: cleaner constant for N/2, delete secp<I>k1N and use (#<I>)
`secp<I>k1.S<I>().N` directly instead
|
tendermint_tendermint
|
train
|
go
|
99fcb794d49bcb5c2391e74d7165c29847552d9c
|
diff --git a/js/huobi.js b/js/huobi.js
index <HASH>..<HASH> 100644
--- a/js/huobi.js
+++ b/js/huobi.js
@@ -6368,7 +6368,8 @@ module.exports = class huobi extends Exchange {
//
const data = this.safeValue (response, 'data');
const settlementRecord = this.safeValue (data, 'settlement_record');
- return this.parseSettlements (settlementRecord, market);
+ const settlements = this.parseSettlements (settlementRecord, market);
+ return this.sortBy (settlements, 'timestamp');
}
parseSettlements (settlements, market) {
|
huobi.fetchSettlementHistory sorted by timestamp
|
ccxt_ccxt
|
train
|
js
|
a4fe5c5b6606f5b320222069f076c73a2eb3d12e
|
diff --git a/actionmailbox/lib/action_mailbox/engine.rb b/actionmailbox/lib/action_mailbox/engine.rb
index <HASH>..<HASH> 100644
--- a/actionmailbox/lib/action_mailbox/engine.rb
+++ b/actionmailbox/lib/action_mailbox/engine.rb
@@ -1,6 +1,11 @@
# frozen_string_literal: true
-require "rails/engine"
+require "rails"
+require "action_controller/railtie"
+require "active_job/railtie"
+require "active_record/railtie"
+require "active_storage/engine"
+
require "action_mailbox"
module ActionMailbox
|
Require railties for all Action Mailbox dependencies
|
rails_rails
|
train
|
rb
|
547fecae11ab04ff518101e240d07480336c7046
|
diff --git a/test/integration/active_record_searchable_test.rb b/test/integration/active_record_searchable_test.rb
index <HASH>..<HASH> 100644
--- a/test/integration/active_record_searchable_test.rb
+++ b/test/integration/active_record_searchable_test.rb
@@ -212,6 +212,24 @@ module Tire
assert_nil results.first
end
+ context "without an explicit per_page" do
+
+ should "not find a missing (second) page" do
+ results = ActiveRecordArticle.search 'test*', :sort => 'title', :page => 2
+ assert_equal 0, results.size
+
+ # WillPaginate
+ #
+ assert_equal 1, results.total_pages
+ assert_equal 2, results.current_page
+ assert_equal 1, results.previous_page
+ assert_equal nil, results.next_page
+
+ assert_nil results.first
+ end
+
+ end
+
end
context "and block searches" do
|
[#<I>] Added test for the bug where you have to include `per_page` in model searches
If you don't specify a `per_page` option then the `page` option has no effect.
|
karmi_retire
|
train
|
rb
|
dd7ffc341ab2d2bed994b91cf0cb02562241bc11
|
diff --git a/pushy/src/main/java/com/turo/pushy/apns/ApnsClientHandler.java b/pushy/src/main/java/com/turo/pushy/apns/ApnsClientHandler.java
index <HASH>..<HASH> 100644
--- a/pushy/src/main/java/com/turo/pushy/apns/ApnsClientHandler.java
+++ b/pushy/src/main/java/com/turo/pushy/apns/ApnsClientHandler.java
@@ -343,7 +343,14 @@ class ApnsClientHandler extends Http2ConnectionHandler implements Http2FrameList
}
@Override
- public void onRstStreamRead(final ChannelHandlerContext ctx, final int streamId, final long errorCode) throws Http2Exception {
+ public void onRstStreamRead(final ChannelHandlerContext context, final int streamId, final long errorCode) throws Http2Exception {
+ if (errorCode == Http2Error.REFUSED_STREAM.code()) {
+ // This can happen if the server reduces MAX_CONCURRENT_STREAMS while we already have notifications in
+ // flight. We may get multiple RST_STREAM frames per stream since we send multiple frames (HEADERS and
+ // DATA) for each push notification, but we should only get one REFUSED_STREAM error; the rest should all be
+ // STREAM_CLOSED.
+ this.retryPushNotificationFromStream(context, streamId);
+ }
}
@Override
|
Retry push notifications on RST_STREAM.
|
relayrides_pushy
|
train
|
java
|
82562497035541e9f13fe54f7707782602c5777b
|
diff --git a/test/Compiler.test.js b/test/Compiler.test.js
index <HASH>..<HASH> 100644
--- a/test/Compiler.test.js
+++ b/test/Compiler.test.js
@@ -8,6 +8,7 @@ const WebpackOptionsDefaulter = require("../lib/WebpackOptionsDefaulter");
const MemoryFs = require("memory-fs");
describe("Compiler", () => {
+ jest.setTimeout(20000);
function compile(entry, options, callback) {
const noOutputPath = !options.output || !options.output.path;
if (!options.mode) options.mode = "production";
|
longer timeouts for Compiler test
|
webpack_webpack
|
train
|
js
|
d250f4449c29fa0f03e206f2e3d3403aaec673dd
|
diff --git a/examples/vector-layer.js b/examples/vector-layer.js
index <HASH>..<HASH> 100644
--- a/examples/vector-layer.js
+++ b/examples/vector-layer.js
@@ -38,7 +38,7 @@ var vector = new ol.layer.Vector({
symbolizers: [
new ol.style.Text({
color: '#bada55',
- name: new ol.Expression('name'),
+ text: new ol.Expression('name'),
fontFamily: 'Calibri,sans-serif',
fontSize: 12
})
|
Updating example to use new properties
|
openlayers_openlayers
|
train
|
js
|
6448cab533c1c0124545f44007f08766bc3530c5
|
diff --git a/packages/orbit-components/src/Modal/index.js b/packages/orbit-components/src/Modal/index.js
index <HASH>..<HASH> 100644
--- a/packages/orbit-components/src/Modal/index.js
+++ b/packages/orbit-components/src/Modal/index.js
@@ -77,8 +77,10 @@ const ModalWrapper = styled.div`
!isMobileFullPage && "12px"}; // TODO: create token
border-top-right-radius: ${({ isMobileFullPage }) =>
!isMobileFullPage && "12px"}; // TODO: create token
- transition: ${transition(["top"], "normal", "ease-in-out")};
- top: ${({ loaded, isMobileFullPage }) => (loaded ? !isMobileFullPage && "32px" : "100%")};
+ transition: ${transition(["transform"], "normal", "ease-in-out")};
+ transform: translateY(
+ ${({ loaded, isMobileFullPage }) => (loaded ? !isMobileFullPage && "32px" : "100%")}
+ );
${onlyIE(css`
/* IE flex bug, the content won't be centered if there is not 'height' property
@@ -88,7 +90,7 @@ const ModalWrapper = styled.div`
${media.largeMobile(css`
position: relative;
- top: 0;
+ transform: none;
max-width: ${getSizeToken};
align-items: center;
`)};
|
fix(Modal): reduce Cumulative Layout Shift
|
kiwicom_orbit-components
|
train
|
js
|
bed2010a7464f1112b740d76c34d1ef153ab6698
|
diff --git a/lib/html/pipeline/camo_filter.rb b/lib/html/pipeline/camo_filter.rb
index <HASH>..<HASH> 100644
--- a/lib/html/pipeline/camo_filter.rb
+++ b/lib/html/pipeline/camo_filter.rb
@@ -20,8 +20,9 @@ module HTML
# Hijacks images in the markup provided, replacing them with URLs that
# go through the github asset proxy.
def call
+ return unless asset_proxy_enabled?
+
doc.search("img").each do |element|
- next if context[:disable_asset_proxy]
next if element['src'].nil?
begin
@@ -55,6 +56,11 @@ module HTML
OpenSSL::HMAC.hexdigest(digest, asset_proxy_secret_key, url)
end
+ # Private: Return true if asset proxy filter should be enabled
+ def asset_proxy_enabled?
+ !context[:disable_asset_proxy]
+ end
+
# Private: the hostname to use for generated asset proxied URLs.
def asset_proxy_host
context[:asset_proxy]
|
Fail fast if asset proxy is disabled
|
jch_html-pipeline
|
train
|
rb
|
edfcf93c320903e57f640e11ff1d6487d105c559
|
diff --git a/tests/build/test_build_request.py b/tests/build/test_build_request.py
index <HASH>..<HASH> 100644
--- a/tests/build/test_build_request.py
+++ b/tests/build/test_build_request.py
@@ -83,8 +83,6 @@ class TestBuildRequest(object):
'git_ref': TEST_GIT_REF,
'user': "john-foo",
'component': "component",
- 'base_image': 'fedora:latest',
- 'name_label': 'fedora/resultingimage',
'registry_uri': "registry.example.com",
'openshift_uri': "http://openshift/",
}
@@ -113,11 +111,8 @@ class TestBuildRequest(object):
kwargs = {
'git_uri': TEST_GIT_URI,
'git_ref': TEST_GIT_REF,
- 'git_branch': TEST_GIT_BRANCH,
'user': "john-foo",
'component': TEST_COMPONENT,
- 'base_image': 'fedora:latest',
- 'name_label': name_label,
'registry_uri': "http://registry.example.com:5000",
'openshift_uri': "http://openshift/",
}
|
Update params for 'simple' build type tests
|
projectatomic_osbs-client
|
train
|
py
|
2f49e88e223d1afc3bf7a4686ee7fc67c51ec3f3
|
diff --git a/src/Graze/Monolog/Handler/RaygunHandler.php b/src/Graze/Monolog/Handler/RaygunHandler.php
index <HASH>..<HASH> 100644
--- a/src/Graze/Monolog/Handler/RaygunHandler.php
+++ b/src/Graze/Monolog/Handler/RaygunHandler.php
@@ -41,7 +41,7 @@ class RaygunHandler extends AbstractProcessingHandler
*/
public function isHandling(array $record)
{
- if(parent::isHandling($record)) {
+ if (parent::isHandling($record)) {
$context = $record['context'];
//Ensure only valid records will be handled and no InvalidArgumentException will be thrown
@@ -50,7 +50,7 @@ class RaygunHandler extends AbstractProcessingHandler
$context['exception'] instanceof \Exception ||
(PHP_VERSION_ID > 70000 && $context['exception'] instanceof \Throwable)
)
- ) || (isset($context['file']) && $context['line'])
+ ) || (isset($context['file']) && isset($context['line']))
) {
return true;
}
|
fix isset for line in raygun handler
|
graze_monolog-extensions
|
train
|
php
|
918dd5165ae95514495bd22ed3a689b0fd2c6012
|
diff --git a/setup.py b/setup.py
index <HASH>..<HASH> 100644
--- a/setup.py
+++ b/setup.py
@@ -20,8 +20,8 @@ from setuptools.command.test import test as TestCommand
with open(pjoin(dirname(__file__), 'icecream', '__init__.py')) as fo:
- VERSION = re.compile(
- r".*__version__ = '(.*?)'", re.S).match(fo.read()).group(1)
+ regex = r".*__version__ = '(.*?)'"
+ VERSION = re.compile(regex, re.S).match(fo.read()).group(1)
class Publish(Command):
|
Refactor setup.py's VERSION extraction.
|
gruns_icecream
|
train
|
py
|
91ee25358a3ce59209633bae6601e02199fb0071
|
diff --git a/rest-model/src/test/java/org/jboss/pnc/restmodel/serialization/BuildResultSerializationTest.java b/rest-model/src/test/java/org/jboss/pnc/restmodel/serialization/BuildResultSerializationTest.java
index <HASH>..<HASH> 100644
--- a/rest-model/src/test/java/org/jboss/pnc/restmodel/serialization/BuildResultSerializationTest.java
+++ b/rest-model/src/test/java/org/jboss/pnc/restmodel/serialization/BuildResultSerializationTest.java
@@ -44,7 +44,7 @@ public class BuildResultSerializationTest {
BuildResult buildResult = BuildResultMock.mock(BuildStatus.SUCCESS);
BuildResultRest buildResultRest = new BuildResultRest(buildResult);
- String buildResultJson = buildResultRest.toString();
+ String buildResultJson = buildResultRest.toFullLogString();
log.debug("BuildResultJson : {}", buildResultJson);
BuildResultRest buildResultRestFromJson = JsonOutputConverterMapper.readValue(buildResultJson, BuildResultRest.class);
|
Fix failed test expecting JSON toString
|
project-ncl_pnc
|
train
|
java
|
76eb220f9b1babc5d2cea85c406454b1f73f99d0
|
diff --git a/playhouse/tests/test_helpers.py b/playhouse/tests/test_helpers.py
index <HASH>..<HASH> 100644
--- a/playhouse/tests/test_helpers.py
+++ b/playhouse/tests/test_helpers.py
@@ -94,6 +94,23 @@ class TestDeclaredDependencies(PeeweeTestCase):
ordering = sort_models_topologically(pmodels)
self.assertEqual(ordering, ordered)
+ def test_declared_dependencies_simple(self):
+ class A(Model): pass
+ class B(Model):
+ class Meta:
+ depends_on = (A,)
+ class C(Model):
+ b = ForeignKeyField(B) # Implicit dependency.
+ class D(Model):
+ class Meta:
+ depends_on = (C,)
+
+ models = [A, B, C, D]
+ ordered = list(models)
+ for pmodels in permutations(models):
+ ordering = sort_models_topologically(pmodels)
+ self.assertEqual(ordering, ordered)
+
def permutations(xs):
if not xs:
|
Add another test for #<I>.
|
coleifer_peewee
|
train
|
py
|
67d9fccc60c1cd6f9acd644989f0a20ed34d7f81
|
diff --git a/src/frontend/org/voltdb/jni/ExecutionEngineIPC.java b/src/frontend/org/voltdb/jni/ExecutionEngineIPC.java
index <HASH>..<HASH> 100644
--- a/src/frontend/org/voltdb/jni/ExecutionEngineIPC.java
+++ b/src/frontend/org/voltdb/jni/ExecutionEngineIPC.java
@@ -1035,6 +1035,9 @@ public class ExecutionEngineIPC extends ExecutionEngine {
m_data.putInt(hostId);
m_data.putInt(drClusterId);
m_data.putInt(defaultDrBufferSize);
+ m_data.putInt(drIgnoreConflicts ? 1 : 0);
+ m_data.putInt(drCrcErrorIgnoreMax);
+ m_data.putInt(drCrcErrorIgnoreFatal ? 1 : 0);
m_data.putLong(EELoggers.getLogLevels());
m_data.putLong(tempTableMemory);
m_data.putInt(createDrReplicatedStream ? 1 : 0);
|
ENG-<I> pass params to voltdbipc (#<I>)
|
VoltDB_voltdb
|
train
|
java
|
c86bfc37124da4660473379f49019c6f7c5906a5
|
diff --git a/troposphere/codebuild.py b/troposphere/codebuild.py
index <HASH>..<HASH> 100644
--- a/troposphere/codebuild.py
+++ b/troposphere/codebuild.py
@@ -61,14 +61,16 @@ class EnvironmentVariable(AWSProperty):
}
def validate(self):
- valid_types = [
- 'PARAMETER_STORE',
- 'PLAINTEXT',
- ]
- env_type = self.properties.get('Type')
- if env_type not in valid_types:
- raise ValueError('EnvironmentVariable Type: must be one of %s' %
- ','.join(valid_types))
+ if 'Type' in self.properties:
+ valid_types = [
+ 'PARAMETER_STORE',
+ 'PLAINTEXT',
+ ]
+ env_type = self.properties.get('Type')
+ if env_type not in valid_types:
+ raise ValueError(
+ 'EnvironmentVariable Type: must be one of %s' %
+ ','.join(valid_types))
class Environment(AWSProperty):
|
type is not required for EnvironmentVariable (#<I>)
|
cloudtools_troposphere
|
train
|
py
|
b86bddb91dfdcb763b6d3791da27b2cd8eda4c6d
|
diff --git a/main/boofcv-recognition/src/main/java/boofcv/alg/fiducial/calib/chess/ChessboardPolygonHelper.java b/main/boofcv-recognition/src/main/java/boofcv/alg/fiducial/calib/chess/ChessboardPolygonHelper.java
index <HASH>..<HASH> 100644
--- a/main/boofcv-recognition/src/main/java/boofcv/alg/fiducial/calib/chess/ChessboardPolygonHelper.java
+++ b/main/boofcv-recognition/src/main/java/boofcv/alg/fiducial/calib/chess/ChessboardPolygonHelper.java
@@ -76,9 +76,11 @@ public class ChessboardPolygonHelper<T extends ImageGray<T>> implements PolygonH
@Override
public void configureBeforePolyline(PointsToPolyline contourToPolyline, boolean touchesBorder) {
if( touchesBorder ) {
+ contourToPolyline.setConvex(false);
contourToPolyline.setMinimumSides(3);
contourToPolyline.setMaximumSides(8);
} else {
+ contourToPolyline.setConvex(true);
contourToPolyline.setMinimumSides(4);
contourToPolyline.setMaximumSides(4);
}
|
fixed issue with it not being able to detect close up chessboard patterns
|
lessthanoptimal_BoofCV
|
train
|
java
|
454526a9c0eaa9c90c1e58a5a46fdbaca3fa8a86
|
diff --git a/sa/sa.go b/sa/sa.go
index <HASH>..<HASH> 100644
--- a/sa/sa.go
+++ b/sa/sa.go
@@ -173,6 +173,7 @@ func (ssa *SQLStorageAuthority) GetAuthorization(ctx context.Context, id string)
var fa authzModel
err = tx.SelectOne(&fa, fmt.Sprintf("SELECT %s FROM authz WHERE id = ?", authzFields), id)
if err != nil {
+ err = Rollback(tx, err)
return
}
authz = fa.Authorization
|
Fix missing rollback call on error. (#<I>)
|
letsencrypt_boulder
|
train
|
go
|
9bcad6fc09e44343d592ea544bd5496c4ef61e92
|
diff --git a/src/Command/ProjectGetCommand.php b/src/Command/ProjectGetCommand.php
index <HASH>..<HASH> 100644
--- a/src/Command/ProjectGetCommand.php
+++ b/src/Command/ProjectGetCommand.php
@@ -124,14 +124,7 @@ class ProjectGetCommand extends PlatformCommand
$fsHelper = $this->getHelper('fs');
// Prepare to talk to the Platform.sh repository.
- if (isset($project['repository'])) {
- $gitUrl = $project['repository']['url'];
- }
- else {
- $projectUriParts = explode('/', str_replace(array('http://', 'https://'), '', $project['uri']));
- $cluster = $projectUriParts[0];
- $gitUrl = "{$projectId}@git.{$cluster}:{$projectId}.git";
- }
+ $gitUrl = $project->getGitUrl();
$repositoryDir = $directoryName . '/' . LocalProject::REPOSITORY_DIR;
$gitHelper = new GitHelper(new ShellHelper($output));
|
Use client helper to get the Git URL
|
platformsh_platformsh-cli
|
train
|
php
|
e81afed66d9e49fcca3679b663bd7c0319067c31
|
diff --git a/lib/browser/api/web-contents.js b/lib/browser/api/web-contents.js
index <HASH>..<HASH> 100644
--- a/lib/browser/api/web-contents.js
+++ b/lib/browser/api/web-contents.js
@@ -354,9 +354,6 @@ WebContents.prototype._init = function () {
app.emit('renderer-process-crashed', event, this, ...args)
})
- deprecate.event(this, 'did-get-response-details', '-did-get-response-details')
- deprecate.event(this, 'did-get-redirect-request', '-did-get-redirect-request')
-
// The devtools requests the webContents to reload.
this.on('devtools-reload-page', function () {
this.reload()
|
chore: remove dead code (#<I>)
|
electron_electron
|
train
|
js
|
fda45a5b5d37ed858bd2812ad59ca94c5b8f6742
|
diff --git a/Tests/Twig/Extension/SonataAdminExtensionTest.php b/Tests/Twig/Extension/SonataAdminExtensionTest.php
index <HASH>..<HASH> 100644
--- a/Tests/Twig/Extension/SonataAdminExtensionTest.php
+++ b/Tests/Twig/Extension/SonataAdminExtensionTest.php
@@ -89,6 +89,10 @@ class SonataAdminExtensionTest extends \PHPUnit_Framework_TestCase
public function setUp()
{
+ // NEXT_MAJOR: remove this block when dropping symfony < 2.7 support
+ if (!class_exists('Symfony\Bridge\Twig\Extension\AssetExtension')) {
+ $this->markTestSkipped();
+ }
date_default_timezone_set('Europe/London');
$container = $this->getMock('Symfony\Component\DependencyInjection\ContainerInterface');
|
Skip tests for symfony <I>
Twig 2 and symfony <I> are not actually compatible.
|
sonata-project_SonataAdminBundle
|
train
|
php
|
cb4ce5ab0d64f1091e916e1fdf7d76f167656c13
|
diff --git a/xmpp_backends/ejabberdctl.py b/xmpp_backends/ejabberdctl.py
index <HASH>..<HASH> 100644
--- a/xmpp_backends/ejabberdctl.py
+++ b/xmpp_backends/ejabberdctl.py
@@ -219,7 +219,9 @@ class EjabberdctlBackend(EjabberdBackendBase):
def message_user(self, username, domain, subject, message):
"""Currently use send_message_chat and discard subject, because headline messages are not stored by
mod_offline."""
- self.ctl('send_message_chat', domain, '%s@%s' % (username, domain), message)
+ code, out, err = self.ctl('send_message_chat', domain, '%s@%s' % (username, domain), message)
+ if code != 0:
+ raise BackendError(code)
def all_domains(self):
code, out, err = self.ctl('registered_vhosts')
|
checkout return code of send_message_chat
|
mathiasertl_xmpp-backends
|
train
|
py
|
f568d7ee3b6154628941c018a7ab4485234902c7
|
diff --git a/docs/lib/Home/index.js b/docs/lib/Home/index.js
index <HASH>..<HASH> 100644
--- a/docs/lib/Home/index.js
+++ b/docs/lib/Home/index.js
@@ -21,7 +21,7 @@ export default () => {
Easy to use form validation for <a href="https://github.com/reactstrap/reactstrap">reactstrap</a>
</p>
<p>
- <Button outline color="danger" href="https://github.com/availity/availity-reactstrap-validation">View on Github</Button>
+ <Button outline color="danger" href="https://github.com/availity/availity-reactstrap-validation">View on GitHub</Button>
<Button color="danger" tag={Link} to="/components/">View Components</Button>
</p>
</Col>
diff --git a/docs/lib/UI/Nav.js b/docs/lib/UI/Nav.js
index <HASH>..<HASH> 100644
--- a/docs/lib/UI/Nav.js
+++ b/docs/lib/UI/Nav.js
@@ -48,7 +48,7 @@ export default class UINav extends Component {
</NavItem>
<NavItem>
<NavLink href="https://github.com/availity/availity-reactstrap-validation">
- Github
+ GitHub
</NavLink>
</NavItem>
</Nav>
|
docs(typo): changing Github to GitHub (#<I>)
|
Availity_availity-reactstrap-validation
|
train
|
js,js
|
c5824b00abbc0b919b3c75c4f7b942d1dee736b0
|
diff --git a/memorious/cli.py b/memorious/cli.py
index <HASH>..<HASH> 100644
--- a/memorious/cli.py
+++ b/memorious/cli.py
@@ -1,5 +1,6 @@
import click
import logging
+import time
from tabulate import tabulate
from memorious import settings
@@ -54,11 +55,19 @@ def flush(crawler):
@cli.command()
def process():
- """Start the queue and process tasks as they come"""
+ """Start the queue and process tasks as they come. Blocks while waiting"""
TaskRunner.run()
@cli.command()
+def beat():
+ """Loop and try to run scheduled crawlers at short intervals"""
+ while True:
+ manager.run_scheduled()
+ time.sleep(settings.BEAT_INTERVAL)
+
+
[email protected]()
def list():
"""List the available crawlers."""
crawler_list = []
diff --git a/memorious/settings.py b/memorious/settings.py
index <HASH>..<HASH> 100644
--- a/memorious/settings.py
+++ b/memorious/settings.py
@@ -37,6 +37,9 @@ INCREMENTAL = env_bool('INCREMENTAL', default=True)
# How many days until an incremental crawl expires
EXPIRE = int(env('EXPIRE', 60))
+# How many seconds to wait before trying to run scheduled crawlers
+BEAT_INTERVAL = int(env('BEAT_INTERVAL', 60))
+
# HTTP request configuration
HTTP_CACHE = env_bool('HTTP_CACHE', default=True)
|
command to have a long running process running scheduled crawlers
|
alephdata_memorious
|
train
|
py,py
|
9b861d6441dd2ddacf96267e7e9378733cdcf610
|
diff --git a/randomized-runner/src/main/java/com/carrotsearch/randomizedtesting/RandomizedRunner.java b/randomized-runner/src/main/java/com/carrotsearch/randomizedtesting/RandomizedRunner.java
index <HASH>..<HASH> 100644
--- a/randomized-runner/src/main/java/com/carrotsearch/randomizedtesting/RandomizedRunner.java
+++ b/randomized-runner/src/main/java/com/carrotsearch/randomizedtesting/RandomizedRunner.java
@@ -771,9 +771,11 @@ public final class RandomizedRunner extends Runner implements Filterable {
current.push(new Randomness(c.seed));
current.setTargetMethod(c.method);
- Boolean ignoredTest = ignored.get(c);
- if (ignoredTest != null && ignoredTest) {
- reportAsIgnored(notifier, groupEvaluator, c);
+ if (ignored.containsKey(c)) {
+ // Ignore the test, but report only if requested.
+ if (ignored.get(c)) {
+ reportAsIgnored(notifier, groupEvaluator, c);
+ }
} else {
runSingleTest(notifier, c, threadLeakControl);
}
|
GH-<I>: corrected the ignore logic.
|
randomizedtesting_randomizedtesting
|
train
|
java
|
b47ed977b4bc3c66db4eb269faaa5cd9c77207a3
|
diff --git a/tests/integration/tests/src/test/java/org/sonar/tests/integration/Struts139IT.java b/tests/integration/tests/src/test/java/org/sonar/tests/integration/Struts139IT.java
index <HASH>..<HASH> 100644
--- a/tests/integration/tests/src/test/java/org/sonar/tests/integration/Struts139IT.java
+++ b/tests/integration/tests/src/test/java/org/sonar/tests/integration/Struts139IT.java
@@ -20,6 +20,7 @@
package org.sonar.tests.integration;
import org.junit.BeforeClass;
+import org.junit.Ignore;
import org.junit.Test;
import org.sonar.api.measures.CoreMetrics;
import org.sonar.wsclient.Sonar;
@@ -189,10 +190,8 @@ public class Struts139IT {
assertThat(version.getCategory(), is("Version"));
}
- /**
- * See http://jira.codehaus.org/browse/SONAR-2041
- */
@Test
+ @Ignore("Not fixed. See http://jira.codehaus.org/browse/SONAR-2041")
public void unknownMetric() {
assertThat(getProjectMeasure("notfound"), nullValue());
assertThat(getCoreModuleMeasure("notfound"), nullValue());
|
remove IT of SONAR-<I> which has not been fixed yet
|
SonarSource_sonarqube
|
train
|
java
|
22f5c149aa71babc6649e811f6ec01c9165d5c80
|
diff --git a/lxc/network_acl.go b/lxc/network_acl.go
index <HASH>..<HASH> 100644
--- a/lxc/network_acl.go
+++ b/lxc/network_acl.go
@@ -69,6 +69,9 @@ func (c *cmdNetworkACL) Command() *cobra.Command {
networkACLRuleCmd := cmdNetworkACLRule{global: c.global, networkACL: c}
cmd.AddCommand(networkACLRuleCmd.Command())
+ // Workaround for subcommand usage errors. See: https://github.com/spf13/cobra/issues/706
+ cmd.Args = cobra.NoArgs
+ cmd.Run = func(cmd *cobra.Command, args []string) { cmd.Usage() }
return cmd
}
|
lxc/network/acl: workaround for subcommand errors
|
lxc_lxd
|
train
|
go
|
b6516174a84d849bd620417dca9e0a81e0d3b5dc
|
diff --git a/python/pyspark/pandas/tests/indexes/test_datetime.py b/python/pyspark/pandas/tests/indexes/test_datetime.py
index <HASH>..<HASH> 100644
--- a/python/pyspark/pandas/tests/indexes/test_datetime.py
+++ b/python/pyspark/pandas/tests/indexes/test_datetime.py
@@ -120,7 +120,7 @@ class DatetimeIndexTest(PandasOnSparkTestCase, TestUtils):
def test_month_name(self):
for psidx, pidx in self.idx_pairs:
- self.assert_eq(psidx.day_name(), pidx.day_name())
+ self.assert_eq(psidx.month_name(), pidx.month_name())
def test_normalize(self):
for psidx, pidx in self.idx_pairs:
|
[SPARK-<I>][PYTHON][TESTS] Change day to month
### What changes were proposed in this pull request?
Right now we have two functions that are testing the same thing.
### Why are the changes needed?
To test both day and mount
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Got the green light.
Closes #<I> from bjornjorgensen/change-day-to-month.
|
apache_spark
|
train
|
py
|
76ca90aff330e472e6e38ecc4c8c635d0f7318ec
|
diff --git a/lib/nominet-epp/notification.rb b/lib/nominet-epp/notification.rb
index <HASH>..<HASH> 100644
--- a/lib/nominet-epp/notification.rb
+++ b/lib/nominet-epp/notification.rb
@@ -18,7 +18,7 @@ module NominetEPP
parse_response
end
- undef id if defined?(id)
+ undef id unless RUBY_VERSION >= "1.9"
undef to_s
def type
diff --git a/lib/nominet-epp/responses/contact/info_response.rb b/lib/nominet-epp/responses/contact/info_response.rb
index <HASH>..<HASH> 100644
--- a/lib/nominet-epp/responses/contact/info_response.rb
+++ b/lib/nominet-epp/responses/contact/info_response.rb
@@ -9,7 +9,7 @@ module NominetEPP
ext_inf_data
end
- undef id if defined?(id)
+ undef id unless RUBY_VERSION >= "1.9"
def name
@response.id
|
Fine, compare to Ruby version instead of using defined
I give up trying to do things the proper way
|
m247_nominet-epp
|
train
|
rb,rb
|
ea2c9407d34dd8b8fe1d2c4018a9948015b8245a
|
diff --git a/nosetimer/plugin.py b/nosetimer/plugin.py
index <HASH>..<HASH> 100644
--- a/nosetimer/plugin.py
+++ b/nosetimer/plugin.py
@@ -1,11 +1,14 @@
import logging
-import multiprocessing
import operator
import os
import re
import termcolor
import timeit
+# Windows and Python 2.7 multiprocessing don't marry well.
+if os.name != 'nt':
+ import multiprocessing
+
from nose.plugins import Plugin
log = logging.getLogger('nose.plugin.timer')
@@ -21,7 +24,11 @@ class TimerPlugin(Plugin):
def __init__(self):
super(TimerPlugin, self).__init__()
- self._timed_tests = multiprocessing.Manager().dict()
+
+ if os.name != 'nt':
+ self._timed_tests = multiprocessing.Manager().dict()
+ else:
+ self._timed_tests = {}
def _time_taken(self):
if hasattr(self, '_timer'):
|
Conditionally removed Windows multiprocessing reference
|
mahmoudimus_nose-timer
|
train
|
py
|
973785aa6a618c6d39551892189597e50aeca93f
|
diff --git a/tests/test_periods.py b/tests/test_periods.py
index <HASH>..<HASH> 100644
--- a/tests/test_periods.py
+++ b/tests/test_periods.py
@@ -228,16 +228,19 @@ class TestDay(TestCase):
# This test exercises the case where a Day object is initiatized with
# no date, which causes the Day constructor to call timezone.now(),
# which always uses UTC. This can cause a problem if the desired TZ
- # is not UTC, because the _get_day_range method typecases the
- # tx-aware datetimes object to naive objects.
+ # is not UTC, because the _get_day_range method typecasts the
+ # tz-aware datetime to a naive datetime.
+
+ # To simulate this case, we will create a NY tz date, localize that
+ # date to UTC, then create a Day object with the UTC date and NY TZ
- # To simulate this case, we will initiatize the Day object with a UTC
- # datetime, but pass in a non-UTC tzinfo
NY = pytz.timezone('America/New_York')
+ user_wall_time = datetime.datetime(2015, 11, 4, 21, 30, tzinfo=NY)
+ timezone_now = user_wall_time.astimezone(pytz.utc)
test_day = Day(
events=Event.objects.all(),
- date=datetime.datetime(2015, 11, 5, 2, 30, tzinfo=pytz.utc),
+ date=timezone_now,
tzinfo=NY)
expected_start = datetime.datetime(2015, 11, 4, 5, 00, tzinfo=pytz.utc)
|
cleaning up unit test
fixed typos, tried to make more clear what edge case is being tested
|
llazzaro_django-scheduler
|
train
|
py
|
28130561e812855cc91b0d982529bcf904c8cf58
|
diff --git a/src/main/java/com/sdl/selenium/extjs6/tab/Tab.java b/src/main/java/com/sdl/selenium/extjs6/tab/Tab.java
index <HASH>..<HASH> 100644
--- a/src/main/java/com/sdl/selenium/extjs6/tab/Tab.java
+++ b/src/main/java/com/sdl/selenium/extjs6/tab/Tab.java
@@ -120,6 +120,15 @@ public class Tab extends WebLocator implements ITab {
return activated;
}
+ public boolean doSetActive() {
+ WebLocator inactiveTab = getTitleInactiveEl().setExcludeClasses("x-tab-active");
+ boolean activated = isActive() || inactiveTab.doClick();
+ if (activated) {
+ log.info("doSetActive : " + toString());
+ }
+ return activated;
+ }
+
@Override
public boolean isActive() {
return new ConditionManager(Duration.ofMillis(200)).add(new RenderSuccessCondition(this)).execute().isSuccess();
|
added doSetActive() method in Tab
|
sdl_Testy
|
train
|
java
|
7fee14ec2999960b40a716e29fc32ca396154528
|
diff --git a/sos/report/plugins/ebpf.py b/sos/report/plugins/ebpf.py
index <HASH>..<HASH> 100644
--- a/sos/report/plugins/ebpf.py
+++ b/sos/report/plugins/ebpf.py
@@ -61,7 +61,9 @@ class Ebpf(Plugin, RedHatPlugin, DebianPlugin, UbuntuPlugin):
"bpftool cgroup tree",
# collect list of bpf program attachments in the kernel
# networking subsystem
- "bpftool net list"
+ "bpftool net list",
+ # collect all struct_ops currently existing in the system
+ "bpftool struct_ops dump"
])
# Capture list of bpf program attachments from namespaces
|
[ebpf] Add the output of bpftool struct_ops dump
This patch dumps all struct_ops currently existing in
the system via the commant 'bpftool struct_ops dump'.
Resolves: #<I>
|
sosreport_sos
|
train
|
py
|
de6658b902d11f26b17995cee1502284993db3a9
|
diff --git a/src/Symfony/Component/HttpKernel/Profiler/Profiler.php b/src/Symfony/Component/HttpKernel/Profiler/Profiler.php
index <HASH>..<HASH> 100644
--- a/src/Symfony/Component/HttpKernel/Profiler/Profiler.php
+++ b/src/Symfony/Component/HttpKernel/Profiler/Profiler.php
@@ -165,7 +165,7 @@ class Profiler
$profile = new Profile(uniqid());
$profile->setTime(time());
$profile->setUrl($request->getUri());
- $profile->setIp($request->server->get('REMOTE_ADDR'));
+ $profile->setIp($request->getClientIp());
$response->headers->set('X-Debug-Token', $profile->getToken());
|
[Profiler]Use the abstract method to get client IP
|
symfony_symfony
|
train
|
php
|
ae29e5577d2244c64e318d940780b6da6a7658be
|
diff --git a/test/shared.rb b/test/shared.rb
index <HASH>..<HASH> 100644
--- a/test/shared.rb
+++ b/test/shared.rb
@@ -51,6 +51,18 @@ module FriendlyId
end
end
+ test "when validations block save, to_param should return friendly_id rather than nil" do
+ my_model_class = Class.new(model_class)
+ self.class.const_set("Foo", my_model_class)
+ with_instance_of my_model_class do |record|
+ record.update_attribute(my_model_class.friendly_id_config.slug_column, nil)
+ record = my_model_class.find(record.id)
+ record.class.validate Proc.new {errors[:name] = "FAIL"}
+ record.save
+ assert_equal record.to_param, record.friendly_id
+ end
+ end
+
end
module Core
|
Add failing test for issue #<I>
|
norman_friendly_id
|
train
|
rb
|
09f3c7d93bd1738051ca6bbc53b1c49cf14a15f5
|
diff --git a/core/server/src/test/java/tachyon/worker/block/BlockWorkerTest.java b/core/server/src/test/java/tachyon/worker/block/BlockWorkerTest.java
index <HASH>..<HASH> 100644
--- a/core/server/src/test/java/tachyon/worker/block/BlockWorkerTest.java
+++ b/core/server/src/test/java/tachyon/worker/block/BlockWorkerTest.java
@@ -116,6 +116,7 @@ public class BlockWorkerTest {
@After
public void after() throws IOException {
((DataServer) Whitebox.getInternalState(mBlockWorker, "mDataServer")).close();
+ WorkerContext.reset();
}
/**
* Tests the {@link BlockWorker#abortBlock(long, long)} method.
|
[TACHYON-<I>] Reset the WorkerContext after cases
|
Alluxio_alluxio
|
train
|
java
|
79a0daa8bb2bea2ef7b3cc6e796899432f149681
|
diff --git a/templates/genomics/steps/vcf.py b/templates/genomics/steps/vcf.py
index <HASH>..<HASH> 100644
--- a/templates/genomics/steps/vcf.py
+++ b/templates/genomics/steps/vcf.py
@@ -1,12 +1,12 @@
import gzip
import os
+from warnings import warn
-from sofia_.step import Resource, Target
-from lhc.io.vcf_.tools.index import IndexedVcfFile
+from lhc.io.vcf_.index import IndexedVcfFile
from lhc.io.vcf_.iterator import VcfEntryIterator
from lhc.io.vcf_.set_ import VcfSet as VcfSetBase
from lhc.io.vcf_.tools.split_alt import _split_variant
-from warnings import warn
+from sofia_.step import Resource, Target
class VcfIterator(Target):
|
changed genomics template to use new vcf index
|
childsish_sofia
|
train
|
py
|
79792226603a01f69d1aaa46b48b77999175d8f0
|
diff --git a/gpiozero/internal_devices.py b/gpiozero/internal_devices.py
index <HASH>..<HASH> 100644
--- a/gpiozero/internal_devices.py
+++ b/gpiozero/internal_devices.py
@@ -92,6 +92,22 @@ class PolledInternalDevice(InternalDevice):
def event_delay(self, value):
self._event_delay = float(value)
+ def wait_for_active(self, timeout=None):
+ self._start_stop_events(True)
+ try:
+ return super(PolledInternalDevice, self).wait_for_active(timeout)
+ finally:
+ self._start_stop_events(
+ self.when_activated or self.when_deactivated)
+
+ def wait_for_inactive(self, timeout=None):
+ self._start_stop_events(True)
+ try:
+ return super(PolledInternalDevice, self).wait_for_inactive(timeout)
+ finally:
+ self._start_stop_events(
+ self.when_activated or self.when_deactivated)
+
def _watch_value(self):
while not self._event_thread.stopping.wait(self._event_delay):
self._fire_events(self.pin_factory.ticks(), self.is_active)
|
Fix wait-for methods
In my zeal for efficiency the background thread wasn't running when only
the wait-for methods were called!
|
RPi-Distro_python-gpiozero
|
train
|
py
|
cfef199e08c7c8228f910bd2f6ace3697f6294c8
|
diff --git a/allegedb/allegedb/cache.py b/allegedb/allegedb/cache.py
index <HASH>..<HASH> 100644
--- a/allegedb/allegedb/cache.py
+++ b/allegedb/allegedb/cache.py
@@ -389,8 +389,12 @@ class WindowDict(MutableMapping):
def __bool__(self):
return bool(self._past) or bool(self._future)
- def __init__(self, data={}):
- self._past = deque(sorted(data.items()))
+ def __init__(self, data=()):
+ if hasattr(data, 'items'):
+ self._past = deque(sorted(data.items()))
+ else:
+ # assume it's an orderable sequence of pairs
+ self._past = deque(sorted(data))
self._future = deque()
def __iter__(self):
|
Make WindowDict take a sequence of pairs
|
LogicalDash_LiSE
|
train
|
py
|
5cc3dcd08dbab1d8dda1808bfdd12ce76d75baaa
|
diff --git a/test/test.js b/test/test.js
index <HASH>..<HASH> 100644
--- a/test/test.js
+++ b/test/test.js
@@ -384,7 +384,7 @@ describe('#setSignature()', function() {
{ 'foo': 'bar', 'logs-:a': 2 },
{ 'foo': 'baz', 'logs-:a': 3 },
]).getData().should.eql([
- { foo: 'bar', logs: { a: 1 } },
+ { foo: 'bar', logs: { a: 2 } },
{ foo: 'baz', logs: { a: 3 } }
]);
});
|
fixed tests: new assumption = repeated object nodes will be overwritten with subsequent data
|
kwhitley_treeize
|
train
|
js
|
0f985e9883e96aa7757a6cae3588151c2b0b367d
|
diff --git a/salt/cloud/cli.py b/salt/cloud/cli.py
index <HASH>..<HASH> 100644
--- a/salt/cloud/cli.py
+++ b/salt/cloud/cli.py
@@ -44,8 +44,11 @@ class SaltCloud(parsers.SaltCloudParser):
# Parse shell arguments
self.parse_args()
- salt_master_user = self.config.get('user', salt.utils.get_user())
- if salt_master_user is not None and not check_user(salt_master_user):
+ salt_master_user = self.config.get('user')
+ if salt_master_user is None:
+ salt_master_user = salt.utils.get_user()
+
+ if not check_user(salt_master_user):
self.error(
'If salt-cloud is running on a master machine, salt-cloud '
'needs to run as the same user as the salt-master, {0!r}. If '
|
salt-cloud: None means run as current user
|
saltstack_salt
|
train
|
py
|
a661be2aef160d8bfa3db44601bc8220173ab60c
|
diff --git a/producer.go b/producer.go
index <HASH>..<HASH> 100644
--- a/producer.go
+++ b/producer.go
@@ -78,7 +78,6 @@ func NewProducer(addr string, config *Config) (*Producer, error) {
exitChan: make(chan int),
responseChan: make(chan []byte),
errorChan: make(chan []byte),
- closeChan: make(chan int),
}
return p, nil
}
@@ -223,7 +222,7 @@ func (w *Producer) connect() error {
atomic.StoreInt32(&w.state, StateInit)
return err
}
-
+ w.closeChan = make(chan int)
w.wg.Add(1)
go w.router()
@@ -323,4 +322,8 @@ func (w *Producer) onConnResponse(c *Conn, data []byte) { w.responseChan <- data
func (w *Producer) onConnError(c *Conn, data []byte) { w.errorChan <- data }
func (w *Producer) onConnHeartbeat(c *Conn) {}
func (w *Producer) onConnIOError(c *Conn, err error) { w.close() }
-func (w *Producer) onConnClose(c *Conn) { w.closeChan <- 1 }
+func (w *Producer) onConnClose(c *Conn) {
+ w.guard.Lock()
+ defer w.guard.Unlock()
+ close(w.closeChan)
+}
|
producer: ensure that onConnClose can always make progress
|
nsqio_go-nsq
|
train
|
go
|
24158e04221762514a96bae2679f3c8e84ba1e0c
|
diff --git a/lib/constants.js b/lib/constants.js
index <HASH>..<HASH> 100644
--- a/lib/constants.js
+++ b/lib/constants.js
@@ -139,7 +139,7 @@ var Constants = {
AADConstants : {
WORLD_WIDE_AUTHORITY : 'login.windows.net',
- WELL_KNOWN_AUTHORITY_HOSTS : ['login.windows.net', 'login.microsoftonline.com', 'login.chinacloudapi.cn', 'login.cloudgovapi.us'],
+ WELL_KNOWN_AUTHORITY_HOSTS : ['login.windows.net', 'login.microsoftonline.com', 'login.chinacloudapi.cn', 'login.cloudgovapi.us', 'login.windows-ppe.net'],
INSTANCE_DISCOVERY_ENDPOINT_TEMPLATE : 'https://{authorize_host}/common/discovery/instance?authorization_endpoint={authorize_endpoint}&api-version=1.0',
AUTHORIZE_ENDPOINT_PATH : '/oauth2/authorize',
TOKEN_ENDPOINT_PATH : '/oauth2/token',
|
update well know authoritiy host for testing in ppe
|
AzureAD_azure-activedirectory-library-for-nodejs
|
train
|
js
|
268fe104c00ff0e3ae415fdde0f51f6d8f2c7c30
|
diff --git a/src/main/java/org/takes/rq/RqMultipart.java b/src/main/java/org/takes/rq/RqMultipart.java
index <HASH>..<HASH> 100644
--- a/src/main/java/org/takes/rq/RqMultipart.java
+++ b/src/main/java/org/takes/rq/RqMultipart.java
@@ -398,9 +398,9 @@ public interface RqMultipart extends Request {
/**
* Fake ctor.
- * @param req Fake request header holder
- * @param dispositions Fake request body parts
- * @throws IOException If fails
+ * @param req Fake request header holder
+ * @param dispositions Fake request body parts
+ * @throws IOException If fails
* @todo #252:30min/DEV New ctor should be created and expect
* instances of Request as dispositions. Dispositions should be
* multipart request body holder. New ctor should call
|
#<I> try to fix PMD error
|
yegor256_takes
|
train
|
java
|
ab17e8010aab670d4ac015bca07b3e4c69723800
|
diff --git a/html/pfappserver/root/src/views/Configuration/_composables/useViewCollectionItem.js b/html/pfappserver/root/src/views/Configuration/_composables/useViewCollectionItem.js
index <HASH>..<HASH> 100644
--- a/html/pfappserver/root/src/views/Configuration/_composables/useViewCollectionItem.js
+++ b/html/pfappserver/root/src/views/Configuration/_composables/useViewCollectionItem.js
@@ -105,9 +105,8 @@ export const useViewCollectionItem = (collection, props, context) => {
form.value = {}
reject(e)
})
- }).catch(e => { // meta may not be available, fail silently
+ }).catch(() => { // meta may not be available, fail silently
meta.value = {}
- console.error(e)
getItem().then(item => {
form.value = { ...item } // dereferenced
resolve()
@@ -122,11 +121,9 @@ export const useViewCollectionItem = (collection, props, context) => {
form.value = useItemDefaults(_meta, props, context)
meta.value = _meta
resolve()
- }).catch(e => {
+ }).catch(() => {
form.value = {}
meta.value = {}
- // reject(e)
- console.error(e)
resolve() // meta may not be available, fail silently
})
}
|
fix(admin(js)): remove console statements
|
inverse-inc_packetfence
|
train
|
js
|
9b5cf799bfaeb978c815b07532b9c4360a81acdb
|
diff --git a/src/drawer.js b/src/drawer.js
index <HASH>..<HASH> 100644
--- a/src/drawer.js
+++ b/src/drawer.js
@@ -50,18 +50,16 @@ $.Drawer = function(source, viewport, elmt) {
this.elmt = this._container;
- this._init();
+
+ this._canvas.style.width = "100%";
+ this._canvas.style.height = "100%";
+ this._canvas.style.position = "absolute";
+ this._container.style.textAlign = "left"; // explicit left-align
+ this._container.appendChild(this._canvas);
};
$.Drawer.prototype = {
- _init: function() {
- this._canvas.style.width = "100%";
- this._canvas.style.height = "100%";
- this._canvas.style.position = "absolute";
- this._container.style.textAlign = "left"; // explicit left-align
- this._container.appendChild(this._canvas);
- },
_compareTiles: function(prevBest, tile) {
if (!prevBest) {
return tile;
|
removed _init anti-pattern from drawer constructor
|
openseadragon_openseadragon
|
train
|
js
|
713e715fb3657a274f60c687eec60bad0d9c7da9
|
diff --git a/container/synth.py b/container/synth.py
index <HASH>..<HASH> 100644
--- a/container/synth.py
+++ b/container/synth.py
@@ -25,7 +25,7 @@ gapic = gcp.GAPICGenerator()
library = gapic.py_library(
'container',
'v1',
- config_path='/google/container/artman_container.yaml',
+ config_path='/google/container/artman_container_v1.yaml',
artman_output_name='container-v1')
s.move(library / 'google/cloud/container_v1')
|
Update synth.py yaml location (#<I>)
|
googleapis_google-cloud-python
|
train
|
py
|
1d3ea13a53105106a84c716e20a8cac5a7b6874a
|
diff --git a/src/main/java/io/vlingo/http/resource/ResourceHandler.java b/src/main/java/io/vlingo/http/resource/ResourceHandler.java
index <HASH>..<HASH> 100644
--- a/src/main/java/io/vlingo/http/resource/ResourceHandler.java
+++ b/src/main/java/io/vlingo/http/resource/ResourceHandler.java
@@ -11,7 +11,7 @@ import io.vlingo.actors.Completes;
import io.vlingo.http.Context;
import io.vlingo.http.Response;
-public class ResourceHandler {
+public abstract class ResourceHandler {
protected Context context;
protected ResourceHandler() {
|
Implemented request dispatching with completable responses.
|
vlingo_vlingo-http
|
train
|
java
|
6a58c45bdd6c02afb1ca93a05321f5afdff140e4
|
diff --git a/views/js/assets/strategies.js b/views/js/assets/strategies.js
index <HASH>..<HASH> 100644
--- a/views/js/assets/strategies.js
+++ b/views/js/assets/strategies.js
@@ -76,7 +76,8 @@ define([
taomedia : {
name : 'taomedia',
handle : function handleTaoMedia(url, data){
- var baseUrl = data.baseUrl || '';
+ //either a baseUrl is given or if empty, taomedia resources are managed as relative resources
+ var baseUrl = data.baseUrl || './';
if( (typeof url === 'object' && url.protocol === 'taomedia') ||
(/^taomedia:\/\//.test(url.toString())) ){
return baseUrl + url.toString();
|
taomedia is resolved as a relative without a baseUrl
|
oat-sa_extension-tao-item
|
train
|
js
|
055069628124042c685f226b779d61df09c8abd6
|
diff --git a/inspire_json_merger/comparators.py b/inspire_json_merger/comparators.py
index <HASH>..<HASH> 100644
--- a/inspire_json_merger/comparators.py
+++ b/inspire_json_merger/comparators.py
@@ -86,9 +86,10 @@ def get_pk_comparator(primary_key_fields, normalization_functions=None):
class NoMatch(object):
- """Ensure the result of a normalization function is not a match."""
- def __equals__(self, other):
- return False
+ """Ensure the result of a normalization function is not a match.
+
+ It has no __eq__ method so two instances always test unequal."""
+ pass
class SubdictNormalizer(object):
|
comparators: clarify NoMatch class
|
inspirehep_inspire-json-merger
|
train
|
py
|
a1a558a73fa7e77d7a943c0abb346a5ddf978619
|
diff --git a/lib/starting_blocks/watcher.rb b/lib/starting_blocks/watcher.rb
index <HASH>..<HASH> 100644
--- a/lib/starting_blocks/watcher.rb
+++ b/lib/starting_blocks/watcher.rb
@@ -25,7 +25,7 @@ module StartingBlocks
StartingBlocks::Watcher.run_it(modified[0]) if modified.count > 0
end
- ::Listen.to location, &callback
+ ::Listen::Listener.new location, &callback
end
def add_it(file_that_changed)
|
Return a listener, not a thread.
|
darrencauthon_starting_blocks
|
train
|
rb
|
bdfc2a57e77f20c2e60dc6127b84252839871791
|
diff --git a/vendor/qunit/qunit.js b/vendor/qunit/qunit.js
index <HASH>..<HASH> 100644
--- a/vendor/qunit/qunit.js
+++ b/vendor/qunit/qunit.js
@@ -1051,7 +1051,7 @@ QUnit.jsDump = (function() {
name:'name',
'class':'className'
},
- HTML:true,//if true, entities are escaped ( <, >, \t, space and \n )
+ HTML:false,//if true, entities are escaped ( <, >, \t, space and \n )
indentChar:' ',//indentation unit
multiline:true //if true, items in a collection, are separated by a \n, else just a space.
};
|
Avoid ugliness in QUnit output.
|
pegjs_pegjs
|
train
|
js
|
f33d806413e43262194f6bcf394bd17acf5df8f9
|
diff --git a/src/test/org/openscience/cdk/smiles/DeduceBondSystemToolTest.java b/src/test/org/openscience/cdk/smiles/DeduceBondSystemToolTest.java
index <HASH>..<HASH> 100644
--- a/src/test/org/openscience/cdk/smiles/DeduceBondSystemToolTest.java
+++ b/src/test/org/openscience/cdk/smiles/DeduceBondSystemToolTest.java
@@ -22,6 +22,7 @@ package org.openscience.cdk.smiles;
import org.junit.Assert;
import org.junit.BeforeClass;
+import org.junit.Ignore;
import org.junit.Test;
import org.openscience.cdk.CDKConstants;
import org.openscience.cdk.CDKTestCase;
@@ -131,7 +132,7 @@ public class DeduceBondSystemToolTest extends CDKTestCase {
/**
* @cdk.bug 3506770
*/
- @Test
+ @Ignore("This is an example structure where this class fails")
public void testLargeBioclipseUseCase() throws Exception {
String smiles = "COc1ccc2[C@@H]3[C@H](COc2c1)C(C)(C)OC4=C3C(=O)C(=O)C5=C4OC(C)(C)[C@@H]6COc7cc(OC)ccc7[C@H]56";
SmilesParser smilesParser = new SmilesParser(DefaultChemObjectBuilder.getInstance());
|
Ignore this failing test case; it was one of the original points at which we decided new tools were needed
Change-Id: Ie<I>cd<I>dd<I>a<I>b3d<I>ab<I>b<I>df<I>
|
cdk_cdk
|
train
|
java
|
f85e3611ce907738486b2a341406322fde7e6921
|
diff --git a/lib/actor-system.js b/lib/actor-system.js
index <HASH>..<HASH> 100644
--- a/lib/actor-system.js
+++ b/lib/actor-system.js
@@ -53,9 +53,9 @@ class ActorSystem {
this.debugPortCounter = 1;
this.log = options.log || new Logger();
this.options = _.clone(options);
-
+
if (options.test) this.log.setLevel(this.log.levels().Error); // Only output errors in tests.
-
+
if (options.debug) {
this.log.setLevel(this.log.levels().Debug); // Overrides test option.
@@ -142,18 +142,19 @@ class ActorSystem {
});
}
}
-
+
this.rootActorPromise = this.rootActorPromise
.tap(() => this._loadConfiguration(options.config))
.tap(actor => {
- return actor.initialize()
- .catch(err => {
- this.log.warn('Actor initialization failed, destroying, actor=' + actor);
+ var initPromise = actor.initialize();
- actor.destroy();
+ initPromise && initPromise.catch(err => {
+ this.log.warn('Actor initialization failed, destroying, actor=' + actor);
- throw err;
- });
+ actor.destroy();
+
+ throw err;
+ });
});
this.forkedProcesses = {};
@@ -827,7 +828,7 @@ class ActorSystem {
if (defaultSystem) {
defaultSystem = new ActorSystem();
}
-
+
return defaultSystem;
}
|
(saymon) Fixed failure after actor initialization.
|
untu_comedy
|
train
|
js
|
c269b7df0dc64aa2ddcb6443c5d025052cd8d07f
|
diff --git a/Neos.Flow/Classes/Core/Bootstrap.php b/Neos.Flow/Classes/Core/Bootstrap.php
index <HASH>..<HASH> 100644
--- a/Neos.Flow/Classes/Core/Bootstrap.php
+++ b/Neos.Flow/Classes/Core/Bootstrap.php
@@ -547,7 +547,7 @@ class Bootstrap
define('FLOW_ONLY_COMPOSER_LOADER', $onlyUseComposerAutoLoaderForPackageClasses);
- define('FLOW_VERSION_BRANCH', 'master');
+ define('FLOW_VERSION_BRANCH', '4.3');
}
/**
|
TASK: Set FLOW_VERSION_BRANCH constant to <I>
|
neos_flow-development-collection
|
train
|
php
|
c3d66e1492c3146c96c2bed9235d911dc77e91ea
|
diff --git a/usb1.py b/usb1.py
index <HASH>..<HASH> 100644
--- a/usb1.py
+++ b/usb1.py
@@ -772,6 +772,10 @@ class USBDevice(object):
config = libusb1.libusb_config_descriptor_p()
result = libusb1.libusb_get_config_descriptor(device_p,
configuration_id, byref(config))
+ if result == libusb1.LIBUSB_ERROR_NOT_FOUND:
+ # Some devices (ex windows' root hubs) tell they have one
+ # configuration, but they have no configuration descriptor.
+ continue
if result:
raise libusb1.USBError(result)
append(config.contents)
|
Work around bogus device descriptors advertising too many configurations.
|
vpelletier_python-libusb1
|
train
|
py
|
8b410f35bb5e4c86a5438c1c70e902bc6bfd1271
|
diff --git a/lib/queue/events.js b/lib/queue/events.js
index <HASH>..<HASH> 100644
--- a/lib/queue/events.js
+++ b/lib/queue/events.js
@@ -47,7 +47,7 @@ exports.add = function(job){
*/
exports.subscribe = function(){
- var client = pool.pubSubClient();
+ var client = pool.pubsubClient();
client.subscribe(exports.key);
client.on('message', this.onMessage);
};
diff --git a/lib/queue/pool.js b/lib/queue/pool.js
index <HASH>..<HASH> 100644
--- a/lib/queue/pool.js
+++ b/lib/queue/pool.js
@@ -23,8 +23,6 @@ exports.maxConnections = 5;
exports.pool = [];
-exports.pubSub = false;
-
/**
* Allocate a redis connection.
*
@@ -42,10 +40,13 @@ exports.alloc = function(){
return client;
};
-exports.pubSubClient = function() {
- if ( ! exports.pubSub)
- {
- exports.pubSub = redis.createClient();
- }
- return exports.pubSub;
-}
\ No newline at end of file
+/**
+ * Return the pubsub-specific redis client.
+ *
+ * @return {RedisClient}
+ * @api private
+ */
+
+exports.pubsubClient = function(){
+ return exports._pubsub || (exports._pubsub = redis.createClient());
+};
\ No newline at end of file
|
Fixed the really weird pubSubClient method that someone wrote
|
Automattic_kue
|
train
|
js,js
|
4cd4ce3d15ab12990ed2a6538d2fe44f8396d75e
|
diff --git a/src/suppress-warnings.js b/src/suppress-warnings.js
index <HASH>..<HASH> 100644
--- a/src/suppress-warnings.js
+++ b/src/suppress-warnings.js
@@ -1,3 +1,5 @@
+import { decorate } from './private/utils';
+
function suppressedWarningNoop() {
// Warnings are currently suppressed via @suppressWarnings
}
@@ -21,4 +23,4 @@ function handleDescriptor(target, key, descriptor, warningTypes) {
export default function suppressWarnings() {
return decorate(handleDescriptor, arguments);
-}
\ No newline at end of file
+}
|
Missing dependency in suppress-warnings.js
|
jayphelps_core-decorators
|
train
|
js
|
adaed485aca05bb593799bea0de5e7c9eab6baf8
|
diff --git a/lib/discordrb/gateway.rb b/lib/discordrb/gateway.rb
index <HASH>..<HASH> 100644
--- a/lib/discordrb/gateway.rb
+++ b/lib/discordrb/gateway.rb
@@ -338,6 +338,7 @@ module Discordrb
end
def handle_error(e)
+ LOGGER.error(e)
end
def handle_message(msg)
|
Implement an error handler (that simply logs the error)
|
meew0_discordrb
|
train
|
rb
|
99dee3cd36076c4ae9b60b275f6a71c160dc61dc
|
diff --git a/glue/pipeline.py b/glue/pipeline.py
index <HASH>..<HASH> 100644
--- a/glue/pipeline.py
+++ b/glue/pipeline.py
@@ -2007,7 +2007,7 @@ class CondorDAG:
workflow_job.uses(Pegasus.DAX3.File(os.path.basename(f)),link=Pegasus.DAX3.Link.OUTPUT,register=False,transfer=True)
for f in checkpoint_node_file_dict.keys():
- workflow_job.uses(Pegasus.DAX3.File(os.path.basename(f)),link='checkpoint',register=False,transfer=True)
+ workflow_job.uses(Pegasus.DAX3.File(os.path.basename(f)),link=Pegasus.DAX3.Link.CHECKPOINT,register=False,transfer=True)
node_file_dict = dict( input_node_file_dict.items() + output_node_file_dict.items() + checkpoint_node_file_dict.items() )
|
pipeline.py use Pegasus class for Link.CHECKPOINT
|
gwastro_pycbc-glue
|
train
|
py
|
c28ff05b1eaadb7bb4bec1e46cee5ef9bd35d685
|
diff --git a/pyghmi/ipmi/fru.py b/pyghmi/ipmi/fru.py
index <HASH>..<HASH> 100644
--- a/pyghmi/ipmi/fru.py
+++ b/pyghmi/ipmi/fru.py
@@ -172,6 +172,8 @@ class FRU(object):
raise iexc.IpmiException(response['error'], response['code'])
self.rawfru.extend(response['data'][1:])
offset += response['data'][0]
+ if response['data'][0] == 0:
+ break
if offset + chunksize > frusize:
chunksize = frusize - offset
|
Break out of FRU read if zero data returned
If a BMC returns 0 data, then assume end of data has been
reached. Not all implementations accurately advertise FRU
size, so this is a way to detect and break out.
Change-Id: I<I>d<I>f<I>e<I>b<I>ae<I>ee7ee9e
|
openstack_pyghmi
|
train
|
py
|
1facf8cb4b9f8b22d52b8739e0214aa5cb006d70
|
diff --git a/models/fallahi_eval/assemble_models.py b/models/fallahi_eval/assemble_models.py
index <HASH>..<HASH> 100644
--- a/models/fallahi_eval/assemble_models.py
+++ b/models/fallahi_eval/assemble_models.py
@@ -22,6 +22,16 @@ def run_reading(pmid_fname):
# Download the file and save
+def get_stmt_sif(stmts, fname):
+ with open(fname, 'wt') as fh:
+ for stmt in stmts:
+ agents = [a for a in stmt.agent_list() if a is not None]
+ if len(agents) != 2:
+ continue
+ fh.write('%s,%s,%s\n' %
+ (agents[0].name, stmt.uuid, agents[1].name))
+
+
if __name__ == '__main__':
# Load the data and get the gene names
data = process_data.read_rppa_data()
|
Add a function to get Statement SIFs
|
sorgerlab_indra
|
train
|
py
|
Subsets and Splits
Java Commits in Train Set
Queries for all entries where the diff_languages column is 'java', providing a filtered dataset but without deeper analysis.
Java Commits Test Data
Returns a subset of 5000 entries from the dataset where the programming language difference is Java, providing basic filtering for exploration.
Java Commits Sample
Retrieves the first 1,000 records where the 'diff_languages' column is 'java', providing limited insight into the specific data entries.
Java Commits Validation Sample
Retrieves a sample of entries from the validation dataset where the diff languages are Java, providing limited insight into specific Java-related data points.
Java Commits in Validation
This query retrieves a limited sample of entries from the validation dataset where the programming language difference is Java, providing basic filtering with minimal insight.
Java Commits Sample
This query retrieves a sample of 100 records where the 'diff_languages' is 'java', providing basic filtering but limited analytical value.
Java Commits Sample
Retrieves 100 samples where the language difference is Java, providing basic filtering but minimal analytical value.
Java Commits Sample
Retrieves 10 samples where the diff_languages column is 'java', providing basic examples of data entries with this specific language.
Java Commits Validation Sample
Retrieves 1,000 records where the differences in languages are marked as Java, providing a snapshot of that specific subset but limited to raw data.
Java Commits Sample
This query retrieves 1000 random samples from the dataset where the programming language is Java, offering limited insight beyond raw data.