text
stringlengths
2
1.04M
meta
dict
title: "Hardware requirements" linkTitle: "Hardware Requirements" description: "Planning a Sensu deployment? Read this guide to learn about the hardware and networking requirements for running Sensu backends and agents on your organization's infrastructure." weight: 5 version: "5.9" product: "Sensu Go" platformContent: false menu: sensu-go-5.9: parent: installation --- - [Sensu backend requirements](#sensu-backend) - [Sensu agent requirements](#sensu-agent) - [Networking recommendations](#networking-recommendations) - [Cloud recommendations](#cloud-recommendations) ## Sensu backend ### Backend minimum requirements The following configuration is the minimum required to run the Sensu backend, however it is insufficient for production use. See the [recommended configuration](#backend-recommended-configuration) for production recommendations. * 64-bit Intel or AMD CPU * 4 GB RAM * 4 GB free disk space * 10 mbps network link ### Backend recommended configuration The following configuration is recommended as a baseline for production use to ensure a good user and operator experience. Using additional resources (even over-provisioning) further improves stability and scalability. * 64 bit 4-core Intel or AMD CPU * 8 GB RAM * SSD (NVMe or SATA3) * Gigabit ethernet The Sensu backend is typically CPU and storage intensive. In general, its use of these resources scales linearly with the total number of checks executed by all Sensu agents connecting to the backend. The Sensu backend is a massively parallel application that can scale to any number of CPU cores. Provision roughly 1 CPU core for every 50 checks per second (including agent keepalives). Most installations are fine with 4 CPU cores, but larger installations may find that additional CPU cores (8+) are necessary. Every executed Sensu check results in storage writes. When provisioning storage, a good guideline is to have twice as many **sustained disk IOPS** as you expect to have events per second. Don't forget to include agent keepalives in this calculation; each agent publishes a keepalive every 20 seconds. For example, in a cluster of 100 agents, you can expect those agents to consume 10 write IOPS for keepalives. The Sensu backend uses a relatively modest amount of RAM under most circumstances. Larger production deployments use a larger amount of RAM (8+ GB). ## Sensu agent ### Agent minimum requirements The following configuration is the minimum required to run the Sensu agent, however it is insufficient for production use. See the [recommended configuration](#agent-recommended-configuration) for production recommendations. * 386, amd64, or ARM CPU (armv5 minimum) * 128 MB RAM * 10 mbps network link ### Agent recommended configuration The following configuration is recommended as a baseline for production use to ensure a good user and operator experience. * 64 bit 4-core Intel or AMD CPU * 512 MB RAM * Gigabit ethernet The Sensu agent itself is quite lightweight, and should be able to run on all but the most modest hardware. However, since the agent is responsible for executing checks, factor the agent's responsibilities into your hardware provisioning. ## Networking recommendations ### Agent connections Sensu uses WebSockets for communication between the agent and backend. All communication occurs over a single TCP socket. It's recommended that users connect backends and agents via gigabit ethernet, but any somewhat-reliable network link should work (e.g. WiFi and 4G). If you see WebSocket timeouts in the backend logs, you may need to use a better network link between the backend and agents. ## Cloud recommendations ### AWS The recommended EC2 instance type and size for Sensu backends running embedded etcd is **M5d.xlarge**. The [M5d instance](https://aws.amazon.com/ec2/instance-types/m5/) provides 4 vCPU, 16 GB of RAM, up to 10 Gbps network connectivity, and a 150 NVMe SSD directly attached to the instance host (optimal for sustained disk IOPS).
{ "content_hash": "ddff008b40c394405a91bd47f558af61", "timestamp": "", "source": "github", "line_count": 108, "max_line_length": 192, "avg_line_length": 37.074074074074076, "alnum_prop": 0.792957042957043, "repo_name": "sensu/sensu-docs", "id": "f5d0144689d388ca1cfa738cfdf956da2ccd4884", "size": "4008", "binary": false, "copies": "1", "ref": "refs/heads/main", "path": "archived/sensu-go/5.9/installation/recommended-hardware.md", "mode": "33188", "license": "mit", "language": [ { "name": "HTML", "bytes": "123020" }, { "name": "JavaScript", "bytes": "61971" }, { "name": "Procfile", "bytes": "14" }, { "name": "Python", "bytes": "3764" }, { "name": "Ruby", "bytes": "4422" }, { "name": "SCSS", "bytes": "32403" }, { "name": "Shell", "bytes": "30924" } ], "symlink_target": "" }
<!-- #%L GwtBootstrap3 %% Copyright (C) 2013 - 2014 GwtBootstrap3 %% Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. #L% --> <module rename-to='respond'> <inherits name="com.google.gwt.resources.Resources"/> <source path='client'/> <entry-point class="org.gwtbootstrap3.extras.respond.client.RespondEntryPoint"/> </module>
{ "content_hash": "e26b73f3f2e640de62b5b73be8642c92", "timestamp": "", "source": "github", "line_count": 26, "max_line_length": 84, "avg_line_length": 32.69230769230769, "alnum_prop": 0.7223529411764706, "repo_name": "Sage-Bionetworks/gwtbootstrap3-extras", "id": "f2fbe6010214d40c72f6f62249bb789b512aa0db", "size": "850", "binary": false, "copies": "2", "ref": "refs/heads/master", "path": "src/main/resources/org/gwtbootstrap3/extras/respond/Respond.gwt.xml", "mode": "33188", "license": "apache-2.0", "language": [], "symlink_target": "" }
import sys try: import boto except ImportError: sys.stderr.write('boto required\n') raise from boto.exception import BotoServerError from distutils.version import LooseVersion import time import os class aws(object): def __init__(self, delay=0, maxdelay=16): self.rate_limit_delay = delay self.rate_limit_maxdelay = maxdelay # requires unmerged https://github.com/boto/boto/pull/2898 self.min_boto_version = '2.35.2' self.region = 'us-east-1' if not self.validate_version(self.min_boto_version): sys.stderr.write("boto >= %s required\n" % self.min_boto_version) raise ImportError # Import ENV vars if available, fall back to IAM instance-profile self.aws_access = None self.aws_secret = None if 'AWS_ACCESS_KEY' in os.environ: self.aws_access = os.environ['AWS_ACCESS_KEY'] if 'AWS_SECRET_KEY' in os.environ: self.aws_secret = os.environ['AWS_SECRET_KEY'] def access_key(self): return self.aws_access def secret_key(self): return self.aws_secret def validate_version(self, version): if LooseVersion(boto.Version) < LooseVersion(version): return False return True def wrap(self, awsfunc, *args, **nargs): """ Wrap AWS call with Rate-Limiting backoff Gratefully taken Netflix/security_monkey """ attempts = 0 while True: attempts = attempts + 1 try: if self.rate_limit_delay > 0: time.sleep(self.rate_limit_delay) retval = awsfunc(*args, **nargs) if self.rate_limit_delay > 0: self.rate_limit_delay = self.rate_limit_delay / 2 return retval except BotoServerError as e: if e.error_code == 'Throttling': if self.rate_limit_delay == 0: self.rate_limit_delay = 1 sys.stderr.write('rate-limited: attempt %d\n' % attempts) elif self.rate_limit_delay < self.rate_limit_maxdelay: self.rate_limit_delay = self.rate_limit_delay * 2 sys.stderr.write('rate-limited: attempt %d\n' % attempts) else: raise e elif e.error_code == 'ServiceUnavailable': if self.rate_limit_delay == 0: self.rate_limit_delay = 1 sys.stderr.write('api-unavailable: attempt %d\n' % attempts) elif self.rate_limit_delay < self.rate_limit_maxdelay: self.rate_limit_delay = self.rate_limit_delay * 2 sys.stderr.write('api-unavailable: attempt %d\n' % attempts) else: raise e else: raise e def instance_info(self): identity = boto.utils.get_instance_identity(timeout=60, num_retries=5) self.info['instanceId'] = identity['document'][u'instanceId'] self.info['availabilityZone'] = \ identity['document'][u'availabilityZone'] self.info['region'] = identity['document'][u'region'] return self.info
{ "content_hash": "2e98bcb10fb6d35158fb9be0b068cb40", "timestamp": "", "source": "github", "line_count": 100, "max_line_length": 74, "avg_line_length": 35.98, "alnum_prop": 0.5113952195664258, "repo_name": "WrathOfChris/billow", "id": "d1bf7801feaaad57405ee717b0874ac67e76e769", "size": "3598", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "billow/aws.py", "mode": "33188", "license": "bsd-2-clause", "language": [ { "name": "Python", "bytes": "141262" } ], "symlink_target": "" }
frappe.upload = { make: function(opts) { if(!opts.args) opts.args = {}; var $upload = $(frappe.render_template("upload", {opts:opts})).appendTo(opts.parent); var $file_input = $upload.find(".input-upload-file"); // bind pseudo browse button $upload.find(".btn-browse").on("click", function() { $file_input.click(); }); $file_input.on("change", function() { if (this.files.length > 0) { $upload.find(".web-link-wrapper").addClass("hidden"); var $uploaded_file_display = $(repl('<div class="btn-group" role="group">\ <button type="button" class="btn btn-default btn-sm \ text-ellipsis uploaded-filename-display">%(filename)s\ </button>\ <button type="button" class="btn btn-default btn-sm uploaded-file-remove">\ &times;</button>\ </div>', {filename: this.files[0].name})) .appendTo($upload.find(".uploaded-filename").removeClass("hidden").empty()); $uploaded_file_display.find(".uploaded-filename-display").on("click", function() { $file_input.click(); }); $uploaded_file_display.find(".uploaded-file-remove").on("click", function() { $file_input.val(""); $file_input.trigger("change"); }); if(opts.on_select) { opts.on_select(); } } else { $upload.find(".uploaded-filename").addClass("hidden") $upload.find(".web-link-wrapper").removeClass("hidden"); } }); if(!opts.btn) { opts.btn = $('<button class="btn btn-default btn-sm">' + __("Attach") + '</div>').appendTo($upload); } else { $(opts.btn).unbind("click"); } // get the first file opts.btn.click(function() { // convert functions to values if(opts.get_params) { opts.args.params = opts.get_params(); } opts.args.file_url = $upload.find('[name="file_url"]').val(); var fileobj = $upload.find(":file").get(0).files[0]; frappe.upload.upload_file(fileobj, opts.args, opts); }); }, upload_file: function(fileobj, args, opts) { if(!fileobj && !args.file_url) { if(opts.on_no_attach) { opts.on_no_attach(); } else { msgprint(__("Please attach a file or set a URL")); } return; } var dataurl = null; var _upload_file = function() { if (args.file_size) { frappe.upload.validate_max_file_size(args.file_size); } if(opts.on_attach) { opts.on_attach(args, dataurl) } else { var msgbox = msgprint(__("Uploading...")); if(opts.start) { opts.start(); } ajax_args = { "method": "uploadfile", args: args, callback: function(r) { if(!r._server_messages) msgbox.hide(); if(r.exc) { // if no onerror, assume callback will handle errors opts.onerror ? opts.onerror(r) : opts.callback(null, null, r); return; } var attachment = r.message; opts.callback(attachment, r); $(document).trigger("upload_complete", attachment); }, error: function(r) { // if no onerror, assume callback will handle errors opts.onerror ? opts.onerror(r) : opts.callback(null, null, r); return; } } // copy handlers etc from opts $.each(['queued', 'running', "progress", "always", "btn"], function(i, key) { if(opts[key]) ajax_args[key] = opts[key]; }); return frappe.call(ajax_args); } } if(args.file_url) { _upload_file(); } else { var freader = new FileReader(); freader.onload = function() { args.filename = fileobj.name; if(opts.options && opts.options.toLowerCase()=="image") { if(!frappe.utils.is_image_file(args.filename)) { msgprint(__("Only image extensions (.gif, .jpg, .jpeg, .tiff, .png, .svg) allowed")); return; } } if((opts.max_width || opts.max_height) && frappe.utils.is_image_file(args.filename)) { frappe.utils.resize_image(freader, function(_dataurl) { dataurl = _dataurl; args.filedata = _dataurl.split(",")[1]; args.file_size = Math.round(args.filedata.length * 3 / 4); console.log("resized!") _upload_file(); }) } else { dataurl = freader.result; args.filedata = freader.result.split(",")[1]; args.file_size = fileobj.size; _upload_file(); } }; freader.readAsDataURL(fileobj); } }, get_string: function(dataURI) { // remove filename var parts = dataURI.split(','); if(parts[0].indexOf(":")===-1) { var a = parts[2]; } else { var a = parts[1]; } return decodeURIComponent(escape(atob(a))); }, validate_max_file_size: function(file_size) { var max_file_size = frappe.boot.max_file_size || 5242880; if (file_size > max_file_size) { // validate max file size frappe.throw(__("File size exceeded the maximum allowed size of {0} MB", [max_file_size / 1048576])); } } }
{ "content_hash": "0619a89c9a948db42beab2e19a706e44", "timestamp": "", "source": "github", "line_count": 173, "max_line_length": 104, "avg_line_length": 27.514450867052023, "alnum_prop": 0.5945378151260504, "repo_name": "jevonearth/frappe", "id": "2aa8bef8d1152cf813288b1de96f28b24da34260", "size": "4889", "binary": false, "copies": "3", "ref": "refs/heads/develop", "path": "frappe/public/js/frappe/upload.js", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "279822" }, { "name": "HTML", "bytes": "1327148" }, { "name": "JavaScript", "bytes": "1089936" }, { "name": "Python", "bytes": "1226972" }, { "name": "Shell", "bytes": "517" } ], "symlink_target": "" }
layout: page author: Wikipedia Contributors title: ECC Backdoors link: https://en.wikipedia.org/wiki/Elliptic-curve_cryptography#Backdoors date: 1111-11-11 lesson: 16 ---
{ "content_hash": "f6bbd5dbc4e5c29a90c30ec8b8e9a3f5", "timestamp": "", "source": "github", "line_count": 7, "max_line_length": 73, "avg_line_length": 24.428571428571427, "alnum_prop": 0.7894736842105263, "repo_name": "gabridome/gabridome.github.io", "id": "c32630faf144b8121d1bc01e52935fa0f2b332e5", "size": "175", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "collections/_links/ecc-backdoors.md", "mode": "33188", "license": "mit", "language": [ { "name": "HTML", "bytes": "29800" }, { "name": "JavaScript", "bytes": "9201" }, { "name": "Python", "bytes": "10146" }, { "name": "Ruby", "bytes": "571" }, { "name": "SCSS", "bytes": "43137" }, { "name": "Shell", "bytes": "1132" } ], "symlink_target": "" }
using content::BrowserThread; using extensions::BluetoothApiSocket; using extensions::api::bluetooth_socket::ListenOptions; using extensions::api::bluetooth_socket::SocketInfo; using extensions::api::bluetooth_socket::SocketProperties; namespace { const char kDeviceNotFoundError[] = "Device not found"; const char kInvalidUuidError[] = "Invalid UUID"; const char kPermissionDeniedError[] = "Permission denied"; const char kSocketNotFoundError[] = "Socket not found"; linked_ptr<SocketInfo> CreateSocketInfo(int socket_id, BluetoothApiSocket* socket) { DCHECK(BrowserThread::CurrentlyOn(BluetoothApiSocket::kThreadId)); linked_ptr<SocketInfo> socket_info(new SocketInfo()); // This represents what we know about the socket, and does not call through // to the system. socket_info->socket_id = socket_id; if (socket->name()) { socket_info->name.reset(new std::string(*socket->name())); } socket_info->persistent = socket->persistent(); if (socket->buffer_size() > 0) { socket_info->buffer_size.reset(new int(socket->buffer_size())); } socket_info->paused = socket->paused(); socket_info->connected = socket->IsConnected(); if (socket->IsConnected()) socket_info->address.reset(new std::string(socket->device_address())); socket_info->uuid.reset(new std::string(socket->uuid().canonical_value())); return socket_info; } void SetSocketProperties(BluetoothApiSocket* socket, SocketProperties* properties) { if (properties->name.get()) { socket->set_name(*properties->name.get()); } if (properties->persistent.get()) { socket->set_persistent(*properties->persistent.get()); } if (properties->buffer_size.get()) { // buffer size is validated when issuing the actual Recv operation // on the socket. socket->set_buffer_size(*properties->buffer_size.get()); } } extensions::api::BluetoothSocketEventDispatcher* GetSocketEventDispatcher( content::BrowserContext* browser_context) { extensions::api::BluetoothSocketEventDispatcher* socket_event_dispatcher = extensions::api::BluetoothSocketEventDispatcher::Get(browser_context); DCHECK(socket_event_dispatcher) << "There is no socket event dispatcher. " "If this assertion is failing during a test, then it is likely that " "TestExtensionSystem is failing to provide an instance of " "BluetoothSocketEventDispatcher."; return socket_event_dispatcher; } } // namespace namespace extensions { namespace api { BluetoothSocketAsyncApiFunction::BluetoothSocketAsyncApiFunction() {} BluetoothSocketAsyncApiFunction::~BluetoothSocketAsyncApiFunction() {} bool BluetoothSocketAsyncApiFunction::RunAsync() { if (!PrePrepare() || !Prepare()) { return false; } AsyncWorkStart(); return true; } bool BluetoothSocketAsyncApiFunction::PrePrepare() { if (!BluetoothManifestData::CheckSocketPermitted(GetExtension())) { error_ = kPermissionDeniedError; return false; } manager_ = ApiResourceManager<BluetoothApiSocket>::Get(browser_context()); DCHECK(manager_) << "There is no socket manager. " "If this assertion is failing during a test, then it is likely that " "TestExtensionSystem is failing to provide an instance of " "ApiResourceManager<BluetoothApiSocket>."; return manager_ != NULL; } bool BluetoothSocketAsyncApiFunction::Respond() { return error_.empty(); } void BluetoothSocketAsyncApiFunction::AsyncWorkCompleted() { SendResponse(Respond()); } void BluetoothSocketAsyncApiFunction::Work() {} void BluetoothSocketAsyncApiFunction::AsyncWorkStart() { Work(); AsyncWorkCompleted(); } int BluetoothSocketAsyncApiFunction::AddSocket(BluetoothApiSocket* socket) { return manager_->Add(socket); } content::BrowserThread::ID BluetoothSocketAsyncApiFunction::work_thread_id() const { return BluetoothApiSocket::kThreadId; } BluetoothApiSocket* BluetoothSocketAsyncApiFunction::GetSocket( int api_resource_id) { return manager_->Get(extension_id(), api_resource_id); } void BluetoothSocketAsyncApiFunction::RemoveSocket(int api_resource_id) { manager_->Remove(extension_id(), api_resource_id); } base::hash_set<int>* BluetoothSocketAsyncApiFunction::GetSocketIds() { return manager_->GetResourceIds(extension_id()); } BluetoothSocketCreateFunction::BluetoothSocketCreateFunction() {} BluetoothSocketCreateFunction::~BluetoothSocketCreateFunction() {} bool BluetoothSocketCreateFunction::Prepare() { DCHECK(BrowserThread::CurrentlyOn(BrowserThread::UI)); params_ = bluetooth_socket::Create::Params::Create(*args_); EXTENSION_FUNCTION_VALIDATE(params_.get()); return true; } void BluetoothSocketCreateFunction::Work() { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); BluetoothApiSocket* socket = new BluetoothApiSocket(extension_id()); bluetooth_socket::SocketProperties* properties = params_.get()->properties.get(); if (properties) { SetSocketProperties(socket, properties); } bluetooth_socket::CreateInfo create_info; create_info.socket_id = AddSocket(socket); results_ = bluetooth_socket::Create::Results::Create(create_info); AsyncWorkCompleted(); } BluetoothSocketUpdateFunction::BluetoothSocketUpdateFunction() {} BluetoothSocketUpdateFunction::~BluetoothSocketUpdateFunction() {} bool BluetoothSocketUpdateFunction::Prepare() { params_ = bluetooth_socket::Update::Params::Create(*args_); EXTENSION_FUNCTION_VALIDATE(params_.get()); return true; } void BluetoothSocketUpdateFunction::Work() { BluetoothApiSocket* socket = GetSocket(params_->socket_id); if (!socket) { error_ = kSocketNotFoundError; return; } SetSocketProperties(socket, &params_.get()->properties); results_ = bluetooth_socket::Update::Results::Create(); } BluetoothSocketSetPausedFunction::BluetoothSocketSetPausedFunction() : socket_event_dispatcher_(NULL) {} BluetoothSocketSetPausedFunction::~BluetoothSocketSetPausedFunction() {} bool BluetoothSocketSetPausedFunction::Prepare() { params_ = bluetooth_socket::SetPaused::Params::Create(*args_); EXTENSION_FUNCTION_VALIDATE(params_.get()); socket_event_dispatcher_ = GetSocketEventDispatcher(browser_context()); return socket_event_dispatcher_ != NULL; } void BluetoothSocketSetPausedFunction::Work() { BluetoothApiSocket* socket = GetSocket(params_->socket_id); if (!socket) { error_ = kSocketNotFoundError; return; } if (socket->paused() != params_->paused) { socket->set_paused(params_->paused); if (!params_->paused) { socket_event_dispatcher_->OnSocketResume(extension_id(), params_->socket_id); } } results_ = bluetooth_socket::SetPaused::Results::Create(); } BluetoothSocketListenFunction::BluetoothSocketListenFunction() {} BluetoothSocketListenFunction::~BluetoothSocketListenFunction() {} bool BluetoothSocketListenFunction::Prepare() { DCHECK(BrowserThread::CurrentlyOn(BrowserThread::UI)); if (!CreateParams()) return false; socket_event_dispatcher_ = GetSocketEventDispatcher(browser_context()); return socket_event_dispatcher_ != NULL; } void BluetoothSocketListenFunction::AsyncWorkStart() { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); device::BluetoothAdapterFactory::GetAdapter( base::Bind(&BluetoothSocketListenFunction::OnGetAdapter, this)); } void BluetoothSocketListenFunction::OnGetAdapter( scoped_refptr<device::BluetoothAdapter> adapter) { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); BluetoothApiSocket* socket = GetSocket(socket_id()); if (!socket) { error_ = kSocketNotFoundError; AsyncWorkCompleted(); return; } device::BluetoothUUID bluetooth_uuid(uuid()); if (!bluetooth_uuid.IsValid()) { error_ = kInvalidUuidError; AsyncWorkCompleted(); return; } BluetoothPermissionRequest param(uuid()); if (!BluetoothManifestData::CheckRequest(GetExtension(), param)) { error_ = kPermissionDeniedError; AsyncWorkCompleted(); return; } scoped_ptr<std::string> name; if (socket->name()) name.reset(new std::string(*socket->name())); CreateService( adapter, bluetooth_uuid, name.Pass(), base::Bind(&BluetoothSocketListenFunction::OnCreateService, this), base::Bind(&BluetoothSocketListenFunction::OnCreateServiceError, this)); } void BluetoothSocketListenFunction::OnCreateService( scoped_refptr<device::BluetoothSocket> socket) { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); // Fetch the socket again since this is not a reference-counted object, and // it may have gone away in the meantime (we check earlier to avoid making // a connection in the case of an obvious programming error). BluetoothApiSocket* api_socket = GetSocket(socket_id()); if (!api_socket) { error_ = kSocketNotFoundError; AsyncWorkCompleted(); return; } api_socket->AdoptListeningSocket(socket, device::BluetoothUUID(uuid())); socket_event_dispatcher_->OnSocketListen(extension_id(), socket_id()); CreateResults(); AsyncWorkCompleted(); } void BluetoothSocketListenFunction::OnCreateServiceError( const std::string& message) { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); error_ = message; AsyncWorkCompleted(); } BluetoothSocketListenUsingRfcommFunction:: BluetoothSocketListenUsingRfcommFunction() {} BluetoothSocketListenUsingRfcommFunction:: ~BluetoothSocketListenUsingRfcommFunction() {} int BluetoothSocketListenUsingRfcommFunction::socket_id() const { return params_->socket_id; } const std::string& BluetoothSocketListenUsingRfcommFunction::uuid() const { return params_->uuid; } bool BluetoothSocketListenUsingRfcommFunction::CreateParams() { params_ = bluetooth_socket::ListenUsingRfcomm::Params::Create(*args_); EXTENSION_FUNCTION_VALIDATE(params_.get()); return true; } void BluetoothSocketListenUsingRfcommFunction::CreateService( scoped_refptr<device::BluetoothAdapter> adapter, const device::BluetoothUUID& uuid, scoped_ptr<std::string> name, const device::BluetoothAdapter::CreateServiceCallback& callback, const device::BluetoothAdapter::CreateServiceErrorCallback& error_callback) { device::BluetoothAdapter::ServiceOptions service_options; service_options.name = name.Pass(); ListenOptions* options = params_->options.get(); if (options) { if (options->channel.get()) service_options.channel.reset(new int(*(options->channel))); } adapter->CreateRfcommService(uuid, service_options, callback, error_callback); } void BluetoothSocketListenUsingRfcommFunction::CreateResults() { results_ = bluetooth_socket::ListenUsingRfcomm::Results::Create(); } BluetoothSocketListenUsingL2capFunction:: BluetoothSocketListenUsingL2capFunction() {} BluetoothSocketListenUsingL2capFunction:: ~BluetoothSocketListenUsingL2capFunction() {} int BluetoothSocketListenUsingL2capFunction::socket_id() const { return params_->socket_id; } const std::string& BluetoothSocketListenUsingL2capFunction::uuid() const { return params_->uuid; } bool BluetoothSocketListenUsingL2capFunction::CreateParams() { params_ = bluetooth_socket::ListenUsingL2cap::Params::Create(*args_); EXTENSION_FUNCTION_VALIDATE(params_.get()); return true; } void BluetoothSocketListenUsingL2capFunction::CreateService( scoped_refptr<device::BluetoothAdapter> adapter, const device::BluetoothUUID& uuid, scoped_ptr<std::string> name, const device::BluetoothAdapter::CreateServiceCallback& callback, const device::BluetoothAdapter::CreateServiceErrorCallback& error_callback) { device::BluetoothAdapter::ServiceOptions service_options; service_options.name = name.Pass(); ListenOptions* options = params_->options.get(); if (options) { if (options->psm.get()) service_options.psm.reset(new int(*(options->psm))); } adapter->CreateL2capService(uuid, service_options, callback, error_callback); } void BluetoothSocketListenUsingL2capFunction::CreateResults() { results_ = bluetooth_socket::ListenUsingL2cap::Results::Create(); } BluetoothSocketConnectFunction::BluetoothSocketConnectFunction() {} BluetoothSocketConnectFunction::~BluetoothSocketConnectFunction() {} bool BluetoothSocketConnectFunction::Prepare() { DCHECK(BrowserThread::CurrentlyOn(BrowserThread::UI)); params_ = bluetooth_socket::Connect::Params::Create(*args_); EXTENSION_FUNCTION_VALIDATE(params_.get()); socket_event_dispatcher_ = GetSocketEventDispatcher(browser_context()); return socket_event_dispatcher_ != NULL; } void BluetoothSocketConnectFunction::AsyncWorkStart() { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); device::BluetoothAdapterFactory::GetAdapter( base::Bind(&BluetoothSocketConnectFunction::OnGetAdapter, this)); } void BluetoothSocketConnectFunction::OnGetAdapter( scoped_refptr<device::BluetoothAdapter> adapter) { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); BluetoothApiSocket* socket = GetSocket(params_->socket_id); if (!socket) { error_ = kSocketNotFoundError; AsyncWorkCompleted(); return; } device::BluetoothDevice* device = adapter->GetDevice(params_->address); if (!device) { error_ = kDeviceNotFoundError; AsyncWorkCompleted(); return; } device::BluetoothUUID uuid(params_->uuid); if (!uuid.IsValid()) { error_ = kInvalidUuidError; AsyncWorkCompleted(); return; } BluetoothPermissionRequest param(params_->uuid); if (!BluetoothManifestData::CheckRequest(GetExtension(), param)) { error_ = kPermissionDeniedError; AsyncWorkCompleted(); return; } device->ConnectToService( uuid, base::Bind(&BluetoothSocketConnectFunction::OnConnect, this), base::Bind(&BluetoothSocketConnectFunction::OnConnectError, this)); } void BluetoothSocketConnectFunction::OnConnect( scoped_refptr<device::BluetoothSocket> socket) { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); // Fetch the socket again since this is not a reference-counted object, and // it may have gone away in the meantime (we check earlier to avoid making // a connection in the case of an obvious programming error). BluetoothApiSocket* api_socket = GetSocket(params_->socket_id); if (!api_socket) { error_ = kSocketNotFoundError; AsyncWorkCompleted(); return; } api_socket->AdoptConnectedSocket(socket, params_->address, device::BluetoothUUID(params_->uuid)); socket_event_dispatcher_->OnSocketConnect(extension_id(), params_->socket_id); results_ = bluetooth_socket::Connect::Results::Create(); AsyncWorkCompleted(); } void BluetoothSocketConnectFunction::OnConnectError( const std::string& message) { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); error_ = message; AsyncWorkCompleted(); } BluetoothSocketDisconnectFunction::BluetoothSocketDisconnectFunction() {} BluetoothSocketDisconnectFunction::~BluetoothSocketDisconnectFunction() {} bool BluetoothSocketDisconnectFunction::Prepare() { DCHECK(BrowserThread::CurrentlyOn(BrowserThread::UI)); params_ = bluetooth_socket::Disconnect::Params::Create(*args_); EXTENSION_FUNCTION_VALIDATE(params_.get()); return true; } void BluetoothSocketDisconnectFunction::AsyncWorkStart() { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); BluetoothApiSocket* socket = GetSocket(params_->socket_id); if (!socket) { error_ = kSocketNotFoundError; AsyncWorkCompleted(); return; } socket->Disconnect(base::Bind(&BluetoothSocketDisconnectFunction::OnSuccess, this)); } void BluetoothSocketDisconnectFunction::OnSuccess() { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); results_ = bluetooth_socket::Disconnect::Results::Create(); AsyncWorkCompleted(); } BluetoothSocketCloseFunction::BluetoothSocketCloseFunction() {} BluetoothSocketCloseFunction::~BluetoothSocketCloseFunction() {} bool BluetoothSocketCloseFunction::Prepare() { params_ = bluetooth_socket::Close::Params::Create(*args_); EXTENSION_FUNCTION_VALIDATE(params_.get()); return true; } void BluetoothSocketCloseFunction::Work() { BluetoothApiSocket* socket = GetSocket(params_->socket_id); if (!socket) { error_ = kSocketNotFoundError; return; } RemoveSocket(params_->socket_id); results_ = bluetooth_socket::Close::Results::Create(); } BluetoothSocketSendFunction::BluetoothSocketSendFunction() : io_buffer_size_(0) {} BluetoothSocketSendFunction::~BluetoothSocketSendFunction() {} bool BluetoothSocketSendFunction::Prepare() { DCHECK(BrowserThread::CurrentlyOn(BrowserThread::UI)); params_ = bluetooth_socket::Send::Params::Create(*args_); EXTENSION_FUNCTION_VALIDATE(params_.get()); io_buffer_size_ = params_->data.size(); io_buffer_ = new net::WrappedIOBuffer(params_->data.data()); return true; } void BluetoothSocketSendFunction::AsyncWorkStart() { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); BluetoothApiSocket* socket = GetSocket(params_->socket_id); if (!socket) { error_ = kSocketNotFoundError; return; } socket->Send(io_buffer_, io_buffer_size_, base::Bind(&BluetoothSocketSendFunction::OnSuccess, this), base::Bind(&BluetoothSocketSendFunction::OnError, this)); } void BluetoothSocketSendFunction::OnSuccess(int bytes_sent) { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); results_ = bluetooth_socket::Send::Results::Create(bytes_sent); AsyncWorkCompleted(); } void BluetoothSocketSendFunction::OnError( BluetoothApiSocket::ErrorReason reason, const std::string& message) { DCHECK(BrowserThread::CurrentlyOn(work_thread_id())); error_ = message; AsyncWorkCompleted(); } BluetoothSocketGetInfoFunction::BluetoothSocketGetInfoFunction() {} BluetoothSocketGetInfoFunction::~BluetoothSocketGetInfoFunction() {} bool BluetoothSocketGetInfoFunction::Prepare() { params_ = bluetooth_socket::GetInfo::Params::Create(*args_); EXTENSION_FUNCTION_VALIDATE(params_.get()); return true; } void BluetoothSocketGetInfoFunction::Work() { BluetoothApiSocket* socket = GetSocket(params_->socket_id); if (!socket) { error_ = kSocketNotFoundError; return; } linked_ptr<bluetooth_socket::SocketInfo> socket_info = CreateSocketInfo(params_->socket_id, socket); results_ = bluetooth_socket::GetInfo::Results::Create(*socket_info); } BluetoothSocketGetSocketsFunction::BluetoothSocketGetSocketsFunction() {} BluetoothSocketGetSocketsFunction::~BluetoothSocketGetSocketsFunction() {} bool BluetoothSocketGetSocketsFunction::Prepare() { return true; } void BluetoothSocketGetSocketsFunction::Work() { std::vector<linked_ptr<bluetooth_socket::SocketInfo> > socket_infos; base::hash_set<int>* resource_ids = GetSocketIds(); if (resource_ids != NULL) { for (base::hash_set<int>::iterator it = resource_ids->begin(); it != resource_ids->end(); ++it) { int socket_id = *it; BluetoothApiSocket* socket = GetSocket(socket_id); if (socket) { socket_infos.push_back(CreateSocketInfo(socket_id, socket)); } } } results_ = bluetooth_socket::GetSockets::Results::Create(socket_infos); } } // namespace api } // namespace extensions
{ "content_hash": "90a141f7da68f11335fe7eba0b2ed8be", "timestamp": "", "source": "github", "line_count": 610, "max_line_length": 80, "avg_line_length": 32.014754098360655, "alnum_prop": 0.7286087357263556, "repo_name": "chromium2014/src", "id": "263cd99c61b0edaeddc908ba42953dc91bcd7ea4", "size": "20422", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "chrome/browser/extensions/api/bluetooth_socket/bluetooth_socket_api.cc", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "ASP", "bytes": "853" }, { "name": "AppleScript", "bytes": "6973" }, { "name": "Arduino", "bytes": "464" }, { "name": "Assembly", "bytes": "1889381" }, { "name": "Awk", "bytes": "8660" }, { "name": "C", "bytes": "39993418" }, { "name": "C#", "bytes": "1132" }, { "name": "C++", "bytes": "220757674" }, { "name": "CSS", "bytes": "973910" }, { "name": "Java", "bytes": "6583410" }, { "name": "JavaScript", "bytes": "20967999" }, { "name": "Mercury", "bytes": "9480" }, { "name": "Objective-C", "bytes": "943237" }, { "name": "Objective-C++", "bytes": "7190130" }, { "name": "PHP", "bytes": "97817" }, { "name": "Perl", "bytes": "674461" }, { "name": "Python", "bytes": "10430892" }, { "name": "Rebol", "bytes": "262" }, { "name": "Shell", "bytes": "1337040" }, { "name": "Standard ML", "bytes": "3705" }, { "name": "Tcl", "bytes": "277091" }, { "name": "XSLT", "bytes": "13493" }, { "name": "nesC", "bytes": "15206" } ], "symlink_target": "" }
var http = require('http'); var url = require('url'); function start(route) { function onRequest(request, response) { var pathname = url.parse(request.url).pathname; console.log('Request for ' + pathname + ' received.'); route(pathname); response.writeHead(200, {'Content-Type' : 'text/plain'}); response.write('Hello World'); response.end(); } http.createServer(onRequest).listen(5211); console.log('Server has started.'); } exports.start = start
{ "content_hash": "4bbf4c72390255f0958908dd740db11a", "timestamp": "", "source": "github", "line_count": 17, "max_line_length": 65, "avg_line_length": 30.235294117647058, "alnum_prop": 0.6264591439688716, "repo_name": "gaodacheng/bitcode", "id": "18589fd9b88ac618cdf2df92d3eec31adc7f7b24", "size": "514", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "nodejs/firstproj/server5.js", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Assembly", "bytes": "230074" }, { "name": "C", "bytes": "35270" }, { "name": "CSS", "bytes": "149365" }, { "name": "HTML", "bytes": "4805" }, { "name": "JavaScript", "bytes": "290481" }, { "name": "PHP", "bytes": "13529" }, { "name": "TeX", "bytes": "19465" } ], "symlink_target": "" }
package com.xym.common.factory.presenter; import android.support.v7.util.DiffUtil; import com.xym.common.recycler.RecyclerAdapter; import net.qiujuer.genius.kit.handler.Run; import net.qiujuer.genius.kit.handler.runable.Action; import java.util.List; /** * 对RecyclerView进行的一个简单的Presenter封装 * */ public class BaseRecyclerPresenter<ViewMode, View extends BaseContract.RecyclerView> extends BasePresenter<View> { public BaseRecyclerPresenter(View view) { super(view); } /** * 刷新一堆新数据到界面中 * * @param dataList 新数据 */ protected void refreshData(final List<ViewMode> dataList) { Run.onUiAsync(new Action() { @Override public void call() { View view = getView(); if (view == null) return; // 基本的更新数据并刷新界面 RecyclerAdapter<ViewMode> adapter = view.getRecyclerAdapter(); adapter.replace(dataList); view.onAdapterDataChanged(); } }); } /** * 刷新界面操作,该操作可以保证执行方法在主线程进行 * * @param diffResult 一个差异的结果集 * @param dataList 具体的新数据 */ protected void refreshData(final DiffUtil.DiffResult diffResult, final List<ViewMode> dataList) { Run.onUiAsync(new Action() { @Override public void call() { // 这里是主线程运行时 refreshDataOnUiThread(diffResult, dataList); } }); } private void refreshDataOnUiThread(final DiffUtil.DiffResult diffResult, final List<ViewMode> dataList) { View view = getView(); if (view == null) return; // 基本的更新数据并刷新界面 RecyclerAdapter<ViewMode> adapter = view.getRecyclerAdapter(); // 改变数据集合并不通知界面刷新 adapter.getItems().clear(); adapter.getItems().addAll(dataList); // 通知界面刷新占位布局 view.onAdapterDataChanged(); // 进行增量更新 diffResult.dispatchUpdatesTo(adapter); } }
{ "content_hash": "2941a44336d9aef697612459abbdb468", "timestamp": "", "source": "github", "line_count": 76, "max_line_length": 109, "avg_line_length": 26.592105263157894, "alnum_prop": 0.595249876298862, "repo_name": "FZZFVII/pipe", "id": "baa47d6f8d7cc3f42d028be30d102aaf1a89d461", "size": "2273", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "android/Pipe/common/src/main/java/com/xym/common/factory/presenter/BaseRecyclerPresenter.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Java", "bytes": "429872" } ], "symlink_target": "" }
'use strict'; angular.module('circleCiApp', [ 'circleCiApp.auth', 'circleCiApp.admin', 'circleCiApp.constants', 'ngCookies', 'ngResource', 'ngSanitize', 'btford.socket-io', 'ui.router', 'ui.bootstrap', 'validation.match' ]) .config(function($urlRouterProvider, $locationProvider) { $urlRouterProvider .otherwise('/'); $locationProvider.html5Mode(true); });
{ "content_hash": "3d8bbf2035cc17c36fb425dcefeb3f4b", "timestamp": "", "source": "github", "line_count": 20, "max_line_length": 59, "avg_line_length": 19.85, "alnum_prop": 0.6624685138539043, "repo_name": "rafagomes/poc-circle-ci", "id": "c4f10132d3867f1a1c2a9f611e13f7bb14201bae", "size": "397", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "client/app/app.js", "mode": "33188", "license": "mit", "language": [ { "name": "ApacheConf", "bytes": "24139" }, { "name": "CSS", "bytes": "2071" }, { "name": "HTML", "bytes": "14059" }, { "name": "JavaScript", "bytes": "104677" } ], "symlink_target": "" }
package acceptance_tests /* * File Generated by enaml generator * !!! Please do not edit this file !!! */ type AcceptanceTestsJob struct { /*AcceptanceTests - Descr: Default Timeout Default: <nil> */ AcceptanceTests *AcceptanceTests `yaml:"acceptance_tests,omitempty"` }
{ "content_hash": "e37df64efdbfc85866efd35e0cc082bf", "timestamp": "", "source": "github", "line_count": 12, "max_line_length": 69, "avg_line_length": 22.916666666666668, "alnum_prop": 0.7418181818181818, "repo_name": "enaml-ops/ert-plugin", "id": "43b995b27c47d698281de2a3458f0b3b3bb0f9ad", "size": "275", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "enaml-gen/acceptance-tests/acceptancetestsjob.go", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Go", "bytes": "862257" }, { "name": "Shell", "bytes": "521" } ], "symlink_target": "" }
package io.joshworks.snappy.extensions.dashboard; import java.util.Collection; import java.util.concurrent.ConcurrentLinkedQueue; /** * Created by Josh Gontijo on 5/14/17. */ public class RingBuffer<T> extends ConcurrentLinkedQueue<T> { private int pos = 0; private final int size; private RingBuffer(int size) { if (size <= 0) { throw new IllegalArgumentException("Size must be greater than greater than zero"); } this.size = size; } public static <T> RingBuffer<T> ofSize(int size) { return new RingBuffer<>(size); } private void checkPosInsert() { if (pos++ >= size) { poll(); pos = size; } } private void checkPosRemove() { if (pos > 0) { pos--; } } @Override public boolean offer(T t) { checkPosInsert(); return super.offer(t); } @Override public T poll() { T poll = super.poll(); checkPosRemove(); return poll; } @Override public T peek() { T peek = super.peek(); checkPosRemove(); return peek; } @Override public void clear() { pos = 0; super.clear(); } @Override public boolean remove(Object o) { boolean remove = super.remove(o); checkPosRemove(); return remove; } @Override public T remove() { T remove = super.remove(); checkPosRemove(); return remove; } @Override public boolean containsAll(Collection<?> c) { throw new UnsupportedOperationException("Not supported"); } @Override public boolean removeAll(Collection<?> c) { throw new UnsupportedOperationException("Not supported"); } @Override public boolean retainAll(Collection<?> c) { throw new UnsupportedOperationException("Not supported"); } }
{ "content_hash": "9dbee1b72065c403b7bc0d3ad0ec3256", "timestamp": "", "source": "github", "line_count": 92, "max_line_length": 94, "avg_line_length": 21.097826086956523, "alnum_prop": 0.5692941782586296, "repo_name": "josueeduardo/snappy", "id": "2442df12a5d37f11a475f300f762d02a20a7d0e0", "size": "1941", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "extensions/admin/src/main/java/io/joshworks/snappy/extensions/dashboard/RingBuffer.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "CSS", "bytes": "1133803" }, { "name": "HTML", "bytes": "114839" }, { "name": "Java", "bytes": "794417" }, { "name": "JavaScript", "bytes": "2393747" }, { "name": "Shell", "bytes": "611" } ], "symlink_target": "" }
<?xml version="1.0" encoding="utf-8"?> <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="match_parent" android:layout_height="match_parent" android:background="#ebebeb" android:orientation="vertical"> <RelativeLayout android:id="@+id/title" android:layout_width="match_parent" android:layout_height="@dimen/height_top_bar" android:background="@color/common_top_bar_blue"> <ImageView android:id="@+id/iv_back" android:layout_width="40dp" android:layout_height="match_parent" android:layout_alignParentLeft="true" android:layout_centerVertical="true" android:onClick="back" android:paddingBottom="5dp" android:paddingLeft="5dp" android:paddingRight="5dp" android:paddingTop="5dp" android:scaleType="centerInside" android:src="@drawable/fx_top_bar_back" /> <View android:id="@+id/view_temp" android:layout_width="1dp" android:layout_height="match_parent" android:layout_marginBottom="8dp" android:layout_marginTop="8dp" android:layout_toRightOf="@id/iv_back" android:background="#14191A" /> <TextView android:id="@+id/tv_title" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_centerVertical="true" android:layout_marginLeft="10dp" android:layout_toRightOf="@id/view_temp" android:text="聊天信息" android:textColor="#ffffff" android:textSize="18sp" /> <TextView android:id="@+id/tv_m_total" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_centerVertical="true" android:layout_toRightOf="@+id/tv_title" android:text="(0)" android:textColor="#ffffff" android:textSize="18sp"></TextView> </RelativeLayout> <ScrollView android:layout_width="match_parent" android:layout_height="match_parent"> <LinearLayout android:layout_width="match_parent" android:layout_height="wrap_content" android:background="#EBEBEB" android:orientation="vertical"> <com.fanxin.huangfangyi.main.widget.ExpandGridView android:id="@+id/gridview" android:layout_width="match_parent" android:layout_height="wrap_content" android:layout_marginTop="20dp" android:background="#ffffff" android:columnWidth="56dp" android:gravity="start" android:listSelector="@android:color/transparent" android:numColumns="4" /> <RelativeLayout android:id="@+id/re_change_groupname" android:layout_width="match_parent" android:layout_height="50dip" android:layout_marginTop="20dp" android:background="#ffffff" android:padding="10dp"> <TextView android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_centerVertical="true" android:text="群名称" android:textColor="#353535" android:textSize="16sp" android:typeface="serif" /> <TextView android:id="@+id/tv_groupname" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_alignParentRight="true" android:layout_centerVertical="true" android:layout_marginRight="12dp" android:ellipsize="end" android:maxWidth="200dp" android:singleLine="true" android:textColor="#AAAAAA" android:textSize="12sp" /> </RelativeLayout> <View android:layout_width="match_parent" android:layout_height="0.1dp" android:background="#dadada" /> <RelativeLayout android:visibility="gone" android:id="@+id/rl_switch_chattotop" android:layout_width="match_parent" android:layout_height="50dip" android:background="#ffffff" android:padding="10dip"> <TextView android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_centerVertical="true" android:text="置顶聊天" android:textColor="#353535" android:textSize="16sp" android:typeface="serif" /> <ImageView android:id="@+id/iv_switch_chattotop" android:layout_width="70dp" android:layout_height="22dp" android:layout_alignParentRight="true" android:layout_centerVertical="true" android:background="@drawable/fx_icon_open" android:visibility="invisible" /> <ImageView android:id="@+id/iv_switch_unchattotop" android:layout_width="70dp" android:layout_height="22dp" android:layout_alignParentRight="true" android:layout_centerVertical="true" android:background="@drawable/fx_icon_close" android:visibility="visible" /> </RelativeLayout> <View android:layout_width="match_parent" android:layout_height="0.1dp" android:background="#dadada" /> <RelativeLayout android:id="@+id/rl_switch_block_groupmsg" android:layout_width="match_parent" android:layout_height="50dip" android:background="#ffffff" android:padding="10dip"> <TextView android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_centerVertical="true" android:text="屏蔽群消息" android:textColor="#353535" android:textSize="16sp" android:typeface="serif" /> <ImageView android:id="@+id/iv_switch_block_groupmsg" android:layout_width="70dp" android:layout_height="22dp" android:layout_alignParentRight="true" android:layout_centerVertical="true" android:background="@drawable/fx_icon_open" android:visibility="invisible" /> <ImageView android:id="@+id/iv_switch_unblock_groupmsg" android:layout_width="70dp" android:layout_height="22dp" android:layout_alignParentRight="true" android:layout_centerVertical="true" android:background="@drawable/fx_icon_close" android:visibility="visible" /> </RelativeLayout> <RelativeLayout android:id="@+id/re_clear" android:layout_width="match_parent" android:layout_height="50dip" android:layout_marginTop="20dp" android:background="#ffffff" android:padding="10dp"> <TextView android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_centerVertical="true" android:text="清空聊天记录" android:textColor="#353535" android:textSize="16sp" android:typeface="serif" /> </RelativeLayout> <Button android:id="@+id/btn_exit_grp" android:layout_width="match_parent" android:layout_height="wrap_content" android:layout_margin="11dp" android:background="@drawable/fx_bg_btn_green" android:onClick="exitGroup" android:paddingBottom="7dp" android:paddingTop="7dp" android:text="删除并退出" android:textColor="@android:color/white" android:textSize="18sp" /> </LinearLayout> </ScrollView> </LinearLayout>
{ "content_hash": "88ce087121b7342d162d02d170c5ed82", "timestamp": "", "source": "github", "line_count": 228, "max_line_length": 72, "avg_line_length": 39.42982456140351, "alnum_prop": 0.5205784204671857, "repo_name": "cowthan/Ayo2022", "id": "4ea18fd9840a8f70285f5274a95c731d4e80cc2f", "size": "9044", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "ProjWechat/res/layout/fx_activity_chat_setting_group.xml", "mode": "33261", "license": "mit", "language": [ { "name": "CSS", "bytes": "5114" }, { "name": "HTML", "bytes": "2365" }, { "name": "Java", "bytes": "9575777" }, { "name": "JavaScript", "bytes": "27653" }, { "name": "Makefile", "bytes": "891" } ], "symlink_target": "" }
#region License #endregion #if REMOTING using Quartz.Impl.Matchers; namespace Quartz.Examples.Example12 { /// <summary> /// This example is a client program that will remotely /// talk to the scheduler to schedule a job. In this /// example, we will need to use the JDBC Job Store. The /// client will connect to the JDBC Job Store remotely to /// schedule the job. /// </summary> /// <author>James House</author> /// <author>Bill Kratzer</author> /// <author>Marko Lahma (.NET)</author> public class RemoteClientJobSchedulingExample : IExample { public virtual async Task Run() { // First we must get a reference to a scheduler IScheduler sched = await SchedulerBuilder.Create() .WithName("RemoteClient") .UseDefaultThreadPool(x => x.MaxConcurrency = 5) .ProxyToRemoteScheduler("tcp://127.0.0.1:555/QuartzScheduler") .BuildScheduler(); // define the job and ask it to run IJobDetail job = JobBuilder.Create<SimpleJob>() .WithIdentity("remotelyAddedJob", "default") .Build(); JobDataMap map = job.JobDataMap; map.Put("msg", "Your remotely added job has executed!"); ITrigger trigger = TriggerBuilder.Create() .WithIdentity("remotelyAddedTrigger", "default") .ForJob(job.Key) .WithCronSchedule("/5 * * ? * *") .Build(); // schedule the job await sched.ScheduleJob(job, trigger); Console.WriteLine("Remote job scheduled, scheduler now has following jobs:"); var keys = await sched.GetJobKeys(GroupMatcher<JobKey>.AnyGroup()); foreach (var key in keys) { Console.WriteLine("\t " + key); } } } } #endif
{ "content_hash": "42d3c60674d02831f78db588b922d88d", "timestamp": "", "source": "github", "line_count": 63, "max_line_length": 89, "avg_line_length": 30.714285714285715, "alnum_prop": 0.5710594315245479, "repo_name": "sean-gilliam/quartznet", "id": "d6e493e44ab3e03657e92697a1162305cf77ca9d", "size": "2587", "binary": false, "copies": "2", "ref": "refs/heads/main", "path": "src/Quartz.Examples/12_RemoteClientServerJobScheduling/RemoteClientExample.cs", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Batchfile", "bytes": "207" }, { "name": "C#", "bytes": "3544001" }, { "name": "CSS", "bytes": "9403" }, { "name": "Dockerfile", "bytes": "2663" }, { "name": "HTML", "bytes": "23124" }, { "name": "JavaScript", "bytes": "2077184" }, { "name": "Less", "bytes": "6623" }, { "name": "PowerShell", "bytes": "2953" }, { "name": "SCSS", "bytes": "6883" }, { "name": "Shell", "bytes": "2838" }, { "name": "TSQL", "bytes": "74851" }, { "name": "TypeScript", "bytes": "17242" } ], "symlink_target": "" }
class AddUrlToNewsItem < ActiveRecord::Migration def change add_column :news_items, :url, :string end end
{ "content_hash": "17d5983b55504a24e8dc6220352bb9ac", "timestamp": "", "source": "github", "line_count": 5, "max_line_length": 48, "avg_line_length": 22.8, "alnum_prop": 0.7368421052631579, "repo_name": "madetech/made-news-engine", "id": "a3c36eb497a071ee0d1cc696f7e58a29c1532d60", "size": "114", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "db/migrate/20130412093135_add_url_to_news_item.rb", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "CSS", "bytes": "546" }, { "name": "HTML", "bytes": "2968" }, { "name": "JavaScript", "bytes": "641" }, { "name": "Ruby", "bytes": "48280" } ], "symlink_target": "" }
<fortis> <configurationProvider type = "Fortis.Configuration.XmlConfigurationProvider, Fortis" ></configurationProvider> <models> <model name="Fortis.Model" assembly="Fortis.Model, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" /> <model name="Project.Model" assembly="Project.Model, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" /> </models> <fieldTypes> <field name="checkbox" type="Fortis.Model.Fields.BooleanFieldWrapper, Fortis" /> <field name="image" type="Fortis.Model.Fields.ImageFieldWrapper, Fortis" /> <field name="date" type="Fortis.Model.Fields.DateTimeFieldWrapper, Fortis" /> <field name="datetime" type="Fortis.Model.Fields.DateTimeFieldWrapper, Fortis" /> <field name="checklist" type="Fortis.Model.Fields.ListFieldWrapper, Fortis" /> <field name="treelist" type="Fortis.Model.Fields.ListFieldWrapper, Fortis" /> <field name="treelistex" type="Fortis.Model.Fields.ListFieldWrapper, Fortis" /> <field name="multilist" type="Fortis.Model.Fields.ListFieldWrapper, Fortis" /> <field name="multilist with search" type="Fortis.Model.Fields.ListFieldWrapper, Fortis" /> <field name="treelist with search" type="Fortis.Model.Fields.ListFieldWrapper, Fortis" /> <field name="file" type="Fortis.Model.Fields.FileFieldWrapper, Fortis" /> <field name="droplink" type="Fortis.Model.Fields.LinkFieldWrapper, Fortis" /> <field name="droptree" type="Fortis.Model.Fields.LinkFieldWrapper, Fortis" /> <field name="general link" type="Fortis.Model.Fields.GeneralLinkFieldWrapper, Fortis" /> <field name="general link with search" type="Fortis.Model.Fields.GeneralLinkFieldWrapper, Fortis" /> <field name="text" type="Fortis.Model.Fields.TextFieldWrapper, Fortis" /> <field name="single-line text" type="Fortis.Model.Fields.TextFieldWrapper, Fortis" /> <field name="multi-line text" type="Fortis.Model.Fields.TextFieldWrapper, Fortis" /> <field name="droplist" type="Fortis.Model.Fields.TextFieldWrapper, Fortis" /> <field name="rich text" type="Fortis.Model.Fields.RichTextFieldWrapper, Fortis" /> <field name="number" type="Fortis.Model.Fields.NumberFieldWrapper, Fortis" /> <field name="integer" type="Fortis.Model.Fields.IntegerFieldWrapper, Fortis" /> <field name="name value list" type="Fortis.Model.Fields.NameValueListFieldWrapper, Fortis" /> </fieldTypes> </fortis>
{ "content_hash": "9d214e831c1e6fa38dbed0ca37a07764", "timestamp": "", "source": "github", "line_count": 35, "max_line_length": 112, "avg_line_length": 67.05714285714286, "alnum_prop": 0.7533020877716233, "repo_name": "Fortis-Collection/fortis", "id": "e76ce1ec6d9900e158d0ac3816c5c17908cf75d5", "size": "2349", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "src/Fortis.Tests/Configuration/TestXmlConfiguration.xml", "mode": "33188", "license": "mit", "language": [ { "name": "C#", "bytes": "197329" }, { "name": "CSS", "bytes": "1636152" } ], "symlink_target": "" }
<!-- Copyright 2020 Google LLC Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at https://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --> <mat-sidenav-container class="sidebar"> <!-- This sidenav controls the sorting and content type --> <mat-sidenav mode="side" opened> <mat-list> <!-- search options --> <div mat-subheader>Search for:</div> <mat-selection-list [(ngModel)]="searchOption" (ngModelChange)="onSearchChanged($event)" #search [multiple]="false"> <mat-list-option value='Recipes'> <mat-icon mat-list-icon>format_list_numbered</mat-icon> Recipes </mat-list-option> <mat-list-option value='Users'> <mat-icon mat-list-icon>person</mat-icon> Users </mat-list-option> </mat-selection-list> <mat-divider></mat-divider> </mat-list> <!-- sort options for recipes --> <mat-expansion-panel #recipePanel [disabled]='showResults' *ngIf='searchOption[0] === "Recipes"'> <mat-expansion-panel-header> <mat-panel-title> Sort </mat-panel-title> </mat-expansion-panel-header> <mat-selection-list [(ngModel)]="recipeOption" (ngModelChange)="onRecipeSortChanged($event)" #sortRecipes [multiple]="false"> <mat-list-option value='Time Created'> <mat-icon mat-list-icon>access_time</mat-icon> Time Created </mat-list-option> <mat-list-option value='Rating'> <mat-icon mat-list-icon>star</mat-icon> Rating </mat-list-option> <mat-list-option value='Name'> <mat-icon mat-list-icon>text_format</mat-icon> Name </mat-list-option> </mat-selection-list> <mat-list-item> <mat-slide-toggle *ngIf='recipeOption[0] === "Time Created"' [(ngModel)]="isChecked">{{ isChecked ? 'Oldest' : 'Newest' }}</mat-slide-toggle> <mat-slide-toggle *ngIf='recipeOption[0] === "Rating"' [(ngModel)]="isChecked">{{ isChecked ? 'Lowest' : 'Highest' }}</mat-slide-toggle> <mat-slide-toggle *ngIf='recipeOption[0] === "Name"' [(ngModel)]="isChecked">{{ isChecked ? 'Z-A' : 'A-Z' }}</mat-slide-toggle> </mat-list-item> </mat-expansion-panel> <mat-expansion-panel #tagPanel [disabled]='showResults' *ngIf='searchOption[0] === "Recipes"'> <mat-expansion-panel-header> <mat-panel-title> Seach Tags </mat-panel-title> </mat-expansion-panel-header> <form [formGroup]="tagSearch" (submit)="getTags()"> <mat-form-field appearance="outline"> <input matInput (keyup)='checkEmpty()' formControlName="query" #tagQuery autocomplete="off" placeholder="Ex. Vegetarian"> </mat-form-field> </form> </mat-expansion-panel> <!-- sort options for users --> <mat-expansion-panel [disabled]='showResults' #recipePanel *ngIf='searchOption[0] === "Users"'> <mat-expansion-panel-header> <mat-panel-title> Sort </mat-panel-title> </mat-expansion-panel-header> <mat-selection-list [(ngModel)]="userOption" (ngModelChange)="onUserSortChanged($event)" #sortUsers [multiple]="false"> <mat-list-option value='Time Created'> <mat-icon mat-list-icon>access_time</mat-icon> Time Created </mat-list-option> <mat-list-option value='Name'> <mat-icon mat-list-icon>text_format</mat-icon> Name </mat-list-option> </mat-selection-list> <mat-list-item> <mat-slide-toggle *ngIf='userOption[0] === "Time Created"' [(ngModel)]="isChecked">{{ isChecked ? 'Oldest' : 'Newest' }}</mat-slide-toggle> <mat-slide-toggle *ngIf='userOption[0] === "Name"' [(ngModel)]="isChecked">{{ isChecked ? 'Z-A' : 'A-Z' }}</mat-slide-toggle> </mat-list-item> </mat-expansion-panel> </mat-sidenav> <!-- the content within the actual page --> <mat-sidenav-content> <!-- displays the user search bar --> <div *ngIf='this.searchOption[0] == "Users"' class="content"> <ais-instantsearch [config]="configUsers"> <app-search-box (noQuery)='displayResults($event)' [searchField]='query' contentType='Search for users' [delay]='500'></app-search-box> <ais-hits *ngIf='showResults'> <ng-template let-hits="hits"> <div class="search-content-container"> <app-profile-card *ngFor="let user of hits" (click)='goToUser(user.username)' [user]='user'> </app-profile-card> </div> </ng-template> </ais-hits> </ais-instantsearch> </div> <!-- displays the recipe search bar --> <div *ngIf='this.searchOption[0] == "Recipes"' class="content"> <ais-instantsearch [config]="configRecipes"> <app-search-box (noQuery)='displayResults($event)' [searchField]='query' contentType='Search for recipes' [delay]='500'></app-search-box> <ais-hits *ngIf='showResults'> <ng-template let-hits="hits"> <div class="search-content-container"> <app-recipe-card *ngFor="let recipe of hits" (click)='goToRecipe(recipe.objectID)' [recipe]='recipe'> </app-recipe-card> </div> </ng-template> </ais-hits> </ais-instantsearch> </div> <div *ngIf='!showResults' class="content"> <app-discover-display [contentType]='this.searchOption[0]' [direction]='this.isChecked' [query]='tagQuery' [sortType]='this.searchOption[0] === "Recipes" ? this.recipeOption[0] : this.userOption[0]'> </app-discover-display> </div> </mat-sidenav-content> </mat-sidenav-container>
{ "content_hash": "d0dc6be060a8771b2a15acc6784442ef", "timestamp": "", "source": "github", "line_count": 136, "max_line_length": 149, "avg_line_length": 44.88235294117647, "alnum_prop": 0.6135321100917431, "repo_name": "googleinterns/step79-2020", "id": "71b6b1f0e348498163a6c806b10776ff26e72775", "size": "6104", "binary": false, "copies": "1", "ref": "refs/heads/main", "path": "angularSTEP/src/app/discover-page/discover-page.component.html", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "CSS", "bytes": "24551" }, { "name": "HTML", "bytes": "59641" }, { "name": "JavaScript", "bytes": "20597" }, { "name": "TypeScript", "bytes": "140220" } ], "symlink_target": "" }
ACCEPTED #### According to International Plant Names Index #### Published in null #### Original name null ### Remarks null
{ "content_hash": "c9abc80842cbf086ab3697460d01d3bc", "timestamp": "", "source": "github", "line_count": 13, "max_line_length": 31, "avg_line_length": 9.692307692307692, "alnum_prop": 0.7063492063492064, "repo_name": "mdoering/backbone", "id": "644b72495adbfaacf7a80294f7d10a8b54f04a2e", "size": "174", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "life/Plantae/Magnoliophyta/Magnoliopsida/Proteales/Proteaceae/Banksia/Banksia barbigera/README.md", "mode": "33188", "license": "apache-2.0", "language": [], "symlink_target": "" }
// Because AdSense and DoubleClick are both operated by Google and their A4A // implementations share some behavior in common, part of the logic for this // implementation is located in the ads/google/a4a directory rather than here. // Most other ad networks will want to put their A4A code entirely in the // extensions/amp-ad-network-${NETWORK_NAME}-impl directory. import {AmpA4A} from '../../amp-a4a/0.1/amp-a4a'; import { identityEnabled, } from './adsense-a4a-config'; import { isInManualExperiment, } from '../../../ads/google/a4a/traffic-experiments'; import {isExperimentOn} from '../../../src/experiments'; import { additionalDimensions, googleAdUrl, isReportingEnabled, extractAmpAnalyticsConfig, addCsiSignalsToAmpAnalyticsConfig, QQID_HEADER, maybeAppendErrorParameter, getEnclosingContainerTypes, ValidAdContainerTypes, getIdentityToken, } from '../../../ads/google/a4a/utils'; import { googleLifecycleReporterFactory, setGoogleLifecycleVarsFromHeaders, } from '../../../ads/google/a4a/google-data-reporter'; import { installAnchorClickInterceptor, } from '../../../src/anchor-click-interceptor'; import {removeElement} from '../../../src/dom'; import {getMode} from '../../../src/mode'; import {stringHash32} from '../../../src/string'; import {dev} from '../../../src/log'; import {Services} from '../../../src/services'; import {domFingerprintPlain} from '../../../src/utils/dom-fingerprint'; import {clamp} from '../../../src/utils/math'; import { computedStyle, setStyle, setStyles, } from '../../../src/style'; import {AdsenseSharedState} from './adsense-shared-state'; import {insertAnalyticsElement} from '../../../src/extension-analytics'; import { getAdSenseAmpAutoAdsExpBranch, } from '../../../ads/google/adsense-amp-auto-ads'; /** @const {string} */ const ADSENSE_BASE_URL = 'https://googleads.g.doubleclick.net/pagead/ads'; /** @const {string} */ const TAG = 'amp-ad-network-adsense-impl'; /** * Shared state for AdSense ad slots. This is used primarily for ad request url * parameters that depend on previous slots. * @const {!AdsenseSharedState} */ const sharedState = new AdsenseSharedState(); /** @visibleForTesting */ export function resetSharedState() { sharedState.reset(); } export class AmpAdNetworkAdsenseImpl extends AmpA4A { /** * @param {!Element} element */ constructor(element) { super(element); /** * @type {!../../../ads/google/a4a/performance.GoogleAdLifecycleReporter} */ this.lifecycleReporter_ = this.lifecycleReporter_ || this.initLifecycleReporter(); /** * A unique identifier for this slot. * Not initialized until getAdUrl() is called; updated upon each invocation * of getAdUrl(). * @private {?string} */ this.uniqueSlotId_ = null; /** * Config to generate amp-analytics element for active view reporting. * @type {?JsonObject} * @private */ this.ampAnalyticsConfig_ = null; /** @private {!../../../src/service/extensions-impl.Extensions} */ this.extensions_ = Services.extensionsFor(this.win); /** @private {?({width, height}|../../../src/layout-rect.LayoutRectDef)} */ this.size_ = null; /** * amp-analytics element generated based on this.ampAnalyticsConfig_ * @private {?Element} */ this.ampAnalyticsElement_ = null; /** @private {?string} */ this.qqid_ = null; /** * For full-width responsive ads: whether the element has already been * aligned to the edges of the viewport. * @private {boolean} */ this.responsiveAligned_ = false; /** * The contents of the data-auto-format attribute, or empty string if the * attribute was not set. * @private {?string} */ this.autoFormat_ = null; /** @private {?Promise<!../../../ads/google/a4a/utils.IdentityToken>} */ this.identityTokenPromise_ = null; /** * @private {?boolean} whether preferential rendered AMP creative, null * indicates no creative render. */ this.isAmpCreative_ = null; } /** * @return {boolean} * @private */ isResponsive_() { return this.autoFormat_ == 'rspv'; } /** @override */ isValidElement() { /** * isValidElement used to also check that we are in a valid A4A environment, * however this is not necessary as that is checked by adsenseIsA4AEnabled, * which is always called as part of the upgrade path from an amp-ad element * to an amp-ad-adsense element. Thus, if we are an amp-ad, we can be sure * that it has been verified. */ return !!this.element.getAttribute('data-ad-client') && this.isAmpAdElement(); } /** @override */ delayAdRequestEnabled() { return true; } /** @override */ buildCallback() { super.buildCallback(); this.identityTokenPromise_ = identityEnabled(this.win) ? Services.viewerForDoc(this.getAmpDoc()).whenFirstVisible() .then(() => getIdentityToken(this.win, this.getAmpDoc())) : Promise.resolve( /**@type {!../../../ads/google/a4a/utils.IdentityToken}*/({})); this.autoFormat_ = this.element.getAttribute('data-auto-format') || ''; if (this.isResponsive_()) { // Attempt to resize to the correct height. The width should already be // 100vw, but is fixed here so that future resizes of the viewport don't // affect it. const viewportSize = this.getViewport().getSize(); return this.attemptChangeSize( AmpAdNetworkAdsenseImpl.getResponsiveHeightForContext_( viewportSize), viewportSize.width); } } /** @override */ getAdUrl() { // TODO: Check for required and allowed parameters. Probably use // validateData, from 3p/3p/js, after moving it someplace common. const startTime = Date.now(); const global = this.win; let adClientId = this.element.getAttribute('data-ad-client'); // Ensure client id format: lower case with 'ca-' prefix. adClientId = adClientId.toLowerCase(); if (adClientId.substring(0, 3) != 'ca-') { adClientId = 'ca-' + adClientId; } const adTestOn = this.element.getAttribute('data-adtest') || isInManualExperiment(this.element); const width = Number(this.element.getAttribute('width')); const height = Number(this.element.getAttribute('height')); // Need to ensure these are numbers since width can be set to 'auto'. // Checking height just in case. const useDefinedSizes = isExperimentOn(this.win, 'as-use-attr-for-format') && !this.isResponsive_() && !isNaN(width) && width > 0 && !isNaN(height) && height > 0; this.size_ = useDefinedSizes ? {width, height} : this.getIntersectionElementLayoutBox(); const format = `${this.size_.width}x${this.size_.height}`; const slotId = this.element.getAttribute('data-amp-slot-index'); // data-amp-slot-index is set by the upgradeCallback method of amp-ad. // TODO(bcassels): Uncomment the assertion, fixing the tests. // But not all tests arrange to call upgradeCallback. // dev().assert(slotId != undefined); const adk = this.adKey_(format); this.uniqueSlotId_ = slotId + adk; const slotname = this.element.getAttribute('data-ad-slot'); const sharedStateParams = sharedState.addNewSlot( format, this.uniqueSlotId_, adClientId, slotname); const viewportSize = this.getViewport().getSize(); this.win['ampAdGoogleIfiCounter'] = this.win['ampAdGoogleIfiCounter'] || 1; const enclosingContainers = getEnclosingContainerTypes(this.element); const pfx = enclosingContainers.includes( ValidAdContainerTypes['AMP-FX-FLYING-CARPET']) || enclosingContainers.includes(ValidAdContainerTypes['AMP-STICKY-AD']); const parameters = { 'client': adClientId, format, 'w': this.size_.width, 'h': this.size_.height, 'iu': slotname, 'adtest': adTestOn ? 'on' : null, adk, 'output': 'html', 'bc': global.SVGElement && global.document.createElementNS ? '1' : null, 'ctypes': this.getCtypes_(), 'host': this.element.getAttribute('data-ad-host'), 'to': this.element.getAttribute('data-tag-origin'), 'pv': sharedStateParams.pv, 'channel': this.element.getAttribute('data-ad-channel'), 'wgl': global['WebGLRenderingContext'] ? '1' : '0', 'asnt': this.sentinel, 'dff': computedStyle(this.win, this.element)['font-family'], 'prev_fmts': sharedStateParams.prevFmts || null, 'prev_slotnames': sharedStateParams.prevSlotnames || null, 'brdim': additionalDimensions(this.win, viewportSize), 'ifi': this.win['ampAdGoogleIfiCounter']++, 'rc': this.fromResumeCallback ? 1 : null, 'rafmt': this.isResponsive_() ? 13 : null, 'pfx': pfx ? '1' : '0', }; const experimentIds = []; const ampAutoAdsBranch = getAdSenseAmpAutoAdsExpBranch(this.win); if (ampAutoAdsBranch) { experimentIds.push(ampAutoAdsBranch); } const identityPromise = Services.timerFor(this.win) .timeoutPromise(1000, this.identityTokenPromise_) .catch(unusedErr => { // On error/timeout, proceed. return /**@type {!../../../ads/google/a4a/utils.IdentityToken}*/( {}); }); return identityPromise.then(identity => { return googleAdUrl( this, ADSENSE_BASE_URL, startTime, Object.assign( { adsid: identity.token || null, jar: identity.jar || null, pucrd: identity.pucrd || null, }, parameters), experimentIds); }); } /** @override */ onNetworkFailure(error, adUrl) { dev().info(TAG, 'network error, attempt adding of error parameter', error); return {adUrl: maybeAppendErrorParameter(adUrl, 'n')}; } /** @override */ extractSize(responseHeaders) { setGoogleLifecycleVarsFromHeaders(responseHeaders, this.lifecycleReporter_); this.ampAnalyticsConfig_ = extractAmpAnalyticsConfig(this, responseHeaders); this.qqid_ = responseHeaders.get(QQID_HEADER); if (this.ampAnalyticsConfig_) { // Load amp-analytics extensions this.extensions_./*OK*/installExtensionForDoc( this.getAmpDoc(), 'amp-analytics'); } return this.size_; } /** * @param {string} format * @return {string} The ad unit hash key string. * @private */ adKey_(format) { const element = this.element; const slot = element.getAttribute('data-ad-slot') || ''; const string = `${slot}:${format}:${domFingerprintPlain(element)}`; return stringHash32(string); } /** * @return {?string} * @private */ getCtypes_() { if (!getMode().localDev) { return null; } const ctypesReMatch = /[?&]force_a4a_ctypes=([^&]+)/.exec( this.win.location.search); // If the RE passes, then length is necessarily > 1. if (ctypesReMatch) { return ctypesReMatch[1]; } return null; } /** @override */ emitLifecycleEvent(eventName, opt_extraVariables) { if (opt_extraVariables) { this.lifecycleReporter_.setPingParameters(opt_extraVariables); } this.lifecycleReporter_.sendPing(eventName); } /** * @return {!../../../ads/google/a4a/performance.BaseLifecycleReporter} */ initLifecycleReporter() { return googleLifecycleReporterFactory(this); } /** @override */ onCreativeRender(creativeMetaData) { super.onCreativeRender(creativeMetaData); this.isAmpCreative_ = !!creativeMetaData; if (creativeMetaData && !creativeMetaData.customElementExtensions.includes('amp-ad-exit')) { // Capture phase click handlers on the ad if amp-ad-exit not present // (assume it will handle capture). dev().assert(this.iframe); installAnchorClickInterceptor( this.getAmpDoc(), this.iframe.contentWindow); } if (this.ampAnalyticsConfig_) { dev().assert(!this.ampAnalyticsElement_); if (isReportingEnabled(this)) { addCsiSignalsToAmpAnalyticsConfig( this.win, this.element, this.ampAnalyticsConfig_, this.qqid_, !!creativeMetaData, this.lifecycleReporter_.getDeltaTime(), this.lifecycleReporter_.getInitTime()); } this.ampAnalyticsElement_ = insertAnalyticsElement(this.element, this.ampAnalyticsConfig_, true); } this.lifecycleReporter_.addPingsForVisibility(this.element); setStyles(dev().assertElement(this.iframe), { width: `${this.size_.width}px`, height: `${this.size_.height}px`, }); } /** @override */ unlayoutCallback() { switch (this.postAdResponseExperimentFeatures['unlayout_exp']) { case 'all': // Ensure all creatives are removed. this.isAmpCreative_ = false; break; case 'remain': if (this.qqid_ && this.isAmpCreative_ === null) { // Ad response received but not yet rendered. Note that no fills // would fall into this case even if layoutCallback has executed. // Assume high probability of continued no fill therefore do not // tear down. dev().info(TAG, 'unlayoutCallback - unrendered creative can remain'); return false; } } if (this.isAmpCreative_) { // Allow AMP creatives to remain in case SERP viewer swipe back. return false; } const superResult = super.unlayoutCallback(); this.element.setAttribute('data-amp-slot-index', this.win.ampAdSlotIdCounter++); this.lifecycleReporter_ = this.initLifecycleReporter(); if (this.uniqueSlotId_) { sharedState.removeSlot(this.uniqueSlotId_); } if (this.ampAnalyticsElement_) { removeElement(this.ampAnalyticsElement_); this.ampAnalyticsElement_ = null; } this.ampAnalyticsConfig_ = null; this.qqid_ = null; this.isAmpCreative_ = null; return superResult; } /** @override */ onLayoutMeasure() { super.onLayoutMeasure(); if (this.isResponsive_() && !this.responsiveAligned_) { this.responsiveAligned_ = true; const layoutBox = this.getLayoutBox(); // Nudge into the correct horizontal position by changing side margin. this.getVsync().run({ measure: state => { state.direction = computedStyle(this.win, this.element)['direction']; }, mutate: state => { if (state.direction == 'rtl') { setStyle(this.element, 'marginRight', layoutBox.left, 'px'); } else { setStyle(this.element, 'marginLeft', -layoutBox.left, 'px'); } }, }, {direction: ''}); } } /** @override */ getPreconnectUrls() { return ['https://googleads.g.doubleclick.net']; } /** * Calculates the appropriate height for a full-width responsive ad of the * given width. * @param {!{width: number, height: number}} viewportSize * @return {number} * @private */ static getResponsiveHeightForContext_(viewportSize) { const minHeight = 100; const maxHeight = Math.min(300, viewportSize.height); // We aim for a 6:5 aspect ratio. const idealHeight = Math.round(viewportSize.width / 1.2); return clamp(idealHeight, minHeight, maxHeight); } } AMP.extension(TAG, '0.1', AMP => { AMP.registerElement(TAG, AmpAdNetworkAdsenseImpl); });
{ "content_hash": "cb38295ee34bdaad966db192c43a24f1", "timestamp": "", "source": "github", "line_count": 468, "max_line_length": 80, "avg_line_length": 33.16880341880342, "alnum_prop": 0.6396959350640984, "repo_name": "gumgum/amphtml", "id": "a5f61f8b1bf5f8fb4c23f24a011a892ba6cb05fc", "size": "16150", "binary": false, "copies": "2", "ref": "refs/heads/master", "path": "extensions/amp-ad-network-adsense-impl/0.1/amp-ad-network-adsense-impl.js", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "CSS", "bytes": "120510" }, { "name": "Go", "bytes": "7459" }, { "name": "HTML", "bytes": "965198" }, { "name": "Java", "bytes": "36670" }, { "name": "JavaScript", "bytes": "8738015" }, { "name": "Python", "bytes": "75748" }, { "name": "Ruby", "bytes": "12998" }, { "name": "Shell", "bytes": "6942" }, { "name": "Yacc", "bytes": "22292" } ], "symlink_target": "" }
#region File header #endregion #region Using directives using System.IO; #endregion namespace CG.IO { /// <summary> /// This class is a stream backed by a temporary disk file. The disk file /// is automatically deleted when the stream is disposed. /// </summary> public class TempFileStream : Stream { // ******************************************************************* // Properties. // ******************************************************************* #region Properties /// <summary> /// This property contains the name of the underlying base stream. /// </summary> public string Name { get { return BaseStream.Name; } } // ******************************************************************* /// <summary> /// This property contains a reference to a base stream. /// </summary> protected FileStream BaseStream { get; private set; } #endregion // ******************************************************************* // Constructors. // ******************************************************************* #region Constructors /// <summary> /// This constructor creates a new instance of the <see cref="TempFileStream"/> /// class. /// </summary> public TempFileStream() { // Create a temporary base stream. BaseStream = File.Create( Path.GetTempFileName() ); } // ******************************************************************* /// <summary> /// This constructor creates a new instance of the <see cref="TempFileStream"/> /// class. /// </summary> /// <param name="filePath">The path to a file. Note, this file will be /// deleted after the stream is closed! Once more, THIS FILE WILL BE /// DELETED AFTER THE STREAM IS CLOSED!! DELETED!! KILLED!! EXPUNGED!! /// EXCISED!! REMOVED!! OBLITERATED!! ERADICATED!! Are you understanding /// the words that are coming out of my mouth?!</param> public TempFileStream( string filePath ) { // Validate the parameter before attempting to use it. Guard.Instance.ThrowIfInvalidFilePath(filePath, nameof(filePath)); // Create/open a temporary base stream. BaseStream = File.Open( filePath, FileMode.OpenOrCreate ); } #endregion // ******************************************************************* // Stream overrides. // ******************************************************************* #region Stream overrides /// <summary> /// This property gets a value indicating whether the current stream /// supports reading. /// </summary> public override bool CanRead { get { return BaseStream.CanRead; } } // ******************************************************************* /// <summary> /// This property gets a value indicating whether the current stream /// supports seeking. /// </summary> public override bool CanSeek { get { return BaseStream.CanSeek; } } // ******************************************************************* /// <summary> /// This property gets a value indicating whether the current stream /// supports writing. /// </summary> public override bool CanWrite { get { return BaseStream.CanWrite; } } // ******************************************************************* /// <summary> /// This property gets the length in bytes of the stream. /// </summary> public override long Length { get { return BaseStream.Length; } } // ******************************************************************* /// <summary> /// This property gets or sets the current position of this stream. /// </summary> public override long Position { get { return BaseStream.Position; } set { BaseStream.Position = value; } } // ******************************************************************* /// <summary> /// This method clears buffers for this stream and causes any buffered /// data to be written to the underlying storage. /// </summary> public override void Flush() { BaseStream.Flush(); } // ******************************************************************* /// <summary> /// This method reads a block of bytes from the stream and writes the /// data in a given buffer. /// </summary> /// <param name="buffer">When this method returns, contains the specified /// byte array with the values between offset and (offset + count - 1) /// replaced by the bytes read from the current source.</param> /// <param name="offset">The byte offset in array at which the read bytes /// will be placed.</param> /// <param name="count">The maximum number of bytes to read.</param> /// <returns>The total number of bytes read into the buffer. This might /// be less than the number of bytes requested if that number of bytes /// are not currently available, or zero if the end of the stream is /// reached.</returns> public override int Read( byte[] buffer, int offset, int count ) { return BaseStream.Read( buffer, offset, count ); } // ******************************************************************* /// <summary> /// This method sets the current position of this stream to the given /// value. /// </summary> /// <param name="offset">The point relative to origin from which to /// begin seeking.</param> /// <param name="origin">Specifies the beginning, the end, or the current /// position as a reference point for offset, using a value of type /// System.IO.SeekOrigin.</param> /// <returns>The new position in the stream.</returns> public override long Seek( long offset, SeekOrigin origin ) { return BaseStream.Seek( offset, origin ); } // ******************************************************************* /// <summary> /// This method sets the length of this stream to the given value. /// </summary> /// <param name="value">The new length of the stream.</param> public override void SetLength( long value ) { BaseStream.SetLength(value); } // ******************************************************************* /// <summary> /// This method writes a block of bytes to the file stream. /// </summary> /// <param name="buffer">The buffer containing data to write to the /// stream.</param> /// <param name="offset">The zero-based byte offset in array from which /// to begin copying bytes to the stream.</param> /// <param name="count">The maximum number of bytes to write.</param> public override void Write( byte[] buffer, int offset, int count ) { BaseStream.Write( buffer, offset, count ); } // ******************************************************************* /// <summary> /// This method releases the unmanaged resources used by the <see cref="FileStream"/> /// and optionally releases the managed resources. /// </summary> /// <param name="disposing">True to release both managed and unmanaged /// resources; false to release only unmanaged resources.</param> protected override void Dispose( bool disposing ) { if (disposing) { var tempPath = BaseStream.Name; BaseStream.Dispose(); if (File.Exists(tempPath)) File.Delete(tempPath); } base.Dispose(disposing); } #endregion } }
{ "content_hash": "9899361aa02094b563fd771d48204da9", "timestamp": "", "source": "github", "line_count": 270, "max_line_length": 93, "avg_line_length": 32.77777777777778, "alnum_prop": 0.44384180790960454, "repo_name": "CodeGator/CG.Core", "id": "4bc06ca976b0b38be49de74924fb671776462bd9", "size": "10142", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "CG.Core/IO/TempFileStream.cs", "mode": "33188", "license": "mit", "language": [ { "name": "Batchfile", "bytes": "380" }, { "name": "C#", "bytes": "309512" } ], "symlink_target": "" }
hs.canvas.turtle ================ An experimental module to add a turtle graphics module to `hs.canvas` in Hammerspoon. This module creates a "canvas" that responds to turtle graphics like commands similar to those found in the Logo language, specifically [BERKELEY LOGO 6.1](https://people.eecs.berkeley.edu/~bh/docs/html/usermanual_6.html#GRAPHICS). This may or may not ever make it into Hammerspoon itself; it was a diversion and an attempt to revisit some algorithms I worked on to explore Sierpiński curves in an old Hypercard stack I created in my teens. - - - *December 27, 2020 Update* Now uses an off screen image for generating display. Shows slight general speedup, but is really noticeable when moving or resizing view for complex renders. (If `hs.canvas` ever gets a rewrite, should consider similar... setting `needsDisplay = YES` causes view (or dirty rect, if you specify one) to be cleared, requiring complete redraw; using an offscreen image allows adding elements in stages and then just one image draw in the view in `drawRect` method. Would require some thought about moving elements, though it may still be worth it because even before I started updating offscreen image incrementally, there was a noticeable (though smaller) speedup just from rendering offscreen outside of NSView's Graphics Context.) Todo: * finish documentation -- it's started, but not done yet * rethink _background * does it need a rename? * no way to tell if _background function is active -- subsequent calls to _background are queued, but other turtle actions aren't * queue other actions as well? queries are ok, but anything that changes state isn't safe during run * no way to cancel running function or depth of queue * decide on savepict/loadpict and logo conversion * revisit fill/filled
{ "content_hash": "4ba5b27177d80ad8f6cfc7eb0fb2e18c", "timestamp": "", "source": "github", "line_count": 24, "max_line_length": 730, "avg_line_length": 75.29166666666667, "alnum_prop": 0.7775318206972883, "repo_name": "asmagill/hammerspoon_asm", "id": "107dd701536e75a6c42ab9467444ef50ce53cf1e", "size": "1808", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "turtle/README.md", "mode": "33188", "license": "mit", "language": [ { "name": "C", "bytes": "235183" }, { "name": "Lua", "bytes": "352790" }, { "name": "Makefile", "bytes": "123733" }, { "name": "Objective-C", "bytes": "1227281" } ], "symlink_target": "" }
<!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <link rel="shortcut icon" type="image/x-icon" href="../favicon.ico"> <link rel="stylesheet" type="text/css" href="../css/font-awesome-4.7.0/css/font-awesome.min.css"> <link rel="stylesheet" type="text/css" href="../css/fonts.css"> <link rel="stylesheet" type="text/css" href="../css/sweetalert.css"> <link rel="stylesheet" type="text/css" href="../css/game.css"> <title>C.A.S.E.O.</title> </head> <body> <div class="center-align-container"> <canvas id="myCanvas" class="myCanvas"></canvas> </div> <div class="center-align-container" style="font-size:1.4em;margin-top:0;padding-top:0"> <a href="../menu.html" style="color:rgba(150, 146, 130, 0.7)"><i class="fa fa-bars fa-3x" aria-hidden="true"></i></a> <span>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</span> <a id="LOCK_BTN" style="color:rgba(150, 146, 130, 0.7);cursor:pointer"><i class="fa fa-lock fa-3x" aria-hidden="true"></i></a> <span>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</span> <a id="UNDO_BTN" style="color:rgba(150, 146, 130, 0.7);cursor:pointer"><i class="fa fa-times fa-3x" aria-hidden="true"></i></a> <span>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</span> <a id="BOOKMARK_BTN" style="color:rgba(150, 146, 130, 0.7);cursor:pointer"><i class="fa fa-bookmark-o fa-3x" aria-hidden="true"></i></a> <span>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</span> <a onclick="ReloadPage()" style="color:#DB7093;cursor:pointer"><i class="fa fa-repeat fa-3x" aria-hidden="true"></i></a> </div> <script src="../js/jquery-3.2.1.min.js"></script> <script src="../js/sweetalert.min.js"></script> <script src="../js/js.cookie.js"></script> <script src="../js/utils.js"></script> <script src="../js/board.js"></script> <script> $(document).ready(function() { let R = 4; let C = 3; let row_data = [[0, 0, 2], [1, 1, 1], [1, 0, 0], [1, 1, 1], [2, 0, 0]]; let col_data = [[0, 1, 1, 0], [0, 1, 1, 0], [2, 1, 1, 2], [0, 1, 1, 0]]; let win_h = $(window).height(); let win_w = $(window).width(); let FONT = "Courier, monospace"; let fh = 2; let font_dim = MeasureText("A", true, FONT, fh); let char_h = font_dim[1]; let char_w = font_dim[0]; let req_h = R * 5 * char_h + (R+1) * char_h + 80; let req_w = C * 11 * char_w + (C+1) * char_w + 80; while (req_h < win_h && req_w < win_w) { ++fh; font_dim = MeasureText("A", true, FONT, fh); char_h = font_dim[1]; char_w = font_dim[0]; req_h = R * 5 * char_h + (R+1) * char_h + 80; req_w = C * 11 * char_w + (C+1) * char_w + 80; } var _b = new board(R, C, "myCanvas", FONT, fh-1); _b.RestoreBookmark(row_data, col_data); function getMousePos(canvas, evt) { let rect = canvas.getBoundingClientRect(); return { x: (evt.clientX-rect.left) / (rect.right-rect.left) * canvas.width, y: (evt.clientY-rect.top) / (rect.bottom-rect.top) * canvas.height }; } var canvas = document.getElementById("myCanvas"); var _ctx = canvas.getContext("2d"); _ctx.strokeStyle = "#E6E6FA"; _ctx.lineWidth = Math.floor(fh*1.33); var _drawing = false; var _mousePos = { x : -1, y : -1 }; var _lastPos = _mousePos; $('body').on('contextmenu', '#myCanvas', function(e) { // Disable default right-click - it is used for creating helper walls return false; }); canvas.addEventListener("mousedown", function (e) { _ctx.beginPath(); _drawing = true; _lastPos = getMousePos(canvas, e); _ctx.lineJoin = _ctx.lineCap = 'round'; }, false); canvas.addEventListener("mouseup", function (e) { if (_drawing) { _ctx.closePath(); _mousePos = getMousePos(canvas, e); if (e.which && e.which == 3) _b.AddHelperWall(_mousePos.x, _mousePos.y); else _b.Clicked(_mousePos.x, _mousePos.y); _lastPos = _mousePos; _drawing = false; } }, false); canvas.addEventListener("mousemove", function (e) { _mousePos = getMousePos(canvas, e); }, false); canvas.addEventListener("mouseout", function (e) { var mouseEvent = new MouseEvent("mouseup", {}); canvas.dispatchEvent(mouseEvent); }, false); function renderCanvas() { if (_drawing) { if (_lastPos.x > 0 && _lastPos.y > 0 && _mousePos.x > 0 && _mousePos.y > 0) { _ctx.moveTo(_lastPos.x, _lastPos.y); _ctx.lineTo(_mousePos.x, _mousePos.y); _ctx.stroke(); } _lastPos = _mousePos; _b.LogDragClick(_mousePos.x, _mousePos.y); } } (function drawLoop () { requestAnimFrame(drawLoop); renderCanvas(); })(); // TOUCH EVENTS // Prevent scrolling when touching the canvas canvas.addEventListener("touchstart", function (e) { e.preventDefault(); var touch = e.touches[0]; var mouseEvent = new MouseEvent("mousedown", { clientX: touch.clientX, clientY: touch.clientY }); canvas.dispatchEvent(mouseEvent); }, false); canvas.addEventListener("touchend", function (e) { var mouseEvent = new MouseEvent("mouseup", {}); canvas.dispatchEvent(mouseEvent); }, false); canvas.addEventListener("touchmove", function (e) { e.preventDefault(); var touch = e.touches[0]; var mouseEvent = new MouseEvent("mousemove", { clientX: touch.clientX, clientY: touch.clientY }); canvas.dispatchEvent(mouseEvent); }, false); $("#LOCK_BTN").click(function() { _b.ToggleLock(); }); $("#UNDO_BTN").click(function() { _b.RestoreLockedState(); }); $("#BOOKMARK_BTN").click(function() { _b.Bookmark(); }); }); </script> </body> </html>
{ "content_hash": "bd58bd5491a3be8da3c510c01e19afcf", "timestamp": "", "source": "github", "line_count": 166, "max_line_length": 140, "avg_line_length": 36.43975903614458, "alnum_prop": 0.5483551000165316, "repo_name": "grunfeld/grunfeld.github.io", "id": "94b0483f2080873dc509c7c4cc4a1def5b1c8acb", "size": "6049", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "levels/t10.html", "mode": "33188", "license": "mit", "language": [ { "name": "Awk", "bytes": "187" }, { "name": "CSS", "bytes": "188146" }, { "name": "HTML", "bytes": "3317467" }, { "name": "JavaScript", "bytes": "73901" } ], "symlink_target": "" }
<?php namespace Aego\OAuth2\Client\Provider; use League\OAuth2\Client\Provider\AbstractProvider; use League\OAuth2\Client\Provider\Exception\IdentityProviderException; use League\OAuth2\Client\Token\AccessToken; use Psr\Http\Message\ResponseInterface; class Mailru extends AbstractProvider { /** * {@inheritdoc} */ public function getBaseAuthorizationUrl() { return 'https://connect.mail.ru/oauth/authorize'; } /** * {@inheritdoc} */ public function getBaseAccessTokenUrl(array $params) { return 'https://connect.mail.ru/oauth/token'; } /** * {@inheritdoc} */ public function getResourceOwnerDetailsUrl(AccessToken $token) { $param = 'app_id=' . $this->clientId . '&method=users.getInfo&secure=1&session_key=' . $token->getToken(); $sign = md5(str_replace('&', '', $param) . $this->clientSecret); return 'http://www.appsmail.ru/platform/api?' . $param . '&sig=' . $sign; } /** * {@inheritdoc} */ protected function getDefaultScopes() { return []; } /** * {@inheritdoc} */ protected function checkResponse(ResponseInterface $response, $data) { if (isset($data['error_code'])) { throw new IdentityProviderException($data['error_msg'], $data['error_code'], $response); } elseif (isset($data['error'])) { throw new IdentityProviderException($data['error'], $response->getStatusCode(), $response); } } /** * {@inheritdoc} */ protected function createResourceOwner(array $response, AccessToken $token) { return new MailruResourceOwner($response); } }
{ "content_hash": "5aba91e85daaa63ef15b5ba1c2c7d1b6", "timestamp": "", "source": "github", "line_count": 65, "max_line_length": 114, "avg_line_length": 26.584615384615386, "alnum_prop": 0.6076388888888888, "repo_name": "rakeev/oauth2-mailru", "id": "8c7b808f48c2e45075914199fc3dfdd6f34ebd77", "size": "1728", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "src/Mailru.php", "mode": "33188", "license": "mit", "language": [ { "name": "PHP", "bytes": "10930" } ], "symlink_target": "" }
<?php namespace Controllers; use Symfony\Component\HttpFoundation\Request; use Symfony\Component\HttpFoundation\Response; use Models\User; class DefaultController { private $app; /** * Constructor */ public function __construct($app) { $this->app = $app; } /** * Example default action that gets a custom service * then renders a twig template * * @return Response */ public function indexAction() { $message = $this->app['example_service']->getWelcomeMessage(); return $this->app['twig']->render('home.twig', array( 'message' => $message, )); } }
{ "content_hash": "785d6b6a4b79bb66fdc1a5384f340a5f", "timestamp": "", "source": "github", "line_count": 35, "max_line_length": 70, "avg_line_length": 19.114285714285714, "alnum_prop": 0.5919282511210763, "repo_name": "wellsjo/silex-boilerplate", "id": "4dbd4725449a8c9d085ef77bfe1c82aad9f251f1", "size": "669", "binary": false, "copies": "2", "ref": "refs/heads/master", "path": "app/Controllers/DefaultController.php", "mode": "33188", "license": "mit", "language": [ { "name": "ApacheConf", "bytes": "159" }, { "name": "HTML", "bytes": "117" }, { "name": "PHP", "bytes": "3150" } ], "symlink_target": "" }
class Worochi # Current version of the gem VERSION = '0.0.18' end
{ "content_hash": "c5cb7a877bf0edf2f7d3bde5abd49779", "timestamp": "", "source": "github", "line_count": 4, "max_line_length": 30, "avg_line_length": 17.25, "alnum_prop": 0.6956521739130435, "repo_name": "Pixelapse/worochi", "id": "e64454e44aa2ea86631c8cca98dce674fd01f7c6", "size": "69", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "lib/worochi/version.rb", "mode": "33188", "license": "mit", "language": [ { "name": "Ruby", "bytes": "81841" } ], "symlink_target": "" }
id: a103376db3ba46b2d50db289 title: Spinal Tap Case isRequired: true challengeType: 5 videoUrl: '' localeTitle: Caja del grifo espinal --- ## Description <section id="description"> Convertir una cadena a la caja espinal. El caso de la columna vertebral está compuesto por guiones en minúsculas. Recuerda usar <a href="https://forum.freecodecamp.org/t/how-to-get-help-when-you-are-stuck-coding/19514" target="_blank">Read-Search-Ask</a> si te atascas. Trate de emparejar el programa. Escribe tu propio código. </section> ## Instructions <section id="instructions"> </section> ## Tests <section id='tests'> ```yml tests: - text: <code>spinalCase(&quot;This Is Spinal Tap&quot;)</code> debe devolver <code>&quot;this-is-spinal-tap&quot;</code> . testString: 'assert.deepEqual(spinalCase("This Is Spinal Tap"), "this-is-spinal-tap", "<code>spinalCase("This Is Spinal Tap")</code> should return <code>"this-is-spinal-tap"</code>.");' - text: <code>spinalCase(&quot;thisIsSpinal Tap&quot;)</code> debe devolver <code>&quot;this-is-spinal-tap&quot;</code> . testString: 'assert.strictEqual(spinalCase("thisIsSpinalTap"), "this-is-spinal-tap", "<code>spinalCase("thisIsSpinal<wbr>Tap")</code> should return <code>"this-is-spinal-tap"</code>.");' - text: <code>spinalCase(&quot;The_Andy_ Griffith_Show&quot;)</code> debe devolver <code>&quot;the-andy-griffith-show&quot;</code> . testString: 'assert.strictEqual(spinalCase("The_Andy_Griffith_Show"), "the-andy-griffith-show", "<code>spinalCase("The_Andy_<wbr>Griffith_Show")</code> should return <code>"the-andy-griffith-show"</code>.");' - text: <code>spinalCase(&quot;Teletubbies say Eh-oh&quot;)</code> debería devolver <code>&quot;teletubbies-say-eh-oh&quot;</code> . testString: 'assert.strictEqual(spinalCase("Teletubbies say Eh-oh"), "teletubbies-say-eh-oh", "<code>spinalCase("Teletubbies say Eh-oh")</code> should return <code>"teletubbies-say-eh-oh"</code>.");' - text: <code>spinalCase(&quot;AllThe-small Things&quot;)</code> debería devolver <code>&quot;all-the-small-things&quot;</code> . testString: 'assert.strictEqual(spinalCase("AllThe-small Things"), "all-the-small-things", "<code>spinalCase("AllThe-small Things")</code> should return <code>"all-the-small-things"</code>.");' ``` </section> ## Challenge Seed <section id='challengeSeed'> <div id='js-seed'> ```js function spinalCase(str) { // "It's such a fine line between stupid, and clever." // --David St. Hubbins return str; } spinalCase('This Is Spinal Tap'); ``` </div> </section> ## Solution <section id='solution'> ```js // solution required ``` </section>
{ "content_hash": "7792a06ce29c650131da6e5cf2703696", "timestamp": "", "source": "github", "line_count": 64, "max_line_length": 365, "avg_line_length": 41.09375, "alnum_prop": 0.7121673003802281, "repo_name": "jonathanihm/freeCodeCamp", "id": "739409b5a89f5401784e3636b0e66beb0c66cd21", "size": "2639", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "curriculum/challenges/spanish/02-javascript-algorithms-and-data-structures/intermediate-algorithm-scripting/spinal-tap-case.spanish.md", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "CSS", "bytes": "193850" }, { "name": "HTML", "bytes": "100244" }, { "name": "JavaScript", "bytes": "522335" } ], "symlink_target": "" }
--- uid: SolidEdgeAssembly.StructuralFrame.GetFramePath(System.Int32@,System.Array@) summary: remarks: syntax: parameters: - id: NumPathSegments description: Specifies the number of objects in the PathSegments array. - id: PathSegments description: Specifies the segments for the frame path. ---
{ "content_hash": "3559a7b8b95f3f06d7c7bdcfb6662d42", "timestamp": "", "source": "github", "line_count": 11, "max_line_length": 80, "avg_line_length": 29.363636363636363, "alnum_prop": 0.7306501547987616, "repo_name": "SolidEdgeCommunity/docs", "id": "1c47548a95140abbfa593eeea13d68776e9969c2", "size": "325", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "docfx_project/apidoc/SolidEdgeAssembly.StructuralFrame.GetFramePath.md", "mode": "33188", "license": "mit", "language": [ { "name": "Batchfile", "bytes": "38" }, { "name": "C#", "bytes": "5048212" }, { "name": "C++", "bytes": "2265" }, { "name": "CSS", "bytes": "148" }, { "name": "PowerShell", "bytes": "180" }, { "name": "Smalltalk", "bytes": "1996" }, { "name": "Visual Basic", "bytes": "10236277" } ], "symlink_target": "" }
function searchBusStop(keyword) { var sieve = function(keyword) { var helper = function(bs) { if (bs instanceof BusStop) { var re = new RegExp(keyword); if (re.test(bs.name)) { console.log("Match! " + bs.name + "\n") return true; } else { return false; } } else { return false; } } return helper; } var fx = sieve(keyword); var res = busStops.filter(fx); console.log("%d matches found.", res.length) return res; }
{ "content_hash": "a30112c139590a99c002e071aa4f9c37", "timestamp": "", "source": "github", "line_count": 25, "max_line_length": 45, "avg_line_length": 19.12, "alnum_prop": 0.5857740585774058, "repo_name": "andhieka/Transjakarta", "id": "2162d23764a4b91796813388fdf50052e3d446a3", "size": "478", "binary": false, "copies": "1", "ref": "refs/heads/gh-pages", "path": "web/scripts/searchBusStop.js", "mode": "33188", "license": "mit", "language": [ { "name": "JavaScript", "bytes": "359552" } ], "symlink_target": "" }
FROM {{ DOCKER_PREFIX }}/{{ DOCKER_NAME }}-base CMD ["spark-class", "org.apache.spark.deploy.master.Master"]
{ "content_hash": "0c3f84e7478478069cd11b3e0c7903cc", "timestamp": "", "source": "github", "line_count": 3, "max_line_length": 60, "avg_line_length": 36.666666666666664, "alnum_prop": 0.6818181818181818, "repo_name": "zero323/spark-in-a-box", "id": "1c24f30bb171455663793426d056af4f8e21c4e8", "size": "110", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "sparkinabox/templates/dockerfiles/master.Dockerfile", "mode": "33188", "license": "mit", "language": [ { "name": "Makefile", "bytes": "181" }, { "name": "Python", "bytes": "11907" } ], "symlink_target": "" }
TEST(Tarjan, verts_0_edges_0) { std::vector<std::string> vertices; std::vector<std::vector<size_t>> expected{}; auto const actual = tarjan(0, [](size_t const &) { return std::vector<size_t>(); }); ASSERT_EQ(expected, actual); } TEST(Tarjan, verts_2_edges_1) { std::vector<std::string> vertices; vertices.emplace_back("0"); vertices.emplace_back("1"); std::vector<std::vector<size_t>> expected{{1}, {0}}; auto const actual = tarjan(vertices.size(), [&](size_t i) { if (i == 0) { return std::vector<size_t>{1}; } return std::vector<size_t>{}; }); ASSERT_EQ(expected.size(), actual.size()); for (size_t i = 0; i < expected.size(); ++i) { auto const & subExpected = expected[i]; auto const & subActual = actual[i]; ASSERT_EQ(subExpected.size(), subActual.size()); for (size_t j = 0; j < subExpected.size(); ++j) { ASSERT_EQ(subExpected[j], subActual[j]); } } EXPECT_EQ(expected, actual); } TEST(Tarjan, strongly_connected_verts_2_edges_2) { std::vector<std::string> vertices; vertices.emplace_back("0"); vertices.emplace_back("1"); std::vector<std::vector<size_t>> expected{{1, 0}}; auto const actual = tarjan(vertices.size(), [&](size_t i) { if (i == 0) { return std::vector<size_t>{1}; } else { return std::vector<size_t>{0}; } return std::vector<size_t>{}; }); ASSERT_EQ(expected.size(), actual.size()); for (size_t i = 0; i < expected.size(); ++i) { auto const & subExpected = expected[i]; auto const & subActual = actual[i]; ASSERT_EQ(subExpected.size(), subActual.size()); for (size_t j = 0; j < subExpected.size(); ++j) { ASSERT_EQ(subExpected[j], subActual[j]); } } EXPECT_EQ(expected, actual); }
{ "content_hash": "02537e74b2595f760340e2893b39cd50", "timestamp": "", "source": "github", "line_count": 57, "max_line_length": 85, "avg_line_length": 29.596491228070175, "alnum_prop": 0.6229994072317724, "repo_name": "coder0xff/Plange", "id": "ae9b621dc56e7bf2fee1108573d3ae50c0e158f4", "size": "1739", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "source/utilities/utilities_test/tarjan.test.cpp", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "Batchfile", "bytes": "302" }, { "name": "C++", "bytes": "1211302" }, { "name": "CMake", "bytes": "16045" }, { "name": "CSS", "bytes": "3771" }, { "name": "Hack", "bytes": "1810" }, { "name": "Makefile", "bytes": "203" }, { "name": "PHP", "bytes": "31289" }, { "name": "Python", "bytes": "5829" }, { "name": "Shell", "bytes": "230" } ], "symlink_target": "" }
var EventEmitter = require("eventemitter3"), inherits = require("./util/inherits"), _isFunction = require("lodash-node/modern/lang/isFunction"), _isObject = require("lodash-node/modern/lang/isObject"); function Store(dispatcher) { this.dispatcher = dispatcher; this.__actions__ = {}; EventEmitter.call(this); } inherits(Store, EventEmitter); Store.prototype.__handleAction__ = function(action) { var handler; if (!!(handler = this.__actions__[action.type])) { if (_isFunction(handler)) { handler.call(this, action.payload, action.type); } else if (handler && _isFunction(this[handler])) { this[handler].call(this, action.payload, action.type); } else { throw new Error("The handler for action type " + action.type + " is not a function"); } return true; } else { return false; } }; Store.prototype.bindActions = function() { var actions = Array.prototype.slice.call(arguments); if (actions.length > 1 && actions.length % 2 !== 0) { throw new Error("bindActions must take an even number of arguments."); } var bindAction = function(type, handler) { if (!handler) { throw new Error("The handler for action type " + type + " is falsy"); } this.__actions__[type] = handler; }.bind(this); if (actions.length === 1 && _isObject(actions[0])) { actions = actions[0]; for (var key in actions) { if (actions.hasOwnProperty(key)) { bindAction(key, actions[key]); } } } else { for (var i = 0; i < actions.length; i += 2) { var type = actions[i], handler = actions[i+1]; if (!type) { throw new Error("Argument " + (i+1) + " to bindActions is a falsy value"); } bindAction(type, handler); } } }; Store.prototype.waitFor = function(stores, fn) { this.dispatcher.waitForStores(this, stores, fn.bind(this)); }; module.exports = Store;
{ "content_hash": "b453c9fa4f5509d8b78a5f88d6c65221", "timestamp": "", "source": "github", "line_count": 70, "max_line_length": 91, "avg_line_length": 27.428571428571427, "alnum_prop": 0.6182291666666667, "repo_name": "niole/NewsCharts", "id": "00f23c5799de114f0aa0126d541d2a93cbcc5f7c", "size": "1920", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "node_modules/fluxxor/lib/store.js", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "CSS", "bytes": "1962" }, { "name": "HTML", "bytes": "2647" }, { "name": "JavaScript", "bytes": "2817219" } ], "symlink_target": "" }
from anormbookmarker.test.test_enviroment import * with self_contained_session(CONFIG.database_timestamp) as session: BASE.metadata.create_all(session.bind) # make a tag to make an alias to eucalyptus_deglupta = Tag.construct(session=session, tag='Eucalyptus deglupta') session.commit() # make a tag to use in a conflicting alias for rainbow eucalyptus #trees = Tag.construct(session=session, tag='trees') #session.commit() # make a Alias alias = Alias.construct(session=session, tag=eucalyptus_deglupta, alias='rainbow eucalyptus') session.commit() # make a wordmisspelling for rainbow rainbow_wms = WordMisSpelling.construct(session=session, wordmisspelling="rrainbow", word="rainbow") # make a Alias with misspelled rrainbow alias_rrainbow = Alias.construct(session=session, tag=eucalyptus_deglupta, alias='rrainbow eucalyptus') session.commit() assert id(alias) == id(alias_rrainbow) assert str(alias_rrainbow.tag) == 'Eucalyptus deglupta' #try: # # make a duplicate (conflicting) Alias to a different tag (correctly throws ConflictingAliasError) # alias = Alias.construct(session=session, tag=trees, alias='rainbow eucalyptus') #except ConflictingAliasError: # print("Correctly throws ConflictingAliasError") #session.commit() db_result = [('select COUNT(*) from alias;', 1), ('select COUNT(*) from aliasword;', 2), ('select COUNT(*) from bookmark;', 0), ('select COUNT(*) from filename;', 0), ('select COUNT(*) from tag;', 1), ('select COUNT(*) from tag_relationship;', 0), ('select COUNT(*) from tagbookmarks;', 0), ('select COUNT(*) from tagword;', 2), ('select COUNT(*) from word;', 4), ('select COUNT(*) from wordmisspelling;', 1)] check_db_result(config=CONFIG, db_result=db_result)
{ "content_hash": "2ea5a718cc1551f0facf66cb08e5ba71", "timestamp": "", "source": "github", "line_count": 46, "max_line_length": 107, "avg_line_length": 41.78260869565217, "alnum_prop": 0.6612903225806451, "repo_name": "jakeogh/anormbookmarker", "id": "7887298ca97402cbb95b98bdf2a7b9cf66dc2bc1", "size": "1946", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "anormbookmarker/test/tests/WordMisSpelling/multiple_word_alias.py", "mode": "33261", "license": "mit", "language": [ { "name": "Python", "bytes": "94877" }, { "name": "Shell", "bytes": "149" } ], "symlink_target": "" }
/** * \file AnnGameObject.hpp * \brief Game Object class * \author A. Brainville (Ybalrid) */ #pragma once #include "systemMacro.h" #include <string> #include <BtOgrePG.h> //Annwvyn #include "AnnTypes.h" #include "AnnAbstractMovable.hpp" #pragma warning(default : 4996) namespace Annwvyn { class AnnDllExport AnnGameObjectManager; class AnnBehaviorScript; class AnnAudioSource; ///An object that exist in the game. Graphically and Potentially Physically class AnnDllExport AnnGameObject : public AnnAbstractMovable { public: ///Class constructor AnnGameObject(); ///Deleted AnnGameObject Copy Constructor AnnGameObject(const AnnGameObject&) = delete; ///Class Destructor. Virtual. virtual ~AnnGameObject(); ///Set position from spatial variables /// \param x X component of the position vector /// \param y Y component of the position vector /// \param z Z component of the position vector void setPosition(float x, float y, float z); ///Set position from Vector 3D /// \param pos 3D position vector. Relative to scene root position void setPosition(AnnVect3 pos) override; ///Set the world position from a vector /// \param pos 3D position vector. Relative to scene root position void setWorldPosition(AnnVect3 pos) const; ///Set the world position from a few floats /// \param x X component of the position vector /// \param y Y component of the position vector /// \param z Z component of the position vector void setWorldPosition(float x, float y, float z) const; ///Translate /// \param x X component of the translation vector /// \param y Y component of the translation vector /// \param z Z component of the translation vector void translate(float x, float y, float z) const; ///Set orientation from Quaternion components /// \param w W component of the quaternion /// \param x X component of the quaternion /// \param y Y component of the quaternion /// \param z Z component of the quaternion void setOrientation(float w, float x, float y, float z); ///Set Orientation from Quaternion /// \param orient Quaternion for absolute orientation void setOrientation(AnnQuaternion orient) override; ///Set the world orientation as a quaternion /// \param orient Quaternion for absolute orientation void setWorldOrientation(AnnQuaternion orient) const; ///Set the world orientation as some floats /// \param w W component of the quaternion /// \param x X component of the quaternion /// \param y Y component of the quaternion /// \param z Z component of the quaternion void setWorldOrientation(float w, float x, float y, float z) const; ///Set scale /// \param x X component of the scale vector /// \param y Y component of the scale vector /// \param z Z component of the scale vector /// \param scaleMass If set to true (by default) will update the mass of the rigid body to reflect the change in size (constant density) void setScale(float x, float y, float z, bool scaleMass = true) const; ///Set scale from Vector 3D /// \param scale Relative scaling factor /// \param scaleMass will adjust the mass accoring to the scaling vector. true by default void setScale(AnnVect3 scale, bool scaleMass = true) const; ///Get Position AnnVect3 getPosition() override; ///Get the position in world AnnVect3 getWorldPosition() const; ///Get Orientation AnnQuaternion getOrientation() override; ///Get the world orientation AnnQuaternion getWorldOrientation() const; ///Get scale AnnVect3 getScale() const; ///Get Ogre Node Ogre::SceneNode* getNode() const; ///Get the item of this object Ogre::Item* getItem() const; ///Get Rigid rigidBody btRigidBody* getBody() const; ///Get distance from another object /// \param otherObject The object we're counting the distance from float getDistance(AnnGameObject* otherObject) const; ///Play a sound file /// \param name Name of the audio file in a resource location /// \param loop If set to true, will play the sound in loop /// \param volume Floating point number between 0 and 1 to set the loudness of the sound void playSound(const std::string& name, bool loop = false, float volume = 1.0f) const; ///Set currently playing animation /// \param name Name of the animation as defined by the 3D entity void setAnimation(const std::string& name); ///Set if we want to play the animation /// \param play the playing state we want to apply void playAnimation(bool play = true) const; ///Loop the animation ? /// \param loop the looping state of the animation void loopAnimation(bool loop = true) const; ///Apply a physical force /// \param force Force vector that will be applied to the center of mass of the object void applyForce(AnnVect3 force) const; ///Apply a physical impulsion /// \param impulse the impulsion force void applyImpulse(AnnVect3 impulse) const; ///Set the linear speed of the object /// \param v The linear speed void setLinearSpeed(AnnVect3 v) const; ///Set the friction coefficient ///See the "Results" table from this page : https://www.thoughtspike.com/friction-coefficients-for-bullet-physics/ /// \param coef friction coef applied to this object's body void setFrictionCoef(float coef) const; ///Set up Physics /// \param mass The mass of the object /// \param type The type of shape you want to define for the object /// \praam hasPlayerCollision if set to true (by default) object can colide with the player body representation void setupPhysics(float mass = 0, phyShapeType type = staticShape, bool hasPlayerCollision = true); ///Make the object visible void setVisible() const; ///Make the object invisible void setInvisible() const; ///Return the name of the object std::string getName() const; ///Attach a script to this object /// \param scriptName name of a script void attachScript(const std::string& scriptName); ///Return true if node is attached to the node owned by another AnnGameObject bool hasParent() const; ///Get the parent Game Object std::shared_ptr<AnnGameObject> getParent() const; ///Attach an object to this object. /// \param child The object you want to attach void attachChildObject(std::shared_ptr<AnnGameObject> child) const; ///Make the node independent to any GameObject void detachFromParent() const; ///Recursively check if any parent has a body, if one is found, returns true bool checkForBodyInParent(); ///Recursively check if any child has a body, if one is found, returns true bool checkForBodyInChild(); private: ///The GameObjectManager populate the content of this object when it goes through it's initialization friend class AnnEngine; friend class AnnGameObjectManager; //------------------ local utility ///Do the actual recursion of checkForBodyInParent /// \param obj object to check (for recursion) bool parentsHaveBody(AnnGameObject* obj) const; ///Do the actual recursion of checkForBodyInChild /// \param obj object to check (for recursion) static bool childrenHaveBody(AnnGameObject* obj); //----------------- engine utility ///For engine : set node /// \param node ogre SceneNode void setNode(Ogre::SceneNode* node); ///For engine : set item /// \param item Ogre::Item void setItem(Ogre::Item* item); ///For engine : get elapsed time /// \param offsetTime time to add to the animation void addAnimationTime(double offsetTime) const; ///For engine : update OpenAL source position void updateOpenAlPos() const; ///SceneNode. This also holds the position/orientation/scale of the object Ogre::SceneNode* sceneNode; ///Entity Ogre::Item* model3D; ///Currently selected animation Ogre::SkeletonAnimation* currentAnimation; ///Bullet shape btCollisionShape* collisionShape; ///Bullet rigid body btRigidBody* rigidBody; ///Mass of the rigid body float bodyMass; ///AnnAudioEngine audioSource; std::shared_ptr<AnnAudioSource> audioSource; ///Name of the object std::string name; ///RigidBodyState of this object BtOgre::RigidBodyState* state; ///list of script objects std::vector<std::shared_ptr<AnnBehaviorScript>> scripts; public: ///Executed after object initialization virtual void postInit() {} ///Executed at refresh time (each frames) virtual void update() {} ///Call the update methods of all the script present in the scripts container void callUpdateOnScripts(); }; using AnnGameObjectPtr = std::shared_ptr<AnnGameObject>; }
{ "content_hash": "34e4dfe65a46f0a1a65d9fd1794c7ec5", "timestamp": "", "source": "github", "line_count": 270, "max_line_length": 138, "avg_line_length": 31.755555555555556, "alnum_prop": 0.7234662934452998, "repo_name": "Ybalrid/Annwvyn", "id": "65c4c6e33aef9e9edc07deb7d1ded4155812e30d", "size": "8574", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "include/AnnGameObject.hpp", "mode": "33188", "license": "mit", "language": [ { "name": "Batchfile", "bytes": "6717" }, { "name": "C", "bytes": "1891" }, { "name": "C++", "bytes": "978781" }, { "name": "CMake", "bytes": "19954" }, { "name": "GLSL", "bytes": "36680" }, { "name": "Inno Setup", "bytes": "1878" }, { "name": "Shell", "bytes": "753" } ], "symlink_target": "" }
package headfirst.factory.pizzafm; public class ChicagoStylePepperoniPizza extends Pizza { public ChicagoStylePepperoniPizza() { name = "Chicago Style Pepperoni Pizza"; dough = "Extra Thick Crust Dough"; sauce = "Plum Tomato Sauce"; toppings.add("Shredded Mozzarella Cheese"); toppings.add("Black Olives"); toppings.add("Spinach"); toppings.add("Eggplant"); toppings.add("Sliced Pepperoni"); } void cut() { System.out.println("Cutting the pizza into square slices"); } }
{ "content_hash": "953032144486442e55e898b01222ba8b", "timestamp": "", "source": "github", "line_count": 19, "max_line_length": 61, "avg_line_length": 26.105263157894736, "alnum_prop": 0.7258064516129032, "repo_name": "ljaljushkin/PatternPractise", "id": "df8c529af9fe07d1961bdb5480c6d63aa44f6106", "size": "496", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "from_web/HF_DP/src/headfirst/factory/pizzafm/ChicagoStylePepperoniPizza.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Java", "bytes": "300998" } ], "symlink_target": "" }
Arduino Play
{ "content_hash": "5411d6feece85df651eb3160e956f8bc", "timestamp": "", "source": "github", "line_count": 2, "max_line_length": 12, "avg_line_length": 7, "alnum_prop": 0.7857142857142857, "repo_name": "cthaeler/Arduino_Projects", "id": "20e13961d2c889ed3fe8ee5481d304bcaafcdaf5", "size": "33", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "README.md", "mode": "33188", "license": "mit", "language": [ { "name": "Arduino", "bytes": "13250" } ], "symlink_target": "" }
package jp.fukushima.namie.town.dojo; import org.apache.cordova.CordovaActivity; import android.os.Bundle; import android.os.Handler; public class NamieTabletDojo extends CordovaActivity { @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); // Set by <content src="index.html" /> in config.xml loadUrl(launchUrl); } /** * 受信エラーが発生した場合に呼び出される。 */ @Override public void onReceivedError(int errorCode, String description, String failingUrl) { final Handler handler = new Handler(); handler.postDelayed(new Runnable() { @Override public void run() { displayError("なみえ道場", "アプリケーションを表示できませんでした。電波状態を見直して、起動し直してください。", "OK", true); } }, 3000); } }
{ "content_hash": "b23edf4a5d0e6d1c7eecff5b14c9e174", "timestamp": "", "source": "github", "line_count": 32, "max_line_length": 81, "avg_line_length": 27, "alnum_prop": 0.6157407407407407, "repo_name": "codefornamie/namie-tablet-html5", "id": "f4225b65f274327b34299d2cc022158ddfcc0933", "size": "1636", "binary": false, "copies": "1", "ref": "refs/heads/develop", "path": "cordova/dojo/platforms/android/src/jp/fukushima/namie/town/dojo/NamieTabletDojo.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Batchfile", "bytes": "57251" }, { "name": "C#", "bytes": "646443" }, { "name": "C++", "bytes": "256912" }, { "name": "CSS", "bytes": "209864" }, { "name": "CoffeeScript", "bytes": "9229" }, { "name": "HTML", "bytes": "255450" }, { "name": "Java", "bytes": "6540219" }, { "name": "JavaScript", "bytes": "5111788" }, { "name": "Objective-C", "bytes": "972369" }, { "name": "QML", "bytes": "31292" }, { "name": "Shell", "bytes": "8247" } ], "symlink_target": "" }
package org.wso2.carbon.apimgt.keymgt; import org.wso2.carbon.apimgt.keymgt.model.SubscriptionDataStore; import org.wso2.carbon.apimgt.keymgt.model.impl.SubscriptionDataStoreImpl; import java.util.Map; import java.util.concurrent.ConcurrentHashMap; /* * This class holds tenant wise subscription data stores * */ public class SubscriptionDataHolder { protected Map<String, SubscriptionDataStore> subscriptionStore = new ConcurrentHashMap<>(); private static SubscriptionDataHolder instance = new SubscriptionDataHolder(); public static SubscriptionDataHolder getInstance() { return instance; } public SubscriptionDataStore registerTenantSubscriptionStore(String tenantDomain) { SubscriptionDataStore tenantStore = subscriptionStore.get(tenantDomain); if (tenantStore == null) { tenantStore = new SubscriptionDataStoreImpl(tenantDomain); } subscriptionStore.put(tenantDomain, tenantStore); return tenantStore; } public void initializeSubscriptionStore(String tenantDomain) { SubscriptionDataStore tenantStore = subscriptionStore.get(tenantDomain); if (tenantStore != null) { tenantStore.init(); } } public void unregisterTenantSubscriptionStore(String tenantDomain) { subscriptionStore.remove(tenantDomain); } public SubscriptionDataStore getTenantSubscriptionStore(String tenantDomain) { return subscriptionStore.get(tenantDomain); } }
{ "content_hash": "bf4c1467a56ba8b6e6bd932b38de7896", "timestamp": "", "source": "github", "line_count": 52, "max_line_length": 87, "avg_line_length": 29.346153846153847, "alnum_prop": 0.7326343381389253, "repo_name": "fazlan-nazeem/carbon-apimgt", "id": "3fc78949ac41dcc768ba92e546510a457445f9ac", "size": "2199", "binary": false, "copies": "2", "ref": "refs/heads/master", "path": "components/apimgt/org.wso2.carbon.apimgt.keymgt/src/main/java/org/wso2/carbon/apimgt/keymgt/SubscriptionDataHolder.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Batchfile", "bytes": "11203" }, { "name": "CSS", "bytes": "1852737" }, { "name": "HTML", "bytes": "1971572" }, { "name": "Java", "bytes": "12739249" }, { "name": "JavaScript", "bytes": "10823366" }, { "name": "PLSQL", "bytes": "168275" }, { "name": "Shell", "bytes": "33403" }, { "name": "TSQL", "bytes": "605388" } ], "symlink_target": "" }
var mongoose = require('mongoose'); var bcrypt = require('bcrypt-nodejs'); var Schema = mongoose.Schema; // define the schema for our user model var userSchema = new Schema({ local: { email: String, password: String, } }); // generating a hash userSchema.methods.generateHash = function(password) { return bcrypt.hashSync(password, bcrypt.genSaltSync(8), null); }; // validating if password is valid userSchema.methods.validPassword = function(password) { return bcrypt.compareSync(password, this.local.password); }; // create the model for users and export to app module.exports = mongoose.model('User', userSchema);
{ "content_hash": "d8a75860692f9ee2e3223e9ad091f82f", "timestamp": "", "source": "github", "line_count": 22, "max_line_length": 63, "avg_line_length": 28.5, "alnum_prop": 0.7432216905901117, "repo_name": "npkumar/conference-spa", "id": "4a246179b5d2a8da05f4280a457891310cd39e25", "size": "657", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "server/models/user.js", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "5361" }, { "name": "JavaScript", "bytes": "10376" } ], "symlink_target": "" }
<!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <meta name="viewport" content="width=device-width, initial-scale=1"> <title>stdpp: Not compatible 👼</title> <link rel="shortcut icon" type="image/png" href="../../../../../favicon.png" /> <link href="../../../../../bootstrap.min.css" rel="stylesheet"> <link href="../../../../../bootstrap-custom.css" rel="stylesheet"> <link href="//maxcdn.bootstrapcdn.com/font-awesome/4.2.0/css/font-awesome.min.css" rel="stylesheet"> <script src="../../../../../moment.min.js"></script> <!-- HTML5 Shim and Respond.js IE8 support of HTML5 elements and media queries --> <!-- WARNING: Respond.js doesn't work if you view the page via file:// --> <!--[if lt IE 9]> <script src="https://oss.maxcdn.com/html5shiv/3.7.2/html5shiv.min.js"></script> <script src="https://oss.maxcdn.com/respond/1.4.2/respond.min.js"></script> <![endif]--> </head> <body> <div class="container"> <div class="navbar navbar-default" role="navigation"> <div class="container-fluid"> <div class="navbar-header"> <a class="navbar-brand" href="../../../../.."><i class="fa fa-lg fa-flag-checkered"></i> Coq bench</a> </div> <div id="navbar" class="collapse navbar-collapse"> <ul class="nav navbar-nav"> <li><a href="../..">clean / released</a></li> <li class="active"><a href="">8.12.2 / stdpp - 1.2.1</a></li> </ul> </div> </div> </div> <div class="article"> <div class="row"> <div class="col-md-12"> <a href="../..">« Up</a> <h1> stdpp <small> 1.2.1 <span class="label label-info">Not compatible 👼</span> </small> </h1> <p>📅 <em><script>document.write(moment("2022-04-22 09:36:58 +0000", "YYYY-MM-DD HH:mm:ss Z").fromNow());</script> (2022-04-22 09:36:58 UTC)</em><p> <h2>Context</h2> <pre># Packages matching: installed # Name # Installed # Synopsis base-bigarray base base-threads base base-unix base conf-findutils 1 Virtual package relying on findutils coq 8.12.2 Formal proof management system num 1.4 The legacy Num library for arbitrary-precision integer and rational arithmetic ocaml 4.10.2 The OCaml compiler (virtual package) ocaml-base-compiler 4.10.2 Official release 4.10.2 ocaml-config 1 OCaml Switch Configuration ocamlfind 1.9.3 A library manager for OCaml # opam file: opam-version: &quot;2.0&quot; name: &quot;coq-stdpp&quot; maintainer: &quot;Ralf Jung &lt;[email protected]&gt;&quot; homepage: &quot;https://gitlab.mpi-sws.org/iris/stdpp&quot; authors: &quot;Robbert Krebbers, Jacques-Henri Jourdan, Ralf Jung&quot; bug-reports: &quot;https://gitlab.mpi-sws.org/iris/stdpp/issues&quot; license: &quot;BSD&quot; dev-repo: &quot;git+https://gitlab.mpi-sws.org/iris/stdpp.git&quot; tags: [ &quot;date:2019-08-29&quot; ] synopsis: &quot;std++ is an extended \&quot;Standard Library\&quot; for Coq&quot; description: &quot;&quot;&quot; The key features of this library are as follows: - It provides a great number of definitions and lemmas for common data structures such as lists, finite maps, finite sets, and finite multisets. - It uses type classes for common notations (like `∅`, `∪`, and Haskell-style monad notations) so that these can be overloaded for different data structures. - It uses type classes to keep track of common properties of types, like it having decidable equality or being countable or finite. - Most data structures are represented in canonical ways so that Leibniz equality can be used as much as possible (for example, for maps we have `m1 = m2` iff `∀ i, m1 !! i = m2 !! i`). On top of that, the library provides setoid instances for most types and operations. - It provides various tactics for common tasks, like an ssreflect inspired `done` tactic for finishing trivial goals, a simple breadth-first solver `naive_solver`, an equality simplifier `simplify_eq`, a solver `solve_proper` for proving compatibility of functions with respect to relations, and a solver `set_solver` for goals involving set operations. - It is entirely dependency- and axiom-free.&quot;&quot;&quot; depends: [ &quot;coq&quot; {(&gt;= &quot;8.7.2&quot; &amp; &lt; &quot;8.11~&quot;) | (= &quot;dev&quot;)} ] build: [make &quot;-j%{jobs}%&quot;] install: [make &quot;install&quot;] url { src: &quot;https://gitlab.mpi-sws.org/iris/stdpp/-/archive/coq-stdpp-1.2.1.tar.gz&quot; checksum: &quot;sha512=8eb04e9a70a9be0b7a9565516d03952ec9c54cb7abe20ddea4d27ceb3c18022fdf2f1409df6988092741a0e5e9086f852dc2eb3e7bdc4f4eff91b2baa72c16af&quot; } </pre> <h2>Lint</h2> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>true</code></dd> <dt>Return code</dt> <dd>0</dd> </dl> <h2>Dry install 🏜️</h2> <p>Dry install with the current Coq version:</p> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>opam install -y --show-action coq-stdpp.1.2.1 coq.8.12.2</code></dd> <dt>Return code</dt> <dd>5120</dd> <dt>Output</dt> <dd><pre>[NOTE] Package coq is already installed (current version is 8.12.2). The following dependencies couldn&#39;t be met: - coq-stdpp -&gt; coq (&lt; 8.11~ &amp; = dev) -&gt; ocaml &lt; 4.10 base of this switch (use `--unlock-base&#39; to force) No solution found, exiting </pre></dd> </dl> <p>Dry install without Coq/switch base, to test if the problem was incompatibility with the current Coq/OCaml version:</p> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>opam remove -y coq; opam install -y --show-action --unlock-base coq-stdpp.1.2.1</code></dd> <dt>Return code</dt> <dd>0</dd> </dl> <h2>Install dependencies</h2> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>true</code></dd> <dt>Return code</dt> <dd>0</dd> <dt>Duration</dt> <dd>0 s</dd> </dl> <h2>Install 🚀</h2> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>true</code></dd> <dt>Return code</dt> <dd>0</dd> <dt>Duration</dt> <dd>0 s</dd> </dl> <h2>Installation size</h2> <p>No files were installed.</p> <h2>Uninstall 🧹</h2> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>true</code></dd> <dt>Return code</dt> <dd>0</dd> <dt>Missing removes</dt> <dd> none </dd> <dt>Wrong removes</dt> <dd> none </dd> </dl> </div> </div> </div> <hr/> <div class="footer"> <p class="text-center"> Sources are on <a href="https://github.com/coq-bench">GitHub</a> © Guillaume Claret 🐣 </p> </div> </div> <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script> <script src="../../../../../bootstrap.min.js"></script> </body> </html>
{ "content_hash": "ce1b0eb44ad988804ce0fc2c0370f85a", "timestamp": "", "source": "github", "line_count": 177, "max_line_length": 159, "avg_line_length": 43.67231638418079, "alnum_prop": 0.5664941785252264, "repo_name": "coq-bench/coq-bench.github.io", "id": "7a40c199bdf11ed2cf4c9a9b06fa74ee9ced2829", "size": "7761", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "clean/Linux-x86_64-4.10.2-2.0.6/released/8.12.2/stdpp/1.2.1.html", "mode": "33188", "license": "mit", "language": [], "symlink_target": "" }
module Teamwork module ApplicationHelper def self.top_nav_links [ {url: Teamwork::Engine.routes.url_helpers.root_path(locale: ::I18n.locale), name:I18n.t('teamwork.titles.home')}, ] end end end
{ "content_hash": "923dcd0bcb625ad76814f524013dee34", "timestamp": "", "source": "github", "line_count": 9, "max_line_length": 123, "avg_line_length": 25.333333333333332, "alnum_prop": 0.6491228070175439, "repo_name": "itpkg/teamwork", "id": "e514f08f992f6ab4397b14b53695c73798a52b5c", "size": "228", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "app/helpers/teamwork/application_helper.rb", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "3249" }, { "name": "HTML", "bytes": "51502" }, { "name": "JavaScript", "bytes": "13430" }, { "name": "Ruby", "bytes": "41016" } ], "symlink_target": "" }
using UnityEngine; using System.Collections; #if UNITY_EDITOR using UnityEditor; #endif namespace UnitySampleAssets.Utility { public class WaypointCircuit : MonoBehaviour { public WaypointList waypointList = new WaypointList(); [SerializeField] private bool smoothRoute = true; private int numPoints; private Vector3[] points; private float[] distances; public float editorVisualisationSubsteps = 100; public float Length { get; private set; } public Transform[] Waypoints { get { return waypointList.items; } } //this being here will save GC allocs private int p0n; private int p1n; private int p2n; private int p3n; private float i; private Vector3 P0; private Vector3 P1; private Vector3 P2; private Vector3 P3; // Use this for initialization private void Awake() { if (Waypoints.Length > 1) { CachePositionsAndDistances(); } numPoints = Waypoints.Length; } public RoutePoint GetRoutePoint(float dist) { // position and direction Vector3 p1 = GetRoutePosition(dist); Vector3 p2 = GetRoutePosition(dist + 0.1f); Vector3 delta = p2 - p1; return new RoutePoint(p1, delta.normalized); } public Vector3 GetRoutePosition(float dist) { int point = 0; if (Length == 0) { Length = distances[distances.Length - 1]; } dist = Mathf.Repeat(dist, Length); while (distances[point] < dist) { ++point; } // get nearest two points, ensuring points wrap-around start & end of circuit p1n = ((point - 1) + numPoints)%numPoints; p2n = point; // found point numbers, now find interpolation value between the two middle points i = Mathf.InverseLerp(distances[p1n], distances[p2n], dist); if (smoothRoute) { // smooth catmull-rom calculation between the two relevant points // get indices for the surrounding 2 points, because // four points are required by the catmull-rom function p0n = ((point - 2) + numPoints)%numPoints; p3n = (point + 1)%numPoints; // 2nd point may have been the 'last' point - a dupe of the first, // (to give a value of max track distance instead of zero) // but now it must be wrapped back to zero if that was the case. p2n = p2n%numPoints; P0 = points[p0n]; P1 = points[p1n]; P2 = points[p2n]; P3 = points[p3n]; return CatmullRom(P0, P1, P2, P3, i); } else { // simple linear lerp between the two points: p1n = ((point - 1) + numPoints)%numPoints; p2n = point; return Vector3.Lerp(points[p1n], points[p2n], i); } } private Vector3 CatmullRom(Vector3 _P0, Vector3 _P1, Vector3 _P2, Vector3 _P3, float _i) { // comments are no use here... it's the catmull-rom equation. // Un-magic this, lord vector! return 0.5f* ((2*_P1) + (-_P0 + _P2)*_i + (2*_P0 - 5*_P1 + 4*_P2 - _P3)*_i*_i + (-_P0 + 3*_P1 - 3*_P2 + _P3)*_i*_i*_i); } private void CachePositionsAndDistances() { // transfer the position of each point and distances between points to arrays for // speed of lookup at runtime points = new Vector3[Waypoints.Length + 1]; distances = new float[Waypoints.Length + 1]; float accumulateDistance = 0; for (int i = 0; i < points.Length; ++i) { var t1 = Waypoints[(i)%Waypoints.Length]; var t2 = Waypoints[(i + 1)%Waypoints.Length]; if (t1 != null && t2 != null) { Vector3 p1 = t1.position; Vector3 p2 = t2.position; points[i] = Waypoints[i%Waypoints.Length].position; distances[i] = accumulateDistance; accumulateDistance += (p1 - p2).magnitude; } } } private void OnDrawGizmos() { DrawGizmos(false); } private void OnDrawGizmosSelected() { DrawGizmos(true); } private void DrawGizmos(bool selected) { waypointList.circuit = this; if (Waypoints.Length > 1) { numPoints = Waypoints.Length; CachePositionsAndDistances(); Length = distances[distances.Length - 1]; Gizmos.color = selected ? Color.yellow : new Color(1, 1, 0, 0.5f); Vector3 prev = Waypoints[0].position; if (smoothRoute) { for (float dist = 0; dist < Length; dist += Length/editorVisualisationSubsteps) { Vector3 next = GetRoutePosition(dist + 1); Gizmos.DrawLine(prev, next); prev = next; } Gizmos.DrawLine(prev, Waypoints[0].position); } else { for (int n = 0; n < Waypoints.Length; ++n) { Vector3 next = Waypoints[(n + 1)%Waypoints.Length].position; Gizmos.DrawLine(prev, next); prev = next; } } } } [System.Serializable] public class WaypointList { public WaypointCircuit circuit; public Transform[] items = new Transform[0]; } public struct RoutePoint { public Vector3 position; public Vector3 direction; public RoutePoint(Vector3 position, Vector3 direction) { this.position = position; this.direction = direction; } } } } namespace UnitySampleAssets.Utility.Inspector { #if UNITY_EDITOR [CustomPropertyDrawer(typeof (WaypointCircuit.WaypointList))] public class WaypointListDrawer : PropertyDrawer { private float lineHeight = 18; private float spacing = 4; public override void OnGUI(Rect position, SerializedProperty property, GUIContent label) { EditorGUI.BeginProperty(position, label, property); float x = position.x; float y = position.y; float inspectorWidth = position.width; // Draw label // Don't make child fields be indented var indent = EditorGUI.indentLevel; EditorGUI.indentLevel = 0; var items = property.FindPropertyRelative("items"); string[] titles = new string[] {"Transform", "", "", ""}; string[] props = new string[] {"transform", "^", "v", "-"}; float[] widths = new float[] {.7f, .1f, .1f, .1f}; float lineHeight = 18; bool changedLength = false; if (items.arraySize > 0) { for (int i = -1; i < items.arraySize; ++i) { var item = items.GetArrayElementAtIndex(i); float rowX = x; for (int n = 0; n < props.Length; ++n) { float w = widths[n]*inspectorWidth; // Calculate rects Rect rect = new Rect(rowX, y, w, lineHeight); rowX += w; if (i == -1) { EditorGUI.LabelField(rect, titles[n]); } else { if (n == 0) { EditorGUI.ObjectField(rect, item.objectReferenceValue, typeof (Transform), true); } else { if (GUI.Button(rect, props[n])) { switch (props[n]) { case "-": items.DeleteArrayElementAtIndex(i); items.DeleteArrayElementAtIndex(i); changedLength = true; break; case "v": if (i > 0) items.MoveArrayElement(i, i + 1); break; case "^": if (i < items.arraySize - 1) items.MoveArrayElement(i, i - 1); break; } } } } } y += lineHeight + spacing; if (changedLength) { break; } } } else { // add button var addButtonRect = new Rect((x + position.width) - widths[widths.Length - 1]*inspectorWidth, y, widths[widths.Length - 1]*inspectorWidth, lineHeight); if (GUI.Button(addButtonRect, "+")) { items.InsertArrayElementAtIndex(items.arraySize); } y += lineHeight + spacing; } // add all button var addAllButtonRect = new Rect(x, y, inspectorWidth, lineHeight); if (GUI.Button(addAllButtonRect, "Assign using all child objects")) { var circuit = property.FindPropertyRelative("circuit").objectReferenceValue as WaypointCircuit; var children = new Transform[circuit.transform.childCount]; int n = 0; foreach (Transform child in circuit.transform) children[n++] = child; System.Array.Sort(children, new TransformNameComparer()); circuit.waypointList.items = new Transform[children.Length]; for (n = 0; n < children.Length; ++n) { circuit.waypointList.items[n] = children[n]; } } y += lineHeight + spacing; // rename all button var renameButtonRect = new Rect(x, y, inspectorWidth, lineHeight); if (GUI.Button(renameButtonRect, "Auto Rename numerically from this order")) { var circuit = property.FindPropertyRelative("circuit").objectReferenceValue as WaypointCircuit; int n = 0; foreach (Transform child in circuit.waypointList.items) child.name = "Waypoint " + (n++).ToString("000"); } y += lineHeight + spacing; // Set indent back to what it was EditorGUI.indentLevel = indent; EditorGUI.EndProperty(); } public override float GetPropertyHeight(SerializedProperty property, GUIContent label) { SerializedProperty items = property.FindPropertyRelative("items"); float lineAndSpace = lineHeight + spacing; return 40 + (items.arraySize*lineAndSpace) + lineAndSpace; } // comparer for check distances in ray cast hits public class TransformNameComparer : IComparer { public int Compare(object x, object y) { return ((Transform) x).name.CompareTo(((Transform) y).name); } } } #endif }
{ "content_hash": "3274d98bfddd122dc0730f95ba8388d8", "timestamp": "", "source": "github", "line_count": 383, "max_line_length": 113, "avg_line_length": 33.7911227154047, "alnum_prop": 0.4480760315252666, "repo_name": "scarlethammergames/showcase", "id": "deabd5af030340a29b74ad604388b2f33e4ac744", "size": "12944", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "Assets/Downloaded/SampleAssets/Utility/WaypointCircuit.cs", "mode": "33188", "license": "mit", "language": [ { "name": "C", "bytes": "33102" }, { "name": "C#", "bytes": "297665" } ], "symlink_target": "" }
Broadlink RM2 network protocol ============================== Encryption ---------- Packets include AES-based encryption in CBC mode. The initial key is 0x09, 0x76, 0x28, 0x34, 0x3f, 0xe9, 0x9e, 0x23, 0x76, 0x5c, 0x15, 0x13, 0xac, 0xcf, 0x8b, 0x02. The IV is 0x56, 0x2e, 0x17, 0x99, 0x6d, 0x09, 0x3d, 0x28, 0xdd, 0xb3, 0xba, 0x69, 0x5a, 0x2e, 0x6f, 0x58. Checksum -------- Construct the packet and set checksum bytes to zero. Add each byte to the starting value of 0xbeaf, wrapping after 0xffff. New device setup ---------------- To setup a new Broadlink device while in AP Mode a 136 byte packet needs to be sent to the device as follows: | Offset | Contents | |---------|----------| |0x00-0x19|00| |0x20-0x21|Checksum as a little-endian 16 bit integer| |0x26|14 (Always 14)| |0x44-0x63|SSID Name (zero padding is appended)| |0x64-0x83|Password (zero padding is appended)| |0x84|Character length of SSID| |0x85|Character length of password| |0x86|Wireless security mode (00 - none, 01 = WEP, 02 = WPA1, 03 = WPA2, 04 = WPA1/2)| |0x87-88|00| Send this packet as a UDP broadcast to 255.255.255.255 on port 80. Network discovery ----------------- To discover Broadlink devices on the local network, send a 48 byte packet with the following contents: | Offset | Contents | |---------|----------| |0x00-0x07|00| |0x08-0x0b|Current offset from GMT as a little-endian 32 bit integer| |0x0c-0x0d|Current year as a little-endian 16 bit integer| |0x0e|Current number of seconds past the minute| |0x0f|Current number of minutes past the hour| |0x10|Current number of hours past midnight| |0x11|Current day of the week (Monday = 1, Tuesday = 2, etc)| |0x12|Current day in month| |0x13|Current month| |0x14-0x17|00| |0x18-0x1b|Local IP address| |0x1c-0x1d|Source port as a little-endian 16 bit integer| |0x1e-0x1f|00| |0x20-0x21|Checksum as a little-endian 16 bit integer| |0x22-0x25|00| |0x26|06| |0x27-0x2f|00| Send this packet as a UDP broadcast to 255.255.255.255 on port 80. Response (any unicast response): | Offset | Contents | |---------|----------| |0x34-0x35|Device type as a little-endian 16 bit integer (see device type mapping)| |0x3a-0x3f|MAC address of the target device| Device type mapping: | Device type in response packet | Device type | Treat as | |---------|----------|----------| |0|SP1|SP1| |0x2711|SP2|SP2| |0x2719 or 0x7919 or 0x271a or 0x791a|Honeywell SP2|SP2| |0x2720|SPMini|SP2| |0x753e|SP3|SP2| |0x2728|SPMini2|SP2 |0x2733 or 0x273e|OEM branded SPMini|SP2| |>= 0x7530 and <= 0x7918|OEM branded SPMini2|SP2| |0x2736|SPMiniPlus|SP2| |0x2712|RM2|RM| |0x2737|RM Mini / RM3 Mini Blackbean|RM| |0x273d|RM Pro Phicomm|RM| |0x2783|RM2 Home Plus|RM| |0x277c|RM2 Home Plus GDT|RM| |0x272a|RM2 Pro Plus|RM| |0x2787|RM2 Pro Plus2|RM| |0x278b|RM2 Pro Plus BL|RM| |0x278f|RM Mini Shate|RM| |0x2714|A1|A1| |0x4EB5|MP1|MP1| Command packet format --------------------- The command packet header is 56 bytes long with the following format: |Offset|Contents| |------|--------| |0x00|0x5a| |0x01|0xa5| |0x02|0xaa| |0x03|0x55| |0x04|0x5a| |0x05|0xa5| |0x06|0xaa| |0x07|0x55| |0x08-0x1f|00| |0x20-0x21|Checksum of full packet as a little-endian 16 bit integer| |0x22-0x23|00| |0x24-0x25|Device type as a little-endian 16 bit integer| |0x26-0x27|Command code as a little-endian 16 bit integer| |0x28-0x29|Packet count as a little-endian 16 bit integer| |0x2a-0x2f|Local MAC address| |0x30-0x33|Local device ID (obtained during authentication, 00 before authentication)| |0x34-0x35|Checksum of unencrypted payload as a little-endian 16 bit integer |0x36-0x37|00| The payload is appended immediately after this. The checksum at 0x20 is calculated *after* the payload is appended, and covers the entire packet (including the checksum at 0x34). Therefore: 1. Generate packet header with checksum values set to 0 2. Set the checksum initialisation value to 0xbeaf and calculate the checksum of the unencrypted payload. Set 0x34-0x35 to this value. 3. Encrypt and append the payload 4. Set the checksum initialisation value to 0xbeaf and calculate the checksum of the entire packet. Set 0x20-0x21 to this value. Authorisation ------------- You must obtain an authorisation key from the device before you can communicate. To do so, generate an 80 byte packet with the following contents: |Offset|Contents| |------|--------| |0x00-0x03|00| |0x04-0x12|A 15-digit value that represents this device. Broadlink's implementation uses the IMEI.| |0x13|01| |0x14-0x2c|00| |0x2d|0x01| |0x30-0x7f|NULL-terminated ASCII string containing the device name| Send this payload with a command value of 0x0065. The response packet will contain an encrypted payload from byte 0x38 onwards. Decrypt this using the default key and IV. The format of the decrypted payload is: |Offset|Contents| |------|--------| |0x00-0x03|Device ID| |0x04-0x13|Device encryption key| All further command packets must use this encryption key and device ID. Entering learning mode ---------------------- Send the following 16 byte payload with a command value of 0x006a: |Offset|Contents| |------|--------| |0x00|0x03| |0x01-0x0f|0x00| Reading back data from learning mode ------------------------------------ Send the following 16 byte payload with a command value of 0x006a: |Offset|Contents| |------|--------| |0x00|0x04| |0x01-0x0f|0x00| Byte 0x22 of the response contains a little-endian 16 bit error code. If this is 0, a code has been obtained. Bytes 0x38 and onward of the response are encrypted. Decrypt them. Bytes 0x04 and onward of the decrypted payload contain the captured data. Sending data ------------ Send the following payload with a command byte of 0x006a |Offset|Contents| |------|--------| |0x00|0x02| |0x01-0x03|0x00| |0x04|0x26 = IR, 0xb2 for RF 433Mhz, 0xd7 for RF 315Mhz| |0x05|repeat count, (0 = no repeat, 1 send twice, .....)| |0x06-0x07|Length of the following data in little endian| |0x08 ....|Pulse lengths in 2^-15 s units (µs * 269 / 8192 works very well)| |....|For IR codes, the pulse lengths should be paired as ON, OFF| Each value is represented by one byte. If the length exceeds one byte then it is stored big endian with a leading 0. Captures of IR codes from the device will always end with a constant OFF value of `0x00 0x0d 0x05` but the trailing silence can be anything on transmit. The likely reason for this value is a capped timeout value on detection. The value is about 102 milliseconds. Example: The header for my Optoma projector is 8920 4450 8920 * 269 / 8192 = 0x124 4450 * 269 / 8192 = 0x92 So the data starts with `0x00 0x1 0x24 0x92 ....` Todo ---- * Support for other devices using the Broadlink protocol (various smart home devices) * Figure out what the format of the data packets actually is. * Deal with the response after AP Mode WiFi network setup.
{ "content_hash": "24fb5340922a8fb96021759adcd87b60", "timestamp": "", "source": "github", "line_count": 204, "max_line_length": 270, "avg_line_length": 33.416666666666664, "alnum_prop": 0.7135103417925773, "repo_name": "mjg59/python-broadlink", "id": "e2825640727c0846de252c2f0509f6a2975410de", "size": "6818", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "protocol.md", "mode": "33188", "license": "mit", "language": [ { "name": "Python", "bytes": "76067" } ], "symlink_target": "" }
using System.Reflection; using System.Runtime.CompilerServices; using System.Runtime.InteropServices; // General Information about an assembly is controlled through the following // set of attributes. Change these attribute values to modify the information // associated with an assembly. [assembly: AssemblyTitle("SimpleConsole")] [assembly: AssemblyDescription("")] [assembly: AssemblyConfiguration("")] [assembly: AssemblyCompany("")] [assembly: AssemblyProduct("SimpleConsole")] [assembly: AssemblyCopyright("Copyright © 2014")] [assembly: AssemblyTrademark("")] [assembly: AssemblyCulture("")] // Setting ComVisible to false makes the types in this assembly not visible // to COM components. If you need to access a type in this assembly from // COM, set the ComVisible attribute to true on that type. [assembly: ComVisible(false)] // The following GUID is for the ID of the typelib if this project is exposed to COM [assembly: Guid("70adbf5a-c74f-43d2-a32c-608951af3051")] // Version information for an assembly consists of the following four values: // // Major Version // Minor Version // Build Number // Revision // // You can specify all the values or you can default the Build and Revision Numbers // by using the '*' as shown below: // [assembly: AssemblyVersion("1.0.*")] [assembly: AssemblyVersion("1.0.0.0")] [assembly: AssemblyFileVersion("1.0.0.0")]
{ "content_hash": "7b1965dc640141c8289c16ed93045bb8", "timestamp": "", "source": "github", "line_count": 36, "max_line_length": 84, "avg_line_length": 38.861111111111114, "alnum_prop": 0.7455325232308792, "repo_name": "jsmug/Transit", "id": "546bf5a1fadb1a8c260123e3c660491595f1832f", "size": "1402", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "SimpleConsole/Properties/AssemblyInfo.cs", "mode": "33188", "license": "mit", "language": [ { "name": "C#", "bytes": "120590" } ], "symlink_target": "" }
<?xml version="1.0" encoding="UTF-8"?> <project version="4"> <component name="CompilerConfiguration"> <resourceExtensions /> <wildcardResourcePatterns> <entry name="!?*.java" /> <entry name="!?*.form" /> <entry name="!?*.class" /> <entry name="!?*.groovy" /> <entry name="!?*.scala" /> <entry name="!?*.flex" /> <entry name="!?*.kt" /> <entry name="!?*.clj" /> <entry name="!?*.aj" /> </wildcardResourcePatterns> <annotationProcessing> <profile default="true" name="Default" enabled="false"> <processorPath useClasspath="true" /> </profile> <profile default="false" name="Maven default annotation processors profile" enabled="true"> <sourceOutputDir name="target/generated-sources/annotations" /> <sourceTestOutputDir name="target/generated-test-sources/test-annotations" /> <outputRelativeToContentRoot value="true" /> <processorPath useClasspath="true" /> <module name="DBInsertion" /> </profile> </annotationProcessing> <bytecodeTargetLevel> <module name="DBInsertion" target="1.6" /> </bytecodeTargetLevel> </component> </project>
{ "content_hash": "de951978d93697c4252ca5823d7ae27c", "timestamp": "", "source": "github", "line_count": 32, "max_line_length": 97, "avg_line_length": 37.4375, "alnum_prop": 0.6151919866444073, "repo_name": "SlavoKrupa/Notifications", "id": "43d4da16be21d80f4e168a5f764b7d6d03096440", "size": "1198", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "Utility/DBInsertion/.idea/compiler.xml", "mode": "33188", "license": "mit", "language": [ { "name": "ASP", "bytes": "104" }, { "name": "C#", "bytes": "11806" }, { "name": "CSS", "bytes": "2158" }, { "name": "HTML", "bytes": "16928" }, { "name": "Java", "bytes": "7569" }, { "name": "JavaScript", "bytes": "30571" }, { "name": "PLpgSQL", "bytes": "2604" }, { "name": "Python", "bytes": "2487" } ], "symlink_target": "" }
from urllib import request from multiprocessing import Process timeout = 60 import time start_time= time.time() print('start_time',start_time) URLS = ['http://www.foxnews.com/', 'http://www.cnn.com/', 'http://europe.wsj.com/', 'http://www.foxnews.com/', 'http://www.cnn.com/', 'http://europe.wsj.com/', 'http://www.foxnews.com/', 'http://www.cnn.com/', 'http://europe.wsj.com/' ] def download(url): print('donwloading:',url) s = time.time() with request.urlopen(url,timeout=timeout) as response: html = response.read() print(url,len(html)) print("IN",time.time()-s) threads = [] for url in URLS: t = Process(target=download,kwargs={'url':url}) t.start() threads.append(t) for t in threads: t.join() end_time = time.time() print('end_time',end_time) print('total_time',end_time-start_time) print("in the end")
{ "content_hash": "ac842a55a6d961f54198e076f3eb3c94", "timestamp": "", "source": "github", "line_count": 43, "max_line_length": 58, "avg_line_length": 22.813953488372093, "alnum_prop": 0.5688073394495413, "repo_name": "MortalViews/python-notes", "id": "2d1370213b8c13f00ce7db5417d203ee33ddb78b", "size": "981", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "multi/mutlipr.py", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Python", "bytes": "22704" } ], "symlink_target": "" }
SYNONYM #### According to The Catalogue of Life, 3rd January 2011 #### Published in null #### Original name null ### Remarks null
{ "content_hash": "9b99ebd5ccfbafe5f6d3327961c68de4", "timestamp": "", "source": "github", "line_count": 13, "max_line_length": 39, "avg_line_length": 10.23076923076923, "alnum_prop": 0.6917293233082706, "repo_name": "mdoering/backbone", "id": "e0c49754f046cdff462c7e4b76a877d8ab6f0b8f", "size": "190", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "life/Plantae/Magnoliophyta/Magnoliopsida/Brassicales/Brassicaceae/Polypsecadium/Polypsecadium adscendens/ Syn. Sisymbrium adscendens/README.md", "mode": "33188", "license": "apache-2.0", "language": [], "symlink_target": "" }
using System.Reflection; // These "special" version numbers are found and replaced at build time. [assembly: AssemblyVersion("1.0.0.0")] [assembly: AssemblyFileVersion("1.1.1.1")] [assembly: AssemblyInformationalVersion("1.0.0")]
{ "content_hash": "533b68b1eac8f0fd778f5186be9a883c", "timestamp": "", "source": "github", "line_count": 6, "max_line_length": 72, "avg_line_length": 38.5, "alnum_prop": 0.7532467532467533, "repo_name": "serilog-trace-listener/SerilogTraceListener", "id": "22492404535d3d079513dc8cb13ebe7ae4ea27d7", "size": "231", "binary": false, "copies": "1", "ref": "refs/heads/dev", "path": "assets/CommonAssemblyInfo.cs", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "C#", "bytes": "35142" }, { "name": "PowerShell", "bytes": "1603" } ], "symlink_target": "" }
/* Get Programming with JavaScript * Listing 6.09 * Getting a string for a player’s information */ var getPlayerName; var getPlayerHealth; var getPlayerPlace; var getPlayerInfo; var getBorder; getPlayerName = function (playerName) { return playerName; }; getPlayerHealth = function (playerName, playerHealth) { return playerName + " has health " + playerHealth; }; getPlayerPlace = function (playerName, playerPlace) { return playerName + " is in " + playerPlace; }; getBorder = function () { return "********************"; }; getPlayerInfo = function (playerName, playerPlace, playerHealth) { var playerInfo; playerInfo = "\n" + getPlayerName(playerName); playerInfo += "\n" + getBorder(); playerInfo += "\n" + getPlayerPlace(playerName, playerPlace); playerInfo += "\n" + getPlayerHealth(playerName, playerHealth); playerInfo += "\n" + getBorder(); playerInfo += "\n"; return playerInfo; }; console.log(getPlayerInfo("Kandra", "The Dungeon of Doom", 50)); /* Further Adventures * * 1) Add a second call to getPlayerInfo with * different player information. Log the * returned string to the console. * * 2) Call getPlayerInfo at the console prompt. * * 3) What happens if you call getPlayerInfo * without any arguments? * * > getPlayerInfo() * */
{ "content_hash": "98a1bcb82dd13be94eb1a2d2695ff3c5", "timestamp": "", "source": "github", "line_count": 59, "max_line_length": 67, "avg_line_length": 22.64406779661017, "alnum_prop": 0.6729041916167665, "repo_name": "jrlarsen/AdventuresInCode", "id": "fa938bee2071200dcdfcba3ef71323e514893a62", "size": "1338", "binary": false, "copies": "3", "ref": "refs/heads/master", "path": "Ch06_Return_values/listing6.09.js", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "27" }, { "name": "HTML", "bytes": "11429" }, { "name": "JavaScript", "bytes": "192041" } ], "symlink_target": "" }
from __future__ import unicode_literals from .base import BaseTest from django.utils import timezone from stored_messages import mark_read, add_message_for, broadcast_message, mark_all_read from stored_messages.models import Inbox, MessageArchive from stored_messages.compat import get_user_model from stored_messages.backends.exceptions import MessageDoesNotExist import stored_messages class TestApi(BaseTest): def test_mark_as_read(self): self.client.login(username='test_user', password='123456') self.client.get('/create') inbox = Inbox.objects.filter(user=self.user) self.assertEqual(len(inbox), 1) mark_read(self.user, inbox.get().message) self.assertEqual(Inbox.objects.filter(user=self.user).count(), 0) def test_mark_as_read_idempotent(self): self.client.login(username='test_user', password='123456') self.client.get('/create') msg_archive = MessageArchive.objects.filter(user=self.user).get() mark_read(self.user, msg_archive.message) self.assertRaises(MessageDoesNotExist, mark_read, self.user, msg_archive.message) def test_add_message_for(self): now = timezone.now() + timezone.timedelta(days=-1) url = 'http://example.com/error' user2 = get_user_model().objects.create_user("another_user", "[email protected]", "123456") add_message_for([user2, self.user], stored_messages.STORED_ERROR, 'Multiple errors', 'extra', now, url) self.assertEqual(Inbox.objects.count(), 2) self.assertEqual(MessageArchive.objects.count(), 2) self.assertEqual(Inbox.objects.get(user=user2.id).message.tags, 'extra persisted error') self.assertEqual(Inbox.objects.get(user=self.user).message.tags, 'extra persisted error') self.assertEqual(Inbox.objects.get(user=user2.id).message.date, now) self.assertEqual(Inbox.objects.get(user=self.user).message.date, now) self.assertEqual(Inbox.objects.get(user=user2.id).message.url, url) self.assertEqual(Inbox.objects.get(user=self.user).message.url, url) self.assertEqual(Inbox.objects.get(user=user2.id).message.message, "Multiple errors") self.assertEqual(Inbox.objects.get(user=self.user).message.message, "Multiple errors") self.assertEqual(MessageArchive.objects.get(user=user2.id).message.message, "Multiple errors") self.assertEqual(MessageArchive.objects.get(user=self.user).message.message, "Multiple errors") def test_broadcast_message(self): user1 = get_user_model().objects.create_user("user1", "[email protected]", "123456") user2 = get_user_model().objects.create_user("user2", "[email protected]", "123456") user3 = get_user_model().objects.create_user("user3", "[email protected]", "123456") now = timezone.now() + timezone.timedelta(days=-1) url = 'http://example.com/error' broadcast_message(stored_messages.STORED_INFO, 'broadcast test message', 'extra', now, url) self.assertEqual(Inbox.objects.get(user=user1.id).message.message, "broadcast test message") self.assertEqual(Inbox.objects.get(user=user2.id).message.message, "broadcast test message") self.assertEqual(Inbox.objects.get(user=user3.id).message.message, "broadcast test message") self.assertEqual(Inbox.objects.get(user=user1.id).message.tags, 'extra persisted info') self.assertEqual(Inbox.objects.get(user=user2.id).message.tags, 'extra persisted info') self.assertEqual(Inbox.objects.get(user=user3.id).message.tags, 'extra persisted info') self.assertEqual(Inbox.objects.get(user=user1.id).message.date, now) self.assertEqual(Inbox.objects.get(user=user2.id).message.date, now) self.assertEqual(Inbox.objects.get(user=user3.id).message.date, now) self.assertEqual(Inbox.objects.get(user=user1.id).message.url, url) self.assertEqual(Inbox.objects.get(user=user2.id).message.url, url) self.assertEqual(Inbox.objects.get(user=user3.id).message.url, url) self.assertEqual(MessageArchive.objects.get(user=user1.id).message.message, "broadcast test message") self.assertEqual(MessageArchive.objects.get(user=user2.id).message.message, "broadcast test message") self.assertEqual(MessageArchive.objects.get(user=user3.id).message.message, "broadcast test message") def test_mark_all_read(self): for i in range(20): stored_messages.add_message_for([self.user], stored_messages.INFO, "unicode message ❤") inbox = Inbox.objects.filter(user=self.user) self.assertEqual(len(inbox), 20) mark_all_read(self.user) self.assertEqual(Inbox.objects.filter(user=self.user).count(), 0)
{ "content_hash": "8945b48e656d116895d465af1609a725", "timestamp": "", "source": "github", "line_count": 92, "max_line_length": 111, "avg_line_length": 52.84782608695652, "alnum_prop": 0.6859317153434801, "repo_name": "nthall/django-stored-messages", "id": "da4f0760a349dfa1a21d7da401f3c30da8315c95", "size": "4888", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "stored_messages/tests/test_api.py", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "HTML", "bytes": "803" }, { "name": "Python", "bytes": "65241" } ], "symlink_target": "" }
package com.intellij.openapi.externalSystem.service.execution; import com.intellij.execution.Location; import com.intellij.execution.RunManagerEx; import com.intellij.execution.RunnerAndConfigurationSettings; import com.intellij.execution.actions.ConfigurationContext; import com.intellij.execution.configurations.RunConfiguration; import com.intellij.execution.junit.RuntimeConfigurationProducer; import com.intellij.openapi.externalSystem.model.execution.ExternalSystemTaskExecutionSettings; import com.intellij.openapi.externalSystem.model.execution.ExternalTaskExecutionInfo; import com.intellij.psi.PsiElement; import javax.annotation.Nonnull; import javax.annotation.Nullable; import java.util.List; /** * @author Denis Zhdanov * @since 6/5/13 8:14 PM */ public abstract class AbstractExternalSystemRuntimeConfigurationProducer extends RuntimeConfigurationProducer { private PsiElement mySourceElement; public AbstractExternalSystemRuntimeConfigurationProducer(@Nonnull AbstractExternalSystemTaskConfigurationType type) { super(type); } @Override public PsiElement getSourceElement() { return mySourceElement; } @Nullable @Override protected RunnerAndConfigurationSettings createConfigurationByElement(Location location, ConfigurationContext context) { if (!(location instanceof ExternalSystemTaskLocation)) { return null; } ExternalSystemTaskLocation taskLocation = (ExternalSystemTaskLocation)location; mySourceElement = taskLocation.getPsiElement(); RunManagerEx runManager = RunManagerEx.getInstanceEx(taskLocation.getProject()); RunnerAndConfigurationSettings settings = runManager.createConfiguration("", getConfigurationFactory()); ExternalSystemRunConfiguration configuration = (ExternalSystemRunConfiguration)settings.getConfiguration(); ExternalSystemTaskExecutionSettings taskExecutionSettings = configuration.getSettings(); ExternalTaskExecutionInfo task = taskLocation.getTaskInfo(); taskExecutionSettings.setExternalProjectPath(task.getSettings().getExternalProjectPath()); taskExecutionSettings.setTaskNames(task.getSettings().getTaskNames()); configuration.setName(AbstractExternalSystemTaskConfigurationType.generateName(location.getProject(), taskExecutionSettings)); return settings; } @Nullable @Override protected RunnerAndConfigurationSettings findExistingByElement(Location location, @Nonnull List<RunnerAndConfigurationSettings> existingConfigurationsSettings, ConfigurationContext context) { if (!(location instanceof ExternalSystemTaskLocation)) { return null; } ExternalTaskExecutionInfo taskInfo = ((ExternalSystemTaskLocation)location).getTaskInfo(); for (RunnerAndConfigurationSettings settings : existingConfigurationsSettings) { RunConfiguration runConfiguration = settings.getConfiguration(); if (!(runConfiguration instanceof ExternalSystemRunConfiguration)) { continue; } if (match(taskInfo, ((ExternalSystemRunConfiguration)runConfiguration).getSettings())) { return settings; } } return null; } private static boolean match(@Nonnull ExternalTaskExecutionInfo task, @Nonnull ExternalSystemTaskExecutionSettings settings) { if (!task.getSettings().getExternalProjectPath().equals(settings.getExternalProjectPath())) { return false; } List<String> taskNames = settings.getTaskNames(); return task.getSettings().getTaskNames().equals(taskNames); } @Override public int compareTo(@Nonnull Object o) { return PREFERED; } }
{ "content_hash": "b2707eaba5cb294e62d4792fef97a4d5", "timestamp": "", "source": "github", "line_count": 90, "max_line_length": 142, "avg_line_length": 41.32222222222222, "alnum_prop": 0.7714439365420812, "repo_name": "consulo/consulo", "id": "e946ed08f7fd058dfbf83cc3848e2655863d22b4", "size": "4319", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "modules/base/external-system-impl/src/main/java/com/intellij/openapi/externalSystem/service/execution/AbstractExternalSystemRuntimeConfigurationProducer.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Batchfile", "bytes": "299" }, { "name": "C", "bytes": "52718" }, { "name": "C++", "bytes": "72795" }, { "name": "CMake", "bytes": "854" }, { "name": "CSS", "bytes": "64655" }, { "name": "Groovy", "bytes": "36006" }, { "name": "HTML", "bytes": "173780" }, { "name": "Java", "bytes": "64026758" }, { "name": "Lex", "bytes": "5909" }, { "name": "Objective-C", "bytes": "23787" }, { "name": "Python", "bytes": "3276" }, { "name": "SCSS", "bytes": "9782" }, { "name": "Shell", "bytes": "5689" }, { "name": "Thrift", "bytes": "1216" }, { "name": "XSLT", "bytes": "49230" } ], "symlink_target": "" }
<!doctype html> <html lang="en" class="no-js"> <head> <meta charset="utf-8"> <!-- begin SEO --> <title>Philadelphia Beerlytics, Part One - williamstome</title> <meta property="og:locale" content="en-US"> <meta property="og:site_name" content="williamstome"> <meta property="og:title" content="Philadelphia Beerlytics, Part One"> <link rel="canonical" href="http://williamstome.github.io//philadelphia-beerlytics-part-one/"> <meta property="og:url" content="http://williamstome.github.io//philadelphia-beerlytics-part-one/"> <meta property="og:description" content="As a follow up to my previous Beerlytics posts, I wanted to take a look at what beers are actually on tap at the various bars in a particular city."> <meta name="twitter:site" content="@williamstome"> <meta name="twitter:title" content="Philadelphia Beerlytics, Part One"> <meta name="twitter:description" content="As a follow up to my previous Beerlytics posts, I wanted to take a look at what beers are actually on tap at the various bars in a particular city."> <meta name="twitter:url" content="http://williamstome.github.io//philadelphia-beerlytics-part-one/"> <meta name="twitter:card" content="summary_large_image"> <meta name="twitter:image" content="http://williamstome.github.io//images/features/philly_beer.png"> <meta property="og:image" content="http://williamstome.github.io//images/features/philly_beer.png"> <meta property="og:type" content="article"> <meta property="article:published_time" content="2014-12-11T00:00:00+00:00"> <link rel="next" href="http://williamstome.github.io//philadelphia-beerlytics-part-two-abv/" title="Philadelphia Beerlytics Part Two: ABV"> <link rel="prev" href="http://williamstome.github.io//could-you-maybe-possibly-explain-indirect-speech-acts/" title="Could You Maybe Possibly Explain Indirect Speech Acts?"> <script type="application/ld+json"> { "@context": "http://schema.org", "@type": "Organization", "url": "http://williamstome.github.io/", "logo": "http://williamstome.github.io//images/og/default.png" } </script> <script type="application/ld+json"> { "@context" : "http://schema.org", "@type" : "Person", "name" : "Tom Williams", "url" : "http://williamstome.github.io/", "sameAs" : ["https://twitter.com/williamstome","https://facebook.com/TomEWilliams","https://www.linkedin.com/in/williamstome","https://scholar.google.com/citations?user=qg_aiScAAAAJ","https://www.mendeley.com/profiles/tom-williams23/","https://www.researchgate.net/profile/Tom_Williams19"] } </script> <!-- end SEO --> <link href="http://williamstome.github.io//feed.xml" type="application/atom+xml" rel="alternate" title="williamstome Feed"> <!-- http://t.co/dKP3o1e --> <meta name="HandheldFriendly" content="True"> <meta name="MobileOptimized" content="320"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <script> document.documentElement.className = document.documentElement.className.replace(/\bno-js\b/g, '') + ' js '; </script> <!-- For all browsers --> <link rel="stylesheet" href="http://williamstome.github.io//assets/css/main.css"> <meta http-equiv="cleartype" content="on"> <!-- start custom head snippets --> <!-- insert favicons. use http://realfavicongenerator.net/ --> <!-- end custom head snippets --> </head> <body> <!--[if lt IE 9]> <div class="notice--danger align-center" style="margin: 0;">You are using an <strong>outdated</strong> browser. Please <a href="http://browsehappy.com/">upgrade your browser</a> to improve your experience.</div> <![endif]--> <div class="masthead"> <div class="masthead__inner-wrap"> <div class="masthead__menu"> <nav id="site-nav" class="greedy-nav"> <button><div class="navicon"></div></button> <ul class="visible-links"> <li class="masthead__menu-item masthead__menu-item--lg"><a href="http://williamstome.github.io//">williamstome</a></li> <li class="masthead__menu-item"><a href="http://williamstome.github.io//archive/">archive</a></li> <li class="masthead__menu-item"><a href="http://inside.mines.edu/~twilliams/">author</a></li> <li class="masthead__menu-item"><a href="http://mirrorlab.mines.edu/">lab</a></li> <li class="masthead__menu-item"><a href="http://inside.mines.edu/~twilliams/pdfs/cv.pdf">cv</a></li> </ul> <ul class="hidden-links hidden"></ul> </nav> </div> </div> </div> <div class="page__hero" style=" " > <img src="http://williamstome.github.io//images/features/philly_beer.png" alt="Philadelphia Beerlytics, Part One" class="page__hero-image"> <span class="page__hero-caption">Photo credit: <a href="http://www.flickr.com/photos/benjaminlhaas/3343152090"><strong>Benjamin L. Haas</strong></a> </span> </div> <div id="main" role="main"> <div class="sidebar sticky"> <div itemscope itemtype="http://schema.org/Person"> <div class="author__avatar"> <img src="https://en.gravatar.com/userimage/26616054/e6f3322a7a49fe676683654ec3dbafce.jpg?size=200" alt="Tom Williams"> </div> <div class="author__content"> <h3 class="author__name">Tom Williams</h3> <p class="author__bio">Artificial Intelligence and Human-Robot Interaction Researcher</p> </div> <div class="author__urls-wrapper"> <button class="btn btn--inverse">Follow</button> <ul class="author__urls social-icons"> <li><a href="http://inside.mines.edu/~twilliams"><i class="fa fa-fw fa-chain" aria-hidden="true"></i> Homepage</a></li> <li><a href="mailto:[email protected]"><i class="fa fa-fw fa-envelope-square" aria-hidden="true"></i> Email</a></li> <li><a href="https://scholar.google.com/citations?user=qg_aiScAAAAJ"><i class="fa fa-fw fa-graduation-cap" aria-hidden="true"></i> Google Scholar</a></li> <li><a href="https://www.researchgate.net/profile/Tom_Williams19"><i class="fa fa-fw fa-flask" aria-hidden="true"></i> Researchgate</a></li> <li><a href="https://twitter.com/williamstome"><i class="fa fa-fw fa-twitter-square" aria-hidden="true"></i> Twitter</a></li> <li><a href="https://www.facebook.com/TomEWilliams"><i class="fa fa-fw fa-facebook-square" aria-hidden="true"></i> Facebook</a></li> <li><a href="https://www.linkedin.com/in/williamstome"><i class="fa fa-fw fa-linkedin-square" aria-hidden="true"></i> LinkedIn</a></li> <li><a href="https://www.youtube.com/user/HVj64bAL66gwgRjQKH-aFQ"><i class="fa fa-fw fa-youtube-square" aria-hidden="true"></i> YouTube</a></li> </ul> </div> </div> </div> <article class="page" itemscope itemtype="http://schema.org/CreativeWork"> <meta itemprop="headline" content="Philadelphia Beerlytics, Part One"> <meta itemprop="description" content="As a follow up to my previous Beerlytics posts, I wanted to take a look at what beers are actually on tap at the various bars in a particular city."> <meta itemprop="datePublished" content="December 11, 2014"> <meta itemprop="dateModified" content="December 13, 2014"> <div class="page__inner-wrap"> <header> <h1 class="page__title" itemprop="headline">Philadelphia Beerlytics, Part One </h1> </header> <section class="page__content" itemprop="text"> <h1 id="the-data">The Data</h1> <p>As a follow up to my <a href="http://williamstome.github.io//high-density-beerlytics-data-visualization-of-beer-neighborhoods/">previous</a> <a href="http://williamstome.github.io//beerlytics-bar-density/">Beerlytics posts</a>, I wanted to take a look at what beers are actually on tap at the various bars in a particular city. Unfortunately, this information isn’t readily available for most cities. Philadelphia, however, has a wonderful website called <a href="http://www.phillytapfinder.com/">PhillyTapFinder</a>, which shows you all the beers on tap at all the bars in Philadelphia. This shows that Philadelphia has 1916 craft taps spread across 112 bars, with 795 unique craft beers on tap. I made good use of this, scraping all the information from the site and then cross-referencing it with data from <a href="http://www.ratebeer.com/">RateBeer</a>, which provides information on beer ratings, ABV, IBU, Style, etc.</p> <p>Before I go into the analysis, a few caveats:</p> <ol> <li>All the information I’ve gathered is based on <em>taplists</em>, so bottles are not counted. Thus the incredibly impressive and extensive bottle lists of places like Monk’s Cafe are completely ignored. Sad.</li> <li>Philly Tap Finder only seems to include craft beer. This in and of itself shouldn’t be terribly surprising, but it means that beers like Miller Lite won’t show up anywhere on the lists, despite the fact that Miller is probably on tap at a large number of the examined bars. Low-craft beers like Stella, Hoegarden, and Shocktop <em>are</em> included.</li> <li>I don’t have any price information. If you want to find the cheapest place to get drunk, you’re out of luck.</li> <li>I only have information from a single time slice (in October 2014), so I have no data on tap turnover. In the future I’ll hopefully be able to repeat my scraping over regular intervals to get that kind of information.</li> </ol> <h1 id="the-lists">The Lists</h1> <p>I’ve organized my findings into a set of top- and bottom-ten lists.</p> <h2 id="the-best-and-the-worst">The Best and The Worst</h2> <p>First things first, which bars have the best beer in Philadelphia? Which have worst? To find out, I looked at the taplists for each bar in Philadelphia. For each beer in that taplist, I got the average rating for that beer on ratebeer. I then calculated the average average-beer-rating for each bar. (Remember, macro-brews are not included in these averages) Using this methodology, the top ten beer bars are…</p> <table> <thead> <tr> <th> </th> <th style="text-align: left">Average beer rating: Top 10</th> <th style="text-align: right">Mean</th> </tr> </thead> <tbody> <tr> <td>1</td> <td style="text-align: left">The Beer Shoppe</td> <td style="text-align: right">3.65</td> </tr> <tr> <td>2</td> <td style="text-align: left">Monks Cafe</td> <td style="text-align: right">3.65</td> </tr> <tr> <td>3</td> <td style="text-align: left">320 Market Cafe</td> <td style="text-align: right">3.65</td> </tr> <tr> <td>4</td> <td style="text-align: left">Tria Rittenhouse Square</td> <td style="text-align: right">3.64</td> </tr> <tr> <td>5</td> <td style="text-align: left">Side Bar</td> <td style="text-align: right">3.61</td> </tr> <tr> <td>6</td> <td style="text-align: left">Tria Washington Square</td> <td style="text-align: right">3.60</td> </tr> <tr> <td>7</td> <td style="text-align: left">Moonshine</td> <td style="text-align: right">3.60</td> </tr> <tr> <td>8</td> <td style="text-align: left">Jose Pistola’s</td> <td style="text-align: right">3.60</td> </tr> <tr> <td>9</td> <td style="text-align: left">Bottles, Packs, Growlers</td> <td style="text-align: right">3.60</td> </tr> <tr> <td>10</td> <td style="text-align: left">Lucky’s Last Chance</td> <td style="text-align: right">3.59</td> </tr> </tbody> </table> <p>And the bottom ten are…</p> <table> <thead> <tr> <th> </th> <th style="text-align: left">Average beer rating: Bottom 10</th> <th style="text-align: right">Mean</th> </tr> </thead> <tbody> <tr> <td>112</td> <td style="text-align: left">Daly’s Irish Pub</td> <td style="text-align: right">3.08</td> </tr> <tr> <td>111</td> <td style="text-align: left">The Victoria Freehouse</td> <td style="text-align: right">3.09</td> </tr> <tr> <td>110</td> <td style="text-align: left">Tatooed Mom</td> <td style="text-align: right">3.11</td> </tr> <tr> <td>109</td> <td style="text-align: left">Bourbon Branch</td> <td style="text-align: right">3.12</td> </tr> <tr> <td>108</td> <td style="text-align: left">The Whip Tavern</td> <td style="text-align: right">3.12</td> </tr> <tr> <td>107</td> <td style="text-align: left">McGillin’s Olde Ale House</td> <td style="text-align: right">3.13</td> </tr> <tr> <td>106</td> <td style="text-align: left">Glenmorgan bar</td> <td style="text-align: right">3.17</td> </tr> <tr> <td>105</td> <td style="text-align: left">Misconduct</td> <td style="text-align: right">3.17</td> </tr> <tr> <td>104</td> <td style="text-align: left">Doobie’s Bar</td> <td style="text-align: right">3.18</td> </tr> <tr> <td>103</td> <td style="text-align: left">Grey Lodge</td> <td style="text-align: right">3.20</td> </tr> </tbody> </table> <p>I’m not surprised at all to see Monks Cafe or Tria on the list of the top ten. However, there are several on the list I’m not familiar with. All the more reason to go back to Philadelphia! On the other side of the coin, I’ve never heard of “Tatooed Mom” or “Misconduct”, but I can’t say the names inspired any great confidence. I know you shouldn’t judge a bar by it’s cover, but… Tatooed Mom? Really? Looking at Tatooed Mom’s webpage reveals that they claim to have the “the best local and regional beer”. While they do have a few good beers on tap, their smallish taplist is padded out with the likes of Shocktop and Newcastle. Okay, enough picking on Tatooed Mom.</p> <h2 id="the-most-and-the-least">The Most and The Least</h2> <p>So we know who has the best curated beer list. Who has the longest? Sometimes size matters. Keep in mind that only taplists were counted, and not bottle lists. This means that powerhouses like Monk’s Cafe aren’t going to show up, despite the incredible selection afforded by their bottle list. Remember as well that only craft taps are included. <br /> The top 10…</p> <table> <thead> <tr> <th> </th> <th style="text-align: left">Number of Taps: Top 10</th> <th style="text-align: right">#</th> </tr> </thead> <tbody> <tr> <td>1</td> <td style="text-align: left">City Taphouse</td> <td style="text-align: right">54</td> </tr> <tr> <td>2</td> <td style="text-align: left">Bar-ly Chinatown</td> <td style="text-align: right">53</td> </tr> <tr> <td>3</td> <td style="text-align: left">Bru Craft &amp; Wurst</td> <td style="text-align: right">39</td> </tr> <tr> <td>4</td> <td style="text-align: left">Iron Abbey</td> <td style="text-align: right">37</td> </tr> <tr> <td>5</td> <td style="text-align: left">Field House</td> <td style="text-align: right">34</td> </tr> <tr> <td>6</td> <td style="text-align: left">Flanigan’s Boathouse</td> <td style="text-align: right">33</td> </tr> <tr> <td>7</td> <td style="text-align: left">Garret Hill Ale House</td> <td style="text-align: right">31</td> </tr> <tr> <td>8</td> <td style="text-align: left">The Pour House Exton</td> <td style="text-align: right">30</td> </tr> <tr> <td>9</td> <td style="text-align: left">Pitcher’s Pub</td> <td style="text-align: right">29</td> </tr> <tr> <td>10</td> <td style="text-align: left">Irish Pol</td> <td style="text-align: right">27</td> </tr> <tr> <td>10</td> <td style="text-align: left">Daly’s Irish Pub</td> <td style="text-align: right">27</td> </tr> </tbody> </table> <p>The bottom ten…</p> <table> <thead> <tr> <th> </th> <th style="text-align: left">Number of Taps: Bottom 10</th> <th style="text-align: right">#</th> </tr> </thead> <tbody> <tr> <td>112</td> <td style="text-align: left">Bobkat Liquors</td> <td style="text-align: right">5</td> </tr> <tr> <td>109</td> <td style="text-align: left">Nodding Head</td> <td style="text-align: right">6</td> </tr> <tr> <td>109</td> <td style="text-align: left">Wrap Shack</td> <td style="text-align: right">6</td> </tr> <tr> <td>109</td> <td style="text-align: left">Lucky’s Last Chance</td> <td style="text-align: right">6</td> </tr> <tr> <td>107</td> <td style="text-align: left">Watkins Drinkery</td> <td style="text-align: right">7</td> </tr> <tr> <td>107</td> <td style="text-align: left">The Beer Shoppe</td> <td style="text-align: right">7</td> </tr> <tr> <td>98</td> <td style="text-align: left">Glenmorgan Bar</td> <td style="text-align: right">8</td> </tr> <tr> <td>98</td> <td style="text-align: left">Billy Murphy’s Irish Saloon</td> <td style="text-align: right">8</td> </tr> <tr> <td>98</td> <td style="text-align: left">Redwood</td> <td style="text-align: right">8</td> </tr> <tr> <td>98</td> <td style="text-align: left">Cook and Shaker</td> <td style="text-align: right">8</td> </tr> <tr> <td>98</td> <td style="text-align: left">Royal Tavern</td> <td style="text-align: right">8</td> </tr> <tr> <td>98</td> <td style="text-align: left">Revolution House</td> <td style="text-align: right">8</td> </tr> <tr> <td>98</td> <td style="text-align: left">Tria - Washington Square</td> <td style="text-align: right">8</td> </tr> <tr> <td>98</td> <td style="text-align: left">Tria - Rittenhouse Square</td> <td style="text-align: right">8</td> </tr> </tbody> </table> <p>While none of the bars with the most taps appear on the top- or bottom-rated lists, several of the bars with the fewest taps appear on the top-rated list, and one appears on the bottom-rated list.</p> <h2 id="variety-of-styles">Variety of Styles</h2> <table> <thead> <tr> <th> </th> <th style="text-align: left">Largest Variety of Styles</th> <th style="text-align: right">#</th> </tr> </thead> <tbody> <tr> <td>1</td> <td style="text-align: left">Bru Craft &amp; Wurst</td> <td style="text-align: right">29</td> </tr> <tr> <td>2</td> <td style="text-align: left">Bar-ly Chinatown</td> <td style="text-align: right">28</td> </tr> <tr> <td>3</td> <td style="text-align: left">Iron Abbey</td> <td style="text-align: right">27</td> </tr> <tr> <td>4</td> <td style="text-align: left">Pitcher’s Pub</td> <td style="text-align: right">22</td> </tr> <tr> <td>4</td> <td style="text-align: left">Flanigan’s Boathouse</td> <td style="text-align: right">22</td> </tr> <tr> <td>6</td> <td style="text-align: left">The Pour House</td> <td style="text-align: right">21</td> </tr> <tr> <td>7</td> <td style="text-align: left">Field House</td> <td style="text-align: right">20</td> </tr> <tr> <td>7</td> <td style="text-align: left">Varga Bar</td> <td style="text-align: right">20</td> </tr> <tr> <td>7</td> <td style="text-align: left">The Pour House Exton</td> <td style="text-align: right">20</td> </tr> <tr> <td>10</td> <td style="text-align: left">Capone’s</td> <td style="text-align: right">19</td> </tr> <tr> <td>10</td> <td style="text-align: left">Tria Taproom</td> <td style="text-align: right">19</td> </tr> <tr> <td>10</td> <td style="text-align: left">Isaac Newton’s</td> <td style="text-align: right">19</td> </tr> </tbody> </table> <table> <thead> <tr> <th> </th> <th style="text-align: left">Smallest Variety of Styles</th> <th style="text-align: right">#</th> </tr> </thead> <tbody> <tr> <td>112</td> <td style="text-align: left">Bobkat Liquors</td> <td style="text-align: right">4</td> </tr> <tr> <td>110</td> <td style="text-align: left">Lucky’s Last Chance</td> <td style="text-align: right">5</td> </tr> <tr> <td>110</td> <td style="text-align: left">Wrap Shack</td> <td style="text-align: right">5</td> </tr> <tr> <td>107</td> <td style="text-align: left">Billy Murphy’s Irish Saloon</td> <td style="text-align: right">6</td> </tr> <tr> <td>107</td> <td style="text-align: left">Tria Rittenhouse Square</td> <td style="text-align: right">6</td> </tr> <tr> <td>107</td> <td style="text-align: left">The Beer Shoppe</td> <td style="text-align: right">6</td> </tr> <tr> <td>103</td> <td style="text-align: left">Watkin’s Drinkery</td> <td style="text-align: right">7</td> </tr> <tr> <td>103</td> <td style="text-align: left">Franklin’s</td> <td style="text-align: right">7</td> </tr> <tr> <td>103</td> <td style="text-align: left">Royal Tavern</td> <td style="text-align: right">7</td> </tr> <tr> <td>103</td> <td style="text-align: left">Nodding Head</td> <td style="text-align: right">7</td> </tr> </tbody> </table> <p>You’ll notice that the bar with the most taps (City Taphouse) appears nowhere on the first list. While it has 54 taps, those taps only represent 15 different styles of beer. Which I find just a little disappointing.</p> <h2 id="popularity-of-styles">Popularity of Styles</h2> <p>This leads to the next category: popularity of styles. Which styles are the most common? I won’t bother listing the least common, as there a very large number of styles with no examples on tap anywhere in the city. For example, any kind of Sake. Here are the most common categories.</p> <table> <thead> <tr> <th> </th> <th style="text-align: left">Most Common Styles</th> <th style="text-align: right">#</th> </tr> </thead> <tbody> <tr> <td>1</td> <td style="text-align: left">Spice-Herb-Vegetable</td> <td style="text-align: right">215</td> </tr> <tr> <td>2</td> <td style="text-align: left">IPA</td> <td style="text-align: right">203</td> </tr> <tr> <td>3</td> <td style="text-align: left">Oktoberfest/Marzen</td> <td style="text-align: right">105</td> </tr> <tr> <td>4</td> <td style="text-align: left">Witbier</td> <td style="text-align: right">88</td> </tr> <tr> <td>5</td> <td style="text-align: left">Imperial IPA</td> <td style="text-align: right">82</td> </tr> <tr> <td>6</td> <td style="text-align: left">American Pale Ale</td> <td style="text-align: right">77</td> </tr> <tr> <td>7</td> <td style="text-align: left">Cider</td> <td style="text-align: right">72</td> </tr> <tr> <td>8</td> <td style="text-align: left">Saison</td> <td style="text-align: right">61</td> </tr> <tr> <td>9</td> <td style="text-align: left">Fruit Beer Radler</td> <td style="text-align: right">56</td> </tr> <tr> <td>10</td> <td style="text-align: left">Imperial Stout</td> <td style="text-align: right">53</td> </tr> </tbody> </table> <p>For more detail, consult the following interactive chart (Hyper-categories are determined using the Ratebeer style guide, chart designed using <a href="https://gist.github.com/metmajer/5480307">Sunburst</a> ):</p> <style> path { stroke: #fff; fill-rule: evenodd; } text{ font-family: Arial, sans-serif; font-size:10px;} </style> <div id="stylechart"></div> <script src="http://d3js.org/d3.v3.min.js"></script> <script> var width = 600, height = 600, radius = Math.min(width, height) / 2; var x = d3.scale.linear() .range([0, 2 * Math.PI]); var y = d3.scale.linear() .range([0, radius]); var color = d3.scale.category20c(); var svg = d3.select("div#stylechart").append("svg") .attr("width", width) .attr("height", height) .append("g") .attr("transform", "translate(" + width / 2 + "," + (height / 2) + ")"); var partition = d3.layout.partition() .value(function(d) { return d.size; }); var arc = d3.svg.arc() .startAngle(function(d) { return Math.max(0, Math.min(2 * Math.PI, x(d.x))); }) .endAngle(function(d) { return Math.max(0, Math.min(2 * Math.PI, x(d.x + d.dx))); }) .innerRadius(function(d) { return Math.max(0, y(d.y)); }) .outerRadius(function(d) { return Math.max(0, y(d.y + d.dy)); }); d3.json("/flare.json", function(error, root) { var g = svg.selectAll("g") .data(partition.nodes(root)) .enter().append("g"); var path = g.append("path") .attr("d", arc) .style("fill", function(d) { return color((d.children ? d : d.parent).name); }) .on("click", click); var text = g.append("text") .attr("transform", function(d) { return "rotate(" + computeTextRotation(d) + ")"; }) .attr("x", function(d) { return y(d.y); }) .attr("dx", "6") // margin .attr("dy", ".35em") // vertical-align .text(function(d) { return d.name; }); function click(d) { // fade out all text elements text.transition().attr("opacity", 0); path.transition() .duration(750) .attrTween("d", arcTween(d)) .each("end", function(e, i) { // check if the animated element's data e lies within the visible angle span given in d if (e.x >= d.x && e.x < (d.x + d.dx)) { // get a selection of the associated text element //var arcText = "testing" var arcText = d3.select(this.parentNode).select("text"); // fade in the text element and recalculate positions arcText.transition().duration(750) .attr("opacity", 1) .attr("transform", function() { return "rotate(" + computeTextRotation(e) + ")" }) .attr("x", function(d) { return y(d.y); }); } }); } }); d3.select(self.frameElement).style("height", height + "px"); // Interpolate the scales! function arcTween(d) { var xd = d3.interpolate(x.domain(), [d.x, d.x + d.dx]), yd = d3.interpolate(y.domain(), [d.y, 1]), yr = d3.interpolate(y.range(), [d.y ? 20 : 0, radius]); return function(d, i) { return i ? function(t) { return arc(d); } : function(t) { x.domain(xd(t)); y.domain(yd(t)).range(yr(t)); return arc(d); }; }; } function computeTextRotation(d) { return (x(d.x + d.dx / 2) - Math.PI / 2) / Math.PI * 180; } </script> <div id="stylechart"></div> <p><br /><br /> Some of these results genuinely surprised me. Keep in mind this data was gathered in October, so pumpkin and Oktoberfest beers were in full force, as were ciders. That being said, I didn’t think that the effects would have been quite so prominent.</p> <h2 id="most-common-beers">Most Common Beers</h2> <p>Seeing how popular pumpkin beers were, for example, made me wonder exactly which beers were the most common overall. Would it be more pumpkin beers? Well, they’re certaintly up there:</p> <table> <thead> <tr> <th> </th> <th style="text-align: left">Most Common Beers</th> <th style="text-align: right">#</th> </tr> </thead> <tbody> <tr> <td>1</td> <td style="text-align: left">Allagash White</td> <td style="text-align: right">39</td> </tr> <tr> <td>2</td> <td style="text-align: left">Yuengling Lager</td> <td style="text-align: right">36</td> </tr> <tr> <td>3</td> <td style="text-align: left">Southern Tier Pumking</td> <td style="text-align: right">31</td> </tr> <tr> <td>4</td> <td style="text-align: left">Guinness</td> <td style="text-align: right">20</td> </tr> <tr> <td>5</td> <td style="text-align: left">Yards Cape of Good Hope</td> <td style="text-align: right">18</td> </tr> <tr> <td>6</td> <td style="text-align: left">Dogfish Head Punkin Ale</td> <td style="text-align: right">17</td> </tr> <tr> <td>7</td> <td style="text-align: left">Firestone Walker Pivo Hoppy Pils</td> <td style="text-align: right">16</td> </tr> <tr> <td>7</td> <td style="text-align: left">Founders Breakfast Stout</td> <td style="text-align: right">16</td> </tr> <tr> <td>7</td> <td style="text-align: left">Founders Dark Penance</td> <td style="text-align: right">16</td> </tr> <tr> <td>7</td> <td style="text-align: left">Yards Philadelphia Pale Ale</td> <td style="text-align: right">16</td> </tr> </tbody> </table> <p>Here we see some standards: Yuengling (it’s Philly after all), Guinness, and some other local brews (i.e., Yards). Then there are some seasonal beers: Pumking and Punkin. I hope Allagash White is just seasonally popular, but honestly it’s such a preferable choice to Blue Moon or Shocktop that I’m not terribly upset. I <em>am</em> curious why the Pivo Pils and the two Founders beers happen to be so commonly stocked. They’re great beers, but I’m still a little curious why those particular beers are so popular over, say, Sierra Nevada Pale Ale.</p> <h2 id="unique-beers">Unique Beers</h2> <p>Finally, I decided to look at what bars had beers you couldn’t get anywhere else in Philadelphia. Specifically, by percentage of taps and by number of taps. First, by percentage:</p> <table> <thead> <tr> <th> </th> <th style="text-align: left">Bars with highest % of taps only available there</th> <th style="text-align: right">%</th> </tr> </thead> <tbody> <tr> <td>1</td> <td style="text-align: left">Nodding Head</td> <td style="text-align: right">71</td> </tr> <tr> <td>2</td> <td style="text-align: left">The Trestle Inn</td> <td style="text-align: right">62</td> </tr> <tr> <td>3</td> <td style="text-align: left">CJ’s Doghouse</td> <td style="text-align: right">58</td> </tr> <tr> <td>4</td> <td style="text-align: left">Watkins Drinkery</td> <td style="text-align: right">57</td> </tr> <tr> <td>5</td> <td style="text-align: left">Bainbridge St. Barrel House</td> <td style="text-align: right">52</td> </tr> <tr> <td>6</td> <td style="text-align: left">Churchville inn</td> <td style="text-align: right">50</td> </tr> <tr> <td>6</td> <td style="text-align: left">Memphis Taproom</td> <td style="text-align: right">50</td> </tr> <tr> <td>6</td> <td style="text-align: left">Sancho Pistolas</td> <td style="text-align: right">50</td> </tr> <tr> <td>9</td> <td style="text-align: left">Tria Taproom</td> <td style="text-align: right">46</td> </tr> <tr> <td>10</td> <td style="text-align: left">The Victoria Freehouse</td> <td style="text-align: right">45</td> </tr> <tr> <td>10</td> <td style="text-align: left">Varga Bar</td> <td style="text-align: right">45</td> </tr> </tbody> </table> <p>Things to note: Nodding head is a brewpub: they have such a high percentage of unique beers because they brew their own stuff. But, as far as I know, the other bars just have high percentages because they’re well-curated… well… The Victoria Freehouse has 45% unique beers, but they have the second lowest average average-beer-rating. So you know where to go if you want low-rated beer you can’t find anywhere else! Actually, that’s not entirely fair. The real truth is they fancy themselves something of an English pub, and thus stock plenty of hard-to-find beers from the British Isles. This is a cool idea, but unfortunately a lot of British beer isn’t terribly well rated as it tends to be rather weak and, well, uninteresting. That being said, they actually have some pretty good roses among their thorns, including several beers from J.W. Lees and Innis &amp; Gunn. I guess that means you can’t judge a bar by it’s average rating. So much for this post!</p> <table> <thead> <tr> <th> </th> <th style="text-align: left">Bars with highest # of taps only available there</th> <th style="text-align: right">#</th> </tr> </thead> <tbody> <tr> <td>1</td> <td style="text-align: left">City Tap House</td> <td style="text-align: right">16</td> </tr> <tr> <td>2</td> <td style="text-align: left">Bru Craft Wurst</td> <td style="text-align: right">15</td> </tr> <tr> <td>3</td> <td style="text-align: left">Bainbridge St. Barrel House</td> <td style="text-align: right">13</td> </tr> <tr> <td>4</td> <td style="text-align: left">McGillin’s Olde Ale House</td> <td style="text-align: right">12</td> </tr> <tr> <td>5</td> <td style="text-align: left">Tria Taproom</td> <td style="text-align: right">11</td> </tr> <tr> <td>6</td> <td style="text-align: left">Capone’s</td> <td style="text-align: right">10</td> </tr> <tr> <td>6</td> <td style="text-align: left">Churchville Inn</td> <td style="text-align: right">10</td> </tr> <tr> <td>6</td> <td style="text-align: left">Iron Abbey</td> <td style="text-align: right">10</td> </tr> <tr> <td>6</td> <td style="text-align: left">Varga Bar</td> <td style="text-align: right">10</td> </tr> <tr> <td>10</td> <td style="text-align: left">Bar-ly Chinatown</td> <td style="text-align: right">9</td> </tr> <tr> <td>10</td> <td style="text-align: left">Memphis Taproom</td> <td style="text-align: right">9</td> </tr> <tr> <td>10</td> <td style="text-align: left">Moriarty’s Pub</td> <td style="text-align: right">9</td> </tr> <tr> <td>10</td> <td style="text-align: left">The Cambridge</td> <td style="text-align: right">9</td> </tr> </tbody> </table> <p>Looking at number of unique taps tells a slightly different story. This isn’t terribly surprising: if you have enough taps, hopefully you work some unique beers into the mix.</p> <h1 id="for-next-time">For Next Time</h1> <p>And that wraps up Part One of my analysis! I’ve got lots of cool ideas for further analysis (fueled by reader suggestions) but haven’t had time to look into them yet. For next time, I’ll hopefully be able to look into:</p> <ul> <li>Average ABV by bar</li> <li>Average IBU by bar</li> <li>Average average-beer-rating using /within-style rankings/</li> <li>Bars with the most local vs. domestic vs. foreign beer.</li> <li>Maps, Maps, Maps!</li> </ul> </section> <footer class="page__meta"> <p class="page__taxonomy"> <strong><i class="fa fa-fw fa-tags" aria-hidden="true"></i> Tags: </strong> <span itemprop="keywords"> <a href="http://williamstome.github.io//tag/beer" class="page__taxonomy-item" rel="tag">beer</a><span class="sep">, </span> <a href="http://williamstome.github.io//tag/data" class="page__taxonomy-item" rel="tag">data</a><span class="sep">, </span> <a href="http://williamstome.github.io//tag/visualization" class="page__taxonomy-item" rel="tag">visualization</a> </span> </p> <p class="page__date"><strong><i class="fa fa-fw fa-calendar" aria-hidden="true"></i> Updated:</strong> <time datetime="2014-12-13">December 13, 2014</time></p> </footer> <section class="page__share"> <h4 class="page__share-title">Share on</h4> <a href="https://twitter.com/intent/tweet?text=http://williamstome.github.io//philadelphia-beerlytics-part-one/" class="btn btn--twitter" title="Share on Twitter"><i class="fa fa-fw fa-twitter" aria-hidden="true"></i><span> Twitter</span></a> <a href="https://www.facebook.com/sharer/sharer.php?u=http://williamstome.github.io//philadelphia-beerlytics-part-one/" class="btn btn--facebook" title="Share on Facebook"><i class="fa fa-fw fa-facebook" aria-hidden="true"></i><span> Facebook</span></a> <a href="https://plus.google.com/share?url=http://williamstome.github.io//philadelphia-beerlytics-part-one/" class="btn btn--google-plus" title="Share on Google Plus"><i class="fa fa-fw fa-google-plus" aria-hidden="true"></i><span> Google+</span></a> <a href="https://www.linkedin.com/shareArticle?mini=true&url=http://williamstome.github.io//philadelphia-beerlytics-part-one/" class="btn btn--linkedin" title="Share on LinkedIn"><i class="fa fa-fw fa-linkedin" aria-hidden="true"></i><span> LinkedIn</span></a> </section> <nav class="pagination"> <a href="http://williamstome.github.io//could-you-maybe-possibly-explain-indirect-speech-acts/" class="pagination--pager" title="Could You Maybe Possibly Explain Indirect Speech Acts? ">Previous</a> <a href="http://williamstome.github.io//philadelphia-beerlytics-part-two-abv/" class="pagination--pager" title="Philadelphia Beerlytics Part Two: ABV ">Next</a> </nav> </div> <div class="page__comments"> <h4 class="page__comments-title">Leave a Comment</h4> <section id="disqus_thread"></section> </div> </article> <div class="page__related"> <h4 class="page__related-title">You May Also Enjoy</h4> <div class="grid__wrapper"> <div class="grid__item"> <a href="http://williamstome.github.io//kong-is-dead/"> <article class="archive__item" itemscope itemtype="http://schema.org/CreativeWork"> <div class="archive__item-teaser"> <img src= "http://williamstome.github.io//images/teasers/kong.jpg" alt=""> </div> <h2 class="archive__item-title" itemprop="headline">Kong is Dead </h2> <p class="archive__item-excerpt" itemprop="description">Acknowledgments on a finalized dissertation. </p> </article> </a> </div> <div class="grid__item"> <a href="http://williamstome.github.io//summer-2016-visiting-the-university-of-bremen-part-2/"> <article class="archive__item" itemscope itemtype="http://schema.org/CreativeWork"> <div class="archive__item-teaser"> <img src= "http://williamstome.github.io//images/teasers/schnell.jpg" alt=""> </div> <h2 class="archive__item-title" itemprop="headline">Summer 2016: Visiting the University of Bremen Pt.2 </h2> <p class="archive__item-excerpt" itemprop="description">An American in Bremen </p> </article> </a> </div> <div class="grid__item"> <a href="http://williamstome.github.io//summer-2016-visiting-the-university-of-bremen-part-1/"> <article class="archive__item" itemscope itemtype="http://schema.org/CreativeWork"> <div class="archive__item-teaser"> <img src= "http://williamstome.github.io//images/teasers/schnell.jpg" alt=""> </div> <h2 class="archive__item-title" itemprop="headline">Summer 2016: Visiting the University of Bremen </h2> <p class="archive__item-excerpt" itemprop="description">Auf Wiedersehen Amerika, Hallo Deutschland! </p> </article> </a> </div> <div class="grid__item"> <a href="http://williamstome.github.io//love-and-sex-with-robots-a-retrospective-through-science-and-science-fiction/"> <article class="archive__item" itemscope itemtype="http://schema.org/CreativeWork"> <div class="archive__item-teaser"> <img src= "http://williamstome.github.io//images/teasers/laswr.jpg" alt=""> </div> <h2 class="archive__item-title" itemprop="headline">Love and Sex With Robots: A Retrospective Through Science and Science Fiction </h2> <p class="archive__item-excerpt" itemprop="description">In this blog post, I’ll provide some of my thoughts on David Levy’s “Love and Sex with Robots”, and provide some commentary from recent science and science f...</p> </article> </a> </div> </div> </div> </div> <div class="page__footer"> <footer> <!-- start custom footer snippets --> <!-- end custom footer snippets --> <div class="page__footer-follow"> <ul class="social-icons"> <li><strong>Follow:</strong></li> <li><a href="https://twitter.com/williamstome"><i class="fa fa-fw fa-twitter-square" aria-hidden="true"></i> Twitter</a></li> <li><a href="https://facebook.com/TomEWilliams"><i class="fa fa-fw fa-facebook-square" aria-hidden="true"></i> Facebook</a></li> <li><a href="http://williamstome.github.io//feed.xml"><i class="fa fa-fw fa-rss-square" aria-hidden="true"></i> Feed</a></li> </ul> </div> <div class="page__footer-copyright">&copy; 2017 Tom Williams. Powered by <a href="http://jekyllrb.com" rel="nofollow">Jekyll</a> &amp; <a href="https://mademistakes.com/work/minimal-mistakes-jekyll-theme/" rel="nofollow">Minimal Mistakes</a>.</div> </footer> </div> <script src="http://williamstome.github.io//assets/js/main.min.js"></script> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-43501910-1']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> <script type="text/javascript"> /* * * CONFIGURATION VARIABLES: EDIT BEFORE PASTING INTO YOUR WEBPAGE * * */ var disqus_shortname = 'williamstome'; /* * * DON'T EDIT BELOW THIS LINE * * */ (function() { var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true; dsq.src = '//' + disqus_shortname + '.disqus.com/embed.js'; (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq); })(); /* * * DON'T EDIT BELOW THIS LINE * * */ (function () { var s = document.createElement('script'); s.async = true; s.type = 'text/javascript'; s.src = '//' + disqus_shortname + '.disqus.com/count.js'; (document.getElementsByTagName('HEAD')[0] || document.getElementsByTagName('BODY')[0]).appendChild(s); }()); </script> <noscript>Please enable JavaScript to view the <a href="http://disqus.com/?ref_noscript">comments powered by Disqus.</a></noscript> </body> </html>
{ "content_hash": "7c833d865c1e7ceb7ff016a62e361883", "timestamp": "", "source": "github", "line_count": 1510, "max_line_length": 295, "avg_line_length": 29.36158940397351, "alnum_prop": 0.5976632984482136, "repo_name": "williamstome/williamstome.github.io", "id": "e6d5872c8e13aa445a9c8eb6bfc1b7f49be57699", "size": "44500", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "philadelphia-beerlytics-part-one/index.html", "mode": "33188", "license": "mit", "language": [ { "name": "HTML", "bytes": "867328" }, { "name": "JavaScript", "bytes": "49133" } ], "symlink_target": "" }
Library jars go here - *.jar - *-javadoc.jar - *-sources.jar
{ "content_hash": "89df1b6da22db8fc03d853e71f0b9bf9", "timestamp": "", "source": "github", "line_count": 4, "max_line_length": 20, "avg_line_length": 15, "alnum_prop": 0.6666666666666666, "repo_name": "IamWaffle/CryptoWatch", "id": "20808278edb34e8d2bad16d81cf8812fc0f51b50", "size": "60", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "Legacy/lib/README.md", "mode": "33188", "license": "mit", "language": [ { "name": "Java", "bytes": "7039" } ], "symlink_target": "" }
[Getting Started: Building a Chrome Extension](https://developer.chrome.com/extensions/getstarted) * [Hello World](helloworld.md) * [Budget Manager](budget_manager.md)
{ "content_hash": "77ee852a67180314ec229d00d9442b6a", "timestamp": "", "source": "github", "line_count": 4, "max_line_length": 98, "avg_line_length": 42, "alnum_prop": 0.7797619047619048, "repo_name": "lechuckroh/devdocs", "id": "4cf484606380cfc6e7ee47ab41e593c4925bc898", "size": "198", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "chrome_ext/README.md", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Shell", "bytes": "3134" } ], "symlink_target": "" }
.class public interface abstract Lcom/android/camera/ui/StrangerPickClearContainer$PickClearListener; .super Ljava/lang/Object; .source "StrangerPickClearContainer.java" # annotations .annotation system Ldalvik/annotation/EnclosingClass; value = Lcom/android/camera/ui/StrangerPickClearContainer; .end annotation .annotation system Ldalvik/annotation/InnerClass; accessFlags = 0x609 name = "PickClearListener" .end annotation # virtual methods .method public abstract onCancelClick()V .end method .method public abstract onPositionTouch(FF)Z .end method .method public abstract onPreviewImage([BII)V .end method .method public abstract onSaveClick()V .end method
{ "content_hash": "07c785a39796968593a1aa40de4bf3a8", "timestamp": "", "source": "github", "line_count": 28, "max_line_length": 101, "avg_line_length": 24.392857142857142, "alnum_prop": 0.808199121522694, "repo_name": "baidurom/devices-Coolpad8720L", "id": "df7f4cf23958fbb2c60daf1b28d35f199e265643", "size": "683", "binary": false, "copies": "1", "ref": "refs/heads/coron-4.3", "path": "CP_Gallery3D/smali/com/android/camera/ui/StrangerPickClearContainer$PickClearListener.smali", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Makefile", "bytes": "13619" }, { "name": "Shell", "bytes": "1917" } ], "symlink_target": "" }
package controllers import neo4j.models.user.UserRole import neo4j.models.require.{OtherEffort, Requirement, Project} import neo4j.services.Neo4JServiceProvider import play.api.data.Form import play.api.data.Forms._ import scala.collection.JavaConversions._ import org.apache.commons.lang.StringUtils import neo4j.repositories.RequirementRepository.RequirementInfo import play.api.mvc.AnyContent import groovy.lang.{Binding,GroovyShell,Script} object EffortController extends BaseController { val groovyEngine = new GroovyShell(new Binding()) def addEffort(id: Long) = AuthenticatedLoggingAction(UserRole.USER) { implicit request => val project = Neo4JServiceProvider.get().projectRepository.findOne(id) val user = PlaySession.getUser // TODO contributors ? if (project != null && project.author.id == user.id) { effortForm.bindFromRequest.fold( formWithErrors => Ok(views.html.require.effortEditDialog(-1L, project.id, formWithErrors, "create")), value => { val effort = OtherEffort.create(value.effortName, value.effortDescription, value.effortEffort, user, project) Ok(routes.RequirementController.requirementListIdEffort(project.id).url) }) } else { Ok(routes.RequirementController.requirementList().url) } } def effortEditPanel(projectId: Long, id: Long) = AuthenticatedLoggingAction(UserRole.USER) { implicit request => if(id > 0) { val effort = Neo4JServiceProvider.get().otherEffortRepository.findOne(id) val user = PlaySession.getUser // TODO contributors if (effort != null && effort.author.id == user.id) { val caseEffort = CaseEffort(effort.id, effort.name, effort.description, effort.effort) Ok(views.html.require.effortEditDialog(id, projectId, effortForm.fill(caseEffort), "edit")) } else { Ok(views.html.require.effortEditDialog(-1L, projectId, effortForm, "create")) } } else { Ok(views.html.require.effortEditDialog(-1L, projectId, effortForm, "create")) } } def editEffort(id: Long) = AuthenticatedLoggingAction(UserRole.USER) { implicit request => val effort = Neo4JServiceProvider.get().otherEffortRepository.findOne(id) val user = PlaySession.getUser // TODO contributors if (effort != null && effort.author.id == user.id) { effortForm.bindFromRequest.fold( formWithErrors => Ok(views.html.require.effortEditDialog(id, effort.project.id, formWithErrors, "edit")), value => { effort.name = value.effortName effort.description = value.effortDescription effort.effort = value.effortEffort Neo4JServiceProvider.get().otherEffortRepository.save(effort) Ok(routes.RequirementController.requirementListIdEffort(effort.project.id).url) } ) } else { Ok(routes.RequirementController.requirementList().url) } } def deleteEffort(id: Long) = AuthenticatedLoggingAction(UserRole.USER) { implicit request => val effort = Neo4JServiceProvider.get().otherEffortRepository.findOne(id) val user = PlaySession.getUser val projectId = effort.project.id if (effort != null && effort.author.id == user.id) { Neo4JServiceProvider.get().otherEffortRepository.delete(effort) } Ok(routes.RequirementController.requirementListIdEffort(projectId).url) } case class CaseEffort(effortId: Long, effortName: String, effortDescription: String, effortEffort: String) val effortForm: Form[CaseEffort] = Form( mapping( "effortId" -> default(longNumber, -1L), "effortName" -> nonEmptyText, "effortDescription" -> text, "effortEffort" -> text.verifying("error.format", effort => { if(StringUtils.isBlank(effort)) { true } else { try { calcGroovyExpression(effort, 10) true } catch { case e: Exception => false } } }) )(CaseEffort.apply)(CaseEffort.unapply)) def calcGroovyExpression(effort: String, hourlyRate: Double): Double = { val script = groovyEngine.parse(effort.replace("," ,".").replace("h" ,"*h")) script.setProperty("h", hourlyRate) val result = script.run() if (result.isInstanceOf[Int]) { 1.0 * result.asInstanceOf[Int] } else { result.asInstanceOf[Double] } } }
{ "content_hash": "d0500e2d8573e934321f5d0e34fda941", "timestamp": "", "source": "github", "line_count": 115, "max_line_length": 121, "avg_line_length": 38.904347826086955, "alnum_prop": 0.672552525704068, "repo_name": "unterstein/requireweb", "id": "1c27565b2dbd3e37a51cc4511cd5992b789fde3f", "size": "5109", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "app/controllers/EffortController.scala", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "CSS", "bytes": "27049" }, { "name": "Java", "bytes": "43545" }, { "name": "JavaScript", "bytes": "16207" }, { "name": "Scala", "bytes": "46822" }, { "name": "Shell", "bytes": "238" } ], "symlink_target": "" }
package org.apache.druid.indexing.kafka.supervisor; import com.fasterxml.jackson.core.JsonProcessingException; import com.fasterxml.jackson.databind.ObjectMapper; import com.google.common.base.Optional; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; import com.google.common.collect.ImmutableSet; import com.google.common.util.concurrent.Futures; import com.google.common.util.concurrent.ListenableFuture; import org.apache.curator.test.TestingCluster; import org.apache.druid.data.input.InputFormat; import org.apache.druid.data.input.impl.DimensionSchema; import org.apache.druid.data.input.impl.DimensionsSpec; import org.apache.druid.data.input.impl.JsonInputFormat; import org.apache.druid.data.input.impl.StringDimensionSchema; import org.apache.druid.data.input.impl.TimestampSpec; import org.apache.druid.data.input.kafka.KafkaRecordEntity; import org.apache.druid.indexer.TaskLocation; import org.apache.druid.indexer.TaskStatus; import org.apache.druid.indexing.common.TaskInfoProvider; import org.apache.druid.indexing.common.TestUtils; import org.apache.druid.indexing.common.task.RealtimeIndexTask; import org.apache.druid.indexing.common.task.Task; import org.apache.druid.indexing.kafka.KafkaConsumerConfigs; import org.apache.druid.indexing.kafka.KafkaDataSourceMetadata; import org.apache.druid.indexing.kafka.KafkaIndexTask; import org.apache.druid.indexing.kafka.KafkaIndexTaskClient; import org.apache.druid.indexing.kafka.KafkaIndexTaskClientFactory; import org.apache.druid.indexing.kafka.KafkaIndexTaskIOConfig; import org.apache.druid.indexing.kafka.KafkaIndexTaskTuningConfig; import org.apache.druid.indexing.kafka.KafkaRecordSupplier; import org.apache.druid.indexing.kafka.test.TestBroker; import org.apache.druid.indexing.overlord.DataSourceMetadata; import org.apache.druid.indexing.overlord.IndexerMetadataStorageCoordinator; import org.apache.druid.indexing.overlord.TaskMaster; import org.apache.druid.indexing.overlord.TaskQueue; import org.apache.druid.indexing.overlord.TaskRunner; import org.apache.druid.indexing.overlord.TaskRunnerListener; import org.apache.druid.indexing.overlord.TaskRunnerWorkItem; import org.apache.druid.indexing.overlord.TaskStorage; import org.apache.druid.indexing.overlord.supervisor.SupervisorReport; import org.apache.druid.indexing.overlord.supervisor.SupervisorStateManager; import org.apache.druid.indexing.overlord.supervisor.SupervisorStateManagerConfig; import org.apache.druid.indexing.overlord.supervisor.autoscaler.SupervisorTaskAutoScaler; import org.apache.druid.indexing.seekablestream.SeekableStreamEndSequenceNumbers; import org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskRunner.Status; import org.apache.druid.indexing.seekablestream.SeekableStreamIndexTaskTuningConfig; import org.apache.druid.indexing.seekablestream.SeekableStreamStartSequenceNumbers; import org.apache.druid.indexing.seekablestream.common.RecordSupplier; import org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisorSpec; import org.apache.druid.indexing.seekablestream.supervisor.SeekableStreamSupervisorStateManager; import org.apache.druid.indexing.seekablestream.supervisor.TaskReportData; import org.apache.druid.indexing.seekablestream.supervisor.autoscaler.LagBasedAutoScalerConfig; import org.apache.druid.java.util.common.DateTimes; import org.apache.druid.java.util.common.ISE; import org.apache.druid.java.util.common.StringUtils; import org.apache.druid.java.util.common.granularity.Granularities; import org.apache.druid.java.util.common.parsers.JSONPathSpec; import org.apache.druid.java.util.emitter.EmittingLogger; import org.apache.druid.query.aggregation.AggregatorFactory; import org.apache.druid.query.aggregation.CountAggregatorFactory; import org.apache.druid.segment.TestHelper; import org.apache.druid.segment.incremental.ParseExceptionReport; import org.apache.druid.segment.incremental.RowIngestionMetersFactory; import org.apache.druid.segment.indexing.DataSchema; import org.apache.druid.segment.indexing.RealtimeIOConfig; import org.apache.druid.segment.indexing.granularity.UniformGranularitySpec; import org.apache.druid.segment.realtime.FireDepartment; import org.apache.druid.server.metrics.DruidMonitorSchedulerConfig; import org.apache.druid.server.metrics.ExceptionCapturingServiceEmitter; import org.apache.druid.server.metrics.NoopServiceEmitter; import org.apache.kafka.clients.admin.Admin; import org.apache.kafka.clients.admin.NewPartitions; import org.apache.kafka.clients.admin.NewTopic; import org.apache.kafka.clients.consumer.KafkaConsumer; import org.apache.kafka.clients.producer.KafkaProducer; import org.apache.kafka.clients.producer.ProducerRecord; import org.apache.kafka.common.serialization.ByteArrayDeserializer; import org.apache.kafka.common.serialization.Deserializer; import org.easymock.Capture; import org.easymock.CaptureType; import org.easymock.EasyMock; import org.easymock.EasyMockSupport; import org.easymock.IAnswer; import org.joda.time.DateTime; import org.joda.time.Duration; import org.joda.time.Period; import org.junit.After; import org.junit.AfterClass; import org.junit.Assert; import org.junit.Before; import org.junit.BeforeClass; import org.junit.Test; import org.junit.runner.RunWith; import org.junit.runners.Parameterized; import java.io.File; import java.io.IOException; import java.util.ArrayList; import java.util.Collection; import java.util.Collections; import java.util.HashMap; import java.util.List; import java.util.Map; import java.util.Properties; import java.util.TreeMap; import java.util.concurrent.Executor; @RunWith(Parameterized.class) public class KafkaSupervisorTest extends EasyMockSupport { private static final ObjectMapper OBJECT_MAPPER = TestHelper.makeJsonMapper(); private static final InputFormat INPUT_FORMAT = new JsonInputFormat( new JSONPathSpec(true, ImmutableList.of()), ImmutableMap.of(), false ); private static final String TOPIC_PREFIX = "testTopic"; private static final String DATASOURCE = "testDS"; private static final int NUM_PARTITIONS = 3; private static final int TEST_CHAT_THREADS = 3; private static final long TEST_CHAT_RETRIES = 9L; private static final Period TEST_HTTP_TIMEOUT = new Period("PT10S"); private static final Period TEST_SHUTDOWN_TIMEOUT = new Period("PT80S"); private static TestingCluster zkServer; private static TestBroker kafkaServer; private static String kafkaHost; private static DataSchema dataSchema; private static int topicPostfix; private final int numThreads; private TestableKafkaSupervisor supervisor; private TaskStorage taskStorage; private TaskMaster taskMaster; private TaskRunner taskRunner; private IndexerMetadataStorageCoordinator indexerMetadataStorageCoordinator; private KafkaIndexTaskClient taskClient; private TaskQueue taskQueue; private String topic; private RowIngestionMetersFactory rowIngestionMetersFactory; private ExceptionCapturingServiceEmitter serviceEmitter; private SupervisorStateManagerConfig supervisorConfig; private KafkaSupervisorIngestionSpec ingestionSchema; private static String getTopic() { //noinspection StringConcatenationMissingWhitespace return TOPIC_PREFIX + topicPostfix++; } @Parameterized.Parameters(name = "numThreads = {0}") public static Iterable<Object[]> constructorFeeder() { return ImmutableList.of(new Object[]{1}, new Object[]{8}); } public KafkaSupervisorTest(int numThreads) { this.numThreads = numThreads; } @BeforeClass public static void setupClass() throws Exception { zkServer = new TestingCluster(1); zkServer.start(); kafkaServer = new TestBroker( zkServer.getConnectString(), null, 1, ImmutableMap.of( "num.partitions", String.valueOf(NUM_PARTITIONS), "auto.create.topics.enable", String.valueOf(false) ) ); kafkaServer.start(); kafkaHost = StringUtils.format("localhost:%d", kafkaServer.getPort()); dataSchema = getDataSchema(DATASOURCE); } @Before public void setupTest() { taskStorage = createMock(TaskStorage.class); taskMaster = createMock(TaskMaster.class); taskRunner = createMock(TaskRunner.class); indexerMetadataStorageCoordinator = createMock(IndexerMetadataStorageCoordinator.class); taskClient = createMock(KafkaIndexTaskClient.class); taskQueue = createMock(TaskQueue.class); topic = getTopic(); rowIngestionMetersFactory = new TestUtils().getRowIngestionMetersFactory(); serviceEmitter = new ExceptionCapturingServiceEmitter(); EmittingLogger.registerEmitter(serviceEmitter); supervisorConfig = new SupervisorStateManagerConfig(); ingestionSchema = EasyMock.createMock(KafkaSupervisorIngestionSpec.class); } @After public void tearDownTest() { supervisor = null; } @AfterClass public static void tearDownClass() throws IOException { kafkaServer.close(); kafkaServer = null; zkServer.stop(); zkServer = null; } @Test public void testNoInitialStateWithAutoscaler() throws Exception { KafkaIndexTaskClientFactory taskClientFactory = new KafkaIndexTaskClientFactory( null, null ) { @Override public KafkaIndexTaskClient build( TaskInfoProvider taskInfoProvider, String dataSource, int numThreads, Duration httpTimeout, long numRetries ) { Assert.assertEquals(TEST_CHAT_THREADS, numThreads); Assert.assertEquals(TEST_HTTP_TIMEOUT.toStandardDuration(), httpTimeout); Assert.assertEquals(TEST_CHAT_RETRIES, numRetries); return taskClient; } }; HashMap<String, Object> autoScalerConfig = new HashMap<>(); autoScalerConfig.put("enableTaskAutoScaler", true); autoScalerConfig.put("lagCollectionIntervalMillis", 500); autoScalerConfig.put("lagCollectionRangeMillis", 500); autoScalerConfig.put("scaleOutThreshold", 0); autoScalerConfig.put("triggerScaleOutFractionThreshold", 0.0); autoScalerConfig.put("scaleInThreshold", 1000000); autoScalerConfig.put("triggerScaleInFractionThreshold", 0.8); autoScalerConfig.put("scaleActionStartDelayMillis", 0); autoScalerConfig.put("scaleActionPeriodMillis", 100); autoScalerConfig.put("taskCountMax", 2); autoScalerConfig.put("taskCountMin", 1); autoScalerConfig.put("scaleInStep", 1); autoScalerConfig.put("scaleOutStep", 2); autoScalerConfig.put("minTriggerScaleActionFrequencyMillis", 1200000); final Map<String, Object> consumerProperties = KafkaConsumerConfigs.getConsumerProperties(); consumerProperties.put("myCustomKey", "myCustomValue"); consumerProperties.put("bootstrap.servers", kafkaHost); KafkaSupervisorIOConfig kafkaSupervisorIOConfig = new KafkaSupervisorIOConfig( topic, INPUT_FORMAT, 1, 1, new Period("PT1H"), consumerProperties, OBJECT_MAPPER.convertValue(autoScalerConfig, LagBasedAutoScalerConfig.class), KafkaSupervisorIOConfig.DEFAULT_POLL_TIMEOUT_MILLIS, new Period("P1D"), new Period("PT30S"), true, new Period("PT30M"), null, null, null ); final KafkaSupervisorTuningConfig tuningConfigOri = new KafkaSupervisorTuningConfig( null, 1000, null, null, 50000, null, new Period("P1Y"), new File("/test"), null, null, null, false, null, false, null, numThreads, TEST_CHAT_THREADS, TEST_CHAT_RETRIES, TEST_HTTP_TIMEOUT, TEST_SHUTDOWN_TIMEOUT, null, null, null, null, null ); EasyMock.expect(ingestionSchema.getIOConfig()).andReturn(kafkaSupervisorIOConfig).anyTimes(); EasyMock.expect(ingestionSchema.getDataSchema()).andReturn(dataSchema).anyTimes(); EasyMock.expect(ingestionSchema.getTuningConfig()).andReturn(tuningConfigOri).anyTimes(); EasyMock.replay(ingestionSchema); SeekableStreamSupervisorSpec testableSupervisorSpec = new KafkaSupervisorSpec( ingestionSchema, dataSchema, tuningConfigOri, kafkaSupervisorIOConfig, null, false, taskStorage, taskMaster, indexerMetadataStorageCoordinator, taskClientFactory, OBJECT_MAPPER, new NoopServiceEmitter(), new DruidMonitorSchedulerConfig(), rowIngestionMetersFactory, new SupervisorStateManagerConfig() ); supervisor = new TestableKafkaSupervisor( taskStorage, taskMaster, indexerMetadataStorageCoordinator, taskClientFactory, OBJECT_MAPPER, (KafkaSupervisorSpec) testableSupervisorSpec, rowIngestionMetersFactory ); SupervisorTaskAutoScaler autoscaler = testableSupervisorSpec.createAutoscaler(supervisor); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); addSomeEvents(1); Capture<KafkaIndexTask> captured = Capture.newInstance(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskMaster.getSupervisorManager()).andReturn(Optional.absent()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); int taskCountBeforeScale = supervisor.getIoConfig().getTaskCount(); Assert.assertEquals(1, taskCountBeforeScale); autoscaler.start(); supervisor.runInternal(); Thread.sleep(1 * 1000); verifyAll(); int taskCountAfterScale = supervisor.getIoConfig().getTaskCount(); Assert.assertEquals(2, taskCountAfterScale); KafkaIndexTask task = captured.getValue(); Assert.assertEquals(KafkaSupervisorTest.dataSchema, task.getDataSchema()); Assert.assertEquals(tuningConfig.convertToTaskTuningConfig(), task.getTuningConfig()); KafkaIndexTaskIOConfig taskConfig = task.getIOConfig(); Assert.assertEquals(kafkaHost, taskConfig.getConsumerProperties().get("bootstrap.servers")); Assert.assertEquals("myCustomValue", taskConfig.getConsumerProperties().get("myCustomKey")); Assert.assertEquals("sequenceName-0", taskConfig.getBaseSequenceName()); Assert.assertTrue("isUseTransaction", taskConfig.isUseTransaction()); Assert.assertFalse("minimumMessageTime", taskConfig.getMinimumMessageTime().isPresent()); Assert.assertFalse("maximumMessageTime", taskConfig.getMaximumMessageTime().isPresent()); Assert.assertEquals(topic, taskConfig.getStartSequenceNumbers().getStream()); Assert.assertEquals(0L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0)); Assert.assertEquals(0L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(1)); Assert.assertEquals(0L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2)); Assert.assertEquals(topic, taskConfig.getEndSequenceNumbers().getStream()); Assert.assertEquals( Long.MAX_VALUE, (long) taskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(0) ); Assert.assertEquals( Long.MAX_VALUE, (long) taskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(1) ); Assert.assertEquals( Long.MAX_VALUE, (long) taskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(2) ); autoscaler.reset(); autoscaler.stop(); } @Test public void testCreateBaseTaskContexts() throws JsonProcessingException { supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); final Map<String, Object> contexts = supervisor.createIndexTasks( 1, "seq", OBJECT_MAPPER, new TreeMap<>(), new KafkaIndexTaskIOConfig( 0, "seq", new SeekableStreamStartSequenceNumbers<>("test", Collections.emptyMap(), Collections.emptySet()), new SeekableStreamEndSequenceNumbers<>("test", Collections.emptyMap()), Collections.emptyMap(), null, null, null, null, INPUT_FORMAT ), new KafkaIndexTaskTuningConfig( null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null ), null ).get(0).getContext(); final Boolean contextValue = (Boolean) contexts.get("IS_INCREMENTAL_HANDOFF_SUPPORTED"); Assert.assertNotNull(contextValue); Assert.assertTrue(contextValue); } @Test public void testNoInitialState() throws Exception { supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); addSomeEvents(1); Capture<KafkaIndexTask> captured = Capture.newInstance(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); KafkaIndexTask task = captured.getValue(); Assert.assertEquals(dataSchema, task.getDataSchema()); Assert.assertEquals(tuningConfig.convertToTaskTuningConfig(), task.getTuningConfig()); KafkaIndexTaskIOConfig taskConfig = task.getIOConfig(); Assert.assertEquals(kafkaHost, taskConfig.getConsumerProperties().get("bootstrap.servers")); Assert.assertEquals("myCustomValue", taskConfig.getConsumerProperties().get("myCustomKey")); Assert.assertEquals("sequenceName-0", taskConfig.getBaseSequenceName()); Assert.assertTrue("isUseTransaction", taskConfig.isUseTransaction()); Assert.assertFalse("minimumMessageTime", taskConfig.getMinimumMessageTime().isPresent()); Assert.assertFalse("maximumMessageTime", taskConfig.getMaximumMessageTime().isPresent()); Assert.assertEquals(topic, taskConfig.getStartSequenceNumbers().getStream()); Assert.assertEquals(0L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0)); Assert.assertEquals(0L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(1)); Assert.assertEquals(0L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2)); Assert.assertEquals(topic, taskConfig.getEndSequenceNumbers().getStream()); Assert.assertEquals( Long.MAX_VALUE, (long) taskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(0) ); Assert.assertEquals( Long.MAX_VALUE, (long) taskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(1) ); Assert.assertEquals( Long.MAX_VALUE, (long) taskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(2) ); } @Test public void testSkipOffsetGaps() throws Exception { supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); addSomeEvents(1); Capture<KafkaIndexTask> captured = Capture.newInstance(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); } @Test public void testMultiTask() throws Exception { supervisor = getTestableSupervisor(1, 2, true, "PT1H", null, null); addSomeEvents(1); Capture<KafkaIndexTask> captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.absent()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(2); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); KafkaIndexTask task1 = captured.getValues().get(0); Assert.assertEquals(2, task1.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().size()); Assert.assertEquals(2, task1.getIOConfig().getEndSequenceNumbers().getPartitionSequenceNumberMap().size()); Assert.assertEquals( 0L, task1.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( Long.MAX_VALUE, task1.getIOConfig().getEndSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( 0L, task1.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); Assert.assertEquals( Long.MAX_VALUE, task1.getIOConfig().getEndSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); KafkaIndexTask task2 = captured.getValues().get(1); Assert.assertEquals(1, task2.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().size()); Assert.assertEquals(1, task2.getIOConfig().getEndSequenceNumbers().getPartitionSequenceNumberMap().size()); Assert.assertEquals( 0L, task2.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( Long.MAX_VALUE, task2.getIOConfig().getEndSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); } @Test public void testReplicas() throws Exception { supervisor = getTestableSupervisor(2, 1, true, "PT1H", null, null); addSomeEvents(1); Capture<KafkaIndexTask> captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.absent()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(2); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); KafkaIndexTask task1 = captured.getValues().get(0); Assert.assertEquals(3, task1.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().size()); Assert.assertEquals(3, task1.getIOConfig().getEndSequenceNumbers().getPartitionSequenceNumberMap().size()); Assert.assertEquals( 0L, task1.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( 0L, task1.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( 0L, task1.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); KafkaIndexTask task2 = captured.getValues().get(1); Assert.assertEquals(3, task2.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().size()); Assert.assertEquals(3, task2.getIOConfig().getEndSequenceNumbers().getPartitionSequenceNumberMap().size()); Assert.assertEquals( 0L, task2.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( 0L, task2.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( 0L, task2.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); } @Test public void testLateMessageRejectionPeriod() throws Exception { supervisor = getTestableSupervisor(2, 1, true, "PT1H", new Period("PT1H"), null); addSomeEvents(1); Capture<KafkaIndexTask> captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.absent()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(2); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); KafkaIndexTask task1 = captured.getValues().get(0); KafkaIndexTask task2 = captured.getValues().get(1); Assert.assertTrue( "minimumMessageTime", task1.getIOConfig().getMinimumMessageTime().get().plusMinutes(59).isBeforeNow() ); Assert.assertTrue( "minimumMessageTime", task1.getIOConfig().getMinimumMessageTime().get().plusMinutes(61).isAfterNow() ); Assert.assertEquals( task1.getIOConfig().getMinimumMessageTime().get(), task2.getIOConfig().getMinimumMessageTime().get() ); } @Test public void testEarlyMessageRejectionPeriod() throws Exception { supervisor = getTestableSupervisor(2, 1, true, "PT1H", null, new Period("PT1H")); addSomeEvents(1); Capture<KafkaIndexTask> captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.absent()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(2); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); KafkaIndexTask task1 = captured.getValues().get(0); KafkaIndexTask task2 = captured.getValues().get(1); Assert.assertTrue( "maximumMessageTime", task1.getIOConfig().getMaximumMessageTime().get().minusMinutes(59 + 60).isAfterNow() ); Assert.assertTrue( "maximumMessageTime", task1.getIOConfig().getMaximumMessageTime().get().minusMinutes(61 + 60).isBeforeNow() ); Assert.assertEquals( task1.getIOConfig().getMaximumMessageTime().get(), task2.getIOConfig().getMaximumMessageTime().get() ); } /** * Test generating the starting offsets from the partition high water marks in Kafka. */ @Test public void testLatestOffset() throws Exception { supervisor = getTestableSupervisor(1, 1, false, "PT1H", null, null); addSomeEvents(1100); Capture<KafkaIndexTask> captured = Capture.newInstance(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.absent()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); KafkaIndexTask task = captured.getValue(); Assert.assertEquals( 1101L, task.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( 1101L, task.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( 1101L, task.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); } /** * Test if partitionIds get updated */ @Test public void testPartitionIdsUpdates() throws Exception { supervisor = getTestableSupervisor(1, 1, false, "PT1H", null, null); addSomeEvents(1100); Capture<KafkaIndexTask> captured = Capture.newInstance(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.absent()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); Assert.assertFalse(supervisor.isPartitionIdsEmpty()); } @Test public void testAlwaysUsesEarliestOffsetForNewlyDiscoveredPartitions() throws Exception { supervisor = getTestableSupervisor(1, 1, false, "PT1H", null, null); addSomeEvents(9); Capture<KafkaIndexTask> captured = Capture.newInstance(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.absent()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); KafkaIndexTask task = captured.getValue(); Assert.assertEquals( 10, task.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( 10, task.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( 10, task.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); addMoreEvents(9, 6); EasyMock.reset(taskQueue, taskStorage); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); Capture<KafkaIndexTask> tmp = Capture.newInstance(); EasyMock.expect(taskQueue.add(EasyMock.capture(tmp))).andReturn(true); EasyMock.replay(taskStorage, taskQueue); supervisor.runInternal(); verifyAll(); EasyMock.reset(taskQueue, taskStorage); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); Capture<KafkaIndexTask> newcaptured = Capture.newInstance(); EasyMock.expect(taskQueue.add(EasyMock.capture(newcaptured))).andReturn(true); EasyMock.replay(taskStorage, taskQueue); supervisor.runInternal(); verifyAll(); //check if start from earliest offset task = newcaptured.getValue(); Assert.assertEquals( 0, task.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(3).longValue() ); Assert.assertEquals( 0, task.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(4).longValue() ); Assert.assertEquals( 0, task.getIOConfig().getStartSequenceNumbers().getPartitionSequenceNumberMap().get(5).longValue() ); } /** * Test generating the starting offsets from the partition data stored in druid_dataSource which contains the * offsets of the last built segments. */ @Test public void testDatasourceMetadata() throws Exception { supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); addSomeEvents(100); Capture<KafkaIndexTask> captured = Capture.newInstance(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.absent()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( new SeekableStreamStartSequenceNumbers<>(topic, ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()) ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); KafkaIndexTask task = captured.getValue(); KafkaIndexTaskIOConfig taskConfig = task.getIOConfig(); Assert.assertEquals("sequenceName-0", taskConfig.getBaseSequenceName()); Assert.assertEquals( 10L, taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( 20L, taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( 30L, taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); } @Test public void testBadMetadataOffsets() throws Exception { supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); addSomeEvents(1); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.absent()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); // for simplicity in testing the offset availability check, we use negative stored offsets in metadata here, // because the stream's earliest offset is 0, although that would not happen in real usage. EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( new SeekableStreamStartSequenceNumbers<>( topic, ImmutableMap.of(0, -10L, 1, -20L, 2, -30L), ImmutableSet.of() ) ) ).anyTimes(); replayAll(); supervisor.start(); supervisor.runInternal(); Assert.assertEquals( "org.apache.druid.java.util.common.ISE", supervisor.getStateManager().getExceptionEvents().get(0).getExceptionClass() ); } @Test public void testDontKillTasksWithMismatchedType() throws Exception { supervisor = getTestableSupervisor(2, 1, true, "PT1H", null, null); addSomeEvents(1); // non KafkaIndexTask (don't kill) Task id2 = new RealtimeIndexTask( "id2", null, new FireDepartment( dataSchema, new RealtimeIOConfig(null, null), null ), null ); List<Task> existingTasks = ImmutableList.of(id2); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(existingTasks).anyTimes(); EasyMock.expect(taskClient.getStatusAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(Status.NOT_STARTED)) .anyTimes(); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())) .anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); EasyMock.expect(taskQueue.add(EasyMock.anyObject(Task.class))).andReturn(true).anyTimes(); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); } @Test public void testKillBadPartitionAssignment() throws Exception { supervisor = getTestableSupervisor(1, 2, true, "PT1H", null, null); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); addSomeEvents(1); Task id1 = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>("topic", ImmutableMap.of(0, Long.MAX_VALUE, 2, Long.MAX_VALUE)), null, null, tuningConfig ); Task id2 = createKafkaIndexTask( "id2", DATASOURCE, 1, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(1, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>("topic", ImmutableMap.of(1, Long.MAX_VALUE)), null, null, tuningConfig ); Task id3 = createKafkaIndexTask( "id3", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 0L, 1, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Task id4 = createKafkaIndexTask( "id4", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 0L, 1, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>("topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE)), null, null, tuningConfig ); Task id5 = createKafkaIndexTask( "id5", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(1, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>("topic", ImmutableMap.of(1, Long.MAX_VALUE, 2, Long.MAX_VALUE)), null, null, tuningConfig ); List<Task> existingTasks = ImmutableList.of(id1, id2, id3, id4, id5); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(existingTasks).anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id2")).andReturn(Optional.of(TaskStatus.running("id2"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id3")).andReturn(Optional.of(TaskStatus.running("id3"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id4")).andReturn(Optional.of(TaskStatus.running("id4"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id5")).andReturn(Optional.of(TaskStatus.running("id5"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(id1)).anyTimes(); EasyMock.expect(taskStorage.getTask("id2")).andReturn(Optional.of(id2)).anyTimes(); EasyMock.expect(taskStorage.getTask("id3")).andReturn(Optional.of(id3)).anyTimes(); EasyMock.expect(taskStorage.getTask("id4")).andReturn(Optional.of(id4)).anyTimes(); EasyMock.expect(taskStorage.getTask("id5")).andReturn(Optional.of(id5)).anyTimes(); EasyMock.expect(taskClient.getStatusAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(Status.NOT_STARTED)) .anyTimes(); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())) .anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskClient.stopAsync("id3", false)).andReturn(Futures.immediateFuture(true)); EasyMock.expect(taskClient.stopAsync("id4", false)).andReturn(Futures.immediateFuture(false)); EasyMock.expect(taskClient.stopAsync("id5", false)).andReturn(Futures.immediateFuture(null)); TreeMap<Integer, Map<Integer, Long>> checkpoints1 = new TreeMap<>(); checkpoints1.put(0, ImmutableMap.of(0, 0L, 2, 0L)); TreeMap<Integer, Map<Integer, Long>> checkpoints2 = new TreeMap<>(); checkpoints2.put(0, ImmutableMap.of(1, 0L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints1)) .times(1); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id2"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints2)) .times(1); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); taskQueue.shutdown("id4", "Task [%s] failed to stop in a timely manner, killing task", "id4"); taskQueue.shutdown("id5", "Task [%s] failed to stop in a timely manner, killing task", "id5"); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); } @Test public void testRequeueTaskWhenFailed() throws Exception { supervisor = getTestableSupervisor(2, 2, true, "PT1H", null, null); addSomeEvents(1); Capture<Task> captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(taskClient.getStatusAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(Status.NOT_STARTED)) .anyTimes(); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())) .anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(4); TreeMap<Integer, Map<Integer, Long>> checkpoints1 = new TreeMap<>(); checkpoints1.put(0, ImmutableMap.of(0, 0L, 2, 0L)); TreeMap<Integer, Map<Integer, Long>> checkpoints2 = new TreeMap<>(); checkpoints2.put(0, ImmutableMap.of(1, 0L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-0"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints1)) .anyTimes(); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints2)) .anyTimes(); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); List<Task> tasks = captured.getValues(); // test that running the main loop again checks the status of the tasks that were created and does nothing if they // are all still running EasyMock.reset(taskStorage); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(tasks).anyTimes(); for (Task task : tasks) { EasyMock.expect(taskStorage.getStatus(task.getId())) .andReturn(Optional.of(TaskStatus.running(task.getId()))) .anyTimes(); EasyMock.expect(taskStorage.getTask(task.getId())).andReturn(Optional.of(task)).anyTimes(); } EasyMock.replay(taskStorage); supervisor.runInternal(); verifyAll(); // test that a task failing causes a new task to be re-queued with the same parameters Capture<Task> aNewTaskCapture = Capture.newInstance(); List<Task> imStillAlive = tasks.subList(0, 3); KafkaIndexTask iHaveFailed = (KafkaIndexTask) tasks.get(3); EasyMock.reset(taskStorage); EasyMock.reset(taskQueue); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(imStillAlive).anyTimes(); for (Task task : imStillAlive) { EasyMock.expect(taskStorage.getStatus(task.getId())) .andReturn(Optional.of(TaskStatus.running(task.getId()))) .anyTimes(); EasyMock.expect(taskStorage.getTask(task.getId())).andReturn(Optional.of(task)).anyTimes(); } EasyMock.expect(taskStorage.getStatus(iHaveFailed.getId())) .andReturn(Optional.of(TaskStatus.failure(iHaveFailed.getId(), "Dummy task status failure err message"))); EasyMock.expect(taskStorage.getTask(iHaveFailed.getId())).andReturn(Optional.of(iHaveFailed)).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(aNewTaskCapture))).andReturn(true); EasyMock.replay(taskStorage); EasyMock.replay(taskQueue); supervisor.runInternal(); verifyAll(); Assert.assertNotEquals(iHaveFailed.getId(), aNewTaskCapture.getValue().getId()); Assert.assertEquals( iHaveFailed.getIOConfig().getBaseSequenceName(), ((KafkaIndexTask) aNewTaskCapture.getValue()).getIOConfig().getBaseSequenceName() ); } @Test public void testRequeueAdoptedTaskWhenFailed() throws Exception { supervisor = getTestableSupervisor(2, 1, true, "PT1H", null, null); addSomeEvents(1); DateTime now = DateTimes.nowUtc(); DateTime maxi = now.plusMinutes(60); Task id1 = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>("topic", ImmutableMap.of(0, Long.MAX_VALUE, 2, Long.MAX_VALUE)), now, maxi, supervisor.getTuningConfig() ); List<Task> existingTasks = ImmutableList.of(id1); Capture<Task> captured = Capture.newInstance(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(existingTasks).anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(id1)).anyTimes(); EasyMock.expect(taskClient.getStatusAsync("id1")).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStartTimeAsync("id1")).andReturn(Futures.immediateFuture(now)).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); TreeMap<Integer, Map<Integer, Long>> checkpoints = new TreeMap<>(); checkpoints.put(0, ImmutableMap.of(0, 0L, 2, 0L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(2); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); // check that replica tasks are created with the same minimumMessageTime as tasks inherited from another supervisor Assert.assertEquals(now, ((KafkaIndexTask) captured.getValue()).getIOConfig().getMinimumMessageTime().get()); // test that a task failing causes a new task to be re-queued with the same parameters String runningTaskId = captured.getValue().getId(); Capture<Task> aNewTaskCapture = Capture.newInstance(); KafkaIndexTask iHaveFailed = (KafkaIndexTask) existingTasks.get(0); EasyMock.reset(taskStorage); EasyMock.reset(taskQueue); EasyMock.reset(taskClient); // for the newly created replica task EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-0"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(2); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)) .andReturn(ImmutableList.of(captured.getValue())) .anyTimes(); EasyMock.expect(taskStorage.getStatus(iHaveFailed.getId())) .andReturn(Optional.of(TaskStatus.failure(iHaveFailed.getId(), "Dummy task status failure err message"))); EasyMock.expect(taskStorage.getStatus(runningTaskId)) .andReturn(Optional.of(TaskStatus.running(runningTaskId))) .anyTimes(); EasyMock.expect(taskStorage.getTask(iHaveFailed.getId())).andReturn(Optional.of(iHaveFailed)).anyTimes(); EasyMock.expect(taskStorage.getTask(runningTaskId)).andReturn(Optional.of(captured.getValue())).anyTimes(); EasyMock.expect(taskClient.getStatusAsync(runningTaskId)).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStartTimeAsync(runningTaskId)).andReturn(Futures.immediateFuture(now)).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(aNewTaskCapture))).andReturn(true); EasyMock.replay(taskStorage); EasyMock.replay(taskQueue); EasyMock.replay(taskClient); supervisor.runInternal(); verifyAll(); Assert.assertNotEquals(iHaveFailed.getId(), aNewTaskCapture.getValue().getId()); Assert.assertEquals( iHaveFailed.getIOConfig().getBaseSequenceName(), ((KafkaIndexTask) aNewTaskCapture.getValue()).getIOConfig().getBaseSequenceName() ); // check that failed tasks are recreated with the same minimumMessageTime as the task it replaced, even if that // task came from another supervisor Assert.assertEquals(now, ((KafkaIndexTask) aNewTaskCapture.getValue()).getIOConfig().getMinimumMessageTime().get()); Assert.assertEquals( maxi, ((KafkaIndexTask) aNewTaskCapture.getValue()).getIOConfig().getMaximumMessageTime().get() ); } @Test public void testQueueNextTasksOnSuccess() throws Exception { supervisor = getTestableSupervisor(2, 2, true, "PT1H", null, null); addSomeEvents(1); Capture<Task> captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(taskClient.getStatusAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(Status.NOT_STARTED)) .anyTimes(); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())) .anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(4); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); List<Task> tasks = captured.getValues(); EasyMock.reset(taskStorage); EasyMock.reset(taskClient); EasyMock.expect(taskClient.getStatusAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(Status.NOT_STARTED)) .anyTimes(); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())) .anyTimes(); TreeMap<Integer, Map<Integer, Long>> checkpoints1 = new TreeMap<>(); checkpoints1.put(0, ImmutableMap.of(0, 0L, 2, 0L)); TreeMap<Integer, Map<Integer, Long>> checkpoints2 = new TreeMap<>(); checkpoints2.put(0, ImmutableMap.of(1, 0L)); // there would be 4 tasks, 2 for each task group EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-0"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints1)) .times(2); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints2)) .times(2); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(tasks).anyTimes(); for (Task task : tasks) { EasyMock.expect(taskStorage.getStatus(task.getId())) .andReturn(Optional.of(TaskStatus.running(task.getId()))) .anyTimes(); EasyMock.expect(taskStorage.getTask(task.getId())).andReturn(Optional.of(task)).anyTimes(); } EasyMock.replay(taskStorage); EasyMock.replay(taskClient); supervisor.runInternal(); verifyAll(); // test that a task succeeding causes a new task to be re-queued with the next offset range and causes any replica // tasks to be shutdown Capture<Task> newTasksCapture = Capture.newInstance(CaptureType.ALL); Capture<String> shutdownTaskIdCapture = Capture.newInstance(); List<Task> imStillRunning = tasks.subList(1, 4); KafkaIndexTask iAmSuccess = (KafkaIndexTask) tasks.get(0); EasyMock.reset(taskStorage); EasyMock.reset(taskQueue); EasyMock.reset(taskClient); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(imStillRunning).anyTimes(); for (Task task : imStillRunning) { EasyMock.expect(taskStorage.getStatus(task.getId())) .andReturn(Optional.of(TaskStatus.running(task.getId()))) .anyTimes(); EasyMock.expect(taskStorage.getTask(task.getId())).andReturn(Optional.of(task)).anyTimes(); } EasyMock.expect(taskStorage.getStatus(iAmSuccess.getId())) .andReturn(Optional.of(TaskStatus.success(iAmSuccess.getId()))); EasyMock.expect(taskStorage.getTask(iAmSuccess.getId())).andReturn(Optional.of(iAmSuccess)).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(newTasksCapture))).andReturn(true).times(2); EasyMock.expect(taskClient.stopAsync(EasyMock.capture(shutdownTaskIdCapture), EasyMock.eq(false))) .andReturn(Futures.immediateFuture(true)); EasyMock.replay(taskStorage); EasyMock.replay(taskQueue); EasyMock.replay(taskClient); supervisor.runInternal(); verifyAll(); // make sure we killed the right task (sequenceName for replicas are the same) Assert.assertTrue(shutdownTaskIdCapture.getValue().contains(iAmSuccess.getIOConfig().getBaseSequenceName())); } @Test public void testBeginPublishAndQueueNextTasks() throws Exception { final TaskLocation location = new TaskLocation("testHost", 1234, -1); supervisor = getTestableSupervisor(2, 2, true, "PT1M", null, null); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); addSomeEvents(100); Capture<Task> captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(4); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); List<Task> tasks = captured.getValues(); Collection workItems = new ArrayList<>(); for (Task task : tasks) { workItems.add(new TestTaskRunnerWorkItem(task, null, location)); } EasyMock.reset(taskStorage, taskRunner, taskClient, taskQueue); captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(tasks).anyTimes(); for (Task task : tasks) { EasyMock.expect(taskStorage.getStatus(task.getId())) .andReturn(Optional.of(TaskStatus.running(task.getId()))) .anyTimes(); EasyMock.expect(taskStorage.getTask(task.getId())).andReturn(Optional.of(task)).anyTimes(); } EasyMock.expect(taskRunner.getRunningTasks()).andReturn(workItems).anyTimes(); EasyMock.expect(taskClient.getStatusAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(Status.READING)) .anyTimes(); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.contains("sequenceName-0"))) .andReturn(Futures.immediateFuture(DateTimes.nowUtc().minusMinutes(2))) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.contains("sequenceName-1"))) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())) .times(2); EasyMock.expect(taskClient.pauseAsync(EasyMock.contains("sequenceName-0"))) .andReturn(Futures.immediateFuture(ImmutableMap.of(0, 10L, 2, 30L))) .andReturn(Futures.immediateFuture(ImmutableMap.of(0, 10L, 2, 35L))); EasyMock.expect( taskClient.setEndOffsetsAsync( EasyMock.contains("sequenceName-0"), EasyMock.eq(ImmutableMap.of(0, 10L, 2, 35L)), EasyMock.eq(true) ) ).andReturn(Futures.immediateFuture(true)).times(2); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(2); TreeMap<Integer, Map<Integer, Long>> checkpoints1 = new TreeMap<>(); checkpoints1.put(0, ImmutableMap.of(0, 0L, 2, 0L)); TreeMap<Integer, Map<Integer, Long>> checkpoints2 = new TreeMap<>(); checkpoints2.put(0, ImmutableMap.of(1, 0L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-0"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints1)) .times(2); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints2)) .times(2); EasyMock.replay(taskStorage, taskRunner, taskClient, taskQueue); supervisor.runInternal(); verifyAll(); for (Task task : captured.getValues()) { KafkaIndexTask kafkaIndexTask = (KafkaIndexTask) task; Assert.assertEquals(dataSchema, kafkaIndexTask.getDataSchema()); Assert.assertEquals(tuningConfig.convertToTaskTuningConfig(), kafkaIndexTask.getTuningConfig()); KafkaIndexTaskIOConfig taskConfig = kafkaIndexTask.getIOConfig(); Assert.assertEquals("sequenceName-0", taskConfig.getBaseSequenceName()); Assert.assertTrue("isUseTransaction", taskConfig.isUseTransaction()); Assert.assertEquals(topic, taskConfig.getStartSequenceNumbers().getStream()); Assert.assertEquals(10L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0)); Assert.assertEquals(35L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2)); } } @Test public void testDiscoverExistingPublishingTask() throws Exception { final TaskLocation location = new TaskLocation("testHost", 1234, -1); supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); addSomeEvents(1); Task task = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 0L, 1, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, supervisor.getTuningConfig() ); Collection workItems = new ArrayList<>(); workItems.add(new TestTaskRunnerWorkItem(task, null, location)); Capture<KafkaIndexTask> captured = Capture.newInstance(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(workItems).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of(task)).anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(task)).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskClient.getStatusAsync("id1")).andReturn(Futures.immediateFuture(Status.PUBLISHING)); EasyMock.expect(taskClient.getCurrentOffsetsAsync("id1", false)) .andReturn(Futures.immediateFuture(ImmutableMap.of(0, 10L, 1, 20L, 2, 30L))); EasyMock.expect(taskClient.getEndOffsets("id1")).andReturn(ImmutableMap.of(0, 10L, 1, 20L, 2, 30L)); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true); TreeMap<Integer, Map<Integer, Long>> checkpoints = new TreeMap<>(); checkpoints.put(0, ImmutableMap.of(0, 0L, 1, 0L, 2, 0L)); EasyMock.expect(taskClient.getCheckpoints(EasyMock.anyString(), EasyMock.anyBoolean())) .andReturn(checkpoints) .anyTimes(); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); supervisor.updateCurrentAndLatestOffsets(); SupervisorReport<KafkaSupervisorReportPayload> report = supervisor.getStatus(); verifyAll(); Assert.assertEquals(DATASOURCE, report.getId()); KafkaSupervisorReportPayload payload = report.getPayload(); Assert.assertEquals(DATASOURCE, payload.getDataSource()); Assert.assertEquals(3600L, payload.getDurationSeconds()); Assert.assertEquals(NUM_PARTITIONS, payload.getPartitions()); Assert.assertEquals(1, payload.getReplicas()); Assert.assertEquals(topic, payload.getStream()); Assert.assertEquals(0, payload.getActiveTasks().size()); Assert.assertEquals(1, payload.getPublishingTasks().size()); Assert.assertEquals(SupervisorStateManager.BasicState.RUNNING, payload.getDetailedState()); Assert.assertEquals(0, payload.getRecentErrors().size()); TaskReportData publishingReport = payload.getPublishingTasks().get(0); Assert.assertEquals("id1", publishingReport.getId()); Assert.assertEquals(ImmutableMap.of(0, 0L, 1, 0L, 2, 0L), publishingReport.getStartingOffsets()); Assert.assertEquals(ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), publishingReport.getCurrentOffsets()); KafkaIndexTask capturedTask = captured.getValue(); Assert.assertEquals(dataSchema, capturedTask.getDataSchema()); Assert.assertEquals(tuningConfig.convertToTaskTuningConfig(), capturedTask.getTuningConfig()); KafkaIndexTaskIOConfig capturedTaskConfig = capturedTask.getIOConfig(); Assert.assertEquals(kafkaHost, capturedTaskConfig.getConsumerProperties().get("bootstrap.servers")); Assert.assertEquals("myCustomValue", capturedTaskConfig.getConsumerProperties().get("myCustomKey")); Assert.assertEquals("sequenceName-0", capturedTaskConfig.getBaseSequenceName()); Assert.assertTrue("isUseTransaction", capturedTaskConfig.isUseTransaction()); // check that the new task was created with starting offsets matching where the publishing task finished Assert.assertEquals(topic, capturedTaskConfig.getStartSequenceNumbers().getStream()); Assert.assertEquals( 10L, capturedTaskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( 20L, capturedTaskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( 30L, capturedTaskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); Assert.assertEquals(topic, capturedTaskConfig.getEndSequenceNumbers().getStream()); Assert.assertEquals( Long.MAX_VALUE, capturedTaskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( Long.MAX_VALUE, capturedTaskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( Long.MAX_VALUE, capturedTaskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); } @Test public void testDiscoverExistingPublishingTaskWithDifferentPartitionAllocation() throws Exception { final TaskLocation location = new TaskLocation("testHost", 1234, -1); supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); addSomeEvents(1); Task task = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>("topic", ImmutableMap.of(0, Long.MAX_VALUE, 2, Long.MAX_VALUE)), null, null, supervisor.getTuningConfig() ); Collection workItems = new ArrayList<>(); workItems.add(new TestTaskRunnerWorkItem(task, null, location)); Capture<KafkaIndexTask> captured = Capture.newInstance(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(workItems).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of(task)).anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(task)).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskClient.getStatusAsync("id1")).andReturn(Futures.immediateFuture(Status.PUBLISHING)); EasyMock.expect(taskClient.getCurrentOffsetsAsync("id1", false)) .andReturn(Futures.immediateFuture(ImmutableMap.of(0, 10L, 2, 30L))); EasyMock.expect(taskClient.getEndOffsets("id1")).andReturn(ImmutableMap.of(0, 10L, 2, 30L)); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); supervisor.updateCurrentAndLatestOffsets(); SupervisorReport<KafkaSupervisorReportPayload> report = supervisor.getStatus(); verifyAll(); Assert.assertEquals(DATASOURCE, report.getId()); KafkaSupervisorReportPayload payload = report.getPayload(); Assert.assertEquals(DATASOURCE, payload.getDataSource()); Assert.assertEquals(3600L, payload.getDurationSeconds()); Assert.assertEquals(NUM_PARTITIONS, payload.getPartitions()); Assert.assertEquals(1, payload.getReplicas()); Assert.assertEquals(topic, payload.getStream()); Assert.assertEquals(0, payload.getActiveTasks().size()); Assert.assertEquals(1, payload.getPublishingTasks().size()); Assert.assertEquals(SupervisorStateManager.BasicState.RUNNING, payload.getDetailedState()); Assert.assertEquals(0, payload.getRecentErrors().size()); TaskReportData publishingReport = payload.getPublishingTasks().get(0); Assert.assertEquals("id1", publishingReport.getId()); Assert.assertEquals(ImmutableMap.of(0, 0L, 2, 0L), publishingReport.getStartingOffsets()); Assert.assertEquals(ImmutableMap.of(0, 10L, 2, 30L), publishingReport.getCurrentOffsets()); KafkaIndexTask capturedTask = captured.getValue(); Assert.assertEquals(dataSchema, capturedTask.getDataSchema()); Assert.assertEquals(tuningConfig.convertToTaskTuningConfig(), capturedTask.getTuningConfig()); KafkaIndexTaskIOConfig capturedTaskConfig = capturedTask.getIOConfig(); Assert.assertEquals(kafkaHost, capturedTaskConfig.getConsumerProperties().get("bootstrap.servers")); Assert.assertEquals("myCustomValue", capturedTaskConfig.getConsumerProperties().get("myCustomKey")); Assert.assertEquals("sequenceName-0", capturedTaskConfig.getBaseSequenceName()); Assert.assertTrue("isUseTransaction", capturedTaskConfig.isUseTransaction()); // check that the new task was created with starting offsets matching where the publishing task finished Assert.assertEquals(topic, capturedTaskConfig.getStartSequenceNumbers().getStream()); Assert.assertEquals( 10L, capturedTaskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( 0L, capturedTaskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( 30L, capturedTaskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); Assert.assertEquals(topic, capturedTaskConfig.getEndSequenceNumbers().getStream()); Assert.assertEquals( Long.MAX_VALUE, capturedTaskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( Long.MAX_VALUE, capturedTaskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( Long.MAX_VALUE, capturedTaskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); } @Test public void testDiscoverExistingPublishingAndReadingTask() throws Exception { final TaskLocation location1 = new TaskLocation("testHost", 1234, -1); final TaskLocation location2 = new TaskLocation("testHost2", 145, -1); final DateTime startTime = DateTimes.nowUtc(); supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); addSomeEvents(6); Task id1 = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 0L, 1, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Task id2 = createKafkaIndexTask( "id2", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 1L, 1, 2L, 2, 3L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Collection workItems = new ArrayList<>(); workItems.add(new TestTaskRunnerWorkItem(id1, null, location1)); workItems.add(new TestTaskRunnerWorkItem(id2, null, location2)); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(workItems).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)) .andReturn(ImmutableList.of(id1, id2)) .anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id2")).andReturn(Optional.of(TaskStatus.running("id2"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(id1)).anyTimes(); EasyMock.expect(taskStorage.getTask("id2")).andReturn(Optional.of(id2)).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskClient.getStatusAsync("id1")).andReturn(Futures.immediateFuture(Status.PUBLISHING)); EasyMock.expect(taskClient.getStatusAsync("id2")).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStartTimeAsync("id2")).andReturn(Futures.immediateFuture(startTime)); EasyMock.expect(taskClient.getCurrentOffsetsAsync("id1", false)) .andReturn(Futures.immediateFuture(ImmutableMap.of(0, 1L, 1, 2L, 2, 3L))); EasyMock.expect(taskClient.getEndOffsets("id1")).andReturn(ImmutableMap.of(0, 1L, 1, 2L, 2, 3L)); EasyMock.expect(taskClient.getCurrentOffsetsAsync("id2", false)) .andReturn(Futures.immediateFuture(ImmutableMap.of(0, 4L, 1, 5L, 2, 6L))); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); // since id1 is publishing, so getCheckpoints wouldn't be called for it TreeMap<Integer, Map<Integer, Long>> checkpoints = new TreeMap<>(); checkpoints.put(0, ImmutableMap.of(0, 1L, 1, 2L, 2, 3L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id2"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); replayAll(); supervisor.start(); supervisor.runInternal(); supervisor.updateCurrentAndLatestOffsets(); SupervisorReport<KafkaSupervisorReportPayload> report = supervisor.getStatus(); verifyAll(); Assert.assertEquals(DATASOURCE, report.getId()); KafkaSupervisorReportPayload payload = report.getPayload(); Assert.assertEquals(DATASOURCE, payload.getDataSource()); Assert.assertEquals(3600L, payload.getDurationSeconds()); Assert.assertEquals(NUM_PARTITIONS, payload.getPartitions()); Assert.assertEquals(1, payload.getReplicas()); Assert.assertEquals(topic, payload.getStream()); Assert.assertEquals(1, payload.getActiveTasks().size()); Assert.assertEquals(1, payload.getPublishingTasks().size()); Assert.assertEquals(SupervisorStateManager.BasicState.RUNNING, payload.getDetailedState()); Assert.assertEquals(0, payload.getRecentErrors().size()); TaskReportData activeReport = payload.getActiveTasks().get(0); TaskReportData publishingReport = payload.getPublishingTasks().get(0); Assert.assertEquals("id2", activeReport.getId()); Assert.assertEquals(startTime, activeReport.getStartTime()); Assert.assertEquals(ImmutableMap.of(0, 1L, 1, 2L, 2, 3L), activeReport.getStartingOffsets()); Assert.assertEquals(ImmutableMap.of(0, 4L, 1, 5L, 2, 6L), activeReport.getCurrentOffsets()); Assert.assertEquals(ImmutableMap.of(0, 3L, 1, 2L, 2, 1L), activeReport.getLag()); Assert.assertEquals("id1", publishingReport.getId()); Assert.assertEquals(ImmutableMap.of(0, 0L, 1, 0L, 2, 0L), publishingReport.getStartingOffsets()); Assert.assertEquals(ImmutableMap.of(0, 1L, 1, 2L, 2, 3L), publishingReport.getCurrentOffsets()); Assert.assertNull(publishingReport.getLag()); Assert.assertEquals(ImmutableMap.of(0, 7L, 1, 7L, 2, 7L), payload.getLatestOffsets()); Assert.assertEquals(ImmutableMap.of(0, 3L, 1, 2L, 2, 1L), payload.getMinimumLag()); Assert.assertEquals(6L, (long) payload.getAggregateLag()); Assert.assertTrue(payload.getOffsetsLastUpdated().plusMinutes(1).isAfterNow()); } @Test public void testKillUnresponsiveTasksWhileGettingStartTime() throws Exception { supervisor = getTestableSupervisor(2, 2, true, "PT1H", null, null); addSomeEvents(1); Capture<Task> captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(4); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); List<Task> tasks = captured.getValues(); EasyMock.reset(taskStorage, taskClient, taskQueue); TreeMap<Integer, Map<Integer, Long>> checkpoints1 = new TreeMap<>(); checkpoints1.put(0, ImmutableMap.of(0, 0L, 2, 0L)); TreeMap<Integer, Map<Integer, Long>> checkpoints2 = new TreeMap<>(); checkpoints2.put(0, ImmutableMap.of(1, 0L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-0"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints1)) .times(2); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints2)) .times(2); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(tasks).anyTimes(); for (Task task : tasks) { EasyMock.expect(taskStorage.getStatus(task.getId())) .andReturn(Optional.of(TaskStatus.running(task.getId()))) .anyTimes(); EasyMock.expect(taskStorage.getTask(task.getId())).andReturn(Optional.of(task)).anyTimes(); EasyMock.expect(taskClient.getStatusAsync(task.getId())) .andReturn(Futures.immediateFuture(Status.NOT_STARTED)); EasyMock.expect(taskClient.getStartTimeAsync(task.getId())) .andReturn(Futures.immediateFailedFuture(new RuntimeException())); taskQueue.shutdown(task.getId(), "Task [%s] failed to return start time, killing task", task.getId()); } EasyMock.replay(taskStorage, taskClient, taskQueue); supervisor.runInternal(); verifyAll(); } @Test public void testKillUnresponsiveTasksWhilePausing() throws Exception { final TaskLocation location = new TaskLocation("testHost", 1234, -1); supervisor = getTestableSupervisor(2, 2, true, "PT1M", null, null); addSomeEvents(100); Capture<Task> captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(4); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); List<Task> tasks = captured.getValues(); Collection workItems = new ArrayList<>(); for (Task task : tasks) { workItems.add(new TestTaskRunnerWorkItem(task, null, location)); } EasyMock.reset(taskStorage, taskRunner, taskClient, taskQueue); TreeMap<Integer, Map<Integer, Long>> checkpoints1 = new TreeMap<>(); checkpoints1.put(0, ImmutableMap.of(0, 0L, 2, 0L)); TreeMap<Integer, Map<Integer, Long>> checkpoints2 = new TreeMap<>(); checkpoints2.put(0, ImmutableMap.of(1, 0L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-0"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints1)) .times(2); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints2)) .times(2); captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(tasks).anyTimes(); for (Task task : tasks) { EasyMock.expect(taskStorage.getStatus(task.getId())) .andReturn(Optional.of(TaskStatus.running(task.getId()))) .anyTimes(); EasyMock.expect(taskStorage.getTask(task.getId())).andReturn(Optional.of(task)).anyTimes(); } EasyMock.expect(taskRunner.getRunningTasks()).andReturn(workItems).anyTimes(); EasyMock.expect(taskClient.getStatusAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(Status.READING)) .anyTimes(); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.contains("sequenceName-0"))) .andReturn(Futures.immediateFuture(DateTimes.nowUtc().minusMinutes(2))) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.contains("sequenceName-1"))) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())) .times(2); EasyMock.expect(taskClient.pauseAsync(EasyMock.contains("sequenceName-0"))) .andReturn(Futures.immediateFailedFuture(new RuntimeException())).times(2); taskQueue.shutdown( EasyMock.contains("sequenceName-0"), EasyMock.eq("An exception occured while waiting for task [%s] to pause: [%s]"), EasyMock.contains("sequenceName-0"), EasyMock.anyString() ); EasyMock.expectLastCall().times(2); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(2); EasyMock.replay(taskStorage, taskRunner, taskClient, taskQueue); supervisor.runInternal(); verifyAll(); for (Task task : captured.getValues()) { KafkaIndexTaskIOConfig taskConfig = ((KafkaIndexTask) task).getIOConfig(); Assert.assertEquals(0L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0)); Assert.assertEquals(0L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2)); } } @Test public void testKillUnresponsiveTasksWhileSettingEndOffsets() throws Exception { final TaskLocation location = new TaskLocation("testHost", 1234, -1); supervisor = getTestableSupervisor(2, 2, true, "PT1M", null, null); addSomeEvents(100); Capture<Task> captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(4); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); List<Task> tasks = captured.getValues(); Collection workItems = new ArrayList<>(); for (Task task : tasks) { workItems.add(new TestTaskRunnerWorkItem(task, null, location)); } EasyMock.reset(taskStorage, taskRunner, taskClient, taskQueue); TreeMap<Integer, Map<Integer, Long>> checkpoints1 = new TreeMap<>(); checkpoints1.put(0, ImmutableMap.of(0, 0L, 2, 0L)); TreeMap<Integer, Map<Integer, Long>> checkpoints2 = new TreeMap<>(); checkpoints2.put(0, ImmutableMap.of(1, 0L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-0"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints1)) .times(2); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("sequenceName-1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints2)) .times(2); captured = Capture.newInstance(CaptureType.ALL); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(tasks).anyTimes(); for (Task task : tasks) { EasyMock.expect(taskStorage.getStatus(task.getId())) .andReturn(Optional.of(TaskStatus.running(task.getId()))) .anyTimes(); EasyMock.expect(taskStorage.getTask(task.getId())).andReturn(Optional.of(task)).anyTimes(); } EasyMock.expect(taskRunner.getRunningTasks()).andReturn(workItems).anyTimes(); EasyMock.expect(taskClient.getStatusAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(Status.READING)) .anyTimes(); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.contains("sequenceName-0"))) .andReturn(Futures.immediateFuture(DateTimes.nowUtc().minusMinutes(2))) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.contains("sequenceName-1"))) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())) .times(2); EasyMock.expect(taskClient.pauseAsync(EasyMock.contains("sequenceName-0"))) .andReturn(Futures.immediateFuture(ImmutableMap.of(0, 10L, 1, 20L, 2, 30L))) .andReturn(Futures.immediateFuture(ImmutableMap.of(0, 10L, 1, 15L, 2, 35L))); EasyMock.expect( taskClient.setEndOffsetsAsync( EasyMock.contains("sequenceName-0"), EasyMock.eq(ImmutableMap.of(0, 10L, 1, 20L, 2, 35L)), EasyMock.eq(true) ) ).andReturn(Futures.immediateFailedFuture(new RuntimeException())).times(2); taskQueue.shutdown( EasyMock.contains("sequenceName-0"), EasyMock.eq("Task [%s] failed to respond to [set end offsets] in a timely manner, killing task"), EasyMock.contains("sequenceName-0") ); EasyMock.expectLastCall().times(2); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true).times(2); EasyMock.replay(taskStorage, taskRunner, taskClient, taskQueue); supervisor.runInternal(); verifyAll(); for (Task task : captured.getValues()) { KafkaIndexTaskIOConfig taskConfig = ((KafkaIndexTask) task).getIOConfig(); Assert.assertEquals(0L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0)); Assert.assertEquals(0L, (long) taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2)); } } @Test(expected = IllegalStateException.class) public void testStopNotStarted() { supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); supervisor.stop(false); } @Test public void testStop() { EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); taskClient.close(); taskRunner.unregisterListener(StringUtils.format("KafkaSupervisor-%s", DATASOURCE)); replayAll(); supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); supervisor.start(); supervisor.stop(false); verifyAll(); } @Test public void testStopGracefully() throws Exception { final TaskLocation location1 = new TaskLocation("testHost", 1234, -1); final TaskLocation location2 = new TaskLocation("testHost2", 145, -1); final DateTime startTime = DateTimes.nowUtc(); supervisor = getTestableSupervisor(2, 1, true, "PT1H", null, null); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); addSomeEvents(1); Task id1 = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 0L, 1, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Task id2 = createKafkaIndexTask( "id2", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Task id3 = createKafkaIndexTask( "id3", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Collection workItems = new ArrayList<>(); workItems.add(new TestTaskRunnerWorkItem(id1, null, location1)); workItems.add(new TestTaskRunnerWorkItem(id2, null, location2)); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(workItems).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)) .andReturn(ImmutableList.of(id1, id2, id3)) .anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id2")).andReturn(Optional.of(TaskStatus.running("id2"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id3")).andReturn(Optional.of(TaskStatus.running("id3"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(id1)).anyTimes(); EasyMock.expect(taskStorage.getTask("id2")).andReturn(Optional.of(id2)).anyTimes(); EasyMock.expect(taskStorage.getTask("id3")).andReturn(Optional.of(id3)).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskClient.getStatusAsync("id1")).andReturn(Futures.immediateFuture(Status.PUBLISHING)); EasyMock.expect(taskClient.getStatusAsync("id2")).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStatusAsync("id3")).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStartTimeAsync("id2")).andReturn(Futures.immediateFuture(startTime)); EasyMock.expect(taskClient.getStartTimeAsync("id3")).andReturn(Futures.immediateFuture(startTime)); EasyMock.expect(taskClient.getEndOffsets("id1")).andReturn(ImmutableMap.of(0, 10L, 1, 20L, 2, 30L)); // getCheckpoints will not be called for id1 as it is in publishing state TreeMap<Integer, Map<Integer, Long>> checkpoints = new TreeMap<>(); checkpoints.put(0, ImmutableMap.of(0, 10L, 1, 20L, 2, 30L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id2"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id3"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); EasyMock.reset(taskRunner, taskClient, taskQueue); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(workItems).anyTimes(); EasyMock.expect(taskClient.pauseAsync("id2")) .andReturn(Futures.immediateFuture(ImmutableMap.of(0, 15L, 1, 25L, 2, 30L))); EasyMock.expect(taskClient.setEndOffsetsAsync("id2", ImmutableMap.of(0, 15L, 1, 25L, 2, 30L), true)) .andReturn(Futures.immediateFuture(true)); taskQueue.shutdown("id3", "Killing task for graceful shutdown"); EasyMock.expectLastCall().times(1); taskQueue.shutdown("id3", "Killing task [%s] which hasn't been assigned to a worker", "id3"); EasyMock.expectLastCall().times(1); EasyMock.replay(taskRunner, taskClient, taskQueue); supervisor.gracefulShutdownInternal(); verifyAll(); } @Test public void testResetNoTasks() { EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); supervisor.start(); supervisor.runInternal(); verifyAll(); EasyMock.reset(indexerMetadataStorageCoordinator); EasyMock.expect(indexerMetadataStorageCoordinator.deleteDataSourceMetadata(DATASOURCE)).andReturn(true); EasyMock.replay(indexerMetadataStorageCoordinator); supervisor.resetInternal(null); verifyAll(); } @Test public void testResetDataSourceMetadata() throws Exception { supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); Capture<String> captureDataSource = EasyMock.newCapture(); Capture<DataSourceMetadata> captureDataSourceMetadata = EasyMock.newCapture(); KafkaDataSourceMetadata kafkaDataSourceMetadata = new KafkaDataSourceMetadata( new SeekableStreamStartSequenceNumbers<>( topic, ImmutableMap.of(0, 1000L, 1, 1000L, 2, 1000L), ImmutableSet.of() ) ); KafkaDataSourceMetadata resetMetadata = new KafkaDataSourceMetadata( new SeekableStreamStartSequenceNumbers<>(topic, ImmutableMap.of(1, 1000L, 2, 1000L), ImmutableSet.of()) ); KafkaDataSourceMetadata expectedMetadata = new KafkaDataSourceMetadata( new SeekableStreamStartSequenceNumbers<>(topic, ImmutableMap.of(0, 1000L), ImmutableSet.of())); EasyMock.reset(indexerMetadataStorageCoordinator); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)) .andReturn(kafkaDataSourceMetadata); EasyMock.expect(indexerMetadataStorageCoordinator.resetDataSourceMetadata( EasyMock.capture(captureDataSource), EasyMock.capture(captureDataSourceMetadata) )).andReturn(true); EasyMock.replay(indexerMetadataStorageCoordinator); try { supervisor.resetInternal(resetMetadata); } catch (NullPointerException npe) { // Expected as there will be an attempt to EasyMock.reset partitionGroups offsets to NOT_SET // however there would be no entries in the map as we have not put nay data in kafka Assert.assertNull(npe.getCause()); } verifyAll(); Assert.assertEquals(DATASOURCE, captureDataSource.getValue()); Assert.assertEquals(expectedMetadata, captureDataSourceMetadata.getValue()); } @Test public void testResetNoDataSourceMetadata() { supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); KafkaDataSourceMetadata resetMetadata = new KafkaDataSourceMetadata( new SeekableStreamStartSequenceNumbers<>( topic, ImmutableMap.of(1, 1000L, 2, 1000L), ImmutableSet.of() ) ); EasyMock.reset(indexerMetadataStorageCoordinator); // no DataSourceMetadata in metadata store EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn(null); EasyMock.replay(indexerMetadataStorageCoordinator); supervisor.resetInternal(resetMetadata); verifyAll(); } @Test public void testGetOffsetFromStorageForPartitionWithResetOffsetAutomatically() throws Exception { addSomeEvents(2); supervisor = getTestableSupervisor(1, 1, true, true, "PT1H", null, null); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); EasyMock.reset(indexerMetadataStorageCoordinator); // unknown DataSourceMetadata in metadata store // for simplicity in testing the offset availability check, we use negative stored offsets in metadata here, // because the stream's earliest offset is 0, although that would not happen in real usage. EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)) .andReturn( new KafkaDataSourceMetadata( new SeekableStreamEndSequenceNumbers<>(topic, ImmutableMap.of(1, -100L, 2, 200L)) ) ).times(4); // getOffsetFromStorageForPartition() throws an exception when the offsets are automatically reset. // Since getOffsetFromStorageForPartition() is called per partition, all partitions can't be reset at the same time. // Instead, subsequent partitions will be reset in the following supervisor runs. EasyMock.expect( indexerMetadataStorageCoordinator.resetDataSourceMetadata( DATASOURCE, new KafkaDataSourceMetadata( // Only one partition is reset in a single supervisor run. new SeekableStreamEndSequenceNumbers<>(topic, ImmutableMap.of(2, 200L)) ) ) ).andReturn(true); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); } @Test public void testResetRunningTasks() throws Exception { final TaskLocation location1 = new TaskLocation("testHost", 1234, -1); final TaskLocation location2 = new TaskLocation("testHost2", 145, -1); final DateTime startTime = DateTimes.nowUtc(); supervisor = getTestableSupervisor(2, 1, true, "PT1H", null, null); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); addSomeEvents(1); Task id1 = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 0L, 1, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Task id2 = createKafkaIndexTask( "id2", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Task id3 = createKafkaIndexTask( "id3", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Collection workItems = new ArrayList<>(); workItems.add(new TestTaskRunnerWorkItem(id1, null, location1)); workItems.add(new TestTaskRunnerWorkItem(id2, null, location2)); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(workItems).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)) .andReturn(ImmutableList.of(id1, id2, id3)) .anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id2")).andReturn(Optional.of(TaskStatus.running("id2"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id3")).andReturn(Optional.of(TaskStatus.running("id3"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(id1)).anyTimes(); EasyMock.expect(taskStorage.getTask("id2")).andReturn(Optional.of(id2)).anyTimes(); EasyMock.expect(taskStorage.getTask("id3")).andReturn(Optional.of(id3)).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskClient.getStatusAsync("id1")).andReturn(Futures.immediateFuture(Status.PUBLISHING)); EasyMock.expect(taskClient.getStatusAsync("id2")).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStatusAsync("id3")).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStartTimeAsync("id2")).andReturn(Futures.immediateFuture(startTime)); EasyMock.expect(taskClient.getStartTimeAsync("id3")).andReturn(Futures.immediateFuture(startTime)); EasyMock.expect(taskClient.getEndOffsets("id1")).andReturn(ImmutableMap.of(0, 10L, 1, 20L, 2, 30L)); TreeMap<Integer, Map<Integer, Long>> checkpoints = new TreeMap<>(); checkpoints.put(0, ImmutableMap.of(0, 10L, 1, 20L, 2, 30L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id2"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id3"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); EasyMock.reset(taskQueue, indexerMetadataStorageCoordinator); EasyMock.expect(indexerMetadataStorageCoordinator.deleteDataSourceMetadata(DATASOURCE)).andReturn(true); taskQueue.shutdown("id2", "DataSourceMetadata is not found while reset"); taskQueue.shutdown("id3", "DataSourceMetadata is not found while reset"); EasyMock.replay(taskQueue, indexerMetadataStorageCoordinator); supervisor.resetInternal(null); verifyAll(); } @Test public void testNoDataIngestionTasks() { final DateTime startTime = DateTimes.nowUtc(); supervisor = getTestableSupervisor(2, 1, true, "PT1S", null, null); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); supervisor.getStateManager().markRunFinished(); //not adding any events Task id1 = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 0L, 1, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Task id2 = createKafkaIndexTask( "id2", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Task id3 = createKafkaIndexTask( "id3", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)) .andReturn(ImmutableList.of(id1, id2, id3)) .anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id2")).andReturn(Optional.of(TaskStatus.running("id2"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id3")).andReturn(Optional.of(TaskStatus.running("id3"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(id1)).anyTimes(); EasyMock.expect(taskStorage.getTask("id2")).andReturn(Optional.of(id2)).anyTimes(); EasyMock.expect(taskStorage.getTask("id3")).andReturn(Optional.of(id3)).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskClient.getStatusAsync("id1")).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStatusAsync("id2")).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStatusAsync("id3")).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStartTimeAsync("id1")).andReturn(Futures.immediateFuture(startTime)); EasyMock.expect(taskClient.getStartTimeAsync("id2")).andReturn(Futures.immediateFuture(startTime)); EasyMock.expect(taskClient.getStartTimeAsync("id3")).andReturn(Futures.immediateFuture(startTime)); TreeMap<Integer, Map<Integer, Long>> checkpoints = new TreeMap<>(); checkpoints.put(0, ImmutableMap.of(0, 10L, 1, 20L, 2, 30L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id2"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id3"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); EasyMock.reset(taskQueue, indexerMetadataStorageCoordinator); EasyMock.expect(indexerMetadataStorageCoordinator.deleteDataSourceMetadata(DATASOURCE)).andReturn(true); taskQueue.shutdown("id1", "DataSourceMetadata is not found while reset"); taskQueue.shutdown("id2", "DataSourceMetadata is not found while reset"); taskQueue.shutdown("id3", "DataSourceMetadata is not found while reset"); EasyMock.replay(taskQueue, indexerMetadataStorageCoordinator); supervisor.resetInternal(null); verifyAll(); } @Test(timeout = 60_000L) public void testCheckpointForInactiveTaskGroup() throws InterruptedException { supervisor = getTestableSupervisor(2, 1, true, "PT1S", null, null); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); supervisor.getStateManager().markRunFinished(); //not adding any events final KafkaIndexTask id1 = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>(topic, ImmutableMap.of(0, 0L, 1, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( topic, ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); final Task id2 = createKafkaIndexTask( "id2", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>(topic, ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( topic, ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); final Task id3 = createKafkaIndexTask( "id3", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>(topic, ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( topic, ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); final TaskLocation location1 = new TaskLocation("testHost", 1234, -1); final TaskLocation location2 = new TaskLocation("testHost2", 145, -1); Collection workItems = new ArrayList<>(); workItems.add(new TestTaskRunnerWorkItem(id1, null, location1)); workItems.add(new TestTaskRunnerWorkItem(id2, null, location2)); workItems.add(new TestTaskRunnerWorkItem(id2, null, location2)); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(workItems).anyTimes(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)) .andReturn(ImmutableList.of(id1, id2, id3)) .anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id2")).andReturn(Optional.of(TaskStatus.running("id2"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id3")).andReturn(Optional.of(TaskStatus.running("id3"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(id1)).anyTimes(); EasyMock.expect(taskStorage.getTask("id2")).andReturn(Optional.of(id2)).anyTimes(); EasyMock.expect(taskStorage.getTask("id3")).andReturn(Optional.of(id3)).anyTimes(); EasyMock.expect( indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn(new KafkaDataSourceMetadata(null) ).anyTimes(); EasyMock.expect(taskClient.getStatusAsync("id1")).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStatusAsync("id2")).andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStatusAsync("id3")).andReturn(Futures.immediateFuture(Status.READING)); final DateTime startTime = DateTimes.nowUtc(); EasyMock.expect(taskClient.getStartTimeAsync("id1")).andReturn(Futures.immediateFuture(startTime)); EasyMock.expect(taskClient.getStartTimeAsync("id2")).andReturn(Futures.immediateFuture(startTime)); EasyMock.expect(taskClient.getStartTimeAsync("id3")).andReturn(Futures.immediateFuture(startTime)); final TreeMap<Integer, Map<Integer, Long>> checkpoints = new TreeMap<>(); checkpoints.put(0, ImmutableMap.of(0, 10L, 1, 20L, 2, 30L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id2"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id3"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); supervisor.moveTaskGroupToPendingCompletion(0); supervisor.checkpoint( 0, new KafkaDataSourceMetadata( new SeekableStreamStartSequenceNumbers<>(topic, checkpoints.get(0), ImmutableSet.of()) ) ); while (supervisor.getNoticesQueueSize() > 0) { Thread.sleep(100); } verifyAll(); Assert.assertNull(serviceEmitter.getStackTrace(), serviceEmitter.getStackTrace()); Assert.assertNull(serviceEmitter.getExceptionMessage(), serviceEmitter.getExceptionMessage()); Assert.assertNull(serviceEmitter.getExceptionClass()); } @Test(timeout = 60_000L) public void testCheckpointForUnknownTaskGroup() throws InterruptedException { supervisor = getTestableSupervisor(2, 1, true, "PT1S", null, null); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); //not adding any events final KafkaIndexTask id1 = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>(topic, ImmutableMap.of(0, 0L, 1, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( topic, ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); final Task id2 = createKafkaIndexTask( "id2", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>(topic, ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( topic, ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); final Task id3 = createKafkaIndexTask( "id3", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>(topic, ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( topic, ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)) .andReturn(ImmutableList.of(id1, id2, id3)) .anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id2")).andReturn(Optional.of(TaskStatus.running("id2"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id3")).andReturn(Optional.of(TaskStatus.running("id3"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(id1)).anyTimes(); EasyMock.expect(taskStorage.getTask("id2")).andReturn(Optional.of(id2)).anyTimes(); EasyMock.expect(taskStorage.getTask("id3")).andReturn(Optional.of(id3)).anyTimes(); EasyMock.expect( indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn(new KafkaDataSourceMetadata(null) ).anyTimes(); replayAll(); supervisor.start(); supervisor.checkpoint( 0, new KafkaDataSourceMetadata( new SeekableStreamStartSequenceNumbers<>(topic, Collections.emptyMap(), ImmutableSet.of()) ) ); while (supervisor.getNoticesQueueSize() > 0) { Thread.sleep(100); } verifyAll(); while (serviceEmitter.getStackTrace() == null) { Thread.sleep(100); } Assert.assertTrue( serviceEmitter.getStackTrace().startsWith("org.apache.druid.java.util.common.ISE: Cannot find") ); Assert.assertEquals( "Cannot find taskGroup [0] among all activelyReadingTaskGroups [{}]", serviceEmitter.getExceptionMessage() ); Assert.assertEquals(ISE.class, serviceEmitter.getExceptionClass()); } @Test public void testSuspendedNoRunningTasks() throws Exception { supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null, true, kafkaHost); addSomeEvents(1); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); // this asserts that taskQueue.add does not in fact get called because supervisor should be suspended EasyMock.expect(taskQueue.add(EasyMock.anyObject())).andAnswer((IAnswer) () -> { Assert.fail(); return null; }).anyTimes(); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); } @Test public void testSuspendedRunningTasks() throws Exception { // graceful shutdown is expected to be called on running tasks since state is suspended final TaskLocation location1 = new TaskLocation("testHost", 1234, -1); final TaskLocation location2 = new TaskLocation("testHost2", 145, -1); final DateTime startTime = DateTimes.nowUtc(); supervisor = getTestableSupervisor(2, 1, true, "PT1H", null, null, true, kafkaHost); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); addSomeEvents(1); Task id1 = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 0L, 1, 0L, 2, 0L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Task id2 = createKafkaIndexTask( "id2", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Task id3 = createKafkaIndexTask( "id3", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>("topic", ImmutableMap.of(0, 10L, 1, 20L, 2, 30L), ImmutableSet.of()), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 1, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, tuningConfig ); Collection workItems = new ArrayList<>(); workItems.add(new TestTaskRunnerWorkItem(id1, null, location1)); workItems.add(new TestTaskRunnerWorkItem(id2, null, location2)); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(workItems).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)) .andReturn(ImmutableList.of(id1, id2, id3)) .anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id2")).andReturn(Optional.of(TaskStatus.running("id2"))).anyTimes(); EasyMock.expect(taskStorage.getStatus("id3")).andReturn(Optional.of(TaskStatus.running("id3"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(id1)).anyTimes(); EasyMock.expect(taskStorage.getTask("id2")).andReturn(Optional.of(id2)).anyTimes(); EasyMock.expect(taskStorage.getTask("id3")).andReturn(Optional.of(id3)).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskClient.getStatusAsync("id1")) .andReturn(Futures.immediateFuture(Status.PUBLISHING)); EasyMock.expect(taskClient.getStatusAsync("id2")) .andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStatusAsync("id3")) .andReturn(Futures.immediateFuture(Status.READING)); EasyMock.expect(taskClient.getStartTimeAsync("id2")).andReturn(Futures.immediateFuture(startTime)); EasyMock.expect(taskClient.getStartTimeAsync("id3")).andReturn(Futures.immediateFuture(startTime)); EasyMock.expect(taskClient.getEndOffsets("id1")).andReturn(ImmutableMap.of(0, 10L, 1, 20L, 2, 30L)); // getCheckpoints will not be called for id1 as it is in publishing state TreeMap<Integer, Map<Integer, Long>> checkpoints = new TreeMap<>(); checkpoints.put(0, ImmutableMap.of(0, 10L, 1, 20L, 2, 30L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id2"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id3"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints)) .times(1); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); EasyMock.expect(taskClient.pauseAsync("id2")) .andReturn(Futures.immediateFuture(ImmutableMap.of(0, 15L, 1, 25L, 2, 30L))); EasyMock.expect(taskClient.setEndOffsetsAsync("id2", ImmutableMap.of(0, 15L, 1, 25L, 2, 30L), true)) .andReturn(Futures.immediateFuture(true)); taskQueue.shutdown("id3", "Killing task for graceful shutdown"); EasyMock.expectLastCall().times(1); taskQueue.shutdown("id3", "Killing task [%s] which hasn't been assigned to a worker", "id3"); EasyMock.expectLastCall().times(1); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); } @Test public void testResetSuspended() { EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); supervisor = getTestableSupervisor(1, 1, true, "PT1H", null, null, true, kafkaHost); supervisor.start(); supervisor.runInternal(); verifyAll(); EasyMock.reset(indexerMetadataStorageCoordinator); EasyMock.expect(indexerMetadataStorageCoordinator.deleteDataSourceMetadata(DATASOURCE)).andReturn(true); EasyMock.replay(indexerMetadataStorageCoordinator); supervisor.resetInternal(null); verifyAll(); } @Test public void testFailedInitializationAndRecovery() throws Exception { // Block the supervisor initialization with a bad hostname config, make sure this doesn't block the lifecycle supervisor = getTestableSupervisor( 1, 1, true, "PT1H", null, null, false, StringUtils.format("badhostname:%d", kafkaServer.getPort()) ); final KafkaSupervisorTuningConfig tuningConfig = supervisor.getTuningConfig(); addSomeEvents(1); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); replayAll(); supervisor.start(); Assert.assertTrue(supervisor.isLifecycleStarted()); Assert.assertFalse(supervisor.isStarted()); verifyAll(); while (supervisor.getInitRetryCounter() < 3) { Thread.sleep(1000); } // Portion below is the same test as testNoInitialState(), testing the supervisor after the initialiation is fixed resetAll(); Capture<KafkaIndexTask> captured = Capture.newInstance(); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(ImmutableList.of()).anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskQueue.add(EasyMock.capture(captured))).andReturn(true); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); replayAll(); // Fix the bad hostname during the initialization retries and finish the supervisor start. // This is equivalent to supervisor.start() in testNoInitialState(). // The test supervisor has a P1D period, so we need to manually trigger the initialization retry. supervisor.getIoConfig().getConsumerProperties().put("bootstrap.servers", kafkaHost); supervisor.tryInit(); Assert.assertTrue(supervisor.isLifecycleStarted()); Assert.assertTrue(supervisor.isStarted()); supervisor.runInternal(); verifyAll(); KafkaIndexTask task = captured.getValue(); Assert.assertEquals(dataSchema, task.getDataSchema()); Assert.assertEquals(tuningConfig.convertToTaskTuningConfig(), task.getTuningConfig()); KafkaIndexTaskIOConfig taskConfig = task.getIOConfig(); Assert.assertEquals(kafkaHost, taskConfig.getConsumerProperties().get("bootstrap.servers")); Assert.assertEquals("myCustomValue", taskConfig.getConsumerProperties().get("myCustomKey")); Assert.assertEquals("sequenceName-0", taskConfig.getBaseSequenceName()); Assert.assertTrue("isUseTransaction", taskConfig.isUseTransaction()); Assert.assertFalse("minimumMessageTime", taskConfig.getMinimumMessageTime().isPresent()); Assert.assertFalse("maximumMessageTime", taskConfig.getMaximumMessageTime().isPresent()); Assert.assertEquals(topic, taskConfig.getStartSequenceNumbers().getStream()); Assert.assertEquals( 0L, taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( 0L, taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( 0L, taskConfig.getStartSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); Assert.assertEquals(topic, taskConfig.getEndSequenceNumbers().getStream()); Assert.assertEquals( Long.MAX_VALUE, taskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(0).longValue() ); Assert.assertEquals( Long.MAX_VALUE, taskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(1).longValue() ); Assert.assertEquals( Long.MAX_VALUE, taskConfig.getEndSequenceNumbers().getPartitionSequenceNumberMap().get(2).longValue() ); } @Test public void testGetCurrentTotalStats() { supervisor = getTestableSupervisor(1, 2, true, "PT1H", null, null, false, kafkaHost); supervisor.addTaskGroupToActivelyReadingTaskGroup( supervisor.getTaskGroupIdForPartition(0), ImmutableMap.of(0, 0L), Optional.absent(), Optional.absent(), ImmutableSet.of("task1"), ImmutableSet.of() ); supervisor.addTaskGroupToPendingCompletionTaskGroup( supervisor.getTaskGroupIdForPartition(1), ImmutableMap.of(0, 0L), Optional.absent(), Optional.absent(), ImmutableSet.of("task2"), ImmutableSet.of() ); EasyMock.expect(taskClient.getMovingAveragesAsync("task1")).andReturn(Futures.immediateFuture(ImmutableMap.of( "prop1", "val1" ))).times(1); EasyMock.expect(taskClient.getMovingAveragesAsync("task2")).andReturn(Futures.immediateFuture(ImmutableMap.of( "prop2", "val2" ))).times(1); replayAll(); Map<String, Map<String, Object>> stats = supervisor.getStats(); verifyAll(); Assert.assertEquals(2, stats.size()); Assert.assertEquals(ImmutableSet.of("0", "1"), stats.keySet()); Assert.assertEquals(ImmutableMap.of("task1", ImmutableMap.of("prop1", "val1")), stats.get("0")); Assert.assertEquals(ImmutableMap.of("task2", ImmutableMap.of("prop2", "val2")), stats.get("1")); } @Test public void testGetCurrentParseErrors() { supervisor = getTestableSupervisor(1, 2, true, "PT1H", null, null, false, kafkaHost); supervisor.addTaskGroupToActivelyReadingTaskGroup( supervisor.getTaskGroupIdForPartition(0), ImmutableMap.of(0, 0L), Optional.absent(), Optional.absent(), ImmutableSet.of("task1"), ImmutableSet.of() ); supervisor.addTaskGroupToPendingCompletionTaskGroup( supervisor.getTaskGroupIdForPartition(1), ImmutableMap.of(0, 0L), Optional.absent(), Optional.absent(), ImmutableSet.of("task2"), ImmutableSet.of() ); ParseExceptionReport exception1 = new ParseExceptionReport( "testInput1", "unparseable", ImmutableList.of("detail1", "detail2"), 1000L ); ParseExceptionReport exception2 = new ParseExceptionReport( "testInput2", "unparseable", ImmutableList.of("detail1", "detail2"), 2000L ); ParseExceptionReport exception3 = new ParseExceptionReport( "testInput3", "unparseable", ImmutableList.of("detail1", "detail2"), 3000L ); ParseExceptionReport exception4 = new ParseExceptionReport( "testInput4", "unparseable", ImmutableList.of("detail1", "detail2"), 4000L ); EasyMock.expect(taskClient.getParseErrorsAsync("task1")).andReturn(Futures.immediateFuture(ImmutableList.of( exception1, exception2 ))).times(1); EasyMock.expect(taskClient.getParseErrorsAsync("task2")).andReturn(Futures.immediateFuture(ImmutableList.of( exception3, exception4 ))).times(1); replayAll(); List<ParseExceptionReport> errors = supervisor.getParseErrors(); verifyAll(); Assert.assertEquals(ImmutableList.of(exception4, exception3, exception2, exception1), errors); } @Test public void testDoNotKillCompatibleTasks() throws Exception { // This supervisor always returns true for isTaskCurrent -> it should not kill its tasks int numReplicas = 2; supervisor = getTestableSupervisorCustomIsTaskCurrent( numReplicas, 1, true, "PT1H", new Period("P1D"), new Period("P1D"), false, true ); addSomeEvents(1); Task task = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>( "topic", ImmutableMap.of(0, 0L, 2, 0L), ImmutableSet.of() ), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), null, null, supervisor.getTuningConfig() ); List<Task> existingTasks = ImmutableList.of(task); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(existingTasks).anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(task)).anyTimes(); EasyMock.expect(taskClient.getStatusAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(Status.NOT_STARTED)) .anyTimes(); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())) .anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); EasyMock.expect(taskQueue.add(EasyMock.anyObject(Task.class))).andReturn(true); TreeMap<Integer, Map<Integer, Long>> checkpoints1 = new TreeMap<>(); checkpoints1.put(0, ImmutableMap.of(0, 0L, 2, 0L)); EasyMock.expect(taskClient.getCheckpointsAsync(EasyMock.contains("id1"), EasyMock.anyBoolean())) .andReturn(Futures.immediateFuture(checkpoints1)) .times(numReplicas); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); } @Test public void testKillIncompatibleTasks() throws Exception { // This supervisor always returns false for isTaskCurrent -> it should kill its tasks int numReplicas = 2; supervisor = getTestableSupervisorCustomIsTaskCurrent( numReplicas, 1, true, "PT1H", new Period("P1D"), new Period("P1D"), false, false ); addSomeEvents(1); Task task = createKafkaIndexTask( "id1", DATASOURCE, 0, new SeekableStreamStartSequenceNumbers<>( "topic", ImmutableMap.of(0, 0L, 2, 0L), ImmutableSet.of() ), new SeekableStreamEndSequenceNumbers<>("topic", ImmutableMap.of(0, Long.MAX_VALUE, 2, Long.MAX_VALUE)), null, null, supervisor.getTuningConfig() ); List<Task> existingTasks = ImmutableList.of(task); EasyMock.expect(taskMaster.getTaskQueue()).andReturn(Optional.of(taskQueue)).anyTimes(); EasyMock.expect(taskMaster.getTaskRunner()).andReturn(Optional.of(taskRunner)).anyTimes(); EasyMock.expect(taskRunner.getRunningTasks()).andReturn(Collections.emptyList()).anyTimes(); EasyMock.expect(taskStorage.getActiveTasksByDatasource(DATASOURCE)).andReturn(existingTasks).anyTimes(); EasyMock.expect(taskStorage.getStatus("id1")).andReturn(Optional.of(TaskStatus.running("id1"))).anyTimes(); EasyMock.expect(taskStorage.getTask("id1")).andReturn(Optional.of(task)).anyTimes(); EasyMock.expect(taskClient.getStatusAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(Status.NOT_STARTED)) .anyTimes(); EasyMock.expect(taskClient.getStartTimeAsync(EasyMock.anyString())) .andReturn(Futures.immediateFuture(DateTimes.nowUtc())) .anyTimes(); EasyMock.expect(indexerMetadataStorageCoordinator.retrieveDataSourceMetadata(DATASOURCE)).andReturn( new KafkaDataSourceMetadata( null ) ).anyTimes(); EasyMock.expect(taskClient.stopAsync("id1", false)).andReturn(Futures.immediateFuture(true)); taskRunner.registerListener(EasyMock.anyObject(TaskRunnerListener.class), EasyMock.anyObject(Executor.class)); EasyMock.expect(taskQueue.add(EasyMock.anyObject(Task.class))).andReturn(true).times(2); replayAll(); supervisor.start(); supervisor.runInternal(); verifyAll(); } @Test public void testIsTaskCurrent() { DateTime minMessageTime = DateTimes.nowUtc(); DateTime maxMessageTime = DateTimes.nowUtc().plus(10000); KafkaSupervisor supervisor = getSupervisor( 2, 1, true, "PT1H", new Period("P1D"), new Period("P1D"), false, kafkaHost, dataSchema, new KafkaSupervisorTuningConfig( null, 1000, null, null, 50000, null, new Period("P1Y"), new File("/test"), null, null, null, false, null, false, null, numThreads, TEST_CHAT_THREADS, TEST_CHAT_RETRIES, TEST_HTTP_TIMEOUT, TEST_SHUTDOWN_TIMEOUT, null, null, null, null, null ) ); supervisor.addTaskGroupToActivelyReadingTaskGroup( 42, ImmutableMap.of(0, 0L, 2, 0L), Optional.of(minMessageTime), Optional.of(maxMessageTime), ImmutableSet.of("id1", "id2", "id3", "id4"), ImmutableSet.of() ); DataSchema modifiedDataSchema = getDataSchema("some other datasource"); KafkaSupervisorTuningConfig modifiedTuningConfig = new KafkaSupervisorTuningConfig( null, 42, // This is different null, null, 50000, null, new Period("P1Y"), new File("/test"), null, null, null, false, null, null, null, numThreads, TEST_CHAT_THREADS, TEST_CHAT_RETRIES, TEST_HTTP_TIMEOUT, TEST_SHUTDOWN_TIMEOUT, null, null, null, null, null ); KafkaIndexTask taskFromStorage = createKafkaIndexTask( "id1", 0, new SeekableStreamStartSequenceNumbers<>( "topic", ImmutableMap.of(0, 0L, 2, 0L), ImmutableSet.of() ), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), minMessageTime, maxMessageTime, dataSchema, supervisor.getTuningConfig() ); KafkaIndexTask taskFromStorageMismatchedDataSchema = createKafkaIndexTask( "id2", 0, new SeekableStreamStartSequenceNumbers<>( "topic", ImmutableMap.of(0, 0L, 2, 0L), ImmutableSet.of() ), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), minMessageTime, maxMessageTime, modifiedDataSchema, supervisor.getTuningConfig() ); KafkaIndexTask taskFromStorageMismatchedTuningConfig = createKafkaIndexTask( "id3", 0, new SeekableStreamStartSequenceNumbers<>( "topic", ImmutableMap.of(0, 0L, 2, 0L), ImmutableSet.of() ), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), minMessageTime, maxMessageTime, dataSchema, modifiedTuningConfig ); KafkaIndexTask taskFromStorageMismatchedPartitionsWithTaskGroup = createKafkaIndexTask( "id4", 0, new SeekableStreamStartSequenceNumbers<>( "topic", ImmutableMap.of(0, 0L, 2, 6L), ImmutableSet.of() ), new SeekableStreamEndSequenceNumbers<>( "topic", ImmutableMap.of(0, Long.MAX_VALUE, 2, Long.MAX_VALUE) ), minMessageTime, maxMessageTime, dataSchema, supervisor.getTuningConfig() ); EasyMock.expect(taskStorage.getTask("id1")) .andReturn(Optional.of(taskFromStorage)) .once(); EasyMock.expect(taskStorage.getTask("id2")) .andReturn(Optional.of(taskFromStorageMismatchedDataSchema)) .once(); EasyMock.expect(taskStorage.getTask("id3")) .andReturn(Optional.of(taskFromStorageMismatchedTuningConfig)) .once(); EasyMock.expect(taskStorage.getTask("id4")) .andReturn(Optional.of(taskFromStorageMismatchedPartitionsWithTaskGroup)) .once(); replayAll(); Assert.assertTrue(supervisor.isTaskCurrent(42, "id1")); Assert.assertFalse(supervisor.isTaskCurrent(42, "id2")); Assert.assertFalse(supervisor.isTaskCurrent(42, "id3")); Assert.assertFalse(supervisor.isTaskCurrent(42, "id4")); verifyAll(); } private void addSomeEvents(int numEventsPerPartition) throws Exception { // create topic manually try (Admin admin = kafkaServer.newAdminClient()) { admin.createTopics( Collections.singletonList(new NewTopic(topic, NUM_PARTITIONS, (short) 1)) ).all().get(); } try (final KafkaProducer<byte[], byte[]> kafkaProducer = kafkaServer.newProducer()) { kafkaProducer.initTransactions(); kafkaProducer.beginTransaction(); for (int i = 0; i < NUM_PARTITIONS; i++) { for (int j = 0; j < numEventsPerPartition; j++) { kafkaProducer.send( new ProducerRecord<>( topic, i, null, StringUtils.toUtf8(StringUtils.format("event-%d", j)) ) ).get(); } } kafkaProducer.commitTransaction(); } } private void addMoreEvents(int numEventsPerPartition, int num_partitions) throws Exception { try (Admin admin = kafkaServer.newAdminClient()) { admin.createPartitions(Collections.singletonMap(topic, NewPartitions.increaseTo(num_partitions))).all().get(); } try (final KafkaProducer<byte[], byte[]> kafkaProducer = kafkaServer.newProducer()) { kafkaProducer.initTransactions(); kafkaProducer.beginTransaction(); for (int i = NUM_PARTITIONS; i < num_partitions; i++) { for (int j = 0; j < numEventsPerPartition; j++) { kafkaProducer.send( new ProducerRecord<>( topic, i, null, StringUtils.toUtf8(StringUtils.format("event-%d", j)) ) ).get(); } } kafkaProducer.commitTransaction(); } } private TestableKafkaSupervisor getTestableSupervisor( int replicas, int taskCount, boolean useEarliestOffset, String duration, Period lateMessageRejectionPeriod, Period earlyMessageRejectionPeriod ) { return getTestableSupervisor( replicas, taskCount, useEarliestOffset, false, duration, lateMessageRejectionPeriod, earlyMessageRejectionPeriod ); } private TestableKafkaSupervisor getTestableSupervisor( int replicas, int taskCount, boolean useEarliestOffset, boolean resetOffsetAutomatically, String duration, Period lateMessageRejectionPeriod, Period earlyMessageRejectionPeriod ) { return getTestableSupervisor( replicas, taskCount, useEarliestOffset, resetOffsetAutomatically, duration, lateMessageRejectionPeriod, earlyMessageRejectionPeriod, false, kafkaHost ); } private TestableKafkaSupervisor getTestableSupervisor( int replicas, int taskCount, boolean useEarliestOffset, String duration, Period lateMessageRejectionPeriod, Period earlyMessageRejectionPeriod, boolean suspended, String kafkaHost ) { return getTestableSupervisor( replicas, taskCount, useEarliestOffset, false, duration, lateMessageRejectionPeriod, earlyMessageRejectionPeriod, suspended, kafkaHost ); } private TestableKafkaSupervisor getTestableSupervisor( int replicas, int taskCount, boolean useEarliestOffset, boolean resetOffsetAutomatically, String duration, Period lateMessageRejectionPeriod, Period earlyMessageRejectionPeriod, boolean suspended, String kafkaHost ) { final Map<String, Object> consumerProperties = KafkaConsumerConfigs.getConsumerProperties(); consumerProperties.put("myCustomKey", "myCustomValue"); consumerProperties.put("bootstrap.servers", kafkaHost); KafkaSupervisorIOConfig kafkaSupervisorIOConfig = new KafkaSupervisorIOConfig( topic, INPUT_FORMAT, replicas, taskCount, new Period(duration), consumerProperties, null, KafkaSupervisorIOConfig.DEFAULT_POLL_TIMEOUT_MILLIS, new Period("P1D"), new Period("PT30S"), useEarliestOffset, new Period("PT30M"), lateMessageRejectionPeriod, earlyMessageRejectionPeriod, null ); KafkaIndexTaskClientFactory taskClientFactory = new KafkaIndexTaskClientFactory( null, null ) { @Override public KafkaIndexTaskClient build( TaskInfoProvider taskInfoProvider, String dataSource, int numThreads, Duration httpTimeout, long numRetries ) { Assert.assertEquals(TEST_CHAT_THREADS, numThreads); Assert.assertEquals(TEST_HTTP_TIMEOUT.toStandardDuration(), httpTimeout); Assert.assertEquals(TEST_CHAT_RETRIES, numRetries); return taskClient; } }; final KafkaSupervisorTuningConfig tuningConfig = new KafkaSupervisorTuningConfig( null, 1000, null, null, 50000, null, new Period("P1Y"), new File("/test"), null, null, null, false, null, resetOffsetAutomatically, null, numThreads, TEST_CHAT_THREADS, TEST_CHAT_RETRIES, TEST_HTTP_TIMEOUT, TEST_SHUTDOWN_TIMEOUT, null, null, null, null, 10 ); return new TestableKafkaSupervisor( taskStorage, taskMaster, indexerMetadataStorageCoordinator, taskClientFactory, OBJECT_MAPPER, new KafkaSupervisorSpec( null, dataSchema, tuningConfig, kafkaSupervisorIOConfig, null, suspended, taskStorage, taskMaster, indexerMetadataStorageCoordinator, taskClientFactory, OBJECT_MAPPER, new NoopServiceEmitter(), new DruidMonitorSchedulerConfig(), rowIngestionMetersFactory, new SupervisorStateManagerConfig() ), rowIngestionMetersFactory ); } /** * Use when you want to mock the return value of SeekableStreamSupervisor#isTaskCurrent() */ private TestableKafkaSupervisor getTestableSupervisorCustomIsTaskCurrent( int replicas, int taskCount, boolean useEarliestOffset, String duration, Period lateMessageRejectionPeriod, Period earlyMessageRejectionPeriod, boolean suspended, boolean isTaskCurrentReturn ) { Map<String, Object> consumerProperties = new HashMap<>(); consumerProperties.put("myCustomKey", "myCustomValue"); consumerProperties.put("bootstrap.servers", kafkaHost); consumerProperties.put("isolation.level", "read_committed"); KafkaSupervisorIOConfig kafkaSupervisorIOConfig = new KafkaSupervisorIOConfig( topic, INPUT_FORMAT, replicas, taskCount, new Period(duration), consumerProperties, null, KafkaSupervisorIOConfig.DEFAULT_POLL_TIMEOUT_MILLIS, new Period("P1D"), new Period("PT30S"), useEarliestOffset, new Period("PT30M"), lateMessageRejectionPeriod, earlyMessageRejectionPeriod, null ); KafkaIndexTaskClientFactory taskClientFactory = new KafkaIndexTaskClientFactory( null, null ) { @Override public KafkaIndexTaskClient build( TaskInfoProvider taskInfoProvider, String dataSource, int numThreads, Duration httpTimeout, long numRetries ) { Assert.assertEquals(TEST_CHAT_THREADS, numThreads); Assert.assertEquals(TEST_HTTP_TIMEOUT.toStandardDuration(), httpTimeout); Assert.assertEquals(TEST_CHAT_RETRIES, numRetries); return taskClient; } }; final KafkaSupervisorTuningConfig tuningConfig = new KafkaSupervisorTuningConfig( null, 1000, null, null, 50000, null, new Period("P1Y"), new File("/test"), null, null, null, false, null, false, null, numThreads, TEST_CHAT_THREADS, TEST_CHAT_RETRIES, TEST_HTTP_TIMEOUT, TEST_SHUTDOWN_TIMEOUT, null, null, null, null, null ); return new TestableKafkaSupervisorWithCustomIsTaskCurrent( taskStorage, taskMaster, indexerMetadataStorageCoordinator, taskClientFactory, OBJECT_MAPPER, new KafkaSupervisorSpec( null, dataSchema, tuningConfig, kafkaSupervisorIOConfig, null, suspended, taskStorage, taskMaster, indexerMetadataStorageCoordinator, taskClientFactory, OBJECT_MAPPER, new NoopServiceEmitter(), new DruidMonitorSchedulerConfig(), rowIngestionMetersFactory, supervisorConfig ), rowIngestionMetersFactory, isTaskCurrentReturn ); } /** * Use when you don't want generateSequenceNumber overridden */ private KafkaSupervisor getSupervisor( int replicas, int taskCount, boolean useEarliestOffset, String duration, Period lateMessageRejectionPeriod, Period earlyMessageRejectionPeriod, boolean suspended, String kafkaHost, DataSchema dataSchema, KafkaSupervisorTuningConfig tuningConfig ) { Map<String, Object> consumerProperties = new HashMap<>(); consumerProperties.put("myCustomKey", "myCustomValue"); consumerProperties.put("bootstrap.servers", kafkaHost); consumerProperties.put("isolation.level", "read_committed"); KafkaSupervisorIOConfig kafkaSupervisorIOConfig = new KafkaSupervisorIOConfig( topic, INPUT_FORMAT, replicas, taskCount, new Period(duration), consumerProperties, null, KafkaSupervisorIOConfig.DEFAULT_POLL_TIMEOUT_MILLIS, new Period("P1D"), new Period("PT30S"), useEarliestOffset, new Period("PT30M"), lateMessageRejectionPeriod, earlyMessageRejectionPeriod, null ); KafkaIndexTaskClientFactory taskClientFactory = new KafkaIndexTaskClientFactory( null, null ) { @Override public KafkaIndexTaskClient build( TaskInfoProvider taskInfoProvider, String dataSource, int numThreads, Duration httpTimeout, long numRetries ) { Assert.assertEquals(TEST_CHAT_THREADS, numThreads); Assert.assertEquals(TEST_HTTP_TIMEOUT.toStandardDuration(), httpTimeout); Assert.assertEquals(TEST_CHAT_RETRIES, numRetries); return taskClient; } }; return new KafkaSupervisor( taskStorage, taskMaster, indexerMetadataStorageCoordinator, taskClientFactory, OBJECT_MAPPER, new KafkaSupervisorSpec( null, dataSchema, tuningConfig, kafkaSupervisorIOConfig, null, suspended, taskStorage, taskMaster, indexerMetadataStorageCoordinator, taskClientFactory, OBJECT_MAPPER, new NoopServiceEmitter(), new DruidMonitorSchedulerConfig(), rowIngestionMetersFactory, supervisorConfig ), rowIngestionMetersFactory ); } private static DataSchema getDataSchema(String dataSource) { List<DimensionSchema> dimensions = new ArrayList<>(); dimensions.add(StringDimensionSchema.create("dim1")); dimensions.add(StringDimensionSchema.create("dim2")); return new DataSchema( dataSource, new TimestampSpec("timestamp", "iso", null), new DimensionsSpec( dimensions, null, null ), new AggregatorFactory[]{new CountAggregatorFactory("rows")}, new UniformGranularitySpec( Granularities.HOUR, Granularities.NONE, ImmutableList.of() ), null ); } private KafkaIndexTask createKafkaIndexTask( String id, String dataSource, int taskGroupId, SeekableStreamStartSequenceNumbers<Integer, Long> startPartitions, SeekableStreamEndSequenceNumbers<Integer, Long> endPartitions, DateTime minimumMessageTime, DateTime maximumMessageTime, KafkaSupervisorTuningConfig tuningConfig ) { return createKafkaIndexTask( id, taskGroupId, startPartitions, endPartitions, minimumMessageTime, maximumMessageTime, getDataSchema(dataSource), tuningConfig ); } private KafkaIndexTask createKafkaIndexTask( String id, int taskGroupId, SeekableStreamStartSequenceNumbers<Integer, Long> startPartitions, SeekableStreamEndSequenceNumbers<Integer, Long> endPartitions, DateTime minimumMessageTime, DateTime maximumMessageTime, DataSchema schema, KafkaSupervisorTuningConfig tuningConfig ) { return createKafkaIndexTask( id, taskGroupId, startPartitions, endPartitions, minimumMessageTime, maximumMessageTime, schema, tuningConfig.convertToTaskTuningConfig() ); } private KafkaIndexTask createKafkaIndexTask( String id, int taskGroupId, SeekableStreamStartSequenceNumbers<Integer, Long> startPartitions, SeekableStreamEndSequenceNumbers<Integer, Long> endPartitions, DateTime minimumMessageTime, DateTime maximumMessageTime, DataSchema schema, KafkaIndexTaskTuningConfig tuningConfig ) { return new KafkaIndexTask( id, null, schema, tuningConfig, new KafkaIndexTaskIOConfig( taskGroupId, "sequenceName-" + taskGroupId, startPartitions, endPartitions, ImmutableMap.of("bootstrap.servers", kafkaHost), KafkaSupervisorIOConfig.DEFAULT_POLL_TIMEOUT_MILLIS, true, minimumMessageTime, maximumMessageTime, INPUT_FORMAT ), Collections.emptyMap(), OBJECT_MAPPER ); } private static class TestTaskRunnerWorkItem extends TaskRunnerWorkItem { private final String taskType; private final TaskLocation location; private final String dataSource; TestTaskRunnerWorkItem(Task task, ListenableFuture<TaskStatus> result, TaskLocation location) { super(task.getId(), result); this.taskType = task.getType(); this.location = location; this.dataSource = task.getDataSource(); } @Override public TaskLocation getLocation() { return location; } @Override public String getTaskType() { return taskType; } @Override public String getDataSource() { return dataSource; } } private static class TestableKafkaSupervisor extends KafkaSupervisor { private final Map<String, Object> consumerProperties; public TestableKafkaSupervisor( TaskStorage taskStorage, TaskMaster taskMaster, IndexerMetadataStorageCoordinator indexerMetadataStorageCoordinator, KafkaIndexTaskClientFactory taskClientFactory, ObjectMapper mapper, KafkaSupervisorSpec spec, RowIngestionMetersFactory rowIngestionMetersFactory ) { super( taskStorage, taskMaster, indexerMetadataStorageCoordinator, taskClientFactory, mapper, spec, rowIngestionMetersFactory ); this.consumerProperties = spec.getIoConfig().getConsumerProperties(); } @Override protected RecordSupplier<Integer, Long, KafkaRecordEntity> setupRecordSupplier() { final Map<String, Object> consumerConfigs = KafkaConsumerConfigs.getConsumerProperties(); consumerConfigs.put("metadata.max.age.ms", "1"); final Properties props = new Properties(); KafkaRecordSupplier.addConsumerPropertiesFromConfig(props, sortingMapper, consumerProperties); props.putAll(consumerConfigs); Deserializer keyDeserializerObject = new ByteArrayDeserializer(); Deserializer valueDeserializerObject = new ByteArrayDeserializer(); return new KafkaRecordSupplier( new KafkaConsumer<>(props, keyDeserializerObject, valueDeserializerObject) ); } @Override protected String generateSequenceName( Map<Integer, Long> startPartitions, Optional<DateTime> minimumMessageTime, Optional<DateTime> maximumMessageTime, DataSchema dataSchema, SeekableStreamIndexTaskTuningConfig tuningConfig ) { final int groupId = getTaskGroupIdForPartition(startPartitions.keySet().iterator().next()); return StringUtils.format("sequenceName-%d", groupId); } private SeekableStreamSupervisorStateManager getStateManager() { return stateManager; } } private static class TestableKafkaSupervisorWithCustomIsTaskCurrent extends TestableKafkaSupervisor { private final boolean isTaskCurrentReturn; public TestableKafkaSupervisorWithCustomIsTaskCurrent( TaskStorage taskStorage, TaskMaster taskMaster, IndexerMetadataStorageCoordinator indexerMetadataStorageCoordinator, KafkaIndexTaskClientFactory taskClientFactory, ObjectMapper mapper, KafkaSupervisorSpec spec, RowIngestionMetersFactory rowIngestionMetersFactory, boolean isTaskCurrentReturn ) { super( taskStorage, taskMaster, indexerMetadataStorageCoordinator, taskClientFactory, mapper, spec, rowIngestionMetersFactory ); this.isTaskCurrentReturn = isTaskCurrentReturn; } @Override public boolean isTaskCurrent(int taskGroupId, String taskId) { return isTaskCurrentReturn; } } }
{ "content_hash": "00c62ba4889aaa8b75aca7235edfab27", "timestamp": "", "source": "github", "line_count": 4143, "max_line_length": 125, "avg_line_length": 40.452087859039345, "alnum_prop": 0.6985017273991158, "repo_name": "druid-io/druid", "id": "8744b9d45ebeafbd3e0189cf8114ceb4710ecc1a", "size": "168400", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "extensions-core/kafka-indexing-service/src/test/java/org/apache/druid/indexing/kafka/supervisor/KafkaSupervisorTest.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "ANTLR", "bytes": "2538" }, { "name": "CSS", "bytes": "11623" }, { "name": "HTML", "bytes": "26736" }, { "name": "Java", "bytes": "20112547" }, { "name": "JavaScript", "bytes": "295150" }, { "name": "Makefile", "bytes": "659" }, { "name": "PostScript", "bytes": "5" }, { "name": "R", "bytes": "17002" }, { "name": "Roff", "bytes": "3617" }, { "name": "Shell", "bytes": "6116" }, { "name": "TeX", "bytes": "399444" }, { "name": "Thrift", "bytes": "199" } ], "symlink_target": "" }
package io.getquill.context.sql.util object StringOps { implicit class StringOpsExt(str: String) { def collapseSpace: String = str.stripMargin.replaceAll("\\s+", " ").trim } }
{ "content_hash": "fc8082f776cb552a91268696ae714467", "timestamp": "", "source": "github", "line_count": 9, "max_line_length": 50, "avg_line_length": 21.333333333333332, "alnum_prop": 0.6822916666666666, "repo_name": "getquill/quill", "id": "e92927cb08bb30ff7058f694bf4de51855a1b663", "size": "192", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "quill-sql/src/test/scala/io/getquill/context/sql/util/StringOps.scala", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Scala", "bytes": "1991240" }, { "name": "Shell", "bytes": "19413" }, { "name": "TSQL", "bytes": "4060" } ], "symlink_target": "" }
package com.ctrip.hermes.core.transport.endpoint; import io.netty.channel.ChannelHandlerContext; import io.netty.handler.timeout.IdleState; import io.netty.handler.timeout.IdleStateEvent; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import com.ctrip.hermes.core.transport.command.processor.CommandProcessorManager; import com.ctrip.hermes.core.transport.netty.AbstractNettyChannelInboundHandler; import com.ctrip.hermes.core.transport.netty.NettyUtils; /** * @author Leo Liang([email protected]) * */ public class DefaultServerChannelInboundHandler extends AbstractNettyChannelInboundHandler { private static final Logger log = LoggerFactory.getLogger(DefaultServerChannelInboundHandler.class); private int m_maxIdleTime; public DefaultServerChannelInboundHandler(CommandProcessorManager cmdProcessorManager, int maxIdleTime) { super(cmdProcessorManager); m_maxIdleTime = maxIdleTime; } @Override public void channelActive(ChannelHandlerContext ctx) throws Exception { log.info("Client connected(addr={})", NettyUtils.parseChannelRemoteAddr(ctx.channel())); super.channelActive(ctx); } @Override public void channelInactive(ChannelHandlerContext ctx) throws Exception { log.info("Client disconnected(addr={})", NettyUtils.parseChannelRemoteAddr(ctx.channel())); super.channelInactive(ctx); } @Override public void userEventTriggered(ChannelHandlerContext ctx, Object evt) throws Exception { if (evt instanceof IdleStateEvent) { IdleStateEvent evnet = (IdleStateEvent) evt; if (evnet.state().equals(IdleState.ALL_IDLE)) { log.info("Client idle for {} seconds, will remove it automatically(client addr={})", m_maxIdleTime, NettyUtils.parseChannelRemoteAddr(ctx.channel())); ctx.channel().close(); } } super.userEventTriggered(ctx, evt); } }
{ "content_hash": "5e8bf3986408b0e39664485fed6ebebe", "timestamp": "", "source": "github", "line_count": 52, "max_line_length": 106, "avg_line_length": 35.01923076923077, "alnum_prop": 0.7880285557386052, "repo_name": "Tony-Zhang03/hermes", "id": "fbdedcefc10970439c7155866b3e08301bb28333", "size": "1821", "binary": false, "copies": "4", "ref": "refs/heads/master", "path": "hermes-core/src/main/java/com/ctrip/hermes/core/transport/endpoint/DefaultServerChannelInboundHandler.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "CSS", "bytes": "10616" }, { "name": "Cucumber", "bytes": "200" }, { "name": "HTML", "bytes": "17711" }, { "name": "Java", "bytes": "1875795" }, { "name": "JavaScript", "bytes": "76961" }, { "name": "Shell", "bytes": "25106" } ], "symlink_target": "" }
package java.lang; /** * The <code>Compiler</code> class is a placeholder for a JIT compiler * implementation, and does nothing unless there is such a compiler. * * <p>The system property <code>java.compiler</code> may contain the name * of a library to load with <code>System.loadLibrary</code> when the * virtual machine first starts. If so, and loading the library succeeds, * then a function by the name of <code>java_lang_Compiler_start()</code> * in that library is called. * * <p>Note that a VM might not have implemented any of this. * * @author Tom Tromey ([email protected]) * @see System#getProperty(String) * @see System#getProperty(String, String) * @see System#loadLibrary(String) * @since JDK 1.0 * @status updated to 1.4 */ public final class Compiler { /** * Don't allow new `Compiler's to be made. */ private Compiler() { } /** * Compile the class named by <code>oneClass</code>. * * @param oneClass the class to compile * @return <code>false</code> if no compiler is available or * compilation failed, <code>true</code> if compilation succeeded * @throws NullPointerException if oneClass is null */ public static boolean compileClass(Class oneClass) { return VMCompiler.compileClass(oneClass); } /** * Compile the classes whose name matches <code>classNames</code>. * * @param classNames the name of classes to compile * @return <code>false</code> if no compiler is available or * compilation failed, <code>true</code> if compilation succeeded * @throws NullPointerException if classNames is null */ public static boolean compileClasses(String classNames) { return VMCompiler.compileClasses(classNames); } /** * This method examines the argument and performs an operation * according to the compilers documentation. No specific operation * is required. * * @param arg a compiler-specific argument * @return a compiler-specific value, including null * @throws NullPointerException if the compiler doesn't like a null arg */ public static Object command(Object arg) { return VMCompiler.command(arg); } /** * Calling <code>Compiler.enable()</code> will cause the compiler * to resume operation if it was previously disabled; provided that a * compiler even exists. */ public static void enable() { VMCompiler.enable(); } /** * Calling <code>Compiler.disable()</code> will cause the compiler * to be suspended; provided that a compiler even exists. */ public static void disable() { VMCompiler.disable(); } }
{ "content_hash": "c438ef6330b8587cd7b0baada646a4cd", "timestamp": "", "source": "github", "line_count": 92, "max_line_length": 75, "avg_line_length": 28.58695652173913, "alnum_prop": 0.691254752851711, "repo_name": "shaotuanchen/sunflower_exp", "id": "56fb951bbd9c2fdf70a302578f4dd7f2aa75f8ca", "size": "4438", "binary": false, "copies": "9", "ref": "refs/heads/master", "path": "tools/source/gcc-4.2.4/libjava/classpath/java/lang/Compiler.java", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "Assembly", "bytes": "459993" }, { "name": "Awk", "bytes": "6562" }, { "name": "Batchfile", "bytes": "9028" }, { "name": "C", "bytes": "50326113" }, { "name": "C++", "bytes": "2040239" }, { "name": "CSS", "bytes": "2355" }, { "name": "Clarion", "bytes": "2484" }, { "name": "Coq", "bytes": "61440" }, { "name": "DIGITAL Command Language", "bytes": "69150" }, { "name": "Emacs Lisp", "bytes": "186910" }, { "name": "Fortran", "bytes": "5364" }, { "name": "HTML", "bytes": "2171356" }, { "name": "JavaScript", "bytes": "27164" }, { "name": "Logos", "bytes": "159114" }, { "name": "M", "bytes": "109006" }, { "name": "M4", "bytes": "100614" }, { "name": "Makefile", "bytes": "5409865" }, { "name": "Mercury", "bytes": "702" }, { "name": "Module Management System", "bytes": "56956" }, { "name": "OCaml", "bytes": "253115" }, { "name": "Objective-C", "bytes": "57800" }, { "name": "Papyrus", "bytes": "3298" }, { "name": "Perl", "bytes": "70992" }, { "name": "Perl 6", "bytes": "693" }, { "name": "PostScript", "bytes": "3440120" }, { "name": "Python", "bytes": "40729" }, { "name": "Redcode", "bytes": "1140" }, { "name": "Roff", "bytes": "3794721" }, { "name": "SAS", "bytes": "56770" }, { "name": "SRecode Template", "bytes": "540157" }, { "name": "Shell", "bytes": "1560436" }, { "name": "Smalltalk", "bytes": "10124" }, { "name": "Standard ML", "bytes": "1212" }, { "name": "TeX", "bytes": "385584" }, { "name": "WebAssembly", "bytes": "52904" }, { "name": "Yacc", "bytes": "510934" } ], "symlink_target": "" }
<html> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8"> <link rel="stylesheet" href="stylesheets/bootstrap.min.css" type="text/css" media="screen" charset="utf-8"> <link rel="stylesheet" href="stylesheets/bootstrap-theme.min.css" type="text/css" media="screen" charset="utf-8"> <link rel="stylesheet" href="stylesheets/style.css" type="text/css" media="screen" charset="utf-8"> <link rel="shortcut icon" type="image/png" href="i/favico.png"> </head> <body> <header class="header"> <span>Clappr</span> <ul> <li><a href="https://github.com/globocom/clappr/blob/master/README.md">docs</a></li> </ul> </header> <section class="container"> <div class="main"> <div id="output"> <div id="player-wrapper" class="player"></div> </div> </div> <div class="sidebar"> <div id="editor">var playerElement = document.getElementById("player-wrapper"); var player = new Clappr.Player({ source: 'http://clappr.io/highline.mp4', baseUrl: 'http://cdn.clappr.io/latest', poster: 'http://clappr.io/poster.png', mute: true, height: 360, width: 640 }); player.attachTo(playerElement); </div> <button class="run btn btn-primary">Run</button> <div id="console"> </div> </div> </section> <footer class="footer"></footer> <script type="text/javascript" charset="utf-8" src="j/main.js"></script> <script type="text/javascript" charset="utf-8" src="j/editor/ace.js"></script> <script type="text/javascript" charset="utf-8" src="latest/clappr.min.js"></script> <script> var urlParams; (function() { window.onpopstate = function () { var match, pl = /\+/g, // Regex for replacing addition symbol with a space search = /([^&=]+)=?([^&]*)/g, decode = function (s) { return decodeURIComponent(s.replace(pl, " ")); }, query = window.location.search.substring(1); urlParams = {}; while (match = search.exec(query)) urlParams[decode(match[1])] = decode(match[2]); } window.onpopstate(); })(); var playerElement = document.getElementById("player-wrapper"); var player = new Clappr.Player({ source: 'http://clappr.io/highline.mp4', poster: 'http://clappr.io/poster.png', baseUrl: 'http://cdn.clappr.io/latest', mute: true, height: 360, width: 640 }); player.attachTo(playerElement); //editor window.onload = function() { var editor = ace.edit("editor"); var session = editor.getSession(); editor.setTheme("ace/theme/katzenmilch"); session.setMode("ace/mode/javascript"); session.setTabSize(2); session.setUseSoftTabs(true); } </script> </body> </html>
{ "content_hash": "5da6872db87392adf10c8fe2971a16c1", "timestamp": "", "source": "github", "line_count": 86, "max_line_length": 117, "avg_line_length": 33.825581395348834, "alnum_prop": 0.5864558267445857, "repo_name": "tarkanlar/clappr", "id": "83e42814aabec3e3558a498e2497df756f5e349f", "size": "2909", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "public/index.html", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "ActionScript", "bytes": "8991" }, { "name": "CSS", "bytes": "17308" }, { "name": "HTML", "bytes": "8033" }, { "name": "JavaScript", "bytes": "1081475" }, { "name": "Shell", "bytes": "1757" } ], "symlink_target": "" }
package com.example.gungde.imk_m3.api; import com.example.gungde.imk_m3.BuildConfig; import com.example.gungde.imk_m3.model.Article; import java.util.List; import retrofit2.Call; import retrofit2.Retrofit; import retrofit2.converter.gson.GsonConverterFactory; import retrofit2.http.GET; import retrofit2.http.Query; /** * Created by gungdeaditya on 13/05/17. */ public interface ApiInterface { @GET("posts") Call<List<Article>> getArticle( @Query("&_embed") String embed ); Retrofit retrofit = new Retrofit.Builder() .baseUrl(BuildConfig.BASE_URL) .addConverterFactory(GsonConverterFactory.create()) .build(); }
{ "content_hash": "81b80978c200b458d7a6d7419bd95db9", "timestamp": "", "source": "github", "line_count": 32, "max_line_length": 63, "avg_line_length": 21.5, "alnum_prop": 0.6991279069767442, "repo_name": "gungdeaditya/Kulturistik", "id": "e17d43ae8ded43edf9fa657fec0e3e3c6f8af1f5", "size": "688", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "app/src/main/java/com/example/gungde/imk_m3/api/ApiInterface.java", "mode": "33188", "license": "mit", "language": [ { "name": "Java", "bytes": "23262" } ], "symlink_target": "" }
require 'test_helper' module TalentWizard class DemoControllerTest < ActionController::TestCase # test "the truth" do # assert true # end end end
{ "content_hash": "91fe74946b8e21f0818ad5461ab4c64f", "timestamp": "", "source": "github", "line_count": 9, "max_line_length": 55, "avg_line_length": 18.333333333333332, "alnum_prop": 0.696969696969697, "repo_name": "leadbaxter/talent_wizard", "id": "2a560a3aa86d945b0b536aaf642c56d6aeea8409", "size": "165", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "test/functional/talent_wizard/demo_controller_test.rb", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "58366" }, { "name": "JavaScript", "bytes": "90720" }, { "name": "Ruby", "bytes": "74010" } ], "symlink_target": "" }
using System; using System.Collections.Generic; using FlintSharp.Behaviours; using FlintSharp.Activities; using FlintSharp.Counters; using FlintSharp.Easing; using FlintSharp.Emitters; using FlintSharp.EnergyEasing; using FlintSharp.Initializers; using FlintSharp.Particles; using FlintSharp.Zones; namespace FlintSharp.Behaviours { /// <summary> /// The MatchVelocity action applies an acceleration to the particle to match /// its velocity to that of its nearest neighbours. /// </summary> public class MatchVelocity : Behaviour { private double m_max; private double m_acc; private double m_maxSq; public MatchVelocity() : this(10.0, 1.0) { } /// <summary> /// The constructor creates a MatchVelocity action for use by /// an emitter. To add a MatchVelocity to all particles created by an emitter, use the /// emitter's addAction method. /// /// @see org.flintparticles.emitters.Emitter#addAction() /// </summary> /// <param name="maxDistance">The maximum distance, in pixels, over which this action operates. /// The particle will match its velocity other particles that are this close or closer to it.</param> /// <param name="acceleration">The acceleration force applied to adjust velocity to match that /// of the other particles.</param> public MatchVelocity(double maxDistance, double acceleration) { m_max = maxDistance; m_acc = acceleration; m_maxSq = maxDistance * maxDistance; } /// <summary> /// The maximum distance, in pixels, over which this action operates. /// The particle will match its velocity other particles that are this close or closer to it. /// </summary> public double MaxDistance { get { return m_max; } set { m_max = value; } } /// <summary> /// The acceleration force applied to adjust velocity to match that /// of the other particles. /// </summary> public double Acceleration { get { return m_acc; } set { m_acc = value; } } /// <summary> /// The getDefaultPriority method is used to order the execution of actions. /// It is called within the emitter's addAction method when the user doesn't /// manually set a priority. It need not be called directly by the user. /// /// @see org.flintparticles.common.actions.Action#getDefaultPriority() /// </summary> /// <returns><p>Returns a value of 10, so that the MutualGravity action executes before other actions.</p></returns> public override int GetDefaultPriority() { return 10; } /// <summary> /// The addedToEmitter method is called by the emitter when the Action is added to it /// It is called within the emitter's addAction method and need not /// be called by the user. /// </summary> /// <param name="emitter">The Emitter that the Action was added to.</param> public override void AddedToEmitter(Emitter emitter) { if (emitter == null) { return; } emitter.SpaceSort = true; } /// <summary> /// The update method is used by the emitter to apply the action /// to every particle. It is called within the emitter's update /// loop and need not be called by the user. /// </summary> /// <param name="emitter">The Emitter that created the particle.</param> /// <param name="particle">The particle to be updated.</param> /// <param name="elapsedTime">The duration of the frame - used for time based updates.</param> public override void Update(Emitter emitter, Particle particle, double elapsedTime) { if (emitter == null) { return; } if (particle == null) { return; } List<Particle> particles = emitter.Particles; Particle other; double distanceSq; int i; int len = particles.Count; double dx; double dy; double velX = 0; double velY = 0; int count = 0; double factor; for (i = particles.IndexOf(particle) - 1; i >= 0; i--) { other = particles[i]; if ((dx = particle.X - other.X) > m_max) break; dy = other.Y - particle.Y; if (dy > m_max || dy < -m_max) continue; distanceSq = dy * dy + dx * dx; if (distanceSq <= m_maxSq && distanceSq > 0) { velX += other.VelocityX; velY += other.VelocityY; count++; } } for (i = particles.IndexOf(particle) + 1; i < len; i++) { other = particles[i]; if ((dx = other.X - particle.X) > m_max) break; dy = other.Y - particle.Y; if (dy > m_max || dy < -m_max) continue; distanceSq = dy * dy + dx * dx; if (distanceSq <= m_maxSq && distanceSq > 0) { velX += other.VelocityX; velY += other.VelocityY; count++; } } if (count != 0) { velX = velX / count - particle.VelocityX; velY = velY / count - particle.VelocityY; if (velX != 0 || velY != 0) { factor = (double)(elapsedTime * m_acc / Math.Sqrt(velX * velX + velY * velY)); if (factor > 1) factor = 1; particle.VelocityX += factor * velX; particle.VelocityY += factor * velY; } } } } }
{ "content_hash": "e9c40638be0a727636076a710f5cacda", "timestamp": "", "source": "github", "line_count": 193, "max_line_length": 124, "avg_line_length": 33.590673575129536, "alnum_prop": 0.49899737775721115, "repo_name": "ebifrier/Ragnarok", "id": "af86702995e951cd53c1454c57092dc8aab275bd", "size": "7821", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "Ragnarok.Extra/FlintSharp/Behaviours/MatchVelocity.cs", "mode": "33188", "license": "mit", "language": [ { "name": "Batchfile", "bytes": "113" }, { "name": "C#", "bytes": "2839431" }, { "name": "HLSL", "bytes": "453" }, { "name": "Ruby", "bytes": "507" } ], "symlink_target": "" }
/** * Package: MAG - VistA Imaging WARNING: Per VHA Directive 2004-038, this routine should not be modified. Date Created: Jun 30, 2011 Site Name: Washington OI Field Office, Silver Spring, MD Developer: VHAISWWERFEJ Description: ;; +--------------------------------------------------------------------+ ;; Property of the US Government. ;; No permission to copy or redistribute this software is given. ;; Use of unreleased versions of this software requires the user ;; to execute a written test agreement with the VistA Imaging ;; Development Office of the Department of Veterans Affairs, ;; telephone (301) 734-0100. ;; ;; The Food and Drug Administration classifies this software as ;; a Class II medical device. As such, it may not be changed ;; in any way. Modifications to this software may result in an ;; adulterated medical device under 21CFR820, the use of which ;; is considered to be a violation of US Federal Statutes. ;; +--------------------------------------------------------------------+ */ package gov.va.med; import gov.va.med.imaging.BhieImageURN; import gov.va.med.imaging.BhieStudyURN; import gov.va.med.imaging.DocumentSetURN; import gov.va.med.imaging.DocumentURN; import gov.va.med.imaging.ImageURN; import gov.va.med.imaging.StudyURN; /** * Registers "core" URN classes for VISA implementations * @author VHAISWWERFEJ * */ public class CoreURNProvider extends URNProvider { public CoreURNProvider() { super(); } @SuppressWarnings("unchecked") @Override protected Class<? extends URN>[] getUrnClasses() { return new Class [] { StudyURN.class, DocumentSetURN.class, ImageURN.class, DocumentURN.class, BhieImageURN.class, BhieStudyURN.class, PatientArtifactIdentifierImpl.class, GlobalArtifactIdentifierImpl.class, HealthSummaryURN.class }; } }
{ "content_hash": "24ecc1e99aafea8b3179ea76bf353ea6", "timestamp": "", "source": "github", "line_count": 67, "max_line_length": 81, "avg_line_length": 29.53731343283582, "alnum_prop": 0.6407276402223345, "repo_name": "VHAINNOVATIONS/Telepathology", "id": "f785fb97069dbc93753081f4538fd8f15ff8fd35", "size": "1979", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "Source/Java/ImagingCommon/main/src/java/gov/va/med/CoreURNProvider.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "C#", "bytes": "572128" }, { "name": "FreeMarker", "bytes": "32711" }, { "name": "Genshi", "bytes": "605869" }, { "name": "HTML", "bytes": "1939" }, { "name": "Java", "bytes": "13534272" }, { "name": "XSLT", "bytes": "73234" } ], "symlink_target": "" }
<?php namespace Smalot\PdfParser\Tests\Units\Element; use mageekguy\atoum; use Smalot\PdfParser\Document; use Smalot\PdfParser\Header; use Smalot\PdfParser\Page; /** * Class ElementArray * * @package Smalot\PdfParser\Tests\Units\Element */ class ElementArray extends atoum\test { public function testParse() { $document = new \Smalot\PdfParser\Document(array()); // Skipped. $offset = 0; $element = \Smalot\PdfParser\Element\ElementArray::parse('ABC', $document, $offset); $this->assert->boolean($element)->isEqualTo(false); $this->assert->integer($offset)->isEqualTo(0); $offset = 0; $element = \Smalot\PdfParser\Element\ElementArray::parse(' / [ 4 2 ] ', $document, $offset); $this->assert->boolean($element)->isEqualTo(false); $this->assert->integer($offset)->isEqualTo(0); $offset = 0; $element = \Smalot\PdfParser\Element\ElementArray::parse(' 0 [ 4 2 ] ', $document, $offset); $this->assert->boolean($element)->isEqualTo(false); $this->assert->integer($offset)->isEqualTo(0); $offset = 0; $element = \Smalot\PdfParser\Element\ElementArray::parse(" 0 \n [ 4 2 ] ", $document, $offset); $this->assert->boolean($element)->isEqualTo(false); $this->assert->integer($offset)->isEqualTo(0); // Valid. $offset = 0; $element = \Smalot\PdfParser\Element\ElementArray::parse(' [ 4 2 ] ', $document, $offset); $this->assert->boolean($element->contains(4))->isEqualTo(true); $this->assert->boolean($element->contains(2))->isEqualTo(true); $this->assert->boolean($element->contains(8))->isEqualTo(false); $this->assert->integer($offset)->isEqualTo(8); $offset = 0; $element = \Smalot\PdfParser\Element\ElementArray::parse(' [ 4 2 ]', $document, $offset); $this->assert->boolean($element->contains(4))->isEqualTo(true); $this->assert->boolean($element->contains(2))->isEqualTo(true); $this->assert->boolean($element->contains(8))->isEqualTo(false); $this->assert->integer($offset)->isEqualTo(8); $offset = 0; $element = \Smalot\PdfParser\Element\ElementArray::parse('[ 4 2 ]', $document, $offset); $this->assert->boolean($element->contains(4))->isEqualTo(true); $this->assert->boolean($element->contains(2))->isEqualTo(true); $this->assert->boolean($element->contains(8))->isEqualTo(false); $this->assert->integer($offset)->isEqualTo(7); $offset = 0; $element = \Smalot\PdfParser\Element\ElementArray::parse(" \n [ 4 2 ] ", $document, $offset); $this->assert->boolean($element->contains(4))->isEqualTo(true); $this->assert->boolean($element->contains(2))->isEqualTo(true); $this->assert->boolean($element->contains(8))->isEqualTo(false); $this->assert->integer($offset)->isEqualTo(10); } public function testGetContent() { $val_4 = new \Smalot\PdfParser\Element\ElementNumeric('4'); $val_2 = new \Smalot\PdfParser\Element\ElementNumeric('2'); $element = new \Smalot\PdfParser\Element\ElementArray(array($val_4, $val_2)); $content = $element->getContent(); $this->assert->array($content)->hasSize(2); } public function testContains() { $val_4 = new \Smalot\PdfParser\Element\ElementNumeric('4'); $val_2 = new \Smalot\PdfParser\Element\ElementNumeric('2'); $element = new \Smalot\PdfParser\Element\ElementArray(array($val_4, $val_2)); $this->assert->boolean($element->contains(2))->isEqualTo(true); $this->assert->boolean($element->contains(8))->isEqualTo(false); } public function testResolveXRef() { // Document with text. $filename = __DIR__ . '/../../../../../../samples/Document1_pdfcreator_nocompressed.pdf'; $parser = new \Smalot\PdfParser\Parser(); $document = $parser->parseFile($filename); $object = $document->getObjectById('3_0'); $kids = $object->get('Kids'); $this->assert->object($kids)->isInstanceOf('\Smalot\PdfParser\Element\ElementArray'); $this->assert->array($kids->getContent())->hasSize(1); $pages = $kids->getContent(); $this->assert->object(reset($pages))->isInstanceOf('\Smalot\PdfParser\Page'); } public function testGetDetails() { // // Document with text. // $filename = __DIR__ . '/../../../../../../samples/Document1_pdfcreator_nocompressed.pdf'; // $parser = new \Smalot\PdfParser\Parser(); // $document = $parser->parseFile($filename); // $object = $document->getObjectById('3_0'); // /** @var \Smalot\PdfParser\Element\ElementArray $kids */ // $kids = $object->get('Kids'); // $details = $kids->getDetails(); // // $this->assert->array($details)->hasSize(1); // $this->assert->string($details[0]['Type'])->isEqualTo('Page'); $document = new Document(); $content = '<</Type/Page/Types[8]/Sizes[1 2 3 4 5 <</Subtype/XObject>> [8 [9 <</FontSize 10>>]]]>>'; $details_reference = array( 'Type' => 'Page', 'Types' => array( 8, ), 'Sizes' => array( 1, 2, 3, 4, 5, array( 'Subtype' => 'XObject', ), array( 8, array( 9, array( 'FontSize' => 10, ), ), ), ), ); $header = Header::parse($content, $document); $details = $header->getDetails(); $this->assert->array($details)->hasSize(3); $this->assert->array($details)->isEqualTo($details_reference); } public function test__toString() { $val_4 = new \Smalot\PdfParser\Element\ElementNumeric('4'); $val_2 = new \Smalot\PdfParser\Element\ElementNumeric('2'); $element = new \Smalot\PdfParser\Element\ElementArray(array($val_4, $val_2)); $this->assert->castToString($element)->isEqualTo('4,2'); $document = new \Smalot\PdfParser\Document(array()); $element = \Smalot\PdfParser\Element\ElementArray::parse(' [ 4 2 ]', $document); $this->assert->castToString($element)->isEqualTo('4,2'); } }
{ "content_hash": "2593d8257ec04346be64c54ec4abc553", "timestamp": "", "source": "github", "line_count": 163, "max_line_length": 118, "avg_line_length": 40.576687116564415, "alnum_prop": 0.5580586634411854, "repo_name": "allblue-pl-dev/php_espada_ecore", "id": "3c9125833fe6af32a349c063e983dcb825265d2a", "size": "7673", "binary": false, "copies": "8", "ref": "refs/heads/master", "path": "PDF/3rdparty/PDFParser/src/Smalot/PdfParser/Tests/Units/Element/ElementArray.php", "mode": "33188", "license": "mit", "language": [ { "name": "HTML", "bytes": "4825" }, { "name": "Hack", "bytes": "1111" }, { "name": "JavaScript", "bytes": "13160" }, { "name": "Less", "bytes": "15655" }, { "name": "PHP", "bytes": "236459" } ], "symlink_target": "" }
package org.gradle.play.internal.run; import org.gradle.api.Action; import org.gradle.api.logging.Logging; import org.gradle.internal.UncheckedException; import org.gradle.internal.classpath.DefaultClassPath; import org.gradle.process.internal.WorkerProcessContext; import java.io.IOException; import java.io.Serializable; import java.net.MalformedURLException; import java.net.URL; import java.net.URLClassLoader; import java.net.URLConnection; import java.util.concurrent.CountDownLatch; public class PlayWorkerServer implements Action<WorkerProcessContext>, PlayRunWorkerServerProtocol, Serializable { private PlayRunSpec runSpec; private VersionedPlayRunAdapter runAdapter; private volatile CountDownLatch stop; public PlayWorkerServer(PlayRunSpec runSpec, VersionedPlayRunAdapter runAdapter) { this.runSpec = runSpec; this.runAdapter = runAdapter; } @Override public void execute(WorkerProcessContext context) { stop = new CountDownLatch(1); final PlayRunWorkerClientProtocol clientProtocol = context.getServerConnection().addOutgoing(PlayRunWorkerClientProtocol.class); context.getServerConnection().addIncoming(PlayRunWorkerServerProtocol.class, this); context.getServerConnection().connect(); final PlayAppLifecycleUpdate result = startServer(); try { clientProtocol.update(result); stop.await(); } catch (InterruptedException e) { throw UncheckedException.throwAsUncheckedException(e); } finally { clientProtocol.update(PlayAppLifecycleUpdate.stopped()); } } private PlayAppLifecycleUpdate startServer() { try { run(); return PlayAppLifecycleUpdate.running(); } catch (Exception e) { Logging.getLogger(this.getClass()).error("Failed to run Play", e); return PlayAppLifecycleUpdate.failed(e); } } private void run() { disableUrlConnectionCaching(); final Thread thread = Thread.currentThread(); final ClassLoader previousContextClassLoader = thread.getContextClassLoader(); final ClassLoader classLoader = new URLClassLoader(new DefaultClassPath(runSpec.getClasspath()).getAsURLArray(), null); thread.setContextClassLoader(classLoader); try { Object buildDocHandler = runAdapter.getBuildDocHandler(classLoader, runSpec.getClasspath()); Object buildLink = runAdapter.getBuildLink(classLoader, runSpec.getProjectPath(), runSpec.getApplicationJar(), runSpec.getChangingClasspath(), runSpec.getAssetsJar(), runSpec.getAssetsDirs()); runAdapter.runDevHttpServer(classLoader, classLoader, buildLink, buildDocHandler, runSpec.getHttpPort()); } catch (Exception e) { throw UncheckedException.throwAsUncheckedException(e); } finally { thread.setContextClassLoader(previousContextClassLoader); } } private void disableUrlConnectionCaching() { // fix problems in updating jar files by disabling default caching of URL connections. // URLConnection default caching should be disabled since it causes jar file locking issues and JVM crashes in updating jar files. // Changes to jar files won't be noticed in all cases when caching is enabled. // sun.net.www.protocol.jar.JarURLConnection leaves the JarFile instance open if URLConnection caching is enabled. try { URL url = new URL("jar:file://valid_jar_url_syntax.jar!/"); URLConnection urlConnection = url.openConnection(); urlConnection.setDefaultUseCaches(false); } catch (MalformedURLException e) { throw UncheckedException.throwAsUncheckedException(e); } catch (IOException e) { throw UncheckedException.throwAsUncheckedException(e); } } @Override public void stop() { stop.countDown(); } @Override public void reload() { runAdapter.reload(); } @Override public void buildError(Throwable throwable) { runAdapter.buildError(throwable); } }
{ "content_hash": "eb1ba4ea3457956989e572e47efe1c7c", "timestamp": "", "source": "github", "line_count": 106, "max_line_length": 204, "avg_line_length": 39.490566037735846, "alnum_prop": 0.7021022455805065, "repo_name": "HenryHarper/Acquire-Reboot", "id": "ebb8bb2452ba573a76aab9aa6c4e121db659a3c0", "size": "4801", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "gradle/src/platform-play/org/gradle/play/internal/run/PlayWorkerServer.java", "mode": "33188", "license": "mit", "language": [], "symlink_target": "" }
require_dependency "molecular/application_controller" module Molecular class CampaignsController < ApplicationController before_action :set_campaign, only: [:show, :edit, :update, :destroy] before_action :build_campaign, only: :create before_action :load_campaigns, only: :index before_action :validate_status, only: [:edit, :update] # GET /campaigns def index @campaigns = @campaigns.includes(:subscriptions).page(params[:page]).per(10) end # GET /campaigns/1 def show end # GET /campaigns/new def new @campaign = Campaign.new end # GET /campaigns/1/edit def edit end # POST /campaigns def create if @campaign.save redirect_to @campaign, notice: I18n.t('flash.campaigns.created') else render :new end end # PATCH/PUT /campaigns/1 def update submit and return if params[:submit] if @campaign.update(campaign_params) redirect_to @campaign, notice: I18n.t('flash.campaigns.updated') else render :edit end end # DELETE /campaigns/1 def destroy @campaign.destroy redirect_to campaigns_url, notice: I18n.t('flash.campaigns.destroyed') end private def build_campaign @campaign = Campaign.new(campaign_params) @campaign.owner = molecular_owner end # Use callbacks to share common setup or constraints between actions. def set_campaign @campaign = Campaign.find(params[:id]) end def load_campaigns @campaigns = Campaign.where(owner: molecular_owner) end def validate_status redirect_to @campaign, alert: I18n.t('flash.campaigns.already_sent') if @campaign.sent? end # Only allow a trusted parameter "white list" through. def campaign_params params. require(:campaign). permit(:subject, :body, :recipients_query, :from, :from_name) end def submit @campaign.enqueue redirect_to @campaign, notice: I18n.t('flash.campaigns.scheduled') end end end
{ "content_hash": "3c770d9afb9ec7aec42837a8a444c828", "timestamp": "", "source": "github", "line_count": 86, "max_line_length": 82, "avg_line_length": 24.813953488372093, "alnum_prop": 0.6307403936269915, "repo_name": "oliveiragabriel07/molecular", "id": "700bd548c5e121c7fcc136697e13d019baa46273", "size": "2134", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "app/controllers/molecular/campaigns_controller.rb", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "3456" }, { "name": "CoffeeScript", "bytes": "1375" }, { "name": "HTML", "bytes": "20106" }, { "name": "JavaScript", "bytes": "2402" }, { "name": "Ruby", "bytes": "79587" } ], "symlink_target": "" }
#ifndef HALE_AI_RANDOM_H #define HALE_AI_RANDOM_H #include <stdint.h> #include "state.h" /* * This AI is random- every move is randomized from the set of valid options, * with no strategy whatsoever. */ //See player.h for an explanation of these functions extern const PlayerActions_t randomActions; uint8_t randomPlayTile(GameState_t* gs, uint8_t playerNum); chain_t randomFormChain(GameState_t* gs, uint8_t playerNum); chain_t randomMergerSurvivor(GameState_t* gs, uint8_t playerNum, uint8_t* options); void randomMergerOrder(GameState_t* gs, uint8_t playerNum, chain_t survivor, uint8_t* options); void randomBuyStock(GameState_t* gs, uint8_t playerNum, uint8_t* toBuy); void randomMergerTrade(GameState_t* gs, uint8_t playerNum, chain_t survivor, chain_t defunct, uint8_t* tradeFor, uint8_t* sell); uint8_t randomEndGame(GameState_t* gs, uint8_t playerNum); #endif
{ "content_hash": "4e8b67e40ce90a06534c42bb5afc210f", "timestamp": "", "source": "github", "line_count": 35, "max_line_length": 128, "avg_line_length": 25.4, "alnum_prop": 0.7536557930258717, "repo_name": "dbtayl/HALE", "id": "0f38d40e8173a6fb8fe244679b3281e4358b3c3b", "size": "889", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "ai-random.h", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "C", "bytes": "217701" }, { "name": "C++", "bytes": "6411" }, { "name": "Makefile", "bytes": "1433" }, { "name": "Python", "bytes": "20859" } ], "symlink_target": "" }
#ifndef MUPDF_FITZ_DOCUMENT_H #define MUPDF_FITZ_DOCUMENT_H #include "mupdf/fitz/system.h" #include "mupdf/fitz/context.h" #include "mupdf/fitz/math.h" #include "mupdf/fitz/device.h" #include "mupdf/fitz/transition.h" #include "mupdf/fitz/link.h" #include "mupdf/fitz/outline.h" /* Document interface */ typedef struct fz_document_s fz_document; typedef struct fz_page_s fz_page; typedef struct fz_annot_s fz_annot; // TODO: move out of this interface (it's pdf specific) typedef struct fz_write_options_s fz_write_options; struct fz_document_s { void (*close)(fz_document *); int (*needs_password)(fz_document *doc); int (*authenticate_password)(fz_document *doc, char *password); fz_outline *(*load_outline)(fz_document *doc); int (*count_pages)(fz_document *doc); fz_page *(*load_page)(fz_document *doc, int number); fz_link *(*load_links)(fz_document *doc, fz_page *page); fz_rect *(*bound_page)(fz_document *doc, fz_page *page, fz_rect *); void (*run_page_contents)(fz_document *doc, fz_page *page, fz_device *dev, const fz_matrix *transform, fz_cookie *cookie); void (*run_annot)(fz_document *doc, fz_page *page, fz_annot *annot, fz_device *dev, const fz_matrix *transform, fz_cookie *cookie); void (*free_page)(fz_document *doc, fz_page *page); int (*meta)(fz_document *doc, int key, void *ptr, int size); fz_transition *(*page_presentation)(fz_document *doc, fz_page *page, float *duration); fz_annot *(*first_annot)(fz_document *doc, fz_page *page); fz_annot *(*next_annot)(fz_document *doc, fz_annot *annot); fz_rect *(*bound_annot)(fz_document *doc, fz_annot *annot, fz_rect *rect); void (*write)(fz_document *doc, char *filename, fz_write_options *opts); }; /* fz_open_document: Open a PDF, XPS or CBZ document. Open a document file and read its basic structure so pages and objects can be located. MuPDF will try to repair broken documents (without actually changing the file contents). The returned fz_document is used when calling most other document related functions. Note that it wraps the context, so those functions implicitly can access the global state in context. filename: a path to a file as it would be given to open(2). */ fz_document *fz_open_document(fz_context *ctx, const char *filename); /* fz_open_document_with_stream: Open a PDF, XPS or CBZ document. Open a document using the specified stream object rather than opening a file on disk. magic: a string used to detect document type; either a file name or mime-type. */ fz_document *fz_open_document_with_stream(fz_context *ctx, const char *magic, fz_stream *stream); /* fz_close_document: Close and free an open document. The resource store in the context associated with fz_document is emptied, and any allocations for the document are freed. Does not throw exceptions. */ void fz_close_document(fz_document *doc); /* fz_needs_password: Check if a document is encrypted with a non-blank password. Does not throw exceptions. */ int fz_needs_password(fz_document *doc); /* fz_authenticate_password: Test if the given password can decrypt the document. password: The password string to be checked. Some document specifications do not specify any particular text encoding, so neither do we. Does not throw exceptions. */ int fz_authenticate_password(fz_document *doc, char *password); /* fz_load_outline: Load the hierarchical document outline. Should be freed by fz_free_outline. */ fz_outline *fz_load_outline(fz_document *doc); /* fz_count_pages: Return the number of pages in document May return 0 for documents with no pages. */ int fz_count_pages(fz_document *doc); /* fz_load_page: Load a page. After fz_load_page is it possible to retrieve the size of the page using fz_bound_page, or to render the page using fz_run_page_*. Free the page by calling fz_free_page. number: page number, 0 is the first page of the document. */ fz_page *fz_load_page(fz_document *doc, int number); /* fz_load_links: Load the list of links for a page. Returns a linked list of all the links on the page, each with its clickable region and link destination. Each link is reference counted so drop and free the list of links by calling fz_drop_link on the pointer return from fz_load_links. page: Page obtained from fz_load_page. */ fz_link *fz_load_links(fz_document *doc, fz_page *page); /* fz_bound_page: Determine the size of a page at 72 dpi. Does not throw exceptions. */ fz_rect *fz_bound_page(fz_document *doc, fz_page *page, fz_rect *rect); /* fz_run_page: Run a page through a device. page: Page obtained from fz_load_page. dev: Device obtained from fz_new_*_device. transform: Transform to apply to page. May include for example scaling and rotation, see fz_scale, fz_rotate and fz_concat. Set to fz_identity if no transformation is desired. cookie: Communication mechanism between caller and library rendering the page. Intended for multi-threaded applications, while single-threaded applications set cookie to NULL. The caller may abort an ongoing rendering of a page. Cookie also communicates progress information back to the caller. The fields inside cookie are continually updated while the page is rendering. */ void fz_run_page(fz_document *doc, fz_page *page, fz_device *dev, const fz_matrix *transform, fz_cookie *cookie); /* fz_run_page_contents: Run a page through a device. Just the main page content, without the annotations, if any. page: Page obtained from fz_load_page. dev: Device obtained from fz_new_*_device. transform: Transform to apply to page. May include for example scaling and rotation, see fz_scale, fz_rotate and fz_concat. Set to fz_identity if no transformation is desired. cookie: Communication mechanism between caller and library rendering the page. Intended for multi-threaded applications, while single-threaded applications set cookie to NULL. The caller may abort an ongoing rendering of a page. Cookie also communicates progress information back to the caller. The fields inside cookie are continually updated while the page is rendering. */ void fz_run_page_contents(fz_document *doc, fz_page *page, fz_device *dev, const fz_matrix *transform, fz_cookie *cookie); /* fz_run_annot: Run an annotation through a device. page: Page obtained from fz_load_page. annot: an annotation. dev: Device obtained from fz_new_*_device. transform: Transform to apply to page. May include for example scaling and rotation, see fz_scale, fz_rotate and fz_concat. Set to fz_identity if no transformation is desired. cookie: Communication mechanism between caller and library rendering the page. Intended for multi-threaded applications, while single-threaded applications set cookie to NULL. The caller may abort an ongoing rendering of a page. Cookie also communicates progress information back to the caller. The fields inside cookie are continually updated while the page is rendering. */ void fz_run_annot(fz_document *doc, fz_page *page, fz_annot *annot, fz_device *dev, const fz_matrix *transform, fz_cookie *cookie); /* fz_free_page: Free a loaded page. Does not throw exceptions. */ void fz_free_page(fz_document *doc, fz_page *page); /* fz_page_presentation: Get the presentation details for a given page. duration: NULL, or a pointer to a place to set the page duration in seconds. (Will be set to 0 if unspecified). Returns: a pointer to a transition structure, or NULL if there isn't one. Does not throw exceptions. */ fz_transition *fz_page_presentation(fz_document *doc, fz_page *page, float *duration); #endif
{ "content_hash": "5c939f835b7d650c680f63f7c581b5fb", "timestamp": "", "source": "github", "line_count": 229, "max_line_length": 132, "avg_line_length": 33.1353711790393, "alnum_prop": 0.7449920927780707, "repo_name": "hirakuni45/glfw3_app", "id": "0c31e7131380790a29ac604dc16813d862c7ce7e", "size": "7588", "binary": false, "copies": "9", "ref": "refs/heads/master", "path": "glfw3_app/libraries/include/mupdf/fitz/document.h", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "C", "bytes": "3710211" }, { "name": "C++", "bytes": "6987703" }, { "name": "CMake", "bytes": "3465" }, { "name": "Emacs Lisp", "bytes": "4489" }, { "name": "Makefile", "bytes": "182229" }, { "name": "Objective-C", "bytes": "7780" }, { "name": "Shell", "bytes": "4457" } ], "symlink_target": "" }
module Trainmaster class UserMailer < ApplicationMailer def email_verification(user) @user = user mail(to: @user.username, subject: "[trainmaster] Email Confirmation") end def password_reset(user) @user = user mail(to: @user.username, subject: "[trainmaster] Password Reset") end end end
{ "content_hash": "f2840962a3c1593ad9de2678b017f173", "timestamp": "", "source": "github", "line_count": 14, "max_line_length": 75, "avg_line_length": 23.857142857142858, "alnum_prop": 0.6676646706586826, "repo_name": "davidan1981/trainmaster", "id": "830f8678a721fe12be140fe260c6003fa0c9ad21", "size": "334", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "app/mailers/trainmaster/user_mailer.rb", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "686" }, { "name": "HTML", "bytes": "7059" }, { "name": "JavaScript", "bytes": "596" }, { "name": "Ruby", "bytes": "75276" } ], "symlink_target": "" }
SYNONYM #### According to The Catalogue of Life, 3rd January 2011 #### Published in null #### Original name null ### Remarks null
{ "content_hash": "988b620278b99e5ba01737de8fd4d322", "timestamp": "", "source": "github", "line_count": 13, "max_line_length": 39, "avg_line_length": 10.23076923076923, "alnum_prop": 0.6917293233082706, "repo_name": "mdoering/backbone", "id": "773c79e1832390bfa1da08e1839441a6c7d0d69d", "size": "189", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "life/Plantae/Magnoliophyta/Liliopsida/Poales/Poaceae/Brachypodium/Brachypodium distachyon/ Syn. Trachynia distachya hispida/README.md", "mode": "33188", "license": "apache-2.0", "language": [], "symlink_target": "" }
@interface MMBarricadeDispatchTests : XCTestCase @end @implementation MMBarricadeDispatchTests - (void)testCannotCreateDispatchWithNilResponseStore { MMBarricadeDispatch *dispatch = [[MMBarricadeDispatch alloc] initWithResponseStore:nil]; XCTAssertNil(dispatch); } #pragma mark - Basic vending of responses for requests - (void)testRequestWithNoDispatchResponsesReturnsNil { MMBarricadeDispatch *dispatch = [self dispatchWithInMemoryStore]; NSURLRequest *request = [self movieListRequest]; id<MMBarricadeResponse> response = [dispatch responseForRequest:request]; XCTAssertNil(response); } - (void)testRequestWithoutRegisteredResponseReturnsNil { MMBarricadeDispatch *dispatch = [self dispatchWithInMemoryStore]; [dispatch registerResponseSet:[self movieListResponseSet]]; NSURLRequest *request = [self nonRegisteredRequest]; id<MMBarricadeResponse> response = [dispatch responseForRequest:request]; XCTAssertNil(response); } - (void)testCurrentResponseDefaultsToDefaultResponse { MMBarricadeDispatch *dispatch = [self dispatchWithInMemoryStore]; [dispatch registerResponseSet:[self movieListResponseSet]]; NSURLRequest *request = [self movieListRequest]; id<MMBarricadeResponse> response = [dispatch responseForRequest:request]; XCTAssertEqualObjects(response.name, @"Success"); } #pragma mark Unregistration - (void)testCanUnregisterResponseSets { MMBarricadeDispatch *dispatch = [self dispatchWithInMemoryStore]; MMBarricadeResponseSet *movieSet = [self movieListResponseSet]; MMBarricadeResponseSet *loginSet = [self loginResponseSet]; [dispatch registerResponseSet:movieSet]; [dispatch registerResponseSet:loginSet]; NSURLRequest *movieRequest = [self movieListRequest]; id<MMBarricadeResponse> response = [dispatch responseForRequest:movieRequest]; XCTAssertNotNil(response); [dispatch unregisterResponseSet:movieSet]; response = [dispatch responseForRequest:movieRequest]; XCTAssertNil(response); response = [dispatch responseForRequest:[self loginRequest]]; XCTAssertNotNil(response); } #pragma mark Current Response - (void)testCurrentResponseCanBeUpdated { MMBarricadeDispatch *dispatch = [self dispatchWithInMemoryStore]; [dispatch registerResponseSet:[self movieListResponseSet]]; [dispatch selectResponseforRequest:@"Movie List" withResponseName:@"Service Unavailable"]; NSURLRequest *request = [self movieListRequest]; id<MMBarricadeResponse> response = [dispatch responseForRequest:request]; XCTAssertEqualObjects(response.name, @"Service Unavailable"); } - (void)testCannotSetCurrentResponseToAResponseNotInTheSet { MMBarricadeDispatch *dispatch = [self dispatchWithInMemoryStore]; [dispatch registerResponseSet:[self movieListResponseSet]]; [dispatch selectResponseforRequest:@"Movie List" withResponseName:@"Nonexistent Response"]; NSURLRequest *request = [self movieListRequest]; id<MMBarricadeResponse> response = [dispatch responseForRequest:request]; XCTAssertEqualObjects(response.name, @"Success"); } - (void)testAbilityToResetUpdatedResponses { MMBarricadeDispatch *dispatch = [self dispatchWithInMemoryStore]; [dispatch registerResponseSet:[self movieListResponseSet]]; [dispatch selectResponseforRequest:@"Movie List" withResponseName:@"Service Unavailable"]; NSURLRequest *request = [self movieListRequest]; id<MMBarricadeResponse> response = [dispatch responseForRequest:request]; XCTAssertEqualObjects(response.name, @"Service Unavailable"); [dispatch resetResponseSelections]; response = [dispatch responseForRequest:request]; XCTAssertEqualObjects(response.name, @"Success"); } #pragma mark - Private - (MMBarricadeDispatch *)dispatchWithInMemoryStore { id<MMBarricadeResponseStore> responseStore = [[MMBarricadeInMemoryResponseStore alloc] init]; MMBarricadeDispatch *dispatch = [[MMBarricadeDispatch alloc] initWithResponseStore:responseStore]; return dispatch; } - (MMBarricadeResponseSet *)movieListResponseSet { id successfulResponse = OCMProtocolMock(@protocol(MMBarricadeResponse)); OCMStub([successfulResponse name]).andReturn(@"Success"); id failureResponse = OCMProtocolMock(@protocol(MMBarricadeResponse)); OCMStub([failureResponse name]).andReturn(@"Service Unavailable"); MMBarricadeResponseSet *responseSet = [[MMBarricadeResponseSet alloc] initWithRequestName:@"Movie List" respondsToRequest:^BOOL(NSURLRequest *request, NSURLComponents *components) { return [components.path isEqualToString:@"/movies"]; }]; [responseSet addResponse:successfulResponse]; [responseSet addResponse:failureResponse]; return responseSet; } - (MMBarricadeResponseSet *)loginResponseSet { id successfulResponse = OCMProtocolMock(@protocol(MMBarricadeResponse)); OCMStub([successfulResponse name]).andReturn(@"Success"); id failureResponse = OCMProtocolMock(@protocol(MMBarricadeResponse)); OCMStub([failureResponse name]).andReturn(@"Service Unavailable"); MMBarricadeResponseSet *responseSet = [[MMBarricadeResponseSet alloc] initWithRequestName:@"Login" respondsToRequest:^BOOL(NSURLRequest *request, NSURLComponents *components) { return [components.path isEqualToString:@"/login"]; }]; [responseSet addResponse:successfulResponse]; [responseSet addResponse:failureResponse]; return responseSet; } - (NSURLRequest *)movieListRequest { NSURLRequest *request = [[NSURLRequest alloc] initWithURL:[NSURL URLWithString:@"http://example.com/movies"]]; return request; } - (NSURLRequest *)loginRequest { NSURLRequest *request = [[NSURLRequest alloc] initWithURL:[NSURL URLWithString:@"http://example.com/login"]]; return request; } - (NSURLRequest *)nonRegisteredRequest { NSURLRequest *request = [[NSURLRequest alloc] initWithURL:[NSURL URLWithString:@"http://example.com"]]; return request; } @end
{ "content_hash": "970a13d8858440b94055f54c9e3c1d55", "timestamp": "", "source": "github", "line_count": 163, "max_line_length": 185, "avg_line_length": 37.03680981595092, "alnum_prop": 0.7669372204737452, "repo_name": "ALHariPrasad/MMBarricade", "id": "33b933073cc5695436b53fc108baf09662498c5f", "size": "7376", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "Barricade/Core/MMBarricadeDispatchTests.m", "mode": "33188", "license": "mit", "language": [ { "name": "C", "bytes": "973" }, { "name": "C++", "bytes": "410" }, { "name": "Objective-C", "bytes": "404800" }, { "name": "Ruby", "bytes": "1080" }, { "name": "Shell", "bytes": "8894" } ], "symlink_target": "" }
.. _bigip_virtual_server: bigip_virtual_server - Manages F5 BIG-IP LTM virtual servers ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ .. versionadded:: 2.1 .. contents:: :local: :depth: 1 Synopsis -------- Manages F5 BIG-IP LTM virtual servers via iControl SOAP API Requirements (on host that executes module) ------------------------------------------- * bigsuds Options ------- .. raw:: html <table border=1 cellpadding=4> <tr> <th class="head">parameter</th> <th class="head">required</th> <th class="head">default</th> <th class="head">choices</th> <th class="head">comments</th> </tr> <tr> <td>all_profiles<br/><div style="font-size: small;"></div></td> <td>no</td> <td>None</td> <td><ul></ul></td> <td><div>List of all Profiles (HTTP, ClientSSL, ServerSSL, etc) that must be used by the virtual server</div></td></tr> <tr> <td>default_persistence_profile<br/><div style="font-size: small;"></div></td> <td>no</td> <td>None</td> <td><ul></ul></td> <td><div>Default Profile which manages the session persistence</div></td></tr> <tr> <td>description<br/><div style="font-size: small;"></div></td> <td>no</td> <td>None</td> <td><ul></ul></td> <td><div>Virtual server description.</div></td></tr> <tr> <td>destination<br/><div style="font-size: small;"></div></td> <td>yes</td> <td></td> <td><ul></ul></td> <td><div>Destination IP of the virtual server (only host is currently supported). Required when state=present and vs does not exist."</div></br> <div style="font-size: small;">aliases: address, ip<div></td></tr> <tr> <td>name<br/><div style="font-size: small;"></div></td> <td>yes</td> <td></td> <td><ul></ul></td> <td><div>Virtual server name.</div></br> <div style="font-size: small;">aliases: vs<div></td></tr> <tr> <td>partition<br/><div style="font-size: small;"></div></td> <td>no</td> <td>Common</td> <td><ul></ul></td> <td><div>Partition</div></td></tr> <tr> <td>password<br/><div style="font-size: small;"></div></td> <td>yes</td> <td></td> <td><ul></ul></td> <td><div>BIG-IP password</div></td></tr> <tr> <td>pool<br/><div style="font-size: small;"></div></td> <td>no</td> <td>None</td> <td><ul></ul></td> <td><div>Default pool for the virtual server</div></td></tr> <tr> <td>port<br/><div style="font-size: small;"></div></td> <td>no</td> <td>None</td> <td><ul></ul></td> <td><div>Port of the virtual server . Required when state=present and vs does not exist</div></td></tr> <tr> <td>server<br/><div style="font-size: small;"></div></td> <td>yes</td> <td></td> <td><ul></ul></td> <td><div>BIG-IP host</div></td></tr> <tr> <td>snat<br/><div style="font-size: small;"></div></td> <td>no</td> <td>None</td> <td><ul></ul></td> <td><div>Source network address policy</div></td></tr> <tr> <td>state<br/><div style="font-size: small;"></div></td> <td>no</td> <td>present</td> <td><ul><li>absent</li><li>disabled</li><li>enabled</li><li>present</li></ul></td> <td><div>Virtual Server state. If <code>absent</code>, delete the VS if <code>present</code>, or <code>enabled</code>, create the VS if needed and set state to enabled. If <code>disabled</code>, create the VS if needed and set state to disabled.</div></td></tr> <tr> <td>user<br/><div style="font-size: small;"></div></td> <td>yes</td> <td></td> <td><ul></ul></td> <td><div>BIG-IP username</div></td></tr> <tr> <td>validate_certs<br/><div style="font-size: small;"></div></td> <td>no</td> <td>yes</td> <td><ul><li>True</li><li>False</li></ul></td> <td><div>If <code>no</code>, SSL certificates will not be validated. This should only be used on personally controlled sites using self-signed certificates.</div></td></tr> </table> </br> Examples -------- :: - name: Add VS local_action: module: bigip_virtual_server server: lb.mydomain.net user: admin password: secret state: present partition: MyPartition name: myvirtualserver destination: "{{ ansible_default_ipv4['address'] }}" port: 443 pool: "{{ mypool }}" snat: Automap description: Test Virtual Server all_profiles: - http - clientssl - name: Modify Port of the Virtual Server local_action: module: bigip_virtual_server server: lb.mydomain.net user: admin password: secret state: present partition: MyPartition name: myvirtualserver port: 8080 - name: Delete pool local_action: module: bigip_virtual_server server: lb.mydomain.net user: admin password: secret state: absent partition: MyPartition name: myvirtualserver Return Values ------------- Common return values are documented here :doc:`common_return_values`, the following are the fields unique to this module: .. raw:: html <table border=1 cellpadding=4> <tr> <th class="head">name</th> <th class="head">description</th> <th class="head">returned</th> <th class="head">type</th> <th class="head">sample</th> </tr> <tr> <td> deleted </td> <td> Name of a virtual server that was deleted </td> <td align=center> virtual server was successfully deleted on state=absent </td> <td align=center> string </td> <td align=center> </td> </tr> </table> </br></br> Notes ----- .. note:: Requires BIG-IP software version >= 11 .. note:: F5 developed module 'bigsuds' required (see http://devcentral.f5.com) .. note:: Best run as a local_action in your playbook This is an Extras Module ------------------------ For more information on what this means please read :doc:`modules_extra` For help in developing on modules, should you be so inclined, please read :doc:`community`, :doc:`developing_test_pr` and :doc:`developing_modules`.
{ "content_hash": "e43e9efe2c6cfc3b8505ecca1d894101", "timestamp": "", "source": "github", "line_count": 218, "max_line_length": 269, "avg_line_length": 30.009174311926607, "alnum_prop": 0.5417303576887802, "repo_name": "buzzsurfr/f5-ansible", "id": "5dea98bdbc44543e121866ee273255c67e67f856", "size": "6542", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "docs/modules/bigip_virtual_server_module.rst", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Makefile", "bytes": "2693" }, { "name": "Python", "bytes": "518917" }, { "name": "Shell", "bytes": "1371" }, { "name": "Tcl", "bytes": "2122" } ], "symlink_target": "" }
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <!-- NewPage --> <html lang="en"> <head> <title>Uses of Class org.apache.poi.xslf.model.geom.Guide (POI API Documentation)</title> <link rel="stylesheet" type="text/css" href="../../../../../../../stylesheet.css" title="Style"> </head> <body> <script type="text/javascript"><!-- if (location.href.indexOf('is-external=true') == -1) { parent.document.title="Uses of Class org.apache.poi.xslf.model.geom.Guide (POI API Documentation)"; } //--> </script> <noscript> <div>JavaScript is disabled on your browser.</div> </noscript> <!-- ========= START OF TOP NAVBAR ======= --> <div class="topNav"><a name="navbar_top"> <!-- --> </a><a href="#skip-navbar_top" title="Skip navigation links"></a><a name="navbar_top_firstrow"> <!-- --> </a> <ul class="navList" title="Navigation"> <li><a href="../../../../../../../overview-summary.html">Overview</a></li> <li><a href="../package-summary.html">Package</a></li> <li><a href="../../../../../../../org/apache/poi/xslf/model/geom/Guide.html" title="class in org.apache.poi.xslf.model.geom">Class</a></li> <li class="navBarCell1Rev">Use</li> <li><a href="../package-tree.html">Tree</a></li> <li><a href="../../../../../../../deprecated-list.html">Deprecated</a></li> <li><a href="../../../../../../../index-all.html">Index</a></li> <li><a href="../../../../../../../help-doc.html">Help</a></li> </ul> </div> <div class="subNav"> <ul class="navList"> <li>Prev</li> <li>Next</li> </ul> <ul class="navList"> <li><a href="../../../../../../../index.html?org/apache/poi/xslf/model/geom/class-use/Guide.html" target="_top">Frames</a></li> <li><a href="Guide.html" target="_top">No Frames</a></li> </ul> <ul class="navList" id="allclasses_navbar_top"> <li><a href="../../../../../../../allclasses-noframe.html">All Classes</a></li> </ul> <div> <script type="text/javascript"><!-- allClassesLink = document.getElementById("allclasses_navbar_top"); if(window==top) { allClassesLink.style.display = "block"; } else { allClassesLink.style.display = "none"; } //--> </script> </div> <a name="skip-navbar_top"> <!-- --> </a></div> <!-- ========= END OF TOP NAVBAR ========= --> <div class="header"> <h2 title="Uses of Class org.apache.poi.xslf.model.geom.Guide" class="title">Uses of Class<br>org.apache.poi.xslf.model.geom.Guide</h2> </div> <div class="classUseContainer"> <ul class="blockList"> <li class="blockList"> <table border="0" cellpadding="3" cellspacing="0" summary="Use table, listing packages, and an explanation"> <caption><span>Packages that use <a href="../../../../../../../org/apache/poi/xslf/model/geom/Guide.html" title="class in org.apache.poi.xslf.model.geom">Guide</a></span><span class="tabEnd">&nbsp;</span></caption> <tr> <th class="colFirst" scope="col">Package</th> <th class="colLast" scope="col">Description</th> </tr> <tbody> <tr class="altColor"> <td class="colFirst"><a href="#org.apache.poi.xslf.model.geom">org.apache.poi.xslf.model.geom</a></td> <td class="colLast">&nbsp;</td> </tr> </tbody> </table> </li> <li class="blockList"> <ul class="blockList"> <li class="blockList"><a name="org.apache.poi.xslf.model.geom"> <!-- --> </a> <h3>Uses of <a href="../../../../../../../org/apache/poi/xslf/model/geom/Guide.html" title="class in org.apache.poi.xslf.model.geom">Guide</a> in <a href="../../../../../../../org/apache/poi/xslf/model/geom/package-summary.html">org.apache.poi.xslf.model.geom</a></h3> <table border="0" cellpadding="3" cellspacing="0" summary="Use table, listing subclasses, and an explanation"> <caption><span>Subclasses of <a href="../../../../../../../org/apache/poi/xslf/model/geom/Guide.html" title="class in org.apache.poi.xslf.model.geom">Guide</a> in <a href="../../../../../../../org/apache/poi/xslf/model/geom/package-summary.html">org.apache.poi.xslf.model.geom</a></span><span class="tabEnd">&nbsp;</span></caption> <tr> <th class="colFirst" scope="col">Modifier and Type</th> <th class="colLast" scope="col">Class and Description</th> </tr> <tbody> <tr class="altColor"> <td class="colFirst"><code>class&nbsp;</code></td> <td class="colLast"><code><strong><a href="../../../../../../../org/apache/poi/xslf/model/geom/AdjustValue.html" title="class in org.apache.poi.xslf.model.geom">AdjustValue</a></strong></code> <div class="block">Represents a shape adjust values (see section 20.1.9.5 in the spec)</div> </td> </tr> </tbody> </table> <table border="0" cellpadding="3" cellspacing="0" summary="Use table, listing methods, and an explanation"> <caption><span>Methods in <a href="../../../../../../../org/apache/poi/xslf/model/geom/package-summary.html">org.apache.poi.xslf.model.geom</a> that return <a href="../../../../../../../org/apache/poi/xslf/model/geom/Guide.html" title="class in org.apache.poi.xslf.model.geom">Guide</a></span><span class="tabEnd">&nbsp;</span></caption> <tr> <th class="colFirst" scope="col">Modifier and Type</th> <th class="colLast" scope="col">Method and Description</th> </tr> <tbody> <tr class="altColor"> <td class="colFirst"><code><a href="../../../../../../../org/apache/poi/xslf/model/geom/Guide.html" title="class in org.apache.poi.xslf.model.geom">Guide</a></code></td> <td class="colLast"><span class="strong">IAdjustableShape.</span><code><strong><a href="../../../../../../../org/apache/poi/xslf/model/geom/IAdjustableShape.html#getAdjustValue(java.lang.String)">getAdjustValue</a></strong>(java.lang.String&nbsp;name)</code>&nbsp;</td> </tr> <tr class="rowColor"> <td class="colFirst"><code><a href="../../../../../../../org/apache/poi/xslf/model/geom/Guide.html" title="class in org.apache.poi.xslf.model.geom">Guide</a></code></td> <td class="colLast"><span class="strong">Context.</span><code><strong><a href="../../../../../../../org/apache/poi/xslf/model/geom/Context.html#getAdjustValue(java.lang.String)">getAdjustValue</a></strong>(java.lang.String&nbsp;name)</code>&nbsp;</td> </tr> </tbody> </table> </li> </ul> </li> </ul> </div> <!-- ======= START OF BOTTOM NAVBAR ====== --> <div class="bottomNav"><a name="navbar_bottom"> <!-- --> </a><a href="#skip-navbar_bottom" title="Skip navigation links"></a><a name="navbar_bottom_firstrow"> <!-- --> </a> <ul class="navList" title="Navigation"> <li><a href="../../../../../../../overview-summary.html">Overview</a></li> <li><a href="../package-summary.html">Package</a></li> <li><a href="../../../../../../../org/apache/poi/xslf/model/geom/Guide.html" title="class in org.apache.poi.xslf.model.geom">Class</a></li> <li class="navBarCell1Rev">Use</li> <li><a href="../package-tree.html">Tree</a></li> <li><a href="../../../../../../../deprecated-list.html">Deprecated</a></li> <li><a href="../../../../../../../index-all.html">Index</a></li> <li><a href="../../../../../../../help-doc.html">Help</a></li> </ul> </div> <div class="subNav"> <ul class="navList"> <li>Prev</li> <li>Next</li> </ul> <ul class="navList"> <li><a href="../../../../../../../index.html?org/apache/poi/xslf/model/geom/class-use/Guide.html" target="_top">Frames</a></li> <li><a href="Guide.html" target="_top">No Frames</a></li> </ul> <ul class="navList" id="allclasses_navbar_bottom"> <li><a href="../../../../../../../allclasses-noframe.html">All Classes</a></li> </ul> <div> <script type="text/javascript"><!-- allClassesLink = document.getElementById("allclasses_navbar_bottom"); if(window==top) { allClassesLink.style.display = "block"; } else { allClassesLink.style.display = "none"; } //--> </script> </div> <a name="skip-navbar_bottom"> <!-- --> </a></div> <!-- ======== END OF BOTTOM NAVBAR ======= --> <p class="legalCopy"><small> <i>Copyright 2015 The Apache Software Foundation or its licensors, as applicable.</i> </small></p> </body> </html>
{ "content_hash": "ad1873288ced980c9279e0bd69fc5f17", "timestamp": "", "source": "github", "line_count": 176, "max_line_length": 337, "avg_line_length": 45.46022727272727, "alnum_prop": 0.6100487439070116, "repo_name": "lunheur/JiraSorting", "id": "1bd5223415b8fabed5cb8aedc06895ae0f2d961c", "size": "8001", "binary": false, "copies": "2", "ref": "refs/heads/master", "path": "JiraSorting/lib/poi-3.12/docs/apidocs/org/apache/poi/xslf/model/geom/class-use/Guide.html", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "26462" }, { "name": "HTML", "bytes": "78171707" }, { "name": "Java", "bytes": "9424" } ], "symlink_target": "" }
<div class="push"></div> <footer> <aside class="wrap"> <ol class="prev-posts"> <p class="list-title">Recent Posts</p> {% for post in site.posts | limit:3 %} <!-- for1 --> <li> <span class="recent-title"><a href="{{ site.url }}{{ post.url }}" title="{{ post.title }}">{{ post.title | strip_html | strip_newlines | truncate: 30 }} </a></span> <span class="date">{{post.date | date: "%b %d, %Y" }}</span> </li> {% endfor %} </ol> <div class="social"> <ul> {% if site.owner.email%} <li><a id="mail" href="mailto:{{ site.owner.email }}"><span class="foot-link">Contact Me</span></a></li> {% endif %} {% if site.owner.twitter %} <li><a id="twit" href="http://twitter.com/{{ site.owner.twitter }}" target="_blank"><span class="foot-link">@{{ site.owner.twitter }}</span></a></li> {% endif %} {% if site.owner.dribbble %} <li><a id="drib" href="http://dribbble.com/{{ site.owner.dribbble }}" target="_blank"><span class="foot-link">Dribbble</span></a></li> {% endif %} </ul> </div> </aside> <small>&copy; {{ site.time | date: '%Y' }} {{ site.owner.name }}. Powered by <a href="http://jekyllrb.com">Jekyll</a> using the <a href="http://jekyll.gtat.me/about">Balzac</a> theme.</small> </footer> <!-- If they're out, get some from the cellar --> <script>window.jQuery || document.write('<script src="{{ site.url }}/assets/js/vendor/jquery-1.9.1.min.js"><\/script>')</script> <script src="{{ site.url }}/assets/js/retina.min.js"></script> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-28090801-2', 'auto'); ga('send', 'pageview'); </script> <!-- Custom JS --> <script src="{{ site.url }}/assets/js/scripts.js"></script> </body> </html>
{ "content_hash": "9374ddf470f6c4f08393b2b3e3d5a3b2", "timestamp": "", "source": "github", "line_count": 52, "max_line_length": 193, "avg_line_length": 40.46153846153846, "alnum_prop": 0.5670152091254753, "repo_name": "mrmagooey/mrmagooey.github.io", "id": "aa80a28f851a68d00eadb6c71cd17a952f87dcdb", "size": "2105", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "_includes/footer.html", "mode": "33261", "license": "mit", "language": [ { "name": "CSS", "bytes": "36609" }, { "name": "HTML", "bytes": "22413" }, { "name": "JavaScript", "bytes": "8965" }, { "name": "Ruby", "bytes": "92" } ], "symlink_target": "" }
from pygw.config import geowave_pkg from ..statistic import FieldStatistic from ..statistic_type import FieldStatisticType from ...base import GeoWaveObject from ...base.java_transformer import JavaTransformer class NumericHistogramStatistic(FieldStatistic): """ Dynamic histogram provide very high accuracy for CDF and quantiles over the a numeric attribute. """ STATS_TYPE = FieldStatisticType(geowave_pkg.core.store.statistics.field.NumericHistogramStatistic.STATS_TYPE) def __init__(self, type_name=None, field_name=None, compression=100, java_ref=None): if java_ref is None: if type_name is None and field_name is None: java_ref = geowave_pkg.core.store.statistics.field.NumericHistogramStatistic() else: java_ref = geowave_pkg.core.store.statistics.field.NumericHistogramStatistic( type_name, field_name, float(compression)) super().__init__(java_ref, NumericHistogramTransformer()) def set_compression(self, compression): self._java_ref.setCompression(float(compression)) def get_compression(self): return self._java_ref.getCompression() class NumericHistogramTransformer(JavaTransformer): def transform(self, j_object): return NumericHistogram(j_object) class NumericHistogram(GeoWaveObject): def quantile(self, value): return self._java_ref.quantile(float(value)) def cdf(self, value): return self._java_ref.cdf(float(value)) def sum(self, value, inclusive=True): return self._java_ref.sum(float(value), inclusive) def get_min_value(self): return self._java_ref.getMinValue() def get_max_value(self): return self._java_ref.getMaxValue() def get_total_count(self): return self._java_ref.getTotalCount()
{ "content_hash": "6572198dd3e1630ea1257f582fd92200", "timestamp": "", "source": "github", "line_count": 52, "max_line_length": 113, "avg_line_length": 35.48076923076923, "alnum_prop": 0.6959349593495935, "repo_name": "ngageoint/geowave", "id": "b71c831ac9a37eddb243b039829fbb84d4f69f62", "size": "2364", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "python/src/main/python/pygw/statistics/field/numeric_histogram_statistic.py", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "C++", "bytes": "5073" }, { "name": "CMake", "bytes": "2032" }, { "name": "FreeMarker", "bytes": "2879" }, { "name": "Gnuplot", "bytes": "57750" }, { "name": "Groovy", "bytes": "1323" }, { "name": "HTML", "bytes": "1903" }, { "name": "Java", "bytes": "7103026" }, { "name": "Protocol Buffer", "bytes": "1525" }, { "name": "Puppet", "bytes": "4039" }, { "name": "Scala", "bytes": "26507" }, { "name": "Scheme", "bytes": "20491" }, { "name": "Shell", "bytes": "68381" } ], "symlink_target": "" }
/** * The Hamming distance between two integers is the number of positions at which the corresponding bits are different. * * Given two integers x and y, calculate the Hamming distance. * * Note: * 0 ≤ x, y < 231. * * Example: * * Input: x = 1, y = 4 * * Output: 2 * * Explanation: * 1 (0 0 0 1) * 4 (0 1 0 0) * ↑ ↑ * * The above arrows point to positions where the corresponding bits are different. */ /** * @param {number} x * @param {number} y * @return {number} */ var hammingDistance = function(x, y) { return bitCount( x^y ); // http://www.cnblogs.com/graphics/archive/2010/06/21/1752421.html // https://en.wikipedia.org/wiki/Hamming_distance function bitCount(num) { var c = 0; for(; num; ++c){ num &= (num - 1); } return c; } };
{ "content_hash": "9eee4fa2b2e5e6e7ba5ea5b4dfe56664", "timestamp": "", "source": "github", "line_count": 44, "max_line_length": 118, "avg_line_length": 20.022727272727273, "alnum_prop": 0.5448354143019296, "repo_name": "paper/LeetCode", "id": "e9f50de73cfa66dfbe9f9433367513389c55031f", "size": "887", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "Algorithms/461_Hamming_Distance.js", "mode": "33188", "license": "mit", "language": [ { "name": "JavaScript", "bytes": "356834" } ], "symlink_target": "" }
// Copyright 2013 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. package org.chromium.chrome.browser.infobar; import org.chromium.base.CalledByNative; import org.chromium.chrome.browser.ResourceId; /** * Provides JNI methods for ConfirmInfoBars */ public class ConfirmInfoBarDelegate { private ConfirmInfoBarDelegate() { } @CalledByNative public static ConfirmInfoBarDelegate create() { return new ConfirmInfoBarDelegate(); } /** * Creates and begins the process for showing a ConfirmInfoBar. * @param nativeInfoBar Pointer to the C++ InfoBar corresponding to the Java InfoBar. * @param enumeratedIconId ID corresponding to the icon that will be shown for the InfoBar. * The ID must have been mapped using the ResourceMapper class before * passing it to this function. * @param message Message to display to the user indicating what the InfoBar is for. * @param buttonOk String to display on the OK button. * @param buttonCancel String to display on the Cancel button. */ @CalledByNative InfoBar showConfirmInfoBar(int nativeInfoBar, int enumeratedIconId, String message, String buttonOk, String buttonCancel) { int drawableId = ResourceId.mapToDrawableId(enumeratedIconId); // Apparently, yellow was the popular choice at the time these InfoBars were implemented // because they stuck out more (hence the BACKGROUND_TYPE_WARNING) default. ConfirmInfoBar infoBar = new ConfirmInfoBar(nativeInfoBar, null, InfoBar.BACKGROUND_TYPE_WARNING, drawableId, message, buttonOk, buttonCancel); return infoBar; } }
{ "content_hash": "1a8ef8b8b717e13f77cb5236bdfc562e", "timestamp": "", "source": "github", "line_count": 44, "max_line_length": 97, "avg_line_length": 41.29545454545455, "alnum_prop": 0.7066593285635663, "repo_name": "mogoweb/chromium-crosswalk", "id": "6ef6f4e29310d6004120970e499e6ca0dfab119b", "size": "1817", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "chrome/android/java/src/org/chromium/chrome/browser/infobar/ConfirmInfoBarDelegate.java", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "ASP", "bytes": "853" }, { "name": "AppleScript", "bytes": "6973" }, { "name": "Arduino", "bytes": "464" }, { "name": "Assembly", "bytes": "54831" }, { "name": "Awk", "bytes": "8660" }, { "name": "C", "bytes": "40940503" }, { "name": "C#", "bytes": "1132" }, { "name": "C++", "bytes": "182703853" }, { "name": "CSS", "bytes": "799795" }, { "name": "DOT", "bytes": "1873" }, { "name": "Java", "bytes": "4807735" }, { "name": "JavaScript", "bytes": "20714038" }, { "name": "Mercury", "bytes": "10299" }, { "name": "Objective-C", "bytes": "985558" }, { "name": "Objective-C++", "bytes": "6205987" }, { "name": "PHP", "bytes": "97817" }, { "name": "Perl", "bytes": "1213389" }, { "name": "Python", "bytes": "9735121" }, { "name": "Rebol", "bytes": "262" }, { "name": "Shell", "bytes": "1305641" }, { "name": "Tcl", "bytes": "277091" }, { "name": "TypeScript", "bytes": "1560024" }, { "name": "XSLT", "bytes": "13493" }, { "name": "nesC", "bytes": "14650" } ], "symlink_target": "" }
'use strict'; var articles = require('../controllers/articles'); // Article authorization helpers var hasAuthorization = function(req, res, next) { if (!req.user.isAdmin && req.article.user.id !== req.user.id) { return res.send(401, 'User is not authorized'); } next(); }; module.exports = function(Articles, app, auth) { app.route('/articles') .get(articles.all) .post(auth.requiresLogin, articles.create); app.route('/articles/:articleId') .get(articles.show) .put(auth.requiresLogin, hasAuthorization, articles.update) .put(auth.requiresLogin, hasAuthorization, articles.book) .delete(auth.requiresLogin, hasAuthorization, articles.destroy); // Finish with setting up the articleId param app.param('articleId', articles.article); };
{ "content_hash": "7eefe90b8ede4fecd2b2e6b2623bdb9b", "timestamp": "", "source": "github", "line_count": 26, "max_line_length": 72, "avg_line_length": 31.615384615384617, "alnum_prop": 0.670316301703163, "repo_name": "vepere/NeloTW", "id": "9f6ea38228d0d9775ea427184ec9b55948bab456", "size": "822", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "packages/articles/server/routes/articles.js", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "33704" }, { "name": "JavaScript", "bytes": "87269" } ], "symlink_target": "" }
#ifndef OC_CLOUD_RESOURCE_TYPES_H #define OC_CLOUD_RESOURCE_TYPES_H #ifdef __cplusplus extern "C" { #endif #define OIC_RSRC_TYPE_SEC_CLOUDCONF "oic.r.coapcloudconf" #define OIC_RSRC_CLOUDCONF_URI "/CoapCloudConfResURI" #define OIC_JSON_CLOUDCONF_NAME "CoAPCloudConf" #define OIC_RSRC_CLOUDCONF_TOKEN_REFRESH "/oic/sec/tokenrefresh" #define OC_RSRVD_ACCESS_TOKENTYPE "tokentype" /** * Perform cleanup for Cloud resources. * * @return ::OC_STACK_OK for Success, otherwise some error value. */ OCStackResult DeInitCloudResource(); /** * Initialize Cloud resource by loading data from persistent storage. * * @return ::OC_STACK_OK for Success, otherwise some error value. */ OCStackResult InitCloudResource(); /** * The method is used to create '/oic/cloudconf' resource. */ OCStackResult CreateCloudResource(); /** * Update the persistent storage */ bool UpdateCloudPersistentStorage(); /** * SignOut from cloud accounts. */ void CloudsSignOut(); /** * SignOut/Delete cloud accounts and release cloud entries */ void ResetClouds(); /** * Delete cloud accounts and release cloud entries. */ void DeleteCloudAccount(); #ifdef __cplusplus } #endif #endif // OC_CLOUD_RESOURCE_TYPES_H
{ "content_hash": "8a558b6ba342add494bfa0d8d001c76b", "timestamp": "", "source": "github", "line_count": 59, "max_line_length": 71, "avg_line_length": 21.440677966101696, "alnum_prop": 0.7027667984189724, "repo_name": "rzr/iotivity", "id": "5a444068d9bd13309bca2a1e4b75ad2d8bb6d48d", "size": "2032", "binary": false, "copies": "2", "ref": "refs/heads/master", "path": "resource/csdk/security/provisioning/include/cloud/cloudresource.h", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Batchfile", "bytes": "12408" }, { "name": "C", "bytes": "8157549" }, { "name": "C++", "bytes": "13386094" }, { "name": "CSS", "bytes": "1069" }, { "name": "Dockerfile", "bytes": "8372" }, { "name": "HTML", "bytes": "1714" }, { "name": "Java", "bytes": "6702288" }, { "name": "JavaScript", "bytes": "167465" }, { "name": "Less", "bytes": "39" }, { "name": "M4", "bytes": "2816" }, { "name": "Makefile", "bytes": "52136" }, { "name": "Python", "bytes": "1469227" }, { "name": "RAML", "bytes": "1535" }, { "name": "Roff", "bytes": "2808" }, { "name": "Shell", "bytes": "200101" } ], "symlink_target": "" }
/** * The `Matter.SAT` module contains methods for detecting collisions using the Separating Axis Theorem. * * @class SAT */ // TODO: true circles and curves var SAT = {}; module.exports = SAT; var Vertices = require('../geometry/Vertices'); var Vector = require('../geometry/Vector'); (function() { /** * Detect collision between two bodies using the Separating Axis Theorem. * @method collides * @param {body} bodyA * @param {body} bodyB * @param {collision} previousCollision * @return {collision} collision */ SAT.collides = function(bodyA, bodyB, previousCollision) { var overlapAB, overlapBA, minOverlap, collision, prevCol = previousCollision, canReusePrevCol = false; if (prevCol) { // estimate total motion var parentA = bodyA.parent, parentB = bodyB.parent, motion = parentA.speed * parentA.speed + parentA.angularSpeed * parentA.angularSpeed + parentB.speed * parentB.speed + parentB.angularSpeed * parentB.angularSpeed; // we may be able to (partially) reuse collision result // but only safe if collision was resting canReusePrevCol = prevCol && prevCol.collided && motion < 0.2; // reuse collision object collision = prevCol; } else { collision = { collided: false, bodyA: bodyA, bodyB: bodyB }; } if (prevCol && canReusePrevCol) { // if we can reuse the collision result // we only need to test the previously found axis var axisBodyA = collision.axisBody, axisBodyB = axisBodyA === bodyA ? bodyB : bodyA, axes = [axisBodyA.axes[prevCol.axisNumber]]; minOverlap = _overlapAxes(axisBodyA.vertices, axisBodyB.vertices, axes); collision.reused = true; if (minOverlap.overlap <= 0) { collision.collided = false; return collision; } } else { // if we can't reuse a result, perform a full SAT test overlapAB = _overlapAxes(bodyA.vertices, bodyB.vertices, bodyA.axes); if (overlapAB.overlap <= 0) { collision.collided = false; return collision; } overlapBA = _overlapAxes(bodyB.vertices, bodyA.vertices, bodyB.axes); if (overlapBA.overlap <= 0) { collision.collided = false; return collision; } if (overlapAB.overlap < overlapBA.overlap) { minOverlap = overlapAB; collision.axisBody = bodyA; } else { minOverlap = overlapBA; collision.axisBody = bodyB; } // important for reuse later collision.axisNumber = minOverlap.axisNumber; } collision.bodyA = bodyA.id < bodyB.id ? bodyA : bodyB; collision.bodyB = bodyA.id < bodyB.id ? bodyB : bodyA; collision.collided = true; collision.normal = minOverlap.axis; collision.depth = minOverlap.overlap; collision.parentA = collision.bodyA.parent; collision.parentB = collision.bodyB.parent; bodyA = collision.bodyA; bodyB = collision.bodyB; // ensure normal is facing away from bodyA if (Vector.dot(collision.normal, Vector.sub(bodyB.position, bodyA.position)) > 0) collision.normal = Vector.neg(collision.normal); collision.tangent = Vector.perp(collision.normal); collision.penetration = { x: collision.normal.x * collision.depth, y: collision.normal.y * collision.depth }; // find support points, there is always either exactly one or two var verticesB = _findSupports(bodyA, bodyB, collision.normal), supports = collision.supports || []; supports.length = 0; // find the supports from bodyB that are inside bodyA if (Vertices.contains(bodyA.vertices, verticesB[0])) supports.push(verticesB[0]); if (Vertices.contains(bodyA.vertices, verticesB[1])) supports.push(verticesB[1]); // find the supports from bodyA that are inside bodyB if (supports.length < 2) { var verticesA = _findSupports(bodyB, bodyA, Vector.neg(collision.normal)); if (Vertices.contains(bodyB.vertices, verticesA[0])) supports.push(verticesA[0]); if (supports.length < 2 && Vertices.contains(bodyB.vertices, verticesA[1])) supports.push(verticesA[1]); } // account for the edge case of overlapping but no vertex containment if (supports.length < 1) supports = [verticesB[0]]; collision.supports = supports; return collision; }; /** * Find the overlap between two sets of vertices. * @method _overlapAxes * @private * @param {} verticesA * @param {} verticesB * @param {} axes * @return result */ var _overlapAxes = function(verticesA, verticesB, axes) { var projectionA = Vector._temp[0], projectionB = Vector._temp[1], result = { overlap: Number.MAX_VALUE }, overlap, axis; for (var i = 0; i < axes.length; i++) { axis = axes[i]; _projectToAxis(projectionA, verticesA, axis); _projectToAxis(projectionB, verticesB, axis); overlap = Math.min(projectionA.max - projectionB.min, projectionB.max - projectionA.min); if (overlap <= 0) { result.overlap = overlap; return result; } if (overlap < result.overlap) { result.overlap = overlap; result.axis = axis; result.axisNumber = i; } } return result; }; /** * Projects vertices on an axis and returns an interval. * @method _projectToAxis * @private * @param {} projection * @param {} vertices * @param {} axis */ var _projectToAxis = function(projection, vertices, axis) { var min = Vector.dot(vertices[0], axis), max = min; for (var i = 1; i < vertices.length; i += 1) { var dot = Vector.dot(vertices[i], axis); if (dot > max) { max = dot; } else if (dot < min) { min = dot; } } projection.min = min; projection.max = max; }; /** * Finds supporting vertices given two bodies along a given direction using hill-climbing. * @method _findSupports * @private * @param {} bodyA * @param {} bodyB * @param {} normal * @return [vector] */ var _findSupports = function(bodyA, bodyB, normal) { var nearestDistance = Number.MAX_VALUE, vertexToBody = Vector._temp[0], vertices = bodyB.vertices, bodyAPosition = bodyA.position, distance, vertex, vertexA, vertexB; // find closest vertex on bodyB for (var i = 0; i < vertices.length; i++) { vertex = vertices[i]; vertexToBody.x = vertex.x - bodyAPosition.x; vertexToBody.y = vertex.y - bodyAPosition.y; distance = -Vector.dot(normal, vertexToBody); if (distance < nearestDistance) { nearestDistance = distance; vertexA = vertex; } } // find next closest vertex using the two connected to it var prevIndex = vertexA.index - 1 >= 0 ? vertexA.index - 1 : vertices.length - 1; vertex = vertices[prevIndex]; vertexToBody.x = vertex.x - bodyAPosition.x; vertexToBody.y = vertex.y - bodyAPosition.y; nearestDistance = -Vector.dot(normal, vertexToBody); vertexB = vertex; var nextIndex = (vertexA.index + 1) % vertices.length; vertex = vertices[nextIndex]; vertexToBody.x = vertex.x - bodyAPosition.x; vertexToBody.y = vertex.y - bodyAPosition.y; distance = -Vector.dot(normal, vertexToBody); if (distance < nearestDistance) { vertexB = vertex; } return [vertexA, vertexB]; }; })();
{ "content_hash": "d25a61ba4a5de90d717147abf2034a2e", "timestamp": "", "source": "github", "line_count": 265, "max_line_length": 102, "avg_line_length": 32.33584905660377, "alnum_prop": 0.5525732290815731, "repo_name": "ECLabs/UpBoard", "id": "54374ccf1df3a5d9f8cbe1805281fc0c770e183e", "size": "8569", "binary": false, "copies": "6", "ref": "refs/heads/master", "path": "bower_components/Matter/src/collision/SAT.js", "mode": "33188", "license": "mit", "language": [ { "name": "ApacheConf", "bytes": "24139" }, { "name": "CSS", "bytes": "16937" }, { "name": "HTML", "bytes": "28417" }, { "name": "JavaScript", "bytes": "200850" } ], "symlink_target": "" }
declare namespace Bokeh { interface ViewOptions<ModelType extends Model> { model: ModelType; id?: string; el?: HTMLElement | JQuery; } class BokehView<ModelType extends Model> { constructor(options?: ViewOptions<ModelType>); model: ModelType; id: string; el: HTMLElement; $el: JQuery; render(): this; remove(): this; } type View<ModelType extends Model> = BokehView<ModelType>; }
{ "content_hash": "626dbe243415baa49e975628bdb5020f", "timestamp": "", "source": "github", "line_count": 21, "max_line_length": 60, "avg_line_length": 20.476190476190474, "alnum_prop": 0.6651162790697674, "repo_name": "azjps/bokeh", "id": "c25a5de7e8e30bec3dadaba3bfbed5e2b4400346", "size": "430", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "bokehjs/src/coffee/api/typings/views.d.ts", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "Batchfile", "bytes": "1710" }, { "name": "CSS", "bytes": "92582" }, { "name": "CoffeeScript", "bytes": "1051340" }, { "name": "HTML", "bytes": "46812" }, { "name": "JavaScript", "bytes": "34439" }, { "name": "Jupyter Notebook", "bytes": "3981" }, { "name": "Makefile", "bytes": "1164" }, { "name": "Python", "bytes": "2152481" }, { "name": "Shell", "bytes": "13140" }, { "name": "TypeScript", "bytes": "87868" } ], "symlink_target": "" }
var misc={ globalEval:function(c){ var s=D.createElement('script') s.text=c D.head.appendChild(s).parentNode.removeChild(s) } }
{ "content_hash": "ba6481f81ffe2723c0c8c5f5b61ce55f", "timestamp": "", "source": "github", "line_count": 7, "max_line_length": 51, "avg_line_length": 20.571428571428573, "alnum_prop": 0.6597222222222222, "repo_name": "frinknet/MicroQuery", "id": "9ec507ab4540cbe66bbd1f5e59dc316a7d82a748", "size": "144", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "src/misc.js", "mode": "33188", "license": "mit", "language": [ { "name": "HTML", "bytes": "1984" }, { "name": "JavaScript", "bytes": "65918" }, { "name": "Makefile", "bytes": "519" }, { "name": "Shell", "bytes": "1848" } ], "symlink_target": "" }
package com.sun.corba.se.spi.activation.RepositoryPackage; /** * com/sun/corba/se/spi/activation/RepositoryPackage/StringSeqHolder.java . * Generated by the IDL-to-Java compiler (portable), version "3.2" * from c:/re/workspace/8-2-build-windows-amd64-cygwin/jdk8u112/7884/corba/src/share/classes/com/sun/corba/se/spi/activation/activation.idl * Thursday, September 22, 2016 9:33:08 PM PDT */ public final class StringSeqHolder implements org.omg.CORBA.portable.Streamable { public String value[] = null; public StringSeqHolder () { } public StringSeqHolder (String[] initialValue) { value = initialValue; } public void _read (org.omg.CORBA.portable.InputStream i) { value = StringSeqHelper.read (i); } public void _write (org.omg.CORBA.portable.OutputStream o) { StringSeqHelper.write (o, value); } public org.omg.CORBA.TypeCode _type () { return StringSeqHelper.type (); } }
{ "content_hash": "05fa1d705d6212ec95cd90803db97792", "timestamp": "", "source": "github", "line_count": 39, "max_line_length": 138, "avg_line_length": 23.923076923076923, "alnum_prop": 0.7181136120042872, "repo_name": "wapalxj/Java", "id": "ed49af3831653f2695e4984122576faadaf23e3b", "size": "933", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "javaworkplace/source/src/com/sun/corba/se/spi/activation/RepositoryPackage/StringSeqHolder.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "C", "bytes": "1" }, { "name": "C++", "bytes": "1" }, { "name": "HTML", "bytes": "200117" }, { "name": "Java", "bytes": "687071" }, { "name": "JavaScript", "bytes": "238567" }, { "name": "PHP", "bytes": "2789730" } ], "symlink_target": "" }
> Unless you want to scramble at the last minute, I mean, it's up to you. ![](https://johnsblotsandthoughts.files.wordpress.com/2012/09/neil1.png) --- ![](http://ecx.images-amazon.com/images/I/41wgksZup2L.jpg) ## [CareerCup | Cracking the Coding Interview](http://www.amazon.com/Cracking-Coding-Interview-Programming-Questions/dp/098478280X) > Now in the 5th edition, Cracking the Coding Interview gives you the interview preparation you need to get the top software developer jobs. This is a deeply technical book and focuses on the software engineering skills to ace your interview. The book is over 500 pages and includes 150 programming interview questions and answers, as well as other advice. --- ![CODE2040 Applicant Playbook](https://huacm.files.wordpress.com/2015/04/applicantplaybook.png) ## [CODE2040 | Applicant Playbook](http://playbook.code2040.org/) > This online resource offers advice, activities and materials to prepare you for an amazing internship in the tech industry. The Playbook will help you prioritize tasks based on where you are in the application process. --- ![](https://huacm.files.wordpress.com/2015/04/developer-questions-github.png) ## [Front-End Developer Job Interview Questions | GitHub](http://h5bp.github.io/Front-end-Developer-Interview-Questions/) > A list of questions you can use to help interview potential candidates for a front-end development position. --- ![](http://employers.glassdoor.com/app/uploads/2013/05/glassdoor_logo_500.png) ## [Glassdoor](http://www.glassdoor.com/index.htm) > Glassdoor helps you find a job and company you love. Reviews, salaries and benefits from employees. Interview questions from candidates. Millions of jobs. --- ![](https://huacm.files.wordpress.com/2015/03/internshipscom.png) ## [Internships.com | Student Resources](http://www.internships.com/student/resources) - Basics > What's the difference between an internship and any other job? - Preparation > It's time to get smart about your options. - Search > Tips and tricks on finding your next gig. - Interview > Learn how to prepare for the big day. - Workplace > What to do once you're in the door. - Paperwork > Resume samples, cover letter ideas, and more. --- ![](https://d2o2wpn1drmies.cloudfront.net/avatars/25c/011/b47/0aa/4f9/3b9/eb0/967/1ee/3c6/13/InternMatch-interns-logo.medium.png?1426090185) ## [Looksharp InternMatch | Resources](https://www.internmatch.com/guides/student) > Packed with a abundance of resources, InternMatch has your back when it comes to getting a job. **Resources** - [Mastering the Interview eBook and Resume/Cover Letter/Recommendation Templates](https://www.internmatch.com/documents/student) - [Internship Guides](https://www.internmatch.com/guides/student) - [Industry Guides](https://www.internmatch.com/guides/industry) - [Other fun perks](https://www.internmatch.com/perks) --- ![](https://huacm.files.wordpress.com/2015/03/cs-internship-recruiting-guide.png) ## [Medium | CS Internship Recruiting Guide](https://medium.com/@qrazhan/cs-internship-recruiting-guide-aebb68912808) > This guide is primarily meant for freshmen or sophomores, but should be useful to anyone who is new to the tech industry.
{ "content_hash": "e9d9fd78404fc60dc619662d7cc8c1bd", "timestamp": "", "source": "github", "line_count": 77, "max_line_length": 356, "avg_line_length": 41.714285714285715, "alnum_prop": 0.7671232876712328, "repo_name": "fvcproductions/GitBook", "id": "843c22b8f83cfa6c68a911a7473f50c7bf5337cb", "size": "3241", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "career/prepare.md", "mode": "33188", "license": "mit", "language": [], "symlink_target": "" }
package org.apache.sysml.test.integration.functions.misc; import org.junit.runner.RunWith; import org.junit.runners.Suite; /** Group together the tests in this package into a single suite so that the Maven build * won't run two of them at once. */ @RunWith(Suite.class) @Suite.SuiteClasses({ ConditionalValidateTest.class, DataTypeCastingTest.class, DataTypeChangeTest.class, FunctionInliningTest.class, FunctionNamespaceTest.class, IfTest.class, InvalidFunctionAssignmentTest.class, InvalidFunctionSignatureTest.class, IPAConstantFoldingScalarVariablePropagationTest.class, IPALiteralReplacementTest.class, IPANnzPropagationTest.class, IPAScalarRecursionTest.class, IPAScalarVariablePropagationTest.class, IPAUnknownRecursionTest.class, LongOverflowTest.class, NegativeLoopIncrementsTest.class, NrowNcolStringTest.class, NrowNcolUnknownCSVReadTest.class, OuterTableExpandTest.class, PrintExpressionTest.class, PrintMatrixTest.class, ReadAfterWriteTest.class, RewriteBinaryMV2OuterTest.class, RewriteCSETransposeScalarTest.class, RewriteCTableToRExpandTest.class, RewriteElementwiseMultChainOptimizationTest.class, RewriteEliminateAggregatesTest.class, RewriteFuseBinaryOpChainTest.class, RewriteFusedRandTest.class, RewriteLoopVectorization.class, RewriteMatrixMultChainOptTest.class, RewriteMergeBlocksTest.class, RewritePushdownSumBinaryMult.class, RewritePushdownSumOnBinaryTest.class, RewritePushdownUaggTest.class, RewriteSimplifyRowColSumMVMultTest.class, RewriteSlicedMatrixMultTest.class, ScalarAssignmentTest.class, ScalarFunctionTest.class, ScalarMatrixUnaryBinaryTermTest.class, ScalarToMatrixInLoopTest.class, SetWorkingDirTest.class, ToStringTest.class, ValueTypeAutoCastingTest.class, ValueTypeCastingTest.class, ValueTypeMatrixScalarBuiltinTest.class, }) /** This class is just a holder for the above JUnit annotations. */ public class ZPackageSuite { }
{ "content_hash": "3ecf26ed0500d9514f92352f66017cb5", "timestamp": "", "source": "github", "line_count": 64, "max_line_length": 88, "avg_line_length": 30.078125, "alnum_prop": 0.850909090909091, "repo_name": "asurve/arvind-sysml2", "id": "52f71f87624fee8b08ed163a3cc271f88d8c214d", "size": "2734", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "src/test_suites/java/org/apache/sysml/test/integration/functions/misc/ZPackageSuite.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "ANTLR", "bytes": "31285" }, { "name": "Batchfile", "bytes": "23989" }, { "name": "C", "bytes": "8676" }, { "name": "C++", "bytes": "30804" }, { "name": "CMake", "bytes": "10312" }, { "name": "Cuda", "bytes": "35333" }, { "name": "Java", "bytes": "13146562" }, { "name": "Jupyter Notebook", "bytes": "107236" }, { "name": "Makefile", "bytes": "2459" }, { "name": "Protocol Buffer", "bytes": "78944" }, { "name": "Python", "bytes": "317740" }, { "name": "R", "bytes": "711665" }, { "name": "Scala", "bytes": "211968" }, { "name": "Shell", "bytes": "156697" } ], "symlink_target": "" }
<html><head> <meta http-equiv="content-type" content="text/html; charset=UTF-8"><title>IsScreenActive</title></head> <body bgcolor="#EFF1F0" link="#3A3966" vlink="#000000" alink="#000000"> <font face="Verdana, sans-serif" size="2"><p align="center"><b><font size="4">IsScreenActive()</font></b></p> <p><b>语法</b></p><blockquote> Result = <font color="#3A3966"><b>IsScreenActive</b></font>()</blockquote> </blockquote> <b>概要</b><br><blockquote> Games and full screen applications using PureBasic functions run under a multitasking environment. This means that the user can switch back from full screen to the normal desktop. This change can be detected with this function and appropriate actions should be taken, such as <a href="../mouse/releasemouse.html">ReleaseMouse()</a>, pause the game, stop the sounds etc. This function must be called after <a href="flipbuffers.html">FlipBuffers()</a>, as the events are handled inside <a href="flipbuffers.html">FlipBuffers()</a>. </blockquote><p><b>参数</b></p><blockquote> 无. </blockquote><p><b>返回值</b></p><blockquote> Nonzero if the screen is still active, zero otherwise. </blockquote><p><b>参阅</b></p><blockquote> <a href="openscreen.html">OpenScreen()</a>, <a href="openwindowedscreen.html">OpenWindowedScreen()</a>, <a href="../mouse/releasemouse.html">ReleaseMouse()</a>, <a href="flipbuffers.html">FlipBuffers()</a> </Blockquote><p><b>已支持操作系统 </b><Blockquote>所有</Blockquote></p><center>&lt;- <a href=flipbuffers.html>FlipBuffers()</a> - <a href="index.html">Screen Index</a> - <a href="nextscreenmode.html">NextScreenMode()</a> -&gt;<br><br> </body></html>
{ "content_hash": "d2821e17e106b020c8c78ac0bf168c71", "timestamp": "", "source": "github", "line_count": 40, "max_line_length": 225, "avg_line_length": 40.9, "alnum_prop": 0.7011002444987775, "repo_name": "PureBasicCN/PureBasicPreference", "id": "cde7499db101e2a569988b50fb71d0b3aff4125d", "size": "1678", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "target_dir/documentation/screen/isscreenactive.html", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "HTML", "bytes": "13068252" } ], "symlink_target": "" }
import os import sys os.system('clear') port = raw_input('\nEnter a valid port: ') if port == '': print '\nYou did not enter anything.\n\n' sys.exit(1) try: val = int(port) except ValueError: print('\nThat is not an number.\n\n') sys.exit(1) if int(port) not in range(1,65535): print '\nThat is an invalid port.\n\n' else: print '\nThat is a valid port.\n\n'
{ "content_hash": "c93ba2190b692d9b0bfcaae7bc2dc02a", "timestamp": "", "source": "github", "line_count": 22, "max_line_length": 46, "avg_line_length": 18.181818181818183, "alnum_prop": 0.61, "repo_name": "EricSB/discover", "id": "63f5259b947e837dc33663a7367e4131f9118c67", "size": "423", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "misc/python/ex1.py", "mode": "33261", "license": "mit", "language": [ { "name": "CSS", "bytes": "1459" }, { "name": "HTML", "bytes": "95458" }, { "name": "Python", "bytes": "66134" }, { "name": "Shell", "bytes": "258670" } ], "symlink_target": "" }
@import XCTest; #import "Kiwi.h" #import "NSString+DPAdditions.h" SPEC_BEGIN(DPFoundationAdditionsSpecs) describe(@"Foundation Additions", ^{ describe(@"NSString", ^{ __block NSString *testString; __block NSString *blankString; __block NSString *emptyString; beforeEach(^{ testString = @" foo bar "; blankString = @" "; emptyString = @""; }); it(@"should detect blank and empty strings", ^{ [[@([testString dp_isBlank]) should] beNo]; [[@([testString dp_isEmpty]) should] beNo]; [[@([blankString dp_isBlank]) should] beYes]; [[@([blankString dp_isEmpty]) should] beNo]; [[@([emptyString dp_isBlank]) should] beYes]; [[@([emptyString dp_isEmpty]) should] beYes]; }); it(@"should know the length if trimmed, without actually trimming", ^{ [[@([testString dp_lengthWhenTrimmed]) should] equal:@7]; [[testString should] equal:@" foo bar "]; }); it(@"should trim", ^{ [[[testString dp_trim] should] equal:@"foo bar"]; }); }); }); SPEC_END
{ "content_hash": "180f77e56064741f6668f04e636b6ed7", "timestamp": "", "source": "github", "line_count": 44, "max_line_length": 78, "avg_line_length": 27.181818181818183, "alnum_prop": 0.5301003344481605, "repo_name": "DuneParkSoftware/DPAdditions", "id": "5ca23a074bf255836aa78c9c8fade8ecb3ffdb2c", "size": "1363", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "DPAdditionsDemoTests/DPFoundationAdditionsSpecs.m", "mode": "33188", "license": "mit", "language": [ { "name": "Objective-C", "bytes": "49518" }, { "name": "Ruby", "bytes": "1535" } ], "symlink_target": "" }
package toktok import ( "math/rand" "strings" "sync" "time" "github.com/xrash/smetrics" ) // Bucket tracks all the generated tokens and lets you create new, unique tokens. type Bucket struct { length uint runes []rune tokens map[string]struct{} tries []uint64 sync.RWMutex } // NewBucket returns a new bucket, which will contain tokens of tokenLength. func NewBucket(tokenLength uint) (Bucket, error) { // Problematic chars: // - A and 4 // - B and 8 // - G and 6 // - I and 1 // - O and Q, 0 // - Q and O, 0 // - S and 5 // - U and V // - Z and 2, 7 return NewBucketWithRunes(tokenLength, "ABCDEFHJKLMNPRSTUWXY369") } // NewBucketWithRunes returns a new bucket and lets you define which runes will // be used for token generation. func NewBucketWithRunes(tokenLength uint, runes string) (Bucket, error) { runes = strings.ToUpper(runes) for _, r := range runes { if strings.Count(runes, string(r)) > 1 { return Bucket{}, ErrDupeRunes } } if tokenLength < 2 { return Bucket{}, ErrTokenLengthTooSmall } if len(runes) < 4 { return Bucket{}, ErrTooFewRunes } return Bucket{ length: tokenLength, runes: []rune(runes), tokens: make(map[string]struct{}), }, nil } // LoadTokens adds previously generated tokens to the Bucket. func (bucket *Bucket) LoadTokens(tokens []string) { bucket.Lock() defer bucket.Unlock() for _, v := range tokens { bucket.tokens[v] = struct{}{} } } // NewToken returns a new token with a minimal safety distance to all other // existing tokens. func (bucket *Bucket) NewToken(distance int) (string, error) { if distance < 1 { return "", ErrDistanceTooSmall } c, i, err := bucket.generate(distance) if err != nil { return "", err } bucket.Lock() defer bucket.Unlock() bucket.tokens[c] = struct{}{} bucket.tries = append(bucket.tries, uint64(i)) if len(bucket.tries) > 5000 { bucket.tries = bucket.tries[1:] } return c, nil } // Resolve tries to find the matching original token for a potentially corrupted // token. func (bucket *Bucket) Resolve(code string) (string, int) { distance := 65536 bucket.RLock() defer bucket.RUnlock() code = strings.ToUpper(code) // try to find a perfect match first _, ok := bucket.tokens[code] if ok { return code, 0 } var t string // find the closest match for token := range bucket.tokens { if hd := smetrics.WagnerFischer(code, token, 1, 1, 2); hd <= distance { if hd == distance { // duplicate distance, ignore the previous result as it's not unique enough t = "" } else { t = token distance = hd } } } return t, distance } // Count returns how many tokens are currently in this Bucket. func (bucket *Bucket) Count() uint64 { bucket.Lock() defer bucket.Unlock() return uint64(len(bucket.tokens)) } // EstimatedFillPercentage returns how full the Bucket approximately is. func (bucket *Bucket) EstimatedFillPercentage() float64 { bucket.Lock() defer bucket.Unlock() if len(bucket.tries) == 0 { return 0 } tries := uint64(0) for _, v := range bucket.tries { tries += v } return 100.0 - (100.0 / (float64(tries) / float64(len(bucket.tries)))) } // EstimatedTokenSpace returns the total estimated token space available in this // Bucket. func (bucket *Bucket) EstimatedTokenSpace() uint64 { return uint64(float64(bucket.Count()) * (100.0 / bucket.EstimatedFillPercentage())) } func (bucket *Bucket) generate(distance int) (string, int, error) { bucket.RLock() defer bucket.RUnlock() var c string i := 0 for { i++ if i == 100 { return "", i, ErrTokenSpaceExhausted } c = GenerateToken(bucket.length, bucket.runes) dupe := false for token := range bucket.tokens { if hd := smetrics.WagnerFischer(c, token, 1, 1, 2); hd <= distance { dupe = true break } } if !dupe { break } } return c, i, nil } // GenerateToken generates a new token of length n with the defined rune-set // letterRunes. func GenerateToken(n uint, letterRunes []rune) string { l := len(letterRunes) b := make([]rune, n) for i := range b { var lastrune rune if i > 0 { lastrune = b[i-1] } b[i] = lastrune for lastrune == b[i] { b[i] = letterRunes[rand.Intn(l)] //nolint:gosec } } return string(b) } func init() { rand.Seed(time.Now().UnixNano()) }
{ "content_hash": "dd7b2d25fbb2a5b399e0cd5613cb43b8", "timestamp": "", "source": "github", "line_count": 213, "max_line_length": 84, "avg_line_length": 20.248826291079812, "alnum_prop": 0.6631115233016461, "repo_name": "muesli/toktok", "id": "9758b699ff5e66d5df03a14a6c73b14f9881b543", "size": "4459", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "toktok.go", "mode": "33188", "license": "mit", "language": [ { "name": "Go", "bytes": "9610" } ], "symlink_target": "" }
package io.virtdata.libbasics.shared.from_double.to_double.to_double; import io.virtdata.libbasics.shared.from_double.to_double.Clamp; import org.assertj.core.data.Offset; import org.junit.Test; import static org.assertj.core.api.Assertions.assertThat; public class ClampTest { @Test public void testDoubleUnaryClamp() { Clamp clamp = new Clamp(90.0d, 103.0d); assertThat(clamp.applyAsDouble(9.034D)).isCloseTo(90.0d, Offset.offset(0.0000001D)); assertThat(clamp.applyAsDouble(90.34D)).isCloseTo(90.34d, Offset.offset(0.0000001D)); assertThat(clamp.applyAsDouble(903.4D)).isCloseTo(103.0d, Offset.offset(0.0000001D)); } @Test public void testIntUnaryClamp() { io.virtdata.libbasics.shared.unary_int.Clamp clamp = new io.virtdata.libbasics.shared.unary_int.Clamp(9, 13); assertThat(clamp.applyAsInt(8)).isEqualTo(9); assertThat(clamp.applyAsInt(9)).isEqualTo(9); assertThat(clamp.applyAsInt(10)).isEqualTo(10); assertThat(clamp.applyAsInt(100)).isEqualTo(13); } @Test public void testLongUnaryClamp() { io.virtdata.libbasics.shared.from_long.to_long.Clamp clamp = new io.virtdata.libbasics.shared.from_long.to_long.Clamp(9, 13); assertThat(clamp.applyAsLong(8L)).isEqualTo(9L); assertThat(clamp.applyAsLong(9L)).isEqualTo(9L); assertThat(clamp.applyAsLong(10L)).isEqualTo(10L); assertThat(clamp.applyAsLong(100L)).isEqualTo(13L); } }
{ "content_hash": "d512fcb5299ead3ffe215516461946bb", "timestamp": "", "source": "github", "line_count": 37, "max_line_length": 133, "avg_line_length": 40.189189189189186, "alnum_prop": 0.7074646940147948, "repo_name": "virtualdataset/metagen-java", "id": "7896a5ac92c7afa0f26eeae270ff5abe65a650dd", "size": "1487", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "virtdata-lib-basics/src/test/java/io/virtdata/libbasics/shared/from_double/to_double/to_double/ClampTest.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "ANTLR", "bytes": "4866" }, { "name": "Java", "bytes": "548350" }, { "name": "Shell", "bytes": "444" } ], "symlink_target": "" }
package com.amazonaws.services.codecommit.model; import javax.annotation.Generated; /** * <p> * A branch name is required but was not specified. * </p> */ @Generated("com.amazonaws:aws-java-sdk-code-generator") public class BranchNameRequiredException extends com.amazonaws.services.codecommit.model.AWSCodeCommitException { private static final long serialVersionUID = 1L; /** * Constructs a new BranchNameRequiredException with the specified error message. * * @param message * Describes the error encountered. */ public BranchNameRequiredException(String message) { super(message); } }
{ "content_hash": "2c0b066b4e16bc973b8ca1bf6e0629a2", "timestamp": "", "source": "github", "line_count": 25, "max_line_length": 113, "avg_line_length": 26.16, "alnum_prop": 0.7125382262996942, "repo_name": "dagnir/aws-sdk-java", "id": "3a363ea88fc5a8b0bae0f7fb802af3b651e92720", "size": "1234", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "aws-java-sdk-codecommit/src/main/java/com/amazonaws/services/codecommit/model/BranchNameRequiredException.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "FreeMarker", "bytes": "157317" }, { "name": "Gherkin", "bytes": "25556" }, { "name": "Java", "bytes": "165755153" }, { "name": "Scilab", "bytes": "3561" } ], "symlink_target": "" }
package httpserver import ( "context" "crypto/tls" "errors" "fmt" "log" "net" "net/http" "net/url" "os" "path" "path/filepath" "runtime" "strings" "sync" "time" "github.com/lucas-clemente/quic-go/h2quic" "github.com/mholt/caddy" "github.com/mholt/caddy/caddyhttp/staticfiles" "github.com/mholt/caddy/caddytls" ) // Server is the HTTP server implementation. type Server struct { Server *http.Server quicServer *h2quic.Server listener net.Listener listenerMu sync.Mutex sites []*SiteConfig connTimeout time.Duration // max time to wait for a connection before force stop tlsGovChan chan struct{} // close to stop the TLS maintenance goroutine vhosts *vhostTrie } // ensure it satisfies the interface var _ caddy.GracefulServer = new(Server) var defaultALPN = []string{"h2", "http/1.1"} // makeTLSConfig extracts TLS settings from each site config to // build a tls.Config usable in Caddy HTTP servers. The returned // config will be nil if TLS is disabled for these sites. func makeTLSConfig(group []*SiteConfig) (*tls.Config, error) { var tlsConfigs []*caddytls.Config for i := range group { if HTTP2 && len(group[i].TLS.ALPN) == 0 { // if no application-level protocol was configured up to now, // default to HTTP/2, then HTTP/1.1 if necessary group[i].TLS.ALPN = defaultALPN } tlsConfigs = append(tlsConfigs, group[i].TLS) } return caddytls.MakeTLSConfig(tlsConfigs) } func getFallbacks(sites []*SiteConfig) []string { fallbacks := []string{} for _, sc := range sites { if sc.FallbackSite { fallbacks = append(fallbacks, sc.Addr.Host) } } return fallbacks } // NewServer creates a new Server instance that will listen on addr // and will serve the sites configured in group. func NewServer(addr string, group []*SiteConfig) (*Server, error) { s := &Server{ Server: makeHTTPServerWithTimeouts(addr, group), vhosts: newVHostTrie(), sites: group, connTimeout: GracefulTimeout, } s.vhosts.fallbackHosts = append(s.vhosts.fallbackHosts, getFallbacks(group)...) s.Server = makeHTTPServerWithHeaderLimit(s.Server, group) s.Server.Handler = s // this is weird, but whatever // extract TLS settings from each site config to build // a tls.Config, which will not be nil if TLS is enabled tlsConfig, err := makeTLSConfig(group) if err != nil { return nil, err } s.Server.TLSConfig = tlsConfig // if TLS is enabled, make sure we prepare the Server accordingly if s.Server.TLSConfig != nil { // enable QUIC if desired (requires HTTP/2) if HTTP2 && QUIC { s.quicServer = &h2quic.Server{Server: s.Server} s.Server.Handler = s.wrapWithSvcHeaders(s.Server.Handler) } // wrap the HTTP handler with a handler that does MITM detection tlsh := &tlsHandler{next: s.Server.Handler} s.Server.Handler = tlsh // this needs to be the "outer" handler when Serve() is called, for type assertion // when Serve() creates the TLS listener later, that listener should // be adding a reference the ClientHello info to a map; this callback // will be sure to clear out that entry when the connection closes. s.Server.ConnState = func(c net.Conn, cs http.ConnState) { // when a connection closes or is hijacked, delete its entry // in the map, because we are done with it. if tlsh.listener != nil { if cs == http.StateHijacked || cs == http.StateClosed { tlsh.listener.helloInfosMu.Lock() delete(tlsh.listener.helloInfos, c.RemoteAddr().String()) tlsh.listener.helloInfosMu.Unlock() } } } // As of Go 1.7, if the Server's TLSConfig is not nil, HTTP/2 is enabled only // if TLSConfig.NextProtos includes the string "h2" if HTTP2 && len(s.Server.TLSConfig.NextProtos) == 0 { // some experimenting shows that this NextProtos must have at least // one value that overlaps with the NextProtos of any other tls.Config // that is returned from GetConfigForClient; if there is no overlap, // the connection will fail (as of Go 1.8, Feb. 2017). s.Server.TLSConfig.NextProtos = defaultALPN } } // Compile custom middleware for every site (enables virtual hosting) for _, site := range group { stack := Handler(staticfiles.FileServer{Root: http.Dir(site.Root), Hide: site.HiddenFiles}) for i := len(site.middleware) - 1; i >= 0; i-- { stack = site.middleware[i](stack) } site.middlewareChain = stack s.vhosts.Insert(site.Addr.VHost(), site) } return s, nil } // makeHTTPServerWithHeaderLimit apply minimum header limit within a group to given http.Server func makeHTTPServerWithHeaderLimit(s *http.Server, group []*SiteConfig) *http.Server { var min int64 for _, cfg := range group { limit := cfg.Limits.MaxRequestHeaderSize if limit == 0 { continue } // not set yet if min == 0 { min = limit } // find a better one if limit < min { min = limit } } if min > 0 { s.MaxHeaderBytes = int(min) } return s } // makeHTTPServerWithTimeouts makes an http.Server from the group of // configs in a way that configures timeouts (or, if not set, it uses // the default timeouts) by combining the configuration of each // SiteConfig in the group. (Timeouts are important for mitigating // slowloris attacks.) func makeHTTPServerWithTimeouts(addr string, group []*SiteConfig) *http.Server { // find the minimum duration configured for each timeout var min Timeouts for _, cfg := range group { if cfg.Timeouts.ReadTimeoutSet && (!min.ReadTimeoutSet || cfg.Timeouts.ReadTimeout < min.ReadTimeout) { min.ReadTimeoutSet = true min.ReadTimeout = cfg.Timeouts.ReadTimeout } if cfg.Timeouts.ReadHeaderTimeoutSet && (!min.ReadHeaderTimeoutSet || cfg.Timeouts.ReadHeaderTimeout < min.ReadHeaderTimeout) { min.ReadHeaderTimeoutSet = true min.ReadHeaderTimeout = cfg.Timeouts.ReadHeaderTimeout } if cfg.Timeouts.WriteTimeoutSet && (!min.WriteTimeoutSet || cfg.Timeouts.WriteTimeout < min.WriteTimeout) { min.WriteTimeoutSet = true min.WriteTimeout = cfg.Timeouts.WriteTimeout } if cfg.Timeouts.IdleTimeoutSet && (!min.IdleTimeoutSet || cfg.Timeouts.IdleTimeout < min.IdleTimeout) { min.IdleTimeoutSet = true min.IdleTimeout = cfg.Timeouts.IdleTimeout } } // for the values that were not set, use defaults if !min.ReadTimeoutSet { min.ReadTimeout = defaultTimeouts.ReadTimeout } if !min.ReadHeaderTimeoutSet { min.ReadHeaderTimeout = defaultTimeouts.ReadHeaderTimeout } if !min.WriteTimeoutSet { min.WriteTimeout = defaultTimeouts.WriteTimeout } if !min.IdleTimeoutSet { min.IdleTimeout = defaultTimeouts.IdleTimeout } // set the final values on the server and return it return &http.Server{ Addr: addr, ReadTimeout: min.ReadTimeout, ReadHeaderTimeout: min.ReadHeaderTimeout, WriteTimeout: min.WriteTimeout, IdleTimeout: min.IdleTimeout, } } func (s *Server) wrapWithSvcHeaders(previousHandler http.Handler) http.HandlerFunc { return func(w http.ResponseWriter, r *http.Request) { s.quicServer.SetQuicHeaders(w.Header()) previousHandler.ServeHTTP(w, r) } } // Listen creates an active listener for s that can be // used to serve requests. func (s *Server) Listen() (net.Listener, error) { if s.Server == nil { return nil, fmt.Errorf("Server field is nil") } ln, err := net.Listen("tcp", s.Server.Addr) if err != nil { var succeeded bool if runtime.GOOS == "windows" { // Windows has been known to keep sockets open even after closing the listeners. // Tests reveal this error case easily because they call Start() then Stop() // in succession. TODO: Better way to handle this? And why limit this to Windows? for i := 0; i < 20; i++ { time.Sleep(100 * time.Millisecond) ln, err = net.Listen("tcp", s.Server.Addr) if err == nil { succeeded = true break } } } if !succeeded { return nil, err } } if tcpLn, ok := ln.(*net.TCPListener); ok { ln = tcpKeepAliveListener{TCPListener: tcpLn} } cln := ln.(caddy.Listener) for _, site := range s.sites { for _, m := range site.listenerMiddleware { cln = m(cln) } } // Very important to return a concrete caddy.Listener // implementation for graceful restarts. return cln.(caddy.Listener), nil } // ListenPacket creates udp connection for QUIC if it is enabled, func (s *Server) ListenPacket() (net.PacketConn, error) { if QUIC { udpAddr, err := net.ResolveUDPAddr("udp", s.Server.Addr) if err != nil { return nil, err } return net.ListenUDP("udp", udpAddr) } return nil, nil } // Serve serves requests on ln. It blocks until ln is closed. func (s *Server) Serve(ln net.Listener) error { s.listenerMu.Lock() s.listener = ln s.listenerMu.Unlock() if s.Server.TLSConfig != nil { // Create TLS listener - note that we do not replace s.listener // with this TLS listener; tls.listener is unexported and does // not implement the File() method we need for graceful restarts // on POSIX systems. // TODO: Is this ^ still relevant anymore? Maybe we can now that it's a net.Listener... ln = newTLSListener(ln, s.Server.TLSConfig) if handler, ok := s.Server.Handler.(*tlsHandler); ok { handler.listener = ln.(*tlsHelloListener) } // Rotate TLS session ticket keys s.tlsGovChan = caddytls.RotateSessionTicketKeys(s.Server.TLSConfig) } err := s.Server.Serve(ln) if s.quicServer != nil { s.quicServer.Close() } return err } // ServePacket serves QUIC requests on pc until it is closed. func (s *Server) ServePacket(pc net.PacketConn) error { if s.quicServer != nil { err := s.quicServer.Serve(pc.(*net.UDPConn)) return fmt.Errorf("serving QUIC connections: %v", err) } return nil } // ServeHTTP is the entry point of all HTTP requests. func (s *Server) ServeHTTP(w http.ResponseWriter, r *http.Request) { defer func() { // We absolutely need to be sure we stay alive up here, // even though, in theory, the errors middleware does this. if rec := recover(); rec != nil { log.Printf("[PANIC] %v", rec) DefaultErrorFunc(w, r, http.StatusInternalServerError) } }() // copy the original, unchanged URL into the context // so it can be referenced by middlewares urlCopy := *r.URL if r.URL.User != nil { userInfo := new(url.Userinfo) *userInfo = *r.URL.User urlCopy.User = userInfo } c := context.WithValue(r.Context(), OriginalURLCtxKey, urlCopy) r = r.WithContext(c) w.Header().Set("Server", caddy.AppName) status, _ := s.serveHTTP(w, r) // Fallback error response in case error handling wasn't chained in if status >= 400 { DefaultErrorFunc(w, r, status) } } func (s *Server) serveHTTP(w http.ResponseWriter, r *http.Request) (int, error) { // strip out the port because it's not used in virtual // hosting; the port is irrelevant because each listener // is on a different port. hostname, _, err := net.SplitHostPort(r.Host) if err != nil { hostname = r.Host } // look up the virtualhost; if no match, serve error vhost, pathPrefix := s.vhosts.Match(hostname + r.URL.Path) c := context.WithValue(r.Context(), caddy.CtxKey("path_prefix"), pathPrefix) r = r.WithContext(c) if vhost == nil { // check for ACME challenge even if vhost is nil; // could be a new host coming online soon if caddytls.HTTPChallengeHandler(w, r, "localhost", caddytls.DefaultHTTPAlternatePort) { return 0, nil } // otherwise, log the error and write a message to the client remoteHost, _, err := net.SplitHostPort(r.RemoteAddr) if err != nil { remoteHost = r.RemoteAddr } WriteSiteNotFound(w, r) // don't add headers outside of this function log.Printf("[INFO] %s - No such site at %s (Remote: %s, Referer: %s)", hostname, s.Server.Addr, remoteHost, r.Header.Get("Referer")) return 0, nil } // we still check for ACME challenge if the vhost exists, // because we must apply its HTTP challenge config settings if s.proxyHTTPChallenge(vhost, w, r) { return 0, nil } // trim the path portion of the site address from the beginning of // the URL path, so a request to example.com/foo/blog on the site // defined as example.com/foo appears as /blog instead of /foo/blog. if pathPrefix != "/" { r.URL.Path = strings.TrimPrefix(r.URL.Path, pathPrefix) if !strings.HasPrefix(r.URL.Path, "/") { r.URL.Path = "/" + r.URL.Path } } return vhost.middlewareChain.ServeHTTP(w, r) } // proxyHTTPChallenge solves the ACME HTTP challenge if r is the HTTP // request for the challenge. If it is, and if the request has been // fulfilled (response written), true is returned; false otherwise. // If you don't have a vhost, just call the challenge handler directly. func (s *Server) proxyHTTPChallenge(vhost *SiteConfig, w http.ResponseWriter, r *http.Request) bool { if vhost.Addr.Port != caddytls.HTTPChallengePort { return false } if vhost.TLS != nil && vhost.TLS.Manual { return false } altPort := caddytls.DefaultHTTPAlternatePort if vhost.TLS != nil && vhost.TLS.AltHTTPPort != "" { altPort = vhost.TLS.AltHTTPPort } return caddytls.HTTPChallengeHandler(w, r, vhost.ListenHost, altPort) } // Address returns the address s was assigned to listen on. func (s *Server) Address() string { return s.Server.Addr } // Stop stops s gracefully (or forcefully after timeout) and // closes its listener. func (s *Server) Stop() error { ctx, cancel := context.WithTimeout(context.Background(), s.connTimeout) defer cancel() err := s.Server.Shutdown(ctx) if err != nil { return err } // signal any TLS governor goroutines to exit if s.tlsGovChan != nil { close(s.tlsGovChan) } return nil } // OnStartupComplete lists the sites served by this server // and any relevant information, assuming caddy.Quiet == false. func (s *Server) OnStartupComplete() { if caddy.Quiet { return } for _, site := range s.sites { output := site.Addr.String() if caddy.IsLoopback(s.Address()) && !caddy.IsLoopback(site.Addr.Host) { output += " (only accessible on this machine)" } fmt.Println(output) log.Println(output) } } // defaultTimeouts stores the default timeout values to use // if left unset by user configuration. NOTE: Most default // timeouts are disabled (see issues #1464 and #1733). var defaultTimeouts = Timeouts{IdleTimeout: 5 * time.Minute} // tcpKeepAliveListener sets TCP keep-alive timeouts on accepted // connections. It's used by ListenAndServe and ListenAndServeTLS so // dead TCP connections (e.g. closing laptop mid-download) eventually // go away. // // Borrowed from the Go standard library. type tcpKeepAliveListener struct { *net.TCPListener } // Accept accepts the connection with a keep-alive enabled. func (ln tcpKeepAliveListener) Accept() (c net.Conn, err error) { tc, err := ln.AcceptTCP() if err != nil { return } tc.SetKeepAlive(true) tc.SetKeepAlivePeriod(3 * time.Minute) return tc, nil } // File implements caddy.Listener; it returns the underlying file of the listener. func (ln tcpKeepAliveListener) File() (*os.File, error) { return ln.TCPListener.File() } // ErrMaxBytesExceeded is the error returned by MaxBytesReader // when the request body exceeds the limit imposed var ErrMaxBytesExceeded = errors.New("http: request body too large") // DefaultErrorFunc responds to an HTTP request with a simple description // of the specified HTTP status code. func DefaultErrorFunc(w http.ResponseWriter, r *http.Request, status int) { WriteTextResponse(w, status, fmt.Sprintf("%d %s\n", status, http.StatusText(status))) } const httpStatusMisdirectedRequest = 421 // RFC 7540, 9.1.2 // WriteSiteNotFound writes appropriate error code to w, signaling that // requested host is not served by Caddy on a given port. func WriteSiteNotFound(w http.ResponseWriter, r *http.Request) { status := http.StatusNotFound if r.ProtoMajor >= 2 { // TODO: use http.StatusMisdirectedRequest when it gets defined status = httpStatusMisdirectedRequest } WriteTextResponse(w, status, fmt.Sprintf("%d Site %s is not served on this interface\n", status, r.Host)) } // WriteTextResponse writes body with code status to w. The body will // be interpreted as plain text. func WriteTextResponse(w http.ResponseWriter, status int, body string) { w.Header().Set("Content-Type", "text/plain; charset=utf-8") w.Header().Set("X-Content-Type-Options", "nosniff") w.WriteHeader(status) w.Write([]byte(body)) } // SafePath joins siteRoot and reqPath and converts it to a path that can // be used to access a path on the local disk. It ensures the path does // not traverse outside of the site root. // // If opening a file, use http.Dir instead. func SafePath(siteRoot, reqPath string) string { reqPath = filepath.ToSlash(reqPath) reqPath = strings.Replace(reqPath, "\x00", "", -1) // NOTE: Go 1.9 checks for null bytes in the syscall package if siteRoot == "" { siteRoot = "." } return filepath.Join(siteRoot, filepath.FromSlash(path.Clean("/"+reqPath))) } // OriginalURLCtxKey is the key for accessing the original, incoming URL on an HTTP request. const OriginalURLCtxKey = caddy.CtxKey("original_url")
{ "content_hash": "71e8c27863d2ac994fcfad6d64dde80e", "timestamp": "", "source": "github", "line_count": 540, "max_line_length": 112, "avg_line_length": 31.553703703703704, "alnum_prop": 0.7093139268736428, "repo_name": "tobya/caddy", "id": "eb854d72606d68e2834a9af203da6825cbd6795a", "size": "17702", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "caddyhttp/httpserver/server.go", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Go", "bytes": "807018" }, { "name": "HTML", "bytes": "660" }, { "name": "PHP", "bytes": "1948" }, { "name": "Shell", "bytes": "5930" }, { "name": "Smarty", "bytes": "193" } ], "symlink_target": "" }
<html lang="hy"> <head> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/> <meta http-equiv="Cache-Control" content="no-cache, no-store, must-revalidate" /> <meta http-equiv="Pragma" content="no-cache"> <meta http-equiv="Expires" content="0"> <style> @font-face { font-family:"NotoSansArmenian-ttf"; src:url("https://cdn.jsdelivr.net/gh/googlefonts/noto-fonts@main/unhinted/ttf/NotoSansArmenian/NotoSansArmenian-Regular.ttf"); } @font-face { font-family:"NotoSansArmenian-otf"; src:url("https://cdn.jsdelivr.net/gh/googlefonts/noto-fonts@main/unhinted/otf/NotoSansArmenian/NotoSansArmenian-Regular.otf"); @font-face {font-family: AdobeBlank; src:url(AdobeBlank.ttf);} </style> </head> <body> <a href="https://github.com/googlefonts/noto-fonts/issues/1616">Issue 1616</a> <p>NotoSansArmenian-ttf: <span style="font-family:NotoSansArmenian-ttf, AdobeBlank; font-weight:400; background-color:#EDEBEA">ՈՒ̈ու̈</span></p> <p>NotoSansArmenian-otf: <span style="font-family:NotoSansArmenian-otf, AdobeBlank; font-weight:400; background-color:#EDEBEA">ՈՒ̈ու̈</span></p> <label style="font-size:10px"> Test details:<br>Tested font directory: https://github.com/googlefonts/noto-fonts/tree/main/unhinted/ttf/NotoSansArmenian/NotoSansArmenian-Regular.ttf<br>https://github.com/googlefonts/noto-fonts/tree/main/unhinted/otf/NotoSansArmenian/NotoSansArmenian-Regular.otf</label><br><label style="font-size:10px">Test created date: </label><label style="font-size:10px" id="date"></label> <script> var date_tag = new Date(); document.getElementById("date").innerHTML=date_tag; </script> </body> </html>
{ "content_hash": "bcde14a7beba1f886dc5b8533199e0ae", "timestamp": "", "source": "github", "line_count": 26, "max_line_length": 412, "avg_line_length": 61.88461538461539, "alnum_prop": 0.7582349285270354, "repo_name": "googlefonts/noto-source", "id": "b9e019d954cb6713ccba451b4fa7c8e6d1de9bed", "size": "1621", "binary": false, "copies": "3", "ref": "refs/heads/main", "path": "test/Armenian/1616-NotoSansArmenian.html", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "HTML", "bytes": "21014413" }, { "name": "Python", "bytes": "16989" }, { "name": "Shell", "bytes": "34567" } ], "symlink_target": "" }
A stand alone string template engine. ## Code Example ```cs StringTemplate stringTemplate = new StringTemplate("hello <data>"); stringTemplate.Add("data", "world"); Console.WriteLine(stringTemplate.Render()); ``` ## License This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details
{ "content_hash": "f6df1f39739be592ed51e314f9ef8ee0", "timestamp": "", "source": "github", "line_count": 14, "max_line_length": 98, "avg_line_length": 23.428571428571427, "alnum_prop": 0.7439024390243902, "repo_name": "AdrianToman1/StringTemplateEngine", "id": "46cdb5ed4089b23837c7dbd0a6f47b4dcf974ff2", "size": "353", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "README.md", "mode": "33188", "license": "mit", "language": [ { "name": "C#", "bytes": "36198" } ], "symlink_target": "" }
<?php namespace app\modules\question\models; use yii\base\Model; use app\modules\tag\models\Tag; use app\modules\tag\models\TagItem; class QuestionForm extends Model { public $subject; public $tags; public $content; public $author_id; public function rules() { return [ [['subject', 'content', 'tags', 'author_id'], 'required'], [['tags'], 'checkTags'] ]; } public function checkTags($attribute, $params) { $tags = is_array($this->tags) ? $this->tags : explode(',', $this->tags); $maxLength = 5; // TODO 后台控制最大标签数 if (count($tags) > $maxLength) { $this->addError($attribute, '最多可以选择' . $maxLength . '个标签'); } } public function create() { $question = new Question(); $question->setAttributes([ 'subject' => $this->subject, 'content' => $this->content, 'author_id' => $this->author_id, ]); if (($result = $question->save()) && ($result = $question->setActive())) { $tags = Tag::findAll([ 'name' => is_array($this->tags) ? $this->tags : explode(',', $this->tags) ]); foreach ($tags as $tag) { $tagItem = new TagItem(); $tagItem->setAttributes([ 'target_id' => $question->id, 'target_type' => $question::TYPE, ]); $tag->addItem($tagItem); } } return $result ? $question : false; } public function attributeLabels() { return [ 'subject' => '问题', 'content' => '内容', 'tags' => '标签' ]; } }
{ "content_hash": "e5f07fb148f28701e1181bca2f0278c0", "timestamp": "", "source": "github", "line_count": 64, "max_line_length": 89, "avg_line_length": 27.171875, "alnum_prop": 0.47556066705002875, "repo_name": "callmez/huajuan", "id": "ac303dc16cf5b041089ead1e5948e2c498b39718", "size": "1787", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "modules/question/models/QuestionForm.php", "mode": "33188", "license": "mit", "language": [ { "name": "ApacheConf", "bytes": "201" }, { "name": "Batchfile", "bytes": "1030" }, { "name": "CSS", "bytes": "6949" }, { "name": "JavaScript", "bytes": "6291" }, { "name": "PHP", "bytes": "188854" } ], "symlink_target": "" }
from __future__ import unicode_literals from django.db import migrations, models class Migration(migrations.Migration): initial = True dependencies = [] operations = [ migrations.CreateModel( name='SampleModel', fields=[ ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('title', models.CharField(max_length=128)), ('category', models.CharField(choices=[('foo', 'Foo'), ('bar', 'Bar')], max_length=128)), ], ) ]
{ "content_hash": "0cb4a45780170aac88a59bdd33467a88", "timestamp": "", "source": "github", "line_count": 21, "max_line_length": 114, "avg_line_length": 27.80952380952381, "alnum_prop": 0.5684931506849316, "repo_name": "plecto/django-smart-lists", "id": "9929ea09b140112bd16345800a894ac3607239c1", "size": "657", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "testproject/migrations/0001_initial.py", "mode": "33188", "license": "mit", "language": [ { "name": "HTML", "bytes": "8351" }, { "name": "Makefile", "bytes": "1673" }, { "name": "Python", "bytes": "68012" } ], "symlink_target": "" }
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <!--NewPage--> <HTML> <HEAD> <!-- Generated by javadoc (build 1.6.0_26) on Sat Jun 21 06:31:11 UTC 2014 --> <META http-equiv="Content-Type" content="text/html; charset=UTF-8"> <TITLE> Uses of Class org.apache.hadoop.net.ConnectTimeoutException (Apache Hadoop Main 2.4.1 API) </TITLE> <META NAME="date" CONTENT="2014-06-21"> <LINK REL ="stylesheet" TYPE="text/css" HREF="../../../../../stylesheet.css" TITLE="Style"> <SCRIPT type="text/javascript"> function windowTitle() { if (location.href.indexOf('is-external=true') == -1) { parent.document.title="Uses of Class org.apache.hadoop.net.ConnectTimeoutException (Apache Hadoop Main 2.4.1 API)"; } } </SCRIPT> <NOSCRIPT> </NOSCRIPT> </HEAD> <BODY BGCOLOR="white" onload="windowTitle();"> <HR> <!-- ========= START OF TOP NAVBAR ======= --> <A NAME="navbar_top"><!-- --></A> <A HREF="#skip-navbar_top" title="Skip navigation links"></A> <TABLE BORDER="0" WIDTH="100%" CELLPADDING="1" CELLSPACING="0" SUMMARY=""> <TR> <TD COLSPAN=2 BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A NAME="navbar_top_firstrow"><!-- --></A> <TABLE BORDER="0" CELLPADDING="0" CELLSPACING="3" SUMMARY=""> <TR ALIGN="center" VALIGN="top"> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../../../overview-summary.html"><FONT CLASS="NavBarFont1"><B>Overview</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../package-summary.html"><FONT CLASS="NavBarFont1"><B>Package</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../../../org/apache/hadoop/net/ConnectTimeoutException.html" title="class in org.apache.hadoop.net"><FONT CLASS="NavBarFont1"><B>Class</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#FFFFFF" CLASS="NavBarCell1Rev"> &nbsp;<FONT CLASS="NavBarFont1Rev"><B>Use</B></FONT>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../package-tree.html"><FONT CLASS="NavBarFont1"><B>Tree</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../../../deprecated-list.html"><FONT CLASS="NavBarFont1"><B>Deprecated</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../../../index-all.html"><FONT CLASS="NavBarFont1"><B>Index</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../../../help-doc.html"><FONT CLASS="NavBarFont1"><B>Help</B></FONT></A>&nbsp;</TD> </TR> </TABLE> </TD> <TD ALIGN="right" VALIGN="top" ROWSPAN=3><EM> </EM> </TD> </TR> <TR> <TD BGCOLOR="white" CLASS="NavBarCell2"><FONT SIZE="-2"> &nbsp;PREV&nbsp; &nbsp;NEXT</FONT></TD> <TD BGCOLOR="white" CLASS="NavBarCell2"><FONT SIZE="-2"> <A HREF="../../../../../index.html?org/apache/hadoop/net//class-useConnectTimeoutException.html" target="_top"><B>FRAMES</B></A> &nbsp; &nbsp;<A HREF="ConnectTimeoutException.html" target="_top"><B>NO FRAMES</B></A> &nbsp; &nbsp;<SCRIPT type="text/javascript"> <!-- if(window==top) { document.writeln('<A HREF="../../../../../allclasses-noframe.html"><B>All Classes</B></A>'); } //--> </SCRIPT> <NOSCRIPT> <A HREF="../../../../../allclasses-noframe.html"><B>All Classes</B></A> </NOSCRIPT> </FONT></TD> </TR> </TABLE> <A NAME="skip-navbar_top"></A> <!-- ========= END OF TOP NAVBAR ========= --> <HR> <CENTER> <H2> <B>Uses of Class<br>org.apache.hadoop.net.ConnectTimeoutException</B></H2> </CENTER> No usage of org.apache.hadoop.net.ConnectTimeoutException <P> <HR> <!-- ======= START OF BOTTOM NAVBAR ====== --> <A NAME="navbar_bottom"><!-- --></A> <A HREF="#skip-navbar_bottom" title="Skip navigation links"></A> <TABLE BORDER="0" WIDTH="100%" CELLPADDING="1" CELLSPACING="0" SUMMARY=""> <TR> <TD COLSPAN=2 BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A NAME="navbar_bottom_firstrow"><!-- --></A> <TABLE BORDER="0" CELLPADDING="0" CELLSPACING="3" SUMMARY=""> <TR ALIGN="center" VALIGN="top"> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../../../overview-summary.html"><FONT CLASS="NavBarFont1"><B>Overview</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../package-summary.html"><FONT CLASS="NavBarFont1"><B>Package</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../../../org/apache/hadoop/net/ConnectTimeoutException.html" title="class in org.apache.hadoop.net"><FONT CLASS="NavBarFont1"><B>Class</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#FFFFFF" CLASS="NavBarCell1Rev"> &nbsp;<FONT CLASS="NavBarFont1Rev"><B>Use</B></FONT>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../package-tree.html"><FONT CLASS="NavBarFont1"><B>Tree</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../../../deprecated-list.html"><FONT CLASS="NavBarFont1"><B>Deprecated</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../../../index-all.html"><FONT CLASS="NavBarFont1"><B>Index</B></FONT></A>&nbsp;</TD> <TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../../../help-doc.html"><FONT CLASS="NavBarFont1"><B>Help</B></FONT></A>&nbsp;</TD> </TR> </TABLE> </TD> <TD ALIGN="right" VALIGN="top" ROWSPAN=3><EM> </EM> </TD> </TR> <TR> <TD BGCOLOR="white" CLASS="NavBarCell2"><FONT SIZE="-2"> &nbsp;PREV&nbsp; &nbsp;NEXT</FONT></TD> <TD BGCOLOR="white" CLASS="NavBarCell2"><FONT SIZE="-2"> <A HREF="../../../../../index.html?org/apache/hadoop/net//class-useConnectTimeoutException.html" target="_top"><B>FRAMES</B></A> &nbsp; &nbsp;<A HREF="ConnectTimeoutException.html" target="_top"><B>NO FRAMES</B></A> &nbsp; &nbsp;<SCRIPT type="text/javascript"> <!-- if(window==top) { document.writeln('<A HREF="../../../../../allclasses-noframe.html"><B>All Classes</B></A>'); } //--> </SCRIPT> <NOSCRIPT> <A HREF="../../../../../allclasses-noframe.html"><B>All Classes</B></A> </NOSCRIPT> </FONT></TD> </TR> </TABLE> <A NAME="skip-navbar_bottom"></A> <!-- ======== END OF BOTTOM NAVBAR ======= --> <HR> Copyright &#169; 2014 <a href="http://www.apache.org">Apache Software Foundation</a>. All Rights Reserved. </BODY> </HTML>
{ "content_hash": "93799cb5666be2bedf5f53de216c3500", "timestamp": "", "source": "github", "line_count": 145, "max_line_length": 223, "avg_line_length": 42.827586206896555, "alnum_prop": 0.6231884057971014, "repo_name": "gsoundar/mambo-ec2-deploy", "id": "8be10365b66a97650e5af66425dc5f0aa3497dac", "size": "6210", "binary": false, "copies": "2", "ref": "refs/heads/master", "path": "packages/hadoop-2.4.1/share/doc/hadoop/api/org/apache/hadoop/net/class-use/ConnectTimeoutException.html", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Batchfile", "bytes": "23179" }, { "name": "CSS", "bytes": "39965" }, { "name": "HTML", "bytes": "263271260" }, { "name": "Java", "bytes": "103085" }, { "name": "JavaScript", "bytes": "1347" }, { "name": "Python", "bytes": "4101" }, { "name": "Ruby", "bytes": "262588" }, { "name": "Shell", "bytes": "118548" } ], "symlink_target": "" }
from swgpy.object import * def create(kernel): result = Tangible() result.template = "object/tangible/wearables/shoes/shared_shoes_s09.iff" result.attribute_template_id = 11 result.stfName("wearables_name","shoes_s09") #### BEGIN MODIFICATIONS #### #### END MODIFICATIONS #### return result
{ "content_hash": "3dd2b68b8ce7d3afe902949350a7c588", "timestamp": "", "source": "github", "line_count": 13, "max_line_length": 73, "avg_line_length": 23.692307692307693, "alnum_prop": 0.7012987012987013, "repo_name": "obi-two/Rebelion", "id": "47971851202d435b9e65f0b1ccdc0bcab1205861", "size": "453", "binary": false, "copies": "2", "ref": "refs/heads/master", "path": "data/scripts/templates/object/tangible/wearables/shoes/shared_shoes_s09.py", "mode": "33188", "license": "mit", "language": [ { "name": "Batchfile", "bytes": "11818" }, { "name": "C", "bytes": "7699" }, { "name": "C++", "bytes": "2293610" }, { "name": "CMake", "bytes": "39727" }, { "name": "PLSQL", "bytes": "42065" }, { "name": "Python", "bytes": "7499185" }, { "name": "SQLPL", "bytes": "41864" } ], "symlink_target": "" }
import pyaf.Bench.TS_datasets as tsds import tests.artificial.process_artificial_dataset as art art.process_dataset(N = 128 , FREQ = 'D', seed = 0, trendtype = "LinearTrend", cycle_length = 30, transform = "Difference", sigma = 0.0, exog_count = 20, ar_order = 0);
{ "content_hash": "9f00de86b85c684e48449ddbb03fff4a", "timestamp": "", "source": "github", "line_count": 7, "max_line_length": 168, "avg_line_length": 38.285714285714285, "alnum_prop": 0.7089552238805971, "repo_name": "antoinecarme/pyaf", "id": "3726dcad7db79b248710b83ec9a2d90db5359812", "size": "268", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "tests/artificial/transf_Difference/trend_LinearTrend/cycle_30/ar_/test_artificial_128_Difference_LinearTrend_30__20.py", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "Makefile", "bytes": "6773299" }, { "name": "Procfile", "bytes": "24" }, { "name": "Python", "bytes": "54209093" }, { "name": "R", "bytes": "807" }, { "name": "Shell", "bytes": "3619" } ], "symlink_target": "" }
title: Using Runscope and CodeShip For API testing shortTitle: Runscope API Testing And Monitoring menus: general/integrations: title: Using Runscope weight: 21 tags: - apis - api testing - service testing - integration testing - functional testing - smoke testing - integrations - reporting - monitoring - notifications categories: - Integrations --- * include a table of contents {:toc} ## About Runscope [Runscope](https://runscope.com) is an API testing and monitoring tool that can be used with continuous integration and delivery services like [CodeShip](https://codeship.com) to test and validate your web applications. By using Runscope you can ship more reliable code for your teams and your customers. The [Runscope documentation](https://guide.blazemeter.com/hc/en-us/articles/360001835558) provides a great guide to getting started, and the instructions below have more information on integrating with CodeShip. ## CodeShip Pro ### Setting Your Access Token To run your Runscope API tests on CodeShip, you will need to add your Runscope Access Token to your [encrypted environment variables]({{ site.baseurl }}{% link _pro/builds-and-configuration/environment-variables.md %}) that you encrypt and include in your [codeship-services.yml file]({{ site.baseurl }}{% link _pro/builds-and-configuration/services.md %}). See the [Runscope documentation](https://api.blazemeter.com/api-monitoring/#authentication-process) for more details. ### Installing Runscope Dependencies And Script After setting your personal access token, you will need to build a service in your [codeship-services.yml file]({{ site.baseurl }}{% link _pro/builds-and-configuration/services.md %}) with the Runscope dependencies installed. Runscope distributes these dependencies through pip, so you will need to be sure your service also has Python and Pip installed. To install the dependencies you need, you can add the following to your Service's Dockerfile: ```bash RUN pip install -r https://raw.githubusercontent.com/Runscope/python-trigger-sample/master/requirements.txt ``` Once you have the dependencies installed, you can add another line to your Service's Dockerfile to install the Runscope script itself: ```bash RUN wget https://raw.githubusercontent.com/Runscope/python-trigger-sample/master/app.py ``` Note again that your service must be able to use wget for the above command to execute. ### Triggering Runscope During A Build To run the Runscope script and execute your tests during your build, you can use the Runscope script you installed via your [codeship-services.yml file]({{ site.baseurl }}{% link _pro/builds-and-configuration/services.md %}) in a step called via your [codeship-steps.yml file]({{ site.baseurl }}{% link _pro/builds-and-configuration/steps.md %}): ```yaml - name: Runscope service: app command: runscope.sh ``` Inside of the `runscope.sh` file, you can call the Runscope script with the following command: ```shell python app.py https://api.runscope.com/radar/your_test_trigger_id/trigger?runscope_environment=your_runscope_environment_id ``` **Note** that you will need to change the URL to reflect the actual Trigger URL you have configured in Runscope. ## CodeShip Basic ### Setting Your Access Token To run your Runscope API tests on CodeShip, you will need to add your Runscope Access Token to your CodeShip project's [environment variables]({{ site.baseurl }}{% link _basic/builds-and-configuration/set-environment-variables.md %}). See the [Runscope documentation](https://api.blazemeter.com/api-monitoring/#authentication-process) for more details. ### Installing Runscope Dependencies And Script After setting your personal access token, you will need to install the Runscope dependencies and the Runscope script in your project's [setup commands]({{ site.baseurl }}{% link _basic/quickstart/getting-started.md %}). In your [setup commands]({{ site.baseurl }}{% link _basic/quickstart/getting-started.md %}), add the following: ```bash pip install -r https://raw.githubusercontent.com/Runscope/python-trigger-sample/master/requirements.txt ``` and ```bash wget https://raw.githubusercontent.com/Runscope/python-trigger-sample/master/app.py ``` ### Triggering Runscope During A Build To run the Runscope script and execute your tests during your build, you can use the Runscope script you installed in your [setup commands]({{ site.baseurl }}{% link _basic/quickstart/getting-started.md %}) by calling the script in your project's [test or deployment commands]({{ site.baseurl }}{% link _basic/quickstart/getting-started.md %}). Inside of your [test or deployment commands]({{ site.baseurl }}{% link _basic/quickstart/getting-started.md %}), simply add the following: ```bash python app.py https://api.runscope.com/radar/your_test_trigger_id/trigger?runscope_environment=your_runscope_environment_id ``` **Note** that you will need to change the URL to reflect the actual Trigger URL you have configured in Runscope.
{ "content_hash": "f6f28a186f9761c8f3bab3d1ee002e60", "timestamp": "", "source": "github", "line_count": 114, "max_line_length": 357, "avg_line_length": 43.89473684210526, "alnum_prop": 0.7703836930455635, "repo_name": "codeship/documentation", "id": "0c5c3d10583d48a04e62ee7a5be8dead1d536fea", "size": "5008", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "_general/integrations/runscope.md", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "82042" }, { "name": "CoffeeScript", "bytes": "2730" }, { "name": "Dockerfile", "bytes": "1624" }, { "name": "HTML", "bytes": "82336" }, { "name": "JavaScript", "bytes": "1583" }, { "name": "Ruby", "bytes": "11653" }, { "name": "Shell", "bytes": "9613" } ], "symlink_target": "" }