text
stringlengths 2
1.04M
| meta
dict |
---|---|
[](https://travis-ci.org/fatedier/frp)
[README](README.md) | [中文文档](README_zh.md)
## What is frp?
frp is a fast reverse proxy to help you expose a local server behind a NAT or firewall to the internet. Now, it supports tcp, udp, http and https protocol when requests can be forwarded by domains to backward web services.
## Table of Contents
<!-- vim-markdown-toc GFM -->
* [What can I do with frp?](#what-can-i-do-with-frp)
* [Status](#status)
* [Architecture](#architecture)
* [Example Usage](#example-usage)
* [Access your computer in LAN by SSH](#access-your-computer-in-lan-by-ssh)
* [Visit your web service in LAN by custom domains](#visit-your-web-service-in-lan-by-custom-domains)
* [Forward DNS query request](#forward-dns-query-request)
* [Forward unix domain socket](#forward-unix-domain-socket)
* [Expose a simple http file server](#expose-a-simple-http-file-server)
* [Expose your service in security](#expose-your-service-in-security)
* [P2P Mode](#p2p-mode)
* [Connect website through frpc's network](#connect-website-through-frpcs-network)
* [Features](#features)
* [Configuration File](#configuration-file)
* [Dashboard](#dashboard)
* [Authentication](#authentication)
* [Encryption and Compression](#encryption-and-compression)
* [Hot-Reload frpc configuration](#hot-reload-frpc-configuration)
* [Get proxy status from client](#get-proxy-status-from-client)
* [Privilege Mode](#privilege-mode)
* [Port White List](#port-white-list)
* [TCP Stream Multiplexing](#tcp-stream-multiplexing)
* [Support KCP Protocol](#support-kcp-protocol)
* [Connection Pool](#connection-pool)
* [Rewriting the Host Header](#rewriting-the-host-header)
* [Get Real IP](#get-real-ip)
* [Password protecting your web service](#password-protecting-your-web-service)
* [Custom subdomain names](#custom-subdomain-names)
* [URL routing](#url-routing)
* [Connect frps by HTTP PROXY](#connect-frps-by-http-proxy)
* [Plugin](#plugin)
* [Development Plan](#development-plan)
* [Contributing](#contributing)
* [Donation](#donation)
* [AliPay](#alipay)
* [Wechat Pay](#wechat-pay)
* [Paypal](#paypal)
<!-- vim-markdown-toc -->
## What can I do with frp?
* Expose any http and https service behind a NAT or firewall to the internet by a server with public IP address(Name-based Virtual Host Support).
* Expose any tcp or udp service behind a NAT or firewall to the internet by a server with public IP address.
## Status
frp is under development and you can try it with latest release version. Master branch for releasing stable version when dev branch for developing.
**We may change any protocol and can't promise backward compatible. Please check the release log when upgrading.**
## Architecture

## Example Usage
Firstly, download the latest programs from [Release](https://github.com/fatedier/frp/releases) page according to your os and arch.
Put **frps** and **frps.ini** to your server with public IP.
Put **frpc** and **frpc.ini** to your server in LAN.
### Access your computer in LAN by SSH
1. Modify frps.ini:
```ini
# frps.ini
[common]
bind_port = 7000
```
2. Start frps:
`./frps -c ./frps.ini`
3. Modify frpc.ini, `server_addr` is your frps's server IP:
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
server_port = 7000
[ssh]
type = tcp
local_ip = 127.0.0.1
local_port = 22
remote_port = 6000
```
4. Start frpc:
`./frpc -c ./frpc.ini`
5. Connect to server in LAN by ssh assuming that username is test:
`ssh -oPort=6000 [email protected]`
### Visit your web service in LAN by custom domains
Sometimes we want to expose a local web service behind a NAT network to others for testing with your own domain name and unfortunately we can't resolve a domain name to a local ip.
However, we can expose a http or https service using frp.
1. Modify frps.ini, configure http port 8080:
```ini
# frps.ini
[common]
bind_port = 7000
vhost_http_port = 8080
```
2. Start frps:
`./frps -c ./frps.ini`
3. Modify frpc.ini and set remote frps server's IP as x.x.x.x. The `local_port` is the port of your web service:
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
server_port = 7000
[web]
type = http
local_port = 80
custom_domains = www.yourdomain.com
```
4. Start frpc:
`./frpc -c ./frpc.ini`
5. Resolve A record of `www.yourdomain.com` to IP `x.x.x.x` or CNAME record to your origin domain.
6. Now visit your local web service using url `http://www.yourdomain.com:8080`.
### Forward DNS query request
1. Modify frps.ini:
```ini
# frps.ini
[common]
bind_port = 7000
```
2. Start frps:
`./frps -c ./frps.ini`
3. Modify frpc.ini, set remote frps's server IP as x.x.x.x, forward dns query request to google dns server `8.8.8.8:53`:
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
server_port = 7000
[dns]
type = udp
local_ip = 8.8.8.8
local_port = 53
remote_port = 6000
```
4. Start frpc:
`./frpc -c ./frpc.ini`
5. Send dns query request by dig:
`dig @x.x.x.x -p 6000 www.google.com`
### Forward unix domain socket
Using tcp port to connect unix domain socket like docker daemon.
Configure frps same as above.
1. Start frpc with configurations:
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
server_port = 7000
[unix_domain_socket]
type = tcp
remote_port = 6000
plugin = unix_domain_socket
plugin_unix_path = /var/run/docker.sock
```
2. Get docker version by curl command:
`curl http://x.x.x.x:6000/version`
### Expose a simple http file server
A simple way to visit files in the LAN.
Configure frps same as above.
1. Start frpc with configurations:
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
server_port = 7000
[test_static_file]
type = tcp
remote_port = 6000
plugin = static_file
plugin_local_path = /tmp/file
plugin_strip_prefix = static
plugin_http_user = abc
plugin_http_passwd = abc
```
2. Visit `http://x.x.x.x:6000/static/` by your browser, set correct user and password, so you can see files in `/tmp/file`.
### Expose your service in security
For some services, if expose them to the public network directly will be a security risk.
**stcp(secret tcp)** help you create a proxy avoiding any one can access it.
Configure frps same as above.
1. Start frpc, forward ssh port and `remote_port` is useless:
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
server_port = 7000
[secret_ssh]
type = stcp
sk = abcdefg
local_ip = 127.0.0.1
local_port = 22
```
2. Start another frpc in which you want to connect this ssh server:
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
server_port = 7000
[secret_ssh_visitor]
type = stcp
role = visitor
server_name = secret_ssh
sk = abcdefg
bind_addr = 127.0.0.1
bind_port = 6000
```
3. Connect to server in LAN by ssh assuming that username is test:
`ssh -oPort=6000 [email protected]`
### P2P Mode
**xtcp** is designed for transmitting a large amount of data directly between two client.
Now it can't penetrate all types of NAT devices. You can try **stcp** if **xtcp** doesn't work.
1. Configure a udp port for xtcp:
```ini
bind_udp_port = 7001
```
2. Start frpc, forward ssh port and `remote_port` is useless:
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
server_port = 7000
[p2p_ssh]
type = xtcp
sk = abcdefg
local_ip = 127.0.0.1
local_port = 22
```
3. Start another frpc in which you want to connect this ssh server:
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
server_port = 7000
[p2p_ssh_visitor]
type = xtcp
role = visitor
server_name = p2p_ssh
sk = abcdefg
bind_addr = 127.0.0.1
bind_port = 6000
```
4. Connect to server in LAN by ssh assuming that username is test:
`ssh -oPort=6000 [email protected]`
### Connect website through frpc's network
Configure frps same as above.
1. Start frpc with configurations:
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
server_port = 7000
[http_proxy]
type = tcp
remote_port = 6000
plugin = http_proxy # or socks5
```
2. Set http proxy or socks5 proxy `x.x.x.x:6000` in your browser and visit website through frpc's network.
## Features
### Configuration File
You can find features which this document not metioned from full example configuration files.
[frps full configuration file](./conf/frps_full.ini)
[frpc full configuration file](./conf/frpc_full.ini)
### Dashboard
Check frp's status and proxies's statistics information by Dashboard.
Configure a port for dashboard to enable this feature:
```ini
[common]
dashboard_port = 7500
# dashboard's username and password are both optional,if not set, default is admin.
dashboard_user = admin
dashboard_pwd = admin
```
Then visit `http://[server_addr]:7500` to see dashboard, default username and password are both `admin`.

### Authentication
Since v0.10.0, you only need to set `privilege_token` in frps.ini and frpc.ini.
Note that time duration between server of frpc and frps mustn't exceed 15 minutes because timestamp is used for authentication.
Howerver, this timeout duration can be modified by setting `authentication_timeout` in frps's configure file. It's defalut value is 900, means 15 minutes. If it is equals 0, then frps will not check authentication timeout.
### Encryption and Compression
Defalut value is false, you could decide if the proxy will use encryption or compression:
```ini
# frpc.ini
[ssh]
type = tcp
local_port = 22
remote_port = 6000
use_encryption = true
use_compression = true
```
### Hot-Reload frpc configuration
First you need to set admin port in frpc's configure file to let it provide HTTP API for more features.
```ini
# frpc.ini
[common]
admin_addr = 127.0.0.1
admin_port = 7400
```
Then run command `frpc reload -c ./frpc.ini` and wait for about 10 seconds to let frpc create or update or delete proxies.
**Note that parameters in [common] section won't be modified except 'start' now.**
### Get proxy status from client
Use `frpc status -c ./frpc.ini` to get status of all proxies. You need to set admin port in frpc's configure file.
### Privilege Mode
Privilege mode is the default and only mode support in frp since v0.10.0. All proxy configurations are set in client.
#### Port White List
`privilege_allow_ports` in frps.ini is used for preventing abuse of ports:
```ini
# frps.ini
[common]
privilege_allow_ports = 2000-3000,3001,3003,4000-50000
```
`privilege_allow_ports` consists of a specific port or a range of ports divided by `,`.
### TCP Stream Multiplexing
frp support tcp stream multiplexing since v0.10.0 like HTTP2 Multiplexing. All user requests to same frpc can use only one tcp connection.
You can disable this feature by modify frps.ini and frpc.ini:
```ini
# frps.ini and frpc.ini, must be same
[common]
tcp_mux = false
```
### Support KCP Protocol
frp support kcp protocol since v0.12.0.
KCP is a fast and reliable protocol that can achieve the transmission effect of a reduction of the average latency by 30% to 40% and reduction of the maximum delay by a factor of three, at the cost of 10% to 20% more bandwidth wasted than TCP.
Using kcp in frp:
1. Enable kcp protocol in frps:
```ini
# frps.ini
[common]
bind_port = 7000
# kcp needs to bind a udp port, it can be same with 'bind_port'
kcp_bind_port = 7000
```
2. Configure the protocol used in frpc to connect frps:
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
# specify the 'kcp_bind_port' in frps
server_port = 7000
protocol = kcp
```
### Connection Pool
By default, frps send message to frpc for create a new connection to backward service when getting an user request.If a proxy's connection pool is enabled, there will be a specified number of connections pre-established.
This feature is fit for a large number of short connections.
1. Configure the limit of pool count each proxy can use in frps.ini:
```ini
# frps.ini
[common]
max_pool_count = 5
```
2. Enable and specify the number of connection pool:
```ini
# frpc.ini
[common]
pool_count = 1
```
### Rewriting the Host Header
When forwarding to a local port, frp does not modify the tunneled HTTP requests at all, they are copied to your server byte-for-byte as they are received. Some application servers use the Host header for determining which development site to display. For this reason, frp can rewrite your requests with a modified Host header. Use the `host_header_rewrite` switch to rewrite incoming HTTP requests.
```ini
# frpc.ini
[web]
type = http
local_port = 80
custom_domains = test.yourdomain.com
host_header_rewrite = dev.yourdomain.com
```
If `host_header_rewrite` is specified, the Host header will be rewritten to match the hostname portion of the forwarding address.
### Get Real IP
Features for http proxy only.
You can get user's real IP from http request header `X-Forwarded-For` and `X-Real-IP`.
**Note that now you can only get these two headers in first request of each user connection.**
### Password protecting your web service
Anyone who can guess your tunnel URL can access your local web server unless you protect it with a password.
This enforces HTTP Basic Auth on all requests with the username and password you specify in frpc's configure file.
It can only be enabled when proxy type is http.
```ini
# frpc.ini
[web]
type = http
local_port = 80
custom_domains = test.yourdomain.com
http_user = abc
http_pwd = abc
```
Visit `http://test.yourdomain.com` and now you need to input username and password.
### Custom subdomain names
It is convenient to use `subdomain` configure for http、https type when many people use one frps server together.
```ini
# frps.ini
subdomain_host = frps.com
```
Resolve `*.frps.com` to the frps server's IP.
```ini
# frpc.ini
[web]
type = http
local_port = 80
subdomain = test
```
Now you can visit your web service by host `test.frps.com`.
Note that if `subdomain_host` is not empty, `custom_domains` should not be the subdomain of `subdomain_host`.
### URL routing
frp support forward http requests to different backward web services by url routing.
`locations` specify the prefix of URL used for routing. frps first searches for the most specific prefix location given by literal strings regardless of the listed order.
```ini
# frpc.ini
[web01]
type = http
local_port = 80
custom_domains = web.yourdomain.com
locations = /
[web02]
type = http
local_port = 81
custom_domains = web.yourdomain.com
locations = /news,/about
```
Http requests with url prefix `/news` and `/about` will be forwarded to **web02** and others to **web01**.
### Connect frps by HTTP PROXY
frpc can connect frps using HTTP PROXY if you set os environment `HTTP_PROXY` or configure `http_proxy` param in frpc.ini file.
It only works when protocol is tcp.
```ini
# frpc.ini
[common]
server_addr = x.x.x.x
server_port = 7000
http_proxy = http://user:[email protected]:8080
```
### Range ports mapping
Proxy name has prefix `range:` will support mapping range ports.
```ini
# frpc.ini
[range:test_tcp]
type = tcp
local_ip = 127.0.0.1
local_port = 6000-6006,6007
remote_port = 6000-6006,6007
```
frpc will generate 6 proxies like `test_tcp_0, test_tcp_1 ... test_tcp_5`.
### Plugin
frpc only forward request to local tcp or udp port by default.
Plugin is used for providing rich features. There are built-in plugins such as `unix_domain_socket`, `http_proxy`, `socks5`, `static_file` and you can see [example usage](#example-usage).
Specify which plugin to use by `plugin` parameter. Configuration parameters of plugin should be started with `plugin_`. `local_ip` and `local_port` is useless for plugin.
Using plugin **http_proxy**:
```ini
# frpc.ini
[http_proxy]
type = tcp
remote_port = 6000
plugin = http_proxy
plugin_http_user = abc
plugin_http_passwd = abc
```
`plugin_http_user` and `plugin_http_passwd` are configuration parameters used in `http_proxy` plugin.
## Development Plan
* Log http request information in frps.
* Direct reverse proxy, like haproxy.
* Load balance to different service in frpc.
* kubernetes ingress support.
## Contributing
Interested in getting involved? We would like to help you!
* Take a look at our [issues list](https://github.com/fatedier/frp/issues) and consider sending a Pull Request to **dev branch**.
* If you want to add a new feature, please create an issue first to describe the new feature, as well as the implementation approach. Once a proposal is accepted, create an implementation of the new features and submit it as a pull request.
* Sorry for my poor english and improvement for this document is welcome even some typo fix.
* If you have some wonderful ideas, send email to [email protected].
**Note: We prefer you to give your advise in [issues](https://github.com/fatedier/frp/issues), so others with a same question can search it quickly and we don't need to answer them repeatly.**
## Donation
If frp help you a lot, you can support us by:
frp QQ group: 606194980
### AliPay

### Wechat Pay

### Paypal
Donate money by [paypal](https://www.paypal.me/fatedier) to my account **[email protected]**.
| {
"content_hash": "36af2d0ea3d634cc96e0b5f288d38ee1",
"timestamp": "",
"source": "github",
"line_count": 675,
"max_line_length": 398,
"avg_line_length": 25.95111111111111,
"alnum_prop": 0.7097676542787007,
"repo_name": "noricohuas/docker",
"id": "458e0e6b8e9be9e0178c168e99f80f7e03c56925",
"size": "17536",
"binary": false,
"copies": "2",
"ref": "refs/heads/master",
"path": "frp/docs/0.16.1/README.md",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "PLpgSQL",
"bytes": "222"
},
{
"name": "Python",
"bytes": "19735"
},
{
"name": "Shell",
"bytes": "95174"
}
],
"symlink_target": ""
} |
<?php
namespace Blog\Form;
// Filename: /module/Blog/src/Blog/Form/ParentFieldset.php
use Blog\Entity\ChildParent;
use Zend\Form\Fieldset;
use Zend\InputFilter\InputFilterProviderInterface;
use DoctrineORMModule\Stdlib\Hydrator\DoctrineEntity as DoctrineHydrator;
use Zend\ServiceManager\ServiceManager;
class ParentsFieldset extends Fieldset implements InputFilterProviderInterface
{
public function __construct(ServiceManager $serviceManager)
{
parent::__construct('childparent');
$entityManager = $serviceManager->get('Doctrine\ORM\EntityManager');
$this->setHydrator(new DoctrineHydrator($entityManager, 'Blog\Entity\ChildParent'))
->setObject(new ChildParent());
$this->add(array(
'type' => 'hidden',
'name' => 'id'
));
$this->add(array(
'type' => 'text',
'name' => 'name',
'options' => array(
'label' => 'Parent Name'
)
));
$this->add(array(
'type' => 'Zend\Form\Element\Collection',
'name' => 'children',
'options' => array(
'should_create_template' => true,
'object_manager' => $serviceManager,
'allow_add' => true,
'target_element' => new ChildrenFieldset($serviceManager) ,
),
'attribute' => array('class' => 'childrenSet')
));
}
/**
* @return array
*/
public function getInputFilterSpecification()
{
return array(
'name' => array(
'required' => true,
),
);
}
}
?>
| {
"content_hash": "acac8e5821dc8aee4b3952f56f5eaafe",
"timestamp": "",
"source": "github",
"line_count": 64,
"max_line_length": 91,
"avg_line_length": 26.0625,
"alnum_prop": 0.552757793764988,
"repo_name": "ashish101184/zf2doctrine",
"id": "7b00c02918d31d7b8e2d87d4840f397eff754686",
"size": "1668",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "module/Blog/src/Blog/Form/ParentsFieldset.php",
"mode": "33188",
"license": "bsd-3-clause",
"language": [
{
"name": "ApacheConf",
"bytes": "192"
},
{
"name": "CSS",
"bytes": "1044"
},
{
"name": "HTML",
"bytes": "15894"
},
{
"name": "PHP",
"bytes": "44297"
}
],
"symlink_target": ""
} |
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.google.api.grpc</groupId>
<artifactId>proto-google-cloud-video-intelligence-v1p2beta1</artifactId>
<version>0.96.0-SNAPSHOT</version><!-- {x-version-update:proto-google-cloud-video-intelligence-v1p2beta1:current} -->
<name>proto-google-cloud-video-intelligence-v1p2beta1</name>
<description>PROTO library for proto-google-cloud-video-intelligence-v1p2beta1</description>
<parent>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-video-intelligence-parent</artifactId>
<version>2.6.0-SNAPSHOT</version><!-- {x-version-update:google-cloud-video-intelligence:current} -->
</parent>
<dependencies>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
</dependency>
<dependency>
<groupId>com.google.api.grpc</groupId>
<artifactId>proto-google-common-protos</artifactId>
</dependency>
</dependencies>
</project> | {
"content_hash": "c61f3d7e066cb4cdded42c87847cfcde",
"timestamp": "",
"source": "github",
"line_count": 25,
"max_line_length": 119,
"avg_line_length": 47.36,
"alnum_prop": 0.7280405405405406,
"repo_name": "googleapis/google-cloud-java",
"id": "a2ac3171463b56fe5bcf2a9c1e538ece59969015",
"size": "1184",
"binary": false,
"copies": "1",
"ref": "refs/heads/main",
"path": "java-video-intelligence/proto-google-cloud-video-intelligence-v1p2beta1/pom.xml",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Dockerfile",
"bytes": "2614"
},
{
"name": "HCL",
"bytes": "28592"
},
{
"name": "Java",
"bytes": "826434232"
},
{
"name": "Jinja",
"bytes": "2292"
},
{
"name": "Python",
"bytes": "200408"
},
{
"name": "Shell",
"bytes": "97954"
}
],
"symlink_target": ""
} |
namespace Google.Cloud.AppEngine.V1.Tests
{
/// <summary>Generated unit tests.</summary>
public sealed class GeneratedAuthorizedDomainsClientTest
{
}
}
| {
"content_hash": "d276b2ff98acdddb5441b3ca2b8f3b53",
"timestamp": "",
"source": "github",
"line_count": 7,
"max_line_length": 60,
"avg_line_length": 24,
"alnum_prop": 0.7202380952380952,
"repo_name": "jskeet/gcloud-dotnet",
"id": "0e5f2ea21d3289cf310d969f4a4a2b6672825aab",
"size": "822",
"binary": false,
"copies": "1",
"ref": "refs/heads/bq-migration",
"path": "apis/Google.Cloud.AppEngine.V1/Google.Cloud.AppEngine.V1.Tests/AuthorizedDomainsClientTest.g.cs",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Batchfile",
"bytes": "1725"
},
{
"name": "C#",
"bytes": "1829733"
}
],
"symlink_target": ""
} |
angularFiles = {
'angularSrc': [
'src/minErr.js',
'src/Angular.js',
'src/loader.js',
'src/AngularPublic.js',
'src/jqLite.js',
'src/apis.js',
'src/auto/injector.js',
'src/ng/anchorScroll.js',
'src/ng/animate.js',
'src/ng/browser.js',
'src/ng/cacheFactory.js',
'src/ng/compile.js',
'src/ng/controller.js',
'src/ng/document.js',
'src/ng/exceptionHandler.js',
'src/ng/http.js',
'src/ng/httpBackend.js',
'src/ng/interpolate.js',
'src/ng/interval.js',
'src/ng/locale.js',
'src/ng/location.js',
'src/ng/log.js',
'src/ng/parse.js',
'src/ng/q.js',
'src/ng/rootScope.js',
'src/ng/sce.js',
'src/ng/sniffer.js',
'src/ng/timeout.js',
'src/ng/urlUtils.js',
'src/ng/window.js',
'src/ng/filter.js',
'src/ng/filter/filter.js',
'src/ng/filter/filters.js',
'src/ng/filter/limitTo.js',
'src/ng/filter/orderBy.js',
'src/ng/directive/directives.js',
'src/ng/directive/a.js',
'src/ng/directive/booleanAttrs.js',
'src/ng/directive/form.js',
'src/ng/directive/input.js',
'src/ng/directive/ngBind.js',
'src/ng/directive/ngClass.js',
'src/ng/directive/ngCloak.js',
'src/ng/directive/ngController.js',
'src/ng/directive/ngCsp.js',
'src/ng/directive/ngEventDirs.js',
'src/ng/directive/ngIf.js',
'src/ng/directive/ngInclude.js',
'src/ng/directive/ngInit.js',
'src/ng/directive/ngNonBindable.js',
'src/ng/directive/ngPluralize.js',
'src/ng/directive/ngRepeat.js',
'src/ng/directive/ngShowHide.js',
'src/ng/directive/ngStyle.js',
'src/ng/directive/ngSwitch.js',
'src/ng/directive/ngTransclude.js',
'src/ng/directive/script.js',
'src/ng/directive/select.js',
'src/ng/directive/style.js'
],
'angularLoader': [
'src/loader.js'
],
'angularModules': {
'ngAnimate': [
'src/ngAnimate/animate.js'
],
'ngCookies': [
'src/ngCookies/cookies.js'
],
'ngResource': [
'src/ngResource/resource.js'
],
'ngRoute': [
'src/ngRoute/route.js',
'src/ngRoute/routeParams.js',
'src/ngRoute/directive/ngView.js'
],
'ngSanitize': [
'src/ngSanitize/sanitize.js',
'src/ngSanitize/filter/linky.js'
],
'ngMock': [
'src/ngMock/angular-mocks.js'
],
'ngTouch': [
'src/ngTouch/touch.js',
'src/ngTouch/swipe.js',
'src/ngTouch/directive/ngClick.js',
'src/ngTouch/directive/ngSwipe.js'
],
},
'angularScenario': [
'src/ngScenario/Scenario.js',
'src/ngScenario/browserTrigger.js',
'src/ngScenario/Application.js',
'src/ngScenario/Describe.js',
'src/ngScenario/Future.js',
'src/ngScenario/ObjectModel.js',
'src/ngScenario/Runner.js',
'src/ngScenario/SpecRunner.js',
'src/ngScenario/dsl.js',
'src/ngScenario/matchers.js',
'src/ngScenario/output/Html.js',
'src/ngScenario/output/Json.js',
'src/ngScenario/output/Xml.js',
'src/ngScenario/output/Object.js'
],
'angularTest': [
'test/helpers/*.js',
'test/ngScenario/*.js',
'test/ngScenario/output/*.js',
'test/*.js',
'test/auto/*.js',
'test/ng/**/*.js',
'test/ngAnimate/*.js',
'test/ngCookies/*.js',
'test/ngResource/*.js',
'test/ngRoute/**/*.js',
'test/ngSanitize/**/*.js',
'test/ngMock/*.js',
'test/ngTouch/**/*.js'
],
'karma': [
'bower_components/jquery/jquery.js',
'test/jquery_remove.js',
'@angularSrc',
'src/publishExternalApis.js',
'@angularSrcModules',
'@angularScenario',
'@angularTest',
'example/personalLog/*.js',
'example/personalLog/test/*.js'
],
'karmaExclude': [
'test/jquery_alias.js',
'src/angular-bootstrap.js',
'src/ngScenario/angular-bootstrap.js'
],
'karmaScenario': [
'build/angular-scenario.js',
'build/docs/docs-scenario.js'
],
"karmaModules": [
'build/angular.js',
'@angularSrcModules',
'src/ngScenario/browserTrigger.js',
'test/helpers/*.js',
'test/ngMock/*.js',
'test/ngCookies/*.js',
'test/ngRoute/**/*.js',
'test/ngResource/*.js',
'test/ngSanitize/**/*.js',
'test/ngTouch/**/*.js'
],
'karmaJquery': [
'bower_components/jquery/jquery.js',
'test/jquery_alias.js',
'@angularSrc',
'src/publishExternalApis.js',
'@angularSrcModules',
'@angularScenario',
'@angularTest',
'example/personalLog/*.js',
'example/personalLog/test/*.js'
],
'karmaJqueryExclude': [
'src/angular-bootstrap.js',
'src/ngScenario/angular-bootstrap.js',
'test/jquery_remove.js'
]
};
angularFiles['angularSrcModules'] = [].concat(
angularFiles['angularModules']['ngAnimate'],
angularFiles['angularModules']['ngCookies'],
angularFiles['angularModules']['ngResource'],
angularFiles['angularModules']['ngRoute'],
angularFiles['angularModules']['ngSanitize'],
angularFiles['angularModules']['ngMock'],
angularFiles['angularModules']['ngTouch']
);
if (exports) {
exports.files = angularFiles;
exports.mergeFilesFor = function() {
var files = [];
Array.prototype.slice.call(arguments, 0).forEach(function(filegroup) {
angularFiles[filegroup].forEach(function(file) {
// replace @ref
var match = file.match(/^\@(.*)/);
if (match) {
files = files.concat(angularFiles[match[1]]);
} else {
files.push(file);
}
});
});
return files;
};
}
| {
"content_hash": "242e1b7ca79d65a11ce1417702dcde56",
"timestamp": "",
"source": "github",
"line_count": 220,
"max_line_length": 74,
"avg_line_length": 24.95,
"alnum_prop": 0.6050282382947714,
"repo_name": "mutnikov/angular.js",
"id": "9603c289abd8235f3ef6d93ce3bed59791074696",
"size": "5489",
"binary": false,
"copies": "3",
"ref": "refs/heads/master",
"path": "angularFiles.js",
"mode": "33261",
"license": "mit",
"language": [],
"symlink_target": ""
} |
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.bedrock.padder.fragment.PresetStoreInstalledFragment">
<android.support.v4.widget.NestedScrollView
android:id="@+id/layout_installed_nested_scrollview"
android:layout_width="match_parent"
android:layout_height="match_parent"
app:layout_behavior="@string/appbar_scrolling_view_behavior">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="vertical">
<RelativeLayout
android:id="@+id/layout_installed_preset_store"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:focusable="true"
android:focusableInTouchMode="true">
<!-- focusableInTouchMode for disable auto-scroll to recyclerView -->
<android.support.v7.widget.RecyclerView
android:id="@+id/layout_installed_preset_store_recycler_view"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:paddingBottom="64dp"
android:scrollbars="none"
android:visibility="visible"/>
<RelativeLayout
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerHorizontal="true"
android:layout_marginTop="100dp">
<LinearLayout
android:id="@+id/layout_installed_preset_store_recycler_view_no_preset"
android:layout_width="wrap_content"
android:layout_height="match_parent"
android:gravity="center_horizontal"
android:orientation="vertical"
android:visibility="gone">
<ImageView
android:layout_width="64dp"
android:layout_height="64dp"
android:layout_marginBottom="8dp"
android:alpha="0.5"
app:srcCompat="@drawable/ic_find_in_page_black"/>
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/preset_store_loading_no_preset"
android:textColor="@color/dark_secondary"
android:textSize="18sp"/>
</LinearLayout>
</RelativeLayout>
</RelativeLayout>
<View
android:id="@+id/layout_installed_margin"
android:layout_width="match_parent"
android:layout_height="0dp"
android:background="@color/transparent"/>
</LinearLayout>
</android.support.v4.widget.NestedScrollView>
</FrameLayout>
| {
"content_hash": "9074a6f68de6c482612dca8b780221dd",
"timestamp": "",
"source": "github",
"line_count": 74,
"max_line_length": 95,
"avg_line_length": 45.9054054054054,
"alnum_prop": 0.5363556078893141,
"repo_name": "berict/Tapad",
"id": "b0f355a9a30750a2e72cf4d928a0b9f98ecd52c6",
"size": "3397",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "app/src/main/res/layout/fragment_preset_store_installed.xml",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Batchfile",
"bytes": "266"
},
{
"name": "Java",
"bytes": "510200"
}
],
"symlink_target": ""
} |
namespace ash::secure_channel {
const char kTestRemoteDeviceId[] = "testRemoteDeviceId";
const char kTestLocalDeviceId[] = "testLocalDeviceId";
constexpr const ConnectionPriority kTestConnectionPriority =
ConnectionPriority::kLow;
class SecureChannelBleInitiatorOperationTest : public testing::Test {
public:
SecureChannelBleInitiatorOperationTest(
const SecureChannelBleInitiatorOperationTest&) = delete;
SecureChannelBleInitiatorOperationTest& operator=(
const SecureChannelBleInitiatorOperationTest&) = delete;
protected:
SecureChannelBleInitiatorOperationTest()
: device_id_pair_(kTestRemoteDeviceId, kTestLocalDeviceId) {}
~SecureChannelBleInitiatorOperationTest() override = default;
// testing::Test:
void SetUp() override {
fake_ble_connection_manager_ = std::make_unique<FakeBleConnectionManager>();
auto test_task_runner = base::MakeRefCounted<base::TestSimpleTaskRunner>();
operation_ = BleInitiatorOperation::Factory::Create(
fake_ble_connection_manager_.get(),
base::BindOnce(&SecureChannelBleInitiatorOperationTest::
OnSuccessfulConnectionAttempt,
base::Unretained(this)),
base::BindRepeating(
&SecureChannelBleInitiatorOperationTest::OnFailedConnectionAttempt,
base::Unretained(this)),
device_id_pair_, kTestConnectionPriority, test_task_runner);
test_task_runner->RunUntilIdle();
EXPECT_EQ(kTestConnectionPriority,
fake_ble_connection_manager_->GetPriorityForAttempt(
device_id_pair_, ConnectionRole::kInitiatorRole));
}
const DeviceIdPair& device_id_pair() { return device_id_pair_; }
void FailAttempt(BleInitiatorFailureType failure_type) {
fake_ble_connection_manager_->NotifyBleInitiatorFailure(device_id_pair_,
failure_type);
EXPECT_EQ(failure_type, failure_type_from_callback_);
}
void CompleteAttemptSuccessfully() {
auto fake_authenticated_channel =
std::make_unique<FakeAuthenticatedChannel>();
FakeAuthenticatedChannel* fake_authenticated_channel_raw =
fake_authenticated_channel.get();
fake_ble_connection_manager_->NotifyConnectionSuccess(
device_id_pair(), ConnectionRole::kInitiatorRole,
std::move(fake_authenticated_channel));
EXPECT_EQ(fake_authenticated_channel_raw, channel_from_callback_.get());
// The operation should no longer be present in BleConnectionManager.
EXPECT_FALSE(fake_ble_connection_manager()->DoesAttemptExist(
device_id_pair(), ConnectionRole::kInitiatorRole));
}
FakeBleConnectionManager* fake_ble_connection_manager() {
return fake_ble_connection_manager_.get();
}
ConnectToDeviceOperation<BleInitiatorFailureType>* operation() {
return operation_.get();
}
private:
void OnSuccessfulConnectionAttempt(
std::unique_ptr<AuthenticatedChannel> authenticated_channel) {
EXPECT_FALSE(channel_from_callback_);
channel_from_callback_ = std::move(authenticated_channel);
}
void OnFailedConnectionAttempt(BleInitiatorFailureType failure_type) {
failure_type_from_callback_ = failure_type;
}
const base::test::TaskEnvironment task_environment_;
std::unique_ptr<FakeBleConnectionManager> fake_ble_connection_manager_;
DeviceIdPair device_id_pair_;
std::unique_ptr<AuthenticatedChannel> channel_from_callback_;
absl::optional<BleInitiatorFailureType> failure_type_from_callback_;
std::unique_ptr<ConnectToDeviceOperation<BleInitiatorFailureType>> operation_;
};
TEST_F(SecureChannelBleInitiatorOperationTest, UpdateThenFail) {
operation()->UpdateConnectionPriority(ConnectionPriority::kMedium);
EXPECT_EQ(ConnectionPriority::kMedium,
fake_ble_connection_manager()->GetPriorityForAttempt(
device_id_pair(), ConnectionRole::kInitiatorRole));
static const BleInitiatorFailureType all_types[] = {
BleInitiatorFailureType::kAuthenticationError,
BleInitiatorFailureType::kGattConnectionError,
BleInitiatorFailureType::kInterruptedByHigherPriorityConnectionAttempt,
BleInitiatorFailureType::kTimeoutContactingRemoteDevice,
BleInitiatorFailureType::kCouldNotGenerateAdvertisement};
for (const auto& failure_type : all_types) {
FailAttempt(failure_type);
// After failure, the attempt should still be present in
// BleConnectionManager.
EXPECT_EQ(ConnectionPriority::kMedium,
fake_ble_connection_manager()->GetPriorityForAttempt(
device_id_pair(), ConnectionRole::kInitiatorRole));
}
operation()->Cancel();
EXPECT_FALSE(fake_ble_connection_manager()->DoesAttemptExist(
device_id_pair(), ConnectionRole::kInitiatorRole));
}
TEST_F(SecureChannelBleInitiatorOperationTest, UpdateThenSucceed) {
operation()->UpdateConnectionPriority(ConnectionPriority::kMedium);
EXPECT_EQ(ConnectionPriority::kMedium,
fake_ble_connection_manager()->GetPriorityForAttempt(
device_id_pair(), ConnectionRole::kInitiatorRole));
CompleteAttemptSuccessfully();
}
} // namespace ash::secure_channel
| {
"content_hash": "e482f17dec03bd25c9d135baf8b46264",
"timestamp": "",
"source": "github",
"line_count": 132,
"max_line_length": 80,
"avg_line_length": 39.28787878787879,
"alnum_prop": 0.7306209024296182,
"repo_name": "nwjs/chromium.src",
"id": "14badb0c074ec879160debaf7658a97058502960",
"size": "5913",
"binary": false,
"copies": "1",
"ref": "refs/heads/nw70",
"path": "ash/services/secure_channel/ble_initiator_operation_unittest.cc",
"mode": "33188",
"license": "bsd-3-clause",
"language": [],
"symlink_target": ""
} |
class AddShowPropertyToSpreeProductProperties < ActiveRecord::Migration[6.0]
def change
add_column :spree_product_properties, :show_property, :boolean, default: true unless column_exists?(:spree_product_properties, :show_property)
end
end
| {
"content_hash": "96c28a4bc43b8f8cf93f3b70ed386221",
"timestamp": "",
"source": "github",
"line_count": 5,
"max_line_length": 146,
"avg_line_length": 49.4,
"alnum_prop": 0.7894736842105263,
"repo_name": "spark-solutions/spark-starter-kit",
"id": "b778f4949d32a79b25431183afa8b5ac8721d008",
"size": "309",
"binary": false,
"copies": "1",
"ref": "refs/heads/main",
"path": "db/migrate/20210115131233_add_show_property_to_spree_product_properties.spree.rb",
"mode": "33188",
"license": "bsd-3-clause",
"language": [
{
"name": "CSS",
"bytes": "546"
},
{
"name": "HTML",
"bytes": "4838"
},
{
"name": "JavaScript",
"bytes": "7292"
},
{
"name": "Ruby",
"bytes": "273069"
}
],
"symlink_target": ""
} |
namespace cc {
class Layer;
class NinePatchLayer;
class SolidColorLayer;
}
namespace gfx {
class Size;
}
namespace ui {
class ResourceManager;
}
namespace android {
class ContentLayer;
class TabContentManager;
class ToolbarLayer;
// Sub layer tree representation of a tab. A TabLayer is not tied to
// specific tab. To specialize it call CustomizeForId() and SetProperties().
class TabLayer : public Layer {
public:
static scoped_refptr<TabLayer> Create(bool incognito,
ui::ResourceManager* resource_manager,
TabContentManager* tab_content_manager);
TabLayer(const TabLayer&) = delete;
TabLayer& operator=(const TabLayer&) = delete;
// TODO(meiliang): This method needs another parameter, a resource that can be
// used to indicate the currently selected tab for the TabLayer.
void SetProperties(int id,
const std::vector<int>& ids,
bool can_use_live_layer,
int toolbar_resource_id,
int shadow_resource_id,
int contour_resource_id,
int border_resource_id,
int border_inner_shadow_resource_id,
int default_background_color,
float x,
float y,
float width,
float height,
float shadow_x,
float shadow_y,
float shadow_width,
float shadow_height,
float alpha,
float border_alpha,
float border_inner_shadow_alpha,
float contour_alpha,
float shadow_alpha,
float border_scale,
float saturation,
float brightness,
float static_to_view_blend,
float content_width,
float content_height,
float view_width,
bool show_toolbar,
int default_theme_color,
int toolbar_background_color,
bool anonymize_toolbar,
int toolbar_textbox_resource_id,
int toolbar_textbox_background_color,
float toolbar_alpha,
float toolbar_y_offset,
float content_offset,
float side_border_scale,
bool inset_border);
bool is_incognito() const { return incognito_; }
scoped_refptr<cc::Layer> layer() override;
static void ComputePaddingPositions(const gfx::Size& content_size,
const gfx::Size& desired_size,
gfx::Rect* side_padding_rect,
gfx::Rect* bottom_padding_rect);
protected:
TabLayer(bool incognito,
ui::ResourceManager* resource_manager,
TabContentManager* tab_content_manager);
~TabLayer() override;
private:
void SetContentProperties(int id,
const std::vector<int>& tab_ids,
bool can_use_live_layer,
float static_to_view_blend,
bool should_override_content_alpha,
float content_alpha_override,
float saturation,
bool should_clip,
const gfx::Rect& clip,
ui::NinePatchResource* inner_shadow_resource,
float inner_shadow_alpha);
const bool incognito_;
raw_ptr<ui::ResourceManager> resource_manager_;
raw_ptr<TabContentManager> tab_content_manager_;
// [layer]-+-[toolbar]
// +-[front border]
// +-[content]
// +-[padding]
// +-[contour_shadow]
// +-[shadow]
scoped_refptr<cc::Layer> layer_;
scoped_refptr<ToolbarLayer> toolbar_layer_;
scoped_refptr<ContentLayer> content_;
scoped_refptr<cc::SolidColorLayer> side_padding_;
scoped_refptr<cc::SolidColorLayer> bottom_padding_;
scoped_refptr<cc::NinePatchLayer> front_border_;
scoped_refptr<cc::NinePatchLayer> front_border_inner_shadow_;
scoped_refptr<cc::NinePatchLayer> contour_shadow_;
scoped_refptr<cc::NinePatchLayer> shadow_;
float brightness_;
};
} // namespace android
#endif // CHROME_BROWSER_ANDROID_COMPOSITOR_LAYER_TAB_LAYER_H_
| {
"content_hash": "e2787656d09c6fa036c6f3de937144fe",
"timestamp": "",
"source": "github",
"line_count": 130,
"max_line_length": 80,
"avg_line_length": 35.33846153846154,
"alnum_prop": 0.5348280365694384,
"repo_name": "nwjs/chromium.src",
"id": "a28b48f7382c7f0f62939250ea00c4e30c788d73",
"size": "5045",
"binary": false,
"copies": "6",
"ref": "refs/heads/nw70",
"path": "chrome/browser/android/compositor/layer/tab_layer.h",
"mode": "33188",
"license": "bsd-3-clause",
"language": [],
"symlink_target": ""
} |
<?xml version="1.0" encoding="utf-8"?>
<doctrine-mapping xmlns="http://doctrine-project.org/schemas/orm/doctrine-mapping" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://doctrine-project.org/schemas/orm/doctrine-mapping http://doctrine-project.org/schemas/orm/doctrine-mapping.xsd">
<entity name="ABC\Admin\CourseAdminBundle\Entity\Teaches" table="teaches">
<indexes>
<index name="course_id" columns="course_id"/>
<index name="session_id" columns="session_id"/>
<index name="IDX_C7F19643B70FF80C" columns="rp_id"/>
</indexes>
<id name="rp" association-key="true"/>
<id name="session" association-key="true"/>
<one-to-one field="rp" target-entity="Resourceperson">
<join-columns>
<join-column name="rp_id" referenced-column-name="rp_id"/>
</join-columns>
</one-to-one>
<one-to-one field="session" target-entity="Session">
<join-columns>
<join-column name="session_id" referenced-column-name="session_id"/>
</join-columns>
</one-to-one>
<many-to-one field="course" target-entity="Course">
<join-columns>
<join-column name="course_id" referenced-column-name="course_id"/>
</join-columns>
</many-to-one>
</entity>
</doctrine-mapping>
| {
"content_hash": "19ca0e2db068e1b62c85165a8af69fe5",
"timestamp": "",
"source": "github",
"line_count": 27,
"max_line_length": 276,
"avg_line_length": 47.592592592592595,
"alnum_prop": 0.666147859922179,
"repo_name": "AlectaInfo/Inkambo",
"id": "600a700a743c4d35a83fb1c47b2a3184f1997aa3",
"size": "1285",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "src/ABC/Admin/CourseAdminBundle/Resources/config/doctrine/Teaches.orm.xml",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "53530"
},
{
"name": "JavaScript",
"bytes": "15574"
},
{
"name": "PHP",
"bytes": "328541"
}
],
"symlink_target": ""
} |
include(CMakeParseArguments)
if(NOT DEFINED Tensile_ROOT)
# Compute the installation prefix relative to this file.
get_filename_component(Tensile_PREFIX "${CMAKE_CURRENT_LIST_FILE}" PATH)
get_filename_component(Tensile_PREFIX "${Tensile_PREFIX}" PATH)
if (WIN32)
execute_process(COMMAND "${Tensile_PREFIX}/bin/TensileGetPath.exe" OUTPUT_VARIABLE Tensile_ROOT)
else()
execute_process(COMMAND "${Tensile_PREFIX}/bin/TensileGetPath" OUTPUT_VARIABLE Tensile_ROOT)
endif()
endif()
list(APPEND CMAKE_MODULE_PATH "${Tensile_ROOT}/Source/cmake/")
list(APPEND CMAKE_MODULE_PATH "${Tensile_ROOT}/Source/")
if("HIP" IN_LIST Tensile_FIND_COMPONENTS)
set(TENSILE_USE_HIP ON CACHE BOOL "Use HIP")
else()
set(TENSILE_USE_HIP OFF CACHE BOOL "Use HIP")
endif()
if("LLVM" IN_LIST Tensile_FIND_COMPONENTS)
set(TENSILE_USE_LLVM ON CACHE BOOL "Use LLVM")
else()
set(TENSILE_USE_LLVM OFF CACHE BOOL "Use LLVM")
endif()
if("Client" IN_LIST Tensile_FIND_COMPONENTS)
if(TENSILE_USE_HIP AND TENSILE_USE_LLVM)
set(TENSILE_BUILD_CLIENT ON CACHE BOOL "Build Client")
elseif(Tensile_FIND_REQUIRED_Client)
message("Tensile client requires both Hip and LLVM.")
set(Tensile_FOUND false)
else()
set(TENSILE_BUILD_CLIENT OFF CACHE BOOL "Build Client")
endif()
else()
set(TENSILE_BUILD_CLIENT OFF CACHE BOOL "Build Client")
endif()
if("STATIC_ONLY" IN_LIST Tensile_FIND_COMPONENTS)
set(TENSILE_STATIC_ONLY ON CACHE BOOL "Disable exporting symbols from shared library.")
else()
set(TENSILE_STATIC_ONLY OFF CACHE BOOL "Disable exporting symbols from shared library.")
endif()
add_subdirectory("${Tensile_ROOT}/Source" "Tensile")
# Target is created for copying dependencies
function(TensileCreateCopyTarget
Target_NAME
Tensile_OBJECTS_TO_COPY
Dest_PATH
)
file(MAKE_DIRECTORY "${Dest_PATH}")
add_custom_target(
${Target_NAME} ALL
COMMENT "${Target_NAME}: Copying tensile objects to ${Dest_PATH}"
DEPENDS ${Tensile_OBJECTS_TO_COPY}
)
foreach(OBJECT_TO_COPY ${Tensile_OBJECTS_TO_COPY})
add_custom_command(
TARGET ${Target_NAME} PRE_BUILD
COMMAND_EXPAND_LISTS
COMMAND ${CMAKE_COMMAND} -E copy_if_different ${OBJECT_TO_COPY} ${Dest_PATH}
DEPENDS ${OBJECT_TO_COPY}
)
endforeach()
endfunction()
# Output target: ${Tensile_VAR_PREFIX}_LIBRARY_TARGET. Ensures that the libs get built in Tensile_OUTPUT_PATH/library.
# Output symbol: ${Tensile_VAR_PREFIX}_ALL_FILES. List of full paths of all expected library files in manifest.
function(TensileCreateLibraryFiles
Tensile_LOGIC_PATH
Tensile_OUTPUT_PATH
)
# Boolean options
set(options
MERGE_FILES
NO_MERGE_FILES
SHORT_FILE_NAMES
PRINT_DEBUG
GENERATE_PACKAGE
SEPARATE_ARCHITECTURES
LAZY_LIBRARY_LOADING
)
# Single value settings
set(oneValueArgs
CODE_OBJECT_VERSION
COMPILER
COMPILER_PATH
EMBED_KEY
EMBED_LIBRARY
LIBRARY_FORMAT
TENSILE_ROOT
VAR_PREFIX
CPU_THREADS
)
# Multi value settings
set(multiValueArgs
ARCHITECTURE
)
cmake_parse_arguments(Tensile "${options}" "${oneValueArgs}" "${multiValueArgs}" ${ARGN})
if(Tensile_UNPARSED_ARGUMENTS)
message(WARNING "Unrecognized arguments: ${Tensile_UNPARSED_ARGUMENTS}")
endif()
if(Tensile_KEYWORDS_MISSING_VALUES)
message(WARNING "Malformed arguments: ${Tensile_KEYWORDS_MISSING_VALUES}")
endif()
# Parse incoming options
if(Tensile_TENSILE_ROOT)
set(Script "${Tensile_TENSILE_ROOT}/bin/TensileCreateLibrary")
else()
set(Script "${Tensile_ROOT}/bin/TensileCreateLibrary")
endif()
message(STATUS "Tensile script: ${Script}")
# Older NO_MERGE_FILES flag overrides MERGE_FILES option.
if(Tensile_NO_MERGE_FILES)
set(Tensile_MERGE_FILES FALSE)
endif()
if(Tensile_MERGE_FILES)
set(Options ${Options} "--merge-files")
else()
set(Options ${Options} "--no-merge-files")
endif()
if(Tensile_SEPARATE_ARCHITECTURES)
set(Options ${Options} "--separate-architectures")
endif()
if(Tensile_LAZY_LIBRARY_LOADING)
set(Options ${Options} "--lazy-library-loading")
endif()
if(Tensile_GENERATE_PACKAGE)
set(Options ${Options} "--package-library")
endif()
if(Tensile_SHORT_FILE_NAMES)
set(Options ${Options} "--short-file-names")
else()
set(Options ${Options} "--no-short-file-names")
endif()
if(Tensile_PRINT_DEBUG)
set(Options ${Options} "--library-print-debug")
else()
set(Options ${Options} "--no-library-print-debug")
endif()
if(Tensile_EMBED_LIBRARY)
set(Options ${Options} "--embed-library=${Tensile_EMBED_LIBRARY}")
endif()
if(Tensile_EMBED_KEY)
set(Options ${Options} "--embed-library-key=${Tensile_EMBED_KEY}")
endif()
if(Tensile_CODE_OBJECT_VERSION)
set(Options ${Options} "--code-object-version=${Tensile_CODE_OBJECT_VERSION}")
endif()
if(Tensile_COMPILER)
set(Options ${Options} "--cxx-compiler=${Tensile_COMPILER}")
endif()
if(Tensile_COMPILER_PATH)
set(Options ${Options} "--cmake-cxx-compiler=${Tensile_COMPILER_PATH}")
endif()
if(Tensile_CPU_THREADS)
set(Options ${Options} "--jobs=${Tensile_CPU_THREADS}")
endif()
if(Tensile_LIBRARY_FORMAT)
set(Options ${Options} "--library-format=${Tensile_LIBRARY_FORMAT}")
if(Tensile_LIBRARY_FORMAT MATCHES "yaml")
target_compile_definitions( TensileHost PUBLIC -DTENSILE_YAML=1)
endif()
endif()
if(Tensile_ARCHITECTURE)
string (REPLACE ";" "_" archString "${Tensile_ARCHITECTURE}")
# uses _ separator to avoid cmake ; list interpretation, either ; or _ decoded in TensileCreateLibrary
set(Options ${Options} "--architecture=${archString}")
endif()
if (WIN32)
set(CommandLine ${VIRTUALENV_BIN_DIR}/${VIRTUALENV_PYTHON_EXENAME} ${Script} ${Options} ${Tensile_LOGIC_PATH} ${Tensile_OUTPUT_PATH} HIP)
else()
set(CommandLine ${Script} ${Options} ${Tensile_LOGIC_PATH} ${Tensile_OUTPUT_PATH} HIP)
endif()
message(STATUS "Tensile_CREATE_COMMAND: ${CommandLine}")
if(Tensile_EMBED_LIBRARY)
set(Tensile_EMBED_LIBRARY_SOURCE "${Tensile_OUTPUT_PATH}/library/${Tensile_EMBED_LIBRARY}.cpp")
endif()
if($ENV{TENSILE_SKIP_LIBRARY})
message(STATUS "Skipping build of ${Tensile_OUTPUT_PATH}")
else()
if(NOT Tensile_VAR_PREFIX)
set(Tensile_VAR_PREFIX TENSILE)
endif()
set(Tensile_MANIFEST_FILE_PATH "${Tensile_OUTPUT_PATH}/library/TensileManifest.txt")
message(STATUS "Tensile_MANIFEST_FILE_PATH: ${Tensile_MANIFEST_FILE_PATH}")
if($ENV{ENABLE_ADDRESS_SANITIZER})
# Must populate LD_PRELOAD with ASAN runtime if ASAN is being used.
# Find the ASAN RT with compiler and update env for Tensile call.
execute_process(
COMMAND ${CMAKE_CXX_COMPILER} --print-file-name=libclang_rt.asan-x86_64.so
OUTPUT_VARIABLE ASAN_LIB_PATH
COMMAND_ECHO STDOUT)
string(STRIP ${ASAN_LIB_PATH} ASAN_LIB_PATH)
set(CommandLine env LD_PRELOAD=${ASAN_LIB_PATH} ${CommandLine})
endif()
# Create the manifest file of the output libraries.
set(Tensile_CREATE_MANIFEST_COMMAND ${CommandLine} "--generate-manifest-and-exit")
execute_process(
COMMAND ${Tensile_CREATE_MANIFEST_COMMAND}
RESULT_VARIABLE Tensile_CREATE_MANIFEST_RESULT
COMMAND_ECHO STDOUT)
if(Tensile_CREATE_MANIFEST_RESULT OR (NOT EXISTS ${Tensile_MANIFEST_FILE_PATH}))
message(FATAL_ERROR "Error creating Tensile library: ${Tensile_CREATE_MANIFEST_RESULT}")
endif()
# Defer the actual call of the TensileCreateLibraries to 'make' time as needed.
# Read the manifest file and declare the files as expected output.
file(STRINGS ${Tensile_MANIFEST_FILE_PATH} Tensile_MANIFEST_CONTENTS)
add_custom_command(
COMMENT "Generating Tensile Libraries"
OUTPUT ${Tensile_EMBED_LIBRARY_SOURCE};${Tensile_MANIFEST_CONTENTS}
COMMAND ${CommandLine}
)
set("${Tensile_VAR_PREFIX}_ALL_FILES" ${Tensile_MANIFEST_CONTENTS} PARENT_SCOPE)
# Create a chained library build target.
# We've declared the manifest contents as output of the custom
# command above which builds the tensile libs. Now create a
# target dependency on those files so that we force the custom
# command to be invoked at build time, not cmake time.
TensileCreateCopyTarget(
"${Tensile_VAR_PREFIX}_LIBRARY_TARGET"
"${Tensile_MANIFEST_CONTENTS}"
"${Tensile_OUTPUT_PATH}/library"
)
endif()
if(Tensile_EMBED_LIBRARY)
add_library(${Tensile_EMBED_LIBRARY} ${Tensile_EMBED_LIBRARY_SOURCE})
target_link_libraries(${Tensile_EMBED_LIBRARY} PUBLIC TensileHost)
endif()
endfunction()
| {
"content_hash": "6f14837c12a899ef5125a21abf985100",
"timestamp": "",
"source": "github",
"line_count": 275,
"max_line_length": 141,
"avg_line_length": 32.483636363636364,
"alnum_prop": 0.6801746333818426,
"repo_name": "ROCmSoftwarePlatform/Tensile",
"id": "1c853b44aea68a4ea09a141a6dd00e8823138cdb",
"size": "10756",
"binary": false,
"copies": "1",
"ref": "refs/heads/develop",
"path": "Tensile/cmake/TensileConfig.cmake",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "Assembly",
"bytes": "1179916"
},
{
"name": "Awk",
"bytes": "1667"
},
{
"name": "C++",
"bytes": "1570879"
},
{
"name": "CMake",
"bytes": "70754"
},
{
"name": "Dockerfile",
"bytes": "1413"
},
{
"name": "Groovy",
"bytes": "23999"
},
{
"name": "Makefile",
"bytes": "5336"
},
{
"name": "Python",
"bytes": "2699223"
},
{
"name": "Shell",
"bytes": "64197"
},
{
"name": "TeX",
"bytes": "83918"
}
],
"symlink_target": ""
} |
license: >
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
title: Dateisystem
---
# Dateisystem
Dieses Objekt stellt ein Dateisystem.
## Eigenschaften
* **Name**: der Name des Dateisystems. *(DOM-String und enthält)*
* **Wurzel**: das Root-Verzeichnis des Dateisystems. *(DirectoryEntry)*
## Informationen
Das `FileSystem` Objekt stellt Informationen über das Dateisystem. Der Name des Dateisystems ist eindeutig die Liste der exponierten Dateisysteme. Die Root-Eigenschaft enthält ein `[DirectoryEntry](../directoryentry/directoryentry.html)` -Objekt, das Dateisystem Root-Verzeichnis darstellt.
## Unterstützte Plattformen
* Android
* BlackBerry WebWorks (OS 5.0 und höher)
* iOS
* Windows Phone 7 und 8
* Windows 8
## Schnelle System-Beispieldatei
function onSuccess(fileSystem) {
console.log(fileSystem.name);
console.log(fileSystem.root.name);
}
// request the persistent file system
window.requestFileSystem(LocalFileSystem.PERSISTENT, 0, onSuccess, null);
## Vollständiges Beispiel
<!DOCTYPE html>
<html>
<head>
<title>File System Example</title>
<script type="text/javascript" charset="utf-8" src="cordova.js"></script>
<script type="text/javascript" charset="utf-8">
// Wait for device API libraries to load
//
document.addEventListener("deviceready", onDeviceReady, false);
// device APIs are available
//
function onDeviceReady() {
window.requestFileSystem(LocalFileSystem.PERSISTENT, 0, onFileSystemSuccess, fail);
}
function onFileSystemSuccess(fileSystem) {
console.log(fileSystem.name);
console.log(fileSystem.root.name);
}
function fail(evt) {
console.log(evt.target.error.code);
}
</script>
</head>
<body>
<h1>Example</h1>
<p>File System</p>
</body>
</html> | {
"content_hash": "a1c4805adb150888edf68195167bcd31",
"timestamp": "",
"source": "github",
"line_count": 90,
"max_line_length": 290,
"avg_line_length": 30.655555555555555,
"alnum_prop": 0.6727075027183762,
"repo_name": "shazron/cordova-docs",
"id": "1e0bf7b9eead9f2aca8981e6bd24f8aa0f3c92ac",
"size": "2769",
"binary": false,
"copies": "9",
"ref": "refs/heads/master",
"path": "www/docs/de/3.1.0/cordova/file/filesystem/filesystem.md",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "CSS",
"bytes": "45010"
},
{
"name": "HTML",
"bytes": "67534"
},
{
"name": "JavaScript",
"bytes": "189303"
},
{
"name": "Ruby",
"bytes": "103"
},
{
"name": "Shell",
"bytes": "911"
}
],
"symlink_target": ""
} |
from thrift.Thrift import TType, TMessageType, TFrozenDict, TException, TApplicationException
from thrift.protocol.TProtocol import TProtocolException
from thrift.TRecursive import fix_spec
import sys
import logging
from .ttypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
all_structs = []
class Iface(object):
def execute(self, functionName, funcArgs):
"""
Parameters:
- functionName
- funcArgs
"""
pass
class Client(Iface):
def __init__(self, iprot, oprot=None):
self._iprot = self._oprot = iprot
if oprot is not None:
self._oprot = oprot
self._seqid = 0
def execute(self, functionName, funcArgs):
"""
Parameters:
- functionName
- funcArgs
"""
self.send_execute(functionName, funcArgs)
return self.recv_execute()
def send_execute(self, functionName, funcArgs):
self._oprot.writeMessageBegin('execute', TMessageType.CALL, self._seqid)
args = execute_args()
args.functionName = functionName
args.funcArgs = funcArgs
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_execute(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = execute_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
if result.aze is not None:
raise result.aze
raise TApplicationException(TApplicationException.MISSING_RESULT, "execute failed: unknown result")
class Processor(Iface, TProcessor):
def __init__(self, handler):
self._handler = handler
self._processMap = {}
self._processMap["execute"] = Processor.process_execute
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_execute(self, seqid, iprot, oprot):
args = execute_args()
args.read(iprot)
iprot.readMessageEnd()
result = execute_result()
try:
result.success = self._handler.execute(args.functionName, args.funcArgs)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except DRPCExecutionException as e:
msg_type = TMessageType.REPLY
result.e = e
except AuthorizationException as aze:
msg_type = TMessageType.REPLY
result.aze = aze
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("execute", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class execute_args(object):
"""
Attributes:
- functionName
- funcArgs
"""
def __init__(self, functionName=None, funcArgs=None,):
self.functionName = functionName
self.funcArgs = funcArgs
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.functionName = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.funcArgs = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('execute_args')
if self.functionName is not None:
oprot.writeFieldBegin('functionName', TType.STRING, 1)
oprot.writeString(self.functionName.encode('utf-8') if sys.version_info[0] == 2 else self.functionName)
oprot.writeFieldEnd()
if self.funcArgs is not None:
oprot.writeFieldBegin('funcArgs', TType.STRING, 2)
oprot.writeString(self.funcArgs.encode('utf-8') if sys.version_info[0] == 2 else self.funcArgs)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(execute_args)
execute_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'functionName', 'UTF8', None, ), # 1
(2, TType.STRING, 'funcArgs', 'UTF8', None, ), # 2
)
class execute_result(object):
"""
Attributes:
- success
- e
- aze
"""
def __init__(self, success=None, e=None, aze=None,):
self.success = success
self.e = e
self.aze = aze
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRING:
self.success = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = DRPCExecutionException()
self.e.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.aze = AuthorizationException()
self.aze.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('execute_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRING, 0)
oprot.writeString(self.success.encode('utf-8') if sys.version_info[0] == 2 else self.success)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
if self.aze is not None:
oprot.writeFieldBegin('aze', TType.STRUCT, 2)
self.aze.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(execute_result)
execute_result.thrift_spec = (
(0, TType.STRING, 'success', 'UTF8', None, ), # 0
(1, TType.STRUCT, 'e', [DRPCExecutionException, None], None, ), # 1
(2, TType.STRUCT, 'aze', [AuthorizationException, None], None, ), # 2
)
fix_spec(all_structs)
del all_structs
| {
"content_hash": "f3db21c802fa6a579f24167da8b4c06b",
"timestamp": "",
"source": "github",
"line_count": 284,
"max_line_length": 134,
"avg_line_length": 34.186619718309856,
"alnum_prop": 0.5742094963435987,
"repo_name": "hmcl/storm-apache",
"id": "4d91b1d9f0034bf7ca971e96172b4609d753a053",
"size": "10668",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "storm-client/src/py/storm/DistributedRPC.py",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "C",
"bytes": "54084"
},
{
"name": "CSS",
"bytes": "12597"
},
{
"name": "Clojure",
"bytes": "323393"
},
{
"name": "Fancy",
"bytes": "6234"
},
{
"name": "FreeMarker",
"bytes": "3512"
},
{
"name": "HTML",
"bytes": "193952"
},
{
"name": "Java",
"bytes": "12471832"
},
{
"name": "JavaScript",
"bytes": "74893"
},
{
"name": "M4",
"bytes": "1522"
},
{
"name": "Makefile",
"bytes": "1302"
},
{
"name": "PowerShell",
"bytes": "3405"
},
{
"name": "Python",
"bytes": "1072987"
},
{
"name": "Ruby",
"bytes": "15824"
},
{
"name": "Shell",
"bytes": "24778"
},
{
"name": "Thrift",
"bytes": "31772"
},
{
"name": "XSLT",
"bytes": "1365"
}
],
"symlink_target": ""
} |
using namespace std::literals;
BOOST_AUTO_TEST_SUITE(base32_tests)
BOOST_AUTO_TEST_CASE(base32_testvectors)
{
static const std::string vstrIn[] = {"","f","fo","foo","foob","fooba","foobar"};
static const std::string vstrOut[] = {"","my======","mzxq====","mzxw6===","mzxw6yq=","mzxw6ytb","mzxw6ytboi======"};
static const std::string vstrOutNoPadding[] = {"","my","mzxq","mzxw6","mzxw6yq","mzxw6ytb","mzxw6ytboi"};
for (unsigned int i=0; i<std::size(vstrIn); i++)
{
std::string strEnc = EncodeBase32(vstrIn[i]);
BOOST_CHECK_EQUAL(strEnc, vstrOut[i]);
strEnc = EncodeBase32(vstrIn[i], false);
BOOST_CHECK_EQUAL(strEnc, vstrOutNoPadding[i]);
std::string strDec = DecodeBase32(vstrOut[i]);
BOOST_CHECK_EQUAL(strDec, vstrIn[i]);
}
// Decoding strings with embedded NUL characters should fail
bool failure;
(void)DecodeBase32("invalid\0"s, &failure); // correct size, invalid due to \0
BOOST_CHECK(failure);
(void)DecodeBase32("AWSX3VPP"s, &failure); // valid
BOOST_CHECK(!failure);
(void)DecodeBase32("AWSX3VPP\0invalid"s, &failure); // correct size, invalid due to \0
BOOST_CHECK(failure);
(void)DecodeBase32("AWSX3VPPinvalid"s, &failure); // invalid size
BOOST_CHECK(failure);
}
BOOST_AUTO_TEST_SUITE_END()
| {
"content_hash": "0791e4bd44ebaf24215eb40ab4cbaaaa",
"timestamp": "",
"source": "github",
"line_count": 32,
"max_line_length": 120,
"avg_line_length": 41.28125,
"alnum_prop": 0.6449659348978047,
"repo_name": "dscotese/bitcoin",
"id": "5fab7f0d1e174fe66e21a47ef6a106f042c4796a",
"size": "1603",
"binary": false,
"copies": "6",
"ref": "refs/heads/master",
"path": "src/test/base32_tests.cpp",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "Assembly",
"bytes": "28178"
},
{
"name": "Batchfile",
"bytes": "13"
},
{
"name": "C",
"bytes": "1225699"
},
{
"name": "C++",
"bytes": "9747775"
},
{
"name": "CMake",
"bytes": "29132"
},
{
"name": "Cap'n Proto",
"bytes": "1256"
},
{
"name": "Dockerfile",
"bytes": "1721"
},
{
"name": "HTML",
"bytes": "21833"
},
{
"name": "Java",
"bytes": "541"
},
{
"name": "M4",
"bytes": "238817"
},
{
"name": "Makefile",
"bytes": "141095"
},
{
"name": "Objective-C++",
"bytes": "5497"
},
{
"name": "Python",
"bytes": "2751909"
},
{
"name": "QMake",
"bytes": "438"
},
{
"name": "Sage",
"bytes": "56897"
},
{
"name": "Scheme",
"bytes": "24525"
},
{
"name": "Shell",
"bytes": "205960"
}
],
"symlink_target": ""
} |
<!DOCTYPE html>
<html lang="en" >
<head>
<meta charset="utf-8">
<title>text-align: start, dir=ltr</title>
<link rel='author' title='Richard Ishida' href='mailto:[email protected]'>
<link rel='help' href='https://drafts.csswg.org/css-text-3/#text-align-property'>
<link rel='match' href='reference/text-align-start-ref-004.html'>
<meta name="assert" content="text-align:start aligns inline-level content to the start edge of the line box – ie. left when direction is horizontal, ltr.">
<link rel="stylesheet" type="text/css" href="/fonts/ahem.css" />
<style type='text/css'>
.test { text-align: start; }
/* the CSS below is not part of the test */
.test, .ref { border: 1px solid orange; margin: 20px; width: 300px; color: orange; font: 24px/24px Ahem; }
.ref { position: relative; height: 24px; }
#rb1 { position: absolute; top: 0; left: 0; background-color: orange; width: 120px; height: 24px; }
</style>
</head>
<body>
<div id='instructions'>Test passes if the shading in both orange boxes is identical.</div>
<div dir="rtl">
<div class="test" dir="ltr">XXXXX</div>
<div class="ref"><div id="rb1"></div></div>
</div>
</body>
</html> | {
"content_hash": "3900bbcf7c143e0ead026e62c17dd54f",
"timestamp": "",
"source": "github",
"line_count": 26,
"max_line_length": 155,
"avg_line_length": 43.61538461538461,
"alnum_prop": 0.689594356261023,
"repo_name": "nwjs/chromium.src",
"id": "739bd8811ff4034742c52cb92a3bd9d16b0882e5",
"size": "1136",
"binary": false,
"copies": "21",
"ref": "refs/heads/nw70",
"path": "third_party/blink/web_tests/external/wpt/css/css-text/text-align/text-align-start-004.html",
"mode": "33188",
"license": "bsd-3-clause",
"language": [],
"symlink_target": ""
} |
namespace ui {
// Represents a gamepad device state.
struct EVENTS_DEVICES_EXPORT GamepadDevice : public InputDevice {
// Represents an axis of a gamepad e.g. an analog thumb stick.
struct Axis {
// Gamepad axis index. Corresponds to |raw_code_| of GamepadEvent.
uint16_t code = 0;
// See input_absinfo for the definition of these variables.
int32_t min_value = 0;
int32_t max_value = 0;
int32_t flat = 0;
int32_t fuzz = 0;
int32_t resolution = 0;
};
GamepadDevice(const InputDevice& input_device,
std::vector<Axis>&& axes,
bool supports_rumble);
GamepadDevice(const GamepadDevice& other);
~GamepadDevice() override;
// Axes the gamepad has e.g. analog thumb sticks.
std::vector<Axis> axes;
// Whether the gamepad device supports rumble type force feedback.
bool supports_vibration_rumble = false;
};
} // namespace ui
#endif // UI_EVENTS_DEVICES_GAMEPAD_DEVICE_H_
| {
"content_hash": "c33bae82faef35f04b34bb285dff9eff",
"timestamp": "",
"source": "github",
"line_count": 33,
"max_line_length": 70,
"avg_line_length": 29.03030303030303,
"alnum_prop": 0.673277661795407,
"repo_name": "ric2b/Vivaldi-browser",
"id": "9d2fc2cc6e342bb75c332561022627d1831461bb",
"size": "1330",
"binary": false,
"copies": "2",
"ref": "refs/heads/master",
"path": "chromium/ui/events/devices/gamepad_device.h",
"mode": "33188",
"license": "bsd-3-clause",
"language": [],
"symlink_target": ""
} |
package javax.imageio.event;
import java.util.EventListener;
import javax.imageio.ImageReader;
public interface IIOReadWarningListener extends EventListener
{
/**
* Reports the occurrence of a non-fatal error in decoding.
* Decoding will continue after this method is called.
*
* @param source the <code>ImageReader</code> object calling this method
* @param warning the warning
*/
void warningOccurred(ImageReader source, String warning);
}
| {
"content_hash": "e297fc824f965e9f9e5aef2bc2cc4324",
"timestamp": "",
"source": "github",
"line_count": 20,
"max_line_length": 74,
"avg_line_length": 23.5,
"alnum_prop": 0.7510638297872341,
"repo_name": "nmldiegues/jvm-stm",
"id": "e89735035ff3604b2744929e6af0e70d1891db13",
"size": "2214",
"binary": false,
"copies": "182",
"ref": "refs/heads/master",
"path": "classpath-0.98/javax/imageio/event/IIOReadWarningListener.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Assembly",
"bytes": "66583"
},
{
"name": "C",
"bytes": "3107750"
},
{
"name": "C++",
"bytes": "500403"
},
{
"name": "CSS",
"bytes": "42335"
},
{
"name": "GAP",
"bytes": "13089"
},
{
"name": "Groff",
"bytes": "71735"
},
{
"name": "HTML",
"bytes": "761911"
},
{
"name": "Java",
"bytes": "42685062"
},
{
"name": "JavaScript",
"bytes": "6158"
},
{
"name": "Perl",
"bytes": "12256"
},
{
"name": "Shell",
"bytes": "1008589"
},
{
"name": "TeX",
"bytes": "290889"
},
{
"name": "XSLT",
"bytes": "156419"
},
{
"name": "Yacc",
"bytes": "18175"
}
],
"symlink_target": ""
} |
<?php
namespace ZendTest\View\Model;
use PHPUnit\Framework\TestCase;
use Zend\View\Model\JsonModel;
use Zend\Json\Json;
use Zend\View\Variables;
class JsonModelTest extends TestCase
{
public function testAllowsEmptyConstructor()
{
$model = new JsonModel();
$this->assertInstanceOf(Variables::class, $model->getVariables());
$this->assertEquals([], $model->getOptions());
}
public function testCanSerializeVariablesToJson()
{
$array = ['foo' => 'bar'];
$model = new JsonModel($array);
$this->assertEquals($array, $model->getVariables());
$this->assertEquals(Json::encode($array), $model->serialize());
}
public function testCanSerializeWithJsonpCallback()
{
$array = ['foo' => 'bar'];
$model = new JsonModel($array);
$model->setJsonpCallback('callback');
$this->assertEquals('callback(' . Json::encode($array) . ');', $model->serialize());
}
public function testPrettyPrint()
{
$array = [
'simple' => 'simple test string',
'stringwithjsonchars' => '\"[1,2]',
'complex' => [
'foo' => 'bar',
'far' => 'boo'
]
];
$model = new JsonModel($array, ['prettyPrint' => true]);
$this->assertEquals(Json::encode($array, false, ['prettyPrint' => true]), $model->serialize());
}
}
| {
"content_hash": "f6a246b75de5ccbdaa0c420f7515e50e",
"timestamp": "",
"source": "github",
"line_count": 49,
"max_line_length": 103,
"avg_line_length": 29,
"alnum_prop": 0.569317382125264,
"repo_name": "zendframework/zend-view",
"id": "8e10f4e3693e597a44f594009a6501b488b032f1",
"size": "1723",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "test/Model/JsonModelTest.php",
"mode": "33188",
"license": "bsd-3-clause",
"language": [
{
"name": "HTML",
"bytes": "35978"
},
{
"name": "PHP",
"bytes": "967281"
}
],
"symlink_target": ""
} |
#ifndef __GXPRINTING__
#define __GXPRINTING__
#ifndef __CONDITIONALMACROS__
#include <ConditionalMacros.h>
#endif
#ifndef __COLLECTIONS__
#include <Collections.h>
#endif
#ifndef __DIALOGS__
#include <Dialogs.h>
#endif
#ifndef __MACERRORS__
#include <MacErrors.h>
#endif
#ifndef __FILES__
#include <Files.h>
#endif
#ifndef __GXFONTS__
#include <GXFonts.h>
#endif
#ifndef __GXMATH__
#include <GXMath.h>
#endif
#ifndef __GXTYPES__
#include <GXTypes.h>
#endif
#ifndef __LISTS__
#include <Lists.h>
#endif
#ifndef __MENUS__
#include <Menus.h>
#endif
#ifndef __GXMESSAGES__
#include <GXMessages.h>
#endif
#ifndef __PRINTING__
#include <Printing.h>
#endif
#ifndef __QUICKDRAW__
#include <Quickdraw.h>
#endif
#if PRAGMA_ONCE
#pragma once
#endif
#ifdef __cplusplus
extern "C" {
#endif
#if PRAGMA_IMPORT
#pragma import on
#endif
#if PRAGMA_STRUCT_ALIGN
#pragma options align=mac68k
#elif PRAGMA_STRUCT_PACKPUSH
#pragma pack(push, 2)
#elif PRAGMA_STRUCT_PACK
#pragma pack(2)
#endif
/********************************************************************
Start of old "GXPrintingManager.h/a/p" interface file.
*********************************************************************/
/* ------------------------------------------------------------------------------
Printing Manager API Contants and Types
-------------------------------------------------------------------------------- */
typedef unsigned long gxOwnerSignature;
#if OLDROUTINENAMES
typedef unsigned long Signature;
#endif /* OLDROUTINENAMES */
/*
ABSTRACT DATA TYPES
*/
/*
typedef struct gxPrivatePrinterRecord *gxPrinter;
typedef struct gxPrivateJobRecord *gxJob;
typedef struct gxPrivateFormatRecord *gxFormat;
typedef struct gxPrivatePaperTypeRecord *gxPaperType;
typedef struct gxPrivatePrintFileRecord *gxPrintFile;
*/
typedef struct OpaquegxPrinter* gxPrinter;
typedef struct OpaquegxJob* gxJob;
typedef struct OpaquegxFormat* gxFormat;
typedef struct OpaquegxPaperType* gxPaperType;
typedef struct OpaquegxPrintFile* gxPrintFile;
/* Possible values for LoopStatus */
typedef Boolean gxLoopStatus;
enum {
gxStopLooping = false,
gxKeepLooping = true
};
typedef CALLBACK_API( gxLoopStatus , GXViewDeviceProcPtr )(gxViewDevice aViewDevice, void *refCon);
typedef STACK_UPP_TYPE(GXViewDeviceProcPtr) GXViewDeviceUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXViewDeviceUPP)
NewGXViewDeviceUPP (GXViewDeviceProcPtr userRoutine);
EXTERN_API(void)
DisposeGXViewDeviceUPP (GXViewDeviceUPP userUPP);
EXTERN_API(gxLoopStatus)
InvokeGXViewDeviceUPP (gxViewDevice aViewDevice,
void * refCon,
GXViewDeviceUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXViewDeviceProcInfo = 0x000003D0 }; /* pascal 1_byte Func(4_bytes, 4_bytes) */
#define NewGXViewDeviceUPP(userRoutine) (GXViewDeviceUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXViewDeviceProcInfo, GetCurrentArchitecture())
#define DisposeGXViewDeviceUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXViewDeviceUPP(aViewDevice, refCon, userUPP) (gxLoopStatus)CALL_TWO_PARAMETER_UPP((userUPP), uppGXViewDeviceProcInfo, (aViewDevice), (refCon))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXViewDeviceProc(userRoutine) NewGXViewDeviceUPP(userRoutine)
#define CallGXViewDeviceProc(userRoutine, aViewDevice, refCon) InvokeGXViewDeviceUPP(aViewDevice, refCon, userRoutine)
typedef CALLBACK_API( gxLoopStatus , GXFormatProcPtr )(gxFormat aFormat, void *refCon);
typedef STACK_UPP_TYPE(GXFormatProcPtr) GXFormatUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXFormatUPP)
NewGXFormatUPP (GXFormatProcPtr userRoutine);
EXTERN_API(void)
DisposeGXFormatUPP (GXFormatUPP userUPP);
EXTERN_API(gxLoopStatus)
InvokeGXFormatUPP (gxFormat aFormat,
void * refCon,
GXFormatUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXFormatProcInfo = 0x000003D0 }; /* pascal 1_byte Func(4_bytes, 4_bytes) */
#define NewGXFormatUPP(userRoutine) (GXFormatUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXFormatProcInfo, GetCurrentArchitecture())
#define DisposeGXFormatUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXFormatUPP(aFormat, refCon, userUPP) (gxLoopStatus)CALL_TWO_PARAMETER_UPP((userUPP), uppGXFormatProcInfo, (aFormat), (refCon))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXFormatProc(userRoutine) NewGXFormatUPP(userRoutine)
#define CallGXFormatProc(userRoutine, aFormat, refCon) InvokeGXFormatUPP(aFormat, refCon, userRoutine)
typedef CALLBACK_API( gxLoopStatus , GXPaperTypeProcPtr )(gxPaperType aPapertype, void *refCon);
typedef STACK_UPP_TYPE(GXPaperTypeProcPtr) GXPaperTypeUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXPaperTypeUPP)
NewGXPaperTypeUPP (GXPaperTypeProcPtr userRoutine);
EXTERN_API(void)
DisposeGXPaperTypeUPP (GXPaperTypeUPP userUPP);
EXTERN_API(gxLoopStatus)
InvokeGXPaperTypeUPP (gxPaperType aPapertype,
void * refCon,
GXPaperTypeUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXPaperTypeProcInfo = 0x000003D0 }; /* pascal 1_byte Func(4_bytes, 4_bytes) */
#define NewGXPaperTypeUPP(userRoutine) (GXPaperTypeUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXPaperTypeProcInfo, GetCurrentArchitecture())
#define DisposeGXPaperTypeUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXPaperTypeUPP(aPapertype, refCon, userUPP) (gxLoopStatus)CALL_TWO_PARAMETER_UPP((userUPP), uppGXPaperTypeProcInfo, (aPapertype), (refCon))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXPaperTypeProc(userRoutine) NewGXPaperTypeUPP(userRoutine)
#define CallGXPaperTypeProc(userRoutine, aPapertype, refCon) InvokeGXPaperTypeUPP(aPapertype, refCon, userRoutine)
typedef CALLBACK_API( OSErr , GXPrintingFlattenProcPtr )(long size, void *data, void *refCon);
typedef STACK_UPP_TYPE(GXPrintingFlattenProcPtr) GXPrintingFlattenUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXPrintingFlattenUPP)
NewGXPrintingFlattenUPP (GXPrintingFlattenProcPtr userRoutine);
EXTERN_API(void)
DisposeGXPrintingFlattenUPP (GXPrintingFlattenUPP userUPP);
EXTERN_API(OSErr)
InvokeGXPrintingFlattenUPP (long size,
void * data,
void * refCon,
GXPrintingFlattenUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXPrintingFlattenProcInfo = 0x00000FE0 }; /* pascal 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXPrintingFlattenUPP(userRoutine) (GXPrintingFlattenUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXPrintingFlattenProcInfo, GetCurrentArchitecture())
#define DisposeGXPrintingFlattenUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXPrintingFlattenUPP(size, data, refCon, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXPrintingFlattenProcInfo, (size), (data), (refCon))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXPrintingFlattenProc(userRoutine) NewGXPrintingFlattenUPP(userRoutine)
#define CallGXPrintingFlattenProc(userRoutine, size, data, refCon) InvokeGXPrintingFlattenUPP(size, data, refCon, userRoutine)
#if OLDROUTINENAMES
typedef GXViewDeviceProcPtr gxViewDeviceProc;
typedef GXFormatProcPtr gxFormatProc;
typedef GXPaperTypeProcPtr gxPaperTypeProc;
typedef GXPrintingFlattenProcPtr gxPrintingFlattenProc;
#endif /* OLDROUTINENAMES */
/*
The following constants are used to set collection item flags in printing
collections. The Printing Manager purges certain items whenever a driver
switch occurs. If the formatting driver changes, all items marked as
gxVolatileFormattingDriverCategory will be purged. If the output driver
changes, all items marked as gxVolatileOutputDriverCategory will be purged.
Note that to prevent items from being flattened when GXFlattenJob is called,
you should unset the collectionPersistenceBit (defined in Collections.h),
which is on by default.
*/
/* Structure stored in collection items' user attribute bits */
typedef short gxCollectionCategory;
enum {
gxNoCollectionCategory = 0x0000,
gxOutputDriverCategory = 0x0001,
gxFormattingDriverCategory = 0x0002,
gxDriverVolatileCategory = 0x0004,
gxVolatileOutputDriverCategory = gxOutputDriverCategory + gxDriverVolatileCategory,
gxVolatileFormattingDriverCategory = gxFormattingDriverCategory + gxDriverVolatileCategory
};
/*
>>>>>> JOB COLLECTION ITEMS <<<<<<
*/
/* gxJobInfo COLLECTION ITEM */
enum {
gxJobTag = FOUR_CHAR_CODE('job ')
};
struct gxJobInfo {
long numPages; /* Number of pages in the document */
long priority; /* Priority of this job plus "is it on hold?" */
unsigned long timeToPrint; /* When to print job, if scheduled */
long jobTimeout; /* Timeout value, in ticks */
long firstPageToPrint; /* Start printing from this page */
short jobAlert; /* How to alert user when printing */
Str31 appName; /* Which application printed the document */
Str31 documentName; /* The name of the document being printed */
Str31 userName; /* The owner name of the machine that printed the document */
};
typedef struct gxJobInfo gxJobInfo;
/* gxPDDDefaultSettingTag COLLECTION ITEM */
enum {
gxPDDDefaultSettingTag = FOUR_CHAR_CODE('pdds')
};
struct gxPDDDefaultSettingInfo {
Boolean useDefaultSetting; /* true if PDD default setting should be used */
SInt8 pad;
};
typedef struct gxPDDDefaultSettingInfo gxPDDDefaultSettingInfo;
/* priority field constants */
enum {
gxPrintJobHoldingBit = 0x00001000 /* This bit is set if the job is on hold. */
};
enum {
gxPrintJobUrgent = 0x00000001,
gxPrintJobAtTime = 0x00000002,
gxPrintJobASAP = 0x00000003,
gxPrintJobHolding = (gxPrintJobHoldingBit + gxPrintJobASAP),
gxPrintJobHoldingAtTime = (gxPrintJobHoldingBit + gxPrintJobAtTime),
gxPrintJobHoldingUrgent = (gxPrintJobHoldingBit + gxPrintJobUrgent)
};
/* jobAlert field constants */
enum {
gxNoPrintTimeAlert = 0, /* Don't alert user when we print */
gxAlertBefore = 1, /* Alert user before we print */
gxAlertAfter = 2, /* Alert user after we print */
gxAlertBothTimes = 3 /* Alert before and after we print */
};
/* jobTimeout field constants */
enum {
gxThirtySeconds = 1800, /* 30 seconds in ticks */
gxTwoMinutes = 7200 /* 2 minutes in ticks */
};
/* gxCollationTag COLLECTION ITEM */
enum {
gxCollationTag = FOUR_CHAR_CODE('sort')
};
struct gxCollationInfo {
Boolean collation; /* True if copies are to be collated */
char padByte;
};
typedef struct gxCollationInfo gxCollationInfo;
/* gxCopiesTag COLLECTION ITEM */
enum {
gxCopiesTag = FOUR_CHAR_CODE('copy')
};
struct gxCopiesInfo {
long copies; /* Number of copies of the document to print */
};
typedef struct gxCopiesInfo gxCopiesInfo;
/* gxPageRangeTag COLLECTION ITEM */
enum {
gxPageRangeTag = FOUR_CHAR_CODE('rang')
};
struct gxSimplePageRangeInfo {
char optionChosen; /* From options listed below */
Boolean printAll; /* True if user wants to print all pages */
long fromPage; /* For gxDefaultPageRange, current value */
long toPage; /* For gxDefaultPageRange, current value */
};
typedef struct gxSimplePageRangeInfo gxSimplePageRangeInfo;
struct gxPageRangeInfo {
gxSimplePageRangeInfo simpleRange; /* Info which will be returned for GetJobPageRange */
Str31 fromString; /* For gxCustomizePageRange, current value */
Str31 toString; /* For gxCustomizePageRange, current value */
long minFromPage; /* For gxDefaultPageRange, we parse with this, ignored if nil */
long maxToPage; /* For gxDefaultPageRange, we parse with this, ignored if nil */
char replaceString[1]; /* For gxReplacePageRange, string to display */
};
typedef struct gxPageRangeInfo gxPageRangeInfo;
/* optionChosen field constants for SimplePageRangeInfo */
enum {
gxDefaultPageRange = 0,
gxReplacePageRange = 1,
gxCustomizePageRange = 2
};
/* gxQualityTag COLLECTION ITEM */
enum {
gxQualityTag = FOUR_CHAR_CODE('qual')
};
struct gxQualityInfo {
Boolean disableQuality; /* True to disable standard quality controls */
char padByte;
short defaultQuality; /* The default quality value */
short currentQuality; /* The current quality value */
short qualityCount; /* The number of quality menu items in popup menu */
char qualityNames[1]; /* An array of packed pascal strings for popup menu titles */
};
typedef struct gxQualityInfo gxQualityInfo;
/* gxFileDestinationTag COLLECTION ITEM */
enum {
gxFileDestinationTag = FOUR_CHAR_CODE('dest')
};
struct gxFileDestinationInfo {
Boolean toFile; /* True if destination is a file */
char padByte;
};
typedef struct gxFileDestinationInfo gxFileDestinationInfo;
/* gxFileLocationTag COLLECTION ITEM */
enum {
gxFileLocationTag = FOUR_CHAR_CODE('floc')
};
struct gxFileLocationInfo {
FSSpec fileSpec; /* Location to put file, if destination is file */
};
typedef struct gxFileLocationInfo gxFileLocationInfo;
/* gxFileFormatTag COLLECTION ITEM */
enum {
gxFileFormatTag = FOUR_CHAR_CODE('ffmt')
};
struct gxFileFormatInfo {
Str31 fileFormatName; /* Name of file format (e.g. "PostScript") if destination is file */
};
typedef struct gxFileFormatInfo gxFileFormatInfo;
/* gxFileFontsTag COLLECTION ITEM */
enum {
gxFileFontsTag = FOUR_CHAR_CODE('incf')
};
struct gxFileFontsInfo {
char includeFonts; /* Which fonts to include, if destination is file */
char padByte;
};
typedef struct gxFileFontsInfo gxFileFontsInfo;
/* includeFonts field constants */
enum {
gxIncludeNoFonts = 1, /* Include no fonts */
gxIncludeAllFonts = 2, /* Include all fonts */
gxIncludeNonStandardFonts = 3 /* Include only fonts that aren't in the standard LW set */
};
/* gxPaperFeedTag COLLECTION ITEM */
enum {
gxPaperFeedTag = FOUR_CHAR_CODE('feed')
};
struct gxPaperFeedInfo {
Boolean autoFeed; /* True if automatic feed, false if manual */
char padByte;
};
typedef struct gxPaperFeedInfo gxPaperFeedInfo;
/* gxTrayFeedTag COLLECTION ITEM */
enum {
gxTrayFeedTag = FOUR_CHAR_CODE('tray')
};
typedef long gxTrayIndex;
struct gxTrayFeedInfo {
gxTrayIndex feedTrayIndex; /* Tray to feed paper from */
Boolean manualFeedThisPage; /* Signals manual feeding for the page */
char padByte;
};
typedef struct gxTrayFeedInfo gxTrayFeedInfo;
/* gxManualFeedTag COLLECTION ITEM */
enum {
gxManualFeedTag = FOUR_CHAR_CODE('manf')
};
struct gxManualFeedInfo {
long numPaperTypeNames; /* Number of paperTypes to manually feed */
Str31 paperTypeNames[1]; /* Array of names of paperTypes to manually feed */
};
typedef struct gxManualFeedInfo gxManualFeedInfo;
/* gxNormalMappingTag COLLECTION ITEM */
enum {
gxNormalMappingTag = FOUR_CHAR_CODE('nmap')
};
struct gxNormalMappingInfo {
Boolean normalPaperMapping; /* True if not overriding normal paper mapping */
char padByte;
};
typedef struct gxNormalMappingInfo gxNormalMappingInfo;
/* gxSpecialMappingTag COLLECTION ITEM */
enum {
gxSpecialMappingTag = FOUR_CHAR_CODE('smap')
};
struct gxSpecialMappingInfo {
char specialMapping; /* Enumerated redirect, scale or tile setting */
char padByte;
};
typedef struct gxSpecialMappingInfo gxSpecialMappingInfo;
/* specialMapping field constants */
enum {
gxRedirectPages = 1, /* Redirect pages to a papertype and clip if necessary */
gxScalePages = 2, /* Scale pages if necessary */
gxTilePages = 3 /* Tile pages if necessary */
};
/* gxTrayMappingTag COLLECTION ITEM */
enum {
gxTrayMappingTag = FOUR_CHAR_CODE('tmap')
};
struct gxTrayMappingInfo {
gxTrayIndex mapPaperToTray; /* Tray to map all paper to */
};
typedef struct gxTrayMappingInfo gxTrayMappingInfo;
/* gxPaperMappingTag COLLECTION ITEM */
/* This collection item contains a flattened paper type resource */
enum {
gxPaperMappingTag = FOUR_CHAR_CODE('pmap')
};
/* gxPrintPanelTag COLLECTION ITEM */
enum {
gxPrintPanelTag = FOUR_CHAR_CODE('ppan')
};
struct gxPrintPanelInfo {
Str31 startPanelName; /* Name of starting panel in Print dialog */
};
typedef struct gxPrintPanelInfo gxPrintPanelInfo;
/* gxFormatPanelTag COLLECTION ITEM */
enum {
gxFormatPanelTag = FOUR_CHAR_CODE('fpan')
};
struct gxFormatPanelInfo {
Str31 startPanelName; /* Name of starting panel in Format dialog */
};
typedef struct gxFormatPanelInfo gxFormatPanelInfo;
/* gxTranslatedDocumentTag COLLECTION ITEM */
enum {
gxTranslatedDocumentTag = FOUR_CHAR_CODE('trns')
};
struct gxTranslatedDocumentInfo {
long translatorInfo; /* Information from the translation process */
};
typedef struct gxTranslatedDocumentInfo gxTranslatedDocumentInfo;
/* gxCoverPageTag COLLECTION ITEM */
enum {
gxCoverPageTag = FOUR_CHAR_CODE('cvpg')
};
struct gxCoverPageInfo {
long coverPage; /* Use same enum values as for PrintRecord field in GXPrinterDrivers.h */
};
typedef struct gxCoverPageInfo gxCoverPageInfo;
/*
>>>>>> FORMAT COLLECTION ITEMS <<<<<<
*/
/* gxPaperTypeLockTag COLLECTION ITEM */
enum {
gxPaperTypeLockTag = FOUR_CHAR_CODE('ptlk')
};
struct gxPaperTypeLockInfo {
Boolean paperTypeLocked; /* True if format's paperType is locked */
char padByte;
};
typedef struct gxPaperTypeLockInfo gxPaperTypeLockInfo;
/* gxOrientationTag COLLECTION ITEM */
enum {
gxOrientationTag = FOUR_CHAR_CODE('layo')
};
struct gxOrientationInfo {
char orientation; /* An enumerated orientation value */
char padByte;
};
typedef struct gxOrientationInfo gxOrientationInfo;
/* orientation field constants */
enum {
gxPortraitLayout = 0, /* Portrait */
gxLandscapeLayout = 1, /* Landscape */
gxRotatedPortraitLayout = 2, /* Portrait, rotated 180. */
gxRotatedLandscapeLayout = 3 /* Landscape, rotated 180. */
};
/* gxScalingTag COLLECTION ITEM */
enum {
gxScalingTag = FOUR_CHAR_CODE('scal')
};
struct gxScalingInfo {
Fixed horizontalScaleFactor; /* Current horizontal scaling factor */
Fixed verticalScaleFactor; /* Current vertical scaling factor */
short minScaling; /* Minimum scaling allowed */
short maxScaling; /* Maximum scaling allowed */
};
typedef struct gxScalingInfo gxScalingInfo;
/* gxDirectModeTag COLLECTION ITEM */
enum {
gxDirectModeTag = FOUR_CHAR_CODE('dirm')
};
struct gxDirectModeInfo {
Boolean directModeOn; /* True if a direct mode is enabled */
char padByte;
};
typedef struct gxDirectModeInfo gxDirectModeInfo;
/* gxFormatHalftoneTag COLLECTION ITEM */
enum {
gxFormatHalftoneTag = FOUR_CHAR_CODE('half')
};
struct gxFormatHalftoneInfo {
long numHalftones; /* Number of halftone records */
gxHalftone halftones[1]; /* The halftone records */
};
typedef struct gxFormatHalftoneInfo gxFormatHalftoneInfo;
/* gxInvertPageTag COLLECTION ITEM */
enum {
gxInvertPageTag = FOUR_CHAR_CODE('invp')
};
struct gxInvertPageInfo {
char padByte;
Boolean invert; /* If true, invert page */
};
typedef struct gxInvertPageInfo gxInvertPageInfo;
/* gxFlipPageHorizontalTag COLLECTION ITEM */
enum {
gxFlipPageHorizontalTag = FOUR_CHAR_CODE('flph')
};
struct gxFlipPageHorizontalInfo {
char padByte;
Boolean flipHorizontal; /* If true, flip x coordinates on page */
};
typedef struct gxFlipPageHorizontalInfo gxFlipPageHorizontalInfo;
/* gxFlipPageVerticalTag COLLECTION ITEM */
enum {
gxFlipPageVerticalTag = FOUR_CHAR_CODE('flpv')
};
struct gxFlipPageVerticalInfo {
char padByte;
Boolean flipVertical; /* If true, flip y coordinates on page */
};
typedef struct gxFlipPageVerticalInfo gxFlipPageVerticalInfo;
/* gxPreciseBitmapsTag COLLECTION ITEM */
enum {
gxPreciseBitmapsTag = FOUR_CHAR_CODE('pbmp')
};
struct gxPreciseBitmapInfo {
Boolean preciseBitmaps; /* If true, scale page by 96% */
char padByte;
};
typedef struct gxPreciseBitmapInfo gxPreciseBitmapInfo;
/*
>>>>>> PAPERTYPE COLLECTION ITEMS <<<<<<
*/
/* gxBaseTag COLLECTION ITEM */
enum {
gxBaseTag = FOUR_CHAR_CODE('base')
};
struct gxBaseInfo {
long baseType; /* PaperType's base type */
};
typedef struct gxBaseInfo gxBaseInfo;
/* baseType field constants */
enum {
gxUnknownBase = 0, /* Base paper type from which this paper type is */
gxUSLetterBase = 1, /* derived. This is not a complete set. */
gxUSLegalBase = 2,
gxA4LetterBase = 3,
gxB5LetterBase = 4,
gxTabloidBase = 5
};
/* gxCreatorTag COLLECTION ITEM */
enum {
gxCreatorTag = FOUR_CHAR_CODE('crea')
};
struct gxCreatorInfo {
OSType creator; /* PaperType's creator */
};
typedef struct gxCreatorInfo gxCreatorInfo;
/* gxUnitsTag COLLECTION ITEM */
enum {
gxUnitsTag = FOUR_CHAR_CODE('unit')
};
struct gxUnitsInfo {
char units; /* PaperType's units (used by PaperType Editor). */
char padByte;
};
typedef struct gxUnitsInfo gxUnitsInfo;
/* units field constants */
enum {
gxPicas = 0, /* Pica measurement */
gxMMs = 1, /* Millimeter measurement */
gxInches = 2 /* Inches measurement */
};
/* gxFlagsTag COLLECTION ITEM */
enum {
gxFlagsTag = FOUR_CHAR_CODE('flag')
};
struct gxFlagsInfo {
long flags; /* PaperType's flags */
};
typedef struct gxFlagsInfo gxFlagsInfo;
/* flags field constants */
enum {
gxOldPaperTypeFlag = 0x00800000, /* Indicates a paper type for compatibility printing */
gxNewPaperTypeFlag = 0x00400000, /* Indicates a paper type for QuickDraw GX-aware printing */
gxOldAndNewFlag = 0x00C00000, /* Indicates a paper type that's both old and new */
gxDefaultPaperTypeFlag = 0x00100000 /* Indicates the default paper type in the group */
};
/* gxCommentTag COLLECTION ITEM */
enum {
gxCommentTag = FOUR_CHAR_CODE('cmnt')
};
struct gxCommentInfo {
Str255 comment; /* PaperType's comment */
};
typedef struct gxCommentInfo gxCommentInfo;
/*
>>>>>> PRINTER VIEWDEVICE TAGS <<<<<<
*/
/* gxPenTableTag COLLECTION ITEM */
enum {
gxPenTableTag = FOUR_CHAR_CODE('pent')
};
struct gxPenTableEntry {
Str31 penName; /* Name of the pen */
gxColor penColor; /* Color to use from the color set */
Fixed penThickness; /* Size of the pen */
short penUnits; /* Specifies units in which pen thickness is defined */
short penPosition; /* Pen position in the carousel, -1 (kPenNotLoaded) if not loaded */
};
typedef struct gxPenTableEntry gxPenTableEntry;
struct gxPenTable {
long numPens; /* Number of pen entries in the following array */
gxPenTableEntry pens[1]; /* Array of pen entries */
};
typedef struct gxPenTable gxPenTable;
typedef gxPenTable * gxPenTablePtr;
typedef gxPenTablePtr * gxPenTableHdl;
/* penUnits field constants */
enum {
gxDeviceUnits = 0,
gxMMUnits = 1,
gxInchesUnits = 2
};
/* penPosition field constants */
enum {
gxPenNotLoaded = -1
};
/*
>>>>>> DIALOG-RELATED CONSTANTS AND TYPES <<<<<<
*/
typedef long gxDialogResult;
enum {
gxCancelSelected = 0L,
gxOKSelected = 1L,
gxRevertSelected = 2L
};
struct gxEditMenuRecord {
short editMenuID;
short cutItem;
short copyItem;
short pasteItem;
short clearItem;
short undoItem;
};
typedef struct gxEditMenuRecord gxEditMenuRecord;
/*
>>>>>> JOB FORMAT MODE CONSTANTS AND TYPES <<<<<<
*/
typedef OSType gxJobFormatMode;
struct gxJobFormatModeTable {
long numModes; /* Number of job format modes to choose from */
gxJobFormatMode modes[1]; /* The job format modes */
};
typedef struct gxJobFormatModeTable gxJobFormatModeTable;
typedef gxJobFormatModeTable * gxJobFormatModeTablePtr;
typedef gxJobFormatModeTablePtr * gxJobFormatModeTableHdl;
enum {
gxGraphicsJobFormatMode = FOUR_CHAR_CODE('grph'),
gxTextJobFormatMode = FOUR_CHAR_CODE('text'),
gxPostScriptJobFormatMode = FOUR_CHAR_CODE('post')
};
typedef long gxQueryType;
enum {
gxGetJobFormatLineConstraintQuery = 0L,
gxGetJobFormatFontsQuery = 1L,
gxGetJobFormatFontCommonStylesQuery = 2L,
gxGetJobFormatFontConstraintQuery = 3L,
gxSetStyleJobFormatCommonStyleQuery = 4L
};
/* Structures used for Text mode field constants */
struct gxPositionConstraintTable {
gxPoint phase; /* Position phase */
gxPoint offset; /* Position offset */
long numSizes; /* Number of available font sizes */
Fixed sizes[1]; /* The available font sizes */
};
typedef struct gxPositionConstraintTable gxPositionConstraintTable;
typedef gxPositionConstraintTable * gxPositionConstraintTablePtr;
typedef gxPositionConstraintTablePtr * gxPositionConstraintTableHdl;
/* numSizes field constants */
enum {
gxConstraintRange = -1
};
struct gxStyleNameTable {
long numStyleNames; /* Number of style names */
Str255 styleNames[1]; /* The style names */
};
typedef struct gxStyleNameTable gxStyleNameTable;
typedef gxStyleNameTable * gxStyleNameTablePtr;
typedef gxStyleNameTablePtr * gxStyleNameTableHdl;
struct gxFontTable {
long numFonts; /* Number of font references */
gxFont fonts[1]; /* The font references */
};
typedef struct gxFontTable gxFontTable;
typedef gxFontTable * gxFontTablePtr;
typedef gxFontTablePtr * gxFontTableHdl;
/* ------------------------------------------------------------------------------
Printing Manager API Functions
-------------------------------------------------------------------------------- */
/*
Global Routines
*/
#if CALL_NOT_IN_CARBON
EXTERN_API( OSErr )
GXInitPrinting (void) FOURWORDINLINE(0x203C, 0x0000, 0x0000, 0xABFE);
EXTERN_API( OSErr )
GXExitPrinting (void) FOURWORDINLINE(0x203C, 0x0000, 0x0001, 0xABFE);
/*
Error-Handling Routines
*/
EXTERN_API( OSErr )
GXGetJobError (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x000E, 0xABFE);
EXTERN_API( void )
GXSetJobError (gxJob aJob,
OSErr anErr) FOURWORDINLINE(0x203C, 0x0000, 0x000F, 0xABFE);
/*
Job Routines
*/
EXTERN_API( OSErr )
GXNewJob (gxJob * aJob) FOURWORDINLINE(0x203C, 0x0000, 0x0002, 0xABFE);
EXTERN_API( OSErr )
GXDisposeJob (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x0003, 0xABFE);
EXTERN_API( void )
GXFlattenJob (gxJob aJob,
GXPrintingFlattenUPP flattenProc,
void * aVoid) FOURWORDINLINE(0x203C, 0x0000, 0x0004, 0xABFE);
EXTERN_API( gxJob )
GXUnflattenJob (gxJob aJob,
GXPrintingFlattenUPP flattenProc,
void * aVoid) FOURWORDINLINE(0x203C, 0x0000, 0x0005, 0xABFE);
EXTERN_API( Handle )
GXFlattenJobToHdl (gxJob aJob,
Handle aHdl) FOURWORDINLINE(0x203C, 0x0000, 0x0006, 0xABFE);
EXTERN_API( gxJob )
GXUnflattenJobFromHdl (gxJob aJob,
Handle aHdl) FOURWORDINLINE(0x203C, 0x0000, 0x0007, 0xABFE);
EXTERN_API( void )
GXInstallApplicationOverride (gxJob aJob,
short messageID,
void * override) FOURWORDINLINE(0x203C, 0x0000, 0x0008, 0xABFE);
EXTERN_API( Collection )
GXGetJobCollection (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x001D, 0xABFE);
EXTERN_API( void *)
GXGetJobRefCon (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x001E, 0xABFE);
EXTERN_API( void )
GXSetJobRefCon (gxJob aJob,
void * refCon) FOURWORDINLINE(0x203C, 0x0000, 0x001F, 0xABFE);
EXTERN_API( gxJob )
GXCopyJob (gxJob srcJob,
gxJob dstJob) FOURWORDINLINE(0x203C, 0x0000, 0x0020, 0xABFE);
EXTERN_API( void )
GXSelectJobFormattingPrinter (gxJob aJob,
Str31 printerName) FOURWORDINLINE(0x203C, 0x0000, 0x0021, 0xABFE);
EXTERN_API( void )
GXSelectJobOutputPrinter (gxJob aJob,
Str31 printerName) FOURWORDINLINE(0x203C, 0x0000, 0x0022, 0xABFE);
EXTERN_API( void )
GXForEachJobFormatDo (gxJob aJob,
GXFormatUPP formatProc,
void * refCon) FOURWORDINLINE(0x203C, 0x0000, 0x0023, 0xABFE);
EXTERN_API( long )
GXCountJobFormats (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x0024, 0xABFE);
EXTERN_API( Boolean )
GXUpdateJob (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x0025, 0xABFE);
EXTERN_API( void )
GXConvertPrintRecord (gxJob aJob,
THPrint hPrint) FOURWORDINLINE(0x203C, 0x0000, 0x0026, 0xABFE);
EXTERN_API( void )
GXIdleJob (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x0057, 0xABFE);
/*
Job Format Modes Routines
*/
EXTERN_API( void )
GXSetAvailableJobFormatModes (gxJob aJob,
gxJobFormatModeTableHdl formatModeTable) FOURWORDINLINE(0x203C, 0x0000, 0x003B, 0xABFE);
EXTERN_API( gxJobFormatMode )
GXGetPreferredJobFormatMode (gxJob aJob,
Boolean * directOnly) FOURWORDINLINE(0x203C, 0x0000, 0x003C, 0xABFE);
EXTERN_API( gxJobFormatMode )
GXGetJobFormatMode (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x003D, 0xABFE);
EXTERN_API( void )
GXSetJobFormatMode (gxJob aJob,
gxJobFormatMode formatMode) FOURWORDINLINE(0x203C, 0x0000, 0x003E, 0xABFE);
EXTERN_API( void )
GXJobFormatModeQuery (gxJob aJob,
gxQueryType aQueryType,
void * srcData,
void * dstData) FOURWORDINLINE(0x203C, 0x0000, 0x003F, 0xABFE);
/*
Format Routines
*/
EXTERN_API( gxFormat )
GXNewFormat (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x0009, 0xABFE);
EXTERN_API( void )
GXDisposeFormat (gxFormat aFormat) FOURWORDINLINE(0x203C, 0x0000, 0x000A, 0xABFE);
EXTERN_API( gxFormat )
GXGetJobFormat (gxJob aJob,
long whichFormat) FOURWORDINLINE(0x203C, 0x0000, 0x0013, 0xABFE);
EXTERN_API( gxJob )
GXGetFormatJob (gxFormat aFormat) FOURWORDINLINE(0x203C, 0x0000, 0x0014, 0xABFE);
EXTERN_API( gxPaperType )
GXGetFormatPaperType (gxFormat aFormat) FOURWORDINLINE(0x203C, 0x0000, 0x0015, 0xABFE);
EXTERN_API( void )
GXGetFormatDimensions (gxFormat aFormat,
gxRectangle * pageSize,
gxRectangle * paperSize) FOURWORDINLINE(0x203C, 0x0000, 0x0016, 0xABFE);
EXTERN_API( Collection )
GXGetFormatCollection (gxFormat aFormat) FOURWORDINLINE(0x203C, 0x0000, 0x0033, 0xABFE);
EXTERN_API( void )
GXChangedFormat (gxFormat aFormat) FOURWORDINLINE(0x203C, 0x0000, 0x0034, 0xABFE);
EXTERN_API( gxFormat )
GXCopyFormat (gxFormat srcFormat,
gxFormat dstFormat) FOURWORDINLINE(0x203C, 0x0000, 0x0035, 0xABFE);
EXTERN_API( gxFormat )
GXCloneFormat (gxFormat aFormat) FOURWORDINLINE(0x203C, 0x0000, 0x0036, 0xABFE);
EXTERN_API( long )
GXCountFormatOwners (gxFormat aFormat) FOURWORDINLINE(0x203C, 0x0000, 0x0037, 0xABFE);
EXTERN_API( void )
GXGetFormatMapping (gxFormat aFormat,
gxMapping * fmtMapping) FOURWORDINLINE(0x203C, 0x0000, 0x0038, 0xABFE);
EXTERN_API( gxShape )
GXGetFormatForm (gxFormat aFormat,
gxShape * mask) FOURWORDINLINE(0x203C, 0x0000, 0x0039, 0xABFE);
EXTERN_API( void )
GXSetFormatForm (gxFormat aFormat,
gxShape form,
gxShape mask) FOURWORDINLINE(0x203C, 0x0000, 0x003A, 0xABFE);
/*
PaperType Routines
*/
EXTERN_API( gxPaperType )
GXNewPaperType (gxJob aJob,
Str31 name,
gxRectangle * pageSize,
gxRectangle * paperSize) FOURWORDINLINE(0x203C, 0x0000, 0x000B, 0xABFE);
EXTERN_API( void )
GXDisposePaperType (gxPaperType aPaperType) FOURWORDINLINE(0x203C, 0x0000, 0x000C, 0xABFE);
EXTERN_API( gxPaperType )
GXGetNewPaperType (gxJob aJob,
short resID) FOURWORDINLINE(0x203C, 0x0000, 0x000D, 0xABFE);
EXTERN_API( long )
GXCountJobPaperTypes (gxJob aJob,
Boolean forFormatDevice) FOURWORDINLINE(0x203C, 0x0000, 0x0042, 0xABFE);
EXTERN_API( gxPaperType )
GXGetJobPaperType (gxJob aJob,
long whichPaperType,
Boolean forFormatDevice,
gxPaperType aPaperType) FOURWORDINLINE(0x203C, 0x0000, 0x0043, 0xABFE);
EXTERN_API( void )
GXForEachJobPaperTypeDo (gxJob aJob,
GXPaperTypeUPP aProc,
void * refCon,
Boolean forFormattingPrinter) FOURWORDINLINE(0x203C, 0x0000, 0x0044, 0xABFE);
EXTERN_API( gxPaperType )
GXCopyPaperType (gxPaperType srcPaperType,
gxPaperType dstPaperType) FOURWORDINLINE(0x203C, 0x0000, 0x0045, 0xABFE);
EXTERN_API( void )
GXGetPaperTypeName (gxPaperType aPaperType,
Str31 papertypeName) FOURWORDINLINE(0x203C, 0x0000, 0x0046, 0xABFE);
EXTERN_API( void )
GXGetPaperTypeDimensions (gxPaperType aPaperType,
gxRectangle * pageSize,
gxRectangle * paperSize) FOURWORDINLINE(0x203C, 0x0000, 0x0047, 0xABFE);
EXTERN_API( gxJob )
GXGetPaperTypeJob (gxPaperType aPaperType) FOURWORDINLINE(0x203C, 0x0000, 0x0048, 0xABFE);
EXTERN_API( Collection )
GXGetPaperTypeCollection (gxPaperType aPaperType) FOURWORDINLINE(0x203C, 0x0000, 0x0049, 0xABFE);
/*
Printer Routines
*/
EXTERN_API( gxPrinter )
GXGetJobFormattingPrinter (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x0027, 0xABFE);
EXTERN_API( gxPrinter )
GXGetJobOutputPrinter (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x0028, 0xABFE);
EXTERN_API( gxPrinter )
GXGetJobPrinter (gxJob aJob) FOURWORDINLINE(0x203C, 0x0000, 0x0029, 0xABFE);
EXTERN_API( gxJob )
GXGetPrinterJob (gxPrinter aPrinter) FOURWORDINLINE(0x203C, 0x0000, 0x002A, 0xABFE);
EXTERN_API( void )
GXForEachPrinterViewDeviceDo (gxPrinter aPrinter,
GXViewDeviceUPP aProc,
void * refCon) FOURWORDINLINE(0x203C, 0x0000, 0x002B, 0xABFE);
EXTERN_API( long )
GXCountPrinterViewDevices (gxPrinter aPrinter) FOURWORDINLINE(0x203C, 0x0000, 0x002C, 0xABFE);
EXTERN_API( gxViewDevice )
GXGetPrinterViewDevice (gxPrinter aPrinter,
long whichViewDevice) FOURWORDINLINE(0x203C, 0x0000, 0x002D, 0xABFE);
EXTERN_API( void )
GXSelectPrinterViewDevice (gxPrinter aPrinter,
long whichViewDevice) FOURWORDINLINE(0x203C, 0x0000, 0x002E, 0xABFE);
EXTERN_API( void )
GXGetPrinterName (gxPrinter aPrinter,
Str31 printerName) FOURWORDINLINE(0x203C, 0x0000, 0x002F, 0xABFE);
EXTERN_API( OSType )
GXGetPrinterType (gxPrinter aPrinter) FOURWORDINLINE(0x203C, 0x0000, 0x0030, 0xABFE);
EXTERN_API( void )
GXGetPrinterDriverName (gxPrinter aPrinter,
Str31 driverName) FOURWORDINLINE(0x203C, 0x0000, 0x0031, 0xABFE);
EXTERN_API( OSType )
GXGetPrinterDriverType (gxPrinter aPrinter) FOURWORDINLINE(0x203C, 0x0000, 0x0032, 0xABFE);
/*
Dialog Routines
*/
EXTERN_API( gxDialogResult )
GXJobDefaultFormatDialog (gxJob aJob,
gxEditMenuRecord * anEditMenuRec) FOURWORDINLINE(0x203C, 0x0000, 0x0010, 0xABFE);
EXTERN_API( gxDialogResult )
GXJobPrintDialog (gxJob aJob,
gxEditMenuRecord * anEditMenuRec) FOURWORDINLINE(0x203C, 0x0000, 0x0011, 0xABFE);
EXTERN_API( gxDialogResult )
GXFormatDialog (gxFormat aFormat,
gxEditMenuRecord * anEditMenuRec,
StringPtr title) FOURWORDINLINE(0x203C, 0x0000, 0x0012, 0xABFE);
EXTERN_API( void )
GXEnableJobScalingPanel (gxJob aJob,
Boolean enabled) FOURWORDINLINE(0x203C, 0x0000, 0x0040, 0xABFE);
EXTERN_API( void )
GXGetJobPanelDimensions (gxJob aJob,
Rect * panelArea) FOURWORDINLINE(0x203C, 0x0000, 0x0041, 0xABFE);
/*
Spooling Routines
*/
EXTERN_API( void )
GXGetJobPageRange (gxJob theJob,
long * firstPage,
long * lastPage) FOURWORDINLINE(0x203C, 0x0000, 0x0017, 0xABFE);
EXTERN_API( void )
GXStartJob (gxJob theJob,
StringPtr docName,
long pageCount) FOURWORDINLINE(0x203C, 0x0000, 0x0018, 0xABFE);
EXTERN_API( void )
GXPrintPage (gxJob theJob,
long pageNumber,
gxFormat theFormat,
gxShape thePage) FOURWORDINLINE(0x203C, 0x0000, 0x0019, 0xABFE);
EXTERN_API( Boolean )
GXStartPage (gxJob theJob,
long pageNumber,
gxFormat theFormat,
long numViewPorts,
gxViewPort * viewPortList) FOURWORDINLINE(0x203C, 0x0000, 0x001A, 0xABFE);
EXTERN_API( void )
GXFinishPage (gxJob theJob) FOURWORDINLINE(0x203C, 0x0000, 0x001B, 0xABFE);
EXTERN_API( void )
GXFinishJob (gxJob theJob) FOURWORDINLINE(0x203C, 0x0000, 0x001C, 0xABFE);
/*
PrintFile Routines
*/
EXTERN_API( gxPrintFile )
GXOpenPrintFile (gxJob theJob,
FSSpecPtr anFSSpec,
char permission) FOURWORDINLINE(0x203C, 0x0000, 0x004A, 0xABFE);
EXTERN_API( void )
GXClosePrintFile (gxPrintFile aPrintFile) FOURWORDINLINE(0x203C, 0x0000, 0x004B, 0xABFE);
EXTERN_API( gxJob )
GXGetPrintFileJob (gxPrintFile aPrintFile) FOURWORDINLINE(0x203C, 0x0000, 0x004C, 0xABFE);
EXTERN_API( long )
GXCountPrintFilePages (gxPrintFile aPrintFile) FOURWORDINLINE(0x203C, 0x0000, 0x004D, 0xABFE);
EXTERN_API( void )
GXReadPrintFilePage (gxPrintFile aPrintFile,
long pageNumber,
long numViewPorts,
gxViewPort * viewPortList,
gxFormat * pgFormat,
gxShape * pgShape) FOURWORDINLINE(0x203C, 0x0000, 0x004E, 0xABFE);
EXTERN_API( void )
GXReplacePrintFilePage (gxPrintFile aPrintFile,
long pageNumber,
gxFormat aFormat,
gxShape aShape) FOURWORDINLINE(0x203C, 0x0000, 0x004F, 0xABFE);
EXTERN_API( void )
GXInsertPrintFilePage (gxPrintFile aPrintFile,
long atPageNumber,
gxFormat pgFormat,
gxShape pgShape) FOURWORDINLINE(0x203C, 0x0000, 0x0050, 0xABFE);
EXTERN_API( void )
GXDeletePrintFilePageRange (gxPrintFile aPrintFile,
long fromPageNumber,
long toPageNumber) FOURWORDINLINE(0x203C, 0x0000, 0x0051, 0xABFE);
EXTERN_API( void )
GXSavePrintFile (gxPrintFile aPrintFile,
FSSpec * anFSSpec) FOURWORDINLINE(0x203C, 0x0000, 0x0052, 0xABFE);
/*
ColorSync Routines
*/
EXTERN_API( long )
GXFindPrinterProfile (gxPrinter aPrinter,
void * searchData,
long index,
gxColorProfile * returnedProfile) FOURWORDINLINE(0x203C, 0x0000, 0x0053, 0xABFE);
EXTERN_API( long )
GXFindFormatProfile (gxFormat aFormat,
void * searchData,
long index,
gxColorProfile * returnedProfile) FOURWORDINLINE(0x203C, 0x0000, 0x0054, 0xABFE);
EXTERN_API( void )
GXSetPrinterProfile (gxPrinter aPrinter,
gxColorProfile oldProfile,
gxColorProfile newProfile) FOURWORDINLINE(0x203C, 0x0000, 0x0055, 0xABFE);
EXTERN_API( void )
GXSetFormatProfile (gxFormat aFormat,
gxColorProfile oldProfile,
gxColorProfile newProfile) FOURWORDINLINE(0x203C, 0x0000, 0x0056, 0xABFE);
/************************************************************************
Start of old "GXPrintingResEquates.h/a/p" interface file.
*************************************************************************/
/* ------------------------------------
Basic client types
------------------------------------ */
#endif /* CALL_NOT_IN_CARBON */
enum {
gxPrintingManagerType = FOUR_CHAR_CODE('pmgr'),
gxImagingSystemType = FOUR_CHAR_CODE('gxis'),
gxPrinterDriverType = FOUR_CHAR_CODE('pdvr'),
gxPrintingExtensionType = FOUR_CHAR_CODE('pext'),
gxUnknownPrinterType = FOUR_CHAR_CODE('none'),
gxAnyPrinterType = FOUR_CHAR_CODE('univ'),
gxQuickdrawPrinterType = FOUR_CHAR_CODE('qdrw'),
gxPortableDocPrinterType = FOUR_CHAR_CODE('gxpd'),
gxRasterPrinterType = FOUR_CHAR_CODE('rast'),
gxPostscriptPrinterType = FOUR_CHAR_CODE('post'),
gxVectorPrinterType = FOUR_CHAR_CODE('vect')
};
/* All pre-defined printing collection items have this ID */
enum {
gxPrintingTagID = -28672
};
/* ----------------------------------------------------------------------
Resource types and IDs used by both extension and driver writers
---------------------------------------------------------------------- */
/* Resources in a printer driver or extension must be based off of these IDs */
enum {
gxPrintingDriverBaseID = -27648,
gxPrintingExtensionBaseID = -27136
};
/* Override resources tell the system what messages a driver or extension
is overriding. A driver may have a series of these resources. */
/* Override resource type for 68k resource-based code:*/
enum {
gxOverrideType = FOUR_CHAR_CODE('over')
};
/* Override resource type for PowerPC datafork-based code:*/
enum {
gxNativeOverrideType = FOUR_CHAR_CODE('povr')
};
/* --------------------------------------------------------------
Message ID definitions by both extension and driver writers
--------------------------------------------------------------- */
/* Identifiers for universal message overrides. */
enum {
gxInitializeMsg = 0,
gxShutDownMsg = 1,
gxJobIdleMsg = 2,
gxJobStatusMsg = 3,
gxPrintingEventMsg = 4,
gxJobDefaultFormatDialogMsg = 5,
gxFormatDialogMsg = 6,
gxJobPrintDialogMsg = 7,
gxFilterPanelEventMsg = 8,
gxHandlePanelEventMsg = 9,
gxParsePageRangeMsg = 10,
gxDefaultJobMsg = 11,
gxDefaultFormatMsg = 12,
gxDefaultPaperTypeMsg = 13,
gxDefaultPrinterMsg = 14,
gxCreateSpoolFileMsg = 15,
gxSpoolPageMsg = 16,
gxSpoolDataMsg = 17,
gxSpoolResourceMsg = 18,
gxCompleteSpoolFileMsg = 19,
gxCountPagesMsg = 20,
gxDespoolPageMsg = 21,
gxDespoolDataMsg = 22,
gxDespoolResourceMsg = 23,
gxCloseSpoolFileMsg = 24,
gxStartJobMsg = 25,
gxFinishJobMsg = 26,
gxStartPageMsg = 27,
gxFinishPageMsg = 28,
gxPrintPageMsg = 29,
gxSetupImageDataMsg = 30,
gxImageJobMsg = 31,
gxImageDocumentMsg = 32,
gxImagePageMsg = 33,
gxRenderPageMsg = 34,
gxCreateImageFileMsg = 35,
gxOpenConnectionMsg = 36,
gxCloseConnectionMsg = 37,
gxStartSendPageMsg = 38,
gxFinishSendPageMsg = 39,
gxWriteDataMsg = 40,
gxBufferDataMsg = 41,
gxDumpBufferMsg = 42,
gxFreeBufferMsg = 43,
gxCheckStatusMsg = 44,
gxGetDeviceStatusMsg = 45,
gxFetchTaggedDataMsg = 46,
gxGetDTPMenuListMsg = 47,
gxDTPMenuSelectMsg = 48,
gxHandleAlertFilterMsg = 49,
gxJobFormatModeQueryMsg = 50,
gxWriteStatusToDTPWindowMsg = 51,
gxInitializeStatusAlertMsg = 52,
gxHandleAlertStatusMsg = 53,
gxHandleAlertEventMsg = 54,
gxCleanupStartJobMsg = 55,
gxCleanupStartPageMsg = 56,
gxCleanupOpenConnectionMsg = 57,
gxCleanupStartSendPageMsg = 58,
gxDefaultDesktopPrinterMsg = 59,
gxCaptureOutputDeviceMsg = 60,
gxOpenConnectionRetryMsg = 61,
gxExamineSpoolFileMsg = 62,
gxFinishSendPlaneMsg = 63,
gxDoesPaperFitMsg = 64,
gxChooserMessageMsg = 65,
gxFindPrinterProfileMsg = 66,
gxFindFormatProfileMsg = 67,
gxSetPrinterProfileMsg = 68,
gxSetFormatProfileMsg = 69,
gxHandleAltDestinationMsg = 70,
gxSetupPageImageDataMsg = 71
};
/* Identifiers for Quickdraw message overrides. */
enum {
gxPrOpenDocMsg = 0,
gxPrCloseDocMsg = 1,
gxPrOpenPageMsg = 2,
gxPrClosePageMsg = 3,
gxPrintDefaultMsg = 4,
gxPrStlDialogMsg = 5,
gxPrJobDialogMsg = 6,
gxPrStlInitMsg = 7,
gxPrJobInitMsg = 8,
gxPrDlgMainMsg = 9,
gxPrValidateMsg = 10,
gxPrJobMergeMsg = 11,
gxPrGeneralMsg = 12,
gxConvertPrintRecordToMsg = 13,
gxConvertPrintRecordFromMsg = 14,
gxPrintRecordToJobMsg = 15
};
/* Identifiers for raster imaging message overrides. */
enum {
gxRasterDataInMsg = 0,
gxRasterLineFeedMsg = 1,
gxRasterPackageBitmapMsg = 2
};
/* Identifiers for PostScript imaging message overrides. */
enum {
gxPostscriptQueryPrinterMsg = 0,
gxPostscriptInitializePrinterMsg = 1,
gxPostscriptResetPrinterMsg = 2,
gxPostscriptExitServerMsg = 3,
gxPostscriptGetStatusTextMsg = 4,
gxPostscriptGetPrinterTextMsg = 5,
gxPostscriptScanStatusTextMsg = 6,
gxPostscriptScanPrinterTextMsg = 7,
gxPostscriptGetDocumentProcSetListMsg = 8,
gxPostscriptDownloadProcSetListMsg = 9,
gxPostscriptGetPrinterGlyphsInformationMsg = 10,
gxPostscriptStreamFontMsg = 11,
gxPostscriptDoDocumentHeaderMsg = 12,
gxPostscriptDoDocumentSetUpMsg = 13,
gxPostscriptDoDocumentTrailerMsg = 14,
gxPostscriptDoPageSetUpMsg = 15,
gxPostscriptSelectPaperTypeMsg = 16,
gxPostscriptDoPageTrailerMsg = 17,
gxPostscriptEjectPageMsg = 18,
gxPostscriptProcessShapeMsg = 19,
gxPostScriptEjectPendingPageMsg = 20
};
/* Identifiers for Vector imaging message overrides. */
enum {
gxVectorPackageDataMsg = 0,
gxVectorLoadPensMsg = 1,
gxVectorVectorizeShapeMsg = 2
};
/* Dialog related resource types */
enum {
gxPrintingAlertType = FOUR_CHAR_CODE('plrt'),
gxStatusType = FOUR_CHAR_CODE('stat'),
gxExtendedDITLType = FOUR_CHAR_CODE('xdtl'),
gxPrintPanelType = FOUR_CHAR_CODE('ppnl'),
gxCollectionType = FOUR_CHAR_CODE('cltn')
};
/* Communication resource types */
/*
The looker resource is used by the Chooser PACK to determine what kind
of communications this driver supports. (In order to generate/handle the
pop-up menu for "Connect via:".
The looker resource is also used by PrinterShare to determine the AppleTalk NBP Type
for servers created for this driver.
*/
enum {
gxLookerType = FOUR_CHAR_CODE('look'),
gxLookerID = -4096
};
/* The communications method and private data used to connect to the printer */
enum {
gxDeviceCommunicationsType = FOUR_CHAR_CODE('comm')
};
/* -------------------------------------------------
Resource types and IDs used by extension writers
------------------------------------------------- */
enum {
gxExtensionUniversalOverrideID = gxPrintingExtensionBaseID
};
enum {
gxExtensionImagingOverrideSelectorID = gxPrintingExtensionBaseID
};
enum {
gxExtensionScopeType = FOUR_CHAR_CODE('scop'),
gxDriverScopeID = gxPrintingExtensionBaseID,
gxPrinterScopeID = gxPrintingExtensionBaseID + 1,
gxPrinterExceptionScopeID = gxPrintingExtensionBaseID + 2
};
enum {
gxExtensionLoadType = FOUR_CHAR_CODE('load'),
gxExtensionLoadID = gxPrintingExtensionBaseID
};
enum {
gxExtensionLoadFirst = 0x00000100,
gxExtensionLoadAnywhere = 0x7FFFFFFF,
gxExtensionLoadLast = (long)0xFFFFFF00
};
enum {
gxExtensionOptimizationType = FOUR_CHAR_CODE('eopt'),
gxExtensionOptimizationID = gxPrintingExtensionBaseID
};
/* -----------------------------------------------
Resource types and IDs used by driver writers
----------------------------------------------- */
enum {
gxDriverUniversalOverrideID = gxPrintingDriverBaseID,
gxDriverImagingOverrideID = gxPrintingDriverBaseID + 1,
gxDriverCompatibilityOverrideID = gxPrintingDriverBaseID + 2
};
enum {
gxDriverFileFormatType = FOUR_CHAR_CODE('pfil'),
gxDriverFileFormatID = gxPrintingDriverBaseID
};
enum {
gxDestinationAdditionType = FOUR_CHAR_CODE('dsta'),
gxDestinationAdditionID = gxPrintingDriverBaseID
};
/* IMAGING RESOURCES */
/* The imaging system resource specifies which imaging system a printer
driver wishes to use. */
enum {
gxImagingSystemSelectorType = FOUR_CHAR_CODE('isys'),
gxImagingSystemSelectorID = gxPrintingDriverBaseID
};
/* 'exft' resource ID -- exclude font list */
enum {
kExcludeFontListType = FOUR_CHAR_CODE('exft'),
kExcludeFontListID = gxPrintingDriverBaseID
};
/* Resource for type for color matching */
enum {
gxColorMatchingDataType = FOUR_CHAR_CODE('prof'),
gxColorMatchingDataID = gxPrintingDriverBaseID
};
/* Resource type and id for the tray count */
enum {
gxTrayCountDataType = FOUR_CHAR_CODE('tray'),
gxTrayCountDataID = gxPrintingDriverBaseID
};
/* Resource type for the tray names */
enum {
gxTrayNameDataType = FOUR_CHAR_CODE('tryn')
};
/* Resource type for manual feed preferences, stored in DTP. */
enum {
gxManualFeedAlertPrefsType = FOUR_CHAR_CODE('mfpr'),
gxManualFeedAlertPrefsID = gxPrintingDriverBaseID
};
/* Resource type for desktop printer output characteristics, stored in DTP. */
enum {
gxDriverOutputType = FOUR_CHAR_CODE('outp'),
gxDriverOutputTypeID = 1
};
/* IO Resources */
/* Resource type and ID for default IO and buffering resources */
enum {
gxUniversalIOPrefsType = FOUR_CHAR_CODE('iobm'),
gxUniversalIOPrefsID = gxPrintingDriverBaseID
};
/* Resource types and IDs for default implementation of CaptureOutputDevice.
The default implementation of CaptureOutputDevice only handles PAP devices */
enum {
gxCaptureType = FOUR_CHAR_CODE('cpts'),
gxCaptureStringID = gxPrintingDriverBaseID,
gxReleaseStringID = gxPrintingDriverBaseID + 1,
gxUncapturedAppleTalkType = gxPrintingDriverBaseID + 2,
gxCapturedAppleTalkType = gxPrintingDriverBaseID + 3
};
/* Resource type and ID for custom halftone matrix */
enum {
gxCustomMatrixType = FOUR_CHAR_CODE('dmat'),
gxCustomMatrixID = gxPrintingDriverBaseID
};
/* Resource type and ID for raster driver rendering preferences */
enum {
gxRasterPrefsType = FOUR_CHAR_CODE('rdip'),
gxRasterPrefsID = gxPrintingDriverBaseID
};
/* Resource type for specifiying a colorset */
enum {
gxColorSetResType = FOUR_CHAR_CODE('crst')
};
/* Resource type and ID for raster driver packaging preferences */
enum {
gxRasterPackType = FOUR_CHAR_CODE('rpck'),
gxRasterPackID = gxPrintingDriverBaseID
};
/* Resource type and ID for raster driver packaging options */
enum {
gxRasterNumNone = 0, /* Number isn't output at all */
gxRasterNumDirect = 1, /* Lowest minWidth bytes as data */
gxRasterNumToASCII = 2 /* minWidth ASCII characters */
};
enum {
gxRasterPackOptionsType = FOUR_CHAR_CODE('ropt'),
gxRasterPackOptionsID = gxPrintingDriverBaseID
};
/* Resource type for the PostScript imaging system procedure set control resource */
enum {
gxPostscriptProcSetControlType = FOUR_CHAR_CODE('prec')
};
/* Resource type for the PostScript imaging system printer font resource */
enum {
gxPostscriptPrinterFontType = FOUR_CHAR_CODE('pfnt')
};
/* Resource type and ID for the PostScript imaging system imaging preferences */
enum {
gxPostscriptPrefsType = FOUR_CHAR_CODE('pdip'),
gxPostscriptPrefsID = gxPrintingDriverBaseID
};
/* Resource type and ID for the PostScript imaging system default scanning code */
enum {
gxPostscriptScanningType = FOUR_CHAR_CODE('scan'),
gxPostscriptScanningID = gxPrintingDriverBaseID
};
/* Old Application Support Resources */
enum {
gxCustType = FOUR_CHAR_CODE('cust'),
gxCustID = -8192
};
enum {
gxReslType = FOUR_CHAR_CODE('resl'),
gxReslID = -8192
};
enum {
gxDiscreteResolution = 0
};
enum {
gxStlDialogResID = -8192
};
enum {
gxJobDialogResID = -8191
};
enum {
gxScaleTableType = FOUR_CHAR_CODE('stab'),
gxDITLControlType = FOUR_CHAR_CODE('dctl')
};
/* The default implementation of gxPrintDefault loads and
PrValidates a print record stored in the following driver resource. */
enum {
gxPrintRecordType = FOUR_CHAR_CODE('PREC'),
gxDefaultPrintRecordID = 0
};
/*
-----------------------------------------------
Resource types and IDs used in papertype files
-----------------------------------------------
*/
/* Resource type and ID for driver papertypes placed in individual files */
enum {
gxSignatureType = FOUR_CHAR_CODE('sig '),
gxPapertypeSignatureID = 0
};
/* Papertype creator types */
enum {
gxDrvrPaperType = FOUR_CHAR_CODE('drpt'),
gxSysPaperType = FOUR_CHAR_CODE('sypt'), /* System paper type creator */
gxUserPaperType = FOUR_CHAR_CODE('uspt'), /* User paper type creator */
/* Driver creator types == driver file's creator value */
gxPaperTypeType = FOUR_CHAR_CODE('ptyp')
};
/*********************************************************************
Start of old "GXPrintingMessages.h/a/p" interface file.
**********************************************************************/
/* ------------------------------------------------------------------------------
Constants and Types
-------------------------------------------------------------------------------- */
/*
ABSTRACT DATA TYPES
*/
typedef struct OpaquegxSpoolFile* gxSpoolFile;
/*
DIALOG PANEL CONSTANTS AND TYPES
*/
typedef long gxPanelEvent;
/* Dialog panel event equates */
enum {
gxPanelNoEvt = 0L,
gxPanelOpenEvt = 1L, /* Initialize and draw */
gxPanelCloseEvt = 2L, /* Your panel is going away (panel switchL, confirm or cancel) */
gxPanelHitEvt = 3L, /* There's a hit in your panel */
gxPanelActivateEvt = 4L, /* The dialog window has just been activated */
gxPanelDeactivateEvt = 5L, /* The dialog window is about to be deactivated */
gxPanelIconFocusEvt = 6L, /* The focus changes from the panel to the icon list */
gxPanelPanelFocusEvt = 7L, /* The focus changes from the icon list to the panel */
gxPanelFilterEvt = 8L, /* Every event is filtered */
gxPanelCancelEvt = 9L, /* The user has cancelled the dialog */
gxPanelConfirmEvt = 10L, /* The user has confirmed the dialog */
gxPanelDialogEvt = 11L, /* Event to be handle by dialoghandler */
gxPanelOtherEvt = 12L, /* osEvts, etc. */
gxPanelUserWillConfirmEvt = 13L /* User has selected confirm, time to parse panel interdependencies */
};
/* Constants for panel responses to dialog handler calls */
typedef long gxPanelResult;
enum {
gxPanelNoResult = 0,
gxPanelCancelConfirmation = 1 /* Only valid from panelUserWillConfirmEvt - used to keep the dialog from going away */
};
/* Panel event info record for FilterPanelEvent and HandlePanelEvent messages */
struct gxPanelInfoRecord {
gxPanelEvent panelEvt; /* Why we were called */
short panelResId; /* 'ppnl' resource ID of current panel */
DialogPtr pDlg; /* Pointer to dialog */
EventRecord * theEvent; /* Pointer to event */
short itemHit; /* Actual item number as Dialog Mgr thinks */
short itemCount; /* Number of items before your items */
short evtAction; /* Once this event is processed, the action that will result */
/* (evtAction is only meaningful during filtering) */
short errorStringId; /* STR ID of string to put in error alert (0 means no string) */
gxFormat theFormat; /* The current format (only meaningful in a format dialog) */
void * refCon; /* refCon passed in PanelSetupRecord */
};
typedef struct gxPanelInfoRecord gxPanelInfoRecord;
/* Constants for the evtAction field in PanelInfoRecord */
enum {
gxOtherAction = 0, /* Current item will not change */
gxClosePanelAction = 1, /* Panel will be closed */
gxCancelDialogAction = 2, /* Dialog will be cancelled */
gxConfirmDialogAction = 3 /* Dialog will be confirmed */
};
/* Constants for the panelKind field in gxPanelSetupRecord */
typedef long gxPrintingPanelKind;
/* The gxPanelSetupInfo structure is passed to GXSetupDialogPanel */
struct gxPanelSetupRecord {
gxPrintingPanelKind panelKind;
short panelResId;
short resourceRefNum;
void * refCon;
};
typedef struct gxPanelSetupRecord gxPanelSetupRecord;
enum {
gxApplicationPanel = 0L,
gxExtensionPanel = 1L,
gxDriverPanel = 2L
};
/* Constants returned by gxParsePageRange message */
typedef long gxParsePageRangeResult;
enum {
gxRangeNotParsed = 0L, /* Default initial value */
gxRangeParsed = 1L, /* Range has been parsed */
gxRangeBadFromValue = 2L, /* From value is bad */
gxRangeBadToValue = 3L /* To value is bad */
};
/*
STATUS-RELATED CONSTANTS AND TYPES
*/
/* Structure for status messages */
struct gxStatusRecord {
unsigned short statusType; /* One of the ids listed above (nonFatalError, etc. ) */
unsigned short statusId; /* Specific status (out of paper, etc.) */
unsigned short statusAlertId; /* Printing alert ID (if any) for status */
gxOwnerSignature statusOwner; /* Creator type of status owner */
short statResId; /* ID for 'stat' resource */
short statResIndex; /* Index into 'stat' resource for this status */
short dialogResult; /* ID of button string selected on dismissal of printing alert */
unsigned short bufferLen; /* Number of bytes in status buffer - total record size must be <= 512 */
char statusBuffer[1]; /* User response from alert */
};
typedef struct gxStatusRecord gxStatusRecord;
/* Constants for statusType field of gxStatusRecord */
enum {
gxNonFatalError = 1, /* An error occurred, but the job can continue */
gxFatalError = 2, /* A fatal error occurred-- halt job */
gxPrinterReady = 3, /* Tells QDGX to leave alert mode */
gxUserAttention = 4, /* Signals initiation of a modal alert */
gxUserAlert = 5, /* Signals initiation of a moveable modal alert */
gxPageTransmission = 6, /* Signals page sent to printer, increments page count in strings to user */
gxOpenConnectionStatus = 7, /* Signals QDGX to begin animation on printer icon */
gxInformationalStatus = 8, /* Default status type, no side effects */
gxSpoolingPageStatus = 9, /* Signals page spooled, increments page count in spooling dialog */
gxEndStatus = 10, /* Signals end of spooling */
gxPercentageStatus = 11 /* Signals QDGX as to the amount of the job which is currently complete */
};
/* Structure for gxWriteStatusToDTPWindow message */
struct gxDisplayRecord {
Boolean useText; /* Use text as opposed to a picture */
char padByte;
Handle hPicture; /* if !useText, the picture handle */
Str255 theText; /* if useText, the text */
};
typedef struct gxDisplayRecord gxDisplayRecord;
/*-----------------------------------------------*/
/* paper mapping-related constants and types... */
/*-----------------------------------------------*/
typedef long gxTrayMapping;
enum {
gxDefaultTrayMapping = 0L,
gxConfiguredTrayMapping = 1L
};
/* ------------------------------------------------------------------------------
API Functions callable only from within message overrides
-------------------------------------------------------------------------------- */
#define GXPRINTINGDISPATCH(segID, selector) {0x203C, 0x0001, 0, 0x223C, (segID & 0x0FFF), selector << 2, 0xABFE}
/*
Message Sending API Routines
*/
#if TARGET_CPU_68K
#if CALL_NOT_IN_CARBON
EXTERN_API_C( OSErr )
GXPrintingDispatch (long selector,
...) SIXWORDINLINE(0x221F, 0x203C, 0x0001, 0x0000, 0xABFE, 0x598F);
#endif /* CALL_NOT_IN_CARBON */
#endif /* TARGET_CPU_68K */
/*
How to use the GXPRINTINGDISPATCH macro...
If your driver or extension is large, you may want to segment it
across smaller boundaries than is permitted by the messaging system.
Without using the Printing Manager's segmentation manager directly,
the smallest segment you can create consists of the code to override
a single message. If you are overriding workhorse messages such as
RenderPage, you may want to divide up the work among many functions
distributed across several segments. Here's how...
The Printing Manager segment scheme involves the construction of a
single 32-bit dispatch selector, which contains all the information
necessary for the dispatcher to find a single routine. It contains the
segment's resource ID, and the offset within the segment which contains
the start of the routine. The GXPRINTINGDISPATCH macro will construct the
dispatch selector for you, as well as the code to do the dispatch.
Usually, it is convenient to start your segment with a long aligned jump table,
beginning after the 4 byte header required by the Printing Manager. The
macro assumes this is the case and takes a 1-based routine selector from
which it conmstructs the offset.
For example, if your code is in resource 'pdvr' (print driver), ID=2
at offset=12 (third routine in segment), you would declare your
routine as follows:
OSErr MyRenderingRoutine (long param1, Ptr param2)
= GXPRINTINGDISPATCH(2, 3);
Remember, ALL segment dispatches must return OSErr. If your routine
does not generate errors, you must still declare it to return OSErr
and have the routine itself return noErr.
An alternative way to call across segments is to call the GXPrintingDispatch
function directly. You must construct the 32-bit selector yourself and pass
it as the first parameter. This is usually not preferable since you don't get
type-checking unless you declare a prototype as shown above, and your code
isn't as easy to read.
So given the above prototype, there are two ways to call the function:
anErr = MyRenderingRoutine(p1, p2); // Free type checking!
or:
#define kMyRenderRoutineSelector 0x0002000C
anErr = GXPrintingDispatch(kMyRenderRoutineSelector, p1, p2); // No type-checking!
Both have the same effect.
*/
#if CALL_NOT_IN_CARBON
EXTERN_API_C( gxJob )
GXGetJob (void) FOURWORDINLINE(0x203C, 0x0001, 0x0001, 0xABFE);
EXTERN_API_C( short )
GXGetMessageHandlerResFile (void) FOURWORDINLINE(0x203C, 0x0001, 0x0002, 0xABFE);
EXTERN_API_C( Boolean )
GXSpoolingAborted (void) FOURWORDINLINE(0x203C, 0x0001, 0x0003, 0xABFE);
EXTERN_API_C( OSErr )
GXJobIdle (void) FOURWORDINLINE(0x203C, 0x0001, 0x0004, 0xABFE);
EXTERN_API_C( OSErr )
GXReportStatus (long statusID,
unsigned long statusIndex) FOURWORDINLINE(0x203C, 0x0001, 0x0005, 0xABFE);
EXTERN_API_C( OSErr )
GXAlertTheUser (gxStatusRecord * statusRec) FOURWORDINLINE(0x203C, 0x0001, 0x0006, 0xABFE);
EXTERN_API_C( OSErr )
GXSetupDialogPanel (gxPanelSetupRecord * panelRec) FOURWORDINLINE(0x203C, 0x0001, 0x0007, 0xABFE);
EXTERN_API_C( OSErr )
GXCountTrays (gxTrayIndex * numTrays) FOURWORDINLINE(0x203C, 0x0001, 0x0008, 0xABFE);
EXTERN_API_C( OSErr )
GXGetTrayName (gxTrayIndex trayNumber,
Str31 trayName) FOURWORDINLINE(0x203C, 0x0001, 0x0009, 0xABFE);
EXTERN_API_C( OSErr )
GXSetTrayPaperType (gxTrayIndex whichTray,
gxPaperType aPapertype) FOURWORDINLINE(0x203C, 0x0001, 0x000A, 0xABFE);
EXTERN_API_C( OSErr )
GXGetTrayPaperType (gxTrayIndex whichTray,
gxPaperType aPapertype) FOURWORDINLINE(0x203C, 0x0001, 0x000B, 0xABFE);
EXTERN_API_C( OSErr )
GXGetTrayMapping (gxTrayMapping * trayMapping) FOURWORDINLINE(0x203C, 0x0001, 0x000C, 0xABFE);
EXTERN_API_C( void )
GXCleanupStartJob (void) FOURWORDINLINE(0x203C, 0x0001, 0x000D, 0xABFE);
EXTERN_API_C( void )
GXCleanupStartPage (void) FOURWORDINLINE(0x203C, 0x0001, 0x000E, 0xABFE);
EXTERN_API_C( void )
GXCleanupOpenConnection (void) FOURWORDINLINE(0x203C, 0x0001, 0x000F, 0xABFE);
EXTERN_API_C( void )
GXCleanupStartSendPage (void) FOURWORDINLINE(0x203C, 0x0001, 0x0010, 0xABFE);
/* ------------------------------------------------------------------------------
Constants and types for Universal Printing Messages
-------------------------------------------------------------------------------- */
/* Options for gxCreateSpoolFile message */
#endif /* CALL_NOT_IN_CARBON */
enum {
gxNoCreateOptions = 0x00000000, /* Just create the file */
gxInhibitAlias = 0x00000001, /* Do not create an alias in the PMD folder */
gxInhibitUniqueName = 0x00000002, /* Do not append to the filename to make it unique */
gxResolveBitmapAlias = 0x00000004 /* Resolve bitmap aliases and duplicate data in file */
};
/* Options for gxCloseSpoolFile message */
enum {
gxNoCloseOptions = 0x00000000, /* Just close the file */
gxDeleteOnClose = 0x00000001, /* Delete the file rather than closing it */
gxUpdateJobData = 0x00000002, /* Write current job information into file prior to closing */
gxMakeRemoteFile = 0x00000004 /* Mark job as a remote file */
};
/* Options for gxCreateImageFile message */
enum {
gxNoImageFile = 0x00000000, /* Don't create image file */
gxMakeImageFile = 0x00000001, /* Create an image file */
gxEachPlane = 0x00000002, /* Only save up planes before rewinding */
gxEachPage = 0x00000004, /* Save up entire pages before rewinding */
gxEntireFile = gxEachPlane + gxEachPage /* Save up the entire file before rewinding */
};
/* Options for gxBufferData message */
enum {
gxNoBufferOptions = 0x00000000,
gxMakeBufferHex = 0x00000001,
gxDontSplitBuffer = 0x00000002
};
/* Structure for gxDumpBuffer and gxFreeBuffer messages */
struct gxPrintingBuffer {
long size; /* Size of buffer in bytes */
long userData; /* Client assigned id for the buffer */
char data[1]; /* Array of size bytes */
};
typedef struct gxPrintingBuffer gxPrintingBuffer;
/* Structure for gxRenderPage message */
struct gxPageInfoRecord {
long docPageNum; /* Number of page being printed */
long copyNum; /* Copy number being printed */
Boolean formatChanged; /* True if format changed from last page */
Boolean pageChanged; /* True if page contents changed from last page */
long internalUse; /* Private */
};
typedef struct gxPageInfoRecord gxPageInfoRecord;
/* ------------------------------------------------------------------------------
Universal Printing Messages
-------------------------------------------------------------------------------- */
typedef CALLBACK_API_C( OSErr , GXJobIdleProcPtr )(void );
typedef STACK_UPP_TYPE(GXJobIdleProcPtr) GXJobIdleUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXJobIdleUPP)
NewGXJobIdleUPP (GXJobIdleProcPtr userRoutine);
EXTERN_API(void)
DisposeGXJobIdleUPP (GXJobIdleUPP userUPP);
EXTERN_API(OSErr)
InvokeGXJobIdleUPP (GXJobIdleUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXJobIdleProcInfo = 0x00000021 }; /* 2_bytes Func() */
#define NewGXJobIdleUPP(userRoutine) (GXJobIdleUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXJobIdleProcInfo, GetCurrentArchitecture())
#define DisposeGXJobIdleUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXJobIdleUPP(userUPP) (OSErr)CALL_ZERO_PARAMETER_UPP((userUPP), uppGXJobIdleProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXJobIdleProc(userRoutine) NewGXJobIdleUPP(userRoutine)
#define CallGXJobIdleProc(userRoutine) InvokeGXJobIdleUPP(userRoutine)
#define Send_GXJobIdle() MacSendMessage(0x00000002)
#define Forward_GXJobIdle() ForwardThisMessage((void *) (0))
typedef CALLBACK_API_C( OSErr , GXJobStatusProcPtr )(gxStatusRecord *pStatus);
typedef STACK_UPP_TYPE(GXJobStatusProcPtr) GXJobStatusUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXJobStatusUPP)
NewGXJobStatusUPP (GXJobStatusProcPtr userRoutine);
EXTERN_API(void)
DisposeGXJobStatusUPP (GXJobStatusUPP userUPP);
EXTERN_API(OSErr)
InvokeGXJobStatusUPP (gxStatusRecord * pStatus,
GXJobStatusUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXJobStatusProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXJobStatusUPP(userRoutine) (GXJobStatusUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXJobStatusProcInfo, GetCurrentArchitecture())
#define DisposeGXJobStatusUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXJobStatusUPP(pStatus, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXJobStatusProcInfo, (pStatus))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXJobStatusProc(userRoutine) NewGXJobStatusUPP(userRoutine)
#define CallGXJobStatusProc(userRoutine, pStatus) InvokeGXJobStatusUPP(pStatus, userRoutine)
#define Send_GXJobStatus(pStatus) \
MacSendMessage(0x00000003, pStatus)
#define Forward_GXJobStatus(pStatus) \
ForwardThisMessage((void *) (pStatus))
typedef CALLBACK_API_C( OSErr , GXPrintingEventProcPtr )(EventRecord *evtRecord, Boolean filterEvent);
typedef STACK_UPP_TYPE(GXPrintingEventProcPtr) GXPrintingEventUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXPrintingEventUPP)
NewGXPrintingEventUPP (GXPrintingEventProcPtr userRoutine);
EXTERN_API(void)
DisposeGXPrintingEventUPP (GXPrintingEventUPP userUPP);
EXTERN_API(OSErr)
InvokeGXPrintingEventUPP (EventRecord * evtRecord,
Boolean filterEvent,
GXPrintingEventUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXPrintingEventProcInfo = 0x000001E1 }; /* 2_bytes Func(4_bytes, 1_byte) */
#define NewGXPrintingEventUPP(userRoutine) (GXPrintingEventUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXPrintingEventProcInfo, GetCurrentArchitecture())
#define DisposeGXPrintingEventUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXPrintingEventUPP(evtRecord, filterEvent, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXPrintingEventProcInfo, (evtRecord), (filterEvent))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXPrintingEventProc(userRoutine) NewGXPrintingEventUPP(userRoutine)
#define CallGXPrintingEventProc(userRoutine, evtRecord, filterEvent) InvokeGXPrintingEventUPP(evtRecord, filterEvent, userRoutine)
#define Send_GXPrintingEvent(evtRecord, filterEvent) \
MacSendMessage(0x00000004, evtRecord, filterEvent)
#define Forward_GXPrintingEvent(evtRecord, filterEvent) \
ForwardThisMessage((void *) (evtRecord), (void *) (filterEvent))
typedef CALLBACK_API_C( OSErr , GXJobDefaultFormatDialogProcPtr )(gxDialogResult *dlgResult);
typedef STACK_UPP_TYPE(GXJobDefaultFormatDialogProcPtr) GXJobDefaultFormatDialogUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXJobDefaultFormatDialogUPP)
NewGXJobDefaultFormatDialogUPP (GXJobDefaultFormatDialogProcPtr userRoutine);
EXTERN_API(void)
DisposeGXJobDefaultFormatDialogUPP (GXJobDefaultFormatDialogUPP userUPP);
EXTERN_API(OSErr)
InvokeGXJobDefaultFormatDialogUPP (gxDialogResult * dlgResult,
GXJobDefaultFormatDialogUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXJobDefaultFormatDialogProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXJobDefaultFormatDialogUPP(userRoutine) (GXJobDefaultFormatDialogUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXJobDefaultFormatDialogProcInfo, GetCurrentArchitecture())
#define DisposeGXJobDefaultFormatDialogUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXJobDefaultFormatDialogUPP(dlgResult, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXJobDefaultFormatDialogProcInfo, (dlgResult))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXJobDefaultFormatDialogProc(userRoutine) NewGXJobDefaultFormatDialogUPP(userRoutine)
#define CallGXJobDefaultFormatDialogProc(userRoutine, dlgResult) InvokeGXJobDefaultFormatDialogUPP(dlgResult, userRoutine)
#define Send_GXJobDefaultFormatDialog(dlgResult) \
MacSendMessage(0x00000005, dlgResult)
#define Forward_GXJobDefaultFormatDialog(dlgResult) \
ForwardThisMessage((void *) (dlgResult))
typedef CALLBACK_API_C( OSErr , GXFormatDialogProcPtr )(gxFormat theFormat, StringPtr title, gxDialogResult *dlgResult);
typedef STACK_UPP_TYPE(GXFormatDialogProcPtr) GXFormatDialogUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXFormatDialogUPP)
NewGXFormatDialogUPP (GXFormatDialogProcPtr userRoutine);
EXTERN_API(void)
DisposeGXFormatDialogUPP (GXFormatDialogUPP userUPP);
EXTERN_API(OSErr)
InvokeGXFormatDialogUPP (gxFormat theFormat,
StringPtr title,
gxDialogResult * dlgResult,
GXFormatDialogUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXFormatDialogProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXFormatDialogUPP(userRoutine) (GXFormatDialogUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXFormatDialogProcInfo, GetCurrentArchitecture())
#define DisposeGXFormatDialogUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXFormatDialogUPP(theFormat, title, dlgResult, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXFormatDialogProcInfo, (theFormat), (title), (dlgResult))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXFormatDialogProc(userRoutine) NewGXFormatDialogUPP(userRoutine)
#define CallGXFormatDialogProc(userRoutine, theFormat, title, dlgResult) InvokeGXFormatDialogUPP(theFormat, title, dlgResult, userRoutine)
#define Send_GXFormatDialog(theFormat, title, dlgResult) \
MacSendMessage(0x00000006, theFormat, title, dlgResult)
#define Forward_GXFormatDialog(theFormat, title, dlgResult) \
ForwardThisMessage((void *) (theFormat),(void *) (title),(void *) (dlgResult))
typedef CALLBACK_API_C( OSErr , GXJobPrintDialogProcPtr )(gxDialogResult *dlgResult);
typedef STACK_UPP_TYPE(GXJobPrintDialogProcPtr) GXJobPrintDialogUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXJobPrintDialogUPP)
NewGXJobPrintDialogUPP (GXJobPrintDialogProcPtr userRoutine);
EXTERN_API(void)
DisposeGXJobPrintDialogUPP (GXJobPrintDialogUPP userUPP);
EXTERN_API(OSErr)
InvokeGXJobPrintDialogUPP (gxDialogResult * dlgResult,
GXJobPrintDialogUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXJobPrintDialogProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXJobPrintDialogUPP(userRoutine) (GXJobPrintDialogUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXJobPrintDialogProcInfo, GetCurrentArchitecture())
#define DisposeGXJobPrintDialogUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXJobPrintDialogUPP(dlgResult, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXJobPrintDialogProcInfo, (dlgResult))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXJobPrintDialogProc(userRoutine) NewGXJobPrintDialogUPP(userRoutine)
#define CallGXJobPrintDialogProc(userRoutine, dlgResult) InvokeGXJobPrintDialogUPP(dlgResult, userRoutine)
#define Send_GXJobPrintDialog(dlgResult) \
MacSendMessage(0x00000007, dlgResult)
#define Forward_GXJobPrintDialog(dlgResult) \
ForwardThisMessage((void *) (dlgResult))
typedef CALLBACK_API_C( OSErr , GXFilterPanelEventProcPtr )(gxPanelInfoRecord *pHitInfo, Boolean *returnImmed);
typedef STACK_UPP_TYPE(GXFilterPanelEventProcPtr) GXFilterPanelEventUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXFilterPanelEventUPP)
NewGXFilterPanelEventUPP (GXFilterPanelEventProcPtr userRoutine);
EXTERN_API(void)
DisposeGXFilterPanelEventUPP (GXFilterPanelEventUPP userUPP);
EXTERN_API(OSErr)
InvokeGXFilterPanelEventUPP (gxPanelInfoRecord * pHitInfo,
Boolean * returnImmed,
GXFilterPanelEventUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXFilterPanelEventProcInfo = 0x000003E1 }; /* 2_bytes Func(4_bytes, 4_bytes) */
#define NewGXFilterPanelEventUPP(userRoutine) (GXFilterPanelEventUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXFilterPanelEventProcInfo, GetCurrentArchitecture())
#define DisposeGXFilterPanelEventUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXFilterPanelEventUPP(pHitInfo, returnImmed, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXFilterPanelEventProcInfo, (pHitInfo), (returnImmed))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXFilterPanelEventProc(userRoutine) NewGXFilterPanelEventUPP(userRoutine)
#define CallGXFilterPanelEventProc(userRoutine, pHitInfo, returnImmed) InvokeGXFilterPanelEventUPP(pHitInfo, returnImmed, userRoutine)
#define Send_GXFilterPanelEvent(pHitInfo, returnImmed) \
MacSendMessage(0x00000008, pHitInfo, returnImmed)
typedef CALLBACK_API_C( OSErr , GXHandlePanelEventProcPtr )(gxPanelInfoRecord *pHitInfo, gxPanelResult *panelResponse);
typedef STACK_UPP_TYPE(GXHandlePanelEventProcPtr) GXHandlePanelEventUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXHandlePanelEventUPP)
NewGXHandlePanelEventUPP (GXHandlePanelEventProcPtr userRoutine);
EXTERN_API(void)
DisposeGXHandlePanelEventUPP (GXHandlePanelEventUPP userUPP);
EXTERN_API(OSErr)
InvokeGXHandlePanelEventUPP (gxPanelInfoRecord * pHitInfo,
gxPanelResult * panelResponse,
GXHandlePanelEventUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXHandlePanelEventProcInfo = 0x000003E1 }; /* 2_bytes Func(4_bytes, 4_bytes) */
#define NewGXHandlePanelEventUPP(userRoutine) (GXHandlePanelEventUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXHandlePanelEventProcInfo, GetCurrentArchitecture())
#define DisposeGXHandlePanelEventUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXHandlePanelEventUPP(pHitInfo, panelResponse, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXHandlePanelEventProcInfo, (pHitInfo), (panelResponse))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXHandlePanelEventProc(userRoutine) NewGXHandlePanelEventUPP(userRoutine)
#define CallGXHandlePanelEventProc(userRoutine, pHitInfo, panelResponse) InvokeGXHandlePanelEventUPP(pHitInfo, panelResponse, userRoutine)
#define Send_GXHandlePanelEvent(pHitInfo, panelResponse) \
MacSendMessage(0x00000009, pHitInfo, panelResponse)
typedef CALLBACK_API_C( OSErr , GXParsePageRangeProcPtr )(StringPtr fromString, StringPtr toString, gxParsePageRangeResult *result);
typedef STACK_UPP_TYPE(GXParsePageRangeProcPtr) GXParsePageRangeUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXParsePageRangeUPP)
NewGXParsePageRangeUPP (GXParsePageRangeProcPtr userRoutine);
EXTERN_API(void)
DisposeGXParsePageRangeUPP (GXParsePageRangeUPP userUPP);
EXTERN_API(OSErr)
InvokeGXParsePageRangeUPP (StringPtr fromString,
StringPtr toString,
gxParsePageRangeResult * result,
GXParsePageRangeUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXParsePageRangeProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXParsePageRangeUPP(userRoutine) (GXParsePageRangeUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXParsePageRangeProcInfo, GetCurrentArchitecture())
#define DisposeGXParsePageRangeUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXParsePageRangeUPP(fromString, toString, result, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXParsePageRangeProcInfo, (fromString), (toString), (result))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXParsePageRangeProc(userRoutine) NewGXParsePageRangeUPP(userRoutine)
#define CallGXParsePageRangeProc(userRoutine, fromString, toString, result) InvokeGXParsePageRangeUPP(fromString, toString, result, userRoutine)
#define Send_GXParsePageRange(fromString, toString, result) \
MacSendMessage(0x0000000A, fromString, toString, result)
#define Forward_GXParsePageRange(fromString, toString, result) \
ForwardThisMessage((void *) (fromString), (void *) (toString), (void *) (result))
typedef CALLBACK_API_C( OSErr , GXDefaultJobProcPtr )(void );
typedef STACK_UPP_TYPE(GXDefaultJobProcPtr) GXDefaultJobUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXDefaultJobUPP)
NewGXDefaultJobUPP (GXDefaultJobProcPtr userRoutine);
EXTERN_API(void)
DisposeGXDefaultJobUPP (GXDefaultJobUPP userUPP);
EXTERN_API(OSErr)
InvokeGXDefaultJobUPP (GXDefaultJobUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXDefaultJobProcInfo = 0x00000021 }; /* 2_bytes Func() */
#define NewGXDefaultJobUPP(userRoutine) (GXDefaultJobUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXDefaultJobProcInfo, GetCurrentArchitecture())
#define DisposeGXDefaultJobUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXDefaultJobUPP(userUPP) (OSErr)CALL_ZERO_PARAMETER_UPP((userUPP), uppGXDefaultJobProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXDefaultJobProc(userRoutine) NewGXDefaultJobUPP(userRoutine)
#define CallGXDefaultJobProc(userRoutine) InvokeGXDefaultJobUPP(userRoutine)
#define Send_GXDefaultJob() MacSendMessage(0x0000000B)
#define Forward_GXDefaultJob() ForwardThisMessage((void *) (0))
typedef CALLBACK_API_C( OSErr , GXDefaultFormatProcPtr )(gxFormat theFormat);
typedef STACK_UPP_TYPE(GXDefaultFormatProcPtr) GXDefaultFormatUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXDefaultFormatUPP)
NewGXDefaultFormatUPP (GXDefaultFormatProcPtr userRoutine);
EXTERN_API(void)
DisposeGXDefaultFormatUPP (GXDefaultFormatUPP userUPP);
EXTERN_API(OSErr)
InvokeGXDefaultFormatUPP (gxFormat theFormat,
GXDefaultFormatUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXDefaultFormatProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXDefaultFormatUPP(userRoutine) (GXDefaultFormatUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXDefaultFormatProcInfo, GetCurrentArchitecture())
#define DisposeGXDefaultFormatUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXDefaultFormatUPP(theFormat, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXDefaultFormatProcInfo, (theFormat))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXDefaultFormatProc(userRoutine) NewGXDefaultFormatUPP(userRoutine)
#define CallGXDefaultFormatProc(userRoutine, theFormat) InvokeGXDefaultFormatUPP(theFormat, userRoutine)
#define Send_GXDefaultFormat(theFormat) \
MacSendMessage(0x0000000C, theFormat)
#define Forward_GXDefaultFormat(theFormat) \
ForwardThisMessage((void *) (theFormat))
typedef CALLBACK_API_C( OSErr , GXDefaultPaperTypeProcPtr )(gxPaperType thePaperType);
typedef STACK_UPP_TYPE(GXDefaultPaperTypeProcPtr) GXDefaultPaperTypeUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXDefaultPaperTypeUPP)
NewGXDefaultPaperTypeUPP (GXDefaultPaperTypeProcPtr userRoutine);
EXTERN_API(void)
DisposeGXDefaultPaperTypeUPP (GXDefaultPaperTypeUPP userUPP);
EXTERN_API(OSErr)
InvokeGXDefaultPaperTypeUPP (gxPaperType thePaperType,
GXDefaultPaperTypeUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXDefaultPaperTypeProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXDefaultPaperTypeUPP(userRoutine) (GXDefaultPaperTypeUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXDefaultPaperTypeProcInfo, GetCurrentArchitecture())
#define DisposeGXDefaultPaperTypeUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXDefaultPaperTypeUPP(thePaperType, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXDefaultPaperTypeProcInfo, (thePaperType))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXDefaultPaperTypeProc(userRoutine) NewGXDefaultPaperTypeUPP(userRoutine)
#define CallGXDefaultPaperTypeProc(userRoutine, thePaperType) InvokeGXDefaultPaperTypeUPP(thePaperType, userRoutine)
#define Send_GXDefaultPaperType(thePaperType) \
MacSendMessage(0x0000000D, thePaperType)
#define Forward_GXDefaultPaperType(thePaperType) \
ForwardThisMessage((void *) thePaperType)
typedef CALLBACK_API_C( OSErr , GXDefaultPrinterProcPtr )(gxPrinter thePrinter);
typedef STACK_UPP_TYPE(GXDefaultPrinterProcPtr) GXDefaultPrinterUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXDefaultPrinterUPP)
NewGXDefaultPrinterUPP (GXDefaultPrinterProcPtr userRoutine);
EXTERN_API(void)
DisposeGXDefaultPrinterUPP (GXDefaultPrinterUPP userUPP);
EXTERN_API(OSErr)
InvokeGXDefaultPrinterUPP (gxPrinter thePrinter,
GXDefaultPrinterUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXDefaultPrinterProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXDefaultPrinterUPP(userRoutine) (GXDefaultPrinterUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXDefaultPrinterProcInfo, GetCurrentArchitecture())
#define DisposeGXDefaultPrinterUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXDefaultPrinterUPP(thePrinter, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXDefaultPrinterProcInfo, (thePrinter))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXDefaultPrinterProc(userRoutine) NewGXDefaultPrinterUPP(userRoutine)
#define CallGXDefaultPrinterProc(userRoutine, thePrinter) InvokeGXDefaultPrinterUPP(thePrinter, userRoutine)
#define Send_GXDefaultPrinter(thePrinter) \
MacSendMessage(0x0000000E, thePrinter)
#define Forward_GXDefaultPrinter(thePrinter) \
ForwardThisMessage((void *) thePrinter)
typedef CALLBACK_API_C( OSErr , GXCreateSpoolFileProcPtr )(FSSpecPtr pFileSpec, long createOptions, gxSpoolFile *theSpoolFile);
typedef STACK_UPP_TYPE(GXCreateSpoolFileProcPtr) GXCreateSpoolFileUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCreateSpoolFileUPP)
NewGXCreateSpoolFileUPP (GXCreateSpoolFileProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCreateSpoolFileUPP (GXCreateSpoolFileUPP userUPP);
EXTERN_API(OSErr)
InvokeGXCreateSpoolFileUPP (FSSpecPtr pFileSpec,
long createOptions,
gxSpoolFile * theSpoolFile,
GXCreateSpoolFileUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCreateSpoolFileProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXCreateSpoolFileUPP(userRoutine) (GXCreateSpoolFileUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCreateSpoolFileProcInfo, GetCurrentArchitecture())
#define DisposeGXCreateSpoolFileUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCreateSpoolFileUPP(pFileSpec, createOptions, theSpoolFile, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXCreateSpoolFileProcInfo, (pFileSpec), (createOptions), (theSpoolFile))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCreateSpoolFileProc(userRoutine) NewGXCreateSpoolFileUPP(userRoutine)
#define CallGXCreateSpoolFileProc(userRoutine, pFileSpec, createOptions, theSpoolFile) InvokeGXCreateSpoolFileUPP(pFileSpec, createOptions, theSpoolFile, userRoutine)
#define Send_GXCreateSpoolFile(pFileSpec, createOptions, theSpoolFile) \
MacSendMessage(0x0000000F, pFileSpec, createOptions, \
theSpoolFile)
#define Forward_GXCreateSpoolFile(pFileSpec, createOptions, theSpoolFile) \
ForwardThisMessage((void *) pFileSpec, (void *) (createOptions), (void *) theSpoolFile)
typedef CALLBACK_API_C( OSErr , GXSpoolPageProcPtr )(gxSpoolFile theSpoolFile, gxFormat theFormat, gxShape thePage);
typedef STACK_UPP_TYPE(GXSpoolPageProcPtr) GXSpoolPageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXSpoolPageUPP)
NewGXSpoolPageUPP (GXSpoolPageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXSpoolPageUPP (GXSpoolPageUPP userUPP);
EXTERN_API(OSErr)
InvokeGXSpoolPageUPP (gxSpoolFile theSpoolFile,
gxFormat theFormat,
gxShape thePage,
GXSpoolPageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXSpoolPageProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXSpoolPageUPP(userRoutine) (GXSpoolPageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXSpoolPageProcInfo, GetCurrentArchitecture())
#define DisposeGXSpoolPageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXSpoolPageUPP(theSpoolFile, theFormat, thePage, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXSpoolPageProcInfo, (theSpoolFile), (theFormat), (thePage))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXSpoolPageProc(userRoutine) NewGXSpoolPageUPP(userRoutine)
#define CallGXSpoolPageProc(userRoutine, theSpoolFile, theFormat, thePage) InvokeGXSpoolPageUPP(theSpoolFile, theFormat, thePage, userRoutine)
#define Send_GXSpoolPage(theSpoolFile, theFormat, thePage) \
MacSendMessage(0x00000010, theSpoolFile, theFormat, thePage)
#define Forward_GXSpoolPage(theSpoolFile, theFormat, thePage) \
ForwardThisMessage((void *) theSpoolFile, (void *) theFormat, (void *) thePage)
typedef CALLBACK_API_C( OSErr , GXSpoolDataProcPtr )(gxSpoolFile theSpoolFile, Ptr data, long *length);
typedef STACK_UPP_TYPE(GXSpoolDataProcPtr) GXSpoolDataUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXSpoolDataUPP)
NewGXSpoolDataUPP (GXSpoolDataProcPtr userRoutine);
EXTERN_API(void)
DisposeGXSpoolDataUPP (GXSpoolDataUPP userUPP);
EXTERN_API(OSErr)
InvokeGXSpoolDataUPP (gxSpoolFile theSpoolFile,
Ptr data,
long * length,
GXSpoolDataUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXSpoolDataProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXSpoolDataUPP(userRoutine) (GXSpoolDataUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXSpoolDataProcInfo, GetCurrentArchitecture())
#define DisposeGXSpoolDataUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXSpoolDataUPP(theSpoolFile, data, length, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXSpoolDataProcInfo, (theSpoolFile), (data), (length))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXSpoolDataProc(userRoutine) NewGXSpoolDataUPP(userRoutine)
#define CallGXSpoolDataProc(userRoutine, theSpoolFile, data, length) InvokeGXSpoolDataUPP(theSpoolFile, data, length, userRoutine)
#define Send_GXSpoolData(theSpoolFile, data, length) \
MacSendMessage(0x00000011, theSpoolFile, data, length)
#define Forward_GXSpoolData(theSpoolFile, data, length) \
ForwardThisMessage((void *) theSpoolFile, (void *) data, (void *) length)
typedef CALLBACK_API_C( OSErr , GXSpoolResourceProcPtr )(gxSpoolFile theSpoolFile, Handle theResource, ResType theType, long id);
typedef STACK_UPP_TYPE(GXSpoolResourceProcPtr) GXSpoolResourceUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXSpoolResourceUPP)
NewGXSpoolResourceUPP (GXSpoolResourceProcPtr userRoutine);
EXTERN_API(void)
DisposeGXSpoolResourceUPP (GXSpoolResourceUPP userUPP);
EXTERN_API(OSErr)
InvokeGXSpoolResourceUPP (gxSpoolFile theSpoolFile,
Handle theResource,
ResType theType,
long id,
GXSpoolResourceUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXSpoolResourceProcInfo = 0x00003FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXSpoolResourceUPP(userRoutine) (GXSpoolResourceUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXSpoolResourceProcInfo, GetCurrentArchitecture())
#define DisposeGXSpoolResourceUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXSpoolResourceUPP(theSpoolFile, theResource, theType, id, userUPP) (OSErr)CALL_FOUR_PARAMETER_UPP((userUPP), uppGXSpoolResourceProcInfo, (theSpoolFile), (theResource), (theType), (id))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXSpoolResourceProc(userRoutine) NewGXSpoolResourceUPP(userRoutine)
#define CallGXSpoolResourceProc(userRoutine, theSpoolFile, theResource, theType, id) InvokeGXSpoolResourceUPP(theSpoolFile, theResource, theType, id, userRoutine)
#define Send_GXSpoolResource(theSpoolFile, theResource, theType, id) \
MacSendMessage(0x00000012, theSpoolFile, theResource, \
theType, id)
#define Forward_GXSpoolResource(theSpoolFile, theResource, theType, id) \
ForwardThisMessage((void *) theSpoolFile, (void *) theResource, \
(void *) theType, (void *) (id))
typedef CALLBACK_API_C( OSErr , GXCompleteSpoolFileProcPtr )(gxSpoolFile theSpoolFile);
typedef STACK_UPP_TYPE(GXCompleteSpoolFileProcPtr) GXCompleteSpoolFileUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCompleteSpoolFileUPP)
NewGXCompleteSpoolFileUPP (GXCompleteSpoolFileProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCompleteSpoolFileUPP (GXCompleteSpoolFileUPP userUPP);
EXTERN_API(OSErr)
InvokeGXCompleteSpoolFileUPP (gxSpoolFile theSpoolFile,
GXCompleteSpoolFileUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCompleteSpoolFileProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXCompleteSpoolFileUPP(userRoutine) (GXCompleteSpoolFileUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCompleteSpoolFileProcInfo, GetCurrentArchitecture())
#define DisposeGXCompleteSpoolFileUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCompleteSpoolFileUPP(theSpoolFile, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXCompleteSpoolFileProcInfo, (theSpoolFile))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCompleteSpoolFileProc(userRoutine) NewGXCompleteSpoolFileUPP(userRoutine)
#define CallGXCompleteSpoolFileProc(userRoutine, theSpoolFile) InvokeGXCompleteSpoolFileUPP(theSpoolFile, userRoutine)
#define Send_GXCompleteSpoolFile(theSpoolFile) \
MacSendMessage(0x00000013, theSpoolFile)
#define Forward_GXCompleteSpoolFile(theSpoolFile) \
ForwardThisMessage((void *) (theSpoolFile))
typedef CALLBACK_API_C( OSErr , GXCountPagesProcPtr )(gxSpoolFile theSpoolFile, long *numPages);
typedef STACK_UPP_TYPE(GXCountPagesProcPtr) GXCountPagesUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCountPagesUPP)
NewGXCountPagesUPP (GXCountPagesProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCountPagesUPP (GXCountPagesUPP userUPP);
EXTERN_API(OSErr)
InvokeGXCountPagesUPP (gxSpoolFile theSpoolFile,
long * numPages,
GXCountPagesUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCountPagesProcInfo = 0x000003E1 }; /* 2_bytes Func(4_bytes, 4_bytes) */
#define NewGXCountPagesUPP(userRoutine) (GXCountPagesUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCountPagesProcInfo, GetCurrentArchitecture())
#define DisposeGXCountPagesUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCountPagesUPP(theSpoolFile, numPages, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXCountPagesProcInfo, (theSpoolFile), (numPages))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCountPagesProc(userRoutine) NewGXCountPagesUPP(userRoutine)
#define CallGXCountPagesProc(userRoutine, theSpoolFile, numPages) InvokeGXCountPagesUPP(theSpoolFile, numPages, userRoutine)
#define Send_GXCountPages(theSpoolFile, numPages) \
MacSendMessage(0x00000014, theSpoolFile, numPages)
#define Forward_GXCountPages(theSpoolFile, numPages) \
ForwardThisMessage((void *) (theSpoolFile), (void *) (numPages))
typedef CALLBACK_API_C( OSErr , GXDespoolPageProcPtr )(gxSpoolFile theSpoolFile, long numPages, gxFormat theFormat, gxShape *thePage, Boolean *formatChanged);
typedef STACK_UPP_TYPE(GXDespoolPageProcPtr) GXDespoolPageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXDespoolPageUPP)
NewGXDespoolPageUPP (GXDespoolPageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXDespoolPageUPP (GXDespoolPageUPP userUPP);
EXTERN_API(OSErr)
InvokeGXDespoolPageUPP (gxSpoolFile theSpoolFile,
long numPages,
gxFormat theFormat,
gxShape * thePage,
Boolean * formatChanged,
GXDespoolPageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXDespoolPageProcInfo = 0x0000FFE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXDespoolPageUPP(userRoutine) (GXDespoolPageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXDespoolPageProcInfo, GetCurrentArchitecture())
#define DisposeGXDespoolPageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXDespoolPageUPP(theSpoolFile, numPages, theFormat, thePage, formatChanged, userUPP) (OSErr)CALL_FIVE_PARAMETER_UPP((userUPP), uppGXDespoolPageProcInfo, (theSpoolFile), (numPages), (theFormat), (thePage), (formatChanged))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXDespoolPageProc(userRoutine) NewGXDespoolPageUPP(userRoutine)
#define CallGXDespoolPageProc(userRoutine, theSpoolFile, numPages, theFormat, thePage, formatChanged) InvokeGXDespoolPageUPP(theSpoolFile, numPages, theFormat, thePage, formatChanged, userRoutine)
#define Send_GXDespoolPage(theSpoolFile, numPages, theFormat, thePage, formatChanged) \
MacSendMessage(0x00000015, theSpoolFile, numPages, \
theFormat, thePage, formatChanged)
#define Forward_GXDespoolPage(theSpoolFile, numPages, theFormat, thePage, formatChanged) \
ForwardThisMessage((void *) (theSpoolFile), (void *) (numPages), (void *) (theFormat), \
(void *) (thePage), (void *) (formatChanged))
typedef CALLBACK_API_C( OSErr , GXDespoolDataProcPtr )(gxSpoolFile theSpoolFile, Ptr data, long *length);
typedef STACK_UPP_TYPE(GXDespoolDataProcPtr) GXDespoolDataUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXDespoolDataUPP)
NewGXDespoolDataUPP (GXDespoolDataProcPtr userRoutine);
EXTERN_API(void)
DisposeGXDespoolDataUPP (GXDespoolDataUPP userUPP);
EXTERN_API(OSErr)
InvokeGXDespoolDataUPP (gxSpoolFile theSpoolFile,
Ptr data,
long * length,
GXDespoolDataUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXDespoolDataProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXDespoolDataUPP(userRoutine) (GXDespoolDataUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXDespoolDataProcInfo, GetCurrentArchitecture())
#define DisposeGXDespoolDataUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXDespoolDataUPP(theSpoolFile, data, length, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXDespoolDataProcInfo, (theSpoolFile), (data), (length))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXDespoolDataProc(userRoutine) NewGXDespoolDataUPP(userRoutine)
#define CallGXDespoolDataProc(userRoutine, theSpoolFile, data, length) InvokeGXDespoolDataUPP(theSpoolFile, data, length, userRoutine)
#define Send_GXDespoolData(theSpoolFile, data, length) \
MacSendMessage(0x00000016, theSpoolFile, data, length)
#define Forward_GXDespoolData(theSpoolFile, data, length) \
ForwardThisMessage((void *) (theSpoolFile), (void *) (data), (void *) (length))
typedef CALLBACK_API_C( OSErr , GXDespoolResourceProcPtr )(gxSpoolFile theSpoolFile, ResType theType, long id, Handle *theResource);
typedef STACK_UPP_TYPE(GXDespoolResourceProcPtr) GXDespoolResourceUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXDespoolResourceUPP)
NewGXDespoolResourceUPP (GXDespoolResourceProcPtr userRoutine);
EXTERN_API(void)
DisposeGXDespoolResourceUPP (GXDespoolResourceUPP userUPP);
EXTERN_API(OSErr)
InvokeGXDespoolResourceUPP (gxSpoolFile theSpoolFile,
ResType theType,
long id,
Handle * theResource,
GXDespoolResourceUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXDespoolResourceProcInfo = 0x00003FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXDespoolResourceUPP(userRoutine) (GXDespoolResourceUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXDespoolResourceProcInfo, GetCurrentArchitecture())
#define DisposeGXDespoolResourceUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXDespoolResourceUPP(theSpoolFile, theType, id, theResource, userUPP) (OSErr)CALL_FOUR_PARAMETER_UPP((userUPP), uppGXDespoolResourceProcInfo, (theSpoolFile), (theType), (id), (theResource))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXDespoolResourceProc(userRoutine) NewGXDespoolResourceUPP(userRoutine)
#define CallGXDespoolResourceProc(userRoutine, theSpoolFile, theType, id, theResource) InvokeGXDespoolResourceUPP(theSpoolFile, theType, id, theResource, userRoutine)
#define Send_GXDespoolResource(theSpoolFile, theType, id, theResource) \
MacSendMessage(0x00000017, theSpoolFile, theType, \
id, theResource)
#define Forward_GXDespoolResource(theSpoolFile, theType, id, theResource) \
ForwardThisMessage((void *) (theSpoolFile), (void *) (theType), (void *) (id), \
(void *) (theResource))
typedef CALLBACK_API_C( OSErr , GXCloseSpoolFileProcPtr )(gxSpoolFile theSpoolFile, long closeOptions);
typedef STACK_UPP_TYPE(GXCloseSpoolFileProcPtr) GXCloseSpoolFileUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCloseSpoolFileUPP)
NewGXCloseSpoolFileUPP (GXCloseSpoolFileProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCloseSpoolFileUPP (GXCloseSpoolFileUPP userUPP);
EXTERN_API(OSErr)
InvokeGXCloseSpoolFileUPP (gxSpoolFile theSpoolFile,
long closeOptions,
GXCloseSpoolFileUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCloseSpoolFileProcInfo = 0x000003E1 }; /* 2_bytes Func(4_bytes, 4_bytes) */
#define NewGXCloseSpoolFileUPP(userRoutine) (GXCloseSpoolFileUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCloseSpoolFileProcInfo, GetCurrentArchitecture())
#define DisposeGXCloseSpoolFileUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCloseSpoolFileUPP(theSpoolFile, closeOptions, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXCloseSpoolFileProcInfo, (theSpoolFile), (closeOptions))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCloseSpoolFileProc(userRoutine) NewGXCloseSpoolFileUPP(userRoutine)
#define CallGXCloseSpoolFileProc(userRoutine, theSpoolFile, closeOptions) InvokeGXCloseSpoolFileUPP(theSpoolFile, closeOptions, userRoutine)
#define Send_GXCloseSpoolFile(theSpoolFile, closeOptions) \
MacSendMessage(0x00000018, theSpoolFile, closeOptions)
#define Forward_GXCloseSpoolFile(theSpoolFile, closeOptions) \
ForwardThisMessage((void *) (theSpoolFile), (void *) (closeOptions))
typedef CALLBACK_API_C( OSErr , GXStartJobProcPtr )(StringPtr docName, long pageCount);
typedef STACK_UPP_TYPE(GXStartJobProcPtr) GXStartJobUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXStartJobUPP)
NewGXStartJobUPP (GXStartJobProcPtr userRoutine);
EXTERN_API(void)
DisposeGXStartJobUPP (GXStartJobUPP userUPP);
EXTERN_API(OSErr)
InvokeGXStartJobUPP (StringPtr docName,
long pageCount,
GXStartJobUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXStartJobProcInfo = 0x000003E1 }; /* 2_bytes Func(4_bytes, 4_bytes) */
#define NewGXStartJobUPP(userRoutine) (GXStartJobUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXStartJobProcInfo, GetCurrentArchitecture())
#define DisposeGXStartJobUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXStartJobUPP(docName, pageCount, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXStartJobProcInfo, (docName), (pageCount))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXStartJobProc(userRoutine) NewGXStartJobUPP(userRoutine)
#define CallGXStartJobProc(userRoutine, docName, pageCount) InvokeGXStartJobUPP(docName, pageCount, userRoutine)
#define Send_GXStartJob(docName, pageCount) \
MacSendMessage(0x00000019, docName, pageCount)
#define Forward_GXStartJob(docName, pageCount) \
ForwardThisMessage((void *) (docName), (void *) (pageCount))
typedef CALLBACK_API_C( OSErr , GXFinishJobProcPtr )(void );
typedef STACK_UPP_TYPE(GXFinishJobProcPtr) GXFinishJobUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXFinishJobUPP)
NewGXFinishJobUPP (GXFinishJobProcPtr userRoutine);
EXTERN_API(void)
DisposeGXFinishJobUPP (GXFinishJobUPP userUPP);
EXTERN_API(OSErr)
InvokeGXFinishJobUPP (GXFinishJobUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXFinishJobProcInfo = 0x00000021 }; /* 2_bytes Func() */
#define NewGXFinishJobUPP(userRoutine) (GXFinishJobUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXFinishJobProcInfo, GetCurrentArchitecture())
#define DisposeGXFinishJobUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXFinishJobUPP(userUPP) (OSErr)CALL_ZERO_PARAMETER_UPP((userUPP), uppGXFinishJobProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXFinishJobProc(userRoutine) NewGXFinishJobUPP(userRoutine)
#define CallGXFinishJobProc(userRoutine) InvokeGXFinishJobUPP(userRoutine)
#define Send_GXFinishJob() MacSendMessage(0x0000001A)
#define Forward_GXFinishJob() ForwardThisMessage((void *) (0))
typedef CALLBACK_API_C( OSErr , GXStartPageProcPtr )(gxFormat theFormat, long numViewPorts, gxViewPort *viewPortList);
typedef STACK_UPP_TYPE(GXStartPageProcPtr) GXStartPageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXStartPageUPP)
NewGXStartPageUPP (GXStartPageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXStartPageUPP (GXStartPageUPP userUPP);
EXTERN_API(OSErr)
InvokeGXStartPageUPP (gxFormat theFormat,
long numViewPorts,
gxViewPort * viewPortList,
GXStartPageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXStartPageProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXStartPageUPP(userRoutine) (GXStartPageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXStartPageProcInfo, GetCurrentArchitecture())
#define DisposeGXStartPageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXStartPageUPP(theFormat, numViewPorts, viewPortList, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXStartPageProcInfo, (theFormat), (numViewPorts), (viewPortList))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXStartPageProc(userRoutine) NewGXStartPageUPP(userRoutine)
#define CallGXStartPageProc(userRoutine, theFormat, numViewPorts, viewPortList) InvokeGXStartPageUPP(theFormat, numViewPorts, viewPortList, userRoutine)
#define Send_GXStartPage(theFormat, numViewPorts, viewPortList) \
MacSendMessage(0x0000001B, theFormat, numViewPorts, viewPortList)
#define Forward_GXStartPage(theFormat, numViewPorts, viewPortList) \
ForwardThisMessage((void *) (theFormat), (void *) (numViewPorts), (void *) (viewPortList))
typedef CALLBACK_API_C( OSErr , GXFinishPageProcPtr )(void );
typedef STACK_UPP_TYPE(GXFinishPageProcPtr) GXFinishPageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXFinishPageUPP)
NewGXFinishPageUPP (GXFinishPageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXFinishPageUPP (GXFinishPageUPP userUPP);
EXTERN_API(OSErr)
InvokeGXFinishPageUPP (GXFinishPageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXFinishPageProcInfo = 0x00000021 }; /* 2_bytes Func() */
#define NewGXFinishPageUPP(userRoutine) (GXFinishPageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXFinishPageProcInfo, GetCurrentArchitecture())
#define DisposeGXFinishPageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXFinishPageUPP(userUPP) (OSErr)CALL_ZERO_PARAMETER_UPP((userUPP), uppGXFinishPageProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXFinishPageProc(userRoutine) NewGXFinishPageUPP(userRoutine)
#define CallGXFinishPageProc(userRoutine) InvokeGXFinishPageUPP(userRoutine)
#define Send_GXFinishPage() MacSendMessage(0x0000001C)
#define Forward_GXFinishPage() ForwardThisMessage((void *) (0))
typedef CALLBACK_API_C( OSErr , GXPrintPageProcPtr )(gxFormat theFormat, gxShape thePage);
typedef STACK_UPP_TYPE(GXPrintPageProcPtr) GXPrintPageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXPrintPageUPP)
NewGXPrintPageUPP (GXPrintPageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXPrintPageUPP (GXPrintPageUPP userUPP);
EXTERN_API(OSErr)
InvokeGXPrintPageUPP (gxFormat theFormat,
gxShape thePage,
GXPrintPageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXPrintPageProcInfo = 0x000003E1 }; /* 2_bytes Func(4_bytes, 4_bytes) */
#define NewGXPrintPageUPP(userRoutine) (GXPrintPageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXPrintPageProcInfo, GetCurrentArchitecture())
#define DisposeGXPrintPageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXPrintPageUPP(theFormat, thePage, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXPrintPageProcInfo, (theFormat), (thePage))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXPrintPageProc(userRoutine) NewGXPrintPageUPP(userRoutine)
#define CallGXPrintPageProc(userRoutine, theFormat, thePage) InvokeGXPrintPageUPP(theFormat, thePage, userRoutine)
#define Send_GXPrintPage(theFormat, thePage) \
MacSendMessage(0x0000001D, theFormat, thePage)
#define Forward_GXPrintPage(theFormat, thePage) \
ForwardThisMessage((void *) (theFormat), (void *) (thePage))
typedef CALLBACK_API_C( OSErr , GXSetupImageDataProcPtr )(void *imageData);
typedef STACK_UPP_TYPE(GXSetupImageDataProcPtr) GXSetupImageDataUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXSetupImageDataUPP)
NewGXSetupImageDataUPP (GXSetupImageDataProcPtr userRoutine);
EXTERN_API(void)
DisposeGXSetupImageDataUPP (GXSetupImageDataUPP userUPP);
EXTERN_API(OSErr)
InvokeGXSetupImageDataUPP (void * imageData,
GXSetupImageDataUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXSetupImageDataProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXSetupImageDataUPP(userRoutine) (GXSetupImageDataUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXSetupImageDataProcInfo, GetCurrentArchitecture())
#define DisposeGXSetupImageDataUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXSetupImageDataUPP(imageData, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXSetupImageDataProcInfo, (imageData))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXSetupImageDataProc(userRoutine) NewGXSetupImageDataUPP(userRoutine)
#define CallGXSetupImageDataProc(userRoutine, imageData) InvokeGXSetupImageDataUPP(imageData, userRoutine)
#define Send_GXSetupImageData(imageData) \
MacSendMessage(0x0000001E, imageData)
#define Forward_GXSetupImageData(imageData) \
ForwardThisMessage((void *) (imageData))
typedef CALLBACK_API_C( OSErr , GXImageJobProcPtr )(gxSpoolFile theSpoolFile, long *closeOptions);
typedef STACK_UPP_TYPE(GXImageJobProcPtr) GXImageJobUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXImageJobUPP)
NewGXImageJobUPP (GXImageJobProcPtr userRoutine);
EXTERN_API(void)
DisposeGXImageJobUPP (GXImageJobUPP userUPP);
EXTERN_API(OSErr)
InvokeGXImageJobUPP (gxSpoolFile theSpoolFile,
long * closeOptions,
GXImageJobUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXImageJobProcInfo = 0x000003E1 }; /* 2_bytes Func(4_bytes, 4_bytes) */
#define NewGXImageJobUPP(userRoutine) (GXImageJobUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXImageJobProcInfo, GetCurrentArchitecture())
#define DisposeGXImageJobUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXImageJobUPP(theSpoolFile, closeOptions, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXImageJobProcInfo, (theSpoolFile), (closeOptions))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXImageJobProc(userRoutine) NewGXImageJobUPP(userRoutine)
#define CallGXImageJobProc(userRoutine, theSpoolFile, closeOptions) InvokeGXImageJobUPP(theSpoolFile, closeOptions, userRoutine)
#define Send_GXImageJob(theSpoolFile, closeOptions) \
MacSendMessage(0x0000001F, theSpoolFile, closeOptions)
#define Forward_GXImageJob(theSpoolFile, closeOptions) \
ForwardThisMessage((void *) (theSpoolFile), (void *) (closeOptions))
typedef CALLBACK_API_C( OSErr , GXImageDocumentProcPtr )(gxSpoolFile theSpoolFile, void *imageData);
typedef STACK_UPP_TYPE(GXImageDocumentProcPtr) GXImageDocumentUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXImageDocumentUPP)
NewGXImageDocumentUPP (GXImageDocumentProcPtr userRoutine);
EXTERN_API(void)
DisposeGXImageDocumentUPP (GXImageDocumentUPP userUPP);
EXTERN_API(OSErr)
InvokeGXImageDocumentUPP (gxSpoolFile theSpoolFile,
void * imageData,
GXImageDocumentUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXImageDocumentProcInfo = 0x000003E1 }; /* 2_bytes Func(4_bytes, 4_bytes) */
#define NewGXImageDocumentUPP(userRoutine) (GXImageDocumentUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXImageDocumentProcInfo, GetCurrentArchitecture())
#define DisposeGXImageDocumentUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXImageDocumentUPP(theSpoolFile, imageData, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXImageDocumentProcInfo, (theSpoolFile), (imageData))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXImageDocumentProc(userRoutine) NewGXImageDocumentUPP(userRoutine)
#define CallGXImageDocumentProc(userRoutine, theSpoolFile, imageData) InvokeGXImageDocumentUPP(theSpoolFile, imageData, userRoutine)
#define Send_GXImageDocument(theSpoolFile, imageData) \
MacSendMessage(0x00000020, theSpoolFile, imageData)
#define Forward_GXImageDocument(theSpoolFile, imageData) \
ForwardThisMessage((void *) (theSpoolFile), (void *) (imageData))
typedef CALLBACK_API_C( OSErr , GXImagePageProcPtr )(gxSpoolFile theSpoolFile, long pageNumber, gxFormat theFormat, void *imageData);
typedef STACK_UPP_TYPE(GXImagePageProcPtr) GXImagePageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXImagePageUPP)
NewGXImagePageUPP (GXImagePageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXImagePageUPP (GXImagePageUPP userUPP);
EXTERN_API(OSErr)
InvokeGXImagePageUPP (gxSpoolFile theSpoolFile,
long pageNumber,
gxFormat theFormat,
void * imageData,
GXImagePageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXImagePageProcInfo = 0x00003FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXImagePageUPP(userRoutine) (GXImagePageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXImagePageProcInfo, GetCurrentArchitecture())
#define DisposeGXImagePageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXImagePageUPP(theSpoolFile, pageNumber, theFormat, imageData, userUPP) (OSErr)CALL_FOUR_PARAMETER_UPP((userUPP), uppGXImagePageProcInfo, (theSpoolFile), (pageNumber), (theFormat), (imageData))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXImagePageProc(userRoutine) NewGXImagePageUPP(userRoutine)
#define CallGXImagePageProc(userRoutine, theSpoolFile, pageNumber, theFormat, imageData) InvokeGXImagePageUPP(theSpoolFile, pageNumber, theFormat, imageData, userRoutine)
#define Send_GXImagePage(theSpoolFile, pageNumber, theFormat, imageData) \
MacSendMessage(0x00000021, theSpoolFile, pageNumber, theFormat, imageData)
#define Forward_GXImagePage(theSpoolFile, pageNumber, theFormat, imageData) \
ForwardThisMessage((void *) (theSpoolFile), (void *) (pageNumber), (void *) (theFormat), \
(void *) (imageData))
typedef CALLBACK_API_C( OSErr , GXRenderPageProcPtr )(gxFormat theFormat, gxShape thePage, gxPageInfoRecord *pageInfo, void *imageData);
typedef STACK_UPP_TYPE(GXRenderPageProcPtr) GXRenderPageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXRenderPageUPP)
NewGXRenderPageUPP (GXRenderPageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXRenderPageUPP (GXRenderPageUPP userUPP);
EXTERN_API(OSErr)
InvokeGXRenderPageUPP (gxFormat theFormat,
gxShape thePage,
gxPageInfoRecord * pageInfo,
void * imageData,
GXRenderPageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXRenderPageProcInfo = 0x00003FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXRenderPageUPP(userRoutine) (GXRenderPageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXRenderPageProcInfo, GetCurrentArchitecture())
#define DisposeGXRenderPageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXRenderPageUPP(theFormat, thePage, pageInfo, imageData, userUPP) (OSErr)CALL_FOUR_PARAMETER_UPP((userUPP), uppGXRenderPageProcInfo, (theFormat), (thePage), (pageInfo), (imageData))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXRenderPageProc(userRoutine) NewGXRenderPageUPP(userRoutine)
#define CallGXRenderPageProc(userRoutine, theFormat, thePage, pageInfo, imageData) InvokeGXRenderPageUPP(theFormat, thePage, pageInfo, imageData, userRoutine)
#define Send_GXRenderPage(theFormat, thePage, pageInfo, imageData) \
MacSendMessage(0x00000022, theFormat, thePage, pageInfo, imageData)
#define Forward_GXRenderPage(theFormat, thePage, pageInfo, imageData) \
ForwardThisMessage((void *) (theFormat), (void *) (thePage), (void *) (pageInfo), (void *) (imageData))
typedef CALLBACK_API_C( OSErr , GXCreateImageFileProcPtr )(FSSpecPtr pFileSpec, long imageFileOptions, long *theImageFile);
typedef STACK_UPP_TYPE(GXCreateImageFileProcPtr) GXCreateImageFileUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCreateImageFileUPP)
NewGXCreateImageFileUPP (GXCreateImageFileProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCreateImageFileUPP (GXCreateImageFileUPP userUPP);
EXTERN_API(OSErr)
InvokeGXCreateImageFileUPP (FSSpecPtr pFileSpec,
long imageFileOptions,
long * theImageFile,
GXCreateImageFileUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCreateImageFileProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXCreateImageFileUPP(userRoutine) (GXCreateImageFileUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCreateImageFileProcInfo, GetCurrentArchitecture())
#define DisposeGXCreateImageFileUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCreateImageFileUPP(pFileSpec, imageFileOptions, theImageFile, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXCreateImageFileProcInfo, (pFileSpec), (imageFileOptions), (theImageFile))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCreateImageFileProc(userRoutine) NewGXCreateImageFileUPP(userRoutine)
#define CallGXCreateImageFileProc(userRoutine, pFileSpec, imageFileOptions, theImageFile) InvokeGXCreateImageFileUPP(pFileSpec, imageFileOptions, theImageFile, userRoutine)
#define Send_GXCreateImageFile(pFileSpec, imageFileOptions, theImageFile) \
MacSendMessage(0x00000023, pFileSpec, imageFileOptions, theImageFile)
#define Forward_GXCreateImageFile(pFileSpec, imageFileOptions, theImageFile) \
ForwardThisMessage((void *) (pFileSpec), (void *) (imageFileOptions), (void *) (theImageFile))
typedef CALLBACK_API_C( OSErr , GXOpenConnectionProcPtr )(void );
typedef STACK_UPP_TYPE(GXOpenConnectionProcPtr) GXOpenConnectionUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXOpenConnectionUPP)
NewGXOpenConnectionUPP (GXOpenConnectionProcPtr userRoutine);
EXTERN_API(void)
DisposeGXOpenConnectionUPP (GXOpenConnectionUPP userUPP);
EXTERN_API(OSErr)
InvokeGXOpenConnectionUPP (GXOpenConnectionUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXOpenConnectionProcInfo = 0x00000021 }; /* 2_bytes Func() */
#define NewGXOpenConnectionUPP(userRoutine) (GXOpenConnectionUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXOpenConnectionProcInfo, GetCurrentArchitecture())
#define DisposeGXOpenConnectionUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXOpenConnectionUPP(userUPP) (OSErr)CALL_ZERO_PARAMETER_UPP((userUPP), uppGXOpenConnectionProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXOpenConnectionProc(userRoutine) NewGXOpenConnectionUPP(userRoutine)
#define CallGXOpenConnectionProc(userRoutine) InvokeGXOpenConnectionUPP(userRoutine)
#define Send_GXOpenConnection() MacSendMessage(0x00000024)
#define Forward_GXOpenConnection() ForwardThisMessage((void *) (0))
typedef CALLBACK_API_C( OSErr , GXCloseConnectionProcPtr )(void );
typedef STACK_UPP_TYPE(GXCloseConnectionProcPtr) GXCloseConnectionUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCloseConnectionUPP)
NewGXCloseConnectionUPP (GXCloseConnectionProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCloseConnectionUPP (GXCloseConnectionUPP userUPP);
EXTERN_API(OSErr)
InvokeGXCloseConnectionUPP (GXCloseConnectionUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCloseConnectionProcInfo = 0x00000021 }; /* 2_bytes Func() */
#define NewGXCloseConnectionUPP(userRoutine) (GXCloseConnectionUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCloseConnectionProcInfo, GetCurrentArchitecture())
#define DisposeGXCloseConnectionUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCloseConnectionUPP(userUPP) (OSErr)CALL_ZERO_PARAMETER_UPP((userUPP), uppGXCloseConnectionProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCloseConnectionProc(userRoutine) NewGXCloseConnectionUPP(userRoutine)
#define CallGXCloseConnectionProc(userRoutine) InvokeGXCloseConnectionUPP(userRoutine)
#define Send_GXCloseConnection() MacSendMessage(0x00000025)
#define Forward_GXCloseConnection() ForwardThisMessage((void *) (0))
typedef CALLBACK_API_C( OSErr , GXStartSendPageProcPtr )(gxFormat theFormat);
typedef STACK_UPP_TYPE(GXStartSendPageProcPtr) GXStartSendPageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXStartSendPageUPP)
NewGXStartSendPageUPP (GXStartSendPageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXStartSendPageUPP (GXStartSendPageUPP userUPP);
EXTERN_API(OSErr)
InvokeGXStartSendPageUPP (gxFormat theFormat,
GXStartSendPageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXStartSendPageProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXStartSendPageUPP(userRoutine) (GXStartSendPageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXStartSendPageProcInfo, GetCurrentArchitecture())
#define DisposeGXStartSendPageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXStartSendPageUPP(theFormat, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXStartSendPageProcInfo, (theFormat))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXStartSendPageProc(userRoutine) NewGXStartSendPageUPP(userRoutine)
#define CallGXStartSendPageProc(userRoutine, theFormat) InvokeGXStartSendPageUPP(theFormat, userRoutine)
#define Send_GXStartSendPage(theFormat) MacSendMessage(0x00000026, theFormat)
#define Forward_GXStartSendPage(theFormat) ForwardThisMessage((void *) (theFormat))
typedef CALLBACK_API_C( OSErr , GXFinishSendPageProcPtr )(void );
typedef STACK_UPP_TYPE(GXFinishSendPageProcPtr) GXFinishSendPageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXFinishSendPageUPP)
NewGXFinishSendPageUPP (GXFinishSendPageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXFinishSendPageUPP (GXFinishSendPageUPP userUPP);
EXTERN_API(OSErr)
InvokeGXFinishSendPageUPP (GXFinishSendPageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXFinishSendPageProcInfo = 0x00000021 }; /* 2_bytes Func() */
#define NewGXFinishSendPageUPP(userRoutine) (GXFinishSendPageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXFinishSendPageProcInfo, GetCurrentArchitecture())
#define DisposeGXFinishSendPageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXFinishSendPageUPP(userUPP) (OSErr)CALL_ZERO_PARAMETER_UPP((userUPP), uppGXFinishSendPageProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXFinishSendPageProc(userRoutine) NewGXFinishSendPageUPP(userRoutine)
#define CallGXFinishSendPageProc(userRoutine) InvokeGXFinishSendPageUPP(userRoutine)
#define Send_GXFinishSendPage() MacSendMessage(0x00000027)
#define Forward_GXFinishSendPage() ForwardThisMessage((void *) (0))
typedef CALLBACK_API_C( OSErr , GXWriteDataProcPtr )(Ptr data, long length);
typedef STACK_UPP_TYPE(GXWriteDataProcPtr) GXWriteDataUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXWriteDataUPP)
NewGXWriteDataUPP (GXWriteDataProcPtr userRoutine);
EXTERN_API(void)
DisposeGXWriteDataUPP (GXWriteDataUPP userUPP);
EXTERN_API(OSErr)
InvokeGXWriteDataUPP (Ptr data,
long length,
GXWriteDataUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXWriteDataProcInfo = 0x000003E1 }; /* 2_bytes Func(4_bytes, 4_bytes) */
#define NewGXWriteDataUPP(userRoutine) (GXWriteDataUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXWriteDataProcInfo, GetCurrentArchitecture())
#define DisposeGXWriteDataUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXWriteDataUPP(data, length, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXWriteDataProcInfo, (data), (length))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXWriteDataProc(userRoutine) NewGXWriteDataUPP(userRoutine)
#define CallGXWriteDataProc(userRoutine, data, length) InvokeGXWriteDataUPP(data, length, userRoutine)
#define Send_GXWriteData(data, length) MacSendMessage(0x00000028, data, length)
#define Forward_GXWriteData(data, length) ForwardThisMessage((void *) (data), (void *) (length))
typedef CALLBACK_API_C( OSErr , GXBufferDataProcPtr )(Ptr data, long length, long bufferOptions);
typedef STACK_UPP_TYPE(GXBufferDataProcPtr) GXBufferDataUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXBufferDataUPP)
NewGXBufferDataUPP (GXBufferDataProcPtr userRoutine);
EXTERN_API(void)
DisposeGXBufferDataUPP (GXBufferDataUPP userUPP);
EXTERN_API(OSErr)
InvokeGXBufferDataUPP (Ptr data,
long length,
long bufferOptions,
GXBufferDataUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXBufferDataProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXBufferDataUPP(userRoutine) (GXBufferDataUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXBufferDataProcInfo, GetCurrentArchitecture())
#define DisposeGXBufferDataUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXBufferDataUPP(data, length, bufferOptions, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXBufferDataProcInfo, (data), (length), (bufferOptions))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXBufferDataProc(userRoutine) NewGXBufferDataUPP(userRoutine)
#define CallGXBufferDataProc(userRoutine, data, length, bufferOptions) InvokeGXBufferDataUPP(data, length, bufferOptions, userRoutine)
#define Send_GXBufferData(data, length, bufferOptions) \
MacSendMessage(0x00000029, data, length, bufferOptions)
#define Forward_GXBufferData(data, length, bufferOptions) \
ForwardThisMessage((void *) (data), (void *) (length), (void *) (bufferOptions))
typedef CALLBACK_API_C( OSErr , GXDumpBufferProcPtr )(gxPrintingBuffer *theBuffer);
typedef STACK_UPP_TYPE(GXDumpBufferProcPtr) GXDumpBufferUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXDumpBufferUPP)
NewGXDumpBufferUPP (GXDumpBufferProcPtr userRoutine);
EXTERN_API(void)
DisposeGXDumpBufferUPP (GXDumpBufferUPP userUPP);
EXTERN_API(OSErr)
InvokeGXDumpBufferUPP (gxPrintingBuffer * theBuffer,
GXDumpBufferUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXDumpBufferProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXDumpBufferUPP(userRoutine) (GXDumpBufferUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXDumpBufferProcInfo, GetCurrentArchitecture())
#define DisposeGXDumpBufferUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXDumpBufferUPP(theBuffer, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXDumpBufferProcInfo, (theBuffer))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXDumpBufferProc(userRoutine) NewGXDumpBufferUPP(userRoutine)
#define CallGXDumpBufferProc(userRoutine, theBuffer) InvokeGXDumpBufferUPP(theBuffer, userRoutine)
#define Send_GXDumpBuffer(theBuffer) MacSendMessage(0x0000002A, theBuffer)
#define Forward_GXDumpBuffer(theBuffer) ForwardThisMessage((void *) (theBuffer))
typedef CALLBACK_API_C( OSErr , GXFreeBufferProcPtr )(gxPrintingBuffer *theBuffer);
typedef STACK_UPP_TYPE(GXFreeBufferProcPtr) GXFreeBufferUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXFreeBufferUPP)
NewGXFreeBufferUPP (GXFreeBufferProcPtr userRoutine);
EXTERN_API(void)
DisposeGXFreeBufferUPP (GXFreeBufferUPP userUPP);
EXTERN_API(OSErr)
InvokeGXFreeBufferUPP (gxPrintingBuffer * theBuffer,
GXFreeBufferUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXFreeBufferProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXFreeBufferUPP(userRoutine) (GXFreeBufferUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXFreeBufferProcInfo, GetCurrentArchitecture())
#define DisposeGXFreeBufferUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXFreeBufferUPP(theBuffer, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXFreeBufferProcInfo, (theBuffer))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXFreeBufferProc(userRoutine) NewGXFreeBufferUPP(userRoutine)
#define CallGXFreeBufferProc(userRoutine, theBuffer) InvokeGXFreeBufferUPP(theBuffer, userRoutine)
#define Send_GXFreeBuffer(theBuffer) MacSendMessage(0x0000002B, theBuffer)
#define Forward_GXFreeBuffer(theBuffer) ForwardThisMessage((void *) (theBuffer))
typedef CALLBACK_API_C( OSErr , GXCheckStatusProcPtr )(Ptr data, long length, long statusType, gxOwnerSignature owner);
typedef STACK_UPP_TYPE(GXCheckStatusProcPtr) GXCheckStatusUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCheckStatusUPP)
NewGXCheckStatusUPP (GXCheckStatusProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCheckStatusUPP (GXCheckStatusUPP userUPP);
EXTERN_API(OSErr)
InvokeGXCheckStatusUPP (Ptr data,
long length,
long statusType,
gxOwnerSignature owner,
GXCheckStatusUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCheckStatusProcInfo = 0x00003FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXCheckStatusUPP(userRoutine) (GXCheckStatusUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCheckStatusProcInfo, GetCurrentArchitecture())
#define DisposeGXCheckStatusUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCheckStatusUPP(data, length, statusType, owner, userUPP) (OSErr)CALL_FOUR_PARAMETER_UPP((userUPP), uppGXCheckStatusProcInfo, (data), (length), (statusType), (owner))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCheckStatusProc(userRoutine) NewGXCheckStatusUPP(userRoutine)
#define CallGXCheckStatusProc(userRoutine, data, length, statusType, owner) InvokeGXCheckStatusUPP(data, length, statusType, owner, userRoutine)
#define Send_GXCheckStatus(data, length, statusType, owner) \
MacSendMessage(0x0000002C, data, length, statusType, owner)
#define Forward_GXCheckStatus(data, length, statusType, owner) \
ForwardThisMessage((void *) (data), (void *) (length), (void *) (statusType), (void *) (owner))
typedef CALLBACK_API_C( OSErr , GXGetDeviceStatusProcPtr )(Ptr cmdData, long cmdSize, Ptr responseData, long *responseSize, Str255 termination);
typedef STACK_UPP_TYPE(GXGetDeviceStatusProcPtr) GXGetDeviceStatusUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXGetDeviceStatusUPP)
NewGXGetDeviceStatusUPP (GXGetDeviceStatusProcPtr userRoutine);
EXTERN_API(void)
DisposeGXGetDeviceStatusUPP (GXGetDeviceStatusUPP userUPP);
EXTERN_API(OSErr)
InvokeGXGetDeviceStatusUPP (Ptr cmdData,
long cmdSize,
Ptr responseData,
long * responseSize,
Str255 termination,
GXGetDeviceStatusUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXGetDeviceStatusProcInfo = 0x0000FFE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXGetDeviceStatusUPP(userRoutine) (GXGetDeviceStatusUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXGetDeviceStatusProcInfo, GetCurrentArchitecture())
#define DisposeGXGetDeviceStatusUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXGetDeviceStatusUPP(cmdData, cmdSize, responseData, responseSize, termination, userUPP) (OSErr)CALL_FIVE_PARAMETER_UPP((userUPP), uppGXGetDeviceStatusProcInfo, (cmdData), (cmdSize), (responseData), (responseSize), (termination))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXGetDeviceStatusProc(userRoutine) NewGXGetDeviceStatusUPP(userRoutine)
#define CallGXGetDeviceStatusProc(userRoutine, cmdData, cmdSize, responseData, responseSize, termination) InvokeGXGetDeviceStatusUPP(cmdData, cmdSize, responseData, responseSize, termination, userRoutine)
#define Send_GXGetDeviceStatus(cmdData, cmdSize, responseData, responseSize, termination) \
MacSendMessage(0x0000002D, cmdData, cmdSize, responseData, responseSize, termination)
#define Forward_GXGetDeviceStatus(cmdData, cmdSize, responseData, responseSize, termination) \
ForwardThisMessage((void *) (cmdData), (void *) (cmdSize), (void *) (responseData), \
(void *) (responseSize), (void *) (termination))
typedef CALLBACK_API_C( OSErr , GXFetchTaggedDataProcPtr )(ResType theType, long id, Handle *dataHdl, gxOwnerSignature owner);
typedef STACK_UPP_TYPE(GXFetchTaggedDataProcPtr) GXFetchTaggedDataUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXFetchTaggedDataUPP)
NewGXFetchTaggedDataUPP (GXFetchTaggedDataProcPtr userRoutine);
EXTERN_API(void)
DisposeGXFetchTaggedDataUPP (GXFetchTaggedDataUPP userUPP);
EXTERN_API(OSErr)
InvokeGXFetchTaggedDataUPP (ResType theType,
long id,
Handle * dataHdl,
gxOwnerSignature owner,
GXFetchTaggedDataUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXFetchTaggedDataProcInfo = 0x00003FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXFetchTaggedDataUPP(userRoutine) (GXFetchTaggedDataUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXFetchTaggedDataProcInfo, GetCurrentArchitecture())
#define DisposeGXFetchTaggedDataUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXFetchTaggedDataUPP(theType, id, dataHdl, owner, userUPP) (OSErr)CALL_FOUR_PARAMETER_UPP((userUPP), uppGXFetchTaggedDataProcInfo, (theType), (id), (dataHdl), (owner))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXFetchTaggedDataProc(userRoutine) NewGXFetchTaggedDataUPP(userRoutine)
#define CallGXFetchTaggedDataProc(userRoutine, theType, id, dataHdl, owner) InvokeGXFetchTaggedDataUPP(theType, id, dataHdl, owner, userRoutine)
#define Send_GXFetchTaggedDriverData(tag, id, pHandle) Send_GXFetchTaggedData(tag, id, pHandle, 'drvr')
#define Forward_GXFetchTaggedDriverData(tag, id, pHandle) Forward_GXFetchTaggedData(tag, id, pHandle, 'drvr')
#define Send_GXFetchTaggedData(theType, id, dataHdl, owner) \
MacSendMessage(0x0000002E, theType, id, dataHdl, owner)
#define Forward_GXFetchTaggedData(theType, id, dataHdl, owner) \
ForwardThisMessage((void *) (theType), (void *) (id), (void *) (dataHdl), (void *) (owner))
typedef CALLBACK_API_C( OSErr , GXGetDTPMenuListProcPtr )(MenuHandle menuHdl);
typedef STACK_UPP_TYPE(GXGetDTPMenuListProcPtr) GXGetDTPMenuListUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXGetDTPMenuListUPP)
NewGXGetDTPMenuListUPP (GXGetDTPMenuListProcPtr userRoutine);
EXTERN_API(void)
DisposeGXGetDTPMenuListUPP (GXGetDTPMenuListUPP userUPP);
EXTERN_API(OSErr)
InvokeGXGetDTPMenuListUPP (MenuHandle menuHdl,
GXGetDTPMenuListUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXGetDTPMenuListProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXGetDTPMenuListUPP(userRoutine) (GXGetDTPMenuListUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXGetDTPMenuListProcInfo, GetCurrentArchitecture())
#define DisposeGXGetDTPMenuListUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXGetDTPMenuListUPP(menuHdl, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXGetDTPMenuListProcInfo, (menuHdl))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXGetDTPMenuListProc(userRoutine) NewGXGetDTPMenuListUPP(userRoutine)
#define CallGXGetDTPMenuListProc(userRoutine, menuHdl) InvokeGXGetDTPMenuListUPP(menuHdl, userRoutine)
#define Send_GXGetDTPMenuList(menuHdl) \
MacSendMessage(0x0000002F, menuHdl)
#define Forward_GXGetDTPMenuList(menuHdl) \
ForwardThisMessage((void *) (menuHdl))
typedef CALLBACK_API_C( OSErr , GXDTPMenuSelectProcPtr )(long id);
typedef STACK_UPP_TYPE(GXDTPMenuSelectProcPtr) GXDTPMenuSelectUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXDTPMenuSelectUPP)
NewGXDTPMenuSelectUPP (GXDTPMenuSelectProcPtr userRoutine);
EXTERN_API(void)
DisposeGXDTPMenuSelectUPP (GXDTPMenuSelectUPP userUPP);
EXTERN_API(OSErr)
InvokeGXDTPMenuSelectUPP (long id,
GXDTPMenuSelectUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXDTPMenuSelectProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXDTPMenuSelectUPP(userRoutine) (GXDTPMenuSelectUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXDTPMenuSelectProcInfo, GetCurrentArchitecture())
#define DisposeGXDTPMenuSelectUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXDTPMenuSelectUPP(id, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXDTPMenuSelectProcInfo, (id))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXDTPMenuSelectProc(userRoutine) NewGXDTPMenuSelectUPP(userRoutine)
#define CallGXDTPMenuSelectProc(userRoutine, id) InvokeGXDTPMenuSelectUPP(id, userRoutine)
#define Send_GXDTPMenuSelect(id) \
MacSendMessage(0x00000030, id)
#define Forward_GXDTPMenuSelect(id) \
ForwardThisMessage((void *) (id))
typedef CALLBACK_API_C( OSErr , GXHandleAlertFilterProcPtr )(gxJob theJob, gxStatusRecord *pStatusRec, DialogPtr pDialog, EventRecord *theEvent, short *itemHit, Boolean *returnImmed);
typedef STACK_UPP_TYPE(GXHandleAlertFilterProcPtr) GXHandleAlertFilterUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXHandleAlertFilterUPP)
NewGXHandleAlertFilterUPP (GXHandleAlertFilterProcPtr userRoutine);
EXTERN_API(void)
DisposeGXHandleAlertFilterUPP (GXHandleAlertFilterUPP userUPP);
EXTERN_API(OSErr)
InvokeGXHandleAlertFilterUPP (gxJob theJob,
gxStatusRecord * pStatusRec,
DialogPtr pDialog,
EventRecord * theEvent,
short * itemHit,
Boolean * returnImmed,
GXHandleAlertFilterUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXHandleAlertFilterProcInfo = 0x0003FFE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXHandleAlertFilterUPP(userRoutine) (GXHandleAlertFilterUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXHandleAlertFilterProcInfo, GetCurrentArchitecture())
#define DisposeGXHandleAlertFilterUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXHandleAlertFilterUPP(theJob, pStatusRec, pDialog, theEvent, itemHit, returnImmed, userUPP) (OSErr)CALL_SIX_PARAMETER_UPP((userUPP), uppGXHandleAlertFilterProcInfo, (theJob), (pStatusRec), (pDialog), (theEvent), (itemHit), (returnImmed))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXHandleAlertFilterProc(userRoutine) NewGXHandleAlertFilterUPP(userRoutine)
#define CallGXHandleAlertFilterProc(userRoutine, theJob, pStatusRec, pDialog, theEvent, itemHit, returnImmed) InvokeGXHandleAlertFilterUPP(theJob, pStatusRec, pDialog, theEvent, itemHit, returnImmed, userRoutine)
#define Send_GXHandleAlertFilter(theJob, pStatusRec, pDialog, theEvent, itemHit, returnImmed) \
MacSendMessage(0x00000031, theJob, pStatusRec, pDialog, theEvent, itemHit, returnImmed)
#define Forward_GXHandleAlertFilter(theJob, pStatusRec, pDialog, theEvent, itemHit, returnImmed) \
ForwardThisMessage((void *) (theJob), (void *) (pStatusRec), (void *) (pDialog), (void *) (theEvent), \
(void *) (itemHit), (void *) (returnImmed))
typedef CALLBACK_API_C( OSErr , GXJobFormatModeQueryProcPtr )(gxQueryType theQuery, void *srcData, void *dstData);
typedef STACK_UPP_TYPE(GXJobFormatModeQueryProcPtr) GXJobFormatModeQueryUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXJobFormatModeQueryUPP)
NewGXJobFormatModeQueryUPP (GXJobFormatModeQueryProcPtr userRoutine);
EXTERN_API(void)
DisposeGXJobFormatModeQueryUPP (GXJobFormatModeQueryUPP userUPP);
EXTERN_API(OSErr)
InvokeGXJobFormatModeQueryUPP (gxQueryType theQuery,
void * srcData,
void * dstData,
GXJobFormatModeQueryUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXJobFormatModeQueryProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXJobFormatModeQueryUPP(userRoutine) (GXJobFormatModeQueryUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXJobFormatModeQueryProcInfo, GetCurrentArchitecture())
#define DisposeGXJobFormatModeQueryUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXJobFormatModeQueryUPP(theQuery, srcData, dstData, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXJobFormatModeQueryProcInfo, (theQuery), (srcData), (dstData))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXJobFormatModeQueryProc(userRoutine) NewGXJobFormatModeQueryUPP(userRoutine)
#define CallGXJobFormatModeQueryProc(userRoutine, theQuery, srcData, dstData) InvokeGXJobFormatModeQueryUPP(theQuery, srcData, dstData, userRoutine)
#define Send_GXJobFormatModeQuery(theQuery, srcData, dstData) \
MacSendMessage(0x00000032, theQuery, srcData, dstData)
#define Forward_GXJobFormatModeQuery(theQuery, srcData, dstData) \
ForwardThisMessage((void *) (theQuery), (void *) (srcData), (void *) (dstData))
typedef CALLBACK_API_C( OSErr , GXWriteStatusToDTPWindowProcPtr )(gxStatusRecord *pStatusRec, gxDisplayRecord *pDisplay);
typedef STACK_UPP_TYPE(GXWriteStatusToDTPWindowProcPtr) GXWriteStatusToDTPWindowUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXWriteStatusToDTPWindowUPP)
NewGXWriteStatusToDTPWindowUPP (GXWriteStatusToDTPWindowProcPtr userRoutine);
EXTERN_API(void)
DisposeGXWriteStatusToDTPWindowUPP (GXWriteStatusToDTPWindowUPP userUPP);
EXTERN_API(OSErr)
InvokeGXWriteStatusToDTPWindowUPP (gxStatusRecord * pStatusRec,
gxDisplayRecord * pDisplay,
GXWriteStatusToDTPWindowUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXWriteStatusToDTPWindowProcInfo = 0x000003E1 }; /* 2_bytes Func(4_bytes, 4_bytes) */
#define NewGXWriteStatusToDTPWindowUPP(userRoutine) (GXWriteStatusToDTPWindowUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXWriteStatusToDTPWindowProcInfo, GetCurrentArchitecture())
#define DisposeGXWriteStatusToDTPWindowUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXWriteStatusToDTPWindowUPP(pStatusRec, pDisplay, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXWriteStatusToDTPWindowProcInfo, (pStatusRec), (pDisplay))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXWriteStatusToDTPWindowProc(userRoutine) NewGXWriteStatusToDTPWindowUPP(userRoutine)
#define CallGXWriteStatusToDTPWindowProc(userRoutine, pStatusRec, pDisplay) InvokeGXWriteStatusToDTPWindowUPP(pStatusRec, pDisplay, userRoutine)
#define Send_GXWriteStatusToDTPWindow(pStatusRec, pDisplay) \
MacSendMessage(0x00000033, pStatusRec, pDisplay)
#define Forward_GXWriteStatusToDTPWindow(pStatusRec, pDisplay) \
ForwardThisMessage((void *) (pStatusRec), (void *) (pDisplay))
typedef CALLBACK_API_C( OSErr , GXInitializeStatusAlertProcPtr )(gxStatusRecord *pStatusRec, DialogPtr *pDialog);
typedef STACK_UPP_TYPE(GXInitializeStatusAlertProcPtr) GXInitializeStatusAlertUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXInitializeStatusAlertUPP)
NewGXInitializeStatusAlertUPP (GXInitializeStatusAlertProcPtr userRoutine);
EXTERN_API(void)
DisposeGXInitializeStatusAlertUPP (GXInitializeStatusAlertUPP userUPP);
EXTERN_API(OSErr)
InvokeGXInitializeStatusAlertUPP (gxStatusRecord * pStatusRec,
DialogPtr * pDialog,
GXInitializeStatusAlertUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXInitializeStatusAlertProcInfo = 0x000003E1 }; /* 2_bytes Func(4_bytes, 4_bytes) */
#define NewGXInitializeStatusAlertUPP(userRoutine) (GXInitializeStatusAlertUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXInitializeStatusAlertProcInfo, GetCurrentArchitecture())
#define DisposeGXInitializeStatusAlertUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXInitializeStatusAlertUPP(pStatusRec, pDialog, userUPP) (OSErr)CALL_TWO_PARAMETER_UPP((userUPP), uppGXInitializeStatusAlertProcInfo, (pStatusRec), (pDialog))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXInitializeStatusAlertProc(userRoutine) NewGXInitializeStatusAlertUPP(userRoutine)
#define CallGXInitializeStatusAlertProc(userRoutine, pStatusRec, pDialog) InvokeGXInitializeStatusAlertUPP(pStatusRec, pDialog, userRoutine)
#define Send_GXInitializeStatusAlert(pStatusRec, pDialog) \
MacSendMessage(0x00000034, pStatusRec, pDialog)
#define Forward_GXInitializeStatusAlert(pStatusRec, pDialog) \
ForwardThisMessage((void *) (pStatusRec), (void *) (pDialog))
typedef CALLBACK_API_C( OSErr , GXHandleAlertStatusProcPtr )(gxStatusRecord *pStatusRec);
typedef STACK_UPP_TYPE(GXHandleAlertStatusProcPtr) GXHandleAlertStatusUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXHandleAlertStatusUPP)
NewGXHandleAlertStatusUPP (GXHandleAlertStatusProcPtr userRoutine);
EXTERN_API(void)
DisposeGXHandleAlertStatusUPP (GXHandleAlertStatusUPP userUPP);
EXTERN_API(OSErr)
InvokeGXHandleAlertStatusUPP (gxStatusRecord * pStatusRec,
GXHandleAlertStatusUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXHandleAlertStatusProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXHandleAlertStatusUPP(userRoutine) (GXHandleAlertStatusUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXHandleAlertStatusProcInfo, GetCurrentArchitecture())
#define DisposeGXHandleAlertStatusUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXHandleAlertStatusUPP(pStatusRec, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXHandleAlertStatusProcInfo, (pStatusRec))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXHandleAlertStatusProc(userRoutine) NewGXHandleAlertStatusUPP(userRoutine)
#define CallGXHandleAlertStatusProc(userRoutine, pStatusRec) InvokeGXHandleAlertStatusUPP(pStatusRec, userRoutine)
#define Send_GXHandleAlertStatus(pStatusRec) \
MacSendMessage(0x00000035, pStatusRec)
#define Forward_GXHandleAlertStatus(pStatusRec) \
ForwardThisMessage((void *) (pStatusRec))
typedef CALLBACK_API_C( OSErr , GXHandleAlertEventProcPtr )(gxStatusRecord *pStatusRec, DialogPtr pDialog, EventRecord *theEvent, short *response);
typedef STACK_UPP_TYPE(GXHandleAlertEventProcPtr) GXHandleAlertEventUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXHandleAlertEventUPP)
NewGXHandleAlertEventUPP (GXHandleAlertEventProcPtr userRoutine);
EXTERN_API(void)
DisposeGXHandleAlertEventUPP (GXHandleAlertEventUPP userUPP);
EXTERN_API(OSErr)
InvokeGXHandleAlertEventUPP (gxStatusRecord * pStatusRec,
DialogPtr pDialog,
EventRecord * theEvent,
short * response,
GXHandleAlertEventUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXHandleAlertEventProcInfo = 0x00003FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXHandleAlertEventUPP(userRoutine) (GXHandleAlertEventUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXHandleAlertEventProcInfo, GetCurrentArchitecture())
#define DisposeGXHandleAlertEventUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXHandleAlertEventUPP(pStatusRec, pDialog, theEvent, response, userUPP) (OSErr)CALL_FOUR_PARAMETER_UPP((userUPP), uppGXHandleAlertEventProcInfo, (pStatusRec), (pDialog), (theEvent), (response))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXHandleAlertEventProc(userRoutine) NewGXHandleAlertEventUPP(userRoutine)
#define CallGXHandleAlertEventProc(userRoutine, pStatusRec, pDialog, theEvent, response) InvokeGXHandleAlertEventUPP(pStatusRec, pDialog, theEvent, response, userRoutine)
#define Send_GXHandleAlertEvent(pStatusRec, pDialog, theEvent, response) \
MacSendMessage(0x00000036, pStatusRec, pDialog, theEvent, response)
#define Forward_GXHandleAlertEvent(pStatusRec, pDialog, theEvent, response) \
ForwardThisMessage((void *) (pStatusRec), (void *) (pDialog), \
(void *) (theEvent), (void *) (response))
typedef CALLBACK_API_C( void , GXCleanupStartJobProcPtr )(void );
typedef STACK_UPP_TYPE(GXCleanupStartJobProcPtr) GXCleanupStartJobUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCleanupStartJobUPP)
NewGXCleanupStartJobUPP (GXCleanupStartJobProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCleanupStartJobUPP (GXCleanupStartJobUPP userUPP);
EXTERN_API(void)
InvokeGXCleanupStartJobUPP (GXCleanupStartJobUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCleanupStartJobProcInfo = 0x00000001 }; /* no_return_value Func() */
#define NewGXCleanupStartJobUPP(userRoutine) (GXCleanupStartJobUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCleanupStartJobProcInfo, GetCurrentArchitecture())
#define DisposeGXCleanupStartJobUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCleanupStartJobUPP(userUPP) CALL_ZERO_PARAMETER_UPP((userUPP), uppGXCleanupStartJobProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCleanupStartJobProc(userRoutine) NewGXCleanupStartJobUPP(userRoutine)
#define CallGXCleanupStartJobProc(userRoutine) InvokeGXCleanupStartJobUPP(userRoutine)
#define Send_GXCleanupStartJob() ((void) MacSendMessage(0x00000037))
#define Forward_GXCleanupStartJob() ((void) ForwardThisMessage((void *) (0)))
typedef CALLBACK_API_C( void , GXCleanupStartPageProcPtr )(void );
typedef STACK_UPP_TYPE(GXCleanupStartPageProcPtr) GXCleanupStartPageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCleanupStartPageUPP)
NewGXCleanupStartPageUPP (GXCleanupStartPageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCleanupStartPageUPP (GXCleanupStartPageUPP userUPP);
EXTERN_API(void)
InvokeGXCleanupStartPageUPP (GXCleanupStartPageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCleanupStartPageProcInfo = 0x00000001 }; /* no_return_value Func() */
#define NewGXCleanupStartPageUPP(userRoutine) (GXCleanupStartPageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCleanupStartPageProcInfo, GetCurrentArchitecture())
#define DisposeGXCleanupStartPageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCleanupStartPageUPP(userUPP) CALL_ZERO_PARAMETER_UPP((userUPP), uppGXCleanupStartPageProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCleanupStartPageProc(userRoutine) NewGXCleanupStartPageUPP(userRoutine)
#define CallGXCleanupStartPageProc(userRoutine) InvokeGXCleanupStartPageUPP(userRoutine)
#define Send_GXCleanupStartPage() ((void) MacSendMessage(0x00000038))
#define Forward_GXCleanupStartPage() ((void) ForwardThisMessage((void *) (0)))
typedef CALLBACK_API_C( void , GXCleanupOpenConnectionProcPtr )(void );
typedef STACK_UPP_TYPE(GXCleanupOpenConnectionProcPtr) GXCleanupOpenConnectionUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCleanupOpenConnectionUPP)
NewGXCleanupOpenConnectionUPP (GXCleanupOpenConnectionProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCleanupOpenConnectionUPP (GXCleanupOpenConnectionUPP userUPP);
EXTERN_API(void)
InvokeGXCleanupOpenConnectionUPP (GXCleanupOpenConnectionUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCleanupOpenConnectionProcInfo = 0x00000001 }; /* no_return_value Func() */
#define NewGXCleanupOpenConnectionUPP(userRoutine) (GXCleanupOpenConnectionUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCleanupOpenConnectionProcInfo, GetCurrentArchitecture())
#define DisposeGXCleanupOpenConnectionUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCleanupOpenConnectionUPP(userUPP) CALL_ZERO_PARAMETER_UPP((userUPP), uppGXCleanupOpenConnectionProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCleanupOpenConnectionProc(userRoutine) NewGXCleanupOpenConnectionUPP(userRoutine)
#define CallGXCleanupOpenConnectionProc(userRoutine) InvokeGXCleanupOpenConnectionUPP(userRoutine)
#define Send_GXCleanupOpenConnection() ((void) MacSendMessage(0x00000039))
#define Forward_GXCleanupOpenConnection() ((void) ForwardThisMessage((void *) (0)))
typedef CALLBACK_API_C( void , GXCleanupStartSendPageProcPtr )(void );
typedef STACK_UPP_TYPE(GXCleanupStartSendPageProcPtr) GXCleanupStartSendPageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCleanupStartSendPageUPP)
NewGXCleanupStartSendPageUPP (GXCleanupStartSendPageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCleanupStartSendPageUPP (GXCleanupStartSendPageUPP userUPP);
EXTERN_API(void)
InvokeGXCleanupStartSendPageUPP (GXCleanupStartSendPageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCleanupStartSendPageProcInfo = 0x00000001 }; /* no_return_value Func() */
#define NewGXCleanupStartSendPageUPP(userRoutine) (GXCleanupStartSendPageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCleanupStartSendPageProcInfo, GetCurrentArchitecture())
#define DisposeGXCleanupStartSendPageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCleanupStartSendPageUPP(userUPP) CALL_ZERO_PARAMETER_UPP((userUPP), uppGXCleanupStartSendPageProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCleanupStartSendPageProc(userRoutine) NewGXCleanupStartSendPageUPP(userRoutine)
#define CallGXCleanupStartSendPageProc(userRoutine) InvokeGXCleanupStartSendPageUPP(userRoutine)
#define Send_GXCleanupStartSendPage() ((void) MacSendMessage(0x0000003A))
#define Forward_GXCleanupStartSendPage() ((void) ForwardThisMessage((void *) (0)))
typedef CALLBACK_API_C( OSErr , GXDefaultDesktopPrinterProcPtr )(Str31 dtpName);
typedef STACK_UPP_TYPE(GXDefaultDesktopPrinterProcPtr) GXDefaultDesktopPrinterUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXDefaultDesktopPrinterUPP)
NewGXDefaultDesktopPrinterUPP (GXDefaultDesktopPrinterProcPtr userRoutine);
EXTERN_API(void)
DisposeGXDefaultDesktopPrinterUPP (GXDefaultDesktopPrinterUPP userUPP);
EXTERN_API(OSErr)
InvokeGXDefaultDesktopPrinterUPP (Str31 dtpName,
GXDefaultDesktopPrinterUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXDefaultDesktopPrinterProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXDefaultDesktopPrinterUPP(userRoutine) (GXDefaultDesktopPrinterUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXDefaultDesktopPrinterProcInfo, GetCurrentArchitecture())
#define DisposeGXDefaultDesktopPrinterUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXDefaultDesktopPrinterUPP(dtpName, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXDefaultDesktopPrinterProcInfo, (dtpName))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXDefaultDesktopPrinterProc(userRoutine) NewGXDefaultDesktopPrinterUPP(userRoutine)
#define CallGXDefaultDesktopPrinterProc(userRoutine, dtpName) InvokeGXDefaultDesktopPrinterUPP(dtpName, userRoutine)
#define Send_GXDefaultDesktopPrinter(dtpName) MacSendMessage(0x0000003B, dtpName)
#define Forward_GXDefaultDesktopPrinter(dtpName) ForwardThisMessage((void *) (dtpName))
typedef CALLBACK_API_C( OSErr , GXCaptureOutputDeviceProcPtr )(Boolean capture);
typedef STACK_UPP_TYPE(GXCaptureOutputDeviceProcPtr) GXCaptureOutputDeviceUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXCaptureOutputDeviceUPP)
NewGXCaptureOutputDeviceUPP (GXCaptureOutputDeviceProcPtr userRoutine);
EXTERN_API(void)
DisposeGXCaptureOutputDeviceUPP (GXCaptureOutputDeviceUPP userUPP);
EXTERN_API(OSErr)
InvokeGXCaptureOutputDeviceUPP (Boolean capture,
GXCaptureOutputDeviceUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXCaptureOutputDeviceProcInfo = 0x00000061 }; /* 2_bytes Func(1_byte) */
#define NewGXCaptureOutputDeviceUPP(userRoutine) (GXCaptureOutputDeviceUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXCaptureOutputDeviceProcInfo, GetCurrentArchitecture())
#define DisposeGXCaptureOutputDeviceUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXCaptureOutputDeviceUPP(capture, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXCaptureOutputDeviceProcInfo, (capture))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXCaptureOutputDeviceProc(userRoutine) NewGXCaptureOutputDeviceUPP(userRoutine)
#define CallGXCaptureOutputDeviceProc(userRoutine, capture) InvokeGXCaptureOutputDeviceUPP(capture, userRoutine)
#define Send_GXCaptureOutputDevice(capture) MacSendMessage(0x0000003C, capture)
#define Forward_GXCaptureOutputDevice(capture) ForwardThisMessage((void *) (capture))
typedef CALLBACK_API_C( OSErr , GXOpenConnectionRetryProcPtr )(ResType theType, void *aVoid, Boolean *retryopenPtr, OSErr anErr);
typedef STACK_UPP_TYPE(GXOpenConnectionRetryProcPtr) GXOpenConnectionRetryUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXOpenConnectionRetryUPP)
NewGXOpenConnectionRetryUPP (GXOpenConnectionRetryProcPtr userRoutine);
EXTERN_API(void)
DisposeGXOpenConnectionRetryUPP (GXOpenConnectionRetryUPP userUPP);
EXTERN_API(OSErr)
InvokeGXOpenConnectionRetryUPP (ResType theType,
void * aVoid,
Boolean * retryopenPtr,
OSErr anErr,
GXOpenConnectionRetryUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXOpenConnectionRetryProcInfo = 0x00002FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 2_bytes) */
#define NewGXOpenConnectionRetryUPP(userRoutine) (GXOpenConnectionRetryUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXOpenConnectionRetryProcInfo, GetCurrentArchitecture())
#define DisposeGXOpenConnectionRetryUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXOpenConnectionRetryUPP(theType, aVoid, retryopenPtr, anErr, userUPP) (OSErr)CALL_FOUR_PARAMETER_UPP((userUPP), uppGXOpenConnectionRetryProcInfo, (theType), (aVoid), (retryopenPtr), (anErr))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXOpenConnectionRetryProc(userRoutine) NewGXOpenConnectionRetryUPP(userRoutine)
#define CallGXOpenConnectionRetryProc(userRoutine, theType, aVoid, retryopenPtr, anErr) InvokeGXOpenConnectionRetryUPP(theType, aVoid, retryopenPtr, anErr, userRoutine)
#define Send_GXOpenConnectionRetry(theType, aVoid, retryopenPtr, anErr) \
MacSendMessage(0x0000003D, theType, aVoid, retryopenPtr, anErr)
#define Forward_GXOpenConnectionRetry(theType, aVoid, retryopenPtr, anErr) \
ForwardThisMessage((void *) (theType), (void *) (aVoid), (void *) (retryopenPtr), (void *) (anErr))
typedef CALLBACK_API_C( OSErr , GXExamineSpoolFileProcPtr )(gxSpoolFile theSpoolFile);
typedef STACK_UPP_TYPE(GXExamineSpoolFileProcPtr) GXExamineSpoolFileUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXExamineSpoolFileUPP)
NewGXExamineSpoolFileUPP (GXExamineSpoolFileProcPtr userRoutine);
EXTERN_API(void)
DisposeGXExamineSpoolFileUPP (GXExamineSpoolFileUPP userUPP);
EXTERN_API(OSErr)
InvokeGXExamineSpoolFileUPP (gxSpoolFile theSpoolFile,
GXExamineSpoolFileUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXExamineSpoolFileProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXExamineSpoolFileUPP(userRoutine) (GXExamineSpoolFileUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXExamineSpoolFileProcInfo, GetCurrentArchitecture())
#define DisposeGXExamineSpoolFileUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXExamineSpoolFileUPP(theSpoolFile, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXExamineSpoolFileProcInfo, (theSpoolFile))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXExamineSpoolFileProc(userRoutine) NewGXExamineSpoolFileUPP(userRoutine)
#define CallGXExamineSpoolFileProc(userRoutine, theSpoolFile) InvokeGXExamineSpoolFileUPP(theSpoolFile, userRoutine)
#define Send_GXExamineSpoolFile(theSpoolFile) MacSendMessage(0x0000003E, theSpoolFile)
#define Forward_GXExamineSpoolFile(theSpoolFile) ForwardThisMessage((void *) (theSpoolFile))
typedef CALLBACK_API_C( OSErr , GXFinishSendPlaneProcPtr )(void );
typedef STACK_UPP_TYPE(GXFinishSendPlaneProcPtr) GXFinishSendPlaneUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXFinishSendPlaneUPP)
NewGXFinishSendPlaneUPP (GXFinishSendPlaneProcPtr userRoutine);
EXTERN_API(void)
DisposeGXFinishSendPlaneUPP (GXFinishSendPlaneUPP userUPP);
EXTERN_API(OSErr)
InvokeGXFinishSendPlaneUPP (GXFinishSendPlaneUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXFinishSendPlaneProcInfo = 0x00000021 }; /* 2_bytes Func() */
#define NewGXFinishSendPlaneUPP(userRoutine) (GXFinishSendPlaneUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXFinishSendPlaneProcInfo, GetCurrentArchitecture())
#define DisposeGXFinishSendPlaneUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXFinishSendPlaneUPP(userUPP) (OSErr)CALL_ZERO_PARAMETER_UPP((userUPP), uppGXFinishSendPlaneProcInfo)
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXFinishSendPlaneProc(userRoutine) NewGXFinishSendPlaneUPP(userRoutine)
#define CallGXFinishSendPlaneProc(userRoutine) InvokeGXFinishSendPlaneUPP(userRoutine)
#define Send_GXFinishSendPlane() MacSendMessage(0x0000003F)
#define Forward_GXFinishSendPlane() ForwardThisMessage((void *) (0))
typedef CALLBACK_API_C( OSErr , GXDoesPaperFitProcPtr )(gxTrayIndex whichTray, gxPaperType paper, Boolean *fits);
typedef STACK_UPP_TYPE(GXDoesPaperFitProcPtr) GXDoesPaperFitUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXDoesPaperFitUPP)
NewGXDoesPaperFitUPP (GXDoesPaperFitProcPtr userRoutine);
EXTERN_API(void)
DisposeGXDoesPaperFitUPP (GXDoesPaperFitUPP userUPP);
EXTERN_API(OSErr)
InvokeGXDoesPaperFitUPP (gxTrayIndex whichTray,
gxPaperType paper,
Boolean * fits,
GXDoesPaperFitUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXDoesPaperFitProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXDoesPaperFitUPP(userRoutine) (GXDoesPaperFitUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXDoesPaperFitProcInfo, GetCurrentArchitecture())
#define DisposeGXDoesPaperFitUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXDoesPaperFitUPP(whichTray, paper, fits, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXDoesPaperFitProcInfo, (whichTray), (paper), (fits))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXDoesPaperFitProc(userRoutine) NewGXDoesPaperFitUPP(userRoutine)
#define CallGXDoesPaperFitProc(userRoutine, whichTray, paper, fits) InvokeGXDoesPaperFitUPP(whichTray, paper, fits, userRoutine)
#define Send_GXDoesPaperFit(whichTray, paper, fits) \
MacSendMessage(0x00000040, whichTray, paper, fits)
#define Forward_GXDoesPaperFit(whichTray, paper, fits) \
ForwardThisMessage((void *) (whichTray), (void *) (paper), (void *) (fits))
typedef CALLBACK_API_C( OSErr , GXChooserMessageProcPtr )(long message, long caller, StringPtr objName, StringPtr zoneName, ListHandle theList, long p2);
typedef STACK_UPP_TYPE(GXChooserMessageProcPtr) GXChooserMessageUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXChooserMessageUPP)
NewGXChooserMessageUPP (GXChooserMessageProcPtr userRoutine);
EXTERN_API(void)
DisposeGXChooserMessageUPP (GXChooserMessageUPP userUPP);
EXTERN_API(OSErr)
InvokeGXChooserMessageUPP (long message,
long caller,
StringPtr objName,
StringPtr zoneName,
ListHandle theList,
long p2,
GXChooserMessageUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXChooserMessageProcInfo = 0x0003FFE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXChooserMessageUPP(userRoutine) (GXChooserMessageUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXChooserMessageProcInfo, GetCurrentArchitecture())
#define DisposeGXChooserMessageUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXChooserMessageUPP(message, caller, objName, zoneName, theList, p2, userUPP) (OSErr)CALL_SIX_PARAMETER_UPP((userUPP), uppGXChooserMessageProcInfo, (message), (caller), (objName), (zoneName), (theList), (p2))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXChooserMessageProc(userRoutine) NewGXChooserMessageUPP(userRoutine)
#define CallGXChooserMessageProc(userRoutine, message, caller, objName, zoneName, theList, p2) InvokeGXChooserMessageUPP(message, caller, objName, zoneName, theList, p2, userRoutine)
#define Send_GXChooserMessage(message, caller, objName, zoneName, theList, p2) \
MacSendMessage(0x00000041, message, caller, objName, zoneName, theList, p2)
#define Forward_GXChooserMessage(message, caller, objName, zoneName, theList, p2) \
ForwardThisMessage((void *) (message), (void *) (caller), (void *) (objName), \
(void *) (zoneName), (void *) (theList), (void *) (p2))
typedef CALLBACK_API_C( OSErr , GXFindPrinterProfileProcPtr )(gxPrinter thePrinter, void *searchData, long index, gxColorProfile *returnedProfile, long *numProfiles);
typedef STACK_UPP_TYPE(GXFindPrinterProfileProcPtr) GXFindPrinterProfileUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXFindPrinterProfileUPP)
NewGXFindPrinterProfileUPP (GXFindPrinterProfileProcPtr userRoutine);
EXTERN_API(void)
DisposeGXFindPrinterProfileUPP (GXFindPrinterProfileUPP userUPP);
EXTERN_API(OSErr)
InvokeGXFindPrinterProfileUPP (gxPrinter thePrinter,
void * searchData,
long index,
gxColorProfile * returnedProfile,
long * numProfiles,
GXFindPrinterProfileUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXFindPrinterProfileProcInfo = 0x0000FFE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXFindPrinterProfileUPP(userRoutine) (GXFindPrinterProfileUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXFindPrinterProfileProcInfo, GetCurrentArchitecture())
#define DisposeGXFindPrinterProfileUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXFindPrinterProfileUPP(thePrinter, searchData, index, returnedProfile, numProfiles, userUPP) (OSErr)CALL_FIVE_PARAMETER_UPP((userUPP), uppGXFindPrinterProfileProcInfo, (thePrinter), (searchData), (index), (returnedProfile), (numProfiles))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXFindPrinterProfileProc(userRoutine) NewGXFindPrinterProfileUPP(userRoutine)
#define CallGXFindPrinterProfileProc(userRoutine, thePrinter, searchData, index, returnedProfile, numProfiles) InvokeGXFindPrinterProfileUPP(thePrinter, searchData, index, returnedProfile, numProfiles, userRoutine)
#define Send_GXFindPrinterProfile(thePrinter, searchData, index, returnedProfile, numProfiles) \
MacSendMessage(0x00000042, thePrinter, searchData, index, returnedProfile, numProfiles)
#define Forward_GXFindPrinterProfile(thePrinter, searchData, index, returnedProfile, numProfiles) \
ForwardThisMessage((void *) (thePrinter), (void *) (searchData), (void *) (index), (void *) (returnedProfile), (void *) (numProfiles))
typedef CALLBACK_API_C( OSErr , GXFindFormatProfileProcPtr )(gxFormat theFormat, void *searchData, long index, gxColorProfile *returnedProfile, long *numProfiles);
typedef STACK_UPP_TYPE(GXFindFormatProfileProcPtr) GXFindFormatProfileUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXFindFormatProfileUPP)
NewGXFindFormatProfileUPP (GXFindFormatProfileProcPtr userRoutine);
EXTERN_API(void)
DisposeGXFindFormatProfileUPP (GXFindFormatProfileUPP userUPP);
EXTERN_API(OSErr)
InvokeGXFindFormatProfileUPP (gxFormat theFormat,
void * searchData,
long index,
gxColorProfile * returnedProfile,
long * numProfiles,
GXFindFormatProfileUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXFindFormatProfileProcInfo = 0x0000FFE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes, 4_bytes, 4_bytes) */
#define NewGXFindFormatProfileUPP(userRoutine) (GXFindFormatProfileUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXFindFormatProfileProcInfo, GetCurrentArchitecture())
#define DisposeGXFindFormatProfileUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXFindFormatProfileUPP(theFormat, searchData, index, returnedProfile, numProfiles, userUPP) (OSErr)CALL_FIVE_PARAMETER_UPP((userUPP), uppGXFindFormatProfileProcInfo, (theFormat), (searchData), (index), (returnedProfile), (numProfiles))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXFindFormatProfileProc(userRoutine) NewGXFindFormatProfileUPP(userRoutine)
#define CallGXFindFormatProfileProc(userRoutine, theFormat, searchData, index, returnedProfile, numProfiles) InvokeGXFindFormatProfileUPP(theFormat, searchData, index, returnedProfile, numProfiles, userRoutine)
#define Send_GXFindFormatProfile(theFormat, searchData, index, returnedProfile, numProfiles) \
MacSendMessage(0x00000043, theFormat, searchData, index, returnedProfile, numProfiles)
#define Forward_GXFindFormatProfile(theFormat, searchData, index, returnedProfile, numProfiles) \
ForwardThisMessage((void *) (theFormat), (void *) (searchData), (void *) (index), (void *) (returnedProfile), \
(void *) (numProfiles))
typedef CALLBACK_API_C( OSErr , GXSetPrinterProfileProcPtr )(gxPrinter thePrinter, gxColorProfile oldProfile, gxColorProfile newProfile);
typedef STACK_UPP_TYPE(GXSetPrinterProfileProcPtr) GXSetPrinterProfileUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXSetPrinterProfileUPP)
NewGXSetPrinterProfileUPP (GXSetPrinterProfileProcPtr userRoutine);
EXTERN_API(void)
DisposeGXSetPrinterProfileUPP (GXSetPrinterProfileUPP userUPP);
EXTERN_API(OSErr)
InvokeGXSetPrinterProfileUPP (gxPrinter thePrinter,
gxColorProfile oldProfile,
gxColorProfile newProfile,
GXSetPrinterProfileUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXSetPrinterProfileProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXSetPrinterProfileUPP(userRoutine) (GXSetPrinterProfileUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXSetPrinterProfileProcInfo, GetCurrentArchitecture())
#define DisposeGXSetPrinterProfileUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXSetPrinterProfileUPP(thePrinter, oldProfile, newProfile, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXSetPrinterProfileProcInfo, (thePrinter), (oldProfile), (newProfile))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXSetPrinterProfileProc(userRoutine) NewGXSetPrinterProfileUPP(userRoutine)
#define CallGXSetPrinterProfileProc(userRoutine, thePrinter, oldProfile, newProfile) InvokeGXSetPrinterProfileUPP(thePrinter, oldProfile, newProfile, userRoutine)
#define Send_GXSetPrinterProfile(thePrinter, oldProfile, newProfile) \
MacSendMessage(0x00000044, thePrinter, oldProfile, newProfile)
#define Forward_GXSetPrinterProfile(thePrinter, oldProfile, newProfile) \
ForwardThisMessage((void *) (thePrinter), (void *) (oldProfile), (void *) (newProfile))
typedef CALLBACK_API_C( OSErr , GXSetFormatProfileProcPtr )(gxFormat theFormat, gxColorProfile oldProfile, gxColorProfile newProfile);
typedef STACK_UPP_TYPE(GXSetFormatProfileProcPtr) GXSetFormatProfileUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXSetFormatProfileUPP)
NewGXSetFormatProfileUPP (GXSetFormatProfileProcPtr userRoutine);
EXTERN_API(void)
DisposeGXSetFormatProfileUPP (GXSetFormatProfileUPP userUPP);
EXTERN_API(OSErr)
InvokeGXSetFormatProfileUPP (gxFormat theFormat,
gxColorProfile oldProfile,
gxColorProfile newProfile,
GXSetFormatProfileUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXSetFormatProfileProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXSetFormatProfileUPP(userRoutine) (GXSetFormatProfileUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXSetFormatProfileProcInfo, GetCurrentArchitecture())
#define DisposeGXSetFormatProfileUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXSetFormatProfileUPP(theFormat, oldProfile, newProfile, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXSetFormatProfileProcInfo, (theFormat), (oldProfile), (newProfile))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXSetFormatProfileProc(userRoutine) NewGXSetFormatProfileUPP(userRoutine)
#define CallGXSetFormatProfileProc(userRoutine, theFormat, oldProfile, newProfile) InvokeGXSetFormatProfileUPP(theFormat, oldProfile, newProfile, userRoutine)
#define Send_GXSetFormatProfile(theFormat, oldProfile, newProfile) \
MacSendMessage(0x00000045, theFormat, oldProfile, newProfile)
#define Forward_GXSetFormatProfile(theFormat, oldProfile, newProfile) \
ForwardThisMessage((void *) (theFormat), (void *) (oldProfile), (void *) (newProfile))
typedef CALLBACK_API_C( OSErr , GXHandleAltDestinationProcPtr )(Boolean *userCancels);
typedef STACK_UPP_TYPE(GXHandleAltDestinationProcPtr) GXHandleAltDestinationUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXHandleAltDestinationUPP)
NewGXHandleAltDestinationUPP (GXHandleAltDestinationProcPtr userRoutine);
EXTERN_API(void)
DisposeGXHandleAltDestinationUPP (GXHandleAltDestinationUPP userUPP);
EXTERN_API(OSErr)
InvokeGXHandleAltDestinationUPP (Boolean * userCancels,
GXHandleAltDestinationUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXHandleAltDestinationProcInfo = 0x000000E1 }; /* 2_bytes Func(4_bytes) */
#define NewGXHandleAltDestinationUPP(userRoutine) (GXHandleAltDestinationUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXHandleAltDestinationProcInfo, GetCurrentArchitecture())
#define DisposeGXHandleAltDestinationUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXHandleAltDestinationUPP(userCancels, userUPP) (OSErr)CALL_ONE_PARAMETER_UPP((userUPP), uppGXHandleAltDestinationProcInfo, (userCancels))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXHandleAltDestinationProc(userRoutine) NewGXHandleAltDestinationUPP(userRoutine)
#define CallGXHandleAltDestinationProc(userRoutine, userCancels) InvokeGXHandleAltDestinationUPP(userCancels, userRoutine)
#define Send_GXHandleAltDestination(userCancels) MacSendMessage(0x00000046, userCancels)
#define Forward_GXHandleAltDestination(userCancels) ForwardThisMessage((void *) (userCancels))
typedef CALLBACK_API_C( OSErr , GXSetupPageImageDataProcPtr )(gxFormat theFormat, gxShape thePage, void *imageData);
typedef STACK_UPP_TYPE(GXSetupPageImageDataProcPtr) GXSetupPageImageDataUPP;
#if OPAQUE_UPP_TYPES
#if CALL_NOT_IN_CARBON
EXTERN_API(GXSetupPageImageDataUPP)
NewGXSetupPageImageDataUPP (GXSetupPageImageDataProcPtr userRoutine);
EXTERN_API(void)
DisposeGXSetupPageImageDataUPP (GXSetupPageImageDataUPP userUPP);
EXTERN_API(OSErr)
InvokeGXSetupPageImageDataUPP (gxFormat theFormat,
gxShape thePage,
void * imageData,
GXSetupPageImageDataUPP userUPP);
#endif /* CALL_NOT_IN_CARBON */
#else
enum { uppGXSetupPageImageDataProcInfo = 0x00000FE1 }; /* 2_bytes Func(4_bytes, 4_bytes, 4_bytes) */
#define NewGXSetupPageImageDataUPP(userRoutine) (GXSetupPageImageDataUPP)NewRoutineDescriptor((ProcPtr)(userRoutine), uppGXSetupPageImageDataProcInfo, GetCurrentArchitecture())
#define DisposeGXSetupPageImageDataUPP(userUPP) DisposeRoutineDescriptor(userUPP)
#define InvokeGXSetupPageImageDataUPP(theFormat, thePage, imageData, userUPP) (OSErr)CALL_THREE_PARAMETER_UPP((userUPP), uppGXSetupPageImageDataProcInfo, (theFormat), (thePage), (imageData))
#endif
/* support for pre-Carbon UPP routines: NewXXXProc and CallXXXProc */
#define NewGXSetupPageImageDataProc(userRoutine) NewGXSetupPageImageDataUPP(userRoutine)
#define CallGXSetupPageImageDataProc(userRoutine, theFormat, thePage, imageData) InvokeGXSetupPageImageDataUPP(theFormat, thePage, imageData, userRoutine)
#define Send_GXSetupPageImageData(theFormat, thePage, imageData) \
MacSendMessage(0x00000047, theFormat, thePage, imageData)
#define Forward_GXSetupPageImageData(theFormat, thePage, imageData) \
ForwardThisMessage((void *) (theFormat), (void *) (thePage), (void *) (imageData))
/*******************************************************************
Start of old "GXPrintingErrors.h/a/p" interface file.
********************************************************************/
enum {
gxPrintingResultBase = -510 /*First QuickDraw GX printing error code.*/
};
/*RESULT CODES FOR QUICKDRAW GX PRINTING OPERATIONS*/
enum {
gxAioTimeout = (gxPrintingResultBase), /*-510 : Timeout condition occurred during operation*/
gxAioBadRqstState = (gxPrintingResultBase - 1), /*-511 : Async I/O request in invalid state for operation*/
gxAioBadConn = (gxPrintingResultBase - 2), /*-512 : Invalid Async I/O connection refnum*/
gxAioInvalidXfer = (gxPrintingResultBase - 3), /*-513 : Read data transfer structure contained bad values*/
gxAioNoRqstBlks = (gxPrintingResultBase - 4), /*-514 : No available request blocks to process request*/
gxAioNoDataXfer = (gxPrintingResultBase - 5), /*-515 : Data transfer structure pointer not specified*/
gxAioTooManyAutos = (gxPrintingResultBase - 6), /*-516 : Auto status request already active*/
gxAioNoAutoStat = (gxPrintingResultBase - 7), /*-517 : Connection not configured for auto status*/
gxAioBadRqstID = (gxPrintingResultBase - 8), /*-518 : Invalid I/O request identifier*/
gxAioCantKill = (gxPrintingResultBase - 9), /*-519 : Comm. protocol doesn't support I/O term*/
gxAioAlreadyExists = (gxPrintingResultBase - 10), /*-520 : Protocol spec. data already specified*/
gxAioCantFind = (gxPrintingResultBase - 11), /*-521 : Protocol spec. data does not exist*/
gxAioDeviceDisconn = (gxPrintingResultBase - 12), /*-522 : Machine disconnected from printer*/
gxAioNotImplemented = (gxPrintingResultBase - 13), /*-523 : Function not implemented*/
gxAioOpenPending = (gxPrintingResultBase - 14), /*-524 : Opening a connection for protocol, but another open pending*/
gxAioNoProtocolData = (gxPrintingResultBase - 15), /*-525 : No protocol specific data specified in request*/
gxAioRqstKilled = (gxPrintingResultBase - 16), /*-526 : I/O request was terminated*/
gxBadBaudRate = (gxPrintingResultBase - 17), /*-527 : Invalid baud rate specified*/
gxBadParity = (gxPrintingResultBase - 18), /*-528 : Invalid parity specified*/
gxBadStopBits = (gxPrintingResultBase - 19), /*-529 : Invalid stop bits specified*/
gxBadDataBits = (gxPrintingResultBase - 20), /*-530 : Invalid data bits specified*/
gxBadPrinterName = (gxPrintingResultBase - 21), /*-531 : Bad printer name specified*/
gxAioBadMsgType = (gxPrintingResultBase - 22), /*-532 : Bad masType field in transfer info structure*/
gxAioCantFindDevice = (gxPrintingResultBase - 23), /*-533 : Cannot locate target device*/
gxAioOutOfSeq = (gxPrintingResultBase - 24), /*-534 : Non-atomic SCSI requests submitted out of sequence*/
gxPrIOAbortErr = (gxPrintingResultBase - 25), /*-535 : I/O operation aborted*/
gxPrUserAbortErr = (gxPrintingResultBase - 26), /*-536 : User aborted*/
gxCantAddPanelsNowErr = (gxPrintingResultBase - 27), /*-537 : Can only add panels during driver switch or dialog setup*/
gxBadxdtlKeyErr = (gxPrintingResultBase - 28), /*-538 : Unknown key for xdtl - must be radiobutton, etc*/
gxXdtlItemOutOfRangeErr = (gxPrintingResultBase - 29), /*-539 : Referenced item does not belong to panel*/
gxNoActionButtonErr = (gxPrintingResultBase - 30), /*-540 : Action button is nil*/
gxTitlesTooLongErr = (gxPrintingResultBase - 31), /*-541 : Length of buttons exceeds alert maximum width*/
gxUnknownAlertVersionErr = (gxPrintingResultBase - 32), /*-542 : Bad version for printing alerts*/
gxGBBufferTooSmallErr = (gxPrintingResultBase - 33), /*-543 : Buffer too small.*/
gxInvalidPenTable = (gxPrintingResultBase - 34), /*-544 : Invalid vector driver pen table.*/
gxIncompletePrintFileErr = (gxPrintingResultBase - 35), /*-545 : Print file was not completely spooled*/
gxCrashedPrintFileErr = (gxPrintingResultBase - 36), /*-546 : Print file is corrupted*/
gxInvalidPrintFileVersion = (gxPrintingResultBase - 37), /*-547 : Print file is incompatible with current QuickDraw GX version*/
gxSegmentLoadFailedErr = (gxPrintingResultBase - 38), /*-548 : Segment loader error*/
gxExtensionNotFoundErr = (gxPrintingResultBase - 39), /*-549 : Requested printing extension could not be found*/
gxDriverVersionErr = (gxPrintingResultBase - 40), /*-550 : Driver too new for current version of QuickDraw GX*/
gxImagingSystemVersionErr = (gxPrintingResultBase - 41), /*-551 : Imaging system too new for current version of QuickDraw GX*/
gxFlattenVersionTooNew = (gxPrintingResultBase - 42), /*-552 : Flattened object format too new for current version of QDGX*/
gxPaperTypeNotFound = (gxPrintingResultBase - 43), /*-553 : Requested papertype could not be found*/
gxNoSuchPTGroup = (gxPrintingResultBase - 44), /*-554 : Requested papertype group could not be found*/
gxNotEnoughPrinterMemory = (gxPrintingResultBase - 45), /*-555 : Printer does not have enough memory for fonts in document*/
gxDuplicatePanelNameErr = (gxPrintingResultBase - 46), /*-556 : Attempt to add more than 10 panels with the same name*/
gxExtensionVersionErr = (gxPrintingResultBase - 47) /*-557 : Extension too new for current version of QuickDraw GX*/
};
#if PRAGMA_STRUCT_ALIGN
#pragma options align=reset
#elif PRAGMA_STRUCT_PACKPUSH
#pragma pack(pop)
#elif PRAGMA_STRUCT_PACK
#pragma pack()
#endif
#ifdef PRAGMA_IMPORT_OFF
#pragma import off
#elif PRAGMA_IMPORT
#pragma import reset
#endif
#ifdef __cplusplus
}
#endif
#endif /* __GXPRINTING__ */
| {
"content_hash": "ac6a1d3156619ef1c700660a3698d23b",
"timestamp": "",
"source": "github",
"line_count": 4442,
"max_line_length": 258,
"avg_line_length": 50.14678072940117,
"alnum_prop": 0.6276531748312024,
"repo_name": "HellicarAndLewis/ProjectDonk",
"id": "a440a8f5b358fe3d949420e3bcdb238b9c77e5d5",
"size": "223226",
"binary": false,
"copies": "9",
"ref": "refs/heads/master",
"path": "libs/quicktime/include/GXPrinting.h",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "C",
"bytes": "15166492"
},
{
"name": "C++",
"bytes": "12585409"
},
{
"name": "CSS",
"bytes": "51345"
},
{
"name": "Erlang",
"bytes": "412"
},
{
"name": "F#",
"bytes": "19748"
},
{
"name": "JavaScript",
"bytes": "62073"
},
{
"name": "Objective-C",
"bytes": "506316"
},
{
"name": "Objective-C++",
"bytes": "278258"
},
{
"name": "Perl",
"bytes": "7551"
},
{
"name": "Processing",
"bytes": "3021"
},
{
"name": "Python",
"bytes": "9555"
},
{
"name": "Shell",
"bytes": "35227"
}
],
"symlink_target": ""
} |
import React from 'react'
import { Message } from 'stardust'
const Fixed = () => {
return (
<Message info>
You can view examples of the <b>fixed</b> variation in the
<a href='http://semantic-ui.com/collections/menu.html#fixed' target='_blank'> official documentation</a>.
</Message>
)
}
export default Fixed
| {
"content_hash": "b2e711c633a70e551c2f0e46b28a285d",
"timestamp": "",
"source": "github",
"line_count": 13,
"max_line_length": 111,
"avg_line_length": 25.692307692307693,
"alnum_prop": 0.6586826347305389,
"repo_name": "jcarbo/stardust",
"id": "856d87b5047716a5c800a92ea9b329549f872b5f",
"size": "334",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "docs/app/Examples/collections/Menu/Variations/Fixed.js",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "JavaScript",
"bytes": "602056"
}
],
"symlink_target": ""
} |
<#
.EXAMPLE
This example shows how to ensure that the Business Data Connectivity Service
is not running on the local server.
#>
Configuration Example
{
param(
[Parameter(Mandatory = $true)]
[PSCredential]
$SetupAccount
)
Import-DscResource -ModuleName SharePointDsc
node localhost {
SPServiceInstance StopBCSServiceInstance
{
Name = "Business Data Connectivity Service"
Ensure = "Absent"
PsDscRunAsCredential = $SetupAccount
}
}
}
| {
"content_hash": "8d0cbae3202154a5b5bd295d12331958",
"timestamp": "",
"source": "github",
"line_count": 24,
"max_line_length": 80,
"avg_line_length": 26.791666666666668,
"alnum_prop": 0.5318818040435459,
"repo_name": "ykuijs/xSharePoint",
"id": "a11cb15585b890570df0fd820efa636394971eec",
"size": "643",
"binary": false,
"copies": "1",
"ref": "refs/heads/dev",
"path": "Modules/SharePointDsc/Examples/Resources/SPServiceInstance/2-StopService.ps1",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "C#",
"bytes": "13147308"
},
{
"name": "JavaScript",
"bytes": "789"
},
{
"name": "PowerShell",
"bytes": "3989944"
}
],
"symlink_target": ""
} |
package com.alibaba.nacos.core.distributed.distro;
import com.alibaba.nacos.consistency.DataOperation;
import com.alibaba.nacos.core.cluster.Member;
import com.alibaba.nacos.core.cluster.ServerMemberManager;
import com.alibaba.nacos.core.distributed.distro.component.DistroCallback;
import com.alibaba.nacos.core.distributed.distro.component.DistroComponentHolder;
import com.alibaba.nacos.core.distributed.distro.component.DistroDataProcessor;
import com.alibaba.nacos.core.distributed.distro.component.DistroDataStorage;
import com.alibaba.nacos.core.distributed.distro.component.DistroTransportAgent;
import com.alibaba.nacos.core.distributed.distro.entity.DistroData;
import com.alibaba.nacos.core.distributed.distro.entity.DistroKey;
import com.alibaba.nacos.core.distributed.distro.task.DistroTaskEngineHolder;
import com.alibaba.nacos.core.distributed.distro.task.delay.DistroDelayTask;
import com.alibaba.nacos.core.distributed.distro.task.load.DistroLoadDataTask;
import com.alibaba.nacos.core.distributed.distro.task.verify.DistroVerifyTimedTask;
import com.alibaba.nacos.core.utils.GlobalExecutor;
import com.alibaba.nacos.core.utils.Loggers;
import com.alibaba.nacos.sys.env.EnvUtil;
import org.springframework.stereotype.Component;
/**
* Distro protocol.
*
* @author xiweng.yy
*/
@Component
public class DistroProtocol {
private final ServerMemberManager memberManager;
private final DistroComponentHolder distroComponentHolder;
private final DistroTaskEngineHolder distroTaskEngineHolder;
private volatile boolean isInitialized = false;
public DistroProtocol(ServerMemberManager memberManager, DistroComponentHolder distroComponentHolder,
DistroTaskEngineHolder distroTaskEngineHolder) {
this.memberManager = memberManager;
this.distroComponentHolder = distroComponentHolder;
this.distroTaskEngineHolder = distroTaskEngineHolder;
startDistroTask();
}
private void startDistroTask() {
if (EnvUtil.getStandaloneMode()) {
isInitialized = true;
return;
}
startVerifyTask();
startLoadTask();
}
private void startLoadTask() {
DistroCallback loadCallback = new DistroCallback() {
@Override
public void onSuccess() {
isInitialized = true;
}
@Override
public void onFailed(Throwable throwable) {
isInitialized = false;
}
};
GlobalExecutor.submitLoadDataTask(
new DistroLoadDataTask(memberManager, distroComponentHolder, DistroConfig.getInstance(), loadCallback));
}
private void startVerifyTask() {
GlobalExecutor.schedulePartitionDataTimedSync(new DistroVerifyTimedTask(memberManager, distroComponentHolder,
distroTaskEngineHolder.getExecuteWorkersManager()),
DistroConfig.getInstance().getVerifyIntervalMillis());
}
public boolean isInitialized() {
return isInitialized;
}
/**
* Start to sync by configured delay.
*
* @param distroKey distro key of sync data
* @param action the action of data operation
*/
public void sync(DistroKey distroKey, DataOperation action) {
sync(distroKey, action, DistroConfig.getInstance().getSyncDelayMillis());
}
/**
* Start to sync data to all remote server.
*
* @param distroKey distro key of sync data
* @param action the action of data operation
* @param delay delay time for sync
*/
public void sync(DistroKey distroKey, DataOperation action, long delay) {
for (Member each : memberManager.allMembersWithoutSelf()) {
syncToTarget(distroKey, action, each.getAddress(), delay);
}
}
/**
* Start to sync to target server.
*
* @param distroKey distro key of sync data
* @param action the action of data operation
* @param targetServer target server
* @param delay delay time for sync
*/
public void syncToTarget(DistroKey distroKey, DataOperation action, String targetServer, long delay) {
DistroKey distroKeyWithTarget = new DistroKey(distroKey.getResourceKey(), distroKey.getResourceType(),
targetServer);
DistroDelayTask distroDelayTask = new DistroDelayTask(distroKeyWithTarget, action, delay);
distroTaskEngineHolder.getDelayTaskExecuteEngine().addTask(distroKeyWithTarget, distroDelayTask);
if (Loggers.DISTRO.isDebugEnabled()) {
Loggers.DISTRO.debug("[DISTRO-SCHEDULE] {} to {}", distroKey, targetServer);
}
}
/**
* Query data from specified server.
*
* @param distroKey data key
* @return data
*/
public DistroData queryFromRemote(DistroKey distroKey) {
if (null == distroKey.getTargetServer()) {
Loggers.DISTRO.warn("[DISTRO] Can't query data from empty server");
return null;
}
String resourceType = distroKey.getResourceType();
DistroTransportAgent transportAgent = distroComponentHolder.findTransportAgent(resourceType);
if (null == transportAgent) {
Loggers.DISTRO.warn("[DISTRO] Can't find transport agent for key {}", resourceType);
return null;
}
return transportAgent.getData(distroKey, distroKey.getTargetServer());
}
/**
* Receive synced distro data, find processor to process.
*
* @param distroData Received data
* @return true if handle receive data successfully, otherwise false
*/
public boolean onReceive(DistroData distroData) {
Loggers.DISTRO.info("[DISTRO] Receive distro data type: {}, key: {}", distroData.getType(),
distroData.getDistroKey());
String resourceType = distroData.getDistroKey().getResourceType();
DistroDataProcessor dataProcessor = distroComponentHolder.findDataProcessor(resourceType);
if (null == dataProcessor) {
Loggers.DISTRO.warn("[DISTRO] Can't find data process for received data {}", resourceType);
return false;
}
return dataProcessor.processData(distroData);
}
/**
* Receive verify data, find processor to process.
*
* @param distroData verify data
* @param sourceAddress source server address, might be get data from source server
* @return true if verify data successfully, otherwise false
*/
public boolean onVerify(DistroData distroData, String sourceAddress) {
if (Loggers.DISTRO.isDebugEnabled()) {
Loggers.DISTRO.debug("[DISTRO] Receive verify data type: {}, key: {}", distroData.getType(),
distroData.getDistroKey());
}
String resourceType = distroData.getDistroKey().getResourceType();
DistroDataProcessor dataProcessor = distroComponentHolder.findDataProcessor(resourceType);
if (null == dataProcessor) {
Loggers.DISTRO.warn("[DISTRO] Can't find verify data process for received data {}", resourceType);
return false;
}
return dataProcessor.processVerifyData(distroData, sourceAddress);
}
/**
* Query data of input distro key.
*
* @param distroKey key of data
* @return data
*/
public DistroData onQuery(DistroKey distroKey) {
String resourceType = distroKey.getResourceType();
DistroDataStorage distroDataStorage = distroComponentHolder.findDataStorage(resourceType);
if (null == distroDataStorage) {
Loggers.DISTRO.warn("[DISTRO] Can't find data storage for received key {}", resourceType);
return new DistroData(distroKey, new byte[0]);
}
return distroDataStorage.getDistroData(distroKey);
}
/**
* Query all datum snapshot.
*
* @param type datum type
* @return all datum snapshot
*/
public DistroData onSnapshot(String type) {
DistroDataStorage distroDataStorage = distroComponentHolder.findDataStorage(type);
if (null == distroDataStorage) {
Loggers.DISTRO.warn("[DISTRO] Can't find data storage for received key {}", type);
return new DistroData(new DistroKey("snapshot", type), new byte[0]);
}
return distroDataStorage.getDatumSnapshot();
}
}
| {
"content_hash": "f0cc640e6c0db4fe333ffae4e5b81554",
"timestamp": "",
"source": "github",
"line_count": 213,
"max_line_length": 120,
"avg_line_length": 39.85446009389671,
"alnum_prop": 0.6762869595947697,
"repo_name": "alibaba/nacos",
"id": "5eb24c5bbcbde2cefce3f39a41a9278882706f7e",
"size": "9104",
"binary": false,
"copies": "1",
"ref": "refs/heads/develop",
"path": "core/src/main/java/com/alibaba/nacos/core/distributed/distro/DistroProtocol.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Batchfile",
"bytes": "4434"
},
{
"name": "EJS",
"bytes": "2645"
},
{
"name": "Java",
"bytes": "9013127"
},
{
"name": "JavaScript",
"bytes": "7472"
},
{
"name": "SCSS",
"bytes": "80124"
},
{
"name": "Shell",
"bytes": "6330"
},
{
"name": "TypeScript",
"bytes": "4765"
}
],
"symlink_target": ""
} |
<?php
/**
* This is the model class for table "containr_elementPageRef".
*
* The followings are the available columns in table 'containr_elementPageRef':
* @property integer $id
* @property integer $elementId
* @property integer $pageId
*/
class ContainrElementPageRef extends CActiveRecord
{
/**
* Returns the static model of the specified AR class.
* @param string $className active record class name.
* @return ElementPageRef the static model class
*/
public static function model($className=__CLASS__)
{
return parent::model($className);
}
/**
* @return string the associated database table name
*/
public function tableName()
{
return 'containr_elementPageRef';
}
/**
* @return array validation rules for model attributes.
*/
public function rules()
{
// NOTE: you should only define rules for those attributes that
// will receive user inputs.
return array(
array('elementId, pageId, posNum', 'numerical', 'integerOnly'=>true),
array('template,columnId','safe'),
// The following rule is used by search().
// Please remove those attributes that should not be searched.
array('id, elementId, pageId, posNum', 'safe', 'on'=>'search'),
);
}
/**
* @return array relational rules.
*/
public function relations()
{
// NOTE: you may need to adjust the relation name and the related
// class name for the relations automatically generated below.
return array(
'elementDef'=>array(self::BELONGS_TO, 'ContainrElementLib', 'elementId'),
);
}
/**
* @return array customized attribute labels (name=>label)
*/
public function attributeLabels()
{
return array(
'id' => 'ID',
'elementId' => 'Element',
'pageId' => 'Page',
'posNum' => 'Position',
);
}
public function scopes(){
return array(
'sorted'=>array(
'order'=>'posNum ASC'
),
'lastPosNum'=>array(
'limit'=>'1',
'order'=>'posNum DESC'
)
);
}
/**
* Retrieves a list of models based on the current search/filter conditions.
* @return CActiveDataProvider the data provider that can return the models based on the search/filter conditions.
*/
public function search()
{
// Warning: Please modify the following code to remove attributes that
// should not be searched.
$criteria=new CDbCriteria;
$criteria->compare('id',$this->id);
$criteria->compare('elementId',$this->elementId);
$criteria->compare('pageId',$this->pageId);
$criteria->compare('posNum',$this->posNum);
return new CActiveDataProvider($this, array(
'criteria'=>$criteria,
));
}
public static function getLastPosition($page, $column) {
$model = ContainrElementPageRef::model()->lastPosNum()->find('pageId = ' . $page . ' AND columnId = "' . $column . '"');
if($model)
return $model->posNum;
else
return 0;
}
}
| {
"content_hash": "3374ca60c3ad3cdb434156f28d29e5ec",
"timestamp": "",
"source": "github",
"line_count": 113,
"max_line_length": 122,
"avg_line_length": 24.849557522123895,
"alnum_prop": 0.6705840455840456,
"repo_name": "giantbits/conTainr",
"id": "af389aa382c1f40d8a45232cf49645cfe259f77f",
"size": "2808",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "protected/modules/containr/models/ContainrElementPageRef.php",
"mode": "33188",
"license": "bsd-3-clause",
"language": [
{
"name": "CSS",
"bytes": "1196858"
},
{
"name": "JavaScript",
"bytes": "3059871"
},
{
"name": "PHP",
"bytes": "3315479"
}
],
"symlink_target": ""
} |
MIT License
Copyright (c) 2016 Renan Mendes Carvalho
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
| {
"content_hash": "92decc08a9b0f430924e35178f808119",
"timestamp": "",
"source": "github",
"line_count": 21,
"max_line_length": 78,
"avg_line_length": 51.333333333333336,
"alnum_prop": 0.8061224489795918,
"repo_name": "team-767/simple-react-docgen",
"id": "ed1c14a006cf34b1ecddd00b20ae46df290464d0",
"size": "1078",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "LICENSE.md",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "HTML",
"bytes": "450"
},
{
"name": "JavaScript",
"bytes": "12104"
}
],
"symlink_target": ""
} |
@implementation Thermomix_DocumentMaintenance_Portal_ViewController
@synthesize DocumentsTree;
@end
| {
"content_hash": "0bb2158288c9d470aa9053a0f733f535",
"timestamp": "",
"source": "github",
"line_count": 5,
"max_line_length": 67,
"avg_line_length": 20.4,
"alnum_prop": 0.8627450980392157,
"repo_name": "expanz/expanz-iOS-SDK",
"id": "6638abaf6f431f6e02d7fc678ba71944aa5d2c59",
"size": "569",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "Source/Examples/Controllers/Thermomix_DocumentMaintenance_Portal_ViewController.m",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "C",
"bytes": "6781"
},
{
"name": "Objective-C",
"bytes": "752993"
},
{
"name": "Shell",
"bytes": "638"
}
],
"symlink_target": ""
} |
<?xml version="1.0" encoding="utf-8"?>
<android.support.v4.widget.SwipeRefreshLayout
xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/hot_info_swipe_container"
android:layout_width="match_parent"
android:layout_height="match_parent"
>
<android.support.v4.widget.NestedScrollView
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/home_hot_info_sv"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="@color/common_bg"
tools:context=".view.home.HotInfoFragment"
>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical"
>
<!-- banner -->
<FrameLayout
android:layout_width="match_parent"
android:layout_height="220dp"
>
<com.bigkoo.convenientbanner.ConvenientBanner
xmlns:app="http://schemas.android.com/apk/res-auto"
android:id="@+id/home_hot_info_banner"
android:layout_width="match_parent"
android:layout_height="match_parent"
app:canLoop="true"
/>
<TextView
android:id="@+id/home_hot_info_banner_title"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_marginBottom="20dp"
android:gravity="bottom"
android:paddingLeft="20dp"
android:paddingRight="20dp"
android:textColor="@color/hot_info_banner_title_color"
android:textSize="@dimen/hot_info_banner_title_size"
/>
</FrameLayout>
<!-- title -->
<LinearLayout
android:layout_width="match_parent"
android:layout_height="60dp"
android:background="@color/white"
android:gravity="center_vertical"
android:orientation="horizontal"
android:paddingLeft="8dp"
>
<Button
android:id="@+id/hot_info_recent"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_gravity="center"
android:layout_weight="1"
android:background="#00000000"
android:drawableLeft="@mipmap/hot_info_icon_recent"
android:gravity="center"
android:padding="8dp"
android:scaleType="fitCenter"
android:text="最新活动"
android:textColor="@color/hot_info_item_title_color"
android:textSize="@dimen/hot_info_activity_text_size"
/>
<Button
android:id="@+id/hot_info_history"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_gravity="center"
android:layout_weight="1"
android:background="#00000000"
android:drawableLeft="@mipmap/hot_info_icon_history"
android:gravity="center"
android:padding="8dp"
android:scaleType="fitCenter"
android:text="历史活动"
android:textColor="@color/hot_info_item_title_color"
android:textSize="@dimen/hot_info_activity_text_size"
/>
<Button
android:id="@+id/hot_info_activity"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_gravity="center"
android:layout_weight="1"
android:background="#00000000"
android:drawableLeft="@mipmap/hot_info_icon_activity"
android:gravity="center"
android:padding="8dp"
android:scaleType="fitCenter"
android:text="活动日历"
android:textColor="@color/hot_info_item_title_color"
android:textSize="@dimen/hot_info_activity_text_size"
/>
</LinearLayout>
<!-- 微直播 -->
<LinearLayout
android:layout_width="match_parent"
android:layout_height="230dp"
android:layout_marginTop="16dp"
android:background="@color/white"
android:orientation="vertical"
>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
>
<TextView
android:id="@+id/hot_info_weizhibo_title"
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_marginLeft="40dp"
android:layout_weight="1"
android:gravity="center_horizontal"
android:padding="16dp"
android:text="微直播"
android:textColor="@color/hot_info_item_title_color"
android:textSize="16sp"
/>
<ImageView
android:id="@+id/ivLiveMore"
android:layout_width="40dp"
android:layout_height="40dp"
android:layout_gravity="center_vertical"
android:padding="8dp"
android:src="@mipmap/hot_info_more_icon"
/>
</LinearLayout>
<android.support.v7.widget.RecyclerView
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/hot_info_live_list"
android:name="com.homesport.view.HotInfoFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:paddingLeft="8dp"
android:paddingRight="8dp"
app:layoutManager="LinearLayoutManager"
tools:context=".view.home.HotInfoFragment"
tools:listitem="@layout/fragment_hot_info_live"
/>
</LinearLayout>
<!-- 精选动态 -->
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginTop="16dp"
android:background="@color/white"
android:orientation="vertical"
>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
>
<TextView
android:id="@+id/hot_info_dynamic_title"
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_marginLeft="40dp"
android:layout_weight="1"
android:gravity="center_horizontal"
android:padding="16dp"
android:text="精选动态"
android:textColor="@color/hot_info_item_title_color"
android:textSize="16sp"
/>
<ImageView
android:id="@+id/ivDynamicMore"
android:layout_width="40dp"
android:layout_height="40dp"
android:layout_gravity="center_vertical"
android:padding="8dp"
android:src="@mipmap/hot_info_more_icon"
/>
</LinearLayout>
<android.support.v7.widget.RecyclerView
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/hot_info_dynamic_list"
android:name="com.homesport.view.HotInfoFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:paddingLeft="8dp"
android:paddingRight="8dp"
android:nestedScrollingEnabled="false"
app:layoutManager="LinearLayoutManager"
tools:context=".view.home.HotInfoFragment"
tools:listitem="@layout/fragment_hot_info_dynamic"
/>
</LinearLayout>
<!-- 热门圈子 -->
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginTop="16dp"
android:background="@color/white"
android:orientation="vertical"
>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
>
<TextView
android:id="@+id/hot_info_circle_title"
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_marginLeft="40dp"
android:layout_weight="1"
android:gravity="center_horizontal"
android:padding="16dp"
android:text="热门圈子讨论"
android:textColor="@color/hot_info_item_title_color"
android:textSize="16sp"
/>
<ImageView
android:id="@+id/ivCircleMore"
android:layout_width="40dp"
android:layout_height="40dp"
android:layout_gravity="center_vertical"
android:padding="8dp"
android:src="@mipmap/hot_info_more_icon"
/>
</LinearLayout>
<android.support.v7.widget.RecyclerView
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/hot_info_circle_list"
android:name="com.homesport.view.HotInfoFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:paddingLeft="8dp"
android:paddingRight="8dp"
android:layout_marginBottom="16dp"
android:nestedScrollingEnabled="false"
app:layoutManager="LinearLayoutManager"
tools:context=".view.home.HotInfoFragment"
tools:listitem="@layout/fragment_hot_info_circle"
/>
</LinearLayout>
<!-- 推荐用户 -->
<LinearLayout
android:layout_width="match_parent"
android:layout_height="140dp"
android:layout_marginTop="16dp"
android:background="@color/white"
android:orientation="vertical"
>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
>
<TextView
android:id="@+id/hot_info_activity_user_title"
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_marginLeft="40dp"
android:layout_weight="1"
android:gravity="center_horizontal"
android:padding="16dp"
android:text="推荐用户"
android:textColor="@color/hot_info_item_title_color"
android:textSize="16sp"
/>
<ImageView
android:id="@+id/ivUserMore"
android:layout_width="40dp"
android:layout_height="40dp"
android:layout_gravity="center_vertical"
android:padding="8dp"
android:src="@mipmap/hot_info_more_icon"
/>
</LinearLayout>
<android.support.v7.widget.RecyclerView
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/hot_info_activity_user_list"
android:name="com.homesport.view.HotInfoFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:paddingLeft="8dp"
android:paddingRight="8dp"
app:layoutManager="LinearLayoutManager"
tools:context=".view.home.HotInfoFragment"
tools:listitem="@layout/fragment_hot_info_activity_user"
/>
</LinearLayout>
</LinearLayout>
</android.support.v4.widget.NestedScrollView>
</android.support.v4.widget.SwipeRefreshLayout> | {
"content_hash": "9a0cda9bcdfbe78b819565856b94a262",
"timestamp": "",
"source": "github",
"line_count": 301,
"max_line_length": 70,
"avg_line_length": 40.906976744186046,
"alnum_prop": 0.5737025907577358,
"repo_name": "928902646/Gymnast",
"id": "7f9f116ce67071eab935a7dd2203db81745d86c9",
"size": "12401",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "app/src/main/res/layout/fragment_hot_info.xml",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "HTML",
"bytes": "6556"
},
{
"name": "Java",
"bytes": "1588242"
}
],
"symlink_target": ""
} |
package com.xiaogua.better.json;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.Iterator;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SequenceWriter;
public class JacksonJsonCode {
private static ObjectMapper mapper = new ObjectMapper();
public static JsonNode readAsTree(String filePath) throws Exception {
JsonNode jsonNode = mapper.readTree(Files.newInputStream(Paths.get(filePath)));
return jsonNode;
}
public static void writeJsonToFile(JsonNode rootNode, String destFile) throws Exception {
SequenceWriter writer = mapper.writerWithDefaultPrettyPrinter()
.writeValues(Files.newOutputStream(Paths.get(destFile)));
writer.write(rootNode);
writer.close();
}
public static void printJsonNodeFileldName(JsonNode jsonNode) throws Exception {
Iterator<String> fieldNames = jsonNode.fieldNames();
while (fieldNames.hasNext()) {
String fieldName = fieldNames.next();
System.out.println(fieldName);
}
}
}
| {
"content_hash": "fd777d5432f2032bec406e25db0190af",
"timestamp": "",
"source": "github",
"line_count": 33,
"max_line_length": 90,
"avg_line_length": 33.121212121212125,
"alnum_prop": 0.7621225983531564,
"repo_name": "oyxiaogua/JavaBasicCode",
"id": "97af71419ee04021c7adba7c33dbe58189eac49e",
"size": "1093",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "src/main/java/com/xiaogua/better/json/JacksonJsonCode.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Java",
"bytes": "601444"
}
],
"symlink_target": ""
} |
module SlackRubyBot
module Commands
module Support
class Attrs
attr_accessor :command_name, :command_desc, :command_long_desc
attr_reader :klass, :commands
def initialize(klass)
@klass = klass
@commands = []
end
def title(title)
self.command_name = title
end
def desc(desc)
self.command_desc = desc
end
def long_desc(long_desc)
self.command_long_desc = long_desc
end
def command(title, &block)
@commands << self.class.new(klass).tap do |k|
k.title(title)
k.instance_eval(&block)
end
end
end
end
end
end
| {
"content_hash": "93f1de332e1d667ade499ed82402a6fc",
"timestamp": "",
"source": "github",
"line_count": 34,
"max_line_length": 70,
"avg_line_length": 21.235294117647058,
"alnum_prop": 0.5290858725761773,
"repo_name": "dblock/slack-ruby-bot",
"id": "514ed9e0bcd1dc5c552f694e6f6694afcccb7506",
"size": "753",
"binary": false,
"copies": "2",
"ref": "refs/heads/master",
"path": "lib/slack-ruby-bot/commands/support/attrs.rb",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "Ruby",
"bytes": "105537"
}
],
"symlink_target": ""
} |
// http://developer.yahoo.com/yui/articles/hosting/?button&connection&container&datatable&dragdrop&json&menu&utilities&MIN&nocombine&norollup&basepath&[{$shop-%3Ebasetpldir}]yui/build/
YAHOO.namespace( 'YAHOO.oxid' );
var $ = YAHOO.util.Dom.get,
$D = YAHOO.util.Dom,
$E = YAHOO.util.Event;
//--------------------------------------------------------------------------------
YAHOO.oxid.aoc = function( elContainer , aColumnDefs , sDataSource , oConfigs )
{
YAHOO.widget.DataTable.MSG_EMPTY = "";
/**
* Count of items per screen
*
* @var int
*/
this.viewSize = 25;
/**
* Count of views to cache
*
* @var int
*/
this.viewCount = 100;
/**
* Count of DB records to fetch from DB, includes cache
*
* @var int
*/
this.dataSize = this.viewSize * this.viewCount;
/**
* Maximum number of cached pages
*
* @var int
*/
this.maxCacheEntries = 20;
/**
* Response data array
*
* @var array
*/
this.viewResponse = {};
/**
* Request array
*
* @var array
*/
this.aRequests = {};
/**
* Actual object copy
*
* @var object
*/
var me = this;
/**
* Internal counter
*
* @var int
*/
this.evtCtr = 0;
/**
* Internal counter
*
* @var int
*/
this.inpCtr = 0;
/**
*
*/
this.focusCtr = 0;
/**
* Container name
*
* @var string
*/
this.elContainer = elContainer;
this._elColGroup = null;
this._iVisibleCount = 0;
this.aColsTohide = [];
/**
* Overridable method to add specific parameters to server request.
* Useful when creating specific additional functionalities like
* filters, category selectors or so.
*
* @param string sRequest initial request
*
* @return string
*/
this.modRequest = function( sRequest )
{
return sRequest;
};
/**
* Formats request URL which is executes by AJAX request. URL is
* formatted by adding:
* - start index ( startIndex );
* - qty. of results ( results );
* - sorting information ( dir, sort );
* - filter data array ( aFilter[column] );
* - displayable columns info array ( aCols[] )
* - adding user defined params ( this.modRequest() ).
*
* @param int iStartIndex
*
* @return string
*/
this.getRequest = function( iStartIndex )
{
me.iStartIndex = iStartIndex;
var sRequest = '&startIndex=' + iStartIndex + '&results=' + this.dataSize;
// attaching ordering
if(me.sortCol){
sRequest += '&dir=' + me.sortDir + '&sort=' + me.sortCol;
}
// attaching filter
if ( me._aFilters ) {
for ( var i=0; i < me._aFilters.length; i++ ) {
if ( me._aFilters[i].name && me._aFilters[i].value ) {
sRequest += '&aFilter[' + me._aFilters[i].name + ']=' + encodeURIComponent( me._aFilters[i].value );
}
}
}
// only visible columns
for ( var i=0, col; col = aColumnDefs[i]; i++ ) {
if ( col.visible ) {
sRequest += '&aCols[]='+ col.key;
}
}
sRequest = me.modRequest( sRequest );
return sRequest;
};
// if config information is not passed by user - creating empty object
if ( !oConfigs ){
oConfigs = {};
}
/**
* Initial request URL which is executed to fetch initial data for data table
*
* @var string
*/
oConfigs.initialRequest = this.getRequest( 0 ) + '&results=' + ( this.viewSize );
/**
* Scrollbar renderer, which is used for data paging
*
* @return null
*/
this.renderScrollBar = function()
{
// selecting height calculation object
if ( this._elTbody.rows.length ) {
var oReg = this._elTbody.rows;
} else {
var oReg = this._elMsgTbody.rows;
}
var aReg = $D.getRegion( oReg[0] );
this.rowHeight = (aReg.bottom - aReg.top - 1);
// setting scroll div layer
$D.setStyle( this._elScroll, 'height', ( this.rowHeight * this.viewSize )+ 'px' );
$D.setStyle( this._elScrollView, 'height', ( this.rowHeight * this.totalRecords ) + 'px' );
var aTReg = $D.getRegion( this.getTheadEl() );
var hHeight = ( aTReg.bottom - aTReg.top );
if ( this.elFilterHead ) {
var aTReg = $D.getRegion( this.elFilterHead );
hHeight += ( aTReg.bottom - aTReg.top );
}
$D.setStyle( this._elScroll, 'margin-top', ( hHeight + 1 ) + 'px' );
// subscribing event listener on scroll bar
$E.on( this._elScroll, 'scroll', this.scrollTo, this );
this.iScrollOffset = $D.getY( this._elScroll );
};
/**
* Resets scrollbar info after data reload
*
* @return null
*/
this.resetScrollBar = function()
{
if ( this._elTbody.rows.length ) {
var oReg = this._elTbody.rows;
} else {
var oReg = this._elMsgTbody.rows;
}
var aReg = $D.getRegion( oReg[0] );
this.rowHeight = ( aReg.bottom - aReg.top - 1);
$D.setStyle( this._elScrollView, 'height', (this.rowHeight*this.totalRecords)+ 'px' );
};
/**
* Scrollbar manager
*
* @param object e event
* @param object oScroll scrollbar object
*
* @return null
*/
this.scrollTo = function( e, oScroll )
{
me.evtCtr++;
sCall = me.evtCtr + 0;
setTimeout( function () { me.scrollDo( sCall ); }, 100 );
};
/**
* Calculates scroll direction + offset and loads new data to render
*
* @param int evtCtr counter to reduce server load
*
* @return null
*/
this.scrollDo = function( evtCtr )
{
if ( me.evtCtr == evtCtr )
{
// initial scroll
var iOffset = Math.round( ( $D.getY( me ._elScrollView ) - me.iScrollOffset ) / me.rowHeight );
iOffset = Math.min( iOffset, 0 );
var iStartRecordIndex = iOffset * - 1;
var dataSize = me.viewSize * me.viewCount;
var page1 = Math.floor( iStartRecordIndex / dataSize );
var page2 = Math.floor( ( iStartRecordIndex + me.viewSize ) / dataSize );
var sRequest = me.getRequest( page1 * dataSize );
if ( page1 != page2 )
var sKey = sRequest + "initDataView_ff";
else {
var sKey = sRequest + "initDataView_fl";
}
if ( sKey in me.aRequests ) {
return;
}
// Do we need second page ?
if ( page1 != page2 ) {
sKey = me.getRequest(page2 * dataSize) + "me.initDataView_l";
}
if ( sKey in me.aRequests ) {
return;
}
me.evtCtr = 0;
me.getPage( iOffset * - 1 );
}
};
/**
* Calculates page position information and send request to fetch data
* to render in data table
*
* @param int iStartRecordIndex
*
* @return null
*/
this.getPage = function( iStartRecordIndex )
{
var dataSize = me.viewSize * me.viewCount;
var page1 = Math.floor( iStartRecordIndex / dataSize );
var page2 = Math.floor( ( iStartRecordIndex + me.viewSize ) / dataSize );
me.viewResponse = {results:[]};
me.iStartRecordIndex = Math.min( iStartRecordIndex, Math.max( ( me.totalRecords - me.viewSize ), 0 ) );
var sRequest = me.getRequest( page1 * dataSize );
if ( page1 != page2 ) {
me.aRequests[sRequest + "initDataView_ff"] = 1;
me.oDataSource.sendRequest( sRequest, me.initDataView_ff, me );
} else {
me.aRequests[sRequest + "initDataView_fl"] = 1;
me.oDataSource.sendRequest( sRequest, me.initDataView_fl, me );
}
// Do we need second page ?
if ( page1 != page2 ) {
var sRequest = me.getRequest( page2 * dataSize );
me.aRequests[sRequest + "initDataView_l"] = 1;
me.oDataSource.sendRequest( sRequest, me.initDataView_l, me );
}
};
this.initDataView_fl = function( sRequest, oResponse )
{
me.initDataView( sRequest, oResponse, 1, 1, "initDataView_fl" );
};
this.initDataView_ff = function( sRequest, oResponse )
{
me.initDataView( sRequest, oResponse, 1, 2, "initDataView_ff" );
};
this.initDataView_l = function( sRequest, oResponse )
{
me.initDataView( sRequest, oResponse, 2, 2, "initDataView_l" );
};
/**
* Initiates data response data, which further is delivered and populated by
* datatable object
*
* @param string sRequest action request URL
* @param object oResponse response object
* @param int nr response page number
* @param int total total number of responses
* @param string request type
*
* @return null
*/
this.initDataView = function( sRequest, oResponse, nr, total, sRequestType )
{
var page = Math.floor( me.iStartRecordIndex / (me.viewSize * me.viewCount ) ) + ( nr - 1 );
var cacheSize = me.viewSize * me.viewCount;
var startIndex = me.iStartRecordIndex;
if ( ( startIndex - cacheSize ) > 0 ) {
startIndex = startIndex - Math.floor( me.iStartRecordIndex / cacheSize ) * cacheSize;
}
var iCnt = 0;
for ( var i = startIndex; i < startIndex + me.viewSize; i++ ) {
me.viewResponse.results[iCnt] = oResponse.results[i];
iCnt++;
}
if ( nr == total ) {
me.totalRecords = oResponse.totalRecords;
me.sortCol = oResponse.sortCol;
me.sortDir = oResponse.sortDir;
me.onDataReturnInitializeTable(sRequest, me.viewResponse);
}
var sKey = sRequest + sRequestType;
if ( sKey in me.aRequests ) {
delete me.aRequests[sKey];
}
};
/**
* Executes sorting call
*
* @param object oColumn sortable column
*
* @return null
*/
this.sortColumn = function( oColumn )
{
// Which direction
var sDir = 'asc';
// Already sorted?
if ( oColumn.key === me.sortCol ) {
sDir = ( me.sortDir === 'asc' ) ? 'desc' : 'asc';
}
me.sortCol = oColumn.key;
me.sortDir = sDir;
me.set( 'sortedBy', { key:oColumn.key, dir:(sDir === 'asc') ? YAHOO.widget.DataTable.CLASS_ASC : YAHOO.widget.DataTable.CLASS_DESC, column:oColumn } );
me.getPage( 0 );
};
/**
* Forms HTML container to store data table and scroll components
*
* @return null
*/
this.addScrollBar = function()
{
$(elContainer).innerHTML = "<table class=\"oxid-aoc\" border=\"0\" cellpadding=\"0\" cellspacing=\"0\"><tr><td id=\""+elContainer+"_bg\" class=\"oxid-aoc-table\" valign=\"top\"><div id=\""+elContainer+"_c\"><\/div><\/td><td valign=\"top\" height=\"100%\"><div dir=RTL class=\"oxid-aoc-scrollbar\" id=\""+elContainer+"_s\"><div id=\""+elContainer+"_v\"><\/div><\/div><\/td><\/tr><\/table>";
this._elScroll = $(elContainer + '_s' );
this._elScrollView = $(elContainer + '_v' );
};
/**
* Adds filters into data table component
*
* @return null
*/
this.addFilters = function()
{
this._aFilters = [];
var elTr = document.createElement( 'tr' );
// just adding class the same as headers
$D.addClass( elTr, 'yui-dt-first yui-dt-last' );
elTr.id = "yui-dt" + this._nIndex + "-hdrow1";
for ( var i=0,col; col=aViewCols[i]; i++ ) {
var elInput = document.createElement( 'input' );
elInput.setAttribute( "name", aViewCols[i].key );
elInput.setAttribute( "value", "" );
elInput.style.width = '95%';
this._aFilters[i] = elInput;
$E.on( elInput, 'keyup', this.waitForfilter, this );
$E.on( elInput, 'click', elInput.focus );
elInput.focused = false;
var elTd = document.createElement( 'th' );
elTd.id = "yui-dt" + this._nIndex + "-filter_" + i;
$D.addClass( elTd, 'yui-dt-resizeable yui-dt-sortable' );
var elDiv = document.createElement( 'div' );
elDiv.style.padding = '2px';
elDiv.style.overflow = 'hidden';
elDiv.appendChild( elInput );
elTd.appendChild( elDiv );
elTr.appendChild( elTd );
}
this.elFilterHead = document.createElement( 'thead' );
this.elFilterHead.appendChild( elTr );
// adding filters
var elThead = this.getTheadEl();
elThead.parentNode.insertBefore( this.elFilterHead, elThead.parentNode.firstChild );
};
/**
* Checks if user input is allowed. Returns false if not
*
* @param int nKeyCode key code
*
* @return bool
*/
this.isIgnoreKey = function( nKeyCode )
{
if ( ( nKeyCode == 9 ) || ( nKeyCode == 13 ) || // tab, enter
(nKeyCode == 16) || (nKeyCode == 17) || // shift, ctl
(nKeyCode >= 18 && nKeyCode <= 20) || // alt,pause/break,caps lock
(nKeyCode == 27) || // esc
(nKeyCode >= 33 && nKeyCode <= 35) || // page up,page down,end
(nKeyCode >= 36 && nKeyCode <= 40) || // home,left,up, right, down
(nKeyCode >= 44 && nKeyCode <= 45)) { // print screen,insert
return true;
}
return false;
};
/**
* Filter manager
*
* @param object e event
*
* @return null
*/
this.waitForfilter = function( e )
{
if ( me.isIgnoreKey( e.keyCode ) )
return false;
me.inpCtr++;
sCall = me.inpCtr + 0;
var sSecondParam = $E.getTarget( e, false );
setTimeout( function( ) { me.filterBy( sCall, sSecondParam ); }, 100 );
};
/**
* Performs filter call and loads new data to render
*
* @param int inpCtr counter to reduce server load
*
* @return null
*/
this.filterBy = function( inpCtr, oTarget )
{
if ( me.inpCtr == inpCtr ) {
me.inpCtr = 0;
me.getPage( 0 );
if ( me._aFilters && oTarget ) {
for ( var i=0; i < me._aFilters.length; i++ ) {
if ( me._aFilters[i] == oTarget ) {
me._aFilters[i].focused = true;
} else {
me._aFilters[i].focused = false;
}
}
}
}
};
this.onMouseDown = function( e )
{
if ( !(e.shiftKey || e.event.ctrlKey || ((navigator.userAgent.toLowerCase().indexOf("mac") != -1) && e.event.metaKey)) && !this.isSelected(e.target) ) {
this.onEventSelectRow(e);
}
};
/**
* Initializes and mounts drag&drop on dataTable component
*
* @return null
*/
this.addDD = function()
{
this.subscribe( 'rowClickEvent', this.onEventSelectRow );
this.subscribe( 'rowMousedownEvent', this.onMouseDown );
YAHOO.util.DDM.mode = YAHOO.util.DDM.INTERSECT;
me.dd = new YAHOO.util.DDProxy( this.elContainer+'_bg', 'aoc', { resizeFrame: false , centerFrame :true} );
me.dd.endDrag = this.endDrag;
me.dd.onDragEnter = this.onDragEnter;
me.dd.onDragOut = this.onDragOut;
me.dd.onDragDrop = this.onDragDrop;
me.dd.me = me;
me.dd.b4StartDrag = this.b4StartDrag;
};
/**
*
*/
this.onDragDrop = function( e, DDArray )
{
me.afterDrop( me, DDArray[0].me );
};
/**
* Overridable method to add specific parameters to server request.
* Useful when creating specific additional functionalities like
* filters, category selectors or so.
*
* @return null
*/
this.getDropAction = function()
{
};
/**
* Overridable callback function for user defined functionality after
* failure/success d&d action
*
* @param object oResponse response object
*
* @return null
*/
this.onFailureCalback = function( oResponse )
{
};
/**
* Overridable callback function for user defined functionality after
* failure/success d&d action
*
* @param object oResponse response object
*
* @return null
*/
this.onSuccessCalback = function( oResponse )
{
};
/**
* On success drop action refreshes data tables
*
* @param object oResponse response object
*
* @return null
*/
this.onSuccess = function( oResponse )
{
me.onSuccessCalback( oResponse );
var oSrc = oResponse.argument[0];
oSrc.getDataSource().flushCache();
oSrc.getPage( 0 );
var oTrg = oResponse.argument[1];
oTrg.getDataSource().flushCache();
oTrg.getPage( 0 );
};
/**
* On failure executes onFailureCalback function
*
* @param object oResponse response object
*
* @return null
*/
this.onFailure = function( oResponse )
{
me.onFailureCalback( oResponse );
};
/**
* Returns drop action URL. Usefull when there is a need to pass different DD action
*
* @return string
*/
this.getDropUrl = function()
{
return me.sDataSource;
};
/**
* Adds and returns query parameters for drop action URL
*
* @return string
*/
this.getDropParams = function() {
var sNextAction = '';
var oBtn = $( me.elContainer + '_btn' );
var blChecked = $D.hasClass( oBtn, 'oxid-aoc-button-checked' );
if ( blChecked ) {
sNextAction += '&all=1';
} else {
var aSelRows = me.getSelectedRows();
for ( var i=0, aRow; aRow = aSelRows[i]; i++ ) {
var oRecord = me.getRecord( aRow );
if ( oRecord ) {
var oData = oRecord.getData();
// building action url
if ( me.aIdentFields ) { //
for ( var c=0, sIdent; sIdent = me.aIdentFields[c]; c++ ) {
if ( oData[sIdent] ) {
var dataString = oData[sIdent];
//T2010-01-06
//fixing #1580
dataString = escape(dataString);
sNextAction += '&'+sIdent + '[]=' + dataString;
}
}
}
}
}
}
return sNextAction;
};
/**
* Executes "after drop" action
*
* @param object oSource drop source
* @param object oTarget drop target
*
* @return null
*/
this.afterDrop = function( oSource, oTarget )
{
var callback = { success: me.onSuccess,
failure: me.onFailure,
argument: [ oSource, oTarget ]
};
YAHOO.util.Connect.asyncRequest( 'GET', me.getDropUrl() + '&' + me.getDropParams() + '&' + me.getDropAction(), callback, null );
};
/**
* Clears CSS classes from D&D after dropping DD proxy
*
* @return null
*/
this.endDrag = function()
{
if ( me.ddtarget ) {
$D.removeClass( me.ddtarget.me._elTbody, 'ddtarget' );
$D.removeClass( me.ddtarget.elContainer, 'ddtarget' );
$D.removeClass( $( me.ddtarget.me.elContainer + '_bg' ), 'ddtarget' );
}
};
/**
* Adds CSS class on drag component
*
* @param object e mouse event
* @param array DDArray array of DD objects
*
* @return null
*/
this.onDragEnter = function( e, DDArray )
{
var tar = YAHOO.util.DragDropMgr.getBestMatch( DDArray );
if ( me.ddtarget ) {
$D.removeClass( me.ddtarget.me._elTbody, 'ddtarget' );
$D.removeClass( $( elContainer ), 'ddtarget' );
}
$D.addClass( tar.me._elTbody, 'ddtarget' );
$D.addClass( $( tar.me.elContainer + '_bg' ), 'ddtarget' );
me.ddtarget = tar;
};
/**
* Removes CSS class from drag component
*
* @param object e mouse event
* @param array DDArray array of DD objects
*
* @return null
*/
this.onDragOut = function( e, DDArray )
{
var tar = YAHOO.util.DragDropMgr.getBestMatch( DDArray );
$D.removeClass( tar.me._elTbody, 'ddtarget' );
$D.removeClass( $(tar.me.elContainer + '_bg' ), 'ddtarget' );
me.ddtarget = null;
};
/**
* D&D: cancels drag process if no cells are selected. Otherwise shows frame
*
* @param int x position by x axis
* @param int y position by y axis
*
* @return bool
*/
this.b4StartDrag = function( x, y )
{
if ( !me.getSelectedRows().length ) {
me.dd.endDrag();
return false;
}
// show the drag frame
this.showFrame(x, y);
};
/**
* Initializes and adds menu object on column headers
*
* @return null
*/
this.addColumnSelector = function()
{
me.oContextMenu = new YAHOO.widget.ContextMenu( elContainer + '_m', { zindex: 1000, trigger: this._elThead.childNodes } );
me.oContextMenu.clearContent();
var aItems = [];
for ( var i=1,col; col = aColumnDefs[i]; i++ ) {
if ( col.label ) {
aItems[i] = { target:i, text:col.label, checked: col.visible };
}
}
me.oContextMenu.addItems( aItems );
me.oContextMenu.render( document.body );
me.oContextMenu.clickEvent.subscribe( me.onFieldMenuClick );
};
/**
* Iterates through
*/
this.hideNumberedColumn = function( nr, blShow, startIdx )
{
var aCols = [ "yui-dt" + this._nIndex + "-th-_" + nr , "yui-dt" + this._nIndex + "-filter_" + nr ];
if ( blShow ) {
this._iVisibleCount++;
this.showColumn( this.getColumn( nr ) );
$D.removeClass( aCols, 'yui-dt-hidden' );
// somehow some elements keeps its state
$D.setStyle( $D.getElementsByClassName( 'yui-dt-sortable'), 'display', '');
$D.setStyle( $D.getElementsByClassName( 'yui-dt-col-'+ nr ), 'width', 'auto' );
} else {
this._iVisibleCount--;
this.hideColumn( this.getColumn( nr ) );
$D.addClass( aCols, 'yui-dt-hidden' );
}
};
/**
* Hides/displays data table column after uses choose one from menu
*
* @param object e click event
* @param array args array with arguments
*
* @return bool
*/
this.onFieldMenuClick = function( e, args ) {
var item = args[1],
checked = !$D.hasClass(item._oAnchor,'yuimenuitemlabel-checked'),
nr = item.cfg.getProperty('target');
item.cfg.setProperty( 'checked', checked );
aColumnDefs[nr].visible = checked;
//Set class for headers / cells
var oColumn = me._oColumnSet.getColumn( aColumnDefs[nr].key );
oColumn.hidden = !checked;
me.hideNumberedColumn( nr, checked, me.getColumn(0)._sId );
me.getPage( 0 );
me.set( 'sortedBy', {} );
return false;
};
/**
* Processing response data to load information:
* - total records found;
* - sorting column;
* - sorting direction.
* Returns parsed response object (which is used by data source class further).
*
* @param object oRequest request object
* @param object oRawResponse response object
* @param object oParsedResponse parsed response object
*
* @return object
*/
this.doBeforeCallback = function( oRequest, oRawResponse, oParsedResponse )
{
oParsedResponse.totalRecords = me.totalRecords = oRawResponse.totalRecords;
oParsedResponse.sortCol = me.sortCol = oRawResponse.sort; // Which column is sorted
oParsedResponse.sortDir = me.sortDir = oRawResponse.dir; // Which sort direction
return oParsedResponse;
};
/**
* Default cell formatter. Surrounds cell HTML with div's to make view nicer (adds
* hidden overflow property).
*
* @param object elCell table cell element
* @param object oRecord data record object
* @param object oColumn table column object
* @param object oData responce data object
*
* @return null
*/
this.defaultFormatter = function( elCell, oRecord, oColumn, oData )
{
if ( oData ) {
elCell.innerHTML = '<div>'+oData.toString()+'</div>';
} else { // avoiding empty broken cells
elCell.innerHTML = ' ';
}
};
/**
* Binds event listener on assign all button
*/
this.initAssignBtn = function()
{
// only if this action is allowed
var oBtn,sBtn = $( this.elContainer + '_btn' );
if ( sBtn != null ) {
oBtn = new YAHOO.widget.Button(sBtn);
oBtn.on("click", this.assignAll, this );
}
};
/**
* Returns URL which are executen on assignment action
* Usefult when you need to pass different action URL than default
*
* @return string
*/
this.getAssignUrl = function()
{
return me.sDataSource;;
};
/**
* Overridable method to add user defined assign parameters
*
* @return string
*/
this.getAssignParams = function()
{
return '';
};
/**
* Returns assign action parameters which are included in action URL
*
* @return string
*/
this.getAssignAction = function()
{
sRequest = me.getDropAction();
// attaching filter
if ( me._aFilters ) {
for ( var i=0; i < me._aFilters.length; i++ ) {
if ( me._aFilters[i].name && me._aFilters[i].value ) {
sRequest += '&aFilter['+me._aFilters[i].name+']='+ encodeURIComponent( me._aFilters[i].value );
}
}
}
sRequest = me.modRequest( sRequest );
return sRequest+'&all=1';
};
/**
* Refreshing data tables after assign succeded
*
* @param object oResponse response object
*
* @return null
*/
this.onAssignSuccess = function( oResponse )
{
me.onSuccessCalback( oResponse );
var oSrc = oResponse.argument[0];
oSrc.getDataSource().flushCache();
oSrc.getPage( 0 );
var oTrg = oResponse.argument[1];
oTrg.getDataSource().flushCache();
oTrg.getPage( 0 );
};
/**
* Overridable method which is called when some failure raises while
* assigning object
*
* @return null
*/
this.onAssignFailure = function()
{
};
/**
* Returns assignment target.
* NOTICE: override this on special cases
*
* @return null
*/
this.getAssignTarget = function()
{
// not nice hack ...
if ( me != YAHOO.oxid.container1 )
return YAHOO.oxid.container1;
else
return YAHOO.oxid.container2;
};
/**
* Returns object which was the source for assignment
*
* @return object
*/
this.getAssignSource = function()
{
return me;
};
/**
* Executes "assign all" request
*
* @return null
*/
this.assignAll = function()
{
var callback = { success: me.onAssignSuccess,
failure: me.onAssignFailure,
argument: [ me.getAssignSource(), me.getAssignTarget() ]
};
YAHOO.util.Connect.asyncRequest( 'GET', me.getAssignUrl() + '&' + me.getAssignParams() + '&' + me.getAssignAction(), callback, null );
};
this.setFocusOnFilter = function()
{
// saving information about field which has focus
if ( this._aFilters ) {
for ( var i=0; i < this._aFilters.length; i++ ) {
if ( this._aFilters[i].focused ) {
this._aFilters[i].focus();
}
}
}
};
this.waitForFocus = function( focusCtr )
{
var blContinue = false;
// saving information about field which has focus
if ( me._aFilters ) {
for ( var i=0; i < me._aFilters.length; i++ ) {
if ( me._aFilters[i].focused ) {
blContinue = true;
}
}
}
if ( !blContinue ) {
return;
}
if ( me.focusCtr == focusCtr ) {
me.setFocusOnFilter();
} else {
me.focusCtr++;
sCall = me.focustCtr + 0;
setTimeout( function( ) { me.waitForFocus( sCall ); }, 100 );
}
};
/**
* All data columns which somes from server serialized in JSON style
*
* @return array
*/
var aDataCols = [];
/**
* Columns which are displayed
*
* @return array
*/
var aViewCols = [];
/**
* Name of fields which are used as identifiers. They are not usefull for
* displaying, but is handy while performing soecial actions
*
* @var array
*/
this.aIdentFields = [];
/**
* Data source URL
*/
this.sDataSource = sDataSource;
// just counting
var iCtr = 0;
var iIdentCtr = 0;
var iHiddenColCtr = 0;
// collecting data/view/ident fields information arrays
for ( var i=0,col; col = aColumnDefs[i]; i++ ) {
if ( !col.ident ) {
if ( col.formatter ) {
sFormatters = col.formatter;
} else {
sFormatters = me.defaultFormatter;
}
var blSortable = true;
if ( col.sortable != null )
blSortable = col.sortable;
col.sortable = blSortable;
col.formatter = sFormatters;
col.resizeable = true;
aViewCols[iCtr] = col;
if (!col.visible) {
this.aColsTohide[iHiddenColCtr] = i;
iHiddenColCtr++;
}
} else {
this.aIdentFields[iIdentCtr] = col.key;
iIdentCtr++;
}
aDataCols[iCtr] = col.key;
iCtr++;
}
this._iVisibleCount = aViewCols.length;
// initiating data source
this.oDataSource = new YAHOO.util.DataSource( sDataSource, { maxCacheEntries: this.maxCacheEntries } );
this.oDataSource.responseType = YAHOO.util.DataSource.TYPE_JSON;
this.oDataSource.doBeforeCallback = this.doBeforeCallback;
this.oDataSource.responseSchema = {
resultsList: 'records',
fields: aDataCols
};
// adding scrollbar divs
this.addScrollBar( elContainer );
// constructing datatable component
YAHOO.oxid.aoc.superclass.constructor.call( this, elContainer + '_c', aViewCols, this.oDataSource, oConfigs );
// adding data filters
this.addFilters();
// initializing menu - column selector
this.addColumnSelector();
// initializind drag&drop
this.addDD();
// adding listener on "assign all" button
this.initAssignBtn();
for ( var i=0; i < this.aColsTohide.length; i++ ) {
this.hideNumberedColumn( this.aColsTohide[i], false, i );
}
// table must be 100% width ..
$D.addClass( this.getTableEl(), 'yui-dt-table' );
};
YAHOO.lang.extend( YAHOO.oxid.aoc, YAHOO.widget.DataTable );
/**
* Overriding default data set call to add additional functionality
* which renders scrollbar after data is fetched from server
*
* @param string sRequest request URL
* @param object oResponse responce object
*
* @return null
*/
YAHOO.oxid.aoc.prototype.onDataReturnSetRows = function( sRequest, oResponse, oPayload )
{
YAHOO.oxid.aoc.superclass.onDataReturnSetRows.call( this, sRequest, oResponse, oPayload );
if ( this.blInitScrollBar == null ) {
this.renderScrollBar();
this.blInitScrollBar = true;
} else {
this.resetScrollBar();
}
var oColumn = this.getColumn( this.sortCol );
if ( this.sortDir && this.sortCol && oColumn ) {
this.set( 'sortedBy', { key: this.sortCol, dir:(this.sortDir === 'asc') ? YAHOO.widget.DataTable.CLASS_ASC : YAHOO.widget.DataTable.CLASS_DESC, column:oColumn } );
}
me = this;
setTimeout( function () { me.waitForFocus( 0 ); }, 100 );
this.hideTableMessage();
}; | {
"content_hash": "105b02692565a457619b3b499e3655d6",
"timestamp": "",
"source": "github",
"line_count": 1190,
"max_line_length": 398,
"avg_line_length": 28.04873949579832,
"alnum_prop": 0.5323866019533825,
"repo_name": "Olyvko/oxrom",
"id": "4fc102b42dfdc948ca9746d91ca199d6fa96891b",
"size": "33378",
"binary": false,
"copies": "16",
"ref": "refs/heads/master",
"path": "out/admin/src/yui/oxid-aoc.js",
"mode": "33261",
"license": "mit",
"language": [
{
"name": "ApacheConf",
"bytes": "2581"
},
{
"name": "CSS",
"bytes": "380442"
},
{
"name": "HTML",
"bytes": "9392"
},
{
"name": "JavaScript",
"bytes": "347827"
},
{
"name": "PHP",
"bytes": "12550647"
},
{
"name": "Smarty",
"bytes": "2383132"
}
],
"symlink_target": ""
} |
CREATE TABLE `affectation` (
`affectation` int(8) NOT NULL default '0',
`elu` varchar(10) NOT NULL default '',
`scrutin` varchar(10) NOT NULL default '',
`periode` varchar(10) NOT NULL default '',
`poste` varchar(20) NOT NULL default '',
`bureau` char(3) NOT NULL default '',
`note` longtext NOT NULL,
`decision` char(3) NOT NULL default '',
`candidat` varchar(40) NOT NULL default '',
PRIMARY KEY (`affectation`)
) TYPE=MyISAM;
-- --------------------------------------------------------
--
-- Structure de la table `agent`
--
CREATE TABLE `agent` (
`agent` varchar(10) NOT NULL default '',
`nom` varchar(40) NOT NULL default '',
`prenom` varchar(40) NOT NULL default '',
`adresse` varchar(80) NOT NULL default '',
`cp` varchar(5) NOT NULL default '',
`ville` varchar(40) NOT NULL default '',
`telephone` varchar(14) NOT NULL default '',
`service` varchar(10) NOT NULL default '',
`poste` varchar(4) NOT NULL default '',
`grade` varchar(10) NOT NULL default '',
PRIMARY KEY (`agent`)
) TYPE=MyISAM;
-- --------------------------------------------------------
--
-- Structure de la table `bureau`
--
CREATE TABLE `bureau` (
`bureau` char(3) NOT NULL default '',
`libelle` varchar(30) NOT NULL default '',
`canton` varchar(8) NOT NULL default '',
`adresse1` varchar(60) NOT NULL default '',
`adresse2` varchar(60) NOT NULL default '',
`cp` varchar(5) NOT NULL default '',
`ville` varchar(40) NOT NULL default '',
PRIMARY KEY (`bureau`),
UNIQUE KEY `bureau` (`bureau`)
) TYPE=MyISAM;
-- --------------------------------------------------------
--
-- Structure de la table `candidat`
--
CREATE TABLE `candidat` (
`candidat` int(11) NOT NULL,
`nom` varchar(50) NOT NULL,
`scrutin` varchar(15) NOT NULL,
PRIMARY KEY (`candidat`)
) TYPE=InnoDB;
-- --------------------------------------------------------
--
-- Structure de la table `candidature`
--
CREATE TABLE `candidature` (
`candidature` int(8) NOT NULL default '0',
`agent` varchar(10) NOT NULL default '',
`scrutin` varchar(10) NOT NULL default '',
`periode` varchar(10) NOT NULL default '',
`poste` varchar(20) NOT NULL default '',
`bureau` char(3) NOT NULL default '',
`recuperation` char(3) NOT NULL default '',
`note` longtext NOT NULL,
`decision` char(3) NOT NULL default '',
`debut` varchar(5) NOT NULL default '',
`fin` varchar(5) NOT NULL default '',
PRIMARY KEY (`candidature`)
) TYPE=MyISAM;
-- --------------------------------------------------------
--
-- Structure de la table `canton`
--
CREATE TABLE `canton` (
`canton` varchar(8) NOT NULL default '',
`libelle` varchar(40) NOT NULL default '',
PRIMARY KEY (`canton`)
) TYPE=MyISAM;
-- --------------------------------------------------------
--
-- Structure de la table `collectivite`
--
CREATE TABLE `collectivite` (
`ville` varchar(30) NOT NULL default '',
`logo` varchar(40) NOT NULL default '',
`maire` varchar(40) NOT NULL default '',
`id` int(2) NOT NULL default '0',
PRIMARY KEY (`id`)
) TYPE=MyISAM;
--
-- Contenu de la table `collectivite`
--
INSERT INTO `collectivite` (`ville`, `logo`, `maire`, `id`) VALUES
('LibreVille / mysql', '../img/logo.png', 'openmairie', 0);
-- --------------------------------------------------------
--
-- Structure de la table `composition_bureau`
--
CREATE TABLE `composition_bureau` (
`scrutin` varchar(15) NOT NULL default '',
`bureau` char(3) NOT NULL default '',
`president` varchar(80) NOT NULL default '',
`president_suppleant` varchar(80) NOT NULL default '',
`secretaire` varchar(80) NOT NULL default ''
) TYPE=MyISAM;
-- --------------------------------------------------------
--
-- Structure de la table `composition_bureau_agent`
--
CREATE TABLE `composition_bureau_agent` (
`scrutin` varchar(15) NOT NULL default '',
`bureau` char(3) NOT NULL default '',
`secretaire_matin` varchar(80) NOT NULL default '',
`secretaire_apm` varchar(80) NOT NULL default '',
`planton_matin` varchar(80) NOT NULL default '',
`planton_apm` varchar(80) NOT NULL default ''
) TYPE=MyISAM;
--
-- Contenu de la table `composition_bureau_agent`
--
-- --------------------------------------------------------
--
-- Structure de la table `droit`
--
CREATE TABLE `droit` (
`droit` varchar(30) NOT NULL default '',
`profil` int(2) NOT NULL default '0',
PRIMARY KEY (`droit`)
) TYPE=MyISAM;
--
-- Contenu de la table `droit`
--
INSERT INTO `droit` (`droit`, `profil`) VALUES
('utilisateur', 5),
('droit', 5),
('profil', 5),
('collectivite', 4),
('edition', 4),
('txtab', 4),
('agent', 3),
('elu', 3),
('scrutin', 3),
('service', 3),
('grade', 3),
('bureau', 4),
('canton', 4),
('periode', 4),
('poste', 4);
-- --------------------------------------------------------
--
-- Structure de la table `elu`
--
CREATE TABLE `elu` (
`elu` varchar(10) NOT NULL default '',
`nom` varchar(40) NOT NULL default '',
`prenom` varchar(40) NOT NULL default '',
`nomjf` varchar(40) NOT NULL default '',
`date_naissance` date NOT NULL default '0000-00-00',
`lieu_naissance` varchar(40) NOT NULL default '',
`adresse` varchar(80) NOT NULL default '',
`cp` varchar(5) NOT NULL default '',
`ville` varchar(40) NOT NULL default '',
PRIMARY KEY (`elu`)
) TYPE=MyISAM;
-- --------------------------------------------------------
--
-- Structure de la table `grade`
--
CREATE TABLE `grade` (
`grade` varchar(10) NOT NULL default '',
`libelle` varchar(40) NOT NULL default '',
PRIMARY KEY (`grade`)
) TYPE=MyISAM;
-- --------------------------------------------------------
--
-- Structure de la table `periode`
--
CREATE TABLE `periode` (
`periode` varchar(15) NOT NULL default '',
`libelle` varchar(40) NOT NULL default '',
`debut` varchar(5) NOT NULL default '00:00',
`fin` varchar(5) NOT NULL default '00:00',
PRIMARY KEY (`periode`)
) TYPE=MyISAM;
--
-- Contenu de la table `periode`
--
INSERT INTO `periode` (`periode`, `libelle`, `debut`, `fin`) VALUES
('matin', '7 heures a 13 heures', '07:00', '13:00'),
('apres-midi', '13 heures a la fin', '13:00', '22:00'),
('journee', '7 heures a la fin', '07:00', '23:00');
-- --------------------------------------------------------
--
-- Structure de la table `poste`
--
CREATE TABLE `poste` (
`poste` varchar(20) NOT NULL default '',
`nature` varchar(15) NOT NULL default '',
`ordre` int(11) NOT NULL default '0',
PRIMARY KEY (`poste`)
) TYPE=MyISAM;
--
-- Contenu de la table `poste`
--
INSERT INTO `poste` (`poste`, `nature`, `ordre`) VALUES
('SECRETAIRE', 'candidature', 0),
('PLANTON', 'candidature', 0),
('PRESIDENT', 'affectation', 1),
('DELEGUE TITULAIRE', 'affectation', 5),
('ASSESSEUR TITULAIRE', 'affectation', 3),
('ASSESSEUR SUPPLEANT', 'affectation', 4),
('AGENT CENTRALISATION', 'candidature', 0),
('DELEGUE SUPPLEANT', 'affectation', 6),
('PRESIDENT SUPPLEANT', 'affectation', 2);
-- --------------------------------------------------------
--
-- Structure de la table `profil`
--
CREATE TABLE `profil` (
`profil` int(2) NOT NULL default '0',
`libelle_profil` varchar(30) NOT NULL default '',
PRIMARY KEY (`profil`)
) TYPE=MyISAM;
--
-- Contenu de la table `profil`
--
INSERT INTO `profil` (`profil`, `libelle_profil`) VALUES
(5, 'ADMINISTRATEUR'),
(4, 'SUPER UTILISATEUR'),
(3, 'UTILISATEUR'),
(2, 'UTILISATEUR LIMITE'),
(1, 'CONSULTATION');
-- --------------------------------------------------------
--
-- Structure de la table `scrutin`
--
CREATE TABLE `scrutin` (
`scrutin` varchar(15) NOT NULL default '',
`libelle` varchar(40) NOT NULL default '',
`tour` char(1) NOT NULL default '',
`canton` char(1) NOT NULL default '',
`date_scrutin` date NOT NULL default '0000-00-00',
`solde` char(3) NOT NULL default '',
`convocation_agent` varchar(100) NOT NULL default '',
`convocation_president` varchar(100) NOT NULL default '',
PRIMARY KEY (`scrutin`)
) TYPE=MyISAM;
-- --------------------------------------------------------
--
-- Structure de la table `service`
--
CREATE TABLE `service` (
`service` varchar(10) NOT NULL default '',
`libelle` varchar(40) NOT NULL default '',
PRIMARY KEY (`service`)
) TYPE=MyISAM;
-- --------------------------------------------------------
--
-- Structure de la table `utilisateur`
--
CREATE TABLE `utilisateur` (
`idUtilisateur` int(8) NOT NULL default '0',
`nom` varchar(30) NOT NULL default '',
`Login` varchar(30) NOT NULL default '',
`Pwd` varchar(100) NOT NULL default '',
`profil` int(2) NOT NULL default '0',
`email` varchar(40) NOT NULL default '',
PRIMARY KEY (`idUtilisateur`)
) TYPE=MyISAM;
--
-- Contenu de la table `utilisateur`
--
INSERT INTO `utilisateur` (`idUtilisateur`, `nom`, `Login`, `Pwd`, `profil`, `email`) VALUES
(1, 'admin', 'admin', '21232f297a57a5a743894a0e4a801fc3', 5, '[email protected]'),
(2, 'demo', 'demo', 'fe01ce2a7fbac8fafaed7c982a04e229', 4, '[email protected]');
-- --------------------------------------------------------
--
-- Structure de la table `utilisateur_seq`
--
CREATE TABLE `utilisateur_seq` (
`id` int(10) unsigned NOT NULL default '0',
PRIMARY KEY (`id`)
) TYPE=MyISAM;
--
-- Contenu de la table `utilisateur_seq`
--
INSERT INTO `utilisateur_seq` (`id`) VALUES
(2);
| {
"content_hash": "22e1b82ca732aab33d0dfe382c04312c",
"timestamp": "",
"source": "github",
"line_count": 376,
"max_line_length": 92,
"avg_line_length": 24.6781914893617,
"alnum_prop": 0.5694579157236771,
"repo_name": "SaltwaterC/hackers-dome-images",
"id": "841d3215c60ebbaa9ae498626ca0b2f85bc3e76e",
"size": "9623",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "cookbooks/ctf01-01/files/default/rootfs/var/www/html/development/data/mysql/init.sql",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "165746"
},
{
"name": "HTML",
"bytes": "39084"
},
{
"name": "JavaScript",
"bytes": "76096"
},
{
"name": "PHP",
"bytes": "1220600"
},
{
"name": "Ruby",
"bytes": "17670"
},
{
"name": "Shell",
"bytes": "2624"
}
],
"symlink_target": ""
} |
<?php
declare(strict_types=1);
namespace SerendipityHQ\Bundle\FeaturesBundle\Model;
use Doctrine\ORM\Mapping as ORM;
use Money\Currency;
use SerendipityHQ\Component\ValueObjects\Money\Money;
use SerendipityHQ\Component\ValueObjects\Money\MoneyInterface;
use function Safe\sprintf;
/**
* @ORM\MappedSuperclass
* @ORM\HasLifecycleCallbacks
*/
abstract class Invoice implements InvoiceInterface
{
private const SECTION_DEFAULT = '_default';
/** @ORM\Column(name="currency", type="currency") */
private Currency $currency;
/** @ORM\Column(name="issued_on", type="datetime") */
private \DateTimeInterface $issuedOn;
/**
* @var InvoiceSection[]
*
* @ORM\Column(name="`sections`", type="json")
*/
private array $sections = [];
/** @ORM\Column(name="gross_total", type="money") */
private MoneyInterface $grossTotal;
/** @ORM\Column(name="net_total", type="money") */
private MoneyInterface $netTotal;
public function __construct($currency)
{
if ( ! $currency instanceof Currency) {
$currency = new Currency($currency);
}
$this->currency = $currency;
// Set the issue date
if (null === $this->issuedOn) {
// Create it with microseconds, so it is possible to use the createdOn to create a unique invoice number (http://stackoverflow.com/a/28937386/1399706)
$this->issuedOn = \DateTime::createFromFormat('U.u', \microtime(true));
}
if (null === $this->grossTotal) {
$this->grossTotal = new Money([MoneyInterface::BASE_AMOUNT => 0, MoneyInterface::CURRENCY => $this->getCurrency()]);
}
if (null === $this->netTotal) {
$this->netTotal = new Money([MoneyInterface::BASE_AMOUNT => 0, MoneyInterface::CURRENCY => $this->getCurrency()]);
}
// Generate the Invoice number
$this->generateNumber();
}
public function getHeader(): ?InvoiceSectionHeader
{
return $this->sections[self::SECTION_DEFAULT]->getHeader();
}
public function hasHeader(): bool
{
return $this->sections[self::SECTION_DEFAULT]->hasHeader();
}
public function removeHeader(): InvoiceSection
{
return $this->sections[self::SECTION_DEFAULT]->removeHeader();
}
public function setHeader(InvoiceSectionHeader $header)
{
$this->sections[self::SECTION_DEFAULT]->setHeader($header);
}
public function addLine(InvoiceLine $line, string $id = null): InvoiceInterface
{
if (false === isset($this->sections[self::SECTION_DEFAULT])) {
$this->sections[self::SECTION_DEFAULT] = new InvoiceSection($this->getCurrency());
}
$this->sections[self::SECTION_DEFAULT]->addLine($line, $id);
$this->recalculateTotal();
return $this;
}
public function getLine($id): InvoiceLine
{
return $this->sections[self::SECTION_DEFAULT]->getLine($id);
}
public function getLines(): array
{
return $this->sections[self::SECTION_DEFAULT]->getLines();
}
public function hasLine($id): bool
{
return $this->sections[self::SECTION_DEFAULT]->hasLine($id);
}
public function removeLine($id)
{
$return = $this->sections[self::SECTION_DEFAULT]->removeLine($id);
$this->recalculateTotal();
return $return;
}
public function addSection(InvoiceSection $section, string $id = null): InvoiceInterface
{
if ($this->getCurrency()->getCode() !== $section->getCurrency()->getCode()) {
throw new \LogicException(sprintf('The Sections and the Invoice to which you add it MUST have the same currency code. Invoice has code "%s" while Section has code "%s".', $this->getCurrency()->getCode(), $section->getCurrency()->getCode()));
}
switch (\gettype($id)) {
case 'string':
case 'integer':
if ($this->hasSection($id)) {
throw new \LogicException(sprintf('The section "%s" already exists. You cannot add it again', $id));
}
$this->sections[$id] = $section;
break;
case 'NULL':
$this->sections[] = $section;
break;
default:
throw new \InvalidArgumentException(sprintf('Invalid $id type. Accepted types are "string, "integer" and "null". You passed "%s".', \gettype($id)));
}
// Set the new Total
$this->grossTotal = $this->getGrossTotal()->add($section->getGrossTotal());
$this->netTotal = $this->getNetTotal()->add($section->getNetTotal());
return $this;
}
public function getSection($id): ?InvoiceSection
{
if (self::SECTION_DEFAULT === $id && false === isset($this->sections[self::SECTION_DEFAULT])) {
$this->sections[self::SECTION_DEFAULT] = new InvoiceSection($this->getCurrency());
}
return $this->sections[$id] ?? null;
}
public function getSections(): array
{
return $this->sections;
}
public function hasSection($id): bool
{
if (false === \is_string($id) && false === \is_int($id)) {
throw new \InvalidArgumentException(sprintf('Only strings or integers are accepted as $id. "%s" passed.', \gettype($id)));
}
return isset($this->sections[$id]);
}
public function removeSection($id)
{
if ($this->hasSection($id)) {
$return = $this->sections[$id];
unset($this->sections[$id]);
$this->recalculateTotal();
return $return;
}
return false;
}
public function getCurrency(): Currency
{
return $this->currency;
}
/**
* @return \DateTime|\DateTimeImmutable
*/
public function getIssuedOn(): \DateTimeInterface
{
return $this->issuedOn;
}
public function getGrossTotal(): MoneyInterface
{
return $this->grossTotal;
}
public function getNetTotal(): MoneyInterface
{
return $this->netTotal;
}
public function jsonSerialize(): array
{
return $this->sections;
}
/**
* @ORM\PostLoad()
*/
public function jsonUnserialize()
{
foreach ($this->sections as $sectionId => $section) {
$hydratingSection = new InvoiceSection($this->getCurrency());
// Set the Header if it exists
if (isset($section['_header'])) {
$header = new InvoiceSectionHeader($section['_header']);
$hydratingSection->setHeader($header);
}
// Add lines
foreach ($section['_lines'] as $lineId => $line) {
$lineObject = new InvoiceLine();
$lineObject->hydrate($line);
$hydratingSection->addLine($lineObject, $lineId);
}
$this->sections[$sectionId] = $hydratingSection;
}
$this->recalculateTotal();
}
/**
* Recalculates the total of the invoice.
*/
private function recalculateTotal()
{
$this->grossTotal = new Money([MoneyInterface::BASE_AMOUNT => 0, MoneyInterface::CURRENCY => $this->getCurrency()]);
$this->netTotal = new Money([MoneyInterface::BASE_AMOUNT => 0, MoneyInterface::CURRENCY => $this->getCurrency()]);
/** @var InvoiceSection $section */
foreach ($this->getSections() as $section) {
$this->grossTotal = $this->grossTotal->add($section->getGrossTotal());
$this->netTotal = $this->netTotal->add($section->getNetTotal());
}
}
}
| {
"content_hash": "243f4aafbde7fced9e8f79ef1340fb64",
"timestamp": "",
"source": "github",
"line_count": 263,
"max_line_length": 253,
"avg_line_length": 29.368821292775664,
"alnum_prop": 0.5836354220611082,
"repo_name": "Aerendir/bundle-features",
"id": "e07eb11188ca25d5fd2a7f4662fa8bdab0f0fdfd",
"size": "7992",
"binary": false,
"copies": "1",
"ref": "refs/heads/dev",
"path": "src/Model/Invoice.php",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "JavaScript",
"bytes": "9706"
},
{
"name": "PHP",
"bytes": "225973"
},
{
"name": "Shell",
"bytes": "864"
}
],
"symlink_target": ""
} |
GH.ProofGenerator.associatorRight = function(prover) {
this.prover = prover;
this.expectedForm = GH.sExpression.fromString('(operator (operator A B) C)');
};
// A set of operators and the theorem for associating them.
// The first step is for when the expression is part of another expression or is unproven.
// The second step is when the expression is proven.
GH.ProofGenerator.associatorRight.OPERATIONS = [
['<->', 'biass', 'biassi'],
['/\\', 'anass', 'anassi'],
['\\/', 'orass', 'orassi'],
[ '+' , 'addass', 'addass'],
[ '*' , 'mulass', 'mulass'],
['u.' , 'unass', 'unass'],
['i^i', 'inass', 'inass'],
];
GH.ProofGenerator.associatorRight.getActionName = function (sexp) {
var associateOperations = GH.ProofGenerator.associatorRight.OPERATIONS;
for (var i = 0; i < associateOperations.length; i++) {
if (sexp.operator == associateOperations[i][0]) {
if (!sexp.isProven) {
return associateOperations[i][1];
} else {
return associateOperations[i][2];
}
}
}
return null;
};
GH.ProofGenerator.associatorRight.prototype.action = function(sexp) {
var hyps = (!sexp.isProven) ? this.prover.getHyps(sexp, this.expectedForm) : [];
var name = GH.ProofGenerator.associatorRight.getActionName(sexp);
return new GH.action(name, hyps);
};
GH.ProofGenerator.associatorRight.prototype.isApplicable = function(sexp) {
if (this.prover.getHyps(sexp, this.expectedForm) == null) {
return false;
}
if (sexp.operator.toString() != sexp.left().operator.toString()) {
return false;
}
var action = this.action(sexp);
return this.prover.symbolDefined(action.name);
};
GH.ProofGenerator.associatorRight.prototype.inline = function(sexp) { return false; };
GH.ProofGenerator.associatorRight.prototype.canAddTheorem = function(sexp) { return false; };
// Apply the associative property on an s-expression. This does the same thing as
// reposition(result, 'A (B C)', '(A B) C');
GH.ProofGenerator.associatorLeft = function(prover) {
this.prover = prover;
this.expectedForm = GH.sExpression.fromString('(operator A (operator B C))');
};
GH.ProofGenerator.associatorLeft.OPERATIONS = [
['<->', 'biassl', 'biassli'],
['/\\', 'anassl', 'anassli'],
['\\/', 'orassl', 'orassli'],
[ '+' , 'addassl', 'addassl'],
[ '*' , 'mulassl', 'mulassl'],
['u.' , 'unassl', 'unassl'],
['i^i', 'inassl', 'inassl'],
];
GH.ProofGenerator.associatorLeft.prototype.action = function(sexp) {
var hyps = (!sexp.isProven) ? this.prover.getHyps(sexp, this.expectedForm) : [];
var associateOperations = GH.ProofGenerator.associatorLeft.OPERATIONS;
for (var i = 0; i < associateOperations.length; i++) {
if (sexp.operator == associateOperations[i][0]) {
if (!sexp.isProven) {
return new GH.action(associateOperations[i][1], hyps);
} else {
return new GH.action(associateOperations[i][2], hyps);
}
}
}
return null;
};
GH.ProofGenerator.associatorLeft.prototype.isApplicable = function(sexp) {
if (this.prover.getHyps(sexp, this.expectedForm) == null) {
return false;
}
if (sexp.operator.toString() != sexp.right().operator.toString()) {
return false;
}
return (this.action(sexp).name != null);
};
GH.ProofGenerator.associatorLeft.prototype.inline = function(sexp) {
var hyps = (!sexp.isProven) ? this.prover.getHyps(sexp, this.expectedForm) : [];
var name = GH.ProofGenerator.associatorRight.getActionName(sexp);
this.prover.print(hyps, name);
var result = this.prover.getLast();
this.prover.commute(result);
return true;
};
GH.ProofGenerator.associatorLeft.prototype.canAddTheorem = function(sexp) { return false; }; | {
"content_hash": "6bc43ad989b44ebde4e5a7ceeacd8147",
"timestamp": "",
"source": "github",
"line_count": 106,
"max_line_length": 95,
"avg_line_length": 33.95283018867924,
"alnum_prop": 0.6854681856071131,
"repo_name": "jkingdon/ghilbert",
"id": "c11f5b0137ba9b54fc0e6d79da44592790040ac9",
"size": "3727",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "js/prover/associator.js",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "CSS",
"bytes": "11821"
},
{
"name": "JavaScript",
"bytes": "285914"
},
{
"name": "Python",
"bytes": "203542"
}
],
"symlink_target": ""
} |
package kieker.tools.slastic.metamodel.usage;
import org.eclipse.emf.common.util.EList;
import org.eclipse.emf.ecore.EObject;
/**
* <!-- begin-user-doc -->
* A representation of the model object '<em><b>Frequency Distribution</b></em>'.
* <!-- end-user-doc -->
*
* <p>
* The following features are supported:
* <ul>
* <li>{@link kieker.tools.slastic.metamodel.usage.FrequencyDistribution#getValues <em>Values</em>}</li>
* <li>{@link kieker.tools.slastic.metamodel.usage.FrequencyDistribution#getFrequencies <em>Frequencies</em>}</li>
* </ul>
* </p>
*
* @see kieker.tools.slastic.metamodel.usage.UsagePackage#getFrequencyDistribution()
* @model
* @generated
*/
public interface FrequencyDistribution extends EObject {
/**
* Returns the value of the '<em><b>Values</b></em>' attribute list.
* The list contents are of type {@link java.lang.Long}.
* <!-- begin-user-doc -->
* <p>
* If the meaning of the '<em>Values</em>' attribute list isn't clear,
* there really should be more of a description here...
* </p>
* <!-- end-user-doc -->
* @return the value of the '<em>Values</em>' attribute list.
* @see kieker.tools.slastic.metamodel.usage.UsagePackage#getFrequencyDistribution_Values()
* @model
* @generated
*/
EList<Long> getValues();
/**
* Returns the value of the '<em><b>Frequencies</b></em>' attribute list.
* The list contents are of type {@link java.lang.Long}.
* <!-- begin-user-doc -->
* <p>
* If the meaning of the '<em>Frequencies</em>' attribute list isn't clear,
* there really should be more of a description here...
* </p>
* <!-- end-user-doc -->
* @return the value of the '<em>Frequencies</em>' attribute list.
* @see kieker.tools.slastic.metamodel.usage.UsagePackage#getFrequencyDistribution_Frequencies()
* @model unique="false" ordered="false"
* @generated
*/
EList<Long> getFrequencies();
} // FrequencyDistribution
| {
"content_hash": "e61a7e2c41ee36241ee2b51bdf1d25de",
"timestamp": "",
"source": "github",
"line_count": 58,
"max_line_length": 116,
"avg_line_length": 33.06896551724138,
"alnum_prop": 0.67570385818561,
"repo_name": "SLAsticSPE/slastic",
"id": "6e1c9b34f4adde62eace210ce9f61cabd101fe6e",
"size": "1967",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "src-gen/kieker/tools/slastic/metamodel/usage/FrequencyDistribution.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Assembly",
"bytes": "157755"
},
{
"name": "C",
"bytes": "734"
},
{
"name": "CSS",
"bytes": "19693"
},
{
"name": "HTML",
"bytes": "759550"
},
{
"name": "Java",
"bytes": "3242359"
},
{
"name": "Perl",
"bytes": "907"
},
{
"name": "R",
"bytes": "78417"
},
{
"name": "Shell",
"bytes": "42152"
}
],
"symlink_target": ""
} |
using System;
namespace TSF.UmlToolingFramework.UML.Classes.Kernel {
public interface Parameter : MultiplicityElement, TypedElement
{
ParameterDirectionKind direction { get; set; }
String _default { get; set; }
ValueSpecification defaultValue { get; set; }
Operation operation { get; set; }
}
}
| {
"content_hash": "ba1b89e54fa7b05311e6453c16029fc6",
"timestamp": "",
"source": "github",
"line_count": 11,
"max_line_length": 64,
"avg_line_length": 28.363636363636363,
"alnum_prop": 0.7339743589743589,
"repo_name": "GeertBellekens/UML-Tooling-Framework",
"id": "5581797c1d3b8c805c9c19cc05e40eaf7b9e2a00",
"size": "312",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "UMLToolingFramework/Classes/Kernel/Parameter.cs",
"mode": "33188",
"license": "bsd-2-clause",
"language": [
{
"name": "C#",
"bytes": "188270"
},
{
"name": "HTML",
"bytes": "3372"
}
],
"symlink_target": ""
} |
class SkBitmap;
namespace gfx {
class Size;
// Interface for encoding and decoding PNG data. This is a wrapper around
// libpng, which has an inconvenient interface for callers. This is currently
// designed for use in tests only (where we control the files), so the handling
// isn't as robust as would be required for a browser (see Decode() for more).
// WebKit has its own more complicated PNG decoder which handles, among other
// things, partially downloaded data.
class PNGCodec {
public:
enum ColorFormat {
// 3 bytes per pixel (packed), in RGB order regardless of endianness.
// This is the native JPEG format.
FORMAT_RGB,
// 4 bytes per pixel, in RGBA order in memory regardless of endianness.
FORMAT_RGBA,
// 4 bytes per pixel, in BGRA order in memory regardless of endianness.
// This is the default Windows DIB order.
FORMAT_BGRA,
// 4 bytes per pixel, in pre-multiplied kARGB_8888_Config format. For use
// with directly writing to a skia bitmap.
FORMAT_SkBitmap
};
// Represents a comment in the tEXt ancillary chunk of the png.
struct Comment {
Comment(const std::string& k, const std::string& t);
~Comment();
std::string key;
std::string text;
};
// Calls PNGCodec::EncodeWithCompressionLevel with the default compression
// level.
static bool Encode(const unsigned char* input,
ColorFormat format,
const Size& size,
int row_byte_width,
bool discard_transparency,
const std::vector<Comment>& comments,
std::vector<unsigned char>* output);
// Encodes the given raw 'input' data, with each pixel being represented as
// given in 'format'. The encoded PNG data will be written into the supplied
// vector and true will be returned on success. On failure (false), the
// contents of the output buffer are undefined.
//
// When writing alpha values, the input colors are assumed to be post
// multiplied.
//
// size: dimensions of the image
// row_byte_width: the width in bytes of each row. This may be greater than
// w * bytes_per_pixel if there is extra padding at the end of each row
// (often, each row is padded to the next machine word).
// discard_transparency: when true, and when the input data format includes
// alpha values, these alpha values will be discarded and only RGB will be
// written to the resulting file. Otherwise, alpha values in the input
// will be preserved.
// comments: comments to be written in the png's metadata.
// compression_level: An integer between -1 and 9, corresponding to zlib's
// compression levels. -1 is the default.
static bool EncodeWithCompressionLevel(const unsigned char* input,
ColorFormat format,
const Size& size,
int row_byte_width,
bool discard_transparency,
const std::vector<Comment>& comments,
int compression_level,
std::vector<unsigned char>* output);
// Call PNGCodec::Encode on the supplied SkBitmap |input|, which is assumed
// to be BGRA, 32 bits per pixel. The params |discard_transparency| and
// |output| are passed directly to Encode; refer to Encode for more
// information. During the call, an SkAutoLockPixels lock is held on |input|.
static bool EncodeBGRASkBitmap(const SkBitmap& input,
bool discard_transparency,
std::vector<unsigned char>* output);
// Decodes the PNG data contained in input of length input_size. The
// decoded data will be placed in *output with the dimensions in *w and *h
// on success (returns true). This data will be written in the 'format'
// format. On failure, the values of these output variables are undefined.
//
// This function may not support all PNG types, and it hasn't been tested
// with a large number of images, so assume a new format may not work. It's
// really designed to be able to read in something written by Encode() above.
static bool Decode(const unsigned char* input, size_t input_size,
ColorFormat format, std::vector<unsigned char>* output,
int* w, int* h);
// Decodes the PNG data directly into the passed in SkBitmap. This is
// significantly faster than the vector<unsigned char> version of Decode()
// above when dealing with PNG files that are >500K, which a lot of theme
// images are. (There are a lot of themes that have a NTP image of about ~1
// megabyte, and those require a 7-10 megabyte side buffer.)
//
// Returns true if data is non-null and can be decoded as a png, false
// otherwise.
static bool Decode(const unsigned char* input, size_t input_size,
SkBitmap* bitmap);
// Create a SkBitmap from a decoded BGRA DIB. The caller owns the returned
// SkBitmap.
static SkBitmap* CreateSkBitmapFromBGRAFormat(
std::vector<unsigned char>& bgra, int width, int height);
private:
DISALLOW_COPY_AND_ASSIGN(PNGCodec);
};
} // namespace gfx
#endif // UI_GFX_CODEC_PNG_CODEC_H_
| {
"content_hash": "2788f5d2160c88d3b81544c56ffe706a",
"timestamp": "",
"source": "github",
"line_count": 121,
"max_line_length": 79,
"avg_line_length": 44.43801652892562,
"alnum_prop": 0.6496187465129254,
"repo_name": "Crystalnix/house-of-life-chromium",
"id": "4632ac14f522f4f4818d966ba2ea565c5b755372",
"size": "5696",
"binary": false,
"copies": "3",
"ref": "refs/heads/master",
"path": "ui/gfx/codec/png_codec.h",
"mode": "33188",
"license": "bsd-3-clause",
"language": [
{
"name": "Assembly",
"bytes": "3418"
},
{
"name": "C",
"bytes": "88445923"
},
{
"name": "C#",
"bytes": "73756"
},
{
"name": "C++",
"bytes": "77228136"
},
{
"name": "Emacs Lisp",
"bytes": "6648"
},
{
"name": "F#",
"bytes": "381"
},
{
"name": "Go",
"bytes": "3744"
},
{
"name": "Java",
"bytes": "11354"
},
{
"name": "JavaScript",
"bytes": "6191433"
},
{
"name": "Objective-C",
"bytes": "4023654"
},
{
"name": "PHP",
"bytes": "97796"
},
{
"name": "Perl",
"bytes": "92217"
},
{
"name": "Python",
"bytes": "5604932"
},
{
"name": "Ruby",
"bytes": "937"
},
{
"name": "Shell",
"bytes": "1234672"
},
{
"name": "Tcl",
"bytes": "200213"
}
],
"symlink_target": ""
} |
<?php
defined('BASEPATH') OR exit('No direct script access allowed');
/*
| -------------------------------------------------------------------------
| URI ROUTING
| -------------------------------------------------------------------------
| This file lets you re-map URI requests to specific controller functions.
|
| Typically there is a one-to-one relationship between a URL string
| and its corresponding controller class/method. The segments in a
| URL normally follow this pattern:
|
| example.com/class/method/id/
|
| In some instances, however, you may want to remap this relationship
| so that a different class/function is called than the one
| corresponding to the URL.
|
| Please see the user guide for complete details:
|
| http://codeigniter.com/user_guide/general/routing.html
|
| -------------------------------------------------------------------------
| RESERVED ROUTES
| -------------------------------------------------------------------------
|
| There are three reserved routes:
|
| $route['default_controller'] = 'welcome';
|
| This route indicates which controller class should be loaded if the
| URI contains no data. In the above example, the "welcome" class
| would be loaded.
|
| $route['404_override'] = 'errors/page_missing';
|
| This route will tell the Router which controller/method to use if those
| provided in the URL cannot be matched to a valid route.
|
| $route['translate_uri_dashes'] = FALSE;
|
| This is not exactly a route, but allows you to automatically route
| controller and method names that contain dashes. '-' isn't a valid
| class or method name character, so it requires translation.
| When you set this option to TRUE, it will replace ALL dashes in the
| controller and method URI segments.
|
| Examples: my-controller/index -> my_controller/index
| my-controller/my-method -> my_controller/my_method
*/
$route['default_controller'] = 'home';
$route['404_override'] = '';
$route['translate_uri_dashes'] = FALSE;
$route['products/:any'] = "products/index";
// -- Purchase --
$route['purchase/purchase-details/(:num)'] = "purchase/purchase_details/$1";
$route['purchase/return'] = "purchase/purchase_return";
$route['purchase/show-return'] = "purchase/purchase_return_show";
$route['purchase/return-history'] = "purchase/return_history";
/* End of file routes.php */
/* Location: ./application/config/routes.php */ | {
"content_hash": "cdc7682158c4cb6c1769238dda2f4443",
"timestamp": "",
"source": "github",
"line_count": 68,
"max_line_length": 78,
"avg_line_length": 35.205882352941174,
"alnum_prop": 0.6307435254803676,
"repo_name": "laayis/erp",
"id": "1b1c018bd96a7d707ca22fa186ee6a96fa37eb18",
"size": "4053",
"binary": false,
"copies": "2",
"ref": "refs/heads/master",
"path": "application/config/routes.php",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "ApacheConf",
"bytes": "420"
},
{
"name": "Batchfile",
"bytes": "299"
},
{
"name": "C",
"bytes": "470979"
},
{
"name": "C++",
"bytes": "6264"
},
{
"name": "CSS",
"bytes": "205966"
},
{
"name": "Groff",
"bytes": "60910"
},
{
"name": "HTML",
"bytes": "398647"
},
{
"name": "JavaScript",
"bytes": "298738"
},
{
"name": "Makefile",
"bytes": "16052"
},
{
"name": "PHP",
"bytes": "3905378"
},
{
"name": "Perl",
"bytes": "50950"
},
{
"name": "Shell",
"bytes": "27876"
}
],
"symlink_target": ""
} |
<?php
namespace Nette\Database\Table;
use Nette;
use Nette\Database\ISupplementalDriver;
use Nette\Database\SqlLiteral;
use Nette\Database\IConventions;
use Nette\Database\Context;
use Nette\Database\IStructure;
/**
* Builds SQL query.
* SqlBuilder is based on great library NotORM http://www.notorm.com written by Jakub Vrana.
*/
class SqlBuilder extends Nette\Object
{
/** @var string */
protected $tableName;
/** @var IConventions */
protected $conventions;
/** @var string delimited table name */
protected $delimitedTable;
/** @var array of column to select */
protected $select = array();
/** @var array of where conditions */
protected $where = array();
/** @var array of where conditions for caching */
protected $conditions = array();
/** @var array of parameters passed to where conditions */
protected $parameters = array(
'select' => array(),
'where' => array(),
'group' => array(),
'having' => array(),
'order' => array(),
);
/** @var array or columns to order by */
protected $order = array();
/** @var int number of rows to fetch */
protected $limit = NULL;
/** @var int first row to fetch */
protected $offset = NULL;
/** @var string columns to grouping */
protected $group = '';
/** @var string grouping condition */
protected $having = '';
/** @var ISupplementalDriver */
private $driver;
/** @var IStructure */
private $structure;
/** @var array */
private $cacheTableList;
public function __construct($tableName, Context $context)
{
$this->tableName = $tableName;
$this->driver = $context->getConnection()->getSupplementalDriver();
$this->conventions = $context->getConventions();
$this->structure = $context->getStructure();
$this->delimitedTable = implode('.', array_map(array($this->driver, 'delimite'), explode('.', $tableName)));
}
/**
* @return string
*/
public function getTableName()
{
return $this->tableName;
}
public function buildInsertQuery()
{
return "INSERT INTO {$this->delimitedTable}";
}
public function buildUpdateQuery()
{
if ($this->limit !== NULL || $this->offset) {
throw new Nette\NotSupportedException('LIMIT clause is not supported in UPDATE query.');
}
return "UPDATE {$this->delimitedTable} SET ?set" . $this->tryDelimite($this->buildConditions());
}
public function buildDeleteQuery()
{
if ($this->limit !== NULL || $this->offset) {
throw new Nette\NotSupportedException('LIMIT clause is not supported in DELETE query.');
}
return "DELETE FROM {$this->delimitedTable}" . $this->tryDelimite($this->buildConditions());
}
/**
* Returns SQL query.
* @param string list of columns
* @return string
*/
public function buildSelectQuery($columns = NULL)
{
$queryCondition = $this->buildConditions();
$queryEnd = $this->buildQueryEnd();
$joins = array();
$this->parseJoins($joins, $queryCondition);
$this->parseJoins($joins, $queryEnd);
if ($this->select) {
$querySelect = $this->buildSelect($this->select);
$this->parseJoins($joins, $querySelect);
} elseif ($columns) {
$prefix = $joins ? "{$this->delimitedTable}." : '';
$cols = array();
foreach ($columns as $col) {
$cols[] = $prefix . $col;
}
$querySelect = $this->buildSelect($cols);
} elseif ($this->group && !$this->driver->isSupported(ISupplementalDriver::SUPPORT_SELECT_UNGROUPED_COLUMNS)) {
$querySelect = $this->buildSelect(array($this->group));
$this->parseJoins($joins, $querySelect);
} else {
$prefix = $joins ? "{$this->delimitedTable}." : '';
$querySelect = $this->buildSelect(array($prefix . '*'));
}
$queryJoins = $this->buildQueryJoins($joins);
$query = "{$querySelect} FROM {$this->delimitedTable}{$queryJoins}{$queryCondition}{$queryEnd}";
if ($this->limit !== NULL || $this->offset > 0) {
$this->driver->applyLimit($query, $this->limit, max(0, $this->offset));
}
return $this->tryDelimite($query);
}
public function getParameters()
{
return array_merge(
$this->parameters['select'],
$this->parameters['where'],
$this->parameters['group'],
$this->parameters['having'],
$this->parameters['order']
);
}
public function importConditions(SqlBuilder $builder)
{
$this->where = $builder->where;
$this->parameters['where'] = $builder->parameters['where'];
$this->conditions = $builder->conditions;
}
/********************* SQL selectors ****************d*g**/
public function addSelect($columns)
{
if (is_array($columns)) {
throw new Nette\InvalidArgumentException('Select column must be a string.');
}
$this->select[] = $columns;
$this->parameters['select'] = array_merge($this->parameters['select'], array_slice(func_get_args(), 1));
}
public function getSelect()
{
return $this->select;
}
public function addWhere($condition, $parameters = array())
{
if (is_array($condition) && is_array($parameters) && !empty($parameters)) {
return $this->addWhereComposition($condition, $parameters);
}
$args = func_get_args();
$hash = md5(json_encode($args));
if (isset($this->conditions[$hash])) {
return FALSE;
}
$this->conditions[$hash] = $condition;
$placeholderCount = substr_count($condition, '?');
if ($placeholderCount > 1 && count($args) === 2 && is_array($parameters)) {
$args = $parameters;
} else {
array_shift($args);
}
$condition = trim($condition);
if ($placeholderCount === 0 && count($args) === 1) {
$condition .= ' ?';
} elseif ($placeholderCount !== count($args)) {
throw new Nette\InvalidArgumentException('Argument count does not match placeholder count.');
}
$replace = NULL;
$placeholderNum = 0;
foreach ($args as $arg) {
preg_match('#(?:.*?\?.*?){' . $placeholderNum . '}(((?:&|\||^|~|\+|-|\*|/|%|\(|,|<|>|=|(?<=\W|^)(?:REGEXP|ALL|AND|ANY|BETWEEN|EXISTS|IN|[IR]?LIKE|OR|NOT|SOME|INTERVAL))\s*)?(?:\(\?\)|\?))#s', $condition, $match, PREG_OFFSET_CAPTURE);
$hasOperator = ($match[1][0] === '?' && $match[1][1] === 0) ? TRUE : !empty($match[2][0]);
if ($arg === NULL) {
$replace = 'IS NULL';
if ($hasOperator) {
if (trim($match[2][0]) === 'NOT') {
$replace = 'IS NOT NULL';
} else {
throw new Nette\InvalidArgumentException('Column operator does not accept NULL argument.');
}
}
} elseif (is_array($arg) || $arg instanceof Selection) {
if ($hasOperator) {
if (trim($match[2][0]) === 'NOT') {
$match[2][0] = rtrim($match[2][0]) . ' IN ';
} elseif (trim($match[2][0]) !== 'IN') {
throw new Nette\InvalidArgumentException('Column operator does not accept array argument.');
}
} else {
$match[2][0] = 'IN ';
}
if ($arg instanceof Selection) {
$clone = clone $arg;
if (!$clone->getSqlBuilder()->select) {
try {
$clone->select($clone->getPrimary());
} catch (\LogicException $e) {
throw new Nette\InvalidArgumentException('Selection argument must have defined a select column.', 0, $e);
}
}
if ($this->driver->isSupported(ISupplementalDriver::SUPPORT_SUBSELECT)) {
$arg = NULL;
$replace = $match[2][0] . '(' . $clone->getSql() . ')';
$this->parameters['where'] = array_merge($this->parameters['where'], $clone->getSqlBuilder()->getParameters());
} else {
$arg = array();
foreach ($clone as $row) {
$arg[] = array_values(iterator_to_array($row));
}
}
}
if ($arg !== NULL) {
if (!$arg) {
$hasBrackets = strpos($condition, '(') !== FALSE;
$hasOperators = preg_match('#AND|OR#', $condition);
$hasNot = strpos($condition, 'NOT') !== FALSE;
$hasPrefixNot = strpos($match[2][0], 'NOT') !== FALSE;
if (!$hasBrackets && ($hasOperators || ($hasNot && !$hasPrefixNot))) {
throw new Nette\InvalidArgumentException('Possible SQL query corruption. Add parentheses around operators.');
}
if ($hasPrefixNot) {
$replace = 'IS NULL OR TRUE';
} else {
$replace = 'IS NULL AND FALSE';
}
$arg = NULL;
} else {
$replace = $match[2][0] . '(?)';
$this->parameters['where'][] = $arg;
}
}
} elseif ($arg instanceof SqlLiteral) {
$this->parameters['where'][] = $arg;
} else {
if (!$hasOperator) {
$replace = '= ?';
}
$this->parameters['where'][] = $arg;
}
if ($replace) {
$condition = substr_replace($condition, $replace, $match[1][1], strlen($match[1][0]));
$replace = NULL;
}
if ($arg !== NULL) {
$placeholderNum++;
}
}
$this->where[] = $condition;
return TRUE;
}
public function getConditions()
{
return array_values($this->conditions);
}
public function addOrder($columns)
{
$this->order[] = $columns;
$this->parameters['order'] = array_merge($this->parameters['order'], array_slice(func_get_args(), 1));
}
public function setOrder(array $columns, array $parameters)
{
$this->order = $columns;
$this->parameters['order'] = $parameters;
}
public function getOrder()
{
return $this->order;
}
public function setLimit($limit, $offset)
{
$this->limit = $limit;
$this->offset = $offset;
}
public function getLimit()
{
return $this->limit;
}
public function getOffset()
{
return $this->offset;
}
public function setGroup($columns)
{
$this->group = $columns;
$this->parameters['group'] = array_slice(func_get_args(), 1);
}
public function getGroup()
{
return $this->group;
}
public function setHaving($having)
{
$this->having = $having;
$this->parameters['having'] = array_slice(func_get_args(), 1);
}
public function getHaving()
{
return $this->having;
}
/********************* SQL building ****************d*g**/
protected function buildSelect(array $columns)
{
return 'SELECT ' . implode(', ', $columns);
}
protected function parseJoins(& $joins, & $query)
{
$builder = $this;
$query = preg_replace_callback('~
(?(DEFINE)
(?P<word> [\w_]*[a-z][\w_]* )
(?P<del> [.:] )
(?P<node> (?&del)? (?&word) (\((?&word)\))? )
)
(?P<chain> (?!\.) (?&node)*) \. (?P<column> (?&word) | \* )
~xi', function ($match) use (& $joins, $builder) {
return $builder->parseJoinsCb($joins, $match);
}, $query);
}
public function parseJoinsCb(& $joins, $match)
{
$chain = $match['chain'];
if (!empty($chain[0]) && ($chain[0] !== '.' && $chain[0] !== ':')) {
$chain = '.' . $chain; // unified chain format
}
preg_match_all('~
(?(DEFINE)
(?P<word> [\w_]*[a-z][\w_]* )
)
(?P<del> [.:])?(?P<key> (?&word))(\((?P<throughColumn> (?&word))\))?
~xi', $chain, $keyMatches, PREG_SET_ORDER);
$parent = $this->tableName;
$parentAlias = preg_replace('#^(.*\.)?(.*)$#', '$2', $this->tableName);
// join schema keyMatch and table keyMatch to schema.table keyMatch
if ($this->driver->isSupported(ISupplementalDriver::SUPPORT_SCHEMA) && count($keyMatches) > 1) {
$tables = $this->getCachedTableList();
if (!isset($tables[$keyMatches[0]['key']]) && isset($tables[$keyMatches[0]['key'] . '.' . $keyMatches[1]['key']])) {
$keyMatch = array_shift($keyMatches);
$keyMatches[0]['key'] = $keyMatch['key'] . '.' . $keyMatches[0]['key'];
$keyMatches[0]['del'] = $keyMatch['del'];
}
}
// do not make a join when referencing to the current table column - inner conditions
// check it only when not making backjoin on itself - outer condition
if ($keyMatches[0]['del'] === '.') {
if ($parent === $keyMatches[0]['key']) {
return "{$parent}.{$match['column']}";
} elseif ($parentAlias === $keyMatches[0]['key']) {
return "{$parentAlias}.{$match['column']}";
}
}
foreach ($keyMatches as $keyMatch) {
if ($keyMatch['del'] === ':') {
if (isset($keyMatch['throughColumn'])) {
$table = $keyMatch['key'];
$belongsTo = $this->conventions->getBelongsToReference($table, $keyMatch['throughColumn']);
if (!$belongsTo) {
throw new Nette\InvalidArgumentException("No reference found for \${$parent}->{$keyMatch['key']}.");
}
list(, $primary) = $belongsTo;
} else {
$hasMany = $this->conventions->getHasManyReference($parent, $keyMatch['key']);
if (!$hasMany) {
throw new Nette\InvalidArgumentException("No reference found for \${$parent}->related({$keyMatch['key']}).");
}
list($table, $primary) = $hasMany;
}
$column = $this->conventions->getPrimary($parent);
} else {
$belongsTo = $this->conventions->getBelongsToReference($parent, $keyMatch['key']);
if (!$belongsTo) {
throw new Nette\InvalidArgumentException("No reference found for \${$parent}->{$keyMatch['key']}.");
}
list($table, $column) = $belongsTo;
$primary = $this->conventions->getPrimary($table);
}
$tableAlias = $keyMatch['key'] ?: preg_replace('#^(.*\.)?(.*)$#', '$2', $table);
// if we are joining itself (parent table), we must alias joining table
if ($parent === $table) {
$tableAlias = $parentAlias . '_ref';
}
$joins[$tableAlias . $column] = array($table, $tableAlias, $parentAlias, $column, $primary);
$parent = $table;
$parentAlias = $tableAlias;
}
return $tableAlias . ".{$match['column']}";
}
protected function buildQueryJoins(array $joins)
{
$return = '';
foreach ($joins as $join) {
list($joinTable, $joinAlias, $table, $tableColumn, $joinColumn) = $join;
$return .=
" LEFT JOIN {$joinTable}" . ($joinTable !== $joinAlias ? " {$joinAlias}" : '') .
" ON {$table}.{$tableColumn} = {$joinAlias}.{$joinColumn}";
}
return $return;
}
protected function buildConditions()
{
return $this->where ? ' WHERE (' . implode(') AND (', $this->where) . ')' : '';
}
protected function buildQueryEnd()
{
$return = '';
if ($this->group) {
$return .= ' GROUP BY '. $this->group;
}
if ($this->having) {
$return .= ' HAVING '. $this->having;
}
if ($this->order) {
$return .= ' ORDER BY ' . implode(', ', $this->order);
}
return $return;
}
protected function tryDelimite($s)
{
$driver = $this->driver;
return preg_replace_callback('#(?<=[^\w`"\[?]|^)[a-z_][a-z0-9_]*(?=[^\w`"(\]]|\z)#i', function ($m) use ($driver) {
return strtoupper($m[0]) === $m[0] ? $m[0] : $driver->delimite($m[0]);
}, $s);
}
protected function addWhereComposition(array $columns, array $parameters)
{
if ($this->driver->isSupported(ISupplementalDriver::SUPPORT_MULTI_COLUMN_AS_OR_COND)) {
$conditionFragment = '(' . implode(' = ? AND ', $columns) . ' = ?) OR ';
$condition = substr(str_repeat($conditionFragment, count($parameters)), 0, -4);
return $this->addWhere($condition, Nette\Utils\Arrays::flatten($parameters));
} else {
return $this->addWhere('(' . implode(', ', $columns) . ') IN', $parameters);
}
}
private function getCachedTableList()
{
if (!$this->cacheTableList) {
$this->cacheTableList = array_flip(array_map(function ($pair) {
return isset($pair['fullName']) ? $pair['fullName'] : $pair['name'];
}, $this->structure->getTables()));
}
return $this->cacheTableList;
}
}
| {
"content_hash": "1c393ced77abe81aea75bca6bd12ac37",
"timestamp": "",
"source": "github",
"line_count": 566,
"max_line_length": 236,
"avg_line_length": 26.609540636042404,
"alnum_prop": 0.5986986255892703,
"repo_name": "kivi8/ars-poetica",
"id": "dd7b691809d7c79cbb118b9c8b3f949f100081af",
"size": "15191",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "vendor/nette/database/src/Database/Table/SqlBuilder.php",
"mode": "33188",
"license": "bsd-3-clause",
"language": [
{
"name": "ApacheConf",
"bytes": "3480"
},
{
"name": "CSS",
"bytes": "594011"
},
{
"name": "HTML",
"bytes": "1010265"
},
{
"name": "JavaScript",
"bytes": "197037"
},
{
"name": "PHP",
"bytes": "8397436"
}
],
"symlink_target": ""
} |
module Backend::ImportViaCsv
class BaseController < ::Backend::BaseController
expose(:active_top_nav) { :import_via_csv }
expose(:csv_import) { CsvImport.find(params[:object_id]) }
expose(:csv_imports_type) {
# specific type imports or all imports
params[:type] || "CsvImport"
}
end
end
| {
"content_hash": "e904fa47a11ead53c0dc76b6f6ba1331",
"timestamp": "",
"source": "github",
"line_count": 10,
"max_line_length": 62,
"avg_line_length": 31.9,
"alnum_prop": 0.6677115987460815,
"repo_name": "bitzesty/scheme-finder-api",
"id": "0672bc4d1f36190f98e3ed58704651519050fd60",
"size": "319",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "app/controllers/backend/import_via_csv/base_controller.rb",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "50"
},
{
"name": "CoffeeScript",
"bytes": "4123"
},
{
"name": "HTML",
"bytes": "36046"
},
{
"name": "Ruby",
"bytes": "159296"
}
],
"symlink_target": ""
} |
__import__('pkg_resources').declare_namespace(__name__)
| {
"content_hash": "96b57b75f83745a044136090e8240b5e",
"timestamp": "",
"source": "github",
"line_count": 1,
"max_line_length": 55,
"avg_line_length": 56,
"alnum_prop": 0.6785714285714286,
"repo_name": "projectmallard/mallard-ducktype",
"id": "1e3a4b538ce599fc6293990aab46ba932e88d34b",
"size": "1168",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "mallard/ducktype/extensions/__init__.py",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "Python",
"bytes": "236590"
},
{
"name": "Shell",
"bytes": "1316"
}
],
"symlink_target": ""
} |
package org.elasticsearch.search.highlight;
import com.carrotsearch.randomizedtesting.generators.RandomPicks;
import com.google.common.base.Joiner;
import com.google.common.collect.Iterables;
import org.elasticsearch.Version;
import org.elasticsearch.action.index.IndexRequestBuilder;
import org.elasticsearch.action.search.SearchRequestBuilder;
import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.settings.Settings.Builder;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.index.query.BoostableQueryBuilder;
import org.elasticsearch.index.query.IdsQueryBuilder;
import org.elasticsearch.index.query.MatchQueryBuilder;
import org.elasticsearch.index.query.MatchQueryBuilder.Operator;
import org.elasticsearch.index.query.MatchQueryBuilder.Type;
import org.elasticsearch.index.query.MultiMatchQueryBuilder;
import org.elasticsearch.index.query.QueryBuilder;
import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.rest.RestStatus;
import org.elasticsearch.search.SearchHit;
import org.elasticsearch.search.builder.SearchSourceBuilder;
import org.elasticsearch.search.highlight.HighlightBuilder.Field;
import org.elasticsearch.test.ESIntegTestCase;
import org.hamcrest.Matcher;
import org.hamcrest.Matchers;
import org.junit.Test;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import static org.elasticsearch.client.Requests.searchRequest;
import static org.elasticsearch.common.settings.Settings.settingsBuilder;
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
import static org.elasticsearch.index.query.QueryBuilders.boolQuery;
import static org.elasticsearch.index.query.QueryBuilders.boostingQuery;
import static org.elasticsearch.index.query.QueryBuilders.commonTermsQuery;
import static org.elasticsearch.index.query.QueryBuilders.constantScoreQuery;
import static org.elasticsearch.index.query.QueryBuilders.filteredQuery;
import static org.elasticsearch.index.query.QueryBuilders.fuzzyQuery;
import static org.elasticsearch.index.query.QueryBuilders.matchPhrasePrefixQuery;
import static org.elasticsearch.index.query.QueryBuilders.matchPhraseQuery;
import static org.elasticsearch.index.query.QueryBuilders.matchQuery;
import static org.elasticsearch.index.query.QueryBuilders.missingQuery;
import static org.elasticsearch.index.query.QueryBuilders.multiMatchQuery;
import static org.elasticsearch.index.query.QueryBuilders.prefixQuery;
import static org.elasticsearch.index.query.QueryBuilders.queryStringQuery;
import static org.elasticsearch.index.query.QueryBuilders.rangeQuery;
import static org.elasticsearch.index.query.QueryBuilders.regexpQuery;
import static org.elasticsearch.index.query.QueryBuilders.termQuery;
import static org.elasticsearch.index.query.QueryBuilders.typeQuery;
import static org.elasticsearch.index.query.QueryBuilders.wildcardQuery;
import static org.elasticsearch.search.builder.SearchSourceBuilder.highlight;
import static org.elasticsearch.search.builder.SearchSourceBuilder.searchSource;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertFailures;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHighlight;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertHitCount;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNotHighlighted;
import static org.elasticsearch.test.hamcrest.RegexMatcher.matches;
import static org.hamcrest.Matchers.anyOf;
import static org.hamcrest.Matchers.containsString;
import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.hasKey;
import static org.hamcrest.Matchers.not;
import static org.hamcrest.Matchers.startsWith;
public class HighlighterSearchIT extends ESIntegTestCase {
@Test
public void testHighlightingWithWildcardName() throws IOException {
// test the kibana case with * as fieldname that will try highlight all fields including meta fields
XContentBuilder mappings = jsonBuilder();
mappings.startObject();
mappings.startObject("type")
.startObject("properties")
.startObject("text")
.field("type", "string")
.field("analyzer", "keyword")
.field("index_options", "offsets")
.field("term_vector", "with_positions_offsets")
.endObject()
.endObject()
.endObject();
mappings.endObject();
assertAcked(prepareCreate("test")
.addMapping("type", mappings));
ensureYellow();
client().prepareIndex("test", "type", "1")
.setSource(jsonBuilder().startObject().field("text", "text").endObject())
.get();
refresh();
String highlighter = randomFrom(new String[]{"plain", "postings", "fvh"});
SearchResponse search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("text", "text"))).addHighlightedField(new Field("*").highlighterType(highlighter)).get();
assertHighlight(search, 0, "text", 0, equalTo("<em>text</em>"));
}
@Test
public void testPlainHighlighterWithLongUnanalyzedStringTerm() throws IOException {
XContentBuilder mappings = jsonBuilder();
mappings.startObject();
mappings.startObject("type")
.startObject("properties")
.startObject("long_text")
.field("type", "string")
.field("analyzer", "keyword")
.field("index_options", "offsets")
.field("term_vector", "with_positions_offsets")
.field("ignore_above", 1)
.endObject()
.startObject("text")
.field("type", "string")
.field("analyzer", "keyword")
.field("index_options", "offsets")
.field("term_vector", "with_positions_offsets")
.endObject()
.endObject()
.endObject();
mappings.endObject();
assertAcked(prepareCreate("test")
.addMapping("type", mappings));
ensureYellow();
// crate a term that is larger than the allowed 32766, index it and then try highlight on it
// the search request should still succeed
StringBuilder builder = new StringBuilder();
for (int i = 0; i < 32767; i++) {
builder.append('a');
}
client().prepareIndex("test", "type", "1")
.setSource(jsonBuilder().startObject().field("long_text", builder.toString()).field("text", "text").endObject())
.get();
refresh();
String highlighter = randomFrom(new String[]{"plain", "postings", "fvh"});
SearchResponse search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("text", "text"))).addHighlightedField(new Field("*").highlighterType(highlighter)).get();
assertHighlight(search, 0, "text", 0, equalTo("<em>text</em>"));
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("text", "text"))).addHighlightedField(new Field("long_text").highlighterType(highlighter)).get();
assertNoFailures(search);
assertThat(search.getHits().getAt(0).getHighlightFields().size(), equalTo(0));
search = client().prepareSearch().setQuery(prefixQuery("text", "te")).addHighlightedField(new Field("long_text").highlighterType(highlighter)).get();
assertNoFailures(search);
assertThat(search.getHits().getAt(0).getHighlightFields().size(), equalTo(0));
}
@Test
public void testHighlightingWhenFieldsAreNotStoredThereIsNoSource() throws IOException {
XContentBuilder mappings = jsonBuilder();
mappings.startObject();
mappings.startObject("type")
.startObject("_source")
.field("enabled", false)
.endObject()
.startObject("properties")
.startObject("unstored_field")
.field("index_options", "offsets")
.field("term_vector", "with_positions_offsets")
.field("type", "string")
.field("store", "no")
.endObject()
.startObject("text")
.field("index_options", "offsets")
.field("term_vector", "with_positions_offsets")
.field("type", "string")
.field("store", "yes")
.endObject()
.endObject()
.endObject();
mappings.endObject();
assertAcked(prepareCreate("test")
.addMapping("type", mappings));
ensureYellow();
client().prepareIndex("test", "type", "1")
.setSource(jsonBuilder().startObject().field("unstored_text", "text").field("text", "text").endObject())
.get();
refresh();
String highlighter = randomFrom(new String[]{"plain", "postings", "fvh"});
SearchResponse search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("text", "text"))).addHighlightedField(new Field("*").highlighterType(highlighter)).get();
assertHighlight(search, 0, "text", 0, equalTo("<em>text</em>"));
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("text", "text"))).addHighlightedField(new Field("unstored_text")).get();
assertNoFailures(search);
assertThat(search.getHits().getAt(0).getHighlightFields().size(), equalTo(0));
}
@Test
// see #3486
public void testHighTermFrequencyDoc() throws IOException {
assertAcked(prepareCreate("test")
.addMapping("test", "name", "type=string,term_vector=with_positions_offsets,store=" + (randomBoolean() ? "yes" : "no")));
ensureYellow();
StringBuilder builder = new StringBuilder();
for (int i = 0; i < 6000; i++) {
builder.append("abc").append(" ");
}
client().prepareIndex("test", "test", "1")
.setSource("name", builder.toString())
.get();
refresh();
SearchResponse search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name", "abc"))).addHighlightedField("name").get();
assertHighlight(search, 0, "name", 0, startsWith("<em>abc</em> <em>abc</em> <em>abc</em> <em>abc</em>"));
}
@Test
public void testNgramHighlightingWithBrokenPositions() throws IOException {
assertAcked(prepareCreate("test")
.addMapping("test", jsonBuilder()
.startObject()
.startObject("test")
.startObject("properties")
.startObject("name")
.startObject("fields")
.startObject("autocomplete")
.field("type", "string")
.field("analyzer", "autocomplete")
.field("search_analyzer", "search_autocomplete")
.field("term_vector", "with_positions_offsets")
.endObject()
.startObject("name")
.field("type", "string")
.endObject()
.endObject()
.field("type", "multi_field")
.endObject()
.endObject()
.endObject())
.setSettings(settingsBuilder()
.put(indexSettings())
.put("analysis.tokenizer.autocomplete.max_gram", 20)
.put("analysis.tokenizer.autocomplete.min_gram", 1)
.put("analysis.tokenizer.autocomplete.token_chars", "letter,digit")
.put("analysis.tokenizer.autocomplete.type", "nGram")
.put("analysis.filter.wordDelimiter.type", "word_delimiter")
.putArray("analysis.filter.wordDelimiter.type_table",
"& => ALPHANUM", "| => ALPHANUM", "! => ALPHANUM",
"? => ALPHANUM", ". => ALPHANUM", "- => ALPHANUM", "# => ALPHANUM", "% => ALPHANUM",
"+ => ALPHANUM", ", => ALPHANUM", "~ => ALPHANUM", ": => ALPHANUM", "/ => ALPHANUM",
"^ => ALPHANUM", "$ => ALPHANUM", "@ => ALPHANUM", ") => ALPHANUM", "( => ALPHANUM",
"] => ALPHANUM", "[ => ALPHANUM", "} => ALPHANUM", "{ => ALPHANUM")
.put("analysis.filter.wordDelimiter.type.split_on_numerics", false)
.put("analysis.filter.wordDelimiter.generate_word_parts", true)
.put("analysis.filter.wordDelimiter.generate_number_parts", false)
.put("analysis.filter.wordDelimiter.catenate_words", true)
.put("analysis.filter.wordDelimiter.catenate_numbers", true)
.put("analysis.filter.wordDelimiter.catenate_all", false)
.put("analysis.analyzer.autocomplete.tokenizer", "autocomplete")
.putArray("analysis.analyzer.autocomplete.filter", "lowercase", "wordDelimiter")
.put("analysis.analyzer.search_autocomplete.tokenizer", "whitespace")
.putArray("analysis.analyzer.search_autocomplete.filter", "lowercase", "wordDelimiter")));
ensureYellow();
client().prepareIndex("test", "test", "1")
.setSource("name", "ARCOTEL Hotels Deutschland").get();
refresh();
SearchResponse search = client().prepareSearch("test").setTypes("test").setQuery(matchQuery("name.autocomplete", "deut tel").operator(Operator.OR)).addHighlightedField("name.autocomplete").execute().actionGet();
assertHighlight(search, 0, "name.autocomplete", 0, equalTo("ARCO<em>TEL</em> Ho<em>tel</em>s <em>Deut</em>schland"));
}
@Test
public void testMultiPhraseCutoff() throws IOException {
/*
* MultiPhraseQuery can literally kill an entire node if there are too many terms in the
* query. We cut off and extract terms if there are more than 16 terms in the query
*/
assertAcked(prepareCreate("test")
.addMapping("test", "body", "type=string,analyzer=custom_analyzer,search_analyzer=custom_analyzer,term_vector=with_positions_offsets")
.setSettings(
settingsBuilder().put(indexSettings())
.put("analysis.filter.wordDelimiter.type", "word_delimiter")
.put("analysis.filter.wordDelimiter.type.split_on_numerics", false)
.put("analysis.filter.wordDelimiter.generate_word_parts", true)
.put("analysis.filter.wordDelimiter.generate_number_parts", true)
.put("analysis.filter.wordDelimiter.catenate_words", true)
.put("analysis.filter.wordDelimiter.catenate_numbers", true)
.put("analysis.filter.wordDelimiter.catenate_all", false)
.put("analysis.analyzer.custom_analyzer.tokenizer", "whitespace")
.putArray("analysis.analyzer.custom_analyzer.filter", "lowercase", "wordDelimiter"))
);
ensureGreen();
client().prepareIndex("test", "test", "1")
.setSource("body", "Test: http://www.facebook.com http://elasticsearch.org http://xing.com http://cnn.com http://quora.com http://twitter.com this is a test for highlighting feature Test: http://www.facebook.com http://elasticsearch.org http://xing.com http://cnn.com http://quora.com http://twitter.com this is a test for highlighting feature")
.get();
refresh();
SearchResponse search = client().prepareSearch().setQuery(matchQuery("body", "Test: http://www.facebook.com ").type(Type.PHRASE)).addHighlightedField("body").execute().actionGet();
assertHighlight(search, 0, "body", 0, startsWith("<em>Test: http://www.facebook.com</em>"));
search = client().prepareSearch().setQuery(matchQuery("body", "Test: http://www.facebook.com http://elasticsearch.org http://xing.com http://cnn.com http://quora.com http://twitter.com this is a test for highlighting feature Test: http://www.facebook.com http://elasticsearch.org http://xing.com http://cnn.com http://quora.com http://twitter.com this is a test for highlighting feature").type(Type.PHRASE)).addHighlightedField("body").execute().actionGet();
assertHighlight(search, 0, "body", 0, equalTo("<em>Test</em>: <em>http://www.facebook.com</em> <em>http://elasticsearch.org</em> <em>http://xing.com</em> <em>http://cnn.com</em> http://quora.com"));
}
@Test
public void testNgramHighlightingPreLucene42() throws IOException {
assertAcked(prepareCreate("test")
.addMapping("test",
"name", "type=string,analyzer=name_index_analyzer,search_analyzer=name_search_analyzer," + randomStoreField() + "term_vector=with_positions_offsets",
"name2", "type=string,analyzer=name2_index_analyzer,search_analyzer=name_search_analyzer," + randomStoreField() + "term_vector=with_positions_offsets")
.setSettings(settingsBuilder()
.put(indexSettings())
.put("analysis.filter.my_ngram.max_gram", 20)
.put("analysis.filter.my_ngram.version", "4.1")
.put("analysis.filter.my_ngram.min_gram", 1)
.put("analysis.filter.my_ngram.type", "ngram")
.put("analysis.tokenizer.my_ngramt.max_gram", 20)
.put("analysis.tokenizer.my_ngramt.version", "4.1")
.put("analysis.tokenizer.my_ngramt.min_gram", 1)
.put("analysis.tokenizer.my_ngramt.type", "ngram")
.put("analysis.analyzer.name_index_analyzer.tokenizer", "my_ngramt")
.put("analysis.analyzer.name2_index_analyzer.tokenizer", "whitespace")
.putArray("analysis.analyzer.name2_index_analyzer.filter", "lowercase", "my_ngram")
.put("analysis.analyzer.name_search_analyzer.tokenizer", "whitespace")
.put("analysis.analyzer.name_search_analyzer.filter", "lowercase")));
ensureYellow();
client().prepareIndex("test", "test", "1")
.setSource("name", "logicacmg ehemals avinci - the know how company",
"name2", "logicacmg ehemals avinci - the know how company").get();
client().prepareIndex("test", "test", "2")
.setSource("name", "avinci, unilog avinci, logicacmg, logica",
"name2", "avinci, unilog avinci, logicacmg, logica").get();
refresh();
SearchResponse search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name", "logica m"))).addHighlightedField("name").get();
assertHighlight(search, 0, "name", 0, anyOf(equalTo("<em>logica</em>c<em>m</em>g ehe<em>m</em>als avinci - the know how co<em>m</em>pany"),
equalTo("avinci, unilog avinci, <em>logica</em>c<em>m</em>g, <em>logica</em>")));
assertHighlight(search, 1, "name", 0, anyOf(equalTo("<em>logica</em>c<em>m</em>g ehe<em>m</em>als avinci - the know how co<em>m</em>pany"),
equalTo("avinci, unilog avinci, <em>logica</em>c<em>m</em>g, <em>logica</em>")));
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name", "logica ma"))).addHighlightedField("name").get();
assertHighlight(search, 0, "name", 0, anyOf(equalTo("<em>logica</em>cmg ehe<em>ma</em>ls avinci - the know how company"),
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
assertHighlight(search, 1, "name", 0, anyOf(equalTo("<em>logica</em>cmg ehe<em>ma</em>ls avinci - the know how company"),
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name", "logica"))).addHighlightedField("name").get();
assertHighlight(search, 0, "name", 0, anyOf(equalTo("<em>logica</em>cmg ehemals avinci - the know how company"),
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
assertHighlight(search, 0, "name", 0, anyOf(equalTo("<em>logica</em>cmg ehemals avinci - the know how company"),
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name2", "logica m"))).addHighlightedField("name2").get();
assertHighlight(search, 0, "name2", 0, anyOf(equalTo("<em>logica</em>c<em>m</em>g ehe<em>m</em>als avinci - the know how co<em>m</em>pany"),
equalTo("avinci, unilog avinci, <em>logica</em>c<em>m</em>g, <em>logica</em>")));
assertHighlight(search, 1, "name2", 0, anyOf(equalTo("<em>logica</em>c<em>m</em>g ehe<em>m</em>als avinci - the know how co<em>m</em>pany"),
equalTo("avinci, unilog avinci, <em>logica</em>c<em>m</em>g, <em>logica</em>")));
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name2", "logica ma"))).addHighlightedField("name2").get();
assertHighlight(search, 0, "name2", 0, anyOf(equalTo("<em>logica</em>cmg ehe<em>ma</em>ls avinci - the know how company"),
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
assertHighlight(search, 1, "name2", 0, anyOf(equalTo("<em>logica</em>cmg ehe<em>ma</em>ls avinci - the know how company"),
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
search = client().prepareSearch().setQuery(constantScoreQuery(matchQuery("name2", "logica"))).addHighlightedField("name2").get();
assertHighlight(search, 0, "name2", 0, anyOf(equalTo("<em>logica</em>cmg ehemals avinci - the know how company"),
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
assertHighlight(search, 1, "name2", 0, anyOf(equalTo("<em>logica</em>cmg ehemals avinci - the know how company"),
equalTo("avinci, unilog avinci, <em>logica</em>cmg, <em>logica</em>")));
}
@Test
public void testNgramHighlighting() throws IOException {
assertAcked(prepareCreate("test")
.addMapping("test",
"name", "type=string,analyzer=name_index_analyzer,search_analyzer=name_search_analyzer,term_vector=with_positions_offsets",
"name2", "type=string,analyzer=name2_index_analyzer,search_analyzer=name_search_analyzer,term_vector=with_positions_offsets")
.setSettings(settingsBuilder()
.put(indexSettings())
.put("analysis.filter.my_ngram.max_gram", 20)
.put("analysis.filter.my_ngram.min_gram", 1)
.put("analysis.filter.my_ngram.type", "ngram")
.put("analysis.tokenizer.my_ngramt.max_gram", 20)
.put("analysis.tokenizer.my_ngramt.min_gram", 1)
.put("analysis.tokenizer.my_ngramt.token_chars", "letter,digit")
.put("analysis.tokenizer.my_ngramt.type", "ngram")
.put("analysis.analyzer.name_index_analyzer.tokenizer", "my_ngramt")
.put("analysis.analyzer.name2_index_analyzer.tokenizer", "whitespace")
.put("analysis.analyzer.name2_index_analyzer.filter", "my_ngram")
.put("analysis.analyzer.name_search_analyzer.tokenizer", "whitespace")));
client().prepareIndex("test", "test", "1")
.setSource("name", "logicacmg ehemals avinci - the know how company",
"name2", "logicacmg ehemals avinci - the know how company").get();
refresh();
ensureGreen();
SearchResponse search = client().prepareSearch().setQuery(matchQuery("name", "logica m")).addHighlightedField("name").get();
assertHighlight(search, 0, "name", 0, equalTo("<em>logica</em>c<em>m</em>g ehe<em>m</em>als avinci - the know how co<em>m</em>pany"));
search = client().prepareSearch().setQuery(matchQuery("name", "logica ma")).addHighlightedField("name").get();
assertHighlight(search, 0, "name", 0, equalTo("<em>logica</em>cmg ehe<em>ma</em>ls avinci - the know how company"));
search = client().prepareSearch().setQuery(matchQuery("name", "logica")).addHighlightedField("name").get();
assertHighlight(search, 0, "name", 0, equalTo("<em>logica</em>cmg ehemals avinci - the know how company"));
search = client().prepareSearch().setQuery(matchQuery("name2", "logica m")).addHighlightedField("name2").get();
assertHighlight(search, 0, "name2", 0, equalTo("<em>logicacmg</em> <em>ehemals</em> avinci - the know how <em>company</em>"));
search = client().prepareSearch().setQuery(matchQuery("name2", "logica ma")).addHighlightedField("name2").get();
assertHighlight(search, 0, "name2", 0, equalTo("<em>logicacmg</em> <em>ehemals</em> avinci - the know how company"));
search = client().prepareSearch().setQuery(matchQuery("name2", "logica")).addHighlightedField("name2").get();
assertHighlight(search, 0, "name2", 0, equalTo("<em>logicacmg</em> ehemals avinci - the know how company"));
}
@Test
public void testEnsureNoNegativeOffsets() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1",
"no_long_term", "type=string,term_vector=with_positions_offsets",
"long_term", "type=string,term_vector=with_positions_offsets"));
ensureYellow();
client().prepareIndex("test", "type1", "1")
.setSource("no_long_term", "This is a test where foo is highlighed and should be highlighted",
"long_term", "This is a test thisisaverylongwordandmakessurethisfails where foo is highlighed and should be highlighted")
.get();
refresh();
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("long_term", "thisisaverylongwordandmakessurethisfails foo highlighed"))
.addHighlightedField("long_term", 18, 1)
.get();
assertHighlight(search, 0, "long_term", 0, 1, equalTo("<em>thisisaverylongwordandmakessurethisfails</em>"));
search = client().prepareSearch()
.setQuery(matchQuery("no_long_term", "test foo highlighed").type(Type.PHRASE).slop(3))
.addHighlightedField("no_long_term", 18, 1).setHighlighterPostTags("</b>").setHighlighterPreTags("<b>")
.get();
assertNotHighlighted(search, 0, "no_long_term");
search = client().prepareSearch()
.setQuery(matchQuery("no_long_term", "test foo highlighed").type(Type.PHRASE).slop(3))
.addHighlightedField("no_long_term", 30, 1).setHighlighterPostTags("</b>").setHighlighterPreTags("<b>")
.get();
assertHighlight(search, 0, "no_long_term", 0, 1, equalTo("a <b>test</b> where <b>foo</b> is <b>highlighed</b> and"));
}
@Test
public void testSourceLookupHighlightingUsingPlainHighlighter() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
// we don't store title and don't use term vector, now lets see if it works...
.startObject("title").field("type", "string").field("store", "no").field("term_vector", "no").endObject()
.startObject("attachments").startObject("properties").startObject("body").field("type", "string").field("store", "no").field("term_vector", "no").endObject().endObject().endObject()
.endObject().endObject().endObject()));
ensureYellow();
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[5];
for (int i = 0; i < indexRequestBuilders.length; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i))
.setSource(XContentFactory.jsonBuilder().startObject()
.field("title", "This is a test on the highlighting bug present in elasticsearch")
.startArray("attachments").startObject().field("body", "attachment 1").endObject().startObject().field("body", "attachment 2").endObject().endArray()
.endObject());
}
indexRandom(true, indexRequestBuilders);
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "bug"))
.addHighlightedField("title", -1, 0)
.get();
for (int i = 0; i < indexRequestBuilders.length; i++) {
assertHighlight(search, i, "title", 0, equalTo("This is a test on the highlighting <em>bug</em> present in elasticsearch"));
}
search = client().prepareSearch()
.setQuery(matchQuery("attachments.body", "attachment"))
.addHighlightedField("attachments.body", -1, 0)
.get();
for (int i = 0; i < indexRequestBuilders.length; i++) {
assertHighlight(search, i, "attachments.body", 0, equalTo("<em>attachment</em> 1"));
assertHighlight(search, i, "attachments.body", 1, equalTo("<em>attachment</em> 2"));
}
}
@Test
public void testSourceLookupHighlightingUsingFastVectorHighlighter() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
// we don't store title, now lets see if it works...
.startObject("title").field("type", "string").field("store", "no").field("term_vector", "with_positions_offsets").endObject()
.startObject("attachments").startObject("properties").startObject("body").field("type", "string").field("store", "no").field("term_vector", "with_positions_offsets").endObject().endObject().endObject()
.endObject().endObject().endObject()));
ensureYellow();
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[5];
for (int i = 0; i < indexRequestBuilders.length; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i))
.setSource(XContentFactory.jsonBuilder().startObject()
.field("title", "This is a test on the highlighting bug present in elasticsearch")
.startArray("attachments").startObject().field("body", "attachment 1").endObject().startObject().field("body", "attachment 2").endObject().endArray()
.endObject());
}
indexRandom(true, indexRequestBuilders);
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "bug"))
.addHighlightedField("title", -1, 0)
.get();
for (int i = 0; i < indexRequestBuilders.length; i++) {
assertHighlight(search, i, "title", 0, equalTo("This is a test on the highlighting <em>bug</em> present in elasticsearch"));
}
search = client().prepareSearch()
.setQuery(matchQuery("attachments.body", "attachment"))
.addHighlightedField("attachments.body", -1, 2)
.execute().get();
for (int i = 0; i < 5; i++) {
assertHighlight(search, i, "attachments.body", 0, equalTo("<em>attachment</em> 1"));
assertHighlight(search, i, "attachments.body", 1, equalTo("<em>attachment</em> 2"));
}
}
@Test
public void testSourceLookupHighlightingUsingPostingsHighlighter() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
// we don't store title, now lets see if it works...
.startObject("title").field("type", "string").field("store", "no").field("index_options", "offsets").endObject()
.startObject("attachments").startObject("properties").startObject("body").field("type", "string").field("store", "no").field("index_options", "offsets").endObject().endObject().endObject()
.endObject().endObject().endObject()));
ensureYellow();
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[5];
for (int i = 0; i < indexRequestBuilders.length; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i))
.setSource(XContentFactory.jsonBuilder().startObject()
.array("title", "This is a test on the highlighting bug present in elasticsearch. Hopefully it works.",
"This is the second bug to perform highlighting on.")
.startArray("attachments").startObject().field("body", "attachment for this test").endObject().startObject().field("body", "attachment 2").endObject().endArray()
.endObject());
}
indexRandom(true, indexRequestBuilders);
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "bug"))
//asking for the whole field to be highlighted
.addHighlightedField("title", -1, 0).get();
for (int i = 0; i < indexRequestBuilders.length; i++) {
assertHighlight(search, i, "title", 0, equalTo("This is a test on the highlighting <em>bug</em> present in elasticsearch. Hopefully it works."));
assertHighlight(search, i, "title", 1, 2, equalTo("This is the second <em>bug</em> to perform highlighting on."));
}
search = client().prepareSearch()
.setQuery(matchQuery("title", "bug"))
//sentences will be generated out of each value
.addHighlightedField("title").get();
for (int i = 0; i < indexRequestBuilders.length; i++) {
assertHighlight(search, i, "title", 0, equalTo("This is a test on the highlighting <em>bug</em> present in elasticsearch."));
assertHighlight(search, i, "title", 1, 2, equalTo("This is the second <em>bug</em> to perform highlighting on."));
}
search = client().prepareSearch()
.setQuery(matchQuery("attachments.body", "attachment"))
.addHighlightedField("attachments.body", -1, 2)
.get();
for (int i = 0; i < indexRequestBuilders.length; i++) {
assertHighlight(search, i, "attachments.body", 0, equalTo("<em>attachment</em> for this test"));
assertHighlight(search, i, "attachments.body", 1, 2, equalTo("<em>attachment</em> 2"));
}
}
@Test
public void testHighlightIssue1994() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", "title", "type=string,store=no", "titleTV", "type=string,store=no,term_vector=with_positions_offsets"));
ensureYellow();
indexRandom(false, client().prepareIndex("test", "type1", "1")
.setSource("title", new String[]{"This is a test on the highlighting bug present in elasticsearch", "The bug is bugging us"},
"titleTV", new String[]{"This is a test on the highlighting bug present in elasticsearch", "The bug is bugging us"}));
indexRandom(true, client().prepareIndex("test", "type1", "2")
.setSource("titleTV", new String[]{"some text to highlight", "highlight other text"}));
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "bug"))
.addHighlightedField("title", -1, 2)
.addHighlightedField("titleTV", -1, 2).setHighlighterRequireFieldMatch(false)
.get();
assertHighlight(search, 0, "title", 0, equalTo("This is a test on the highlighting <em>bug</em> present in elasticsearch"));
assertHighlight(search, 0, "title", 1, 2, equalTo("The <em>bug</em> is bugging us"));
assertHighlight(search, 0, "titleTV", 0, equalTo("This is a test on the highlighting <em>bug</em> present in elasticsearch"));
assertHighlight(search, 0, "titleTV", 1, 2, equalTo("The <em>bug</em> is bugging us"));
search = client().prepareSearch()
.setQuery(matchQuery("titleTV", "highlight"))
.addHighlightedField("titleTV", -1, 2)
.get();
assertHighlight(search, 0, "titleTV", 0, equalTo("some text to <em>highlight</em>"));
assertHighlight(search, 0, "titleTV", 1, 2, equalTo("<em>highlight</em> other text"));
}
@Test
public void testGlobalHighlightingSettingsOverriddenAtFieldLevel() {
createIndex("test");
ensureGreen();
client().prepareIndex("test", "type1")
.setSource("field1", new String[]{"this is a test", "this is the second test"},
"field2", new String[]{"this is another test", "yet another test"}).get();
refresh();
logger.info("--> highlighting and searching on field1 and field2 produces different tags");
SearchSourceBuilder source = searchSource()
.query(termQuery("field1", "test"))
.highlight(highlight().order("score").preTags("<global>").postTags("</global>").fragmentSize(1).numOfFragments(1)
.field(new HighlightBuilder.Field("field1").numOfFragments(2))
.field(new HighlightBuilder.Field("field2").preTags("<field2>").postTags("</field2>").fragmentSize(50).requireFieldMatch(false)));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field1", 0, 2, equalTo(" <global>test</global>"));
assertHighlight(searchResponse, 0, "field1", 1, 2, equalTo(" <global>test</global>"));
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("this is another <field2>test</field2>"));
}
@Test //https://github.com/elasticsearch/elasticsearch/issues/5175
public void testHighlightingOnWildcardFields() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1",
"field-postings", "type=string,index_options=offsets",
"field-fvh", "type=string,term_vector=with_positions_offsets",
"field-plain", "type=string"));
ensureGreen();
client().prepareIndex("test", "type1")
.setSource("field-postings", "This is the first test sentence. Here is the second one.",
"field-fvh", "This is the test with term_vectors",
"field-plain", "This is the test for the plain highlighter").get();
refresh();
logger.info("--> highlighting and searching on field*");
SearchSourceBuilder source = searchSource()
//postings hl doesn't support require_field_match, its field needs to be queried directly
.query(termQuery("field-postings", "test"))
.highlight(highlight().field("field*").preTags("<xxx>").postTags("</xxx>").requireFieldMatch(false));
SearchResponse searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field-postings", 0, 1, equalTo("This is the first <xxx>test</xxx> sentence."));
assertHighlight(searchResponse, 0, "field-fvh", 0, 1, equalTo("This is the <xxx>test</xxx> with term_vectors"));
assertHighlight(searchResponse, 0, "field-plain", 0, 1, equalTo("This is the <xxx>test</xxx> for the plain highlighter"));
}
@Test
public void testForceSourceWithSourceDisabled() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", jsonBuilder().startObject().startObject("type1")
.startObject("_source").field("enabled", false).endObject()
.startObject("properties")
.startObject("field1").field("type", "string").field("store", "yes").field("index_options", "offsets")
.field("term_vector", "with_positions_offsets").endObject()
.endObject().endObject().endObject()));
ensureGreen();
client().prepareIndex("test", "type1")
.setSource("field1", "The quick brown fox jumps over the lazy dog", "field2", "second field content").get();
refresh();
//works using stored field
SearchResponse searchResponse = client().prepareSearch("test")
.setQuery(termQuery("field1", "quick"))
.addHighlightedField(new Field("field1").preTags("<xxx>").postTags("</xxx>"))
.get();
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("The <xxx>quick</xxx> brown fox jumps over the lazy dog"));
assertFailures(client().prepareSearch("test")
.setQuery(termQuery("field1", "quick"))
.addHighlightedField(new Field("field1").preTags("<xxx>").postTags("</xxx>").highlighterType("plain").forceSource(true)),
RestStatus.BAD_REQUEST,
containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
assertFailures(client().prepareSearch("test")
.setQuery(termQuery("field1", "quick"))
.addHighlightedField(new Field("field1").preTags("<xxx>").postTags("</xxx>").highlighterType("fvh").forceSource(true)),
RestStatus.BAD_REQUEST,
containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
assertFailures(client().prepareSearch("test")
.setQuery(termQuery("field1", "quick"))
.addHighlightedField(new Field("field1").preTags("<xxx>").postTags("</xxx>").highlighterType("postings").forceSource(true)),
RestStatus.BAD_REQUEST,
containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
SearchSourceBuilder searchSource = SearchSourceBuilder.searchSource().query(termQuery("field1", "quick"))
.highlight(highlight().forceSource(true).field("field1"));
assertFailures(client().prepareSearch("test").setSource(searchSource.buildAsBytes()),
RestStatus.BAD_REQUEST,
containsString("source is forced for fields [field1] but type [type1] has disabled _source"));
searchSource = SearchSourceBuilder.searchSource().query(termQuery("field1", "quick"))
.highlight(highlight().forceSource(true).field("field*"));
assertFailures(client().prepareSearch("test").setSource(searchSource.buildAsBytes()),
RestStatus.BAD_REQUEST,
matches("source is forced for fields \\[field\\d, field\\d\\] but type \\[type1\\] has disabled _source"));
}
@Test
public void testPlainHighlighter() throws Exception {
createIndex("test");
ensureGreen();
client().prepareIndex("test", "type1")
.setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog").get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource()
.query(termQuery("field1", "test"))
.highlight(highlight().field("field1").order("score").preTags("<xxx>").postTags("</xxx>"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("this is a <xxx>test</xxx>"));
logger.info("--> searching on _all, highlighting on field1");
source = searchSource()
.query(termQuery("_all", "test"))
.highlight(highlight().field("field1").order("score").preTags("<xxx>").postTags("</xxx>").requireFieldMatch(false));
searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("this is a <xxx>test</xxx>"));
logger.info("--> searching on _all, highlighting on field2");
source = searchSource()
.query(termQuery("_all", "quick"))
.highlight(highlight().field("field2").order("score").preTags("<xxx>").postTags("</xxx>").requireFieldMatch(false));
searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <xxx>quick</xxx> brown fox jumps over the lazy dog"));
logger.info("--> searching on _all, highlighting on field2");
source = searchSource()
.query(prefixQuery("_all", "qui"))
.highlight(highlight().field("field2").order("score").preTags("<xxx>").postTags("</xxx>").requireFieldMatch(false));
searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <xxx>quick</xxx> brown fox jumps over the lazy dog"));
logger.info("--> searching on _all with constant score, highlighting on field2");
source = searchSource()
.query(constantScoreQuery(prefixQuery("_all", "qui")))
.highlight(highlight().field("field2").order("score").preTags("<xxx>").postTags("</xxx>").requireFieldMatch(false));
searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <xxx>quick</xxx> brown fox jumps over the lazy dog"));
logger.info("--> searching on _all with constant score, highlighting on field2");
source = searchSource()
.query(boolQuery().should(constantScoreQuery(prefixQuery("_all", "qui"))))
.highlight(highlight().field("field2").order("score").preTags("<xxx>").postTags("</xxx>").requireFieldMatch(false));
searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <xxx>quick</xxx> brown fox jumps over the lazy dog"));
}
@Test
public void testFastVectorHighlighter() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
ensureGreen();
indexRandom(true, client().prepareIndex("test", "type1")
.setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog"));
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource()
.query(termQuery("field1", "test"))
.highlight(highlight().field("field1", 100, 0).order("score").preTags("<xxx>").postTags("</xxx>"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("this is a <xxx>test</xxx>"));
logger.info("--> searching on _all, highlighting on field1");
source = searchSource()
.query(termQuery("_all", "test"))
.highlight(highlight().field("field1", 100, 0).order("score").preTags("<xxx>").postTags("</xxx>").requireFieldMatch(false));
searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("this is a <xxx>test</xxx>"));
logger.info("--> searching on _all, highlighting on field2");
source = searchSource()
.query(termQuery("_all", "quick"))
.highlight(highlight().field("field2", 100, 0).order("score").preTags("<xxx>").postTags("</xxx>").requireFieldMatch(false));
searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <xxx>quick</xxx> brown fox jumps over the lazy dog"));
logger.info("--> searching on _all, highlighting on field2");
source = searchSource()
.query(prefixQuery("_all", "qui"))
.highlight(highlight().field("field2", 100, 0).order("score").preTags("<xxx>").postTags("</xxx>").requireFieldMatch(false));
searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <xxx>quick</xxx> brown fox jumps over the lazy dog"));
logger.info("--> searching with boundary characters");
source = searchSource()
.query(matchQuery("field2", "quick"))
.highlight(highlight().field("field2", 30, 1).boundaryChars(new char[] {' '}));
searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <em>quick</em> brown fox jumps over"));
logger.info("--> searching with boundary characters on field");
source = searchSource()
.query(matchQuery("field2", "quick"))
.highlight(highlight().field(new Field("field2").fragmentSize(30).numOfFragments(1).boundaryChars(new char[] {' '})));
searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <em>quick</em> brown fox jumps over"));
}
/**
* The FHV can spend a long time highlighting degenerate documents if phraseLimit is not set.
*/
@Test(timeout=120000)
public void testFVHManyMatches() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
ensureGreen();
// Index one megabyte of "t " over and over and over again
client().prepareIndex("test", "type1")
.setSource("field1", Joiner.on("").join(Iterables.limit(Iterables.cycle("t "), 1024*256))).get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource()
.query(termQuery("field1", "t"))
.highlight(highlight().highlighterType("fvh").field("field1", 20, 1).order("score").preTags("<xxx>").postTags("</xxx>"));
SearchResponse searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field1", 0, 1, containsString("<xxx>t</xxx>"));
logger.info("--> done");
}
@Test
public void testMatchedFieldsFvhRequireFieldMatch() throws Exception {
checkMatchedFieldsCase(true);
}
@Test
public void testMatchedFieldsFvhNoRequireFieldMatch() throws Exception {
checkMatchedFieldsCase(false);
}
private void checkMatchedFieldsCase(boolean requireFieldMatch) throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", XContentFactory.jsonBuilder().startObject().startObject("type1")
.startObject("properties")
.startObject("foo")
.field("type", "multi_field")
.startObject("fields")
.startObject("foo")
.field("type", "string")
.field("termVector", "with_positions_offsets")
.field("store", "yes")
.field("analyzer", "english")
.endObject()
.startObject("plain")
.field("type", "string")
.field("termVector", "with_positions_offsets")
.field("analyzer", "standard")
.endObject()
.endObject()
.endObject()
.startObject("bar")
.field("type", "multi_field")
.startObject("fields")
.startObject("bar")
.field("type", "string")
.field("termVector", "with_positions_offsets")
.field("store", "yes")
.field("analyzer", "english")
.endObject()
.startObject("plain")
.field("type", "string")
.field("termVector", "with_positions_offsets")
.field("analyzer", "standard")
.endObject()
.endObject()
.endObject()
.endObject()));
ensureGreen();
index("test", "type1", "1",
"foo", "running with scissors");
index("test", "type1", "2",
"foo", "cat cat junk junk junk junk junk junk junk cats junk junk",
"bar", "cat cat junk junk junk junk junk junk junk cats junk junk");
index("test", "type1", "3",
"foo", "weird",
"bar", "result");
refresh();
Field fooField = new Field("foo").numOfFragments(1).order("score").fragmentSize(25)
.highlighterType("fvh").requireFieldMatch(requireFieldMatch);
Field barField = new Field("bar").numOfFragments(1).order("score").fragmentSize(25)
.highlighterType("fvh").requireFieldMatch(requireFieldMatch);
SearchRequestBuilder req = client().prepareSearch("test").addHighlightedField(fooField);
// First check highlighting without any matched fields set
SearchResponse resp = req.setQuery(queryStringQuery("running scissors").field("foo")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("<em>running</em> with <em>scissors</em>"));
// And that matching a subfield doesn't automatically highlight it
resp = req.setQuery(queryStringQuery("foo.plain:running scissors").field("foo")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("running with <em>scissors</em>"));
// Add the subfield to the list of matched fields but don't match it. Everything should still work
// like before we added it.
fooField.matchedFields("foo", "foo.plain");
resp = req.setQuery(queryStringQuery("running scissors").field("foo")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("<em>running</em> with <em>scissors</em>"));
// Now make half the matches come from the stored field and half from just a matched field.
resp = req.setQuery(queryStringQuery("foo.plain:running scissors").field("foo")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("<em>running</em> with <em>scissors</em>"));
// Now remove the stored field from the matched field list. That should work too.
fooField.matchedFields("foo.plain");
resp = req.setQuery(queryStringQuery("foo.plain:running scissors").field("foo")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("<em>running</em> with scissors"));
// Now make sure boosted fields don't blow up when matched fields is both the subfield and stored field.
fooField.matchedFields("foo", "foo.plain");
resp = req.setQuery(queryStringQuery("foo.plain:running^5 scissors").field("foo")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("<em>running</em> with <em>scissors</em>"));
// Now just all matches are against the matched field. This still returns highlighting.
resp = req.setQuery(queryStringQuery("foo.plain:running foo.plain:scissors").field("foo")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("<em>running</em> with <em>scissors</em>"));
// And all matched field via the queryString's field parameter, just in case
resp = req.setQuery(queryStringQuery("running scissors").field("foo.plain")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("<em>running</em> with <em>scissors</em>"));
// Finding the same string two ways is ok too
resp = req.setQuery(queryStringQuery("run foo.plain:running^5 scissors").field("foo")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("<em>running</em> with <em>scissors</em>"));
// But we use the best found score when sorting fragments
resp = req.setQuery(queryStringQuery("cats foo.plain:cats^5").field("foo")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("junk junk <em>cats</em> junk junk"));
// which can also be written by searching on the subfield
resp = req.setQuery(queryStringQuery("cats").field("foo").field("foo.plain^5")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("junk junk <em>cats</em> junk junk"));
// Speaking of two fields, you can have two fields, only one of which has matchedFields enabled
QueryBuilder twoFieldsQuery = queryStringQuery("cats").field("foo").field("foo.plain^5")
.field("bar").field("bar.plain^5");
resp = req.setQuery(twoFieldsQuery).addHighlightedField(barField).get();
assertHighlight(resp, 0, "foo", 0, equalTo("junk junk <em>cats</em> junk junk"));
assertHighlight(resp, 0, "bar", 0, equalTo("<em>cat</em> <em>cat</em> junk junk junk junk"));
// And you can enable matchedField highlighting on both
barField.matchedFields("bar", "bar.plain");
resp = req.get();
assertHighlight(resp, 0, "foo", 0, equalTo("junk junk <em>cats</em> junk junk"));
assertHighlight(resp, 0, "bar", 0, equalTo("junk junk <em>cats</em> junk junk"));
// Setting a matchedField that isn't searched/doesn't exist is simply ignored.
barField.matchedFields("bar", "candy");
resp = req.get();
assertHighlight(resp, 0, "foo", 0, equalTo("junk junk <em>cats</em> junk junk"));
assertHighlight(resp, 0, "bar", 0, equalTo("<em>cat</em> <em>cat</em> junk junk junk junk"));
// If the stored field doesn't have a value it doesn't matter what you match, you get nothing.
barField.matchedFields("bar", "foo.plain");
resp = req.setQuery(queryStringQuery("running scissors").field("foo.plain").field("bar")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("<em>running</em> with <em>scissors</em>"));
assertThat(resp.getHits().getAt(0).getHighlightFields(), not(hasKey("bar")));
// If the stored field is found but the matched field isn't then you don't get a result either.
fooField.matchedFields("bar.plain");
resp = req.setQuery(queryStringQuery("running scissors").field("foo").field("foo.plain").field("bar").field("bar.plain")).get();
assertThat(resp.getHits().getAt(0).getHighlightFields(), not(hasKey("foo")));
// But if you add the stored field to the list of matched fields then you'll get a result again
fooField.matchedFields("foo", "bar.plain");
resp = req.setQuery(queryStringQuery("running scissors").field("foo").field("foo.plain").field("bar").field("bar.plain")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("<em>running</em> with <em>scissors</em>"));
assertThat(resp.getHits().getAt(0).getHighlightFields(), not(hasKey("bar")));
// You _can_ highlight fields that aren't subfields of one another.
resp = req.setQuery(queryStringQuery("weird").field("foo").field("foo.plain").field("bar").field("bar.plain")).get();
assertHighlight(resp, 0, "foo", 0, equalTo("<em>weird</em>"));
assertHighlight(resp, 0, "bar", 0, equalTo("<em>resul</em>t"));
assertFailures(req.setQuery(queryStringQuery("result").field("foo").field("foo.plain").field("bar").field("bar.plain")),
RestStatus.INTERNAL_SERVER_ERROR, containsString("IndexOutOfBoundsException"));
}
@Test
public void testFastVectorHighlighterManyDocs() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
ensureGreen();
int COUNT = between(20, 100);
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[COUNT];
for (int i = 0; i < COUNT; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i)).setSource("field1", "test " + i);
}
logger.info("--> indexing docs");
indexRandom(true, indexRequestBuilders);
logger.info("--> searching explicitly on field1 and highlighting on it");
SearchResponse searchResponse = client().prepareSearch()
.setSize(COUNT)
.setQuery(termQuery("field1", "test"))
.addHighlightedField("field1", 100, 0)
.get();
for (int i = 0; i < COUNT; i++) {
SearchHit hit = searchResponse.getHits().getHits()[i];
// LUCENE 3.1 UPGRADE: Caused adding the space at the end...
assertHighlight(searchResponse, i, "field1", 0, 1, equalTo("<em>test</em> " + hit.id()));
}
logger.info("--> searching explicitly _all and highlighting on _all");
searchResponse = client().prepareSearch()
.setSize(COUNT)
.setQuery(termQuery("_all", "test"))
.addHighlightedField("_all", 100, 0)
.get();
for (int i = 0; i < COUNT; i++) {
SearchHit hit = searchResponse.getHits().getHits()[i];
assertHighlight(searchResponse, i, "_all", 0, 1, equalTo("<em>test</em> " + hit.id() + " "));
}
}
public XContentBuilder type1TermVectorMapping() throws IOException {
return XContentFactory.jsonBuilder().startObject().startObject("type1")
.startObject("_all").field("store", "yes").field("termVector", "with_positions_offsets").endObject()
.startObject("properties")
.startObject("field1").field("type", "string").field("termVector", "with_positions_offsets").endObject()
.startObject("field2").field("type", "string").field("termVector", "with_positions_offsets").endObject()
.endObject()
.endObject().endObject();
}
@Test
public void testSameContent() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", "title", "type=string,store=yes,term_vector=with_positions_offsets"));
ensureYellow();
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[5];
for (int i = 0; i < 5; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i))
.setSource("title", "This is a test on the highlighting bug present in elasticsearch");
}
indexRandom(true, indexRequestBuilders);
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "bug"))
.addHighlightedField("title", -1, 0)
.get();
for (int i = 0; i < 5; i++) {
assertHighlight(search, i, "title", 0, 1, equalTo("This is a test on the highlighting <em>bug</em> present in elasticsearch"));
}
}
@Test
public void testFastVectorHighlighterOffsetParameter() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", "title", "type=string,store=yes,term_vector=with_positions_offsets").get());
ensureYellow();
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[5];
for (int i = 0; i < 5; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i))
.setSource("title", "This is a test on the highlighting bug present in elasticsearch");
}
indexRandom(true, indexRequestBuilders);
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "bug"))
.addHighlightedField("title", 30, 1, 10)
.get();
for (int i = 0; i < 5; i++) {
// LUCENE 3.1 UPGRADE: Caused adding the space at the end...
assertHighlight(search, i, "title", 0, 1, equalTo("highlighting <em>bug</em> present in elasticsearch"));
}
}
@Test
public void testEscapeHtml() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", "title", "type=string,store=yes"));
ensureYellow();
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[5];
for (int i = 0; i < indexRequestBuilders.length; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i))
.setSource("title", "This is a html escaping highlighting test for *&? elasticsearch");
}
indexRandom(true, indexRequestBuilders);
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "test"))
.setHighlighterEncoder("html")
.addHighlightedField("title", 50, 1, 10)
.get();
for (int i = 0; i < indexRequestBuilders.length; i++) {
assertHighlight(search, i, "title", 0, 1, equalTo("This is a html escaping highlighting <em>test</em> for *&? elasticsearch"));
}
}
@Test
public void testEscapeHtml_vector() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", "title", "type=string,store=yes,term_vector=with_positions_offsets"));
ensureYellow();
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[5];
for (int i = 0; i < 5; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i))
.setSource("title", "This is a html escaping highlighting test for *&? elasticsearch");
}
indexRandom(true, indexRequestBuilders);
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "test"))
.setHighlighterEncoder("html")
.addHighlightedField("title", 30, 1, 10)
.get();
for (int i = 0; i < 5; i++) {
assertHighlight(search, i, "title", 0, 1, equalTo("highlighting <em>test</em> for *&? elasticsearch"));
}
}
@Test
public void testMultiMapperVectorWithStore() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
.startObject("title").field("type", "multi_field").startObject("fields")
.startObject("title").field("type", "string").field("store", "yes").field("term_vector", "with_positions_offsets").field("analyzer", "classic").endObject()
.startObject("key").field("type", "string").field("store", "yes").field("term_vector", "with_positions_offsets").field("analyzer", "whitespace").endObject()
.endObject().endObject()
.endObject().endObject().endObject()));
ensureGreen();
client().prepareIndex("test", "type1", "1").setSource("title", "this is a test").get();
refresh();
// simple search on body with standard analyzer with a simple field query
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "this is a test"))
.setHighlighterEncoder("html")
.addHighlightedField("title", 50, 1)
.get();
assertHighlight(search, 0, "title", 0, 1, equalTo("this is a <em>test</em>"));
// search on title.key and highlight on title
search = client().prepareSearch()
.setQuery(matchQuery("title.key", "this is a test"))
.setHighlighterEncoder("html")
.addHighlightedField("title.key", 50, 1)
.get();
assertHighlight(search, 0, "title.key", 0, 1, equalTo("<em>this</em> <em>is</em> <em>a</em> <em>test</em>"));
}
@Test
public void testMultiMapperVectorFromSource() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
.startObject("title").field("type", "multi_field").startObject("fields")
.startObject("title").field("type", "string").field("store", "no").field("term_vector", "with_positions_offsets").field("analyzer", "classic").endObject()
.startObject("key").field("type", "string").field("store", "no").field("term_vector", "with_positions_offsets").field("analyzer", "whitespace").endObject()
.endObject().endObject()
.endObject().endObject().endObject()));
ensureGreen();
client().prepareIndex("test", "type1", "1").setSource("title", "this is a test").get();
refresh();
// simple search on body with standard analyzer with a simple field query
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "this is a test"))
.setHighlighterEncoder("html")
.addHighlightedField("title", 50, 1)
.get();
assertHighlight(search, 0, "title", 0, 1, equalTo("this is a <em>test</em>"));
// search on title.key and highlight on title.key
search = client().prepareSearch()
.setQuery(matchQuery("title.key", "this is a test"))
.setHighlighterEncoder("html")
.addHighlightedField("title.key", 50, 1)
.get();
assertHighlight(search, 0, "title.key", 0, 1, equalTo("<em>this</em> <em>is</em> <em>a</em> <em>test</em>"));
}
@Test
public void testMultiMapperNoVectorWithStore() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
.startObject("title").field("type", "multi_field").startObject("fields")
.startObject("title").field("type", "string").field("store", "yes").field("term_vector", "no").field("analyzer", "classic").endObject()
.startObject("key").field("type", "string").field("store", "yes").field("term_vector", "no").field("analyzer", "whitespace").endObject()
.endObject().endObject()
.endObject().endObject().endObject()));
ensureGreen();
client().prepareIndex("test", "type1", "1").setSource("title", "this is a test").get();
refresh();
// simple search on body with standard analyzer with a simple field query
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "this is a test"))
.setHighlighterEncoder("html")
.addHighlightedField("title", 50, 1)
.get();
assertHighlight(search, 0, "title", 0, 1, equalTo("this is a <em>test</em>"));
// search on title.key and highlight on title
search = client().prepareSearch()
.setQuery(matchQuery("title.key", "this is a test"))
.setHighlighterEncoder("html")
.addHighlightedField("title.key", 50, 1)
.get();
assertHighlight(search, 0, "title.key", 0, 1, equalTo("<em>this</em> <em>is</em> <em>a</em> <em>test</em>"));
}
@Test
public void testMultiMapperNoVectorFromSource() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
.startObject("title").field("type", "multi_field").startObject("fields")
.startObject("title").field("type", "string").field("store", "no").field("term_vector", "no").field("analyzer", "classic").endObject()
.startObject("key").field("type", "string").field("store", "no").field("term_vector", "no").field("analyzer", "whitespace").endObject()
.endObject().endObject()
.endObject().endObject().endObject()));
ensureGreen();
client().prepareIndex("test", "type1", "1").setSource("title", "this is a test").get();
refresh();
// simple search on body with standard analyzer with a simple field query
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "this is a test"))
.setHighlighterEncoder("html")
.addHighlightedField("title", 50, 1)
.get();
assertHighlight(search, 0, "title", 0, 1, equalTo("this is a <em>test</em>"));
// search on title.key and highlight on title.key
search = client().prepareSearch()
.setQuery(matchQuery("title.key", "this is a test"))
.setHighlighterEncoder("html")
.addHighlightedField("title.key", 50, 1)
.get();
assertHighlight(search, 0, "title.key", 0, 1, equalTo("<em>this</em> <em>is</em> <em>a</em> <em>test</em>"));
}
@Test
public void testFastVectorHighlighterShouldFailIfNoTermVectors() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", "title", "type=string,store=yes,term_vector=no"));
ensureGreen();
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[5];
for (int i = 0; i < 5; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i))
.setSource("title", "This is a test for the enabling fast vector highlighter");
}
indexRandom(true, indexRequestBuilders);
SearchResponse search = client().prepareSearch()
.setQuery(matchPhraseQuery("title", "this is a test"))
.addHighlightedField("title", 50, 1, 10)
.get();
assertNoFailures(search);
assertFailures(client().prepareSearch()
.setQuery(matchPhraseQuery("title", "this is a test"))
.addHighlightedField("title", 50, 1, 10)
.setHighlighterType("fast-vector-highlighter"),
RestStatus.BAD_REQUEST,
containsString("the field [title] should be indexed with term vector with position offsets to be used with fast vector highlighter"));
//should not fail if there is a wildcard
assertNoFailures(client().prepareSearch()
.setQuery(matchPhraseQuery("title", "this is a test"))
.addHighlightedField("tit*", 50, 1, 10)
.setHighlighterType("fast-vector-highlighter").get());
}
@Test
public void testDisableFastVectorHighlighter() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", "title", "type=string,store=yes,term_vector=with_positions_offsets,analyzer=classic"));
ensureGreen();
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[5];
for (int i = 0; i < indexRequestBuilders.length; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i))
.setSource("title", "This is a test for the workaround for the fast vector highlighting SOLR-3724");
}
indexRandom(true, indexRequestBuilders);
SearchResponse search = client().prepareSearch()
.setQuery(matchPhraseQuery("title", "test for the workaround"))
.addHighlightedField("title", 50, 1, 10)
.get();
for (int i = 0; i < indexRequestBuilders.length; i++) {
// Because of SOLR-3724 nothing is highlighted when FVH is used
assertNotHighlighted(search, i, "title");
}
// Using plain highlighter instead of FVH
search = client().prepareSearch()
.setQuery(matchPhraseQuery("title", "test for the workaround"))
.addHighlightedField("title", 50, 1, 10)
.setHighlighterType("highlighter")
.get();
for (int i = 0; i < indexRequestBuilders.length; i++) {
assertHighlight(search, i, "title", 0, 1, equalTo("This is a <em>test</em> for the <em>workaround</em> for the fast vector highlighting SOLR-3724"));
}
// Using plain highlighter instead of FVH on the field level
search = client().prepareSearch()
.setQuery(matchPhraseQuery("title", "test for the workaround"))
.addHighlightedField(new HighlightBuilder.Field("title").highlighterType("highlighter"))
.setHighlighterType("highlighter")
.get();
for (int i = 0; i < indexRequestBuilders.length; i++) {
assertHighlight(search, i, "title", 0, 1, equalTo("This is a <em>test</em> for the <em>workaround</em> for the fast vector highlighting SOLR-3724"));
}
}
@Test
public void testFSHHighlightAllMvFragments() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", "tags", "type=string,term_vector=with_positions_offsets"));
ensureGreen();
client().prepareIndex("test", "type1", "1")
.setSource("tags", new String[]{
"this is a really long tag i would like to highlight",
"here is another one that is very long and has the tag token near the end"}).get();
refresh();
SearchResponse response = client().prepareSearch("test")
.setQuery(QueryBuilders.matchQuery("tags", "tag"))
.addHighlightedField("tags", -1, 0).get();
assertHighlight(response, 0, "tags", 0, equalTo("this is a really long <em>tag</em> i would like to highlight"));
assertHighlight(response, 0, "tags", 1, 2, equalTo("here is another one that is very long and has the <em>tag</em> token near the end"));
}
@Test
public void testBoostingQuery() {
createIndex("test");
ensureGreen();
client().prepareIndex("test", "type1")
.setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog").get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource()
.query(boostingQuery().positive(termQuery("field2", "brown")).negative(termQuery("field2", "foobar")).negativeBoost(0.5f))
.highlight(highlight().field("field2").order("score").preTags("<x>").postTags("</x>"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The quick <x>brown</x> fox jumps over the lazy dog"));
}
@Test
@AwaitsFix(bugUrl="Broken now that BoostingQuery does not extend BooleanQuery anymore")
public void testBoostingQueryTermVector() throws IOException {
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog")
.get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource()
.query(boostingQuery().positive(termQuery("field2", "brown")).negative(termQuery("field2", "foobar")).negativeBoost(0.5f))
.highlight(highlight().field("field2").order("score").preTags("<x>").postTags("</x>"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The quick <x>brown</x> fox jumps over the lazy dog"));
}
@Test
public void testCommonTermsQuery() {
createIndex("test");
ensureGreen();
client().prepareIndex("test", "type1")
.setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog")
.get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource()
.query(commonTermsQuery("field2", "quick brown").cutoffFrequency(100))
.highlight(highlight().field("field2").order("score").preTags("<x>").postTags("</x>"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <x>quick</x> <x>brown</x> fox jumps over the lazy dog"));
}
@Test
public void testCommonTermsTermVector() throws IOException {
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog").get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource().query(commonTermsQuery("field2", "quick brown").cutoffFrequency(100))
.highlight(highlight().field("field2").order("score").preTags("<x>").postTags("</x>"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <x>quick</x> <x>brown</x> fox jumps over the lazy dog"));
}
@Test
public void testPhrasePrefix() throws IOException {
Builder builder = settingsBuilder()
.put(indexSettings())
.put("index.analysis.analyzer.synonym.tokenizer", "whitespace")
.putArray("index.analysis.analyzer.synonym.filter", "synonym", "lowercase")
.put("index.analysis.filter.synonym.type", "synonym")
.putArray("index.analysis.filter.synonym.synonyms", "quick => fast");
assertAcked(prepareCreate("test").setSettings(builder.build()).addMapping("type1", type1TermVectorMapping())
.addMapping("type2", "_all", "store=yes,termVector=with_positions_offsets",
"field4", "type=string,term_vector=with_positions_offsets,analyzer=synonym",
"field3", "type=string,analyzer=synonym"));
ensureGreen();
client().prepareIndex("test", "type1", "0")
.setSource("field0", "The quick brown fox jumps over the lazy dog", "field1", "The quick brown fox jumps over the lazy dog").get();
client().prepareIndex("test", "type1", "1")
.setSource("field1", "The quick browse button is a fancy thing, right bro?").get();
refresh();
logger.info("--> highlighting and searching on field0");
SearchSourceBuilder source = searchSource()
.query(matchPhrasePrefixQuery("field0", "quick bro"))
.highlight(highlight().field("field0").order("score").preTags("<x>").postTags("</x>"));
SearchResponse searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field0", 0, 1, equalTo("The <x>quick</x> <x>brown</x> fox jumps over the lazy dog"));
logger.info("--> highlighting and searching on field1");
source = searchSource()
.query(matchPhrasePrefixQuery("field1", "quick bro"))
.highlight(highlight().field("field1").order("score").preTags("<x>").postTags("</x>"));
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field1", 0, 1, anyOf(equalTo("The <x>quick browse</x> button is a fancy thing, right bro?"), equalTo("The <x>quick brown</x> fox jumps over the lazy dog")));
assertHighlight(searchResponse, 1, "field1", 0, 1, anyOf(equalTo("The <x>quick browse</x> button is a fancy thing, right bro?"), equalTo("The <x>quick brown</x> fox jumps over the lazy dog")));
// with synonyms
client().prepareIndex("test", "type2", "0")
.setSource("field4", "The quick brown fox jumps over the lazy dog", "field3", "The quick brown fox jumps over the lazy dog").get();
client().prepareIndex("test", "type2", "1")
.setSource("field4", "The quick browse button is a fancy thing, right bro?").get();
client().prepareIndex("test", "type2", "2")
.setSource("field4", "a quick fast blue car").get();
refresh();
source = searchSource().postFilter(typeQuery("type2")).query(matchPhrasePrefixQuery("field3", "fast bro"))
.highlight(highlight().field("field3").order("score").preTags("<x>").postTags("</x>"));
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field3", 0, 1, equalTo("The <x>quick</x> <x>brown</x> fox jumps over the lazy dog"));
logger.info("--> highlighting and searching on field4");
source = searchSource().postFilter(typeQuery("type2")).query(matchPhrasePrefixQuery("field4", "the fast bro"))
.highlight(highlight().field("field4").order("score").preTags("<x>").postTags("</x>"));
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field4", 0, 1, anyOf(equalTo("<x>The quick browse</x> button is a fancy thing, right bro?"), equalTo("<x>The quick brown</x> fox jumps over the lazy dog")));
assertHighlight(searchResponse, 1, "field4", 0, 1, anyOf(equalTo("<x>The quick browse</x> button is a fancy thing, right bro?"), equalTo("<x>The quick brown</x> fox jumps over the lazy dog")));
logger.info("--> highlighting and searching on field4");
source = searchSource().postFilter(typeQuery("type2")).query(matchPhrasePrefixQuery("field4", "a fast quick blue ca"))
.highlight(highlight().field("field4").order("score").preTags("<x>").postTags("</x>"));
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field4", 0, 1, equalTo("<x>a quick fast blue car</x>"));
}
@Test
public void testPlainHighlightDifferentFragmenter() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", "tags", "type=string"));
ensureGreen();
client().prepareIndex("test", "type1", "1")
.setSource(jsonBuilder().startObject().field("tags",
"this is a really long tag i would like to highlight",
"here is another one that is very long tag and has the tag token near the end").endObject()).get();
refresh();
SearchResponse response = client().prepareSearch("test")
.setQuery(QueryBuilders.matchQuery("tags", "long tag").type(MatchQueryBuilder.Type.PHRASE))
.addHighlightedField(new HighlightBuilder.Field("tags")
.fragmentSize(-1).numOfFragments(2).fragmenter("simple")).get();
assertHighlight(response, 0, "tags", 0, equalTo("this is a really <em>long</em> <em>tag</em> i would like to highlight"));
assertHighlight(response, 0, "tags", 1, 2, equalTo("here is another one that is very <em>long</em> <em>tag</em> and has the tag token near the end"));
response = client().prepareSearch("test")
.setQuery(QueryBuilders.matchQuery("tags", "long tag").type(MatchQueryBuilder.Type.PHRASE))
.addHighlightedField(new HighlightBuilder.Field("tags")
.fragmentSize(-1).numOfFragments(2).fragmenter("span")).get();
assertHighlight(response, 0, "tags", 0, equalTo("this is a really <em>long</em> <em>tag</em> i would like to highlight"));
assertHighlight(response, 0, "tags", 1, 2, equalTo("here is another one that is very <em>long</em> <em>tag</em> and has the tag token near the end"));
assertFailures(client().prepareSearch("test")
.setQuery(QueryBuilders.matchQuery("tags", "long tag").type(MatchQueryBuilder.Type.PHRASE))
.addHighlightedField(new HighlightBuilder.Field("tags")
.fragmentSize(-1).numOfFragments(2).fragmenter("invalid")),
RestStatus.BAD_REQUEST,
containsString("unknown fragmenter option [invalid] for the field [tags]"));
}
@Test
public void testPlainHighlighterMultipleFields() {
createIndex("test");
ensureGreen();
index("test", "type1", "1", "field1", "The <b>quick<b> brown fox", "field2", "The <b>slow<b> brown fox");
refresh();
SearchResponse response = client().prepareSearch("test")
.setQuery(QueryBuilders.matchQuery("field1", "fox"))
.addHighlightedField(new HighlightBuilder.Field("field1").preTags("<1>").postTags("</1>").requireFieldMatch(true))
.addHighlightedField(new HighlightBuilder.Field("field2").preTags("<2>").postTags("</2>").requireFieldMatch(false))
.get();
assertHighlight(response, 0, "field1", 0, 1, equalTo("The <b>quick<b> brown <1>fox</1>"));
assertHighlight(response, 0, "field2", 0, 1, equalTo("The <b>slow<b> brown <2>fox</2>"));
}
@Test
public void testFastVectorHighlighterMultipleFields() {
assertAcked(prepareCreate("test")
.addMapping("type1", "field1", "type=string,term_vector=with_positions_offsets", "field2", "type=string,term_vector=with_positions_offsets"));
ensureGreen();
index("test", "type1", "1", "field1", "The <b>quick<b> brown fox", "field2", "The <b>slow<b> brown fox");
refresh();
SearchResponse response = client().prepareSearch("test")
.setQuery(QueryBuilders.matchQuery("field1", "fox"))
.addHighlightedField(new HighlightBuilder.Field("field1").preTags("<1>").postTags("</1>").requireFieldMatch(true))
.addHighlightedField(new HighlightBuilder.Field("field2").preTags("<2>").postTags("</2>").requireFieldMatch(false))
.get();
assertHighlight(response, 0, "field1", 0, 1, equalTo("The <b>quick<b> brown <1>fox</1>"));
assertHighlight(response, 0, "field2", 0, 1, equalTo("The <b>slow<b> brown <2>fox</2>"));
}
@Test
public void testMissingStoredField() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", "highlight_field", "type=string,store=yes"));
ensureGreen();
client().prepareIndex("test", "type1", "1")
.setSource(jsonBuilder().startObject()
.field("field", "highlight")
.endObject()).get();
refresh();
// This query used to fail when the field to highlight was absent
SearchResponse response = client().prepareSearch("test")
.setQuery(QueryBuilders.matchQuery("field", "highlight").type(MatchQueryBuilder.Type.BOOLEAN))
.addHighlightedField(new HighlightBuilder.Field("highlight_field")
.fragmentSize(-1).numOfFragments(1).fragmenter("simple")).get();
assertThat(response.getHits().hits()[0].highlightFields().isEmpty(), equalTo(true));
}
@Test
// https://github.com/elasticsearch/elasticsearch/issues/3211
public void testNumericHighlighting() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("test", "text", "type=string,index=analyzed",
"byte", "type=byte", "short", "type=short", "int", "type=integer", "long", "type=long",
"float", "type=float", "double", "type=double"));
ensureGreen();
client().prepareIndex("test", "test", "1").setSource("text", "elasticsearch test",
"byte", 25, "short", 42, "int", 100, "long", -1, "float", 3.2f, "double", 42.42).get();
refresh();
SearchResponse response = client().prepareSearch("test")
.setQuery(QueryBuilders.matchQuery("text", "test").type(MatchQueryBuilder.Type.BOOLEAN))
.addHighlightedField("text")
.addHighlightedField("byte")
.addHighlightedField("short")
.addHighlightedField("int")
.addHighlightedField("long")
.addHighlightedField("float")
.addHighlightedField("double")
.get();
// Highlighting of numeric fields is not supported, but it should not raise errors
// (this behavior is consistent with version 0.20)
assertHitCount(response, 1l);
}
@Test
// https://github.com/elasticsearch/elasticsearch/issues/3200
public void testResetTwice() throws Exception {
assertAcked(prepareCreate("test")
.setSettings(settingsBuilder()
.put(indexSettings())
.put("analysis.analyzer.my_analyzer.type", "pattern")
.put("analysis.analyzer.my_analyzer.pattern", "\\s+")
.build())
.addMapping("type", "text", "type=string,analyzer=my_analyzer"));
ensureGreen();
client().prepareIndex("test", "type", "1")
.setSource("text", "elasticsearch test").get();
refresh();
SearchResponse response = client().prepareSearch("test")
.setQuery(QueryBuilders.matchQuery("text", "test").type(MatchQueryBuilder.Type.BOOLEAN))
.addHighlightedField("text").execute().actionGet();
// PatternAnalyzer will throw an exception if it is resetted twice
assertHitCount(response, 1l);
}
@Test
public void testHighlightUsesHighlightQuery() throws IOException {
assertAcked(prepareCreate("test")
.addMapping("type1", "text", "type=string," + randomStoreField() + "term_vector=with_positions_offsets,index_options=offsets"));
ensureGreen();
index("test", "type1", "1", "text", "Testing the highlight query feature");
refresh();
HighlightBuilder.Field field = new HighlightBuilder.Field("text");
SearchRequestBuilder search = client().prepareSearch("test").setQuery(QueryBuilders.matchQuery("text", "testing"))
.addHighlightedField(field);
Matcher<String> searchQueryMatcher = equalTo("<em>Testing</em> the highlight query feature");
field.highlighterType("plain");
SearchResponse response = search.get();
assertHighlight(response, 0, "text", 0, searchQueryMatcher);
field.highlighterType("fvh");
response = search.get();
assertHighlight(response, 0, "text", 0, searchQueryMatcher);
field.highlighterType("postings");
response = search.get();
assertHighlight(response, 0, "text", 0, searchQueryMatcher);
Matcher<String> hlQueryMatcher = equalTo("Testing the highlight <em>query</em> feature");
field.highlightQuery(matchQuery("text", "query"));
field.highlighterType("fvh");
response = search.get();
assertHighlight(response, 0, "text", 0, hlQueryMatcher);
field.highlighterType("plain");
response = search.get();
assertHighlight(response, 0, "text", 0, hlQueryMatcher);
field.highlighterType("postings");
response = search.get();
assertHighlight(response, 0, "text", 0, hlQueryMatcher);
// Make sure the the highlightQuery is taken into account when it is set on the highlight context instead of the field
search.setHighlighterQuery(matchQuery("text", "query"));
field.highlighterType("fvh").highlightQuery(null);
response = search.get();
assertHighlight(response, 0, "text", 0, hlQueryMatcher);
field.highlighterType("plain");
response = search.get();
assertHighlight(response, 0, "text", 0, hlQueryMatcher);
field.highlighterType("postings");
response = search.get();
assertHighlight(response, 0, "text", 0, hlQueryMatcher);
}
private static String randomStoreField() {
if (randomBoolean()) {
return "store=yes,";
}
return "";
}
@Test
public void testHighlightNoMatchSize() throws IOException {
assertAcked(prepareCreate("test")
.addMapping("type1", "text", "type=string," + randomStoreField() + "term_vector=with_positions_offsets,index_options=offsets"));
ensureGreen();
String text = "I am pretty long so some of me should get cut off. Second sentence";
index("test", "type1", "1", "text", text);
refresh();
// When you don't set noMatchSize you don't get any results if there isn't anything to highlight.
HighlightBuilder.Field field = new HighlightBuilder.Field("text")
.fragmentSize(21)
.numOfFragments(1)
.highlighterType("plain");
SearchResponse response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("fvh");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("postings");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
// When noMatchSize is set to 0 you also shouldn't get any
field.highlighterType("plain").noMatchSize(0);
response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("fvh");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("postings");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
// When noMatchSize is between 0 and the size of the string
field.highlighterType("plain").noMatchSize(21);
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("I am pretty long so"));
// The FVH also works but the fragment is longer than the plain highlighter because of boundary_max_scan
field.highlighterType("fvh");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("I am pretty long so some"));
// Postings hl also works but the fragment is the whole first sentence (size ignored)
field.highlighterType("postings");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("I am pretty long so some of me should get cut off."));
// We can also ask for a fragment longer than the input string and get the whole string
field.highlighterType("plain").noMatchSize(text.length() * 2);
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo(text));
field.highlighterType("fvh");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo(text));
//no difference using postings hl as the noMatchSize is ignored (just needs to be greater than 0)
field.highlighterType("postings");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("I am pretty long so some of me should get cut off."));
// We can also ask for a fragment exactly the size of the input field and get the whole field
field.highlighterType("plain").noMatchSize(text.length());
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo(text));
field.highlighterType("fvh");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo(text));
//no difference using postings hl as the noMatchSize is ignored (just needs to be greater than 0)
field.highlighterType("postings");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("I am pretty long so some of me should get cut off."));
// You can set noMatchSize globally in the highlighter as well
field.highlighterType("plain").noMatchSize(null);
response = client().prepareSearch("test").setHighlighterNoMatchSize(21).addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("I am pretty long so"));
field.highlighterType("fvh");
response = client().prepareSearch("test").setHighlighterNoMatchSize(21).addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("I am pretty long so some"));
field.highlighterType("postings");
response = client().prepareSearch("test").setHighlighterNoMatchSize(21).addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("I am pretty long so some of me should get cut off."));
// We don't break if noMatchSize is less than zero though
field.highlighterType("plain").noMatchSize(randomIntBetween(Integer.MIN_VALUE, -1));
response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("fvh");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("postings");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
}
@Test
public void testHighlightNoMatchSizeWithMultivaluedFields() throws IOException {
assertAcked(prepareCreate("test")
.addMapping("type1", "text", "type=string," + randomStoreField() + "term_vector=with_positions_offsets,index_options=offsets"));
ensureGreen();
String text1 = "I am pretty long so some of me should get cut off. We'll see how that goes.";
String text2 = "I am short";
index("test", "type1", "1", "text", new String[] {text1, text2});
refresh();
// The no match fragment should come from the first value of a multi-valued field
HighlightBuilder.Field field = new HighlightBuilder.Field("text")
.fragmentSize(21)
.numOfFragments(1)
.highlighterType("plain")
.noMatchSize(21);
SearchResponse response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("I am pretty long so"));
field.highlighterType("fvh");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("I am pretty long so some"));
// Postings hl also works but the fragment is the whole first sentence (size ignored)
field.highlighterType("postings");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("I am pretty long so some of me should get cut off."));
// And noMatchSize returns nothing when the first entry is empty string!
index("test", "type1", "2", "text", new String[] {"", text2});
refresh();
IdsQueryBuilder idsQueryBuilder = QueryBuilders.idsQuery("type1").addIds("2");
field.highlighterType("plain");
response = client().prepareSearch("test")
.setQuery(idsQueryBuilder)
.addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("fvh");
response = client().prepareSearch("test")
.setQuery(idsQueryBuilder)
.addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("postings");
response = client().prepareSearch("test")
.setQuery(idsQueryBuilder)
.addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
// But if the field was actually empty then you should get no highlighting field
index("test", "type1", "3", "text", new String[] {});
refresh();
idsQueryBuilder = QueryBuilders.idsQuery("type1").addIds("3");
field.highlighterType("plain");
response = client().prepareSearch("test")
.setQuery(idsQueryBuilder)
.addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("fvh");
response = client().prepareSearch("test")
.setQuery(idsQueryBuilder)
.addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("postings");
response = client().prepareSearch("test")
.setQuery(idsQueryBuilder)
.addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
// Same for if the field doesn't even exist on the document
index("test", "type1", "4");
refresh();
idsQueryBuilder = QueryBuilders.idsQuery("type1").addIds("4");
field.highlighterType("plain");
response = client().prepareSearch("test")
.setQuery(idsQueryBuilder)
.addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("fvh");
response = client().prepareSearch("test")
.setQuery(idsQueryBuilder)
.addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("fvh");
response = client().prepareSearch("test")
.setQuery(idsQueryBuilder)
.addHighlightedField(field).get();
assertNotHighlighted(response, 0, "postings");
// Again same if the field isn't mapped
field = new HighlightBuilder.Field("unmapped")
.highlighterType("plain")
.noMatchSize(21);
response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("fvh");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
field.highlighterType("postings");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertNotHighlighted(response, 0, "text");
}
@Test
public void testHighlightNoMatchSizeNumberOfFragments() throws IOException {
assertAcked(prepareCreate("test")
.addMapping("type1", "text", "type=string," + randomStoreField() + "term_vector=with_positions_offsets,index_options=offsets"));
ensureGreen();
String text1 = "This is the first sentence. This is the second sentence." + HighlightUtils.PARAGRAPH_SEPARATOR;
String text2 = "This is the third sentence. This is the fourth sentence.";
String text3 = "This is the fifth sentence";
index("test", "type1", "1", "text", new String[] {text1, text2, text3});
refresh();
// The no match fragment should come from the first value of a multi-valued field
HighlightBuilder.Field field = new HighlightBuilder.Field("text")
.fragmentSize(1)
.numOfFragments(0)
.highlighterType("plain")
.noMatchSize(20);
SearchResponse response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("This is the first"));
field.highlighterType("fvh");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("This is the first sentence"));
// Postings hl also works but the fragment is the whole first sentence (size ignored)
field.highlighterType("postings");
response = client().prepareSearch("test").addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 1, equalTo("This is the first sentence."));
//if there's a match we only return the values with matches (whole value as number_of_fragments == 0)
MatchQueryBuilder queryBuilder = QueryBuilders.matchQuery("text", "third fifth");
field.highlighterType("plain");
response = client().prepareSearch("test").setQuery(queryBuilder).addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 2, equalTo("This is the <em>third</em> sentence. This is the fourth sentence."));
assertHighlight(response, 0, "text", 1, 2, equalTo("This is the <em>fifth</em> sentence"));
field.highlighterType("fvh");
response = client().prepareSearch("test").setQuery(queryBuilder).addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 2, equalTo("This is the <em>third</em> sentence. This is the fourth sentence."));
assertHighlight(response, 0, "text", 1, 2, equalTo("This is the <em>fifth</em> sentence"));
field.highlighterType("postings");
response = client().prepareSearch("test").setQuery(queryBuilder).addHighlightedField(field).get();
assertHighlight(response, 0, "text", 0, 2, equalTo("This is the <em>third</em> sentence. This is the fourth sentence."));
assertHighlight(response, 0, "text", 1, 2, equalTo("This is the <em>fifth</em> sentence"));
}
@Test
public void testPostingsHighlighter() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1")
.setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy quick dog").get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource()
.query(termQuery("field1", "test"))
.highlight(highlight().field("field1").preTags("<xxx>").postTags("</xxx>"));
SearchResponse searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("this is a <xxx>test</xxx>"));
logger.info("--> searching on field1, highlighting on field1");
source = searchSource()
.query(termQuery("field1", "test"))
.highlight(highlight().field("field1").preTags("<xxx>").postTags("</xxx>"));
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("this is a <xxx>test</xxx>"));
logger.info("--> searching on field2, highlighting on field2");
source = searchSource()
.query(termQuery("field2", "quick"))
.highlight(highlight().field("field2").order("score").preTags("<xxx>").postTags("</xxx>"));
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <xxx>quick</xxx> brown fox jumps over the lazy <xxx>quick</xxx> dog"));
logger.info("--> searching on field2, highlighting on field2");
source = searchSource()
.query(matchPhraseQuery("field2", "quick brown"))
.highlight(highlight().field("field2").preTags("<xxx>").postTags("</xxx>"));
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
//phrase query results in highlighting all different terms regardless of their positions
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <xxx>quick</xxx> <xxx>brown</xxx> fox jumps over the lazy <xxx>quick</xxx> dog"));
//lets fall back to the standard highlighter then, what people would do to highlight query matches
logger.info("--> searching on field2, highlighting on field2, falling back to the plain highlighter");
source = searchSource()
.query(matchPhraseQuery("_all", "quick brown"))
.highlight(highlight().field("field2").preTags("<xxx>").postTags("</xxx>").highlighterType("highlighter").requireFieldMatch(false));
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <xxx>quick</xxx> <xxx>brown</xxx> fox jumps over the lazy quick dog"));
}
@Test
public void testPostingsHighlighterMultipleFields() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()).get());
ensureGreen();
index("test", "type1", "1", "field1", "The <b>quick<b> brown fox. Second sentence.", "field2", "The <b>slow<b> brown fox. Second sentence.");
refresh();
SearchResponse response = client().prepareSearch("test")
.setQuery(QueryBuilders.matchQuery("field1", "fox"))
.addHighlightedField(new HighlightBuilder.Field("field1").preTags("<1>").postTags("</1>").requireFieldMatch(true))
.get();
assertHighlight(response, 0, "field1", 0, 1, equalTo("The <b>quick<b> brown <1>fox</1>."));
}
@Test
public void testPostingsHighlighterNumberOfFragments() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1", "1")
.setSource("field1", "The quick brown fox jumps over the lazy dog. The lazy red fox jumps over the quick dog. The quick brown dog jumps over the lazy fox.",
"field2", "The quick brown fox jumps over the lazy dog. The lazy red fox jumps over the quick dog. The quick brown dog jumps over the lazy fox.").get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource()
.query(termQuery("field1", "fox"))
.highlight(highlight()
.field(new HighlightBuilder.Field("field1").numOfFragments(5).preTags("<field1>").postTags("</field1>")));
SearchResponse searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field1", 0, equalTo("The quick brown <field1>fox</field1> jumps over the lazy dog."));
assertHighlight(searchResponse, 0, "field1", 1, equalTo("The lazy red <field1>fox</field1> jumps over the quick dog."));
assertHighlight(searchResponse, 0, "field1", 2, 3, equalTo("The quick brown dog jumps over the lazy <field1>fox</field1>."));
client().prepareIndex("test", "type1", "2")
.setSource("field1", new String[]{"The quick brown fox jumps over the lazy dog. Second sentence not finished", "The lazy red fox jumps over the quick dog.", "The quick brown dog jumps over the lazy fox."}).get();
refresh();
source = searchSource()
.query(termQuery("field1", "fox"))
.highlight(highlight()
.field(new HighlightBuilder.Field("field1").numOfFragments(0).preTags("<field1>").postTags("</field1>")));
searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHitCount(searchResponse, 2l);
for (SearchHit searchHit : searchResponse.getHits()) {
if ("1".equals(searchHit.id())) {
assertHighlight(searchHit, "field1", 0, 1, equalTo("The quick brown <field1>fox</field1> jumps over the lazy dog. The lazy red <field1>fox</field1> jumps over the quick dog. The quick brown dog jumps over the lazy <field1>fox</field1>."));
} else if ("2".equals(searchHit.id())) {
assertHighlight(searchHit, "field1", 0, 3, equalTo("The quick brown <field1>fox</field1> jumps over the lazy dog. Second sentence not finished"));
assertHighlight(searchHit, "field1", 1, 3, equalTo("The lazy red <field1>fox</field1> jumps over the quick dog."));
assertHighlight(searchHit, "field1", 2, 3, equalTo("The quick brown dog jumps over the lazy <field1>fox</field1>."));
} else {
fail("Only hits with id 1 and 2 are returned");
}
}
}
@Test
public void testMultiMatchQueryHighlight() throws IOException {
String[] highlighterTypes = new String[] {"fvh", "plain", "postings"};
XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("type1")
.startObject("_all").field("store", "yes").field("index_options", "offsets").endObject()
.startObject("properties")
.startObject("field1").field("type", "string").field("index_options", "offsets").field("term_vector", "with_positions_offsets").endObject()
.startObject("field2").field("type", "string").field("index_options", "offsets").field("term_vector", "with_positions_offsets").endObject()
.endObject()
.endObject().endObject();
assertAcked(prepareCreate("test").addMapping("type1", mapping));
ensureGreen();
client().prepareIndex("test", "type1")
.setSource("field1", "The quick brown fox jumps over",
"field2", "The quick brown fox jumps over").get();
refresh();
final int iters = scaledRandomIntBetween(20, 30);
for (int i = 0; i < iters; i++) {
String highlighterType = rarely() ? null : RandomPicks.randomFrom(getRandom(), highlighterTypes);
MultiMatchQueryBuilder.Type[] supportedQueryTypes;
if ("postings".equals(highlighterType)) {
//phrase_prefix is not supported by postings highlighter, as it rewrites against an empty reader, the prefix will never match any term
supportedQueryTypes = new MultiMatchQueryBuilder.Type[]{MultiMatchQueryBuilder.Type.BEST_FIELDS, MultiMatchQueryBuilder.Type.CROSS_FIELDS, MultiMatchQueryBuilder.Type.MOST_FIELDS, MultiMatchQueryBuilder.Type.PHRASE};
} else {
supportedQueryTypes = MultiMatchQueryBuilder.Type.values();
}
MultiMatchQueryBuilder.Type matchQueryType = rarely() ? null : RandomPicks.randomFrom(getRandom(), supportedQueryTypes);
final MultiMatchQueryBuilder multiMatchQueryBuilder = multiMatchQuery("the quick brown fox", "field1", "field2").type(matchQueryType);
SearchSourceBuilder source = searchSource()
.query(multiMatchQueryBuilder)
.highlight(highlight().highlightQuery(randomBoolean() ? multiMatchQueryBuilder : null).highlighterType(highlighterType)
.field(new Field("field1").requireFieldMatch(true).preTags("<field1>").postTags("</field1>")));
logger.info("Running multi-match type: [" + matchQueryType + "] highlight with type: [" + highlighterType + "]");
SearchResponse searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHitCount(searchResponse, 1l);
assertHighlight(searchResponse, 0, "field1", 0, anyOf(equalTo("<field1>The quick brown fox</field1> jumps over"),
equalTo("<field1>The</field1> <field1>quick</field1> <field1>brown</field1> <field1>fox</field1> jumps over")));
}
}
@Test
public void testPostingsHighlighterOrderByScore() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1")
.setSource("field1", new String[]{"This sentence contains one match, not that short. This sentence contains two sentence matches. This one contains no matches.",
"This is the second value's first sentence. This one contains no matches. This sentence contains three sentence occurrences (sentence).",
"One sentence match here and scored lower since the text is quite long, not that appealing. This one contains no matches."}).get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource()
.query(termQuery("field1", "sentence"))
.highlight(highlight().field("field1").order("score"));
SearchResponse searchResponse = client().search(searchRequest("test").source(source)).actionGet();
Map<String,HighlightField> highlightFieldMap = searchResponse.getHits().getAt(0).highlightFields();
assertThat(highlightFieldMap.size(), equalTo(1));
HighlightField field1 = highlightFieldMap.get("field1");
assertThat(field1.fragments().length, equalTo(5));
assertThat(field1.fragments()[0].string(), equalTo("This <em>sentence</em> contains three <em>sentence</em> occurrences (<em>sentence</em>)."));
assertThat(field1.fragments()[1].string(), equalTo("This <em>sentence</em> contains two <em>sentence</em> matches."));
assertThat(field1.fragments()[2].string(), equalTo("This is the second value's first <em>sentence</em>."));
assertThat(field1.fragments()[3].string(), equalTo("This <em>sentence</em> contains one match, not that short."));
assertThat(field1.fragments()[4].string(), equalTo("One <em>sentence</em> match here and scored lower since the text is quite long, not that appealing."));
}
@Test
public void testPostingsHighlighterEscapeHtml() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", "title", "type=string," + randomStoreField() + "index_options=offsets"));
ensureYellow();
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[5];
for (int i = 0; i < 5; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i))
.setSource("title", "This is a html escaping highlighting test for *&? elasticsearch");
}
indexRandom(true, indexRequestBuilders);
SearchResponse searchResponse = client().prepareSearch()
.setQuery(matchQuery("title", "test"))
.setHighlighterEncoder("html")
.addHighlightedField("title").get();
for (int i = 0; i < indexRequestBuilders.length; i++) {
assertHighlight(searchResponse, i, "title", 0, 1, equalTo("This is a html escaping highlighting <em>test</em> for *&?"));
}
}
@Test
public void testPostingsHighlighterMultiMapperWithStore() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", jsonBuilder().startObject().startObject("type1")
.startObject("properties")
.startObject("title").field("type", "multi_field").startObject("fields")
.startObject("title").field("type", "string").field("store", "yes").field("index_options", "offsets").field("analyzer", "classic").endObject()
.startObject("key").field("type", "string").field("store", "yes").field("index_options", "offsets").field("analyzer", "whitespace").endObject()
.endObject().endObject()
.endObject().endObject().endObject()));
ensureGreen();
client().prepareIndex("test", "type1", "1").setSource("title", "this is a test . Second sentence.").get();
refresh();
// simple search on body with standard analyzer with a simple field query
SearchResponse searchResponse = client().prepareSearch()
//lets make sure we analyze the query and we highlight the resulting terms
.setQuery(matchQuery("title", "This is a Test"))
.addHighlightedField("title").get();
assertHitCount(searchResponse, 1l);
SearchHit hit = searchResponse.getHits().getAt(0);
//stopwords are not highlighted since not indexed
assertHighlight(hit, "title", 0, 1, equalTo("this is a <em>test</em> ."));
// search on title.key and highlight on title
searchResponse = client().prepareSearch()
.setQuery(matchQuery("title.key", "this is a test"))
.addHighlightedField("title.key").get();
assertHitCount(searchResponse, 1l);
//stopwords are now highlighted since we used only whitespace analyzer here
assertHighlight(searchResponse, 0, "title.key", 0, 1, equalTo("<em>this</em> <em>is</em> <em>a</em> <em>test</em> ."));
}
@Test
public void testPostingsHighlighterMultiMapperFromSource() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
.startObject("title").field("type", "multi_field").startObject("fields")
.startObject("title").field("type", "string").field("store", "no").field("index_options", "offsets").field("analyzer", "classic").endObject()
.startObject("key").field("type", "string").field("store", "no").field("index_options", "offsets").field("analyzer", "whitespace").endObject()
.endObject().endObject()
.endObject().endObject().endObject()));
ensureGreen();
client().prepareIndex("test", "type1", "1").setSource("title", "this is a test").get();
refresh();
// simple search on body with standard analyzer with a simple field query
SearchResponse searchResponse = client().prepareSearch()
.setQuery(matchQuery("title", "this is a test"))
.addHighlightedField("title")
.get();
assertHighlight(searchResponse, 0, "title", 0, 1, equalTo("this is a <em>test</em>"));
// search on title.key and highlight on title.key
searchResponse = client().prepareSearch()
.setQuery(matchQuery("title.key", "this is a test"))
.addHighlightedField("title.key").get();
assertHighlight(searchResponse, 0, "title.key", 0, 1, equalTo("<em>this</em> <em>is</em> <em>a</em> <em>test</em>"));
}
@Test
public void testPostingsHighlighterShouldFailIfNoOffsets() throws Exception {
assertAcked(prepareCreate("test")
.addMapping("type1", jsonBuilder().startObject().startObject("type1").startObject("properties")
.startObject("title").field("type", "string").field("store", "yes").field("index_options", "docs").endObject()
.endObject().endObject().endObject()));
ensureGreen();
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[5];
for (int i = 0; i < indexRequestBuilders.length; i++) {
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i))
.setSource("title", "This is a test for the postings highlighter");
}
indexRandom(true, indexRequestBuilders);
SearchResponse search = client().prepareSearch()
.setQuery(matchQuery("title", "this is a test"))
.addHighlightedField("title")
.get();
assertNoFailures(search);
assertFailures(client().prepareSearch()
.setQuery(matchQuery("title", "this is a test"))
.addHighlightedField("title")
.setHighlighterType("postings-highlighter"),
RestStatus.BAD_REQUEST,
containsString("the field [title] should be indexed with positions and offsets in the postings list to be used with postings highlighter"));
assertFailures(client().prepareSearch()
.setQuery(matchQuery("title", "this is a test"))
.addHighlightedField("title")
.setHighlighterType("postings"),
RestStatus.BAD_REQUEST,
containsString("the field [title] should be indexed with positions and offsets in the postings list to be used with postings highlighter"));
//should not fail if there is a wildcard
assertNoFailures(client().prepareSearch()
.setQuery(matchQuery("title", "this is a test"))
.addHighlightedField("tit*")
.setHighlighterType("postings").get());
}
@Test
public void testPostingsHighlighterBoostingQuery() throws IOException {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.")
.get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource()
.query(boostingQuery().positive(termQuery("field2", "brown")).negative(termQuery("field2", "foobar")).negativeBoost(0.5f))
.highlight(highlight().field("field2").preTags("<x>").postTags("</x>"));
SearchResponse searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The quick <x>brown</x> fox jumps over the lazy dog!"));
}
@Test
public void testPostingsHighlighterCommonTermsQuery() throws IOException {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource().query(commonTermsQuery("field2", "quick brown").cutoffFrequency(100))
.highlight(highlight().field("field2").preTags("<x>").postTags("</x>"));
SearchResponse searchResponse = client().search(searchRequest("test").source(source)).actionGet();
assertHitCount(searchResponse, 1l);
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <x>quick</x> <x>brown</x> fox jumps over the lazy dog!"));
}
private static XContentBuilder type1PostingsffsetsMapping() throws IOException {
return XContentFactory.jsonBuilder().startObject().startObject("type1")
.startObject("properties")
.startObject("field1").field("type", "string").field("index_options", "offsets").endObject()
.startObject("field2").field("type", "string").field("index_options", "offsets").endObject()
.endObject()
.endObject().endObject();
}
@Test
public void testPostingsHighlighterPrefixQuery() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
refresh();
logger.info("--> highlighting and searching on field2");
SearchSourceBuilder source = searchSource().query(prefixQuery("field2", "qui"))
.highlight(highlight().field("field2"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <em>quick</em> brown fox jumps over the lazy dog!"));
}
@Test
public void testPostingsHighlighterFuzzyQuery() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
refresh();
logger.info("--> highlighting and searching on field2");
SearchSourceBuilder source = searchSource().query(fuzzyQuery("field2", "quck"))
.highlight(highlight().field("field2"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <em>quick</em> brown fox jumps over the lazy dog!"));
}
@Test
public void testPostingsHighlighterRegexpQuery() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
refresh();
logger.info("--> highlighting and searching on field2");
SearchSourceBuilder source = searchSource().query(regexpQuery("field2", "qu[a-l]+k"))
.highlight(highlight().field("field2"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <em>quick</em> brown fox jumps over the lazy dog!"));
}
@Test
public void testPostingsHighlighterWildcardQuery() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
refresh();
logger.info("--> highlighting and searching on field2");
SearchSourceBuilder source = searchSource().query(wildcardQuery("field2", "qui*"))
.highlight(highlight().field("field2"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <em>quick</em> brown fox jumps over the lazy dog!"));
source = searchSource().query(wildcardQuery("field2", "qu*k"))
.highlight(highlight().field("field2"));
searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHitCount(searchResponse, 1l);
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <em>quick</em> brown fox jumps over the lazy dog!"));
}
@Test
public void testPostingsHighlighterTermRangeQuery() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "aaab").get();
refresh();
logger.info("--> highlighting and searching on field2");
SearchSourceBuilder source = searchSource().query(rangeQuery("field2").gte("aaaa").lt("zzzz"))
.highlight(highlight().field("field2"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("<em>aaab</em>"));
}
@Test
public void testPostingsHighlighterQueryString() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "this is a test", "field2", "The quick brown fox jumps over the lazy dog! Second sentence.").get();
refresh();
logger.info("--> highlighting and searching on field2");
SearchSourceBuilder source = searchSource().query(queryStringQuery("qui*").defaultField("field2"))
.highlight(highlight().field("field2"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field2", 0, 1, equalTo("The <em>quick</em> brown fox jumps over the lazy dog!"));
}
@Test
public void testPostingsHighlighterRegexpQueryWithinConstantScoreQuery() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "The photography word will get highlighted").get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource().query(constantScoreQuery(regexpQuery("field1", "pho[a-z]+")))
.highlight(highlight().field("field1"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("The <em>photography</em> word will get highlighted"));
}
@Test
public void testPostingsHighlighterMultiTermQueryMultipleLevels() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "The photography word will get highlighted").get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource().query(boolQuery()
.should(constantScoreQuery(QueryBuilders.missingQuery("field1")))
.should(matchQuery("field1", "test"))
.should(filteredQuery(queryStringQuery("field1:photo*"), null)))
.highlight(highlight().field("field1"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("The <em>photography</em> word will get highlighted"));
}
@Test
public void testPostingsHighlighterPrefixQueryWithinBooleanQuery() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "The photography word will get highlighted").get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource().query(boolQuery().must(prefixQuery("field1", "photo")).should(matchQuery("field1", "test").minimumShouldMatch("0")))
.highlight(highlight().field("field1"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("The <em>photography</em> word will get highlighted"));
}
@Test
public void testPostingsHighlighterQueryStringWithinFilteredQuery() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
client().prepareIndex("test", "type1").setSource("field1", "The photography word will get highlighted").get();
refresh();
logger.info("--> highlighting and searching on field1");
SearchSourceBuilder source = searchSource().query(filteredQuery(queryStringQuery("field1:photo*"), missingQuery("field_null")))
.highlight(highlight().field("field1"));
SearchResponse searchResponse = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(searchResponse, 0, "field1", 0, 1, equalTo("The <em>photography</em> word will get highlighted"));
}
@Test
public void testPostingsHighlighterManyDocs() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
ensureGreen();
int COUNT = between(20, 100);
Map<String, String> prefixes = new HashMap<>(COUNT);
IndexRequestBuilder[] indexRequestBuilders = new IndexRequestBuilder[COUNT];
for (int i = 0; i < COUNT; i++) {
//generating text with word to highlight in a different position
//(https://github.com/elasticsearch/elasticsearch/issues/4103)
String prefix = randomAsciiOfLengthBetween(5, 30);
prefixes.put(String.valueOf(i), prefix);
indexRequestBuilders[i] = client().prepareIndex("test", "type1", Integer.toString(i)).setSource("field1", "Sentence " + prefix
+ " test. Sentence two.");
}
logger.info("--> indexing docs");
indexRandom(true, indexRequestBuilders);
logger.info("--> searching explicitly on field1 and highlighting on it");
SearchRequestBuilder searchRequestBuilder = client().prepareSearch()
.setSize(COUNT)
.setQuery(termQuery("field1", "test"))
.addHighlightedField("field1");
SearchResponse searchResponse =
searchRequestBuilder.get();
assertHitCount(searchResponse, (long)COUNT);
assertThat(searchResponse.getHits().hits().length, equalTo(COUNT));
for (SearchHit hit : searchResponse.getHits()) {
String prefix = prefixes.get(hit.id());
assertHighlight(hit, "field1", 0, 1, equalTo("Sentence " + prefix + " <em>test</em>."));
}
}
public void testDoesNotHighlightTypeName() throws Exception {
XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("typename").startObject("properties")
.startObject("foo").field("type", "string")
.field("index_options", "offsets")
.field("term_vector", "with_positions_offsets")
.endObject().endObject().endObject().endObject();
assertAcked(prepareCreate("test").addMapping("typename", mapping));
ensureGreen();
indexRandom(true, client().prepareIndex("test", "typename").setSource("foo", "test typename"));
for (String highlighter: new String[] {"plain", "fvh", "postings"}) {
SearchSourceBuilder source = searchSource().query(matchQuery("foo", "test"))
.highlight(highlight().field("foo").highlighterType(highlighter).requireFieldMatch(false));
SearchResponse response = client().prepareSearch("test").setSource(source.buildAsBytes()).get();
assertHighlight(response, 0, "foo", 0, 1, equalTo("<em>test</em> typename"));
}
}
public void testDoesNotHighlightAliasFilters() throws Exception {
XContentBuilder mapping = XContentFactory.jsonBuilder().startObject().startObject("typename").startObject("properties")
.startObject("foo").field("type", "string")
.field("index_options", "offsets")
.field("term_vector", "with_positions_offsets")
.endObject().endObject().endObject().endObject();
assertAcked(prepareCreate("test").addMapping("typename", mapping));
assertAcked(client().admin().indices().prepareAliases().addAlias("test", "filtered_alias", matchQuery("foo", "japanese")));
ensureGreen();
indexRandom(true, client().prepareIndex("test", "typename").setSource("foo", "test japanese"));
for (String highlighter: new String[] {"plain", "fvh", "postings"}) {
SearchSourceBuilder source = searchSource().query(matchQuery("foo", "test"))
.highlight(highlight().field("foo").highlighterType(highlighter).requireFieldMatch(false));
SearchResponse response = client().prepareSearch("filtered_alias").setTypes("typename").setSource(source.buildAsBytes()).get();
assertHighlight(response, 0, "foo", 0, 1, equalTo("<em>test</em> japanese"));
}
}
@AwaitsFix(bugUrl="Broken now that BoostingQuery does not extend BooleanQuery anymore")
public void testFastVectorHighlighterPhraseBoost() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1TermVectorMapping()));
phraseBoostTestCase("fvh");
}
@Test
public void testPostingsHighlighterPhraseBoost() throws Exception {
assertAcked(prepareCreate("test").addMapping("type1", type1PostingsffsetsMapping()));
phraseBoostTestCase("postings");
}
/**
* Test phrase boosting over normal term matches. Note that this will never pass with the plain highlighter
* because it doesn't support the concept of terms having a different weight based on position.
* @param highlighterType highlighter to test
*/
private void phraseBoostTestCase(String highlighterType) {
ensureGreen();
StringBuilder text = new StringBuilder();
text.append("words words junk junk junk junk junk junk junk junk highlight junk junk junk junk together junk\n");
for (int i = 0; i<10; i++) {
text.append("junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk\n");
}
text.append("highlight words together\n");
for (int i = 0; i<10; i++) {
text.append("junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk junk\n");
}
index("test", "type1", "1", "field1", text.toString());
refresh();
// Match queries
phraseBoostTestCaseForClauses(highlighterType, 100f,
matchQuery("field1", "highlight words together"),
matchPhraseQuery("field1", "highlight words together"));
// Query string with a single field
phraseBoostTestCaseForClauses(highlighterType, 100f,
queryStringQuery("highlight words together").field("field1"),
queryStringQuery("\"highlight words together\"").field("field1").autoGeneratePhraseQueries(true));
// Query string with a single field without dismax
phraseBoostTestCaseForClauses(highlighterType, 100f,
queryStringQuery("highlight words together").field("field1").useDisMax(false),
queryStringQuery("\"highlight words together\"").field("field1").useDisMax(false).autoGeneratePhraseQueries(true));
// Query string with more than one field
phraseBoostTestCaseForClauses(highlighterType, 100f,
queryStringQuery("highlight words together").field("field1").field("field2"),
queryStringQuery("\"highlight words together\"").field("field1").field("field2").autoGeneratePhraseQueries(true));
// Query string boosting the field
phraseBoostTestCaseForClauses(highlighterType, 1f,
queryStringQuery("highlight words together").field("field1"),
queryStringQuery("\"highlight words together\"").field("field1^100").autoGeneratePhraseQueries(true));
}
private <P extends QueryBuilder & BoostableQueryBuilder<?>> void
phraseBoostTestCaseForClauses(String highlighterType, float boost, QueryBuilder terms, P phrase) {
Matcher<String> highlightedMatcher = Matchers.either(containsString("<em>highlight words together</em>")).or(
containsString("<em>highlight</em> <em>words</em> <em>together</em>"));
SearchRequestBuilder search = client().prepareSearch("test").setHighlighterRequireFieldMatch(true)
.setHighlighterOrder("score").setHighlighterType(highlighterType)
.addHighlightedField("field1", 100, 1);
// Try with a bool query
phrase.boost(boost);
SearchResponse response = search.setQuery(boolQuery().must(terms).should(phrase)).get();
assertHighlight(response, 0, "field1", 0, 1, highlightedMatcher);
phrase.boost(1);
// Try with a boosting query
response = search.setQuery(boostingQuery().positive(phrase).negative(terms).boost(boost).negativeBoost(1)).get();
assertHighlight(response, 0, "field1", 0, 1, highlightedMatcher);
// Try with a boosting query using a negative boost
response = search.setQuery(boostingQuery().positive(phrase).negative(terms).boost(1).negativeBoost(1/boost)).get();
assertHighlight(response, 0, "field1", 0, 1, highlightedMatcher);
}
public void testGeoFieldHighlightingWithDifferentHighlighters() throws IOException {
// check that we do not get an exception for geo_point fields in case someone tries to highlight
// it accidentially with a wildcard
// see https://github.com/elastic/elasticsearch/issues/17537
XContentBuilder mappings = jsonBuilder();
mappings.startObject();
mappings.startObject("type")
.startObject("properties")
.startObject("geo_point")
.field("type", "geo_point")
.field("geohash", true)
.endObject()
.startObject("text")
.field("type", "string")
.field("term_vector", "with_positions_offsets_payloads")
.field("index_options", "offsets")
.endObject()
.endObject()
.endObject();
mappings.endObject();
assertAcked(prepareCreate("test")
.addMapping("type", mappings));
ensureYellow();
client().prepareIndex("test", "type", "1")
.setSource(jsonBuilder().startObject().field("text", "Arbitrary text field which will should not cause a failure").endObject())
.get();
refresh();
String highlighterType = randomFrom("plain", "fvh", "postings");
QueryBuilder query = QueryBuilders.boolQuery().should(QueryBuilders.geoBoundingBoxQuery("geo_point")
.bottomLeft(-64.92354174306496, -170.15625)
.topRight(61.10078883158897, 118.47656249999999))
.should(QueryBuilders.termQuery("text", "failure"));
SearchResponse search = client().prepareSearch().setSource(
new SearchSourceBuilder().query(query)
.highlight(new HighlightBuilder().field("*").highlighterType(highlighterType)).buildAsBytes()).get();
assertNoFailures(search);
assertThat(search.getHits().totalHits(), equalTo(1L));
assertThat(search.getHits().getAt(0).highlightFields().get("text").fragments().length, equalTo(1));
}
}
| {
"content_hash": "5dfacb686a536e4f222052409fa2ebe5",
"timestamp": "",
"source": "github",
"line_count": 2722,
"max_line_length": 466,
"avg_line_length": 56.77002204261572,
"alnum_prop": 0.6189557879478153,
"repo_name": "Kamapcuc/elasticsearch",
"id": "c90219c66d909f3630d236580d81267a641278f6",
"size": "155316",
"binary": false,
"copies": "1",
"ref": "refs/heads/2.3.4_patched",
"path": "core/src/test/java/org/elasticsearch/search/highlight/HighlighterSearchIT.java",
"mode": "33188",
"license": "apache-2.0",
"language": [],
"symlink_target": ""
} |
namespace rocksdb {
// Helper class that locks a mutex on construction and unlocks the mutex when
// the destructor of the MutexLock object is invoked.
//
// Typical usage:
//
// void MyClass::MyMethod() {
// MutexLock l(&mu_); // mu_ is an instance variable
// ... some complex code, possibly with multiple return paths ...
// }
class MutexLock {
public:
explicit MutexLock(port::Mutex *mu) : mu_(mu) {
this->mu_->Lock();
}
~MutexLock() { this->mu_->Unlock(); }
private:
port::Mutex *const mu_;
// No copying allowed
MutexLock(const MutexLock&);
void operator=(const MutexLock&);
};
//
// Acquire a ReadLock on the specified RWMutex.
// The Lock will be automatically released then the
// object goes out of scope.
//
class ReadLock {
public:
explicit ReadLock(port::RWMutex *mu) : mu_(mu) {
this->mu_->ReadLock();
}
~ReadLock() { this->mu_->ReadUnlock(); }
private:
port::RWMutex *const mu_;
// No copying allowed
ReadLock(const ReadLock&);
void operator=(const ReadLock&);
};
//
// Automatically unlock a locked mutex when the object is destroyed
//
class ReadUnlock {
public:
explicit ReadUnlock(port::RWMutex *mu) : mu_(mu) { mu->AssertHeld(); }
~ReadUnlock() { mu_->ReadUnlock(); }
private:
port::RWMutex *const mu_;
// No copying allowed
ReadUnlock(const ReadUnlock &) = delete;
ReadUnlock &operator=(const ReadUnlock &) = delete;
};
//
// Acquire a WriteLock on the specified RWMutex.
// The Lock will be automatically released then the
// object goes out of scope.
//
class WriteLock {
public:
explicit WriteLock(port::RWMutex *mu) : mu_(mu) {
this->mu_->WriteLock();
}
~WriteLock() { this->mu_->WriteUnlock(); }
private:
port::RWMutex *const mu_;
// No copying allowed
WriteLock(const WriteLock&);
void operator=(const WriteLock&);
};
//
// SpinMutex has very low overhead for low-contention cases. Method names
// are chosen so you can use std::unique_lock or std::lock_guard with it.
//
class SpinMutex {
public:
SpinMutex() : locked_(false) {}
bool try_lock() {
auto currently_locked = locked_.load(std::memory_order_relaxed);
return !currently_locked &&
locked_.compare_exchange_weak(currently_locked, true,
std::memory_order_acquire,
std::memory_order_relaxed);
}
void lock() {
for (size_t tries = 0;; ++tries) {
if (try_lock()) {
// success
break;
}
port::AsmVolatilePause();
if (tries > 100) {
std::this_thread::yield();
}
}
}
void unlock() { locked_.store(false, std::memory_order_release); }
private:
std::atomic<bool> locked_;
};
} // namespace rocksdb
| {
"content_hash": "ccf390b33bcc87d5d015f759ee0e81bd",
"timestamp": "",
"source": "github",
"line_count": 115,
"max_line_length": 77,
"avg_line_length": 24.008695652173913,
"alnum_prop": 0.6236870699022093,
"repo_name": "hkernbach/arangodb",
"id": "18210d2c67a7b0e2bad546e6f3200a207d9225f3",
"size": "3526",
"binary": false,
"copies": "3",
"ref": "refs/heads/devel",
"path": "3rdParty/rocksdb/v5.6.X/util/mutexlock.h",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Ada",
"bytes": "89079"
},
{
"name": "Assembly",
"bytes": "391227"
},
{
"name": "Awk",
"bytes": "7502"
},
{
"name": "Batchfile",
"bytes": "62496"
},
{
"name": "C",
"bytes": "9184899"
},
{
"name": "C#",
"bytes": "96431"
},
{
"name": "C++",
"bytes": "278343201"
},
{
"name": "CMake",
"bytes": "664691"
},
{
"name": "CSS",
"bytes": "650173"
},
{
"name": "CWeb",
"bytes": "174166"
},
{
"name": "Cuda",
"bytes": "52444"
},
{
"name": "DIGITAL Command Language",
"bytes": "259402"
},
{
"name": "Emacs Lisp",
"bytes": "14637"
},
{
"name": "Fortran",
"bytes": "1856"
},
{
"name": "Groovy",
"bytes": "51836"
},
{
"name": "HTML",
"bytes": "2415724"
},
{
"name": "Java",
"bytes": "1048556"
},
{
"name": "JavaScript",
"bytes": "54219725"
},
{
"name": "LLVM",
"bytes": "24019"
},
{
"name": "Lex",
"bytes": "1231"
},
{
"name": "Lua",
"bytes": "17899"
},
{
"name": "M4",
"bytes": "658700"
},
{
"name": "Makefile",
"bytes": "522586"
},
{
"name": "Max",
"bytes": "36857"
},
{
"name": "Module Management System",
"bytes": "1545"
},
{
"name": "NSIS",
"bytes": "42998"
},
{
"name": "Objective-C",
"bytes": "98866"
},
{
"name": "Objective-C++",
"bytes": "2503"
},
{
"name": "PHP",
"bytes": "118092"
},
{
"name": "Pascal",
"bytes": "150599"
},
{
"name": "Perl",
"bytes": "906737"
},
{
"name": "Perl 6",
"bytes": "25883"
},
{
"name": "PowerShell",
"bytes": "20434"
},
{
"name": "Python",
"bytes": "4557865"
},
{
"name": "QMake",
"bytes": "16692"
},
{
"name": "R",
"bytes": "5123"
},
{
"name": "Rebol",
"bytes": "354"
},
{
"name": "Roff",
"bytes": "1089418"
},
{
"name": "Ruby",
"bytes": "1141022"
},
{
"name": "SAS",
"bytes": "1847"
},
{
"name": "Scheme",
"bytes": "10604"
},
{
"name": "Shell",
"bytes": "508528"
},
{
"name": "Swift",
"bytes": "116"
},
{
"name": "Tcl",
"bytes": "1172"
},
{
"name": "TeX",
"bytes": "32117"
},
{
"name": "Visual Basic",
"bytes": "11568"
},
{
"name": "XSLT",
"bytes": "567028"
},
{
"name": "Yacc",
"bytes": "53063"
}
],
"symlink_target": ""
} |
/* TEMPLATE GENERATED TESTCASE FILE
Filename: CWE78_OS_Command_Injection__wchar_t_file_execl_18.c
Label Definition File: CWE78_OS_Command_Injection.strings.label.xml
Template File: sources-sink-18.tmpl.c
*/
/*
* @description
* CWE: 78 OS Command Injection
* BadSource: file Read input from a file
* GoodSource: Fixed string
* Sink: execl
* BadSink : execute command with wexecl
* Flow Variant: 18 Control flow: goto statements
*
* */
#include "std_testcase.h"
#include <wchar.h>
#ifdef _WIN32
#define COMMAND_INT_PATH L"%WINDIR%\\system32\\cmd.exe"
#define COMMAND_INT L"cmd.exe"
#define COMMAND_ARG1 L"/c"
#define COMMAND_ARG2 L"dir "
#define COMMAND_ARG3 data
#else /* NOT _WIN32 */
#include <unistd.h>
#define COMMAND_INT_PATH L"/bin/sh"
#define COMMAND_INT L"sh"
#define COMMAND_ARG1 L"-c"
#define COMMAND_ARG2 L"ls "
#define COMMAND_ARG3 data
#endif
#ifdef _WIN32
#define FILENAME "C:\\temp\\file.txt"
#else
#define FILENAME "/tmp/file.txt"
#endif
#ifdef _WIN32
#include <process.h>
#define EXECL _wexecl
#else /* NOT _WIN32 */
#define EXECL execl
#endif
#ifndef OMITBAD
void CWE78_OS_Command_Injection__wchar_t_file_execl_18_bad()
{
wchar_t * data;
wchar_t dataBuffer[100] = COMMAND_ARG2;
data = dataBuffer;
goto source;
source:
{
/* Read input from a file */
size_t dataLen = wcslen(data);
FILE * pFile;
/* if there is room in data, attempt to read the input from a file */
if (100-dataLen > 1)
{
pFile = fopen(FILENAME, "r");
if (pFile != NULL)
{
/* POTENTIAL FLAW: Read data from a file */
if (fgetws(data+dataLen, (int)(100-dataLen), pFile) == NULL)
{
printLine("fgetws() failed");
/* Restore NUL terminator if fgetws fails */
data[dataLen] = L'\0';
}
fclose(pFile);
}
}
}
/* wexecl - specify the path where the command is located */
/* POTENTIAL FLAW: Execute command without validating input possibly leading to command injection */
EXECL(COMMAND_INT_PATH, COMMAND_INT_PATH, COMMAND_ARG1, COMMAND_ARG3, NULL);
}
#endif /* OMITBAD */
#ifndef OMITGOOD
/* goodG2B() - use goodsource and badsink by reversing the blocks on the goto statement */
static void goodG2B()
{
wchar_t * data;
wchar_t dataBuffer[100] = COMMAND_ARG2;
data = dataBuffer;
goto source;
source:
/* FIX: Append a fixed string to data (not user / external input) */
wcscat(data, L"*.*");
/* wexecl - specify the path where the command is located */
/* POTENTIAL FLAW: Execute command without validating input possibly leading to command injection */
EXECL(COMMAND_INT_PATH, COMMAND_INT_PATH, COMMAND_ARG1, COMMAND_ARG3, NULL);
}
void CWE78_OS_Command_Injection__wchar_t_file_execl_18_good()
{
goodG2B();
}
#endif /* OMITGOOD */
/* Below is the main(). It is only used when building this testcase on
* its own for testing or for building a binary to use in testing binary
* analysis tools. It is not used when compiling all the testcases as one
* application, which is how source code analysis tools are tested.
*/
#ifdef INCLUDEMAIN
int main(int argc, char * argv[])
{
/* seed randomness */
srand( (unsigned)time(NULL) );
#ifndef OMITGOOD
printLine("Calling good()...");
CWE78_OS_Command_Injection__wchar_t_file_execl_18_good();
printLine("Finished good()");
#endif /* OMITGOOD */
#ifndef OMITBAD
printLine("Calling bad()...");
CWE78_OS_Command_Injection__wchar_t_file_execl_18_bad();
printLine("Finished bad()");
#endif /* OMITBAD */
return 0;
}
#endif
| {
"content_hash": "73f8f700be0a1cb8b4dfd65ae942485b",
"timestamp": "",
"source": "github",
"line_count": 135,
"max_line_length": 104,
"avg_line_length": 28.54074074074074,
"alnum_prop": 0.6215935634570464,
"repo_name": "JianpingZeng/xcc",
"id": "590366dc08d708c5114d4c379a4956df26b7f6b8",
"size": "3853",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "xcc/test/juliet/testcases/CWE78_OS_Command_Injection/s07/CWE78_OS_Command_Injection__wchar_t_file_execl_18.c",
"mode": "33188",
"license": "bsd-3-clause",
"language": [],
"symlink_target": ""
} |
/*___Generated_by_IDEA___*/
package com.zhy.android.percent.support;
/* This stub is only used by the IDE. It is NOT the BuildConfig class actually packed into the APK */
public final class BuildConfig {
public final static boolean DEBUG = Boolean.parseBoolean(null);
} | {
"content_hash": "ffbd1797d01effafbb8192ea10cbd985",
"timestamp": "",
"source": "github",
"line_count": 8,
"max_line_length": 101,
"avg_line_length": 34.125,
"alnum_prop": 0.7472527472527473,
"repo_name": "hongyangAndroid/android-percent-support-extend",
"id": "1b0dfba4c12d3891c843cfc4fa7b293cebc21ba9",
"size": "273",
"binary": false,
"copies": "2",
"ref": "refs/heads/master",
"path": "percent-support-extends/src/main/gen/com/zhy/android/percent/support/BuildConfig.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Java",
"bytes": "60721"
}
],
"symlink_target": ""
} |
<script>
import Bootlog from "src/client/bootlog.js"
import d3 from "src/external/d3.v5.js"
(async() => {
var currentboot = []
var starttime = performance.now()
await Bootlog.current().db.logs.each(ea => {
if (ea.bootid == lively4currentbootid) {
currentboot.push(ea)
}
})
// console.log("loaded Bootlog in " + (performance.now() - starttime))
if (currentboot.length == 0) {
return "no log for current boot, please enable <b>Preference > keep bootlog</b>"
}
var chart = await lively.create("d3-barchart")
chart.style.width = "1200px"
var offset = currentboot[0].date
var color = d3.scaleOrdinal(d3.schemeCategory10);
var nodeMap = new Map();
var data = currentboot
.map(ea => {
return {
log: ea,
children: [],
label: ea.url.replace(/.*\//,""),
x0: ea.date - ea.time - offset,
x1: ea.date - offset,
}
})
data = _.sortBy(data, d => d.log.date)
data = data.map(d => {
var parentNode = nodeMap.get(d.log.url)
if (parentNode) {
parentNode.children.push(d)
d.parent = parentNode
return null
} else {
nodeMap.set(d.log.url, d)
return d
}
})
.filter(ea => ea)
chart.config({
height(d, defaultValue) {
if (d.log.mode.match(/resolveInstantiate(Dep)?End/)) {
return 0.3 * parseFloat(defaultValue)
}
return defaultValue
},
onclick(d, evt) {
if(evt.shiftKey) {
lively.openInspector(d)
} else {
lively.openBrowser(d.log.url, true)
}
},
color(d) {
return color(d.log.mode)
},
title(d) {
return d.log.mode + " \n" + d.log.url + "\n" + d.log.time.toFixed(2) + "ms"
}
})
chart.setData(data)
chart.updateViz()
async function analysisTable(mode) {
var table = await lively.create("lively-table")
var filtered = currentboot.filter(ea => ea.mode == mode)
var analysis = _.sortBy(filtered, ea => ea.time).reverse().slice(0, 5).map(ea => ({
name: ea.url.replace(lively4url, ""),
time: (ea.time / 1000).toFixed(3)+ "s"}));
analysis.push({name: "total", time: (filtered
.reduce((sum, ea) => sum + ea.time, 0) / 1000).toFixed(3)+ "s"});
table.setFromJSO(analysis)
return <div><h3>Details: {mode}</h3>{table}</div>
}
var transpileTable = await analysisTable("transpiled")
var evaluateTable = await analysisTable("evaluate")
return <div>{transpileTable}{evaluateTable}{chart}</div>
})()
</script> | {
"content_hash": "c16f897f670376e99acecf7942ab479c",
"timestamp": "",
"source": "github",
"line_count": 99,
"max_line_length": 87,
"avg_line_length": 26.1010101010101,
"alnum_prop": 0.5719814241486069,
"repo_name": "LivelyKernel/lively4-core",
"id": "7b16a1f9c9c31eb29fb9853b84f9a571faced153",
"size": "2595",
"binary": false,
"copies": "1",
"ref": "refs/heads/gh-pages",
"path": "demos/visualizations/bootlog.md",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "144422"
},
{
"name": "HTML",
"bytes": "589346"
},
{
"name": "JavaScript",
"bytes": "4554338"
}
],
"symlink_target": ""
} |
'use strict';
module.exports = {
up: function (queryInterface, Sequelize) {
return queryInterface.addColumn('Comments',
'accepted',
{ type: Sequelize.BOOLEAN,
defaultValue: false
}
);
},
down: function (queryInterface, Sequelize) {
return queryInterface.removeColumn('Comments', 'accepted');
}
};
| {
"content_hash": "871d43c1329885cdafd98bf1743ae1e7",
"timestamp": "",
"source": "github",
"line_count": 16,
"max_line_length": 63,
"avg_line_length": 31.0625,
"alnum_prop": 0.4426559356136821,
"repo_name": "oscargverdejo/P9_Quiz_CORE",
"id": "9d14a04e393f34678b1f766c02837115ed20c524",
"size": "497",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "migrations/20160526170055-AddAcceptedToCommentsTable.js",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "925"
},
{
"name": "HTML",
"bytes": "3731"
},
{
"name": "JavaScript",
"bytes": "7177"
}
],
"symlink_target": ""
} |
<?xml version="1.0" encoding="UTF-8" ?>
<class name="GradientTexture1D" inherits="Texture2D" version="4.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="../class.xsd">
<brief_description>
Gradient-filled texture.
</brief_description>
<description>
GradientTexture1D uses a [Gradient] to fill the texture data. The gradient will be filled from left to right using colors obtained from the gradient. This means the texture does not necessarily represent an exact copy of the gradient, but instead an interpolation of samples obtained from the gradient at fixed steps (see [member width]).
</description>
<tutorials>
</tutorials>
<members>
<member name="gradient" type="Gradient" setter="set_gradient" getter="get_gradient">
The [Gradient] that will be used to fill the texture.
</member>
<member name="use_hdr" type="bool" setter="set_use_hdr" getter="is_using_hdr" default="false">
If [code]true[/code], the generated texture will support high dynamic range ([constant Image.FORMAT_RGBAF] format). This allows for glow effects to work if [member Environment.glow_enabled] is [code]true[/code]. If [code]false[/code], the generated texture will use low dynamic range; overbright colors will be clamped ([constant Image.FORMAT_RGBA8] format).
</member>
<member name="width" type="int" setter="set_width" getter="get_width" default="256">
The number of color samples that will be obtained from the [Gradient].
</member>
</members>
</class>
| {
"content_hash": "dea0a97eb2a7caa577290f30017153fa",
"timestamp": "",
"source": "github",
"line_count": 22,
"max_line_length": 361,
"avg_line_length": 68.54545454545455,
"alnum_prop": 0.7460212201591512,
"repo_name": "Faless/godot",
"id": "a124753a9f8e8938b2259c4aac3559fd28840ed4",
"size": "1508",
"binary": false,
"copies": "2",
"ref": "refs/heads/master",
"path": "doc/classes/GradientTexture1D.xml",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "AIDL",
"bytes": "1633"
},
{
"name": "C",
"bytes": "820090"
},
{
"name": "C#",
"bytes": "954253"
},
{
"name": "C++",
"bytes": "36679303"
},
{
"name": "CMake",
"bytes": "538"
},
{
"name": "GAP",
"bytes": "62"
},
{
"name": "GDScript",
"bytes": "59146"
},
{
"name": "GLSL",
"bytes": "802565"
},
{
"name": "Java",
"bytes": "512268"
},
{
"name": "JavaScript",
"bytes": "184665"
},
{
"name": "Kotlin",
"bytes": "16172"
},
{
"name": "Makefile",
"bytes": "1421"
},
{
"name": "Objective-C",
"bytes": "20550"
},
{
"name": "Objective-C++",
"bytes": "315906"
},
{
"name": "PowerShell",
"bytes": "2713"
},
{
"name": "Python",
"bytes": "427946"
},
{
"name": "Shell",
"bytes": "27482"
}
],
"symlink_target": ""
} |
@implementation YFFootBtn
-(void)setMode:(YFMeBtnMode *)mode
{
_mode = mode;
self.titleLabel.font = [UIFont systemFontOfSize:14];
[self setTitleColor:[UIColor blackColor] forState:UIControlStateNormal];
self.titleLabel.textAlignment = NSTextAlignmentCenter;
[self setBackgroundImage:[UIImage imageNamed:@"mainCellBackground"] forState:UIControlStateNormal];
[self setTitle:mode.name forState:UIControlStateNormal];
[self sd_setImageWithURL:[NSURL URLWithString:mode.icon] forState:UIControlStateNormal];
}
-(void)layoutSubviews
{
[super layoutSubviews];
CGFloat h = self.height * 0.6;
CGFloat margin = self.height * 0.1;
self.imageView.frame = CGRectMake(self.width*0.5 - h*0.5,10, h, h);
self.titleLabel.frame = CGRectMake(0,h+ margin, self.width, self.height - h-margin);
}
@end
| {
"content_hash": "4017177a78bbb52f239d0335e5ae7eb0",
"timestamp": "",
"source": "github",
"line_count": 29,
"max_line_length": 103,
"avg_line_length": 29.862068965517242,
"alnum_prop": 0.7055427251732102,
"repo_name": "waterfa/baisi",
"id": "753fabfa1b269adb465621fa9863ee03c7c60fa5",
"size": "1078",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "百思_复习/百思_复习/classes/我-Me/view/YFFootBtn.m",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "C++",
"bytes": "54687"
},
{
"name": "Objective-C",
"bytes": "1178830"
},
{
"name": "Objective-C++",
"bytes": "124423"
},
{
"name": "Ruby",
"bytes": "235"
},
{
"name": "Shell",
"bytes": "8861"
}
],
"symlink_target": ""
} |
# -*- coding: utf-8 -*-
# Copyright 2013 Takeshi KOMIYA
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import re
import json
import yaml
from glob import glob
from shutil import copyfile, copytree
from testing.common.database import (
Database, SkipIfNotInstalledDecorator
)
try:
from urllib.request import urlopen
except ImportError:
from urllib import urlopen
__all__ = ['Elasticsearch', 'skipIfNotFound']
SEARCH_PATHS = ['/usr/share/elasticsearch']
class Elasticsearch(Database):
DEFAULT_SETTINGS = dict(auto_start=2,
base_dir=None,
elasticsearch_home=None,
pid=None,
port=None,
copy_data_from=None,
boot_timeout=20)
subdirectories = ['data', 'logs']
def initialize(self):
self.elasticsearch_home = self.settings.get('elasticsearch_home')
if self.elasticsearch_home is None:
self.elasticsearch_home = find_elasticsearch_home()
user_config = self.settings.get('elasticsearch_yaml')
elasticsearch_yaml_path = find_elasticsearch_yaml_path(self.elasticsearch_home)
with open(os.path.realpath(elasticsearch_yaml_path)) as fd:
self.elasticsearch_yaml = yaml.load(fd.read()) or {}
self.elasticsearch_yaml['network.host'] = '127.0.0.1'
self.elasticsearch_yaml['http.port'] = self.settings['port']
self.elasticsearch_yaml['path.data'] = os.path.join(self.base_dir, 'data')
self.elasticsearch_yaml['path.logs'] = os.path.join(self.base_dir, 'logs')
self.elasticsearch_yaml['cluster.name'] = generate_cluster_name()
self.elasticsearch_yaml['discovery.zen.ping.multicast.enabled'] = False
if user_config:
for key, value in user_config.items():
self.elasticsearch_yaml[key] = value
def dsn(self, **kwargs):
return {'hosts': ['127.0.0.1:%d' % self.elasticsearch_yaml['http.port']]}
def get_data_directory(self):
return os.path.join(self.base_dir, 'data')
def initialize_database(self):
# copy data files
if self.settings['copy_data_from'] and self.elasticsearch_yaml['cluster.name']:
indexdir = os.listdir(self.settings['copy_data_from'])[0]
os.rename(os.path.join(self.base_dir, 'data', indexdir),
os.path.join(self.base_dir, 'data', self.elasticsearch_yaml['cluster.name']))
# conf directory
for filename in os.listdir(self.elasticsearch_home):
srcpath = os.path.join(self.elasticsearch_home, filename)
destpath = os.path.join(self.base_dir, filename)
if not os.path.exists(destpath):
if filename in ['lib', 'plugins']:
os.symlink(srcpath, destpath)
elif filename == 'conf':
destpath = os.path.join(self.base_dir, 'config')
copytree(srcpath, destpath)
elif os.path.isdir(srcpath):
copytree(srcpath, destpath)
else:
copyfile(srcpath, destpath)
elasticsearch_yaml_path = find_elasticsearch_yaml_path(self.elasticsearch_home)
if not elasticsearch_yaml_path.startswith(self.elasticsearch_home):
destpath = os.path.join(self.base_dir, 'config')
copytree(os.path.dirname(elasticsearch_yaml_path), destpath)
# rewrite elasticsearch.in.sh (for homebrew)
with open(os.path.join(self.base_dir, 'bin', 'elasticsearch.in.sh'), 'r+t') as fd:
body = re.sub('ES_HOME=.*', '', fd.read())
fd.seek(0)
fd.write(body)
def prestart(self):
super(Elasticsearch, self).prestart()
# assign port to elasticsearch
self.elasticsearch_yaml['http.port'] = self.settings['port']
# generate cassandra.yaml
with open(os.path.join(self.base_dir, 'config', 'elasticsearch.yml'), 'wt') as fd:
fd.write(yaml.dump(self.elasticsearch_yaml, default_flow_style=False))
def get_server_commandline(self):
return [os.path.join(self.base_dir, 'bin', 'elasticsearch')]
def is_server_available(self):
try:
url = 'http://127.0.0.1:%d/_cluster/health' % self.elasticsearch_yaml['http.port']
ret = json.loads(urlopen(url).read().decode('utf-8'))
if ret['status'] in ('green', 'yellow'):
return True
else:
return False
except Exception:
return False
class ElasticsearchSkipIfNotInstalledDecorator(SkipIfNotInstalledDecorator):
name = 'Elasticsearch'
def search_server(self):
find_elasticsearch_home() # raise exception if not found
skipIfNotFound = skipIfNotInstalled = ElasticsearchSkipIfNotInstalledDecorator()
def strip_version(dir):
m = re.search('(\d+)\.(\d+)\.(\d+)', dir)
if m is None:
return None
else:
return tuple([int(ver) for ver in m.groups()])
def find_elasticsearch_home():
elasticsearch_home = os.environ.get('ES_HOME')
if elasticsearch_home:
elasticsearch_home = os.path.abspath(elasticsearch_home)
if os.path.exists(os.path.join(elasticsearch_home, 'bin', 'elasticsearch')):
return elasticsearch_home
for path in SEARCH_PATHS:
if os.path.exists(os.path.join(path, 'bin', 'elasticsearch')):
return path
# search newest elasticsearch-x.x.x directory
globbed = (glob("/usr/local/*elasticsearch*") +
glob("*elasticsearch*") +
glob("/usr/local/Cellar/elasticsearch/*/libexec"))
elasticsearch_dirs = [os.path.abspath(dir) for dir in globbed if os.path.isdir(dir)]
if elasticsearch_dirs:
return sorted(elasticsearch_dirs, key=strip_version)[-1]
raise RuntimeError("could not find ES_HOME")
def find_elasticsearch_yaml_path(es_home):
for path in (os.path.join(es_home, 'conf', 'elasticsearch.yml'), # ubuntu
os.path.join(es_home, 'config', 'elasticsearch.yml'), # official package
'/etc/elasticsearch/elasticsearch.yml'): # travis
if os.path.exists(path):
return path
raise RuntimeError("could not find elasticsearch.yml")
def generate_cluster_name():
import string
import random
return ''.join([random.choice(string.ascii_letters) for i in range(6)])
| {
"content_hash": "94f3f931a270e562793eb4ad8d9e0ac3",
"timestamp": "",
"source": "github",
"line_count": 186,
"max_line_length": 99,
"avg_line_length": 37.93010752688172,
"alnum_prop": 0.6215450035435861,
"repo_name": "tk0miya/testing.elasticsearch",
"id": "f658fe3df72fd1048c6cad5c16eae1c59468ee1a",
"size": "7055",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "src/testing/elasticsearch.py",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Python",
"bytes": "16459"
}
],
"symlink_target": ""
} |
<?php
include 'config.php';
// RECUPERO DATI E AGGIUNGO
define('CHARSET', 'UTF-8');
define('REPLACE_FLAGS', ENT_COMPAT | ENT_XHTML);
$errors = array();
// METODO CON ROLLBACK
$db = new PDO("mysql:host=" . $dbhost . ";dbname=" . $dbname, $dbuser, $dbpswd, array(PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,PDO::ATTR_EMULATE_PREPARES => false ));
//Inizia la transazione
$db->beginTransaction();
try {
// per ogni orario
for($c = 1; $c<289 ; $c++) {
if(isset($_POST[$c])) {
//Update attivo
$sql = "UPDATE orario SET attivo = 1 WHERE orario.idorario = ?;";
} else {
//Update non attivo
$sql = "UPDATE orario SET attivo = 0 WHERE orario.idorario = ?;";
}
$stmt = $db->prepare($sql);
$stmt->execute(array($c));
}
//Se non ci sono eccezioni commit
$db->commit();
}
//Se sollevate eccezioni
catch(Exception $e){
echo $e->getMessage();
//Rollback la transazione
$db->rollBack();
$errors['DB'] = 'Errore nel database';
}
$db = NULL;
// Mando il messaggio del risultato e redirigo
session_start();
if (!empty($errors)) {
$_SESSION['sqlerrori'] = implode(", ", $errors);
} else {
$_SESSION['sqlok'] = "OPERAZIONE RIUSCITA";
}
header('Location: ./opzioni.php'); | {
"content_hash": "1a99c40c4f7f6e6b94765d7c1b1ae037",
"timestamp": "",
"source": "github",
"line_count": 56,
"max_line_length": 169,
"avg_line_length": 23.410714285714285,
"alnum_prop": 0.5751334858886347,
"repo_name": "archistico/appuntamenti",
"id": "b343ec2993cf37a8f8f81ba6e1626fac55f0cc47",
"size": "1311",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "modificaorarioambulatorio.php",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "1643"
},
{
"name": "PHP",
"bytes": "97538"
}
],
"symlink_target": ""
} |
+++
title = "Frequently asked questions"
weight = 30
+++
{{% toc %}}
## Is Outcome safe to use in extern APIs?
Outcome is specifically designed for use in the public interfaces of multi-million
line codebases. `result`'s layout is hard coded to:
```c++
struct trivially_copyable_result_layout {
union {
value_type value;
error_type error;
};
unsigned int flags;
};
```
... if both `value_type` and `error_type` are `TriviallyCopyable`, otherwise:
```c++
struct non_trivially_copyable_result_layout {
value_type value;
unsigned int flags;
error_type error;
};
```
This is C-compatible if `value_type` and `error_type` are C-compatible. {{% api "std::error_code" %}}
is *probably* C-compatible, but its layout is not standardised (though there is a
normative note in the standard about its layout). Hence Outcome cannot provide a
C macro API for standard Outcome, but we can for [Experimental Outcome]({{< relref "/experimental/c-api" >}}).
## Does Outcome implement over-alignment?
Outcome propagates any over-alignment of the types you supply to it as according
to the layout specified above. Therefore the ordinary alignment and padding rules
for your compiler are used.
## Does Outcome implement the no-fail, strong or basic exception guarantee?
([You can read about the meaning of these guarantees at cppreference.com](https://en.cppreference.com/w/cpp/language/exceptions#Exception_safety))
If for the following operations:
- Construction
- Assignment
- Swap
... the corresponding operation in **all** of `value_type`, `error_type` (and
`exception_type` for `outcome`) is `noexcept(true)`, then `result` and
`outcome`'s operation is `noexcept(true)`. This propagates the no-fail exception
guarantee of the underlying types. Otherwise the basic guarantee applies for all
but Swap, under the same rules as for the `struct` layout type given above e.g.
value would be constructed first, then the flags, then the error. If the error
throws, value and status bits would be as if the failure had not occurred, same
as for aborting the construction of any `struct` type.
It is recognised that these weak guarantees may be unsuitable for some people,
so Outcome implements `swap()` with much stronger guarantees, as one can locally refine,
without too much work, one's own custom classes from `result` and `outcome` implementing
stronger guarantees for construction and assignment using `swap()` as the primitive
building block.
The core ADL discovered implementation of strong guarantee swap is {{% api "strong_swap(bool &all_good, T &a, T &b)" %}}.
This can be overloaded by third party code with custom strong guarantee swap
implementations, same as for `std::swap()`. Because strong guarantee swap may fail
when trying to restore input state during handling of failure to swap, the
`all_good` boolean becomes false if restoration fails, at which point both
results/outcomes get marked as tainted via {{% api "has_lost_consistency()" %}}.
It is **up to you** to check this flag to see if known good state has been lost,
as Outcome never does so on your behalf. The simple solution to avoiding having
to deal with this situation is to always choose your value, error and exception
types to have non-throwing move constructors and move assignments. This causes
the strong swap implementation to no longer be used, as it is no longer required,
and standard swap is used instead.
## Does Outcome have a stable ABI and API?
The layout changed for all trivially copyable types between Outcome v2.1 and v2.2,
as union based storage was introduced. From v2.2 onwards, the layout is not
expected to change again.
If v2.2 proves to be unchanging for 24 months, Outcome's ABI and API will be
formally fixed as **the** v2 interface and written into stone forever. Thereafter
the [ABI compliance checker](https://lvc.github.io/abi-compliance-checker/)
will be run per-commit to ensure Outcome's ABI and API remains stable. This is
currently expected to occur in 2022.
Note that the stable ABI and API guarantee will only apply to standalone
Outcome, not to Boost.Outcome. Boost.Outcome has dependencies on other
parts of Boost which are not stable across releases.
Note also that the types you configure a `result` or `outcome` with also need
to be ABI stable if `result` or `outcome` is to be ABI stable.
## Can I use `result<T, EC>` across DLL/shared object boundaries?
A known problem with using Windows DLLs (and to smaller extent POSIX shared libraries) is that global
objects may get duplicated: one instance in the executable and one in the DLL. This
behaviour is not incorrect according to the C++ Standard, as the Standard does not
recognize the existence of DLLs or shared libraries. Therefore, program designs that
depend on globals having unique addresses may become compromised when used in a program
using DLLs.
Nothing in Outcome depends on the addresses of globals, plus the guaranteed fixed data
layout (see answer above) means that different versions of Outcome can be used in
different DLLs, and it probably will work okay (it is still not advised that you do that
as that is an ODR violation).
However, one of the most likely candidate for `EC` -- `std::error_code` -- **does** depend
on the addresses of globals for correct functioning.
The standard library is required to implement globally unique addresses for the standard library
provided {{% api "std::error_category" %}} implementations e.g. `std::system_category()`.
User defined error code categories may **not** have unique global addresses, and thus
introduce misoperation.
`boost::system::error_code`, since version 1.69 does offer an *opt-in* guarantee
that it does not depend on the addresses of globals **if** the user defined error code
category *opts-in* to the 64-bit comparison mechanism. This can be seen in the specification of
`error_category::operator==` in
[Boost.System synopsis](https://www.boost.org/doc/libs/1_69_0/libs/system/doc/html/system.html#ref_synopsis).
Alternatively, the `status_code` in [Experimental Outcome](({{< relref "/experimental/differences" >}})),
due to its more modern design, does not suffer from any problems from being used in shared
libraries in any configuration.
## Why two types `result<>` and `outcome<>`, rather than just one?
`result` is the simple, success OR failure type.
`outcome` extends `result` with a third state to transport, conventionally (but not necessarily) some sort of "abort" or "exceptional" state which a function can return to indicate that not only did the operation fail, but it did so *catastrophically* i.e. please abort any attempt to retry the operation.
A perfect alternative to using `outcome` is to throw a C++ exception for the abort code path, and indeed most programs ought to do exactly that instead of using `outcome`. However there are a number of use cases where choosing `outcome` shines:
1. Where C++ exceptions or RTTI is not available, but the ability to fail catastrophically without terminating the program is important.
2. Where deterministic behaviour is required even in the catastrophic failure situation.
3. In unit test suites of code using Outcome it is extremely convenient to accumulate test failures into an `outcome` for later reporting. A similar convenience applies to RPC situations, where C++ exception throws need to be accumulated for reporting back to the initiating endpoint.
4. Where a function is "dual use deterministic" i.e. it can be used deterministically, in which case one switches control flow based on `.error()`, or it can be used non-deterministically by throwing an exception perhaps carrying a custom payload.
## How badly will including Outcome in my public interface affect compile times?
The quick answer is that it depends on how much convenience you want.
The convenience header `<result.hpp>` is dependent on `<system_error>` or Boost.System, which unfortunately includes `<string>` and thus
drags in quite a lot of other slow-to-parse stuff. If your public interface already includes `<string>`,
then the impact of additionally including Outcome will be low. If you do not include `<string>`,
unfortunately impact may be relatively quite high, depending on the total impact of your
public interface files.
If you've been extremely careful to avoid ever including the most of the STL headers
into your interfaces in order to maximise build performance, then `<basic_result.hpp>`
can have as few dependencies as:
1. `<cstdint>`
2. `<initializer_list>`
3. `<iosfwd>`
4. `<new>`
5. `<type_traits>`
6. `<cstdio>`
7. `<cstdlib>`
8. `<cassert>`
These, apart from `<iosfwd>`, tend to be very low build time impact in most standard
library implementations. If you include only `<basic_result.hpp>`, and manually configure
`basic_result<>` by hand, compile time impact will be minimised.
(See reference documentation for {{% api "basic_result<T, E, NoValuePolicy>" %}} for more detail.
## Is Outcome suitable for fixed latency/predictable execution coding such as for high frequency trading or audio?
Great care has been taken to ensure that Outcome never unexpectedly executes anything
with unbounded execution times such as `malloc()`, `dynamic_cast<>()` or `throw`.
Outcome works perfectly with C++ exceptions and RTTI globally disabled.
Outcome's entire design premise is that its users are happy to exchange a small, predictable constant overhead
during successful code paths, in exchange for predictable failure code paths.
In contrast, table-based exception handling gives zero run time overhead for the
successful code path, and completely unpredictable (and very expensive) overhead
for failure code paths.
For code where predictability of execution, no matter the code path, is paramount,
writing all your code to use Outcome is not a bad place to start. Obviously enough,
do choose a non-throwing policy when configuring `outcome` or `result` such as
{{% api "all_narrow" %}} to guarantee that exceptions can never be thrown by Outcome
(or use the convenience typedef for `result`, {{% api "unchecked<T, E = varies>" %}} which uses `policy::all_narrow`).
## What kind of runtime performance impact will using Outcome in my code introduce?
It is very hard to say anything definitive about performance impacts in codebases one
has never seen. Each codebase is unique. However to come up with some form of measure,
we timed traversing ten stack frames via each of the main mechanisms, including the
"do nothing" (null) case.
A stack frame is defined to be something called by the compiler whilst
unwinding the stack between the point of return in the ultimate callee and the base
caller, so for example ten stack allocated objects might be destructed, or ten levels
of stack depth might be unwound. This is not a particularly realistic test, but it
should at least give one an idea of the performance impact of returning Outcome's
`result` or `outcome` over say returning a plain integer, or throwing an exception.
The following figures are for Outcome v2.1.0 with GCC 7.4, clang 8.0 and Visual
Studio 2017.9. Figures for newer Outcomes with newer compilers can be found at
https://github.com/ned14/outcome/tree/develop/benchmark.
### High end CPU: Intel Skylake x64
This is a high end CPU with very significant ability to cache, predict, parallelise
and execute out-of-order such that tight, repeated loops perform very well. It has
a large μop cache able to wholly contain the test loop, meaning that these results
are a **best case** performance.
{{% figure src="/faq/results_skylake_log.png" title="Log graph comparing GCC 7.4, clang 8.0 and Visual Studio 2017.9 on x64, for exceptions-globally-disabled, ordinary and link-time-optimised build configurations." %}}
As you can see, throwing and catching an exception is
expensive on table-based exception handling implementations such as these, anywhere
between 26,000 and 43,000 CPU cycles. And this is the *hot path* situation, this
benchmark is a loop around hot cached code. If the tables are paged out onto storage,
you are talking about **millions** of CPU cycles.
Simple integer returns (i.e. do nothing null case)
are always going to be the fastest as they do the least work, and that costs 80 to 90
CPU cycles on this Intel Skylake CPU.
Note that returning a `result<int, std::error_code>` with a "success (error code)"
is no more than 5% added runtime overhead over returning a naked int on GCC and clang. On MSVC
it costs an extra 20% or so, mainly due to poor code optimisation in the VS2017.9 compiler. Note that "success
(experimental status code)" optimises much better, and has almost no overhead over a
naked int.
Returning a `result<int, std::error_code>` with a "failure (error code)"
is less than 5% runtime overhead over returning a success on GCC, clang and MSVC.
You might wonder what happens if type `E` has a non-trivial destructor, thus making the
`result<T, E>` have a non-trivial destructor? We tested `E = std::exception_ptr` and
found less than a 5% overhead to `E = std::error_code` for returning success. Returning a failure
was obviously much slower at anywhere between 300 and 1,100 CPU cycles, due to the
dynamic memory allocation and free of the exception ptr, plus at least two atomic operations per stack frame, but that is
still two orders of magnitude better than throwing and catching an exception.
We conclude that if failure is anything but extremely rare in your C++ codebase,
using Outcome instead of throwing and catching exceptions ought to be quicker overall:
- Experimental Outcome is statistically indistinguishable from the null case on this
high end CPU, for both returning success and failure, on all compilers.
- Standard Outcome is less than 5%
worse than the null case for returning successes on GCC and clang, and less than 10% worse than
the null case for returning failures on GCC and clang.
- Standard Outcome optimises
poorly on VS2017.9, indeed markedly worse than on previous point releases, so let's
hope that Microsoft fix that soon. It currently has a less than 20% overhead on the null case.
### Mid tier CPU: ARM Cortex A72
This is a four year old mid tier CPU used in many high end mobile phones and tablets
of its day, with good ability to cache, predict, parallelise
and execute out-of-order such that tight, repeated loops perform very well. It has
a μop cache able to wholly contain the test loop, meaning that these results
are a **best case** performance.
{{% figure src="/faq/results_arm_a72_log.png" title="Log graph comparing GCC 7.3 and clang 7.3 on ARM64, for exceptions-globally-disabled, ordinary and link-time-optimised build configurations." %}}
This ARM chip is a very consistent performer -- null case, success, or failure, all take
almost exactly the same CPU cycles. Choosing Outcome, in any configuration, makes no
difference to not using Outcome at all. Throwing and catching a C++ exception costs
about 90,000 CPU cycles, whereas the null case/Outcome costs about 130 - 140 CPU cycles.
There is very little to say about this CPU, other than Outcome is zero overhead on it. The same
applied to the ARM Cortex A15 incidentally, which I test cased extensively when
deciding on the Outcome v2 design back after the first peer review. The v2 design
was chosen partially because of such consistent performance on ARM.
### Low end CPUs: Intel Silvermont x64 and ARM Cortex A53
These are low end CPUs with a mostly or wholly in-order execution core. They have a small
or no μop cache, meaning that the CPU must always decode the instruction stream.
These results represent an execution environment more typical of CPUs two decades
ago, back when table-based EH created a big performance win if you never threw
an exception.
{{% figure src="/faq/results_silvermont_log.png" title="Log graph comparing GCC 7.3 and clang 7.3 on x64, for exceptions-globally-disabled, ordinary and link-time-optimised build configurations." %}}
{{% figure src="/faq/results_arm_a53_log.png" title="Log graph comparing GCC 7.3 and clang 7.3 on ARM64, for exceptions-globally-disabled, ordinary and link-time-optimised build configurations." %}}
The first thing to mention is that clang generates very high performance code for
in-order cores, far better than GCC. It is said that this is due to a very large investment by
Apple in clang/LLVM for their devices sustained over many years. In any case, if you're
targeting in-order CPUs, don't use GCC if you can use clang instead!
For the null case, Silvermont and Cortex A53 are quite similar in terms of CPU clock cycles. Ditto
for throwing and catching a C++ exception (approx 150,000 CPU cycles). However the Cortex
A53 does far better with Outcome than Silvermont, a 15% versus 100% overhead for Standard
Outcome, and a 4% versus 20% overhead for Experimental Outcome.
Much of this large difference is in fact due to calling convention differences. x64 permits up to 8 bytes
to be returned from functions by CPU register. `result<int>` consumes 24 bytes, so on x64
the compiler writes the return value to the stack. However ARM64 permits up to 64 bytes
to be returned in registers, so `result<int>` is returned via CPU registers on ARM64.
On higher end CPUs, memory is read and written in cache lines (32 or 64 bytes), and
reads and writes are coalesced and batched together by the out-of-order execution core. On these
low end CPUs, memory is read and written sequentially per assembler instruction,
so only one load or one store to L1
cache can occur at a time. This makes writing the stack particularly slow on in-order
CPUs. Memory operations which "disappear" on higher end CPUs take considerable time
on low end CPUs. This particularly punishes Silvermont in a way which does not punish
the Cortex A53, because of having to write multiple values to the stack to create the
24 byte object to be returned.
The conclusion to take away from this is that if you are targeting a low end CPU,
table-based EH still delivers significant performance improvements for the success
code path. Unless determinism in failure is critically important, you should not
use Outcome on in-order execution CPUs.
## Why is implicit default construction disabled?
This was one of the more interesting points of discussion during the peer review of
Outcome v1. v1 had a formal empty state. This came with many advantages, but it
was not felt to be STL idiomatic as `std::optional<result<T>>` is what was meant, so
v2 has eliminated any legal possibility of being empty.
The `expected<T, E>` proposal of that time (May 2017) did permit default construction
if its `T` type allowed default construction. This was specifically done to make
`expected<T, E>` more useful in STL containers as one can say resize a vector without
having to supply an `expected<T, E>` instance to fill the new items with. However
there was some unease with that design choice, because it may cause programmers to
use some type `T` whose default constructed state is overloaded with additional meaning,
typically "to be filled" i.e. a de facto empty state via choosing a magic value.
For the v2 redesign, the various arguments during the v1 review were considered.
Unlike `expected<T, E>` which is intended to be a general purpose Either monad
vocabulary type, Outcome's types are meant primarily for returning success or failure
from functions. The API should therefore encourage the programmer to not overload
the successful type with additional meaning of "to be filled" e.g. `result<std::optional<T>>`.
The decision was therefore taken to disable *implicit* default construction, but
still permit *explicit* default construction by making the programmer spell out their
intention with extra typing.
To therefore explicitly default construct a `result<T>` or `outcome<T>`, use one
of these forms as is the most appropriate for the use case:
1. Construct with just `in_place_type<T>` e.g. `result<T>(in_place_type<T>)`.
2. Construct via `success()` e.g. `outcome<T>(success())`.
3. Construct from a `void` form e.g. `result<T>(result<void>(in_place_type<void>))`.
## How far away from the proposed `std::expected<T, E>` is Outcome's `checked<T, E>`?
Not far, in fact after the first Boost.Outcome peer review in May 2017, Expected moved
much closer to Outcome, and Outcome deliberately provides {{% api "checked<T, E = varies>" %}}
as a semantic equivalent.
Here are the remaining differences which represent the
divergence of consensus opinion between the Boost peer review and WG21 on the proper
design for this object:
1. `checked<T, E>` has no default constructor. Expected has a default constructor if
`T` has a default constructor.
2. `checked<T, E>` uses the same constructor design as `std::variant<...>`. Expected
uses the constructor design of `std::optional<T>`.
3. `checked<T, E>` cannot be modified after construction except by assignment.
Expected provides an `.emplace()` modifier.
4. `checked<T, E>` permits implicit construction from both `T` and `E` when
unambiguous. Expected permits implicit construction from `T` alone.
5. `checked<T, E>` does not permit `T` and `E` to be the same, and becomes annoying
to use if they are constructible into one another (implicit construction self-disables).
Expected permits `T` and `E` to be the same.
6. `checked<T, E>` throws `bad_result_access_with<E>` instead of Expected's
`bad_expected_access<E>`.
7. `checked<T, E>` models `std::variant<...>`. Expected models `std::optional<T>`. Thus:
- `checked<T, E>` does not provide `operator*()` nor `operator->`
- `checked<T, E>` `.error()` is wide (i.e. throws on no-value) like `.value()`.
Expected's `.error()` is narrow (UB on no-error). [`checked<T, E>` provides
`.assume_value()` and `.assume_error()` for narrow (UB causing) observers].
8. `checked<T, E>` uses `success<T>` and `failure<E>` type sugars for disambiguation.
Expected uses `unexpected<E>` only.
9. `checked<T, E>` does not implement (prone to unintended misoperation) comparison
operators which permit implicit conversion e.g. `checked<T> == T` will fail to compile.
Instead write unambiguous code e.g. `checked<T> == success(T)` or `checked<T> == failure(T)`.
10. `checked<T, E>` defaults `E` to `std::error_code` or `boost::system::error_code`.
Expected does not default `E`.
In fact, the two are sufficiently close in design that a highly conforming `expected<T, E>`
can be implemented by wrapping up `checked<T, E>` with the differing functionality:
{{% snippet "expected_implementation.cpp" "expected_implementation" %}}
## Why doesn't Outcome duplicate `std::expected<T, E>`'s design?
There are a number of reasons:
1. Outcome is not aimed at the same audience as Expected. We target developers
and users who would be happy to use Boost. Expected targets the standard library user.
2. Outcome believes that the monadic use case isn't as important as Expected does.
Specifically, we think that 99% of use of Expected in the real world will be to
return failure from functions, and not as some sort of enhanced or "rich" Optional.
Outcome therefore models a subset of Variant, whereas Expected models an extended Optional.
3. Outcome believes that if you are thinking about using something like Outcome,
then for you writing failure code will be in the same proportion as writing success code,
and thus in Outcome writing for failure is exactly the same as writing for success.
Expected assumes that success will be more common than failure, and makes you type
more when writing for failure.
4. Outcome goes to considerable effort to help the end user type fewer characters
during use. This results in tighter, less verbose, more succinct code. The cost of this is a steeper
learning curve and more complex mental model than when programming with Expected.
5. Outcome has facilities to make easier interoperation between multiple third
party libraries each using incommensurate Outcome (or Expected) configurations. Expected does
not do any of this, but subsequent WG21 papers do propose various interoperation
mechanisms, [one of which](https://wg21.link/P0786) Outcome implements so code using Expected will seamlessly
interoperate with code using Outcome.
6. Outcome was designed with the benefit of hindsight after Optional and Expected,
where how those do implicit conversions have been found to be prone to writing
unintentionally buggy code. Outcome simultaneously permits more implicit conversions
for ease of use and convenience, where those are unambigiously safe, and prevents
other implicit conversions which the Boost peer review reported as dangerous.
## Is Outcome riddled with undefined behaviour for const, const-containing and reference-containing types?
The short answer is not any more in C++ 20 and after, thanks to changes made to
C++ 20 at the Belfast WG21 meeting in November 2019.
The longer answer is that before C++ 20, use of placement
new on types containing `const` member types where the resulting pointer was
thrown away is undefined behaviour. As of the resolution of a national body
comment, this is no longer the case, and now Outcome is free of this particular
UB for C++ 20 onwards.
This still affects C++ before 20, though no major compiler is affected. Still,
if you wish to avoid UB, don't use `const` types within Outcome types (or any
`optional<T>`, or `vector<T>` or any STL container type for that matter).
### More detail
Before the C++ 14 standard, placement new into storage which used to contain
a const type was straight out always undefined behaviour, period. Thus all use of
placement new within a `result<const_containing_type>`, or indeed an `optional<const_containing_type>`, is always
undefined behaviour before C++ 14. From `[basic.life]` for the C++ 11 standard:
> Creating a new object at the storage location that a const object with static,
> thread, or automatic storage duration occupies or, at the storage location
> that such a const object used to occupy before its lifetime ended results
> in undefined behavior.
This being excessively restrictive, from C++ 14 onwards, `[basic_life]` now states:
> If, after the lifetime of an object has ended and before the storage which
> the object occupied is reused or released, a new object is created at the
> storage location which the original object occupied, a pointer that
> pointed to the original object, a reference that referred to the original
> object, or the name of the original object will automatically refer to the
> new object and, once the lifetime of the new object has started, can be
> used to manipulate the new object, if:
>
> — the storage for the new object exactly overlays the storage location which
> the original object occupied, and
>
> — the new object is of the same type as the original object (ignoring the
> top-level cv-qualifiers), and
>
> — the type of the original object is not const-qualified, and, if a class type,
> does not contain any non-static data member whose type is const-qualified
> or a reference type, and
>
> — neither the original object nor the new object is a potentially-overlapping
> subobject
Leaving aside my personal objections to giving placement new of non-const
non-reference types magical pointer renaming powers, the upshot is that if
you want defined behaviour for placement new of types containing const types
or references, you must store the pointer returned by placement new, and use
that pointer for all further reference to the newly created object. This
obviously adds eight bytes of storage to a `result<const_containing_type>`, which is highly
undesirable given all the care and attention paid to keeping it small. The alternative
is to use {{% api "std::launder" %}}, which was added in C++ 17, to 'launder'
the storage into which we placement new before each and every use of that
storage. This forces the compiler to reload the object stored by placement
new on every occasion, and not assume it can be constant propagated, which
impacts codegen quality.
As mentioned above, this issue (in so far as it applies to types containing
user supplied `T` which might be `const`) has been resolved as of C++ 20 onwards,
and it is extremely unlikely that any C++ compiler will act on any UB here in
C++ 17 or 14 given how much of STL containers would break.
| {
"content_hash": "8d121b5ea3f24798d42eb3a93f771cae",
"timestamp": "",
"source": "github",
"line_count": 504,
"max_line_length": 305,
"avg_line_length": 56.18452380952381,
"alnum_prop": 0.7732810679097362,
"repo_name": "wiltonlazary/arangodb",
"id": "1236a2a4deac82808f4cc26ba12c03a40003fe2b",
"size": "28328",
"binary": false,
"copies": "5",
"ref": "refs/heads/devel",
"path": "3rdParty/boost/1.78.0/libs/outcome/doc/src/content/faq/_index.md",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Assembly",
"bytes": "61827"
},
{
"name": "C",
"bytes": "309788"
},
{
"name": "C++",
"bytes": "34723629"
},
{
"name": "CMake",
"bytes": "383904"
},
{
"name": "CSS",
"bytes": "210549"
},
{
"name": "EJS",
"bytes": "231166"
},
{
"name": "HTML",
"bytes": "23114"
},
{
"name": "JavaScript",
"bytes": "33741286"
},
{
"name": "LLVM",
"bytes": "14975"
},
{
"name": "NASL",
"bytes": "269512"
},
{
"name": "NSIS",
"bytes": "47138"
},
{
"name": "Pascal",
"bytes": "75391"
},
{
"name": "Perl",
"bytes": "9811"
},
{
"name": "PowerShell",
"bytes": "7869"
},
{
"name": "Python",
"bytes": "184352"
},
{
"name": "SCSS",
"bytes": "255542"
},
{
"name": "Shell",
"bytes": "134504"
},
{
"name": "TypeScript",
"bytes": "179074"
},
{
"name": "Yacc",
"bytes": "79620"
}
],
"symlink_target": ""
} |
<?php
namespace Meta\OntologyCreateBundle\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
class CreateDataBaseController extends Controller
{
public function wizardAction(){
return $this->render('MetaOntologyBundle:CreateDataBase:index.html.twig', array());
}
}
| {
"content_hash": "f249ced8a9c16946b25ad3a464ba28db",
"timestamp": "",
"source": "github",
"line_count": 16,
"max_line_length": 91,
"avg_line_length": 24,
"alnum_prop": 0.7161458333333334,
"repo_name": "andreymaznyak/ontologydatabase",
"id": "b0f6e0642c6bb55fb9fe39f47a567c7ab4297eee",
"size": "384",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "src/Meta/OntologyCreateBundle/Controller/CreateDataBase.php",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "46437"
},
{
"name": "JavaScript",
"bytes": "238731"
},
{
"name": "PHP",
"bytes": "367385"
},
{
"name": "Shell",
"bytes": "1345"
}
],
"symlink_target": ""
} |
class Chef
class Provider
class Breakpoint < Chef::Provider
def load_current_resource
end
def action_break
if defined?(Shef) && Shef.running?
@collection.iterator.pause
@new_resource.updated = true
@collection.iterator
end
end
end
end
end | {
"content_hash": "3de7c022fe4507d710e9782a218e6c12",
"timestamp": "",
"source": "github",
"line_count": 18,
"max_line_length": 42,
"avg_line_length": 18.944444444444443,
"alnum_prop": 0.5571847507331378,
"repo_name": "jsierles/chef",
"id": "9bb3e3f55087bb0ca32326d319531593faa1321b",
"size": "1027",
"binary": false,
"copies": "3",
"ref": "refs/heads/master",
"path": "chef/lib/chef/provider/breakpoint.rb",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "JavaScript",
"bytes": "225917"
},
{
"name": "Perl",
"bytes": "2585"
},
{
"name": "Python",
"bytes": "3830"
},
{
"name": "Ruby",
"bytes": "2029250"
}
],
"symlink_target": ""
} |
<?php
/**
* sfCombineFilter
*
* @package sfCombinePlugin
* @subpackage filter
* @author Alexandre Mogère
*/
class sfCombineFilter extends sfFilter
{
/**
* Executes this filter.
*
* @param sfFilterChain $filterChain A sfFilterChain instance
*/
public function execute($filterChain)
{
// execute next filter
$filterChain->execute();
// execute this filter only once
$response = $this->context->getResponse();
// include javascripts and stylesheets
$content = $response->getContent();
if (false !== ($pos = strpos($content, '</head>')) // has a </head> tag
&& false !== strpos($response->getContentType(), 'html')) // is html content
{
$this->context->getConfiguration()->loadHelpers(array('Tag', 'Asset', 'Url', 'sfCombine'));
$html = '';
if (!sfConfig::get('symfony.asset.javascripts_included', false)
|| sfConfig::get('app_sfCombinePlugin_filter_include_unused_groups', false))
{
$html .= get_combined_javascripts(
null,
sfCombineManager::GROUP_INCLUDE,
true
);
}
if (!sfConfig::get('symfony.asset.stylesheets_included', false)
|| sfConfig::get('app_sfCombinePlugin_filter_include_unused_groups', false))
{
$html .= get_combined_stylesheets(
null,
sfCombineManager::GROUP_INCLUDE,
true
);
}
if ($html)
{
$response->setContent(substr($content, 0, $pos).$html.substr($content, $pos));
}
}
sfConfig::set('symfony.asset.javascripts_included', true);
sfConfig::set('symfony.asset.stylesheets_included', true);
}
} | {
"content_hash": "01fb7ffd10a3ee93f77e6ea340a36d8e",
"timestamp": "",
"source": "github",
"line_count": 60,
"max_line_length": 97,
"avg_line_length": 28.116666666666667,
"alnum_prop": 0.5975103734439834,
"repo_name": "heristop/sfCombinePlugin",
"id": "9d580d7b82d7689df81e4a723fc524e8f5658117",
"size": "1688",
"binary": false,
"copies": "2",
"ref": "refs/heads/master",
"path": "lib/filter/sfCombineFilter.class.php",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "PHP",
"bytes": "76100"
}
],
"symlink_target": ""
} |
package org.w3._2003._05.soap_envelope;
import javax.xml.bind.annotation.XmlAccessType;
import javax.xml.bind.annotation.XmlAccessorType;
import javax.xml.bind.annotation.XmlElement;
import javax.xml.bind.annotation.XmlSchemaType;
import javax.xml.bind.annotation.XmlType;
import org.apache.commons.lang3.builder.ToStringBuilder;
import org.apache.cxf.xjc.runtime.JAXBToStringStyle;
/**
*
* Fault reporting structure
*
*
* <p>Java class for Fault complex type.
*
* <p>The following schema fragment specifies the expected content contained within this class.
*
* <pre>
* <complexType name="Fault">
* <complexContent>
* <restriction base="{http://www.w3.org/2001/XMLSchema}anyType">
* <sequence>
* <element name="Code" type="{http://www.w3.org/2003/05/soap-envelope}faultcode"/>
* <element name="Reason" type="{http://www.w3.org/2003/05/soap-envelope}faultreason"/>
* <element name="Node" type="{http://www.w3.org/2001/XMLSchema}anyURI" minOccurs="0"/>
* <element name="Role" type="{http://www.w3.org/2001/XMLSchema}anyURI" minOccurs="0"/>
* <element name="Detail" type="{http://www.w3.org/2003/05/soap-envelope}detail" minOccurs="0"/>
* </sequence>
* </restriction>
* </complexContent>
* </complexType>
* </pre>
*
*
*/
@XmlAccessorType(XmlAccessType.FIELD)
@XmlType(name = "Fault", propOrder = {
"code",
"reason",
"node",
"role",
"detail"
})
public class Fault {
@XmlElement(name = "Code", required = true)
protected Faultcode code;
@XmlElement(name = "Reason", required = true)
protected Faultreason reason;
@XmlElement(name = "Node")
@XmlSchemaType(name = "anyURI")
protected String node;
@XmlElement(name = "Role")
@XmlSchemaType(name = "anyURI")
protected String role;
@XmlElement(name = "Detail")
protected Detail detail;
/**
* Gets the value of the code property.
*
* @return
* possible object is
* {@link Faultcode }
*
*/
public Faultcode getCode() {
return code;
}
/**
* Sets the value of the code property.
*
* @param value
* allowed object is
* {@link Faultcode }
*
*/
public void setCode(Faultcode value) {
this.code = value;
}
/**
* Gets the value of the reason property.
*
* @return
* possible object is
* {@link Faultreason }
*
*/
public Faultreason getReason() {
return reason;
}
/**
* Sets the value of the reason property.
*
* @param value
* allowed object is
* {@link Faultreason }
*
*/
public void setReason(Faultreason value) {
this.reason = value;
}
/**
* Gets the value of the node property.
*
* @return
* possible object is
* {@link String }
*
*/
public String getNode() {
return node;
}
/**
* Sets the value of the node property.
*
* @param value
* allowed object is
* {@link String }
*
*/
public void setNode(String value) {
this.node = value;
}
/**
* Gets the value of the role property.
*
* @return
* possible object is
* {@link String }
*
*/
public String getRole() {
return role;
}
/**
* Sets the value of the role property.
*
* @param value
* allowed object is
* {@link String }
*
*/
public void setRole(String value) {
this.role = value;
}
/**
* Gets the value of the detail property.
*
* @return
* possible object is
* {@link Detail }
*
*/
public Detail getDetail() {
return detail;
}
/**
* Sets the value of the detail property.
*
* @param value
* allowed object is
* {@link Detail }
*
*/
public void setDetail(Detail value) {
this.detail = value;
}
/**
* Generates a String representation of the contents of this type.
* This is an extension method, produced by the 'ts' xjc plugin
*
*/
@Override
public String toString() {
return ToStringBuilder.reflectionToString(this, JAXBToStringStyle.DEFAULT_STYLE);
}
}
| {
"content_hash": "8af8eee236f12e24c0e079a28de5c857",
"timestamp": "",
"source": "github",
"line_count": 193,
"max_line_length": 110,
"avg_line_length": 23.55440414507772,
"alnum_prop": 0.5609326880774307,
"repo_name": "fpompermaier/onvif",
"id": "ef537588e239f8dd9738687034d0aa7b86099866",
"size": "4546",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "onvif-ws-client/src/main/java/org/w3/_2003/_05/soap_envelope/Fault.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Java",
"bytes": "5555048"
}
],
"symlink_target": ""
} |
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical" android:layout_width="match_parent"
android:layout_height="match_parent">
<FrameLayout
android:id="@+id/container"
android:layout_width="match_parent"
android:layout_height="wrap_content" />
</LinearLayout> | {
"content_hash": "6172478bc34bc5c1fd10293436b22902",
"timestamp": "",
"source": "github",
"line_count": 9,
"max_line_length": 72,
"avg_line_length": 42.77777777777778,
"alnum_prop": 0.6987012987012987,
"repo_name": "malaonline/Android",
"id": "12e0fc9bf4e8200c5ccc51b69635e5cc449307d8",
"size": "385",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "app/src/main/res/layout/activity_teacher_info.xml",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "HTML",
"bytes": "118"
},
{
"name": "Java",
"bytes": "1634323"
},
{
"name": "Shell",
"bytes": "357"
}
],
"symlink_target": ""
} |
package client
import (
"bytes"
"crypto/tls"
"encoding/binary"
"fmt"
"github.com/juju/errors"
. "github.com/siddontang/go-mysql/mysql"
"github.com/siddontang/go-mysql/packet"
)
const defaultAuthPluginName = AUTH_NATIVE_PASSWORD
// defines the supported auth plugins
var supportedAuthPlugins = []string{AUTH_NATIVE_PASSWORD, AUTH_SHA256_PASSWORD, AUTH_CACHING_SHA2_PASSWORD}
// helper function to determine what auth methods are allowed by this client
func authPluginAllowed(pluginName string) bool {
for _, p := range supportedAuthPlugins {
if pluginName == p {
return true
}
}
return false
}
// See: http://dev.mysql.com/doc/internals/en/connection-phase-packets.html#packet-Protocol::Handshake
func (c *Conn) readInitialHandshake() error {
data, err := c.ReadPacket()
if err != nil {
return errors.Trace(err)
}
if data[0] == ERR_HEADER {
return errors.New("read initial handshake error")
}
if data[0] < MinProtocolVersion {
return errors.Errorf("invalid protocol version %d, must >= 10", data[0])
}
// skip mysql version
// mysql version end with 0x00
pos := 1 + bytes.IndexByte(data[1:], 0x00) + 1
// connection id length is 4
c.connectionID = uint32(binary.LittleEndian.Uint32(data[pos : pos+4]))
pos += 4
c.salt = []byte{}
c.salt = append(c.salt, data[pos:pos+8]...)
// skip filter
pos += 8 + 1
// capability lower 2 bytes
c.capability = uint32(binary.LittleEndian.Uint16(data[pos : pos+2]))
// check protocol
if c.capability&CLIENT_PROTOCOL_41 == 0 {
return errors.New("the MySQL server can not support protocol 41 and above required by the client")
}
if c.capability&CLIENT_SSL == 0 && c.tlsConfig != nil {
return errors.New("the MySQL Server does not support TLS required by the client")
}
pos += 2
if len(data) > pos {
// skip server charset
//c.charset = data[pos]
pos += 1
c.status = binary.LittleEndian.Uint16(data[pos : pos+2])
pos += 2
// capability flags (upper 2 bytes)
c.capability = uint32(binary.LittleEndian.Uint16(data[pos:pos+2]))<<16 | c.capability
pos += 2
// skip auth data len or [00]
// skip reserved (all [00])
pos += 10 + 1
// The documentation is ambiguous about the length.
// The official Python library uses the fixed length 12
// mysql-proxy also use 12
// which is not documented but seems to work.
c.salt = append(c.salt, data[pos:pos+12]...)
pos += 13
// auth plugin
if end := bytes.IndexByte(data[pos:], 0x00); end != -1 {
c.authPluginName = string(data[pos : pos+end])
} else {
c.authPluginName = string(data[pos:])
}
}
// if server gives no default auth plugin name, use a client default
if c.authPluginName == "" {
c.authPluginName = defaultAuthPluginName
}
return nil
}
// generate auth response data according to auth plugin
//
// NOTE: the returned boolean value indicates whether to add a \NUL to the end of data.
// it is quite tricky because MySQl server expects different formats of responses in different auth situations.
// here the \NUL needs to be added when sending back the empty password or cleartext password in 'sha256_password'
// authentication.
func (c *Conn) genAuthResponse(authData []byte) ([]byte, bool, error) {
// password hashing
switch c.authPluginName {
case AUTH_NATIVE_PASSWORD:
return CalcPassword(authData[:20], []byte(c.password)), false, nil
case AUTH_CACHING_SHA2_PASSWORD:
return CalcCachingSha2Password(authData, c.password), false, nil
case AUTH_SHA256_PASSWORD:
if len(c.password) == 0 {
return nil, true, nil
}
if c.tlsConfig != nil || c.proto == "unix" {
// write cleartext auth packet
// see: https://dev.mysql.com/doc/refman/8.0/en/sha256-pluggable-authentication.html
return []byte(c.password), true, nil
} else {
// request public key from server
// see: https://dev.mysql.com/doc/internals/en/public-key-retrieval.html
return []byte{1}, false, nil
}
default:
// not reachable
return nil, false, fmt.Errorf("auth plugin '%s' is not supported", c.authPluginName)
}
}
// See: http://dev.mysql.com/doc/internals/en/connection-phase-packets.html#packet-Protocol::HandshakeResponse
func (c *Conn) writeAuthHandshake() error {
if !authPluginAllowed(c.authPluginName) {
return fmt.Errorf("unknow auth plugin name '%s'", c.authPluginName)
}
// Adjust client capability flags based on server support
capability := CLIENT_PROTOCOL_41 | CLIENT_SECURE_CONNECTION |
CLIENT_LONG_PASSWORD | CLIENT_TRANSACTIONS | CLIENT_PLUGIN_AUTH | c.capability&CLIENT_LONG_FLAG
// To enable TLS / SSL
if c.tlsConfig != nil {
capability |= CLIENT_SSL
}
auth, addNull, err := c.genAuthResponse(c.salt)
if err != nil {
return err
}
// encode length of the auth plugin data
// here we use the Length-Encoded-Integer(LEI) as the data length may not fit into one byte
// see: https://dev.mysql.com/doc/internals/en/integer.html#length-encoded-integer
var authRespLEIBuf [9]byte
authRespLEI := AppendLengthEncodedInteger(authRespLEIBuf[:0], uint64(len(auth)))
if len(authRespLEI) > 1 {
// if the length can not be written in 1 byte, it must be written as a
// length encoded integer
capability |= CLIENT_PLUGIN_AUTH_LENENC_CLIENT_DATA
}
//packet length
//capability 4
//max-packet size 4
//charset 1
//reserved all[0] 23
//username
//auth
//mysql_native_password + null-terminated
length := 4 + 4 + 1 + 23 + len(c.user) + 1 + len(authRespLEI) + len(auth) + 21 + 1
if addNull {
length++
}
// db name
if len(c.db) > 0 {
capability |= CLIENT_CONNECT_WITH_DB
length += len(c.db) + 1
}
data := make([]byte, length+4)
// capability [32 bit]
data[4] = byte(capability)
data[5] = byte(capability >> 8)
data[6] = byte(capability >> 16)
data[7] = byte(capability >> 24)
// MaxPacketSize [32 bit] (none)
data[8] = 0x00
data[9] = 0x00
data[10] = 0x00
data[11] = 0x00
// Charset [1 byte]
// use default collation id 33 here, is utf-8
data[12] = byte(DEFAULT_COLLATION_ID)
// SSL Connection Request Packet
// http://dev.mysql.com/doc/internals/en/connection-phase-packets.html#packet-Protocol::SSLRequest
if c.tlsConfig != nil {
// Send TLS / SSL request packet
if err := c.WritePacket(data[:(4+4+1+23)+4]); err != nil {
return err
}
// Switch to TLS
tlsConn := tls.Client(c.Conn.Conn, c.tlsConfig)
if err := tlsConn.Handshake(); err != nil {
return err
}
currentSequence := c.Sequence
c.Conn = packet.NewConn(tlsConn)
c.Sequence = currentSequence
}
// Filler [23 bytes] (all 0x00)
pos := 13
for ; pos < 13+23; pos++ {
data[pos] = 0
}
// User [null terminated string]
if len(c.user) > 0 {
pos += copy(data[pos:], c.user)
}
data[pos] = 0x00
pos++
// auth [length encoded integer]
pos += copy(data[pos:], authRespLEI)
pos += copy(data[pos:], auth)
if addNull {
data[pos] = 0x00
pos++
}
// db [null terminated string]
if len(c.db) > 0 {
pos += copy(data[pos:], c.db)
data[pos] = 0x00
pos++
}
// Assume native client during response
pos += copy(data[pos:], c.authPluginName)
data[pos] = 0x00
return c.WritePacket(data)
}
| {
"content_hash": "5c351fe09b625ac968e4683b2444bf14",
"timestamp": "",
"source": "github",
"line_count": 256,
"max_line_length": 120,
"avg_line_length": 27.70703125,
"alnum_prop": 0.6808120682362893,
"repo_name": "TeamFat/go-way",
"id": "5ba9c9f4e1024506180d165923b06db2a5c9cd00",
"size": "7093",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "vendor/github.com/siddontang/go-mysql/client/auth.go",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "Go",
"bytes": "190092"
},
{
"name": "HTML",
"bytes": "55"
},
{
"name": "PHP",
"bytes": "15286"
},
{
"name": "Python",
"bytes": "14802"
},
{
"name": "Shell",
"bytes": "2156"
}
],
"symlink_target": ""
} |
namespace Inflection.TypeNode.TypeNode
{
using System;
using System.Collections.Generic;
using System.Linq;
using System.Reflection;
using Immutable.Extensions;
using Immutable.Monads;
using Immutable.TypeSystem;
public static class TypeNode
{
public static TypeNode<TNode, TNode> Create<TNode>(IInflector inflector)
{
var t = inflector.Inflect<TNode>();
return new TypeNode<TNode, TNode>(t, x => x, (x, y) => y);
}
}
public class TypeNode<TParent, TNode> : ITypeNode<TParent, TNode>
{
private readonly IImmutableType<TNode> nodeType;
private readonly Func<TParent, TNode> get;
private readonly IMaybe<Func<TParent, TNode, TParent>> set;
private readonly ITypePath<TParent, TNode> path;
private readonly Lazy<Dictionary<MemberInfo, ITypeNode>> children;
public TypeNode(IImmutableType<TNode> nodeType, Func<TParent, TNode> get, Func<TParent, TNode, TParent> set)
: this(nodeType, get, Maybe.Return(set))
{
}
public TypeNode(IImmutableType<TNode> nodeType, Func<TParent, TNode> get, IMaybe<Func<TParent, TNode, TParent>> set)
{
this.nodeType = nodeType;
this.get = get;
this.set = set;
this.path = TypePath.Create(get, set);
var visitor = new ChildBuilder<TNode>();
this.children = new Lazy<Dictionary<MemberInfo, ITypeNode>>(() => this.NodeType.GetProperties().ToDictionary(x => x.ClrMember, visitor.BuildChild));
}
IImmutableType ITypeNode.NodeType
{
get { return this.NodeType; }
}
public IImmutableType<TNode> NodeType
{
get { return this.nodeType; }
}
public ITypePath<TParent, TNode> Path
{
get { return this.path; }
}
public Func<TParent, TNode> Get
{
get { return this.get; }
}
public IMaybe<Func<TParent, TNode, TParent>> Set
{
get { return this.set; }
}
IMaybe<ITypeNode> ITypeNode.MaybeGetChild(MemberInfo memberInfo)
{
return this.MaybeGetChild(memberInfo);
}
IEnumerable<ITypeNode> ITypeNode.GetChildren()
{
return this.GetChildren();
}
IEnumerable<ITypeNode<T>> ITypeNode.GetChildren<T>()
{
return this.GetChildren<T>();
}
void ITypeNode.Accept(ITypeNodeVisitor visitor)
{
visitor.Visit(this);
}
public IMaybe<ITypeNode> MaybeGetChild(MemberInfo memberInfo)
{
return Maybe.Return(this.children.Value.GetValueOrDefault(memberInfo));
}
public IEnumerable<ITypeNode> GetChildren()
{
return this.children.Value.Values;
}
public IEnumerable<ITypeNode<TNode, T>> GetChildren<T>()
{
return this.GetChildren().OfType<ITypeNode<TNode, T>>();
}
public IMaybe<ITypeNode> MaybeGetNode(IEnumerable<MemberInfo> memberInfos)
{
var mn = Maybe.Return(this as ITypeNode);
foreach (var info in memberInfos)
{
var info1 = info;
mn = mn.Bind(n => n.MaybeGetChild(info1));
}
return mn;
}
public IEnumerable<ITypeNode> GetNodes()
{
var q = new Queue<ITypeNode>();
foreach (var c in this.GetChildren())
{
yield return c;
q.Enqueue(c);
}
while (q.Count > 0)
{
var d = q.Dequeue();
foreach (var c in d.GetChildren())
{
yield return c;
q.Enqueue(c);
}
}
}
public IEnumerable<ITypeNode<T>> GetNodes<T>()
{
return this.GetNodes().OfType<ITypeNode<T>>();
}
public IMaybe<ITypePath<TNode, T>> MaybeGetPath<T>(IEnumerable<MemberInfo> memberInfos)
{
var nodes = new List<ITypeNode>();
var mn = Maybe.Return(this as ITypeNode);
foreach (var info in memberInfos)
{
var info1 = info;
mn = mn.Bind(
n =>
{
nodes.Add(n);
return n.MaybeGetChild(info1);
});
if (!(mn is Just<ITypeNode>))
{
return new Nothing<ITypePath<TNode, T>>();
}
}
var pathBuilder = new PathBuilder<TNode, T>();
return pathBuilder.MaybeBuild(nodes);
}
public IEnumerable<ITypePath<TNode, T>> GetPaths<T>()
{
return GetPaths<T>(new PathStack<TNode>(), this.GetChildren());
}
public IEnumerable<IValuePath<TNode, T>> GetValuePaths<T>(TNode node)
{
return GetValuePaths<T>(new ValuePathStack<TNode>(node), this.GetChildren());
}
public IEnumerable<T> GetValues<T>(TNode node)
{
return GetValues<T>(new ValueStack<TNode>(node), this.GetChildren());
}
private static IEnumerable<ITypePath<TNode, T>> GetPaths<T>(PathStack<TNode> stack, IEnumerable<ITypeNode> children)
{
foreach (var child in children)
{
stack.Push(child);
foreach (var p in GetPaths<T>(stack, child.GetChildren()))
{
yield return p;
}
var d = stack.Pop() as ITypePath<TNode, T>;
if (d != null)
{
yield return d;
}
}
}
private static IEnumerable<IValuePath<TNode, T>> GetValuePaths<T>(ValuePathStack<TNode> stack, IEnumerable<ITypeNode> children)
{
foreach (var child in children)
{
stack.Push(child);
var top = stack.Peek();
if (top.IsNull)
{
stack.Pop();
continue;
}
var d = top as IValuePath<TNode, T>;
if (d != null)
{
yield return d;
}
foreach (var p in GetValuePaths<T>(stack, child.GetChildren()))
{
yield return p;
}
stack.Pop();
}
}
private static IEnumerable<T> GetValues<T>(ValueStack<TNode> stack, IEnumerable<ITypeNode> children)
{
foreach (var child in children)
{
stack.Push(child);
var top = stack.Peek();
if (top == null)
{
stack.Pop();
continue;
}
if (top is T)
{
yield return (T)top;
}
foreach (var p in GetValues<T>(stack, child.GetChildren()))
{
yield return p;
}
stack.Pop();
}
}
private static T MapIfCompatible<T, TA, TB>(T v, Func<TA, TB> f) {
return (v is TA && typeof(T).IsAssignableFrom(typeof(TB)))
? (T)(object)f((TA)(object)v)
: v;
}
public TNode GMap<TA, TB>(TNode node, Func<TA, TB> f) {
node = MapIfCompatible(node, f);
foreach (var p in this.GetPaths<TA>()) {
var set = p.Set.GetValueOrDefault();
if (set == null) {
continue;
}
var prop = MapIfCompatible(p.Get(node), f);
node = set(node, prop);
}
return node;
}
}
} | {
"content_hash": "9b7f7f906bbf13db0d59e6f09bb0967c",
"timestamp": "",
"source": "github",
"line_count": 289,
"max_line_length": 160,
"avg_line_length": 28.14878892733564,
"alnum_prop": 0.4810079901659496,
"repo_name": "diab0l/Inflection",
"id": "8ecaa14fecd0ad278e1dc717d8fe6d7345278872",
"size": "8137",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "Inflection/Inflection.TypeNode/TypeNode/TypeNode.cs",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "C#",
"bytes": "127901"
}
],
"symlink_target": ""
} |
import React from "react";
import { shallow, mount } from "enzyme";
import WeatherAPI from "./WeatherAPI.js";
import Main from "../lib/components/Main.js";
import Greeting from "../lib/components/Greeting.js";
import Input from "../lib/components/Input.js";
describe("Input component", () => {
it("state.input should change when input is changed", () => {
let wrapper = shallow(<Input fetchWeather={() => {}} />);
let button = wrapper.find(".submit-button");
let input = wrapper.find(".location-input");
input.simulate("change", { target: { value: "Denver, CO" } });
expect(wrapper.state("userInput")).toEqual("Denver, CO");
button.simulate("click");
expect(wrapper.state("userInput")).toEqual("");
});
it("should fetch the weather on mount", () => {
let spy = jest.fn(); // () => {}
let wrapper = mount(<Input fetchWeather={spy} />);
expect(spy).toHaveBeenCalled();
// checking props, by checking if fetchWeather (which has been called on props) has been executed
});
it("calls componentDidMount", () => {
let spy = jest.spyOn(Input.prototype, "componentDidMount");
let wrapper = mount(<Input fetchWeather={() => {}} />);
expect(spy).toHaveBeenCalled();
// don't need
});
it("calls submitItem", () => {
let spy = jest.fn();
let wrapper = mount(<Input fetchWeather={spy} />);
let button = wrapper.find(".submit-button");
button.simulate("click");
expect(spy).toHaveBeenCalled();
});
it("calls submitItem", () => {
let spy = jest.fn();
let wrapper = mount(<Input fetchWeather={spy} />);
let button = wrapper.find(".submit-button");
let location = wrapper.find("userInput");
button.simulate("click");
expect(spy).toHaveBeenCalledWith("Denver, CO");
});
});
| {
"content_hash": "ac2aa03913d3bebae932b6f043ad0844",
"timestamp": "",
"source": "github",
"line_count": 55,
"max_line_length": 101,
"avg_line_length": 32.54545454545455,
"alnum_prop": 0.6262569832402235,
"repo_name": "tlgreg86/weatherly",
"id": "b2617f50e94ac31e58c787c6d889dd94d7a93f92",
"size": "1790",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "test/greeting.test.js",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "2567"
},
{
"name": "HTML",
"bytes": "600"
},
{
"name": "JavaScript",
"bytes": "140337"
}
],
"symlink_target": ""
} |
namespace blink {
class SimplifiedBackwardsTextIteratorTest : public EditingTestBase {
};
template <typename Strategy>
static String extractString(const Element& element)
{
const EphemeralRangeTemplate<Strategy> range = EphemeralRangeTemplate<Strategy>::rangeOfContents(element);
BackwardsTextBuffer buffer;
for (SimplifiedBackwardsTextIteratorAlgorithm<Strategy> it(range.startPosition(), range.endPosition()); !it.atEnd(); it.advance()) {
it.copyTextTo(&buffer);
}
return String(buffer.data(), buffer.size());
}
TEST_F(SimplifiedBackwardsTextIteratorTest, SubrangeWithReplacedElements)
{
static const char* bodyContent = "<a id=host><b id=one>one</b> not appeared <b id=two>two</b></a>";
const char* shadowContent = "three <content select=#two></content> <content select=#one></content> zero";
setBodyContent(bodyContent);
setShadowContent(shadowContent, "host");
Element* host = document().getElementById("host");
// We should not apply DOM tree version to containing shadow tree in
// general. To record current behavior, we have this test. even if it
// isn't intuitive.
EXPECT_EQ("onetwo", extractString<EditingStrategy>(*host));
EXPECT_EQ("three two one zero", extractString<EditingInFlatTreeStrategy>(*host));
}
TEST_F(SimplifiedBackwardsTextIteratorTest, characterAt)
{
const char* bodyContent = "<a id=host><b id=one>one</b> not appeared <b id=two>two</b></a>";
const char* shadowContent = "three <content select=#two></content> <content select=#one></content> zero";
setBodyContent(bodyContent);
setShadowContent(shadowContent, "host");
Element* host = document().getElementById("host");
EphemeralRangeTemplate<EditingStrategy> range1(EphemeralRangeTemplate<EditingStrategy>::rangeOfContents(*host));
SimplifiedBackwardsTextIteratorAlgorithm<EditingStrategy> backIter1(range1.startPosition(), range1.endPosition());
const char* message1 = "|backIter1| should emit 'one' and 'two' in reverse order.";
EXPECT_EQ('o', backIter1.characterAt(0)) << message1;
EXPECT_EQ('w', backIter1.characterAt(1)) << message1;
EXPECT_EQ('t', backIter1.characterAt(2)) << message1;
backIter1.advance();
EXPECT_EQ('e', backIter1.characterAt(0)) << message1;
EXPECT_EQ('n', backIter1.characterAt(1)) << message1;
EXPECT_EQ('o', backIter1.characterAt(2)) << message1;
EphemeralRangeTemplate<EditingInFlatTreeStrategy> range2(EphemeralRangeTemplate<EditingInFlatTreeStrategy>::rangeOfContents(*host));
SimplifiedBackwardsTextIteratorAlgorithm<EditingInFlatTreeStrategy> backIter2(range2.startPosition(), range2.endPosition());
const char* message2 = "|backIter2| should emit 'three ', 'two', ' ', 'one' and ' zero' in reverse order.";
EXPECT_EQ('o', backIter2.characterAt(0)) << message2;
EXPECT_EQ('r', backIter2.characterAt(1)) << message2;
EXPECT_EQ('e', backIter2.characterAt(2)) << message2;
EXPECT_EQ('z', backIter2.characterAt(3)) << message2;
EXPECT_EQ(' ', backIter2.characterAt(4)) << message2;
backIter2.advance();
EXPECT_EQ('e', backIter2.characterAt(0)) << message2;
EXPECT_EQ('n', backIter2.characterAt(1)) << message2;
EXPECT_EQ('o', backIter2.characterAt(2)) << message2;
backIter2.advance();
EXPECT_EQ(' ', backIter2.characterAt(0)) << message2;
backIter2.advance();
EXPECT_EQ('o', backIter2.characterAt(0)) << message2;
EXPECT_EQ('w', backIter2.characterAt(1)) << message2;
EXPECT_EQ('t', backIter2.characterAt(2)) << message2;
backIter2.advance();
EXPECT_EQ(' ', backIter2.characterAt(0)) << message2;
EXPECT_EQ('e', backIter2.characterAt(1)) << message2;
EXPECT_EQ('e', backIter2.characterAt(2)) << message2;
EXPECT_EQ('r', backIter2.characterAt(3)) << message2;
EXPECT_EQ('h', backIter2.characterAt(4)) << message2;
EXPECT_EQ('t', backIter2.characterAt(5)) << message2;
}
TEST_F(SimplifiedBackwardsTextIteratorTest, copyTextTo)
{
const char* bodyContent = "<a id=host><b id=one>one</b> not appeared <b id=two>two</b></a>";
const char* shadowContent = "three <content select=#two></content> <content select=#one></content> zero";
setBodyContent(bodyContent);
setShadowContent(shadowContent, "host");
Element* host = document().getElementById("host");
const char* message = "|backIter%d| should have emitted '%s' in reverse order.";
EphemeralRangeTemplate<EditingStrategy> range1(EphemeralRangeTemplate<EditingStrategy>::rangeOfContents(*host));
SimplifiedBackwardsTextIteratorAlgorithm<EditingStrategy> backIter1(range1.startPosition(), range1.endPosition());
BackwardsTextBuffer output1;
backIter1.copyTextTo(&output1, 0, 2);
EXPECT_EQ("wo", String(output1.data(), output1.size())) << String::format(message, 1, "wo").utf8().data();
backIter1.copyTextTo(&output1, 2, 1);
EXPECT_EQ("two", String(output1.data(), output1.size())) << String::format(message, 1, "two").utf8().data();
backIter1.advance();
backIter1.copyTextTo(&output1, 0, 1);
EXPECT_EQ("etwo", String(output1.data(), output1.size())) << String::format(message, 1, "etwo").utf8().data();
backIter1.copyTextTo(&output1, 1, 2);
EXPECT_EQ("onetwo", String(output1.data(), output1.size())) << String::format(message, 1, "onetwo").utf8().data();
EphemeralRangeTemplate<EditingInFlatTreeStrategy> range2(EphemeralRangeTemplate<EditingInFlatTreeStrategy>::rangeOfContents(*host));
SimplifiedBackwardsTextIteratorAlgorithm<EditingInFlatTreeStrategy> backIter2(range2.startPosition(), range2.endPosition());
BackwardsTextBuffer output2;
backIter2.copyTextTo(&output2, 0, 2);
EXPECT_EQ("ro", String(output2.data(), output2.size())) << String::format(message, 2, "ro").utf8().data();
backIter2.copyTextTo(&output2, 2, 3);
EXPECT_EQ(" zero", String(output2.data(), output2.size())) << String::format(message, 2, " zero").utf8().data();
backIter2.advance();
backIter2.copyTextTo(&output2, 0, 1);
EXPECT_EQ("e zero", String(output2.data(), output2.size())) << String::format(message, 2, "e zero").utf8().data();
backIter2.copyTextTo(&output2, 1, 2);
EXPECT_EQ("one zero", String(output2.data(), output2.size())) << String::format(message, 2, "one zero").utf8().data();
backIter2.advance();
backIter2.copyTextTo(&output2, 0, 1);
EXPECT_EQ(" one zero", String(output2.data(), output2.size())) << String::format(message, 2, " one zero").utf8().data();
backIter2.advance();
backIter2.copyTextTo(&output2, 0, 2);
EXPECT_EQ("wo one zero", String(output2.data(), output2.size())) << String::format(message, 2, "wo one zero").utf8().data();
backIter2.copyTextTo(&output2, 2, 1);
EXPECT_EQ("two one zero", String(output2.data(), output2.size())) << String::format(message, 2, "two one zero").utf8().data();
backIter2.advance();
backIter2.copyTextTo(&output2, 0, 3);
EXPECT_EQ("ee two one zero", String(output2.data(), output2.size())) << String::format(message, 2, "ee two one zero").utf8().data();
backIter2.copyTextTo(&output2, 3, 3);
EXPECT_EQ("three two one zero", String(output2.data(), output2.size())) << String::format(message, 2, "three two one zero").utf8().data();
}
TEST_F(SimplifiedBackwardsTextIteratorTest, CopyWholeCodePoints)
{
const char* bodyContent = "𓀀𓀁𓀂 𓅀𓅁.";
setBodyContent(bodyContent);
const UChar expected[] = {0xD80C, 0xDC00, 0xD80C, 0xDC01, 0xD80C, 0xDC02, ' ', 0xD80C, 0xDD40, 0xD80C, 0xDD41, '.'};
EphemeralRange range(EphemeralRange::rangeOfContents(document()));
SimplifiedBackwardsTextIterator iter(range.startPosition(), range.endPosition());
BackwardsTextBuffer buffer;
EXPECT_EQ(1, iter.copyTextTo(&buffer, 0, 1)) << "Should emit 1 UChar for '.'.";
EXPECT_EQ(2, iter.copyTextTo(&buffer, 1, 1)) << "Should emit 2 UChars for 'U+13141'.";
EXPECT_EQ(2, iter.copyTextTo(&buffer, 3, 2)) << "Should emit 2 UChars for 'U+13140'.";
EXPECT_EQ(5, iter.copyTextTo(&buffer, 5, 4)) << "Should emit 5 UChars for 'U+13001U+13002 '.";
EXPECT_EQ(2, iter.copyTextTo(&buffer, 10, 2)) << "Should emit 2 UChars for 'U+13000'.";
for (int i = 0; i < 12; i++)
EXPECT_EQ(expected[i], buffer[i]);
}
} // namespace blink
| {
"content_hash": "fd565e003bdf78a885c0a1a049efa262",
"timestamp": "",
"source": "github",
"line_count": 149,
"max_line_length": 142,
"avg_line_length": 55.630872483221474,
"alnum_prop": 0.6921220895162263,
"repo_name": "danakj/chromium",
"id": "17fca10cf2cb9e68a6c3996fbc89b98f6b46b0cf",
"size": "8664",
"binary": false,
"copies": "5",
"ref": "refs/heads/master",
"path": "third_party/WebKit/Source/core/editing/iterators/SimplifiedBackwardsTextIteratorTest.cpp",
"mode": "33188",
"license": "bsd-3-clause",
"language": [],
"symlink_target": ""
} |
using System.Collections;
using System.Diagnostics.CodeAnalysis;
using System.Runtime.Serialization;
using IX.StandardExtensions;
using IX.StandardExtensions.Contracts;
using IX.StandardExtensions.Threading;
using IX.System.Collections.Generic;
using IX.System.IO;
using JetBrains.Annotations;
namespace IX.Guaranteed.Collections;
/// <summary>
/// A base class for persisted queues.
/// </summary>
/// <typeparam name="T">The type of object in the queue.</typeparam>
/// <seealso cref="StandardExtensions.ComponentModel.DisposableBase" />
/// <seealso cref="System.Collections.Generic.IQueue{T}" />
[PublicAPI]
[SuppressMessage(
"Design",
"CA1010:Generic interface should also be implemented",
Justification = "This is not necessary.")]
public abstract class PersistedQueueBase<T> : ReaderWriterSynchronizedBase,
IQueue<T>
{
#region Internal state
private readonly IDirectory directoryShim;
private readonly IFile fileShim;
private readonly IPath pathShim;
/// <summary>
/// The poisoned non-removable files list.
/// </summary>
private readonly List<string> poisonedUnremovableFiles;
private readonly DataContractSerializer serializer;
#endregion
#region Constructors and destructors
/// <summary>
/// Initializes a new instance of the <see cref="PersistedQueueBase{T}" /> class.
/// </summary>
/// <param name="persistenceFolderPath">
/// The persistence folder path.
/// </param>
/// <param name="fileShim">
/// The file shim.
/// </param>
/// <param name="directoryShim">
/// The directory shim.
/// </param>
/// <param name="pathShim">
/// The path shim.
/// </param>
/// <param name="serializer">
/// The serializer.
/// </param>
/// <param name="timeout">
/// The timeout.
/// </param>
/// <exception cref="ArgumentNullException">
/// <paramref name="persistenceFolderPath" />
/// or
/// <paramref name="fileShim" />
/// or
/// <paramref name="directoryShim" />
/// or
/// <paramref name="pathShim" />
/// or
/// <paramref name="serializer" />
/// is <see langword="null" /> (<see langword="Nothing" /> in Visual Basic).
/// </exception>
/// <exception cref="ArgumentInvalidPathException">
/// The folder at <paramref name="persistenceFolderPath" /> does not exist, or is not accessible.
/// </exception>
protected PersistedQueueBase(
string persistenceFolderPath,
IFile fileShim,
IDirectory directoryShim,
IPath pathShim,
DataContractSerializer serializer,
TimeSpan timeout)
: base(timeout)
{
// Dependency validation
Requires.NotNull(
out this.fileShim,
fileShim);
Requires.NotNull(
out this.pathShim,
pathShim);
Requires.NotNull(
out this.directoryShim,
directoryShim);
Requires.NotNull(
out this.serializer,
serializer);
// Parameter validation
_ = directoryShim.RequiresExists(persistenceFolderPath);
// Internal state
poisonedUnremovableFiles = new();
// Persistence folder paths
var dataFolderPath = pathShim.Combine(
persistenceFolderPath,
"Data");
DataFolderPath = dataFolderPath;
var poisonFolderPath = pathShim.Combine(
persistenceFolderPath,
"Poison");
PoisonFolderPath = poisonFolderPath;
// Initialize folder paths
if (!directoryShim.Exists(dataFolderPath))
{
directoryShim.CreateDirectory(dataFolderPath);
}
if (!directoryShim.Exists(poisonFolderPath))
{
directoryShim.CreateDirectory(poisonFolderPath);
}
}
#endregion
#region Properties and indexers
/// <summary>
/// Gets the number of elements contained in the <see cref="PersistedQueueBase{T}" />.
/// </summary>
/// <value>The count.</value>
public abstract int Count { get; }
/// <summary>
/// Gets a value indicating whether this queue is empty.
/// </summary>
/// <value>
/// <c>true</c> if this queue is empty; otherwise, <c>false</c>.
/// </value>
public bool IsEmpty => Count == 0;
/// <summary>
/// Gets a value indicating whether access to the <see cref="PersistedQueueBase{T}" /> is synchronized (thread safe).
/// </summary>
/// <value>The is synchronized.</value>
bool ICollection.IsSynchronized => true;
/// <summary>
/// Gets an object that can be used to synchronize access to the <see cref="PersistedQueueBase{T}" />.
/// </summary>
/// <value>The synchronize root.</value>
object ICollection.SyncRoot { get; } = new();
/// <summary>
/// Gets the data folder path.
/// </summary>
/// <value>The data folder path.</value>
protected string DataFolderPath { get; }
/// <summary>
/// Gets the folder shim.
/// </summary>
/// <value>The folder shim.</value>
protected IDirectory DirectoryShim => directoryShim;
/// <summary>
/// Gets the file shim.
/// </summary>
/// <value>The file shim.</value>
protected IFile FileShim => fileShim;
/// <summary>
/// Gets the path shim.
/// </summary>
/// <value>The path shim.</value>
protected IPath PathShim => pathShim;
/// <summary>
/// Gets the poison folder path.
/// </summary>
/// <value>The poison folder path.</value>
protected string PoisonFolderPath { get; }
/// <summary>
/// Gets the serializer.
/// </summary>
/// <value>The serializer.</value>
protected DataContractSerializer Serializer => serializer;
#endregion
#region Methods
#region Interface implementations
/// <summary>
/// Copies the elements of the <see cref="PersistedQueueBase{T}" /> to an <see cref="Array" />, starting at a
/// particular <see cref="Array" /> index.
/// </summary>
/// <param name="array">
/// The one-dimensional <see cref="Array" /> that is the destination of the elements copied
/// from <see cref="PersistedQueueBase{T}" />. The <see cref="Array" /> must have zero-based indexing.
/// </param>
/// <param name="index">The zero-based index in <paramref name="array" /> at which copying begins.</param>
public abstract void CopyTo(
Array array,
int index);
/// <summary>
/// Returns an enumerator that iterates through the queue.
/// </summary>
/// <returns>An enumerator that can be used to iterate through the queue.</returns>
public abstract IEnumerator<T> GetEnumerator();
/// <summary>
/// Clears the queue of all elements.
/// </summary>
public abstract void Clear();
/// <summary>
/// Verifies whether or not an item is contained in the queue.
/// </summary>
/// <param name="item">The item to verify.</param>
/// <returns><see langword="true" /> if the item is queued, <see langword="false" /> otherwise.</returns>
public abstract bool Contains(T item);
/// <summary>
/// De-queues an item and removes it from the queue.
/// </summary>
/// <returns>The item that has been de-queued.</returns>
public abstract T Dequeue();
/// <summary>
/// Queues an item, adding it to the queue.
/// </summary>
/// <param name="item">The item to enqueue.</param>
public abstract void Enqueue(T item);
/// <summary>
/// Queues a range of elements, adding them to the queue.
/// </summary>
/// <param name="items">The item range to push.</param>
public void EnqueueRange(T[] items)
{
_ = Requires.NotNull(
items);
foreach (T item in items)
{
Enqueue(item);
}
}
/// <summary>
/// Queues a range of elements, adding them to the queue.
/// </summary>
/// <param name="items">The item range to enqueue.</param>
/// <param name="startIndex">The start index.</param>
/// <param name="count">The number of items to enqueue.</param>
public void EnqueueRange(
T[] items,
int startIndex,
int count)
{
_ = Requires.NotNull(
items);
Requires.ValidArrayRange(
in startIndex,
in count,
items);
var itemsRange = new ReadOnlySpan<T>(
items,
startIndex,
count);
foreach (T item in itemsRange)
{
Enqueue(item);
}
}
/// <summary>
/// Peeks at the topmost element in the queue, without removing it.
/// </summary>
/// <returns>The item peeked at, if any.</returns>
public abstract T Peek();
/// <summary>
/// Copies all elements of the queue into a new array.
/// </summary>
/// <returns>The created array with all element of the queue.</returns>
public abstract T[] ToArray();
/// <summary>
/// Trims the excess free space from within the queue, reducing the capacity to the actual number of elements.
/// </summary>
public virtual void TrimExcess() { }
/// <summary>
/// Attempts to de-queue an item and to remove it from queue.
/// </summary>
/// <param name="item">The item that has been de-queued, default if unsuccessful.</param>
/// <returns>
/// <see langword="true" /> if an item is de-queued successfully, <see langword="false" /> otherwise, or if the
/// queue is empty.
/// </returns>
public abstract bool TryDequeue(out T item);
/// <summary>
/// Attempts to peek at the current queue and return the item that is next in line to be dequeued.
/// </summary>
/// <param name="item">The item, or default if unsuccessful.</param>
/// <returns>
/// <see langword="true" /> if an item is found, <see langword="false" /> otherwise, or if the queue is empty.
/// </returns>
public abstract bool TryPeek(out T item);
/// <summary>
/// Returns an enumerator that iterates through the queue.
/// </summary>
/// <returns>An <see cref="T:System.Collections.IEnumerator" /> object that can be used to iterate through the queue.</returns>
[ExcludeFromCodeCoverage]
[SuppressMessage(
"Performance",
"HAA0401:Possible allocation of reference type enumerator",
Justification = "We can't yet avoid this.")]
IEnumerator IEnumerable.GetEnumerator() => GetEnumerator();
#endregion
/// <summary>
/// Loads the topmost item from the folder, ensuring its deletion afterwards.
/// </summary>
/// <returns>An item, if one exists and can be loaded, a default value otherwise.</returns>
/// <exception cref="InvalidOperationException">There are no more valid items in the folder.</exception>
protected T LoadTopmostItem()
{
ThrowIfCurrentObjectDisposed();
using (AcquireWriteLock())
{
var files = GetPossibleDataFiles();
var i = 0;
string possibleFilePath;
T obj;
while (true)
{
if (i >= files.Length)
{
throw new InvalidOperationException();
}
possibleFilePath = files[i];
try
{
using Stream stream = FileShim.OpenRead(possibleFilePath);
obj = (T)(Serializer.ReadObject(stream) ?? throw new SerializationException());
break;
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
}
try
{
FileShim.Delete(possibleFilePath);
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
}
return obj;
}
}
/// <summary>
/// Tries the load topmost item and execute an action on it, deleting the topmost object data if the operation is
/// successful.
/// </summary>
/// <typeparam name="TState">The type of the state object to send to the action.</typeparam>
/// <param name="actionToInvoke">The action to invoke.</param>
/// <param name="state">The state object to pass to the invoked action.</param>
/// <returns><see langword="true" /> if de-queuing and executing is successful, <see langword="false" /> otherwise.</returns>
protected bool TryLoadTopmostItemWithAction<TState>(
Action<TState, T> actionToInvoke,
TState state)
{
_ = Requires.NotNull(
actionToInvoke);
ThrowIfCurrentObjectDisposed();
using var locker = AcquireReadWriteLock();
var files = GetPossibleDataFiles();
var i = 0;
T obj;
string possibleFilePath;
while (true)
{
if (i >= files.Length)
{
return false;
}
possibleFilePath = files[i];
try
{
using Stream stream = FileShim.OpenRead(possibleFilePath);
obj = (T)(Serializer.ReadObject(stream) ?? throw new SerializationException());
break;
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
}
try
{
actionToInvoke(
state,
obj);
}
catch (Exception)
{
return false;
}
_ = locker.Upgrade();
try
{
FileShim.Delete(possibleFilePath);
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
}
return true;
}
/// <summary>
/// Asynchronously tries the load topmost item and execute an action on it, deleting the topmost object data if the
/// operation is successful.
/// </summary>
/// <typeparam name="TState">The type of the state object to send to the action.</typeparam>
/// <param name="actionToInvoke">The action to invoke.</param>
/// <param name="state">The state object to pass to the invoked action.</param>
/// <param name="cancellationToken">The cancellation token for this operation.</param>
/// <returns><see langword="true" /> if de-queuing and executing is successful, <see langword="false" /> otherwise.</returns>
[SuppressMessage(
"Performance",
"HAA0603:Delegate allocation from a method group",
Justification = "Acceptable - we're doing a lot of allocation in the Task method anyway.")]
protected async Task<bool> TryLoadTopmostItemWithActionAsync<TState>(
Func<TState, T, Task> actionToInvoke,
TState state,
CancellationToken cancellationToken = default)
{
// TODO BREAKING: In next breaking-changes version, switch this to a ValueTask-returning method
return await TryLoadTopmostItemWithActionAsync(
InvokeActionLocal,
state,
cancellationToken);
async ValueTask InvokeActionLocal(
TState stateInternal,
T obj) =>
await actionToInvoke(
stateInternal,
obj);
}
/// <summary>
/// Asynchronously tries the load topmost item and execute an action on it, deleting the topmost object data if the
/// operation is successful.
/// </summary>
/// <typeparam name="TState">The type of the state object to send to the action.</typeparam>
/// <param name="actionToInvoke">The action to invoke.</param>
/// <param name="state">The state object to pass to the invoked action.</param>
/// <param name="cancellationToken">The cancellation token for this operation.</param>
/// <returns><see langword="true" /> if de-queuing and executing is successful, <see langword="false" /> otherwise.</returns>
protected async ValueTask<bool> TryLoadTopmostItemWithActionAsync<TState>(
Func<TState, T, ValueTask> actionToInvoke,
TState state,
CancellationToken cancellationToken = default)
{
Requires.NotNull(actionToInvoke);
this.RequiresNotDisposed();
using var locker = AcquireReadWriteLock();
var files = GetPossibleDataFiles();
var i = 0;
T obj;
string possibleFilePath;
while (true)
{
cancellationToken.ThrowIfCancellationRequested();
if (i >= files.Length)
{
return false;
}
possibleFilePath = files[i];
try
{
#if FRAMEWORK_ADVANCED
await using Stream stream = FileShim.OpenRead(possibleFilePath);
#else
using Stream stream = FileShim.OpenRead(possibleFilePath);
#endif
obj = (T)(Serializer.ReadObject(stream) ?? throw new SerializationException());
break;
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
}
try
{
await actionToInvoke(
state,
obj)
.ConfigureAwait(false);
}
catch (Exception)
{
return false;
}
_ = locker.Upgrade();
try
{
await FileShim.DeleteAsync(
possibleFilePath,
cancellationToken);
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
}
return true;
}
/// <summary>
/// Tries to load the topmost item and execute an action on it, deleting the topmost object data if the operation is
/// successful.
/// </summary>
/// <typeparam name="TState">The type of the state object to send to the action.</typeparam>
/// <param name="predicate">The predicate.</param>
/// <param name="actionToInvoke">The action to invoke.</param>
/// <param name="state">The state object to pass to the invoked action.</param>
/// <returns>The number of items that have been de-queued.</returns>
/// <remarks>
/// <para>
/// Warning! This method has the potential of overrunning its read/write lock timeouts. Please ensure that the
/// <paramref name="predicate" /> method
/// filters out items in a way that limits the amount of data passing through.
/// </para>
/// </remarks>
protected int TryLoadWhilePredicateWithAction<TState>(
Func<TState, T, bool> predicate,
Action<TState, IEnumerable<T>> actionToInvoke,
TState state)
{
_ = Requires.NotNull(
predicate);
_ = Requires.NotNull(
actionToInvoke);
this.RequiresNotDisposed();
using var locker = AcquireReadWriteLock();
var files = GetPossibleDataFiles();
var i = 0;
var accumulatedObjects = new List<T>();
var accumulatedPaths = new List<string>();
while (i < files.Length)
{
var possibleFilePath = files[i];
try
{
T obj;
using (Stream stream = FileShim.OpenRead(possibleFilePath))
{
obj = (T)(Serializer.ReadObject(stream) ?? throw new SerializationException());
}
if (!predicate(
state,
obj))
{
break;
}
accumulatedObjects.Add(obj);
accumulatedPaths.Add(possibleFilePath);
i++;
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
}
if (accumulatedObjects.Count <= 0)
{
return accumulatedPaths.Count;
}
try
{
actionToInvoke(
state,
accumulatedObjects);
}
catch (Exception)
{
return 0;
}
_ = locker.Upgrade();
foreach (var possibleFilePath in accumulatedPaths)
{
try
{
FileShim.Delete(possibleFilePath);
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
}
}
return accumulatedPaths.Count;
}
/// <summary>
/// Asynchronously tries to load the topmost item and execute an action on it, deleting the topmost object data if the
/// operation is successful.
/// </summary>
/// <typeparam name="TState">The type of the state object to send to the action.</typeparam>
/// <param name="predicate">The predicate.</param>
/// <param name="actionToInvoke">The action to invoke.</param>
/// <param name="state">The state object to pass to the invoked action.</param>
/// <param name="cancellationToken">The cancellation token for this operation.</param>
/// <returns>The number of items that have been de-queued.</returns>
/// <remarks>
/// <para>
/// Warning! This method has the potential of overrunning its read/write lock timeouts. Please ensure that the
/// <paramref name="predicate" /> method
/// filters out items in a way that limits the amount of data passing through.
/// </para>
/// </remarks>
[SuppressMessage(
"Performance",
"HAA0603:Delegate allocation from a method group",
Justification = "Acceptable - we're doing a lot of allocation in the Task method anyway.")]
protected async Task<int> TryLoadWhilePredicateWithActionAsync<TState>(
Func<TState, T, Task<bool>> predicate,
Action<TState, IEnumerable<T>> actionToInvoke,
TState state,
CancellationToken cancellationToken = default)
{
// TODO BREAKING: In next breaking-changes version, switch this to a ValueTask-returning method
return await TryLoadWhilePredicateWithActionAsync(
InvokePredicateLocal,
actionToInvoke,
state,
cancellationToken);
async ValueTask<bool> InvokePredicateLocal(
TState stateInternal,
T obj) =>
await predicate(
stateInternal,
obj);
}
/// <summary>
/// Asynchronously tries to load the topmost item and execute an action on it, deleting the topmost object data if the
/// operation is successful.
/// </summary>
/// <typeparam name="TState">The type of the state object to send to the action.</typeparam>
/// <param name="predicate">The predicate.</param>
/// <param name="actionToInvoke">The action to invoke.</param>
/// <param name="state">The state object to pass to the invoked action.</param>
/// <param name="cancellationToken">The cancellation token for this operation.</param>
/// <returns>The number of items that have been de-queued.</returns>
/// <remarks>
/// <para>
/// Warning! This method has the potential of overrunning its read/write lock timeouts. Please ensure that the
/// <paramref name="predicate" /> method
/// filters out items in a way that limits the amount of data passing through.
/// </para>
/// </remarks>
protected async ValueTask<int> TryLoadWhilePredicateWithActionAsync<TState>(
Func<TState, T, ValueTask<bool>> predicate,
Action<TState, IEnumerable<T>> actionToInvoke,
TState state,
CancellationToken cancellationToken = default)
{
Requires.NotNull(predicate);
Requires.NotNull(actionToInvoke);
this.RequiresNotDisposed();
using var locker = AcquireReadWriteLock();
var files = GetPossibleDataFiles();
var i = 0;
var accumulatedObjects = new List<T>();
var accumulatedPaths = new List<string>();
while (i < files.Length)
{
if (cancellationToken.IsCancellationRequested)
{
break;
}
var possibleFilePath = files[i];
try
{
T obj;
#if FRAMEWORK_ADVANCED
await using (Stream stream = FileShim.OpenRead(possibleFilePath))
#else
using (Stream stream = FileShim.OpenRead(possibleFilePath))
#endif
{
obj = (T)(Serializer.ReadObject(stream) ?? throw new SerializationException());
}
if (!await predicate(
state,
obj)
.ConfigureAwait(false))
{
break;
}
accumulatedObjects.Add(obj);
accumulatedPaths.Add(possibleFilePath);
i++;
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
}
if (accumulatedObjects.Count <= 0)
{
cancellationToken.ThrowIfCancellationRequested();
return accumulatedPaths.Count;
}
try
{
actionToInvoke(
state,
accumulatedObjects);
}
catch (Exception)
{
cancellationToken.ThrowIfCancellationRequested();
return 0;
}
_ = locker.Upgrade();
foreach (var possibleFilePath in accumulatedPaths)
{
try
{
await FileShim.DeleteAsync(
possibleFilePath,
cancellationToken);
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
}
}
cancellationToken.ThrowIfCancellationRequested();
return accumulatedPaths.Count;
}
/// <summary>
/// Asynchronously tries to load the topmost item and execute an action on it, deleting the topmost object data if the
/// operation is successful.
/// </summary>
/// <typeparam name="TState">The type of the state object to send to the action.</typeparam>
/// <param name="predicate">The predicate.</param>
/// <param name="actionToInvoke">The action to invoke.</param>
/// <param name="state">The state object to pass to the invoked action.</param>
/// <param name="cancellationToken">The cancellation token for this operation.</param>
/// <returns>The number of items that have been de-queued.</returns>
/// <remarks>
/// <para>
/// Warning! This method has the potential of overrunning its read/write lock timeouts. Please ensure that the
/// <paramref name="predicate" /> method
/// filters out items in a way that limits the amount of data passing through.
/// </para>
/// </remarks>
[SuppressMessage(
"Performance",
"HAA0603:Delegate allocation from a method group",
Justification = "Acceptable - we're doing a lot of allocation in the Task method anyway.")]
protected async Task<int> TryLoadWhilePredicateWithActionAsync<TState>(
Func<TState, T, bool> predicate,
Func<TState, IEnumerable<T>, Task> actionToInvoke,
TState state,
CancellationToken cancellationToken = default)
{
// TODO BREAKING: In next breaking-changes version, switch this to a ValueTask-returning method
return await TryLoadWhilePredicateWithActionAsync(
predicate,
InvokeActionLocal,
state,
cancellationToken);
async ValueTask InvokeActionLocal(
TState stateInternal,
IEnumerable<T> obj) =>
await actionToInvoke(
stateInternal,
obj);
}
/// <summary>
/// Asynchronously tries to load the topmost item and execute an action on it, deleting the topmost object data if the
/// operation is successful.
/// </summary>
/// <typeparam name="TState">The type of the state object to send to the action.</typeparam>
/// <param name="predicate">The predicate.</param>
/// <param name="actionToInvoke">The action to invoke.</param>
/// <param name="state">The state object to pass to the invoked action.</param>
/// <param name="cancellationToken">The cancellation token for this operation.</param>
/// <returns>The number of items that have been de-queued.</returns>
/// <remarks>
/// <para>
/// Warning! This method has the potential of overrunning its read/write lock timeouts. Please ensure that the
/// <paramref name="predicate" /> method
/// filters out items in a way that limits the amount of data passing through.
/// </para>
/// </remarks>
protected async ValueTask<int> TryLoadWhilePredicateWithActionAsync<TState>(
Func<TState, T, bool> predicate,
Func<TState, IEnumerable<T>, ValueTask> actionToInvoke,
TState state,
CancellationToken cancellationToken = default)
{
Requires.NotNull(predicate);
Requires.NotNull(actionToInvoke);
this.RequiresNotDisposed();
using var locker = AcquireReadWriteLock();
var files = GetPossibleDataFiles();
var i = 0;
var accumulatedObjects = new List<T>();
var accumulatedPaths = new List<string>();
while (i < files.Length)
{
if (cancellationToken.IsCancellationRequested)
{
break;
}
var possibleFilePath = files[i];
try
{
T obj;
#if FRAMEWORK_ADVANCED
await using (Stream stream = FileShim.OpenRead(possibleFilePath))
#else
using (Stream stream = FileShim.OpenRead(possibleFilePath))
#endif
{
obj = (T)(Serializer.ReadObject(stream) ?? throw new SerializationException());
}
if (!predicate(
state,
obj))
{
break;
}
accumulatedObjects.Add(obj);
accumulatedPaths.Add(possibleFilePath);
i++;
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
}
if (accumulatedObjects.Count <= 0)
{
cancellationToken.ThrowIfCancellationRequested();
return accumulatedPaths.Count;
}
try
{
await actionToInvoke(
state,
accumulatedObjects)
.ConfigureAwait(false);
}
catch (Exception)
{
cancellationToken.ThrowIfCancellationRequested();
return 0;
}
_ = locker.Upgrade();
foreach (var possibleFilePath in accumulatedPaths)
{
try
{
await FileShim.DeleteAsync(
possibleFilePath,
cancellationToken);
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
}
}
cancellationToken.ThrowIfCancellationRequested();
return accumulatedPaths.Count;
}
/// <summary>
/// Asynchronously tries to load the topmost item and execute an action on it, deleting the topmost object data if the
/// operation is successful.
/// </summary>
/// <typeparam name="TState">The type of the state object to send to the action.</typeparam>
/// <param name="predicate">The predicate.</param>
/// <param name="actionToInvoke">The action to invoke.</param>
/// <param name="state">The state object to pass to the invoked action.</param>
/// <param name="cancellationToken">The cancellation token for this operation.</param>
/// <returns>The number of items that have been de-queued.</returns>
/// <remarks>
/// <para>
/// Warning! This method has the potential of overrunning its read/write lock timeouts. Please ensure that the
/// <paramref name="predicate" /> method
/// filters out items in a way that limits the amount of data passing through.
/// </para>
/// </remarks>
[SuppressMessage(
"Performance",
"HAA0603:Delegate allocation from a method group",
Justification = "Acceptable - we're doing a lot of allocation in the Task method anyway.")]
protected async Task<int> TryLoadWhilePredicateWithActionAsync<TState>(
Func<TState, T, Task<bool>> predicate,
Func<TState, IEnumerable<T>, Task> actionToInvoke,
TState state,
CancellationToken cancellationToken = default)
{
// TODO BREAKING: In next breaking-changes version, switch this to a ValueTask-returning method
return await TryLoadWhilePredicateWithActionAsync(
InvokePredicateLocal,
InvokeActionLocal,
state,
cancellationToken);
async ValueTask<bool> InvokePredicateLocal(
TState stateInternal,
T obj) =>
await predicate(
stateInternal,
obj);
async ValueTask InvokeActionLocal(
TState stateInternal,
IEnumerable<T> obj) =>
await actionToInvoke(
stateInternal,
obj);
}
/// <summary>
/// Asynchronously tries to load the topmost item and execute an action on it, deleting the topmost object data if the
/// operation is successful.
/// </summary>
/// <typeparam name="TState">The type of the state object to send to the action.</typeparam>
/// <param name="predicate">The predicate.</param>
/// <param name="actionToInvoke">The action to invoke.</param>
/// <param name="state">The state object to pass to the invoked action.</param>
/// <param name="cancellationToken">The cancellation token for this operation.</param>
/// <returns>The number of items that have been de-queued.</returns>
/// <remarks>
/// <para>
/// Warning! This method has the potential of overrunning its read/write lock timeouts. Please ensure that the
/// <paramref name="predicate" /> method
/// filters out items in a way that limits the amount of data passing through.
/// </para>
/// </remarks>
protected async ValueTask<int> TryLoadWhilePredicateWithActionAsync<TState>(
Func<TState, T, ValueTask<bool>> predicate,
Func<TState, IEnumerable<T>, ValueTask> actionToInvoke,
TState state,
CancellationToken cancellationToken = default)
{
Requires.NotNull(predicate);
Requires.NotNull(actionToInvoke);
ThrowIfCurrentObjectDisposed();
using var locker = AcquireReadWriteLock();
var files = GetPossibleDataFiles();
var i = 0;
var accumulatedObjects = new List<T>();
var accumulatedPaths = new List<string>();
while (i < files.Length)
{
if (cancellationToken.IsCancellationRequested)
{
break;
}
var possibleFilePath = files[i];
try
{
T obj;
#if FRAMEWORK_ADVANCED
await using (Stream stream = FileShim.OpenRead(possibleFilePath))
#else
using (Stream stream = FileShim.OpenRead(possibleFilePath))
#endif
{
obj = (T)(Serializer.ReadObject(stream) ?? throw new SerializationException());
}
if (!await predicate(
state,
obj)
.ConfigureAwait(false))
{
break;
}
accumulatedObjects.Add(obj);
accumulatedPaths.Add(possibleFilePath);
i++;
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
}
if (accumulatedObjects.Count <= 0)
{
cancellationToken.ThrowIfCancellationRequested();
return accumulatedPaths.Count;
}
try
{
await actionToInvoke(
state,
accumulatedObjects)
.ConfigureAwait(false);
}
catch (Exception)
{
cancellationToken.ThrowIfCancellationRequested();
return 0;
}
_ = locker.Upgrade();
foreach (var possibleFilePath in accumulatedPaths)
{
try
{
await FileShim.DeleteAsync(
possibleFilePath,
cancellationToken);
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
}
}
cancellationToken.ThrowIfCancellationRequested();
return accumulatedPaths.Count;
}
/// <summary>
/// Peeks at the topmost item in the folder.
/// </summary>
/// <returns>An item, if one exists and can be loaded, or an exception otherwise.</returns>
/// <exception cref="InvalidOperationException">There are no more valid items in the folder.</exception>
protected T PeekTopmostItem() => !TryPeekTopmostItem(out T item) ? throw new InvalidOperationException() : item;
/// <summary>
/// Peeks at the topmost item in the folder.
/// </summary>
/// <param name="item">The item.</param>
/// <returns>
/// <see langword="true" /> if an item is found, <see langword="false" /> otherwise, or if the queue is empty.
/// </returns>
protected bool TryPeekTopmostItem(out T item)
{
ThrowIfCurrentObjectDisposed();
using (AcquireReadLock())
{
var files = GetPossibleDataFiles();
var i = 0;
while (true)
{
if (i >= files.Length)
{
item = default!;
return false;
}
var possibleFilePath = files[i];
try
{
using Stream stream = FileShim.OpenRead(possibleFilePath);
item = (T)(Serializer.ReadObject(stream) ?? throw new SerializationException());
return true;
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
i++;
}
}
}
}
/// <summary>
/// Loads the items from the folder.
/// </summary>
/// <returns>An item, if one exists and can be loaded, a default value otherwise.</returns>
/// <remarks>
/// <para>Warning! Not synchronized.</para>
/// <para>
/// This method is not synchronized between threads. Please ensure that you only use this method in a guaranteed
/// one-time-access manner (such as a constructor).
/// </para>
/// </remarks>
/// <exception cref="InvalidOperationException">There are no more valid items in the folder.</exception>
protected IEnumerable<Tuple<T, string>> LoadValidItemObjectHandles()
{
foreach (var possibleFilePath in GetPossibleDataFiles())
{
T obj;
try
{
using Stream stream = FileShim.OpenRead(possibleFilePath);
obj = (T)(Serializer.ReadObject(stream) ?? throw new SerializationException());
}
catch (IOException)
{
HandleFileLoadProblem(possibleFilePath);
continue;
}
catch (UnauthorizedAccessException)
{
HandleFileLoadProblem(possibleFilePath);
continue;
}
catch (SerializationException)
{
HandleFileLoadProblem(possibleFilePath);
continue;
}
yield return new(
obj,
possibleFilePath);
}
}
/// <summary>
/// Saves the new item to the disk.
/// </summary>
/// <param name="item">The item to save.</param>
/// <returns>The path of the newly-saved file.</returns>
/// <exception cref="InvalidOperationException">
/// We have reached the maximum number of items saved in the same femtosecond.
/// This is theoretically not possible.
/// </exception>
[SuppressMessage(
"Performance",
"HAA0601:Value type to reference type conversion causing boxing allocation",
Justification = "This is unavoidable, considering how the method works.")]
protected string SaveNewItem(T item)
{
ThrowIfCurrentObjectDisposed();
using (AcquireWriteLock())
{
var i = 1;
string filePath;
DateTime now = DateTime.UtcNow;
do
{
filePath = PathShim.Combine(
DataFolderPath,
$"{now:yyyy.MM.dd.HH.mm.ss.fffffff}.{i}.dat");
i++;
if (i == int.MaxValue)
{
throw new InvalidOperationException();
}
}
while (FileShim.Exists(filePath));
using (Stream stream = FileShim.Create(filePath))
{
Serializer.WriteObject(
stream,
item!);
}
return filePath;
}
}
/// <summary>
/// Clears the data.
/// </summary>
protected void ClearData()
{
ThrowIfCurrentObjectDisposed();
using (AcquireWriteLock())
{
foreach (var possibleFilePath in DirectoryShim.EnumerateFiles(
DataFolderPath,
"*.dat")
.ToArray())
{
HandleImpossibleMoveToPoison(possibleFilePath);
}
}
FixUnmovableReferences();
}
/// <summary>
/// Gets the possible data files.
/// </summary>
/// <returns>An array of data file names.</returns>
protected string[] GetPossibleDataFiles() =>
DirectoryShim.EnumerateFiles(
DataFolderPath,
"*.dat")
.Except(poisonedUnremovableFiles)
.ToArray();
/// <summary>
/// Handles the file load problem.
/// </summary>
/// <param name="possibleFilePath">The possible file path.</param>
private void HandleFileLoadProblem(string possibleFilePath)
{
var newFilePath = PathShim.Combine(
PoisonFolderPath,
PathShim.GetFileName(possibleFilePath));
// Seemingly-redundant catch code below will be replaced at a later time with an opt-in-based logging solution
// and a more try/finally general approach
// If an item by the same name exists in the poison queue, delete it
try
{
if (FileShim.Exists(newFilePath))
{
FileShim.Delete(newFilePath);
}
}
catch (IOException)
{
HandleImpossibleMoveToPoison(possibleFilePath);
return;
}
catch (UnauthorizedAccessException)
{
HandleImpossibleMoveToPoison(possibleFilePath);
return;
}
try
{
// Move to poison queue
FileShim.Move(
possibleFilePath,
newFilePath);
}
catch (IOException)
{
HandleImpossibleMoveToPoison(possibleFilePath);
}
catch (UnauthorizedAccessException)
{
HandleImpossibleMoveToPoison(possibleFilePath);
}
}
/// <summary>
/// Handles the situation where it is impossible to move a file to poison.
/// </summary>
/// <param name="possibleFilePath">The possible file path.</param>
private void HandleImpossibleMoveToPoison(string possibleFilePath)
{
try
{
// If deletion was not possible, delete the offending item
FileShim.Delete(possibleFilePath);
}
catch (IOException)
{
poisonedUnremovableFiles.Add(possibleFilePath);
}
catch (UnauthorizedAccessException)
{
poisonedUnremovableFiles.Add(possibleFilePath);
}
}
/// <summary>
/// Fixes the unmovable references.
/// </summary>
[SuppressMessage(
"StyleCop.CSharp.LayoutRules",
"SA1501:Statement should not be on a single line",
Justification = "It's fine.")]
private void FixUnmovableReferences()
{
foreach (var file in poisonedUnremovableFiles.ToArray())
{
try
{
if (!FileShim.Exists(file))
{
_ = poisonedUnremovableFiles.Remove(file);
}
}
catch (IOException) { }
catch (UnauthorizedAccessException) { }
}
}
#endregion
} | {
"content_hash": "3473b03692558af25c16a77279bd6433",
"timestamp": "",
"source": "github",
"line_count": 1583,
"max_line_length": 131,
"avg_line_length": 31.93177511054959,
"alnum_prop": 0.5525441164833426,
"repo_name": "adimosh/IX.StandardExtensions",
"id": "94ff5835c935d4400033e69191e7e867cac329b4",
"size": "50713",
"binary": false,
"copies": "1",
"ref": "refs/heads/main",
"path": "src/IX.StandardExtensions/Guaranteed/Collections/PersistedQueueBase{T}.cs",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "Berry",
"bytes": "1080"
},
{
"name": "C#",
"bytes": "6260245"
},
{
"name": "Vim Snippet",
"bytes": "1732"
}
],
"symlink_target": ""
} |
namespace v8 {
namespace internal {
bool CpuFeatures::SupportsCrankshaft() { return IsSupported(FPU); }
// -----------------------------------------------------------------------------
// Operand and MemOperand.
Operand::Operand(int32_t immediate, RelocInfo::Mode rmode) {
rm_ = no_reg;
imm32_ = immediate;
rmode_ = rmode;
}
Operand::Operand(const ExternalReference& f) {
rm_ = no_reg;
imm32_ = reinterpret_cast<int32_t>(f.address());
rmode_ = RelocInfo::EXTERNAL_REFERENCE;
}
Operand::Operand(Smi* value) {
rm_ = no_reg;
imm32_ = reinterpret_cast<intptr_t>(value);
rmode_ = RelocInfo::NONE32;
}
Operand::Operand(Register rm) {
rm_ = rm;
}
bool Operand::is_reg() const {
return rm_.is_valid();
}
int Register::NumAllocatableRegisters() {
return kMaxNumAllocatableRegisters;
}
int DoubleRegister::NumRegisters() {
return FPURegister::kMaxNumRegisters;
}
int DoubleRegister::NumAllocatableRegisters() {
return FPURegister::kMaxNumAllocatableRegisters;
}
int DoubleRegister::NumAllocatableAliasedRegisters() {
return NumAllocatableRegisters();
}
int FPURegister::ToAllocationIndex(FPURegister reg) {
DCHECK(reg.code() % 2 == 0);
DCHECK(reg.code() / 2 < kMaxNumAllocatableRegisters);
DCHECK(reg.is_valid());
DCHECK(!reg.is(kDoubleRegZero));
DCHECK(!reg.is(kLithiumScratchDouble));
return (reg.code() / 2);
}
// -----------------------------------------------------------------------------
// RelocInfo.
void RelocInfo::apply(intptr_t delta, ICacheFlushMode icache_flush_mode) {
if (IsCodeTarget(rmode_)) {
uint32_t scope1 = (uint32_t) target_address() & ~kImm28Mask;
uint32_t scope2 = reinterpret_cast<uint32_t>(pc_) & ~kImm28Mask;
if (scope1 != scope2) {
Assembler::JumpLabelToJumpRegister(pc_);
}
}
if (IsInternalReference(rmode_) || IsInternalReferenceEncoded(rmode_)) {
// Absolute code pointer inside code object moves with the code object.
byte* p = reinterpret_cast<byte*>(pc_);
int count = Assembler::RelocateInternalReference(rmode_, p, delta);
CpuFeatures::FlushICache(p, count * sizeof(uint32_t));
}
}
Address RelocInfo::target_address() {
DCHECK(IsCodeTarget(rmode_) || IsRuntimeEntry(rmode_));
return Assembler::target_address_at(pc_, host_);
}
Address RelocInfo::target_address_address() {
DCHECK(IsCodeTarget(rmode_) ||
IsRuntimeEntry(rmode_) ||
rmode_ == EMBEDDED_OBJECT ||
rmode_ == EXTERNAL_REFERENCE);
// Read the address of the word containing the target_address in an
// instruction stream.
// The only architecture-independent user of this function is the serializer.
// The serializer uses it to find out how many raw bytes of instruction to
// output before the next target.
// For an instruction like LUI/ORI where the target bits are mixed into the
// instruction bits, the size of the target will be zero, indicating that the
// serializer should not step forward in memory after a target is resolved
// and written. In this case the target_address_address function should
// return the end of the instructions to be patched, allowing the
// deserializer to deserialize the instructions as raw bytes and put them in
// place, ready to be patched with the target. After jump optimization,
// that is the address of the instruction that follows J/JAL/JR/JALR
// instruction.
return reinterpret_cast<Address>(
pc_ + Assembler::kInstructionsFor32BitConstant * Assembler::kInstrSize);
}
Address RelocInfo::constant_pool_entry_address() {
UNREACHABLE();
return NULL;
}
int RelocInfo::target_address_size() {
return Assembler::kSpecialTargetSize;
}
void RelocInfo::set_target_address(Address target,
WriteBarrierMode write_barrier_mode,
ICacheFlushMode icache_flush_mode) {
DCHECK(IsCodeTarget(rmode_) || IsRuntimeEntry(rmode_));
Assembler::set_target_address_at(pc_, host_, target, icache_flush_mode);
if (write_barrier_mode == UPDATE_WRITE_BARRIER &&
host() != NULL && IsCodeTarget(rmode_)) {
Object* target_code = Code::GetCodeFromTargetAddress(target);
host()->GetHeap()->incremental_marking()->RecordWriteIntoCode(
host(), this, HeapObject::cast(target_code));
}
}
Address Assembler::target_address_from_return_address(Address pc) {
return pc - kCallTargetAddressOffset;
}
Address Assembler::break_address_from_return_address(Address pc) {
return pc - Assembler::kPatchDebugBreakSlotReturnOffset;
}
void Assembler::set_target_internal_reference_encoded_at(Address pc,
Address target) {
// Encoded internal references are lui/ori load of 32-bit abolute address.
Instr instr_lui = Assembler::instr_at(pc + 0 * Assembler::kInstrSize);
Instr instr_ori = Assembler::instr_at(pc + 1 * Assembler::kInstrSize);
DCHECK(Assembler::IsLui(instr_lui));
DCHECK(Assembler::IsOri(instr_ori));
instr_lui &= ~kImm16Mask;
instr_ori &= ~kImm16Mask;
int32_t imm = reinterpret_cast<int32_t>(target);
DCHECK((imm & 3) == 0);
Assembler::instr_at_put(pc + 0 * Assembler::kInstrSize,
instr_lui | ((imm >> kLuiShift) & kImm16Mask));
Assembler::instr_at_put(pc + 1 * Assembler::kInstrSize,
instr_ori | (imm & kImm16Mask));
// Currently used only by deserializer, and all code will be flushed
// after complete deserialization, no need to flush on each reference.
}
void Assembler::deserialization_set_target_internal_reference_at(
Address pc, Address target, RelocInfo::Mode mode) {
if (mode == RelocInfo::INTERNAL_REFERENCE_ENCODED) {
DCHECK(IsLui(instr_at(pc)));
set_target_internal_reference_encoded_at(pc, target);
} else {
DCHECK(mode == RelocInfo::INTERNAL_REFERENCE);
Memory::Address_at(pc) = target;
}
}
Object* RelocInfo::target_object() {
DCHECK(IsCodeTarget(rmode_) || rmode_ == EMBEDDED_OBJECT);
return reinterpret_cast<Object*>(Assembler::target_address_at(pc_, host_));
}
Handle<Object> RelocInfo::target_object_handle(Assembler* origin) {
DCHECK(IsCodeTarget(rmode_) || rmode_ == EMBEDDED_OBJECT);
return Handle<Object>(reinterpret_cast<Object**>(
Assembler::target_address_at(pc_, host_)));
}
void RelocInfo::set_target_object(Object* target,
WriteBarrierMode write_barrier_mode,
ICacheFlushMode icache_flush_mode) {
DCHECK(IsCodeTarget(rmode_) || rmode_ == EMBEDDED_OBJECT);
Assembler::set_target_address_at(pc_, host_,
reinterpret_cast<Address>(target),
icache_flush_mode);
if (write_barrier_mode == UPDATE_WRITE_BARRIER &&
host() != NULL &&
target->IsHeapObject()) {
host()->GetHeap()->incremental_marking()->RecordWrite(
host(), &Memory::Object_at(pc_), HeapObject::cast(target));
}
}
Address RelocInfo::target_external_reference() {
DCHECK(rmode_ == EXTERNAL_REFERENCE);
return Assembler::target_address_at(pc_, host_);
}
Address RelocInfo::target_internal_reference() {
if (rmode_ == INTERNAL_REFERENCE) {
return Memory::Address_at(pc_);
} else {
// Encoded internal references are lui/ori load of 32-bit abolute address.
DCHECK(rmode_ == INTERNAL_REFERENCE_ENCODED);
Instr instr_lui = Assembler::instr_at(pc_ + 0 * Assembler::kInstrSize);
Instr instr_ori = Assembler::instr_at(pc_ + 1 * Assembler::kInstrSize);
DCHECK(Assembler::IsLui(instr_lui));
DCHECK(Assembler::IsOri(instr_ori));
int32_t imm = (instr_lui & static_cast<int32_t>(kImm16Mask)) << kLuiShift;
imm |= (instr_ori & static_cast<int32_t>(kImm16Mask));
return reinterpret_cast<Address>(imm);
}
}
Address RelocInfo::target_internal_reference_address() {
DCHECK(rmode_ == INTERNAL_REFERENCE || rmode_ == INTERNAL_REFERENCE_ENCODED);
return reinterpret_cast<Address>(pc_);
}
Address RelocInfo::target_runtime_entry(Assembler* origin) {
DCHECK(IsRuntimeEntry(rmode_));
return target_address();
}
void RelocInfo::set_target_runtime_entry(Address target,
WriteBarrierMode write_barrier_mode,
ICacheFlushMode icache_flush_mode) {
DCHECK(IsRuntimeEntry(rmode_));
if (target_address() != target)
set_target_address(target, write_barrier_mode, icache_flush_mode);
}
Handle<Cell> RelocInfo::target_cell_handle() {
DCHECK(rmode_ == RelocInfo::CELL);
Address address = Memory::Address_at(pc_);
return Handle<Cell>(reinterpret_cast<Cell**>(address));
}
Cell* RelocInfo::target_cell() {
DCHECK(rmode_ == RelocInfo::CELL);
return Cell::FromValueAddress(Memory::Address_at(pc_));
}
void RelocInfo::set_target_cell(Cell* cell,
WriteBarrierMode write_barrier_mode,
ICacheFlushMode icache_flush_mode) {
DCHECK(rmode_ == RelocInfo::CELL);
Address address = cell->address() + Cell::kValueOffset;
Memory::Address_at(pc_) = address;
if (write_barrier_mode == UPDATE_WRITE_BARRIER && host() != NULL) {
// TODO(1550) We are passing NULL as a slot because cell can never be on
// evacuation candidate.
host()->GetHeap()->incremental_marking()->RecordWrite(
host(), NULL, cell);
}
}
static const int kNoCodeAgeSequenceLength = 7 * Assembler::kInstrSize;
Handle<Object> RelocInfo::code_age_stub_handle(Assembler* origin) {
UNREACHABLE(); // This should never be reached on Arm.
return Handle<Object>();
}
Code* RelocInfo::code_age_stub() {
DCHECK(rmode_ == RelocInfo::CODE_AGE_SEQUENCE);
return Code::GetCodeFromTargetAddress(
Assembler::target_address_at(pc_ + Assembler::kInstrSize, host_));
}
void RelocInfo::set_code_age_stub(Code* stub,
ICacheFlushMode icache_flush_mode) {
DCHECK(rmode_ == RelocInfo::CODE_AGE_SEQUENCE);
Assembler::set_target_address_at(pc_ + Assembler::kInstrSize,
host_,
stub->instruction_start());
}
Address RelocInfo::call_address() {
DCHECK((IsJSReturn(rmode()) && IsPatchedReturnSequence()) ||
(IsDebugBreakSlot(rmode()) && IsPatchedDebugBreakSlotSequence()));
// The pc_ offset of 0 assumes mips patched return sequence per
// debug-mips.cc BreakLocation::SetDebugBreakAtReturn(), or
// debug break slot per BreakLocation::SetDebugBreakAtSlot().
return Assembler::target_address_at(pc_, host_);
}
void RelocInfo::set_call_address(Address target) {
DCHECK((IsJSReturn(rmode()) && IsPatchedReturnSequence()) ||
(IsDebugBreakSlot(rmode()) && IsPatchedDebugBreakSlotSequence()));
// The pc_ offset of 0 assumes mips patched return sequence per
// debug-mips.cc BreakLocation::SetDebugBreakAtReturn(), or
// debug break slot per BreakLocation::SetDebugBreakAtSlot().
Assembler::set_target_address_at(pc_, host_, target);
if (host() != NULL) {
Object* target_code = Code::GetCodeFromTargetAddress(target);
host()->GetHeap()->incremental_marking()->RecordWriteIntoCode(
host(), this, HeapObject::cast(target_code));
}
}
Object* RelocInfo::call_object() {
return *call_object_address();
}
Object** RelocInfo::call_object_address() {
DCHECK((IsJSReturn(rmode()) && IsPatchedReturnSequence()) ||
(IsDebugBreakSlot(rmode()) && IsPatchedDebugBreakSlotSequence()));
return reinterpret_cast<Object**>(pc_ + 2 * Assembler::kInstrSize);
}
void RelocInfo::set_call_object(Object* target) {
*call_object_address() = target;
}
void RelocInfo::WipeOut() {
DCHECK(IsEmbeddedObject(rmode_) || IsCodeTarget(rmode_) ||
IsRuntimeEntry(rmode_) || IsExternalReference(rmode_) ||
IsInternalReference(rmode_) || IsInternalReferenceEncoded(rmode_));
if (IsInternalReference(rmode_)) {
Memory::Address_at(pc_) = NULL;
} else if (IsInternalReferenceEncoded(rmode_)) {
Assembler::set_target_internal_reference_encoded_at(pc_, nullptr);
} else {
Assembler::set_target_address_at(pc_, host_, NULL);
}
}
bool RelocInfo::IsPatchedReturnSequence() {
Instr instr0 = Assembler::instr_at(pc_);
Instr instr1 = Assembler::instr_at(pc_ + 1 * Assembler::kInstrSize);
Instr instr2 = Assembler::instr_at(pc_ + 2 * Assembler::kInstrSize);
bool patched_return = ((instr0 & kOpcodeMask) == LUI &&
(instr1 & kOpcodeMask) == ORI &&
((instr2 & kOpcodeMask) == JAL ||
((instr2 & kOpcodeMask) == SPECIAL &&
(instr2 & kFunctionFieldMask) == JALR)));
return patched_return;
}
bool RelocInfo::IsPatchedDebugBreakSlotSequence() {
Instr current_instr = Assembler::instr_at(pc_);
return !Assembler::IsNop(current_instr, Assembler::DEBUG_BREAK_NOP);
}
void RelocInfo::Visit(Isolate* isolate, ObjectVisitor* visitor) {
RelocInfo::Mode mode = rmode();
if (mode == RelocInfo::EMBEDDED_OBJECT) {
visitor->VisitEmbeddedPointer(this);
} else if (RelocInfo::IsCodeTarget(mode)) {
visitor->VisitCodeTarget(this);
} else if (mode == RelocInfo::CELL) {
visitor->VisitCell(this);
} else if (mode == RelocInfo::EXTERNAL_REFERENCE) {
visitor->VisitExternalReference(this);
} else if (mode == RelocInfo::INTERNAL_REFERENCE ||
mode == RelocInfo::INTERNAL_REFERENCE_ENCODED) {
visitor->VisitInternalReference(this);
} else if (RelocInfo::IsCodeAgeSequence(mode)) {
visitor->VisitCodeAgeSequence(this);
} else if (((RelocInfo::IsJSReturn(mode) &&
IsPatchedReturnSequence()) ||
(RelocInfo::IsDebugBreakSlot(mode) &&
IsPatchedDebugBreakSlotSequence())) &&
isolate->debug()->has_break_points()) {
visitor->VisitDebugTarget(this);
} else if (RelocInfo::IsRuntimeEntry(mode)) {
visitor->VisitRuntimeEntry(this);
}
}
template<typename StaticVisitor>
void RelocInfo::Visit(Heap* heap) {
RelocInfo::Mode mode = rmode();
if (mode == RelocInfo::EMBEDDED_OBJECT) {
StaticVisitor::VisitEmbeddedPointer(heap, this);
} else if (RelocInfo::IsCodeTarget(mode)) {
StaticVisitor::VisitCodeTarget(heap, this);
} else if (mode == RelocInfo::CELL) {
StaticVisitor::VisitCell(heap, this);
} else if (mode == RelocInfo::EXTERNAL_REFERENCE) {
StaticVisitor::VisitExternalReference(this);
} else if (mode == RelocInfo::INTERNAL_REFERENCE ||
mode == RelocInfo::INTERNAL_REFERENCE_ENCODED) {
StaticVisitor::VisitInternalReference(this);
} else if (RelocInfo::IsCodeAgeSequence(mode)) {
StaticVisitor::VisitCodeAgeSequence(heap, this);
} else if (heap->isolate()->debug()->has_break_points() &&
((RelocInfo::IsJSReturn(mode) &&
IsPatchedReturnSequence()) ||
(RelocInfo::IsDebugBreakSlot(mode) &&
IsPatchedDebugBreakSlotSequence()))) {
StaticVisitor::VisitDebugTarget(heap, this);
} else if (RelocInfo::IsRuntimeEntry(mode)) {
StaticVisitor::VisitRuntimeEntry(this);
}
}
// -----------------------------------------------------------------------------
// Assembler.
void Assembler::CheckBuffer() {
if (buffer_space() <= kGap) {
GrowBuffer();
}
}
void Assembler::CheckTrampolinePoolQuick() {
if (pc_offset() >= next_buffer_check_) {
CheckTrampolinePool();
}
}
void Assembler::emit(Instr x) {
if (!is_buffer_growth_blocked()) {
CheckBuffer();
}
*reinterpret_cast<Instr*>(pc_) = x;
pc_ += kInstrSize;
CheckTrampolinePoolQuick();
}
} } // namespace v8::internal
#endif // V8_MIPS_ASSEMBLER_MIPS_INL_H_
| {
"content_hash": "9f80115ada2a564e063c8ffac318fd21",
"timestamp": "",
"source": "github",
"line_count": 477,
"max_line_length": 80,
"avg_line_length": 32.85324947589098,
"alnum_prop": 0.6554782719673282,
"repo_name": "nekulin/arangodb",
"id": "7b6b3f8c76067b658aa67ff1f3edde6083ea6b45",
"size": "17582",
"binary": false,
"copies": "11",
"ref": "refs/heads/devel",
"path": "3rdParty/V8-4.3.61/src/mips/assembler-mips-inl.h",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Ada",
"bytes": "89080"
},
{
"name": "AppleScript",
"bytes": "1429"
},
{
"name": "Assembly",
"bytes": "142084"
},
{
"name": "Batchfile",
"bytes": "9073"
},
{
"name": "C",
"bytes": "1938354"
},
{
"name": "C#",
"bytes": "55625"
},
{
"name": "C++",
"bytes": "79307771"
},
{
"name": "CLIPS",
"bytes": "5291"
},
{
"name": "CMake",
"bytes": "109682"
},
{
"name": "CSS",
"bytes": "1683781"
},
{
"name": "CoffeeScript",
"bytes": "94"
},
{
"name": "DIGITAL Command Language",
"bytes": "27303"
},
{
"name": "Emacs Lisp",
"bytes": "15477"
},
{
"name": "Go",
"bytes": "1018005"
},
{
"name": "Groff",
"bytes": "263567"
},
{
"name": "HTML",
"bytes": "458914"
},
{
"name": "JavaScript",
"bytes": "57970034"
},
{
"name": "LLVM",
"bytes": "39361"
},
{
"name": "Lua",
"bytes": "16189"
},
{
"name": "Makefile",
"bytes": "177932"
},
{
"name": "Module Management System",
"bytes": "1545"
},
{
"name": "NSIS",
"bytes": "26909"
},
{
"name": "Objective-C",
"bytes": "4430"
},
{
"name": "Objective-C++",
"bytes": "1857"
},
{
"name": "Pascal",
"bytes": "145262"
},
{
"name": "Perl",
"bytes": "227308"
},
{
"name": "Protocol Buffer",
"bytes": "5837"
},
{
"name": "Python",
"bytes": "3563935"
},
{
"name": "Ruby",
"bytes": "1000569"
},
{
"name": "SAS",
"bytes": "1847"
},
{
"name": "Scheme",
"bytes": "19885"
},
{
"name": "Shell",
"bytes": "488744"
},
{
"name": "VimL",
"bytes": "4075"
},
{
"name": "Yacc",
"bytes": "36950"
}
],
"symlink_target": ""
} |
package org.apache.flink.runtime.resourcemanager.slotmanager;
import org.apache.flink.api.common.JobID;
import org.apache.flink.runtime.clusterframework.types.ResourceProfile;
import org.apache.flink.runtime.instance.InstanceID;
import org.apache.flink.runtime.slots.ResourceCounter;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
/** Contains the results of the {@link ResourceAllocationStrategy}. */
public class ResourceAllocationResult {
private final Set<JobID> unfulfillableJobs;
private final Map<JobID, Map<InstanceID, ResourceCounter>> allocationsOnRegisteredResources;
private final List<PendingTaskManager> pendingTaskManagersToAllocate;
private final Map<PendingTaskManagerId, Map<JobID, ResourceCounter>>
allocationsOnPendingResources;
private ResourceAllocationResult(
Set<JobID> unfulfillableJobs,
Map<JobID, Map<InstanceID, ResourceCounter>> allocationsOnRegisteredResources,
List<PendingTaskManager> pendingTaskManagersToAllocate,
Map<PendingTaskManagerId, Map<JobID, ResourceCounter>> allocationsOnPendingResources) {
this.unfulfillableJobs = unfulfillableJobs;
this.allocationsOnRegisteredResources = allocationsOnRegisteredResources;
this.pendingTaskManagersToAllocate = pendingTaskManagersToAllocate;
this.allocationsOnPendingResources = allocationsOnPendingResources;
}
public List<PendingTaskManager> getPendingTaskManagersToAllocate() {
return Collections.unmodifiableList(pendingTaskManagersToAllocate);
}
public Set<JobID> getUnfulfillableJobs() {
return Collections.unmodifiableSet(unfulfillableJobs);
}
public Map<JobID, Map<InstanceID, ResourceCounter>> getAllocationsOnRegisteredResources() {
return Collections.unmodifiableMap(allocationsOnRegisteredResources);
}
public Map<PendingTaskManagerId, Map<JobID, ResourceCounter>>
getAllocationsOnPendingResources() {
return Collections.unmodifiableMap(allocationsOnPendingResources);
}
public static Builder builder() {
return new Builder();
}
public static class Builder {
private final Set<JobID> unfulfillableJobs = new HashSet<>();
private final Map<JobID, Map<InstanceID, ResourceCounter>>
allocationsOnRegisteredResources = new HashMap<>();
private final List<PendingTaskManager> pendingTaskManagersToAllocate = new ArrayList<>();
private final Map<PendingTaskManagerId, Map<JobID, ResourceCounter>>
allocationsOnPendingResources = new HashMap<>();
public Builder addUnfulfillableJob(JobID jobId) {
this.unfulfillableJobs.add(jobId);
return this;
}
public Builder addPendingTaskManagerAllocate(PendingTaskManager pendingTaskManager) {
this.pendingTaskManagersToAllocate.add(pendingTaskManager);
return this;
}
public Builder addAllocationOnPendingResource(
JobID jobId,
PendingTaskManagerId pendingTaskManagerId,
ResourceProfile resourceProfile) {
this.allocationsOnPendingResources
.computeIfAbsent(pendingTaskManagerId, ignored -> new HashMap<>())
.computeIfAbsent(jobId, id -> new ResourceCounter())
.incrementCount(resourceProfile, 1);
return this;
}
public Builder addAllocationOnRegisteredResource(
JobID jobId, InstanceID instanceId, ResourceProfile resourceProfile) {
this.allocationsOnRegisteredResources
.computeIfAbsent(jobId, jobID -> new HashMap<>())
.computeIfAbsent(instanceId, id -> new ResourceCounter())
.incrementCount(resourceProfile, 1);
return this;
}
public ResourceAllocationResult build() {
return new ResourceAllocationResult(
unfulfillableJobs,
allocationsOnRegisteredResources,
pendingTaskManagersToAllocate,
allocationsOnPendingResources);
}
}
}
| {
"content_hash": "97adbaebfb46c0cc8f469de2eb0e503a",
"timestamp": "",
"source": "github",
"line_count": 104,
"max_line_length": 99,
"avg_line_length": 41.70192307692308,
"alnum_prop": 0.6995619091537929,
"repo_name": "kl0u/flink",
"id": "c7aee7a6c769a90de2fe28b5e1768c9204c630e3",
"size": "5142",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "flink-runtime/src/main/java/org/apache/flink/runtime/resourcemanager/slotmanager/ResourceAllocationResult.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Batchfile",
"bytes": "4722"
},
{
"name": "CSS",
"bytes": "58149"
},
{
"name": "Clojure",
"bytes": "93247"
},
{
"name": "Dockerfile",
"bytes": "12142"
},
{
"name": "FreeMarker",
"bytes": "28662"
},
{
"name": "HTML",
"bytes": "108850"
},
{
"name": "Java",
"bytes": "53696165"
},
{
"name": "JavaScript",
"bytes": "1829"
},
{
"name": "Makefile",
"bytes": "5134"
},
{
"name": "Python",
"bytes": "1132424"
},
{
"name": "Scala",
"bytes": "13885420"
},
{
"name": "Shell",
"bytes": "533455"
},
{
"name": "TSQL",
"bytes": "123113"
},
{
"name": "TypeScript",
"bytes": "249126"
}
],
"symlink_target": ""
} |
require 'morpheus/cli/cli_command'
class Morpheus::Cli::Clouds
include Morpheus::Cli::CliCommand
include Morpheus::Cli::InfrastructureHelper
include Morpheus::Cli::ProvisioningHelper
register_subcommands :list, :count, :get, :add, :update, :remove, :refresh, :security_groups, :apply_security_groups, :types => :list_cloud_types
register_subcommands :wiki, :update_wiki
#register_subcommands :firewall_disable, :firewall_enable
alias_subcommand :details, :get
set_default_subcommand :list
def initialize()
# @appliance_name, @appliance_url = Morpheus::Cli::Remote.active_appliance
end
def connect(opts)
@api_client = establish_remote_appliance_connection(opts)
@clouds_interface = @api_client.clouds
@groups_interface = @api_client.groups
@active_group_id = Morpheus::Cli::Groups.active_groups[@appliance_name]
end
def handle(args)
handle_subcommand(args)
end
def list(args)
options={}
params = {}
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage()
opts.on( '-g', '--group GROUP', "Group Name" ) do |group|
options[:group] = group
end
opts.on( '-t', '--type TYPE', "Cloud Type" ) do |val|
options[:zone_type] = val
end
build_standard_list_options(opts, options)
opts.footer = "List clouds."
end
optparse.parse!(args)
connect(options)
# verify_args!(args:args, optparse:optparse, count:0)
if args.count > 0
options[:phrase] = args.join(" ")
end
begin
if options[:zone_type]
cloud_type = cloud_type_for_name(options[:zone_type])
params[:type] = cloud_type['code']
end
if !options[:group].nil?
group = find_group_by_name(options[:group])
if !group.nil?
params['groupId'] = group['id']
end
end
params.merge!(parse_list_options(options))
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.list(params)
return 0
end
json_response = @clouds_interface.list(params)
render_response(json_response, options, 'zones') do
clouds = json_response['zones']
title = "Morpheus Clouds"
subtitles = []
if group
subtitles << "Group: #{group['name']}".strip
end
if cloud_type
subtitles << "Type: #{cloud_type['name']}".strip
end
subtitles += parse_list_subtitles(options)
print_h1 title, subtitles
if clouds.empty?
print cyan,"No clouds found.",reset,"\n"
else
print_clouds_table(clouds, options)
print_results_pagination(json_response)
end
print reset,"\n"
end
return 0, nil
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def count(args)
options = {}
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[options]")
build_common_options(opts, options, [:query, :remote, :dry_run])
opts.footer = "Get the number of clouds."
end
optparse.parse!(args)
connect(options)
begin
params = {}
params.merge!(parse_list_options(options))
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.list(params)
return
end
json_response = @clouds_interface.list(params)
# print number only
if json_response['meta'] && json_response['meta']['total']
print cyan, json_response['meta']['total'], reset, "\n"
else
print yellow, "unknown", reset, "\n"
end
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def get(args)
options = {}
params = {}
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[name]")
build_standard_list_options(opts, options)
opts.footer = "Get details about a cloud.\n" +
"[name] is required. This is the name or id of a cloud."
end
optparse.parse!(args)
verify_args!(args:args, optparse:optparse, min:1)
connect(options)
params.merge!(parse_query_options(options))
id_list = parse_id_list(args)
return run_command_for_each_arg(id_list) do |arg|
_get(arg, params, options)
end
end
def _get(id, params, options={})
cloud = nil
if id.to_s !~ /\A\d{1,}\Z/
cloud = find_cloud_by_name_or_id(id)
id = cloud['id']
end
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.get(id.to_i, params)
return
end
json_response = @clouds_interface.get(id, params)
render_response(json_response, options, 'zone') do
cloud = json_response['zone']
cloud_stats = cloud['stats']
# serverCounts moved to zone.stats.serverCounts
server_counts = nil
if cloud_stats
server_counts = cloud_stats['serverCounts']
else
server_counts = json_response['serverCounts'] # legacy
end
cloud_type = cloud_type_for_id(cloud['zoneTypeId'])
print_h1 "Cloud Details"
print cyan
description_cols = {
"ID" => 'id',
"Name" => 'name',
"Type" => lambda {|it| cloud_type ? cloud_type['name'] : '' },
"Code" => 'code',
"Location" => 'location',
"Region Code" => 'regionCode',
"Visibility" => lambda {|it| it['visibility'].to_s.capitalize },
"Groups" => lambda {|it| it['groups'].collect {|g| g.instance_of?(Hash) ? g['name'] : g.to_s }.join(', ') },
#"Owner" => lambda {|it| it['owner'].instance_of?(Hash) ? it['owner']['name'] : it['ownerId'] },
"Tenant" => lambda {|it| it['account'].instance_of?(Hash) ? it['account']['name'] : it['accountId'] },
"Enabled" => lambda {|it| format_boolean(it['enabled']) },
"Status" => lambda {|it| format_cloud_status(it) }
}
print_description_list(description_cols, cloud)
print_h2 "Cloud Servers"
print cyan
if server_counts
print "Container Hosts: #{server_counts['containerHost']}".center(20)
print "Hypervisors: #{server_counts['hypervisor']}".center(20)
print "Bare Metal: #{server_counts['baremetal']}".center(20)
print "Virtual Machines: #{server_counts['vm']}".center(20)
print "Unmanaged: #{server_counts['unmanaged']}".center(20)
print "\n"
end
print reset,"\n"
end
return 0, nil
end
def add(args)
options = {}
params = {}
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[name] --group GROUP --type TYPE")
opts.on( '-g', '--group GROUP', "Group Name" ) do |val|
params[:group] = val
end
opts.on( '-t', '--type TYPE', "Cloud Type" ) do |val|
params[:zone_type] = val
end
opts.on( '-d', '--description DESCRIPTION', "Description (optional)" ) do |desc|
params[:description] = desc
end
opts.on( '--certificate-provider CODE', String, "Certificate Provider. Default is 'internal'" ) do |val|
params[:certificate_provider] = val
end
build_common_options(opts, options, [:options, :payload, :json, :dry_run, :remote])
end
optparse.parse!(args)
# if args.count < 1
# puts optparse
# exit 1
# end
connect(options)
begin
payload = nil
if options[:payload]
payload = options[:payload]
payload.deep_merge!({'zone' => parse_passed_options(options)})
else
cloud_payload = {name: args[0], description: params[:description]}
cloud_payload.deep_merge!(parse_passed_options(options))
# use active group by default
params[:group] ||= @active_group_id
# Group
group_id = nil
group = params[:group] ? find_group_by_name_or_id_for_provisioning(params[:group]) : nil
if group
group_id = group["id"]
else
# print_red_alert "Group not found or specified!"
# exit 1
#groups_dropdown = @groups_interface.list({})['groups'].collect {|it| {'name' => it["name"], 'value' => it["id"]} }
group_prompt = Morpheus::Cli::OptionTypes.prompt([{'fieldName' => 'group', 'type' => 'select', 'fieldLabel' => 'Group', 'optionSource' => 'groups', 'required' => true, 'description' => 'Select Group.'}],options[:options],@api_client,{})
group_id = group_prompt['group']
end
cloud_payload['groupId'] = group_id
# todo: pass groups as an array instead
# Cloud Name
if args[0]
cloud_payload[:name] = args[0]
options[:options]['name'] = args[0] # to skip prompt
elsif !options[:no_prompt]
# name_prompt = Morpheus::Cli::OptionTypes.prompt([{'fieldName' => 'name', 'fieldLabel' => 'Name', 'type' => 'text', 'required' => true}], options[:options])
# cloud_payload[:name] = name_prompt['name']
end
# Cloud Type
cloud_type = nil
if params[:zone_type]
cloud_type = cloud_type_for_name(params[:zone_type])
elsif !options[:no_prompt]
# print_red_alert "Cloud Type not found or specified!"
# exit 1
cloud_types_dropdown = cloud_types_for_dropdown
cloud_type_prompt = Morpheus::Cli::OptionTypes.prompt([{'fieldName' => 'type', 'type' => 'select', 'fieldLabel' => 'Cloud Type', 'selectOptions' => cloud_types_dropdown, 'required' => true, 'description' => 'Select Cloud Type.'}],options[:options],@api_client,{})
cloud_type_code = cloud_type_prompt['type']
cloud_type = cloud_type_for_name(cloud_type_code) # this does work
end
if !cloud_type
print_red_alert "A cloud type is required."
exit 1
end
cloud_payload[:zoneType] = {code: cloud_type['code']}
cloud_payload['config'] ||= {}
if params[:certificate_provider]
cloud_payload['config']['certificateProvider'] = params[:certificate_provider]
else
cloud_payload['config']['certificateProvider'] = 'internal'
end
all_option_types = add_cloud_option_types(cloud_type)
params = Morpheus::Cli::OptionTypes.prompt(all_option_types, options[:options], @api_client, {zoneTypeId: cloud_type['id']})
# some optionTypes have fieldContext='zone', so move those to the root level of the zone payload
if params['zone'].is_a?(Hash)
cloud_payload.deep_merge!(params.delete('zone'))
end
cloud_payload.deep_merge!(params)
payload = {zone: cloud_payload}
end
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.create(payload)
return
end
json_response = @clouds_interface.create(payload)
cloud = json_response['zone']
if options[:json]
puts as_json(json_response, options)
else
get([cloud['id']] + (options[:remote] ? ["-r",options[:remote]] : []))
end
return 0
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def update(args)
options = {}
params = {}
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[name] [options]")
# opts.on( '-g', '--group GROUP', "Group Name" ) do |val|
# params[:group] = val
# end
# opts.on( '-t', '--type TYPE', "Cloud Type" ) do |val|
# params[:zone_type] = val
# end
# opts.on( '-d', '--description DESCRIPTION', "Description (optional)" ) do |desc|
# params[:description] = desc
# end
build_common_options(opts, options, [:options, :payload, :json, :dry_run, :remote])
end
optparse.parse!(args)
if args.count < 1
puts optparse
exit 1
end
connect(options)
begin
cloud = find_cloud_by_name_or_id(args[0])
return 1 if cloud.nil?
payload = nil
if options[:payload]
payload = options[:payload]
# support -O OPTION switch on top of --payload
if options[:options]
payload['zone'] ||= {}
payload['zone'].deep_merge!(options[:options].reject {|k,v| k.is_a?(Symbol) })
end
else
cloud_type = cloud_type_for_id(cloud['zoneTypeId'])
cloud_payload = {}
all_option_types = update_cloud_option_types(cloud_type)
#params = Morpheus::Cli::OptionTypes.prompt(all_option_types, options[:options], @api_client, {zoneTypeId: cloud_type['id']})
params = options[:options] || {}
if params.empty?
puts_error optparse.banner
puts_error format_available_options(all_option_types)
exit 1
end
# some optionTypes have fieldContext='zone', so move those to the root level of the zone payload
if params['zone'].is_a?(Hash)
cloud_payload.merge!(params.delete('zone'))
end
cloud_payload.merge!(params)
payload = {zone: cloud_payload}
end
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.update(cloud['id'], payload)
return
end
json_response = @clouds_interface.update(cloud['id'], payload)
if options[:json]
puts as_json(json_response, options)
else
get([cloud['id']] + (options[:remote] ? ["-r",options[:remote]] : []))
end
return 0
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def remove(args)
options = {}
query_params = {}
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[name]")
opts.on( '-f', '--force', "Force Remove" ) do
query_params[:force] = 'on'
end
build_common_options(opts, options, [:auto_confirm, :quiet, :json, :dry_run, :remote])
end
optparse.parse!(args)
if args.count < 1
puts optparse
return
end
connect(options)
begin
cloud = find_cloud_by_name_or_id(args[0])
unless options[:yes] || Morpheus::Cli::OptionTypes.confirm("Are you sure you want to delete the cloud #{cloud['name']}?")
exit
end
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.destroy(cloud['id'], query_params)
return 0
end
json_response = @clouds_interface.destroy(cloud['id'], query_params)
if options[:json]
print JSON.pretty_generate(json_response)
print "\n"
elsif !options[:quiet]
print_green_success "Removed cloud #{cloud['name']}"
#list([])
end
return 0
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def refresh(args)
options = {}
query_params = {}
params = {}
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[name] [options]")
opts.on( '-f', '--force', "Force refresh. Useful if the cloud is disabled." ) do
query_params[:force] = 'true'
end
build_common_options(opts, options, [:options, :payload, :json, :dry_run, :remote])
opts.footer = "Refresh a cloud." + "\n" +
"[cloud] is required. This is the name or id of a cloud."
end
optparse.parse!(args)
if args.count != 1
raise_command_error "wrong number of arguments, expected 1 and got (#{args.count}) #{args}\n#{optparse}"
end
connect(options)
begin
cloud = find_cloud_by_name_or_id(args[0])
return 1 if cloud.nil?
passed_options = options[:options] ? options[:options].reject {|k,v| k.is_a?(Symbol) } : {}
payload = nil
if options[:payload]
payload = options[:payload]
payload.deep_merge!(passed_options) unless passed_options.empty?
else
payload = {}
payload.deep_merge!(passed_options) unless passed_options.empty?
end
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.refresh(cloud['id'], query_params, payload)
return
end
json_response = @clouds_interface.refresh(cloud['id'], query_params, payload)
if options[:json]
puts as_json(json_response, options)
else
print_green_success "Refreshing cloud #{cloud['name']}..."
#get([cloud['id']] + (options[:remote] ? ["-r",options[:remote]] : []))
end
return 0
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
# not exposed yet, refresh should be all that's needed.
def sync(args)
options = {}
query_params = {}
params = {}
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[name] [options]")
opts.on( '-f', '--force', "Force Delete" ) do
query_params[:force] = 'true'
end
build_common_options(opts, options, [:options, :payload, :json, :dry_run, :remote])
opts.footer = "Sync a cloud." + "\n" +
"[cloud] is required. This is the name or id of a cloud."
end
optparse.parse!(args)
if args.count != 1
raise_command_error "wrong number of arguments, expected 1 and got (#{args.count}) #{args}\n#{optparse}"
end
connect(options)
begin
cloud = find_cloud_by_name_or_id(args[0])
return 1 if cloud.nil?
passed_options = options[:options] ? options[:options].reject {|k,v| k.is_a?(Symbol) } : {}
payload = nil
if options[:payload]
payload = options[:payload]
payload.deep_merge!(passed_options) unless passed_options.empty?
else
payload = {}
payload.deep_merge!(passed_options) unless passed_options.empty?
end
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.sync(cloud['id'], query_params, payload)
return
end
json_response = @clouds_interface.sync(cloud['id'], query_params, payload)
if options[:json]
puts as_json(json_response, options)
else
print_green_success "Syncing cloud #{cloud['name']}..."
#get([cloud['id']] + (options[:remote] ? ["-r",options[:remote]] : []))
end
return 0
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def firewall_disable(args)
options = {}
clear_or_secgroups_specified = false
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[name]")
build_common_options(opts, options, [:json, :dry_run, :remote])
end
optparse.parse!(args)
if args.count < 1
puts optparse
return
end
connect(options)
begin
cloud = find_cloud_by_name_or_id(args[0])
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.firewall_disable(cloud['id'])
return
end
json_response = @clouds_interface.firewall_disable(cloud['id'])
if options[:json]
print JSON.pretty_generate(json_response)
print "\n"
return
end
security_groups([args[0]])
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def firewall_enable(args)
options = {}
clear_or_secgroups_specified = false
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[name]")
build_common_options(opts, options, [:json, :dry_run, :remote])
end
optparse.parse!(args)
if args.count < 1
puts optparse
return
end
connect(options)
begin
cloud = find_cloud_by_name_or_id(args[0])
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.firewall_enable(cloud['id'])
return
end
json_response = @clouds_interface.firewall_enable(cloud['id'])
if options[:json]
print JSON.pretty_generate(json_response)
print "\n"
return
end
security_groups([args[0]])
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def security_groups(args)
options = {}
clear_or_secgroups_specified = false
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[name]")
build_common_options(opts, options, [:json, :dry_run, :remote])
end
optparse.parse!(args)
if args.count < 1
puts optparse
return
end
connect(options)
begin
cloud = find_cloud_by_name_or_id(args[0])
zone_id = cloud['id']
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.security_groups(zone_id)
return
end
json_response = @clouds_interface.security_groups(zone_id)
if options[:json]
print JSON.pretty_generate(json_response)
print "\n"
return
end
securityGroups = json_response['securityGroups']
print_h1 "Morpheus Security Groups for Cloud: #{cloud['name']}"
print cyan
print_description_list({"Firewall Enabled" => lambda {|it| format_boolean it['firewallEnabled'] } }, json_response)
if securityGroups.empty?
print yellow,"\n","No security groups currently applied.",reset,"\n"
else
print "\n"
securityGroups.each do |securityGroup|
print cyan, "= #{securityGroup['id']} (#{securityGroup['name']}) - (#{securityGroup['description']})\n"
end
end
print reset,"\n"
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def apply_security_groups(args)
options = {}
clear_or_secgroups_specified = false
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[name] [-s] [--clear]")
opts.on( '-c', '--clear', "Clear all security groups" ) do
options[:securityGroupIds] = []
clear_or_secgroups_specified = true
end
opts.on( '-s', '--secgroups SECGROUPS', "Apply the specified comma separated security group ids" ) do |secgroups|
options[:securityGroupIds] = secgroups.split(",")
clear_or_secgroups_specified = true
end
build_common_options(opts, options, [:json, :dry_run, :remote])
end
optparse.parse!(args)
if !clear_or_secgroups_specified
puts optparse
exit
end
connect(options)
begin
cloud = find_cloud_by_name_or_id(args[0])
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.apply_security_groups(cloud['id'])
return
end
json_response = @clouds_interface.apply_security_groups(cloud['id'], options)
if options[:json]
print JSON.pretty_generate(json_response)
print "\n"
return
end
security_groups([args[0]])
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def list_cloud_types(args)
options={}
params = {}
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage()
build_common_options(opts, options, [:list, :query, :json, :yaml, :csv, :fields, :dry_run, :remote])
end
optparse.parse!(args)
connect(options)
begin
params.merge!(parse_list_options(options))
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.cloud_types({})
return 0
end
json_response = @clouds_interface.cloud_types(params)
render_result = render_with_format(json_response, options, 'zoneTypes')
return 0 if render_result
#cloud_types = get_available_cloud_types()
cloud_types = json_response['zoneTypes']
title = "Morpheus Cloud Types"
subtitles = []
subtitles += parse_list_subtitles(options)
print_h1 title, subtitles
if cloud_types.empty?
print cyan,"No cloud types found.",reset,"\n"
else
print cyan
cloud_types = cloud_types.select {|it| it['enabled'] }
rows = cloud_types.collect do |cloud_type|
{id: cloud_type['id'], name: cloud_type['name'], code: cloud_type['code']}
end
#print "\n"
columns = [:id, :name, :code]
columns = options[:include_fields] if options[:include_fields]
print as_pretty_table(rows, columns, options)
print_results_pagination(json_response)
print reset,"\n"
end
return 0
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def wiki(args)
options = {}
params = {}
open_wiki_link = false
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[cloud]")
opts.on('--view', '--view', "View wiki page in web browser.") do
open_wiki_link = true
end
build_common_options(opts, options, [:json, :dry_run, :remote])
opts.footer = "View wiki page details for a cloud." + "\n" +
"[cloud] is required. This is the name or id of a cloud."
end
optparse.parse!(args)
if args.count != 1
puts_error "#{Morpheus::Terminal.angry_prompt}wrong number of arguments. Expected 1 and received #{args.count} #{args.inspect}\n#{optparse}"
return 1
end
connect(options)
begin
cloud = find_cloud_by_name_or_id(args[0])
return 1 if cloud.nil?
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.wiki(cloud["id"], params)
return
end
json_response = @clouds_interface.wiki(cloud["id"], params)
page = json_response['page']
render_result = render_with_format(json_response, options, 'page')
return 0 if render_result
if page
# my_terminal.exec("wiki get #{page['id']}")
print_h1 "Cloud Wiki Page: #{cloud['name']}"
# print_h1 "Wiki Page Details"
print cyan
print_description_list({
"Page ID" => 'id',
"Name" => 'name',
#"Category" => 'category',
#"Ref Type" => 'refType',
#"Ref ID" => 'refId',
#"Owner" => lambda {|it| it['account'] ? it['account']['name'] : '' },
"Created" => lambda {|it| format_local_dt(it['dateCreated']) },
"Created By" => lambda {|it| it['createdBy'] ? it['createdBy']['username'] : '' },
"Updated" => lambda {|it| format_local_dt(it['lastUpdated']) },
"Updated By" => lambda {|it| it['updatedBy'] ? it['updatedBy']['username'] : '' }
}, page)
print reset,"\n"
print_h2 "Page Content"
print cyan, page['content'], reset, "\n"
else
print "\n"
print cyan, "No wiki page found.", reset, "\n"
end
print reset,"\n"
if open_wiki_link
return view_wiki([args[0]])
end
return 0
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def view_wiki(args)
params = {}
options = {}
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[id]")
build_common_options(opts, options, [:dry_run, :remote])
opts.footer = "View cloud wiki page in a web browser" + "\n" +
"[cloud] is required. This is the name or id of a cloud."
end
optparse.parse!(args)
if args.count != 1
raise_command_error "wrong number of arguments, expected 1 and got (#{args.count}) #{args.join(' ')}\n#{optparse}"
end
connect(options)
begin
cloud = find_cloud_by_name_or_id(args[0])
return 1 if cloud.nil?
link = "#{@appliance_url}/login/oauth-redirect?access_token=#{@access_token}\\&redirectUri=/infrastructure/clouds/#{cloud['id']}#!wiki"
if options[:dry_run]
puts Morpheus::Util.open_url_command(link)
return 0
end
return Morpheus::Util.open_url(link)
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
def update_wiki(args)
options = {}
params = {}
optparse = Morpheus::Cli::OptionParser.new do |opts|
opts.banner = subcommand_usage("[cloud] [options]")
build_option_type_options(opts, options, update_wiki_page_option_types)
opts.on('--file FILE', "File containing the wiki content. This can be used instead of --content") do |filename|
full_filename = File.expand_path(filename)
if File.exists?(full_filename)
params['content'] = File.read(full_filename)
else
print_red_alert "File not found: #{full_filename}"
return 1
end
# use the filename as the name by default.
if !params['name']
params['name'] = File.basename(full_filename)
end
end
opts.on(nil, '--clear', "Clear current page content") do |val|
params['content'] = ""
end
build_common_options(opts, options, [:options, :payload, :json, :dry_run, :remote])
end
optparse.parse!(args)
if args.count != 1
puts_error "#{Morpheus::Terminal.angry_prompt}wrong number of arguments. Expected 1 and received #{args.count} #{args.inspect}\n#{optparse}"
return 1
end
connect(options)
begin
cloud = find_cloud_by_name_or_id(args[0])
return 1 if cloud.nil?
# construct payload
passed_options = options[:options] ? options[:options].reject {|k,v| k.is_a?(Symbol) } : {}
payload = nil
if options[:payload]
payload = options[:payload]
payload.deep_merge!({'page' => passed_options}) unless passed_options.empty?
else
payload = {
'page' => {
}
}
# allow arbitrary -O options
payload.deep_merge!({'page' => passed_options}) unless passed_options.empty?
# prompt for options
#params = Morpheus::Cli::OptionTypes.prompt(update_wiki_page_option_types, options[:options], @api_client, options[:params])
#params = passed_options
params.deep_merge!(passed_options)
if params.empty?
raise_command_error "Specify at least one option to update.\n#{optparse}"
end
payload.deep_merge!({'page' => params}) unless params.empty?
end
@clouds_interface.setopts(options)
if options[:dry_run]
print_dry_run @clouds_interface.dry.update_wiki(cloud["id"], payload)
return
end
json_response = @clouds_interface.update_wiki(cloud["id"], payload)
if options[:json]
puts as_json(json_response, options)
else
print_green_success "Updated wiki page for cloud #{cloud['name']}"
wiki([cloud['id']])
end
return 0
rescue RestClient::Exception => e
print_rest_exception(e, options)
exit 1
end
end
private
def print_clouds_table(clouds, opts={})
table_color = opts[:color] || cyan
rows = clouds.collect do |cloud|
cloud_type = cloud_type_for_id(cloud['zoneTypeId'])
{
id: cloud['id'],
name: cloud['name'],
type: cloud_type ? cloud_type['name'] : '',
location: cloud['location'],
"REGION CODE": cloud['regionCode'],
groups: (cloud['groups'] || []).collect {|it| it.instance_of?(Hash) ? it['name'] : it.to_s }.join(', '),
servers: cloud['serverCount'],
status: format_cloud_status(cloud)
}
end
columns = [
:id, :name, :type, :location, "REGION CODE", :groups, :servers, :status
]
print as_pretty_table(rows, columns, opts)
end
def add_cloud_option_types(cloud_type)
# note: Type is selected before this
tmp_option_types = [
#{'fieldName' => 'zoneType.code', 'fieldLabel' => 'Image Type', 'type' => 'select', 'selectOptions' => cloud_types_for_dropdown, 'required' => true, 'description' => 'Cloud Type.', 'displayOrder' => 0},
{'fieldName' => 'name', 'fieldLabel' => 'Name', 'type' => 'text', 'required' => true, 'displayOrder' => 1},
{'fieldName' => 'code', 'fieldLabel' => 'Code', 'type' => 'text', 'required' => false, 'displayOrder' => 2},
{'fieldName' => 'location', 'fieldLabel' => 'Location', 'type' => 'text', 'required' => false, 'displayOrder' => 3},
{'fieldName' => 'visibility', 'fieldLabel' => 'Visibility', 'type' => 'select', 'selectOptions' => [{'name' => 'Private', 'value' => 'private'},{'name' => 'Public', 'value' => 'public'}], 'required' => false, 'description' => 'Visibility', 'category' => 'permissions', 'defaultValue' => 'private', 'displayOrder' => 4},
]
# TODO: Account
# Details (zoneType.optionTypes)
if cloud_type && cloud_type['optionTypes']
# adjust displayOrder to put these at the end
#tmp_option_types = tmp_option_types + cloud_type['optionTypes']
cloud_type['optionTypes'].each do |opt|
tmp_option_types << opt.merge({'displayOrder' => opt['displayOrder'].to_i + 100})
end
end
# TODO:
# Advanced Options
## (a whole bunch needed here)
# Provisioning Options
## PROXY (dropdown)
## BYPASS PROXY FOR APPLIANCE URL (checkbox)
## USER DATA LINUX (code)
return tmp_option_types
end
def update_cloud_option_types(cloud_type)
add_cloud_option_types(cloud_type).collect {|it| it['required'] = false; it }
end
def cloud_types_for_dropdown
get_available_cloud_types().select {|it| it['enabled'] }.collect {|it| {'name' => it['name'], 'value' => it['code']} }
end
def format_cloud_status(cloud, return_color=cyan)
out = ""
status_string = cloud['status']
if cloud['enabled'] == false
out << "#{red}DISABLED#{return_color}"
elsif status_string.nil? || status_string.empty? || status_string == "unknown"
out << "#{white}UNKNOWN#{return_color}"
elsif status_string == 'ok'
out << "#{green}#{status_string.upcase}#{return_color}"
elsif status_string == 'syncing'
out << "#{yellow}#{status_string.upcase}#{return_color}"
else
out << "#{red}#{status_string ? status_string.upcase : 'N/A'}#{cloud['statusMessage'] ? "#{return_color} - #{cloud['statusMessage']}" : ''}#{return_color}"
end
out
end
def update_wiki_page_option_types
[
{'fieldName' => 'name', 'fieldLabel' => 'Name', 'type' => 'text', 'required' => false, 'displayOrder' => 1, 'description' => 'The name of the wiki page for this instance. Default is the instance name.'},
#{'fieldName' => 'category', 'fieldLabel' => 'Category', 'type' => 'text', 'required' => false, 'displayOrder' => 2},
{'fieldName' => 'content', 'fieldLabel' => 'Content', 'type' => 'textarea', 'required' => false, 'displayOrder' => 3, 'description' => 'The content (markdown) of the wiki page.'}
]
end
end
| {
"content_hash": "e8d5053b58a8e5e89ee55def9893af9d",
"timestamp": "",
"source": "github",
"line_count": 1010,
"max_line_length": 325,
"avg_line_length": 34.97425742574257,
"alnum_prop": 0.5950062280602423,
"repo_name": "bertramdev/morpheus-cli",
"id": "0cc9902cf76388269f0d44e8e75169b84dd27079",
"size": "35324",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "lib/morpheus/cli/commands/clouds.rb",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "Ruby",
"bytes": "77918"
}
],
"symlink_target": ""
} |
package com.cattong.weibo.api;
import java.util.List;
import com.cattong.commons.LibException;
import com.cattong.commons.Paging;
import com.cattong.entity.User;
/**
* BlockMethods 黑名单相关操作接口
*
* @version
* @author cattong.com
* @time 2010-9-27 上午11:12:43
*/
public interface BlockService {
/**
* 将某用户加入黑名单
*
* @param userId
* 用户唯一标识
* @return 用户对象
* @throws LibException
*/
User createBlock(String userId) throws LibException;
/**
* 将某用户移出黑名单
*
* @param userId
* 用户唯一标识
* @return 用户对象
* @throws LibException
*/
User destroyBlock(String userId) throws LibException;
/**
* 列出黑名单用户(输出用户详细信息)
*
* @param paging
* 分页控制参数
* @return 黑名单用户列表
* @throws LibException
*/
List<User> getBlockingUsers(Paging<User> paging) throws LibException;
}
| {
"content_hash": "c29d9114269e535f78d453908d5218ca",
"timestamp": "",
"source": "github",
"line_count": 49,
"max_line_length": 70,
"avg_line_length": 16.877551020408163,
"alnum_prop": 0.656590084643289,
"repo_name": "Terry-L/YiBo",
"id": "a59b0c9e661285f806fb6ca97e5b8e4a9a255e60",
"size": "981",
"binary": false,
"copies": "3",
"ref": "refs/heads/master",
"path": "YiBoLibrary/src/com/cattong/weibo/api/BlockService.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Java",
"bytes": "2675703"
}
],
"symlink_target": ""
} |
package redpoll.text;
import java.io.IOException;
import java.util.Iterator;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapred.MapReduceBase;
import org.apache.hadoop.mapred.OutputCollector;
import org.apache.hadoop.mapred.Reducer;
import org.apache.hadoop.mapred.Reporter;
import redpoll.core.WritableSparseVector;
/**
* Reducer to create vsm of documents.
* @author Jeremy Chow([email protected])
*/
public class TfIdfReducer extends MapReduceBase implements Reducer<Text, TfIdfWritable, Text, TfIdfWritable>{
private int totalTerms;
public void reduce(Text key, Iterator<TfIdfWritable> values,
OutputCollector<Text, TfIdfWritable> output, Reporter reporter)
throws IOException {
WritableSparseVector tfIdfVector = new WritableSparseVector(totalTerms);
while (values.hasNext()) {
ElementWritable value = (ElementWritable) values.next().get();
tfIdfVector.setQuick(value.getIndex(), value.getValue());
}
// tf-idf style vector
output.collect(key, new TfIdfWritable(tfIdfVector));
}
@Override
public void configure(JobConf job) {
totalTerms = job.getInt("redpoll.text.terms.num", 1024);
}
}
| {
"content_hash": "51fc2a065011e36d0fe5536bb2f750d1",
"timestamp": "",
"source": "github",
"line_count": 43,
"max_line_length": 109,
"avg_line_length": 29.790697674418606,
"alnum_prop": 0.7197501951600312,
"repo_name": "coderplay/redpoll",
"id": "71c269719dd57296e326fa1735c2d4a5f9ef971a",
"size": "2104",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "core/src/java/redpoll/text/TfIdfReducer.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Java",
"bytes": "151115"
}
],
"symlink_target": ""
} |
namespace possumwood {
namespace opencv {
class ExifSequence final {
public:
void add(const Exif& exif);
bool empty() const;
std::size_t size() const;
typedef std::vector<Exif>::const_iterator const_iterator;
const_iterator begin() const;
const_iterator end() const;
bool operator==(const ExifSequence& f) const;
bool operator!=(const ExifSequence& f) const;
private:
std::vector<Exif> m_sequence;
};
std::ostream& operator<<(std::ostream& out, const ExifSequence& f);
} // namespace opencv
template <>
struct Traits<opencv::ExifSequence> {
static constexpr std::array<float, 3> colour() {
return std::array<float, 3>{{1, 0, 1}};
}
};
} // namespace possumwood
| {
"content_hash": "ed4b8790360b1807840ad26fb5c60b9e",
"timestamp": "",
"source": "github",
"line_count": 33,
"max_line_length": 67,
"avg_line_length": 20.87878787878788,
"alnum_prop": 0.6981132075471698,
"repo_name": "martin-pr/possumwood",
"id": "d85441f1adc425c6c1eadf3bc6d991a26b4a28fc",
"size": "807",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "src/plugins/opencv/exif_sequence.h",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "C",
"bytes": "33"
},
{
"name": "C++",
"bytes": "1831270"
},
{
"name": "CMake",
"bytes": "19071"
},
{
"name": "Python",
"bytes": "9428"
},
{
"name": "Shell",
"bytes": "3431"
}
],
"symlink_target": ""
} |
maintainer "Nathan Lee"
maintainer_email "[email protected]"
license "Apache 2.0"
description "Installs/Configures Putty a free implementation of Telnet and SSH for Windows and Unix platforms, along with an xterm terminal emulator."
long_description IO.read(File.join(File.dirname(__FILE__), 'README.md'))
version "0.0.5"
supports "windows"
depends "windows", ">= 1.2.8"
| {
"content_hash": "4bc43682d549742179d423c134f8d0e5",
"timestamp": "",
"source": "github",
"line_count": 9,
"max_line_length": 155,
"avg_line_length": 47.111111111111114,
"alnum_prop": 0.6816037735849056,
"repo_name": "SergioVicenteRuiz/chef-repo",
"id": "6d9fc57f68201d218e99ec1ee879ef85b19f9808",
"size": "424",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "cookbooks/putty/metadata.rb",
"mode": "33261",
"license": "apache-2.0",
"language": [
{
"name": "Ruby",
"bytes": "282165"
}
],
"symlink_target": ""
} |
const setValidTick = (inputContainer) => {
const errorContainer = inputContainer.querySelector('.sprk-b-ErrorContainer');
if (errorContainer) {
errorContainer.innerHTML = '';
}
};
export { setValidTick as default };
| {
"content_hash": "a7f6939090d58baf1616ddf13abeecac",
"timestamp": "",
"source": "github",
"line_count": 9,
"max_line_length": 80,
"avg_line_length": 25.333333333333332,
"alnum_prop": 0.7105263157894737,
"repo_name": "sparkdesignsystem/spark-design-system",
"id": "d3f2609d04025320f9f97f93ceacb5d554db3e53",
"size": "228",
"binary": false,
"copies": "1",
"ref": "refs/heads/staging",
"path": "html/utilities/validation/setValidTick.js",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "HTML",
"bytes": "2975"
},
{
"name": "JavaScript",
"bytes": "1750014"
},
{
"name": "SCSS",
"bytes": "252142"
},
{
"name": "Swift",
"bytes": "67467"
},
{
"name": "TypeScript",
"bytes": "1283216"
}
],
"symlink_target": ""
} |
package eventgrid
// Copyright (c) Microsoft and contributors. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
//
// See the License for the specific language governing permissions and
// limitations under the License.
//
// Code generated by Microsoft (R) AutoRest Code Generator.
// Changes may cause incorrect behavior and will be lost if the code is regenerated.
import (
"context"
"github.com/Azure/go-autorest/autorest"
"github.com/Azure/go-autorest/autorest/azure"
"net/http"
)
// EventSubscriptionsClient is the azure EventGrid Management Client
type EventSubscriptionsClient struct {
BaseClient
}
// NewEventSubscriptionsClient creates an instance of the EventSubscriptionsClient client.
func NewEventSubscriptionsClient(subscriptionID string) EventSubscriptionsClient {
return NewEventSubscriptionsClientWithBaseURI(DefaultBaseURI, subscriptionID)
}
// NewEventSubscriptionsClientWithBaseURI creates an instance of the EventSubscriptionsClient client.
func NewEventSubscriptionsClientWithBaseURI(baseURI string, subscriptionID string) EventSubscriptionsClient {
return EventSubscriptionsClient{NewWithBaseURI(baseURI, subscriptionID)}
}
// Create asynchronously creates a new event subscription to the specified scope. Existing event subscriptions cannot
// be updated with this API and should instead use the Update event subscription API.
//
// scope is the scope of the resource to which the event subscription needs to be created. The scope can be a
// subscription, or a resource group, or a top level resource belonging to a resource provider namespace, or an
// EventGrid topic. For example, use '/subscriptions/{subscriptionId}/' for a subscription,
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}' for a resource group, and
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}'
// for a resource, and
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EventGrid/topics/{topicName}'
// for an EventGrid topic. eventSubscriptionName is name of the event subscription to be created. Event subscription
// names must be between 3 and 64 characters in length and use alphanumeric letters only. eventSubscriptionInfo is
// event subscription properties containing the destination and filter information
func (client EventSubscriptionsClient) Create(ctx context.Context, scope string, eventSubscriptionName string, eventSubscriptionInfo EventSubscription) (result EventSubscriptionsCreateFuture, err error) {
req, err := client.CreatePreparer(ctx, scope, eventSubscriptionName, eventSubscriptionInfo)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "Create", nil, "Failure preparing request")
return
}
result, err = client.CreateSender(req)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "Create", result.Response(), "Failure sending request")
return
}
return
}
// CreatePreparer prepares the Create request.
func (client EventSubscriptionsClient) CreatePreparer(ctx context.Context, scope string, eventSubscriptionName string, eventSubscriptionInfo EventSubscription) (*http.Request, error) {
pathParameters := map[string]interface{}{
"eventSubscriptionName": autorest.Encode("path", eventSubscriptionName),
"scope": scope,
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsJSON(),
autorest.AsPut(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/{scope}/providers/Microsoft.EventGrid/eventSubscriptions/{eventSubscriptionName}", pathParameters),
autorest.WithJSON(eventSubscriptionInfo),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// CreateSender sends the Create request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) CreateSender(req *http.Request) (future EventSubscriptionsCreateFuture, err error) {
sender := autorest.DecorateSender(client, autorest.DoRetryForStatusCodes(client.RetryAttempts, client.RetryDuration, autorest.StatusCodesForRetry...))
future.Future = azure.NewFuture(req)
future.req = req
_, err = future.Done(sender)
if err != nil {
return
}
err = autorest.Respond(future.Response(),
azure.WithErrorUnlessStatusCode(http.StatusOK, http.StatusCreated))
return
}
// CreateResponder handles the response to the Create request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) CreateResponder(resp *http.Response) (result EventSubscription, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK, http.StatusCreated),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// Delete delete an existing event subscription
//
// scope is the scope of the event subscription. The scope can be a subscription, or a resource group, or a top level
// resource belonging to a resource provider namespace, or an EventGrid topic. For example, use
// '/subscriptions/{subscriptionId}/' for a subscription,
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}' for a resource group, and
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}'
// for a resource, and
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EventGrid/topics/{topicName}'
// for an EventGrid topic. eventSubscriptionName is name of the event subscription
func (client EventSubscriptionsClient) Delete(ctx context.Context, scope string, eventSubscriptionName string) (result EventSubscriptionsDeleteFuture, err error) {
req, err := client.DeletePreparer(ctx, scope, eventSubscriptionName)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "Delete", nil, "Failure preparing request")
return
}
result, err = client.DeleteSender(req)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "Delete", result.Response(), "Failure sending request")
return
}
return
}
// DeletePreparer prepares the Delete request.
func (client EventSubscriptionsClient) DeletePreparer(ctx context.Context, scope string, eventSubscriptionName string) (*http.Request, error) {
pathParameters := map[string]interface{}{
"eventSubscriptionName": autorest.Encode("path", eventSubscriptionName),
"scope": scope,
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsDelete(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/{scope}/providers/Microsoft.EventGrid/eventSubscriptions/{eventSubscriptionName}", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// DeleteSender sends the Delete request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) DeleteSender(req *http.Request) (future EventSubscriptionsDeleteFuture, err error) {
sender := autorest.DecorateSender(client, autorest.DoRetryForStatusCodes(client.RetryAttempts, client.RetryDuration, autorest.StatusCodesForRetry...))
future.Future = azure.NewFuture(req)
future.req = req
_, err = future.Done(sender)
if err != nil {
return
}
err = autorest.Respond(future.Response(),
azure.WithErrorUnlessStatusCode(http.StatusOK, http.StatusAccepted, http.StatusNoContent))
return
}
// DeleteResponder handles the response to the Delete request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) DeleteResponder(resp *http.Response) (result autorest.Response, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK, http.StatusAccepted, http.StatusNoContent),
autorest.ByClosing())
result.Response = resp
return
}
// Get get properties of an event subscription
//
// scope is the scope of the event subscription. The scope can be a subscription, or a resource group, or a top level
// resource belonging to a resource provider namespace, or an EventGrid topic. For example, use
// '/subscriptions/{subscriptionId}/' for a subscription,
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}' for a resource group, and
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}'
// for a resource, and
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EventGrid/topics/{topicName}'
// for an EventGrid topic. eventSubscriptionName is name of the event subscription
func (client EventSubscriptionsClient) Get(ctx context.Context, scope string, eventSubscriptionName string) (result EventSubscription, err error) {
req, err := client.GetPreparer(ctx, scope, eventSubscriptionName)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "Get", nil, "Failure preparing request")
return
}
resp, err := client.GetSender(req)
if err != nil {
result.Response = autorest.Response{Response: resp}
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "Get", resp, "Failure sending request")
return
}
result, err = client.GetResponder(resp)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "Get", resp, "Failure responding to request")
}
return
}
// GetPreparer prepares the Get request.
func (client EventSubscriptionsClient) GetPreparer(ctx context.Context, scope string, eventSubscriptionName string) (*http.Request, error) {
pathParameters := map[string]interface{}{
"eventSubscriptionName": autorest.Encode("path", eventSubscriptionName),
"scope": scope,
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsGet(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/{scope}/providers/Microsoft.EventGrid/eventSubscriptions/{eventSubscriptionName}", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// GetSender sends the Get request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) GetSender(req *http.Request) (*http.Response, error) {
return autorest.SendWithSender(client, req,
autorest.DoRetryForStatusCodes(client.RetryAttempts, client.RetryDuration, autorest.StatusCodesForRetry...))
}
// GetResponder handles the response to the Get request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) GetResponder(resp *http.Response) (result EventSubscription, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// GetFullURL get the full endpoint URL for an event subscription
//
// scope is the scope of the event subscription. The scope can be a subscription, or a resource group, or a top level
// resource belonging to a resource provider namespace, or an EventGrid topic. For example, use
// '/subscriptions/{subscriptionId}/' for a subscription,
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}' for a resource group, and
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}'
// for a resource, and
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EventGrid/topics/{topicName}'
// for an EventGrid topic. eventSubscriptionName is name of the event subscription
func (client EventSubscriptionsClient) GetFullURL(ctx context.Context, scope string, eventSubscriptionName string) (result EventSubscriptionFullURL, err error) {
req, err := client.GetFullURLPreparer(ctx, scope, eventSubscriptionName)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "GetFullURL", nil, "Failure preparing request")
return
}
resp, err := client.GetFullURLSender(req)
if err != nil {
result.Response = autorest.Response{Response: resp}
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "GetFullURL", resp, "Failure sending request")
return
}
result, err = client.GetFullURLResponder(resp)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "GetFullURL", resp, "Failure responding to request")
}
return
}
// GetFullURLPreparer prepares the GetFullURL request.
func (client EventSubscriptionsClient) GetFullURLPreparer(ctx context.Context, scope string, eventSubscriptionName string) (*http.Request, error) {
pathParameters := map[string]interface{}{
"eventSubscriptionName": autorest.Encode("path", eventSubscriptionName),
"scope": scope,
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsPost(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/{scope}/providers/Microsoft.EventGrid/eventSubscriptions/{eventSubscriptionName}/getFullUrl", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// GetFullURLSender sends the GetFullURL request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) GetFullURLSender(req *http.Request) (*http.Response, error) {
return autorest.SendWithSender(client, req,
autorest.DoRetryForStatusCodes(client.RetryAttempts, client.RetryDuration, autorest.StatusCodesForRetry...))
}
// GetFullURLResponder handles the response to the GetFullURL request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) GetFullURLResponder(resp *http.Response) (result EventSubscriptionFullURL, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// ListByResource list all event subscriptions that have been created for a specific topic
//
// resourceGroupName is the name of the resource group within the user's subscription. providerNamespace is namespace
// of the provider of the topic resourceTypeName is name of the resource type resourceName is name of the resource
func (client EventSubscriptionsClient) ListByResource(ctx context.Context, resourceGroupName string, providerNamespace string, resourceTypeName string, resourceName string) (result EventSubscriptionsListResult, err error) {
req, err := client.ListByResourcePreparer(ctx, resourceGroupName, providerNamespace, resourceTypeName, resourceName)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListByResource", nil, "Failure preparing request")
return
}
resp, err := client.ListByResourceSender(req)
if err != nil {
result.Response = autorest.Response{Response: resp}
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListByResource", resp, "Failure sending request")
return
}
result, err = client.ListByResourceResponder(resp)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListByResource", resp, "Failure responding to request")
}
return
}
// ListByResourcePreparer prepares the ListByResource request.
func (client EventSubscriptionsClient) ListByResourcePreparer(ctx context.Context, resourceGroupName string, providerNamespace string, resourceTypeName string, resourceName string) (*http.Request, error) {
pathParameters := map[string]interface{}{
"providerNamespace": autorest.Encode("path", providerNamespace),
"resourceGroupName": autorest.Encode("path", resourceGroupName),
"resourceName": autorest.Encode("path", resourceName),
"resourceTypeName": autorest.Encode("path", resourceTypeName),
"subscriptionId": autorest.Encode("path", client.SubscriptionID),
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsGet(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{providerNamespace}/{resourceTypeName}/{resourceName}/providers/Microsoft.EventGrid/eventSubscriptions", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// ListByResourceSender sends the ListByResource request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) ListByResourceSender(req *http.Request) (*http.Response, error) {
return autorest.SendWithSender(client, req,
azure.DoRetryWithRegistration(client.Client))
}
// ListByResourceResponder handles the response to the ListByResource request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) ListByResourceResponder(resp *http.Response) (result EventSubscriptionsListResult, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// ListGlobalByResourceGroup list all global event subscriptions under a specific Azure subscription and resource group
//
// resourceGroupName is the name of the resource group within the user's subscription.
func (client EventSubscriptionsClient) ListGlobalByResourceGroup(ctx context.Context, resourceGroupName string) (result EventSubscriptionsListResult, err error) {
req, err := client.ListGlobalByResourceGroupPreparer(ctx, resourceGroupName)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalByResourceGroup", nil, "Failure preparing request")
return
}
resp, err := client.ListGlobalByResourceGroupSender(req)
if err != nil {
result.Response = autorest.Response{Response: resp}
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalByResourceGroup", resp, "Failure sending request")
return
}
result, err = client.ListGlobalByResourceGroupResponder(resp)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalByResourceGroup", resp, "Failure responding to request")
}
return
}
// ListGlobalByResourceGroupPreparer prepares the ListGlobalByResourceGroup request.
func (client EventSubscriptionsClient) ListGlobalByResourceGroupPreparer(ctx context.Context, resourceGroupName string) (*http.Request, error) {
pathParameters := map[string]interface{}{
"resourceGroupName": autorest.Encode("path", resourceGroupName),
"subscriptionId": autorest.Encode("path", client.SubscriptionID),
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsGet(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EventGrid/eventSubscriptions", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// ListGlobalByResourceGroupSender sends the ListGlobalByResourceGroup request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) ListGlobalByResourceGroupSender(req *http.Request) (*http.Response, error) {
return autorest.SendWithSender(client, req,
azure.DoRetryWithRegistration(client.Client))
}
// ListGlobalByResourceGroupResponder handles the response to the ListGlobalByResourceGroup request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) ListGlobalByResourceGroupResponder(resp *http.Response) (result EventSubscriptionsListResult, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// ListGlobalByResourceGroupForTopicType list all global event subscriptions under a resource group for a specific
// topic type.
//
// resourceGroupName is the name of the resource group within the user's subscription. topicTypeName is name of the
// topic type
func (client EventSubscriptionsClient) ListGlobalByResourceGroupForTopicType(ctx context.Context, resourceGroupName string, topicTypeName string) (result EventSubscriptionsListResult, err error) {
req, err := client.ListGlobalByResourceGroupForTopicTypePreparer(ctx, resourceGroupName, topicTypeName)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalByResourceGroupForTopicType", nil, "Failure preparing request")
return
}
resp, err := client.ListGlobalByResourceGroupForTopicTypeSender(req)
if err != nil {
result.Response = autorest.Response{Response: resp}
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalByResourceGroupForTopicType", resp, "Failure sending request")
return
}
result, err = client.ListGlobalByResourceGroupForTopicTypeResponder(resp)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalByResourceGroupForTopicType", resp, "Failure responding to request")
}
return
}
// ListGlobalByResourceGroupForTopicTypePreparer prepares the ListGlobalByResourceGroupForTopicType request.
func (client EventSubscriptionsClient) ListGlobalByResourceGroupForTopicTypePreparer(ctx context.Context, resourceGroupName string, topicTypeName string) (*http.Request, error) {
pathParameters := map[string]interface{}{
"resourceGroupName": autorest.Encode("path", resourceGroupName),
"subscriptionId": autorest.Encode("path", client.SubscriptionID),
"topicTypeName": autorest.Encode("path", topicTypeName),
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsGet(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EventGrid/topicTypes/{topicTypeName}/eventSubscriptions", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// ListGlobalByResourceGroupForTopicTypeSender sends the ListGlobalByResourceGroupForTopicType request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) ListGlobalByResourceGroupForTopicTypeSender(req *http.Request) (*http.Response, error) {
return autorest.SendWithSender(client, req,
azure.DoRetryWithRegistration(client.Client))
}
// ListGlobalByResourceGroupForTopicTypeResponder handles the response to the ListGlobalByResourceGroupForTopicType request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) ListGlobalByResourceGroupForTopicTypeResponder(resp *http.Response) (result EventSubscriptionsListResult, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// ListGlobalBySubscription list all aggregated global event subscriptions under a specific Azure subscription
func (client EventSubscriptionsClient) ListGlobalBySubscription(ctx context.Context) (result EventSubscriptionsListResult, err error) {
req, err := client.ListGlobalBySubscriptionPreparer(ctx)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalBySubscription", nil, "Failure preparing request")
return
}
resp, err := client.ListGlobalBySubscriptionSender(req)
if err != nil {
result.Response = autorest.Response{Response: resp}
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalBySubscription", resp, "Failure sending request")
return
}
result, err = client.ListGlobalBySubscriptionResponder(resp)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalBySubscription", resp, "Failure responding to request")
}
return
}
// ListGlobalBySubscriptionPreparer prepares the ListGlobalBySubscription request.
func (client EventSubscriptionsClient) ListGlobalBySubscriptionPreparer(ctx context.Context) (*http.Request, error) {
pathParameters := map[string]interface{}{
"subscriptionId": autorest.Encode("path", client.SubscriptionID),
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsGet(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/subscriptions/{subscriptionId}/providers/Microsoft.EventGrid/eventSubscriptions", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// ListGlobalBySubscriptionSender sends the ListGlobalBySubscription request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) ListGlobalBySubscriptionSender(req *http.Request) (*http.Response, error) {
return autorest.SendWithSender(client, req,
azure.DoRetryWithRegistration(client.Client))
}
// ListGlobalBySubscriptionResponder handles the response to the ListGlobalBySubscription request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) ListGlobalBySubscriptionResponder(resp *http.Response) (result EventSubscriptionsListResult, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// ListGlobalBySubscriptionForTopicType list all global event subscriptions under an Azure subscription for a topic
// type.
//
// topicTypeName is name of the topic type
func (client EventSubscriptionsClient) ListGlobalBySubscriptionForTopicType(ctx context.Context, topicTypeName string) (result EventSubscriptionsListResult, err error) {
req, err := client.ListGlobalBySubscriptionForTopicTypePreparer(ctx, topicTypeName)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalBySubscriptionForTopicType", nil, "Failure preparing request")
return
}
resp, err := client.ListGlobalBySubscriptionForTopicTypeSender(req)
if err != nil {
result.Response = autorest.Response{Response: resp}
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalBySubscriptionForTopicType", resp, "Failure sending request")
return
}
result, err = client.ListGlobalBySubscriptionForTopicTypeResponder(resp)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListGlobalBySubscriptionForTopicType", resp, "Failure responding to request")
}
return
}
// ListGlobalBySubscriptionForTopicTypePreparer prepares the ListGlobalBySubscriptionForTopicType request.
func (client EventSubscriptionsClient) ListGlobalBySubscriptionForTopicTypePreparer(ctx context.Context, topicTypeName string) (*http.Request, error) {
pathParameters := map[string]interface{}{
"subscriptionId": autorest.Encode("path", client.SubscriptionID),
"topicTypeName": autorest.Encode("path", topicTypeName),
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsGet(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/subscriptions/{subscriptionId}/providers/Microsoft.EventGrid/topicTypes/{topicTypeName}/eventSubscriptions", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// ListGlobalBySubscriptionForTopicTypeSender sends the ListGlobalBySubscriptionForTopicType request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) ListGlobalBySubscriptionForTopicTypeSender(req *http.Request) (*http.Response, error) {
return autorest.SendWithSender(client, req,
azure.DoRetryWithRegistration(client.Client))
}
// ListGlobalBySubscriptionForTopicTypeResponder handles the response to the ListGlobalBySubscriptionForTopicType request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) ListGlobalBySubscriptionForTopicTypeResponder(resp *http.Response) (result EventSubscriptionsListResult, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// ListRegionalByResourceGroup list all event subscriptions from the given location under a specific Azure subscription
// and resource group
//
// resourceGroupName is the name of the resource group within the user's subscription. location is name of the location
func (client EventSubscriptionsClient) ListRegionalByResourceGroup(ctx context.Context, resourceGroupName string, location string) (result EventSubscriptionsListResult, err error) {
req, err := client.ListRegionalByResourceGroupPreparer(ctx, resourceGroupName, location)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalByResourceGroup", nil, "Failure preparing request")
return
}
resp, err := client.ListRegionalByResourceGroupSender(req)
if err != nil {
result.Response = autorest.Response{Response: resp}
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalByResourceGroup", resp, "Failure sending request")
return
}
result, err = client.ListRegionalByResourceGroupResponder(resp)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalByResourceGroup", resp, "Failure responding to request")
}
return
}
// ListRegionalByResourceGroupPreparer prepares the ListRegionalByResourceGroup request.
func (client EventSubscriptionsClient) ListRegionalByResourceGroupPreparer(ctx context.Context, resourceGroupName string, location string) (*http.Request, error) {
pathParameters := map[string]interface{}{
"location": autorest.Encode("path", location),
"resourceGroupName": autorest.Encode("path", resourceGroupName),
"subscriptionId": autorest.Encode("path", client.SubscriptionID),
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsGet(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EventGrid/locations/{location}/eventSubscriptions", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// ListRegionalByResourceGroupSender sends the ListRegionalByResourceGroup request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) ListRegionalByResourceGroupSender(req *http.Request) (*http.Response, error) {
return autorest.SendWithSender(client, req,
azure.DoRetryWithRegistration(client.Client))
}
// ListRegionalByResourceGroupResponder handles the response to the ListRegionalByResourceGroup request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) ListRegionalByResourceGroupResponder(resp *http.Response) (result EventSubscriptionsListResult, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// ListRegionalByResourceGroupForTopicType list all event subscriptions from the given location under a specific Azure
// subscription and resource group and topic type
//
// resourceGroupName is the name of the resource group within the user's subscription. location is name of the location
// topicTypeName is name of the topic type
func (client EventSubscriptionsClient) ListRegionalByResourceGroupForTopicType(ctx context.Context, resourceGroupName string, location string, topicTypeName string) (result EventSubscriptionsListResult, err error) {
req, err := client.ListRegionalByResourceGroupForTopicTypePreparer(ctx, resourceGroupName, location, topicTypeName)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalByResourceGroupForTopicType", nil, "Failure preparing request")
return
}
resp, err := client.ListRegionalByResourceGroupForTopicTypeSender(req)
if err != nil {
result.Response = autorest.Response{Response: resp}
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalByResourceGroupForTopicType", resp, "Failure sending request")
return
}
result, err = client.ListRegionalByResourceGroupForTopicTypeResponder(resp)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalByResourceGroupForTopicType", resp, "Failure responding to request")
}
return
}
// ListRegionalByResourceGroupForTopicTypePreparer prepares the ListRegionalByResourceGroupForTopicType request.
func (client EventSubscriptionsClient) ListRegionalByResourceGroupForTopicTypePreparer(ctx context.Context, resourceGroupName string, location string, topicTypeName string) (*http.Request, error) {
pathParameters := map[string]interface{}{
"location": autorest.Encode("path", location),
"resourceGroupName": autorest.Encode("path", resourceGroupName),
"subscriptionId": autorest.Encode("path", client.SubscriptionID),
"topicTypeName": autorest.Encode("path", topicTypeName),
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsGet(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EventGrid/locations/{location}/topicTypes/{topicTypeName}/eventSubscriptions", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// ListRegionalByResourceGroupForTopicTypeSender sends the ListRegionalByResourceGroupForTopicType request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) ListRegionalByResourceGroupForTopicTypeSender(req *http.Request) (*http.Response, error) {
return autorest.SendWithSender(client, req,
azure.DoRetryWithRegistration(client.Client))
}
// ListRegionalByResourceGroupForTopicTypeResponder handles the response to the ListRegionalByResourceGroupForTopicType request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) ListRegionalByResourceGroupForTopicTypeResponder(resp *http.Response) (result EventSubscriptionsListResult, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// ListRegionalBySubscription list all event subscriptions from the given location under a specific Azure subscription
//
// location is name of the location
func (client EventSubscriptionsClient) ListRegionalBySubscription(ctx context.Context, location string) (result EventSubscriptionsListResult, err error) {
req, err := client.ListRegionalBySubscriptionPreparer(ctx, location)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalBySubscription", nil, "Failure preparing request")
return
}
resp, err := client.ListRegionalBySubscriptionSender(req)
if err != nil {
result.Response = autorest.Response{Response: resp}
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalBySubscription", resp, "Failure sending request")
return
}
result, err = client.ListRegionalBySubscriptionResponder(resp)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalBySubscription", resp, "Failure responding to request")
}
return
}
// ListRegionalBySubscriptionPreparer prepares the ListRegionalBySubscription request.
func (client EventSubscriptionsClient) ListRegionalBySubscriptionPreparer(ctx context.Context, location string) (*http.Request, error) {
pathParameters := map[string]interface{}{
"location": autorest.Encode("path", location),
"subscriptionId": autorest.Encode("path", client.SubscriptionID),
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsGet(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/subscriptions/{subscriptionId}/providers/Microsoft.EventGrid/locations/{location}/eventSubscriptions", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// ListRegionalBySubscriptionSender sends the ListRegionalBySubscription request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) ListRegionalBySubscriptionSender(req *http.Request) (*http.Response, error) {
return autorest.SendWithSender(client, req,
azure.DoRetryWithRegistration(client.Client))
}
// ListRegionalBySubscriptionResponder handles the response to the ListRegionalBySubscription request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) ListRegionalBySubscriptionResponder(resp *http.Response) (result EventSubscriptionsListResult, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// ListRegionalBySubscriptionForTopicType list all event subscriptions from the given location under a specific Azure
// subscription and topic type.
//
// location is name of the location topicTypeName is name of the topic type
func (client EventSubscriptionsClient) ListRegionalBySubscriptionForTopicType(ctx context.Context, location string, topicTypeName string) (result EventSubscriptionsListResult, err error) {
req, err := client.ListRegionalBySubscriptionForTopicTypePreparer(ctx, location, topicTypeName)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalBySubscriptionForTopicType", nil, "Failure preparing request")
return
}
resp, err := client.ListRegionalBySubscriptionForTopicTypeSender(req)
if err != nil {
result.Response = autorest.Response{Response: resp}
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalBySubscriptionForTopicType", resp, "Failure sending request")
return
}
result, err = client.ListRegionalBySubscriptionForTopicTypeResponder(resp)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "ListRegionalBySubscriptionForTopicType", resp, "Failure responding to request")
}
return
}
// ListRegionalBySubscriptionForTopicTypePreparer prepares the ListRegionalBySubscriptionForTopicType request.
func (client EventSubscriptionsClient) ListRegionalBySubscriptionForTopicTypePreparer(ctx context.Context, location string, topicTypeName string) (*http.Request, error) {
pathParameters := map[string]interface{}{
"location": autorest.Encode("path", location),
"subscriptionId": autorest.Encode("path", client.SubscriptionID),
"topicTypeName": autorest.Encode("path", topicTypeName),
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsGet(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/subscriptions/{subscriptionId}/providers/Microsoft.EventGrid/locations/{location}/topicTypes/{topicTypeName}/eventSubscriptions", pathParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// ListRegionalBySubscriptionForTopicTypeSender sends the ListRegionalBySubscriptionForTopicType request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) ListRegionalBySubscriptionForTopicTypeSender(req *http.Request) (*http.Response, error) {
return autorest.SendWithSender(client, req,
azure.DoRetryWithRegistration(client.Client))
}
// ListRegionalBySubscriptionForTopicTypeResponder handles the response to the ListRegionalBySubscriptionForTopicType request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) ListRegionalBySubscriptionForTopicTypeResponder(resp *http.Response) (result EventSubscriptionsListResult, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
// Update asynchronously updates an existing event subscription.
//
// scope is the scope of existing event subscription. The scope can be a subscription, or a resource group, or a top
// level resource belonging to a resource provider namespace, or an EventGrid topic. For example, use
// '/subscriptions/{subscriptionId}/' for a subscription,
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}' for a resource group, and
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}'
// for a resource, and
// '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.EventGrid/topics/{topicName}'
// for an EventGrid topic. eventSubscriptionName is name of the event subscription to be created
// eventSubscriptionUpdateParameters is updated event subscription information
func (client EventSubscriptionsClient) Update(ctx context.Context, scope string, eventSubscriptionName string, eventSubscriptionUpdateParameters EventSubscriptionUpdateParameters) (result EventSubscriptionsUpdateFuture, err error) {
req, err := client.UpdatePreparer(ctx, scope, eventSubscriptionName, eventSubscriptionUpdateParameters)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "Update", nil, "Failure preparing request")
return
}
result, err = client.UpdateSender(req)
if err != nil {
err = autorest.NewErrorWithError(err, "eventgrid.EventSubscriptionsClient", "Update", result.Response(), "Failure sending request")
return
}
return
}
// UpdatePreparer prepares the Update request.
func (client EventSubscriptionsClient) UpdatePreparer(ctx context.Context, scope string, eventSubscriptionName string, eventSubscriptionUpdateParameters EventSubscriptionUpdateParameters) (*http.Request, error) {
pathParameters := map[string]interface{}{
"eventSubscriptionName": autorest.Encode("path", eventSubscriptionName),
"scope": scope,
}
const APIVersion = "2017-09-15-preview"
queryParameters := map[string]interface{}{
"api-version": APIVersion,
}
preparer := autorest.CreatePreparer(
autorest.AsJSON(),
autorest.AsPatch(),
autorest.WithBaseURL(client.BaseURI),
autorest.WithPathParameters("/{scope}/providers/Microsoft.EventGrid/eventSubscriptions/{eventSubscriptionName}", pathParameters),
autorest.WithJSON(eventSubscriptionUpdateParameters),
autorest.WithQueryParameters(queryParameters))
return preparer.Prepare((&http.Request{}).WithContext(ctx))
}
// UpdateSender sends the Update request. The method will close the
// http.Response Body if it receives an error.
func (client EventSubscriptionsClient) UpdateSender(req *http.Request) (future EventSubscriptionsUpdateFuture, err error) {
sender := autorest.DecorateSender(client, autorest.DoRetryForStatusCodes(client.RetryAttempts, client.RetryDuration, autorest.StatusCodesForRetry...))
future.Future = azure.NewFuture(req)
future.req = req
_, err = future.Done(sender)
if err != nil {
return
}
err = autorest.Respond(future.Response(),
azure.WithErrorUnlessStatusCode(http.StatusOK, http.StatusCreated))
return
}
// UpdateResponder handles the response to the Update request. The method always
// closes the http.Response Body.
func (client EventSubscriptionsClient) UpdateResponder(resp *http.Response) (result EventSubscription, err error) {
err = autorest.Respond(
resp,
client.ByInspecting(),
azure.WithErrorUnlessStatusCode(http.StatusOK, http.StatusCreated),
autorest.ByUnmarshallingJSON(&result),
autorest.ByClosing())
result.Response = autorest.Response{Response: resp}
return
}
| {
"content_hash": "3e5c657db24566a6f342b052e2b653d9",
"timestamp": "",
"source": "github",
"line_count": 1011,
"max_line_length": 232,
"avg_line_length": 46.789317507418396,
"alnum_prop": 0.7955986808726535,
"repo_name": "pravisankar/origin",
"id": "40aa40d4296d4f2b12f77e756115d43de84f7c8d",
"size": "47304",
"binary": false,
"copies": "5",
"ref": "refs/heads/master",
"path": "vendor/github.com/Azure/azure-sdk-for-go/services/eventgrid/mgmt/2017-09-15-preview/eventgrid/eventsubscriptions.go",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Awk",
"bytes": "921"
},
{
"name": "DIGITAL Command Language",
"bytes": "117"
},
{
"name": "Go",
"bytes": "14764687"
},
{
"name": "Makefile",
"bytes": "8599"
},
{
"name": "Perl",
"bytes": "365"
},
{
"name": "Python",
"bytes": "16037"
},
{
"name": "Ruby",
"bytes": "484"
},
{
"name": "Shell",
"bytes": "1000988"
}
],
"symlink_target": ""
} |
package cellularAutomata.lattice;
import cellularAutomata.CurrentProperties;
import cellularAutomata.Cell;
import cellularAutomata.lattice.view.LatticeView;
import cellularAutomata.lattice.view.SquareLatticeView;
import cellularAutomata.rules.Rule;
import cellularAutomata.util.math.Modulus;
/**
* A two-dimensional diamond-shaped square lattice with wrap around boundary
* conditions and a von Neumann neighborhood of radius r. Radius 1 corresponds
* to a nearest-neighbor "square (4 neighbor)". The number of neighbors
* (excluding the central cell) is 2r(r+1). The wrap around conditions and
* neighbors are implemented in the getNeighbors() method.
* <p>
* If the cell is located at (i0, j0), then another cell at (i, j) is in the
* neighborhood if |i-i0| + |j-j0| <= r.
* <p>
* By convention, the neighbors start at the upper left and wrap around
* clockwise, spiraling inwards. Therefore, the top point of the von Neumann
* neighborhood is the 0th neighbor, and the right point is the 1st neighbor,
* and the bottom point is the 2nd neighbor, etc.
*
* @author David Bahr
*/
public class VonNeumannRadiusLattice extends SquareLattice
{
// Note that the following constant is public so that the PropertyReader
// class can set a default value for PropertyReader.LATTICE.
/**
* The display name for this lattice.
*/
public final static String DISPLAY_NAME = "square (von Neumann, radius r)";
/**
* The radius of the neighborhood (radius = 1 corresponds to the
* nearest-neighbor lattice). This value may be reset by the PropertyPanel.
*/
public static int radius = 2;
// Tells the lattice how to display its graphics.
private LatticeView view = null;
// The tooltip for this lattice
private static final String TOOLTIP = "<html><body> <b>One of the more common "
+ "geometries:</b> a "
+ "square lattice <br>"
+ "with a diamond-shaped neighborhood that extends a radius <br>"
+ "r from the central cell. A radius of 1 is the standard <br>"
+ "four-neighbor geometry -- see \""
+ FourNeighborSquareLattice.DISPLAY_NAME
+ "\". "
+ "<br><br>"
+ "In general, for a central cell at (i<sub>0</sub>, j<sub>0</sub>), the neighborhood <br>"
+ "is all cells such that |i-i<sub>0</sub>| + |j-j<sub>0</sub>| <= r. "
+ "<br><br>"
+ "For example, consider the cells shown below where _ is <br>"
+ "the central cell. <br>"
+ "<pre>"
+ " 12345 <br>"
+ " ghij6 <br>"
+ " fo_k7 <br>"
+ " enml8 <br>"
+ " dcba9 <br>"
+ "</pre>"
+ "The von Neumann neighborhood of radius 1 is given by <br>"
+ "the cells <br>"
+ "<pre>"
+ " i <br>"
+ " o_k <br>"
+ " m <br>"
+ "</pre>"
+ "The von Neumann neighborhood of radius 2 is given by <br>"
+ "the cells <br>"
+ "<pre>"
+ " 3 <br>"
+ " hij <br>"
+ " fo_k7 <br>"
+ " nml <br>"
+ " b <br>"
+ "</pre>"
+ "Note the diamond shapes. <br><br>"
+ "To see the neighborhood associated with a cell, use the <br>"
+ "\"Show Neighborhood\" analysis.</body><html>";
/**
* Default constructor required of all Lattices (for reflection). Typically
* not used to build the lattice, but instead used to gain access to methods
* such as getDisplayName() and getNumberOfNeighbors().
*/
public VonNeumannRadiusLattice()
{
super();
}
/**
* Create a two-dimensional cellular automaton with the same rule at all
* positions. The height and width of the automaton and the initial state of
* each cell is specified in a file.
*
* @param rule
* The rule applied to all cells.
* @param initialStateFilePath
* The path to the file that specifies the initial state of the
* cellular automaton. If null, uses default initial state.
* @param maxHistory
* The maximum number of generations (or time steps) that will be
* remembered by the Cells on the lattice.
*/
public VonNeumannRadiusLattice(String initialStateFilePath, Rule rule,
int maxHistory)
{
super(initialStateFilePath, rule, maxHistory);
view = new SquareLatticeView(this);
// make sure the radius in the properties is the same as the radius
// here.
radius = CurrentProperties.getInstance().getNeighborhoodRadius();
}
/**
* A brief one or two-word string describing the rule, appropriate for
* display in a drop-down list.
*
* @return A string no longer than 25 characters.
*/
public String getDisplayName()
{
return DISPLAY_NAME;
}
/**
* On a square lattice, finds the von Neumann neighbors to a cell at the
* specified index.
* <p>
* If the cell is located at (i0, j0), then another cell at (i, j) is in the
* neighborhood if |i-i0| + |j-j0| <= r.
* <p>
* By convention, the neighbors start at the upper left and wrap around
* clockwise.
*
* @param row
* The cell's vertical position in the array.
* @param col
* The cell's horizontal position in the array.
* @param boundaryType
* A constant indicating the type of boundary (wrap-around,
* reflection, etc). Acceptable constants are specified in the
* Lattice class.
*
* @return An array of neighboring cells.
*/
protected Cell[] getNeighboringCells(int row, int col, int boundaryType)
{
// get the total number of rows and columns from the super class
int numRows = getHeight();
int numCols = getWidth();
int numNeighbors = getNumberOfNeighbors();
// create an array of neighbors
Cell[] neighboringCells = new Cell[numNeighbors];
int neighborNumber = 0;
// For wrap around boundary conditions...
//
// this spirals in, starting on the outside at the upper left. Each
// decreasing value of r moves the spiral inward one shell.
// E.g., when r = 2,
// 12345
// g _ 6
// f x 7
// e _ 8
// dcba9
// And, when r = 1,
// 123
// 8x4
// 765
//
// HOWEVER, I don't include all cells. Just the ones that satisfy
// |i-i0| + |j-j0| <= r.
//
// Also note that I use % for positive numbers, but use Modulus.mod for
// negative numbers. This ensures that I get a more typical modulus
// behavior, rather than "remainder after division" behavior. This works
// better for wrap around boundary conditions. (I could have used
// Modulus.mod for positive numbers as well, but I assume that % is
// faster.)
for(int r = radius; r > 0; r--)
{
// the following "for" loops do one lap around the spiral at a
// radius of r. Each "for" loop is one side of the spiral
for(int j = 0; j <= 2 * r; j++)
{
int rowPos = row - r;
int colPos = col - r + j;
int vonNeumannDistance = Math.abs(rowPos - row)
+ Math.abs(colPos - col);
if(boundaryType == Lattice.REFLECTION_BOUNDARY)
{
// adjust for reflection
if(rowPos < 0)
{
rowPos = row + r;
}
if(colPos < 0 || colPos >= numCols)
{
colPos = col + r - j;
}
}
// now get the neighboring cell if it is in the von Neumann
// neighborhood
if(vonNeumannDistance <= radius)
{
neighboringCells[neighborNumber] = cells[Modulus.mod(
rowPos, numRows)][Modulus.mod(colPos, numCols)];
neighborNumber++;
}
}
for(int i = 1; i <= 2 * r; i++)
{
int rowPos = row - r + i;
int colPos = col + r;
int vonNeumannDistance = Math.abs(rowPos - row)
+ Math.abs(colPos - col);
// adjust for reflection
if(boundaryType == Lattice.REFLECTION_BOUNDARY)
{
if(colPos >= numCols)
{
colPos = col - r;
}
if(rowPos < 0 || rowPos >= numRows)
{
rowPos = row + r - i;
}
}
// now get the neighboring cell if it is in the von Neumann
// neighborhood
if(vonNeumannDistance <= radius)
{
neighboringCells[neighborNumber] = cells[Modulus.mod(
rowPos, numRows)][colPos % numCols];
neighborNumber++;
}
}
for(int j = 1; j <= 2 * r; j++)
{
int rowPos = row + r;
int colPos = col + r - j;
int vonNeumannDistance = Math.abs(rowPos - row)
+ Math.abs(colPos - col);
// adjust for reflection
if(boundaryType == Lattice.REFLECTION_BOUNDARY)
{
if(rowPos >= numRows)
{
rowPos = row - r;
}
if(colPos < 0 || colPos >= numCols)
{
colPos = col - r + j;
}
}
// now get the neighboring cell if it is in the von Neumann
// neighborhood
if(vonNeumannDistance <= radius)
{
neighboringCells[neighborNumber] = cells[rowPos % numRows][Modulus
.mod(colPos, numCols)];
neighborNumber++;
}
}
for(int i = 1; i <= 2 * r - 1; i++)
{
int rowPos = row + r - i;
int colPos = col - r;
int vonNeumannDistance = Math.abs(rowPos - row)
+ Math.abs(colPos - col);
// adjust for reflection
if(boundaryType == Lattice.REFLECTION_BOUNDARY)
{
if(colPos < 0)
{
colPos = col + r;
}
if(rowPos < 0 || rowPos >= numRows)
{
rowPos = row - r + i;
}
}
// now get the neighboring cell if it is in the von Neumann
// neighborhood
if(vonNeumannDistance <= radius)
{
neighboringCells[neighborNumber] = cells[Modulus.mod(
rowPos, numRows)][Modulus.mod(colPos, numCols)];
neighborNumber++;
}
}
}
return neighboringCells;
}
/**
* Returns the number of neighbors of a cell on the lattice (excluding the
* cell). If the cell's have a varying number of neighbors, then this
* returns -1, and the user should use <code>getNeighbors(Cell cell)</code>
* to find the number of neighbors for any particular cell.
*
* @return The number of neighbors for each cell, or -1 if that number is
* variable.
*/
public int getNumberOfNeighbors()
{
return 2 * radius * (radius + 1);
}
/**
* A tooltip that describes the lattice.
*
* @return The tooltip.
*/
public String getToolTipDescription()
{
return TOOLTIP;
}
/**
* Gets the graphics that tells the lattice how to display itself.
*
* @return The graphics view that tells this Lattice how to display.
*/
public LatticeView getView()
{
return view;
}
/**
* Sets the graphics that tells the Lattice how to display itself. Unless a
* non-default is desired, the view does not have to be set by the user
* (because a default is always used).
*
* @param view
* The graphics view that tells this Lattice how to display.
*/
public void setView(LatticeView view)
{
this.view = view;
}
}
| {
"content_hash": "20781c561762c14c814e0dc4a0dbd491",
"timestamp": "",
"source": "github",
"line_count": 369,
"max_line_length": 99,
"avg_line_length": 36.22764227642276,
"alnum_prop": 0.4967833632555356,
"repo_name": "KEOpenSource/CAExplorer",
"id": "bfd00449724a1f6134cdf3198c2eb5102caf9a2d",
"size": "14232",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "cellularAutomata/lattice/VonNeumannRadiusLattice.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "CSS",
"bytes": "1420"
},
{
"name": "Java",
"bytes": "5802939"
}
],
"symlink_target": ""
} |
using System;
using System.Collections.Generic;
using QuantConnect.Data;
using QuantConnect.Orders;
using QuantConnect.Interfaces;
namespace QuantConnect.Algorithm.CSharp
{
/// <summary>
/// Options Open Interest data regression test.
/// </summary>
/// <meta name="tag" content="options" />
/// <meta name="tag" content="regression test" />
public class OptionOpenInterestRegressionAlgorithm : QCAlgorithm, IRegressionAlgorithmDefinition
{
public override void Initialize()
{
// this test opens position in the first day of trading, lives through stock split (7 for 1), and closes adjusted position on the second day
SetStartDate(2014, 06, 05);
SetEndDate(2014, 06, 06);
SetCash(1000000);
var option = AddOption("TWX");
option.SetFilter(-10, +10, TimeSpan.Zero, TimeSpan.FromDays(365 * 2));
// use the underlying equity as the benchmark
SetBenchmark("TWX");
}
/// <summary>
/// Event - v3.0 DATA EVENT HANDLER: (Pattern) Basic template for user to override for receiving all subscription data in a single event
/// </summary>
/// <param name="slice">The current slice of data keyed by symbol string</param>
public override void OnData(Slice slice)
{
if (!Portfolio.Invested)
{
foreach (var chain in slice.OptionChains)
{
foreach (var contract in chain.Value)
{
if (contract.Symbol.ID.StrikePrice == 72.5m &&
contract.Symbol.ID.OptionRight == OptionRight.Call &&
contract.Symbol.ID.Date == new DateTime(2016, 01, 15))
{
if (slice.Time.Date == new DateTime(2014, 06, 05) && contract.OpenInterest != 50)
{
throw new Exception("Regression test failed: current open interest was not correctly loaded and is not equal to 50");
}
if (slice.Time.Date == new DateTime(2014, 06, 06) && contract.OpenInterest != 70)
{
throw new Exception("Regression test failed: current open interest was not correctly loaded and is not equal to 70");
}
if (slice.Time.Date == new DateTime(2014, 06, 06))
{
MarketOrder(contract.Symbol, 1);
MarketOnCloseOrder(contract.Symbol, -1);
}
}
}
}
}
}
/// <summary>
/// Order fill event handler. On an order fill update the resulting information is passed to this method.
/// </summary>
/// <param name="orderEvent">Order event details containing details of the evemts</param>
/// <remarks>This method can be called asynchronously and so should only be used by seasoned C# experts. Ensure you use proper locks on thread-unsafe objects</remarks>
public override void OnOrderEvent(OrderEvent orderEvent)
{
Log(orderEvent.ToString());
}
/// <summary>
/// This is used by the regression test system to indicate if the open source Lean repository has the required data to run this algorithm.
/// </summary>
public bool CanRunLocally { get; } = true;
/// <summary>
/// This is used by the regression test system to indicate which languages this algorithm is written in.
/// </summary>
public Language[] Languages { get; } = { Language.CSharp, Language.Python };
/// <summary>
/// This is used by the regression test system to indicate what the expected statistics are from running the algorithm
/// </summary>
public Dictionary<string, string> ExpectedStatistics => new Dictionary<string, string>
{
{"Total Trades", "2"},
{"Average Win", "0%"},
{"Average Loss", "-0.01%"},
{"Compounding Annual Return", "-2.072%"},
{"Drawdown", "0.000%"},
{"Expectancy", "-1"},
{"Net Profit", "-0.010%"},
{"Sharpe Ratio", "-11.225"},
{"Loss Rate", "100%"},
{"Win Rate", "0%"},
{"Profit-Loss Ratio", "0"},
{"Alpha", "0"},
{"Beta", "-0.036"},
{"Annual Standard Deviation", "0.001"},
{"Annual Variance", "0"},
{"Information Ratio", "-11.225"},
{"Tracking Error", "0.033"},
{"Treynor Ratio", "0.355"},
{"Total Fees", "$2.00"}
};
}
}
| {
"content_hash": "e42b1ba58b1c526975a8b1d9b3c311ea",
"timestamp": "",
"source": "github",
"line_count": 114,
"max_line_length": 175,
"avg_line_length": 43.05263157894737,
"alnum_prop": 0.5270986145069275,
"repo_name": "Jay-Jay-D/LeanSTP",
"id": "9e9c2791af4b250eb678e117e704ac3051528a20",
"size": "5615",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "Algorithm.CSharp/OptionOpenInterestRegressionAlgorithm.cs",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Batchfile",
"bytes": "2540"
},
{
"name": "C#",
"bytes": "17013438"
},
{
"name": "Dockerfile",
"bytes": "1229"
},
{
"name": "F#",
"bytes": "1723"
},
{
"name": "HTML",
"bytes": "2607907"
},
{
"name": "Java",
"bytes": "852"
},
{
"name": "Jupyter Notebook",
"bytes": "22467"
},
{
"name": "Python",
"bytes": "852757"
},
{
"name": "Shell",
"bytes": "2307"
},
{
"name": "Visual Basic",
"bytes": "2448"
}
],
"symlink_target": ""
} |
import {AmpDocSingle, installDocService} from '#service/ampdoc-impl';
import {Animation} from '../../src/animation';
import {FakeMutationObserver, FakeWindow} from '#testing/fake-dom';
import {FixedLayer} from '#service/fixed-layer';
import {Services} from '#service';
import {endsWith} from '#core/types/string';
import {installHiddenObserverForDoc} from '#service/hidden-observer-impl';
import {installPlatformService} from '#service/platform-impl';
import {installTimerService} from '#service/timer-impl';
import {installViewerServiceForDoc} from '#service/viewer-impl';
import {toggle} from '#core/dom/style';
import {user} from '../../src/log';
describes.sandboxed('FixedLayer', {}, (env) => {
let parentApi;
let documentApi;
let ampdoc;
let viewer;
let vsyncApi;
let vsyncTasks;
let docBody, docElem;
let element1;
let element2;
let element3;
let element4;
let element5;
let element6;
let allRules;
let timer;
beforeEach(() => {
parentApi = {};
documentApi = {};
allRules = {};
docBody = createElement('docBody');
docBody.id = 'doc-body-id';
docElem = createElement('docElem');
element1 = createElement('element1');
element2 = createElement('element2');
element3 = createElement('element3');
element4 = createElement('element4');
element5 = createElement('element5');
element6 = createElement('element6');
docBody.appendChild(element1);
docBody.appendChild(element2);
docBody.appendChild(element3);
docBody.appendChild(element4);
docBody.appendChild(element5);
docBody.appendChild(element6);
const invalidRule = createValidRule('#invalid', 'fixed', [
element1,
element3,
]);
Object.assign(documentApi, {
styleSheets: [
// Will be ignored due to being a link.
{
ownerNode: document.createElement('link'),
cssRules: [invalidRule],
},
// Will be ignored because it's disabled
{
ownerNode: createStyleNode(),
disabled: true,
cssRules: [invalidRule],
},
// Will be ignored because it's a boilerplate
{
ownerNode: createStyleNode('amp-boilerplate'),
cssRules: [invalidRule],
},
// Will be ignored because it's a runtime
{
ownerNode: createStyleNode('amp-runtime'),
cssRules: [invalidRule],
},
// Will be ignored because it's an extension
{
ownerNode: createStyleNode('amp-extension', 'amp-fit-text'),
cssRules: [invalidRule],
},
// Valid stylesheet with amp-custom
{
ownerNode: createStyleNode('amp-custom'),
cssRules: [
createValidRule('#amp-custom-rule1', 'fixed', [element1]),
createValidRule('#doc-body-id #amp-custom-rule1', 'fixed', [
element1,
]),
createValidRule('#amp-custom-rule2', 'fixed', [element1, element2]),
createUnrelatedRule('#amp-custom-rule3', [element3]),
createValidRule('#amp-custom-rule4', 'sticky', [
element2,
element4,
]),
createValidRule('#amp-custom-rule5', '-webkit-sticky', [element5]),
{
type: 4,
cssRules: [
createValidRule('#amp-custom-media-rule1', 'fixed', [element1]),
],
},
{
type: 12,
cssRules: [
createValidRule('#amp-custom-supports-rule1', 'fixed', [
element2,
]),
],
},
// Unknown rule.
{
type: 3,
},
],
},
// Valid stylesheet without amp-custom
{
ownerNode: createStyleNode(),
cssRules: [
createValidRule('#other-rule1', 'fixed', [element1]),
createValidRule('#other-rule2', 'fixed', [element2]),
createUnrelatedRule('#other-rule3', [element1, element3]),
],
},
],
querySelectorAll: (selector) => {
if (!allRules[selector]) {
return null;
}
return allRules[selector].elements;
},
contains: (elem) => {
return !!elem.parentElement;
},
defaultView: {
setTimeout: window.setTimeout,
clearTimeout: window.clearTimeout,
Promise: window.Promise,
MutationObserver: FakeMutationObserver,
getComputedStyle: (elem) => {
return elem.computedStyle;
},
navigator: window.navigator,
location: window.location,
cookie: '',
},
createElement: (name) => {
return createElement(name);
},
documentElement: docElem,
body: docBody,
parent: parentApi,
});
documentApi.defaultView.document = documentApi;
installDocService(documentApi.defaultView, /* isSingleDoc */ true);
ampdoc = Services.ampdocServiceFor(documentApi.defaultView).getSingleDoc();
installHiddenObserverForDoc(ampdoc);
installPlatformService(documentApi.defaultView);
installTimerService(documentApi.defaultView);
installViewerServiceForDoc(ampdoc);
timer = Services.timerFor(documentApi.defaultView);
viewer = Services.viewerForDoc(ampdoc);
viewer.isEmbedded = () => true;
vsyncTasks = [];
vsyncApi = {
runPromise: (task) => {
vsyncTasks.push(task);
return Promise.resolve();
},
mutate: (mutator) => {
vsyncTasks.push({mutate: mutator});
},
};
});
class FakeAttributes {
constructor() {
const attrs = [];
attrs.setNamedItem = function ({name, value}) {
for (let i = 0; i < this.length; i++) {
if (this[i].name === name) {
this[i].value = value;
return;
}
}
this.push(new FakeAttr(name, value));
};
return attrs;
}
}
class FakeAttr {
constructor(name, value) {
this.name = name;
this.value = value;
}
cloneNode() {
return new FakeAttr(this.name, this.value);
}
}
function createElement(id) {
const children = [];
const elem = {
id,
nodeType: 1,
ownerDocument: documentApi,
autoTop: '',
tagName: id,
toString: () => {
return id;
},
style: {
_top: '15px',
_bottom: '',
_position: '',
_opacity: '0.9',
_visibility: 'visible',
_transition: '',
get top() {
return this._top;
},
set top(v) {
elem.style.setProperty('top', v);
},
get bottom() {
return this._bottom;
},
set bottom(v) {
elem.style.setProperty('bottom', v);
},
get position() {
return this._position;
},
set position(v) {
elem.style.setProperty('position', v);
},
get opacity() {
return this._opacity;
},
set opacity(v) {
elem.style.setProperty('opacity', v);
},
get visibility() {
return this._visibility;
},
set visibility(v) {
elem.style.setProperty('visibility', v);
},
get transition() {
return this._transition;
},
set transition(v) {
elem.style.setProperty('transition', v);
},
setProperty(prop, value, priority) {
const privProp = '_' + prop;
// Override if important
if (priority === 'important') {
elem.style[privProp] = `${value} !${priority}`;
} else if (
elem.style[privProp] ||
!endsWith(elem.computedStyle[prop], '!important')
) {
if (
prop === 'transition' &&
!value &&
endsWith(elem.style[privProp] || '', '!important')
) {
// Emulate a stupid Safari bug.
// noop.
} else {
// If element style is already set, we can override
// Or, if computed style is not important priority
elem.style[privProp] = value;
}
}
},
},
computedStyle: {
opacity: '0.9',
visibility: 'visible',
_top: '',
get top() {
if (
elem.computedStyle.transition &&
elem.style.transition !== '' &&
elem.style.transition !== 'none !important'
) {
return this._oldTop;
}
if (elem.style.bottom) {
return elem.autoTop || this._top;
}
return this._top;
},
set top(val) {
this._oldTop = this._top;
this._top = val;
},
bottom: '',
zIndex: '',
transform: '',
position: '',
transition: '',
},
matches: () => true,
compareDocumentPosition: (other) => {
if (other.id > id) {
return Node.DOCUMENT_POSITION_FOLLOWING;
}
return Node.DOCUMENT_POSITION_PRECEDING;
},
contains(elem) {
let el = elem;
while (el) {
if (el === this) {
return true;
}
el = el.parentElement;
}
return false;
},
hasAttribute(name) {
for (let i = 0; i < this.attributes.length; i++) {
if (this.attributes[i].name === name) {
return true;
}
}
return false;
},
getAttribute(name) {
for (let i = 0; i < this.attributes.length; i++) {
if (this.attributes[i].name === name) {
return this.attributes[i].value;
}
}
return null;
},
setAttribute(name, value) {
if (name.includes('[')) {
throw new Error('invalid characters');
}
for (let i = 0; i < this.attributes.length; i++) {
if (this.attributes[i].name === name) {
this.attributes[i].value = value;
return;
}
}
this.attributes.push(new FakeAttr(name, value));
},
removeAttribute(name) {
for (let i = 0; i < this.attributes.length; i++) {
if (this.attributes[i].name === name) {
this.attributes.splice(i, 1);
return;
}
}
},
attributes: new FakeAttributes(),
appendChild: (child) => {
child.parentElement = elem;
children.push(child);
},
removeChild: (child) => {
const index = children.indexOf(child);
if (index != -1) {
children.splice(index, 1);
}
child.parentElement = null;
},
replaceChild: (newChild, oldChild) => {
oldChild.parentElement = null;
if (children.indexOf(oldChild) != -1) {
children.splice(children.indexOf(oldChild), 1);
}
newChild.parentElement = elem;
children.push(newChild);
},
cloneNode() {
return createElement(this.id);
},
get offsetTop() {
return parseFloat(elem.computedStyle.top);
},
set innerHTML(html) {
if (html === '') {
this.firstElementChild = null;
return;
}
// Updating the placeholder means we have to update the tests.
expect(html.trim()).to.equal(
'<i-amphtml-fpa style="display: none"></i-amphtml-fpa>'
);
this.firstElementChild = createElement('i-amphtml-fpa');
toggle(this.firstElementChild, false);
},
getRootNode() {
return documentApi;
},
dispatchEvent() {},
};
return elem;
}
function createStyleNode(attr, value) {
const node = document.createElement('style');
if (attr) {
node.setAttribute(attr, value || '');
}
return node;
}
function createValidRule(selector, position, elements) {
const rule = {
type: 1,
selectorText: selector,
style: {position},
elements,
};
if (allRules[selector]) {
throw new Error('dup selector');
}
allRules[selector] = rule;
return rule;
}
function createUnrelatedRule(selector, elements) {
const rule = {
type: 1,
selectorText: selector,
style: {},
elements,
};
if (allRules[selector]) {
throw new Error('dup selector');
}
allRules[selector] = rule;
return rule;
}
// TODO(jridgewell, #11827): Make this test work on Safari.
describe
.configure()
.skipSafari()
.run('no-transfer', () => {
let fixedLayer;
beforeEach(() => {
fixedLayer = new FixedLayer(
ampdoc,
vsyncApi,
/* borderTop */ 0,
/* paddingTop */ 11,
/* transfer */ false
);
fixedLayer.setup();
});
it('should initialize fixed layer to null', () => {
expect(fixedLayer.transferLayer_).to.be.null;
});
it('should discover all potentials', () => {
function expectFe(actual, expected) {
expect(actual.id).to.equal(expected.id, `${expected.id} wrong`);
expect(actual.element).to.equal(
expected.element,
`${expected.id}: wrong element`
);
expect(actual.position).to.equal(
expected.position,
`${expected.id}: wrong position`
);
expect(JSON.stringify(actual.selectors)).to.equal(
JSON.stringify(expected.selectors),
`${expected.id}: wrong selectors`
);
}
expect(fixedLayer.elements_).to.have.length(5);
expectFe(fixedLayer.elements_[0], {
id: 'F0',
element: element1,
position: 'fixed',
selectors: [
'#amp-custom-rule1',
'#doc-body-id #amp-custom-rule1',
'#amp-custom-rule2',
'#amp-custom-media-rule1',
'#other-rule1',
],
});
expectFe(fixedLayer.elements_[1], {
id: 'F1',
element: element2,
position: 'fixed',
selectors: [
'#amp-custom-rule2',
'#amp-custom-supports-rule1',
'#other-rule2',
],
});
expectFe(fixedLayer.elements_[2], {
id: 'F2',
element: element2,
position: 'sticky',
selectors: ['#amp-custom-rule4'],
});
expectFe(fixedLayer.elements_[3], {
id: 'F3',
element: element4,
position: 'sticky',
selectors: ['#amp-custom-rule4'],
});
expectFe(fixedLayer.elements_[4], {
id: 'F4',
element: element5,
position: 'sticky',
selectors: ['#amp-custom-rule5'],
});
expect(fixedLayer.isDeclaredFixed(element1)).to.be.true;
expect(fixedLayer.isDeclaredSticky(element1)).to.be.false;
expect(fixedLayer.isDeclaredFixed(element2)).to.be.true;
expect(fixedLayer.isDeclaredSticky(element2)).to.be.true;
expect(fixedLayer.isDeclaredFixed(element3)).to.be.false;
expect(fixedLayer.isDeclaredSticky(element3)).to.be.false;
expect(fixedLayer.isDeclaredSticky(element4)).to.be.true;
expect(fixedLayer.isDeclaredSticky(element5)).to.be.true;
});
it('should add and remove element directly', () => {
const updateStub = env.sandbox.stub(fixedLayer, 'update');
expect(fixedLayer.elements_).to.have.length(5);
// Add.
fixedLayer.addElement(element6);
expect(updateStub).to.be.calledOnce;
expect(fixedLayer.elements_).to.have.length(6);
const fe = fixedLayer.elements_[5];
expect(fe.id).to.equal('F5');
expect(fe.element).to.equal(element6);
expect(fe.selectors).to.deep.equal(['*']);
expect(fe.forceTransfer).to.be.undefined;
// Remove.
fixedLayer.removeElement(element6);
expect(fixedLayer.elements_).to.have.length(5);
// Add with forceTransfer=true.
fixedLayer.addElement(element6, true);
expect(updateStub).to.have.callCount(2);
expect(fixedLayer.elements_).to.have.length(6);
const fe1 = fixedLayer.elements_[5];
expect(fe1.id).to.equal('F6');
expect(fe1.element).to.equal(element6);
expect(fe1.selectors).to.deep.equal(['*']);
expect(fe1.forceTransfer).to.be.true;
// Remove.
fixedLayer.removeElement(element6);
expect(fixedLayer.elements_).to.have.length(5);
// Add with forceTransfer=false.
fixedLayer.addElement(element6, false);
expect(updateStub).to.have.callCount(3);
expect(fixedLayer.elements_).to.have.length(6);
const fe2 = fixedLayer.elements_[5];
expect(fe2.id).to.equal('F7');
expect(fe2.element).to.equal(element6);
expect(fe2.selectors).to.deep.equal(['*']);
expect(fe2.forceTransfer).to.be.false;
// Remove.
fixedLayer.removeElement(element6);
expect(fixedLayer.elements_).to.have.length(5);
});
it('should remove node when disappeared from DOM', () => {
docBody.removeChild(element1);
expect(fixedLayer.elements_).to.have.length(5);
fixedLayer.update();
expect(fixedLayer.elements_).to.have.length(4);
});
it('should remove all candidates', () => {
// element2 is both fixed and sticky.
docBody.removeChild(element2);
expect(fixedLayer.elements_).to.have.length(5);
fixedLayer.update();
expect(fixedLayer.elements_).to.have.length(3);
});
it('should collect updates', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element5.computedStyle['position'] = 'sticky';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
// F0: element1
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].sticky).to.be.false;
expect(state['F0'].top).to.equal('');
expect(state['F0'].zIndex).to.equal('');
// F1: element2
expect(state['F1'].fixed).to.equal(false);
expect(state['F1'].sticky).to.equal(false);
// F2: element3
expect(state['F2'].fixed).to.equal(false);
expect(state['F2'].sticky).to.equal(false);
// F3: element4
expect(state['F3'].fixed).to.be.false;
expect(state['F3'].sticky).to.be.false;
expect(state['F3'].top).to.equal('');
expect(state['F3'].zIndex).to.equal('');
// F4: element5
expect(state['F4'].fixed).to.be.false;
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('');
expect(state['F4'].zIndex).to.equal('');
});
it('should support vendor-based sticky', () => {
element5.computedStyle['position'] = '-webkit-sticky';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F4'].sticky).to.be.true;
});
it('should disregard non-fixed position', () => {
element1.computedStyle['position'] = 'static';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element5.computedStyle['position'] = 'static';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.false;
expect(state['F1'].fixed).to.be.false;
expect(state['F4'].fixed).to.be.false;
});
it('should disregard invisible element, but for fixed only', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 0;
element1.offsetHeight = 0;
element5.computedStyle['position'] = 'sticky';
element5.offsetWidth = 0;
element5.offsetHeight = 0;
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.false;
expect(state['F1'].fixed).to.be.false;
expect(state['F4'].sticky).to.be.true;
});
it('should disregard display:none element', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.computedStyle['display'] = 'none';
element5.computedStyle['position'] = 'sticky';
element5.offsetWidth = 10;
element5.offsetHeight = 10;
element5.computedStyle['display'] = 'none';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.false;
expect(state['F1'].fixed).to.be.false;
expect(state['F4'].sticky).to.be.false;
});
it('should tolerate getComputedStyle = null', () => {
// See #3096 and https://bugzilla.mozilla.org/show_bug.cgi?id=548397
documentApi.defaultView.getComputedStyle = () => null;
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.false;
expect(state['F0'].sticky).to.be.false;
expect(state['F0'].transferrable).to.be.false;
expect(state['F0'].top).to.equal('');
expect(state['F0'].zIndex).to.equal('');
expect(state['F1'].fixed).to.be.false;
expect(state['F1'].sticky).to.be.false;
});
it('should collect for top != auto', () => {
element1.computedStyle['position'] = 'fixed';
element1.computedStyle['top'] = '11px';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = '11px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].sticky).to.be.false;
expect(state['F0'].top).to.equal('11px');
expect(state['F4'].fixed).to.be.false;
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('11px');
});
it('should collect for top = auto, but not update top', () => {
element1.computedStyle['position'] = 'fixed';
element1.computedStyle['top'] = 'auto';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = 'auto';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].sticky).to.be.false;
expect(state['F0'].top).to.equal('');
expect(state['F4'].fixed).to.be.false;
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('');
});
it('should work around top=0 for sticky', () => {
// See http://crbug.com/703816.
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = '0px';
element5.autoTop = '12px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('');
});
it('should work around top=0 for sticky when offset = 0', () => {
// See http://crbug.com/703816.
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = '0px';
element5.autoTop = '0px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('0px');
});
it('should NOT work around top=0 for sticky for non-implicit top', () => {
// See http://crbug.com/703816.
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = '0px';
element5.autoTop = '12px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('');
});
it('should collect for implicit top = auto, but not update top', () => {
element1.computedStyle['position'] = 'fixed';
element1.computedStyle['top'] = '0px';
element1.autoTop = '12px';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element5.computedStyle['position'] = 'sticky';
element5.autoTop = '12px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].top).to.equal('');
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('');
});
it('should override implicit top = auto to 0 when equals padding', () => {
element1.computedStyle['position'] = 'fixed';
element1.computedStyle['top'] = '11px';
element1.autoTop = '0px';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = '11px';
element5.autoTop = '11px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].top).to.equal('0px');
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('11px');
});
it('should override implicit top = auto to 0 and padding + border', () => {
fixedLayer.borderTop_ = 1;
element1.computedStyle['position'] = 'fixed';
element1.computedStyle['top'] = '12px';
element1.autoTop = '0px';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = '12px';
element5.autoTop = '12px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].top).to.equal('0px');
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('12px');
});
it('should override implicit top = auto to 0 w/transient padding', () => {
element1.computedStyle['position'] = 'fixed';
element1.computedStyle['top'] = '11px';
element1.autoTop = '0px';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = '11px';
element5.autoTop = '11px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.equal(true);
expect(state['F0'].top).to.equal('0px');
// Update to transient padding.
env.sandbox.stub(fixedLayer, 'update').callsFake(() => {});
fixedLayer.updatePaddingTop(22, /* transient */ true);
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].top).to.equal('0px');
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('11px');
expect(fixedLayer.paddingTop_).to.equal(22);
expect(fixedLayer.committedPaddingTop_).to.equal(11);
// Update to non-transient padding.
fixedLayer.updatePaddingTop(22, /* transient */ false);
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].top).to.equal(''); // Reset completely.
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('11px');
expect(fixedLayer.paddingTop_).to.equal(22);
expect(fixedLayer.committedPaddingTop_).to.equal(22);
});
it('should always collect and update top = 0', () => {
element1.computedStyle['position'] = 'fixed';
element1.computedStyle['top'] = '0px';
element1.autoTop = '0px';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = '0px';
element5.autoTop = '0px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].top).to.equal('0px');
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('0px');
});
it('should handle transitions', () => {
element1.computedStyle['position'] = 'fixed';
element1.computedStyle['transition'] = 'all .4s ease';
element1.computedStyle['top'] = '0px';
element1.autoTop = '0px';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['transition'] = 'all .4s ease';
element5.computedStyle['top'] = '0px';
element5.autoTop = '0px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].top).to.equal('0px');
expect(element1.style.transition).to.equal('none !important');
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].top).to.equal('0px');
expect(element5.style.transition).to.equal('none !important');
vsyncTasks[0].mutate({});
expect(element1.style.transition).to.equal('');
expect(element5.style.transition).to.equal('');
});
it('should mutate element to fixed without top', () => {
const fe = fixedLayer.elements_[0];
fixedLayer.mutateElement_(fe, 1, {
fixed: true,
sticky: false,
top: '',
});
expect(fe.fixedNow).to.be.true;
expect(fe.stickyNow).to.be.false;
expect(fe.element.style.top).to.equal('15px');
expect(fixedLayer.transferLayer_).to.be.null;
});
it('should mutate element to sticky without top', () => {
const fe = fixedLayer.elements_[4];
fixedLayer.mutateElement_(fe, 1, {
fixed: false,
sticky: true,
top: '',
});
expect(fe.fixedNow).to.be.false;
expect(fe.stickyNow).to.be.true;
expect(fe.element.style.top).to.equal('15px');
expect(fixedLayer.transferLayer_).to.be.null;
});
it('should mutate element to fixed with top', () => {
const fe = fixedLayer.elements_[0];
element1.style.top = '';
fixedLayer.mutateElement_(fe, 1, {
fixed: true,
top: '17px',
});
expect(fe.fixedNow).to.be.true;
expect(fe.element.style.top).to.equal('calc(17px + 11px)');
});
it('should add needed padding to sticky top if transferring', () => {
const fe = fixedLayer.elements_[4];
fixedLayer.transfer_ = true;
fe.element.style.top = '';
fixedLayer.mutateElement_(fe, 1, {
sticky: true,
top: '17px',
});
expect(fe.stickyNow).to.be.true;
expect(fe.element.style.top).to.equal('17px');
});
it('should not add unneeded padding to sticky top if transferring', () => {
const fe = fixedLayer.elements_[4];
fixedLayer.transfer_ = true;
fixedLayer.paddingTop_ = 0;
fe.element.style.top = '';
fixedLayer.mutateElement_(fe, 1, {
sticky: true,
top: '17px',
});
expect(fe.stickyNow).to.be.true;
expect(fe.element.style.top).to.equal('calc(17px - 11px)');
});
it('should mutate element to sticky with top', () => {
const fe = fixedLayer.elements_[4];
fixedLayer.mutateElement_(fe, 1, {
sticky: true,
top: '17px',
});
expect(fe.stickyNow).to.be.true;
expect(fe.element.style.top).to.equal('calc(17px + 11px)');
});
it('should reset top upon being removed from fixedlayer', () => {
expect(fixedLayer.elements_).to.have.length(5);
// Add.
fixedLayer.addElement(element6, '*');
expect(fixedLayer.elements_).to.have.length(6);
const fe = fixedLayer.elements_[5];
expect(fe.id).to.equal('F5');
expect(fe.element).to.equal(element6);
expect(fe.selectors).to.deep.equal(['*']);
fixedLayer.mutateElement_(fe, 1, {
fixed: true,
top: '17px',
});
expect(fe.fixedNow).to.be.true;
expect(fe.element.style.top).to.equal('calc(17px + 11px)');
// Remove.
fixedLayer.vsync_ = {
mutate(callback) {
callback();
},
};
fixedLayer.removeElement(element6);
expect(fixedLayer.elements_).to.have.length(5);
expect(element6.style.top).to.equal('');
});
it('should reset sticky top upon being removed from fixedlayer', () => {
expect(fixedLayer.elements_).to.have.length(5);
const fe = fixedLayer.elements_[4];
fixedLayer.mutateElement_(fe, 1, {
sticky: true,
top: '17px',
});
expect(fe.stickyNow).to.be.true;
expect(fe.element.style.top).to.equal('calc(17px + 11px)');
// Remove.
fixedLayer.vsync_ = {
mutate(callback) {
callback();
},
};
fixedLayer.removeElement(element5);
expect(fixedLayer.elements_).to.have.length(4);
expect(element5.style.top).to.equal('');
});
it('should transform fixed elements with anchored top', () => {
const fe = fixedLayer.elements_[0];
fixedLayer.mutateElement_(fe, 1, {
fixed: true,
top: '17px',
});
expect(fe.fixedNow).to.be.true;
expect(fe.element.style.top).to.equal('calc(17px + 11px)');
fixedLayer.transformMutate('translateY(-10px)');
expect(fe.element.style.transform).to.equal('translateY(-10px)');
expect(fe.element.style.transition).to.equal('none');
// Reset back.
fixedLayer.transformMutate(null);
expect(fe.element.style.transform).to.equal('');
expect(fe.element.style.transition).to.equal('');
});
it('should NOT transform sticky elements with anchored top', () => {
const fe = fixedLayer.elements_[4];
fixedLayer.mutateElement_(fe, 1, {
sticky: true,
top: '17px',
});
expect(fe.stickyNow).to.be.true;
expect(fe.element.style.top).to.equal('calc(17px + 11px)');
fixedLayer.transformMutate('translateY(-10px)');
expect(fe.element.style.transform).to.be.undefined;
expect(fe.element.style.transition).to.equal('');
// Reset back.
fixedLayer.transformMutate(null);
expect(fe.element.style.transform).to.be.undefined;
expect(fe.element.style.transition).to.equal('');
});
it('should compound transform with anchored top', () => {
const fe = fixedLayer.elements_[0];
fixedLayer.mutateElement_(fe, 1, {
fixed: true,
top: '17px',
transform: 'scale(2)',
});
fixedLayer.transformMutate('translateY(-10px)');
expect(fe.element.style.transform).to.equal(
'scale(2) translateY(-10px)'
);
});
it('should NOT transform fixed elements w/o anchored top', () => {
const fe = fixedLayer.elements_[0];
fe.element.style.transform = '';
fixedLayer.mutateElement_(fe, 1, {
fixed: true,
top: '',
});
expect(fe.fixedNow).to.be.true;
fixedLayer.transformMutate('translateY(-10px)');
expect(fe.element.style.transform).to.equal('');
});
it('should user error when inline styles may be overriden', () => {
// Set both attribute and property since element1 is a fake element.
element1.setAttribute('style', 'bottom: 10px');
element1.style.bottom = '10px';
const userError = env.sandbox.stub(user(), 'error');
fixedLayer.setup();
// Expect error regarding inline styles.
expect(userError).calledWithMatch(
'FixedLayer',
/not supported yet for fixed or sticky elements/
);
});
describe('hidden toggle', () => {
let mutationObserver;
beforeEach(() => {
fixedLayer.observeHiddenMutations();
mutationObserver = Services.hiddenObserverForDoc(
documentApi.documentElement
).mutationObserver_;
});
it('should trigger an update', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.computedStyle['display'] = 'none';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.false;
element1.computedStyle['display'] = '';
env.sandbox.stub(timer, 'delay').callsFake((callback) => {
callback();
});
return mutationObserver
.__mutate({
attributeName: 'hidden',
})
.then(() => {
expect(vsyncTasks).to.have.length(2);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].top).to.equal('');
expect(state['F0'].zIndex).to.equal('');
});
});
});
it('should ignore descendants of already-tracked elements', () => {
const updateStub = env.sandbox.stub(fixedLayer, 'update');
expect(fixedLayer.elements_).to.have.length(5);
element1.appendChild(element6);
// Add.
fixedLayer.addElement(element6);
expect(updateStub).not.to.have.been.called;
expect(fixedLayer.elements_).to.have.length(5);
});
it('should replace descendants of tracked elements', () => {
const updateStub = env.sandbox.stub(fixedLayer, 'update');
expect(fixedLayer.elements_).to.have.length(5);
element6.appendChild(element1);
// Add.
fixedLayer.addElement(element6);
expect(updateStub).to.be.calledOnce;
expect(fixedLayer.elements_).to.have.length(5);
const fe = fixedLayer.elements_[4];
expect(fe.id).to.equal('F5');
expect(fe.element).to.equal(element6);
expect(fe.selectors).to.deep.equal(['*']);
expect(fe.forceTransfer).to.be.undefined;
});
});
describe('with-transfer', () => {
let fixedLayer;
const borderTop = 0;
const paddingTop = 11;
const transfer = true;
beforeEach(() => {
installViewerServiceForDoc(ampdoc);
viewer = Services.viewerForDoc(ampdoc);
viewer.isEmbedded = () => true;
fixedLayer = new FixedLayer(
ampdoc,
vsyncApi,
borderTop,
paddingTop,
transfer
);
fixedLayer.setup();
});
it('should not instantiate transfer layer on setup', () => {
const win = new FakeWindow();
const ampdoc = new AmpDocSingle(win);
installPlatformService(win);
installTimerService(win);
installViewerServiceForDoc(ampdoc);
viewer = Services.viewerForDoc(ampdoc);
viewer.isEmbedded = () => true;
const fixedLayer = new FixedLayer(
ampdoc,
vsyncApi,
borderTop,
paddingTop,
transfer
);
fixedLayer.setup();
expect(fixedLayer.transferLayer_).to.be.null;
});
it('should collect turn off transferrable', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element5.computedStyle['position'] = 'sticky';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].transferrable).to.be.false;
expect(state['F1'].fixed).to.equal(false);
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].transferrable).to.be.false;
});
it('should collect turn on transferrable with top = 0', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.computedStyle['top'] = '0px';
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = '0px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].transferrable).to.be.true;
expect(state['F0'].top).to.equal('0px');
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].transferrable).to.be.false;
expect(state['F4'].top).to.equal('0px');
});
it('should collect turn on transferrable with top != 0', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.computedStyle['top'] = '2px';
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = '2px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].transferrable).to.be.true;
expect(state['F0'].top).to.equal('2px');
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].transferrable).to.be.false;
expect(state['F4'].top).to.equal('2px');
});
it('should collect turn on transferrable with bottom = 0', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.computedStyle['top'] = '';
element1.computedStyle['bottom'] = '0px';
element5.computedStyle['position'] = 'sticky';
element5.computedStyle['top'] = '';
element5.computedStyle['bottom'] = '0px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].transferrable).to.be.true;
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].transferrable).to.be.false;
});
it('should not disregard invisible element if it has forceTransfer', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 0;
element1.offsetHeight = 0;
element5.computedStyle['position'] = 'sticky';
expect(vsyncTasks).to.have.length(1);
let state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.false;
expect(state['F0'].transferrable).to.be.false;
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].transferrable).to.be.false;
// Add.
state = {};
fixedLayer.setupElement_(element1, '*', 'fixed', true);
expect(vsyncTasks).to.have.length(1);
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].transferrable).to.be.true;
});
it('should disregard element if it has forceTransfer=false', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.computedStyle['top'] = '0px';
element5.computedStyle['position'] = 'sticky';
expect(vsyncTasks).to.have.length(1);
let state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].transferrable).to.be.true;
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].transferrable).to.be.false;
// Add.
state = {};
fixedLayer.setupElement_(element1, '*', 'fixed', false);
expect(vsyncTasks).to.have.length(1);
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].transferrable).to.be.false;
});
it('should collect turn on transferrable with bottom != 0', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.style['top'] = '';
element1.computedStyle['top'] = '';
element1.computedStyle['bottom'] = '2px';
element5.computedStyle['position'] = 'sticky';
element5.style['top'] = '';
element5.computedStyle['top'] = '';
element5.computedStyle['bottom'] = '2px';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].transferrable).to.be.true;
expect(state['F4'].sticky).to.be.true;
expect(state['F4'].transferrable).to.be.false;
});
it('should collect z-index', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.computedStyle['zIndex'] = '101';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].zIndex).to.equal('101');
});
it('should transfer element', () => {
const fe = fixedLayer.elements_[0];
fixedLayer.mutateElement_(fe, 1, {
fixed: true,
transferrable: true,
zIndex: '11',
});
expect(fe.fixedNow).to.be.true;
expect(fe.placeholder).to.exist;
expect(fe.placeholder).to.have.attribute('hidden');
expect(fixedLayer.transferLayer_).to.exist;
const layer = fixedLayer.transferLayer_.layer_;
expect(layer.style['pointerEvents']).to.equal('none');
expect(fe.element.parentElement).to.equal(layer);
expect(fe.element.style['pointer-events']).to.equal('initial');
expect(fe.element.style['zIndex']).to.equal('calc(10001 + 11)');
});
it('should ignore transfer when non-transferrable', () => {
const fe = fixedLayer.elements_[0];
fixedLayer.mutateElement_(fe, 1, {
fixed: true,
transferrable: false,
});
expect(fe.fixedNow).to.be.true;
expect(fe.placeholder).to.not.exist;
expect(fixedLayer.transferLayer_).to.not.exist;
});
it('should return transfered element if it no longer matches', () => {
const fe = fixedLayer.elements_[0];
fe.element.matches = () => false;
fixedLayer.mutateElement_(fe, 1, {
fixed: true,
transferrable: true,
zIndex: '11',
});
expect(fe.fixedNow).to.be.true;
expect(fe.placeholder).to.exist;
expect(fixedLayer.transferLayer_).to.exist;
expect(fe.element.parentElement).to.not.equal(
fixedLayer.transferLayer_.layer_
);
expect(fe.placeholder.parentElement).to.be.null;
expect(fe.element.style.zIndex).to.equal('');
});
it('should remove transfered element if it no longer exists', () => {
const fe = fixedLayer.elements_[0];
// Add.
fixedLayer.mutateElement_(fe, 1, {
fixed: true,
transferrable: true,
zIndex: '11',
});
expect(fe.fixedNow).to.be.true;
expect(fe.placeholder).to.exist;
expect(fe.element.parentElement).to.equal(
fixedLayer.transferLayer_.layer_
);
expect(fixedLayer.transferLayer_).to.exist;
expect(fixedLayer.transferLayer_.layer_.id).to.equal('doc-body-id');
// Remove from DOM.
fe.element.parentElement.removeChild(fe.element);
fixedLayer.mutateElement_(fe, 1, {
fixed: false,
transferrable: false,
});
expect(fe.fixedNow).to.be.false;
expect(fe.placeholder.parentElement).to.not.exist;
});
it('should disregard transparent element', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.computedStyle['top'] = '0px';
element1.computedStyle['opacity'] = '0';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.equal(true);
expect(state['F0'].transferrable).to.equal(false);
});
it('should force transfer for visibility=hidden element', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.computedStyle['top'] = '0px';
element1.computedStyle['visibility'] = 'hidden';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.equal(true);
expect(state['F0'].transferrable).to.equal(true);
});
it('should user error when inline styles may be overriden', () => {
// Set both attribute and property since element1 is a fake element.
element1.setAttribute('style', 'bottom: 10px');
element1.style.bottom = '10px';
const userError = env.sandbox.stub(user(), 'error');
fixedLayer.setup();
// Expect error regarding inline styles.
expect(userError).calledWithMatch(
'FixedLayer',
/not supported yet for fixed or sticky elements/
);
});
describe('hidden toggle', () => {
let mutationObserver;
beforeEach(() => {
fixedLayer.observeHiddenMutations();
mutationObserver = Services.hiddenObserverForDoc(
documentApi.documentElement
).mutationObserver_;
});
it('should trigger an update', () => {
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.computedStyle['display'] = 'none';
expect(vsyncTasks).to.have.length(1);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.false;
element1.computedStyle['display'] = '';
env.sandbox.stub(timer, 'delay').callsFake((callback) => {
callback();
});
return mutationObserver
.__mutate({
attributeName: 'hidden',
})
.then(() => {
expect(vsyncTasks).to.have.length(2);
const state = {};
vsyncTasks[0].measure(state);
expect(state['F0'].fixed).to.be.true;
expect(state['F0'].top).to.equal('');
expect(state['F0'].zIndex).to.equal('');
});
});
});
it('should sync attributes between body and layer', () => {
const {body} = ampdoc.win.document;
// Necessary to create the fixed layer
element1.computedStyle['position'] = 'fixed';
element1.offsetWidth = 10;
element1.offsetHeight = 10;
element1.computedStyle['top'] = '0px';
fixedLayer.vsync_ = {
runPromise({measure, mutate}, state) {
measure(state);
mutate(state);
return Promise.resolve();
},
};
body.setAttribute('test', 'hello');
fixedLayer.update();
expect(fixedLayer.transferLayer_).to.exist;
const layer = fixedLayer.transferLayer_.layer_;
expect(layer.getAttribute('test')).to.equal('hello');
body.removeAttribute('test');
body.setAttribute('test1', 'hello1');
body.setAttribute('test2', 'hello2');
fixedLayer.update();
expect(layer.getAttribute('test')).to.equal(null);
expect(layer.getAttribute('test1')).to.equal('hello1');
expect(layer.getAttribute('test2')).to.equal('hello2');
body.removeAttribute('test1');
body.removeAttribute('test2');
fixedLayer.update();
expect(layer.getAttribute('test')).to.equal(null);
expect(layer.getAttribute('test1')).to.equal(null);
expect(layer.getAttribute('test2')).to.equal(null);
body.attributes.push(new FakeAttr('[class]', 'amp-bind'));
fixedLayer.update();
expect(layer.getAttribute('[class]')).to.equal('amp-bind');
// [i-amphtml-lightbox] is an exception and should be preserved.
layer.setAttribute('i-amphtml-lightbox', '');
fixedLayer.update();
expect(layer.hasAttribute('i-amphtml-lightbox')).to.be.true;
expect(body.hasAttribute('i-amphtml-lightbox')).to.be.false;
});
it('should sync invalid-named attributes to layer', () => {
// Holy poop it's hard to inject an invalid-named attribute.
// This is a sanity check using real Elements to ensure we can insert
// a cloned attribute into another element.
let div = document.createElement('div');
div.innerHTML = '<div [class]="amp-bind"></div>';
div = div.firstElementChild;
const attr = div.attributes[0];
expect(attr.name).to.equal('[class]');
expect(attr.value).to.equal('amp-bind');
const body = document.createElement('body');
body.attributes.setNamedItem(attr.cloneNode(false));
expect(div.getAttribute('[class]')).to.equal('amp-bind');
expect(body.getAttribute('[class]')).to.equal('amp-bind');
expect(body.attributes.getNamedItem('[class]')).to.not.equal(
div.attributes.getNamedItem('[class]')
);
});
});
});
describes.sandboxed('FixedLayer Setup Execution Bailouts', {}, () => {
let win;
let ampdoc;
let viewer;
let vsyncApi;
beforeEach(() => {
win = new FakeWindow();
window.__AMP_MODE = {
localDev: false,
development: false,
minified: false,
test: false,
version: '$internalRuntimeVersion$',
};
const vsyncTasks = [];
vsyncApi = {
runPromise: (task) => {
vsyncTasks.push(task);
return Promise.resolve();
},
mutate: (mutator) => {
vsyncTasks.push({mutate: mutator});
},
};
});
it('should not perform setup when served canonically', () => {
ampdoc = new AmpDocSingle(win);
installPlatformService(win);
installTimerService(win);
installViewerServiceForDoc(ampdoc);
viewer = Services.viewerForDoc(ampdoc);
viewer.isEmbedded = () => false;
const fixedLayer = new FixedLayer(
ampdoc,
vsyncApi,
/* borderTop */ 0,
/* paddingTop */ 11,
/* transfer */ false
);
const executed = fixedLayer.setup();
expect(executed).to.be.false;
});
it('should perform setup when served within a viewer', () => {
ampdoc = new AmpDocSingle(win);
installPlatformService(win);
installTimerService(win);
installViewerServiceForDoc(ampdoc);
viewer = Services.viewerForDoc(ampdoc);
viewer.isEmbedded = () => true;
const fixedLayer = new FixedLayer(
ampdoc,
vsyncApi,
/* borderTop */ 0,
/* paddingTop */ 11,
/* transfer */ false
);
const executed = fixedLayer.setup();
expect(executed).to.be.true;
});
});
describes.sandboxed(
'FixedLayer Setup Execution Bailouts with Local Development',
{},
() => {
let win;
let ampdoc;
let viewer;
let vsyncApi;
beforeEach(() => {
win = new FakeWindow();
window.__AMP_MODE = {
localDev: true,
development: false,
minified: false,
test: false,
version: '$internalRuntimeVersion$',
};
const vsyncTasks = [];
vsyncApi = {
runPromise: (task) => {
vsyncTasks.push(task);
return Promise.resolve();
},
mutate: (mutator) => {
vsyncTasks.push({mutate: mutator});
},
};
});
it('should perform setup when served canonically', () => {
ampdoc = new AmpDocSingle(win);
installPlatformService(win);
installTimerService(win);
installViewerServiceForDoc(ampdoc);
viewer = Services.viewerForDoc(ampdoc);
viewer.isEmbedded = () => false;
const fixedLayer = new FixedLayer(
ampdoc,
vsyncApi,
/* borderTop */ 0,
/* paddingTop */ 11,
/* transfer */ false
);
const executed = fixedLayer.setup();
expect(executed).to.be.true;
});
it('should perform setup when served within a viewer', () => {
ampdoc = new AmpDocSingle(win);
installPlatformService(win);
installTimerService(win);
installViewerServiceForDoc(ampdoc);
viewer = Services.viewerForDoc(ampdoc);
viewer.isEmbedded = () => true;
const fixedLayer = new FixedLayer(
ampdoc,
vsyncApi,
/* borderTop */ 0,
/* paddingTop */ 11,
/* transfer */ false
);
const executed = fixedLayer.setup();
expect(executed).to.be.true;
});
}
);
describes.sandboxed(
'FixedLayer bug due to calling browser native instead of AMP implementation',
{},
(env) => {
let win;
let ampdoc;
let viewer;
let vsyncApi;
beforeEach(() => {
win = new FakeWindow();
window.__AMP_MODE = {
localDev: true,
development: false,
minified: false,
test: false,
version: '$internalRuntimeVersion$',
};
const vsyncTasks = [];
vsyncApi = {
runPromise: async (task) => {
vsyncTasks.push(task);
},
mutate: (mutator) => {
vsyncTasks.push({mutate: mutator});
},
};
});
it('should call Animation.animate in animateFixedElements', () => {
const animateSpy = env.sandbox.spy(Animation, 'animate');
ampdoc = new AmpDocSingle(win);
installPlatformService(win);
installTimerService(win);
installViewerServiceForDoc(ampdoc);
viewer = Services.viewerForDoc(ampdoc);
viewer.isEmbedded = () => true;
const fixedLayer = new FixedLayer(
ampdoc,
vsyncApi,
/* borderTop */ 0,
/* paddingTop */ 11,
/* transfer */ false
);
fixedLayer.animateFixedElements(123, 456, 789, 'ease-in', false);
expect(animateSpy).to.be.calledOnce;
});
}
);
| {
"content_hash": "4bd5f278019defe9cf3268a0e7ff96be",
"timestamp": "",
"source": "github",
"line_count": 1874,
"max_line_length": 81,
"avg_line_length": 31.533617929562432,
"alnum_prop": 0.5663519138998883,
"repo_name": "rsimha-amp/amphtml",
"id": "b11776b0d258f203785ab1d2f8c14c64323c6188",
"size": "59721",
"binary": false,
"copies": "1",
"ref": "refs/heads/2021-06-29-CrossBrowserTests",
"path": "test/unit/test-fixed-layer.js",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "CSS",
"bytes": "195514"
},
{
"name": "Go",
"bytes": "15254"
},
{
"name": "HTML",
"bytes": "1018438"
},
{
"name": "Java",
"bytes": "36670"
},
{
"name": "JavaScript",
"bytes": "9464137"
},
{
"name": "Python",
"bytes": "80044"
},
{
"name": "Ruby",
"bytes": "14445"
},
{
"name": "Shell",
"bytes": "12162"
},
{
"name": "Yacc",
"bytes": "22292"
}
],
"symlink_target": ""
} |
module PaperTrail
module Reifiers
# Reify a single `has_one` association of `model`.
# @api private
module HasOne
class << self
# @api private
def reify(assoc, model, options, transaction_id)
version = load_version_for_has_one(assoc, model, transaction_id, options[:version_at])
return unless version
if version.event == "create"
create_event(assoc, model, options)
else
noncreate_event(assoc, model, options, version)
end
end
private
# @api private
def create_event(assoc, model, options)
if options[:mark_for_destruction]
model.send(assoc.name).mark_for_destruction if model.send(assoc.name, true)
else
model.paper_trail.appear_as_new_record do
model.send "#{assoc.name}=", nil
end
end
end
# Given a has-one association `assoc` on `model`, return the version
# record from the point in time identified by `transaction_id` or `version_at`.
# @api private
def load_version_for_has_one(assoc, model, transaction_id, version_at)
version_table_name = model.class.paper_trail.version_class.table_name
model.class.paper_trail.version_class.joins(:version_associations).
where("version_associations.foreign_key_name = ?", assoc.foreign_key).
where("version_associations.foreign_key_id = ?", model.id).
where("#{version_table_name}.item_type = ?", assoc.class_name).
where("created_at >= ? OR transaction_id = ?", version_at, transaction_id).
order("#{version_table_name}.id ASC").
first
end
# @api private
def noncreate_event(assoc, model, options, version)
child = version.reify(
options.merge(
has_many: false,
has_one: false,
belongs_to: false,
has_and_belongs_to_many: false
)
)
model.paper_trail.appear_as_new_record do
without_persisting(child) do
model.send "#{assoc.name}=", child
end
end
end
# Temporarily suppress #save so we can reassociate with the reified
# master of a has_one relationship. Since ActiveRecord 5 the related
# object is saved when it is assigned to the association. ActiveRecord
# 5 also happens to be the first version that provides #suppress.
def without_persisting(record)
if record.class.respond_to? :suppress
record.class.suppress { yield }
else
yield
end
end
end
end
end
end
| {
"content_hash": "880fb7cd2dee445ac05c9069ca357373",
"timestamp": "",
"source": "github",
"line_count": 76,
"max_line_length": 96,
"avg_line_length": 36.421052631578945,
"alnum_prop": 0.5773121387283237,
"repo_name": "devonestes/paper_trail",
"id": "ad6611250dc83780a90862b109520ec1f7e02703",
"size": "2768",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "lib/paper_trail/reifiers/has_one.rb",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "Ruby",
"bytes": "324340"
}
],
"symlink_target": ""
} |
namespace FakeItEasy.Specs
{
using System;
using System.Threading.Tasks;
using FluentAssertions;
using Xbehave;
public class ValueTaskReturnValueSpecs
{
public interface IReturnValueTask
{
ValueTask<int> GetValueAsync();
ValueTask<int> GetValueAsync(int a);
ValueTask<int> GetValueAsync(int a, string b);
ValueTask<int> GetValueAsync(int a, string b, bool c);
}
[Scenario]
public void ValueTaskSpecificValue(IReturnValueTask fake, ValueTask<int> task)
{
"Given a fake"
.x(() => fake = A.Fake<IReturnValueTask>());
"And a ValueTask-returning method of the fake configured to return a specific value"
.x(() => A.CallTo(() => fake.GetValueAsync()).Returns(42));
"When the configured method is called"
.x(() => task = fake.GetValueAsync());
"Then it returns a successfully completed ValueTask"
.x(() => task.IsCompletedSuccessfully.Should().BeTrue());
"And the ValueTask's result is the configured value"
.x(() => task.Result.Should().Be(42));
}
[Scenario]
public void ValueTaskLazilyComputed(IReturnValueTask fake, Func<int> valueProducer, ValueTask<int> task)
{
"Given a fake"
.x(() => fake = A.Fake<IReturnValueTask>());
"And a value producer configured to return a specific value"
.x(() => valueProducer = A.Fake<Func<int>>(o => o.ConfigureFake(f => A.CallTo(() => f()).Returns(42))));
"And a ValueTask-returning method of the fake configured to return a value computed by the value producer"
.x(() => A.CallTo(() => fake.GetValueAsync()).ReturnsLazily(valueProducer));
"When the configured method is called"
.x(() => task = fake.GetValueAsync());
"Then it returns a successfully completed ValueTask"
.x(() => task.IsCompletedSuccessfully.Should().BeTrue());
"And the ValueTask's result is the configured value"
.x(() => task.Result.Should().Be(42));
"And the value producer has been called"
.x(() => A.CallTo(() => valueProducer()).MustHaveHappenedOnceExactly());
}
[Scenario]
public void ValueTaskFromSequence(
IReturnValueTask fake,
ValueTask<int> task1,
ValueTask<int> task2,
ValueTask<int> task3,
ValueTask<int> task4)
{
"Given a fake"
.x(() => fake = A.Fake<IReturnValueTask>());
"And a ValueTask-returning method of the fake configured to return values from a sequence of 3"
.x(() => A.CallTo(() => fake.GetValueAsync()).ReturnsNextFromSequence(1, 2, 3));
"When the configured method is called 4 times"
.x(() =>
{
task1 = fake.GetValueAsync();
task2 = fake.GetValueAsync();
task3 = fake.GetValueAsync();
task4 = fake.GetValueAsync();
});
"Then it returns 3 ValueTasks whose results are the values from the sequence"
.x(() =>
{
task1.Result.Should().Be(1);
task2.Result.Should().Be(2);
task3.Result.Should().Be(3);
});
"Then it returns a ValueTask whose result is a dummy"
.x(() => task4.Result.Should().Be(0));
}
[Scenario]
public void ValueTaskStronglyTypedLazilyComputed1Parameter(IReturnValueTask fake, Func<int, int> valueProducer, ValueTask<int> task)
{
"Given a fake"
.x(() => fake = A.Fake<IReturnValueTask>());
"And a strongly typed value producer that returns a value based on the argument's value"
.x(() => valueProducer = a => a + 1);
"And a ValueTask-returning method of the fake configured to return a value computed by the value producer"
.x(() => A.CallTo(() => fake.GetValueAsync(A<int>._)).ReturnsLazily(valueProducer));
"When the configured method is called"
.x(() => task = fake.GetValueAsync(41));
"Then it returns a successfully completed ValueTask"
.x(() => task.IsCompletedSuccessfully.Should().BeTrue());
"And the ValueTask's result is the result of the value producer"
.x(() => task.Result.Should().Be(42));
}
[Scenario]
public void ValueTaskStronglyTypedLazilyComputed2Parameters(IReturnValueTask fake, Func<int, string, int> valueProducer, ValueTask<int> task)
{
"Given a fake"
.x(() => fake = A.Fake<IReturnValueTask>());
"And a strongly typed value producer that returns a value based on the arguments' values"
.x(() => valueProducer = (a, b) => a + b.Length);
"And a ValueTask-returning method of the fake configured to return a value computed by the value producer"
.x(() => A.CallTo(() => fake.GetValueAsync(A<int>._, A<string>._)).ReturnsLazily(valueProducer));
"When the configured method is called"
.x(() => task = fake.GetValueAsync(37, "hello"));
"Then it returns a successfully completed ValueTask"
.x(() => task.IsCompletedSuccessfully.Should().BeTrue());
"And the ValueTask's result is the result of the value producer"
.x(() => task.Result.Should().Be(42));
}
[Scenario]
public void ValueTaskStronglyTypedLazilyComputed3Parameters(IReturnValueTask fake, Func<int, string, bool, int> valueProducer, ValueTask<int> task)
{
"Given a fake"
.x(() => fake = A.Fake<IReturnValueTask>());
"And a strongly typed value producer that returns a value based on the arguments' values"
.x(() => valueProducer = (a, b, c) => a + b.Length + (c ? 1 : 0));
"And a ValueTask-returning method of the fake configured to return a value computed by the value producer"
.x(() => A.CallTo(() => fake.GetValueAsync(A<int>._, A<string>._, A<bool>._)).ReturnsLazily(valueProducer));
"When the configured method is called"
.x(() => task = fake.GetValueAsync(36, "hello", true));
"Then it returns a successfully completed ValueTask"
.x(() => task.IsCompletedSuccessfully.Should().BeTrue());
"And the ValueTask's result is the result of the value producer"
.x(() => task.Result.Should().Be(42));
}
}
}
| {
"content_hash": "f3826518468b30a9e89c7e476946bdb1",
"timestamp": "",
"source": "github",
"line_count": 166,
"max_line_length": 155,
"avg_line_length": 42.524096385542165,
"alnum_prop": 0.5439864003399915,
"repo_name": "thomaslevesque/FakeItEasy",
"id": "522c0786f99a904e0c5207736d0169e59d2a25bd",
"size": "7059",
"binary": false,
"copies": "4",
"ref": "refs/heads/master",
"path": "tests/FakeItEasy.Specs/ValueTaskReturnValueSpecs.cs",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "Batchfile",
"bytes": "517"
},
{
"name": "C#",
"bytes": "1917197"
},
{
"name": "Shell",
"bytes": "109"
},
{
"name": "Visual Basic .NET",
"bytes": "7598"
}
],
"symlink_target": ""
} |
ACCEPTED
#### According to
Index Fungorum
#### Published in
null
#### Original name
Opegrapha onchospora Nyl.
### Remarks
null | {
"content_hash": "02a114988b324cb638ebd19f9adc3168",
"timestamp": "",
"source": "github",
"line_count": 13,
"max_line_length": 25,
"avg_line_length": 10,
"alnum_prop": 0.7076923076923077,
"repo_name": "mdoering/backbone",
"id": "30a6482f38f227a54fc7febb99dfe24ffb96a456",
"size": "179",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "life/Fungi/Ascomycota/Arthoniomycetes/Arthoniales/Roccellaceae/Opegrapha/Opegrapha onchospora/README.md",
"mode": "33188",
"license": "apache-2.0",
"language": [],
"symlink_target": ""
} |
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<title>Creating Web Apps with Angular JS | Renemari Padillo</title>
<meta name="description" content="Front-end Web Technologies Presentation for MSU Campus DevCon.">
<!-- Style Vendors -->
<link rel="stylesheet" href="styles/styles.css">
<link rel="stylesheet" href="bower_components/normalize.css/normalize.css">
</head>
<body>
<!-- INTRODUCTION -->
<div id="introduction" class="slide" data-title="Introduction">
<div class="slide-content">
<div class="my-info">
<h1>Creating Web Apps with Angular JS<br>and Firebase</h1>
<div class="my-name">Renemari Padillo</div>
<div class="my-details">
Software Engineer | Front-end Web Developer<br>
Bywave Ltd.<br>
DevCon Volunteer
</div>
</div>
<div class="title">
<p class="big-font">Ola!</p>
</div>
</div>
</div>
<!-- INTRO TO WEB DEV -->
<div id="web-dev" class="slide" data-title="Angular JS">
<div class="slide-content">
<h1 class="big-font">Web App</h1>
<p>A web app is much more than just a plain website.<br>Web apps use the web browser as a way of connecting to networked tools and systems and are stored on web servers, and use tools like databases, JavaScript (or Ajax or Silverlight), and PHP (or ASP.Net) to deliver experiences beyond the standard web page or web form.<br><em>- About.com</em></p>
</div>
</div>
<!-- ANGULAR JS -->
<div id="angular" class="slide" data-title="Angular JS">
<div class="slide-content">
<div class="logo"><img class="samples" src="images/angular-js.png" alt="Angular JS"></div>
<p class="med-font">A Superheroic Javascript MVW (Model-View-Whatever) Framework by Google</p>
</div>
</div>
<!-- FIREBASE -->
<div id="firebase" class="slide" data-title="Angular JS">
<div class="slide-content">
<div class="logo"><img class="samples" src="images/firebase.png"></div>
<p class="heading">A cloud service provider and back-end as a service company.<br>Their primary product is a realtime database which provides an API that allows developers to store and sync data across multiple clients.<br>Now also owned by Google</p>
</div>
</div>
<!-- GOAL -->
<div id="qa" class="slide" data-title="QA">
<div class="slide-content">
<p class="big-font">Goals for this talk</p>
<ol style="width: 30%; margin: auto;">
<li>Create a simple todo app</li>
<li>Organize our Angular Project Structure</li>
<li>Create a simple web chat application with Angular Fire</li>
</ol>
</div>
</div>
<!-- QA -->
<div id="qa" class="slide" data-title="QA">
<div class="slide-content">
<p class="big-font">Question(s) ? ?</p>
</div>
</div>
<!-- THANK YOU -->
<div id="thank-you" class="slide" data-title="thank-you">
<div class="slide-content">
<h1 class="heading">Thank You!</h1>
<div>
<img src="images/thumbs-up.jpg" width="250px">
</div>
<p>Slides: https://github.com/renesansz/presentations</p>
<p>Source Code: https://github.com/renesansz/simple-web-app-angular-firebase</p>
<a href="https://twitter.com/renesansz">@renesansz</a>
</div>
</div>
<!-- Script Vendors -->
<script src="bower_components/gsap/src/minified/TweenMax.min.js"></script>
<script src="scripts/app.js"></script>
</body>
</html> | {
"content_hash": "2aef9878ac7ff2dc869cbc90d77c49f8",
"timestamp": "",
"source": "github",
"line_count": 96,
"max_line_length": 362,
"avg_line_length": 40.677083333333336,
"alnum_prop": 0.5687580025608194,
"repo_name": "renesansz/presentations",
"id": "49991c66180c2dbfe9011b1cd51ef550d0d2debc",
"size": "3905",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "frontend-web-technologies/app/index.html",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "3247"
},
{
"name": "HTML",
"bytes": "7321"
},
{
"name": "JavaScript",
"bytes": "17587"
},
{
"name": "SCSS",
"bytes": "4809"
}
],
"symlink_target": ""
} |
<html>
<head><title>Click noreferrer links</title>
<script>
function simulateClick(target) {
var evt = document.createEvent("MouseEvents");
evt.initMouseEvent("click", true, true, window,
0, 0, 0, 0, 0, false, false,
false, false, 0, null);
return target.dispatchEvent(evt);
}
function clickNoRefTargetBlankLink() {
return simulateClick(document.getElementById("noref_and_tblank_link"));
}
function clickTargetBlankLink() {
return simulateClick(document.getElementById("tblank_link"));
}
function clickNoRefLink() {
return simulateClick(document.getElementById("noref_link"));
}
</script>
</head>
<a href="https://REPLACE_WITH_HOST_AND_PORT/files/title2.html"
id="noref_and_tblank_link" rel="noreferrer" target="_blank">
rel=noreferrer and target=_blank</a><br>
<a href="https://REPLACE_WITH_HOST_AND_PORT/files/title2.html" id="tblank_link"
target="_blank">target=_blank</a><br>
<a href="https://REPLACE_WITH_HOST_AND_PORT/files/title2.html" id="noref_link"
rel="noreferrer">rel=noreferrer</a><br>
</html>
| {
"content_hash": "00f976e57817011b02f857b5f3079573",
"timestamp": "",
"source": "github",
"line_count": 38,
"max_line_length": 79,
"avg_line_length": 29.42105263157895,
"alnum_prop": 0.6708407871198568,
"repo_name": "aYukiSekiguchi/ACCESS-Chromium",
"id": "0e5e297bbb1042b151e865586b062d1816e35ddc",
"size": "1118",
"binary": false,
"copies": "4",
"ref": "refs/heads/master",
"path": "chrome/test/data/click-noreferrer-links.html",
"mode": "33188",
"license": "bsd-3-clause",
"language": [
{
"name": "Assembly",
"bytes": "1174606"
},
{
"name": "C",
"bytes": "65916105"
},
{
"name": "C++",
"bytes": "113472993"
},
{
"name": "F#",
"bytes": "381"
},
{
"name": "Go",
"bytes": "10440"
},
{
"name": "Java",
"bytes": "11354"
},
{
"name": "JavaScript",
"bytes": "8864255"
},
{
"name": "Objective-C",
"bytes": "8990130"
},
{
"name": "PHP",
"bytes": "97796"
},
{
"name": "Perl",
"bytes": "903036"
},
{
"name": "Python",
"bytes": "5269405"
},
{
"name": "R",
"bytes": "524"
},
{
"name": "Shell",
"bytes": "4123452"
},
{
"name": "Tcl",
"bytes": "277077"
}
],
"symlink_target": ""
} |
import PointLocator from '../../algorithm/PointLocator';
import Location from '../../geom/Location';
import TreeSet from '../../../../../java/util/TreeSet';
import extend from '../../../../../extend';
import GeometryCombiner from '../../geom/util/GeometryCombiner';
import CoordinateArrays from '../../geom/CoordinateArrays';
export default function PointGeometryUnion() {
this.pointGeom = null;
this.otherGeom = null;
this.geomFact = null;
let pointGeom = arguments[0], otherGeom = arguments[1];
this.pointGeom = pointGeom;
this.otherGeom = otherGeom;
this.geomFact = otherGeom.getFactory();
}
extend(PointGeometryUnion.prototype, {
union: function () {
var locater = new PointLocator();
var exteriorCoords = new TreeSet();
for (var i = 0; i < this.pointGeom.getNumGeometries(); i++) {
var point = this.pointGeom.getGeometryN(i);
var coord = point.getCoordinate();
var loc = locater.locate(coord, this.otherGeom);
if (loc === Location.EXTERIOR) exteriorCoords.add(coord);
}
if (exteriorCoords.size() === 0) return this.otherGeom;
var ptComp = null;
var coords = CoordinateArrays.toCoordinateArray(exteriorCoords);
if (coords.length === 1) {
ptComp = this.geomFact.createPoint(coords[0]);
} else {
ptComp = this.geomFact.createMultiPointFromCoords(coords);
}
return GeometryCombiner.combine(ptComp, this.otherGeom);
},
interfaces_: function () {
return [];
},
getClass: function () {
return PointGeometryUnion;
}
});
PointGeometryUnion.union = function (pointGeom, otherGeom) {
var unioner = new PointGeometryUnion(pointGeom, otherGeom);
return unioner.union();
};
| {
"content_hash": "74fd04f91d46f6599c4a0ac97ea3fbc6",
"timestamp": "",
"source": "github",
"line_count": 47,
"max_line_length": 66,
"avg_line_length": 34.57446808510638,
"alnum_prop": 0.7058461538461539,
"repo_name": "futuristicblanket/reef-space",
"id": "42dd10356bf77e00573718151cf575fc1ee3d10d",
"size": "1625",
"binary": false,
"copies": "4",
"ref": "refs/heads/master",
"path": "node_modules/jsts/src/org/locationtech/jts/operation/union/PointGeometryUnion.js",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "5869"
},
{
"name": "HTML",
"bytes": "168529"
},
{
"name": "JavaScript",
"bytes": "872961"
},
{
"name": "PowerShell",
"bytes": "471"
},
{
"name": "Python",
"bytes": "2489"
},
{
"name": "Ruby",
"bytes": "1030"
},
{
"name": "Shell",
"bytes": "2877"
}
],
"symlink_target": ""
} |
using System.Collections.Generic;
using System.Runtime.Serialization;
using Chiro.Gap.Domain;
namespace Chiro.Gap.ServiceContracts.DataContracts
{
/// <summary>
/// DataContract voor summiere info over aangesloten leden/leiding en hun afdeling(en)
/// </summary>
[DataContract]
public class LidAfdelingInfo
{
/// <summary>
/// Instantieert een LidAfdelingInfo-object
/// </summary>
public LidAfdelingInfo()
{
AfdelingsJaarIDs = new List<int>();
}
/// <summary>
/// Volledige naam van het lid
/// </summary>
[DataMember]
public string VolledigeNaam { get; set; }
/// <summary>
/// ID's van de afdelingsjaren gekoppeld aan het lid
/// </summary>
[DataMember]
public IList<int> AfdelingsJaarIDs { get; set; }
/// <summary>
/// Type lid (kind/leiding)
/// </summary>
[DataMember]
public LidType Type { get; set; }
}
}
| {
"content_hash": "122ca651199f88d3610f3ffee2e43beb",
"timestamp": "",
"source": "github",
"line_count": 41,
"max_line_length": 87,
"avg_line_length": 21.70731707317073,
"alnum_prop": 0.6584269662921348,
"repo_name": "Chirojeugd-Vlaanderen/gap",
"id": "d27298af07c27f45d19fa2a6a0a854a55e29425e",
"size": "1618",
"binary": false,
"copies": "1",
"ref": "refs/heads/dev",
"path": "Solution/Chiro.Gap.ServiceContracts/DataContracts/LidAfdelingInfo.cs",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "ASP",
"bytes": "412956"
},
{
"name": "C#",
"bytes": "2255924"
},
{
"name": "CSS",
"bytes": "28536"
},
{
"name": "JavaScript",
"bytes": "51309"
},
{
"name": "PLSQL",
"bytes": "1032"
},
{
"name": "PLpgSQL",
"bytes": "339"
},
{
"name": "SQLPL",
"bytes": "17329"
},
{
"name": "Shell",
"bytes": "96"
}
],
"symlink_target": ""
} |
'use strict';
var idCount = 0;
var _ = module.exports = {
has: has,
result: function result(value) {
for (var _len = arguments.length, args = Array(_len > 1 ? _len - 1 : 0), _key = 1; _key < _len; _key++) {
args[_key - 1] = arguments[_key];
}
return typeof value === 'function' ? value.apply(undefined, args) : value;
},
isShallowEqual: function isShallowEqual(a, b) {
if (a === b) return true;
if (a instanceof Date && b instanceof Date) return a.getTime() === b.getTime();
if (typeof a !== 'object' && typeof b !== 'object') return a === b;
if (typeof a !== typeof b) return false;
return shallowEqual(a, b);
},
transform: function transform(obj, cb, seed) {
_.each(obj, cb.bind(null, seed = seed || (Array.isArray(obj) ? [] : {})));
return seed;
},
each: function each(obj, cb, thisArg) {
if (Array.isArray(obj)) return obj.forEach(cb, thisArg);
for (var key in obj) if (has(obj, key)) cb.call(thisArg, obj[key], key, obj);
},
pick: function pick(obj, keys) {
keys = [].concat(keys);
return _.transform(obj, function (mapped, val, key) {
if (keys.indexOf(key) !== -1) mapped[key] = val;
}, {});
},
omit: function omit(obj, keys) {
keys = [].concat(keys);
return _.transform(obj, function (mapped, val, key) {
if (keys.indexOf(key) === -1) mapped[key] = val;
}, {});
},
find: function find(arr, cb, thisArg) {
var result;
if (Array.isArray(arr)) {
arr.every(function (val, idx) {
if (cb.call(thisArg, val, idx, arr)) return (result = val, false);
return true;
});
return result;
} else for (var key in arr) if (has(arr, key)) if (cb.call(thisArg, arr[key], key, arr)) return arr[key];
},
chunk: function chunk(array, chunkSize) {
var index = 0,
length = array ? array.length : 0,
result = [];
chunkSize = Math.max(+chunkSize || 1, 1);
while (index < length) result.push(array.slice(index, index += chunkSize));
return result;
},
splat: function splat(obj) {
return obj == null ? [] : [].concat(obj);
},
noop: function noop() {},
uniqueId: function uniqueId(prefix) {
return '' + ((prefix == null ? '' : prefix) + ++idCount);
}
};
function has(o, k) {
return o ? Object.prototype.hasOwnProperty.call(o, k) : false;
}
function eql(a, b) {
return a === b;
}
function shallowEqual(objA, objB) {
if (objA == null || objB == null) return false;
var keysA = Object.keys(objA),
keysB = Object.keys(objB);
if (keysA.length !== keysB.length) return false;
for (var i = 0; i < keysA.length; i++) if (!has(objB, keysA[i]) || !eql(objA[keysA[i]], objB[keysA[i]])) return false;
return true;
} | {
"content_hash": "8fb0592dfd7a05039a80e4ae3c882f64",
"timestamp": "",
"source": "github",
"line_count": 108,
"max_line_length": 120,
"avg_line_length": 25.48148148148148,
"alnum_prop": 0.575218023255814,
"repo_name": "yomolify/cs-webserver",
"id": "9463fb7c5d4785b5ac40405c0e861c14529b6972",
"size": "2822",
"binary": false,
"copies": "4",
"ref": "refs/heads/master",
"path": "node_modules/react-widgets/lib/util/_.js",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "580003"
},
{
"name": "JavaScript",
"bytes": "932904"
}
],
"symlink_target": ""
} |
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=0" />
<title>Findborgs earth-typography element</title>
<link rel="stylesheet" href="../dist/earth-typography.css">
<link rel="stylesheet" href="../dist/earth-colors.css">
</head>
<body class="bg-red-800 p4">
<div class="mx-auto bg-red-50" style="max-width:740px">
<header class="bg-red" style="height: 192px">
<div class="bg-red-700" style="height: 64px">menu area</div>
<h1 class="et-white-text p1">Test Header</h1>
</header>
<main class="p1" style="min-height: 300px;">
<article class="bg-white ml4 mr4 p1 mt3" style="box-shadow: 0 30px 10px -8px #777;">
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod
tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam,
quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo
consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse
cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non
proident, sunt in culpa qui officia deserunt mollit anim id est laborum.
</article>
<aside class="bg-red-300 ml4 mr4 p1 mt4">Something about or related to the article</aside>
</main>
<footer class="bg-red-900 et-white-text p2">
How do you like me now
</footer>
</div>
</body>
</html> | {
"content_hash": "32d9ff00dd2febcece89bffac696c0e2",
"timestamp": "",
"source": "github",
"line_count": 36,
"max_line_length": 112,
"avg_line_length": 42.583333333333336,
"alnum_prop": 0.6666666666666666,
"repo_name": "ExoTypography-Elements/earth-colors",
"id": "e5ac07b00b45d9ef2f31c9afbda0669a30f193e2",
"size": "1533",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "examples/red.html",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "225877"
},
{
"name": "HTML",
"bytes": "500"
},
{
"name": "JavaScript",
"bytes": "482208"
},
{
"name": "Vue",
"bytes": "151990"
}
],
"symlink_target": ""
} |
package io.gravitee.am.identityprovider.linkedin.authentication.model;
/**
* GitHub User claims
*
* See <a href="https://developer.github.com/v3/users/#get-a-single-user>Get a single user</a>
*
* @author David BRASSELY (david.brassely at graviteesource.com)
* @author GraviteeSource Team
*/
public interface LinkedinUser {
String ID = "id";
String FIRSTNAME = "localizedFirstName";
String LASTNAME = "localizedLastName";
String MAIDENNAME = "localizedMaidenName";
String HEADLINE = "localizedHeadline";
String PROFILE_URL = "vanityName";
}
| {
"content_hash": "1127de02b2af3b79a2db6fb8386d4036",
"timestamp": "",
"source": "github",
"line_count": 20,
"max_line_length": 94,
"avg_line_length": 28.7,
"alnum_prop": 0.7195121951219512,
"repo_name": "gravitee-io/graviteeio-access-management",
"id": "91195a0c14aaf3d4d30e890a93c7e205acab6df1",
"size": "1204",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "gravitee-am-identityprovider/gravitee-am-identityprovider-linkedin/src/main/java/io/gravitee/am/identityprovider/linkedin/authentication/model/LinkedinUser.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Batchfile",
"bytes": "4027"
},
{
"name": "CSS",
"bytes": "68813"
},
{
"name": "Dockerfile",
"bytes": "4681"
},
{
"name": "HTML",
"bytes": "419979"
},
{
"name": "Java",
"bytes": "5954609"
},
{
"name": "JavaScript",
"bytes": "4135"
},
{
"name": "Makefile",
"bytes": "18654"
},
{
"name": "Shell",
"bytes": "13103"
},
{
"name": "TypeScript",
"bytes": "839905"
}
],
"symlink_target": ""
} |
using System;
using System.Collections.Generic;
using System.IO;
using System.Web;
using System.Web.Routing;
using Umbraco.Core;
using Umbraco.Core.Configuration;
using Umbraco.Core.Configuration.UmbracoSettings;
using Umbraco.Core.Events;
using Umbraco.Core.Models.PublishedContent;
using Umbraco.Web;
using Umbraco.Web.PublishedCache;
using Umbraco.Web.Routing;
using Umbraco.Web.Security;
namespace Umbraco.Web
{
/// <summary>
/// Class that encapsulates Umbraco information of a specific HTTP request
/// </summary>
public class UmbracoContext : DisposableObjectSlim, IDisposeOnRequestEnd
{
private readonly IGlobalSettings _globalSettings;
private readonly Lazy<IPublishedSnapshot> _publishedSnapshot;
private string _previewToken;
private bool? _previewing;
// initializes a new instance of the UmbracoContext class
// internal for unit tests
// otherwise it's used by EnsureContext above
// warn: does *not* manage setting any IUmbracoContextAccessor
internal UmbracoContext(HttpContextBase httpContext,
IPublishedSnapshotService publishedSnapshotService,
WebSecurity webSecurity,
IUmbracoSettingsSection umbracoSettings,
IEnumerable<IUrlProvider> urlProviders,
IEnumerable<IMediaUrlProvider> mediaUrlProviders,
IGlobalSettings globalSettings,
IVariationContextAccessor variationContextAccessor)
{
if (httpContext == null) throw new ArgumentNullException(nameof(httpContext));
if (publishedSnapshotService == null) throw new ArgumentNullException(nameof(publishedSnapshotService));
if (webSecurity == null) throw new ArgumentNullException(nameof(webSecurity));
if (umbracoSettings == null) throw new ArgumentNullException(nameof(umbracoSettings));
if (urlProviders == null) throw new ArgumentNullException(nameof(urlProviders));
if (mediaUrlProviders == null) throw new ArgumentNullException(nameof(mediaUrlProviders));
VariationContextAccessor = variationContextAccessor ?? throw new ArgumentNullException(nameof(variationContextAccessor));
_globalSettings = globalSettings ?? throw new ArgumentNullException(nameof(globalSettings));
// ensure that this instance is disposed when the request terminates, though we *also* ensure
// this happens in the Umbraco module since the UmbracoCOntext is added to the HttpContext items.
//
// also, it *can* be returned by the container with a PerRequest lifetime, meaning that the
// container *could* also try to dispose it.
//
// all in all, this context may be disposed more than once, but DisposableObject ensures that
// it is ok and it will be actually disposed only once.
httpContext.DisposeOnPipelineCompleted(this);
ObjectCreated = DateTime.Now;
UmbracoRequestId = Guid.NewGuid();
HttpContext = httpContext;
Security = webSecurity;
// beware - we cannot expect a current user here, so detecting preview mode must be a lazy thing
_publishedSnapshot = new Lazy<IPublishedSnapshot>(() => publishedSnapshotService.CreatePublishedSnapshot(PreviewToken));
// set the URLs...
// NOTE: The request will not be available during app startup so we can only set this to an absolute URL of localhost, this
// is a work around to being able to access the UmbracoContext during application startup and this will also ensure that people
// 'could' still generate URLs during startup BUT any domain driven URL generation will not work because it is NOT possible to get
// the current domain during application startup.
// see: http://issues.umbraco.org/issue/U4-1890
//
OriginalRequestUrl = GetRequestFromContext()?.Url ?? new Uri("http://localhost");
CleanedUmbracoUrl = UriUtility.UriToUmbraco(OriginalRequestUrl);
UrlProvider = new UrlProvider(this, umbracoSettings.WebRouting, urlProviders, mediaUrlProviders, variationContextAccessor);
}
/// <summary>
/// This is used internally for performance calculations, the ObjectCreated DateTime is set as soon as this
/// object is instantiated which in the web site is created during the BeginRequest phase.
/// We can then determine complete rendering time from that.
/// </summary>
internal DateTime ObjectCreated { get; }
/// <summary>
/// This is used internally for debugging and also used to define anything required to distinguish this request from another.
/// </summary>
internal Guid UmbracoRequestId { get; }
/// <summary>
/// Gets the WebSecurity class
/// </summary>
public WebSecurity Security { get; }
/// <summary>
/// Gets the uri that is handled by ASP.NET after server-side rewriting took place.
/// </summary>
internal Uri OriginalRequestUrl { get; }
/// <summary>
/// Gets the cleaned up URL that is handled by Umbraco.
/// </summary>
/// <remarks>That is, lowercase, no trailing slash after path, no .aspx...</remarks>
internal Uri CleanedUmbracoUrl { get; }
/// <summary>
/// Gets the published snapshot.
/// </summary>
public IPublishedSnapshot PublishedSnapshot => _publishedSnapshot.Value;
/// <summary>
/// Gets the published content cache.
/// </summary>
[Obsolete("Use the Content property.")]
public IPublishedContentCache ContentCache => PublishedSnapshot.Content;
/// <summary>
/// Gets the published content cache.
/// </summary>
public IPublishedContentCache Content => PublishedSnapshot.Content;
/// <summary>
/// Gets the published media cache.
/// </summary>
[Obsolete("Use the Media property.")]
public IPublishedMediaCache MediaCache => PublishedSnapshot.Media;
/// <summary>
/// Gets the published media cache.
/// </summary>
public IPublishedMediaCache Media => PublishedSnapshot.Media;
/// <summary>
/// Gets the domains cache.
/// </summary>
public IDomainCache Domains => PublishedSnapshot.Domains;
/// <summary>
/// Boolean value indicating whether the current request is a front-end umbraco request
/// </summary>
public bool IsFrontEndUmbracoRequest => PublishedRequest != null;
/// <summary>
/// Gets the URL provider.
/// </summary>
public UrlProvider UrlProvider { get; }
/// <summary>
/// Gets/sets the PublishedRequest object
/// </summary>
public PublishedRequest PublishedRequest { get; set; }
/// <summary>
/// Exposes the HttpContext for the current request
/// </summary>
public HttpContextBase HttpContext { get; }
/// <summary>
/// Gets the variation context accessor.
/// </summary>
public IVariationContextAccessor VariationContextAccessor { get; }
/// <summary>
/// Gets a value indicating whether the request has debugging enabled
/// </summary>
/// <value><c>true</c> if this instance is debug; otherwise, <c>false</c>.</value>
public bool IsDebug
{
get
{
var request = GetRequestFromContext();
//NOTE: the request can be null during app startup!
return GlobalSettings.DebugMode
&& request != null
&& (string.IsNullOrEmpty(request["umbdebugshowtrace"]) == false
|| string.IsNullOrEmpty(request["umbdebug"]) == false
|| string.IsNullOrEmpty(request.Cookies["UMB-DEBUG"]?.Value) == false);
}
}
/// <summary>
/// Determines whether the current user is in a preview mode and browsing the site (ie. not in the admin UI)
/// </summary>
public bool InPreviewMode
{
get
{
if (_previewing.HasValue == false) DetectPreviewMode();
return _previewing ?? false;
}
private set => _previewing = value;
}
#region Urls
/// <summary>
/// Gets the URL of a content identified by its identifier.
/// </summary>
/// <param name="contentId">The content identifier.</param>
/// <param name="culture"></param>
/// <returns>The URL for the content.</returns>
public string Url(int contentId, string culture = null)
{
return UrlProvider.GetUrl(contentId, culture: culture);
}
/// <summary>
/// Gets the URL of a content identified by its identifier.
/// </summary>
/// <param name="contentId">The content identifier.</param>
/// <param name="culture"></param>
/// <returns>The URL for the content.</returns>
public string Url(Guid contentId, string culture = null)
{
return UrlProvider.GetUrl(contentId, culture: culture);
}
/// <summary>
/// Gets the URL of a content identified by its identifier, in a specified mode.
/// </summary>
/// <param name="contentId">The content identifier.</param>
/// <param name="mode">The mode.</param>
/// <param name="culture"></param>
/// <returns>The URL for the content.</returns>
public string Url(int contentId, UrlMode mode, string culture = null)
{
return UrlProvider.GetUrl(contentId, mode, culture);
}
/// <summary>
/// Gets the URL of a content identified by its identifier, in a specified mode.
/// </summary>
/// <param name="contentId">The content identifier.</param>
/// <param name="mode">The mode.</param>
/// <param name="culture"></param>
/// <returns>The URL for the content.</returns>
public string Url(Guid contentId, UrlMode mode, string culture = null)
{
return UrlProvider.GetUrl(contentId, mode, culture);
}
/// <summary>
/// Gets the absolute URL of a content identified by its identifier.
/// </summary>
/// <param name="contentId">The content identifier.</param>
/// <param name="culture"></param>
/// <returns>The absolute URL for the content.</returns>
[Obsolete("Use the Url() method with UrlMode.Absolute.")]
public string UrlAbsolute(int contentId, string culture = null)
{
return UrlProvider.GetUrl(contentId, UrlMode.Absolute, culture);
}
/// <summary>
/// Gets the absolute URL of a content identified by its identifier.
/// </summary>
/// <param name="contentId">The content identifier.</param>
/// <param name="culture"></param>
/// <returns>The absolute URL for the content.</returns>
[Obsolete("Use the Url() method with UrlMode.Absolute.")]
public string UrlAbsolute(Guid contentId, string culture = null)
{
return UrlProvider.GetUrl(contentId, UrlMode.Absolute, culture);
}
#endregion
private string PreviewToken
{
get
{
if (_previewing.HasValue == false) DetectPreviewMode();
return _previewToken;
}
}
private void DetectPreviewMode()
{
var request = GetRequestFromContext();
if (request?.Url != null
&& request.Url.IsBackOfficeRequest(HttpRuntime.AppDomainAppVirtualPath, _globalSettings) == false
&& Security.CurrentUser != null)
{
var previewToken = request.GetPreviewCookieValue(); // may be null or empty
_previewToken = previewToken.IsNullOrWhiteSpace() ? null : previewToken;
}
_previewing = _previewToken.IsNullOrWhiteSpace() == false;
}
// say we render a macro or RTE in a give 'preview' mode that might not be the 'current' one,
// then due to the way it all works at the moment, the 'current' published snapshot need to be in the proper
// default 'preview' mode - somehow we have to force it. and that could be recursive.
internal IDisposable ForcedPreview(bool preview)
{
InPreviewMode = preview;
return PublishedSnapshot.ForcedPreview(preview, orig => InPreviewMode = orig);
}
private HttpRequestBase GetRequestFromContext()
{
try
{
return HttpContext.Request;
}
catch (HttpException)
{
return null;
}
}
protected override void DisposeResources()
{
// DisposableObject ensures that this runs only once
Security.DisposeIfDisposable();
// help caches release resources
// (but don't create caches just to dispose them)
// context is not multi-threaded
if (_publishedSnapshot.IsValueCreated)
_publishedSnapshot.Value.Dispose();
}
}
}
| {
"content_hash": "e49f77c67719b324ca8c9f10940ede7b",
"timestamp": "",
"source": "github",
"line_count": 328,
"max_line_length": 142,
"avg_line_length": 41.47560975609756,
"alnum_prop": 0.6107027344898559,
"repo_name": "mattbrailsford/Umbraco-CMS",
"id": "ab8470c64f0899ec0b0495698865a366f01f3c9d",
"size": "13604",
"binary": false,
"copies": "5",
"ref": "refs/heads/v8/dev",
"path": "src/Umbraco.Web/UmbracoContext.cs",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "ASP.NET",
"bytes": "2163"
},
{
"name": "C#",
"bytes": "15591985"
},
{
"name": "CSS",
"bytes": "17972"
},
{
"name": "HTML",
"bytes": "1259078"
},
{
"name": "JavaScript",
"bytes": "4555945"
},
{
"name": "Less",
"bytes": "688504"
},
{
"name": "PowerShell",
"bytes": "29216"
},
{
"name": "TypeScript",
"bytes": "122285"
}
],
"symlink_target": ""
} |
package ee.pri.bcup.client.common.panel;
import java.awt.Graphics;
import java.awt.Image;
import ee.pri.bcup.client.common.context.AppletContext;
import ee.pri.bcup.common.Testing;
/**
* Base class for panels with background image and no layout manager.
*
* @author Raivo Laanemets
*/
public class BCupBackgroundedPanel extends BCupPanel {
private static final long serialVersionUID = 1L;
private Image background;
public BCupBackgroundedPanel(AppletContext appletContext, String background) {
super(appletContext);
setLayout(null);
if (background != null && Testing.useBackground()) {
System.out.println("APPLETCTX: " + appletContext);
this.background = appletContext.getBackground(background);
} else {
setBackground(Testing.backgroundColor());
}
}
@Override
protected void paintComponent(Graphics g) {
if (background != null) {
g.drawImage(background, 0, 0, null);
}
}
}
| {
"content_hash": "bc70a48c42cdf34e668112abea47cdd3",
"timestamp": "",
"source": "github",
"line_count": 38,
"max_line_length": 79,
"avg_line_length": 24.31578947368421,
"alnum_prop": 0.737012987012987,
"repo_name": "rla/bcup",
"id": "8b38c584cce5273511d464edf2e25e6fca1c1c85",
"size": "924",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "src/client/ee/pri/bcup/client/common/panel/BCupBackgroundedPanel.java",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "Java",
"bytes": "327905"
},
{
"name": "PHP",
"bytes": "2120"
},
{
"name": "Shell",
"bytes": "2233"
}
],
"symlink_target": ""
} |
|LF_STANDARD|
|?TREE=» <a class=tree href="/CMD_FILE_MANAGER">`LANG_FILEMANAGER`</a> » <a class=tree href="/CMD_FILE_MANAGER`path`?action=extract&page=1">`path`</a>|
|?HELP_SECTION=`USER_HELPER`/filemanager.html#extract|
|HTM_USER_TOP|
<!-- inicio bloque -->
<div class="bcp-bloque">
<div class="bcpb-padding">
<!-- inicio contenido bloque -->
<!-- titulo -->
<div class="bcpbp-titulo">
<h2>|LANG_EXTRACT_1| |LANG_EXTRACT_2|</h2>
</div>
<!-- fin titulo -->
<!-- inicio content -->
<div class="bcpbp-content">
<!-- table -->
<div class="box-da-table">
<table class=list cellpadding=0 cellspacing=0>
<form name=info action='/CMD_FILE_MANAGER' method='POST'>
<input type=hidden name=action value="extract">
<input type=hidden name=page value="2">
<input type=hidden name=path value="|path|">
<tr><td class=list>|LANG_DIRECTORY|: <input type=text name=directory value="|directory|"></td></tr>
<tr><td class=listtitle align=right><input type=submit value="|LANG_EXTRACT|"></td ></tr>
</form>
</table>
</div> <!-- fin table -->
</div> <!-- fin content -->
<!-- fin contenido bloque -->
</div> <!-- fin padding -->
</div> <!-- fin bloque -->
<!-- inicio bloque -->
<div class="bcp-bloque">
<div class="bcpb-padding">
<!-- inicio contenido bloque -->
<!-- generales -->
<div class="bcpbp-generales">
<!-- texto -->
<div class="bcpbpt-texto">
<p>|LANG_CONTENTS| |path|</p>
</div><!-- fin texto -->
</div><!-- fin generales -->
<div class="bcpbp-resp">
<textarea cols=100 rows=20 wrap=off readonly>|OUTPUT|</textarea>
</div>
<!-- fin contenido bloque -->
</div> <!-- fin padding -->
</div> <!-- fin bloque -->
|HTM_USER_BOTTOM|
| {
"content_hash": "b2559b35ed3ed12cb3eb17669024a47b",
"timestamp": "",
"source": "github",
"line_count": 67,
"max_line_length": 163,
"avg_line_length": 28.238805970149254,
"alnum_prop": 0.5533826638477801,
"repo_name": "cake654326/einDa-skin",
"id": "48390a536a74238dbcf0384f560cc11107a20640",
"size": "1892",
"binary": false,
"copies": "7",
"ref": "refs/heads/master",
"path": "user/filemanager/extract.html",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "HTML",
"bytes": "686960"
},
{
"name": "JavaScript",
"bytes": "8079"
},
{
"name": "Shell",
"bytes": "1445"
}
],
"symlink_target": ""
} |
NTSTATUS nativeIde_Constructor(__in PPCI_COMMON_EXTENSION Ext,__in PPCI_INTERFACE PciIntrf,__in PVOID Data,__in USHORT Version,__in USHORT Size,__in PINTERFACE Interface);
//
// initializer
//
NTSTATUS nativeIde_Initializer(__in PPCI_ARBITER_INSTANCE Instance);
//
// reference
//
VOID nativeIde_Reference(__in PPCI_PDO_EXTENSION PdoExt);
//
// dereference
//
VOID nativeIde_Dereference(__in PPCI_PDO_EXTENSION PdoExt);
//
// interrupt control
//
VOID nativeIde_InterruptControl(__in PPCI_PDO_EXTENSION PdoExt,__in BOOLEAN Enable); | {
"content_hash": "bbc1568fcb0125963b98bb8195d66c32",
"timestamp": "",
"source": "github",
"line_count": 21,
"max_line_length": 171,
"avg_line_length": 25.476190476190474,
"alnum_prop": 0.7476635514018691,
"repo_name": "godspeed1989/WDUtils",
"id": "03996db55ff3237c67dff2057b373fa87c7e4873",
"size": "818",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "tiamoPCI/source/pci.ideintrf.h",
"mode": "33188",
"license": "bsd-2-clause",
"language": [
{
"name": "C",
"bytes": "822538"
},
{
"name": "C++",
"bytes": "683300"
},
{
"name": "Makefile",
"bytes": "6639"
},
{
"name": "Objective-C",
"bytes": "37899"
},
{
"name": "Shell",
"bytes": "81"
}
],
"symlink_target": ""
} |
//
// ___FILENAME___
// ___PROJECTNAME___
//
// Created by ___FULLUSERNAME___ on ___DATE___.
//___COPYRIGHT___
//
#import "___FILEBASENAME___.h"
@implementation ___FILEBASENAMEASIDENTIFIER___
- (void)awakeFromNib
{
[super awakeFromNib];
// Additional configuration
}
/*
// Only override drawRect: if you perform custom drawing.
// An empty implementation adversely affects performance during animation.
- (void)drawRect:(CGRect)rect {
// Drawing code
}
*/
@end
| {
"content_hash": "c40780ac1c0e664f9c1158c6a2f5a31e",
"timestamp": "",
"source": "github",
"line_count": 28,
"max_line_length": 74,
"avg_line_length": 17.214285714285715,
"alnum_prop": 0.6473029045643154,
"repo_name": "samdods/Stencil",
"id": "ef70d13ab50b096a284cc355ffe78c5c5b90e653",
"size": "482",
"binary": false,
"copies": "3",
"ref": "refs/heads/master",
"path": "StencilPlugin/Resources/Built-in Templates/Stencil/UIView.xctemplate/UIViewObjective-C/___FILEBASENAME___.m",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "Objective-C",
"bytes": "57516"
}
],
"symlink_target": ""
} |
export default {
setupFiles: ["./auto"],
testMatch: ["<rootDir>/test/jest.js"],
};
| {
"content_hash": "0bde2c0e907814e4b6e50cf3a52aba80",
"timestamp": "",
"source": "github",
"line_count": 4,
"max_line_length": 42,
"avg_line_length": 22.75,
"alnum_prop": 0.5824175824175825,
"repo_name": "dumbmatter/fakeIndexedDB",
"id": "7f1da0201f7cf2f9a5b9b941c39832a592930c13",
"size": "91",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "jest.config.js",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "HTML",
"bytes": "761271"
},
{
"name": "JavaScript",
"bytes": "3534850"
},
{
"name": "Shell",
"bytes": "59"
},
{
"name": "TypeScript",
"bytes": "177316"
}
],
"symlink_target": ""
} |
using Autofac;
using Microsoft.Bot.Builder.Dialogs;
using Microsoft.Bot.Builder.Dialogs.Internals;
using Microsoft.Bot.Connector;
using Microsoft.VisualStudio.TestTools.UnitTesting;
using Newtonsoft.Json;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace Microsoft.Bot.Builder.Tests
{
class Script : DialogTestBase
{
public static async Task RecordScript(ILifetimeScope container,
bool proactive,
StreamWriter stream,
Func<string> extraInfo,
params string[] inputs)
{
var toBot = MakeTestMessage();
using (var scope = DialogModule.BeginLifetimeScope(container, toBot))
{
var task = scope.Resolve<IPostToBot>();
var queue = scope.Resolve<Queue<IMessageActivity>>();
Action drain = () =>
{
stream.WriteLine($"{queue.Count()}");
while (queue.Count > 0)
{
var toUser = queue.Dequeue();
if (!string.IsNullOrEmpty(toUser.Text))
{
stream.WriteLine($"ToUserText:{JsonConvert.SerializeObject(toUser.Text)}");
}
else
{
stream.WriteLine($"ToUserButtons:{JsonConvert.SerializeObject(toUser.Attachments)}");
}
}
};
string result = null;
var root = scope.Resolve<IDialog<object>>().Do(async (context, value) =>
result = JsonConvert.SerializeObject(await value));
if (proactive)
{
var loop = root.Loop();
var data = scope.Resolve<IBotData>();
await data.LoadAsync(CancellationToken.None);
var stack = scope.Resolve<IDialogTask>();
stack.Call(loop, null);
await stack.PollAsync(CancellationToken.None);
drain();
}
else
{
var builder = new ContainerBuilder();
builder
.RegisterInstance(root)
.AsSelf()
.As<IDialog<object>>();
builder.Update((IContainer)container);
}
foreach (var input in inputs)
{
stream.WriteLine($"FromUser:{JsonConvert.SerializeObject(input)}");
toBot.Text = input;
try
{
await task.PostAsync(toBot, CancellationToken.None);
drain();
if (extraInfo != null)
{
var extra = extraInfo();
stream.WriteLine(extra);
}
}
catch (Exception e)
{
stream.WriteLine($"Exception:{e.Message}");
}
}
if (result != null)
{
stream.WriteLine($"Result: {result}");
}
}
}
public static string ReadLine(StreamReader stream, out string label)
{
string line = stream.ReadLine();
label = null;
if (line != null)
{
int pos = line.IndexOf(':');
if (pos != -1)
{
label = line.Substring(0, pos);
line = line.Substring(pos + 1);
}
}
return line;
}
public static async Task VerifyScript(ILifetimeScope container, bool proactive, StreamReader stream, Action<string> extraCheck, string[] expected)
{
var toBot = DialogTestBase.MakeTestMessage();
using (var scope = DialogModule.BeginLifetimeScope(container, toBot))
{
var task = scope.Resolve<IPostToBot>();
var queue = scope.Resolve<Queue<IMessageActivity>>();
string input, label;
Action check = () =>
{
var count = int.Parse(stream.ReadLine());
Assert.AreEqual(count, queue.Count);
for (var i = 0; i < count; ++i)
{
var toUser = queue.Dequeue();
var expectedOut = ReadLine(stream, out label);
if (label == "ToUserText")
{
Assert.AreEqual(expectedOut, JsonConvert.SerializeObject(toUser.Text));
}
else
{
Assert.AreEqual(expectedOut, JsonConvert.SerializeObject(toUser.Attachments));
}
}
extraCheck?.Invoke(ReadLine(stream, out label));
};
string result = null;
var root = scope.Resolve<IDialog<object>>().Do(async (context, value) => result = JsonConvert.SerializeObject(await value));
if (proactive)
{
var loop = root.Loop();
var data = scope.Resolve<IBotData>();
await data.LoadAsync(CancellationToken.None);
var stack = scope.Resolve<IDialogTask>();
stack.Call(loop, null);
await stack.PollAsync(CancellationToken.None);
check();
}
else
{
var builder = new ContainerBuilder();
builder
.RegisterInstance(root)
.AsSelf()
.As<IDialog<object>>();
builder.Update((IContainer)container);
}
int current = 0;
while ((input = ReadLine(stream, out label)) != null)
{
if (input.StartsWith("\""))
{
input = input.Substring(1, input.Length - 2);
Assert.IsTrue(current < expected.Length && input == expected[current++]);
toBot.Text = input;
try
{
await task.PostAsync(toBot, CancellationToken.None);
check();
}
catch (Exception e)
{
Assert.AreEqual(ReadLine(stream, out label), e.Message);
}
}
else if (input.StartsWith("Result:"))
{
Assert.AreEqual(input.Substring(7), result);
}
}
}
}
public static async Task RecordDialogScript<T>(string filePath, IDialog<T> dialog, bool proactive, params string[] inputs)
{
using (var stream = new StreamWriter(filePath))
using (var container = Build(Options.ResolveDialogFromContainer | Options.Reflection))
{
var builder = new ContainerBuilder();
builder
.RegisterInstance(dialog)
.AsSelf()
.As<IDialog<object>>();
builder.Update(container);
await RecordScript(container, proactive, stream, null, inputs);
}
}
public static string NewScriptPathFor(string pathScriptOld)
{
var pathScriptNew = Path.Combine
(
Path.GetDirectoryName(pathScriptOld),
Path.GetFileNameWithoutExtension(pathScriptOld) + "-new" + Path.GetExtension(pathScriptOld)
);
return pathScriptNew;
}
public static async Task VerifyDialogScript<T>(string filePath, IDialog<T> dialog, bool proactive, params string[] inputs)
{
var newPath = NewScriptPathFor(filePath);
File.Delete(newPath);
try
{
using (var stream = new StreamReader(filePath))
using (var container = Build(Options.ResolveDialogFromContainer | Options.Reflection))
{
var builder = new ContainerBuilder();
builder
.RegisterInstance(dialog)
.AsSelf()
.As<IDialog<object>>();
builder.Update(container);
await VerifyScript(container, proactive, stream, null, inputs);
}
}
catch (Exception)
{
// There was an error, so record new script and pass on error
await RecordDialogScript(newPath, dialog, proactive, inputs);
throw;
}
}
}
}
| {
"content_hash": "5400aa8564f28c36a277f3f32d397715",
"timestamp": "",
"source": "github",
"line_count": 235,
"max_line_length": 154,
"avg_line_length": 40.91063829787234,
"alnum_prop": 0.43821510297482835,
"repo_name": "xiangyan99/BotBuilder",
"id": "22bf427ee175857e7839720aecb6594c12cc817b",
"size": "9616",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "CSharp/Tests/Microsoft.Bot.Builder.Tests/Script.cs",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "ASP",
"bytes": "1938"
},
{
"name": "Batchfile",
"bytes": "6976"
},
{
"name": "C#",
"bytes": "2276903"
},
{
"name": "CSS",
"bytes": "142330"
},
{
"name": "HTML",
"bytes": "78994"
},
{
"name": "JavaScript",
"bytes": "520043"
},
{
"name": "TypeScript",
"bytes": "426976"
}
],
"symlink_target": ""
} |
namespace FluentValidation.Resources {
using System;
using System.Reflection;
using Internal;
public class ResourceAccessor {
public Func<string> Accessor { get; set; }
public Type ResourceType { get; set; }
public string ResourceName { get; set; }
}
/// <summary>
/// Builds a delegate for retrieving a localised resource from a resource type and property name.
/// </summary>
public interface IResourceAccessorBuilder {
/// <summary>
/// Gets a function that can be used to retrieve a message from a resource type and resource name.
/// </summary>
ResourceAccessor GetResourceAccessor(Type resourceType, string resourceName);
}
/// <summary>
/// Builds a delegate for retrieving a localised resource from a resource type and property name.
/// </summary>
public class StaticResourceAccessorBuilder : IResourceAccessorBuilder {
/// <summary>
/// Builds a function used to retrieve the resource.
/// </summary>
public virtual ResourceAccessor GetResourceAccessor(Type resourceType, string resourceName) {
var property = GetResourceProperty(ref resourceType, ref resourceName);
if (property == null) {
throw new InvalidOperationException(string.Format("Could not find a property named '{0}' on type '{1}'.", resourceName, resourceType));
}
if (property.PropertyType != typeof(string)) {
throw new InvalidOperationException(string.Format("Property '{0}' on type '{1}' does not return a string", resourceName, resourceType));
}
var accessor = property.CreateGetter();
return new ResourceAccessor {
Accessor = accessor,
ResourceName = resourceName,
ResourceType = resourceType
};
}
/// <summary>
/// Gets the PropertyInfo for a resource.
/// ResourceType and ResourceName are ref parameters to allow derived types
/// to replace the type/name of the resource before the delegate is constructed.
/// </summary>
protected virtual PropertyInfo GetResourceProperty(ref Type resourceType, ref string resourceName) {
return resourceType.GetPublicStaticProperty(resourceName);
}
}
/// <summary>
/// Implemenetation of IResourceAccessorBuilder that can fall back to the default resource provider.
/// </summary>
public class FallbackAwareResourceAccessorBuilder : StaticResourceAccessorBuilder {
/// <summary>
/// Gets the PropertyInfo for a resource.
/// ResourceType and ResourceName are ref parameters to allow derived types
/// to replace the type/name of the resource before the delegate is constructed.
/// </summary>
protected override PropertyInfo GetResourceProperty(ref Type resourceType, ref string resourceName) {
// Rather than just using the specified resource type to find the resource accessor property
// we first look on the ResourceProviderType which gives our end user the ability
// to redirect error messages away from the default Messages class.
if (ValidatorOptions.ResourceProviderType != null) {
var property = ValidatorOptions.ResourceProviderType.GetPublicStaticProperty(resourceName);
if (property != null) {
// We found a matching property on the Resource Provider.
// In this case, as well as returning the PropertyInfo from this resource provider,
// we also replace the resource type with the resource provider (remember this is a ref parameter)
resourceType = ValidatorOptions.ResourceProviderType;
return property;
}
}
return base.GetResourceProperty(ref resourceType, ref resourceName);
}
}
} | {
"content_hash": "4f20b84b7d552b0f0d7096a57c02adaa",
"timestamp": "",
"source": "github",
"line_count": 91,
"max_line_length": 140,
"avg_line_length": 40.0989010989011,
"alnum_prop": 0.7119758838037819,
"repo_name": "ruisebastiao/FluentValidation",
"id": "d15be2618ef1fd962f9e49c739153a4b34ac7702",
"size": "3649",
"binary": false,
"copies": "2",
"ref": "refs/heads/master",
"path": "src/FluentValidation/Resources/IResourceAccessorBuilder.cs",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Batchfile",
"bytes": "1193"
},
{
"name": "C#",
"bytes": "836269"
},
{
"name": "PowerShell",
"bytes": "1462"
},
{
"name": "Smalltalk",
"bytes": "2894"
}
],
"symlink_target": ""
} |
<?xml version="1.0" encoding="ISO-8859-1"?>
<!--
Copyright 2016 Goldman Sachs.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
<layout pageTitle="Dulic Bega Mowere" pageBuilder="com.gs.fw.kaza.rukele.so2004.vo.tudewax.SO2004QuxepowinojEdiZizugexog.detegotAyiluwece" layoutId="vebidIdib" showSelection="false"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="../../../xsd/layout.xsd">
<section>
<row>
<table name="Fizumurel" label="Fizumurel" rowSelection="single" rowsPerPage="40000">
<drilldown>
<button name="drilldown_button" nextLayoutName="rukele/so2004/so2004LivoqITeyucOhufAhaba.xml"
pageBuilder="com.gs.fw.kaza.rukele.so2004.vo.tudewax.SO2004KotawIBobivOwurAceyaNapaqoq" refreshPage="true"/>
</drilldown>
</table>
</row>
</section>
<section>
<row>
<uiElement name="hepafiq" label="VebidIdib By:" renderAs="duallistbox" attribute="taxobifuNeyo"/>
</row>
</section>
</layout>
| {
"content_hash": "718df3e9a539b987256172edb6ad78cc",
"timestamp": "",
"source": "github",
"line_count": 34,
"max_line_length": 181,
"avg_line_length": 47.14705882352941,
"alnum_prop": 0.6768558951965066,
"repo_name": "motlin/gs-xsd2bean",
"id": "0f570fa3834279cbc5e89b181a3e49db1a2d4fec",
"size": "1603",
"binary": false,
"copies": "3",
"ref": "refs/heads/master",
"path": "src/test/testdata/glui/fr2004SchedADrillDownPanel.xml",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Batchfile",
"bytes": "4478"
},
{
"name": "Java",
"bytes": "478865"
},
{
"name": "Shell",
"bytes": "4940"
}
],
"symlink_target": ""
} |
'use strict';
/* jshint node: true */
/* jshint esnext: true */
/* global describe, beforeEach, afterAll, spyOn, it, expect, fail, jasmine */
const timeCalculator = require('../../domain/timeCalculator');
const _ = require('lodash');
const moment = require('moment');
describe('timeCalculator service', () => {
const now = moment();
const startTimes = [ now.unix() ];
const finishTime = _.cloneDeep(now).add(31, 'm').add(29, 's').unix();
it('returns relative time', () => {
let result = timeCalculator.relativeTime(startTimes, finishTime, 0);
expect(result[0]).toBe(0);
expect(result[1]).toBe(31);
expect(result[2]).toBe(29);
});
it('returns relative seconds', () => {
let result = timeCalculator.relativeSeconds(startTimes, finishTime, 0);
expect(result).toBe(1889);
});
it('returns a formated string', () => {
let result = timeCalculator.timeString('14587');
expect(result).toBe('04:03:07');
});
});
| {
"content_hash": "c37c95d5954fd004ca1fe5480ab1e666",
"timestamp": "",
"source": "github",
"line_count": 29,
"max_line_length": 77,
"avg_line_length": 33.89655172413793,
"alnum_prop": 0.6215666327568667,
"repo_name": "lplotni/pace",
"id": "689c1e2342fc1664b4fdf08ba549c7295c8b2025",
"size": "983",
"binary": false,
"copies": "2",
"ref": "refs/heads/master",
"path": "spec/domain/timeCalculatorSpec.js",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "CSS",
"bytes": "57893"
},
{
"name": "Dockerfile",
"bytes": "502"
},
{
"name": "HCL",
"bytes": "22088"
},
{
"name": "HTML",
"bytes": "32759"
},
{
"name": "JavaScript",
"bytes": "928886"
},
{
"name": "Makefile",
"bytes": "148"
},
{
"name": "Shell",
"bytes": "2945"
},
{
"name": "Smarty",
"bytes": "2315"
}
],
"symlink_target": ""
} |
<?php
namespace App\Http\Controllers;
use Illuminate\Http\Request;
use \App\Http\Controllers\FrontController;
class HomeController extends FrontController
{
/**
* Show the application dashboard.
*
* @return \Illuminate\Http\Response
*/
public function __construct()
{
parent::__construct();
}
public function index()
{
$data = [
'numeroClients' => \App\Nay\Model\EntitiesModel::whereIn('entity_category', [\App\Nay\Model\EntityCategory::Client, \App\Nay\Model\EntityCategory::ClientAndCompany ])->count(),
'numeroProducts' => \App\Nay\Model\ProductsModel::count(),
'numeroCategories' => \App\Nay\Model\CategoriesModel::count(),
'numeroBrands' => \App\Nay\Model\BrandsModel::count(),
'numeroSales' => \App\Nay\Model\SalesModel::count(),
'numeroRequests' => 0,//\App\Nay\Model\RequestsModel::count(),
'numeroUsers' => \App\User::count(),
'numeroProviders' => \App\Nay\Model\EntitiesModel::whereIn('entity_category', [ \App\Nay\Model\EntityCategory::Company, \App\Nay\Model\EntityCategory::ClientAndCompany ])->count(),
'numeroShippingCompany' => \App\Nay\Model\ShippingCompanyModel::count(),
'numeroFinancials' => \App\Nay\Model\FinancialsModel::count(),
];
return view('home')->with($data);
}
}
| {
"content_hash": "6e6c20cbf8f05b57bf0754e75aa19a18",
"timestamp": "",
"source": "github",
"line_count": 40,
"max_line_length": 207,
"avg_line_length": 38.95,
"alnum_prop": 0.5641848523748395,
"repo_name": "alphabraga/nay",
"id": "03639025c676eef5fbfc1aaf6e3eb4a484dbad83",
"size": "1558",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "app/Http/Controllers/HomeController.php",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "CSS",
"bytes": "20584"
},
{
"name": "HTML",
"bytes": "249899"
},
{
"name": "JavaScript",
"bytes": "25119"
},
{
"name": "PHP",
"bytes": "277551"
},
{
"name": "Vue",
"bytes": "563"
}
],
"symlink_target": ""
} |
'use strict';
let angular = require('angular');
module.exports = angular.module('spinnaker.serverGroup.configure.cf.configuration.service', [
require('../../../core/account/account.service.js'),
require('../../../core/securityGroup/securityGroup.read.service.js'),
require('../../../core/cache/cacheInitializer.js'),
require('../../image/image.reader.js'),
require('../../instance/cfInstanceTypeService.js'),
])
.factory('cfServerGroupConfigurationService', function(cfImageReader, accountService, securityGroupReader,
cfInstanceTypeService, cacheInitializer,
$q, _) {
function configureCommand(command) {
command.image = command.viewState.imageId;
return $q.all({
regionsKeyedByAccount: accountService.getRegionsKeyedByAccount('cf'),
securityGroups: securityGroupReader.getAllSecurityGroups(),
instanceTypes: cfInstanceTypeService.getAllTypesByRegion(),
images: cfImageReader.findImages({provider: 'cf'}),
}).then(function(backingData) {
var securityGroupReloader = $q.when(null);
backingData.accounts = _.keys(backingData.regionsKeyedByAccount);
backingData.filtered = {};
command.backingData = backingData;
configureImages(command);
if (command.securityGroups && command.securityGroups.length) {
// Verify all security groups are accounted for; otherwise, try refreshing security groups cache.
var securityGroupIds = _.pluck(getSecurityGroups(command), 'id');
if (_.intersection(command.securityGroups, securityGroupIds).length < command.securityGroups.length) {
securityGroupReloader = refreshSecurityGroups(command, true);
}
}
return $q.all([securityGroupReloader, ]).then(function() {
attachEventHandlers(command);
});
});
}
function configureInstanceTypes(command) {
var result = { dirty: {} };
if (command.region) {
var filtered = cfInstanceTypeService.getAvailableTypesForRegions(command.backingData.instanceTypes, [command.region]);
if (command.instanceType && filtered.indexOf(command.instanceType) === -1) {
command.instanceType = null;
result.dirty.instanceType = true;
}
command.backingData.filtered.instanceTypes = filtered;
} else {
command.backingData.filtered.instanceTypes = [];
}
return result;
}
function configureImages(command) {
var result = { dirty: {} };
if (command.viewState.disableImageSelection) {
return result;
}
if (command.credentials !== command.viewState.lastImageAccount) {
command.viewState.lastImageAccount = command.credentials;
var filtered = extractFilteredImageNames(command);
command.backingData.filtered.imageNames = filtered;
if (filtered.indexOf(command.image) === -1) {
command.image = null;
result.dirty.imageName = true;
}
}
return result;
}
function configureZones(command) {
command.backingData.filtered.zones =
command.backingData.regionsKeyedByAccount[command.credentials].regions[command.region];
}
function extractFilteredImageNames(command) {
return _(command.backingData.images)
.filter({account: command.credentials})
.pluck('imageName')
.flatten(true)
.unique()
.valueOf();
}
function getSecurityGroups(command) {
var newSecurityGroups = command.backingData.securityGroups[command.credentials] || { cf: {}};
newSecurityGroups = _.filter(newSecurityGroups.cf.global, function(securityGroup) {
return securityGroup.network === command.network;
});
return _(newSecurityGroups)
.sortBy('name')
.valueOf();
}
function configureSecurityGroupOptions(command) {
var results = { dirty: {} };
var currentOptions = command.backingData.filtered.securityGroups;
var newSecurityGroups = getSecurityGroups(command);
if (currentOptions && command.securityGroups) {
// not initializing - we are actually changing groups
var currentGroupNames = command.securityGroups.map(function(groupId) {
var match = _(currentOptions).find({id: groupId});
return match ? match.name : groupId;
});
var matchedGroups = command.securityGroups.map(function(groupId) {
var securityGroup = _(currentOptions).find({id: groupId}) ||
_(currentOptions).find({name: groupId});
return securityGroup ? securityGroup.name : null;
}).map(function(groupName) {
return _(newSecurityGroups).find({name: groupName});
}).filter(function(group) {
return group;
});
var matchedGroupNames = _.pluck(matchedGroups, 'name');
var removed = _.xor(currentGroupNames, matchedGroupNames);
command.securityGroups = _.pluck(matchedGroups, 'id');
if (removed.length) {
results.dirty.securityGroups = removed;
}
}
// Only include explicit security group options in the pulldown list.
command.backingData.filtered.securityGroups = _.filter(newSecurityGroups, function(securityGroup) {
return !_.isEmpty(securityGroup.targetTags);
});
// Identify implicit security groups so they can be optionally listed in a read-only state.
command.implicitSecurityGroups = _.filter(newSecurityGroups, function(securityGroup) {
return _.isEmpty(securityGroup.targetTags);
});
// Only include explicitly-selected security groups in the body of the command.
command.securityGroups = _.difference(command.securityGroups, _.pluck(command.implicitSecurityGroups, 'id'));
return results;
}
function refreshSecurityGroups(command, skipCommandReconfiguration) {
return cacheInitializer.refreshCache('securityGroups').then(function() {
return securityGroupReader.getAllSecurityGroups().then(function(securityGroups) {
command.backingData.securityGroups = securityGroups;
if (!skipCommandReconfiguration) {
configureSecurityGroupOptions(command);
}
});
});
}
function refreshInstanceTypes(command) {
return cacheInitializer.refreshCache('instanceTypes').then(function() {
return cfInstanceTypeService.getAllTypesByRegion().then(function(instanceTypes) {
command.backingData.instanceTypes = instanceTypes;
configureInstanceTypes(command);
});
});
}
function attachEventHandlers(command) {
command.regionChanged = function regionChanged() {
var result = { dirty: {} };
var filteredData = command.backingData.filtered;
if (command.region) {
angular.extend(result.dirty, configureInstanceTypes(command).dirty);
configureZones(command);
angular.extend(result.dirty, configureImages(command).dirty);
} else {
filteredData.zones = null;
}
command.viewState.dirty = command.viewState.dirty || {};
angular.extend(command.viewState.dirty, result.dirty);
return result;
};
command.credentialsChanged = function credentialsChanged() {
var result = { dirty: {} };
var backingData = command.backingData;
if (command.credentials) {
backingData.filtered.regions = Object.keys(backingData.regionsKeyedByAccount[command.credentials].regions);
if (backingData.filtered.regions.indexOf(command.region) === -1) {
command.region = null;
result.dirty.region = true;
} else {
angular.extend(result.dirty, command.regionChanged().dirty);
}
} else {
command.region = null;
}
command.viewState.dirty = command.viewState.dirty || {};
angular.extend(command.viewState.dirty, result.dirty);
return result;
};
command.networkChanged = function networkChanged() {
var result = { dirty: {} };
angular.extend(result.dirty, configureSecurityGroupOptions(command).dirty);
command.viewState.dirty = command.viewState.dirty || {};
angular.extend(command.viewState.dirty, result.dirty);
return result;
};
}
return {
configureCommand: configureCommand,
configureInstanceTypes: configureInstanceTypes,
configureImages: configureImages,
configureZones: configureZones,
refreshSecurityGroups: refreshSecurityGroups,
refreshInstanceTypes: refreshInstanceTypes,
};
}).name;
| {
"content_hash": "93842c32d6c5327bb816b37cb38fac2f",
"timestamp": "",
"source": "github",
"line_count": 230,
"max_line_length": 126,
"avg_line_length": 38.29565217391304,
"alnum_prop": 0.6491825613079019,
"repo_name": "zanthrash/deck-old",
"id": "a989498637ec36a85c59d0d0a8a04dd5b012c1cf",
"size": "8808",
"binary": false,
"copies": "3",
"ref": "refs/heads/master",
"path": "app/scripts/modules/cloudfoundry/serverGroup/configure/serverGroupConfiguration.service.js",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "ApacheConf",
"bytes": "24139"
},
{
"name": "CSS",
"bytes": "99587"
},
{
"name": "HTML",
"bytes": "787246"
},
{
"name": "JavaScript",
"bytes": "1836403"
}
],
"symlink_target": ""
} |
ACCEPTED
#### According to
Interim Register of Marine and Nonmarine Genera
#### Published in
null
#### Original name
null
### Remarks
null | {
"content_hash": "0467fe79583dd5d91bd1170f5b6ea5ee",
"timestamp": "",
"source": "github",
"line_count": 13,
"max_line_length": 47,
"avg_line_length": 10.923076923076923,
"alnum_prop": 0.7183098591549296,
"repo_name": "mdoering/backbone",
"id": "27db46f01511f79d4a4185508635f11f4fea88a8",
"size": "202",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "life/Protozoa/Acritarcha/Goniosphaeridium/Goniosphaeridium igneum/README.md",
"mode": "33188",
"license": "apache-2.0",
"language": [],
"symlink_target": ""
} |
<?php
namespace tperroin\BlogSioBundle;
use Symfony\Component\HttpKernel\Bundle\Bundle;
class tperroinBlogSioBundle extends Bundle
{
}
| {
"content_hash": "e21793fe666ecf382dbdbcb9f0437d73",
"timestamp": "",
"source": "github",
"line_count": 10,
"max_line_length": 47,
"avg_line_length": 14.3,
"alnum_prop": 0.7902097902097902,
"repo_name": "tperroin/blogsio",
"id": "56a514deb4b9d233232d7d095039ebef7360fc4d",
"size": "143",
"binary": false,
"copies": "1",
"ref": "refs/heads/master",
"path": "src/tperroin/BlogSioBundle/tperroinBlogSioBundle.php",
"mode": "33261",
"license": "mit",
"language": [
{
"name": "JavaScript",
"bytes": "398404"
},
{
"name": "PHP",
"bytes": "82781"
}
],
"symlink_target": ""
} |
package org.jboss.arquillian.test.spi.annotation;
import static java.lang.annotation.RetentionPolicy.RUNTIME;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
import org.jboss.arquillian.core.api.annotation.Scope;
/**
* TestScoped
*
* @author <a href="mailto:[email protected]">Aslak Knutsen</a>
* @version $Revision: $
*/
@Scope
@Documented
@Retention(RUNTIME)
@Target(ElementType.FIELD)
public @interface TestScoped {
}
| {
"content_hash": "b0ac1f96c3d6e654f7abd51740688f65",
"timestamp": "",
"source": "github",
"line_count": 26,
"max_line_length": 62,
"avg_line_length": 20.76923076923077,
"alnum_prop": 0.7759259259259259,
"repo_name": "topicusonderwijs/arquillian-core",
"id": "8b860cb1f47a99240ff07f7b2874fb1f582eefd8",
"size": "1324",
"binary": false,
"copies": "3",
"ref": "refs/heads/master",
"path": "test/spi/src/main/java/org/jboss/arquillian/test/spi/annotation/TestScoped.java",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "Java",
"bytes": "1821775"
}
],
"symlink_target": ""
} |
// Licensed to the .NET Foundation under one or more agreements.
// The .NET Foundation licenses this file to you under the MIT license.
// See the LICENSE file in the project root for more information.
using System;
using System.IO;
using System.Xml;
using Cake.Core;
using Cake.Core.Annotations;
using Cake.Core.Diagnostics;
using Cake.Core.IO;
namespace Cake.Common.Xml
{
/// <summary>
/// Contains functionality related to XML XPath queries.
/// </summary>
[CakeAliasCategory("XML")]
public static class XmlPeekAliases
{
/// <summary>
/// Gets the value of a target node.
/// </summary>
/// <returns>The value found at the given XPath query.</returns>
/// <param name="context">The context.</param>
/// <param name="filePath">The target file.</param>
/// <param name="xpath">The xpath of the node to get.</param>
/// <example>
/// <code>
/// string autoFacVersion = XmlPeek("./src/Cake/packages.config", "/packages/package[@id='Autofac']/@version");
/// </code>
/// </example>
[CakeMethodAlias]
public static string XmlPeek(this ICakeContext context, FilePath filePath, string xpath)
{
return context.XmlPeek(filePath, xpath, new XmlPeekSettings());
}
/// <summary>
/// Get the value of a target node.
/// </summary>
/// <returns>The value found at the given XPath query.</returns>
/// <param name="context">The context.</param>
/// <param name="filePath">The target file.</param>
/// <param name="xpath">The xpath of the nodes to set.</param>
/// <param name="settings">Additional settings to tweak Xml Peek behavior.</param>
/// <example>
/// <code>
/// <para>XML document:</para>
/// <![CDATA[
/// <?xml version="1.0" encoding="UTF-8"?>
/// <pastery xmlns = "https://cakebuild.net/pastery" >
/// <cake price="1.62" />
/// </pastery>
/// ]]>
/// </code>
/// <para>XmlPeek usage:</para>
/// <code>
/// string version = XmlPeek("./pastry.xml", "/pastry:pastry/pastry:cake/@price",
/// new XmlPeekSettings {
/// Namespaces = new Dictionary<string, string> {{ "pastry", "https://cakebuild.net/pastry" }}
/// });
/// string unknown = XmlPeek("./pastry.xml", "/pastry:pastry/pastry:cake/@recipe",
/// new XmlPeekSettings {
/// Namespaces = new Dictionary<string, string> {{ "pastry", "https://cakebuild.net/pastry" }},
/// SuppressWarning = true
/// });
/// </code>
/// </example>
[CakeMethodAlias]
[System.Diagnostics.CodeAnalysis.SuppressMessage("Microsoft.Usage", "CA2202:Do not dispose objects multiple times")]
public static string XmlPeek(this ICakeContext context, FilePath filePath, string xpath, XmlPeekSettings settings)
{
if (context == null)
{
throw new ArgumentNullException(nameof(context));
}
if (filePath == null)
{
throw new ArgumentNullException(nameof(filePath));
}
if (settings == null)
{
throw new ArgumentNullException(nameof(settings));
}
var file = context.FileSystem.GetFile(filePath);
if (!file.Exists)
{
throw new FileNotFoundException("Source File not found.", file.Path.FullPath);
}
using (var fileStream = file.Open(FileMode.Open, FileAccess.Read, FileShare.None))
using (var xmlReader = XmlReader.Create(fileStream, GetXmlReaderSettings(settings)))
{
var xmlValue = XmlPeek(xmlReader, xpath, settings);
if (xmlValue == null && !settings.SuppressWarning)
{
context.Log.Warning("Warning: Failed to find node matching the XPath '{0}'", xpath);
}
return xmlValue;
}
}
/// <summary>
/// Gets the value of a target node.
/// </summary>
/// <returns>The value found at the given XPath query (or the first, if multiple eligible nodes are found).</returns>
/// <param name="source">The source xml to transform.</param>
/// <param name="xpath">The xpath of the nodes to set.</param>
/// <param name="settings">Additional settings to tweak Xml Peek behavior.</param>
private static string XmlPeek(XmlReader source, string xpath, XmlPeekSettings settings)
{
if (source == null)
{
throw new ArgumentNullException(nameof(source));
}
if (string.IsNullOrWhiteSpace(xpath))
{
throw new ArgumentNullException(nameof(xpath));
}
if (settings == null)
{
throw new ArgumentNullException(nameof(settings));
}
var document = new XmlDocument();
document.PreserveWhitespace = settings.PreserveWhitespace;
document.Load(source);
var navigator = document.CreateNavigator();
var namespaceManager = new XmlNamespaceManager(navigator.NameTable);
foreach (var xmlNamespace in settings.Namespaces)
{
namespaceManager.AddNamespace(xmlNamespace.Key /* Prefix */, xmlNamespace.Value /* URI */);
}
var node = navigator.SelectSingleNode(xpath, namespaceManager);
return node?.Value;
}
/// <summary>
/// Gets a XmlReaderSettings from a XmlPeekSettings
/// </summary>
/// <returns>The xml reader settings.</returns>
/// <param name="settings">Additional settings to tweak Xml Peek behavior.</param>
private static XmlReaderSettings GetXmlReaderSettings(XmlPeekSettings settings)
{
var xmlReaderSettings = new XmlReaderSettings();
xmlReaderSettings.DtdProcessing = (DtdProcessing)settings.DtdProcessing;
return xmlReaderSettings;
}
}
} | {
"content_hash": "538af4a624ad1f8248f439db719a5533",
"timestamp": "",
"source": "github",
"line_count": 159,
"max_line_length": 125,
"avg_line_length": 39.57861635220126,
"alnum_prop": 0.5687271571587478,
"repo_name": "ferventcoder/cake",
"id": "5185348be5535b7a35d75f331a081e59177dbdc8",
"size": "6295",
"binary": false,
"copies": "3",
"ref": "refs/heads/develop",
"path": "src/Cake.Common/Xml/XmlPeekAliases.cs",
"mode": "33188",
"license": "mit",
"language": [
{
"name": "C#",
"bytes": "6117677"
},
{
"name": "PowerShell",
"bytes": "6199"
},
{
"name": "Shell",
"bytes": "3195"
}
],
"symlink_target": ""
} |
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="null" lang="null">
<head>
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-1" /><title>MockRule xref</title>
<link type="text/css" rel="stylesheet" href="../../../stylesheet.css" />
</head>
<body>
<div id="overview"><a href="../../../../apidocs/net/sourceforge/pmd/MockRule.html">View Javadoc</a></div><pre>
<a name="1" href="#1">1</a> <strong>package</strong> net.sourceforge.pmd;
<a name="2" href="#2">2</a>
<a name="3" href="#3">3</a> <em>/**</em>
<a name="4" href="#4">4</a> <em> * This is a Rule implementation which can be used in scenarios where an actual</em>
<a name="5" href="#5">5</a> <em> * functional Rule is not needed. For example, during unit testing, or as</em>
<a name="6" href="#6">6</a> <em> * an editable surrogate used by IDE plugins.</em>
<a name="7" href="#7">7</a> <em> */</em>
<a name="8" href="#8">8</a> <strong>public</strong> <strong>class</strong> <a href="../../../net/sourceforge/pmd/MockRule.html">MockRule</a> <strong>extends</strong> <a href="../../../net/sourceforge/pmd/AbstractRule.html">AbstractRule</a> {
<a name="9" href="#9">9</a>
<a name="10" href="#10">10</a> <strong>public</strong> <a href="../../../net/sourceforge/pmd/MockRule.html">MockRule</a>() {
<a name="11" href="#11">11</a> <strong>super</strong>();
<a name="12" href="#12">12</a> }
<a name="13" href="#13">13</a>
<a name="14" href="#14">14</a> <strong>public</strong> <a href="../../../net/sourceforge/pmd/MockRule.html">MockRule</a>(String name, String description, String message, String ruleSetName, <strong>int</strong> priority) {
<a name="15" href="#15">15</a> <strong>this</strong>(name, description, message, ruleSetName);
<a name="16" href="#16">16</a> setPriority(priority);
<a name="17" href="#17">17</a> }
<a name="18" href="#18">18</a>
<a name="19" href="#19">19</a> <strong>public</strong> <a href="../../../net/sourceforge/pmd/MockRule.html">MockRule</a>(String name, String description, String message, String ruleSetName) {
<a name="20" href="#20">20</a> <strong>super</strong>();
<a name="21" href="#21">21</a> setName(name);
<a name="22" href="#22">22</a> setDescription(description);
<a name="23" href="#23">23</a> setMessage(message);
<a name="24" href="#24">24</a> setRuleSetName(ruleSetName);
<a name="25" href="#25">25</a> }
<a name="26" href="#26">26</a> }
</pre>
<hr/><div id="footer">This page was automatically generated by <a href="http://maven.apache.org/">Maven</a></div></body>
</html>
| {
"content_hash": "b1a489c63ea9dd02fa3263cf5a76d9c4",
"timestamp": "",
"source": "github",
"line_count": 39,
"max_line_length": 243,
"avg_line_length": 68.7948717948718,
"alnum_prop": 0.6261647409616101,
"repo_name": "pscadiz/pmd-4.2.6-gds",
"id": "32652947623e68f9806d84b1bee12ed1b2f1f83c",
"size": "2683",
"binary": false,
"copies": "3",
"ref": "refs/heads/master",
"path": "docs/xref/net/sourceforge/pmd/MockRule.html",
"mode": "33188",
"license": "bsd-3-clause",
"language": [
{
"name": "Batchfile",
"bytes": "3336"
},
{
"name": "CSS",
"bytes": "1241"
},
{
"name": "HTML",
"bytes": "440"
},
{
"name": "Java",
"bytes": "2602537"
},
{
"name": "JavaScript",
"bytes": "7987"
},
{
"name": "Ruby",
"bytes": "845"
},
{
"name": "Shell",
"bytes": "19634"
},
{
"name": "XSLT",
"bytes": "60577"
}
],
"symlink_target": ""
} |
using Eigen::MatrixXd;
namespace wit {
void print_heat(MatrixXd &Temp);
void print_material(MatrixXd &Material);
}
#endif
| {
"content_hash": "5aafd1e036273da9f14f488416aceb72",
"timestamp": "",
"source": "github",
"line_count": 8,
"max_line_length": 40,
"avg_line_length": 15.5,
"alnum_prop": 0.75,
"repo_name": "hdeplaen/heattopology",
"id": "d4eeaa80f72b9c183c8897edaafe47d5844b370c",
"size": "231",
"binary": false,
"copies": "3",
"ref": "refs/heads/master",
"path": "src/print.hpp",
"mode": "33188",
"license": "apache-2.0",
"language": [
{
"name": "C",
"bytes": "137619"
},
{
"name": "C++",
"bytes": "22141526"
},
{
"name": "CMake",
"bytes": "41241"
},
{
"name": "Cuda",
"bytes": "358665"
},
{
"name": "Makefile",
"bytes": "2484"
}
],
"symlink_target": ""
} |
Subsets and Splits