content
stringlengths 0
14.9M
| filename
stringlengths 44
136
|
---|---|
pkg_data <- new.env(parent = emptyenv())
#' Create a new web application
#'
#' @details
#' The typical workflow of creating a web application is:
#'
#' 1. Create a `webfakes_app` object with `new_app()`.
#' 1. Add middleware and/or routes to it.
#' 1. Start is with the `webfakes_app$listen()` method, or start it in
#' another process with [new_app_process()].
#' 1. Make queries to the web app.
#' 1. Stop it via `CTRL+C` / `ESC`, or, if it is running in another
#' process, with the `$stop()` method of [new_app_process()].
#'
#' A web application can be
#' * restarted,
#' * saved to disk,
#' * copied to another process using the callr package, or a similar way,
#' * embedded into a package,
#' * extended by simply adding new routes and/or middleware.
#'
#' The webfakes API is very much influenced by the
#' [express.js](http://expressjs.com/) project.
#'
#' ## Create web app objects
#'
#' ```r
#' new_app()
#' ```
#'
#' `new_app()` returns a `webfakes_app` object the has the methods listed
#' on this page.
#'
#' An app is an environment with S3 class `webfakes_app`.
#'
#' ## The handler stack
#'
#' An app has a stack of handlers. Each handler can be a route or
#' middleware. The differences between the two are:
#' * A route is bound to one or more paths on the web server. Middleware
#' is not (currently) bound to paths, but run for all paths.
#' * A route is usually (but not always) the end of the handler stack for
#' a request. I.e. a route takes care of sending out the response to
#' the request. Middleware typically performs some action on the request
#' or the response, and then the next handler in the stack is invoked.
#'
#' ## Routes
#'
#' The following methods define routes. Each method corresponds to the
#' HTTP verb with the same name, except for `app$all()`, which creates a
#' route for all HTTP methods.
#'
#' ```r
#' app$all(path, ...)
#' app$delete(path, ...)
#' app$get(path, ...)
#' app$head(path, ...)
#' app$patch(path, ...)
#' app$post(path, ...)
#' app$put(path, ...)
#' ... (see list below)
#' ```
#'
#' * `path` is a path specification, see 'Path specification' below.
#' * `...` is one or more handler functions. These will be placed in the
#' handler stack, and called if they match an incoming HTTP request.
#' See 'Handler functions' below.
#'
#' webfakes also has methods for the less frequently used HTTP verbs:
#' `CONNECT`, `MKCOL`, `OPTIONS`, `PROPFIND`, `REPORT`. (The method
#' names are always in lowercase.)
#'
#' If a request is not handled by any routes (or handler functions in
#' general), then webfakes will send a simple HTTP 404 response.
#'
#' ## Middleware
#'
#' `app$use()` adds a middleware to the handler stack. A middleware is
#' a handler function, see 'Handler functions' below. webfakes comes with
#' middleware to perform common tasks:
#'
#' * [mw_cookie_parser()] parses `Cookie` headers.
#' * [mw_etag()] adds an `ETag` header to the response.
#' * [mw_json()] parses JSON request bodies.
#' * [mw_log()] logs each requests to standard output, or another connection.
#' * [mw_multipart()] parses multipart request bodies.
#' * [mw_range_parser()] parses `Range` headers.
#' * [mw_raw()] parses raw request bodies.
#' * [mw_static()] serves static files from a directory.
#' * [mw_text()] parses plain text request bodies.
#' * [mw_urlencoded()] parses URL encoded request bodies.
#'
#' ```r
#' app$use(..., .first = FALSE)
#' ```
#'
#' * `...` is a set of (middleware) handler functions. They are added to
#' the handler stack, and called for every HTTP request. (Unless an HTTP
#' response is created before reaching this point in the handler stack.)
#' * `.first` set to `TRUE` is you want to add the handler function
#' to the bottom of the stack.
#'
#' ## Handler functions
#'
#' A handler function is a route or middleware. A handler function is
#' called by webfakes with the incoming HTTP request and the outgoing
#' HTTP response objects (being built) as arguments. The handler function
#' may query and modify the members of the request and/or the response
#' object. If it returns the string `"next"`, then it is _not_ a terminal
#' handler, and once it returns, webfakes will move on to call the next
#' handler in the stack.
#'
#' A typical route:
#'
#' ```r
#' app$get("/user/:id", function(req, res) {
#' id <- req$params$id
#' ...
#' res$
#' set_status(200L)$
#' set_header("X-Custom-Header", "foobar")$
#' send_json(response, auto_unbox = TRUE)
#' })
#' ```
#'
#' * The handler belongs to an API path, which is a wildcard path in
#' this case. It matches `/user/alice`, `/user/bob`, etc. The handler
#' will be only called for GET methods and matching API paths.
#' * The handler receives the request (`req`) and the response (`res`).
#' * It sets the HTTP status, additional headers, and sends the data.
#' (In this case the `webfakes_response$send_json()` method automatically
#' converts `response` to JSON and sets the `Content-Type` and
#' `Content-Length` headers.
#' * This is a terminal handler, because it does _not_ return `"next"`.
#' Once this handler function returns, webfakes will send out the HTTP
#' response.
#'
#' A typical middleware:
#'
#' ```r
#' app$use(function(req, res) {
#' ...
#' "next"
#' })
#' ````
#'
#' * There is no HTTP method and API path here, webfakes will call the
#' handler for each HTTP request.
#' * This is not a terminal handler, it does return `"next"`, so after it
#' returns webfakes will look for the next handler in the stack.
#'
#' ## Errors
#'
#' If a handler function throws an error, then the web server will return
#' a HTTP 500 `text/plain` response, with the error message as the
#' response body.
#'
#' ## Request and response objects
#'
#' See [webfakes_request] and [webfakes_response] for the methods of the
#' request and response objects.
#'
#' ## Path specification
#'
#' Routes are associated with one or more API paths. A path specification
#' can be
#'
#' * A "plain" (i.e. without parameters) string. (E.g. `"/list"`.)
#' * A parameterized string. (E.g. `"/user/:id"`.)
#' * A regular expression created via [new_regexp()] function.
#' * A list or character vector of the previous ones. (Regular expressions
#' must be in a list.)
#'
#' ## Path parameters
#'
#' Paths that are specified as parameterized strings or regular expressions
#' can have parameters.
#'
#' For parameterized strings the keys may contain letters, numbers and
#' underscores. When webfakes matches an API path to a handler with a
#' parameterized string path, the parameters will be added to the
#' request, as `params`. I.e. in the handler function (and subsequent
#' handler functions, if the current one is not terminal), they are
#' available in the `req$params` list.
#'
#' For regular expressions, capture groups are also added as parameters.
#' It is best to use named capture groups, so that the parameters are in
#' a named list.
#'
#' If the path of the handler is a list of parameterized strings or
#' regular expressions, the parameters are set according to the first
#' matching one.
#'
#' ## Templates
#'
#' webfakes supports templates, using any template engine. It comes with
#' a template engine that uses the glue package, see [tmpl_glue()].
#'
#' `app$engine()` registers a template engine, for a certain file
#' extension. The `$render()` method of [webfakes_response]
#' can be called from the handler function to evaluate a template from a
#' file.
#'
#' ```r
#' app$engine(ext, engine)
#' ```
#'
#' * `ext`: the file extension for which the template engine is added.
#' It should not contain the dot. E.g. `"html"', `"brew"`.
#' * `engine`: the template engine, a function that takes the file path
#' (`path`) of the template, and a list of local variables (`locals`)
#' that can be used in the template. It should return the result.
#'
#' An example template engine that uses glue might look like this:
#'
#' ```r
#' app$engine("txt", function(path, locals) {
#' txt <- readChar(path, nchars = file.size(path))
#' glue::glue_data(locals, txt)
#' })
#' ```
#'
#' (The built-in [tmpl_glue()] engine has more features.)
#'
#' This template engine can be used in a handler:
#'
#' ```r
#' app$get("/view", function(req, res) {
#' txt <- res$render("test")
#' res$
#' set_type("text/plain")$
#' send(txt)
#' })
#' ```
#'
#' The location of the templates can be set using the `views` configuration
#' parameter, see the `$set_config()` method below.
#'
#' In the template, the variables passed in as `locals`, and also the
#' response local variables (see `locals` in [webfakes_response]), are
#' available.
#'
#' ## Starting and stopping
#'
#' ```r
#' app$listen(port = NULL, opts = server_opts(), cleanup = TRUE)
#' ```
#'
#' * `port`: port to listen on. When `NULL`, the operating system will
#' automatically select a free port.
#'
#' * `opts`: options to the web server. See [server_opts()] for the
#' list of options and their default values.
#'
#' * `cleanup`: stop the server (with an error) if the standard input
#' of the process is closed. This is handy when the app runs in a
#' `callr::r_session` subprocess, because it stops the app (and the
#' subprocess) if the main process has terminated.
#'
#' This method does not return, and can be interrupted with `CTRL+C` / `ESC`
#' or a SIGINT signal. See [new_app_process()] for interrupting an app that
#' is running in another process.
#'
#' When `port` is `NULL`, the operating system chooses a port where the
#' app will listen. To be able to get the port number programmatically,
#' before the listen method blocks, it advertises the selected port in a
#' `webfakes_port` condition, so one can catch it:
#'
#' webfakes by default binds only to the loopback interface at 127.0.0.1, so
#' the webfakes web app is never reachable from the network.
#'
#' ```r
#' withCallingHandlers(
#' app$listen(),
#' "webfakes_port" = function(msg) print(msg$port)
#' )
#' ```
#'
#' ## Logging
#'
#' webfakes can write an access log that contains an entry for all incoming
#' requests, and also an error log for the errors that happen while
#' the server is running. This is the default behavior for local app
#' (the ones started by `app$listen()` and for remote apps (the ones
#' started via `new_app_process()`:
#'
#' * Local apps do not write an access log by default.
#' * Remote apps write an access log into the
#' `<tmpdir>/webfakes/<pid>/access.log` file, where `<tmpdir>` is the
#' session temporary directory of the _main process_, and `<pid>` is
#' the process id of the _subprocess_.
#' * Local apps write an error log to `<tmpdir>/webfakes/error.log`, where
#' `<tmpdir>` is the session temporary directory of the current process.
#' * Remote app write an error log to the `<tmpdir>/webfakes/<pid>/error.log`,
#' where `<tmpdir>` is the session temporary directory of the
#' _main process_ and `<pid>` is the process id of the _subprocess_`.
#'
#' See [server_opts()] for changing the default logging behavior.
#'
#' ## Shared app data
#'
#' ```r
#' app$locals
#' ```
#'
#' It is often useful to share data between handlers and requests in an
#' app. `app$locals` is an environment that supports this. E.g. a
#' middleware that counts the number of requests can be implemented as:
#'
#' ```
#' app$use(function(req, res) {
#' locals <- req$app$locals
#' if (is.null(locals$num)) locals$num <- 0L
#' locals$num <- locals$num + 1L
#' "next"
#' })
#' ```
#'
#' [webfakes_response] objects also have a `locals` environment, that is
#' initially populated as a copy of `app$locals`.
#'
#' ## Configuration
#'
#' ```r
#' app$get_config(key)
#' app$set_config(key, value)
#' ```
#'
#' * `key`: configuration key.
#' * `value`: configuration value.
#'
#' Currently used configuration values:
#'
#' * `views`: path where webfakes searches for templates.
#'
#' @return A new `webfakes_app`.
#' @aliases webfakes_app
#' @seealso [webfakes_request] for request objects, [webfakes_response] for
#' response objects.
#' @export
#' @examples
#' # see example web apps in the `/examples` directory in
#' system.file(package = "webfakes", "examples")
#'
#' app <- new_app()
#' app$use(mw_log())
#'
#' app$get("/hello", function(req, res) {
#' res$send("Hello there!")
#' })
#'
#' app$get(new_regexp("^/hi(/.*)?$"), function(req, res) {
#' res$send("Hi indeed!")
#' })
#'
#' app$post("/hello", function(req, res) {
#' res$send("Got it, thanks!")
#' })
#'
#' app
#'
#' # Start the app with: app$listen()
#' # Or start it in another R session: new_app_process(app)
new_app <- function() {
self <- new_object(
"webfakes_app",
all = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("all", path, ...))
invisible(self)
},
connect = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("connect", path, ...))
invisible(self)
},
delete = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("delete", path, ...))
invisible(self)
},
get = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("get", path, ...))
invisible(self)
},
engine = function(ext, engine) {
rec <- list(ext = ext, engine = engine)
self$.engines <- c(self$.engines, list(rec))
invisible(self)
},
get_config = function(key) {
self$.config[[key]]
},
head = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("head", path, ...))
invisible(self)
},
listen = function(port = NULL, opts = server_opts(), cleanup = TRUE) {
stopifnot(is.null(port) || is_port(port) || is_na_scalar(port))
if (is_na_scalar(port)) port <- NULL
opts$port <- port
self$.enable_keep_alive <- opts$enable_keep_alive
opts$access_log_file <- sub("%p", Sys.getpid(), opts$access_log_file)
opts$error_log_file <- sub("%p", Sys.getpid(), opts$error_log_file)
self$.opts <- opts
tryCatch(srv <- server_start(opts), error = function(err) {
err$message <- paste(sep = "\n", err$message, self$.get_error_log())
stop(err)
})
ports <- server_get_ports(srv)
self$.port <- ports$port[1]
message("Running webfakes web app on port ", self$.port)
if (!is.na(opts$access_log_file)) {
message("Access log file: ", opts$access_log_file)
}
if (!is.na(opts$error_log_file)) {
message("Error log file: ", opts$error_log_file)
}
msg <- structure(
list(
port = self$.port,
access_log = attr(srv, "options")$access_log_file,
error_log = attr(srv, "options")$error_log_file
),
class = c("webfakes_port", "callr_message", "condition")
)
message(msg)
on.exit(server_stop(srv), add = TRUE)
while (TRUE) {
req <- server_poll(srv, cleanup)
tryCatch(
self$.process_request(req),
error = function(err) {
cat(as.character(err), file = stderr())
response_send_error(req, as.character(err), 500L)
}
)
}
},
mkcol = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("mkcol", path, ...))
invisible(self)
},
options = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("options", path, ...))
invisible(self)
},
patch = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("patch", path, ...))
invisible(self)
},
post = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("post", path, ...))
invisible(self)
},
propfind = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("propfind", path, ...))
invisible(self)
},
put = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("put", path, ...))
invisible(self)
},
report = function(path, ...) {
self$.stack <- c(self$.stack, parse_handlers("report", path, ...))
invisible(self)
},
set_config = function(key, value) {
self$.config[[key]] <- value
invisible(self)
},
use = function(..., .first = FALSE) {
mw <- parse_handlers("use", "*", ...)
if (.first) {
self$.stack <- c(mw, self$.stack)
} else {
self$.stack <- c(self$.stack, mw)
}
invisible(self)
},
# Public data to be used in handlers
locals = new.env(parent = parent.frame()),
# Private data
.port = NULL,
.enable_keep_alive = NULL,
.opts = NULL,
# middleware stack
.stack = list(),
# view engines
.engines = list(),
# config
.config = as.environment(list(
views = file.path(getwd(), "views")
)),
# The request processing function
.process_request = function(req) {
req <- new_request(self, req)
res <- req$res
res$.delay <- NULL
tryCatch({
for (i in sseq(res$.stackptr, length(self$.stack))) {
handler <- self$.stack[[i]]
m <- path_match(req$method, req$path, handler)
if (!isFALSE(m)) {
res$.i <- i
if (is.list(m)) req$params <- m$params
out <- handler$handler(req, res)
if (!identical(out, "next")) break
}
}
if (!res$.sent && is.null(res$.delay)) {
if (!res$headers_sent) {
res$send_status(404)
} else if ((res$get_header("Transfer-Encoding") %||% "") == "chunked") {
res$send_chunk(raw(0))
res$headers_sent <- TRUE
res$send("")
} else {
res$send("")
}
}
}, webfakes_error = function(err) { })
},
.get_error_log = function() {
if (!is.na(self$.opts$error_log_file)) {
paste0("Error log:\n", read_char(self$.opts$error_log_file))
}
}
)
self
}
parse_handlers <- function(method, path, ...) {
handlers <- list(...)
ans <- list()
for (h in seq_along(handlers)) {
handler <- handlers[[h]]
if (is.function(handler)) {
rec <- list(
method = method,
path = path,
handler = handler,
name = names(handlers)[h]
)
ans <- c(ans, list(rec))
} else {
stop("Invalid webfakes handler")
}
}
ans
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/app.R
|
XX <- 255L
EQ <- 254L
INVALID <- XX
index_64 <- as.integer(c(
XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX,
XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX,
XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,62, XX,XX,XX,63,
52,53,54,55, 56,57,58,59, 60,61,XX,XX, XX,EQ,XX,XX,
XX, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,10, 11,12,13,14,
15,16,17,18, 19,20,21,22, 23,24,25,XX, XX,XX,XX,XX,
XX,26,27,28, 29,30,31,32, 33,34,35,36, 37,38,39,40,
41,42,43,44, 45,46,47,48, 49,50,51,XX, XX,XX,XX,XX,
XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX,
XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX,
XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX,
XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX,
XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX,
XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX,
XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX,
XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX, XX,XX,XX,XX
))
base64_decode <- function(x) {
if (is.character(x)) {
x <- charToRaw(x)
}
len <- length(x)
idx <- 1
c <- integer(4)
out <- raw()
while(idx <= len) {
i <- 1
while(i <= 4) {
uc <- index_64[[as.integer(x[[idx]]) + 1L]]
idx <- idx + 1
if (uc != INVALID) {
c[[i]] <- uc
i <- i + 1
}
if (idx > len) {
if (i <= 4) {
if (i <= 2) return(rawToChar(out))
if (i == 3) {
c[[3]] <- EQ
c[[4]] <- EQ
}
break
}
}
}
if (c[[1]] == EQ || c[[2]] == EQ) {
break
}
#print(sprintf("c1=%d,c2=%d,c3=%d,c4=%d\n", c[1],c[2],c[3],c[4]))
out[[length(out) + 1]] <- as.raw(bitwOr(bitwShiftL(c[[1]], 2L), bitwShiftR(bitwAnd(c[[2]], 0x30), 4L)))
if (c[[3]] == EQ) {
break
}
out[[length(out) + 1]] <- as.raw(bitwOr(bitwShiftL(bitwAnd(c[[2]], 0x0F), 4L), bitwShiftR(bitwAnd(c[[3]], 0x3C), 2L)))
if (c[[4]] == EQ) {
break
}
out[[length(out) + 1]] <- as.raw(bitwOr(bitwShiftL(bitwAnd(c[[3]], 0x03), 6L), c[[4]]))
}
rawToChar(out)
}
basis64 <- charToRaw(paste(c(LETTERS, letters, 0:9, "+", "/"),
collapse = ""))
base64_encode <- function(x) {
if (is.character(x)) {
x <- charToRaw(x)
}
len <- length(x)
rlen <- floor((len + 2L) / 3L) * 4L
out <- raw(rlen)
ip <- op <- 1L
c <- integer(4)
while (len > 0L) {
c[[1]] <- as.integer(x[[ip]])
ip <- ip + 1L
if (len > 1L) {
c[[2]] <- as.integer(x[ip])
ip <- ip + 1L
} else {
c[[2]] <- 0L
}
out[op] <- basis64[1 + bitwShiftR(c[[1]], 2L)]
op <- op + 1L
out[op] <- basis64[1 + bitwOr(bitwShiftL(bitwAnd(c[[1]], 3L), 4L),
bitwShiftR(bitwAnd(c[[2]], 240L), 4L))]
op <- op + 1L
if (len > 2) {
c[[3]] <- as.integer(x[ip])
ip <- ip + 1L
out[op] <- basis64[1 + bitwOr(bitwShiftL(bitwAnd(c[[2]], 15L), 2L),
bitwShiftR(bitwAnd(c[[3]], 192L), 6L))]
op <- op + 1L
out[op] <- basis64[1 + bitwAnd(c[[3]], 63)]
op <- op + 1L
} else if (len == 2) {
out[op] <- basis64[1 + bitwShiftL(bitwAnd(c[[2]], 15L), 2L)]
op <- op + 1L
out[op] <- charToRaw("=")
op <- op + 1L
} else { ## len == 1
out[op] <- charToRaw("=")
op <- op + 1L
out[op] <- charToRaw("=")
op <- op + 1L
}
len <- len - 3L
}
rawToChar(out)
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/base64.R
|
call_with_cleanup <- function(ptr, ...) {
.Call(c_cleancall_call, pairlist(ptr, ...), parent.frame())
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/cleancall.R
|
# nocov start --- compat-defer --- 2020-06-16
# This drop-in file implements withr::defer(). Please find the most
# recent version in withr's repository.
defer <- function(expr, envir = parent.frame(), priority = c("first", "last")) { }
local({
defer <<- defer <- function(expr, envir = parent.frame(), priority = c("first", "last")) {
priority <- match.arg(priority)
if (identical(envir, .GlobalEnv) && is.null(get_handlers(envir))) {
message(
"Setting deferred event(s) on global environment.\n",
" * Execute (and clear) with `withr::deferred_run()`.\n",
" * Clear (without executing) with `withr::deferred_clear()`."
)
}
invisible(
add_handler(
envir,
handler = list(expr = substitute(expr), envir = parent.frame()),
front = priority == "first"
)
)
}
get_handlers <- function(envir) {
attr(envir, "handlers")
}
set_handlers <- function(envir, handlers) {
has_handlers <- "handlers" %in% names(attributes(envir))
attr(envir, "handlers") <- handlers
if (!has_handlers) {
call <- make_call(execute_handlers, envir)
# We have to use do.call here instead of eval because of the way on.exit
# determines its evaluation context
# (https://stat.ethz.ch/pipermail/r-devel/2013-November/067867.html)
do.call(base::on.exit, list(call, TRUE), envir = envir)
}
}
execute_handlers <- function(envir) {
handlers <- get_handlers(envir)
errors <- list()
for (handler in handlers) {
tryCatch(eval(handler$expr, handler$envir),
error = function(e) {
errors[[length(errors) + 1]] <<- e
}
)
}
for (error in errors) {
stop(error)
}
}
add_handler <- function(envir, handler, front) {
if (front) {
handlers <- c(list(handler), get_handlers(envir))
} else {
handlers <- c(get_handlers(envir), list(handler))
}
set_handlers(envir, handlers)
handler
}
make_call <- function(...) {
as.call(list(...))
}
}) # defer() namespace
# nocov end
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/compat-defer.R
|
crc32 <- function(x) {
if (is.character(x)) x <- charToRaw(x)
stopifnot(is.raw(x))
call_with_cleanup(c_webfakes_crc32, x)
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/digest.R
|
#' @title webfakes glossary
#' @name glossary
#' @section Webfakes glossary:
#'
#' ```{r child = "vignettes/glossary.Rmd"}
#' ```
NULL
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/docs.R
|
#' Web app that acts as a git http server
#'
#' It is useful for tests that need an HTTP git server.
#'
#' @param git_root Path to the root of the directory tree to be served.
#' @param git_cmd Command to call, by default it is `"git"`. It may also
#' be a full path to git.
#' @param git_timeout A `difftime` object, time limit for the git
#' command.
#' @param filter Whether to support the `filter` capability in the server.
#' @param cleanup Whether to clean up `git_root` when the app is
#' garbage collected.
#'
#' @export
#' @examplesIf FALSE
#' dir.create(tmp <- tempfile())
#' setwd(tmp)
#' system("git clone --bare https://github.com/cran/crayon")
#' system("git clone --bare https://github.com/cran/glue")
#' app <- git_app(tmp)
#' git <- new_app_process(app)
#' system(paste("git ls-remote", git$url("/crayon")))
git_app <- function(git_root,
git_cmd = "git",
git_timeout = as.difftime(1, units = "mins"),
filter = TRUE,
cleanup = TRUE) {
app <- webfakes::new_app()
app$locals$git_root <- git_root
app$locals$git_timeout <- as.double(git_timeout, units = "secs") * 1000
app$locals$git_config <- tempfile()
reg.finalizer(app, function(app0) unlink(app$locals$git_config), TRUE)
writeLines(
c(
"[uploadpack]",
paste0("\tallowFilter = ", if (isTRUE(filter)) "true" else "false")
),
app$locals$git_config
)
if (cleanup) {
reg.finalizer(
app,
function(app) unlink(app$locals$git_root, recursive = TRUE),
TRUE
)
}
cgi <- mw_cgi(git_cmd, "http-backend", timeout = git_timeout)
handler <- function(req, res) {
env <- c(
GIT_CONFIG_GLOBAL = req$app$locals$git_config,
GIT_HTTP_EXPORT_ALL = "true",
GIT_PROJECT_ROOT = req$app$locals$git_root,
GIT_PROTOCOL = req$get_header("Git-Protocol") %||% "",
HTTP_GIT_PROTOCOL = req$get_header("Git-Protocol") %||% ""
)
cgi(req, res, env = env)
}
re_all <- new_regexp("^(?<path>.*)$")
app$get(re_all, handler)
app$post(re_all, handler)
app
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/git-app.R
|
#' Generic web app for testing HTTP clients
#'
#' A web app similar to `https://httpbin.org`.
#' See [its specific docs](https://webfakes.r-lib.org/httpbin.html).
#' You can also see these docs locally, by starting the app:
#' ```r
#' httpbin <- new_app_process(httpbin_app())
#' browseURL(httpbin$url())
#' ```
#'
#' @param log Whether to log requests to the standard output.
#' @return A `webfakes_app`.
#'
#' @export
#' @examples
#' app <- httpbin_app()
#' proc <- new_app_process(app)
#' url <- proc$url("/get")
#' resp <- curl::curl_fetch_memory(url)
#' curl::parse_headers_list(resp$headers)
#' cat(rawToChar(resp$content))
#' proc$stop()
httpbin_app <- function(log = interactive()) {
encode_files <- function(files) {
for (i in seq_along(files)) {
files[[i]]$value <- paste0(
"data:application/octet-stream;base64,",
base64_encode(files[[i]]$value)
)
}
files
}
app <- new_app()
# Log requests by default
if (log) app$use("logger" = mw_log())
# Parse all kinds of bodies
app$use("json body parser" = mw_json())
app$use("text body parser" = mw_text(type = c("text/plain", "application/json")))
app$use("multipart body parser" = mw_multipart())
app$use("URL encoded body parser" = mw_urlencoded())
app$use("cookie parser" = mw_cookie_parser())
# Add etags by default
app$use("add etag" = mw_etag())
# Add date by default
app$use("add date" = function(req, res) {
res$set_header("Date", http_time_stamp())
"next"
})
make_common_response <- function(req, res) {
ret <- list(
args = as.list(req$query),
data = req$text,
files = encode_files(req$files),
form = req$form,
headers = req$headers,
json = req$json,
method = req$method,
path = req$path,
origin = req$remote_addr,
url = req$url
)
}
common_response <- function(req, res) {
ret <- make_common_response(req, res)
res$send_json(object = ret, auto_unbox = TRUE, pretty = TRUE)
}
# Main page
app$get("/", function(req, res) {
res$send_file(
root = system.file(package = "webfakes", "examples", "httpbin", "assets"),
"httpbin.html"
)
})
# HTTP methods =========================================================
common_get <- function(req, res) {
ret <- list(
args = as.list(req$query),
headers = req$headers,
origin = req$remote_addr,
path = req$path,
url = req$url
)
res$send_json(object = ret, auto_unbox = TRUE, pretty = TRUE)
}
app$get("/get", common_get)
app$delete("/delete", common_response)
app$patch("/patch", common_response)
app$post("/post", common_response)
app$put("/put", common_response)
app$get("/forms/post", function(req, res) {
res$send_file(
root = system.file(package = "webfakes", "examples", "httpbin", "assets"),
"forms-post.html"
)
})
# Auth =================================================================
basic_auth <- function(req, res, error_status = 401L) {
exp <- paste(
"Basic",
base64_encode(paste0(req$params$user, ":", req$params$passwd))
)
hdr <- req$get_header("Authorization") %||% ""
if (exp == hdr) {
res$send_json(list(
authenticated = jsonlite::unbox(TRUE),
user = jsonlite::unbox(req$params$user)
))
} else {
if (error_status == 401L) {
res$
set_header("WWW-Authenticate", "Basic realm=\"Fake Realm\"")$
send_status(error_status)
} else {
res$
send_status(error_status)
}
}
}
app$get("/basic-auth/:user/:passwd", function(req, res) {
basic_auth(req, res, error_status = 401L)
})
app$get("/hidden-basic-auth/:user/:passwd", function(req, res) {
basic_auth(req, res, error_status = 404L)
})
app$get("/bearer", function(req, res) {
auth <- req$get_header("Authorization") %||% ""
if (! grepl("^Bearer ", auth)) {
res$
set_header("WWW-Authenticate", "bearer")$
send_status(401L)
} else {
token <- sub("^Bearer ", "", auth)
res$
send_json(
list(authenticated = TRUE, token = token),
auto_unbox = TRUE, pretty = TRUE
)
}
})
hash <- function(str, algorithm) {
algo <- tolower(algorithm %||% "md5")
algo <- c("md5" = "md5", "sha-256" = "sha256", "sha-512" = "sha512")[algo]
if (is.na(algo)) {
stop("Unknown hash algorithm for digest auth: ", algorithm)
}
digest::digest(str, algo = algo, serialize = FALSE)
}
hash1 <- function(realm, username, password, algorithm) {
realm <- realm %||% ""
hash(paste(collapse = ":", c(
username,
realm,
password
)), algorithm)
}
hash2 <- function(credentials, req, algorithm) {
qop <- credentials[["qop"]] %||% "auth"
query <- if (nchar(req$query_string %||% "")) paste0("?", req$query_string)
req_uri <- paste0(req$path, query)
if (qop == "auth") {
hash(paste(collapse = ":", c(
toupper(req[["method"]]),
req_uri
)), algorithm)
} else {
hash(paste0(collapse = ":", c(
toupper(req[["method"]]),
req_uri,
hash(req$.body %||% "", algorithm)
)), algorithm)
}
}
digest_challenge_response <- function(req, qop, algorithm, stale = FALSE) {
nonce <- hash(
paste0(req$remote_addr, ":", unclass(Sys.time()), ":", random_id(10)),
algorithm
)
opaque <- hash(random_id(10), algorithm)
realm <- "webfakes.r-lib.org"
qop <- qop %||% "auth,auth-int"
wwwauth <- paste0(
"Digest ",
"realm=\"", realm, "\", ",
"qop=\"", qop, "\", ",
"nonce=\"", nonce, "\", ",
"opaque=\"", opaque, "\", ",
"algorithm=", algorithm, ", ",
"stale=", stale
)
wwwauth
}
next_stale_after_value <- function(x) {
x <- suppressWarnings(as.integer(x))
if (is.na(x)) "never" else as.character(x - 1L)
}
check_digest_auth <- function(req, credentials, user, passwd) {
if (is.null(credentials)) return(FALSE)
algorithm <- credentials[["algorithm"]]
HA1_value <- hash1(
credentials[["realm"]],
credentials[["username"]],
passwd,
algorithm
)
HA2_value <- hash2(credentials, req, algorithm)
qop <- credentials[["qop"]] %||% "compat"
if (! qop %in% c("compat", "auth", "auth-int")) return(FALSE)
response_hash <- if (qop == "compat") {
hash(paste0(c(
HA1_value,
credentials[["nonce"]] %||% "",
HA2_value
), collapse = ":"), algorithm)
} else {
if (any(! c("nonce", "nc", "cnonce", "qop") %in% names(credentials))) {
return(FALSE)
}
hash(paste0(c(
HA1_value,
credentials[["nonce"]],
credentials[["nc"]],
credentials[["cnonce"]],
credentials[["qop"]],
HA2_value
), collapse = ":"), algorithm)
}
(credentials[["response"]] %||% "") == response_hash
}
digest_auth <- function(req, res, qop, user, passwd, algorithm, stale_after) {
require_cookie_handling <-
tolower(req$query$`require-cookie` %||% "") %in% c("1", "t", "true")
if (! algorithm %in% c("MD5", "SHA-256", "SHA-512")) {
algorithm <- "MD5"
}
if (! qop %in% c("auth", "auth-int")) {
qop <- NULL
}
authorization <- req$get_header("Authorization")
credentials <- if (!is.null(authorization)) {
parse_authorization_header(authorization)
}
if (is.null(authorization) ||
is.null(credentials) ||
tolower(credentials$scheme) != "digest" ||
(require_cookie_handling && is.null(req$get_header("Cookie")))
) {
wwwauth <- digest_challenge_response(req, qop, algorithm)
res$
set_status(401L)$
add_cookie("stale_after", stale_after)$
add_cookie("fake", "fake_value")$
set_header("WWW-Authenticate", wwwauth)$
send("")
return()
}
if (require_cookie_handling &&
(req$cookies[["fake"]] %||% "") != "fake_value") {
res$
add_cookie("fake", "fake_value")$
add_status(403L)$
send_json(
list(errors = "missing cookie set on challenge"),
pretty = TRUE
)
return()
}
current_nonce <- credentials$nonce
stale_after_value <- req$cookies$stale_after %||% ""
if (identical(current_nonce, req$cookies[["last_nonce"]] %||% "") ||
stale_after_value == "0") {
wwwauth <- digest_challenge_response(req, qop, algorithm, TRUE)
res$
set_status(401L)$
add_cookie("stale_after", stale_after)$
add_cookie("last_nonce", current_nonce)$
add_cookie("fake", "fake_value")$
set_header("WWW-Authenticate", wwwauth)$
send("")
return()
}
if (!check_digest_auth(req, credentials, user, passwd)) {
wwwauth <- digest_challenge_response(req, qop, algorithm, FALSE)
res$
set_status(401L)$
add_cookie("stale_after", stale_after)$
add_cookie("last_nonce", current_nonce)$
add_cookie("fake", "fake_value")$
set_header("WWW-Authenticate", wwwauth)$
send("")
return()
}
res$add_cookie("fake", "fake_value")
if (!is.null(stale_after_value)) {
res$add_cookie(
"stale_after",
next_stale_after_value(stale_after_value)
)
}
res$
send_json(
list(authentication = TRUE, user = user),
pretty = TRUE,
auto_unbox = TRUE
)
}
app$get("/digest-auth/:qop/:user/:passwd", function(req, res) {
qop <- req$params$qop
user <- req$params$user
passwd <- req$params$passwd
digest_auth(req, res, qop, user, passwd, "MD5", "never")
})
app$get("/digest-auth/:qop/:user/:passwd/:algorithm", function(req, res) {
qop <- req$params$qop
user <- req$params$user
passwd <- req$params$passwd
algorithm <- req$params$algorithm
digest_auth(req, res, qop, user, passwd, algorithm, "never")
})
app$get(
"/digest-auth/:qop/:user/:passwd/:algorithm/:stale_after",
function(req, res) {
qop <- req$params$qop
user <- req$params$user
passwd <- req$params$passwd
algorithm <- req$params$algorithm
stale_after <- req$params$stale_after
digest_auth(req, res, qop, user, passwd, algorithm, stale_after)
}
)
# Status codes =========================================================
app$all(
new_regexp("^/status/(?<status>[0-9][0-9][0-9])$"),
function(req, res) {
status <- req$params$status
res$set_status(status)
if (status == "418") {
res$send(paste(
sep = "\n",
"",
" -=[ teapot ]=-",
"",
" _...._",
" .' _ _ `.",
" | .\"` ^ `\". _,",
" \\_;`\"---\"`|//",
" | ;/",
" \\_ _/",
" `\"\"\"`",
""
))
} else {
res$send("")
}
}
)
# Request inspection ===================================================
app$get("/headers", function(req, res) {
ret <- list(headers = req$headers)
res$send_json(ret, auto_unbox = TRUE, pretty = TRUE)
})
app$get("/ip", function(req, res) {
ret <- list(origin = req$remote_addr)
res$send_json(ret, auto_unbox = TRUE, pretty = TRUE)
})
app$get("/user-agent", function(req, res) {
ret <- list("user-agent" = req$get_header("User-Agent"))
res$send_json(ret, auto_unbox = TRUE, pretty = TRUE)
})
# Response inspection ==================================================
app$get("/etag/:etag", function(req, res) {
etag <- req$params$etag
# The mw_etag() middleware is active, so we need to do this after that
res_etag <- NULL
res$on_response(function(req, res) {
if (!is.null(res_etag)) res$set_header("ETag", res_etag)
})
parse <- function(x) {
x <- strsplit(x, ",", fixed = TRUE)[[1]]
re_match(x, '\\s*(W/)?"?([^"]*)"?\\s*')$groups[,2]
}
if_none_match <- parse(req$get_header("If-None-Match") %||% "")
if_match <- parse(req$get_header("If-Match") %||% "")
if (length(if_none_match) > 0) {
if (etag %in% if_none_match || "*" %in% if_none_match) {
res$send_status(304)
res_etag <- "etag"
return()
}
} else if (length(if_match) > 0) {
if ((! etag %in% if_match) && (!"*" %in% if_match)) {
res$send_status(412)
return()
}
}
res_etag <- etag
common_get(req, res)
})
rsp_hdrs <- function(req, res) {
obj <- structure(list(), names = character())
for (i in seq_along(req$query)) {
key <- names(req$query)[i]
res$add_header(key, req$query[[i]])
obj[[key]] <- c(obj[[key]], req$query[[i]])
}
res$send_json(object = obj, auto_unbox = TRUE)
}
app$get("/response-headers", rsp_hdrs)
app$post("/response-headers", rsp_hdrs)
app$get("/cache", function(req, res) {
if (is.null(req$get_header("If-Modified-Since")) &&
is.null(req$get_header("If-None-Match"))) {
res$set_header("Last-Modified", http_time_stamp())
# etag is added by default
common_response(req, res)
} else {
res$send_status(304)
}
})
app$get("/cache/:value", function(req, res) {
value <- suppressWarnings(as.integer(req$params$value))
if (is.na(value)) {
"next"
} else {
res$set_header(
"Cache-Control",
sprintf("public, max-age=%d", value)
)
common_response(req, res)
}
})
# Response formats =====================================================
app$get("/deny", function(req, res) {
res$
set_type("text/plain")$
send_file(
root = system.file(package = "webfakes"),
file.path("examples", "httpbin", "data", "deny.txt")
)
})
app$get("/brotli", function(req, res) {
ret <- make_common_response(req, res)
ret$brotli <- TRUE
json <- jsonlite::toJSON(ret, auto_unbox = TRUE, pretty = TRUE)
data <- charToRaw(json)
datax <- brotli::brotli_compress(data)
res$
set_type("application/json")$
set_header("Content-Encoding", "brotli")$
send(datax)
})
app$get("/gzip", function(req, res) {
ret <- make_common_response(req, res)
ret$gzipped <- TRUE
json <- jsonlite::toJSON(ret, auto_unbox = TRUE, pretty = TRUE)
tmp <- tempfile()
on.exit(unlink(tmp), add = TRUE)
con <- file(tmp, open = "wb")
con2 <- gzcon(con)
writeBin(charToRaw(json), con2)
flush(con2)
close(con2)
gzipped <- readBin(tmp, "raw", file.info(tmp)$size)
res$
set_type("application/json")$
set_header("Content-Encoding", "gzip")$
send(gzipped)
})
app$get("/deflate", function(req, res) {
ret <- make_common_response(req, res)
ret$deflated <- TRUE
json <- jsonlite::toJSON(ret, auto_unbox = TRUE, pretty = TRUE)
data <- charToRaw(json)
datax <- zip::deflate(data)
res$
set_type("application/json")$
set_header("Content-Encoding", "deflate")$
send(datax$output)
})
app$get("/encoding/utf8", function(req, res) {
res$
set_type("text/html; charset=utf-8")$
send_file(
root = system.file(package = "webfakes"),
file.path("examples", "httpbin", "data", "utf8.html")
)
})
app$get("/html", function(req, res) {
res$send_file(
root = system.file(package = "webfakes"),
file.path("examples", "httpbin", "data", "example.html")
)
})
app$get("/json", function(req, res) {
res$send_file(
root = system.file(package = "webfakes"),
file.path("examples", "httpbin", "data", "example.json")
)
})
app$get("/robots.txt", function(req, res) {
res$send_file(
root = system.file(package = "webfakes"),
file.path("examples", "httpbin", "data", "robots.txt")
)
})
app$get("/xml", function(req, res) {
res$send_file(
root = system.file(package = "webfakes"),
file.path("examples", "httpbin", "data", "example.xml")
)
})
# Dynamic data =========================================================
app$get(list("/base64", new_regexp("/base64/(?<value>[\\+/=a-zA-Z0-9]*)")),
function(req, res) {
value <- req$params$value %||% ""
if (value == "") value <- "RXZlcnl0aGluZyBpcyBSc29tZQ=="
plain <- charToRaw(base64_decode(value))
res$
set_type("application/octet-stream")$
send(plain)
})
app$get("/bytes/:n", function(req, res) {
n <- suppressWarnings(as.integer(req$params$n))
if (is.na(n)) {
return("next")
} else {
n <- min(n, 10000)
bytes <- as.raw(as.integer(floor(stats::runif(n, min=0, max=256))))
res$
set_type("application/octet-stream")$
send(bytes)
}
})
app$all(new_regexp("/delay/(?<delay>[0-9\\.]+)$"), function(req, res) {
delay <- suppressWarnings(as.numeric(req$params$delay))
if (is.na(delay)) {
return("next")
} else if (is.null(res$locals$seen)) {
res$locals$seen <- TRUE
delay <- min(delay, 10)
res$delay(delay)
} else if (req$method == "head") {
res$send_status(200L)
} else {
common_response(req, res)
}
})
app$get("/drip", function(req, res) {
# First time?
if (is.null(res$locals$drip)) {
duration <- as.double(req$query$duration %||% 2)
numbytes <- as.integer(req$query$numbytes %||% 10)
code <- as.integer(req$query$code %||% 200L)
delay <- as.double(req$query$delay %||% 0)
# how much to wait between messages, at least 10ms
pause <- max(duration / numbytes, 0.01)
# how many messages
nummsg <- duration / pause + 1
# how big is a message, at least a byte
msgsize <- max(floor(numbytes / nummsg), 1)
res$locals$drip <- list(
tosend = numbytes,
msgsize = msgsize,
pause = pause
)
res$
set_header("Content-Length", numbytes)$
set_header("Content-Type", "application/octet-stream")$
set_status(code)
if (delay > 0) return(res$delay(delay))
}
len <- min(res$locals$drip$tosend, res$locals$drip$msgsize)
res$write(strrep("*", len))
res$locals$drip$tosend <- res$locals$drip$tosend - len
if (res$locals$drip$tosend == 0) {
res$send("")
} else {
res$delay(res$locals$drip$pause)
}
})
app$get(new_regexp("^/stream/(?<n>[0-9]+)$"), function(req, res) {
n <- suppressWarnings(as.integer(req$params$n))
n <- min(n, 100)
if (length(n) == 0 || is.na(n)) return("next")
msg <- make_common_response(req, res)[c("url", "args", "headers", "origin")]
res$set_type("application/json")
for (i in seq_len(n)) {
msg$id <- i - 1L
txt <- paste0(jsonlite::toJSON(msg, auto_unbox = TRUE), "\n")
res$send_chunk(charToRaw(txt))
}
})
app$get(new_regexp("^/stream-bytes/(?<n>[0-9]+)$"), function(req, res) {
n <- suppressWarnings(as.integer(req$params$n))
n <- min(n, 100 * 1024)
seed <- suppressWarnings(as.integer(req$query$seed %||% 42))
chunk_size <- suppressWarnings(as.integer(req$query$chunk_size %||% 10240))
if (length(n) == 0 || is.na(n) || length(seed) == 0 || is.na(seed) ||
length(chunk_size) == 0 || is.na(chunk_size)) return("next")
oldseed <- .GlobalEnv$.Random.seed
on.exit(set.seed(oldseed))
set.seed(seed)
bytes <- as.raw(as.integer(floor(stats::runif(n, min=0, max=256))))
nc <- ceiling(n / chunk_size)
for (i in seq_len(nc)) {
from <- (i-1)*chunk_size + 1
to <- min(length(bytes), i * chunk_size)
res$send_chunk(bytes[from:to])
}
})
re_range <- new_regexp("^/range/(?<numbytes>[0-9]+)$")
# This is not in httpbin, but it is handy to get the size of the
# response, and to see whether the server supports ranges
app$head(re_range, function(req, res) {
numbytes <- suppressWarnings(as.integer(req$params$n))
if (length(numbytes) == 0 || is.na(numbytes)) {
return("next")
}
res$
set_header("ETag", paste0("range", numbytes))$
set_header("Accept-Ranges", "bytes")$
set_header("Content-Length", numbytes)$
send_status(200L)
})
app$get(re_range, function(req, res) {
if (is.null(res$locals$range)) {
numbytes <- suppressWarnings(as.integer(req$params$n))
if (length(numbytes) == 0 || is.na(numbytes)) {
return("next")
}
if (numbytes < 0 || numbytes > 100 * 1024) {
res$
set_header("ETag", paste0("range", numbytes))$
set_header("Accept-Ranges", "bytes")$
set_status(404L)$
send("number of bytes must be in the range (0, 102400].")
return()
}
chunk_size <- max(1, as.integer(req$query$chunk_size %||% (10 * 1024)))
duration <- as.integer(req$query$duration %||% 0)
pause_per_byte <- duration / numbytes
if (duration == 0) {
chunk_size <- numbytes
}
ranges <- parse_range(req$get_header("Range"))
# just like httpbin, we do not support multiple ranges
if (NROW(ranges) != 1) {
ranges <- NULL
}
# This is not exactly the same as httpbin, but rather follows
# https://developer.mozilla.org/en-US/docs/Web/HTTP/Range_requests
# and also how web servers seem to behave.
#
# In particular, in these cases we return the full response:
# - no Range header,
# - invalid Range header syntax,
# - overlapping Range header ranges
#
# Otherwise, if a range is outside of the size of the response, we
# return a 416 response.
if (!is.null(ranges)) {
ranges[ranges == Inf] <- numbytes
if (any(ranges[,2] >= numbytes)) {
res$
set_header("ETag", paste0("range", numbytes))$
set_header("Accept-Ranges", "bytes")$
set_header("Content-Range", paste0("bytes */", numbytes))$
set_header("Content-Length", 0L)$
send_status(416L)
return()
}
}
# we need the response for sure
abc <- paste(letters, collapse = "")
bytes <- substr(strrep(abc, numbytes / nchar(abc) + 1), 1, numbytes)
res$locals$range <- list(
bytes = bytes,
chunk_size = chunk_size,
pause_per_byte = pause_per_byte
)
# First part, so send status and headers
if (is.null(ranges)) {
res$
set_header("ETag", paste0("range", numbytes))$
set_header("Accept-Ranges", "bytes")$
set_header("Content-Length", numbytes)$
set_status(200L)
} else if (nrow(ranges) == 1) {
# A single range
res$
set_header("ETag", paste0("range", numbytes))$
set_header("Accept-Ranges", "bytes")$
set_header(
"Content-Range",
sprintf("bytes=%d-%d/%d", ranges[1, 1], ranges[1, 2], numbytes)
)$
set_header("Content-Length", ranges[1, 2] - ranges[1, 1] + 1L)$
set_status(206L)
# This is all we need to send
res$locals$range$bytes <- substr(
bytes,
ranges[1, 1] + 1,
ranges[1, 2] + 1L
)
} else {
# This cannot happen now, we do not support multiple ranges
# Maybe later
}
}
# send a part
chunk_size <- res$locals$range$chunk_size
pause <- res$locals$range$pause_per_byte
tosend <- substr(res$locals$range$bytes, 1, chunk_size)
res$locals$range$bytes <- substr(
res$locals$range$bytes,
chunk_size + 1L,
nchar(res$locals$range$bytes)
)
res$write(tosend)
if (pause * nchar(tosend) > 0) {
res$delay(pause * nchar(tosend))
}
})
app$get("/uuid", function(req, res) {
ret <- list(uuid = uuid_random())
res$send_json(ret, auto_unbox = TRUE, pretty = TRUE)
})
app$get(new_regexp("^/links/(?<n>[0-9]+)(/(?<offset>[0-9]+))?$"),
function(req, res) {
n <- suppressWarnings(as.integer(req$params$n))
o <- suppressWarnings(as.integer(req$params$offset))
if (length(o) == 0 || is.na(o)) o <- 1
if (length(n) == 0 || is.na(n)) return("next")
n <- min(max(1, n), 200)
o <- min(max(1, o), n)
links <- sprintf("<a href = \"/links/%d/%d\">%d</a>", n, 1:n, 1:n)
links[o] <- o
html <- paste0(
"<html><head><title>Links</title></head><body>",
paste(links, collapse = " "),
"</body></html>"
)
res$
set_type("html")$
send(html)
})
# Cookies ==============================================================
app$get("/cookies", function(req, res) {
cks <- req$cookies
res$send_json(
object = list(cookies = cks),
auto_unbox = TRUE,
pretty = TRUE
)
})
app$get("/cookies/set/:name/:value", function(req, res) {
res$add_cookie(req$params$name, req$params$value)
res$redirect("/cookies", 302L)
})
app$get("/cookies/set", function(req, res) {
for (n in names(req$query)) {
res$add_cookie(n, req$query[[n]])
}
res$redirect("/cookies", 302L)
})
app$get("/cookies/delete", function(req, res) {
for (n in names(req$query)) {
res$clear_cookie(n)
}
res$redirect("/cookies", 302L)
})
# Images ===============================================================
app$get("/image", function(req, res) {
act <- req$get_header("Accept")
ok <- c(
"image/webp",
"image/svg+xml",
"image/jpeg",
"image/png",
"image/*"
)
msg <- list(
message = "Client did not request a supported media type.",
accept = ok
)
if (is.null(act) || ! act %in% ok) {
res$
set_status(406)$
set_type("application/json")$
send_json(msg)
} else {
fls <- c(
"image/webp" = "Rlogo.webp",
"image/svg+xml" = "Rlogo.svg",
"image/jpeg" = "Rlogo.jpeg",
"image/png" = "Rlogo.png",
"image/*" = "Rlogo.png"
)
res$send_file(
root = system.file(package = "webfakes"),
file.path("examples", "httpbin", "images", fls[act])
)
}
})
app$get(new_regexp("/image/(?<format>jpeg|png|svg|webp)"),
function(req, res) {
filename <- paste0("Rlogo.", req$params$format)
res$send_file(
root = system.file(package = "webfakes"),
file.path("examples", "httpbin", "images", filename)
)
})
# Redirects ============================================================
app$get("/absolute-redirect/:n", function(req, res) {
n <- suppressWarnings(as.integer(req$params$n))
if (is.na(n)) {
return("next")
} else {
if (n == 1) {
url <- sub("/absolute-redirect/[0-9]+$", "/get", req$url)
} else {
n <- min(n, 5)
url <- paste0(sub("/[0-9]+$", "/", req$url), n - 1)
}
res$redirect(url, 302L)
}
})
app$get(c("/redirect/:n", "/relative-redirect/:n"), function(req, res) {
n <- suppressWarnings(as.integer(req$params$n))
if (is.na(n)) {
return("next")
} else {
if (n == 1) {
url <- sub("/redirect/[0-9]+$", "/get", req$path)
url <- sub("/relative-redirect/[0-9]+$", "/get", url)
} else {
n <- min(n, 5)
url <- paste0(sub("/[0-9]+$", "/", req$path), n - 1)
}
res$redirect(url, 302L)
}
})
app$all("/redirect-to", function(req, res) {
res$redirect(req$query$url, req$query$status_code %||% 302)
})
# Anything =============================================================
app$all(new_regexp("^/anything"), common_response)
app
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/httpbin.R
|
#' App process that is cleaned up automatically
#'
#' You can start the process with an explicit `$start()` call.
#' Alternatively it starts up at the first `$url()` or `$get_port()`
#' call.
#'
#' @inheritParams new_app_process
#' @param ... Passed to [new_app_process()].
#' @param .local_envir The environment to attach the process cleanup to.
#' Typically a frame. When this frame finishes, the process is stopped.
#'
#' @export
#' @seealso [new_app_process()] for more details.
local_app_process <- function(app, ..., .local_envir = parent.frame()) {
proc <- new_app_process(app, ...)
withr::defer(proc$stop(), envir = .local_envir)
proc
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/local-app-process.R
|
mime_find <- function(ext) {
stopifnot(is_string(ext))
m <- mime_types[ext]
if (is.na(m)) {
ew <- str_is_suffix(ext, names(mime_types_sfx))
m <- mime_types_sfx[ew]
}
c(m, NA_character_)[1]
}
mime_types_sfx <- c(
`3gpp` = "audio/3gpp",
`jpm` = "video/jpm",
`mp3` = "audio/mp3",
`rtf` = "text/rtf",
`wav` = "audio/wave",
`x3db` = "model/x3d+binary",
`x3dv` = "model/x3d+vrml",
`xml` = "text/xml"
)
mime_types <- c(
`3g2` = "video/3gpp2",
`3gp` = "video/3gpp",
`3gpp` = "video/3gpp",
`3mf` = "model/3mf",
ac = "application/pkix-attr-cert",
adp = "audio/adpcm",
ai = "application/postscript",
apng = "image/apng",
appcache = "text/cache-manifest",
asc = "application/pgp-signature",
atom = "application/atom+xml",
atomcat = "application/atomcat+xml",
atomsvc = "application/atomsvc+xml",
au = "audio/basic",
aw = "application/applixware",
bdoc = "application/bdoc",
bin = "application/octet-stream",
bmp = "image/bmp",
bpk = "application/octet-stream",
buffer = "application/octet-stream",
ccxml = "application/ccxml+xml",
cdmia = "application/cdmi-capability",
cdmic = "application/cdmi-container",
cdmid = "application/cdmi-domain",
cdmio = "application/cdmi-object",
cdmiq = "application/cdmi-queue",
cer = "application/pkix-cert",
cgm = "image/cgm",
class = "application/java-vm",
coffee = "text/coffeescript",
conf = "text/plain",
cpt = "application/mac-compactpro",
crl = "application/pkix-crl",
css = "text/css",
csv = "text/csv",
cu = "application/cu-seeme",
davmount = "application/davmount+xml",
dbk = "application/docbook+xml",
deb = "application/octet-stream",
def = "text/plain",
deploy = "application/octet-stream",
`disposition-notification` = "message/disposition-notification",
dist = "application/octet-stream",
distz = "application/octet-stream",
dll = "application/octet-stream",
dmg = "application/octet-stream",
dms = "application/octet-stream",
doc = "application/msword",
dot = "application/msword",
drle = "image/dicom-rle",
dssc = "application/dssc+der",
dtd = "application/xml-dtd",
dump = "application/octet-stream",
ear = "application/java-archive",
ecma = "application/ecmascript",
elc = "application/octet-stream",
emf = "image/emf",
eml = "message/rfc822",
emma = "application/emma+xml",
eps = "application/postscript",
epub = "application/epub+zip",
es = "application/ecmascript",
exe = "application/octet-stream",
exi = "application/exi",
exr = "image/aces",
ez = "application/andrew-inset",
fits = "image/fits",
g3 = "image/g3fax",
gbr = "application/rpki-ghostbusters",
geojson = "application/geo+json",
gif = "image/gif",
glb = "model/gltf-binary",
gltf = "model/gltf+json",
gml = "application/gml+xml",
gpx = "application/gpx+xml",
gram = "application/srgs",
grxml = "application/srgs+xml",
gxf = "application/gxf",
gz = "application/gzip",
h261 = "video/h261",
h263 = "video/h263",
h264 = "video/h264",
heic = "image/heic",
heics = "image/heic-sequence",
heif = "image/heif",
heifs = "image/heif-sequence",
hjson = "application/hjson",
hlp = "application/winhlp",
hqx = "application/mac-binhex40",
htm = "text/html",
html = "text/html",
ics = "text/calendar",
ief = "image/ief",
ifb = "text/calendar",
iges = "model/iges",
igs = "model/iges",
img = "application/octet-stream",
`in` = "text/plain",
ini = "text/plain",
ink = "application/inkml+xml",
inkml = "application/inkml+xml",
ipfix = "application/ipfix",
iso = "application/octet-stream",
jade = "text/jade",
jar = "application/java-archive",
jls = "image/jls",
jp2 = "image/jp2",
jpe = "image/jpeg",
jpeg = "image/jpeg",
jpf = "image/jpx",
jpg = "image/jpeg",
jpg2 = "image/jp2",
jpgm = "video/jpm",
jpgv = "video/jpeg",
jpm = "image/jpm",
jpx = "image/jpx",
js = "application/javascript",
json = "application/json",
json5 = "application/json5",
jsonld = "application/ld+json",
jsonml = "application/jsonml+json",
jsx = "text/jsx",
jxr = "image/jxr",
kar = "audio/midi",
ktx = "image/ktx",
less = "text/less",
list = "text/plain",
litcoffee = "text/coffeescript",
log = "text/plain",
lostxml = "application/lost+xml",
lrf = "application/octet-stream",
m1v = "video/mpeg",
m21 = "application/mp21",
m2a = "audio/mpeg",
m2v = "video/mpeg",
m3a = "audio/mpeg",
m4a = "audio/mp4",
m4p = "application/mp4",
ma = "application/mathematica",
mads = "application/mads+xml",
man = "text/troff",
manifest = "text/cache-manifest",
map = "application/json",
mar = "application/octet-stream",
markdown = "text/markdown",
mathml = "application/mathml+xml",
mb = "application/mathematica",
mbox = "application/mbox",
md = "text/markdown",
mdx = "text/mdx",
me = "text/troff",
mesh = "model/mesh",
meta4 = "application/metalink4+xml",
metalink = "application/metalink+xml",
mets = "application/mets+xml",
mft = "application/rpki-manifest",
mid = "audio/midi",
midi = "audio/midi",
mime = "message/rfc822",
mj2 = "video/mj2",
mjp2 = "video/mj2",
mjs = "application/javascript",
mml = "text/mathml",
mods = "application/mods+xml",
mov = "video/quicktime",
mp2 = "audio/mpeg",
mp21 = "application/mp21",
mp2a = "audio/mpeg",
mp3 = "audio/mpeg",
mp4 = "video/mp4",
mp4a = "audio/mp4",
mp4s = "application/mp4",
mp4v = "video/mp4",
mpd = "application/dash+xml",
mpe = "video/mpeg",
mpeg = "video/mpeg",
mpg = "video/mpeg",
mpg4 = "video/mp4",
mpga = "audio/mpeg",
mrc = "application/marc",
mrcx = "application/marcxml+xml",
ms = "text/troff",
mscml = "application/mediaservercontrol+xml",
msh = "model/mesh",
msi = "application/octet-stream",
msm = "application/octet-stream",
msp = "application/octet-stream",
mxf = "application/mxf",
mxml = "application/xv+xml",
n3 = "text/n3",
nb = "application/mathematica",
nq = "application/n-quads",
nt = "application/n-triples",
oda = "application/oda",
oga = "audio/ogg",
ogg = "audio/ogg",
ogv = "video/ogg",
ogx = "application/ogg",
omdoc = "application/omdoc+xml",
onepkg = "application/onenote",
onetmp = "application/onenote",
onetoc = "application/onenote",
onetoc2 = "application/onenote",
opf = "application/oebps-package+xml",
otf = "font/otf",
owl = "application/rdf+xml",
oxps = "application/oxps",
p10 = "application/pkcs10",
p7c = "application/pkcs7-mime",
p7m = "application/pkcs7-mime",
p7s = "application/pkcs7-signature",
p8 = "application/pkcs8",
pdf = "application/pdf",
pfr = "application/font-tdpfr",
pgp = "application/pgp-encrypted",
pkg = "application/octet-stream",
pki = "application/pkixcmp",
pkipath = "application/pkix-pkipath",
pls = "application/pls+xml",
png = "image/png",
prf = "application/pics-rules",
ps = "application/postscript",
pskcxml = "application/pskc+xml",
qt = "video/quicktime",
raml = "application/raml+yaml",
rdf = "application/rdf+xml",
rif = "application/reginfo+xml",
rl = "application/resource-lists+xml",
rld = "application/resource-lists-diff+xml",
rmi = "audio/midi",
rnc = "application/relax-ng-compact-syntax",
rng = "application/xml",
roa = "application/rpki-roa",
roff = "text/troff",
rq = "application/sparql-query",
rs = "application/rls-services+xml",
rsd = "application/rsd+xml",
rss = "application/rss+xml",
rtf = "application/rtf",
rtx = "text/richtext",
s3m = "audio/s3m",
sbml = "application/sbml+xml",
scq = "application/scvp-cv-request",
scs = "application/scvp-cv-response",
sdp = "application/sdp",
ser = "application/java-serialized-object",
setpay = "application/set-payment-initiation",
setreg = "application/set-registration-initiation",
sgi = "image/sgi",
sgm = "text/sgml",
sgml = "text/sgml",
shex = "text/shex",
shf = "application/shf+xml",
shtml = "text/html",
sieve = "application/sieve",
sig = "application/pgp-signature",
sil = "audio/silk",
silo = "model/mesh",
siv = "application/sieve",
slim = "text/slim",
slm = "text/slim",
smi = "application/smil+xml",
smil = "application/smil+xml",
snd = "audio/basic",
so = "application/octet-stream",
spp = "application/scvp-vp-response",
spq = "application/scvp-vp-request",
spx = "audio/ogg",
sru = "application/sru+xml",
srx = "application/sparql-results+xml",
ssdl = "application/ssdl+xml",
ssml = "application/ssml+xml",
stk = "application/hyperstudio",
stl = "model/stl",
styl = "text/stylus",
stylus = "text/stylus",
svg = "image/svg+xml",
svgz = "image/svg+xml",
t = "text/troff",
t38 = "image/t38",
tei = "application/tei+xml",
teicorpus = "application/tei+xml",
text = "text/plain",
tfi = "application/thraud+xml",
tfx = "image/tiff-fx",
tif = "image/tiff",
tiff = "image/tiff",
tr = "text/troff",
ts = "video/mp2t",
tsd = "application/timestamped-data",
tsv = "text/tab-separated-values",
ttc = "font/collection",
ttf = "font/ttf",
ttl = "text/turtle",
txt = "text/plain",
u8dsn = "message/global-delivery-status",
u8hdr = "message/global-headers",
u8mdn = "message/global-disposition-notification",
u8msg = "message/global",
uri = "text/uri-list",
uris = "text/uri-list",
urls = "text/uri-list",
vcard = "text/vcard",
vrml = "model/vrml",
vtt = "text/vtt",
vxml = "application/voicexml+xml",
war = "application/java-archive",
wasm = "application/wasm",
wav = "audio/wav",
weba = "audio/webm",
webm = "video/webm",
webmanifest = "application/manifest+json",
webp = "image/webp",
wgt = "application/widget",
wmf = "image/wmf",
woff = "font/woff",
woff2 = "font/woff2",
wrl = "model/vrml",
wsdl = "application/wsdl+xml",
wspolicy = "application/wspolicy+xml",
x3d = "model/x3d+xml",
x3db = "model/x3d+fastinfoset",
x3dbz = "model/x3d+binary",
x3dv = "model/x3d-vrml",
x3dvz = "model/x3d+vrml",
x3dz = "model/x3d+xml",
xaml = "application/xaml+xml",
xdf = "application/xcap-diff+xml",
xdssc = "application/dssc+xml",
xenc = "application/xenc+xml",
xer = "application/patch-ops-error+xml",
xht = "application/xhtml+xml",
xhtml = "application/xhtml+xml",
xhvml = "application/xv+xml",
xm = "audio/xm",
xml = "application/xml",
xop = "application/xop+xml",
xpl = "application/xproc+xml",
xsd = "application/xml",
xsl = "application/xml",
xslt = "application/xslt+xml",
xspf = "application/xspf+xml",
xvm = "application/xv+xml",
xvml = "application/xv+xml",
yaml = "text/yaml",
yang = "application/yang",
yin = "application/yin+xml",
yml = "text/yaml",
zip = "application/zip"
)
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mime.R
|
parse_authorization_header <- function(x) {
if (length(x) == 0) return(NULL)
scheme <- tolower(sub("[ ].*$", "", x))
rest <- trimws(sub("^[^ ]+[ ]", "", x))
if (scheme == "basic") {
username <- password <- NULL
tryCatch({
ptxt <- strsplit(base64_decode(rest), ":", fixed = TRUE)[[1]]
if (length(ptxt) == 2) {
username <- ptxt[1]
password <- ptxt[2]
}
}, error = function(err) NULL)
if (!is.null(username) && !is.null(password)) {
return(list(
scheme = scheme,
username = username,
password = password
))
} else {
return(NULL)
}
}
# if it has a (non-trailing) =, then it is a dictionary, otherwise token
if (grepl("=", sub("=+$", "", rest))) {
return(c(list(scheme = scheme), parse_dict_header(rest)))
}
list(scheme = scheme, token = rest)
}
parse_dict_header <- function(x) {
result <- list()
for (item in parse_list_header(x)) {
if (!grepl("=", item)) {
result[item] <- list(NULL)
next
}
key <- sub("=.*$", "", item)
value <- sub("^[^=]+=", "", item)
# https://www.rfc-editor.org/rfc/rfc2231#section-4
# we always assume UTF-8
if (grepl("[*]$", key)) {
key <- sub("[*]$", "", key)
value <- sub("^.*'.*'", "", value)
value <- utils::URLdecode(value)
Encoding(value) <- "UTF-8"
}
if (grepl('^".*"$', value)) {
value <- substr(value, 2, nchar(value) - 1L)
}
result[[key]] <- value
}
result
}
parse_list_header <- function(x) {
s <- strsplit(x, "")[[1]]
res <- character()
part <- character()
escape <- quote <- FALSE
for (cur in s) {
if (escape) {
part[length(part) + 1L] <- cur
escape <- FALSE
next
}
if (quote) {
if (cur == "\\") {
escape <- TRUE
next
} else if (cur == '"') {
quote <- FALSE
}
part[length(part) + 1L] <- cur
next
}
if (cur == ",") {
res[length(res) + 1L] <- paste(part, collapse = "")
part <- character()
next
}
if (cur == '"') {
quote <- TRUE
}
part[length(part) + 1L] <- cur
}
if (length(part)) {
res[length(res) + 1L] <- paste(part, collapse = "")
}
trimws(res)
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-authorization.R
|
#' Middleware that calls a CGI script
#'
#' You can use it as an unconditional middleware in `app$use()`,
#' as a handler on `app$get()`, `app$post()`, etc., or you can call it
#' from a handler. See examples below.
#'
#' @param command External command to run.
#' @param args Arguments to pass to the external command.
#' @param timeout Timeout for the external command. If the command does
#' not terminate in time, the web server kills it and returns an 500
#' response.
#' @return A function with signature
#' ```
#' function(req, res, env = character())
#' ```
#'
#' See [RFC 3875](https://www.ietf.org/rfc/rfc3875) for details on the CGI
#' protocol.
#'
#' The request body (if any) is passed to the external command as standard
#' intput. `mw_cgi()` sets `CONTENT_LENGTH`, `CONTENT_TYPE`,
#' `GATEWAY_INTERFACE`, `PATH_INFO`, `QUERY_STRING`, `REMOTE_ADDR`,
#' `REMOTE_HOST`, `REMOTE_USER`, `REQUEST_METHOD`, `SERVER_NAME`,
#' `SERVER_PORT`, `SERVER_PROTOCOL`, `SERVER_SOFTEWARE`.
#'
#' It does not currently set the `AUTH_TYPE`, `PATH_TRANSLATED`,
#' `REMOTE_IDENT`, `SCRIPT_NAME` environment variables.
#'
#' The standard output of the external command is used to set the
#' response status code, the response headers and the response body.
#' Example output from git's CGI:
#' ```
#' Status: 200 OK
#' Expires: Fri, 01 Jan 1980 00:00:00 GMT
#' Pragma: no-cache
#' Cache-Control: no-cache, max-age=0, must-revalidate
#' Content-Type: application/x-git-upload-pack-advertisement
#'
#' 000eversion 2
#' 0015agent=git/2.42.0
#' 0013ls-refs=unborn
#' 0020fetch=shallow wait-for-done
#' 0012server-option
#' 0017object-format=sha1
#' 0010object-info
#' 0000
#' ```
#'
#' @family middleware
#' @export
#' @examples
#' app <- new_app()
#' app$use(mw_cgi("echo", "Status: 200\n\nHello"))
#' app
#'
#' app2 <- new_app()
#' app2$get("/greet", mw_cgi("echo", "Status: 200\n\nHello"))
#' app2
#'
#' # Using `mw_cgi()` in a handler, you can pass extra environment variables
#' app3 <- new_app()
#' cgi <- mw_cgi("echo", "Status: 200\n\nHello")
#' app2$get("/greet", function(req, res) {
#' cgi(req, res, env = c("EXTRA_VAR" = "EXTRA_VALUE"))
#' })
#' app3
mw_cgi <- function(command, args = character(),
timeout = as.difftime(Inf, units = "secs")) {
command
args
timeout <- if (timeout == Inf) {
-1
} else {
as.double(timeout, units = "secs") * 1000
}
function(req, res, env = character()) {
all_env <- c("current", cgi_env(req), env)
inp <- tempfile()
out <- tempfile()
err <- tempfile()
on.exit(unlink(c(inp, out, err)), add = TRUE)
writeBin(req$.body %||% raw(), inp)
px <- processx::process$new(
command,
args,
env = all_env,
stdin = inp,
stdout = out,
stderr = err
)
px$wait(timeout)
output <- parse_cgi_output(px, out, err)
res$set_status(output$status)
for (idx in seq_along(output$headers)) {
res$set_header(names(output$headers)[idx], output$headers[[idx]])
}
res$send(output$body)
}
}
parse_cgi_output <- function(px, out, err) {
if (px$is_alive() || px$get_exit_status() != 0) {
px$kill()
err <- tryCatch(read_char(err), error = function(e) "???")
return(list(
status = 500L,
headers = c("content-type" = "text/plain"),
body = paste0("Internal git error: ", err)
))
}
out <- read_bin(out)
err <- read_char(err)
cgi_res <- split_cgi_output(out)
headers <- cgi_res$headers
names(headers) <- tolower(names(headers))
status <- parse_status(headers[["status"]] %||% "200")
headers <- headers[names(headers) != "status"]
list(status = status, headers = headers, body = cgi_res$body)
}
cgi_env <- function(req) {
url <- parse_url(req$url)
c(
CONTENT_LENGTH = length(req$.body),
CONTENT_TYPE = if (!is.null(req$get_header)) req$get_header("content-type") %||% "",
GATEWAY_INTERFACE = "CGI/1.1",
PATH_INFO = req$path,
QUERY_STRING = req$query_string,
REMOTE_ADDR = req$remote_addr,
REMOTE_HOST = req$remote_addr,
REMOTE_USER = "anonymous",
REQUEST_METHOD = toupper(req$method),
SERVER_NAME = url$host,
SERVER_PORT = url$port,
SERVER_PROTOCOL = paste0("http/", req$http_version),
SERVER_SOFTWARE = "https://github.com/r-lib/webfakes"
)
}
split_cgi_output <- function(x) {
nlnl <- grepRaw("\r?\n\r?\n", x)[1]
if (is.na(nlnl)) {
stop("Invalid response from git cgi, no headers?")
}
headers <- parse_headers(rawToChar(x[1:(nlnl - 1L)]))
body <- x[nlnl:length(x)]
ndrop <- 1L
while (body[ndrop] != 0x0a) ndrop <- ndrop + 1L
ndrop <- ndrop + 1L
while (body[ndrop] != 0x0a) ndrop <- ndrop + 1L
body <- utils::tail(body, -ndrop)
list(headers = headers, body = body)
}
parse_status <- function(x) {
status <- as.integer(strsplit(x, " ", fixed = TRUE)[[1]][1])
if (is.na(status)) {
stop("Invalid status from git cgi: ", x)
}
status
}
parse_headers <- function (txt) {
headers <- grep(":", parse_headers0(txt), fixed = TRUE, value = TRUE)
out <- lapply(headers, split_header)
names <- tolower(vapply(out, `[[`, character(1), 1))
values <- lapply(lapply(out, `[[`, 2), trimws)
names(values) <- names
values
}
parse_headers0 <- function (txt, multiple = FALSE) {
if (!length(txt))
return(NULL)
if (is.raw(txt)) {
txt <- rawToChar(txt)
}
stopifnot(is.character(txt))
if (length(txt) > 1) {
txt <- paste(txt, collapse = "\n")
}
sets <- strsplit(txt, "\\r\\n\\r\\n|\\n\\n|\\r\\r")[[1]]
headers <- strsplit(sets, "\\r\\n|\\n|\\r")
if (multiple) {
headers
}
else {
headers[[length(headers)]]
}
}
split_header <- function(x) {
pos <- grepRaw(":", x, fixed = TRUE)[1]
if (is.na(pos)) {
stop("Invalid response header from git cgi: ", x)
}
c(substr(x, 1, pos - 1L), substr(x, pos + 1L, nchar(x)))
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-cgi.R
|
#' Middleware to parse Cookies
#'
#' Adds the cookies as the `cookies` element of the request object.
#'
#' It ignores cookies in an invalid format. It ignores duplicate cookies:
#' if two cookies have the same name, only the first one is included.
#'
#' @return Handler function.
#'
#' @family middleware
#' @export
mw_cookie_parser <- function() {
function(req, res) {
ch <- req$get_header("Cookie") %||% ""
req$cookies <- parse_cookies(ch)
"next"
}
}
parse_cookies <- function(x) {
parts <- strsplit(x, ";", fixed = TRUE)[[1]]
dict <- structure(list(), names = character())
lapply(parts, function(ck) {
ck <- trimws(ck)
key <- sub("^([^=]+)=.*$", "\\1", ck)
if (key == ck) return()
if (!is.null(dict[[key]])) return()
value <- sub("^[^=]+=", "", ck)
dict[[key]] <<- value
})
dict
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-cookie-parser.R
|
#' Middleware that add an `ETag` header to the response
#'
#' If the response already has an `ETag` header, then it is kept.
#'
#' This middleware handles the `If-None-Match` headers, and it sets the
#' status code of the response to 304 if `If-None-Match` matches the
#' `ETag`. It also removes the response body in this case.
#'
#' @param algorithm Checksum algorithm to use. Only `"crc32"` is
#' implemented currently.
#'
#' @return Handler function.
#'
#' @family middleware
#' @export
#' @examples
#' app <- new_app()
#' app$use(mw_etag())
#' app
mw_etag <- function(algorithm = "crc32") {
if (algorithm != "crc32") {
stop("Only the 'crc32' algorithm is implemented in `mw_etag()`")
}
function(req, res) {
do <- function(req, res) {
etag <- res$get_header("ETag")
if (is.null(etag)) {
etag <- paste0("\"", crc32(res$.body), "\"")
res$set_header("ETag", etag)
}
req_etag <- req$get_header("If-None-Match")
if (!is.null(req_etag) && req_etag == etag) {
res$.body <- NULL
res$set_status(304)
}
}
res$on_response(do)
"next"
}
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-etag.R
|
#' Middleware to parse a JSON body
#'
#' Adds the parsed object as the `json` element of the request object.
#'
#' @param type Content type to match before parsing. If it does not
#' match, then the request object is not modified.
#' @param simplifyVector Whether to simplify lists to vectors, passed to
#' [jsonlite::fromJSON()].
#' @param ... Arguments to pass to [jsonlite::fromJSON()], that performs
#' the JSON parsing.
#' @return Handler function.
#'
#' @family middleware
#' @export
#' @examples
#' app <- new_app()
#' app$use(mw_json())
#' app
mw_json <- function(type = "application/json",
simplifyVector = FALSE,
...) {
type; simplifyVector; list(...)
function(req, res) {
ct <- req$get_header("Content-Type") %||% ""
if (! ct %in% tolower(type)) return("next")
if (!is.null(req$.body)) {
req$json <- jsonlite::fromJSON(
rawToChar(req$.body),
simplifyVector = simplifyVector,
...
)
}
"next"
}
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-json.R
|
#' Log requests to the standard output or other connection
#'
#' A one line log entry for every request. The output looks like this:
#' ```
#' GET http://127.0.0.1:3000/image 200 3 ms - 4742
#' ```
#' and contains
#' * the HTTP method,
#' * the full request URL,
#' * the HTTP status code of the response,
#' * how long it took to process the response, in ms,
#' * and the size of the response body, in bytes.
#'
#' @param format Log format. Not implemented currently.
#' @param stream R connection to log to. `"stdout"` means the standard
#' output, `"stderr"` is the standard error. You can also supply a
#' connection object, but then you need to be sure that it will be
#' valid when the app is actually running.
#' @return Handler function.
#'
#' @family middleware
#' @export
#' @examples
#' app <- new_app()
#' app$use(mw_log())
#' app
mw_log <- function(format = "dev", stream = "stdout") {
format; stream
function(req, res) {
start <- Sys.time()
fmt <- function(req, res) {
if (identical(stream, "stdout")) stream <- stdout()
if (identical(stream, "stderr")) stream <- stderr()
len <- if (is.null(res$.body)) {
0L # nocov
} else if (is.raw(res$.body)) {
length(res$.body)
} else if (is_string(res$.body)) {
nchar(res$.body, type = "bytes")
} else {
"??" # nocov
}
t <- as.integer(round((Sys.time() - start) * 1000))
msg <- sprintf(
"%s %s %s %s ms - %s\n",
toupper(req$method), req$url, res$.status, t, len
)
cat0(msg, file = stream)
if (inherits(stream, "connection")) flush(stream)
}
res$on_response(fmt)
"next"
}
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-log.R
|
#' Parse a multipart HTTP request body
#'
#' Adds the parsed form fields in the `form` element of the request and
#' the parsed files to the `files` element.
#'
#' @param type Content type to match before parsing. If it does not
#' match, then the request object is not modified.
#' @return Handler function.
#'
#' @family middleware
#' @export
#' @examples
#' app <- new_app()
#' app$use(mw_multipart())
#' app
mw_multipart <- function(type = "multipart/form-data") {
type
function(req, res) {
ct <- req$get_header("Content-Type") %||% ""
if (!any(vapply(
paste0("^", type),
function(x) grepl(x, ct),
logical(1)))) return("next")
parts <- str_trim(strsplit(ct, ";", fixed = TRUE)[[1]])
bnd <- grep("boundary=", parts, value = TRUE)[1]
if (is.na(bnd)) return("next")
bnd <- sub("^boundary=", "", bnd)
tryCatch({
mp <- parse_multipart(req$.body, bnd)
req$form <- list()
req$files <- list()
for (p in mp) {
if (is.null(p$filename)) {
req$form[[p$name]] <- rawToChar(p$value)
} else {
req$files[[p$name]] <- list(filename = p$filename, value = p$value)
}
}
}, error = function(err) NULL)
"next"
}
}
parse_multipart <- function(body, boundary) {
boundary <- paste0("--", boundary)
boundary_length <- nchar(boundary)
# Find the locations of the boundary string
indexes <- grepRaw(boundary, body, fixed = TRUE, all = TRUE)
if (!length(indexes)) stop("Boundary was not found in the body.")
if (length(indexes) == 1) {
if (length(body) < (boundary_length + 5)) {
# Empty HTML5 FormData object
return(list())
} else {
# Something went wrong
stop("The 'boundary' was only found once in the ",
"multipart/form-data message. It should appear at ",
"least twice. The request-body might be truncated.")
}
}
parts <- list()
for (i in seq_along(utils::head(indexes, -1))) {
from <- indexes[i] + boundary_length
to <- indexes[i + 1] -1
parts[[i]] <- body[from:to]
}
out <- lapply(parts, multipart_sub)
names(out) <- vapply(
out,
function(x) as.character(x$name),
character(1)
)
out
}
multipart_sub <- function(bodydata) {
splitchar <- grepRaw("\\r\\n\\r\\n|\\n\\n|\\r\\r", bodydata)
if (!length(splitchar)) {
stop("Invalid multipart subpart:\n\n", rawToChar(bodydata))
}
headers <- bodydata[1:(splitchar-1)]
headers <- str_trim(rawToChar(headers))
headers <- gsub("\r\n", "\n", headers)
headers <- gsub("\r", "\n", headers)
headerlist <- unlist(lapply(strsplit(headers, "\n")[[1]], str_trim))
dispindex <- grep("^Content-Disposition:", headerlist)
if (!length(dispindex)) {
stop("Content-Disposition header not found:", headers)
}
dispheader <- headerlist[dispindex]
#get parameter name
m <- regexpr("; name=\\\"(.*?)\\\"", dispheader)
if (m < 0)
stop('failed to find the name="..." header')
namefield <- unquote(sub(
"; name=",
"",
regmatches(dispheader, m),
fixed = TRUE
))
#test for file upload
m <- regexpr("; filename=\\\"(.*?)\\\"", dispheader)
if (m < 0) {
filenamefield = NULL
} else {
filenamefield <- unquote(sub(
"; filename=",
"",
regmatches(dispheader, m),
fixed = TRUE
))
}
#filedata
splitval <- grepRaw("\\r\\n\\r\\n|\\n\\n|\\r\\r", bodydata, value=TRUE)
start <- splitchar + length(splitval)
if (identical(utils::tail(bodydata, 2), charToRaw("\r\n"))) {
end <- length(bodydata) - 2
} else {
end <- length(bodydata) - 1
}
#the actual fields
list (
name = namefield,
value = bodydata[start:end],
filename = filenamefield
)
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-multipart.R
|
#' Middleware to parse a Range header
#'
#' Adds the requested ranges to the `ranges` element of the request
#' object. `request$ranges` is a data frame with two columns, `from` and
#' `to`. Each row corresponds one requested interval.
#'
#' When the last `n` bytes of the file are requested, the matrix row is set
#' to `c(0, -n)`. When all bytes after a `p` position are requested, the
#' matrix row is set to `c(p, Inf)`.
#'
#' If the intervals overlap, then `ranges` is not set, i.e. the `Range`
#' header is ignored.
#'
#' If its syntax is invalid or the unit is not `bytes`, then the
#' `Range` header is ignored.
#'
#' @return Handler function.
#'
#' @family middleware
#' @export
mw_range_parser <- function() {
function(req, res) {
rh <- req$get_header("Range")
if (length(rh) == 0) return("next")
req$ranges <- parse_range(rh)
"next"
}
}
parse_range <- function(rh) {
rh <- trimws(rh)
if (length(rh) == 0 || !startsWith(rh, "bytes=")) return()
rh <- sub("^bytes=[ ]*", "", rh)
rngs <- trimws(strsplit(rh, ",", fixed = TRUE)[[1]])
res <- matrix(integer(1), nrow = length(rngs), ncol = 2)
for (i in seq_along(rngs)) {
rng <- strsplit(rngs[i], "-")[[1]]
if (length(rng) < 1 || length(rng) > 3) {
return()
} else if (length(rng) == 1) {
res[i, 1] <- parse_int(rng[1])
res[i, 2] <- Inf
if (is.na(res[i, 1]) || res[i, 1] < 0) return()
} else if (rng[1] == "") {
res[i, 1] <- 0
res[i, 2] <- -parse_int(rng[2])
if (is.na(res[i, 2]) || res[i, 2] > 0) return()
} else {
res[i, 1] <- parse_int(rng[1])
res[i, 2] <- parse_int(rng[2])
if (is.na(res[i, 1]) || is.na(res[i, 2]) ||
res[i, 1] < 0 || res[i, 2] < 0 || res[i, 1] > res[i, 2]) {
return()
}
}
}
res <- res[order(res[,1]), , drop = FALSE]
# check for overlapping intervals
if (intervals_overlap(res)) return()
res
}
parse_int <- function(x) {
suppressWarnings(as.integer(x))
}
intervals_overlap <- function(x) {
# assume that it is sorted on first column
# then every interval needs to finish before the next one starts
if (nrow(x) <= 1) return(FALSE)
any(x[,2][-nrow(x)] >= x[,1][-1])
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-range-parser.R
|
#' Middleware to read the raw body of a request
#'
#' Adds the raw body, as a raw object to the `raw` field of the request.
#'
#' @param type Content type to match. If it does not match, then the
#' request object is not modified.
#' @return Handler function.
#'
#' @family middleware
#' @export
#' @examples
#' app <- new_app()
#' app$use(mw_raw())
#' app
mw_raw <- function(type = "application/octet-stream") {
function(req, res) {
ct <- req$get_header("Content-Type") %||% ""
if (! ct %in% tolower(type)) return("next")
req$raw <- req$.body
"next"
}
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-raw.R
|
#' Middleware function to serve static files
#'
#' The content type of the response is set automatically from the
#' extension of the file. Note that this is a terminal middleware
#' handler function. If a file is served, then the rest of the handler
#' functions will not be called. If a file was not found, however,
#' the rest of the handlers are still called.
#'
#' @param root Root path of the served files. Everything under this
#' directory is served automatically. Directory lists are not currently
#' supports.
#' @param set_headers Callback function to call before a file is served.
#' @return Handler function.
#'
#' @family middleware
#' @export
#' @examples
#' root <- system.file(package = "webfakes", "examples", "static", "public")
#' app <- new_app()
#' app$use(mw_static(root = root))
#' app
mw_static <- function(root, set_headers = NULL) {
root; set_headers
function(req, res) {
path <- file.path(root, sub("^/", "", req$path))
if (!file.exists(path)) return("next")
if (file.info(path)$isdir) return("next")
ext <- tools::file_ext(basename(path))
ct <- mime_find(ext)
if (!is.na(ct)) {
res$set_header("Content-Type", ct)
}
if (!is.null(set_headers)) set_headers(req, res)
res$send(read_bin(path))
}
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-static.R
|
#' Middleware to parse a plain text body
#'
#' Adds the parsed object as the `text` element of the request object.
#'
#' @param default_charset Encoding to set on the text.
#' @param type Content type to match before parsing. If it does not
#' match, then the request object is not modified.
#' @return Handler function.
#'
#' @family middleware
#' @export
#' @examples
#' app <- new_app()
#' app$use(mw_text())
#' app
mw_text <- function(default_charset = "utf-8",
type = "text/plain") {
default_charset; type
function(req, res) {
ct <- req$get_header("Content-Type") %||% ""
if (! ct %in% tolower(type)) return("next")
req$text <- rawToChar(req$.body %||% raw())
Encoding(req$text) <- default_charset
"next"
}
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-text.R
|
#' Middleware to parse an url-encoded request body
#'
#' This is typically data from a form. The parsed data is added
#' as the `form` element of the request object.
#'
#' @param type Content type to match before parsing. If it does not
#' match, then the request object is not modified.
#' @return Handler function.
#'
#' @family middleware
#' @export
#' @examples
#' app <- new_app()
#' app$use(mw_urlencoded())
#' app
mw_urlencoded <- function(type = "application/x-www-form-urlencoded") {
function(req, res) {
ct <- req$get_header("Content-Type") %||% ""
if (! ct %in% tolower(type)) return("next")
if (!is.null(req$.body)) {
req$form <- parse_query(rawToChar(req$.body))
}
"next"
}
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/mw-urlencoded.R
|
#' Fake OAuth 2.0 resource and authorization app
#'
#' @includeRmd man/rmd-fragments/oauth2.Rmd description
#'
#' @details
#' The app has the following endpoints:
#' * `GET /register` is the endpoint that you can use to register
#' your third party app. It needs to receive the `name` of the
#' third party app, and its `redirect_uri` as query parameters,
#' otherwise returns an HTTP 400 error. On success it returns a
#' JSON dictionary with entries `name` (the name of the third party
#' app), `client_id`, `client_secret` and `redirect_uri`.
#' * `GET /authorize` is the endpoint where the user of the third
#' party app is sent. You can change the URL of this endpoint with
#' the `authorize_endpoint` argument. It needs to receive the `client_id`
#' of the third party app, and its correct `redirect_uri` as query
#' parameters. It may receive a `state` string as well, which can
#' be used by a client to identify the request. Otherwise it
#' generates a random `state` string. On error it fails with a HTTP
#' 400 error. On success it returns a simple HTML login page.
#' * `POST /authorize/decision` is the endpoint where the HTML login
#' page generated at `/authorize` connects back to, either with a
#' positive or negative result. The form on the login page will send
#' the `state` string and the user's choice in the `action` variable.
#' If the user authorized the third party app, then they are
#' redirected to the `redirect_uri` of the app, with a temporary
#' `code` and the `state` string supplied as query parameters.
#' Otherwise a simple HTML page is returned.
#' * `POST /token` is the endpoint where the third party app requests
#' a temporary access token. It is also uses for refreshing an
#' access token with a refresh token. You can change the URL of this
#' endpoint with the `token_endpoint` argument.
#' To request a new token or refresh an existing one, the following
#' data must be included in either a JSON or an URL encoded request body:
#' - `grant_type`, this must be `authorization_code` for new tokens,
#' and `refresh_token` for refreshing.
#' - `code`, this must be the temporary code obtained from the
#' `/authorize/decision` redirection, for new tokens. It is not
#' needed when refreshing.
#' - `client_id` must be the client id of the third party app.
#' - `client_secret` must be the client secret of the third party
#' app.
#' - `redirect_uri` must be the correct redirection URI of the
#' third party app. It is not needed when refreshing tokens.
#' - `refresh_token` must be the refresh token obtained previously,
#' when refreshing a token. It is not needed for new tokens.
#' On success a JSON dictionary is returned with entries:
#' `access_token`, `expiry` and `refresh_token`. (The latter is
#' omitted if the `refresh` argument is `FALSE`).
#' * `GET /locals` returns a list of current apps, access tokens and
#' refresh tokens.
#' * `GET /data` is an endpoint that returns a simple JSON response,
#' and needs authorization.
#'
#' ## Notes
#'
#' * Using this app in your tests requires the glue package, so you
#' need to put it in `Suggests`.
#' * You can add custom endpoints to the app, as needed.
#' * If you need authorization in your custom endpoint, call
#' `app$is_authorized()` in your handler:
#' ```
#' if (!app$is_authorized(req, res)) return()
#' ```
#' `app$is_authorized()` returns an HTTP 401 response if the
#' client is not authorized, so you can simply return from your
#' handler.
#'
#' For more details see `vignette("oauth", package = "webfakes")`.
#'
#' @section `oauth2_resource_app()`:
#' App representing the API server (resource/authorization)
#' @return a `webfakes` app
#' @param access_duration After how many seconds should access tokens
#' expire.
#' @param refresh_duration After how many seconds should refresh
#' tokens expire (ignored if `refresh` is `FALSE`).
#' @param refresh Should a refresh token be returned (logical).
#' @param seed Random seed used when creating tokens. If `NULL`,
#' we rely on R to provide a seed. The app uses its own RNG stream,
#' so it does not affect reproducibility of the tests.
#' @param authorize_endpoint The authorization endpoint of the resource
#' server. Change this from the default if the real app that you
#' are faking does not use `/authorize`.
#' @param token_endpoint The endpoint to request tokens. Change this if the
#' real app that you are faking does not use `/token`.
#' @return webfakes app
#'
#' @export
#' @family OAuth2.0 functions
oauth2_resource_app <- function(access_duration = 3600L,
refresh_duration = 7200L,
refresh = TRUE, seed = NULL,
authorize_endpoint = "/authorize",
token_endpoint = "/token") {
access_duration
refresh_duration
refresh
seed
authorize_endpoint
token_endpoint
app <- new_app()
app$locals$seed <- seed %||% get_seed()
# Parse body for /authorize/decision, /token
app$use(mw_urlencoded())
app$use(mw_json())
app$locals$tpapps <- data.frame(
name = character(),
client_id = character(),
client_secret = character(),
redirect_uri = character()
)
app$set_config("views", system.file("views", package = "webfakes"))
app$engine("html", tmpl_glue())
app$get("/register", function(req, res) {
if (is.null(req$query$name) || is.null(req$query$redirect_uri)) {
res$
set_status(400L)$
send("Cannot register without 'name' and 'redirect_uri'")
return()
}
rec <- list(
name = req$query$name,
client_id = paste0("id-", generate_token()),
client_secret = paste0("secret-", generate_token()),
redirect_uri = req$query$redirect_uri
)
app$locals$tpapps <- rbind(app$locals$tpapps, rec)
res$send_json(rec)
})
app$get(authorize_endpoint, function(req, res) {
# Missing or invalid client id
client_id <- req$query$client_id
if (is.null(client_id)) {
res$
set_status(400L)$
send("Invalid authorization request, no client id")
return()
} else if (! client_id %in% app$locals$tpapps$client_id) {
res$
set_status(400L)$
send("Invalid authorization request, unknown client id")
return()
}
tpapps <- app$locals$tpapps
tprec <- tpapps[match(client_id, tpapps$client_id), ]
# Bad redirect URL?
if (req$query$redirect_uri %||% "" != tprec$redirect_uri) {
res$
set_status(400L)$
send(paste0(
"Invalid authorization request, redirect URL mismatch: ",
req$query$redirect_uri %||% "", " vs ", tprec$redirect_uri
))
return()
}
state <- req$query$state %||% generate_token()
app$locals$states <- c(app$locals$states, set_name(client_id, state))
html <- res$render("authorize", list(state = state, app = tprec$name))
res$
set_type("text/html")$
send(html)
})
app$post(paste0(authorize_endpoint, "/decision"), function(req, res) {
state <- req$form$state
if (is.null(state) || ! state %in% names(app$locals$states)) {
res$
set_status(400L)$
send("Invalid decision, no state")
return()
}
client_id <- app$locals$states[state]
tpapps <- app$locals$tpapps
tprec <- tpapps[match(client_id, tpapps$client_id), ]
app$local$states <-
app$local$states[setdiff(names(app$local$states), state)]
if (req$form$action %||% "" == "yes") {
code <- generate_token()
# TODO: make this app specific
app$locals$codes <- c(app$locals$codes, code)
red_uri <- paste0(tprec$redirect_uri, "?code=", code, "&state=", state)
res$redirect(red_uri)$send()
} else {
res$
send("Maybe next time.")
}
})
local_app_seed <- function(.local_envir = parent.frame()) {
old_seed <- get_seed()
set_seed(app$locals$seed)
defer({
app$locals$seed <- get_seed()
set_seed(old_seed)
}, envir = .local_envir)
}
new_access_token <- function(client_id, duration) {
local_app_seed()
token <- paste0("token-", generate_token())
new <- data.frame(
stringsAsFactors = FALSE,
client_id = client_id,
token = token,
expiry = Sys.time() + duration
)
app$locals$tokens <- rbind(app$locals$tokens, new)
token
}
new_refresh_token <- function(client_id, duration) {
local_app_seed()
token <- paste0("refresh-token-", generate_token())
new <- data.frame(
stringsAsFactors = FALSE,
client_id = client_id,
token = token,
expiry = Sys.time() + duration
)
app$locals$refresh_tokens <- rbind(app$locals$refresh_tokens, new)
token
}
expire_tokens <- function() {
if (!is.null(app$locals$tokens)) {
app$locals$tokens <- app$locals$tokens[
app$locals$tokens$expiry > Sys.time(),,
drop = FALSE
]
}
if (!is.null(app$locals$refresh_tokens)) {
app$locals$refresh_tokens <- app$locals$refresh_tokens[
app$locals$refresh_tokens$expiry > Sys.time(),,
drop = FALSE
]
}
}
is_valid_token <- function(token) {
expire_tokens()
token %in% app$locals$tokens$token
}
is_valid_refresh_token <- function(client_id, token) {
expire_tokens()
wh <- match(token, app$locals$refresh_tokens$token)
if (is.na(wh)) return(FALSE)
# client ID must match as well
app$locals$refresh_tokens$client_id == client_id
}
# For refresh tokens
app$post(token_endpoint, function(req, res) {
if (req$form$grant_type != "refresh_token") return("next")
client_id <- req$form$client_id
tpapps <- app$locals$tpapps
if (! client_id %in% tpapps$client_id) {
res$
set_status(400L)$
send("Invalid client id")
return()
}
tprec <- tpapps[match(client_id, tpapps$client_id), ]
if (req$form$client_secret %||% "" != tprec$client_secret) {
res$
set_status(400L)$
send("Invalid token request, client secret mismatch")
return()
}
if (!is_valid_refresh_token(client_id, req$form$refresh_token)) {
res$
set_status(400L)$
send_json(list(error = "invalid_request"), auto_unbox = TRUE)
return()
}
res$send_json(list(
access_token = new_access_token(client_id, access_duration),
expiry = access_duration,
refresh_token = new_refresh_token(client_id, refresh_duration)
), auto_unbox = TRUE)
})
# For regular tokens
app$post(token_endpoint, function(req, res) {
if (req$form$grant_type %||% "" != "authorization_code") {
res$
set_status(400L)$
send("Invalid grant type, must be 'authorization_code'")
return()
}
if (! req$form$code %in% app$locals$codes) {
res$
set_status(400L)$
send("Unknown authorization code")
return()
}
tpapps <- app$locals$tpapps
client_id <- req$form$client_id
if (! client_id %in% tpapps$client_id) {
res$
set_status(400L)$
send("Invalid client id")
return()
}
tprec <- tpapps[match(client_id, tpapps$client_id), ]
if (req$form$client_secret %||% "" != tprec$client_secret) {
res$
set_status(400L)$
send("Invalid token request, client secret mismatch")
return()
}
if (req$form$redirect_uri %||% "" != tprec$redirect_uri) {
res$
set_status(400L)$
send("Invalid token request, redirect URL mismatch")
return()
}
app$locals$codes <- setdiff(app$locals$codes, req$query$code)
res$send_json(list(
access_token = new_access_token(client_id, access_duration),
expiry = access_duration,
refresh_token =
if (refresh) new_refresh_token(client_id, refresh_duration)
), auto_unbox = TRUE, pretty = TRUE)
})
app$get("/locals", function(req, res) {
res$
set_status(200L)$
send_json(list(
apps = app$locals$tpapps,
access = app$locals$tokens,
refresh = app$locals$refresh_tokens
), auto_unbox = TRUE)
})
app$is_authorized <- function(req, res) {
expire_tokens()
if (!("Authorization" %in% names(req$headers))) {
res$
set_status(401L)$
send("Missing bearer token")
return(FALSE)
}
token <- gsub("Bearer ", "", req$headers$Authorization[[1]])
if (!is_valid_token(token)) {
res$
set_status(401L)$
send("Invalid bearer token")
return(FALSE)
}
TRUE
}
app$get("/data", function(req, res) {
if (!app$is_authorized(req, res)) return()
res$send_json(list(data = "top secret!"))
})
app
}
#' App representing the third-party app
#'
#' @includeRmd man/rmd-fragments/oauth2.Rmd description
#'
#' @details
#' Endpoints:
#' * `POST /login/config` Use this endpoint to configure the client ID
#' and the client secret of the app, received from
#' [oauth2_resource_app()] (or another resource app). You need to
#' send in a JSON or URL encoded body:
#' - `auth_url`, the authorization URL of the resource app.
#' - `token_url`, the token URL of the resource app.
#' - `client_id`, the client ID, received from the resource app.
#' - `client_secret` the client secret, received from the resource
#' app.
#' * `GET /login` Use this endpoint to start the login process. It
#' will redirect to the resource app for authorization and after the
#' OAuth2.0 dance to `/login/redirect`.
#' * `GET /login/redirect`, `POST /login/redirect` This is the
#' redirect URI of the third party app. (Some HTTP clients redirect
#' a `POST` to a `GET`, others don't, so it has both.) This endpoint
#' is used by the resource app, and it received the `code` that can
#' be exchanged to an access token and the `state` which was
#' generated in `/login`. It contacts the resource app to get an
#' access token, and then stores the token in its `app$locals`
#' local variables. It fails with HTTP code 500 if it cannot obtain
#' an access token. On success it returns a JSON dictionary with
#' `access_token`, `expiry` and `refresh_token` (optionally) by
#' default. This behavior can be changed by redefining the
#' `app$redirect_hook()` function.
#' * `GET /locals` returns the tokens that were obtained from the
#' resource app.
#' * `GET /data` is an endpoint that uses the obtained token(s) to
#' connect to the `/data` endpoint of the resource app. The `/data`
#' endpoint of the resource app needs authorization. It responds
#' with the response of the resource app. It tries to refresh the
#' access token of the app if needed.
#'
#' For more details see `vignette("oauth", package = "webfakes")`.
#'
#' @param name Name of the third-party app
#' @return webfakes app
#' @export
#' @family OAuth2.0 functions
oauth2_third_party_app <- function(name = "Third-Party app") {
app <- new_app()
app$use(mw_urlencoded())
app$use(mw_json())
app$locals$auth_url <- NA_character_
app$locals$token_url <- NA_character_
app$locals$client_id <- NA_character_
app$locals$client_secret <- NA_character_
app$post("/login/config", function(req, res) {
if (is.null(req$json$auth_url) || is.null(req$json$token_url) ||
is.null(req$json$client_id) || is.null(req$json$client_secret)) {
res$
set_status(400L)$
send("Need `client_id` and `client_secret` to config auth")
return()
}
if (!is.na(app$locals$auth_url)) {
res$
set_status(400L)$
send("Auth already configured")
return()
}
app$locals$auth_url <- req$json$auth_url
app$locals$token_url <- req$json$token_url
app$locals$client_id <- req$json$client_id
app$locals$client_secret <- req$json$client_secret
res$send_json(list(response = "Authorization configured"))
})
app$get("/login", function (req, res) {
state <- generate_token()
app$locals$state <- state
url <- paste0(
app$locals$auth_url,
"?client_id=", app$locals$client_id,
"&redirect_uri=", paste0(req$url, "/redirect"),
"&state=", state
)
res$redirect(url)$send()
})
# I could not convince curl to redirect a POST to a GET, so both are good
app$all("/login/redirect", function (req, res) {
code <- req$query$code
state <- req$query$state
if (is.null(code) || is.null(state)) {
res$
set_status(400L)$
send("Invalid request via auth server, no 'code' or 'state'.")
return()
}
if (state != app$locals$state) {
res$
set_status(400L)$
send("Unknown state in request via auth server")
return()
}
# Get a token
handle <- curl::new_handle()
data <- charToRaw(paste0(
"grant_type=authorization_code&",
"code=", code, "&",
"client_id=", app$locals$client_id, "&",
"client_secret=", app$locals$client_secret, "&",
"redirect_uri=", req$url
))
curl::handle_setheaders(
handle,
"content-type" = "application/x-www-form-urlencoded"
)
curl::handle_setopt(
handle,
customrequest = "POST",
postfieldsize = length(data),
postfields = data
)
resp <- curl::curl_fetch_memory(app$locals$token_url, handle = handle)
if (resp$status_code != 200L) {
res$
set_status(500L)$
send(paste0(
"Failed to acquire authorization token. ",
rawToChar(resp$content)
))
return()
}
tokens <- rawToChar(resp$content)
app$locals$tokens <- jsonlite::fromJSON(tokens)
app$redirect_hook(res, tokens)
})
app$redirect_hook <- function(res, tokens) {
res$
send_json(text = tokens)
}
app$get("/locals", function(req, res) {
res$
set_status(200L)$
send_json(app$locals$tokens, auto_unbox = TRUE)
})
get_data <- function() {
auth <- paste("Bearer", app$locals$tokens$access_token)
handle <- curl::new_handle()
curl::handle_setheaders(handle, Authorization = auth)
url <- modify_path(app$locals$token_url, "/data")
curl::curl_fetch_memory(url, handle = handle)
}
try_refresh <- function() {
refresh_token <- app$locals$tokens$refresh_token
if (is.null(refresh_token)) return(FALSE)
data <- charToRaw(paste0(
"refresh_token=", refresh_token, "&",
"grant_type=refresh_token"
))
handle <- curl::new_handle()
curl::handle_setheaders(
handle,
"content-type" = "application/x-www-form-urlencoded"
)
curl::handle_setopt(
handle,
customrequest = "POST",
postfieldsize = length(data),
postfields = data
)
resp <- curl::curl_fetch_memory(app$locals$token_url, handle = handle)
if (resp$status_code != 200L) return(FALSE)
tokens <- rawToChar(resp$content)
app$locals$tokens <- jsonlite::fromJSON(tokens)
TRUE
}
app$get("/data", function(req, res) {
resp <- get_data()
if (resp$status_code == 401) {
if (try_refresh()) resp <- get_data()
}
res$
set_status(resp$status_code)$
set_type(resp$type)$
send(resp$content)
})
}
#' Helper function to log in to a third party OAuth2.0 app without a
#' browser
#'
#' It works with [oauth2_resource_app()], and any third party app,
#' including the fake [oauth2_third_party_app()].
#'
#' See `test-oauth.R` in webfakes for an example.
#'
#' @param login_url The login URL of the third party app.
#' @return A named list with
#' * `login_response` The curl HTTP response object for the login
#' page.
#' * `token_response` The curl HTTP response object for submitting
#' the login page.
#'
#' @family OAuth2.0 functions
#' @export
oauth2_login <- function(login_url) {
login_resp <- curl::curl_fetch_memory(login_url)
html <- rawToChar(login_resp$content)
xml <- xml2::read_html(html)
form <- xml2::xml_find_first(xml, "//form")
input <- xml2::xml_find_first(form, "//input")
actn <- xml2::xml_attr(form, "action")
stnm <- xml2::xml_attr(input, "name")
stvl <- xml2::xml_attr(input, "value")
data <- charToRaw(paste0(
stnm, "=", stvl, "&",
"action=yes"
))
handle2 <- curl::new_handle()
curl::handle_setheaders(
handle2,
"content-type" = "application/x-www-form-urlencoded"
)
curl::handle_setopt(
handle2,
customrequest = "POST",
postfieldsize = length(data),
postfields = data
)
psurl <- parse_url(login_resp$url)
actn_url <- paste0(psurl$protocol, "://", psurl$host, actn)
token_resp <- curl::curl_fetch_memory(actn_url, handle = handle2)
list(
login_response = login_resp,
token_response = token_resp
)
}
#' Helper function to use httr's OAuth2.0 functions
#' non-interactively, e.g. in test cases
#'
#' To perform an automatic acknowledgement and log in for a
#' local OAuth2.0 app, run by httr, wrap the expression that
#' obtains the OAuth2.0 token in `oauth2_httr_login()`.
#'
#' In interactive sessions, `oauth2_httr_login()` overrides the
#' `browser` option, and when httr opens a browser page, it
#' calls [oauth2_login()] in a subprocess.
#'
#' In non-interactive sessions, httr does not open a browser page,
#' only messages the user to do it manually. `oauth2_httr_login()`
#' listens for these messages, and calls [oauth2_login()] in a
#' subprocess.
#'
#' @param expr Expression that calls [httr::oauth2.0_token()],
#' either directly, or indirectly.
#' @return The return value of `expr`.
#'
#' @seealso See `?vignette("oauth", package = "webfakes")` for a case
#' study that uses this function.
#'
#' @export
#' @family OAuth2.0 functions
oauth2_httr_login <- function(expr) {
proc <- NULL
if (interactive()) {
local_options(browser = function(url) {
proc <<- callr::r_bg(
oauth2_login,
list(url),
package = "webfakes"
)
})
expr
} else {
withCallingHandlers(
expr,
message = function(msg) {
if (grepl("^Please point your browser to the following url:",
msg$message)) {
invokeRestart("muffleMessage")
}
if (grepl("^http", msg$message)) {
proc <<- callr::r_bg(
oauth2_login,
list(trimws(msg$message)),
package = "webfakes"
)
invokeRestart("muffleMessage")
}
}
)
}
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/oauth.R
|
#' @useDynLib webfakes, .registration = TRUE, .fixes = "c_"
NULL
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/package.R
|
#' Create a new regular expression to use in webfakes routes
#'
#' Note that webfakes uses PERL regular expressions.
#'
#' @details
#' As R does not have data type or class for regular expressions,
#' you can use `new_regexp()` to mark a string as a regular expression,
#' when adding routes.
#'
#' @param x String scalar containing a regular expression.
#' @return String with class `webfakes_regexp`.
#'
#' @aliases webfakes_regexp
#' @seealso The 'Path specification' and 'Path parameters' chapters
#' of the manual of [new_app()].
#' @export
#' @examples
#' new_regexp("^/api/match/(?<pattern>.*)$")
new_regexp <- function(x) structure(x, class = "webfakes_regexp")
path_match <- function(method, path, handler) {
if (handler$method == "use") return(TRUE)
if ((! handler$method %in% c("all", method)) &&
!(handler$method == "get" && method == "head")) return(FALSE)
pattern_match(path, handler$path)
}
pattern_match <- function(path, patterns) {
# Make sure patterns is a list
if (inherits(patterns, "webfakes_regexp")) {
patterns <- list(patterns)
} else if (is.character(patterns)) {
patterns <- as.list(patterns)
}
for (p in patterns) {
if (!inherits(p, "webfakes_regexp") && grepl(":", p)) {
p <- path_to_regexp(p)
}
if (inherits(p, "webfakes_regexp")) {
m <- re_match(path, p)
if (m$match) return(list(params = as.list(m$groups)))
} else {
if (path == p) return(TRUE)
}
}
FALSE
}
path_to_regexp <- function(path) {
tokens <- strsplit(path, "/")[[1]]
keys <- grep("^:", tokens)
reg <- tokens
reg[keys] <- paste0("(?<", substring(tokens[keys], 2), ">[-A-Za-z0-9_]+)")
new_regexp(paste0("^", paste(reg, collapse = "/"), "$"))
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/path.R
|
format_named_list <- function(name, data) {
c(paste0(name, ":"),
if (length(data)) paste0(" ", names(data), ": ", data)
)
}
format_path <- function(patterns) {
# Make sure patterns is a list
if (inherits(patterns, "webfakes_regexp")) {
patterns <- list(patterns)
} else if (is.character(patterns)) {
patterns <- as.list(patterns)
}
paste(collapse = ", ", vapply(patterns, format, character(1)))
}
#' @export
format.webfakes_app <- function(x, ...) {
header <- "<webfakes_app>"
data <- vapply(x$.stack, FUN.VALUE = character(1), function(x) {
name <- if (nchar(x$name %||% "")) paste0(" # ", x$name)
paste0(" ", x$method, " ", format_path(x$path), name)
})
methods <- c(
" all(path, ...) # add route for *all* HTTP methods",
" delete(path, ...) # add route for DELETE",
" engine(ext, engine) # add template engine for file extension",
" head(path, ...) # add route for HEAD",
" listen(port) # start web app on port",
" patch(path, ...) # add route for PATCH",
" post(path, ...) # add route for POST",
" put(path, ...) # add route for PUT",
" use(...) # add middleware",
" locals # app-wide shared data"
)
help <- "# see ?webfakes_app for all methods"
c(header, "routes:", data, "fields and methods:", methods, help)
}
#' @export
print.webfakes_app <- function(x, ...) {
cat(format(x, ...), sep = "\n")
invisible(x)
}
#' @export
format.webfakes_request <- function(x, ...) {
header <- "<webfakes_request>"
data <- c(
"method:",
paste0(" ", x$method),
"url:",
paste0(" ", x$url),
"client:",
paste0(" ", x$remote_addr),
format_named_list("query", x$query),
format_named_list("headers", x$headers)
)
methods <- c(
" app # the webfakes_app the request belongs to",
" headers # HTTP request headers",
" hostname # server hostname, the Host header",
" method # HTTP method of request (lowercase)",
" path # server path",
" protocol # http or https",
" query_string # raw query string without '?'",
" query # named list of query parameters",
" remote_addr # IP address of the client",
" url # full URL of the request",
" get_header(field) # get a request header"
)
help <- " # see ?webfakes_request for details"
c(header, data, "fields and methods:", methods, help)
}
#' @export
print.webfakes_request <- function(x, ...) {
cat(format(x, ...), sep = "\n")
invisible(x)
}
#' @export
format.webfakes_response <- function(x, ...) {
header <- "<webfakes_reeponse>"
methods <- c(
" app # the webfakes_app the response belongs to",
" locals # response-wide shared data",
" get_header(field) # query response header",
" on_response(fun) # call handler function for complete response",
" redirect(path, status) # send redirect response",
" render(view, locals) # render template",
" send(body) # send text or raw data",
" send_file(path, root) # send a file (automatic Content-Type)",
" send_json(object, text, ...)",
" # send JSON data",
" send_status(status) # send HTTP status and empty body",
" set_header(field, value)
# set a response header",
" set_status(status) # set response status code",
" set_type(type) # set Content-Type"
)
help <- " # see ?webfakes_response for details"
c(header, "fields and methods:", methods, help)
}
#' @export
print.webfakes_response <- function(x, ...) {
cat(format(x, ...), sep = "\n")
invisible(x)
}
#' @export
format.webfakes_regexp <- function(x, ...) {
paste0("<webfakes_regexp> ", encodeString(x, quote = "\""))
}
#' @export
print.webfakes_regexp <- function(x, ...) {
cat(format(x, ...), sep = "\n")
invisible(x)
}
#' @export
format.webfakes_app_process <- function(x, ...) {
header <- "<webfakes_app_process>"
state <- x$get_state()
data <- c(
"state:",
paste0(" ", state),
"auto_start:",
paste0(" ", x$.auto_start),
"process id:",
paste0(" ", if (state == "not running") "none" else x$.process$get_pid()),
"http url:",
paste0(" ", if (state == "live") x$url() else "NA")
)
methods <- c(
" get_app() # get the app object",
" get_port() # query port of the app",
" get_state() # query web server process state",
" local_env(envvars) # set temporary environment variables",
" start() # start the app",
" url(path, query) # query url for an api path",
" stop() # stop web server process"
)
help <- "# see ?webfakes_app_process for details"
c(header, data, "fields and methods:", methods, help)
}
#' @export
print.webfakes_app_process <- function(x, ...) {
cat(format(x, ...), sep = "\n")
invisible(x)
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/print.R
|
re_match <- function(text, pattern, perl = TRUE, ...) {
stopifnot(is.character(pattern), length(pattern) == 1, !is.na(pattern))
text <- as.character(text)
match <- regexpr(pattern, text, perl = perl, ...)
start <- as.vector(match)
length <- attr(match, "match.length")
end <- start + length - 1L
matchstr <- substring(text, start, end)
matchstr[ start == -1 ] <- NA_character_
empty <- data.frame(stringsAsFactors = FALSE, .text = text)[, numeric()]
res <- list(match = !is.na(matchstr), groups = empty)
if (!is.null(attr(match, "capture.start"))) {
gstart <- attr(match, "capture.start")
glength <- attr(match, "capture.length")
gend <- gstart + glength - 1L
groupstr <- substring(text, gstart, gend)
groupstr[ gstart == -1 ] <- NA_character_
dim(groupstr) <- dim(gstart)
res$groups <- cbind(groupstr, res$groups, stringsAsFactors = FALSE)
names(res$groups) <- attr(match, "capture.names")
}
res
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/rematch2.R
|
#' A webfakes request object
#'
#' webfakes creates a `webfakes_request` object for every incoming HTTP
#' request. This object is passed to every matched route and middleware,
#' until the response is sent. It has reference semantics, so handlers
#' can modify it.
#'
#' Fields and methods:
#'
#' * `app`: The `webfakes_app` object itself.
#' * `headers`: Named list of HTTP request headers.
#' * `hostname`: The Host header, the server hostname and maybe port.
#' * `method`: HTTP method.
#' * `path`: Server path.
#' * `protocol`: `"http"` or `"https"`.
#' * `query_string`: The raw query string, without the starting `?`.
#' * `query`: Parsed query parameters in a named list.
#' * `remote_addr`: String, the domain name or IP address of the client.
#' webfakes runs on the localhost, so this is `127.0.0.1`.
#' * `url`: The full URL of the request.
#' * `get_header(field)`: Function to query a request header. Returns
#' `NULL` if the header is not present.
#'
#' Body parsing middleware adds additional fields to the request object.
#' See [mw_raw()], [mw_text()], [mw_json()], [mw_multipart()] and
#' [mw_urlencoded()].
#'
#' @seealso [webfakes_response] for the webfakes response object.
#' @name webfakes_request
#' @examples
#' # This is how you can see the request and response objects:
#' app <- new_app()
#' app$get("/", function(req, res) {
#' browser()
#' res$send("done")
#' })
#' app
#'
#' # Now start this app on a port:
#' # app$listen(3000)
#' # and connect to it from a web browser: http://127.0.0.1:3000
#' # You can also use another R session to connect:
#' # httr::GET("http://127.0.0.1:3000")
#' # or the command line curl tool:
#' # curl -v http://127.0.0.1:3000
#' # The app will stop while processing the request.
NULL
new_request <- function(app, self) {
if (isTRUE(self$.has_methods)) return(self)
parsed_headers <- self$headers
self$.has_methods <- TRUE
self$app <- app
self$headers <- parsed_headers
self$hostname <- parsed_headers$host
self$method <- tolower(self$method)
self$protocol <- "http"
self$query = parse_query(self$query_string)
self$get_header <- function(field) {
h <- self$headers
names(h) <- tolower(names(h))
h[[tolower(field)]]
}
rm(parsed_headers)
self$res <- new_response(app, self)
class(self) <- c("webfakes_request", class(self))
self
}
parse_query <- function(query) {
query <- sub("^[?]", "", query)
query <- chartr("+", " ", query)
argstr <- strsplit(query, "&", fixed = TRUE)[[1]]
argstr <- strsplit(argstr, "=", fixed = TRUE)
keys <- vapply(argstr, function(x) utils::URLdecode(x[[1]]), character(1))
vals <- lapply(argstr, function(x) {
if (length(x) == 2) utils::URLdecode(x[[2]]) else ""
})
structure(vals, names = keys)
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/request.R
|
#' A webfakes response object
#'
#' webfakes creates a `webfakes_response` object for every incoming HTTP
#' request. This object is passed to every matched route and middleware,
#' until the HTTP response is sent. It has reference semantics, so handlers
#' can modify it.
#'
#' Fields and methods:
#'
#' * `app`: The `webfakes_app` object itself.
#' * `req`: The request object.
#' * `headers_sent`: Whether the response headers were already sent out.
#' * `locals`: Local variables, the are shared between the handler
#' functions. This is for the end user, and not for the middlewares.
#' * `delay(secs)`: delay the response for a number of seconds. If a
#' handler calls `delay()`, the same handler will be called again,
#' after the specified number of seconds have passed. Use the `locals`
#' environment to distinguish between the calls. If you are using
#' `delay()`, and want to serve requests in parallel, then you probably
#' need a multi-threaded server, see [server_opts()].
#' * `add_header(field, value)`: Add a response header. Note that
#' `add_header()` may create duplicate headers. You usually want
#' `set_header()`.
#' * `get_header(field)`: Query the currently set response headers. If
#' `field` is not present it return `NULL`.
#' * `on_response(fun)`: Run the `fun` handler function just before the
#' response is sent out. At this point the headers and the body are
#' already properly set.
#' * `redirect(path, status = 302)`: Send a redirect response. It sets
#' the `Location` header, and also sends a `text/plain` body.
#' * `render(view, locals = list())`: Render a template page. Searches
#' for the `view` template page, using all registered engine extensions,
#' and calls the first matching template engine. Returns the filled
#' template.
#' * `send(body)`. Send the specified body. `body` can be a raw vector,
#' or HTML or other text. For raw vectors it sets the content type to
#' `application/octet-stream`.
#' * `send_json(object = NULL, text = NULL, ...)`: Send a JSON response.
#' Either `object` or `text` must be given. `object` will be converted
#' to JSON using [jsonlite::toJSON()]. `...` are passed to
#' [jsonlite::toJSON()]. It sets the content type appropriately.
#' * `send_file(path, root = ".")`: Send a file. Set `root = "/"` for
#' absolute file names. It sets the content type automatically, based
#' on the extension of the file, if it is not set already.
#' * `send_status(status)`: Send the specified HTTP status code, without
#' a response body.
#' * `send_chunk(data)`: Send a chunk of a response in chunked encoding.
#' The first chunk will automatically send the HTTP response headers.
#' Webfakes will automatically send a final zero-lengh chunk, unless
#' `$delay()` is called.
#' * `set_header(field, value)`: Set a response header. If the headers have
#' been sent out already, then it throws a warning, and does nothing.
#' * `set_status(status)`: Set the response status code. If the headers
#' have been sent out already, then it throws a warning, and does nothing.
#' * `set_type(type)`: Set the response content type. If it contains a `/`
#' character then it is set as is, otherwise it is assumed to be a file
#' extension, and the corresponding MIME type is set. If the headers have
#' been sent out already, then it throws a warning, and does nothing.
#' * `add_cookie(name, value, options)`: Adds a cookie to the response.
#' `options` is a named list, and may contain:
#' * `domain`: Domain name for the cookie, not set by default.
#' * `expires`: Expiry date in GMT. It must be a POSIXct object, and
#' will be formatted correctly.
#' * 'http_only': if TRUE, then it sets the 'HttpOnly' attribute, so
#' Javasctipt cannot access the cookie.
#' * `max_age`: Maximum age, in number of seconds.
#' * `path`: Path for the cookie, defaults to "/".
#' * `same_site`: The 'SameSite' cookie attribute. Possible values are
#' "strict", "lax" and "none".
#' * `secure`: if TRUE, then it sets the 'Secure' attribute.
#' * `clear_cookie(name, options = list())`: clears a cookie. Typically,
#' web browsers will only clear a cookie if the options also match.
#' * `write(data)`: writes (part of) the body of the response. It also
#' sends out the response headers, if they haven't been sent out before.
#'
#' Usually you need one of the `send()` methods, to send out the HTTP
#' response in one go, first the headers, then the body.
#'
#' Alternatively, you can use `$write()` to send the response in parts.
#'
#' @seealso [webfakes_request] for the webfakes request object.
#' @name webfakes_response
#' @examples
#' # This is how you can see the request and response objects:
#' app <- new_app()
#' app$get("/", function(req, res) {
#' browser()
#' res$send("done")
#' })
#' app
#'
#' # Now start this app on a port:
#' # app$listen(3000)
#' # and connect to it from a web browser: http://127.0.0.1:3000
#' # You can also use another R session to connect:
#' # httr::GET("http://127.0.0.1:3000")
#' # or the command line curl tool:
#' # curl -v http://127.0.0.1:3000
#' # The app will stop while processing the request.
NULL
new_response <- function(app, req) {
self <- new_object(
"webfakes_response",
app = app,
req = req,
locals = as.environment(as.list(app$locals)),
headers_sent = FALSE,
delay = function(secs) {
self$.stackptr <- self$.i
self$.delay <- secs
response_delay(self$req, secs)
invisible(NULL)
},
get_header = function(field) {
# this is case insensitive
h <- self$.headers
names(h) <- tolower(names(h))
h[[tolower(field)]]
},
on_response = function(fun) {
self$.on_response <- c(self$.on_response, list(fun))
invisible(self)
},
redirect = function(path, status = 302) {
if (self$.check_sent()) return(invisible(self))
self$
set_status(status)$
set_header("Location", path)$
set_type("text/plain")$
send(paste0(
status, " ", http_statuses[as.character(status)],
". Redirecting to ", path
))
invisible(self)
},
render = function(view, locals = list()) {
locals <- as.environment(as.list(locals))
parent.env(locals) <- self$locals
root <- self$app$get_config("views")
for (eng in self$app$.engines) {
f <- file.path(root, paste0(view, ".", eng$ext))
if (file.exists(f)) return(eng$engine(f, locals))
}
stop("Cannot find template engine for view '", view, "'")
},
send = function(body) {
if (self$.check_sent()) return(invisible(self))
# We need to do these here, on_response middleware might depend on it
self$.body <- body
self$.set_defaults()
for (fn in self$.on_response) fn(self$req, self)
response_send(self$req)
self$headers_sent <- TRUE
self$.sent <- TRUE
invisible(self)
},
send_chunk = function(data) {
if (self$.check_sent()) return(invisible(self))
# The first chunk sends the headers automatically, but we make
# sure to set chunked encoding
if (! self$headers_sent) {
self$set_header("Transfer-Encoding", "chunked")
if (is.null(self$get_header("Content-Type"))) {
self$set_header("Content-Type", "application/octet-stream")
}
if (is.null(self$.status)) self$set_status(200L)
self$.set_defaults()
}
enc <- self$get_header("Transfer-Encoding")
if (enc != "chunked") {
warning("Headers sent, cannot set chunked encoding now")
return(invisible(self))
}
if (is.character(data)) data <- charToRaw(paste(data, collapse = "\n"))
response_send_chunk(self$req, data)
self$headers_sent <- TRUE
invisible(self)
},
send_json = function(object = NULL, text = NULL, ...) {
if (!is.null(object) && !is.null(text)) {
stop("Specify only one of `object` and `text` in `send_json()`")
}
if (is.null(text)) {
text <- jsonlite::toJSON(object, ...)
}
self$
set_header("Content-Type", "application/json")$
send(text)
},
send_file = function(path, root = ".") {
# Set content type automatically
if (is.null(self$get_header("Content-Type"))) {
ext <- tools::file_ext(basename(path))
ct <- mime_find(ext)
if (!is.na(ct)) {
self$set_header("Content-Type", ct)
}
}
if (root == "/" && .Platform$OS.type == "windows" &&
grepl("^[a-zA-Z]:", path)) {
abs_path <- path
} else {
abs_path <- file.path(root, path)
}
self$send(read_bin(normalizePath(abs_path)))
},
send_status = function(status) {
self$
set_status(status)$
send("")
},
set_header = function(field, value) {
if (self$.check_sent()) return(invisible(self))
self$.headers[[field]] <- as.character(value)
invisible(self)
},
add_header = function(field, value) {
if (self$.check_sent()) return(invisible(self))
h <- structure(list(value), names = field)
self$.headers <- append(self$.headers, h)
invisible(self)
},
set_status = function(status) {
if (self$.check_sent()) return(invisible(self))
self$.status <- as.integer(status)
invisible(self)
},
set_type = function(type) {
if (self$.check_sent()) return(invisible(self))
if (grepl("/", type)) {
self$set_header("Content-Type", type)
} else {
ct <- mime_find(type)
if (!is.na(ct)) {
self$set_header("Content-Type", ct)
}
}
invisible(self)
},
add_cookie = function(name, value, options = list()) {
if (!is_string(name)) {
stop("Cookie name must be a string.")
}
if (grepl("[=;]", name)) {
stop("Cookie name cannot contain ';' and '=' characters.")
}
if (!is_string(value)) {
stop("Cookie value must be a string.")
}
if (grepl("[=;]", value)) {
stop("Cookie value cannot contain ';' and '=' characters.")
}
ck <- paste0(
name, "=", value,
"; ",
format_cookie_options(options)
)
self$add_header("Set-Cookie", ck)
invisible(self)
},
clear_cookie = function(name, options = list()) {
if (!is_string(name)) {
stop("Cookie name must be a string.")
}
if (grepl("[=;]", name)) {
stop("Cookie name cannot contain ';' and '=' characters.")
}
options$expires <- .POSIXct(0)
options$max_age <- 0L
ck <- paste0(name, "=; ", format_cookie_options(options))
self$add_header("Set-Cookie", ck)
invisible(self)
},
write = function(data) {
if (is.null(self$get_header("content-length"))) {
warning("response$write() without a Content-Length header")
}
if (is.null(self$.status)) self$set_status(200L)
if (is.character(data)) data <- charToRaw(paste(data, collapse = "\n"))
response_write(self$req, data)
self$headers_sent <- TRUE
invisible(self)
},
.check_sent = function() {
if (isTRUE(self$.sent)) {
warning("Response is sent already")
}
self$.sent
},
.set_defaults = function() {
if (is.null(self$.status)) {
if (is.null(self$.body)) {
# No status, no body, that's 404
self$.status <- 404L
self$.body <- "Not found"
} else {
# No status, but body, set status
self$.status <- 200L
}
}
# Set Content-Type if not set
if (is.null(self$get_header("Content-Type"))) {
if (is.raw(self$body)) {
ct <- "application/octet-stream"
} else {
ct <- "text/plain"
}
self$set_header("Content-Type", ct)
}
# Set Content-Length if not set
if (is.null(self$get_header("Content-Length")) &&
(self$get_header("Transfer-Encoding") %||% "") != "chunked") {
if (is.raw(self$.body)) {
cl <- length(self$.body)
} else if (is.character(self$.body)) {
cl <- nchar(self$.body, type = "bytes")
} else if (is.null(self$.body)) {
cl <- 0L
}
self$set_header("Content-Length", cl)
}
# Make sure response to HEAD has empty body
if (self$req$method == "head") {
self$.body <- raw(0)
self$set_header("Content-Length", "0")
}
},
.body = NULL,
.status = NULL,
.headers = if (!app$.enable_keep_alive) list("Connection" = "close") else list(),
.on_response = NULL,
.sent = FALSE,
.stackptr = 1L
)
self
}
format_cookie_options <- function(options) {
options$path <- options$path %||% "/"
bad <- unique(setdiff(
names(options),
c("domain", "expires", "http_only", "max_age", "path", "same_site",
"secure")
))
if (length(bad)) {
stop(
"Unknown or unsupported cookie attribute(s): ",
paste0("\"", bad, "\"", collapse = ", "),
"."
)
}
parts <- c(
if (!is.null(options$domain)) {
paste0("Domain=", options$domain)
},
if (!is.null(options$expires)) {
if (!inherits(options$expires, "POSIXct")) {
stop("The 'expires' cookie attribute must be a POSIXct object")
}
paste0("Expires=", http_time_stamp(options$expires))
},
if (isTRUE(options$http_only)) {
"HttpOnly"
},
if (!is.null(options$max_age)) {
paste0("Max-Age=", options$max_age)
},
paste0("Path=", options$path),
if (!is.null(options$same_site)) {
if (tolower(!options$same_site) %in% c("strict", "lax", "none")) {
stop(
"Invalid value for 'SameSite' cookie atrribute: ",
options$same_site,
", must be \"strict\", \"lax\" or \"none\"."
)
}
paste0("SameSite=", capitalize(options$same_site))
},
if (isTRUE(options$secure)) {
"Secure"
}
)
paste(parts, collapse = "; ")
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/response.R
|
server_start <- function(opts = server_opts()) {
ports <- paste0(opts$interfaces, ":", opts$port %||% "0", collapse = ",")
if (!is.na(opts$access_log_file)) {
mkdirp(dirname(opts$access_log_file))
file.create(opts$access_log_file)
}
if (!is.na(opts$error_log_file)) {
mkdirp(dirname(opts$error_log_file))
file.create(opts$error_log_file)
}
throttle <- paste0(
"*=",
if (opts$throttle == Inf) "0" else opts$throttle
)
options <- c(
"listening_ports" = ports,
"num_threads" = opts$num_threads,
"enable_keep_alive" = c("no", "yes")[[opts$enable_keep_alive + 1]],
"access_log_file" = opts$access_log_file %|NA|% "",
"error_log_file" = opts$error_log_file %|NA|% "",
"tcp_nodelay" = c("0", "1")[[opts$tcp_nodelay + 1]],
"throttle" = throttle,
# These are not configurable currently
"request_timeout_ms" = "100000",
"enable_auth_domain_check" = "no"
)
srv <- call_with_cleanup(c_server_start, options)
attr(srv, "options") <- opts
srv
}
#' Webfakes web server options
#'
#' @param remote Meta-option. If set to `TRUE`, webfakes uses slightly
#' different defaults, that are more appropriate for a background
#' server process.
#' @param port Port to start the web server on. Defaults to a randomly
#' chosen port.
#' @param num_threads Number of request handler threads to use. Typically
#' you don't need more than one thread, unless you run test cases in
#' parallel or you make concurrent HTTP requests.
#' @param interfaces The network interfaces to listen on. Being a test
#' web server, it defaults to the localhost. Only bind to a public
#' interface if you know what you are doing. webfakes was not designed
#' to serve public web pages.
#' @param enable_keep_alive Whether the server keeps connections alive.
#' @param access_log_file `TRUE`, `FALSE`, or a path. See 'Logging'
#' below.
#' @param error_log_file `TRUE`, `FALSE`, or a path. See 'Logging'
#' below.
#' @param tcp_nodelay if `TRUE` then packages will be sent as soon as
#' possible, instead of waiting for a full buffer or timeout to occur.
#' @param throttle Limit download speed for clients. If not `Inf`,
#' then it is the maximum number of bytes per second, that is sent to
#' as connection.
#' @return List of options that can be passed to `webfakes_app$listen()`
#' (see [new_app()]), and [new_app_process()].
#'
#' @section Logging:
#'
#' * For `access_log_file`, `TRUE` means `<log-dir>/access.log`.
#' * For `error_log_file`, `TRUE` means `<log-dir>/error.log`.
#'
#' `<log-dir>` is set to the contents of the `WEBFAKES_LOG_DIR`
#' environment variable, if it is set. Otherwise it is set to
#' `<tmpdir>/webfakes` for local apps and `<tmpdir>/<pid>/webfakes` for
#' remote apps (started with `new_app_procss()`).
#'
#' `<tmpdir>` is the session temporary directory of the _main process_.
#'
#' `<pid>` is the process id of the subprocess.
#'
#' @export
#' @examples
#' # See the defaults
#' server_opts()
server_opts <- function(remote = FALSE, port = NULL, num_threads = 1,
interfaces = "127.0.0.1",
enable_keep_alive = FALSE,
access_log_file = remote,
error_log_file = TRUE,
tcp_nodelay = FALSE,
throttle = Inf) {
log_dir <- Sys.getenv("WEBFAKES_LOG_DIR", file.path(tempdir(), "webfakes"))
if (isTRUE(access_log_file)) {
if (remote) {
access_log_file <- file.path(log_dir, "%p", "access.log")
} else {
access_log_file <- file.path(log_dir, "access.log")
}
} else if (isFALSE(access_log_file)) {
access_log_file <- NA_character_
}
if (isTRUE(error_log_file)) {
if (remote) {
error_log_file <- file.path(log_dir, "%p", "error.log")
} else {
error_log_file <- file.path(log_dir, "error.log")
}
} else if (isFALSE(error_log_file)) {
error_log_file <- NA_character_
}
rm(log_dir)
as.list(environment())
}
server_get_ports <- function(srv) {
as.data.frame(call_with_cleanup(c_server_get_ports, srv))
}
server_stop <- function(srv) {
invisible(call_with_cleanup(c_server_stop, srv))
}
server_poll <- function(srv, cleanup = TRUE) {
while (TRUE) {
tryCatch(
return(call_with_cleanup(c_server_poll, srv, cleanup)),
error = function(err) {
cat(as.character(err), file = stderr())
stop(new_webfakes_error())
}
)
}
}
response_send <- function(req) {
tryCatch(
call_with_cleanup(c_response_send, req),
error = function(err) {
cat(as.character(err), file = stderr())
stop(new_webfakes_error())
}
)
invisible(NULL)
}
response_send_chunk <- function(req, data) {
tryCatch(
call_with_cleanup(c_response_send_chunk, req, data),
error = function(err) {
cat(as.character(err), file = stderr())
stop(new_webfakes_error())
}
)
invisible(NULL)
}
response_send_error <- function(req, msg, status) {
tryCatch(
call_with_cleanup(c_response_send_error, req, msg, status),
error = function(err) {
cat(as.character(err), file = stderr())
stop(new_webfakes_error())
}
)
invisible(NULL)
}
response_delay <- function(req, secs) {
tryCatch(
call_with_cleanup(c_response_delay, req, secs),
error = function(err) {
cat(as.character(err), file = stderr())
stop(new_webfakes_error())
}
)
invisible(NULL)
}
response_write <- function(req, data) {
tryCatch(
call_with_cleanup(c_response_write, req, data),
error = function(err) {
cat(as.character(err), file = stderr())
stop(new_webfakes_error())
}
)
invisible(NULL)
}
new_webfakes_error <- function(...) {
structure(
list(message = "error in webfakes connection", ...),
class = c("webfakes_error", "error", "condition")
)
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/server.R
|
http_statuses <- c(
"100" = "Continue",
"101" = "Switching Protocols",
"102" = "Processing (WebDAV; RFC 2518)",
"200" = "OK",
"201" = "Created",
"202" = "Accepted",
"203" = "Non-Authoritative Information",
"204" = "No Content",
"205" = "Reset Content",
"206" = "Partial Content",
"207" = "Multi-Status (WebDAV; RFC 4918)",
"208" = "Already Reported (WebDAV; RFC 5842)",
"226" = "IM Used (RFC 3229)",
"300" = "Multiple Choices",
"301" = "Moved Permanently",
"302" = "Found",
"303" = "See Other",
"304" = "Not Modified",
"305" = "Use Proxy",
"306" = "Switch Proxy",
"307" = "Temporary Redirect",
"308" = "Permanent Redirect (experimental Internet-Draft)",
"400" = "Bad Request",
"401" = "Unauthorized",
"402" = "Payment Required",
"403" = "Forbidden",
"404" = "Not Found",
"405" = "Method Not Allowed",
"406" = "Not Acceptable",
"407" = "Proxy Authentication Required",
"408" = "Request Timeout",
"409" = "Conflict",
"410" = "Gone",
"411" = "Length Required",
"412" = "Precondition Failed",
"413" = "Request Entity Too Large",
"414" = "Request-URI Too Long",
"415" = "Unsupported Media Type",
"416" = "Requested Range Not Satisfiable",
"417" = "Expectation Failed",
"418" = "I'm a teapot (RFC 2324)",
"420" = "Enhance Your Calm (Twitter)",
"422" = "Unprocessable Entity (WebDAV; RFC 4918)",
"423" = "Locked (WebDAV; RFC 4918)",
"424" = "Failed Dependency (WebDAV; RFC 4918)",
"424" = "Method Failure (WebDAV)",
"425" = "Unordered Collection (Internet draft)",
"426" = "Upgrade Required (RFC 2817)",
"428" = "Precondition Required (RFC 6585)",
"429" = "Too Many Requests (RFC 6585)",
"431" = "Request Header Fields Too Large (RFC 6585)",
"444" = "No Response (Nginx)",
"449" = "Retry With (Microsoft)",
"450" = "Blocked by Windows Parental Controls (Microsoft)",
"451" = "Unavailable For Legal Reasons (Internet draft)",
"499" = "Client Closed Request (Nginx)",
"500" = "Internal Server Error",
"501" = "Not Implemented",
"502" = "Bad Gateway",
"503" = "Service Unavailable",
"504" = "Gateway Timeout",
"505" = "HTTP Version Not Supported",
"506" = "Variant Also Negotiates (RFC 2295)",
"507" = "Insufficient Storage (WebDAV; RFC 4918)",
"508" = "Loop Detected (WebDAV; RFC 5842)",
"509" = "Bandwidth Limit Exceeded (Apache bw/limited extension)",
"510" = "Not Extended (RFC 2774)",
"511" = "Network Authentication Required (RFC 6585)",
"598" = "Network read timeout error (Unknown)",
"599" = "Network connect timeout error (Unknown)"
)
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/status.R
|
#' glue based template engine
#'
#' Use this template engine to create pages with glue templates.
#' See [glue::glue()] for the syntax.
#'
#' @param sep Separator used to separate elements.
#' @param open The opening delimiter. Doubling the full delimiter escapes
#' it.
#' @param close The closing delimiter. Doubling the full delimiter escapes
#' it.
#' @param na Value to replace NA values with. If `NULL` missing values are
#' propagated, that is an `NA` result will cause `NA` output.
#' Otherwise the value is replaced by the value of `na`.
#' @param transformer A function taking three parameters `code`, `envir`
#' and `data` used to transform the output of each block before during or
#' after evaluation.
#' @param trim Whether to trim the input template with [glue::trim()] or not.
#' @return Template function.
#'
#' @export
#' @examples
#' # See th 'hello' app at
#' hello_root <- system.file(package = "webfakes", "examples", "hello")
#' hello_root
#'
#' app <- new_app()
#' app$engine("txt", tmpl_glue())
#' app$use(mw_log())
#'
#'
#' app$get("/view", function(req, res) {
#' txt <- res$render("test")
#' res$
#' set_type("text/plain")$
#' send(txt)
#' })
#'
#' # Switch to the app's root: setwd(hello_root)
#' # Now start the app with: app$listen(3000L)
#' # Or start it in another process: new_process(app)
tmpl_glue <- function(sep = "", open = "{", close = "}", na = "NA",
transformer = NULL, trim = TRUE) {
sep; open; close; na; transformer; trim
function(path, locals) {
txt <- readChar(path, nchars = file.size(path))
glue::glue_data(
locals, txt, .sep = sep, .open = open, .close = close, .na = na,
.transformer = transformer %||% glue::identity_transformer,
.trim = trim
)
}
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/tmpl-glue.R
|
`%||%` <- function(l, r) if (is.null(l)) r else l
`%|NA|%` <- function(l, r) if (is_na_scalar(l)) r else l
new_object <- function(class_name, ...) {
structure(as.environment(list(...)), class = class_name)
}
is_string <- function(x) {
is.character(x) && length(x) == 1 && !is.na(x)
}
is_integerish <- function(x) {
is.integer(x) || (is.numeric(x) && all(round(x) == x))
}
is_count <- function(x) {
is_integerish(x) && length(x) == 1 && !is.na(x)
}
is_port <- function(x) {
is_count(x)
}
is_na_scalar <- function(x) {
is.atomic(x) && length(x) == 1 && is.na(x)
}
cat0 <- function(..., sep = "") {
cat(..., sep = sep, append = TRUE)
}
str_trim <- function(x) {
sub("\\s+$", "", sub("^\\s+", "", x))
}
unquote <- function(str) {
len <- nchar(str)
if (substr(str, 1, 1) == '"' && substr(str, len, len) == '"') {
substr(str, 2, len - 1)
} else {
str
}
}
isFALSE <- function(x) {
identical(x, FALSE)
}
str_is_suffix <- function(x, sfx) {
lx <- nchar(x)
lsfx <- nchar(sfx)
substring(x, lx - lsfx + 1, lx) == sfx
}
read_char <- function(path, encoding = "UTF-8") {
txt <- rawToChar(readBin(path, "raw", file.info(path)$size))
Encoding(txt) <- encoding
txt
}
read_bin <- function(path) {
readBin(path, "raw", file.info(path)$size)
}
try_silently <- function(expr) {
try(expr, silent = TRUE)
}
sseq <- function(from, to) {
if (from > to) integer() else seq(from, to)
}
is.named <- function(x) {
length(names(x)) == length(x) && all(names(x) != "")
}
set_envvar <- function(envs) {
if (length(envs) == 0) return()
stopifnot(is.named(envs))
old <- Sys.getenv(names(envs), names = TRUE, unset = NA)
set <- !is.na(envs)
both_set <- set & !is.na(old)
if (any(set)) do.call("Sys.setenv", as.list(envs[set]))
if (any(!set)) Sys.unsetenv(names(envs)[!set])
invisible(old)
}
mkdirp <- function(path, ...) {
dir.create(path, showWarnings = FALSE, recursive = TRUE, ...)
}
strrep <- function(x, no) {
paste(rep(x, no), collapse = "")
}
set_name <- function(x, nm) {
names(x) <- nm
x
}
random_id <- generate_token <- function(n = 30) {
paste0(sample(c(0:9, letters[1:6]), n, replace = TRUE), collapse = "")
}
parse_url <- function(url) {
re_url <- paste0(
"^(?<protocol>[a-zA-Z0-9]+)://",
"(?:(?<username>[^@/:]+)(?::(?<password>[^@/]+))?@)?",
"(?<host>[^/]+)",
"(?<path>.*)$" # don't worry about query params here...
)
re_match(url, re_url)$groups
}
modify_path <- function(url, path) {
purl <- parse_url(url)
has_usr <- nzchar(purl$username)
has_pwd <- nzchar(purl$password)
paste0(
purl$protocol,
"://",
purl$username,
if (has_pwd) ":",
purl$password,
if (has_usr) "@",
purl$host,
if (substr(path, 1, 1) != "/") "/",
path
)
}
has_seed <- function() {
exists(".Random.seed", globalenv(), mode = "integer", inherits = FALSE)
}
get_seed <- function() {
if (has_seed()) {
get(".Random.seed", globalenv(), mode = "integer", inherits = FALSE)
}
}
set_seed <- function(seed) {
if (is.null(seed)) {
rm(".Random.seed", envir = globalenv())
} else {
assign(".Random.seed", seed, globalenv())
}
}
local_options <- function(.new = list(), ...,
.local_envir = parent.frame()) {
.new <- utils::modifyList(as.list(.new), list(...))
old <- do.call(options, .new)
defer(do.call(options, old), .local_envir)
}
map_chr <- function(X, FUN, ...) {
vapply(X, FUN, FUN.VALUE = character(1), ...)
}
#' Format a time stamp for HTTP
#'
#' @param t Date-time value to format, defaults to the current date and
#' time. It must be a POSIXct object.
#' @return Character vector, formatted date-time.
#' @export
http_time_stamp <- function(t = Sys.time()) {
t <- as.POSIXlt(t, tz = "UTC")
strftime(t, "%a, %d %b %Y %H:%M:%S GMT")
}
capitalize <- function(x) {
substr(x, 1, 1) <- toupper(substr(x, 1, 1))
x
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/utils.R
|
uuid_random <- function() {
# These uuids are pseudo-random, they are not secure!
# xxxxxxxx-xxxx-Mxxx-Nxxx-xxxxxxxxxxxx (8-4-4-4-12)
# The 4 bits of digit M are the UUID version,
# and the 1 to 3 most significant bits of digit N code the UUID variant.
digits <- floor(stats::runif(32, 0, 16))
digits[13] <- 0x4
digits[17] <- bitwOr(bitwAnd(digits[17], 0x3), 0x8)
x <- format(as.hexmode(digits))
paste(
collapse = "",
c(x[1:8], "-", x[9:12], "-", x[13:16], "-", x[17:20], "-", x[21:32])
)
}
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/uuid.R
|
#' @keywords internal
"_PACKAGE"
## usethis namespace: start
## usethis namespace: end
NULL
|
/scratch/gouwar.j/cran-all/cranData/webfakes/R/webfakes-package.R
|
library(webfakes)
app <- new_app()
app$engine("txt", tmpl_glue())
app$use(mw_log())
app$get("/hello", function(req, res) {
res$send("Hello there!")
})
app$get(new_regexp("^/hi(/.*)?$"), function(req, res) {
res$send("Hi indeed!")
})
app$post("/hello", function(req, res) {
res$send("Got it, thanks!")
})
app$get("/view", function(req, res) {
txt <- res$render("test")
res$
set_type("text/plain")$
send(txt)
})
app$listen(3000L)
|
/scratch/gouwar.j/cran-all/cranData/webfakes/inst/examples/hello/app.R
|
library(webfakes)
app <- httpbin_app()
app$listen(as.integer(Sys.getenv("PORT", NA_character_)))
|
/scratch/gouwar.j/cran-all/cranData/webfakes/inst/examples/httpbin/app.R
|
library(webfakes)
app <- new_app()
app$use(mw_log())
app$use(mw_etag())
app$get("/logo", function(req, res) {
res$send_file("Rlogo.png")
})
app$listen(3000L)
|
/scratch/gouwar.j/cran-all/cranData/webfakes/inst/examples/send-file/app.R
|
library(webfakes)
app <- new_app()
app$use(mw_log())
app$use(mw_static("public"))
app$listen(as.integer(Sys.getenv("PORT", NA_character_)))
|
/scratch/gouwar.j/cran-all/cranData/webfakes/inst/examples/static/app.R
|
The webfakes package comes with two fake apps that
allow to imitate the OAuth2.0 flow in your test cases.
(See [Aaron Parecki's tutorial](https://aaronparecki.com/oauth-2-simplified/) for a good introduction to OAuth2.0.)
One app (`oauth2_resource_app()`) is the API server that serves both as the resource and provides authorization.
`oauth2_third_party_app()` plays the role of the third-party app.
They are useful when testing or demonstrating code handling OAuth2.0 authorization, token caching, etc. in a package.
The apps can be used in your tests directly,
or you could adapt one or both of them to better mimic a particular OAuth2.0 flow.
|
/scratch/gouwar.j/cran-all/cranData/webfakes/man/rmd-fragments/oauth2.Rmd
|
#' @importFrom geojsonio geojson_json
#' @importFrom httpuv stopDaemonizedServer startDaemonizedServer
#' @importFrom jsonlite toJSON fromJSON unbox
#' @importFrom stats runif
#' @importFrom utils browseURL
content_types <- list(
js = "application/javascript",
html = "text/html",
css = "text/css",
png = "image/png",
jpg = "image/jpeg",
json = "application/json"
)
wg404 <- list(
status = 404L,
headers = list("Content-Type" = "text/plain"),
body = "Not found"
)
wg403 <- list(
status = 403L,
headers = list("Content-Type" = "text/plain"),
body = "Forbidden"
)
wg400 <- list(
status = 400L,
headers = list("Content-Type" = "text/plain"),
body = "Bad request"
)
wgstopbyenv <- function(wgenv){
httpuv::stopDaemonizedServer(wgenv[['server']])
}
wgsend <- function(wg,msg){
if(wg$env[['immediate']]){
wgsendsend(wg,msg)
} else {
wg$env[['msgs']] <- c(wg$env[['msgs']],msg)
}
}
wgsendsend <- function(wg,msg){
tryCatch({
wg$env[['ws']]$send(msg)
}, error=function(e){
print('Error: Could not send to client. Make sure you have a browser open!')
})
}
#' @name print.webglobe
#'
#' @title Display a webglobe
#'
#' @description
#' Displays a webglobe. If the webglobe is immediate, then a browser
#' window containing it should already be open; in this case, the
#' webglobe's address is returned. If the webglobe is not immediate
#' then a new browser is open and the cached pipeline is sent to it.
#'
#' @param x The webglobe
#' @param ... Ignored
#'
#' @return NA
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe()
#' wg
#' }
#'
#' @export
print.webglobe <- function(x, ...){
if(x$env[['immediate']])
print(paste0("Server should be running at 'http://localhost:",x$env[['port']],"'"))
else
utils::browseURL(paste0('http://localhost:',x$env[['port']]))
NA
}
#' @name is.webglobe
#'
#' @title Is it a webglobe?
#'
#' @description
#' Determine if an object is a webglobe
#'
#' @param x The object that might be a webglobe
#'
#' @return TRUE or FALSE
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=TRUE)
#' is.webglobe(wg)
#' }
#'
#' @export
is.webglobe <- function(x) inherits(x, "webglobe")
#' @name +.webglobe
#'
#' @title Send command
#'
#' @description
#' Send a command to a webglobe
#'
#' @param wg Webglobe
#' @param x Command to send
#'
#' @return The same webglobe
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=TRUE)
#' wg + wgclear()
#' }
#'
#' @export
`+.webglobe` <- function(wg,x){
if(x=="immediate")
wg$env[['immediate']] <- TRUE
else if (x=="unimmediate")
wg$env[['immediate']] <- FALSE
else
wgsend(wg,x)
wg
}
#' @name webglobe
#'
#' @title Make a new webglobe
#'
#' @description
#' Constructs a new webglobe and starts its server
#'
#' @param immediate
#' Whether the webglobe should immediately show the results of
#' graphics commands or additively cache them. `immediate` mode can
#' be used to experimentally build up a pipeline. Once established
#' this can be stored in a non-immediate webglobe for easy acces
#' later
#'
#' @return A webglobe object
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=TRUE)
#' }
#'
#' @export
webglobe <- function(immediate=FALSE){
the_env <- new.env(parent=emptyenv())
app <- list(
call = function(req) {
if (!identical(req$REQUEST_METHOD, 'GET'))
return(NULL)
path <- req$PATH_INFO
if (is.null(path))
return(wg400)
if (path == '/')
path <- '/index.html'
path <- file.path(path.package('webglobe'),'client',path)
ctype <- content_types[[tools::file_ext(path)]]
if(is.null(ctype))
return(wg403)
if(!file.exists(path))
return(wg404)
fcontents <- readBin(path, 'raw', n=file.info(path)$size)
return(list(status=200L, headers=list('Content-Type'=ctype), body=fcontents))
},
onWSOpen = function(ws) {
the_env[['ws']]<-ws
ws$onMessage(function(binary, message) {
if(message!="sally_forth")
return(NULL)
for(m in the_env[['msgs']])
ws$send(m)
})
},
env = the_env
)
app$env[['msgs']] <- c()
app$env[['immediate']] <- immediate
startServer <- function(depth){
tryCatch({
app$env[['port']] <- floor(stats::runif(1,min=4000,max=8000))
app$env[['server']] <- httpuv::startDaemonizedServer("0.0.0.0", app$env[['port']], app)
TRUE
}, error = function(e) {
if(depth==100){
return(FALSE)
} else {
return(startServer(depth+1))
}
})
}
if(!startServer(0)){
stop('Could not start the server - probably no port was available.')
}
class(app) <- "webglobe"
reg.finalizer(app$env, wgstopbyenv, onexit = TRUE)
if(immediate)
utils::browseURL(paste0('http://localhost:',app$env[['port']]))
return(app)
}
#' @name wgport
#'
#' @title Get webglobe's port
#'
#' @description
#' Determine which port a webglobe is running on
#'
#' @param wg Webglobe whose port should be determined
#'
#' @return A number representing the webglobe's port
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=TRUE)
#' wgport(webglobe)
#' }
#'
#' @export
wgport <- function(wg){
wg$env[['port']]
}
#' @name wgpoints
#'
#' @title Plot points
#'
#' @description
#' Plots latitude-longitude points
#'
#' @param lat One or more latitude values
#' @param lon One or more longitude values
#' @param label Label to put next to point
#' @param alt Altitude of the points, can be single value or vector
#' @param colour Colour name of the points, can be single value or vector
#' @param size Size of the points, can be single value or vector
#'
#' @return A webglobe command
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg <- webglobe(immediate=FALSE)
#' wg <- wg + wgpoints(c(45,20),c(-93,127),alt=3,colour=c("blue","red"))
#' wg <- wg + wgpoints(51.5074,-0.1278,label="London",alt=0,colour="blue")
#' wg
#' }
#'
#' @export
wgpoints <- function(lat,lon,label=NA,alt=0,colour="yellow",size=10){
if(length(lat)!=length(lon))
stop('Same number of latitude and longitude points are required!')
if(length(alt)!=1 && length(alt)!=length(lat))
stop('One altitude must be specified, or a number equal to that of latitude/longitude!')
if(length(alt)==1)
alt <- rep(alt, length(lat))
if(length(colour)==1)
colour <- rep(colour, length(lat))
if(length(size)==1)
size <- rep(size, length(lat))
if(length(label)==1)
label <- rep(label, length(lat))
toString(jsonlite::toJSON(list(
command = jsonlite::unbox("points"),
lat = lat,
lon = lon,
alt = alt,
colour = colour,
size = size,
label = label
)))
}
#' @name wgcamhome
#'
#' @title Camera: Send home
#'
#' @description
#' Send the camera to its home location
#'
#' @return A webglobe command
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=TRUE)
#' wg+wgcamhome()
#' }
#'
#' @export
wgcamhome <- function(){
toString(jsonlite::toJSON(list(
command = jsonlite::unbox("cam_reset")
)))
}
#' @name wgclear
#'
#' @title Clear the scene
#'
#' @description
#' Clears everything from the map
#'
#' @return A webglobe command
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=TRUE)
#' wg+wgclear()
#' }
#'
#' @export
wgclear <- function(){
toString(jsonlite::toJSON(list(
command = jsonlite::unbox("clear")
)))
}
#' @name wgcamcenter
#'
#' @title Camera: Center on a point
#'
#' @description
#' Centers the camera on a point
#'
#' @param lat Latitude of the center point, in degrees
#' @param lon Longitude of the center point, in degrees
#' @param alt Altitude of the center point, in kilometres
#'
#' @return A webglobe command
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=TRUE)
#' wg+wgcamcenter(45,-93,5000)
#' }
#'
#' @export
wgcamcenter <- function(lat,lon,alt=NA){
if(!is.na(alt))
alt<-1000*alt
toString(jsonlite::toJSON(list(
command = jsonlite::unbox("cam_center"),
lat = jsonlite::unbox(lat),
lon = jsonlite::unbox(lon),
alt = jsonlite::unbox(alt)
)))
}
#' @name wgpolygondf
#'
#' @title Plot long-frame polygons
#'
#' @description
#' Plot polygons defined by long-style data frame
#'
#' @param df Data frame to plot
#' @param fill Fill colour name
#' @param alpha Alpha (transparency value)
#' @param extrude_height Height of the polygon above the surrounding landscape, in TODO
#' @param stroke Outline colour (TODO)
#' @param stroke_width Outline width (TODO)
#'
#' @return A webglobe command
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=TRUE)
#' wg+wgpolygondf(ggplot2::map_data("usa"),fill="blue",extrude_height=1000)
#' }
#'
#' @export
wgpolygondf <- function(df,fill=NA,alpha=1,extrude_height=0,stroke="yellow",stroke_width=10){
toString(jsonlite::toJSON(list(
command = jsonlite::unbox("polygons"),
polys = jsonlite::fromJSON(geojsonio::geojson_json(df, group='group', geometry='polygon')),
fill = jsonlite::unbox(fill),
extrude_height = jsonlite::unbox(extrude_height),
alpha = jsonlite::unbox(alpha),
stroke = jsonlite::unbox(stroke),
stroke_width = jsonlite::unbox(stroke_width)
)))
}
#' @name wgtitle
#'
#' @title Title of webglobe browser window
#'
#' @description
#' Changes the tab/window title of the webglobe's browser view
#'
#' @param title The title to use
#'
#' @return A webglobe command
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=TRUE)
#' wg+wgtitle("I am the new title!")
#' }
#'
#' @export
wgtitle <- function(title){
toString(jsonlite::toJSON(list(
command = jsonlite::unbox("title"),
title = jsonlite::unbox(title)
)))
}
#' @name wgbar
#'
#' @title Plot bars from the surface
#'
#' @description
#' Plots bars rising upwards from points on the Earth's surface
#'
#' @param lat Latitude of the bars' bases, in degrees
#' @param lon Latitude of the bars' bases, in degrees
#' @param alt Altitude of the bars' tops, may be one or many values
#' @param colour Colour of the bars, may be one or many values
#' @param width Width of bar bars, may be one or many values
#'
#' @return A webglobe command
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' data(quakes) #Load up some data
#' wg <- webglobe(immediate=FALSE) #Make a webglobe
#' wg <- wg + wgbar(quakes$lat, quakes$lon, alt=1.5e6*quakes$mag/10) #Plot quakes
#' wg <- wg + wgcamcenter(-33.35, 142.96, 8000) #Move camera
#' wg
#' }
#'
#' @export
wgbar <- function(lat,lon,alt=3000000,colour="blue",width=3){
if(length(lat)!=length(lon))
stop('Same number of latitude and longitude points are required!')
if(length(alt)!=1 && length(alt)!=length(lat))
stop('One altitude must be specified, or a number equal to that of latitude/longitude!')
if(length(colour)!=1 && length(colour)!=length(lat))
stop('One colour must be specified, or a number equal to that of latitude/longitude!')
if(length(width)!=1 && length(width)!=length(lat))
stop('One width must be specified, or a number equal to that of latitude/longitude!')
if(length(alt)==1)
alt <- rep(alt, length(lat))
if(length(colour)==1)
colour <- rep(colour, length(lat))
if(length(width)==1)
width <- rep(width, length(lat))
toString(jsonlite::toJSON(list(
command = jsonlite::unbox("bars"),
lat = lat,
lon = lon,
alt = alt,
colour = colour,
width = width
)))
}
#' @name wgimmediate
#'
#' @title Immediate mode: On
#'
#' @description
#' Turns on immediate mode
#'
#' @return A webglobe command
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=FALSE)
#' wg + wgimmediate() #wg is now immediate
#' }
#'
#' @export
wgimmediate <- function(){
"immediate"
}
#' @name wgunimmediate
#'
#' @title Immediate mode: Off
#'
#' @description
#' Turns off immediate mode
#'
#' @return A webglobe command
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=TRUE)
#' wg + wgunimmediate() #wg is now unimmediate
#' }
#'
#' @export
wgunimmediate <- function(){
"unimmediate"
}
#' @name wgimmediate_set
#'
#' @title Immediate mode: Set
#'
#' @description
#' Set immediate mode by value
#'
#' @param mode TRUE or FALSE: TRUE immplies immediate mode on, FALSE implies off
#'
#' @return A webglobe command
#'
#' @examples
#' \dontrun{
#' library(webglobe)
#' wg<-webglobe(immediate=TRUE)
#' wg + wgimmediate_set(FALSE) #wg is now unimmediate
#' }
#'
#' @export
wgimmediate_set <- function(mode){
if(mode)
"immediate"
else
"unimmediate"
}
|
/scratch/gouwar.j/cran-all/cranData/webglobe/R/webglobe.R
|
---
title: "webglobe: Interactive 3D Maps"
author:
- Richard Barnes
date: "`r Sys.Date()`"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{webglobe: Interactive 3D Maps}
%\VignetteEngine{knitr::rmarkdown}
%\VignetteEncoding{UTF-8}
---
#webglobe: Interactive 3D Maps
You want to understand your data, but it's spatially distributed and you're
afraid that trying to make sense of it on something gross, like a Mercator
projection, is going to lead you to bad intuitions.

(Greenland is nowhere near that big in reality.)
webglobe can help you do this! It allows you to interactively visualize your
data on either a three-dimensional globe or a flat map.
#Example: Earth quakes
library(webglobe) #Load the library
data(quakes) #Load up some data
wg <- webglobe(immediate=TRUE) #Make a webglobe (should open a net browser)
Sys.sleep(10) #Wait for browser to start, or it won't work
wg + wgpoints(quakes$lat, quakes$lon, size=5*quakes$mag) #Plot quakes
wg + wgcamcenter(-24, 178.0650, 8000) #Move camera

#Example: States
library(webglobe) #Load the library
m <- ggplot2::map_data("state") #Get data
m$extrude_height <- 1000000*runif(nrow(m),min=0,max=1)
wg <- webglobe(immediate=FALSE) #Make a webglobe (should open a net browser)
wg <- wg + wgpolygondf(m,fill="blue",alpha=1,stroke=NA)
wg

#Example: dggridR
[dggridR](https://CRAN.R-project.org/package=dggridR) is a package for binning spatial data into equally-sized hexagonal or triangular cells. It makes spatial analysis and statistics easier by solving the problem of having to worry about whether your projection is appropriate for the region you are using: all cells have the same area. It also works well with webglobe!
library(dggridR)
library(dplyr)
library(webglobe)
library(colorspace)
#Construct a global grid with cells approximately 1000 miles across
dggs <- dgconstruct(type="ISEA4T",spacing=1000, metric=FALSE, resround='down')
#Load included test data set
data(dgquakes)
#Get the corresponding grid cells for each earthquake epicenter (lat-long pair)
dgquakes$cell <- dgtransform(dggs,dgquakes$lat,dgquakes$lon)
#Get the number of earthquakes in each cell
quakecounts <- dgquakes %>% group_by(cell) %>% summarise(count=n())
#Get the grid cell boundaries for cells which had quakes
grid <- dgcellstogrid(dggs,quakecounts$cell,frame=TRUE,wrapcells=TRUE)
#Update the grid cells' properties to include the number of earthquakes
#in each cell
grid <- merge(grid,quakecounts,by.x="Name",by.y="cell")
#Make adjustments so the output is more visually interesting
grid$count <- log(grid$count)
cutoff <- quantile(grid$count,0.9)
grid <- grid %>% mutate(count=ifelse(count>cutoff,cutoff,count))
#Generate fill values based on quantiles of number quakes
grid$fill <- cut(grid$count, breaks=quantile(grid$count, seq(0,1,by=0.2)), labels=heat_hcl(5), include.lowest=TRUE)
#Construct a webglobe
wg <- webglobe(immediate=FALSE)
wg <- wg + wgpolygondf(grid, alpha=0.6)
wg

#Modes
webglobes have two modes: **immediate** and **not-immediate**. Immediate mode
displays a webglobe upon initialization and immediately prints all commands to
that globe. Not-immediate mode stores commands and displays them all at once,
allowing you to stage visualization without intermediate display. The difference
is illustrated below.
Display timing in intermediate mode:
library(webglobe)
data(quakes) #Get data
q <- quakes #Alias data
wgi <- webglobe(immediate=TRUE) #Webglobe is displayed now
Sys.sleep(10) #Ensure webglobe runs before continuing
wgi + wgpoints( q$lat, q$lon) #Data displays now!
wgi + wgpoints(-q$lat, -q$lon) #Data displays now!
#Reloading the browser window clears everything
Display timing in not-intermediate mode:
library(webglobe)
data(quakes) #Get data
q <- quakes #Alias data
wgn <- webglobe(immediate=FALSE) #Webglobe is not displayed
Sys.sleep(0) #No need to wait
#Note that we have to store commands
wgn <- wgn + wgpoints( q$lat, q$lon) #Nothing shown yet
wgn <- wgn + wgpoints(-q$lat, -q$lon) #Nothing shown yet
wgn <- wgn + wgcamcenter(2.89,-175.962,21460) #Nothing shown yet
wgn #Show it all now!
#Reloading the browser window keeps everything
You can also switch between modes:
library(webglobe)
data(quakes) #Get data
q <- quakes #Alias data
wgn <- webglobe(immediate=FALSE) #Webglobe is not displayed
Sys.sleep(0) #No need to wait
#Note that we have to store commands
wgn <- wgn + wgpoints( q$lat, q$lon) #Nothing shown yet
wgn <- wgn + wgpoints(-q$lat, -q$lon) #Nothing shown yet
wgn <- wgn + wgcamcenter(2.89,-175.962,21460) #Nothing shown yet
wgn + wgimmediate() #Make it all immediate
wgn
wgn + wgpoints(q$lat, -q$lon) #This is shown right away
#Reloading the browser window keeps everything up to `wgimmediate()`
#Roadmap
* Additional graphics primitives
* Submission to CRAN
#Credits
This R package was developed by Richard Barnes (https://rbarnes.org/).
It uses the Cesium WebGL virtual globe and map engine ([link](https://cesium.com/cesiumjs/)).
|
/scratch/gouwar.j/cran-all/cranData/webglobe/inst/doc/webglobe.Rmd
|
---
title: "webglobe: Interactive 3D Maps"
author:
- Richard Barnes
date: "`r Sys.Date()`"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{webglobe: Interactive 3D Maps}
%\VignetteEngine{knitr::rmarkdown}
%\VignetteEncoding{UTF-8}
---
#webglobe: Interactive 3D Maps
You want to understand your data, but it's spatially distributed and you're
afraid that trying to make sense of it on something gross, like a Mercator
projection, is going to lead you to bad intuitions.

(Greenland is nowhere near that big in reality.)
webglobe can help you do this! It allows you to interactively visualize your
data on either a three-dimensional globe or a flat map.
#Example: Earth quakes
library(webglobe) #Load the library
data(quakes) #Load up some data
wg <- webglobe(immediate=TRUE) #Make a webglobe (should open a net browser)
Sys.sleep(10) #Wait for browser to start, or it won't work
wg + wgpoints(quakes$lat, quakes$lon, size=5*quakes$mag) #Plot quakes
wg + wgcamcenter(-24, 178.0650, 8000) #Move camera

#Example: States
library(webglobe) #Load the library
m <- ggplot2::map_data("state") #Get data
m$extrude_height <- 1000000*runif(nrow(m),min=0,max=1)
wg <- webglobe(immediate=FALSE) #Make a webglobe (should open a net browser)
wg <- wg + wgpolygondf(m,fill="blue",alpha=1,stroke=NA)
wg

#Example: dggridR
[dggridR](https://CRAN.R-project.org/package=dggridR) is a package for binning spatial data into equally-sized hexagonal or triangular cells. It makes spatial analysis and statistics easier by solving the problem of having to worry about whether your projection is appropriate for the region you are using: all cells have the same area. It also works well with webglobe!
library(dggridR)
library(dplyr)
library(webglobe)
library(colorspace)
#Construct a global grid with cells approximately 1000 miles across
dggs <- dgconstruct(type="ISEA4T",spacing=1000, metric=FALSE, resround='down')
#Load included test data set
data(dgquakes)
#Get the corresponding grid cells for each earthquake epicenter (lat-long pair)
dgquakes$cell <- dgtransform(dggs,dgquakes$lat,dgquakes$lon)
#Get the number of earthquakes in each cell
quakecounts <- dgquakes %>% group_by(cell) %>% summarise(count=n())
#Get the grid cell boundaries for cells which had quakes
grid <- dgcellstogrid(dggs,quakecounts$cell,frame=TRUE,wrapcells=TRUE)
#Update the grid cells' properties to include the number of earthquakes
#in each cell
grid <- merge(grid,quakecounts,by.x="Name",by.y="cell")
#Make adjustments so the output is more visually interesting
grid$count <- log(grid$count)
cutoff <- quantile(grid$count,0.9)
grid <- grid %>% mutate(count=ifelse(count>cutoff,cutoff,count))
#Generate fill values based on quantiles of number quakes
grid$fill <- cut(grid$count, breaks=quantile(grid$count, seq(0,1,by=0.2)), labels=heat_hcl(5), include.lowest=TRUE)
#Construct a webglobe
wg <- webglobe(immediate=FALSE)
wg <- wg + wgpolygondf(grid, alpha=0.6)
wg

#Modes
webglobes have two modes: **immediate** and **not-immediate**. Immediate mode
displays a webglobe upon initialization and immediately prints all commands to
that globe. Not-immediate mode stores commands and displays them all at once,
allowing you to stage visualization without intermediate display. The difference
is illustrated below.
Display timing in intermediate mode:
library(webglobe)
data(quakes) #Get data
q <- quakes #Alias data
wgi <- webglobe(immediate=TRUE) #Webglobe is displayed now
Sys.sleep(10) #Ensure webglobe runs before continuing
wgi + wgpoints( q$lat, q$lon) #Data displays now!
wgi + wgpoints(-q$lat, -q$lon) #Data displays now!
#Reloading the browser window clears everything
Display timing in not-intermediate mode:
library(webglobe)
data(quakes) #Get data
q <- quakes #Alias data
wgn <- webglobe(immediate=FALSE) #Webglobe is not displayed
Sys.sleep(0) #No need to wait
#Note that we have to store commands
wgn <- wgn + wgpoints( q$lat, q$lon) #Nothing shown yet
wgn <- wgn + wgpoints(-q$lat, -q$lon) #Nothing shown yet
wgn <- wgn + wgcamcenter(2.89,-175.962,21460) #Nothing shown yet
wgn #Show it all now!
#Reloading the browser window keeps everything
You can also switch between modes:
library(webglobe)
data(quakes) #Get data
q <- quakes #Alias data
wgn <- webglobe(immediate=FALSE) #Webglobe is not displayed
Sys.sleep(0) #No need to wait
#Note that we have to store commands
wgn <- wgn + wgpoints( q$lat, q$lon) #Nothing shown yet
wgn <- wgn + wgpoints(-q$lat, -q$lon) #Nothing shown yet
wgn <- wgn + wgcamcenter(2.89,-175.962,21460) #Nothing shown yet
wgn + wgimmediate() #Make it all immediate
wgn
wgn + wgpoints(q$lat, -q$lon) #This is shown right away
#Reloading the browser window keeps everything up to `wgimmediate()`
#Roadmap
* Additional graphics primitives
* Submission to CRAN
#Credits
This R package was developed by Richard Barnes (https://rbarnes.org/).
It uses the Cesium WebGL virtual globe and map engine ([link](https://cesium.com/cesiumjs/)).
|
/scratch/gouwar.j/cran-all/cranData/webglobe/vignettes/webglobe.Rmd
|
#' Add cluster control button to a web map
#'
#' Add a button to a [Leaflet](https://leafletjs.com/) map to toggle marker clusters on and off.
#'
#' @inheritParams add_home_button
#' @param cluster_id 'character' string.
#' Identification for the marker cluster layer.
#'
#' @inherit add_home_button return
#'
#' @author J.C. Fisher, U.S. Geological Survey, Idaho Water Science Center
#'
#' @seealso [`make_map`] function for creating a map widget.
#'
#' @export
#'
#' @examples
#' make_map(maps = "Topo") |>
#' leaflet::addMarkers(
#' lng = ~lng,
#' lat = ~lat,
#' label = ~name,
#' popup = ~name,
#' clusterOptions = leaflet::markerClusterOptions(
#' showCoverageOnHover = FALSE
#' ),
#' clusterId = "cluster",
#' group = "marker",
#' data = us_cities
#' ) |>
#' add_cluster_button(cluster_id = "cluster")
add_cluster_button <- function(map,
cluster_id,
position = "topleft") {
# check arguments
checkmate::assert_class(map, c("leaflet", "htmlwidget"))
checkmate::assert_string(cluster_id, min.chars = 1)
checkmate::assert_choice(position, c("topleft", "topright", "bottomleft", "bottomright"))
# Javascript derived from https://rstudio.github.io/leaflet/morefeatures.html and accessed on 2017-11-06.
# disable clusters
js <- sprintf(
"function(btn, map) {
var clusterManager = map.layerManager.getLayer('cluster', '%s');
clusterManager.disableClustering();
btn.state('disable-cluster');
}",
cluster_id
)
s0 <- leaflet::easyButtonState(
stateName = "enable-cluster",
icon = "fa-circle",
title = "Disable clustering",
onClick = htmlwidgets::JS(js)
)
# enable clusters
js <- sprintf(
"function(btn, map) {
var clusterManager = map.layerManager.getLayer('cluster', '%s');
clusterManager.enableClustering();
btn.state('enable-cluster');
}",
cluster_id
)
s1 <- leaflet::easyButtonState(
stateName = "disable-cluster",
icon = "fa-circle-o",
title = "Enable clustering",
onClick = htmlwidgets::JS(js)
)
# create button
button <- leaflet::easyButton(
position = position,
states = list(s0, s1)
)
# place button on map
leaflet::addEasyButton(map, button)
}
|
/scratch/gouwar.j/cran-all/cranData/webmap/R/add_cluster_button.R
|
#' Add full-screen button to a web map
#'
#' Add a button to a [Leaflet](https://leafletjs.com/) map that toggles full screen on and off.
#' Functionality provided by the [leaflet-fullscreen](https://github.com/Leaflet/Leaflet.fullscreen)
#' plugin for Leaflet.
#'
#' @inheritParams add_home_button
#' @param pseudo_fullscreen 'logical' flag.
#' Whether to fullscreen to page width and height.
#'
#' @inherit add_home_button return
#'
#' @author J.C. Fisher, U.S. Geological Survey, Idaho Water Science Center
#'
#' @seealso [`make_map`] function for creating a map widget.
#'
#' @export
#'
#' @examples
#' make_map(maps = "Topo") |>
#' add_fullscreen_button()
add_fullscreen_button <- function(map,
pseudo_fullscreen = FALSE,
position = "topleft") {
# check arguments
checkmate::assert_class(map, c("leaflet", "htmlwidget"))
checkmate::assert_choice(position, c("topleft", "topright", "bottomleft", "bottomright"))
checkmate::assert_flag(pseudo_fullscreen)
# attach html dependencies to map widget
map$dependencies <- c(
map$dependencies,
list(
htmltools::htmlDependency(
name = "leaflet-fullscreen",
version = "1.0.2",
src = "htmlwidgets/plugins/leaflet-fullscreen",
script = c("Leaflet.fullscreen.min.js"),
stylesheet = "leaflet.fullscreen.css",
package = "webmap"
)
)
)
# account for missing options
if (is.null(map$x$options)) {
map$x$options <- list()
}
# set control options
map$x$options["fullscreenControl"] <- list(list(
position = position,
pseudoFullscreen = pseudo_fullscreen
))
map
}
|
/scratch/gouwar.j/cran-all/cranData/webmap/R/add_fullscreen_button.R
|
#' Add home button to a web map
#'
#' Add a button to a [Leaflet](https://leafletjs.com/) map that zooms to the provided map extent.
#'
#' @param map '[leaflet]'.
#' Map widget object
#' @param position 'character' string.
#' Position of the button on the web map.
#' Possible values are "topleft", "topright", "bottomleft", and "bottomright".
#' @param extent 'bbox', or 'numeric' vector of length four, with `xmin`, `xmax`, `ymin` and `ymax` values.
#' Extent object representing a rectangular geographical area on the map.
#' The extent must be specified in the coordinate reference system (CRS) of the web map,
#' usually in latitude and longitude using WGS 84 (also known as [EPSG:4326](https://epsg.io/4326)).
#' By default, the extent will be automatically determined from
#' latitudes and longitudes of the map elements.
#'
#' @return A new HTML web `map` with added element, an object of class 'leaflet'.
#'
#' @author J.C. Fisher, U.S. Geological Survey, Idaho Water Science Center
#'
#' @seealso [`make_map`] function for creating a map widget.
#'
#' @export
#'
#' @examples
#' make_map(maps = "Topo") |>
#' add_home_button(
#' extent = c(-124.409, -114.131, 32.534, 42.009) # California
#' )
add_home_button <- function(map,
extent = NULL,
position = "topleft") {
# check arguments
checkmate::assert_class(map, c("leaflet", "htmlwidget"))
checkmate::assert_choice(position, c("topleft", "topright", "bottomleft", "bottomright"))
checkmate::assert_numeric(extent, len = 4, null.ok = TRUE)
# extract/create extent object
if (is.null(extent)) {
if (is.null(map$x$limits)) {
stop("Extent can not be determined from map elements", call. = FALSE)
} else {
extent <- c(map$x$limits$lng, map$x$limits$lat)
}
}
# create button
js <- sprintf(
"function(btn, map) {
map.fitBounds([[%f, %f],[%f, %f]]);
}",
extent[3], extent[1], extent[4], extent[2]
)
button <- leaflet::easyButton(
icon = "fa-home fa-lg",
title = "Zoom to initial map extent",
onClick = htmlwidgets::JS(js),
position = position
)
# place button on map
leaflet::addEasyButton(map, button)
}
|
/scratch/gouwar.j/cran-all/cranData/webmap/R/add_home_button.R
|
#' Add legend to a web map
#'
#' Add a legend to a [Leaflet](https://leafletjs.com/) map.
#'
#' @inheritParams add_home_button
#' @param labels 'character' vector.
#' Labels in the legend.
#' @param colors 'character' vector.
#' HTML colors corresponding to `labels`.
#' @param radius 'numeric' number.
#' Border radius of symbols in the legend, in pixels.
#' @param opacity 'numeric' number.
#' Opacity of symbols in the legend, from 0 to 1.
#' @param symbol 'character' string.
#' Symbol type in the legend, either "square" or "circle".
#' @param title 'character' string.
#' Legend title
#'
#' @inherit add_home_button return
#'
#' @author J.C. Fisher, U.S. Geological Survey, Idaho Water Science Center
#'
#' @seealso [`make_map`] function for creating a map widget.
#'
#' @export
#'
#' @examples
#' # define marker colors based on whether a city serves as a capital
#' colors <- c(
#' "Non-capital" = "green",
#' "Capital" = "red"
#' )
#' fill_colors <- colors[(us_cities$capital > 0) + 1L] |>
#' as.character()
#'
#' # print map with city circle markers and a map legend
#' make_map(maps = "Topo") |>
#' leaflet::addCircleMarkers(
#' lng = ~lng,
#' lat = ~lat,
#' radius = 6,
#' color = "white",
#' weight = 1,
#' opacity = 1,
#' fillColor = fill_colors,
#' fillOpacity = 1,
#' fill = TRUE,
#' data = us_cities
#' ) |>
#' add_legend(
#' labels = names(colors),
#' colors = colors,
#' radius = 5,
#' opacity = 1,
#' symbol = "circle"
#' )
add_legend <- function(map,
labels,
colors,
radius,
opacity = 0.5,
symbol = c("square", "circle"),
title = "EXPLANATION",
position = "topright") {
# check arguments
checkmate::assert_class(map, c("leaflet", "htmlwidget"))
checkmate::assert_character(labels, any.missing = FALSE, min.len = 1)
checkmate::assert_character(colors, any.missing = FALSE, len = length(labels))
checkmate::assert_numeric(radius, lower = 0, any.missing = FALSE, min.len = 1)
checkmate::assert_number(opacity, lower = 0, upper = 1, finite = TRUE)
symbol <- match.arg(symbol)
checkmate::assert_string(title, null.ok = TRUE)
checkmate::assert_choice(position, c("topleft", "topright", "bottomleft", "bottomright"))
sizes <- rep(radius, length.out = length(colors)) * 2
col <- sprintf(
switch(symbol,
"square" = "%s; width:%fpx; height:%fpx; margin-top:4px;",
"circle" = "%s; border-radius:50%%; width:%fpx; height:%fpx; margin-top:4px;"
), colors, sizes, sizes
)
lab <- sprintf(
"<div style='display:inline-block; height:%fpx; line-height:%fpx; margin-top:4px;'>%s</div>",
sizes, sizes, labels
)
if (is.character(title)) {
title <- sprintf("<div style='text-align:center;'>%s</div>", title)
}
leaflet::addLegend(map,
position = position,
colors = col,
labels = lab,
labFormat = as.character(),
opacity = opacity,
title = title
)
}
|
/scratch/gouwar.j/cran-all/cranData/webmap/R/add_legend.R
|
#' Add search button to a web map
#'
#' Add a button to a [Leaflet](https://leafletjs.com/) map to search markers/features location by property.
#' Functionality provided by the [leaflet-search](https://github.com/stefanocudini/leaflet-search)
#' plugin for Leaflet.
#'
#' @inheritParams add_home_button
#' @param group 'character' string.
#' Name of the group whose features will be searched.
#' @param open_popup 'logical' flag.
#' Whether to open the marker popup associated with the searched for marker.
#' @param property_name 'character' string.
#' Property name used to describe markers, such as, "label" and "popup".
#' @param text_placeholder 'character' string.
#' Message to show in search element.
#' @param zoom 'integer' count.
#' Zoom level for move to location after marker found in search.
#'
#' @inherit add_home_button return
#'
#' @author J.C. Fisher, U.S. Geological Survey, Idaho Water Science Center
#'
#' @seealso [`make_map`] function for creating a map widget.
#'
#' @export
#'
#' @examples
#' make_map(maps = "Topo") |>
#' leaflet::addMarkers(
#' lng = ~lng,
#' lat = ~lat,
#' label = ~name,
#' popup = ~name,
#' group = "marker",
#' data = us_cities
#' ) |>
#' add_search_button(
#' group = "marker",
#' zoom = 15,
#' text_placeholder = "Search city names..."
#' )
add_search_button <- function(map,
group,
property_name = "label",
zoom = NULL,
text_placeholder = "Search...",
open_popup = FALSE,
position = "topleft") {
# check arguments
checkmate::assert_class(map, c("leaflet", "htmlwidget"))
checkmate::assert_string(group, min.chars = 1)
checkmate::assert_string(property_name, min.chars = 1)
checkmate::assert_int(zoom, lower = 0, null.ok = TRUE)
checkmate::assert_string(text_placeholder, null.ok = TRUE)
checkmate::assert_flag(open_popup)
checkmate::assert_choice(position, c("topleft", "topright", "bottomleft", "bottomright"))
# check group is in map widget
grp <- lapply(lapply(map$x$calls, function(x) x[[2]]), function(x) x[5][[1]]) |>
unlist()
if (!(group %in% grp)) {
stop("Group with name '", group, "' missing from map widget.")
}
# attach html dependencies to map widget
map$dependencies <- c(
map$dependencies,
list(
htmltools::htmlDependency(
name = "leaflet-search",
version = "2.9.6",
src = "htmlwidgets/plugins/leaflet-search",
script = c("leaflet-search.min.js", "leaflet-search-binding.js"),
stylesheet = "leaflet-search.min.css",
package = "webmap"
)
)
)
# define arguments to be passed to the javascript method
marker <- list(
"icon" = FALSE,
"animate" = TRUE,
"circle" = list(
"radius" = 20,
"weight" = 3,
"opacity" = 0.7,
"color" = "#FF4040",
"stroke" = TRUE,
"fill" = FALSE
)
)
option <- list(
"propertyName" = property_name,
"zoom" = zoom,
"textPlaceholder" = text_placeholder,
"openPopup" = open_popup,
"position" = position,
"initial" = FALSE,
"hideMarkerOnCollapse" = TRUE,
"marker" = marker
)
# add leaflet-search element to map
leaflet::invokeMethod(map,
data = leaflet::getMapData(map),
method = "addSearchControl",
group,
leaflet::filterNULL(option)
)
}
|
/scratch/gouwar.j/cran-all/cranData/webmap/R/add_search_button.R
|
#' Create a web map using TNM services
#'
#' Create a [Leaflet](https://leafletjs.com/) map widget that includes base maps offered through
#' [The National Map](https://www.usgs.gov/programs/national-geospatial-program/national-map) (TNM)
#' cached [service endpoints](https://apps.nationalmap.gov/services).
#' Information about the content of these base maps can be found within the
#' [TNM Base Maps](https://apps.nationalmap.gov/help/3.0%20TNM%20Base%20Maps.htm) document.
#' TNM content is limited to the United States and territories.
#' The map widget can be rendered on HTML pages generated from R Markdown, Shiny, or other applications.
#'
#' @param maps 'character' vector.
#' TNM base maps to include in the web map. Choices include
#' "Topo", "Imagery", "Imagery Topo", "Hydrography", "Shaded Relief", and "Blank".
#' See 'Details' section for a description of each base map.
#' By default, all base maps are included.
#' The one exception is the "Blank" map,
#' which is only accessible using a Web Map Service (WMS),
#' see `protocol` argument.
#' @param ...
#' Arguments to be passed to the [`leaflet`][leaflet::leaflet] function.
#' @param protocol 'character' string.
#' Standard protocol for serving pre-rendered georeferenced TNM map tiles.
#' Select "WMTS" for the Web Map Tile Service (the default) and "WMS" for the Web Map Service.
#' @param hydro 'logical' flag.
#' Whether to show or hide (the default) the "Hydrography" overlay base map.
#' @param collapse 'logical' flag.
#' Whether the layers control should be rendered as an icon that expands when hovered over.
#' Default is `FALSE`.
#'
#' @details Composite base maps include:
#' * "Topo" a tile base map that combines the most current TNM data,
#' and other public-domain data, into a multi-scale topographic reference map.
#' Data includes boundaries, geographic names, transportation,
#' contours, hydrography, land cover, shaded relief, and bathymetry.<br/>
#' \if{latex}{\cr} 
#' * "Imagery" is a tile base map of orthoimagery in TNM.
#' Orthoimagery data typically are high resolution aerial images that combine the
#' visual attributes of an aerial photograph with the spatial accuracy and reliability of a planimetric map.
#' USGS digital orthoimage resolution may vary from 6 inches to 1 meter.<br/>
#' \if{latex}{\cr} 
#' * "Imagery Topo" is a tile base map of orthoimagery in TNM as a backdrop,
#' and a limited selection of topographic data
#' (boundaries, names, transportation, contours, and hydrography).<br/>
#' \if{latex}{\cr} 
#' * "Hydrography" is a overlay of cartographic representation of the
#' [National Hydrography Dataset](https://www.usgs.gov/national-hydrography/national-hydrography-dataset) (NHD).
#' The NHD is a comprehensive set of digital geospatial data that encodes information about naturally occurring
#' and constructed bodies of surface water, paths through which water flows, related features such as
#' stream gages and dams, and additional hydrologic information.<br/>
#' \if{latex}{\cr} 
#' * "Shaded Relief" is a tile base map of terrain representation in the form of hillshades created from the
#' [3D Elevation Program](https://www.usgs.gov/3d-elevation-program) (3DEP). 3DEP maintains a seamless dataset
#' of best available raster elevation data, in the form of digital elevation models (DEMs) for the conterminous
#' United States, Alaska, Hawaii, and Territorial Islands of the United States.<br/>
#' \if{latex}{\cr} 
#' * "OSM" is the [OpenStreetMap](https://www.openstreetmap.org/about) tile base map.<br/>
#' \if{latex}{\cr} 
#' * "Blank" consists of ocean tints to give the outline of land cover as an empty base map.<br/>
#' \if{latex}{\cr} 
#'
#' @return An object of class 'leaflet', a hypertext markup language (HTML) map widget.
#' See example for instructions on how to add additional graphic layers
#' (such as points, lines, and polygons) to the map widget.
#' Graphic layers added to the web map must be in latitude and longitude using WGS 84
#' (also known as [EPSG:4326](https://epsg.io/4326)).
#'
#' @author J.C. Fisher, U.S. Geological Survey, Idaho Water Science Center
#'
#' @export
#'
#' @examples
#' # Create map widget
#' map <- make_map()
#'
#' # Print map widget
#' map
#'
#' # Print map with markers
#' pts <- rbind(
#' c(-112.049, 43.517),
#' c(-122.171, 37.456),
#' c( -77.367, 38.947),
#' c(-149.803, 61.187),
#' c( -80.248, 26.080)
#' )
#' leaflet::addMarkers(map,
#' lng = pts[, 1],
#' lat = pts[, 2]
#' )
#'
#' # Print map of satellite imagery with a rectangle in the vicinity of UCLA
#' make_map(
#' maps = "Imagery",
#' collapse = TRUE
#' ) |>
#' leaflet::addRectangles(
#' lng1 = -118.456,
#' lat1 = 34.078,
#' lng2 = -118.436,
#' lat2 = 34.062,
#' fillColor = "transparent"
#' )
make_map <- function(maps,
...,
protocol = c("WMTS", "WMS"),
hydro = FALSE,
collapse = FALSE) {
# check arguments
protocol <- match.arg(protocol)
checkmate::assert_flag(hydro)
checkmate::assert_flag(collapse)
# set baese maps
basemaps <- c(
"Topo" = "USGSTopo",
"Imagery" = "USGSImageryOnly",
"Imagery Topo" = "USGSImageryTopo",
"Hydrography" = "USGSHydroCached",
"Shaded Relief" = "USGSShadedReliefOnly",
"OSM" = "OpenStreetMap",
"Blank" = "USGSTNMBlank"
)
if (missing(maps)) {
maps <- names(basemaps)
if (protocol == "WMTS") {
maps <- maps[maps != "Blank"]
}
}
checkmate::assert_character(maps, any.missing = FALSE, min.len = 1)
maps <- match.arg(maps, choices = names(basemaps), several.ok = TRUE)
basemaps <- basemaps[maps]
# subset TNM base maps
tnm_basemaps <- basemaps[maps != "OSM"]
# set attribution
hyperlinks <- c(
"USGS" = format(
htmltools::tags$a(
"USGS",
href = "https://www.usgs.gov/laws/policies_notices.html",
title = "Policies and Notices",
target = "_blank"
)
),
"TNM" = format(
htmltools::tags$a(
"TNM",
href = "https://www.usgs.gov/programs/national-geospatial-program/national-map",
title = "The National Map",
target = "_blank"
)
),
"OSM" = paste(
"\U00A9",
htmltools::tags$a(
"OpenStreetMap",
href = "https://www.openstreetmap.org/copyright",
title = "Copyright and License",
target = "_blank"
)
)
)
if (length(tnm_basemaps) == 0) {
hyperlinks <- hyperlinks["OSM"]
} else if (!"OSM" %in% maps) {
hyperlinks <- hyperlinks[c("USGS", "TNM")]
}
attr <- paste(hyperlinks, collapse = " | ")
# initialize tile options
opt <- leaflet::tileOptions(minZoom = 3)
# initialize map widget
map <- leaflet::leaflet(...)
# add OpenStreetMap layer
if ("OSM" %in% maps) {
map <- leaflet::addTiles(map,
group = "OSM",
attribution = attr,
options = opt
)
}
# set domain
domain <- "https://basemap.nationalmap.gov"
# set maximum zoom
opt[["maxZoom"]] <- 16
# add base map using web map tile service
if (protocol == "WMTS") {
url <- sprintf("%s/arcgis/rest/services/%s/MapServer/tile/{z}/{y}/{x}", domain, tnm_basemaps)
for (i in seq_along(tnm_basemaps)) {
map <- leaflet::addTiles(map,
urlTemplate = url[i],
attribution = attr,
group = names(tnm_basemaps)[i],
options = opt
)
}
# add base map using web map service
} else if (protocol == "WMS") {
url <- sprintf("%s/arcgis/services/%s/MapServer/WmsServer?", domain, tnm_basemaps)
opt[["format"]] <- "image/jpeg"
opt[["version"]] <- "1.3.0"
for (i in seq_along(tnm_basemaps)) {
map <- leaflet::addWMSTiles(map,
baseUrl = url[i],
group = names(tnm_basemaps)[i],
options = opt,
attribution = attr,
layers = "0"
)
}
}
# add layer control
if (length(maps) > 1) {
# set groups
is <- maps == "Hydrography"
overlay_groups <- maps[is]
base_groups <- maps[!is]
# add widget
map <- leaflet::addLayersControl(map,
position = "topright",
baseGroups = base_groups,
overlayGroups = overlay_groups,
options = leaflet::layersControlOptions(collapsed = collapse)
)
# hide hydrography layer
if (!hydro) {
map <- leaflet::hideGroup(map, group = "Hydrography")
}
}
# add scale bar
map <- leaflet::addScaleBar(map, position = "bottomleft")
# return map widget
map
}
|
/scratch/gouwar.j/cran-all/cranData/webmap/R/make_map.R
|
#' US Major Cities
#'
#' @description This dataset contains the locations of cities within the United States
#' with populations of about 40,000 or greater, all state capitals, and the national capital.
#'
#' @format A data frame with columns:
#' \describe{
#' \item{`name`}{City name}
#' \item{`capital`}{Capital status code indicates whether a city is a capital or not.
#' A value of 0 indicates that the city is not a capital,
#' while a value of 1 indicates that the city is a capital.
#' If the city is a state capital, the value is 2.}
#' \item{`lng`}{Longitude in decimal degrees.}
#' \item{`lat`}{Latitude in decimal degrees.}
#' }
#'
#' @source The census-designated place population as of January 30, 2024,
#' as enumerated by the 2020 United States census.
#'
#' @keywords datasets
#'
#' @examples
#' str(us_cities)
"us_cities"
|
/scratch/gouwar.j/cran-all/cranData/webmap/R/us_cities.R
|
# setup tinytest for checkmate functionality
library("tinytest")
library("checkmate")
using("checkmate")
# test access to TNM web map tile service
map <- make_map(protocol = "WMTS")
checkmate::expect_class(map, classes = c("leaflet", "htmlwidget"))
# test access to TNM web map services
map <- make_map(protocol = "WMS")
checkmate::expect_class(map, classes = c("leaflet", "htmlwidget"))
|
/scratch/gouwar.j/cran-all/cranData/webmap/inst/tinytest/test_webmap.R
|
#' @title HttpLibAdapaterRegistry
#' @description http lib adapter registry
#' @export
#' @examples
#' x <- HttpLibAdapaterRegistry$new()
#' x$register(CrulAdapter$new())
#' x
#' x$adapters
#' x$adapters[[1]]$name
HttpLibAdapaterRegistry <- R6::R6Class(
'HttpLibAdapaterRegistry',
public = list(
#' @field adapters list
adapters = NULL,
#' @description print method for the `HttpLibAdapaterRegistry` class
#' @param x self
#' @param ... ignored
print = function(x, ...) {
cat("<HttpLibAdapaterRegistry> ", sep = "\n")
for (i in seq_along(self$adapters)) {
cat(sprintf(" %s: webmockr:::%s", self$adapters[[i]]$name,
class(self$adapters[[i]])[1]), sep = "\n")
}
},
#' @description Register an http library adapter
#' @param x an http lib adapter, e.g., [CrulAdapter]
#' @return nothing, registers the library adapter
register = function(x) {
# FIXME: when other adapters supported, change this inherits test
if (!inherits(x, c("CrulAdapter", "HttrAdapter"))) {
stop("'x' must be an adapter, such as CrulAdapter", call. = FALSE)
}
self$adapters <- c(self$adapters, x)
}
)
)
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/HttpLibAdapterRegistry.R
|
#' @title RequestPattern class
#' @description class handling all request matchers
#' @export
#' @seealso pattern classes for HTTP method [MethodPattern], headers
#' [HeadersPattern], body [BodyPattern], and URI/URL [UriPattern]
#' @examples \dontrun{
#' (x <- RequestPattern$new(method = "get", uri = "httpbin.org/get"))
#' x$body_pattern
#' x$headers_pattern
#' x$method_pattern
#' x$uri_pattern
#' x$to_s()
#'
#' # make a request signature
#' rs <- RequestSignature$new(method = "get", uri = "http://httpbin.org/get")
#'
#' # check if it matches
#' x$matches(rs)
#'
#' # regex uri
#' (x <- RequestPattern$new(method = "get", uri_regex = ".+ossref.org"))
#' x$uri_pattern
#' x$uri_pattern$to_s()
#' x$to_s()
#'
#' # uri with query parameters
#' (x <- RequestPattern$new(
#' method = "get", uri = "https://httpbin.org/get",
#' query = list(foo = "bar")
#' ))
#' x$to_s()
#' ## query params included in url, not separately
#' (x <- RequestPattern$new(
#' method = "get", uri = "https://httpbin.org/get?stuff=things"
#' ))
#' x$to_s()
#' x$query_params
#'
#' # just headers (via setting method=any & uri_regex=.+)
#' headers <- list(
#' 'User-Agent' = 'Apple',
#' 'Accept-Encoding' = 'gzip, deflate',
#' 'Accept' = 'application/json, text/xml, application/xml, */*')
#' x <- RequestPattern$new(
#' method = "any",
#' uri_regex = ".+",
#' headers = headers)
#' x$to_s()
#' rs <- RequestSignature$new(method = "any", uri = "http://foo.bar",
#' options = list(headers = headers))
#' rs
#' x$matches(rs)
#'
#' # body
#' x <- RequestPattern$new(method = "post", uri = "httpbin.org/post",
#' body = list(y = crul::upload(system.file("CITATION"))))
#' x$to_s()
#' rs <- RequestSignature$new(method = "post", uri = "http://httpbin.org/post",
#' options = list(
#' body = list(y = crul::upload(system.file("CITATION")))))
#' rs
#' x$matches(rs)
#' }
RequestPattern <- R6::R6Class(
'RequestPattern',
public = list(
#' @field method_pattern xxx
method_pattern = NULL,
#' @field uri_pattern xxx
uri_pattern = NULL,
#' @field body_pattern xxx
body_pattern = NULL,
#' @field headers_pattern xxx
headers_pattern = NULL,
#' @description Create a new `RequestPattern` object
#' @param method the HTTP method (any, head, options, get, post, put,
#' patch, trace, or delete). "any" matches any HTTP method. required.
#' @param uri (character) request URI. required or uri_regex
#' @param uri_regex (character) request URI as regex. required or uri
#' @param query (list) query parameters, optional
#' @param body (list) body request, optional
#' @param headers (list) headers, optional
#' @return A new `RequestPattern` object
initialize = function(method, uri = NULL, uri_regex = NULL,
query = NULL, body = NULL, headers = NULL) {
if (is.null(uri) && is.null(uri_regex)) {
stop("one of uri or uri_regex is required", call. = FALSE)
}
self$method_pattern <- MethodPattern$new(pattern = method)
self$uri_pattern <- if (is.null(uri_regex)) {
UriPattern$new(pattern = uri)
} else {
UriPattern$new(regex_pattern = uri_regex)
}
self$uri_pattern$add_query_params(query)
self$body_pattern <- if (!is.null(body)) BodyPattern$new(pattern = body)
self$headers_pattern <- if (!is.null(headers))
HeadersPattern$new(pattern = headers)
# FIXME: all private methods used in the below line, see if needed or remove
# if (length(options)) private$assign_options(options)
},
#' @description does a request signature match the selected matchers?
#' @param request_signature a [RequestSignature] object
#' @return a boolean
matches = function(request_signature) {
assert(request_signature, "RequestSignature")
c_type <- if (!is.null(request_signature$headers)) request_signature$headers$`Content-Type` else NULL
if (!is.null(c_type)) c_type <- strsplit(c_type, ';')[[1]][1]
self$method_pattern$matches(request_signature$method) &&
self$uri_pattern$matches(request_signature$uri) &&
(is.null(self$body_pattern) || self$body_pattern$matches(request_signature$body, c_type %||% "")) &&
(is.null(self$headers_pattern) || self$headers_pattern$matches(request_signature$headers))
},
#' @description Print pattern for easy human consumption
#' @return a string
to_s = function() {
gsub("^\\s+|\\s+$", "", paste(
toupper(self$method_pattern$to_s()),
self$uri_pattern$to_s(),
if (!is.null(self$body_pattern)) paste0(" with body ", self$body_pattern$to_s()),
if (!is.null(self$headers_pattern)) paste0(" with headers ", self$headers_pattern$to_s())
))
}
),
private = list(
# assign_options = function(options) {
# #self$validate_keys(options, 'body', 'headers', 'query', 'basic_auth')
# set_basic_auth_as_headers(options)
# self$body_pattern <- if ('body' %in% names(options)) BodyPattern$new(options['body'])
# self$headers_pattern <- if ('headers' %in% names(options)) HeadersPattern$new(options['headers'])
# if ('query' %in% names(options)) self$uri_pattern$add_query_params(options['query'])
# },
# validate_keys = function(x, ...) {
# valid_keys <- unlist(list(...), recursive = FALSE)
# for (i in seq_along(x)) {
# if (!names(x)[i] %in% valid_keys) {
# stop(
# sprintf("Unknown key: %s. Valid keys are: %s",
# names(x)[i],
# paste0(valid_keys, collapse = ", "),
# call. = FALSE
# )
# )
# }
# }
# },
set_basic_auth_as_headers = function(options) {
if ('basic_auth' %in% names(options)) {
private$validate_basic_auth(options$basic_auth)
options$headers <- list()
options$headers$Authorization <-
private$make_basic_auth(options$basic_auth[1], options$basic_auth[2])
}
},
validate_basic_auth = function(x) {
if (!inherits(x, "list") || length(unique(unname(unlist(x)))) == 1) {
stop(
"'basic_auth' option should be a list of length 2: username and password",
call. = FALSE
)
}
},
make_basic_auth = function(x, y) {
jsonlite::base64_enc(paste0(x, ":", y))
}
)
)
#' @title MethodPattern
#' @description method matcher
#' @export
#' @keywords internal
#' @details Matches regardless of case. e.g., POST will match to post
#' @examples
#' (x <- MethodPattern$new(pattern = "post"))
#' x$pattern
#' x$matches(method = "post")
#' x$matches(method = "POST")
#'
#' # all matches() calls should be TRUE
#' (x <- MethodPattern$new(pattern = "any"))
#' x$pattern
#' x$matches(method = "post")
#' x$matches(method = "GET")
#' x$matches(method = "HEAD")
MethodPattern <- R6::R6Class(
'MethodPattern',
public = list(
#' @field pattern (character) an http method
pattern = NULL,
#' @description Create a new `MethodPattern` object
#' @param pattern (character) a HTTP method, lowercase
#' @return A new `MethodPattern` object
initialize = function(pattern) {
self$pattern <- tolower(pattern)
},
#' @description test if the pattern matches a given http method
#' @param method (character) a HTTP method, lowercase
#' @return a boolean
matches = function(method) {
self$pattern == tolower(method) || self$pattern == "any"
},
#' @description Print pattern for easy human consumption
#' @return a string
to_s = function() self$pattern
)
)
#' @title HeadersPattern
#' @description headers matcher
#' @export
#' @keywords internal
#' @details
#' `webmockr` normalises headers and treats all forms of same headers as equal:
#' i.e the following two sets of headers are equal:
#' `list(Header1 = "value1", content_length = 123, X_CuStOm_hEAder = "foo")`
#' and
#' `list(header1 = "value1", "Content-Length" = 123, "x-cuSTOM-HeAder" = "foo")`
#' @examples
#' (x <- HeadersPattern$new(pattern = list(a = 5)))
#' x$pattern
#' x$matches(list(a = 5))
#'
#' # different cases
#' (x <- HeadersPattern$new(pattern = list(Header1 = "value1")))
#' x$pattern
#' x$matches(list(header1 = "value1"))
#' x$matches(list(header1 = "value2"))
#'
#' # different symbols
#' (x <- HeadersPattern$new(pattern = list(`Hello_World` = "yep")))
#' x$pattern
#' x$matches(list(`hello-world` = "yep"))
#' x$matches(list(`hello-worlds` = "yep"))
#'
#' headers <- list(
#' 'User-Agent' = 'Apple',
#' 'Accept-Encoding' = 'gzip, deflate',
#' 'Accept' = 'application/json, text/xml, application/xml, */*')
#' (x <- HeadersPattern$new(pattern = headers))
#' x$to_s()
#' x$pattern
#' x$matches(headers)
HeadersPattern <- R6::R6Class(
'HeadersPattern',
public = list(
#' @field pattern a list
pattern = NULL,
#' @description Create a new `HeadersPattern` object
#' @param pattern (list) a pattern, as a named list, must be named,
#' e.g,. `list(a = 5, b = 6)`
#' @return A new `HeadersPattern` object
initialize = function(pattern) {
stopifnot(is.list(pattern))
pattern <- private$normalize_headers(pattern)
self$pattern <- pattern
},
#' @description Match a list of headers against that stored
#' @param headers (list) named list of headers, e.g,. `list(a = 5, b = 6)`
#' @return a boolean
matches = function(headers) {
if (self$empty_headers(self$pattern)) {
self$empty_headers(headers)
} else {
if (self$empty_headers(headers)) return(FALSE)
headers <- private$normalize_headers(headers)
out <- c()
for (i in seq_along(self$pattern)) {
out[i] <- names(self$pattern)[i] %in% names(headers) &&
self$pattern[[i]] == headers[[names(self$pattern)[i]]]
}
all(out)
}
},
#' @description Are headers empty? tests if null or length==0
#' @param headers named list of headers
#' @return a boolean
empty_headers = function(headers) {
is.null(headers) || length(headers) == 0
},
#' @description Print pattern for easy human consumption
#' @return a string
to_s = function() hdl_lst2(self$pattern)
),
private = list(
normalize_headers = function(x) {
# normalize names
names(x) <- tolower(names(x))
# normalize symbols
## underscores to single dash
names(x) <- gsub("_", "-", names(x))
return(x)
}
)
)
#' @title BodyPattern
#' @description body matcher
#' @export
#' @keywords internal
#' @examples
#' # make a request signature
#' bb <- RequestSignature$new(
#' method = "get",
#' uri = "https:/httpbin.org/get",
#' options = list(
#' body = list(foo = "bar", a = 5)
#' )
#' )
#'
#' # make body pattern object
#' ## FALSE
#' z <- BodyPattern$new(pattern = list(foo = "bar"))
#' z$pattern
#' z$matches(bb$body)
#' ## TRUE
#' z <- BodyPattern$new(pattern = list(foo = "bar", a = 5))
#' z$pattern
#' z$matches(bb$body)
#'
#' # uploads in bodies
#' ## upload NOT in a list
#' bb <- RequestSignature$new(
#' method = "post", uri = "https:/httpbin.org/post",
#' options = list(body = crul::upload(system.file("CITATION"))))
#' bb$body
#' z <- BodyPattern$new(pattern =
#' crul::upload(system.file("CITATION")))
#' z$pattern
#' z$matches(bb$body)
#'
#' ## upload in a list
#' bb <- RequestSignature$new(
#' method = "post", uri = "https:/httpbin.org/post",
#' options = list(body = list(y = crul::upload(system.file("CITATION")))))
#' bb$body
#' z <- BodyPattern$new(pattern =
#' list(y = crul::upload(system.file("CITATION"))))
#' z$pattern
#' z$matches(bb$body)
BodyPattern <- R6::R6Class(
'BodyPattern',
public = list(
#' @field pattern a list
pattern = NULL,
#' @description Create a new `BodyPattern` object
#' @param pattern (list) a body object
#' @return A new `BodyPattern` object
initialize = function(pattern) {
if (inherits(pattern, "form_file"))
self$pattern <- unclass(pattern)
else
self$pattern <- pattern
},
#' @description Match a request body pattern against a pattern
#' @param body (list) the body
#' @param content_type (character) content type
#' @return a boolean
matches = function(body, content_type = "") {
if (inherits(self$pattern, "list")) {
if (length(self$pattern) == 0) return(TRUE)
private$matching_hashes(private$body_as_hash(body, content_type), self$pattern)
} else {
(private$empty_string(self$pattern) && private$empty_string(body)) || all(self$pattern == body)
}
},
#' @description Print pattern for easy human consumption
#' @return a string
to_s = function() self$pattern
),
private = list(
empty_string = function(string) {
is.null(string) || !nzchar(string)
},
matching_hashes = function(z, pattern) {
if (is.null(z)) return(FALSE)
if (!inherits(z, "list")) return(FALSE)
if (!all(sort(names(z)) %in% sort(names(pattern)))) return(FALSE)
for (i in seq_along(z)) {
expected <- pattern[[names(z)[i]]]
actual <- z[[i]]
if (inherits(actual, "list") && inherits(expected, "list")) {
if (!private$matching_hashes(actual, expected)) return(FALSE)
} else {
if (!identical(as.character(actual), as.character(expected))) return(FALSE)
}
}
return(TRUE)
},
body_as_hash = function(body, content_type) {
if (inherits(body, "form_file")) body <- unclass(body)
bctype <- BODY_FORMATS[[content_type]] %||% ""
if (bctype == 'json') {
jsonlite::fromJSON(body, FALSE)
} else if (bctype == 'xml') {
check_for_pkg("xml2")
xml2::read_xml(body)
} else {
query_mapper(body)
}
}
)
)
BODY_FORMATS <- list(
'text/xml' = 'xml',
'application/xml' = 'xml',
'application/json' = 'json',
'text/json' = 'json',
'application/javascript' = 'json',
'text/javascript' = 'json',
'text/html' = 'html',
'application/x-yaml' = 'yaml',
'text/yaml' = 'yaml',
'text/plain' = 'plain'
)
#' @title UriPattern
#' @description uri matcher
#' @export
#' @keywords internal
#' @examples
#' # trailing slash
#' (z <- UriPattern$new(pattern = "http://foobar.com"))
#' z$matches("http://foobar.com") # TRUE
#' z$matches("http://foobar.com/") # TRUE
#'
#' # without scheme
#' ## matches http by default: does not match https by default
#' (z <- UriPattern$new(pattern = "foobar.com"))
#' z$matches("http://foobar.com") # TRUE
#' z$matches("http://foobar.com/") # TRUE
#' z$matches("https://foobar.com") # FALSE
#' z$matches("https://foobar.com/") # FALSE
#' ## to match https, you'll have to give the complete url
#' (z <- UriPattern$new(pattern = "https://foobar.com"))
#' z$matches("https://foobar.com/") # TRUE
#' z$matches("http://foobar.com/") # FALSE
#'
#' # default ports
#' (z <- UriPattern$new(pattern = "http://foobar.com"))
#' z$matches("http://foobar.com:80") # TRUE
#' z$matches("http://foobar.com:80/") # TRUE
#' z$matches("http://foobar.com:443") # TRUE
#' z$matches("http://foobar.com:443/") # TRUE
#'
#' # user info - FIXME, not sure we support this yet
#' (z <- UriPattern$new(pattern = "http://foobar.com"))
#' z$matches("http://user:[email protected]")
#'
#' # regex
#' (z <- UriPattern$new(regex_pattern = ".+ample\\.."))
#' z$matches("http://sample.org") # TRUE
#' z$matches("http://example.com") # TRUE
#' z$matches("http://tramples.net") # FALSE
#'
#' # add query parameters
#' (z <- UriPattern$new(pattern = "http://foobar.com"))
#' z$add_query_params(list(pizza = "cheese", cheese = "cheddar"))
#' z
#' z$pattern
#' z$matches("http://foobar.com?pizza=cheese&cheese=cheddar") # TRUE
#' z$matches("http://foobar.com?pizza=cheese&cheese=swiss") # FALSE
#'
#' # query parameters in the uri
#' (z <- UriPattern$new(pattern = "https://httpbin.org/get?stuff=things"))
#' z$add_query_params() # have to run this method to gather query params
#' z$matches("https://httpbin.org/get?stuff=things") # TRUE
#' z$matches("https://httpbin.org/get?stuff2=things") # FALSE
#'
#' # regex add query parameters
#' (z <- UriPattern$new(regex_pattern = "https://foobar.com/.+/order"))
#' z$add_query_params(list(pizza = "cheese"))
#' z
#' z$pattern
#' z$matches("https://foobar.com/pizzas/order?pizza=cheese") # TRUE
#' z$matches("https://foobar.com/pizzas?pizza=cheese") # FALSE
#'
#' # query parameters in the regex uri
#' (z <- UriPattern$new(regex_pattern = "https://x.com/.+/order\\?fruit=apple"))
#' z$add_query_params() # have to run this method to gather query params
#' z$matches("https://x.com/a/order?fruit=apple") # TRUE
#' z$matches("https://x.com/a?fruit=apple") # FALSE
#'
#' # any pattern
#' (z <- UriPattern$new(regex_pattern = "stuff\\.com.+"))
#' z$regex
#' z$pattern
#' z$matches("http://stuff.com") # FALSE
#' z$matches("https://stuff.com/stff") # TRUE
#' z$matches("https://stuff.com/apple?bears=brown&bats=grey") # TRUE
UriPattern <- R6::R6Class(
'UriPattern',
public = list(
#' @field pattern (character) pattern holder
pattern = NULL,
#' @field regex a logical
regex = FALSE,
#' @field query_params a list, or `NULL` if empty
query_params = NULL,
#' @description Create a new `UriPattern` object
#' @param pattern (character) a uri, as a character string. if scheme
#' is missing, it is added (we assume http)
#' @param regex_pattern (character) a uri as a regex character string,
#' see [base::regex]. if scheme is missing, it is added (we assume
#' http)
#' @return A new `UriPattern` object
initialize = function(pattern = NULL, regex_pattern = NULL) {
stopifnot(xor(is.null(pattern), is.null(regex_pattern)))
if (!is.null(regex_pattern)) self$regex <- TRUE
pattern <- if (!is.null(pattern)) pattern else regex_pattern
if (self$regex) pattern <- add_scheme(pattern)
self$pattern <- normalize_uri(pattern, self$regex)
},
#' @description Match a uri against a pattern
#' @param uri (character) a uri
#' @return a boolean
matches = function(uri) {
uri <- normalize_uri(uri, self$regex)
if (self$regex)
grepl(self$pattern, uri)
else
self$pattern_matches(uri) && self$query_params_matches(uri)
},
#' @description Match a URI
#' @param uri (character) a uri
#' @return a boolean
pattern_matches = function(uri) {
if (!self$regex) return(uri == self$pattern) # not regex
grepl(drop_query_params(self$pattern), uri) # regex
},
#' @description Match query parameters of a URI
#' @param uri (character) a uri
#' @return a boolean
query_params_matches = function(uri) {
identical(self$query_params, self$extract_query(uri))
},
#' @description Extract query parameters as a named list
#' @param uri (character) a uri
#' @return named list, or `NULL` if no query parameters
extract_query = function(uri) {
params <- parse_a_url(uri)$parameter
if (all(is.na(params))) return(NULL)
params
},
#' @description Add query parameters to the URI
#' @param query_params (list|character) list or character
#' @return nothing returned, updates uri pattern
add_query_params = function(query_params) {
if (self$regex) return(NULL)
if (missing(query_params) || is.null(query_params)) {
self$query_params <- self$extract_query(self$pattern)
} else {
self$query_params <- query_params
if (
inherits(query_params, "list") ||
inherits(query_params, "character")
) {
pars <- paste0(unname(Map(function(x, y) paste(x, esc(y), sep = "="),
names(query_params), query_params)), collapse = "&")
self$pattern <- paste0(self$pattern, "?", pars)
}
}
},
#' @description Print pattern for easy human consumption
#' @return a string
to_s = function() self$pattern
)
)
add_scheme <- function(x) {
if (is.na(urltools::url_parse(x)$scheme)) {
paste0('https?://', x)
} else {
x
}
}
esc <- function(x) curl::curl_escape(x)
normalize_uri <- function(x, regex = FALSE) {
x <- prune_trailing_slash(x)
x <- prune_port(x)
if (!regex)
if (is.na(urltools::url_parse(x)$scheme))
x <- paste0('http://', x)
tmp <- urltools::url_parse(x)
if (is.na(tmp$path)) return(x)
if (!regex) tmp$path <- esc(tmp$path)
urltools::url_compose(tmp)
}
prune_trailing_slash <- function(x) sub("/$", "", x)
prune_port <- function(x) gsub("(:80)|(:443)", "", x)
# matcher helpers --------------------------
## URI stuff
is_url <- function(x) {
grepl("https?://", x, ignore.case = TRUE) ||
grepl("localhost:[0-9]{4}", x, ignore.case = TRUE)
}
is_localhost <- function(x) {
grepl("localhost|127.0.0.1|0.0.0.0", x, ignore.case = TRUE)
}
parse_a_url <- function(url) {
tmp <- urltools::url_parse(url)
tmp <- as.list(tmp)
if (!is.na(tmp$parameter)) {
tmp$parameter <- unlist(
lapply(
strsplit(tmp$parameter, "&")[[1]], function(x) {
z <- strsplit(x, split = "=")[[1]]
as.list(stats::setNames(z[2], z[1]))
}),
recursive = FALSE
)
}
tmp$default_port <- 443
return(tmp)
}
uri_fetch <- function(x) {
x <- as.character(x)
tmp <- x[vapply(x, FUN = is_url, FUN.VALUE = logical(1))]
if (length(tmp) == 0) NULL else tmp
}
uri_host <- function(x) parse_a_url(x)$domain
uri_path <- function(x) parse_a_url(x)$path
uri_port <- function(x) parse_a_url(x)$port
drop_query_params <- function(x) {
x <- urltools::url_parse(x)
x$parameter <- NA_character_
x <- urltools::url_compose(x)
# prune trailing slash
sub("\\/$", "", x)
}
## http method
get_method <- function(x) {
x <- as.character(x)
tmp <- grep(
"(get)$|(post)$|(put)$|(delete)$|(options)$|(patch)$|(head)$",
tolower(x), value = TRUE)
tmp <- sub("httr::", "", tmp)
if (length(tmp) == 0) NULL else tmp
}
## query and body stuff
get_query <- function(x) {
if ("query" %in% names(x)) {
x[["query"]]
} else {
NULL
}
}
get_body <- function(x) {
if ("body" %in% names(x)) {
x[["body"]]
} else {
NULL
}
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/RequestPattern.R
|
#' @title HashCounter
#' @description hash with counter, to store requests, and count each time
#' it is used
#' @export
#' @family request-registry
#' @examples
#' x <- HashCounter$new()
#' x$hash
#' z <- RequestSignature$new(method = "get", uri = "https:/httpbin.org/get")
#' x$put(z)
#' x$hash
#' x$get(z)
#' x$put(z)
#' x$get(z)
HashCounter <- R6::R6Class(
'HashCounter',
public = list(
#' @field hash (list) a list for internal use only, with elements
#' `key`, `sig`, and `count`
hash = list(),
#' @description Register a request by it's key
#' @param req_sig an object of class `RequestSignature`
#' @return nothing returned; registers request and iterates
#' internal counter
put = function(req_sig) {
assert(req_sig, "RequestSignature")
key <- req_sig$to_s()
self$hash[[key]] <- list(
key = key,
sig = req_sig,
count = (self$hash[[key]]$count %||% 0) + 1
)
},
#' @description Get a request by key
#' @param req_sig an object of class `RequestSignature`
#' @return (integer) the count of how many times the request has been made
get = function(req_sig) {
assert(req_sig, "RequestSignature")
self$hash[[req_sig$to_s()]]$count %||% 0
}
)
)
#' @title RequestRegistry
#' @description keeps track of HTTP requests
#' @export
#' @family request-registry
#' @seealso [stub_registry()] and [StubRegistry]
#' @examples
#' x <- RequestRegistry$new()
#' z1 <- RequestSignature$new("get", "http://scottchamberlain.info")
#' z2 <- RequestSignature$new("post", "https://httpbin.org/post")
#' x$register_request(request = z1)
#' x$register_request(request = z1)
#' x$register_request(request = z2)
#' # print method to list requests
#' x
#'
#' # more complex requests
#' w <- RequestSignature$new(
#' method = "get",
#' uri = "https:/httpbin.org/get",
#' options = list(headers = list(`User-Agent` = "foobar", stuff = "things"))
#' )
#' w$to_s()
#' x$register_request(request = w)
#' x
#'
#'
#' # hashes, and number of times each requested
#' x$request_signatures$hash
#'
#' # times_executed method
#' pat <- RequestPattern$new(
#' method = "get",
#' uri = "https:/httpbin.org/get",
#' headers = list(`User-Agent` = "foobar", stuff = "things")
#' )
#' pat$to_s()
#' x$times_executed(pat)
#' z <- RequestPattern$new(method = "get", uri = "http://scottchamberlain.info")
#' x$times_executed(z)
#' w <- RequestPattern$new(method = "post", uri = "https://httpbin.org/post")
#' x$times_executed(w)
#'
#' ## pattern with no matches - returns 0 (zero)
#' pat <- RequestPattern$new(
#' method = "get",
#' uri = "http://recology.info/"
#' )
#' pat$to_s()
#' x$times_executed(pat)
#'
#' # reset the request registry
#' x$reset()
RequestRegistry <- R6::R6Class(
'RequestRegistry',
public = list(
#' @field request_signatures a HashCounter object
request_signatures = HashCounter$new(),
#' @description print method for the `RequestRegistry` class
#' @param x self
#' @param ... ignored
print = function(x, ...) {
cat("<webmockr request registry> ", sep = "\n")
cat(" Registered Requests", sep = "\n")
for (i in seq_along(self$request_signatures$hash)) {
cat(
sprintf(
" %s was made %s times\n",
names(self$request_signatures$hash)[i],
self$request_signatures$hash[[i]]$count
),
sep = "\n"
)
}
invisible(self$request_signatures$hash)
},
#' @description Reset the registry to no registered requests
#' @return nothing returned; ressets registry to no requests
reset = function() {
self$request_signatures <- HashCounter$new()
},
#' @description Register a request
#' @param request a character string of the request, serialized from
#' a `RequestSignature$new(...)$to_s()`
#' @return nothing returned; registers the request
register_request = function(request) {
self$request_signatures$put(request)
},
#' @description How many times has a request been made
#' @param request_pattern an object of class `RequestPattern`
#' @return integer, the number of times the request has been made
#' @details if no match is found for the request pattern, 0 is returned
times_executed = function(request_pattern) {
bools <- c()
for (i in seq_along(self$request_signatures$hash)) {
bools[i] <- request_pattern$matches(self$request_signatures$hash[[i]]$sig)
}
if (all(!bools)) return(0)
self$request_signatures$hash[bools][[1]]$count
}
)
)
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/RequestRegistry.R
|
#' @title RequestSignature
#' @description General purpose request signature builder
#' @export
#' @examples
#' # make request signature
#' x <- RequestSignature$new(method = "get", uri = "https:/httpbin.org/get")
#' # method
#' x$method
#' # uri
#' x$uri
#' # request signature to string
#' x$to_s()
#'
#' # headers
#' w <- RequestSignature$new(
#' method = "get",
#' uri = "https:/httpbin.org/get",
#' options = list(headers = list(`User-Agent` = "foobar", stuff = "things"))
#' )
#' w
#' w$headers
#' w$to_s()
#'
#' # headers and body
#' bb <- RequestSignature$new(
#' method = "get",
#' uri = "https:/httpbin.org/get",
#' options = list(
#' headers = list(`User-Agent` = "foobar", stuff = "things"),
#' body = list(a = "tables")
#' )
#' )
#' bb
#' bb$headers
#' bb$body
#' bb$to_s()
#'
#' # with disk path
#' f <- tempfile()
#' bb <- RequestSignature$new(
#' method = "get",
#' uri = "https:/httpbin.org/get",
#' options = list(disk = f)
#' )
#' bb
#' bb$disk
#' bb$to_s()
RequestSignature <- R6::R6Class(
'RequestSignature',
public = list(
#' @field method (character) an http method
method = NULL,
#' @field uri (character) a uri
uri = NULL,
#' @field body (various) request body
body = NULL,
#' @field headers (list) named list of headers
headers = NULL,
#' @field proxies (list) proxies as a named list
proxies = NULL,
#' @field auth (list) authentication details, as a named list
auth = NULL,
#' @field url internal use
url = NULL,
#' @field disk (character) if writing to disk, the path
disk = NULL,
#' @field fields (various) request body details
fields = NULL,
#' @field output (various) request output details, disk, memory, etc
output = NULL,
#' @description Create a new `RequestSignature` object
#' @param method the HTTP method (any, head, options, get, post, put,
#' patch, trace, or delete). "any" matches any HTTP method. required.
#' @param uri (character) request URI. required.
#' @param options (list) options. optional. See Details.
#' @return A new `RequestSignature` object
initialize = function(method, uri, options = list()) {
verb <- match.arg(tolower(method), http_verbs)
self$method <- verb
self$uri <- uri
self$url$url <- uri
if (length(options)) private$assign_options(options)
},
#' @description print method for the `RequestSignature` class
#' @param x self
#' @param ... ignored
print = function() {
cat("<RequestSignature> ", sep = "\n")
cat(paste0(" method: ", toupper(self$method)), sep = "\n")
cat(paste0(" uri: ", self$uri), sep = "\n")
if (!is.null(self$body)) {
cat(" body: ", sep = "\n")
if (inherits(self$body, "form_file")) {
cat(paste0(" ",
sprintf("type=%s; path=%s", self$body$type, self$body$path)),
sep = "\n")
} else {
cat_foo(self$body)
}
}
if (!is.null(self$headers)) {
cat(" headers: ", sep = "\n")
cat_foo(self$headers)
}
if (!is.null(self$proxies)) {
cat(" proxies: ", sep = "\n")
cat_foo(self$proxies)
}
if (!is.null(self$auth)) {
cat(" auth: ", sep = "\n")
cat_foo(self$auth)
}
if (!is.null(self$disk)) {
cat(paste0(" disk: ", self$disk), sep = "\n")
}
if (!is.null(self$fields)) {
cat(" fields: ", sep = "\n")
cat_foo(self$fields)
}
},
#' @description Request signature to a string
#' @return a character string representation of the request signature
to_s = function() {
gsub("^\\s+|\\s+$", "", paste(
paste0(toupper(self$method), ": "),
self$uri,
if (!is.null(self$body) && length(self$body)) {
paste0(" with body ", to_string(self$body))
},
if (!is.null(self$headers) && length(self$headers)) {
paste0(
" with headers ",
sprintf("{%s}",
paste(names(self$headers),
unlist(unname(self$headers)), sep = ": ",
collapse = ", "))
)
}
))
}
),
private = list(
assign_options = function(options) {
op_vars <- c("body", "headers", "proxies", "auth",
"disk", "fields", "output")
for (i in seq_along(op_vars)) {
if (op_vars[i] %in% names(options)) {
if (!is.null(options[[ op_vars[i] ]]) && length(options)) {
self[[ op_vars[i] ]] <- options[[ op_vars[i] ]]
}
}
}
}
)
)
cat_foo <- function(x) {
cat(paste0(" ",
paste0(paste(names(x), x, sep = ": "),
collapse = "\n ")), sep = "\n")
}
to_string <- function(x) {
if (inherits(x, "list") && all(nchar(names(x)) > 0)) {
tmp <- paste0(paste(names(x), x, sep = ": "), collapse = ", ")
} else if (inherits(x, "list") && any(nchar(names(x)) == 0)) {
tmp <- paste0(paste(names(x), x, sep = ": "), collapse = ", ")
} else if (inherits(x, "form_file")) {
tmp <- sprintf("type=%s; path=%s", x$type, x$path)
} else {
tmp <- paste0(x, collapse = ", ")
}
return(sprintf("{%s}", tmp))
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/RequestSignature.R
|
#' @title Response
#' @description custom webmockr http response class
#' @export
#' @examples \dontrun{
#' (x <- Response$new())
#'
#' x$set_url("https://httpbin.org/get")
#' x
#'
#' x$set_request_headers(list('Content-Type' = "application/json"))
#' x
#' x$request_headers
#'
#' x$set_response_headers(list('Host' = "httpbin.org"))
#' x
#' x$response_headers
#'
#' x$set_status(404)
#' x
#' x$get_status()
#'
#' x$set_body("hello world")
#' x
#' x$get_body()
#' # raw body
#' x$set_body(charToRaw("hello world"))
#' x
#' x$get_body()
#'
#' x$set_exception("exception")
#' x
#' x$get_exception()
#' }
Response <- R6::R6Class(
'Response',
public = list(
#' @field url (character) a url
url = NULL,
#' @field body (various) list, character, etc
body = NULL,
#' @field content (various) response content/body
content = NULL,
#' @field request_headers (list) a named list
request_headers = NULL,
#' @field response_headers (list) a named list
response_headers = NULL,
#' @field options (character) list
options = NULL,
#' @field status_code (integer) an http status code
status_code = 200,
#' @field exception (character) an exception message
exception = NULL,
#' @field should_timeout (logical) should the response timeout?
should_timeout = NULL,
#' @description Create a new `Response` object
#' @param options (list) a list of options
#' @return A new `Response` object
initialize = function(options = list()) {
if (inherits(options, "file") || inherits(options, "character")) {
self$options <- read_raw_response(options)
} else {
self$options <- options
}
},
#' @description print method for the `Response` class
#' @param x self
#' @param ... ignored
print = function(x, ...) {
cat("<webmockr response> ", sep = "\n")
cat(paste0(" url: ", self$url), sep = "\n")
cat(paste0(" status: ", self$status_code), sep = "\n")
cat(" headers: ", sep = "\n")
for (i in seq_along(self$request_headers)) {
cat(" request headers: ", sep = "\n")
cat(paste0(" ",
paste(names(self$request_headers)[i], self$request_headers[[i]],
sep = ": ")), sep = "\n")
}
for (i in seq_along(self$response_headers)) {
cat(" response headers: ", sep = "\n")
cat(paste0(" ",
paste(names(self$response_headers)[i], self$response_headers[[i]],
sep = ": ")), sep = "\n")
}
cat(paste0(" exception: ", self$exception), sep = "\n")
cat(paste0(" body length: ", length(self$body)), sep = "\n")
},
#' @description set the url for the response
#' @param url (character) a url
#' @return nothing returned; sets url
set_url = function(url) {
self$url <- url
},
#' @description get the url for the response
#' @return (character) a url
get_url = function() self$url,
#' @description set the request headers for the response
#' @param headers (list) named list
#' @param capitalize (logical) whether to capitalize first letters of
#' each header; default: `TRUE`
#' @return nothing returned; sets request headers on the response
set_request_headers = function(headers, capitalize = TRUE) {
self$request_headers <- private$normalize_headers(headers, capitalize)
},
#' @description get the request headers for the response
#' @return (list) request headers, a named list
get_request_headers = function() self$request_headers,
#' @description set the response headers for the response
#' @param headers (list) named list
#' @param capitalize (logical) whether to capitalize first letters of
#' each header; default: `TRUE`
#' @return nothing returned; sets response headers on the response
set_response_headers = function(headers, capitalize = TRUE) {
self$response_headers <- private$normalize_headers(headers, capitalize)
},
#' @description get the response headers for the response
#' @return (list) response headers, a named list
get_respone_headers = function() self$response_headers,
#' @description set the body of the response
#' @param body (various types)
#' @param disk (logical) whether its on disk; default: `FALSE`
#' @return nothing returned; sets body on the response
set_body = function(body, disk = FALSE) {
self$body <- body
self$content <- if (is.character(body)) {
stopifnot(length(body) <= 1)
if (disk) body else charToRaw(body)
} else if (is.raw(body)) {
body
} else {
raw(0)
}
},
#' @description get the body of the response
#' @return various
get_body = function() self$body %||% '',
#' @description set the http status of the response
#' @param status (integer) the http status
#' @return nothing returned; sets the http status of the response
set_status = function(status) {
self$status_code <- status
},
#' @description get the http status of the response
#' @return (integer) the http status
get_status = function() self$status_code %||% 200,
#' @description set an exception
#' @param exception (character) an exception string
#' @return nothing returned; sets an exception
set_exception = function(exception) {
self$exception <- exception
},
#' @description get the exception, if set
#' @return (character) an exception
get_exception = function() self$exception
),
private = list(
normalize_headers = function(x, capitalize = TRUE) normalize_headers(x, capitalize)
)
)
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/Response.R
|
#' @title StubRegistry
#' @description stub registry to keep track of [StubbedRequest] stubs
#' @export
#' @family stub-registry
#' @examples \dontrun{
#' # Make a stub
#' stub1 <- StubbedRequest$new(method = "get", uri = "api.crossref.org")
#' stub1$with(headers = list('User-Agent' = 'R'))
#' stub1$to_return(status = 200, body = "foobar", headers = list())
#' stub1
#'
#' # Make another stub
#' stub2 <- StubbedRequest$new(method = "get", uri = "api.crossref.org")
#' stub2
#'
#' # Put both stubs in the stub registry
#' reg <- StubRegistry$new()
#' reg$register_stub(stub = stub1)
#' reg$register_stub(stub = stub2)
#' reg
#' reg$request_stubs
#' }
StubRegistry <- R6::R6Class(
"StubRegistry",
public = list(
#' @field request_stubs (list) list of request stubs
request_stubs = list(),
#' @field global_stubs (list) list of global stubs
global_stubs = list(),
#' @description print method for the `StubRegistry` class
#' @param x self
#' @param ... ignored
print = function(x, ...) {
cat("<webmockr stub registry> ", sep = "\n")
cat(" Registered Stubs", sep = "\n")
for (i in seq_along(self$request_stubs)) {
cat(" ", self$request_stubs[[i]]$to_s(), "\n")
}
invisible(self$request_stubs)
},
#' @description Register a stub
#' @param stub an object of type [StubbedRequest]
#' @return nothing returned; registers the stub
register_stub = function(stub) {
self$request_stubs <- Filter(length, c(self$request_stubs, stub))
},
#' @description Find a stubbed request
#' @param req an object of class [RequestSignature]
#' @return an object of type [StubbedRequest], if matched
find_stubbed_request = function(req) {
stubs <- c(self$global_stubs, self$request_stubs)
stubs[self$request_stub_for(req)]
},
# response_for_request = function(request_signature) {
# stub <- self$request_stub_for(request_signature)
# evaluate_response_for_request(stub$response, request_signature) %||% NULL
# },
#' @description Find a stubbed request
#' @param request_signature an object of class [RequestSignature]
#' @param count (bool) iterate counter or not. default: `TRUE`
#' @return logical, 1 or more
request_stub_for = function(request_signature, count = TRUE) {
stubs <- c(self$global_stubs, self$request_stubs)
mtchs <- vapply(stubs, function(z) {
tmp <- RequestPattern$new(method = z$method, uri = z$uri,
uri_regex = z$uri_regex, query = z$query,
body = z$body, headers = z$request_headers)
tmp$matches(request_signature)
}, logical(1))
if (count) {
for (i in seq_along(stubs)) {
if (mtchs[i]) stubs[[i]]$counter$put(request_signature)
}
}
return(mtchs)
},
#' @description Remove a stubbed request by matching request signature
#' @param stub an object of type [StubbedRequest]
#' @return nothing returned; removes the stub from the registry
remove_request_stub = function(stub) {
xx <- vapply(self$request_stubs, function(x) x$to_s(), "")
if (stub$to_s() %in% xx) {
self$request_stubs <- self$request_stubs[-which(stub$to_s() %in% xx)]
} else {
stop(
"Request stub \n\n ",
stub$to_s(),
"\n\n is not registered.",
call. = FALSE
)
}
},
#' @description Remove all request stubs
#' @return nothing returned; removes all request stubs
remove_all_request_stubs = function() {
for (stub in self$request_stubs) {
if (inherits(stub, "StubbedRequest")) stub$reset()
}
self$request_stubs <- list()
},
#' @description Find a stubbed request
#' @param x an object of class [RequestSignature]
#' @return nothing returned; registers the stub
is_registered = function(x) any(self$request_stub_for(x, count = FALSE))
)
)
json_validate <- function(x) {
res <- tryCatch(jsonlite::validate(x), error = function(e) e)
if (inherits(res, "error")) return(FALSE)
res
}
# make body info for print method
make_body <- function(x) {
if (is.null(x)) return("")
if (inherits(x, "mock_file")) x <- x$payload
if (inherits(x, "form_file")) x <- unclass(x)
clzzes <- vapply(x, function(z) inherits(z, "form_file"), logical(1))
if (any(clzzes)) for(i in seq_along(x)) x[[i]] <- unclass(x[[i]])
if (json_validate(x))
body <- x
else
body <- jsonlite::toJSON(x, auto_unbox = TRUE)
paste0(" with body ", body)
}
# make headers info for print method
make_headers <- function(x) {
if (is.null(x)) return("")
paste0(" with headers ", jsonlite::toJSON(x, auto_unbox = TRUE))
}
# make body info for print method
make_status <- function(x) {
if (is.null(x)) return("")
paste0(" with status ", as.character(x))
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/StubRegistry.R
|
#' @title StubCounter
#' @description hash with counter to store requests and count number
#' of requests made against the stub
#' @export
#' @examples
#' x <- StubCounter$new()
#' x
#' x$hash
#' x$count()
#' z <- RequestSignature$new(method = "get", uri = "https:/httpbin.org/get")
#' x$put(z)
#' x$count()
#' x$put(z)
#' x$count()
StubCounter <- R6::R6Class(
'StubCounter',
public = list(
#' @field hash (list) a list for internal use only, with elements
#' `key`, `sig`, and `count`
hash = list(),
#' @description Register a request by it's key
#' @param x an object of class `RequestSignature`
#' @return nothing returned; registers request & iterates internal counter
put = function(x) {
assert(x, "RequestSignature")
key <- x$to_s()
self$hash[[key]] <- list(key = key, sig = x)
private$total <- private$total + 1
},
#' @description Get the count of number of times any matching request has
#' been made against this stub
count = function() {
private$total
}
),
private = list(
total = 0
)
)
#' @title StubbedRequest
#' @description stubbed request class underlying [stub_request()]
#' @export
#' @seealso [stub_request()]
#' @examples \dontrun{
#' x <- StubbedRequest$new(method = "get", uri = "api.crossref.org")
#' x$method
#' x$uri
#' x$with(headers = list('User-Agent' = 'R', apple = "good"))
#' x$to_return(status = 200, body = "foobar", headers = list(a = 5))
#' x
#' x$to_s()
#'
#' # many to_return's
#' x <- StubbedRequest$new(method = "get", uri = "httpbin.org")
#' x$to_return(status = 200, body = "foobar", headers = list(a = 5))
#' x$to_return(status = 200, body = "bears", headers = list(b = 6))
#' x
#' x$to_s()
#'
#' # raw body
#' x <- StubbedRequest$new(method = "get", uri = "api.crossref.org")
#' x$to_return(status = 200, body = raw(0), headers = list(a = 5))
#' x$to_s()
#' x
#'
#' x <- StubbedRequest$new(method = "get", uri = "api.crossref.org")
#' x$to_return(status = 200, body = charToRaw("foo bar"),
#' headers = list(a = 5))
#' x$to_s()
#' x
#'
#' # basic auth
#' x <- StubbedRequest$new(method = "get", uri = "api.crossref.org")
#' x$with(basic_auth = c("foo", "bar"))
#' x$to_s()
#' x
#'
#' # file path
#' x <- StubbedRequest$new(method = "get", uri = "api.crossref.org")
#' f <- tempfile()
#' x$to_return(status = 200, body = file(f), headers = list(a = 5))
#' x
#' x$to_s()
#' unlink(f)
#'
#' # to_file(): file path and payload to go into the file
#' # payload written to file during mocked response creation
#' x <- StubbedRequest$new(method = "get", uri = "api.crossref.org")
#' f <- tempfile()
#' x$to_return(status = 200, body = mock_file(f, "{\"foo\": \"bar\"}"),
#' headers = list(a = 5))
#' x
#' x$to_s()
#' unlink(f)
#'
#' # uri_regex
#' (x <- StubbedRequest$new(method = "get", uri_regex = ".+ossref.org"))
#' x$method
#' x$uri_regex
#' x$to_s()
#'
#' # to timeout
#' (x <- StubbedRequest$new(method = "get", uri_regex = ".+ossref.org"))
#' x$to_s()
#' x$to_timeout()
#' x$to_s()
#' x
#'
#' # to raise
#' library(fauxpas)
#' (x <- StubbedRequest$new(method = "get", uri_regex = ".+ossref.org"))
#' x$to_s()
#' x$to_raise(HTTPBadGateway)
#' x$to_s()
#' x
#' }
StubbedRequest <- R6::R6Class(
"StubbedRequest",
public = list(
#' @field method (xx) xx
method = NULL,
#' @field uri (xx) xx
uri = NULL,
#' @field uri_regex (xx) xx
uri_regex = NULL,
#' @field uri_parts (xx) xx
uri_parts = NULL,
#' @field host (xx) xx
host = NULL,
#' @field query (xx) xx
query = NULL,
#' @field body (xx) xx
body = NULL,
#' @field basic_auth (xx) xx
basic_auth = NULL,
#' @field request_headers (xx) xx
request_headers = NULL,
#' @field response_headers (xx) xx
response_headers = NULL,
#' @field responses_sequences (xx) xx
responses_sequences = NULL,
#' @field status_code (xx) xx
status_code = NULL,
#' @field counter a StubCounter object
counter = NULL,
#' @description Create a new `StubbedRequest` object
#' @param method the HTTP method (any, head, get, post, put,
#' patch, or delete). "any" matches any HTTP method. required.
#' @param uri (character) request URI. either this or `uri_regex`
#' required. \pkg{webmockr} can match uri's without the "http" scheme,
#' but does not match if the scheme is "https". required, unless
#' `uri_regex` given. See [UriPattern] for more.
#' @param uri_regex (character) request URI as regex. either this or `uri`
#' required
#' @return A new `StubbedRequest` object
initialize = function(method, uri = NULL, uri_regex = NULL) {
if (!missing(method)) {
verb <- match.arg(tolower(method), http_verbs)
self$method <- verb
}
if (is.null(uri) && is.null(uri_regex)) {
stop("one of uri or uri_regex is required", call. = FALSE)
}
self$uri <- uri
self$uri_regex <- uri_regex
if (!is.null(uri)) self$uri_parts <- parseurl(self$uri)
self$counter <- StubCounter$new()
},
#' @description print method for the `StubbedRequest` class
#' @param x self
#' @param ... ignored
print = function(x, ...) {
cat("<webmockr stub> ", sep = "\n")
cat(paste0(" method: ", self$method), sep = "\n")
cat(paste0(" uri: ", self$uri %||% self$uri_regex), sep = "\n")
cat(" with: ", sep = "\n")
cat(paste0(" query: ", hdl_lst(self$query)), sep = "\n")
if (is.null(self$body))
cat(" body: ", sep = "\n")
else
cat(sprintf(" body (class: %s): %s", class(self$body)[1L],
hdl_lst(self$body)), sep = "\n")
cat(paste0(" request_headers: ",
hdl_lst(self$request_headers)),
sep = "\n")
cat(" to_return: ", sep = "\n")
rs <- self$responses_sequences
for (i in seq_along(rs)) {
cat(paste0(" - status: ", hdl_lst(rs[[i]]$status)),
sep = "\n")
cat(paste0(" body: ", hdl_lst(rs[[i]]$body)),
sep = "\n")
cat(paste0(" response_headers: ",
hdl_lst(rs[[i]]$headers)),
sep = "\n")
cat(paste0(" should_timeout: ", rs[[i]]$timeout), sep = "\n")
cat(paste0(" should_raise: ",
if (rs[[i]]$raise)
paste0(vapply(rs[[i]]$exceptions, "[[", "", "classname"),
collapse = ", ")
else "FALSE"
), sep = "\n")
}
},
#' @description Set expectations for what's given in HTTP request
#' @param query (list) request query params, as a named list. optional
#' @param body (list) request body, as a named list. optional
#' @param headers (list) request headers as a named list. optional.
#' @param basic_auth (character) basic authentication. optional.
#' @return nothing returned; sets only
with = function(query = NULL, body = NULL, headers = NULL, basic_auth = NULL) {
if (!is.null(query)) {
query <- lapply(query, as.character)
}
self$query <- query
self$body <- body
self$basic_auth <- basic_auth
if (!is.null(basic_auth)) {
headers <- c(prep_auth(paste0(basic_auth, collapse = ':')), headers)
}
self$request_headers <- headers
},
#' @description Set expectations for what's returned in HTTP response
#' @param status (numeric) an HTTP status code
#' @param body (list) response body, one of: `character`, `json`,
#' `list`, `raw`, `numeric`, `NULL`, `FALSE`, or a file connection
#' (other connetion types not supported)
#' @param headers (list) named list, response headers. optional.
#' @return nothing returned; sets whats to be returned
to_return = function(status, body, headers) {
body <- if (inherits(body, "connection")) {
bod_sum <- summary(body)
close.connection(body)
if (bod_sum$class != "file")
stop("'to_return' only supports connections of type 'file'")
structure(bod_sum$description, type = "file")
} else {
body
}
self$response_headers <- headers # FIXME: for then change, remove eventually
body_raw <- {
if (inherits(body, "mock_file")) {
body
} else if (inherits(body, "logical")) {
if (!body) {
raw()
} else {
webmockr_stub_registry$remove_request_stub(self)
stop(paste0("Unknown type of `body`: ",
"must be NULL, FALSE, character, raw or list; stub removed"),
call. = FALSE)
}
} else if (inherits(body, "raw")) {
body
} else if (is.null(body)) {
raw()
} else if (is.character(body) || inherits(body, "json")) {
if (!is.null(attr(body, "type"))) {
stopifnot(attr(body, "type") == "file")
body
} else {
charToRaw(body)
}
} else if (!is.list(body)) {
webmockr_stub_registry$remove_request_stub(self)
stop(paste0("Unknown type of `body`: ",
"must be numeric, NULL, FALSE, character, json, ",
"raw, list, or file connection; stub removed"),
call. = FALSE)
} else {
charToRaw(jsonlite::toJSON(body, auto_unbox = TRUE))
}
}
private$append_response(
private$response(
status = status,
body = body,
headers = headers,
body_raw = body_raw
)
)
},
#' @description Response should time out
#' @return nothing returned
to_timeout = function() {
private$append_response(private$response(timeout = TRUE))
},
#' @description Response should raise an exception `x`
#' @param x (character) an exception message
#' @return nothing returned
to_raise = function(x) {
private$append_response(
private$response(
raise = TRUE,
exceptions = if (inherits(x, "list")) x else list(x)
)
)
},
#' @description Response as a character string
#' @return (character) the response as a string
to_s = function() {
ret <- self$responses_sequences
gsub("^\\s+|\\s+$", "", sprintf(
" %s: %s %s %s %s",
toupper(self$method),
url_builder(self$uri %||% self$uri_regex, self$query),
make_body(self$body),
make_headers(self$request_headers),
if (length(ret) > 0) {
strgs <- c()
for (i in seq_along(ret)) {
bd <- make_body(ret[[i]]$body)
stt <- make_status(ret[[i]]$status)
hed <- make_headers(ret[[i]]$headers)
strgs[i] <- sprintf("%s %s %s",
if (nzchar(paste0(bd, stt, hed))) paste("| to_return: ", bd, stt, hed) else "",
if (ret[[i]]$timeout) "| should_timeout: TRUE" else "",
if (ret[[i]]$raise)
paste0("| to_raise: ",
paste0(vapply(ret[[i]]$exceptions, "[[", "", "classname"),
collapse = ", "))
else ""
)
}
paste0(strgs, collapse = " ")
} else {
""
}
))
},
#' @description Reset the counter for the stub
#' @return nothing returned; resets stub counter to no requests
reset = function() {
self$counter <- StubCounter$new()
}
),
private = list(
append_response = function(x) {
self$responses_sequences <- cc(c(self$responses_sequences, list(x)))
},
response = function(status = NULL, body = NULL, headers = NULL,
body_raw = NULL, timeout = FALSE, raise = FALSE, exceptions = list()
) {
list(
status = status,
body = body,
headers = headers,
body_raw = body_raw,
timeout = timeout,
raise = raise,
exceptions = exceptions
)
}
)
)
basic_auth_header <- function(x) {
assert(x, "character")
stopifnot(length(x) == 1)
encoded <- base64enc::base64encode(charToRaw(x))
return(paste0("Basic ", encoded))
}
prep_auth <- function(x) {
if (is.null(x)) return(NULL)
if (!is.null(x)) {
list(Authorization = basic_auth_header(x))
}
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/StubbedRequest.R
|
#' Build a crul response
#' @export
#' @param req a request
#' @param resp a response
#' @return a crul response
build_crul_response <- function(req, resp) {
# prep headers
if (grepl("^ftp://", resp$url %||% "")) { # in case uri_regex only
headers <- list()
} else {
hds <- resp$headers
if (is.null(hds)) {
hds <- resp$response_headers
headers <- if (is.null(hds)) {
list()
} else {
stopifnot(is.list(hds))
stopifnot(is.character(hds[[1]]))
hds
}
} else {
hh <- rawToChar(hds %||% raw(0))
if (is.null(hh) || nchar(hh) == 0) {
headers <- list()
} else {
headers <- lapply(curl::parse_headers(hh, multiple = TRUE),
crul_headers_parse)
}
}
}
crul::HttpResponse$new(
method = req$method,
# if resp URL is empty, use URL from request
url = resp$url %||% req$url$url,
status_code = resp$status_code,
request_headers = c('User-Agent' = req$options$useragent, req$headers),
response_headers = {
if (all(hz_namez(headers))) headers else last(headers)
},
response_headers_all = headers,
modified = resp$modified %||% NA,
times = resp$times,
content = resp$content,
handle = req$url$handle,
request = req
)
}
#' Build a crul request
#' @export
#' @param x an unexecuted crul request object
#' @return a crul request
build_crul_request = function(x) {
headers <- x$headers %||% NULL
auth <- check_user_pwd(x$options$userpwd) %||% NULL
if (!is.null(auth)) {
auth_header <- prep_auth(auth)
headers <- c(headers, auth_header)
}
RequestSignature$new(
method = x$method,
uri = x$url$url,
options = list(
body = pluck_body(x),
headers = headers,
proxies = x$proxies %||% NULL,
auth = auth,
disk = x$disk %||% NULL
)
)
}
#' @rdname Adapter
#' @export
CrulAdapter <- R6::R6Class("CrulAdapter",
inherit = Adapter,
public = list(
#' @field client HTTP client package name
client = "crul",
#' @field name adapter name
name = "CrulAdapter"
),
private = list(
pluck_url = function(request) request$url$url,
mock = function(on) crul::mock(on),
build_request = build_crul_request,
build_response = build_crul_response,
fetch_request = function(request) {
private$build_response(request, webmockr_crul_fetch(request))
},
request_handler = function(request) vcr::RequestHandlerCrul$new(request),
update_vcr_disk_path = function(response) {
write_disk_path <- vcr::vcr_configuration()$write_disk_path
# if crul_resp$content is character, it must be a file path (I THINK?)
if (is.null(write_disk_path)) {
stop("if writing to disk, write_disk_path must be given; ",
"see ?vcr::vcr_configure")
}
response$content <- file.path(
write_disk_path,
basename(response$content)
)
response
}
)
)
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/adapter-crul.R
|
#' Build a httr response
#' @export
#' @param req a request
#' @param resp a response
#' @return a httr response
build_httr_response <- function(req, resp) {
try_url <- tryCatch(resp$url, error = function(e) e)
lst <- list(
url = try_url %|s|% req$url,
status_code = as.integer(resp$status_code),
headers = {
if (grepl("^ftp://", resp$url %||% "")) { # in case uri_regex only
list()
} else {
hds <- resp$headers
if (is.null(hds)) {
hds <- resp$response_headers
if (is.null(hds)) {
list()
} else {
stopifnot(is.list(hds))
stopifnot(is.character(hds[[1]]))
httr::insensitive(hds)
}
} else {
httr::insensitive(hds)
}
}
},
all_headers = list(),
cookies = httr_cookies_df(),
content = resp$content,
date = {
if (!is.null(resp$response_headers$date)) {
httr::parse_http_date(resp$response_headers$date)
} else {
Sys.time()
}
},
times = numeric(0),
request = req,
handle = NA
)
lst$all_headers <- list(list(
status = lst$status_code,
version = "",
headers = lst$headers
))
structure(lst, class = "response")
}
httr_cookies_df <- function() {
df <- data.frame(matrix(ncol = 7, nrow = 0))
x <- c("domain", "flag", "path", "secure", "expiration", "name", "value")
colnames(df) <- x
df
}
# x = "https://foobar.com"
# check_user_pwd(x)
check_user_pwd <- function(x) {
if (is.null(x)) return(x)
if (grepl("^https?://", x)) {
stop(sprintf("expecting string of pattern 'user:pwd', got '%s'", x))
}
return(x)
}
#' Build a httr request
#' @export
#' @param x an unexecuted httr request object
#' @return a httr request
build_httr_request = function(x) {
headers <- as.list(x$headers) %||% NULL
auth <- check_user_pwd(x$options$userpwd) %||% NULL
if (!is.null(auth)) {
auth_header <- prep_auth(auth)
headers <- c(headers, auth_header)
}
RequestSignature$new(
method = x$method,
uri = x$url,
options = list(
body = pluck_body(x),
headers = headers,
proxies = x$proxies %||% NULL,
auth = auth,
disk = x$disk %||% NULL,
fields = x$fields %||% NULL,
output = x$output %||% NULL
)
)
}
#' Turn on httr mocking
#' Sets a callback that routes httr request through webmockr
#'
#' @export
#' @param on (logical) set to `TRUE` to turn on, and `FALSE`
#' to turn off. default: `TRUE`
#' @return Silently returns `TRUE` when enabled and `FALSE` when disabled.
httr_mock <- function(on = TRUE) {
check_for_pkg("httr")
webmockr_handle <- function(req) {
webmockr::HttrAdapter$new()$handle_request(req)
}
if (on) {
httr::set_callback("request", webmockr_handle)
} else {
httr::set_callback("request", NULL)
}
invisible(on)
}
#' @rdname Adapter
#' @export
HttrAdapter <- R6::R6Class("HttrAdapter",
inherit = Adapter,
public = list(
#' @field client HTTP client package name
client = "httr",
#' @field name adapter name
name = "HttrAdapter"
),
private = list(
pluck_url = function(request) request$url,
mock = function(on) httr_mock(on),
build_request = build_httr_request,
build_response = build_httr_response,
request_handler = function(request) vcr::RequestHandlerHttr$new(request),
fetch_request = function(request) {
METHOD <- eval(parse(text = paste0("httr::", request$method)))
METHOD(
private$pluck_url(request),
body = pluck_body(request),
do.call(httr::config, request$options),
httr::add_headers(request$headers),
if (!is.null(request$output$path)) {
httr::write_disk(request$output$path, TRUE)
}
)
}
)
)
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/adapter-httr.R
|
#' @title Adapters for Modifying HTTP Requests
#' @description `Adapter` is the base parent class used to implement
#' \pkg{webmockr} support for different HTTP clients. It should not be used
#' directly. Instead, use one of the client-specific adapters that webmockr
#' currently provides:
#' * `CrulAdapter` for \pkg{crul}
#' * `HttrAdapter` for \pkg{httr}
#' @details Note that the documented fields and methods are the same across all
#' client-specific adapters.
#' @export
#' @examples \dontrun{
#' if (requireNamespace("httr", quietly = TRUE)) {
#' # library(httr)
#'
#' # normal httr request, works fine
#' # real <- GET("https://httpbin.org/get")
#' # real
#'
#' # with webmockr
#' # library(webmockr)
#' ## turn on httr mocking
#' # httr_mock()
#' ## now this request isn't allowed
#' # GET("https://httpbin.org/get")
#' ## stub the request
#' # stub_request('get', uri = 'https://httpbin.org/get') %>%
#' # wi_th(
#' # headers = list('Accept' = 'application/json, text/xml, application/xml, */*')
#' # ) %>%
#' # to_return(status = 418, body = "I'm a teapot!", headers = list(a = 5))
#' ## now the request succeeds and returns a mocked response
#' # (res <- GET("https://httpbin.org/get"))
#' # res$status_code
#' # rawToChar(res$content)
#'
#' # allow real requests while webmockr is loaded
#' # webmockr_allow_net_connect()
#' # webmockr_net_connect_allowed()
#' # GET("https://httpbin.org/get?animal=chicken")
#' # webmockr_disable_net_connect()
#' # webmockr_net_connect_allowed()
#' # GET("https://httpbin.org/get?animal=chicken")
#'
#' # httr_mock(FALSE)
#' }
#' }
Adapter <- R6::R6Class("Adapter",
public = list(
#' @field client HTTP client package name
client = NULL,
#' @field name adapter name
name = NULL,
#' @description Create a new Adapter object
initialize = function() {
if (is.null(self$client)) {
stop(
"Adapter parent class should not be called directly.\n",
"Use one of the following package-specific adapters instead:\n",
" - CrulAdapter$new()\n",
" - HttrAdapter$new()",
call. = FALSE
)
}
},
#' @description Enable the adapter
#' @param quiet (logical) suppress messages? default: `FALSE`
#' @return `TRUE`, invisibly
enable = function(quiet = FALSE) {
assert(quiet, "logical")
if (!quiet) message(sprintf("%s enabled!", self$name))
webmockr_lightswitch[[self$client]] <- TRUE
switch(self$client,
crul = crul::mock(on = TRUE),
httr = httr_mock(on = TRUE)
)
},
#' @description Disable the adapter
#' @param quiet (logical) suppress messages? default: `FALSE`
#' @return `FALSE`, invisibly
disable = function(quiet = FALSE) {
assert(quiet, "logical")
if (!quiet) message(sprintf("%s disabled!", self$name))
webmockr_lightswitch[[self$client]] <- FALSE
self$remove_stubs()
switch(self$client,
crul = crul::mock(on = FALSE),
httr = httr_mock(on = FALSE)
)
},
#' @description All logic for handling a request
#' @param req a request
#' @return various outcomes
handle_request = function(req) {
# put request in request registry
request_signature <- private$build_request(req)
webmockr_request_registry$register_request(
request = request_signature
# request = request_signature$to_s()
)
if (request_is_in_cache(request_signature)) {
# if real requests NOT allowed
# even if net connects allowed, we check if stubbed found first
ss <- webmockr_stub_registry$find_stubbed_request(request_signature)[[1]]
# if user wants to return a partial object
# get stub with response and return that
resp <- private$build_stub_response(ss)
# generate response
# VCR: recordable/ignored
if (vcr_cassette_inserted()) {
# req <- handle_separate_redirects(req)
# use RequestHandler - gets current cassette & record interaction
resp <- private$request_handler(req)$handle()
# if written to disk, see if we should modify file path
if (self$client == "crul" && is.character(resp$content)) {
resp <- private$update_vcr_disk_path(resp)
}
# no vcr
} else {
resp <- private$build_response(req, resp)
# add to_return() elements if given
resp <- private$add_response_sequences(ss, resp)
}
# request is not in cache but connections are allowed
} else if (webmockr_net_connect_allowed(uri = private$pluck_url(req))) {
# if real requests || localhost || certain exceptions ARE
# allowed && nothing found above
# if vcr loaded: record http interaction into vcr namespace
# VCR: recordable
if (vcr_loaded()) {
# req <- handle_separate_redirects(req)
# use RequestHandler instead? - which gets current cassette for us
resp <- private$request_handler(req)$handle()
# if written to disk, see if we should modify file path
if (self$client == "crul" && is.character(resp$content)) {
if (file.exists(resp$content)) {
resp <- private$update_vcr_disk_path(resp)
}
}
# stub request so next time we match it
req_url <- private$pluck_url(req)
urip <- crul::url_parse(req_url)
m <- vcr::vcr_configuration()$match_requests_on
if (all(m %in% c("method", "uri")) && length(m) == 2) {
stub_request(req$method, req_url)
} else if (all(m %in% c("method", "uri", "query")) && length(m) == 3) {
tmp <- stub_request(req$method, req_url)
wi_th(tmp, .list = list(query = urip$parameter))
} else if (all(m %in% c("method", "uri", "headers")) && length(m) == 3) {
tmp <- stub_request(req$method, req_url)
wi_th(tmp, .list = list(headers = req$headers))
} else if (all(m %in% c("method", "uri", "headers", "query")) && length(m) == 4) {
tmp <- stub_request(req$method, req_url)
wi_th(tmp, .list = list(query = urip$parameter, headers = req$headers))
}
# check if new request/response from redirects in vcr
# req <- redirects_request(req)
# resp <- redirects_response(resp)
} else {
private$mock(on = FALSE)
resp <- private$fetch_request(req)
private$mock(on = TRUE)
}
# request is not in cache and connections are not allowed
} else {
# throw vcr error: should happen when user not using
# use_cassette or insert_cassette
if (vcr_loaded()) {
private$request_handler(req)$handle()
}
# no stubs found and net connect not allowed - STOP
x <- "Real HTTP connections are disabled.\nUnregistered request:\n "
y <- "\n\nYou can stub this request with the following snippet:\n\n "
z <- "\n\nregistered request stubs:\n\n"
msgx <- paste(x, request_signature$to_s())
msgy <- ""
if (webmockr_conf_env$show_stubbing_instructions) {
msgy <- paste(y, private$make_stub_request_code(request_signature))
}
if (length(webmockr_stub_registry$request_stubs)) {
msgz <- paste(
z,
paste0(vapply(webmockr_stub_registry$request_stubs, function(z)
z$to_s(), ""), collapse = "\n ")
)
} else {
msgz <- ""
}
ending <- "\n============================================================"
stop(paste0(msgx, msgy, msgz, ending), call. = FALSE)
}
return(resp)
},
#' @description Remove all stubs
#' @return nothing returned; removes all request stubs
remove_stubs = function() {
webmockr_stub_registry$remove_all_request_stubs()
}
),
private = list(
make_stub_request_code = function(x) {
tmp <- sprintf(
"stub_request('%s', uri = '%s')",
x$method,
x$uri
)
if (!is.null(x$headers) || !is.null(x$body)) {
# set defaults to ""
hd_str <- bd_str <- ""
# headers has to be a named list, so easier to deal with
if (!is.null(x$headers)) {
hd <- x$headers
hd_str <- paste0(
paste(sprintf("'%s'", names(hd)),
sprintf("'%s'", unlist(unname(hd))), sep = " = "),
collapse = ", ")
}
# body can be lots of things, so need to handle various cases
if (!is.null(x$body)) {
bd <- x$body
bd_str <- hdl_lst2(bd)
}
if (all(nzchar(hd_str) && nzchar(bd_str))) {
with_str <- sprintf(" wi_th(\n headers = list(%s),\n body = list(%s)\n )",
hd_str, bd_str)
} else if (nzchar(hd_str) && !nzchar(bd_str)) {
with_str <- sprintf(" wi_th(\n headers = list(%s)\n )", hd_str)
} else if (!nzchar(hd_str) && nzchar(bd_str)) {
with_str <- sprintf(" wi_th(\n body = list(%s)\n )", bd_str)
}
tmp <- paste0(tmp, " %>%\n ", with_str)
}
return(tmp)
},
build_stub_response = function(stub) {
stopifnot(inherits(stub, "StubbedRequest"))
resp <- Response$new()
resp$set_url(stub$uri)
resp$set_body(stub$body)
resp$set_request_headers(stub$request_headers)
resp$set_response_headers(stub$response_headers)
resp$set_status(as.integer(stub$status_code %||% 200))
stub_num_get <- stub$counter$count()
if (stub_num_get > length(stub$responses_sequences)) {
stub_num_get <- length(stub$responses_sequences)
}
respx <- stub$responses_sequences[[stub_num_get]]
# if user set to_timeout or to_raise, do that
if (!is.null(respx)) {
if (respx$timeout || respx$raise) {
if (respx$timeout) {
x <- fauxpas::HTTPRequestTimeout$new()
resp$set_status(x$status_code)
x$do_verbose(resp)
}
if (respx$raise) {
x <- respx$exceptions[[1]]$new()
resp$set_status(x$status_code)
x$do_verbose(resp)
}
}
}
return(resp)
},
add_response_sequences = function(stub, response) {
# TODO: assert HttpResponse (is it ever a crul response?)
stopifnot(inherits(stub, "StubbedRequest"))
# FIXME: temporary fix, change to using request registry counter
# to decide which responses_sequence entry to use
# choose which response to return
stub_num_get <- stub$counter$count()
if (stub_num_get > length(stub$responses_sequences)) {
stub_num_get <- length(stub$responses_sequences)
}
respx <- stub$responses_sequences[[stub_num_get]]
# remove NULLs
toadd <- cc(respx)
if (is.null(toadd)) return(response)
# remove timeout, raise, exceptions fields
toadd <- toadd[!names(toadd) %in% c('timeout', 'raise', 'exceptions')]
for (i in seq_along(toadd)) {
if (names(toadd)[i] == "status") {
response$status_code <- as.integer(toadd[[i]])
}
if (names(toadd)[i] == "body") {
if (inherits(respx$body_raw, "mock_file")) {
cat(
respx$body_raw$payload,
file = respx$body_raw$path,
sep = "\n"
)
respx$body_raw <-
respx$body_raw$path
if (self$client == "httr") {
class(respx$body_raw) <- "path"
}
}
body_type <- attr(respx$body_raw, "type") %||% ""
if (self$client == "httr" && body_type == "file") {
attr(respx$body_raw, "type") <- NULL
class(respx$body_raw) <- "path"
}
response$content <- respx$body_raw
}
if (names(toadd)[i] == "headers") {
headers <- names_to_lower(as_character(toadd[[i]]))
if (self$client == "crul") {
response$response_headers <- headers
response$response_headers_all <- list(headers)
} else {
response$headers <- httr::insensitive(headers)
}
}
}
return(response)
}
)
)
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/adapter.R
|
#' This function is defunct.
#' @export
#' @rdname webmockr_enable-defunct
#' @keywords internal
webmockr_enable <- function(...) .Defunct("enable")
#' This function is defunct.
#' @export
#' @rdname webmockr_disable-defunct
#' @keywords internal
webmockr_disable <- function(...) .Defunct("disable")
#' This function is defunct.
#' @export
#' @rdname to_return_-defunct
#' @keywords internal
to_return_ <- function(...) .Defunct("to_return")
#' This function is defunct.
#' @export
#' @rdname wi_th_-defunct
#' @keywords internal
wi_th_ <- function(...) .Defunct("wi_th")
#' Defunct functions in \pkg{webmockr}
#'
#' - [webmockr_enable()]: Function removed, see [enable()]
#' - [webmockr_disable()]: Function removed, see [disable()]
#' - [to_return_]: Only [to_return()] is available now
#' - [wi_th_]: Only [wi_th()] is available now
#'
#' @name webmockr-defunct
NULL
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/defunct.R
|
webmockr_lightswitch <- new.env()
webmockr_lightswitch$httr <- FALSE
webmockr_lightswitch$crul <- FALSE
webmockr_adapters <- c('crul', 'httr')
#' Enable or disable webmockr
#'
#' @export
#' @param adapter (character) the adapter name, 'crul' or 'httr'.
#' one or the other. if none given, we attempt to enable both
#' adapters
#' @param options list of options - ignored for now.
#' @param quiet (logical) suppress messages? default: `FALSE`
#' @details `enable()` enables \pkg{webmockr} for all adapters.
#' `disable()` disables \pkg{webmockr} for all adapters. `enabled()`
#' answers whether \pkg{webmockr} is enabled for a given adapter
#' @return `enable()` and `disable()` invisibly returns booleans for
#' each adapter, as a result of running enable or disable, respectively,
#' on each [HttpLibAdapaterRegistry] object. `enabled` returns a
#' single boolean
enable <- function(adapter = NULL, options = list(), quiet = FALSE) {
adnms <- vapply(http_lib_adapter_registry$adapters, function(w) w$client, "")
if (!is.null(adapter)) {
if (!adapter %in% webmockr_adapters) {
stop("adapter must be one of 'crul' or 'httr'")
}
if (!requireNamespace(adapter, quietly = TRUE)) {
message(adapter, " not installed, skipping enable")
return(invisible(FALSE))
}
http_lib_adapter_registry$adapters[[grep(adapter, adnms)]]$enable(quiet)
} else {
invisible(vapply(http_lib_adapter_registry$adapters, function(z) {
pkgname <- z$client
# check if package installed first
if (!requireNamespace(pkgname, quietly = TRUE)) {
message(pkgname, " not installed, skipping enable")
FALSE
} else {
# if instaled, enable
z$enable(quiet)
}
}, logical(1)))
}
}
#' @export
#' @rdname enable
enabled <- function(adapter = "crul") {
if (!adapter %in% webmockr_adapters) {
stop("'adapter' must be in the set ",
paste0(webmockr_adapters, collapse = ", "))
}
webmockr_lightswitch[[adapter]]
}
#' @export
#' @rdname enable
disable <- function(adapter = NULL, options = list(), quiet = FALSE) {
adnms <- vapply(http_lib_adapter_registry$adapters, function(w) w$client, "")
if (!is.null(adapter)) {
if (!adapter %in% webmockr_adapters) {
stop("adapter must be one of 'crul' or 'httr'")
}
if (!requireNamespace(adapter, quietly = TRUE)) {
message(adapter, " not installed, skipping disable")
return(invisible(FALSE))
}
http_lib_adapter_registry$adapters[[grep(adapter, adnms)]]$disable(quiet)
} else {
invisible(vapply(http_lib_adapter_registry$adapters, function(z) {
pkgname <- z$client
# check if package installed first
if (!requireNamespace(pkgname, quietly = TRUE)) {
message(pkgname, " not installed, skipping disable")
FALSE
} else {
# if instaled, disable
z$disable(quiet)
}
}, logical(1)))
}
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/flipswitch.R
|
if (base::getRversion() >= "2.15.1") {
utils::globalVariables(c("vcr_c"))
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/globals.R
|
# headers <- list(`Content-type` = 'application/json', Stuff = "things")
# normalize_headers(x = headers)
#
# headers <- list(`content-type` = 'application/json', stuff = "things")
# normalize_headers(x = headers, capitalize = FALSE)
#
# headers <- list(`content-type` = 'application/json', `x-frame-options` = c("SAMEORIGIN", "sameorigin"))
# normalize_headers(x = headers)
# normalize_headers(x = headers, FALSE)
normalize_headers <- function(x = NULL, capitalize = TRUE) {
if (is.null(x) || length(x) == 0) return(x)
res <- list()
for (i in seq_along(x)) {
name <- paste0(
vapply(strsplit(as.character(names(x)[i]), '_|-')[[1]], function(w) simple_cap(w, capitalize), ""),
collapse = "-"
)
value <- switch(
class(x[[i]]),
list = if (length(x[[i]]) == 1) x[[i]][[1]] else sort(vapply(x[[i]], function(z) as.character(z), "")),
if (length(x[[i]]) > 1) paste0(as.character(x[[i]]), collapse = ",") else as.character(x[[i]])
)
res[[i]] <- list(name, value)
}
unlist(lapply(res, function(z) stats::setNames(z[2], z[1])), FALSE)
}
simple_cap <- function(x, capitalize) {
if (capitalize) {
s <- strsplit(x, " ")[[1]]
paste(toupper(substring(s, 1, 1)), substring(s, 2),
sep = "", collapse = " ")
} else {
x
}
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/headers.R
|
#' Mock file
#'
#' @export
#' @param path (character) a file path. required
#' @param payload (character) string to be written to the file given
#' at `path` parameter. required
#' @return a list with S3 class `mock_file`
#' @examples
#' mock_file(path = tempfile(), payload = "{\"foo\": \"bar\"}")
mock_file <- function(path, payload) {
assert(path, "character")
assert(payload, c("character", "json"))
structure(list(path = path, payload = payload), class = "mock_file")
}
#' @export
print.mock_file <- function(x, ...) {
cat("<mock file>", sep = "\n")
cat(paste0(" path: ", x$path), sep = "\n")
cat(paste0(" payload: ", substring(x$payload, 1, 80)), sep = "\n")
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/mock_file.R
|
#' Mocking writing to disk
#'
#' @name mocking-disk-writing
#' @examples \dontrun{
#' # enable mocking
#' enable()
#'
#' # Write to a file before mocked request
#'
#' # crul
#' library(crul)
#' ## make a temp file
#' f <- tempfile(fileext = ".json")
#' ## write something to the file
#' cat("{\"hello\":\"world\"}\n", file = f)
#' readLines(f)
#' ## make the stub
#' stub_request("get", "https://httpbin.org/get") %>%
#' to_return(body = file(f))
#' ## make a request
#' (out <- HttpClient$new("https://httpbin.org/get")$get(disk = f))
#' out$content
#' readLines(out$content)
#'
#' # httr
#' library(httr)
#' ## make a temp file
#' f <- tempfile(fileext = ".json")
#' ## write something to the file
#' cat("{\"hello\":\"world\"}\n", file = f)
#' readLines(f)
#' ## make the stub
#' stub_request("get", "https://httpbin.org/get") %>%
#' to_return(body = file(f),
#' headers = list('content-type' = "application/json"))
#' ## make a request
#' ## with httr, you must set overwrite=TRUE or you'll get an errror
#' out <- GET("https://httpbin.org/get", write_disk(f, overwrite=TRUE))
#' out
#' out$content
#' content(out, "text", encoding = "UTF-8")
#'
#'
#' # Use mock_file to have webmockr handle file and contents
#'
#' # crul
#' library(crul)
#' f <- tempfile(fileext = ".json")
#' ## make the stub
#' stub_request("get", "https://httpbin.org/get") %>%
#' to_return(body = mock_file(f, "{\"hello\":\"mars\"}\n"))
#' ## make a request
#' (out <- crul::HttpClient$new("https://httpbin.org/get")$get(disk = f))
#' out$content
#' readLines(out$content)
#'
#' # httr
#' library(httr)
#' ## make a temp file
#' f <- tempfile(fileext = ".json")
#' ## make the stub
#' stub_request("get", "https://httpbin.org/get") %>%
#' to_return(
#' body = mock_file(path = f, payload = "{\"foo\": \"bar\"}"),
#' headers = list('content-type' = "application/json")
#' )
#' ## make a request
#' out <- GET("https://httpbin.org/get", write_disk(f))
#' out
#' ## view stubbed file content
#' out$content
#' readLines(out$content)
#' content(out, "text", encoding = "UTF-8")
#'
#' # disable mocking
#' disable()
#' }
NULL
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/mocking-disk-writing.R
|
http_lib_adapter_registry <- NULL # nocov start
webmockr_stub_registry <- NULL
webmockr_request_registry <- NULL
.onLoad <- function(libname, pkgname) {
# set defaults for webmockr
webmockr_configure()
# assign crul and httr adapters
# which doesn't require those packages loaded yet
x <- HttpLibAdapaterRegistry$new()
x$register(CrulAdapter$new())
x$register(HttrAdapter$new())
http_lib_adapter_registry <<- x
# initialize empty stub registry on package load
webmockr_stub_registry <<- StubRegistry$new()
# initialize empty request registry on package load
webmockr_request_registry <<- RequestRegistry$new()
} # nocov end
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/onload.R
|
#' Pipe operator
#'
#' @name %>%
#' @rdname pipe
#' @keywords internal
#' @export
#' @importFrom magrittr %>%
#' @usage lhs \%>\% rhs
NULL
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/pipe.R
|
#' Extract the body from an HTTP request
#'
#' Returns an appropriate representation of the data contained within a request
#' body based on its encoding.
#'
#' @export
#' @param x an unexecuted crul *or* httr request object
#' @return one of the following:
#' - `NULL` if the request is not associated with a body
#' - `NULL` if an upload is used not in a list
#' - list containing the multipart-encoded body
#' - character vector with the JSON- or raw-encoded body, or upload form file
pluck_body <- function(x) {
assert_request(x)
if (is_body_empty(x)) return(NULL)
# multipart body
if (!is.null(x$fields)) {
form_file_comp <- vapply(x$fields, inherits, logical(1), "form_file")
if (any(form_file_comp)) {
return(x$fields[form_file_comp])
} else {
return(x$fields)
}
# json/raw-encoded body
} else if (!is.null(x$options$postfields) && is.raw(x$options$postfields)) {
return(rawToChar(x$options$postfields))
# upload not in a list
} else if (!is.null(x$options$postfieldsize_large)) {
return(paste0("upload, file size: ", x$options$postfieldsize_large))
# unknown, fail out
} else {
stop("couldn't fetch request body; file an issue at \n",
" https://github.com/ropensci/webmockr/issues/",
call. = FALSE)
}
}
assert_request <- function(x) {
request_slots <- c("url", "method", "options", "headers")
if (!is.list(x) || !all(request_slots %in% names(x))) {
stop(deparse(substitute(x)), " is not a valid request ", call. = FALSE)
}
}
is_body_empty <- function(x) {
is.null(x$fields) &&
(is.null(x$options$postfieldsize) || x$options$postfieldsize == 0L)
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/pluck_body.R
|
# query mapper for BodyPattern
# attempt to convert input to an R object regardless of format
query_mapper <- function(x) {
if (is.null(x)) return(NULL)
x
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/query_mapper.R
|
#' Remove a request stub
#'
#' @export
#' @param stub a request stub, of class `StubbedRequest`
#' @return logical, `TRUE` if removed, `FALSE` if not removed
#' @family stub-registry
#' @examples
#' (x <- stub_request("get", "https://httpbin.org/get"))
#' stub_registry()
#' remove_request_stub(x)
#' stub_registry()
remove_request_stub <- function(stub) {
stopifnot(inherits(stub, "StubbedRequest"))
webmockr_stub_registry$remove_request_stub(stub = stub)
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/remove_request_stub.R
|
# Check if request is in cache
request_is_in_cache <- function(request_signature) {
webmockr_stub_registry$is_registered(request_signature)
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/request_is_in_cache.R
|
#' List or clear requests in the request registry
#'
#' @export
#' @return an object of class `RequestRegistry`, print method gives the
#' requests in the registry and the number of times each one has been
#' performed
#' @family request-registry
#' @details `request_registry()` lists the requests that have been made
#' that webmockr knows about; `request_registry_clear()` resets the
#' request registry (removes all recorded requests)
#' @examples
#' webmockr::enable()
#' stub_request("get", "https://httpbin.org/get") %>%
#' to_return(body = "success!", status = 200)
#'
#' # nothing in the request registry
#' request_registry()
#'
#' # make the request
#' z <- crul::HttpClient$new(url = "https://httpbin.org")$get("get")
#'
#' # check the request registry - the request was made 1 time
#' request_registry()
#'
#' # do the request again
#' z <- crul::HttpClient$new(url = "https://httpbin.org")$get("get")
#'
#' # check the request registry - now it's been made 2 times, yay!
#' request_registry()
#'
#' # clear the request registry
#' request_registry_clear()
#' webmockr::disable()
request_registry <- function() webmockr_request_registry
#' @export
#' @rdname request_registry
request_registry_clear <- function() webmockr_request_registry$reset()
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/request_registry.R
|
#' List stubs in the stub registry
#'
#' @export
#' @return an object of class `StubRegistry`, print method gives the
#' stubs in the registry
#' @family stub-registry
#' @examples
#' # make a stub
#' stub_request("get", "https://httpbin.org/get") %>%
#' to_return(body = "success!", status = 200)
#'
#' # check the stub registry, there should be one in there
#' stub_registry()
#'
#' # make another stub
#' stub_request("get", "https://httpbin.org/get") %>%
#' to_return(body = "woopsy", status = 404)
#'
#' # check the stub registry, now there are two there
#' stub_registry()
#'
#' # to clear the stub registry
#' stub_registry_clear()
stub_registry <- function() webmockr_stub_registry
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/stub_registry.R
|
#' @title stub_registry_clear
#' @description Clear all stubs in the stub registry
#' @export
#' @return an empty list invisibly
#' @family stub-registry
#' @examples
#' (x <- stub_request("get", "https://httpbin.org/get"))
#' stub_registry()
#' stub_registry_clear()
#' stub_registry()
stub_registry_clear <- function() {
invisible(webmockr_stub_registry$remove_all_request_stubs())
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/stub_registry_clear.R
|
#' Stub an http request
#'
#' @export
#' @param method (character) HTTP method, one of "get", "post", "put", "patch",
#' "head", "delete", "options" - or the special "any" (for any method)
#' @param uri (character) The request uri. Can be a full or partial uri.
#' \pkg{webmockr} can match uri's without the "http" scheme, but does
#' not match if the scheme is "https". required, unless `uri_regex` given.
#' See [UriPattern] for more. See the "uri vs. uri_regex" section
#' @param uri_regex (character) A URI represented as regex. required, if `uri`
#' not given. See examples and the "uri vs. uri_regex" section
#' @return an object of class `StubbedRequest`, with print method describing
#' the stub.
#' @details Internally, this calls [StubbedRequest] which handles the logic
#'
#' See [stub_registry()] for listing stubs, [stub_registry_clear()]
#' for removing all stubs and [remove_request_stub()] for removing specific
#' stubs
#'
#' If multiple stubs match the same request, we use the first stub. So if you
#' want to use a stub that was created after an earlier one that matches,
#' remove the earlier one(s).
#'
#' Note on `wi_th()`: If you pass `query` values are coerced to character
#' class in the recorded stub. You can pass numeric, integer, etc., but
#' all will be coerced to character.
#'
#' See [wi_th()] for details on request body/query/headers and
#' [to_return()] for details on how response status/body/headers
#' are handled
#'
#' @note Trailing slashes are dropped from stub URIs before matching
#'
#' @section uri vs. uri_regex:
#' When you use `uri`, we compare the URIs without query params AND
#' also the query params themselves without the URIs.
#'
#' When you use `uri_regex` we don't compare URIs and query params;
#' we just use your regex string defined in `uri_regex` as the pattern
#' for a call to [grepl]
#'
#' @section Mocking writing to disk:
#' See [mocking-disk-writing]
#' @seealso [wi_th()], [to_return()], [to_timeout()], [to_raise()],
#' [mock_file()]
#' @examples \dontrun{
#' # basic stubbing
#' stub_request("get", "https://httpbin.org/get")
#' stub_request("post", "https://httpbin.org/post")
#'
#' # any method, use "any"
#' stub_request("any", "https://httpbin.org/get")
#'
#' # list stubs
#' stub_registry()
#'
#' # request headers
#' stub_request("get", "https://httpbin.org/get") %>%
#' wi_th(headers = list('User-Agent' = 'R'))
#'
#' # request body
#' stub_request("post", "https://httpbin.org/post") %>%
#' wi_th(body = list(foo = 'bar'))
#' stub_registry()
#' library(crul)
#' x <- crul::HttpClient$new(url = "https://httpbin.org")
#' crul::mock()
#' x$post('post', body = list(foo = 'bar'))
#'
#' # add expectation with to_return
#' stub_request("get", "https://httpbin.org/get") %>%
#' wi_th(
#' query = list(hello = "world"),
#' headers = list('User-Agent' = 'R')) %>%
#' to_return(status = 200, body = "stuff", headers = list(a = 5))
#'
#' # list stubs again
#' stub_registry()
#'
#' # regex
#' stub_request("get", uri_regex = ".+ample\\..")
#'
#' # set stub an expectation to timeout
#' stub_request("get", "https://httpbin.org/get") %>% to_timeout()
#' x <- crul::HttpClient$new(url = "https://httpbin.org")
#' res <- x$get('get')
#'
#' # raise exception
#' library(fauxpas)
#' stub_request("get", "https://httpbin.org/get") %>% to_raise(HTTPAccepted)
#' stub_request("get", "https://httpbin.org/get") %>% to_raise(HTTPAccepted, HTTPGone)
#'
#' x <- crul::HttpClient$new(url = "https://httpbin.org")
#' stub_request("get", "https://httpbin.org/get") %>% to_raise(HTTPBadGateway)
#' crul::mock()
#' x$get('get')
#'
#' # pass a list to .list
#' z <- stub_request("get", "https://httpbin.org/get")
#' wi_th(z, .list = list(query = list(foo = "bar")))
#'
#' # just body
#' stub_request("any", uri_regex = ".+") %>%
#' wi_th(body = list(foo = 'bar'))
#' ## with crul
#' library(crul)
#' x <- crul::HttpClient$new(url = "https://httpbin.org")
#' crul::mock()
#' x$post('post', body = list(foo = 'bar'))
#' x$put('put', body = list(foo = 'bar'))
#' ## with httr
#' library(httr)
#' httr_mock()
#' POST('https://example.com', body = list(foo = 'bar'))
#' PUT('https://google.com', body = list(foo = 'bar'))
#'
#'
#' # just headers
#' headers <- list(
#' 'Accept-Encoding' = 'gzip, deflate',
#' 'Accept' = 'application/json, text/xml, application/xml, */*')
#' stub_request("any", uri_regex = ".+") %>% wi_th(headers = headers)
#' library(crul)
#' x <- crul::HttpClient$new(url = "https://httpbin.org", headers = headers)
#' crul::mock()
#' x$post('post')
#' x$put('put', body = list(foo = 'bar'))
#' x$get('put', query = list(stuff = 3423234L))
#'
#' # many responses
#' ## the first response matches the first to_return call, and so on
#' stub_request("get", "https://httpbin.org/get") %>%
#' to_return(status = 200, body = "foobar", headers = list(a = 5)) %>%
#' to_return(status = 200, body = "bears", headers = list(b = 6))
#' con <- crul::HttpClient$new(url = "https://httpbin.org")
#' con$get("get")$parse("UTF-8")
#' con$get("get")$parse("UTF-8")
#'
#' ## OR, use times with to_return() to repeat the same response many times
#' library(fauxpas)
#' stub_request("get", "https://httpbin.org/get") %>%
#' to_return(status = 200, body = "apple-pie", times = 2) %>%
#' to_raise(HTTPUnauthorized)
#' con <- crul::HttpClient$new(url = "https://httpbin.org")
#' con$get("get")$parse("UTF-8")
#' con$get("get")$parse("UTF-8")
#' con$get("get")$parse("UTF-8")
#'
#' # clear all stubs
#' stub_registry()
#' stub_registry_clear()
#' }
stub_request <- function(method = "get", uri = NULL, uri_regex = NULL) {
if (is.null(uri) && is.null(uri_regex)) {
stop("one of uri or uri_regex is required", call. = FALSE)
}
tmp <- StubbedRequest$new(method = method, uri = uri, uri_regex = uri_regex)
webmockr_stub_registry$register_stub(tmp)
return(tmp)
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/stub_request.R
|
#' Set raise error condition
#'
#' @export
#' @param .data input. Anything that can be coerced to a `StubbedRequest`
#' class object
#' @param ... One or more HTTP exceptions from the \pkg{fauxpas} package. Run
#' `grep("HTTP*", getNamespaceExports("fauxpas"), value = TRUE)` for a list of
#' possible exceptions
#' @return an object of class `StubbedRequest`, with print method describing
#' the stub
#' @section Raise vs. Return:
#' `to_raise()` always raises a stop condition, while `to_return(status=xyz)` only
#' sets the status code on the returned HTTP response object. So if you want to
#' raise a stop condition then `to_raise()` is what you want. But if you
#' don't want to raise a stop condition use `to_return()`. Use cases for each
#' vary. For example, in a unit test you may have a test expecting a 503 error;
#' in this case `to_raise()` makes sense. In another case, if a unit test
#' expects to test some aspect of an HTTP response object that httr or crul
#' typically returns, then you'll want `to_return()`.
#'
#' @details The behavior in the future will be:
#'
#' When multiple exceptions are passed, the first is used on the first
#' mock, the second on the second mock, and so on. Subsequent mocks use the
#' last exception
#'
#' But for now, only the first exception is used until we get that fixed
#' @note see examples in [stub_request()]
to_raise <- function(.data, ...) {
assert(.data, "StubbedRequest")
tmp <- list(...)
if (!all(vapply(tmp, function(x) inherits(x, "R6ClassGenerator"),
logical(1)))) {
stop("all objects must be error classes from fauxpas")
}
if (!all(vapply(tmp, function(x) grepl("HTTP", x$classname), logical(1)))) {
stop("all objects must be error classes from fauxpas")
}
.data$to_raise(tmp)
return(.data)
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/to_raise.R
|
#' Expectation for what's returned from a stubbed request
#'
#' Set response status code, response body, and/or response headers
#'
#' @export
#' @param .data input. Anything that can be coerced to a `StubbedRequest` class
#' object
#' @param ... Comma separated list of named variables. accepts the following:
#' `status`, `body`, `headers`. See Details for more.
#' @param .list named list, has to be one of 'status', 'body',
#' and/or 'headers'. An alternative to passing in via `...`. Don't pass the
#' same thing to both, e.g. don't pass 'status' to `...`, and also 'status' to
#' this parameter
#' @param times (integer) number of times the given response should be
#' returned; default: 1. value must be greater than or equal to 1. Very large
#' values probably don't make sense, but there's no maximum value. See
#' Details.
#' @return an object of class `StubbedRequest`, with print method describing
#' the stub
#' @note see more examples in [stub_request()]
#' @details Values for status, body, and headers:
#'
#' - status: (numeric/integer) three digit status code
#' - body: various: `character`, `json`, `list`, `raw`, `numeric`,
#' `NULL`, `FALSE`, a file connection (other connetion types
#' not supported), or a `mock_file` function call (see [mock_file()])
#' - headers: (list) a named list, must be named
#'
#' response headers are returned with all lowercase names and the values
#' are all of type character. if numeric/integer values are given
#' (e.g., `to_return(headers = list(a = 10))`), we'll coerce any
#' numeric/integer values to character.
#'
#' @section multiple `to_return()`:
#' You can add more than one `to_return()` to a webmockr stub (including
#' [to_raise()], [to_timeout()]). Each one is a HTTP response returned.
#' That is, you'll match to an HTTP request based on `stub_request()` and
#' `wi_th()`; the first time the request is made, the first response
#' is returned; the second time the reqeust is made, the second response
#' is returned; and so on.
#'
#' Be aware that webmockr has to track number of requests
#' (see [request_registry()]), and so if you use multiple `to_return()`
#' or the `times` parameter, you must clear the request registry
#' in order to go back to mocking responses from the start again.
#' [webmockr_reset()] clears the stub registry and the request registry,
#' after which you can use multiple responses again (after creating
#' your stub(s) again of course)
#'
#' @inheritSection to_raise Raise vs. Return
#'
#' @examples
#' # first, make a stub object
#' foo <- function() {
#' stub_request("post", "https://httpbin.org/post")
#' }
#'
#' # add status, body and/or headers
#' foo() %>% to_return(status = 200)
#' foo() %>% to_return(body = "stuff")
#' foo() %>% to_return(body = list(a = list(b = "world")))
#' foo() %>% to_return(headers = list(a = 5))
#' foo() %>%
#' to_return(status = 200, body = "stuff", headers = list(a = 5))
#'
#' # .list - pass in a named list instead
#' foo() %>% to_return(.list = list(body = list(foo = "bar")))
#'
#' # multiple responses using chained `to_return()`
#' foo() %>% to_return(body = "stuff") %>% to_return(body = "things")
#'
#' # many of the same response using the times parameter
#' foo() %>% to_return(body = "stuff", times = 3)
to_return <- function(.data, ..., .list = list(), times = 1) {
assert(.data, "StubbedRequest")
assert(.list, "list")
assert(times, c("integer", "numeric"))
assert_gte(times, 1)
z <- list(...)
if (length(z) == 0) z <- NULL
z <- c(z, .list)
if (
!any(c("status", "body", "headers") %in% names(z)) &&
length(z) != 0
) {
stop("'to_return' only accepts status, body, headers")
}
assert(z$status, c("numeric", "integer"))
assert(z$headers, "list")
if (!all(hz_namez(z$headers))) stop("'headers' must be a named list")
replicate(times,
.data$to_return(status = z$status, body = z$body, headers = z$headers))
return(.data)
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/to_return.R
|
#' Set timeout as an expected return on a match
#'
#' @export
#' @param .data input. Anything that can be coerced to a `StubbedRequest` class
#' object
#' @return an object of class `StubbedRequest`, with print method describing
#' the stub
#' @note see examples in [stub_request()]
to_timeout <- function(.data) {
assert(.data, "StubbedRequest")
.data$to_timeout()
return(.data)
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/to_timeout.R
|
#' webmockr configuration
#'
#' @export
#' @param allow_net_connect (logical) Default: `FALSE`
#' @param allow_localhost (logical) Default: `FALSE`
#' @param allow (character) one or more URI/URL to allow (and by extension
#' all others are not allowed)
#' @param show_stubbing_instructions (logical) Default: `TRUE`. If `FALSE`,
#' stubbing instructions are not shown
#' @param uri (character) a URI/URL as a character string - to determine
#' whether or not it is allowed
#'
#' @section webmockr_allow_net_connect:
#' If there are stubs found for a request, even if net connections are
#' allowed (by running `webmockr_allow_net_connect()`) the stubbed
#' response will be returned. If no stub is found, and net connections
#' are allowed, then a real HTTP request can be made.
#'
#' @examples \dontrun{
#' webmockr_configure()
#' webmockr_configure(
#' allow_localhost = TRUE
#' )
#' webmockr_configuration()
#' webmockr_configure_reset()
#'
#' webmockr_allow_net_connect()
#' webmockr_net_connect_allowed()
#'
#' # disable net connect for any URIs
#' webmockr_disable_net_connect()
#' ### gives NULL with no URI passed
#' webmockr_net_connect_allowed()
#' # disable net connect EXCEPT FOR given URIs
#' webmockr_disable_net_connect(allow = "google.com")
#' ### is a specific URI allowed?
#' webmockr_net_connect_allowed("google.com")
#' }
webmockr_configure <- function(
allow_net_connect = FALSE,
allow_localhost = FALSE,
allow = NULL,
show_stubbing_instructions = TRUE) {
opts <- list(
allow_net_connect = allow_net_connect,
allow_localhost = allow_localhost,
allow = allow,
show_stubbing_instructions = show_stubbing_instructions
)
for (i in seq_along(opts)) {
assign(names(opts)[i], opts[[i]], envir = webmockr_conf_env)
}
webmockr_configuration()
}
#' @export
#' @rdname webmockr_configure
webmockr_configure_reset <- function() webmockr_configure()
#' @export
#' @rdname webmockr_configure
webmockr_configuration <- function() {
structure(as.list(webmockr_conf_env), class = "webmockr_config")
}
#' @export
#' @rdname webmockr_configure
webmockr_allow_net_connect <- function() {
if (!webmockr_net_connect_allowed()) {
message("net connect allowed")
assign('allow_net_connect', TRUE, envir = webmockr_conf_env)
}
}
#' @export
#' @rdname webmockr_configure
webmockr_disable_net_connect <- function(allow = NULL) {
assert(allow, "character")
message("net connect disabled")
assign('allow_net_connect', FALSE, envir = webmockr_conf_env)
assign('allow', allow, envir = webmockr_conf_env)
}
#' @export
#' @rdname webmockr_configure
webmockr_net_connect_allowed <- function(uri = NULL) {
assert(uri, c("character", "list"))
if (is.null(uri)) return(webmockr_conf_env$allow_net_connect)
uri <- normalize_uri(uri)
webmockr_conf_env$allow_net_connect ||
(webmockr_conf_env$allow_localhost && is_localhost(uri) ||
`!!`(webmockr_conf_env$allow) &&
net_connect_explicit_allowed(webmockr_conf_env$allow, uri))
}
net_connect_explicit_allowed <- function(allowed, uri = NULL) {
if (is.null(allowed)) return(FALSE)
if (is.null(uri)) return(FALSE)
z <- parse_a_url(uri)
if (is.na(z$domain)) return(FALSE)
if (inherits(allowed, "list")) {
any(vapply(allowed, net_connect_explicit_allowed, logical(1), uri = uri))
} else if (inherits(allowed, "character")) {
if (length(allowed) == 1) {
allowed == uri ||
allowed == z$domain ||
allowed == sprintf("%s:%s", z$domain, z$port) ||
allowed == sprintf("%s://%s:%s", z$scheme, z$domain, z$port) ||
allowed == sprintf("%s://%s", z$scheme, z$domain) &&
z$port == z$default_port
} else {
any(vapply(allowed, net_connect_explicit_allowed, logical(1), uri = uri))
}
}
}
#' @export
print.webmockr_config <- function(x, ...) {
cat("<webmockr configuration>", sep = "\n")
cat(paste0(" crul enabled?: ", webmockr_lightswitch$crul), sep = "\n")
cat(paste0(" httr enabled?: ", webmockr_lightswitch$httr), sep = "\n")
cat(paste0(" allow_net_connect?: ", x$allow_net_connect), sep = "\n")
cat(paste0(" allow_localhost?: ", x$allow_localhost), sep = "\n")
cat(paste0(" allow: ", x$allow %||% ""), sep = "\n")
cat(paste0(" show_stubbing_instructions: ", x$show_stubbing_instructions),
sep = "\n")
}
webmockr_conf_env <- new.env()
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/webmockr-opts.R
|
#' @title webmockr
#' @description Stubbing and setting expectations on HTTP requests
#'
#' @importFrom R6 R6Class
#' @importFrom fauxpas HTTPRequestTimeout
#' @importFrom crul mock
#' @importFrom base64enc base64encode
#' @name webmockr-package
#' @aliases webmockr
#' @docType package
#' @keywords package
#' @author Scott Chamberlain \email{myrmecocystus+r@@gmail.com}
#' @author Aaron Wolen
#'
#' @section Features:
#'
#' - Stubbing HTTP requests at low http client lib level
#' - Setting and verifying expectations on HTTP requests
#' - Matching requests based on method, URI, headers and body
#' - Supports multiple HTTP libraries, including \pkg{crul} and
#' \pkg{httr}
#' - Integration with HTTP test caching library \pkg{vcr}
#'
#' @examples
#' library(webmockr)
#' stub_request("get", "https://httpbin.org/get")
#' stub_request("post", "https://httpbin.org/post")
#' stub_registry()
NULL
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/webmockr.R
|
#' @title webmockr_reset
#' @description Clear all stubs and the request counter
#' @export
#' @return nothing
#' @seealso [stub_registry_clear()] [request_registry_clear()]
#' @details this function runs [stub_registry_clear()] and
#' [request_registry_clear()] - so you can run those two yourself
#' to achieve the same thing
#' @examples
#' # webmockr_reset()
webmockr_reset <- function() {
stub_registry_clear()
request_registry_clear()
invisible(NULL)
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/webmockr_reset.R
|
#' Set additional parts of a stubbed request
#'
#' Set query params, request body, request headers and/or basic_auth
#'
#' @export
#' @param .data input. Anything that can be coerced to a `StubbedRequest` class
#' object
#' @param ... Comma separated list of named variables. accepts the following:
#' `query`, `body`, `headers`, `basic_auth`. See Details.
#' @param .list named list, has to be one of `query`, `body`,
#' `headers` and/or `basic_auth`. An alternative to passing in via `...`.
#' Don't pass the same thing to both, e.g. don't pass 'query' to `...`, and
#' also 'query' to this parameter
#' @details `with` is a function in the `base` package, so we went with
#' `wi_th`
#' @return an object of class `StubbedRequest`, with print method describing
#' the stub
#' @note see more examples in [stub_request()]
#' @details
#' Values for query, body, headers, and basic_auth:
#'
#' - query: (list) a named list. values are coerced to character
#' class in the recorded stub. You can pass numeric, integer, etc., but
#' all will be coerced to character.
#' - body: various, including character string, list, raw, numeric,
#' upload (`crul::upload` or `httr::upload_file`, they both create the
#' same object in the end)
#' - headers: (list) a named list
#' - basic_auth: (character) a length two vector, username and password.
#' authentication type (basic/digest/ntlm/etc.) is ignored. that is,
#' mocking authenciation right now does not take into account the
#' authentication type. We don't do any checking of the username/password
#' except to detect edge cases where for example, the username/password
#' were probably not set by the user on purpose (e.g., a URL is
#' picked up by an environment variable)
#'
#' Note that there is no regex matching on query, body, or headers. They
#' are tested for matches in the following ways:
#'
#' - query: compare stubs and requests with `identical()`. this compares
#' named lists, so both list names and values are compared
#' - body: varies depending on the body format (list vs. character, etc.)
#' - headers: compare stub and request values with `==`. list names are
#' compared with `%in%`. `basic_auth` is included in headers (with the name
#' Authorization)
#'
#' @examples
#' # first, make a stub object
#' req <- stub_request("post", "https://httpbin.org/post")
#'
#' # add body
#' # list
#' wi_th(req, body = list(foo = "bar"))
#' # string
#' wi_th(req, body = '{"foo": "bar"}')
#' # raw
#' wi_th(req, body = charToRaw('{"foo": "bar"}'))
#' # numeric
#' wi_th(req, body = 5)
#' # an upload
#' wi_th(req, body = crul::upload(system.file("CITATION")))
#' # wi_th(req, body = httr::upload_file(system.file("CITATION")))
#'
#' # add query - has to be a named list
#' wi_th(req, query = list(foo = "bar"))
#'
#' # add headers - has to be a named list
#' wi_th(req, headers = list(foo = "bar"))
#' wi_th(req, headers = list(`User-Agent` = "webmockr/v1", hello="world"))
#'
#' # .list - pass in a named list instead
#' wi_th(req, .list = list(body = list(foo = "bar")))
#'
#' # basic authentication
#' wi_th(req, basic_auth = c("user", "pass"))
#' wi_th(req, basic_auth = c("user", "pass"), headers = list(foo = "bar"))
wi_th <- function(.data, ..., .list = list()) {
assert(.data, "StubbedRequest")
assert(.list, "list")
z <- list(...)
if (length(z) == 0) z <- NULL
z <- c(z, .list)
if (
!any(c("query", "body", "headers", "basic_auth") %in% names(z)) &&
length(z) != 0
) {
stop("'wi_th' only accepts query, body, headers, basic_auth")
}
if (any(duplicated(names(z)))) stop("can not have duplicated names")
assert(z$query, "list")
if (!all(hz_namez(z$query))) stop("'query' must be a named list")
assert(z$headers, "list")
if (!all(hz_namez(z$headers))) stop("'headers' must be a named list")
assert(z$basic_auth, "character")
assert_eq(z$basic_auth, 2)
.data$with(
query = z$query,
body = z$body,
headers = z$headers,
basic_auth = z$basic_auth
)
return(.data)
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/wi_th.R
|
http_verbs <- c("any", "get", "post", "put", "patch", "head", "delete")
cc <- function(x) Filter(Negate(is.null), x)
is_nested <- function(x) {
stopifnot(is.list(x))
for (i in x) {
if (is.list(i)) return(TRUE)
}
return(FALSE)
}
col_l <- function(w) paste(names(w), unname(unlist(w)), sep = "=")
hdl_nested <- function(x) {
if (!is_nested(x)) col_l(x)
}
subs <- function(x, n) {
unname(vapply(x, function(w) {
w <- as.character(w)
if (nchar(w) > n) paste0(substring(w, 1, n), "...") else w
}, ""))
}
l2c <- function(w) paste(names(w), as.character(w), sep = " = ", collapse = "")
hdl_lst <- function(x) {
if (is.null(x) || length(x) == 0) return("")
if (is.raw(x)) return(paste0("raw bytes, length: ", length(x)))
if (inherits(x, "form_file"))
return(sprintf("crul::upload(\"%s\", type=\"%s\")", x$path, x$type))
if (inherits(x, "mock_file")) return(paste0("mock file, path: ", x$path))
if (inherits(x, "list")) {
if (is_nested(x)) {
# substring(l2c(x), 1, 80)
subs(l2c(x), 80)
} else {
txt <- paste(names(x), subs(unname(unlist(x)), 20), sep = "=",
collapse = ", ")
substring(txt, 1, 80)
}
} else {
x
}
}
hdl_lst2 <- function(x) {
if (is.null(x) || length(x) == 0) return("")
if (is.raw(x)) return(rawToChar(x))
if (inherits(x, "form_file"))
return(sprintf("crul::upload(\"%s\", \"%s\")", x$path, x$type))
if (inherits(x, "list")) {
if (any(vapply(x, function(z) inherits(z, "form_file"), logical(1))))
for (i in seq_along(x)) x[[i]] <- sprintf("crul::upload(\"%s\", \"%s\")", x[[i]]$path, x[[i]]$type)
out <- vector(mode = "character", length = length(x))
for (i in seq_along(x)) {
targ <- x[[i]]
out[[i]] <- paste(names(x)[i], switch(
class(targ)[1L],
character = if (grepl("upload", targ)) targ else sprintf('\"%s\"', targ),
list = sprintf("list(%s)", hdl_lst2(targ)),
targ
), sep = "=")
}
return(paste(out, collapse = ", "))
} else {
# FIXME: dumping ground, just spit out whatever and hope for the best
return(x)
}
}
parseurl <- function(x) {
tmp <- urltools::url_parse(x)
tmp <- as.list(tmp)
if (!is.na(tmp$parameter)) {
tmp$parameter <- sapply(strsplit(tmp$parameter, "&")[[1]], function(z) {
zz <- strsplit(z, split = "=")[[1]]
as.list(stats::setNames(zz[2], zz[1]))
}, USE.NAMES = FALSE)
}
tmp
}
url_builder <- function(uri, args = NULL) {
if (is.null(args)) return(uri)
paste0(uri, "?", paste(names(args), args, sep = "=", collapse = "&"))
}
`%||%` <- function(x, y) {
if (
is.null(x) || length(x) == 0 || all(nchar(x) == 0) || all(is.na(x))
) y else x
}
# tryCatch version of above
`%|s|%` <- function(x, y) {
z <- tryCatch(x)
if (inherits(z, "error")) return(y)
if (
is.null(z) || length(z) == 0 || all(nchar(z) == 0) || all(is.na(z))
) y else x
}
`!!` <- function(x) if (is.null(x) || is.na(x)) FALSE else TRUE
assert <- function(x, y) {
if (!is.null(x)) {
if (!inherits(x, y)) {
stop(deparse(substitute(x)), " must be of class ",
paste0(y, collapse = ", "), call. = FALSE)
}
}
}
assert_gte <- function(x, y) {
if (!x >= y) {
stop(sprintf("%s must be greater than or equal to %s",
deparse(substitute(x)), y), call. = FALSE)
}
}
assert_eq <- function(x, y) {
if (!is.null(x)) {
if (!length(x) == y) {
stop(sprintf("length of %s must be equal to %s",
deparse(substitute(x)), y), call. = FALSE)
}
}
}
crul_head_parse <- function(z) {
if (grepl("HTTP\\/", z)) {
list(status = z)
} else {
ff <- regexec("^([^:]*):\\s*(.*)$", z)
xx <- regmatches(z, ff)[[1]]
as.list(stats::setNames(xx[[3]], tolower(xx[[2]])))
}
}
crul_headers_parse <- function(x) do.call("c", lapply(x, crul_head_parse))
#' execute a curl request
#' @export
#' @keywords internal
#' @param x an object
#' @return a curl response
webmockr_crul_fetch <- function(x) {
if (is.null(x$disk) && is.null(x$stream)) {
curl::curl_fetch_memory(x$url$url, handle = x$url$handle)
}
else if (!is.null(x$disk)) {
curl::curl_fetch_disk(x$url$url, x$disk, handle = x$url$handle)
}
else {
curl::curl_fetch_stream(x$url$url, x$stream, handle = x$url$handle)
}
}
# modified from purrr:::has_names
along_rep <- function(x, y) rep(y, length.out = length(x))
hz_namez <- function(x) {
nms <- names(x)
if (is.null(nms)) {
along_rep(x, FALSE)
} else {
!(is.na(nms) | nms == "")
}
}
# check for a package
check_for_pkg <- function(x) {
if (!requireNamespace(x, quietly = TRUE)) {
stop(sprintf("Please install '%s'", x), call. = FALSE)
} else {
invisible(TRUE)
}
}
# lower case names in a list, return that list
names_to_lower <- function(x) {
names(x) <- tolower(names(x))
return(x)
}
as_character <- function(x) {
stopifnot(is.list(x))
lapply(x, as.character)
}
last <- function(x) {
if (length(x) == 0) return(list())
x[[length(x)]]
}
vcr_loaded <- function() {
"package:vcr" %in% search()
}
# check whether a cassette is inserted without assuming vcr is installed
vcr_cassette_inserted <- function() {
if (vcr_loaded()) {
return(length(vcr::current_cassette()) > 0)
}
return(FALSE)
}
check_redirect_setting <- function() {
cs <- vcr::current_cassette()
stopifnot("record_separate_redirects must be logical" =
is.logical(cs$record_separate_redirects))
return(cs)
}
handle_separate_redirects <- function(req) {
cs <- check_redirect_setting()
if (cs$record_separate_redirects) {
req$options$followlocation <- 0L
if (is.list(req$url))
curl::handle_setopt(req$url$handle, followlocation = 0L)
}
return(req)
}
redirects_request <- function(x) {
cs <- check_redirect_setting()
if (cs$record_separate_redirects) return(cs$request_handler$request_original)
x
}
redirects_response <- function(x) {
cs <- check_redirect_setting()
if (cs$record_separate_redirects) return(last(cs$redirect_pool)[[1]])
x
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/R/zzz.R
|
#' httr library adapter
#'
#' @export
#' @family http_lib_adapters
#' @details This adapter modifies \pkg{httr} to allow mocking HTTP requests
#' when one is using \pkg{httr} in their code
HttrAdapter <- R6::R6Class(
'HttrAdapter',
public = list(
name = "httr_adapter",
enable = function() {
message("HttrAdapter enabled!")
webmockr_lightswitch$httr <- TRUE
},
disable = function() {
message("HttrAdapter disabled!")
webmockr_lightswitch$httr <- FALSE
},
build_request_signature = function(x) {
RequestSignature$new(
method = x$method,
uri = x$url,
options = list(
body = x$body %||% NULL,
headers = x$headers %||% NULL
)
)
},
handle_request = function() {
"fadfas"
}
)
)
# httr methods to override
## request_perform -> changes:
## - look in cache for matching request (given user specified matchers)
## - if it's a match, return the response (body, headers, etc.)
## - if no match, proceed with http request as normal
request_perform <- function(req, handle, refresh = TRUE) {
stopifnot(httr:::is.request(req), inherits(handle, "curl_handle"))
req <- httr:::request_prepare(req)
curl::handle_setopt(handle, .list = req$options)
if (!is.null(req$fields))
curl::handle_setform(handle, .list = req$fields)
curl::handle_setheaders(handle, .list = req$headers)
on.exit(curl::handle_reset(handle), add = TRUE)
# put request in cache
request_signature <- HttrAdapter$build_request_signature(req)
webmockr_request_registry$register_request(request_signature)
if (request_is_in_cache(req)) {
StubRegistry$find_stubbed_request(req)
} else {
resp <- httr:::request_fetch(req$output, req$url, handle)
# If return 401 and have auth token, refresh it and then try again
needs_refresh <- refresh && resp$status_code == 401L &&
!is.null(req$auth_token) && req$auth_token$can_refresh()
if (needs_refresh) {
message("Auto-refreshing stale OAuth token.")
req$auth_token$refresh()
return(httr:::request_perform(req, handle, refresh = FALSE))
}
all_headers <- httr:::parse_headers(resp$headers)
headers <- httr:::last(all_headers)$headers
if (!is.null(headers$date)) {
date <- httr:::parse_http_date(headers$Date)
} else {
date <- Sys.time()
}
httr:::response(
url = resp$url,
status_code = resp$status_code,
headers = headers,
all_headers = all_headers,
cookies = curl::handle_cookies(handle),
content = resp$content,
date = date,
times = resp$times,
request = req,
handle = handle
)
}
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/inst/ignore/adapter-httr.R
|
wbenv <- new.env()
bucket <- new.env()
start_server <- function(x) {
app <- list(
call = function(req) {
wsUrl = paste(sep = '',
'"',
"ws://",
ifelse(is.null(req$HTTP_HOST), req$SERVER_NAME, req$HTTP_HOST),
'"')
tmp <- list(
status = 200L,
headers = list(
'Content-Type' = 'application/json'
),
body = sprintf('{
"http_method": "%s",
"url": "%s",
"port": "%s",
"query": "%s",
"user_agent": "%s"
}', req$REQUEST_METHOD, req$SERVER_NAME,
req$SERVER_PORT, req$QUERY_STRING, req$HTTP_USER_AGENT)
)
assign(basename(tempfile()), tmp, envir = bucket)
tmp
}
)
wbenv$server <- startDaemonizedServer("0.0.0.0", 9200, app)
#wbenv$server <- startDaemonizedServer("80", 9200, app)
message("server started")
}
stop_server <- function(x = NULL) {
stopDaemonizedServer(if (is.null(x)) wbenv$server else x)
}
bucket_list <- function(x) ls(envir = bucket)
bucket_unique <- function(x) {
hashes <- vapply(ls(envir = bucket), function(z) digest::digest(get(z, envir = bucket)), "")
if (any(duplicated(hashes))) {
torm <- names(hashes)[duplicated(hashes)]
invisible(lapply(torm, function(z) rm(list = z, envir = bucket)))
}
}
|
/scratch/gouwar.j/cran-all/cranData/webmockr/inst/ignore/sockets.R
|
#' Align templates and images
#'
#' Align images so that template points line up. Defaults to two-point alignment of the first two points in your template (usually the eyes) to their mean coordinate position across the stimuli.
#'
#' @details
#' Setting pt1 the same as pt2 aligns 1 point, but does not resize or rotate images. Setting pt1 and pt2 aligns 2 points, resizing and rotating faces. Setting `procrustes = TRUE` uses Procrustes analysis to resize and rotate images to be as close as possible to a mean shape.
#'
#' You can specify the x and y coordinates to align, and the width and height of the output images, or set them from a reference image. The reference image (`ref_img`) can be a stim, a 1-item stimlist, or the index or name of a stim in stimuli. It defaults to average of all stimuli if NULL.
#'
#' Visualise the template points with [draw_tem()] to determine which to align, using pt.shape = "index".
#'
#' @param stimuli list of stimuli
#' @param pt1 The first point to align (defaults to 0)
#' @param pt2 The second point to align (defaults to 1)
#' @param x1,y1,x2,y2 The coordinates to align the first and second point to
#' @param width,height The dimensions of the aligned images
#' @param ref_img The reference image to get coordinates and dimensions from if they are NULL
#' @param fill background color if cropping goes outside the original image, see [color_conv()]
#' @param procrustes logical; whether to use procrustes alignment
#'
#' @return list of stimuli with aligned images and/or templates
#' @export
#' @family manipulators
#'
#' @examples
#' # align eye points to specific x and y coordinates
#' # in a 300x300 pixel image
#' demo_unstandard(1:3) |>
#' align(pt1 = 0, pt2 = 1,
#' x1 = 100, x2 = 200, y1 = 100, y2 = 100,
#' width = 300, height = 300)
#'
#' \donttest{
#' orig <- demo_unstandard()
#'
#' # align to bottom-centre of nose (average position)
#' align(orig, pt1 = 55, pt2 = 55, fill = "dodgerblue")
#'
#' # align to pupils of second image
#' align(orig, ref_img = 2, fill = "dodgerblue")
#' }
#'
#' \dontrun{
#' # procrustes align to average position
#' # this requires XQuartz on mac and may not run on linux
#' align(orig, procrustes = TRUE, fill = "dodgerblue")
#' }
align <- function(stimuli, pt1 = 0, pt2 = 1,
x1 = NULL, y1 = NULL, x2 = NULL, y2 = NULL,
width = NULL, height = NULL, ref_img = NULL,
fill = wm_opts("fill"),
procrustes = FALSE) {
stimuli <- require_tems(stimuli)
# if values NULL, default to ref_img points
if (is.null(ref_img)) {
avg <- average_tem(stimuli)
ref_points <- avg[[1]]$points
width <- width %||% width(avg)[[1]]
height <- height %||% height(avg)[[1]]
} else if (is.list(ref_img)) {
ref_img <- require_tems(ref_img)
ref_points <- ref_img[[1]]$points
width <- width %||% ref_img[[1]]$width
height <- height %||% ref_img[[1]]$height
} else {
ref_points <- stimuli[[ref_img]]$points
width <- width %||% stimuli[[ref_img]]$width
height <- height %||% stimuli[[ref_img]]$height
}
x1 <- x1 %||% ref_points[1, pt1+1]
y1 <- y1 %||% ref_points[2, pt1+1]
x2 <- x2 %||% ref_points[1, pt2+1]
y2 <- y2 %||% ref_points[2, pt2+1]
if (pt1 == pt2) {
# align points are the same, so no resize needed,
# make sure that left and right align coordinates are the same
x2 <- x1
y2 <- y1
}
# procrustes align ----
if (isTRUE(procrustes)) {
# procrustes align coordinates
data <- tems_to_array(stimuli)
coords <- procrustes_coords(data, ref_img)
# calculate size multiplier
orig_avg <- apply(data, c(1, 2), mean)
pro_avg <- apply(coords, c(1, 2), mean)
oEyeWidth <- (orig_avg[pt1+1, ] - orig_avg[pt2+1, ])^2 |>
sum() |> sqrt()
pEyeWidth <- (pro_avg[pt1+1, ] - pro_avg[pt2+1, ])^2 |>
sum() |> sqrt()
mult <- oEyeWidth/pEyeWidth
# resize and convert to webmorph coordinates
pts <- dim(coords)[[1]]
for (i in 1:dim(coords)[[3]]) {
coords[, , i] <- coords[, , i] *
rep(c(mult, -mult), each = pts) +
rep(c(width/2, height/2), each = pts)
}
# get align points for each image
x1 <- coords[pt1+1, 1, ]
x2 <- coords[pt2+1, 1, ]
y1 <- coords[pt1+1, 2, ]
y2 <- coords[pt2+1, 2, ]
}
fill <- sapply(fill, color_conv)
suppressWarnings({
n <- length(stimuli)
x1 <- rep_len(x1, n)
y1 <- rep_len(y1, n)
x2 <- rep_len(x2, n)
y2 <- rep_len(y2, n)
fill <- rep_len(fill, n)
})
# calculate rotation and resize parameters for each image
rotate <- rep_len(0, n)
newsize <- rep_len(1, n)
for (i in seq_along(stimuli)) {
# calculate aligned rotation and inter-point width
rotate_aligned <- ifelse(x1[i] - x2[i] == 0, pi/2,
atan((y1[i] - y2[i])/(x1[i] - x2[i])))
aEyeWidth = sqrt(`^`(x1[i]-x2[i], 2) + `^`(y1[i]-y2[i], 2))
o_pts <- stimuli[[i]]$points
ox1 <- o_pts[1, pt1+1]
oy1 <- o_pts[2, pt1+1]
ox2 <- o_pts[1, pt2+1]
oy2 <- o_pts[2, pt2+1]
# calculate rotation
rotate_orig <- ifelse(ox1 - ox2 == 0, pi/2,
atan((oy1 - oy2)/(ox1 - ox2)))
rotate[i] <- -(rotate_orig - rotate_aligned) / (pi/180)
# calculate resize needed
oEyeWidth <- sqrt(`^`(ox1-ox2, 2) + `^`(oy1-oy2, 2))
newsize[i] <- ifelse(aEyeWidth == 0 || oEyeWidth == 0,
1, aEyeWidth / oEyeWidth)
}
newstimuli <- stimuli |>
rotate(degrees = rotate, fill = fill) |>
resize(newsize)
# recalculate eye position for cropping
x_off <- c()
y_off <- c()
for (i in seq_along(newstimuli)) {
n_pts <- newstimuli[[i]]$points
newx1 <- n_pts[1, pt1+1]
newy1 <- n_pts[2, pt1+1]
x_off[i] = newx1 - x1[i]
y_off[i] = newy1 - y1[i]
}
crop(newstimuli, width = width, height = height,
x_off = x_off/width(newstimuli), # in case some values < 1
y_off = y_off/height(newstimuli),
fill = fill)
}
#' Procrustes align templates
#'
#' @param data Template points
#' @param ref_img Reference image
#'
#' @return coordinates
#' @keywords internal
#'
procrustes_coords <- function(data, ref_img = NULL) {
suppressMessages(requireNamespace("geomorph"))
n <- dim(data)[3]
if (is.null(ref_img)) {
# calculate average and add as img 1
avg <- apply(data, c(1, 2), mean) |>
array(dim = dim(data) - c(0, 0, 1))
newdata <- array(c(avg, data), dim = dim(data) + c(0, 0, 1))
data_i <- 2:(n+1)
} else {
# move ref image to first position
ref <- data[, , ref_img, drop = FALSE]
noref <- data[, , -ref_img, drop = FALSE]
newdata <- array(c(ref, noref), dim = dim(data))
if (ref_img == 1) {
data_i <- 1:n
} else {
data_i <- c(2:ref_img, 1, (ref_img+1):n)[1:n]
}
}
gpa <- geomorph::gpagen(newdata, PrinAxes = FALSE,
print.progress = FALSE)
coords <- gpa$coords[, , data_i]
# if (rotate != "guess") {
# rotate <- as.numeric(rotate)
#
# if (rotate == 90) {
# coords <- geomorph::rotate.coords(gpa$coords, "rotateC")
# } else if (rotate == 180) {
# coords <- geomorph::rotate.coords(gpa$coords, "rotateC") |>
# geomorph::rotate.coords("rotateC")
# } else if (rotate == 270) {
# coords <- geomorph::rotate.coords(gpa$coords, "rotateCC")
# } else { # rotate == 0 or anything else
# coords <- gpa$coords
# }
# } else {
# ### otherwise guess best rotation ----
#
# # calculate average face
# orig_avg <- apply(data, c(1, 2), mean)
# pro_avg <- apply(gpa$coords, c(1, 2), mean)
#
# all_pts <- expand.grid(x = 1:nrow(orig_avg),
# y = 1:nrow(orig_avg)) |>
# dplyr::filter(x != y) |>
# dplyr::sample_n(min(100, nrow(.)))
#
# angles <- list(
# o = orig_avg,
# p0 = pro_avg,
# p1 = geomorph::rotate.coords(pro_avg, "rotateC"),
# p2 = geomorph::rotate.coords(pro_avg, "rotateC") |>
# geomorph::rotate.coords("rotateC"),
# p3 = geomorph::rotate.coords(pro_avg, "rotateCC")
# ) |>
# lapply(function(coords) {
# mapply(function(pt1, pt2) { angle_from_2pts(coords, pt1, pt2) },
# all_pts$x, all_pts$y)
# })
#
# dd <- data.frame(
# p0 = angles$p0 - angles$o,
# p1 = angles$p1 - angles$o,
# p2 = angles$p2 - angles$o,
# p3 = angles$p3 - angles$o
# ) |>
# # take care of values near +-2pi (better way?)
# dplyr::mutate_all(function(x) {
# x <- ifelse(x > 2*pi, x - (2*pi), x)
# x <- ifelse(x > 0, x - (2*pi), x)
# x <- ifelse(x < -2*pi, x + (2*pi), x)
# x <- ifelse(x > 0, x + (2*pi), x)
# x
# }) |>
# dplyr::summarise_all(mean)
#
# min_diff <- as.list(dd) |> sapply(abs) |> .subset(. == min(.)) |> names()
# #message("rotation: ", min_diff)
#
# if (min_diff == "p1") {
# coords <- geomorph::rotate.coords(gpa$coords, "rotateC")
# } else if (min_diff == "p2") {
# coords <- geomorph::rotate.coords(gpa$coords, "rotateC") |>
# geomorph::rotate.coords("rotateC")
# } else if (min_diff == "p3") {
# coords <- geomorph::rotate.coords(gpa$coords, "rotateCC")
# } else { # min_diff == "p0" or anything else
# coords <- gpa$coords
# }
# }
dimnames(coords) <- dimnames(data)
return(coords)
}
#' #' @export
#' #' @rdname align
#' align_1point <- function(stimuli, pt = 0,
#' x = NULL, y = NULL,
#' width = NULL, height = NULL, ref_img = NULL,
#' fill = wm_opts("fill")) {
#' align(stimuli, pt, pt, x, y, x, y,
#' width, height, ref_img, fill, FALSE)
#' }
#'
#' #' @export
#' #' @rdname align
#' align_2point <- function(stimuli, pt1 = 0, pt2 = 1,
#' x1 = NULL, y1 = NULL, x2 = NULL, y2 = NULL,
#' width = NULL, height = NULL, ref_img = NULL,
#' fill = wm_opts("fill")) {
#' align(stimuli, pt1, pt2, x1, y1, x2, y2,
#' width, height, ref_img, fill, FALSE)
#' }
#'
#' #' @export
#' #' @rdname align
#' align_procrustes <- function(stimuli,
#' width = NULL, height = NULL, ref_img = NULL,
#' fill = wm_opts("fill")) {
#' align(stimuli, 0, 1, NULL, NULL, NULL, NULL,
#' width, height, ref_img, fill, TRUE)
#' }
# #' Get angle from 2 points
# #'
# #' @param coords The coordinate array
# #' @param pt1 The first point
# #' @param pt2 The second point
# #'
# #' @return double of angle in radians
# #' @keywords internal
# #'
# angle_from_2pts <- function(coords, pt1 = 1, pt2 = 2) {
# x1 <- coords[[pt1,1]]
# x2 <- coords[[pt2,1]]
# y1 <- coords[[pt1,2]]
# y2 <- coords[[pt2,2]]
#
# atan2(y1 - y2, x1 - x2) %% (2*pi)
# }
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/align.R
|
#' Create an animated gif from a list of stimuli
#'
#' @param stimuli list of stimuli
#' @param fps frames per second
#' @param loop how many times to loop the animation (0 = infinite)
#' @param rev whether to loop back and forth (TRUE) or in one direction (FALSE)
#'
#' @return magick image
#' @export
#' @family stim
#'
#' @examples
#' \donttest{
#' # slideshow of images (1/second)
#' demo_stim() |> animate()
#'
#' # rotate a face
#' degrees <- seq(0, 350, 10)
#' demo_stim(1) |>
#' mask() |>
#' rep(length(degrees)) |>
#' rotate(degrees) |>
#' animate(fps = 10)
#' }
animate <- function(stimuli, fps = 1, loop = 0, rev = FALSE) {
if (length(stimuli) < 2) {
stop("You need at least two images in the list to make an animated gif")
}
img <- get_imgs(stimuli)
# if looping back and forth
if (isTRUE(rev)) img <- c(img, rev(img))
x <- magick::image_animate(
img,
fps = fps,
dispose = "previous",
loop = loop,
optimize = TRUE)
new_stim(x, "animation.gif") |>
new_stimlist()
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/animate.R
|
#' Convert stimuli to a ggplot
#'
#' Convert a stimulus or list of stimuli into a ggplot, which can be further used with ggplot functions.
#'
#' @param stimuli list of stimuli
#' @param ... Additional arguments to pass to [plot_stim()] if stimuli contains more than 1 image
#'
#' @return a ggplot object
#' @export
#' @family viz
#'
#' @examples
#' stimuli <- demo_stim()
#' gg <- as_ggplot(stimuli)
#'
#' # add to ggplot object; coordinates are pixels
#' # (images are 500x500 each, plus 10px padding)
#' gg +
#' ggplot2::geom_vline(xintercept = 0, color = "red") +
#' ggplot2::geom_vline(xintercept = 1030, color = "blue") +
#' ggplot2::geom_hline(yintercept = 0, color = "green") +
#' ggplot2::geom_hline(yintercept = 520, color = "purple") +
#' ggplot2::annotate("point", x = 515, y = 260, size = 10) +
#' ggplot2::labs(
#' title = "This is a ggplot!",
#' caption = "Made with webmorphR"
#' )
as_ggplot <- function(stimuli, ...) {
stimuli <- as_stimlist(stimuli)
if (length(stimuli) > 1) {
stimuli <- plot(stimuli, ...)
}
img <- stimuli[[1]]$img
magick::image_ggplot(img)
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/as_ggplot.R
|
#' Auto-Delineation
#'
#' Automatically delineate faces using Face++ (an external service). Since each delineation counts against a daily limit, you need to set up your own Face++ account (see details below).
#'
#' @details
#' To use Face++ auto-delineation, you need to get your own free API key from https://www.faceplusplus.com. After signing up for an account, go to https://console.faceplusplus.com/app/apikey/list and request a free API key. Add the key and secret to your .Renviron file as follows:
#'
#' FACEPLUSPLUS_KEY="1234567890abcdefghijk"
#'
#' FACEPLUSPLUS_SECRET="1234567890abcdefghijk"
#'
#' @param stimuli list of stimuli
#' @param model Which model (fpp106, fpp83)
#' @param replace logical; whether to replace original templates - if FALSE, only gets templates for images with no template
#' @param face which face to delineate in each image if there is more than 1
#'
#' @return list of stimuli with templates
#' @export
#' @family tem
#'
#' @examples
#' \dontrun{
#' # requires an API key in .Renviron
#' auto_fpp106 <- demo_stim() |>
#' auto_delin(model = "fpp106", replace = TRUE)
#' }
auto_delin <- function(stimuli,
model = c("fpp106", "fpp83"),
replace = FALSE,
face = 1) {
stimuli <- as_stimlist(stimuli)
model <- match.arg(model)
# find out which stimuli need tems ----
if (isTRUE(replace)) stimuli <- remove_tem(stimuli)
notems <- sapply(stimuli, `[[`, "points") |> sapply(is.null)
if (all(notems == FALSE)) {
warning("No images needed templates; set replace = TRUE to replace existing templates")
return(stimuli)
}
# check for required things ----
## face++ version ----
url <- "https://api-us.faceplusplus.com/facepp/v3/detect"
key <- Sys.getenv("FACEPLUSPLUS_KEY")
secret <- Sys.getenv("FACEPLUSPLUS_SECRET")
if (key == "" || secret == "") {
stop("You need to set FACEPLUSPLUS_KEY and FACEPLUSPLUS_SECRET in your .Renviron (see ?auto_delin)")
}
data <- list(
'api_key' = key,
'api_secret' = secret,
'return_landmark' = ifelse(model == "fpp106", 2, 1),
'image_file' = NULL
)
# save images to temp file ----
tempdir <- tempfile()
paths <- stimuli |>
remove_tem() |>
write_stim(tempdir, format = "jpg", overwrite = TRUE) |>
unlist()
face <- rep(face, length.out = length(stimuli))
if (wm_opts("verbose")) {
pb <- progress::progress_bar$new(
total = length(stimuli), clear = FALSE,
format = "Autodelineating [:bar] :current/:total :elapsedfull"
)
pb$tick(0)
Sys.sleep(0.5)
pb$tick(0)
}
# get line definitions
fpp <- tem_def(model)
## Face++ model ----
for (i in seq_along(stimuli)) {
imgname <- names(stimuli)[[i]]
data$image_file <- file.path(tempdir, paste0(imgname, ".jpg")) |>
httr::upload_file()
r <- httr::POST(url, body = data)
resp <- httr::content(r)
if (!is.null(resp$error_message)) {
e <- resp$error_message |>
paste(collapse = "\n")
e <- paste0(imgname, ": ", e)
warning(e, call. = FALSE)
} else {
# put in order from fpp
which_face <- face[i]
if (length(resp$faces) < which_face) {
warning(imgname, " did not have ", which_face, " faces", call. = FALSE)
which_face <- 1
}
alpha_order <- resp$faces[[which_face]]$landmark
order <- sapply(fpp$points$name, function(x) {
which(x == names(alpha_order))
})
landmarks <- alpha_order[order]
# set up points matrix
x <- sapply(landmarks, `[[`, "x")
y <- sapply(landmarks, `[[`, "y")
dnames <- list(c("x", "y"), names(landmarks))
stimuli[[i]]$points <- matrix(c(x, y), nrow = 2, byrow = TRUE, dimnames = dnames)
# set up lines from fpp
stimuli[[i]]$lines <- fpp$lines
stimuli[[i]]$closed <- fpp$closed
}
if (wm_opts("verbose")) pb$tick()
}
stimuli
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/auto_delin.R
|
#' Average templates
#'
#' This function just averages the templates. An average image is returned, but it is just all the images superimposed. To create a template-aware average, see [avg()].
#'
#' @param stimuli list of stimuli
#' @param name Name for the average
#'
#' @return list of stimuli consisting of just the average
#' @export
#' @family tem
#'
#' @examples
#' tem_only_avg <- demo_stim() |> average_tem()
#'
#' # view the average template
#' draw_tem(tem_only_avg, bg = "white")
#'
#' # view the superimposed image
#' tem_only_avg
average_tem <- function(stimuli, name = "average") {
stimuli <- require_tems(stimuli, TRUE)
# dim is coord (x/y), pt_i, tem_n
pt <- sapply(stimuli, `[[`, "points", simplify = "array")
avg <- apply(pt, c(1, 2), mean)
w <- width(stimuli) |> mean()
h <- height(stimuli) |> mean()
img <- crop(stimuli, w, h) |> get_imgs()
stim <- new_stim(
magick::image_average(img),
name,
points = avg,
lines = stimuli[[1]]$lines,
closed = stimuli[[1]]$closed
)
as_stimlist(stim)
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/average_tem.R
|
#' Average Images
#'
#' Create an average from a list of delineated stimuli.
#'
#' @details
#'
#' ### Normalisation options
#'
#' * none: averages will have all coordinates as the mathematical average of the coordinates in the component templates
#' * twopoint: all images are first aligned to the 2 alignment points designated in `normpoint`. Their position is set to their position in the first image in stimuli
#' * rigid: procrustes aligns all images to the position of the first image in stimuli
#'
#' ### Texture
#'
#' This applies a representative texture to the average, resulting in composite images with more realistic texture instead of the very smooth, bland texture most other averaging programs create. See the papers below for methodological details.
#'
#' B. Tiddeman, M. Stirrat and D. Perrett (2005). Towards realism in facial prototyping: results of a wavelet MRF method. Theory and Practice of Computer Graphics.
#'
#' B. Tiddeman, D.M. Burt and D. Perrett (2001). Computer Graphics in Facial Perception Research. IEEE Computer Graphics and Applications, 21(5), 42-50.
#'
#' @param stimuli list of stimuli to average
#' @param texture logical; whether textured should be averaged
#' @param norm how to normalise; see Details
#' @param normpoint points for twopoint normalisation
#'
#' @return list of stimuli with the average image and template
#' @export
#' @family webmorph
#'
#' @examples
#' \donttest{
#' if (webmorph_up()) {
#' demo_stim() |> avg()
#' }
#' }
avg <- function(stimuli,
texture = TRUE,
norm = c("none", "twopoint", "rigid"),
normpoint = 0:1) {
stimuli <- require_tems(stimuli, TRUE)
if (length(stimuli) > 100) {
stop("We can't average more than 100 images at a time. You can create sub-averages with equal numbers of faces and average those together.")
}
if (!webmorph_up()) {
stop("Webmorph.org can't be reached. Check if you are connected to the internet.")
}
norm <- match.arg(norm)
format <- "jpg" #match.arg(format)
# save images locally
tdir <- tempfile()
files <- write_stim(stimuli, tdir, format = "jpg") |> unlist()
upload <- lapply(files, httr::upload_file)
names(upload) <- sprintf("upload[%d]", 0:(length(upload)-1))
settings <- list(
texture = ifelse(isTRUE(as.logical(texture)), "true", "false"),
norm = norm,
normPoint0 = normpoint[[1]],
normPoint1 = normpoint[[2]],
format = format
)
# send request to webmorph and handle zip file
ziptmp <- paste0(tdir, "/avg.zip")
httr::timeout(30 + 10*length(stimuli))
httr::set_config( httr::config( ssl_verifypeer = 0L ) )
url <- paste0(wm_opts("server"), "/scripts/webmorphR_avg")
r <- httr::POST(url, body = c(upload, settings) ,
httr::write_disk(ziptmp, TRUE))
utils::unzip(ziptmp, exdir = paste0(tdir, "/avg"))
#resp <- httr::content(r)
avg <- paste0(tdir, "/avg") |>
read_stim() |>
rename_stim("avg")
unlink(tdir, recursive = TRUE) # clean up temp directory
avg
}
#' Check if webmorph.org is available
#'
#' @export
#' @family webmorph
#' @examples
#' webmorph_up()
webmorph_up <- function() {
tryCatch({
paste0(wm_opts("server"), "/scripts/status") |>
httr::HEAD() |> httr::status_code() |> identical(200L)
}, error = function(e) {
return(FALSE)
})
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/avg.R
|
#' Make blank images
#'
#' @param n the number of images to return
#' @param width,height image dimensions
#' @param color background color, see [color_conv()]
#' @param names names of the images (appended with index if < n)
#'
#' @return list of stimuli
#' @export
#' @family stim
#'
#' @examples
#' stimuli <- blank(5, 100, 250, color = rainbow(5))
#'
#' label(stimuli, size = 20)
blank <- function(n = 1, width = 100, height = 100, color = "white", names = "img") {
# fix name length
if (length(names) < n && n > 1) {
idx <- as.character(n) |> nchar() |>
formatC(x = 1:n, digits = 0, flag = "0")
names <- rep_len(names, n)|> paste0("_", idx)
}
# fix color length
color <- sapply(color, color_conv) |> rep_len(n)
names(color) <- names
# make stim
stimuli <- lapply(color, function(color) {
new_stim(magick::image_blank(width, height, color))
})
class(stimuli) <- c("stimlist", "list")
stimuli
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/blank.R
|
#' Color to Lab Conversion
#'
#' R color to Lab colourspace conversion. Calculated with Observer. = 2°, Illuminant = D65.
#'
#' @details
#' The formulas used to convert from RGB to XYZ and XYZ to Lab are from http://www.easyrgb.com/en/math.php and the reference values are from http://www.brucelindbloom.com/index.html?ColorCheckerCalcHelp.html
#'
#' @param col vector of hex or color names
#' @param ref_X,ref_Y,ref_Z Reference values for Observer= 2°, Illuminant= D65
#'
#' @return vector of L, a and b values
#' @export
#' @keywords internal
#' @family color
#'
#' @examples
#' col2lab("red")
#' col2lab("#FF0000")
#'
col2lab <- function(col, ref_X = 95.047, ref_Y = 100.000, ref_Z = 108.883) {
# Observer= 2°, Illuminant= D65
# see http://www.brucelindbloom.com/index.html?ColorCheckerCalcHelp.html for other values
# RGB to XYZ via http://www.easyrgb.com/index.php?X=MATH&H=02#text2
#change to 0-1
rgb <- grDevices::col2rgb(col)/255
# inverse sRGB companding
rgb2 <- apply(rgb, 1, function(v) {
100 * ifelse( v > 0.04045,
`^`( (( v + 0.055 ) / 1.055 ), 2.4),
v / 12.92
)
}) |> matrix(nrow = ncol(rgb)) # for 1-pixel images
# Observer. = 2°, Illuminant = D65
X = rgb2[, 1] * 0.4124 + rgb2[, 2] * 0.3576 + rgb2[, 3] * 0.1805
Y = rgb2[, 1] * 0.2126 + rgb2[, 2] * 0.7152 + rgb2[, 3] * 0.0722
Z = rgb2[, 1] * 0.0193 + rgb2[, 2] * 0.1192 + rgb2[, 3] * 0.9505
# XYZ to CieL*ab via http://www.easyrgb.com/index.php?X=MATH&H=07#text7
xyz <- list(
'x' = X / ref_X,
'y' = Y / ref_Y,
'z' = Z / ref_Z
) |> lapply(function(v) {
ifelse( v > 0.008856,
`^`(v, 1/3 ),
(7.787 * v) + (16 / 116))
})
c(
L = ( 116 * xyz$y ) - 16,
a = 500 * ( xyz$x - xyz$y ),
b = 200 * ( xyz$y - xyz$z )
)
}
#' Lab to RGB Conversion
#'
#' Lab colourspace to RGB conversion. Calculated with Observer. = 2°, Illuminant = D65.
#'
#' @details
#' The formulas used to convert from Lab to XYZ and XYZ to RGB are from http://www.easyrgb.com/en/math.php and the reference values are from http://www.brucelindbloom.com/index.html?ColorCheckerCalcHelp.html
#'
#' @param lab
#' @param ref_X,ref_Y,ref_Z Reference values for Observer= 2°, Illuminant= D65
#'
#' @return vector of red, green and blue values
#' @export
#' @keywords internal
#' @family color
#'
#' @examples
#' lab <- c(100, 0, 0)
#' lab2rgb(lab)
#'
#' lab <- col2lab("red")
#' lab2rgb(lab)
lab2rgb <- function(lab, ref_X = 95.047, ref_Y = 100.000, ref_Z = 108.883) {
# CIE-L*ab → XYZ via http://www.easyrgb.com/index.php?X=MATH&H=07#text7
# Reference-X, Y and Z refer to specific illuminants and observers.
# Common reference values are available below in this same page.
var_Y = ( lab[[1]] + 16 ) / 116
var_X = lab[[2]] / 500 + var_Y
var_Z = var_Y - lab[[3]] / 200
if ( var_Y^3 > 0.008856 ) {
var_Y = var_Y^3
} else {
var_Y = ( var_Y - 16 / 116 ) / 7.787
}
if ( var_X^3 > 0.008856 ) {
var_X = var_X^3
} else {
var_X = ( var_X - 16 / 116 ) / 7.787
}
if ( var_Z^3 > 0.008856 ) {
var_Z = var_Z^3
} else {
var_Z = ( var_Z - 16 / 116 ) / 7.787
}
X = var_X * ref_X
Y = var_Y * ref_Y
Z = var_Z * ref_Z
# X, Y and Z input refer to a D65/2° standard illuminant.
# sR, sG and sB (standard RGB) output range = 0 ÷ 255
var_X = X / 100
var_Y = Y / 100
var_Z = Z / 100
var_R = var_X * 3.2406 + var_Y * -1.5372 + var_Z * -0.4986
var_G = var_X * -0.9689 + var_Y * 1.8758 + var_Z * 0.0415
var_B = var_X * 0.0557 + var_Y * -0.2040 + var_Z * 1.0570
if ( var_R > 0.0031308 ) {
var_R = 1.055 * ( var_R ^ ( 1 / 2.4 ) ) - 0.055
} else{
var_R = 12.92 * var_R
}
if ( var_G > 0.0031308 ) {
var_G = 1.055 * ( var_G ^ ( 1 / 2.4 ) ) - 0.055
} else {
var_G = 12.92 * var_G
}
if ( var_B > 0.0031308 ) {
var_B = 1.055 * ( var_B ^ ( 1 / 2.4 ) ) - 0.055
} else {
var_B = 12.92 * var_B
}
rgb <- c(red = var_R,
green = var_G,
blue = var_B) * 255
round(rgb) |> pmax(0) |> pmin(255)
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/col2lab.R
|
#' Convert colors
#'
#' Convert from common color inputs to specified output type, adding alpha transparency for output formats that support it (hexa, rgba).
#'
#' @details
#' * color: one of the R colours listed in [grDevices::colors()], e.g., "red"
#' * hex: hexadecimal string, e.g., "#FF0000"
#' * hexa: hexadecimal string with alpha, e.g., "#FF0000FF"
#' * hex3: abbreviated hexadecimal string, e.g., "#F00"
#' * rgb: vector of red, green and blue values 0-255, e.g., c(255, 0, 0)
#' * rgba: vector of red, green, blue and alpha values 0-255, e.g., c(255, 0, 0, 255)
#' * lab: CIE-Lab color
#' * hsv: vector of hue, saturation and value values (0-1), e.g., c(h=0, s = 1, v = 1)
#'
#' @param color A color in one of the input formats (see Details)
#' @param alpha Alpha transparency (values <=1 converted to 0-255); ignored if color has alpha already
#' @param from,to Input and output color spaces, see `Details` below.
#'
#' @return color in \code{to} format
#' @export
#' @family color
#'
#' @examples
#' # different ways to input red
#' color_conv("red")
#' color_conv("#FF0000")
#' color_conv("#FF0000FF")
#' color_conv(c(255,0,0))
#' color_conv("rgb(255,0,0)") # you can use CSS-style text
#' color_conv(c(255,0,0,255))
#'
#' # Lab must have names or use text format to be guessed
#' color_conv(c(l = 53.2, a = 80.1, b = 67.2))
#' color_conv("lab(53.2,80.1,67.2)")
#'
#' # else, it will be guessed as rgb; fix by setting from explicitly
#' color_conv(c(53.2, 80.1, 67.2))
#' color_conv(c(53.2, 80.1, 67.2), from = "lab")
#'
#' # add 50% alpha transparency to dodgerblue
#' color_conv("dodgerblue", alpha = 0.5, to = "rgba")
#'
color_conv <- function(color, alpha = 1,
from = c("guess", "col", "hex", "hexa", "hex3", "rgb", "rgba", "lab"),
to = c("hexa", "hex", "rgba", "rgb", "lab", "hsv")) {
# handle NULL and NA
if (is.null(color) || any(is.na(color))) {
return(NULL)
} else if (any(color == "none")) {
return("none")
}
from <- match.arg(from)
to <- match.arg(to)
# guess colour type ----
if (from == "guess") {
from <- dplyr::case_when(
length(color) == 3 && !is.null(names(color)) &&
all(tolower(names(color)) %in% c("l", "a", "b")) ~ "lab",
length(color) == 3 && all(color <= 255) && all(color>=0) ~ "rgb",
length(color) == 4 ~ "rgba",
all(grepl("^rgb\\(.*\\)$", color)) ~ "rgb",
all(grepl("^rgba\\(.*\\)$", color)) ~ "rgba",
all(color %in% colors()) ~ "col",
all(grepl("^#[0-9A-Fa-f]{6}$", color)) ~ "hex",
all(grepl("^#[0-9A-Fa-f]{8}$", color)) ~ "hexa",
all(grepl("^#[0-9A-Fa-f]{3}$", color)) ~ "hex3",
all(grepl("^lab\\(.*\\)$", color, ignore.case = TRUE)) ~ "lab",
NULL
)
}
if (from %in% c("rgb", "rgba", "lab") && is.character(color)) {
from <- paste0(from, "_text")
}
if (is.null(from) || is.na(from)) {
stop("The color format could not be guessed.")
}
# convert alpha to integer between 0 and 255
alpha255 <- ifelse(alpha > 1, alpha, alpha * 255) |>
round() |> pmax(0) |> pmin(255) |>
.subset2(1) # remove if vectorised, currently only 1 alpha
# convert to rgba ----
rgba <- switch(
from,
rgba = color,
rgb = c(color, alpha255),
lab = lab2rgb(color) |> c(alpha = 255),
rgb_text = gsub("(rgb|\\(|\\)|\\s)", "", color) |>
strsplit(",") |> .subset2(1) |>
as.numeric() |> c(alpha255),
lab_text = gsub("(lab|\\(|\\)|\\s)", "", color, ignore.case = TRUE) |>
strsplit(",") |> .subset2(1) |>
as.numeric() |> lab2rgb() |> c(alpha255),
rgba_text = gsub("(rgba|\\(|\\)|\\s)", "", color) |>
strsplit(",") |> .subset2(1) |>
as.numeric(),
col = grDevices::col2rgb(color) |> c(alpha255),
hex = grDevices::col2rgb(color) |> c(alpha255),
hex3 = strsplit(color, "")[[1]] |>
rep(each = 2) |>
paste(collapse = "") |>
substr(2,8) |>
grDevices::col2rgb() |>
c(alpha255),
hexa = grDevices::col2rgb(color, alpha = TRUE)
)
# convert to output
if (to == "rgba") {
rgba[1:4]
} else if (to == "rgb") {
rgba[1:3]
} else if (to == "lab") {
hex <- rgba[1:3] |> format.hexmode(width = 2) |> paste(collapse = "")
col2lab(paste0("#", hex))
} else if (to == "hsv") {
(grDevices::rgb2hsv(rgba[1:3]) |> t())[1,]
} else if (to == "hex") {
hex <- rgba[1:3] |>
format.hexmode(width = 2) |>
paste(collapse = "")
paste0("#", hex)
} else if (to == "hexa") {
hex <- rgba[1:4] |>
format.hexmode(width = 2) |>
paste(collapse = "")
paste0("#", hex)
}
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/color_conv.R
|
#' Image Comparison
#'
#' This is just a convenient way to use magick::compareare with webmorph stimuli. It defaults to the "MSE" metric, which gives a linearly increasing score to images along a morph continuum.
#'
#'
#' Metric Types
#'
#' * Undefined: ?
#' * AE: Absolute Error
#' * Fuzz: ?
#' * MAE: Mean Absolute Error
#' * MEPP: Mean Error Per Pixel
#' * MSE: Mean Squared Error
#' * NCC: Normalized Cross Correlation
#' * PAE: Peak Absolute Error
#' * PHASH: Perceptual Hash
#' * PSNR: Peak Signal-to-Noise Ratio
#' * RMSE: Root Mean Squared Error
#'
#' How these metrics behave when comparing a morph continuum to its first image.
#'
#' Increases with morph distance:
#'
#' * very strong negative exponential decay at 0 fuzz; more linear with higher fuzz: AE
#' * strong negative exponential decay: PAE
#' * slight negative exponential decay: Fuzz, RMSE
#' * linear: MAE, MEPP, MSE
#' * no idea: PHASH
#'
#' Decreases with morph distance:
#'
#' * linear: NCC, Undefined
#' * slight exponential decay: PSNR
#'
#'
#' @param stimuli Stimuli to compare to the ref_stim
#' @param ref_stim A stim, 1-item stimlist, or the name or index of the comparison item in stim
#' @param metric string with a metric from magick::metric_types(): "Undefined", "AE", "Fuzz", "MAE", "MEPP", "MSE", "NCC", "PAE", "PHASH", "PSNR", "RMSE"
#' @param fuzz relative color distance (value between 0 and 100) to be considered similar in the filling algorithm (only useful for AE)
#' @param scale whether to scale the values so that the maximum value is 1 and the minimum is 0 (only useful when stim is more than 1 image and includes ref_stim)
#'
#' @return Difference metric
#' @export
#' @family info
#'
#' @examples
#' stimuli <- demo_stim()
#' compare(stimuli, stimuli$m_multi)
#' compare(stimuli, stimuli$m_multi, "AE")
#' compare(stimuli, stimuli$m_multi, "AE", fuzz = 5)
compare <- function(stimuli, ref_stim, metric = "MSE", fuzz = 0, scale = FALSE) {
stimuli <- as_stimlist(stimuli)
img1 <- get_imgs(stimuli)
if (is.numeric(ref_stim) | is.character(ref_stim)) {
img2 <- stimuli[[ref_stim[[1]]]]$img
} else {
img2 <- as_stimlist(ref_stim)[[1]]$img
}
comp <- magick::image_compare(img1, img2, metric, fuzz)
diff <- attr(comp, "distortion")
names(diff) <- names(stimuli)
non_inf <- diff[!is.infinite(diff)]
if (scale && (max(non_inf)-min(non_inf)) > 0) {
scalediff <- (diff - min(diff)) / (max(non_inf)-min(non_inf))
# scalediff[is.infinite(scalediff)] <- 1.0
scalediff
} else {
diff
}
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/compare.R
|
#' Morph between two images
#'
#' Morph from one image to another in the specified steps.
#'
#' @param from_img image to start at
#' @param to_img image to end at
#' @param from starting percentage
#' @param to ending percentage
#' @param by step size
#' @param ... arguments to pass to [trans()]
#'
#' @return a list of stimuli containing each step of the continuum
#' @export
#' @family webmorph
#'
#' @examples
#' \donttest{
#' if (webmorph_up()) {
#' stimuli <- demo_stim()
#' cont <- continuum(stimuli$f_multi, stimuli$m_multi)
#'
#' # create an animated gif
#' animate(cont, fps = 10, rev = TRUE)
#' }
#' }
continuum <- function(from_img, to_img, from = 0, to = 1, by = 0.1, ...) {
steps <- seq(from, to, by)
trans(from_img, from_img, to_img, steps, steps, steps, ...)
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/continuum.R
|
#' Convert templates
#'
#' @param stimuli list of stimuli
#' @param from id of template definition of stimlist images
#' @param to id of template definition to convert to
#'
#' @return list of stimuli with converted templates
#' @export
#' @keywords internal
convert_tem <- function(stimuli,
from = c("guess", "frl", "fpp106", "fpp83", "dlib70", "dlib7"),
to = c("frl", "fpp106", "fpp83", "dlib70", "dlib7")) {
stimuli <- require_tems(stimuli)
from <- match.arg(from)
to <- match.arg(to)
if (from == "guess") {
pt_n <- lapply(stimuli, `[[`, "points") |> sapply(ncol)
from <- dplyr::case_when(
pt_n == 189 ~ "frl",
pt_n == 106 ~ "fpp106",
pt_n == 83 ~ "fpp83",
pt_n == 70 ~ "dlib70",
pt_n == 7 ~ "dlib7",
TRUE ~ "x"
)
if (any(from == "x")) {
notem <- paste(names(stimuli[from == "x"]), collapse = ", ")
stop("Some images don't have a recognised template: ", notem)
}
}
from <- rep(from, out.length = length(stimuli))
to_tem <- tem_def(to)
pt <- array(c(to_tem$points$x, to_tem$points$y),
c(nrow(to_tem$points), 2),
dimnames = list(
to_tem$points$name,
c("x", "y")
)) |> t()
for (i in seq_along(stimuli)) {
if (to == from[i]) next # skip if the same
from_tem <- tem_def(from[i])
stimuli[[i]]$lines <- to_tem$lines
stimuli[[i]]$closed <- to_tem$closed
# 2-pt align default template to face
old_points <- stimuli[[i]]$points
stimuli[[i]]$points <- pt
pt1 <- from_tem$align_pts[[1]] + 1
pt2 <- from_tem$align_pts[[2]] + 1
new_stim <- align(stimuli[i],
pt1 = to_tem$align_pts[[1]],
pt2 = to_tem$align_pts[[2]],
x1 = old_points['x', pt1],
x2 = old_points['x', pt2],
y1 = old_points['y', pt1],
y2 = old_points['y', pt2])
stimuli[[i]]$points <- new_stim[[1]]$points
draw_tem(stimuli[[i]], pt.shape = "index", pt.size = 15)
# check for conversion
x_conversion <- paste0(from[i], ".x") |> tolower()
y_conversion <- paste0(from[i], ".y") |> tolower()
if (all(c(x_conversion, y_conversion) %in% names(to_tem$points))) {
from_x <- as.list(to_tem$points[[x_conversion]]) |>
lapply(`+`, 1)
to_x <- which(!sapply(from_x, is.null))
from_x <- from_x[to_x]
from_y <- as.list(to_tem$points[[y_conversion]]) |>
lapply(`+`, 1)
to_y <- which(!sapply(from_y, is.null))
from_y <- from_y[to_y]
# handle from_? with multiple points that need to be averaged
old_x <- sapply(from_x, function(x) {
mean(old_points['x', x])
})
old_y <- sapply(from_y, function(y) {
mean(old_points['y', y])
})
stimuli[[i]]$points['x', to_x] <- old_x
stimuli[[i]]$points['y', to_y] <- old_y
#draw_tem(stimuli[[i]], pt.shape = "index", pt.size = 15)
}
}
stimuli
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/convert_tem.R
|
#' Crop images and templates
#'
#' Remove or add margins to images and templates.
#'
#' @details
#' Dimensions and offsets can be set in pixels or proportions. For width and height, values less than 2 will be interpreted as proportions, otherwise pixels. For x_off and y_off, values between -1 and 1 are interpreted as proportions, otherwise pixels.
#'
#' Cropping is anchored at the image center (or calculated template centroid if there is no image) unless x_off or y_off are set.
#'
#' @param stimuli list of stimuli
#' @param width,height dimensions of cropped image in pixels or proportion (<2)
#' @param x_off,y_off offset in pixels or proportion (<1) (NULL centers cropped image)
#' @param fill background color if cropping goes outside the original image, see [color_conv()]
#'
#' @return list of stimuli with cropped tems and/or images
#' @export
#' @family manipulators
#'
#' @examples
#' stimuli <- demo_stim()
#'
#' # crop to 60% width and 80% height (centered)
#' crop(stimuli, width = .60, height = .80)
#'
#' # crop to upper right quadrant
#' crop(stimuli, .5, .5, x_off = .5, y_off = 0)
#'
#' \donttest{
#' # negative offset with fill
#' crop(stimuli, 260, 260,
#' x_off = -10, y_off = -10,
#' fill = c("red", "dodgerblue"))
#' }
crop <- function(stimuli,
width = 1.0, height = 1.0,
x_off = NULL, y_off = NULL,
fill = wm_opts("fill")) {
stimuli <- as_stimlist(stimuli)
fill <- sapply(fill, color_conv)
suppressWarnings({
l <- length(stimuli)
width <- rep(width, length.out = l)
height <- rep(height, length.out = l)
x_off <- rep(x_off, length.out = l)
y_off <- rep(y_off, length.out = l)
fill <- rep(fill, length.out = l)
})
## TODO: figure out the bug in this and move dim/offset calcs outside loop
# origw <- width(stimuli)
# w <- width %||% origw
# w <- ifelse(w < 2, w * origw, w)
#
# origh <- height(stimuli)
# h <- height %||% origh
# h <- ifelse(h < 2, h * origh, h)
#
# # handle null or missing offsets by centering
# x_off <- x_off %||% (origw - w)/2
# y_off <- y_off %||% (origh - h)/2
# x_off <- ifelse(is.na(x_off), (origw - w)/2, x_off)
# y_off <- ifelse(is.na(y_off), (origh - h)/2, y_off)
#
# # handle offsets < 1
# x_off <- ifelse(x_off < 1, x_off * origw, x_off)
# y_off <- ifelse(y_off < 1, y_off * origh, y_off)
for (i in seq_along(stimuli)) {
origw <- stimuli[[i]]$width
origh <- stimuli[[i]]$height
w <- width[i] %||% origw
h <- height[i] %||% origh
# handle percentages
if (w < 2) w <- w * origw
if (h < 2) h <- h * origh
# handle percentage offsets
if (!is.null(x_off[i]) && !is.na(x_off[i]) && abs(x_off[i]) < 1)
x_off[i] <- x_off[i] * origw
if (!is.null(y_off[i]) && !is.na(y_off[i]) && abs(y_off[i]) < 1)
y_off[i] <- y_off[i] * origh
# null offsets split the remainder between orig and new dimensions
if (is.null(x_off[i]) || is.na(x_off[i])) x_off[i] <- (origw - w)/2
if (is.null(y_off[i]) || is.na(y_off[i])) y_off[i] <- (origh - h)/2
# update width and height in case no image
# (gets updates from img below)
stimuli[[i]]$width <- w
stimuli[[i]]$height <- h
if (!is.null(stimuli[[i]]$img) &&
"magick-image" %in% class(stimuli[[i]]$img)) {
# crop doesn't handle negative offsets well
ga <- magick::geometry_area(
width = min(w, origw),
height = min(h, origh),
x_off = max(0, x_off[i]),
y_off = max(0, y_off[i])
)
newimg <- magick::image_crop(
image = stimuli[[i]]$img,
geometry = ga,
gravity = "NorthWest"
)
# make background image with fill
bg <- magick::image_blank(w, h, color = fill[i])
offset <- magick::geometry_point(
x = max(0, -x_off[i]),
y = max(0, -y_off[i])
)
stimuli[[i]]$img <- magick::image_composite(
image = bg,
composite_image = newimg,
operator = "Over",
offset = offset
)
# get dim from magick in case of rounding differences
info <- magick::image_info(stimuli[[i]]$img)
stimuli[[i]]$width <- info$width[[1]]
stimuli[[i]]$height <- info$height[[1]]
}
# crop template if present
pts <- stimuli[[i]]$points
if (!is.null(pts)) {
# benchmarked 50x faster than apply (still trivial)
stimuli[[i]]$points <- pts - c(x_off[i], y_off[i])
}
}
stimuli
}
#' Pad images
#'
#' Add padding to sides of stimuli. This is a convenience function to calculate offsets for [crop()].
#'
#' @details
#' The value for top is copied to bottom and right, and the value for right is copied to left, so setting only top produces a consistent border, while setting just top and right sets different borders for top-bottom and right-left. (This convention will be familiar if you use CSS.)
#'
#' Padding size values are interpreted as a proportion of width or height if less than 1.
#'
#' @param stimuli list of stimuli
#' @param top,right,bottom,left number of pixels or proportion (<1) to pad each side
#' @param ... additional arguments to pass to [crop()]
#'
#' @return list of stimuli
#' @export
#' @family manipulators
#'
#' @examples
#' stimuli <- demo_stim()
#'
#' # default 10-pixel padding
#' pad(stimuli, fill = "dodgerblue")
#'
#' \donttest{
#' # change pad width and set fill
#' pad(stimuli, 2, fill = "dodgerblue")
#'
#' # set top border to 10% height
#' # different colour for each image
#' pad(stimuli, 0.1, 1, 1, 1,
#' fill = c("hotpink", "dodgerblue"))
#' }
pad <- function(stimuli, top = 10, right = top, bottom = top, left = right, ...) {
stimuli <- as_stimlist(stimuli)
# convert if percents
top <- ifelse(abs(top) < 1, top * height(stimuli), top)
bottom <- ifelse(abs(bottom) < 1, bottom * height(stimuli), bottom)
left <- ifelse(abs(left) < 1, left * width(stimuli), left)
right <- ifelse(abs(right) < 1, right * width(stimuli), right)
stimuli |>
crop(width = width(stimuli) + left + right,
height = height(stimuli) + top + bottom,
x_off = - left, y_off = - top, ...
)
}
#' Crop to template boundaries and pad
#'
#' Calculate the maximum and minimum x and y coordinates across the stimuli (or for each stimulus) and crop all image to this plus padding.
#'
#' @param stimuli list of stimuli
#' @param top,right,bottom,left numeric; number of pixels or proportion (<1) to pad each side
#' @param each logical; Whether to calculate bounds for the full set (default) or each image separately
#' @param ... additional arguments to pass to [crop()]
#'
#' @return list of stimuli
#' @export
#' @family manipulators
#'
#' @examples
#' stimuli <- demo_stim()
#' ctem <- crop_tem(stimuli, each = TRUE)
#' draw_tem(ctem)
#'
#' \donttest{
#' # demo with different templates
#' stimuli <- demo_tems()
#'
#' # default 10 pixels around maximum template
#' crop_tem(stimuli)
#'
#' # crop specific to each image
#' crop_tem(stimuli, each = TRUE)
#' }
#'
crop_tem <- function(stimuli, top = 10, right = top, bottom = top, left = right, each = FALSE, ...) {
stimuli <- require_tems(stimuli)
b <- bounds(stimuli, each = each)
stimuli |>
crop(width = ceiling(b$max_x - b$min_x + left + right),
height = ceiling(b$max_y - b$min_y + top + bottom),
x_off = floor(b$min_x - left),
y_off = floor(b$min_y - top), ...
)
}
#' Get template bounds
#'
#' @param stimuli A stimlist
#' @param each Whether to calculate max and min for the full set (default) or each image separately
#'
#' @return A list of min and max x and y values
#' @export
#'
#' @examples
#' demo_stim() |> bounds() |> str()
#'
#' demo_stim() |> bounds(each = TRUE)
bounds <- function(stimuli, each = FALSE) {
stimuli <- require_tems(stimuli)
if (isTRUE(each)) {
# get separate bounds for each stimulus
b <- lapply(stimuli, bounds) |>
do.call(what = rbind)
return(b)
}
min_vals <- lapply(stimuli, `[[`, "points") |>
sapply(apply, 1, min) |>
apply(1, min)
max_vals <- lapply(stimuli, `[[`, "points") |>
sapply(apply, 1, max) |>
apply(1, max)
# only work if all tems are the same
# A <- tems_to_array(stimuli)
# x <- (A[, "X", ])
# y <- (A[, "Y", ]) * -1
data.frame(min_x = min_vals[["x"]],
max_x = max_vals[["x"]],
min_y = min_vals[["y"]],
max_y = max_vals[["y"]])
}
|
/scratch/gouwar.j/cran-all/cranData/webmorphR/R/crop.R
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.