id
int64 393k
2.82B
| repo
stringclasses 68
values | title
stringlengths 1
936
| body
stringlengths 0
256k
β | labels
stringlengths 2
508
| priority
stringclasses 3
values | severity
stringclasses 3
values |
---|---|---|---|---|---|---|
126,303,228 | nvm | Feature Request: npm management? | Nvm installs node and npm for a particular user, but the version of node and npm are linked. It would be nice if there was a way to choose a different version of npm, in the same way you can change your node version. It seems like the easy and recommended way to update npm is via `npm install npm -g`, but this installs npm globally. If you don't have the need or ability to do a global install, it would be nice if nvm could install a newer/different version for you.
| feature requests | medium | Major |
126,561,682 | go | cmd/link: DW_AT_name should come from source code | gc is not generating DW_AT_name according to the spec.
DW_AT_name is "a string representing the name as it appears in the source program."
Mangled names should go in DW_AT_linkage_name.
I've seen names like these:
package.functioname for functions
package.(type).functionname for methods
package.functioname.func1 for closures
package.type for types
These should all be the linkage_name.
We should probably represent packages as DW_TAG_namespace. Functions and types should be children of the package. Methods should be children of the type. Anonymous functions should only have a linkage_name and not a DW_AT_name.
It would also be good to fix variables to represent what's in the source, e.g. no '&foo'.
| compiler/runtime | low | Minor |
126,565,814 | youtube-dl | youtube-dl should display ffmpeg errors | Whenever ffmpeg gives out errors, youtube-dl should give that feedback to the user. As it is now, it only gives the last line:[ffmpeg] Correcting container in "A.m4a"
ERROR: file:A.temp.m4a: Invalid argument
But it would be better if it gave the whole error so that I would know that the error has to do with mp4 codec: Requested output format 'mp4' is not a suitable output format
file:A.temp.m4a: Invalid argument
| request,postprocessors | low | Critical |
126,568,108 | kubernetes | Implement alternative storage destination (for events) | currently, events are stored into etcd, i think this is not a good way. because events always multi-times than pods, if the cluster is big(`15000 pods, there maybe > 100K events`), etcd can not handle it effectively. Than the e2e test(`density test`) maybe not works fine.
why not send the events to a manager center(`like kafka`), or store them in the database?
| priority/backlog,sig/scalability,area/logging,sig/api-machinery,lifecycle/frozen | medium | Critical |
126,575,435 | TypeScript | Allow inline type predicates | From #5731
``` typescript
if (<foo is Element>(foo.nodeType === 1)) {
// Assume foo is Element here
}
```
A proposal from @sandersn: https://github.com/Microsoft/TypeScript/issues/5731#issuecomment-162586789
> ## Type Predicate Expressions
>
> A type predicate expression allows you to narrow a union type with a single expression. The syntax is
>
> > `<`_variable_ `is` _type_`>` _boolean-expression_
>
> When the expression is used in a conditional context, the true branch of the conditional narrows variable to type, while the false branch narrows variable by removing type.
>
> The type predicate expression is of type Boolean, with apparent type of type predicate. The right hand side must be an expression of type Boolean. The left hand side's variable must be a name bound in the current scope. The left hand side's type must be a name bound in the current scope.
> ### Open questions
> 1. How does variable capture work? Can a type predicate expression be returned from a function and used to narrow a variable that's no longer in scope? This could be defined to cause an error, but that restriction is neither obvious nor easy to use.
> 2. Why reuse the type assertion syntax? The type of the right expression is checked to be Boolean, unlike assertions, and the resulting type is still Boolean, also unlike assertions.
| Suggestion,Needs Proposal | medium | Critical |
126,635,273 | youtube-dl | add support for prosiebenmaxx plailistfiles | hi,
could you please add support for playlists such as of this:
http://www.prosiebenmaxx.de/anime/playlist-animenacht
unfortunately the page dosnt allow to get the video url out of the playlist (as youtube dose for example) so for me there is no way to get the videos
thanks alot in advance
| geo-restricted | low | Minor |
126,685,476 | rust | Struct and variant constructors are not generic over lifetimes like regular functions. | The following testcase produces the errors noted in comments and the last one is an ICE ([run on playpen](https://play.rust-lang.org/?gist=f5acaea83408a752d28d&version=nightly)):
``` rust
#![feature(fn_traits, unboxed_closures)]
fn test<F: for<'x> FnOnce<(&'x str,)>>(_: F) {}
struct Compose<F,G>(F,G);
impl<T,F,G> FnOnce<(T,)> for Compose<F,G>
where F: FnOnce<(T,)>, G: FnOnce<(F::Output,)> {
type Output = G::Output;
extern "rust-call" fn call_once(self, (x,): (T,)) -> G::Output {
(self.1)((self.0)(x))
}
}
struct Str<'a>(&'a str);
fn mk_str<'a>(s: &'a str) -> Str<'a> { Str(s) }
fn main() {
let _: for<'a> fn(&'a str) -> Str<'a> = mk_str;
// expected concrete lifetime, found bound lifetime parameter 'a
let _: for<'a> fn(&'a str) -> Str<'a> = Str;
test(|_: &str| {});
test(mk_str);
// expected concrete lifetime, found bound lifetime parameter 'x
test(Str);
test(Compose(|_: &str| {}, |_| {}));
test(Compose(mk_str, |_| {}));
// internal compiler error: cannot relate bound region:
// ReLateBound(DebruijnIndex { depth: 2 },
// BrNamed(DefId { krate: 0, node: DefIndex(6) => test::'x }, 'x(65)))
//<= ReSkolemized(0,
// BrNamed(DefId { krate: 0, node: DefIndex(6) => test::'x }, 'x(65)))
test(Compose(Str, |_| {}));
}
```
The same errors occur if an `enum Foo<'a> { Str(&'a str) }` is used.
Usage of `FnOnce<(T,)>` is to avoid proving an explicit type parameter for the `Output` associated type, as an workaround for #30867.
Originally found in @asajeffrey's [wasm](https://github.com/asajeffrey/wasm) repo (see my comment on asajeffrey/wasm#1).
cc @nikomatsakis @arielb1
| C-enhancement,A-lifetimes,T-compiler,F-unboxed_closures | low | Critical |
126,744,298 | TypeScript | Compiler "target" option capitalization in docs inconsistent with code and json schema | https://github.com/Microsoft/TypeScript/wiki/Compiler-Options
says `--target` can take the value `ES5` (uppercase).
Visual Studio Code warns about using the uppercase one in your `tsconfig.json`, presumably because it's following the schema.
The JSON schema for `tsconfig.json` also has it in lowercase, but the docs (https://github.com/Microsoft/TypeScript/wiki/tsconfig.json#details) refer you to the first page (which uses uppercase) for the values.
---
My guess at fixing the problem:
From the source it appears the value ought to be lowercase:
https://github.com/Microsoft/TypeScript/blob/master/src/compiler/commandLineParser.ts#L221
I think the fix is to use lowercase in the docs (the first page).
| Docs | low | Minor |
126,800,015 | kubernetes | Umbrella: Improvements to API documentation | the public API documentation on kubernetes.io is very challenging to understand compared with projects like Docker, OpenStack.
The first and most challenging issue is that all of the API operations
are listed on one massive page, there isn't any grouping or index. I
have to use the browser find to search within the page and guess what
I'm looking for.
The second issue is that the docs are automatically generated so a lot
of the descriptions are not helpful, for example "proxy PUT requests
to Pod" : PUT /api/v1/proxy/namespaces/{namespace}/pods/{name} - I
have no idea what this does or what it is supposed to do.
Third issue is that the models 'required' for each of the fields are
not entirely correct. I was trying to create a pod. The docs say :
http://kubernetes.io/v1.1/docs/api-reference/v1/definitions.html#_v1_pod
that a pod spec is required, but none of the fields are required.
The API returns and gives an error saying fields are missing. I had to
use trial and error to discover which fields are required at a minimum
(turns out, metadata.name, containers[0].name and
containers[0].image).
@caesarxuchao
@brendandburns
@bgrant0607
| priority/important-soon,kind/documentation,area/api,sig/api-machinery,sig/docs,lifecycle/frozen,wg/api-expression | medium | Critical |
126,800,062 | rust | Improve typed pretty printing | Currently you can use `--unpretty=hir,typed` to print out the HIR with type annotations in comments. This could be vastly improved so that it is actually a useful for the user:
- use type ascription rather than comments
- map the types back from the HIR to the AST
- use the original program's formatting (or if that isn't possible, use rustfmt, rather than the pretty printer)
- output only a specified portion of the program, rather than the whole program
If we do all of those things, then this should probably be a tool external to the compiler. The first item could just fix `--unpretty=hir,typed`, the second could provide `--unpretty=typed`.
| A-pretty,C-enhancement,E-mentor | low | Major |
126,876,344 | go | x/tools/cmd/oracle: `describe` operation doesn't work with import "C" statements | Take this source for file `/devel/go-workspace/src/foo/foo.go`:
```
package main
/*
#include <time.h>
#include <stdlib.h>
*/
import "C"
import "fmt"
func Random() int {
return int(C.rand())
}
func Seed() {
C.srand(C.uint(C.time(nil)))
}
func main() {
Seed()
fmt.Print(Random())
}
```
Then trying to run oracle describe doesn't work:
`oracle -pos=/devel/go-workspace/src/foo/foo.go:#8,#8 describe ___`
This results in error:
`oracle: no buildable Go source files in /devel/go-workspace/src/foo`
| Tools | low | Critical |
126,985,126 | kubernetes | kubectl should suggest next kubectl commands | If I do `kubectl describe replicationcontroller/foo`, there is a good chance that the next
thing I want to do is to see details about the pods created by that replication controller.
Therefore, `kubectl` should suggest these next steps to me. I'm thinking an output format like this:
``` console
$ ./kubectl describe rc/foo
$ ./kubectl describe rc/temp
Name: temp
Namespace: default
Image(s): redis
Selector: run=temp
Labels: run=temp
Replicas: 1 current / 1 desired
Pods Status: 1 Running / 0 Waiting / 0 Succeeded / 0 Failed
No volumes.
No events.
...
Suggested Next Commands:
# List all pods created by this replication controller.
kubectl get pods -l run=temp
# Describe a running pod created by this replication controller.
kubectl get pods temp-d729d
# Get logs from first container of the last failed pod of this replication controller.
kubectl logs -p pods temp-5sj4k
```
Since we don't use URLs and HATEOAS in kubernetes, and since new users do not understand the
conventions by which controllers add labels, it is hard for users to know what to do next. As we get more levels of controllers (Deployment/Rc/Pod, ScheduledJob/Workflow/Job/Pod) and as we get more types of objects created by the same controller (PetSet/{Secret,ConfigMap,RC}), it will become tedious for users to navigate the hierarchy even if they know the right commands.
Other example of this feature:
- Job:
- show Pods it created
- show all still running pods
- show last failed pod (if there was a failure)
- RC:
- show all pods created by the RC
- show all
- Service:
- show all pods matching the selector
- ScheduledJob:
- show Jobs or Workflows it created
- Workflow:
- show Jobs it created
- Pod:
- show logs of its containers
- show logs of previous invokation (if there has been a failure).
- Deployment:
- show its RCs (since there are typically only 2 or 3 RCs, maybe show one command for individually listing each RC?
| priority/backlog,area/usability,area/kubectl,sig/cli,lifecycle/frozen | medium | Critical |
126,986,735 | rust | -C link-args and -C llvm-args can't pass arguments with spaces | The arguments passed in -C link-args and -C llvm-args is split by spaces making it impossible to pass along paths with spaces in them.
| A-frontend,C-enhancement,T-compiler | low | Major |
127,011,713 | youtube-dl | add support for metrolyrics.com | ```
$ youtube-dl --proxy '' --verbose http://www.metrolyrics.com/news-story-watch-adele-absolutely-crush-her-carpool-karaoke-appearance.html
[debug] System config: []
[debug] User config: []
[debug] Command-line args: [u'--proxy', u'', u'--verbose', u'http://www.metrolyrics.com/news-story-watch-adele-absolutely-crush-her-carpool-karaoke-appearance.html']
[debug] Encodings: locale UTF-8, fs UTF-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2016.01.15
[debug] Python version 2.7.10 - CYGWIN_NT-6.1-WOW-2.2.1-0.289-5-3-i686-32bit
[debug] exe versions: none
[debug] Proxy map: {}
[generic] news-story-watch-adele-absolutely-crush-her-carpool-karaoke-appearance: Requesting header
WARNING: Falling back on generic information extractor.
[generic] news-story-watch-adele-absolutely-crush-her-carpool-karaoke-appearance: Downloading webpage
[generic] news-story-watch-adele-absolutely-crush-her-carpool-karaoke-appearance: Extracting information
ERROR: Unsupported URL: http://www.metrolyrics.com/news-story-watch-adele-absolutely-crush-her-carpool-karaoke-appearance.html
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/youtube_dl/extractor/generic.py", line 1289, in _real_extract
doc = compat_etree_fromstring(webpage.encode('utf-8'))
File "/usr/lib/python2.7/site-packages/youtube_dl/compat.py", line 248, in compat_etree_fromstring
doc = _XML(text, parser=etree.XMLParser(target=etree.TreeBuilder(element_factory=_element_factory)))
File "/usr/lib/python2.7/site-packages/youtube_dl/compat.py", line 237, in _XML
parser.feed(text)
File "/usr/lib/python2.7/xml/etree/ElementTree.py", line 1642, in feed
self._raiseerror(v)
File "/usr/lib/python2.7/xml/etree/ElementTree.py", line 1506, in _raiseerror
raise err
ParseError: not well-formed (invalid token): line 21, column 1218
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/youtube_dl/YoutubeDL.py", line 665, in extract_info
ie_result = ie.extract(url)
File "/usr/lib/python2.7/site-packages/youtube_dl/extractor/common.py", line 312, in extract
return self._real_extract(url)
File "/usr/lib/python2.7/site-packages/youtube_dl/extractor/generic.py", line 1908, in _real_extract
raise UnsupportedError(url)
UnsupportedError: Unsupported URL: http://www.metrolyrics.com/news-story-watch-adele-absolutely-crush-her-carpool-karaoke-appearance.html
```
| site-support-request | low | Critical |
127,015,502 | java-design-patterns | Application Controller pattern | **Description:**
The Application Controller design pattern centralizes the request handling by routing incoming requests to appropriate handlers. This pattern is particularly useful for applications with complex request processing logic, as it decouples the request handling from the request processing, promoting modularity and ease of maintenance. Key elements of this pattern include:
1. **Application Controller:** A central controller that handles incoming requests and delegates them to appropriate handlers.
2. **Handlers:** Specific components or services that perform the business logic associated with a request.
3. **Request:** Represents the client request to be processed.
4. **Response:** Represents the outcome of the request processing, which is sent back to the client.
5. **Command:** Encapsulates a request as an object, thereby decoupling the sender of a request from its receiver.
**References:**
- [Application Controller Design Pattern](https://martinfowler.com/eaaCatalog/applicationController.html)
- [Project Contribution Guidelines](https://github.com/iluwatar/java-design-patterns/wiki)
**Acceptance Criteria:**
1. Implement the Application Controller class that can route incoming requests to appropriate handlers.
2. Create at least two handler classes demonstrating different business logic processes.
3. Ensure comprehensive unit tests are written to validate the functionality of the Application Controller and handlers.
| info: help wanted,epic: pattern,type: feature | low | Major |
127,015,633 | java-design-patterns | Association Table Mapping pattern | ### Description:
The Association Table Mapping design pattern is used to manage many-to-many relationships between objects by using an intermediary table (association table) in the database. This pattern is particularly useful when an application requires handling complex relationships where each object can have multiple associations with objects of another type.
**Main elements of the pattern:**
- **Entities:** These are the objects that have a many-to-many relationship.
- **Association Table:** A table that holds the foreign keys referencing the primary keys of the associated tables.
- **Mappings:** The process of defining how the objects are related through the association table.
- **Database Operations:** CRUD operations need to account for the intermediary table to properly maintain the relationships.
### References:
- [Association Table Mapping - Martin Fowler](https://martinfowler.com/eaaCatalog/associationTableMapping.html)
- [Project Contribution Guidelines](https://github.com/iluwatar/java-design-patterns/wiki)
### Acceptance Criteria:
1. Create an association table to manage many-to-many relationships between two entities.
2. Implement CRUD operations ensuring the association table is appropriately updated.
3. Provide unit tests demonstrating the correct implementation and functionality of the Association Table Mapping pattern. | info: help wanted,epic: pattern,type: feature | low | Major |
127,017,159 | java-design-patterns | Plugin pattern | ## Description
The Plugin design pattern allows a software application to support extension through third-party plugins, providing flexibility and scalability. This pattern is particularly useful for applications requiring dynamic and interchangeable components.
### Main Elements of the Plugin Design Pattern:
1. **Plugin Interface**: Defines the methods that plugins must implement.
2. **Concrete Plugins**: Implementations of the plugin interface, providing specific functionality.
3. **Plugin Manager**: Responsible for loading, initializing, and managing the lifecycle of plugins.
4. **Application Core**: The main application that uses the plugin manager to interact with plugins.
### References:
- [Plugin](http://martinfowler.com/eaaCatalog/plugin.html)
- [Contribution Guidelines](https://github.com/iluwatar/java-design-patterns/wiki)
### Acceptance Criteria:
1. A Plugin Interface is defined with necessary methods that plugins must implement.
2. At least two Concrete Plugin implementations are created, showcasing different functionalities.
3. A Plugin Manager class is implemented to handle loading, initializing, and managing the plugins.
4. Demonstration of the pattern in a sample application, showing dynamic loading and usage of plugins.
| info: help wanted,epic: pattern,type: feature | medium | Major |
127,018,809 | nvm | Unable to install custom non-nvm managed version | Hi,
I'm currently using N|Solid for some projects and would like to be able to switch between this and other versions during development.
I've tried copying the N|Solid distribution into the versions directory, but it isn't picked up. Is there a way to do this? And if not, can this feature be added?
Cheers,
Nathan
| feature requests | low | Major |
127,049,530 | flutter | Need a way to set app icon from setTaskDescription on Android | the `Title` widget, and the `Activity` mojo service that backs it, doesn't support sending a bitmap to Android to set the app icon.
| c: new feature,platform-android,framework,engine,P3,team-android,triaged-android | low | Major |
127,079,778 | rust | Linking to rustc_llvm with --llvm-root | If you link to rustc_llvm with a rustc configured with --llvm-root the path to the LLVM libraries is not passed to the linker resulting in linking errors
| A-LLVM,T-compiler,C-bug | low | Critical |
127,085,336 | You-Dont-Know-JS | "this & Object Prototypes" - ch6 Behavior Delegation - Minor grammatical error | Not 100% on this but the following sentence doesn't seem to flow well for me:
> These instances have **copies both** of the general Task defined behavior as well as the specific XYZ defined behavior.
While I think this might be better:
> These instances have **copies of both** the general Task defined behavior as well as the specific XYZ defined behavior.
By the way, thanks for making this series happening. I have been enjoying and learning a lot thus far!
| for second edition | medium | Critical |
127,091,096 | rust | Fulfillment context should support DAGs better, integrate with caching better | The new fulfillment context introduced in #30533 could be more efficient in a number of ways in terms of pruning the work it has to do. First, it currently considers obligations to be formed in a tree, but really it ought to detect arbitrary DAGs and avoiding adding needless work. In particular, if a given obligation O is also found elsewhere in the tree -- but NOT as an ancestor! -- then we can drop it. We have some limited support for this where we drop duplicate _siblings_. This may be good enough, but we can do better.
It seems like we could integrate this with cycle detection. Currently, we wait until the overflow counter trips and then walk back up and look for a cycle. This is perfectly fine. But I could also imagine that we might have some kind of per-tree hashtable (the current `ObligationForest` doesn't support "per-tree" data, but it seems like an easy enough extension) that tracks obligations currently in the tree. If we find when adding a new obligation O that it is already present, then we could go look for a cycle and otherwise consider it a duplicate. This seems nice, but there is a wrinkle.
In particular, I am wary of getting things wrong around inference variables! We can always add things to the set in their current state, and if unifications occur then the obligation is just kind of out-of-date, but I want to be sure we don't accidentally fail to notice that something is our ancestor. I decided this was subtle enough to merit its own PR.
Another related issue is that it's not clear when is the best time to refresh inference variables. We might be able to find a perfect time, or maybe we should just make it very cheap to do and then do it all over the place.
cc @aturon @arielb1
| C-enhancement,A-trait-system,I-compiletime,T-compiler | low | Minor |
127,094,982 | opencv | Color convert under CPU and GPU is not consistant | When trying to convert a color from Lab to RGB using cvtColor there is an inconsistency between some of the results returned from the GPU (cuda) and CPU.
For example:
Lab(1.1, 68.34847, 64.08946) --> RGB(0.3326095 , 0.0, 0.0) under CPU
Lab(1.1, 68.34847, 64.08946) --> RGB(0.332609445 , -0.1881216 , 0.234428525) under GPU
While a & b values in Lab colorsapce can be negative, in RGB space they can't. All the values are single floats as requested by opencv documentation...
This was tested under opencv 3.0 and opencv 3.1
| bug,category: gpu/cuda (contrib) | low | Minor |
127,123,280 | go | os: on unix Process.Kill() can kill the wrong process | This is somewhat similar to the issue #9382. On Unix Process.Kill() just calls Process.Signal(Kill). As https://golang.org/src/os/exec_unix.go#L39 indicates, the Signal function invokes syscall.Kill(p.Pid) outside any lock after checking if the process still runs. Thus at the point when the signal is called the process could finish and Process.Wait() from another thread return. Thus OS is free to reuse the pid for another unrelated process. If this happens, Process.Kill() kills that process.
Due to this race it is impossible to write a correct platform-independent code in Go that kills the process if it does not terminate after a pause. I.e. the code fragment from #9382 is not correct on Unix:
```
func spawnAndKill(exePath string, counter int) error {
cmd := exec.Command(exePath, fmt.Sprintf("%d", counter))
err := cmd.Start()
if err != nil {
return err
}
go func() {
time.Sleep(1000 * time.Millisecond)
cmd.Process.Kill()
}()
cmd.Wait()
return nil
}
```
| OS-OpenBSD,OS-Solaris | medium | Critical |
127,129,565 | javascript | Lint the readme | Don't know if there's already an issue to do this, but @btmills published https://github.com/eslint/eslint-plugin-markdown. So you should be able to lint the readme with the eslint config?
| enhancement,pull request wanted | low | Major |
127,159,034 | You-Dont-Know-JS | Async & Performance - ch2 - Timeoutify callback arguments | If I'm right, this callback always called with `err == null` and real `err` argument will be second parameter as `data`
``` javascript
fn.apply( this, [ null ].concat( [].slice.call( arguments ) ) );
```
| for second edition | medium | Major |
127,423,593 | go | cmd/pprof: svg profiles are broken for rpc services | go version devel +c7754c8 Tue Jan 19 06:20:36 2016 +0000 linux/amd64
SVG profile for my program shows:

You can see that as if reflect.Value.call directly allocates a lot. This does not make sense.
In text profile I see records that are probably these reflect.Value.call allocations. But in SVG they are trimmed at reflect.Value.call. RPC servers can allocate most of its memory during request serving, so this is pretty unfortunate.
```
944: 3866624 [15104: 61865984] @ 0x452e65 0x56df40 0x56d0b5 0x40b72d 0x40bf66 0x46d5ae 0x8a17ed 0x8a04b1 0x832e52 0x46fbc1
# 0x56df40 github.com/google/syzkaller/prog.calcDynamicPrio+0xb0 gopath/src/github.com/google/syzkaller/prog/prio.go:153
# 0x56d0b5 github.com/google/syzkaller/prog.CalculatePriorities+0x75 gopath/src/github.com/google/syzkaller/prog/prio.go:31
# 0x40b72d main.(*Manager).minimizeCorpus+0x9ed gopath/src/github.com/google/syzkaller/syz-manager/manager.go:341
# 0x40bf66 main.(*Manager).Connect+0x216 gopath/src/github.com/google/syzkaller/syz-manager/manager.go:363
# 0x46d5ae runtime.call64+0x3e go/src/runtime/asm_amd64.s:473
# 0x8a17ed reflect.Value.call+0x120d go/src/reflect/value.go:435
# 0x8a04b1 reflect.Value.Call+0xb1 go/src/reflect/value.go:303
# 0x832e52 net/rpc.(*service).call+0x1c2 go/src/net/rpc/server.go:383
1: 1761280 [1: 1761280] @ 0x453261 0x40c701 0x46d5ae 0x8a17ed 0x8a04b1 0x832e52 0x46fbc1
# 0x40c701 main.(*Manager).NewInput+0x611 gopath/src/github.com/google/syzkaller/syz-manager/manager.go:383
# 0x46d5ae runtime.call64+0x3e go/src/runtime/asm_amd64.s:473
# 0x8a17ed reflect.Value.call+0x120d go/src/reflect/value.go:435
# 0x8a04b1 reflect.Value.Call+0xb1 go/src/reflect/value.go:303
# 0x832e52 net/rpc.(*service).call+0x1c2 go/src/net/rpc/server.go:383
1: 1515520 [1: 1515520] @ 0x453261 0x40d8fb 0x40c6b9 0x46d5ae 0x8a17ed 0x8a04b1 0x832e52 0x46fbc1
# 0x40d8fb main.(*PersistentSet).add+0x49b gopath/src/github.com/google/syzkaller/syz-manager/persistent.go:100
# 0x40c6b9 main.(*Manager).NewInput+0x5c9 gopath/src/github.com/google/syzkaller/syz-manager/manager.go:385
# 0x46d5ae runtime.call64+0x3e go/src/runtime/asm_amd64.s:473
# 0x8a17ed reflect.Value.call+0x120d go/src/reflect/value.go:435
# 0x8a04b1 reflect.Value.Call+0xb1 go/src/reflect/value.go:303
# 0x832e52 net/rpc.(*service).call+0x1c2 go/src/net/rpc/server.go:383
```
| compiler/runtime | low | Critical |
127,436,976 | rust | Bad macro usage error message does not include correct error location | Consider this reduced test case:
``` rust
macro_rules! match_ignore_ascii_case {
(@inner $value:expr) => { () };
( $($rest:tt)* ) => { match_ignore_ascii_case!(@inner $($rest)*) };
}
fn main() {
// This is fine
match_ignore_ascii_case!(1);
// This causes an error as it doesnβt match the expected syntax
// but the error message does not the location of the actual error.
match_ignore_ascii_case!(2 => 3);
}
```
The `@inner` indirection exists because the non-reduced macro is recursive:
- https://play.rust-lang.org/?gist=f8b1652f43cc720f89a3&version=nightly
- https://users.rust-lang.org/t/writing-a-macro-rules-macro-used-like-a-match-expression/4328
This fails to compile (as it should) but the error message does not include the real location of the error, which is line 12. It can be hard to track down in a large crate with many users of the macro.
``` rust
a.rs:3:52: 3:53 error: unexpected token: `@`
a.rs:3 ( $($rest:tt)* ) => { match_ignore_ascii_case!(@inner $($rest)*) };
```
The error message looks even worse when the macro is used (with incorrect syntax) from another crate
```
<cssparser macros>:12:1: 12:2 error: unexpected token: `@`
<cssparser macros>:12 @ inner $ value , ( $ ( $ rest ) * ) -> (
```
| C-enhancement,A-diagnostics,A-macros,T-compiler,D-papercut | low | Critical |
127,493,858 | go | cmd/compile: Need better type propagation | When this code is compiled with `-m`, several heap allocations can be seen because `hash.Hash64` is an interface type and it's not possible (in general) to know how the parameters to its methods are treated:
```
package foo
import (
"hash"
"hash/fnv"
)
func harsh(fieldVals []interface{}) uint64 {
if len(fieldVals) == 0 {
// Avoid allocating the hasher if there are no fieldVals
return 0
}
h := fnv.New64a() // .(*fnv.Sum64a)
for _, v := range fieldVals {
switch v := v.(type) {
case string:
h.Write([]byte(v))
case int, int32:
vi, ok := v.(int32)
if !ok {
vi = int32(v.(int))
}
b := [4]byte{}
for i := 0; i < 4; i++ {
b[i] = byte(vi & 0xFF)
vi >>= 8
}
h.Write(b[:])
case bool:
if v {
h.Write([]byte{1})
} else {
h.Write([]byte{0})
}
}
}
return h.Sum64()
}
```
If the package-private type actually allocated by `fnv.New64a` is exposed and cast to (see the commented out `.(*fnv.Sum64a)` ), it becomes possible to see which functions are actually called and either inline them or determine that they do not leak their parameters, and heap allocations are avoided. This type information is available and can be propagated forward in tandem with the declared type, and used to better escape-analyze interface method calls (that are really monomorphic). Right now inlining can reveal it, ideally the summary information can include more precise information about returned types if there is any to be had.
| Performance,compiler/runtime | low | Major |
127,556,428 | TypeScript | Assignability error reporting should have a more specific error for unions | ``` ts
interface A {
a: number
}
interface B {
b: number
}
interface C {
c: number
}
interface D {
d: number
}
declare var c: C;
let abc: ((A | B) & C) | D = c;
```
Actual error:
```
Type 'C' is not assignable to type '((A | B) & C) | D'.
Type 'C' is not assignable to type 'D'.
....
```
Expected error:
```
Type 'C' is not assignable to any branch of the union type '((A | B) & C) | D'.
For example, type 'C' is not assignable to type 'D'.
....
```
Error reporting on unions doesn't mention that the compiler tried to assign the source to all branches of the target. Instead it proceeds to elaborate just the last branch's error. It should make it clear that the error elaboration is just one of several checks that failed.
Otherwise it is easy to assume that the compiler _only_ tried the last branch and somehow skipped the "more promising" branch (which ultimately also fails, but at least contains the source type).
Based on one of the multiple issues reported by @aleksey-bykov in #6538.
| Bug,Help Wanted,Domain: Error Messages | low | Critical |
127,605,421 | go | cmd/compile: stack overflow accessing large return value | Taken this toy example, that calculates prime numbers: http://play.golang.org/p/XrUCUvC7In
Building this with <b> go build -gcflags -m </b> shows following output
<b>
moved to heap: arr
moved to heap: x
</b>
However, running the program I get following error:
runtime: goroutine stack exceeds 1000000000-byte limit
fatal error: stack overflow
runtime stack:
runtime.throw(0x470f80, 0xe)
c:/go/src/runtime/panic.go:527 +0x97
runtime.newstack()
c:/go/src/runtime/stack1.go:800 +0xb25
runtime.morestack()
c:/go/src/runtime/asm_amd64.s:330 +0x87
goroutine 1 [stack growth]:
main.main()
C:/Users/Daniel/Desktop/rensir.go:27 fp=0xc0c202be50 sp=0xc0c202be48
runtime.main()
c:/go/src/runtime/proc.go:111 +0x27e fp=0xc0c202bea0 sp=0xc0c202be50
runtime.goexit()
c:/go/src/runtime/asm_amd64.s:1721 +0x1 fp=0xc0c202bea8 sp=0xc0c202bea0
Is it supposed to get a stackoverflow, if it is a heap object?
Go version: go1.5.3 windows/amd64
OS: windows 7 64 bit
| NeedsFix,compiler/runtime | low | Critical |
127,623,857 | youtube-dl | how to configure youtebe-dl when using PAC on windows 7 | hi,
I use a PAC file on windows 7 to access the internet.
I try the command "youtube-dl --proxy http://127.0.0.1:16823/adf.pac https://www.youtube.com/watch?v=ByVPaqN9594" to download some files. but it failed.
i'm not sure if it' correct to set parameter. Could you help me?
Thanks.
| request | low | Critical |
127,640,581 | youtube-dl | Site Support Request: PopTV.com | Can you add support for this website?
Example URL: http://poptv.com/post/136762107828/impact-wrestling-premiere-full-episode
```
youtube-dl --verbose http://poptv.com/post/136762107828/impact-wrestling-premiere-full-episode
[debug] System config: []
[debug] User config: []
[debug] Command-line args: [u'--verbose', u'http://poptv.com/post/136762107828/i
mpact-wrestling-premiere-full-episode']
[debug] Encodings: locale cp1252, fs mbcs, out cp437, pref cp1252
[debug] youtube-dl version 2016.01.15
[debug] Python version 2.7.10 - Windows-7-6.1.7601-SP1
[debug] exe versions: ffmpeg N-77953-gcc83177, ffprobe N-77953-gcc83177, rtmpdum
p 2.4
[debug] Proxy map: {}
[generic] impact-wrestling-premiere-full-episode: Requesting header
WARNING: Falling back on generic information extractor.
[generic] impact-wrestling-premiere-full-episode: Downloading webpage
[generic] impact-wrestling-premiere-full-episode: Extracting information
[brightcove:legacy] AQ~~,AAAB-0j8O8k~,4Zm4raQfNIpvkHb496QWbqJbPs3IHFnC: Download
ing playlist information
ERROR: Empty playlist; please report this issue on https://yt-dl.org/bug . Make
sure you are using the latest version; type youtube-dl -U to update. Be sure t
o call youtube-dl with the --verbose flag and include its complete output.
Traceback (most recent call last):
File "youtube_dl\YoutubeDL.pyo", line 665, in extract_info
File "youtube_dl\extractor\common.pyo", line 312, in extract
File "youtube_dl\extractor\brightcove.pyo", line 245, in _real_extract
File "youtube_dl\extractor\brightcove.pyo", line 283, in _get_playlist_info
ExtractorError: Empty playlist; please report this issue on https://yt-dl.org/bu
g . Make sure you are using the latest version; type youtube-dl -U to update.
Be sure to call youtube-dl with the --verbose flag and include its complete outp
ut.
```
| site-support-request,geo-restricted | low | Critical |
127,665,597 | godot | Node `duplicate()` doesn't copy internal variables values | ***Bugsquad edit:** This bug has been confirmed several times already. No need to confirm it further.*
___
Duplicate method don't take into account variable values of duplication root as well as internal nodes.
I have prepared small test project that can be used to detect some issues related to duplication:
[DuplicateTests.zip](https://github.com/godotengine/godot/files/97344/DuplicateTests.zip)
it test various scenarios and report results in output console in the following form:
```
(...)
Translation test: SUCCESS
Root export variable test: SUCCESS
Inner export variable test: SUCCESS
Root variable value test: FAIL
Inner variable val test: FAIL
Inner node structure test: SUCCESS
Summary result: FAIL
```
**17.08.2022 UPDATE**
I converted test project from Godot1? to Godot4 . I can confirm I still get the same result:
Godot4 reproduction here: https://github.com/godotengine/godot/issues/3393#issuecomment-1218262767
**Also to clarify: the issue is about usage of Object.duplicate() method and the fact that variables of returned object (even simple types) are different then in the original one.**
| bug,topic:core,confirmed | medium | Critical |
127,738,484 | go | tour: infinite loop example, after switching page 'Kill' button cannot be accessed | Context: http://127.0.0.1:3999/flowcontrol/4
When running example locally process time constraint is removed so infinite loop will happily chug along and can be killed using the button on page:

However after switching pages the context is lost, program is still running and must be killed manually.

| NeedsInvestigation | low | Minor |
127,761,469 | rust | Rustdoc search box fails accessability guidelines | Background links: https://www.w3.org/TR/WCAG20-TECHS/H32.html for a11y, http://doc.rust-lang.org/std/ for rustdoc search. Before typing anything:

After typing a query:

Basically, we currently implement the searchbox by using JS to hook into the submission: https://github.com/rust-lang/rust/blob/master/src/librustdoc/html/static/main.js#L737-L741
So we're not providing a 'submit' button. But we're supposed to.
I'm not sure of the best way to resolve this, exactly. I _think_ the plan would be:
1. create an `<input type="submit">`
2. hide it with CSS
3. have it trigger that JS instead
Does that make sense? I am bad at front-end :(
<!-- TRIAGEBOT_START -->
<!-- TRIAGEBOT_ASSIGN_START -->
<!-- TRIAGEBOT_ASSIGN_DATA_START$${"user":null}$$TRIAGEBOT_ASSIGN_DATA_END -->
<!-- TRIAGEBOT_ASSIGN_END -->
<!-- TRIAGEBOT_END --> | T-rustdoc,C-bug,A-a11y | low | Major |
127,810,122 | angular | Invalid component selectors should throw an error | If you use an unsupported component selector, such as `parent-thing > child-thing`, Angular will **silently** accept this and just capture `child-thing`.
| hotlist: error messages,freq1: low,area: compiler,core: directive matching,type: confusing,P4 | low | Critical |
127,955,635 | go | runtime: improve performance of IndexByte on older processors | Pre-avx2 processors use a loop of sse operations to do IndexByte. They use an unaligned load to do so. There may be a significant performance win by aligning the loads. See the comments at the end of https://go-review.googlesource.com/#/c/18703/
| Performance,NeedsFix,compiler/runtime | low | Major |
128,057,019 | kubernetes | Secret and ConfigMap should limit # of keys | I think it's best to say "max 1000 keys, max key size is 256 characters. max total size of keys + data is 1 MB" or similar.
@pmorie promises to do this unless someone gets there first.
| priority/backlog,sig/api-machinery,lifecycle/frozen | medium | Major |
128,088,669 | go | cmd/asm: incorrect instruction encodings | I have constructed a fairly exhaustive test suite for the x86 assembler and identified some problems. The ones in this issue are long-time bugs that appear to have been present since the beginning of the Go project. We should fix them but given the history there is no need to rush the fixes into Go 1.6.
This may not be all of them: my tests don't account for some Go renamings of instructions.
I intend to fix these and check in the tests.
```
ADCB (BX), DL: have encoding "1013", want "1213"
0: 10 13 adc %dl,(%rbx)
0: 12 13 adc (%rbx),%dl
ANDPS (BX), X2: have encoding "660f5413", want "0f5413"
0: 66 0f 54 13 andpd (%rbx),%xmm2
0: 0f 54 13 andps (%rbx),%xmm2
CMOVLEQ (BX), DX: have encoding "0f4413", want "480f4e13"
0: 0f 44 13 cmove (%rbx),%edx
0: 48 0f 4e 13 cmovle (%rbx),%rdx
MOVQ 0x123456789abcdef1, AX: have encoding "488b0425f1debc9a", want "48a1f1debc9a78563412"
0: 48 8b 04 25 f1 de bc mov 0xffffffff9abcdef1,%rax
7: 9a
0: 48 a1 f1 de bc 9a 78 movabs 0x123456789abcdef1,%rax
7: 56 34 12
MOVQ AX, $0x123456789abcdef1: have encoding "48890425f1debc9a", want "48a3f1debc9a78563412"
0: 48 89 04 25 f1 de bc mov %rax,0xffffffff9abcdef1
7: 9a
0: 48 a3 f1 de bc 9a 78 movabs %rax,0x123456789abcdef1
7: 56 34 12
```
| NeedsFix,early-in-cycle | low | Critical |
128,130,550 | angular | Angular2 Routing: persisting route tabs and child routes | (I posted the problem on SO: http://stackoverflow.com/questions/34925782/angular2-routing-persisting-route-tabs-and-child-routes but decided to ask here, too, hopefully more specific this time)
So, basically Router in Angular 2 destroys inactive components (my tabs!). The problem is I don't want this behaviour. Reason is I have some components like charts and data grid and want to switch between them fast, I don't want to recreate them.
I've found some workaround for one-level routing but it's totally not a solution.
`app.ts`:
``` ts
@Component({ /*...*/ })
@RouteConfig([
{path: '/**', redirectTo: ['Dashboard']},
{path: '/dashboard', name: 'Dashboard', component: EmptyRoute},
{path: '/products/...', name: 'Products', component: EmptyRoute},
{path: '/sales', name: 'Sales', component: EmptyRoute},
{path: '/reports', name: 'Reports', component: EmptyRoute},
])
export class App {
constructor(private router: Router) {
}
public isRouteActive(route) {
return this.router.isRouteActive(this.router.generate(route))
}
}
```
As you can see `component` is set to `EmptyRoute` (just an empty view `@Component` with no methods and no data) and I don't use `<router-outlet>` but instead I instantiate my route components manually:
`app.html`:
``` ts
<dashboard [hidden]="!isRouteActive(['/Dashboard'])"></dashboard>
<products-management [hidden]="!isRouteActive(['/Products'])"></products-management>
<sales [hidden]="!isRouteActive(['/Sales'])"></sales>
<reports [hidden]="!isRouteActive(['/Reports'])"></reports>
```
but of course it doesn't work for child routes. In my (existing - written with Angular 1.x) application I have `/products/product/21/pricing` or `/products/product/21` etc. (could get rid of `products/` part in the middle but whatever).
So, going back to `Router`. It instantiates and destroys. Ideally, I'd like to:
1. instantiate components only once - or better - decide when I want to do this (some Strategy?)
2. decide whether to **destroy or hide** components. When I hide a component I want to reuse it when route switches to the component.
**EDIT** on 14.11.2017: some people ask me about solution - yeah, I ditched Angular and use [Elm](http://elm-lang.org/) for private projects and changed my job. | feature,freq2: medium,area: router,feature: under consideration | high | Critical |
128,335,237 | go | x/net/http2: export more tunable knobs | There are several default window settings in `transport.go`:
``` go
// transportDefaultConnFlow is how many connection-level flow control
// tokens we give the server at start-up, past the default 64k.
transportDefaultConnFlow = 1 << 30
// transportDefaultStreamFlow is how many stream-level flow
// control tokens we announce to the peer, and how many bytes
// we buffer per stream.
transportDefaultStreamFlow = 4 << 20
// transportDefaultStreamMinRefresh is the minimum number of bytes we'll send
// a stream-level WINDOW_UPDATE for at a time.
transportDefaultStreamMinRefresh = 4 << 10
```
`NewClientConn` method will use these without looking up other places. However, these default values are not suitable for all situations like high-latency high-throughput network.
Can you provide some setters to change these values instead of rewrite the whole thing using `Framer` or forking and rewriting this repo?
| NeedsInvestigation | low | Minor |
128,373,143 | neovim | weird newline behaviour when terminal scrolls because of new output | <img width="1440" alt="screen shot 2016-01-23 at 11 54 41 pm" src="https://cloud.githubusercontent.com/assets/10180857/12534552/ae0cb06e-c22c-11e5-9f35-59e8dd842481.png">
Noticed this when testing #4085. Yes it is the same script. If the output of that script exceeds the terminal and it has to scroll, the background colour carries over to the rest of the line. It doesn't respect the newline.
| bug,terminal | low | Minor |
128,519,570 | opencv | Possible wrong tutorial (Arithmetic operation Tut for Python) | Studying opencv in python as new here, I came across the [Arithmetic operations tutorial](http://docs.opencv.org/master/d0/d86/tutorial_py_image_arithmetics.html) and the `cv2.threshold(...)` command. In the Bitwise operation section I runned the code multiple times and the result was not as show in the pictures below.
I got somthing closer to the desired result by changing the thresold method as it is to:
`ret, mask = cv2.threshold(img2gray, 250, 255, cv2.THRESH_BINARY_INV)`
which results in the desired mask (Results attached)
I can see that the tutorial tried to extract the mask by the complementary command (low thres, `cv. THRES_BINARY`) but this results in a non desired output (**in my case at least and unless I miss something**)




| bug,category: documentation,category: samples,affected: 3.4 | low | Minor |
128,864,309 | opencv | Support BORDER_TRANSPARENT for image filtering | Image filtering functions like cv::blur do not support BORDER_TRANSPARENT as border handling option. An exception is thrown within cv::borderInterpolate if this value is used. BORDER_TRANSPARENT could have practical uses in case of cv::Mat obtained as region of interest (operator parenthesis) of another cv::Mat: the edge pixels would be computed from pixels in the input image outside the region of interest cv::Mat, but inside the containing cv::Mat. The implementation would be actually very simple, by adding these 2 lines of code at the beginning of cv::borderInterpolate:
```
if( borderType == BORDER_TRANSPARENT )
return p;
```
(I have tested it with blur and gaussian filter and it seemingly works fine)
One drawback is that the user would have to check that its region of interest cv::Mat is far enough from the borders of the containing cv::Mat (the distance would have to be `(kernelSize/2)+1`.
(In the functions I tested it does crash if I don't take these precautions)
How difficult would it be to perform this check only once at the beginning of the filtering function, and throw an informative exception in case?
| category: core | low | Critical |
128,884,411 | nvm | nvm problems caused by "npm version" of nvm | Perhaps the **Problems** section of the README.markdown could note there is an old, npm-installed (or homebrew-installed) version of nvm that does not work with modern systems. (In fact, its github page/repo has disappeared...) It could say something like:
> **Outdated nvm** There is an older, unmaintained version of nvm that conflicts with this project. If it's installed, you may see messages like `"local" not implemented yet` or `Error: Cannot find module './nvm-help'` when typing nvm commands. You should remove that version with `npm uninstall -g nvm`, then re-run the **Install Script** above.
PS It would be helpful if each of the items in the **Problems** section had the first phrase bolded, so that people can see that they're separate questions/issues.
| informational,pull request wanted | low | Critical |
128,913,096 | kubernetes | Service Account Key Rotation | A user asked how to rotate service account keys.
[discussion thread](https://groups.google.com/forum/#!msg/google-containers/qSo2Q5kGI6I/use7PIYuCAAJ)
Currently, user restarts controller manager with new keypair and then restarts all pods.
User pointed out problems with that procedure:
1. you cannot restart pod instantly
2. until a pod is restarted, it is basically broken when assuming it requires access to the API. (KubeDNS in particular is a problem)
3. many pods are restarted which actually don't have to be restarted as they don't access the API
For problem 2, I suggested the following improved procedure:
1. Cluster is started with keypair 1
- `kube-apiserver -service-account-key-file=pub-1`
- `kube-controller-mgr --service-account-private-key-file=priv-1`.
2. time passes. service account tokens are generated with first keypair
3. it is time to rotate. generate new key pair, `pub-2` and `priv-2`
4. run with both keypairs
- Accept both: `kube-apiserver -service-account-key-file=pub-1,pub-2`
- Generate new: `kube-controller-mgr --service-account-private-key-file=priv-2`
5. for each service account, delete all token secrets and references to them. Token controller will make new ones, and the new ones will use the new key.
6. restart pods at a leisurely pace.
7. stop accepting old tokens
- `kube-apiserver -service-account-key-file=pub-1`
- `kube-controller-mgr --service-account-private-key-file=priv-1`
What is missing is that `-service-account-key-file` only accepts a single public key. Code deeper in the token authenticator appears to handle a list of keys.
For problem 3:
Action items:
- [ ] make the flag take a list
- [ ] close #16779 so that it can be explicit which pods do not need (and so do not have) a service account token.
- [ ] document rotation procedure, mentioning how to find and to skip pods without a token.
- [ ] e2e test where keys are rotated
| area/security,kind/documentation,kind/cleanup,sig/auth,priority/important-longterm,lifecycle/frozen | medium | Critical |
128,975,553 | go | proposal: os/v2: File should be an interface | By analogy with #13473, os.File should also be an interface. That would permit a program to use files in the abstract sense, including implementations not provided by the os package.
| v2,Proposal | medium | Critical |
129,032,467 | neovim | terminal: tty RPC via APC ctrlseq | I've been using neovim's terminal feature extensively lately (notice all my bug reports :P) and I had a couple really neat ideas that I think would be a must have feature for anyone who uses ssh and neovim.
First in the neovim terminal if you are sshed onto a server, if you type `nvim file` instead of opening a new neovim instance on the server it pipes back the data to your already neovim. This would make editing over SSH very seamless. You already have the terminal open and maybe you were doing something and bam if you type nvim you can edit now in your already open instance (access to clipboard etc).
I know about netrw's ability to edit files over the network but it would be really nice if it was seamless. I have to type `e scp://server/pathtofile` which is really annoying but a simple `nvim file` would make it much easier.
Second is either syncing the working directories of the terminals with neovim similar to `set autochdir` or to be able to again pipe straight into the open neovim instance from the terminal.
These are ideas I want to implement myself but I wanted some input first.
edit: nvm I'm never gonna have the time to implement all this. Sorry! | enhancement,terminal | medium | Critical |
129,181,134 | go | gccgo, doc: comments on doc/install/gccgo for gcc release 6 | I know that it is likely that some changes are planned for golang.org/doc/install/gccgo for the release of gcc 6. Here are a few comments/suggestions on this page.
With the changes to split stack support in gccgo for gcc release 6, it no longer required to use --with-ld on the configure option (and probably better not to? maybe both methods should be described). In gcc 6, if the gold linker is found in the PATH when gcc is configured, then it will be used as the linker for gccgo, as required for split stack.
Also, split stack is now implemented for PowerPC64 big and little endian in addition to x86.
On the section describing how to build gccgo, another way to get gcc source is from the weekly snapshots or releases from the various mirrors for GNU GCC found here https://gcc.gnu.org/mirrors.html. This is much faster than using svn to get the source.
When describing the setting of the LD_LIBRARY_PATH, the paths used are not correct based on the systems I've looked at (ppc64le & x86). The shared libraries are found under ${prefix}/lib64 or ${prefix}/lib, so that is the path needed for LD_LIBRARY_PATH or when setting the runpath with the -W option.
The example showing -gccgoflags should have the options within quotes.
It would be nice to see something that mentions the use of the go tool built for gccgo and how it is preferred over using gccgo directly or the go tool from golang with the -compiler option. When using the go tool built for gccgo, then using 'go version' displays version information about gccgo. Also, if 'go build' is used to build an application with build tags then it matches against the tags for gccgo as expected. When using the go tool from golang with the -compiler option, the build tags are matched against the golang values and not the gccgo values.
| NeedsInvestigation | low | Minor |
129,234,512 | go | runtime/race: deflake tests | See https://go-review.googlesource.com/#/c/18968 for context.
Race tests run with GOMAXPROCS=1, this makes them more or less reliable. But the ultimate solution is to explicitly annotate tests with required execution order by means of a special "invisible" synchronization primitive (that's what is done for C++ ThreadSanitizer tests). But that would require going over 350 tests.
| help wanted,NeedsFix,compiler/runtime | low | Major |
129,243,287 | kubernetes | kubectl port-forward container listener to local port | It would be great if `kubectl port-forward` offered the equivalent of `ssh -R`, such that it would open a _listening_ port on the container that forwards to a port on the local machine. I would be perfectly content if that port were only available to the container (i.e. just the 3-tuple `-R` flavor, without `bind_address`; no need to have other containers be able to connect to the port)
Since there is no `kubectl cp` command that I could find, this would make for a painless way to get content into a running container via http from a workstation
| priority/backlog,area/kubectl,sig/node,kind/feature,sig/cli | high | Critical |
129,351,797 | TypeScript | Suggestion for improving generator and async function type checking | This is a suggestion to improve both the consistency and the type-safety of return type checking for
generator functions (GFs) and async functions (AFs).
NB: this issue refers to `tsc` behaviour for target >=ES6 with `typescript@next` as of 2016-01-28 (ie since commit https://github.com/Microsoft/TypeScript/commit/a6af98e10025492e0afda315ea37cfebe0b2bbfb)
## Current Behaviour
Consider the following examples of current behaviour for checking return type annotations:
``` typescript
// Current behaviour - generator functions
class MyIterator implements Iterator<any> {next}
function* gf1(): any {} // OK
function* gf2(): MyIterator {} // OK (but really is an error)
// Current behaviour - async functions
class MyPromise extends Promise<any> {}
async function af1(): any {} // ERROR (but is really not an error)
async function af2(): MyPromise {} // ERROR
```
## Problems with Current Behaviour
Firstly, type checking is not consistent across the two kinds of functions. In the examples the GF checking is too loose and the AF checking is too strict. The inconsistency is due to the different approach to checking the return type. The two approaches may be summarised like this:
- _Generator functions:_ accept any return type annotation that is assignable from `IterableIterator<T>`.
- _Async functions:_ reject all return type annotations other than references to the global `Promise<T>`. This is a recent change, the rationale
for it can be followed from [here](https://github.com/Microsoft/TypeScript/issues/5911#issuecomment-175866624).
Secondly, the type checker only gets 2 out of 4 of the above checks right (`gf1` and `af2`). Explanation:
- GOOD: `gf1`'s return type annotation is not super helpful but is 100% consistent with the type system. No sense erroring here, so the implementation is good.
- BAD: `gf2`'s return type annotation passes type checking because it passes the assignability check.
However `gf2` definitely does not return an instance of `MyIterator`. All generator functions
return a [generator object](http://www.ecma-international.org/ecma-262/6.0/#sec-generator-objects), so at runtime `gf2() instanceof MyIterator` is `false`. A compile error would have been helpful.
- BAD: `af1`'s return type annotation is just like `gf1`: not super helpful but 100% consistent with the type system. The compiler errors here even though nothing is wrong (reason for the error is [here](https://github.com/Microsoft/TypeScript/issues/5911#issuecomment-175866624)).
- GOOD: `af2`'s return type annotation fails type checking because it's not `Promise<T>`. The return type definitely won't be an instance of any class other than `Promise`, so the implementation is good.
## Suggested Improvement
_Since GFs and AFs always return instances of known instrinsic types, we can rule out any type annotation that asserts they will return an instance of some other class._
Both generator and async functions could therefore be checked with the same two steps:
1. _Is Assignable:_ Ensure the return type of the GF/AF is assignable to the annotated return type. This is the basic check for all kinds of function return types. If not assignable, type checking fails. Otherwise, continue to step 2.
2. _Not a Class Type:_ Ensure the return type annotation is not a class type (except `Promise<T>` which is allowed for AFs). For example if the return type annotation is `Foo`, ensure it does not refer to `class Foo {...}` or another class-like value.
These rules have the following effects:
- GF and AF type checking are mutually consistent.
- This fixes `gf2` by ruling out class types like `MyIterator` in addition to checking assignability. GF type checking is made safer in general by catching a class of errors that currently slip through.
- This fixes `af1`, because it's no longer necessary to rule out _all_ annotations other than `Promise<T>`, but just those that are assignment-compatible class types like `MyPromise`. This approach will catch the breaking change from 1.7 as a compile error (as desired for reason [here](https://github.com/Microsoft/TypeScript/pull/6631#discussion_r51040975)), but allow harmless (and correct) things like `any` and `PromiseLike<T>`.
## Working Implementation
This is a small code change. I implemented this in a branch as best I could (but I may have made errors since I'm still getting my head around the codebase). The diff can be seen [here](https://github.com/yortus/TypeScript/compare/master...yortus:gen-async-func-return-types).
With this version of `tsc` the above code works as follows:
``` typescript
// Suggested behaviour - generator functions
class MyIterator implements Iterator<any> {next}
function* gf1(): any {} // OK
function* gf2(): MyIterator {} // ERROR: A generator cannot have a return type annotation of a class type.
// Suggested behaviour - async functions
class MyPromise extends Promise<any> {}
async function af1(): any {} // OK
async function af2(): MyPromise {} // ERROR: An async function cannot have a return type annotation of a class type other than Promise<T>.
```
| Suggestion,In Discussion | medium | Critical |
129,500,072 | TypeScript | Proposal: Merge enum and const enum features | Currently there are two enumerable types specified in TypeScript: `enum` and `const enum`.
Both of them aren't bijective, i.e. they both don't provide the ability to cast them arbitrarily and unambiguously between `string`, `number` and `enum`.
After discussing this on Gitter with @jsobell and @masaeedu I'd like to propose the following:
1. merge both enumerable types into one: using `const enum` on constant index expressions and `enum` on variable enum expressions.
2. always return a `number` value when a `string` index expression is used on an `enum`.
3. allow for both, `number` and `string` values, to be used as index argument types on the proposed merged `enum` type.
This would solve some major problems with the current design. Currently ...
1. `enum` values cannot be converted to their `number` equivalent, only to their `string` representation.
2. `const enum` values cannot be converted to their `string` representation, only to their `number` equivalent.
(This blocks `const enum` values from being used to serialize configuration settings into a commonly expected string value representation.)
3. `const enum` values can not be converted to `enum` values and vice versa.
The key to providing the missing functionality is type inference. At compile time, TSC is able to tell whether a string or a number value is provided in an `enum` index argument. It's also able to tell whether the index expression is a constant or a variable.
Given these prerequisites the compiler can easily decide ...
1. whether to use a `const enum` value or an `enum` in the generated code,
2. whether to return the numerical value or string representation of the enum in the generated code.
(Variable index expressions of type _any_ should be regarded as values of type `string`.This will result in maximum compatibility.)
#### So I'm proposing the following:
1. Using a `string` indexer expression on an `enum` shall return a `number` if the type of the L-value _isn't_ the enum itself: `enum[:string] => number`.
2. Using a `string` indexer expression on an `enum` shall return an `enum` if the type of the L-value _is_ the enum itself: `enum[:string] => E`.
3. Using a `number` indexer expression on an `enum` shall return a `string` if the type of the L-value _isn't_ the enum itself: `enum[:number] => string`.
4. Using a `number` indexer expression on an `enum` shall return an `enum` if the type of the L-value _is_ the enum itself: `enum[:number] => E`.
5. Using a `enum` indexer expression on the same `enum` type shall return a `number` if the type of the L-value _isn't_ the enum itself: `enum[:enum] => number`. (During transpilation, this operation is practically redundant and may be cancelled from the output.)
6. Using a `enum` indexer expression on the same `enum` type shall return an `enum` if the type of the L-value _is_ the enum itself: `enum[:enum] => E`. (During transpilation, this operation is practically redundant and may be cancelled from the output.)
7. Using a _constant_ `string` or `number` indexer expression on an `enum` shall emit a constant `number` value in JavaScript (i.e., this is the `const enum` equivalent).
8. Using a _variable_ `string` or `number` indexter expression on an `enum` shall emit an array indexer expression in JavaScript (i.e., this is the `enum` equivalent).
---
<br/>
#### So, given the above prerequisites, here are two examples, hopefully shedding some light upon my proposal:
<br/>
##### (A) Example illustrating type inference being used at compile time to decide whether to return a `number` or a `string` value from an enum:
The following TypeScript code:
``` ts
// TypeScript
enum E {a, b, c}
const iN :number = 0, iS1 :string = "b", iS2 :string = "2";
// assigning to primitive types
const s :string = E[iN];
const n1 :number = E[iS1]; // special treatment because index type is string
const n2 :number = E[iS2]; // special treatment because index type is string
// assigning to enum
const es :E = E[iN];
const en1 :E = E[iS1]; // special treatment because index type is string
const en2 :E = E[iS2]; // special treatment because index type is string
```
... should result in the following JavaScript compilation result:
``` js
// JavaScript
var E;
(function (E) {
E[E["a"] = 0] = "a";
E[E["b"] = 1] = "b";
E[E["c"] = 2] = "c";
E.toNumber = function (e)
{ return IsNaN(e) ? E[e] : E.isOwnProperty(e) ? +e : undefined; }
})(E || (E = {}));
var iN = 0, iS1 = "b", iS2 = "2";
var s = E[iN]; // === "a"
var n1 = E.toNumber(iS1); // === 1
var n2 = E.toNumber(iS1); // === 2
var es = E[iN] ? iN : undefined; // E[iN] returns a ?:string. --- result == 0
var en1 = E.toNumber(iS1); // == 1
var en2 = E.toNumber(iS1); // == 2
```
<br/>
##### (B) Example illustrating `const enum` and `enum` being merged into one single type:
The following TypeScript code:
``` ts
// TypeScript
enum E {a, b, c}
let e :E;
const n :number = 1;
const s :string ="c";
// constant assignments
e = E.a;
e = E[2];
e = E["a"];
// variable assignments
e = E[n];
e = E[E[n]];
e = E[s];
```
... should result in the following JavaScript compilation result:
``` js
// JavaScript
var E;
(function (E) {
E[E["a"] = 0] = "a";
E[E["b"] = 1] = "b";
E[E["c"] = 2] = "c";
E.toNumber = function (e)
{ return IsNaN(e) ? E[e] : E.isOwnProperty(e) ? +e : undefined; }
})(E || (E = {}));
var e;
var n = 1;
var s = "c";
e = 0;
e = 2;
e = 0;
e = E[n] ? n : undefined; // E[n] returns a ?:string. --- result == 1
e = E[n] ? n : undefined; // The outer indexing operation is redundant and may be cancelled. --- result == 1
e = E.toNumber(s); // == 2
```
<br/>
Some of the `E[n] ? n : undefined` constructs may be cancelled if runtime boundary checking isn't wanted/desired (new boolean compiler option?). So `e = E[n] ? n : undefined;` may then be transpiled to `e = n;`.
<br/>
In the discussion on Gitter I learned about #3507, #592. Yet I feel the above proposal will add value to the ongoing discussion on improving the `enum` types.
| Suggestion,Needs Proposal | medium | Critical |
129,500,360 | go | x/text/encoding/unicode: add examples | golang.org/x/text/encoding/unicode's documentation is a bit unclear to the newbie like myself that just wants to read files that Windows folks send them without having to become an expert in UTF-16. I think the docs make sense to someone that already understands transforms, unicode, etc, but not everyone should need to know such plumbing. All I wanted was something like ioutil.ReadFile() that automagically read MS-Windows UTF-16 files and gave me UTF-8.
An example like this would have helped:
```
func NewReader(rd io.Reader) io.Reader {
// Make an tranformer that decodes MS-Windows (16LE) UTF files:
winutf := unicode.UTF16(unicode.LittleEndian, unicode.IgnoreBOM)
// Make a transformer that is like winutf, but abides by BOM if found:
decoder := winutf.NewDecoder()
// Make a Reader that uses decoder:
return transform.NewReader(rd, unicode.BOMOverride(decoder))
}
fd, _ := os.Open(filename)
r := NewReader(fd) // Read from "r" to get UTF-8.
utf8, _ := ioutil.ReadAll(r)
text := string(utf8)
```
It would be awesome if that example (or one like it) was added to the documentation. I'd be glad to submit a PR.
Furthermore, it would be useful to have a library similar to ioutil but automagically detects UTF-16 is found. I've made an example here: https://github.com/TomOnTime/utfutil
| Documentation,NeedsInvestigation | low | Minor |
129,702,946 | opencv | OpenCV projectPoints jacobian matrix is not correct | We compared the Jacobian matrix coming from projectPoints in OpenCV 2.4.11 against the one we have provided with matlab code (we have written the projection formula exactly based on the OpenCV source code). We found out that when the Tangential distortion coefficients (p1 and p2) are not zero the output of MATLAB code and the OpenCV projectPoints are not identical for the Jacobian of rotation and translation. It seems that the OpenCV source code is not correct for computing the Jacobian for rotation and translation.
we are using Ubuntu 14.04 64 bit with gcc version 4.8.4.
for your convenience you can find our MATLAB code in the following:
% convert rotation vector to rotation matrix (world_in_camera)
syms rx ry rz real;
r=[rx; ry; rz];
theta=norm(r);
r=r/theta;
R=cos(theta)_eye(3)+(1-cos(theta))_(r_r')+sin(theta)_[0,-r(3), r(2);r(3),0,-r(1);-r(2),r(1),0];
% camera Position world_in_camera
syms tx ty tz real;
T = [tx; ty; tz];
syms fx fy cx cy real ;
% 3D points in camera coordinate system
syms X Y Z real; % position of 3D Points in world coordinate system
P = R*[X; Y; Z] + [tx; ty; tz];
% distortion
syms k1 k2 p1 p2 k3
x_perim = P(1)/P(3);
y_perim = P(2)/P(3);
r2 = (x_perim \* x_perim) + (y_perim \* y_perim);
r4 = r2 \* r2;
r6 = r4 \* r2;
cdist = 1 + k1_r2 + k2_r4 + k3_r6;
a1 = 2_x_perim_y_perim;
a2 = r2 + 2_x_perim_x_perim;
a3 = r2 + 2_y_perim*y_perim;
x_zegond = x_perim \* cdist + p1 \* a1 + p2 \* a2;
y_zegond = y_perim \* cdist + p1 \* a3 + p2 \* a1;
% 3D point projection equation
xp(X, Y, Z) = x_zegond \* fx + cx;
yp(X, Y, Z) = y_zegond \* fy + cy;
syms X3 Y3 Z3 real;
x = xp(X3, Y3, Z3);
y = yp(X3, Y3, Z3);
imagePoints=[x;y];
% disp('imagePoints = ');
% pretty(imagePoints);
dp_dfx = diff(imagePoints, fx);
dp_dfy = diff(imagePoints, fy);
dp_dcx = diff(imagePoints, cx);
dp_dcy = diff(imagePoints, cy);
dp_dk1 = diff(imagePoints, k1);
dp_dk2 = diff(imagePoints, k2);
dp_dp1 = diff(imagePoints, p1);
dp_dp2 = diff(imagePoints, p2);
dp_dk3 = diff(imagePoints, k3);
dp_drx = diff(imagePoints, rx);
dp_dry = diff(imagePoints, ry);
dp_drz = diff(imagePoints, rz);
dp_dtx = diff(imagePoints, tx);
dp_dty = diff(imagePoints, ty);
dp_dtz = diff(imagePoints, tz);
jacobianMat = [dp_drx, dp_dry, dp_drz ,...
dp_dtx, dp_dty, dp_dtz ,...
dp_dfx, dp_dfy, dp_dcx, dp_dcy, ...
dp_dk1, dp_dk2, dp_dp1, dp_dp2, dp_dk3];
fprintf('generating c code for jacobian \n');
ccode(jacobianMat,'file','jacobian.cpp');
| bug,category: calib3d,affected: 2.4 | low | Minor |
129,845,753 | youtube-dl | youtube-dl skips post-processing for M4As | Hey there.
First of all, thanks for a great program, youtube-dl is the only program on the market I've come across that allows me to extract the audio from my huge YouTube playlists and convert them to M4A ready to be put on my phone.
The problem I'm currently having is that after extracting the audio, ffmpeg skips the post-processing, leaving me with the original-quality audio files rather than converting them to the 192k bitrate that I need. After posting here in StackOverflow: http://stackoverflow.com/questions/34930699/cant-use-youtube-dl-to-download-specified-bitrate/34930832?noredirect=1#comment57644155_34930832, a user there informed me that I'm probably experiencing another instance of a previous bug that skipped post-processing on M4A files.
Here's what I'm getting when I try to extract and transcode:
```
B:\Users\Hashim>youtube-dl --extract-audio --audio-format m4a --audio-quality 19
2k --playlist-items 48 https://www.youtube.com/playlist?list=PLR
3nWwHlZ9WBpi3uWsjSe6r1PiA8MTbnE
[youtube:playlist] PLR3nWwHlZ9WBpi3uWsjSe6r1PiA8MTbnE: Downloading webpage
[download] Downloading playlist: Main Playlist
[youtube:playlist] PLR3nWwHlZ9WBpi3uWsjSe6r1PiA8MTbnE: Downloading page #1
[youtube:playlist] PLR3nWwHlZ9WBpi3uWsjSe6r1PiA8MTbnE: Downloading page #2
[youtube:playlist] PLR3nWwHlZ9WBpi3uWsjSe6r1PiA8MTbnE: Downloading page #3
[youtube:playlist] PLR3nWwHlZ9WBpi3uWsjSe6r1PiA8MTbnE: Downloading page #4
[youtube:playlist] playlist Main Playlist: Downloading 1 videos
[download] Downloading video 1 of 1
[youtube] -o36bO1XKnw: Downloading webpage
[youtube] -o36bO1XKnw: Downloading video info webpage
[youtube] -o36bO1XKnw: Extracting video information
[youtube] -o36bO1XKnw: Downloading DASH manifest
[download] Destination: B:\Users\Hashim\Desktop\New Folder\Have Mercy - Cigarett
es And Old Perfume.m4a
[download] 100% of 6.24MiB in 00:13
[ffmpeg] Correcting container in "B:\Users\Hashim\Desktop\New Folder\Have Mercy -
Cigarettes And Old Perfume.m4a"
[ffmpeg] Post-process file B:\Users\Hashim\Desktop\New Folder\Have Mercy - Cigar
ettes And Old Perfume.m4a exists, skipping
[download] Finished downloading playlist: Main Playlist
```
Note the final line, "Post-process file x exists, skipping"
Thanks, would really appreciate a fix for this as soon as possible as it's preventing me from getting my music in the format and quality I need.
| bug,postprocessors | medium | Critical |
129,916,291 | opencv | Python Multiprocessing Problem | I'm using python 2.7 and opencv 2.4.11 and am having some trouble using the multiprocessing module for a very simple purpose. Here's a sample of my code with a consumer-producer framework employed.
```
class Consumer(multiprocessing.Process):
def __init__(self, task_queue, result_queue):
multiprocessing.Process.__init__(self)
self.task_queue = task_queue
self.result_queue = result_queue
def run(self):
#Creating window and starting video camputer from camera
cv2.namedWindow("preview")
vc = cv2.VideoCapture(0)
vc.set(cv.CV_CAP_PROP_FRAME_WIDTH, 640)
vc.set(cv.CV_CAP_PROP_FRAME_HEIGHT, 480)
if vc.isOpened():
rval, frame = vc.read()
else:
rval = False
while rval:
cv2.imshow("preview", frame)
rval, frame = vc.read()
key = cv2.waitKey(20)
if key == 27:
break
cv2.destroyWindow("preview")
return
##################################
#MAIN PROGAM
##################################
#SETTING UP MULTIPROCESSING STUFF
tasks = multiprocessing.JoinableQueue()
results = multiprocessing.Queue()
consumer = Consumer(tasks,results)
consumer.start()
#Rest of main program
```
However, the process hangs on the line
`cv2.namedWindow("preview")`
Is this a bug? How do I work around this?
| bug,category: python bindings,affected: 2.4 | low | Critical |
130,212,221 | neovim | Support GUI signs | `sign` support allowed the GUI to place icons as defined through the `sign icon= command, but we have no equivalent for remote GUIs, and the old code got removed.
Ping equalsraf/neovim-qt#90.
| enhancement,gui,ui-extensibility | low | Minor |
130,388,425 | TypeScript | Readonly indexer and constructor usage: question | ``` typescript
class A {
readonly x;
readonly [x: string]: string;
constructor() {
this.x = 5; // that's okay
this["a"] = "s"; // that yields an error
}
}
```
Is that by design that readonly indexers for the same type in constructor yield errors?
| Suggestion,Awaiting More Feedback | low | Critical |
130,412,950 | TypeScript | Destructuring causes error for null/undefined properties | The following code:
``` ts
interface Foo {
x: number;
}
interface Bar {
foo?: Foo;
}
function test( { foo: { x } }: Bar ) {
alert( x );
}
test( {} ); // TypeError: Cannot read property 'x' of undefined
```
transpiles to:
``` js
function test(_a) {
var x = _a.foo.x;
alert(x);
}
```
instead of the (arguably) preferable:
``` js
function test(_a) {
var x;
if (_a.foo) x = _a.foo.x;
alert(x);
}
```
The transpiled code does not check for the presence of optional property `foo` in the `Bar` interface. This leads to an error at invocation.
I can see three possible solutions:
1. Always check for member existence when transpiling destructuring statements.
2. Check for member existence only when an element is optional in the interface or type declaration.
3. New syntax for destructuring that can indicate members should be checked.
E.g.
``` ts
function test( { foo?: { x } }: Bar ) {}
```
| Suggestion,In Discussion,Help Wanted | medium | Critical |
130,460,512 | go | runtime: GC should wake up idle Ps | Currently the GC doesn't always wake up idle Ps, and hence may not take full advantage of idle marking during the concurrent mark phase. This can happen during mark 2 because mark 1 completion preempts all workers; if the Ps running those workers have nothing else to do they will simply park, and there's no mechanism to wake them up after we allow mark workers to start again. It's possible this can happen during mark 1 as well, though it may be since we allow mark workers to run before starting the world for mark 1 that all of the Ps start running.
/cc @RLH
| Performance,NeedsFix,compiler/runtime | low | Major |
130,465,471 | go | syscall: synchronization between clone() and execve() | I would like to see a synchronization primitive to be added between the clone() and the execve() call in the function forkAndExecInChild here https://golang.org/src/syscall/exec_linux.go so that the parent can setup the child after the clone() but before the execve().
| compiler/runtime | low | Major |
130,515,751 | youtube-dl | --download-archive doesn't create directories when needed | So, I'm on Linux and this is the exact command I'm executing:
```
sudo youtube-dl https://www.youtube.com/playlist?list=PLjlvXrjuRczngya2R6iQpUx8UnSCG7DAo --extract-audio --audio-format mp3 --playlist-start 1 -o playlist_storage/PLjlvXrjuRczngya2R6iQpUx8UnSCG7DAo/%\(title\)s.%\(ext\)s --download-archive playlist_storage/PLjlvXrjuRczngya2R6iQpUx8UnSCG7DAo/PLjlvXrjuRczngya2R6iQpUx8UnSCG7DAo.txt
```
And I get this error:
```
Traceback (most recent call last):
File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/usr/local/bin/youtube-dl/__main__.py", line 19, in <module>
File "/usr/local/bin/youtube-dl/youtube_dl/__init__.py", line 411, in main
File "/usr/local/bin/youtube-dl/youtube_dl/__init__.py", line 401, in _real_main
File "/usr/local/bin/youtube-dl/youtube_dl/YoutubeDL.py", line 1690, in download
File "/usr/local/bin/youtube-dl/youtube_dl/YoutubeDL.py", line 677, in extract_info
File "/usr/local/bin/youtube-dl/youtube_dl/YoutubeDL.py", line 830, in process_ie_result
File "/usr/local/bin/youtube-dl/youtube_dl/YoutubeDL.py", line 623, in _match_entry
File "/usr/local/bin/youtube-dl/youtube_dl/YoutubeDL.py", line 1768, in in_download_archive
File "/usr/local/bin/youtube-dl/youtube_dl/utils.py", line 1231, in __init__
IOError: [Errno 20] Not a directory: 'playlist_storage/PLjlvXrjuRczngya2R6iQpUx8UnSCG7DAo/PLjlvXrjuRczngya2R6iQpUx8UnSCG7DAo.txt'
```
What's causing this?
| bug | low | Critical |
130,516,689 | TypeScript | Enable JavaScript specific warning behavior | See #6658 for background.
Currently for JavaScript files we provide a very limited set of errors, mostly for syntax errors. There are a number of grammatical and type errors that would be useful to display also.
Brief notes from ad hoc discussion in the team room include:
- Move the code that flags TypeScript syntax in JavaScript files for better error handling
- Add a distinction between errors and warnings to avoid "valid" JavaScript containing errors
- Make the warnings user configurable as part of the project (i.e. may enable/disable certain warnings)
More detailed design to come...
| Suggestion,Committed,VS Code Tracked,Domain: JavaScript | low | Critical |
130,538,870 | nvm | node not found after install | ```
v21@v21:~$ nvm install 5.0
Downloading https://nodejs.org/dist/v5.0.0/node-v5.0.0-linux-x64.tar.xz...
######################################################################## 100.0%
WARNING: checksums are currently disabled for node.js v4.0 and later
/home/v21/.nvm/versions/node/v5.0.0/bin/npm: 2: exec: /home/v21/.nvm/versions/node/v5.0.0/bin/node: not found
nvm is not compatible with the npm config "prefix" option: currently set to ""
Run 'nvm use --delete-prefix v5.0.0' to unset it.
```
Seeing the same error on every version I try. nvm version is 0.30.2. I'm running Ubuntu 12.04.5 LTS (GNU/Linux 4.1.5-x86_64-linode61 x86_64) with bash. I have an install of 0.10 still working via nvm.
the node executable is visible with ls:
```
v21@v21:~$ ls -alh /home/v21/.nvm/versions/node/v5.5.0/bin/
total 25M
drwxrwxr-x 2 v21 v21 4.0K Jan 21 00:59 .
drwxrwxr-x 6 v21 v21 4.0K Feb 2 01:22 ..
-rwxrwxr-x 1 v21 v21 25M Jan 21 00:59 node
lrwxrwxrwx 1 v21 v21 38 Jan 21 00:59 npm -> ../lib/node_modules/npm/bin/npm-cli.js
```
| installing node,feature requests,pull request wanted | low | Critical |
130,768,594 | go | net/http/httptest: optional faster test server | Look into making httptest.Server have a way to use in-memory networking (i.e. net.Pipe) instead of localhost TCP. Might be faster/cheaper and not hit ephemeral port issues under load?
| help wanted,NeedsFix,FeatureRequest | medium | Major |
130,774,506 | youtube-dl | site request: atpworldtour | "Association of Tennis Professionals"
URL example:
http://www.atpworldtour.com/en/video/rotterdam-2015-final-highlights-wawrinka-berdych
Thanks
Ringo
| site-support-request | low | Minor |
130,845,875 | go | net/http: source of errors is unclear | **What version of Go are you using (go version)?**
1.5.3.
**What operating system and processor architecture are you using?**
Linux. AMD 64.
**What did you do?**
I'm using `httputil.ReverseProxy` to proxy to a host behind ELB, and I'm getting the errors "http: proxy error: net/http: request canceled" & "http: proxy error: net/http: transport closed before response was received" occationally. The line that printed the error is [here](https://github.com/golang/go/blob/release-branch.go1.5/src/net/http/httputil/reverseproxy.go#L196). And the error came from [`http.Transport.RoundTrip`](https://github.com/golang/go/blob/release-branch.go1.5/src/net/http/httputil/reverseproxy.go#L194).
After more digging, [the errors](https://github.com/golang/go/blob/release-branch.go1.5/src/net/http/transport.go#L1093-L1094) could come from a couple places in `http.Transport.RoundTrip`. I was trying to find a way to print the callstacks of the error from `http.Transport.RoundTrip` to understand why it was returned but there's currently no way to do so.
Details can be found in [this golang-nuts thread](https://groups.google.com/forum/#!topic/golang-nuts/apVfiayC01I).
**What did you expect to see?**
I would expect I could instrument a method of either application code or Go's standard library and find out the callstacks at runtime.
**What did you see instead?**
No way to find out the callstack of a method from Go's stdlib. This is a feature request.
| Documentation,NeedsInvestigation | medium | Critical |
130,865,507 | opencv | Support for HEVC and VP9 in cudacodec::VideoReader | Hello,
Current version of cudacodec::VideoReader does not support hevc/vp9 decode, Nvidia cards with GM206 and above have a VP that supports both decode and encode of HEVC.
| feature,category: gpu/cuda (contrib) | low | Minor |
130,899,194 | javascript | Exception for space-before-keywords? | Writing a React/redux app, I am occasionally declaring anonymous classes inside function calls:
``` javascript
export default connect()(class extends Component {
// React component class definition
});
```
which gives this error: `Missing space before keyword "class". (space-before-keywords)`.
If I put a space there:
``` javascript
export default connect()( class extends Component {
// React component class definition
});
```
It fails another rule: `There should be no spaces inside this paren. (space-in-parens)`.
Was wondering if that was an unintended effect of the `space-before-keywords` or if everything is working as expected.
| needs eslint rule change/addition | low | Critical |
130,908,856 | opencv | BackgroundSubtractorMOG2 apply calls from python causes allocation error | Error message:
```
File "(myfile.py)", line 1500, in (some function)
bgsub.apply(frame)
error: /tmp/opencv-3.1.0/modules/python/src2/cv2.cpp:163: error: (-215) The data should normally be NULL! in function allocate
```
This appears to be a similar bug as [#5667](https://github.com/Itseez/opencv/issues/5667) for knnMatch, however the [fix](https://github.com/Itseez/opencv/pull/6009) for that bug doesn't apply here
The bug seems to be related to an allocate call. It seems to be avoided for python3 and some people just disable the warning [link](http://stackoverflow.com/questions/32931856/cv2-3-0-0-cv2-flannbasedmatcher-flann-knnmatch-is-throwing-cv2-error), however it would be better to identify the problem (eg for python 2) and prevent it occurring.
| bug,category: python bindings,affected: 3.4 | low | Critical |
130,946,003 | go | tour: figure out how to make pic.Show interact well with debug prints | Context: https://tour.golang.org/moretypes/15
Running the slices exercise doesn't display bluescale image. Instead it displays the slices of slices returned by Pic and as the last line prints "IMAGE: <base64-value>".
| help wanted | low | Critical |
131,041,599 | youtube-dl | Support request for http://knpuniversity.com | This website provides video tutorials in PHP programming.
There are some free tracks on the website which can be downloaded by the tool.
Example of video page: http://knpuniversity.com/screencast/symfony/twig-layouts
Example of playlist page: http://knpuniversity.com/screencast/symfony
| site-support-request | low | Minor |
131,054,331 | rust | rustdoc's search-index.js file is huge for large projects | See https://bugzilla.mozilla.org/show_bug.cgi?id=1245213#c3
We should fix SpiderMonkey / the profiler to be smarter, but 15.2 MB is a _lot_ of JS to load on every page load. Looks like this file will also create a ton of JS objects/strings.
Can we load this file only when we're using the search bar? Maybe we can split it up somehow and load only the relevant parts? Or come up with a more efficient format for it?
| T-rustdoc,C-enhancement,I-heavy,A-rustdoc-search | low | Critical |
131,129,072 | kubernetes | Support "Delete" and "Get" via UID or with UID as a precondition | Currently we use `<namespace, name>` pair as the identifier for a delete operation, e.g., https://github.com/kubernetes/kubernetes/blob/master/pkg/client/unversioned/pods.go#L72. However, the `<namespace, name>` pair is not unique across time, and it may lead to a race condition like #19403.
To make the Delete unambiguous, we should support
1. Delete via UID;
2. Delete via `<namepace, name>` and passing UID through DeleteOptions as a precondition.
The "Get" operation also suffers a similar problem. Users need to check the returned object' UID to make sure it is the object they intend to get. This is error prone. We should support "Get" via UID.
| area/api,sig/api-machinery,lifecycle/frozen | medium | Critical |
131,192,455 | kubernetes | Umbrella issue for kubectl config command overhaul | `kubectl config` strongly departs from the command and output conventions of other commands. It shouldn't.
Current commands:
https://github.com/kubernetes/kubernetes/blob/master/docs/user-guide/kubectl/kubectl_config.md
Conventions:
https://github.com/kubernetes/kubernetes/blob/master/docs/devel/kubectl-conventions.md
Related: #5592, #7804, #8593, #8817, #9298, #10516, #10626, #10693, #10735, #11233, #16085, #16935, #20374
@deads2k proposed English descriptions:
https://github.com/kubernetes/kubernetes/issues/8593#issuecomment-104272624
https://github.com/kubernetes/kubernetes/pull/20468#issuecomment-179812594
@thockin proposed a family of commands:
https://github.com/kubernetes/kubernetes/issues/8593#issuecomment-104503048
https://github.com/kubernetes/kubernetes/issues/10516#issuecomment-116965691
TODO: Create proposal
cc @jlowdermilk @deads2k @thockin @quinton-hoole @kubernetes/kubectl
| priority/backlog,area/kubectl,sig/cli,lifecycle/frozen | low | Major |
131,200,944 | You-Dont-Know-JS | "es6 & beyond": regex clarification | From Chapter 2 on sticky regex
> from the content of the source property, such as
Your code goes on to use toString, but not source. You might clarify in future editions.
| for second edition | medium | Minor |
131,414,560 | opencv | inpaint cannot handle borders | I wanted to smooth some depth map images by using the `inpaint()` functionality. However, I noticed that the no depth information on the borders are not completely "washed out" from the function as I was expecting. A example can be seen in[ this question](http://answers.opencv.org/question/86569/inpainting-normal-behavior-or-a-bug/) by me in the official forum of OpenCV. My question is if this is an expected behavior or there is an actual problem in the implementation of the function.
Thanks.
Best,
Theo
| category: photo,RFC | low | Critical |
131,449,790 | vscode | Git - Automatically insert line breaks in git commit messages | I'm trying to get into the habit of writing good commit messages. Not sure if this should be the default but it would be really useful if vscode automatically wrapped commit message lines at say 80 chars and inserted line breaks.
I don't mean changing the appearance of the text in the commit message box, just before it's actually committed. The line length could be in an option too.
| help wanted,feature-request,git | high | Critical |
131,492,057 | go | spec: formatting of tables relies on fixed-width fonts | On https://golang.org/ref/spec if you search for "array or slice" the table is not aligned as a table. I can't tell from this paragraph which is the 1st or the 2nd value in a range expression for different types.
(https://golang.org/doc/effective_go.html#for is clear on the range expression, though it's not the spec.)
| NeedsInvestigation | low | Major |
131,687,981 | vscode | activationEvents.workspaceContains doesn't fire for directory | Is this expected? Also, a general way to detect all workspace-open events, like 'workspaceContains:*' or an onDidChangeWorkspace event would be useful.
| feature-request,api | low | Major |
131,721,647 | go | tour: Search in table of contents | Context: http://tour.golang.org/*
The tour is a good reference for quick examples, when you are not familiar with go yet. It would be useful to be able to search in the contents of the tour. At least in its table of contents.
| Documentation,help wanted,NeedsInvestigation,FeatureRequest | low | Minor |
131,752,491 | rust | Tracking issue for `?` operator and `try` blocks (RFC 243, `question_mark` & `try_blocks` features) | Tracking issue for rust-lang/rfcs#243 and rust-lang/rfcs#1859.
Implementation concerns:
- [x] `?` operator that is roughly equivalent to `try!` - #31954
- [x] `try { ... }` expression - https://github.com/rust-lang/rust/issues/39849
- [x] resolve `do catch { ... }` syntax question
- Resolved as `try { .. }`, - https://github.com/rust-lang/rust/issues/50412
- [x] resolve whether catch blocks should "wrap" result value (first addressed in https://github.com/rust-lang/rust/issues/41414, now being settled anew in https://github.com/rust-lang/rust/issues/70941)
- [ ] Add a test confirming that it's an `ExprWithBlock`, so works in a match arm without a comma
- [ ] Address issues with type inference (`try { expr? }?` currently requires an explicit type annotation somewhere).
- [x] settle design of the `Try` trait (https://github.com/rust-lang/rfcs/pull/1859)
- [x] implement new `Try` trait (in place of `Carrier`) and convert `?` to use it (https://github.com/rust-lang/rust/pull/42275)
- [x] add impls for `Option` and so forth, and a suitable family of tests (https://github.com/rust-lang/rust/pull/42526)
- [x] improve error messages as described in the RFC (https://github.com/rust-lang/rust/issues/35946)
- [x] reserve `try` in new edition
- [x] block `try{}catch` (or other following idents) to leave design space open for the future, and point people to how to do what they want with `match` instead | B-RFC-approved,T-lang,T-libs-api,B-unstable,B-RFC-implemented,C-tracking-issue,A-error-handling,F-try_blocks,Libs-Tracked,S-tracking-design-concerns | high | Critical |
131,837,716 | neovim | visual effect for operations | Replicated from [StackOverflow](http://stackoverflow.com/questions/35171726/is-there-visual-flash-effect-for-editing)
What we are looking is to visually flash briefly the affected areas of vim editing in normal mode. For example, when editing
``` C
if (true) {
//line to be deleted
}
```
if we do dd on //line to be deleted, this affected area should be flashed before deleting, The same we can do using Vd. What we are looking is same effect as that of Vd using dd. This should work for all editing operations like c, y etc
May be the effect can be controlled using a configuration option
| enhancement,gsoc | low | Minor |
131,837,719 | youtube-dl | francetv unable to find ID | Trying to download from here:
http://television.telerama.fr/television/regardez-cargos-la-face-cachee-du-fret-ou-l-invisible-armada-de-la-mondialisation,137619.php
Which is using this iframe:
http://embed.francetv.fr/?ue=a69734825a04041ae3b51fd8b41f87f1&autoplay=1
And:
[francetv] a69734825a04041ae3b51fd8b41f87f1&autoplay=1: Downloading webpage
ERROR: Unable to extract video ID;
ID is located here: http://sivideo.webservices.francetelevisions.fr/tools/getInfosOeuvre/v2/?idDiffusion=NI_620048&catalogue=Catalogue_programmes&callback=_jsonp_loader_callback_request_0
...
```
_jsonp_loader_callback_request_0({"id":"NI_620048", ...
```
Thanks!
| geo-restricted | low | Critical |
131,839,076 | rust | The Vec should not needlessly overallocate capacity if it is guaranteed to fail. | Current Vec / RawVec allocation strategy is to at least double the capacity on
reallocation. This leads to unnecessary panic when the new overallocated
capacity exceeds std::isize::MAX. If allocating std::isize::MAX would be
sufficient it should do so instead.
For example I would expect the following to work on 32-bit platform (provided
that there is sufficient amount of memory):
``` rust
fn main() {
let mut v = std::vec::from_elem(1 as u8, (std::isize::MAX / 2 + 1) as usize);
v.push(1);
}
```
| C-enhancement,A-collections,T-libs-api | low | Minor |
131,870,964 | vscode | Automatically Activate Markdown Preview | I would like to eliminate the step of pressing `ctrl+k v` to preview the md file I open. For the preview should always be on.
Thank you
| feature-request,markdown | high | Critical |
131,890,153 | flutter | Make it real easy to open an issue with crash reports and logs | Imagine a `flutter report-bug` command. It could grab any crash report from iOS, any recent logs, the flutter version, the Dart version, and then send it off to GitHub as an issue.
We want to make it as simple as
Example UI:
```
$ flutter report-bug
Sorry you ran into trouble, but we appreciate the bug report.
This command collects potentially relevant information and opens a public bug report for the Flutter team as a GitHub issue. Please review the information before you confirm.
Flutter version: ...
Dart version: ...
OS: mac
Crash file: path/to/file
Log: path/to/file or print out what we will send
OK to make this information public as a GitHub issue? [y/N]
```
| c: new feature,team,tool,a: triage improvements,P3,team-tool,triaged-tool | low | Critical |
131,970,355 | youtube-dl | site support request: www.tg4.ie | Sample URL: http://www.tg4.ie/ga/player/baile/?pid=4742502074001
`$ youtube-dl --version
2016.01.27`
```
$ youtube-dl -v http://www.tg4.ie/ga/player/baile/?pid=4742502074001
[debug] System config: []
[debug] User config: []
[debug] Command-line args: [u'-v', u'http://www.tg4.ie/ga/player/baile/?pid=4742502074001']
[debug] Encodings: locale UTF-8, fs UTF-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2016.01.27
[debug] Python version 2.7.6 - Linux-3.13.0-77-generic-x86_64-with-Ubuntu-14.04-trusty
[debug] exe versions: avconv 9.18-6, avprobe 9.18-6, ffmpeg 2.5.3-1kxstudio1, ffprobe 2.5.3-1kxstudio1, rtmpdump 2.4-n87-gita9f353c-ppa8
[debug] Proxy map: {}
[generic] ?pid=4742502074001: Requesting header
WARNING: Falling back on generic information extractor.
[generic] ?pid=4742502074001: Downloading webpage
[generic] ?pid=4742502074001: Extracting information
ERROR: Unsupported URL: http://www.tg4.ie/ga/player/baile/?pid=4742502074001
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/youtube_dl/extractor/generic.py", line 1289, in _real_extract
doc = compat_etree_fromstring(webpage.encode('utf-8'))
File "/usr/lib/python2.7/dist-packages/youtube_dl/compat.py", line 248, in compat_etree_fromstring
doc = _XML(text, parser=etree.XMLParser(target=etree.TreeBuilder(element_factory=_element_factory)))
File "/usr/lib/python2.7/dist-packages/youtube_dl/compat.py", line 237, in _XML
parser.feed(text)
File "/usr/lib/python2.7/xml/etree/ElementTree.py", line 1642, in feed
self._raiseerror(v)
File "/usr/lib/python2.7/xml/etree/ElementTree.py", line 1506, in _raiseerror
raise err
ParseError: syntax error: line 2, column 0
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/youtube_dl/YoutubeDL.py", line 666, in extract_info
ie_result = ie.extract(url)
File "/usr/lib/python2.7/dist-packages/youtube_dl/extractor/common.py", line 313, in extract
return self._real_extract(url)
File "/usr/lib/python2.7/dist-packages/youtube_dl/extractor/generic.py", line 1908, in _real_extract
raise UnsupportedError(url)
UnsupportedError: Unsupported URL: http://www.tg4.ie/ga/player/baile/?pid=4742502074001
```
| site-support-request | low | Critical |
132,005,297 | rust | Add rustc --emit=link-info for staticlib link-line output | When linking rust into C++ projects, I use `rustc --crate-type=staticlib` to generate a static library which I can link into the overall project. Since there's no standard for transitive dependency declaration in the C ABI for static libraries, rustc prints out a list of libraries which need to be linked along with the staticlib. (https://github.com/rust-lang/rust/issues/25820#issuecomment-106480235)
This is very helpful in development, but not ideal for automation. I propose adding an `--emit=link-info` option (`--emit=libs`? `--emit=ldflags`?) to write the required link line out to a file for use later, similar to `--emit=dep-info` for makefile dependency generation.
Order of evaluation is a little trickier than with dep-info since it has to work the first time, but this should simplify funky stderr hacks like
``` make
libfoo.a libfoo.a.out: foo/lib.rs
rustc -g --crate-type staticlib --crate-name foo \
--emit dep-info,link=$@ $< \
2> [email protected] || cat [email protected] >&2
-include foo.d
prog: RUST_LIBS = $(shell awk '/^note: library: / {print "-l"$$3}' libmp4parse.a.out)
prog: prog.o libfoo.a
$(CXX) $(CXXFLAGS) -o $@ $^ $(RUST_LIBS)
```
into
``` make
libfoo.a libfoo.a.link: foo/lib.rs
rustc -g --crate-type staticlib --crate-name foo \
--emit dep-info,link=$@,[email protected] $<
-include foo.d
prog: prog.o libfoo.a
$(CXX) $(CXXFLAGS) -o $@ $^ $(shell cat libfoo.a.link)
```
| A-frontend,T-compiler,C-feature-request | low | Major |
132,023,607 | youtube-dl | Site support request: cops.com | Hello,
Could you add cops.com as a supported website? They seem to break full episodes into separate clips for showing advertisements in-between.
Example of a full episode: http://www.cops.com/cops2805/
Thanks.
| site-support-request | low | Minor |
132,083,670 | neovim | python remote plugin: warn when file doesn't export handlers | While developing a Python3 plugin for neovim, I used a directory structure like this:
```
.
βββ requirements.txt
βββ rplugin
βΒ Β βββ __init__.py
βΒ Β βββ python3
βΒ Β βββ myplugin.py
β βββ __init__.py
βββ tests
βββ __init__.py
βββ myplugin_tests.py
```
When I attempted to use `vim-plug` to load the plugin from the filesystem and called `UpdateRemotePlugins`, I got the following error in the nvim python log:
```
2016-02-08 00:32:52,269 [ERROR @ host.py:_load:94] 2780 - /home/siddharthist/[...]/rplugin/python3/__init__.py exports no handlers
```
The expected behavior is that neovim would only warn or silently ignore python files that don't export handlers, because there are lots of valid use cases for having another non-plugin python file in the same directory as the plugin file.
I fixed it in my case by renaming `myplugin.py` to `__init__.py`.
What do you all think?
### Extra Info
I'm using Arch Linux,
```
$ nvim --version
NVIM 0.1.1 (compiled Jan 22 2016 15:40:18)
Build type: RelWithDebInfo
Compilation: /usr/bin/cc -march=x86-64 -mtune=generic -O2 -pipe -fstack-protector-strong -Wconversion -O2 -g -DDISABLE_LOG -Wall -Wextra -pedantic -Wno-unused-parameter -Wstrict-prototypes -std=gnu99 -Wvla -fstack-protector-strong -fdiagnostics-color=auto -DINCLUDE_GENERATED_DECLARATIONS -DHAVE_CONFIG_H -I/build/neovim/src/build/config -I/build/neovim/src/neovim-0.1.1/src -I/usr/include -I/usr/include -I/usr/include/luajit-2.0 -I/usr/include -I/usr/include -I/usr/include -I/usr/include -I/usr/include -I/build/neovim/src/build/src/nvim/auto -I/build/neovim/src/build/include
Compiled by builduser
Optional features included (+) or not (-): +acl +iconv +jemalloc
For differences from Vim, see :help vim-differences
system vimrc file: "$VIM/sysinit.vim"
fall-back for $VIM: "/usr/share/nvim"
```
| enhancement,provider,complexity:low | low | Critical |
132,134,473 | go | doc: add guidance on new vs. literal usage | One thing I observed with new gophers, is having trouble judging is when to use `new` and when to take the address of struct literal.
Adding a bit of guidance in the wiki or effective_go.html would be quite useful.
Suggestion for guidance:
- prefer the literal
- if you don't initialize the struct with values, use new
- if you did use new, don't initialize values later, prefer switching the code to a literal instead.
| Documentation,NeedsInvestigation | low | Major |
132,163,073 | electron | [Enhancement] Applescript API | Is it possible to have an Applescript API for Mac apps made with Electron?
| enhancement :sparkles: | medium | Critical |
132,264,512 | go | runtime: _cgoCheckPointer0 extreme overhead | 1. What version of Go are you using (go version)?
go1.6rc2
2. What operating system and processor architecture are you using?
linux/amd64
3. What did you do?
Call C code with cgo.
4. What did you expect to see?
Almost the same performance with GODEBUG=cgocheck=1 and GODEBUG=cgocheck=0.
5. What did you see instead?
22 times more running time with GODEBUG=cgocheck=1 than with GODEBUG=cgocheck=0.
The overhead is 22x in [practice](https://groups.google.com/forum/#!topic/golang-nuts/ccMkPG6Bi5k) and 36000 in this contrived example:
$ GODEBUG=cgocheck=1 go test -bench=. -run=-
testing: warning: no tests to run
PASS
BenchmarkCGO-4 50 28792524 ns/op
ok github.com/tgulacsi/go/cgo22 1.855s
:gthomas@waterhouse: ~/src/github.com/tgulacsi/go/cgo22
$ GODEBUG=cgocheck=0 go test -bench=. -run=-
testing: warning: no tests to run
PASS
BenchmarkCGO-4 2000000 796 ns/op
ok github.com/tgulacsi/go/cgo22 2.988s
:gthomas@waterhouse: ~/src/github.com/tgulacsi/go/cgo22
The code is at https://github.com/tgulacsi/go/tree/master/cgo22
[cgo22.zip](https://github.com/golang/go/files/122255/cgo22.zip)
| compiler/runtime | low | Critical |
132,304,763 | vscode | Support variables when resolving values in settings | Hi,
I was just reading the [latest updates](https://code.visualstudio.com/updates) and it says one can install `typescript@next` globally and then set `typescript.tsdk` so VS Code can use the appropriate version/installation. In a team environment, I'd like to put that setting in our project, something like:
`.vscode/settings.json`:
``` json
{
"typescript.tsdk": "%APPDATA%/npm/node_modules/typescript/lib"
}
```
The problem is restarting VS Code results in an error:
```
The path c:\Projects\Derp\%APPDATA%\npm\node_modules\typescript\lib doesn't point to a valid tsserver install. TypeScript language features will be disabled.
```
Now the setting needs to be per-person because I highly doubt my teammates have `tsc` installed in `C:\Users\Olson\AppData\Roaming\npm\node_modules\typescript\lib` :wink:
Could we get environment variables evaluated on that and all other settings that involve paths?
I haven't tested other paths, but I see these in the Default Settings:
- `git.path`
- `markdown.styles`
- `json.schemas`
- `typescript.tsdk`
- `php.validate.executablePath`
| feature-request,config | high | Critical |
132,331,790 | rust | Guard pages are disabled on musl. | In #30629 I [disabled guard pages on musl](https://github.com/rust-lang/rust/pull/30629/files#diff-1a00e90827d8a240091012902881fdc4R184). I did this because one of the pthread calls (I've forgotten which) was segfaulting during runtime initialization on i686-unknown-linux-musl, with my local build of musl, and I just wanted it working.
It should at least be on for x86_64 where it's known to work on the bots.
| O-musl,C-bug,A-stack-probe | low | Minor |
132,338,105 | rust | Specifying linkage on externs silently removes indirection | When compiling the following code:
``` c
// externs-c.c
unsigned char myarr[10]={1,2,3,4,5,6,7,8,9,10};
unsigned char (*implicitvar)[10]=&myarr;
unsigned char (*explicitvar)[10]=&myarr;
```
``` rust
// externs-rust.rs
#![feature(linkage)]
extern {
static implicitvar: *const [u8;10];
// Should have no effect, external linkage is the default in an extern block
#[linkage="external"]
static explicitvar: *const [u8;10];
}
fn as_option(p: *const [u8;10]) -> Option<&'static [u8;10]> {
unsafe{std::mem::transmute(p)}
}
fn main() {
println!("implicitvar = {:?}",as_option(implicitvar));
println!("explicitvar = {:?}",as_option(explicitvar));
}
```
using
```
clang -c externs-c.c && rustc externs-rust.rs -C link-args=./externs.o
```
running `./externs` will output something like the following:
```
implicitvar = Some([1, 2, 3, 4, 5, 6, 7, 8, 9, 10])
explicitvar = Some([168, 4, 122, 85, 85, 85, 0, 0, 0, 0])
```
Wat.
Taking a look at the IR:
``` llvm
; externs-c.ll
@myarr = global [10 x i8] c"\01\02\03\04\05\06\07\08\09\0A", align 1
@implicitvar = global [10 x i8]* @myarr, align 8
@explicitvar = global [10 x i8]* @myarr, align 8
```
``` llvm
; externs-rust.ll
@implicitvar = external global [10 x i8]*
@explicitvar = external global [10 x i8]
@_rust_extern_with_linkage_explicitvar = internal global [10 x i8]* @explicitvar
```
So, Rust removes a layer of indirection defining `static explicitvar: [u8;10]` and adding a new variable `static _rust_extern_with_linkage_explicitvar: *const [u8;10]=&explicitvar`. All mentions of `explicitvar` in Rust source code get replaced with `_rust_extern_with_linkage_explicitvar`. This results in the C version and this new Rust version **not having the same type**! To get βcorrectβ behavior in the example above, you would need to define `static explicitvar: *const *const [u8;10]` instead.
This weird assymmetry between the types associated with symbols in Rust and in C is a source of great confusion and can easily lead to bugs. In the example above, we just read 2 bytes past some pointer by interpreting it as a 10-byte array.
This weird behavior was introduced in #12556 (see also #11978), the rationale being weak linkage and the fact that some pointers can't be null in the Rust typesystem. While true, I don't think that's sufficient rationale to add this layer of indirection. I think the layer of indirection should be removed completely. For weak linkage, a restriction can be added to allow only zeroable types.
| A-linkage,A-codegen,T-compiler,C-bug,requires-nightly | low | Critical |
132,418,818 | youtube-dl | Calling ffmpeg once | It has come to my attention that youtube-dl calls ffmpeg **3 times** if we enable "--write-sub --all-subs --embed-subs --embed-thumbnail --add-metadata" when we can just safely call ffmpeg once to do all stuffs(maybe except thumbnail as it requires atomicparsley).
This is a huge waste in system resources, much more noticeable in embedded environments with limited I/O throughput.
[ffmpeg] Merging formats into "iPhone 6s Chipgate-0bAeJ5fJ1M0.mp4"
[ffmpeg] Adding metadata to 'iPhone 6s Chipgate-0bAeJ5fJ1M0.mp4'
[ffmpeg] Embedding subtitles in 'iPhone 6s Chipgate-0bAeJ5fJ1M0.mp4'
This leaves a clear room to improve.
I personally cannot write Python codes, so it'd be really nice if someone jumps in and improve this.
Thanks :)
```
youtube-dl -v -k -f 137+140 --newline --merge-output-format mp4 --write-sub --all-subs --embed-subs --embed-thumbnail --add-metadata https://www.youtube.com/watch?v=0bAeJ5fJ1M0
[debug] System config: []
[debug] User config: []
[debug] Command-line args: [u'-v', u'-k', u'-f', u'137+140', u'--newline', u'--merge-output-format', u'mp4', u'--write-sub', u'--all-subs', u'--embed-subs', u'--embed-thumbnail', u'--add-metadata', u'https://www.youtube.com/watch?v=0bAeJ5fJ1M0']
[debug] Encodings: locale UTF-8, fs UTF-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2016.02.09
[debug] Python version 2.7.10 - Linux-4.1.16-zen+-x86_64-with-Ubuntu-15.10-wily
[debug] exe versions: avconv 2.7.6-0ubuntu0.15.10.1, avprobe 2.7.6-0ubuntu0.15.10.1, ffmpeg 2.7.6-0ubuntu0.15.10.1, ffprobe 2.7.6-0ubuntu0.15.10.1
[debug] Proxy map: {'no': 'localhost,127.0.0.0/8,::1'}
[youtube] 0bAeJ5fJ1M0: Downloading webpage
[youtube] 0bAeJ5fJ1M0: Downloading video info webpage
[youtube] 0bAeJ5fJ1M0: Extracting video information
[youtube] 0bAeJ5fJ1M0: Downloading MPD manifest
[youtube] 0bAeJ5fJ1M0: Downloading MPD manifest
[info] Writing video subtitles to: iPhone 6s Chipgate-0bAeJ5fJ1M0.en.vtt
[youtube] 0bAeJ5fJ1M0: Downloading thumbnail ...
[youtube] 0bAeJ5fJ1M0: Writing thumbnail to: iPhone 6s Chipgate-0bAeJ5fJ1M0.jpg
[debug] Invoking downloader on u'https://r4---sn-3u-bh2d.googlevideo.com/videoplayback?id=d1b01e2797c9d4cd&itag=137&source=youtube&requiressl=yes&mm=31&mn=sn-3u-bh2d&ms=au&mv=m&pl=19&pcm2cms=yes&ratebypass=yes&mime=video/mp4&gir=yes&clen=33738804&lmt=1444725098607526&dur=147.247&sver=3&key=dg_yt0&mt=1455023178&fexp=9405265,9416126,9417058,9417580,9418199,9420452,9422596,9423661,9423662,9424134,9425730,9426681,9426765,9427037,9427105,9427708,9428294,9428561&upn=Y5isukI8LBo&signature=85BE9BFAA1291F418C27C0F5D8C41A8CE47EDAA4.688B26D0F3051AC0DC3162960A70997B12DBAACB&ip=121.140.87.200&ipbits=0&expire=1455044917&sparams=ip,ipbits,expire,id,itag,source,requiressl,mm,mn,ms,mv,pl,pcm2cms,ratebypass,mime,gir,clen,lmt,dur'
[download] Destination: iPhone 6s Chipgate-0bAeJ5fJ1M0.f137.mp4
[download] 0.0% of 32.18MiB at 390.46KiB/s ETA 01:24
[download] 0.0% of 32.18MiB at 1.10MiB/s ETA 00:29
[download] 0.0% of 32.18MiB at 2.50MiB/s ETA 00:12
[download] 0.0% of 32.18MiB at 5.24MiB/s ETA 00:06
[download] 0.1% of 32.18MiB at 9.69MiB/s ETA 00:03
[download] 0.2% of 32.18MiB at 12.04MiB/s ETA 00:02
[download] 0.4% of 32.18MiB at 17.08MiB/s ETA 00:01
[download] 0.8% of 32.18MiB at 15.81MiB/s ETA 00:02
[download] 1.6% of 32.18MiB at 14.81MiB/s ETA 00:02
[download] 3.1% of 32.18MiB at 13.60MiB/s ETA 00:02
[download] 6.2% of 32.18MiB at 13.26MiB/s ETA 00:02
[download] 12.4% of 32.18MiB at 13.29MiB/s ETA 00:02
[download] 24.9% of 32.18MiB at 13.01MiB/s ETA 00:01
[download] 37.3% of 32.18MiB at 13.17MiB/s ETA 00:01
[download] 49.7% of 32.18MiB at 13.26MiB/s ETA 00:01
[download] 62.2% of 32.18MiB at 12.97MiB/s ETA 00:00
[download] 74.6% of 32.18MiB at 13.02MiB/s ETA 00:00
[download] 87.0% of 32.18MiB at 13.09MiB/s ETA 00:00
[download] 99.5% of 32.18MiB at 13.13MiB/s ETA 00:00
[download] 100.0% of 32.18MiB at 13.15MiB/s ETA 00:00
[download] 100% of 32.18MiB in 00:02
[debug] Invoking downloader on u'https://r4---sn-3u-bh2d.googlevideo.com/videoplayback?id=d1b01e2797c9d4cd&itag=140&source=youtube&requiressl=yes&mm=31&mn=sn-3u-bh2d&ms=au&mv=m&pl=19&pcm2cms=yes&ratebypass=yes&mime=audio/mp4&gir=yes&clen=2340459&lmt=1444724918594639&dur=147.307&sver=3&key=dg_yt0&mt=1455023178&fexp=9405265,9416126,9417058,9417580,9418199,9420452,9422596,9423661,9423662,9424134,9425730,9426681,9426765,9427037,9427105,9427708,9428294,9428561&upn=Y5isukI8LBo&signature=42CC940A196469F2B6743EFD18458A5E18739EA5.674E78475B5304A57B5170BA5E4B2800EFA9F7B8&ip=121.140.87.200&ipbits=0&expire=1455044917&sparams=ip,ipbits,expire,id,itag,source,requiressl,mm,mn,ms,mv,pl,pcm2cms,ratebypass,mime,gir,clen,lmt,dur'
[download] Destination: iPhone 6s Chipgate-0bAeJ5fJ1M0.f140.m4a
[download] 0.0% of 2.23MiB at 663.13KiB/s ETA 00:03
[download] 0.1% of 2.23MiB at 1.85MiB/s ETA 00:01
[download] 0.3% of 2.23MiB at 4.16MiB/s ETA 00:00
[download] 0.7% of 2.23MiB at 8.59MiB/s ETA 00:00
[download] 1.4% of 2.23MiB at 16.98MiB/s ETA 00:00
[download] 2.8% of 2.23MiB at 16.91MiB/s ETA 00:00
[download] 5.6% of 2.23MiB at 24.80MiB/s ETA 00:00
[download] 11.2% of 2.23MiB at 20.37MiB/s ETA 00:00
[download] 22.4% of 2.23MiB at 16.27MiB/s ETA 00:00
[download] 44.8% of 2.23MiB at 14.77MiB/s ETA 00:00
[download] 89.6% of 2.23MiB at 13.08MiB/s ETA 00:00
[download] 100.0% of 2.23MiB at 13.25MiB/s ETA 00:00
[download] 100% of 2.23MiB in 00:00
**[ffmpeg] Merging formats into "iPhone 6s Chipgate-0bAeJ5fJ1M0.mp4"**
[debug] ffmpeg command line: avconv -y -i 'file:iPhone 6s Chipgate-0bAeJ5fJ1M0.f137.mp4' -i 'file:iPhone 6s Chipgate-0bAeJ5fJ1M0.f140.m4a' -c copy -map 0:v:0 -map 1:a:0 'file:iPhone 6s Chipgate-0bAeJ5fJ1M0.temp.mp4'
**[ffmpeg] Adding metadata to 'iPhone 6s Chipgate-0bAeJ5fJ1M0.mp4'**
[debug] ffmpeg command line: avconv -y -i 'file:iPhone 6s Chipgate-0bAeJ5fJ1M0.mp4' -c copy -metadata 'comment=Does iPhone 6s Chipgate slow your phone down? https://youtu.be/pXmIQJMDv68
Subscribe! http://www.youtube.com/austinevans
The iPhone 6s actually has two different processors, the Apple A9 Samsung edition and TSMC edition. With rumors of Chipgate making a huge difference to battery life and performance I wanted to put it to the test.
iPhone 6s Samsung model number: N71AP
iPhone 6s TSMC model number: N71MAP
iPhone 6s Plus Samsung model number: N66AP
iPhone 6s Plus TSMC model number: N66MAP
Lirum Device Info: https://itunes.apple.com/us/app/lirum-device-info-lite-system/id591660734?mt=8
Chipworks Apple A9 Samsung vs TSMC: http://www.chipworks.com/about-chipworks/overview/blog/a9-is-tsmc-16nm-finfet-and-samsung-fabbed
Thanks to Vsauce 3 for letting me use the awesome β video: https://www.youtube.com/watch?v=8UqYfaxOJOg
Twitter: http://twitter.com/austinnotduncan
Instagram: http://instagram.com/austinnotduncan
Facebook: https://www.facebook.com/austinnotduncan' -metadata 'description=Does iPhone 6s Chipgate slow your phone down? https://youtu.be/pXmIQJMDv68
Subscribe! http://www.youtube.com/austinevans
The iPhone 6s actually has two different processors, the Apple A9 Samsung edition and TSMC edition. With rumors of Chipgate making a huge difference to battery life and performance I wanted to put it to the test.
iPhone 6s Samsung model number: N71AP
iPhone 6s TSMC model number: N71MAP
iPhone 6s Plus Samsung model number: N66AP
iPhone 6s Plus TSMC model number: N66MAP
Lirum Device Info: https://itunes.apple.com/us/app/lirum-device-info-lite-system/id591660734?mt=8
Chipworks Apple A9 Samsung vs TSMC: http://www.chipworks.com/about-chipworks/overview/blog/a9-is-tsmc-16nm-finfet-and-samsung-fabbed
Thanks to Vsauce 3 for letting me use the awesome β video: https://www.youtube.com/watch?v=8UqYfaxOJOg
Twitter: http://twitter.com/austinnotduncan
Instagram: http://instagram.com/austinnotduncan
Facebook: https://www.facebook.com/austinnotduncan' -metadata 'artist=Austin Evans' -metadata 'title=iPhone 6s Chipgate?' -metadata date=20151007 -metadata 'purl=https://www.youtube.com/watch?v=0bAeJ5fJ1M0' 'file:iPhone 6s Chipgate-0bAeJ5fJ1M0.temp.mp4'
**[ffmpeg] Embedding subtitles in 'iPhone 6s Chipgate-0bAeJ5fJ1M0.mp4'**
[debug] ffmpeg command line: avconv -y -i 'file:iPhone 6s Chipgate-0bAeJ5fJ1M0.mp4' -i 'file:iPhone 6s Chipgate-0bAeJ5fJ1M0.en.vtt' -map 0 -c copy -map -0:s -c:s mov_text -map 1:0 -metadata:s:s:0 language=eng 'file:iPhone 6s Chipgate-0bAeJ5fJ1M0.temp.mp4'
[atomicparsley] Adding thumbnail to "iPhone 6s Chipgate-0bAeJ5fJ1M0.mp4"
[debug] AtomicParsley command line: AtomicParsley 'iPhone 6s Chipgate-0bAeJ5fJ1M0.mp4' --artwork 'iPhone 6s Chipgate-0bAeJ5fJ1M0.jpg' -o 'iPhone 6s Chipgate-0bAeJ5fJ1M0.temp.mp4'
```
| request | low | Critical |
Subsets and Splits