id
int64 393k
2.82B
| repo
stringclasses 68
values | title
stringlengths 1
936
| body
stringlengths 0
256k
β | labels
stringlengths 2
508
| priority
stringclasses 3
values | severity
stringclasses 3
values |
---|---|---|---|---|---|---|
629,734,184 |
bitcoin
|
support BIP39 mnemonic in descriptors
|
Supporting BIP39 mnemonic in descriptors would allow less painful imports and scans of third party wallet seeds.
I think it could be used almost identical as the existing `xpriv` key.
Example:
`pkh(bip39(room,cross,cube,glance,infant,setup,renew,more,lion,glimpse,retreat,tone,chief,vanish,brisk,destroy,destroy,bounce,knee,analyst,autumn,meadow,divorce,since)/44'/0'/0'/0/0)`
(or slightly different to support the "passphrase")
|
Feature
|
medium
|
Major
|
629,756,031 |
pytorch
|
BCEWithLogitsLoss() not equal to BCELoss() with sigmoid()
|
Calling BCEWithLogitsLoss() gives different results than calling sigmoid with BCELoss(). Please see the example below.
pred = tensor(
[[ 1.6094e+01],
[ 5.1216e-01],
[-9.5367e+00],
[ 2.4377e+01],
[ 3.5349e+01],
[ 2.5894e+01],
[ 3.1808e+01],
[ 1.8592e-01],
[-2.8377e+00],
[-9.0343e+00],
[ 1.5208e+01],
[ 4.8932e+00],
[-5.4581e+00],
[ 1.8190e+01],
[ 1.0950e+01],
[ 1.7572e+01],
[-2.2359e+01],
[ 2.0425e+01],
[ 1.9340e+02],
[ 8.5044e-01]])
targ = tensor([[0.],
[1.],
[0.],
[0.],
[0.],
[0.],
[0.],
[0.],
[0.],
[0.],
[0.],
[0.],
[0.],
[0.],
[1.],
[0.],
[0.],
[0.],
[0.],
[0.]])
print (F.binary_cross_entropy_with_logits(pred, targ), nn.BCEWithLogitsLoss()(pred, targ), nn.BCELoss()(torch.sigmoid(pred), targ))
# Error still occurs with float64
pred, targ = pred.double(), targ.double()
print (F.binary_cross_entropy_with_logits(pred, targ), nn.BCEWithLogitsLoss()(pred, targ), nn.BCELoss()(torch.sigmoid(pred), targ))
# Error still occurs with two samples
# NB: However the error goes away with two samples with float64
pred, targ = pred[:2], targ[:2]
print (F.binary_cross_entropy_with_logits(pred, targ), nn.BCEWithLogitsLoss()(pred, targ), nn.BCELoss()(torch.sigmoid(pred), targ))
# Try with these values instead and the error gets magnified even more
pred2 = tensor(
[[ 1.6094e+01],
[ 5.1216e-01],
[-9.5367e+00],
[ 2.4377e+01],
[ 3.5349e+01],
[ 2.5894e+12],
[ 3.1808e+01],
[ 1.8592e-01],
[-2.8377e+00],
[-9.0343e+00],
[ 1.5208e+01],
[ 4.8932e+00],
[-5.4581e+00],
[ 1.8190e+01],
[ 1.0950e+07],
[ 1.7572e+06],
[-2.2359e+01],
[ 2.0425e+01],
[ 1.9340e+02],
[ 8.5044e-01]])
Pytorch - 1.5.0+cu101
Python - 3.6.9
Results are the same with CPU and GPU
|
module: loss,triaged
|
low
|
Critical
|
629,803,192 |
pytorch
|
error when specifying sparse=True in embedding
|
## π Bug
<!-- A clear and concise description of what the bug is. -->
neural network training work without sparse=True, but do not work with sparse=True.
## To Reproduce
https://colab.research.google.com/drive/1m1xhqu9s_4YdZoDjkTTNu2P58O0Cwv7J?usp=sharing
Steps to reproduce the behavior:
1. just run all
it give this error,
```
RuntimeError: set_indices_and_values_unsafe is not allowed on a Tensor created from .data or .detach().
If your intent is to change the metadata of a Tensor (such as sizes / strides / storage / storage_offset)
without autograd tracking the change, remove the .data / .detach() call and wrap the change in a `with torch.no_grad():` block.
For example, change:
x.data.set_(y)
to:
with torch.no_grad():
x.set_(y)
```
<!-- If you have a code sample, error messages, stack traces, please provide it here as well -->
## Expected behavior
<!-- A clear and concise description of what you expected to happen. -->
it work with sparse=True
cc @vincentqb
|
module: sparse,triaged
|
low
|
Critical
|
629,811,028 |
flutter
|
[Feature Request] [Navigator] Expose current route(s)
|
## Use case
I have chat in my application with notifications.
When the app receives notifications from a specific user while it's open and that conversation is already, the notification should not be displayed.
To know when that happens, I need to get the current route and see which one is it.
## Current Solutions Status
I know `ModalRoute.of(context)` can get you some information on the route, but it will only return the route that is part of the context.
## Current Workaround
Currently the only workaround I found is by creating a minimal mimic of the navigator and maintaining (another) history on my own.
That is, when I want to push a new roue I'm adding it to a list, and when it's out I'm removing it.
While it works, that info is already kept on the `Navigator` - one just can't access it.
## Proposal
Add a property to `NavigatorState`:
- `List<Route> get history`: Return a list of all routes that are "alive"
- `Route get currentRoute`: Return the current route (one that will return true for `Route.isCurrent`).
|
c: new feature,framework,f: routes,c: proposal,P3,team-framework,triaged-framework
|
low
|
Major
|
629,854,466 |
rust
|
ffi-safe lint should be more aggressive in checking ffi stability of sub-types
|
According to [this answer](https://internals.rust-lang.org/t/question-about-c-abi-stability/12449/4?u=sharazam), this code:
```rust
pub struct A { a: usize, b: usize }
#[repr(C)]
pub struct B {
a: A
}
```
... should emit a warning that `struct A` is still `#[repr(Rust)]`, so the layout of `B` isn't completely defined, specifically the order of the fields `a` and `b` are not specified in `#[repr(Rust)]`. I'm not sure if the `ffi-safe` lint is active or not, but it would be very useful, so I know that if a type is marked `#[repr(C)]`, all sub-types are also marked `#[repr(C)]`.
Another question: When compiling to a `cdylib`, why does cargo use `#[repr(Rust)]` for publicly visible structs of a crate. It would be nice if, when compiling to a `cdylib` with this code:
```rust
pub fn foo(b: B) -> A { b.a }
```
... there could be automatic warnings:
- `foo` is a public function, should be marked as `#[no_mangle]`
- `B` and `A` should be marked as `#[repr(C)]`
... because the only representation that you usually want when compiling to a `cdylib` is `#[repr(C)]` (or `#[repr(C, u8)]` for enums).
|
C-enhancement,A-lints,A-FFI
|
low
|
Minor
|
629,883,021 |
TypeScript
|
Add quick fix to convert type-only import into normal import
|
<!-- π¨ STOP π¨ π¦π§π’π£ π¨ πΊπ»πΆπ· π¨
Half of all issues filed here are duplicates, answered in the FAQ, or not appropriate for the bug tracker.
Please help us by doing the following steps before logging an issue:
* Search: https://github.com/Microsoft/TypeScript/search?type=Issues
* Read the FAQ, especially the "Common Feature Requests" section: https://github.com/Microsoft/TypeScript/wiki/FAQ
-->
## Search Terms
<!-- List of keywords you searched for before creating this issue. Write them down here so that others can find this suggestion more easily -->
convert type import
## Suggestion
<!-- A summary of what you'd like to see added or changed -->
I'd like to have a quick fix to convert a type-only import into a normal import if the imported type is used in a value position.
## Use Cases
<!--
What do you want to use this for?
What shortcomings exist with current approaches?
-->
Now that #36400 is supported, TypeScript auto-creates type-only imports for us. However, if we later decide to use the imported type in a value position, the described workflow in #36400 is still awkward. By adding a quick fix, TypeScript could make this more smooth for the developer.
## Examples
```ts
import type { Foo } from "./foo";
// ... 1000 lines below
const something = Foo.staticMethod();
// ~~~ Error: Foo cannot be used as a value
```
When placing the cursor on `Foo` here, TypeScript should offer converting the type-only import into a normal import.
## Checklist
My suggestion meets these guidelines:
* [x] This wouldn't be a breaking change in existing TypeScript/JavaScript code
* [x] This wouldn't change the runtime behavior of existing JavaScript code
* [x] This could be implemented without emitting different JS based on the types of the expressions
* [x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, etc.)
* [x] This feature would agree with the rest of [TypeScript's Design Goals](https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals).
|
Suggestion,Awaiting More Feedback
|
low
|
Critical
|
629,887,686 |
flutter
|
Coloured Android navigation bar not working with Launch Screen
|
## Steps to Reproduce
I have a styles.xml file where I am trying to change the colour of the android navigation bar (the system one with three buttons) on the launch screen. The code is:
```
<?xml version="1.0" encoding="utf-8"?>
<resources>
<!-- Theme applied to the Android Window while the process is starting -->
<style name="LaunchTheme" parent="@android:style/Theme.Black.NoTitleBar">
<!-- Show a splash screen on the activity. Automatically removed when
Flutter draws its first frame -->
<item name="android:windowBackground">@drawable/launch_background</item>
<item name="android:navigationBarColor">@android:color/black</item>
</style>
<!-- Theme applied to the Android Window as soon as the process has started.
This theme determines the color of the Android Window while your
Flutter UI initializes, as well as behind your Flutter UI while its
running.
This Theme is only used starting with V2 of Flutter's Android embedding. -->
<style name="NormalTheme" parent="@android:style/Theme.Black.NoTitleBar">
<item name="android:windowBackground">@drawable/launch_background</item>
<item name="android:navigationBarColor">@android:color/black</item>
</style>
</resources>
```
The line `<item name="android:navigationBarColor">@android:color/black</item>` works for the "NormalTheme" changing the bar to black. However, it seems to be ignored for the "LaunchTheme". I can't work out why. The [documentation](https://developer.android.com/reference/android/view/Window#setNavigationBarColor(int)) says that the [FLAG_DRAWS_SYSTEM_BAR_BACKGROUNDS](https://developer.android.com/reference/android/view/WindowManager.LayoutParams#FLAG_DRAWS_SYSTEM_BAR_BACKGROUNDS) has to be set in order for this property to work. Could it be that this is not set for the Flutter launch?
**Expected results:**
I want to see a differently coloured Android navigation bar when launching the app
**Actual results:**
I saw a white Android navigation bar when launching the app, that transitioned into a black one once it was showing the secondary launch screen it shows when Flutter is initialising.
|
c: new feature,platform-android,engine,P3,team-android,triaged-android
|
low
|
Major
|
629,902,190 |
rust
|
Misleading compilation error message in desugaring of closure with mutable reference
|
I tried this code:
```rust
struct A(i32);
struct B {
arr: Vec<A>
}
impl B {
fn using_iterator(&mut self) -> Vec<&A> {
(0..2).map(|i| &self.arr[i]).collect()
}
}
```
I expected to see this happen:
While I understand that because of the mutable reference and how closures are desugared, the code is expected to not compile. But the error message seems misleading.
I expected to receive a compilation error message which explained multiple mutable borrows.
(discussion at - https://users.rust-lang.org/t/lifetimes-in-closures-with-captured-mutable-reference/43721
Instead, this happened:
Received misleading error message
```
9 | (0..2).map(|i| &self.arr[i]).collect()
| ^^^ ---- `self` is borrowed here
| |
| may outlive borrowed value `self`
```
https://play.rust-lang.org/?version=nightly&mode=debug&edition=2018&gist=e538deb64a5702e8e13cb5dbf106b130
### Meta
Tried on both stable and nightly
`rustc --version --verbose`:
```
rustc 1.43.0 (4fb7144ed 2020-04-20)
binary: rustc
commit-hash: 4fb7144ed159f94491249e86d5bbd033b5d60550
commit-date: 2020-04-20
host: x86_64-apple-darwin
release: 1.43.0
LLVM version: 9.0
```
```
rustc 1.45.0-nightly (fe10f1a49 2020-06-02)
binary: rustc
commit-hash: fe10f1a49f5ca46e57261b95f46f519523f418fe
commit-date: 2020-06-02
host: x86_64-apple-darwin
release: 1.45.0-nightly
LLVM version: 10.0
```
Backtrace:
|
C-enhancement,A-diagnostics,A-closures,T-compiler,D-confusing
|
low
|
Critical
|
629,941,309 |
pytorch
|
PyTorch multiprocessing.spawn seems slow with list of tensors
|
## π Bug
When I use torch.multiprocessing.spawn in distributed GPU training (on a single machine), I observe much slower training times than starting the processes independently.
This is even more true when my Dataset contains a list of tensors. It seems that the shared memory creates a file pointer for each entry in the list and all these filepointers make the training very slow.
In particular, I need to increase the ulimit: `ulimit -n 9999`.
When training exactly the same data and model, but using a tensor of tensors vs a list of tensors, the list of tensors takes around 4 times as long (!) to train.
## To Reproduce
I created a repository with a minimal example (~100 lines of code including everything needed): https://github.com/mpaepper/pytorch_multiprocessing_list_of_tensors
When using the `--use_spawn` flag, it's using multiprocessing.spawn, else you start the processes manually with `--rank`:
```
python custom.py --use_spawn # Training time: 17 seconds
python custom.py --use_spawn --use_lists # Training time: 72 seconds (!)
# Instead of using spawn, start each process independently:
python custom.py --rank 0 # Training time: 14 seconds
python custom.py --rank 1
python custom.py --rank 0 --use_lists # Training time: 14 seconds
python custom.py --rank 1 --use_lists
```
Basically this is the main change in the Dataset:
```
if use_lists:
self.list = []
for i in range(self.length):
self.list.append(torch.rand((3, 224, 224)))
else:
self.list = torch.rand((self.length, 3, 224, 224))
```
## Expected behavior
I would expect to have `python custom.py --use_spawn` and `python custom.py --use_spawn --use_lists` run in the same amount of time, i.e. just having a list of tensors shouldn't completely slow down my training.
## Environment
Collecting environment information...
PyTorch version: 1.4.0
Is debug build: No
CUDA used to build PyTorch: 10.1
OS: Ubuntu 18.04.4 LTS
GCC version: (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0
CMake version: Could not collect
Python version: 3.7
Is CUDA available: Yes
CUDA runtime version: Could not collect
GPU models and configuration:
GPU 0: GeForce RTX 2080 Ti
GPU 1: GeForce RTX 2080 Ti
GPU 2: GeForce RTX 2080 Ti
GPU 3: GeForce RTX 2080 Ti
Nvidia driver version: 440.33.01
cuDNN version: Could not collect
Versions of relevant libraries:
[pip] numpy==1.18.1
[pip] numpydoc==0.9.1
[pip] pytorch-lightning==0.7.5
[pip] torch==1.4.0
[pip] torch-lr-finder==0.1.2
[pip] torchbearer==0.5.3
[pip] torchfile==0.1.0
[pip] torchvision==0.4.2
[conda] _pytorch_select 0.1 cpu_0
[conda] blas 1.0 mkl
[conda] cudatoolkit 10.2.89 hfd86e86_0
[conda] mkl 2019.4 243
[conda] mkl-service 2.3.0 py37he904b0f_0
[conda] mkl_fft 1.0.14 py37ha843d7b_0
[conda] mkl_random 1.1.0 py37hd6b4f25_0
[conda] numpy 1.18.1 pypi_0 pypi
[conda] numpy-base 1.17.2 py37hde5b4d6_0
[conda] numpydoc 0.9.1 py_0
[conda] pytorch 1.3.1 cpu_py37h62f834f_0
[conda] pytorch-lightning 0.7.5 pypi_0 pypi
[conda] torch 1.4.0 pypi_0 pypi
[conda] torch-lr-finder 0.1.2 pypi_0 pypi
[conda] torchbearer 0.5.3 pypi_0 pypi
[conda] torchfile 0.1.0 pypi_0 pypi
[conda] torchvision 0.4.2 cpu_py37h9ec355b_0
cc @VitalyFedyunin @ngimel
|
module: performance,module: multiprocessing,triaged
|
low
|
Critical
|
629,944,984 |
flutter
|
The 'Pods-Runner' target has transitive dependencies that include statically linked binaries: (/Users/tossdown/Documents/FLUTTER/Projects/shezan/ios/Flutter/Flutter.framework)
|
```
- Running pre install hooks
[!] The 'Pods-Runner' target has transitive dependencies that include statically linked binaries: (/Users/tossdown/Documents/FLUTTER/Projects/shezan/ios/Flutter/Flutter.framework)
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:84:in `block (2 levels) in verify_no_static_framework_transitive_dependencies'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:74:in `each_key'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:74:in `block in verify_no_static_framework_transitive_dependencies'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:73:in `each'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:73:in `verify_no_static_framework_transitive_dependencies'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:38:in `validate!'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer.rb:590:in `validate_targets'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer.rb:158:in `install!'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/command/install.rb:52:in `run'
/Library/Ruby/Gems/2.6.0/gems/claide-1.0.3/lib/claide/command.rb:334:in `run'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/command.rb:52:in `run'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/bin/pod:55:in `<top (required)>'
/usr/local/bin/pod:23:in `load'
/usr/local/bin/pod:23:in `<main>'
Error running pod install
Error launching application on Iphone X.
```
<details>
<summary>complete log</summary>
```
Launching lib/main.dart on Iphone X in debug mode...
Running pod install... 13.1s
CocoaPods' output:
β³
Preparing
Analyzing dependencies
Inspecting targets to integrate
Using `ARCHS` setting to build architectures of target `Pods-Runner`: (``)
Fetching external sources
-> Fetching podspec for `Flutter` from `Flutter`
-> Fetching podspec for `barcode_scan` from `.symlinks/plugins/barcode_scan/ios`
-> Fetching podspec for `connectivity` from `.symlinks/plugins/connectivity/ios`
-> Fetching podspec for `connectivity_macos` from `.symlinks/plugins/connectivity_macos/ios`
-> Fetching podspec for `firebase_auth` from `.symlinks/plugins/firebase_auth/ios`
-> Fetching podspec for `firebase_auth_web` from `.symlinks/plugins/firebase_auth_web/ios`
-> Fetching podspec for `firebase_core` from `.symlinks/plugins/firebase_core/ios`
-> Fetching podspec for `firebase_core_web` from `.symlinks/plugins/firebase_core_web/ios`
-> Fetching podspec for `firebase_messaging` from `.symlinks/plugins/firebase_messaging/ios`
-> Fetching podspec for `flutter_facebook_login` from `.symlinks/plugins/flutter_facebook_login/ios`
-> Fetching podspec for `flutter_plugin_android_lifecycle` from `.symlinks/plugins/flutter_plugin_android_lifecycle/ios`
-> Fetching podspec for `flutter_ringtone_player` from `.symlinks/plugins/flutter_ringtone_player/ios`
-> Fetching podspec for `geolocator` from `.symlinks/plugins/geolocator/ios`
-> Fetching podspec for `google_api_availability` from `.symlinks/plugins/google_api_availability/ios`
-> Fetching podspec for `google_maps_flutter` from `.symlinks/plugins/google_maps_flutter/ios`
-> Fetching podspec for `google_sign_in` from `.symlinks/plugins/google_sign_in/ios`
-> Fetching podspec for `google_sign_in_web` from `.symlinks/plugins/google_sign_in_web/ios`
-> Fetching podspec for `location` from `.symlinks/plugins/location/ios`
-> Fetching podspec for `location_permissions` from `.symlinks/plugins/location_permissions/ios`
-> Fetching podspec for `open_appstore` from `.symlinks/plugins/open_appstore/ios`
-> Fetching podspec for `package_info` from `.symlinks/plugins/package_info/ios`
-> Fetching podspec for `path_provider` from `.symlinks/plugins/path_provider/ios`
-> Fetching podspec for `shared_preferences` from `.symlinks/plugins/shared_preferences/ios`
-> Fetching podspec for `shared_preferences_macos` from `.symlinks/plugins/shared_preferences_macos/ios`
-> Fetching podspec for `shared_preferences_web` from `.symlinks/plugins/shared_preferences_web/ios`
-> Fetching podspec for `sqflite` from `.symlinks/plugins/sqflite/ios`
-> Fetching podspec for `webview_flutter` from `.symlinks/plugins/webview_flutter/ios`
Resolving dependencies of `Podfile`
CDN: trunk Relative path: CocoaPods-version.yml exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_0_2_a.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.11/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_c_7_9.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/7/9/Reachability/3.2/Reachability.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_0_3_5.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.26.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.3.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.4.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.5.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.6.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.7.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.8.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.8.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.9.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.10.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.11.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.12.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.13.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.14.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.15.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.16.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.17.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.18.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.19.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.20.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.21.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.22.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.23.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.24.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.25.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.26.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.0.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.0.5/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.0.7/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.0.9/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.4/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.5/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.6/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.7/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.9/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.10/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.11/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.12/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.2.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.2.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.2.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.0.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.0.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.0.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.0.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.1.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.1.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.1.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.2.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.2.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.3.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.3.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.3.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.3.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.4.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.4.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.4.1.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.4.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.4.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.5.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.5.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.2.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.3.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.4.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.5.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.5.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.5.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.6.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.7.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.7.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.8.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.9.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.10.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.11.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.11.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.12.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.13.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.14.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.15.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.16.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.17.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.0.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.0.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.0.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.0.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.0.4/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.1.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.1.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.3.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.4.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.5.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.6.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.7.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.8.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.8.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.9.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.10.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.10.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.11.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.12.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.13.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.0.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.0.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.1.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.3.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.4.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.4.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.5.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.6.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.7.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.8.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.8.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.9.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.10.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.11.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.12.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.13.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.14.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.15.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.16.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.17.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.18.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.19.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.20.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.20.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.20.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.0.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.1.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.3.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.4.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.5.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.6.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.7.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.8.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.8.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.9.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.10.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.11.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.12.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.13.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.14.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.15.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.16.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.17.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.18.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.19.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.20.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.21.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.22.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.23.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.24.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.25.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.26.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.0.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.0.5/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.0.7/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.0.9/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.4/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.5/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.6/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.7/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.9/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.10/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.11/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.1.12/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.2.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.2.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/1.2.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.0.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.0.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.0.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.0.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.1.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.1.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.1.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.2.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.2.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.3.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.3.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.3.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.3.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.4.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.4.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.4.1.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.4.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.4.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.5.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/2.5.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.2.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.3.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.4.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.5.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.5.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.5.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.6.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.7.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.7.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.8.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.9.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.10.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.11.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.11.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.12.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.13.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.14.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.15.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.16.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/3.17.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.0.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.0.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.0.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.0.3/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.0.4/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.1.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.1.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.3.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.4.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.5.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.6.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.7.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.8.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.8.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.9.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.10.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.10.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.11.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.12.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/4.13.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.0.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.0.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.1.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.3.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.4.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.4.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.5.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.6.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.7.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.8.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.8.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.9.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.10.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.11.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.12.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.13.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.14.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.15.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.16.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.17.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.18.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.19.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.20.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.20.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/5.20.2/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.0.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.1.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.2.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.3.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.4.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.5.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.6.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.7.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.8.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.8.1/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.9.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.10.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.11.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.12.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.13.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.14.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.15.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.16.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.17.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.18.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.19.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.20.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.21.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.22.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.23.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.24.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.25.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.26.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_9_b_5.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/7.0.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_b_3_c.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/7.0.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_a_d_d.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.9.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_d_4_0.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/d/4/0/GoogleSignIn/5.0.2/GoogleSignIn.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_f_4_e.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/f/4/e/FMDB/2.7.5/FMDB.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/f/4/e/FMDB/2.7.5/FMDB.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/f/4/e/FMDB/2.7.2/FMDB.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/f/4/e/FMDB/2.7.5/FMDB.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/d/4/0/GoogleSignIn/5.0.2/GoogleSignIn.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/d/4/0/GoogleSignIn/5.0.1/GoogleSignIn.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/d/4/0/GoogleSignIn/5.0.0/GoogleSignIn.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_b_b_9.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.4.0/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_2_c_c.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/2/c/c/GTMAppAuth/1.0.0/GTMAppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_c_e_3.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.4.0/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.0/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.1/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.2/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.3/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.4/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.5/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.6/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.7/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.8/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.9/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.10/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.11/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.12/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.13/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.14/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.15/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.2.0/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.2.1/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.2.2/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.3.0/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.3.1/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.4.0/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/2/c/c/GTMAppAuth/1.0.0/GTMAppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.0.0/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.1.0/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.2.0/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.3.0/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.3.1/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.4.0/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.4.0/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.3.1/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.3.0/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.2.0/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.4.0/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/b/9/AppAuth/1.4.0/AppAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/7/9/Reachability/3.2/Reachability.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/7/9/Reachability/3.1.1/Reachability.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/7/9/Reachability/3.1.0/Reachability.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/7/9/Reachability/3.0.0/Reachability.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.15.1/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.15.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.14.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.13.1/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.13.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.12.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.11.1/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.11.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.10.1/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.10.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.9.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.8.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.7.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.6.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.5.0/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/3/c/FBSDKLoginKit/5.15.1/FBSDKLoginKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.15.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.15.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.14.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.13.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.13.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.12.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.11.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.11.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.10.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.10.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.9.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.8.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.7.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.6.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.5.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.15.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.15.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.15.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.15.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.14.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.13.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.13.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.12.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.11.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.11.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.10.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.10.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.9.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.8.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.7.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.6.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.5.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.4.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.4.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.3.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.2.3/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.2.2/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.2.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.2.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.1.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.1.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.0.2/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.0.1/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/9/b/5/FBSDKCoreKit/5.0.0/FBSDKCoreKit.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.4.0/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.3.1/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.3.0/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.2.2/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.2.1/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.2.0/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.15/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.14/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.13/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.12/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.11/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.10/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.9/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.8/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.7/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.6/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.5/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.4/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.3/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.2/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.1/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.1.0/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.4.0/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/c/e/3/GTMSessionFetcher/1.4.0/GTMSessionFetcher.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/3/5/Firebase/6.26.0/Firebase.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_6_3_6.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/6/3/6/FirebaseAuth/6.5.3/FirebaseAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_8_b_d.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.7.2/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/6/3/6/FirebaseAuth/6.5.3/FirebaseAuth.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_4_2_7.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/4/2/7/FirebaseAuthInterop/1.1.0/FirebaseAuthInterop.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_0_8_4.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.7.2/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_8_9_c.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/9/c/FirebaseCoreDiagnosticsInterop/1.2.0/FirebaseCoreDiagnosticsInterop.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_8_3_c.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/3/c/FirebaseCoreDiagnostics/1.3.0/FirebaseCoreDiagnostics.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.7.2/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.7.1/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.7.0/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.6.7/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.6.6/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.6.5/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.6.4/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.6.3/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.6.2/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.6.1/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.6.0/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/3/c/FirebaseCoreDiagnostics/1.3.0/FirebaseCoreDiagnostics.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_b_c_f.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/c/f/GoogleDataTransportCCTSupport/3.1.0/GoogleDataTransportCCTSupport.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_6_1_e.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/6/1/e/nanopb/1.30905.0/nanopb.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/9/c/FirebaseCoreDiagnosticsInterop/1.2.0/FirebaseCoreDiagnosticsInterop.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/b/c/f/GoogleDataTransportCCTSupport/3.1.0/GoogleDataTransportCCTSupport.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_0_6_a.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/6/a/GoogleDataTransport/6.2.1/GoogleDataTransport.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/6/1/e/nanopb/1.30905.0/nanopb.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/6/1/e/nanopb/1.30905.0/nanopb.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/6/1/e/nanopb/1.30905.0/nanopb.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/4/2/7/FirebaseAuthInterop/1.1.0/FirebaseAuthInterop.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/4/2/7/FirebaseAuthInterop/1.0.0/FirebaseAuthInterop.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/6/a/GoogleDataTransport/6.2.1/GoogleDataTransport.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/6/a/GoogleDataTransport/6.2.0/GoogleDataTransport.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/6/a/GoogleDataTransport/6.1.1/GoogleDataTransport.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/6/a/GoogleDataTransport/6.1.0/GoogleDataTransport.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_0_b_5.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/b/5/PromisesObjC/1.2.8/PromisesObjC.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/b/5/PromisesObjC/1.2.8/PromisesObjC.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/b/5/PromisesObjC/1.2.7/PromisesObjC.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/b/5/PromisesObjC/1.2.6/PromisesObjC.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/b/5/PromisesObjC/1.2.5/PromisesObjC.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/b/5/PromisesObjC/1.2.4/PromisesObjC.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/b/5/PromisesObjC/1.2.3/PromisesObjC.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/b/5/PromisesObjC/1.2.2/PromisesObjC.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/b/5/PromisesObjC/1.2.1/PromisesObjC.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/b/5/PromisesObjC/1.2/PromisesObjC.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.9.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.8.2/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.8.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.7.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.6.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.5.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.4.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.3.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.2.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.1.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.0.3/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.0.2/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.0.1/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.0.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/2.7.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/2.6.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/2.5.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/2.4.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/2.3.1/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/2.3.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/2.2.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/2.1.1/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/2.1.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/2.0.1/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/2.0.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.13.2/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.13.1/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.13.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.12.3/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.12.2/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.12.1/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.12.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.11.1/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.11.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.10.5/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.10.4/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.10.3/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.10.2/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.10.1/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.10.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/1.9.2/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.9.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/a/d/d/GoogleMaps/3.9.0/GoogleMaps.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.11/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.10/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.9/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.8/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.7/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.6/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.5/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.3/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.2/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.1/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/5.0.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/4.0.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/3.1.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/3.0.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/2.1.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/2.0.3/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/2.0.2/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/2.0.1/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.9.1/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.9.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.8.11/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.8.10/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.8.9/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.8.8/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.8.7/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.8.6/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.8.5/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.8.4/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.8.3/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.8.1/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.8.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.7.1/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.7.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.6.1/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.6.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.5.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.4.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.3.2/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.3.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.2.0/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/1.1.18/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/0.1.8/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/0.1.7/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/0.1.6/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/0.1.5/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/0.1.4/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/0.1.3/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/0.1.2/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/2/a/MTBBarcodeScanner/0.1.1/MTBBarcodeScanner.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_2_d_6.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/2/d/6/FirebaseMessaging/4.4.1/FirebaseMessaging.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/2/d/6/FirebaseMessaging/4.4.1/FirebaseMessaging.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_6_f_9.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/6/f/9/FirebaseAnalyticsInterop/1.5.0/FirebaseAnalyticsInterop.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_3_6_0.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/3/6/0/FirebaseInstanceID/4.3.4/FirebaseInstanceID.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_e_c_d.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/c/d/Protobuf/3.12.0/Protobuf.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/6/f/9/FirebaseAnalyticsInterop/1.5.0/FirebaseAnalyticsInterop.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/3/6/0/FirebaseInstanceID/4.3.4/FirebaseInstanceID.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/3/6/0/FirebaseInstanceID/4.3.3/FirebaseInstanceID.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/3/6/0/FirebaseInstanceID/4.3.2/FirebaseInstanceID.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/3/6/0/FirebaseInstanceID/4.3.1/FirebaseInstanceID.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/3/6/0/FirebaseInstanceID/4.3.0/FirebaseInstanceID.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_2_f_7.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/2/f/7/FirebaseInstallations/1.3.0/FirebaseInstallations.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/2/f/7/FirebaseInstallations/1.3.0/FirebaseInstallations.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/2/f/7/FirebaseInstallations/1.2.0/FirebaseInstallations.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/2/f/7/FirebaseInstallations/1.1.1/FirebaseInstallations.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/2/f/7/FirebaseInstallations/1.1.0/FirebaseInstallations.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/2/f/7/FirebaseInstallations/1.0.0/FirebaseInstallations.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/c/d/Protobuf/3.12.0/Protobuf.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/c/d/Protobuf/3.11.4/Protobuf.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/c/d/Protobuf/3.11.3/Protobuf.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/c/d/Protobuf/3.11.2/Protobuf.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/c/d/Protobuf/3.11.1/Protobuf.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/c/d/Protobuf/3.11.0/Protobuf.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/c/d/Protobuf/3.11.0-rc2/Protobuf.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/c/d/Protobuf/3.10.0/Protobuf.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/c/d/Protobuf/3.10.0-rc1/Protobuf.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/c/d/Protobuf/3.9.2/Protobuf.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_e_2_1.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/2/1/FirebaseAnalytics/6.6.0/FirebaseAnalytics.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/2/1/FirebaseAnalytics/6.6.0/FirebaseAnalytics.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: all_pods_versions_e_3_b.txt exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/3/b/GoogleAppMeasurement/6.6.0/GoogleAppMeasurement.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.0.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.1.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.3/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.4/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.5/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.4.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.0.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.1.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.3/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.4/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.5/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.4.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.0.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.1.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.3/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.4/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.5/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.4.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.0.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.1.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.3/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.4/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.2.5/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.3.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.4.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.1/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.5.2/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/0/8/4/GoogleUtilities/6.6.0/GoogleUtilities.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/2/f/7/FirebaseInstallations/1.3.0/FirebaseInstallations.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.7.2/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.7.1/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/8/b/d/FirebaseCore/6.7.0/FirebaseCore.podspec.json exists! Returning local because checking is only perfomed in repo update
CDN: trunk Relative path: Specs/e/3/b/GoogleAppMeasurement/6.6.0/GoogleAppMeasurement.podspec.json exists! Returning local because checking is only perfomed in repo update
Comparing resolved specification to the sandbox manifest
A AppAuth
A FBSDKCoreKit
A FBSDKLoginKit
A FMDB
A Firebase
A FirebaseAnalytics
A FirebaseAnalyticsInterop
A FirebaseAuth
A FirebaseAuthInterop
A FirebaseCore
A FirebaseCoreDiagnostics
A FirebaseCoreDiagnosticsInterop
A FirebaseInstallations
A FirebaseInstanceID
A FirebaseMessaging
A Flutter
A GTMAppAuth
A GTMSessionFetcher
A GoogleAppMeasurement
A GoogleDataTransport
A GoogleDataTransportCCTSupport
A GoogleMaps
A GoogleSignIn
A GoogleUtilities
A MTBBarcodeScanner
A PromisesObjC
A Protobuf
A Reachability
A barcode_scan
A connectivity
A connectivity_macos
A firebase_auth
A firebase_auth_web
A firebase_core
A firebase_core_web
A firebase_messaging
A flutter_facebook_login
A flutter_plugin_android_lifecycle
A flutter_ringtone_player
A geolocator
A google_api_availability
A google_maps_flutter
A google_sign_in
A google_sign_in_web
A location
A location_permissions
A nanopb
A open_appstore
A package_info
A path_provider
A shared_preferences
A shared_preferences_macos
A shared_preferences_web
A sqflite
A webview_flutter
Downloading dependencies
-> Installing AppAuth (1.4.0)
> Copying AppAuth from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/AppAuth/1.4.0-31bce` to `Pods/AppAuth`
-> Installing FBSDKCoreKit (5.15.1)
> Copying FBSDKCoreKit from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FBSDKCoreKit/5.15.1-1d5ac` to `Pods/FBSDKCoreKit`
-> Installing FBSDKLoginKit (5.15.1)
> Copying FBSDKLoginKit from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FBSDKLoginKit/5.15.1-f1ea8` to `Pods/FBSDKLoginKit`
-> Installing FMDB (2.7.5)
> Copying FMDB from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FMDB/2.7.5-2ce00` to `Pods/FMDB`
-> Installing Firebase (6.26.0)
> Copying Firebase from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/Firebase/6.26.0-7cf5f` to `Pods/Firebase`
-> Installing FirebaseAnalytics (6.6.0)
> Copying FirebaseAnalytics from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FirebaseAnalytics/6.6.0-96634` to `Pods/FirebaseAnalytics`
-> Installing FirebaseAnalyticsInterop (1.5.0)
> Copying FirebaseAnalyticsInterop from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FirebaseAnalyticsInterop/1.5.0-3f862` to `Pods/FirebaseAnalyticsInterop`
-> Installing FirebaseAuth (6.5.3)
> Copying FirebaseAuth from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FirebaseAuth/6.5.3-7047a` to `Pods/FirebaseAuth`
-> Installing FirebaseAuthInterop (1.1.0)
> Copying FirebaseAuthInterop from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FirebaseAuthInterop/1.1.0-a0f37` to `Pods/FirebaseAuthInterop`
-> Installing FirebaseCore (6.7.2)
> Copying FirebaseCore from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FirebaseCore/6.7.2-f42e5` to `Pods/FirebaseCore`
-> Installing FirebaseCoreDiagnostics (1.3.0)
> Copying FirebaseCoreDiagnostics from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FirebaseCoreDiagnostics/1.3.0-4a773` to `Pods/FirebaseCoreDiagnostics`
-> Installing FirebaseCoreDiagnosticsInterop (1.2.0)
> Copying FirebaseCoreDiagnosticsInterop from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FirebaseCoreDiagnosticsInterop/1.2.0-296e2` to `Pods/FirebaseCoreDiagnosticsInterop`
-> Installing FirebaseInstallations (1.3.0)
> Copying FirebaseInstallations from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FirebaseInstallations/1.3.0-6f5f6` to `Pods/FirebaseInstallations`
-> Installing FirebaseInstanceID (4.3.4)
> Copying FirebaseInstanceID from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FirebaseInstanceID/4.3.4-cef67` to `Pods/FirebaseInstanceID`
-> Installing FirebaseMessaging (4.4.1)
> Copying FirebaseMessaging from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/FirebaseMessaging/4.4.1-29543` to `Pods/FirebaseMessaging`
-> Installing Flutter (1.0.0)
-> Installing GTMAppAuth (1.0.0)
> Copying GTMAppAuth from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/GTMAppAuth/1.0.0-4deac` to `Pods/GTMAppAuth`
-> Installing GTMSessionFetcher (1.4.0)
> Copying GTMSessionFetcher from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/GTMSessionFetcher/1.4.0-6f5c8` to `Pods/GTMSessionFetcher`
-> Installing GoogleAppMeasurement (6.6.0)
> Copying GoogleAppMeasurement from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/GoogleAppMeasurement/6.6.0-67458` to `Pods/GoogleAppMeasurement`
-> Installing GoogleDataTransport (6.2.1)
> Copying GoogleDataTransport from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/GoogleDataTransport/6.2.1-9a8a1` to `Pods/GoogleDataTransport`
-> Installing GoogleDataTransportCCTSupport (3.1.0)
> Copying GoogleDataTransportCCTSupport from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/GoogleDataTransportCCTSupport/3.1.0-d70a5` to `Pods/GoogleDataTransportCCTSupport`
-> Installing GoogleMaps (3.9.0)
> Copying GoogleMaps from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/GoogleMaps/3.9.0-4b534` to `Pods/GoogleMaps`
-> Installing GoogleSignIn (5.0.2)
> Copying GoogleSignIn from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/GoogleSignIn/5.0.2-7137d` to `Pods/GoogleSignIn`
-> Installing GoogleUtilities (6.6.0)
> Copying GoogleUtilities from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/GoogleUtilities/6.6.0-39530` to `Pods/GoogleUtilities`
-> Installing MTBBarcodeScanner (5.0.11)
> Copying MTBBarcodeScanner from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/MTBBarcodeScanner/5.0.11-f453b` to `Pods/MTBBarcodeScanner`
-> Installing PromisesObjC (1.2.8)
> Copying PromisesObjC from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/PromisesObjC/1.2.8-c119f` to `Pods/PromisesObjC`
-> Installing Protobuf (3.12.0)
> Copying Protobuf from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/Protobuf/3.12.0-2793f` to `Pods/Protobuf`
-> Installing Reachability (3.2)
> Copying Reachability from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/Reachability/3.2-33e18` to `Pods/Reachability`
-> Installing barcode_scan (0.0.1)
-> Installing connectivity (0.0.1)
-> Installing connectivity_macos (0.0.1)
-> Installing firebase_auth (0.0.1)
-> Installing firebase_auth_web (0.1.0)
-> Installing firebase_core (0.0.1)
-> Installing firebase_core_web (0.1.0)
-> Installing firebase_messaging (0.0.1)
> Running prepare command
$ /bin/bash -c set -e echo // Generated file, do not edit > Classes/UserAgent.h echo "#define LIBRARY_VERSION @\"5.1.8\"" >> Classes/UserAgent.h echo "#define LIBRARY_NAME @\"flutter-fire-fcm\"" >> Classes/UserAgent.h
-> Installing flutter_facebook_login (0.0.1)
-> Installing flutter_plugin_android_lifecycle (0.0.1)
-> Installing flutter_ringtone_player (0.0.1)
-> Installing geolocator (5.3.1)
-> Installing google_api_availability (2.0.4)
-> Installing google_maps_flutter (0.0.1)
-> Installing google_sign_in (0.0.1)
-> Installing google_sign_in_web (0.8.1)
-> Installing location (0.0.1)
-> Installing location_permissions (2.0.5)
-> Installing nanopb (1.30905.0)
> Copying nanopb from `/Users/tossdown/Library/Caches/CocoaPods/Pods/Release/nanopb/1.30905.0-c43f4` to `Pods/nanopb`
-> Installing open_appstore (0.0.1)
-> Installing package_info (0.0.1)
-> Installing path_provider (0.0.1)
-> Installing shared_preferences (0.0.1)
-> Installing shared_preferences_macos (0.0.1)
-> Installing shared_preferences_web (0.0.1)
-> Installing sqflite (0.0.1)
-> Installing webview_flutter (0.0.1)
- Running pre install hooks
[!] The 'Pods-Runner' target has transitive dependencies that include statically linked binaries: (/Users/tossdown/Documents/FLUTTER/Projects/shezan/ios/Flutter/Flutter.framework)
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:84:in `block (2 levels) in verify_no_static_framework_transitive_dependencies'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:74:in `each_key'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:74:in `block in verify_no_static_framework_transitive_dependencies'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:73:in `each'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:73:in `verify_no_static_framework_transitive_dependencies'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer/xcode/target_validator.rb:38:in `validate!'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer.rb:590:in `validate_targets'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/installer.rb:158:in `install!'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/command/install.rb:52:in `run'
/Library/Ruby/Gems/2.6.0/gems/claide-1.0.3/lib/claide/command.rb:334:in `run'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/lib/cocoapods/command.rb:52:in `run'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.8.4/bin/pod:55:in `<top (required)>'
/usr/local/bin/pod:23:in `load'
/usr/local/bin/pod:23:in `<main>'
Error running pod install
Error launching application on Iphone X.
```
</details>
|
c: crash,platform-ios,tool,t: xcode,P3,team-ios,triaged-ios
|
medium
|
Critical
|
629,963,943 |
pytorch
|
Torch hub: object has no attribute nms
|
## π Bug
Using docker `pytorch/pytorch`
```
root@90479d02671b:/workspace# python -c "import torch.hub; torch.hub.load('pytorch/vision', 'deeplabv3_resnet101', pretrained=True)"
Downloading: "https://github.com/pytorch/vision/archive/master.zip" to /root/.cache/torch/hub/master.zip
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/opt/conda/lib/python3.7/site-packages/torch/hub.py", line 365, in load
hub_module = import_module(MODULE_HUBCONF, repo_dir + '/' + MODULE_HUBCONF)
File "/opt/conda/lib/python3.7/site-packages/torch/hub.py", line 75, in import_module
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/root/.cache/torch/hub/pytorch_vision_master/hubconf.py", line 5, in <module>
from torchvision.models.alexnet import alexnet
File "/root/.cache/torch/hub/pytorch_vision_master/torchvision/__init__.py", line 5, in <module>
from torchvision import models
File "/root/.cache/torch/hub/pytorch_vision_master/torchvision/models/__init__.py", line 12, in <module>
from . import detection
File "/root/.cache/torch/hub/pytorch_vision_master/torchvision/models/detection/__init__.py", line 1, in <module>
from .faster_rcnn import *
File "/root/.cache/torch/hub/pytorch_vision_master/torchvision/models/detection/faster_rcnn.py", line 7, in <module>
from torchvision.ops import misc as misc_nn_ops
File "/root/.cache/torch/hub/pytorch_vision_master/torchvision/ops/__init__.py", line 1, in <module>
from .boxes import nms, box_iou
File "/root/.cache/torch/hub/pytorch_vision_master/torchvision/ops/boxes.py", line 7, in <module>
@torch.jit.script
File "/opt/conda/lib/python3.7/site-packages/torch/jit/__init__.py", line 1290, in script
fn = torch._C._jit_script_compile(qualified_name, ast, _rcb, get_default_args(obj))
RuntimeError:
object has no attribute nms:
File "/root/.cache/torch/hub/pytorch_vision_master/torchvision/ops/boxes.py", line 41
by NMS, sorted in decreasing order of scores
"""
return torch.ops.torchvision.nms(boxes, scores, iou_threshold)
~~~~~~~~~~~~~~~~~~~~~~~~~ <--- HERE
```
## To Reproduce
Steps to reproduce the behavior:
1. `docker run -it -v --gpus all pytorch/pytorch`
2. `python -c "import torch.hub; torch.hub.load('pytorch/vision', 'deeplabv3_resnet101', pretrained=True)"`
<!-- If you have a code sample, error messages, stack traces, please provide it here as well -->
## Expected behavior
Model should download and load without fail.
## Environment
```
root@a15507f00c1b:/workspace# python collect_env.py
Collecting environment information...
PyTorch version: 1.5.0
Is debug build: No
CUDA used to build PyTorch: 10.1
OS: Ubuntu 18.04.4 LTS
GCC version: Could not collect
CMake version: Could not collect
Python version: 3.7
Is CUDA available: Yes
CUDA runtime version: Could not collect
GPU models and configuration: GPU 0: GeForce RTX 2080 Ti
Nvidia driver version: 440.82
cuDNN version: Could not collect
Versions of relevant libraries:
[pip] numpy==1.18.1
[pip] torch==1.5.0
[pip] torchvision==0.6.0a0+82fd1c8
[conda] blas 1.0 mkl
[conda] cudatoolkit 10.1.243 h6bb024c_0
[conda] mkl 2020.0 166
[conda] mkl-service 2.3.0 py37he904b0f_0
[conda] mkl_fft 1.0.15 py37ha843d7b_0
[conda] mkl_random 1.1.0 py37hd6b4f25_0
[conda] numpy 1.18.1 py37h4f9e942_0
[conda] numpy-base 1.18.1 py37hde5b4d6_1
[conda] pytorch 1.5.0 py3.7_cuda10.1.243_cudnn7.6.3_0 pytorch
[conda] torchvision 0.6.0 py37_cu101 pytorch
```
- PyTorch Version (e.g., 1.0): 1.5.0
- OS (e.g., Linux): NixOS 20.03 unstable -> run docker image pytorch/pytorch
- How you installed PyTorch (`conda`, `pip`, source): NA
- Build command you used (if compiling from source): NA
- Python version: 3.7.7
- CUDA/cuDNN version: 440.82
- GPU models and configuration: GeForce RTX 2080Ti
- Any other relevant information:
|
triaged,module: docker
|
low
|
Critical
|
629,975,605 |
node
|
repl: uncomplete expression shouldn't be autocompleted.
|
<!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
-->
* **Version**: v14.4.0, maybe master
* **Platform**: macOS
* **Subsystem**: repl
### What steps will reproduce the bug?
In repl:
```js
> proc // then press `enter` key
```
<!--
Enter details about your bug, preferably a simple code snippet that can be
run using `node` directly without installing third-party dependencies.
-->
### How often does it reproduce? Is there a required condition?
Every time.
### What is the expected behavior?
Throw `Uncaught ReferenceError: proc is not defined`.
<!--
If possible please provide textual output instead of screenshots.
-->
### What do you see instead?
Print the entire `process` object.
<!--
If possible please provide textual output instead of screenshots.
-->
### Additional information
This bug was introduced in recent release.
Maybe /cc @BridgeAR
<!--
Tell us anything else you think we should know.
-->
|
repl,discuss
|
medium
|
Critical
|
629,988,000 |
go
|
x/crypto/acme: confusing error when ACME CA does not implement pre-authorization flow
|
### What version of Go are you using (`go version`)?
<pre>
$ go version
go version go1.14.4 linux/amd64
</pre>
### Does this issue reproduce with the latest release?
yes (latest version in use)
### What operating system and processor architecture are you using (`go env`)?
<details><summary><code>go env</code> Output</summary><br><pre>
$ go env
GO111MODULE=""
GOARCH="amd64"
GOBIN=""
GOCACHE="/home/[username removed]/.cache/go-build"
GOENV="/home/[username removed]/.config/go/env"
GOEXE=""
GOFLAGS=""
GOHOSTARCH="amd64"
GOHOSTOS="linux"
GOINSECURE=""
GONOPROXY="github.com/[company name removed]/*,[company internal git hosting]/*"
GONOSUMDB="github.com/[company name removed]/*,[company internal git hosting]/*"
GOOS="linux"
GOPATH="/home/[username removed]/go"
GOPRIVATE="github.com/[company name removed]/*,[company internal git hosting]/*"
GOPROXY="https://proxy.golang.org,direct"
GOROOT="/opt/go"
GOSUMDB="sum.golang.org"
GOTMPDIR=""
GOTOOLDIR="/opt/go/pkg/tool/linux_amd64"
GCCGO="gccgo"
AR="ar"
CC="gcc"
CXX="g++"
CGO_ENABLED="1"
GOMOD="/home/[username removed]/dev/certman/webui/webui/go.mod"
CGO_CFLAGS="-g -O2"
CGO_CPPFLAGS=""
CGO_CXXFLAGS="-g -O2"
CGO_FFLAGS="-g -O2"
CGO_LDFLAGS="-g -O2"
PKG_CONFIG="pkg-config"
GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build840948656=/tmp/go-build -gno-record-gcc-switches"
</pre></details>
### What did you do?
Using `golang.org/x/crypto/acme` (`golang.org/x/crypto v0.0.0-20200510223506-06a226fb4e37` in `go.mod`), I register an account with Lets Encrypt's staging v2 API and then call [Client.Authorize()](https://pkg.go.dev/golang.org/x/crypto/acme?tab=doc#Client.Authorize). This fails because [Discover()](https://pkg.go.dev/golang.org/x/crypto/acme?tab=doc#Client.Discover) returns a [Directory](https://pkg.go.dev/golang.org/x/crypto/acme?tab=doc#Directory) with `AuthzURL` set to the empty string indicating the pre-authorization flow is not supported by the CA.
### What did you expect to see?
Some form of `The CA does not implement pre-authorization flow` error message.
### What did you see instead?
An error returned with text `Post "": unsupported protocol scheme ""`
|
NeedsInvestigation
|
low
|
Critical
|
629,990,142 |
bitcoin
|
Improve deadlock detection
|
Background
---
If two or more threads acquire mutexes in different order, that could cause a deadlock. Currently we have two mechanisms for detecting that - our [DEBUG_LOCKORDER](https://github.com/bitcoin/bitcoin/blob/bdedfcf/src/sync.cpp#L134) and the [thread sanitizer](https://clang.llvm.org/docs/ThreadSanitizer.html).
Problem
---
Both methods may fail to detect some deadlocks:
deadlock type | detected by `DEBUG_LOCKORDER` | detected by TSan | detected by the proposed solution
---|---|---|---
A => B => C => A | :x: | :heavy_check_mark: | :heavy_check_mark:
test case `deadlock_unlock_not_last` | :x: | :heavy_check_mark: | :heavy_check_mark:
test case `deadlock_3` | :x: | :x: <sup>[1]</sup> | :heavy_check_mark:
A => B, restart the program, B => A <sup>[2]</sup> | :x: | :x: | :heavy_check_mark: <sup>[3]</sup>
<sup>[1]</sup> submitted as a bug report at https://github.com/google/sanitizers/issues/1258
<sup>[2]</sup> I guess this is how the bug which https://github.com/bitcoin/bitcoin/pull/19132 fixes sneaked in
<sup>[3]</sup> as long as just B => A is executed
<details>
<summary>test cases</summary>
```cpp
class Event
{
public:
void signal()
{
std::unique_lock<std::mutex> lk(m_mutex);
m_occurred = true;
m_cond.notify_all();
}
void wait()
{
std::unique_lock<std::mutex> lk(m_mutex);
m_cond.wait(lk, [&]() { return m_occurred; });
}
private:
bool m_occurred{false};
std::condition_variable m_cond;
std::mutex m_mutex;
};
static std::mutex printf_mutex;
void printf_sync(const char* format, ...)
{
va_list ap;
va_start(ap, format);
{
std::unique_lock<std::mutex> lk(printf_mutex);
vprintf(format, ap);
}
va_end(ap);
}
BOOST_AUTO_TEST_CASE(deadlock_3)
{
#if 0
// detected by DEBUG_LOCKORDER
// not detected by the thread sanitizer (when DEBUG_LOCKORDER is disabled)
constexpr size_t n_threads = 2;
#else
// deadlock is not detected by either one
constexpr size_t n_threads = 3;
#endif
// t0: lock m0
// t1: lock m1
// t2: lock m2
// t0: try to lock m1, waits for t1
// t1: try to lock m2, waits for t2
// t2: try to lock m0, waits for t0 => deadlock
std::array<Mutex, n_threads> mutexes;
std::array<std::thread, n_threads> threads;
std::array<Event, n_threads> locked_own;
std::array<Event, n_threads> try_to_lock_next;
auto thread = [&](size_t i) {
LOCK(mutexes[i]);
printf_sync("thread%zu locked mutex%zu\n", i, i);
locked_own[i].signal();
try_to_lock_next[i].wait();
const size_t next = (i + 1) % n_threads;
printf_sync("thread%zu trying to lock mutex%zu\n", i, next);
LOCK(mutexes[next]);
};
for (size_t i = 0; i < n_threads; ++i) {
threads[i] = std::thread{thread, i};
}
for (size_t i = 0; i < n_threads; ++i) {
locked_own[i].wait();
}
for (size_t i = 0; i < n_threads; ++i) {
try_to_lock_next[i].signal();
}
for (size_t i = 0; i < n_threads; ++i) {
threads[i].join();
}
}
BOOST_AUTO_TEST_CASE(deadlock_unlock_not_last)
{
// t0: lock m0
// t0: lock m1
// t0: unlock m0
// t1: lock m0
// t1: try to lock m1, waits for t0
// t0: try to lock m0, waits for t1 => deadlock
Mutex m0;
Mutex m1;
Event t1_locked_m0;
std::thread t0{[&]() {
ENTER_CRITICAL_SECTION(m0);
LOCK(m1);
LEAVE_CRITICAL_SECTION(m0);
t1_locked_m0.wait();
LOCK(m0);
}};
std::thread t1{[&]() {
LOCK(m0);
t1_locked_m0.signal();
LOCK(m1);
}};
t0.join();
t1.join();
}
```
</details>
Solution
---
Attach a predefined integer to each mutex that denotes the locking order in relation to other mutexes. So, whenever attempting to lock a mutex a thread would check if the previous mutex it acquired has a lower number.
This would require some thorough considerations when adding a new mutex - when is it going to be used, what other mutexes are going to be held at the time the new one is acquired and what other mutexes may be acquired while holding the new one. That is a good practice and should be done anyway. Such a mechanism will enforce it.
|
Feature
|
medium
|
Critical
|
629,992,885 |
PowerToys
|
FancyZones: Sound event for an accoustic confirmation of a snapped in window
|
# Summary of the new feature/enhancement
Can we please have a Windows sound event which can be set with an individual sound file for an accoustic confirmation of a snapped in window?
# Proposed technical implementation details (optional)
Add a new category in the following Sound dialog, for instance "PowerToys FancyZones" and add a new event "Window snap in complete", which will be triggered when the window is snapped in a new zone.

|
Idea-Enhancement,Help Wanted,FancyZones-Dragging&UI,Product-FancyZones
|
low
|
Minor
|
630,000,913 |
TypeScript
|
Add named type arguments
|
## Search Terms
generics, type parameters, named parameter, named type parameter, type argument, named type argument
## Suggestion
It should be possible to pass type arguments to a generic by name rather than positionally, eg.
```typescript
interface Foo<T = SomeDefaultType, U> { ... }
// Current syntax
const foo: Foo<SomeDefaultType, string> ...
// Proposed syntax
const foo: Foo<U = string> ...
// yields foo: Foo<SomeDefaultValue, string>
```
This is loosely inspired on python's named arguments:
```python
def foo(bar = "I'm bar", baz):
...
foo(baz="I'm baz")
```
## Use Cases
Generics only accept positional type arguments. If you have a generic accepting many type arguments, most or all of which having default values such as:
```typescript
interface Handler<Type = string, TPayload = object, TOutput = void> {
type: Type
handle(payload: TPayload): TOutput
}
```
Let's say we have a class which implements the Handler interface but the defaults for `Type` and `TPayload` are fine for us and we only want to specify a type for `TOuput`, currently it is mandatory that we pass type arguments for `Type` and `TPayload`:
```typescript
class Foo implements Handler<string, object, Promise<number>>
```
If it was possible to pass type arguments by name we could use the considerably terser form:
```typescript
class Foo implements Handler<TOutput=Promise<number>>
```
## Examples
[Fastify](https://github.com/fastify/fastify) exposes types generic over many parameters with default values, such as
```typescript
interface FastifyRequest<
HttpRequest = http.IncomingMessage,
Query = DefaultQuery,
Params = DefaultParams,
Headers = DefaultHeaders,
Body = DefaultBody
> { ...
```
With the proposed syntax we could create specialized interfaces with much less overhead such as
```typescript
import * as fastify from 'fastify';
const app = fastify();
app.get('/users/:id', async (req: FastifyRequest<Params = {id: string }>, res: FastifyReply) => {
// req.params is strongly typed now
})
```
## Checklist
My suggestion meets these guidelines:
* [x] This wouldn't be a breaking change in existing TypeScript/JavaScript code
* [x] This wouldn't change the runtime behavior of existing JavaScript code
* [x] This could be implemented without emitting different JS based on the types of the expressions
* [x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, etc.)
* [x] This feature would agree with the rest of [TypeScript's Design Goals](https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals).
|
Suggestion,Awaiting More Feedback
|
medium
|
Major
|
630,014,185 |
TypeScript
|
allow local module type declarations on separate files (similar to C .h files)
|
<!-- π¨ STOP π¨ π¦π§π’π£ π¨ πΊπ»πΆπ· π¨
Half of all issues filed here are duplicates, answered in the FAQ, or not appropriate for the bug tracker.
Please help us by doing the following steps before logging an issue:
* Search: https://github.com/Microsoft/TypeScript/search?type=Issues
* Read the FAQ, especially the "Common Feature Requests" section: https://github.com/Microsoft/TypeScript/wiki/FAQ
-->
## Search Terms
.d.ts definitions ambient module typing local
<!-- List of keywords you searched for before creating this issue. Write them down here so that others can find this suggestion more easily -->
## Suggestion
I would like to being able to compose typescript modules/components inside my application .i.e. `src/components/Foo.ts` and being able to declare its typings under `src/components/Foo.d.ts` where typescript would either infer those types automatically.
**Disclaimer**: It may happen that this is already possible, but I already researched everywhere and read the documentation as well as trying to declare ambient modules *(which won't work with relative paths)*, so I am trying this as a last resort in case it's indeed not currently possible.
<!-- A summary of what you'd like to see added or changed -->
## Use Cases
Sometimes, when creating components inside an application, some may have a quite extensive type set which usually makes the file difficult to read and take the focus away from the implementation.
Like so:
```
src/component/Foo.ts
##########################
# type #
# type #
# type #
# type #
# type #
# #
# implementation #
# implementation #
##########################
```
<!--
What do you want to use this for?
What shortcomings exist with current approaches?
-->
This would make it easier for the developer to organise their code similar to C's .h files separating the definition from implementation.
## Examples
<!-- Show how this would be used and what the behavior would be -->
Project structure:
```
src/components
src/components/Foo.ts
src/components/Foo.d.ts
src/components/complex/index.ts
src/components/complets/index.d.ts
```
types are automatically inferred within the main module
### src/components/Foo.ts
```ts
const bar:Bar = {foo:1,bar:2};
export default ...
```
### src/components/Foo.d.ts
```ts
type Bar = {foo:number; bar:number; }
export type ExportedType = string[];
```
types can still be imported from a different component
### src/components/Bar.ts
```ts
import type {ExportedType} from './Foo';
import type {Bar} from './Foo'; // < Error as Bar is not exported
```
This would result in:
```
src/component/Foo.d.ts src/components/Foo.ts
########################## ##########################
# type # # implementation #
# type # # implementation #
# type # # #
# type # # #
# type # # #
########################## ##########################
```
## Checklist
My suggestion meets these guidelines:
* [ ] This wouldn't be a breaking change in existing TypeScript/JavaScript code
* [x] This wouldn't change the runtime behavior of existing JavaScript code
* [x] This could be implemented without emitting different JS based on the types of the expressions
* [ ] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, etc.)
* [x] This feature would agree with the rest of [TypeScript's Design Goals](https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals).
|
Suggestion,Awaiting More Feedback
|
low
|
Critical
|
630,026,086 |
vscode
|
Add settings search synonym - password/authentication
|
https://github.com/microsoft/vscode-remote-release/issues/3109
|
bug,settings-editor,settings-search
|
low
|
Minor
|
630,052,741 |
vscode
|
Tasks: Pick folder for user tasks in multi-root-workspaces
|
<!-- β οΈβ οΈ Do Not Delete This! feature_request_template β οΈβ οΈ -->
<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->
<!-- Please search existing issues to avoid creating duplicates. -->
<!-- Describe the feature you'd like. -->
User-Tasks in Multi-Root-Workspaces should ask in which workspace Folder they should be executed. Currently every User-Task will only be executed in the first workspaceFolder.
|
feature-request,tasks,workbench-multiroot
|
medium
|
Major
|
630,117,502 |
vscode
|
Allow for "random access" undos
|
<!-- β οΈβ οΈ Do Not Delete This! feature_request_template β οΈβ οΈ -->
<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->
<!-- Please search existing issues to avoid creating duplicates. -->
Refs: https://github.com/microsoft/vscode/issues/99159
> The undo fails regardless of the change made in the other file. I wonder if VS Code could be smart enough to see that the range effected by the edit/undo wasn't changed, and be able to handle the rename (even line-level detection would probably work)
|
feature-request,undo-redo
|
low
|
Minor
|
630,128,958 |
rust
|
Macro expansion for expr
|
```rust
#![feature(raw_ref_op)]
macro_rules! offset {
($ty: tt, $field: expr) => {
unsafe { &raw const ((*(0 as *const $ty)).$field) } as usize
};
}
struct Emm {
a: i32,
b: u64,
}
fn main() {
let tmp = offset!(Emm, b);
let tmp = unsafe { &raw const ((*(0 as *const Emm)).b) } as usize;
}
```
Code below expanded into this result:
```rust
#![feature(prelude_import)]
#![feature(raw_ref_op)]
#[prelude_import]
use std::prelude::v1::*;
#[macro_use]
extern crate std;
struct Emm {
a: i32,
b: u64,
}
fn main() {
let tmp = unsafe { &raw const ((*(0 as *const Emm)), b) } as usize; // Weird
let tmp = unsafe { &raw const ((*(0 as *const Emm)).b) } as usize;
}
```
Noticing the `, b` in the weird line. After changing the `expr` in `offset!` to `ident`, this problem fixed. Is this a bug? I cannot find any documentation about it. If it's not, I guess we can write about it in the rust reference.
|
A-macros,T-compiler,C-bug
|
low
|
Critical
|
630,135,605 |
create-react-app
|
Add option `"root": true` to `eslintConfig` in `package.json`
|
### Is your proposal related to a problem?
https://eslint.org/docs/user-guide/configuring#configuration-cascading-and-hierarchy
> By default, ESLint will look for configuration files in all parent folders up to the root directory.
### Describe the solution you'd like
```diff
{
...
"eslintConfig": {
+ "root": true,
"extends": "react-app"
}
...
}
```
|
issue: proposal,needs triage
|
low
|
Minor
|
630,173,362 |
flutter
|
flutter run shows log lines from other Flutter apps on Android
|
*@Reprevise commented on Jun 3, 2020, 5:02 PM UTC:*
I'm on Dart 2.8.3, and my OS is Windows. I don't know whether to post this in Flutter or here, but it has things to do with logging. Anyways, I was debugging my Flutter application and then I clicked on a notification from one of my other apps. Apparently this app was also built with Flutter and I could see it's debug logs.
Shouldn't logs be limited to my app, and not all of the other apps on my phone built with Flutter? Some of these logs could contain sensitive information.
*This issue was moved by [devoncarew](https://github.com/devoncarew) from [dart-lang/sdk#42175](https://github.com/dart-lang/sdk/issues/42175).*
|
c: new feature,platform-android,tool,a: debugging,has reproducible steps,P2,found in release: 3.0,found in release: 3.1,team-android,triaged-android
|
low
|
Critical
|
630,229,477 |
node
|
ERR_MODULE_NOT_FOUND recommends what I asked it to import
|
<!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
-->
* **Version**: v14.4.0
* **Platform**: Darwin L7 19.5.0 Darwin Kernel Version 19.5.0: Thu Apr 30 18:25:59 PDT 2020; root:xnu-6153.121.1~7/RELEASE_X86_64 x86_64
* **Subsystem**: esm modules
### What steps will reproduce the bug?
```sh
mkdir -p node_modules/@broken_imports
echo 'export const lala = 3' > node_modules/@broken_imports/lala.js
echo "import { lala } from '@broken_imports/lala.js'" > test_imports.js
node test_imports.js
```
ok, now, we'll make it a es module package. it'll still say the same error.
```sh
echo '{"name": "test-pkg", "type": "module"}' > node_modules/@broken_imports/package.json
node test_imports.js
```
### How often does it reproduce? Is there a required condition?
always happens for me. should output:
```
internal/modules/run_main.js:54
internalBinding('errors').triggerUncaughtException(
^
Error [ERR_MODULE_NOT_FOUND]: Cannot find package '@broken_imports/lala.js' imported from /Users/kenny/test/test_imports.js
Did you mean to import @broken_imports/lala.js?
at packageResolve (internal/modules/esm/resolve.js:620:9)
at moduleResolve (internal/modules/esm/resolve.js:659:14)
at Loader.defaultResolve [as _resolve] (internal/modules/esm/resolve.js:752:11)
at Loader.resolve (internal/modules/esm/loader.js:97:40)
at Loader.getModuleJob (internal/modules/esm/loader.js:242:28)
at ModuleWrap.<anonymous> (internal/modules/esm/module_job.js:50:40)
at link (internal/modules/esm/module_job.js:49:36)
```
### What is the expected behavior?
I expected it to import. the docs say: https://nodejs.org/api/esm.html#esm_no_node_path
```
No NODE_PATH#
NODE_PATH is not part of resolving import specifiers. Please use symlinks if this behavior is desired.
```
tried with symlinks too, and couldn't get it to work.
|
esm
|
low
|
Critical
|
630,230,724 |
rust
|
Get rid of /rustc/$hash hack
|
Spawned off of https://github.com/rust-lang/rust/pull/72767/commits/a8e4236edc1e118ccb6c3f3a8d0139e4cd90b5b8
Right now, `/rustc/$hash` is used as a magic prefix to track paths into libstd source so that it can be remapped to local developer paths when they install the rust-src component.
But, as @eddyb pointed out on PR #72767, once we are willing to actually allocate another entry in the `FileName` enum (or its moral equivalent, splitting the payload of one of its variants into two via a separate `RealFileName` enum), then we should be able to do away with the `/rustc/$hash` prefix hack entirely within the rustc source code.
|
C-cleanup,P-medium,T-compiler
|
low
|
Minor
|
630,253,789 |
pytorch
|
JIT test suite has dependencies across tests
|
Steps to reproduce:
1. Induce a bug by adding more type annotations. https://github.com/ezyang/pytorch/tree/poc/jit-bug is what I was working on when this happened
2. Run these tests:
```
(/home/ezyang/local/pytorch-tmp-env) [[email protected] ~/local/pytorch-tmp] python test/test_jit.py TestScript.test_circular_depend
ency
Couldn't download test skip set, leaving all tests enabled...
.
----------------------------------------------------------------------
Ran 1 test in 0.211s
OK
(/home/ezyang/local/pytorch-tmp-env) [[email protected] ~/local/pytorch-tmp] python test/test_jit.py TestScript.test_attribute_in_init TestScript.test_circular_dependency
Couldn't download test skip set, leaving all tests enabled...
.F
======================================================================
FAIL: test_circular_dependency (__main__.TestScript)
----------------------------------------------------------------------
Traceback (most recent call last):
File "test/test_jit.py", line 5303, in test_circular_dependency
self.getExportImportCopy(C())
File "/data/users/ezyang/pytorch-tmp/torch/jit/__init__.py", line 1581, in init_then_script
original_init(self, *args, **kwargs)
File "test/test_jit.py", line 5296, in __init__
self.foo = torch.nn.Sequential(B())
File "/data/users/ezyang/pytorch-tmp/torch/jit/__init__.py", line 1587, in init_then_script
self.__dict__["_actual_script_module"] = torch.jit._recursive.create_script_module(self, make_stubs)
File "/data/users/ezyang/pytorch-tmp/torch/jit/_recursive.py", line 305, in create_script_module
concrete_type = concrete_type_store.get_or_create_concrete_type(nn_module)
File "/data/users/ezyang/pytorch-tmp/torch/jit/_recursive.py", line 264, in get_or_create_concrete_type
concrete_type_builder = infer_concrete_type_builder(nn_module)
File "/data/users/ezyang/pytorch-tmp/torch/jit/_recursive.py", line 128, in infer_concrete_type_builder
assert attr_type.is_interface_type()
AssertionError
----------------------------------------------------------------------
Ran 2 tests in 0.152s
FAILED (failures=1)
(/home/ezyang/local/pytorch-tmp-env) [[email protected] ~/local/pytorch-tmp] python test/test_jit.py TestScript.test_attribute_in_init
Couldn't download test skip set, leaving all tests enabled...
.
----------------------------------------------------------------------
Ran 1 test in 0.047s
OK
```
Individually the tests pass but together they fail.
cc @suo
|
oncall: jit,triaged
|
low
|
Critical
|
630,267,700 |
rust
|
cargo doc output layout improvement for pub const
|
<!-- Thanks for filing a π feature request π! -->
**Describe the problem you are trying to solve**
<!-- A clear and concise description of the problem this feature request is trying to solve. -->
The current `cargo doc` output layout is difficult to read. For example, the HTTP status codes on this[ doc page](https://docs.rs/http/0.2.1/http/status/struct.StatusCode.html). As an honest comparison, here is the similar doc on [Python page](https://docs.python.org/3/library/http.html#http-status-codes). Note that Rust's page has the same info, but with a different layout. The problem is the layout.
**Describe the solution you'd like**
<!-- A clear and concise description of what you want to happen. -->
My request is: change the default layout of `cargo doc` to :
- A clear section for struct's `Consts`, separate from `Methods`.
- Remove duplicated info, for example: all `pub const` can be in one sub-section titled `pub const` and no need to repeat `pub const` for each value. Also, no need to repeat the struct name for each const value.
- Consider using a table format for `Consts`.
**Notes**
<!-- Any additional context or information you feel may be relevant to the issue. -->
|
T-rustdoc,C-feature-request
|
low
|
Minor
|
630,278,071 |
flutter
|
Flutter ellipsis is broken when using a custom font with fontWeight other than normal
|
This does not seem right, the issue is that I have downloaded the "Oxygen-Regular.ttf" from Google Fonts, and added it to my flutter app (as all the other fonts I have in the app). Regarding text
```
child: Text(
"some text",
overflow: TextOverflow.ellipsis,
)
```
using this specific font, the ellipsis gets broken (displays a square) whereas other fonts do not. I have looked for the problem and it seems that this specific font does not have the "ellipsis" character (U+2026). My question is, in these cases, when fonts cannot display ellipsis, can it be replaced by three dots? Or, is there any solution to this issue and how can this be solved, even if there is nothing better than putting three dots instead of ellipsis?
Thanks in advance. There is no flutter doctor attached because my question does not attend directly a problem **in** Flutter, I just need some kind of instruction or advice.
|
framework,engine,a: typography,has reproducible steps,P2,found in release: 2.0,found in release: 2.3,team-engine,triaged-engine
|
low
|
Critical
|
630,287,434 |
terminal
|
The Terminal needs to support `"cursorTextColor": null`
|
Follow up from #6337
Spec'd in #6151
Currently, the terminal always draws the cursor on top of text. #6337 is going to change the Terminal to always draw the cursor _underneath_ the text. Before we release a build with #6337 in it, we should add a setting to allow users to return to the current behavior.
|
Area-Rendering,Area-Settings,Product-Terminal,Issue-Task
|
low
|
Minor
|
630,299,337 |
terminal
|
Split-Pane with 'backgroundImage' set, not showing all background images
|
# Environment
```
Microsoft Windows [Version 10.0.18363.836]
Windows Terminal version: 1.0.1401.0
```
# Steps to reproduce
1. Set up a profile to have a background image
2. Close all terminals
3. Open the terminal via `wt split-pane`, or `wt -p "Command Prompt" ; split-pane -p "Windows PowerShell"`
# Expected behavior
Terminal to open with both background images shown
# Actual behavior
Only a single pane shows the background image.
`wt split-pane;split-pane` also works correctly, and shows the expected behavior.
|
Help Wanted,Needs-Repro,Issue-Bug,Area-TerminalControl,Product-Terminal,Priority-3
|
medium
|
Major
|
630,331,908 |
go
|
runtime: "fatal error: all goroutines are asleep - deadlock!" with GC assist wait
|
[2020-06-02T21:19:07-ee776b4/freebsd-386-11_2](https://build.golang.org/log/a52682319e4acceb9aa1804f128b3f941c6eb007):
```
fatal error: all goroutines are asleep - deadlock!
goroutine 1 [semacquire]:
sync.runtime_Semacquire(0x39123438)
/tmp/workdir/go/src/runtime/sema.go:56 +0x30
sync.(*WaitGroup).Wait(0x39123430)
/tmp/workdir/go/src/sync/waitgroup.go:130 +0x76
cmd/compile/internal/gc.compileFunctions()
/tmp/workdir/go/src/cmd/compile/internal/gc/pgen.go:392 +0x190
cmd/compile/internal/gc.Main(0x87a6314)
/tmp/workdir/go/src/cmd/compile/internal/gc/main.go:757 +0x30d6
main.main()
/tmp/workdir/go/src/cmd/compile/main.go:52 +0x8d
goroutine 22 [GC assist wait]:
cmd/compile/internal/ssa.cse(0x39168540)
/tmp/workdir/go/src/cmd/compile/internal/ssa/cse.go:52 +0x301
cmd/compile/internal/ssa.Compile(0x39168540)
/tmp/workdir/go/src/cmd/compile/internal/ssa/compile.go:93 +0x873
cmd/compile/internal/gc.buildssa(0x3933c5b0, 0x3, 0x0)
/tmp/workdir/go/src/cmd/compile/internal/gc/ssa.go:460 +0xa5d
cmd/compile/internal/gc.compileSSA(0x3933c5b0, 0x3)
/tmp/workdir/go/src/cmd/compile/internal/gc/pgen.go:317 +0x4c
cmd/compile/internal/gc.compileFunctions.func2(0x3943a7c0, 0x39123430, 0x3)
/tmp/workdir/go/src/cmd/compile/internal/gc/pgen.go:382 +0x35
created by cmd/compile/internal/gc.compileFunctions
/tmp/workdir/go/src/cmd/compile/internal/gc/pgen.go:380 +0xf7
```
This is a failure on `freebsd-386-11_2`. I don't see any recent similar failures.
cc @mknyszek @aclements
|
GarbageCollector,NeedsInvestigation,compiler/runtime
|
low
|
Critical
|
630,363,367 |
flutter
|
Issue with Gradle Build during Flutter run command
|
I have created a new Flutter project in Android Studio.
while doing "flutter run" in project directory, i am getting below error.
**D:\SandBox\Flutter_workspace\flutter_app>flutter run
The system cannot find the path specified.
The system cannot find the path specified.
Launching lib\main.dart on Redmi 4 in debug mode...
The system cannot find the path specified.
The system cannot find the path specified.
The system cannot find the path specified.
The system cannot find the path specified.
FAILURE: Build failed with an exception.
* Where:
Script 'D:\Flutter-SDK\flutter\packages\flutter_tools\gradle\flutter.gradle' line: 882
* What went wrong:
Execution failed for task ':app:compileFlutterBuildDebug'.
> Process 'command 'D:\Flutter-SDK\flutter\bin\flutter.bat'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 6s
Running Gradle task 'assembleDebug'...
Running Gradle task 'assembleDebug'... Done 7.1s
Exception: Gradle task assembleDebug failed with exit code 1**
## Steps to Reproduce
<!-- Please tell us exactly how to reproduce the problem you are running into. -->
1. Create a new Flutter project in Android Studio.
2. Run "**flutter run**" in project directory or click on "**Run**" button
## Logs
Output of **flutter run --verbose** :
```
D:\SandBox\Flutter_workspace\flutter_app>flutter run --verbose
The system cannot find the path specified.
The system cannot find the path specified.
[ +18 ms] executing: [D:\Flutter-SDK\flutter/] git -c log.showSignature=false log -n 1 --pretty=format:%H
[ +69 ms] Exit code 0 from: git -c log.showSignature=false log -n 1 --pretty=format:%H
[ +3 ms] 5f21edf8b66e31a39133177319414395cc5b5f48
[ +1 ms] executing: [D:\Flutter-SDK\flutter/] git tag --contains HEAD
[ +244 ms] Exit code 0 from: git tag --contains HEAD
[ +3 ms] 1.17.2
[ +16 ms] executing: [D:\Flutter-SDK\flutter/] git rev-parse --abbrev-ref --symbolic @{u}
[ +40 ms] Exit code 0 from: git rev-parse --abbrev-ref --symbolic @{u}
[ +3 ms] origin/stable
[ +7 ms] executing: [D:\Flutter-SDK\flutter/] git ls-remote --get-url origin
[ +47 ms] Exit code 0 from: git ls-remote --get-url origin
[ +2 ms] https://github.com/flutter/flutter.git
[ +111 ms] executing: [D:\Flutter-SDK\flutter/] git rev-parse --abbrev-ref HEAD
[ +42 ms] Exit code 0 from: git rev-parse --abbrev-ref HEAD
[ +2 ms] stable
[ +44 ms] Artifact Instance of 'AndroidMavenArtifacts' is not required, skipping update.
[ +3 ms] Artifact Instance of 'AndroidGenSnapshotArtifacts' is not required, skipping update.
[ +20 ms] Artifact Instance of 'AndroidInternalBuildArtifacts' is not required, skipping update.
[ +6 ms] Artifact Instance of 'IOSEngineArtifacts' is not required, skipping update.
[ +11 ms] Artifact Instance of 'FlutterWebSdk' is not required, skipping update.
[ +6 ms] Artifact Instance of 'WindowsEngineArtifacts' is not required, skipping update.
[ +4 ms] Artifact Instance of 'MacOSEngineArtifacts' is not required, skipping update.
[ +8 ms] Artifact Instance of 'LinuxEngineArtifacts' is not required, skipping update.
[ +3 ms] Artifact Instance of 'LinuxFuchsiaSDKArtifacts' is not required, skipping update.
[ +12 ms] Artifact Instance of 'MacOSFuchsiaSDKArtifacts' is not required, skipping update.
[ +1 ms] Artifact Instance of 'FlutterRunnerSDKArtifacts' is not required, skipping update.
[ +7 ms] Artifact Instance of 'FlutterRunnerDebugSymbols' is not required, skipping update.
[ +33 ms] executing: C:\Users\adity\AppData\Local\Android\Sdk\platform-tools\adb.exe devices -l
[ +51 ms] List of devices attached
ca2825ef7d14 device product:santoni model:Redmi_4 device:santoni transport_id:1
[ +10 ms] C:\Users\adity\AppData\Local\Android\Sdk\platform-tools\adb.exe -s ca2825ef7d14 shell getprop
[ +147 ms] Artifact Instance of 'AndroidMavenArtifacts' is not required, skipping update.
[ +10 ms] Artifact Instance of 'AndroidInternalBuildArtifacts' is not required, skipping update.
[ +6 ms] Artifact Instance of 'IOSEngineArtifacts' is not required, skipping update.
[ +1 ms] Artifact Instance of 'FlutterWebSdk' is not required, skipping update.
[ +12 ms] Artifact Instance of 'WindowsEngineArtifacts' is not required, skipping update.
[ +1 ms] Artifact Instance of 'MacOSEngineArtifacts' is not required, skipping update.
[ +8 ms] Artifact Instance of 'LinuxEngineArtifacts' is not required, skipping update.
[ +10 ms] Artifact Instance of 'LinuxFuchsiaSDKArtifacts' is not required, skipping update.
[ +1 ms] Artifact Instance of 'MacOSFuchsiaSDKArtifacts' is not required, skipping update.
[ +11 ms] Artifact Instance of 'FlutterRunnerSDKArtifacts' is not required, skipping update.
[ +1 ms] Artifact Instance of 'FlutterRunnerDebugSymbols' is not required, skipping update.
[ +211 ms] Generating D:\SandBox\Flutter_workspace\flutter_app\android\app\src\main\java\io\flutter\plugins\GeneratedPluginRegistrant.java
[ +28 ms] ro.hardware = qcom
[ +54 ms] Launching lib\main.dart on Redmi 4 in debug mode...
[ +12 ms] D:\Flutter-SDK\flutter\bin\cache\dart-sdk\bin\dart.exe D:\Flutter-SDK\flutter\bin\cache\artifacts\engine\windows-x64\frontend_server.dart.snapshot --sdk-root
D:\Flutter-SDK\flutter\bin\cache\artifacts\engine\common\flutter_patched_sdk/ --incremental --target=flutter --debugger-module-names -Ddart.developer.causal_async_stacks=true --output-dill
C:\Users\adity\AppData\Local\Temp\flutter_tool.382d803a-a5e4-11ea-9cca-cd7aa83d4682\app.dill --packages D:\SandBox\Flutter_workspace\flutter_app\.packages -Ddart.vm.profile=false -Ddart.vm.product=false
--bytecode-options=source-positions,local-var-info,debugger-stops,instance-field-initializers,keep-unreachable-code,avoid-closure-call-instructions --enable-asserts --track-widget-creation --filesystem-scheme
org-dartlang-root --initialize-from-dill build\cache.dill
[ +23 ms] executing: C:\Users\adity\AppData\Local\Android\Sdk\platform-tools\adb.exe -s ca2825ef7d14 shell -x logcat -v time -t 1
[ +285 ms] Exit code 0 from: C:\Users\adity\AppData\Local\Android\Sdk\platform-tools\adb.exe -s ca2825ef7d14 shell -x logcat -v time -t 1
[ +6 ms] --------- beginning of main
06-04 03:20:21.964 D/ThermalEngine( 685): sensor_wait: case_therm Wait start. 1000ms
[ +33 ms] <- compile package:flutterapp/main.dart
[ +25 ms] executing: C:\Users\adity\AppData\Local\Android\Sdk\platform-tools\adb.exe version
[ +66 ms] Android Debug Bridge version 1.0.41
Version 30.0.1-6435776
Installed as C:\Users\adity\AppData\Local\Android\Sdk\platform-tools\adb.exe
[ +7 ms] executing: C:\Users\adity\AppData\Local\Android\Sdk\platform-tools\adb.exe start-server
[ +42 ms] Building APK
[ +28 ms] Running Gradle task 'assembleDebug'...
[ +7 ms] gradle.properties already sets `android.enableR8`
[ +10 ms] Using gradle from D:\SandBox\Flutter_workspace\flutter_app\android\gradlew.bat.
[ +7 ms] D:\SandBox\Flutter_workspace\flutter_app\android\gradlew.bat mode: 33279 rwxrwxrwx.
[ +18 ms] executing: C:\Program Files\Android\Android Studio2\jre\bin\java -version
[ +142 ms] Exit code 0 from: C:\Program Files\Android\Android Studio2\jre\bin\java -version
[ +3 ms] openjdk version "1.8.0_242-release"
OpenJDK Runtime Environment (build 1.8.0_242-release-1644-b01)
OpenJDK 64-Bit Server VM (build 25.242-b01, mixed mode)
[ +16 ms] executing: [D:\SandBox\Flutter_workspace\flutter_app\android/] D:\SandBox\Flutter_workspace\flutter_app\android\gradlew.bat -Pverbose=true -Ptarget-platform=android-arm64
-Ptarget=D:\SandBox\Flutter_workspace\flutter_app\lib\main.dart -Ptrack-widget-creation=true -Pfilesystem-scheme=org-dartlang-root assembleDebug
[ +51 ms] The system cannot find the path specified.
[+1647 ms] > Configure project :app
[ +4 ms] WARNING: The following project options are deprecated and have been removed:
[ +6 ms] android.enableAapt2
[ +6 ms] This property has no effect, AAPT2 is now always used.
[+2179 ms] > Task :app:compileFlutterBuildDebug
[ +4 ms] [ +20 ms] executing: [D:\Flutter-SDK\flutter/] git -c log.showSignature=false log -n 1 --pretty=format:%H
[ +4 ms] [ +62 ms] Exit code 0 from: git -c log.showSignature=false log -n 1 --pretty=format:%H
[ +1 ms] [ ] 5f21edf8b66e31a39133177319414395cc5b5f48
[ +1 ms] [ ] executing: [D:\Flutter-SDK\flutter/] git tag --contains HEAD
[ +9 ms] [ +254 ms] Exit code 0 from: git tag --contains HEAD
[ +1 ms] [ +1 ms] 1.17.2
[ +3 ms] [ +10 ms] executing: [D:\Flutter-SDK\flutter/] git rev-parse --abbrev-ref --symbolic @{u}
[ +5 ms] [ +35 ms] Exit code 0 from: git rev-parse --abbrev-ref --symbolic @{u}
[ +5 ms] [ ] origin/stable
[ +1 ms] [ ] executing: [D:\Flutter-SDK\flutter/] git ls-remote --get-url origin
[ +5 ms] [ +32 ms] Exit code 0 from: git ls-remote --get-url origin
[ +3 ms] [ ] https://github.com/flutter/flutter.git
[ +7 ms] [ +98 ms] executing: [D:\Flutter-SDK\flutter/] git rev-parse --abbrev-ref HEAD
[ +3 ms] [ +48 ms] Exit code 0 from: git rev-parse --abbrev-ref HEAD
[ +5 ms] [ ] stable
[ +5 ms] [ +30 ms] Artifact Instance of 'AndroidMavenArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'AndroidGenSnapshotArtifacts' is not required, skipping update.
[ +5 ms] The system cannot find the path specified.
[ +10 ms] The system cannot find the path specified.
[ +3 ms] The system cannot find the path specified.
[ +2 ms] [ ] Artifact Instance of 'AndroidInternalBuildArtifacts' is not required, skipping update.
[ +5 ms] [ ] Artifact Instance of 'IOSEngineArtifacts' is not required, skipping update.
[ +5 ms] [ ] Artifact Instance of 'FlutterWebSdk' is not required, skipping update.
[ +5 ms] [ +5 ms] Artifact Instance of 'WindowsEngineArtifacts' is not required, skipping update.
[ +4 ms] [ ] Artifact Instance of 'MacOSEngineArtifacts' is not required, skipping update.
[ +7 ms] [ ] Artifact Instance of 'LinuxEngineArtifacts' is not required, skipping update.
[ +4 ms] [ ] Artifact Instance of 'LinuxFuchsiaSDKArtifacts' is not required, skipping update.
[ +5 ms] [ ] Artifact Instance of 'MacOSFuchsiaSDKArtifacts' is not required, skipping update.
[ +5 ms] [ ] Artifact Instance of 'FlutterRunnerSDKArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'FlutterRunnerDebugSymbols' is not required, skipping update.
[ +5 ms] [ +18 ms] Artifact Instance of 'MaterialFonts' is not required, skipping update.
[ +8 ms] [ ] Artifact Instance of 'GradleWrapper' is not required, skipping update.
[ +9 ms] [ ] Artifact Instance of 'AndroidMavenArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'AndroidGenSnapshotArtifacts' is not required, skipping update.
[ +10 ms] [ ] Artifact Instance of 'AndroidInternalBuildArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'IOSEngineArtifacts' is not required, skipping update.
[ +5 ms] [ ] Artifact Instance of 'FlutterWebSdk' is not required, skipping update.
[ +5 ms] [ ] Artifact Instance of 'FlutterSdk' is not required, skipping update.
[ +5 ms] [ ] Artifact Instance of 'WindowsEngineArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'MacOSEngineArtifacts' is not required, skipping update.
[ +10 ms] [ ] Artifact Instance of 'LinuxEngineArtifacts' is not required, skipping update.
[ +4 ms] [ ] Artifact Instance of 'LinuxFuchsiaSDKArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'MacOSFuchsiaSDKArtifacts' is not required, skipping update.
[ +5 ms] [ ] Artifact Instance of 'FlutterRunnerSDKArtifacts' is not required, skipping update.
[ +5 ms] [ ] Artifact Instance of 'FlutterRunnerDebugSymbols' is not required, skipping update.
[ +4 ms] [ ] Artifact Instance of 'IosUsbArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'IosUsbArtifacts' is not required, skipping update.
[ +10 ms] [ ] Artifact Instance of 'IosUsbArtifacts' is not required, skipping update.
[ +4 ms] [ ] Artifact Instance of 'IosUsbArtifacts' is not required, skipping update.
[ +4 ms] [ ] Artifact Instance of 'IosUsbArtifacts' is not required, skipping update.
[ +2 ms] [ ] Artifact Instance of 'FontSubsetArtifacts' is not required, skipping update.
[ +4 ms] [ +87 ms] Initializing file store
[ +5 ms] [ +15 ms] Done initializing file store
[ +842 ms] [+1719 ms] Skipping target: kernel_snapshot
[ +199 ms] [ +200 ms] debug_android_application: Starting due to {InvalidatedReason.outputMissing}
[ +198 ms] [ +207 ms] debug_android_application: Complete
[ +4 ms] [ +24 ms] Persisting file store
[ +5 ms] [ +19 ms] Done persisting file store
[ +90 ms] [ +4 ms] build succeeded.
[ +3 ms] [ +33 ms] "flutter assemble" took 2,421ms.
[ +172 ms] > Task :app:compileFlutterBuildDebug FAILED
[ +4 ms] 1 actionable task: 1 executed
[ +5 ms] FAILURE: Build failed with an exception.
[ +3 ms] * Where:
[ +1 ms] Script 'D:\Flutter-SDK\flutter\packages\flutter_tools\gradle\flutter.gradle' line: 882
[ +4 ms] * What went wrong:
[ +6 ms] Execution failed for task ':app:compileFlutterBuildDebug'.
[ +3 ms] > Process 'command 'D:\Flutter-SDK\flutter\bin\flutter.bat'' finished with non-zero exit value 1
[ +2 ms] * Try:
[ +1 ms] Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
[ +3 ms] * Get more help at https://help.gradle.org
[ +2 ms] BUILD FAILED in 5s
[ +334 ms] Running Gradle task 'assembleDebug'... (completed in 6.2s)
[ +520 ms] Exception: Gradle task assembleDebug failed with exit code 1
[ +2 ms] "flutter run" took 8,093ms.
#0 throwToolExit (package:flutter_tools/src/base/common.dart:14:3)
#1 RunCommand.runCommand (package:flutter_tools/src/commands/run.dart:569:7)
<asynchronous suspension>
#2 FlutterCommand.verifyThenRunCommand (package:flutter_tools/src/runner/flutter_command.dart:723:18)
#3 _rootRunUnary (dart:async/zone.dart:1192:38)
#4 _CustomZone.runUnary (dart:async/zone.dart:1085:19)
#5 _FutureListener.handleValue (dart:async/future_impl.dart:141:18)
#6 Future._propagateToListeners.handleValueCallback (dart:async/future_impl.dart:682:45)
#7 Future._propagateToListeners (dart:async/future_impl.dart:711:32)
#8 Future._completeWithValue (dart:async/future_impl.dart:526:5)
#9 _AsyncAwaitCompleter.complete (dart:async-patch/async_patch.dart:36:15)
#10 _completeOnAsyncReturn (dart:async-patch/async_patch.dart:298:13)
#11 RunCommand.usageValues (package:flutter_tools/src/commands/run.dart)
#12 _rootRunUnary (dart:async/zone.dart:1192:38)
#13 _CustomZone.runUnary (dart:async/zone.dart:1085:19)
#14 _FutureListener.handleValue (dart:async/future_impl.dart:141:18)
#15 Future._propagateToListeners.handleValueCallback (dart:async/future_impl.dart:682:45)
#16 Future._propagateToListeners (dart:async/future_impl.dart:711:32)
#17 Future._completeWithValue (dart:async/future_impl.dart:526:5)
#18 _AsyncAwaitCompleter.complete (dart:async-patch/async_patch.dart:36:15)
#19 _completeOnAsyncReturn (dart:async-patch/async_patch.dart:298:13)
#20 AndroidDevice.isLocalEmulator (package:flutter_tools/src/android/android_device.dart)
#21 _rootRunUnary (dart:async/zone.dart:1192:38)
#22 _CustomZone.runUnary (dart:async/zone.dart:1085:19)
#23 _FutureListener.handleValue (dart:async/future_impl.dart:141:18)
#24 Future._propagateToListeners.handleValueCallback (dart:async/future_impl.dart:682:45)
#25 Future._propagateToListeners (dart:async/future_impl.dart:711:32)
#26 Future._completeWithValue (dart:async/future_impl.dart:526:5)
#27 Future._asyncComplete.<anonymous closure> (dart:async/future_impl.dart:556:7)
#28 _rootRun (dart:async/zone.dart:1184:13)
#29 _CustomZone.run (dart:async/zone.dart:1077:19)
#30 _CustomZone.runGuarded (dart:async/zone.dart:979:7)
#31 _CustomZone.bindCallbackGuarded.<anonymous closure> (dart:async/zone.dart:1019:23)
#32 _microtaskLoop (dart:async/schedule_microtask.dart:43:21)
#33 _startMicrotaskLoop (dart:async/schedule_microtask.dart:52:5)
#34 _runPendingImmediateCallback (dart:isolate-patch/isolate_patch.dart:118:13)
#35 _RawReceivePortImpl._handleMessage (dart:isolate-patch/isolate_patch.dart:169:5)
```
Output for **flutter doctor -v**
```
D:\SandBox\Flutter_workspace\flutter_app>flutter doctor -v
The system cannot find the path specified.
The system cannot find the path specified.
[β] Flutter (Channel stable, v1.17.2, on Windows, locale en-IN)
β’ Flutter version 1.17.2 at D:\Flutter-SDK\flutter
β’ Framework revision 5f21edf8b6 (6 days ago), 2020-05-28 12:44:12 -0700
β’ Engine revision b851c71829
β’ Dart version 2.8.3
[β] Android toolchain - develop for Android devices (Android SDK version 28.0.3)
β’ Android SDK at C:\Users\adity\AppData\Local\Android\Sdk
β’ Platform android-28, build-tools 28.0.3
β’ ANDROID_SDK_ROOT = D:\Android_SDK
β’ Java binary at: C:\Program Files\Android\Android Studio2\jre\bin\java
β’ Java version OpenJDK Runtime Environment (build 1.8.0_242-release-1644-b01)
β’ All Android licenses accepted.
[β] Android Studio (version 4.0)
β’ Android Studio at C:\Program Files\Android\Android Studio2
β’ Flutter plugin version 46.0.2
β’ Dart plugin version 193.7361
β’ Java version OpenJDK Runtime Environment (build 1.8.0_242-release-1644-b01)
[β] VS Code (version 1.45.1)
β’ VS Code at C:\Users\adity\AppData\Local\Programs\Microsoft VS Code
β’ Flutter extension version 3.11.0
[β] Connected device (1 available)
β’ Redmi 4 β’ ca2825ef7d14 β’ android-arm64 β’ Android 7.1.2 (API 25)
β’ No issues found!
```
Output of **gradlew build --stacktrace** inside PROJECTDIR\android
```
D:\SandBox\Flutter_workspace\flutter_app\android>gradlew build --stacktrace
> Configure project :app
WARNING: The following project options are deprecated and have been removed:
android.enableAapt2
This property has no effect, AAPT2 is now always used.
> Task :app:compileFlutterBuildDebug
The system cannot find the path specified.
The system cannot find the path specified.
The system cannot find the path specified.
> Task :app:compileFlutterBuildDebug FAILED
FAILURE: Build failed with an exception.
* Where:
Script 'D:\Flutter-SDK\flutter\packages\flutter_tools\gradle\flutter.gradle' line: 882
* What went wrong:
Execution failed for task ':app:compileFlutterBuildDebug'.
> Process 'command 'D:\Flutter-SDK\flutter\bin\flutter.bat'' finished with non-zero exit value 1
* Try:
Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':app:compileFlutterBuildDebug'.
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$3.accept(ExecuteActionsTaskExecuter.java:166)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$3.accept(ExecuteActionsTaskExecuter.java:163)
at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:191)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:156)
at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:62)
at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:108)
at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
at org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:94)
at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:46)
at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:95)
at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52)
at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
Caused by: org.gradle.process.internal.ExecException: Process 'command 'D:\Flutter-SDK\flutter\bin\flutter.bat'' finished with non-zero exit value 1
at org.gradle.process.internal.DefaultExecHandle$ExecResultImpl.assertNormalExitValue(DefaultExecHandle.java:409)
at org.gradle.process.internal.DefaultExecAction.execute(DefaultExecAction.java:38)
at org.gradle.process.internal.DefaultExecActionFactory.exec(DefaultExecActionFactory.java:145)
at org.gradle.api.internal.project.DefaultProject.exec(DefaultProject.java:1117)
at org.gradle.api.internal.project.DefaultProject.exec(DefaultProject.java:1112)
at org.gradle.api.Project$exec$5.call(Unknown Source)
at BaseFlutterTask.buildBundle(D:\Flutter-SDK\flutter\packages\flutter_tools\gradle\flutter.gradle:882)
at BaseFlutterTask$buildBundle.callCurrent(Unknown Source)
at FlutterTask.build(D:\Flutter-SDK\flutter\packages\flutter_tools\gradle\flutter.gradle:990)
at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:103)
at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:49)
at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:42)
at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28)
at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:717)
at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:684)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$5.run(ExecuteActionsTaskExecuter.java:476)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:461)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:444)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:93)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:237)
at org.gradle.internal.execution.steps.ExecuteStep.lambda$execute$1(ExecuteStep.java:33)
at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:33)
at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:26)
at org.gradle.internal.execution.steps.CleanupOutputsStep.execute(CleanupOutputsStep.java:58)
at org.gradle.internal.execution.steps.CleanupOutputsStep.execute(CleanupOutputsStep.java:35)
at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:48)
at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:33)
at org.gradle.internal.execution.steps.CancelExecutionStep.execute(CancelExecutionStep.java:39)
at org.gradle.internal.execution.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:73)
at org.gradle.internal.execution.steps.TimeoutStep.execute(TimeoutStep.java:54)
at org.gradle.internal.execution.steps.CatchExceptionStep.execute(CatchExceptionStep.java:35)
at org.gradle.internal.execution.steps.CreateOutputsStep.execute(CreateOutputsStep.java:51)
at org.gradle.internal.execution.steps.SnapshotOutputsStep.execute(SnapshotOutputsStep.java:45)
at org.gradle.internal.execution.steps.SnapshotOutputsStep.execute(SnapshotOutputsStep.java:31)
at org.gradle.internal.execution.steps.CacheStep.executeWithoutCache(CacheStep.java:208)
at org.gradle.internal.execution.steps.CacheStep.execute(CacheStep.java:70)
at org.gradle.internal.execution.steps.CacheStep.execute(CacheStep.java:45)
at org.gradle.internal.execution.steps.BroadcastChangingOutputsStep.execute(BroadcastChangingOutputsStep.java:49)
at org.gradle.internal.execution.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:43)
at org.gradle.internal.execution.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:32)
at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:38)
at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:24)
at org.gradle.internal.execution.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:96)
at org.gradle.internal.execution.steps.SkipUpToDateStep.lambda$execute$0(SkipUpToDateStep.java:89)
at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:54)
at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:38)
at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:76)
at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:37)
at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:36)
at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:26)
at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:90)
at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:48)
at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:69)
at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:47)
at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:33)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:140)
... 34 more
* Get more help at https://help.gradle.org
BUILD FAILED in 5s
1 actionable task: 1 executed
```
Kindly help with this as this is my first time with flutter, gradle as well as Android Studio.
|
c: crash,tool,t: gradle,P2,team-tool,triaged-tool
|
low
|
Critical
|
630,410,184 |
svelte
|
delay in transitions only applies to last change
|
https://svelte.dev/repl/577f73063a75469292b63d384f72ea9c?version=3.23.0

( reload to replay gif )
|
bug,temp-stale
|
low
|
Major
|
630,415,282 |
pytorch
|
Misannotation of layer_norm parameters causes internal assert failure
|
Steps to reproduce:
1. Check out https://github.com/ezyang/pytorch/pull/new/poc/layer-norm-misannot
2. Run `pytest test/quantization/test_quantize_script.py -k test_layer_norm`
Expected result: no internal assert failure
Actual result:
```
model = RecursiveScriptModule(original_name=LayerNorm), inplace = False, debug = False, is_dynamic = False
def _convert_script(model, inplace=False, debug=False, is_dynamic=False):
assert not inplace, "The inplace support is still in development"
_check_is_script_module(model)
model.eval()
model = wrap_cpp_module(torch._C._jit_pass_insert_quant_dequant(model._c, 'forward', inplace, is_dynamic))
if not debug:
> model = wrap_cpp_module(torch._C._jit_pass_quant_finalize(model._c, is_dynamic))
E RuntimeError: 0 INTERNAL ASSERT FAILED at "../torch/csrc/jit/ir/alias_analysis.cpp":465, please report a bug to PyTorch. We don't have an op for quantized::layer_norm but it isn't a special case. Argument types: Tensor, int[], Tensor?, Tensor?, float, float, int,
```
The failure is because of a bug in the module code:
```
diff --git a/torch/nn/modules/normalization.py b/torch/nn/modules/normalization.py
index b49650e946..09f710769b 100644
--- a/torch/nn/modules/normalization.py
+++ b/torch/nn/modules/normalization.py
@@ -142,8 +142,8 @@ class LayerNorm(Module):
normalized_shape: _shape_t
eps: float
elementwise_affine: bool
- weight: Optional[Tensor]
- bias: Optional[Tensor]
+ weight: Tensor
+ bias: Tensor
def __init__(self, normalized_shape: _shape_t, eps: float = 1e-5, elementwise_affine: bool = True):
super(LayerNorm, self).__init__()
```
But it should not be possible to trigger an internal assert just by misannotating parameters in Python.
cc @jerryzh168 @jianyuh @raghuramank100 @jamesr66a @vkuzo @dzhulgakov
|
oncall: quantization,low priority,triaged
|
low
|
Critical
|
630,443,449 |
pytorch
|
pytest suppresses stderr from Python startup by default
|
Steps to reproduce:
1. Make some error in a static initializer
2. Run pytest on a test suite
Expected result: You get the stderr printed by python binary (with the actual static initializer error) before exit
Actual result: Pytest suppresses the stderr and... gives you the C backtrace for Python?
This should get filed upstream
cc @mruberry
|
module: logging,module: tests,triaged
|
low
|
Critical
|
630,487,573 |
flutter
|
[web] Doesn't work on android 4.4 or 5.x default browser
|
## Steps to Reproduce
Sorry , there is no need to say very complicatedly.
1. Use Android Studio to create a sample flutter app -- a Counter app.
2. Compile it use the command: flutter build web,
3. Config you Nginx using the local port 80, and test it with your chrome browser.
3. Run an android (4.4 or 5.x) emulator, open the default browser and input 10.0.2.2. You will find a blank page before you after the browser's progress is closed.
4. In fact, at present, I have tested several browsers, none works properly except a blank page.
Why I just test android 4.4 or 5.x, I want my web application to be run properly at least in android 4.4 or 5.x. If this demonstration app is not able to run in android 4.4 or 5.x, my other project will be discarded officially.
How can I do?
I really need help.
|
platform-android,framework,engine,platform-web,a: build,e: OS-version specific,has reproducible steps,P3,team-engine,triaged-engine,found in release: 3.16,found in release: 3.18
|
low
|
Major
|
630,494,854 |
go
|
x/tools/gopls: rename in GOPATH mode does not rename test variant
|
I was running through the steps in https://github.com/golang/vscode-go/blob/master/docs/Smoke-Test.md, with the example repository in `GOPATH` mode. Renaming `stringutil.Reverse` resulted in an error in the `reverse_test.go` file, as the version in the test variant was not renamed.
|
gopls,Tools
|
low
|
Critical
|
630,500,892 |
flutter
|
webview response to gesture from the widget above it
|
webView_flutter for iOS οΌWhen loading the map URL, stack will overlay a gesturedetector event, which will penetrate the map pageγγLoading other https web will not appear
```
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Stack(
children: <Widget>[
WebView(
javascriptMode: JavascriptMode.unrestricted,
javascriptChannels: <JavascriptChannel>[].toSet(),
onWebViewCreated: (WebViewController controller) {
///url: https://www.google.com
_webViewController.loadUrl('https://gaode.com/');
},
onPageFinished: (String value) {},
),
Positioned(
left: 100.0,
top: 100.0,
child: GestureDetector(
onTap: () {
print('click red...');
},
child: Container(
width: 100.0,
height: 100.0,
color: Colors.red,
),
),
)
],
),
);
```
|
platform-ios,framework,f: gestures,p: webview,package,has reproducible steps,P2,found in release: 2.2,found in release: 2.5,team-ios,triaged-ios
|
low
|
Critical
|
630,602,524 |
flutter
|
RouteAware should have didReplace callback
|
## Use case
`RouteAware` class have `didPopNext`, `didPush`, `didPop`, and `didPushNext` callback,
but if a route is replaced, it will not be notified.
## Proposal
I think we should add a `didReplace` callback to `RouteAware` and `RouteObserver` class.
|
framework,f: routes,c: proposal,P3,team-framework,triaged-framework
|
low
|
Minor
|
630,615,245 |
TypeScript
|
Inconsistent assignment analysis for async arrow IIFE's regarding class property declarations (sugar vs constructor assigning)
|
<!-- π¨ STOP π¨ π¦π§π’π£ π¨ πΊπ»πΆπ· π¨
Half of all issues filed here are duplicates, answered in the FAQ, or not appropriate for the bug tracker. Even if you think you've found a *bug*, please read the FAQ first, especially the Common "Bugs" That Aren't Bugs section!
Please help us by doing the following steps before logging an issue:
* Search: https://github.com/Microsoft/TypeScript/search?type=Issues
* Read the FAQ: https://github.com/Microsoft/TypeScript/wiki/FAQ
Please fill in the *entire* template below.
-->
<!--
Please try to reproduce the issue with the latest published version. It may have already been fixed.
For npm: `typescript@next`
This is also the 'Nightly' version in the playground: http://www.typescriptlang.org/play/?ts=Nightly
-->
**TypeScript Version:** 4.0.0-dev.202000519
<!-- Search terms you tried before logging this (so others can find this issue more easily) -->
**Search Terms:** async IIFE Property is used before being assigned class declaration
**Code**
This works fine:
```ts
class MyClass {
public foo = 0;
constructor() {
(async () => this.foo + 5)();
}
}
```
This, on the other hand, errors:
```ts
class MyClass {
public foo: number;
constructor() {
this.foo = 0;
(async () => this.foo + 5)(); // errors?
}
}
```
These two code snippets should be treated the same because the first snippet is sugar for the second.
**Expected behavior:** `(async () => this.foo + 5)()` should not have the error "Property 'foo' is used before being assigned."
**Actual behavior:** It errors
**Playground Link:** [Here](https://www.typescriptlang.org/play/#code/MYGwhgzhAECyCeBhcVoG8CwAoAkABwFcAjEAS2GgDMB7a6AXmgAYBubbHYagOwgBcATgWB9qAgBQBKdBxzjI8bhSkMAfND4ALUhAB0NOgGpoAVklS2uAL7YbWbKEgwEyJwCYZuQiXJVaALmhuAgBbIgBTAUsOLl5BYVEJaUxcHC0dfVoGZkscOQUlaBV6dXS9A2hjMwtZWXFi0u1yrKrzSRZoAHpO6AB3MQBrGEpSbnCOOysgA)<!-- A link to a TypeScript Playground "Share" link which demonstrates this behavior -->
**Related Issues:** <!-- Did you find other bugs that looked similar? -->
|
Bug
|
low
|
Critical
|
630,639,918 |
angular
|
ZoneJS: legacy Object.defineProperty patch breaks configurable descriptor attribute
|
<!--π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
Oh hi there! π
To expedite issue processing please search open and closed issues before submitting a new one.
Existing issues often contain information about workarounds, resolution, or progress updates.
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
-->
# bug report
### Affected Package
zone.js
### Is this a regression?
Yes, but super back in the past! It broke with v0.6.24.
### Description
ZoneJS has this super old legacy `Object.defineProperty` patch that basically partially re-implements
the `Object.defineProperty` functionality in order to resolve some custom elements issue. I'm confident that these issues are no longer surfacing and patching `Object.defineProperty` is not the right solution these days.
The issue is that this overwritten `Object.defineProperty` function behaves different to the native implementation. i.e. by default defined properties are not configurable. In ZoneJS, with the patched version, properties are _always_ configurable. This means that:
1. The default `configurable` property descriptor attribute is not respected.
2. Defined properties are by default always `configurable: true`. This doesn't match the native implementation.
Due to these inconsistencies, ZoneJS could potentially hide app issues that would usually cause errors in production. e.g. when using differential loading, legacy patches might be used in tests, but not always in production. So, calling `defineProperty` multiple times in your app might work with the legacy patches in tests and legacy browsers, but otherwise cause a `Uncaught TypeError: Cannot redefine property: X` error in evergreen browsers.
We saw this issue in the Angular Material test harnesses: https://github.com/angular/components/issues/19440
Here is the commit that introduced this logic back in the old repository:
https://github.com/angular/zone.js/commit/383b47905d0a55c0fd87983fa6effd0278cf70f1. And here is the commit that caused the incorrect `configurable` default, and the flag to be not respected at all: https://github.com/angular/zone.js/commit/7b7258b9bf0b4ffd944b2ae015bfb5cb045e3c05.
## Minimal Reproduction
```ts
const x = {}
Object.defineProperty(x, 'defaultPrevented', { get: () => true, configurable: false });
console.error(Object.getOwnPropertyDescriptor(x, 'defaultPrevented'));
```
This prints `configurable: false` for the `defaultPrevented` property without ZoneJS. If the ZoneJS legacy patches are loaded, this is not the case though and `configurable` is always `true`.
## Exception or Error
`Object.defineProperty` behaves different depending on whether ZoneJS legacy patches are loaded or not. ZoneJS could **hide app failures** in tests as `Object.defineProperty` never sets `configurable: false` while in reality, defining a non-configurable property again, causes a `Cannot redeclare property X` runtime error in browsers.
## Your Environment
**Angular Version:**
Angular v10.0.0-rc.2.
zonejs: 0.10.
**Anything else relevant?**
We should audit if we still need this legacy patch. I'm confident, we wouldn't. I'm hoping we can remove this as it's prone to inconsistencies with the native `Object.defineProperty` behavior.
|
hotlist: components team,freq1: low,workaround2: non-obvious,area: zones,state: confirmed,P3
|
low
|
Critical
|
630,659,511 |
godot
|
Imported texture with HDR enabled are darker on Android
|
<!-- Please search existing issues for potential duplicates before filing yours:
https://github.com/godotengine/godot/issues?q=is%3Aissue
-->
**Godot version:**
3.2.1
<!-- Specify commit hash if using non-official build. -->
**OS/device including version:**
Huawei P30 GLES3
<!-- Specify GPU model, drivers, and the backend (GLES2, GLES3, Vulkan) if graphics-related. -->
**Issue description:**
<!-- What happened, and what was expected. -->
Imported images are darker on Android when importing with HDR enabled. If importing with force RGBE, the color is correct but alpha channel is lost
**Steps to reproduce:**
Import an image with alpha channel and enable HDR
**Minimal reproduction project:**
<!-- A small Godot project which reproduces the issue. Drag and drop a zip archive to upload it. -->
|
bug,platform:android,topic:rendering,topic:3d
|
low
|
Minor
|
630,667,420 |
go
|
runtime: OOM when runing in sparse available memory system
|
<!--
Please answer these questions before submitting your issue. Thanks!
For questions please use one of our forums: https://github.com/golang/go/wiki/Questions
-->
### What version of Go are you using (`go version`)?
<pre>
$ go version
go version go1.11 linux/amd64
</pre>
### Does this issue reproduce with the latest release?
YES
### What operating system and processor architecture are you using (`go env`)?
<details><summary><code>go env</code> Output</summary><br><pre>
$ go env
GOARCH="amd64"
GOBIN=""
GOCACHE="/root/.cache/go-build"
GOEXE=""
GOFLAGS=""
GOHOSTARCH="amd64"
GOHOSTOS="linux"
GOOS="linux"
GOPATH="/data/golang"
GOPROXY=""
GORACE=""
GOROOT="/usr/local/go1.11"
GOTMPDIR=""
GOTOOLDIR="/usr/local/go1.11/pkg/tool/linux_amd64"
GCCGO="gccgo"
CC="gcc"
CXX="g++"
CGO_ENABLED="1"
GOMOD=""
CGO_CFLAGS="-g -O2"
CGO_CPPFLAGS=""
CGO_CXXFLAGS="-g -O2"
CGO_FFLAGS="-g -O2"
CGO_LDFLAGS="-g -O2"
PKG_CONFIG="pkg-config"
GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build473055106=/tmp/go-build -gno-record-gcc-switches"
</pre></details>
### What did you do?
<!--
If possible, provide a recipe for reproducing the error.
A complete runnable program is good.
A link on play.golang.org is best.
-->
When I use go1.11 to compile elastic metricbeat with tags [v7.0.0](https://github.com/elastic/beats/tree/v7.0.0)οΌthe binary file would run out of memory when the system available memory is sparse.
The system memory useage is like this:
<pre>
free -m
total used free shared buff/cache available
Mem: 1868 1686 14 97 167 43
Swap: 0 0 0
</pre>
### What did you expect to see?
The system available memory is sparse, But Metricbeat memory usage would less then 10MB when it started. So It could be started.
### What did you see instead?
It can't start instead of oom.
The oom runtime stack logs are:
<pre>
fatal error: runtime: out of memory
runtime stack:
runtime.throw(0x283d1da, 0x16)
/usr/local/go1.11/src/runtime/panic.go:608 +0x72 fp=0x7ffc2108ab90 sp=0x7ffc2108ab60 pc=0x11123e2
runtime.sysMap(0xc000000000, 0x4000000, 0x3e26558)
/usr/local/go1.11/src/runtime/mem_linux.go:156 +0xc7 fp=0x7ffc2108abd0 sp=0x7ffc2108ab90 pc=0x10fd687
runtime.(*mheap).sysAlloc(0x3e0c560, 0x4000000, 0x0, 0x0)
/usr/local/go1.11/src/runtime/malloc.go:619 +0x1c7 fp=0x7ffc2108ac58 sp=0x7ffc2108abd0 pc=0x10f14d7
runtime.(*mheap).grow(0x3e0c560, 0x1, 0x0)
/usr/local/go1.11/src/runtime/mheap.go:920 +0x42 fp=0x7ffc2108acb0 sp=0x7ffc2108ac58 pc=0x11099c2
runtime.(*mheap).allocSpanLocked(0x3e0c560, 0x1, 0x3e26568, 0x0)
/usr/local/go1.11/src/runtime/mheap.go:848 +0x337 fp=0x7ffc2108acf0 sp=0x7ffc2108acb0 pc=0x1109847
runtime.(*mheap).alloc_m(0x3e0c560, 0x1, 0x2a, 0x0)
/usr/local/go1.11/src/runtime/mheap.go:692 +0x119 fp=0x7ffc2108ad30 sp=0x7ffc2108acf0 pc=0x1109059
runtime.(*mheap).alloc.func1()
/usr/local/go1.11/src/runtime/mheap.go:759 +0x4c fp=0x7ffc2108ad68 sp=0x7ffc2108ad30 pc=0x113ef1c
runtime.(*mheap).alloc(0x3e0c560, 0x1, 0x101002a, 0x7ffc2108add0)
/usr/local/go1.11/src/runtime/mheap.go:758 +0x8a fp=0x7ffc2108adb8 sp=0x7ffc2108ad68 pc=0x11092fa
runtime.(*mcentral).grow(0x3e0e318, 0x0)
/usr/local/go1.11/src/runtime/mcentral.go:232 +0x94 fp=0x7ffc2108ae00 sp=0x7ffc2108adb8 pc=0x10fd084
runtime.(*mcentral).cacheSpan(0x3e0e318, 0x0)
/usr/local/go1.11/src/runtime/mcentral.go:106 +0x2f8 fp=0x7ffc2108ae48 sp=0x7ffc2108ae00 pc=0x10fcbd8
runtime.(*mcache).refill(0x7f237641b000, 0x220000002a)
/usr/local/go1.11/src/runtime/mcache.go:122 +0x95 fp=0x7ffc2108ae78 sp=0x7ffc2108ae48 pc=0x10fc795
runtime.(*mcache).nextFree.func1()
/usr/local/go1.11/src/runtime/malloc.go:749 +0x32 fp=0x7ffc2108ae98 sp=0x7ffc2108ae78 pc=0x113e332
runtime.(*mcache).nextFree(0x7f237641b000, 0x3e2652a, 0x4000, 0x7f237641b000, 0x7ffc2108af58)
/usr/local/go1.11/src/runtime/malloc.go:748 +0xb6 fp=0x7ffc2108aef0 sp=0x7ffc2108ae98 pc=0x10f1b86
runtime.mallocgc(0x180, 0x2807a20, 0x7ffc2108af01, 0x7f237641f000)
/usr/local/go1.11/src/runtime/malloc.go:903 +0x793 fp=0x7ffc2108af90 sp=0x7ffc2108aef0 pc=0x10f24d3
runtime.newobject(0x2807a20, 0x3e265c0)
/usr/local/go1.11/src/runtime/malloc.go:1032 +0x38 fp=0x7ffc2108afc0 sp=0x7ffc2108af90 pc=0x10f28b8
runtime.malg(0x7f2300008000, 0x7f237641b000)
/usr/local/go1.11/src/runtime/proc.go:3285 +0x31 fp=0x7ffc2108b000 sp=0x7ffc2108afc0 pc=0x111c661
runtime.mpreinit(0x3e06840)
/usr/local/go1.11/src/runtime/os_linux.go:311 +0x29 fp=0x7ffc2108b020 sp=0x7ffc2108b000 pc=0x1110819
runtime.mcommoninit(0x3e06840)
/usr/local/go1.11/src/runtime/proc.go:624 +0xc1 fp=0x7ffc2108b058 sp=0x7ffc2108b020 pc=0x1115f71
runtime.schedinit()
/usr/local/go1.11/src/runtime/proc.go:546 +0x89 fp=0x7ffc2108b0c0 sp=0x7ffc2108b058 pc=0x1115c39
runtime.rt0_go(0x7ffc2108b1c8, 0x1, 0x7ffc2108b1c8, 0x0, 0x7f2375b95c05, 0x2000000000, 0x7ffc2108b1c8, 0x100000000, 0x1140e50, 0x0, ...)
/usr/local/go1.11/src/runtime/asm_amd64.s:195 +0x11a fp=0x7ffc2108b0c8 sp=0x7ffc2108b0c0 pc=0x1140f7a
</pre>
**BUT**
When I use go1.10.8 to build the same elastic metricbeat code, It started without oom and the system available memory is also about 40 MB left.
Is there any breaking changes start from go version 1.11?
|
NeedsInvestigation
|
medium
|
Critical
|
630,713,486 |
vscode
|
`getTokenInformationAtPosition` API causes too much traffic and blocks the renderer process
|
* Install `GitHub Pull Request Nightly BuildPreview`
* Open the following file:
```
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
// #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555 #91555
function hello() {
//
}
function hello2() {
}
function hello3() {
}
```
Try to work in the file and observe how the window is not really responsive.
Add a console log statement here -- https://github.com/microsoft/vscode/blob/353e8f4fbadc048125593c8e34723bc667f373ce/src/vs/workbench/api/browser/mainThreadLanguages.ts#L47
Observe how after each keystroke there are thousands of requests going from the extension host to the renderer process.

|
bug,api,tokenization,under-discussion
|
medium
|
Major
|
630,769,505 |
TypeScript
|
Pipe function + `React.memo` returns component with `any` props
|
<!-- π¨ STOP π¨ π¦π§π’π£ π¨ πΊπ»πΆπ· π¨
Half of all issues filed here are duplicates, answered in the FAQ, or not appropriate for the bug tracker. Even if you think you've found a *bug*, please read the FAQ first, especially the Common "Bugs" That Aren't Bugs section!
Please help us by doing the following steps before logging an issue:
* Search: https://github.com/Microsoft/TypeScript/search?type=Issues
* Read the FAQ: https://github.com/Microsoft/TypeScript/wiki/FAQ
Please fill in the *entire* template below.
-->
<!--
Please try to reproduce the issue with the latest published version. It may have already been fixed.
For npm: `typescript@next`
This is also the 'Nightly' version in the playground: http://www.typescriptlang.org/play/?ts=Nightly
-->
**TypeScript Version:** 3.9.2
<!-- Search terms you tried before logging this (so others can find this issue more easily) -->
**Search Terms:** generic pipe pipeWith react memo HOC props any
**Code**
```ts
import * as React from 'react';
declare function pipeWith<A, B>(a: A, ab: (a: A) => B): B;
type Props = { foo: number };
declare const MyComponent: React.FunctionComponent<Props>;
// β
correct props type
// React.NamedExoticComponent<Props>
const r1 = React.memo(MyComponent);
// β `any` props type
// React.MemoExoticComponent<React.ComponentType<any>>
const r2 = pipeWith(MyComponent, React.memo);
// Workaround
// β
correct props type
// React.NamedExoticComponent<Props>
const r3 = pipeWith(MyComponent, (C) => React.memo(C));
```
Using latest version of `@types/react` (at the time of writing: 16.9.35).
**Expected behavior:**
See above.
**Actual behavior:**
See above.
**Playground Link:** <!-- A link to a TypeScript Playground "Share" link which demonstrates this behavior -->
https://stackblitz.com/edit/react-ts-4ih6jn
**Related Issues:** <!-- Did you find other bugs that looked similar? -->
https://github.com/microsoft/TypeScript/issues/25637, although that was closed as a duplicate of an issue which has since been closed (https://github.com/microsoft/TypeScript/issues/10957), so I decided to post a new issue.
|
Needs Investigation,Rescheduled
|
low
|
Critical
|
630,778,012 |
godot
|
Vulkan: Camera effects depth write fails in transparente materials
|
**Godot version:**
v4.0.dev.custom_duild.030a26206
**OS/device including version:**
Ubuntu with Gnome - Rizen7 GPU GTX 1650 Linux driver 440.64
**Issue description:**
Using default materials appears to not write in depth buffer in transparent textures. Camera blur effects appears wrong. when full opaque materials write fine

**Steps to reproduce:**
Add transaparent material, alpha-prepass.
Add a phisycal skybox.
Add a far blur camera effect.
|
bug,topic:rendering
|
low
|
Minor
|
630,848,672 |
flutter
|
[Plugin] Google Maps position problem on iOS
|
Reopening #43010
---
Trying to configure Google Maps widget where I have pin drop on longPress.
All good if I have the location zoomed somehow. But in case, I don't have any zoom on the map, it causes some map position issues.
For instance, when I place the marker more at the top, the map jumps down and grey area shows at the top. Like said, only happens when not zoomed and on iOS (Android works good).
My example: https://github.com/frasza/flutter_google_maps_example
Flutter v1.17.2
google_maps_flutter: ^0.5.28+1

|
platform-ios,a: quality,p: maps,package,has reproducible steps,P2,found in release: 2.3,team-ios,triaged-ios
|
low
|
Minor
|
630,853,662 |
excalidraw
|
TypeError "a is undefined" in production version
|
TypeError "a is undefined" in production version
### Scene content
```
{"excalidraw":[{"id":"RSrswM0Nsru4O3DIycwr4","type":"rectangle","x":1197.3636363636374,"y":350.5454545454543,"width":23.090909090909012,"height":247.72727272727272,"angle":0,"strokeColor":"#000000","backgroundColor":"#bf9ea8","fillStyle":"solid","strokeWidth":1,"strokeStyle":"solid","roughness":0,"opacity":100,"seed":1014866942,"version":101,"versionNonce":436515497,"isDeleted":false,"groupIds":[]},{"id":"zC3nJLfOyHgw1l_bhG0DD","type":"rectangle","x":57.1935483870966,"y":-195.90322580645147,"width":2496.8709677419356,"height":1407.3548387096776,"angle":0,"strokeColor":"#000000","backgroundColor":"transparent","fillStyle":"hachure","strokeWidth":1,"strokeStyle":"solid","roughness":1,"opacity":100,"seed":476861886,"version":247,"versionNonce":738837694,"isDeleted":false,"groupIds":[]},{"id":"5q5J-gB-nKn73XqjhlxV-","type":"diamond","x":1266.907235621526,"y":411.0037105751381,"width":78.18181818181829,"height":110.9090909090909,"angle":0,"strokeColor":"#000000","backgroundColor":"#000000","fillStyle":"cross-hatch","strokeWidth":1,"strokeStyle":"solid","roughness":0,"opacity":100,"seed":573471998,"version":965,"versionNonce":281806818,"isDeleted":false,"groupIds":[]},{"id":"AV2cGDgArQMwZrR-f-cIs","type":"rectangle","x":1102.1956260634397,"y":515.6963813915872,"width":70.56256713211614,"height":20.3477443609022,"angle":0,"strokeColor":"#000000","backgroundColor":"#bf9ea8","fillStyle":"solid","strokeWidth":1,"strokeStyle":"solid","roughness":0,"opacity":100,"seed":11039906,"version":617,"versionNonce":354599559,"isDeleted":false,"groupIds":[]},{"id":"FAj2Aa6cBVATt_MSfLFy1","type":"rectangle","x":1057,"y":457.0624320261031,"width":101.74962330254682,"height":13.93756797389688,"angle":0,"strokeColor":"#000000","backgroundColor":"#40c057","fillStyle":"solid","strokeWidth":1,"strokeStyle":"solid","roughness":0,"opacity":100,"seed":713358178,"version":606,"versionNonce":692495542,"isDeleted":false,"groupIds":[]},{"id":"-EtgadXsLWD-hsA0AWfrY","type":"rectangle","x":883,"y":388,"width":73,"height":73,"angle":0,"strokeColor":"#000000","backgroundColor":"#228be6","fillStyle":"solid","strokeWidth":4,"roughness":0,"opacity":100,"seed":1427189302,"version":266,"versionNonce":2015479146,"isDeleted":false},{"id":"1h53bY-FFiMz7opoq-BoM","type":"rectangle","x":882.5,"y":245.5,"width":73,"height":73,"angle":0,"strokeColor":"#000000","backgroundColor":"#fa5252","fillStyle":"solid","strokeWidth":4,"roughness":0,"opacity":100,"seed":1865249386,"version":296,"versionNonce":1387058422,"isDeleted":false},{"id":"DpaElZBjD75KMbfyJbSxE","type":"ellipse","x":1048,"y":297,"width":21,"height":59,"angle":0,"strokeColor":"#000000","backgroundColor":"#000000","fillStyle":"solid","strokeWidth":4,"roughness":0,"opacity":100,"seed":1578247606,"version":57,"versionNonce":1546668470,"isDeleted":false},{"id":"MChygxa7Bw8QhdYy41HF1","type":"rectangle","x":1010.1139698441657,"y":517.7291997947076,"width":70.56256713211614,"height":20.3477443609022,"angle":0,"strokeColor":"#000000","backgroundColor":"#ced4da","fillStyle":"solid","strokeWidth":1,"strokeStyle":"solid","roughness":0,"opacity":100,"seed":2106603495,"version":640,"versionNonce":215291881,"isDeleted":false,"groupIds":[]}],"excalidraw-state":{"elementType":"selection","elementLocked":false,"exportBackground":true,"shouldAddWatermark":false,"currentItemStrokeColor":"#000000","currentItemBackgroundColor":"#ced4da","currentItemFillStyle":"hachure","currentItemStrokeWidth":1,"currentItemStrokeStyle":"solid","currentItemRoughness":1,"currentItemOpacity":100,"currentItemFontSize":20,"currentItemFontFamily":1,"currentItemTextAlign":"left","viewBackgroundColor":"#ffffff","scrollX":-177,"scrollY":-1,"cursorX":0,"cursorY":0,"cursorButton":"up","scrolledOutside":false,"name":"Untitled-2020-06-04-1558","username":"Johannes","zoom":1.42,"openMenu":null,"lastPointerDownWith":"mouse","selectedElementIds":{"MChygxa7Bw8QhdYy41HF1":true,"A8Ec64ZeUodakw0TaOpy_":true,"4I2mp_tnHdI7HTxWTlTD-":true},"previousSelectedElementIds":{"MChygxa7Bw8QhdYy41HF1":true,"A8Ec64ZeUodakw0TaOpy_":true},"shouldCacheIgnoreZoom":false,"zenModeEnabled":false,"editingGroupId":null,"selectedGroupIds":{}},"excalidraw-collab":{"username":"Johannes"},"i18nextLng":"en"}
```
### Sentry Error ID
1ee2e5a6bb60493b9186a2adda80cf15
|
bug
|
low
|
Critical
|
630,911,536 |
pytorch
|
Valgrind leak checking flags losses in libtorch
|
Valgrind is my go-to for wrangling possible memory leaks. It is a beautiful piece of software, but is unfortunately (and necessarily) imperfect. I just ran a libtorch-based application through a relatively brief optimization of a CNN model, and it generated a fair number of loss records. Fortunately, all of them appear to be of the βpossibly lostβ variety (as opposed to βdefinitely lostβ); I was using all of the available leak-check-heuristics available to valgrind. Many of these records reflect pytorch-based allocations embedded in pthread-related activities, which may just suggest that threads are not being thoroughly cleaned up on exit.
However, there are a number of records which, at least going by the traceback, donβt reflect an allocation embedded in thread creation. For brevity, I won't quote any of them here, because I just want to ask a larger question: Is libtorch (v. 1.5, specifically) being subjected to any kind of careful memory-leak vetting, whether by valgrind or some other checker? I know valgrind is not necessarily the ultimate authority; something as simple as a -fsanitize=address compilation could do as well or better.
cc @yf225 @glaringlee
|
module: cpp,triaged
|
low
|
Major
|
630,948,469 |
create-react-app
|
Support aliased imports
|
As previously seen on #9034, I would like to use a separate, TypeScript-based "commons" library in two other TypeScript projects. The problem is that aliased imports are disabled for no documented reason. A workaround is possible with `react-app-rewired` through `react-app-rewire-alias` (since version 0.1.6 also fully compatible with my problem), but this should be natively supported in CRA.
The problem this would bring is that `override`s in `eslint-loader` does not support absolute/external paths, so either a version that does support this should be used, or it should be ignored (possibly accompanied by a warning that they are not linted).
|
issue: proposal,needs triage
|
low
|
Major
|
630,956,180 |
pytorch
|
test_nn_module_tests should run less tests
|
This test appears to run every single module test in a loop. This makes it harder to diagnose failures. Each module test should get a separate test function so you can ask for a specific test to be run.
cc @albanD @mruberry
|
module: nn,module: tests,triaged
|
low
|
Critical
|
630,956,956 |
terminal
|
Same input with different results in terminal and cmd.exe
|
<!--
π¨π¨π¨π¨π¨π¨π¨π¨π¨π¨
I ACKNOWLEDGE THE FOLLOWING BEFORE PROCEEDING:
1. If I delete this entire template and go my own path, the core team may close my issue without further explanation or engagement.
2. If I list multiple bugs/concerns in this one issue, the core team may close my issue without further explanation or engagement.
3. If I write an issue that has many duplicates, the core team may close my issue without further explanation or engagement (and without necessarily spending time to find the exact duplicate ID number).
4. If I leave the title incomplete when filing the issue, the core team may close my issue without further explanation or engagement.
5. If I file something completely blank in the body, the core team may close my issue without further explanation or engagement.
All good? Then proceed!
-->
<!--
This bug tracker is monitored by Windows Terminal development team and other technical folks.
**Important: When reporting BSODs or security issues, DO NOT attach memory dumps, logs, or traces to Github issues**.
Instead, send dumps/traces to [email protected], referencing this GitHub issue.
If this is an application crash, please also provide a Feedback Hub submission link so we can find your diagnostic data on the backend. Use the category "Apps > Windows Terminal (Preview)" and choose "Share My Feedback" after submission to get the link.
Please use this form and describe your issue, concisely but precisely, with as much detail as possible.
-->
# Environment
```none
Windows build number: Microsoft Windows Pro [Version 10.0.18363.836] x64
Windows Terminal version (if applicable): 1.0.1401.0
Any other software?
stripe cli
```
# Steps to reproduce
Download Stripe CLI (https://github.com/stripe/stripe-cli/releases/tag/v1.4.1)
after "_stripe login_" (not sure if needed, i was already logged in) using the command
"_stripe listen --forward-toΒ https://localhost:44391/api/StripeWebHooks_" works on cmd.exe, but not in terminal
<!-- A description of how to trigger this bug. -->
# Expected behavior
should work as in cmd.exe
# Actual behavior
<!-- What's actually happening? -->
output "unknown flag: --forward-toΒ https://localhost:44391/api/StripeWebHooks"
Screenshot with outputs:

|
Help Wanted,Issue-Bug,Area-TerminalControl,Product-Terminal,Priority-2
|
low
|
Critical
|
630,961,566 |
godot
|
Memory leak, on selecting multiple animation keys 10Gb+ ram used
|
**Godot version:**
3.2.1 stable
**OS/device including version:**
Linux
**Issue description:**
Selecting multiply animation keys like on screenshot, make Godot freeze, because Ram usage.

**Steps to reproduce, minimal reproduction project:**
Download my example zip [test.res.zip](https://github.com/godotengine/godot/files/4731287/test.res.zip)
and open test.res in Godot editor, then select animation keys like on screenshot.
|
bug,topic:editor,crash
|
low
|
Minor
|
630,969,445 |
godot
|
Memory leak when creating Nodes or Object derived objects.
|
<!-- Please search existing issues for potential duplicates before filing yours:
https://github.com/godotengine/godot/issues?q=is%3Aissue
-->
**Godot version:**
<!-- Specify commit hash if using non-official build. -->
Godot_v3.2.2-beta2_win64
**OS/device including version:**
<!-- Specify GPU model, drivers, and the backend (GLES2, GLES3, Vulkan) if graphics-related. -->
Windows 7 x64 PS1
**Issue description:**
<!-- What happened, and what was expected. -->
Memory consumption didn't return to its previous value after creating and deleting same number of objects.
No leak in 3.2.1 stable.
**Steps to reproduce:**
- Launch the minimal project
- check 'static' memory in Debugger > Monitors (or memory consumption reported by OS)
- press the button
- check memory consumption again - it will be higher
**Minimal reproduction project:**
<!-- A small Godot project which reproduces the issue. Drag and drop a zip archive to upload it. -->
https://github.com/hilfazer/Projects/tree/master/Godot/EngineIssues/CreatingObjectsLeaksMemory
|
bug,topic:core
|
low
|
Critical
|
630,972,295 |
rust
|
thread 'rustc' panicked at 'called `Option::unwrap()` on a `None` value', src/librustc_typeck/check/intrinsic.rs:288:42
|
This happened while trying to build libstd (outside of bootstrap) for hexagon-unknown-linux-musl. Not sure if `xbuild` is the appropriate convention but I saw it used in libc crate's CI and assumed it might be worth a try w/libstd.
```
cd src/libstd
RUST_BACKTRACE=1 cargo +nightly xbuild -vv --no-default-features --target hexagon-unknown-linux-musl
...
thread 'rustc' panicked at 'called `Option::unwrap()` on a `None` value', src/librustc_typeck/check/intrinsic.rs:288:42
stack backtrace:
0: backtrace::backtrace::libunwind::trace
at /cargo/registry/src/github.com-1ecc6299db9ec823/backtrace-0.3.46/src/backtrace/libunwind.rs:86
1: backtrace::backtrace::trace_unsynchronized
at /cargo/registry/src/github.com-1ecc6299db9ec823/backtrace-0.3.46/src/backtrace/mod.rs:66
2: std::sys_common::backtrace::_print_fmt
at src/libstd/sys_common/backtrace.rs:78
3: <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt
at src/libstd/sys_common/backtrace.rs:59
4: core::fmt::write
at src/libcore/fmt/mod.rs:1076
5: std::io::Write::write_fmt
at src/libstd/io/mod.rs:1537
6: std::sys_common::backtrace::_print
at src/libstd/sys_common/backtrace.rs:62
7: std::sys_common::backtrace::print
at src/libstd/sys_common/backtrace.rs:49
8: std::panicking::default_hook::{{closure}}
at src/libstd/panicking.rs:198
9: std::panicking::default_hook
at src/libstd/panicking.rs:218
10: rustc_driver::report_ice
11: std::panicking::rust_panic_with_hook
at src/libstd/panicking.rs:490
12: rust_begin_unwind
at src/libstd/panicking.rs:388
13: core::panicking::panic_fmt
at src/libcore/panicking.rs:101
14: core::panicking::panic
at src/libcore/panicking.rs:56
15: rustc_typeck::check::intrinsic::check_intrinsic_type
16: rustc_typeck::check::check_item_type
17: rustc_middle::hir::map::Map::visit_item_likes_in_module
18: rustc_typeck::check::check_mod_item_types
19: rustc_middle::ty::query::<impl rustc_query_system::query::config::QueryAccessors<rustc_middle::ty::context::TyCtxt> for rustc_middle::ty::query::queries::check_mod_item_types>::compute
20: rustc_middle::dep_graph::<impl rustc_query_system::dep_graph::DepKind for rustc_middle::dep_graph::dep_node::DepKind>::with_deps
21: rustc_query_system::dep_graph::graph::DepGraph<K>::with_task_impl
22: rustc_data_structures::stack::ensure_sufficient_stack
23: rustc_query_system::query::plumbing::get_query_impl
24: rustc_query_system::query::plumbing::ensure_query_impl
25: rustc_session::utils::<impl rustc_session::session::Session>::time
26: rustc_typeck::check_crate
27: rustc_interface::passes::analysis
28: rustc_middle::ty::query::<impl rustc_query_system::query::config::QueryAccessors<rustc_middle::ty::context::TyCtxt> for rustc_middle::ty::query::queries::analysis>::compute
29: rustc_middle::dep_graph::<impl rustc_query_system::dep_graph::DepKind for rustc_middle::dep_graph::dep_node::DepKind>::with_deps
30: rustc_query_system::dep_graph::graph::DepGraph<K>::with_task_impl
31: rustc_data_structures::stack::ensure_sufficient_stack
32: rustc_query_system::query::plumbing::get_query_impl
33: rustc_middle::ty::context::tls::enter_global
34: rustc_interface::interface::run_compiler_in_existing_thread_pool
35: rustc_ast::attr::with_globals
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
error: internal compiler error: unexpected panic
note: the compiler unexpectedly panicked. this is a bug.
note: we would appreciate a bug report: https://github.com/rust-lang/rust/blob/master/CONTRIBUTING.md#bug-reports
note: rustc 1.45.0-nightly (56daaf669 2020-06-03) running on x86_64-unknown-linux-gnu
note: compiler flags: -C embed-bitcode=no -C debug-assertions=off -C overflow-checks=on -C incremental --crate-type lib
note: some of the compiler flags provided by cargo are hidden
query stack during panic:
#0 [check_mod_item_types] checking item types in module `intrinsics`
#1 [analysis] running analysis passes on this crate
end of query stack
```
|
I-ICE,T-compiler,C-bug,requires-nightly,S-needs-repro
|
low
|
Critical
|
630,976,188 |
ant-design
|
Issue in horizontal scroll with keyboard arrow key inside the tabs
|
- [ ] I have searched the [issues](https://github.com/ant-design/ant-design/issues) of this repository and believe that this is not a duplicate.
### Reproduction link
[](https://codesandbox.io/s/antd-issue-tabpane-2smge)
### Steps to reproduce
Open the link.
click inside the table/footer of table.
try to scroll using keyboard arrow key
it does not scroll
### What is expected?
The scroll using the arrow keys from keyboard inside the tabpane
### What is actually happening?
the scroll is not working.
| Environment | Info |
|---|---|
| antd | 4.3.1 |
| React | 16.12.0 |
| System | Ubuntu/window |
| Browser | chrome/mozilla |
---
I am using the antd version 3 before and it is working fine but after update to new version it is not working.
as it is easy from the keyboard to navigate or scroll.
<!-- generated by ant-design-issue-helper. DO NOT REMOVE -->
|
Inactive,β¨οΈ Accessibility
|
low
|
Minor
|
630,982,742 |
tensorflow
|
tf.ragged.constant does not detect dense dimensions
|
**System information**
- Have I written custom code (as opposed to using a stock example script provided in TensorFlow): No
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Windows 10 x64
- Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: NA
- TensorFlow installed from (source or binary): Binary
- TensorFlow version (use command below): 2.2.0
- Python version: 3.7.6
- Bazel version (if compiling from source): NA
- GCC/Compiler version (if compiling from source): NA
- CUDA/cuDNN version: NA
- GPU model and memory: NA
**Describe the current behavior**
Note: I am reporting this as a bug but I am not sure if it may actually be a feature request, as I am not entirely sure if the described behaviour is fully expected or not.
[`tf.ragged.constant`](https://www.tensorflow.org/api_docs/python/tf/ragged/constant) does not properly detect which dimensions should be ragged from the given Python list. By default, only the outermost dimension is considered as dense, even if other coherent dimensions exist in the data. One can use `ragged_rank` and/or `inner_shape` to mark a number of innermost dimensions as dense, but it does not seem to be possible to do the opposite, that is, marking some outermost dimensions after the first one as dense. And, in general, it does not detect nor allow to make a ragged tensor with an arbitrary combination of ragged and dense dimensions (even though it is possible to build such ragged tensors in other ways).
**Describe the expected behavior**
I would expect that all coherent dimensions of a Python nested list are detected as dense dimensions:
```python
import tensorflow as tf
print(tf.ragged.constant([[[1], [2, 3], [4]], [[5, 6], [], [7]]]).shape)
# (2, 3, None)
```
As a feature addition, having the possibility to specify which arbitrary dimensions are ragged or not would also be nice, although it would have to be with a different API. Maybe I could have for example:
```python
import tensorflow as tf
tf.ragged.constant([[[1], [2, 3], [4]], [[5, 6], [], [7]]], shape=[2, -1, None])
```
With `-1` meaning "detect automatically from data" and `None` meaning `ragged dimension`.
**Standalone code to reproduce the issue**
```python
import tensorflow as tf
# Dense inner dimensions are not detected
print(tf.ragged.constant([[[1], [2, 3], [4]], [[5, 6], [], [7]]]).shape)
# (2, None, None)
# The outermost dimension is always dense
print(tf.ragged.constant([[1], [2, 3], [4]]).shape)
# (3, None)
# But simply adding a couple of brackets makes the dimension ragged
print(tf.ragged.constant([[[1], [2, 3], [4]]]).shape)
# (1, None, None)
```
**Other info / logs**
NA
|
stat:awaiting tensorflower,type:feature,comp:ops,TF 2.9
|
low
|
Critical
|
631,019,187 |
go
|
regexp: doc: lead with the precise syntax rather than referring to Perl, Python, etc.
|
<!--
Please answer these questions before submitting your issue. Thanks!
For questions please use one of our forums: https://github.com/golang/go/wiki/Questions
-->
### What version of Go are you using (`go version`)?
<pre>
1.13.5
</pre>
### Does this issue reproduce with the latest release?
Yes
### What operating system and processor architecture are you using (`go env`)?
<details><summary><code>go env</code> darwin/amd64</summary><br><pre>
$ go env
</pre></details>
### What did you do?
- Regexp is not catching all of the whitespace values that is should
- In particular `\s` is not matching [hair spaces](https://www.compart.com/en/unicode/U+200A) (whereas perl regex does)
Here's a [go playground link](https://play.golang.org/p/Q7qOkm_QsMv)
### What did you expect to see?
It should match the white space.
### What did you see instead?
It did not recognize the hair space as valid white space.
If this is indeed the intended behavior, the docs should be more clear that golang's regex does not conform with perl's and will not match all of the standard regex special characters.
If the docs already say this loudly and clearly and I just missed that section, apologies for being an idiot.
|
Documentation,help wanted,NeedsFix
|
low
|
Major
|
631,045,940 |
angular
|
Elements lazy load make unable to use providers in modules
|
<!--π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
Oh hi there! π
To expedite issue processing please search open and closed issues before submitting a new one.
Existing issues often contain information about workarounds, resolution, or progress updates.
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
π
-->
# π bug report
### Affected Package
<!-- Can you pin-point one or more @angular/* packages as the source of the bug? -->
The issue is caused by package @angular/elements.
### Is this a regression?
<!-- Did this behavior use to work in the previous version? -->
I'm not sure.
### Description
When you export a Custom Element that contains a lazy load module, you aren't able to create a service without the providedIn: 'root'.
If you create a service with a @Injectable() and than make the provider in a module under the lazy load module, it'll break the application and throw a no provider error.
[The service code](https://github.com/richardlnnr/angular-lazy-load-injectable/blob/master/src/app/feature/content/content.service.ts#L3)
[The providers in the module under a Lazy load route](https://github.com/richardlnnr/angular-lazy-load-injectable/blob/master/src/app/feature/content/content.module.ts#L13)
## π¬ Minimal Reproduction
<!--
Please create and share minimal reproduction of the issue starting with this template: https://stackblitz.com/fork/angular-ivy
-->
https://richardlnnr.github.io/angular-lazy-load-injectable/
https://github.com/richardlnnr/angular-lazy-load-injectable
<!--
If StackBlitz is not suitable for reproduction of your issue, please create a minimal GitHub repository with the reproduction of the issue.
A good way to make a minimal reproduction is to create a new app via `ng new repro-app` and add the minimum possible code to show the problem.
Share the link to the repo below along with step-by-step instructions to reproduce the problem, as well as expected and actual behavior.
Issues that don't have enough info and can't be reproduced will be closed.
You can read more about issue submission guidelines here: https://github.com/angular/angular/blob/master/CONTRIBUTING.md#-submitting-an-issue
-->
## π₯ Exception or Error
<pre><code>
core.js:6228 ERROR NullInjectorError: R3InjectorError(AppModule)[ContentService -> ContentService -> ContentService]:
NullInjectorError: No provider for ContentService!
at NullInjector.get (http://localhost:4200/vendor.js:8310:27)
at R3Injector.get (http://localhost:4200/vendor.js:22304:33)
at R3Injector.get (http://localhost:4200/vendor.js:22304:33)
at R3Injector.get (http://localhost:4200/vendor.js:22304:33)
at NgModuleRef$1.get (http://localhost:4200/vendor.js:39605:33)
at Object.get (http://localhost:4200/vendor.js:37339:35)
at getOrCreateInjectable (http://localhost:4200/vendor.js:12112:39)
at Module.Ι΅Ι΅directiveInject (http://localhost:4200/vendor.js:26119:12)
at NodeInjectorFactory.ContentComponent_Factory [as factory] (http://localhost:4200/feature-feature-module.js:26:162)
at getNodeInjectable (http://localhost:4200/vendor.js:12257:44)
</code></pre>
<pre><code>
ERROR Error: Uncaught (in promise): NullInjectorError: R3InjectorError(AppModule)[ContentService -> ContentService -> ContentService]:
NullInjectorError: No provider for ContentService!
NullInjectorError: R3InjectorError(AppModule)[ContentService -> ContentService -> ContentService]:
NullInjectorError: No provider for ContentService!
at NullInjector.get (core.js:1085)
at R3Injector.get (core.js:16955)
at R3Injector.get (core.js:16955)
at R3Injector.get (core.js:16955)
at NgModuleRef$1.get (core.js:36329)
at Object.get (core.js:33972)
at getOrCreateInjectable (core.js:5848)
at Module.Ι΅Ι΅directiveInject (core.js:21103)
at NodeInjectorFactory.ContentComponent_Factory [as factory] (content.component.ts:9)
at getNodeInjectable (core.js:5993)
at resolvePromise (zone-evergreen.js:798)
at resolvePromise (zone-evergreen.js:750)
at zone-evergreen.js:860
at ZoneDelegate.invokeTask (zone-evergreen.js:399)
at Object.onInvokeTask (core.js:41632)
at ZoneDelegate.invokeTask (zone-evergreen.js:398)
at Zone.runTask (zone-evergreen.js:167)
at drainMicroTaskQueue (zone-evergreen.js:569)
</code></pre>
## π Your Environment
**Angular Version:**
<pre><code>
Angular CLI: 9.1.7
Node: 10.18.0
OS: linux x64
Angular: 9.1.9
... animations, common, compiler, compiler-cli, core, elements
... forms, platform-browser, platform-browser-dynamic, router
Ivy Workspace: Yes
Package Version
-----------------------------------------------------------
@angular-devkit/architect 0.901.7
@angular-devkit/build-angular 0.901.7
@angular-devkit/build-optimizer 0.901.7
@angular-devkit/build-webpack 0.901.7
@angular-devkit/core 9.1.7
@angular-devkit/schematics 9.1.7
@angular/cli 9.1.7
@ngtools/webpack 9.1.7
@schematics/angular 9.1.7
@schematics/update 0.901.7
rxjs 6.5.5
typescript 3.8.3
webpack 4.42.0
</code></pre>
**Anything else relevant?**
<!-- βοΈIs this a browser specific issue? If so, please specify the browser and version. -->
<!-- βοΈDo any of these matter: operating system, IDE, package manager, HTTP server, ...? If so, please mention it below. -->
|
type: bug/fix,workaround2: non-obvious,area: elements,state: confirmed,P3
|
medium
|
Critical
|
631,054,894 |
godot
|
Node self-reference leaks on quit if it is typed
|
Godot 3.1
Godot 3.2.2 beta3
If you do this in a script, Godot will leak on quit:
```gdscript
extends Node
var _tilemap := self
```
```
ERROR: ~List: Condition ' _first != __null ' is true.
At: ./core/self_list.h:111
ERROR: ~List: Condition ' _first != __null ' is true.
At: ./core/self_list.h:111
WARNING: cleanup: ObjectDB Instances still exist!
At: core/object.cpp:2092
```
It won't leak if the variable is not typed.
I suspect that happens because the type system tried to reference the script itself, which creates an embarrassing cyclic reference.
Curiously, that doesn't happen if you do it on a local variable.
|
bug,topic:gdscript
|
low
|
Critical
|
631,072,138 |
go
|
crypto/tls: cleanup handshake state
|
We should refactor where and when the hs and Conn state is accessed and modified during the handshake. For example checkForResumption should probably be side-effect free.
|
NeedsFix
|
low
|
Major
|
631,106,517 |
godot
|
Less than comparisons between arrays and sorting arrays of arrays is unpredictable
|
**Godot version:**
3.2.1.stable
**OS/device including version:**
Windows10
**Issue description:**
I just tried to sort an array of pairs of numbers `[[2.061036, 1], [2.599602, 100], [-0.833468, 1]]` and the answer wasn't right.
It appears that Array objects are not compared by the value of their contents, even when they are arrays of one element, as shown in the example below. It is not obvious what is being compared. Object pointer addresses?
Sorting arrays of tuples is a technique I use a lot in C++ to avoid the hassle of having to define a comparison function pointer required by `sort_custom()`. Python used to have a similar `cmp(a, b)` function for its sort feature, but it was [taken out](https://portingguide.readthedocs.io/en/latest/comparisons.html#the-cmp-argument) of Python3 in favour of a simpler `key(a)` function.
The [class_array.html documentation](https://docs.godotengine.org/en/3.2/classes/class_array.html#class-array-method-sort-custom) gives an example use of `sort_custom()` which would have worked with a normal `sort()` in Python. If this is a message about how to do it, it's probably too subtle. There is a warning about sorting strings containing numbers, but no warning that sorting arrays of arrays is going to be a problem.
If comparing Arrays doesn't work in the way expected, then it would be better if it gave an Invalid Operation Type error (as it does when you compare to Dictionaries) than to allow it to be written.
**Steps to reproduce:**
```Python
print("0.8 < 2.5: ", 0.8 < 2.5) # ---> True
print("[0.8] < [2.5]: ", [0.8] < [2.5]) # ---> False
print("[9.8] < [2.5]: ", [9.8] < [2.5]) # ---> True
```
**Minimal reproduction project:**
I've put the above code into the run() function of an editorscript .gd file and used Control-Shift-X to run it:
```
tool
extends EditorScript
func _run():
...
```
|
discussion,topic:core,confirmed,documentation
|
low
|
Critical
|
631,114,322 |
godot
|
A tilemap created in 3.0 will break if you use `Fix Invalid Tiles`
|
Godot 3.2.1
I created a lot of levels with tilemaps in Godot 3.0.6, and later migrated the project to 3.1, then 3.2. Levels still worked fine, until I notice a few occurrences of "invalid cell" errors in some levels.
So I tried using `Fix Invalid Tiles` on these levels, but it completely messed it up with a checkerboard pattern:

After failing to create a minimal repro with only the tilemap in the scene, I came to the conclusion that `Fix Invalid Tiles` will break *if you did not re-save the scene before, using Godot 3.2*. So you have to edit something and save on every level to "upgrade" them.
However this is very unintuitive. This should be fixed, and if it can't be, this should be indicated somehow.
I tried to create another repro from scratch but could not get the checkerboard pattern to appear, I only get it in my project until I save the scenes...
Here is the diff detail from my project after I re-save:

|
bug,topic:editor
|
low
|
Critical
|
631,122,289 |
TypeScript
|
bug of multi re-export with interface
|
<!-- π¨ STOP π¨ π¦π§π’π£ π¨ πΊπ»πΆπ· π¨
Half of all issues filed here are duplicates, answered in the FAQ, or not appropriate for the bug tracker. Even if you think you've found a *bug*, please read the FAQ first, especially the Common "Bugs" That Aren't Bugs section!
Please help us by doing the following steps before logging an issue:
* Search: https://github.com/Microsoft/TypeScript/search?type=Issues
* Read the FAQ: https://github.com/Microsoft/TypeScript/wiki/FAQ
Please fill in the *entire* template below.
-->
<!--
Please try to reproduce the issue with the latest published version. It may have already been fixed.
For npm: `typescript@next`
This is also the 'Nightly' version in the playground: http://www.typescriptlang.org/play/?ts=Nightly
-->
**TypeScript Version:** 3.7.x-dev.201xxxxx
<!-- Search terms you tried before logging this (so others can find this issue more easily) -->
**Search Terms:**
**Code**
- (first) https://github.com/bluelovers/ws-regexp/blob/master/packages/%40lazy-cjk/jp-table-convert/lib/types.ts
```ts
export interface IOptions
{
/**
* εΏ½η₯ηε or δ»»δ½ζ―ζ΄ indexOf η Object
*/
skip?,
/**
* safe mode
*/
safe?: boolean,
}
```
- https://github.com/bluelovers/ws-regexp/blob/master/packages/%40lazy-cjk/jp-table-convert/index.ts
- https://github.com/bluelovers/ws-regexp/blob/master/packages/cjk-conv/lib/jp/core.ts
- (last) https://github.com/bluelovers/ws-regexp/blob/master/packages/cjk-conv/lib/jp/index.ts
```ts
export { zh2jp, zht2jp, zhs2jp, zhs2zht, zht2zhs, cjk2zhs, jp2zhs, jp2zht, cjk2zht, cjk2jp, IOptions } from './core';
```
**Expected behavior:**
```js
Object.defineProperty(exports, "__esModule", { value: true });
var core_1 = require("./core");
Object.defineProperty(exports, "zh2jp", { enumerable: true, get: function () { return core_1.zh2jp; } });
Object.defineProperty(exports, "zht2jp", { enumerable: true, get: function () { return core_1.zht2jp; } });
Object.defineProperty(exports, "zhs2jp", { enumerable: true, get: function () { return core_1.zhs2jp; } });
Object.defineProperty(exports, "zhs2zht", { enumerable: true, get: function () { return core_1.zhs2zht; } });
Object.defineProperty(exports, "zht2zhs", { enumerable: true, get: function () { return core_1.zht2zhs; } });
Object.defineProperty(exports, "cjk2zhs", { enumerable: true, get: function () { return core_1.cjk2zhs; } });
Object.defineProperty(exports, "jp2zhs", { enumerable: true, get: function () { return core_1.jp2zhs; } });
Object.defineProperty(exports, "jp2zht", { enumerable: true, get: function () { return core_1.jp2zht; } });
Object.defineProperty(exports, "cjk2zht", { enumerable: true, get: function () { return core_1.cjk2zht; } });
Object.defineProperty(exports, "cjk2jp", { enumerable: true, get: function () { return core_1.cjk2jp; } });
```
**Actual behavior:**
the `(last)` will emit wrong js code in Version 4.0.0-dev.20200601
this bug happen in super low chance
```patch
***************
*** 11,18 ****
--- 11,19 ----
Object.defineProperty(exports, "zht2zhs", { enumerable: true, get: function () { return core_1.zht2zhs; } });
Object.defineProperty(exports, "cjk2zhs", { enumerable: true, get: function () { return core_1.cjk2zhs; } });
Object.defineProperty(exports, "jp2zhs", { enumerable: true, get: function () { return core_1.jp2zhs; } });
Object.defineProperty(exports, "jp2zht", { enumerable: true, get: function () { return core_1.jp2zht; } });
Object.defineProperty(exports, "cjk2zht", { enumerable: true, get: function () { return core_1.cjk2zht; } });
Object.defineProperty(exports, "cjk2jp", { enumerable: true, get: function () { return core_1.cjk2jp; } });
+ Object.defineProperty(exports, "IOptions", { enumerable: true, get: function () { return core_1.IOptions; } });
exports.default = exports;
//# sourceMappingURL=index.js.map
```
**Playground Link:** <!-- A link to a TypeScript Playground "Share" link which demonstrates this behavior -->
**Related Issues:** <!-- Did you find other bugs that looked similar? -->
|
Needs Investigation
|
low
|
Critical
|
631,148,678 |
rust
|
Clarify the behavior of std::task::Waker and Future::Poll
|
I have been implemented a few sync mechanism including WaitGroup and MPMC/MPSC in async, althought currently my code seems to work fine. I have some confusion about the documented and feels that there's undefined behavior, especially when future is implemented as lockless and waited/wake_up between multiple thread concurrently.
In https://doc.rust-lang.org/std/future/trait.Future.html there's a saying.
`
Note that on multiple calls to poll, only the Waker from the Context passed to the most recent call should be scheduled to receive a wakeup.`
Before I have any introspect to the low level implemented, I'm assuming the following action should by done when implemented std::future::Future:
- when poll(), the first thing is should have a test of the condition of readiness
- register the waker, either store the waker inside some object, or pass it to event driver reside in other thread.
- check the condition of readiness again
- If poll() return Ready before the waker atually fired, should do some cleanup of the waker registration resource.
- on Drop to the Future object (when use tokio::time::timeout with the future), should do some cleanup of the waker registration resource
Later I found that when trying to be lockless, during future::select! will make the future poll() multiple times, and register the waker to even driver every time harm the performance. And because I'm not sure whether waker is the same when produced by the same future during multiple poll(). And when a future expect to seldom waked up (eg close channel receiver poll in combination with other busy channel receiver ) all the waker produced by the same future will take up considerble memory resource if my CustomWaker is not cleanup (cancel) ASAP.
And then I discovered in all my usage cases the following assumption seems to be valid, which may help greatly to simply and speed up my code in parallel enviromment, under controllable memory usage:
- A waker is valid from the time ctx.waker().clone() until it's woken up by outsider
- During multiple poll(), if the waker of previous call is not woken up, I can expect the future to wake up on future calls poll() when waker.wake() on previous waker.
- If a call to poll() thinks it's condition is not meet, concurrent wake() will guarantee the future to wake in the future.
I have a little investigation to tokio-0.2 runtime code:
https://github.com/tokio-rs/tokio/blob/master/tokio/src/runtime/task/raw.rs
https://github.com/tokio-rs/tokio/blob/master/tokio/src/runtime/task/stack.rs
https://github.com/tokio-rs/tokio/blob/master/tokio/src/runtime/task/waker.rs
std::task::Waker just wraps std::task::RawWaker. RawWaker.data points to "header" entry allocated on the heap. when ctx.waker().clone(), a ref_count of "header". which means all waker return by ctx.waker().clone() points to the same waker. until all the waker is dropped or waked, the "header" exists.
The logic seems to support my assumption. modfication to my code base on these assumption runs better than before.
I think the document or RFC need to be more clarify, for better users' understanding and for more expectable behavior of different runtime implementations.
|
C-enhancement,T-libs-api,A-async-await,AsyncAwait-Triaged
|
low
|
Major
|
631,157,881 |
pytorch
|
Segfault = docker + tensorboard + pytorch
|
## π Bug
I have this program:
```python
from torch.utils.tensorboard import SummaryWriter
writer = SummaryWriter()
writer.add_scalar('test', 1)
writer.close()
```
Whenever I try to run it on a cell of a jupyter-lab instance in a docker container with tensorboard, **if it is the first time it crashes the kernel, otherwise it runs forever**. Basically it's unusable.
My suspition was a segfault, since no log appears anywhere while it blocks. Here's `sudo dmesg` that gives me this everytime i try to run a cell using the writer:

## To Reproduce
Steps:
0. pull the docker image to reproduce the bug `docker pull fpelosin/tboardbug`
1. Watch me reproducing the bug instead of listing the passages: [asciinema](https://asciinema.org/a/CxtaVfej4QHkZxgJXiljMbyXn)
## Expected behavior
Whenever I run the cell it should just run.
## Environment
_Run inside the container:_
```
Collecting environment information...
PyTorch version: 1.5.0
Is debug build: No
CUDA used to build PyTorch: 10.1
OS: Ubuntu 18.04.3 LTS
GCC version: (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0
CMake version: version 3.17.20200320-g6bea0b2
Python version: 3.6
Is CUDA available: Yes
CUDA runtime version: 10.1.243
GPU models and configuration: GPU 0: GeForce RTX 2080 Ti
Nvidia driver version: 440.64
cuDNN version: /usr/lib/x86_64-linux-gnu/libcudnn.so.7.6.5
Versions of relevant libraries:
[pip3] numpy==1.18.2
[pip3] torch==1.5.0
[pip3] torchvision==0.6.0
[conda] Could not collect
```
## Additional context
_Run inside the container:_
```
cython --version
Cython version 0.29.15
tensorflow.__version__
2.2.0
tensorboard --version
2.2.2
torch.__version__
1.5.0
jupyter-lab --version
2.1.4
python --version
Python 3.6.9
uname -a
Linux e13c9d2cf9a9 5.4.0-33-generic #37-Ubuntu SMP Thu May 21 12:53:59 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
```
_Run outside the container:_
```
docker --version
Docker version 19.03.11, build 42e35e61f3
```
with nvidia docker (dunno how to check the version). The docker image is based on [ufoym/deepo](https://github.com/ufoym/deepo) and it is very simple:
```Dockerfile
FROM ufoym/deepo
RUN pip install jupyterlab
RUN pip install torch==1.5.0
RUN pip install torchvision==0.6.0
RUN pip install tensorflow==2.2.0
```
This is the tensorboard diagnose:
[tboard.log](https://github.com/tensorflow/tensorboard/files/4731264/tboard.log)
[They](https://github.com/tensorflow/tensorboard/issues/3704) pointed me here.
|
triaged,module: tensorboard
|
low
|
Critical
|
631,165,042 |
pytorch
|
[JIT] Print out mutation in IR Dumps
|
## π Feature
When we print out the IR of a function like:
```
@torch.jit.script
def foo(x):
res = []
for i in range(len(x)):
res.append(x[i])
return res
```
We should make visible the mutation that is occurring. Below, the first argument of `aten::append` has a `(!)` after it to indicate it is being mutated.
```
graph(%x.1 : Tensor):
%12 : int = prim::Constant[value=0]()
%6 : bool = prim::Constant[value=1]()
%res.1 : Tensor[] = prim::ListConstruct()
%3 : int = aten::len(%x.1)
= prim::Loop(%3, %6)
block0(%i.1 : int):
%13 : Tensor = aten::select(%x.1, %12, %i.1)
%14 : None = aten::append(%res.1 (!), %13)
-> (%6)
return (%res.1)
```
TorchScript developers often come from IRs that are completely functional. TorchScript IR might look very similar to TF / Caffe2 / ONNX, but there are additional semantics that are not being represented. I've also had a number of instances where someone asked me why a node wasn't getting optimized, and the answer was because an argument was marked as being mutated. If that had been made visible they probably would have figured it out.
There are a number of other properties of nodes which are not made explicit when we print IR, such as `hasSideEffects`, `isNondeterministic()`, aliasing, etc. I don't think we should go too far into the weeds with adding other properties, but mutation would be helpful to add.
cc @suo @jamesr66a
|
triage review,oncall: jit
|
low
|
Minor
|
631,187,483 |
flutter
|
More graceful handling of INSTALL_FAILED_VERIFICATION_FAILURE
|
I am required by device policy to have apps installed via USB verified. It turns out that if you have airplane mode enabled, this fails during `flutter run` as follows:
```
Error: ADB exited with exit code 1
Performing Streamed Install
adb: failed to install /Users/ianh/dev/ui-exp-dg/microbenchmarks/gc/flutter/build/app/outputs/flutter-apk/app.apk: Failure [INSTALL_FAILED_VERIFICATION_FAILURE]
Error running application on Pixel 3 XL.
```
We should catch that case and report it more gracefully, and suggest turning off airplane mode if appropriate.
|
tool,a: quality,P3,team-tool,triaged-tool
|
low
|
Critical
|
631,190,408 |
flutter
|
Allow devicelab A/B test to compare 2 local engines
|
The current A/B test only tests the --local-engine=xxx and the default engine
The default engine and the local engine might be compiled with different settings (e.g., default engine probably has --lto while the local engine has --no-lto), so the comparison isn't completely controlled if we just want to test a local engine change.
A more fair comparison would be a local-to-local testing with the exact same compilation setting, but just with different engine code (say two git branches)
|
a: tests,c: new feature,engine,c: performance,P3,team-engine,triaged-engine
|
low
|
Major
|
631,211,385 |
go
|
runtime: βunknown pcβ exception for amd64 programs on Windows 7 with EMET and Export Address Table Access Filtering enabled
|
<!-- Please answer these questions before submitting your issue. Thanks! -->
### What version of Go are you using (`go version`)?
<pre>
$ go version
go version go1.14.4 windows/amd64
</pre>
### Does this issue reproduce with the latest release?
Yes.
### What operating system and processor architecture are you using (`go env`)?
<details><summary><code>go env</code> Output</summary><br><pre>
$ go env
set GO111MODULE=
set GOARCH=amd64
set GOBIN=
set GOCACHE=C:\Users\user\AppData\Local\go-build
set GOENV=C:\Users\user\AppData\Roaming\go\env
set GOEXE=.exe
set GOFLAGS=
set GOHOSTARCH=amd64
set GOHOSTOS=windows
set GOINSECURE=
set GONOPROXY=
set GONOSUMDB=
set GOOS=windows
set GOPATH=C:\Users\user\go
set GOPRIVATE=
set GOPROXY=https://proxy.golang.org,direct
set GOROOT=c:\go
set GOSUMDB=sum.golang.org
set GOTMPDIR=
set GOTOOLDIR=c:\go\pkg\tool\windows_amd64
set GCCGO=gccgo
set AR=ar
set CC=gcc
set CXX=g++
set CGO_ENABLED=1
set GOMOD=
set CGO_CFLAGS=-g -O2
set CGO_CPPFLAGS=
set CGO_CXXFLAGS=-g -O2
set CGO_FFLAGS=-g -O2
set CGO_LDFLAGS=-g -O2
set PKG_CONFIG=pkg-config
set GOGCCFLAGS=-m64 -mthreads -fno-caret-diagnostics -Qunused-arguments -fmessage-length=0 -fdebug-prefix-map=C:\Users\user\AppData\Local\Temp\go-build817141214=/tmp/go-build -gno-record-gcc-switches
GOROOT/bin/go version: go version go1.14.4 windows/amd64
GOROOT/bin/go tool compile -V: compile version go1.14.4
</pre></details>
### What did you do?
<!--
If possible, provide a recipe for reproducing the error.
A complete runnable program is good.
A link on play.golang.org is best.
-->
1. Download and install [EMET 5.5](https://www.microsoft.com/en-us/download/details.aspx?id=50766) on Windows 7 64-bits.
2. Compile a simple 64-bit Go program (GOARCH=amd64). A "Hello, world" like the default shown in https://play.golang.org/ is enough.
3. Enable EMET for the compiled program (Open EMET GUI -> Apps -> Add application). The default profile enable all mitigations but EAF+ and ASR. The following screenshot illustrate this (_main.exe_ in this example):

4. Click OK.
5. Run the program from the Command Prompt.
### What did you expect to see?
Hello, playground
### What did you see instead?
```
Exception 0x80000001 0x0 0x7fefd7da020 0x7fefd7aa677
PC=0x7fefd7aa677
runtime: unknown pc 0x7fefd7aa677
stack: frame={sp:0x22fc00, fp:0x0} stack=[0x0,0x22ff30)
000000000022fb00: 0000000000000002 0000000000000000
000000000022fb10: 0000000077ca7ff0 0000000000000000
000000000022fb20: 0000000000000034 000007fef4fe7fda
000000000022fb30: 0000000000000002 0000000077bd141a
000000000022fb40: 00000000002617d0 000007fef4fdffdc
000000000022fb50: 0000000000291630 000000000022fb80
000000000022fb60: 0000000000000000 000007fef4fe20c5
000000000022fb70: 0000000000000000 0000000000000000
000000000022fb80: 00000000002746c0 0000000000000004
000000000022fb90: 0000000041d70000 0000774313c43aa0
000000000022fba0: 0000000000290810 0000000000267ed0
000000000022fbb0: 0000000000000002 00000000002680c4
000000000022fbc0: 00000000002683b0 00000000002617d0
000000000022fbd0: 0000000000000000 0000000000000000
000000000022fbe0: 00000000002617a0 0000000000000000
000000000022fbf0: 0000000077b83128 0000000000000000
000000000022fc00: <0000000000000020 0000000000260000
000000000022fc10: 0000000000000001 0000000000000018
000000000022fc20: 0000000000260298 0000000077ba7974
000000000022fc30: 0000000000000000 0000000000000018
000000000022fc40: 0000006800380021 00000000002682c0
000000000022fc50: 000000000022fd18 000007fef4fe7f54
000000000022fc60: 0000000000000000 0000000000579380
000000000022fc70: 0000000000000000 00000000779629b1
000000000022fc80: 000007fef5026470 0000000000000002
000000000022fc90: 00000000379619c4 07fe18aa80000000
000000000022fca0: 0000000000000000 0000000000000202
000000000022fcb0: 0000000000000103 000007fffffde000
000000000022fcc0: 0000000000000001 0000000000000008
000000000022fcd0: 0000000000000370 0000000000000df4
000000000022fce0: 000007fffffdc000 000000000022fea8
000000000022fcf0: 0000000000579380 0000000077bd141a
runtime: unknown pc 0x7fefd7aa677
stack: frame={sp:0x22fc00, fp:0x0} stack=[0x0,0x22ff30)
000000000022fb00: 0000000000000002 0000000000000000
000000000022fb10: 0000000077ca7ff0 0000000000000000
000000000022fb20: 0000000000000034 000007fef4fe7fda
000000000022fb30: 0000000000000002 0000000077bd141a
000000000022fb40: 00000000002617d0 000007fef4fdffdc
000000000022fb50: 0000000000291630 000000000022fb80
000000000022fb60: 0000000000000000 000007fef4fe20c5
000000000022fb70: 0000000000000000 0000000000000000
000000000022fb80: 00000000002746c0 0000000000000004
000000000022fb90: 0000000041d70000 0000774313c43aa0
000000000022fba0: 0000000000290810 0000000000267ed0
000000000022fbb0: 0000000000000002 00000000002680c4
000000000022fbc0: 00000000002683b0 00000000002617d0
000000000022fbd0: 0000000000000000 0000000000000000
000000000022fbe0: 00000000002617a0 0000000000000000
000000000022fbf0: 0000000077b83128 0000000000000000
000000000022fc00: <0000000000000020 0000000000260000
000000000022fc10: 0000000000000001 0000000000000018
000000000022fc20: 0000000000260298 0000000077ba7974
000000000022fc30: 0000000000000000 0000000000000018
000000000022fc40: 0000006800380021 00000000002682c0
000000000022fc50: 000000000022fd18 000007fef4fe7f54
000000000022fc60: 0000000000000000 0000000000579380
000000000022fc70: 0000000000000000 00000000779629b1
000000000022fc80: 000007fef5026470 0000000000000002
000000000022fc90: 00000000379619c4 07fe18aa80000000
000000000022fca0: 0000000000000000 0000000000000202
000000000022fcb0: 0000000000000103 000007fffffde000
000000000022fcc0: 0000000000000001 0000000000000008
000000000022fcd0: 0000000000000370 0000000000000df4
000000000022fce0: 000007fffffdc000 000000000022fea8
000000000022fcf0: 0000000000579380 0000000077bd141a
rax 0x0
rbx 0x22fce0
rcx 0x0
rdi 0x22fe80
rsi 0x22fe88
rbp 0x22fe28
rsp 0x22fc00
r8 0x40
r9 0x0
r10 0x0
r11 0x246
r12 0x0
r13 0x0
r14 0x0
r15 0x0
rip 0x7fefd7aa677
rflags 0x10346
cs 0x33
fs 0x53
gs 0x2b
```
### Comments
The problem only occurs if Export Address Table Access Filtering (EAF) is enabled (it's enabled by default on EMET). You can find more information about it in [EMET 5.5 User Guide](https://download.microsoft.com/download/7/7/1/771312BF-53F1-4FE1-B894-8EE93ABE567E/EMET%205%205%20User's%20Guide.pdf).
I'm aware EMET is a discontinued product but there are legacy systems still using it, so I thought it'd good to let you guys know about this problem.
|
ExpertNeeded,OS-Windows,NeedsInvestigation,compiler/runtime
|
low
|
Critical
|
631,219,875 |
flutter
|
Use an always true BotDetector in fuchsia_asset_builder.dart
|
The fuchsia_asset_builder tool runs during the Fuchsia build, and should never *not* be considered a bot. Analytics is already disabled for the entrypoint, but bot detection runs beforehand. Running the real bot detector is likely resulting in stray files in the Fuchsia source tree. This can be worked around by injecting a bot detector that always returns true here:
https://github.com/flutter/flutter/blob/3ec6978c9c20d7372cd0d416042fd32a457c184f/packages/flutter_tools/bin/fuchsia_asset_builder.dart#L33
Example hardwired bot detector:
https://github.com/flutter/flutter/blob/e216eec7b483f3030547df48ec35b52ad05df199/packages/flutter_tools/test/src/mocks.dart#L702
/cc @iskakaushik @jonahwilliams
|
team,customer: fuchsia,tool,a: quality,P2,team-tool,triaged-tool
|
low
|
Minor
|
631,230,213 |
PowerToys
|
[PowerRename] Allow for padding prefix/suffix
|
# Summary of the new feature/enhancement
Allow for padding a filename with a character(s) to a certain number of total characters.
I often need to rename a large number of image files which are named as UPC/PLU codes. The existing file names are variable length and I need to left pad the file names with 0's to 13 digits. For example I may have the following file names:
1234.bmp
123.bmp
1234567890.bmp
The result after the batch rename would need to be:
0000000001234.bmp
0000000000123.bmp
0001234567890.bmp
<!--
A clear and concise description of what the problem is that the new feature would solve.
Describe why and how a user would use this new functionality (if applicable).
-->
|
Idea-Enhancement,Help Wanted,Product-PowerRename
|
low
|
Minor
|
631,252,093 |
react-native
|
TextInput controlled selection broken on both ios and android.
|
**This issue is a continuation of the discussion:**
https://github.com/facebook/react-native/commit/dff490d140010913d3209a2f3e987914b9c4eee4#commitcomment-39332764
**The link to the sample project that demonstrates the issues:**
https://github.com/Ginger-Labs/Input-bug
## Description
Controlled selection seems to be broken on both ios and android, to demonstrate the issues I created a sample project (find the link above).
## React Native version:
System:
OS: macOS 10.15.4
CPU: (8) x64 Intel(R) Core(TM) i7-7820HQ CPU @ 2.90GHz
Memory: 1.55 GB / 16.00 GB
Shell: 5.7.1 - /bin/zsh
Binaries:
Node: 10.14.2 - /usr/local/bin/node
Yarn: 1.13.0 - /usr/local/bin/yarn
npm: 6.14.5 - ~/.npm-global/bin/npm
Watchman: 4.9.0 - /usr/local/bin/watchman
SDKs:
iOS SDK:
Platforms: iOS 13.5, DriverKit 19.0, macOS 10.15, tvOS 13.4, watchOS 6.2
Android SDK:
API Levels: 23, 24, 25, 26, 27, 28, 29
Build Tools: 26.0.3, 28.0.3, 29.0.0, 30.0.0
System Images: android-28 | Google APIs Intel x86 Atom, android-28 | Google Play Intel x86 Atom, android-29 | Google APIs Intel x86 Atom, android-29 | Google Play Intel x86 Atom_64
Android NDK: 21.1.6352462
IDEs:
Android Studio: 3.6 AI-192.7142.36.36.6392135
Xcode: 11.5/11E608c - /usr/bin/xcodebuild
npmPackages:
react: ~16.9.0 => 16.9.0
react-native: ~0.62.2 => 0.62.2
npmGlobalPackages:
create-react-native-app: 3.4.0
react-native-app-id: 0.0.5
react-native-cli: 2.0.1
## Steps To Reproduce
The reproduction steps are in the sample project's ReadMe file.
For simplicity purposes, I will post them here as well:
SIM - iPhone 11 (13.4.1):
1) Click on the "Click Me" Button, set the cursor in the middle, add some text, click on the button again: Notice the text is set to needed one but the selection is not 10
2) Input text: "Hello world", move cursor in between words, click on "@":
Expected: "Hello @Mihailworld" with the cursor at 13
Actual: "Hello @Mihailworld" with the cursor at the end of the whole string.
SIM - nexus 6P API28
1) Add text, click on enter (new line).
Expected: The text stays and a new line is created with "-" in front.
Actual: The first line becomes empty and the second line "-".
2) Press enter twice and you will get a:
`Exception in native call java.lang.IndexOutOfBoundsException: setSpan (6 ... 6) ends beyond length 3`
3) Press "@" twice and observe the same bug above.
## Expected Results
I expect the TextInput to work as intended (unless I am missing something conceptual).
## Snack, code example, screenshot, or link to a repository:
https://github.com/Ginger-Labs/Input-bug
|
Platform: iOS,Issue: Author Provided Repro,Component: TextInput,Platform: Android
|
medium
|
Critical
|
631,304,315 |
vscode
|
Slowdown in native file dialogs for wayland
|
Issue Type: <b>Performance Issue</b>
I/O operations (e.g. opening a file, deleting files, opening links) using the VSCode interface are noticeably laggy. Initially I thought this might be to do with my filesystem itself but doing the same on other software (e.g. LibreOffice) does not produce any noticeable lag.
In fact, clicking the "Preview on GitHub" button when I tried to report this issue causes the same lag!
Steps to reproduce:
* Open VSCode.
* Attempt to open a file / folder OR delete file in VSCode explorer tab.
VS Code version: Code 1.45.1 (5763d909d5f12fe19f215cbfdd29a91c0fa9208a, 2020-05-14T08:27:22.494Z)
OS version: Linux x64 5.4.0-33-generic snap
<details>
<summary>System Info</summary>
|Item|Value|
|---|---|
|CPUs|Intel(R) Core(TM) i7-8550U CPU @ 1.80GHz (8 x 2984)|
|GPU Status|2d_canvas: unavailable_software<br>flash_3d: disabled_software<br>flash_stage3d: disabled_software<br>flash_stage3d_baseline: disabled_software<br>gpu_compositing: disabled_software<br>multiple_raster_threads: enabled_on<br>oop_rasterization: disabled_off<br>protected_video_decode: disabled_off<br>rasterization: disabled_software<br>skia_renderer: disabled_off_ok<br>video_decode: disabled_software<br>viz_display_compositor: enabled_on<br>viz_hit_test_surface_layer: disabled_off_ok<br>webgl: unavailable_software<br>webgl2: unavailable_software|
|Load (avg)|1, 1, 1|
|Memory (System)|15.11GB (5.31GB free)|
|Process Argv|--force-user-env --no-sandbox --unity-launch --no-sandbox|
|Screen Reader|no|
|VM|0%|
|DESKTOP_SESSION|ubuntu-wayland|
|XDG_CURRENT_DESKTOP|Unity|
|XDG_SESSION_DESKTOP|ubuntu-wayland|
|XDG_SESSION_TYPE|wayland|
</details><details>
<summary>Process Info</summary>
```
CPU % Mem MB PID Process
0 139 7417 code main
0 31 7419 zygote
0 93 7456 gpu-process
0 46 7458 utility
0 294 7468 window (App.js - part9 (Workspace) - Visual Studio Code)
0 108 7736 watcherService
0 62 7751 searchService
0 0 7915 /bin/bash
0 46 12024 node /usr/share/yarn/bin/yarn.js run dev
0 0 12047 electron_node index.js
0 31 12048 /home/linuxbrew/.linuxbrew/Cellar/node/14.3.0_1/bin/node /home/zegheim/Documents/projects/fullstackopen/tutorial/backend/node_modules/.bin/nodemon --inspect index.js
0 0 12068 sh -c node --inspect index.js
0 46 12069 /home/linuxbrew/.linuxbrew/Cellar/node/14.3.0_1/bin/node --inspect index.js
0 0 11678 /bin/bash
0 46 12790 node /usr/share/yarn/bin/yarn.js start
0 0 12811 /bin/sh -c react-scripts start
0 31 12812 /home/linuxbrew/.linuxbrew/Cellar/node/14.3.0_1/bin/node /home/zegheim/Documents/projects/fullstackopen/tutorial/node_modules/.bin/react-scripts start
0 155 12819 /home/linuxbrew/.linuxbrew/Cellar/node/14.3.0_1/bin/node /home/zegheim/Documents/projects/fullstackopen/tutorial/node_modules/react-scripts/scripts/start.js
0 0 11739 /bin/bash
0 124 12622 extensionHost
0 77 12659 electron_node tsserver.js
0 371 12660 electron_node tsserver.js
0 124 12712 electron_node typingsInstaller.js typesMap.js
0 62 12677 /snap/code/33/usr/share/code/code /snap/code/33/usr/share/code/resources/app/extensions/json-language-features/server/dist/jsonServerMain --node-ipc --clientProcessId=12622
0 93 12680 electron_node eslintServer.js
0 108 7754 shared-process
0 0 13804 /bin/sh -c /bin/ps -ax -o pid=,ppid=,pcpu=,pmem=,command=
0 0 13805 /bin/ps -ax -o pid=,ppid=,pcpu=,pmem=,command=
0 108 13774 window (Issue Reporter)
```
</details>
<details>
<summary>Workspace Info</summary>
```
| Window (App.js - part9 (Workspace) - Visual Studio Code)
| Folder (tutorial): 65 files
| File types: js(16) json(6) png(4) rest(4) map(4) txt(3) eslintignore(2)
| gitignore(2) lock(2) ico(2)
| Conf files: package.json(2)
| Folder (phonebook): 65 files
| File types: js(15) json(6) rest(6) png(4) map(4) txt(3) gitignore(2)
| lock(2) ico(2) html(2)
| Conf files: package.json(2);
```
</details>
<details><summary>Extensions (11)</summary>
Extension|Author (truncated)|Version
---|---|---
vscode-markdownlint|Dav|0.36.0
vscode-eslint|dba|2.1.5
unsaved|esa|0.2.5
prettier-vscode|esb|5.0.0
vscode-firefox-debug|fir|2.8.0
rest-client|hum|0.23.2
vscode-edit-csv|jan|0.2.9
vscode-language-babel|mgm|0.0.27
python|ms-|2020.5.80290
debugger-for-chrome|msj|4.12.8
autodocstring|njp|0.5.3
</details>
<!-- generated by issue reporter -->
|
bug,freeze-slow-crash-leak,linux,snap
|
medium
|
Critical
|
631,343,514 |
rust
|
Use `fclass.{s|d|q}` instruction for float point classification in RISC-V targets
|
Recently I came up with floating point variable classification. I found the function [`f32::classify`](https://doc.rust-lang.org/std/primitive.f32.html#method.classify) useful. Regardless of instruction set architecture, this function is implemented currently like this in standard library:
```rust
#[stable(feature = "rust1", since = "1.0.0")]
pub fn classify(self) -> FpCategory {
const EXP_MASK: u32 = 0x7f800000;
const MAN_MASK: u32 = 0x007fffff;
let bits = self.to_bits();
match (bits & MAN_MASK, bits & EXP_MASK) {
(0, 0) => FpCategory::Zero,
(_, 0) => FpCategory::Subnormal,
(0, EXP_MASK) => FpCategory::Infinite,
(_, EXP_MASK) => FpCategory::Nan,
_ => FpCategory::Normal,
}
}
```
However, this standard library's function compiles to very long bunches of instructions. On RISC-V RV64GC, it compiles into:
<details>
<summary>very long assembly code</summary>
```asm
example::classify_std:
fmv.w.x ft0, a0
fsw ft0, -20(s0)
lwu a0, -20(s0)
sd a0, -48(s0)
j .LBB0_1
.LBB0_1:
lui a0, 2048
addiw a0, a0, -1
ld a1, -48(s0)
and a0, a0, a1
lui a2, 522240
and a2, a2, a1
sw a0, -32(s0)
sw a2, -28(s0)
mv a2, zero
bne a0, a2, .LBB0_3
j .LBB0_2
.LBB0_2:
lw a0, -28(s0)
mv a1, zero
beq a0, a1, .LBB0_7
j .LBB0_3
.LBB0_3:
lwu a0, -28(s0)
mv a1, zero
sd a0, -56(s0)
beq a0, a1, .LBB0_8
j .LBB0_4
.LBB0_4:
lui a0, 522240
ld a1, -56(s0)
bne a1, a0, .LBB0_6
j .LBB0_5
.LBB0_5:
lw a0, -32(s0)
mv a1, zero
beq a0, a1, .LBB0_9
j .LBB0_10
.LBB0_6:
addi a0, zero, 4
sb a0, -33(s0)
j .LBB0_11
.LBB0_7:
addi a0, zero, 2
sb a0, -33(s0)
j .LBB0_11
.LBB0_8:
addi a0, zero, 3
sb a0, -33(s0)
j .LBB0_11
.LBB0_9:
addi a0, zero, 1
sb a0, -33(s0)
j .LBB0_11
.LBB0_10:
mv a0, zero
sb a0, -33(s0)
j .LBB0_11
.LBB0_11:
lb a0, -33(s0)
ret
```
</details>
To solve this problem, RISC-V provided with `fclass.{s|d|q}` instructions. According to RISC-V's spec Section 11.9, instruction `fclass.s rd, rs1` examines `rs1` as 32-bit floating point number, and store its type into `rd`. By this way we use register `rd` and to judge which enum value of Rust standard library we should return.
I'd like to explain this procedure in Rust code. The new way looks like this:
```rust
pub fn classify_riscv_rvf(input: f32) -> FpCategory {
let ans: usize;
// step 1: map this f32 value into RISC-V defined integer type number
// this procedure could be built in into compiler
unsafe { llvm_asm!(
"fclass.s a0, fa0"
:"={a0}"(ans)
:"{fa0}"(input)
:
:"intel"
) };
// step 2: convert from return flags to FpCategory enum value
if ans & 0b10000001 != 0 {
return FpCategory::Infinite;
}
if ans & 0b01000010 != 0 {
return FpCategory::Normal;
}
if ans & 0b00100100 != 0 {
return FpCategory::Subnormal;
}
if ans & 0b00011000 != 0 {
return FpCategory::Zero;
}
FpCategory::Nan
}
```
It compiles into the following assembly code which is shorter and could be executed faster:
```asm
example::classify_riscv_rvf:
fclass.s a0, fa0
andi a2, a0, 129
addi a1, zero, 1
beqz a2, .LBB0_2
.LBB0_1:
add a0, zero, a1
ret
.LBB0_2:
andi a2, a0, 66
addi a1, zero, 4
bnez a2, .LBB0_1
andi a2, a0, 36
addi a1, zero, 3
bnez a2, .LBB0_1
andi a0, a0, 24
snez a0, a0
slli a1, a0, 1
add a0, zero, a1
ret
```
For `f64` types, we could use `fclass.d` instruction other than `fclass.s` for `f32`s. If in the future we had a chance to introduce `f128` primitive type, there's also `fclass.q` instruction. After using these instructions, it improves speed on this function in RISC-V platforms. This enhancement is especially significant for embedded devices. I suggest to change the implementation of this function in the standard library. We may implement it by any of following ways:
1. Implement `fclassf32` and `fclassf64` intrinsics function into `core::instrinsics`, and call them in `f32::classify` or `f64::classify`. These functions can be implemented with special instruction or fallback in other platforms;
2. Use inline assembly directly in standard library and add a `#[cfg(..)]` to compile it only in RISC-V targets with floating extension `F` or `D` respectively, or fallback in other platforms.
|
I-slow,C-enhancement,O-riscv,T-libs,A-floating-point
|
low
|
Major
|
631,353,892 |
godot
|
Godot try to load addon before it import it all resources.
|
**Godot version:**
3.2.1
**OS/device including version:**
Ubuntu 20.04, Windows 10
**Issue description:**
So I making addon with a lot of resources called Rakugo.
And I publish template project that use it.
When people try to open it for first time it crashes Godot with bug simmilar this:
```
ERROR: Failed loading resource: res://.import/sound-on.png-c7069ca607a583f763fc164615f5a3ee.stex.
At: core/io/resource_loader.cpp:278
ERROR: Failed loading resource: res://gui/OptionsBox/icons/sound-on.png.
At: core/io/resource_loader.cpp:278
ERROR: _load_data: Condition "!f" is true. Returned: ERR_CANT_OPEN
At: scene/resources/texture.cpp:502
ERROR: Failed loading resource: res://.import/sound-on.png-c7069ca607a583f763fc164615f5a3ee.stex.
At: core/io/resource_loader.cpp:278
ERROR: Failed loading resource: res://gui/OptionsBox/icons/sound-on.png.
At: core/io/resource_loader.cpp:278
WARNING: _parse_ext_resource: Couldn't load external resource: res://gui/OptionsBox/icons/sound-on.png
At: scene/resources/resource_format_text.cpp:175
ERROR: _load_data: Condition "!f" is true. Returned: ERR_CANT_OPEN
At: scene/resources/texture.cpp:502
ERROR: Failed loading resource: res://.import/rakugo_var_h_slider.svg-078a6aa2c49fda8df315869911b0f882.stex.
At: core/io/resource_loader.cpp:278
ERROR: Failed loading resource: res://addons/Rakugo/icons/rakugo_var_h_slider.svg.
At: core/io/resource_loader.cpp:278
WARNING: _parse_ext_resource: Couldn't load external resource: res://addons/Rakugo/icons/rakugo_var_h_slider.svg
At: scene/resources/resource_format_text.cpp:175
ERROR: _load_data: Condition "!f" is true. Returned: ERR_CANT_OPEN
At: scene/resources/texture.cpp:502
```
But when they try to open it once again every thing works fine.
**Steps to reproduce:**
1. Download attached zip project
1. Try to open in Godot
1. If it crashes try to reopen the project
**Minimal reproduction project:**
1. The empty project just with addon [Rakugo-bug-test.zip](https://github.com/godotengine/godot/files/4734348/Rakugo-bug-test.zip)
2. [The Rakugo Template](https://github.com/rakugoteam/Rakugo/releases/download/v2.1.01/Rakugo-2.1.01.7z)
|
bug,topic:editor,topic:plugin
|
low
|
Critical
|
631,367,589 |
go
|
net/http/httputil: ReverseProxy is not allowing to change the status message for the response
|
<!--
Please answer these questions before submitting your issue. Thanks!
For questions please use one of our forums: https://github.com/golang/go/wiki/Questions
-->
### What version of Go are you using (`go version`)?
<pre>
$ go1.13.3 windows/amd64
</pre>
### Does this issue reproduce with the latest release?
Have not upgraded to latest version yet
### What operating system and processor architecture are you using (`go env`)?
set GO111MODULE=
set GOARCH=amd64
set GOBIN=C:\Users\xyz\Documents\GitHub\Go\workspace\bin
set GOCACHE=C:\Users\xyz\AppData\Local\go-build
set GOENV=C:\Users\xyz\AppData\Roaming\go\env
set GOEXE=.exe
set GOFLAGS=
set GOHOSTARCH=amd64
set GOHOSTOS=windows
set GONOPROXY=
set GONOSUMDB=
set GOOS=windows
set GOPATH=C:\Users\xyz\Documents\GitHub\Go\workspace
set GOPRIVATE=
set GOPROXY=https://proxy.golang.org,direct
set GOROOT=C:\Go
set GOSUMDB=sum.golang.org
set GOTMPDIR=
set GOTOOLDIR=C:\Go\pkg\tool\windows_amd64
set GCCGO=gccgo
set AR=ar
set CC=gcc
set CXX=g++
set CGO_ENABLED=1
set GOMOD=C:\Users\xyz\Documents\GitHub\integration-gateway\go.mod
set CGO_CFLAGS=-g -O2
set CGO_CPPFLAGS=
set CGO_CXXFLAGS=-g -O2
set CGO_FFLAGS=-g -O2
set CGO_LDFLAGS=-g -O2
set PKG_CONFIG=pkg-config
set GOGCCFLAGS=-m64 -mthreads -fmessage-length=0 -fdebug-prefix-map=C:\Users\xyz\AppData\Local\Temp\go-build906736114=/tmp/go-build -gno-record-gcc-switches
<details><summary><code>go env</code> Output</summary><br><pre>
$ go env
</pre></details>
### What did you do?
Overwrote the ModifyResponse method in Reversproxy and set the status of Response to a custom value. In addition to that changed the status code of the response as well
resp.StatusCode = 201
resp.Status = "Custom object created"
<!--
If possible, provide a recipe for reproducing the error.
A complete runnable program is good.
A link on play.golang.org is best.
-->
### What did you expect to see?
Expected to see the status-code as well as the status message to change in the final response
### What did you see instead?
Only the status code was changed, but the status message was never changed.
|
NeedsInvestigation,FeatureRequest
|
low
|
Critical
|
631,417,937 |
react
|
Bug: Form reset lost checkbox onChange event
|
Hi, I use checkbox uncontrolled mode, onChange in form reset after, lose onChange.
```js
<input type="checkbox" onChange={onChange} />
```
but use add ref.addEventListener('change', onChange) is ok
```js
const checkRef = useRef<HTMLInputElement>();
useEffect(() => {
if (checkboxRef) {
checkboxRef.current.addEventListener('change', onChange);
}
}, []);
<input type="checkbox" ref="checkboxRef" onChange={onChange} />
```
React version: 16.13 and old
## Steps To Reproduce
1. checkbox => checked
2. form reset
3. checked => checked
Link to code example:
[not react is ok](https://codepen.io/imagine10255/pen/ExPaLOJ?editors=1111)
[reset is lose target onChange](https://codesandbox.io/s/affectionate-brook-96dw9?file=/src/App.js)
## The current behavior
1. checkbox => checked (target onChange)
2. form reset
3. checked => checked (lose target onChange)
## The expected behavior
1. checkbox => checked (target onChange)
2. form reset
3. checked => checked (target onChange)
|
Type: Bug,Component: DOM
|
low
|
Critical
|
631,434,528 |
youtube-dl
|
yt-dl.org is sending an expired intermediate cert, causing validation by youtube-dl.exe (python 3.4.4) to fail
|
<!--
######################################################################
WARNING!
IGNORING THE FOLLOWING TEMPLATE WILL RESULT IN ISSUE CLOSED AS INCOMPLETE
######################################################################
-->
## Checklist
<!--
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of youtube-dl:
- First of, make sure you are using the latest version of youtube-dl. Run `youtube-dl --version` and ensure your version is 2020.05.29. If it's not, see https://yt-dl.org/update on how to update. Issues with outdated version will be REJECTED.
- Make sure that all provided video/audio/playlist URLs (if any) are alive and playable in a browser.
- Make sure that all URLs and arguments with special characters are properly quoted or escaped as explained in http://yt-dl.org/escape.
- Search the bugtracker for similar issues: http://yt-dl.org/search-issues. DO NOT post duplicates.
- Read bugs section in FAQ: http://yt-dl.org/reporting
- Finally, put x into all relevant boxes (like this [x])
-->
- [ ] I'm reporting a broken site support issue
- [x] I've verified that I'm running youtube-dl version **2020.05.29**
- [ ] I've checked that all provided URLs are alive and playable in a browser
- [ ] I've checked that all URLs and arguments with special characters are properly quoted or escaped
- [x] I've searched the bugtracker for similar bug reports including closed ones
- [x] I've read bugs section in FAQ
## Verbose log
<!--
Provide the complete verbose output of youtube-dl that clearly demonstrates the problem.
Add the `-v` flag to your command line you run youtube-dl with (`youtube-dl -v <your command line>`), copy the WHOLE output and insert it below. It should look similar to this:
[debug] System config: []
[debug] User config: []
[debug] Command-line args: [u'-v', u'http://www.youtube.com/watch?v=BaW_jenozKcj']
[debug] Encodings: locale cp1251, fs mbcs, out cp866, pref cp1251
[debug] youtube-dl version 2020.05.29
[debug] Python version 2.7.11 - Windows-2003Server-5.2.3790-SP2
[debug] exe versions: ffmpeg N-75573-g1d0487f, ffprobe N-75573-g1d0487f, rtmpdump 2.4
[debug] Proxy map: {}
<more lines>
-->
```
[debug] System config: []
[debug] User config: []
[debug] Custom config: []
[debug] Command-line args: ['-U', '--verbose']
[debug] Encodings: locale cp1252, fs mbcs, out cp437, pref cp1252
[debug] youtube-dl version 2020.05.29
[debug] Python version 3.4.4 (CPython) - Windows-7-6.1.7601-SP1
[debug] exe versions: ffmpeg 4.2.1, ffprobe 4.2.1
[debug] Proxy map: {}
Traceback (most recent call last):
File "C:\Python\Python34\lib\urllib\request.py", line 1183, in do_open
File "C:\Python\Python34\lib\http\client.py", line 1137, in request
File "C:\Python\Python34\lib\http\client.py", line 1182, in _send_request
File "C:\Python\Python34\lib\http\client.py", line 1133, in endheaders
File "C:\Python\Python34\lib\http\client.py", line 963, in _send_output
File "C:\Python\Python34\lib\http\client.py", line 898, in send
File "C:\Python\Python34\lib\http\client.py", line 1287, in connect
File "C:\Python\Python34\lib\ssl.py", line 362, in wrap_socket
File "C:\Python\Python34\lib\ssl.py", line 580, in __init__
File "C:\Python\Python34\lib\ssl.py", line 807, in do_handshake
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:600)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\dst\AppData\Roaming\Build archive\youtube-dl\ytdl-org\tmpxtvjzx45\build\youtube_dl\update.py", line 46, in update_self
File "C:\Python\Python34\lib\urllib\request.py", line 464, in open
File "C:\Python\Python34\lib\urllib\request.py", line 482, in _open
File "C:\Python\Python34\lib\urllib\request.py", line 442, in _call_chain
File "C:\Users\dst\AppData\Roaming\Build archive\youtube-dl\ytdl-org\tmpxtvjzx45\build\youtube_dl\utils.py", line 2736, in https_open
File "C:\Python\Python34\lib\urllib\request.py", line 1185, in do_open
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:600)>
ERROR: can't find the current version. Please try again later.
```
## Description
<!--
Provide an explanation of your issue in an arbitrary form. Please make sure the description is worded well enough to be understood, see https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient. Provide any additional information, suggested solution and as much context and examples as possible.
If work on your issue requires account credentials please provide them or explain how one can obtain them.
-->
yt-dl.org is sending the expired intermediate cert (the last one in the below openssl output, "COMODO RSA Certification Authority"), which is what's causing validation to fail in python. Browsers are evidently smart enough to ignore it, and build an alternate chain, but the python client is not.
```txt
echo q | openssl s_client -connect yt-dl.org:443 -CApath /usr/lib/ssl/certs -showcerts
CONNECTED(00000003)
depth=2 C = GB, ST = Greater Manchester, L = Salford, O = COMODO CA Limited, CN = COMODO RSA Certification Authority
verify return:1
depth=1 C = GB, ST = Greater Manchester, L = Salford, O = COMODO CA Limited, CN = COMODO RSA Domain Validation Secure Server CA
verify return:1
depth=0 OU = Domain Control Validated, OU = PositiveSSL, CN = yt-dl.org
verify return:1
---
Certificate chain
0 s:OU = Domain Control Validated, OU = PositiveSSL, CN = yt-dl.org
i:C = GB, ST = Greater Manchester, L = Salford, O = COMODO CA Limited, CN = COMODO RSA Domain Validation Secure Server CA
-----BEGIN CERTIFICATE-----
MIIGPzCCBSegAwIBAgIQMZS8dhHXcUB+Xw/VCtIQ5DANBgkqhkiG9w0BAQsFADCB
kDELMAkGA1UEBhMCR0IxGzAZBgNVBAgTEkdyZWF0ZXIgTWFuY2hlc3RlcjEQMA4G
A1UEBxMHU2FsZm9yZDEaMBgGA1UEChMRQ09NT0RPIENBIExpbWl0ZWQxNjA0BgNV
BAMTLUNPTU9ETyBSU0EgRG9tYWluIFZhbGlkYXRpb24gU2VjdXJlIFNlcnZlciBD
QTAeFw0xODAyMDEwMDAwMDBaFw0yMTAyMTgyMzU5NTlaME0xITAfBgNVBAsTGERv
bWFpbiBDb250cm9sIFZhbGlkYXRlZDEUMBIGA1UECxMLUG9zaXRpdmVTU0wxEjAQ
BgNVBAMTCXl0LWRsLm9yZzCCAiIwDQYJKoZIhvcNAQEBBQADggIPADCCAgoCggIB
AJqSiXCTmmdFE2EStjBrXAd62YD5lFWdS55i1fJQjUzmk0SYTFFlM63pLl46J/Xv
KXKlQhz94+l4RpG+Y4/lTJpKQCvATiJ+YB3nd40Py08VbUhLRqafcXCY72JC5aoG
6CHOxzYpMITImNVkVCnDSbFoEEzu6cHHtK8YLDCbxbAt9Jxb6fd43tCJRxiJyC72
MBTtURIWL5dVzWFppHBUuHcmFyCjDXlWcnNWPK9M0h1KEAlv3my/vfvYCHWKOaVZ
0a9qpHZojhzIEhuGdg00QxBB9jd7CVwsQOYDOgfn2wEZCP1bHYlITFmWeEIxeBTd
cAiVkjsDYQScY/ll+EeB+ONzBJ2vjsSmMKY0YdYU5qKRZcZvzMVTY+TaKTQN9Bzv
MKauvZ1cCLEz/NeewTkGILXzHpkXl1rZwQN0oDEZJAG3rUn0L0t9FccA5zG/1uxV
cvYp//1DWulbC7/8KZQKrJN4DDxeXgFj7gaYptrxdxY8N9jWTBWORflky9r5YT8J
gAqLMPS8GpBOCUOsaMpAu92GZsxKM46Qy0MDv3xXYxL7GQ4L/9O9neENv3mOVAS9
4c5P8w5x5vTQ+mpeS44Upij0eYbU+Axp6/tZGA2QMwBv9HPpX+1ez59kAC/MBLQs
0t6gu/pc0k4d8KmvY3nc0eS7yLitOOzoub2dzMM38SutAgMBAAGjggHVMIIB0TAf
BgNVHSMEGDAWgBSQr2o6lFoL2JDqElZz30O0Oija5zAdBgNVHQ4EFgQUq7j9Gob+
eMDqG4+pwvoyOAI1eP4wDgYDVR0PAQH/BAQDAgWgMAwGA1UdEwEB/wQCMAAwHQYD
VR0lBBYwFAYIKwYBBQUHAwEGCCsGAQUFBwMCME8GA1UdIARIMEYwOgYLKwYBBAGy
MQECAgcwKzApBggrBgEFBQcCARYdaHR0cHM6Ly9zZWN1cmUuY29tb2RvLmNvbS9D
UFMwCAYGZ4EMAQIBMFQGA1UdHwRNMEswSaBHoEWGQ2h0dHA6Ly9jcmwuY29tb2Rv
Y2EuY29tL0NPTU9ET1JTQURvbWFpblZhbGlkYXRpb25TZWN1cmVTZXJ2ZXJDQS5j
cmwwgYUGCCsGAQUFBwEBBHkwdzBPBggrBgEFBQcwAoZDaHR0cDovL2NydC5jb21v
ZG9jYS5jb20vQ09NT0RPUlNBRG9tYWluVmFsaWRhdGlvblNlY3VyZVNlcnZlckNB
LmNydDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuY29tb2RvY2EuY29tMCMGA1Ud
EQQcMBqCCXl0LWRsLm9yZ4INd3d3Lnl0LWRsLm9yZzANBgkqhkiG9w0BAQsFAAOC
AQEAUMCK4vJAeMvNOGEQp1fcI4PvQ1sGLaPbawVkEwCOA3IW5hfQG7xObxI9qywv
7xweXLlF9YB01j/ovb7Oxg11iLsdJ4ESo6pkqiUsAsLRQPSp/3MuRMUgzLqH1Hh0
BX3yBAUGnVxZV/jORpUjlKDQLrpIdWTIfuYhQzH2uuBN2iqjzQBb7biahX3naNjl
Qu1MGeJhjO4aKQMXVLbs0MJ7QGGqr+t1VyjjOluy6nftsd6EJdZ44hnOfOArL9Hm
dLDQ1FExCPwvZyvVTHaCk4PcbuYF+DecvaRkeDejEYxMza2ULbK6AiaLRem6G/4M
TdC1MRnuYlnaCkcYVVyYu1ejGw==
-----END CERTIFICATE-----
1 s:C = GB, ST = Greater Manchester, L = Salford, O = COMODO CA Limited, CN = COMODO RSA Domain Validation Secure Server CA
i:C = GB, ST = Greater Manchester, L = Salford, O = COMODO CA Limited, CN = COMODO RSA Certification Authority
-----BEGIN CERTIFICATE-----
MIIGCDCCA/CgAwIBAgIQKy5u6tl1NmwUim7bo3yMBzANBgkqhkiG9w0BAQwFADCB
hTELMAkGA1UEBhMCR0IxGzAZBgNVBAgTEkdyZWF0ZXIgTWFuY2hlc3RlcjEQMA4G
A1UEBxMHU2FsZm9yZDEaMBgGA1UEChMRQ09NT0RPIENBIExpbWl0ZWQxKzApBgNV
BAMTIkNPTU9ETyBSU0EgQ2VydGlmaWNhdGlvbiBBdXRob3JpdHkwHhcNMTQwMjEy
MDAwMDAwWhcNMjkwMjExMjM1OTU5WjCBkDELMAkGA1UEBhMCR0IxGzAZBgNVBAgT
EkdyZWF0ZXIgTWFuY2hlc3RlcjEQMA4GA1UEBxMHU2FsZm9yZDEaMBgGA1UEChMR
Q09NT0RPIENBIExpbWl0ZWQxNjA0BgNVBAMTLUNPTU9ETyBSU0EgRG9tYWluIFZh
bGlkYXRpb24gU2VjdXJlIFNlcnZlciBDQTCCASIwDQYJKoZIhvcNAQEBBQADggEP
ADCCAQoCggEBAI7CAhnhoFmk6zg1jSz9AdDTScBkxwtiBUUWOqigwAwCfx3M28Sh
bXcDow+G+eMGnD4LgYqbSRutA776S9uMIO3Vzl5ljj4Nr0zCsLdFXlIvNN5IJGS0
Qa4Al/e+Z96e0HqnU4A7fK31llVvl0cKfIWLIpeNs4TgllfQcBhglo/uLQeTnaG6
ytHNe+nEKpooIZFNb5JPJaXyejXdJtxGpdCsWTWM/06RQ1A/WZMebFEh7lgUq/51
UHg+TLAchhP6a5i84DuUHoVS3AOTJBhuyydRReZw3iVDpA3hSqXttn7IzW3uLh0n
c13cRTCAquOyQQuvvUSH2rnlG51/ruWFgqUCAwEAAaOCAWUwggFhMB8GA1UdIwQY
MBaAFLuvfgI9+qbxPISOre44mOzZMjLUMB0GA1UdDgQWBBSQr2o6lFoL2JDqElZz
30O0Oija5zAOBgNVHQ8BAf8EBAMCAYYwEgYDVR0TAQH/BAgwBgEB/wIBADAdBgNV
HSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwGwYDVR0gBBQwEjAGBgRVHSAAMAgG
BmeBDAECATBMBgNVHR8ERTBDMEGgP6A9hjtodHRwOi8vY3JsLmNvbW9kb2NhLmNv
bS9DT01PRE9SU0FDZXJ0aWZpY2F0aW9uQXV0aG9yaXR5LmNybDBxBggrBgEFBQcB
AQRlMGMwOwYIKwYBBQUHMAKGL2h0dHA6Ly9jcnQuY29tb2RvY2EuY29tL0NPTU9E
T1JTQUFkZFRydXN0Q0EuY3J0MCQGCCsGAQUFBzABhhhodHRwOi8vb2NzcC5jb21v
ZG9jYS5jb20wDQYJKoZIhvcNAQEMBQADggIBAE4rdk+SHGI2ibp3wScF9BzWRJ2p
mj6q1WZmAT7qSeaiNbz69t2Vjpk1mA42GHWx3d1Qcnyu3HeIzg/3kCDKo2cuH1Z/
e+FE6kKVxF0NAVBGFfKBiVlsit2M8RKhjTpCipj4SzR7JzsItG8kO3KdY3RYPBps
P0/HEZrIqPW1N+8QRcZs2eBelSaz662jue5/DJpmNXMyYE7l3YphLG5SEXdoltMY
dVEVABt0iN3hxzgEQyjpFv3ZBdRdRydg1vs4O2xyopT4Qhrf7W8GjEXCBgCq5Ojc
2bXhc3js9iPc0d1sjhqPpepUfJa3w/5Vjo1JXvxku88+vZbrac2/4EjxYoIQ5QxG
V/Iz2tDIY+3GH5QFlkoakdH368+PUq4NCNk+qKBR6cGHdNXJ93SrLlP7u3r7l+L4
HyaPs9Kg4DdbKDsx5Q5XLVq4rXmsXiBmGqW5prU5wfWYQ//u+aen/e7KJD2AFsQX
j4rBYKEMrltDR5FL1ZoXX/nUh8HCjLfn4g8wGTeGrODcQgPmlKidrv0PJFGUzpII
0fxQ8ANAe4hZ7Q7drNJ3gjTcBpUC2JD5Leo31Rpg0Gcg19hCC0Wvgmje3WYkN5Ap
lBlGGSW4gNfL1IYoakRwJiNiqZ+Gb7+6kHDSVneFeO/qJakXzlByjAA6quPbYzSf
+AZxAeKCINT+b72x
-----END CERTIFICATE-----
2 s:C = GB, ST = Greater Manchester, L = Salford, O = COMODO CA Limited, CN = COMODO RSA Certification Authority
i:C = SE, O = AddTrust AB, OU = AddTrust External TTP Network, CN = AddTrust External CA Root
-----BEGIN CERTIFICATE-----
MIIFdDCCBFygAwIBAgIQJ2buVutJ846r13Ci/ITeIjANBgkqhkiG9w0BAQwFADBv
MQswCQYDVQQGEwJTRTEUMBIGA1UEChMLQWRkVHJ1c3QgQUIxJjAkBgNVBAsTHUFk
ZFRydXN0IEV4dGVybmFsIFRUUCBOZXR3b3JrMSIwIAYDVQQDExlBZGRUcnVzdCBF
eHRlcm5hbCBDQSBSb290MB4XDTAwMDUzMDEwNDgzOFoXDTIwMDUzMDEwNDgzOFow
gYUxCzAJBgNVBAYTAkdCMRswGQYDVQQIExJHcmVhdGVyIE1hbmNoZXN0ZXIxEDAO
BgNVBAcTB1NhbGZvcmQxGjAYBgNVBAoTEUNPTU9ETyBDQSBMaW1pdGVkMSswKQYD
VQQDEyJDT01PRE8gUlNBIENlcnRpZmljYXRpb24gQXV0aG9yaXR5MIICIjANBgkq
hkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAkehUktIKVrGsDSTdxc9EZ3SZKzejfSNw
AHG8U9/E+ioSj0t/EFa9n3Byt2F/yUsPF6c947AEYe7/EZfH9IY+Cvo+XPmT5jR6
2RRr55yzhaCCenavcZDX7P0N+pxs+t+wgvQUfvm+xKYvT3+Zf7X8Z0NyvQwA1onr
ayzT7Y+YHBSrfuXjbvzYqOSSJNpDa2K4Vf3qwbxstovzDo2a5JtsaZn4eEgwRdWt
4Q08RWD8MpZRJ7xnw8outmvqRsfHIKCxH2XeSAi6pE6p8oNGN4Tr6MyBSENnTnIq
m1y9TBsoilwie7SrmNnu4FGDwwlGTm0+mfqVF9p8M1dBPI1R7Qu2XK8sYxrfV8g/
vOldxJuvRZnio1oktLqpVj3Pb6r/SVi+8Kj/9Lit6Tf7urj0Czr56ENCHonYhMsT
8dm74YlguIwoVqwUHZwK53Hrzw7dPamWoUi9PPevtQ0iTMARgexWO/bTouJbt7IE
IlKVgJNp6I5MZfGRAy1wdALqi2cVKWlSArvX31BqVUa/oKMoYX9w0MOiqiwhqkfO
KJwGRXa/ghgntNWutMtQ5mv0TIZxMOmm3xaG4Nj/QN370EKIf6MzOi5cHkERgWPO
GHFrK+ymircxXDpqR+DDeVnWIBqv8mqYqnK8V0rSS527EPywTEHl7R09XiidnMy/
s1Hap0flhFMCAwEAAaOB9DCB8TAfBgNVHSMEGDAWgBStvZh6NLQm9/rEJlTvA73g
JMtUGjAdBgNVHQ4EFgQUu69+Aj36pvE8hI6t7jiY7NkyMtQwDgYDVR0PAQH/BAQD
AgGGMA8GA1UdEwEB/wQFMAMBAf8wEQYDVR0gBAowCDAGBgRVHSAAMEQGA1UdHwQ9
MDswOaA3oDWGM2h0dHA6Ly9jcmwudXNlcnRydXN0LmNvbS9BZGRUcnVzdEV4dGVy
bmFsQ0FSb290LmNybDA1BggrBgEFBQcBAQQpMCcwJQYIKwYBBQUHMAGGGWh0dHA6
Ly9vY3NwLnVzZXJ0cnVzdC5jb20wDQYJKoZIhvcNAQEMBQADggEBAGS/g/FfmoXQ
zbihKVcN6Fr30ek+8nYEbvFScLsePP9NDXRqzIGCJdPDoCpdTPW6i6FtxFQJdcfj
Jw5dhHk3QBN39bSsHNA7qxcS1u80GH4r6XnTq1dFDK8o+tDb5VCViLvfhVdpfZLY
Uspzgb8c8+a4bmYRBbMelC1/kZWSWfFMzqORcUx8Rww7Cxn2obFshj5cqsQugsv5
B5a6SE2Q8pTIqXOi6wZ7I53eovNNVZ96YUWYGGjHXkBrI/V5eu+MtWuLt29G9Hvx
PUsE2JOAWVrgQSQdso8VYFhH2+9uRv0V9dlfmrPb2LjkQLPNlzmuhbsdjrzch5vR
pu/xO28QOG8=
-----END CERTIFICATE-----
```
the solution is described here:
https://www.agwa.name/blog/post/fixing_the_addtrust_root_expiration
you have to stop sending the expired intermediate cert because your own python client can't handle it
> Fortunately, modern clients with well-written certificate validators (this includes all mainstream web browsers) won't have a problem with the expiration. Since they trust the USERTrust RSA Certification Authority root, they will build a chain to that root and ignore the fact that the server sent an expired intermediate certificate.
>Other clients, notably anything using OpenSSL 1.0.x or GnuTLS, will have a problem. Even if these clients trust the USERTrust RSA Certification Authority root, and could build a chain to it if they wanted, they'll end up building a chain to AddTrust External CA Root instead, causing the certificate validation to fail with an expired certificate error.
>Fixing this problem as a server operator
Basically, you need to remove the intermediate certificate issued by AddTrust External CA Root from your certificate chain.
|
cant-reproduce
|
low
|
Critical
|
631,443,538 |
flutter
|
Reference material spec for text scaling when available in AppBar
|
Follow up from https://github.com/flutter/flutter/issues/58093
When the material spec for accessibility text scaling is available add reference to it in docs, to educate and explain the text scale factor cap of `1.34` for titles.
|
framework,f: material design,a: typography,team-design,triaged-design
|
low
|
Minor
|
631,475,005 |
pytorch
|
[JIT] OrderedDict doesn't support custom objects
|
There seems to be an inconsistency with how `OrderedDict` is handled by torchscript compared to `dict`.
Minimal example illustrating that `OrderedDict` doesn't seem to support custom objects, while dict do.
```python
import torch; from typing import Dict; from collections import OrderedDict
class C:
def __init__(self, x):
self.x = x
def f(x):
# this fails with the below error
d : Dict[str, C] = OrderedDict()
# this works
# d : Dict[str, C] = {}
d['a'] = C(x)
d['b'] = C(2 * x)
return d
ff = torch.jit.script(f)
```
with the error message
```python
Traceback (most recent call last):
File "tst.py", line 17, in <module>
ff = torch.jit.script(f)
File "/Users/fmassa/anaconda3/lib/python3.6/site-packages/torch/jit/__init__.py", line 1369, in script
fn = torch._C._jit_script_compile(qualified_name, ast, _rcb, get_default_args(obj))
RuntimeError:
Variable 'd' is annotated with type Dict[str, __torch__.C] but is being assigned to a value of type Dict[str, Tensor]:
File "tst.py", line 9
def f(x):
d : Dict[str, C] = OrderedDict()
~ <--- HERE
d['a'] = C(x)
d['b'] = C(2 * x)
```
cc @suo
|
triage review,oncall: jit
|
low
|
Critical
|
631,480,699 |
pytorch
|
Dataloader._shutdown_workers hangs
|
## π Bug
Dataloader._shutdown_workers hangs unexpectedly. The program has to be killed with Ctrl-C.
```
File "iterative_clustering.py", line 80, in calculate_features
for batch in tqdm(dataloader, desc=f"Features (pass {i})"):
File "/data1/mschroeder/miniconda3/envs/20-ssdc/lib/python3.7/site-packages/tqdm/std.py", line 1119, in __iter__
for obj in iterable:
File "/data1/mschroeder/miniconda3/envs/20-ssdc/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 571, in __next__
self._shutdown_workers()
File "/data1/mschroeder/miniconda3/envs/20-ssdc/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 659, in _shutdown_workers
w.join()
File "/data1/mschroeder/miniconda3/envs/20-ssdc/lib/python3.7/multiprocessing/process.py", line 140, in join
res = self._popen.wait(timeout)
File "/data1/mschroeder/miniconda3/envs/20-ssdc/lib/python3.7/multiprocessing/popen_fork.py", line 48, in wait
return self.poll(os.WNOHANG if timeout == 0.0 else 0)
File "/data1/mschroeder/miniconda3/envs/20-ssdc/lib/python3.7/multiprocessing/popen_fork.py", line 28, in poll
pid, sts = os.waitpid(self.pid, flag)
KeyboardInterrupt
```
## To Reproduce
```python
features = np.zeros((n_samples, n_features))
model.eval()
with torch.no_grad():
for i in range(n_passes):
offset = 0
for batch in tqdm(dataloader, desc=f"Features (pass {i})"):
batch_images = torch.rot90(batch["image"].cuda(), i, (-2, -1))
assert (
batch_images.size() == batch["image"].size()
), f"{batch_images.size()} vs. {batch['image'].size()}"
batch_features = model.forward(batch_images)
batch_features = torch.flatten(batch_features, 1).cpu().numpy()
batch_size = batch_features.shape[0]
features[offset : offset + batch_size] += batch_features
offset += batch_size
```
The error happens in the forth (last) iteration of `for i in range(n_passes)`. `num_workers` is 8.
## Expected behavior
I would expect the loop to terminate without problems.
## Environment
PyTorch version: 1.1.0
Is debug build: No
CUDA used to build PyTorch: 9.0.176
OS: Linux Mint 18.1 Serena
GCC version: (Ubuntu 5.4.0-6ubuntu1~16.04.12) 5.4.0 20160609
CMake version: version 3.5.1
Python version: 3.7
Is CUDA available: Yes
CUDA runtime version: 8.0.61
GPU models and configuration: GPU 0: GeForce GTX TITAN X
Nvidia driver version: 384.98
cuDNN version: Could not collect
Versions of relevant libraries:
[pip] numpy==1.18.1
[pip] pytorch-lightning==0.7.2.dev0
[pip] torch==1.1.0
[pip] torchvision==0.3.0
[conda] blas 1.0 mkl
[conda] cudatoolkit 9.0 h13b8566_0
[conda] mkl 2020.1 217
[conda] mkl-service 2.3.0 py37he904b0f_0
[conda] mkl_fft 1.0.15 py37ha843d7b_0
[conda] mkl_random 1.1.1 py37h0573a6f_0
[conda] numpy 1.18.1 py37h4f9e942_0
[conda] numpy-base 1.18.1 py37hde5b4d6_1
[conda] pytorch 1.1.0 py3.7_cuda9.0.176_cudnn7.5.1_0 pytorch
[conda] pytorch-lightning 0.7.2.dev0 pypi_0 pypi
[conda] torchvision 0.3.0 py37_cu9.0.176_1 pytorch
## Additional info
When examining the code, I see no reason why this wouldn't work. A work-around may be passing a timeout to `w.join()` at the expense of leaving zombie processes around.
cc @SsnL
|
module: multiprocessing,module: dataloader,triaged
|
low
|
Critical
|
631,523,357 |
react-native
|
TextInput with TextAlign on iOS does not add ellipsis instead wraps
|
```
import React, { PureComponent } from 'react';
import {Text, TextInput, View} from 'react-native';
const App = () => {
return (
<View style={{flex: 1, justifyContent: 'center'}}>
<View>
<Text>Text Input without alignment</Text>
<TextInput style={{padding: 10, margin: 20, borderColor: 'grey', borderWidth: 1}}/>
<Text>Text Input with right alignment</Text>
<TextInput textAlign={'right'} style={{padding: 10, margin: 20, borderColor: 'grey', borderWidth: 1}}/>
</View>
<TextInput/>
</View>);
};
export default App;
```
Write some long text into the first field with spaces.

Write similar text into the second field.

Unfocus second input
Expected: both fields will have ellipsis at the end
Actual: second input with textAlign doesn't have ellipsis.
Android app works in a different way:
It never shows ellipsis but if you type a long string it always shows the end of it and never beginning. I guess it should move to begin when unfocused.
react: 16.11.0 => 16.11.0
react-native: 0.62.1 => 0.62.1
|
Platform: iOS,Component: TextInput,Needs: Environment Info,Needs: Attention
|
medium
|
Critical
|
631,531,169 |
terminal
|
Option to create desktop shortcuts to profiles
|
<!--
π¨π¨π¨π¨π¨π¨π¨π¨π¨π¨
I ACKNOWLEDGE THE FOLLOWING BEFORE PROCEEDING:
1. If I delete this entire template and go my own path, the core team may close my issue without further explanation or engagement.
2. If I list multiple bugs/concerns in this one issue, the core team may close my issue without further explanation or engagement.
3. If I write an issue that has many duplicates, the core team may close my issue without further explanation or engagement (and without necessarily spending time to find the exact duplicate ID number).
4. If I leave the title incomplete when filing the issue, the core team may close my issue without further explanation or engagement.
5. If I file something completely blank in the body, the core team may close my issue without further explanation or engagement.
All good? Then proceed!
-->
# Description of the new feature/enhancement
As a user, I'd like to be able to easily create desktop shortcuts that open a specific profile. This could be done e.g. via right clicking a profile in the profile dropdown and selecting an action "Create desktop shortcut".
For context, I use a Powershell Core profile to do software updates and a WSL shell to do some dev work and I currently cannot find a way to quickly run one of those; I always start the default Powershell profile and then manually open WSL and close Powershell which seems cumbersome.
<!--
A clear and concise description of what the problem is that the new feature would solve.
Describe why and how a user would use this new functionality (if applicable).
-->
<!--
A clear and concise description of what you want to happen.
-->
|
Issue-Feature,Help Wanted,Area-UserInterface,Product-Terminal
|
medium
|
Critical
|
631,538,428 |
flutter
|
Google Maps Flutter Navigation Toolbar
|
I can see that we have the ability to enable/disable the navigation toolbar (mapToolbarEnabled) on the map but even when enabled, the toolbar only appears when the red pin is clicked on.
Can we have an option to force the navigation toolbar to always be showing when enabled, as it may not be obvious to the end user that they need to click on the red pin to navigate to Maps directly from there for directions?
|
c: new feature,p: maps,package,c: proposal,team-ecosystem,P3,triaged-ecosystem
|
low
|
Minor
|
631,582,306 |
node
|
Async Hooks and Streams
|
Continuing a little bit from https://github.com/nodejs/node/issues/33723.
Do we need closer integration between async hooks and streams? In particular since `Stream.destroy` can be invoked from basically anywhere the `'close'` event can be emitted in a for user unexpected async scope (not sure yet about the correct terminology in async hooks context).
What currently seems to be the way to approach this is to monkey patch `destroy` after creating a stream, e.g.
```js
const stream = new Duplex(...)
stream.destroy = asyncResource.runInAsyncScope.bind(asyncResource, stream.destroy, stream)
```
Maybe would make sense to be able to provide a `asyncId` or `asyncTriggerId` (not sure of the difference yet) as a constructor argument?
|
stream,async_hooks
|
low
|
Minor
|
631,650,749 |
flutter
|
Overscroll behavior should default to off for touch on ChromeOS
|
Go to gallery.flutter.dev on a ChromeOS machine.
If you scroll using the trackpad (2 finger scroll) no overscroll behavior is present.
However if you scroll using the touchscreen, you get the Android-style overscroll glow.
Everywhere else I can find in ChromeOS (web pages in Chrome, the application launcher, settings dialog) do not exhibit the overscroll glow behavior.
I expect if I tried an Android app on Chrome OS it would show overscroll however?
This is a tiny issue, just putting it on file.
|
framework,f: scrolling,platform-chromebook,platform-web,P3,team-web,triaged-web
|
low
|
Minor
|
631,653,887 |
go
|
runtime: read-only accessor for gcpercent
|
I would like to programmatically access the current value of `GOGC` in my program (for logging it, with various runtime stats).
For `GOMAXPROCS`, I can call `runtime.GOMAXPROCS(0)` to get the current value without changing it. However, for `GOGC`, I see no way of reading the value without writing. I could do a two-step call like
```
percent := debug.SetGCPercent(100)
debug.SetGCPercent(percent)
```
but I am worried about the disruption that this may cause in a program that is under load and that has a non-standard `GOGC` value in the first place.
The documentation for SetGCPercent says that a negative value will disable GC, however there is no documented effect for calling it with 0. Maybe this could be defined to just return the current value, like in GOMAXPROCS? This would just be a matter of making `runtime.setGCPercent` return early in this case.
|
NeedsInvestigation,FeatureRequest,compiler/runtime
|
low
|
Critical
|
631,659,453 |
go
|
net/http/cookiejar: strips additional information
|
https://github.com/golang/go/blob/50bd1c4d4eb4fac8ddeb5f063c099daccfb71b26/src/net/http/cookiejar/jar.go#L219
Is there a purpose to leave out everything apart from the name and value here?
I'd like to know how long time a cookie has left to live..
|
NeedsInvestigation
|
low
|
Major
|
631,715,033 |
svelte
|
Error in bidi transitions when returning a function
|
https://svelte.dev/repl/283d750225684048b26cf6880dfcea2d?version=3.23.0
`Cannot read property 'r' of undefined`
https://github.com/sveltejs/svelte/blob/dba6e5efadb8aaf86ca25022cd3279bb4ea42434/src/runtime/internal/transitions.ts#L337-L342
`wait.then` resolves after `check_outros` meaning that `outros` will be the reassigned to its parent group, which is `undefined` in this case
|
awaiting submitter,stale-bot,temp-stale
|
low
|
Critical
|
631,748,597 |
terminal
|
Notification after a long running command finishes
|
<!--
π¨π¨π¨π¨π¨π¨π¨π¨π¨π¨
I ACKNOWLEDGE THE FOLLOWING BEFORE PROCEEDING:
1. If I delete this entire template and go my own path, the core team may close my issue without further explanation or engagement.
2. If I list multiple bugs/concerns in this one issue, the core team may close my issue without further explanation or engagement.
3. If I write an issue that has many duplicates, the core team may close my issue without further explanation or engagement (and without necessarily spending time to find the exact duplicate ID number).
4. If I leave the title incomplete when filing the issue, the core team may close my issue without further explanation or engagement.
5. If I file something completely blank in the body, the core team may close my issue without further explanation or engagement.
All good? Then proceed!
-->
# Description of the new feature/enhancement
<!--
A clear and concise description of what the problem is that the new feature would solve.
Describe why and how a user would use this new functionality (if applicable).
-->
It would be awesome to add an option for a notification bubble or sound after a long running command. Long running command can be configurable but a sane default could be 10 seconds.
The scenario for this would to switch users from a pull to push model. Without notifications, a user would have to constantly "pull" their terminal to see if their command is done. Switching to push removes the cognitive overhead of having to constantly pull and instead be pushed only once the command is done running.
# Current workarounds
* Run a script that creates a notification after the command is done: `long_command; notification.py`
## Extra asks
Beyond allowing for notifications, it would also be helpful if users were provided a hook. Here are some scenarios the hook can cover:
* Hitting an API to report the run length of the command (Useful for engineering systems)
* Changing the color of your smart LEDs (Think of all the things Makers could do!)
* Context aware notifications (If the script name contains 'foo' then do then, otherwise do that
# Other Terminal support
* iTerm has this as a feature: https://www.iterm2.com/documentation-shell-integration.html#:~:text=Alert%20on%20next%20mark,in%20another%20window%20or%20tab.
# Proposed technical implementation details (optional)
<!--
A clear and concise description of what you want to happen.
-->
|
Issue-Feature,Area-Extensibility,Product-Terminal
|
medium
|
Critical
|
631,857,976 |
TypeScript
|
[javascript] Find all references when for module.exports = { foo: foo }
|
*TS Template added by @mjbvz*
**TypeScript Version**: 4.0.0-dev.20200605
**Search Terms**
- JavaScript
- find all references (f12)
---
Code
// main.ts
```
import { foo } from './lib.js';
foo();
```
// lib.js
```
function foo() {
console.log("hello world")
}
module.exports = { foo: foo }
```
Run find all references on **foo()** in **main.js**
**Expected behavior:**
foo references in both files are returned.
**Actual behavior:**
Only references in main are returned
Also I created **jsconfig.json** on root project but still not work.
```
{
"compilerOptions": {
"target": "ES6"
},
"exclude": [
"node_modules",
"**/node_modules/*"
]
}
```
And many other examples:
```
{
"compilerOptions": {
"target": "esnext",
"baseUrl": "./",
"jsx": "react",
},
"include": [
"./src/js/**/*"
],
}
```
I use react and javascript in actual project
|
Bug
|
low
|
Minor
|
631,890,967 |
flutter
|
MediaQueryData.viewInsets might not be the right thing to use on iOS keyboard
|
The default behavior for iOS keyboards might be closer to media query padding rather than a view inset.
In an iOS app, such as iMessage, open the keyboard and scroll. The content will scroll behind the translucent keyboard. This doesn't happen by default in Flutter since our Scaffold and CupertinoScafford default to resizeToAvoidBottomInset=true. This makes it unlikely for any content to be behind the keyboard which loses the "frosted glass look and feel".
|
platform-ios,framework,f: material design,a: fidelity,f: cupertino,team-design,triaged-design
|
low
|
Minor
|
631,970,635 |
TypeScript
|
Add auto-completion results to CSSStyleDeclaration.{setProperty,removeProperty}
|
## Search Terms
setProperty
## Suggestion
[`CSSStyleDeclaration.setProperty(propertyName, value, priority);`](https://developer.mozilla.org/en-US/docs/Web/API/CSSStyleDeclaration/setProperty) is currently typed as `setProperty(property: string, value: string | null, priority?: string | undefined): void`.
There is a specific set of defined properties in CSS: https://www.w3.org/TR/CSS/#properties It would be great if this specific list of properties could be added to the definition, such that auto-completion would show the allowed values.
(The same is true for https://developer.mozilla.org/en-US/docs/Web/API/CSSStyleDeclaration/removeProperty so this feature request applies to that method as well)
For backwards compatible reasons (and to be spec-compliant), a `string` overload would remain. That's because the following is legal, albeit a noop:
```js
element.style.setProperty('non-existent', 'something');
```
## Use Cases
Improved editor experience for style APIs.
## Examples
With my cursor at `|`, ask for the auto-completions. The expected behavior would be to show the list of accepted properties, similar to how `document.createElement` works.
```
element.style.setProperty('|
```
## Checklist
My suggestion meets these guidelines:
* [x] This wouldn't be a breaking change in existing TypeScript/JavaScript code
* [x] This wouldn't change the runtime behavior of existing JavaScript code
* [x] This could be implemented without emitting different JS based on the types of the expressions
* [x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, etc.)
* [x] This feature would agree with the rest of [TypeScript's Design Goals](https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals).
|
Suggestion,Awaiting More Feedback
|
low
|
Minor
|
631,979,433 |
pytorch
|
friend constexpr in templated struct loses constexpr-ness in nvcc
|
This fails to compile in nvcc:
```
template <typename T>
struct A {
constexpr A(T x) : x_(x) {}
constexpr bool equals(const A& other) { return x_ == other.x_; }
friend constexpr bool func_equals(const A& self, const A& other) { return self.x_ == other.x_; }
T x_;
};
constexpr bool ok(int a, int b) {
return A<int>(a).equals(b);
}
constexpr bool bad(int a, int b) {
return func_equals(A<int>(a), A<int>(b));
}
```
This results in the error:
```
test.cu: In function βconstexpr bool bad(int, int)β:
test.cu:14:19: error: call to non-constexpr function βbool func_equals(const A<int>&, const A<int>&)β
return func_equals(A<int>(a), A<int>(b));
~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~
```
If you run with `--verbose --keep` and look at the cudafe, you can see that the constexpr is dropped
```
# 1 "test.cu"
template< class T>Β·
# 2
struct A {Β·
# 3
constexpr A(T x) : x_(x) { }Β·
# 4
constexpr bool equals(const A &other) { return (x_) == (other.x_); }Β·
# 5
friend inline bool func_equals(const A &self, const A &other) { return (self.x_) == (other.x_); }Β·
# 6
T x_;Β·
```
I've filed this as nvbug https://developer.nvidia.com/nvidia_bug/3010385
In the meantime, we MUST NOT define constexpr friend functions. It is usually simple enough to work around it some other way (define it as a method or a free function). An example of some constexpr friend functions that are affected by this are `c10::string_view` `operator==`.
|
triaged
|
low
|
Critical
|
632,009,140 |
flutter
|
in_app_purchase - cancel subscription and restore cancelled subscription
|
Would it be possible to add to the in_app_purchase plugin the following functionalities?
- Cancel Subscription
- Restore Cancelled Subscription (When cancelled but not yet expired)
At the moment the user has to do those actions via the App Store / Google Play.
|
c: new feature,p: in_app_purchase,package,team-ecosystem,P3,triaged-ecosystem
|
low
|
Major
|
632,048,785 |
PowerToys
|
[Run][Calculator Plugin] Calculator Expansion
|
# Summary of the new feature/enhancement
Expand the calculator by implementing bit shifting, modulus operations, and bitwise operators like on macOS' Spotlight, as well as allowing an implicit multiplication when a number is next to a parenthesized number.
# Proposed technical implementation details (optional)
Examples:
- 5 << 4 * 2 = **1280**
- (24 >> 2) - 1 = **6**
- 5 % 3 = **2**
- 15 & 3 = **3**
- 26 | 5 = **31**
- 2(40) = **80**
|
Idea-Enhancement,Product-PowerToys Run,Run-Plugin
|
low
|
Major
|
632,137,329 |
PowerToys
|
[PowerRename] Add an option to reverse the order of files
|
I need to be able to reverse filenames for automatic import into a program. My files are named as 3a, 3b, 3c, etc. as I don't know how many I am going to have to create, I need it to delete the letters (can be done with regex `[a-zA-Z]`) and then enumerate the items in reverse order, so 3c comes out as "3 (1)", 3b comes out as "3 (2)", etc. This would be really helpful and make my use case and anyone else who needs to do something like this so much faster.
|
Idea-Enhancement,Product-PowerRename
|
low
|
Major
|
632,137,554 |
kubernetes
|
Unable to bind another pvc to a volume after original pvc is deleted
|
**What happened**:
When I have a volume and a pvc, and then delete the pvc, the volume is "Released", but the ClaimRef never gets removed from the volume, and it seems like this prevents me from ever binding another claim to this volume.
**What you expected to happen**:
After I delete a pvc that is bound to a volume, I expect that the ClaimRef would no longer have the deleted pvc's information and that I should be able to bind a difference pvc to the volume.
**How to reproduce it (as minimally and precisely as possible)**:
First, create a volume and pvc bound to it
```
$ kubectl apply -f - << EOF
kind: StorageClass
apiVersion: storage.k8s.io/v1
metadata:
name: sc-test
provisioner: kubernetes.io/no-provisioner
volumeBindingMode: Immediate
---
kind: PersistentVolume
apiVersion: v1
metadata:
name: pv-test
spec:
storageClassName: sc-test
capacity:
storage: 2Gi
accessModes:
- ReadWriteOnce
persistentVolumeReclaimPolicy: Retain
local:
path: /tmp/pv-test
nodeAffinity:
required:
nodeSelectorTerms:
- matchExpressions:
- key: kubernetes.io/hostname
operator: In
values:
- k8s-worker-1
---
kind: PersistentVolumeClaim
apiVersion: v1
metadata:
name: pvc-test
spec:
storageClassName: sc-test
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 2Gi
EOF
```
Then delete the pvc
```
$ kubectl delete pvc pvc-test
```
Check the volume (it looks fine, status is Released)...
```
$ kubectl get pv pv-test
NAME CAPACITY ACCESS MODES RECLAIM POLICY STATUS CLAIM STORAGECLASS REASON AGE
pv-test 2Gi RWO Retain Released default/pvc-test sc-test 9s
```
Check the ClaimRef field, and it still has the information from pvc-test, which was deleted...
```
$ kubectl get pv pv-test -o=json | jq .spec.claimRef
{
"apiVersion": "v1",
"kind": "PersistentVolumeClaim",
"name": "pvc-test",
"namespace": "default",
"resourceVersion": "631479",
"uid": "3c536cd4-656d-454d-b69a-8343e42f5d4b"
}
```
I waited around a while, thinking maybe the garbage collector or some other mechanism might clean this up, but it doesn't seem to.
Then, if I try to bind a different claim to the volume, it gets stuck in Pending status...
```
kubectl apply -f - << EOF
kind: PersistentVolumeClaim
apiVersion: v1
metadata:
name: pvc-test2
spec:
storageClassName: sc-test
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 2Gi
EOF
```
Check after a while...
```
$ kubectl get pvc
NAME STATUS VOLUME CAPACITY ACCESS MODES STORAGECLASS AGE
pvc-test2 Pending sc-test 5m30s
```
Check the ClaimRef field again, and it is still referencing the original claim, pvc-test, which was deleted...
```
$ kubectl get pv pv-test -o=json | jq .spec.claimRef
{
"apiVersion": "v1",
"kind": "PersistentVolumeClaim",
"name": "pvc-test",
"namespace": "default",
"resourceVersion": "631479",
"uid": "3c536cd4-656d-454d-b69a-8343e42f5d4b"
}
```
**Anything else we need to know?**:
I found these similar issues from 4 years ago:
* https://github.com/kubernetes/kubernetes/issues/20753
* https://github.com/kubernetes/kubernetes/issues/27164
But I don't think it was ever really fixed.
The workaround suggested in one of those issues was to patch the pv like this:
```
kubectl patch pv pv-test -p '{"spec":{"claimRef": null}}'
```
...but that doesn't seem like an ideal solution long-term.
I also suspect/wonder if this problem *might* be responsible for some of the storage e2e test flakes that have been occurring, where pods never become ready. I ran into this while trying to troubleshoot e2e flakes, but it is hard to know for sure if this is the cause. The symptoms are similar though (timeout).
**Environment**:
- Kubernetes version (use `kubectl version`):
```
Client Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.0", GitCommit:"9e991415386e4cf155a24b1da15becaa390438d8", GitTreeState:"clean", BuildDate:"2020-03-25T14:58:59Z", GoVersion:"go1.13.8", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"19+", GitVersion:"v1.19.0-beta.1-55-gb8b4186a14045a.dev-1591390939", GitCommit:"b8b4186a14045ab66b150b5a92276d02b8a73a3e", GitTreeState:"clean", BuildDate:"2020-06-05T21:02:50Z", GoVersion:"go1.13.9", Compiler:"gc", Platform:"linux/amd64"}
```
- Cloud provider or hardware configuration: On-prem self-hosted cluster, but I also tested using kind
- OS (e.g: `cat /etc/os-release`):
```
NAME="Ubuntu"
VERSION="20.04 LTS (Focal Fossa)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 20.04 LTS"
VERSION_ID="20.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=focal
UBUNTU_CODENAME=focal
```
- Kernel (e.g. `uname -a`):
```
Linux k8s-master 5.4.0-33-generic #37-Ubuntu SMP Thu May 21 12:53:59 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
```
- Install tools:
- Network plugin and version (if this is a network-related bug):
- Others:
|
kind/bug,sig/storage,lifecycle/frozen
|
medium
|
Critical
|
632,153,440 |
godot
|
get_cell_autotile_coord gives the same information if the tile is at the position 0, 0 in the tileset or the cell doesn't have autotilling
|
**Godot version:**
3.2.1.stable.official
**OS/device including version:**
Windows10
**Issue description:**
The GDscript function get_cell_autotile_coord() returns a zero vector when the cell doesn't have autotilling. See documentation: https://docs.godotengine.org/en/stable/classes/class_tilemap.html#class-tilemap-method-get-cell-autotile-coord
What the function returns is a vector2 containing the coordinates of the tile in the tileset. But the first tile in the tileset is at the position (0, 0). So it's exactly the same as if the cell doesn't have autotilling.
So the function can return misleading information.
**Steps to reproduce:**
Execute the minimal project linked. You'll see that the cell at the coordinates (0, 0) get the same result with the get_cell_autotile_coord function as the cell (4, 0) even if it's an empty cell - so obviously no autotiling.
**Minimal reproduction project:**
[Issue_TileMap.zip](https://github.com/godotengine/godot/files/4739241/Issue_TileMap.zip)
I took the liberty to check the tile_map.cpp - if I may add a small contribution to the project.
Maybe the constant INVALID_CELL could be returned is both coordinates of the vector2 to handle this case:

|
enhancement,topic:core
|
low
|
Major
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.