Usage on Windows

#7
by tombr - opened

Might be an idea to add how to download on windows on the front page, this worked for me

python -m venv .env
..env\Scripts\Activate
pip install datasets
python.exe -m pip install --upgrade pip
Invoke-WebRequest -Uri "https://data.together.xyz/redpajama-data-1T/v1.0.0/urls.txt" -OutFile "urls.txt"
Get-Content urls.txt | Foreach-Object { Invoke-WebRequest -Uri $_ -OutFile (Split-Path -Path $_ -Leaf) }

Not sure if it works, but its downloading something

cheers
tom

Together org

Thanks! Does it download the files into different folders? (see #9 - there are duplicate file names across CC)

No it doesn't my bad, didn't think about that, it was very late.

image.png

should also probably have setup multi-download, this will take a long time. but hey I've got the time.

image.png

I'll use this when I restart or something similar, on the assumption that the file names stay in the same order.
Get-Content urls.txt | Foreach-Object {
$originalFilename = Split-Path -Path $_ -Leaf
$filename = $originalFilename
$counter = 1

while (Test-Path $filename) {
    $fileExtension = [System.IO.Path]::GetExtension($originalFilename)
    $fileNameWithoutExtension = [System.IO.Path]::GetFileNameWithoutExtension($originalFilename)
    $filename = $fileNameWithoutExtension + "_" + $counter + $fileExtension
    $counter++
}

Invoke-WebRequest -Uri $_ -OutFile $filename

}

I'll look into hash file comparison later.

Cheers

tom

tombr changed discussion status to closed
tombr changed discussion status to open
Together org

Thanks! I don't have a Windows machine to test on, so it's very helpful here :)

If it helps, this is a download script for *nix based machines:

wget 'https://data.together.xyz/redpajama-data-1T/v1.0.0/urls.txt'
while read line; do
    dload_loc=${line#https://data.together.xyz/redpajama-data-1T/v1.0.0/}
    mkdir -p $(dirname $dload_loc)
    wget "$line" -O "$dload_loc"
done < urls.txt

the download works fine, the problem is it takes a looooong time, I think I worked out it would take about a week if not longer, my connection is slow. Working on another solution at the moment.

I will try out your download script for the first couple of files in the sets, and let you know.

wget 'https://data.together.xyz/redpajama-data-1T/v1.0.0/urls.txt'
cat urls.txt | xargs -n 1 -P 8 wget --no-cache --no-cookies --timeout=5

for a little more parallel action....but be kind to the server(s) This will put all the files in the same directory as where you're running it. (should work on most *nix)

Thanks Emma, problem is I live in the bush in Darwin, top of Australia, my internet is as slow as a wet weekend especially in the wet season. I just don't have university or Researcher bandwidth. I'll give it a try when I get the Ubuntu server up and running again, for the moment it's the win box that needs to do the work.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment