360

I know I can download and install the aformentioned library (wget for Windows), but my question is this:

In Windows PowerShell, is there a native alternative to wget?

I need wget simply to retrieve a file from a given URL with HTTP GET. For instance:

wget http://www.google.com/
jsalonen
  • 9,241

12 Answers12

334

Here's a simple PS 3.0 and later one-liner that works and doesn't involve much PS barf:

wget http://blog.stackexchange.com/ -OutFile out.html

Note that:

  • wget is an alias for Invoke-WebRequest
  • Invoke-WebRequest returns a HtmlWebResponseObject, which contains a lot of useful HTML parsing properties such as Links, Images, Forms, InputFields, etc., but in this case we're just using the raw Content
  • The file contents are stored in memory before writing to disk, making this approach unsuitable for downloading large files
  • On Windows Server Core installations, you'll need to write this as

    wget http://blog.stackexchange.com/ -UseBasicParsing -OutFile out.html
    
  • Prior to Sep 20 2014, I suggested

    (wget http://blog.stackexchange.com/).Content >out.html
    

    as an answer.  However, this doesn't work in all cases, as the > operator (which is an alias for Out-File) converts the input to Unicode.

If you are using Windows 7, you will need to install version 4 or newer of the Windows Management Framework.

You may find that doing a $ProgressPreference = "silentlyContinue" before Invoke-WebRequest will significantly improve download speed with large files; this variable controls whether the progress UI is rendered.

Warren Rumak
  • 3,547
198

If you just need to retrieve a file, you can use the DownloadFile method of the WebClient object:

$client = New-Object System.Net.WebClient
$client.DownloadFile($url, $path)

Where $url is a string representing the file's URL, and $path is representing the local path the file will be saved to.

Note that $path must include the file name; it can't just be a directory.

108

There is Invoke-WebRequest in the upcoming PowerShell version 3:

Invoke-WebRequest http://www.google.com/ -OutFile c:\google.html
user4514
  • 1,382
19

It's a bit messy but there is this blog post which gives you instructions for downloading files.

Alternatively (and this is one I'd recommend) you can use BITS:

Import-Module BitsTransfer
Start-BitsTransfer -source "http://urlToDownload"

It will show progress and will download the file to the current directory.

9

If your Windows is new enough (like version 1809 or newer), there's a "real" curl available. curl has the command-Line option "-O" (capital letter O; small letter won't do the same!) The option "-O", alternatively "--remote-name" tells curl, that the saved file gets the same name as the file-name part of the URL.

One needs to start this as "curl.exe", to discern it from the Alias "curl" for "Invoke-WebRequest". Incidentally it works in cmd.exe without changes.

Using the same example as in another answer here

curl.exe -O http://demo.mediacore.tv/files/31266.mp4

(The site won't allow me to add this as a comment, since I apparently need more "reputation" for that - so it gets a new answer)

Dweia
  • 101
  • 1
  • 3
8

PowerShell V4 One-liner:

(iwr http://blog.stackexchange.com/).Content >index.html`

or

(iwr http://demo.mediacore.tv/files/31266.mp4).Content >video.mp4

This is basically Warren's (awesome) V3 one-liner (thanks for this!) - with just a tiny change in order to make it work in a V4 PowerShell.

Warren's one-liner - which simply uses wget rather than iwr - should still work for V3 (At least, I guess; didn't tested it, though). Anyway. But when trying to execute it in a V4 PowerShell (as I tried), you'll see PowerShell failing to resolve wget as a valid cmdlet/program.

For those interested, that is - as I picked up from Bob's comment in reply to the accepted answer (thanks, man!) - because as of PowerShell V4, wget and curl are aliased to Invoke-WebRequest, set to iwr by default. Thus, wget can not be resolved (as well as curl can not work here).

4

Here is a PowerShell function that resolves short URLs before downloading the file

function Get-FileFromUri {  
    param(  
        [parameter(Mandatory=$true, Position=0, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
        [string]
        [Alias('Uri')]
        $Url,
        [parameter(Mandatory=$false, Position=1)]
        [string]
        [Alias('Folder')]
        $FolderPath
    )
    process {
        try {
            # resolve short URLs
            $req = [System.Net.HttpWebRequest]::Create($Url)
            $req.Method = "HEAD"
            $response = $req.GetResponse()
            $fUri = $response.ResponseUri
            $filename = [System.IO.Path]::GetFileName($fUri.LocalPath);
            $response.Close()
            # download file
            $destination = (Get-Item -Path ".\" -Verbose).FullName
            if ($FolderPath) { $destination = $FolderPath }
            if ($destination.EndsWith('\')) {
                $destination += $filename
            } else {
                $destination += '\' + $filename
            }
            $webclient = New-Object System.Net.webclient
            $webclient.downloadfile($fUri.AbsoluteUri, $destination)
            write-host -ForegroundColor DarkGreen "downloaded '$($fUri.AbsoluteUri)' to '$($destination)'"
        } catch {
            write-host -ForegroundColor DarkRed $_.Exception.Message
        }  
    }  
}  

Use it like this to download the file to the current folder:

Get-FileFromUri http://example.com/url/of/example/file  

Or to download the file to a specified folder:

Get-FileFromUri http://example.com/url/of/example/file  C:\example-folder  
user25986
  • 141
2

The following function will get a URL.

function Get-URLContent ($url, $path) {
  if (!$path) {
      $path = Join-Path $pwd.Path ([URI]$url).Segments[-1]
  }
  $wc = New-Object Net.WebClient
  $wc.UseDefaultCredentials = $true
  $wc.Proxy.Credentials = $wc.Credentials
  $wc.DownloadFile($url, $path)
}

Some comments:

  1. The last 4 lines are only needed if you are behind an authenticating proxy. For simple use, (New-Object Net.WebClient).DownloadFile($url, $path) works fine.
  2. The path must be absolute, as the download is not done in your current directory, so relative paths will result in the download getting lost somewhere.
  3. The if (!$path) {...} section handles the simple case where you just want to download the file to the current directory using the name given in the URL.
Paul Moore
  • 591
  • 1
  • 5
  • 9
1

Use Windows 10 bash shell which includes wget once the windows feature is setup.

How to install Ubuntu bash shell on Windows:

YouTube: Running Bash on Ubuntu on Windows!

Windows Subsystem for Linux Documentation

TamusJRoyce
  • 103
  • 3
1

PowerShell Invoke-RestMethod may have fewer dependencies than other methods ... in case you have a minimal (or older) Windows Server installed.

See error reported at Running Invoke-WebRequest as System account:

Invoke-WebRequest : The response content cannot be parsed because the Internet Explorer engine is not available, or Internet Explorer's first-launch configuration is not complete. Specify the UseBasicParsing parameter and try again.


This can be an alternative to applying the -UseBasicParsing option that is, in some cases, required with wget or Invoke-WebRequest.

However, the displayed response may be in a different format, based on data parsing:

PowerShell formats the response based to the data type. For an RSS or ATOM feed, PowerShell returns the Item or Entry XML nodes. For JavaScript Object Notation (JSON) or XML, PowerShell converts, or deserializes, the content into [PSCustomObject] objects.

0

Invoke-WebRequest with -outfile parameter expects a string, so if your filename starts with a number, and not enclosed in quotes, no output file is created.

eg. Invoke-WebRequest -Uri "http://www.google.com/" -outfile "2.pdf"

This does not affect filenames starting with a letter.

Zimba
  • 1,291
-1

This should work for you to get around the no browser initialized stuff. Note the "-UseBasicParsing" param.

Invoke-WebRequest http://localhost -UseBasicParsing
Joe Healy
  • 135