是否有windows命令可以将文件从http url复制或下载到文件系统?我尝试过copy,xcopy和robocopy,它们似乎不支持http urls。
发布于 2009-03-03 19:14:54
您可以使用powershell脚本来完成此任务。
Get-Web http://www.msn.com/ -toFile www.msn.com.html
function Get-Web($url,
[switch]$self,
$credential,
$toFile,
[switch]$bytes)
{
#.Synopsis
# Downloads a file from the web
#.Description
# Uses System.Net.Webclient (not the browser) to download data
# from the web.
#.Parameter self
# Uses the default credentials when downloading that page (for downloading intranet pages)
#.Parameter credential
# The credentials to use to download the web data
#.Parameter url
# The page to download (e.g. www.msn.com)
#.Parameter toFile
# The file to save the web data to
#.Parameter bytes
# Download the data as bytes
#.Example
# # Downloads www.live.com and outputs it as a string
# Get-Web http://www.live.com/
#.Example
# # Downloads www.live.com and saves it to a file
# Get-Web http://wwww.msn.com/ -toFile www.msn.com.html
$webclient = New-Object Net.Webclient
if ($credential) {
$webClient.Credential = $credential
}
if ($self) {
$webClient.UseDefaultCredentials = $true
}
if ($toFile) {
if (-not "$toFile".Contains(":")) {
$toFile = Join-Path $pwd $toFile
}
$webClient.DownloadFile($url, $toFile)
} else {
if ($bytes) {
$webClient.DownloadData($url)
} else {
$webClient.DownloadString($url)
}
}
}源http://blogs.msdn.com/mediaandmicrocode/archive/2008/12/01/microcode-powershell-scripting-tricks-scripting-the-web-part-1-get-web.aspx
发布于 2009-03-03 19:07:11
我不熟悉Windows上可以做到这一点的任何命令,但我总是下载Windows上的GNU wget用于这些和类似的目的。
发布于 2009-03-03 19:26:23
cURL出现在我的脑海中。
curl -o homepage.html http://www.apptranslator.com/此命令下载页面并将其存储到文件homepage.html中。上千个选项可用。
https://stackoverflow.com/questions/607625
复制相似问题