curl and wget

curl - transfer a URL

curl命令大家都用过,它能够实现的功能很多,比如:

  • 测试某个网页是否能正常访问 (支持输入密码,传递cookie等)
  • 下载东西
  • 上传东西

在谷歌浏览器的开发者工具里,可以直接生成对应的curl命令

在类似POSTMA这样的工具里,可以直接把CURL网址转成软件里内置的请求

man curl
curl --help
https://curl.haxx.se/ 
https://curl.haxx.se/docs/manpage.html   在线man page
https://curl.haxx.se/docs/manual.html    curl tutorial


curl http://www.netscape.com/
curl ftp://ftp.funet.fi/README
curl http://www.weirdserver.com:8000/

Get a web page and store in a local file with a specific name:
    curl -o thatpage.html http://www.netscape.com/

Get a web page and store in a local file, make the local file get the name of the remote document (if no file name part is specified in the URL, this will fail):
    curl -O http://www.netscape.com/index.html

curl http://name:passwd@machine.domain/full/path/to/file
curl -u name:passwd http://machine.domain/full/path/to/file

Verbose / Debug
    curl -v ftp://ftp.upload.com/
    curl --trace trace.txt www.haxx.se

wget - The non-interactive network downloader.

wget是一个下载文件的工具,它用在命令行下。对于Linux用户是必不可少的工具,我们经常要下载一些软件或从远程服务器恢复备份到本地服务器。

wget支持HTTP,HTTPS和FTP协议,可以使用HTTP代理。所谓的自动下载是指,wget可以在用户退出系统的之后在后台执行。这意味这你可以登录系统,启动一个wget下载任务,然后退出系统,wget将在后台执行直到任务完成

wget 可以跟踪HTML页面上的链接依次下载来创建远程服务器的本地版本,完全重建原始站点的目录结构。这又常被称作”递归下载”。

wget 非常稳定,它在带宽很窄的情况下和不稳定网络中有很强的适应性.如果是由于网络的原因下载失败,wget会不断的尝试,直到整个文件下载完毕。如果是服务器打断下载过程,它会再次联到服务器上从停止的地方继续下载。这对从那些限定了链接时间的服务器上下载大文件非常有用。

常见帮助

wget --help
man wget 
在线手册 https://www.gnu.org/software/wget/manual/wget.html

Linux wget命令详解
https://www.cnblogs.com/ftl1012/p/9265699.html

实践

wget baidu.com/zip.tar.gz
localhost:~ ndps$ curl -d "user=admin&pwd=admin" -v 'http://as4k:18081/login'
*   Trying as4k...
* TCP_NODELAY set
* Connected to as4k (as4k) port 18081 (#0)
> POST /login HTTP/1.1
> Host: as4k:18081
> User-Agent: curl/7.64.1
> Accept: */*
> Content-Length: 20
> Content-Type: application/x-www-form-urlencoded
> 
* upload completely sent off: 20 out of 20 bytes
< HTTP/1.1 302 Temporary Redirect
< Date: Mon, 23 Mar 2020 07:54:23 GMT
< Content-length: 0
< Set-cookie: jsessionid=YjIyZWFjYmU1N2Q2ZGNiNDFhM2I5MTE5ZjkwMDY4MDE%3D
< Location: /main.html
< 
* Connection #0 to host as4k left intact
* Closing connection 0


$ cat jiuqiao-check.sh
JIUQIAO_WEB="http://as4k:18081"
curl -s -d "user=admin&pwd=admin" -v "${JIUQIAO_WEB}/login" &> /tmp/jiuqiao-check.txt
JIUQIAO_COOKIE=`cat /tmp/jiuqiao-check.txt | grep "Set-cookie" | sed 's#< Set-cookie: ##'`
if curl -s "${JIUQIAO_WEB}/sysinfo"  -H 'Connection: keep-alive' -H 'Accept: application/json, text/javascript, */*; q=0.01' -H 'X-Requested-With: XMLHttpRequest' --compressed --insecure   -H "Cookie: ${JIUQIAO_COOKIE}" | grep false; then
    echo "load recv alloc at least one false, so we should restart"
    curl -s "${JIUQIAO_WEB}/fzsOption?flag=0" -H 'Connection: keep-alive' -H 'Accept: application/json, text/javascript, */*; q=0.01' -H 'X-Requested-With: XMLHttpRequest' --compressed --insecure   -H "Cookie: ${JIUQIAO_COOKIE}"
fi