I'm trying to use cURL in a script and get it to not show the progress bar.
I've tried the -s, -silent, -S, and -quiet options, but none of them work.
Here's a typical command I've tried:
curl -s http://google.com > temp.html
I only get the progress bar when pushing it to a file, so curl -s http://google.com doesn't have a progress bar, but curl -s http://google.com > temp.html does.
curl -s http://google.com > temp.html
works for curl version 7.19.5 on Ubuntu 9.10 (no progress bar). But if for some reason that does not work on your platform, you could always redirect stderr to /dev/null:
curl http://google.com 2>/dev/null > temp.html
In curl version 7.22.0 on Ubuntu and 7.24.0 on OSX the solution to not show progress but to show errors is to use both -s (--silent) and -S (--show-error) like so:
curl -sS http://google.com > temp.html
This works for both redirected output > /some/file, piped output | less and outputting directly to the terminal for me.
Update: Since curl 7.67.0 there is a new option --no-progress-meter which does precisely this and nothing else, see clonejo's answer for more details.
I found that with curl 7.18.2 the download progress bar is not hidden with:
curl -s http://google.com > temp.html
but it is with:
curl -ss http://google.com > temp.html
Since curl 7.67.0 (2019-11-06) there is --no-progress-meter, which does exactly this, and nothing else. From the man page:
--no-progress-meter
Option to switch off the progress meter output without muting or
otherwise affecting warning and informational messages like -s,
--silent does.
Note that this is the negated option name documented. You can
thus use --progress-meter to enable the progress meter again.
See also -v, --verbose and -s, --silent. Added in 7.67.0.
It's available in Ubuntu ≥20.04 and Debian ≥11 (Bullseye).
For a bit of history on curl's verbosity options, you can read Daniel Stenberg's blog post.
Not sure why it's doing that. Try -s with the -o option to set the output file instead of >.
this could help..
curl 'http://example.com' > /dev/null
On macOS 10.13.6 (High Sierra), the -sS option works. It is especially useful inside Perl, in a command like curl -sS --get {someURL}, which frankly is a whole lot more simple than any of the LWP or HTTP wrappers, for just getting a website or web page's contents.
Related
I got this link to get the Build Time Trend along with other Data in jenkins
https://jenkins:8080/view/<view-name>/job/<job-name>/<buildnumber>/api/json
This works well in a web browser but this does not seem to work with curl, does not give any result when I run along with curl command
This is what I tried
curl -u user:api_token -s -k "https://jenkins:8080/view/<view-name>/job/<job-name>/<buildnumber>/api/json"
This syntax worked with other API's.
Not sure what is wrong here.
curl -u userid:api_token -s -k "https://jenkins:8080/view/<view-name>/job/<job-name>/<buildnumber>/api/json" | jq.'causes[]|{result}'
jq.causes[]|{result}: command not found
You need a space between jq and its arguments (and probably not a period).
... | jq 'causes[]|{result}'
^
space here
Based on the discussion here: download-a-working-local-copy-of-a-webpage,
I am using following command to make download a webpage with the image:
wget --default-page -q -p -k http://mattvh.github.io/solar-theme-jekyll/index.html -O ./page_source/ex.html
For some urls this works, for others (as the one mentioned) this fails with error:
Cannot specify both -k or --convert-file-only and -O if multiple URLs are given, or in combination
with -p or -r. See the manual for details.
How to fix this?
I am trying to do a curl to basically clone a package.json in my local directory, by using:
curl -U "<email>":"<pass>" -L "https://github.com/flowrepo/blob/master/package.json"
I managed to get pass the redirect by using the -L flag, but I cannot get a valid JSON doc as the command above is returning the entire Github page. Any thoughts?
Access the "raw" variant by changing the URL to:
https://github.com/flowrepo/raw/master/package.json"
[Solved]
Thanks Hans Z. for your answer. This is the final call that works:
curl -U "<email>":"<pass>" -L "https://github.com/flowrepo/raw/master/package.json?token<retrieved_token> -o output.txt
According to the https://docs.docker.com/engine/reference/api/docker_remote_api_v1.24/#/list-tasks, filter can be only used to get running containers with a particular service name. For some reason, I am getting a full list of all tasks regardless of their names or desired states. I can't find any proper examples of using curl with JSON requests with Docker API.
I'm using the following command:
A)
curl -X GET -H "Content-Type: application/json" -d '{"filters":[{ "service":"demo", "desired-state":"running" }]}' https://HOSTNAME:2376/tasks --cert ~/.docker/cert.pem --key ~/.docker/key.pem --cacert ~/.docker/ca.pem
Returns everything
B)
trying to get something working from Docker Remote API Filter Exited
curl https://HOSTNAME:2376/containers/json?all=1&filters={%22status%22:[%22exited%22]} --cert ~/.docker/cert.pem --key ~/.docker/key.pem --cacert ~/.docker/ca.pem
This one returns "curl: (60) Peer's Certificate issuer is not recognized.", so I guess that curl request is malformed.
I have asked on Docker forums and they helped a little. I'm amazed that there are no proper documentation anywhere on the internet on how to use Docker API with curl or is it so obvious and I don't understand something?
I should prefix this with the fact that I have never seen curl erroneously report a certificate error when in fact there was some sort of other issue in play, but I will trust your assertion that this is in fact not a certificate problem.
I thought at first that your argument to filters was incorrect, because
according to the API reference, the filters parameter is...
a JSON encoded value of the filters (a map[string][]string) to process on the containers list.
I wasn't exactly sure how to interpret map[string][]string, so I set up a logging proxy between my Docker client and server and ran docker ps -f status=exited, which produced the following request:
GET /v1.24/containers/json?filters=%7B%22status%22%3A%7B%22exited%22%3Atrue%7D%7D HTTP/1.1\r
If we decode the argument to filters, we see that it is:
{"status":{"exited":true}}
Whereas you are passing:
{"status":["exited"]}
So that's different, obviously, and I was assuming that was the source of the problem...but when trying to verify that, I ran into a curious problem. I can't even run your curl command line as written, because curl tries to perform some globbing behavior due to the braces:
$ curl http://localhost:2376/containers/json'?filters={%22status%22:[%22exited%22]}'
curl: (3) [globbing] nested brace in column 67
If I correctly quote your arguments to filter:
$ python -c 'import urllib; print urllib.quote("""{"status":["exited"]}""")'
%7B%22status%22%3A%5B%22exited%22%5D%7D
It seems to work just fine:
$ curl http://localhost:2376/containers/json'?filters=%7B%22status%22%3A%5B%22exited%22%5D%7D'
[{"Id":...
I can get the same behavior if I use your original expression and pass -g (aka --globoff) to disable the brace expansion:
$ curl -g http://localhost:2376/containers/json'?filters={%22status%22:[%22exited%22]}'
[{"Id":...
One thing I would like to emphasize is the utility of sticking a proxy between the docker client and server. If you ever find yourself asking, "how do I use this API?", an excellent answer is to see exactly what the Docker client is doing in the same situation.
You can create a logging proxy using socat. Here is an example.
docker run -v /var/run/docker.sock:/var/run/docker.sock -p 127.0.0.1:1234:1234 bobrik/socat -v TCP-LISTEN:1234,fork UNIX-CONNECT:/var/run/docker.sock
Then run a command like so in another window.
docker -H localhost:1234 run --rm -p 2222:2222 hello-world
This example uses docker on ubuntu.
A docker REST proxy can be simple like this:
https://github.com/laoshanxi/app-mesh/blob/main/src/sdk/docker/docker-rest.go
Then you can curl like this:
curl -g http://127.0.0.1:6058/containers/json'?filters={%22name%22:[%22jenkins%22]}'
I want to download the entire page with all its objects, including images, js, css, etc. However, the url may contain parameters, eg.
www.youtube.com/results?search_query=star+wars
I have tried the options suggested in similar questions:
wget -p -k "www.youtube.com/results?search_query=star+wars"
wget -p -k ‐‐post-data "search_query=star+wars" "https://www.youtube.com/results"
But none of them works. Can anybody help me with that? Many thanks!
Find the answer:
wget -E -H -k -K -p -e robots=off $URL
NOTE: The URL must be quoted by ''.
REF: https://superuser.com/questions/55040/save-a-single-web-page-with-background-images-with-wget