Unable to download website content with wget - html

I am failing to download the below website using wget in bash:
wget --wait 1 -x -H -mk https://bittrex.com/api/v1.1/public/getorderbook?market=usdt-btc&type=sell
I have found some similar issues in other questions, however, their solution was to use -mk, which makes no difference here. The prompt just freeze after the command and nothing happens. If I try to open the same website in a browser, it will open normally. I would be grateful for any help here.

As already pointed out in a comment, you need to quote your URL.
The & in the URL is putting wget --wait 1 -x -H -mk https://bittrex.com/api/v1.1/public/getorderbook?market=usdt-btc into the background and causes everything after the & to be interpreted as a command, which can be a great risk depending on the URL!
If your URL contains a $ you should use single-quotes (') to pass the string literally without variable expansion, otherwise double quotes (") are fine.

Related

How to translate my cURL command into Chrome command?

I want to fire a POST request in command line, to post my image to a image searching site. At first, I tried cURL and get this command which works:
curl -i -X POST -F file=#search.png http://saucenao.com/search.php
It will post a file in FORM to the searching site and returns a HTML page result full with JavaScript which makes it hard to read in terminal. And it's also hard to preview online image in terminal.
Then I remember that I can open Chrome with arguments in command line, which I think may solve my problem. After some digging, I found Chrome switches, but seams it's just about Chrome starting flags (I'm not sure is this right, but I didn't find how to fire a post request like cURL do.)
So, can I use Chrome in command line to start it with a POST request just like my cURL command above?
There are a couple of things you could do.
You could write a script in JavaScript that will send the POST request and display the results inside the <body> element or the like;
You could keep the cURL command and use the -o (or --output) to save the resulting HTML in a file (but lose the -i switch, to avoid having the headers in the file), then open the file in Chrome or whichever browser you prefer. You could combine the two commands as a one-liner in any operating system. If you use Ubuntu, for example:
$ curl -o search.html -X POST -F file=#search.png http://saucenao.com/search.php && google-chrome search.html && rm search.html
According to this answer you could use bcat in order to avoid using a temporary file. Install it by apt-get install ruby-bcat and then just run
$ curl -X POST -F file=#search.png http://saucenao.com/search.php | bcat
I think the easier option is #2, but whichever you prefer.

curl command unable to pass bash parameter

so I am new to curl and am trying to write a bash script that will I can run that downloads a file. So I start off by authentication then make a POST to request a download. I am given a Foo_ID in which I parse using bash and set to a parameter. I then try to use GET the certain Foo data via a download URL. The issue I am having is that whenever I pass in the parameter I parsed from the POST response I get nothing. Here is an example of what I am doing.
#!/bin/bash
curl -b cookies -c cookies -X POST #login_info -d "https://foo.bar.com/auth"
curl -b cookies -c cookies -X POST #Foo_info -d "https://foo.bar.com/foos" > ./tmp/stuff.t
myFooID=`cat ./tmp/stuff.t |grep -Po '"foo_id:.*?",'|cut -d '"' -f 4`
curl -b cookies -c cookies "http://foo.bar.com/foo-download?id=${myFooID}" > ./myFoos/Foo1.data
I have echo'd myFooID to make sure it is correct and it is. I have also echo'd "https://foo.bar.com/foo-download?id=${myFooID}" and it is properly showing the URL I need. Could anyone help me with this like I said I am new to using curl and a little rusty on using bash commands.
So I have solved the issue. The problem was after doing my post for the Foo I didn't give enough time for my foo to be created before trying to download it. I added a sleep command between both the last two curl commands and now it works perfectly. I would like to thank Dennis Williamson for helping me clean up my code wich led me to understanding my issue. I have created a shrine for him on my desk.

download html page for offline use

I want to make an html page available for offline viewing by downloading the html and all images / css resources from it, but not other pages which are links.
I was looking at httrack and wget but could not find the right set of arguments (I need the command line).
Any ideas?
If you want to download using the newest version of wget, get it using cygwin installer
and use this version
wget -m –w 2 –p -E -k –P {target-dir} http://{website}
to mirror {website} to {target-dir} (without images in 1.11.4).
Leave out -w 2 to speed up the progress.
For one page, the following wget command line parameters should be enough. Please keep in mind that it might not download everything including background images attached to CSS files etc.
wget -p <webpage>
Also try wget --help for a list of all command line parameters.

Wget recognizes some part of my URL address as a syntax error

I am quite new with wget and I have done my research on Google but I found no clue.
I need to save a single HTML file of a webpage:
wget yahoo.com -O test.html
and it works, but, when I try to be more specific:
wget http://search.yahoo.com/404handler?src=search&p=food+delicious -O test.html
here comes the problem, wget recognizes &p=food+delicious as a syntax, it says: 'p' is not recognized as an internal or external command
How can I solve this problem? I really appreciate your suggestions.
The & has a special meaning in the shell. Escape it with \ or put the url in quotes to avoid this problem.
wget http://search.yahoo.com/404handler?src=search\&p=food+delicious -O test.html
or
wget "http://search.yahoo.com/404handler?src=search&p=food+delicious" -O test.html
In many Unix shells, putting an & after a command causes it to be executed in the background.
Wrap your URL in single quotes to avoid this issue.
i.e.
wget 'http://search.yahoo.com/404handler?src=search&p=food+delicious' -O test.html
if you are using a jubyter notebook, maybe check if you have downloaded
pip install wget
before warping from URL

Curl giving "Invalid UTF-8 JSON" error from CouchDb although JSON is fine? Any ideas?

This is slightly different from the question titled "Error about ‘invalid JSON’ with couchDB view but the json’s fine": I am not trying to upload a file only enter a simple document.
The example I am trying to use is actually from O'Reilly's book "CouchDB: The Definitive Guide" and I am pretty sure that I have tried it before and got it to work. Here's the command:
curl -X PUT http://username:password#127.0.0.1:5984/albums/6e1295ed6c29495e54cc05947f18c8af -d '{"title":"There is Nothing Left to Lose","artist":"Foo Fighters"}'
The database albums exists and the username and password are correct. I have checked this with JSONLint and the JSON is valid and I am at a loss ... presumably there is an issue with the CouchDB server itself but it appears to be running correctly ... any ideas? This is driving me nuts!
Thanks
Thanks guys. Turns out it's a problem with quote escaping. Here's the answer I got from David on the CouchDB user mailing list:
This is a windows thing regarding
quoting - a real PITA. Unfortunately
cmd.exe shell on windows doesn't parse
this correctly. The rules for when
escaping with a "" or a ^" or a \" are
a bit vague but this works:
C:\tmp>curl -X PUT
http://username:password#127.0.0.1:5984/albums/6e1295ed6c29495e54cc05947f18c8af
-d "{\"title\":\"There is Nothing Left to Lose\",\"artist\":\"Foo
Fighters\"}"
{"ok":true,"id":"6e1295ed6c29495e54cc05947f18c8af","rev":"1-4b39c2971c9ad54cb37e08fa02fec636"}
C:\tmp>
"basically you need to \"escape\" all
\"quotes\" within your JSON"
This fixes my problem
I also met same problem, after searching a while, I found about Git Bash in other question. Git bash has curl command inside. The git bash is included in Git installer.
You have to put the escap characterstic ** and also do not forgot to put after -d all things in **"".
I.e : curl -X PUT http://127.0.0.1:5984/my_database/"001" -d "{\"Name\":\"Suresh\",\"age\":\"32\",\"Designation\":\"Associates Manager\"}"