Pass flags to the Sphinx runner? - read-the-docs

So I've got the following project OpenFHE-development and when I run the build process, there are lots of warnings. However, most of these warnings are fine to ignore (we vet them before pushing to the main branch)
Specifically, is there a way to take
pth/python -m sphinx -T -E -b readthedocssinglehtmllocalmedia -d _build/doctrees -D language=en . _build/localmedia
and convert it to
pth/python -m sphinx -T -E -b readthedocssinglehtmllocalmedia -d _build/doctrees -D language=en . _build/localmedia 2> errors.txt
(pipe the stderr to a file instead of having it display on stdout)?

Does not seem to be possible at the moment. See git discussion

Related

qsub pbs doesn't show error and output log, even forcing the path

when I run code using qsub and pbs script, the log and error files are not shown.
I have also tried to add the path of error and log file, but without succes
#PBS -N example_job
#PBS -j oe
#PBS -q shortp
#PBS -V
##PBS -v BATCH_NUM_PROC_TOT=16
#PBS -l nodes=1:ppn=4
#PBS -e $HOME/error.txt
#PBS -o $HOME/otput.txt
Do you know how can I solve the problem ?
Thanks a lot

How to deserialize a Riak backup into a JSON?

I have just dumped a riak db (back-up). But the backup file is a binary file.
Is there a lib that it deserialize it into a human readable file (JSON w/e) ?
I haven't found anything on google, neither on Stack Overflow.
Found a solution for my current problem:
Connect to the env and then run following command:
wget https://s3-us-west-2.amazonaws.com/ps-tools/riak-data-migrator-0.2.9-bin.tar.gz
tar -xvzf riak-data-migrator-0.2.9-bin.tar.gz
cd riak-data-migrator-0.2.9
java -jar riak-data-migrator-0.2.9.jar -d -r /var/riak_export -a -h 127.0.0.1 -p 8087 -H 8098
(source: https://github.com/basho-labs/riak-data-migrator)
EDIT
Another way to export riak db https://www.npmjs.com/package/riak-bucket-exporter
#!/bin/bash
for bucket in $(curl http://localhost:8098/riak?buckets=true | sed -e 's/[{}:"]//gi' -e 's/buckets\[//' -e 's/\]//' -e 's/,/ /g')
do
echo "Exporting bucket $bucket"
rm -f $bucket.json
riak-bucket-exporter -H localhost -p 8098 $bucket
done
echo "Export done"
As all the suggestions listed so far appear to be broken in one way or another (at least for me and riak-kv#2.x), I ultimately resorted to homegrow a bash shell script that leverages riak-kv's HTTP API with no other prerequisites than curl and jq to accomplish an export of sorts.
It can be found in this gist here: https://gist.github.com/cueedee/0b26ec746c4ef578cd98e93c93d2b6e8 hoping that someone will find it useful.

Download all links to .zip files on entire website wget

so basically I want to download all zip files on a given website using wget and I'm having a hard time. I'm new to this so please bear with me. The website DOES NOT have a page that list all the zip files. Is there a way I can have wget go through the entire site like a webcrawler and download all the zip files? I've tried commands like -
1) wget -r -np -l 1 -A zip http://site/path/
2) wget -A zip -m -p -E -k -K -np http://site/path/
3) wget --no-clobber --convert-links --random-wait -r -p -E -e robots=off -U mozilla http://site/path/
supposedly they search through the entire site, I haven't been getting those results though. Help or pointing me in the right direction would be very much appreciated!

How to make gsutil rsync skip symlinks and return error code 0?

I have noticed that when gsutil rsync is working, it will return a non-zero error code out if it encounters a symlink which it can not resolve:
$ gsutil -m rsync -r -C /my_folder/ gs://my_bucket/
CommandException: Error opening file "file:////my_folder/my_symlink": .
CommandException: 1 files/objects could not be copied/removed.
Is there any way I can exclude such symlinks during the sync and make gsutil return error code 0?
I do not know the names of the symlinks.
As stated in the gsutil rsync documentation the -e parameter is used to ignore symbolic links.
Your command would look like:
gsutil -m rsync -r -C -e /my_folder/ gs://my_bucket/
I hope this is what you are looking for.

Tshark - Export packet info from pcap to cvs

I am trying to programmatically capture a stream of packets by using Tshark. The simplified terminal command I am using is:
tshark -i 2 -w output.pcap
This is pretty straightforward, but I then need to get a .csv file in order to easily analyze the information captured.
By opening the .pcap file in Wireshark and exporting it in .csv what I get is a file structured as follows:
"No.","Time","Source","Destination","Protocol","Length","Info"
but,again, I need to do this in an automatic way. So I tried using the command:
tshark -r output.pcap -T fields -e frame.number -e ip.src -e ip.dst -e frame.len -e frame.time -e frame.time_relative -E header=y -E separator=, > output.csv
but I can not find anywhere the name of the "Info" field I get when manually exporting the .csv.
Any ideas? Thanks!
Yes, you can if you use the latest Development Release.
See Wireshark Bug 2892.
Download the Development Release Version 1.9.0.
Use the following command:
$ tshark -i 2 -T fields -e frame.time -e col.Info
Output
Feb 28, 2013 20:58:24.604635000 Who has 10.10.128.203? Tell 10.10.128.1
Feb 28, 2013 20:58:24.678963000 Who has 10.10.128.163? Tell 10.10.128.1
Note
-e col.Info,
Use capital I
How about directly exporting the packets to a csv file.
sudo tshark > fileName.csv