jq: error: round/0 is not defined at <top-level> - json

round function in jq doesn't work.
$ jq '10.01 | round'
jq: error: round/0 is not defined at <top-level>, line 1:
10.01 | round
jq: 1 compile error
$ jq --help
jq - commandline JSON processor [version 1.5-1-a5b5cbe]
What I need to do?

Seems like round is unavailable in your build. Either upgrade jq or implement round using floor:
def round: . + 0.5 | floor;
Usage example:
$ jq -n 'def round: . + 0.5 | floor; 10.01 | round'
10

We can use the pow function along with . + 0.5 | floor to create our own 'round' function that takes a value to round as input and the number of decimal places as argument.
def round_whole:
# Basic round function, returns the closest whole number
# Usage:
# 2.6 | round_whole // 3
. + 0.5 | floor
;
def round(num_dec):
# Round function, takes num_dec as argument
# Usage: 2.2362 | round(2) // 2.24
num_dec as $num_dec |
# First multiply the number by the number of decimal places we want to round to
# i.e 2.2362 becomes 223.62
. * pow(10; $num_dec) |
# Then use the round_whole function
# 223.62 becomes 224
round_whole |
# Then divide by the number of decimal places we want to round by
# 224 becomes 2.24 as expected
. / pow(10; $num_dec)
;
jq --null-input --raw-output \
'
def round_whole:
# Basic round function, returns the closest whole number
# Usage:
# 2.6 | round_whole // 3
. + 0.5 | floor
;
def round(num_dec):
# Round function, takes num_dec as argument
# Usage: 2.2362 | round(2) // 2.24
num_dec as $num_dec |
# First multiply the number by the number of decimal places we want to round to
# i.e 2.2362 becomes 223.62
. * pow(10; $num_dec) |
# Then use the round_whole function
# 223.62 becomes 224
round_whole |
# Then divide by the number of decimal places we want to round by
# 224 becomes 2.24 as expected
. / pow(10; $num_dec)
;
[
2.2362,
2.4642,
10.23423
] |
map(
round(2)
)
'
Yields
[
2.24,
2.46,
10.23
]

Related

Bash script with jq wont get date difference from strings, and runs quite slowly on i7 16GB RAM

Need to find the difference between TradeCloseTime and TradeOpenTime time in dd:hh:mm format for the Exposure column in the following script.
Also the script runs super slow (~4 mins for 800 rows of json, on Core i7 16gb RAM machine)
#!/bin/bash
echo "TradeNo, TradeOpenType, TradeCloseType, TradeOpenSource, TradeCloseSource, TradeOpenTime, TradeCloseTime, PNL, Exposure" > tradelist.csv
tradecount=$(jq -r '.performance.numberOfTrades|tonumber' D.json)
for ((i=0; i<$tradecount; i++))
do
tradeNo=$(jq -r '.trades['$i']|[.tradeNo][]|tonumber' D.json)
entrySide=$(jq -r '.trades['$i'].orders[0]|[.side][]' D.json)
exitSide=$(jq -r '.trades['$i'].orders[1]|[.side][]' D.json)
entrySource=$(jq -r '.trades['$i'].orders[0]|[.source][]' D.json)
exitSource=$(jq -r '.trades['$i'].orders[1]|[.source][]' D.json)
tradeEntryTime=$(jq -r '.trades['$i'].orders[0]|[.placedTime][]' D.json | tr -d 'Z' | tr -s 'T' ' ')
tradeExitTime=$(jq -r '.trades['$i'].orders[1]|[.placedTime][]' D.json | tr -d 'Z' | tr -s 'T' ' ')
profitPercentage=$(jq -r '(.trades['$i']|[.profitPercentage][0]|tonumber)*(100)' D.json)
echo $tradeNo","$entrySide","$exitSide","$entrySource","$exitSource","$tradeEntryTime","$tradeExitTime","$profitPercentage | tr -d '"' >> tradelist.csv
done
json file looks like this
{"market":{"exchange":"BINANCE_FUTURES","coinPair":"BTC_USDT"},"strategy":{"name":"","type":"BACKTEST","candleSize":15,"lookbackDays":6,"leverageLong":1.00000000,"leverageShort":1.00000000,"strategyName":"ABC","strategyVersion":35,"runNo":"002","source":"Personal"},"strategyParameters":[{"name":"DurationInput","value":"87.0"}],"openPositionStrategy":{"actionTime":"CANDLE_CLOSE","maxPerSignal":1.00000000},"closePositionStrategy":{"actionTime":"CANDLE_CLOSE","minProfit":"NaN","stopLossValue":0.07000000,"stopLossTrailing":true,"takeProfit":0.01290000,"takeProfitDeviation":"NaN"},"performance":{"startTime":"2019-01-01T00:00:00Z","endTime":"2021-11-24T00:00:00Z","startAllocation":1000.00000000,"endAllocation":3478.58904150,"absoluteProfit":2478.58904150,"profitPerc":2.47858904,"buyHoldRatio":0.62426630,"buyHoldReturn":4.57228387,"numberOfTrades":744,"profitableTrades":0.67833109,"maxDrawdown":-0.20924885,"avgMonthlyProfit":0.05242718,"profitableMonths":0.70370370,"avgWinMonth":0.09889897,"avgLoseMonth":-0.05275563,"startPrice":null,"endPrice":57623.08000000},"trades":[{"tradeNo":0,"profit":-5.48836165,"profitPercentage":-0.00549085,"accumulatedBalance":994.51163835,"compoundProfitPerc":-0.00548836,"orders":[{"side":"Long","placedTime":"2019-09-16T21:15:00Z","placedAmount":0.09700000,"filledTime":"2019-09-16T21:15:00Z","filledAmount":0.09700000,"filledPrice":10300.49000000,"commissionPaid":0.39965901,"source":"SIGNAL"},{"side":"CloseLong","placedTime":"2019-09-17T19:15:00Z","placedAmount":0.09700000,"filledTime":"2019-09-17T19:15:00Z","filledAmount":0.09700000,"filledPrice":10252.13000000,"commissionPaid":0.39778264,"source":"SIGNAL"}]},{"tradeNo":1,"profit":-3.52735800,"profitPercentage":-0.00356403,"accumulatedBalance":990.98428035,"compoundProfitPerc":-0.00901572,"orders":[{"side":"Long","placedTime":"2019-09-19T06:00:00Z","placedAmount":0.10000000,"filledTime":"2019-09-19T06:00:00Z","filledAmount":0.10000000,"filledPrice":9893.16000000,"commissionPaid":0.39572640,"source":"SIGNAL"},{"side":"CloseLong","placedTime":"2019-09-19T06:15:00Z","placedAmount":0.10000000,"filledTime":"2019-09-19T06:15:00Z","filledAmount":0.10000000,"filledPrice":9865.79000000,"commissionPaid":0.39463160,"source":"SIGNAL"}]},{"tradeNo":2,"profit":-5.04965308,"profitPercentage":-0.00511770,"accumulatedBalance":985.93462727,"compoundProfitPerc":-0.01406537,"orders":[{"side":"Long","placedTime":"2019-09-25T10:15:00Z","placedAmount":0.11700000,"filledTime":"2019-09-25T10:15:00Z","filledAmount":0.11700000,"filledPrice":8430.00000000,"commissionPaid":0.39452400,"source":"SIGNAL"},{"side":"CloseLong","placedTime":"2019-09-25T10:30:00Z","placedAmount":0.11700000,"filledTime":"2019-09-25T10:30:00Z","filledAmount":0.11700000,"filledPrice":8393.57000000,"commissionPaid":0.39281908,"source":"SIGNAL"}]}
You can do it all (extracts, conversions and formatting) with one jq call:
#!/bin/sh
echo 'TradeNo,TradeOpenType,TradeCloseType,TradeOpenSource,TradeCloseSource,TradeOpenTime,TradeCloseTime,PNL,Exposure'
query='
.trades[]
| [
.tradeNo,
.orders[0].side,
.orders[1].side,
.orders[0].source,
.orders[1].source,
(.orders[0].placedTime | fromdate | strftime("%Y-%m-%d %H:%M:%S")),
(.orders[1].placedTime | fromdate | strftime("%Y-%m-%d %H:%M:%S")),
.profitPercentage * 100,
(
(.orders[1].placedTime | fromdate) - (.orders[0].placedTime | fromdate)
| (. / 86400 | floor | tostring) + (. % 86400 | strftime(":%H:%M"))
)
]
|#csv
'
jq -r "$query" < D.json > tradelist.csv
example of JSON (cleaned of all irrelevant keys):
{
"trades": [
{
"tradeNo": 0,
"profitPercentage": -0.00549085,
"orders": [
{
"side": "Long",
"placedTime": "2018-12-16T21:34:46Z",
"source": "SIGNAL"
},
{
"side": "CloseLong",
"placedTime": "2019-09-17T19:15:00Z",
"source": "SIGNAL"
}
]
}
]
}
output:
TradeNo,TradeOpenType,TradeCloseType,TradeOpenSource,TradeCloseSource,TradeOpenTime,TradeCloseTime,PNL,Exposure
0,"Long","CloseLong","SIGNAL","SIGNAL","2018-12-16 21:34:46","2019-09-17 20:15:00",-0.549085,"274:22:40"
If you want to get rid of the double quotes that jq adds when generating a CSV (which are completely valid, but you need a real parser to read the CSV) then you can replace #csv with #tsv and post-process the output with tr '\t' ',', like this:
query='
...
|#tsv
'
jq -r "$query" < D.json | tr '\t' ',' > tradelist.csv
and you'll get:
TradeNo,TradeOpenType,TradeCloseType,TradeOpenSource,TradeCloseSource,TradeOpenTime,TradeCloseTime,PNL,Exposure
0,Long,CloseLong,SIGNAL,SIGNAL,2018-12-16 21:34:46,2019-09-17 20:15:00,-0.549085,274:22:40
note: This method of getting rid of the " in the CSV is only accurate when there is no \n \t \r \ , or " characters in the input data.
Regarding the main question (regarding computing time differences), you're in luck as jq provides the built-in function fromdateiso8601 for converting ISO times to "the
number of seconds since the Unix epoch (1970-01-01T00:00:00Z)".
With your JSON sample,
.trades[]
| [ .orders[1].placedTime, .orders[0].placedTime]
| map(fromdateiso8601)
| .[0] - .[1]
produces the three differences:
79200
900
900
And here's a function for converting seconds to "hh:mm:ss" format:
def hhmmss:
def l: tostring | if length < 2 then "0\(.)" else . end;
(. % 60) as $ss
| ((. / 60) | floor) as $mm
| (($mm / 60) | floor) as $hh
| ($mm % 60) as $mm
| [$hh, $mm, $ss] | map(l) | join(":");
I prefer using an intermediate structure of the "entry" and "exit" JSON. This helps with debugging the jq commands. Formatted for readability over performance:
#!/usr/bin/env bash
echo "TradeNo,TradeOpenType,TradeCloseType,TradeOpenSource,TradeCloseSource,TradeOpenTime,TradeCloseTime,PNL,Exposure" > tradelist.csv
jq -r '
.trades[]
|{tradeNo,
profitPercentage,
entry:.orders[0],
exit:.orders[1],
entryTS:.orders[0].placedTime|fromdate,
exitTS:.orders[1].placedTime|fromdate}
|[.tradeNo,
.entry.side,
.exit.side,
.entry.source,
.exit.source,
(.entry.placedTime|strptime("%Y-%m-%dT%H:%M:%SZ")|strftime("%Y-%m-%d %H:%M:%S")),
(.exit.placedTime|strptime("%Y-%m-%dT%H:%M:%SZ")|strftime("%Y-%m-%d %H:%M:%S")),
(.profitPercentage*100),
(.exitTS-.entryTS|todate|strptime("%Y-%m-%dT%H:%M:%SZ")|strftime("%d:%H:%M"))]|#csv
' D.json | tr -d '"' >> tradelist.csv
WARNING: This formatting assumes Exposure is LESS THAN 1 MONTH. Good luck with that!

Filter results using bash

To be more clear, look at the below text file.
https://brianbrandt.dk/web/var/www/public_html/.htpasswd
https://brianbrandt.dk/web/var/www/public_html/wp-config.php
https://briannajackson1.wordpress.org/high-entropy-misc.txt
https://briannajackson1.wordpress.org/Homestead.yaml
https://brickellmiami.centric.hyatt.com/dev
https://brickellmiami.centric.hyatt.com/django.log
https://brickellmiami.centric.hyatt.com/.dockercfg
https://brickellmiami.centric.hyatt.com/docker-compose.yml
https://brickellmiami.centric.hyatt.com/.docker/config.json
https://brickellmiami.centric.hyatt.com/Dockerfile
https://brideonashoestring.wordpress.org/web/var/www/public_html/config.php
https://brideonashoestring.wordpress.org/web/var/www/public_html/wp-config.php
https://brideonashoestring.wordpress.org/wp-config.php
https://brideonashoestring.wordpress.org/.wp-config.php.swp
https://brideonashoestring.wordpress.org/_wpeprivate/config.json
https://brideonashoestring.wordpress.org/yarn-debug.log
https://brideonashoestring.wordpress.org/yarn-error.log
https://brideonashoestring.wordpress.org/yarn.lock
https://brideonashoestring.wordpress.org/.yarnrc
https://bridgehome.adobe.com/etc/shadow
https://bridgehome.adobe.com/phpinfo.php
https://bridgetonema.wordpress.org/manifest.json
https://bridgetonema.wordpress.org/manifest.yml
https://bridge.twilio.com/.wp-config.php.swp
https://bridge.twilio.com/wp-content/themes/.git/config
https://bridge.twilio.com/_wpeprivate/config.json
https://bridge.twilio.com/yarn-debug.log
https://bridge.twilio.com/yarn-error.log
https://bridge.twilio.com/yarn.lock
https://bridge.twilio.com/.yarnrc
https://brightside.mtn.co.za/config.lua
https://brightside.mtn.co.za/config.php
https://brightside.mtn.co.za/config.php.txt
https://brightside.mtn.co.za/config.rb
https://brightside.mtn.co.za/config.ru
https://brightside.mtn.co.za/_config.yml
https://brightside.mtn.co.za/console
https://brightside.mtn.co.za/.credentials
https://brightside.mtn.co.za/CVS/Entries
https://brightside.mtn.co.za/CVS/Root
https://brightside.mtn.co.za/dasbhoard/
https://brightside.mtn.co.za/data
https://brightside.mtn.co.za/data.txt
https://brightside.mtn.co.za/db/dbeaver-data-sources.xml
https://brightside.mtn.co.za/db/dump.sql
https://brightside.mtn.co.za/db/.pgpass
https://brightside.mtn.co.za/db/robomongo.json
https://brightside.mtn.co.za/README.txt
https://brightside.mtn.co.za/RELEASE_NOTES.txt
https://brightside.mtn.co.za/.remote-sync.json
https://brightside.mtn.co.za/Resources.zip.manifest
https://brightside.mtn.co.za/.rspec
https://br.infinite.sx/db/dump.sql
https://br.infinite.sx/graphiql
The domain name brightside.mtn.co.za and other domains repeated more than 10 times now i want to drop brightside.mtn.co.za and other domains that are repeated more than 10 times and then the output the results the output should look like.
https://br.infinite.sx/db/dump.sql
https://br.infinite.sx/graphiql
https://bridgetonema.wordpress.org/manifest.json
https://bridgetonema.wordpress.org/manifest.yml
[The following is a response to the original question, which was premised on JSON input.]
Since you need to count the items in a group, it would appear that you will find group_by( sub("/[^/]*$";"") ) useful.
For example, if you wanted to omit large groups entirely, as one interpretation of the stated requirements would seem to imply, you could use the following filter:
[.results[] | select(.status==301) | .url]
| group_by( sub("/[^/]*$";"") )
| map(select(length < 10) )
| .[][]
If the text input is in input.txt, then one solution using jq at the bash command line would be:
< input.txt jq -Rr '[inputs]
| group_by( sub("/[^/]*$";"") )
| map(select(length < 10) )
| .[][]'
(If you want the output as JSON strings, omit the -r option.)
A more efficient solution
The above solution uses the built-in filter group_by/1 and is thus somewhat inefficient. For a very large number of input lines, a more efficient solution would be:
< input.txt jq -Rr '
def GROUPS_BY(stream; f):
reduce stream as $x ({}; .[$x|f] += [$x] ) | .[] ;
GROUPS_BY(inputs; sub("/[^/]*$";""))
| select(length < 10)
| .[]'

Spread number equally across elements of array (and add remainder to beginning of ring)

Let's say I have some JSON array, we'll call it A:
["foo", "bar", "baz"]
And I have some number X, let's say 5 in this case.
I want to produce the following object in jq:
{
"foo": 2,
"bar": 2,
"baz": 1,
}
This is the number 5 divided up equally across the elements of the array, with the remainder being distributed to the elements at the beginning of the ring. You could maybe think of it this way, the value for element N should be ceil(X / length(A)) if index(N) < (X % length(A)), otherwise it should be floor(X / length(A)).
Assuming A is my file input to jq, and I have X defined as a variable, how can I express this in jq?
I have tried 'length as $len | .[] | if index(.) < (5 % $len) then (5 / $len) + 1 else 5 / $len end' | 5 as a starting point but I get 2 for each element.
You can use the transpose function to help build this. It's simpler with a ceil function, which we have to define ourselves. The mapping you are looking for from index to allocation is ceil($count - $i)/$n), where $count is the amount you are distributing, $i is the index in the original list, and $n is the length of the list.
Comments show how each piece works on your sample input of ["foo", "bar", "baz"].
def ceil(v): -(-v | floor);
def objectify(n): {key: .[0], value: ceil(($count - .[1])/n)};
# ["foo", 0] | objectify(3) -> {"key": "foo", "value", 2}
length as $n | # n == 3
[., keys] | # [["foo", "bar", "baz"], [0,1,2]]
[transpose[] | # [["foo", 0], ["bar", 1], ["baz", 2]]
objectify($n)
] |
from_entries # {"foo": 2, "bar": 2, "baz": 1}
Without the comments...
def ceil(v): -(-v | floor);
def objectify(n): {key: .[0], value: ceil(($count - .[1])/n)};
length as $n | [., keys] | [transpose[] | objectify($n)] | from_entries
An example of its use, assuming you saved it to file named distribute.jq:
jq --argjson count 5 -f distribute.jq tmp.json
I found a solution by saving the original input as a variable so that I can continue to reference it while operating on its values.
. as $arr
| length as $len
| [
.[]
| . as $i
| {
$(i): (
if ($arr | index($i)) < ($x % $len) then
($x / $len) + 1
else
$x / $len
end
| floor
)
}
]
| add
The following worked for me with passing --argjson count $X and feeding the array as my input.

Get count based on value bash

I have data in this format in a file:
{"field1":249449,"field2":116895,"field3":1,"field4":"apple","field5":42,"field6":"2019-07-01T00:00:10","metadata":"","frontend":""}
{"field1":249448,"field2":116895,"field3":1,"field4":"apple","field5":42,"field6":"2019-07-01T00:00:10","metadata":"","frontend":""}
{"field1":249447,"field2":116895,"field3":1,"field4":"apple","field5":42,"field6":"2019-07-01T00:00:10","metadata":"","frontend":""}
{"field1":249443,"field2":116895,"field3":1,"field4":"apple","field5":42,"field6":"2019-07-01T00:00:10","metadata":"","frontend":""}
{"field1":249449,"field2":116895,"field3":1,"field4":"apple","field5":42,"field6":"2019-07-01T00:00:10","metadata":"","frontend":""}
Here, each entry represents a row. I want to have a count of the rows with respect to the value in field one, like:
249449 : 2
249448 : 1
249447 : 1
249443 : 1
How can I get that?
with awk
$ awk -F'[,:]' -v OFS=' : ' '{a[$2]++} END{for(k in a) print k, a[k]}' file
You can use the jq command line tool to interpret JSON data. uniq -c counts the number of occurences.
% jq .field1 < $INPUTFILE | sort | uniq -c
1 249443
1 249447
1 249448
2 249449
(tested with jq 1.5-1-a5b5cbe on linux xubuntu 18.04 with zsh)
Here's an efficient jq-only solution:
reduce inputs.field1 as $x ({}; .[$x|tostring] += 1)
| to_entries[]
| "\(.key) : \(.value)"
Invocation: jq -nrf program.jq input.json
(Note in particular the -n option.)
Of course if an object-representation of the counts is satisfactory, then
one could simply write:
jq -n 'reduce inputs.field1 as $x ({}; .[$x|tostring] += 1)' input.json
Using datamash and some shell utils, change the non-data delimiters to squeezed tabs, count field 3, (it'd be field 2, but there's a leading tab), reverse, then pretty print as per OP spec:
tr -s '{":,}' '\t' < file | datamash -sg 3 count 3 | tac | xargs printf '%s : %s\n'
Output:
249449 : 2
249448 : 1
249447 : 1
249443 : 1

how do I convert fractional decimal numbers to fractional binary numbers using dc

So dc is a great tool for converting between bases - handy for those bit twiddling coding jobs. e.g to convert 1078 into binary I can do this:
bash> echo "2o1078p" | dc
10000110110
However I can't get it to print fractions between 0 and 1 correctly.
Trying to convert 0.3 into binary:
bash> echo "2o10k 0.3p" | dc
.0100
But 0.0100(bin) = 0.25 not 0.3.
However if I construct the value manually I get the right answer
bash> echo "2o10k 3 10 / p" | dc
.0100110011001100110011001100110011
Well it looks like its giving me more than the 10 significant figures I ask for but thats OK
Am I doing something wrong? Or am I trying to make dc do something that its not able to do?
bash> dc --version
dc (GNU bc 1.06) 1.3
...
Strange. My first thought was that maybe precision only applies to calculations, not conversions. But then it only works for division, not addition, subtraction, or multiplication:
echo "2o10k 0.3 1 / p" | dc
.0100110011001100110011001100110011
echo "2o10k 0.3 0 + p" | dc
.0100
echo "2o10k 0.3 0 - p" | dc
.0100
echo "2o10k 0.3 1 * p" | dc
.0100
As for precision, the man page says "The precision is always measured in decimal digits, regardless of the current input or output radix." That explains why the output (when you get it) is 33 significant bits.
It seems that dc is getting the number of significant figures from the input.
Now 1/log10(2)=3.32 so each decimal significant digit is 3.3 binary digits.
Looking at the output of dc for varying input SF lengths shows:
`dc -e "2o10k 0.3 p"` => .0100
`dc -e "2o10k 0.30 p"` => .0100110
`dc -e "2o10k 0.300 p"` => .0100110011
`dc -e "2o10k 0.3000 p"` => .01001100110011
A table of these values and expected value, ceil(log10(2)*SFinput) is as follows:
input : output : expected output
1 : 4 : 4
2 : 7 : 7
3 : 10 : 10
4 : 14 : 14
And dc is behaving exactly as expected.
So the solution is to either use the right number of significant figures in the input, or the division form dc -e "2o10k 3 10 / p"