Search and compare values of a .CSV file - csv

My goal is to search a CSV's first column twice, then execute an action (dependent of a value in the third column of same record). I began in VBScript using InStr() :
Set objFS = CreateObject("Scripting.FileSystemObject")
roster = "C:\bin\roster.csv"
Set objFile = objFS.OpenTextFile(roster)
Do Until objFile.AtEndOfStream
strLine = objFile.ReadLine
intLength = Len(strLine)
intZeros = 5 - intLength
If InStr(strLine, strIP)> 0 Then
strinfo = split(strLine, ",")
siteNumA = strinfo (0)
siteNumB = string(5 - Len(siteNumA), "0") & siteNumA
siteIP = strinfo (1)
siteDist = strinfo (2)
siteReg = strinfo (3)
End If
It could compare values of siteDist to same data from a second search. However, I prefer to use AutoIt. Is there a way to achieve this using AutoIt (or a command to achieve my plan)?
A simple CSV file I am using for testing :
Site,District,Region
1,1,1
2,1,1
3,1,2
4,2,2
5,2,1
Searching two separate entries for Site and confirming that District matches afterwards, running this script at site 1 should have it evaluate as true for Site 1, 2, or 3, and false for Site 4 and 5.

Use this:
; #FUNCTION# ====================================================================================================================
; Name...........: _ParseCSV
; Description ...: Reads a CSV-file
; Syntax.........: _ParseCSV($sFile, $sDelimiters=',', $sQuote='"', $iFormat=0)
; Parameters ....: $sFile - File to read or string to parse
; $sDelimiters - [optional] Fieldseparators of CSV, mulitple are allowed (default: ,;)
; $sQuote - [optional] Character to quote strings (default: ")
; $iFormat - [optional] Encoding of the file (default: 0):
; |-1 - No file, plain data given
; |0 or 1 - automatic (ASCII)
; |2 - Unicode UTF16 Little Endian reading
; |3 - Unicode UTF16 Big Endian reading
; |4 or 5 - Unicode UTF8 reading
; Return values .: Success - 2D-Array with CSV data (0-based)
; Failure - 0, sets #error to:
; |1 - could not open file
; |2 - error on parsing data
; |3 - wrong format chosen
; Author ........: ProgAndy
; Modified.......:
; Remarks .......:
; Related .......: _WriteCSV
; Link ..........:
; Example .......:
; ===============================================================================================================================
Func _ParseCSV($sFile, $sDelimiters=',;', $sQuote='"', $iFormat=0)
Local Static $aEncoding[6] = [0, 0, 32, 64, 128, 256]
If $iFormat < -1 Or $iFormat > 6 Then
Return SetError(3,0,0)
ElseIf $iFormat > -1 Then
Local $hFile = FileOpen($sFile, $aEncoding[$iFormat]), $sLine, $aTemp, $aCSV[1], $iReserved, $iCount
If #error Then Return SetError(1,#error,0)
$sFile = FileRead($hFile)
FileClose($hFile)
EndIf
If $sDelimiters = "" Or IsKeyword($sDelimiters) Then $sDelimiters = ',;'
If $sQuote = "" Or IsKeyword($sQuote) Then $sQuote = '"'
$sQuote = StringLeft($sQuote, 1)
Local $srDelimiters = StringRegExpReplace($sDelimiters, '[\\\^\-\[\]]', '\\\0')
Local $srQuote = StringRegExpReplace($sQuote, '[\\\^\-\[\]]', '\\\0')
Local $sPattern = StringReplace(StringReplace('(?m)(?:^|[,])\h*(["](?:[^"]|["]{2})*["]|[^,\r\n]*)(\v+)?',',', $srDelimiters, 0, 1),'"', $srQuote, 0, 1)
Local $aREgex = StringRegExp($sFile, $sPattern, 3)
If #error Then Return SetError(2,#error,0)
$sFile = '' ; save memory
Local $iBound = UBound($aREgex), $iIndex=0, $iSubBound = 1, $iSub = 0
Local $aResult[$iBound][$iSubBound]
For $i = 0 To $iBound-1
Select
Case StringLen($aREgex[$i])<3 And StringInStr(#CRLF, $aREgex[$i])
$iIndex += 1
$iSub = 0
ContinueLoop
Case StringLeft(StringStripWS($aREgex[$i], 1),1)=$sQuote
$aREgex[$i] = StringStripWS($aREgex[$i], 3)
$aResult[$iIndex][$iSub] = StringReplace(StringMid($aREgex[$i], 2, StringLen($aREgex[$i])-2), $sQuote&$sQuote, $sQuote, 0, 1)
Case Else
$aResult[$iIndex][$iSub] = $aREgex[$i]
EndSelect
$aREgex[$i]=0 ; save memory
$iSub += 1
If $iSub = $iSubBound Then
$iSubBound += 1
ReDim $aResult[$iBound][$iSubBound]
EndIf
Next
If $iIndex = 0 Then $iIndex=1
ReDim $aResult[$iIndex][$iSubBound]
Return $aResult
EndFunc
; #FUNCTION# ====================================================================================================================
; Name...........: _WriteCSV
; Description ...: Writes a CSV-file
; Syntax.........: _WriteCSV($sFile, Const ByRef $aData, $sDelimiter, $sQuote, $iFormat=0)
; Parameters ....: $sFile - Destination file
; $aData - [Const ByRef] 0-based 2D-Array with data
; $sDelimiter - [optional] Fieldseparator (default: ,)
; $sQuote - [optional] Quote character (default: ")
; $iFormat - [optional] character encoding of file (default: 0)
; |0 or 1 - ASCII writing
; |2 - Unicode UTF16 Little Endian writing (with BOM)
; |3 - Unicode UTF16 Big Endian writing (with BOM)
; |4 - Unicode UTF8 writing (with BOM)
; |5 - Unicode UTF8 writing (without BOM)
; Return values .: Success - True
; Failure - 0, sets #error to:
; |1 - No valid 2D-Array
; |2 - Could not open file
; Author ........: ProgAndy
; Modified.......:
; Remarks .......:
; Related .......: _ParseCSV
; Link ..........:
; Example .......:
; ===============================================================================================================================
Func _WriteCSV($sFile, Const ByRef $aData, $sDelimiter=',', $sQuote='"', $iFormat=0)
Local Static $aEncoding[6] = [2, 2, 34, 66, 130, 258]
If $sDelimiter = "" Or IsKeyword($sDelimiter) Then $sDelimiter = ','
If $sQuote = "" Or IsKeyword($sQuote) Then $sQuote = '"'
Local $iBound = UBound($aData, 1), $iSubBound = UBound($aData, 2)
If Not $iSubBound Then Return SetError(2,0,0)
Local $hFile = FileOpen($sFile, $aEncoding[$iFormat])
If #error Then Return SetError(2,#error,0)
For $i = 0 To $iBound-1
For $j = 0 To $iSubBound-1
FileWrite($hFile, $sQuote & StringReplace($aData[$i][$j], $sQuote, $sQuote&$sQuote, 0, 1) & $sQuote)
If $j < $iSubBound-1 Then FileWrite($hFile, $sDelimiter)
Next
FileWrite($hFile, #CRLF)
Next
FileClose($hFile)
Return True
EndFunc
; === EXAMPLE ===================================================
;~ #include<Array.au3>
;~ $aResult = _ParseCSV(#ScriptDir & '\test.csv', "\", '$', 4)
;~ _ArrayDisplay($aResult)
;~ _WriteCSV(#ScriptDir & '\written.csv', $aResult, ',', '"', 5)
; ===============================================================
or this:
#region ;************ Includes ************
#include <Array.au3>
#endregion ;************ Includes ************
; _csvTo2DArray
Local $re = _csvTo2DArray("c:\Repository.csv", ';')
_ArrayDisplay($re)
Func _csvTo2DArray($file, $delim = ',')
Local $content = FileRead($file)
Local $rows_A = StringSplit(StringStripCR($content), #LF, 2)
StringReplace($rows_A[0], $delim, $delim)
Local $countColumns = #extended
Local $columns_A = 0
Local $2D_A[UBound($rows_A)][$countColumns + 1]
For $z = 0 To UBound($rows_A) - 1
$columns_A = StringSplit($rows_A[$z], $delim, 2)
For $y = 0 To UBound($columns_A) - 1
$2D_A[$z][$y] = $columns_A[$y]
Next
Next
Return $2D_A
EndFunc ;==>_csvTo2DArray
and then compare the cells with normal Stringfunctions or use the _Array functions on the array.

… is there another way to achieve my end goal in AutoIT, or is there a direct equivalent command set to what is outlined above to achieve my original plan?
Example using _ArraySearch() (replace ConsoleWrite() by whatever action required) :
#include <FileConstants.au3>
#include <File.au3>
#include <Array.au3>
Global Enum $CSV_COL_SITE, _
$CSV_COL_DISTRICT, _
$CSV_COL_REGION
Global Const $g_sFileCSV = 'C:\bin\roster.csv'
Global Const $g_sFileDelim = ','
Global Const $g_iColSearch = $CSV_COL_DISTRICT
Global Const $g_sValSearch = '1'
Global Const $g_sMessage = 'Matched row #%i : %s\n'
Global $g_iRow = 0
Global $g_aCSV
_FileReadToArray($g_sFileCSV, $g_aCSV, $FRTA_NOCOUNT, $g_sFileDelim)
While True
$g_iRow = _ArraySearch($g_aCSV, $g_sValSearch, ($g_iRow ? $g_iRow + 1 : $g_iRow), 0, 0, 0, 1, $g_iColSearch, False)
If #error Then ExitLoop
ConsoleWrite(StringFormat($g_sMessage, $g_iRow, _ArrayToString($g_aCSV, $g_sFileDelim, $g_iRow, $g_iRow, '')))
WEnd
Console output (row number +1 for presence of header record):
Matched row #2 : 1,1,1
Matched row #3 : 2,1,1
Matched row #4 : 3,1,2

Related

Extract NetCDF Variable and create New NetCDF

I need some help with manipulating NetCDF files. In total I have 10 files for 10 years respectively. Each year hast multiple (the same) variables, some of them also covering daily values. Here, I show you one example for the structure:
(base) thess2ice#local:rhone_smb_modelling $ ncdump -h Rhone_AWS_1990.nc
netcdf Rhone_AWS_1990 {
dimensions:
x = 402 ;
y = 852 ;
time = 1460 ;
variables:
double x(x) ;
x:_FillValue = NaN ;
x:standard_name = "x" ;
x:long_name = "longitude" ;
x:units = "degrees_east" ;
double y(y) ;
y:_FillValue = NaN ;
y:standard_name = "y" ;
y:long_name = "latitude" ;
y:units = "degrees_north" ;
float HGT(y, x) ;
HGT:_FillValue = -9999.f ;
HGT:units = "m" ;
HGT:long_name = "Elevation" ;
float ASPECT(y, x) ;
ASPECT:_FillValue = -9999.f ;
ASPECT:units = "degrees" ;
ASPECT:long_name = "Aspect of slope" ;
float SLOPE(y, x) ;
SLOPE:_FillValue = -9999.f ;
SLOPE:units = "degrees" ;
SLOPE:long_name = "Terrain slope" ;
float MASK(y, x) ;
MASK:_FillValue = -9999.f ;
MASK:units = "boolean" ;
MASK:long_name = "Glacier mask" ;
int64 time(time) ;
time:units = "hours since 1990-01-01 00:00:00" ;
time:calendar = "proleptic_gregorian" ;
double T2(time, y, x) ;
T2:_FillValue = NaN ;
T2:units = "K" ;
T2:long_name = "Temperature at 2 m" ;
double RRR(time, y, x) ;
RRR:_FillValue = NaN ;
RRR:units = "mm" ;
RRR:long_name = "Total precipitation (liquid+solid)" ;
double ACC(y, x) ;
ACC:_FillValue = -9999. ;
ACC:units = "mm yr^-1" ;
ACC:long_name = "Accumulation from RRR_solid" ;
double MELT_I(y, x) ;
MELT_I:_FillValue = -9999. ;
MELT_I:units = "mm yr^-1" ;
MELT_I:long_name = "Melt from PDD" ;
double MELT_S(y, x) ;
MELT_S:_FillValue = -9999. ;
MELT_S:units = "mm yr^-1" ;
MELT_S:long_name = "Melt from PDD" ;
double SMB(y, x) ;
SMB:_FillValue = -9999. ;
SMB:units = "mm yr^-1" ;
SMB:long_name = "SMB from PDD" ;
I need the data manipulated as input for a model. The variable I need to extract from each of the 10 NetCDF files is the SMB variable which is only a yearly value for each grid cell. So I'd like to build a NetCDF of the form:
(year, y, x) for the SMB variable
I know the ncks command already to extract only the SMB variable, but I can not manage to apply it on multiple files at once (let's say all nc files in the current directory) and bring them into one NetCDF file subsequently spanning all 10 years.
Can anybody help me with that?
Would be great!
Theresa
The NCO ncecat command, documented here, does exactly what you seem to want:
ncecat -u time -v SMB Rhone_AWS_199*.nc out.nc

Lua nested Json, remove single occurs or list of occurs if multiple

So what I am trying to do here is for a given json_body which is decoded json into a table using cjson I want to remove a given element by a configurable value conf.remove.json, I feel I am pretty close but its still not working, and is there a better way? Is there a safe way to find the tables "depth" and then reach out like conf.remove.json= I.want.to.remove.this creates the behavior json_table[I][want][to][remove][this] = nil without throwing some kind of NPE?
local configRemovePath= {}
local configRemoveDepth= 0
local recursiveCounter = 1
local function splitString(inputstr)
sep = "%." --Split on .
configRemovePath={}
configRemoveDepth=0
for str in string.gmatch(inputstr, "([^"..sep.."]+)") do
configRemovePath[configRemoveDepth + 1] = str
configRemoveDepth = configRemoveDepth + 1
end
end
local function recursiveSearchAndNullify(jsonTable)
for key, value in pairs(jsonTable) do --unordered search
-- First iteration
--Sample Json below, where conf.remove.json = data.id and nothing happened. conf.remove.json=data.id
--{
--"data": {
-- "d": 2,
-- "id": 1
--}
--}
-- value = {"d": 2, "id": 1}, key = "data", configRemovePath[recursiveCounter] = "data" , configRemovePath ['data','id'] , configRemoveDepth = 2
if(type(value) == "table" and value == configRemovePath[recursiveCounter] and recursiveCounter < configRemoveDepth) then --If the type is table, the current table is one we need to dive into, and we have not exceeded the configurations remove depth level
recursiveCounter = recursiveCounter + 1
jsonTable = recursiveSearchAndNullify(value)
else
if(key == configRemovePath[recursiveCounter] and recursiveCounter == configRemoveDepth) then --We are at the depth to remove and the key matches then we delete.
for key in pairs (jsonTable) do --Remove all occurances of said element
jsonTable[key] = nil
end
end
end
end
return jsonTable
end
for _, name in iter(conf.remove.json) do
splitString(name)
if(configRemoveDepth == 0) then
for name in pairs (json_body) do
json_body[name] = nil
end
else
recursiveCounter = 1 --Reset to 1 for each for call
json_body = recursiveSearchAndNullify(json_body)
end
end
Thanks to any who assist, this is my first day with Lua so I am pretty newb.
This is the official answer, found a better way with the help of Christian Sciberras!
local json_body_test_one = {data = { id = {"a", "b"},d = "2" }} --decoded json w cjson
local json_body_test_two = {data = { { id = "a", d = "1" }, { id = "b", d = "2" } } }
local config_json_remove = "data.id"
local function dump(o) --Method to print test tables for debugging
if type(o) == 'table' then
local s = '{ '
for k,v in pairs(o) do
if type(k) ~= 'number' then k = '"'..k..'"' end
s = s .. '['..k..'] = ' .. dump(v) .. ','
end
return s .. '} '
else
return tostring(o)
end
end
local function splitstring(inputstr, sep)
if sep == nil then
sep = "%." --Dot notation default
end
local t={} ; i=1
for str in string.gmatch(inputstr, "([^"..sep.."]+)") do
t[i] = str
i = i + 1
end
return t
end
local function setjsonprop(json_object, path, newvalue)
local configarray = splitstring(path)
while (#configarray > 1) do
json_object = json_object[table.remove(configarray, 1)]
if(type(json_object) == "table" and #json_object > 0) then
local recursepath = table.concat(configarray, ".")
for _, item in pairs(json_object) do
setjsonprop(item, recursepath, newvalue)
end
return
end
end
json_object[table.remove(configarray, 1)] = newvalue
end
setjsonprop(json_body_test_one, config_json_remove, nil)
print(dump(json_body_test_one))

Sort lua table based on nested json value

We have a key-value pair in redis consisting of a key with a JSON object as a value with various information;
"node:service:i-01fe0d69c343734" :
"{\"port\":\"32781\",
\"version\":\"3.0.2\",
\"host-instance-id\":\"i-01fe0d69c2243b366\",
\"last-checkin\":\"1492702508\",
\"addr\":\"10.0.0.0\",
\"host-instance-type\":\"m3.large\"}"
Is it possible to sort the table based on the last-checkin time of the value?
Here is my solution to your problem, using the quick sort algorithm, before doing a little correction of your input (as I understood it):
-----------------------------------------------------
local json = require("json")
function quicksort(t, sortname, start, endi)
start, endi = start or 1, endi or #t
sortname = sortname or 1
if(endi - start < 1) then return t end
local pivot = start
for i = start + 1, endi do
if t[i][sortname] <= t[pivot][sortname] then
local temp = t[pivot + 1]
t[pivot + 1] = t[pivot]
if(i == pivot + 1) then
t[pivot] = temp
else
t[pivot] = t[i]
t[i] = temp
end
pivot = pivot + 1
end
end
t = quicksort(t, sortname, start, pivot - 1)
return quicksort(t, sortname, pivot + 1, endi)
end
---------------------------------------------------------
-- I manually added delimeter ","
-- and name "node:service..." must be different
str = [[
{
"node:service:i-01fe0d69c343731" :
"{\"port\":\"32781\",
\"version\":\"3.0.2\",
\"host-instance-id\":\"i-01fe0d69c2243b366\",
\"last-checkin\":\"1492702506\",
\"addr\":\"10.0.0.0\",
\"host-instance-type\":\"m3.large\"}"
,
"node:service:i-01fe0d69c343732" :
"{\"port\":\"32781\",
\"version\":\"3.0.2\",
\"host-instance-id\":\"i-01fe0d69c2243b366\",
\"last-checkin\":\"1492702508\",
\"addr\":\"10.0.0.0\",
\"host-instance-type\":\"m3.large\"}"
,
"node:service:i-01fe0d69c343733" :
"{\"port\":\"32781\",
\"version\":\"3.0.2\",
\"host-instance-id\":\"i-01fe0d69c2243b366\",
\"last-checkin\":\"1492702507\",
\"addr\":\"10.0.0.0\",
\"host-instance-type\":\"m3.large\"}"
,
"node:service:i-01fe0d69c343734" :
"{\"port\":\"32781\",
\"version\":\"3.0.2\",
\"host-instance-id\":\"i-01fe0d69c2243b366\",
\"last-checkin\":\"1492702501\",
\"addr\":\"10.0.0.0\",
\"host-instance-type\":\"m3.large\"}"
}
]]
-- remove unnecessary \
str = str:gsub('"{','{'):gsub('}"','}'):gsub('\\"','"')
local t_res= json.decode(str)
-- prepare table before sorting
local t_indexed = {}
for k,v in pairs(t_res) do
v["node-service"] = k
t_indexed[#t_indexed+1] = v
end
-- algoritm quicksort realised only for indexed table
local t_sort= quicksort(t_indexed, "last-checkin")
for k,v in pairs(t_sort) do
print( k , v["node-service"] , v["port"], v["version"], v["host-instance-id"], v["last-checkin"] , v["addr"], v["host-instance-type"] )
end
console:
1 node:service:i-01fe0d69c343734 32781 3.0.2 i-01fe0d69c2243b366 1492702501 10.0.0.0 m3.large
2 node:service:i-01fe0d69c343731 32781 3.0.2 i-01fe0d69c2243b366 1492702506 10.0.0.0 m3.large
3 node:service:i-01fe0d69c343733 32781 3.0.2 i-01fe0d69c2243b366 1492702507 10.0.0.0 m3.large
4 node:service:i-01fe0d69c343732 32781 3.0.2 i-01fe0d69c2243b366 1492702508 10.0.0.0 m3.large

Convert JSON String to Lua Table?

I need to convert a Json String to a table data structure in Lua. I am using the following code.
local json = require "json"
local t = {
["name1"] = "value1",
["name2"] = { 1, false, true, 23.54, "a \021 string" },
name3 = json.null
}
local encode = json.encode (t)
print (encode) --> {"name1":"value1","name3":null,"name2":[1,false,true,23.54,"a \u0015 string"]}
local decode = json.decode( encode )
But when I run the script, I get the following errors,
no field package.preload['json']
no file '/usr/local/share/lua/5.2/json.lua'
no file '/usr/local/share/lua/5.2/json/init.lua'
no file '/usr/local/lib/lua/5.2/json.lua'
no file '/usr/local/lib/lua/5.2/json/init.lua'
no file './json.lua'
no file '/usr/local/lib/lua/5.2/json.so'
no file '/usr/local/lib/lua/5.2/loadall.so'
no file './json.so'
So how to convert my json string to lua table?
maybe lua-cjsonis your friend:
install e.g. through luarocks:
$sudo luarocks install lua-cjson
then in lua:
local json = require('cjson')
local tab = json.decode(json_string)
json_string = json.encode(tab)
https://gist.github.com/tylerneylon/59f4bcf316be525b30ab
I found pure lua script file to parse json data (just one file).
local json = {}
-- Internal functions.
local function kind_of(obj)
if type(obj) ~= 'table' then return type(obj) end
local i = 1
for _ in pairs(obj) do
if obj[i] ~= nil then i = i + 1 else return 'table' end
end
if i == 1 then return 'table' else return 'array' end
end
local function escape_str(s)
local in_char = {'\\', '"', '/', '\b', '\f', '\n', '\r', '\t'}
local out_char = {'\\', '"', '/', 'b', 'f', 'n', 'r', 't'}
for i, c in ipairs(in_char) do
s = s:gsub(c, '\\' .. out_char[i])
end
return s
end
-- Returns pos, did_find; there are two cases:
-- 1. Delimiter found: pos = pos after leading space + delim; did_find = true.
-- 2. Delimiter not found: pos = pos after leading space; did_find = false.
-- This throws an error if err_if_missing is true and the delim is not found.
local function skip_delim(str, pos, delim, err_if_missing)
pos = pos + #str:match('^%s*', pos)
if str:sub(pos, pos) ~= delim then
if err_if_missing then
error('Expected ' .. delim .. ' near position ' .. pos)
end
return pos, false
end
return pos + 1, true
end
-- Expects the given pos to be the first character after the opening quote.
-- Returns val, pos; the returned pos is after the closing quote character.
local function parse_str_val(str, pos, val)
val = val or ''
local early_end_error = 'End of input found while parsing string.'
if pos > #str then error(early_end_error) end
local c = str:sub(pos, pos)
if c == '"' then return val, pos + 1 end
if c ~= '\\' then return parse_str_val(str, pos + 1, val .. c) end
-- We must have a \ character.
local esc_map = {b = '\b', f = '\f', n = '\n', r = '\r', t = '\t'}
local nextc = str:sub(pos + 1, pos + 1)
if not nextc then error(early_end_error) end
return parse_str_val(str, pos + 2, val .. (esc_map[nextc] or nextc))
end
-- Returns val, pos; the returned pos is after the number's final character.
local function parse_num_val(str, pos)
local num_str = str:match('^-?%d+%.?%d*[eE]?[+-]?%d*', pos)
local val = tonumber(num_str)
if not val then error('Error parsing number at position ' .. pos .. '.') end
return val, pos + #num_str
end
-- Public values and functions.
function json.stringify(obj, as_key)
local s = {} -- We'll build the string as an array of strings to be concatenated.
local kind = kind_of(obj) -- This is 'array' if it's an array or type(obj) otherwise.
if kind == 'array' then
if as_key then error('Can\'t encode array as key.') end
s[#s + 1] = '['
for i, val in ipairs(obj) do
if i > 1 then s[#s + 1] = ', ' end
s[#s + 1] = json.stringify(val)
end
s[#s + 1] = ']'
elseif kind == 'table' then
if as_key then error('Can\'t encode table as key.') end
s[#s + 1] = '{'
for k, v in pairs(obj) do
if #s > 1 then s[#s + 1] = ', ' end
s[#s + 1] = json.stringify(k, true)
s[#s + 1] = ':'
s[#s + 1] = json.stringify(v)
end
s[#s + 1] = '}'
elseif kind == 'string' then
return '"' .. escape_str(obj) .. '"'
elseif kind == 'number' then
if as_key then return '"' .. tostring(obj) .. '"' end
return tostring(obj)
elseif kind == 'boolean' then
return tostring(obj)
elseif kind == 'nil' then
return 'null'
else
error('Unjsonifiable type: ' .. kind .. '.')
end
return table.concat(s)
end
json.null = {} -- This is a one-off table to represent the null value.
function json.parse(str, pos, end_delim)
pos = pos or 1
if pos > #str then error('Reached unexpected end of input.') end
local pos = pos + #str:match('^%s*', pos) -- Skip whitespace.
local first = str:sub(pos, pos)
if first == '{' then -- Parse an object.
local obj, key, delim_found = {}, true, true
pos = pos + 1
while true do
key, pos = json.parse(str, pos, '}')
if key == nil then return obj, pos end
if not delim_found then error('Comma missing between object items.') end
pos = skip_delim(str, pos, ':', true) -- true -> error if missing.
obj[key], pos = json.parse(str, pos)
pos, delim_found = skip_delim(str, pos, ',')
end
elseif first == '[' then -- Parse an array.
local arr, val, delim_found = {}, true, true
pos = pos + 1
while true do
val, pos = json.parse(str, pos, ']')
if val == nil then return arr, pos end
if not delim_found then error('Comma missing between array items.') end
arr[#arr + 1] = val
pos, delim_found = skip_delim(str, pos, ',')
end
elseif first == '"' then -- Parse a string.
return parse_str_val(str, pos + 1)
elseif first == '-' or first:match('%d') then -- Parse a number.
return parse_num_val(str, pos)
elseif first == end_delim then -- End of an object or array.
return nil, pos + 1
else -- Parse true, false, or null.
local literals = {['true'] = true, ['false'] = false, ['null'] = json.null}
for lit_str, lit_val in pairs(literals) do
local lit_end = pos + #lit_str - 1
if str:sub(pos, lit_end) == lit_str then return lit_val, lit_end + 1 end
end
local pos_info_str = 'position ' .. pos .. ': ' .. str:sub(pos, pos + 10)
error('Invalid json syntax starting at ' .. pos_info_str)
end
end
return json
You can use json-lua. A pure lua implementation of json. First install json-lua using Luarocks. luarocks install json-lua . Then Use this code :
local json = require "json"
local t = {
["name1"] = "value1",
["name2"] = { 1, false, true, 23.54, "a \021 string" },
name3 = json.null
}
local encode = json:encode (t)
print (encode) --> {"name1":"value1","name3":null,"name2":[1,false,true,23.54,"a \u0015 string"]}
local decode = json:decode( encode )
Tested & Verified on win 7 64 bit with lua 5.1. lua-cjson is fine, but it is not a pure lua rock. So, it's installation will not be easier to you.

Lua: print integer as a binary

How can I represent integer as Binary?
so I can print 7 as 111
You write a function to do this.
num=7
function toBits(num)
-- returns a table of bits, least significant first.
local t={} -- will contain the bits
while num>0 do
rest=math.fmod(num,2)
t[#t+1]=rest
num=(num-rest)/2
end
return t
end
bits=toBits(num)
print(table.concat(bits))
In Lua 5.2 you've already have bitwise functions which can help you ( bit32 )
Here is the most-significant-first version, with optional leading 0 padding to a specified number of bits:
function toBits(num,bits)
-- returns a table of bits, most significant first.
bits = bits or math.max(1, select(2, math.frexp(num)))
local t = {} -- will contain the bits
for b = bits, 1, -1 do
t[b] = math.fmod(num, 2)
num = math.floor((num - t[b]) / 2)
end
return t
end
There's a faster way to do this that takes advantage of string.format, which converts numbers to base 8. It's trivial to then convert base 8 to binary.
--create lookup table for octal to binary
oct2bin = {
['0'] = '000',
['1'] = '001',
['2'] = '010',
['3'] = '011',
['4'] = '100',
['5'] = '101',
['6'] = '110',
['7'] = '111'
}
function getOct2bin(a) return oct2bin[a] end
function convertBin(n)
local s = string.format('%o', n)
s = s:gsub('.', getOct2bin)
return s
end
If you want to keep them all the same size, then do
s = string.format('%.22o', n)
Which gets you 66 bits. That's two extra bits at the end, since octal works in groups of 3 bits, and 64 isn't divisible by 3. If you want 33 bits, change it to 11.
If you have the BitOp library, which is available by default in LuaJIT, then you can do this:
function convertBin(n)
local t = {}
for i = 1, 32 do
n = bit.rol(n, 1)
table.insert(t, bit.band(n, 1))
end
return table.concat(t)
end
But note this only does the first 32 bits! If your number is larger than 2^32, the result wont' be correct.
function bits(num)
local t={}
while num>0 do
rest=num%2
table.insert(t,1,rest)
num=(num-rest)/2
end return table.concat(t)
end
Since nobody wants to use table.insert while it's useful here
Here is a function inspired by the accepted answer with a correct syntax which returns a table of bits in wriiten from right to left.
num=255
bits=8
function toBits(num, bits)
-- returns a table of bits
local t={} -- will contain the bits
for b=bits,1,-1 do
rest=math.fmod(num,2)
t[b]=rest
num=(num-rest)/2
end
if num==0 then return t else return {'Not enough bits to represent this number'}end
end
bits=toBits(num, bits)
print(table.concat(bits))
>>11111111
function reverse(t)
local nt = {} -- new table
local size = #t + 1
for k,v in ipairs(t) do
nt[size - k] = v
end
return nt
end
function tobits(num)
local t={}
while num>0 do
rest=num%2
t[#t+1]=rest
num=(num-rest)/2
end
t = reverse(t)
return table.concat(t)
end
print(tobits(7))
# 111
print(tobits(33))
# 100001
print(tobits(20))
# 10100
local function tobinary( number )
local str = ""
if number == 0 then
return 0
elseif number < 0 then
number = - number
str = "-"
end
local power = 0
while true do
if 2^power > number then break end
power = power + 1
end
local dot = true
while true do
power = power - 1
if dot and power < 0 then
str = str .. "."
dot = false
end
if 2^power <= number then
number = number - 2^power
str = str .. "1"
else
str = str .. "0"
end
if number == 0 and power < 1 then break end
end
return str
end
May seem more verbose but it is actually faster than other functions that use the math library functions. Works with any number, be it positive/negative/fractional...
local function tobits(num, str) -- tail call
str = str or "B"
if num == 0 then return str end
return tobits(
num >> 1 , -- right shift
((num & 1)==1 and "1" or "0") .. str )
end
This function uses a lookup table to print a binary number extracted from a hex representation. All using string manipulation essentially. Tested in lua 5.1.
local bin_lookup = {
["0"] = "0000",
["1"] = "0001",
["2"] = "0010",
["3"] = "0011",
["4"] = "0100",
["5"] = "0101",
["6"] = "0110",
["7"] = "0111",
["8"] = "1000",
["9"] = "1001",
["A"] = "1010",
["B"] = "1011",
["C"] = "1100",
["D"] = "1101",
["E"] = "1110",
["F"] = "1111"
}
local print_binary = function(value)
local hs = string.format("%.2X", value) -- convert number to HEX
local ln, str = hs:len(), "" -- get length of string
for i = 1, ln do -- loop through each hex character
local index = hs:sub(i, i) -- each character in order
str = str .. bin_lookup[index] -- lookup a table
str = str .. " " -- add a space
end
return str
end
print(print_binary(45))
#0010 1101
print(print_binary(65000))
#1111 1101 1110 1000
This maybe not work in lua that has no bit32 library
function toBinary(number, bits)
local bin = {}
bits = bits - 1
while bits >= 0 do --As bit32.extract(1, 0) will return number 1 and bit32.extract(1, 1) will return number 0
--I do this in reverse order because binary should like that
table.insert(bin, bit32.extract(number, bits))
bits = bits - 1
end
return bin
end
--Expected result 00000011
print(table.concat(toBinary(3, 8)))
This need at least lua 5.2 (because the code need bit32 library)
As by Dave, but with filled empty bits:
local function toBits(num, bits)
-- returns a table of bits, least significant first.
local t={} -- will contain the bits
bits = bits or 8
while num>0 do
rest=math.fmod(num,2)
t[#t+1]=rest
num=math.floor((num-rest)/2)
end
for i = #t+1, bits do -- fill empty bits with 0
t[i] = 0
end
return t
end
for i = 0, 255 do
local bits = toBits(i)
print(table.concat(bits, ' '))
end
Result:
0 0 0 0 0 0 0 0
1 0 0 0 0 0 0 0
0 1 0 0 0 0 0 0
1 1 0 0 0 0 0 0
0 0 1 0 0 0 0 0
1 0 1 0 0 0 0 0
...
0 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1