Token Comma Expected- JSON power query - json

I am unable to run the query because it says there is a token comma expected and highlights the "donordrive" in "donordrive-password".
let
Source = Json.Document(Web.Contents("https://api.donordrive.com/cmndancemarathon/export/commit.JSON", [Headers=[#"donordrive-email" ="email#gmail.com”, #"donordrive-password" ="password2020"])),
result = Source[result],
#"Converted to Table" = Table.FromList(result, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"donationentereddate", "eventfiscalyear", "donationamount", "donationisregistrationfee", "participantid", "participantfirstname", "participantlastname", "teamname", "donorfirstname", "donorlastname"}, {"Column1.donationentereddate", "Column1.eventfiscalyear", "Column1.donationamount", "Column1.donationisregistrationfee", "Column1.participantid", "Column1.participantfirstname", "Column1.participantlastname", "Column1.teamname", "Column1.donorfirstname", "Column1.donorlastname"})
in
#"Expanded Column1"

Related

Wordpress escaping quoting ONLY when inserting into the database

here's another problem:
I have built a custom plugin... all works except when I update a record everything gets escaped and magic quoted.
I have stripslahsed_deep the $_POST and the rest, however it seems I can see the query ACTUALLY going in by a mistake (SET Id on update - which I know I have to fix).
At any rate all the magic quotes are in... how can I remove them?
Here's the var_dump of the query JUST BEFORE it is executed.
array(14) {
["id"]=> string(4) "'10'"
["number"]=> string(4) "'44'"
["title"]=> string(16) "'pippoasdasddad'"
["description"]=> string(7) "'pippo'"
["type"]=> string(11) "'Book Club'"
["platform"]=> string(7) "'pippo'"
["airdate"]=> string(12) "'2023-02-16'"
["duration"]=> string(7) "'17:38'"
["shownotes"]=> string(7) "'pippo'"
["authors"]=> string(20) "'Andy, Diego, Wiedo'"
["image_small"]=> string(7) "'pippo'"
["image_big"]=> string(7) "'pippo'"
["stream_link"]=> string(7) "'pippo'"
["published"]=> bool(false) }
And here is the error that shows what is ACTUALLY going in.
WordPress database error: [Duplicate entry '0' for key 'PRIMARY']
UPDATE `wp_ngof_episodes` SET `id` = '\'10\'', `number` = '\'44\'', `title` = '\'pippoasdasddad\'', `description` = '\'pippo\'', `type` = '\'Book Club\'', `platform` = '\'pippo\'', `airdate` = '\'2023-02-16\'', `duration` = '\'17:38\'', `shownotes` = '\'pippo\'', `authors` = '\'Andy, Diego, Wiedo\'', `image_small` = '\'pippo\'', `image_big` = '\'pippo\'', `stream_link` = '\'pippo\'', `published` = '' WHERE `id` = '10'
I tried the deepslashes removal
$POST = array_map('stripslashes_deep', $_POST);
and then used that $POST variable... no luck
Any ideas?
Well I am answering my own question as I realised I was the one actually adding the quotes... what a plonker, took me hours to find out.
Here it is:
Original
if (isset($_POST[$field['name']])) {
$data[$field['name']] = "'".stripslashes_deep($_POST[$field['name']])."'";
} else {
$data[$field['name']] = false;
}
To
if (isset($_POST[$field['name']])) {
$data[$field['name']] = stripslashes_deep($_POST[$field['name']]);
} else {
$data[$field['name']] = false;
}

PowerQuery: Function to get Duplicates info for given Columnnames

I need Some function in PowerQuery to get Additional Columns for duplicated data (not just keep/remove duplicates)
Example:
For the given table I want to get following info for duplicated columns set {"Date", "Product", "Color"}:
Minimal RowId - basicaly, Id of the 1st occurence of data
Nr. of Duplicate - duplicates counter within MinRowId group
NB! For non duplicates it should return null values
try grouping then expanding in powerquery
let Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
#"Grouped Rows" = Table.Group(Source, {"Product", "Color"}, {
{"data", each Table.AddIndexColumn(_, "nDupl", 0, 1, Int64.Type), type table},
{"MinRowID", each List.Min(_[RowId]), type number}
}),
#"Expanded data" = Table.ExpandTableColumn(#"Grouped Rows", "data", {"RowId", "Date", "amount", "nDupl"}, {"RowId", "Date", "amount", "nDupl"})
in #"Expanded data"
Please try following function (download):
Function call Example:
tfnAddDuplicatesInfo2(Source,{"Product","Color","Date"},"DuplInfo" ,"RowId")
Function Arguments:
srcTable as table, // input Table
inGroupBy as list, // List of ColumnNames to search duplicates
outDuplInfo as text, // Output ColumnName for Information about Duplicates - Duplicate number and Minimal RowId (if inRowId provided) within a group
optional inRowId as nullable text // RowId ColumnName - required for outMinRowId calculation for inGroupBy columns
Function body:
let
func = (
srcTable as table, // input Table
inGroupBy as list, // List of ColumnNames to search duplicates
outDuplInfo as text, // Output ColumnName for Information about Duplicates - Duplicate number and Minimal RowId (if inRowId provided) within a group
optional inRowId as nullable text // RowId ColumnName - required for outMinRowId calculation for inGroupBy columns
) =>
let
Source = srcTable,
// // To test as script
// inGroupBy = {"Product", "Color","Date"},
// outDuplInfo = "DuplInfo",
// inRowId = "RowId", // null, "RowId",
//> == Variables ===================================================
Columns2Expand = List.Combine({List.Difference(Table.ColumnNames(Source),inGroupBy),{"__outDuplCounter__"}}),
srcType = Value.Type(Source),
srcTypeRow=
Type.ForRecord(
Record.Combine(
{
Type.RecordFields(Type.TableRow(srcType)),
Type.RecordFields(type [__outDuplCounter__= Int64.Type])
}
),
false
),
RowIdType = if inRowId<>null then Type.TableColumn(srcType,inRowId) else Any.Type, // Stores Column Typename
//< == Variables ===================================================
#"Grouped Rows" = Table.Group(
Source,
inGroupBy,
{
{"__tmpCount__" , each Table.RowCount(_), Int64.Type},
{"__MinGroupRowId__", each if inRowId<> null then List.Min( Record.Field(_,inRowId) ) else null, RowIdType},
{"__AllRows__" , each Table.AddIndexColumn(_, "__outDuplCounter__", 0, 1, Int64.Type), type table srcTypeRow}
}
),
#"Expanded __AllRows__" = Table.ExpandTableColumn(#"Grouped Rows", "__AllRows__", Columns2Expand),
nulls4MinRowId = Table.ReplaceValue(#"Expanded __AllRows__",each [__tmpCount__]<=1, null,
(currentValue, isConditionTrue, replacementValue) => if isConditionTrue then null else currentValue, // Replace.Value function
if inRowId<>null then {"__MinGroupRowId__","__outDuplCounter__"} else {"__outDuplCounter__"}
),
Add_outDuplInfo =
if inRowId<> null then
Table.AddColumn(nulls4MinRowId, outDuplInfo,
each
if [__outDuplCounter__]=null
then null
else [MinRowId=[__MinGroupRowId__], nDupl = [__outDuplCounter__]] ,
type nullable [MinRowId = RowIdType, nDupl = Int64.Type]
)
else
Table.AddColumn(nulls4MinRowId, outDuplInfo, each [__outDuplCounter__], Int64.Type),
Result_tfnAddDuplMinRowId = Table.SelectColumns(Add_outDuplInfo, List.Combine({Table.ColumnNames(Source),{outDuplInfo}}))
in
Result_tfnAddDuplMinRowId,
documentation = [
Documentation.Name = " tfnAddDuplicatesInfo2 ",
Documentation.Description = " Adds two info columns for Duplicates - 1st occurence RowId and given group Occurence Number",
Documentation.LongDescription = " Adds two info columns for Duplicates - 1st occurence RowId and given group Occurence Number",
Documentation.Category = " Running Total ",
Documentation.Source = " ",
Documentation.Version = " 1.0 ",
Documentation.Author = " Denis Sipchenko ",
Documentation.Examples = {
[
Description = "tfnAddDuplicatesInfo2 arguments: ",
Code = "
srcTable as table, // input Table
inGroupBy as list, // List of ColumnNames to search duplicates
outDuplInfo as text, // Output ColumnName for Information about Duplicates - Duplicate number and Minimal RowId (if inRowId provided) within a group
optional inRowId as nullable text // RowId ColumnName - required for outMinRowId calculation for inGroupBy columns",
Result =""
],
[
Description = "tfnAddDuplicatesInfo2 function call example ",
Code = "
let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText(""hZTBasMwEET/xWdDdteSbP9CT4U2h2JyCK1oQ0xS3IT8frUpWsmSqpxs4ccw2pn1NDXYtA3CBsYNAZE7PNn96cc93+w8n2/uZWwBml07NfwVTIS+nN+PK1SDZzuW1RG7PX3Y5Wb3y4r3uHKHDgrSz9fle7buRQ2e1e5EpuA4sORZw+x/NgIvtnu2jbGP42G5rMS73sMDw0MdlhuODKua68Ai8KT7CH49fH5dVqOOaI6QoO5DCX1PkeraKDTnSKquLdNDjhGLvgMtsE6NZHUKrEnrVBPuU8/F0El6jRykox+UlSR45DCJamEGmODhhpERGNOa5BeNaErrna0NSU3ovpJjXVpqQip1LcGLbZSVJJ1OMLsjBtcm/Y8Ux43BCwcKxa0s0UPqPC84/hV89ws="", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [RowId = Int64.Type, Date = date, Product = _t, Color = _t, Amount = Currency.Type])
in
tfnAddDuplicatesInfo2(Source,{""Product"",""Color"",""Date""},""DuplInfo"" ,""RowId"")
",
Result = "Adds to Source table ""DuplInfo"" column with records:
""MinRowId"" - Minimal RowId within within given group,
""nDupl"" - given group Occurence Number
"
],
[
Description = "tfnAddDuplicatesInfo2 function short call example ",
Code = "
let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText(""hZTBasMwEET/xWdDdteSbP9CT4U2h2JyCK1oQ0xS3IT8frUpWsmSqpxs4ccw2pn1NDXYtA3CBsYNAZE7PNn96cc93+w8n2/uZWwBml07NfwVTIS+nN+PK1SDZzuW1RG7PX3Y5Wb3y4r3uHKHDgrSz9fle7buRQ2e1e5EpuA4sORZw+x/NgIvtnu2jbGP42G5rMS73sMDw0MdlhuODKua68Ai8KT7CH49fH5dVqOOaI6QoO5DCX1PkeraKDTnSKquLdNDjhGLvgMtsE6NZHUKrEnrVBPuU8/F0El6jRykox+UlSR45DCJamEGmODhhpERGNOa5BeNaErrna0NSU3ovpJjXVpqQip1LcGLbZSVJJ1OMLsjBtcm/Y8Ux43BCwcKxa0s0UPqPC84/hV89ws="", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [RowId = Int64.Type, Date = date, Product = _t, Color = _t, Amount = Currency.Type])
in
tfnAddDuplicatesInfo2(Source,{""Product"",""Color"",""Date""},""nDupl"")
",
Result = "Adds to Source table one column:
""nDupl"" - given group Occurence Number
"
]
}
]
in
Value.ReplaceType(func, Value.ReplaceMetadata(Value.Type(func), documentation))
P.S. Idea about group & expand index column borrowed from horseyride post.
P.S.S. Initially, I took as a source Running Total by Category by Rick de Groot. And than reworked it.

Power Query (M) Get info using a function with an API

As a newbe, I have a question about Power Query (M)
I am looking for a way to extract samo info from an API result.
For starters I am doing this:
I have created a query to get the title from a task.
This works fine:
let
Source = Web.Contents(#fxGetSource() & "/tasks/IEABCDQ7KQPO5DQ4",
[Headers=[#"Authorization"=#fxGetHeader()]]),
convertToJson = Json.Document(Source),
data = convertToJson[data],
ConvertedToTable = Table.FromList(data, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
ExpandedColumn1 = Table.ExpandRecordColumn(ConvertedToTable, "Column1", {"title"}),
TheTitle = Table.TransformColumnTypes(ExpandedColumn1,{{"title", type text}})
in
TheTitle
I would like to have the taskid to sit in a variable, so I created a function:
(aTask as text) as text =>
let
Source = Web.Contents(#fxGetSource() & "/tasks/" & aTask,
[Headers=[#"Authorization"=#fxGetHeader()]]),
convertToJson = Json.Document(Source),
data = convertToJson[data],
ConvertedToTable = Table.FromList(data, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
ExpandedColumn1 = Table.ExpandRecordColumn(ConvertedToTable, "Column1", {"title"}),
TheTitle = Table.TransformColumnTypes(ExpandedColumn1,{{"title", type text}})
in
TheTitle
When I invoke this function ans use the taskid from above I get:
Expression Error: We cannot convert a value of type Table to type Text.
change
(aTask as text) as text =>
to
(aTask as text) as table =>

Expanding all columns simultaneously in Power Query

Need help expanding all columns in a spreadsheet simultaneously using Power Query. I have transposed the spreadsheet from this:
to this:
Each table is a long column of values (9,000+ rows). I would like each column to be a separate ID. Expanding columns manually would be a tedious job and our team is adding data from new study participants (IDs) regularly, so I need help creating a code that can expand all columns simultaneously without having to indicate the column names (IDs) in the code. Thank you for your help!
This is the code I'm currently using:
let
Source = Folder.Files("folder address goes here"),
#"Filtered Rows" = Table.SelectRows(Source, each ([Extension] = ".xlsx")),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Content", "Extension", "Date accessed", "Date modified", "Date created", "Attributes"}),
#"Added Custom" = Table.AddColumn(#"Removed Columns", "filepath", each [Folder Path]&[Name]),
#"Removed Columns1" = Table.RemoveColumns(#"Added Custom",{"Folder Path"}),
#"Added Custom1" = Table.AddColumn(#"Removed Columns1", "Custom", each Excel.Workbook(File.Contents([filepath]))),
#"Expanded Custom" = Table.ExpandTableColumn(#"Added Custom1", "Custom", {"Name"}, {"Name.1"}),
#"Filtered Rows1" = Table.SelectRows(#"Expanded Custom", each ([Name.1] = "Heart Period Time Series")),
#"Added Custom2" = Table.AddColumn(#"Filtered Rows1", "Custom", each fnImportExcel3([filepath],[Name.1])),
#"Removed Columns2" = Table.RemoveColumns(#"Added Custom2",{"filepath", "Name.1"}),
#"Replaced Value" = Table.ReplaceValue(#"Removed Columns2",".xlsx","",Replacer.ReplaceText,{"Name"}),
#"Transposed Table" = Table.Transpose(#"Replaced Value"),
#"Promoted Headers" = Table.PromoteHeaders(#"Transposed Table")
You may use following technique:
let
t1 = #table({"1"},List.Zip({{"a".."f"}})),
t2 = #table({"2"},List.Zip({{"d".."g"}})),
t3 = #table({"3"},List.Zip({{"a".."e"}})),
input = #table({"Name","Custom"},{{"B1",t1},{"B2",t2},{"B3",t3}}),
toList = Table.TransformColumns(input, {"Custom", Table.ToList}),
output = #table(toList[Name],List.Zip(toList[Custom]))
in
output

Codeigniter Model Getting Error: Column count doesn't match value count at row 2

I am parsing currency rates from a rss.xml feed that all works great. I am now trying to insert that data into a database called rates with a table called tblRates. I keep getting this error and do not know why. Here is the function in the model I am using to try to batch insert into the database.
function addIQDRates($Data){
if($this->db->insert_batch('tblRates', $Data, 'Currency'))
{
return $this->db->affected_rows();
}else{
return FALSE;
}
}
Also here is the foreach statement I am using in my controller to sort the data from the xml file and to insert it into the database.
$Data = array();
$Data = array();
$Count = 0;
foreach ($xml->channel->item as $currencyInfo) {
$Data[$Count]['Currency'] = trim(str_replace("/USD", "", $currencyInfo->title)); // UNIQUE
$Data[$Count]['PubDate'] = date('Y-m-d H:i:s', strtotime(trim($currencyInfo->pubDate)));
$Data['CXRate'] = trim(preg_replace("/[^0-9,.]/", "", str_replace("1 United States Dollar = ", "", $currencyInfo->description)));
$Data[$Count]['DateCreated'] = date('Y-m-d H:i:s');
$Count++;
}
$TotalRows = $this->mycron_model->addIQDRates($Data);
Also here is my Create Table statement
CREATE TABLE IF NOT EXISTS `tblRates` (
`RateID` int(11) NOT NULL AUTO_INCREMENT,
`Currency` varchar(50) NOT NULL,
`PubDate` datetime NOT NULL,
`CXRate` int(11) NOT NULL,
`DateCreated` datetime NOT NULL,
PRIMARY KEY (`RateID`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=1 ;
all help greatly appreciated.
I am not sure, you might have written $Data['CXRate'] instead of $Data[$Count]['CXRate'].
So the loop should like like below:
foreach ($xml->channel->item as $currencyInfo) {
$Data[$Count]['Currency'] = trim(str_replace("/USD", "", $currencyInfo->title)); // UNIQUE
$Data[$Count]['PubDate'] = date('Y-m-d H:i:s', strtotime(trim($currencyInfo->pubDate)));
$Data[$Count]['CXRate'] = trim(preg_replace("/[^0-9,.]/", "", str_replace("1 United States Dollar = ", "", $currencyInfo->description)));
$Data[$Count]['DateCreated'] = date('Y-m-d H:i:s');
$Count++;
}