Is it possible to generate json file from power bi report?
and if it's possible how to make it?
i tried to make a function like that
(InputData) =>
let
JsonOutput = Json.FromValue(InputData),
OutputText = Text.FromBinary(JsonOutput)
in
OutputText
let
Source = Excel.CurrentWorkbook(){[Name="Sales"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(
Source,
{{"Month", type text}, {"Sales", Int64.Type}}),
Output = GetJson(#"Changed Type")
in
Output
in power query and its doesnt work with Cyrillic letters
or may be there is another ways to get json?
Open our PowerBI perort
Save it in dash_name.pbit(usuall dash_name.pbix)
Archives our report to .zip(!important!)
Open .zip file
Open report in .zip, open DataModelSchemas
We just got JSON
Related
I am the task of consuming from a webservice some data by means of REST, for the consumption of data from Power BI I have used as a source a blank query where later I have added in the advanced editor the following query:
let
url = "http://*********",
headers= [#"Content-Type"="application/json"],
postData = Text.ToBinary("{""token"":""*************""}"),
response = Web.Contents(
url,
[
Headers = headers,
Content = postData
]
),
jsonResponse = Json.Document(response)
in
jsonResponse
As an answer I have clearly obtained a JSON file which is composed as follows:
{'result':'1',
'message':'Successful Operation',
'data':[
{'idActivity':'1001',
'organization':'ABC-001' ,
'date':'6/10/2022 2:34:04 PM',
'lat':'57.3497300',
'lng':'-90.3929000',
'status':'0',
'company':'382',
'tag':'0'},
{'idActivity':'1002',
'organization':'DEF-002',
'date':'6/10/2022 2:21:15 PM',
'lat':'83.6718200',
'lng':'-23.3464000',
'status':'0',
'company':'932',
'tag':'0'}]}
I would like to know if there is a way to convert this JSON file to tables and then be able to represent the information in power bi visuals?
Click the parse JSON button. That should parse it automatically. You might need to replace your single quotes with double quotes ("). Your sample JSON worked for me using parse JSON when I did that.
I want to convert the following line of a file into JSON, I want to save that into an mongoose schema.
>HWI-ST700660_96:2:1101:1455:2154#5#0/1
GAA…..GAATG
Should be:
{“>HWI-ST700660_96:2:1101:1455:2154#5#0/1”: “GAA…..GAATG”}
I have tried several options, one sample below, but no success, any suggestion?
const parser = require("csv-parse/lib/sync");//import parser
const fs = require("fs");//import file reader
const path = require("path");//for join paths
const sourceData = fs.readFileSync(path.join(__dirname, "Reads.txt"), "utf8");//read the file, locally stored
console.log(sourceData);//print out for checking
const documents = parser(sourceData);//parsing, it works for other situations I have tested, in a column like data
console.log(documents);//printing out
This code give me an output as following:
[ [ '>HWI-ST700660_96:2:1101:1455:2154#5#0/1' ],
[ 'GAATGGAATGAAATGGATAGGAATGGAATGGAATGGAATGGATTGGAATGGATTAGAATGGATTGGAATGGAATGAAATTAATTTGATTGGAATGGAATG' ],...
Similar question: fasta file reading python
Because you are using the default config of the parser, it does simply output arrays of arrays in that configuration.
If you want to receive objects you will need to give the parser some options (columns) first. Take a look at the doc.
When using the sync parsing mode (like you are using) you can provide options like this:
const documents = parse(sourceData, {columns: true})
columns:true will infer the column names from the first line of the input csv.
I have a continuous stream of integers I am receiving from an Arduino uno I have set up. The input is going into PsychoPy (v1.85.2) and I would like to have this stream of numbers continuously saved into a .csv file with timestamps for datalogging purposes.
I have confirmed that I'm receiving input from the Arduino using print port.readline() I'm not sure why, but the actual integer stream simply isn't writing to the .csv file. Only the timestamps are written to the .csv file.
This is my code in PsychoPy:
import serial
import time
import csv
port = serial.Serial("COM3", 9600)
# by default, the Arduino resets on connection,
# give it some time to wake-up.
time.sleep(1)
csvfile = "C:\Users\xxxx\Desktop\csvfile.csv"
while True:
res = port.readline()
with open (csvfile,'a') as output:
writer = csv.writer(output)
now = time.strftime('%d-%m-%Y %H:%M:%S')
writer.writerow([now, res])
I'm not sure if this is an issue with the serial reading from the Arduino, the fact that I'm running it through PsychoPy, or (most likely) some error in my code. Assistance is appreciated!
The issue was was that port.readline() was returning a string with a \n(a new line) at the end of the string. To fix this, I used a .strip() to remove all the whitespace before and after the string and finally converted the string to a float value.
while True:
res = port.readline()
resa = float(res.strip())
with open (csvfile,'a') as output:
writer = csv.writer(output)
now = time.strftime('%d-%m-%Y %H:%M:%S')
writer.writerow([resa, now])
I am importing .CSV file from an angular app into MVC and i am able to get the files like this
Int32 strLen, strRead;
System.IO.Stream stream = Request.InputStream;
strLen = Convert.ToInt32(stream.Length);
byte[] strArr = new byte[strLen];
strRead = stream.Read(strArr, 0, strLen);
here the files which is being imported is converted into byte[] because i am reading the file using
System.IO.Stream stream = Request.InputStream
Then i convert it into string like this
string a = System.Text.Encoding.UTF8.GetString(strArr);
and try to split the content and retrieve the data but it becomes very complex, i wonder if there is any alternate way for it. In a simple .CSV file like this
I get the result after converting the byte[] to string like this
and once i apply logic for splitting the string and retrieving the data, the logic gets very messy like this
Is there any efficinet way where i can convert the imported .CSV file to JSON
Save stream as text file in to the TEMP folder.
Use any parcer for working with CSV file. (Example FileHelpers)
Use any Json helper to convert it to the output format. (Example: newtonsoft)
You can use Cinchoo ETL - an open source library, to convert CSV to JSON easily.
using (var parser = new ChoCSVReader("IgnoreLineFile1.csv")
.WithField("PolicyNumber", 1)
.WithField("VinNumber", 2)
.Configure(c => c.IgnoreEmptyLine = true)
.Configure(c => c.ColumnCountStrict = true)
)
{
using (var writer = new ChoJSONWriter("ignoreLineFile1.json")
.WithField("PolicyNumber", fieldName: "Policy Number")
.WithField("VinNumber", fieldName: "Vin Number")
)
writer.Write(parser.Skip(1));
}
In above, you can pass stream to the reader and writer as well for your requirement.
Hope this will help.
Disclaimer: I'm the author of this library.
I have a SSIS package where I am trying to load data from an Excel filesource into an OLE DB destination. I want to load the file name as well into the target table rather a part of the original filename.
Can anyone provide some tips for this?
Screenshot of the SSIS package
If you don't want to parameterize, then using the Script task, capture the Connection String and then extract the ExcelFileName from the Connection string using Substring.
Here are the steps :
First create a variable "FileName" of type String.
Next, use a Script task and add the above created variable in ReadWriteVariables
Use this script in your script task
public void Main()
{
// TODO: Add your code here
ConnectionManager con = Dts.Connections["EX_SRC_CON"];
string conStr = con.ConnectionString;
int firstCharacter = conStr.LastIndexOf("\\");
int lastCharacter = conStr.IndexOf(".xlsx");
string fileName = conStr.Substring(firstCharacter + 1, (lastCharacter - (firstCharacter + 1)));
Dts.Variables["User::FileName"].Value= fileName;
Dts.TaskResult = (int)ScriptResults.Success;
}
Note that "EX_SRC_CON" is your Excel Connection Manager, from this only, we will extract the Excel File Name. From the script, you can see that we are writing the Excel File into string "FileName"
Next, connect the Script task to your Data flow task (DFT). In the DFT, in the Excel Source, select the Data access mode : "SQL Command" and use the following script
SELECT *, ?
FROM [PolicyList$]
PolicyList$ is your sheet name and we need to map the variable "FileName" to ? under Parameters tab.
Next connect the columns from Excel source to your OLE DB Destination.