arcgis developers - export data from FeatureServer into csv file - csv

Good day to all,
I am new to survey 123 and argis online, but after loads of try, I ask for the help of the community.
I am trying to export data from the API rest of arcgis under CSV format by command line. As I have hundred of Feature Layer to export, I do not want to download it through the "Export Data" button
I have already download in json format through API rest, but the conversion of nested json into csv is tricky so I prefer to download the csv file from arcgis directly in command line.
For informations, the feature layer data that I want to export is from the results of survey made with 123survey.
Thus I have a table of data accessible to the following link : https://developers.arcgis.com/layers/6f8228f7bfa24a61a029aa9751da55d8/data/0
There is a tutorial in python to export csv from a layer, but in this tutorial the layer is a datasets, however I have a Feature Layer and not a datasets one :
https://developers.arcgis.com/labs/python/download-data/
The documentation to query a feature layer is available to the following link, but we can only get html or json object from the query :
https://developers.arcgis.com/labs/rest/query-a-feature-layer/
https://developers.arcgis.com/rest/services-reference/query-feature-service-layer-.htm
The documentation to export data is available to the following link :
https://developers.arcgis.com/rest/users-groups-and-items/export-item.htm
However I can not find the request to get the csv data from this table. I know how to do it through the web interface, but I want to know how to do it with requests.
I have already try the following requests with PostMan, but unfortunately none of the following have worked :
https://mycompany.maps.arcgis.com/home/rest/content/users/chargeetudeti/export?token=UsSMUWq.......&itemId=6f8228f7bfa24a61a029aa9751da55d8&exportFormat=CSV
https://mycompany.maps.arcgis.com/sharing/rest/content/users/chargeetudeti/export?token=UsSMUWq.......&itemId=6f8228f7bfa24a61a029aa9751da55d8&exportFormat=CSV
https://www.arcgis.com/sharing/rest/content/users/chargeetudeti/export?token=UsSMUWq.......&itemId=6f8228f7bfa24a61a029aa9751da55d8&exportFormat=CSV&exportParameters={"layers" : [ { "id" : 0 } ] }
https://services6.arcgis.com/xNifwnpI45tLCPU7/ArcGIS/rest/services/service_70df166a7c9649878cc260d81ee85e4e/FeatureServer/export?token=UsSMUWq.......&exportFormat=CSV&exportParameters={"layers" : [ { "id" : 0 } ] }
Do you have any clues ?
I made public the data to let you try to export the data into a csv format, from this feature layer : https://services6.arcgis.com/xNifwnpI45tLCPU7/arcgis/rest/services/service_70df166a7c9649878cc260d81ee85e4e/FeatureServer
I will be very gratefull if you could help me : )
If you think the title of this post or its tags, are not appropriate, please let me know.
Adrien

Related

is there a template for uploading information to ibm maximo using JSON files?

is there any way to upload files to ibm maximo using JSON files with an external system?
Let me explain, I have created my object structure and I want to upload information, then I do it with a CSV file with the following structure:
<EXTSYSNAME>;<IFACENAME>;;<LANGUAGE> <ATRIBUTE1>;<ATRIBUTE2>;<ATRIBUTE3> <VALUE1>;<VALUE2>;<VALUE3>
and everything is fine.
Now my question is could I do this but instead of using csv, using JSON and if possible, what would be the structure?
Thanks in advance
I found this template but only works for domains:
{
"domainid":"DOMAIN1",
"domaintype":"ALN",
"maxtype":"ALN",
"description":"DESCRIPTION FOR DOMAIN 1",
"length":30,
"alndomain":[
{
"value":"ALNVALUE1",
"description":"ALN value 1",
"orgid":"",
"siteid":""
}
],
"_action":"CREATE"
}

ingest JSON-STAT from RESTapi to csv on Azure Data Factory

I want to make a request to an RESTapi of the Norwegian statistics bureau.
According to their api I need to make a POST request (thats not usual but that is ok) to get the tables that I need. In the body of the POST request I can specify with kind of table I want.
What I get is a JSON-STAT file. I need some help about how to handle it on Azure Data Factory. I read on documentation that ADF supports just JSON when using REST API. Does it mean that it also supports JSON-STAT ? If so How can I handle it on the activity ( source/sink ).
Thanks in advance,
Nunotrt
Here is the sample process of JSON-STAT file ingest from REST API . Create a new REST dataset. Create Linked services to the REST API. Provide REST dataset as the source for the copy activity and also with the POST request method, providing the request body({ "query": [ { "code": "Region", "selection": { "filter": "agg:KommSummer", "values": [ "K-3006" ] } } ], "response": { "format": "json-stat2" } })
Create a linked services for sink dataset. Create a file path for the sink dataset to get the file into the location. Result we get as per the expectations.
If you want to convert a json-stat file to csv or parquet, it is an alternative to you use a python code as a part of your Data Factory job.
The python library pyjstat makes it easy to convert json-stat to a DataFrame, and from DataFrame to any other file format.
Here are an example with json-stat 1.2
import pandas as pd
from pyjstat import pyjstat as pyj
# url
file_ssb = 'https://data.ssb.no/api/v0/dataset/1102.json'
# Upload data to memory
dataset = pyj.Dataset.read(file_ssb)
# Write json-stat to dataframe
df = dataset.write('dataframe')
print(df)
# Save as csv
df.to_csv('../Data/test_jsonstat.csv')

Importing Well-Structured JSON Data into ElasticSearch via Cloud Watch

Is is there known science for getting JSON data logged via Cloud Watch imported into an Elasticsearch instance as well structured JSON?
That is -- I'm logging JSON data during the execution of an Amazon Lambda function.
This data is available via Amazon's Cloud Watch service.
I've been able to import this data into an elastic search instance using functionbeat, but the data comes in as an unstructured message.
"_source" : {
"#timestamp" : "xxx",
"owner" : "xxx",
"message_type" : "DATA_MESSAGE",
"cloud" : {
"provider" : "aws"
},
"message" : ""xxx xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx INFO {
foo: true,
duration_us: 19418,
bar: 'BAZ',
duration_ms: 19
}
""",
What I'm trying to do is get a document indexed into elastic that has a foo field, duration_us field, bar field, etc. Instead of one that has a plain text message field.
It seems like there are a few different ways to do this, but I'm wondering if there's a well trod path for this sort of thing using elastic's default tooling, or if I'm doomed to one more one-off hack.
Functionbeat is a good starting point and will allow you to keep it as "serverless" as possible.
To process the JSON, you can use the decode_json_fields processor.
The problem is that your message isn't really JSON though. Possible solutions I could think of:
A dissect processor that extracts the JSON message to pass it on to the decode_json_fields — both in the Functionbeat. I'm wondering if trim_chars couldn't be abused for that — trim any possible characters except for curly braces.
If that is not enough, you could do all the processing in Elasticsearch's Ingest pipeline where you probably stitch this together with a Grok processor and then the JSON processor.
Only log a JSON message if you can to make your life simpler; potentially move the log level into the JSON structure.

How connect json file with after effects for rendering dynamic video

I wan't to learn how connect some json file with after effects for rendering dynamic videos.
Eg i have a form in some webpage:
this form included one input which people are using there their name.
And then i create some json file like that array of objects with this form.
data = [
{
name: 'John'
},
{
name: 'Mike'
}
]
and i wan't to create with these json objects for each name some video about few second there will be shown just name from json and render some mp4 video.
How to do that?
which steps following?
if it will be web form i think i'll need to connect json file dynamically too right?
so after effects will read this json file from some url ?
There are many ways to go about doing this, but a single answer on Stack Overflow probably won't give you everything you need.
Network communication can be done using the CEP framework provided by Adobe which can then execute ExtendScript code which actually does the manipulation of the layers inside the AEP project file. You can use node modules to perform the network communication, and then write ExtendScript code to pass in the JSON data to that.
While not free, you might want to explore Dataclay's Templater extension to help you accomplish what you want. It not only does what you are asking out of the box, but it has some rules-based AI to reconfigure layers both temporally and spatially. You can point Templater to a URL that response with an array of JSON objects and have it process that data. In addition to this, it has event hooks which allow you to execute any script within the Shell or with the ExtendScript engine during its versioning process.
Hope this helps!

Issue with attachments in ui5 application having Notes database as the backend

I am developing an application in ui5. The 'Model' is a notes database. Now, I am fetching the notes document via JSON. I want to display the attachments in the upload collection in ui5. The JSON is returning attachments but I am unable to understand it. For e.g. I have attached a PDF document, but the JSON is showing as if I have attached 3 different files. Also the file is in Base64 format.
I want to be able to download and upload the attachments.
Following is the attachment field details in the JSON(There is only 1 file "Domino Access Service.pdf" in the field and nothing else):
"Attach_ProductDetails":
{
"type":"multipart",
"content": [
{
"contentType":"multipart\/mixed; Boundary=\"0__=4EBB0B01DFD9A4D28f9e8a93df938690918c4EBB0B01DFD9A4D2\""
},
{
"contentType":"multipart\/alternative; Boundary=\"1__=4EBB0B01DFD9A4D28f9e8a93df938690918c4EBB0B01DFD9A4D2\"",
"boundary":"--0__=4EBB0B01DFD9A4D28f9e8a93df938690918c4EBB0B01DFD9A4D2"
},
{
"contentType":"text\/plain; charset=US-ASCII",
"data":" (See attached file: 1. Domino Access Service.pdf)",
"boundary":"--1__=4EBB0B01DFD9A4D28f9e8a93df938690918c4EBB0B01DFD9A4D2"
},
{
"contentType":"text\/html; charset=US-ASCII",
"contentDisposition":"inline",
"data":"<html><body><i>(See attached file: 1. Domino Access Service.pdf)<\/i><\/body><\/html>\r\n",
"boundary":"--1__=4EBB0B01DFD9A4D28f9e8a93df938690918c4EBB0B01DFD9A4D2"
},
{
"contentType":"application\/pdf; name=\"1. Domino Access Service.pdf\"",
"contentID":"<1__=4EBB0B01DFD9A4D28f9e8a93df93869091#local>",
"contentDisposition":"attachment; filename=\"1. Domino Access Service.pdf\"",
"contentTransferEncoding":"base64",
"data":"<Base64 data>",
"boundary":"--0__=4EBB0B01DFD9A4D28f9e8a93df938690918c4EBB0B01DFD9A4D2"
}
]
}
It will be great help if anyone has the solution for the same.
It is not giving you three files. It is showing you two alternative renderings of the rich text field called Attach_ProductDetails which contains the icon representing the attached file - which it thinks you might want. There could also be other data in that rich text field. The API has no idea what part of it you want, so it gives you everything - and in case you're not prepared to deal with text/html it also gives you a text/plain rendering, too.
It is also giving you the file attachment data, tagged with the "application/pdf" content-type. You need to decode the base64 data and store it so you can display it (or whatever else your application wants to do with it).