10k Curl post request to splunk - json

I have a workflow that generates around 10k json files, im using a bash script which uses curl to post these json to Splunk as an event.
Each post request take time, is there an efficient way to post these 10k json to splunk endpoint ?
I had these 10k value in one single json file but due to limitation of Splunk (splunk can only handle a number of characters that json has around 1,2milion character ), so suggested to split it and make each value in that json as a new json so splunk can handle it as separate event, so i end up generating 10k json.
Any ideas?

It seems you've answered your own question - namely, the JSO blob you're trying to POST is too big for the HEC to handle
Split it into smaller chunks instead of trying to batch it all at once

Related

C# Sending Json body that includes a file

I have a C# project that
gets a request to do work
performs the work
returns to the api a json object containing the results of the request
This is working fine. As the work is done there is a log file that gets populated with various information (date being processed, errors encountered, etc.). I want to include a request type that would tell my program to return to the API the log file. Because I want this to be "just another request" I am trying to figure out how to include the file as part of the json body being returned. Is this possible? Is there a way to include a file (actual file - not just the contents of the file) in the json body?
I know how to return the file as it's own post - is there a way to return a json AND a file in the same post or am I looking at 2 separate post requests? One for the json body and a separate post that would send the file?

jmeter multi response data used in request

i have a test plan that has many POST calls like
/api/v1/budgets
now each of those call has a response which return a uuid from the data base, i extract that using a json path extractor and save it to a variable
after i'm doing all the post calls, i need to do the same amount of calls but with DELETE and do it with the uuid i got from the response
is there an efficient way to extract those uuid? for now i had to add a json path extractor manually to each call
and after that, is there a way to save them and run on those saved vars in a loop just send the next one each time?
also i'm gonna use multiple users for each thread, so i don't know if jmeter will be able to solve that issue or i need to handle that as well the threads and the users per thread
JMeter provides ForEach Controller which can iterate variables having numeric postfix like:
uuid_1
uuid_2
uuid_3
etc.
So you can store the uuids above way using for example __counter() function and use single HTTP Request under ForEach Controller in order to delete them.
I would also recommend getting familiarized with Here’s What to Do to Combine Multiple JMeter Variables article to learn how to work with compound variables in JMeter scripts.

JSON with too many characters error

I am building an application that I am dumping the data from one table into a json object. When I try to use this json object via angular I dont see the data. So I took the data from the json object and ran it in a parser and was told that there are too many characters in my json object. What is this limit?
Part 2, if this is the case, any ideas on how I could break up the data? Thanks

Is using multipart/form-data any better then JSON + Base64?

I have a server and I need to upload files along with some fields from the client to the server. I have currently been using standard multipart/form-data.
I have found however that using multipart/form-data is not ideal. Objects on my server may have other objects nested within them, and thus are represented as a JSON object with other JSON objects embedded within.
I would like for the client to start making POST/PUT requests using a JSON representation exactly like it would expect in a GET request to the server, in a REST-ful manner. This way I don't have to flatten the fields which might be nested a couple layers within the JSON object in order to use multipart/form-data.
Problem is, JSON doesn't represent binary data. Multipart/form-data doesn't seem to have a way to represent fields nested within the values of other fields. But it does have much better handling of file-uploads.
I am at a loss for how to design this. Should I just have the client upload JSON with the fields encoded in base64, and take the 25% hit? Or should I have the JSON object being represented as some sort of "json" variable in a Multipart/form-data request, and have the binary files to be uploaded as another variable?
Should I just have the client upload JSON with the fields encoded in
base64, and take the 25% hit?
The hit will be 33% since 4/3=1.33.
Or should I have the JSON object being represented as some sort of
"json" variable in a Multipart/form-data request, and have the binary
files to be uploaded as another variable?
This should work.
You might also consider this approach: send all files using multipart, then get some identificators of files as a response. Put this identificators in your json and send it anyway you like. This approach might be beneficial if you have many scenarios in which you send files: you might always send them to the server with the same request, then get their identificators; after that do with them what you like.

MongoDB - Update collection hourly without interrupting normal query

I'm implementing a web service that needs to query a JSON file( size: ~100MB; format: [{},{},...,{}] ) about 70-80 times per second, and the JSON file will be updated every hour. "query a JSON file" means checking if there's a JSON object in the file that has an attribute with a certain value.
Currently I think I will implement the service in Node.js, and import ( mongoimport ) the JSON file into a collection in MongoDB. When a request comes in, it will query the MongoDB collection instead of reading and looking up in the file directly. In the Node.js server, there should be another timer service, which in every hour checks whether the JSON file has been updated, and if it has, it needs to "repopulate" the collection with the data in the new file.
The JSON file is retrieved by sending a request to an external API. The API has two methods: methodA lets me download the entire JSON file; methodB is actually just an HTTP HEAD call, which simply tells whether the file has been updated. I cannot get the incrementally updated data from the API.
My problem is with the hourly update. With the service running, requests are coming in constantly. When the timer detects there is an update for the JSON file, it will download it and when download finishes it will try to re-import the file to the collection, which I think will take at least a few minutes. Is there a way to do this without interrupting the queries to the collection?
Above is my first idea to approach this. Is there anything wrong about the process? Looking up in the file directly just seems too expensive, especially with the requests coming in about 100 times per seconds.