jmeter.functions.FileToString not finding file location - json

In JMeter I am passing multiple JSON inputs as body, Variable name is defined as JSON_FILE and coming from CSV Data Config
${__FileToString(${__eval(${JSON_FILE})}.json,,)}
CSV Data
designO1015643320
.
.
designO1077673985
designO1088516727
Running load test from Jmeter UI works fine, but running as mvn project is giving error about FileNotFoundException even though .csv file and .json files are in same folder as .jmx file
Error from .jmx.log:
WARN - jmeter.functions.FileToString: Could not read file: designO1015643320.json File 'designO1015643320.json' does not exist java.io.FileNotFoundException: File 'designO1015643320.json' does not exist
Response in .jtl:
httpSample t="4" lt="0" ts="1508530091457" s="false" lb="CreateDesign_PUT" rc="Non HTTP response code: org.apache.jorphan.util.JMeterStopThreadException" rm="Non HTTP response message: End of sequence" tn="Design_APIs 1-1" dt="text" by="1822" ng="1" na="1"/>

JMeter GUI default relative path is the bin folder
Relative paths are resolved relative to the current working directory (which defaults to the bin/ directory).
Maven search in different default path for files src/test/jmeter directory
See guide:
in the src/test/jmeter directory. When running the project, the JMeter Maven plugin searches for tests to run in this directory.
And you can find this path dynamically

I heard Groovy is a new black so I would recommend replacing your __FileToString() function with __groovy() function, the Groovy equivalent of dynamically getting the file path relative to Maven's plugin current working directory would be something like:
${__groovy(new File(org.apache.jmeter.services.FileServer.getFileServer().getBaseDir() + System.getProperty('file.separator') + vars.get('JSON_FILE') + '.json').text,)}
See JavaDoc on FileServer class for more details.

Related

Couldn't load file in neo4j

I used the following comnands in neo4j, but the system always responds the following error message.
"Couldn't load the external resource at: file:/import/Tokyo_subway_system.csv ()"
Here is my script:
load csv with headers from "file:///Tokyo_subway_system.csv" as csvLine
create (s:Station {id: toInteger(csvLine.id), station_No: csvLine.station_No, station_Name: csvLine.station_Name, station_English: csvLine.station_English, line_Name: csvLine.line_Name ,line_English: csvLine.line_English, latitude: csvLine.latitude, longitade: csvLine.longitade})
Find your $NEO4J_HOME/import/ folder in your server or local directory. Then copy that file Tokyo_subway_system.csv in that directory. If you have multiple versions of neo4j installed, ensure that you are on the right neo4j home directory.

electron-builder generate latest.json instead of latest.yml

electron-builder generates "latest.yml" blockmap and exe for windows. But in production environment yml is not accepted. Need to change "latest.yml" to "latest.json". What are the configuration require to change "latest.yml" to "latest.json"?
electron-builder#^22.9.1
We tried it , there are no configuration options to change to json.We converted from yml to json at jenkins build . Electron-builder is using js-yaml node module to parse the yml response, which will accept both json and yml. If you send json instead of yml present version of electron-updater will accept and works fine.

Using Newtonsoft.Json in a TFS release definition from powershell

Folks,
We are using TFS 2017 on-premise and are working toward provisioning/deploying to an Azure website and database. Our Azure database provisioning/updating task works well, but until we can automate the website portion part, we have an interim step where we copy the web package from the build into a staging share.
This step runs on a Deployment Group under the TFS vstsagent as follows:
Create the temp folder
Expand the zip file into the temp folder
Retrieve the Out folder under the expanded zip file
Move out folder to root of temp folder
Merge the appsettings.json file with the master.appsettings.json file for the environment.
Re-zip the web package
Copy the web package to staging destination share under PRJ\build#
Step 5) is performed using the merge feature of Newtonsoft.Json.
# Load the Newtonsoft.Json dll
$asmNewtonsoftJson = [Reflection.Assembly]::LoadFile([io.path]::combine($FolderToZip, "Newtonsoft.Json.dll"))
# Read the appsettings.json file and store as a JObject
$appSettingsPath = [io.path]::combine($folderToZip, "appsettings.json")
$appSettingsJson = (Get-Content $appSettingsPath | Out-String)
$appSettings = [Newtonsoft.Json.Linq.JObject]::Parse($appSettingsJson)
# Read the master appsettings.json file from the env and store as a JObject
$appSettingsMasterPath = [io.path]::combine($AppSettingsMasterFolderPath, $Environment + ".appsettings.json")
$appSettingsMasterJson = (Get-Content $appSettingsMasterPath | Out-String)
$appSettingsMaster = [Newtonsoft.Json.Linq.JObject]::Parse($appSettingsMasterJson)
# Merge the master appsettings.json file into the appsettings.json file
$jms = New-Object Newtonsoft.Json.Linq.JsonMergeSettings
$jms.MergeArrayHandling = [Newtonsoft.Json.Linq.MergeArrayHandling]::Merge
$appSettings.Merge($appSettingsMaster, $jms)
# Write out the updated appsettings.json file
[io.File]::WriteAllText($appSettingsPath, $appSettings.ToString())
Note: $folderToZip contains the expanded contents of the web package. This also contains the Newtonsoft.Json dll (and presumably its dependencies).
The above code works fine in the Windows Powershell ISE app, but fails when run in the TFS release definition task.
It fails at step 5) with the following error:
2018-10-11T19:29:39.1790546Z ##[error]Unable to find type [Newtonsoft.Json.Linq.JObject].
At C:\vstsagent\A1\_work\r7\a\<path>\drop\Build\mymergescript.ps1:84 char:20
$appSettings = [Newtonsoft.Json.Linq.JObject]::Parse($appSettings)
CategoryInfo : InvalidOperation: (Newtonsoft.Json.Linq.JObject:TypeName) [], RuntimeException
FullyQualifiedErrorId : TypeNotFound
It appears that the Newtonsoft.Json.dll isn't getting loaded properly when deployed by the TFS vstsagent.
Any help here is appreciated.

Failed loading positionFile: while using TAILDIR Source in flume i am getting error

I working on Flume to append the data from a local directory to HDFS using Flume Source TAILDIR.
My use case is to do Delta Load If the new line comes in the source file in local dir so that will append in hdfs.
This is my Flume Conf file :
#configure the agent
agent.sources=r1
agent.channels=k1
agent.sinks=c1
agent.sources.r1.type=TAILDIR
agent.sources.r1.positionFile = /home/flume/Documents/taildir_position.json
agent.sources.r1.filegroups=f1
agent.sources.r1.filegroups.f1=/home/flume/Documents/spooldir/
agent.sources.r1.batchSize = 20
agent.sources.r1.writePosInterval=2000
agent.sources.r1.maxBackoffSleep=5000
agent.sources.r1.fileHeader = true
agent.sources.r1.channels=k1
agent.channels.k1.type=memory
agent.channels.k1.capacity=10000
agent.channels.k1.transactionCapacity=1000
agent.sinks.c1.type=hdfs
agent.sinks.c1.channel=k1
agent.sinks.c1.hdfs.path=hdfs://localhost:8020/flume_sink
agent.sinks.c1.hdfs.batchSize = 1000
agent.sinks.c1.hdfs.rollSize = 268435456
agent.sinks.c1.hdfs.writeFormat=Text
while running flume command : flume-ng agent -n agent -c conf -f /home/swechchha/Documents/flumereal.conf
I am getting error
I am getting error to load JSON file.
Here is the code. It crashes at the line 110. Please make sure that flume user has access to that JSON file and that the file is correctly formatted.
The Flume.conf mentioned in Question Statement is having a problem.
TAILDIR SOURCE: Watch the specified files, and tail them in nearly real-time once detected new lines appended to each files. If the new lines are being written, this source will retry reading them in wait for the completion of the write.
While writing filegroups property directory may contain multiple files in this case it should be mentioned like directory path/ .filestype.
agent.sources.r1.filegroups.f1=/home/flume/Documents/spooldir/.*txt.*
Then run flume.conf and check the result it will work fine.

Neo4j load csv error : Couldn't load the external resource

I am using Neo4j3.0.1 and for loading a csv file
LOAD CSV WITH HEADERS FROM 'file:///D:/dummy.csv' as line
CREATE (:myData {line})
But it throws an error :
Couldn't load the external resource at: file:/D:/dummy.csv
Note : I've already tried configuring neo4j.conf which was described here
Suggest any other alternative besides placing csv file into import folder.
Try setting dbms.directories.import to D: in neoj4.conf
dbms.directories.import=D:
and after run
LOAD CSV WITH HEADERS FROM 'file:///dummy.csv' as line
CREATE (:myData {line})
EDIT:
As shown in comments the problem was solved by changing the owner of the CSV file location directory, as described in this answer.
sudo chown neo4j:adm <csv file location>