Invalid Stream header Error in WEKA - CSV Loading
Unable to load simple CSV file in weka
In the Experimenter->Datasets-> Use relative path (check this box)
From the error message, it looks like you clicked on Open... in the GenericObjectEditor window instead of OK to accept the configuration of the CSVLoader class to load your CSV dataset.
The Open... and Save... buttons are for loading/saving object templates, not for loading/saving the CSV file (you are in the process of loading a CSV file, so a Save button makes no sense for this operation).
With this template functionality, you can maintain a library of commonly used setups for any Weka object (loader, saver, filter, classifier, etc.) and simply load it back into the GenericObjectEditor when you need it.
Related
API call for download a file which is csv is giving good repsonse in Tree listeners. I have save the downloaded file using save response to a file listener using prefix with name.csv.
But on chcing in bin folder of jemter the same csv seems to be 0 bytes. whereas in tree listeners i can see repsone.
what should i do to get the same data in csv file too.
enter image description here
Most probably you're looking at the wrong file. If you want the result to be saved as CTCbreakup.csv you need to use the following configuration of the Save Responses to a file listener:
Default configuration will give you something like CTCBreakup.csv1.octet-stream
More information: JMeter Performance Testing: Upload and Download Scenarios
I have created one file .testcaferc.json that contains all configuration information like browser name, specs, timeouts etc. I want to fetch the configuration data from file so that I have to change the information at only one place
I want to store all these information in single file, I tried, js, json and array. But I can not import all above format files in my .testcaferc.json, when I press Alt+F8 I see the error "Expected a JSON object, array or literal"
Is there any way I can import json, array or js data in .testcaferc.json?
Thanks in advance!!
The JSON format doesn't support any import directives. The TestCafe configuration file (.testcaferc.json) is a simple JSON file. So, the TestCafe configuration file doesn't support such functionality.
To achieve your goal, you can transform the existing .testcaferc.json file before test running: load data from various sources and add/replace values for the appropriate data fields.
Also, there is a suggestion in the TestCafe GitHub repository, which will make your scenario easier to implement. Track it to be notified about its progress.
i want to handle a requirement in polymer webcomponents where user can upload csv file from ui and csv file can be parsed to json and sent to server ,i searched and found for vaadin upload,looked over the api but i am not sure how to receive the csv file and convert to json and sent to server,can anyone show a jsfiddle of vaadin upload or any other web component to handle this scenario?
First of all, I am wondering why you would not simply do the conversion on the server side.
In this case, you would be able to use the vaadin-upload directly indeed.
Here is a snippet that would upload all files to the example.com server, and only allow CSV files.
<vaadin-upload target="https://example.com/upload" method="POST" accept="text/csv">
</vaadin-upload>
There are plenty of resources on how to convert CSV files to JSON.
Here is a snippet
And here is a node library
If you really wanted to do the conversion client side, then I would suggest to create an element that would embed a vaadin-upload, and convert the Files array to Json before manually calling the uploadFiles method.
For a course on Excel I was trying to load a CSV in Neo4j (first time using this application) when I was blocked at the first step of replicating an example shown in said course: loading.
The command which was used in the example was this;
LOAD CSV WITH HEADERS FROM "file:/path/to/file/file.csv"
as row
CREATE (m:movie {name:row.movie})
But it gave syntax errors. I found out I could correct it by using double \ and add "file:";
LOAD CSV WITH HEADERS FROM "file://C:\\path\\to\\file\\file.csv"
as row
CREATE (m:movie {name:row.movie})
Neo4j accepts this syntax, processes for a few moments, and returns YET ANOTHER error;
Neo.TransientError.Statement.ExternalResourceFailure
I tried the same commands (original and my own) in the online Neo4j console but no luck. I can reach the file using that path without problem; it really is there. The CSV file consist out of just 5 strings of regular letters, that's all. No fancy formatting or characters.
What's going on?
Not that mysterious, Neo4j's IMPORT CSV function looks for the specified CSV file in the import directory within your server configuration for that database, as specified at the top of its server configuration file. (IE: dbms.directories.import=import in your neo4j.conf file.)
You should create the import directory in...
"C:\Users\[User Name]\Documents\Neo4j\default.graphdb\"
If you place your CSV file in there, you can specify any sub-directory or just the "file.csv" you want to import with the IMPORT CSV function as below.
LOAD CSV WITH HEADERS FROM "file:///file.csv"
AS row
RETURN row
LIMIT 5
Try using:
"file:///C:/path/to/file/file.csv"
Since your file is on your local computer, the third / following the file scheme is not preceded by a host name or address -- but it still needs to be there. Also, file URI path separators should be forward slashes (even on Windows machines).
See the File URI scheme Wikipedia page if you need more information.
I want to create .csv files with the Report Generation Toolkit in Labview.
They must actually be .csv files which can be opened with Notepad or something similar.
Creating a .csv is not that hard, it's just a matter of adding the extension to the file name that's going to be created.
If I create a .csv file this way it opens nicely in excel just the way it should, but if I open it in Notepad it shows all kind of characters and it doesn't even come close to the data I wrote to the file.
I create the files with the Labview code below:
Link to image (can't post image yet because I've got to few points)
I know .csv files can be created with the Write to Spreadsheet VI but I would like to use the Report Generation Toolkit because it's pretty easy to add columns and rows to the file and that is something I really need.
you can use the Robust CSV package on the lavag.org forum to read and write 2D arrays to CSV files.
http://lavag.org/files/file/239-robust-csv/
Calling a file "csv" does not make it a CSV file. I never used the toolkit to generate an Excel file, but I'm assuming it creates an XLS or XLSX file, regardless of what extension you give it, which is why you're seeing gibberish (probably XLS, since it's been around for a while and I believe XLSX is XML, not binary).
I'm not sure what your problem is with the write spreadsheet VI. It has an append input, so I assume you can use that to at least add rows directly to a file, although I can't say I ever tried it. I would prefer handling all the data in memory explicitly, where you can easily use the array functions to add rows or columns to the array and then overwrite the entire file.