JavaFX 2. Loading external CSV into a TableView - csv

Im pretty new to Java and Im searching the Internet for a simple way to load an external csv into JavaFX TableView.
I was able to parse the CSV into an array but I dont know how I have to handle it now. Then I was playing with the DataFX library. But again wasnt able to pass the parsed csv into my table.
I think I dont really understand ObservableLists here which I believe is kind of necessary? Do you know a good tutorial or could you explain what the next steps would be after parsing the file?
thx
Edit: That's what I did
import javafx.application.Application;
import javafx.scene.SceneBuilder;
import javafx.scene.control.TableColumn;
import javafx.scene.control.TableView;
import javafx.stage.Stage;
import org.javafxdata.datasources.reader.FileSource;
import org.javafxdata.datasources.provider.CSVDataSource;
public class CSVTableSample extends Application {
#SuppressWarnings("unchecked")
#Override
public void start(Stage stage) throws Exception {
stage.setTitle("Test App");
// Just loading the file...
FileSource fs = new FileSource("test.csv");
// Now creating my datasource
CSVDataSource dataSource = new CSVDataSource(
fs, "order-id", "order-item-id");
#SuppressWarnings("rawtypes")
TableView table1 = new TableView();
TableColumn<?, ?> orderCol = dataSource.getNamedColumn("order-id");
TableColumn<?, ?> itemCol = dataSource.getNamedColumn("order-item-id");
table1.getColumns().addAll(orderCol, itemCol);
table1.setItems(dataSource);
stage.setScene(SceneBuilder.create().root(table1).build());
stage.show();
}
public static void main(String[] args) {
Application.launch(args);
}
}
eclipse says for table1.setItems(dataSource);
The method setItems(ObservableList) in the type TableView is not applicable for the arguments (CSVDataSource)

There is a sample solution here for a tab delimited file.
A csv file could handled similarly.
The sample works by declaring the type of the the TableView as TableView<ObservableList<StringProperty>> such that each row in the TableView is an ObservableList of string properties where each property represents a field in the csv file. The TableView's items list is a list of such lists. cellValueFactorys set for each column extract the correct cell value for that column from the ObservableList<StringProperty> backing that cell's row.

The method setItems(ObservableList) in the type TableView is not
applicable for the arguments (CSVDataSource)
change your line
table1.setItems(dataSource);
to
table1.setItems(dataSource.getData());
Example Code Using DataFX :
DataSourceReader dsr1 = new FileSource("your csv file path");
String[] columnsArray // create array of column names you want to display
CSVDataSource ds1 = new CSVDataSource(dsr1,columnsArray);
TableView tableView = new TableView();
tableView.setItems(ds1.getData());
tableView.getColumns().addAll(ds1.getColumns());
if you want to do it in standard javafx way : Look Here

Related

Weka - issue with line X ... coverting csv to ARFF

I am currently trying to covert a csv file of information to an ARFF file in Weka...
The issue pops up that there is a problem with line 3384... but there is nothing that i can see that is wrong with the line?
Image of excel file here
Please can someone help?
Thanks.
This problem often pops up when there are illegal characters in the file to be converted. You can double check for such characters. You can also use the code below to do the conversion from csv to arff in java.
import weka.core.Instances;
import weka.core.converters.ArffSaver;
import weka.core.converters.CSVLoader;
import java.io.File;
public class CsvtoArff {
public static void main(String[] args) throws Exception {
String args0="/Users/Kehinde/Documents/trainingtest.csv";
String args1="/Users/Kehinde/Documents/theoutput.arff";
// This is used to load CSV
CSVLoader myloader = new CSVLoader();
myloader.setSource(new File(args0));
Instances mydata = myloader.getDataSet();
System.out.println(mydata);
// This is used to save ARFF
ArffSaver mysaver = new ArffSaver();
mysaver.setInstances(mydata);
mysaver.setFile(new File(args1));
mysaver.setDestination(new File(args1));
mysaver.writeBatch();
}
}

USACO Code Submission Problem - Output File Missing

I'm practicing some USACO past released problems but whenever I submit my code for grading I receive the error:
Your output file (FILENAME.out):
[File missing!]
I tested every problem using this simple code, but still receive the same error:
import java.util.*;
import java.io.*;
public class Test
{
public static void main (String [] args) throws IOException
{
PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter(FILENAME)));
out.println("Hello world.");
out.close();
System.exit(0);
}
}
Why would this code not create an output file?
The USACO grading system has the output file already made in the same directory as your java solution, so all you need to do is just write to it.
In your line
PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter(FILENAME)));
you should change this to
PrintWriter out = new PrintWriter(new BufferedWriter(new FileWriter(FILENAME.out)));
since this is the name of the file. This does not create an actual file, but just writes to the existing one on the USACO grading system.

Xamarin Forms - Saving object to access it later

I am using Xamarin Forms with Newtonsoft.JSON and Xam.Plugin.Settings plugins to save IDevice object as JSON to use it later.
First page:
private async void SelectBluetoothDevice(object sender, SelectedItemChangedEventArgs e)
{
string device = JsonConvert.SerializeObject((IDevice) e.SelectedItem);
AppSettings.AddOrUpdateValue("device", device);
await Navigation.PopAsync();
}
Here I simply make selected item to JSON string and save it. It works like it should be.
But the problem im facing comes whenever I try to deserialize from saved string.
string device = AppSettings.GetValueOrDefault("device", "");
if (!device.Equals(""))
{
Debug.WriteLine(device);
// This line produces error
IDevice dev = JsonConvert.DeserializeObject<IDevice>(device);
settingsDeviceName.Text = dev.Name;
}
Newtonsoft.Json.JsonSerializationException: Could not create an
instance of type Plugin.BLE.Abstractions.Contracts.IDevice. Type is an
interface or abstract class and cannot be instantiated. Path
'BluetoothDevice', line 1, position 19.
So I understand that IDevice is interface and my JSON string cant be deserialized nothing but into object. Any good ideas how to workaround it? Thanks!
This line JsonConvert.DeserializeObject<IDevice>(device); basically does this:
Read the JSON in device
Try to create the type of object you want to deserialize to, in this case, IDevice
Return you the new object with all the properties filled
The problem is with step 2. You can't create an instance of an interface. So you need to deserialize to a concrete object that implements IDevice.

How to convert json to associative array in JSP

I'm trying to make a dynamic jsp page based on the json.
For example, if my json is look like this:
{
'page1':'true',
'page2':'true',
'page3':'false'
}
In php, I could get my associative array easily by one line:
$data = json_decode($json_str);
Then I could access whichever that I wanted at the place I needed, (i.e)
If($data['page1'] == 'true')
echo #page link#;
But in jsp, it doesn't goes so easily because it don't have much documents like php. I find the gson but still not sure. how to use it to achieve it.
Please give me some example that I could turn json to associative array, then get and access it in jsp.
A Java Map is an associative array. You can ask Gson to deserialize your json input with this:
Map<String,String> map = new Gson().fromJson(inputJson, new TypeToken<Map<String,String>>() {}.getType());
Example code:
package net.sargue.gson;
import com.google.gson.Gson;
import com.google.gson.reflect.TypeToken;
import java.lang.reflect.Type;
import java.util.Map;
public class SO36859440 {
public static void main(String[] args) {
String inputJson = "{\n" +
" 'page1':'true',\n" +
" 'page2':'true',\n" +
" 'page3':'false'\n" +
"}";
Type type = new TypeToken<Map<String, String>>() {}.getType();
Map<String,String> map = new Gson().fromJson(inputJson, type);
System.out.println(map.get("page2"));
System.out.println(map.get("page3"));
}
}
The output is:
true
false
Then, your question is about converting it in JSP. Well, best practices for JSP advice you to move this kind of processing to a Servlet that acts as a controller and passes attributes to the JSP in order to build the view. You can put code directly on the JSP using <% ... %> but I highly discourage you to do so. But that's an entirely different question.

How to read a CSV file from Hdfs?

I have my Data in a CSV file. I want to read the CSV file which is in HDFS.
Can anyone help me with the code??
I'm new to hadoop. Thanks in Advance.
The classes required for this are FileSystem, FSDataInputStream and Path. Client should be something like this :
public static void main(String[] args) throws IOException {
// TODO Auto-generated method stub
Configuration conf = new Configuration();
conf.addResource(new Path("/hadoop/projects/hadoop-1.0.4/conf/core-site.xml"));
conf.addResource(new Path("/hadoop/projects/hadoop-1.0.4/conf/hdfs-site.xml"));
FileSystem fs = FileSystem.get(conf);
FSDataInputStream inputStream = fs.open(new Path("/path/to/input/file"));
System.out.println(inputStream.readChar());
}
FSDataInputStream has several read methods. Choose the one which suits your needs.
If it is MR, it's even easier :
public static class YourMapper extends
Mapper<LongWritable, Text, Your_Wish, Your_Wish> {
public void map(LongWritable key, Text value, Context context)
throws IOException, InterruptedException {
//Framework does the reading for you...
String line = value.toString(); //line contains one line of your csv file.
//do your processing here
....................
....................
context.write(Your_Wish, Your_Wish);
}
}
}
If you want to use mapreduce you can use TextInputFormat to read line by line and parse each line in mapper's map function.
Other option is to develop (or find developed) CSV input format for reading data from file.
There is one old tutorial here http://hadoop.apache.org/docs/r0.18.3/mapred_tutorial.html but logic is same in new versions
If you are using single process for reading data from file it is same as reading file from any other file system. There is nice example here https://sites.google.com/site/hadoopandhive/home/hadoop-how-to-read-a-file-from-hdfs
HTH