I am working with csv file having very large dataset. while reading file i had extracted 4th place(BALANCE) ';' separated numeric value from each rows through while loop iteration. and make a list of Double after some mathematical calculation(here incremented).
now I want to store this list of Double in reverse order(from end to beginning).as its original position(here 4th place).example
public static void main(String[] args) throws IOException {
String filename = "abc.csv";
List<Double> list = new ArrayList<Double>();
File file = new File(filename);
Scanner inputStream = new Scanner(file);
inputStream.next();
while (inputStream.hasNext()) {
String data = inputStream.next();
String[] values = data.split(";");
double BALANCE = Double.parseDouble(values[1]);
BALANCE = BALANCE + 1;
ListIterator li = list.listIterator(list.size());
while (li.hasPrevious()) {
values[1] = String.valueOf(li.previous()); }
inputStream.close();
}
} }
You can use Collections.reverse. Example Collections.reverse(list);
Related
I have a very simple .csv file with ID's and serial numbers
the actual application will not know how many rows but will always have two columns
1,16600687
2,16600939
3,16604031
4,16607302
I have everything else setup but i am only loading the data into a 1D array and the comma is remaining in the data
The result i get is string value for 3rd position is 3,16604031
How do i separate this so it is a 2D array with get value [2,0] is 3 and get value [2,1] is 16604031 ?
private void button1_Click(object sender, EventArgs e)
{
string stFileNamenPath = "(put location of file here)";
DialogResult result = openFileDialog1.ShowDialog();
StreamReader sr = new StreamReader(stFileNamenPath);
string[] sortArray = null;
while (!sr.EndOfStream)
{
string strResult = sr.ReadToEnd();
sortArray = strResult.Split(new string[] { Environment.NewLine }, StringSplitOptions.None);
}
string stTest = (string)sortArray.GetValue(2);
MessageBox.Show("string value for 3rd position is " + stTest);
}
CSV file
1,16600687
2,16600939
3,16604031
4,16607302
The answer that comes to my mind is just a LINQ Where statement followed by a Select statement. The former for filtering and the second for actually remaping the data you have. You code would be something like this
string stFileNamenPath = "(put location of file here)";
//DialogResult result = openFileDialog1.ShowDialog();
StreamReader sr = new StreamReader(stFileNamenPath);
string[] sortArray = null;
while (!sr.EndOfStream)
{
string strResult = sr.ReadToEnd();
sortArray = strResult.Split(new string[] { Environment.NewLine }, StringSplitOptions.None);
}
char separator = ',';
var mappedArray = sortArray
.Where(x => !string.IsNullOrEmpty(x) && !string.IsNullOrWhiteSpace(x))
.Select(x => new string[] { x.Split(separator)[0], x.Split(separator)[1] }).ToArray();
var stTest = mappedArray[2][1];
MessageBox.Show("string value for 3rd position is " + stTest);
on Clicking the button, I m loading the function WriteJsonForLevel(). I have placed three GameObject with the tag name "RedCoin" and I want to write the position of the GameObject to a JSON file. I can get the position of the object, but it's all overwritten. I can only see the last GameObject position (i.e the completion of the loop)
public List<GameObject> levelObjects;
public string level;
public Vector3 pos;
// Start is called before the first frame update
void Start()
{
levelObjects = new List<GameObject>();
}
// Update is called once per frame
void Update()
{
}
public void WritejsonForAll()
{
WriteJsonForLevel();
}
public void WriteJsonForLevel()
{
/* FileStream fs = new FileStream(Application.dataPath + "/sample.json",FileMode.Create);
StreamWriter writer= new StreamWriter(fs);*/
GameObject[] coinObjRed = GameObject.FindGameObjectsWithTag("RedCoin");
putAllObjectInList(coinObjRed);
}
public void putAllObjectInList(GameObject[] p)
{
string path = Application.dataPath + "/text.json";
foreach (GameObject q in p)
{
levelObjects.Add(q);
}
for (int i = 0; i < levelObjects.Count; i++)
{
GameObject lvlObj = levelObjects[i];
Vector3 pos = lvlObj.transform.position;
string posOutput = JsonUtility.ToJson(pos);
File.WriteAllText(path,posOutput);
Debug.Log("position:" + posOutput);
}
}
}
You are using WriteAllText which will overwrite the file every time it is called. As it is overwriting each time it is in the loop, it will only write the last object to the file as every other previous write is overwritten. I would consider making a serialized class of data, assigning the data to it, converting it to a JSON string then saving that.
// stores individual locations for saving
[System.Serializable]
public class IndividualLocation
{
public IndividualLocation(Vector3 pos)
{
xPos = pos.x;
yPos = pos.y;
zPos = pos.z;
}
public float xPos;
public float yPos;
public float zPos;
}
// stores all game locations for saving
[System.Serializable]
public class AllGameLocations
{
public List<IndividualLocation> Locations = new List<IndividualLocation>();
}
public void PutAllObjectInList(in GameObject[] p)
{
string path = Application.dataPath + "/text.json";
// create a new object to write to
AllGameLocations data = new AllGameLocations();
// iterate the objects adding each to our structure
foreach(GameObject obj in p)
{
data.Locations.Add(new IndividualLocation(obj.transform.position));
}
// now that the data is filled, write out to the file
File.WriteAllText(path, JsonUtility.ToJson(AllGameLocations));
}
If you need a snippet on how to load the data properly I can add one.
Edit: Here is a load snippet
public void LoadJSONObject()
{
string path = Application.dataPath + "/text.json";
// if the file path or name does not exist
if (!Directory.Exists(Path.GetDirectoryName(path)))
{
Debug.LogWarning("File or path does not exist! " + path);
return
}
// load in the save data as byte array
byte[] jsonDataAsBytes = null;
try
{
jsonDataAsBytes = File.ReadAllBytes(path);
Debug.Log("<color=green>Loaded all data from: </color>" + path);
}
catch (Exception e)
{
Debug.LogWarning("Failed to load data from: " + path);
Debug.LogWarning("Error: " + e.Message);
return;
}
if (jsonDataAsBytes == null)
return;
// convert the byte array to json
string jsonData;
// convert the byte array to json
jsonData = Encoding.ASCII.GetString(jsonDataAsBytes);
// convert to the specified object type
AllGameLocations returnedData;
JsonUtility.FromJsonOverwrite<AllGameLocations>(jsonData, AllGameLocations);
// use returnedData as a normal object now
float firstObjectX = returnedData.Locations[0].xPos;
}
}
Let me know if the Load works, just typed it up untested. Added some error handling as well to assure data exists and the load properly works.
I am trying to compare columns of two csv files using mapreduce program
The input csv data files(input for map program)that contains some automated generated data which has around 100 columns and thousands of rows looks in the below format...
Note: CSV file columns are separated by ";"
Input File1 Data
Column1;Column2;Column3;Column4;-----------
Sigma48_12mar09.9010.9010.3;K.TAFQEALDAAGDKLVVVDFSATWC[160.14]GPC[160.14]K.M;P08263.3;1.062
Sigma48_12mar09.9063.9063.3;K.KDPEGLFLQDNIVAEFSVDETGQMSATAK.G;P08263.3;1.062
Input File2 Data
Column1;Column2;Column3;Column4;-----------
Sigma48_12mar09.9188.9188.2;R.YKLSLEFPSGYPYNAPTVK.F;P08263.3;1.062
Sigma48_12mar09.9314.9314.2;R.YKLSLEFPSGYPYNAPTVK.FP08263.3;1.062
Sigma48_12mar09.9010.9010.3;K.TAFQEALDAAGDKLVVVDFSATWC[160.14]GPC[160.14]K.M;P08263.3;1.062
My requirement :
Read all the rows in Input File1 Data.csv take column1 and read all rows in Input File2 Data.csv, then compare the column1 in first file with column1 in the second file.
When match found compare all other columns in the above two files for that particular row and write the matched data in to the HDFS and should return the % matched among those two input files.
Mycode is as follows..
/* First Mapper */
public void map(LongWritable key,Text value,Context context)
throws IOException, InterruptedException{
String line = value.toString();
String[] words = line.split(";");
String name = words[1];
String other = words[2];
context.write(new Text(name), new Text(line));
}
}
/* Second Mapper */
public static class InputMapper2 extends Mapper<LongWritable,Text,Text,Text>{
public void map(LongWritable key, Text value, Context context)
throws IOException, InterruptedException
{
String line = value.toString();
String[] words = line.split(";");
String name = words[1];
String other = words[2];
System.out.println(key);
context.write(new Text(name), new Text(line));
}
}
/* Reducer for both of the mappers */
/*incomplete and have to compare the two csv files here */
public static class CounterReducer extends Reducer
{
String line=null;
public void reduce(Text key, Iterable<Text> values, Context context )
throws IOException, InterruptedException
{
Iterator<Text> val = values.iterator();
for(Text value:values)
{
line = value.toString();
}
context.write(key, new Text(line));
}
}
I was having a hard time loading training set from Ms Access database in to the main form that does the Face Recognition. I saved the training sets with their names and ID in to the database as a binary data with an OLE Object format.The method i used to change, save and read the data from the database and in to the training sets is
private static byte[] ConvertImageToBytes(Image InputImage)
{
using (Bitmap BmpImage = new Bitmap(InputImage))
{
using (MemoryStream MyStream = new MemoryStream())
{
BmpImage.Save(MyStream, System.Drawing.Imaging.ImageFormat.Jpeg);
byte[] ImageAsBytes = MyStream.ToArray();
return ImageAsBytes;
}
}
}
The method that i use to store the converted byte data to the database is the following:
private void StoreData(byte[] ImageAsBytes,String NameStudent,String IDStudent)
{
if (DBConnection.State.Equals(ConnectionState.Closed))
DBConnection.Open();
try
{
//MessageBox.Show("Saving image at index : " + rowPosition);
using (OleDbCommand insert = new OleDbCommand(String.Format("Insert INTO
TrainingSet(rowPosition,StudentName,StudentID,StudentFace) values ('
{0}','{1}','{2}',#StudentFace)", rowPosition, NameStudent, IDStudent),
DBConnection))
{
OleDbParameter imageParameter = insert.Parameters.AddWithValue(#"StudentFace",
SqlDbType.Binary);
imageParameter.Value = ImageAsBytes;
imageParameter.Size = ImageAsBytes.Length;
int rowsAffected = insert.ExecuteNonQuery();
MessageBox.Show(String.Format("Data stored successfully in {0}
Row",rowsAffected));
}
rowPosition++;
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
MessageBox.Show(ex.Message);
}
finally
{
RefreshDBConnection();
}
}
The method that i use to Read this binary data is as follows:
private Image ReadImageFromDB()
{
Image FetchedImg;
if (rowNumber >= 0)
{
byte[] FetchedImgBytes = (byte[])LocalDataTable.Rows[rowNumber]["StudentFace"];
MemoryStream stream = new MemoryStream(FetchedImgBytes);
FetchedImg = Image.FromStream(stream);
return FetchedImg;
}
else
{
MessageBox.Show("There are no images in the database yet.Please reconnect
or add some pictures.");
return null;
}
}
I have successfully saved the training sets/images as a binary data in to the database.The problem is when i load these training sets for Recognition.
// Declaring the variables=====trainingImages is where the training sets are
// loaded from the database NameLabels and IDLabels are text in the database
// and where name and Id of subject
//is saved.
List<Image<Gray,byte>> trainingImages = new List<Image<Gray,byte>>();
List<string> NameLables= new List<string>();
List<string> IDLables = new List<string>();
int ContTrain, NumNameLabels,NumIDLabels, t;
//The training sets from the database are loaded in to the facerecognizer code as
// follows
public FaceRecognizer()
{
InitializeComponent();
try
{
//Load previous trained and labels for each image from the database Here
RefreshDBConnection();
String[] NameLabels = (String[])LocalDataTable.Rows[rowNumber]["StudentName"];
NumNameLabels = Convert.ToInt16(NameLabels[0]);
String[] IDLabels = (String[])LocalDataTable.Rows[rowNumber]["StudentID"];
NumIDLabels = Convert.ToInt16(IDLabels[0]);
if (NumNameLabels == NumIDLabels)
{
ContTrain = NumNameLabels;
string LoadFaces;
// Converting the master image to a bitmap
Image imageFromDB;
Bitmap imageChangedToBitmap;
// Normalizing it to grayscale
Image<Gray, Byte> normalizedMasterImage;
for (int tf = 1; tf < NumNameLabels + 1; tf++)
{
imageFromDB = ReadImageFromDB();
//image loaded from the database is converted in to Bitmap and then
//convert the bitmap image in to Image<Gray,byte> for input to
//EigenObjectRecognizer(,,,)
imageChangedToBitmap = new Bitmap(imageFromDB);
normalizedMasterImage = new Image<Gray, Byte>(imageChangedToBitmap);
LoadFaces = String.Format("face{0}.bmp", tf);
trainingImages.Add(normalizedMasterImage);
//trainingImages.Add(new Image<Gray, byte>());
NameLables.Add(NameLabels[tf]);
IDLables.Add(IDLabels[tf]);
rowNumber = rowNumber + 1;
}
}
else
MessageBox.Show("There's a conflict between Name labels and id labels");
}
catch (Exception e)
{
MessageBox.Show("Nothing in the database, please add at least a
face.Train the database","Triained faces load",MessageBoxButtons.OK,
MessageBoxIcon.Exclamation);
}
}
I am only getting the message in the catch when the the form loads even if there are faces saved in the database. I have used EigenObjectRecognizer and i will post the code if necessary.
at the part of loading face, you did not save by face1, face2, face3 etc. So you can not load using;
LoadFaces = String.Format("face{0}.bmp", tf);
I have the following part of a CSV File with 7 columns (see first line) and I want to put the dates (1st column) as the keys in a TreeMap and the Adj Close values (7th column) as the mapped values in the TreeMap:
Date,Open,High,Low,Close,Volume,Adj Close
7/1/2011,132.09,134.1,131.78,133.92,202370700,133.92
6/30/2011,131.14,132.18,130.71,131.97,223496600,131.97
6/29/2011,130.2,130.93,129.63,130.72,244295500,130.72
6/28/2011,128.45,129.63,128.27,129.61,165556300,129.61
In an earlier part of the assignment, I only had to put the Open values (2nd column) as the mapped values (the dates were the keys) in a TreeMap. I used Scanner for this and my code is below:
TreeMap<String, String> loadPriceData(String fileName) throws Exception
{
TreeMap<String, String> prices = new TreeMap<String, String>();//create prices map
Scanner fileScanner = new Scanner(new File(fileName));
fileScanner.useDelimiter("[,\n]+");// use comma as delimiter
while(fileScanner.hasNext()) //condition detects comma
{
prices.put(fileScanner.nextLine(),fileScanner.nextLine());
}
return prices;
}
But this seems only good for 2 column CSV data. If I need the mapped values in the 7th column, what's an efficient way to go about it? Thanks in advance.
Your code doesn't work. The delimiter pattern is not correct. If you look at the contents of your map, you will see that instead of having date-price mappings, you only have one strange mapping.
Instead of using a Scanner, a simpler way is to read the file line-by-line, split each line on comma and put the fields you need into the map.
For example:
public TreeMap<String, String> loadPriceData(String fileName) throws IOException {
TreeMap<String, String> prices = new TreeMap<String, String>();// create prices map
BufferedReader in = null;
try {
in = new BufferedReader(new FileReader(fileName));
String line;
//read each line in the csv file
while ((line = in.readLine()) != null) {
//split line on comma
String[] fields = line.split(",");
//put the first and second fields into the map
prices.put(fields[0], fields[1]);
}
return prices;
} catch (IOException e) {
throw e;
} finally {
if (in != null) {
try {
in.close();
} catch (IOException e) {// ignore
}
}
}
}
If you are using Java 7, you can make use of the try-with-resources statement:
public TreeMap<String, String> loadPriceData(String fileName) throws IOException {
TreeMap<String, String> prices = new TreeMap<>();// create prices map
try (BufferedReader in = Files.newBufferedReader(Paths.get(fileName}),
Charset.forName("UTF-8"))) {
String line;
//read each line in the csv file
while ((line = in.readLine()) != null) {
//split line on comma
String[] fields = line.split(",");
//put the first and second fields into the map
prices.put(fields[0], fields[1]);
}
return prices;
} catch (IOException e) {
throw e;
}
}