I am trying to convert ADXL335 accelerometer data from arduino to a csv file. The arduino code works perfectly when looking at it using the serial monitor. The processing code returns an output in the console but does not write anything into the CSV file. I'm not sure why it won't print. When I uncomment the print in the second if statement, the values are stored in the csv file, but it only works for 15 value inputs and then repeats these same values until the code is stopped. When we embed the if statements, we again get nothing in the csv file. I think there is something with the first if statement but I am not sure how to continue troubleshooting. I'm wondering how I can make it so I get a continuous output of the accelerometer readings. Thanks in advance.
Here is my arduino code:
void setup() {
pinMode(14,INPUT);//define mode of pin
pinMode(15,INPUT);
pinMode(16,INPUT);
Serial.begin(9600);//serial communication begin
delay(10);
}
void loop()
{
//Serial.print(",");
Serial.print("X=");
Serial.print(analogRead(14));
Serial.print(",");
Serial.print("Y=");
Serial.print(analogRead(15));
Serial.print(",");
Serial.print("Z=");
Serial.print(analogRead(16));
Serial.println(",");
delay(100);
}
Here is my processing code:
void setup() {
output = createWriter( "data.csv" );
mySerial = new Serial( this, Serial.list()[1], 9600 );
}
void draw() {
if (mySerial.available() > 0 ) {
value = mySerial.readString();
System.out.print(value);
output.println(value);
}
if ( value != null ) {
//output.println(value);
output.println();
}
}
void keyPressed() {
output.flush(); // Writes the remaining data to the file
output.close(); // Finishes the file
exit(); // Stops the program
}
First off you might want to change your arduino code to output valid CSV lines.
I'd suggest loosing the CSV header for now, or appending it from Processing.
Try this for now:
void loop()
{
Serial.print(analogRead(14));
Serial.print(",");
Serial.print(analogRead(15));
Serial.print(",");
Serial.print(analogRead(16));
Serial.println(",");
delay(100);
}
This should output something like ####,####,####, which is a valid CSV line.
On the processing side I would also advise first buffering until the new line(\n) character, which you can easily do with bufferUntil() and serialEvent()
import processing.serial.*;
// serial port
Serial mySerial;
// single line containing acc. values as a CSV row string
String values;
// the output
PrintWriter output;
void setup() {
output = createWriter( "data.csv" );
try{
mySerial = new Serial( this, Serial.list()[1], 9600 );
mySerial.bufferUntil('\n');
}catch(Exception e){
println("Error opening serial port: double check USB cable, if Serial Monitor is open, etc.");
e.printStackTrace();
}
}
void draw() {
background(0);
if(values != null){
text("most recent values:\n" + values,10,15);
}
}
// gets called when new data is in, and since we're buffering until \n it's one csv line at a time
void serialEvent(Serial p) {
values = p.readString();
if(values != null && values.length() > 0){
println(values);
// if values.trim() isn't call, the \n should still be there so print() will suffice in terms of adding a new line
output.print(values);
}else{
println("received invalid serial data");
}
}
void keyPressed() {
output.flush(); // Writes the remaining data to the file
output.close(); // Finishes the file
exit(); // Stops the program
}
Also notice error checking on the serial connection and serial data read
(it's a good habit checking for things that could go wrong).
Optionally you can add a header to your CSV file in setup():
output = createWriter( "data.csv" );
output.println("X,Y,Z,");
In terms of writing a CSV file there are many ways to do that and Processing has a Table class which allows you to read/parse and write CSV data. At the moment your PrintWriter approach is pretty straight forward: use that.
Related
I am new to apache flink and trying to learn data streams. I am reading student data which has 3 columns(Name,Subject and Marks) from a csv file. I have applied filter on marks and only selecting those records where marks >40.
I am trying to write this data to csv file but program runs successfully and csv file remains empty. No data gets written to csv file.
I tried with different syntax for writing csv file but none of them worked for me. I am running this locally through eclipse. Write to text file works fine.
DataStream<String> text = env.readFile(format, params.get("input"),
FileProcessingMode.PROCESS_CONTINUOUSLY,100);
DataStream<String> filtered = text.filter(new FilterFunction<String>(){
public boolean filter(String value) {
String[] tokens = value.split(",");
return Integer.parseInt(tokens[2]) >= 40;
}
});
filtered.writeAsText("testFilter",WriteMode.OVERWRITE);
DataStream<Tuple2<String, Integer>> tokenized = filtered
.map(new MapFunction<String, Tuple2<String, Integer>>(){
public Tuple2<String, Integer> map(String value) throws Exception {
return new Tuple2("Test", Integer.valueOf(1));
}
});
tokenized.print();
tokenized.writeAsCsv("file:///home/Test/Desktop/output.csv",
WriteMode.OVERWRITE, "/n", ",");
try {
env.execute();
} catch (Exception e1) {
e1.printStackTrace();
}
}
}
Below is my input CSV format:
Name1,Subj1,30
Name1,Subj2,40
Name1,Subj3,40
Name1,Subj4,40
Tokenized.print() prints all correct records.
I did a little experimenting, and found that this job works just fine:
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.core.fs.FileSystem;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
public class WriteCSV {
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.setParallelism(1);
env.fromElements(new Tuple2<>("abc", 1), new Tuple2<>("def", 2))
.writeAsCsv("file:///tmp/test.csv", FileSystem.WriteMode.OVERWRITE, "\n", ",");
env.execute();
}
}
If I don't set the parallelism to 1, then the results are different. In that case, test.csv is a directory containing four files, each written by one of the four parallel subtasks.
I'm not sure what's wrong in your case, but maybe you can work backwards from this example (assuming it works for you).
You should remove tokenized.print(); before tokenized.writeAsCsv();.
It will consume the data the print();.
For a university assignment, we have been just been introduced to try and catch exception handling method, and have some assignment work associated with it.
Our task was to create a file that consits of numbers that are added together and display a total.
However, Anything BUT an INTEGER should be ignored wherever it is placed in the file, and should continue. e.g.
5
6
g
10
Should add 5+6+10 and ignore g, but some reason I can't work out how to ignore anything but an INT. Also, I need it so if the first line of the program is g, it is ignored and continues downwards adding all integers.
I am therefore having a hard time trying to get the program to display the content of file.
public static void main(String[] args) throws FileNotFoundException {
int numbers = 0;
int total = 0;
Scanner fileScan = null;
try{
File integers = new File("integers.txt");
fileScan = new Scanner(integers);
// Loops through file and adds up
while(fileScan.hasNextLine()){
numbers = fileScan.nextInt();
System.out.println(numbers);
total+=numbers;
}
//fileScan.next();
System.out.print("Normal Total: " +total);
}
catch(FileNotFoundException ex){
System.err.println("Missing file try again!");
}
catch(InputMismatchException ex){
System.err.print("result through input exception\n");
//System.out.println(total);
//total+=numbers;
}
catch(NoSuchElementException ex){
System.err.println("Cannot find error");
}
finally{
System.out.println("\nTotal through finally: "+total);
fileScan.close();
}
}
}
AND the file output is:
result through input exception
6
8
Total through finally: 14
Couchbase newbie question:
I am trying to insert 1 million records in couchbase, but I see that nearly about 0.5 million records get inserted (Admin console shows 517365 as Item Count).
Also, from the admin GUI, I can only see 1000 records (10 pages of 100 records each)
Am wondering where rest of the records are vanishing !
1)Can somebody help me with this ?
2)Which log file I should be looking at to find insertion failure errors ?
I suspect Couchbase has a internal queue. Once it gets full, further requests are dropped. If yes, then how to configure the queue size ?
PS: I tried looking into the logs C:\Program Files\Couchbase\Server\var\lib\couchbase\logs, but couldn't figure out anything.
public class Test {
public static void main(String[] args) {
ArrayList<URI> nodes = new ArrayList<URI>();
String cbUrl = "http://127.0.0.1:8091/pools";
String dbName = "deafult";
CouchbaseClient client = null;
try {
nodes.add(URI.create(cbUrl));
client = new CouchbaseClient(nodes, dbName, "");
insertRecords(client);
System.out.println("Test Over");
} catch (Exception e) {
e.printStackTrace();
} finally {
// client.shutdown();
}
}
public static void insertRecords(CouchbaseClient client) throws Exception {
int num = 1000000;
for (int n = 1; n <= num; n++) {
System.out.println("Adding: " + n);
client.set(n + "", 0, n + "");
}
}
}
The set operation in the Couchbase Java SDK is asynchronous. This means once the call returns there is no guarantee that you've even sent the operation to Couchbase since it may not have even been written to the network buffer yet. In order to make sure the operation has completed you need to call the get() function on the object (which is a Future) returned by the set() API.
In other words replace this line:
client.set(n + "", 0, n + "");
with this one:
client.set(n + "", 0, n + "").get();
To expand on #mikewied's answer, to check that all 1,000,000 set operations have completed without having to call .get() explicitly on each one (and hence converting the calls from async to sync), you need to add a listener to each set which tracks how many of your operations have completed.
There's a nice example of how to do this in the blog post announcing Couchbase Java SDK 1.2 :-
final CountDownLatch latch = new CountDownLatch(100);
for (int i = 0; i < 100; i++) {
OperationFuture<Boolean> future = client.set("key-" + i, "value");
future.addListener(new OperationCompletionListener() {
#Override
public void onComplete(OperationFuture<?> future) throws Exception {
latch.countDown();
}
});
}
latch.await();
You create a CountDownLatch, initialised to how many documents you are set()ing, then register a listener which is called on completion of each set (but note the sets are still asynchronous). At the end you then call await() on the latch to ensure that all set operations have completed before continuing.
This approach is described in more detail in the Understanding and Using Asynchronous Operations section of the Couchbase Java SDK Developer guide, along with a more compact syntax if you're using Java 8.
I would like to redirect rows hat could not be loaded into a table to an error file.
I see that the red exception path has 3 Input columns coming in, the "Flat File Source Error Output Column" contains the original data in the file.
The problem is that when I open the file, there is an extra Carriage Return\Line Feed character after every row. I'd like to be able to manually fix the errors and reprocess them without having to delete all of the added CRLF chars. So I added a Script Component to shave of the characters being added.
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
string buffer = GetString(Row.FlatFileSourceErrorOutputColumn.GetBlobData(0, (int)(Row.FlatFileSourceErrorOutputColumn.Length)));
System.Windows.Forms.MessageBox.Show(buffer);
byte[] ThisBytes = GetBytes("Test");
Row.FlatFileSourceErrorOutputColumn.ResetBlobData();
Row.FlatFileSourceErrorOutputColumn.AddBlobData(ThisBytes);
}
static string GetString(byte[] bytes)
{
char[] chars = new char[bytes.Length / sizeof(char)];
System.Buffer.BlockCopy(bytes, 0, chars, 0, bytes.Length);
return new string(chars);
}
static byte[] GetBytes(string str)
{
byte[] bytes = new byte[str.Length * sizeof(char)];
System.Buffer.BlockCopy(str.ToCharArray(), 0, bytes, 0, bytes.Length);
return bytes;
}
But my debug message box shows non displayable characters that appear as blocks.
When I try to force a literal "TEST" to the output file as a test to see if I could control what goes into the file, I get NULL (ASCII 0) characters after every letter.
Why is SSIS adding a CRLF when I just simply redirect the Output column to the file w/o using a Scripting block component to attempt to modify the data written? How can I get rid of the CRLF? Why am I unable to read the byte array in the data column and display it as a string? Why is the "TEST" literal having NULLS between every letter? Is my ByteArray conversion functions incorrect?
Got it.
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
string buffer = GetString(Row.FlatFileSourceErrorOutputColumn.GetBlobData(0, (int)(Row.FlatFileSourceErrorOutputColumn.Length - 2)));
System.Windows.Forms.MessageBox.Show(buffer);
byte[] ThisBytes = GetBytes(buffer);
Row.FlatFileSourceErrorOutputColumn.ResetBlobData();
Row.FlatFileSourceErrorOutputColumn.AddBlobData(ThisBytes);
}
static string GetString(byte[] bytes)
{
System.Text.Encoding enc = System.Text.Encoding.ASCII;
return enc.GetString(bytes);
}
static byte[] GetBytes(string str)
{
System.Text.Encoding enc = System.Text.Encoding.ASCII;
return enc.GetBytes(str);
}
I would still like to know why SSIS is adding the CRLF!
There is a web-service deployed on tomcat 6 and exposed via apache-cxf 2.3.3. A generated sources stubs using wsdl2java to be able to call this service.
Things seemed fine until I sent big request(~1Mb). This request wasn't processed and failing with exception:
Interceptor for {http://localhost/}ResourceAllocationServiceSoapService has thrown
exception, unwinding now org.apache.cxf.binding.soap.SoapFault:
Error reading XMLStreamReader.
...
com.ctc.wstx.exc.WstxEOFException: Unexpected EOF in prolog
at [row,col {unknown-source}]: [1,0]
Is some kind of max request length here, I'm totally stuck with it.
Vladimir's suggestion worked. This code below will help others with understanding where to put the 1000000.
public void handleMessage(SoapMessage message) throws Fault {
// Get message content for dirty editing...
InputStream inputStream = message.getContent(InputStream.class);
if (inputStream != null)
{
String processedSoapEnv = "";
// Cache InputStream so it can be read independently
CachedOutputStream cachedInputStream = new CachedOutputStream(1000000);
try {
IOUtils.copy(inputStream,cachedInputStream);
inputStream.close();
cachedInputStream.close();
InputStream tmpInputStream = cachedInputStream.getInputStream();
try{
String inputBuffer = "";
int data;
while((data = tmpInputStream.read()) != -1){
byte x = (byte)data;
inputBuffer += (char)x;
}
/**
* At this point you can choose to reformat the SOAP
* envelope or simply view it just make sure you put
* an InputStream back when you done (see below)
* otherwise CXF will complain.
*/
processedSoapEnv = fixSoapEnvelope(inputBuffer);
}
catch(IOException e){
}
}
catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// Re-set the SOAP InputStream with the new envelope
message.setContent(InputStream.class,new ByteArrayInputStream( processedSoapEnv.getBytes()));
/**
* If you just want to read the InputStream and not
* modify it then you just need to put it back where
* it was using the CXF cached inputstream
*
* message.setContent(InputStream.class,cachedInputStream.getInputStream());
*/
}
}
I figured out what was wrong. Actually it was bug inside interceptor's code:
CachedOutputStream requestStream = new CachedOutputStream()
When I replaced this with
CachedOutputStream requestStream = new CachedOutputStream(1000000);
things start working fine.
So the request was just trunkated during copying of streams.
I run into same issue of geting "com.ctc.wstx.exc.WstxEOFException: Unexpected EOF in prolog" when using CachedOutputStream class.
Looking at sources of CachedOutputStream class the threshold is used to switch between storing stream's data from "in memory" to "a file".
Assuming stream operates on data that exceeds threshold it gets stored in a file thus following code is going to break
IOUtils.copy(inputStream,cachedInputStream);
inputStream.close();
cachedInputStream.close(); //closes the stream, the file on disk gets deleted
InputStream tmpInputStream = cachedInputStream.getInputStream(); //returned tmpInputStream is brand *empty* one
// ... reading tmpInputStream here will produce WstxEOFException
Increasing 'threshold' does help as all stream data is stored into memory and in such scenario calling cachedInputStream.close() does not really close the underlying stream implementation so one can still read from it later on.
Here is 'fixed' version of above code (at least it worked without exception for me)
IOUtils.copy(inputStream,cachedInputStream);
inputStream.close();
InputStream tmpInputStream = cachedInputStream.getInputStream();
cachedInputStream.close();
// reading from tmpInputStream here works fine
Temporary file gets deleted when close() is called on tmpInputStream and there are no more other references to it, see source code of CachedOutputStream.maybeDeleteTempFile()