Apache Camel CSV with Header - csv

I have written a simple test app that reads records from a DB and puts the result in a csv file. So far it works fine but the column names i.e. headers are not put in the csv file. According to the doc it should be put there. I have also tried it without/with streaming and split but the situation is the same.
In the camel unit-tests in line 182 the headers are put there explicitly: https://github.com/apache/camel/blob/master/components/camel-csv/src/test/java/org/apache/camel/dataformat/csv/CsvDataFormatTest.java
How could this very simple problem be solved without the need to iterate over the headers? I also experimented with different settings but all the same. The e.g delimiters have been considered I set but the headers not. Thanks for the responses also in advance.
I used Camel 2.16.1 like this:
final CsvDataFormat csvDataFormat = new CsvDataFormat();
csvDataFormat.setHeaderDisabled(false);
[...]
from("direct:TEST").routeId("TEST")
.setBody(constant("SELECT * FROM MYTABLE"))
.to("jdbc:myDataSource?readSize=100") // max 100 records
// .split(simple("${body}")) // split the list
// .streaming() // not to keep all messages in memory
.marshal(csvDataFormat)
.to("file:extract?fileName=TEST.csv");
[...]
EDIT 1
I have also tried to add the headers from the exchange.in. They are there available with the name "CamelJdbcColumnNames" in a HashSet. I added it to the csvDataFormat like this:
final CsvDataFormat csvDataFormat = new CsvDataFormat();
csvDataFormat.setHeaderDisabled(false);
[...]
from("direct:TEST").routeId("TEST")
.setBody(constant("SELECT * FROM MYTABLE"))
.to("jdbc:myDataSource?readSize=100") // max 100 records
.process(new Processor() {
public void process(Exchange exchange) throws Exception {
headerNames = (HashSet)exchange.getIn().getHeader("CamelJdbcColumnNames");
System.out.println("#### Process headernames = " + new ArrayList<String>(headerNames).toString());
csvDataFormat.setHeader(new ArrayList<String>(headerNames));
}
})
.marshal(csvDataFormat)//.tracing()
.to("file:extract?fileName=TEST.csv");
The println() prints the column names but the cvs file generated does not.
EDIT2
I added the header names to the body as proposed in comment 1 like this:
.process(new Processor() {
public void process(Exchange exchange) throws Exception {
Set<String> headerNames = (HashSet)exchange.getIn().getHeader("CamelJdbcColumnNames");
Map<String, String> nameMap = new LinkedHashMap<String, String>();
for (String name: headerNames){
nameMap.put(name, name);
}
List<Map> listWithHeaders = new ArrayList<Map>();
listWithHeaders.add(nameMap);
List<Map> records = exchange.getIn().getBody(List.class);
listWithHeaders.addAll(records);
exchange.getIn().setBody(listWithHeaders, List.class);
System.out.println("#### Process headernames = " + new ArrayList<String>(headerNames).toString());
csvDataFormat.setHeader(new ArrayList<String>(headerNames));
}
})
The proposal solved the problem and thank you for that but it means that CsvDataFormat is not really usable. The exchange body after the JDBC query contains an ArrayList from HashMaps containing one record of the table. The key of the HashMap is the name of the column and the value is the value. So setting the config value for the header output in CsvDataFormat should be more than enough to get the headers generated. Do you know a simpler solution or did I miss something in the configuration?

You take the data from a database with JDBC so you need to add the headers yourself first to the message body so its the first row. The resultset from the jdbc is just the data, not including headers.

I have done it by overriding the BindyCsvDataFormat and BindyCsvFactory
public class BindySplittedCsvDataFormat extends BindyCsvDataFormat {
private boolean marshallingfirslLot = false;
public BindySplittedCsvDataFormat() {
super();
}
public BindySplittedCsvDataFormat(Class<?> type) {
super(type);
}
#Override
public void marshal(Exchange exchange, Object body, OutputStream outputStream) throws Exception {
marshallingfirslLot = new Integer(0).equals(exchange.getProperty("CamelSplitIndex"));
super.marshal(exchange, body, outputStream);
}
#Override
protected BindyAbstractFactory createModelFactory(FormatFactory formatFactory) throws Exception {
BindySplittedCsvFactory bindyCsvFactory = new BindySplittedCsvFactory(getClassType(), this);
bindyCsvFactory.setFormatFactory(formatFactory);
return bindyCsvFactory;
}
protected boolean isMarshallingFirslLot() {
return marshallingfirslLot;
}
}
public class BindySplittedCsvFactory extends BindyCsvFactory {
private BindySplittedCsvDataFormat bindySplittedCsvDataFormat;
public BindySplittedCsvFactory(Class<?> type, BindySplittedCsvDataFormat bindySplittedCsvDataFormat) throws Exception {
super(type);
this.bindySplittedCsvDataFormat = bindySplittedCsvDataFormat;
}
#Override
public boolean getGenerateHeaderColumnNames() {
return super.getGenerateHeaderColumnNames() && bindySplittedCsvDataFormat.isMarshallingFirslLot();
}
}

My solution with spring xml (but I'd like to have an option in for extracting also the header on top:
Using spring xml
<multicast stopOnException="true">
<pipeline>
<log message="saving table ${headers.tablename} header to ${headers.CamelFileName}..."/>
<setBody>
<groovy>request.headers.get('CamelJdbcColumnNames').join(";") + "\n"</groovy>
</setBody>
<to uri="file:output"/>
</pipeline>
<pipeline>
<log message="saving table ${headers.tablename} rows to ${headers.CamelFileName}..."/>
<marshal>
<csv delimiter=";" headerDisabled="false" useMaps="true"/>
</marshal>
<to uri="file:output?fileExist=Append"/>
</pipeline>
</multicast>
http://www.redaelli.org/matteo-blog/2019/05/24/exporting-database-tables-to-csv-files-with-apache-camel/

Related

Camel - CSV Headers setting not working

I have CSV files without headers. Since I'm using 'useMaps' I want to specify the headers dynamically. If I set headers statically and then use in route it works fine as below Approach 1 -
#Component
public class BulkActionRoutes extends RouteBuilder {
#Override
public void configure() throws Exception {
CsvDataFormat csv = new CsvDataFormat(",");
csv.setUseMaps(true);
ArrayList<String> list = new ArrayList<String>();
list.add("DeviceName");
list.add("Brand");
list.add("status");
list.add("type");
list.add("features_c");
list.add("battery_c");
list.add("colors");
csv.setHeader(list);
from("direct:bulkImport")
.convertBodyTo(String.class)
.unmarshal(csv)
.split(body()).streaming()
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
GenericObjectModel model = null;
HashMap<String, String> csvRecord = (HashMap<String, String>)exchange.getIn().getBody();
}
});
}
}
However, if the list is passed via Camel headers as below then it does not work Approach 2 -
#Component
public class BulkActionRoutes extends RouteBuilder {
#Override
public void configure() throws Exception {
CsvDataFormat csv = new CsvDataFormat(",");
csv.setUseMaps(true);
from("direct:bulkImport")
.convertBodyTo(String.class)
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
ArrayList<String> fileHeaders = (ArrayList<String>)headers.get(Constants.FILE_HEADER_LIST);
if (fileHeaders != null && fileHeaders.size() > 0) {
csv.setHeader(fileHeaders);
}
}
})
.unmarshal(csv)
.split(body()).streaming()
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
GenericObjectModel model = null;
HashMap<String, String> csvRecord = (HashMap<String, String>)exchange.getIn().getBody();
}
});
}
}
What could be missing in the Approach 2?
The big difference between approach 1 and 2 is the scope.
In approach 1 you fully configure the CSV data format. This is all done when the Camel Context is created, since the data format is shared within the Camel Context. When messages are processed, it is the same config for all messages.
In approach 2 you just configure the basics globally. The header configuration is within the route and therefore can change for every single message. Every message would overwrite the header configuration of the context-global data format instance.
Without being sure about this, I guess that it is not possible to change a context-global DataFormat inside the routes.
What would you expect (just for example) when messages are processed in parallel? They would overwrite the header config against each other.
As an alternative, you could use a POJO where you can do your dynamic marshal / unmarshal from Java code.

Display Message to User instead of empty JSON on HTML when records are empty in the database

I have an application where I have an html page which takes user input through a textbox.This is a REST Spring Framework and is divided as Controller, Entity, Service, Repository, View and the main application class.
I take an input value and search in the Mongodb database, If the value is present, I return the entity object from Service to Controller. The controller returns the same Entity View object.- PersonView in this case. I get a JSON Data.
The above scenario works well as long as there are records in the database. In case if the record is not present, it returns an empty JSON. My Controller returns Person View Object and I do not wish to change the signature and make the return type as String since in that case it returns the address on my HTML page.
Considering this, how should I handle the case when there are no records in the database and I wish to display a message on this same HTML page saying there are no records available.
I tried throwing an exception but in this case too, how Do I display message on my HTML considering that my Controller returns JSON object and I do not wish to change its signature?
Controller Class is as below:
public PersonView searchPerson(#PathVariable String pname) {
List<Person> pList= PersonService.searchPerson(pname);
PersonView personView = new PersonView();
personView.setPersonView(pList);
return personView;
EDIT:
Here is the function from personView Class that I call in Controller:
public List<Person> setPersonView() {
this.personView = personView;
}
Here is the service Impl class:
public List<Person> searchPerson(String name) throws Exception {
List<Person> personlist= new ArrayList<Person>();
personlist = personRepository.findByName(name);
if (personlist.isEmpty())
throw new Exception("Records not found in the the database");
return personlist;
}
Create a custom Exception class:
public class EntityNotFoundException extends RuntimeException {
public EntityNotFoundException(String message) {
super(message);
}
}
Now, in you controller code:
public List<Person> searchPerson(String name) {
List<Person> personlist= new ArrayList<Person>();
personlist = personRepository.findByName(name);
if (personlist.isEmpty()) {
throw new EntityNotFoundException("Records not found in the the database");
}
return personlist;
}
After that you can try something like this in you controller class:
private static final MappingJacksonJsonView JSON_VIEW = new MappingJacksonJsonView();
#ExceptionHandler(EntityNotFoundException.class)
public ModelAndView handleNotFoundException( Exception ex )
{
return new ModelAndView(JSON_VIEW, "error", new ErrorMessage("No Record in Db") );
}
Your ErrorMessage class can be a simple POJO:
public class ErrorMessage {
private String message;
ErrorMessage(String message) {
this.message = message;
}
public String getMessage() {
return message;
}
}
Although already answered, I will add some points here.
Please note that at some point of time you will have a requirement to send the
headers, Response body (with different Objects). So consider using ResponseEntity Object which will be a wrapper to your List. Here is the sample code.
public ResponseEntity<List<Person>> searchPerson(String name) {
List<Person> personlist= new ArrayList<Person>();
personlist = personRepository.findByName(name);
if (personlist.isEmpty()) {
return new ResponseEntity(new EntityNotFoundException("Records not found in the the database"), HttpStatus.BAD_REQUEST);
}
return new ResponseEntity(personlist , HttpStatus.OK);
}
Response Entity Object provides flexibility to greater extent. Read the documentation here.
https://docs.spring.io/spring/docs/current/javadocapi/org/springframework/http/ResponseEntity.html

Camel bindy marshal to file creates multiple header row

I have the following camel route:
from(inputDirectory)
.unmarshal(jaxb)
.process(jaxb2CSVDataProcessor)
.split(body()) //because there is a list of CSVRecords
.marshal(bindyCsvDataFormat)
.to(outputDirectory); //appending to existing file using "?autoCreate=true&fileExist=Append"
for my CSV model class I am using annotations:
#CsvRecord(separator = ",", generateHeaderColumns = true)
...
and for properties
#DataField(pos = 0)
...
My problem is that the headers are appended every time a new csv record is appended.
Is there a non-dirty way to control this? Am I missing anything here?
I made a work around which is working quite nicely, creating the header by querying the columnames of the #DataField annotation. This is happening once the first time the file is written. I wrote down the whole solution here:
How to generate a Flat file with header and footer using Camel Bindy
I ended up adding a processor that checks if the csv file exists just before the "to" clause. In there I do a manipulation of the byte array and remove the headers.
Hope this helps anyone else. I needed to do something similar where after my first split message I wanted to supress the header output. Here is a complete class (the 'FieldUtils' is part of the apache commons lib)
package com.routes;
import java.io.OutputStream;
import org.apache.camel.Exchange;
import org.apache.camel.dataformat.bindy.BindyAbstractFactory;
import org.apache.camel.dataformat.bindy.BindyCsvFactory;
import org.apache.camel.dataformat.bindy.BindyFactory;
import org.apache.camel.dataformat.bindy.FormatFactory;
import org.apache.camel.dataformat.bindy.csv.BindyCsvDataFormat;
import org.apache.commons.lang3.reflect.FieldUtils;
public class StreamingBindyCsvDataFormat extends BindyCsvDataFormat {
public StreamingBindyCsvDataFormat(Class<?> type) {
super(type);
}
#Override
public void marshal(Exchange exchange, Object body, OutputStream outputStream) throws Exception {
final StreamingBindyModelFactory factory = (StreamingBindyModelFactory) super.getFactory();
final int splitIndex = exchange.getProperty(Exchange.SPLIT_INDEX, -1, int.class);
final boolean splitComplete = exchange.getProperty(Exchange.SPLIT_COMPLETE, false, boolean.class);
super.marshal(exchange, body, outputStream);
if (splitIndex == 0) {
factory.setGenerateHeaderColumnNames(false); // turn off header generate after first exchange
} else if(splitComplete) {
factory.setGenerateHeaderColumnNames(true); // turn on header generate when split complete
}
}
#Override
protected BindyAbstractFactory createModelFactory(FormatFactory formatFactory) throws Exception {
BindyCsvFactory bindyCsvFactory = new StreamingBindyModelFactory(getClassType());
bindyCsvFactory.setFormatFactory(formatFactory);
return bindyCsvFactory;
}
public class StreamingBindyModelFactory extends BindyCsvFactory implements BindyFactory {
public StreamingBindyModelFactory(Class<?> type) throws Exception {
super(type);
}
public void setGenerateHeaderColumnNames(boolean generateHeaderColumnNames) throws IllegalAccessException {
FieldUtils.writeField(this, "generateHeaderColumnNames", generateHeaderColumnNames, true);
}
}
}

Hibernate export to csv

I want to export query result to excel or csv file.
I am using hibernate struts.
Is there any query like 'into outfile' which can directly export excel to specified location?
In MySQL database, 'into outfile' query works fine but in hibernate it is not working.
I tried using native sql but it gives error 'couldn't execute bulk manipulation query' and anyhow I can not solve that.
I am using MySQL database.
If you are writing an web app and using spring you can do it by writing data to an output stream
Write a simple class to construct your response
public class CsvResponse {
private final String filename;
private final List<YourPojo> records;
public CsvResponse(List<YourPojo> records, String filename) {
this.records = records;
this.filename = filename;
}
public String getFilename() {
return filename;
}
public List<YourPojo> getRecords() {
return records;
}
}
Now write a message converter to write them to an output stream
public class CsvMessageConverter extends AbstractHttpMessageConverter<CsvResponse> {
public static final MediaType MEDIA_TYPE = new MediaType("text", "csv", Charset.forName("UTF-8"));
public CsvMessageConverter() {
super(MEDIA_TYPE);
}
protected boolean supports(Class<?> clazz) {
return CsvResponse.class.equals(clazz);
}
protected void writeInternal(CsvResponse response, HttpOutputMessage output) throws Exception {
output.getHeaders().setContentType(MEDIA_TYPE);
output.getHeaders().set("Content-Disposition", "attachment; filename=\"" + response.getFilename() + "\"");
OutputStream out = output.getBody();
CsvWriter writer = new CsvWriter(new OutputStreamWriter(out), '\u0009');
List<YourPojo> allRecords = response.getRecords();
for (int i = 1; i < allRecords.size(); i++) {
YourPojo aReq = allRecords.get(i);
writer.write(aReq.toString());
}
writer.close();
}
}
Add this Message converter to your app context config file
<mvc:annotation-driven>
<mvc:message-converters register-defaults="true">
<bean class="com.yourpackage.CsvMessageConverter"/>
</mvc:message-converters>
</mvc:annotation-driven>
Finally the controller will look like
#RequestMapping(value = "/csvData", method = RequestMethod.GET, produces="text/csv")
#ResponseBody
public CsvResponse getFullData(HttpSession session) throws IOException {
// get data
List<YourPojo> allRecords = yourService.getData();
return new CsvResponse(allRecords, "yourData.csv");
}
I've found a similar way using JAX RS here.
But the bottomline is you'll have to use a REST mechanism to get data into the output stream if you want to do it in proper way but if your only target is to get data into a file you can just get your data in a list and then simply write it to a file.

Weird query issue on hibernate

I met a weird problem with updating & displaying data in hibernate. Can anyone help me please!?
I am using hibernate, spring with mysql.
The problem here i am facing is, any changes can be applied to database. But if I load updated item on web page, it always returns the old data or new data randomly.
I am sure that it is not a problem of browser cache. I tried to print out return data in getPost method in dao class. It just print out wrong message sometimes.
Say, if I change post content for multiple times, all changes can be stored in database. But If I continuously refresh page to display changed data, it displays all previous changes randomly.
I have tried different ways to load data in getPost method, but still face same problem:
tried session.clear, and session.flush
close second level cache as :
<prop key="hibernate.cache.use_second_level_cache">false</prop>
<prop key="hibernate.cache.use_query_cache">false</prop>
<prop key="hibernate.cache.provider_class">org.hibernate.cache.EhCacheProvider</prop>
<prop key="hibernate.cache.use_structured_entries">false</prop>
different way to load data: session.load, session.get, hibernate query, Criteria, all have same issue.
In getPost method of postDAO: I tried to load data by native SQL first, and wanted to compare with result of hibernate query. both queries return old data.
Code:
public class Post implements Cloneable, Serializable {
private String postID;
private String content;
}
PostSelectController (controller):
public class PostSelectController extends AbstractController
{
....
protected ModelAndView handleRequestInternal(HttpServletRequest request, HttpServletResponse response) throws Exception
{
String id = request.getParameter("id");
Course course = null;
Vendor vendor = null;
Post post = null;
ModelAndView modelAndView = new ModelAndView();
modelAndView.setViewName(getSuccessView());
post = postService.getPost(id);
modelAndView.addObject("post", post);
return modelAndView;
}
}
postService:
#Transactional(propagation=Propagation.SUPPORTS, isolation=Isolation.READ_COMMITTED, readOnly=true)
public class PostService
{
#Transactional(propagation=Propagation.REQUIRED, readOnly=false)
public boolean updatePost(Post post) {
System.out.println("service side::::::::::::::::::::::"+(post.getBestAnswer()!=null));
if(post.getBestAnswer()!=null) System.out.println(">>>>>>>>"+post.getBestAnswer().getPostID());
System.out.println("service side::::::::::::::::::::::"+(post.getBestAnswer()!=null));;
return this.postDAO.updatePost(post);
}
public Post getPost(String postID) {
return this.postDAO.getPost(postID);
}
}
postDAO:
public class PostDAO {
private SessionFactory sessionFactory;
...
public boolean updatePost(Post post) {
boolean proceed = true;
try {
Session session = sessionFactory.getCurrentSession();
session.merge(post); //tried session.update, same problem
session.flush(); //it does not help
} catch (Exception ex) {
logger.error(post.getPostID() + " refused :: " + ex.getMessage());
proceed = false;
}
return proceed;
}
public Post getPost(String postID) {
Session session = sessionFactory.getCurrentSession();
try{
PreparedStatement st = session.connection()
.prepareStatement("select content from post where postid='"+postID+"'") ;
ResultSet rs =st.executeQuery();
while (rs.next()) {
System.out.println("database::::::::::::::::::"+rs.getInt("content"));
// tried to use native sql to load data from database and compare it with result of hibernate query.
break;
}
}catch(Exception ex){
}
Criteria crit = session.createCriteria(Post.class);
NaturalIdentifier natId = Restrictions.naturalId();
natId.set("postID", postID);
crit.add(natId);
crit.setCacheable(false);
List<Post> posts = crit.list();
Post post = null;
if(posts!=null) post = posts.get(0);
System.out.println("hibernate::::::::::::::::::"+post.getContent());
return post;
}
I had the same trouble. The answer i found quikly. As Riccardo said the problem was in not cleanly closing session, so session was randomly recycled. i`ve done this in consructor of the class.
Ex(i used here HybernateUtil):
public yourHelper() {
this.session = HibernateUtil.getSessionFactory().getCurrentSession();
if (session.isOpen()){
session.close();
session=HibernateUtil.getSessionFactory().openSession();
}
}
code of HibernateUtil:
public class HibernateUtil {
private static final SessionFactory sessionFactory;
static {
try {
// Create the SessionFactory from standard (hibernate.cfg.xml)
// config file.
sessionFactory = new AnnotationConfiguration().configure().buildSessionFactory();
System.out.println("SRPU_INFO: Initial SessionFactory creation success.");
} catch (Throwable ex) {
// Log the exception.
System.out.println("SRPU_INFO: Initial SessionFactory creation failed." + ex);
throw new ExceptionInInitializerError(ex);
}
}
public static SessionFactory getSessionFactory() {
return sessionFactory;
}
}
thanx for reading
Looks like you retrieve a list and display only the first entry of the list. I am guessing that the list is populated with more than one item, in random order each time, since there's no order-by criteria.Thus the first element of the list might differ for different executions.
Are you expecting a unique result ? If so, it would be better to use Criteria.uniqueResult();
It may depend on the way you obtain the session: if you are using the typycal HibernateUtil with ThreadLocal session it may be the case you are not correctly closing the session after you finish working with it. In this case the session in almost randomly recycled by completely unrelated units of work which will get the cached value