I have to choice to output the info in database (Mysql) to be json format.
directly connect to database and fetch the data and output json
connect to a REST service to get the data and output json.
Which is better and why?
directly connect to database and fetch the data and output json
If you are connecting to the database (doesn't matter if it's MySQL or something else) directly through binary based protocol it should be faster than REST based protocol.
connect to a REST service to get the data and output json.
REST based protocols are on the other hand more simple, straightforward and easier to implement from the client side of view than binary ones in general.
Which is better and why?
It depends if you need speed or simplicity of use. In case of binary connection you would additionaly have to parse fetched data to JSON. REST service can give you usually just what you need in desired JSON format. However if speed is crucial for you then binary protocol is better choice I would say.
Related
I have the task to implement an API with Spring Boot and a relational database to save the data from a client (mobile app) and synchronize it.
So far no problem. I have some endpoints to post and get the stored data.
Now I have the task to provide an endpoint that return the complete data in a GET-Request and another to save the complete data of the client via a POST-Request.
My problem is:
How do I store the complete data in one POST-Request(JSON)?
The database has multiple entities with manytomany relationships and if I just POST them then I have some problems with the relations between the Entities.
My approach to GET the complete data was to just create a new Entity with every entity in it. Is this the best solution?
And is this even a good solution to POST the complete data instead of the usage of the other endpoints to get the entities one by one. Or is there another approach to store and restore the complete data from server and client? Whereby I think that posting the complete data makes less sense.
is this even a good solution to POST the complete data instead of the usage of the other endpoints to get the entities one by one
In some scenarios you may want to force update or synchronize the client data with the server, for example, WhatsApp backup now option.
How do I store the complete data in one POST-Request(JSON)
You can make one post endpoint that extracts the data sent from the client and internally use all your repositories or daos for each property.
My approach to GET the complete data was to just create a new Entity
with every entity in it. Is this the best solution
either by doing as you mentioned or by handling it manually in the endpoint like this
also check this one which uses apache camel to aggregate multiple endpoints
Lets say you have a business Application producing and storing Enriched Product Master Data in its own environment, one the enrichment is completed you want to make that data available on a CouchBase Database.
In order to get that data from Business Application's environment into CouchBase, let's assume I want to use Kafka to broadcast the changes and Nifi to distribute it to the final desination (CouchBase).
But CouchBase takes JSON format files. Can I use Kafka or Nifi to convert the pulled data into a JSON format? I know I can for instance put solution such as Attunity between the Business Application and Kafka to replicate the data real time. But let us assume that there is no budget to implement the solution attunity, so one will temporarily use a REST API on the Business Application side and pull that data with (based on made changes) Kafka, can I convert the data into JSON with Kafka? or NiFi?
EDIT:
Well the reason why I want to know if NiFi can do this, is because our landscape is a bit more complex than I described. Because between CouchBase and the Business Application, you have:
[Business App] - [ X ] - [Kafka] - [NiFi] - [DC/OS with KONG API Layer] - [CouchBase Cluster].
And I want to know if I should implement a new solution for Data Replication on the spot of the X, or should I just make use of the Business App REST API and pull data from the REST API with Kafka and Convert my data to JSON in NiFI.
There is a Couchbase sink for Kafka Connect. This will enable you to do exactly what you want. Simple configuration-file based approach.
I was created my project in spring 4 MVC + Hibernate with MongoDB. now, I have to convert it into the Hibernate with MySQL. my problem is I have too many collections in MongoDB the format of bson and json. how can I convert that file into MySQL table format? is that possible?
Mongodb is a non-relational database, while MySQL is relational. The key difference is that the non relational database contains documents (JSON objects) which can contain hierarchical structure, where as the relational database expects the objects to be normalised, and broken down into tables. It is therefore not possible to simply convert the bson data from MongoDB into something which MySQL will understand. You will need to write some code that will read the data from MongoDB and the write it into MySQL.
The documents in your MongoDB collections represent serialised forms of some classes (POJOs, domain object etc) in your project. Presumably, you read this data from MongoDB deserialise it into its class form and use it in your project e.g. display it to end users, use it in calculations, generate reports from it etc.
Now, you'd prefer to host that data in MySQL so you'd like to know how to migrate the data from MongoDB to MySQL but since the persistent formats are radically different you are wondering how to do that.
Here are two options:
Use your application code to read the data from MongoDB, deserialise it into your classes and then write that data into MySQL using JDBC or an ORM mapping layer etc.
Use mongoexport to export the data from MongoDB (in JSON format) and then write some kind of adapter which is capable of mapping this data into the desired format for your MySQL data model.
The non functionals (especially for the read and write aspects) will differ between these approaches but fundamentally both approaches are quite similar; they both (1) read from MongoDB; (2) map the document data to the relational model; (3) write the mapped data into MySQL. The trickiest aspect of this flow is no. 2 and since only you understand your data and your relational model there is no tool which can magically do this for you. How would a thirdparty tool be sufficiently aware of your document model and your relational model to be able to perform this transformation for you?
You could investigate a MongoDB JDBC driver or use something like Apache Drill to facilitate JDBC queries onto your Mongo DB. Since these could return java.sql.ResultSet you would be dealing with a result format which is more suited for writing to MySQL but it's likely that this still wouldn't match your target relational model and hence you'd still need some form of transformation code.
I am trying to fetch large tabular data from a remote url which send data in binary format. To read this data I am using a C program that runs (inbuilt) with server. The C program fetches fetches data in binary format from various sources, convert into readable form and send it to frontend.
The I have two option:
I convert the data into CSV format which is lighter or
I convert the data into JSON format which is little heavier but easy to interpret by frontend web application
I want to to operations like sorting and grouping of data in frontend. So can you suggest me which is better option to use in this scenario.
Updates:
1 -> frontend will just receive the data and may do sorting or grouping
If you must send all the data to the front end, you could use a json array to wrap each row. This minimises the amount of extra data you are adding e.g.
data:{
columns:["A","B","C"],
rows:[ [1,2,3],[4,5,6],[7,8,9] ]
}
However, I would try to avoid sending all the data to the fronend application if possible. It should be possible to show pages of data by fetching the required data on demand using ajax calls. The server can do the heavy work of sorting, grouping etc. Consider storing the data in a database.
Now i am creating Ajax app and i want to know what is better to connect server data with client.
Last time i used json but some time ago i hear that serialized data is faster.
Who know what is better?
In general, a serialized binary representation will be smaller than a JSON representation of the same object (that in turn would be smaller than an XML representation).
In this regards, network transfers would be faster.
Probably you're comparing serialize (working similarly in PHP) and json on your client (a browser). There's a serialize function (similar to how PHP works) in jquery but no native unserialize. I suggest you work with json when communication is between your server and your client (ajax).