I'm working on a RESTful API using node and express (with a MySQL database).
I have a table of products, and a page on my front end which is supposed to display those products. The problem is, there's hundreds (sometimes thousands) of them, I can't just send all of them back in a response.
What is the best way (in terms of performance and speed) for transferring all those rows back to the client?
I thought of streaming the data so that the client starts displaying them while the streaming is still happening. But I have no idea on how to do that, or whether that's the best way.
Related
I am trying to build a simple project online. I have a MySql Database,Database where I will store different information, such as fake orders made from fake clients. The application will be formed by a frontend made with Javascript and HTML/CSS, while the backend will be a Node/Express API that should handle all the requests and handle the database.
I wanted to know whether there is a way to, each time a new order is made, have a refresh in my own page, and see, for example, a new column in an hypotetical table in my HTML with a minumum latency, avoiding making a request from the client every x seconds. This because it could be quite expensive in terms of bandwith and also prett unefficient.
I thought that each time I connect to the site, I get subscribed to a sort of list in the server, that broadcast a trigger to then update the frontend when tha UPDATE function is triggered in the backend. In other words, every time an update is done on the backend, the server sends a trigger to the clients that he knows are currently connected. Then, the frontend asks for the update directly.
This solutions i really complicated to handle and may be not that performant. I was thinking if there where some functionalities of the frontend or the backend or the database, or any framework that allow me to do this thing.
I would like to have all as real time as possible, using the least bandwith possible. This is because I would like to use the free tier of some online service, and I don't want to consume all the bandwith.
If you have some suggestions of framework or functionalities, or any protocol, you are welcome. Thank you a lot in advice
You can use websockets. When a user creates an order and after there is a success saving to the data base, the backend will push or publish the data to the client who is subscribed to a specific channel. The logic is not complicated at all it is called the pub/sub pattern you should search for it.
Also https://socket.io/ this is library that used on both backend and front end to deal with websockets.
I have a scenario wherein frontend app makes a call to backend DB (APP -> API Gateway -> SpringBoot -> DB) with a JSON request. Backend returns a very large dataset (>50000 rows) in response sizing ~10 MB.
My frontend app is highly responsive and mission critical, we are seeing performance issues; frontend where app is not responding or timing-out. What can be best design to resolve this issue condering
DB query cant be normalized any further.
SpringBoot code has has cache builtin.
No data can be left behind due to intrinsic nature
No multiple calls can be made as data is needed is first call itself
Can any cache be built in-between frontend and backend?
Thanks.
Sounds like this is a generated report from a search. If this data needs to be associated with each other, I'd assign the search an id and restore the results on the server. Then pull the data for this id as needed on the frontend. You should never have to send 50,000 rows to the client in one go... Paginate the data and pull as needed if you have to. If you don't want to paginate, how much data can they display on a single screen? You can pull more data from the server based on where they scroll on the page. You should only need to return the count of the rows to the frontend, and maybe 100 rows of data. This would allow you to show a scrollbar with the right height. When they scroll to a certain position within the data, you can pull the corresponding offset from the server for that particular search id. Even if you could return all 50,000+ rows in one go, it doesn't sound very friendly to the end user's device to have to load that kind of memory for a functional page.
This is a sign of a flawed frontend that should be redone.
10mb is huge and can be inconsiderate to your users especially if there's a high probability of mobile use.
If possible, it would be best to collect this data on the backend, probably put it onto disk, and then provide only the necessary data to the frontend as it's needed. As the map needs more data, you would make further calls to the backend.
If this isn't possible, you could load this data with the client-side bundle. If the data doesn't update too frequently, you can even cache it on the frontend. This would at least prevent the user from needing to fetch it repeatedly.
I'm building a mobile app where a pull-down gesture on the UI initiates an update of existing data/posts (also retrieves new posts if there are any, but that's not the point here). The server is stateless meaning there is no sessions.
If the posts have been updated in the database, how do I let the front-end know which posts need to be updated? Only way I could think of is to send a list of ids of all retrieved posts to the server, and have it check if any of the posts have been modified since the time fetched.
This however seems quiet inefficient as the users might have stacked up hundreds of posts in some extreme cases, and it's most likely that only few or none of the posts need to be updated. Issuing hundreds of db requests could be a huge overhead.
There are at least 2 ways of doing this
Long Polling
client requests server for new information.
the server keeps the connection open until there is some new data to send.
once the server gets data, it sends this to the client and connection is closed.
the client then sends a new request for information.
This is a continuous process.
WebSockets
Create a websocket connection, keep it open.
The server pushes any updates as and when they come.
Problems with both situations
May take a significant amount of time to have production ready implementations.
Both them will require the server to be aware of any change in the database. This can be tricky as well
I am building an app that receives a bunch of static data that is read only. The user does not change the data, or send any data to the server. The app just gets the data and presents it to the user in various views.
Like for example a parts list, with part numbers and prices. This data is currently stored in mongoDB.
I have few options for getting the data to the client. I could just use meteor's publication system, and have the client subscribe to the data it needs.
Or I could map all the data the client needs into one JSON file, save the JSON file to Amazon S3, and have the client make simple GET request to grab the data.
If we wanted this app to scale to many, many users, would not using meteor publication be the best? Or would either method be similar in terms of performance? Using meteor publication system would be the easiest, but I am worried that going down this route would lead to performance issues if a lot of clients request the data. If the performance between publishing and get request is about the same, I would just stick with the publication as its the easiest.
In this case Meteor will provide better performance. If your data is mostly server to client driven then clients do not have to worry about polling the server and the server will not have to worry about handling the request.
Also Meteor requires very little resources to send data to the client because the connection is persistent. Take an app like code fights which is built on Meteor constantly has thousands of connections to and from it, its performance runs great.
As a side note, if you are ready to serve your static data as a JSON file in a separate server (AWS S3), then it means you do not expect that data to be that big, so that it can be handled in a single file and entirely loaded in client's memory.
In that case, you might even want to reconsider the need to perform any separate request (whether HTTP or Meteor Pub/Sub).
For instance, simply embedding the data in your app, or served through SSR / Fast Render package.
Then if you are really concerned about your scalability, you might even reconsider the need to use Meteor, since you do not seem to need any client-server interactivity (no real need for Pub/Sub, no reactivity…). After your prototype is ready, you could rework it as a separate and static SPA, so that you do not even need to serve it through Node / Meteor.
Suppose you have a mobile app that needs to ask the server for 20 other Users near your current location. The URL to get this data might look something like this (not escaped):
https://example.com/api/users?lat=40.240239&long=-111.657920&count=20
The server could then respond in one of two ways:
Return all User objects directly, as JSON, in one large array.
Return an array of UUIDs corresponding to the Users who match the request. The client would then have two choices:
a) Send a request for all User objects in one big batch:
https://example.com/api/users?ids=[1,2,3,4...]
b) Send requests for each User independently:
https://example.com/api/users?id=1
https://example.com/api/users?id=2 ...
Currently, my application implements Option #1. In the name of responsiveness, I've eliminated every possible network round-trip by returning as much data as possible in as few requests as possible. However, I'm starting to see problems with this choice due to the fact that my client and server logic are very tightly coupled. It's difficult to maintain, versioning is a nightmare, and caching on the client side is much more difficult than say, Option #2b.
Based on your experience, which option do you recommend (or a different method entirely)? What would you consider the "industry standard" way of serving data to a mobile app?