I've got two questions regarding Orion subscriptions.
If we register entity with provider application url in Orion and create a subscription for it (e.g. sending updates every 15 minutes), what will happen if there is no data in Orion's local DB? Will it query data provider to fetch data from specified url and then return this a subscription update, or will it return nothing?
This is somewhat related to the first one. Is there an option to specify "max duration" of an attribute value in Orion's local DB (e.g. if an attribute is not updated in 1 hour, delete it's value)?
We have the following example in mind: A subscription is set up for an entity to send update every 15 minutes to our server. Updates from sensors to Orion should be done every 5 minutes. Now, if something is not working with the sensor and it stops sending updates, we will get the last value stored in Orion DB forever, unless there's the "max-duration" option for that attribute that deletes the value if it's not changed in specified time period.
When subscription update is triggered, in case that there's no value for that attribute in Orion's local DB, it should query the provider application for data (Q1).
Regarding 1, I guess you refer to ONTIMEINTERVAL subscriptions. At the present moment at Orion 0.23.0 (it may change in the future) the notifications sent due to that kind of subscriptions are populated from Orion entities database, without querying Context Providers.
Regarding 2, there isn't such option (duration applies to registrations and subscriptions, but not to entity attributes). However, it would be easy to implement at client side: you can have an attribute named X_last_update to store the last update time of the X attribute and check regularly for attributes which last update is two old to be deleted.
Related
I am very thank full if anyone help me to resolve this issue.
Detail: I have a mysql table where 110000+ record stored with Unique application(column) number.
Now there is Detail API which will take only 1 application number at a time in URI parameter and return details of that application number it will take Approx 1 min to respond for now.
Now i need to update that records(multiple column) using Cron Job Scheduling to always updated record in database.
and it will progressively update.
Flow: Get application number from database -> Call detail API -> update there record on database.
Issue: there is large number of record in database so we can not call API for all application number in ones.
I am using Laravel 7 and Guzzle HTTP client 7.2 HTTP Client for API calling.
Any suggestion are welcome !
Update: (Tested) I am thinking to do something like this and agree with #bhucho comment to call cron in every 15 minute
We create one more column in table for last_updated_id default 1
and we will write a query to fetch application number something like and get 100 or 500 records in one slab using Laravel take methods from here Database Queries
$latestIdInColumn = myTableClassRef::max(last_updated_id);
$applications = myTableClassRef::select('application_number)->where('id', '>', $latestIdInColumn)
->take(100)
->get()->toArray();
Here we call detail API and update for each application number, when there is last application update done we will store that id in last_updated_id.
When again cron call we have that last_updated_id so we will apply same filter in next query where id '>' $latestIdInColumn. now we will get next 100/500 records where id is greater then $latestIdInColumn.
if ( $applications )
{
For all other available id
} else {
when record not found.
Here we will reset the last_updated_id = 1
}
Now function will fetch from id 1.
It is not tested yet i just planing to do this, i am happy to have
feedback on this.
Note: Now API not taking 1Min to respond I resolved this issue.
Update 2: It is working good.
I would need to remove some data from the MySQL DB when the expire time has arrived.
Like, expire the cookie value on the server-side when user session time has arrived. It's like an automatic action. It's needs to do even the customer was not online. When he was back to the online he needs to login again. And it needs to do some works like reminder action. User can fix some time to bring some reminder notification. On that time server needs to fetch some data from the MySQL DB and sent it to the customer through the push notification automatically.
You need to do the following:
Add a column in your table to store the expiry date/time using default systime + expiry. You want the time to be generated by the DB and not your app code, so it will be exact.
Index this column with BTree, not Hash. You need this index.
Write a stored procedure to delete rows < current system time.
Create a mysql scheduled event to run every 1 minute to run your stored procedure.
I have a big amount of data in a mysql database. I want to poll data from database and push them in a activemq in camel. the connection between database and queue will be lost every 15 minutes. some of the messages are lost during connection interruption. I need to know which messages are lost to poll them again from database. the messages should not be send more that one time. and this should be done without any changes in database schema.(i can not add any Boolean status field to my database).
any suggestion is welcomed.
Essentially, you need to have some unique identifier in the data you pull from the source database. Maybe it is whatever has already been defined as the primary key. Or, maybe the table has some timestamp field. Or, maybe some combination of fields will be unique.
Once you identify that, when you are putting the data into the target, reject any key that is already in the target. You could use Camel's "idempotency" features, but if you are able to check for the key in the target database, you probably won't need anything else.
If you have to make the decision about what to send, but do not have access to your remote database from App #1, you'll need to keep a record on the other side of the firewall.
You would need to do this, even if the connection did not break every 15 minutes...because you could have failures for other reasons.
If you can have an Idempotency database for App#1, another approach could be to transfer data from the local database to some other local table, and read from this. Then you poll this other table, and delete whenever the send is successful.
Example:
It looks like you're using MySql. If both databases are on MySql, you could look into MySql data-replication, rather than using your own app, with Camel.
I have a MySQL database with a REST API for my main application hosted on Azure. I am setting up a hardware sensor with an additional database that will capture data multiple times a second. When a value changes by a specific threshold of the current value or after a specific time interval I want to make an API call to update the main database.
ie) Threshold is 10%. Last value was 10 this value is 12; this will set a trigger to call API and add to main database.
Can a trigger be added to the second database to make a HTTP request? Is there benefit to using another RDBMS in this case instead of MySQL? Does PubNub/Firebase make sense in this situation?
I am currently working on a j2ee web application. The application features a way for users to reset their passwords if they forget them.
I have a database table with 3 columns: username, key, and timestamp.
When the user requests a password change, I add an entry in that table with their username and a random key (making sure that their are no duplicate keys in the table, also that a user can only appear once in the table). I also add the current time. I then send them an e-mail with a link to the application that contains their key, something like:
mysite.com/app/reset?key=abcxyz123
The servlet that handles this request looks at the key in the url to find the matching entry in the reset table to determine which user the key belongs to. If the key doesn't match an entry, I show an error page, if it does, I show the password reset screen. Once the user changes their password, I manually delete the entry from that reset table.
I am trying to implement the equivalent of a time to live for the password reset links, so that I don't have entries loitering in the table unnecessarily, and I thought of 2 options, the first of which I have implemented:
1) Create an EJB Timer that fires every minute that will delete entries in the reset table where the timestamp is older than 30 minutes. This is a manual process in that I am using hibernate as my jpa implementation, so I retrieve all the entries from the table, examine their timestamps, and delete the old ones.
2) Create a database job that deletes rows over a certain age?
My question is, does anyone see any drawbacks to the first approach, and second, is the 2nd option even possible with mysql? I figure that if I can use the 2nd approach, I can get rid of the timer, and let the database handle the time to live aspect of the password reset links, and that may be more efficient.
I haven't been doing j2ee development for that long, but based on the knowledge that I have, these seemed like 2 logical approaches. I welcome any input.
3) Create script that will connect to db, execute delete, disconnect. Then you can schedule this script via operating system e.g. crontab.
Regarding option 1 - Drawback of that solution is that it uses application server resources for stuff that can be done on database only and is not dependent/uses any application logic.
Benefit is that whole app is self contained and you don't need any additional installation/setup task on database as with 2 and 3.