Schedule funtion to Pull CSV file to database - csv

Sorry, I have not done any coding in WEB API so I don't have any sample to share here.
But I just wanted to know that, how we can schedule a task on server for WEB API to read a csv file after every 15 or 30 mins.
From client we are successfully able to push csv to my ftp server where my application is located. But how to retrieve the data from csv after every fixed interval?

WebAPI is the wrong tool for the job.
You can certainly push (or upload specifically) CSV data to a WebAPI method, and that method can do whatever you need to do with that data. But WebAPI doesn't run on a timer. It's not going to periodically look for data to process, you have to send it that data.
A web application is a request/response system. In order to do something, you have to send it a request.
For an application that periodically runs on a timer, consider a Windows Service or even simply a Console Application which is scheduled to run with some job scheduler (Windows Task Scheduler, cron, etc.). That would run periodically without the need for input or a user interface at all.
Now, if the code you need to use is in the WebAPI application already, that's ok. You can refactor that code into a separate Class Library in the same solution as the WebAPI application. Then add the new application (Windows Service or Console Application) to that solution and reference that same Class Library. Both applications can share the same supporting code. Indeed, the application layer should be as thin as possible over a set of common business logic that can be shared by multiple applications.

Related

Firebase Automatic Sync with Local PC

I'm working on a project that can take data from a Weintek HMI, put them on a webserver and then send them to an application that I created on android studio.
I've found firebase that can help me in this task.
In easybuilder that works with my hmi, I can create a mysql database that can store the data.
The problem is how can I update automatically firebase database with mysql database with an interval of time in order to access them on the android app.
If there is no solution with mysql, can someone suggest other method to extract the data and use some web server to sync it with the android app?
I don't know your specific need, in terms of data volume or application, but as a workaround, maybe this can help you:
I usually apply MQTT, which many Weintek HMIs have, to send telemetry data, and then use NodeRed to process and redirect the data to a database, email, SMS, Telegram, CSV, TXT... depending on the need , which in your case could be Firebase (I never used it).
It works great for me as I don't have to worry about HMI limitations.
The problem is the reliability of the data, in terms of confirming that when the HMI sends, the server listens and writes, but there are certainly ways to deal with this, and the fact that you need to have a server with NodeRed running.
If you have never done so, in Weintek HMIs you can send the MQTT payload cyclically using macros easily.

Google Cloud Platform - I need a webhook to get JSON data. How to approach it?

I am fairly new to the google-cloud-platform world and I am struggling with my 1st steps there.
So, what I want to know is how to make a webhook app, that will run 24/7 and "catch" data sent from another 3rd party service (later I will try to do something with this data - manipulate it and push into a DB, but that's anohter question to ask).
I have set up an instance on GCP which Linux based, but , what's next?
I am familiar with PHP, but I want to do it this time in Phyton (learning it nowdays).
Which service in GCP should I use and how I set up the server to catch every data the 3rd party service is sending?
This sounds like a perfect fit for Google App Engine. As long as the 3rd-party service makes HTTP requests, App Engine is a great fit. You can write your application in Python, PHP, Java, or just about anything else, then GAE takes care of the rest. No need to manage Linux, instances, firewall rules, or anything else.
If your load is minimal, you may even fit into the free tier and pay nothing to run your app.
Check out the GAE Python docs at https://cloud.google.com/appengine/docs/python/.
If you want to run your Web Hook continuously, then you can run it as a Cron job. Here is a guide on how to run Python scripts as cron jobs on Google App Engine: Scheduling Tasks With Cron for Python

Architecture tips for multi-platform software / API

I'm creating a multi-platform app, mostly for web interface, mobile and a windows application. The app will manage user task lists and sync them to the server, but also store them locally for processing data faster.
My idea of architecture until now is:
Keeping most of the processing on client side, eventually syncing with the server.
Developing an API to provide and receive data that will be saved on the server (basically just a json wrapper web service)
The data flow:
user Authenticates -> Requests updated Json objects to the server -> populate client-side objects -> work with client-side objects -> send a json object back to the server -> server updates data.
Is this a good approach? I've never done this, can you guys give me some tips?
I think you are on the right track. The idea is to decouple the front-end from the back-end. The backend shall expose a set of CRUD (Create, Read, Update, Delete) functions as RESTful JSON web services. All your different flavours of UI (mobile, web, windows) can consume the same API.
I would recommend for the web front-end to take a look at AngularJS together with bootstrap.
Regarding the backend, you could implement it as a simple Java web application with Jersey/JAX-RS or alternatively, you could check Node.js + Express.

how to call a java action when changes in databse

hii i am working on spring and hibernate, i have a situation when i want to call a method when changes is done in data base means like notification whenever a new notification is come then my page automatically show the no of notification, i have done this work using timer but its not good it because it call repeatedly and load on server is increased unusually so please tell me is there any way to listen the data base and call the method only when a new entry ios done or any change is made on data base
You have two options:
Trigger from database to java program using sys_exec():
see https://github.com/mysqludf/lib_mysqludf_sys
Use an hibernate entity listener. This only work if hibernate has excusive access to database.
see http://docs.jboss.org/hibernate/entitymanager/3.5/reference/en/html/listeners.html
If I've understood your question correctly, you have a web interface which should show a notification, if there's a new DB entry?
Then you've first got to choose one of Jose Luis Martin's suggestions, in order to have the notification on the server side. And then you have to forward this notification to the client. For this there are a few possibilities:
(What you already did): Use polling (sending a request from client to server every x seconds, asking for new entries): http://en.wikipedia.org/wiki/Polling_(computer_science)
Let the server push the data to the client. This is the more "modern" solution: http://en.wikipedia.org/wiki/Push_technology
I'd suggest using the second approach in combination with some framework like Atmosphere: https://github.com/Atmosphere/atmosphere
This framework supports several different ways of communication, with fallbacks etc.
EDIT:
If you really just want the information inside a server method, and the information hasn't to be 100% precise, you could also use a timer on the server to count the new items every 30 seconds and kinda cache the result for the client requests.

NetSuite Migrations

Has anyone had much experience with data migration into and out of NetSuite? I have to export DB2 tables into MySQL, manipulate data, and then export ina CSV file. Then take a CSV file of accounts and manipulate the data again for accounts to match up from our old system to new. Anyone tried to do this in MySQL?
A couple of options:
Invest in a data transformation tool that connects to NetSuite and DB2 or MySQL. Look at Dell Boomi, IBM Cast Iron, etc. These tools allow you to connect to both systems, define the data to be extracted, perform data transformation functions and mappings and do all the inserts/updates or whatever you need to do.
For MySQL to NetSuite, php scripts can be written to access MySQL and NetSuite. On the NetSuite side, you can either do SOAP web services, or you can write custom REST APIs within NetSuite. SOAP is probably a bit slower than REST, but with REST, you have to write the API yourself (server side JavaScript - it's not hard, but there's a learning curve).
Hope this helps.
I'm an IBM i programmer; try CPYTOIMPF to create a pretty generic CSV file. I'll go to a stream file - if you have NetServer running you can map a network drive to the IFS directory or you can use FTP to get the CSV file from the IFS to another machine in your network.
Try Adeptia's Netsuite integration tool to perform ETL. You can also try Pentaho ETL for this (As far as I know Celigo's Netsuite connector is built upon Pentaho). Also Jitterbit does have an extension for Netsuite.
We primarily have 2 options to pump data into NS:
i)SuiteTalk ---> Using which we can have SOAP based transformations.There are 2 versions of SuiteTalk synchronous and asynchronous.
Typical tools like Boomi/Mule/Jitterbit use synchronous SuiteTalk to pump data into NS.They also have decent editors to help you do mapping.
ii)RESTlets ---> which are typical REST based architures by NS can also be used but you may have to write external brokers to communicate with them.
Depending on your need you can have whatever you need.IN most of the cases you will be using SuiteTalk to bring in data to Netsuite.
Hope this helps ...
We just got done doing this. We used an iPAAS platform called Jitterbit (similar to Dell Boomi). It can connect to mySql and to NetSuite and you can do transformations in the tool. I have been really impressed with the platform overall so far
There are different approaches, I like the following to process a batch job:
To import data to Netsuite:
Export CSV from old system and place it in Netsuite's a File Cabinet folder (Use a RESTlet or Webservices for this).
Run a scheduled script to load the files in the folder and update the records.
Don't forget to handle errors. Ways to handle errors: send email, create custom record, log to file or write to record
Once the file has been processed move the file to another folder or delete it.
To export data out of Netsuite:
Gather data and export to a CSV (You can use a saved search or similar)
Place CSV in File Cabinet folder.
From external server call webservices or RESTlet to grab new CSV files in the folder.
Process file.
Handle errors.
Call webservices or RESTlet to move CSV File or Delete.
You can also use Pentaho Data Integration, its free and the learning curve is not that difficult. I took this course and I was able to play around with the tool within a couple of hours.