I am writing a program to pull some JSON data off of a 3rd party website. What I would like to happen is run my method to pull the data, and cycle through the data finding seeing if there is a newer date then the last time the system pulled it. If there is, I want to send out a local notification that something was "Found". I expect if there is new data to be there every 5 min, but I would like the users to schedule a time interval from 5 min to once a day. There is more I would like to check on but this is where I want to start.
I don't know where to begin, any help will be appreciated.
I am building this project in SWIFT on Xcode 6.4
You should look into the following
For url fetching (For you json) - NSURLSession tutorial
Look at ETAG, Last-Modified headers to check if JSON file is modified at server. Refer this
NSTimer for for your timing requirements. Refer this
For notifications: Use NSNotification. Refer this
Related
I am using Workbox 3.6.1 now and I am really enjoy it.
But I just have a question about the post data caching of Workbox.
I may need to use the data which received from a request to post data to the server.
For example,
Step 1: the user enters his ID in the input field to validate his
identity and then to help the website rendering the rest content (eg: another input field)
Step 2: And the user enter some other information and sends the data
along with his ID to the server.
To my basic knowledge so far, I understand the Workbox can do the step 1 and 2 separately (I can only implement like this way......). But I just wondering if they will be queued together and when the website comes back online, the step 1 runs and the step 2 uses the data from step 1 and runs as well. During online, it can be done by using promises or callbacks. But i want to know if they are queued offline as well.
May be there is a way which I do not know, but I would like to know the official Workbox method to implement this case OR other workarounds.
Thanks for the help.
I'm working on a simple ruby script with cli that will allow me to browse certain statistics inside the terminal.
I'm using API from the following website: https://worldcup.sfg.io/matches
require 'httparty'
url = "https://worldcup.sfg.io/matches"
response = HTTParty.get(url)
I have to goals in mind. First is to somehow save the JSON response (I'm not using a database) so I can avoid unnecessary requests. Second is to check if the new data is available, and if it is, to override the previously saved response.
What's the best way to go about this?
... with cli ...
So caching in memory is likely not available to you. In this case you can save the response to a file on disk.
Second is to check if the new data is available, and if it is, to override the previously saved response.
The thing is, how can you check if new data is available without doing a request for the data? Not possible (given the information you provided). So you can simply keep fetching data every 5 minutes or so and updating your local file.
I know this has been asked similarly in two other threads, but even with both of those I still have not been able to get a simple step-count.
I've been going through the documentation and have been sending requests through OAuth 2.0 Playground but I can't for the life of me get any meaningful numbers in a response, or I fear I'm overlooking something or looking in the wrong place.
What I've tried:
1) Got all data sources at this request URL:
https://www.googleapis.com/fitness/v1/users/{userId}/dataSources
2) Gone through two specific SO threads: One, Two
From suggestions there, I sent this request:
https://www.googleapis.com/fitness/v1/users/me/dataSources/derived:com.google.step_count.delta:com.google.android.gms:estimated_steps/datasets/{maxtime}-{mintime}
with values for maxtime/mintime that corresponded from April last year to today and the response I got was this:
{
"minStartTimeNs": {mintime},
"maxEndTimeNs": {maxtime},
"dataSourceId": "derived:com.google.step_count.delta:com.google.android.gms:estimated_steps"
}
where mintime and maxtime were the values in the request. I'm continuing to read through the docs in the hope that I'm missing something, but no luck currently. Any thoughts?
I have been stuck with this request too. You get this response because there is no data within this range of time. Make sure that mintime and maxtime are in nanoseconds and try again. For example, today is: 1442404933000000000
Good luck!
Use google takeout to export the Google Fitness historic data, then use the time interval for which you have the fitness data. You can only get the data for which you have synced the Google data. So frequently sync google Fit data.
I am working with an Arduino project that uses Google AppEngine to post data that it collects from various sensors. To give you an idea, here is a link to the project: http://www.iowa-aquaponics.com
I am finding that occasionally the Arduino will post, or goggle app engine receives, data that isn't valid.
This would be a good entry:
http://www.mysite.appspot.com/adacs/arduino?&Temp=80.6&Humidity=82.2&AmbientLDR=16&WaterTemp=79.3&pHValue=5.03&doValue=2.5
Occasionally, a character will drop and i will either get a decimal point that is missing or it will drop an & and what should be a number value like 80.6 turns into a sting like 80.6umidity.
Since the Google AppEngine is seeing this as a string, it goes in the datastore no problem. When I query this data into a JSON table, it will fail because it is looking for a number and it is getting a string.
I was thinking of writing a CRON job that would run each time a new data set was submitted and it would validate the data and delete the record if any of the elements were not valid. I am collecting data every 10 minutes or so, so I am fine with occasionally dropping some records. I would rather have this over not being able to see any data because one element of the JSON table is not valid.
I am curious of some other ideas to handle this situation. If there is a best practice for this, please let me know. Thanks everyone.
There is a web service that allows me to go to a URL, with my API-key, and request a page of data. The data is returned as JSON. The JSON is well-formed, I ran it through JSONLint and confirmed its OK.
What I would like to do is retrieve the JSON data from within MS Access (2003 or 2007), if possible, and build a table from that data (first time thru), then append/update the table on subsequent calls to that URL. I would settle for "pre-step" where I retrieve this information via another means. Since I have an API key in the URL, I do not want to do this server-side. I would like to keep it all within Access, run it on my PC at home (its for personal use anyway).
If I have to use another step before the database load then Javascript? But I dont know that very well. I dont even really know what JSON is other than what I have read in Wikipedia. The URL looks similar to:
http://www.SomeWebService.com/MyAPIKey?p=1&s=50
where: p = page number
s = records per page
Access DB is a JavaScript Lib for MS Access, quick page search says they play nicely with JSON, and you can input/output with. boo-ya.
http://www.accessdb.org/
EDIT:
dead url; wayback machine ftw:
http://web.archive.org/web/20131007143335/http://www.accessdb.org/
also sourceforge
http://sourceforge.net/projects/accessdb/