Google Fit Running data - google-fit

I am using Google API Client for fitness to retrieve user fitness data using the PHP Client library.
My usecase: To get users Total run time for the day, and also distance.
I am using the following code (sample code with key steps):
$service = new Google_Service_Fitness($google_client);
$dataSets = $service->users_dataset;
..
$aggBy->setDataTypeName("com.google.activity.segment");
..
$aggregates = $dataSets->aggregate('me',$aggReq);
I am using com.google.activity.segment as datatypename:
I get a response with run type with data
Activity type: 8
runtime : value in msec
no of segment: 2
Now i want to get the total distance travelled during running session,
From above response i can make out that user had 2 separate running activity, then how i can get the combined distance done by the particular activity?
Also how i can retrieve details about these in 2 running segment in details?
As said i am using PHP Client API.

Related

JMeter : How to read particular row data in csv file based on a column value and pass value of that column to a sampler?

I am new to Jmeter and doing a POC to do a load test on a web application.
What I am trying to do:
I have a total of 4 user logins(surgeons). Each Login is associated with 'n' number of patients.
I've created 2 CSV files
1. one with the user login and password for surgeons
2. another CSV file that contains the PatientName, PatientID and the Surgeon associated with that Patient like below.
PatientName,PatientId,loginName
Pa1,PID1,user1
Pa2,PID2,user1
Pa3,PID3,user1
Pa4,PID4,user1
Pa5,PID5,user2
Pa6,PID6,user2
Pa7,PID7,user3
Pa8,PID8,user4
My Scenario:
Login as User.
Navigate to Each Patient Dashboard as per their associations.
log out of the application.
My Testplan
Thread Group (4 users, Ramp up time as 1 sec, 1 loop)
-csv1(with username, password )
-Login Page and Navigate to the Main page
- RunTime Controller (To sustain the load of a set amount of time)
-- While Loop(to loop between the patient dashboard of the surgeon/user logged in)
---CSV2 (the data as shown above)
----Navigate to Dashboard
----Navigate to Main
-Log out of the Application
What I want to achieve:
I want to use the single thread group and run it concurrently for all the 4 users. In this process, once the user login, the user should only those patient data from the CSV which are associated.
For Ex:
When the Thread1 is running with User1 login, he should only able to loop through Pa1, Pa2, Pa3, Pa4 users
When the thread2 is running with User2 login, user should only read the Pa5, Pa6 data.
Like this, each user login should only pick those users as per their associations mentioned above.
Is there any way, I can use this single CSV2 file and achieve this task? so that I don't have to create n number of the thread of n numbers of logins with n number CSV files each containing the data specific to the user login.
According to JMeter Test Elements Execution Order
0. Configuration elements
Pre-Processors
Timers
Sampler
Post-Processors (unless SampleResult is null)
Assertions (unless SampleResult is null)
Listeners (unless SampleResult is null)
Being a Configuration Element CSV Data Set Config is initialized once and before anything else therefore you won't be able to use the current variable from 1st CSV Data Set Config in 2nd CSV Data Set Config.
The solution is using __CSVRead() function instead, JMeter Functions are evaluated in the place where they appear in the Test Plan so you can use any hardcoded value or JMeter Variable or another function there.
More information: How to Pick Different CSV Files at JMeter Runtime
1. CSV Data Set Config for Surgeon credentials (loginNameSurgeon & Password)
2. Login Request (take first surgeon credentials from CSV)
3. While ${__jexl3("${loginNameSurgeon}" != "${loginName}")}
a. CSV Data Set Config for patient data w.r.t surgeons (PatientName,PatientId,loginName)
b. If Controller - ${__jexl3("${loginName}" != "<EOF>")} // to check if we have any more loginName left
c. Dashboard request
d. Debug Sampler // Just to validate if the variables are in place.
4. Logout request

Make weather app using openweathermap api to display LAST x Days' Weather(3 hourly basis)

I want to make an app displaying LAST x days' temperature on 3 hourly basis using the openweathermap API only. I went through the available APIs. They either have the predicted forecast of next 5 days or the historical data expecting start and end as parameters which I tried to use then (https://openweathermap.org/history).
So just to check whether I have understood the params correctly or not, I tried fetching(through the address bar):
http://history.openweathermap.org/data/2.5/history/city?id=1275339&type=hour&appid=df78dcfa9580e72b15fdf62d406d34ec&start=1369728000&end=1369789200".
But I got HTTP error 401:
{"cod":401, "message": "Invalid API key. Please see http://openweathermap.org/faq#error401 for more info."}
I fail to understand how should the parameters be entered to get the desired results (last 5 days temperature on 3 hourly basis) and also how do I enter "cnt" (if needed to) such that it gives me 3 hourly data.

Analyze data volume of API calls with Invantive SQL

The SQL engine hides away all nifty details on what API calls are being done. However, some cloud solutions have pricing per API call.
For instance:
select *
from transactionlines
retrieves all Exact Online transaction lines of the current company, but:
select *
from transactionlines
where financialyear = 2016
filters it effectively on REST API of Exact Online to just that year, reducing data volume. And:
select *
from gltransactionlines
where year_attr = 2016
retrieves all data since the where-clause is not forwarded to this XML API of Exact.
Of course I can attach fiddler or wireshark and try to analyze the data volume, but is there an easier way to analyze the data volume of API calls with Invantive SQL?
First of all, all calls handled by Invantive SQL are logged in the Invantive Cloud together with:
the time
data volume in both directions
duration
to enable consistent API use monitoring across all supported cloud platforms. The actual data is not logged and travels directly.
You can query the same numbers from within your session, for instance:
select * from exactonlinerest..projects where code like 'A%'
retrieves all projects with a code starting with 'A'. And then:
select * from sessionios#datadictionary
shows you the API calls made:
You can also put a query like to following at the end of your session before logging off:
select main_url
, sum(bytes_received) bytes_received
, sum(duration_ms) duration_ms
from ( select regexp_replace(url, '\?.*', '') main_url
, bytes_received
, duration_ms
from sessionios#datadictionary
)
group
by main_url
with a result such as:

RESTful API scenario

I am asking about the scenario of a RESTful service in a particular case. Assume that this is a file delevery service. Users submit an order then after a period of time ( 1-10 min ) a pdf file is ready for them to download. So the basics I came with:
user submits an order using GET method to the webservice ( edit: OR POST )
webservice returns an orderid via json or xml
some background and human process takes place ( 1 - 10 mins )
user checks the status of the order by passing the orderid to the webservice
if the order is ready then an statusCode and a pdfLink is returned to user
else only the statusCode is returned (i.e still proccessing, failed, etc)
Now, the question about this scenario is that how often the user ( other website ) should try to fetch the status of one specific order?
Do we need to stablish a double side webservices ? like:
server A submits the order to B
B informes A that the order is ready to get
A requests B for the pdfLink
A transferes the pdf file from server B to A
When server A submits an order to B, it could also specify an url on which it expects a call if the order is ready. This way service B does not need to know the specifics of service A. It just calls the url specified by service A.
The response service B gives to service A, could also contain a url where to download the order.
This prevents polling from server A to server B, which significantly reduces the load on service B.

google json api search results limit, not 100 search queries per day

The Google Custom Search API requires the use of an API key, I have get from the Google APIs console. The API provides 100 search queries per day for free. I want to more,so I have signed up for billing in the console and succeed.I can set the requests/day,defalut 1000 requests/day.But the total results are still 100,I show 10 in one page,so I can get 10 pages.
Billing solve the querys per day,but not the total results.The document does not explain clearly.What should i do to solve the results problem.Does XML API have the same problem? Must I Replace the JSON API by XML API?
another year passed... Limits are the same =((
Google forces us to use multiple accounts to get complete search results.
Use a query per day/week/month.
Or you can sort huge result array by publish date. And when '100-items' limit is reached you should execute new request with excluding first items by applying "the lowest(or the highest) possible value" condition on publish date.
btw, using webbrowser you could set 100 results per page. So total count of result items is 1000 per query. Web crawling would be helpful=))))
be happy!