Is there any way to return more than 10 results per page using the Amazon product API? - amazon-product-api

I see that it has a limit of 10 results per request for the product ItemSearch api. Is there any way that I can return more, or perhaps use the batch api to return more?
I'm using the node.js apac library.

Each page in the amazon api response can only include upto 10 items http://docs.aws.amazon.com/AWSECommerceService/latest/DG/ItemSearch.html#Description

Use "perPage": 10000 in your payload. 10000 records will be return in one shot.

Related

How do I return a specific number of items from an API in react?

The api that im pulling data from has 5000 records, im trying to pull on 20 for test purposes. I tried
https://jsonplaceholder.typicode.com/photos?=20 but i'm still getting the 5000. Any idea on how I can achieve this?
You can try hitting the URL with below params
http://jsonplaceholder.typicode.com/photos?_start=0&_limit=10

SoapUI Groovy script - multiple calls storing values in variable?

I can't figure this out if its even possible to do in soapUI. Couldn't find much about groovy script here.
I'm using Flickr API, specificaly methods:
flickr.photos.search
As parameters I pass my user_id to get all my photos in response.
Let's say I can display only 500 photos per page --> there is total of 51 pages.
Is it possible to write some groovy script to:
loop through the response, get each photoId and store it in some list or array
call the same request, just with "page": 2 this time
repeat step 1 and 2 until photoId is extracted from all 51 pages
Is this possible, will I have the array/list accessible for a different request call?
Or do I need to write the value to some txt file and then when I want to use the values, use the file as data source?
Thank you so much!

ssis 2010 zappysys more than 300 rows issue

I am using SSIS 2010 and using a test extension ZappySys connecting test JSON Source (rest API or File).
The issue I have is the total rows to scan 300 default. I have tried to override this, and it still returned 300 rows. I would like to use this extension more, but is there a way of getting more than 300 rows of data? Does anyone know how to by pass this?
There is this post on zappysys blog but does not state how to get more than 300 rows.
https://zappysys.com/blog/how-to-read-data-from-servicenow-rest-api-ssis/
Disclaimer: I work for ZappySys.
Scan option affect only Metadata Guess in ZappySys. Based on your description Looks like your issue is not metadata guess but fetching more rows with pagination. Process of looping through ServiceNow data is described here
For using pagination in ZappySys make sure you configure JSON source as per the article below
Step-By-Step : Configure ServiceNow API Pagination
Screenshot - Pagination Settings:
Here is an example of Pagination in ServiceNow. Let's say you fetching all rows for MyTable1. In such case you have to keep calling API like below until last page is reached. You detect last page by trapping WebException with StatusCode 404
/api/now/table/MyTable1?sysparm_limit=10&sysparm_offset=0
/api/now/table/MyTable1?sysparm_limit=10&sysparm_offset=10
/api/now/table/MyTable1?sysparm_limit=10&sysparm_offset=20
/api/now/table/MyTable1?sysparm_limit=10&sysparm_offset=30
/api/now/table/MyTable1?sysparm_limit=10&sysparm_offset=NNN ...Last page reached you get 404 error. Stop looping
ZappySys Pagination settings takes care of these for you automatically.

Leveraging a Wikipedia bot to get results over 500

I am using MediaWiki APIs to fetch category members
http://en.wikipedia.org/w//api.php?action=query&list=categorymembers&cmtitle=Category:Physics&cmsort=timestamp&cmdir=desc&cmlimit=1000&format=json
However I get this message
"cmlimit may not be over 500 (set to 1000) for users
Is there any way I can leverage an existing bot to get all results?
If not, how can I get all results without using a bot?
You will need to make more than one request, using the continue parameter to iterate over the result set.

Get shorter response from Google Geocode Api

This is my Get request for api:
http://maps.googleapis.com/maps/api/geocode/json?latlng=50.126886,14.421954&language=cs&sensor=false
And I get really long response which is not ideal for mobile device. So which parameters should I add to get just result with first "address_components" and "formatted_address" but don't get other things (they are useless for me). Thanks
I don't think you can limit the response (https://developers.google.com/maps/documentation/geocoding/).
What you can do (if you really want to limit the response) is to proxy the response through your own server and remove some of the results.
Are you sure, that your problem is the size of the result? More than often, most of the time spent on downloading a website is spent initializing requests. Which basically means you should favor fewer requests over size per request.
Good resources:
HTTP Requests vs File Size?
http://developer.yahoo.com/blogs/ydn/high-performance-sites-rule-1-fewer-http-requests-7163.html
Should I aim for fewer HTTP requests or more cacheable CSS files?