When exporting a large google doc as a pdf I am experiencing a timeout issue. I am currently using this REST API. Is there a way to allow me to export larger documents? Maybe there is a way to change the timeout duration or upgrade to a paid version in the google cloud console.
Use exportLinks instead under Files: get instead as Files: export does have a limit of 10mb.
Sample:
curl \
'https://www.googleapis.com/drive/v3/files/[FILE_ID]?fields=exportLinks&key=[YOUR_API_KEY]' \
--header 'Authorization: Bearer [YOUR_ACCESS_TOKEN]' \
--header 'Accept: application/json' \
--compressed
And it will provide you an output that contains all possible export links available (that doesn't have a file size limit)
Output:
{
"exportLinks": {
"application/rtf": "https://docs.google.com/feeds/download/documents/export/Export?id=[FILE_ID]&exportFormat=rtf&resourcekey=[RESOURCE_KEY]",
"application/vnd.oasis.opendocument.text": "https://docs.google.com/feeds/download/documents/export/Export?id=[FILE_ID]&exportFormat=odt&resourcekey=[RESOURCE_KEY]",
"text/html": "https://docs.google.com/feeds/download/documents/export/Export?id=[FILE_ID]&exportFormat=html&resourcekey=[RESOURCE_KEY]",
"application/pdf": "https://docs.google.com/feeds/download/documents/export/Export?id=[FILE_ID]&exportFormat=pdf&resourcekey=[RESOURCE_KEY]",
"application/epub+zip": "https://docs.google.com/feeds/download/documents/export/Export?id=[FILE_ID]&exportFormat=epub&resourcekey=[RESOURCE_KEY]",
"application/zip": "https://docs.google.com/feeds/download/documents/export/Export?id=[FILE_ID]&exportFormat=zip&resourcekey=[RESOURCE_KEY]",
"application/vnd.openxmlformats-officedocument.wordprocessingml.document": "https://docs.google.com/feeds/download/documents/export/Export?id=[FILE_ID]&exportFormat=docx&resourcekey=[RESOURCE_KEY]",
"text/plain": "https://docs.google.com/feeds/download/documents/export/Export?id=[FILE_ID]&exportFormat=txt&resourcekey=[RESOURCE_KEY]"
}
Reference:
https://stackoverflow.com/a/59168288/14606045
Related
I was trying Vimeo live streaming API and was following this guide https://developer.vimeo.com/api/live/events.
However at step 2: https://developer.vimeo.com/api/live/events#managing-a-live-event-step-2
I am getting this :
{
"error": "Unable to upload a video. Please contact the app's creator.",
"link": null,
"developer_message": "Invalid upload approach provided. The only valid approach for versions greater than 3.0 are `streaming`, `pull`, ' .\n '`post`, and `tus`.",
"error_code": 2230
}
my curl request looks like this :
curl --location --request POST 'https://api.vimeo.com/me/videos' \
--header 'Authorization: bearer {token}' \
--header 'Content-Type: application/json' \
--header 'Accept: application/vnd.vimeo.*+json' \
--data-raw '{ "upload": { "approach": "live" } }'
I have a premium account and the token is correct. I tried other values for upload.approach as suggested in the error message but nothing works for live streaming. Any help is appreciated.
You're a Vimeo Premium member, an Enterprise account is required to access the Live API:
To qualify, you must be a Vimeo Enterprise customer or a Vimeo development partner.
See the first bullet point here: https://developer.vimeo.com/api/live/events#before-you-begin
#Ashish Yes premium membership is enough only if you are handling live stream through Vimeo Website.
But accessing through api for events you must have Enterprise membership.
We can't use without Enterprise version, even we also suffering from the same issue.
They also mentioned this one under pricing (check image) and I was told that they did this intentionally.
I am doing automation to get opening tickets details so i am executing web service call in FreshService ticketing tool.
Below is my web service call using CURL and GET
curl -X GET -H 'Content-Type: application/json' -v -i 'https://support.XXXXXXX.com/helpdesk/tickets/9725.json?-u=ASDDDECDFF%3AX%20'
When i am executing only getting below response in body.
{
"require_login":True
}
But my output joson file visible after opening that url in browse i can see json file, so i execute below vbsctipt to send HTTP request for read the JSON file but same ""Require_login":True" coming.
Dim o
Set o = CreateObject("MSXML2.XMLHTTP")
o.open "GET", "https://support.XXXXXXX.com/helpdesk/tickets/9723.json", False
o.send
msgbox o.responseText
So my expectation to get JSON file in body part of web service response or through vbscript get JSON file store locally.
Appreciate any other light wight tool or easy approach. More about Fresh service API details https://api.freshservice.com/#introduction
After i encoded my authentication key with base64 and addd Authorization as header and its working now.
Update Web service URL,
curl -X GET -H 'Content-Type: application/json' -H 'Authorization: Basic XXXXXYYYYYZZZ' -v -i 'https://support.XXXXX.com/helpdesk/tickets.json'
I tried uploading the file using postman ,curl command in windows and linux mentioned in the url https://forge.autodesk.com/en/docs/data/v2/tutorials/upload-file/ but getting gateway timeout error.
I followed the steps in the url https://forge.autodesk.com/en/docs/data/v2/tutorials/upload-file/. I am able to create bucket. But when I tried uploading file to the created bucket its giving 504 gateway timeout error.
Can you give me the solution to resolve this?
Screenshot attached for error
Please find request below:
curl -v 'https://developer.api.autodesk.com/oss/v2/buckets/testbucket/objects/test.3ds' -X 'PUT' -H 'Authorization: Bearer TOKEN' -H 'Content-Type: application/octet-stream' -H 'Content-Length: 308331' -T 'test.3ds'
It looks like the size of your model file is too big to upload, please use the resumable upload API instead, see the related thread here Upload large files (2GB) to Autodesk Forge Data Management API
I am having trouble requesting the current status of a resumable upload. Based on the Google Documentation, the following request should return a Range header with the current range google has of my upload, but I keep getting the following response:
Failed to parse Content-Range header
Here is my curl request:
curl -H "Content-Range: bytes */1443452365" -H "Content-Length: 0" locationUrl -X PUT
I have also tried "bytes */*" and "*/*" for the Content-Range header, but no luck.
Any ideas?
First, you need to check the correct request format like the sample given below:
curl -H "Accept: application/json" -H "Content-type: application/json" -X POST -d '{"id":100}' http://localhost/api/postJsonReader.do
And, discussed in other command-line tools, when sending raw HTTP data, be aware that the POST and PUT operations will require computing the value for a Content-Length header. You can use the UNIX tool wc to compute this value. Place all the content of the HTTP body into a text file such as template_entry.xml (example used above) and run wc -c template_entry.xml. It is often difficult to debug if you accidentally use an incorrect value for the Content-Length header.
Lastly, you can request the status between chunks, not just if the upload is interrupted. If the upload request is interrupted, follow the procedure outlined in resume an interrupted upload.
I have a simply Json file stored on a ftp and when I call the url http://myurl/example.json, I receive a plain/txt type file.
How can I solve the problem ?
Thanks for any help :)
Whether file is stored on server via FTP or any other method is irrelevant. What matters is that you are requesting the file over HTTP. So you need to work on your HTTP request and response headers.
One thing you may try first, is setting the headers of your request. Here's a curl example I randomly found, that shows that (application/json is the interesting part).
curl -i -u application_name:application_password --data '{"value": "my_password"}' http://localhost:8095/crowd/rest/usermanagement/1/authentication?username=my_username --header 'Content-Type: application/json' --header 'Accept: application/json'
What also matters is the response headers that the server assigns. If you have access and rights, ensure those are set, too. A good starting point: review current response headers.