I'm trying to add some functionality to an internal Access database that will automatically create tasks on Asana.
My VBA experience is somewhat limited but by examining various code samples online and tinkering I've been able to POST tasks with all the necessary data and GET info back.
But I'm now struggling with being able to upload file attachments to a task. I'm not sure how to go about it.
Leaving the content type as application/x-www-form-urlencoded which works for the normal POST statements when creating tasks and just pointing the send command to a file location using doesn't work and results in an error of "file is not an object", I'm guessing this is because all that's contained in the send command is a file=path pair.
Do I need to encode the file at all, if so how?
I'm hoping someone can point me in the right direction.
Thanks.
You can check out how curl does it, but I believe it needs to be multipart/form-data - I would strongly recommend using a library rather than doing the encoding manually, since there are often subtle gotchas.
Basically, it works a lot like a standard form upload from a web browser.
Hope that helps!
Related
With the latest ForgeARKit-update-6-2018.1, I was trying to load my model in Unity, with the sample Unity scene 'loadAtStartup'. I can successfully load the sample models from 'Sandbox', but I couldn't load my model, which was uploaded through script 'test-2legged'.
Error message shows 504, it seems not reaching the service:
AsyncRequestCompleted The remote server returned an error: (504) Gateway Time-out.
UnityEngine.Debug:Log(Object)
Autodesk.Forge.ARKit.RequestQueueMgr:AsyncRequestCompleted(Object, AsyncCompletedEventArgs) (at Assets/Forge/CodeBase/RequestQueue.cs:322)
UnityEngine.UnitySynchronizationContext:ExecuteTasks()
Model URN:
dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bWFvbGlua3ppOHM3cnlvZWx4bjVndnR4bjcyZWc2N2l0dGp0a2MvMmZsb29yX0FyYy5pZmM=
[Update 23/4/2019]
I found that I can successfully load the same model with ForgeARKit-update-3-2017.1.2f1. I compare the Forge code in Unity. I think it has something to do with the service URL. The version 6 is fetching models from 'https://developer-api-beta.autodesk.io' while version 3 is fetching from 'https://developer-api.autodesk.io'. Meanwhile the shell script 'test-2legged' is uploading to the latter one ('https://developer-api.autodesk.io'). That is why it counldn'd find the resource. Question here is how can I upload model to the 'beta' ARKit? I tried modifying the URL in script 'test-2legged' but it doesn't work. Below screen-shot is the output of script 'test-2legged' when fetching from 'beta' ARkit. It seems model is uploaded successfully, but some parsing post-work failed. I guess the response format is also changed in the beta version. Is there a beta version of 'test-2legged' scripts (and other Scene Preparation scripts)?
Please comments, Thanks.
This is correct. My apologizes for this, I know we did not documented very well on the server changes.
This update6 assumes you are using the new server under beta right now. The scripts and update 3 are using the legacy server. Note that the 2 servers are not necessarily compatible and store the data in different places, so make sure to always use the same server in Unity as the one you used to prepare the scene. When we will switch everyone to the new server, we will transfer the data from the legacy server to the new server cloud storage.
The Update3 package will still be able to read scenes from the new server, as we make sure the old Unity code stays compatible.
Note as well you need to use SafeBase64 encoded string everywhere. I saw in your description that you are using base64 encoded (not safe). The new server will be more strict of parameters and format, so I encourage you to test your scripts/code on the beta server.
Last, I am working on a new Unity code update, and documentation which will be released next week. Make sure to use this version - it adds support for 3legged, automatic 2/3legged token refresh, and more. If you got scenes failing, please contact me directly and share your models and URN. I'll either test it on my development environment, or look into our log files for the reason to fail. My email address is my first name at autodesk.com
Thank you Cyrille for your help!! I am replying you here as it's easier to insert images.
I replaced the function 'xbase64encode()' with 'xbase64safeencode()', and now it works! However it seems for some model it still responds some error and in that case it cannot be loaded in Unity. (as the image below). I checked the script and I think all the encoding are using SafeBase64. Any clue of that? Or is that caused by my model?
BTW, the loading performance is greatly improved than the legacy version!! It looks almost the same as the web client. Huge Thanks for that!
Good to know that there is going to be an update next week. Yes I will test it and get back to you later.
What are the few things that I'll have to include in my code that will point me in the right direction?
For Example this website
Open your browser's debugger on Network tab and observe what are the requests when site is loading dynamic content (when you click). You'll see it's getting all the data using some API, for example: https://www.bestfightodds.com/api?f=ggd&b=3&m=16001&p=2
You can download all the data by changing parameters in this URL.
Usually that's enough but here it's more tricky as the data returned by the server is somehow encoded and not easily readable. You'd have to debug its javascript to find function which is used to decode this data before you can parse it.
I've been wondering how to fetch the PlayStation server status. They display it on this page:
https://status.playstation.com/en-us/
But PlayStation is known to use APIs instead of PHP database fetches. After looking around in the source code of the site, I found that they have a separate file called /data.json.
https://status.playstation.com/en-us/data.json
The content of this file is the same as the index file (for some reason). They use stuff like {{endDateTitle}} and {{message}}, but I can't find where it's defined, if it's pulled using a separate file or just pulled from a database using PHP.
How can I "reverse" this site and see if there's a API I can use to display the status on my site?
Maybe I did not get the question right, but it seems pretty straightforward.
If using firefox, open Developer tools, Network. Reload the page.
You can clearly see the requested URL
https://status.playstation.com/data/statuses/region/SCEA.json
It seems that an empty list as a status means "No problems" (since there are no problems I cannot verify this assumption. That's all
The parenthesis {{}} are used by various HTML templating languages, like angular, so you'd have to go through the js code to understand where they get updated.
I am at this website -
http://www.zoominfo.com/s/#!search/company/1.64.eyJjb21wYW55TmFtZSI6xIB2YWx1xIw6ImEiLCJpc1VzZWTEjXRyxJN9fQ%3D%3D
If you see the company name - Agilent Technologies Inc.
Its neither there in page source, nor in any json format.
But it does show in the Dom of Chrome Developer tool.
I have looked and analysed almost every requests that it sent, but still couldn't find where this data is saved.
By where the data is saved - I am looking to find where I can scrape that data from?
If by using python-requests and BeautifulSoup
I do see an XMLHTTPREQUEST made, not sure what that means, or if that is the clue to my answer.
I am still learning python, and it would be a very useful information if someone helps me with this.
Thanks in advance.
After the HTML is loaded, js requests for the data through an XMLHTTPREQUEST which is loaded right after the request is received on your client. That's why you see the DOM element right there using element inspector.
You didn't mention what goal you want to achieve or what tool you are using. Please be specific on your question. If you do not have any idea about this kind of pattern, google out angularjs, see some example.
do see an XMLHTTPREQUEST made, not sure what that means, or if that is the clue to my answer.
It means that javascript embedded in the page is sending an extra HHTP request to the web server. It is likely that the "Agilent Technologies Inc." text is being returned in the server's response to that request, and the javascript in the page is then injecting the text into the DOM in the appropriate place.
Where is the Data stored on Website
That is a completely different question ...
(You have already noted that the data (e.g. the company name) gets injected into the page displayed by your browser.)
On the server side, the data could be stored in the web server (or its back-end systems) in a variety of ways. Or it might not be stored at all. There is no way of knowing ... without looking at the server-side code and configurations.
I have a drupal site that is being used strictly as a CMS that produces JSON feeds using services and services_views, which are consumed by a separate site. What I would like to do (and I have a working proof of concept of this) is allow for a "live preview" on the real site, by intercepting the node form preview / submit, encoding the node as JSON, and loading a special page on the live site that consumes that JSON and displays the page accordingly.
The problem with this JSONized node is, it's different from the JSON being produced by my view (using services_views). My end goal is to produce JSON that is identical for both previewed and non-previewed objects, without having to maintain separate output methods (I could easily hand-customize the json but then when my view for the public api changes I have to make the same changes to the preview json. Trying to avoid this).
I'm looking for feedback on this approach. Is what I'm attempting even possible? The ideas I've been able to come up with so far are:
being able to (conditionally) drive my view with data from a non-databse source
sneakily inserting data into the view object during one of the stages of execution? Kludgy but I'm not above that :)
saving a "clone" node (or revision?) of the node being previewed and let the view use that to display the preview JSON?
Maybe this is the wrong approach altogether and there's something better? (Trying to intercept and format the services output in my module... maybe avoid services_views altogether?)
If anyone can offer some advice, insight or opinions on how to best proceed here, I'd be really grateful.
in a custom module, you could set up a page that grabs the json output from the view page.
$JSON = file_get_contents($url);
that way the preview stays bound to the view, even if the view changes.
First I think it's not an easy task what you are trying to achieve. So before all, good luck.
I think you could intercept the node submission data, then create a node programatically, then render that node, and then export the rendered node to JSON. Inmediately after you get the JSON, delete this node, because the programmatically created node is only for preview.
This task could be more CPU demanding but think that previewing content exactly as the content will look is difficult.
Your rss feeds that your site reads could be filtered with some parameter to avoid programmatically created nodes (prewiew nodes), despite these nodes will be available for a very short time.