Posting image to Facebook album with AS3 API - actionscript-3

I'm having trouble with posting an image from my canvas application to the user's albums. According to the Facebook docs:
In order to publish a photo to a user’s album, you must have the publish_stream permission. With that granted, you can upload a photo by issuing an HTTP POST request with the photo content and an optional description to one these to Graph API connections:
https://graph.facebook.com/USER_ID/photos - The photo will be published to an album created for your app. We automatically create an album for your app if it does not already exist. All photos uploaded this way will then be added to this same album.
https://graph.facebook.com/ALBUM_ID/photos - The photo will be published to a specific, existing photo album, represented by the ALBUM_ID.
So, going by point one, if I upload an image like this...
Facebook.api("me/photos",imagePostCallback,{message:"",image:myImageBitmap,fileName:''},URLRequestMethod.POST);
...then I can expect it to place my image in an album named for my app, which it will create if necessary?
Not so.
What actually happens when the album doesn't exist is that the uploaded image is pushed into any other handy albums that exist, which are usually for (and created by) other applications. This is a bit of a pain.
So far I've tried the following:
Disabling sandbox mode. I had thought that the app might be unable to create new albums because it was in sandbox mode, however disabling sandbox mode made no difference and I can create albums directly with it enabled.
Checking for the existence of my album and creating it if necessary. I can check for my album and create it if it does not exist, but I cannot then upload an image because the POST call to Facebook.api to upload the image will fail if it is not called as a direct result of a user interaction.
And so now I'm a bit stumped. Obviously I can't have the possibility of my app posting images to a competitors album, but at the moment the only alternative I can see will involve effectively making the user submit their image twice if an album has to be created. Any ideas?

I'm guessing you need the access_token in your params :) When posting something on a user's facebook, you always need this one (not always necessary when getting information). The way to get the accesstoken is shown below :)
public function post():void
{
var _params:Object = new Object();
_params.access_token = Facebook.getSession().accessToken;
_params.message = "";
_params.image = myImageBitmap;
_params.fileName = "";
Facebook.api("me/photos", imagePostCallback, _params, URLRequestMethod.POST);
}
also make sure that you have the right permissions when asking for permissions with your app.
EDIT
Ok, so I've missed your edit a bit there ;) it should be possible to create your own album. Take a look at this php-code for graph api. The code should also be able to be parsed to AS3.
http://developers.facebook.com/blog/post/498/
EDIT2
ok, i've done some more digging (seemed interesting to know). This should actually work when using graph api.
FB.api('/me/albums', albumCreateCallback, {name: 'name of the album', message: 'description of the album'}, URLRequestMethod.POST);
When you then call for another api call to upload your image in the albumCreateCallback, it should work and upload your image (according to what i've found).

Related

Added request description | AssertionError: expected [] to have a length above 0 but got 0?

How I can solve this issue in post man tired documentation but couldn't find any useful resource please help??
This error occurs in the Student Expert training, when verifying whether you have completed all of the tasks that were requested. Specifically, it relates to this element of the Get specific player request:
Before you continue with scripting, add a description to this
request—the description will appear within the collection
documentation, which you would use if you were e.g. publishing an API
for public use. In the Postman app, at the top of this request tab, to
the left of the request name, expand and and click to edit. Add a
short description of the request (you can use markdown) and click
Save. If you're using the web version, use the little documentation icon to the right of the request.
The test is looking at the Get specific player endpoint and checking whether a description has been set: expected [description] to have a length above 0 but got 0. If you add a description, save, and regenerate your public collection share link, the test will pass.

How can I display images on a web page via SugarCRM 7 API

I am trying to dynamically retrieve images from Sugar CRM to display on a website. When I am logged in, the images display alright. When I am logged out, I am also denied access to the images hence the wider public will not see the images.
How can I ensure that a logged in header is sent on the webpage without exposing my username and password? Or how can I display the image on the webpage?
I had to do something like this earlier in the year. I can't provide all of the code, but my idea was to create a separate Entry Point that did not require authorization. From that Entry Point file I essentially spoofed authentication and called the normal download.php Entry Point. It went something like this (keep in mind this code was invoked by hitting index.php?module=MyModule&entryPoint=MyEntryPoint)
unset($_REQUEST);
$_REQUEST['entryPoint'] = 'download';
$_REQUEST['id'] = $focus->$field;
$_REQUEST['type'] = 'SugarFieldImage';
$_REQUEST['isTempFile'] = '1';
$_SESSION['authenticated_user_id'] = '1';
require_once('download.php');
One caveat I found there was that I needed to check first for an existing session before setting $_SESSION['authenticated_user_id'], otherwise an actual Sugar user who used the website would go back to Sugar and find that his/her session had been escalated to an Admin account(!). So, I added a check before setting it that way, and code to re-set it back to the original value. Something like this:
if(!empty($_SESSION['authenticated_user_id'])){
$old_session_id = $_SESSION['authenticated_user_id'];
}
$_SESSION['authenticated_user_id'] = '1';
require_once('download.php');
if(isset($old_session_id)){
$_SESSION['authenticated_user_id'] = $old_session_id;
}

Permanent links to thumbnails in Google Drive API

I'm using Google Drive API (PHP) to upload some photos to my Drive. When a file is uploaded, a Google_DriveFile object is returned in the response to confirm the successful transfer. It includes a field called thumbnailLink, accessible through the getThumbnailLink getter. Its content may look like this:
https://lh4.googleusercontent.com/dqVdU195R4_0ZtWxsJlhW1Fr2K30xa2hH3V1KV4UrTBl9QkhOSR0ZqN9HoB-TjEQv8SIJw=s220
Until today, I was sure that the link doesn't change by itself over time. However, when I tried to display a thumbnail of a photo I have on my Drive, using a cached address I keep in my local database, I got a 403 error - you can see it under the mentioned link. I asked the API for the current link to the thumbnail and it's now completely different.
It happened to me only once but for multiple files, i.e. all the files I had on my Drive suddenly got new thumbnail links.
Is there a way to quickly retrieve a thumbnail of a document (preferably, a photo) by some constant value or to be sure that it won't change? The perfect solution would be to access the thumbnail under a link that includes the document's id instead of some hash that may change.
Try this:
https://drive.google.com/thumbnail?authuser=0&sz=w320&id=[fileid]
Where:
sz is a size, where you may use as w (width), as h (height)
fileid is a file id. You may find it in "share" menu by right click in Google Drive UI.
I have gone through the API Documentation as they have provided:
Important: Thumbnails are invalidated each time the content of the file changes. When supplying thumbnails, it is important to upload new thumbnails each time the content is modified.
According to the information it means that a new Thumbnail is only generated only when the contents of the file are modifided. But in your case it is really weired thing and the contents are not changed but the thumbnail are Changed. As from documentation there is no batch process thing avaiable but another way around is available i.e. Web Hook
According to the Documentation there is web hook available i.e. Files:Watch process through which one can track the changes are made to file. Thus, it means every time contents are changed then hook would run and you can change the cache of the image thumbnail.
HTTP request can be sent to request the watching the files changing
POST https://www.googleapis.com/drive/v2/files/fileId/watch
Here fileID means the ID provided after loading the file.
In the request body, supply data with the following structure:
id ==> string (A UUID or similar unique string that identifies
this channel.)
token# ==> string (An arbitrary string delivered to the target address with
each notification delivered over this channel).
expiration# => long (Date and time of notification channel expiration,
expressed as a Unix timestamp, in milliseconds.)
type ==> string (The type of delivery mechanism used for this channel.
The only option is web_hook.)
address => string (The address where notifications are delivered
for this channel.)
# Optional.
If the contents get changed then new Thumbnail is generated and hook will notify you address and through your address you can fetch new information.
Here is another solution. Let's say we store only GDrive ID of the images or PDFs (google generate thumbs for many file types).
we can send request to gDrive to get valid thumbnail since looks like thumbs will expire even if there is no changes to the file.
In this case each thumbnail inside Angular component. If you use something else you can create array of links and iterate through it to create proper thumb links.
Here is the code:
const thumb = () => {
if (this.item.DriveId) {
this.getThumb(this.item.DriveId, this.authToken)
.then(response => {
console.log(`response from service ${response}`);
// Set thumbnail width size to 300px or any other width if needed
this.item.externalThumbnailId = response.slice(0, -3) + 300;
})
//here we can handle cases when API limit exceeded 10 req in a sec
.catch(e => {
if(e.data.error.message == 'User Rate Limit Exceeded'){
console.log('Failed to load thumb. trying one more time');
setTimeout(thumb, 1000);
} else {
console.log(e);
}
});
}
};
//call this function on component load.
thumb();
Another solution will be to write some backend script that updates thumbs in DB records.

Oracle APEX - HTML Links Breaks Session and Requires New Login

Ok so here is what is happening:
I have a client that I am building an application for. My client has a flowchart that they would like posted on the front page of their application. Check. My client then wants this flowchart to be set up as an image map so that a user could click one of the boxes in this flowchart and be taken to a report in another part of the application. Check.
All of that is elementary and, in a technical sense, works. The issue is, and it is an issue I have encountered before with APEX, is that every time a user clicks one of these links it takes them to the login screen. It seems that linking directly to a page's URL breaks the session and requires you to login again, even if you are linking from one page in the application to another in the same application.
I have played with all of the authentication settings in a hopes of fixing this and tried to determine what is breaking the session exactly but with no luck.
Has anyone else had this problem and could share their method for fixing it? I really cant have users logging in every time they click a link and I also cannot simply remove the authentication on the pages. Thanks in advance.
You should pass on the session id in your links. If you don't, then apex will see this as a new session. You can tell from the url: take note of the session id in your url when you are on your image map. When you select an application, take another look at the session id part in the url. If they are different, then you are starting a new session each time.
/apex/f?p=190:90:1674713700462259:::::
190 -> application id
90 -> page id
1674713700462259 -> Session id
To pass on the session, it depends where you construct your links.
In PLSQL, you can find it through :SESSION or :APP_SESSION
For example, in a plsql dynamic region: htp.p('the session id is '||:SESSION);
In javascript code you can use $v("pInstance") to retrieve the value dynamically, or use &APP_SESSION. which will have the value substituted at runtime.
Small example:
function printsome(){
var d = $("<div></div>");
d.text('&APP_SESSION. = ' + $v("pInstance"));
$("body").append(d);
};
So you probably just need to alter the construction of your link somewhat to include the session!
I was assuming the binding variables will do the job. But they were helpless.
Best way is to pass the current session id to an item then use the item value in the link.
f?p=&APP_ID.:32:&P31_SESSION.:::P32_CUSTOMER_ID:#CUSTOMER_ID#

Retrieving information from a web page

My application is meant to speed up the retrieval of phone call information from our telephone system.
The best way to get this information is to create a new search on the telephone system's web interface and export the results to an Excel spreadsheet which my application then imports into a DataSet.
To get the export, from the login screen, the process goes as follows:
Log in
Navigate to Reports Page
Click "Extension Detail" link
Select "Extensions" CheckBox
Select the extensions (typically all the ones currently being used) from the listbox
Specify date range
Click on Export button
It's not a big job to do it manually every day, but, for reliability, it would be great if I can make my application do this automatically the first time it starts every day.
Since more than 1 person in the company is going to use this application, having a Windows Service do it would be even better.
I don't know if it'll help, but the system is Datatex Topaz Next Generation telephone management system: http://www.datatex.co.za/downloads/index.html#TNG
Can anyone give me a basic idea how to do this?
Also, can anyone post links (in comments if need be) to pages where I can learn more about how to do this?
I have done the something similar to fetch info from a website. I cannot give you a exact answer. But the idea is to send login info to the page with form values. If the site is relying on cookies, you can use this cookie aware WebClient:
public class CookieAwareWebClient : WebClient
{
private CookieContainer cookieContainer = new CookieContainer();
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
if (request is HttpWebRequest)
{
(request as HttpWebRequest).CookieContainer = cookieContainer;
}
return request;
}
}
You should be aware that some sites rely on a session id being passed so the first thing I did was to fetch the session id from the page:
var client = new CookieAwareWebClient();
client.Encoding = Encoding.UTF8;
var indexHtml = client.DownloadString(*index page url*);
string sessionID = fetchSessionID(indexHtml);
Then I had to log in to the page which you can do by uploading values to the page. You can see the specific form elements with "view source" but you have to know a little HTML to do so.
var values = new NameValueCollection();
values.Add("sessionid", sessionID); //Fetched session id
values.Add("brugerid", args[0]); //Username in my case
values.Add("adgangskode", args[1]); //Password in my case
values.Add("login", "Login"); //The login button
//Logging in
client.UploadValues(*url to login*, values); //If all goes perfect, I'm logged in now
And then I could download the page I needed. In your case you may use DownloadFile(...) if the file always have the same url (something like Export.aspx?From=2010-10-10&To=2010-11-11) or UploadValues(...) where you specify the values as before but saves the result.
string html = client.DownloadString(*url*);
It seems you have a lot more steps than I did. But the principle is the same. To see what values your send to the site to login etc. you can use programs such as Fiddler (windows) which can capture the activity going on. Essential you just do exactly the same thing but watch out for session id etc. which is temporary.
The best idea is really to use some native way to fetch data, but if don't got the code, database etc. you have to do it the ugly way. You may also need a HTML parser to fetch the data (ups, you don't because you export to a file). And last but not least, keep in mind that pages can change and there is great potential to fail to login, parse etc.
Please ask for if you are uncertain what is going on.
ADDITION
The CookieAwareWebClient is not my code:
http://code.google.com/p/gardens/source/browse/Montrics/Physical.MyPyramid/CookieAwareWebClient.cs?r=26
Using CookieContainer with WebClient class
I also found some relevant threads:
What's a good tool to screen-scrape with Javascript support?
http://forums.asp.net/t/1475637.aspx
With a HTTP client, you need to do the following:
Log in, using cookies or HTTP authentication
Request a page
Submit form data
This means that you need some class or component in your program that can do HTTP, cookies, authentication and forms. With this, you do the same requests a user would do.