Chatfuel not clearing user input cache - facebook-chatbot

I'm building a flight search chat bot using, chatfuel, flowxo and google sheets.
when I ran the chat for the first time, it worked perfectly but when i ran it again it only displayed the search results containing the input from the first time I ran and not from the second/third/forth time. however, if I wait for some time, 15-20 minutes, the bot works as it should again. I think that maybe the problem is that the chatfuel not cleaning cache or not processing Json answers fast enough.
Has anybody encountered this issue and/or know how to solve it?
Dan.

I solved the issue. instead of displaying the Json results in the block, I add another step and now the json result are defined as user attributes and the attribute is displayed in the block

Related

Undetectable memory leak in vuejs application

I've been trying to resolve this problem for a while now, I even gave it a shot to rewrite the entire program but without success. The application is running on VueJS 2.3.3 and is supposed to be running on Chromium in combination with a Raspberry Pi (irrelevant information, for now).
We're working with several components which are being included in a single file, later on this file will be compiled using either gulp or npm run dev. When the instance of VueJS initializes, a request will be send using Vue Resource's $http option. This'll receive a json response with a size of around 30mb. This'll be saved in the data array, as seen here:
this.$http.get('<url>' + this.token)
.then((response) => {
this.properties = response.properties;
});
This data will later on be used for further actions, another thing that is worth mentioning is that the data is being refreshed every once in a while. Which is where I think the problem occurs, if I'm not refreshing the data every 5 minutes (can be longer too, really depends on the way I'm testing) the program runs fine. It's just that I want to refresh the data every once in a while to retrieve new information. The way of setting a timeout which I'm using is as following:
this.dataTimeout = setTimeout(this.refreshData, 300000);
Each (so called) property has at least 6 base64 images saved in it's JSON, which are later used to present to the user. Besides that, there is a name, address, and some other tiny bits of data. It doesn't sound all that wrong but I'm getting the feeling that each response makes the memory grow so intense that even a desktop is getting trouble running it.
Each 10 seconds a new property will be presented on the user's screen including the images, street, location, etc. I'm not sure if there is a memory leak in my code or if I'm forgetting something. A few questions pop up in my head:
Do I need to reset the response I'm getting from the server back to
null or undefined?
Could there be a leak in one of the plugins I'm using (Just VueResources)?
For how long can a VueJS instance stay alive, is there any recommended time to reload the entire program?
What thinks should I take in consideration in order to achieve this at all?
I've taken out the data renewal and put a demo project online, this can be seen right here. The main problem I'm having is that the browser just runs out of memory and shows us the amazing Aw snap! page from Chrome. I tried taking snapshots from the memory usage but it all seems fine, it just explodes randomly after a while.
Well, I don't know what really does your app, but are your 30Mb of data really useful / essential ? In JSON moreover ?
Maybe you don't need all this data, and you could just adapt the data to your needs. For example, keep your JSON store data, and retrieve your Base64 images by another way.
I don't understand why you store in memory images. Images are just useful for display purpose in my opinion.
So I think 30Mb is really huge. But maybe I'm wrong ?
By the way, I've tested with Firefox Nightly, no problem here. Doesn't seem to crash. Maybe I don't encounter the refresh call ?

Make an item display only for certain period of time on the webpage

Hi guys I am constructing a task distribution management system for my team in which I want to add a functionality that:
When I create a task, I will have an option to choose "how long is this task valid for being taken". For example, when creating the task I put "2 hours" in the
<input id="valid-for">
, then this task will only be displayed on the dashboard for 2 hours from the time it was created and then after 2 hours -> "display: none".
I've searched the web for the mechanism of achieving this feature but I didn't get a satisfied answer probably because I don't know the right terminology to google. I tried to use AJAX and use TIME_STAMP type attribute in MySQL but didn't know how to proceed. Could anybody tell me how to achieve this feature by the use of MySQL, jQuery or any other technics that could fulfill this feature? No code necessary I just need some explanation.
Thanks guys!
Without knowing any more details, here is how I would consider writing the code:
In the database, have a start time and a use-by time.
In your browser page, you can run a script periodically, say every minute (this is called polling). In this case, you can use Ajax to call back to the server for updates.
At the server end, check for new tasks as well as expired tasks. Then send the results back to the Ajax caller.
Back at the browser, update the dashboard accordingly.
I would be inclined to remove the task on the browser rather than simply hide it.

Django - Dojo/Dgrid - how to manage LARGE data sets

6.30.15 - HOW CAN I MAKE THIS QUESTION BETTER AND MORE HELPFUL TO OTHERS? FEEDBACK WOULD BE HELPFUL. THANKS!
I am developing a web application that will handle/manage a VERY LARGE data set - Currently any kind of heavy load causes the browser to lock up - whether I'm in the Django Rest Framework API or in the Dojo/Dgrid. This is kind of a dual question.
I've researched and can't find a clear way to do this on either side.
How do I limit how much the database sends at one time to the Django Rest Framework and/ or The Dojo Dgrid. The Dgrid pulls the data from the Django Rest API. The DRF pulls data directly from the MySQL database.
If I can control how much data is sent at one time, then hopefully it won't lock up the browser as much. ANY suggestions, advice, help, examples would be very helpful. Thanks in advance!
UPDATED 6.22.15 -
Alright, I FINALLY Got the pagination to work and it display the limit/offset in the headers. :) YAY!!!! I can also see the data in the Response headers. HOWEVER... the grid won't populate and I keep getting this odd error:
TypeError: transform(...) is null
return transform(value, key).toString();
instrum...tion.js (line 20)
I've gotten this error before, but I've never been able to find a solution to it. After researching, there's not much I can find on HOW to fix or really even what it is. Any help with this would be greatly appreciated!! I'm SO CLOSE to getting this thing to work correctly after WEEKS and WEEKS of beating my head against a wall. Please help! :) Thanks in advance!!!
2nd Update - This was an answer from a previous post - but I'm still not sure how to fix it. When I addressed another issue - it went away for awhile, but I still have no idea how to correct the issue.
Problem 3: "transform(...) is null return transform(value, key).toString();"
This sounds largely tangential to the original issue, but the most common cause is a widget template that is referencing a property via ${...} that doesn't actually exist in the widget.
I don't know how to answer this on the layer between DRF and the database, but as discussed in other SO questions like this one, DRF allows you to limit the amount of data sent with requests via page or offset/limit parameters.
Based on the phrasing of your question, it sounds like the client side is actually requesting too much data. I'll outline how the flow should work, so hopefully you can spot what you've missed:
A dgrid instance is set up with a collection referencing a dstore/Rest instance
The dstore/Rest instance is created with appropriate properties set. In this case, based on the DRF Documentation:
useRangeHeaders: false (this is already the default)
rangeStartParam: 'offset'
rangeCountParam: 'limit'
As a result, when the grid renders, you should see requests sent to your server e.g. endpoint?offset=0&limit=25 - if you don't see those two parameters, that would be why you're getting too much data
The server will need to query the database with the respective OFFSET and LIMIT
The server must provide a response with the expected number of items (except if it reaches the end of the data set first, which should be reflected by the total property in the response, presuming the customization in the previous SO answer I linked is used)
Ultimately, if the service is working as expected, the grid should only be requesting a handful of items at a time, and should only be firing one or two requests at any given time.
Would add as a comment, but not enough reputation at the moment ....
Your question is pretty general, but one strategy would be to allow the user to select the number of items they wish to view at a time and then allow the user to page through the data with 'next x items' and 'prev x items' buttons. Your data object query would then use the current position +/- 'x' as the index range to reduce the number of data objects sent to the browser. This is the basic flow for Ebay, Amazon, Google, or any site with thousands of items to display. The 'next' and 'prev' actions could be wired as POST requests.

Twitters json search results returning empty

I am having trouble access tweets via twitters json search
https://search.twitter.com/search.json?callback=?&rpp=5&q=from:secretdreameruk
The strange this was it was working a few weeks ago, but all of a sudden stopped and is producing no errors.
Viewing the above link in a browser displays the json return, but the results section is empty ("results":[]), even though I have tweeted recently.
At first I thought the usage would be limited per day, but I have had this problem for about a week.
Does anyone know why this has happened?
Thank you.
Mike
I'm not sure why that seems to be happening though by twitter's own admission:
Search is focused in relevance and not completeness. This means that
some Tweets and users may be missing from search results. If you want
to match for completeness you should consider using the Streaming API
instead.
I would use the following api format instead, it's much more customisable.
https://api.twitter.com/1/statuses/user_timeline.json?screen_name=secretdreameruk&count=5&callback=?

Facebook.login callback fires multiple times

I'm using the AS3 SDK and running into a weird issue. I do something like the following:
Facebook.init(MY_APP_ID, someHandler);
This works ok. At some later point as a result of a user clicking a button in a swf, I do:
Facebook.login(someOtherHandler, {perms:"offline_access,publish_stream"});
Following the user login/approval in the FB popup window, the "someOtherHandler" method is called, but the problem is it is getting called many times - like upwards of 150. In Firefox this is annoying but still works, and I'm then able to make other calls to the API (lookup their account info, post to their wall, etc.). In IE however, this typically leads to a stack overflow. And honestly, even in Firefox this isn't something I want to live with.
If I do a barebones "hello world" type example to try to reproduce the problem I can't; there the callback only fires the expected single time. I'm trying to isolate what could be causing the callback to be called again and again but the system I'm integrating this into is quite large and so it's taking a very long time. That's also why I'm not posting the real code which I know will make it harder for anyone else to help out, but I'm hoping someone's been down this general road before. Thx.
(EDIT - I have verified that Facebook.login is in fact only being called a single time)