UI Testing (casperjs) with good known status of data (mysql database) - mysql

I'm using CasperJS for automated UI tests. I've done the basic UI testing and validation with some random data, kind of POC. I've set up this automation using bash script which kicks to start the web server, load MySQL data from SQL file, start CasperJS test cases, stop the web server, check the log files.
Now, I want to start the testing with some good known status of data which are stored in MySQL. So that I can test the list data and form data with detailed field information with some known database status. How should I know the status of data in the database at a moment?
1) Should I use pre-populated JSON dumped file which has status and details about all data?
2) Should I use web service API? (web service APIs are being used to show/save/delete data from the web page)
Let's take an example. I've 5 users in Users table. Now when I open the home page it shows 5 users with some rough details. When I click on any record from the list of users, it shows a form with detailed information about that user. The webpage is requesting to the web application to get the detail about a user with the help of user_id to show the detailed user data in a form. Now I want to check that all the data in that form is populated correctly. So at the next step, what would be the preferred way, should I read content from JSON dumped file or should I use web service API (like webpage does).
Searching this problem online, I also found MYSQL HTTP plugin. Should I consider this as well? and How safe it is to use? (I know from the docs that this plugin is not for the production, it is just for testing purpose only. :) )

For the main question in cases like this I would change the database connection string to your testing database (this is a clone).
In your case use your bash script to change the connection string (file copy?) automatically before you run the tests. And when completed change back.
Your testing database is a direct clone of your dev/live databsae but with ONLY the test data you want. Downside is you need to keep the schema in sync with DEV/LIVE.
Also another point to take into considertion is if your testing changes state (post). If so your testing data might be out of sync. One way is get around this is to drop foreign keys, truncate the data and load in a dump file.
HTH

Related

How to render a graph when Mysql data is updated in Vue?

I organized like this,
Back-end: Nodejs - Insert a data to MYSQL periodically.
DB: MYSQL
Front-end: Vue.js - Draw and Re-render a Graph with Mysql Data when data is updated.
In this situation, do I have to check a MYSQL data per seconds whether it is latest or not? and then I have to render a graph again?
Or is there a way like when MYSQL data is updated, triggering a specific methods that is re-render a graph(automatically).
I prefer a second way though, I have no idea it is possible.
Someone can give an advises me plz? or example?
What you could do is using socket.io for this. The moment you update your database in Node.js you then also send a notification or the data over a socket, which tells your Vue.js application that it needs to load new data or gives it the data directly.
I used the following npm packages for my setup with Vue.js and Node.js:
Vue: socket.io-client
Node: socket.io
I followed these 2 sites when I faced a similar problem to get a hang of sockets:
https://socket.io/get-started/chat/#Introduction
https://medium.com/#jaouad_45834/basic-chat-web-app-using-express-js-vue-js-socket-io-429588e841f0

How to add manage temparory data in DotnetNuke?

I am beginner in DNN. I am creating a module which provides Login, Dashboard and Add-Update Form. I have data in JSON format. I want to store it temparory while user use the website. Data will be destroy as soon as user will close the website.
Currently I have created a folder in my Solution Explorer of project in Visual Basic and created 3 .json files which stores login_info.json, basic_info.json and auth_info.json. I write json data whenever user login and I make it blank when user logout.
Above method is working fine now but I afraid it will work when I will publish this module.
Also I may have situation where I need to store image some where. I don't know how I will manage.
Can anybody please guide me?
Is this proper way to store data temparory in DNN?
Is there any other better way?
After getting one of reply for Database Suggestion
Is there any table which same as User Meta in DotnetNuke?
You use the ConnectionString that is used by DNN and access the database as you would normally.
DotNetNuke.Common.Utilities.Config.GetConnectionString()
Or you can use the Data Access Layer that the DNN Framework supplies. For that take the Christoc Templates. In there is all you need to communicate with the DB.

Google Realtime API - when to persist changes to database?

Scenario:
I have multiple browser clients whose internet connection varies from very fast to super slow. Because of that they might not see same state of a document.
I'm using Google shortcut file since the document is actually being stored in database.
saving document to database is triggered from client-side.
Question: how do I know which client got the most up to date document that should be saved to the database?
You are right that you can't rely on any particular client being the most up to date at a particular time. There is no easy way to determine that, since that can change at any given instant. (Although you can make sure that you don't have any unsaved changes in a particular client by looking at the document save state.)
Rather than trying to do this based on client state, you can use the export capability that is part of the Drive API, which will give you a valid snapshot of the data with a revision number so you can track what version you have.
Note that this is a brand new feature, so its not yet well documented. The response is a json object with the appId, revision number, and a data field which contains a json version of the document. It looks something like this, for a document that has a collab list "list" and collab string "text" in the root:
{"appId":"788242802491","revision":17,
"data":{"id":"root","type":"Map",
"value":{
"list":{"id":"gde9s8z5khjarls7o","type":"List","value":[]},
"text":{"id":"gdef98qdhiq679af","type":"EditableString","value":"This is a test 2."}}}}

Django database watchdog save signal outside django

I have the following problem:
I Am using a Django framework.
One of the parts in a system (non-django) writes to the database, in the same database that django is using.
I want to have a signal when an object is being saved. It's a django model object but not saved via django, but directly in the mysql database.
Is there a way django can watch save-actions in his database when it's not being saved by django?
The neatest way would be: create an Api, and let the save action run through this api. The save signal can than be django default. (but this depends on some work of externals... so not the prefered route... for future development it sure is).
Another option is to implement celery and create a task that frequently looks whether one of the saved objects has had no follow up..... (also quit some puzzling I guess to get this up and running)
But there might be an easier... for me unknown?
I saw django watchdog solutions for file systems... not for databases (probably because django has this build in... when properly done through django)
to complex it: I test and develop locally with sqlite .... but the save signal I can put in my tests without needing to get this locally working.... as long as it works in mysql, I Am happy.
You can try this solution:
Create a new table 'django_watch' with one column 'object_id' (add other columns like 'created_datetime' etc according to your standards);
Lets say your main table is 'object'. Add a mysql trigger for the INSERT event on this table.
You should add an extra insert query inside the trigger to insert the object_id into 'django_watch' table.
Now you can have a cronjob that will be inpecting the new table 'django_watch' (for updations in Django objects) and perform necessary actions. You can run this cronjob continuously with some 1 minute delay (upto you).
In the end, I wrote an api that can be called by the thirdparty module. I delivered the code to logon on django using c-code to this api and call the GET of this api. (using django rest framework). This api just saves the object (the id given in the url), and from there on it's default django. The only thing the third party had to do is build in my code to call the api as well....
Maybe not the best solution, but the best to implement for my problem....

Drupal node / data import issue

I have a client who needs to have data imported into drupal from a large spreadsheet. They already had the modules setup and I have the data all in the right tables... so it seems. I have run my own custom scripts to manipulate what was in the spreadsheets and exported / imported it into the drupal databases.
However, when I go to view that type of content in the backend, it shows there are multiple pages of data but displays "No content available." in every page. Here are the tables I imported to:
// for the business listing node type
field_data_field_bd_address_city
field_data_field_bd_address_street
field_data_field_bd_address_zip
field_data_field_bd_business_type
field_data_field_bd_contact_email
field_data_field_bd_contact_name
field_data_field_bd_description
field_data_field_bd_image
field_data_field_bd_listing_type
field_data_field_bd_phone
field_data_field_bd_tags
field_data_field_bd_website
// drupal default tables
node
node_comment_statistics
taxonomy_index
taxonomy_term_data
taxonomy_term_hierarchy
taxonomy_vocabulary
Am I missing any tables that I need to import data into to make connections?
I had this problem before and it took me a while to solve it. This was before anyone had mentioned the feeds module to me so I thought it was my only option.
If you're going to upload straight into your database you need to enter the data into the revisions tables as well. So you would have:
// for the business listing node type
field_data_field_bd_address_city
field_data_field_bd_address_street
field_data_field_bd_address_zip
field_data_field_bd_business_type
field_data_field_bd_contact_email
field_data_field_bd_contact_name
field_data_field_bd_description
field_data_field_bd_image
field_data_field_bd_listing_type
field_data_field_bd_phone
field_data_field_bd_tags
field_data_field_bd_website
And also:
// for the business listing node type
field_revision_field_bd_address_city
field_revision_field_bd_address_street
field_revision_field_bd_address_zip
field_revision_field_bd_business_type
field_revision_field_bd_contact_email
field_revision_field_bd_contact_name
field_revision_field_bd_description
field_revision_field_bd_image
field_revision_field_bd_listing_type
field_revision_field_bd_phone
field_revision_field_bd_tags
field_revision_field_bd_website
The same goes for the node table. This took me a while to work out and worked for me. Typically someone then mentioned the feeds module which would have saved me time but I thought I'd share what worked for me.
Instead of manually importing the data directly into the database and trying to figure out how to satisfy all the relational dependencies to make it work, I would suggest using the Feeds module
If you wish to continue with the manual process, perhaps this ER diagram of the Drupal database will help (keep in mind it is a bit dated and was likely based on earlier versions of Drupal 7).
I figured it out. I had to add data to the node_revision table with status of 1 set to all the nodes and it worked out just fine after that!