I have a client who needs to have data imported into drupal from a large spreadsheet. They already had the modules setup and I have the data all in the right tables... so it seems. I have run my own custom scripts to manipulate what was in the spreadsheets and exported / imported it into the drupal databases.
However, when I go to view that type of content in the backend, it shows there are multiple pages of data but displays "No content available." in every page. Here are the tables I imported to:
// for the business listing node type
field_data_field_bd_address_city
field_data_field_bd_address_street
field_data_field_bd_address_zip
field_data_field_bd_business_type
field_data_field_bd_contact_email
field_data_field_bd_contact_name
field_data_field_bd_description
field_data_field_bd_image
field_data_field_bd_listing_type
field_data_field_bd_phone
field_data_field_bd_tags
field_data_field_bd_website
// drupal default tables
node
node_comment_statistics
taxonomy_index
taxonomy_term_data
taxonomy_term_hierarchy
taxonomy_vocabulary
Am I missing any tables that I need to import data into to make connections?
I had this problem before and it took me a while to solve it. This was before anyone had mentioned the feeds module to me so I thought it was my only option.
If you're going to upload straight into your database you need to enter the data into the revisions tables as well. So you would have:
// for the business listing node type
field_data_field_bd_address_city
field_data_field_bd_address_street
field_data_field_bd_address_zip
field_data_field_bd_business_type
field_data_field_bd_contact_email
field_data_field_bd_contact_name
field_data_field_bd_description
field_data_field_bd_image
field_data_field_bd_listing_type
field_data_field_bd_phone
field_data_field_bd_tags
field_data_field_bd_website
And also:
// for the business listing node type
field_revision_field_bd_address_city
field_revision_field_bd_address_street
field_revision_field_bd_address_zip
field_revision_field_bd_business_type
field_revision_field_bd_contact_email
field_revision_field_bd_contact_name
field_revision_field_bd_description
field_revision_field_bd_image
field_revision_field_bd_listing_type
field_revision_field_bd_phone
field_revision_field_bd_tags
field_revision_field_bd_website
The same goes for the node table. This took me a while to work out and worked for me. Typically someone then mentioned the feeds module which would have saved me time but I thought I'd share what worked for me.
Instead of manually importing the data directly into the database and trying to figure out how to satisfy all the relational dependencies to make it work, I would suggest using the Feeds module
If you wish to continue with the manual process, perhaps this ER diagram of the Drupal database will help (keep in mind it is a bit dated and was likely based on earlier versions of Drupal 7).
I figured it out. I had to add data to the node_revision table with status of 1 set to all the nodes and it worked out just fine after that!
Related
I'm using CasperJS for automated UI tests. I've done the basic UI testing and validation with some random data, kind of POC. I've set up this automation using bash script which kicks to start the web server, load MySQL data from SQL file, start CasperJS test cases, stop the web server, check the log files.
Now, I want to start the testing with some good known status of data which are stored in MySQL. So that I can test the list data and form data with detailed field information with some known database status. How should I know the status of data in the database at a moment?
1) Should I use pre-populated JSON dumped file which has status and details about all data?
2) Should I use web service API? (web service APIs are being used to show/save/delete data from the web page)
Let's take an example. I've 5 users in Users table. Now when I open the home page it shows 5 users with some rough details. When I click on any record from the list of users, it shows a form with detailed information about that user. The webpage is requesting to the web application to get the detail about a user with the help of user_id to show the detailed user data in a form. Now I want to check that all the data in that form is populated correctly. So at the next step, what would be the preferred way, should I read content from JSON dumped file or should I use web service API (like webpage does).
Searching this problem online, I also found MYSQL HTTP plugin. Should I consider this as well? and How safe it is to use? (I know from the docs that this plugin is not for the production, it is just for testing purpose only. :) )
For the main question in cases like this I would change the database connection string to your testing database (this is a clone).
In your case use your bash script to change the connection string (file copy?) automatically before you run the tests. And when completed change back.
Your testing database is a direct clone of your dev/live databsae but with ONLY the test data you want. Downside is you need to keep the schema in sync with DEV/LIVE.
Also another point to take into considertion is if your testing changes state (post). If so your testing data might be out of sync. One way is get around this is to drop foreign keys, truncate the data and load in a dump file.
HTH
Hi I'm looking for help with the following issue:
In TFS on our SSRS report server whenever I run any of the out the box Sprint Burndown reports the report seems to run successfully but I get the following error in the bottom right hand corner:
Through some research I found that the issue was due to the field definitions in that particular Collection not matching the other collections that we have in TFS. Simple...
In order to determine which field definition in the collection was the issue I used the witadmin command listfields for all of my collections:
witadmin listfields /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy
This led me to find that the Synchronizes Identity Name Changes definition in the collection mentioned in the TF293000 error was set to a value of true, while it is false in all of my other collections. Issue Found! Should be easy from here...wrong.
The following command should solve my problem:
witadmin changefield /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy /syncnamechanges:false
*of course with the proper collection url subbed in for the word Collection
However when run and after I confirm that I want to make the change I get the following error:
TF401327: The operation is not supported. The feature is obselete.
I look the error up and it takes me to this page TFS Known Issue which tells me it's a known issue but was resolved in update 1 ... we have update 3.
I then attempted to simply edit the WIT .xml file and update the attribute for that WIT on that collection with false, but when I import the change to the server it tells me it has imported successfully however when I export it I see that the file has not changed.
I have also tried copying the the .xml file from the same WIT in another collection and uploading that to the offending collection and that will not work. I've never had an issue with uploading a WIT as we've made several changes to our TFS workflow before. I'm pretty stuck at this point and just wondering if anyone else has experienced this issue before, thanks!
According to the error info, seems there is a conflict in the TFS Data warehouse and this because 2 fields in different collection has different attributes in the data warehouse as it’s only one data warehouse. To avoid schema conflicts when you export and process data to the data warehouse databases, you must assign the same values to these attributes across all collections:
Field type (the value for this field cannot be changed for an
existing field).
Reporting type.
Reporting name.
What you have done is the correct operation, change/update the attribute for the field in one project collection to match the assignments that are made in other project collections.
You could try to narrow the issue, if this issue only happened on that specific field in the team project collection. All other work item filed working correctly? Also give a try with other collections, such as change the syncnamechanges=true, then set it back to syncnamechanges=false, to see if any issue occurs.
Run the command line on TFS sever machine instead of your develop machine. Clear TFS cahce. And if the filed is not use for reporting about those project collections, you could also try to mark it as non-reportable. More details please refer below links:
Resolve data warehouse schema conflicts
Change a reportable attribute for a work item field
I am beginner in DNN. I am creating a module which provides Login, Dashboard and Add-Update Form. I have data in JSON format. I want to store it temparory while user use the website. Data will be destroy as soon as user will close the website.
Currently I have created a folder in my Solution Explorer of project in Visual Basic and created 3 .json files which stores login_info.json, basic_info.json and auth_info.json. I write json data whenever user login and I make it blank when user logout.
Above method is working fine now but I afraid it will work when I will publish this module.
Also I may have situation where I need to store image some where. I don't know how I will manage.
Can anybody please guide me?
Is this proper way to store data temparory in DNN?
Is there any other better way?
After getting one of reply for Database Suggestion
Is there any table which same as User Meta in DotnetNuke?
You use the ConnectionString that is used by DNN and access the database as you would normally.
DotNetNuke.Common.Utilities.Config.GetConnectionString()
Or you can use the Data Access Layer that the DNN Framework supplies. For that take the Christoc Templates. In there is all you need to communicate with the DB.
For some reason, I can't seem to create a bonfire module using the "existing" table option.
Earlier, it wasn't even displaying the list of fields from my database table when I selected the option to use existing table vs. creating a new one.
BUt I figured out that it was a permissions thing and so as a test I did the following:
chmod -R 777 /var/www/myapp
Now, it is querying the database and displaying all the correct fields from the table but when I click on the build button, it just keeps redisplaying the same form.
what I've done so far:
I created a test database in my database with just 2 fields. I tried to create a module using that table... but I get the same results.
I've ensured that all my tables are prefixed with "bf_". If they weren't, the system wouldn't be able to find and list all the correct fields... I think.
I've tested creating a new module using a new table. That seems to work just fine. Bonfire creates a new table in my database without any issues and also creates the correct folder structure for the module.
I've tried to ensure that all fields have a proper name validation rules specified.
In most cases, I just accepted defaults and tried to click on build.
changed logging settings to log everything. but after trying to create a module and going back to the logs, there's nothing listed.
If you have any suggestions, I'd appreciate it.
EDIT 1
Figured out how the profiler works - i didn't realize that you had to click on the flame icon on the bottom left corner of the screen.
found the issue. there's a bug with code igniter. found a bug report on their github site.
the post that i found was: https://github.com/ci-bonfire/Bonfire/issues/733
I have the following problem:
I Am using a Django framework.
One of the parts in a system (non-django) writes to the database, in the same database that django is using.
I want to have a signal when an object is being saved. It's a django model object but not saved via django, but directly in the mysql database.
Is there a way django can watch save-actions in his database when it's not being saved by django?
The neatest way would be: create an Api, and let the save action run through this api. The save signal can than be django default. (but this depends on some work of externals... so not the prefered route... for future development it sure is).
Another option is to implement celery and create a task that frequently looks whether one of the saved objects has had no follow up..... (also quit some puzzling I guess to get this up and running)
But there might be an easier... for me unknown?
I saw django watchdog solutions for file systems... not for databases (probably because django has this build in... when properly done through django)
to complex it: I test and develop locally with sqlite .... but the save signal I can put in my tests without needing to get this locally working.... as long as it works in mysql, I Am happy.
You can try this solution:
Create a new table 'django_watch' with one column 'object_id' (add other columns like 'created_datetime' etc according to your standards);
Lets say your main table is 'object'. Add a mysql trigger for the INSERT event on this table.
You should add an extra insert query inside the trigger to insert the object_id into 'django_watch' table.
Now you can have a cronjob that will be inpecting the new table 'django_watch' (for updations in Django objects) and perform necessary actions. You can run this cronjob continuously with some 1 minute delay (upto you).
In the end, I wrote an api that can be called by the thirdparty module. I delivered the code to logon on django using c-code to this api and call the GET of this api. (using django rest framework). This api just saves the object (the id given in the url), and from there on it's default django. The only thing the third party had to do is build in my code to call the api as well....
Maybe not the best solution, but the best to implement for my problem....