Drupal 7 - Adding Nodes Through phpmyadmin is not Working Correctly - ms-access

I have received a Microsoft Access database file and was tasked to convert the contents into something readable by mySQL standards for a Drupal 7 website database. I managed to upload them into the "node" table successfully, with the correct content type classification, unique primary keys and node IDs, etc. Or so I thought.
When I checked the Drupal site, I checked the list of content type X, and all of the new stuff was there. However, when I tried to click them, instead of opening the new page like I expected it to, I received a "page not found" message, instead. I tried looking for the new content manually via "Find Content", but none of it was showing up. I checked entity reference lists that referenced content type X, but they were not showing up on those lists, either.
I checked which fields were required for content type X, and I found that "location category" and "address" were required fields. So to test, I only added 1 entry to each of those tables (both field_data and field_revision versions of the required field), representing the 1st of the many I tried to transfer over. Still nothing. I have no idea what I could be doing wrong. Can anyone offer some insight?

Adding content to Drupal through the database is absolutely the wrong way to go about creating content. I suggest you try any of the following methods:
Create the nodes programmatically using Drupal's API functions:
http://fooninja.net/2011/04/13/guide-to-programmatic-node-creation-in-drupal-7/
Upload data through a CSV file using the Feeds module:
http://drupal.org/project/feeds/

Related

TF293000: The data warehouse has detected data conflicts for the following work item fields

Hi I'm looking for help with the following issue:
In TFS on our SSRS report server whenever I run any of the out the box Sprint Burndown reports the report seems to run successfully but I get the following error in the bottom right hand corner:
Through some research I found that the issue was due to the field definitions in that particular Collection not matching the other collections that we have in TFS. Simple...
In order to determine which field definition in the collection was the issue I used the witadmin command listfields for all of my collections:
witadmin listfields /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy
This led me to find that the Synchronizes Identity Name Changes definition in the collection mentioned in the TF293000 error was set to a value of true, while it is false in all of my other collections. Issue Found! Should be easy from here...wrong.
The following command should solve my problem:
witadmin changefield /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy /syncnamechanges:false
*of course with the proper collection url subbed in for the word Collection
However when run and after I confirm that I want to make the change I get the following error:
TF401327: The operation is not supported. The feature is obselete.
I look the error up and it takes me to this page TFS Known Issue which tells me it's a known issue but was resolved in update 1 ... we have update 3.
I then attempted to simply edit the WIT .xml file and update the attribute for that WIT on that collection with false, but when I import the change to the server it tells me it has imported successfully however when I export it I see that the file has not changed.
I have also tried copying the the .xml file from the same WIT in another collection and uploading that to the offending collection and that will not work. I've never had an issue with uploading a WIT as we've made several changes to our TFS workflow before. I'm pretty stuck at this point and just wondering if anyone else has experienced this issue before, thanks!
According to the error info, seems there is a conflict in the TFS Data warehouse and this because 2 fields in different collection has different attributes in the data warehouse as it’s only one data warehouse. To avoid schema conflicts when you export and process data to the data warehouse databases, you must assign the same values to these attributes across all collections:
Field type (the value for this field cannot be changed for an
existing field).
Reporting type.
Reporting name.
What you have done is the correct operation, change/update the attribute for the field in one project collection to match the assignments that are made in other project collections.
You could try to narrow the issue, if this issue only happened on that specific field in the team project collection. All other work item filed working correctly? Also give a try with other collections, such as change the syncnamechanges=true, then set it back to syncnamechanges=false, to see if any issue occurs.
Run the command line on TFS sever machine instead of your develop machine. Clear TFS cahce. And if the filed is not use for reporting about those project collections, you could also try to mark it as non-reportable. More details please refer below links:
Resolve data warehouse schema conflicts
Change a reportable attribute for a work item field

Include categories custom fields in REST api

I am working with Advanced custom fields, where I add an image selector for categories in wp - But I can't see the data in my JSON response for categories?
I have tried to use several plugins to do the same, but That haven't worked either -
I am using ACF to rest plugin to include acf fields in the response, which works fine on custom post types - where it creates an array field called "acf"
This field is not created in my categories though - Am i missing a function to use it in taxonomies?
examples.
domain.com/wp-json/wp/v2/recipies (a custom post type)
returns everything including acf.
domain.com/wp-json/v2/categories (a texonomy)
returns all of the categories, but nothing about acf
domain.com/wp-json/wp/v2/categories/37 (a single category)
returns the category but nothing about acf.
domain.com/wp-json/wp/v2/categories/37?_embed[0] (way of getting all embedded stuff)
Does not show acf
Hope that you can put me on the right path.
To anybody who is interested, and runs into this problem.
I talked to the developer of acf-to-rest, and there is a bug in version 2, where it doesn't save taxonomies correctly - It is fixed in version 3, but you can't update it yet through wordpress, since it is still in beta -
Redownload the plugin and go into your wp-config file
here you need to define that you are going to use version 3. paste this line into the wp-config file
define('ACF_TO_REST_API_REQUEST_VERSION', 3);
The endpoints has been rewritten, so you also need to read up on that if you are using the acf endpoints for updating etc.
To read more about the bug, go github page
To read about the end points - go here
I hope that this will help somebody in the future.

No Google BigQuery table created after importing data through webclient

I'm currently familiarizing myself with Google BigQuery by working through the examples at https://cloud.google.com/bigquery/web-ui-quickstart. Doing a query over the pubic datasets runs fine.
I run into problems when uploading custom data into a new table through the WebUI. I create a new dataset and table, and upload the csv file provided with the example case. As in the example I input the schema and submit the file. Now the upload window stays on top and turns grey as if it's working. Nothing seems to happen afterwards though. When clicking away the upload window after a long wait, the table seems to be created in the tree on the left. However, when clicking on the table an error is shown:
"Unable to find table: ndwtest-984:csvtest.csvdata"
This seems like a trivial action, however I cannot seems to get it to work. I've tried varies different files, uploaded the file to Google Cloud Storage first and played around with the advanced options the last two days, but keep getting the same error.
Help would be much appreciated.
Some steps to help you:
billing must be enabled
you need to choose to upload one single TXT file from the example eg: yob2013.txt and not the zip file
make sure the schema is entered as text: name:string,gender:string,count:integer
on the last screen of the wizard you don't need to change the default CSV option parameters (for demo purposes works as it is)
I just tried the example, and it does work for me. In case you still have errors, than you can check your Job History menu in the Web UI, direct link would be, warning you need to put your Id in the link.
https://bigquery.cloud.google.com/jobs/YOUR_ANONYMOUS_PROJECT_ID_HERE?pli=1

can't create module in bonfire using existing tables

For some reason, I can't seem to create a bonfire module using the "existing" table option.
Earlier, it wasn't even displaying the list of fields from my database table when I selected the option to use existing table vs. creating a new one.
BUt I figured out that it was a permissions thing and so as a test I did the following:
chmod -R 777 /var/www/myapp
Now, it is querying the database and displaying all the correct fields from the table but when I click on the build button, it just keeps redisplaying the same form.
what I've done so far:
I created a test database in my database with just 2 fields. I tried to create a module using that table... but I get the same results.
I've ensured that all my tables are prefixed with "bf_". If they weren't, the system wouldn't be able to find and list all the correct fields... I think.
I've tested creating a new module using a new table. That seems to work just fine. Bonfire creates a new table in my database without any issues and also creates the correct folder structure for the module.
I've tried to ensure that all fields have a proper name validation rules specified.
In most cases, I just accepted defaults and tried to click on build.
changed logging settings to log everything. but after trying to create a module and going back to the logs, there's nothing listed.
If you have any suggestions, I'd appreciate it.
EDIT 1
Figured out how the profiler works - i didn't realize that you had to click on the flame icon on the bottom left corner of the screen.
found the issue. there's a bug with code igniter. found a bug report on their github site.
the post that i found was: https://github.com/ci-bonfire/Bonfire/issues/733

Drupal node / data import issue

I have a client who needs to have data imported into drupal from a large spreadsheet. They already had the modules setup and I have the data all in the right tables... so it seems. I have run my own custom scripts to manipulate what was in the spreadsheets and exported / imported it into the drupal databases.
However, when I go to view that type of content in the backend, it shows there are multiple pages of data but displays "No content available." in every page. Here are the tables I imported to:
// for the business listing node type
field_data_field_bd_address_city
field_data_field_bd_address_street
field_data_field_bd_address_zip
field_data_field_bd_business_type
field_data_field_bd_contact_email
field_data_field_bd_contact_name
field_data_field_bd_description
field_data_field_bd_image
field_data_field_bd_listing_type
field_data_field_bd_phone
field_data_field_bd_tags
field_data_field_bd_website
// drupal default tables
node
node_comment_statistics
taxonomy_index
taxonomy_term_data
taxonomy_term_hierarchy
taxonomy_vocabulary
Am I missing any tables that I need to import data into to make connections?
I had this problem before and it took me a while to solve it. This was before anyone had mentioned the feeds module to me so I thought it was my only option.
If you're going to upload straight into your database you need to enter the data into the revisions tables as well. So you would have:
// for the business listing node type
field_data_field_bd_address_city
field_data_field_bd_address_street
field_data_field_bd_address_zip
field_data_field_bd_business_type
field_data_field_bd_contact_email
field_data_field_bd_contact_name
field_data_field_bd_description
field_data_field_bd_image
field_data_field_bd_listing_type
field_data_field_bd_phone
field_data_field_bd_tags
field_data_field_bd_website
And also:
// for the business listing node type
field_revision_field_bd_address_city
field_revision_field_bd_address_street
field_revision_field_bd_address_zip
field_revision_field_bd_business_type
field_revision_field_bd_contact_email
field_revision_field_bd_contact_name
field_revision_field_bd_description
field_revision_field_bd_image
field_revision_field_bd_listing_type
field_revision_field_bd_phone
field_revision_field_bd_tags
field_revision_field_bd_website
The same goes for the node table. This took me a while to work out and worked for me. Typically someone then mentioned the feeds module which would have saved me time but I thought I'd share what worked for me.
Instead of manually importing the data directly into the database and trying to figure out how to satisfy all the relational dependencies to make it work, I would suggest using the Feeds module
If you wish to continue with the manual process, perhaps this ER diagram of the Drupal database will help (keep in mind it is a bit dated and was likely based on earlier versions of Drupal 7).
I figured it out. I had to add data to the node_revision table with status of 1 set to all the nodes and it worked out just fine after that!