drupal 'node' table regenerate - mysql

I have a drupal 7 installation in a shared hosting server and I am in a situation that the server provider removed some entries from node table when the database size crossed 1gb limit.
I lost a lot of entries, but some data exists in other tables with entity_id reference numbers.
I can query my data using the references from other tables but drupal interface could not show the data because of these missing entries.
So, how could I safely regenerate the entries into node table and its tables references ?
Where can I get the information about the usage of fields in drupal tables?
If possible, a step by step guide would be very helpful.

Jahangir is on the right track. The revision info will get you 90% of the way there.
My example makes the assumption that only the node table has been affected by these spurious deletions.
Start by getting any revisions that don't have a related node
$result = db_query('SELECT nid, MAX(vid) AS vid, uid, title, `timestamp`, `status`, `comment`, promote, sticky FROM {node_revision}
WHERE nid NOT IN (SELECT nid FROM {node}) GROUP BY nid');
You could do this next part using a node object, but given that we just need the node entries I would do it directly to the db:
foreach ($result as $record) {
//You might be able to get the created date by looking at the earliest revision
$min_result = db_query('SELECT nid, MIN(vid) as vid, `timestamp` FROM {node_revision} WHERE nid = :nid', array(':nid' => $record->nid));
$created = $min_result->fetchColumn(4);
$new_node = db_insert('node')
->fields(array(
'nid' => $record->nid,
'vid' => $record->vid,
'type' => 'YOU CANNOT GET THIS',
'language' => $record->language,
'title' => $record->title,
'uid' => $record->uid,
'created' => $created,
'updated' => $record->timestamp,
//you get the idea...
));
}
I wasn't able to find a way to retrieve the content type using the revisions, so that is the weakness of the solution. Assuming all of your field data is still in place, this should get your nodes back.
*disclaimer: I wasn't able to fake up an environment that resembled the problem presented here so this code is based on my experience and a little Googling. This hasn't been tested, but I hope it is still valuable. Downvotes humbly accepted.

Related

Join two INSERT INTOs

I'm not sure how to articulate this correctly so here it goes.
I'm designing a small webapp and in my database I have a table for users, commits, and userCommits. it's pretty minimal as in users has an ID that is auto number as well as commitshas a similar ID column. Now userCommits is what links them together as you can see in the following figure.
What I've done is INSERT INTO commits (...) VALUES (...) but the problem with that is how do I create the link in userCommits so that userCommits.commitID will equal the correct commit. I feel like I shouldn't just query the table to find the latest one and use its ID. Is there a way to join the ID's somehow?
I know that I can do a query like this to list all the commits the user via email.
SELECT
commits.id,
commits.cName,
users.email
FROM
commits
INNER JOIN userCommits ON commits.id = userCommits.commitID
INNER JOIN users ON userCommits.userID = users.id
WHERE users.email = "someonecool#hotmail.com"
But what I'm now trying to do is to insert into the database when the user creates a commit.
You perform the insert and select the inserted id as part of the same command for one table, then do the same for the other table
Once your app has knowledge of both ids, it can insert into the usercommits table
See Get the new record primary key ID from mysql insert query?
So I didn't mention I was using CodeIgniter. My bad. However the solutions provided would work. I just want to post what actually is used for other people that may encounter this.
As I learned for mySQL each client gets a session of it's own. This means that when you call $this->db->insert_id(), or SELECT LAST_INSERT_ID(); it will return YOUR last entered ID, not the server's last entered ID.
$commitData = array(
'cName' => $this->input->post('commitName'),
'cLocation' => $this->input->post('location'),
'cStartDate' => $this->input->post('startDate'),
'cEndDate' => $this->input->post('endDate')
);
$this->db->insert('commits', $commitData);
$userData = array(
'commitID' => $this->db->insert_id(),
'userID' => $user->id
);
$this->db->insert('userCommits', $userData);
There is obviously going to be security built around this but this is just the basic nuts and bolts.

Minimum and maximum of a field in cakephp and mysql

I am trying to build a search function for a cakephp and mysql site. Selecting different parameters like price of the product or the length triggers an ajax call which returns the number of matching results. I want to extend the returned results with the the minimum and maximum values for the lengths and prices. I tried doing this, http://bin.cakephp.org/view/1004813660 . Using the first 4 finds is too time consuming. The last one functions locally, but I get the error;
1140 - Mixing of GROUP columns (MIN(),MAX(),,...) with no GROUP columns is illegal if there is no GROUP BY clause`
remotely, due to ONLY_FULL_GROUP_BY being on.
Is it possible to use the last option with some improvements, or can I switch off ONLY_FULL_GROUP_BY?
If I understood you well, you want to get in a single request
MIN(Yacht.price) as min_price
MAX(Yacht.price) as max_price
MIN(Yacht.long) as min_length
MAX(Yacht.long) as max_length
right ?
For this, you do not need any "Group By" clause. MIN and MAX functions are already aggregations functions. But nothing prevents you from using multiple aggregations functions in a single request.
Have you tried simply doing this ?
$stats = $this->Yacht->find(array(
'conditions' => $conditions,
'fields' => array(
'MIN(Yacht.price) as min_price',
'MAX(Yacht.price) as max_price',
'MIN(Yacht.long) as min_length',
'MAX(Yacht.long) as max_length'
)
)
);
By the way, according to the documentation, there seems to be quite a lot of redundancy in your original code. "find('first', array(...))" by itself ensures you get only one result hence, there is no need to specify "'limit' => 1" in the request nor "order" clause as there would be only one field anyway :)
Hope it helps.
The way to set server modes can be found here... If you read the top of the document it will tell you how to set the server mode defaults:
http://dev.mysql.com/doc/refman/5.1/en/server-sql-mode.html
However, I'm not sure that is necessary to get to your solution. I think your query is running for a long time because you need a different group by in your code and less queries. You should be able to use a logical group by that will maximize your primary key (index):
'group' => 'Yacht.id'
So you have one query returning everything:
$this->Yacht->find('first', array(
'conditions' => $conditions,
'fields' => array('MAX(Yacht.price) as max_price', 'MIN(Yacht.price) as min_price', ...)
'group' => 'Yacht.id'
'order' => '...'));
I ended up solving the problem by changing the way I was searching. Instead of doing queries in the conditions that would lead to joins, I explicitly did the searching with where. I had things like,
$conditions = array('Brand.name LIKE'=> '%bla%');
which I replaced it with
$condtions = array('Yacht.brand_name LIKE' => '%bla%');
I had to restructure the database a bit, but the tradeoff between speed and database normalization is one I can live with.

CakePHP parsing query results alternative

I have a query that's returning a LOT of results and my code is running out of memory trying to parse the results... how can I run a query in CakePHP and just get normal results?
By parsing it I mean....
SELECT table1.*, table2.* FROM table1 INNER JOIN table2 ON table1.id = table2.table1_id
With the above query it'll return....
array(
0 => array(
'table1' => array(
'field1' => value,
'field2' => value
),
'table2' => array(
'field1' => value,
'field2' => value
)
)
)
When it parses those results into nested arrays is when it's running out of memory.... how do I avoid this?
I couldn't hate CakePHP any more than I do right now :-\ If the documentation was decent that would be one thing, but it's not decent and it's functionality is annoying.
you could do:
$list = $this->AnyModel->query("SELECT * FROM big_table");
but i dont think that will solve your problem, because if you have, for exemple, 10millon rows.. php wont be able to manage an array of 10millon values...
but you might want to read this two links to change the execution time and the memory limit.. you could also change them on your php.ini
Good Luck!
EDITED
hmm thanks to your question i've learned something :P First of all, we all agree that you're receiving that error because Cake executes the query and tries to store the results in one array but php doesn't support an array that big so it runs out of memory and crashes.. I have never used the classic mysql_query() (i prefer PDO) but after reading the docs, it seems that mysql_query stores the results inside a resource therefore, it's not loading the results on memory, and that allows you to loop the results (like looping though a big file). So now i see the difference... and your question is actually, this one:
Can I stop CakePHP fetching all rows for a query?
=) i understand your frustration with cake, sometimes i also get frustrated with it (could you believe there's no simple way to execute a query with a HAVING clause?? u_U)
Cheers!
I'd suggest you utilize the Containable behavior on your model. This is the easiest way to control the amount of data that's returned. I've confident that this is precisely what you need to implement.
CakePHP :: Containable :: Core Behaviors
You should limit the rows returned from your query (like 500 rows) and allow the user to fetch more rows when needed (next 500 rows at a time). You could do that nicely with the pagination component and a little AJAX.

Rails find_by_sql issue

I have the following call in my RoR script:
Foo.where("event = 'Login'").group('user_id').order('user_id ASC').count()
This gives me a list of all users and how much they have logged in in the form of:
<userid1> => <count>, <userid2> => <count>, ...}
This is great and very close to what I wan but I've been unable to convince it to sort by the count of logins instead, what I'd really like to have it do. There is also a column that has some info about the login session in the form of a character delimited string. I'd like to get at certain parts of that information.
To achieve this I've tried using find_by_sql and when I make the following call:
Foo.find_by_sql("SELECT userid, COUNT(*) AS number, SUBSTRING_INDEX(stuff, ',', 1) AS info FROM <table> WHERE event = 'Login' GROUP BY userid")
What I get is a ilst of Foo entries that contain the userids but not the count or the info. When I run this in the MySQL workbench it works like a charm. Is there something else I need to do to get this to work? Also, would there be a way to just do this using Foo.select or Foo.where? Thanks.
Update I have also tried this format, as demonstrated here:
Foo.find(:all, :select => 'count(*) count, userid', :group =>'userid')
But this too merely responds with the userids and does not spit out the count.
Update 2 Looking at the output a bit more i can see now that when i do the find_by_fql call everything is being found in the correct way and even being sorted. It just isn't actually selecting the COUNT(*) or the SUBSTRING_INDEX.
Update 3 I also tried out this SO tip but when I tell it:
Foo.find(:all, :select => 'userid, count(*) as cnt', :group => 'userid')
It doesn't print or find anything related to the var cnt. I'm totally baffled here because I've seen more than one example now that does it this ^^ way and I've yet to get it to succeed.
Actually, your problem is not an SQL problem. To generate the correct SQL you would just need this:
Foo.where("event = 'Login'").group('user_id').order('count_all').count()
Take a look in your log and you'll find that this generates the following SQL:
SELECT COUNT(*) AS count_all, user_id AS school_id FROM `foos` GROUP BY user_id ORDER BY count_all
...and if you run that in your SQL console you'll get what you want.
The problem is that Rails doesn't return them in this order, Rails always returns these special group/count results in the order of the GROUP BY field. So, if you want them in a different order then you'll need to do it in Ruby after getting the hash back.
Code below returns an array of foos, checking any element inside foo will return userid/cnt
foos = Foo.find(:all, :select => 'userid, count(*) as cnt', :group => 'userid')
Is this what you're looking for?
foos.first.userid # will show userid
foos.first.cnt # will show count

Why doesn't a new table column show up in the query results in CakePHP?

I have added a new column to my table Attributes which already has (id , form_id(foreign key),type,label,size,sequence no,instr)
where instr is the new column i have added.
My application is in CakePHP and MySQL.
I have used the following code to insert into the table Attributes But the field instr alone not inserted.
function saveFieldname($data)//from untitledfieldname
{
$this->data['Attribute']['form_id'] = $this->find( 'all', array(
'fields' => array('Form.id'),
'order' => 'Form.id DESC'
));
$this->data['Attribute']['form_id'] = $this->data['Attribute']['form_id'][0]['Form']['id'];
$this->data['Attribute']['label'] = 'Label';
$this->data['Attribute']['size'] ='50';
$this->data['Attribute']['instr'] ='Fill';
$this->data['Attribute']['type'] = $data['Attribute']['type'];
$this->data['Attribute']['sequence_no'] = $data['Attribute']['sequence_no'];
$this->Attribute->save($this->data);
}
Please suggest me..
The information about the structure of your table is probably cached. Remove the content of "app/tmp/cache/models" and try it again.
Note that in development the debug level in app/config/core.php is usually set to > 1. This means you should never run into the issue in development because Cake will not cache. However, in production, debug is set to 0 in core.php, thus causing cake to start caching.
To add to that, I had removed the cache files in app/tmp/cache/models as dhofstet specified on my production CakePHP app and the find queries were still not grabbing my new column.
--
As well as clearing the model cache files, I set the debug level to 2 on my production site and did page refresh then set it back to 0 and it got it working again. I know this is an ugly approach, but it fixed it for me when no other methods were working.