Problem with loading large MySQL Table into JSON - mysql

We have a large MySQL Table (say) 50,000 records or rows. The loading time of all the records is close to 20 mins. By loading I mean the data getting displayed in the browser or in a Javascript Grid - time taken to load this data is around 20 mins. This is creating a big problem for us. We have even tried to return all data in JSON following this approach How to convert result table to JSON array in MySQL still we are facing a prolonged delay in loading.
The query is very much straight forward as shown below. Can anyone help us with any solution to overcome this problem. Thanks in advance!
<?php
include 'includes/xxx.php';
$returnArray = array();
$totalSymptoms = 0;
$matchedSymptomIds = array();
$cutOff = 10;
$runningGlobalSymptomId = "";
$symptomResult = mysqli_query($db,"SELECT * FROM comparison_small WHERE (matched_percentage IS NULL OR matched_percentage >= ".$cutOff.")");
if(mysqli_num_rows($symptomResult) > 0){
while($symRow = mysqli_fetch_array($symptomResult)){
$dataArray = array();
$totalSymptoms++;
if($symRow['current_symptom'] == '1'){
// Global symptom
$runningGlobalSymptomId = $symRow['symptom_id'];
$dataArray['symptom_id'] = $symRow['symptom_id'];
$dataArray['row_id'] = "row".$symRow['symptom_id'];
$dataArray['symptom'] = $symRow['symptom'];
$dataArray['match'] = "";
}else{
// Local symptom
array_push($matchedSymptomIds, $symRow['symptom_id']);
$dataArray['symptom_id'] = $symRow['symptom_id'];
$dataArray['row_id'] = "row".$runningGlobalSymptomId."_".$symRow['symptom_id'];
$dataArray['symptom'] = $symRow['symptom'];
$dataArray['match'] = $symRow['matched_percentage'];
}
$returnArray[] = $dataArray;
}
}

Related

Can't wrap my head around MySQL statement

I have two tables:
cache and main
In cache there are a lot of fields; in main a little less. A UNION is not going to work because of the unequal number of columns.
cache
client - file - target - many other columns
main
client - file - target - few other columns
From cache I would like all columns for which main.target LIKE '%string%', cache.client = main.client, cache.file = main.file
For these particular records, target, client and file are always the same in main and cache.
I just can't get my head around this, but then again MySQL never was my strongest point.
Thank you very much in advance!
In the end combining the two SELECT statements with a UNION made things very complicated, for the simple reason there were countless other queries, some without UNION, that in the end all had to be processed by the same end routine presenting the results. As this was only a one-time query and time wasn't really an issue, in the end I just used SELECT on the two different tables and then combined the results by checking if a certain field was present. If not, the remaining results had to be fetched from the cache table; if so, the remaining results had to be fetched from the main table.
I actually wonder whether this solution is faster, slower or just as fast.
if (!isset($row['current']))
{
$field = $row['field'];
$sqlcache = "SELECT * FROM " . $dbtable . " WHERE (client = '$sqlclient' AND file = '$sqlfile' AND field = '$field')";
$resultcache = $conn->query($sqlcache);
if (!$resultcache)
{
die($conn->error);
}
$rowcache = $resultcache->fetch_assoc();
$currenttarget = $rowcache['current'];
$context = $rowcache['context'];
$dirtysource = $rowcache['dirtysource'];
$stringid = $rowcache['stringid'];
$limit = $rowcache['maxlength'];
$locked = $rowcache['locked'];
$filei = $rowcache['filei'];
}
else
{
$currenttarget = $row['current'];
$context = $row['context'];
$dirtysource = $row['dirtysource'];
$stringid = $row['stringid'];
$limit = $row['maxlength'];
$locked = $row['locked'];
$filei = $row['filei'];
}

How to use node streams to insert big data into mysql?

I'm trying to use node stream to insert 10 million records into mysql. Is there a way to do this with node stream? I'm not finding very useful or 'friendly' answers or documentation about this issue anywhere. So far I'm able to insert 45K records, but I'm getting some errors trying with a record set any bigger than that.
Also, what's the callback in the code below supposed to do here? I'm not sure where I got this code from and I'm not actually passing a call back, so, maybe that's the problem!! :D Any ideas? What would the callback actually be? Maybe the callback is supposed to take chunk and pass a chunk at a time? How could I rework this to get it to work consistently? I just don't think this code below is actually splitting the data up into chunks at all. How do I split it up into manageable chunks?
Depending on the amount of records I try this with I get different errors. The errors I am getting are:
For 50K - 80K sometimes I get this error:
Error: connect ETIMEDOUT
at Connection._handleConnectTimeout
I get this error for 100K records or above:
Error: ER_NET_PACKET_TOO_LARGE: Got a packet bigger than 'max_allowed_packet' bytes
at Query.Sequence._packetToError
This error for around 55K records:
Error: write EPIPE
at WriteWrap.afterWrite [as oncomplete] (net.js:788:14)
It's kind of wild to get 3 different errors depending on the amount of records I'm trying to insert.
Here's the code (It's working fine for 45000 records, but not for anything bigger):
var db = require('./index.js');
var faker = require('faker');
var stream = require('stream');
var songs = [];
var size = 10000000;
var songList = function(){
for (var i = 0; i < size; i++) {
var song = [i, faker.random.words(1,2), faker.name.findName(), faker.internet.url(1,50), faker.random.words(1,2), faker.random.words(1,20)];
songs.push(song);
}
console.log('songs', songs);
return songs;
}
var songSql = "INSERT INTO songs (song_id, song_name, artist, song_url, song_album, song_playlist) VALUES ?";
var songValues = songList();
var songSeed = function() {
console.log('x: ', x);
var query = db.connection.query(songSql, [songValues]).stream({highWaterMark: 5});
var testStream = new stream.Transform({highWaterMark: 5, objectMode: true});
testStream._write = function(chunk,encoding,callback) {
setTimeout(function() {
console.log('my chunk: ', chunk);
callback();
},1000);
}
// Pipe the query stream into the testStream
query.pipe(testStream)
// Monitor data events on the side to see when we pause
query.on("result",function(d,i) {
console.log("Data Sent")
})
}
songSeed();
On the MySQL server increase max_allowed_packet to 1G. There's no real downside to this.

SQL SELECT query excluding an array of id's for infinite scroll [duplicate]

I have followed help located in this topic: Using infinite scroll w/ a MySQL Database
And have gotten close to getting this working properly. I have a page that is displayed in blocks using jquery masonry, in which the blocks are populated by data from a mysql database. When I scroll to the end of the page I successfully get the loading.gif image but immediately after the image it says "No more posts to show." which is what it should say if that were true. I am only calling in 5 posts initially out of about 10-15, so the rest of the posts should load when I reach the bottom of the page but I get the message that is supposed to come up when there really aren't any more posts.
Here is my javascript:
var loading = false;
$(window).scroll(function(){
if($(window).scrollTop() == $(document).height() - $(window).height()) {
var h = $('.blockContainer').height();
var st = $(window).scrollTop();
var trigger = h - 250;
if((st >= 0.2*h) && (!loading) && (h > 500)){
loading = true;
$('div#ajaxLoader').html('<img src="images/loading.gif" name="HireStarts Loading" title="HireStarts Loading" />');
$('div#ajaxLoader').show();
$.ajax({
url: "blocks.php?lastid=" + $(".masonryBlock:last").attr("id"),
success: function(html){
if(html){
$(".blockContainer").append(html);
$('div#ajaxLoader').hide();
}else{
$('div#ajaxLoader').html('<center><b>No more posts to show.</b></center>');
}
}
});
}
}
});
Here is the php on the page the blocks are actually on. This page initially posts 5 items from the database. The javascript grabs the last posted id and sends that via ajax to the blocks.php script, which then uses the last posted id to grab the rest of the items from the database.
$allPosts = $link->query("/*qc=on*/SELECT * FROM all_posts ORDER BY post_id DESC LIMIT 5");
while($allRows = mysqli_fetch_assoc($allPosts)) {
$postID = $link->real_escape_string(intval($allRows['post_id']));
$isBlog = $link->real_escape_string(intval($allRows['blog']));
$isJob = $link->real_escape_string(intval($allRows['job']));
$isVid = $link->real_escape_string(intval($allRows['video']));
$itemID = $link->real_escape_string(intval($allRows['item_id']));
if($isBlog === '1') {
$query = "SELECT * FROM blogs WHERE blog_id = '".$itemID."' ORDER BY blog_id DESC";
$result = $link->query($query);
while($blogRow = mysqli_fetch_assoc($result)) {
$blogID = $link->real_escape_string($blogRow['blog_id']);
$blogTitle = $link->real_escape_string(html_entity_decode($blogRow['blog_title']));
$blogDate = $blogRow['pub_date'];
$blogPhoto = $link->real_escape_string($blogRow['image']);
$blogAuthor = $link->real_escape_string($blowRow['author']);
$blogContent = $link->real_escape_string($blogRow['content']);
//clean up the text
$blogTitle = stripslashes($blogTitle);
$blogContent = html_entity_decode(stripslashes(truncate($blogContent, 150)));
echo "<div class='masonryBlock' id='".$postID."'>";
echo "<a href='post.php?id=".$blogID."'>";
echo "<div class='imgholder'><img src='uploads/blogs/photos/".$blogPhoto."'></div>";
echo "<strong>".$blogTitle."</strong>";
echo "<p>".$blogContent."</p>";
echo "</a>";
echo "</div>";
}
}
Here is the php from the blocks.php script that the AJAX calls:
//if there is a query in the URL
if(isset($_GET['lastid'])) {
//get the starting ID from the URL
$startID = $link->real_escape_string(intval($_GET['lastid']));
//make the query, querying 25 fields per run
$result = $link->query("SELECT * FROM all_posts ORDER BY post_id DESC LIMIT '".$startID."', 25");
$html = '';
//put the table rows into variables
while($allRows = mysqli_fetch_assoc($result)) {
$postID = $link->real_escape_string(intval($allRows['post_id']));
$isBlog = $link->real_escape_string(intval($allRows['blog']));
$isJob = $link->real_escape_string(intval($allRows['job']));
$isVid = $link->real_escape_string(intval($allRows['video']));
$itemID = $link->real_escape_string(intval($allRows['item_id']));
//if the entry is a blog
if($isBlog === '1') {
$query = "SELECT * FROM blogs WHERE blog_id = '".$itemID."' ORDER BY blog_id DESC";
$result = $link->query($query);
while($blogRow = mysqli_fetch_assoc($result)) {
$blogID = $link->real_escape_string($blogRow['blog_id']);
$blogTitle = $link->real_escape_string(html_entity_decode($blogRow['blog_title']));
$blogDate = $blogRow['pub_date'];
$blogPhoto = $link->real_escape_string($blogRow['image']);
$blogAuthor = $link->real_escape_string($blowRow['author']);
$blogContent = $link->real_escape_string($blogRow['content']);
$blogTitle = stripslashes($blogTitle);
$blogContent = html_entity_decode(stripslashes(truncate($blogContent, 150)));
$html .="<div class='masonryBlock' id='".$postID."'>
<a href='post.php?id=".$blogID."'>
<div class='imgholder'><img src='uploads/blogs/photos/".$blogPhoto."'></div>
<strong>".$blogTitle."</strong>
<p>".$blogContent."</p>
</a></div>";
}
}
echo $html;
}
I have tried using the jquery infinite-scroll plugin, but it seemed much more difficult to do it that way. I don't know what the issue is here. I have added alerts and did testing and the javascript script is fully processing, so it must be with blocks.php right?
EDIT: I have made a temporary fix to this issue by changing the sql query to SELECT * FROM all_posts WHERE post_id < '".$startID."' ORDER BY post_id DESC LIMIT 15
The blocks are now loading via ajax, however they are only loading one block at a time. The ajax is sending a request for every single block and they are fading in one after another, is it possible to make them all fade in at once with jquery masonry?
I seen your code in another answer, and I would recommend using the LIMIT functionality in MySql instead of offsetting the values. Example:
SELECT * FROM all_posts ORDER BY post_id DESC LIMIT '".(((int)$page)*5)."',5
This will just take a page number in the AJAX request and get the offset automatically. It's one consistent query, and works independent of the last results on the page. Send something like page=1 or page=2 in your jQuery code. This can be done a couple different ways.
First, count the number of elements constructed on the page and divide by the number on the page. This will yield a page number.
Second, you can use jQuery and bind the current page number to the body:
$(body).data('page', 1)
Increment it by one each page load.
Doing this is really the better way to go, because it uses one query for all of the operations, and doesn't require a whole lot of information about the data already on the page.
Only thing to note is that this logic requires the first page request to be 0, not 1. This is because 1*5 will evaluate to 5, skipping the first 5 rows. If its 0, it will evaluate to 0*5 and skip the first 0 rows (since 0*5 is 0).
Let me know any questions you have!
Have you tried doing any debugging?
If you are not already using, I would recommend getting the firebug plugin.
Does the ajax call return empty? If it does, try echoing the sql and verify that is the correct statement and that all the variables contain the expected information. A lot of things could fail considering there's a lot of communication happening between client, server and db.
In response to your comment, you are adding the html in this piece of code:
if(html){
$(".blockContainer").append(html);
$('div#ajaxLoader').hide();
}
I would do a console.log(html) and console.log($(".blockContainer").length) before the if statement.

How to save data from a table from another database in Laravel?

I need to save data from an order table from one database to another through Laravel. I created a function in my controller like this:
public function getMarketplace()
{
$orderoc = OrderOC::orderBy('oc_order.date_added', 'desc')
->join('oc_order_product', 'oc_order_product.order_id', 'oc_order.order_id')
->join('oc_order_history', 'oc_order_history.order_id', 'oc_order.order_id')
->where('oc_order_history.order_status_id', '=', '17')
->get();
foreach($orderoc as $oc){
$ordererp = new Order;
$ordererp->erp_createdid = $oc->created_id;
$ordererp->erp_marketplaceid = 1;
$ordererp->erp_site = rand(1,100000000);
$ordererp->erp_payment_method = $oc->payment_method;
$ordererp->erp_orderdate = $oc->date_added;
$ordererp->erp_orderaprove = $oc->date_added;
$ordererp->erp_billingid = 1;
$ordererp->erp_shippingid = 1;
$ordererp->erp_marketplace = 'Comércio Urbano';
$ordererp->erp_orderquantity = $oc->quantity;
$ordererp->erp_erro = '';
$ordererp->erp_product_ok = 1;
$ordererp->erp_compraId = null;
$ordererp->save();
if(strlen($oc->created_id) == 0){
$oc->created_id = rand(1,10000000);
$oc->save();
}
$orderprod = new OrderProduct;
$orderprod->erp_productid = $oc->product_id;
$orderprod->erp_createdid = $oc->created_id;
$orderprod->erp_model = $oc->model;
$orderprod->erp_quantity = $oc->quantity;
}
}
One table is from my ERP and the other is responsible for receiving OpenCart purchases, but every time I run, the same product appears more than once in my order table.
(It is possible to see through the purchase date, since created_id is created in the controller function)
Does anyone know how to tell me why data is duplicated when inserted inside a foreach? This is not the first time, if you tell me a more robust way of doing the job, I'm grateful. Any suggestion? Thank you in advance!
One possiblity is you put unique validation on a table that receive the data

How to get last inserted id with insert method in laravel

In my laravel project I am inserting multiple records at time with modelname::insert method. Now I want to get last inserted id of it.I read somewhere when you insert multiple records with single insert method and try to get the last_record_id it will gives you the first id of the last inserted query bunch. But my first question is how to get last record id with following code .If I am able to get first id of the bunch .I ll make other ids for other record by my own using incremental variable.
Code to insert multiple record
if(!empty($req->contract_name) && count($req->contract_name)>0)
{
for($i=0; $i<count($req->contract_name); $i++)
{
$contract_arr[$i]['client_id'] = $this->id;
$contract_arr[$i]['contract_name'] = $req->contract_name[$i];
$contract_arr[$i]['contract_code'] = $req->contract_code[$i];
$contract_arr[$i]['contract_type'] = $req->contract_type[$i];
$contract_arr[$i]['contract_ext_period'] = $req->contract_ext_period[$i];
$contract_arr[$i]['contract_email'] = $req->contract_email[$i];
$contract_arr[$i]['created_at'] = \Carbon\Carbon::now();
$contract_arr[$i]['updated_at'] = \Carbon\Carbon::now();
$contract_arr[$i]['created_by'] = Auth::user()->id;
$contract_arr[$i]['updated_by'] = Auth::user()->id;
if($req->startdate[$i] != ''){
$contract_arr[$i]['startdate'] = date('Y-m-d',strtotime($req->startdate[$i]));
}
if($req->enddate[$i] != ''){
$contract_arr[$i]['enddate'] = date('Y-m-d',strtotime($req->enddate[$i]));
}
}
if(!empty($contract_arr)){
Contract::insert($contract_arr);
}
}
You should be able to call it like this
$lastId = Contract::insert($contract_arr)->lastInsertId();
If i see right, you're using a Model. Direct inserting only shows an success boolean. Try this instead:
Contract::create($contract_arr)->getKey()