Rendering a very large ResultSet to the screen in JSP - mysql

I am a beginner programmer, so please be patient! ;-)
I need to display a very large ResultSet (over 15.000 rows) from a MySQL query on the screen in the form of a HTML table. I am using JSP pages to try and accomplish this.
The query itself takes about 0.25s on my MySQL server (command-line, via terminal), but over 8 minutes to render on the browser with minimal HTML.
If I write the results to a file using something like new BufferedWriter(new OutputStreamWriter(new FileOutputStream(filename),"ISO-8859-1")), it is also very fast (1.2s), but it's not what the client wants.
So I guess my question is: is there some way I can use the OutputStreamWriter to print the results to the screen instead of writting them to a file? I am assuming that out.print is what is causing the huge delays.
Thanks a lot!

Related

Result set must be in groupings - MYSQL

I am making a report using java code and it is working fine (Image Right).
Now I want to convert it to BIRT.
Since BIRT focuses on Query, I have to change my java code to plain query. Problem is, I have a for each loop on java query where mysql doesn't have.
I found similar query in here but it did not work on me.
As of now, using plain query this is my result:
BIRT RESULT
What am I missing in here? Here's my code.
SELECT item.No_, item.Description, item_ledger_entry.Item_No_,
item_ledger_entry.Description,item_ledger_entry.Posting_Date,
item_ledger_entry.External_Document_No_, item_ledger_entry.Document_No_,
item_ledger_entry.Location_Code, item_ledger_entry.Quantity,
item_ledger_entry.Entry_Type
FROM pbsdev3.item, pbsdev3.item_ledger_entry
where item.No_ = item_ledger_entry.Item_No_
and item.Description = item_ledger_entry.Description
group by item.No_;
I am a very fresh coder so I don't have much knowledge yet.
The result I wanted Look like this(It must be this)
This is the item Table and the item_ledger_entry
Help much appreciated!
Actually, your query is good working good. Make sure you are using Table instead of grid. Table is better than grid when it comes to multiple data.
And by the way, you don't need that groupings.

sfPropelPager reduce queries

i'm working in a symfony project and using sfPropelPager to show a paged list of elements.
The problem is that with a great amount of data to list (i.e. thousands of registers) it makes a query to the database for each page to show!!!! That means about 100 extra queries in my case, and that is unacceptable.
Showing some of my code: the function that returns the pager object
$pager = new sfPropelPager('MyTable',sfConfig::get('sfPropelPagerLines'));
$c = new Criteria();
$c->add('my_table_field',$value);
$c->addDescendingOrderByColumn('date');
$pager->setCriteria($c);
$pager->init();
return $pager;
So, please, if you know a way to get all the results with only one query, it would be a great solution for my problem. Otherwise i must implement that list with an ajax call for every page the user wants to see
Thank you very much for your time.
I'm not sure to get your problem but, anyway, avoid the use of Criteria. Try to make queries with the ModelCriteria API: http://www.propelorm.org/reference/model-criteria.html.
For each paginated page, a query to the database will be done, this is the standard behavior for all pagers I know. If it's related to related objects (assuming you want to display information from relations), you may want to create a query that links those objects before to paginate, that way you'll get one query per page for all your data to display.
Read this doc for instance: http://www.propelorm.org/documentation/03-basic-crud.html#query_termination_methods
At last i did'nt get a solution for the problem, i had to implement the list via AJAX call, calling to a function that returns the requested page, so at the load of the page, no query for this list is slowing the user experience.
Thank you anyway to help me :)

ajax speed of generating select

I had a question here Mysql select speed
I figure that select of mysql is fast, but
My problem is when the ajax is generating a huge select and option of cities, for example BRAZIL.
Is there a way to generate the ajax select faster? Because if it's too big, the browser lag and waits to load full content of select. I want it smoother.
Can anyone help me please ? :(
Try to cache your query result. That means, run the query on a regular base, save the result (for example in another table or in an xml file), then send the precached data to the browser on request.
You may want to preload the whole lists into your document and hide them by the display: none; property. Then you do not need to AJAX pull the contents of your select box. Instead, just show the list you need if a user selects a specific entry and hide the others.
Is you problem is in server or in client?
Fiddler or Firebug can help you to know how much time spent by processing in server and by transfering to client.
I think you need to change your server side, and return small strings, like this:
1,Jerusalem|2,Tel Aviv|3,Ariel
In client side, you need to split by "|" and then by ",", and use an JS array to build HTML:
//not tested
var a = string.split("|"),s=[]
for (var i=0;i"+b[1]+"");
}
$("#div").html(s.join(" "));
JSON is easier to use, but will make the response larger and slowly.

Classic ASP/MySQL - Parameter Issue

I have been trying to insert a huge text-editor string in to my database. The application I'm developing allows my client to create and edit their website terms and conditions from the admin part of their website. So as you can imagine, they are incredibly long. I have got to 18,000+ characters in length and I have now received an error when trying to add another load of text.
The error I am receiving is this:
ADODB.Command error '800a0d5d'
Application uses a value of the wrong type for the current operation
Which points to this part of my application, specifically the Set newParameter line:
Const adVarChar = 200
Const adParamInput = 1
Set newParameter = cmdConn.CreateParameter("#policyBody", adVarChar, adParamInput, Len(policyBody), policyBody)
cmdConn.Parameters.Append newParameter
Now this policy I am creating, that is currently 18,000+ characters in length, is only half complete, if that. It could jump to 50 - 60,000! I tried using adLongVarChar = 201 ADO type but this still didn't fix it.
Am I doing the right thing for such a large entry? If I am doing the right thing, how can I fix this issue? ...or if I'm doing the wrong thing, what is the right one?
Try to avoid putting documents in your database if you can. Sometimes it's a reasonable compromise, serialised objects, mark up snippets and such.
If you don't want to query the document with sql the only benefit is the all in one place thing. ie back up your db, you back up your documents as well, and you can use your db connectivity exclusively.
That said nothing is free, carting all that stuff about in your database costs you.
If you can.
have a documents table, User name for the file, and internal name in your documents directory, so the file name is unique in the file system, and a path description, if there could be more than one.
Then just upload and download the selected document as a file, on a get or set of the related database entity.
You'll need to dal with deployment issues, document directory exists, and the account you are running mysql daemon as can see it, but most of the time, the issues you have keeping documents seperate fromthe db, are much easier to deal with than the head scratchers you are running into now.

How IQueryables are dealt with in ASP.NET MVC Views?

I have some tables in a MySQL database to represent records from a sensor. One of the features of the system I'm developing is to display this records from the database to the web user, so I used ADO.NET Entity Data Model to create an ORM, used Linq to SQL to get the data from the database, and stored them in a ViewModel I designed, so I can display it using MVCContrib Grid Helper:
public IQueryable<TrendSignalRecord> GetTrends()
{
var dataContext = new SmgerEntities();
var trendSignalRecords = from e in dataContext.TrendSignalRecords
select e;
return trendSignalRecords;
}
public IQueryable<TrendRecordViewModel> GetTrendsProjected()
{
var projectedTrendRecords = from t in GetTrends()
select new TrendRecordViewModel
{
TrendID = t.ID,
TrendName = t.TrendSignalSetting.Name,
GeneratingUnitID = t.TrendSignalSetting.TrendSetting.GeneratingUnit_ID,
//{...}
Unit = t.TrendSignalSetting.Unit
};
return projectedTrendRecords;
}
I call the GetTrendsProjectedMethod and then I use Linq to SQL to select only the records I want. It is working fine in my developing scenario, but when I test it in a real scenario, where the number of records is way greater (something around a million records), it stops working.
I put some debug messages to test it, and everything works fine, but when it reaches the return View() statement, it simply stops, throwing me a MySQLException: Timeout expired. That let me wondering if the data I sent to the page is retrieved by the page itself (it only search for the displayed items in the database when the page itself needs it, or something like that).
All of my other pages use the same set of tools: MVCContrib Grid Helper, ADO.NET, Linq to SQL, MySQL, and everything else works alright.
You absolutely should paginate your data set before executing your query if you have millions of records. This could be done using the .Skip and .Take extension methods. And those should be called before running any query against your database.
Trying to fetch millions of records from a database without pagination would very likely cause a timeout at best.
Well, assuming information in this blog is correct, .AsPagination method requires you to sort your data by a particular column. It's possible that trying to do an OrderBy on a table with millions of records in it is just a time consuming operation and times out.