I'm using http://datatables.net/extensions/tabletools/ in my local-host ( wamp server ). It's working fine, but when I put the same code on my online server, it isn't working.
I am using all latest version of datatables
tableTools: {
"sSwfPath": "https://datatables.net/release-datatables/extensions/TableTools/swf/copy_csv_xls_pdf.swf",
"sRowSelect": "os",
"sRowSelector": 'td:first-child',
// "aButtons": [ "copy", "csv", "xls","pdf","print","select_all", "select_none" ]
"aButtons": [
"copy",
"print", {
"sExtends": "collection",
"sButtonText": "Save", // button name
// "aButtons": [ "csv", "xls", "pdf" ]
"aButtons": [
"csv",
"xls", {
"sExtends": "pdf",
"sPdfOrientation": "landscape",
"sPdfMessage": "List of product."
},
"print"
]
}
]
}
Firstly there was no click on copy, pdf, csv, xls button. Hence I taught my path or swf is not working hence I replaced the link with online link. Hence now I get click, but when I click Copy button it gives me a message ... but when I past in my notepad it's giving me "blank ". Also my pdf, csv, xlsx is not working. Only Print is working perfect. Please let me know what is the issue as in my localhost all is working fine. Its creating issues in my online server.
I am pretty sure that datatables.net actively is blocking for use of the .swf. Allan Jardine has commented the direct use of the .swf files several times :
datatables.net is not a CDN server and should not be used as such. It
is not designed to be, and I might add throttling for hotlinking in
future as a huge amount of bandwidth is being used and causing
unnecessary load. You'll get much better performance from using a
proper CDN or even a locally hosted file.
However, with the introduction of 1.10.x there is finally established a real CDN server, including all the TableTools resources -> http://cdn.datatables.net/tabletools/2.2.2/
So replace the sSwfPath with :
http://cdn.datatables.net/tabletools/2.2.2/swf/copy_csv_xls_pdf.swf
I meant you should post it as a completely new question, since it in fact is a new question! :) Anyway. The problem is, that you'll need to render the data, when you produce a PDF. Otherwise you will just get some $(element).text() output, including the select and its options. Like this :
"aButtons": [
"copy",
"csv",
"xls",
{ "sExtends": "pdf",
"fnCellRender": function ( sValue, iColumn, nTr, iDataIndex ) {
//extract the value of the select
if ( iColumn === 7 ) {
var val=$(sValue).find('select').val();
return (val!=='') ? val : 'not set';
}
//create a dummy text for the HTML-link
if ( iColumn === 8 ) {
return 'click';
}
return sValue;
}
},
"print",
"select_all",
"select_none"
]
see your code here (as close I can get) -> http://jsfiddle.net/3F8ZJ/.
However, you still have a problem caused by your mRender rendering, so the column positions is messing up. It breaks the internal <table>-structure. Why are you inserting extra <td>..</td>? But have not time to look into this at the moment.
As i want to show proper code format hence post as new answer
# DAVIDKONRAD : i got to know when i remove columnDefs from below code then my PDF shows proper records ... btw my csv , excel , print diplay proper record with columnDefs .. only pdf is not showing proper record with "columnDefs"
and i got to know the word which is seen " select " is because of that only ..as i have use dropdown in columndefs
dt = $('#example').DataTable( {
"dom": '<"clear">T<"clear"><"clear">lfrtip',
"pagingType": "full_numbers",
"scrollY": "440px",
"scrollX": "100%",
"scrollCollapse": true,
"bProcessing": true,
"bServerSide": true,
"sAjaxSource": "includes/db/server_processing.php",
"deferRender": true,
"aaSorting":[[0, "desc"]],
"aoColumns": [
{ className: "center", },
{ className: "center", },
{ className: "center", },
{ className: "center", },
{ className: "center", },
{ className: "center", },
{ className: "center", },
],
"columnDefs": [
{
"aTargets":[7],
"fnCreatedCell": function(nTd, sData, oData, iRow, iCol)
{
$(nTd).css('text-align', 'center');
},
"mData": null,
"mRender": function( data, type, full) {
return '<td><select id="dynamic_select_'+full[0]+'" name="dynamic_select_'+full[0]+'">\n\
<option id="0" value="">Select</option/>\n\
<option id="1_'+full[0]+'" value="test.php?id='+full[0]+'">10</option/>\n\
<option id="2_'+full[0]+'" value="test2.php?id='+full[0]+'">12</option/>\n\
<option id="3_'+full[0]+'" value="test3.php?id='+full[0]+'">13</option/>\n\
</select></td>';
//return '<button>Click!</button>';
}
},
{
"aTargets":[8],
"fnCreatedCell": function(nTd, sData, oData, iRow, iCol)
{
$(nTd).css('text-align', 'center');
},
"mData": null,
"mRender": function(data, type, full){
//return '<button>Click!</button>';
return '<div id="container">Click</div>';
}
}
]
} );
Related
I am having a strange issue with an Angular 7.1.1 and Electron 4.1.4 project.
Data Flow:
Angular Component "Report Builder" collects report configuration options from a FormGroup and FormControl validated form and sends data to docx-templater.service
User Button triggers createReport() function
When submitting options for a complete report, the createReport() function calls dataService's fnGetCompleteControlList() which returns properly configured JSON asynchronously.
with a .then() function after the async data retrieval, the createReport() function combines the output directory which is part of the configuration form and sends both to the docx-templater.service's createCompleteDocument() function. Once the promise is returned it updates the UI.
Angular Service "docx-templater"'s createCompleteDocument function passes the data and folder values to the ipcRenderer.send for the electron "writeCompleteDocument" channel and returns a promise.
In my main.ts, I have an ipcMain.on for the "writeCompleteDocument" channel that passes the data to a write-docx function for processing that data into a word document.
Problem:
When the data gets to my write-docx function it is missing a sub array of objects that are essential to the export process.
I have verified that the data is perfect in the Chrome Developer Tools console of electron at the moment just before it sends the data to the docx-templater.service and just before that service sends it to the ipcRenderer (meaning my data service and Report Builder functions are working as designed). When I check the data in the main.ts by saving the data off to a JSON file it is missing the controls sub array within the second object of the JSON only. The controls sub array shows up in the first object as expected.
I will note that what is coming out of the ipcMain function is a properly formed JSON file so it has really just excluded the "controls" sub array and is not truncating due to memory or buffer limits or anything like that.
report-builder.component.ts
createReport() {
if (this.reportBuilderFG.get('allControls').value) {
this.db.fnGetCompleteControlList()
.then((groups: Group[]) => {
this.word.createCompleteDocument(groups, this.reportBuilderFG.get('folder').value + '\\filename.docx')
.then(() => {
this.openSnackBar(this.reportBuilderFG.get('folder').value + '\\filename.docx created successfully');
});
});
} else {
// Do other stuff
}
docx-templater.service.ts
createCompleteDocument(data, folder: string): Promise<boolean> {
return new Promise(resolve => {
console.log(data) <=== Data is perfect here.
ipcRenderer.send('writeCompleteDocument', {data: data, folder: folder});
resolve();
});
}
main.ts
import { writeCompleteDocument } from './node_scripts/write-docx';
ipcMain.on('writeCompleteDocument', (event, arg) => {
fs.writeFileSync("IPCdata.json", arg.data); // <==== Part of the data is missing here.
writeCompleteDocument(arg.data, arg.folder);
});
Good Data Example (some keys and objects excluded for brevity)
[
{
"name": "General Security",
"order": 1,
"subgroups": [
{
"_id": "GOV",
"name": "Governance",
"order": 1,
"controls": [
{
"group": "GS",
"subgroup": "GOV",
"active": true,
"printOrder": 1,
"name": "This is my GS control name",
"requirements": [
{
"id": "SA01",
"active": true,
"order": 1,
"type": "SA",
"applicability": [
"ABC",
"DEF",
"GHI"
],
},
{ ... 3 more }
],
"_id": "GSRA-03",
"_rev": "1-0cbdefc93e56683bc98bae3a122f9783"
},
{ ... 3 more }
],
"_id": "GS",
"_rev": "1-b94d1651589eefd5ef0a52360dac6f9d"
},
{
"order": 2,
"name": "IT Security",
"subgroups": [
{
"_id": "PLCY",
"order": 1,
"name": "Policies",
"controls": [ <==== This entire sub array is missing when exporting from IPC Main
{
"group": "IT",
"subgroup": "PLCY",
"active": true,
"printOrder": 1,
"name": "This is my IT control name",
"requirements": [
{
"id": "SA01",
"active": true,
"order": 1,
"type": "SA",
"applicability": [
"ABC",
"DEF",
"GHI"
],
}
],
"_id": "GSRA-03",
"_rev": "1-0cbdefc93e56683bc98bae3a122f9783"
}
}
],
"_id": "IT",
"_rev": "2-e6ff53456e85b45d9bafd791652a945c"
}
]
I would have expected the ipcRenderer to pass a JSON exactly as it is to the ipcMain.on function, but somehow it is trimming part of the data. I have even tried strigifying the data before sending it to the renderer and then parsing it on the other side but that did nothing.
Could this be an async thing? I am at a loss of where to go next to debug and find what idiot mistake I made in the process.
Also, I realize that the above data flow seems overly complex for what I am doing, and that I can probably do it easier, but it makes sense (kinda) for the way the whole application is structured so I am going to go with it if I can squash this bug.
Looks like your createCompleteDocument() function is set up incorrectly. A quick search showed me that ipcRenderer is an async function, but you are responding to it (almost) synchronously.
You have the following, which is (probably) incorrect (actually it's definitely incorrect, because you've typed typed the return as Promise<boolean> when it is Promise<void>):
createCompleteDocument(data, folder: string): Promise<boolean> {
return new Promise(resolve => {
ipcRenderer.send('writeCompleteDocument', {data: data, folder: folder});
resolve();
});
}
ipcRenderer#send() is async but you are calling resolve() immediately afterwards without waiting for the function to resolve. This probably explains why adding the setTimeout() is fixing the problem for you. Looking at the ipcRenderer docs, the following probably does what you want:
createCompleteDocument(data, folder: string): Promise<Event> {
return new Promise(resolve => {
ipcRenderer.once('writeCompleteDocument', resolve);
ipcRenderer.send('writeCompleteDocument', {data: data, folder: folder});
});
}
Looks like the callback is passed an Event object.
Another option would be to simply replace ipcRenderer#send() with ipcRenderer#sendSync() in your original code, but as pointed out in that method's documentation :
Sending a synchronous message will block the whole renderer process, unless you know what you are doing you should never use it.
Making use of ipcRenderer#send() and ipcRenderer#once() is almost definitely the way to go.
Seperately, you can clean up the code by switching to async/await functions. For example:
async createReport(): Promise<void> {
if (this.reportBuilderFG.get('allControls').value) {
const groups: Group[] = await this.db.fnGetCompleteControlList();
await this.word.createCompleteDocument(
groups,
this.reportBuilderFG.get('folder').value + '\\filename.docx'
);
// Unclear if this function is actually async
await this.openSnackBar(
this.reportBuilderFG.get('folder').value +
'\\filename.docx created successfully'
);
} else {
// Do other stuff
}
}
I was able to solve this by adding a 1000 ms timeout after my fnGetCompleteControlList() data pull in the report-builder.component.ts. It seems like I have a lot more work todo with learning async functions. :-(
report-builder.component.ts
createReport() {
if (this.reportBuilderFG.get('allControls').value) {
this.db.fnGetCompleteControlList()
.then((groups: Group[]) => {
setTimeout(() => {
this.word.createCompleteDocument(groups, this.reportBuilderFG.get('folder').value + '\\filename.docx')
.then(() => {
this.openSnackBar(this.reportBuilderFG.get('folder').value + '\\filename.docx created successfully');
});
}, 1000);
});
} else {
// Do other stuff
}
I have a strange problem with Datatables.
On localhost(port 8080) I’m trying to populate a Datatable thought ajax and JSON. The call (and the html page of Datatable itself) is on classic asp on IIS.
After a lot of retries I manage to get Datatables to work. As usual F12 developer tools were always on to help me on messages and errors.
When I closed F12, the table was not populating at all. It shows only the message loading…
I’ve try to clear cache, to prevent from caching, nothing. When I reopen F12, it works like a charm.
Here is the code from client-side:
var oTable = $('#table_list').dataTable({
"bJQueryUI": true,
"sDom": 'l<"H"Rf>t<"F"ip>',
"ajax": {
"url": " list_new.asp",
"type": "POST",
"data": function (d) {
d.col1 = ID;
d.col2 = name;
d.col2 = date;
}
},
"columns": [
{ "data": "ID" },
{ "data": "FullName" },
{ "data": "Locations" },
{ "data": "date" },
{ "data": "status" }
]
});
Any thoughts?
Since this is my first post in StackOverflow, if I forget something, please let me know.
When removing console.log() form JavaScript code, the table populates successfully.
Thanks to thirtydot the problem was solved.
ref : https://stackoverflow.com/a/6713651/405015
My DataTable is loaded almost entirely from the server side and I would like to keep as much processing as possible on the server to reduce the workload on the client.
By formatting I mean changing color, size, font, font-weight, adding icons and html tags... For example when a row or a few cells need to be highlighted or displayed in a specific way.
I'm thinking about using the render parameter, but I didn't find a way to set it in JSON:
render: $.fn.dataTable.render.number( ',', '.', 0, '$' ) }
An example would also be the following snippet, but it does not provide enough separation between the client and server:
{
"doc": "<strong>546546545<strong>",
"nothing": 0.0,
"order": "<div class="shipped">98745</div>"
}
While I was able to find resources on how to fetch data from the server, I can't find information on how to pass formatting data as part of the json.
Is there a way to paste formatting options in my current DataTable format to make it also optimal? What alterations need to be done?
http://jsfiddle.net/ebRXw/1004/
JSON:
{
"columns": [
{
"data": "doc",
"title": "Doc."
},
{
"data": "order",
"title": "Order no."
},
{
"data": "nothing",
"title": "Nothing"
}
],
"data": [
{
"doc": "564251422",
"nothing": 0.0,
"order": "56421"
},
{
"doc": "546546545",
"nothing": 0.0,
"order": "98745"
}
]
}
JS:
this.table = $('#example').DataTable({
data: json.rows,
columns: json.columns,
select: true,
responsive: true,
rowReorder: true,
colReorder: true,
scrollY: 680,
deferRender: true,
scroller: true
});
HTML:
<table id="example" cellspacing="0" width="100%" />
So I'm trying to load the data received from a webservice into a sencha touch 2 store.
The data is nested JSON, however it is made to include multiple dataArrays.
I am working with sencha touch 2.3.1, somewhat equal to Ext JS 4.2. I don't have that much experience with sencha yet, but I'm getting there. I decided to go for MVC, so I'd like the answers to be as close to this as possible :).
This is the example JSON I am using:
[
{
"DataCollection": {
"DataArrayOne": [
{
"Name": "John Smith",
"Age": "19"
},
{
"Name": "Bart Smith",
"Age": "16"
}
],
"DataArrayTwo": [
{
"Date": "20110601",
"Product": "Apple",
"Descr": "",
"Remark": ""
},
{
"Date": "20110601",
"Product": "Orange",
"Descr": "",
"Remark": ""
},
{
"Date": "20110601",
"Product": "Pear",
"Descr": "",
"Remark": ""
}
],
"DataArrayThree": [
{
"SomeTotalCost": "400,50",
"IntrestPercentage": "3"
}
]
}
}
]
Through only one call, I get this json. I don't want to cause any unnecessary traffic so I hope to be able to use the data somehow.
I want to be able to use each DataArray on its own.
The data gets sent to the store through its proxy:
Ext.define("MyApp.store.myDataObjects", {
extend: "Ext.data.Store",
config: {
model: "MyApp.model.myDataObject",
proxy: {
reader: {
type: "json",
rootProperty: "DataCollection"
},
type: "ajax",
api: {
read: "https://localhost/Service.svc/json"
},
limitParam: false,
startParam: false,
pageParam: false,
extraParams: {
id: "",
token: "",
filter: ""
},
writer: {
encodeRequest: true,
type: "json"
}
}
}
});
I am a bit stuck with the model here. I tried using mappings which would look like this:
config: {
fields: [ {
name: "IntrestPercentage",
mapping: "Calculation.IntrestPercentage",
type: "string"
}
]}
I tried associations as well but to no avail.
According to google chrome console, it doesn't make any objects containing data. I get only 1 object with all values "null".
My endgoal is to be able to show each dataArray in a separate table. So a table for DataArrayOne, a table for DatarrayTwo... The data itself isn't linked. They are only details that have to be shown on a view.
John Smith isn't related to the apples, as in he didn't buy. The apples are just there as an item to be shown.
The possible solutions I've seen yet not understood due to them being outdated are:
ChildStores: You have a master store that receives the data, and then
you split the data to other stores according to rootProperty. I have
no idea how to do this however and I'm not sure if it will work at
all.
Associations, in case I was doing them wrong. I don't think they
are needed because the data isn't linked to each other but it is part
of "DataCollection" though.
Could someone please post an example on how to deal with this unusual(?) kind of nested json.
Or any other solution which will lead to being able to use the 3 dataArrays at will.
Thanks in advance
The best would be to load the complete data with a separate Ext.Ajax.request and then use store.loadData in the success callback. For example:
var data = Ext.decode(response.responseText);
store1.loadData(data[0].DataCollection.DataArrayOne);
store2.loadData(data[0].DataCollection.DataArrayTwo);
store3.loadData(data[0].DataCollection.DataArrayThree);
I've already made several attempts to make the typeahead.js 0.10 to work and only can make it work with the local dataset.
When using prefetch or remote option, even following the examples page, it doesn't work. Either I am formatting the json file with the wrong syntax and/or messing up with the bloodhound options.
Honestly, what does the "datumTokenizer: function(d) { return Bloodhound.tokenizers.whitespace(d.value); }," actually do? what does it mean the whitespace...
Well, I leave here my current example with expectations that somebody can help me understand how to use this.
Can you please, if possible, add an example with Bloodhound using the filter option, along with the json file example being used.
JSON file
[
{
"year": "1961",
"value": "Test Side Story",
"tokens": [
"West",
"Side",
"Story"
]
},
{
"year": "1962",
"value": "Tawrence of Arabia",
"tokens": [
"Lawrence",
"of",
"Arabia"
]
},
{
"year": "1963",
"value": "Tom Jones",
"tokens": [
"Tom",
"Jones"
]
},
{
"year": "2012",
"value": "Argo",
"tokens": [
"Argo"
]
}
]
Typeahead 0.10 script
<script>
var films = new Bloodhound({
datumTokenizer: function(d) { return Bloodhound.tokenizers.whitespace(d.value); },
queryTokenizer: Bloodhound.tokenizers.whitespace,
prefetch: 'http://localhost/dh/js/films.json'
});
films.initialize();
$('#cenas0').typeahead(null, {
displayKey: 'value',
source: films.ttAdapter(),
templates: {
suggestion: Handlebars.compile(
'<p><strong>{{value}}</strong> – {{year}}</p>'
)
}
});
</script>
HTML code (with some script declarations)
<script type="text/javascript" src="http://localhost/dh/js/jquery-1.9.1.js"></script>
<script type="text/javascript" src="http://localhost/dh/js/typeahead.bundle.js"></script>
<input id="cenas0" class="typeahead" placeholder="cenas0"></input>
I am pretty sure that , weirdly enough it has todo with the html divs and class delcaration, if i use the following code , which simply adds a div wrapper around your input, then it seems to work fine ( using jquery 1.9.1, typeahead latest bundle)
html
<div class="films">
<input class="typeahead" name="film" type="text" autocomplete="off" value="">
</div>
js code (only part i left off was handlebars)
var films = new Bloodhound({
datumTokenizer: function(d) { return Bloodhound.tokenizers.whitespace(d.value); },
queryTokenizer: Bloodhound.tokenizers.whitespace,
limit: 10,
prefetch: "js/films.json"
});
films.initialize();
$('.films .typeahead').typeahead(null, {
displayKey: 'value',
source: films.ttAdapter()
})
});
here is a jsfiddle using the same data as a local store http://jsfiddle.net/qLk8c/