Create SSRS subscriptions from web page - sql-server-2008

I have Reporting Services running on SQL 2008 R2 and have a handful of reports that I created. I'm able to go into Report Server and set up a subscription and have any of the reports emailed to an email address. So all of that is configured correctly.
What I want to do is have a web page in my application that shows a list of available reports. The user can choose one, choose a schedule frequency, enter an email address, and click a 'save' button. When clicking save it should create the subscription in SSRS. I may need to pass in a couple report parameters depending on the report.
How can I do this in C#?

You can dynamically generate a one time subscription in SSRS for the report. You'll have to use the RS webservice as mentioned by Diego.
Your code would look something like this:
static void generateSubscription()
{
if (SubscriptionRequests.Count < 1) return;
NetworkCredential credentials = new NetworkCredential("user", "pass");
reports.ReportingService2005 rs = new reports.ReportingService2005();
rs.Credentials = credentials;
DateTime topDatetime = DateTime.Now;
topDatetime = topDatetime.AddMinutes(2);
foreach (SubscriptionRequest x in SubscriptionRequests)
{
reports.ExtensionSettings extensionSettings = new reports.ExtensionSettings();
List<reports.ParameterValue> extParameters = new List<reports.ParameterValue>();
List<reports.ParameterValue> parameters = new List<reports.ParameterValue>();
string description = "Email: ";
string eventType = "TimedSubscription";
extensionSettings.Extension = "Report Server Email";
string scheduleXml = "<ScheduleDefinition><StartDateTime>";
scheduleXml += topDatetime.ToShortDateString() + " " + topDatetime.ToShortTimeString();
scheduleXml += "</StartDateTime></ScheduleDefinition>";
parameters.Add(new reports.ParameterValue() { Name = "abc", Value = x.id });
extParameters.Add(new reports.ParameterValue() { Name = "RenderFormat", Value = x.renderFormat });
extParameters.Add(new reports.ParameterValue() { Name = "TO", Value = x.email });
extParameters.Add(new reports.ParameterValue() { Name = "ReplyTo", Value = x.replyTo });
extParameters.Add(new reports.ParameterValue() { Name = "IncludeReport", Value = "True" });
extParameters.Add(new reports.ParameterValue() { Name = "Subject", Value = "subject - " + " (" + x.id.ToString() + ")" });
extParameters.Add(new reports.ParameterValue() { Name = "Comment", Value = x.body });
extensionSettings.ParameterValues = extParameters.ToArray();
description += topDatetime.ToShortDateString() + " " + topDatetime.ToShortTimeString();
description += " (" + x.a + " - " + x.b + " - " + x.c + ")";
string _reportName = "/report";
rs.CreateSubscription(_reportName, extensionSettings, description, eventType, scheduleXml, parameters.ToArray());
topDatetime = topDatetime.AddSeconds(30);
}
}

Easiest way is give access to the user to the report manager under the "Browser" pre-defined role. This is exactly what this role is about, view folders and reports and subscribe to reports.
If that's not possible you can create your own management tool. To do that you need to access the SSRS web methods Using SOAP and the ReportService2005 endpoint
Examples here

Related

SSRS CreateSubscription method - render format

I am trying to set up a website that allows users to add themselves to subscriptions for reports.
The problem is that when I try and make the subscription render the report as "Excel" it sets it too "XML with report data".
The piece of code I use to try and change the render format is below;
extParameters.Add(new ParameterValue() { Name = "RenderFormat", Value = "Excel" });
extParameters.Add(new ParameterValue() { Name = "TO", Value = strEmail });
extParameters.Add(new ParameterValue() { Name = "IncludeReport", Value = "True" });
extParameters.Add(new ParameterValue() { Name = "Subject", Value = "subject - " + " (" + strReportPath + ")" });
Thank you
Solved it now.
I had to put the RenderFormat value as "EXCEL" or "EXCELOPENXML"

limited odbc connections in sql server 2008 import wizard

Issue: pervasive odbc driver ((called: "Pervasive ODBC engine interface") is visible in ODBC(odbcad32.exe). However, the same odbc driver is not visible in SQL server 2008 import wizard, although I can see the same odbc driver in SQL server 2000 import wizard.
I am using 32-bit win 7 OS with SQL server 2008, SQL server 2000 and pervasive SQL v11. any solution will be very helpful...Many Thanks!
I could never figure out how to do make the 'Import/Export' wizard work in Sql Server Management Studio. I even tried to modify the 'ProviderResources.xml' file as I saw in another response.
I was attempting to migrate Sage Timberline Office data which uses a proprietary 'Timberline Data' ODBC driver. That driver is missing the 'ORDINAL_POSITION' column when you call the 'GetSchema' function in .NET. So 'Import/Export' in Sql Server Management Studio fails.
I ended up having to write my own app to copy the data over to SQL server. The only downside is it doesn't know about primary keys, indexes, or other constraints. Nonetheless, I get the data in MSSQL so I am happy.
I am sure this code will be useful to others, so here you go.
Program.cs
using System;
using System.Data.Odbc;
using System.Data.SqlClient;
using System.Data;
using System.Collections.Generic;
using System.Diagnostics;
namespace TimberlineOdbcSync
{
class Program
{
static string currentTableName;
const string sourceOdbcDriver = "{Timberline Data}";
const string sourceOdbcDsn = "timberline data source";
const string sourceOdbcUid = "user1";
const string sourceOdbcPwd = "user1a";
const string destSqlServer = "SERVER5";
const string destSqlDatabase = "TSData";
const string destSqlUsername = "";
const string destSqlPassword = "";
const string destSqlOwner = "dbo";
public static void Main(string[] args)
{
DateTime allStartDate = DateTime.Now;
DateTime allEndDate;
DateTime tableStartDate = DateTime.Now;
DateTime tableEndDate;
TimeSpan diff;
string errMsg;
int pCount; //pervasive record count
int sCount; //sql server record count
string sourceOdbcConnString =
"Dsn=" + sourceOdbcDsn + ";" +
"Driver="+ sourceOdbcDriver +";" +
(!string.IsNullOrEmpty(sourceOdbcUid) ? "uid="+ sourceOdbcUid +";" : "") +
(!string.IsNullOrEmpty(sourceOdbcUid) ? "pwd="+ sourceOdbcPwd +";" : "");
string destSqlConnString =
"Server=" + destSqlServer + ";" +
"Database=" + destSqlDatabase+ ";" +
(!string.IsNullOrEmpty(destSqlUsername) && !string.IsNullOrEmpty(destSqlPassword) ?
"User Id=" + destSqlUsername + ";" +
"Password=" + destSqlPassword + ";"
:
"Trusted_Connection=true;");
try{
using(OdbcConnection pConn = new OdbcConnection(sourceOdbcConnString)){
pConn.Open();
List<string> tables = new List<string>();
//get a list of all tables
using(DataTable tableschema = pConn.GetSchema("TABLES"))
foreach(DataRow row in tableschema.Rows)
tables.Add(row["TABLE_NAME"].ToString());
foreach(string tableName in tables){
//set the current table name
currentTableName = tableName;
try{
//get the schema info for the table (from pervasive)
DataTable dtSchema = pConn.GetSchema("Columns", new string[]{null, null, tableName});
//if we could not get the schema
if(dtSchema == null || dtSchema.Rows.Count <= 0){
pConn.Close();
errMsg = "Error: Could not get column information for table " + tableName;
Trace.WriteLine(errMsg);
WriteErrorEvent(errMsg);
return;
}
//emit the table name
Trace.Write("[" + tableName + "]");
//get the number of records in this table
pCount = TableCount(tableName, pConn);
//emit the number of records in this table
Trace.Write(" = P:" + pCount);
//create a data reader to read the pervasive data
string sql = "select * from \""+ tableName + "\"";
OdbcCommand cmd = new OdbcCommand(sql, pConn);
OdbcDataReader dr = cmd.ExecuteReader();
//create a connection to SQL Server
using (SqlConnection sConn = new SqlConnection(destSqlConnString)){
//open the connection
sConn.Open();
//if the table already exists
if(TableExists(tableName, sConn)){
//get the record count for this table
sCount = TableCount(tableName, sConn);
} else {
//set the record count to zero
sCount = 0;
}
//output the record count
Trace.Write(", S: " + sCount);
//if the record counts match
if( pCount == sCount ){
//output an indicator that we are skipping this table
Trace.WriteLine(" -- Skipping");
//skip this table and go to the next
continue;
}
//output a blank line
Trace.WriteLine("");
//create the table in SQL Server using the schema info from Pervasive
CreateTableInDatabase(dtSchema, destSqlOwner, tableName, sConn);
// Copies all rows to the database from the data reader.
using (SqlBulkCopy bc = new SqlBulkCopy(sConn))
{
// Destination table with owner -
// this example does not check the owner names! It uses dbo exclusively.
bc.DestinationTableName = "[" + destSqlOwner + "].[" + tableName + "]";
bc.BulkCopyTimeout = 30;
bc.BatchSize = 3000;
bc.BulkCopyTimeout = 12000;
// User notification with the SqlRowsCopied event
bc.NotifyAfter = 1000;
bc.SqlRowsCopied += new SqlRowsCopiedEventHandler(OnSqlRowsCopied);
//output the date and time so we know when we started
tableStartDate = DateTime.Now;
Trace.WriteLine("Copying " + pCount + " records to " + destSqlServer + " - " + tableStartDate.ToString("g"));
// Starts the bulk copy.
bc.WriteToServer(dr);
tableEndDate = DateTime.Now;
diff = tableEndDate - tableStartDate;
Trace.WriteLine(String.Format(
"Completed {4} at {0}\r\nDuration: {1}:{2}:{3}",
tableEndDate.ToString("g"),
diff.Hours.ToString(), diff.Minutes.ToString(), diff.Seconds.ToString(),
tableName));
// Closes the SqlBulkCopy instance
bc.Close();
}
dr.Close();
}
}catch(Exception ex){
errMsg = "Error: " + ex.Message + Environment.NewLine +
"Stack: " + ex.StackTrace + Environment.NewLine;
Trace.WriteLine(errMsg);
WriteErrorEvent(errMsg);
if( !ReadBool("Do you want to continue? [y/n]") ){
break;
}
}//end try
}//end for
}//end using
allEndDate = DateTime.Now;
diff = allEndDate - allStartDate;
Trace.WriteLine(
"Bulk copy operation complete" + Environment.NewLine +
"Started: " + allStartDate.ToString("g") + Environment.NewLine +
"Current: " + allEndDate.ToString("g") + Environment.NewLine +
String.Format("Duration: {0}:{1}:{2}",
diff.Hours.ToString(),
diff.Minutes.ToString(),
diff.Seconds.ToString()));
}catch(Exception ex){
errMsg =
"Error: " + ex.Message + Environment.NewLine +
"Stack: " + ex.StackTrace;
Trace.WriteLine(errMsg);
WriteErrorEvent(errMsg);
}//end try
Console.Write("Press any key to continue . . . ");
Console.ReadKey(true);
}
static bool TableExists(string tableName, SqlConnection sqlConn){
int retVal = 0;
try{
using(SqlCommand command = sqlConn.CreateCommand()){
command.CommandText = "IF OBJECT_ID('dbo." + tableName + "', 'U') IS NOT NULL SELECT 1 as res ELSE SELECT 0 as res";
retVal = Convert.ToInt32(command.ExecuteScalar());
}
}catch(Exception ex){
string errMsg =
"Error: Could not determine if table " + tableName + " exists."+ Environment.NewLine +
"Reason: " + ex.Message + Environment.NewLine +
"Stack: " + ex.StackTrace;
Trace.WriteLine(errMsg);
WriteErrorEvent(errMsg);
retVal = 0;
}//end try
return (retVal==1);
}
static int TableCount(string tableName, IDbConnection anyConn){
int retVal = 0;
try{
using(IDbCommand command = anyConn.CreateCommand()){
command.CommandText = "SELECT count(*) FROM \"" + tableName + "\"";
retVal = Convert.ToInt32(command.ExecuteScalar());
}
}catch(Exception ex){
string errMsg =
"Error: Could not get table count for " + tableName + "." + Environment.NewLine +
"Reason: " + ex.Message + Environment.NewLine +
"Stack: " + ex.StackTrace;
Trace.WriteLine(errMsg);
WriteErrorEvent(errMsg);
retVal = 0;
}//end try
return (retVal);
}
static bool ReadBool(String question) {
while (true) {
Console.WriteLine(question);
String r = (Console.ReadLine() ?? "").ToLower();
if (r == "y" || r == "yes" || r == "1")
return true;
if (r == "n" || r == "no" || r=="0")
return false;
Console.WriteLine("Please Select a Valid Option!!");
}//end while
}
static void OnSqlRowsCopied(object sender, SqlRowsCopiedEventArgs e) {
Trace.WriteLine(String.Format("-- [{1}] Copied {0} rows.", e.RowsCopied, currentTableName));
}
private static string s(object o){
return (Convert.IsDBNull(o) ? "" : Convert.ToString(o));
}
private static string _drToColSql(DataRow dr){
string colName = s(dr["COLUMN_NAME"]);
string ret = "[" + colName + "] ";
string typeName = ((string)s(dr["TYPE_NAME"])).ToLower();
switch(typeName){
case "char":
ret += "CHAR(" + s(dr["LENGTH"]) + ")";
break;
case "byte":
ret += "CHAR(" + s(dr["PRECISION"]) + ")";
break;
case "text":
ret += "VARCHAR(" + s(dr["PRECISION"]) + ")";
break;
case "date":
ret += "DATE";
break;
case "time":
ret += "TIME(7)";
break;
case "double":
ret += "DECIMAL(16,2)"; // + c(dr["PRECISION"]) + "," + c(dr["LENGTH"]) + ")";
break;
case "usmallint":
case "smallint":
ret += "SMALLINT";
break;
case "utinyint":
case "tinyint":
ret += "TINYINT";
break;
case "identity":
case "integer":
ret += "BIGINT";
break;
case "smallidentity":
case "short":
ret += "INT";
break;
case "longvarchar":
case "memo":
ret += "TEXT";
break;
case "checkbox":
ret += "BIT";
break;
case "real":
ret += "REAL";
break;
default:
//this was an unexpected column, figure out what happened
Trace.WriteLine("ERROR - Column '" + colName + "' Details: ");
Trace.WriteLine("\tCOLUMN_NAME: " + s(dr["COLUMN_NAME"]));
Trace.WriteLine("\tTYPE_NAME: " + s(dr["TYPE_NAME"]));
Trace.WriteLine("\tDATA_TYPE: " + s(dr["DATA_TYPE"]));
Trace.WriteLine("\tLENGTH: " + s(dr["LENGTH"]));
Trace.WriteLine("\tPRECISION: " + s(dr["PRECISION"]));
Trace.WriteLine("\tSCALE: " + s(dr["SCALE"]));
Trace.WriteLine("\tNULLABLE: " + s(dr["NULLABLE"]));
throw new Exception("Unexpected data type: " + typeName);
}
if(s(dr["NULLABLE"])=="1"){
ret += " NULL";
}
return ret;
}
private static bool CreateTableInDatabase(DataTable dtSchemaTable, string tableOwner, string tableName, SqlConnection sqlConn) {
// Generates the create table command.
string ctStr = "CREATE TABLE [" + tableOwner + "].[" + tableName + "](\r\n";
for (int i = 0; i < dtSchemaTable.Rows.Count; i++)
{
ctStr += _drToColSql(dtSchemaTable.Rows[i]);
if (i < dtSchemaTable.Rows.Count)
ctStr += ",";
ctStr += "\r\n";
}
ctStr += ")";
// Emit SQL statement
Trace.WriteLine("-".PadLeft(30, '-'));
Trace.WriteLine(ctStr + Environment.NewLine);
// Runs the SQL command to make the destination table.
using(SqlCommand command = sqlConn.CreateCommand()){
command.CommandText = "IF OBJECT_ID('dbo." + tableName + "', 'U') IS NOT NULL DROP TABLE dbo." + tableName;
command.ExecuteNonQuery();
command.CommandText = ctStr;
command.ExecuteNonQuery();
}
return true;
}
private static bool WriteErrorEvent(string errMsg){
const string sSource = "PervasiveOdbcSync";
const string sLog = "Application";
try{
if (!EventLog.SourceExists(sSource))
EventLog.CreateEventSource(sSource,sLog);
EventLog.WriteEntry(sSource, errMsg);
EventLog.WriteEntry(sSource, errMsg, EventLogEntryType.Error, 128);
return true;
}catch(Exception ex){
Trace.WriteLine("Unable to write error to event log. Reason: " + ex.Message);
return false;
}
}
}
}
You'll want to add a System.Diagnostics.ConsoleTraceListener to your app.config file. That way you can see everything that is being outputted. If you also add a System.Diagnostics.TextWriterTraceListener, you can make the app also output everything to a log file.
On my PSQL v11 box which also has SQL Server 2008 R2 installed, I don't see a "Pervasive ODBC Engine Interface" listed in the "Data Source" dialog of the SQL Server Import and Export Wizard. I do see the "Pervasive PSQL OLEDB Provider" and "Pervasive Provider, release v4.0" (and 3.5 and 3.2). THe Pervasive Provider is an ADO.NET provider. I do see a ".Net Framework Data Provider for ODBC" and if I put a DSN name for a Pervasive DSN (like DEMODATA), it works.

Bulk Insert into SQL Server 2008

for (int i = 0; i < myClass.Length; i++)
{
string upSql = "UPDATE CumulativeTable SET EngPosFT = #EngPosFT,EngFTAv=#EngFTAv WHERE RegNumber =#RegNumber AND Session=#Session AND Form=#Form AND Class=#Class";
SqlCommand cmdB = new SqlCommand(upSql, connection);
cmdB.CommandTimeout = 980000;
cmdB.Parameters.AddWithValue("#EngPosFT", Convert.ToInt32(Pos.GetValue(i)));
cmdB.Parameters.AddWithValue("#RegNumber", myClass.GetValue(i));
cmdB.Parameters.AddWithValue("#EngFTAv", Math.Round((engtot / arrayCount), 2));
cmdB.Parameters.AddWithValue("#Session", drpSess.SelectedValue);
cmdB.Parameters.AddWithValue("#Form", drpForm.SelectedValue);
cmdB.Parameters.AddWithValue("#Class", drpClass.SelectedValue);
int idd = Convert.ToInt32(cmdB.ExecuteScalar());
}
assuming myClass.Length is 60. This does 60 update statements. How can I limit it to 1 update statement. Please code example using the above code will be appreciated. Thanks
Tried using this
StringBuilder command = new StringBuilder();
SqlCommand cmdB = null;
for (int i = 0; i < myClass.Length; i++)
{
command.Append("UPDATE CumulativeTable SET" + " EngPosFT = " + Convert.ToInt32(Pos.GetValue(i)) + "," + " EngFTAv = " + Math.Round((engtot / arrayCount), 2) +
" WHERE RegNumber = " + myClass.GetValue(i) + " AND Session= " + drpSess.SelectedValue + " AND Form= " + drpForm.SelectedValue + " AND Class= " + drpClass.SelectedValue + ";");
//or command.AppendFormat("UPDATE CumulativeTable SET EngPosFT = {0},EngFTAv={1} WHERE RegNumber ={2} AND Session={3} AND Form={4} AND Class={5};", Convert.ToInt32(Pos.GetValue(i)), Math.Round((engtot / arrayCount), 2), myClass.GetValue(i), drpSess.SelectedValue, drpForm.SelectedValue, drpClass.SelectedValue);
}//max length is 128 error is encountered
Look at the BULK INSERT T-SQL command. But since I don't have a lot of personal experience with that command, I do see some immediate opportunity to improve this code using the same sql by creating the command and parameters outside of the loop, and only making the necessary changes inside the loop:
string upSql = "UPDATE CumulativeTable SET EngPosFT = #EngPosFT,EngFTAv=#EngFTAv WHERE RegNumber =#RegNumber AND Session=#Session AND Form=#Form AND Class=#Class";
SqlCommand cmdB = new SqlCommand(upSql, connection);
cmdB.CommandTimeout = 980000;
//I had to guess at the sql types you used here.
//Adjust this to match your actual column data types
cmdB.Parameters.Add("#EngPosFT", SqlDbType.Int);
cmdB.Parameters.Add("#RegNumber", SqlDbType.Int);
//It's really better to use explicit types here, too.
//I'll just update the first parameter as an example of how it looks:
cmdB.Parameters.Add("#EngFTAv", SqlDbType.Decimal).Value = Math.Round((engtot / arrayCount), 2));
cmdB.Parameters.AddWithValue("#Session", drpSess.SelectedValue);
cmdB.Parameters.AddWithValue("#Form", drpForm.SelectedValue);
cmdB.Parameters.AddWithValue("#Class", SqlDbTypedrpClass.SelectedValue);
for (int i = 0; i < myClass.Length; i++)
{
cmdB.Parameters[0].Value = Convert.ToInt32(Pos.GetValue(i)));
cmdB.Parameters[1].Value = myClass.GetValue(i));
int idd = Convert.ToInt32(cmdB.ExecuteScalar());
}
It would be better in this case to create a stored procedure that accepts a Table Valued Parameter. On the .NET side of things, you create a DataTable object containing a row for each set of values you want to use.
On the SQL Server side of things, you can treat the parameter as another table in a query. So inside the stored proc, you'd have:
UPDATE a
SET
EngPosFT = b.EngPosFT,
EngFTAv=b.EngFTAv
FROM
CumulativeTable a
inner join
#MyParm b
on
a.RegNumber =b.RegNumber AND
a.Session=b.Session AND
a.Form=b.Form AND
a.Class=b.Class
Where #MyParm is your table valued parameter.
This will then be processed as a single round-trip to the server.
In such scenarios it is always best to write a Stored Procedure and call that stored proc in the for loop, passing the necessary arguments at each call.
using System;
using System.Data;
using System.Data.SqlClient;
namespace DataTableExample
{
class Program
{
static void Main(string[] args)
{
DataTable prodSalesData = new DataTable("ProductSalesData");
// Create Column 1: SaleDate
DataColumn dateColumn = new DataColumn();
dateColumn.DataType = Type.GetType("System.DateTime");
dateColumn.ColumnName = "SaleDate";
// Create Column 2: ProductName
DataColumn productNameColumn = new DataColumn();
productNameColumn.ColumnName = "ProductName";
// Create Column 3: TotalSales
DataColumn totalSalesColumn = new DataColumn();
totalSalesColumn.DataType = Type.GetType("System.Int32");
totalSalesColumn.ColumnName = "TotalSales";
// Add the columns to the ProductSalesData DataTable
prodSalesData.Columns.Add(dateColumn);
prodSalesData.Columns.Add(productNameColumn);
prodSalesData.Columns.Add(totalSalesColumn);
// Let's populate the datatable with our stats.
// You can add as many rows as you want here!
// Create a new row
DataRow dailyProductSalesRow = prodSalesData.NewRow();
dailyProductSalesRow["SaleDate"] = DateTime.Now.Date;
dailyProductSalesRow["ProductName"] = "Nike";
dailyProductSalesRow["TotalSales"] = 10;
// Add the row to the ProductSalesData DataTable
prodSalesData.Rows.Add(dailyProductSalesRow);
// Copy the DataTable to SQL Server using SqlBulkCopy
using (SqlConnection dbConnection = new SqlConnection("Data Source=ProductHost;Initial Catalog=dbProduct;Integrated Security=SSPI;Connection Timeout=60;Min Pool Size=2;Max Pool Size=20;"))
{
dbConnection.Open();
using (SqlBulkCopy s = new SqlBulkCopy(dbConnection))
{
s.DestinationTableName = prodSalesData.TableName;
foreach (var column in prodSalesData.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
s.WriteToServer(prodSalesData);
}
}
}
}
}

MS Exchange Web-Services: How to get items with 'Flag' set?

Does anyone know how to get all the items that are flagged inside the Inbox using Microsoft Exchange Web-Services?
Apparently they are neither inside Tasks folder (even though they appear there in Outlook), nor do they have IsReminderSet set to true.
Following attempts either return only appointments or true tasks only, but not flagged messages:
var msgsView = new ItemView(100);
var msgsFilter = new SearchFilter.IsEqualTo(ItemSchema.IsReminderSet, true);
var flagged = exSvc.FindItems(WellKnownFolderName.Inbox, msgsFilter, msgsView);
or
var taskView = new ItemView(100);
var tasks = exSvc.FindItems(WellKnownFolderName.Tasks, taskView);
neither work.
I know this question is old, but I just found list sample code which looks like it might do the trick (I haven't tested it myself yet)
source: http://independentsoft.de/exchangewebservices/tutorial/findmessageswithflag.html
IsEqualTo restriction1 = new IsEqualTo(MessagePropertyPath.FlagStatus, "1"); //FlagStatus.Complete
IsEqualTo restriction2 = new IsEqualTo(MessagePropertyPath.FlagStatus, "2"); //FlagStatus.Marked
Or restriction3 = new Or(restriction1, restriction2);
FindItemResponse response = service.FindItem(StandardFolder.Inbox
, MessagePropertyPath.AllPropertyPaths, restriction3);
for (int i = 0; i < response.Items.Count; i++)
{
if (response.Items[i] is Message)
{
Message message = (Message)response.Items[i];
Console.WriteLine("Subject = " + message.Subject);
Console.WriteLine("FlagStatus = " + message.FlagStatus);
Console.WriteLine("FlagIcon = " + message.FlagIcon);
Console.WriteLine("FlagCompleteTime = " + message.FlagCompleteTime);
Console.WriteLine("FlagRequest = " + message.FlagRequest);
Console.WriteLine("-----------------------------------------------");
}
}

Update jqGrid table with the results of Fusion Tables query in a Google Maps v3 page

I am looking to understand how to update a jqGrid table from Fusion Tables (FT) -
at the moment I can search or scroll on a Google Map, send an event listener that compiles a FT query of the spatial bounds of the viewport/map, to get a new set of results.
I want to use the new FT query string (or could use the Google code to retrieve the data - query.send(getData);) to update the jqGrid table with the new values.
Before I started using jqGrid, I tried/suceeded with the Google Visualisation API, and some of that code is below. Could anyone suggest how to move from table.draw, to loading/reloading a jqGrid table? Thanks a lot in advance.
function tilesLoaded() {
google.maps.event.clearListeners(map, 'tilesloaded');
google.maps.event.addListener(map, 'zoom_changed', getSpatialQuery);
google.maps.event.addListener(map, 'dragend', getSpatialQuery);
getSpatialQuery();
}
function getSpatialQuery() {
sw = map.getBounds().getSouthWest();
ne = map.getBounds().getNorthEast();
var spatialQuery = "ST_INTERSECTS(latitude, RECTANGLE(LATLNG(" + sw.lat() + "," + sw.lng() + "), LATLNG(" + ne.lat() + "," + ne.lng() + ")))";
changeDataTable(spatialQuery);
}
function changeDataTable(spatialQuery) {
var whereClause = "";
if(spatialQuery) {
whereClause = " WHERE " + spatialQuery;
}
var queryText = encodeURIComponent("SELECT 'latitude', 'longitude', 'name' FROM xxxxxxxx" + whereClause + " LIMIT 50");
var query = new google.visualization.Query('http://www.google.com/fusiontables/gvizdata?tq=' + queryText);
query.send(getData);
}
function getData(response) {
var table = new google.visualization.Table(document.getElementById('visualization'));
table.draw(response.getDataTable(), {showRowNumber: true});
}
Oh, and I used Oleg's code jqGrid returns blank cells as a basis for just seeing if I could get a simple multi-select table to pull data from my FT - that worked fine with the simple mod of
url: 'http://www.google.com/fusiontables/api/query?sql=' +
In case this helps someone, I've taken some of the code I came up with and pasted it below:
// You can get the map bounds via then pass it via a function (below is hacked from several functions
sw = map.getBounds().getSouthWest();
ne = map.getBounds().getNorthEast();
var whereClause = "ST_INTERSECTS(latitude, RECTANGLE(LATLNG(" + sw.lat() + "," + sw.lng() + "), LATLNG(" + ne.lat() + "," + ne.lng() + ")))";
//construct the URL to get the JSON
var queryUrlHead = 'http://www.google.com/fusiontables/api/query?sql=';
var queryUrlTail = '&jsonCallback=?'; //
var queryOrderBy = ' ORDER BY \'name\' ASC';
var queryMain = "SELECT * FROM " + tableid + whereClause + queryOrderBy + " LIMIT 100";
var queryurl = encodeURI(queryUrlHead + queryMain + queryUrlTail);
//use the constructed URL to update the jqGrid table - this is the part that I didn't know in my above question
$("#gridTable").setGridParam({url:queryurl});
$("#gridTable").jqGrid('setGridParam',{datatype:'jsonp'}).trigger('reloadGrid');