Web API returning null JSON objects C# - json

I have a web API returning 117k JSON objects.
Edit: The API is calling MySQL to fetch 117k rows of data, putting them into a IEnumerable and sending them through JSON
All I see is
[{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},{},... the entire page...
I wanted to ask how someone what is happening and how you would handle a large JSON transfer. Prefer to get it all in one go to avoid querying back and forth (delay time).
The function call is this:
public IEnumerable<Score> Get(int id)
{
string mConnectionString = System.Configuration.ConfigurationManager.AppSettings["mysqlConnectionString"];
MySqlConnection mConn;
MySqlDataReader mReader;
List<Score> returnedRows = new List<Score>();
if (String.IsNullOrEmpty(mConnectionString))
{
return returnedRows;
}
try
{
// prepare the dump query
MySqlCommand dumpCmd;
string query = "SELECT * FROM score where id = "+id+";";
using (mConn = new MySqlConnection(mConnectionString))
{
using (dumpCmd = new MySqlCommand())
{
dumpCmd.Connection = mConn;
dumpCmd.CommandText = query;
mConn.Open();
mReader = dumpCmd.ExecuteReader(); /
if (mReader.HasRows)
{
while (mReader.Read())
{
string[] rowCols = new string[mReader.FieldCount]; // there are 20+ columns, at least the primary keys are not null
for (int i = 0; i < rowCols.Length; ++i)
{
rowCols[i] = mReader.GetString(i);
}
returnedRows.Add(new Score(rowCols));
}
mConn.Close();
return returnedRows;
}
else
{
// should return a 404 cause nothing found
mConn.Close();
}
}
}
}
catch (Exception e)
{
return returnedRows;
}
return returnedRows;
}

Either mReader.GetString(i) is returning null or you have no data in the columns.

Related

parsing issue with JSON data from SQL 2017 to MongoDB

I am working on c# utility to migrate data from SQL server 2017 to MongoDB. Below are steps I am following
1) Getting data from SQL server in JSON format (FOR JSON AUTO)
2) Parsing into BSON document
3) Then trying to insert into MongoDB
But I am getting error while reading JSON data from SQL.
My Json data is combination of root attributes as well as nested objects.
So Its dynamic data, that I want to PUSH as it is to MongoDB.
string jsonData = string.Empty;
foreach (var userId in userIdList)
{
using (SqlConnection con = new SqlConnection("Data Source=;Initial Catalog=;Integrated Security=True"))
{
using (SqlCommand cmd = new SqlCommand("Usp_GetUserdata", con))
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("#userId", SqlDbType.Int).Value = userId;
con.Open();
var reader = cmd.ExecuteReader();
jsonResult = new StringBuilder();
//cmd.ExecuteNonQuery();
if (!reader.HasRows)
{
jsonResult.Append("[]");
}
else
{
while (reader.Read())
{
jsonResult.Append(reader.GetValue(0));
jsonData = reader.GetValue(0).ToString();
File.WriteAllText(#"c:\a.txt", jsonResult.ToString());
File.WriteAllText(#"c:\a.txt",jsonData);
jsonData.TrimEnd(']');
jsonData.TrimStart('[');
//Create client connection to our MongoDB database
var client = new MongoClient(MongoDBConnectionString);
//Create a session object that is used when leveraging transactions
var session = client.StartSession();
//Create the collection object that represents the "products" collection
var employeeCollection = session.Client.GetDatabase("mongodev").GetCollection<BsonDocument>("EmpData");
//Begin transaction
session.StartTransaction();
try
{
dynamic resultJson = JsonConvert.DeserializeObject(result);
var document = BsonSerializer.Deserialize<BsonDocument>(resultJson);
//MongoDB.Bson.BsonDocument document
// = MongoDB.Bson.Serialization.BsonSerializer.Deserialize<BsonDocument>(jsonResult);
employeeCollection.InsertOneAsync(document);
//BsonArray pipeline =
// MongoDB.Bson.Serialization.BsonSerializer.Deserialize<BsonArray>(jsonData);
//var documents = pipeline.Select(val => val.AsBsonDocument);
//employeeCollection.InsertManyAsync(documents);
session.CommitTransaction();
}
catch (Exception e)
{
Console.WriteLine(e);
session.AbortTransaction();
throw;
}
}
}
}
}
}

How to optimize the hibernate lucene fetching records time

I am doing hibernate lucene with boolean search to fetch the records from the database.Present in my table there are 250 records.I think it's not a matter for hibernate lucene to fetch.But what's happening is, it is taking 1.30 min to 2.30 minutes.
What's my flow is from my controller I am getting some search keywords and I am passing it to service and then to dao.In service layer I started the transaction.Finally Dao will return Listof records.After getting the records I am setting that list to List of XYZTableVo objects .
I don't know where exactly the time is taking whether in the lucene or in Object preparation of VO.
Following is my snippet
Session session = getSession();
FullTextSession fullTextSession = Search.getFullTextSession(session);
SearchFactory searchFactory = fullTextSession.getSearchFactory();
fullTextSession
.createIndexer(XYZTable.class)
.typesToIndexInParallel(20)
.batchSizeToLoadObjects(25)
.cacheMode(CacheMode.NORMAL)
.threadsToLoadObjects(5)
.startAndWait();
searchFactory.optimize(CvlizerJobseeker.class);
MultiFieldQueryParser parser = new MultiFieldQueryParser(new String[] { "Skills.skill" },
new StandardAnalyzer());
parser.setDefaultOperator(Operator.OR);
org.apache.lucene.search.Query luceneQuery = null;
QueryBuilder qb = fullTextSession.getSearchFactory().buildQueryBuilder().forEntity(XYZTable.class)
.get();
BooleanQuery boolQuery = new BooleanQuery();
if (locationList != null) {
if (locationList.get(2) != null) {
boolQuery.add(qb.keyword().onField("XYZTablePersonalInfo.XYZTableAddress.postalCode")
.matching(locationList.get(2)).createQuery(), BooleanClause.Occur.MUST);
}
else if (locationList.get(1) != null) {
boolQuery.add(qb.keyword().onField("XYZTablePersonalInfo.XYZTableAddress.city")
.matching(locationList.get(1)).createQuery(), BooleanClause.Occur.MUST);
}
}
if (StringUtils.isEmpty(query) != true && StringUtils.isBlank(query) != true) {
try {
luceneQuery = parser.parse(query.toUpperCase());
} catch (ParseException e) {
luceneQuery = parser.parse(parser.escape(query.toUpperCase()));
}
boolQuery.add(luceneQuery, BooleanClause.Occur.MUST);
}
boolQuery.add(qb.keyword().onField("isValid").matching(false).createQuery(), BooleanClause.Occur.MUST);
FullTextQuery createFullTextQuery = fullTextSession.createFullTextQuery(boolQuery, XYZTable.class);
createFullTextQuery.list();

TableContinuationToken not getting Deserialised from JSON correctly

I am having trouble trying to retrieve large datasets from Azure TableStorage. After several attempts at trying to get it in one go I have given up and am now using the TableContinuation Token, which is now not getting Deserialized correctly.The object is getting created but all the Next... values (i.e. NextRowKey, NextPartitionKey, etc are NULL, when the in stringresponse that gets created you can see the values it should be populating with...
The class I am passing contains a list of objects and the token
public class FlorDataset
{
public List<FlorData> Flors { get; set; }
public TableContinuationToken Token { get; set; }
}
The controller code is not exactly rocket science either....
[HttpGet, Route("api/list/{token}")]
public IHttpActionResult FindAll(string token)
{
try
{
TableContinuationToken actualToken = token == "None"
? null
: new TableContinuationToken()
{
NextPartitionKey = NextPartition,
NextRowKey = token,
NextTableName = NextTableName
};
var x = Run(actualToken);
Flors = x.Flors;
actualToken = x.Token;
NextTableName = actualToken.NextTableName;
NextPartition = actualToken.NextPartitionKey;
return Flors != null
? (IHttpActionResult)new IsoncOkResult<FlorDataset>(x, this)
: NotFound();
}
catch (Exception ex)
{
Trace.TraceError(ex.ToString());
return NotFound();
}
}
private FlorDataset Run(TableContinuationToken token)
{
return _repo.GetAllByYear("2016", token) as FlorDataset;
}
The calling code, which calls my fairly standard Web API 2 Controller is:
do
{
try
{
HttpResponseMessage response = null;
if (string.IsNullOrEmpty(token.NextRowKey))
{
response = await client.GetAsync("api/list/None");
}
else
{
response = await client.GetAsync($"api/list/{token.NextRowKey}");
}
if (response.IsSuccessStatusCode)
{
var stringresponse = await response.Content.ReadAsStringAsync();
var ds = JsonConvert.DeserializeObject<FlorDataset>(stringresponse);
token = ds.Token;
Flors.AddRange(ds.Flors);
}
else
{
token = null;
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
token = null;
}
} while (token != null);
Okay, this is not the greatest solution, but it's the only thing that works so far in case anyone else is trying the same and stumbling across my question....
In the calling code bit you do a horrible bit of string replacement before you do the deserialisation.... I actually feel dirty just posting this, so if anyone comes up with a better answer, please feel free to share.....
if (response.IsSuccessStatusCode)
{
var stringresponse = await response.Content.ReadAsStringAsync();
stringresponse = stringresponse.Replace(">k__BackingField", "");
stringresponse = stringresponse.Replace("<", "");
var ds = JsonConvert.DeserializeObject<FlorDataset>(stringresponse);
token = ds.Token;
Flors.AddRange(ds.Flors);
}
Not nice, not pretty, but does work!!!! :-D Going to wash my fingers with bleach now!!!

Selenium and html agility pack drops html

I'm using the HTML Agility Pack and Selenium to crawl a site, find particular tables, and then parse those tables. Everything works fine individually, but when I run the app, it sometimes drops huge chunks of HTML from within the table. When I track down the page on the site with the data, the HTML is there. For whatever reason, it isn't there when the crawler is running.
Here's the code. The rows[r].InnerHtml is NOT the HTML from page. Anyone have any thoughts on what might be happening here?
public IMyInterface CreateObjectFromHtmlRow(HtmlNode rowNode)
{
try
{
var columns = rowNode.SelectNodes("td");
MyClass obj = new MyClass()
{
OnlineId = columns[0].InnerText.Trim(),
FirstName = columns[1].InnerText.Trim(),
MiddleInitial = columns[2].InnerText.Trim(),
LastName = columns[3].InnerText.Trim(),
Residence = columns[4].InnerText.Trim(),
};
return obj;
}
catch (Exception exc)
{
_logger.LogFormat("Error trying to parse row: {0}", exc.Message);
return null;
}
}
IMyInterface obj = null;
obj = _repository.CreateObjectFromHtmlRow(rows[r]);
if (obj == null)
{
_logger.LogFormat("Unable to create object from this data: {0}", rows[r].InnerHtml);
}
else
{
// Do something useful
}
Thanks for your help.
WW

Multiple PushNotification Subscriptions some work properly and some don't

I tried posting this on the Exchange Development forum and didnt get any replies, so I will try here. Link to forum
I have a windows services that fires every fifteen minutes to see if there is any subscriptions that need to be created or updated. I am using the Managed API v1.1 against Exchange 2007 SP1. I have a table that stores all the users that want there mailbox monitored. So that when a notifcation comes in to the "Listening Service" I am able to look up the user and access the message to log it into the application we are building. In the table I have the following columns that store the subscription information:
SubscriptionId - VARCHAR(MAX)
Watermark - VARCHAR(MAX)
LastStatusUpdate - DATETIME
My services calls a function that queries the data needed (based on which function it is doing). If the user doesn't have a subscription already the service will go and create one. I am using impersonation to access the mailboxes. Here is my "ActiveSubscription" method that is fired when a user needs the subscription either created or updated.
private void ActivateSubscription(User user)
{
if (user.ADGUID.HasValue)
{
PrincipalContext ctx = new PrincipalContext(ContextType.Domain, Settings.ActiveDirectoryServerName, Settings.ActiveDirectoryRootContainer);
using (UserPrincipal up = UserPrincipal.FindByIdentity(ctx, IdentityType.Guid, user.ADGUID.Value.ToString()))
{
ewService.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SID, up.Sid.Value);
}
}
else
{
ewService.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, user.EmailAddress);
}
PushSubscription pushSubscription = ewService.SubscribeToPushNotifications(
new FolderId[] { WellKnownFolderName.Inbox, WellKnownFolderName.SentItems },
Settings.ListenerService, 30, user.Watermark,
EventType.NewMail, EventType.Created);
user.Watermark = pushSubscription.Watermark;
user.SubscriptionID = pushSubscription.Id;
user.SubscriptionStatusDateTime = DateTime.Now.ToLocalTime();
_users.Update(user);
}
We have also ran the following cmdlet to give the user we are accessing the EWS with the ability to impersonate on the Exchange Server.
Get-ExchangeServer | where {$_.IsClientAccessServer -eq $TRUE} | ForEach-Object {Add-ADPermission -Identity $_.distinguishedname -User (Get-User -Identity mailmonitor | select-object).identity -extendedRight ms-Exch-EPI-Impersonation}
The "ActivateSubscription" code above works as expected. Or so I thought. When I was testing it I had it monitoring my mailbox and it worked great. The only problem I had to work around was that the subscription was firing twice when the item was a new mail in the inbox, I got a notification for the NewMail event and Created event. I implemented a work around that checks to make sure the message hasn't already been logged on my Listening service. It all worked great.
Today, we started testing two mailboxes being monitor at the same time. The two mailboxes were mine and another developers mailbox. We found the strangest behavior. My subscription worked as expected. But his didn't, the incoming part of his subscription work properly but any email he sent out the listening service never was sent a notification. Looking at the mailbox properties on Exchange I don't see any difference between his mailbox and mine. We even compared options/settings in Outlook. I can see no reasons why it works on my mailbox and not on his.
Is there something that I am missing when creating the subscription. I didn't think there was since my subscription works as expected.
My listening service code works perfectly well. I have placed the code below incase someone wants to see it to make sure it is not the issue.
Thanks in advance, Terry
Listening Service Code:
/// <summary>
/// Summary description for PushNotificationClient
/// </summary>
[WebService(Namespace = "http://tempuri.org/")]
[WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
[System.ComponentModel.ToolboxItem(false)]
// To allow this Web Service to be called from script, using ASP.NET AJAX, uncomment the following line.
// [System.Web.Script.Services.ScriptService]
public class PushNotificationClient : System.Web.Services.WebService, INotificationServiceBinding
{
ExchangeService ewService = new ExchangeService(ExchangeVersion.Exchange2007_SP1);
public PushNotificationClient()
{
//todo: init the service.
SetupExchangeWebService();
}
private void SetupExchangeWebService()
{
ewService.Credentials = Settings.ServiceCreds;
try
{
ewService.AutodiscoverUrl(Settings.AutoDiscoverThisEmailAddress);
}
catch (AutodiscoverRemoteException e)
{
//log auto discovery failed
ewService.Url = Settings.ExchangeService;
}
}
public SendNotificationResultType SendNotification(SendNotificationResponseType SendNotification1)
{
using (var _users = new ExchangeUser(Settings.SqlConnectionString))
{
var result = new SendNotificationResultType();
var responseMessages = SendNotification1.ResponseMessages.Items;
foreach (var responseMessage in responseMessages)
{
if (responseMessage.ResponseCode != ResponseCodeType.NoError)
{
//log error and unsubscribe.
result.SubscriptionStatus = SubscriptionStatusType.Unsubscribe;
return result;
}
var sendNoficationResponse = responseMessage as SendNotificationResponseMessageType;
if (sendNoficationResponse == null)
{
result.SubscriptionStatus = SubscriptionStatusType.Unsubscribe;
return result;
}
var notificationType = sendNoficationResponse.Notification;
var subscriptionId = notificationType.SubscriptionId;
var previousWatermark = notificationType.PreviousWatermark;
User user = _users.GetById(subscriptionId);
if (user != null)
{
if (user.MonitorEmailYN == true)
{
BaseNotificationEventType[] baseNotifications = notificationType.Items;
for (int i = 0; i < notificationType.Items.Length; i++)
{
if (baseNotifications[i] is BaseObjectChangedEventType)
{
var bocet = baseNotifications[i] as BaseObjectChangedEventType;
AccessCreateDeleteNewMailEvent(bocet, ref user);
}
}
_PreviousItemId = null;
}
else
{
user.SubscriptionID = String.Empty;
user.SubscriptionStatusDateTime = null;
user.Watermark = String.Empty;
_users.Update(user);
result.SubscriptionStatus = SubscriptionStatusType.Unsubscribe;
return result;
}
user.SubscriptionStatusDateTime = DateTime.Now.ToLocalTime();
_users.Update(user);
}
else
{
result.SubscriptionStatus = SubscriptionStatusType.Unsubscribe;
return result;
}
}
result.SubscriptionStatus = SubscriptionStatusType.OK;
return result;
}
}
private string _PreviousItemId;
private void AccessCreateDeleteNewMailEvent(BaseObjectChangedEventType bocet, ref User user)
{
var watermark = bocet.Watermark;
var timestamp = bocet.TimeStamp.ToLocalTime();
var parentFolderId = bocet.ParentFolderId;
if (bocet.Item is ItemIdType)
{
var itemId = bocet.Item as ItemIdType;
if (itemId != null)
{
if (string.IsNullOrEmpty(_PreviousItemId) || (!string.IsNullOrEmpty(_PreviousItemId) && _PreviousItemId != itemId.Id))
{
ProcessItem(itemId, ref user);
_PreviousItemId = itemId.Id;
}
}
}
user.SubscriptionStatusDateTime = timestamp;
user.Watermark = watermark;
using (var _users = new ExchangeUser(Settings.SqlConnectionString))
{
_users.Update(user);
}
}
private void ProcessItem(ItemIdType itemId, ref User user)
{
try
{
ewService.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, user.EmailAddress);
EmailMessage email = EmailMessage.Bind(ewService, itemId.Id);
using (var _entity = new SalesAssistantEntityDataContext(Settings.SqlConnectionString))
{
var direction = EmailDirection.Incoming;
if (email.From.Address == user.EmailAddress)
{
direction = EmailDirection.Outgoing;
}
int? bodyType = (int)email.Body.BodyType;
var _HtmlToRtf = new HtmlToRtf();
var message = _HtmlToRtf.ConvertHtmlToText(email.Body.Text);
bool? IsIncoming = Convert.ToBoolean((int)direction);
if (IsIncoming.HasValue && IsIncoming.Value == false)
{
foreach (var emailTo in email.ToRecipients)
{
_entity.InsertMailMessage(email.From.Address, emailTo.Address, email.Subject, message, bodyType, IsIncoming);
}
}
else
{
if (email.ReceivedBy != null)
{
_entity.InsertMailMessage(email.From.Address, email.ReceivedBy.Address, email.Subject, message, bodyType, IsIncoming);
}
else
{
var emailToFind = user.EmailAddress;
if (email.ToRecipients.Any(x => x.Address == emailToFind))
{
_entity.InsertMailMessage(email.From.Address, emailToFind, email.Subject, message, bodyType, IsIncoming);
}
}
}
}
}
catch(Exception e)
{
//Log exception
using (var errorHandler = new ErrorHandler(Settings.SqlConnectionString))
{
errorHandler.LogException(e, user.UserID, user.SubscriptionID, user.Watermark, user.SubscriptionStatusDateTime);
}
throw e;
}
}
}
I have two answers for you.
At first you will have to create one instance of ExchangeService per user. Like I understand your Code you just create one instance and switch the impersonation, which is not supported. I developed a windowsservice which is pretty similar to yours. Mine is synchronising the mails between our CRM and Exchange. So at startup I create an instance per user and Cache it as long as the application runs.
Now about cache-mode. The diffrence between using cache-mode and not is just a timing gab. In cache-mode Outlook synchronizes from time to time. And non cached it's in time. When you use the cache-mode and want the Events immediatly on your Exchange-Server you can press the "send and receive"-button in Outlook to force the sync.
Hope that helps you...