Concurrency issue in flink stream job - mysql

I have a flink streaming job which does user fingerprinting based on click-stream event data. Code snippet is attached below
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
// setting event time characteristic for processing
env.setStreamTimeCharacteristic(TimeCharacteristic.ProcessingTime);
DataStream<EventData> input = ConfluentKafkaSource.
createKafkaSourceFromApplicationProperties(env);
final OutputTag<EventData> emailPresentTag = new OutputTag<>("email-present") {
};
final OutputTag<EventData> dispatchIdPresentTag = new OutputTag<>("dispatch-id-present") {
};
final OutputTag<EventData> residueTag = new OutputTag<>("residue") {
};
SingleOutputStreamOperator<EventData> splitStream = input
.process(new ProcessFunction<EventData, EventData>() {
#Override
public void processElement(
EventData data,
Context ctx,
Collector<EventData> out) {
if (data.email != null && !data.email.isEmpty()) {
// emit data to side output for emailPresentTag
ctx.output(emailPresentTag, data);
} else if (data.url != null && data.url.contains("utm_source=starling")) {
// emit data to side output for dispatchIdPresentTag
ctx.output(dispatchIdPresentTag, data);
} else {
// emit data to side output for ip/campaign attributing
ctx.output(residueTag, data);
}
}
});
DataStream<EventData> emailPresentStream = splitStream.getSideOutput(emailPresentTag);
DataStream<EventData> dispatchIdPresentStream = splitStream.getSideOutput(dispatchIdPresentTag);
DataStream<EventData> residueStream = splitStream.getSideOutput(residueTag);
// process the 3 split streams separately based on their corresponding logic
DataStream<EventData> enrichedEmailPresentStream = emailPresentStream.
keyBy(e -> e.lbUserId == null ? e.eventId : e.lbUserId).
window(TumblingProcessingTimeWindows.of(Time.seconds(30))).
process(new AttributeWithEmailPresent());
DataStream<EventData> enrichedDispatchIdPresentStream = dispatchIdPresentStream.
keyBy(e -> e.lbUserId == null ? e.eventId : e.lbUserId).
window(TumblingProcessingTimeWindows.of(Time.seconds(30))).
process(new AttributeWithDispatchPresent());
DataStream<EventData> enrichedResidueStream = residueStream.
keyBy(e -> e.lbUserId == null ? e.eventId : e.lbUserId).
window(TumblingProcessingTimeWindows.of(Time.seconds(30))).
process(new AttributeWithIP());
DataStream<EventData> dataStream = enrichedEmailPresentStream.union(enrichedDispatchIdPresentStream, enrichedResidueStream);
final OutputTag<EventData> attributedTag = new OutputTag<>("attributed") {
};
final OutputTag<EventData> unattributedTag = new OutputTag<>("unattributedTag") {
};
SingleOutputStreamOperator<EventData> splitEnrichedStream = dataStream
.process(new ProcessFunction<EventData, EventData>() {
#Override
public void processElement(
EventData data,
Context ctx,
Collector<EventData> out) {
if (data.attributedEmail != null && !data.attributedEmail.isEmpty()) {
// emit data to side output for emailPresentTag
ctx.output(attributedTag, data);
} else {
// emit data to side output for ip/campaign attributing
ctx.output(unattributedTag, data);
}
}
});
//splitting attributed and unattributed stream
DataStream<EventData> attributedStream = splitEnrichedStream.getSideOutput(attributedTag);
DataStream<EventData> unattributedStream = splitEnrichedStream.getSideOutput(unattributedTag);
// attributing backlog unattributed events using attributed stream and flushing resultant attributed
// stream to kafka enriched_clickstream_event topic.
attributedStream = attributedStream.windowAll(TumblingProcessingTimeWindows.of(Time.seconds(30))).
process(new AttributeBackLogEvents()).forceNonParallel();
attributedStream.
addSink(ConfluentKafkaSink.createKafkaSinkFromApplicationProperties()).
name("Enriched Event kafka topic sink");
//handling unattributed events. Flushing them to mysql
Properties dbProperties = ConfigReader.getConfig().get(REPORTINGDB_PREFIX);
ObjectMapper objectMapper = new ObjectMapper();
unattributedStream.addSink(JdbcSink.sink(
"INSERT IGNORE INTO events_store.unattributed_event (event_id, lb_user_id, ip, event) values (?,?,?,?)",
(ps, t) -> {
ps.setString(1, t.eventId);
ps.setString(2, t.lbUserId);
ps.setString(3, t.ip);
try {
ps.setString(4, objectMapper.writeValueAsString(t));
} catch (JsonProcessingException e) {
logger.error("[UserFingerPrintJob] "+ e.getMessage());
}
},
JdbcExecutionOptions.builder()
.withBatchIntervalMs(Long.parseLong(dbProperties.getProperty(REPORTINGDB_FLUSH_INTERVAL)))
.withMaxRetries(Integer.parseInt(dbProperties.getProperty(REPORTINGDB_FLUSH_MAX_RETRIES)))
.build(),
new JdbcConnectionOptions.JdbcConnectionOptionsBuilder()
.withUrl(dbProperties.getProperty(REPORTINGDB_URL_PROPERTY_NAME))
.withDriverName(dbProperties.getProperty(REPORTINGDB_DRIVER_PROPERTY_NAME))
.withUsername(dbProperties.getProperty(REPORTINGDB_USER_PROPERTY_NAME))
.withPassword(dbProperties.getProperty(REPORTINGDB_PASSWORD_PROPERTY_NAME))
.build())).name("Unattributed event ReportingDB sink");
env.execute("UserFingerPrintJob");
Steps involved:
Splitting the stream to 3 streams based on 3 criteria and then attributing them with email and then collecting as union of these 3 streams.
Events which are unattributed in above step are sinked to mysql as backlog unattributed events.
Events which are attributed are passed on to AttributeBackLogEvents ProcessFunction. I'm assuming issue is here.
In AttributeBackLogEvents function, I'm fetching all events from mysql which have cookie-id(lb_user_id) or ip present in input attributed events. Those events are then attributed and percolated down to the kafka sink along with input attributed events. For some of these unattributed events, I'm seeing duplicate attributed events with timestamp difference of 30seconds(which is the processing time window). What i think is that while one task of AttributeBackLogEvents function is still processing, a seaparate task is fetching the same events from mysql and both the tasks are processing simultaneously. Basically i want to enforce record level lock in mysql/code so that same event don't get picked up. One way may be to use select for update, but given the size of data can lead to deadlock(or will this approach be useful?). I tried forceNonParallel() method too but isn't helpful.

Related

Dart boolean stays the same after update

I have a login page where the user credentials checks against a status response from a api. I've written a function that returns a future boolean from the check but my problem is that if the user puts the wrong info the first time all the times they try to log in after the function still comes back as false.
I've print the user input to the console and it shows that the old info was updated but still comes back as false.
Future boolean function:
bool loginCheck;
Future<bool>check() async{
try{
await fetchResponse().then((response){
if(response.status == "ok"){
return loginCheck = true;
}else {
return loginCheck = false;
}
});
}
catch (e){
print(e);
}
return loginCheck;
}
API response function:
Future <SubsonicResponse>fetchResponse() async{
try{
userClear();
loginUser();
var authresponse = await http.get(authURL);
if (authresponse.statusCode == 200){
var jsondata = jsonDecode(authresponse.body);
var data = apicallFromJson(jsondata);
var response = data.subsonicResponse;
return response;
} else{
}
}
catch (e){
print(e);
}
}
other functions:
void loginUser() {
serveraddress = _serveraddressController.text;
password = _passwordController.text;
username = _usernameController.text;
print(username);
print(password);
print(serveraddress);
}
void loginclear(){
_serveraddressController.clear();
_passwordController.clear();
_usernameController.clear();
}
void userClear(){
loginCheck = null;
serveraddress = null;
password = null;
username = null;
}
as you can see above I've tried clearing the user input vars before the request and it updates to the newest user input but still comes back false
Login button:
onPressed: () {
check().then((loginCheck){
print(loginCheck);
if(loginCheck == true){
loginclear();
return Get.toNamed('/home');
} else {
return showAlertDialog(context);
}
});
},
If the user puts the right info in the first time it works no problem.
You need to update the state of your variables using some sort of state management, i.e. Use setState() (or streams or what ever based on your use case) to update your variable.
Simply calling user clear will not work.

How to optimize the hibernate lucene fetching records time

I am doing hibernate lucene with boolean search to fetch the records from the database.Present in my table there are 250 records.I think it's not a matter for hibernate lucene to fetch.But what's happening is, it is taking 1.30 min to 2.30 minutes.
What's my flow is from my controller I am getting some search keywords and I am passing it to service and then to dao.In service layer I started the transaction.Finally Dao will return Listof records.After getting the records I am setting that list to List of XYZTableVo objects .
I don't know where exactly the time is taking whether in the lucene or in Object preparation of VO.
Following is my snippet
Session session = getSession();
FullTextSession fullTextSession = Search.getFullTextSession(session);
SearchFactory searchFactory = fullTextSession.getSearchFactory();
fullTextSession
.createIndexer(XYZTable.class)
.typesToIndexInParallel(20)
.batchSizeToLoadObjects(25)
.cacheMode(CacheMode.NORMAL)
.threadsToLoadObjects(5)
.startAndWait();
searchFactory.optimize(CvlizerJobseeker.class);
MultiFieldQueryParser parser = new MultiFieldQueryParser(new String[] { "Skills.skill" },
new StandardAnalyzer());
parser.setDefaultOperator(Operator.OR);
org.apache.lucene.search.Query luceneQuery = null;
QueryBuilder qb = fullTextSession.getSearchFactory().buildQueryBuilder().forEntity(XYZTable.class)
.get();
BooleanQuery boolQuery = new BooleanQuery();
if (locationList != null) {
if (locationList.get(2) != null) {
boolQuery.add(qb.keyword().onField("XYZTablePersonalInfo.XYZTableAddress.postalCode")
.matching(locationList.get(2)).createQuery(), BooleanClause.Occur.MUST);
}
else if (locationList.get(1) != null) {
boolQuery.add(qb.keyword().onField("XYZTablePersonalInfo.XYZTableAddress.city")
.matching(locationList.get(1)).createQuery(), BooleanClause.Occur.MUST);
}
}
if (StringUtils.isEmpty(query) != true && StringUtils.isBlank(query) != true) {
try {
luceneQuery = parser.parse(query.toUpperCase());
} catch (ParseException e) {
luceneQuery = parser.parse(parser.escape(query.toUpperCase()));
}
boolQuery.add(luceneQuery, BooleanClause.Occur.MUST);
}
boolQuery.add(qb.keyword().onField("isValid").matching(false).createQuery(), BooleanClause.Occur.MUST);
FullTextQuery createFullTextQuery = fullTextSession.createFullTextQuery(boolQuery, XYZTable.class);
createFullTextQuery.list();

Nancy OnError will not accept a Response object?

The Nancy documentation seems to say that Pipelines.OnError should return null - as opposed to BeforeResponse which allows both null and a Response object.
All the examples like this one and many code samples here on StackOverflow show a Response being returned in the OnError, just like in the BeforeRequest.
When I attempt to return an HTTPStatus string for the Pipelines.OnError, everything works OK!
But when I attempt to return a Response, I get a compiler error:
Operator '+=' cannot be applied to operands of type 'Nancy.ErrorPipeline' and 'lambda expression'
I'm emulating almost exactly the code in the Nancy example, except for the fact that mine is a TinyIocContainer while the example's is using a StructureMap container and a StructureMap derived bootstrapper
Here's my code:
const string errKey = "My proj error";
const string creationProblem = "Message creation (HTTP-POST)";
const string retrievalProblem = "Message retrieval (HTTP-GET)";
public void Initialize(IPipelines pipelines)
{
string jsonContentType = "application/json";
byte[] jsonFailedCreate = toJsonByteArray(creationProblem);
byte[] jsonFailedRetrieve = toJsonByteArray(retrievalProblem);
Response responseFailedCreate = new Response
{
StatusCode = HttpStatusCode.NotModified,
ContentType = jsonContentType,
Contents = (stream) =>
stream.Write(jsonFailedCreate, 0, jsonFailedCreate.Length)
};
Response responseFailedRetrieve = new Response
{
StatusCode = HttpStatusCode.NotFound,
ContentType = jsonContentType,
Contents = (stream) =>
stream.Write(jsonFailedRetrieve, 0, jsonFailedRetrieve.Length)
};
// POST - error in Create call
pipelines.OnError += (context, exception) =>
{
// POST - error during Create call
if (context.Request.Method == "POST")
return responsefailedCreate;
// GET - error during Retrieve call
else if (context.Request.Method == "GET")
return responseFailedRetrieve;
// All other cases - not supported
else
return HttpStatusCode.InternalServerError;
};
}
private byte[] toJsonByteArray(string plainString)
{
string jsonString = new JObject { { errKey, plainString } }.ToString();
byte[] result = Encoding.UTF8.GetBytes(jsonString);
return result;
}
I had the same problem and I found a nice approach to the problem: http://paulstovell.com/blog/consistent-error-handling-with-nancy.
you should override RequestStartup on the Bootstrapper, here my test code:
protected override void RequestStartup(TinyIoCContainer container, IPipelines pipelines, NancyContext context)
{
pipelines.OnError.AddItemToEndOfPipeline((ctx, ex) =>
{
DefaultJsonSerializer serializer = new DefaultJsonSerializer();
Response error = new JsonResponse(ex.Message,serializer);
error.StatusCode = HttpStatusCode.InternalServerError;
return error;
});
base.RequestStartup(container, pipelines, context);
}

APEX, Unit Test, Callout No Response with Static Resource

Bit stuck on another one i'm afraid, i am trying to write a unit test for a bulk APEX class.
The class has a calllout to the google api, so i have created a static resource which i am feeding in via a mock, so i can complete testing of processing the JSON that is returned. However for some reason the response is always empty.
Now the very odd thing is that if i use exactly the same callout/JSON code, and the same mock code on a previous #future call, then it does return fine.
Here is the class:
global class mileage_bulk implements Database.Batchable<sObject>,
Database.AllowsCallouts
{
global Database.QueryLocator start(Database.BatchableContext BC)
{
String query = 'SELECT Id,Name,Amount,R2_Job_Ref__c,R2_Shipping_Post_Code__c,Shipping_Postcode_2__c FROM Opportunity WHERE R2_Shipping_Post_Code__c != null';
return Database.getQueryLocator(query);
//system.debug('Executing'+query);
}
global void execute(Database.BatchableContext BC, List<Opportunity> scope)
{
system.debug(scope);
for(Opportunity a : scope)
{
String startPostcode = null;
startPostcode = EncodingUtil.urlEncode('HP27DU', 'UTF-8');
String endPostcode = null;
String endPostcodeEncoded = null;
if (a.R2_Shipping_Post_Code__c != null){
endPostcode = a.R2_Shipping_Post_Code__c;
Pattern nonWordChar = Pattern.compile('[^\\w]');
endPostcode = nonWordChar.matcher(endPostcode).replaceAll('');
endPostcodeEncoded = EncodingUtil.urlEncode(endPostcode, 'UTF-8');
}
Double totalDistanceMeter = null;
Integer totalDistanceMile = null;
String responseBody = null;
Boolean firstRecord = false;
String ukPrefix = 'UKH';
if (a.R2_Job_Ref__c != null){
if ((a.R2_Job_Ref__c).toLowerCase().contains(ukPrefix.toLowerCase())){
system.debug('Is Hemel Job');
startPostcode = EncodingUtil.urlEncode('HP27DU', 'UTF-8');
} else {
system.debug('Is Bromsgrove Job');
startPostcode = EncodingUtil.urlEncode('B604AD', 'UTF-8');
}
}
// build callout
Http h = new Http();
HttpRequest req = new HttpRequest();
req.setEndpoint('http://maps.googleapis.com/maps/api/directions/json?origin='+startPostcode+'&destination='+endPostcodeEncoded+'&units=imperial&sensor=false');
req.setMethod('GET');
req.setTimeout(60000);
system.debug('request follows');
system.debug(req);
try{
// callout
HttpResponse res = h.send(req);
// parse coordinates from response
JSONParser parser = JSON.createParser(res.getBody());
responseBody = res.getBody();
system.debug(responseBody);
while (parser.nextToken() != null) {
if ((parser.getCurrentToken() == JSONToken.FIELD_NAME) &&
(parser.getText() == 'distance')){
parser.nextToken(); // object start
while (parser.nextToken() != JSONToken.END_OBJECT){
String txt = parser.getText();
parser.nextToken();
//system.debug(parser.nextToken());
//system.debug(txt);
if (firstRecord == false){
//if (txt == 'text'){
//totalDistanceMile = parser.getText();
system.debug(parser.getText());
//}
if (txt == 'value'){
totalDistanceMeter = parser.getDoubleValue();
double inches = totalDistanceMeter*39.3701;
totalDistanceMile = (integer)inches/63360;
system.debug(parser.getText());
firstRecord = true;
}
}
}
}
}
} catch (Exception e) {
}
//system.debug(accountId);
system.debug(a);
system.debug(endPostcodeEncoded);
system.debug(totalDistanceMeter);
system.debug(totalDistanceMile);
// update coordinates if we get back
if (totalDistanceMile != null){
system.debug('Entering Function to Update Object');
a.DistanceM__c = totalDistanceMile;
a.Shipping_Postcode_2__c = a.R2_Shipping_Post_Code__c;
//update a;
}
}
update scope;
}
global void finish(Database.BatchableContext BC)
{
}
}
and here is the test class;
#isTest
private class mileage_bulk_tests{
static testMethod void myUnitTest() {
Opportunity opp1 = new Opportunity(name = 'Google Test Opportunity',R2_Job_Ref__c = 'UKH12345',R2_Shipping_Post_Code__c = 'AL35QW',StageName = 'qualified',CloseDate = Date.today());
insert opp1;
Opportunity opp2 = new Opportunity(name = 'Google Test Opportunity 2',StageName = 'qualified',CloseDate = Date.today());
insert opp2;
Opportunity opp3 = new Opportunity(name = 'Google Test Opportunity 3',R2_Job_Ref__c = 'UKB56789',R2_Shipping_Post_Code__c = 'AL35QW',StageName = 'qualified',CloseDate = Date.today());
insert opp3;
StaticResourceCalloutMock mock = new StaticResourceCalloutMock();
mock.setStaticResource('googleMapsJSON');
mock.setStatusCode(200); // Or other appropriate HTTP status code
mock.setHeader('Content-Type', 'application/json'); // Or other appropriate MIME type like application/xml
//Set the mock callout mode
Test.setMock(HttpCalloutMock.class, mock);
system.debug(opp1);
system.debug(opp1.id);
//Call the method that performs the callout
Test.startTest();
mileage_bulk b = new mileage_bulk();
database.executeBatch((b), 10);
Test.stopTest();
}
}
Help greatly appreciated!
Thanks
Gareth
Not certain what 'googleMapsJSON' looks like, perhaps you could post for us.
Assuming your mock resource is well formatted, make sure the file extension is ".json" and was saved with UTF-8 encoding.
If #2 does not work, you should try saving your resource as .txt - I've run in to this before where it needed a plain text resource but expected application/json content type
Be certain that the resource name string you are providing has the same casing as the name of the resource. It is case sensitive.
Are you developing on a namespaced package environment? Try adding the namespace to the resource name if so.
Otherwise, your code looks pretty good at first glance.

Multiple PushNotification Subscriptions some work properly and some don't

I tried posting this on the Exchange Development forum and didnt get any replies, so I will try here. Link to forum
I have a windows services that fires every fifteen minutes to see if there is any subscriptions that need to be created or updated. I am using the Managed API v1.1 against Exchange 2007 SP1. I have a table that stores all the users that want there mailbox monitored. So that when a notifcation comes in to the "Listening Service" I am able to look up the user and access the message to log it into the application we are building. In the table I have the following columns that store the subscription information:
SubscriptionId - VARCHAR(MAX)
Watermark - VARCHAR(MAX)
LastStatusUpdate - DATETIME
My services calls a function that queries the data needed (based on which function it is doing). If the user doesn't have a subscription already the service will go and create one. I am using impersonation to access the mailboxes. Here is my "ActiveSubscription" method that is fired when a user needs the subscription either created or updated.
private void ActivateSubscription(User user)
{
if (user.ADGUID.HasValue)
{
PrincipalContext ctx = new PrincipalContext(ContextType.Domain, Settings.ActiveDirectoryServerName, Settings.ActiveDirectoryRootContainer);
using (UserPrincipal up = UserPrincipal.FindByIdentity(ctx, IdentityType.Guid, user.ADGUID.Value.ToString()))
{
ewService.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SID, up.Sid.Value);
}
}
else
{
ewService.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, user.EmailAddress);
}
PushSubscription pushSubscription = ewService.SubscribeToPushNotifications(
new FolderId[] { WellKnownFolderName.Inbox, WellKnownFolderName.SentItems },
Settings.ListenerService, 30, user.Watermark,
EventType.NewMail, EventType.Created);
user.Watermark = pushSubscription.Watermark;
user.SubscriptionID = pushSubscription.Id;
user.SubscriptionStatusDateTime = DateTime.Now.ToLocalTime();
_users.Update(user);
}
We have also ran the following cmdlet to give the user we are accessing the EWS with the ability to impersonate on the Exchange Server.
Get-ExchangeServer | where {$_.IsClientAccessServer -eq $TRUE} | ForEach-Object {Add-ADPermission -Identity $_.distinguishedname -User (Get-User -Identity mailmonitor | select-object).identity -extendedRight ms-Exch-EPI-Impersonation}
The "ActivateSubscription" code above works as expected. Or so I thought. When I was testing it I had it monitoring my mailbox and it worked great. The only problem I had to work around was that the subscription was firing twice when the item was a new mail in the inbox, I got a notification for the NewMail event and Created event. I implemented a work around that checks to make sure the message hasn't already been logged on my Listening service. It all worked great.
Today, we started testing two mailboxes being monitor at the same time. The two mailboxes were mine and another developers mailbox. We found the strangest behavior. My subscription worked as expected. But his didn't, the incoming part of his subscription work properly but any email he sent out the listening service never was sent a notification. Looking at the mailbox properties on Exchange I don't see any difference between his mailbox and mine. We even compared options/settings in Outlook. I can see no reasons why it works on my mailbox and not on his.
Is there something that I am missing when creating the subscription. I didn't think there was since my subscription works as expected.
My listening service code works perfectly well. I have placed the code below incase someone wants to see it to make sure it is not the issue.
Thanks in advance, Terry
Listening Service Code:
/// <summary>
/// Summary description for PushNotificationClient
/// </summary>
[WebService(Namespace = "http://tempuri.org/")]
[WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
[System.ComponentModel.ToolboxItem(false)]
// To allow this Web Service to be called from script, using ASP.NET AJAX, uncomment the following line.
// [System.Web.Script.Services.ScriptService]
public class PushNotificationClient : System.Web.Services.WebService, INotificationServiceBinding
{
ExchangeService ewService = new ExchangeService(ExchangeVersion.Exchange2007_SP1);
public PushNotificationClient()
{
//todo: init the service.
SetupExchangeWebService();
}
private void SetupExchangeWebService()
{
ewService.Credentials = Settings.ServiceCreds;
try
{
ewService.AutodiscoverUrl(Settings.AutoDiscoverThisEmailAddress);
}
catch (AutodiscoverRemoteException e)
{
//log auto discovery failed
ewService.Url = Settings.ExchangeService;
}
}
public SendNotificationResultType SendNotification(SendNotificationResponseType SendNotification1)
{
using (var _users = new ExchangeUser(Settings.SqlConnectionString))
{
var result = new SendNotificationResultType();
var responseMessages = SendNotification1.ResponseMessages.Items;
foreach (var responseMessage in responseMessages)
{
if (responseMessage.ResponseCode != ResponseCodeType.NoError)
{
//log error and unsubscribe.
result.SubscriptionStatus = SubscriptionStatusType.Unsubscribe;
return result;
}
var sendNoficationResponse = responseMessage as SendNotificationResponseMessageType;
if (sendNoficationResponse == null)
{
result.SubscriptionStatus = SubscriptionStatusType.Unsubscribe;
return result;
}
var notificationType = sendNoficationResponse.Notification;
var subscriptionId = notificationType.SubscriptionId;
var previousWatermark = notificationType.PreviousWatermark;
User user = _users.GetById(subscriptionId);
if (user != null)
{
if (user.MonitorEmailYN == true)
{
BaseNotificationEventType[] baseNotifications = notificationType.Items;
for (int i = 0; i < notificationType.Items.Length; i++)
{
if (baseNotifications[i] is BaseObjectChangedEventType)
{
var bocet = baseNotifications[i] as BaseObjectChangedEventType;
AccessCreateDeleteNewMailEvent(bocet, ref user);
}
}
_PreviousItemId = null;
}
else
{
user.SubscriptionID = String.Empty;
user.SubscriptionStatusDateTime = null;
user.Watermark = String.Empty;
_users.Update(user);
result.SubscriptionStatus = SubscriptionStatusType.Unsubscribe;
return result;
}
user.SubscriptionStatusDateTime = DateTime.Now.ToLocalTime();
_users.Update(user);
}
else
{
result.SubscriptionStatus = SubscriptionStatusType.Unsubscribe;
return result;
}
}
result.SubscriptionStatus = SubscriptionStatusType.OK;
return result;
}
}
private string _PreviousItemId;
private void AccessCreateDeleteNewMailEvent(BaseObjectChangedEventType bocet, ref User user)
{
var watermark = bocet.Watermark;
var timestamp = bocet.TimeStamp.ToLocalTime();
var parentFolderId = bocet.ParentFolderId;
if (bocet.Item is ItemIdType)
{
var itemId = bocet.Item as ItemIdType;
if (itemId != null)
{
if (string.IsNullOrEmpty(_PreviousItemId) || (!string.IsNullOrEmpty(_PreviousItemId) && _PreviousItemId != itemId.Id))
{
ProcessItem(itemId, ref user);
_PreviousItemId = itemId.Id;
}
}
}
user.SubscriptionStatusDateTime = timestamp;
user.Watermark = watermark;
using (var _users = new ExchangeUser(Settings.SqlConnectionString))
{
_users.Update(user);
}
}
private void ProcessItem(ItemIdType itemId, ref User user)
{
try
{
ewService.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, user.EmailAddress);
EmailMessage email = EmailMessage.Bind(ewService, itemId.Id);
using (var _entity = new SalesAssistantEntityDataContext(Settings.SqlConnectionString))
{
var direction = EmailDirection.Incoming;
if (email.From.Address == user.EmailAddress)
{
direction = EmailDirection.Outgoing;
}
int? bodyType = (int)email.Body.BodyType;
var _HtmlToRtf = new HtmlToRtf();
var message = _HtmlToRtf.ConvertHtmlToText(email.Body.Text);
bool? IsIncoming = Convert.ToBoolean((int)direction);
if (IsIncoming.HasValue && IsIncoming.Value == false)
{
foreach (var emailTo in email.ToRecipients)
{
_entity.InsertMailMessage(email.From.Address, emailTo.Address, email.Subject, message, bodyType, IsIncoming);
}
}
else
{
if (email.ReceivedBy != null)
{
_entity.InsertMailMessage(email.From.Address, email.ReceivedBy.Address, email.Subject, message, bodyType, IsIncoming);
}
else
{
var emailToFind = user.EmailAddress;
if (email.ToRecipients.Any(x => x.Address == emailToFind))
{
_entity.InsertMailMessage(email.From.Address, emailToFind, email.Subject, message, bodyType, IsIncoming);
}
}
}
}
}
catch(Exception e)
{
//Log exception
using (var errorHandler = new ErrorHandler(Settings.SqlConnectionString))
{
errorHandler.LogException(e, user.UserID, user.SubscriptionID, user.Watermark, user.SubscriptionStatusDateTime);
}
throw e;
}
}
}
I have two answers for you.
At first you will have to create one instance of ExchangeService per user. Like I understand your Code you just create one instance and switch the impersonation, which is not supported. I developed a windowsservice which is pretty similar to yours. Mine is synchronising the mails between our CRM and Exchange. So at startup I create an instance per user and Cache it as long as the application runs.
Now about cache-mode. The diffrence between using cache-mode and not is just a timing gab. In cache-mode Outlook synchronizes from time to time. And non cached it's in time. When you use the cache-mode and want the Events immediatly on your Exchange-Server you can press the "send and receive"-button in Outlook to force the sync.
Hope that helps you...