I wish to create a new file for every log event, just appending a number to the filename every time it rolls is sufficient.
Is this possible in logback?
Related
I'm currently working on a project where I need the data factory pipeline to copy based off the last run date.
The process breakdown....
Data is ingested into a storage account
The data ingested is in the directory format topic/yyyy/mm/dd i.e., multiple files being brought in a single directory hence it's files are partitioned by date which looks like this day format and month and year etc
The process currently filters based on the last high water mark date which updates each time the pipeline is run and triggers daily at 4am, once the copy is successful, a set variable increases the high-water mark value by 1 (I.e., one day), though files are not brought over on the weekends (this is the problem)
The date value (HWM) will not increase if no files are brought over and will continue to loop through the same date.
How to I get the pipeline to increase or look for the next file in that directory given that I use the HWV as the directory to the file, copy and update the HWM value only when completed dynamically. Current update logic
current lookup of HWV lookup and directory path to copy files
Instead of adding 1 to last high water mark value, we can try to update current UTC as watermark value. So that, even when pipeline is not triggered data will be copied to the correct destination folder. I have tried to repro in my environment and below is the approach.
Watermark table is taken initially with watermark value as '1970-01-01'.
This table is referred in the Lookup Activity.
Copy data activity is added and in source, query is given as
select * from tab1 where lastmodified > '#{activity('Lookup1').output.firstRow.watermark_value}'
In Sink, Blob storage is taken. In order to have folder structure as year/month/day,
#concat(formatDateTime(utcnow(),'yyyy'),'/', formatDateTime(utcnow(),'mm'),'/',formatDateTime(utcnow(),'dd'))
is given in folder path.
File is copied as in below path.
Once file is copied, Watermark value is updated with the current UTC time.
update watermark_table
set
watermark_value='#{formatDateTime(utcnow(),'yyyy-MM-dd')}'
where tab_name='tab1'
When pipeline is triggered next day, data will be copied from the watermark value and once file is copied, value of current UTC is updated as watermark value.
I think reading the post a couple of time , what I understood is
You already have a water mark logic .
On the weekend when there are NO files in the folder , the current logic does NOT increment the watermark and so you are facing issues .
If I understand the ask correctly . please use the #dayOfWeek() function . Add a If statement and let the current logic only execute when the day of the week is Monday(2)-Friday(6) .
https://learn.microsoft.com/en-us/azure/data-factory/data-flow-expressions-usage#dayofweek
How to query in RTC 7.0.2 for a formal project, to get each day effort (time tracking) or time spent each day for a task?
That might have changed since 2015, but this thread mentioned
Time tracking is not a field/an attribute. Time tracking entries are added to a work item using a reference/link. I know that others have done this in the past.
The article which deals with this topic is "The Work Item Time Tracking API" from Ralph
The following code update or create time tracking entries, but, in the process, access and read existing time tracking first, which is of interest here.
public void updateOrCreateTimeSheetEntry(WorkItemWorkingCopy workingCopy,
ITimeCode timeCode, Timestamp startDateTimeStamp,
Duration workDuration, Identifier workType,
IProgressMonitor monitor) throws TeamRepositoryException {
// set the active work item from the working copy
setWorkItem(workingCopy.getWorkItem());
// Find a matching time sheet if it exists.
ITimeSheetEntry timeSheet = findTimeSheetEntry(timeCode,
startDateTimeStamp, monitor);
if (timeSheet == null) {
// There is no time sheet for this entry
// Create a new one and create the link
timeSheet = createTimeSheet();
workingCopy.getReferences()
.add(WorkItemEndPoints.WORK_TIME,
IReferenceFactory.INSTANCE
.createReferenceToItem(timeSheet));
// Add the entry to the map to hold the data
addEntry(timeSheet, monitor);
} else {
// There is a time sheet, we need to update it
// Get the workingCopy of the time sheet
timeSheet = (ITimeSheetEntry) timeSheet.getWorkingCopy();
// remove the time spent from current time
setTotalDuration(new Duration(getTotalDuration().longValue()
- timeSheet.getTimeSpent().longValue()));
}
// Set the new data
timeSheet.setStartDate(startDateTimeStamp);
timeSheet.setTimeCodeId(timeCode.getTimeCodeId());
// TODO: If I leave this out it fails....
timeSheet.setTimeCode(timeCode.getTimeCodeLabel());
timeSheet.setTimeSpent(workDuration);
timeSheet.setWorkType(workType);
// add the new time back
setTotalDuration(getTotalDuration().add(workDuration));
// Update the value
// Note: it is important to set the duration value, of the work item
// otherwise the work item is not marked as dirty and in need to update
// in the repository and the save process will not save the time sheet
getWorkItem().setDuration(getTotalDuration().longValue());
workingCopy.getDependentItems().add(timeSheet);
}
You can make a query to fetch work items for a given date, used a stored query, and list your work items that way.
I want to generate logs with time stamp being in place of popups i have mentioned.
Please suggest some steps for generating logs with time stamp.
Code:-
click("1450851018693.png")
wait(2)
click(Pattern("1450851091319.png").targetOffset(1,2))
click(Pattern("1450851555941.png").targetOffset(-201,1))
type("111")
click(Pattern("1450851201892.png").targetOffset(-13,2))
type("121")
wait(1)
if exists ("1450851253342.png"):
popup("start button is enabled")
click("1450851253342.png")
In the above code instead of popups i want the messages to be logged in file with time stamp.
Please Help..
You can use the logging module from Python importing it and getting a logger instance:
import logging
FORMAT='%(asctime)-15s %(message)s'
logging.basicConfig(format=FORMAT)
logger=logging.getLogger('')
Then, use it in you code (per default, level info is not printed to the console, but warning is)
logger.warning('My message')
You should have a log entry in your like:
2016-03-07 13:10:43,151 My message
See Logging Python for description and basic tutorial.
I am using a sample listener from. http://relevantcodes.com/testng-listener-using-extentreports/
Does someone know how to get the time test was started and end from the listener?
I checked. https://github.com/cbeust/testng/blob/master/src/main/java/org/testng/reporters/SuiteHTMLReporter.java#L424 but its not clear how time info is pulled from. The code does
iim.getDate()
But I don't understand how that provides the diff because it just subtracts diff from itself only.
Karthik
I have updated the post, you can use the new code to retrieve started and ended times:
test.getTest().startedTime = new Date(result.getStartMillis());
test.getTest().endedTime = new Date(result.getEndMillis());
In TestNG, you can retrieve starting and ending milliseconds, which can be converted to Date.
i receive input files daily in a folder called INPUTFILES. These files have filename along with datetime.
My Package has been scheduled to run everyday. If i receive 2 files for the day, i need to fetch these 2 files and load into the table.
For example i had files in my files
test20120508_122334.csv
test20120608_122455.csv
test20120608_014455.csv
now i need to run files test20120608_122455.csv test20120608_014455.csv for the same day.
I solved the issue. I have taken one varibale which checks for whether a file exists for that particular Day.
If the file exists for a particular day then the value for the variable is assigned to 1.
For Each Loop Container has been taken, and placed the this file exists variable inside the container.
For Loop Properties
EvalExpression ---- #fileexists==1.
if no file exists for that particular day, then the loop fails.