I am generating report everyday through SSRS. I am trying to get current date as file name when ever the job runs and file created.
Like this ID_report_03-31-2014
I have tried echo %date% %time% and #ExecutionTime but it doesn't work
The main option with a file share subscription is to add #timestamp to the file name.
From File Share Delivery in Reporting Services:
An alternative approach to creating unique files for every delivery is
to include a timestamp in the file name. To do this, add the
#timestamp variable to the file name (for example,
CompanySales#timestamp). With this approach, the file name is unique
by definition, so it will never be overwritten.
I don't know if this is exactly what you are after, since it will include a time component, but as far as a standard SSRS subscription goes your options are limited.
Edit after comment
You don't have a lot of control over the format here.
On the MS forums one of there support suggests using Data Driven Subscriptions to get more control:
Can we edit #timestamp variable in SSRS:
In this case, we can define the filename with timestamp in database
and then use Data-Driven Subscriptions to delivey the report.
But that seems a poor option to me, but is the only suggestion out there that I can see.
I have solved that problem using Data Driven Subscription.
I found the solution here.
Related
Hi I'm looking for help with the following issue:
In TFS on our SSRS report server whenever I run any of the out the box Sprint Burndown reports the report seems to run successfully but I get the following error in the bottom right hand corner:
Through some research I found that the issue was due to the field definitions in that particular Collection not matching the other collections that we have in TFS. Simple...
In order to determine which field definition in the collection was the issue I used the witadmin command listfields for all of my collections:
witadmin listfields /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy
This led me to find that the Synchronizes Identity Name Changes definition in the collection mentioned in the TF293000 error was set to a value of true, while it is false in all of my other collections. Issue Found! Should be easy from here...wrong.
The following command should solve my problem:
witadmin changefield /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy /syncnamechanges:false
*of course with the proper collection url subbed in for the word Collection
However when run and after I confirm that I want to make the change I get the following error:
TF401327: The operation is not supported. The feature is obselete.
I look the error up and it takes me to this page TFS Known Issue which tells me it's a known issue but was resolved in update 1 ... we have update 3.
I then attempted to simply edit the WIT .xml file and update the attribute for that WIT on that collection with false, but when I import the change to the server it tells me it has imported successfully however when I export it I see that the file has not changed.
I have also tried copying the the .xml file from the same WIT in another collection and uploading that to the offending collection and that will not work. I've never had an issue with uploading a WIT as we've made several changes to our TFS workflow before. I'm pretty stuck at this point and just wondering if anyone else has experienced this issue before, thanks!
According to the error info, seems there is a conflict in the TFS Data warehouse and this because 2 fields in different collection has different attributes in the data warehouse as it’s only one data warehouse. To avoid schema conflicts when you export and process data to the data warehouse databases, you must assign the same values to these attributes across all collections:
Field type (the value for this field cannot be changed for an
existing field).
Reporting type.
Reporting name.
What you have done is the correct operation, change/update the attribute for the field in one project collection to match the assignments that are made in other project collections.
You could try to narrow the issue, if this issue only happened on that specific field in the team project collection. All other work item filed working correctly? Also give a try with other collections, such as change the syncnamechanges=true, then set it back to syncnamechanges=false, to see if any issue occurs.
Run the command line on TFS sever machine instead of your develop machine. Clear TFS cahce. And if the filed is not use for reporting about those project collections, you could also try to mark it as non-reportable. More details please refer below links:
Resolve data warehouse schema conflicts
Change a reportable attribute for a work item field
Good evening,
I am currently developing a way to import machine created data from a csv sheet into a database.
The question I have is, is there a way to react to a change in a csv file with Lua.
The file gets a line in this format:
17162H,"801234500001",9/23/2016 12:33:30 PM,"INV"
Every time a scanner is finishing a scan process, added under the old lines, but there is no direct connection to the database, to trigger the script.
It doesn't matter if the change is detected via different file size, foldersize (of the folder that contains the file) or a change within the file information (like date of last opening), but I can't open and read in it permanently due performance reasons.
Also this is the first time I ask here, so sorry for my clunky way, I'll try to improve myself with that over time.
Take a look at linotify, it has lua bindings for inotify and looks like it should do the trick, using the "modify" event to trigger your script.
I use LibUV based variant in my spylog apllication
Usage:
file_monitor(path_to_file, {eol = '\r?\n'}, function(line)
...
end)
If you need to run this on Windows, you can use winapi library, which supports file watchers. Here is an example of how it's used in one of my projects; you'll need to call winapi.sleep() to allow time for the check to trigger.
I have a SSRS 2008 project with some reports, and recently we update the version, now I open the project with the SQL Server Data Tools 2015, and all is fine, I can deploy, edit, update all fine.
The problem comes when I want to create a new report, when I add a DataSet to the report, the preview tab says:
An error occurred during local report processing
The definition of the report 'Report Name' is invalid
and nothing more happens, I can't preview the report anymore.
Does anyone know if it is a issue of upgrade/open a SSRS 2008 solution with SSDT 2015?
After some research... I couldn't find anything.
So I tried to create the report in Reporting Services 2008, and when trying to preview the report after adding a DataSet it show me the same error:
An error occurred during local report processing
The definition of the report 'Report Name' is invalid
But now, it has an aditional line:
The shared dataset definition is not valid. details the required
attribute 'name' is missing
And after enter the first search result in Google, it looks like the problem was that my DataSet was a Shared DataSet, JoannaK found the same problem and also found a workaround:
Found a workaround for now: Create the data set as embedded> Convert
to Shared Data set Looks like the Name property is set when you start
with embedded. report runs and can data sets can be uploaded to the
report server
This solves my problem. Hope it helps someone in the future.
Source: JoannaK from SQL Server Data Tools Preview update for April 2016
SSDT is generating a broken dataset definition. To fix it:
Open up the shared dataset's .rsd file in a text editor.
In the xml therein find the <DataSet> opening tag.
Add the attribute Name (case-sensitive) to that tag, e.g: <DataSet Name="SomeDataSet">
Probably should set the Name to be the same as the filename but doesn't seem to make any difference as far as I can see.
Thanks to the existing answer as without that I never would have got it to work at all.
VS2015 / SSRS2012
I experienced similar error. It mostly occurs as of change in 'SHARED DATASET'.
Check for query fields in 'Shared Database Properties', if no fields are there , simply add fields you have included in your query.
Check for database connection
Check for 'Stored Procedure' and it's result if you have used stored procedure.
One can delete old dataset and re-create with same name.
As a last option, one can create 'New Dataset' (another one) and bind the new one with the report(take care to map all fields again once you choose to create new dataset.
I want to know if there is a way to check if a file has been edited. I looked for methods that can make it for me on Google Apps library, but I have found nothing about it. I don't know if I searched wrong.
Basically, I need to take a file, take a measurable data (something like size) of this file and store on a variable. Then, I need to take that measurable data again, store on another variable, and compare if there was a change. I need a boolean return.
Anyone?
You could do the following (pseudo with links to documentation):
Get the file you want to check using the DocList Class.
Get the ID of that File once you have it using File.getID()
Get the last edit timestamp using File.getLastUpdated()
Store this value in a Spreadsheet, or maybe Script or User Properties.
When you want to check to see if the File was updated, simply File.getFileById()
Repeat step 3.
Then compare the two last-edited timestamps with an operator like !=, or do more complex comparisons on the Dates if you want.
Depending on the result of step 7, return true or false.
Google's documentation is great, for all their services. You just need to read it a bit to understand what kind of power you have through scripting. Hopefully my pseudo-method helps in solving your problem!
Look at the file update date: https://developers.google.com/apps-script/reference/drive/file#getLastUpdated() and for storing data look up the storing data section in the apps script help page.
You could also use the GAT General Audit Tool http://goo.gl/hzZ2yf... which reports when files were edited , viewed and much more.
Using Business Intelligence Development Studio, I am creating a report for SSRS that requires the user to add a few notes before being printed. The notes do not need to be sent back to the SQL Server that the report is being generated from, they just need to be included when the report is printed or exported. I have some other solutions including:
Exporting to Word for edit, then the user can manually publish to pdf & send
Including parameters for the note fields which involves pulling the report, then adding in the notes and lastly re-pulling the report again to include the data
But I really don't want to add the extra steps to the user's process unless necessary. Has anyone tried this before? I've been tinkering and searching and have had no luck.
Thanks in advance.
Input to an SSRS report comes from data sources and the parameters. Some server settings are applied, but all the per-report stuff is from either of those two places.
Based on the OP comment, I would add a text parameter that allows blank values. You can set a default value of ="" so that the report will run on first access. Then any text the users adds can be inserted into the report simply by referring to the parameter's value.(=Parameters!MyParam1.Value)
user is pulling the report first to analyze the data. So they would pull it once, then add the notes, and then pull the report again with the parameters added.