How to update lookup value in Workflow on CRM dynamics 4.0 - dynamics-crm-4

I want to update the lookup value in Workflow, all lookup record contains one unique field.
Can we arrange to update the lookup field value for the entity record using unique value of lookup in Workflow.
Suggest me the way that I should not go for plug-in development, and web application development for doing this.
Please can you suggest me for implementing this using workflow under CRM Dynamics 4.0.

Yes, you can set a lookup field via the workflow designer to be a particular record.

Related

TFS 2013 SSRS Test Points

I am working with TFS 2013 and generating reports with SSRS however I have benn tasked with creating a detailed report showing all steps and acceptance criteria for each test case and list the available parameters. I have searched and searched the web and have not found any examples that show the test steps.
Thank you in advance.
In most instances the Test Steps fields are not set to reportable by default so they are not populated in normal reporting databases. You'll need to set them to reportable or create a new data source targeting your TFS transaction databases instead of the Warehouse/Cube databases.
It is not recommended you target transaction databases with reports since this can affect performance of TFS when reports are run. Especially a field like Test Steps since it might need to pull a lot of data at once.
Use the witadmin command to change the reportability of the Test Steps field. Using detail will only put the information in your Warehouse while the detail option will put it into the Warehouse and the Cube. Where you want it depends on how you write your reports. Once set the reportability type can only be changed in limited ways, so please keep this in mind when deciding how you want it set.
witadmin changefield n:<nameOfFieldToChange> /collection:http://yourcollectionURL:8080/tfs/YourCollectionName /reportingtype:<dimension,detail>
See the reportable attributes section for details.

SSRS 2012 custom code - Local update of Shared Dataset values - Is this scenario possible?

Assuming a fairly conventional SSRS 2012 report (in Visual Studio 2012) with a main report, a set of sub-reports, a shared dataset that is populated at the start of the report, and a shared datasource.
Is there any simple way within a sub-report's custom code (this is VBA, right?) to access the shared dataset, either to read or update records locally? (No updates back to the database itself.) I'm seeing hints out there that this is possible but no clear examples yet.
And if the above is possible, assuming that a call in the sub-report changed a record in the shared dataset, could that record change be displayed in the main report body?
Yes and No.
I think the overall concept would work but a few points won't.
I don't think you'd be able to use the report dataset with VBA. The code won't have access to the report's datasource directly. You'd probably need to use ADO to access the db from VB.
The only way to see the updated data would be to refresh the report - either manually or automatically on the timer.
I don't see how the subreport is going to figure out what to update the value to. You might have some idea that I'm not seeing right now.
The easier way I see this working would be to use parameters that default to NULL. Then select the row to update with one parameter and the value with another. Then have an UPDATE in your main query that only runs if your parameters are populated.

SSRS generate one pdf per record automatically

I've built a report which displays data from a db. This works fine, rendering the report correctly.
My query returns ~40,000 records. I would like to automatically generate a pdf file for each record, named using one of the fields returned by the query.
How do I automate this? I have Report Builder 3, and Visual Studio.
Thanks.
I'd write a report that takes a parameter and generates the report for one record. Deploy to your Reporting Services server.
Then I'd write a quick program that loops through your data and passes each unique record value to the report as a parameter and saves it as a PDF with a unique name. It's quite easy to run reports programmatically and Microsoft have some code to get you started.
If you have Enterprise SKU of SQL Server, look at data driven subscriptions. Another option could be creating report with one record per page, and then splitting resulting PDF into individual PDFs using some free tools.

reporting services - determining report GUID from old report name

I'm trying to use a combination of the SQL Server 2008 Reporting Services ReportServer database and the report server web service to determine the GUID for a particular report. What I want is to see a list of the previous names of a report that has been renamed or moved on the report server. I know a report's history snapshots stays with it when you rename it because the report has a unique GUID that doesn't change when you rename it. However, I can't seem to find a place in the database where previous names of a report are associated with the report's GUID. I can't find any instance of an old report name in the database, so I don't know if this is even stored. When I look at all the values of Catalog.Path and Catalog.Name in the database, the old values are not included from before the report rename. Is it possible, given a value like MyAwesomeReport, to associate that with a GUID like 7af3fe6d-b4ea-4cd8-8430-280392cba428, so that I can determine that this report has actually been renamed to MySuperAwesomeReport?
I think the easiest way for you to be able to determine a history for these is to build your own ETL script which would look up the current name of the reports, stored by GUID, and hold this in another structure. Having looked through the ReportServer database, I don't believe the name attribute of a report is held historically. I would based my implementation for this off the Sample Code for Reporting Utilization Stat's which is available in the examples on codeplex. I'll see if I can find the link to that example, its how I learned the most about how the ReportServer database is used and structured.

When do I need to update the Report Model

There is a constant change (!) in our database, new columns are often added.
Is reporting services the tool to choose for reporting in this case?
Case1: Developers add a new column to a table used in a report. Will the old reports created with a report model based on the old table still work?
Case2: Developers add a new column, and end users want to be able to report on it. If we update the report model, will the old reports based on the old report model still work? Or do we have to create a new report model every time the end user wants to report on a newly created column?
Regards
Lars
Reporting services has required strategies for change management. So, adding new column to a table in the underlying data source does not affect the reports.
If you want to include a newly added table column into your report model you should update (not create from scratch) your report model. Updating the report model automatically insert your new column to the model and does not break your old reports. on the other hand, updating report model does not update/delete the existing item if you change them (like table/column name or column data type etc) in the underlying source. You should manually change them at the report model and at the affected reports.
So, in your case, you won't be having any problem with reporting services.
Here i'm adding a change management section of the Reporting Services/Report Model document and strongly suggest you to read it.
Change Management
Models and the reports based on them
have many internal and external
dependencies. Therefore, you need to
consider the impact of changes
introduced into the dependency chain.
Report models based on relational data
sources use GUID attributes to
identify each entity, attribute, and
role. As mentioned, the report
model-generation process sets the
GUIDs, which are re-created at each
generation. For that reason and to
preserve edits on the report model,
generating a new report model each
time a change occurs is not an option.
You must work with the existing model
and update it, either manually or by
using the update options described
below.
The Semantic Query Engine
manages missing attributes when they
are not critical to report processing.
This functionality is in place to keep
reports running when security
attributes preclude users from seeing
some attributes in the report that may
be allowed to other users. Thus, if a
user is not allowed to access a field
such as the employee home telephone
number, the Employee Listing report
will run for that user but will not
show the excluded information. This
functionality works to your advantage
when models are edited to delete a
non-critical attribute. The report
will still run after you have removed
an attribute, although the report
might show a blank field. However,
query or report processing can be
broken by other changes to the model.
Remember that you should not overwrite
a model generated from a relational
data source when any reports depend on
it.
Schema Changes
If the underlying schema changes and
report model entities or attributes
are affected, you might have to update
the report model accordingly. To do so
in BIDS, use the Autogenerate command
on the Reporting Model menu. You can
also select Autogenerate from the
model item's context menu. By using
the context menu, you can select which
item on the model you want to update
without having to update the entire
model.
The autogeneration process will
show informational, warning, and alert
messages. These messages will show all
items in the model that are
out-of-sync with the underlying DSV,
even though those items are not
specifically included in the item
selected for autogeneration. This
functionality helps detect potential
errors than may lead to unpredictable
errors when running reports based on
the model.
Automatic update affects
newly added items only. The
autogeneration process will add any
new entity, attribute, or role found
in the DSV, but will not delete or
change any entity, attribute, or role.
Therefore, you need to manually manage
updated or deleted items. The messages
shown at the end of the generation
process will highlight any entity,
attribute, or role that needs to be
updated in the resulting out-of-sync
model. You will have to update the
model manually or revert the DSV
changes to maintain model-to-schema
coherence.
Data Source Changes
You can develop and test your model in
a development environment and then
deploy the model in a production
environment easily by changing the
connection string in the data source
file that the DSV uses. The two data
source schemas must be identical.
Note that the DSV contains statistics
based on the actual database data. As
mentioned in the section "Statistics
in Report Model Generation," the value
of those statistics will drive some
algorithm decisions during the model
generation. Therefore, if the
development database data is
significantly different from the
production database data, the model
might not be optimized for the data
that will eventually be used.
Hope this help.