SSRS - listing out CUSTOM data sources? - reporting-services

I am trying to change the server for several dozen reports, all developed by different people, that use a mix of:
Shared Datasources (have a query for this)
Custom Data Sources in RDLs (not perfect but I have a PowerShell script that will download all reports, then I can them search them with Notepad++)
Custom Data Sources that are overridden on the SSRS Server.
Does anybody have a way to find the details of the Custom Data Sources? Shared is easy. I need to get the details of the CUSTOM. Tried going through the data sources table but it's not standard varbinary.
And hey, bonus fake internet points if you have a way to update the custom DS as well; I already have code to change the shared ones.

Related

How to use a Power BI data model across different folders of similar data

I have a system that produces CSV data on a regular event-driven basis (say, daily). Each event triggers the creation of a new folder and a fixed set of CSV files, each representing different types of data. For instance:
PlansDB.csv - data for plans of action
StepsDB.csv - descriptions of steps used by different plans
GroupsDB.csv - data on groups that can handle plans
RoomsDB.csv - data on places where a group can work on a plan
ResultsDB.csv - the records of results from steps of a plan
These have fields that identify the relationships between the different files, and I have no problems creating a data model for the CSVs in any given folder.
But how do I switch between folders? Once I have a working data model and some reports built off it, I'd like to view those reports on specific folders of data. How does that work? Can I switch easily to yesterday's folder, or last weeks, etc. with minimal effort (preferably just pointing to the folder).
The CSV files maintain the same names across folders which represent the types of the data they store. Can Power BI use that?
And can I run reports over multiple folders maintaining this data model? I know of the Folder merge capability, but my attempts at using it just merges all files as if they were the same type, whereas I would need each type merged separately.
You need to change the data source. To do this, from "Edit Queries" select "Data source settings":
Then click "Change Source..." button and select the new folder. After that Power BI Desktop will tell you to apply the changes and will reload the data from the new folder:

Dynamically changing Report's Shared Data Source at Runtime

I'm looking to use SSRS for multi-tenant reporting and I'd like the ability to have runtime-chosen Shared Data Sources for my reports. What do I mean by this? Well, I could be flexible but I think the two most likely possibilities are (however, I'm also open to other possibilities):
The Shared Data Source is dictated by the client's authentication. In my case, the "client" is a .NET application and not the user, so if this is a viable path then I'd like to somehow have the MainDB (that's what I'm calling it) Shared Data Source selected by the Service Account that the client logs in as.
Pass the name of the Shared Data Source as a parameter and let that dictate which one to use. Given that all of my clients are "trusted players", I am comfortable with this approach. While each client will have its own representative Service Account, it's just for good measure and should not be important. So instead of just calling the data source MainDB, we could instead have Client1DB and Client2DB, etc. It's okay if a new data source means a new deployment but I need this to scale easily enough as well to ~50 different data sources over time.
Why? Because we have multiple/duplicate copies of our production application for multiple customers but we don't want to duplicate everything, just the web apps and databases. We're fine with some common "back-end" things. And for SSRS, because of how expensive licenses are (and how rarely reports are ran by our users), we really want to have just a single back-end for all of our customers (I actually have a second one on standby for manual disaster recovery situations - we don't need to be too fancy here as reports are the least important DR concern we have).
I have seen this question which points to this post but I was really hoping there was a better way than this. Because of all of those additional steps/efforts/limitations/etc, I'd rather just use PowerShell to script duplicate deployments of the reports with tweaked hardcoded data sources instead of standardizing on the steps in that post. That solution feels WAY too hacky to me and doesn't seem to scale very well at all.
I've done this a bunch of terrible ways (usually hardcoded in a dynamic script), and then I discovered its actually quite simple.
Instead of using Shared Connection, use the Embedded Connection and create your Connection string based on params (or any string manipulation code)....

Inserting label's values from SugarCrm document (sp_ve.js) into some table in mysql

I am using a software powered by SugarCRM. The reports I show are not very good so I decided to user Pentaho BI Suite. I already made all the business analitycs stuff (cubes, reports, saiku report, etc..), but I am having an issue, in SugarCRM database, the label's values of some dropdownlist (for example), are not storaged. Instead they are written in a .JS file y the sugar directory called sp_ve.js like this :
SUGAR.language.SetLanguage('app_string',[JSON OBJECT WITH THE LABELS VALUES]) ;
SUGAR.language.SetLanguage('app_list_strings',[JSON OBJECT WITH LABEL VALUES]);
Now, is there any way to insert those objects into tables, so I can use it in my OLAP solution in pentaho?, HOW? . If yes, how can I make this process work automatically, every time my users add new labels.
First of all, all the dropdowns, text labels, etc. actually stored in <lang_key>.lang.php files under <sugar_root>/include/language/ and <sugar_root>/custom/include/language/ directories. And only then those labels become accessible with JavaScript used in SugarCRM UI. See this page for more information.
For similar situation of preparing SugarCRM database for analyzing with Tableau Desktop Professional I decided to use the following approach.
At the beginning all the dictionaries/mappings needed in files mentioned above were extracted into separate CSV files. You can do it with text editor of your choice using find/replace. Then those CSV files were imported into SugarCRM database as new tables - one table for each list. After that I used database view to join "data" table with "label map" table and used that view as a source for my analytical needs. Yes, it's not so elegant and dynamic solution. Moreover it needs a little hand work. But it solves the issue.

moving database from file maker pro7 to Mysql

So I have this fileMakerPro7 database. As my senior project, I supposed to migrate the database to a MySQL database and than give it a PHP Based interface in 3N form...
Company allow us $200 tops to spend on the project, but if I pay for something, it has to work. However, I am having trouble finding a way of migrating the database. Any suggestions?
I have found "file maker pro migrator" (http://www.fmpromigrator.com), would the trial version be enough for us? In worst case, we will start from the beginning with throwing away the whole database that company has.
I can also download fileMakerPro12 and use it for a month with trial version for free. Would I be able to convert the db by using FMP12?
I am totally lost...open to any free suggestions...
+this is a non-profit-making company I'm doing the project for
If I had to do it, I'd look at the design of the FileMaker db and create something similar in mysql. Then I would export the Filemaker data to text and import it somehow. The details depend on foreign key values and such.
The PHP interface would be done separately.
MySQL Data Conversion:
Yes, if your database is small enough, the demo version of FmPro Migrator will convert the database and also build you a PHP web application - at no cost.
Here are the limitations of the demo version:
5 fields
5 scripts
5 layouts
PHP Web Application:
Most people don't realize it, but there is a wealth of FileMaker metadata available in XML format for performing these types of conversions. This XML info is available either thru copying the layout via the clipboard or reading it from the Database Design Report XML file. I have found the clipboard data to be the most reliable source of this info.
FmPro Migrator is able to parse in the XML and convert it into the PHP web application.
Each object on a layout is represented in XML, along with style and position info. This info can be used to create form files representing the same look as the original layout. In fact, it can be difficult to see the difference between the web application and the original database if you get all of the object properties implemented. This can be helpful for situations in which companies don't want to have to retrain their employees. They want the web application to look and work the same as the original desktop application.
I have done a few of these conversions recently into the CakePHP framework. Here a few techniques I used:
Auto-Enter Calculation Fields - Stored calculation fields are calculated and stored within the model saves a record to the database.
Unstored Calculation Fields - Unstored Calculation fields are calculated in real-time within the form controller - but only for fields actually displayed on the form. This prevents unnecessarily calculating these values if they aren't being used on a form, improving performance.
Global Fields - A Global field in FileMaker is used like a global variable in programming languages. It is important to know that each FileMaker user gets there own private copy of global field data. There is no equivalent feature MySQL or other SQL database servers, but this functionality can easily be simulated using SESSION variables. Therefore each web user will still get their own private SESSION data, simulating the same functionality originally present in the FileMaker database. I structure these globals in the model data array as if they were retrieved from the model, meaning that converted scripts and fields on forms can reference them easily. Just before the record gets written into the database, the results are saved into SESSION variables for persistence.
Global Variables in Scripts - Global variables within FileMaker scripts match up very well with the use of PHP SESSION variables, if you want to implement the same functionality.
Vector Graphic Objects - FileMaker layouts frequently include rectangles, ovals and line objects. These objects can be replaced with the RafaelJS library, providing high quality resolution independent graphics.
Value Lists - Custom and Field based value lists are implemented in a centralized location within the AppController.php file. Therefore making a change to the definition of the value list within the AppController, succeeds in changing the menu automatically throughout the whole application.

Notes database to MySQL (with CF?) / or how to get the NSF-datastructure

I have a commercial ColdFusion application, running on a MySQL database. A possible new client has approached me, they have been working in a Lotus Notes environment (and their own database) for many years now. Ofcourse they want to migrate their data to my application, before making the move.
I'm trying to get a grip on how to get a thorough feeling of the data, structure and interdependencies in their current database-application. Are there any tools to see a database-structure (like in a RDBMS) of a NSF-file, or is there anyway to dump the structure using ColdFusion etc....I don't have any handson experience with Lotus Notes (I do in the meanwhile have a local Lotus client and their database).
I need a good startingpoint to be able to determine whether or not I can find a way to migrate the data.
Any ideas??
thanks
Bart
To get at the data in Notes, a good option is to use NotesSQL which can be found here:
A quick overview of the Notes data structure is this: Notes is a document-centric database, with non-relational data contained within each document. Notes Databases (NSFs) contain any number of Notes Documents, which in turn contain any number of items that hold data. Each Notes Document can have a different set of items, and thus different data in it. While that sounds like a horrible mess, usually the documents have similar data based on the form used to create the documents.
This all leads to why there is no simple way to get data out of Lotus Notes. There are a few other options, which may or may not be useful depending on how much data you have to migrate.
I personally like using XML to extract data from Lotus Notes. You can do so by creating XML views within a Notes database. IBM has a tutorial that looks helpful.
Using Java or LotusScript, you can write code to extract data from the documents to any format you wish (CSV, XML, TXT, etc)
If it's not a lot of data, you may find getting the data into an Excel format is the simplest intermediary step. Long ago I wrote an add-in tool for exporting data from Lotus Notes to Excel, which may help you. Or you can use the "Edit > Copy Selected To Table" feature in the Lotus Notes client to copy what is visible in a Notes View to the clipboard, and then paste that into Excel. In that scenario, you'd want to edit the views so they show all the data you need.
I hope this helps!