Lotus Domino NotesSQL ODBC & SQL 2008 query - sql-server-2008

I'm trying to work at getting the same information from a couple different sources but have hit a wall in trying to use NotesSQL and SQL 2008. What I am trying to do is to retrieve info from a couple different views on Domino servers. One view is a default view, the other is a created one.
One method I have used is Powershell, where I get the database, then select the view, get the first document & then iterate through the rest of the view grabbing fields that I need. The view I have selected is the People view.
I was trying to replicate this same thing using SQL 2008, using the NotesSQL driver, setting up an ODBC connection, and then creating a linked server to that Notes database. I am using the following query to select from the People view:
select * from openquery(MyNotesServer,'Select * from People')
However, what is returned from this view isn't what I am able to see when I use Powershell & then iterate through the documents returned in that view. Powershell shows 100+ columns in it, while SQL only returns 5 columns. Additionally, they're named "_12", "17" etc. Some fields (which may be custom, I don't know) have a meaningful name. Of the fields shown, I can select them by name ("_12", etc) but cannot select anything else. The number of rows (SQL) is the same as the number of documents in the view (Powershell $view.Allentries.Count).
Querying the view that was created (3 fields):
select * from openquery(MyNotesServer,'Select * from MyCreatedView')
returns all the fields in that view, and they are named as they are in the view.
In T-SQL querying the People view, how do you get the names of the columns that I know are there as I discovered in my Powershell script? They don't appear to be named the same thing, so how do you retrieve more than the 5 returned when you select * from the view? I have read through the Notes documentation & examples, but haven't been able to figure out what is mapped to where.
The reasoning behind this is wanting to utilize SQL & a notes.id file instead of running a script. Also, I want to make use of an already existing global view instead of views that may be accessible only to their author.

You can use select * from Person. Person is a form name, not a view name. Notes and Domino are not relational. The NSF file is a document database. The views in it are pre-built indexes that already have an implicit select. I.e., the "People" view selects all documents created with the "Person" form.
The above query bypasses the use of a view and will give you all the fields for all documents created with the the Person form.
Actually, come to think of it, the better query would be select * from Person where type='Person'. That's because the People view in Domino uses type="Person" instead of form="Person" in its selection formula. It is theoretically possible to have a document created with the Person form but with the Type field set to a different value. This variation will insure that you always get the same list that you see in the Person view.
But: In either case, it will be inefficient. The NotesSQL driver will have to do a full database search instead of simply reading the index of an existing view. It's going to search using the Notes formula SELECT Form="Person" & Type="Person". I really cannot recommend this, unless you are querying against a small Domino Directory database.
The best practice is to create a view containing all the fields that you really need, and do your query against that view.

First, in the Lotus Notes documentation, you'll find "columns" when you read about views, not fields.
Second, the columns have configured an option called "Programmatic Name". In that option, you can put an "alias". Lotus Notes puts values by default as "$12", "$17", etc. The thing is NotesSQL changes "$" by "_". That's the reason why you see "_12", "_17, etc.
How do you get the original names? As far as I remember (I don't have Lotus Notes near to make a verification) you can't. But, you can create another view with the columns with the data you need, and put appropiate names. The easy way to do it is copy/paste the view, delete the columns you don't need, and change that you want.

Related

Alternative approach to parameters in Invantive Control to control query outcome

I would like to use parameters in Invantive control.
For example, I would like to retrieve only the hours, of Exact Online Project management which, are in the given data parameters.
There are three often used approaches:
Use parameters in the model editor.
Use Excel values.
Use data from the databases involved.
Model Parameters
To use parameters in the model editor, you define them in the model editor:
Select a unique code.
Possibly provide a default value (always string, use to_... or casting to change it in the queries).
Define list-of-values providing a quick pick.
The unique code is in general all in uppercase and in the format P_..., but as long it is a legal identifier anything should work.
To use them in one or more block queries or triggers:
Use the building blocks button.
Or type $P{CODE} in the SQL or trigger for Excel.
The values can be entered using the parameters button (the green funnel in the Control ribbon):
Please note that parameters are always bound as parameters, and not lexicographically substituted so you can NOT say: select * from $P{TABLE_NAME}.
Please note that parameters can be dependent on each other, so in the query for the parameter you can use another parameter. Such as first choosing the country of a project and then showing a list of projects in that country. But be wise, avoid recursion and other overly complex scenarios; the user will not easily understand it.
Excel values
To use Excel values, you can define them as follows:
Enter a value somewhere in Excel.
Optionally define a named range to make it easier to change the location.
You can of course assign lists as normal in Excel as a pop-up or other validations. Also cell locking works fine.
To use the actual value in a query or trigger of Invantive Control you can use the building blocks in the query editor or use something like select * from table where code = $X{projectcode} or select * from table where code = $X{B2&C2&D2}. The last one shows that you can also use other type of Excel expressions.
Note that Excel parameters are also bound as parameters to the query, but that they are also typed, so the following query can be different depending on the general format of the Excel cell:
select *
from table
where code like $X{CELL}
When cell is a text, the database or Exact Online in this case will retrieve:
select *
from table
where code like :ex0
With ex0 being a text such as '8%'. But when cell is a percentage, the contents might still display in Excel as '8%', but the actual query will be with identical to the outcome of:
select *
from table
where code like 0.08
Caused me some headaches, but typing is in general a useful feature, especially with dates.
Parameter using database data
Option 3 is practically not feasible with Exact Online, since they are little possibilities to create your own tables and/or fields.
On other platforms such as Oracle you might want to enter new rows in Invantive Control in Excel and them upload them on sync to provide parameters. Especially handy in case of complex risk models.

Invalid Field Name Access Web App

I have an Access Web App and I am currently trying to make an OnClick macro for a field on one of my views so that when it is clicked, it will pop up another view to a specific record. Currently, my Where clause reads Where: [DistrictID]=[Districts].[DistrictID]
When I go into the App and click on the field with the macro, I get a pop up stating "Invalid field name 'Districts.DistrictID'."
The one and only site I could find that mentioned this error is: http://blogs.technet.com/b/the_microsoft_access_support_team_blog/archive/2014/08/04/access-app-invalid-field-name-lt-tablename-gt-lt-fieldname-gt-error-when-using-where-clause.aspx
This site says that it is due to case-sensitivity of the Where clause. My case matches my table/field exactly (I program in other languages, so I always match case just out of habit). I have quadruple-checked my spelling anyway, and even went so far as to copy-paste the field name into my Where clause. Still I get the error.
I have another view that does something similar with a different table/view that works perfectly (Where: [ContactID]=[Contacts].[ContactID]).
Does anyone have any idea why my Where clause is not working, or what I could be doing wrong?
Extra info if needed:
I would include photos, but the information in my database is sensitive, so I will do my best to describe the information in question:
I have 10 tables in the database. "Districts" is the one I am trying to work with. The Districts table has... quite a few fields, including DistrictID, DistrictName, EmailService, SpecialComment, and more. These four however are the fields being queried for the datasheet view which contains the macro. The OnClick macro is triggered for the SpecialComment field - when clicked, I want my District List view to pop up to the same district whose SpecialComment was clicked on (the Special Comment can run long sometimes, so if it is cut off by the size limit of the datasheet, I want people to be able to read the rest of the info without having to switch to another view and then find the district in the list). The District List view and the queried view both have the DistrictID field in the view, although it is hidden. I have tried unhiding the field in both views and it did not solve the problem - I had other Where clauses that used the DistrictID field before this that worked fine, so I doubted it would change anything anyway. Those Where clauses from before were substituted for other functions, so I don't have them to refer to to see why that one worked and this one does not.
If the view you are trying to open using OpenPopup macro action has a saved query as it's record source, then you must use the query name like this:
[FieldNameHere]=[NameOfQueryHere].[FieldNameInThatQuery]
If you are using a table name, then substitute the actual table name in the appropriate spot.
One thing to note which sometimes trips people up is that this technique won't work if the view you are trying to open uses an Embedded Query as its record source. The reason for this is you have no way of knowing what the embedded query name is that Access creates behind the scenes for the Access 2013 web app view. (It's actually a GUID name behind the scenes.)
I even had to add a special troubleshooting note in my book on this since I knew people would get tripped up on this (Page 584):
TROUBLESHOOTING
Why do I get an error trying to use a Where clause with an OpenPopup or ChangeView action when the view is based on an embedded query?
Access Services requires the Where clause to include the table or query name on which the view is based. When you define an embedded query as the record source for a view, Access Services creates a hidden system query that is not visible in the Navigation pane. Therefore, you cannot use a Where clause with the OpenPopup action or ChangeView action to open a view based on an embedded query. To work around this limitation, you can base your view on a saved query object. Note that Access Services creates a hidden system query as the record source also for Summary views. This means that you cannot use the Where clause argument to open a Summary view to a specific record or set of records. However, the workaround mentioned above won't work for Summary views.

MS Access data lineage documentation

I am looking for a scripted/automated way (presumably VBA?) to take an Access query and generate some kind of savable, searchable, publish-able documentation on the data lineage. So if there were a bunch of layered/nested queries, or even passthrough queries, along the way I want a way to trace the final fields in the specified query back until I get back to the original source tables/fields.
Everything I've found seems to do database documentation focused on how the table relationships are configured. I'm looking for a way to get the documentation of the user-created portion, down to the field. I'm very open-minded on what format the output is in. I'm convinced this must be possible, but haven't had any luck yet.
I'm also open to recommendations for a third-party application if it could do this.
Thanks in advance!
Access does have a built in “dependency” feature. The result is a VERY nice tree-view of those dependencies, and you can even launch such objects using that treeview of your application to “navigate” the application so to speak.
The option is found under database tools and is appropriately called Object Dependencies.
The result looks like this:
While you don't want to use auto correct, this feature will force on track changes. If this is a large application, then on first run a significant delay will occur. After that, the results can be viewed instantly. As noted, not only do you have a hierarchical tree view, but objects in the tree view can be clicked on to launch the object in question.
And the above will work for a query that based on a query etc. all the way down to the base table.
https://www.dropbox.com/sh/f73rs3h9u9q2xk5/AAArloN_Cmf_WbPZ4W75I6KVa?dl=0
This is a set of queries I wrote to provide the kind of documentation you're looking for. It seems a bit kludgy, but it works for me. It's not a simple as the other response, but it provides output that can be incorporated into other documentation.
Note - the documentation is out of date with respect to Union queries. The query I have to analyze Union queries seems to only pick up the 1st two things that go into the Union, so I changed this to a Make Table query, and manually edit the resulting table to add the missing relationships.
To use the queries:
Copy the table and all the queries into your database
Run the "Mapping Unions Make Table" query
Manually edit the Unions table if necessary
When you run any of the 3 main output queries, you are prompted for the Top object you want to analyze. Enter the name of a query or table to find all the dependencies for that object. The three main outputs are:
Mapping Summary - lists all of the objects that go into the top object and all of the objects that go into them, to a depth of about 10 (depth is controlled in the "Mapping all parents" query)
Mapping summary without duplicates
Mapping summary duplicates
I especially like the 2nd output - this is in a format that can be saved in Excel and input to Visio's Org Chart Wizard to get a simple graphical representation of the relationships. Then the 3rd output query can be used to manually add in the queries that go into more than one other query, which Visio's wizard cannot handle.

MS Access with linked tables to SQLServer using FIND button

I'm using MS Access 2007 as a front end and have all linked tables in SQLServer 2008 R2 backend.
In a form in Access I am trying to execute the FIND button either in the ribbon or by creating a button on the form with the expressed purpose of looking for records with a specific value in a particular field.
When I complete the entry in the FIND window, I click on Find Next. In some cases, the record(s) is found immediately. In others, it can go for hours only to report that it can't find anything (when I know it should).
The table I am looking in has approximately 99,000 records in it. It doesn't seem to matter whether or not the field is indexed.
Is there something I'm doing wrong, or is Access unable to handle this? Also, is creating a stored procedure with handling multiple search requests and passing the info to Access the answer?
The find methods are known to be slow with ODBC data sources. Here is what the Access 2007 Recordset.FindFirst Method help topic says:
When working with Microsoft Access database engine-connected ODBC databases and large dynaset-type Recordset objects, you might discover that using the Find methods or using the Sort or Filter property is slow. To improve performance, use SQL queries with customized ORDER BY or WHERE clauses, parameter queries, or QueryDef objects that retrieve specific indexed records.
Futhermore, binding an Access form to a record source of 99K records is a performance challenge. Use a query as the form's record source, and design the query to return only one or a few rows.
Give the user an option to choose a different set of rows, and modify the form's record source property to reflect the user's choice.
This depends on the type of search you need and on the data type of the column (field) to be searched. For example, if I have a text data type in an indexed column and I search using Start of field or Whole field, it will be quite fast, however, if I search for Any part of field, it may well fall over. In other words, if Access can use an index for the search, it will work, even on quite a large table, otherwise, you may be best with a stored procedure, though I doubt that will be fast without an index, either.

How do I re-use reports on different datasets?

What is the best way to re-use reports on different tables / datasets?
I have a number of reports built in BIRT, which get their data from a flat (un-normalized) MySQL table, the data which in turn has been imported from an excel sheet.
In BIRT, I've constructed my query like this, such that I can change the field names and re-use the report:
SELECT * FROM
(SELECT index as "Index", name as "Name", param1 as "First Parameter" FROM mytable) t
However, then when I switch to a new client's data, I need to change the query to the new data source and this doesn't seem sustainable or anywhere near a good practice.
So... what is a good practice?
Is this a reporting issue, or a database-design issue?
Do I create a standard view that the report connects to?
If I have a standard view, do I create a different view with the same structure for each data table, or keep replacing the view with a reference to the correct data table each time I run the report?
What's annoying is the excel sheets keep changing - new columns are added, and different clients name their data differently. Even if I can standardize this, I'd store different client data in different tables... so would I need to create a different report for each client, or pass in the table name to the report?
There are two ways and the path you choose is really dictated by how much flexibility you have architecturally.
First, you are on the right track by renaming your selected columns to a common name since that name is what is used to bind the data to the control on the report. Have you considered a stored procedure to access the data? This removes the query from the report and allows you to set up the stored proc on any database to return the necessary columns. If you cannot off-load to a stored proc, you can always rely on altering the query text at run-time. Because BIRT reports are not compiled (they are XML) you can change the query based on parameters and have it executed for each run of the design. Look at the onCreate event for the Data Set and you can access this.queryText and do any dynamic string substitution you need via JavaScript. Hidden parameters are a good way to help alter/tune the query. If you build the Data Set correctly, the changing of the underlying data could be as easy as changing the Data Source and then re-associating the Data Set to the new Data Source (in the edit data set window). I have done this MANY times and it works well. If you are going down this route, I would add the Data Source(s), Data Set(s) and any controls that they provide data to a report library. With the library you can use the controls in many reports and maintain them in one spot. If you update the library, all the reports using the library get updated as well.
Alternatively, if you want to really commit to a fully re-usable strategy that allows you to build a library of reusable components you could check out the free Reusable Component Library at BIRT Exchange (Reusable Component Library). In my opinion this strategy would give you the re-use you are looking for but at the expense of maintainability. It is abstraction to the point of obfuscation. It requires totally generic names for columns and controls that make debugging very difficult. While it would not be my first choice (the option above would be) others have used it successfully so I thought I would include it here since it directly speaks to your question.