Access 2010 development environment - ms-access

Newbie - Looking for assistance in moving an Access 2010 front end, SQL back end, from an isolated test environment to developing in a live environment with multiple developers.
Currently reading "Automatically Deploy a New Access Client" http://www.databasejournal.com/features/msaccess/article.php/3286111/Automatically-Deploy-a-New-Access-Client.htm
Need assistance with how to switch from test database to live database during publishing.

The general assumption is that on code start up, you need to check if the current linked back end is the same. This suggests that you need to have some .ini or text file that points to the production server.
Or if you on the same network, then you can simply use the linked table manager, point the front end to the production SQL server and re-link. You then compile to an accDE and simply distribute that pre-linked database to each workstation.
So you only really need re-link code on start-up if you can’t be on site or on the same network to pre-link.
If you link to SQL server using a FILE dsn, then Access automatically converts the links to DSN-less, and thus no re-linking code is required, nor is any “DSN” setup on each workstation required.
So you can manually switch and re-link your front end application before you distribute and that would means you simply switch which database the FE points to before you distribute.
As noted, if you can’t be on the same network for this “switch”, then you have to adopt some code to test the current link, and ensure that it points to the correct server (and this means an external file with the correct SQL server needs to exist).
If you need to switch during deployment, then you need some code to test if a re-link is required, and if yes, then you need some re-link code to run “one time” to correct and point the links to the production server. It certainly much less work to pre-link, and thus no switch or change in the connection and which SQL server you point to will be required during deployment.
If you need some re-link code, then this code can re-link for you:
http://www.accessmvp.com/DJSteele/DSNLessLinks.html

Related

I have a production SSIS project and I need to change the destination server. Can I just edit the project or do I need to repackage it?

I have a production SSIS project in SQL Server 2016 that creates and exports a flat file to another server. The destination server has reached end-of-life and I need to change the destination path to the new server so we can decommission the old server. Can I just edit the package or project in Visual Studio or do I need to recompile (redeploy? republish?)? I have never edited before, only created new projects however that was a few years ago and I am a little rusty.
Alternatively, I could copy the existing job, edit the copy, then run them in parallel first. Then I can disable the old project/package once I am confident the new one works. I'm not having much luck figuring out how to do this either.
Any help would be appreciated. Thanks!
If I'm understanding your question correctly, it sounds like your goal is to simply change the destination location for your report. You would need to update the connection manager for your flat file connection inside of your SSIS package.
You would need to edit your flat file connection manager's connection string (by loading up the connection manager properties and changing the value yourself or by changing the expressions, if it is parameterized). Once you have verified it is now pointing at the right server's location to save the report you are generating, you would rebuild the entire package and redeploy it, which would essentially overwrite the .ispac and .dtsx files on the deployment server with your updates.
Standard caveat of "it depends" but the most common case is that you can solve this through Configuration.
Right click on the Package (or Project) depending on how things are set up. In the "Connection Managers" tab, find the connection manager that corresponds to the flat file output (a strong naming standard helps). I have selected SO_61794511.dtsx and the Name is Flat File Conn... which then allows the right side menu to be populated.
Of interest here is ConnectionString. I am going to directly edit this to change from C:\ssisdata\input\so_61794511.txt to my new path D:\path\here\something\so_newthing.txt
Click OK 2x and the next time the package runs, it will use the configured value.
That's the easiest approach. You could accomplish a similar thing if you edit the job that runs the package to set the value at every execution but this just does it at a global scale.
Where this can go off the rails is if there's a expression applied to the ConnectionString property, e.g. the output file has a dynamic date in the file name. This is why I advocate for exposing Package or Project level parameters of a "base file path" concept. This allows me to change the path from C: (local development) to D: (server deployment) or even to a UNC path \server\share by setting a configuration instead of hard coding a path into the packages themselves.

MS Access handle ODBC fault at opening database

I have an MS Access frontend which is using a SQL database as backend. To connect, I use a ODBC connection and have created the needed entries in "ODBC Data Sources (32 Bit)"
When I will give the database to others, they will need to create this data source. I have a batch file for this so they can just run it.
If they do not run it, they will get a fault like "ODBC connection to XY failed". How could I change this error or at least write a second Message Box afterwards where I can tell them "run the Batch file XY to connect"?
When I will give the database to others, they will need to create this data source.
that is your first bad mistake and assumption. You MOST certainly do NOT want to deploy your applcation that way.
The way you deploy?
You take your accDB file, and link the tables to sql server. And you link using what is called DSN-less connections. Such connections do NOT require ANY data source to be setup on each work station.
So, ok, now you linked the tables to sql server (the production one - you probably were developing local on your developer PC and using a local copy of sql server express edition.
So, so now you link the tables to THEIR server, and then you now compile the accDB down to compiled executable version of Access - a accDE.
You are now free to deploy this "application" to any and all workstations for that company - and they do not have to re-link tables, do not have to setup a data source, and in fact they don't have to do anything, but simply run/launch the applcation.
How to make and get a dns-less connection?
Well, the MOST simple way is to ALWAYS, but ALWAYS ALWAYS create the linked tables using a FILE dsn. In fact, when you launch the ODBC connection manager from Access, the default is to use a FILE dsn. Never, and do not use "system" or "user" dsn.
If you link the tables using that FILE dsn, then Access converts them to DNS-less connections. At that point, you can even delete the DSN you created - access 100% ignores the DSN, and you don't need it anymore.
Next up:
If you been using nc 17 or nc 11-18 drivers? then yes, each workstation MUST have that same driver installed. Or, you can use the older "legacy" sql driver. But be careful - the legacy driver does not support datetime2 columns data types.
So, MAKE double, triple, quadrible sure that you not using data types and columns that require the newer drivers.
Right now, for some of the larger sites, we STILL use the older "legacy" driver, since that driver is installed at the OS level, and has been installed on every copy of windows going back to windows xp - in fact windows 98SE edition started shipping that driver. So you can 100% assume that the legacy sql driver is and will be installed on each workstation.
And by using + adopting dns-less connections, then no setup of the connection on each workstation is required. As long as those work stations are on the same network to sql server when YOU linked the tables, then they are good to go.
Now, on some sites, we actually can't even be on site, and we can't even pre-link to their sql server. So, what we do is on applcation startup, we check the current link for a linked table, and if it does not match a little external text file we ALSO included when setting up the work station, then we use VBA code to re-link the tables on startup. But once again, re-link of tables in VBA is easy, and again does NOT require ANY KIND of "dsn" or odbc setup on each work station.
And in fact, another way we used to do this is have a table in the front end, and it had one record, and that record had the connection string. So, right before deployment, we just edit that one record table in the front end to have the correct connection to their sql server. And once again, on startup , we check a linked table, and see if the connection strings match, if they don't, then we run the VBA re-link code, and once again, zero configuration and zero need exists to setup anything at all on each work station.
So, as a general rule, every dog, frog, insect that deployed access applications? We setup some re-link code and check that link on startup. In fact most developers have done this even without sql server - and even when using a access back end, then re-link code is included to resolve this issue.
but, be it linking to a access back end, or sql server back end? Some time of link check and system is assumed to have been cooked up by you, and this code will run on startup to check if the linked tables are pointing to the correct location.
But, be the back end oracle, sql server or whatever? You can create what is called dsn-less linked tables. As noted, you can use VBA to do this, or in fact you can use the linked table manager ---- as LONG AS you use a FILE dsn when you linked, then access coverts that to dns-less for you, and you be good to go.
so, in effect, you don't have to test/check/know for a odbc fail, since you checked the correct connection string on startup.
However, there is a way to trap, and check for a odbc failure, and this involves using a DIFFERENT way to connect, since we all know that if you have a odbc fail, you are duck soup (you have to exit Access, and there is NO KNOWN way around this issue - (well, except for testing if you can connect, and you do NOT use a linked table - since as noted once a odbc connect error triggers, it is game over.
The way you do this "alternate" test is like this:
Function TestLogin(strCon As String) As Boolean
On Error GoTo TestError
Dim dbs As DAO.Database
Dim qdf As DAO.QueryDef
Set dbs = CurrentDb()
Set qdf = dbs.CreateQueryDef("")
qdf.connect = strCon
qdf.ReturnsRecords = False
'Any VALID SQL statement that runs on server will work below.
' this does assume user has enough rights to query built in
' system tables
qdf.sql = "SELECT 1 "
qdf.Execute
TestLogin = True
Exit Function
TestError:
TestLogin = False
Exit Function
End Function
So, above does not hit or use a linked table. And a HUGE bonus of above? Once you execute that above valid logon, then any and all linked tables will work - AND WILL WORK without even having included the user/password in that connection string for the linked table. This is in have a huge bonus in terms of security, since now you don't have to include the user/password in the linked tables which of course exposes the user/password in plane text for all users that could look and see and find the sql user/password used.
In fact, what this means is that you can link your tables, but NOT include the user/password when you link the tables!!! - this is a HUGE security hole plugged and fixed when you do this.
So, once a valid logon has occurred (such as above), then any and all linked tables will now work, and work without even having included the user/passwords in those linked table connection strings.
As noted, the other big bonus is that you can use the above code to test for a valid connection, and avoid that dreaded "odbc" error, since as noted, if a odbc connection error is EVER triggered at any point in time, you MUST exit the applcation - no other way out.
However, it should be noted, that if you ever are going to use a wi-fi connection, or say cloud based sql server say running on Azure?
In that case, often with wi-fi, or a cloud based edition of sql server, then of course such connections over the internet are prone to minor and frequent dis-connects.
ODBC was developed long before the internet, and long before people would do things like connect access to some cloud based sql server running, and using a connection over the internet. But, if this turns out to be your use case, and deployment case?
Then you have to bite the bullet and ASSUME and ENSURE that you now adopt the nc 11-18 drivers. (I would go with nc17). These new drivers are now "internet" aware, and they are able to gracefully handle minor dis-conects, and in fact automatic re-cover and re-connect.
So, if you are ever going to use wi-fi, or connect to cloud based server? Then yes, you have to link the tables using say nc17 newer drivers, and you ALSO MUST THEN ensure that the same driver version you linked tables with is to be installed on each work station. You still don't have to setup any dsn connection and all that jazz - but you do have to ensure that the driver you used is ALSO installed on those work stations.
As noted, for larger deployments, we thus use the standard "legacy" sql driver, as it would be too painful to go to all work stations and install this driver.
However, we had one location, and for months they were experience in odbc connection failure. We had them replace a router, and even a network card on a server - but the problem still remained.
We suspect that some workstations had aggressive power management turned on, newer windows 10-11 will often put the network card to sleep, and thus when using access, we were seeing odbc errors. So, for that company, we had them install nc17, and linked access to sql server using that driver, and the problem went away (because those newer drivers now have built in re-connect ability - this is a relative new feature of ODBC, and one that legacy systems and drivers don't have.

SSRS Create development environment from Live server

I've inherited a live SSRS server and have been asked to amend a lot of reports that are on there.
Is there a quick way I can "export" all of the reports/data sources to a local instance so I can develop against it using BIDS?
e.g. Can I copy the ReportServer database from Production?
What else would I need to do?
I'd like to be able to have a Development copy of everything, with DataSources pointing to copies of the production databases but with the same names. Therefore I could re-write the report and re-define any SP's required locally, and then just deploy the new RDL to the server along with the ALTER SP scripts.
Is that possible or even sensible!?
Personally, with the volume you mentioned in the comments (30 RDL's and 3 databases) I wouldn't recommend some automated cloning of the entire Reporting setup from production to local. Instead, I'd suggest the following:
Reports
Go to the web front-end for your reportserver (typically http://yourserver/reports). Find each report, open it, and on the Properties tab click the Edit button. This button does not do what you might expect (edit the report inside the browser), but instead offers you a download of the RDL file. Save all the RDL files in one folder on disk.
With 30 reports manually downloading the reports may take you maybe an hour, max. This will probably beat most automated approaches. And since you should only need to do this step once...
Databases
It's not entirely clear from the question, but if you only have production databases and no DTAP setup yet, now may be a good time to start with that. You could host clones of the 3 production databases on a test server or possibly on your dev environment. Note that the schema's important here (should be the same as production), the data doesn't have to be entirely up to date.
Alternatively you can skip this bit and develop your reports against the production databases, assuming you can create connections from your dev machine to the production databases. Up to you.
Visual Studio / BIDS
This bit has a few parts to it:
Create a new reports project and solution in Visual Studio.
Add the existing RDL files you've downloaded earlier.
Depending on how the reports were set up, you may need to add shared data sources in your project, to get your reports up and running.
After all this, you should be able to preview your reports from Visual Studio (either with data coming from the "cloned" databases, or directly from production).
At this point you should also be able to safely make changes and preview/test them before deploying them.
Be sure to add the solution, reports, etc. to your version control system of choice.
Deployment
Once you've made changes you want to deploy to the reportserver, you have two basic options:
Deploy them using BIDS (see also the deployment properties MSDN page)
Go back to the web front-end, find the report, open the Properties tab again, click the Update button. This allows you to re-upload the RDL file with the changes you've made.
From now on you can just rinse and repeat on making updates and deploying the reports. No need for cloning/exporting the entire SSRS instance to keep things in sync.

Embed MS Access Form To Website Oline

I have a website online with just HTML and I am not willing to use any other programming language apart from Javascript. All I need to do is connect my Microsoft Access database on my computer to a form hosted online so when information is submitted online it is updated on MS access the next time I open the file up. Is this possible and how can it be done?
Turns out, you can do this with zero code. If you use office 365, and publish an Access web forms.
Any information entered into the Access web form will automatic appear in your local database. The synchronizing of data from the web site and pulling down of the records to a local copy works automatic and without the need to write any code. In fact the sync starts automatic when you launch the client application. (it runs in disconnected mode). And any records you enter in the client application will also sync up and appear on the web site.
So, you can use Access and write zero code, and this two way sync feature is built in.
You need Access 2010, and either SharePoint 2010 (enterprise), or you can use office 365 and the $6 per month p1 plan which also does support Access web publishing.
However, I suspect issues of user logons and security may well be a greater issue here, and thus office 365 might not be correct from a user logon point of view. You can invite up to 50 users to that site for the basic $6 per month, but all users of the site will require a logon (which can be due to being invited to the site).
There are two videos of mine here showing this setup in action here:
http://www.youtube.com/playlist?list=PL27E956A1537FE1C5&feature=plcp
I think what you are trying to do is very impractical. You'll need to use Server Side Javascript to insert your data into a database, preferably SQL Server, and then you'll have to write some kind of code to sync the SQL Server Database to your Access database.
Alternately, you could setup your Access database so it connects to the same instance of SQL Server as your website using ODBC linked tables or ADO. I cannot really recommend this, especially if the data you have in your Access database is anything you wouldn't want to be public. Also, using MS Access to access a database across the WAN/Internet is really not recommended although it can certainly be done, as long as you aren't working with large amounts of data, large quantity of records, etc.
I am not willing to use any other programming language apart from Javascript.
And why aren't you willing to use something else? I don't think you're going to get anywhere if you don't open your mind to using the right tools for the right job.
Here's something that might help you get connected to SQL from Javascript:
How to connect to SQL Server database from JavaScript in the browser?

How to access SQL Server Publishing Wizard 1.4

I've had a big problem in replicating a simple SQL Server 2008 R2 Express database for use on a development server. I thought I had it sorted but it turns out that each table has lost it's 'Identity' value somewhere along the line, and it's not possible to add those back in now. This is pretty much useless. So I'm back at square 1; having to get a copy of a MSSQL database plus data from a web server to another web server.
I've read that SQL Server Publishing Wizard does this, and maintains crucial things like identity settings etc. Trouble is, I'm working with SQL Server 2008 R2 Express and I can't actually seem to find a way to access that program anywhere - even though when I go to 'control panel > remove programs' it's in there. When I try to find it on my system (e.g. via start > find programs / files) it's nowhere.
Does anyone know how to access this program, and will it do what I need?
Thanks!
Sure thing, thanks Michael. So the solution was to connect to the database through VWD 2010 Express, which has the options required to do this. There are actually some really great third party tools which do database migrations from one system to another detailed here: http://erikej.blogspot.co.uk/2009/04/sql-compact-3rd-party-tools.html. The ones on this page are geared specifically at SQLCE migrations, but several of the tools also support other full SQL versions too.