I'm trying to create a folder for my SSIS packages like I've done on several servers up to now.
But instead of the folder being created, I recieve the following error: SSIS folder 'XXX' already exists in the specified parent folder. (Microsoft SQL Server Native Client 10.0)
I've tried using several diffrent names but have recieved the same error each time.
This is a new instance of SSIS that I set up today, so I'm very sure no other folders exist.
Found the culprit!
At first I tried the solution suggested by #billinkc in the comment, it worked (I managed to create a folder) but the folder wasn't usable as I couldn't upload packages.
I have db_owner permission on the MSDB database and so I can see and access the SSIS MSDB folder.
What I really needed was the db_ssisadmin permission. Not to worry as db_owner permissions allow me to give myself the db_ssisadmin permission!
After this was done, all the problems (the one in the original Q and more) disappeared.
Related
I have to import a 180Mb table on a shared (which means trouble I know) host and I'm getting the max_questions error. The limit is 75000 and they are not willing to higher the limit for even an hour. The number of queries as about 2000000.
I have been using bigdump to upload the gz file, but the limit is killing me.
Is there a way to split the table and upload it bit by bit or merge it somehow? I have been searching online, but solutions are mostly for people who can access the database settings.
I am assuming that you have phpmyadmin installed in your website directory,
other wise download latest phpmyadmin version from official website.
and unzip it in your website directory
If that's the case then perform below steps.
Access your website directory via sftp or ftp.
If you can not access your website directory via sftp or ftp then you can setup cron jobs for fetching files.
Find the config.inc.php file located in the phpmyadmin directory. In my case it is located here:
\public_html\phpmyadmin3.2.0.1\config.inc.php
Find the line with $cfg[‘UploadDir’] on it and update it to:
$cfg['UploadDir'] = 'upload';
Create a directory called ‘upload’ within the phpmyadmin directory.
**\public_html\phpmyadmin3.2.0.1\upload**
Then place the large sql file that you are trying to import into the new upload directory.
Now when you go onto the db import page within phpmyadmin console
you will notice a drop down present that wasn’t there before
it contains all of the sql files in the upload directory that you have just created.
You can now select this and begin the import.
Reference : http://daipratt.co.uk/importing-large-files-into-mysql-with-phpmyadmin/comment-page-4/
I'd like to install AdventureWorks2008 (I just install SQL Server 2008 R2 Express).
Each time I download the recommended version from CodePlex, all I get is a AdventureWorks2008.mdf file. Not only I cannot attach the file from SQL Server Management Studio, but I cannot copy/paste the file directly into the the database.
I've read in several places that I need to use AdventureWorks2008.msi, but I cannot find where to download it.
I just cannot figure out how to install AdventureWorks2008
Thanks for helping
There isn't an .msi file for adventureworks, even though you'll find it mentioned in outdated documentation and books. You aren't alone in finding this confusing -- it seems the web site, files and steps Microsoft provides for installing these databases changes every time I need to install them.
You need to create the database and attach the .mdf file, which is the "data file" referred to in the instructions. (.mdf = primary data file, .ldf = log file, .ndf = secondary data file)
In order to attach the file, you need to make sure you carefully follow the steps here: http://social.technet.microsoft.com/wiki/contents/articles/3735.sql-server-samples-readme-en-us.aspx#Readme_for_Adventure_Works_Sample_Databases
Instructions for 2008R2:
To install AdventureWorks2008R2 OLTP database
Download the AdventureWorks2008R2 Data File.
From File Download, click Save and browse to a location on your local
server.
From SQL Server Management Studio, execute the following code:
Case-insensitive Database
CREATE DATABASE AdventureWorks2008R2
ON (FILENAME = '{drive}:\{file path}\AdventureWorks2008R2_Data.mdf')
FOR ATTACH_REBUILD_LOG;
As an alternative to step 3, you can attach the database using the SQL
Server Management Studio user interface. For more detailed
information, see Attach a Database (SQL Server Management Studio).
Note: You must remove the log file from the list of files to attach.
This will cause the operation to rebuild the log.
Headache saving tip from Aaron Bertrand:
You should place the mdf file in your normal data folder - SQL Server
will already have the proper permissions. You can get this path using
SELECT TOP (1) physical_name FROM master.sys.database_files;
You can directly paste that file into your database directory. For more information you can refer http://tryingmicrosoft.com/error-while-attaching-a-database-to-sql-server-2008-r2/.
I have written a module that is refusing point blank to create the tables within my mysql4-install-1.0.0.php file....but only on the live server.
The funny thing is that on my local machine (which is a mirror of the live server (i.e. identical file structure etc)) the install runs correctly and the table is created.
So based on the fact that the files are the same can I assume that it is a server configuration and or permissions problem? I have looked everywhere and I can find no problems in any of the log files (PHP, MySQL, Apache, Magento).
I can create tables ok in test scripts (using core_read/write).
Anyone see this before?
Thanks
** EDIT ** One main difference between the 2 environments is that on the live server the MySQL is remote (not localhost). The dev server is localhost. Could that cause issues?
Is the module which your install script is a part of installed on the live server? (XML file in app/etc/modules/, Module List Module for debugging.)
Is there already a record in the core_resource table for your module? If so, remove it to set your script to re-run.
If you file named correctly? The _modifyResourceDb method in app/code/core/Mage/Core/Model/Resource/Setup.php is where this file is include/run from. Read more here
Probably a permissions issue - a MySQL account used by public-facing code should have as few permissions as possible that still let it get the job done, which generally does NOT allow for creating/altering/dropping tables.
Take whatever username you're connecting to mysql with, and do:
SELECT User, Host
FROM mysql.user
WHERE User='your username here';
This will show you the user#host combos available for that particular username, then you can get actual permissions with
show grants for username#host;
Do this for the two accounts on the live and devlopment server, which will show you what permissions are missing from the live system.
In the Admin->System->Advanced section is your module present and enabled?
Did you actually unpack your module to the right space, e.g. app/code/local/yourcompany/yourmodule ?
Do you have app/etc/modules/yourmodule.xml - I believe that this could be the overlooked file giving rise to your problem.
the cache could be the culprit, if you manually deleted the core_resource row for your module in order to make the setup sql run again, you have to also flush the cache
probably a difference between dev and production servers is cache settings, that would explain why you only see this in production
For me, the issue appeared using Windows for development. Linux system is case sensitive. In my config.xml the setup section was named camelCase while the folder was named all-lowercase. Making them the same made the script run.
SSIS package loops through input files. For each file, flatfile parse adds records to a DB table, then file is renames/moved for archiving. After all files, package calls a sproc to delete all year-old records.
Package runs from visual studio OK. Put in SSIS package store, run from there, no problem.
Create an SQL Agent job to run package. Job does something for about five minutes, announces it was successful, but no new records in DB and no renaming of input files.
Package uses dedicated login for SQL Server privileges. Job is run as HOSTNAME-SVC which has read/write privileges on the input directory and the archive directory.
Have you setup logging for the package? You could add a script task to the For-Each Loop Container that runs a Dts.Events.FireInformation command during each loop. This could help you track the file name it finds, the number of loops it does, how long each loop takes, etc. You could also add a logging step at the end so that you know it is at least exiting the For-Each Loop container successfully.
If you find that the package is running successfully but not looping through any files at all, then you may want to test using a simpler package that reads one file only and loads it into a staging table. If that works, then go the next step of looping over all the files in the director and only importing the one file over and over again. If that works, then go the next step of changing the file connection to match the file that it finds in the For-Each Loop Container file enumerator task.
If the package isn't looping over any files and you can't get it to see even the one file you tested loading from the job, then try creating a proxy account with your credentials and running the job as the proxy account. If that works, then you probably have a permissions issue with your service account.
If the package doesn't import anything even with the proxy account, then you may want to log into the server as the service account and try to run the SSIS package in BIDS. If that works, then you may want to deploy it to the server and run the package from the server (which will really use your machine, but at least it uses the ssis definition from the server). If this works, then try running the package from the agent.
I'm not sure I fully understand. The package has already been thoroughly tested under several Windows accounts, and it does find all the files and rename all the files.
Under the Agent, it does absolutely nothing visible, but takes five minutes to do it. NO permissions errors or any other errors. I didn't mention that an earlier attempt DID get permissions errors because we had failed to give the service acount access to the input and output directories.
I cannot log in as the service account to try that because I do not have a pasword for it. But sa isd job owner so it should be able to switch to the service account--and the access errors we got ten days ago show that it can. The package itself has not changed in those ten days. We just deleted the job in order to do a complete "dress rehearsal" of deployment procedure.
So what has changed, I presume, is some detail in the deployment procedure, which unfortunately was not in source control at the time it succeeded.
It seems to be something different about the permissions. We made the problem go away by allowing "everyone" to read the directory on the production server. For some unknown reason, we did not have to do that on the test server.
When the job tried to fetch the file list, instead of getting an error (which would be logged) it got an empty list. Why looping through an empty list took five minutes is still a mystery, as is the lack of permissions. But at least what happened has been identified.
I had a similar problem. Was able to figure out what was happening by setting the logging option of the SQL Server Agent Job.
Edit the step in the job that runs the package, go to the logging tab and pick "SSIS log provider for SQL Server" and, in the configuration string, I picked (using the drop down) the OLEDB connector that was in the package, it happens to connect to SQL Server in question.
I was then able to view more details in the history of that job, and confirmed that it was not finding files. By changing permissions on the directory to match the sql server agent account, the package finally executed properly.
Hope this helps.
You may want to turn logging off after you resolve your issue, depending on how often your package will run and how much information logging provides in your case.
Regards,
Bertin
I am very new to SSIS, looking at a package already created by someone else and deployed on SQL server. There is a File System Task that moves files to a network share. I need to change the path of the destination folder. The destination folder requires a domain login.
I can change the folder path in the
global variable. Do I have to
redeploy the package after making
this change? Can the change be made directly on the SQL server?
How do I change the user name and password for this network share? Where is this infomation saved? I don't see it in any of the variables.
There is a SQL Job on the server with the same name, how do I check if this Job is related to the SSIS package?
You can change the folder path in the variable. You will need to redeploy after doing this unless the variable is stored in a configuration file. If that is the case, you can just change the configuration file, and won't need to redeploy.
The user name and password will probably be stored in the Flat File connection. Look at the bottom of the package in the section labelled Connection Manager.
If there is a job of the same name, it was most likely created for the SSIS package of the same name. If you open the job for editing in SSMS, you can look at the job steps and confirm that the dtsrun command references the SSIS package.