Importing a sql script using sqlcmd throws "insufficient system memory error" - sqlcmd

I am trying to import a 50mb sql file using sqlcmd but it throws the following error:
There is insufficient system memory in resource pool 'default' to run this query error.
sqlcmd import error
I am running SQL server 2019 Developer in an EC2 instance having 4GB memory. SQL server is set to use default memory (2147483647 MB). I also tried with sql service in single user mode by running "NET START MSSQLSERVER /f /mSQLCMD" but still no luck. Can anyone suggest what could be wrong?

The problem is fixed by adding a GO statement after every thousand rows as suggested by #Remus Rusanu here: https://stackoverflow.com/a/1995728.

Related

Import Data into Cloud SQL Using Cloud SQL Client Connection Ended

I am trying to run a SQL script from MySQL workbench, and have it import data to Cloud SQL using Cloud SQL Client proxy. This SQL script is about 10 GB. I have tried this a few times, and it takes too long, and says the certification of the connection will expire soon. Soon after that, the so the connection ends, and I see an error on workbench saying lost connection to MySQL server.
Are there any suggestions on what I should do / do different?
I am not wanting to use Cloud SQL Buckets to import data, and hence am going the route of Cloud SQL Client.
Your connection ended error can be due to many different reasons. One being your script is too large and is causing the connection to timeout. I would suggest splitting your script and importing them separately.
It is possible that the “certification of the connection will expire soon” error can be resolved by rotating your server certificates. You can find how to do that here: https://cloud.google.com/sql/docs/mysql/manage-ssl-instance#manage-server-certs

SQL Server 32-bit easysoft IB6 odbc SSIS package failing with validation error: codes 0x80004005 and 0xC0014009 when run by a sql server agent job

I have a SSIS package on sql server 2012.
It utilizes an easysoft odbc connection to access and optima attendance controller. In my project, I have
Run64BitRuntime is false. My SQL Agent Job is also set for 32 bit run time.
I use and SSIS proxy account and it is an admin account.
I have another package running on the same server using a different odbc provider, timebersoft, but it runs without issue.
If you need more details please let me know what you need to know.
The following errors only occur when the package is exacuted by a sql server agent job.
Data Flow TAsk: Error: ODBC source failed validation and returned error code 0x80004005
attnd: Error: There was an error trying to establish and Open Database Connectivity (ODBC) connection with the database server.
Data Flow Task: Error: The AquireConnection method call to the connection manager Attnd failed with error code 0xC0014009.
I have attempted to create a hello world version and every time the only thing to cause it to crash is when i try to connect to the easysoft odbc. the driver is installed (Name{Easysoft IB6 ODBC}; Version{1.00.01.70}; Company{Easysoft Limited}; File{IB6ODBC.DLL}; Date{6/5/2002}) the user is the same administrator for the successful and unsuccessful attempts
Update: July 21:
1. Tried the installation that wasn't it.
2. Tried Operation system(cmdline) for using 32 bit detexec.exe failed again. (double checked and the command I used did work in the cmdline)
Here are a couple of things you must check (going from front end to back end). See what you are missing.
Run64BitRuntime is set to false in the project setting.
The same configuration setting (having Run64BitRuntime set to false) should be used for deployment.
If you are using an external configuration file, and that has this property, see that it is set to false.
SQL Server > Database Engine > SQL Server Agent > Jobs > Your job's properties > Select the step where you run the SSIS and click Edit> In the 'Execution Options' tab check 'Use 32 bit runtime'
Hope this helps!
"You can use our Interbase ODBC driver from SQL Server Agent or any ODBC enabled application. You will however need to install a 64-bit Interbase client that is compatable with your Interbase server before you install the Easysoft ODBC driver. This can be obtained from Embarcadero"
the issue appears to be with the software. In order to use easysoft on a 64 bit system i need the 64 bit driver properly installed.
Thank you #billinkc for pointing out the installation issue.

Error code 0x80004005 when executing package as a scheduled job

The error message is:
POSTGRES dm_genders_d failed validation and returned error code
0x80004005.
I've seen several references to this almost assuredly being a permissions issue, which sounds right to me, but I have been completely unable to identify the relevant permissions.
The Postgres connection is using ODBC. The package is moving data from PostgreSQL to SQL Server. Currently both 32bit and 64bit drivers exist, but I haven't seen how to choose between them.
I have made all of the recommended changes to 32 bit for the job.
We are using Windows Authentication.
I've set up a proxy to execute the job as my user.
None of this has alleviated this error.
UPDATE: Yes, the 32 bit data source has been defined, and it is being used.
I had this error and I could solve it by Add ODBC connection in the "system DSN" instead of "User DSN" tab.
Start > ODBC Data Sources
Also I ran package with 32-bit runtime
for this: right click on your job in SQL Server Agent > properties > steps > edit
in the window that appear (Job Step Properties) you can set 32-bit runtime. (below picture)

Memory Leak using msxml3.dll

Currently on Windows Server 2008 R2 Standard with 32GB memory.
Once the server hits a Memory usage of around 50% (18GB, 13GB of it is for SQL Server (2008)) some strange things are happening with the XMLHTTP requests. I have tried using "Microsoft.XMLHTTP and MSXML2.ServerXMLHTTP but i get the same result.
I am getting a 2 different errors all associated with the msxml3.dll file:
Error #: -2146697211
The system cannot locate the resource specified.
and
Error #: -2147024888
Not enough storage is available to process this command.
After i restart the server, everything seems to be working fine, for now at least, we'll see when the memory usage starts increasing.
I have searched a while for a solution, and have found that nothing seems to work except for restarting the server, haven't tried just restarting IIS, but I am wondering why this is happening all of a sudden.
If you mean to reserve only 13GB of memory for SQL Server, you need to tell someone. Right now you've told SQL Server to take over the server if it wants.
EXEC sp_configure 'show adv', 1;
RECONFIGURE WITH OVERRIDE;
GO
EXEC sp_configure 'max server memory', 13000;
RECONFIGURE WITH OVERRIDE;
GO
Now restart SQL Server. This won't guarantee that SQL Server won't use more than 13 GB, since this setting only controls certain aspects of its memory usage, but it will prevent SQL Server from taking over the box.
This is the setting I assumed you had already set what you said:
18GB, 13GB of it is for SQL Server (2008)

MySQL server has gone execute large script files

Please help me to get rid of this errors
While i run some large number of mysql scripts at once i got this kind of errors
ERROR: 2006 MySQL server has gone away Error: 1153 Got a packet
bigger than 'max_allowed_packet' bytes
How to get rid of this error, im using Navicat mysql and WAMP server??
In running scripts i choose continue on error so scripts still running with error msgs and values in Message log. Will this work and can i run the error values later??
try using these commands from mysql terminal
set global max_allowed_packet=1000000000;
set global net_buffer_length=1000000;
The answer from ayush solves it, but if you un-check the option "Run multiple queries in each execution" in the dialog box when executing SQL file it might do the trick as well (but the execution will take longer to run).