I want to export from sql server result json to 'json file'.
Example:
SELECT * FROM SYS.all_columns FOR JSON AUTO
I know one method to do this using the command "BCP".
You are invited to share different ways to export 'Json File From Sql Server.'
(Example convert you can watch Here: convert table to Json)
To much for a comment, but only half an answer...
You know BCP, so nothing to tell you here...
Important to know: SQL-Server is very limited in its access to the file-system. It is not running as the user running the script. it is running in the context of the machine, where the SQL-Server is running: So a file destination like c:\temp\SomeFile.json might not get in the so called directory on your machine. A destination somewhere on a shared drive might fail with access violations.
It might be more flexible to use an external tool (power shell or any programming language of your choice) to connect to the database and call the result in order to store it.
If you need to trigger this from within SQL-Server, you can execute such an external programm using xp_cmdshell (just as you do this with BCP.
I'd suggest to create a VIEW, an UDF or a SP to return whatever you need in a single call (with some paramterers). This will allow you to control the content from within SQL-Server but to execute the retrieval and the storage externally.
Hope this helps...
Enables system administrators to control whether the xp_cmdshell.
EXEC sp_configure 'show advanced options', 1;
GO
RECONFIGURE;
GO
EXEC sp_configure 'xp_cmdshell', 1;
GO
RECONFIGURE;
GO
Exec Your Query With bcp:
EXEC sys.XP_CMDSHELL 'bcp "SELECT * FROM SYS.all_columns FOR JSON AUTO;" queryout C:\Data\JsonTest.json -t, -c -S . -d master -T'
Related
I am trying to import a .csv file with 2 columns (...and just 3 rows...just for testing) using the query:
IMPORT FROM "C:\db2\dtest.csv" OF DEL INSERT INTO TEST_DATA.DTEST (CAR, NICKNAME)
I am getting this error:
SQL0104N: SQL0104N An unexpected token "IMPORT FROM "C:\db2\dtest.csv"
OF DEL" was found following "BEGIN-OF-STATEMENT". Expected tokens may
include: "". SQLSTATE=42601
If I'm just being stupid here please do tell me :)
The IMPORT is not SQL, it is a command. That means you must submit the IMPORT from the Db2 command window (CLP), or from a script, or from a stored procedure . You cannot submit it directly via SQL, unless you use the ADMIN_CMD stored procedure. See the documentation for details and examples. You would use the ADMIN_CMD if you normally interact with databases via a GUI tool.
Using the command line is best ONLY if you will regularly use batch commands, or write scripts in any scripting language - and you are competent at scripting and using command-lines. But this method has pre-requisites especially if the database is on a different hostname or a different Db2-instance than the one you are working with (i.e. the db is remote). For remote databases the database needs to be catalogued via db2 catalog tcpip node .... and db2 catalog database ... commands.
Additionally you must first connect to the database via db2 connect to .... You have to do this regardless of whether the database is local or remote. See docs for details. For local databases you just use db2 connect to dbname where dbname must be your database name.
Using the ADMIN_CMD stored procedure if often easier for new users who are not familiar with using command line tools.
To use the Db2 command window (CLP), you can either use it interactively, or you can use your operating-system shell. On Windows use db2cwadmin.bat to open such a window (for batch or command usage), or run the db2.exe to enter interactive mode. You can then in interactive mode run your import command.
This happened to me when i was importing data from CSV to Sql DB
--In my case this happened because of incorrectly formatted CSV File.
When extracting data from CSV file have a look at the preview of data which will give an idea of where the data is wrong.
It may be because of different characters like , ' ' etc..
--t
I want to put the logs in some stored procedures in our database to monitor the working of stored procedures. I am new to SQL Server 2008. The logs should be created on the production server.
I had tried this link:
http://www.codeproject.com/Articles/18469/Creating-Log-file-for-Stored-Procedure
but get the error message:
The EXECUTE permission was denied on the object 'xp_cmdshell', database 'mssqlsystemresource', schema 'sys'.
Please provide me some needful.
First of all are you sure you want to log data to text file? May be it will be better to store log into separate table ?
If you want to work with text file:
Look at description xp_cmdshell
The Windows process spawned by xp_cmdshell has the same security rights as the SQL Server service account.
Check the security rights for this account.
xp_cmdshell can be enabled and disabled by using the Policy-Based Management or by executing sp_configure.
Check you have it enabled.
EXEC sp_configure 'show advanced options', 1
GO
RECONFIGURE
GO
EXEC sp_configure 'xp_cmdshell', 1
GO
RECONFIGURE
GO
When it is called by a user that is not a member of the sysadmin fixed server role, xp_cmdshell connects to Windows by using the account name and password stored in the credential named ##xp_cmdshell_proxy_account##. If this proxy credential does not exist, xp_cmdshell will fail.
You need to create proxy account.
EXEC sp_xp_cmdshell_proxy_account [MyDomain\SQLServerProxy], 'usdcu&34&23'
Add permissions to use this SP:
USE master;
GRANT EXECUTE on xp_cmdshell to Current_user
Here is a more detailed information.
granting permissions using the master database to the object should do
Use Master
grant execute on xp_cmdshell to 'user'
Using xp_cmdshell for logging is bad for both security and performance. Please delete that codeproject link from your browser and forget you ever saw it. Seriously, it is badness.
If you want to log calls to procs, either:
Set up a table for this (as demas also suggested). You can have a DATETIME field defaulted to GETDATE() or GETUTCDATE(). You can have a field for the Proc Name, a field for parameters. Whatever.
or
Use SQLCLR to create a stored procedure that does a simple File.Write of the info. You can use Impersonation (something xp_cmdshell can't do) to have the security context be that of the person running the proc and not the Log On account of the SQL Server process. This approach is far more efficient and contained than xp_cmdshell, even when not using Impersonation.
or
Do a combination of the log table + SQL CLR [or something else]: You can log to the table for immediate writing. And then set up a SQL Agent job to archive entries over X days old to a file using SQLCLR or some other means. This way the table doesn't grow too big with info that is probably older than you need anyway for researching problems that are currently happening.
I have an SSIS package I can import into Integration Services on my server and run with no problems. All it does is copy files from a directory on the network to the server it is running on.
When I execute the SQL Agent Job it says the job ran successfully but no files are copied. I verify there are files in the source location and the destination path exists. I am also using absolute paths (no mapped drives).
Why doesn't it copy any files when I run it as a SQL Agent Job?
FYI - the source directory is actually on a UNIX box and to map a drive to that location you have to enter a user/password combination.
I have a feeling that the SQL Agent Job runs as NT SERVICE\SQLSERVERAGENT, which is not the user that has permission to the UNIX box. Is there a way to run the SQL job as a specific user?
Thanks in advance.
You need to create a Credential, a SQL Agent Proxy, and then assign the proxy account to the SQL Agent job step. Proxy accounts are specific to each subsystem (e.g Powershell, CmdExec, SSIS, etc.)
-- creating credential
USE [master]
GO
CREATE CREDENTIAL [Superuser] WITH IDENTITY = N'DOMAIN\account', SECRET = N'mypassword'
GO
-- creating proxy for CmdExec subsystem, adding principal
USE [msdb]
GO
EXEC msdb.dbo.sp_add_proxy #proxy_name=N'My custom proxy',#credential_name=N'Superuser',
#enabled=1
GO
EXEC msdb.dbo.sp_grant_proxy_to_subsystem #proxy_name=N'My custom proxy', #subsystem_id=3
GO
EXEC msdb.dbo.sp_grant_login_to_proxy #proxy_name=N'My custom proxy', #fixed_server_role=N'sysadmin'
GO
-- assigning job step to run as a given proxy user
USE [msdb]
GO
EXEC msdb.dbo.sp_update_jobstep #job_id=N'0df2dac2-4754-46cd-b0bf-05ef65e1f87e', #step_id=1 , #subsystem=N'CmdExec',
#proxy_name=N'My custom proxy'
GO
Want to export data from a table in SQL Server 2008 to a excel file on windows 7 with T-SQL.
By searching over internet, many try following:
Insert into openrowset ('Microsoft.Jet.OLEDB.4.0', 'Excel 8.0;Database=c:\MyExcel.xls;','SELECT * FROM [Sheet1$]')
select * FROM mytab
I tried it too. I it is not working.
Also try below:
sqlcmd -S myServer -d myDB -E -Q "select * from Tab" -o "MyData.csv" -h-1 -s","
okay with no error, but no file created. Also not sure if this can be run in T-SQL.
Any better solution for this case?
Use export wizard to create a SSIS package. Select [First row is column name], excel destination, your source table or query
In the last step, save SSIS package as file
Use sp_configure to enable xp_cmdshell
use t-sql script to exec ('dtexec /file: ')
Bulk Copy Program = bcp
MSDN: http://msdn.microsoft.com/en-us/library/ms162802.aspx
example usage:
bcp AdventureWorks2012.Sales.Currency out "Currency Types.dat" -T -c
This is a command line util so you will need to use the command shell functionality if you have to do this from SQL Server. I am curious if the sqlcmd you were using was trying to use bcp under the hood potentially. Be aware creating files directly from SQL you will have permission issues if you are sending out to certain locations as the SQL Account may not have access. And above all remember that the machine running the TSQL is not the server, so where it saves to when using a local path is relative to the server running the script, NOT the user running the script over TCP/IP to SQL Server. I think the default for file creation for SQL server is the root install of the 'data' location of SQL Server on the Server. Something like:
C:\Program Files\Microsoft SQL Server\(version current is: 110)\Data
Is it possible to log CREATE / ALTER statements issued on a MySQL server through phpMyAdmin? I heard that it could be done with a trigger, but I can't seem to find suitable code anywhere. I would like to log these statements to a table, preferably with the timestamp of when they were issued. Can someone provide me with a sample trigger that would enable me to accomplish this?
I would like to log these statements so I can easily synchronize the changes with another MySQL server.
There is a patch for phpMyAdmin which provides configurable logging with only some simple code modifications.
We did this at my work and then i tweaked it further to log into folders by day, log IP addresses and a couple other things and it works great.
Thanks #Unreason for the link, i couldn't recall where i found it.
Here is a script that would do what you want for mysql-proxy (check the link on official docs how to install the proxy).
To actually log the queries you can use something as simple as
function string.starts(String,Start)
return string.sub(String,1,string.len(Start))==Start
end
function read_query( packet )
if string.byte(packet) == proxy.COM_QUERY then
local query = string.lower(string.sub(packet, 2))
if string.starts(query, "alter") or string.starts(query, "create") then
-- give your logfile a name, absolute path worked for me
local log_file = '/var/log/mysql-proxy-ddl.log'
local fh = io.open(log_file, "a+")
fh:write( string.format("%s %6d -- %s \n",
os.date('%Y-%m-%d %H:%M:%S'),
proxy.connection.server["thread_id"],
query))
fh:flush()
end
end
end
The script was adopted from here, search for 'simple logging'.
This does not care about results - even if the query returned an error it would be logged (there is 'more customized logging' example, which is a better candidate for production logging).
Also, you might take another approach if it is applicable for you - define different users in your database and give DDL rights only to a certain user, then you could log everything for that user and you don't have to worry about details (for example - proxy recognizes the following server commands, out of which it inspects only Query)
Installing the proxy is straight forward, when you test it you can run it with
mysql-proxy --proxy-lua-script=/path/to/script.lua
It runs on port 4040 by default so test it with
mysql -u user -p -h 127.0.0.1 -P 4040
(make sure you don't bypass the proxy; for example on my distro mysql -u user -p -h localhost -P 4040 completely ignored the port and connected over socket, which left me puzzled for a few minutes)
The answer to your question will fall into one of the listed in MySQL Server logs
If you just want to get the CREATE/ALTER statements, I would go with the general query log. But you will have to parse the file manually. Be aware of the security issues this approach raises.
In your scenario, replication seems to be an overkill.
Triggers are not a valid option since they are only supported at SELECT, UPDATE and INSERT level and not ALTER/CREATE.
Edit 1:
The query log would be the best choice but as you mentioned on busy servers the logs would cause a considerable efficiency penalty. The only additional alternative I know of is MySQL Proxy.
I think that your best bet would be to look at the use of stored procedures and functions here to make changes to your DB. That way you could look at manually logging data.