pyodbc/FreeTDS/unixODBC on Debian Linux: issues with TDS Version - sql-server-2008

I'm having a bit of trouble successfully using pyodbc on Debian Lenny (5.0.7). Specifically, I appear to be having trouble fetching NVARCHAR values (not a SQL Server expert, so go easy on me :) ).
Most traditional queries work OK. For instance, a count of rows in table1 yields
cursor.execute("SELECT count(id) from table1")
<pyodbc.Cursor object at 0xb7b9b170>
>>> cursor.fetchall()
[(27, )]
As does a full dump of ids
>>> cursor.execute("SELECT id FROM table1")
<pyodbc.Cursor object at 0xb7b9b170>
>>> cursor.fetchall()
[(0.0, ), (3.0, ), (4.0, ), (5.0, ), (6.0, ), (7.0, ), (8.0, ), (11.0, ), (12.0, ), (18.0, ), (19.0, ), (20.0, ), (21.0, ), (22.0, ), (23.0, ), (24.0, ), (25.0, ), (26.0, ), (27.0, ), (28.0, ), (29.0, ), (32.0, ), (33.0, ), (34.0, ), (35.0, ), (36.0, ), (37.0, )]
But a dump of names (again, of type NVARCHAR) does not
>>> cursor.execute("SELECT name FROM table1")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
pyodbc.ProgrammingError: ('42000', '[42000] [FreeTDS][SQL Server]Unicode data in a Unicode-only collation or ntext data cannot be sent to clients using DB-Library (such as ISQL) or ODBC version 3.7 or earlier. (4004) (SQLExecDirectW)')
... the critical error being
pyodbc.ProgrammingError: ('42000', '[42000] [FreeTDS][SQL Server]Unicode data in a Unicode-only collation or ntext data cannot be sent to clients using DB-Library (such as ISQL) or ODBC version 3.7 or earlier. (4004) (SQLExecDirectW)')
This is consistent across tables.
I've tried a variety of different versions of each, but now I'm running unixODBC 2.2.11 (from lenny repos), FreeTDS 0.91 (built from source, with ./configure --enable-msdblib --with-tdsver=8.0), and pyodbc 3.0.3 (built from source).
With a similar combination (unixODBC 2.3.0, FreeTDS 0.91, pyodbc 3.0.3), the same code works on Mac OS X 10.7.2.
I've searched high and low, investigating the solutions presented here and here and recompiling different versions of unixODBC and FreeTDS, but still no dice. Relevant configuration files provided below:
user#host:~$ cat /usr/local/etc/freetds.conf
#$Id: freetds.conf,v 1.12 2007/12/25 06:02:36 jklowden Exp $
#
# This file is installed by FreeTDS if no file by the same
# name is found in the installation directory.
#
# For information about the layout of this file and its settings,
# see the freetds.conf manpage "man freetds.conf".
# Global settings are overridden by those in a database
# server specific section
[global]
# TDS protocol version
tds version = 8.0
client charset = UTF-8
# Whether to write a TDSDUMP file for diagnostic purposes
# (setting this to /tmp is insecure on a multi-user system)
; dump file = /tmp/freetds.log
; debug flags = 0xffff
# Command and connection timeouts
; timeout = 10
; connect timeout = 10
# If you get out-of-memory errors, it may mean that your client
# is trying to allocate a huge buffer for a TEXT field.
# Try setting 'text size' to a more reasonable limit
text size = 64512
# A typical Sybase server
[egServer50]
host = symachine.domain.com
port = 5000
tds version = 5.0
# A typical Microsoft server
[egServer70]
host = ntmachine.domain.com
port = 1433
tds version = 8.0
[foo]
host = foo.bar.com
port = 1433
tds version = 8.0
user#host:~$ cat /etc/odbc.ini
[foo]
Description = Foo
Driver = foobar
Trace = No
Database = db
Server = foo.bar.com
Port = 1433
TDS_Version = 8.0
user#host:~$ cat /etc/odbcinst.ini
[foobar]
Description = Description
Driver = /usr/lib/odbc/libtdsodbc.so
Setup = /usr/lib/odbc/libtdsS.so
CPTimeout =
CPReuse =
Any advice or direction would be very much appreciated!

I encountered the same error with Ubuntu. I "solved" it with a work around.
All you need to do is to set the environment variable TDSVER.
import os
os.environ['TDSVER'] = '8.0'
As I said it is not a real "solution" but it works.

Try to add
TDS_Version=8.0;ClientCharset=UTF-8
in your connection string.
For example,
DRIVER=FreeTDS;SERVER=myserver;DATABASE=mydatebase;UID=me;PWD=pwd;TDS_Version=8.0;ClientCharset=UTF-8

Cant you just side step the issue and either Convert or Cast name to something it can handle?
cursor.execute("SELECT CAST(name AS TEXT) FROM table")

Related

Error: nanodbc/nanodbc.cpp:1021: 00000: [

I have been trying to connect to my SQL that I am trying to create. I recently downloaded MySQL, the workbench, the connector ODBC, and the ODBC Manager, but I can't find the solution to solve the error for the connection.
Do I need to download anything else? I can't find a solution on internet or youtube for Mac.
packages_required = c("quantmod", "RSQLite", "data.table", "lubridate", "pbapply", "DBI")
install.packages(packages_required)
library("quantmod")
library("RSQLite")
library("data.table")
library("lubridate")
library("pbapply")
library("odbc")
PASS <- new.env()
assign("pwd","My Password",envir=PASS)
library("DBI")
con <- dbConnect(odbc(), Driver = "/usr/local/mysql-connector-odbc-8.0.28-macos11-x86-64bit/lib/libmyodbc8w.so",
Server = "localhost", Database = "data", UID = "root", PWD = PASS$pwd,
Port = 3306)
-----------------------------------------------------------------------------------------
> con <- dbConnect(odbc(), Driver = "/usr/local/mysql-connector-odbc-8.0.28-macos11-x86-64bit/lib/libmyodbc8w.so",
+ Server = "localhost", Database = "data", UID = "root", PWD = PASS$pwd,
+ Port = 3306)
Error: nanodbc/nanodbc.cpp:1021: 00000: [
>
Thank you
Presuming you're on Windows, try creating an ODBC connection using the most recent driver. The ODBC data sources tool should already be installed, you just need to open it and create a new one.
Press the windows key (or click the search spyglass) and type in "ODBC." The "ODBC Data Sources (64-bit)" tool should come up.
How to Create an ODBC Connection in Windows
Open the "ODBC Data Sources (64-bit)" application
Click "Add"
Choose"MySQL ODBC 8.0 Unicode Driver" (or whatever the newest version you
have is). If you don't have it, you can download it here:
https://dev.mysql.com/downloads/connector/odbc/
Enter the following information:
Data source name (the example code below uses "my_odbc_connection"), TCP/IP Server, Port, User and Password
Click "Details" to expand the box.
In the "Connection" tab, you may need to check the "Enable Cleartext
Authentication" box. Could depend on your system configuratoin.
Click "Test" to test the connection. If everything went right you
should get "Connection Successful" message. If you aren't able to get a
successful connection, make sure that you have access and that your
connection information doesn't have any typos.
After making a successful connection, perform these 2 additional steps (the
drop-downs won't populate until you connect successfully):
Click the "Database" drop down to choose the default database that you'll
be writing data to. If you will be writing to more than 1 database
then you may need to create a separate connection for each database
that you'll be writing to, specifying the default database
differently for each one.
Click the "Character Set" drop down and choose utf8.
You should now able to use the "DBI" and "ODBC" packages to read, write, etc. any data directly from R. Specific settings listed above may or may not apply depending on situation.
See the example code below.
Further reading: https://www.r-bloggers.com/setting-up-an-odbc-connection-with-ms-sql-server-on-windows/
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#~~ Load or install packages
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# Load or install librarian
if(require(librarian) == FALSE){
install.packages("librarian")
if(require(librarian)== FALSE){stop("Unable to install and load librarian")}
}
# Load multiple packages using the librarian package
librarian::shelf(tidyverse, readxl, DBI, lubridate, ODBC, quiet = TRUE)
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#~~ Read
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# Connect to a pre-defined ODBC connection named "my_odbc_connection"
conn <- DBI::dbConnect(odbc::odbc(), "my_odbc_connection")
# Create a query
query <- "
SELECT *
FROM YOUR_SCHEMA.YOUR_TABLE;
"
# Run the query
df_data <- DBI::dbGetQuery(conn,query)
# Close the open connection
try(DBI::dbDisconnect(conn), silent = TRUE)
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#~~ Write
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# Define the connection and database you'll be writing to
conn <- DBI::dbConnect(odbc::odbc(), "my_odbc_connection", db ="YOUR_DEFAULT_DB")
# Define variable types for your data frame. As a general rule, it's a good idea to define your data types rather than let the package guess.
field_types <- c("INTEGER","VARCHAR(20)","DATE","DATETIME","VARCHAR(20)","VARCHAR(50)","VARCHAR(20)")
names(field_types) <- names(df_data)
# Record start time
start_time <- Sys.time()
# Example write statement
DBI::dbWriteTable(conn,"YOUR_TABLE_NAME",YOUR_DATA_FRAME,overwrite=TRUE,field.types=field_types, row.names = FALSE)
# Print time difference
print("Writing complete.")
print(Sys.time() - start_time)
# Close the open connection
try(DBI::dbDisconnect(conn), silent = TRUE)

Error Code: 23 Out of resources when opening file

When I execute a query in MySQL, I get this error:
Error Code: 23
Out of resources when opening file '.\test\sample_table#P#p364.MYD' (Errcode: 24 - Too many open files)
MySQL version details:
VERSION 5.6.21
version_comment MySQL Community SERVER (GPL)
version_compile_machine x86_64
version_compile_os Win64
How to solve this problem?
The mysql error: Out of resources when opening file... (Errcode: 24) indicates that the number of files that msyql is permitted to open has been exceeded.
This limit is controlled by the variable open_files_limit. You can read this in phpMyAdmin (or the MySQL command line utility) with the statement:
SHOW VARIABLES LIKE 'open%'
To set this variable to a higher number, edit the /etc/my.cnf file and add the lines:
[mysqld]
open_files_limit = 5000
This answer explains the error code 24 (which is at the end of your error message).
If you happen (like me) to be doing some hacky server maintenance by running a single insert query for 35k rows of data, upping the open_files_limit is not the answer. Try breaking up your statement into multiple bite-sized pieces instead.
Here's some python code for how I solved my problem, to illustrate:
headers = ['field_1', 'field_2', 'field_3']
data = [('some stuff', 12, 89.3), ...] # 35k rows worth
insert_template = "\ninsert into my_schema.my_table ({}) values {};"
value_template = "('{}', {}, {})"
start = 0
chunk_size = 1000
total = len(data)
sql = ''
while start < total:
end = start + chunk_size
values = ", \n".join([
value_template.format(*row)
for row in data[start:end]
])
sql += template.format(headers, values)
start = end
Note that I do not recommend running statements like this as a rule; mine was a quick, dirty job, neglecting proper cleansing and connection management.

How to make shiny app talk to cloud relational database (in MySQL)?

This might sounds quite easy to many of experts, but after spending hours I have not came up with a right solution yet, I might have overlooked something which is easy to configure.
My question is how to make this shiny app to talk to Cloud Relational Database, for instance Google MySQL services, after deploying onto shinyapps.io
I have successfully launched this shiny app locally on my Windows 7 64-bit machine, cause I have specified User DSN as google_sql with correct Driver MySQL ODBC 5.3 ANSI Driver, ip, password, etc, so in code line odbcConnect I can simply provide dsn, username and password to open a connection. However when I deploy it to shinyapps.io, it failed with my expectation. My guess is my DSN google_sql is not recognized by shinyapps.io, so in order to make it working, what should I do? Should I change some code? Or configure on shinyapps.io
PS: It's not about how to install RMySQL, someone is posting ans to similar question here (unless they think RMySQL can do something which RODBC can not do)
connecting shiny app to mysql database on server
server.R
library(shiny)
# library(RODBC)
library(RMySQL)
# ch <- odbcConnect(dsn = "google_sql", uid = "abc", pwd = "def")
ch <- dbConnect(MySQL(),user='abc',password='def',
host = 'cloud_rdb_ip_address', dbname = 'my_db')
shinyServer(function(input, output) {
statement <- reactive({
if(input$attribute == 'All'){
sprintf("SELECT * FROM test_db WHERE country = '%s' AND item = '%s' AND year = '%s' AND data_source = '%s'",
input$country,input$item,input$year,input$data_source)
}else{
sprintf("SELECT * FROM test_db WHERE country = '%s' AND item = '%s' AND attribute = '%s' AND year = '%s' AND data_source = '%s'",
input$country,input$item,input$attribute,input$year,input$data_source)
}
})
output$result <- renderTable(dbFetch(dbSendQuery(ch, statement=statement()),n=1000))
})
ui.R
library(shiny)
shinyUI(fluidPage(
# Application title
headerPanel("Sales Database User Interface"),
fluidRow(
column(4,
selectInput('country','Country',c('United States','European Union','China'),selected = NULL),
selectInput('item','Item',c('Shoes','Hat','Pants','T-Shirt'),selected = NULL),
selectInput('attribute','Attribute',c('All','Sales','Procurement'),selected = NULL)
),
column(4,
selectInput('year','Calendar Year',c('2014/2015','2015/2016'),selected = NULL),
selectInput('data_source','Data Source',c('Automation','Manual'),selected = NULL)
)
),
submitButton(text = "Submit", icon = NULL),
# Sidebar with a slider input for the number of bins
# Show a plot of the generated distribution
mainPanel(
tableOutput("result")
)
))
I think it worth posting my shiny showLogs() error log for expert to enlighten me pls,
2015-05-04T06:32:16.143534+00:00 shinyapps[40315]: R version: 3.1.2
2015-05-04T06:32:16.392183+00:00 shinyapps[40315]:
2015-05-04T06:32:16.143596+00:00 shinyapps[40315]: shiny version: 0.11.1
2015-05-04T06:32:16.392185+00:00 shinyapps[40315]: Listening on http://0.0.0.0:51336
2015-05-04T06:32:16.143598+00:00 shinyapps[40315]: rmarkdown version: NA
2015-05-04T06:32:16.143607+00:00 shinyapps[40315]: knitr version: NA
2015-05-04T06:32:16.143608+00:00 shinyapps[40315]: jsonlite version: NA
2015-05-04T06:32:16.143616+00:00 shinyapps[40315]: RJSONIO version: 1.3.0
2015-05-04T06:32:16.143660+00:00 shinyapps[40315]: htmltools version: 0.2.6
2015-05-04T06:32:16.386758+00:00 shinyapps[40315]: Using RJSONIO for JSON processing
2015-05-04T06:32:16.386763+00:00 shinyapps[40315]: Starting R with process ID: '27'
2015-05-04T06:32:19.572072+00:00 shinyapps[40315]: Loading required package: DBI
2015-05-04T06:32:19.831544+00:00 shinyapps[40315]: Error in .local(drv, ...) :
2015-05-04T06:32:19.831547+00:00 shinyapps[40315]: Failed to connect to database: Error: Lost connection to MySQL server at 'reading initial communication packet', system error: 0
2015-05-04T06:32:19.831549+00:00 shinyapps[40315]:
PS: I think I need to white-list shinyapps.io ip address to my Google Could, to enable deployment on shinyapps.io.
The list given in pidig89's answer is the right IP list, but rather than trusting some random list found on a SO answer you can find the most up-to-date list on their support site: https://support.rstudio.com/hc/en-us/articles/217592507-How-do-I-give-my-application-on-shinyapps-io-access-to-my-remote-database-
(They "formally" announced this as the recommended way of IP filtering on their mailing list on a post in July 28, 2016)
I actually managed to come up with the answer to this question ourselves but wanted to share the answer since it might be relevant to others as well.
Here are the IP addresses you need to whitelist.
54.204.29.251
54.204.34.9
54.204.36.75
54.204.37.78
If you think it's a shinyapps issue with whitelisting your IP (I'm not saying that's the actual issue, but in case it is) then I would post to the shinyapps google group, as the shinyapps developers monitor it and answer frequently.
https://groups.google.com/forum/#!forum/shinyapps-users

How to pass secure_auth to MySQL login via SQLalchemy

I'm working on the front end of a webapp, and my co-developer is using Pyramid and SQAlchemy. We've just moved from SQLite to MySQL. I installed MySQL 5.6.15 (via Homebrew) on my OS X machine to get the Python MySQLdb install to work (via pip in a virtualenv).
Because in MySQL >= 5.6.5 secure_auth is now ON by default I can only connect to the remote database (pre 5.6.5) with the --skip-secure-auth flag, which works fine in a terminal.
However, in the Python Pyramid code, it only seems possible to add this flag as an argument to create_engine(), but I can't find create_engine() in my co-dev's code, only the connection string below in an initialisation config file. He's not available, this isn't my area of expertise, and we launch next week :(
sqlalchemy.url = mysql+mysqldb://gooddeeds:deeds808letme1now#146.227.24.38/gooddeeds_development?charset=utf8
I've tried appending various "secure auth" strings to the above with no success. Am I looking in the wrong place? Has MySQLdb set secure_auth to ON because I'm running MySQL 5.6.15? If so, how can I change that?
If you are forced to use the old passwords (bah!) when using MySQL 5.6, and using MySQLdb with SQLAlchemy, you'll have to add the --skip-secure-auth to an option file and use URL:
from sqlalchemy.engine.url import URL
..
dialect_options = {
'read_default_file': '/path/to/your/mysql.cnf',
}
engine = create_engine(URL(
'mysql',
username='..', password='..',
host='..', database='..',
query=dialect_options
))
The mysql.cnf would contain:
[client]
skip-secure-auth
For Pyramid, you can do the following. Add a line in your configuration ini-file that holds the connection arguments:
sqlalchemy.url = mysql://scott:tiger#localhost/test
sqlalchemy.connect_args = { 'read_default_file': '/path/to/foo' }
Now you need to change a bit the way the settings are read and used. In the file that launches your Pyramic app, do the following:
def main(global_config, **settings):
try:
settings['sqlalchemy.connect_args'] = eval(settings['sqlalchemy.connect_args'])
except KeyError:
settings['sqlalchemy.connect_args'] = {}
engine = engine_from_config(settings, 'sqlalchemy.')
# rest of code..
The trick is to evaluate the string in the ini file which contains a dictionary with the extra options for the SQLAlchemy dialect.

How do I configure pyodbc to correctly accept strings from SQL Server using freeTDS and unixODBC?

I can not get a valid string from an MSSQL server into python. I believe there is an encoding mismatch somewhere. I believe it is between the ODBC layer and python because I am able to get readable results in tsql and isql.
What character encoding does pyodbc expect? What do I need to change in the chain to get this to work?
Specific Example
Here is a simplified python script as an example:
#!/usr/bin/env python
import pyodbc
dsn = 'yourdb'
user = 'import'
password = 'get0lddata'
database = 'YourDb'
def get_cursor():
con_string = 'DSN=%s;UID=%s;PWD=%s;DATABASE=%s;' % (dsn, user, password, database)
conn = pyodbc.connect(con_string)
return conn.cursor()
if __name__ == '__main__':
c = get_cursor()
c.execute("select id, name from recipe where id = 4140567")
row = c.fetchone()
if row:
print row
The output of this script is:
(Decimal('4140567'), u'\U0072006f\U006e0061\U00650067')
Alternatively, if the last line of the script is changed to:
print "{0}, '{1}'".format(row.id, row.name)
Then the result is:
Traceback (most recent call last):
File "/home/mdenson/projects/test.py", line 20, in <module>
print "{0}, '{1}'".format(row.id, row.name)
UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-2: ordinal not in range(128)
A transcript using tsql to execute the same query:
root#luke:~# tsql -S cmw -U import -P get0lddata
locale is "C"
locale charset is "ANSI_X3.4-1968"
using default charset "UTF-8"
1> select id, name from recipe where id = 4140567
2> go
id name
4140567 orange2
(1 row affected)
and also in isql:
root#luke:~# isql -v yourdb import get0lddata
SQL> select id, name from recipe where id = 4140567
+----------------------+--------------------------+
| id | name |
+----------------------+--------------------------+
| 4140567 | orange2 |
+----------------------+--------------------------+
SQLRowCount returns 1
1 rows fetched
So I have worked at this for the morning and looked high and low and haven't figured out what is amiss.
Details
Here are version details:
Client is Ubuntu 12.04
freetds v0.91
unixodbc 2.2.14
python 2.7.3
pyodbc 2.1.7-1 (from ubuntu package) & 3.0.7-beta06 (compiled from source)
Server is XP with SQL Server Express 2008 R2
Here are the contents of a few configuration files on the client.
/etc/freetds/freetds.conf
[global]
tds version = 8.0
text size = 64512
[cmw]
host = 192.168.90.104
port = 1433
tds version = 8.0
client charset = UTF-8
/etc/odbcinst.ini
[FreeTDS]
Description = TDS driver (Sybase/MS SQL)
Driver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so
Setup = /usr/lib/x86_64-linux-gnu/odbc/libtdsS.so
CPTimeout =
CPReuse =
FileUsage = 1
/etc/odbc.ini
[yourdb]
Driver = FreeTDS
Description = ODBC connection via FreeTDS
Trace = No
Servername = cmw
Database = YourDB
Charset = UTF-8
So after continued work I am now getting unicode characters into python. Unfortunately the solution I've stumbled upon is about as satisfying as kissing your cousin.
I solved the problem by installing the python3 and python3-dev packages and then rebuilding pyodbc with python3.
Now that I've done this my scripts now work even though I am still running them with python 2.7.
So I don't know what was fixed by doing this, but it now works and I can move on to the project I started with.
Any chance you're having a problem with a BOM (Byte Order Marker)? If so, maybe this snippet of code will help:
import codecs
if s.beginswith( codecs.BOM_UTF8 ):
# The byte string s begins with the BOM: Do something.
# For example, decode the string as UTF-8
if u[0] == unicode( codecs.BOM_UTF8, "utf8" ):
# The unicode string begins with the BOM: Do something.
# For example, remove the character.
# Strip the BOM from the beginning of the Unicode string, if it exists
u.lstrip( unicode( codecs.BOM_UTF8, "utf8" ) )
I found that snippet on this page.
If you upgrade the pyodbc to version 3 the problem will be solved.