pgadmin 4 import/export table does not work - csv

I have made sure that the columns match the 'column to export' field in the columns tab, not null columns have data, tried in both csv and txt but all i get is a message saying:
import/export job created
Nothing else: no errors, no warning, no completion.
windows 7 os
Version1.4
CopyrightCopyright 2013 - 2017, The pgAdmin Development Team
Python Version2.7.12 (v2.7.12:d33e0cf91556, Jun 27 2016, 15:19:22) [MSC v.1500 32 bit (Intel)]
Flask Version0.11.1
Application ModeDesktop
till then ill try via psql

Just a guess, but are you trying this as a superuser? I'm having the same issue and so trying to write it as a COPY statement, but get this response:
ERROR: must be superuser to COPY to or from a file
SQL state: 42501
Hint: Anyone can COPY to stdout or from stdin. psql's \copy command also works for anyone.
I'm not sure, but maybe the functionality in pgAdmin just constructs a COPY statement, then isn't properly relaying the error message back to you when it fails.
Sounds like psql is the right way to go, though.

Related

SSIS ODBC Simba - Error when access table list on ODBC Source \ Destination

I'm using Simba ODBC to create a connection with Google Big Query and using SSIS (Visual Studio 2019) to read and write information on Big Query. The connection works fine and when I use the ODBC Source with the query option, I'm able to get data from Big Query and used it inside SSIS. But when I use the list of tables, I get an error as below:
Exception of HRESULT: 0xC0014020
Error in Data Flow Task[ODBC Source [100]]: SQLSTATE: 42000, Message: [Simba][BigQuery] (70) Invalid query: Invalid dataset ID ""TEST"". Dataset IDs must be alphanumeric (plus underscores and dashes) and must be at most 1024 characters long.
I believe that this happens because the list of tables appear between ("), instead of (`).
Print of table list
The same happens when I use the ODBC Destination. Is there a way to change the format in which the table list appears ?
Obs.: On the Visual Studio 2015 this table list comes with (`) and I can connect with big query just fine.
I can see that the tool is sending "TEST" as the dataset, however, depending on if Visual Studio is using StandardSQL or LegacySQL, the dataset should be specified as:
# LegacySQL
FROM [myproject:TEST.TABLE_TEST]
# StandardSQL
FROM `myproject:TEST.TABLE_TEST`
I was wondering if Visual Studio accepts a custom query or can be parameterized to remove the quotes. If this doesn't help, could you please share the query that cause the error? I understand that there is a query option (I'm not familiar with Visual Studio) and it is not clear for me the exact moment when the tool returns the error, screenshot without sensitive information would be appreciated.
UPDATE:
You can review the following checkpoints that could help to verify that the Simba driver is correctly set up and it is not the cause of the reported error:
Installation. Check that you are using the last version of the driver. The last version usually contains improvements on the driver.
ODBC Configuration. For example, the Step 13 of the link you will be able to see a drop-down list with the datasets available and select one as the default. If you don't have issues is this step, then the issue could be in the tool that uses the ODBC connection.
Language Dialect. In here you case change between StandardSQL or LegacySQL as needed, for example, you can force your tool to use LegacySQL and use the characters [ and ] that I explained above.
Connection String. If your tool allows to use a string with the connection, you might want to use it and explicitly indicating the default Dataset (among other driver options).

how to fix CONVERSION errors after importing SSIS PROJECT

I'm importing a perfectly working SSIS project from TFS.
I have actually a problem with all the packages that contain a data FLOW with a date importation.
I get dozens of this error :
Validation error. DFT Get Date ODBC Source CodeDate2 [63]: The OLE DB provider used by the OLE DB adapter cannot convert between types "DT_BYTES" and "DT_DBDATE" for "Date".
and when I click on the odbc source editor, I have the following message:
the metadata of the following output columnsdoes not match the metadata of the external columns with which the output columns are associated:
Output "ODBC Source Output": "Date"
Do you want to replace the metadata of the output columns with the metadata of the external columns?
the fact is that works everywhere but on my computer.
is there an ole db provider component I'm lacking of something like that?
Downgrading will work, but if that's not possible for you, then rewriting your queries may also solve your problem.
In my case I had a Postgres query returning columns of type date. I just converted them all to timestamptz using ::timestamptz. At that point the columns changed from DT_BYTES to DT_DBTIMESTAMP, which was just fine for my purposes.
It might be related to the version of Visual Studio or SSDT.
Try to install SSDT 15.8.0(SSDT previous releases), and run the package in it.
I once saw similar posts on MSDN after the release of Visual Studio 15.9.2
Import from Teradata using ODBC gives VS_NEEDSNEWMETADATA error
ODBC Progress datatype problems after updating to VS 2017 15.9
Same here, I forced the type casting it in the select and it works :
SELECT
[...]
cast(release_date as datetime) as release_date,
[...]
FROM cm_wo

Insert JSON into Table using MySQL Workbench 6.3 or SQL Server 2012

I've got some JSON bank statement code that I need to insert into a table but I'm having trouble with how I would do this on SQL server 2012, as it seems possible on 2016+ but not on 2012.
Either on there or MySQL would be ok - for reference, please see the code below. I think I would need two tables for the first and second part of the JSON file.
Can anyone point me in the right direction on how to handle files like this? the first line with name, status, location, etc' are included just once, but the rest is repeated multiple times throughout the statements.
Please see below for an imitation test sample/reference of the JSON file/contents :
{ "status": "success","Name": "Theresa May","Location": "87654321","AccountCategory": "Business","AccountType": "Current","TicketNo": "12345-12","AvailableBal": "12775.0400","BookBa": "123475.0400","TotalCredit": "1234567","TotalDebit": "0","Usage": "5","Period": "May 11 2014 to Jul 11 2014","Currency": "GBP","Applicants": "Angel","Signatories": [{"Name": "Not Available","BVB":"Not Available"}],"Details": [{"PTransactionDate":"24-Jul-14","PValueDate":"24-Jul-13","PNarration":"Cash Deposit","PCredit":"0.0000","PDebit":"40003.0000","PBalance":"40003.0000"},{"PTransactionDate":"24-Jul-14","PValueDate":"23-Jul-14","PTest":"Cash Deposit","PCredit":"0.0000","PDebit":"40003.0000","PBalance":"40003.0000"},{"PTransactionDate":"25-Jul-14","PValueDate":"22-Jul-14","PTest":"Cash Deposit","PCredit":"0.0000","PDebit":"40003.0000","PBalance":"40003.0000"},{"PTransactionDate":"25-Jul-14","PValueDate":"21-Jul-14","PTest":"Cash Deposit","PCredit":"0.0000","PDebit":"40003.0000","PBalance":"40003.0000"},{"PTransactionDate":"25-Jul-14","PValueDate":"20-Jul-14","PTest":"Cash Deposit","PCredit":"0.0000","PDebit":"40003.0000","PBalance":"40003.0000"}]}

OpenERP pentaho report error : Failed at query

I am getting an error while I try to generate a report from OpenERP.
File "/home/target/projects/orthene/pentaho_reports/wizard/report_prompt.py", line 207, in _setup_parameters
report_parameters = proxy.report.getParameterInfo(proxy_argument)
File "/usr/lib/python2.7/xmlrpclib.py", line 1224, in __call__
return self.__send(self.__name, args)
File "/usr/lib/python2.7/xmlrpclib.py", line 1578, in __request
verbose=self.__verbose
File "/usr/lib/python2.7/xmlrpclib.py", line 1264, in request
return self.single_request(host, handler, request_body, verbose)
File "/usr/lib/python2.7/xmlrpclib.py", line 1297, in single_request
return self.parse_response(response)
File "/usr/lib/python2.7/xmlrpclib.py", line 1473, in parse_response
return u.close()
File "/usr/lib/python2.7/xmlrpclib.py", line 793, in close
raise Fault(**self._stack[0])
Fault: <Fault 0: 'Failed to invoke method getParameterInfo in class com.willowit.reporting.PentahoRenderer: Failed at query: SELECT name FROM res_partner WHERE customer=True'>
Here is how my pentaho report design looks like:
The pentaho report designed version is 5.0.1-Stable
It is a JDBC connection with my local postgres database connection, which is a successful connection in report designer.
select_partner is a SQL query that fetches all the partner names to show in parameter.
SELECT name FROM res_partner WHERE customer=True
PartnerQuery is a SQL query that fetches name,city,phone & email of customers
SELECT name,city,phone,email FROM res_partner WHERE customer=True AND name=${partner} // ${partner} is a parameter defined
partner parameter is containing drop down selection of partners when we generate a report, it includes 2 point partner_query(which lists down all the partners)
![enter image description here][2]
Report is working fine in Pentaho report designed & give output like this:
But it is not working in OpenERP & getting an error as described on the very top.
Keyur, I recommend that you post it on github page of WillowIT or this pentaho module.
Here is what I found https://github.com/WillowIT/Pentaho-reports-for-OpenERP/issues/125.
Is this query returning NULL?
There was a wrong value of pentaho.postgres.password in OpenERP Settings->Technical->Parameters->System Parameters.
This issue was solved after applying correct password.
The error
Fault 0: 'Failed to invoke method getParameterInfo in class com.willowit.reporting.PentahoRenderer:
is a pretty generic error. It is the top level of the error log from the Pentaho reporting engine when almost anything goes wrong, and is what is returned to Odoo (OpenERP). As such, on its own, it is not helpful for working which of the numerous possible problems it can be.
Generally, the tomcat log file (catalina.out) can be far more helpful in working out what is wrong in the reporting set up.
There are a number of issues tagged with Failed to invoke method getParameterInfo on github.
Can I encourage people to look through the github issues and post there when looking for solutions.

XPages JDBC connected to MS ACCESS DB, issue showing data in ViewPanel

I am trying to connect XPagesJDBC.nsf to a simple MS Access database,
it connects Successfully and i am able to perform the #JdbcInsert , Update, Delete. also able to execute SQL Query operations.
But i am facing an issue while trying to showing data in ViewPanel control with JDBC Query datasource type.
It give bellow error
Error while reading the relational data
Invalid Fetch Size
and a long StackTrace, i am copying some part of stack trace bellow;
I think this error is specific to MS Access, because i could find some successful implementation with other RDBMS also it work fine with "durby" at my end.
com.ibm.xsp.util.FacesUtil.renderComponent(FacesUtil.java:840)
com.ibm.xsp.util.FacesUtil.renderComponent(FacesUtil.java:851)
com.ibm.xsp.util.FacesUtil.renderComponent(FacesUtil.java:851)
com.ibm.xsp.util.FacesUtil.renderComponent(FacesUtil.java:851)
com.ibm.xsp.util.FacesUtil.renderComponent(FacesUtil.java:851)
com.ibm.xsp.component.UIViewRootEx._renderView(UIViewRootEx.java:1317)
com.ibm.xsp.component.UIViewRootEx.renderView(UIViewRootEx.java:1255)
com.ibm.xsp.application.ViewHandlerExImpl.doRender(ViewHandlerExImpl.java:641)
com.ibm.xsp.application.ViewHandlerExImpl._renderView(ViewHandlerExImpl.java:320)
com.ibm.xsp.application.ViewHandlerExImpl.renderView(ViewHandlerExImpl.java:335)
com.sun.faces.lifecycle.RenderResponsePhase.execute(RenderResponsePhase.java:103)
com.sun.faces.lifecycle.LifecycleImpl.phase(LifecycleImpl.java:210)
com.sun.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:120)
com.ibm.xsp.controller.FacesControllerImpl.render(FacesControllerImpl.java:264)
com.ibm.xsp.webapp.FacesServlet.serviceView(FacesServlet.java:248)
com.ibm.xsp.webapp.FacesServletEx.serviceView(FacesServletEx.java:200)
com.ibm.xsp.webapp.FacesServlet.service(FacesServlet.java:160)
com.ibm.xsp.webapp.FacesServletEx.service(FacesServletEx.java:137)
com.ibm.xsp.webapp.DesignerFacesServlet.service(DesignerFacesServlet.java:103)
com.ibm.designer.runtime.domino.adapter.ComponentModule.invokeServlet(ComponentModule.java:576)
com.ibm.domino.xsp.module.nsf.NSFComponentModule.invokeServlet(NSFComponentModule.java:1267)
com.ibm.designer.runtime.domino.adapter.ComponentModule$AdapterInvoker.invokeServlet(ComponentModule.java:847)
com.ibm.designer.runtime.domino.adapter.ComponentModule$ServletInvoker.doService(ComponentModule.java:796)
com.ibm.designer.runtime.domino.adapter.ComponentModule.doService(ComponentModule.java:565)
com.ibm.domino.xsp.module.nsf.NSFComponentModule.doService(NSFComponentModule.java:1251)
com.ibm.domino.xsp.module.nsf.NSFService.doServiceInternal(NSFService.java:598)
com.ibm.domino.xsp.module.nsf.NSFService.doService(NSFService.java:421)
com.ibm.designer.runtime.domino.adapter.LCDEnvironment.doService(LCDEnvironment.java:341)
com.ibm.designer.runtime.domino.adapter.LCDEnvironment.service(LCDEnvironment.java:297)
com.ibm.domino.xsp.bridge.http.engine.XspCmdManager.service(XspCmdManager.java:272)
java.sql.SQLException: Invalid Fetch Size
sun.jdbc.odbc.JdbcOdbcStatement.setFetchSize(JdbcOdbcStatement.java:838)
sun.jdbc.odbc.JdbcOdbcResultSet.setFetchSize(JdbcOdbcResultSet.java:2987)
com.ibm.xsp.extlib.jdbc.model.JdbcDataBlockAccessor.loadBlock(JdbcDataBlockAccessor.java:389)
com.ibm.xsp.extlib.model.DataBlockAccessor.findBlockByIndex(DataBlockAccessor.java:344)
com.ibm.xsp.extlib.model.DataBlockAccessor.prefetchData(DataBlockAccessor.java:292)
com.ibm.xsp.extlib.model.DataAccessorModel.getRowCount(DataAccessorModel.java:93)
com.ibm.xsp.component.UIPager$PagerState.<init>(UIPager.java:76)
com.ibm.xsp.component.UIPager$PagerState.<init>(UIPager.java:71)
com.ibm.xsp.component.UIPager.createPagerState(UIPager.java:540)
com.ibm.xsp.renderkit.html_extended.XPagerRenderer.encodeChildren(XPagerRenderer.java:67)
com.ibm.xsp.renderkit.ReadOnlyAdapterRenderer.encodeChildren(ReadOnlyAdapterRenderer.java:162)
javax.faces.component.UIComponentBase.encodeChildren(UIComponentBase.java:979)
com.ibm.xsp.util.FacesUtil.renderComponent(FacesUtil.java:842)
com.ibm.xsp.renderkit.html_extended.ViewPanelRenderer.processPagerCell(ViewPanelRenderer.java:656)
com.ibm.xsp.renderkit.html_extended.ViewPanelRenderer.encodeBegin(ViewPanelRenderer.java:354)
com.ibm.xsp.renderkit.ReadOnlyAdapterRenderer.encodeBegin(ReadOnlyAdapterRenderer.java:146)
javax.faces.component.UIComponentBase.encodeBegin(UIComponentBase.java:956)
javax.faces.component.UIData.encodeBegin(UIData.java:788)
com.ibm.xsp.component.UIDataEx.encodeBegin(UIDataEx.java:413)
com.ibm.xsp.component.UIViewPanel.encodeBegin(UIViewPanel.java:288)
com.ibm.xsp.util.FacesUtil.renderComponent(FacesUtil.java:840)
com.ibm.xsp.util.FacesUtil.renderComponent(FacesUtil.java:851)
com.ibm.xsp.util.FacesUtil.renderComponent(FacesUtil.java:851)
myconnection.jdbc file ; ext_db is a DSN
<jdbc>
<driver>sun.jdbc.odbc.JdbcOdbcDriver</driver>
<url>jdbc:odbc:ext_db</url>
<user>xxx</user>
<password>xxx</password>
</jdbc>
The tongue-in-cheek-answer: get a database :-)
The problem seems to be well known and plagues others too to quite some extend. The batch-size parameter (first link) might work. You also could revisit your requirements. Would DECS do the trick?