I'm creating an API for POIs and use the POINT-Type to store the coordinates.
As my company uses CakePHP I have to write a migration-script with Phinx.
But I don't have any Idea how to correctly create a column with the POINT-Type.
Sure, I just could make an "ALTER TABLE ..." in a handwritten Query, but maybe there is a better way?
Versions:
Cake: 3.4.7
Phinx: 0.6.5
MySQL: 5.7.18
Phinx Does not provide an adapter for POINT yet.
You should create your query manually.
See also Unable to seed data with POINT datatype #999
Just use "point" as you would use any other datatype as the second parameter of addColumn().
It's just not documented yet.
Credits for this solution are going to #ndm;
I just think it's worth putting this as answer instead as a comment.
Looks like Phinx supports point types for quite some time now (the docs are not up to date)... try to use \Phinx\Db\Adapter\AdapterInterface::PHINX_TYPE_POINT as the type
I'm trying to build a Data Integration job uses pass through to extract data from a view in a MySQL database.
Wev'e been using pass through a lot in the project, mostly extracting data from Redshift,
however with MySQL I was not able to do make it work properly.
It keeps complaining a table is missing even though when pass through is off, view is found and data is extracted...
tried every trick I know, starting from enabling case-sensitive DBMS object names, to manually remove single/double quotes from the statement just in case MySQL confuses confuses it with something else...
No luck.
ODBC driver is [MySQL][ODBC 5.3(a) Driver][mysqld-5.5.53].
Ran on a Windows environment.
Any idea how to solve this?
Thank you in advance.
EDIT
So, first of all, one correction (even though not that important - I extract from a view, not a table).
This is the code generated by SAS Create Table transformation, pass through enabled. I only put an asterisk instead of the full list of columns:
proc sql;
connect to ODBC
(
READBUFF=10000 DATASRC="cmp.web_api" AUTHDOMAIN="MYSQL_CMP_Auth"
);
create table work."W7ZZZKOC"n as
select
*
from connection to ODBC
(
select
V_BI_ACCOUNT.ACCOUNT_NAME,
V_BI_ACCOUNT.ACQUISITION_SOURCE__C,
V_BI_ACCOUNT.ZUORA__ACTIVE__C,
V_BI_ACCOUNT.ADDRESS_LINE_1__C,
V_BI_ACCOUNT.ADDRESS_LINE_2__C,
V_BI_ACCOUNT.ADDRESS_LINE_3__C,
V_BI_ACCOUNT.AGREEMENT_DATE,
V_BI_ACCOUNT.AGREEMENT_LEGAL_CLAUSE_1__C,
V_BI_ACCOUNT.AGREEMENT_LEGAL_CLAUSE_2__C,
V_BI_ACCOUNT.PERSONBIRTHDATE,
V_BI_ACCOUNT.BLOCKED_REASON__C,
V_BI_ACCOUNT.BRAND__C,
V_BI_ACCOUNT.CPN__C,
V_BI_ACCOUNT.ACCCREATEDBYID,
V_BI_ACCOUNT.ACCCREATEDDATE,
V_BI_ACCOUNT.CURRENCY_PREFERENCE__C,
V_BI_ACCOUNT.CUSTOMER_FULL_NAME__PC,
V_BI_ACCOUNT.ACCOUNTID,
V_BI_ACCOUNT.ZUORA__CUSTOMERPRIORITY__C,
V_BI_ACCOUNT.DELIVERY_SALUTATION__C,
V_BI_ACCOUNT.DISPLAY_NAME,
V_BI_ACCOUNT.PERSONEMAIL,
V_BI_ACCOUNT.EMAILKEY__C,
V_BI_ACCOUNT.FACEBOOKKEY,
V_BI_ACCOUNT.FIRSTNAME,
V_BI_ACCOUNT.GENDER__C,
V_BI_ACCOUNT.PHONE,
V_BI_ACCOUNT.ACCLASTACTIVITYDATE,
V_BI_ACCOUNT.ACCLASTMODIFIEDDATE,
V_BI_ACCOUNT.LASTNAME,
V_BI_ACCOUNT.OTHER_EMAIL__C,
V_BI_ACCOUNT.PI_TYPE__C,
V_BI_ACCOUNT.ACCPARENTID,
V_BI_ACCOUNT.POSTCODE__C,
V_BI_ACCOUNT.PRIMARY_ACCOUNT_OF_THIS_CUSTOMER,
V_BI_ACCOUNT.ACCPRIMARY__C,
V_BI_ACCOUNT.ACCREASON_FOR_STATUS__C,
V_BI_ACCOUNT.ZUORA__SLA__C,
V_BI_ACCOUNT.ZUORA__SLASERIALNUMBER__C,
V_BI_ACCOUNT.SALUTATION,
V_BI_ACCOUNT.ACCSYSTEMMODSTAMP,
V_BI_ACCOUNT.PERSONTITLE,
V_BI_ACCOUNT.ZUORA__UPSELLOPPORTUNITY__C,
V_BI_ACCOUNT.X_CODE__C,
V_BI_ACCOUNT.ZUORA__ACCOUNT_ID__C,
V_BI_ACCOUNT.ZUORA__PAYMENTMETHODID__C,
V_BI_ACCOUNT.CITY,
V_BI_ACCOUNT.ORIGINAL_CREATED_DATE,
V_BI_ACCOUNT.SOURCE_SYSTEM_ID,
V_BI_ACCOUNT.STATUS,
V_BI_ACCOUNT.ZUORA__CONTACT_ID,
V_BI_ACCOUNT.ACCISDELETED,
V_BI_ACCOUNT.BILLING_ACCOUNT_NAME,
V_BI_ACCOUNT.ACZCREATEDDATE,
V_BI_ACCOUNT.ACZSYSTEMMODSTAMP,
V_BI_ACCOUNT.ACZLASTACTIVITYDATE,
V_BI_ACCOUNT.ZUORA__ACCOUNT__C,
V_BI_ACCOUNT.ZUORA__ACCOUNTNUMBER__C,
V_BI_ACCOUNT.ZUORA__AUTOPAY__C,
V_BI_ACCOUNT.ZUORA__BALANCE__C,
V_BI_ACCOUNT.ZUORA__CREDITCARDEXPIRATION__C,
V_BI_ACCOUNT.ZUORA__CURRENCY__C,
V_BI_ACCOUNT.ZUORA__MRR__C,
V_BI_ACCOUNT.ZUORA__PAYMENTTERM__C,
V_BI_ACCOUNT.ZUORA__PURCHASEORDERNUMBER__C,
V_BI_ACCOUNT.ZUORA__LASTINVOICEDATE__C,
V_BI_ACCOUNT.COUNTRY_NAME,
V_BI_ACCOUNT.COUNTRY_CODE,
V_BI_ACCOUNT.FAVOURITE_FOOTBALL_CLUB,
V_BI_ACCOUNT.COUNTY
from
web_api.V_BI_ACCOUNT as V_BI_ACCOUNT
);
%rcSet(&sqlrc);
disconnect from ODBC;
quit;
And again, when I extract data without pass through - works successfully,
I found out the problem was a column name exceeds 32 positions.
As SAS supports up column names up to 32,
the query fails to find PRIMARY_ACCOUNT_OF_THIS_CUSTOMER as the original column name is PRIMARY_ACCOUNT_OF_THIS_CUSTOMER__C.
EDIT
One more thing I found out is, MySQL doesn't like specifying schema name nor aliases.
Therefore,
From clause to only specify table name i.e : 'from v_bi_account' rather than 'web_api.v_bi_account'
and do not use aliases i.e use 'from v_bi_account' rather than 'from v_bi_account as v_bi_account'
Thank you guys so much for your help.
So I have a bunch of users in a column that get refreshed as:
Bill#test.comXYZ
Tom#test.comXYZ
John#test.comXYZ
We refresh the database each week and I need to update these appropriate emails to:
Bill#domain.com
Tom#domain.com
John#domain.com
I figured I can use concat to do the latter, but I am stuck on the former issue. Is there a way to split the values (like split Bill#test.comXYZ into Bill - #test.comXYZ and then remove the #TEXT values?).
Anyways, any help will be much appreciated.
You can use the mySQL replace function, i.e.
UPDATE mytable
set myfield = replace (myfield, '#test.comXYZ', 'domain.com')
http://dev.mysql.com/doc/refman/5.0/en/string-functions.html#function_replace
I'm trying to create a text file that contains the value of a custom field I added on redmine. I tried to get it from an SQL query in the create method of the project_controller.rb (at line 80 on redmine 1.2.0) as follows :
sql = Mysql.new('localhost','root','pass','bitnami_redmine')
rq = sql.query("SELECT value
FROM custom_values
INNER JOIN projects
ON custom_values.customized_id=projects.id
WHERE custom_values.custom_field_id=7
AND projects.name='#{#project.name}'")
rq.each_hash { |h|
File.open('pleasework.txt', 'w') { |myfile|
myfile.write(h['value'])
}
}
sql.close
This works fine if I test it in a separate file (with an existing project name instead of #project.name) so it may be a syntax issue but I can't find what it is. I'd also be glad to hear any other solution to get that value.
Thanks !
(there's a very similar post here but none of the solutions actually worked)
First, you could use Project.connection.query instead of your own Mysql instance. Second, I would try to log the SQL RAILS_DEFAULT_LOGGER.info "SELECT ..." and check if it's ok... And the third, I would use identifier instead of name.
I ended up simply using params["project"]["custom_field_values"]["x"] where x is the custom field's id. I still don't know why the sql query didn't work but well, this is much simpler and faster.
I have two columns with mySQL:
"part_no"
"pdf_link"
I need the "pdf_link" column to automatically grab/duplicate the "part_no" value and add a .pdf extension on the end.
For example: If part_no = 00-12345-998, then pdf_link = 00-12345-998.pdf
I need this to happen every time I insert.
I appreciate the help.
Erik
you can achive this effect by using triggers I think.
http://dev.mysql.com/doc/refman/5.0/en/trigger-syntax.html
CREATE TRIGGER ins_pdf AFTER INSERT ON MY_TABLE SET #pdf_link = concat(#part_no,'.pdf')
Why store this extra computed information in the database? You can do this in the query when you pull it out, or, if needed, you could make a view that does it only as-needed.
Example pseudo query (my brain hurts right now, so this is only an example):
select concat(`part_no`, ".pdf") as `pdf_link` from `parts`;
If you really need this, you could use a trigger to duplicate the data ans add the extra string.