Undefined symbol error when testing Netezza UDX - undefined

I'm trying to implement a Netezza UDX that uses R language through Rserve (therefore I use Rconnection library for C++). Currently I've written only the bare minimum required for the UDX to run.
It looks as follows:
#define MAIN
// Netezza UDX libraries
#include "udxinc.h"
#include "udxhelpers.h"
// C++ libraries
#include <cstdlib>
#include <cstring>
// external Rserve and R headers
#include "Rconnection.h"
#include "sisocks.h"
using namespace nz::udx::dthelpers;
using namespace nz::udx_ver2;
// extends UDF base class
class My_UDX : public Udf {
public:
Rconnection * rc;
My_UDX(UdxInit *pInit) : Udf(pInit) {
initsocks();
// IP address in place of ...
rc = new Rconnection("...", 6311);
};
~My_UDX() {
delete rc;
}
static Udf* instantiate(UdxInit *pInit);
virtual ReturnValue evaluate();
};
Udf* My_UDX::instantiate(UdxInit *pInit) {
return new My_UDX(pInit);
}
// called once for each row of data during execution
ReturnValue My_UDX::evaluate() {
int32 retval = 15;
NZ_UDX_RETURN_INT32(retval);
}
The problem arises when I try to test the UDX with a SQL query. My query looks like this:
SELECT * FROM table_name WHERE My_UDX(table_field) = 1;
And this is the error message I get:
Testing...
nzsql -d database_name -u my_username -pw my_password -f sql/test.sql
nzsql:sql/test.sql:1: ERROR: /dev/shm/spuplans/63_1_3.o: undefined symbol: _ZN11RconnectionC1EPKci : loading executable code
make: *** [update] Error 2
If I remove the Rconnection object, UDX works just fine.
I've checked different manuals about Netezza UDXs, Rserve, and Rconnection but with no results. Couldn't find any explanation to this error and what could be done to resolve it.
Then I assumed there might be something wrong with the compiling process, inclusion of needed libraries and flags. I'm not so familiar with makefiles so this is quite a possibility. My makefile looks like so:
# list of files
function_name = My_UDX
object_files = $(function_name).o_x86
spu_files = $(function_name).o_spu10
# database connection
db_call = nzsql -d database_name -u my_username -pw my_password -f
# path to directories
lib_dir = /home/nz/libs
r_connection_dir = /home/nz/Rconnection
all: update
update: compile
$(db_call) sql/unload.sql
cp $(object_files) $(lib_dir)
cp $(spu_files) $(lib_dir)
rm -f *.o_spu10 *.o_x86
$(db_call) sql/create.sql
$(db_call) sql/test.sql
compile:
nzudxcompile $(function_name).cpp --args -I$(r_connection_dir)
unload.sql just drops the previously loaded function and create.sql just creates it. The problem arises only when the test.sql with the previously shown SQL query is executed.
Have been trying to resolve this issue for quite a long time so any help would be appreciated!

As mentioned above by Alex, Rconnection had to be loaded as a shared library or compiled along with the UDF. Solved this by compiling with the UDF.
# for host environment
nzudxcompile --host Rconnection.cpp -o temp_rconnection.o_x86
nzudxcompile --host myudf.cpp -o temp_myudf.o_x86
nzudxcompile --host --objs temp_rconnection.o_x86 --objs temp_myudf.o_x86 -o myudf.o_x86
# for spu environment
nzudxcompile Rconnection.cpp --spu -o temp_rconnection.o_spu10
nzudxcompile myudf.cpp --spu -o temp_myudf.o_spu10
nzudxcompile --spu --objs temp_rconnection.o_spu10 --objs temp_myudf.o_spu10 -o myudf.o_spu10

Related

How to tell OCaml compiler the path of recently installed module

I already have OCaml installed on my Mac by running these commands:
$ brew install opam
$ opam init --bare -a -y
$ opam switch create cs3110-2022fa ocaml-base-compiler.4.14.0
Running any OCaml code which only uses Standard library modules works fine. Then I want to do a few things that are not covered with it, e.g reading CSV files.
$ opam install csv
Let's try to compile this code
open Printf
open Csv
let embedded_csv = "\
\"Banner clickins\"
\"Clickin\",\"Number\",\"Percentage\",
\"brand.adwords\",\"4,878\",\"14.4\"
\"vacation.advert2.adwords\",\"4,454\",\"13.1\"
\"affiliates.generic.tc1\",\"1,608\",\"4.7\"
\"brand.overture\",\"1,576\",\"4.6\"
\"vacation.cheap.adwords\",\"1,515\",\"4.5\"
\"affiliates.generic.vacation.biggestchoice\",\"1,072\",\"3.2\"
\"breaks.no-destination.adwords\",\"1,015\",\"3.0\"
\"fly.no-destination.flightshome.adwords\",\"833\",\"2.5\"
\"exchange.adwords\",\"728\",\"2.1\"
\"holidays.cyprus.cheap\",\"574\",\"1.7\"
\"travel.adwords\",\"416\",\"1.2\"
\"affiliates.vacation.generic.onlinediscount.200\",\"406\",\"1.2\"
\"promo.home.topX.ACE.189\",\"373\",\"1.1\"
\"homepage.hp_tx1b_20050126\",\"369\",\"1.1\"
\"travel.agents.adwords\",\"358\",\"1.1\"
\"promo.home.topX.SSH.366\",\"310\",\"0.9\""
let csvs =
List.map (fun name -> name, Csv.load name)
[ "examples/example1.csv"; "examples/example2.csv" ]
let () =
let ecsv = Csv.input_all(Csv.of_string embedded_csv) in
printf "---Embedded CSV---------------------------------\n" ;
Csv.print_readable ecsv;
List.iter (
fun (name, csv) ->
printf "---%s----------------------------------------\n" name;
Csv.print_readable csv
) csvs;
printf "Compare (Embedded CSV) example1.csv = %i\n"
(Csv.compare ecsv (snd(List.hd csvs)))
let () =
(* Save it to a file *)
let ecsv = Csv.input_all(Csv.of_string embedded_csv) in
let fname = Filename.concat (Filename.get_temp_dir_name()) "example.csv" in
Csv.save fname ecsv;
printf "Saved CSV to file %S.\n" fname
The result is:
$ ocamlopt csvdemo.ml -o csvdemo
File "csvdemo.ml", line 2, characters 5-8:
2 | open Csv
^^^
Error: Unbound module Csv
How to tell OCaml compiler where to find Csv module path?
This seems like a wonderful place to use Dune, but going a little bit old school you can use ocamlfind to locate the package.
% cat test2.ml
open Csv
let () = print_endline "hello"
% ocamlopt -I `ocamlfind query csv` -o test2 csv.cmxa test2.ml
% ./test2
hello
%
Or alternatively:
ocamlfind ocamlopt -package csv -o test2 cvs.cmxa test2.ml

How to bind migrations with executable

I have a Go project using goose for Mysql migrations. I would like to bind the migrations to the package executable so that the executable can be deployed and used independently from any system, similar to JAR files in JAVA projects.
Is there an equivalent in Go to accomplish that?
How to get a single file which can migrate database and work
Install
go get -u github.com/pressly/goose/cmd/goose
Make app. I base it on examplemain.go and add run option. Suppose your project is located at github.com/user/project:
package main
import (
"database/sql"
"flag"
"log"
"os"
"github.com/pressly/goose"
// Init DB drivers. -- here I recommend remove unnecessary - but it's up to you
_ "github.com/go-sql-driver/mysql"
_ "github.com/lib/pq"
_ "github.com/mattn/go-sqlite3"
_ "github.com/ziutek/mymysql/godrv"
// here our migrations will live -- use your path
_ "github.com/user/project/migrations"
)
var (
flags = flag.NewFlagSet("goose", flag.ExitOnError)
dir = flags.String("dir", ".", "directory with migration files")
)
func main() {
flags.Usage = usage
flags.Parse(os.Args[1:])
args := flags.Args()
//////
if len(args) > 1 && args[0] == "run" {
log.Printf("PROGRAM RUN\n") //
.....
os.Exit(0)
}
if len(args) > 1 && args[0] == "create" {
if err := goose.Run("create", nil, *dir, args[1:]...); err != nil {
log.Fatalf("goose run: %v", err)
}
return
}
if len(args) < 3 {
flags.Usage()
return
}
if args[0] == "-h" || args[0] == "--help" {
flags.Usage()
return
}
driver, dbstring, command := args[0], args[1], args[2]
switch driver {
case "postgres", "mysql", "sqlite3", "redshift":
if err := goose.SetDialect(driver); err != nil {
log.Fatal(err)
}
default:
log.Fatalf("%q driver not supported\n", driver)
}
switch dbstring {
case "":
log.Fatalf("-dbstring=%q not supported\n", dbstring)
default:
}
if driver == "redshift" {
driver = "postgres"
}
db, err := sql.Open(driver, dbstring)
if err != nil {
log.Fatalf("-dbstring=%q: %v\n", dbstring, err)
}
arguments := []string{}
if len(args) > 3 {
arguments = append(arguments, args[3:]...)
}
if err := goose.Run(command, db, *dir, arguments...); err != nil {
log.Fatalf("goose run: %v", err)
}
}
func usage() {
log.Print(usagePrefix)
flags.PrintDefaults()
log.Print(usageCommands)
}
var (
usagePrefix = `Usage: goose [OPTIONS] DRIVER DBSTRING COMMAND
Drivers:
postgres
mysql
sqlite3
redshift
Examples:
goose sqlite3 ./foo.db status
goose sqlite3 ./foo.db create init sql
goose sqlite3 ./foo.db create add_some_column sql
goose sqlite3 ./foo.db create fetch_user_data go
goose sqlite3 ./foo.db up
goose postgres "user=postgres dbname=postgres sslmode=disable" status
goose mysql "user:password#/dbname?parseTime=true" status
goose redshift "postgres://user:password#qwerty.us-east-1.redshift.amazonaws.com:5439/db"
status
Options:
`
usageCommands = `
Commands:
up Migrate the DB to the most recent version available
up-to VERSION Migrate the DB to a specific VERSION
down Roll back the version by 1
down-to VERSION Roll back to a specific VERSION
redo Re-run the latest migration
status Dump the migration status for the current DB
version Print the current version of the database
create NAME [sql|go] Creates new migration file with next version
`
)
Create folder for migrations:
mkdir migrations && cd migrations
Create first migrations. We will use go-style migrations:
goose mysql "user:password#/dbname?parseTime=true" create init go
You'll get a file 00001_init.go with Go code. Migrations are baked in it as SQL-commands. Just edit them as you need.
Then go to the main folder and build the application:
cd ..
go build -v -o myapp *.go
You'll get a file myapp with all the migrations baked in it. To check move it to some other place, for example to /tmp folder, and run from there:
./myapp mysql "user:password#/dbname?parseTime=true" status
Run your app:
./myapp run
Result
You have single file which can be used as a migration tool so as a working application itself. All the migration are buil-it. In source code they are stored in a subpackage migrations - so it's easy to edit.
If you use Docker you may put folder to the image.
If not projects like https://github.com/rakyll/statik or https://github.com/jteeuwen/go-bindata can help.
If you're already using Goose, one option would be to write the migrations in Go instead of SQL. Based on the Go migrations example in the Goose repo, when you build the goose binary here, it will bundle all the *.go ones into the binary.
This was the output after I built the example and removed all files except the binary itself. The Go-based migration was embedded:
2017/10/31 11:22:31 Applied At Migration
2017/10/31 11:22:31 =======================================
2017/10/31 11:22:31 Mon Jun 19 21:56:00 2017 -- 00002_rename_root.go
If you're looking to use SQL-based migrations but don't want to take on additional dependencies, you could embed your SQL-based migrations into *.go files as string constants, then, starting from the go-migrations example, add an init phase to main.go that writes them out to the current directory before proceeding.

AWS.Client raised PROGRAM_ERROR : aws-client.adb:543 finalize/adjust raised exception

I am trying to write a simple Ada (with AWS) program to post data to a server. The curl command is working as follows and return a valid response in JSON after successful login:
curl -XPOST -d '{"type":"m.login.password", "user":"xxx", "password": "xxxxxxxxxx"}' "https://matrix.org/_matrix/client/r0/login"
My Ada program:
with Ada.Exceptions; use Ada.Exceptions;
with Ada.Text_Io; use Ada.Text_IO;
with AWS.Client;
with AWS.Communication.Client;
with AWS.MIME;
with AWS.Net;
with AWS.Response;
use AWS;
procedure Communicate is
Result : Response.Data;
Data : String := "{""type"":""m.login.password"", ""user"":""xxx"", ""password"": ""xxxxxxxxxx""}";
begin
Result := Client.Post
( URL => "https://matrix.org/_matrix/client/r0/login",
Data => Data,
Content_Type => AWS.MIME.Application_JSON ) ;
Put_Line ( Response.Message_Body ( Result ) ) ;
end Communicate;
An exception was raised. I can't figure out what is wrong with this code.
$ ./Communicate
raised PROGRAM_ERROR : aws-client.adb:543 finalize/adjust raised exception
To test the code, you can create an account at http://matrix.org and replace the login credential.
Thanks.
Adrian
After a few minor changes (mostly because I don't like compiler warnings), and an adaption to the Debian/Jessie version of AWS, I got it to work.
Here's the adapted version:
with Ada.Text_IO; use Ada.Text_IO;
with AWS.Client;
-- with AWS.MIME;
with AWS.Response;
use AWS;
procedure Communicate is
Result : Response.Data;
Data : constant String :=
"{""type"":""m.login.password"", ""user"":""xxx"", " &
"""password"": ""xxxxxxxxxx""}";
begin
Result := Client.Post
(URL => "https://matrix.org/_matrix/client/r0/login",
Data => Data,
Content_Type => "application/json");
-- Content_Type => AWS.MIME.Application_JSON);
Put_Line (Response.Message_Body (Result));
end Communicate;
Here is my project file:
with "aws";
project Communicate is
for Main use ("communicate");
package Builder is
for Default_Switches ("Ada")
use ("-m");
end Builder;
package Compiler is
for Default_Switches ("Ada")
use ("-fstack-check", -- Generate stack checking code (part of Ada)
"-gnata", -- Enable assertions (part of Ada)
"-gnato13", -- Overflow checking (part of Ada)
"-gnatf", -- Full, verbose error messages
"-gnatwa", -- All optional warnings
"-gnatVa", -- All validity checks
"-gnaty3abcdefhiklmnoOprstux", -- Style checks
"-gnatwe", -- Treat warnings as errors
"-gnat2012", -- Use Ada 2012
"-Wall", -- All GCC warnings
"-O2"); -- Optimise (level 2/3)
end Compiler;
end Communicate;
I built the program with:
% gprbuild -P communicate
gnatgcc -c -fstack-check -gnata -gnato13 -gnatf -gnatwa -gnatVa -gnaty3abcdefhiklmnoOprstux -gnatwe -gnat2012 -Wall -O2 communicate.adb
gprbind communicate.bexch
gnatbind communicate.ali
gnatgcc -c b__communicate.adb
gnatgcc communicate.o -L/usr/lib/x86_64-linux-gnu -lgnutls -lz -llber -lldap -lpthread -o communicate
%
And then tested with:
% ./communicate
{"errcode":"M_FORBIDDEN","error":"Invalid password"}
%
It looks like the problem is located in your AWS version/installation.
Problem resolved by building AWS with gnutls from MacPorts. Apple deprecated OpenSSL since OS X Lion and used CommonCrypto so modern macOS does not come with OpenSSL. The solution is to download and install OpenSSL or gnutls from Mac Ports or Home Brew.
Another problem is that Apple introduced SIP (System Integrity Protection) since El Capitan. With SIP enabled, user with administrator's rights is unable to change the contents in /usr/include and /usr/lib etc.
Mac Ports installs to /opt/local so I made references to /opt/local/include and /opt/local/lib so that AWS can build with either OpenSSL or gnutls.

nix-shell --command `stack build` leads to libpq-fe.h: No such file or directory

i am trying to compile my small project (a yesod application with lambdacms) on nixos. However, after using cabal2nix (more precisely cabal2nix project-karma.cabal --sha256=0 --shell > shell.nix) , I am still missing a dependency wrt. postgresql it seems.
My shell.nix file looks like this:
{ nixpkgs ? import <nixpkgs> {}, compiler ? "default" }:
let
inherit (nixpkgs) pkgs;
f = { mkDerivation, aeson, base, bytestring, classy-prelude
, classy-prelude-conduit, classy-prelude-yesod, conduit, containers
, data-default, directory, fast-logger, file-embed, filepath
, hjsmin, hspec, http-conduit, lambdacms-core, monad-control
, monad-logger, persistent, persistent-postgresql
, persistent-template, random, resourcet, safe, shakespeare, stdenv
, template-haskell, text, time, transformers, unordered-containers
, uuid, vector, wai, wai-extra, wai-logger, warp, yaml, yesod
, yesod-auth, yesod-core, yesod-form, yesod-static, yesod-test
}:
mkDerivation {
pname = "karma";
version = "0.0.0";
sha256 = "0";
isLibrary = true;
isExecutable = true;
libraryHaskellDepends = [
aeson base bytestring classy-prelude classy-prelude-conduit
classy-prelude-yesod conduit containers data-default directory
fast-logger file-embed filepath hjsmin http-conduit lambdacms- core
monad-control monad-logger persistent persistent-postgresql
persistent-template random safe shakespeare template-haskell text
time unordered-containers uuid vector wai wai-extra wai-logger warp
yaml yesod yesod-auth yesod-core yesod-form yesod-static
nixpkgs.zlib
nixpkgs.postgresql
nixpkgs.libpqxx
];
libraryPkgconfigDepends = [ persistent-postgresql];
executableHaskellDepends = [ base ];
testHaskellDepends = [
base classy-prelude classy-prelude-yesod hspec monad-logger
persistent persistent-postgresql resourcet shakespeare transformers
yesod yesod-core yesod-test
];
license = stdenv.lib.licenses.bsd3;
};
haskellPackages = if compiler == "default"
then pkgs.haskellPackages
else pkgs.haskell.packages.${compiler};
drv = haskellPackages.callPackage f {};
in
if pkgs.lib.inNixShell then drv.env else drv
The output is as follows:
markus#nixos ~/git/haskell/karma/karma (git)-[master] % nix-shell --command `stack build`
postgresql-libpq-0.9.1.1: configure
ReadArgs-1.2.2: download
postgresql-libpq-0.9.1.1: build
ReadArgs-1.2.2: configure
ReadArgs-1.2.2: build
ReadArgs-1.2.2: install
-- While building package postgresql-libpq-0.9.1.1 using:
/run/user/1000/stack31042/postgresql-libpq-0.9.1.1/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup --builddir=.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/ build --ghc-options " -ddump-hi -ddump-to-file"
Process exited with code: ExitFailure 1
Logs have been written to: /home/markus/git/haskell/karma/karma/.stack-work/logs/postgresql-libpq-0.9.1.1.log
[1 of 1] Compiling Main ( /run/user/1000/stack31042/postgresql-libpq-0.9.1.1/Setup.hs, /run/user/1000/stack31042/postgresql-libpq-0.9.1.1/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/Main.o )
Linking /run/user/1000/stack31042/postgresql-libpq-0.9.1.1/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup ...
Configuring postgresql-libpq-0.9.1.1...
Building postgresql-libpq-0.9.1.1...
Preprocessing library postgresql-libpq-0.9.1.1...
LibPQ.hsc:213:22: fatal error: libpq-fe.h: No such file or directory
compilation terminated.
compiling .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/PostgreSQL/LibPQ_hsc_make.c failed (exit code 1)
command was: /nix/store/9fbfiij3ajnd3fs1zyc2qy0ispbszrr7-gcc-wrapper-4.9.3/bin/gcc -c .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/PostgreSQL/LibPQ_hsc_make.c -o .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/PostgreSQL/LibPQ_hsc_make.o -fno-stack-protector -D__GLASGOW_HASKELL__=710 -Dlinux_BUILD_OS=1 -Dx86_64_BUILD_ARCH=1 -Dlinux_HOST_OS=1 -Dx86_64_HOST_ARCH=1 -I/run/current-system/sw/include -Icbits -I.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen -include .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen/cabal_macros.h -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/bytes_6elQVSg5cWdFrvRnfxTUrH/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/base_GDytRqRVSUX7zckgKqJjgw/include -I/nix/store/6ykqcjxr74l642kv9gf1ib8v9yjsgxr9-gmp-5.1.3/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/integ_2aU3IZNMF9a7mQ0OzsZ0dS/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include/
I assume not much is missing, so a pointer would be nice.
What is also weird, that is that "nix-shell" works but following that up with "stack exec yesod devel" tells me
Resolving dependencies...
Configuring karma-0.0.0...
cabal: At least the following dependencies are missing:
classy-prelude >=0.10.2,
classy-prelude-conduit >=0.10.2,
classy-prelude-yesod >=0.10.2,
hjsmin ==0.1.*,
http-conduit ==2.1.*,
lambdacms-core >=0.3.0.2 && <0.4,
monad-logger ==0.3.*,
persistent >=2.0 && <2.3,
persistent-postgresql >=2.1.1 && <2.3,
persistent-template >=2.0 && <2.3,
uuid >=1.3,
wai-extra ==3.0.*,
warp >=3.0 && <3.2,
yesod >=1.4.1 && <1.5,
yesod-auth >=1.4.0 && <1.5,
yesod-core >=1.4.6 && <1.5,
yesod-form >=1.4.0 && <1.5,
yesod-static >=1.4.0.3 && <1.6
When using mysql instead, I am getting
pcre-light-0.4.0.4: configure
mysql-0.1.1.8: configure
mysql-0.1.1.8: build
Progress: 2/59
-- While building package mysql-0.1.1.8 using:
/run/user/1000/stack12820/mysql-0.1.1.8/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup --builddir=.stack-work/dist/x86_64- linux/Cabal-1.22.4.0/ build --ghc-options " -ddump-hi -ddump-to-file"
Process exited with code: ExitFailure 1
Logs have been written to: /home/markus/git/haskell/karma/karma/.stack-work/logs/mysql-0.1.1.8.log
[1 of 1] Compiling Main ( /run/user/1000/stack12820/mysql-0.1.1.8/Setup.lhs, /run/user/1000/stack12820/mysql-0.1.1.8/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/Main.o )
Linking /run/user/1000/stack12820/mysql-0.1.1.8/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup ...
Configuring mysql-0.1.1.8...
Building mysql-0.1.1.8...
Preprocessing library mysql-0.1.1.8...
In file included from C.hsc:68:0:
include/mysql_signals.h:9:19: fatal error: mysql.h: No such file or directory
#include "mysql.h"
^
compilation terminated.
compiling .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/MySQL/Base/C_hsc_make.c failed (exit code 1)
command was: /nix/store/9fbfiij3ajnd3fs1zyc2qy0ispbszrr7-gcc-wrapper-4.9.3/bin/gcc -c .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/MySQL/Base/C_hsc_make.c -o .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/MySQL/Base/C_hsc_make.o -fno-stack-protector -D__GLASGOW_HASKELL__=710 -Dlinux_BUILD_OS=1 -Dx86_64_BUILD_ARCH=1 -Dlinux_HOST_OS=1 -Dx86_64_HOST_ARCH=1 -I/nix/store/7ppa4k2drrvjk94rb60c1df9nvw0z696-mariadb-10.0.22-lib/include -I/nix/store/7ppa4k2drrvjk94rb60c1df9nvw0z696-mariadb-10.0.22-lib/include/.. -Iinclude -I.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen -include .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen/cabal_macros.h -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/bytes_6elQVSg5cWdFrvRnfxTUrH/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/base_GDytRqRVSUX7zckgKqJjgw/include -I/nix/store/6ykqcjxr74l642kv9gf1ib8v9yjsgxr9-gmp-5.1.3/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/integ_2aU3IZNMF9a7mQ0OzsZ0dS/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include/
-- While building package pcre-light-0.4.0.4 using:
/home/markus/.stack/setup-exe-cache/setup-Simple-Cabal-1.22.4.0-x86_64-linux-ghc-7.10.2 --builddir=.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/ configure --with-ghc=/run/current-system/sw/bin/ghc --user --package-db=clear --package-db=global --package-db=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/pkgdb/ --libdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/lib --bindir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/bin --datadir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/share --libexecdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/libexec --sysconfdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/etc --docdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/doc/pcre-light-0.4.0.4 --htmldir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/doc/pcre-light-0.4.0.4 --haddockdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/doc/pcre-light-0.4.0.4 --dependency=base=base-4.8.1.0-4f7206fd964c629946bb89db72c80011 --dependency=bytestring=bytestring-0.10.6.0-18c05887c1aaac7adb3350f6a4c6c8ed
Process exited with code: ExitFailure 1
Logs have been written to: /home/markus/git/haskell/karma/karma/.stack-work/logs/pcre-light-0.4.0.4.log
Configuring pcre-light-0.4.0.4...
setup-Simple-Cabal-1.22.4.0-x86_64-linux-ghc-7.10.2: The program 'pkg-config'
version >=0.9.0 is required but it could not be found.
After adding pkgconfig to my global configuration, the build seems to get a little further ahead, so it seems that shell.nix is ignored somewhat.
(Sources for what I tried so far:
https://groups.google.com/forum/#!topic/haskell-stack/_ZBh01VP_fo)
Update: It seems like I overlooked this section of the manual
http://nixos.org/nixpkgs/manual/#using-stack-together-with-nix
However, the first idea that came to mind
(stack --extra-lib-dirs=/nix/store/c6qy7n5wdwl164lnzha7vpc3av9yhnga-postgresql-libpq-0.9.1.1/lib build)
did not work yet, most likely I need to use
--extra-include-dirs or try one of the variations. It seems weird that stack is still trying to build postgresql-libpq in the very same version, though.
Update2: Currently trying out "stack --extra-lib-dirs=/nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/lib --extra-include-dirs=/nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/include build" which looks promising. Does not look like the nix-way, but still.
Update3: Still getting
<command line>: can't load .so/.DLL for: /home/markus /.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/lib/x86_64-linux- ghc-7.10.2/postgresql-libpq-0.9.1.1-ABGs5p1J8FbEwi6uvHaiV6/libHSpostgresql-libpq-0.9.1.1-ABGs5p1J8FbEwi6uvHaiV6-ghc7.10.2.so
(libpq.so.5: cannot open shared object file: No such file or directory) stack build 186.99s user 2.93s system 109% cpu 2:52.76 total
which is strange since libpq.so.5 is contained in /nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/lib.
An additional
$LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:/nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/lib
does not help either.
Update4:
By the way, yesod devel does the same as stack exec yesod devel. My libraries are downloaded to /nix/store but they are not recognized.
Maybe I need to make "build-nix" work and yesod devel does not work here?
Just for completeness, here is stack.yaml
resolver: nightly-2015-11-17
#run stack setup otherwise!!
# Local packages, usually specified by relative directory name
packages:
- '.'
# Packages to be pulled from upstream that are not in the resolver (e.g., acme-missiles-0.3)
extra-deps: [lambdacms-core-0.3.0.2 , friendly-time-0.4, lists-0.4.2, list-extras-0.4.1.4 ]
# Override default flag values for local packages and extra-deps
flags:
karma:
library-only: false
dev: false
# Extra package databases containing global packages
extra-package-dbs: []
Next weekend, I will check out
https://pr06lefs.wordpress.com/2014/09/27/compiling-a-yesod-project-on-nixos/
and other search results.
Funny, because I've just had a similar problem myself - solved it by adding these two lines to stack.yaml:
extra-include-dirs: [/nix/store/jrdvjvf0w9nclw7b4k0pdfkljw78ijgk-postgresql-9.4.5/include/]
extra-lib-dirs: [/nix/store/jrdvjvf0w9nclw7b4k0pdfkljw78ijgk-postgresql-9.4.5/lib/]
You may want to check first which postgresql's path from the /nix/store you should use with include/ and lib/:
nix-build --no-out-link "<nixpkgs>" -A postgresql
And BTW, why do you use nix-shell if you are going to use stack and you have project-karma.cabal available..? Have you considered migrating your project with stack init..?
Looks like stack is trying to build haskellPackages.postgresql-libpq outside of the nix framework.
You probably don't want that to happen. Maybe try to add postgresql-libpq to libraryHaskellDepends?

How to trace MySql queries using MySql-Proxy?

I just downloaded the mysql-proxy and created this script lua (found in Mysql docs):
function read_query(packet)
if string.byte(packet) == proxy.COM_QUERY then
print("QUERY: " .. string.sub(packet, 2))
end
end
This is the command-line I'm using:
mysql-proxy -P localhost:1234 -b localhost:3306 --proxy-lua-script=profile.lua --plugins=proxy
When I run a simple query (like "select * from table1"), this error is reported: "failed: .\lua-scope.c:241: stat(C:...\profile.lua) failed: No error (0)"
Note: If I run mysql-proxy without lua script, no error occurs.
I need to install something to get mysql-proxy and query tracing working?
My environment is Windows 7 Professional x64.
Sorry the bad english.
The error you're getting is caused by --proxy-lua-script pointing to a file that mysql-proxy can't find. Either you've typed the name in wrong, you've typed the path in wrong, or you are expecting it in your CWD and it's not there. Or actually, looking at the entire error a little more closely, it seems possible that mysql-proxy itself sees the file in CWD itself OK, but one of the underlying modules doesn't like it (possibly because mysql-proxy changes the CWD somehow?)
Try saving profile.lua to the root of your C: drive and trying different versions of the option like so:
--proxy-lua-script=c:\profile.lua
--proxy-lua-script=\profile.lua
--proxy-lua-script=/profile.lua
One of those would probably work
simple query log lua script:
require("mysql.tokenizer")
local fh = io.open("/var/log/mysql/proxy.query.log", "a+")
fh:setvbuf('line',4096)
local the_query = "";
local seqno = 0;
function read_query( packet )
if string.byte(packet) == proxy.COM_QUERY then
seqno = seqno + 1
the_query = (string.gsub(string.gsub(string.sub(packet, 2), "%s%s*", ' '), "^%s*(.-)%s*$", "%1"))
fh:write(string.format("%s %09d %09d : %s (%s) -- %s\n",
os.date('%Y-%m-%d %H:%M:%S'),
proxy.connection.server.thread_id,
seqno,
proxy.connection.client.username,
proxy.connection.client.default_db,
the_query))
fh:flush()
return proxy.PROXY_SEND_QUERY
else
query = ""
end
end