module or name space not define error FS0039 [duplicate] - constructor

RaceConditionTest.fs
namespace FConsole
module RaceConditionTest =
let test x =
...
Program.fs
open System
open FConsole
[<EntryPoint>]
let main argv =
RaceConditionTest.test 1000
0 // return an integer exit code
Then I run my console app (linux)
$ dotnet run
error FS0039: The namespace or module 'FConsole' is not defined.
There is only one test method in RaceConditionTest.fs
Is the order of files the problem? if so, How do I indicate the order of *.fs files?

as #boran suggested in his comments there is this FConsoleProject.fsproj
I just added my file before Program.fs
<ItemGroup>
<Compile Include="RaceConditionTest.fs" />
<Compile Include="Program.fs" />
</ItemGroup>

Related

Forge error if using SendStringtoExecute (Command ^C^C)

Actually, the error doesn't raise if I just run (Command ^C^C) from a lisp script.
The case is, my app is a .NET app, and I call some SendStringToExecute to use some lisp code.
To be sure to end the lisp routine, I put at the end:
doc.SendStringToExecute("(Command ^C^C)", True, False, True)
The result of Forge Design Automation is: failedinstruction
Though I can easily find another way to get around this, it cost me more than a day to figure out that it was the (Command ^C^C) cause the failedinstruction, while everything else was working fine.
Hope this bug will be fixed as well as anything similar won't raise up again somewhere else.
I isolate the case like this:
Make a .NET bundle, or just reuse any of your existing one in debug mode
Add the following lisp define function (or it can be a custom command, whatever):
<LispFunction("l+SendStringToExecute")>
Public Shared Function lsp_SendStringToExcute(args As ResultBuffer) As Object
Dim script$ = Nothing
For Each arg As TypedValue In args.AsArray
script = arg.Value
Exit For
Next
script = script.Trim()
If script <> "" Then
Document doc =
AcadApplication.DocumentManager.MdiActiveDocument
doc.SendStringToExecute(script + vbCr, True, False, True)
End If
Return New TypedValue(LispDataType.T_atom)
End Function
Upload the bundle to Forge, create a dump activity and just run the custom lisp solely:
(l+SendStringToExecute "(Command ^C^C)")
The result log is here like this:
...
[02/01/2021 17:23:26] Command: (l+SendStringToExecute "(Command ^C^C)")
[02/01/2021 17:23:26] T
[02/01/2021 17:23:26] Command: (Command ^C^C)
[02/01/2021 17:23:26] *Cancel*
[02/01/2021 17:23:26] Command: nil
[02/01/2021 17:23:27] End AutoCAD Core Engine standard output dump.
[02/01/2021 17:23:27] Error: AutoCAD Core Console failed to finish the script - an unexpected input is encountered.
[02/01/2021 17:23:27] End script phase.
[02/01/2021 17:23:27] Error: An unexpected error happened during phase CoreEngineExecution of job.
[02/01/2021 17:23:27] Job finished with result FailedExecution
[02/01/2021 17:23:27] Job Status:
{
"status": "failedInstructions", ...
Thanks for reporting, I'm not sure if we allow command expression in accoreconsole.
The suggestion is to use following way.
[CommandMethod("CANCELOUTSTANDING")]
public void TESTCANCEL()
{
var doc = Application.DocumentManager.MdiActiveDocument;
string cmd = string.Format("{0}", new string((char)03, 2));
doc.SendStringToExecute(cmd, true, false, true);
}

Prevent double compilation of c files in cython

I am writing a wrapper over c libriary and this lib has file with almost all functions, let say, all_funcs.c. This file in turn requires compilation of lots of another c files
I have created all_funcs.pyx, where I wraped all functions, but I also want to create a submodule, that has access to functions from all_funcs.c. What works for now is adding all c-files to both Extensions in setup.py, however each c-file compiles twice: first for all_funcs.pyx and second for submodule extension.
Are there any ways to provide common sourse files to each Extension?
Example of current setup.py:
ext_helpers = Extension(name=SRC_DIR + '.wrapper.utils.helpers',
sources=[SRC_DIR + '/wrapper/utils/helpers.pyx'] + source_files_paths,
include_dirs=[SRC_DIR + '/include/'])
ext_all_funcs = Extension(name=SRC_DIR + '.wrapper.all_funcs',
sources=[SRC_DIR + '/wrapper/all_funcs.pyx'] + source_files_paths,
include_dirs=[SRC_DIR + '/include/'])
EXTENSIONS = [
ext_helpers,
ext_all_funcs,
]
if __name__ == "__main__":
setup(
packages=PACKAGES,
zip_safe=False,
name='some_name',
ext_modules=cythonize(EXTENSIONS, language_level=3)
)
source_files_paths - the list with common c source files
Note: this answer only explains how to avoid multiple compilation of c/cpp-files using libraries-argument of setup-function. It doesn't however explain how to avoid possible problems due to ODR-violation - for that see this SO-post.
Adding libraries-argument to setup will trigger build_clib prior to building of ext_modules (when running setup.py build or setup.py install commands), the resulting static library will also be automatically passed to the linker, when extensions are linked.
For your setup.py, this means:
from setuptools import setup, find_packages, Extension
...
#common c files compiled to a static library:
mylib = ('mylib', {'sources': source_files_paths}) # possible further settings
# no common c-files (taken care of in mylib):
ext_helpers = Extension(name=SRC_DIR + '.wrapper.utils.helpers',
sources=[SRC_DIR + '/wrapper/utils/helpers.pyx'],
include_dirs=[SRC_DIR + '/include/'])
# no common c-files (taken care of in mylib):
ext_all_funcs = Extension(name=SRC_DIR + '.wrapper.all_funcs',
sources=[SRC_DIR + '/wrapper/all_funcs.pyx'],
include_dirs=[SRC_DIR + '/include/'])
EXTENSIONS = [
ext_helpers,
ext_all_funcs,
]
if __name__ == "__main__":
setup(
packages=find_packages(where=SRC_DIR),
zip_safe=False,
name='some_name',
ext_modules=cythonize(EXTENSIONS, language_level=3),
# will be build as static libraries and automatically passed to linker:
libraries = [mylib]
)
To build the extensions inplace one should invoke:
python setupy.py build_clib build_ext --inplace
as build_ext alone is not enough: we need the static libraries to build before they can be used in extensions.

BIML Dataflow fails to read Configuration file

Summary
Configuration file lookup works for Execute SQL Task, but fails for Dataflow tasks.
Problem
I have 2 databases:
Source (On-premise SQL Server database)
Destination (Azure SQL Database)
I have 2 packages that I want to create from BIML code.
1) Create Staging (Works fine)
Creates tables in destination database using for each loop and metadata from source database
2) Load Staging (Does not work)
Loads created tables in destination database using for each loop and Dataflow tasks (Source to Destination)
Both of these packages need to use a Package Configuration file that I have created, which stores the Username and Password of the Destination database (Azure database, using SQL Server Authentication).
Using this configuration file works fine for Package 1), but when I try to create the SSIS package using BIML code for Package 2) I get the following error:
Could not execute Query on Connection Dest: SELECT * FROM stg.SalesTaxRate. Login failed for user ''.
I have tried using the BIML Code for Package 1) and adding in a dataflow task and that seems to raise the same error - it seems that when using an Execute SQL Task it can find and use the Configuration file no problem, but when using a Dataflow Task it won't find it.
Script for Package 1):
<## import namespace="System.Data" #>
<## import namespace="System.Data.SqlClient" #>
<## template language="C#" tier="2" #>
<#
string _source_con_string = #"Data Source=YRK-L-101098;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016";
string _dest_con_string = #"Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False";
string _table_name_sql = "SELECT TABLE_SCHEMA, TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE'";
DataTable _table_names = new DataTable();
SqlDataAdapter _table_name_da = new SqlDataAdapter(_table_name_sql, _source_con_string);
_table_name_da.Fill(_table_names);
#>
<#+
public string RowConversion(DataRow Row)
{
string _ret = "[" + Row["COLUMN_NAME"] + "] " + Row["DATA_TYPE"];
switch (Row["DATA_TYPE"].ToString().ToUpper())
{
case "NVARCHAR":
case "VARCHAR":
case "NCHAR":
case "CHAR":
case "BINARY":
case "VARBINARY":
if (Row["CHARACTER_MAXIMUM_LENGTH"].ToString() == "-1")
_ret += "(max)";
else
_ret += "(" + Row["CHARACTER_MAXIMUM_LENGTH"] + ")";
break;
case "NUMERIC":
_ret += "(" + Row["NUMERIC_PRECISION"] + "," + Row["NUMERIC_SCALE"] + ")";
break;
case "FLOAT":
_ret += "(" + Row["NUMERIC_PRECISION"] + ")";
break;
}
return _ret;
}
#>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="Dest" ConnectionString="Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False" />
</Connections>
<Packages>
<Package Name="005_Create_Staging_Configuration" ConstraintMode="Linear">
<PackageConfigurations>
<PackageConfiguration Name="Configuration">
<ExternalFileInput ExternalFilePath="C:\VSRepo\BIML\Configurations\AzureConfigEdit.dtsConfig">
</ExternalFileInput>
</PackageConfiguration>
</PackageConfigurations>
<Tasks>
<Container Name="Create Staging Tables" ConstraintMode="Linear">
<Tasks>
<# foreach(DataRow _table in _table_names.Rows) { #>
<ExecuteSQL Name="SQL-S_<#= _table["TABLE_NAME"] #>" ConnectionName="Dest">
<DirectInput>
IF OBJECT_ID('stg.<#= _table["TABLE_NAME"] #>','U') IS NOT NULL
DROP TABLE stg.<#= _table["TABLE_NAME"] #>;
CREATE TABLE stg.<#= _table["TABLE_NAME"] #>
(
<#
string _col_name_sql = "select COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH, NUMERIC_PRECISION, NUMERIC_SCALE from INFORMATION_SCHEMA.COLUMNS where TABLE_SCHEMA='" + _table["TABLE_SCHEMA"] + "' and TABLE_NAME='"+ _table["TABLE_NAME"] + "' order by ORDINAL_POSITION ";
DataTable _col_names = new DataTable();
SqlDataAdapter _col_names_da = new SqlDataAdapter(_col_name_sql, _source_con_string);
_col_names_da.Fill(_col_names);
for (int _i=0; _i<_col_names.Rows.Count ; _i++ )
{
DataRow _r = _col_names.Rows[_i];
if (_i == 0)
WriteLine(RowConversion(_r));
else
WriteLine(", " + RowConversion(_r));
}
#>
, append_dt datetime
)
</DirectInput>
</ExecuteSQL>
<# } #>
</Tasks>
</Container>
</Tasks>
</Package>
</Packages>
</Biml>
Script for Package 2)
<## import namespace="System.Data" #>
<## import namespace="System.Data.SqlClient" #>
<## template language="C#" tier="2" #>
<#
string _source_con_string = #"Data Source=YRK-L-101098;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016";
string _dest_con_string = #"Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False";
string _table_name_sql = "select TABLE_SCHEMA , table_name from INFORMATION_SCHEMA.TABLES where TABLE_TYPE='BASE TABLE'";
DataTable _table_names = new DataTable();
SqlDataAdapter _table_name_da = new SqlDataAdapter(_table_name_sql, _source_con_string);
_table_name_da.Fill(_table_names);
#>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="Source" ConnectionString="Data Source=YRK-L-101098;Provider=SQLNCLI11.1;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016" />
<OleDbConnection Name="Dest" ConnectionString="Data Source=mpl.database.windows.net;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False" />
</Connections>
<Packages>
<Package Name="006_Load_Staging_Configuration" ConstraintMode="Linear">
<PackageConfigurations>
<PackageConfiguration Name="Configuration">
<ExternalFileInput ExternalFilePath="C:\VSRepo\BIML\Configurations\AzureConfigDF.dtsConfig"></ExternalFileInput>
</PackageConfiguration>
</PackageConfigurations>
<Tasks>
<Container Name="Load Staging Tables" ConstraintMode="Linear">
<Tasks>
<# foreach(DataRow _table in _table_names.Rows) { #>
<Dataflow Name="DFT-S_<#= _table["TABLE_NAME"] #>">
<Transformations>
<OleDbSource Name="SRC-<#= _table["TABLE_SCHEMA"] #>_<#= _table["TABLE_NAME"] #>" ConnectionName="Source">
<DirectInput>
SELECT *
FROM <#= _table["TABLE_SCHEMA"] #>.<#= _table["TABLE_NAME"] #>
</DirectInput>
</OleDbSource>
<OleDbDestination Name="DST-<#= _table["TABLE_SCHEMA"] #>_<#= _table["TABLE_NAME"] #>" ConnectionName="Dest">
<ExternalTableOutput Table="stg.<#= _table["TABLE_NAME"] #>"/>
</OleDbDestination>
</Transformations>
</Dataflow>
<# } #>
</Tasks>
</Container>
</Tasks>
</Package>
</Packages>
</Biml>
Configuration file:
<?xml version="1.0"?>
<DTSConfiguration>
<Configuration ConfiguredType="Property" Path="\Package.Connections[Source].Properties[ConnectionString]" ValueType="String">
<ConfiguredValue>"Data Source=YRK-L-101098;Provider=SQLNCLI11.1;Persist Security Info=true;Integrated Security=SSPI;Initial Catalog=AdventureWorks2016"</ConfiguredValue>
</Configuration> <Configuration ConfiguredType="Property" Path="\Package.Connections[Dest].Properties[ConnectionString]" ValueType="String">
<ConfiguredValue>Data Source=mpl.database.windows.net;User ID=*****;Initial Catalog=mpldb;Provider=SQLNCLI11.1;Persist Security Info=True;Auto Translate=False</ConfiguredValue>
</Configuration>
<Configuration ConfiguredType="Property" Path="\Package.Connections[Dest].Properties[Password]" ValueType="String">
<ConfiguredValue>******</ConfiguredValue>
</Configuration>
<Configuration ConfiguredType="Property" Path="\Package.Connections[Dest].Properties[UserName]" ValueType="String">
<ConfiguredValue>******</ConfiguredValue>
</Configuration>
</DTSConfiguration>
Note: Values for 'UserId' and 'Password' are filled with correct values in actual script.
Summary
The issue is that you are trying to use package configurations (an SSIS run-time feature) when developing the packages by generating them using Biml (a build-time feature).
What is happening?
Think of it this way. If you were to manually create the SSIS package, during development, you would have to connect to the source and destination databases by specifying user names and passwords. Without connecting to the databases, SSIS will not be able to get the metadata needed. Once the package has been developed and all the metadata has been mapped, you can use package configurations. When you open or execute a package with package configurations, all hardcoded values will be replaced by configuration values. This value replacement is the SSIS run-time feature.
Now, compare that to using Biml instead of manually creating the SSIS package: When generating the packages, you are expecting the Biml engine to get the user names and passwords from the package configuration files. Since this is an SSIS run-time feature, Biml is unable to get that data, and will use the connection strings specified in the BimlScript. Since these connection strings don't specify the user names and passwords, Biml will not be able to connect and you get the error Login failed for user ''. (This is similar to creating a connection manager in SSIS without providing a user name and password and getting an error when clicking "Test Connection".)
But it works for the Execute SQL task?
It may look like it, but it doesn't. The Execute SQL task basically just gets ignored. The SQL code in Execute SQL tasks is not checked or validated by SSIS or the Biml engine until the package is executed. You can type anything in there, and SSIS will be happy until you try to execute invalid code, at which point it will give you an error. Because this code is not validated by SSIS during development, it is not validated by Biml during package generation. The package gets generated successfully, and then when you open it, the package configurations will be applied, and you will not see any errors.
An OLE DB Destination, however, is validated by both SSIS and the Biml engine during development. They both need to connect to the database to get the metadata. This is why you get an error on this file only.
Solution
Package Configurations are an SSIS run-time feature only. You can not use them to pass connection strings, user names or passwords to the Biml engine. You can either hardcode the connection strings in your BimlScript or store the connection strings in an external metadata repository, but you will need to provide the user names and passwords to the Biml engine during package generation.

LuaLaTeX with LuaSQL on Archlinux

I am working on a LuaLaTeX file which would take data from my database using LuaSQL. So this are the 003-v1.tex and 003-v1.lua files that I came up with:
003-v1.tex file:
\documentclass{article}
\usepackage{luacode}
% Lua kodo vpišemo v ločeno datoteko zaradi syntax highlithing
\directlua{dofile('003-v1.lua')}
\newcommand{\stranke}{\luadirect{stranke()}}
\begin{document}
\begin{tabular}{ll}
\hline
id stranke & ime \\
\hline
\stranke
\hline
\end{tabular}
\end{document}
003-v1.lua file:
function stranke ()
package.cpath = package.cpath .. ";/usr/lib/i386-linux-gnu/lua/5.1/?.so"
luasql = require "luasql.mysql"
env = assert (luasql.mysql())
con = assert (env:connect("linux_krozki","root","mypassword"))
cur = assert (con:execute("SELECT * FROM stranke"))
vnos = cur:fetch ({}, "a")
while vnos do
print(
string.format([[%s & %s \\]], vnos.id_stranke, vnos.ime)
)
vnos = cur:fetch (vnos, "a")
end
end
This files ought to work but when I try to compile using lualatex 003-v1.tex I get error:
This is LuaTeX, Version 1.0.4 (TeX Live 2017/Arch Linux)
restricted system commands enabled.
(./003-v1.tex
LaTeX2e <2017-04-15>
(using write cache: /home/ziga/.texlive/texmf-var/luatex-cache/generic)(using r
ead cache: /var/lib/texmf/luatex-cache/generic /home/ziga/.texlive/texmf-var/lu
atex-cache/generic)
luaotfload | main : initialization completed in 0.144 seconds
Babel <3.12> and hyphenation patterns for 1 language(s) loaded.
(/usr/share/texmf-dist/tex/latex/base/article.cls
Document Class: article 2014/09/29 v1.4h Standard LaTeX document class
(/usr/share/texmf-dist/tex/latex/base/size10.clo(load luc: /home/ziga/.texlive/
texmf-var/luatex-cache/generic/fonts/otl/lmroman10-regular.luc)))
(/usr/share/texmf-dist/tex/lualatex/luacode/luacode.sty
(/usr/share/texmf-dist/tex/generic/oberdiek/ifluatex.sty)
(/usr/share/texmf-dist/tex/luatex/luatexbase/luatexbase.sty
(/usr/share/texmf-dist/tex/luatex/ctablestack/ctablestack.sty))) (./003-v1.aux)
003-v1.lua:8: module 'luasql.mysql' not found:
no field package.preload['luasql.mysql']
[kpse lua searcher] file not found: 'luasql.mysql'
[kpse C searcher] file not found: 'luasql.mysql'
no file '/usr/local/lib/lua/5.2/luasql.so'
no file '/usr/local/lib/lua/5.2/loadall.so'
no file './luasql.so'
no file '/usr/lib/i386-linux-gnu/lua/5.1/luasql.so'
stack traceback:
[C]: in function 'require'
003-v1.lua:8: in function 'stranke'
[\directlua]:1: in main chunk.
\luadirect ... { \luacode#maybe#printdbg {#1} #1 }
l.14 \stranke
And according to this topic this error arrizes because LuaLaTeX can't load module luasql.mysql while lua can on its own. How do I know this? If I comment out first line (function stranke ()) and last line (end) from 003-v1.lua before compiling with lua 003-v1.lua I get an output which is completely fine:
1 & Žiga \\
2 & Ranja \\
3 & Romana \\
So my question is, how to make sure that module luasql.mysql loads when LuaLateX is called? I am on Archlinux and am using texlive. I heard that people compile the texlive again with support for luasql, but can't find the step by step guide... That would be awesome! It would be even better if there is anyone who already compiled it.
Here is the info about my Texlive version:
[ziga#laptop ~]$ pacman -Qs tex | grep live
local/texlive-bibtexextra 2017.44915-1 (texlive-most)
local/texlive-bin 2017.44590-2
local/texlive-core 2017.44918-1 (texlive-most)
local/texlive-fontsextra 2017.44818-1 (texlive-most)
local/texlive-formatsextra 2017.44177-2 (texlive-most)
local/texlive-games 2017.44131-1 (texlive-most)
local/texlive-humanities 2017.44833-1 (texlive-most)
local/texlive-langchinese 2017.44333-1 (texlive-lang)
local/texlive-langcyrillic 2017.44895-1 (texlive-lang)
local/texlive-langextra 2017.44908-1 (texlive-lang)
local/texlive-langgreek 2017.44917-1 (texlive-lang)
local/texlive-langjapanese 2017.44914-1 (texlive-lang)
local/texlive-langkorean 2017.44467-1 (texlive-lang)
local/texlive-latexextra 2017.44907-1 (texlive-most)
local/texlive-music 2017.44885-1 (texlive-most)
local/texlive-pictures 2017.44899-1 (texlive-most)
local/texlive-pstricks 2017.44742-1 (texlive-most)
local/texlive-publishers 2017.44916-1 (texlive-most)
local/texlive-science 2017.44906-1 (texlive-most)
We found an answer on Archlinux forums after this thread was posted. It looks like there are some internal problems with Lua language - package.cpath wasn't able to reckognize the questionmark, so ?.so had to be changed into mysql.so. Can anyone explain why this happened?

nix-shell --command `stack build` leads to libpq-fe.h: No such file or directory

i am trying to compile my small project (a yesod application with lambdacms) on nixos. However, after using cabal2nix (more precisely cabal2nix project-karma.cabal --sha256=0 --shell > shell.nix) , I am still missing a dependency wrt. postgresql it seems.
My shell.nix file looks like this:
{ nixpkgs ? import <nixpkgs> {}, compiler ? "default" }:
let
inherit (nixpkgs) pkgs;
f = { mkDerivation, aeson, base, bytestring, classy-prelude
, classy-prelude-conduit, classy-prelude-yesod, conduit, containers
, data-default, directory, fast-logger, file-embed, filepath
, hjsmin, hspec, http-conduit, lambdacms-core, monad-control
, monad-logger, persistent, persistent-postgresql
, persistent-template, random, resourcet, safe, shakespeare, stdenv
, template-haskell, text, time, transformers, unordered-containers
, uuid, vector, wai, wai-extra, wai-logger, warp, yaml, yesod
, yesod-auth, yesod-core, yesod-form, yesod-static, yesod-test
}:
mkDerivation {
pname = "karma";
version = "0.0.0";
sha256 = "0";
isLibrary = true;
isExecutable = true;
libraryHaskellDepends = [
aeson base bytestring classy-prelude classy-prelude-conduit
classy-prelude-yesod conduit containers data-default directory
fast-logger file-embed filepath hjsmin http-conduit lambdacms- core
monad-control monad-logger persistent persistent-postgresql
persistent-template random safe shakespeare template-haskell text
time unordered-containers uuid vector wai wai-extra wai-logger warp
yaml yesod yesod-auth yesod-core yesod-form yesod-static
nixpkgs.zlib
nixpkgs.postgresql
nixpkgs.libpqxx
];
libraryPkgconfigDepends = [ persistent-postgresql];
executableHaskellDepends = [ base ];
testHaskellDepends = [
base classy-prelude classy-prelude-yesod hspec monad-logger
persistent persistent-postgresql resourcet shakespeare transformers
yesod yesod-core yesod-test
];
license = stdenv.lib.licenses.bsd3;
};
haskellPackages = if compiler == "default"
then pkgs.haskellPackages
else pkgs.haskell.packages.${compiler};
drv = haskellPackages.callPackage f {};
in
if pkgs.lib.inNixShell then drv.env else drv
The output is as follows:
markus#nixos ~/git/haskell/karma/karma (git)-[master] % nix-shell --command `stack build`
postgresql-libpq-0.9.1.1: configure
ReadArgs-1.2.2: download
postgresql-libpq-0.9.1.1: build
ReadArgs-1.2.2: configure
ReadArgs-1.2.2: build
ReadArgs-1.2.2: install
-- While building package postgresql-libpq-0.9.1.1 using:
/run/user/1000/stack31042/postgresql-libpq-0.9.1.1/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup --builddir=.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/ build --ghc-options " -ddump-hi -ddump-to-file"
Process exited with code: ExitFailure 1
Logs have been written to: /home/markus/git/haskell/karma/karma/.stack-work/logs/postgresql-libpq-0.9.1.1.log
[1 of 1] Compiling Main ( /run/user/1000/stack31042/postgresql-libpq-0.9.1.1/Setup.hs, /run/user/1000/stack31042/postgresql-libpq-0.9.1.1/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/Main.o )
Linking /run/user/1000/stack31042/postgresql-libpq-0.9.1.1/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup ...
Configuring postgresql-libpq-0.9.1.1...
Building postgresql-libpq-0.9.1.1...
Preprocessing library postgresql-libpq-0.9.1.1...
LibPQ.hsc:213:22: fatal error: libpq-fe.h: No such file or directory
compilation terminated.
compiling .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/PostgreSQL/LibPQ_hsc_make.c failed (exit code 1)
command was: /nix/store/9fbfiij3ajnd3fs1zyc2qy0ispbszrr7-gcc-wrapper-4.9.3/bin/gcc -c .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/PostgreSQL/LibPQ_hsc_make.c -o .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/PostgreSQL/LibPQ_hsc_make.o -fno-stack-protector -D__GLASGOW_HASKELL__=710 -Dlinux_BUILD_OS=1 -Dx86_64_BUILD_ARCH=1 -Dlinux_HOST_OS=1 -Dx86_64_HOST_ARCH=1 -I/run/current-system/sw/include -Icbits -I.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen -include .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen/cabal_macros.h -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/bytes_6elQVSg5cWdFrvRnfxTUrH/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/base_GDytRqRVSUX7zckgKqJjgw/include -I/nix/store/6ykqcjxr74l642kv9gf1ib8v9yjsgxr9-gmp-5.1.3/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/integ_2aU3IZNMF9a7mQ0OzsZ0dS/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include/
I assume not much is missing, so a pointer would be nice.
What is also weird, that is that "nix-shell" works but following that up with "stack exec yesod devel" tells me
Resolving dependencies...
Configuring karma-0.0.0...
cabal: At least the following dependencies are missing:
classy-prelude >=0.10.2,
classy-prelude-conduit >=0.10.2,
classy-prelude-yesod >=0.10.2,
hjsmin ==0.1.*,
http-conduit ==2.1.*,
lambdacms-core >=0.3.0.2 && <0.4,
monad-logger ==0.3.*,
persistent >=2.0 && <2.3,
persistent-postgresql >=2.1.1 && <2.3,
persistent-template >=2.0 && <2.3,
uuid >=1.3,
wai-extra ==3.0.*,
warp >=3.0 && <3.2,
yesod >=1.4.1 && <1.5,
yesod-auth >=1.4.0 && <1.5,
yesod-core >=1.4.6 && <1.5,
yesod-form >=1.4.0 && <1.5,
yesod-static >=1.4.0.3 && <1.6
When using mysql instead, I am getting
pcre-light-0.4.0.4: configure
mysql-0.1.1.8: configure
mysql-0.1.1.8: build
Progress: 2/59
-- While building package mysql-0.1.1.8 using:
/run/user/1000/stack12820/mysql-0.1.1.8/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup --builddir=.stack-work/dist/x86_64- linux/Cabal-1.22.4.0/ build --ghc-options " -ddump-hi -ddump-to-file"
Process exited with code: ExitFailure 1
Logs have been written to: /home/markus/git/haskell/karma/karma/.stack-work/logs/mysql-0.1.1.8.log
[1 of 1] Compiling Main ( /run/user/1000/stack12820/mysql-0.1.1.8/Setup.lhs, /run/user/1000/stack12820/mysql-0.1.1.8/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/Main.o )
Linking /run/user/1000/stack12820/mysql-0.1.1.8/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup ...
Configuring mysql-0.1.1.8...
Building mysql-0.1.1.8...
Preprocessing library mysql-0.1.1.8...
In file included from C.hsc:68:0:
include/mysql_signals.h:9:19: fatal error: mysql.h: No such file or directory
#include "mysql.h"
^
compilation terminated.
compiling .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/MySQL/Base/C_hsc_make.c failed (exit code 1)
command was: /nix/store/9fbfiij3ajnd3fs1zyc2qy0ispbszrr7-gcc-wrapper-4.9.3/bin/gcc -c .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/MySQL/Base/C_hsc_make.c -o .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/MySQL/Base/C_hsc_make.o -fno-stack-protector -D__GLASGOW_HASKELL__=710 -Dlinux_BUILD_OS=1 -Dx86_64_BUILD_ARCH=1 -Dlinux_HOST_OS=1 -Dx86_64_HOST_ARCH=1 -I/nix/store/7ppa4k2drrvjk94rb60c1df9nvw0z696-mariadb-10.0.22-lib/include -I/nix/store/7ppa4k2drrvjk94rb60c1df9nvw0z696-mariadb-10.0.22-lib/include/.. -Iinclude -I.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen -include .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen/cabal_macros.h -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/bytes_6elQVSg5cWdFrvRnfxTUrH/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/base_GDytRqRVSUX7zckgKqJjgw/include -I/nix/store/6ykqcjxr74l642kv9gf1ib8v9yjsgxr9-gmp-5.1.3/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/integ_2aU3IZNMF9a7mQ0OzsZ0dS/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include/
-- While building package pcre-light-0.4.0.4 using:
/home/markus/.stack/setup-exe-cache/setup-Simple-Cabal-1.22.4.0-x86_64-linux-ghc-7.10.2 --builddir=.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/ configure --with-ghc=/run/current-system/sw/bin/ghc --user --package-db=clear --package-db=global --package-db=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/pkgdb/ --libdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/lib --bindir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/bin --datadir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/share --libexecdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/libexec --sysconfdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/etc --docdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/doc/pcre-light-0.4.0.4 --htmldir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/doc/pcre-light-0.4.0.4 --haddockdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/doc/pcre-light-0.4.0.4 --dependency=base=base-4.8.1.0-4f7206fd964c629946bb89db72c80011 --dependency=bytestring=bytestring-0.10.6.0-18c05887c1aaac7adb3350f6a4c6c8ed
Process exited with code: ExitFailure 1
Logs have been written to: /home/markus/git/haskell/karma/karma/.stack-work/logs/pcre-light-0.4.0.4.log
Configuring pcre-light-0.4.0.4...
setup-Simple-Cabal-1.22.4.0-x86_64-linux-ghc-7.10.2: The program 'pkg-config'
version >=0.9.0 is required but it could not be found.
After adding pkgconfig to my global configuration, the build seems to get a little further ahead, so it seems that shell.nix is ignored somewhat.
(Sources for what I tried so far:
https://groups.google.com/forum/#!topic/haskell-stack/_ZBh01VP_fo)
Update: It seems like I overlooked this section of the manual
http://nixos.org/nixpkgs/manual/#using-stack-together-with-nix
However, the first idea that came to mind
(stack --extra-lib-dirs=/nix/store/c6qy7n5wdwl164lnzha7vpc3av9yhnga-postgresql-libpq-0.9.1.1/lib build)
did not work yet, most likely I need to use
--extra-include-dirs or try one of the variations. It seems weird that stack is still trying to build postgresql-libpq in the very same version, though.
Update2: Currently trying out "stack --extra-lib-dirs=/nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/lib --extra-include-dirs=/nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/include build" which looks promising. Does not look like the nix-way, but still.
Update3: Still getting
<command line>: can't load .so/.DLL for: /home/markus /.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/lib/x86_64-linux- ghc-7.10.2/postgresql-libpq-0.9.1.1-ABGs5p1J8FbEwi6uvHaiV6/libHSpostgresql-libpq-0.9.1.1-ABGs5p1J8FbEwi6uvHaiV6-ghc7.10.2.so
(libpq.so.5: cannot open shared object file: No such file or directory) stack build 186.99s user 2.93s system 109% cpu 2:52.76 total
which is strange since libpq.so.5 is contained in /nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/lib.
An additional
$LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:/nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/lib
does not help either.
Update4:
By the way, yesod devel does the same as stack exec yesod devel. My libraries are downloaded to /nix/store but they are not recognized.
Maybe I need to make "build-nix" work and yesod devel does not work here?
Just for completeness, here is stack.yaml
resolver: nightly-2015-11-17
#run stack setup otherwise!!
# Local packages, usually specified by relative directory name
packages:
- '.'
# Packages to be pulled from upstream that are not in the resolver (e.g., acme-missiles-0.3)
extra-deps: [lambdacms-core-0.3.0.2 , friendly-time-0.4, lists-0.4.2, list-extras-0.4.1.4 ]
# Override default flag values for local packages and extra-deps
flags:
karma:
library-only: false
dev: false
# Extra package databases containing global packages
extra-package-dbs: []
Next weekend, I will check out
https://pr06lefs.wordpress.com/2014/09/27/compiling-a-yesod-project-on-nixos/
and other search results.
Funny, because I've just had a similar problem myself - solved it by adding these two lines to stack.yaml:
extra-include-dirs: [/nix/store/jrdvjvf0w9nclw7b4k0pdfkljw78ijgk-postgresql-9.4.5/include/]
extra-lib-dirs: [/nix/store/jrdvjvf0w9nclw7b4k0pdfkljw78ijgk-postgresql-9.4.5/lib/]
You may want to check first which postgresql's path from the /nix/store you should use with include/ and lib/:
nix-build --no-out-link "<nixpkgs>" -A postgresql
And BTW, why do you use nix-shell if you are going to use stack and you have project-karma.cabal available..? Have you considered migrating your project with stack init..?
Looks like stack is trying to build haskellPackages.postgresql-libpq outside of the nix framework.
You probably don't want that to happen. Maybe try to add postgresql-libpq to libraryHaskellDepends?