java.lang.NoSuchMethodError: No such DSL method 'readJSON' - json
def data = readJSON text: '{"rel" : {"configVersion": "1.0","manifest" :"'+"${manifestURL}"+'"}}'
writeJSON(file: 'C:\\Users\\Public\\json\\config.json', json: data)
I am using JSON function in my Jenkins pipeline and getting NoSuchMethodFoundError. I am using Jenkins 2.85.
Any idea how to resolve this issue?
java.lang.NoSuchMethodError: No such DSL method 'readJSON' found among steps
[archive, bat, build, catchError, checkout, deleteDir, dir,
dockerFingerprintFrom, dockerFingerprintRun, echo, emailext,
emailextrecipients, envVarsForTool, error, fileExists, getContext, git,
input, isUnix, library, libraryResource, load, mail, milestone, node,
parallel, powershell, properties, pwd, readFile, readTrusted, resolveScm,
retry, script, sh, sleep, stage, stash, step, svn, timeout, timestamps, tm,
tool, unarchive, unstash, validateDeclarativePipeline, waitUntil,
withContext, withCredentials, withDockerContainer, withDockerRegistry,
withDockerServer, withEnv, wrap, writeFile, ws] or symbols [all, allOf,
always, ant, antFromApache, antOutcome, antTarget, any, anyOf, apiToken,
architecture, archiveArtifacts, artifactManager, authorizationMatrix,
batchFile, booleanParam, branch, buildButton, buildDiscarder,
caseInsensitive, caseSensitive, certificate, changelog, changeset, choice,
choiceParam, cleanWs, clock, cloud, command, credentials, cron, crumb,
defaultView, demand, disableConcurrentBuilds, docker, dockerCert,
dockerfile, downloadSettings, downstream, dumb, envVars, environment,
expression, file, fileParam, filePath, fingerprint, frameOptions, freeStyle,
freeStyleJob, fromScm, fromSource, git, github, githubPush, gradle,
headRegexFilter, headWildcardFilter, hyperlink, hyperlinkToModels,
inheriting, inheritingGlobal, installSource, jdk, jdkInstaller, jgit,
jgitapache, jnlp, jobName, junit, label, lastDuration, lastFailure,
lastGrantedAuthorities, lastStable, lastSuccess, legacy, legacySCM, list,
local, location, logRotator, loggedInUsersCanDoAnything, masterBuild, maven,
maven3Mojos, mavenErrors, mavenMojos, mavenWarnings, modernSCM, myView,
node, nodeProperties, nonInheriting,
Using Pipeline Utility Steps Plugin you can use the readJSON function.
def props = readJSON text: '{ "key": "value" }'
You can not use this function without this plugin.
For more info check: Steps
Install the Pipeline Utility Steps plugin.
And in my case it was just a stupid typo: readJson instead of readJSON.
Related
How to define config file variables?
I have a configuration file with: {path, "/mnt/test/"}. {name, "Joe"}. The path and the name could be changed by a user. As I know, there is a way to save those variables in a module by usage of file:consult/1 in -define(VARIABLE, <parsing of the config file>). Are there any better ways to read a config file when the module begins to work without making a parsing function in -define? (As I know, according to Erlang developers, it's not the best way to make a complicated functions in -define)
If you need to store config only when you start the application - you may use application config file which is defined in 'rebar.config' {profiles, [ {local, [{relx, [ {dev_mode, false}, {include_erts, true}, {include_src, false}, {vm_args, "config/local/vm.args"}] {sys_config, "config/local/yourapplication.config"}] }] } ]}. more info about this here: rebar3 configuration next step to create yourapplication.config - store it in your application folder /app/config/local/yourapplication.config this configuration should have structure like this example [ { yourapplicationname, [ {path, "/mnt/test/"}, {name, "Joe"} ] } ]. so when your application is started you can get the whole config data with {ok, "/mnt/test/"} = application:get_env(yourapplicationname, path) {ok, "Joe"} = application:get_env(yourapplicationname, name) and now you may -define this variables like: -define(VARIABLE, case application:get_env(yourapplicationname, path) of {ok, Data} -> Data _ -> undefined end ).
AWS.Client raised PROGRAM_ERROR : aws-client.adb:543 finalize/adjust raised exception
I am trying to write a simple Ada (with AWS) program to post data to a server. The curl command is working as follows and return a valid response in JSON after successful login: curl -XPOST -d '{"type":"m.login.password", "user":"xxx", "password": "xxxxxxxxxx"}' "https://matrix.org/_matrix/client/r0/login" My Ada program: with Ada.Exceptions; use Ada.Exceptions; with Ada.Text_Io; use Ada.Text_IO; with AWS.Client; with AWS.Communication.Client; with AWS.MIME; with AWS.Net; with AWS.Response; use AWS; procedure Communicate is Result : Response.Data; Data : String := "{""type"":""m.login.password"", ""user"":""xxx"", ""password"": ""xxxxxxxxxx""}"; begin Result := Client.Post ( URL => "https://matrix.org/_matrix/client/r0/login", Data => Data, Content_Type => AWS.MIME.Application_JSON ) ; Put_Line ( Response.Message_Body ( Result ) ) ; end Communicate; An exception was raised. I can't figure out what is wrong with this code. $ ./Communicate raised PROGRAM_ERROR : aws-client.adb:543 finalize/adjust raised exception To test the code, you can create an account at http://matrix.org and replace the login credential. Thanks. Adrian
After a few minor changes (mostly because I don't like compiler warnings), and an adaption to the Debian/Jessie version of AWS, I got it to work. Here's the adapted version: with Ada.Text_IO; use Ada.Text_IO; with AWS.Client; -- with AWS.MIME; with AWS.Response; use AWS; procedure Communicate is Result : Response.Data; Data : constant String := "{""type"":""m.login.password"", ""user"":""xxx"", " & """password"": ""xxxxxxxxxx""}"; begin Result := Client.Post (URL => "https://matrix.org/_matrix/client/r0/login", Data => Data, Content_Type => "application/json"); -- Content_Type => AWS.MIME.Application_JSON); Put_Line (Response.Message_Body (Result)); end Communicate; Here is my project file: with "aws"; project Communicate is for Main use ("communicate"); package Builder is for Default_Switches ("Ada") use ("-m"); end Builder; package Compiler is for Default_Switches ("Ada") use ("-fstack-check", -- Generate stack checking code (part of Ada) "-gnata", -- Enable assertions (part of Ada) "-gnato13", -- Overflow checking (part of Ada) "-gnatf", -- Full, verbose error messages "-gnatwa", -- All optional warnings "-gnatVa", -- All validity checks "-gnaty3abcdefhiklmnoOprstux", -- Style checks "-gnatwe", -- Treat warnings as errors "-gnat2012", -- Use Ada 2012 "-Wall", -- All GCC warnings "-O2"); -- Optimise (level 2/3) end Compiler; end Communicate; I built the program with: % gprbuild -P communicate gnatgcc -c -fstack-check -gnata -gnato13 -gnatf -gnatwa -gnatVa -gnaty3abcdefhiklmnoOprstux -gnatwe -gnat2012 -Wall -O2 communicate.adb gprbind communicate.bexch gnatbind communicate.ali gnatgcc -c b__communicate.adb gnatgcc communicate.o -L/usr/lib/x86_64-linux-gnu -lgnutls -lz -llber -lldap -lpthread -o communicate % And then tested with: % ./communicate {"errcode":"M_FORBIDDEN","error":"Invalid password"} % It looks like the problem is located in your AWS version/installation.
Problem resolved by building AWS with gnutls from MacPorts. Apple deprecated OpenSSL since OS X Lion and used CommonCrypto so modern macOS does not come with OpenSSL. The solution is to download and install OpenSSL or gnutls from Mac Ports or Home Brew. Another problem is that Apple introduced SIP (System Integrity Protection) since El Capitan. With SIP enabled, user with administrator's rights is unable to change the contents in /usr/include and /usr/lib etc. Mac Ports installs to /opt/local so I made references to /opt/local/include and /opt/local/lib so that AWS can build with either OpenSSL or gnutls.
using nginx' lua to validate GitHub webhooks and delete cron-lock-file
What I have: GNU/Linux host nginx is up and running there is a cron-job scheduled to run immediately after a specific file has been removed (similar to run-crons) GitHub sends a webhook when someone pushes to a repository What I want: I do now want to run either lua or anything comparable to parse GitHub's request and validate it and then delete a file (if the request was valid of course). Preferably all of this should happen without the hassle to maintain an additional PHP installation as there is currently none, or the need to use fcgiwrap or similar. Template: On the nginx side I have something equivalent to location /deploy { # execute lua (or equivalent) here }
To read json body of GH webhook you nead use JSON4Lua lib, and to validate HMAC signature use luacrypto. Preconfigure Install required modules $ sudo luarocks install JSON4Lua $ sudo luarocks install luacrypto In Nginx define location for deploy location /deploy { client_body_buffer_size 3M; client_max_body_size 3M; content_by_lua_file /path/to/handler.lua; } The max_body_size and body_buffer_size should be equal to prevent error request body in temp file not supported https://github.com/openresty/lua-nginx-module/issues/521 Process webhook Get request payload data and check is correct ngx.req.read_body() local data = ngx.req.get_body_data() if not data then ngx.log(ngx.ERR, "failed to get request body") return ngx.exit (ngx.HTTP_BAD_REQUEST) end Verify GH signature with use luacrypto local function verify_signature (hub_sign, data) local sign = 'sha1=' .. crypto.hmac.digest('sha1', data, secret) -- this is simple comparison, but it's better to use a constant time comparison return hub_sign == sign end -- validate GH signature if not verify_signature(headers['X-Hub-Signature'], data) then ngx.log(ngx.ERR, "wrong webhook signature") return ngx.exit (ngx.HTTP_FORBIDDEN) end Parse data as json and check is master branch, for deploy data = json.decode(data) -- on master branch if data['ref'] ~= branch then ngx.say("Skip branch ", data['ref']) return ngx.exit (ngx.HTTP_OK) end If all correct, call deploy function local function deploy () -- run command for deploy local handle = io.popen("cd /path/to/repo && sudo -u username git pull") local result = handle:read("*a") handle:close() ngx.say (result) return ngx.exit (ngx.HTTP_OK) end Example Example constant time string compare local function const_eq (a, b) -- Check is string equals, constant time exec getmetatable('').__index = function (str, i) return string.sub(str, i, i) end local diff = string.len(a) == string.len(b) for i = 1, math.min(string.len(a), string.len(b)) do diff = (a[i] == b[i]) and diff end return diff end A complete example of how I use it in github gist https://gist.github.com/Samael500/5dbdf6d55838f841a08eb7847ad1c926
This solution does not implement verification for GitHub's hooks and assumes you have the lua extension and the cjson module installed: location = /location { default_type 'text/plain'; content_by_lua_block { local cjson = require "cjson.safe" ngx.req.read_body() local data = ngx.req.get_body_data() if data then local obj = cjson.decode(data) if # checksum checking should go here (obj and obj.repository and obj.repository.full_name) == "user/reponame" then local file = io.open("<your file>","w") if file then file:close() ngx.say("success") else ngx.exit(ngx.HTTP_INTERNAL_SERVER_ERROR) end else ngx.exit(ngx.HTTP_UNAUTHORIZED) end else ngx.exit(ngx.HTTP_NOT_ALLOWED) end } }
nix-shell --command `stack build` leads to libpq-fe.h: No such file or directory
i am trying to compile my small project (a yesod application with lambdacms) on nixos. However, after using cabal2nix (more precisely cabal2nix project-karma.cabal --sha256=0 --shell > shell.nix) , I am still missing a dependency wrt. postgresql it seems. My shell.nix file looks like this: { nixpkgs ? import <nixpkgs> {}, compiler ? "default" }: let inherit (nixpkgs) pkgs; f = { mkDerivation, aeson, base, bytestring, classy-prelude , classy-prelude-conduit, classy-prelude-yesod, conduit, containers , data-default, directory, fast-logger, file-embed, filepath , hjsmin, hspec, http-conduit, lambdacms-core, monad-control , monad-logger, persistent, persistent-postgresql , persistent-template, random, resourcet, safe, shakespeare, stdenv , template-haskell, text, time, transformers, unordered-containers , uuid, vector, wai, wai-extra, wai-logger, warp, yaml, yesod , yesod-auth, yesod-core, yesod-form, yesod-static, yesod-test }: mkDerivation { pname = "karma"; version = "0.0.0"; sha256 = "0"; isLibrary = true; isExecutable = true; libraryHaskellDepends = [ aeson base bytestring classy-prelude classy-prelude-conduit classy-prelude-yesod conduit containers data-default directory fast-logger file-embed filepath hjsmin http-conduit lambdacms- core monad-control monad-logger persistent persistent-postgresql persistent-template random safe shakespeare template-haskell text time unordered-containers uuid vector wai wai-extra wai-logger warp yaml yesod yesod-auth yesod-core yesod-form yesod-static nixpkgs.zlib nixpkgs.postgresql nixpkgs.libpqxx ]; libraryPkgconfigDepends = [ persistent-postgresql]; executableHaskellDepends = [ base ]; testHaskellDepends = [ base classy-prelude classy-prelude-yesod hspec monad-logger persistent persistent-postgresql resourcet shakespeare transformers yesod yesod-core yesod-test ]; license = stdenv.lib.licenses.bsd3; }; haskellPackages = if compiler == "default" then pkgs.haskellPackages else pkgs.haskell.packages.${compiler}; drv = haskellPackages.callPackage f {}; in if pkgs.lib.inNixShell then drv.env else drv The output is as follows: markus#nixos ~/git/haskell/karma/karma (git)-[master] % nix-shell --command `stack build` postgresql-libpq-0.9.1.1: configure ReadArgs-1.2.2: download postgresql-libpq-0.9.1.1: build ReadArgs-1.2.2: configure ReadArgs-1.2.2: build ReadArgs-1.2.2: install -- While building package postgresql-libpq-0.9.1.1 using: /run/user/1000/stack31042/postgresql-libpq-0.9.1.1/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup --builddir=.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/ build --ghc-options " -ddump-hi -ddump-to-file" Process exited with code: ExitFailure 1 Logs have been written to: /home/markus/git/haskell/karma/karma/.stack-work/logs/postgresql-libpq-0.9.1.1.log [1 of 1] Compiling Main ( /run/user/1000/stack31042/postgresql-libpq-0.9.1.1/Setup.hs, /run/user/1000/stack31042/postgresql-libpq-0.9.1.1/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/Main.o ) Linking /run/user/1000/stack31042/postgresql-libpq-0.9.1.1/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup ... Configuring postgresql-libpq-0.9.1.1... Building postgresql-libpq-0.9.1.1... Preprocessing library postgresql-libpq-0.9.1.1... LibPQ.hsc:213:22: fatal error: libpq-fe.h: No such file or directory compilation terminated. compiling .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/PostgreSQL/LibPQ_hsc_make.c failed (exit code 1) command was: /nix/store/9fbfiij3ajnd3fs1zyc2qy0ispbszrr7-gcc-wrapper-4.9.3/bin/gcc -c .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/PostgreSQL/LibPQ_hsc_make.c -o .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/PostgreSQL/LibPQ_hsc_make.o -fno-stack-protector -D__GLASGOW_HASKELL__=710 -Dlinux_BUILD_OS=1 -Dx86_64_BUILD_ARCH=1 -Dlinux_HOST_OS=1 -Dx86_64_HOST_ARCH=1 -I/run/current-system/sw/include -Icbits -I.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen -include .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen/cabal_macros.h -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/bytes_6elQVSg5cWdFrvRnfxTUrH/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/base_GDytRqRVSUX7zckgKqJjgw/include -I/nix/store/6ykqcjxr74l642kv9gf1ib8v9yjsgxr9-gmp-5.1.3/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/integ_2aU3IZNMF9a7mQ0OzsZ0dS/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include/ I assume not much is missing, so a pointer would be nice. What is also weird, that is that "nix-shell" works but following that up with "stack exec yesod devel" tells me Resolving dependencies... Configuring karma-0.0.0... cabal: At least the following dependencies are missing: classy-prelude >=0.10.2, classy-prelude-conduit >=0.10.2, classy-prelude-yesod >=0.10.2, hjsmin ==0.1.*, http-conduit ==2.1.*, lambdacms-core >=0.3.0.2 && <0.4, monad-logger ==0.3.*, persistent >=2.0 && <2.3, persistent-postgresql >=2.1.1 && <2.3, persistent-template >=2.0 && <2.3, uuid >=1.3, wai-extra ==3.0.*, warp >=3.0 && <3.2, yesod >=1.4.1 && <1.5, yesod-auth >=1.4.0 && <1.5, yesod-core >=1.4.6 && <1.5, yesod-form >=1.4.0 && <1.5, yesod-static >=1.4.0.3 && <1.6 When using mysql instead, I am getting pcre-light-0.4.0.4: configure mysql-0.1.1.8: configure mysql-0.1.1.8: build Progress: 2/59 -- While building package mysql-0.1.1.8 using: /run/user/1000/stack12820/mysql-0.1.1.8/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup --builddir=.stack-work/dist/x86_64- linux/Cabal-1.22.4.0/ build --ghc-options " -ddump-hi -ddump-to-file" Process exited with code: ExitFailure 1 Logs have been written to: /home/markus/git/haskell/karma/karma/.stack-work/logs/mysql-0.1.1.8.log [1 of 1] Compiling Main ( /run/user/1000/stack12820/mysql-0.1.1.8/Setup.lhs, /run/user/1000/stack12820/mysql-0.1.1.8/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/Main.o ) Linking /run/user/1000/stack12820/mysql-0.1.1.8/.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/setup/setup ... Configuring mysql-0.1.1.8... Building mysql-0.1.1.8... Preprocessing library mysql-0.1.1.8... In file included from C.hsc:68:0: include/mysql_signals.h:9:19: fatal error: mysql.h: No such file or directory #include "mysql.h" ^ compilation terminated. compiling .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/MySQL/Base/C_hsc_make.c failed (exit code 1) command was: /nix/store/9fbfiij3ajnd3fs1zyc2qy0ispbszrr7-gcc-wrapper-4.9.3/bin/gcc -c .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/MySQL/Base/C_hsc_make.c -o .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/Database/MySQL/Base/C_hsc_make.o -fno-stack-protector -D__GLASGOW_HASKELL__=710 -Dlinux_BUILD_OS=1 -Dx86_64_BUILD_ARCH=1 -Dlinux_HOST_OS=1 -Dx86_64_HOST_ARCH=1 -I/nix/store/7ppa4k2drrvjk94rb60c1df9nvw0z696-mariadb-10.0.22-lib/include -I/nix/store/7ppa4k2drrvjk94rb60c1df9nvw0z696-mariadb-10.0.22-lib/include/.. -Iinclude -I.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen -include .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/autogen/cabal_macros.h -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/bytes_6elQVSg5cWdFrvRnfxTUrH/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/base_GDytRqRVSUX7zckgKqJjgw/include -I/nix/store/6ykqcjxr74l642kv9gf1ib8v9yjsgxr9-gmp-5.1.3/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/integ_2aU3IZNMF9a7mQ0OzsZ0dS/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include -I/nix/store/xphvly2zcd6jsc2xklz1zmmz4y0dh3ny-ghc-7.10.2/lib/ghc-7.10.2/include/ -- While building package pcre-light-0.4.0.4 using: /home/markus/.stack/setup-exe-cache/setup-Simple-Cabal-1.22.4.0-x86_64-linux-ghc-7.10.2 --builddir=.stack-work/dist/x86_64-linux/Cabal-1.22.4.0/ configure --with-ghc=/run/current-system/sw/bin/ghc --user --package-db=clear --package-db=global --package-db=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/pkgdb/ --libdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/lib --bindir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/bin --datadir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/share --libexecdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/libexec --sysconfdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/etc --docdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/doc/pcre-light-0.4.0.4 --htmldir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/doc/pcre-light-0.4.0.4 --haddockdir=/home/markus/.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/doc/pcre-light-0.4.0.4 --dependency=base=base-4.8.1.0-4f7206fd964c629946bb89db72c80011 --dependency=bytestring=bytestring-0.10.6.0-18c05887c1aaac7adb3350f6a4c6c8ed Process exited with code: ExitFailure 1 Logs have been written to: /home/markus/git/haskell/karma/karma/.stack-work/logs/pcre-light-0.4.0.4.log Configuring pcre-light-0.4.0.4... setup-Simple-Cabal-1.22.4.0-x86_64-linux-ghc-7.10.2: The program 'pkg-config' version >=0.9.0 is required but it could not be found. After adding pkgconfig to my global configuration, the build seems to get a little further ahead, so it seems that shell.nix is ignored somewhat. (Sources for what I tried so far: https://groups.google.com/forum/#!topic/haskell-stack/_ZBh01VP_fo) Update: It seems like I overlooked this section of the manual http://nixos.org/nixpkgs/manual/#using-stack-together-with-nix However, the first idea that came to mind (stack --extra-lib-dirs=/nix/store/c6qy7n5wdwl164lnzha7vpc3av9yhnga-postgresql-libpq-0.9.1.1/lib build) did not work yet, most likely I need to use --extra-include-dirs or try one of the variations. It seems weird that stack is still trying to build postgresql-libpq in the very same version, though. Update2: Currently trying out "stack --extra-lib-dirs=/nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/lib --extra-include-dirs=/nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/include build" which looks promising. Does not look like the nix-way, but still. Update3: Still getting <command line>: can't load .so/.DLL for: /home/markus /.stack/snapshots/x86_64-linux/nightly-2015-11-17/7.10.2/lib/x86_64-linux- ghc-7.10.2/postgresql-libpq-0.9.1.1-ABGs5p1J8FbEwi6uvHaiV6/libHSpostgresql-libpq-0.9.1.1-ABGs5p1J8FbEwi6uvHaiV6-ghc7.10.2.so (libpq.so.5: cannot open shared object file: No such file or directory) stack build 186.99s user 2.93s system 109% cpu 2:52.76 total which is strange since libpq.so.5 is contained in /nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/lib. An additional $LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:/nix/store/1xf77x47d0m23nbda0azvkvj8w8y77c7-postgresql-9.4.5/lib does not help either. Update4: By the way, yesod devel does the same as stack exec yesod devel. My libraries are downloaded to /nix/store but they are not recognized. Maybe I need to make "build-nix" work and yesod devel does not work here? Just for completeness, here is stack.yaml resolver: nightly-2015-11-17 #run stack setup otherwise!! # Local packages, usually specified by relative directory name packages: - '.' # Packages to be pulled from upstream that are not in the resolver (e.g., acme-missiles-0.3) extra-deps: [lambdacms-core-0.3.0.2 , friendly-time-0.4, lists-0.4.2, list-extras-0.4.1.4 ] # Override default flag values for local packages and extra-deps flags: karma: library-only: false dev: false # Extra package databases containing global packages extra-package-dbs: [] Next weekend, I will check out https://pr06lefs.wordpress.com/2014/09/27/compiling-a-yesod-project-on-nixos/ and other search results.
Funny, because I've just had a similar problem myself - solved it by adding these two lines to stack.yaml: extra-include-dirs: [/nix/store/jrdvjvf0w9nclw7b4k0pdfkljw78ijgk-postgresql-9.4.5/include/] extra-lib-dirs: [/nix/store/jrdvjvf0w9nclw7b4k0pdfkljw78ijgk-postgresql-9.4.5/lib/] You may want to check first which postgresql's path from the /nix/store you should use with include/ and lib/: nix-build --no-out-link "<nixpkgs>" -A postgresql And BTW, why do you use nix-shell if you are going to use stack and you have project-karma.cabal available..? Have you considered migrating your project with stack init..?
Looks like stack is trying to build haskellPackages.postgresql-libpq outside of the nix framework. You probably don't want that to happen. Maybe try to add postgresql-libpq to libraryHaskellDepends?
I m trying to use 'ffprobe' with Java or groovy
As per my understanding "ffprobe" will provide file related data in JSON format. So, I have installed the ffprobe in my Ubuntu machine but I don't know how to access the ffprobe JSON response using Java/Grails. Expected response format: { "format": { "filename": "/Users/karthick/Documents/videos/TestVideos/sample.ts", "nb_streams": 2, "nb_programs": 1, "format_name": "mpegts", "format_long_name": "MPEG-TS (MPEG-2 Transport Stream)", "start_time": "1.430800", "duration": "170.097489", "size": "80425836", "bit_rate": "3782576", "probe_score": 100 } } This is my groovy code def process = "ffprobe -v quiet -print_format json -show_format -show_streams HelloWorld.mpeg ".execute() println "Found ${process.text}" render process as JSON I m able to get the process object and i m not able to get the json response Should i want to convert the process object to json object? OUTPUT: Found java.lang.UNIXProcess#75566697 org.codehaus.groovy.grails.web.converters.exceptions.ConverterException: Error converting Bean with class java.lang.UNIXProcess
Grails has nothing to do with this. Groovy can execute arbitrary shell commands in a very simplistic way: "mkdir foo".execute() Or for more advanced features, you might look into using ProcessBuilder. At the end of the day, you need to execute ffprobe and then capture the output stream of JSON to use in your app.
Groovy provides a simple way to execute command line processes. Simply write the command line as a string and call the execute() method. The execute() method returns a java.lang.Process instance. println "ffprobe <options>".execute().text [Source]