To undefine in .bbappend shell function defined in .bb file - function

One .inc file included in some image-.bb file defines shell function for Bitbake task.
Let's concentrate here merely on this shell function, asigned Bitbake task is out of scope.
I wonder how to undefine this shell function in .bbappend file (other layer).
unset -f <shell-function-name>
is not working
ERROR: ParseError at .......-image.bbappend:89: unparsed line: 'unset -f do_thisandthat'
Does it need be said Bitbake explicitely "unset -f < shell-function-name>" is a shell script?
Me consulted for this question Bitbake Manual and Yocto Project Reference Manual with outcome of zero findings.

There is no direct API for it however you can do something like:
python () {
d.delVar("shell function name")
}
which will delete the shell function since functions are simply variables.
Just deleting a function may well cause other problems but that does answer your specific question.
This is an 'anonymous python' fragment and will be executed by bitbake at the end of parsing a recipe (or bbappend to a recipe).

Related

Remove debugger keyword during compilation in google closure

UPDATE:
The JS version of closure-compiler is no longer supported or maintained.
https://github.com/google/closure-compiler-npm/blob/master/packages/google-closure-compiler-js/readme.md
Im trying to find if there is a way to remove the "debugger" keyword during compilation process, im using the javascript version google-closure-compiler with gulp.
Looking through the documentation it is clear we can set the flag to stop/show error messages during compilation by doing the following.
https://github.com/google/closure-compiler/wiki/Flags-and-Options
--jscomp_off
translating this to gulp, it is:
const googleClosureOptions = {
...
jscomp_error:"checkDebuggerStatement"
}
however this works on stopping the compilation by throwing error or to show a warning.
zyxcdafg.js:1444: ERROR - [JSC_DEBUGGER_STATEMENT_PRESENT] Using the debugger statement can halt your application if the user has a JavaScript debugger running.
debugger;
^^^^^^^^^
but what I am trying to achieve is to remove the debugger keyword. Is this possible to achieve using googleclosure. I can not find any flags or options relating to this.
UPDATE:
The JS version of closure-compiler is no longer supported or maintained.
https://github.com/google/closure-compiler-npm/blob/master/packages/google-closure-compiler-js/readme.md
No I don't think so. I'd suggest you use something else to do it. Like sed:
find dist -name "*.js" -exec sed -i 's/\sdebugger;//' {} +
Something like that will find files in your dist folder that end with .js and then exec-ute sed to replace all instances of debugger; with nothing.
You could add that to a script that calls your Closure Compiler build.
The compiler doesn't have a command-line api for defining custom code removal passes, but the compiler's architecture does allow for registering custom passes and a pass to remove a debugger statement should be trivial:
if (n.isDebugger()) {
compiler.reportChangeToEnclosingScope(n);
n.detach();
}
The general structure would follow:
https://github.com/google/closure-compiler/blob/master/src/com/google/javascript/jscomp/CheckDebuggerStatement.java

How to pass options to UglifyJS through html-minifier on Windows command line?

HTMLMinifier (html-minifier) (3.5.14) for Node.js (v8.11.1), installed with npm install html-minifier -g, can be run via command line (Windows CMD), e.g. html-minifier --help produces the usage info (excerpts):
Usage: html-minifier [options] [files...]
Options:
-V, --version output the version number
...
--minify-js [value] Minify Javascript in script elements and on* attributes (uses uglify-js)
...
-c --config-file <file> Use config file
--input-dir <dir> Specify an input directory
--output-dir <dir> Specify an output directory
--file-ext <text> Specify an extension to be read, ex: html
-h, --help output usage information
The option --minify-js [value] relies on UglifyJS to "compress" the JavaScript embedded inside the HTML file(s) passed to html-minifier. UglifyJS can remove console.log() function calls (Can uglify-js remove the console.log statements?) from the JavaScript, by enabling the drop_console option (also see pure_funcs).
But --minify-js drop_console=true does not have an effect, nor does something like "uglify:{options:{compress:{drop_console:true}}}" or "compress:{pure_funcs:['console.log']}".
How can such an option be set, ideally via the html-minifier command line (alternatively by config-file, though it just sets "minifyJS": true)?
I was very close.
I started digging through the code (installed in %appdata%\npm\node_modules\html-minifier) to see what happens with the options provided, i.e. adding debug output with console.log(xyz); (using an actual debugger probably would be a better idea).
So, here's my "trace":
option: https://github.com/kangax/html-minifier/blob/gh-pages/cli.js#L118
option handling: https://github.com/kangax/html-minifier/blob/gh-pages/cli.js#L144
argument parsing using [commander][2]
createOptions() https://github.com/kangax/html-minifier/blob/gh-pages/cli.js#L197
options then contains e.g. minifyJS: 'compress:{pure_funcs:[\'console.log\']}',
passed on to minify() https://github.com/kangax/html-minifier/blob/gh-pages/src/htmlminifier.js#L806 which immediately runs
processOptions() https://github.com/kangax/html-minifier/blob/gh-pages/src/htmlminifier.js#L616
where finally in line https://github.com/kangax/html-minifier/blob/gh-pages/src/htmlminifier.js#L667 options.minifyJS is handled, before it's run as var result = UglifyJS.minify(code, minifyJS); in https://github.com/kangax/html-minifier/blob/gh-pages/src/htmlminifier.js#L680.
But there our option string compress:{pure_funcs:['console.log']} gets cleaned because it's not yet an object, resulting in {}.
Or, in a different trial with a different string you may encounter the error Could not parse JSON value '{compress:{pure_funcs:'console.log']}}'
At least it gets that far! But why doesn't it work?
First, it's a good time to revisit the JSON spec: https://www.json.org/index.html
Second, see if the string could be parsed as valid JSON, e.g. with the JSON.parse() demo at https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse
Third, figure out how to get that string through the CMD as argument (escaping the double quotes).
Finally, make sure the data structure to configure UgliFyJS is correct. That's quite easy, since it's documented: https://github.com/mishoo/UglifyJS2#minify-options-structure
And behold, simply escaping the double quotes with a backslash works for me:
html-minfier ... --minify-js {\"compress\":{\"pure_funcs\":[\"console.log\"]}} ...
and it properly shows up in the options as
...
{ compress:
{ pure_funcs: [ 'console.log' ],
...
For ex. curl can read config from a file, like proxies, etc...
Many programs do so. git, maven, gradle.... No matter how and where you call them, they look for the config you or the system provides: first from the current directory, then from the user home and then the system /etc/...
If no batteries included with these node packages, they can only be used on separate html and js files.

Jenkins: import external package from Jenkinsfile using declarative syntax

I had a groovy code wich contains "import groovy.json.JsonSlurper".
I have spent a day testing and i dont know how to load external libraries using declarative syntax.
This is my code:
pipeline {
agent any
import groovy.json.JsonSlurper
stages {
stage("test") {
steps {
}
}
}
}
I have read the jenkins documentation, and i have tried to use the next but without success:
#Grab('groovy.json.JsonSlurper')
import groovy.json.JsonSlurper
both import and #Grab is not recognized. Some idea?
Thanks!
What #Daniel Majano says is true about the import syntax, but the #Grab syntax I found holds differences of behavior between a Pipeline script maintained directly in Jenkins vs Pipeline script from SCM.
When I placed a Grab command in the Pipeline script for a tester pipeline job I found that it didn't make any difference whether the Grab command was there or if it was commented out.
However when used from a Pipeline script from SCM it would throw the following exception...
java.lang.RuntimeException: No suitable ClassLoader found for grab
I removed it from the SCM script and everything worked out in the end.
Additional Background
I'm not sure why the grab was choking in the SCM version, but there's definitely some working parts to the groovy editor because if you define a partial grab command it will give you some validation errors pointing to the broken line as you see in the red X box below, with the error The missing attribute "module" is required in #Grab annotations:
Therefore the script validator is aware of the Grab annotation as it calls it and that it has both a group and module attribute. I'm using the so called shorthand notation in this example.

installing an Ocaml hump library on mutualized server

I am trying to use the Ocaml csv library. I downloaded csv-1.2.3 and followed the installation instructions after installing findlib:
Uncompress the source archive and go to the root of the package,
Run 'ocaml setup.ml -configure',
Run 'ocaml setup.ml -build',
Run 'ocaml setup.ml -install'
Now I have META, csv.a, csv.cma, csv.cmi, csv.cmx, csv.cmxa, csv.mli files in ~/opt/lib/ocaml/site-lib/csv repertory. The shell command ocamlfind list -describe gives csv A pure OCaml library to read and write CSV files. (version: 1.2.3) which I believe means that csv is installed properly.
BUT when I add
let data = Csv.load "foo.csv" in
in my compute.ml module and try to compile it within the larger program package I have the compilation error :
File "_none_", line 1, characters 0-1:
Error: No implementations provided for the following modules:
Csv referenced from compute.cmx"
and if I simply type
let data = load "foo.csv" in
i get :
File "compute.ml", line 74, characters 13-17:
Error: Unbound value load
I have the same type of errors when I use Csv.load or load directly in the Ocaml terminal. Would somebody have an idea of what is wrong in my code or library installation?
My guess is that you're using ocamlfind for compilation (ocamlfind ocamlc -package csv ...), because you have a linking error, not a type-checking one (which would be the case if you had not specified at all where csv is). The solution may be, in this case, to add a -linkall option to the final compilation line producing an executable, to ask it to link csv.cmx with it. Otherwise, please try to use ocamlfind and yes, tell us what your compilation command is.
For the toplevel, it is very easy to use ocamlfind from it. Watch this toplevel interaction:
% ocaml
Objective Caml version 3.12.1
# #use "topfind";;
- : unit = ()
Findlib has been successfully loaded. Additional directives:
#require "package";; to load a package
#list;; to list the available packages
#camlp4o;; to load camlp4 (standard syntax)
#camlp4r;; to load camlp4 (revised syntax)
#predicates "p,q,...";; to set these predicates
Topfind.reset();; to force that packages will be reloaded
#thread;; to enable threads
- : unit = ()
# #require "csv";;
/usr/lib/ocaml/csv: added to search path
/usr/lib/ocaml/csv/csv.cma: loaded
# Csv.load;;
- : ?separator:char -> ?excel_tricks:bool -> string -> Csv.t = <fun>
To be explicit. What I typed once in the toplevel was:
#use "topfind";;
#require "csv";;
Csv.load;; (* or anything else that uses Csv *)

How do I write my hgrc so that Mercurial detects my hooks?

've written two functions in a file commit_hooks.py that I want to run before any commit is made persistent, but I can't figure out how to write my hgrc to detect them.
The function headers are:
def precommit_bad_merge(ui, repo, parent1=None, parent2=None, **kwargs):
...
def precommit_bad_branching(ui, repo, **kwargs):
...
I've tried using this "guide", but the documentation is too "man pagey" for me. The following is an outcast which doesn't work.
[hooks]
precommit = ..\..\mno2\commit_hooks.py
Update!
Rewriting the hook line to:
precommit = D:\environments\next\mno2\commit_hooks.py
make Mercurial detect the precommit hook, but it always exits with status 1 for some reason.
Set up your [hooks] section like this:
[hooks]
precommit.foo = python:D:\environments\next\mno2\commit_hooks.py:precommit_bad_merge
precommit.bar = python:D:\environments\next\mno2\commit_hooks.py:precommit_bad_branching
The syntax for the precommit line that you used is for external hooks, so it was treating your python file as a self-contained script (which I'm assuming it's not since you're using the function signatures for in-process hooks).
You may need to have the python executable in your path (I do).
For more information, see the definitive guide's section on in-process hooks; there's some useful information hidden in the comments.
The "man pagey" documentation has a section on python hook syntax:
The syntax for Python hooks is as
follows:
hookname = python:modulename.submodule.callable
hookname = python:/path/to/python/module.py:callable
Python hooks are run within the
Mercurial process. Each hook is called
with at least three keyword arguments:
a ui object (keyword ui), a repository
object (keyword repo), and a hooktype
keyword that tells what kind of hook
is used. Arguments listed as
environment variables above are passed
as keyword arguments, with no HG_
prefix, and names in lower case.
If a Python hook returns a "true"
value or raises an exception, this is
treated as a failure.