Netezza - making UDF functions available to multiple DBs - function

The LUA Developer's Guide ( part of Netezza Analytics 3.0 ) has instructions for "compiling" .nzl functions using the "nzlua" command. Unfortunately, the function seems to be only available in the DB that is defined in the environmental variable NZ_DATABASE when the "nzlua" command is executed - see command output immediately below.
Question: How does one make the compiled function available to all databases on the appliance ? i.e. without altering the NZ_DATABASE env variable and redoing the nzlua command for each DB ( including user sandbox DBs ).
[nz#nzh1p01 examples]$ /nz/extensions/nz/nzlua/bin/nzl nzlua isdate.nzl
Compiling: isdate.nzl
####################################################################
UdxName = isdate
UdxType = UDF
Arguments = VARCHAR(40),VARCHAR(40)
Result = BOOL
Dependencies = INZA.INZA.LIBNZLUA_3_0_0
NZUDXCOMPILE OPTIONS: (--nullcall --unfenced --mem 2m)
CREATE FUNCTION

The function should available in all the databases however you will need to call it using the full path. Database..function

Add the database that you registered the function into to the "search_path" environment variable in the /nz/data/postgresql.conf file. The function can then be reference d from anywhere :-)

Related

Is it possible to have a module function that modifies parameter values of the caller script?

Motivation
Reduce the maintenance of an Azure DevOps task that invokes a Powershell script with a lot of parameters ("a lot" could be 5).
The idea relies on the fact that Azure DevOps generates environment variables to reflect the build variables. So, I devised the following scheme:
Prefix all non secret Azure DevOps variables with MyBuild.
The task powershell script would call a function to check the script parameters against the MyBuild_ environment variables and would automatically assign the value of the MyBuild_xyz environment variable to the script parameter xyz if the latter has no value.
This way the task command line would only contain secret parameters (which are not reflected in the environment). Often, there are no secret parameters and so the command line remains empty. We find this scheme to reduce the maintenance of the tasks driven by a powershell script.
Example
param(
$DBUser,
[ValidateNotNullOrEmpty()]$DBPassword,
$DBServer,
$Configuration,
$Solutions,
$ClientDB = $env:Build_DefinitionName,
$RawBuildVersion = $env:Build_BuildNumber,
$BuildDefinition = $env:Build_DefinitionName,
$Changeset = $env:Build_SourceVersion,
$OutDir = $env:Build_BinariesDirectory,
$TempDir,
[Switch]$EnforceNoMetadataStoreChanges
)
$ErrorActionPreference = "Stop"
. $PSScriptRoot\AutomationBootstrap.ps1
$AutomationScripts = GetToolPackage DevOpsAutomation
. "$AutomationScripts\vNext\DefaultParameterValueBinding.ps1" $PSCommandPath -Required 'ClientDB' -Props #{
OutDir = #{ DefaultValue = [io.path]::GetFullPath("$PSScriptRoot\..\..\bin") }
TempDir = #{ DefaultValue = 'D:\_gctemp' }
DBUser = #{ DefaultValue = 'SomeUser' }
}
The described parameter binding logic is implemented in the script DefaultParameterValueBinding.ps1 which is published in a NuGet package. The code installs the package and thus gets access to the script.
In the example above, some parameters default to predefined Azure Devops variables, like $RawBuildVersion = $env:Build_BuildNumber. Some are left uninitialized, like $DBServer, which means it would default to $env:MyBuild_DBServer.
We can get away without the special function to do the binding, but then the script author would have to write something like this:
$DBServer = $env:MyBuild_DBServer,
$Configuration = $env:MyBuild_Configuration,
$Solutions = $env:MyBuild_Solutions,
I wanted to avoid this, because of the possibility of an accidental name mismatch.
The Problem
The approach does not work when I package the logic of DefaultParameterValueBinding.ps1 into a module function. This is because of the module scope isolation - I just cannot modify the parameters of the caller script.
Is it still possible to do? Is it possible to achieve my goal in a more elegant way? Remember, I want to reduce the cost associated with maintaining the task command line in Azure DevOps.
Right now I am inclined to retreat back to this scheme:
$xyz = $(Resolve-ParameterValue 'xyz' x y z ...)
Where Resolve-ParameterValue would first check $env:MyBuild_xyz and if not found select the first not null value out of x,y,z,...
But if the Resolve-ParameterValue method comes from a module, then the script must assume the module has already been installed, because it has no way to install it before the parameters are evaluated. Or has it?
EDIT 1
Notice the command line used to invoke the DefaultParameterValueBinding.ps1 script does not contain the caller script parameters! It does include $PSCommandPath, which is used to obtain the PSBoundParameters collection.
Yea, but it will require modifications to the calling script and the function. Pass the parameters by reference. Adam B. has a nice piece on passing parameters by reference in the following:
https://mcpmag.com/articles/2015/06/04/reference-variables-in-powershell.aspx
Net-net, the following is an example:
$age = 12;
function birthday {
param([ref]$age)
$age.value += 1
}
birthday -age ([ref]$age)
Write-Output $age
I've got an age of 12. I pass it into a function as a parameter. The function increments the value of $age by 1. You can do the same thing with a function in a module. You get my drift.

Scope of variable in golang

I am trying to run one function in golang but I am getting some errors. Can anyone please help.
func exportVal(name string, value string){
var ops = isWindows()
if ops == true{
set name=value
fmt.Printf("name=")
}else{
export name=value
fmt.Printf("name=")
}
}
But I am getting below error.
D:\Go>go run octa.go
# command-line-arguments
.\octa.go:42:10: syntax error: unexpected name at end of statement
.\octa.go:46:13: syntax error: unexpected name at end of statement
set and export are not Go keywords. Maybe you're thinking of shell? It looks like you're trying to set and export environment variables.
You cannot export environment variables from one process to another. You can only change your own process's environment. Child processes will inherit the parent's environment, but you can't go the other way. You can only do it in a shell program because that "program" is really a set of commands to the shell itself, and only then when using source something.sh. sh something.sh, in contrast, is run in a new shell process.
If you want to "export" data from a non-shell program, you'll have to print out the data in some format, JSON is a good choice, and have that be read by the other process.
Go doesn't support the export keyword.
Are you trying to create a variable with global scope?
var foo string
func SetFoo(to string) {
foo = to
}
Or are you trying to set an environment variable? In which case, use os.Setenv(key, value).

How to check for functions dependency in powershell scripts. Avoid running the same function multiple times

In my PowerShell script - one function's output is another function's input. For Eg: Function CreateReport($x) cannot run until unless the Function ParseXml($x) runs. What if a user directly runs the 2nd function before running the 1st. How can I check if 1st function is already run to continue with 2nd, i.e, first run the 1st function (generate the txt file) then run the 2nd? if first func is already run do not re-run it.
For Eg: Suppose I have a TestFunc.ps1 file having 2 functions as below
$X = "C:\XmlPath\file1.xml"
Function ParseXml($X)
{
#Read xml and output contents in a txt file
}
#This function should execute only after the function Parsexml($X) and if Pasrsexml() has run before and generated the output, it shouldnot be allowed to re-run here
Function CreateReport($T)
{
#from the txtfile Create csv
}
According to this and your other question How to alias a parameterized function as a flag in powershell script? you are trying to implement a so called build script. Instead of inventing a wheel (implementing task dependencies, watching tasks to be run once, etc.) take a look at some already implemented tools like psake or Invoke-Build. They are designed for PowerShell and they do exactly what you want (run specified task sets, maintain task dependencies, run tasks once, etc.). These tools require a little bit of learning, of course, but in a long run they are worth to be learned.
If ParseXml function output a file, you can, in the CreateReport function, test for the existence of this file with Test-Path cmdlet:
if exists continue with CreateReport function else call the ParseXml function before continue.
Use a flag. Set the flag in ParseXml function and check it in the CreateReport function. If the flag isn't set, print an error and exit, othervise run the reporting code. Remember to clear the flag when the process is complete.
You can use a flag variable. For more persistent flags, consider using flag files or setting the flag in a database.

How do I define custom function to be called from IPython's prompts?

I had an old ipy_user_conf.py in which I included a simple function into the user namespace like this:
import IPython.ipapi
ip = IPython.ipapi.get()
def myfunc():
...
ip.user_ns['myfunc'] = myfunc
Then, I could use myfunc in the prompt.
However, I updated to IPython 0.12.1 and now the ip_user_conf.py does not work. I haven't seen how to translate such a custom function for prompts to the new configuration model.
Which is the way to do this?
Best regards,
Manuel.
UPDATE: Changed the subject to question
After reading a bit of the documentation (and peeking at the source code for leads) I found the solution for this problem.
Simply now you should move all your custom functions to a module inside your .ipython directory. Since what I was doing was a simple function that returns the git branch and status for the current directory, I created a file called gitprompt.py and then I included the filename in the exec_file configuration option:
c.InteractiveShellApp.exec_files = [b'gitprompt.py']
All definitions in such files are placed into the user namespace. So now I can use it inside my prompt:
# Input prompt. '\#' will be transformed to the prompt number
c.PromptManager.in_template = br'{color.Green}\# {color.LightBlue}~\u{color.Green}:\w{color.LightBlue} {git_branch_and_st} \$\n>>> '
# Continuation prompt.
c.PromptManager.in2_template = br'... '
Notice that in order for the function to behave as such (i.e called each time the prompt is printed) you need to use the IPython.core.prompts.LazyEvaluation class. You may use it as a decorator for your function. The gitprompt.py has being placed in the public domain as the gist: https://gist.github.com/2719419

Accessing the Body of a Function with Lua

I'm going back to the basics here but in Lua, you can define a table like so:
myTable = {}
myTable [1] = 12
Printing the table reference itself brings back a pointer to it. To access its elements you need to specify an index (i.e. exactly like you would an array)
print(myTable ) --prints pointer
print(myTable[1]) --prints 12
Now functions are a different story. You can define and print a function like so:
myFunc = function() local x = 14 end --Defined function
print(myFunc) --Printed pointer to function
Is there a way to access the body of a defined function. I am trying to put together a small code visualizer and would like to 'seed' a given function with special functions/variables to allow a visualizer to 'hook' itself into the code, I would need to be able to redefine the function either from a variable or a string.
There is no way to get access to body source code of given function in plain Lua. Source code is thrown away after compilation to byte-code.
Note BTW that function may be defined in run-time with loadstring-like facility.
Partial solutions are possible — depending on what you actually want to achieve.
You may get source code position from the debug library — if debug library is enabled and debug symbols are not stripped from the bytecode. After that you may load actual source file and extract code from there.
You may decorate functions you're interested in manually with required metadata. Note that functions in Lua are valid table keys, so you may create a function-to-metadata table. You would want to make this table weak-keyed, so it would not prevent functions from being collected by GC.
If you would need a solution for analyzing Lua code, take a look at Metalua.
Check out Lua Introspective Facilities in the debugging library.
The main introspective function in the
debug library is the debug.getinfo
function. Its first parameter may be a
function or a stack level. When you
call debug.getinfo(foo) for some
function foo, you get a table with
some data about that function. The
table may have the following fields:
The field you would want is func I think.
Using the debug library is your only bet. Using that, you can get either the string (if the function is defined in a chunk that was loaded with 'loadstring') or the name of the file in which the function was defined; together with the line-numbers at which the function definition starts and ends. See the documentation.
Here at my current job we have patched Lua so that it even gives you the column numbers for the start and end of the function, so you can get the function source using that. The patch is not very difficult to reproduce, but I don't think I'll be allowed to post it here :-(
You could accomplish this by creating an environment for each function (see setfenv) and using global (versus local) variables. Variables created in the function would then appear in the environment table after the function is executed.
env = {}
myFunc = function() x = 14 end
setfenv(myFunc, env)
myFunc()
print(myFunc) -- prints pointer
print(env.x) -- prints 14
Alternatively, you could make use of the Debug Library:
> myFunc = function() local x = 14 ; debug.debug() end
> myFunc()
> lua_debug> _, x = debug.getlocal(3, 1)
> lua_debug> print(x) -- prints 14
It would probably be more useful to you to retrieve the local variables with a hook function instead of explicitly entering debug mode (i.e. adding the debug.debug() call)
There is also a Debug Interface in the Lua C API.