Scope of variable in golang - function

I am trying to run one function in golang but I am getting some errors. Can anyone please help.
func exportVal(name string, value string){
var ops = isWindows()
if ops == true{
set name=value
fmt.Printf("name=")
}else{
export name=value
fmt.Printf("name=")
}
}
But I am getting below error.
D:\Go>go run octa.go
# command-line-arguments
.\octa.go:42:10: syntax error: unexpected name at end of statement
.\octa.go:46:13: syntax error: unexpected name at end of statement

set and export are not Go keywords. Maybe you're thinking of shell? It looks like you're trying to set and export environment variables.
You cannot export environment variables from one process to another. You can only change your own process's environment. Child processes will inherit the parent's environment, but you can't go the other way. You can only do it in a shell program because that "program" is really a set of commands to the shell itself, and only then when using source something.sh. sh something.sh, in contrast, is run in a new shell process.
If you want to "export" data from a non-shell program, you'll have to print out the data in some format, JSON is a good choice, and have that be read by the other process.

Go doesn't support the export keyword.
Are you trying to create a variable with global scope?
var foo string
func SetFoo(to string) {
foo = to
}
Or are you trying to set an environment variable? In which case, use os.Setenv(key, value).

Related

Is it possible to have a module function that modifies parameter values of the caller script?

Motivation
Reduce the maintenance of an Azure DevOps task that invokes a Powershell script with a lot of parameters ("a lot" could be 5).
The idea relies on the fact that Azure DevOps generates environment variables to reflect the build variables. So, I devised the following scheme:
Prefix all non secret Azure DevOps variables with MyBuild.
The task powershell script would call a function to check the script parameters against the MyBuild_ environment variables and would automatically assign the value of the MyBuild_xyz environment variable to the script parameter xyz if the latter has no value.
This way the task command line would only contain secret parameters (which are not reflected in the environment). Often, there are no secret parameters and so the command line remains empty. We find this scheme to reduce the maintenance of the tasks driven by a powershell script.
Example
param(
$DBUser,
[ValidateNotNullOrEmpty()]$DBPassword,
$DBServer,
$Configuration,
$Solutions,
$ClientDB = $env:Build_DefinitionName,
$RawBuildVersion = $env:Build_BuildNumber,
$BuildDefinition = $env:Build_DefinitionName,
$Changeset = $env:Build_SourceVersion,
$OutDir = $env:Build_BinariesDirectory,
$TempDir,
[Switch]$EnforceNoMetadataStoreChanges
)
$ErrorActionPreference = "Stop"
. $PSScriptRoot\AutomationBootstrap.ps1
$AutomationScripts = GetToolPackage DevOpsAutomation
. "$AutomationScripts\vNext\DefaultParameterValueBinding.ps1" $PSCommandPath -Required 'ClientDB' -Props #{
OutDir = #{ DefaultValue = [io.path]::GetFullPath("$PSScriptRoot\..\..\bin") }
TempDir = #{ DefaultValue = 'D:\_gctemp' }
DBUser = #{ DefaultValue = 'SomeUser' }
}
The described parameter binding logic is implemented in the script DefaultParameterValueBinding.ps1 which is published in a NuGet package. The code installs the package and thus gets access to the script.
In the example above, some parameters default to predefined Azure Devops variables, like $RawBuildVersion = $env:Build_BuildNumber. Some are left uninitialized, like $DBServer, which means it would default to $env:MyBuild_DBServer.
We can get away without the special function to do the binding, but then the script author would have to write something like this:
$DBServer = $env:MyBuild_DBServer,
$Configuration = $env:MyBuild_Configuration,
$Solutions = $env:MyBuild_Solutions,
I wanted to avoid this, because of the possibility of an accidental name mismatch.
The Problem
The approach does not work when I package the logic of DefaultParameterValueBinding.ps1 into a module function. This is because of the module scope isolation - I just cannot modify the parameters of the caller script.
Is it still possible to do? Is it possible to achieve my goal in a more elegant way? Remember, I want to reduce the cost associated with maintaining the task command line in Azure DevOps.
Right now I am inclined to retreat back to this scheme:
$xyz = $(Resolve-ParameterValue 'xyz' x y z ...)
Where Resolve-ParameterValue would first check $env:MyBuild_xyz and if not found select the first not null value out of x,y,z,...
But if the Resolve-ParameterValue method comes from a module, then the script must assume the module has already been installed, because it has no way to install it before the parameters are evaluated. Or has it?
EDIT 1
Notice the command line used to invoke the DefaultParameterValueBinding.ps1 script does not contain the caller script parameters! It does include $PSCommandPath, which is used to obtain the PSBoundParameters collection.
Yea, but it will require modifications to the calling script and the function. Pass the parameters by reference. Adam B. has a nice piece on passing parameters by reference in the following:
https://mcpmag.com/articles/2015/06/04/reference-variables-in-powershell.aspx
Net-net, the following is an example:
$age = 12;
function birthday {
param([ref]$age)
$age.value += 1
}
birthday -age ([ref]$age)
Write-Output $age
I've got an age of 12. I pass it into a function as a parameter. The function increments the value of $age by 1. You can do the same thing with a function in a module. You get my drift.

Error accessing a module function

I have a problem with basic module usage in Lua. I have one file "helloworld.lua" and a second file "main.lua". I would like to call a function from the first file inside the second file. But I am getting an error:
attempt to call field 'printText' (a nil value)
My actual code is below. Can someone tell me where the problem is?
helloworld.lua
local module = {}
function module.printText()
print("Hello world")
end
return module
main.lua
hello = require("helloworld")
hello.printText()
As mentioned in the comments, this is the right way to do it. This could be a problem if there is a conflicting helloworld module, or if you have a running lua state and are modifying the files without starting a new one.
require will only load the module passed with a string once. Check package.loaded["helloworld"]. You can set this to nil so that require will load the file again:
package.loaded["helloworld"] = nil
hello = require("helloworld") -- will load it for sure

Do powershell parameters need to be at the front of the script?

I'm trying to have a script with both executable code and a function, like the following:
function CopyFiles {
Param( ... )
...
}
// Parameter for the script
param ( ... )
// Executable code
However, I run into the following error: "The assignment expression is not valid. The input to an assignment operator must be an object that is able to accept assignments, such as a variable or a property"
When I list my function at the end of the file, it says that the function name is undefined. How do I call a powershell function from executable code within the same script?
The correct order is:
1.Script parameters
# Parameter for the script
param([string]$foo)
2.Function definitons
function CopyFiles {
Param([string]$bar)
...
}
3.Script code
# Executable code
CopyFiles $foo $bar
Why would you want it any other way?
Parameters go first always. I had a similar issue at one point in time with providing parameter input to a script. Your script should go:
param ( . . . )
# functions
# script body
For some reason, the PowerShell parsing engine doesn't appreciate the param keyword not being on the first line of a script, not counting comment lines. You can also do this:
param (
# params must be at the top of the file
)
You can also check to see if your parameters have been declared, or if they have the input you want, using Get-Variable. One other thing; if you want to cast data to a certain type, such as System.Boolean, I would do it AFTER the param block, and BEFORE functions. If you type-cast something to System.Boolean in the parameter declaration, then you'll have errors if people running your script don't submit the input argument in a Boolean value, which is much harder than using the .NET System.Convert static method to convert the value afterwards, and checking to see what it evaluated to.

Netezza - making UDF functions available to multiple DBs

The LUA Developer's Guide ( part of Netezza Analytics 3.0 ) has instructions for "compiling" .nzl functions using the "nzlua" command. Unfortunately, the function seems to be only available in the DB that is defined in the environmental variable NZ_DATABASE when the "nzlua" command is executed - see command output immediately below.
Question: How does one make the compiled function available to all databases on the appliance ? i.e. without altering the NZ_DATABASE env variable and redoing the nzlua command for each DB ( including user sandbox DBs ).
[nz#nzh1p01 examples]$ /nz/extensions/nz/nzlua/bin/nzl nzlua isdate.nzl
Compiling: isdate.nzl
####################################################################
UdxName = isdate
UdxType = UDF
Arguments = VARCHAR(40),VARCHAR(40)
Result = BOOL
Dependencies = INZA.INZA.LIBNZLUA_3_0_0
NZUDXCOMPILE OPTIONS: (--nullcall --unfenced --mem 2m)
CREATE FUNCTION
The function should available in all the databases however you will need to call it using the full path. Database..function
Add the database that you registered the function into to the "search_path" environment variable in the /nz/data/postgresql.conf file. The function can then be reference d from anywhere :-)

Accessing the Body of a Function with Lua

I'm going back to the basics here but in Lua, you can define a table like so:
myTable = {}
myTable [1] = 12
Printing the table reference itself brings back a pointer to it. To access its elements you need to specify an index (i.e. exactly like you would an array)
print(myTable ) --prints pointer
print(myTable[1]) --prints 12
Now functions are a different story. You can define and print a function like so:
myFunc = function() local x = 14 end --Defined function
print(myFunc) --Printed pointer to function
Is there a way to access the body of a defined function. I am trying to put together a small code visualizer and would like to 'seed' a given function with special functions/variables to allow a visualizer to 'hook' itself into the code, I would need to be able to redefine the function either from a variable or a string.
There is no way to get access to body source code of given function in plain Lua. Source code is thrown away after compilation to byte-code.
Note BTW that function may be defined in run-time with loadstring-like facility.
Partial solutions are possible — depending on what you actually want to achieve.
You may get source code position from the debug library — if debug library is enabled and debug symbols are not stripped from the bytecode. After that you may load actual source file and extract code from there.
You may decorate functions you're interested in manually with required metadata. Note that functions in Lua are valid table keys, so you may create a function-to-metadata table. You would want to make this table weak-keyed, so it would not prevent functions from being collected by GC.
If you would need a solution for analyzing Lua code, take a look at Metalua.
Check out Lua Introspective Facilities in the debugging library.
The main introspective function in the
debug library is the debug.getinfo
function. Its first parameter may be a
function or a stack level. When you
call debug.getinfo(foo) for some
function foo, you get a table with
some data about that function. The
table may have the following fields:
The field you would want is func I think.
Using the debug library is your only bet. Using that, you can get either the string (if the function is defined in a chunk that was loaded with 'loadstring') or the name of the file in which the function was defined; together with the line-numbers at which the function definition starts and ends. See the documentation.
Here at my current job we have patched Lua so that it even gives you the column numbers for the start and end of the function, so you can get the function source using that. The patch is not very difficult to reproduce, but I don't think I'll be allowed to post it here :-(
You could accomplish this by creating an environment for each function (see setfenv) and using global (versus local) variables. Variables created in the function would then appear in the environment table after the function is executed.
env = {}
myFunc = function() x = 14 end
setfenv(myFunc, env)
myFunc()
print(myFunc) -- prints pointer
print(env.x) -- prints 14
Alternatively, you could make use of the Debug Library:
> myFunc = function() local x = 14 ; debug.debug() end
> myFunc()
> lua_debug> _, x = debug.getlocal(3, 1)
> lua_debug> print(x) -- prints 14
It would probably be more useful to you to retrieve the local variables with a hook function instead of explicitly entering debug mode (i.e. adding the debug.debug() call)
There is also a Debug Interface in the Lua C API.