Please help me out how to save the scripts in a single folder since I am facing the issue while importing Script1 inside Script2. Below are the two scripts.
Script1 : Variable.sikuli
PID = "r'C:\Program Files (x86)\Microsoft Office\Office14\outlook.exe'"
When i saved the script(Variable.sikuli) , by default it will create a folder "Variable.sikuli" inside that "Variable.py" and "Variable.html"
Script2 : openMO.sikuli
def openMO():
openApp(PID) # PID will taken from Variable.sikuli
openMO()
When I saved the script(openMO.sikuli), by default it will create a folder "openMO.sikuli" inside that "openMO.py" and "openMO.html"
Now my questions are:
How to save the two scripts in a single folder?
How to import Variable.sikuli in openMO.sikuli?
I don't think you should or can place multiple Sikuli scripts into one folder to make them visible to each other. Generally, the directories/folders containing your .sikuli’s you want to import have to be in sys.path. Sikuli automatically finds other Sikuli scripts in the same directory, when they are imported. Your imported script must contain (as first line) the following statement:
from sikuli import *
This is necessary for the Python environment to know the Sikuli classes, methods, functions and global names.
Below is an example
# on Windows
myScriptPath = "c:\\someDirectory\\myLibrary"
# on Mac/Linux
myScriptPath = "/someDirectory/myLibrary"
# all systems
if not myScriptPath in sys.path: sys.path.append(myScriptPath)
# supposing there is a myLib.sikuli
import myLib
# supposing myLib.sikuli contains a function "def myFunction():"
myLib.myFunction() # makes the call
More info is available here.
Maybe a bit of a late answer but if you would like to import things like a path to another file, you could also try making use of a global variable.
Putting multiple scripts inside one .sikuli folder is not advised.
If your scripts/programs get bigger it could become real messy.
With a global variable you make a variable that can be used throughout the whole script.
When you import a file in python it runs the class right away, and the variables are set.
If you define a global variable in script A, and then let script B import script A. Then script B also knows how the global variables of script A look like.
To set or use a global variable in a definition you have to call it first by using: "global variableName"
I have some example code below that might make things more clear.
File: BrowserPath.sikuli
# Define a Global variable
PathFireFox = ''
class Fire():
def __init__(self):
global PathFireFox
PathFireFox = r"C:\Program Files (x86)\Mozilla Firefox\firefox.exe"
# Run Fire()
Fire()
File: BrowserMain.sikuli
# Import other class.
from BrowserPath import *
class Main():
def __init__(self):
global PathFireFox
App.open(PathFireFox)
# Run Main class
Main()
Related
I'm using SparkKubernetesOperator which has a template_field called application_file. Normally on giving this field a file's name, airflow reads that file and templates the jinja variable in it (just like script field in the BashOperator).
So this works and the file information is shown in the Rendered Template tab with the jinja variables replaced with the correct values.
start_streaming = SparkKubernetesOperator(
task_id='start_streaming',
namespace='spark',
application_file='user_profiles_streaming_dev.yaml',
...
dag=dag,
)
I want to use different files in the application_file field for different environments
So I used a jinja template in the field. But when I change the application_file with user_profiles_streaming_{{ var.value.env }}.yaml, the rendered output is just user_profiles_streaming_dev.yaml and not the file contents.
I know that recursive jinja variable replacement is not possible in airflow but I was wondering if there is any workaround for having different template files.
What I have tried -
I tried using a different operator and doing xcom push to read the file contents and sending it to SparkKubernetesOperator. While this was good for reading different files based on environment, it did not solve the issue of having the jinja variable replaced.
I also tried making a custom operator which inherits the SparkKubernetesOperator and has a template_field applicaton_file_name thinking that jinja replacement will take place 2 times, but this didn't work too.
I made an env file which had the environment details (dev/prod). Then I added this code to the start of my dag file
ENV = None
with open('/home/airflow/env', 'r') as env_file:
value = env_file.read()
if value == None or value == "":
raise Exception("ENV FILE NOT PRESENT")
ENV = value
and then accessed the environment in the code like this
submit_job = SparkKubernetesOperator(
task_id='submit_job',
namespace="spark",
application_file=f"adhoc_{ENV}.yaml",
do_xcom_push=True,
dag=dag,
)
This way I could have separate dev and prod files.
Trying to include a function in a PowerShell script. I get the message that the function does not exist.
I have the function just below where I am creating the parameters. I assume I'm missing something. I have a number of folders that I want to backup and want to call the function for each folder.
CODE STARTS HERE (code above and pram creation left off for brevity).
$ImagesSaveTo = "s3://crisma-backup/HERTRICH/Milford/$dow/images"
#
# Call Backup
BackupCrismaByShop -$LogFile $CrismaSource $CrismaSaveTo $ImagesSource
# Begin Backup Function
# ------------------------------------------------------------------------
function BackupCrismaByShop {
param(
[string]$LogFile,
[string]$CrismaSource,
[string]$CrismaSaveTo,
[string]$ImagesSource
)
# Backup script....
}
Powershell is a language which is interpreted, it means files are read top to bottom and being interpreted as if we speak.
So, if function is called before you have defined it, the Powershell interpreter does not know what you are talking about.
You can try to reorder your code and this should do the trick:
# DEFINE FUNCTION
function BackupCrismaByShop {
param(
[string]$LogFile,
[string]$CrismaSource,
[string]$CrismaSaveTo,
[string]$ImagesSource
)
# Backup script....
}
# YOUR VARIABLES AND OTHER STUFF
$ImagesSaveTo = "s3://crisma-backup/HERTRICH/Milford/$dow/images"
# CALLING THE FUNCTION
BackupCrismaByShop -$LogFile $CrismaSource $CrismaSaveTo $ImagesSource
I can imagine you are using Powershell ISE to code. Let me suggest you to try Visual Studio Code. It would provide you with some recommendations and warnings as you code such variables you are not using, functions called but not still defined, etc.
Thanks.
I know the basic `include "filename.v" command. But, I am trying to include a module which is in another folder. Now, that module further includes other modules present in the same folder. But, when I try to run the module on the most top-level, I am getting an error.
C:\Users\Dell\Desktop\MIPS>iverilog mips.v
./IF/stage_if.v:2: Include file instruction_memory_if.v not found
No top level modules, and no -s option.
Here, I am trying to make a MIPS processor, which is contained in the file "mips.v". The first statement of this file is "`include "IF/stage_if.v". And, in the IF folder, there are numerous files present which I have included in stage_if.v, one of which is "instruction_memory_if.v". Below is the directory level diagram.
-IF
instruction_memory_if.v
stage_if.v
+ID
+EX
+MEM
+WB
mips.v
You need to tell iverilog where to look using the -I flag.
In top.v:
`include "foo.v"
program top;
initial begin
foo();
end
endprogram
In foo/foo.v:
task foo;
$display("This was printed in the foo module");
endtask
Which can be run using the commands:
iverilog -g2012 top.v -I foo/
vvp a.out
>>> This was printed in the foo module
I run an application that generates and updates a number of files in a specific folder. While the application runs, I observe the content of the folder through the sublime sidebar. Because I am interested to see the current size of each file while the application runs, I have an open terminal (Mac) where I use the following command to get the live state of the folder.
watch -d ls -al -h folderName
I was wondering if I can obtain this information directly from sublime.
So my question is: Is it possible to have the size of each file next to the file-names in the sublime sidebar? And if yes, how?
Since the sidebar is not in the official API, I don't think this is possible or at least it is not easy.
However getting the information into sublime text is easy. You can archive this by using a view. Just execute the ls command and write the result in the view.
I wrote a small (ST3) plugin for this purpose:
import subprocess
import sublime
import sublime_plugin
# change to whatever command you want to execute
commands = ["ls", "-a", "-s", "-1", "-h"]
# the update interval
TIMEOUT = 2000 # ms
def watch_folder(view, watch_command):
"""create a closure to watch a folder and update the view content"""
window = view.window()
def watch():
# stop if the view is not longer open
open_views = [v.id() for v in window.views()]
if view.id() not in open_views:
print("closed")
return
# execute the command and read the output
output = subprocess.check_output(watch_command).decode()
# replace the view content with the output
view.set_read_only(False)
view.run_command("select_all")
view.run_command("insert", {"characters": output})
view.set_read_only(True)
# call this function again after the interval
sublime.set_timeout(watch, TIMEOUT)
return watch
class WatchFolderCommand(sublime_plugin.WindowCommand):
def run(self):
folders = self.window.folders()
if not folders:
sublime.error_message("You don't have opened any folders")
return
folder = folders[0] # get the first folder
watch_command = commands + [folder]
# create a view and set the desired properties
view = self.window.new_file()
view.set_name("Watch files")
view.set_scratch(True)
view.set_read_only(True)
view.settings().set("auto_indent", False)
# create and call the watch closure
watch_folder(view, watch_command)()
Just open the User folder (or any other sub-folder of Packages), create a python file (e.g. watch_folder.py) and paste the source code.
You can bind it to a keybinding by pasting the following to your keymap:
{
"keys": ["ctrl+alt+shift+w"],
"command": "watch_folder",
},
I've put my functions in a separate file and I call the file with:
$workingdir = Split-Path $MyInvocation.MyCommand.Path -Parent
. "$workingdir\serverscan-functions.ps1"
But, if I call the scripts like
my-function
how will the variable scope (from within "my-function") be?
Should I still $script:variable to make the variable exist outside the function or have I dot-sourced the function as well?
Hope I don't confuse anyone with my question... I've tried to make it as understandable as possible, but still learning all the basic concept so I find it hard to explain..
When you dot source code it will behave as if that code was still in the original script. The scopes will be the same as if it was all in one file.
C:\functions.ps1 code:
$myVariable = "Test"
function Test-DotSource {
$script:thisIsAvailableInFunctions = "foo"
$thisIsAvailableOnlyInThisFunction = "bar"
}
main.ps1 code
$script:thisIsAvailableInFunctions = ""
. C:\functions.ps1
# Call the function to set values.
Test-DotSource
$script:thisIsAvailableInFunctions -eq "foo"
# Outputs True because of the script: scope modifier
$thisIsAvailableOnlyInThisFunction -eq "bar"
# Outputs False because it's undefined in this scope.
$myVariable -eq "Test"
# Outputs true because it's in the same scope due to dot sourcing.
In order to achieve what you want, you'll probably need to create a module. In the module, export the functions using Export-ModuleMember, and as long as you don't explicitly export any variables as module members, you should be fine.
Once you've created the module, import it using the Import-Module cmdlet.
My 2cents:
Usually (after a past Andy Arismendi answer! God bless you man!) I save all my scripts in $pwd folder (added in system path environment). The I can call them from the console wihtout dot sourcing and no script variable poisoning the console after a script ends his job.
If you cannot modify yours functions in simple scripts (sometimes it happens) I'm agree with Trevor answer to create a module and import it in $profile