short commands are not working in customized wish shell - tcl

I have a customized in version of wish 8.6 shell with own environment loaded.
The issue is in native wish shell, short command work.
eg. packa r xxx for package require or stri e $str1 $str2 for string comparison.
But the same thing when i run in my customized shell, it says
invalid command name "packa"
But it works for the options for the command, as package re works for requiring the package.
What could be the possible cause, that wish is unable to resolve command name?
I know it it's bit difficult to answer for a customized shell but if someone could share probable causes based of logics, that would be of great help.

It sounds like you're not setting the global tcl_interactive to 1. That enables expansion of abbreviated command names as well as calling external programs without an explicit exec and a few other things (all of which is done in the unknown command handler procedure, or things it calls; if you want to customise things instead of working like tclsh does, look there).
Handling of unique prefixes of subcommand names is entirely separate.

Related

Using write access in Open command in TCL

How can i use write ('w') and read ('r') access while using command pipeline in open command in TCL.
when i do something like :
set f1 [open "| ls -l" w]
it returns a file descriptor to write to , say file1.
Now I am confused how can I put this file descriptor to my use.
PS : My example might be wrong, and in that case it'd be ideal if answer includes a programming example so that it'll be more clear.
Thanks
In general, the key things you can do with a channel are write to it (using puts), read from it (using gets and read), and close it. Obviously, you can only write to it if it is writable, and only read from it if it is readable.
When you write to a channel that is implemented as a pipeline, you send data to the program on the other end of the pipe; that's usually consuming it as its standard input. Not all programs do that; ls is one of the ones that completely ignores its standard input.
But the other thing you can do, as I said above, is close the channel. When you close a pipeline, Tcl waits for all the subprocesses to terminate (if they haven't already) and collects their standard error output, which becomes an error message from close if there is anything. (The errors are just like those you can get from calling exec; the underlying machinery is shared.)
There's no real point in running ls in a pure writable pipeline, at least not unless you redirect its output. Its whole purpose is to produce output (the sorted list of files, together with extra details with the -l option). If you want to get the output, you'll need a readable channel (readable from the perspective of Tcl): open "| ls -l" r. Then you'll be able to use gets $f1 to read a line from the subprocess.
But since ls is entirely non-interactive and almost always has a very quick running time (unless your directories are huge or you pass the options to enable recursion), you might as well just use exec. This does not apply to other programs. Not necessarily anyway; you need to understand what's going on.
If you want to experiment with pipelines, try using sort -u as the subprocess. That takes input and produces output, and exhibits all sorts of annoying behaviour along the way! Understanding how to work with it will teach you a lot about how program automation can be tricky despite it really being very simple.

How do I find where a function is declared in Tcl?

I think this is more of a Tcl configuration question rather than a Tcl coding question...
I inherited a whole series of Tcl scripts that are used within a simulation tool that my company built in-house. In my scripts, I'm finding numerous instances where there are function calls to functions that don't seem to be declared anywhere. How can I trace the path to these phantom functions?
For example, rather than use source, someone build a custom include function that they named INCLUDE. Tclsh obviously balks when I try to run it there, but with my simulation software, it runs fine.
I've tried grep-ing through the entire simulation software for INCLUDE, but I'm not having any luck. Are there any other obvious locations outside the simulation software where a Tcl function might be defined?
The possibilities:
Within your software. (you have checked for this).
Within some other package included by the software.
Check and see if the environment variable TCLLIBPATH is set.
Also check and see if the simulation software sets TCLLIBPATH.
This will be a list of directories to search for Tcl packages, and you
will need to search the packages that are located outside of the
main source tree.
Another possibility is that the locations are specified in the pkgIndex.tcl file.
Check any pkgIndex.tcl files and look for locations outside the main source tree.
Within an unknown command handler. This could be in
your software or within some other package. You should be able to find
some code that processes the INCLUDE statement.
Within a binary package. These are shared libraries that are loaded
by Tcl. If this is the case, there should be some C code used to
build the shared library that can be searched.
Since you say there are numerous instances of unknown functions, my first
guess is that you have
not found all the directories where packages are loaded from. But an
''unknown'' command handler is also a possibility.
Edit:
One more possibility I forgot. Check and see if your software sets the auto_path variable. Check any directories added to the auto_path for
other packages.
This isn't a great answer for you, but I suspect it is the best you're going to get...
The procedure could be defined in a great many places. Your best bet for finding it is to use a tool like findstr (on Windows) or grep -R (on POSIX platforms) to search across all the relevant source files. But that still might not help! It might not be a procedure but instead a general command, which could be implemented in C and not as a procedure, or it could be defined in a packaged application archive (which are usually awkward to look inside). There are also other types of script-implemented command too, which could make things awkward. Generally searching and investigating is your best bet, but it might not work.
Tcl doesn't really differentiate strongly between different types of command except in some introspection operations. If you're lucky, you could find that info body tells you the definition of the procedure (and info args and info default tell you about the arguments) but that won't help with other command types at all. Tcl 8.7 will include a command (info cmdtype) that would help a lot with narrowing down what to do next, but that's no use to you now and it definitely doesn't exist in older versions.

Is there an alternative to the load command to import binary Tcl package

I am using a commercial tool interfaced with an homebrew tclsh(Synopsys EDA).
In their version, they removed the load command. Thus I cannot use third party libraries (Graphviz library in my case).
I wonder if there is a another way to import binary files (.so files)
The only command in standard Tcl that brings in a dynamic library is load. (OK, package require can do too, but that's because it can call load inside.) Without that command, you only have options like statically linking your own code in and creating the commands in the Tcl_AppInit function, but that's really unlikely to work if you're already using someone else's code that's already done that sort of thing.
The easiest approach might be to run a normal tclsh as a subprocess via exec tclsh script.tcl (run and wait for termination) or open |tclsh r+ (open pipeline). If they've not turned off those capabilities as well; you might be running in a safe interpreter where all those things are systematically disabled. I don't know of any way to break out of a standard safe interpreter (the mechanism for locking them down errs on the side of caution) so if that's the case, you'll just have to save the data you want to a file somewhere (by any mechanism that works; safe interpreters also can't touch the filesystem at all by default though that is often profiled back in in protected ways) and use a completely separate program to work with it.

Is it a good practice to execute terminal commands from a Perl or PHP script? [duplicate]

This question already has answers here:
Executing system commands safely while coding in Perl
(3 answers)
Closed 6 years ago.
Like executing a command using backticks or exec() or system().
Yes, this is bad practice in almost all cases! This is especially true if you have a choice.
Executing external commands is:
Very hard to do correctly (but very easy to get sort-of-working). You can't just escape the shell args and call it a day, you have to account for potentially multiple levels of escaping for potentially different languages.
Slow. A fork+exec to e.g. rm is easily a thousand times slower than the corresponding syscall.
A rigid, error-prone and inexpressive integration point. You typically have to convert data to flat lists of strings and back. You can't use the language's features like exception handling, nested data structures or callbacks.
Due to this, the following are BAD reasons to call external commands:
Not knowing how to do X in your language, but knowing a shell command for it. A typical example is cp -R foo bar.
Not knowing how something works, but knowing a shell oneliner that does it. A typical example is foo *.mp4 > >(tee file).
Not wanting to learn a new API for e.g. json or http, and instead using shell tools like jq or curl.
However, if you are calling a program that does non-trivial things, that doesn't have a native library or bindings, AND that you know how to invoke with execve semantics (NOT system nor perl exec semantics that invoke shells), this is a valuable tool.
Examples of good uses of executing external commands that follow all the above is invoking make to build a project from an installer, or running java -jar ... to start a Minecraft server.
It's a great* practice. Perl and PHP are great for lots of things, but they're not great at everything, and there's a use case for using external programs and other tools in your project. But one of things that Perl is definitely great at is gluing together input and output formats and letting you mash together several different tools into a single project, letting each part of the project do what they do best.
* by which I mean, often a great practice. Things like #files=qx(ls $dir) and #txt=qx(cat $textfile) make all right-thinking Perl programmers cringe.
Some operating system commands might have built-in functionality not available in language (Perl's mkdir() lacks *ix mkdir's -p). Then again, something might be easier to do using languages constructs instead of parsing output (readdir() vs ls).
And it is important to remember that something written in Perl might be more portable to non-Unix systems than calling OS-specific external programs.
Why not?
But, be careful and escape all strings inserted in the command:
In PHP: escapeshellarg()
In Perl: Perl equivalent of PHP's escapeshellarg

Scripting language shells?

Are there any software packages or projects that provide the scripting language shells? I know there's csh for C programmers although not in a sense that it's primarily for programming, but for navigation and system administration. I was wondering if there is something inverted for this purpose? I.e. user logs into a shell that's primarily for programming and then for navigation (something like irb in ruby, but with navigation capabilities)?
I think you're misinformed if you think csh (tcsh) is for C programmers. It's just a shell like bash or ash or dash or ksh or zsh.
The R language provides a reasonably functional internal environment, complete with the ability to save/restore the "workspace" (your variables).
Python has a built-in interpreter, as does Maxima, and some Lisp/Scheme versions, plus you already mentioned irb.
You could also view vim or emacs as the type of programmer-centric shell you're talking about; both can be hooked up to run navigation commands and sysadmin-type stuff without forcing you to leave the editor.
I think the real answer to your question is "powerful shells provide their own scripting language".
Tcl's interpreter, tclsh, is really designed to be a shell. In fact, unlike Ruby where the interactive and non interactive shell are separate, tclsh works just like traditional shells like bash: if run without a script it enters interactive mode but given a script it enters batch mode.
But, it does suck in that it doesn't have readline built-in. So no up-arrow history or tab completion etc. But you can always run it using rlwrap:
rlwrap tclsh
which should give you readline capabilities.
However, I wasn't satisfied (partly because my system at the time didn't have rlwrap and partly because there were a few more features I wanted). So I wrote my own implementation of history and tab completion etc. Checkout my original Pure-tcl readline or the improved Pure-tcl readline2.
It really does act like an interactive shell complete with auto-executing external executables if a tcl command is unknown. And you can even execute interactive programs like vi, emacs or lynx from it. Because it automatically falls back to executing external commands, you can mix tcl and shell like:
foreach x [split [ps aux | grep apache] \n] {
puts [lindex $x 1]
}
This is great because tcl's syntax is much saner compared to bash and sh (ever tried to get out of '"\"\\\\"\ quoting hell in bash?). I personally like tcl but tcl is kind of a love-it-or-hate-it language. People who get it really love it and people who don't really hate it.
But even if you don't quite like tcl syntax I'd suggest you give it a try for this specific application because unlike other languages tcl really is designed to be used more as a command language than a programming language. Read I can’t believe I’m praising Tcl for some of the reasons why.
System navigation (and administrative tasks) are a really different application than programming, and it's hard to find a single shell that does both well. However, I'm guessing that what you're really asking for is a shell that
Lets you easily load the contents of a file and manipulate those contents in-process and with more dexterity than you get using bash and standard unix utilities.
In addition you want the convenience of accessing some of the normal commands for moving files around and navigating the file system.
The good news is that the standard scripting languages (e.g. Ruby, Perl) were meant to do #1 really well, and it's not hard to write/find a library to do #2 any of these langauges.
Because Ruby is what I'm familiar with, I'm going to give you a more concrete example of how you might accomplish this using Ruby.
To do this in Ruby, you would use irb (the Ruby REPL), and the FileUtils module which is part of Ruby's standard library.
To do this, start irb, then run
require 'fileutils'
include FileUtils
(you can put this in .irbrc if you'd like, but I'm not sure I'd recommend that.)
this allows you to have access to a number of the normal file manipulation commands through easy Ruby syntax. You can run other Ruby commands automatically yourself. To run other commands on your system, you're going to have to call them with system.
FileUtils doesn't include an ls command, because it wasn't really meant to be used interactively, so you'll need to write your own. I don't know a way to get good job control at all (that's not to say you couldn't write something though).
The only thing I warn you is that this workflow will be very different than other UNIX users, so you might want to think about being such a nonconformist is worth it, or whether you'd rather build experience that meshes well with other UNIX admins' working styles. It's probably better to get used to the core UNIX utils and the Bourne shell scripting language. (You could learn C-Shell if you want, but there is a well-known FAQ explaining the disadvantages of the C-shell for programming.)
You may want to take a look at IPython. It is an interactive Python shell (with filesystem navigation alongside other nice features) and it also provides a system shell profile to optimize its behavior for system shell usage.
CSH has nothing to do with C programming. It's serves the same functions as the Bourne Shell about equally well, but uses different syntax.
If you want C interpreter, I suggest using cint, which is part of CERN's ROOT system. But keep in mind that it's not useful in the least for system administration and navigation.
I'm sure with a little bit of work you could further extend Devel::REPL (Perl) to provide access to gnu coreutils,
Bash has lots of programming features that aren't ordinarily acknowledged, for example arrays and string manipulation options when expanding a shell variable. Some shells, like zsh or ksh have greatly improved programming features compared to the most common shells (namely bash or tcsh.)