does tcl have a standard way of doing NSS lookups (getpwnam, setpwent,...)
Tcl doesn't expose those as APIs (it doesn't really use them internally either) but the TclX extension package does support exactly what you want I believe. For example:
package require TclX
set uid [id convert user $tcl_platform(user)]
puts "Your userid is $uid and you are a member of these groups: [id groups]"
If you're using ActiveTcl, you've definitely got the TclX package available (either already installed or available from the teapot repository).
Related
I'm trying to cross compile GNU grep for Windows from Fedora, using their mingw64 cross compilers. The process is really easy, with one exception. By default, it appears that mingw64 doesn't expand wildcards on the command line, so that grep FOO * gives "Invalid argument: *" rather than searching all files in the current directory.
After a bit of research, I found that there is an external symbol, _dowildcard in the mingw64 CRT, that will trigger wildcard expansion if set to -1. But I've found no useful documentation on how to set this (maybe because it's considered obvious ;-)).
I could modify the source code to set the variable, but I'd much prefer to not have to modify the source if at all possible. (I want to set up an automated build, and applying code patches just adds complexity that I'd like to avoid). Is there any way to set _dowildcard from the configure or make command line? I seem to remember older versions of mingw having a setargv.obj file that could be linked into your project to enable wildcard expansion - is there anything similar for mingw64?
Answer from #ssbssa above:
There is a file CRT_glob.o file supplied with the mingw packages, in /usr/x86_64-w64-mingw32/sys-root/mingw/lib/CRT_glob.o (or the corresponding location for 32-bit) that you can link with your executable to activate command line globbing.
You have to specify the file by full pathname for the linker to find it.
I would like a way to limit what packages can be loaded into tcl.
I can't just remove them from the path, as it has some packages that should be loaded.
Is there a way to handle this without separating the packages to different directories ?
Also, is there a way to add a trace to every package being loaded ? So I can see after the fact which packages have been loaded ?
Thanks
Tcl loads packages in several stages, which are described in a reasonable amount of detail here:
If you ask for a package it does not already known about, it calls the handler registered with package unknown to search the package path (with various origins of the bits of the path) for package index files (pkgIndex.tcl) which provide descriptors of the packages that can be loaded. These descriptors are files that call package ifneeded to give a script to call to load a particular version of a particular package name. If it still doesn't know about the package after doing the loading, you get an error message.
Most pkgIndex.tcl scripts are incredibly simple, though they tend to be dynamically generated during the package installation process. A couple of minor points though; they are not evaluated in the global context, but rather are inside a scope that defines a local dir variable that says the name of the directory containing the pkgIndex.tcl script, and they are always assumed to be UTF-8 encoded on all platforms (I recommend only using ASCII characters in them anyway, which tends to be pretty easy to do precisely because the script is entirely under programmer control).
If you ask for a package it does know about, because the package is either already loaded or there is a descriptor present for it, it checks the version numbers for a match and if the package isn't actually loaded, it evaluates the script provided through package ifneeded in the global context to actually load the package. That will usually source Tcl scripts and/or load DLLs. Some packages do more complicated things than that, but it's common to not put too much complexity in the descriptor scripts themselves: complicated stuff goes into the code they load in, which can be as complex as the package needs it to be.
Now, there is another package subcommand that is useful to you, package forget, which makes Tcl forget the package descriptor and whether the package is loaded. It does not forget the implementation of the package if it is actually loaded but if you can forget the descriptor before stage two, you deny the Tcl interpreter the ability to load the package. If you've got a whitelist of allowed package names and versions, all you need to do is intercept the package unknown handler, delegate to the old implementation, and then strip out anything you don't want before returning. Fortunately, the unknown package handler system is designed to be interceptable like this.
# These packages are to not be version-checked
set builtins {zlib TclOO tcl::tommath Tcl}
# These are the external packages we want to support filtering; THIS IS A WHITELIST!
set whitelist {
Thread {2.8.1}
yaml {0.3.1}
http {2.8.11}
}
proc filterPackages {wrappedCall args} {
# Do the delegation first
uplevel 1 [list {*}$wrappedCall {*}$args]
# Now filter out anything unexpected
global builtins whitelist
foreach p [package names] {
# Check if this is a package we should just tolerate first
if {$p in $builtins} continue
set scripts {}
foreach v [package versions $p] {
if {[dict exists $whitelist $p] && $v in [dict get $whitelist $p]} {
# This is a version we want to keep
lappend scripts [list \
package ifneeded $p $v [package ifneeded $p $v]]
# Everything else is going to be forgotten in a few moments
}
}
# Forget all those package descriptors...
package forget $p
# ...but put back the ones we want.
foreach s $scripts { eval $s }
}
}
# Now install our filter
package unknown [list filterPackages [package unknown]]
As long as install this filter early in your script, before any package require commands have been done (except of the small list of true builtins), you'll have a system which can only package require the packages that you want to allow.
i have many package requires which is used inside the tcl code as below:
package require tlautils 1.0
package require profilemanager
package require tla::log 1.2
package require res_mgr 1.1
package require resource 1.2
package require sysexits
package require utils 1.5
package require testutil 1.0
package require profilemgr
package require leakChecker
package require testutil
what is the alternative to use instead of using so many package requires? this is taking time and i am in search of any other alternatives for package require whcih increases the time in seconds/miliseconds
The package require lines don't really take much longer than the load and source calls that they delegate to (all the packages are doing is stopping you from having to hard-code paths to everything, taking care of all the details of versions and so on). However, when you do package require of a package whose name is not already known, Tcl has to actually search for the pkgIndex.tcl files that describe how to actually load the packages. It does this by calling the code that you can look up (or replace if necessary) using package unknown, and that's actually really quite slow. Depending on the TCLLIBPATH environment variable's contents, it could be extremely slow.
But we can “compile” that so that we will be able to source a single file and be able to load these specific packages on this machine quickly.
To do that, we need the above package requires and a bit of extra wrapping code:
package require tlautils 1.0
package require profilemanager
package require tla::log 1.2
package require res_mgr 1.1
package require resource 1.2
package require sysexits
package require utils 1.5
package require testutil 1.0
package require profilemgr
package require leakChecker
package require testutil
set f [open "pkgdefs.tcl" w]
foreach pkg [package names] {
# Skip the packages built directly into Tcl 8.6
if {$pkg in {zlib TclOO tcl::tommath Tcl}} continue
# Get the actual loaded version; known but absent would be an error
if {[catch {package present $pkg} version]} continue
# Get the actual script that we'd run to load the package.
set script [package ifneeded $pkg $version]
# Add to the file
puts $f [list package ifneeded $pkg $version $script]
}
close $f
Once you've run that, you'll have a script in pkgdefs.tcl that you can source. If in future runs you source it before doing any of those package require calls that you listed, those package require calls will be fast. (This also includes any packages that are dependencies of the ones you list.) However, if you ever install new packages to use in the list, or update the versions of the packages, or move them around, you'll need to rebuild the list: it makes your code quite a lot more inflexible, which is why we don't do it by default.
So for a particular CGI perl script I have included JSON like this to handle some .json files:
use lib "./modules/JSON/lib";
use JSON;
This works fine and well. The web directory holds the files required in the modules folder.
However, the JSON module is very slow. I read that JSON:XS can be much, much faster but I can't seem to simply use it as so:
use lib "./modules/JSON-XS";
use JSON::XS;
There is no lib folder in the JSON-XS files, i've tried combinations of use (ie, using both folders and etc) but it didn't work.
And no I cannot simply install the module for this particular project.
Any help is appreciated.
And no I cannot simply install the module for this particular project.
You can't use a module without installing it. You've just been getting away with doing a half-assed job of it. That won't work for JSON::XS, though. The reason it's fast is because it's written in C, so you'll need to compile the C code. The easiest way by far to do this is to use the provided installer instead of reinventing the wheel.
(You do know you can install a module into any directory, and that this does not require special permissions, right?)
Perl distributions are usually usable in an uninstalled state. What you just need to do is to call perl Makefile.PL && make (or for a Module::Build-based distribution: perl Build.PL && ./Build). This will do all necessary compilations (if it's an XS module) and copy the library files into the blib subdirectory. In your script instead of use lib you would write use blib:
use blib "/path/to/JSON-XS";
Note that if a module has dependencies, then you have to resolve it yourself and add that many use blib statements. JSON::XS does not have that many dependencies, but it will be really inconvenient for other modules. In this case you should probably seek another solution, e.g. using CPAN.pm together with local::lib.
Okay this finally worked for me:
I did this process to all the dependencies (in the order of no dependencies to more dependencies)
export PERL5LIB = ~/path/to/modules/perl5
perl Makefile.PL PREFIX=$PERL5LIB LIB=$PERL5LIB
make
make test
make install
This installed all modules into a directory I called perl5. It also means that when you try to install other modules locally the dependencies issue does not appear due to the PREFIX/LIB additions.
Then all I did was add this to my perl CGI script:
use lib "./modules/perl5";
use JSON::XS;
PS: JSON::XS is so much faster!
:D
I often have the following scenario: in order to reproduce a bug for reporting, I create a small sample project, sometimes a maven multi module project. So there may be a hierarchy of directories and it will usually contain a few small text files. Standard procedure would of course be to create a zip file and send that. But on some mailing lists attachments are not allowed, and so I am looking for a way to automatically create an installation script, that I can post to such mailing lists.
Basically I would be happy with a Unix flavor only that creates mkdir statements to create directories and >> statements to write the file contents. (Actually, apart from the relative path delimiters, the windows and unix versions can probably be identical)
Does such a tool exist somewhere? If not, I'll probably write one in java, but I'm happy to accept solutions in all kinds of languages.
(The tool could run under windows or unix, but the target platform for the generated scripts should be either unix or configurable)
I think you're looking for shar, which creates a shell archive (shell script that when run produces a given directory hierarchy). It is available on most systems; you can use GNU sharutils if you don't already have it.
Normal usage for packing up a directory tree would be something like:
shar `find somedirectory -print` > archive.sh
If you're using GNU sharutils, and want to create "vanilla" archives which use only the most portable of shell builtins, mkdir, and sed, then you should invoke it as shar -V. You can remove some more extra baggage from the scripts by using -xQ; -x to remove checks for existing files, and -Q to remove verbose output from the archive.
shar -VxQ `find somedir -print` > archive.sh
If you really want something even simpler, here's a dirt-simple version of shar as a shell script. It takes filenames on standard input instead of arguments for simplicity and to be a little more robust.
#!/bin/sh
while read filename
do
if test -d $filename
then
echo "mkdir -p '$filename'"
else
echo "sed 's/^X//' <<EOF > '$filename'"
sed 's/^/X/' < "$filename"
echo 'EOF'
fi
done
Invoke as:
find somedir -print | simpleshar > archive.sh
You still need to invoke sed, as you need some way of ensuring that no lines in the here document begin with the delimiter, which would close the document and cause later lines to be interpreted as part of the script. I can't think of any really good way to solve the quoting problem using only shell builtins, so you will have to rely on sed (which is standard on any Unix-like system, and has been practically forever).
if your problem are non-text-file-hating filters:
in times long forgotten, we used uuencode to get past 8-bit eating relays -
is that a way to get past attachment eating mail boxes these days ?
So why not zip and uuencode ?
(or base64, which is its younger cousin)