how to copy a row in a .csv file into a column in another .csv file in tcl ? - tcl

I wish to copy a specific row in a .csv file to a specific column in another .csv file using tcl.
What i've tried is to copy the row i wanted into a new .csv file and then copy this row manually into my .csv file. But I wish to automate all this and directly copy a row in the .csv into a column in an existing .csv file.
Here is what i tried:
package require csv
set fp [open "filenameSource.csv" r]
set secondColumnData {}
while {[gets $fp line]>=0} {
if {[llength $line]>0} {
lappend secondColumnData [lindex [split $line ","] 1]
}
}
close $fp
puts $secondColumnData
set filename "Destination.csv"
set fileId [open $filename "w"]
puts -nonewline $fileId $secondColumnData
close
Is there a way to have a pointer at row x in the source file and copy it into a specific destination into the Destination file.
I am new to tcl. Please provide example.
Thanks,
IEK

One thing you'll need to learn as a newcomer to Tcl is that there's a lot of useful code in Tcllib, a suite of packages written by the Tcl community. In this case, the csv and struct::matrix packages make this task trivial (as I understand it), which is great because CSV files have some tricky aspects that aren't obvious.
package require csv
package require struct::matrix
# Read the source data
set srcMatrix [struct::matrix]
set f [open "filenameSource.csv" r]
csv::read2matrix $f $srcMatrix
close $f
# Read the destination data so we can UPDATE it
set dstMatrix [struct::matrix]
set f [open "Destination.csv" r+]
csv::read2matrix $f $dstMatrix
# Leaving the file open; we're going to rewrite it…
# Do the copying operation; I assume you know which row and column to copy from/to
$dstMatrix set column 2 [$srcMatrix get row 34]
# Write back
chan seek $f 0
csv::writematrix $f $dstMatrix
chan truncate $f; # Make sure there's no junk left if the file shortened
close $f

Related

Change UTF-8 coding of file into ISO 8859-15 in tcl

I have written a code in Tcl which starts by getting the file My_Text_File.txt into it:
set myfile [open My_Text_File.txt]
set file_data [read $myfile]
The file My_Text_File.txt is encoded in UTF-8. But this file must be encoded in ISO 8859-15 (also referred to as Latin-9). Is there a way to extend a Tcl code in a way that it changes a UTF-8 encoded text file to an ISO 8859-15 encoded one?
I would like to emphasize that the change from UTF-8 to ISO 8859-15 must be done inside the Tcl code.
Thanks in advance!
You have to read your original file, converting from UTF-8 to tcl's native Unicode encoding, and then write the contents to a temporary file using ISO-8859-15 encoding, and finally replace the original with the temporary. tcl has a few commands to make it easy:
#!/usr/bin/env tclsh
# See `encoding names` for the list of character encodings supported
# by your version of tcl
proc convert_file {file to_encoding {from_encoding}} {
set infile [open $file]
# Assume original file is in the default system encoding if no
# explicit from encoding is given.
if {$from_encoding ne ""} {
chan configure $infile -encoding $from_encoding
}
# Create a temporary file to write the re-encoded text to
set outfile [file tempfile temp_name]
chan configure $outfile -encoding $to_encoding
# Efficiently read everything from one channel and write to another.
chan copy $infile $outfile
chan close $infile
chan close $outfile
# Rename the temporary file to the original
file copy -force $temp_name $file
file delete -force $temp_name
}
convert_file My_Text_File.txt iso8859-15 utf-8
If the file isn't too big, it's easy to just read everything into memory and write it out again with the new encoding.
# Very simple conversion utility
proc convertFile {filename fromEncoding toEncoding} {
# Read a file in a given encoding
set f [open $filename]
chan configure $f -encoding $fromEncoding
set contents [chan read $f]
chan close $f
# Write a file in a given encoding
set f [open $filename w]
chan configure $f -encoding $toEncoding
chan puts -nonewline $f $contents
chan close $f
}
# Apply to the particular case we care about
convertFile My_Text_File.txt utf-8 iso8859-15
For a large file, you need to stream the data from one file to another (and then you can rename the target file afterwards).
Beware! Converting UTF-8 to ISO 8859-15 can lose information if there are characters in the source text that are not present in the target encoding.

collecting set of files using tcl script

I am looking to generate a tcl script, which reads each line of a file, say abc.txt; each line of abc.txt is a specific location of set of files which need to be picked except the ones commented.
For example abc.txt has
./pvr.vhd
./pvr1.vhd
// ./pvr2.vhd
So I need to read each line of abc.txt and pick the file from the location it has mentioned and store it in a separate file except the once which starts with "//"
Any hint or script will be deeply appreciated.
The usual way of doing this is to put a filter at the start of the loop that processes each line that causes the commented lines to be skipped. You can use string match to do the actual detecting of whether a line is to be filtered.
set f [open "abc.txt"]
set lines [split [read $f] "\n"]
close $f
foreach line $lines {
if {[string match "//*" $line]} {
continue
}
# ... do your existing processing here ...
}
This also works just as well when used with a streaming loop (while {[gets $f line] >= 0} {…}).

in tcl, how to edit string in the open file?

let's say that I have opened a file using:
set in [open "test.txt" r]
I'm intend to revise some string in the certain line, like:
style="fill:#ff00ff;fill-opacity:1"
and this line number is: 20469
And I want to revise the value ff00ff to other string value like ff0000.
What are the proper ways to do this? Thanks in advance!
You need to open the file in read-write mode; the r+ mode is probably suitable.
In most cases with files up to a reasonable number of megabytes long, you can read the whole file into a string, process that with a command like regsub to perform the change in memory, and then write the whole thing back after seeking to the start of the file. Since you're not changing the size of the file, this will work well. (Shortening the file requires explicit truncation.)
set f [open "test.txt" r+]
set data [read $f]
regsub {(style="fill:#)ff00ff(;fill-opacity:1)"} $data {\1ff0000\2} data
seek $f 0
puts -nonewline $f $data
# If you need it, add this here by uncommenting:
#chan truncate $f
close $f
There are other ways to do the replacement; the choice depends on the details of what you're doing.

Writing multiple lines to a file in TCL

I'm looking to modify gpsfeed+ to add in a section which writes the NAV string out to a text file while the simulator is running. The tool is written in tcl and I'm at a loss as to what I need to do. What I have so far is:
if {$prefs(udp) & $::udpOn} {
# opens file to write strings to
set fp [open "input_NAV.txt" w+]
# one sentence per udp packet
foreach line [split $::out \n] {
puts $fp $line
}
close $fp
}
Right now if UDP broadcast is switched on, I want to take each NAV string broadcast over UDP and write it to a file. But the code above only writes 1 of the strings and then overwrites the string. I've been trying to add in a /n switch, but I've not had any joy.
I was using the wrong mode for opening the file:
w+ Open the file for reading and writing. Truncate it if it exists. If it does not exist, create a new file.
I should have been using either of the following:
a Open the file for writing only. If the file does not exist, create a new empty file. Set the file pointer to the end of the file prior to each write.
a+ Open the file for reading and writing. If the file does not exist, create a new empty file. Set the initial access position to the end of the file.
This would be a comment, but formatting.
This code:
foreach line [split $::out \n] {
puts $fp $line
}
Is equivalent to:
puts $fp $::out

Read The Last Line Of An Active Logfile With Eggdrop & .tcl

Hello i was wondering if its possible to read the last line of a realtime logfile with eggdrop and a .tcl script im able to read the first part of the logfile but thats it it doesnt read anymore of it
Is it possible to put an upper bound on the length of a line of the logfile? If so, it's pretty easy to get the last line:
# A nice fat upper bound!
set upperBoundLength 1024
# Open the log file
set f [open $logfile r]
# Go to some distance from the end; catch because don't care about errors here
catch {seek $f -$upperBoundLength end}
# Read to end, stripping trailing newline
set data [read -nonewline $f]
# Hygiene: close the logfile
close $f
# Get the last line
set lastline [lindex [split $data "\n"] end]
Note that it's not really necessary to do the seek; it just saves you from having to read the vast majority of the file which you presumably don't want.