Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have a script in bash in linux that is supposed to import data from csv into mysql database.
I need to make this script as generic as possible so I need to control it using config files that will specify which data goes to which table, etc. so that I don't change the script in the future when I need to make modifications to the tables for example.
How can I start and how can I do it ? as I am completely new to this topic.
When you are completely new, first try make it running without the generic approach. Perhaps you can already set the dynamic fields (that you want to move to a config file later) using variables.
You will have to rewrite the code few times when you are learning scripting and notice where you need common functions or config files.
Once you have a running prototype, copy the solution to a new directory and go on from there. One way would be writing all dynamic variables in config files:
field1="value1"
field2="value2 with spaces, that is why I use quotes"
field3="3"
field4=""
In the main script you can read the config file with source yourfile.cfg.
Perhaps you know which set of variables you want when you start your script.
Then you can use a parameter for it:
#!/bin/bash
if [ $# -eq 0 ]; then
echo "No parameter given, using default"
source default.cfg
else
source $1.cfg
fi
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I am trying to make a web page that gets information about books using HTML and to place the information about books into a database to use it. Any idea of how to take the information from the website open library and store it into a database?
here is the link to the API if needed:
https://openlibrary.org/developers/api
thanks in advance.
If postgreSQL and python is a viable option, LibrariesHacked has a ready-made solution on GitHub for importing and searching Open Library data.
GitHub: LibrariesHacked / openlibrary-search
Using a postgreSQL database it should be possible to import the data directly into tables and then do complex searches with SQL.
Unfortunately the downloads provided are a bit messy. The open library file always errors as the number of columns provided seem to vary. Cleaning it up is difficult as just the text file for editions is 25GB.
That means another python script to clean up the data. The file openlibrary-data-process.py simply reads in the CSV (python is a little more forgiving about dodgy data) and writes it out again, but only if there are 5 columns.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I need to transform an xlsx file to CSV in AIX. The server doesn't have any command for that and I am not allowed to install anything on it. Also, no Python/Perl library for reading Spreadsheets are installed on the server.
Is there still any work around for doing this?
P.S.: it has to work with what is on the server
xlsx is an open xml format, specifications can be found.
Otherwise libraries for perl can be found on CPAN, sources may help to pick some parts.
To start maybe unzip the .xslx it will give a set of xml files and have a look if data can be retrieved.
It is no way to do it or at least not an easy one (like a simple script or command).
Maybe by working on that xml files (Nahuel Fouilleul answer) but it will take too much time.
jugging by reception alone of the question it looks like people donĀ“t want to touch this issue even with a stick.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I want to automate the following scenario:
Login into a router
Give reload
When reloading a router, press ESP key continuously.
When pressing a ESP key , boot> prompt will come. execute some commands in this.
I can do the above things manually, using "Teraterm" and console. But how to do this inside a TCL script.
Please help me to do this.
Thanks,
Balu P.
You first need to find an interactive console program that you can issue the commands from. You may find that plink.exe from the PuTTY suite is suitable. The system telnet.exe is not suitable, as it has a special flag set on it that prevents automation, and TeraTerm isn't suitable because it is a GUI program as far as I can see (GUI program automation is very difficult to say the least, and you probably don't want to do it if there's any easier way).
Once you can do the task interactively, plug it inside of an expect script. It's fairly straight forward.
Start with spawn, making the program run and passing in the arguments required to connect to the other system.
Interact with the remote system.
Listen for a suitable prompt with expect. (Remember, this might include prompts to send a username and password!)
Tell it what to do with send. Remember that \r is how you send a Return.
When you're done, close.
Of course, things get more complex when you're also dealing with conditionals, error conditions and timeouts; expect can handle those, but the code is a bit more complex. Ask another question once you've got a concrete problem we can help you with.
I have use console server to automate this. Its working fine
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I want to automate the following scenario:
Login into a router
Give reload
When reloading a router, press ESP key continuously.
When pressing a ESP key , boot> prompt will come. execute some commands in this.
I can do the above things manually, using "Teraterm" and console. But how to do this inside a TCL script.
Please help me to do this.
Thanks,
Balu P.
You first need to find an interactive console program that you can issue the commands from. You may find that plink.exe from the PuTTY suite is suitable. The system telnet.exe is not suitable, as it has a special flag set on it that prevents automation, and TeraTerm isn't suitable because it is a GUI program as far as I can see (GUI program automation is very difficult to say the least, and you probably don't want to do it if there's any easier way).
Once you can do the task interactively, plug it inside of an expect script. It's fairly straight forward.
Start with spawn, making the program run and passing in the arguments required to connect to the other system.
Interact with the remote system.
Listen for a suitable prompt with expect. (Remember, this might include prompts to send a username and password!)
Tell it what to do with send. Remember that \r is how you send a Return.
When you're done, close.
Of course, things get more complex when you're also dealing with conditionals, error conditions and timeouts; expect can handle those, but the code is a bit more complex. Ask another question once you've got a concrete problem we can help you with.
I have use console server to automate this. Its working fine
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have a MySQL DB which should be versioned with SVN. I dont want the full DB, only the structure and selected tables. I searched the net and found some information, but nothing seems to really work in a reliable way. Any experience or hints?
Thanks :)
Check out this script which automates the process, allowing for specific selection of databases and exclusion of tables [disclaimer I am the author] - http://mysql-svn-backup.redant.com.au/
Use mysqldump to export the data you want into a file and put this into SVN. Using cron, you can automize this to run in specific timeslots
The question is: what exactly do you want to version, and why?
I propose you version an SQL file that you can import to create your database. Any tool can be used to create this SQL file (basic tool: mysqldump), which you can then save into your SVN repository. You will be able to track new tables being created by comparing revisions of SQL files.
You can automate this process by adding a CRON job to automatically dump and commit the file every 2 hours.