I want to populate a mysql database with basic 'holiday resort' content e.g. name of the resort, description, country. What methods can i use to populate it?
First, find the Web sites of one or more holiday companies who offer the destinations you're interested in. You're going to scrape these.
You don't say what language you're using for the implementation, but here is how you might do it in Perl:
Write a scraper using LWP::UserAgent and HTML::TreeBuilder to explore the site and
extract the destination information.
Use DBI with the DBD::MySQL
driver to insert the data into your database.
Where is the content? You can use LOAD DATA INFILE syntax, or import from a CSV file, or write a script in Java or C++ or C# to parse the file(s) holding the data and populate the database via INSERT statements. You could hire and intern and make him/her type it all up. If you don't have it in a file, you could write a web-spider to go and get crap from Google and stuff it into the database.
But I can't help you until you tell me where the data is.
Ok, I know I'm late but I created an application to (create and/or) auto-populate tables. Below is a short demonstration but check it out here if you want.
Related
I've got a raspberry pi with raspbian and all I've done is installed apache2 and created a small web site i want to create a database.
is this possible without using mysql or other database software. i want to use .JS or a text based database
I want to be able to save the contact details in a text format.
can someone point me in the right direction a simple example would be appreciated all online research wants mysql etc
all i want is a simple example as in enter name and submit i want that name to be logged so if name entered again it will say welcome back once i know this mechanism i can add all the other fields. The reason i want this format is so i can see the list that I'm creating.
i just can't get to grips with mysql I've spent months trying to understand mysql but its just not going in so want to simplify the database to minimal workings so i can complete my site. I know .Js isn't so secure but its a demo so security not important at this point any help appreciated
It would be possible to use JSON for your data storage. It will be a key-value storage. On each page view you will have to load the entire file into memory and parse it. From then on it is possible to loop it to search data or get data from a key. This requires no extra software just PHP with Apache.
How to:
Build an array, use json_encode to create the JSON and save it using file_put_contents(). Remebmer to save the whole array and not just the newly added element.
This is not a relative database but might do the trick if you build an intelligent system with cookie's to store an ID that is associated with a user.
Alternative you could use serialize() instead of json;
If you don't mind to use different way of storing data you can use either Google App Engine or mongolab or other cloud based databases
My php script pulls about 1000 names from the mysql db on a certain page. These names are used for a javascript autocomplete script.
I think there's a better method to do this. I would like to update the names with a cronjob once a day (php) and store the names locally in a text file? Where else can I store it? It's not sensitive info.
It should be readable and writable to php.
Since you only need the data updated once a day, have a cron-script generate a static json file in some fixed location. Then read this with ajax on the client and make sure it caches it on the client.
Or potentially, generate the file whenever the database is updated (if this is applicable, I don't know your application)
You could try Memcache. But that could be like using a sledge-hammer to crack a nut.
Edit What about storing the data as simple file and let users (JavaScript) download it. Clients would not query the server for every key stroke because they could search for matching values themself. Format could be JSON because it is simple and JavaScript native.
It's unlikely reading from a text file will be much faster than a database query - MySQL already does a lot of caching that should make your query speedy.
If you need to make this query often and performance is a problem often you could consider using a caching module for PHP.
Related
The best way of PHP Caching
I'm currently in the planning phase of a rather large project that I'll develop in the Zend Framework. One of the problems I'm facing is that the customers will want to translate not only the content but also the interface. I'm currently using gettext and poedit to manage my language files but this is not an option for the customer as they, for one, wont have FTP access to the site.
Hence, I'm thinking of a mysql back end with an interface in the front end for the customer to manage his own translations of the interface. There is however still no mysql adapater for Zend_Translate.
So, does anybody now of an adapter script for Zend_Translate so it can work with a mysql table? Or any arguments against using mysql and possible other solutions for this problem?
You could solve this problem on different ways:
Extend Zend_Translate_Adapter to create your own. All new adapters are only responsible from getting the translations out from the source. That is, you would need only to fetch the translations from the database. Look at other adapters and see how they are implemented.
Fetch the data from the database and pass it to Zend_Translate_Adapter_Array
Use Zend_Translate_Adapter_Csv or Ini. As there would be more reading the writing on the translations, this solution would cut down the number of queries to the database. When the client adds a new language or changes an existing one, simply write it to a file, not the database.
If you decide to go with the database adapter, maybe you could "tag" somehow the translations, so that on the home page you fetch only the translations for the home page, on the contact page only the translations for the contact page...
HTH!
Default Zend adapters handle caching well, so I'd stick to them, unless you really need database.
Instead storing the translation data in the database, you may directly operate on the translation files (e.g. po templates). This would be the best choice if you just needed to add (append to file) new translation strings.
You may use Zend_Translate's option to log untranslated messages (to file or any log adapter, including database),
and then handle the logs, or even create listener translating the saved strings.
Here's how: http://cloetensbrecht.be/zend_translate_mysql.html
I have a commercial ColdFusion application, running on a MySQL database. A possible new client has approached me, they have been working in a Lotus Notes environment (and their own database) for many years now. Ofcourse they want to migrate their data to my application, before making the move.
I'm trying to get a grip on how to get a thorough feeling of the data, structure and interdependencies in their current database-application. Are there any tools to see a database-structure (like in a RDBMS) of a NSF-file, or is there anyway to dump the structure using ColdFusion etc....I don't have any handson experience with Lotus Notes (I do in the meanwhile have a local Lotus client and their database).
I need a good startingpoint to be able to determine whether or not I can find a way to migrate the data.
Any ideas??
thanks
Bart
To get at the data in Notes, a good option is to use NotesSQL which can be found here:
A quick overview of the Notes data structure is this: Notes is a document-centric database, with non-relational data contained within each document. Notes Databases (NSFs) contain any number of Notes Documents, which in turn contain any number of items that hold data. Each Notes Document can have a different set of items, and thus different data in it. While that sounds like a horrible mess, usually the documents have similar data based on the form used to create the documents.
This all leads to why there is no simple way to get data out of Lotus Notes. There are a few other options, which may or may not be useful depending on how much data you have to migrate.
I personally like using XML to extract data from Lotus Notes. You can do so by creating XML views within a Notes database. IBM has a tutorial that looks helpful.
Using Java or LotusScript, you can write code to extract data from the documents to any format you wish (CSV, XML, TXT, etc)
If it's not a lot of data, you may find getting the data into an Excel format is the simplest intermediary step. Long ago I wrote an add-in tool for exporting data from Lotus Notes to Excel, which may help you. Or you can use the "Edit > Copy Selected To Table" feature in the Lotus Notes client to copy what is visible in a Notes View to the clipboard, and then paste that into Excel. In that scenario, you'd want to edit the views so they show all the data you need.
I hope this helps!
I am building my first database driven website with Drupal and I have a few questions.
I am currently populating a google docs excel spreadsheet with all of the data I want to eventually be able to query from the website (after it's imported). Is this the best way to start?
If this is not the best way to start what would you recommend?
My plan is to populate the spreadsheet then import it as a csv into the mysql db via the CCK Node.
I've seen two ways to do this.
http://drupal.org/node/133705 (importing data into CCK nodes)
http://drupal.org/node/237574 (Inserting data using spreadsheet/csv instead of SQL insert statements)
Basically my question(s) is what is the best way to gather, then import data into drupal?
Thanks in advance for any help, suggestions.
There's a comparison of the available modules at http://groups.drupal.org/node/21338
In the past when I've done this I simply write code to do it on cron runs (see http://drupal.org/project/phorum for an example framework that you could strip down and build back up to do what you need).
If I were to do this now I would probably use the http://drupal.org/project/migrate module where the philosophy is "get it into MySQL, View the data, Import via GUI."
There is a very good module for this, node import. It allows you to take your GoogleDocs spreadsheet and import it as a .csv file.
It's really easy to use, the module allows you to map your .csv columns to the node fields you want them to go to, so you don't have to worry about setting your columns in a particular order. Also, if there is an error on some records, it will spit out a .csv with the error files and what caused the error, but will import all good records.
I have imported up to 3000 nodes with this method.