I have such a network structure : There is a person, there are his relatives and relatives of his relatives. I need to do queries in this network like bring the ones older than 30 years old or show the relationship with Jason and Anna. And every person in the network has a list of their relatives as a property.
All of the network information is stored in a non-graph database. So would I have a better performance if I once generate whole network to a json file and then query the JSON file in the front end instead of sending the query to the database and then getting the result? Thanks in advance.
This is my requirement:
I need to populate all countries of the world in a dropdown (combobox).
There would be another dropdown just below that shall be populated with values of cities of the selected country.
All the countries and cities shall have their respective ISO codes when shown in the dropdowns.
I am looking for either a free available database in mysql (I have not been able to find one; all are either paid or inconsistent)- or - preferably a Java API which can return me countries and cities so that I do not have to store the entire database at my end.
Technologies used: MySQL, JSF.
Any help shall be highly appreciated. Thank you.
Maybe http://opengeocode.org/download.php will help, see e.g. the "Cities of the World". It's a CSV you could import into MySQL. You can see the data structure by clicking on the "Metadata" button. Good luck!
I need to store first parent of any Location entity in a mySQL database. So at the end I'll have a complete hierarchy. For example I need to know Berlin is part of Germany and store Germany as first parent of Berlin in the table. How can I query OSM for such information?
You can't query OSM directly for this information. Of course OSM contains such information, mainly through boundary relations and admin_levels. But the exact hierarchy between different elements has to be calculated first.
Geocoders for OSM can be used to obtain these information. The currently most popular one is Nominatim. You can install your own Nominatim instace by either importing the whole planet or an country or area extract. Then you can try to obtain these information via the database created by Nominatim.
I have a question regarding the correct design for Google Engine & Google Maps.
Requirement:
Have a list of stores with their individual attributes (location, hours, etc.).
Each store can be mapped to multiple store types (outlet, retail, business, etc.).
I understand that #1 can be achieved by importing the my data into my Google Maps Engine in a table. #2 is a different set of data with Store PKID as a foreign key and another column specifying the store type. #2 does not have a PKID - i.e. 1 store FK can map to multiple store types and hence, will be represented by multiple rows in #2.
Example:
Table 1:
Store ID, Hours
1,5-8
2,5-5
3,5-5
Table 2:
Store ID, Type
1, Outlet
1, Retail
2, Business
3, Retail
Can I query the two tables in one go using Google Maps API?
OR
Do I have to first get the list of stores from table 1 and then query table 2 for each store?
What design considerations should I keep in mind?
Thank you!
Answer my own question for the benefit of others:
From an application perspective, the number of external calls should be as minimal as possible. There is a fine line between ideal system architecture and performance/costs of the architecture.
In the above example, it would require two calls to get the job done. In addition, the 2nd table is not being used for anything else; hence, I have decided to implement the below table structure:
Table 1:
Store ID, Hours, Outlet, Retail, Business
1,5-8,true,false,false
2,5-8,false,true,true
Now, I do realize that the data is not normalized; however, I have reduced # of external queries.
I've csv file that contains email + street address. I've wordpress installation with users that have same email and I'd like to import street addresses to respective users in user_meta table.
How can I do this?
One stupid way that comes to mind is to use text editor search and replace to create a huge list of insert queries.
Any better way to do this?
You can use
LOAD DATA INFILE your_file.csv
INTO your_table
References: Load data - MySQL