Tilecache failing to generate tiles using Mapnik - gis

I downloaded the Australian OSM extract and moved it into a database called gis using osm2pgsql.
I have changed generate_tiles.py to only generate tiles for Australia:
bbox = (-180.0,-90.0, 180.0,90.0)
render_tiles(bbox, mapfile, tile_dir, 0, 5, "World")
minZoom = 10
maxZoom = 16
bbox = (101.1,-6.9,165.5,-45.9)
render_tiles(bbox, mapfile, tile_dir, minZoom, maxZoom)
When I attempt to generate tiles with:
export MAPNIK_MAP_FILE="osm.xml" && export MAPNIK_TILE_DIR="/tmp/tilecache/" && ./z0generate_tiles.py
Lots of directories are created in /tmp/tilecache with png tiles. The tiles have state boundaries and country names and there does appear to be highways.
But.. when I navigate to the address:
http://localhost/osm/tilecache-2.11/index.html
I only see countries and states, but no labels and no streets. I figure it is probably a permissions issue with accessing the postgis data. I have gone into psql and issued:
GRANT ALL PRIVILEGES ON DATABASE gis TO PUBLIC
In /etc/tilecache.cfg I have:
[cache]
type=Disk
base=/tmp/tilecache
[osm]
type=Mapnik
mapfile=/home/(my user_name)/bin/mapnik/my_osm.xml
spherical_mercator=true
tms_type=google
metatile=yes
[basic]
type=WMS
url=http://labs.metacarta.com/wms/vmap0
extension=png
It would seem that mapnik is not able to communicate with postgis. I have logged into postgres and executed:
GRANT ALL PRIVILEGES ON DATABASE gis TO PUBLIC
I generated the my_osm.xml file with the following:
./generate_xml.py osm.xml my_osm.xml --dbname gis --user (uname) --password (pword) --accept-none
It generated without any errors.
That's about as far as I can take it. New files are being created when accessed via the web, they just don't have any road information.
Any ideas?

One comment:
generate_tiles.py and tilecache are different applications and don't know about each other. So, your tilecache config will only be read by the tilecache application. But, if tilecache is used with 'tms_type=google', like you have done, the cache schemes used by each
program should match.
Couple things to check on your missing roads:
Sometime problems with old geos libraries can lead to lacking data imported by osm2pgsql, so make sure there are a lot of rows in the plant_osm_line table:
select count(*) from planet_osm_line;
Also, make sure you are running the latest Mapnik version, at least 0.7.0, ideally 0.7.1.
Try rendering a few maps with nik2img.py and make sure mapnik does now output any warnings that might be causing this - a common issue can be missing proj4 epsg definitions for EPSG:900913

Related

Couchbase Sync-Gateway Multiple Clients

I'am currently playing around with the Couchbase Sync-Gateway and have built a demo app.
What is the intended behavior if a user logs in with the same username on a different device (which has an empty database) or if he deleted the local database?
I'am expecting that all the data from the server should get synced back to the clients.
Is this correct?
My problem is that if i'am deleting the database or login from a different device, nothing will get synced.
Ok i figured it out and it's exactly how i thought it would be.
If i log in from a different device i get all the data synced automatically.
My problem was the missing sync function. I thought it will use a default and route all documents to the public channel automatically.
I'am now using the following simple sync-function:
"sync": `function (doc, oldDoc) {
channel('!');
access('demo#example.com', '*');
}`
This will simply route all documents to the public channel and grant my demo-user access to it.
I think this shouldn't be used in production but it's a good starting point for playing around.
Now everything is working fine.
Edit: I've now found the missing info:
https://docs.couchbase.com/sync-gateway/current/configuration-properties.html#databases-this_db-sync
If you don't supply a sync function, Sync Gateway uses the following default sync function
...
The channels property is an array of strings that contains the names of the channels to which the document belongs. If you do not include a channels property in a document, the document does not appear in any channels.

Accessing ArcGIS Pro geoprocessing history programmatically

I am writing an ArcGIS Pro Add-In and would like to view items in the geoprocessing history programmatically. The goal of this would be to get the list of parameters and tools used, to be able to better understand and recreate a workflow later, and perhaps, in another project where we would not have direct access to the history within ArcGIS Pro.
After a lot of searching through documentation, online posts, and debugging breakpoints in my code, I've found that some of this data does exist privately within the HistoryProjectItem class, but since this is a private class member, within a sealed class it seems that there would be nothing I can do to access this data. The other place I've seen this data is less than ideal, with the user having an option to write the geoprocessing history to an XML log file that lives within /AppData/Roaming/ESRI/ArcGISPro/ArcToolbox/History. Our team has been told that this file may be a problem because certain recursive operations may cause the file to balloon out of control, and after reading online, it seems that most people want this setting disabled to avoid large log files taking up space on their machine. Overall the log file doesn't seem like a great option as we fear it could slow down a user by having the program write large log files while they are working.
I was wondering if this data is stored somewhere that I have missed that could be accessed programmatically from the add-in. It seems to me that the data within Project.Items is always stored regardless of user settings but appears to be inaccessible this way to due class member visibility. I'm unfamiliar with geodatabases and ArcGIS file formats to know if a project will always have a .gdb which perhaps we could read the history from there.
Any insights on how to better read the Geoprocessing history in a minimally intrusive way to the user would be ideal. Is this data available elsewhere?
This was the closest/best solution I have found so far without writing to the history logs that most people avoid due to filesize bloat, and warnings that one operation may run other operations recursively causing the file to balloon massively.
https://community.esri.com/t5/arcgis-pro-sdk-questions/can-you-access-geoprocessing-history-programmatically-using-the/m-p/1007833#M5842
it involves reading the .arpx file (which is written to on save) by unzipping it, parsing the XML, and filtering the contents to only GPHistoryOperations. From there I was able to read all the parameters, environment options, status, and duration of the operation that I was hoping to gain.
public static void ListHistory()
{
// this can be run in a console app (or within a Pro add-in)
CIMGISProject project = GetProject(#"D:\tests\topologies\topotest1.aprx");
foreach(CIMProjectItem hist in project.ProjectItems
.Where(itm => itm.ItemType == "GPHistory"))
{
Debug.Print($"+++++++++++++++++++++++++++");
Debug.Print($"{hist.Name}");
XmlDocument doc = new XmlDocument();
doc.LoadXml(hist.PropertiesXML);
//it sure would be nice if Pro SDK had things like MdProcess class in ArcObjects
//https://desktop.arcgis.com/en/arcobjects/latest/net/webframe.htm#MdProcess.htm
var json = JsonConvert.SerializeXmlNode(doc, Newtonsoft.Json.Formatting.Indented);
Debug.Print(json);
}
}
static CIMGISProject GetProject(string aprxPath)
{
//aprx files are actually zip files
//https://www.nuget.org/packages/SharpZipLib
using (var zipFile = new ZipFile(aprxPath))
{
var entry = zipFile.GetEntry("GISProject.xml");
using (var stream = zipFile.GetInputStream(entry))
{
using (StreamReader reader = new StreamReader(stream))
{
var xml = reader.ReadToEnd();
//deserialize the xml from the aprx file to hydrate a CIMGISProject
return ArcGIS.Core.CIM.CIMGISProject.FromXml(xml);
}
};
};
}
Code provided by Kirk Kuykendall

How to use the Google api-client python library for Google Logging

I've been using the Google apiclient library in python for various Google Cloud APIs - mostly for Google Compute - with great success.
I want to start using the library to create and control the Google Logging mechanism offered by the Google Cloud Platform.
However, this is a beta version, and I can't find any real documentation or example on how to use the logging API.
All I was able to find are high-level descriptions such as:
https://developers.google.com/apis-explorer/#p/logging/v1beta3/
Can anyone provide a simple example on how to use apiclient for logging purposes?
for example creating a new log entry...
Thanks for the help
Shahar
I found this page:
https://developers.google.com/api-client-library/python/guide/logging
Which states you can do the following to set the log level:
import logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
However it doesn't seem to have any impact on the output which is always INFO for me.
I also tried setting httplib2 to debuglevel 4:
import httplib2
httplib2.debuglevel = 4
Yet I don't see any HTTP headers in the log :/
I know this question is old, but it is getting some attention, so I guess it might be worth answering to it, in case someone else comes here.
Stackdriver Logging Client Libraries for Google Cloud Platform are not in beta anymore, as they hit General Availability some time ago. The link I shared contains the most relevant documentation for installing and using them.
After running the command pip install --upgrade google-cloud-logging, you will be able to authenticate with your GCP account, and use the Client Libraries.
Using them is as easy as importing the library with a command such as from google.cloud import logging, then instantiate a new client (which you can use by default, or even pass the Project ID and Credentials explicitly) and finally work with Logs as you want.
You may also want to visit the official library documentation, where you will find all the details of how to use the library, which methods and classes are available, and how to do most of the things, with lots of self-explanatory examples, and even comparisons between the different alternatives on how to interact with Stackdriver Logging.
As a small example, let me also share a snippet of how to retrieve the five most recent logs which have status more sever than "warning":
# Import the Google Cloud Python client library
from google.cloud import logging
from google.cloud.logging import DESCENDING
# Instantiate a client
logging_client = logging.Client(project = <PROJECT_ID>)
# Set the filter to apply to the logs, this one retrieves GAE logs from the default service with a severity higher than "warning"
FILTER = 'resource.type:gae_app and resource.labels.module_id:default and severity>=WARNING'
i = 0
# List the entries in DESCENDING order and applying the FILTER
for entry in logging_client.list_entries(order_by=DESCENDING, filter_=FILTER): # API call
print('{} - Severity: {}'.format(entry.timestamp, entry.severity))
if (i >= 5):
break
i += 1
Bear in mind that this is just a simple example, and that many things can be achieved using the Logging Client Library, so you should refer to the official documentation pages that I shared in order to get a more deep understanding of how everything works.
However it doesn't seem to have any impact on the output which is
always INFO for me.
add a logging handler, e.g.:
formatter = logging.Formatter('%(asctime)s %(process)d %(levelname)s: %(message)s')
consoleHandler = logging.StreamHandler()
consoleHandler.setLevel(logging.DEBUG)
consoleHandler.setFormatter(formatter)
logger.addHandler(consoleHandler)

How can I generate a file like this for Bing Heat Map data?

I am working on a fairly simple Heat Map application where the longitude and latitude of the points will be stored in a SQL Server database. I have been looking at an example that uses an array of objects as follows (eliminated a lot of data for brevity):
/* Sample data to demonstrate Bing Maps Heatmap */
/* http://alastair.wordpress.com */
var CrimeData = [
new Microsoft.Maps.Location(52.67280, 0.94392),
new Microsoft.Maps.Location(52.62423, 1.29493),
new Microsoft.Maps.Location(52.62187, 1.29080),
new Microsoft.Maps.Location(52.58962, 1.72228),
new Microsoft.Maps.Location(52.69915, 0.24332),
new Microsoft.Maps.Location(52.51161, 0.99350),
new Microsoft.Maps.Location(52.59573, 1.17067),
new Microsoft.Maps.Location(52.94351, 0.49153),
new Microsoft.Maps.Location(52.64585, 1.73145),
new Microsoft.Maps.Location(52.75424, 1.30079),
new Microsoft.Maps.Location(52.63566, 1.27176),
new Microsoft.Maps.Location(52.63882, 1.23121)
];
What I want to do is present the user with a list of some sort that displays all the data sets that exist in the database (they each have a name associated with them) and then allow the user to check all or only a select few. I will then need to generate an array like the above to create the heat map. Any ideas on a good approach to this?
What you trying to achieve is more related to a web developement rather than only related to Bing Maps.
To summarize, you have multiple ways to do this but it really depends on what you are capable to do and what you need in the interface.
What process/technology?
First, you need to determine what process you want to follow to display the data and it will set the technology that you will use. The questions that you need to ask yourself are:
Do you want to be able to change the data sets dynamically without refreshing the whole page?
If yes, it means that you will have to use asynchronous data loading through a dedicated web service in order to avoid loading all the information at the initial load of the page.
Do you have lots of data to load?
If so, it might comfort you with asynchronous loading to avoid loading all data.
If not loading every elements in multiple arrays might be the simplest solution.
Implementation
So now, you want to create a web service to load the data asynchronously, you can take a look at the following websites :
http://www.asp.net/get-started
http://www.stefanprodan.com/2011/04/async-operations-with-jquery-ajax-and-asp-net-mvc/
There might be interesting other website, you will be able to find them. If needed, add comment and I'm sure the community will help you.
If you want to generate the data directly in the script, it could be simple as you can compose the JavaScript directly in your dynamically created HTML page (in your ASP.Net markup code or whatever technology you're using).

How to add EPSG 900913 to geodjango spatialite database?

I'm trying to include a Google Maps widget in my admin-interface using this snippet on a Linux system (presently running locally on a Bitnami django stack in VMWare Player).
The map renders, but point features (any features really) in my database are not showing up on the map, and when trying to register points through the map interface, I get an error that:
An error occurred when transforming the geometry to the SRID of the geometry form field.
I realized from the geodjango docs that the Googles spatial reference system is not included when initializing the spatialite/sqlite database, and the solution should be to issue the following commands, to add the SRS:
$ python manage shell
>>> from django.contrib.gis.utils import add_srs_entry
>>> add_srs_entry(900913)
However, when I do this from my project directory, I get:
ERROR 6: EPSG PCS/GCS code 900913 not found in EPSG support files. Is this a valid
EPSG coordinate system?
I have confirmed that GDAL, GEOS and PROJ4 is installed, and I have added environment variables GDAL_DATA and PROJ_LIB to my .profile. I have checked the /usr/local/share/gdal/gcs.csv file which appears to not have an entry for 900913 (I have googled for other versions of gcs.csv, but none seem to contain 900913). I assume this is causing the error. However, the cubewerx_extra.wkt in the same directory does have a WKT entry for 900913.
My question is: How do I make add_srs_entry find the right SRS representation in order to add it to my database? Or is there a work-around, e.g. somehow converting the WKT representation and inserting it manually in gcs.csv?
I appreciate any help!
EDIT:
I found a way to manually insert the EPSG 900913 into the spatialite database. The solution is inspired by the sql-statement found at http:// trac.osgeo.org/openlayers/wiki/SphericalMercator (sorry, I don't have enough reputation to post more links) and issued to the database backend using raw sql (as described in the docs at https:// docs.djangoproject.com/en/dev/topics/db/sql/#executing-custom-sql-directly):
from django.db import connection, transaction
cursor = connection.cursor()
sql = "INSERT into spatial_ref_sys (srid, auth_name, auth_srid, ref_sys_name, proj4text) values (900913 ,'EPSG',900913,'Google Maps Global Mercator','+proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0 +k=1.0 +units=m +nadgrids=#null +no_defs');"
cursor.execute(sql)
transaction.commit_unless_managed()
I've confirmed that the entry is now in the spatial_ref_sys table. But I am still getting the same error when trying to add points in the admin-interface. Points can be added to the map, but when trying to save the feature, I get the error:
An error occurred when transforming the geometry to the SRID of the geometry form field.
Is the above sql statement correct? Is it sufficient, or does the add_srs_entry do other things as well?
Finally it could be a coding problem in my application, I will work on a minimal test-example and post it...
OK, I found the answer to the main question, as also indicated under the edit-post.
For future reference, here is a method how to add the Google spherical projection to a spatialite database (which must be already be spatially enabled):
1) Create a text file with the following content:
BEGIN;
INSERT into spatial_ref_sys (srid, auth_name, auth_srid, ref_sys_name, proj4text) values (900913,'EPSG',900913,'Google Maps Global Mercator','+proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0 +k=1.0 +units=m +nadgrids=#null +no_defs');
COMMIT;
2) Save the file with a name like init_EPSG900913.sql in the directory holding you spatialite database.
3) Issue the following command to execute the SQL statement on the database:
spatialite some_database.sqlite < init_EPSG900913.sql
Alternative method - From inside django-script or in "python manage.py shell":
from django.db import connection, transaction
cursor = connection.cursor()
sql = "INSERT into spatial_ref_sys (srid, auth_name, auth_srid, ref_sys_name, proj4text) values (900913 ,'EPSG',900913,'Google Maps Global Mercator','+proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0 +k=1.0 +units=m +nadgrids=#null +no_defs');"
cursor.execute(sql)
transaction.commit_unless_managed()
With either of these two methods, your database will have the Google Maps reference sytstem registered.
It turns out, that the missing EPSG definition was only part of the problem.
The other part was related to the fact that the app is running on a bitnami ubuntu django stack.
When following the installation guide in the geodjango docs on the bitnami ubuntu django stack, all the extra python packages and spatial libraries are installed in the system folders /user/local/..something.. not in the self-contained bitnami environment.
For future reference, make sure to issue the following statements before installing additional python packages:
$ sudo su
$ /opt/bitnami/use_djangostack
Then packages will be installed in the bitnami environment.
Also, when configuring builds of the different spatial libraries with the ./configure command, extra options must be added to place the shared files in the bitnami environment.
I have typically used something like:
$ ./configure --prefix=/opt/bitnami/common
Additional arguments might have to be passed as described in the geodjango docs - but the paths specified in these arguments must be changed to point to the proper subdirectories of /opt/bitnami/...

Categories