geopy geocode does not map addr - browser googlemap does - google-maps

I am trying to map a list of (~320+) Texas addresses to lat,lng.
I started using geopy (simple example) and it worked for some addresses but it failed on a set of addresses.
So I integrated a backup with googlemaps geocode... but it too failed. Below is the code... see address_to_geoPt.
Yet, when I submit the failed addresses via the browser, it finds the address... any tips on how to get more reliable hits? Which googleapi should I use (see address_to_geoPt_googlemaps())
class GeoMap(dbXLS):
def __init__(self, **kwargs):
super(umcGeoMap, self).__init__(**kwargs)
# Geo Locator
self.gl = Nominatim()
self.gmaps = googlemaps.Client(key='mykeyISworking')
shName = self.xl.sheet_names[0] if 'sheet' not in kwargs else kwargs['sheet']
self.df = self.xl.parse(shName)
def address_to_geoPt(self, addr):
l = self.geoLocation(addr)
if l : return (l.latitude, l.longitude)
return (np.nan, np.nan)
def address_to_geoPt_googlemaps(self, addr):
geocode = self.gmaps.geocode(addr)
if l == None : return (np.nan, np.nan)
# Geocoding an address
locDict = geocode[0]['geometry']['location']
return(locDict['lat'], locDict['lng'])
def address(self, church):
return (church.Address1 + " "
+ church.City + " "
+ church.State + " "
+ church.ZipCode + " "
+ church.Country)
def church_to_geoPt(self, church):
a = (church.Address1 + " "
+ church.City + " "
+ church.State)
if pd.isnull(church.geoPt):
(lat, lng) = self.address_to_geoPt(a)
else: (lat, lng ) = church.geoPt
if not pd.isnull(lat) :
print("DEBUG to_geoPt 1", lat, lng, a)
return (lat,lng)
(lat, lng) = self.address_to_geoPt_googlemaps(a)
print("DEBUG to_geoPt 2", lat, lng, a)
return (lat, lng)
The following shows a set of addresses that are not mapped by geocoders.
4 3000 Bee Creek Rd Spicewood TX 78669-5109 USA
6 P O BOX 197 BERTRAM TX 78605-0197 USA
10 2833 Petuma Dr Kempher TX 78639 USA

#geocodezip provided the answer... and the code worked the next day.

Related

How to get location from Twitter user profile using Beautiful Soup4?

So, I am trying to get the location text in the profile of a given Twitter account
handles = ['IndieWire' , 'AFP', 'UN']
for x in handles:
url= "https://twitter.com/" + x
try:
html = req.get(url)
except Exception as e:
print(f"Failed to fetch page for url {url} due to: {e}")
continue
soup = BeautifulSoup(html.text,'html.parser')
try:
label = soup.find('span',{'class':"ProfileHeaderCard-locationText"})
label_formatted = label.string.lstrip()
label_formatted = label_formatted.rstrip()
if label_formatted != "":
location_list.append(label_formatted)
print(x + ' : ' + label_formatted)
else:
location_list.append(label_formatted)
print(x + ' : ' + 'Not found')
except AttributeError:
try:
label2 = soup.findAll('span',{"class":"ProfileHeaderCard-locationText"})[0].get_text()
label2 = str(label2)
label2_formatted = label2.lstrip()
label2_formatted = label2_formatted.rstrip()
location_list.append(label_formatted)
print(x + ' : ' + label2_formatted)
except:
print(x + ' : ' + 'Not found')
except:
print(x + ' : ' + 'Not found')
This code used to work when I used it a few months ago. I changed it a little bit now after checking the Twitter page source but I still cant get the locations. Hope you can help
Use mobile version of Twitter to get location.
For example:
import requests
from bs4 import BeautifulSoup
handles = ['IndieWire' , 'AFP', 'UN']
ref = 'https://twitter.com/{h}'
headers = {'Referer': '',}
url = 'https://mobile.twitter.com/i/nojs_router?path=/{h}'
for h in handles:
headers['Referer'] = ref.format(h=h)
soup = BeautifulSoup( requests.post(url.format(h=h), headers=headers).content, 'html.parser' )
loc = soup.select_one('.location')
if loc:
print(h, loc.text)
else:
print(h, 'Not Found')
Prints:
IndieWire New York, NY
AFP France
UN New York, NY

invalid value encountered in true_divide in rk45

I'm trying to implement to RK45 for a two body problem with the earth and sun but keep getting a division by zero that I don't understand. It seems to be in the norme from the accelerations function that the division occurs but I don't see how that can be or how to fix it. Here is code:
from scipy import optimize
from numpy import linalg as LA
import matplotlib.pyplot as plt
from scipy.optimize import fsolve
import numpy as np
AU=1.5e11
a=AU
e=0.5
mss=2E30
ms = 2E30
me = 5.98E24
mv=4.867E24
yr=3.15e7
h=100
mu1=ms*me/(ms+me)
mu2=ms*me/(ms+me)
G=6.67E11
step=24
vi=np.sqrt(G*ms*(2/(a*(1-e))-1/a))
#sun=sphere(pos=vec(0,0,0),radius=0.1*AU,color=color.yellow)
#earth=sphere(pos=vec(1*AU,0,0),radius=0.1*AU)
sunpos=np.array([-903482.12391302, -6896293.6960525, 0. ])
earthpos=np.array([a*(1-e),0,0])
earthv=np.array([0,vi,0])
sunv=np.array([0,0,0])
def accelerations(t,earthposs, sunposs):
norme=sum( (earthposs-sunposs)**2 )**0.5
gravit = G*(earthposs-sunposs)/norme**3
sunaa = me*gravit
earthaa = -ms*gravit
return earthaa, sunaa
def ode45(f,t,y,h):
"""Calculate next step of an initial value problem (IVP) of an ODE with a RHS described
by the RHS function with an order 4 approx. and an order 5 approx.
Parameters:
t: float. Current time.
y: float. Current step (position).
h: float. Step-length.
Returns:
q: float. Order 2 approx.
w: float. Order 3 approx.
"""
s1 = f(t, y[0],y[1])
s2 = f(t + h/4.0, y[0] + h*s1[0]/4.0,y[1] + h*s1[1]/4.0)
s3 = f(t + 3.0*h/8.0, y[0] + 3.0*h*s1[0]/32.0 + 9.0*h*s2[0]/32.0,y[1] + 3.0*h*s1[1]/32.0 + 9.0*h*s2[1]/32.0)
s4 = f(t + 12.0*h/13.0, y[0] + 1932.0*h*s1[0]/2197.0 - 7200.0*h*s2[0]/2197.0 + 7296.0*h*s3[0]/2197.0,y[1] + 1932.0*h*s1[1]/2197.0 - 7200.0*h*s2[1]/2197.0 + 7296.0*h*s3[1]/2197.0)
s5 = f(t + h, y[0] + 439.0*h*s1[0]/216.0 - 8.0*h*s2[0] + 3680.0*h*s3[0]/513.0 - 845.0*h*s4[0]/4104.0,y[1] + 439.0*h*s1[1]/216.0 - 8.0*h*s2[1] + 3680.0*h*s3[1]/513.0 - 845.0*h*s4[1]/4104.0)
s6 = f(t + h/2.0, y[0] - 8.0*h*s1[0]/27.0 + 2*h*s2[0] - 3544.0*h*s3[0]/2565 + 1859.0*h*s4[0]/4104.0 - 11.0*h*s5[0]/40.0,y[1] - 8.0*h*s1[1]/27.0 + 2*h*s2[1] - 3544.0*h*s3[1]/2565 + 1859.0*h*s4[1]/4104.0 - 11.0*h*s5[1]/40.0)
w1 = y[0] + h*(25.0*s1[0]/216.0 + 1408.0*s3[0]/2565.0 + 2197.0*s4[0]/4104.0 - s5[0]/5.0)
w2 = y[1] + h*(25.0*s1[1]/216.0 + 1408.0*s3[1]/2565.0 + 2197.0*s4[1]/4104.0 - s5[1]/5.0)
q1 = y[0] + h*(16.0*s1[0]/135.0 + 6656.0*s3[0]/12825.0 + 28561.0*s4[0]/56430.0 - 9.0*s5[0]/50.0 + 2.0*s6[0]/55.0)
q2 = y[1] + h*(16.0*s1[1]/135.0 + 6656.0*s3[1]/12825.0 + 28561.0*s4[1]/56430.0 - 9.0*s5[1]/50.0 + 2.0*s6[1]/55.0)
return w1,w2, q1,q2
t=0
T=10**5
xarray=[]
yarray=[]
while t<T:
ode45(accelerations,t,[earthpos,sunpos],h)
earthpos=ode45(accelerations,t,[earthpos,sunpos],h)[1]
sunpos=ode45(accelerations,t,[earthpos,sunpos],h)[3]
xarray.append(ode45(accelerations,t,[earthpos,sunpos],h)[0][0])
yarray.append(ode45(accelerations,t,[earthpos,sunpos],h)[0][1])
print(ode45(accelerations,t,[earthpos,sunpos],h)[0][0],ode45(accelerations,t,[earthpos,sunpos],h)[0][1])
t=t+h
plt.plot(xarray,yarray)
plt.savefig('orbit.png')
plt.show()
After the second iteration the code comes back with only nan values for the earthpos.
Numerical integration methods usually integrate first order systems, y'=f(t,y). You want to integrate a second order ODE system y''=f(t,y) which you first need to turn into a first order system.
Why do you not use the vector class of numpy?
Why do you perform the same computation with the same arguments multiple times instead of catching all return values once and then distributing them to the lists?
You could also use scipy.integrate.solve_ivp with the "RK45" method instead of programming it yourself.

Device dependency in ZABBIX 4.2

Suppose the following scenario in using Zabbix 4.2. We have a core switch, two distributed switches and 20 access switches, where the distributed switches are connected to the core and 10 access switches are connected to each distributed switch. I am monitoring all of them using SNMP v2c and using the template cisco switches (the official one). Now the problem arises as I cannot define device dependency in zabbix easily. By easily, I mean that if a distributed switch goes out, I want to have the alarm for that device and not for all access switches connected to it. I could define it as follows. Change the triggers for each device and made them dependent on the corresponding trigger for distributed switches. However, this is too time consuming. What should I do? Any help is appreciated.
You are right, there isn't an easy way to set this kind of dependancy.
I had to manage the same situation a while ago and I wrote a python dependancy setter which uses a "dependent hostgroup <--> master host" logic.
You can modify it to fit your needs (see masterTargetTriggerDescription and slaveTargetTriggerDescription for the dependancy targets), it works but contains little error checking: use at your own risk!
import csv
import re
import json
from zabbix.api import ZabbixAPI
# Zabbix Server endpoint
zabbixServer = 'https://yourzabbix/zabbix/'
zabbixUser = 'admin'
zabbixPass = 'zabbix'
zapi = ZabbixAPI(url=zabbixServer, user=zabbixUser, password=zabbixPass)
# Hostgrop variables - to reference IDs while building API parameters
hostGroupNames = [] # list = array
hostGroupId = {} # dict = associative array
# Csv file for dep settings - see the format:
"""
Hostgroup;Master
ACCESS_1;DistSwitch1
ACCESS_2;DistSwitch1
ACCESS_5;DistSwitch2
ACCESS_6;DistSwitch2
DIST;CoreSwitch1
"""
fileName = 'dependancy.csv'
masterTargetTriggerDescription = '{HOST.NAME} is unavailable by ICMP'
slaveTargetTriggerDescription = '{HOST.NAME} is unavailable by ICMP|Zabbix agent on {HOST.NAME} is unreachable'
# Read CSV file
hostFile = open(fileName)
hostReader = csv.reader(hostFile, delimiter=';', quotechar='|')
hostData = list(hostReader)
# CSV Parsing
for line in hostData:
hostgroupName = line[0]
masterName = line[1]
slaveIds = []
masterId = zapi.get_id('host', item=masterName, with_id=False, hostid=None)
hostGroupId = zapi.get_id('hostgroup', item=hostgroupName, with_id=False, hostid=None)
masterTriggerObj = zapi.trigger.get(hostids=masterId, filter=({'description': masterTargetTriggerDescription}) )
print "Group: " + hostgroupName + " - ID: " + str(hostGroupId)
print "Master host: " + masterName + " - ID: " + str(masterId)
print "Master trigger: " + masterTriggerObj[0]['description'] + " - ID: " + str(masterTriggerObj[0]['triggerid'])
# cycle through slave hosts
hostGroupObj = zapi.hostgroup.get(groupids=hostGroupId, selectHosts='extend')
for host in hostGroupObj[0]['hosts']:
#exclude master
if host['hostid'] != str(masterId):
print " - Host Name: " + host['name'] + " - ID: " + host['hostid'] + " - MASTER: " + str(masterId)
# cycle for all slave's triggers
slaveTargetTriggerObj = zapi.trigger.get(hostids=host['hostid'])
#print json.dumps(slaveTargetTriggerObj)
for slaveTargetTrigger in slaveTargetTriggerObj:
# search for dependancy targets
if re.search(slaveTargetTriggerDescription, slaveTargetTrigger['description'] ,re.IGNORECASE):
print " - Trigger: " + slaveTargetTrigger['description'] + " - ID: " + slaveTargetTrigger['triggerid']
# Clear existing dep. from the trigger, then create the new dep.
clear = zapi.trigger.deletedependencies(triggerid=slaveTargetTrigger['triggerid'].encode())
result = zapi.trigger.adddependencies(triggerid=slaveTargetTrigger['triggerid'].encode(), dependsOnTriggerid=masterTriggerObj[0]['triggerid'])
print "----------------------------------------"
print ""
I updated the code contributed by Simone Zabberoni and rewritten it to work with Python 3, PyZabbix, and YAML.
#!/usr/bin/python3
import re
import yaml
#https://pypi.org/project/py-zabbix/
from pyzabbix import ZabbixAPI
# Zabbix Server endpoint
zabbix_server = 'https://zabbix.example.com/zabbix/'
zabbix_user = 'zbxuser'
zabbix_pass = 'zbxpassword'
# Create ZabbixAPI class instance
zapi = ZabbixAPI(zabbix_server)
# Enable HTTP auth
zapi.session.auth = (zabbix_user, zabbix_pass)
# Login (in case of HTTP Auth, only the username is needed, the password, if passed, will be ignored)
zapi.login(zabbix_user, zabbix_pass)
# Hostgrop variables - to reference IDs while building API parameters
hostGroupNames = [] # list = array
hostGroupId = {} # dict = associative array
# yaml file for dep settings - see the format:
"""
pvebar16 CTs:
master: pvebar16.example.com
masterTargetTriggerDescription: 'is unavailable by ICMP'
slaveTargetTriggerDescription: 'is unavailable by ICMP|Zabbix agent is unreachable for 5 minutes'
"""
fileName = 'dependancy.yml'
with open('dependancy.yml') as f:
hostData = yaml.load(f)
for groupyml in hostData.keys():
masterTargetTriggerDescription = hostData[groupyml]['masterTargetTriggerDescription']
slaveTargetTriggerDescription = hostData[groupyml]['slaveTargetTriggerDescription']
masterName = hostData[groupyml]['master']
hostgroupName = groupyml
slaveIds = []
masterId = zapi.host.get(filter={'host': masterName},output=['hostid'])[0]['hostid']
hostGroupId = zapi.hostgroup.get(filter={'name': hostgroupName},output=['groupid'])[0]['groupid']
masterTriggerObj = zapi.trigger.get(host=masterName, filter={'description': masterTargetTriggerDescription}, output=['triggerid','description'])
print("Group: " + hostgroupName + " - ID: " + str(hostGroupId))
print("Master host: " + masterName + " - ID: " + str(masterId))
print("Master trigger: " + masterTriggerObj[0]['description'] + " - ID: " + str(masterTriggerObj[0]['triggerid']))
# cycle through slave hosts
hostGroupObj = zapi.hostgroup.get(groupids=hostGroupId, selectHosts='extend')
for host in hostGroupObj[0]['hosts']:
#exclude master
if host['hostid'] != str(masterId):
print(" - Host Name: " + host['name'] + " - ID: " + host['hostid'] + " - MASTER: " + str(masterId))
# cycle for all slave's triggers
slaveTargetTriggerObj = zapi.trigger.get(hostids=host['hostid'])
for slaveTargetTrigger in slaveTargetTriggerObj:
# search for dependancy targets
if re.search(slaveTargetTriggerDescription, slaveTargetTrigger['description'] ,re.IGNORECASE):
print(" - Trigger: " + slaveTargetTrigger['description'] + " - ID: " + slaveTargetTrigger['triggerid'])
# Clear existing dep. from the trigger, then create the new dep.
clear = zapi.trigger.deletedependencies(triggerid=slaveTargetTrigger['triggerid'])
result = zapi.trigger.adddependencies(triggerid=slaveTargetTrigger['triggerid'], dependsOnTriggerid=masterTriggerObj[0]['triggerid'])
print("----------------------------------------")
print("")

Saving a leaflet map as an html file

I've created a leaflet map of corn yield in Kansas using USDA NASS data. The problem I'm running into is exporting the leaflet into an html file using the command:
htmlwidgets::saveWidget(my_interactive_map, "kansas_corn2.html")
but I get this error:
Error in system.file(config, package = package) : 'package' must be of length 1
However, I can produce an html file by using Export > Save as Web Page.. from the Viewer pane.
How can I achieve the same export result using the command line?
My code for making the map is:
my_interactive_map <- tm_shape(STATE) +
tm_polygons("Value", textNA = "Not Reported",
title = unit_desc, palette=c('#8290af','#512888','#190019'),
auto.palette.mapping=FALSE, n = 6, style = "quantile", contrast = 0.9, colorNA = "#C0C0C0",
border.col = "#E8E8E8", showNA = FALSE, legend.is.portrait = FALSE,
legend.hist = FALSE, popup.vars = c("County: " = "COUNTY_NAME", "Value: " = "Value")) +
tm_credits("U.S. Department of Agriculture, National Agriculture Statistics Service") +
tm_format_World(title = paste(year_filt, prodn_practice_desc, commodity_desc, statisticcat_desc, "by",
agg_level_desc, "for", state, sep = " "))
my_interactive_map
You appear to be using the tmap library. For that you could use the function detailed here:
library(tmap)
save_tmap(my_interactive_map, "kansas_corn2.html")

Parse Geocodio::Address

I'm using ruby and the geocodio gem to do some reverse geocoding. The reverse geocoder returns an object of type Geocodio::Address which per their website is JSON. I'm trying to use ruby JSON:parse to convert to a hash that I can them map as needed.
JSON parse returns this error...
C:/Ruby23-x64/lib/ruby/2.3.0/json/common.rb:156:in parse': 784: unexpected token at '"Saipan, MP 96950"' (J
from C:/Ruby23-x64/lib/ruby/2.3.0/json/common.rb:156:inparse'
Here's my whole script
require 'geocodio'
require 'CSV'
require 'json'
filename = './for_fips.csv'
#fipslist = []
geocodio = Geocodio::Client.new('not real...d5a1557e2175d8ce265')
i = 1
CSV.foreach(filename) do |row|
lat = row[1]
long = row[2]
coord = lat + "," + long
#puts coord
add = geocodio.reverse_geocode([lat + "," + long],fields: %w[cd stateleg school timezone]).best
add_parsed = JSON.parse(add)
pp add_parsed
#puts add.each { |k,v| "#{k}=#{v}"}.join('~~')
i +=1
if i > 2 then break end
#fipslist << fcc.district_fips
end