Geoserver Image Pyramid with JPEG Compression - gis

I have an Image Pyramid (created using gdal_retile) that I want to import into Geoserver (2.8.2, using the Image Pyramid plugin) and publish as WMS.
I have been able to do this successfully using GeoTIFF images, however as we are storing many terabytes of imagery, we would like to use JPEG compressed TIFF.
If I run gdal_retile using the compression argument, e.g:
#python %OSGEO4W_ROOT%\bin\gdal_retile.py -v -r bilinear -levels
%LEVELS% -tileIndex TileIndex.shp -ps %PIXELS% %PIXELS% -co
"TILED=YES" -co "COMPRESS=JPEG" -targetDir %INPUT%\PROCESSED\
--optfile %INPUT%\tilelist.txt
Then Geoserver errors when trying to publish the layer but doesn't seem to provide a complete error log (I've pasted what it supplied below).
There's some supporting info below - hopefully this is enough - thanks in advance!
I have also tried this on version 2.7.6, with the same results.
My folder structure after creating the Image Pyramid data store is as follows (NOTE: listed output/1 only as an example, all Level folders have the same structure):
../output
Level Folders (0,1,2,3,4,5)
output.prj
output.properties
../output/1
Contains images (JPEG Compressed TIFF)
Tile Index (1.shp,1.shx,1.prj,1.dbf)
Properties File (1.properties)
Fix File (1.fix) - not sure what this is.
I have run gdal_info on one of the outputted TIF files:
Driver: GTiff/GeoTIFF
Files: C:\geoserver\data\pyramids\constantia_comp\PROCESSED\output\1\constantia2
ransparent_mosaic_group1_4_3.tif
Size is 2048, 1757
Coordinate System is:
PROJCS["WGS 84 / UTM zone 34S",
GEOGCS["WGS 84",
DATUM["WGS_1984",
SPHEROID["WGS 84",6378137,298.257223563,
AUTHORITY["EPSG","7030"]],
AUTHORITY["EPSG","6326"]],
PRIMEM["Greenwich",0,
AUTHORITY["EPSG","8901"]],
UNIT["degree",0.0174532925199433,
AUTHORITY["EPSG","9122"]],
AUTHORITY["EPSG","4326"]],
PROJECTION["Transverse_Mercator"],
PARAMETER["latitude_of_origin",0],
PARAMETER["central_meridian",21],
PARAMETER["scale_factor",0.9996],
PARAMETER["false_easting",500000],
PARAMETER["false_northing",10000000],
UNIT["metre",1,
AUTHORITY["EPSG","9001"]],
AXIS["Easting",EAST],
AXIS["Northing",NORTH],
AUTHORITY["EPSG","32734"]]
Origin = (262297.873039999980000,6229349.579940000500000)
Pixel Size = (0.224620000000000,-0.224620000000000)
Metadata:
AREA_OR_POINT=Area
Image Structure Metadata:
COMPRESSION=JPEG
INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left ( 262297.873, 6229349.580) ( 18d25'29.43"E, 34d 2'58.65"S)
Lower Left ( 262297.873, 6228954.923) ( 18d25'29.04"E, 34d 3'11.45"S)
Upper Right ( 262757.895, 6229349.580) ( 18d25'47.35"E, 34d 2'59.02"S)
Lower Right ( 262757.895, 6228954.923) ( 18d25'46.96"E, 34d 3'11.83"S)
Center ( 262527.884, 6229152.251) ( 18d25'38.19"E, 34d 3' 5.24"S)
Band 1 Block=256x256 Type=Byte, ColorInterp=Red
Mask Flags: PER_DATASET ALPHA
Band 2 Block=256x256 Type=Byte, ColorInterp=Green
Mask Flags: PER_DATASET ALPHA
Band 3 Block=256x256 Type=Byte, ColorInterp=Blue
Mask Flags: PER_DATASET ALPHA
Band 4 Block=256x256 Type=Byte, ColorInterp=Alpha
Error Log:
java.lang.RuntimeException: Error occurred while building the resources for the configuration page
at org.geoserver.web.data.layer.NewLayerPage.buildLayerInfo(NewLayerPage.java:356)
at org.geoserver.web.data.layer.NewLayerPage$8.onClick(NewLayerPage.java:281)
at org.geoserver.web.wicket.SimpleAjaxLink$1.onClick(SimpleAjaxLink.java:46)
at org.apache.wicket.ajax.markup.html.AjaxLink$1.onEvent(AjaxLink.java:68)
at org.apache.wicket.ajax.AjaxEventBehavior.respond(AjaxEventBehavior.java:177)
at org.apache.wicket.ajax.AbstractDefaultAjaxBehavior.onRequest(AbstractDefaultAjaxBehavior.java:300)
at org.apache.wicket.request.target.component.listener.BehaviorRequestTarget.processEvents(BehaviorRequestTarget.java:119)
at org.apache.wicket.request.AbstractRequestCycleProcessor.processEvents(AbstractRequestCycleProcessor.java:92)
at org.apache.wicket.RequestCycle.processEventsAndRespond(RequestCycle.java:1250)
at org.apache.wicket.RequestCycle.step(RequestCycle.java:1329)
at org.apache.wicket.RequestCycle.steps(RequestCycle.java:1436)
at org.apache.wicket.RequestCycle.request(RequestCycle.java:545)
at org.apache.wicket.protocol.http.WicketFilter.doGet(WicketFilter.java:484)
at org.apache.wicket.protocol.http.WicketServlet.doGet(WicketServlet.java:138)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at org.springframework.web.servlet.mvc.ServletWrappingController.handleRequestInternal(ServletWrappingController.java:159)
at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:153)
at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:48)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:923)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:852)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:778)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:487)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1093)
at org.geoserver.filters.ThreadLocalsCleanupFilter.doFilter(ThreadLocalsCleanupFilter.java:28)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084)
at org.geoserver.filters.SpringDelegatingFilter$Chain.doFilter(SpringDelegatingFilter.java:75)
at org.geoserver.wms.animate.AnimatorFilter.doFilter(AnimatorFilter.java:71)
at org.geoserver.filters.SpringDelegatingFilter$Chain.doFilter(SpringDelegatingFilter.java:71)
at org.geoserver.filters.SpringDelegatingFilter.doFilter(SpringDelegatingFilter.java:46)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084)
at org.geoserver.platform.AdvancedDispatchFilter.doFilter(AdvancedDispatchFilter.java:50)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:311)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:69)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:116)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:73)
at org.geoserver.security.filter.GeoServerCompositeFilter.doFilter(GeoServerCompositeFilter.java:92)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:323)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:69)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:73)
at org.geoserver.security.filter.GeoServerCompositeFilter.doFilter(GeoServerCompositeFilter.java:92)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:323)
at org.geoserver.security.filter.GeoServerAnonymousAuthenticationFilter.doFilter(GeoServerAnonymousAuthenticationFilter.java:54)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:323)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:69)
at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:182)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:73)
at org.geoserver.security.filter.GeoServerCompositeFilter.doFilter(GeoServerCompositeFilter.java:92)
at org.geoserver.security.filter.GeoServerUserNamePasswordAuthenticationFilter.doFilter(GeoServerUserNamePasswordAuthenticationFilter.java:116)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:323)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:69)
at org.springframework.security.web.authentication.rememberme.RememberMeAuthenticationFilter.doFilter(RememberMeAuthenticationFilter.java:146)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:73)
at org.geoserver.security.filter.GeoServerCompositeFilter.doFilter(GeoServerCompositeFilter.java:92)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:323)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:69)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at org.geoserver.security.filter.GeoServerSecurityContextPersistenceFilter$1.doFilter(GeoServerSecurityContextPersistenceFilter.java:53)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:73)
at org.geoserver.security.filter.GeoServerCompositeFilter.doFilter(GeoServerCompositeFilter.java:92)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:323)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:173)
at org.geoserver.security.GeoServerSecurityFilterChainProxy.doFilter(GeoServerSecurityFilterChainProxy.java:135)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084)
at org.geoserver.filters.LoggingFilter.doFilter(LoggingFilter.java:83)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084)
at org.geoserver.filters.GZIPFilter.doFilter(GZIPFilter.java:42)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084)
at org.geoserver.filters.SessionDebugFilter.doFilter(SessionDebugFilter.java:48)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084)
at org.geoserver.filters.FlushSafeFilter.doFilter(FlushSafeFilter.java:44)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084)
at org.vfny.geoserver.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:109)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1084)
at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:360)
at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181)
at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:726)
at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:405)
at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:206)
at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
at org.mortbay.jetty.Server.handle(Server.java:324)
at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:505)
at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:828)
at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:514)
at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:211)
at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:380)
at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:395)
at org.mortbay.thread.BoundedThreadPool$PoolThread.run(BoundedThreadPool.java:450) Caused by: org.geotools.data.DataSourceException: Unable to load properties from:C:\geoserver\data\pyramids\constantia_comp\PROCESSED\test\0\datastore.properties
at org.geotools.gce.imagemosaic.ImageMosaicReader.<init>(ImageMosaicReader.java:470)
at org.geotools.gce.imagepyramid.ImagePyramidReader.getImageMosaicReaderForLevel(ImagePyramidReader.java:655)
at org.geotools.gce.imagepyramid.ImagePyramidReader.loadRequestedTiles(ImagePyramidReader.java:484)
at org.geotools.gce.imagepyramid.ImagePyramidReader.loadTiles(ImagePyramidReader.java:431)
at org.geotools.gce.imagepyramid.ImagePyramidReader.read(ImagePyramidReader.java:372)
at org.geotools.coverage.grid.io.AbstractGridCoverage2DReader.read(AbstractGridCoverage2DReader.java:249)
at org.geoserver.catalog.CoverageDimensionCustomizerReader.read(CoverageDimensionCustomizerReader.java:227)
at org.geoserver.catalog.SingleGridCoverage2DReader.read(SingleGridCoverage2DReader.java:141)
at org.geoserver.catalog.CatalogBuilder.buildCoverage(CatalogBuilder.java:974)
at org.geoserver.catalog.CatalogBuilder.buildCoverage(CatalogBuilder.java:855)
at org.geoserver.web.data.layer.NewLayerPage.buildLayerInfo(NewLayerPage.java:346) ... 97 more Caused by: org.geotools.data.DataSourceException: Unable to load properties from:C:\geoserver\data\pyramids\constantia_comp\PROCESSED\test\0\datastore.properties
at org.geotools.gce.imagemosaic.ImageMosaicReader.initReaderFromURL(ImageMosaicReader.java:643)
at org.geotools.gce.imagemosaic.ImageMosaicReader.<init>(ImageMosaicReader.java:468) ... 107 more Caused by: java.io.IOException: Unable to load properties from:C:\geoserver\data\pyramids\constantia_comp\PROCESSED\test\0\datastore.properties
at org.geotools.gce.imagemosaic.CatalogManager.createGranuleCatalogProperties(CatalogManager.java:180)
at org.geotools.gce.imagemosaic.ImageMosaicReader.initReaderFromURL(ImageMosaicReader.java:576) ... 108 more

Related

Error when trying process .json files to pandas dataframe

I'm trying to merge various .json files so I can later run sentiment analysis on them.
I tried already other approaches but they always end up with an error.
I checked whether the .json is correctly formatted and can't find any issues there. I also attached an example of a .json file.
Error message is attached below my code.
import glob
import json
# list all files containing News from Guardian API
files = list(glob.iglob('/Users/xxx/tempdata/articles_data/*.json'))
news_data = []
for file in files:
news_file = open(file, "r", encoding = 'utf-8')
# Read in news and store in list: news_data
for line in news_file:
news = json.loads(line)
news_data.append(news)
news_file.close()
Updated Error Output
AttributeError Traceback (most recent call last)
<ipython-input-86-3019ee85b15b> in <module>
12 # Read in news and store in list: news_data
13 for line in news_file:
---> 14 news = json.load(line)
15 news_data.append(news)
16
~/opt/anaconda3/lib/python3.8/json/__init__.py in load(fp, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, **kw)
291 kwarg; otherwise ``JSONDecoder`` is used.
292 """
--> 293 return loads(fp.read(),
294 cls=cls, object_hook=object_hook,
295 parse_float=parse_float, parse_int=parse_int,
AttributeError: 'str' object has no attribute 'read'
2019-11-01.json
{
"id":"business/2019/nov/01/google-snaps-up-fitbit-for-21bn",
"type":"article",
"sectionId":"business",
"sectionName":"Business",
"webPublicationDate":"2019-11-01T14:26:19Z",
"webTitle":"Google snaps up Fitbit for $2.1bn",
"webUrl":"https://www.theguardian.com/business/2019/nov/01/google-snaps-up-fitbit-for-21bn",
"apiUrl":"https://content.guardianapis.com/business/2019/nov/01/google-snaps-up-fitbit-for-21bn",
"fields":{
"headline":"Google snaps up Fitbit for $2.1bn",
"standfirst":"<p>Takeover allows web giant to take on Apple in fast-growing smartwatch and wearables business</p>",
"trailText":"Takeover allows web giant to take on Apple in fast-growing smartwatch and wearables business",
"byline":"Kalyeena Makortoff",
"main":"<figure class=\"element element-image\" data-media-id=\"fc8abb0f70105fcab3aee86dea6c89e211337660\"> <img src=\"https://media.guim.co.uk/fc8abb0f70105fcab3aee86dea6c89e211337660/0_158_3571_2143/1000.jpg\" alt=\"The wireless activity tracker Zip by Fitbit Inc\" width=\"1000\" height=\"600\" class=\"gu-image\" /> <figcaption> <span class=\"element-image__caption\">The wireless activity tracker Zip by Fitbit Inc. Google has confirmed it will buy Fitbit for $2.1bn.</span> <span class=\"element-image__credit\">Photograph: Franck Robichon/EPA</span> </figcaption> </figure>",
"body":"<p>Google has snapped up the Fitbit... ",
"newspaperPageNumber":"38",
"wordcount":"679",
"firstPublicationDate":"2019-11-01T14:25:58Z",
"isInappropriateForSponsorship":"false",
"isPremoderated":"false",
"lastModified":"2019-11-01T18:56:38Z",
"newspaperEditionDate":"2019-11-02T00:00:00Z",
"productionOffice":"UK",
"publication":"The Guardian",
"shortUrl":"https://gu.com/p/cjeze",
"shouldHideAdverts":"false",
"showInRelatedContent":"true",
"thumbnail":"https://media.guim.co.uk/fc8abb0f70105fcab3aee86dea6c89e211337660/0_158_3571_2143/500.jpg",
"legallySensitive":"false",
"lang":"en",
"isLive":"true",
"bodyText":"Google has snapped up the Fitbit activity tracker business in a $2.1bn (\u00a31.6bn) deal that will enable the search giant to go toe-to-toe with Apple in the fast-growing smartwatch and wearables business..." ,
"charCount":"4149",
"shouldHideReaderRevenue":"false",
"showAffiliateLinks":"false",
"bylineHtml":"Kalyeena Makortoff"
},
"isHosted":false,
"pillarId":"pillar/news",
"pillarName":"News"
},
If you are reading json from file, you should use json.load instead of json.loads. json.loads is for reading JSON from string. See json.load documentation.
For example:
import json
with open('ts.json', 'r') as f:
content = json.load(f)
print(content)

Exception: jetson.utils -- failed to create glDisplay device

i am working on jetson nano and utilizing the jetson library provided by hello ai world.
i have tested the basic algo which comes with the library all of them are working fine but
i did a few changings to make it run on a video
got an error
here is my code
import jetson.inference
import jetson.utils
import cv2
#import argparse
import sys
import numpy as np
width=720
height=480
vs=cv2.VideoCapture('b.m4v') #video input file
net = jetson.inference.detectNet("ssd-mobilenet-v2", threshold=0.5) #loading the model
#camera = jetson.utils.gstCamera(1280, 720, "/dev/video0") #using V4L2
display = jetson.utils.glDisplay() #initialting a display window
while display.IsOpen():
_,frame = vs.read() #reading a frmae
img = cv2.cvtColor(frame, cv2.COLOR_BGR2BGRA) #converting it to a bgra format for jetson util
img = jetson.utils.cudaFromNumpy(img) #converting image to cuda format from numpy array
#img, width, height = camera.CaptureRGBA()
detections = net.Detect(img, width, height) #running detections on each image and saving the results in detections
display.RenderOnce(img, width, height) #display the output frame with detection
display.SetTitle("Object Detection | Network {:.0f} FPS".format(net.GetNetworkFPS())) #display title for the output feed
this is the error:
[OpenGL] glDisplay -- X screen 0 resolution: 1280x1024
[OpenGL] failed to create X11 Window.
jetson.utils -- PyDisplay_Dealloc()
Traceback (most recent call last):
File "pool1.py", line 16, in <module>
display = jetson.utils.glDisplay() #initialting a display window
Exception: jetson.utils -- failed to create glDisplay device
PyTensorNet_Dealloc()
It's not technically an answer but I think here is the problem.
Here is the log of my faulty working code
[TRT] binding to input 0 Input binding index: 0
[TRT] binding to input 0 Input dims (b=1 c=3 h=300 w=300) size=1080000
[TRT] binding to output 0 NMS binding index: 1
[TRT] binding to output 0 NMS dims (b=1 c=1 h=100 w=7) size=2800
[TRT] binding to output 1 NMS_1 binding index: 2
[TRT] binding to output 1 NMS_1 dims (b=1 c=1 h=1 w=1) size=4
device GPU, /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff initialized.
W = 7 H = 100 C = 1
detectNet – maximum bounding boxes: 100
detectNet – loaded 91 class info entries
detectNet – number of object classes: 91
jetson.utils – PyDisplay_New()
jetson.utils – PyDisplay_Init()
[OpenGL] glDisplay – X screen 0 resolution: 1280x1024
[OpenGL] failed to create X11 Window.
jetson.utils – PyDisplay_Dealloc()
Traceback (most recent call last):
File “pool1.py”, line 18, in
display = jetson.utils.glDisplay() #initialting a display window
Exception: jetson.utils – failed to create glDisplay device
PyTensorNet_Dealloc()
and here is the log of the builtin alog working
[TRT] binding to output 0 NMS dims (b=1 c=1 h=100 w=7) size=2800
[TRT] binding to output 1 NMS_1 binding index: 2
[TRT] binding to output 1 NMS_1 dims (b=1 c=1 h=1 w=1) size=4
device GPU, /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff initialized.
W = 7 H = 100 C = 1
detectNet – maximum bounding boxes: 100
detectNet – loaded 91 class info entries
detectNet – number of object classes: 91
jetson.utils – PyCamera_New()
jetson.utils – PyCamera_Init()
[gstreamer] initialized gstreamer, version 1.14.5.0
[gstreamer] gstCamera attempting to initialize with GST_SOURCE_NVARGUS, camera /dev/video0
[gstreamer] gstCamera pipeline string:
v4l2src device=/dev/video0 ! video/x-raw, width=(int)1280, height=(int)720, format=YUY2 ! videoconvert ! video/x-raw, format=RGB ! videoconvert !appsink name=mysink
[gstreamer] gstCamera successfully initialized with GST_SOURCE_V4L2, camera /dev/video0
jetson.utils – PyDisplay_New()
jetson.utils – PyDisplay_Init()
[OpenGL] glDisplay – X screen 0 resolution: 1280x1024
[OpenGL] glDisplay – display device initialized
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> videoconvert1
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> videoconvert0
You can clearly see the problem.
My main query is is gstramer even supported by opencv 4.1.1 on nano
try #import cv2 or put it on the top, in front of import jetson.utils

How to specify gunicorn log max size

I'm running gunicorn as:
guiconrn --bind=0.0.0.0:5000 --log-file gunicorn.log myapp:app
Seems like gunicorn.log keeps growing. Is there a way to specify a max size of the log file, so that if it reaches max size, it'll just override it.
Thanks!!
TLDR;
I believe there might be a "python only" solution using the rotating file handler provided in the internal lib of python. (at least 3.10)
To test
I created a pet project for you to fiddle with:
Create the following python file
test_logs.py
import logging
import logging.config
import time
logging.config.fileConfig(fname='log.conf', disable_existing_loggers=False)
while True:
time.sleep(0.5)
logging.debug('This is a debug message')
logging.info('This is an info message')
logging.warning('This is a warning message')
logging.error('This is an error message')
logging.critical('This is a critical message')
Create the following config file
log.conf
[loggers]
keys=root
[handlers]
keys=rotatingHandler
[formatters]
keys=sampleFormatter
[logger_root]
level=DEBUG
handlers=rotatingHandler
[handler_rotatingHandler]
class=logging.handlers.RotatingFileHandler
level=DEBUG
formatter=sampleFormatter
args=('./logs/logs.log', 'a', 1200, 1, 'utf-8')
[formatter_sampleFormatter]
format=%(asctime)s - %(name)s - %(levelname)s - %(message)s
Create the ./logs directory
Run python test_logs.py
To Understand
As you may have noticed already, the setting that allow for this behaviour is logging.handlers.RotatingFileHandler and the provided arguments args=('./logs/logs.log', 'a', 1200, 10, 'utf-8')
RotatingFileHandler is a stream handler writing to a file. That allow for 2 parameters of interest:
maxBytes set arbitrarily at 1200
backupCount set arbitrarily to 10
The behaviour is that upon reaching 1200 Bytes in size, the file is closed, renamed to /logs/logs.log.<a number up to 10> and a new file is opened.
BUT is any of maxBytes or backupCount is 0. No rotation is done !
In Gunicorn
As per the documentation you can feed a config file.
This could look like:
guiconrn --bind=0.0.0.0:5000 --log-config log.conf myapp:app
You will need to tweak it to your existing setup.
On Ubuntu/Linux, suggest to use logrotate to manage your logs, do like this: https://stackoverflow.com/a/55643449/6705684
Since Python>3.3, With RotatingFileHandler, here is my solution(MacOS/Windows/Linux/...) :
import os
import logging
from logging.handlers import RotatingFileHandler
fmt_str = '[%(asctime)s]%(module)s - %(funcName)s - %(message)s'
fmt = logging.Formatter(fmt_str)
def rotating_logger(name, fmt=fmt,
level=logging.INFO,
logfile='.log',
maxBytes=10 * 1024 * 1024,
backupCount=5,
**kwargs
):
logger = logging.getLogger(name)
hdl = RotatingFileHandler(logfile, maxBytes=maxBytes, backupCount=backupCount)
hdl.setLevel(level)
hdl.setFormatter(fmt)
logger.addHandler(hdl)
return logger
more refer:
https://docs.python.org/3/library/logging.handlers.html#rotatingfilehandler

GetMap request return white image

I just get a problem trying to display my map with GetMap request using Mapserver but it return a white image. I did search but i didn't found an answer :
My map file :
MAP
IMAGETYPE PNG
EXTENT -21 20 1 36
SIZE 700 400
IMAGECOLOR 255 255 255
PROJECTION
"init=epsg:4326"
END
OUTPUTFORMAT
NAME png
MIMETYPE image/png
DRIVER GD/PNG
EXTENSION png
IMAGEMODE PC256
TRANSPARENT FALSE
END
WEB
METADATA
"wms_title" "Dans Layers and Stuff"
"wms_onlineresource" "http://localhost:81/cgi-bin/mapserv.exe?"
"wms_enable_request" "*"
"wms_srs" "EPSG:4326"
"wms_feature_info_mime_type" "text/html"
"wms_format" "image/png"
END
END
LAYER
NAME map1
TYPE polygon
STATUS default
CONNECTIONTYPE postgis
CONNECTION "dbname=postgres host=localhost port=5432 user=postgres"
DATA "geom from comgeo"
PROJECTION
"init=epsg:4326"
END
METADATA
"wms_title" "map1"
END
PROCESSING "SCALE=AUTO"
CLASS
STYLE
COLOR 232 232 232
OUTLINECOLOR 32 32 32
END
END
END
END
And this the Link i used for my request :
http://localhost:81/cgi-bin/mapserv.exe?map=/wamp64/www/wordpress/map1.map&version=1.3.0&request=GetMap&CRS=EPSG:4326&bbox=-21,20,1,36&width=760&height=360&layers=map1&styles=&FORMAT=image/png&TRANSPARENT=TRUE
The BBox values are correct. Thank you
You appear to be missing SERVICE=WMS parameter on your URL.
I did solve the probleme when i replace epsg:4326 with CRS:84 and the URL :
http://localhost:81/cgi-bin/mapserv.exe?map=/wamp64/www/wordpress/map1.map&request=GetMap&SERVICE=WMS&version=1.3.0&CRS=CRS:84&bbox=-21,20,1,36&width=700&height=400&layers=map1&styles=&FORMAT=image/png&TRANSPARENT=TRUE
Version of WMS 1.1.1 and WMS 1.3.0 have different request parameter for coordinate system : SRS=EPSG:4326 for 1.1.1 and CRS=CRS:84 for 1.3.0
see mapserver wms

Error :JsonStorage in Pig Local mode

I am running my Pigscript in Local mode in eclipse.
when I try to store the output in JsonStorage.
Exception in thread "main" java.lang.RuntimeException: Cannot instantiate:org.apache.pig.builtin.JsonStorage
at org.apache.pig.impl.PigContext.instantiateFuncFromSpec(PigContext.java:473)
at org.apache.pig.impl.logicalLayer.parser.QueryParser.NonEvalFuncSpec(QueryParser.java:4976)
at org.apache.pig.impl.logicalLayer.parser.QueryParser.StoreClause(QueryParser.java:3473)
at org.apache.pig.impl.logicalLayer.parser.QueryParser.BaseExpr(QueryParser.java:1351)
at org.apache.pig.impl.logicalLayer.parser.QueryParser.Expr(QueryParser.java:893)
at org.apache.pig.impl.logicalLayer.parser.QueryParser.Parse(QueryParser.java:706)
at org.apache.pig.impl.logicalLayer.LogicalPlanBuilder.parse(LogicalPlanBuilder.java:63)
at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1017)
at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:967)
at org.apache.pig.PigServer.registerQuery(PigServer.java:383)
at org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:716)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:324)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:168)
at org.apache.pig.PigServer.registerScript(PigServer.java:407)
at com.paypal.debugpig.DebugPig.main(DebugPig.java:13)
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 1070: Could not resolve org.apache.pig.builtin.JsonStorage using imports: [, org.apache.pig.builtin., org.apache.pig.impl.builtin.]
at org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:458)
at org.apache.pig.impl.PigContext.instantiateFuncFromSpec(PigContext.java:470)
... 14 more
PigScript :
REGISTER C:/path/to/jar/pig.jar;
REGISTER C:/path/to/jar/UpperUDf/UpperUDf_fat.jar;
A = LOAD 'C:/path/to/data/file/student.txt' using PigStorage('\t') AS (name: chararray, age: int, gpa: float);
B = FOREACH A GENERATE myudfs.UPPER(name) ,age, gpa ;
Store B into 'output_student_Json' using org.apache.pig.builtin.JsonStorage();
when I dump or store the ouput in text file its working and but issues occurs when I try to store in JSON format.
Any pointers appreciated
Thank you
I have verified it, and it is working for me if i am using the below line of code for storing output into json file format.
store B into 'json_output' using JsonStorage();