Using HTTP API to build Grafana dashboard - json

I have been using influxDB as a datasource to build Grafana dashboard and now am trying to use an http api link that contain the below json file. The link is for example https://example:8080/cluster. Or if possible to use telegraf to get the extracted data and send the metrics to the influxDB.
I would want to display the http display for example show the TEJD as Status UP and TECJ status as DOWN based on the json shown.
{ "status": "UP", "TEJD": {"status":"UP", "count":"0", "minDateTime":"", "description":"Workbasket Jdmin"}, "TECJ":
{"status":"DOWN", "count":"0", "minDateTime":"",
"description":"Workbasket CreateJppWait"}, "TEDE": {"status":"UP",
"count":"0", "minDateTime":"", "description":"Workbasket default"},
"TEEW": {"status":"UP", "count":"0", "minDateTime":"",
"description":"Workbasket eFormWriteFailure"}, "TEFB":
{"status":"UP", "count":"0", "minDateTime":"",
"description":"Workbasket BackgroundProcessing"}, "TEIC":
{"status":"UP", "count":"0", "minDateTime":"",
"description":"Workbasket IncompleteConnections"}, "TELB":
{"status":"UP", "count":"18", "minDateTime":"20/12/2022 10:37",
"description":"Workbasket LRBackgroundProcess"}, "JETE":
{"status":"UP", "count":"0", "minDateTime":"",
"description":"Jssignment errors for Jssign-WorkBasket"}, "JEWL":
{"status":"DOWN", "count":"0", "minDateTime":"",
"description":"Jssignment errors for Jssign-Worklist"}, "FETE":
{"status":"UP", "count":"0", "minDateTime":"", "description":"Flow
errors for Jssign-WorkBasket"}, "FEWL": {"status":"UP", "count":"0",
"minDateTime":"", "description":"Flow errors for Jssign-Worklist"},
"BQBP": {"status":"UP", "count":"0", "minDateTime":"",
"description":"Broken queue System-Queue-BackgroundProcess"}, "BQDE":
{"status":"UP", "count":"0", "minDateTime":"", "description":"Broken
queue System-Queue-DefaultEntry"}, "CQCJ": {"status":"UP",
"count":"0", "minDateTime":"", "description":"Custom query Create Jpp
Requests"}, "JS02": {"status":"UP", "description":"Job Scheduler
CaseDocumentDeletion (Jny one associated node:BackgroundProcessing)"},
"JS21": {"status":"UP", "description":"Job Scheduler
PurgeOldSIDExchangeRecords (Jll associated
nodes:BackgroundProcessing)"}, "JS01": {"status":"UP",
"description":"Job Scheduler UpdateReferenceData (Jny one associated
node:BackgroundProcessing)"}, "package": { "name": "Pega Cluster",
"elapsedTime": "819.0", "lastUpdated": "20/12/2022 10:52" } }
UPDATE : I found out that I can use telegraf http plugin however am still not able to pass the data. I have used this plugin in telegraf
[[inputs.http]]
urls = ["http://example:8080//Cluster"]
tagexclude = ["url", "host"]
#Overwrite measurement name from default `http` to `cluster`
### name_override = "httppegacluster"
data_format = "json_v2"
[[inputs.http.json_v2]]
[[inputs.http.json_v2.object]]
path = "test1"
[[inputs.http.json_v2.object]]
path = "test2"
[[inputs.http.json_v2.object]]
path = "test3"
[[inputs.http.json_v2.object]]
path = "test4"
[[inputs.http.json_v2.object]]
path = "test5"
It is throwing this output but I would want the output to show test 1 test 2 and test 3 as I need to display the keys in the dashboard.
telegraf -config http.conf -test
> http count="0",description="Workbasket Admin",minDateTime="",status="UP" 1671286903000000000
> http count="0",description="Workbasket eFormWriteFailure",minDateTime="",status="UP" 1671286903000000000
> http count="0",description="Workbasket CreateAppWait",minDateTime="",status="UP" 1671286903000000000
> http count="0",description="Workbasket default#RoS",minDateTime="",status="UP" 1671286903000000000
> http count="0",description="Workbasket BackgroundProcessing",minDateTime="",status="UP" 1671286903000000000
I am following this example from this page link https://www.influxdata.com/blog/how-parse-json-telegraf-influxdb-cloud/ but seems the json format is different to mine

Related

Design Automation fails with : failedInstructions

I am trying to run a Revit plugin on Design Automation. Workitem fails with failedInstructions error. In the logs I can find "Error details: The system cannot find the file specified." Does that refer to the Revit file? Or the AppBundle? The input file is from BIM 360, the workitem contains a link and the header to download the Revit file and it seems like there are no issues there. I have also debugged the plugin locally using DesignAutomationHandler and that seems to work fine. Not sure what's missing.
[05/20/2022 20:39:12] Job information:
"CommandLine":[
"\"$(engine.path)/revitcoreconsole.exe /i $(args[rvtFile].path) /al $(appbundles[Revit2ProtoExporter].path)\""
]
"Settings":{
"dasreportfaileduploadoptional": {
"value": "true",
"isEnvironmentVariable": true
}
}
"Id":"bc24e48e8f7548fe9262db12a7556055"
"ActivityId":"WDQIfAV8PqNa9XSKPmv6MDu3xAtLGfXI.Revit2ProtoExporterActivity+OV"
"Engine.Id":"Autodesk.Revit_2022!84"
"Apps": [
"App.Id":"WDQIfAV8PqNa9XSKPmv6MDu3xAtLGfXI.Revit2ProtoExporter!72"
]
"BoundArguments":{
"rvtFile": {
"localName": "rac_basic_sample_project.rvt",
"url": "https://developer.api.autodesk.com/Masked:kRW0jj8ZUtU7QNO0DXCukH2NDJ4=",
"headers": {
"authorization": "Masked:IlrS8QJ3oW3Igi6Oe34QAwULA1Q="
}
},
"params": {
"localName": "params.json",
"url": "data:application/json,{'ViewName': {3D}}"
},
"result": {
"localName": "result.avr",
"url": "https://staging-appliedvrabs-pa.sandbox.googleapis.com/Masked:kQfE75mPkbGJwltY96k4UwHYNNY=",
"headers": {
"authorization": "Masked:GbZcVdSF7CkXGH0+4kYm/FaeCao="
},
"verb": "put"
},
"onProgress": {
"ondemand": true,
"url": "https://wlnr5sjl3a.execute-api.us-east-1.amazonaws.com/Masked:UK/Z3b5X3xUWxXiH6C9r9i9UlRU=",
"headers": {
"Content-Type": "application/json",
"x-das-authorize": "awssigv4(us-east-1)",
"x-ads-token-data": "{\"access_token\":{\"client_id\":\"WDQIfAV8PqNa9XSKPmv6MDu3xAtLGfXI\"},\"scope\":\"code:all data:write data:read bucket:create bucket:delete\",\"expires_in\":3599,\"client_id\":\"WDQIfAV8PqNa9XSKPmv6MDu3xAtLGfXI\"}",
"x-ads-gateway-secret": "Masked:F6VCvje5cIP0zOGCxgARjmSopQI="
},
"verb": "put"
}
}
"Quotas":{
"limitProcessingTimeSec": 10800,
"limitTotalUncompressedAppsSizeInMB": 5000
}
[05/20/2022 20:39:13] Starting work item bc24e48e8f7548fe9262db12a7556055
[05/20/2022 20:39:13] Start download phase.
[05/20/2022 20:39:13] Start downloading input: verb - 'GET', url - 'https://developer.api.autodesk.com/oss/v2/buckets/wip.dm.prod/objects/47f2a6e6-4349-4a9e-b066-14019b2d95ff.rvt'
[05/20/2022 20:39:13] Embedded resource [{'ViewName': {3D}}] is saved as file: T:\Aces\Jobs\bc24e48e8f7548fe9262db12a7556055\params.json.
[05/20/2022 20:39:14] End downloading file. Source=https://developer.api.autodesk.com/oss/v2/buckets/wip.dm.prod/objects/47f2a6e6-4349-4a9e-b066-14019b2d95ff.rvt,LocalFile=T:\Aces\Jobs\bc24e48e8f7548fe9262db12a7556055\rac_basic_sample_project.rvt,BytesDownloaded=18739200,Duration=1107ms
[05/20/2022 20:39:14] End download phase successfully.
[05/20/2022 20:39:16] Start preparing script and command line parameters.
[05/20/2022 20:39:16] Command line: []
[05/20/2022 20:39:16] Identified standalone application at T:\Aces\AcesRoot\22.0\coreEngine\Exe/revitcoreconsole.exe /i T:\Aces\Jobs\bc24e48e8f7548fe9262db12a7556055\rac_basic_sample_project.rvt /al T:\Aces\Applications\53f4ce6d27cb3f9b8d6727a1bc6e1b36.WDQIfAV8PqNa9XSKPmv6MDu3xAtLGfXI.Revit2ProtoExporter[72].package.
[05/20/2022 20:39:16] End preparing script and command line parameters.
[05/20/2022 20:39:16] Start script phase.
[05/20/2022 20:39:16] Start application 53f4ce6d27cb3f9b8d6727a1bc6e1b36.WDQIfAV8PqNa9XSKPmv6MDu3xAtLGfXI.Revit2ProtoExporter[72].package standard output dump.
[05/20/2022 20:39:16] command line:'"T:\Aces\AcesRoot\22.0\coreEngine\Exe/revitcoreconsole.exe /i T:\Aces\Jobs\bc24e48e8f7548fe9262db12a7556055\rac_basic_sample_project.rvt /al T:\Aces\Applications\53f4ce6d27cb3f9b8d6727a1bc6e1b36.WDQIfAV8PqNa9XSKPmv6MDu3xAtLGfXI.Revit2ProtoExporter[72].package" /isolate HKEY_CURRENT_USER\SOFTWARE\AppDataLow\Software\Autodesk\CoreUser\WorkItem_bc24e48e8f7548fe9262db12a7556055 T:\Aces\Jobs\bc24e48e8f7548fe9262db12a7556055\userdata'
[05/20/2022 20:39:16] CreateProcess fails [ErrorCode:2]
[05/20/2022 20:39:16] Error details: The system cannot find the file specified.
[05/20/2022 20:39:17] End application 53f4ce6d27cb3f9b8d6727a1bc6e1b36.WDQIfAV8PqNa9XSKPmv6MDu3xAtLGfXI.Revit2ProtoExporter[72].package standard output dump.
[05/20/2022 20:39:17] Error: Application 53f4ce6d27cb3f9b8d6727a1bc6e1b36.WDQIfAV8PqNa9XSKPmv6MDu3xAtLGfXI.Revit2ProtoExporter[72].package exits with code 2 which indicates an error.
[05/20/2022 20:39:17] End script phase.
[05/20/2022 20:39:17] Error: An unexpected error happened during phase CoreEngineExecution of job.
[05/20/2022 20:39:17] Job finished with result FailedExecution
[05/20/2022 20:39:17] Job Status:
{
"status": "failedInstructions",
"reportUrl": "https://dasprod-store.s3.amazonaws.com/workItem/WDQIfAV8PqNa9XSKPmv6MDu3xAtLGfXI/bc24e48e8f7548fe9262db12a7556055/report.txt?X-Amz-Expires=48600&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEDUaCXVzLWVhc3QtMSJHMEUCIQD6%2F06ilWih8Y%2FlExEFSRdV29%2BkhzC9dnUiSRgrowI%2F3AIgWW0dFM7mesUZ9r%2Fay0jRdYM5WS5WLaKE70dUdIJ6pdIqmwIIHRADGgwyMjA0NzMxNTIzMTAiDBvJL%2Faj9DnxZRlQoir4AawLQ9ux77C8%2FCXfmKxVn52VplNguYCnlxSoCU6fZcEcXIJzLByNiYn%2Fh4cNZ7Pi%2B5yD%2FBQz9E%2FK7YxddYqRWn6%2FTJlT79dVYoCvigOP4sYClYT7Khqxpnlzdq%2FFKSPrhwVCeBJNBLLdgFXJywVasJqb95nXz%2FEb4zl1459EB7f1L%2BmMpiCi%2BpQgtXmybqC6YIoZAzRlQFGqOsdi%2FrpucSRTFz85uzXKe95ycI3u6AkjB9qIeFus%2FoGuSHCWmIYCLuFxv6N4paqXsf%2FGL4k0eaxlT8vvqc4BOfVSoKkSDP3asDWUTPBctRHllVn4JVYESTux805TB4i8MNDqn5QGOpoBFkJEETYYcw40DufOBaHwnASy1zn6lcSt3K3yDjzYZCJFrrO6sT0YT3Q0poH1W4Pn74YAHb0MFay8WspSghCNcUc5JrL1j1aqDchne8nCLu2tRJQXwIQD6ddJ7AXMnNVfrd12y%2FZlj1gjJxjKXsEK6syt2xg5LDmmMe%2F1iu40qxVq%2Bv3%2BjmsR05Fz52T3oIyuWYNjqojdudPwFA%3D%3D&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=ASIATGVJZKM3ITCDHE4L/20220520/us-east-1/s3/aws4_request&X-Amz-Date=20220520T203912Z&X-Amz-SignedHeaders=host&X-Amz-Signature=9b56fe0c0ac8fd968acdbb72ba073ab1041f3946e929d23d88ad885149bdbf66",
"stats": {
"timeQueued": "2022-05-20T20:39:12.695924Z",
"timeDownloadStarted": "2022-05-20T20:39:12.957438Z",
"timeInstructionsStarted": "2022-05-20T20:39:16.3503525Z",
"timeInstructionsEnded": "2022-05-20T20:39:17.5424815Z",
"bytesDownloaded": 18739218
},
"id": "bc24e48e8f7548fe9262db12a7556055"
}
Here is the activity definition:
url = '{0}/activities/{1}/versions'.format(self._das_api_root,
self.get_activity_name())
headers = self._get_request_headers()
data = {}
data['commandLine'] = [
'"$(engine.path)/revitcoreconsole.exe '
'/i $(args[rvtFile].path) '
'/al $(appbundles[' + self.get_app_bundle_name() + '].path)"'
]
data['parameters'] = {}
data['parameters']['rvtFile'] = {}
data['parameters']['rvtFile']['zip'] = False
data['parameters']['rvtFile']['ondemand'] = False
data['parameters']['rvtFile']['verb'] = 'get'
data['parameters']['rvtFile']['description'] = 'Input'
data['parameters']['rvtFile']['required'] = True
data['parameters']['params'] = {}
data['parameters']['params']['localName'] = 'params.json'
data['parameters']['params']['verb'] = 'get'
data['parameters']['params']['required'] = True
data['parameters']['result'] = {}
data['parameters']['result']['zip'] = False
data['parameters']['result']['ondemand'] = False
data['parameters']['result']['verb'] = 'put'
data['parameters']['result']['description'] = 'Result'
data['parameters']['result']['required'] = True
data['parameters']['result']['localName'] = 'result.avr'
data['engine'] = 'Autodesk.Revit+2022'
data['appbundles'] = [self.get_app_bundle()]
data['description'] = 'Export Geometry'
# Attempt to update the Activity.
response = requests.post(url, json=data, headers=headers)
And here is the workitem definition:
url = '{0}/workitems'.format(self._das_api_root)
headers = self._get_request_headers()
data = {}
data['activityId'] = self._activity_id
data['arguments'] = {}
data['arguments']['rvtFile'] = {}
data['arguments']['rvtFile']['url'] = download_url
data['arguments']['rvtFile']['localName'] = revit_file_name
data['arguments']['rvtFile']['headers'] = download_header
data['arguments']['params'] = {}
data['arguments']['params']['url'] = "data:application/json,{{'ViewName': {0}}}".format(view_name)
data['arguments']['result'] = {}
data['arguments']['result']['verb'] = 'put'
data['arguments']['result']['url'] = upload_url
data['arguments']['result']['headers'] = upload_header
response = self._http_client.request(url, 'POST', json.dumps(data), headers = headers)
By having a closer look at the report above, the problem is at the command line.
"T:\Aces\AcesRoot\22.0\coreEngine\Exe/revitcoreconsole.exe /i T:\Aces\Jobs\bc24e48e8f7548fe9262db12a7556055\rac_basic_sample_project.rvt /al T:\Aces\Applications\53f4ce6d27cb3f9b8d6727a1bc6e1b36.WDQIfAV8PqNa9XSKPmv6MDu3xAtLGfXI.Revit2ProtoExporter[72].package" cannot be identified as an executable, so it complained "Error details: The system cannot find the file specified."
The correct way to define command line, e.g.
"commandLine": [ "$(engine.path)\\\\revitcoreconsole.exe /i \"$(args[rvtFile].path)\" /al \"$(appbundles[RVTIOExportTestPackage2019].path)\"" ]
See the quotes around $(args[rvtFile].path)and $(appbundles[RVTIOExportTestPackage2019].path)

Disabling the Consul HTTP endpoints

We have enabled ACL's and TLS for Consul cluster in our environment. We have disabled the UI as well. But when I use the URL: http://<consul_agent>:8500/v1/coordinate/datacenters. How can disable the URL's as this?
I tested with adding the following to the consulConfig.json:
"ports":{
"http": -1
}
this did not solve the problem.
Apart from the suggestion provided to use "http_config": { "block_endpoints": I am trying to use the ACL Policy if that can solve.
I enabled the ACL's first
I created a policy using the command: consul acl policy create -name "urlblock" -description "Url Block Policy" -rules #service_block.hcl -token <tokenvalue>
contents of the service_block.hcl: service_prefix "/v1/status/leader" { policy = "deny" }
I created a agent token for this using the command: consul acl token create -description "Block Policy Token" -policy-name "urlblock" -token <tokenvalue>
I copied the agent token from the output of the above command and pasted that in the consul_config.json file in the acl -> tokens section as "tokens": { "agent": "<agenttokenvalue>"}
I restarted the consul agents (did the same in the consul client also).
Still I am able to access the endpoint /v1/status/leader. Any ideas as what is wrong with this approach?
That configuration should properly disable the HTTP server. I was able to validate this works using the following config with Consul 1.9.5.
Disabling Consul's HTTP server
Create config.json in the agent's configuration directory which completely disables the HTTP API port.
config.json
{
"ports": {
"http": -1
}
}
Start the Consul agent
$ consul agent -dev -config-file=config.json
==> Starting Consul agent...
Version: '1.9.5'
Node ID: 'ed7f0050-8191-999c-a53f-9ac48fd03f7e'
Node name: 'b1000.local'
Datacenter: 'dc1' (Segment: '<all>')
Server: true (Bootstrap: false)
Client Addr: [127.0.0.1] (HTTP: -1, HTTPS: -1, gRPC: 8502, DNS: 8600)
Cluster Addr: 127.0.0.1 (LAN: 8301, WAN: 8302)
Encrypt: Gossip: false, TLS-Outgoing: false, TLS-Incoming: false, Auto-Encrypt-TLS: false
==> Log data will now stream in as it occurs:
...
Note the HTTP port is set to "-1" on the Client Addr line. The port is now inaccessible.
Test connectivity to HTTP API
$ curl localhost:8500
curl: (7) Failed to connect to localhost port 8500: Connection refused
Blocking access to specific API endpoints
Alternatively you can block access to specific API endpoints, without completely disabling the HTTP API, by using the http_config.block_endpoints configuration option.
For example:
Create a config named block-endpoints.json
{
"http_config": {
"block_endpoints": [
"/v1/catalog/datacenters",
"/v1/coordinate/datacenters",
"/v1/status/leader",
"/v1/status/peers"
]
}
}
Start Consul with this config
consul agent -dev -config-file=block-endpoints.json
==> Starting Consul agent...
Version: '1.9.5'
Node ID: '8ff15668-8624-47b5-6e83-7a8bfd715a56'
Node name: 'b1000.local'
Datacenter: 'dc1' (Segment: '<all>')
Server: true (Bootstrap: false)
Client Addr: [127.0.0.1] (HTTP: 8500, HTTPS: -1, gRPC: 8502, DNS: 8600)
Cluster Addr: 127.0.0.1 (LAN: 8301, WAN: 8302)
Encrypt: Gossip: false, TLS-Outgoing: false, TLS-Incoming: false, Auto-Encrypt-TLS: false
==> Log data will now stream in as it occurs:
...
In this example, the HTTP API is enabled and listening on port 8500.
Test connectivity to HTTP API
If you issue a request to one of the blocked endpoints, the following error will be returned.
$ curl localhost:8500/v1/status/peers
Endpoint is blocked by agent configuration
However, access to other endpoints are still permitted.
$ curl localhost:8500/v1/agent/members
[
{
"Name": "b1000.local",
"Addr": "127.0.0.1",
"Port": 8301,
"Tags": {
"acls": "0",
"build": "1.9.5:3c1c2267",
"dc": "dc1",
"ft_fs": "1",
"ft_si": "1",
"id": "6d157a1b-c893-3903-9037-2e2bd0e6f973",
"port": "8300",
"raft_vsn": "3",
"role": "consul",
"segment": "",
"vsn": "2",
"vsn_max": "3",
"vsn_min": "2",
"wan_join_port": "8302"
},
"Status": 1,
"ProtocolMin": 1,
"ProtocolMax": 5,
"ProtocolCur": 2,
"DelegateMin": 2,
"DelegateMax": 5,
"DelegateCur": 4
}
]

Junit XML to json format in Jroovy with XmlSlurper

I am trying to write a bridge function to convert XML data to the Json format below are the data I have
the sample xml file is
<testsuites> <testsuite tests="4" failures="4" errors="0" name="AT">
<testcase name="#1 notificate › v1 › announcement › announcement.feature/#TEST CASE: Notification: Send an announcement: Send an announcement using the minimum requirements"/>
<testcase name="#2 notifiivate › v1 › announcement › announcement.feature/#TEST CASE: Notification: Send an ant"/>
<testcase name="#1 No tests found in features/tests/auth/auth.POST.js">
<failure/>
</testcase>
<testcase name="#2 versioninfo › versioninfo › versioninfo.feature/#TEST CASE: CDP ADMIN: Get version info: Get the version of the CDP service">
<failure>
name: AssertionError
message: Rejected promise returned by test
values:
</failure>
</testcase>
<testcase name="#3 projects › edit_entitlement › edit_entitlement.feature/#TEST CASE: CDP ADMIN: Edit Entitlement: Attempt to edit an entitlement_id to be a negative number">
<failure>
---
name: AssertionError
message: Rejected promise returned by test
values:
...
</failure>
</testcase>
</testsuite>
</testsuites>
I am trying to write a function in groovy to get the below json format
{
testsuites{
"testsuite": {
"tests": "4",
"failures": "4",
"errors": "0",
"name": "AT-cdpServer.Default",
"testcase": [
{
"name": "#1 notificate › v1 › announcement › Send an announcement: Send an announcement using the minimum requirements"
},
{
"name": "#2 notifiivate › v1 › announcement › announcement.feature/#TEST CASE: Notification: Send an ant"
},
{
"name": "#1 No tests found in features/tests/auth/auth.POST.js",
"failure": []
},
{
"name": "#2 versioninfo › versioninfo › versioninfo.feature/#TEST CASE: CDP ADMIN: Get version info: Get the version of the CDP service",
"failure": "---\n name: AssertionError\n message: Rejected promise returned by test\n values: {\"Rejected promise returned by test. Reason:\":\"Error {\\n message: 'no schema with key or ref \\\"/versioninfo.get.200\\\"',\\n}\"}\n at: Ajv.validate (node_modules/ajv/lib/ajv.js:95:19)\n ..."
},
{
"name": "#3 projects › edit_entitlement › edit_entitlement.feature/#TEST CASE: CDP ADMIN: Edit Entitlement: Attempt to edit an entitlement_id to be a negative number",
"failure": "---\n name: AssertionError\n message: Rejected promise returned by test\n values: {\"Rejected promise returned by test. Reason:\":\"TypeError {\\n message: 'Only absolute URLs are supported',\\n}\"}\n ..."
},
]
}
}}
}
Appreciate any inputs in the right direction , thank you
So far I have this, it reads all the data, but the structure is off
def toJsonBuilder(xml){
def xmlToJson = build(new XmlSlurper().parseText(xml))
new groovy.json.JsonBuilder(xmlToJson)
}
def build(node){
if (node instanceof String){
return // ignore strings...
}
def map = [(node.name()): node.collect]
if (!node.attributes().isEmpty()) {
map.put(node.name(),node.attributes().collectEntries{it})
}
if (!node.children().isEmpty() && !(node.children().getAt(0) instanceof String)) {
map.put(node.children().name, node.children().collect{build(it)}.findAll{it != null})
} else if (node.text() != ''){
map.put(node.name(), node.text())
}
map
}

Why my html content not update without restart javascript server?

I am learning about NodeJS and I am using simple code for tests. My problem is I must restart server for each change I realize to my code but I see videos where another developers to make changes on html file and they must not restart server.
What I do incorrect?
Thank you!
package.json
{
"name": "xxxxxxx",
"version": "1.0.0",
"description": "xxxxxx",
"main": "app.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "xxxxxxxx",
"license": "ISC",
"dependencies": {
"express": "^4.16.3",
"jade": "^1.11.0",
"pug": "^2.0.3"
}
}
app.js
const express = require('express');
var app = express();
app.set('view engine','pug');
let personas = [
{
id: 1,
nombre: 'wwww'
},
{
id: 2,
nombre: 'qqqq'
},
{
id: 3,
nombre: 'ssss'
}
];
app.get('/', function(req,res) {
res.render('index',{hola:"Hola wqw",titulo:'pug',mensaje:'sdasd!!',personas:personas});
//res.send('Hola mundo');
});
app.listen(8080);
index.pug
html
head
title= titulo
link(href='https://stackpath.bootstrapcdn.com/bootstrap/4.1.3/css/bootstrap.min.css',rel='stylesheet')
body
div.container
h1= mensaje
table.table.table-striped
for persona in personas
tr
td= persona.id
td
a(href='/persona/' + persona.id) #{persona.nombre}
- //Esto se ejecuta en el servidor
- var arreglo = [1,2,3,4,5];
- for (var i = 0; i < arreglo.length; i++)
p= arreglo[i]
p!= "<h1>Se ejecuta como HTML</h1>"
p Hola #{mensaje}
p.
Hola haz click en #[a(href="https://www.google.com") este link]
In order to get your server to automatically update, you have to install the right package. For example the npm package nodemon.
npm install -g nodemon(or if yarn, use that command) will install the package globally so you can have your servers autorestart. Instead of the command "node app" you would then use the command "nodemon app".
Hope this answers your question!

Google Stackdriver Error Reporting not picking up errors

Logs with severity “ERROR” is not identified by Error and Reporting tool. Application logs are being directed at Google Stackdriver Logging using fluentd agent and some of these are third party java components.
{
insertId: "14sf3lvg3ccncgh"
jsonPayload: {
class: "o.a.w.MarkupContainer"
message: "Unable to find component with id 'search2' in [Form [Component id = form]]
Expected: 'form:search2'.
Found with similar names: 'form:search'
at org.apache.wicket.markup.MarkupStream.throwMarkupException(MarkupStream.java:526) ~[wicket-core-6.22.0.jar:6.22.0]
at org.apache.wicket.MarkupContainer.renderNext(MarkupContainer.java:1438) ~[wicket-core-6.22.0.jar:6.22.0]
at org.apache.wicket.MarkupContainer.renderAll(MarkupContainer.java:1557) ~[wicket-core-6.22.0.jar:6.22.0]
at org.apache.wicket.MarkupContainer.renderComponentTagBody(MarkupContainer.java:1532) ~[wicket-core-6.22.0.jar:6.22.0]
at org.apache.wicket.MarkupContainer.onComponentTagBody(MarkupContainer.java:1487) ~[wicket-core-6.22.0.jar:6.22.0]"
milsec: "576"
reportLocation: {…}
serviceContext: {…}
tag: "test.gui"
thread: "[ajp-apr-8009-exec-5]"
}
labels: {…}
logName: "projects/myservice/logs/test.gui"
receiveTimestamp: "2017-08-29T15:20:16.847782870Z"
resource: {…}
severity: "ERROR"
timestamp: "2017-08-29T15:20:11Z"
}
Using the following configuration allows for my application logs to be forwarded correctly to Googles Stackdriver logging and all entries are correctly identified.
<source>
#type tail
path /var/log/test/test_gui/test_gui.log
pos_file /var/lib/google-fluentd/pos/test_gui-multiline.pos
read_from_head true
tag test.gui
format multiline
time_format %Y-%m-%d %H:%M:%S
format_firstline /\d{4}-\d{1,2}-\d{1,2} \d{1,2}:\d{1,2}:\d{1,2},\d{1,3}\s(?<severity>\S*)/
format1 /^(?<time>\d{4}-\d{1,2}-\d{1,2} \d{1,2}:\d{1,2}:\d{1,2}),(?<milsec>\d{1,3})\s(?<severity>\S*)\s(?<class>\S*)\s(?<thread>\[\S*\])\s(?<message>.*)/
</source>
However for severity ERROR, Error Reporting never noticed these entries.
The output was identified as textPayLoad, I used the following filter, which ensured output was jsonPayload
<filter test.gui>
#type record_transformer
<record>
serviceContext {"service": "test.gui", "version": "1"}
reportLocation {"filePath": "test_gui.log", "lineNumber": "unknown", "functionName": "unknown"}
tag ${tag}
</record>
</filter>
Still the error jsonPayload is being ignored.
If I replace the message using the filter, then suddenly Error Reporting is working
<filter test.gui>
#type record_transformer
<record>
serviceContext {"service": "test.gui", "version": "1"}
reportLocation {"filePath": "test_gui.log", "lineNumber": "unknown",
"functionName": "unknown"}
message "java.lang.TestError: msg
at com.example.TestClass.test (TestClass.java:51)
at com.example.AnotherClass (AnotherClass.java:25)"
tag ${tag}
</record>
</filter>
How can I force Error Reporting to pick these error entries as my next step would be to implement some form of alerting.
Third parties did not produce correct Java Stack trace. I needed the reportLocation, but this needs to be in context.
I changed the following line:
reportLocation {"filePath": "test_gui.log", "lineNumber": "unknown", "functionName": "unknown"}
to
context { "reportLocation" : {"filePath": "test_gui.log", "lineNumber": 1, "functionName": "unknown"} }
which ensured that the logs are now picked up by Stackdriver Error Reporting.
This is the final version of my filter:
<filter test.gui>
#type record_transformer
<record>
serviceContext {"service": "test.gui", "version": "1"}
context { "reportLocation" : {"filePath": "test_gui.log", "lineNumber": 1, "functionName": "unknown"} }
tag ${tag}
</record>
</filter>