Audit4j 2.4.1 config fails - audit4j

I'm trying Audit4J (2.4.1) and follow the docs, but my audit4j.conf.yml file seems to be wrong.
I use:
!Configuration
handlers:
- !org.audit4j.core.handler.ConsoleAuditHandler {}
layout: !org.audit4j.core.layout.SimpleLayout
dateFormat: dd-MM-yyyy HH:mm:ss
metaData: !org.audit4j.core.DummyMetaData {}
This yields the following error:
org.audit4j.core.exception.InitializationException: initialization failed.!!
....
Caused by: org.audit4j.core.exception.ConfigurationException: Configuration Exception
....
Caused by: com.esotericsoftware.yamlbeans.YamlException: Error parsing YAML.
....
Caused by: com.esotericsoftware.yamlbeans.parser.Parser$ParserException: Line 7, column 13: Expected a 'block end' but found: block mapping start
Anyone any idea what's wrong here?
Regards,
Gerard

The correct format is:
layout: !org.audit4j.core.layout.SimpleLayout {
    dateFormat: "dd/MM/yyyy HH:mm:ss"
    }

Related

Is anyone else experiancing a Syncfusion Pivot Grid remote data error?

Typescript Version: 3.2.4
Angular version: ^7.2.15
Syncfusion Version: 17.1.50
Hi all. I am having an issue with Syncfusion's Pivot Grid. The code below is my onInit method which is taken straight from their demo. This was working as of 17.1.41 but now has magically stopped working. This issue is also on their StackBlitz example.
ngOnInit() {
let data: DataManager;
data = new DataManager({
url: this._controllerUrl + "/billing/extract",
adaptor: new WebApiAdaptor(),
crossDomain: true,
headers: [
{
"Content-Type": "application/json",
Accept: "application/json",
Authorization: "Bearer " + this.token
}
]
});
this.dataSource = {
enableSorting: true,
columns: [{ name: "Year" }, { name: "Month" }],
values: [{ name: "Cost", caption: "Cost (GBP)" }],
data: data,
rows: [{ name: "ServiceId" }],
formatSettings: [],
expandAll: false,
filters: []
};
this.button = new Button({ isPrimary: true });
this.button.appendTo("#export");
this.button.element.onclick = (): void => {
this.pivotGridObj.excelExport();
};
}
Here is the full error in Firefox:
ERROR Error: "Uncaught (in promise): TypeError: this.parent.dataSource.values is undefined
./node_modules/#syncfusion/ej2-pivotview/src/common/grouping-bar/axis-field-renderer.js/AxisFields.prototype.createPivotButtons#http://localhost:4200/views-billing-billing-module.js:197167:17
./node_modules/#syncfusion/ej2-pivotview/src/common/grouping-bar/axis-field-renderer.js/AxisFields.prototype.render#http://localhost:4200/views-billing-billing-module.js:197143:14
./node_modules/#syncfusion/ej2-pivotview/src/common/grouping-bar/grouping-bar.js/GroupingBar.prototype.appendToElement#http://localhost:4200/views-billing-billing-module.js:197347:49
./node_modules/#syncfusion/ej2-base/src/observer.js/Observer.prototype.notify#http://localhost:4200/views-billing-billing-module.js:10781:25
./node_modules/#syncfusion/ej2-base/src/component.js/Component.prototype.notify#http://localhost:4200/views-billing-billing-module.js:5505:32
./node_modules/#syncfusion/ej2-pivotview/src/pivotview/base/pivotview.js/PivotView.prototype.renderPivotGrid#http://localhost:4200/views-billing-billing-module.js:204990:18
./node_modules/#syncfusion/ej2-base/src/observer.js/Observer.prototype.notify#http://localhost:4200/views-billing-billing-module.js:10781:25
./node_modules/#syncfusion/ej2-base/src/component.js/Component.prototype.notify#http://localhost:4200/views-billing-billing-module.js:5505:32
./node_modules/#syncfusion/ej2-pivotview/src/pivotview/base/pivotview.js/PivotView.prototype.initEngine#http://localhost:4200/views-billing-billing-module.js:206000:14
./node_modules/#syncfusion/ej2-pivotview/src/pivotview/base/pivotview.js/PivotView.prototype.executeQuery#http://localhost:4200/views-billing-billing-module.js:206047:14
./node_modules/zone.js/dist/zone.js/</ZoneDelegate.prototype.invoke#http://localhost:4200/polyfills.js:7688:26
onInvoke#http://localhost:4200/vendor.js:82616:33
./node_modules/zone.js/dist/zone.js/</ZoneDelegate.prototype.invoke#http://localhost:4200/polyfills.js:7687:52
./node_modules/zone.js/dist/zone.js/</Zone.prototype.run#http://localhost:4200/polyfills.js:7447:43
scheduleResolveOrReject/<#http://localhost:4200/polyfills.js:8186:34
./node_modules/zone.js/dist/zone.js/</ZoneDelegate.prototype.invokeTask#http://localhost:4200/polyfills.js:7720:31
onInvokeTask#http://localhost:4200/vendor.js:82607:33
./node_modules/zone.js/dist/zone.js/</ZoneDelegate.prototype.invokeTask#http://localhost:4200/polyfills.js:7719:60
./node_modules/zone.js/dist/zone.js/</Zone.prototype.runTask#http://localhost:4200/polyfills.js:7492:47
drainMicroTaskQueue#http://localhost:4200/polyfills.js:7898:35
./node_modules/zone.js/dist/zone.js/</ZoneTask.invokeTask#http://localhost:4200/polyfills.js:7799:21
invokeTask#http://localhost:4200/polyfills.js:9041:14
globalZoneAwareCallback#http://localhost:4200/polyfills.js:9067:17
Here is the full error in Chrome/Edge
core.js:15724 ERROR Error: Uncaught (in promise): TypeError: Cannot read property 'length' of undefined
TypeError: Cannot read property 'length' of undefined
at AxisFields.push../node_modules/#syncfusion/ej2-pivotview/src/common/grouping-bar/axis-field-renderer.js.AxisFields.createPivotButtons (axis-field-renderer.js:45)
at AxisFields.push../node_modules/#syncfusion/ej2-pivotview/src/common/grouping-bar/axis-field-renderer.js.AxisFields.render (axis-field-renderer.js:21)
at GroupingBar.push../node_modules/#syncfusion/ej2-pivotview/src/common/grouping-bar/grouping-bar.js.GroupingBar.appendToElement (grouping-bar.js:142)
at Observer.push../node_modules/#syncfusion/ej2-base/src/observer.js.Observer.notify (observer.js:89)
at PivotViewComponent.push../node_modules/#syncfusion/ej2-base/src/component.js.Component.notify (component.js:188)
at PivotViewComponent.push../node_modules/#syncfusion/ej2-pivotview/src/pivotview/base/pivotview.js.PivotView.renderPivotGrid (pivotview.js:881)
at Observer.push../node_modules/#syncfusion/ej2-base/src/observer.js.Observer.notify (observer.js:89)
at PivotViewComponent.push../node_modules/#syncfusion/ej2-base/src/component.js.Component.notify (component.js:188)
at PivotViewComponent.push../node_modules/#syncfusion/ej2-pivotview/src/pivotview/base/pivotview.js.PivotView.initEngine (pivotview.js:1891)
at PivotViewComponent.push../node_modules/#syncfusion/ej2-pivotview/src/pivotview/base/pivotview.js.PivotView.executeQuery (pivotview.js:1938)
at resolvePromise (zone.js:831)
at zone.js:896
at ZoneDelegate.push../node_modules/zone.js/dist/zone.js.ZoneDelegate.invokeTask (zone.js:423)
at Object.onInvokeTask (core.js:17290)
at ZoneDelegate.push../node_modules/zone.js/dist/zone.js.ZoneDelegate.invokeTask (zone.js:422)
at Zone.push../node_modules/zone.js/dist/zone.js.Zone.runTask (zone.js:195)
at drainMicroTaskQueue (zone.js:601)
at ZoneTask.push../node_modules/zone.js/dist/zone.js.ZoneTask.invokeTask [as invoke] (zone.js:502)
at invokeTask (zone.js:1744)
at XMLHttpRequest.globalZoneAwareCallback (zone.js:1770)
The weird thing about this, is that if I use a subset of the data that is coming back from the server and set it locally, it all works.
Here is an image of the request that I am making with the response and the data. I'm not sure if this is something that I can fix as this may be a bug with the DataManager.
I have tried reverting back to the previous version that I used, but there still seems to be the same problem.
We are glad to announce that our Essential Studio 2019 Volume 2 Beta Release version 17.2.0.28 is rolled out with the mentioned issue fix and is available for download under the following link.
https://www.syncfusion.com/forums/145548/essential-studio-2019-volume-2-beta-release-v17-2-0-28-is-available-for-download
Note: This issue fix will be included in our main release as well which is expected to be available by mid of July 2019.
We thank you for your support and appreciate your patience in waiting for this release. Please get in touch with us if you would require any further assistance.
Regards,
Dinesh Babu Yadav

Log in JSON format to cloudfoundry logstash

We are using Swisscom application cloud (based on Cloudfoundry) and the provided Kibana/Logstash/Elasticsearch service. Now we d like to log in JSON format from our applications to logstash.
Thats why we integrated Logstash formatter into our wildfly swarm apps and since then they log in JSON format like:
{"#version":1,"#timestamp":"2018-07-24T18:28:51.291+0200","sequence":15299,"loggerClassName":"org.jboss.as.server.logging.ServerLogger_$logger","loggerName":"org.jboss.as.server.deployment","level":"INFO","threadName":"MSC service thread 1-2","message":"WFLYSRV0027: Starting deployment of \"hospush.war\" (runtime-name: \"hospush.war\")","threadId":31,"mdc":{},"ndc":""}
I also added a filter.conf to the logstash app on swisscom appcloud with the following content:
filter {
json {
source => "message"
}
}
When I check the logs of logstash now I can see that it throws an error and no logs are transferred to Kibana.
2018-07-24 20:33:15 [APP/PROC/WEB/0] OUT [2018-07-24T18:33:15,202][WARN ][logstash.filters.json ] Error parsing json {:source=>"message", :raw=>"<14>1 2018-07-24T18:33:15.018187+00:00 HosPush.demo-test.demo-test 3694b57f-bc05-459a-880a-17c174fc6d7c [APP/PROC/WEB/0] - - {\"#version\":1,\"#timestamp\":\"2018-07-24T20:33:15.017+0200\",\"sequence\":3796,\"loggerClassName\":\"org.slf4j.impl.Slf4jLogger\",\"loggerName\":\"com.hospush.business.escalation.EscalationService\",\"level\":\"INFO\",\"threadName\":\"EJB default - 3\",\"message\":\"Found 0 patientNeeds with no open notification.\",\"threadId\":154,\"mdc\":{},\"ndc\":\"\"}\n", :exception=>#<LogStash::Json::ParserError: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
2018-07-24 20:33:15 [APP/PROC/WEB/0] OUT at [Source: (byte[])"<14>1 2018-07-24T18:33:15.018187+00:00 HosPush.demo-test.demo-test 3694b57f-bc05-459a-880a-17c174fc6d7c [APP/PROC/WEB/0] - - {"#version":1,"#timestamp":"2018-07-24T20:33:15.017+0200","sequence":3796,"loggerClassName":"org.slf4j.impl.Slf4jLogger","loggerName":"com.hospush.business.escalation.EscalationService","level":"INFO","threadName":"EJB default - 3","message":"Found 0 patientNeeds with no open notification.","threadId":154,"mdc":{},"ndc":""}
2018-07-24 20:33:30 [APP/PROC/WEB/0] OUT [2018-07-24T18:33:30,117][WARN ][logstash.filters.json ] Error parsing json {:source=>"message", :raw=>"<14>1 2018-07-24T18:33:30.002777+00:00 HosPush.demo-test.demo-test 3694b57f-bc05-459a-880a-17c174fc6d7c [APP/PROC/WEB/0] - - {\"#version\":1,\"#timestamp\":\"2018-07-24T20:33:30.001+0200\",\"sequence\":3797,\"loggerClassName\":\"org.slf4j.impl.Slf4jLogger\",\"loggerName\":\"com.hospush.business.escalation.OrphanEscalationScheduler\",\"level\":\"INFO\",\"threadName\":\"EJB default - 4\",\"message\":\"orphan escalation scheduler called...\",\"threadId\":155,\"mdc\":{},\"ndc\":\"\"}\n", :exception=>#<LogStash::Json::ParserError: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
2018-07-24 20:33:30 [APP/PROC/WEB/0] OUT at [Source: (byte[])"<14>1 2018-07-24T18:33:30.002777+00:00 HosPush.demo-test.demo-test 3694b57f-bc05-459a-880a-17c174fc6d7c [APP/PROC/WEB/0] - - {"#version":1,"#timestamp":"2018-07-24T20:33:30.001+0200","sequence":3797,"loggerClassName":"org.slf4j.impl.Slf4jLogger","loggerName":"com.hospush.business.escalation.OrphanEscalationScheduler","level":"INFO","threadName":"EJB default - 4","message":"orphan escalation scheduler called...","threadId":155,"mdc":{},"ndc":""}
My guess is, that because of source => "message" logstash parses the message property as JSON which failes. What it should do is parsing the whole "root object" as json instead of only the message property.
Could that be and if yes, how do I need to adjust the filter.conf to make it work?
Thx a lot for your help guys.
Probably not relevant, but I guess the filter here is incorrect.
This says that their is a JSON structure in the message field that should be parsed as JSON. As I see, I do not see any message field, so this will not work.

Mule JSON validation Schema component avoid error logs

I'm using JSON Validation Schema component and I have notice, that it logs all the errors to the console.
I would like to avoid this error message being displayed in the console.
Even though I have chosen a special exception strategy which has catch exception strategy with JSONValidation Exception and has custom logic implemented and no loggers at all in it, I still see the following error message:
org.mule.api.MessagingException: Json content is not compliant with schema
com.github.fge.jsonschema.core.report.ListProcessingReport: failure
--- BEGIN MESSAGES ---
error: string "blah" is too long (length: 4, maximum allowed: 3)
level: "error"
schema: {"loadingURI":"file:/...}
instance: {"pointer":"/blah_blah_code"}
domain: "validation"
keyword: "maxLength"
value: "blah"
found: 4
maxLength: 3
--- END MESSAGES ---
How could I make mule omit this error message? I don't want these errors to be logged to the console.
You can set the logException attribute of the catch-exception-strategy element to false, forcing mule not to log errors to the console:
<catch-exception-strategy logException="false">
set below logger to false in log4j2.xml
<AsyncLogger name="org.mule.module.apikit.validation.RestJsonSchemaValidator" level="OFF"/>

Dropwizard RequestLog

I have an strange exception while integration testing my Dropwizard 0.9.2 service. The exception and configuration are below.
I dont understand why requestLog is unknown? The DW documentation say, that below configuration part should work. The requestLog can be found in
io.dropwizard.server.AbstractServerFactory.class
and
io.dropwizard.server.DefaultServerFactory.class
extends it, so it should be possible to use the requestLog. Whats wrong here?
Someone knows this problem already?
Configuration Part
server:
requestLog:
timeZone: UTC
appenders:
- type: console
threshold: DEBUG
- type: file
currentLogFilename: ./log/access.log
threshold: ALL
archive: true
archivedLogFilenamePattern: ./log/access.%d.log.gz
archivedFileCount: 14
maxThreads: 1024
minThreads: 8
maxQueuedRequests: 1024
applicationConnectors:
- type: http
port: 80
adminConnectors:
- type: http
port: 12345
Exception
java.lang.RuntimeException: io.dropwizard.configuration.ConfigurationParsingException: myService.yml has an error:
* Unrecognized field at: server.requestLog
Did you mean?:
- adminConnectors
- adminContextPath
- adminMaxThreads
- adminMinThreads
- applicationConnectors
[1 more]
at com.google.common.base.Throwables.propagate(Throwables.java:160)
at io.dropwizard.testing.DropwizardTestSupport.startIfRequired(DropwizardTestSupport.java:214)
at io.dropwizard.testing.DropwizardTestSupport.before(DropwizardTestSupport.java:115)
at io.dropwizard.testing.junit.DropwizardAppRule.before(DropwizardAppRule.java:87)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:46)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
Caused by: io.dropwizard.configuration.ConfigurationParsingException: myservice.yml has an error:
* Unrecognized field at: server.requestLog
Did you mean?:
- adminConnectors
- adminContextPath
- adminMaxThreads
- adminMinThreads
- applicationConnectors
[1 more]
at io.dropwizard.configuration.ConfigurationParsingException$Builder.build(ConfigurationParsingException.java:271)
at io.dropwizard.configuration.ConfigurationFactory.build(ConfigurationFactory.java:163)
at io.dropwizard.configuration.ConfigurationFactory.build(ConfigurationFactory.java:95)
at io.dropwizard.cli.ConfiguredCommand.parseConfiguration(ConfiguredCommand.java:115)
at io.dropwizard.cli.ConfiguredCommand.run(ConfiguredCommand.java:64)
at io.dropwizard.testing.DropwizardTestSupport.startIfRequired(DropwizardTestSupport.java:212)
... 11 more
Caused by: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "requestLog" (class io.dropwizard.server.DefaultServerFactory), not marked as ignorable (6 known properties: "adminMaxThreads", "adminConnectors", "applicationConnectors", "applicationContextPath", "adminMinThreads", "adminContextPath"])
at [Source: N/A; line: -1, column: -1] (through reference chain: .....
we had the same issue. It was a problem of using an older version of Jackson. We'Ve upgraded from v 2.6.3 to v 2.6.5 and the error was gone.
Try to put timeZone: UTC under each appender like this:
server:
requestLog:
appenders:
- type: console
threshold: DEBUG
timeZone: UTC
- type: file
currentLogFilename: ./log/access.log
threshold: ALL
archive: true
archivedLogFilenamePattern: ./log/access.%d.log.gz
archivedFileCount: 14
timeZone: UTC
maxThreads: 1024
minThreads: 8
maxQueuedRequests: 1024
applicationConnectors:
- type: http
port: 80
adminConnectors:
- type: http
port: 12345
The same error arose for me with more recent versions of jackson, when I added a dependency that pulled in com.fasterxml.jackson.core:jackson-databind:jar:2.7.2
Excluding jackson-databind from that dependency in pom.xml resolved it.

Solr return 400(bad request) while indexing the json file from localhost

I am new in Solr . I have setup solr in local pc. I am facing problem with the indexing of json file. I have one json file in local pc which i want to index in solr. it show some error which i have mention below.
Error
D:\Solr\Example\exampledocs>java -Durl=http://localhost:8983/solr/update -Dtype=application/json -jar post.jar timeline.json
SimplePostTool version 1.5
Posting files to base url http://localhost:8983/solr/update using content-type application/json..
POSTing file timeline.json
SimplePostTool: WARNING: Solr returned an error #400 (Bad Request) for url: http://localhost:8983/solr/update
SimplePostTool: WARNING: Response: {"responseHeader":{"status":400,"QTime":0},"error":{"msg":"Unknown command: Name [8]","code":400}}
SimplePostTool: WARNING: IOException while reading response: java.io.IOException: Server returned HTTP response code: 400 for URL: http://localhost:8983/solr/update
1 files indexed.
COMMITting Solr index changes to http://localhost:8983/solr/update..
Time spent: 0:00:00.167
Please help me , how can i solve it? Thanks in advance.
Log show
ERROR - 2014-07-30 18:38:52.330; org.apache.solr.common.SolrException; org.apache.solr.common.SolrException: Unexpected character '{' (code 123) in prolog; expected '<'
at [row,col {unknown-source}]: [1,1]
at org.apache.solr.handler.loader.XMLLoader.load(XMLLoader.java:176)
at org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:92)
at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:74)
at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1962)
at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:777)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:418)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:207)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:368)
at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
at org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
at org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:953)
at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1014)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:861)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
at org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
at org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Unknown Source)
Caused by: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character '{' (code 123) in prolog; expected '<'
at [row,col {unknown-source}]: [1,1]
at com.ctc.wstx.sr.StreamScanner.throwUnexpectedChar(StreamScanner.java:648)
at com.ctc.wstx.sr.BasicStreamReader.nextFromProlog(BasicStreamReader.java:2047)
at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1069)
at org.apache.solr.handler.loader.XMLLoader.processUpdate(XMLLoader.java:213)
at org.apache.solr.handler.loader.XMLLoader.load(XMLLoader.java:174)
... 32 more
Json file
{ "Name" : "Matches and Schedule", "timestamp" : { "$date" : 1400825267792 }, "_id" : { "$oid" : "537ee50494" } }
{ "Name" : "meet Modi", "timestamp" : { "$date" : 1401449841192 }, "_id" : { "$oid" : "53886d3a2c" } }
Can't comment yet.
Please post a snippet of your JSON file so we can comment.
Also, in these cases, Solr log files are your best friend, they will tell you exactly what it doesn't like about the data you are posting.
Also, you don't have to specify the host if you are sending to localhost, it is the default.
EDIT: Your JSON doesn't look correct. What do you expect it to do with stuff like:
"timestamp" : { "$date" : 1400825267792 }
First of, if you need timestamp is solr to be a date/time field it needs to be in UTC format. Second, Solr doesn't support nested elements.
Finally, if you are posting multiple documents via json, the format needs to look like this:
[ {"id":"doc1","field2":"val2"} , {"id":"doc2","field2":"val3"} ]
Note that all documents are enclosed in square brackets and separated by a comma.