converting json annotation to coco format - json

I have annotated my data using vott and the default format is json. I wanted to load my data to detectron2 model but it seems that the required format is coco.
Can anyone tell me how can I convert my data from json vott to coco format ??

My classmates and I have created a python package called PyLabel to help others with this kind of task and other labelling tasks. You can see an example in this notebook: https://github.com/pylabel-project/samples/blob/main/coco2voc.ipynb.
You might be able to use the package's importer tool to import your data and convert it to coco.
You can find the code for the package here: https://github.com/pylabel-project/.

Related

Is there a way to generate json data from json schema with multi source json data?

I am currently working with a project to implement data fusion / data integration. Now I can find the technology to get json schemas from json data. But how could I integrate different json attributes into the target json schema. My final target is to implement the json data fusion by configuration on a web page. Is there any solution or any open source implementation I could take referrence? Thank you so much.
Please allow me to explain it more clearly. I could define a target json schema, and its attributes should be filled by other json data's attributes (the json may come from multi source).

Are There Any Ways to Keep CreateML Training Model Up to Date?

I'm a CoreML and CreateML newbie and have been playing around with CreateML for the past few days and started to wonder if there are any ways to keep the training model up to date without creating a new model with an updated csv model.
Is there a way to get a JSON file from online and train it?
I've tried let dataTable = try MLDataTable(contentsOf: URL(string: "the API in JSON format I am using"))
and it gave me an error: "MLDataTable can only load from a file URL."
The basic gist: Is there a way to train a model without having to download the updated version of the CSV or JSON file everyday?
Thanks :D

How to write the output from DB to a JSON file using Spring batch?

I am new to spring batch and there is a requirement for me to read the data from DB and write in to JSON format. whats the best way to do this ? Any api's are there? or we need to write custom writer ? or i need to user JSON libraries such as GSON or JACKSON ? Please guide me...
To read data from a relational database, you can use one of the database readers. You can find an example in the spring-batch-samples repository.
To write JSON data, Spring Batch 4.1.0.RC1 provides the JsonFileItemWriter that allows you to write JSON data to a file. It collaborates with a JsonObjectMarshaller to marshal an object to JSON format. Spring Batch provides support for both Gson and Jackson libraries (you need to have the one you want to use in the classpath). You can find more details here.
Hope this helps.
You do not need GSON or Jackson Libraries if you DB support JSON.
Example : In SQL Server there is an option to get data out of DB as JSON String instead of resultset.
Reference - https://learn.microsoft.com/en-us/sql/relational-databases/json/format-query-results-as-json-with-for-json-sql-server?view=sql-server-2017
https://learn.microsoft.com/en-us/sql/relational-databases/json/format-nested-json-output-with-path-mode-sql-server?view=sql-server-2017
Example - select (select * from tableName for json path) as jsonString;
This will already give you output in JsonString which you can write to a file.

RavenDB - HTTP request to return data in format other than CSV or JSON

I'm running RavenDB v3.0. According to the RavenDB documentation, you are able to access an HTTP link to export a list of documents in CSV format. I've followed the instructions and can generate the export by connecting to an address similar to their example:
http://my-server/databases/db-name/streams/query/DocumentsForExtract?resultsTransformer=TransformForExtract&format=excel
The above URL will return the extract in CSV format. If I remove the format parameter from the request, or alter it to anything else, it returns it in JSON. I want to know if there are any other formats available? I'd like to get it in XML if possible, but I can't seem to find any documentation about this which is why I'm asking here on SO.
Thanks in advance.
No, that endpoint supports only CSV and JSON

Importing a json file into Cassandra

Hi is it possible to import any random json file into cassandra.
The json file is not exported from sstable2json. The json file is from a different website and needs to be imported into cassandra. Please could anyone advise whether this is possible
JSON support won't be introduced until Cassandra 3.0 (see CASSANDRA-7970) and in this case you still need to define a schema for your json data to map to. You do have some other options:
Use maps which sort of map to JSON. Maps can be indexed as of Cassandra 2.1 (CASSANDRA-4511) There is also a good Stack Exchange post about this.
You mention 'any random json file'. You could just have a string column that contains the raw JSON, but then you lose any query-ability of that data.
Come up with some kind of schema for your JSON data and map it to a CQL table and write some code that parses the JSON and writes it to the CQL table mapping to that data. This doesn't sound like an option for you since you want to be able to import any random JSON file.
If you are looking to only do json document storage, you might want to look at more document-oriented solutions instead of a column-oriented solution like cassandra.