Getting schema with key or id already exists error while using Angular6-json-schema-form library - angular6

I am using Angular6-json-schema-form library and included
import { Bootstrap4FrameworkModule } from 'angular6-json-schema-form'
in the app.module.ts file. Also i created a JSON schema object in the component file and used the below in the app.component.html file
<json-schema-form loadExternalAssets="true" [schema]="yourschema" framework="bootstrap-4"></json-schema-form>
But when i do npm start and do a localhost i am getting an error in the console
ERROR Error: schema with key or id "http://json-schema.org/draft-06/schema" already exists
Can anyone please help me in resolving the issue.

It seems like there are two JSON schemas within your project both having the id of "http://json-schema.org/draft-06/schema". There could be two reasons for this:
There is actually another JSON schema file you are using with the same exact id.
There is only one schema with this id but the framework is having a hard time reading your $id off the schema. We also had some difficulties reading $id with this framework back in our team. try removing the $id and its value and re-run your app.

Related

bigquery error: "Could not parse '41.66666667' as INT64"

I am attempting to create a table using a .tsv file in BigQuery, but keep getting the following error:
"Failed to create table: Error while reading data, error message: Could not parse '41.66666667' as INT64 for field Team_Percentage (position 8) starting at location 14419658 with message 'Unable to parse'"
I am not sure what to do as I am completely new to this.
Here is a file with the first 100 lines of the full data:
https://wetransfer.com/downloads/25c18d56eb863bafcfdb5956a46449c920220502031838/f5ed2f
Here are the steps I am currently taking to to create the table:
https://i.gyazo.com/07815cec446b5c0869d7c9323a7fdee4.mp4
Appreciate any help I can get!
As confirmed with OP (#dan), the error encountered is caused by selecting Auto detect when creating a table using a .tsv file as the source.
The fix for this is to manually create a schema and define the data type for each column properly. For more reference on using schema in BQ see this document.

Can store and retrieve Object Django BinaryField in SQLite but not MySQL

I have some implementation that is best served by pickling a pandas dataframe and storing it in a DB.
This works fine if the database is sqlite but fails with a load error when it is MySQL
I have found other people with similar issues on stackoverflow and google but it seems that everybodys solution is to use sql to store the dataframe.
As a last resort I would go down that route but it would be a shame for this use case to do that.
Anybody got a solution to get the same behaviour from mysql as sqlite here?
I simply dump the dataframe with
pickledframe = pickle.dumps(frame)
and store pickledframe as a BinaryField
pickledframe = models.BinaryField(null=True)
I load it in with
unpickled = pickle.loads(pickledframe)
with sqlite it works fine, with mysql I get
Exception Type: UnpicklingError
Exception Value: invalid load key, ','.
upon trying to load it.
Thanks

How to create table without schema in BigQuery by API?

Simply speaking I would create table with given name providing only data.
I have some JUnit's with sample data (jsons)
I have to provide schema for above files to create tables for them
I suppose that don't need provide above schemas.
Why? Because in BigQuery console I can create table from query (even such simple like: select 1, 'test') or I can upload json to create table with schema autodetection => probably could also do it programatically
I saw https://chartio.com/resources/tutorials/how-to-create-a-table-from-a-query-in-google-bigquery/#using-the-api and know that could parse jsons with data to queries and use Jobs.insert API to run them but it's over engineered and has some other disadvanteges e.g. boilerplate code.
After some research I found possibly simpler way of creating table on fly, but it doesn't work for me, code below:
Insert insert = bigquery.jobs().insert(projectId,
new Job().setConfiguration(
new JobConfiguration().setLoad(
new JobConfigurationLoad()
.setSourceFormat("NEWLINE_DELIMITED_JSON")
.setDestinationTable(
new TableReference()
.setProjectId(projectId)
.setDatasetId(dataSetId)
.setTableId(tableId)
)
.setCreateDisposition("CREATE_IF_NEEDED")
.setWriteDisposition(writeDisposition)
.setSourceUris(Collections.singletonList(sourceUri))
.setAutodetect(true)
)
));
Job myInsertJob = insert.execute();
JSON file which is used as a source data is pointed by sourceUri, looks like:
[
{
"stringField1": "value1",
"numberField2": "123456789"
}
]
Even if I used setCreateDisposition("CREATE_IF_NEEDED") I still receive error: "Not found: Table ..."
Is there any other method in API or better approach than above to exclude schema?
The code in your question is perfectly fine, and it does create table if it doesn't exist. However, it fails when you use partition id in place of table id, i.e. when destination table id is "table$20170323" which is what you used in your job. In order to write to partition, you will have to create table first.

Creating classes in pimcore

I'm new to the whole pimcore thing. I am trying to play around and create classes. The issue is, I am not able to create more than 1 class, and in the database it is nameless, so when I try to create another class, it also tries to store it in the database with no name, which ends up showing an SQL error saying that there is a duplicate entry. Any ideas what the reason behind this could be?
I installed pimcore on an nginx server, I am trying to create classes by choosing Settings->Objects->Classes and then "Add Class", creating the first class was ok, I entered a name for the class and it was successfully added, however the name field in the corresponding database entry is empty, as in empty string ' '. So, when I try to add another class and pimcore attempts to store it in the table "classes", it returns an error saying that it would be a duplicate entry since they both are nameless, i.e. the name entered isn't added. The following error is what I managed to find using developer tools, could be helpful.
[WARN] Unable to parse the JSON returned by the server
minified_javascript_core_f5757da….js?_dc=3708:5684 Error: You're trying to decode an invalid JSON String:
Fatal error: Call to a member function hasChilds() on null in /var/www/html/pimproject/pimcore/modules/admin/controllers/DocumentController.php on line 59
at new Ext.Error (http://192.10.0.0/pimcore/static6/js/lib/ext/ext-all.js?_dc=3708:22:27054)
at Function.Ext.apply.raise (http://192.10.0.10/pimcore/static6/js/lib/ext/ext-all.js?_dc=3708:22:27447)
at Object.Ext.raise (http://192.10.0.10/pimcore/static6/js/lib/ext/ext-all.js?_dc=3708:22:27594)
at Object.Ext.JSON.me.decode (http://192.10.0.10/pimcore/static6/js/lib/ext/ext-all.js?_dc=3708:22:385102)
at Ext.define.onProxyLoad (http://192.10.0.10/website/var/tmp/minified_javascript_core_f5757da9fa29d5bf13e6aa5058eff9f7.js?_dc=3708:5641:28)
at Ext.cmd.derive.triggerCallbacks (http://192.10.0.10/pimcore/static6/js/lib/ext/ext-all.js?_dc=3708:22:594533)
at Ext.cmd.derive.setCompleted (http://192.10.0.10/pimcore/static6/js/lib/ext/ext-all.js?_dc=3708:22:594231)
at Ext.cmd.derive.setException (http://192.10.0.10/pimcore/static6/js/lib/ext/ext-all.js?_dc=3708:22:594444)
at Ext.cmd.derive.process (http://192.10.0.10/pimcore/static6/js/lib/ext/ext-all.js?_dc=3708:22:593638)
at Ext.cmd.derive.processResponse (http://192.10.0.10/pimcore/static6/js/lib/ext/ext-all.js?_dc=3708:22:648303)
Just reinstall Pimcore.
It can be some composer or submodules error.
I strongly recommend for the first installation to run Demo project https://github.com/pimcore/demo not Skeleton, especially if you are using Docker. Later, when you will get the feeling of Pimcore, feel free to install Skeleton or any other project.
Pimcore is stable working for years. If you had some problems before -- nowadays, it is stable.

Store accessor issue : Can't read old already stored json object serialized ( hash ) values in mysql database in Rails 4

I am upgrading my application to Rails 4.2.4 from Rails 3.2.8 . I have a 'extras' attribute for a table 'editorials' which is serialized
store :extras, accessors: [:attr1, :attr2, :attr3], coder: JSON
#The way it is stored in **Rails 3** is
---
:attr1: value
:attr2: value
:attr3: value
#The way it is stored in **Rails 4** is
{"attr1":"value", "attr2":"value", "attr3":"value"}
The problem when i fetch old records created when my app is in rails 3, it is throwing me error
JSON::ParserError: 795: unexpected token at '---
But when i create new records, it is working normally. Have not got any clue yet, how it get it working in Rails 4
I found the solution finally. The "store accessor" (mentioned in the question) implementation is changed in activerecord 4.2.4. Earlier(active record 3.2.8 ) the data is stored in database is in YAML format, and it was working with "coder: JSON", which is not case in activerecord 4.2.4
Here is the code link 4.2.4
https://github.com/rails/rails/blob/master/activerecord/lib/active_record/store.rb#L85
Here is the code link 3.2
https://github.com/rails/rails/blob/3-2-stable/activerecord/lib/active_record/store.rb#L35
Now in 4.2.4, whether the data stored in serialized attribute is in YAML or JSON, the coder that is working for me now is YAML.
Hence my code was starting working after i changed the coder from JSON to YAML.
Any question/doubt about this answer will be appreciated.