I have an ASP.NET Core app with SQL Server 2016, which is accessed using EF Core.
One of the fields in the table we're working on contains JSON data (as NVARCHAR), which is accessed using SQL Server's JSON functions.
The JSON looks similar to this:
{"log" : {
"date" : "2018-01-02 12:04:33",
"IP" : "233.24.122.97",
"Severity" : "INFO"
}
}
We would like to implement OData on top of our controllers, which will enable us to query the tables dynamically. To make the functionality complete, we would like to query the OData fields.
Now, we know EF Core does not support JSON querying as of now, but we can work around it and manipulate the LINQ query manually.
Our problem lies with the OData query.
We would like to be able to use query like this:
http://server/api/logs?$filter=log/Severity eq 'INFO'
Our problem is that this query causes an exception at the OData / ASP.NET level, states that:
ODataException: could not find a property named log/Severity On type LogRecord
(LogRecord is the entity type containing the JSON Data in addition to other, regular, fields).
How can we construct an OData query that uses fields which are not directly related to the entity but to a JSON field inside it?
Related
I'm using azure data factory to transform data, I have a derived column to process images.
iif(isNull(column1_images),
iif(isNull(column2_images),
iif(isNull(images),'N/A',toString(images))
,concat(toString(column12_images),' ','(',column2_type,')')),
concat(toString(column1_images),' ','(',column1_type,')'))
when I click on refresh buton I can see the result :
but when I pass this column to the sink I'M GETTING THIS ERROR :
Conversion from StringType to ArrayType(StructType(StructField(url,StringType,true)),false) not defined
can you tell me what is the problem please ?
The error you are getting because there is no Schema designed for the Sink. Hence, you need to use Derived column expression and create a JSON schema to convert the data. Check Building schemas using the expression builder.
You can also check this similar kind of thread for your reference.
So I have my location column using Point data type, I'm using Apollo Server and Prisma, and when I use "npx prisma db pull" generates this data type because is not currently supported on Prisma (generated script)
so I say "Ok, I'm using string and I manage how to insert this data type" so I changed to this script, surprise! didn't work enter image description here, try to find any approach to handling MySql Point data type in Prisma but no info at soever, I really appreciate any ideas
You cannot convert it to String and use it as it isn't supported yet. You need to leave it as unsupported and you can only add data via raw queries.
For now, only adding data is supported. You cannot query for it using PrismaClient.
We can query data using Prisma Client, via raw queries as SELECT id, ST_AsText(geom) as geom from training_data where geom has dataType geometry for using Unsupported("geometry").
I would like to complete the following code using Entity Framework in .Net Core to query a JSON object from my API where my API is linked to MS SQL Server 2016. I don't want to take in the entire JSON and query from my backend, I want SQL to handle the querying using the MS JSON Functions and return only the queried information.
Something along the lines of:
var queryList = _context.myTable.Select(r => JSON_QUERY(r.myJsonColumn, '$.info')).ToList();
and
var valueList = _context.myTable.Select(r => JSON_VALUE(r.myJsonColumn, '$.info')).ToList();
I would like to do this without using a string inside the select statement and without creating my own function.
Overall I want a library where I can use JSON_VALUE, JSON_QUERY, ISJSON and JSON_MODIFY with my entity framework and linq.
I am upgrading my application to Rails 4.2.4 from Rails 3.2.8 . I have a 'extras' attribute for a table 'editorials' which is serialized
store :extras, accessors: [:attr1, :attr2, :attr3], coder: JSON
#The way it is stored in **Rails 3** is
---
:attr1: value
:attr2: value
:attr3: value
#The way it is stored in **Rails 4** is
{"attr1":"value", "attr2":"value", "attr3":"value"}
The problem when i fetch old records created when my app is in rails 3, it is throwing me error
JSON::ParserError: 795: unexpected token at '---
But when i create new records, it is working normally. Have not got any clue yet, how it get it working in Rails 4
I found the solution finally. The "store accessor" (mentioned in the question) implementation is changed in activerecord 4.2.4. Earlier(active record 3.2.8 ) the data is stored in database is in YAML format, and it was working with "coder: JSON", which is not case in activerecord 4.2.4
Here is the code link 4.2.4
https://github.com/rails/rails/blob/master/activerecord/lib/active_record/store.rb#L85
Here is the code link 3.2
https://github.com/rails/rails/blob/3-2-stable/activerecord/lib/active_record/store.rb#L35
Now in 4.2.4, whether the data stored in serialized attribute is in YAML or JSON, the coder that is working for me now is YAML.
Hence my code was starting working after i changed the coder from JSON to YAML.
Any question/doubt about this answer will be appreciated.
Hi there i have some sql tables and i want to convert these in a "Drupal Node Format" but i don't know how to do it. Does someone knows at least which tables i have to write in order to have a full node with all the keys etc. ?
I will give an example :
I have theses Objects :
Anime
field animeID
field animeName
Producer
field producerID
field producerName
AnimeProducers
field animeID
field producerID
I have used the CCK module and i had created in my drupal a new Content Type Anime and a new Data Type Producer that exist in an Anime object.
How can i insert all the data from my simple mysql db into drupal ?
Sorry for the long post , i would like to give you the chance to understand my problem
Thx in advance for your time to read my post
You can use either the Feeds module to import flat CSV files, or there is a module called Migrate that seems promising (albiet pretty intense). Both work on Drupal 6 or 7.
mmmmm.... i think you can export CVS from your sql database and then use
http://drupal.org/project/node_import
to import this cvs data to nodes.....mmmm i don know if there is another non-programmatically way
The main tables for node property data are node and node_revision, have a look at the columns in those and it should be fairly obvious what needs to go in those.
As far as fields go, their storage is predictable so you would be able automate an import (although I don't envy you having to write that!). If your field is called 'field_anime' it's data will live in two tables: field_data_field_anime and field_revision_field_anime which are keyed by the entity ID (in this case node ID), entity type (in the case 'node' itself) and bundle (in this case the name of your node type). You should keep both tables up to date to ensure the revision system functions correctly.
The simplest way to do it though is with PHP and the node API functions:
/* This is for a single node, obviously you'd want to loop through your custom SQL data here */
$node = new stdClass;
$node->type = 'my_type';
$node->title = 'Title';
node_object_prepare($node);
// Fields
$node->field_anime[LANGUAGE_NONE] = array(0 => array('value' => $value_for_field));
$node->field_producer[LANGUAGE_NONE] = array(0 => array('value' => $value_for_field));
// And so on...
// Finally save the node
node_save($node);
If you use this method Drupal will handle a lot of the messy stuff for you (for example updating the taxonomy_index table automatically when adding a taxonomy term field to a node)