reading a json in spark [duplicate] - json

This question already has answers here:
How to query JSON data column using Spark DataFrames?
(5 answers)
Closed 6 years ago.
I have a table in cassandra where it has a column of type 'text'.
The value it holds is a json type of data.
So in each record this column will be having a value like.
{"a":"1", "b":"5", "c":"3", "d":"12"}
Similarly in next record it will have value something like
{"a":"12", "b":"52", "c":"13", "d":"3",}
So what i can say is this column is having a json value in each record.
My requirement is to retreive the values of "b" and "d" of each record using spark/sparksql.

After you read in the Cassandra table you can perform a User Defined Function (UDF) on the text column, and in that udf you can convert the string to a JSON object and return back the fields you require.

Related

Using Lists / Arrays in MySQL when migrating from Postgres? [duplicate]

This question already has answers here:
Syntax array for mysql [closed]
(1 answer)
Is there any array data type in MySQL like in PostgreSQL?
(4 answers)
Closed 5 months ago.
I am moving from Postgres to Mysql (using Prisma as my connector).
This schema used to work:
model Page {
slug String #id
someValue String
...
tags String[]
}
However, mysql doesn't seem to like String[]
Unable to get DMMF from Prisma Client: Error: Schema parsing error:
Field "tags" in model "Page" can't be a list. The current connector
does not support lists of primitive types.
What am I supposed to do in mysql instead?

Store JSON in multiple columns and concatenate them to query them

I have a table that currently has a extended column 32,768 bytes in size and so the database uses that space no matter if we put in 1 byte or all 32,768.
I store json in this column.
I am needing to reduce the size this column is taking.
Can I store the json in multiple columns and then concatenate the columns to work with the complete JSON?
For example
column has data:
'{"REC_TYPE_IND":"1","ID":"999999","2nd ID":"1111","location":"0003","BEGIN_DT":"20000101","END_DT":"20991231"}'
I want to split it out like
column1:
'{"REC_TYPE_IND":"1","ID":"999999","2nd '
column2:
'ID":"1111","location":"0003","BEGIN_DT":"20000101","END_DT":"20991231"}'
The how to I use built in functions like json_value(column1 || column2,'location') to get a value?
The error I get when trying the above is:
ORA-00932: inconsistent datatypes: expected - got CHAR

How can I transpose a field-value table into a normal in ABAP? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I should create a file with a horizontal table.
I have this table:
FIELDNAME
FIELDVALUE
ZFIELD1
A
ZFIELD2
B
ZFIELD3
C
ZFIELD4
D
File will be:
ZFIELD1
ZFIELD2
ZFIELD2
ZFIELD2
A
B
C
D
How can I do this code?
A short example how to create a CSV file in your scenario:
CONSTANTS:
lc_filename TYPE string VALUE `C:\Myfile.csv`.
DATA:
l_fieldnames TYPE string,
l_fieldvalues TYPE string,
l_csv_file TYPE STANDARD TABLE OF string.
FIELD-SYMBOLS:
<l_workarea> LIKE LINE OF gt_prepare_file.
* Create CSV lines
LOOP AT gt_prepare_file ASSIGNING <l_workarea>.
PERFORM csv_encode USING <l_workarea>-fieldname CHANGING l_fieldnames.
PERFORM csv_encode USING <l_workarea>-fieldvalue CHANGING l_fieldvalues.
ENDLOOP.
* Create CSV file
INSERT l_fieldnames INTO TABLE l_csv_file.
INSERT l_fieldvalues INTO TABLE l_csv_file.
* Download CSV file
CALL METHOD cl_gui_frontend_services=>gui_download
EXPORTING
filename = lc_filename
CHANGING
data_tab = l_csv_file.
* Add a value to a line of the CSV file
FORM csv_encode USING value TYPE string CHANGING target TYPE string.
CONSTANTS:
lc_separator TYPE string VALUE `,`,
lc_delimiter TYPE string VALUE `"`,
lc_escaped_delimiter TYPE string VALUE `""`.
DATA:
l_encoded_value TYPE string.
* Use delimiter if value contains separator
l_encoded_value = value.
IF value CS lc_separator.
REPLACE ALL OCCURRENCES OF lc_delimiter IN l_encoded_value WITH lc_escaped_delimiter.
CONCATENATE lc_delimiter l_encoded_value lc_delimiter INTO l_encoded_value.
ENDIF.
* Add value to line
IF target IS INITIAL.
target = l_encoded_value.
ELSE.
CONCATENATE target lc_separator l_encoded_value INTO target.
ENDIF.
ENDFORM.
Please adjust the constants and add some error handling to the code. If you want to use the CSV file in Microsoft Excel, be aware that the separator character is a country specific setting.
A simple way,
Create a flat table standard table of string
Retrieve the fields of your table from the DD03L table
Concatenate the field names in a string, and add it to your flat table
Loop through the data of the table and for each field name (from DD03L struct) assigns the value (assign component of structure), concatenate the values, and once the loop on the field names is finished, add it to your table.
You can easily transform your flat table into a horizontal table.
You can create a dynamic table in ABAP and fill the field catalog manually. Then you can show this table in ALV. Check the blog post Dynamic internal table iilustrated with an example of creating the transpose of internal table.

mysql 5.7 json data type search array of hashes

Suppose I have json data type column named data, and this column contains array of hashes, so for example: [{x:1, y:2}, {x:3, y:4},...]
Now can I json search this array for specific value of some key? That is select rows which have a hash with value of x key equal to 5, is this possible ?
Well I think I found a solution for it
SELECT * FROM table where JSON_CONTAINS(JSON_EXTRACT(data, '$[*].x'), 'search_val' = 1"

Retrving data from MySql with case sensitivity [duplicate]

This question already has answers here:
How can I make SQL case sensitive string comparison on MySQL?
(12 answers)
Closed 8 years ago.
I have a Login table in MySql database . . In table there is a column by cname and one of the value is 'Raghu'. My question is when i write the query as
Select *from Login where cname='raghu';
Then it is retrieving the record which contains 'Raghu' . I want it to retrieve according to case . How can I retrieve with case sensitively, values in the data of tables.
Use: 10.1.7.7. The BINARY Operator
The BINARY operator casts the string following it to a binary string. This is an easy way to force a comparison to be done byte by byte rather than character by character.
Select * from Login where binary cname='raghu';
SELECT * from Login WHERE STRCMP(cname,'Raghu')=0;
Can you try this, you can use LOWER FUNCTION in either column name LOWER('cname') or in value LOWER('raghu');
Select *from Login where LOWER(`cname`) = 'raghu';