Mysql query isn't working especially the Scan() method - mysql

I am trying the print the struct variable but it's returning the zero value where I want an actual database row value.
Code:
package main
import (
"database/sql"
"fmt"
"log"
"time"
_ "github.com/go-sql-driver/mysql"
)
type Users struct {
Id int
Username string
Password string
Email string
First_name string
Last_name string
Created_at time.Time
Super_user bool
}
func main() {
db, err := sql.Open("mysql", "root:Megamind#1#(127.0.0.1:3306)/note?parseTime=true")
if err != nil {
log.Fatalln("Couldn't connect to the database")
}
var user Users
row := db.QueryRow("select password from users where username=$1", "someone")
row.Scan(&user.Id, &user.Username, &user.Password, &user.Email, &user.First_name, &user.Last_name, &user.Created_at, &user.Super_user)
fmt.Println(user)
}
Database:
mysql> select * from users;
+----+---------------+--------------------------------------------------------------+---------------------------+------------+-----------+---------------------+------------+
| id | username | password | email | first_name | last_name | created_at | super_user |
+----+---------------+--------------------------------------------------------------+---------------------------+------------+-----------+---------------------+------------+
| 3 | someone | $2a$10$a0g.eIGEHoVnD/s55YCePeL5BxCPYDF58nP2gb.TmYKwCuV5E7gP. | abc#gmail.com | NULL | NULL | 2020-06-22 01:24:43 | NULL |
| 4 | oneanother | $2a$10$VMo4iuvruCA/yQlkfMI2QOMWo2H2jIiyyoYprKQtQMT4U7UWb78CS | one#one.com | NULL | NULL | 2020-06-22 01:26:48 | NULL |
| 5 | alpha | $2a$10$oD0YKBkTvJQVPF4rilEVYemjRtwNF3ATGlVLUOVGZzR5lNx5fRl3. | alpha#gmail.com | NULL | NULL | 2020-06-22 01:45:34 | NULL |
+----+---------------+--------------------------------------------------------------+---------------------------+------------+-----------+---------------------+------------+
3 rows in set (0.02 sec)
Output:
(base) [dave#192 test]$ go run main.go
{0 0001-01-01 00:00:00 +0000 UTC false}

Related

Extract key-pair values from JSON objects in MySQL

From MySQL JSON data field, I'm extracting data from array like so:
SELECT
data ->> '$.fields[*]' as fields
FROM some_database...
which returns:
[{
"id": 111056,
"hint": null,
"slug": "email",
"label": "E-mail",
"value": null,
"field_value": "test#example.com",
"placeholder": null
}, {
"id": 111057,
"hint": null,
"slug": "name",
"label": "Imię",
"value": null,
"field_value": "Aneta",
"placeholder": null
}]
I can also extract single column:
SELECT
data ->> '$.fields[*].field_value' as fields
FROM some_database...
and that returns the following result:
[test#example.com, Aneta]
But how can I extract field_value alongside with label as key-pairs?
Preferred output would be a single multi-row string containing pairs:
label: field_value
label: field_value
...
Using example shown above it would get me following output:
E-mail: test#example.com
Imię: Aneta
One-liner preferred as I have multiple of such arrays to extract from various fields.
Here's an example of extracting the key names as rows:
select j.keyname from some_database
cross join json_table(
json_keys(data->'$[0]'),
'$[*]' columns (
keyname varchar(20) path '$'
)
) as j;
Output:
+-------------+
| keyname |
+-------------+
| id |
| hint |
| slug |
| label |
| value |
| field_value |
| placeholder |
+-------------+
Now you can join that to the values:
select n.n, j.keyname,
json_unquote(json_extract(f.data, concat('$[', n.n, ']."', j.keyname, '"'))) as value
from some_database as d
cross join json_table(
json_keys(d.data->'$[0]'),
'$[*]' columns (
keyname varchar(20) path '$'
)
) as j
cross join n
join some_database as f on n.n < json_length(f.data);
Output:
+---+-------------+------------------+
| n | keyname | value |
+---+-------------+------------------+
| 0 | id | 111056 |
| 0 | hint | null |
| 0 | slug | email |
| 0 | label | E-mail |
| 0 | value | null |
| 0 | field_value | test#example.com |
| 0 | placeholder | null |
| 1 | id | 111057 |
| 1 | hint | null |
| 1 | slug | name |
| 1 | label | Imię |
| 1 | value | null |
| 1 | field_value | Aneta |
| 1 | placeholder | null |
+---+-------------+------------------+
I'm using a utility table n which is just filled with integers.
create table n (n int primary key);
insert into n values (0),(1),(2),(3)...;
If this seems like a lot of complex work, then maybe the lesson is that storing data in JSON is not easy, when you want SQL expressions to work on the discrete fields within JSON documents.
You can use JSON_VALUE:
select JSON_VALUE (json_value_col, '$.selected_key') as selected_value from user_details ;
You can also use JSON_EXTRACT:
select JSON_EXTRACT (json_value_col, '$.selected_key') as selected_value from user_details ;
For more details refer:
https://dev.mysql.com/doc/refman/8.0/en/json-search-functions.html

How to paging a query in JPA with nativeQuery and Spring PagingAndSortingRepository

A few days ago a colleague made this question: Order and group by with one column
It was resolved. A simple and clean query, very cool. The solution seemed to be worth it, except that we need to implement it in JPA.
As JPA does not accept subqueries in JOIN we had to do it as nativeQuery, but in doing so we have problems with paging, since JPA does not combine this with native queries.
https://docs.spring.io/spring-data/jpa/docs/1.8.0.M1/reference/html/
Native queriesThe #Query annotation allows to execute native queries
by setting the nativeQuery flag to true. Note, that we currently don’t
support execution of pagination or dynamic sorting for native queries
as we’d have to manipulate the actual query declared and we cannot do
this reliably for native SQL.
We have no idea how to continue with this.
What we have to do: we have a series of records with phone number, user and date, and each user has been able to call N times. We need to obtain all the records grouped by phone number and sorted (DESC) by the most recent date of each group of numbers.
e.g.:
With this data:
+--------------+---------------------+-------------+---------------+
| phone_number | registered | name | first_surname |
+--------------+---------------------+-------------+---------------+
| 222005001 | 2019-05-10 10:01:01 | Alvaro | Garcia |
| 222004001 | 2019-05-13 16:14:21 | David | Garcia |
| 111003001 | 2019-05-13 16:14:43 | Roberto | Martin |
| 111001000 | 2019-05-13 16:14:50 | Juan Manuel | Martin |
| 111001000 | 2019-05-13 16:14:50 | Maria | Alonso |
| 111001000 | 2019-05-13 16:14:50 | Roberto | Martin |
| 333006001 | 2019-05-13 16:14:55 | Benito | Lopera |
| 123456789 | 2019-05-13 16:15:00 | NULL | NULL |
| 987654321 | 2019-05-13 16:15:08 | NULL | NULL |
| 123456789 | 2019-05-13 16:15:13 | NULL | NULL |
| 666999666 | 2019-05-13 16:15:18 | NULL | NULL |
| 454545458 | 2019-05-13 16:15:27 | NULL | NULL |
| 333006001 | 2019-05-13 16:23:36 | Benito | Lopera |
| 987654321 | 2019-05-13 16:23:46 | NULL | NULL |
| 666999666 | 2019-05-13 16:23:50 | NULL | NULL |
| 454545458 | 2019-05-13 16:23:55 | NULL | NULL |
| 666999666 | 2019-05-13 16:24:03 | NULL | NULL |
| 222004001 | 2019-05-13 16:24:10 | David | Garcia |
+--------------+---------------------+-------------+---------------+
Sort them like this:
+--------------+---------------------+-------------+---------------+
| phone_number | registered | name | first_surname |
+--------------+---------------------+-------------+---------------+
| 222004001 | 2019-05-13 16:24:10 | David | Garcia |
| 222004001 | 2019-05-13 16:14:21 | David | Garcia |
| 666999666 | 2019-05-13 16:24:03 | NULL | NULL |
| 666999666 | 2019-05-13 16:23:50 | NULL | NULL |
| 666999666 | 2019-05-13 16:15:18 | NULL | NULL |
| 454545458 | 2019-05-13 16:23:55 | NULL | NULL |
| 454545458 | 2019-05-13 16:15:27 | NULL | NULL |
| 987654321 | 2019-05-13 16:23:46 | NULL | NULL |
| 987654321 | 2019-05-13 16:15:08 | NULL | NULL |
| 333006001 | 2019-05-13 16:23:36 | Benito | Lopera |
| 333006001 | 2019-05-13 16:14:55 | Benito | Lopera |
| 123456789 | 2019-05-13 16:15:13 | NULL | NULL |
| 123456789 | 2019-05-13 16:15:00 | NULL | NULL |
| 111001000 | 2019-05-13 16:14:50 | Maria | Alonso |
| 111001000 | 2019-05-13 16:14:50 | Roberto | Martin |
| 111001000 | 2019-05-13 16:14:50 | Juan Manuel | Martin |
| 111003001 | 2019-05-13 16:14:43 | Roberto | Martin |
| 222005001 | 2019-05-10 10:01:01 | Alvaro | Garcia |
+--------------+---------------------+-------------+---------------+
It can be done with this query:
SELECT c.phone_number, c.registered, cl.name, cl.first_surname
FROM callers cl
INNER JOIN callers_phones cp ON cl.caller_id = cp.caller_id
RIGHT OUTER JOIN calls c ON c.phone_number = cp.phone_number
JOIN (
SELECT phone_number, MAX(registered) AS registered
FROM calls
GROUP BY phone_number) aux_c ON aux_c.phone_number = c.phone_number
WHERE c.answered = FALSE
AND (null is null or null is null or c.registered between null and null)
AND (null is null or c.phone_number = null)
AND (null is null or cl.caller_id = null)
ORDER BY aux_c.registered DESC, c.registered DESC
These are the tables:
CREATE TABLE callers
(
caller_id int NOT NULL UNIQUE AUTO_INCREMENT,
name varchar(50) NOT NULL,
first_surname varchar(50) NOT NULL,
CONSTRAINT callers_pkey PRIMARY KEY (caller_id)
);
CREATE TABLE callers_phones
(
phone_id int NOT NULL UNIQUE AUTO_INCREMENT,
caller_id int NOT NULL,
phone_number int NOT NULL,
CONSTRAINT callers_phones_pkey PRIMARY KEY (phone_id)
);
ALTER TABLE callers_phones
ADD CONSTRAINT callers_phones_fkey_callers FOREIGN KEY (caller_id)
REFERENCES callers (caller_id);
CREATE TABLE calls
(
call_id int NOT NULL UNIQUE AUTO_INCREMENT,
phone_number int NOT NULL,
answered boolean NOT NULL DEFAULT false,
registered datetime NOT NULL,
CONSTRAINT calls_pkey PRIMARY KEY (call_id)
);
The problem is that we have to implement it in JPA with paging, but subqueries do not work in JOIN clause, and paging does not work with nativeQuery.
This is what we have done:
#Entity:
import java.util.Date;
import javax.persistence.Entity;
import javax.persistence.EntityResult;
import javax.persistence.FieldResult;
import javax.persistence.Id;
import javax.persistence.NamedNativeQuery;
import javax.persistence.SqlResultSetMapping;
#SqlResultSetMapping (name = "MissedCallResult",
entities = {
#EntityResult (entityClass = MissedCallEntity.class,
fields = {
#FieldResult (name = "callId", column = "id"),
#FieldResult (name = "phoneNumber", column = "pH"),
#FieldResult (name = "registered", column = "reg"),
#FieldResult (name = "callerName", column = "cN"),
#FieldResult (name = "callerFirstSurname", column = "cFS")
})
})
#NamedNativeQuery (name = "findMissedCalls",
query = "select c.call_id as id, c.phone_number as pH, c.registered as reg, cl.name as cN, cl.first_surname as cFS "
+ "from callers cl "
+ " inner join callers_phones cp on cl.caller_id = cp.caller_id "
+ " right outer join calls c on c.phone_number = cp.phone_number "
+ " join (select c2.phone_number, MAX(c2.registered) as registered "
+ " from calls c2 "
+ " group by c2.phone_number) aux_c on aux_c.phone_number = c.phone_number "
+ "where c.answered = false "
+ " and (:startDate is null or :endDate is null or c.registered between :startDate and :endDate) "
+ " and (:callerId is null or cl.caller_id = :callerId) "
+ " and (:phoneNumber is null or c.phone_number = :phoneNumber) "
+ "order by aux_c.registered desc, c.registered desc",
resultSetMapping = "MissedCallResult")
#Entity
public class MissedCallEntity
{
#Id
private Integer callId;
private Integer phoneNumber;
private Date registered;
private String callerName;
private String callerFirstSurname;
private String callerSecondSurname;
...
}
#Repository:
import java.util.Date;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.PagingAndSortingRepository;
import org.springframework.data.repository.query.Param;
import org.springframework.data.rest.core.annotation.RepositoryRestResource;
import es.panel.domain.MissedCallEntity;
#RepositoryRestResource (path = "missedCalls", collectionResourceRel = "missedCalls")
public interface MissedCallRepository extends PagingAndSortingRepository<MissedCallEntity, Integer>
{
#Query (nativeQuery = true, name = "findMissedCalls")
Page<MissedCallEntity> findMissedCalls(#Param ("startDate") Date startDate,
#Param ("endDate") Date endDate,
#Param ("callerId") Integer callerId,
#Param ("phoneNumber") Integer phoneNumber,
Pageable page);
}
In #Service:
public Page<MissedCallEntity> getMissedCalls(Date startDate,
Date endDate,
Integer callerId,
Integer phoneNumber,
int actualPage,
int limit)
{
Page<MissedCallEntity> calls = mcRepository.findMissedCalls(
startDate, endDate, callerId, phoneNumber, PageRequest.of(1, 5));
return calls;
}
Thanks in advance!
A very simple solution is to create a database view based on your query for the calculated values i.e. counts and max values etc.
You can map this to the relevant entity using the JPA #SecondaryTable annotation which lets you map an entity to more than 1 table (or view).
With this is place you can the sort and filter using standard JPA/spring data functionality just as for any other field and you can pretty much remove all the code you have written.
I would elaborate further however it is not very clear what you are trying to achieve: you are asking about your attempted solution rather than the problem itself. Neither is MissedCall an entity. The entities in your system are users, calls, phones etc.

Hive from JSON Error

I can't make this json into hive table somehow, either become all null data or not able being selected. i just need all the same fields with my DDL, and if it's structured inside it, i want to let it as a string instead try to parse that.
The only one almost achieved only by : hive-hcatalog-core-1.1.0-cdh5.10.0.jar since some data are blank,
i'm able to query with LIMIT but when i remove the limit, it was returning me this kind of error org.apache.hadoop.hive.serde2.SerDeException: java.io.IOException: Field name expected
My table creation :
ADD JAR hive-hcatalog-core-1.1.0-cdh5.10.0.jar;
CREATE EXTERNAL TABLE tabless (`dt` STRING, `hGeoLocation` STRING, `loginId` STRING, `hSearchFunnel` STRING, `timeseries` STRING, `locale` STRING, `fetcherResult` STRING, `searchType` STRING, `isBackDate` STRING, `hId` STRING, `hFrequency` STRING, `currency` STRING, `userType` STRING, `isSNA` STRING, `isBinding` STRING, `nodeId` STRING, `_id` STRING, `adjustedResult` STRING, `chosenProviderSell` STRING, `ChosenInventoryBeforeAdjusted` STRING, `PricingRules` STRING, `cInDate` STRING, `cOutDate` STRING, `machineId` STRING, `interface` STRING, `pricingSpec` STRING, `elapsedTime` STRING, `ChosenInventoryAfterAdjusted` STRING, `chosenProviderBase` STRING, `fFrequency` STRING, `kafkaPT` STRING, `kafkaST` STRING, `cookieId` STRING, `sessionId` STRING,`pricingSpecAbPriceAdjustment` STRING,`searchId` STRING,`prevSearchId` STRING, `competitorRequest` STRING, `CPricingRule` STRING, `CStatisticChosenMethod` STRING, `ChosenCId` STRING, `ChosenCPricingRule` STRING, `chosenCPriceType` STRING, `CPriceDiff` STRING, `competitorResponse` STRING, `searchRateType` STRING) COMMENT 'somecomment'
ROW FORMAT SERDE
'org.apache.hive.hcatalog.data.JsonSerDe'
LOCATION 'someremotelocation';
Please use online json parser if needed,
My JSON looks like this in massive quantity :
{"ChosenCId":null,"ChosenCPricingRule":null,"ChosenInventoryAfterAdjusted":[{"hRoomId":1086174,"BASEFARE":22150,"SELLFARE":25000},{"hRoomId":103270,"BASEFARE":249,"SELLFARE":2800},{"hRoomId":103272,"BASEFARE":2470,"SELLFARE":200},{"hRoomId":100273,"BASEFARE":3050,"SELLFARE":3500},{"hRoomId":10376,"BASEFARE":3050,"SELLFARE":3500},{"hRoomId":10375,"BASEFARE":3050,"SELLFARE":3500},{"hRoomId":10374,"BASEFARE":367,"SELLFARE":4250},{"hRoomId":1069,"BASEFARE":430,"SELLFARE":500},{"hRoomId":108634,"BASEFARE":44700,"SELLFARE":5000},{"hRoomId":10270,"BASEFARE":400,"SELLFARE":570},{"hRoomId":102,"BASEFARE":400,"SELLFARE":5700},{"hRoomId":1026,"BASEFARE":610,"SELLFARE":70},{"hRoomId":1033,"BASEFARE":610,"SELLFARE":70},{"hRoomId":1075,"BASEFARE":60,"SELLFARE":0},{"hRoomId":1074,"BASEFARE":730,"SELLFARE":80},{"hRoomId":1039,"BASEFARE":870,"SELLFARE":10},{"hRoomId":1269,"BASEFARE":800,"SELLFARE":10000},{"hRoomId":10271,"BASEFARE":9500,"SELLFARE":1100},{"hRoomId":1039,"BASEFARE":17000,"SELLFARE":2000},{"hRoomId":1271,"BASEFARE":1900,"SELLFARE":200}],"ChosenInventoryBeforeAdjusted":[{"hRoomId":1084,"BASEFARE":220,"SELLFARE":2000},{"hRoomId":10320,"BASEFARE":250,"SELLFARE":280},{"hRoomId":10372,"BASEFARE":240,"SELLFARE":200},{"hRoomId":103273,"BASEFARE":3850,"SELLFARE":300},{"hRoomId":1076,"BASEFARE":350,"SELLFARE":300},{"hRoomId":10275,"BASEFARE":380,"SELLFARE":350},{"hRoomId":1074,"BASEFARE":360,"SELLFARE":420},{"hRoomId":1069,"BASEFARE":430,"SELLFARE":500},{"hRoomId":1084,"BASEFARE":440,"SELLFARE":50},{"hRoomId":10370,"BASEFARE":490,"SELLFARE":500},{"hRoomId":1032,"BASEFARE":400,"SELLFARE":500},{"hRoomId":1036,"BASEFARE":610,"SELLFARE":710},{"hRoomId":1073,"BASEFARE":610,"SELLFARE":710},{"hRoomId":1035,"BASEFARE":61,"SELLFARE":710},{"hRoomId":1034,"BASEFARE":730,"SELLFARE":80},{"hRoomId":1029,"BASEFARE":800,"SELLFARE":100},{"hRoomId":10269,"BASEFARE":800,"SELLFARE":100},{"hRoomId":101,"BASEFARE":9500,"SELLFARE":100},{"hRoomId":109,"BASEFARE":1700,"SELLFARE":200},{"hRoomId":1071,"BASEFARE":1900,"SELLFARE":20}],"CPriceDiff":0.0,"CPricingRule":{},"CStatisticChosenMethod":"none","CookieID":"1547597","FTA":[{"hRoomId":1074,"BASEFARE":220,"SELLFARE":20},{"hRoomId":10370,"BASEFARE":2450,"SELLFARE":200},{"hRoomId":1072,"BASEFARE":240,"SELLFARE":28},{"hRoomId":1033,"BASEFARE":37,"SELLFARE":35},{"hRoomId":1036,"BASEFARE":300,"SELLFARE":350},{"hRoomId":105,"BASEFARE":30,"SELLFARE":350},{"hRoomId":1074,"BASEFARE":30,"SELLFARE":420},{"hRoomId":109,"BASEFARE":430,"SELLFARE":00},{"hRoomId":10874,"BASEFARE":440,"SELLFARE":500},{"hRoomId":10370,"BASEFARE":4900,"SELLFARE":570},{"hRoomId":103,"BASEFARE":490,"SELLFARE":5700},{"hRoomId":10376,"BASEFARE":6100,"SELLFARE":70},{"hRoomId":10273,"BASEFARE":600,"SELLFARE":700},{"hRoomId":175,"BASEFARE":60,"SELLFARE":70},{"hRoomId":104,"BASEFARE":730,"SELLFARE":80},{"hRoomId":1069,"BASEFARE":80,"SELLFARE":100},{"hRoomId":109,"BASEFARE":80,"SELLFARE":10},{"hRoomId":171,"BASEFARE":950,"SELLFARE":110},{"hRoomId":10,"BASEFARE":170,"SELLFARE":20},{"hRoomId":101,"BASEFARE":100,"SELLFARE":200}],"PricingRules":{"t_l":22000900002,"hbeds":2200000002,"t_p":2200000002,"t_m":22000000002,"e_private":22000900002,"t":22000000002,"e":222,"hbeds_ratebinding":220000002,"t_budgetrooms":22000},"SessionID":"d586280d34","_id":154766,"adjustedResult":{"CheapestBase":{"t":{"BASEFARE":22,"SELLFARE":25},"e":{"BASEFARE":26,"SELLFARE":28}},"CheapestSell":{"t":{"BASEFARE":22,"SELLFARE":25},"e":{"BASEFARE":26,"SELLFARE":28}}},"cInDate":"01-01-2012","cOutDate":"12-12-2017","chosenProviderBase":"t","chosenProviderSell":"t","currency":"SGD","dt":147591430,"elapsedTime":5,"fetcherResult":{"CheapestBase":{"t":{"BASEFARE":20,"SELLFARE":25},"e":{"BASEFARE":20,"SELLFARE":28}},"CheapestSell":{"t":{"BASEFARE":22,"SELLFARE":25},"e":{"BASEFARE":20,"SELLFARE":20}}},"fFrequency":["NONE"],"hFrequency":[],"hGeoLocation":"SINGAPORE","hId":200344,"interface":["MOBILE_APPS_ANDROID"],"isBackDate":false,"isBinding":false,"isSNA":false,"locale":"id_ID","loginId":"","machineId":"416","nodeId":"hivv2","pricingSpec":{"isB":false,"searchDate":14700,"hTransactionFrequencyStatus":"","userLocale":"id_ID","isBackDate":false,"currency":"VND","hTransactionFrequency":0,"roomCount":"1","hUserType":"NON_LOGGED_IN_USER","lengthOfStay":"1","fTransactionRecency":0,"userGeoCountry":"Australia","abPriceAdjustment":"treatmentGroup","searchTime":530,"bookingWindowInDays":1,"roomNight":"1","hSearchFunnel":"LOWER_FUNNEL","cInDate":147000,"cOutDate":1476032400000,"searchDay":"6","fTransactionFrequency":0,"fTransactionFrequencyStatus":"NONE","cInDay":"7","hGrouping":"1,93,41,122","hIds":"2000000369344","hTransactionRecency":0,"clientType":"MOBILE"},"searchType":"hRoomSearch","timeSeries":1475919104430,"timeseries":1475919104430,"userType":["NON"],"kafkaPT":1475919104430,"kafkaST":1475919656986}
do you guys know why / the solution ?
create external table tabless (json_doc string)
row format delimited
tblproperties ('serialization.last.column.takes.rest'='true')
;
select json_tuple
(
json_doc
,'dt','hGeoLocation','loginId','hSearchFunnel'
,'timeseries','locale','fetcherResult','searchType'
,'isBackDate','hId','hFrequency','currency'
,'userType','isSNA','isBinding','nodeId'
,'_id','adjustedResult','chosenProviderSell','ChosenInventoryBeforeAdjusted'
,'PricingRules','cInDate','cOutDate','machineId'
,'interface','pricingSpec','elapsedTime','ChosenInventoryAfterAdjusted'
,'chosenProviderBase','fFrequency','kafkaPT','kafkaST'
,'cookieId','sessionId','pricingSpecAbPriceAdjustment','searchId'
,'prevSearchId','competitorRequest','CPricingRule','CStatisticChosenMethod'
,'ChosenCId','ChosenCPricingRule','chosenCPriceType','CPriceDiff'
,'competitorResponse','searchRateType'
) as (
`dt`,`hGeoLocation`,`loginId`,`hSearchFunnel`
,`timeseries`,`locale`,`fetcherResult`,`searchType`
,`isBackDate`,`hId`,`hFrequency`,`currency`
,`userType`,`isSNA`,`isBinding`,`nodeId`
,`_id`,`adjustedResult`,`chosenProviderSell`,`ChosenInventoryBeforeAdjusted`
,`PricingRules`,`cInDate`,`cOutDate`,`machineId`
,`interface`,`pricingSpec`,`elapsedTime`,`ChosenInventoryAfterAdjusted`
,`chosenProviderBase`,`fFrequency`,`kafkaPT`,`kafkaST`
,`cookieId`,`sessionId`,`pricingSpecAbPriceAdjustment`,`searchId`
,`prevSearchId`,`competitorRequest`,`CPricingRule`,`CStatisticChosenMethod`
,`ChosenCId`,`ChosenCPricingRule`,`chosenCPriceType`,`CPriceDiff`
,`competitorResponse`,`searchRateType`
)
from tabless
;
+-----------+--------------+---------+---------------+---------------+--------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-------------+------------+--------+------------+----------+----------+-------+-----------+--------+--------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+------------+------------+-----------+-------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------+------------+---------------+---------------+----------+-----------+------------------------------+----------+--------------+-------------------+--------------+------------------------+-----------+--------------------+------------------+------------+--------------------+----------------+
| dt | hgeolocation | loginid | hsearchfunnel | timeseries | locale | fetcherresult | searchtype | isbackdate | hid | hfrequency | currency | usertype | issna | isbinding | nodeid | _id | adjustedresult | chosenprovidersell | choseninventorybeforeadjusted | pricingrules | cindate | coutdate | machineid | interface | pricingspec | elapsedtime | choseninventoryafteradjusted | chosenproviderbase | ffrequency | kafkapt | kafkast | cookieid | sessionid | pricingspecabpriceadjustment | searchid | prevsearchid | competitorrequest | cpricingrule | cstatisticchosenmethod | chosencid | chosencpricingrule | chosencpricetype | cpricediff | competitorresponse | searchratetype |
+-----------+--------------+---------+---------------+---------------+--------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-------------+------------+--------+------------+----------+----------+-------+-----------+--------+--------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+------------+------------+-----------+-------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------+------------+---------------+---------------+----------+-----------+------------------------------+----------+--------------+-------------------+--------------+------------------------+-----------+--------------------+------------------+------------+--------------------+----------------+
| 147591430 | SINGAPORE | | (null) | 1475919104430 | id_ID | {"CheapestBase":{"t":{"BASEFARE":20,"SELLFARE":25},"e":{"BASEFARE":20,"SELLFARE":28}},"CheapestSell":{"t":{"BASEFARE":22,"SELLFARE":25},"e":{"BASEFARE":20,"SELLFARE":20}}} | hRoomSearch | false | 200344 | [] | SGD | ["NON"] | false | false | hivv2 | 154766 | {"CheapestBase":{"t":{"BASEFARE":22,"SELLFARE":25},"e":{"BASEFARE":26,"SELLFARE":28}},"CheapestSell":{"t":{"BASEFARE":22,"SELLFARE":25},"e":{"BASEFARE":26,"SELLFARE":28}}} | t | [{"hRoomId":1084,"BASEFARE":220,"SELLFARE":2000},{"hRoomId":10320,"BASEFARE":250,"SELLFARE":280},{"hRoomId":10372,"BASEFARE":240,"SELLFARE":200},{"hRoomId":103273,"BASEFARE":3850,"SELLFARE":300},{"hRoomId":1076,"BASEFARE":350,"SELLFARE":300},{"hRoomId":10275,"BASEFARE":380,"SELLFARE":350},{"hRoomId":1074,"BASEFARE":360,"SELLFARE":420},{"hRoomId":1069,"BASEFARE":430,"SELLFARE":500},{"hRoomId":1084,"BASEFARE":440,"SELLFARE":50},{"hRoomId":10370,"BASEFARE":490,"SELLFARE":500},{"hRoomId":1032,"BASEFARE":400,"SELLFARE":500},{"hRoomId":1036,"BASEFARE":610,"SELLFARE":710},{"hRoomId":1073,"BASEFARE":610,"SELLFARE":710},{"hRoomId":1035,"BASEFARE":61,"SELLFARE":710},{"hRoomId":1034,"BASEFARE":730,"SELLFARE":80},{"hRoomId":1029,"BASEFARE":800,"SELLFARE":100},{"hRoomId":10269,"BASEFARE":800,"SELLFARE":100},{"hRoomId":101,"BASEFARE":9500,"SELLFARE":100},{"hRoomId":109,"BASEFARE":1700,"SELLFARE":200},{"hRoomId":1071,"BASEFARE":1900,"SELLFARE":20}] | {"t_l":22000900002,"hbeds":2200000002,"t_p":2200000002,"t_m":22000000002,"e_private":22000900002,"t":22000000002,"e":222,"hbeds_ratebinding":220000002,"t_budgetrooms":22000} | 01-01-2012 | 12-12-2017 | 416 | ["MOBILE_APPS_ANDROID"] | {"isB":false,"searchDate":14700,"hTransactionFrequencyStatus":"","userLocale":"id_ID","isBackDate":false,"currency":"VND","hTransactionFrequency":10,"roomCount":"1","hUserType":"NON_LOGGED_IN_USER","lengthOfStay":"1","fTransactionRecency":10,"userGeoCountry":"Australia","abPriceAdjustment":"treatmentGroup","searchTime":530,"bookingWindowInDays":1,"roomNight":"1","hSearchFunnel":"LOWER_FUNNEL","cInDate":147000,"cOutDate":1476032400000,"searchDay":"6","fTransactionFrequency":10,"fTransactionFrequencyStatus":"NONE","cInDay":"7","hGrouping":"1,93,41,122","hIds":"2000000369344","hTransactionRecency":10,"clientType":"MOBILE"} | 5 | [{"hRoomId":1086174,"BASEFARE":22150,"SELLFARE":25000},{"hRoomId":103270,"BASEFARE":249,"SELLFARE":2800},{"hRoomId":103272,"BASEFARE":2470,"SELLFARE":200},{"hRoomId":100273,"BASEFARE":3050,"SELLFARE":3500},{"hRoomId":10376,"BASEFARE":3050,"SELLFARE":3500},{"hRoomId":10375,"BASEFARE":3050,"SELLFARE":3500},{"hRoomId":10374,"BASEFARE":367,"SELLFARE":4250},{"hRoomId":1069,"BASEFARE":430,"SELLFARE":500},{"hRoomId":108634,"BASEFARE":44700,"SELLFARE":5000},{"hRoomId":10270,"BASEFARE":400,"SELLFARE":570},{"hRoomId":102,"BASEFARE":400,"SELLFARE":5700},{"hRoomId":1026,"BASEFARE":610,"SELLFARE":70},{"hRoomId":1033,"BASEFARE":610,"SELLFARE":70},{"hRoomId":1075,"BASEFARE":60,"SELLFARE":10},{"hRoomId":1074,"BASEFARE":730,"SELLFARE":80},{"hRoomId":1039,"BASEFARE":870,"SELLFARE":10},{"hRoomId":1269,"BASEFARE":800,"SELLFARE":10000},{"hRoomId":10271,"BASEFARE":9500,"SELLFARE":1100},{"hRoomId":1039,"BASEFARE":17000,"SELLFARE":2000},{"hRoomId":1271,"BASEFARE":1900,"SELLFARE":200}] | t | ["NONE"] | 1475919104430 | 1475919656986 | (null) | (null) | (null) | (null) | (null) | (null) | {} | none | (null) | (null) | (null) | 10.0 | (null) | (null) |
+-----------+--------------+---------+---------------+---------------+--------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-------------+------------+--------+------------+----------+----------+-------+-----------+--------+--------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+------------+------------+-----------+-------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--------------------+------------+---------------+---------------+----------+-----------+------------------------------+----------+--------------+-------------------+--------------+------------------------+-----------+--------------------+------------------+------------+--------------------+----------------+
There are 2 issues here
1.
DDL issue
java.io.IOException: org.apache.hadoop.hive.serde2.SerDeException:
java.io.IOException: Field name expected
ChosenInventoryBeforeAdjusted and ChosenInventoryAfterAdjusted cannot be defined as strings.
They should be defined as the complex types they are -
array<struct<hRoomId:int,BASEFARE:int,SELLFARE:int>>
2.
Data issue
java.io.IOException: org.apache.hadoop.hive.serde2.SerDeException:
org.codehaus.jackson.JsonParseException: Invalid numeric value:
Leading zeroes not allowed
Numeric values cannot start with 0, e.g. (SELLFARE)
"hRoomId": 1075,
"BASEFARE": 60,
"SELLFARE": 000

go mysql returning null values

I am currently working on a Golang Google App Engine Project and have run into a small problem. I have a database "party" with table "parties". The problem is that when the following code is executed, a EMPTY json array is printed - it actually is properly long, but it only contains empty Parties. (And I do have entries in my database)
Go code (not all of it):
func getParties(w http.ResponseWriter, r *http.Request) {
rows := getRowsFromSql("select * from parties;")
parties := scanForParties(rows)
json, _ := json.Marshal(parties)
fmt.Fprint(w, string(json))
}
func scanForParties(rows *sql.Rows) []Party {
var parties []Party
for rows.Next() {
var id int
var name, author, datetime, datetime_to, host, location, description, longtitude, latitude, primary_image_id string
rows.Scan(&id, &name, &author, &datetime, &datetime_to, &host, &location, &description, &longtitude, &latitude, &primary_image_id)
party := Party{
Id: id,
Name: name,
Author: author,
Datetime: datetime,
Datetime_to: datetime_to,
Host: host,
Location: location,
Description: description,
Longtitude: longtitude,
Latitude: latitude,
PrimaryImgId: primary_image_id,
}
parties = append(parties, party)
}
return parties
}
func getRowsFromSql(query string) *sql.Rows {
con, err := sql.Open("mysql", dbConnectString)
if err != nil {
panic(err)
}
defer con.Close()
rows, err2 := con.Query(query)
if err != nil {
panic(err2)
}
return rows
}
type Party struct {
Id int
Name string
Author string
Datetime string
Datetime_to string
Host string
Location string
Description string
Longtitude string
Latitude string
PrimaryImgId string
}
And my parties table:
mysql> describe parties;
+------------------+----------------+------+-----+-------------------+-----------------------------+
| Field | Type | Null | Key | Default | Extra |
+------------------+----------------+------+-----+-------------------+-----------------------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| name | varchar(64) | NO | | | |
| author | varchar(64) | YES | | NULL | |
| datetime | datetime | YES | | NULL | |
| last_edited | timestamp | NO | | CURRENT_TIMESTAMP | on update CURRENT_TIMESTAMP |
| datetime_to | datetime | YES | | NULL | |
| host | text | YES | | NULL | |
| location | text | YES | | NULL | |
| description | text | YES | | NULL | |
| longitude | decimal(23,20) | YES | | NULL | |
| latitude | decimal(23,20) | YES | | NULL | |
| primary_image_id | varchar(256) | YES | | NULL | |
+------------------+----------------+------+-----+-------------------+-----------------------------+
However, this old version of code works just fine:
func getParties(w http.ResponseWriter, r *http.Request) {
con, dbErr := sql.Open("mysql", dbConnectString)
defer con.Close()
if dbErr == nil {
rows, _ := con.Query("select id, name, author, datetime from parties where datetime >= NOW();")
var parties []Party
var id int
var name string
var author string
var datetime string
for rows.Next() {
rows.Scan(&id, &name, &author, &datetime)
party := Party{}
party.Id = id
party.Name = name
party.Author = author
party.Datetime = datetime
parties = append(parties, party)
}
if len(parties) > 0 {
json, _ := json.Marshal(parties)
fmt.Fprint(w, string(json))
} else {
fmt.Fprint(w, "{}")
}
} else {
fmt.Fprint(w, "{\"Error\"}")
}
}
Any idea why this happens?
Thanks in advance :)
This is a guess, but I'm thinking that it's because you're closing the connection to the database here:
defer con.Close()
This will close the connection to the database when getRowsFromSql returns, so by the time you start calling rows.Next() in scanForParties the DB connection is gone. Once the DB connection is closed, any collection of rows will no longer be available.
Something is probably returning an error because of this, but since you're not checking any errors anywhere you won't know. In Go it is idiomatic to check for errors whenever a function can return one (and other languages too, just more so in Go because of the lack of exceptions).
Okay so all the others were right about the errors: rows.Scan() returns an error. And when I finally checked it, it said that there are insufficient scan variables provided. Simple fix: add the missing one.
Thank you guys :)

Ruby & MySQL: How to handle missing elements while parsing XML file

Currently I am trying to parse large xml file, Here is the how my xml file looks like:
<post>
<row Id="22" PostTypeId="2" ParentId="9" CreationDate="2008-08-01T12:07:19.500" Score="7" Body="<p>The best way that I know of because of leap years and everything is:</p>
<pre><code>DateTime birthDate = new DateTime(2000,3,1);<br>int age = (int)Math.Floor((DateTime.Now - birthDate).TotalDays / 365.25D);<br></code></pre>
<p>Hope this helps.</p>" OwnerUserId="17" LastEditorUserId="17" LastEditorDisplayName="Nick" LastEditDate="2008-08-01T15:26:37.087" LastActivityDate="2008-08-01T15:26:37.087" CommentCount="1" CommunityOwnedDate="2011-08-16T19:40:43.080" />
<row Id="29" PostTypeId="2" ParentId="13" CreationDate="2008-08-01T12:19:17.417" Score="18" Body="<p>There are no HTTP headers that will report the clients timezone so far although it has been suggested to include it in the HTTP specification.</p>
<p>If it was me, I would probably try to fetch the timezone using clientside JavaScript and then submit it to the server using Ajax or something.</p>" OwnerUserId="19" LastActivityDate="2008-08-01T12:19:17.417" CommentCount="0" />
</post>
Different between these two records in this XML file is that doesn't have LastEditDate element. I believe as a result of that I get the following error:
/ruby/1.9.2/ubuntuamd1/lib/ruby/1.9.1/date/format.rb:1031:in `dup': can't dup NilClass (TypeError)
from /soft/ruby/1.9.2/ubuntuamd1/lib/ruby/1.9.1/date/format.rb:1031:in `_parse'
from /soft/ruby/1.9.2/ubuntuamd1/lib/ruby/1.9.1/date.rb:1732:in `parse'
from load.rb:105:in `on_start_element'
from load.rb:165:in `parse'
Here is the code segment that its getting referred:
if element == 'row'
#post_st.execute(attributes['Id'], attributes['PostTypeId'], attributes['AcceptedAnswerId'], attributes['ParentId'], attributes['Score'], attributes['ViewCount'],
attributes['Body'], attributes['OwnerUserId'] == nil ? -1 : attributes['OwnerUserId'], attributes['LastEditorUserId'], attributes['LastEditorDisplayName'],
DateTime.parse(attributes['LastEditDate']).to_time.strftime("%F %T"), DateTime.parse(attributes['LastActivityDate']).to_time.strftime("%F %T"), attributes['Title'] == nil ? '' : attributes['Title'],
attributes['AnswerCount'] == nil ? 0 : attributes['AnswerCount'], attributes['CommentCount'] == nil ? 0 : attributes['CommentCount'],
attributes['FavoriteCount'] == nil ? 0 : attributes['FavoriteCount'], DateTime.parse(attributes['CreationDate']).to_time.strftime("%F %T"))
post_id = attributes['Id']
furthermore I think this is the line where I look for LastEditDate
DateTime.parse(attributes['LastEditDate']).to_time.strftime("%F %T"), DateTime.parse(attributes['LastActivityDate']).to_time.strftime("%F %T"), attributes['Title'] == nil ? '' : attributes['Title']
I guess since the element doesn't exist I get the above mentioned error. I was wondering how do I handle this scenario where if an element doesn't exist set it to a default value. Because while I am parsing these record I insert them into MySQL database. Which has following table structure:
+--------------------------+--------------+------+-----+---------------------+-----------------------------+
| Field | Type | Null | Key | Default | Extra |
+--------------------------+--------------+------+-----+---------------------+-----------------------------+
| id | int(11) | NO | PRI | NULL | |
| post_type_id | int(11) | NO | | NULL | |
| accepted_answer_id | int(11) | YES | | NULL | |
| parent_id | int(11) | YES | MUL | NULL | |
| score | int(11) | YES | | NULL | |
| view_count | int(11) | YES | | NULL | |
| body_text | text | YES | | NULL | |
| owner_id | int(11) | NO | | NULL | |
| last_editor_user_id | int(11) | YES | | NULL | |
| last_editor_display_name | varchar(40) | YES | | NULL | |
| last_edit_date | timestamp | NO | | CURRENT_TIMESTAMP | on update CURRENT_TIMESTAMP |
| last_activity_date | timestamp | NO | | 0000-00-00 00:00:00 | |
| title | varchar(256) | NO | | NULL | |
| answer_count | int(11) | NO | | NULL | |
| comment_count | int(11) | NO | | NULL | |
| favorite_count | int(11) | NO | | NULL | |
| created | timestamp | NO | | 0000-00-00 00:00:00 | |
+--------------------------+--------------+------+-----+---------------------+-----------------------------+
I have setup last_edit_date as not null column.
Based on the answer provided I made the change but error still remains the same:
def convert_to_mysql_time(date='1973-01-01T01:01:01.000')
DateTime.parse(date).to_time.strftime("%F %T")
end
def on_start_element(element, attributes)
if element == 'row'
#post_st.execute(attributes['Id'], attributes['PostTypeId'], attributes['AcceptedAnswerId'], attributes['ParentId'], attributes['Score'], attributes['ViewCount'],
attributes['Body'], attributes['OwnerUserId'] == nil ? -1 : attributes['OwnerUserId'], attributes['LastEditorUserId'], attributes['LastEditorDisplayName'],
convert_to_mysql_time(attributes['LastEditDate']), DateTime.parse(attributes['LastActivityDate']).to_time.strftime("%F %T"), attributes['Title'] == nil ? '' : attributes['Title'],
attributes['AnswerCount'] == nil ? 0 : attributes['AnswerCount'], attributes['CommentCount'] == nil ? 0 : attributes['CommentCount'],
attributes['FavoriteCount'] == nil ? 0 : attributes['FavoriteCount'], DateTime.parse(attributes['CreationDate']).to_time.strftime("%F %T"))
post_id = attributes['Id']
Here is the error:
/ruby/1.9.2/ubuntuamd1/lib/ruby/1.9.1/date/format.rb:1031:in `dup': can't dup NilClass (TypeError)
from /soft/ruby/1.9.2/ubuntuamd1/lib/ruby/1.9.1/date/format.rb:1031:in `_parse'
from /soft/ruby/1.9.2/ubuntuamd1/lib/ruby/1.9.1/date.rb:1732:in `parse'
from load.rb:102:in `convert_to_mysql_time'
from load.rb:109:in `on_start_element'
from load.rb:169:in `parse'
from load.rb:169:in `<main>'
I would, write a method that converts String dates to MySQL dates, and supply it a default value if the nil is supplied to the method, e.g:
def convert_to_my_sql_date(date)
date = '1973-01-01T01:01:01.000' if (date.empty? rescue true) #was added since empty string gets supplied as an argument, and the rescue to make arguments that do not respond to empty? take a default date
DateTime.parse(date).to_time.strftime("%F %T")
end
So when the date is nil it uses the default, then you can now use as below in your method:
convert_to_my_sql_date(attributes['LastEditDate'])