How can I parse this .Net JSON date with Go?
The value comes back unassigned.
It appears to parse up to the date field.
package main
import (
"encoding/json"
"fmt"
"time"
)
type MyStruct struct {
FirstField string
SomeTime time.Time
LastField string
}
type MyStructSlice struct {
MyStructs []MyStruct
}
func main() {
var s MyStructSlice
str := `{"MyStructs":[{"FirstField":"123", "SomeTime":"\/Date(1432187580000-0500)\/", "LastField":"456"}]}`
json.Unmarshal([]byte(str), &s)
fmt.Println(s)
}
Go Playground
I am going to provide a few suggestions. You will have to write the code yourself though ;)
First of all is it possible to change .NET application that produced this JSON to generate something more parsable? If you make it output datetime in RFC3339 format (something like 1990-12-31T15:59:12-08:00) then go will automatically converts it to time.Time instance thanks to http://golang.org/pkg/time/#Time.UnmarshalJSON
If you cannot change the client then you will have to parse this date yourself:
extract time part (1432187580000) from the string. this looks like number of milliseconds (ms) since UNIX epoch. You can convert it to time.Time instance using time.Unix(sec, nsec).
(optional). The time that was created in the last step already accurately represent a point in time. However if you want to add the original timezone to it (e.g. to print it) you will need to:
parse the offset part (-0500) from the string
create time.FixedZone
call http://golang.org/pkg/time/#Time.In on the instance of time.Time created in the first step
Example: http://play.golang.org/p/Pkahyg2vZa
First you model is wrong, it doesn't model the data structure in your JSON. It should be:
type Data struct {
Data []MyStruct `json:"data`
}
type MyStruct struct {
SomeTime string
}
It works with this, try it on the Go Playground.
Problem is that we still have the time as string.
Now if you want SomeTime to be time.Time, you need to parse it yourself, you can do it by implementing json.Unmarshaler:
type Data struct {
Data []MyStruct `json:"data`
}
type MyStruct struct {
SomeTime time.Time
}
func (m *MyStruct) UnmarshalJSON(data []byte) error {
// First unmashal it into a string:
ms := struct{ SomeTime string }{}
if err := json.Unmarshal(data, &ms); err != nil {
return err
}
s := ms.SomeTime
// s is of format: "/Date(1432187580000-0500)/"
// extract millis and time zone offset hours
i1 := strings.Index(s, "(")
i3 := strings.Index(s, ")")
i2 := strings.Index(s, "-")
if i2 < 0 {
i2 = strings.Index(s, "+")
}
if i1 < 0 || i2 < 0 || i3 < 0 {
return errors.New("Invalid format")
}
millis, err := strconv.ParseInt(s[i1+1:i2], 10, 64)
if err != nil {
return err
}
m.SomeTime = time.Unix(0, millis*1000000)
// Apply timezone:
zoneHours, err := strconv.ParseInt(s[i2:i3], 10, 64)
if err != nil {
return err
}
zone := time.FixedZone("something", int(zoneHours)*3600)
m.SomeTime = m.SomeTime.In(zone)
return nil
}
Try it on the Go Playground.
The value comes back unassigned for many reasons. First of all your MyStruct does not look right. Your JSON has data key as the parent key, which consists an array of objects.
But your struct for some reason does not even resembles this. It has to have Date as a field at least. So make your struct look like the json you are receiving.
Another thing is that your time.Time will not be able to parse this string: Date(1432187580000-0500) in a normal date.
P.S. now that you have updated your struct to normal way, you have to find a way to parse your strange sting to a date. If you have power over your .net application, I would rather recommend changing json to a normal timestamp or something that can be easily parsable.
If not, then you have to change SomeTime time.Time to SomeTime string and then parse string to a normal timestamp and then parse this timestamp to a date.
Related
I have a JSON object That contains an implementation of an interface within it. I'm attempting to take that JSON and marshal it into a struct whilst creating the implementation of the interface.
I've managed to get it to implement the interface with a custom JSON unmarshal function however I'm struggling to piece together how to then marshal the rest of the fields
I've created an example in the Go playground
https://play.golang.org/p/ztF7H7etdjM
My JSON being passed into my application is
{
"address":"1FYuJ4MsVmpzPoFJ6svJMJfygn91Eubid9",
"nonce":13,
"network_id":"qadre.demo.balance",
"challenge":"f2b19e71876c087e681fc092ea3a34d5680bbfe772e40883563e1d5513bb593f",
"type":"verifying_key",
"verifying_key":{
"verifying_key":"3b6a27bcceb6a42d62a3a8d02a6f0d73653215771de243a63ac048a18b59da29",
"fqdn":"huski.service.key"
},
"signature":"a3bf8ee202a508d5a5632f50b140b70b7095d8836493dc7ac4159f6f3350280078b3a58b2162a240bc8c7485894554976a9c7b5d279d3f5bf49fec950f024e02",
"fqdn":"huski.service.SingleKeyProof"
}
I've attempted to do a json.Unmarshal and pass in a new struct for the remaining fields however it seems to put me in an infinite loop, my application hangs and then crashes
The best solution I've come up with so far is to marshal the JSON into a `map[string]interface{} and do each field separately, this feels very clunky though
var m map[string]interface{}
if err := json.Unmarshal(data, &m); err != nil {
return err
}
ad, ok := m["address"]
if ok {
s.Address = ad.(string)
}
fqdn, ok := m["fqdn"]
if ok {
s.FQDN = fqdn.(string)
}
n, ok := m["nonce"]
if ok {
s.Nonce = int64(n.(float64))
}
c, ok := m["challenge"]
if ok {
s.Challenge = []byte(c.(string))
}
network, ok := m["network_id"]
if ok {
s.NetworkID = network.(string)
}
sig, ok := m["signature"]
if ok {
s.Signature = []byte(sig.(string))
}
The reason your code gets into an infinite loop when you try to unmarshal the rest of the fields is because, I presume, the implementation of UnmarshalJSON after its done unmarshaling the verifying key, calls json.Unmarshal with the receiver, which in turn calls the UnmarshalJSON method on the receiver and so they invoke each other ad infinitum.
What you can do is to create a temporary type using the existing type as its definition, this will "keep the structure" but "drop the methods", then unmarshal the rest of the fields into an instance of the new type, and, after unmarshal is done, convert the instance to the original type and assign that to the receiver.
While this fixes the infinite loop, it also re-introduces the original problem of json.Unmarshal not being able to unmarshal into a non-empty interface type. To fix that you can embed the new type in another temporary struct that has a field with the same json tag as the problematic field which will cause it to be "overshadowed" while json.Unmarshal is doing its work.
type SingleKey struct {
FQDN string `json:"fqdn"`
Address string `json:"address"`
Nonce int64 `json:"nonce"`
Challenge []byte `json:"challenge"`
NetworkID string `json:"network_id"`
Type string `json:"type"`
VerifyingKey PublicKey `json:"verifying_key"`
Signature []byte `json:"signature"`
}
func (s *SingleKey) UnmarshalJSON(data []byte) error {
type _SingleKey SingleKey
var temp struct {
RawKey json.RawMessage `json:"verifying_key"`
_SingleKey
}
if err := json.Unmarshal(data, &temp); err != nil {
return err
}
*s = SingleKey(temp._SingleKey)
switch s.Type {
case "verifying_key":
s.VerifyingKey = &PublicKeyImpl{}
// other cases ...
}
return json.Unmarshal([]byte(temp.RawKey), s.VerifyingKey)
}
https://play.golang.org/p/L3gdQZF47uN
Looking at what you've done in your custom unmarshalling function, you seem to be passing in a map with the name of fields as index, and the reflect.Type you want to unmarshal said value into. That, to me, suggests that the keys might be different for different payloads, but that each key has a distinct type associated with it. You can perfectly handle data like this with a simple wrapper type:
type WrappedSingleKey struct {
FQDN string `json:"fqdn"`
Address string `json:"address"`
Nonce int64 `json:"nonce"`
Challenge []byte `json:"challenge"`
NetworkID string `json:"network_id"`
Type string `json:"type"`
VerifyingKey json.RawMessage `json:"verifying_key"`
OtherKey json.RawMessage `json:"other_key"`
Signature []byte `json:"signature"`
}
type SingleKey struct {
FQDN string `json:"fqdn"`
Address string `json:"address"`
Nonce int64 `json:"nonce"`
Challenge []byte `json:"challenge"`
NetworkID string `json:"network_id"`
Type string `json:"type"`
VerifyingKey *PublicKey `json:"verifying_key,omitempty"`
OtherType *OtherKey `json:"other_key,omitempty"`
Signature []byte `json:"signature"`
}
So I've changed the type of your VerifyingKey field to a json.RawMessage. That's basically telling json.Unmarshal to leave that as raw JSON input. For every custom/optional field, add a corresponding RawMessage field.
In the unwrapped type, I've changed VerifyingKey to a pointer and added the omitempty bit to the tag. That's just to accomodate mutliple types, and not have to worry about custom marshalling to avoid empty fields, like the included OtherType field I have. To get what you need, then:
func (s *SingleKey) UnmarshalJSON(data []byte) error {
w := WrappedSingleKey{} // create wrapped instance
if err := json.Unmarshal(data, &w); err != nil {
return err
}
switch w.Type {
case "verifying_key":
var pk PublicKey
if err := json.Unmarshal([]byte(w.VerifyingKey), &pk); err != nil {
return err
}
s.VerifyingKey = &pk // assign
case "other_key":
var ok OtherKey
if err := json.Unmarshal([]byte(w.OtherKey), &ok); err != nil {
return err
}
s.OtherKey = &ok
}
// copy over the fields that didn't require anything special
s.FQDN = w.FQDN
s.Address = w.Address
}
This is a fairly simple approach, does away with the reflection, tons of functions, and is quite commonly used. It's something that lends itself quite well to code generation, too. The individual assignment of the fields is a bit tedious, though. You might think that you can solve that by embedding the SingleKey type into the wrapper, but be careful: this will recursively call your custom unmarshaller function.
You could, for example, update all the fields in the WRapped type to be pointers, and have them point to fields on your actual type. That does away with the manual copying of fields... It's up to you, really.
Note
I didn't test this code, just wrote it as I went along. It's something I've used in the past, and I believe what I wrote here should work, but no guarantees (as in: you might need to debug it a bit)
Background
I am learning Go and I'm trying to do some JSON unmarshaling of a datetime.
I have some JSON produced by a program I wrote in C, I am outputting what I thought was a valid ISO8601 / RFC3339 timezone offset. I'm using strftime with the following format string:
%Y-%m-%dT%H:%M:%S.%f%z
(Note that %f is not supported by strftime natively, I have a wrapper that replaces it with the nanoseconds).
This will then produce the following result:
2016-08-08T21:35:14.052975+0200
Unmarshaling this in Go however will not work:
https://play.golang.org/p/vzOXbzAwdW
package main
import (
"fmt"
"time"
)
func main() {
t, err := time.Parse(time.RFC3339Nano, "2016-08-08T21:35:14.052975+0200")
if err != nil {
panic(err)
}
fmt.Println(t)
}
Output:
panic: parsing time "2016-08-08T21:35:14.052975+0200" as "2006-01-02T15:04:05.999999999Z07:00": cannot parse "+0200" as "Z07:00"
(Working example: https://play.golang.org/p/5xcM0aHsSw)
This is because RFC3339 expects the timezone offset to be in the format 02:00 with a :, but strftime outputs it as 0200.
So I need to fix this in my C program to output the correct format.
%z The +hhmm or -hhmm numeric timezone (that is, the hour and
minute offset from UTC). (SU)
Question
However, now I have a bunch of JSON files with this incorrect format:
2016-08-08T21:35:14.052975+0200
instead of the correct (with the : in the timezone offset):
2016-08-08T21:35:14.052975+02:00
but I still want to be able to unmarshal it correctly in my Go program. Preferably two different JSON files with only this difference should parse in the exact same way.
Regarding marshaling back to JSON, the correct format should be used.
This is how I have defined it in my struct:
Time time.Time `json:"time"`
So the question is, what is the "Go" way of doing this?
Also in my code example I am using RFC3339Nano. How would I specify that in the metadata for the struct as well? As I have it now with just json:"time" will that ignore the nano seconds?
You can define your own time field type that supports both formats:
type MyTime struct {
time.Time
}
func (self *MyTime) UnmarshalJSON(b []byte) (err error) {
s := string(b)
// Get rid of the quotes "" around the value.
// A second option would be to include them
// in the date format string instead, like so below:
// time.Parse(`"`+time.RFC3339Nano+`"`, s)
s = s[1:len(s)-1]
t, err := time.Parse(time.RFC3339Nano, s)
if err != nil {
t, err = time.Parse("2006-01-02T15:04:05.999999999Z0700", s)
}
self.Time = t
return
}
type Test struct {
Time MyTime `json:"time"`
}
Try on Go Playground
In the example above we take the predefined format time.RFC3339Nano, which is defined like this:
RFC3339Nano = "2006-01-02T15:04:05.999999999Z07:00"
and remove the :
"2006-01-02T15:04:05.999999999Z0700"
This time format used by time.Parse is described here:
https://golang.org/pkg/time/#pkg-constants
Also see the documentation for time.Parse
https://golang.org/pkg/time/#Parse
P.S. The fact that the year 2006 is used in the time format strings is probably because the first version of Golang was released that year.
You can try https://play.golang.org/p/IsUpuTKENg
package main
import (
"fmt"
"time"
)
func main() {
t, err := time.Parse("2006-01-02T15:04:05.999999999Z0700", "2016-08-08T21:35:14.052975-0200")
if err != nil {
panic(err)
}
fmt.Println(t)
}
I'm trying to make a simple tool which parses JSON-formatted lines in a file and performs an INSERT operation into a database.
I have a struct which looks like this:
type DataBlob struct {
....
Datetime time.Time `json:"datetime, string"`
....
}
And parsing code which looks like this:
scanner := bufio.NewScanner(file)
// Loop through all lines in the file
for scanner.Scan() {
var t DataBlob
// Decode the line, parse the JSON
dec := json.NewDecoder(strings.NewReader(scanner.Text()))
if err := dec.Decode(&t);
err != nil {
panic(err)
}
// Perform the database operation
executionString: = "INSERT INTO observations (datetime) VALUES ($1)"
_, err := db.Exec(executionString, t.Datetime)
if err != nil {
panic(err)
}
}
My JSON file has lines, each containing a datetime value that looks like this:
{ "datetime": 1465793854 }
When the datetime is formatted as a Unix timestamp, the Marshaller complains:
panic: parsing time "1465793854" as ""2006-01-02T15:04:05Z07:00"": cannot parse "1465793854" as """
In the script that generates the JSON (also written in Golang), I tried simply printing the String representation of the Time.time type, producing the following:
{ "datetime": "2016-06-13 00:23:34 -0400 EDT" }
To which the Marshaller complains when I go to parse it:
panic: parsing time ""2016-06-13 00:23:34 -0400 EDT"" as ""2006-01-02T15:04:05Z07:00"": cannot parse " 00:23:34 -0400 EDT"" as "T"
If I also treat this timestamp (which looks pretty standard) as a String and avoid the Marshaling problem, Postgres complains when I try to perform the insertion:
panic: pq: invalid input syntax for type timestamp: "2016-06-13 00:23:34 -0400 EDT"
This is frustrating on a number of levels, but mainly because if I serialize a Time.time type, I would think it should still be understood at the other side of the process.
How can I go about parsing this timestamp to perform the database insertion? Apologizes for the lengthy question and thanks for your help!
JSON unmarshalling of time.Time expects date string to be in RFC 3339 format.
So in your golang program that generates the JSON, instead of simply printing the time.Time value, use Format to print it in RFC 3339 format.
t.Format(time.RFC3339)
if I serialize a Time.time type, I would think it should still be
understood at the other side of the process
If you used the Marshaller interface with the serializing, it would indeed output the date in RFC 3339 format. So the other side of the process will understand it. So you can do that as well.
d := DataBlob{Datetime: t}
enc := json.NewEncoder(fileWriter)
enc.Encode(d)
For reference, if you need custom unmarshalling with time types you need to create your own type with the UnmarshalJSON method. Here's an example:
type Timestamp struct {
time.Time
}
// UnmarshalJSON decodes an int64 timestamp into a time.Time object
func (p *Timestamp) UnmarshalJSON(bytes []byte) error {
// 1. Decode the bytes into an int64
var raw int64
err := json.Unmarshal(bytes, &raw)
if err != nil {
fmt.Printf("error decoding timestamp: %s\n", err)
return err
}
// 2. Parse the unix timestamp
p.Time = time.Unix(raw, 0)
return nil
}
Then use the type in your struct:
type DataBlob struct {
....
Datetime Timestamp `json:"datetime"`
....
}
I have a json document and I'm using a client which decodes the document in an interface (instead of struct) as below:
var jsonR interface{}
err = json.Unmarshal(res, &jsonR)
How can I access the interface fields? I've read the go doc and blog but my head still can't get it. Their example seem to show only that you can decode the json in an interface but doesn't explain how its fields can be used.
I've tried to use a range loop but it seems the story ends when I reach a map[string]interface. The fields that I need seem to be in the interface.
for k, v := range jsonR {
if k == "topfield" {
fmt.Printf("k is %v, v is %v", k, v)
}
}
The value inside the interface depends on the json structure you're parsing. If you have a json dictionary, the dynamic type of jsonR will be: map[string]interface{}.
Here's an example.
package main
import (
"encoding/json"
"fmt"
"log"
)
func main() {
a := []byte(`{"topfield": 123}`)
var v interface{}
if err := json.Unmarshal(a, &v); err != nil {
log.Fatalf("unmarshal failed: %s", err)
}
fmt.Printf("value is %v", v.(map[string]interface{})["topfield"])
}
Parsing json like this can be very difficult. The default type of a parse is map[string]interface{}. The Problem arises when you have another complex data structure within the main json(like another list or object). The best way to go about decoding json is defining a struct to hold data. Not only will the values be of the correct type but you can extract the specific data you actually care about.
Your struct can look like this:
type Top struct {
Topfield int `json:"topfield"`
}
which can be decoded like this:
a := []byte(`{"topfield": 123}`)
var data Top
json.Unmarshal(a, &data) //parse the json into data
now you can use regular struct operations to access your data like this:
value := data.Topfield
json which contains more complex data can also be easyli decoded. Perhaps you have a list in your data somewhere, you can use a struct like the following to extract it:
type Data struct {
States []string `json:"states"`
PopulationData []Country `json:"popdata"`
}
type Country struct {
Id int `json:"id"`
LastCensusPop int `json:"lcensuspopulation"`
Gdp float64 `json:"gdp"`
}
such a structure can not only parse list but also parse objects withing fields.
I have a Json string that I want to unmarshal.
This is working:
jsonString := []byte(`{"my_int": 3, "my_string": null}`)
var data map[string]interface{}
err := json.Unmarshal(jsonString, &data)
if err != nil {
fmt.Println(err)
}
//avroJson := make(map[string]interface{})
for k, v := range data {
fmt.Printf("%v, %T\n", k, v)
}
My issue is: the value of my_int which is 3 is returned as float64.
My question is: how to parse a json string with the "minimum type" so that 3 will return int32 and not the maximum type 3 => float64?
Assumption: my Json is huge and only have primitive types and I want a minimum value that is really float64 to continue to show float64.
Clarification:
A "minimum type" means that if 3 can be considered both int32 and float64 the "minimum type" will be int32, which is the exact type you'll get when running this:
reflect.TypeOf(3).string()
Since you are unmarshaling into a map of interface{}, the following section of the golang json.Unmarshal documentation pertains:
To unmarshal JSON into an interface value, Unmarshal stores one of these in the interface value:
...
float64, for JSON numbers
string, for JSON strings
...
As such, to unmarshal your sample data into your desired types you should define a struct type which contains the desired field/type mappings, for example:
type MyType struct {
MyInt int `json:"my_int"`
MyString *string `json:"my_string"`
}
foo := MyType{}
jsonstr := `{"my_int": 3, "my_string": null}`
err := json.Unmarshal([]byte(jsonstr), &foo)
if err != nil {
panic(err)
}
// foo => main.MyType{MyInt:3, MyString:(*string)(nil)}
Since you cannot describe your data in a struct then your options are to:
Use a json.Decoder to convert the values to your desired types as they are parsed.
Parse the document into a generic interface and post-process the value types.
Option #1 is the most flexible and can likely be implemented to be more performant than the other option since parsing and transformation could be performed in a single pass of the data.
Option #2 might be simpler but will require two passes over the data. Here is an example of what the post-processing step might look like:
func TransformValueTypes(o map[string]interface{}) {
for k, v := range o {
// Convert nil values to *string type.
if v == interface{}(nil) {
o[k] = (*string)(nil)
}
// Convert numbers to int32 if possible
if x, isnumber := v.(float64); isnumber {
if math.Floor(x) == x {
if x >= math.MinInt32 && x <= math.MaxInt32 {
o[k] = int32(x)
}
// Possibly check for other integer sizes here?
}
// Possibly check if float32 is possible here?
}
// Check for maps and slices here...
}
}
So if you call TransformValueTypes(data) then your types will look like:
// my_int -> 3 (int32)
// my_string -> <nil> (*string)
// my_string2 -> "foo" (string)
// my_float -> 1.23 (float64)
Of course, your transform function could also apply type transformation logic based on the key name.
Importantly, note that if your document might have additional structure not mentioned in your question (such as nested objects or arrays) then your transform function will need to account for them by more value type checking, recursive calls, and iteration.