How to model a generic reference to a generic type in Rust - csv

I am trying to code a file import module that takes CSV or XLSX files. I am stuck at the CSV import stage, trying to model the property that will hold the reader.
The problem arises because I want to be able to create a csv::Reader using either the ::from_path function (the way it will generally be used) or the ::from_reader function (mainly for tests).
The types returned by the two functions is different: Reader<File> and Reader<R> respectively.
The structures I have:
trait ReaderSeeker: Read + Seek {}
struct CsvReader {
header_map: Vec<(String, usize)>,
line_no: usize,
r: Box<csv::Reader<dyn ReaderSeeker>>,
sr: StringRecord,
}
Read and Seek are the traits that I use on the reader.
I get:
error[E0277]: the size for values of type (dyn ReaderSeeker + 'static) cannot be known at compilation time
I also tried:
struct CsvReader<R: Read + Seek> {
header_map: Vec<(String, usize)>,
line_no: usize,
r: Box<csv::Reader<R>>,
sr: StringRecord,
}
But that causes other issues when trying to instantiate the struct, as I need to pass the type then -which I do not want to do- as it is completely irrelevant to the caller.
How can I model this situation?

Related

Handling different ways to represent null in serde

I'm writing a client library around this REST server. Octoprint (A server to manage a 3d printer), to be precise.
Here's one of the types i'm working with:
#[derive(Serialize, Deserialize, Debug)]
pub struct JobInfo {
/// The file that is the target of the current print job
pub file: FileInfo,
/// The estimated print time for the file, in seconds
#[serde(rename = "estimatedPrintTime")]
pub estimated_print_time: Option<f64>,
/// The print time of the last print of the file, in seconds
#[serde(rename = "lastPrintTime")]
pub last_print_time: Option<f64>,
/// Information regarding the estimated filament usage of the print job
pub filament: Option<Filament>,
}
Pretty straightforward, Using the multiplicity property defined in the specification of the API, I determined which properties should be considered optional, hence why some of these props are wrapped in options.
Unfortunately the documentation lies a little bit in the way multiplicity works here; here's an example on what a response looks like when the printer is in an offline state. For the sake of brevity, I will omit most of the body of this JSON message and keep just enough to get the point across
{
"job": {
"file": null,
"filepos": null,
"printTime": null,
... etc
},
...
"state": "Offline"
}
Here's the type that I'm expecting for this response:
#[derive(Serialize, Deserialize, Debug)]
pub struct JobInformationResponse {
/// Information regarding the target of the current print job
job: JobInfo,
/// Information regarding the progress of the current print job
progress: ProgressInfo,
/// A textual representation of the current state of the job
/// or connection. e.g. "Operational", "Printing", "Pausing",
/// "Paused", "Cancelling", "Error", "Offline", "Offline after error",
/// "Opening serial connection" ... - please note that this list is not exhaustive!
state: String,
/// Any error message for the job or connection. Only set if there has been an error
error: Option<String>,
}
Now I could just wrap all of these types in Options, but the previous example json wouldn't parse, since technically since job is an object, it's not going to deserialize as None despite the fact that each of it's keys are null. I was wondering if there were some sort of attribute in serde that would be able to handle this weird kind of serialization issue. I'd like to avoid just wrapping every single property in Options just to handle the edge case where the printer is offline
Edit: I guess what I'm trying to say is that I would expect that if all props on a struct in the json representation were null, that the object itself would serialize as None
If you're willing to redesign a little bit, you might be able to do something like this:
#[serde(tag = "state")]
enum JobInformationResponse {
Offline {}
// If a field only appears on one type of response, use a struct variant
Error { error: String },
// If multiple response types share fields, use a newtype variant and a substruct
Printing(JobInformationResponseOnline),
Paused(JobInformationResponseOnline),
// ...
}
struct JobInformationResponseOnline {
job: JobInfo,
progress: ProgressInfo,
}
This works in the Offline case because by default, serde ignores properties that don't fit into any field of the struct/enum variant. So it won't check whether all entries of job are null.
If you have fields that appear in every message, you can further wrap JobInformationResponse (you should probably rename it):
struct JobInformationResponseAll {
field_appears_in_all_responses: FooBar,
#[serde(flatten)]
state: JobInformationResponse // Field name doesn't matter to serde
}
But I'm not sure whether that works for you, since I certainly haven't seen enough of the spec or any real example messages.
To answer your question directly: No, there is no attribute in serde which would allow an all-null map to be de/serialized as None. You'd need two versions of the struct, one without options (to be used in your rust code) and one with (to be used in a custom deserialization function where you first deserialize to the with-options struct and then convert). Might not be worth the trouble.
And a side note: You might be happy to find #[serde(rename_all = "camelCase")] exists.

Type '{}' is missing the following properties from type 'Datos[]': when instantiating list from JSON in Angular

I'm having problems to import JSON data into my angular application.
What I'm doing:
Import the json file into a variable:
import _datos from '../../../assets/data.json';
As specified in the answers of this question (How to import JSON File into a TypeScript file?), I'm using the typing.d.ts file in order to be able to read the information.
Declaring an interface in order to improve the readability of the code:
export interface Datos{
property1: String,
property2: String,
etc.
}
Declaring and instantiating the array with the json file imported:
datos: Datos[] = _datos;
What is expected
The variable "datos" should have the content of the JSON file as an array.
What actually happens
Type '{}' is missing the following properties from type 'Datos[]': length, pop, push, concat, and 26 more.ts(2740).
Further information
I've done it before with a smaller JSON file (this one has about 26 properties each object and +100k elements) and worked fine. However, this one is failing and I don't really know why.
Okay it was a very simple mistake:
One of the 26 properties in the JSON file was not written properly. In order for this to work, it is needed that the properties in the JSON file and the properties in the interface match.
The JSON file is something like this:
[
{
"organo_de_contratacion": "Consorcio de Transportes de la Bahía de Cádiz",
"estado_de_la_licitacion**:**": "Resuelta",
...
So that the : slipped before the ending of the property and missmatched the interface property.
To avoid the error:
Type '{}' is missing the following properties from type ...
Ensure that the object _datos is being assigned is an array of objects of type Datos.
This occurs when assigning an object { .. } to an array of objects [ {}, .. ]
Your variable that holds the imported array needs to be declared as follows:
_datos: Datos[] = [];
Then import your JSON file data into _datos.
I would suggest debugging the value of _datos before it is assigned to the datos variable.

How to build an abstract json unmarshaller in go

I have multiple APIs that follow a similar structure on the high level response. It always gives back an answer in that form:
{"data": {"feed":[{...}]}, "success": true}
However, the structure in Feed varies, depending on the concrete API.
I would now like to build an abstract function to process the various APIs. I have the following objects:
type SourceDTO struct { // top level object
Success bool `json:"success"`
Data Feed `json:"data"`
}
type Feed struct {
FeedData []<???> `json:"Feed"`
}
(The real object is more complex, but this shows the idea)
How would be a good way in go to parse this for the different APIs, ut having some common code with some logic based on the high level data (e.g. success)?
EDIT:
I am extending this, to explain more the extend of my question about the "pattern" I am looking for.
I want to create this package that parses the group of APIs. The DTO objects then have to be transferred into some other objects. These 'final' objects are defined in a different package (the entity package) and have then to be persisted.
I am now wondering, how to bring all this together: The 'finaly' entity objects, the transformation functions from DTO to entity, the parsing of the different APIs and their common and different result components.
Where do the transformation functions belong to (package wise)?
EDIT2: Specified FeedData to a slice after digging into the problem (see comments)
You can embed your SourceDTO struct into another struct, like this:
type SourceDTO struct { // top level object
Success bool `json:"success"`
}
type FeedResponse struct {
FeedData YourCustomFeedStruct `json:"feed"`
// Embedded Struct
SourceDTO
}
Now you can access the Success bool from the FeedResponse struct. Also any methods defined on the SourceDTO struct can be accessed from the FeedResponse.
Thanks to #mkopriva for the input for this solution.
In order to have some abstraction in your json unmarshalling it is possible to use interface{} for many use cases.
package main
import (
"encoding/json"
"fmt"
)
type UniversalDTO struct {
Success bool `json:"success"`
Data interface{} `json:"data"`
}
type ConcreteData struct {
Source string `json:"source"`
Site string `json:"site"`
}
func main() {
jsondata := []byte(`{"sucess":"true","data":[{"source":"foo","site":"bar"}]}`)
data := make([]ConcreteData, 0, 10)
dtoToSend := UniversalDTO{Data: &data}
describe(dtoToSend)
describe(dtoToSend.Data)
json.Unmarshal(jsondata, &dtoToSend)
describe(dtoToSend)
describe(dtoToSend.Data)
}
func describe(i interface{}) {
fmt.Printf("(%v, %T)\n", i, i)
}
Test here: https://play.golang.org/p/SSSp_zptMVN
json.Unmarshal expects an object into which the json is being put into. Thus, first we always need an object. Depending on the concrete instance of the target object, the interface{} can be overriden with a concrete struct object (which of course has to be created separately). An important learning here is, that a go interface can also be overridden with a slice. In this way, it is also possible to unmarshal an array into a go object. However, a slice of a struct has to be defined as a slice of pointers to that type.

json ignore tag ("-") not working on embedded sub structure

I have been reading a lot of related questions but could not find anything that actually fit my problem. I am trying to unmarshall a complex object.
type DC struct {
//other fields
ReplenishmentData map[string]ProductReplenishment `bson:"-"`
//other fields
}
type ProductReplenishment struct {
//Other fields
SafetyStockInDay int `json:"SafetyStockInDay" bson:"SafetyStockInDay"`
AlreadyOrderedQuantityForReplenishment *map[float64]*UnitQuantity `json:"-" bson:"-"`
//Other fields
}
Lets say I decode the following json:
{
"ReplenishmentData": {
"000822-099": {
"SafetyStockInDay": 7
},
"001030-001": {
"SafetyStockInDay": 7
}
}
}
Into a structure instance hierachy in which the AlreadyOrderedQuantityForReplenishment is not empty, after decoding this field will be set to and empty map, overriding the initial value.
Why is the decoder not ignore the field all together as specified in the docs? Am I missing something?
Thanks a lot for any help,
Adding screenshot of inspector before (first) / after (second) if that can help
Your problem is not related to embedded structs - the same issue would occur with a regular struct.
Encoders will skip encoding struct fields marked with the tag qualifier "-".
Decoders when initializing a struct, will use the zero-value for any field that is not initialized via the decoding process. So your map will he initialized to a nil (empty) map.
If you want to preserve settings you'd need to write your own (JSON or BSON) marshaler (doable - but not trivial). Or it may be just as simpler to just restore any zero-values after the decoding process.

parsing json with different schema with Play json

I've got to parse a list of Json messages. I am using Play Json
All messages have similar structure, and at high level may be represented as
case class JMessage(
event: String,
messageType: String,
data: JsValue // variable data structure
)
data may hold entries of different types - Double, String, Int, so I cant go with a Map.
Currently there are at least three various types of the data. The structure of data may be identified by messageType.
So far I've created three case classes each representing the structure of data. As well as implicit Reads for them. And the 4th one that is a case class for result with some Option-al fields. So basically I need to map various json messages to some output format.
The approach I'm currently using is:
messages.map(Json.parse(_)).(_.as[JMessage]).map {
elem => {
if (elem.messageType == "event") {
Some(parseMessageOfTypeEvent(elem.data))
}
Some(..)
} else {
None
}
}
.filter(_.nonEmpty)
The parseMessageOfType%type% functions are basically (v: type) => JsValue.
So after all I have 4 case classes and 3 functions for parsing. It works, but it is ugly.
Is there a more beautiful Scala way to it?