Rust Diesel duplicate column name when using paginate - mysql

I have a query that loads an Expense, with a child Currency.
let res: Vec<JoinedExpenses> = expenses
.inner_join(currencies::table.on(currencies::id.eq(expenses::currency_id)
.select(
(
expenses::id,
...,
(
currencies::id,
currencies::shortcode,
currencies::name
),
expenses::currency_id,
...,
)
)
.load::<JoinedExpenses>(conn)?;
The Structs look like this:
#[derive(Debug, Queryable, Serialize, Deserialize)]
pub struct JoinedExpense {
pub id: i64,
...,
pub currency: Currency,
pub currency_id: i64,
...
}
#[derive(Debug, Queryable, Serialize, Deserialize)]
pub struct Currency {
pub id: i64,
pub shortcode: String,
pub name: String,
...
}
This works fine and generates the following (simplified) query:
SELECT
`expenses`.`id`,
...,
`currencies`.`id`,
`currencies`.`shortcode`,
`currencies`.`name`,
`expenses`.`currency_id`,
..
FROM ((
`expenses` INNER JOIN `currencies`
ON (`currencies`.`id` = `expenses`.`currency_id`))
Now I tried paginating this result with the following code:
#[derive(Debug, Serialize, Deserialize)]
pub struct PagedResponse<T> {
data: Vec<T>,
pages: i64,
}
pub trait Paginate: Sized {
fn paginate(self, page: Option<i64>) -> Paginated<Self>;
}
impl<T> Paginate for T {
fn paginate(self, page: Option<i64>) -> Paginated<Self> {
let _page = match page {
None => 1,
Some(p) => p
};
Paginated {
query: self,
per_page: DEFAULT_PER_PAGE,
page: _page,
offset: (_page - 1) * DEFAULT_PER_PAGE,
}
}
}
const DEFAULT_PER_PAGE: i64 = 10;
#[derive(Debug, Clone, Copy, QueryId)]
pub struct Paginated<T> {
query: T,
page: i64,
per_page: i64,
offset: i64,
}
impl<T> Paginated<T> {
pub fn per_page(self, per_page: i64) -> Self {
Paginated {
per_page,
offset: (self.page - 1) * per_page,
..self
}
}
pub fn load_and_count_pages<'a, U>(self, conn: &mut mysql::ConnectionPool) -> Result<PagedResponse<U>, CustomError>
where
Self: LoadQuery<'a, mysql::ConnectionPool, (U, i64)>,
{
let per_page = self.per_page;
let results = self.load::<(U, i64)>(conn)?;
let total = results.get(0).map(|x| x.1).unwrap_or(0);
let records = results.into_iter().map(|x| x.0).collect();
let total_pages = (total as f64 / per_page as f64).ceil() as i64;
let paged: PagedResponse<U> = PagedResponse {
data: records,
pages: total_pages,
};
Ok(paged)
}
}
impl<T: Query> Query for Paginated<T> {
type SqlType = (T::SqlType, BigInt);
}
impl<T> RunQueryDsl<mysql::ConnectionPool> for Paginated<T> {}
impl<T> QueryFragment<Mysql> for Paginated<T>
where
T: QueryFragment<Mysql>,
{
fn walk_ast<'b>(&'b self, mut out: AstPass<'_, 'b, Mysql>) -> QueryResult<()> {
out.push_sql("SELECT *, COUNT(*) OVER () FROM (");
self.query.walk_ast(out.reborrow())?;
out.push_sql(") t LIMIT ");
out.push_bind_param::<BigInt, _>(&self.per_page)?;
out.push_sql(" OFFSET ");
out.push_bind_param::<BigInt, _>(&self.offset)?;
Ok(())
}
}
I get the result like this:
let res: Vec<JoinedExpenses> = expenses
.inner_join(currencies::table.on(currencies::id.eq(expenses::currency_id)
.select(
(
expenses::id,
...,
(
currencies::id,
currencies::shortcode,
currencies::name
),
expenses::currency_id,
...,
)
)
.paginate(1)
.load_and_count_pages::<JoinedExpenses>(conn)?;
This code works if I don't join the currency to the expense, i.e. if I load Expenses only for example. But if I try to use pagination together with the JoinedExpense, I get an error:
DatabaseError(
Unknown,
"Duplicate column name 'id'",
),
The resulting query from the paginated case is:
SELECT *,
COUNT(*) OVER ()
FROM (
SELECT
`expenses`.`id`,
...,
`currencies`.`id`,
`currencies`.`shortcode`,
`currencies`.`name`,
`expenses`.`currency_id`,
..
FROM
((`expenses` INNER JOIN `currencies`
ON (`currencies`.`id` = `expenses`.`currency_id`))
LIMIT ? OFFSET ?
I think I can avoid the issue by changing the Type of Currency to not contain an id field, but I want to understand the underlying issue.
Update 1:
If I implement the pagination manually for the specified query, then I don't run into the same Issue.
The following code produces the output I want without issues:
let test = expenses::dsl::expenses
.filter(
expenses::tenant_id.eq(session.tenant_id))
.inner_join(
currencies::table.on(
currencies::id.eq(expenses::currency_id)
)
)
.inner_join(
expense_types::table.on(
expense_types::id.eq(expenses::type_id)
)
)
.select((
JoinedExpense::COLUMNS,
diesel::dsl::sql::<diesel::sql_types::BigInt>("count(*) over ()")
)).limit(10).offset(0).load::<(JoinedExpense, i64)>(conn);
Now it remains to figure out what the difference is with this query, as compared to the one generated by the pagination code.

Related

Rust diesel with r2d2 load expected struct `Mysql`, found struct `Sqlite`?

I'm writing a database module for my rust application, use diesel.
Here's the compile error:
error[E0271]: type mismatch resolving `<MysqlConnection as Connection>::Backend == Sqlite`
--> src/quant/common/persistence/database.rs:155:18
|
155 | .load::<NetWorthModel>(&self.connection())
| ^^^^ expected struct `Mysql`, found struct `Sqlite`
|
= note: required because of the requirements on the impl of `LoadQuery<MysqlConnection, NetWorthModel>` for `diesel::query_builder::SelectStatement<table, diesel::query_builder::select_clause::DefaultSelectClause, diesel::query_builder::distinct_clause::NoDistinctClause, diesel::query_builder::where_clause::WhereClause<And<And<diesel::expression::operators::Eq<columns::fund_code, diesel::expression::bound::Bound<diesel::sql_types::Text, &str>>, GtEq<columns::date, diesel::expression::bound::Bound<diesel::sql_types::Date, std::string::String>>>, Lt<columns::date, diesel::expression::bound::Bound<diesel::sql_types::Date, std::string::String>>>>, diesel::query_builder::order_clause::OrderClause<Asc<columns::date>>, diesel::query_builder::limit_clause::LimitClause<diesel::expression::bound::Bound<diesel::sql_types::BigInt, i64>>, diesel::query_builder::offset_clause::OffsetClause<diesel::expression::bound::Bound<diesel::sql_types::BigInt, i64>>>`
Here's the model.rs:
use super::schema::tb_net_worth;
use diesel::{Identifiable, Insertable, Queryable};
#[derive(Queryable, Insertable, Identifiable)]
#[table_name = "tb_net_worth"]
#[primary_key(fund_code)]
pub struct NetWorthModel {
pub fund_code: String,
pub date: String,
pub create_time: i64,
pub update_time: i64,
pub payload: String,
}
Here's my source code database.rs:
use crate::quant::common::persistence::model::NetWorthModel;
use crate::quant::common::yadatetime::Date;
use diesel::mysql::MysqlConnection;
use diesel::prelude::*;
use diesel::r2d2::{ConnectionManager, Pool};
use std::collections::HashMap;
use std::sync::Mutex;
pub struct MysqlDatabase {
pool: Pool<ConnectionManager<MysqlConnection>>,
}
impl MysqlDatabase {
fn new(user: &str, passwd: &str, host: &str, port: i32, db: &str) -> MysqlDatabase {
let url = format!("mysql://{}:{}#{}:{}/{}", user, passwd, host, port, db);
let pool_manager = ConnectionManager::<MysqlConnection>::new(url.as_str());
let pool = Pool::builder()
.max_size(16)
.build(pool_manager)
.unwrap();
MysqlDatabase {pool: pool.clone()}
}
fn connection(&self) -> MysqlConnection {
*self.pool.get().unwrap()
}
fn paged_query(&self,
fund_code: &str,
order_by: &str,
start_date: Date,
end_date: Date,
page_index: i32,
page_size: i32) -> Vec[NetWorthModel] {
use super::schema::tb_net_worth::dsl;
let query = dsl::tb_net_worth
.filter(dsl::fund_code.eq(fund_code))
.filter(dsl::date.ge(start_date.to_string()))
.filter(dsl::date.lt(end_date.to_string()));
if order_by == "ASC" {
query.order(dsl::date.asc())
.limit(page_size as i64)
.offset((page_index * page_size) as i64)
.load<NetWorthModel>(&self.connection())
.unwrap()
} else {
query.order(dsl::date.desc())
.limit(page_size as i64)
.offset((page_index * page_size) as i64)
.load<NetWorthModel>(&self.connection())
.unwrap()
}
}
Would you please help with that ?

how can I fix sql type error with diesel and juniper for mysql in graphQL mutation?

I confronted the error logs below when I try to create mutation with graphGL and mysql via diesel.
currently the type of enum is just diesel's type but I want to make that with graphQl's type.
I implemented Customer Structure for graphQL like below.
is it not enough ? and do you have any idea to fix that ?
Thanks
Error log
error[E0277]: the trait bound `graphql::Customer: diesel::Queryable<(diesel::sql_types::Unsigned<diesel::sql_types::BigInt>, diesel::sql_types::Text, diesel::sql_types::Text, diesel::sql_types::Timestamp, diesel::sql_types::Timestamp), _>` is not satisfied
--> src/graphql.rs:60:14
|
60 | .first::<crate::graphql::Customer>(&executor.context().db_con)
| ^^^^^ the trait `diesel::Queryable<(diesel::sql_types::Unsigned<diesel::sql_types::BigInt>, diesel::sql_types::Text, diesel::sql_types::Text, diesel::sql_types::Timestamp, diesel::sql_types::Timestamp), _>` is not implemented for `graphql::Customer`
|
= note: required because of the requirements on the impl of `diesel::query_dsl::LoadQuery<_, graphql::Customer>` for `diesel::query_builder::SelectStatement<schema::customers::table, diesel::query_builder::select_clause::DefaultSelectClause, diesel::query_builder::distinct_clause::NoDistinctClause, diesel::query_builder::where_clause::NoWhereClause, diesel::query_builder::order_clause::NoOrderClause, diesel::query_builder::limit_clause::LimitClause<diesel::expression::bound::Bound<diesel::sql_types::BigInt, i64>>>`
src/graphql.rs
use std::convert::From;
use std::sync::Arc;
use chrono::NaiveDateTime;
use actix_web::{web, Error, HttpResponse};
use futures01::future::Future;
use juniper::http::playground::playground_source;
use juniper::{http::GraphQLRequest, Executor, FieldResult, FieldError,ID};
use juniper_from_schema::graphql_schema_from_file;
use diesel::prelude::*;
use itertools::Itertools;
use crate::schema::customers;
use crate::{DbCon, DbPool};
graphql_schema_from_file!("src/schema.graphql");
pub struct Context {
db_con: DbCon,
}
impl juniper::Context for Context {}
pub struct Query;
pub struct Mutation;
impl QueryFields for Query {
fn field_customers(
&self,
executor: &Executor<'_, Context>,
_trail: &QueryTrail<'_, Customer, Walked>,
) -> FieldResult<Vec<Customer>> {
//type FieldResult<T> = Result<T, String>;
customers::table
.load::<crate::models::Customer>(&executor.context().db_con)
.and_then(|customers| Ok(customers.into_iter().map_into().collect()))
.map_err(Into::into)
}
}
impl MutationFields for Mutation {
fn field_create_customer(
&self,
executor: &Executor<'_, Context>,
_trail: &QueryTrail<'_, Customer, Walked>,
name: String,
email: String,
) -> FieldResult<Customer> {
//type FieldResult<T> = Result<T, String>;
let new_customer = crate::models::NewCustomer { name: name, email: email};
diesel::insert_into(customers::table)
.values(&new_customer)
.execute(&executor.context().db_con);
customers::table
.first::<crate::graphql::Customer>(&executor.context().db_con)
.map_err(Into::into)
}
}
pub struct Customer {
id: u64,
name: String,
email: String,
created_at: NaiveDateTime,
updated_at: NaiveDateTime,
}
impl CustomerFields for Customer {
fn field_id(&self, _: &Executor<'_, Context>) -> FieldResult<juniper::ID> {
Ok(juniper::ID::new(self.id.to_string()))
}
fn field_name(&self, _: &Executor<'_, Context>) -> FieldResult<&String> {
Ok(&self.name)
}
fn field_email(&self, _: &Executor<'_, Context>) -> FieldResult<&String> {
Ok(&self.email)
}
}
impl From<crate::models::Customer> for Customer {
fn from(customer: crate::models::Customer) -> Self {
Self {
id: customer.id,
name: customer.name,
email: customer.email,
created_at: customer.created_at,
updated_at: customer.updated_at,
}
}
}
fn playground() -> HttpResponse {
let html = playground_source("");
HttpResponse::Ok()
.content_type("text/html; charset=utf-8")
.body(html)
}
fn graphql(
schema: web::Data<Arc<Schema>>,
data: web::Json<GraphQLRequest>,
db_pool: web::Data<DbPool>,
) -> impl Future<Item = HttpResponse, Error = Error> {
let ctx = Context {
db_con: db_pool.get().unwrap(),
};
web::block(move || {
let res = data.execute(&schema, &ctx);
Ok::<_, serde_json::error::Error>(serde_json::to_string(&res)?)
})
.map_err(Error::from)
.and_then(|customer| {
Ok(HttpResponse::Ok()
.content_type("application/json")
.body(customer))
})
}
pub fn register(config: &mut web::ServiceConfig) {
let schema = std::sync::Arc::new(Schema::new(Query, Mutation));
config
.data(schema)
.route("/", web::post().to_async(graphql))
.route("/", web::get().to(playground));
}
src/models.rs
#[derive(Queryable, Identifiable, AsChangeset, Clone, PartialEq, Debug)]
pub struct Customer {
pub id: u64,
pub name: String,
pub email: String,
pub created_at: NaiveDateTime,
pub updated_at: NaiveDateTime,
}
use super::schema::customers;
#[derive(Queryable,Insertable, AsChangeset)]
#[table_name="customers"]
pub struct NewCustomer {
pub name: String,
pub email: String,
}
Dependencies
[dependencies]
diesel = { version = "1.4.5", features = ["mysql", "r2d2", "chrono"] }
dotenv = "~0.15"
serde = "~1.0"
serde_derive = "~1.0"
serde_json = "~1.0"
chrono = "~0.4"
rand = "0.7.3"
actix-web = "1.0.9"
actix-cors = "0.1.0"
juniper = "0.14.1"
juniper-from-schema = "0.5.1"
juniper-eager-loading = "0.5.0"
r2d2_mysql = "*"
r2d2-diesel = "0.16.0"
mysql = "*"
r2d2 = "*"
futures01 = "0.1.29"
itertools = "0.8.2"
src/schema.graphql
schema {
query: Query
mutation: Mutation
}
type Query {
customers: [Customer!]! #juniper(ownership: "owned")
}
type Mutation {
createCustomer(
name: String!,
email: String!,
): Customer! #juniper(ownership: "owned")
}
type Customer {
id: ID! #juniper(ownership: "owned")
name: String!
email: String!
}
src/schema.rs
table! {
customers (id) {
id -> Unsigned<Bigint>,
name -> Varchar,
email -> Varchar,
created_at -> Timestamp,
updated_at -> Timestamp,
}
}
As the error message mentions you need to implement Queryable for your struct Customer in src/graphql.rs (That struct in just below where the error message points to). The easiest way to do that is to just add a #[derive(Queryable)] to this struct.

How to serialise structs into io::Write in a loop

I need to do a simple read/process/write in Rust like this:
#[derive(serde::Deserialize)]
struct Incoming {
first: String,
last: String,
}
#[derive(serde::Serialize)]
struct Outgoing {
name: String,
}
// Keep the read/write traits as generic as possible
fn stream_things<R: std::io::Read, W: std::io::Write>(reader: R, writer: W) {
let incoming: Vec<Incoming> = serde_json::from_reader(reader).unwrap();
for a in incoming {
let b = Outgoing {
name: format!("{} {}", a.first, a.last),
};
serde_json::to_writer(writer, &b).unwrap();
}
}
fn main() {
stream_things(std::io::stdin(), std::io::stdout());
}
This does not compile because:
error[E0382]: use of moved value: `writer`
--> src/main.rs:20:31
|
13 | fn stream_things<R: std::io::Read, W: std::io::Write>(reader: R, writer: W) {
| -- ------ move occurs because `writer` has type `W`, which does not implement the `Copy` trait
| |
| help: consider further restricting this bound: `W: Copy +`
...
20 | serde_json::to_writer(writer, &b).unwrap();
| ^^^^^^ value moved here, in previous iteration of loop
What is the correct way to write to std::io::Write in a loop?
Also how to correctly do it with serde's to_writer?
See the plaground.
Given W is io::Write, then &mut W is also io::Write:
impl<'_, W: Write + ?Sized> Write for &'_ mut W
so the following compiles:
fn stream_things<R: std::io::Read, W: std::io::Write>(reader: R, mut writer: W) {
let incoming: Vec<Incoming> = serde_json::from_reader(reader).unwrap();
for a in incoming {
let b = Outgoing {
name: format!("{} {}", a.first, a.last),
};
serde_json::to_writer(&mut writer, &b).unwrap();
}
}

Retrieve datetime from mySQL database using Diesel

I can't retrieve datetime from a populated mySQL database using Rocket and Diesel.
Here is my model:
extern crate chrono;
use diesel::prelude::*;
use diesel::mysql::MysqlConnection;
use schema::chrisms;
use diesel::sql_types::Datetime;
use self::chrono::{DateTime, Duration, NaiveDate, NaiveDateTime, NaiveTime, TimeZone, Utc};
#[derive(Serialize, Deserialize, Queryable)]
pub struct Chrisms {
pub entity_ekklesia_location_id: i32,
pub serie_number: Option<String>,
pub seat_number: Option<String>,
pub date: Datetime,
pub year: i32,
pub deleted: bool,
pub entity_chrism_location_id: Option<i32>,
pub entity_chrism_location_description: Option<String>,
pub entity_rel_mec_id: Option<i32>,
pub entity_rel_mec_description: Option<String>,
pub created_by_user_id: Option<i32>,
pub updated_by_user_id: Option<i32>,
pub deleted_by_user_id: Option<i32>,
pub created_at: Datetime,
pub updated_at: Datetime,
pub id: i32,
}
impl Chrisms {
pub fn read(connection: &MysqlConnection) -> Vec<Chrisms> {
chrisms::table.load::<Chrisms>(connection).unwrap()
}
}
My schema:
table! {
chrisms (id) {
entity_ekklesia_location_id -> Integer,
serie_number -> Nullable<Varchar>,
seat_number -> Nullable<Varchar>,
date -> Datetime,
year -> Integer,
deleted -> Bool,
entity_chrism_location_id -> Nullable<Integer>,
entity_chrism_location_description -> Nullable<Varchar>,
entity_rel_mec_id -> Nullable<Integer>,
entity_rel_mec_description -> Nullable<Varchar>,
created_by_user_id -> Nullable<Integer>,
updated_by_user_id -> Nullable<Integer>,
deleted_by_user_id -> Nullable<Integer>,
created_at -> Datetime,
updated_at -> Datetime,
id -> Integer,
}
}
This produces the errors:
1. the trait `_IMPL_SERIALIZE_FOR_TemplateContext::_serde::Serialize` is not
implemented for `diesel::sql_types::Datetime`
-required by `_IMPL_SERIALIZE_FOR_TemplateContext::_serde::ser::SerializeStruct::serialize_field`
2. the trait `_IMPL_SERIALIZE_FOR_TemplateContext::_serde::Deserialize<'_>` is
not implemented for `diesel::sql_types::Datetime`
- required by `_IMPL_SERIALIZE_FOR_TemplateContext::_serde::de::SeqAccess::next_element`
- required by `_IMPL_SERIALIZE_FOR_TemplateContext::_serde::de::MapAccess::next_value`
3. the trait `diesel::Queryable<diesel::sql_types::Datetime,
diesel::mysql::Mysql>` is not implemented for `diesel::sql_types::Datetime`
- required because of the requirements on the impl of `diesel::query_dsl::LoadQuery<_, models::chrisms::Chrisms>` for `schema::chrisms::table`
How do I fix this? I tested a bunch of uses like diesel:mysql_types, rocket:config and so on, doesn't seem to be that the issue.
diesel Create/Read/Update/Delete example with datetime
Cargo.toml:
[dependencies]
diesel = { version = "1.4", features = ["sqlite", "chrono"] }
chrono = "0.4"
schema of users:
CREATE TABLE users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
email TEXT NOT NULL UNIQUE,
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP
);
mod schema {
table! {
users (id) {
id -> Integer,
email -> Text,
created_at -> Timestamp,
}
}
}
mod models {
use super::schema::users;
#[derive(Queryable, Debug)]
pub struct User {
pub id: i32,
pub email: String,
/// deisel create must enable chrono feature
/// Timestamp without timezone, the memory align of Timestamp type in sqlite is same as libc::timeval?
pub created_at: chrono::NaiveDateTime,
}
#[derive(Insertable)]
#[table_name = "users"]
pub struct UserInsert {
pub email: String,
}
}
#[macro_use]
extern crate diesel;
use diesel::{
result::Error as DieselError, sql_types::BigInt, sqlite::SqliteConnection, Connection,
ExpressionMethods, QueryDsl, RunQueryDsl,
};
use models::{User, UserInsert};
use schema::users::dsl::{created_at, id, users};
fn create_user(conn: &SqliteConnection, new_user_form: UserInsert) -> Result<User, DieselError> {
// use sqlite(last_insert_rowid)/mysql(last_insert_id) to get current connection's last_insert_id
// use .order(id.desc()).last() will get the wrong id when multi db_connections insert at same time
no_arg_sql_function!(last_insert_rowid, BigInt);
diesel::insert_into(users)
.values(&new_user_form)
.execute(conn)?;
let new_user_id: i64 = diesel::select(last_insert_rowid).first(conn)?;
let last_insert_user: User = users.filter(id.eq(new_user_id as i32)).first(conn)?;
Ok(last_insert_user)
}
fn read_users(conn: &SqliteConnection) -> Result<Vec<User>, DieselError> {
Ok(users.load::<User>(conn)?)
}
fn update_user_created_at(conn: &SqliteConnection, user_id: i32) -> Result<(), DieselError> {
diesel::update(users.filter(id.eq(user_id)))
.set(created_at.eq(chrono::Utc::now().naive_utc()))
.execute(conn)?;
Ok(())
}
fn delete_user_by_user_id(conn: &SqliteConnection, user_id: i32) -> Result<(), DieselError> {
diesel::delete(users.filter(id.eq(user_id))).execute(conn)?;
Ok(())
}
/// diesel CRUD(Create, Read, Update, Delete) example with datetime
fn main() -> Result<(), DieselError> {
// TODO use r2d2 db_pool to enhance diesel performance
let conn = SqliteConnection::establish("file:db.sqlite").unwrap();
// clear all data before test
diesel::delete(users).execute(&conn)?;
let test_user_email = format!(
"test+{}#example.com",
std::time::SystemTime::now()
.duration_since(std::time::UNIX_EPOCH)
.unwrap()
.as_secs()
);
// CRUD - Create
println!("\nCRUD - Create");
let last_insert_user = create_user(
&conn,
UserInsert {
email: test_user_email,
},
)?;
dbg!(&last_insert_user);
// CRUD - Read
println!("\nCRUD - Read");
dbg!(read_users(&conn)?);
assert_eq!(read_users(&conn)?[0].id, last_insert_user.id);
// CRUD - Update
println!("\nCRUD - Update");
update_user_created_at(&conn, last_insert_user.id)?;
dbg!(read_users(&conn)?);
assert_ne!(read_users(&conn)?[0].created_at, last_insert_user.created_at);
// CRUD - Delete
println!("\nCRUD - Delete");
delete_user_by_user_id(&conn, last_insert_user.id)?;
dbg!(read_users(&conn)?);
assert!(read_users(&conn)?.is_empty());
Ok(())
}
Output Example:
CRUD - Create
[src/main.rs:85] &last_insert_user = User {
id: 1,
email: "test+1606720099#example.com",
created_at: 2020-11-30T07:08:19,
}
CRUD - Read
[src/main.rs:88] read_users(&conn)? = [
User {
id: 1,
email: "test+1606720099#example.com",
created_at: 2020-11-30T07:08:19,
},
]
CRUD - Update
[src/main.rs:93] read_users(&conn)? = [
User {
id: 1,
email: "test+1606720099#example.com",
created_at: 2020-11-30T07:08:19.386513,
},
]
CRUD - Delete
[src/main.rs:98] read_users(&conn)? = []

rustc-serialize custom enum decoding

I have a JSON structure where one of the fields of a struct could be either an object, or that object's ID in the database. Let's say the document looks like this with both possible formats of the struct:
[
{
"name":"pebbles",
"car":1
},
{
"name":"pebbles",
"car":{
"id":1,
"color":"green"
}
}
]
I'm trying to figure out the best way to implement a custom decoder for this. So far, I've tried a few different ways, and I'm currently stuck here:
extern crate rustc_serialize;
use rustc_serialize::{Decodable, Decoder, json};
#[derive(RustcDecodable, Debug)]
struct Car {
id: u64,
color: String
}
#[derive(Debug)]
enum OCar {
Id(u64),
Car(Car)
}
#[derive(Debug)]
struct Person {
name: String,
car: OCar
}
impl Decodable for Person {
fn decode<D: Decoder>(d: &mut D) -> Result<Person, D::Error> {
d.read_struct("root", 2, |d| {
let mut car: OCar;
// What magic must be done here to get the right OCar?
/* I tried something akin to this:
let car = try!(d.read_struct_field("car", 0, |r| {
let r1 = Car::decode(r);
let r2 = u64::decode(r);
// Compare both R1 and R2, but return code for Err() was tricky
}));
*/
/* And this got me furthest */
match d.read_struct_field("car", 0, u64::decode) {
Ok(x) => {
car = OCar::Id(x);
},
Err(_) => {
car = OCar::Car(try!(d.read_struct_field("car", 0, Car::decode)));
}
}
Ok(Person {
name: try!(d.read_struct_field("name", 0, Decodable::decode)),
car: car
})
})
}
}
fn main() {
// Vector of both forms
let input = "[{\"name\":\"pebbles\",\"car\":1},{\"name\":\"pebbles\",\"car\":{\"id\":1,\"color\":\"green\"}}]";
let output: Vec<Person> = json::decode(&input).unwrap();
println!("Debug: {:?}", output);
}
The above panics with an EOL which is a sentinel value rustc-serialize uses on a few of its error enums. Full line is
thread '<main>' panicked at 'called `Result::unwrap()` on an `Err` value: EOF', src/libcore/result.rs:785
What's the right way to do this?
rustc-serialize, or at least its JSON decoder, doesn't support that use case. If you look at the implementation of read_struct_field (or any other method), you can see why: it uses a stack, but when it encounters an error, it doesn't bother to restore the stack to its original state, so when you try to decode the same thing differently, the decoder is operating on an inconsistent stack, eventually leading to an unexpected EOF value.
I would recommend you look into Serde instead. Deserializing in Serde is different: instead of telling the decoder what type you're expecting, and having no clear way to recover if a value is of the wrong type, Serde calls into a visitor that can handle any of the types that Serde supports in the way it wants. This means that Serde will call different methods on the visitor depending on the actual type of the value it parsed. For example, we can handle integers to return an OCar::Id and objects to return an OCar::Car.
Here's a full example:
#![feature(custom_derive, plugin)]
#![plugin(serde_macros)]
extern crate serde;
extern crate serde_json;
use serde::de::{Deserialize, Deserializer, Error, MapVisitor, Visitor};
use serde::de::value::MapVisitorDeserializer;
#[derive(Deserialize, Debug)]
struct Car {
id: u64,
color: String
}
#[derive(Debug)]
enum OCar {
Id(u64),
Car(Car),
}
struct OCarVisitor;
#[derive(Deserialize, Debug)]
struct Person {
name: String,
car: OCar,
}
impl Deserialize for OCar {
fn deserialize<D>(deserializer: &mut D) -> Result<Self, D::Error> where D: Deserializer {
deserializer.deserialize(OCarVisitor)
}
}
impl Visitor for OCarVisitor {
type Value = OCar;
fn visit_u64<E>(&mut self, v: u64) -> Result<Self::Value, E> where E: Error {
Ok(OCar::Id(v))
}
fn visit_map<V>(&mut self, visitor: V) -> Result<Self::Value, V::Error> where V: MapVisitor {
Ok(OCar::Car(try!(Car::deserialize(&mut MapVisitorDeserializer::new(visitor)))))
}
}
fn main() {
// Vector of both forms
let input = "[{\"name\":\"pebbles\",\"car\":1},{\"name\":\"pebbles\",\"car\":{\"id\":1,\"color\":\"green\"}}]";
let output: Vec<Person> = serde_json::from_str(input).unwrap();
println!("Debug: {:?}", output);
}
Output:
Debug: [Person { name: "pebbles", car: Id(1) }, Person { name: "pebbles", car: Car(Car { id: 1, color: "green" }) }]
Cargo.toml:
[dependencies]
serde = "0.7"
serde_json = "0.7"
serde_macros = "0.7"