how to update IPP printer attributes using ipptool.exe? - cups

ipptool (cups) allows getting ipp printer attributes from an IPP printer (get-printer-attributes.test), is it possible to update ipp attributes of the printer using ipptool.exe?

Related

JSON Schema validation using draft V7

I am trying to write a schema for my JSON file, one of filed shoud be either one of true or false or null
Is there any type which I can use to address above? I am using below types but it seems it doesn't work my schema evaluator
"my_column":
type :{
enum:["true","false","null"]
}
Can someone gives me any hint?

How to force browsers to send input type=number value as is?

When I enter e.g. 47,5 in an input type="number" field, my browser (Firefox) automatically converts it to 47.5 when sending the form to the server. (My client culture uses a decimal comma.) I would like it to send the value as is (with a decimal comma) because it would be more convenient to deal with just a user-specified culture rather than a mixture of user and 'default' culture. How can I do this?
In latest editions of firefox and chrome if you have specified 'type' attribute as number, it will indicate an error on entering the value as 47,5 or 47.5
We cannot control the client environment
There are cases where apostrophe is also used.
As lumio specified you can use type="text" and parse on the server side.

Storing deflated HTML in MySQL

I need to store HTML data in a MySQL database. I read about this and found that the best method is to use NVARCHAR or VARCHAR. Furthermore I'd like to compress the input to make it less space consuming. I use PHP's gzdeflate() function for deflating the HTML input, but in this case what MySQL data type should I use?
EDIT: Since I need to store quite big HTML sources I decided to go with the TEXT data type but the question: Is MySQL's TEXT field compatible with a deflated HTML string?
Use BLOB type instead of TEXT to use it for binary data.

Storing json, jsonb, hstore, xml, enum, ipaddr, etc fails with "column "x" is of type json but expression is of type character varying"

When using PostgreSQL to store data in a field of a string-like validated type, like xml, json, jsonb, xml, ltree, etc, the INSERT or UPDATE fails with an error like:
column "the_col" is of type json but expression is of type character varying
... or
column "the_col" is of type json but expression is of type text
Why? What can I do about it?
I'm using JDBC (PgJDBC).
This happens via Hibernate, JPA, and all sorts of other abstraction layers.
The "standard" advice from the PostgreSQL team is to use a CAST in the SQL. This is not useful for people using query generators or ORMs, especially if those systems don't have explicit support for database types like json, so they're mapped via String in the application.
Some ORMs permit the implementation of custom type handlers, but I don't really want to write a custom handler for each data type for each ORM, e.g. json on Hibernate, json on EclipseLink, json on OpenJPA, xml on Hibernate, ... etc. There's no JPA2 SPI for writing a generic custom type handler. I'm looking for a general solution.
Why it happens
The problem is that PostgreSQL is overly strict about casts between text and non-text data types. It will not allow an implicit cast (one without a CAST or :: in the SQL) from a text type like text or varchar (character varying) to a text-like non-text type like json, xml, etc.
The PgJDBC driver specifies the data type of varchar when you call setString to assign a parameter. If the database type of the column, function argument, etc, is not actually varchar or text, but instead another type, you get a type error. This is also true of quite a lot of other drivers and ORMs.
PgJDBC: stringtype=unspecified
The best option when using PgJDBC is generally to pass the parameter stringtype=unspecified. This overrides the default behaviour of passing setString values as varchar and instead leaves it up to the database to "guess" their data type. In almost all cases this does exactly what you want, passing the string to the input validator for the type you want to store.
All: CREATE CAST ... WITH FUNCTION ...
You can instead CREATE CAST to define a data-type specific cast to permit this on a type-by-type basis, but this can have side effects elsewhere. If you do this, do not use WITHOUT FUNCTION casts, they will bypass type validation and result in errors. You must use the input/validation function for the data type. Using CREATE CAST is suitable for users of other database drivers that don't have any way to stop the driver specifying the type for string/text parameters.
e.g.
CREATE OR REPLACE FUNCTION json_intext(text) RETURNS json AS $$
SELECT json_in($1::cstring);
$$ LANGUAGE SQL IMMUTABLE;
CREATE CAST (text AS json)
WITH FUNCTION json_intext(text) AS IMPLICIT;
All: Custom type handler
If your ORM permits, you can implement a custom type handler for the data type and that specific ORM. This mostly useful when you're using native Java type that maps well to the PostgreSQL type, rather than using String, though it can also work if your ORM lets you specify type handlers using annotations etc.
Methods for implementing custom type handlers are driver-, language- and ORM-specific. Here's an example for Java and Hibernate for json.
PgJDBC: type handler using PGObject
If you're using a native Java type in Java, you can extend PGObject to provide a PgJDBC type mapping for your type. You will probably also need to implement an ORM-specific type handler to use your PGObject, since most ORMs will just call toString on types they don't recognise. This is the preferred way to map complex types between Java and PostgreSQL, but also the most complex.
PgJDBC: Type handler using setObject(int, Object)
If you're using String to hold the value in Java, rather than a more specific type, you can invoke the JDBC method setObject(integer, Object) to store the string with no particular data type specified. The JDBC driver will send the string representation, and the database will infer the type from the destination column type or function argument type.
See also
Questions:
Mapping postgreSQL JSON column to Hibernate value type
Are JPA (EclipseLink) custom types possible?
External:
http://www.postgresql.org/message-id/54096082.1090009#2ndquadrant.com
https://github.com/pgjdbc/pgjdbc/issues/265
http://www.pateldenish.com/2013/05/inserting-json-data-into-postgres-using-jdbc-driver.html

restkit JSON ios - putObject - send type info

i m using restkit on ios, and trying to use the putObject method
i m able to use it and send data with this format:
{"name":"Wet shirt night at Marquee","id":1,"idIcon":1,"note":78,"description":"connard","url":0}
however my web service is expecting something like:
{"event":{{"name":"Wet shirt night at Marquee","id":1,"idIcon":1,"note":78,"description":"connard","url":0}}
see the difference, the type name "event" at the beginning
do you guys have any idea how to set up restkit to send the object type name with the object data?
Thx!
solution:
RKObjectMappable.m
RKObjectMappableGetPropertiesByElement
return a dictionary with
object: your mapped dictionary
key: your type name