Reactive way to read and parse file from resources using WebFlux? - json

I wonder what is the correct way to read, parse and serve a file from resources.
Currently, I do something like this:
fun getFile(request: ServerRequest): Mono<ServerResponse> {
val parsedJson =
objectMapper.readValue(readFile("fileName.json"), JsonModel::class.java)
// modify parsed json
return ok().contentType(APPLICATION_JSON).bodyValue(parsedJson)
}
private fun readFile(fileName: String) =
DefaultResourceLoader()
.getResource(fileName)
.inputStream.bufferedReader().use { it.readText() }
I've noticed JsonObjectDecoder class in Netty, but I don't know if can be applied to my use case.
What is the reactive way to do read/parse resource file then?

After expanding #vins answer, I've came to following solution:
Jackson2JsonDecoder()
.decodeToMono(
DataBufferUtils.read(
DefaultResourceLoader()
.getResource("$fileName.json"),
DefaultDataBufferFactory(),
4096
),
ResolvableType.forClass(JsonModel::class.java), null, null
)
.map { it as JsonModel }

You can take a look at Flux.using here for file read.
As you are using Spring framework, You can also take a look at the DataBufferUtils.
This DataBufferUtils uses AsynchronousFileChannel to read the file and it also internally uses Flux.using to release the file once the file is read / cancelled by the subscriber.
#Value("classpath:somefile.json")
private Resource resource;
#GetMapping("/resource")
public Flux<DataBuffer> serve(){
return DataBufferUtils.read(
this.resource,
new DefaultDataBufferFactory(),
4096
);
}

Related

ASP.NET Core Read in html file from www root and replace parameters

I am trying to create a printable page that says
Hi, [Member]
We have blah blah....[MemberLocation]
Thanks. Please contact [MemberPhone]
Replacing the [Member] tags with data from the model.
I figured the best way is to save an html on wwwroot.
So I saved the above in member.html on wwwroot/staticfiles/member.html
I am having a hard time reading this file into my code.
I was hoping to do something like
memberString = System.IO.File.ReadAllText("staticfiles/member.html").ToString();
Then do something like
var memberName = model.Name;
memberString = memberString.replace([Member], memberName)
Obviously this isn't working. I can't read the file that way.
I have tried creating a service that reads the file
public class ReturnHTML
{
private readonly IHostEnvironment _env;
public ReturnHTML(IHostEnvironment env)
{
_env = env;
}
public string ReturnHTMLPage()
{
var path = Path.Combine(_env.ContentRootPath, "StaticFiles/member.html");
var htmlString = System.IO.File.ReadAllText(path);
return htmlString.ToString();
}
}
Add this a singleton in my startup.cs
But I can't figure out how to inject this and get this string.
So my question is, what is the best way to create a printable page that I can supply values to?
There are third party stuff out there that create pdf and docs that you can template. But I am not interested in third party tools. Is there any native way of doing this?

Unmarshalling with Jackson "The Json input stream must start with an array of Json objects"

I'm getting an error when unmarshalling files that only contain a single JSON object: "IllegalStateException: The Json input stream must start with an array of Json objects"
I can't find any workaround and I don't understand why it has to be so.
#Bean
public ItemReader<JsonHar> reader(#Value("file:${json.resources.path}/*.json") Resource[] resources) {
log.info("Processing JSON resources: {}", Arrays.toString(resources));
JsonItemReader<JsonHar> delegate = new JsonItemReaderBuilder<JsonHar>()
.jsonObjectReader(new JacksonJsonObjectReader<>(JsonHar.class))
.resource(resources[0]) //FIXME had to force this, but fails anyway because the file is "{...}" and not "[...]"
.name("jsonItemReader")
.build();
MultiResourceItemReader<JsonHar> reader = new MultiResourceItemReader<>();
reader.setDelegate(delegate);
reader.setResources(resources);
return reader;
}
I need a way to unmarshall single object files, what's the point in forcing arrays (which I won't have in my use case)??
I don't understand why it has to be so.
The JsonItemReader is designed to read an array of objects because batch processing is usually about handling data sources with a lot of items, not a single item.
I can't find any workaround
JsonObjectReader is what you are looking for: You can implement it to read a single json object and use it with the JsonItemReader (either at construction time or using the setter). This is not a workaround but a strategy interface designed for specific use cases like yours.
Definitely not ideal #thomas-escolan. As #mahmoud-ben-hassine pointed, ideal would be to code a custom reader.
In case some new SOF users stumble on this question, I leave here a code example on how to do it
Though this may not be ideal, this is how I handled the situation:
#Bean
public ItemReader<JsonHar> reader(#Value("file:${json.resources.path}/*.json") Resource[] resources) {
log.info("Processing JSON resources: {}", Arrays.toString(resources));
JsonItemReader<JsonHar> delegate = new JsonItemReaderBuilder<JsonHar>()
.jsonObjectReader(new JacksonJsonObjectReader<>(JsonHar.class))
.resource(resources[0]) //DEBUG had to force this because of NPE...
.name("jsonItemReader")
.build();
MultiResourceItemReader<JsonHar> reader = new MultiResourceItemReader<>();
reader.setDelegate(delegate);
reader.setResources(Arrays.stream(resources)
.map(WrappedResource::new) // forcing the bride to look good enough
.toArray(Resource[]::new));
return reader;
}
#RequiredArgsConstructor
static class WrappedResource implements Resource {
#Delegate(excludes = InputStreamSource.class)
private final Resource resource;
#Override
public InputStream getInputStream() throws IOException {
log.info("Wrapping resource: {}", resource.getFilename());
InputStream in = resource.getInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(in, UTF_8));
String wrap = reader.lines().collect(Collectors.joining())
.replaceAll("[^\\x00-\\xFF]", ""); // strips off all non-ASCII characters
return new ByteArrayInputStream(("[" + wrap + "]").getBytes(UTF_8));
}
}

SpringBatch - how to set up via java config the JsonLineMapper for reading a simple json file

How to change from "setLineTokenizer(new DelimitedLineTokenizer()...)" to "JsonLineMapper" in the first code below? Basicaly, it is working with csv but I want to change it to read a simple json file. I found some threads here asking about complex json but this is not my case. Firstly I thought that I should use a very diferent approach from csv way, but after I read SBiAch05sample.pdf (see the link and snippet at the bottom), I understood that FlatFileItemReader can be used to read json format.
In almost similiar question, I can guess that I am not in the wrong direction. Please, I am trying to find the simplest but elegant and recommended way for fixing this snippet code. So, the wrapper below, unless I am really obligated to work this way, seems to go further. Additionally, the wrapper seems to me more Java 6 style than my tentative which takes advantage of anonimous method from Java 7 (as far as I can judge from studies). Please, any advise is higly appreciated.
//My Code
#Bean
#StepScope
public FlatFileItemReader<Message> reader() {
log.info("ItemReader >>");
FlatFileItemReader<Message> reader = new FlatFileItemReader<Message>();
reader.setResource(new ClassPathResource("test_json.js"));
reader.setLineMapper(new DefaultLineMapper<Message>() {
{
setLineTokenizer(new DelimitedLineTokenizer() {
{
setNames(new String[] { "field1", "field2"...
//Sample using a wrapper
http://www.manning.com/templier/SBiAch05sample.pdf
import org.springframework.batch.item.file.LineMapper;
import org.springframework.batch.item.file.mapping.JsonLineMapper;
import com.manning.sbia.ch05.Product;
public class WrappedJsonLineMapper implements LineMapper<Product> {
private JsonLineMapper delegate;
public Product mapLine(String line, int lineNumber) throws Exception {
Map<String,Object> productAsMap
= delegate.mapLine(line, lineNumber);
Product product = new Product();
product.setId((String)productAsMap.get("id"));
product.setName((String)productAsMap.get("name"));
product.setDescription((String)productAsMap.get("description"));
product.setPrice(new Float((Double)productAsMap.get("price")));
return product;
}
public void setDelegate(JsonLineMapper delegate) {
this.delegate = delegate;
}
}
Really you have two options for parsing JSON within a Spring Batch job:
Don't create a LineMapper, create a LineTokenizer. Spring Batch's DefaultLineMapper breaks up the parsing of a record into two phases, parsing the record and mapping the result to an object. The fact that the incoming data is JSON vs a CSV only impacts the parsing piece (which is handled by the LineTokenizer). That being said, you'd have to write your own LineTokenizer to parse the JSON into a FieldSet.
Use the provided JsonLineMapper. Spring Batch provides a LineMapper implementation that uses Jackson to deserialize JSON objects into java objects.
In either case, you can't map a LineMapper to a LineTokenizer as they accomplish two different things.

How to split messages file into multiple files in play 2.0 framework

I have a huge message file which i need to split into multiples files for different languages.
For example :
I created one folder for English locale i.e. en and another for French locale , fr inside conf folder.
en contains messages1_en.properties and messages2_en.properties
fr contains messages1_fr.properties and messages2_fr.properties
How to access these properties files inside my view.
Thanks
The only way to do that without introducing your own alternative implementation and use that instead of the built in Messages is to use hacked locales, so you would do fr_type1, fr_type2 or something like that to select the right alternative.
This is probably a bad idea since it's always risky to use an API in a different way from how it was intended to be used, there is a high risk of unexpected behaviour and it might be brittle since there is no guarantee that you will be able to use made up locales in future versions etc.
If you look at the Messages implementation you could probably get some ideas of how to implement your own without much fuss.
Good luck!
That's an old question, but i had a close issue, and i didn't find a solution anywhere.
This example use a configuration key to load messages from a file with a custom name. But you can easily modify it to load messages file from a subdirectory and/or multiple messages files.
Override play.api.i18n.DefaultMessagesApiProvider
#Singleton
class CustomMessagesApiProvider #Inject() (
environment: Environment,
config: Configuration,
langs: Langs,
httpConfiguration: HttpConfiguration)
extends DefaultMessagesApiProvider(environment, config, langs, httpConfiguration) {
def filename =
config.get[String]("play.i18n.filename")
override protected def loadAllMessages: Map[String, Map[String, String]] = {
langs.availables.map(_.code).map { lang =>
(lang, loadMessages(filename +"." + lang))
}.toMap
.+("default" -> loadMessages(filename))
.+("default.play" -> loadMessages(filename+".default"))
}
}
Add Guice binding in Module.java
#Override
public void configure() {
bind(DefaultMessagesApiProvider.class).to(CustomMessagesApiProvider.class);
}
It's my first Scala class, so maybe it can be improved. But it works.
To load multiple files (it compiles but not tested)
override protected def loadAllMessages: Map[String, Map[String, String]] = {
langs.availables.map(_.code).map { lang =>
(lang,
loadMessageFiles("." + lang))
}.toMap
.+("default" -> loadMessageFiles(""))
.+("default.play" -> loadMessageFiles(".default"))
}
private def loadMessageFiles(suffix: String) = {
loadMessages("messages-1" + suffix) ++ loadMessages("messages-2" + suffix)
}

How to export data from LinqPAD as JSON?

I want to create a JSON file for use as part of a simple web prototyping exercise. LinqPAD is perfect for accessing the data from my DB in just the shape I need, however I cannot get it out as JSON very easily.
I don't really care what the schema is, because I can adapt my JavaScript to work with whatever is returned.
Is this possible?
A more fluent solution is to add the following methods to the "My Extensions" File in Linqpad:
public static String DumpJson<T>(this T obj)
{
return
obj
.ToJson()
.Dump();
}
public static String ToJson<T>(this T obj)
{
return
new System.Web.Script.Serialization.JavaScriptSerializer()
.Serialize(obj);
}
Then you can use them like this in any query you like:
Enumerable.Range(1, 10)
.Select(i =>
new
{
Index = i,
IndexTimesTen = i * 10,
})
.DumpJson();
I added "ToJson" separately so it can be used in with "Expessions".
This is not directly supported, and I have opened a feature request here. Vote for it if you would also find this useful.
A workaround for now is to do the following:
Set the language to C# Statement(s)
Add an assembly reference (press F4) to System.Web.Extensions.dll
In the same dialog, add a namespace import to System.Web.Script.Serialization
Use code like the following to dump out your query as JSON
new JavaScriptSerializer().Serialize(query).Dump();
There's a solution with Json.NET since it does indented formatting, and renders Json dates properly. Add Json.NET from NuGet, and refer to Newtonsoft.Json.dll to your “My Extensions” query and as well the following code :
public static object DumpJson(this object value, string description = null)
{
return GetJson(value).Dump(description);
}
private static object GetJson(object value)
{
object dump = value;
var strValue = value as string;
if (strValue != null)
{
var obj = JsonConvert.DeserializeObject(strValue);
dump = JsonConvert.SerializeObject(obj, Newtonsoft.Json.Formatting.Indented);
}
else
{
dump = JsonConvert.SerializeObject(value, Newtonsoft.Json.Formatting.Indented);
}
return dump;
}
Use .DumpJson() as .Dump() to render the result. It's possible to override more .DumpJson() with different signatures if necessary.
As of version 4.47, LINQPad has the ability to export JSON built in. Combined with the new lprun.exe utility, it can also satisfy your needs.
http://www.linqpad.net/lprun.aspx