I've made a CRUD REST API with Spring Boot, Spring Data JPA and MySQL and everything's okay.
I'm using postman aswell, and everything seems okay, it is connected to my sqldb, it creates in my sql, updates, retrieve and deletes, but whenever I stop it, and start it again, all my data once created is gone.
I need a solution for persisting the API's data.
My application.yaml
spring:
datasource:
url: jdbc:mysql://localhost:3306/cloud_vendor?useSSL=false
username: root
password: root
mvc:
pathmatch:
matching-strategy: ANT_PATH_MATCHER
# The SQL dialect makes Hibernate generate better SQL for the chosen database
jpa:
properties:
hibernate:
dialect: org.hibernate.dialect.MySQLDialect
#JPA Settings
jpa.hibernate.ddl_auto: create
My CloudVendorController.java
package com.thinkcon.demo.controller;
import com.thinkcon.demo.model.CloudVendor;
import com.thinkcon.demo.service.CloudVendorService;
import java.util.List;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
#RestController
#RequestMapping("/cloudvendor")
public class CloudVendorController {
CloudVendorService cloudVendorService;
public CloudVendorController(CloudVendorService cloudVendorService)
{
this.cloudVendorService = cloudVendorService;
}
#GetMapping("{vendorId}")
public CloudVendor getCloudVendorDetails(#PathVariable("vendorId") String vendorId){
return cloudVendorService.getCloudVendor(vendorId);
}
#GetMapping()
public List<CloudVendor> getAllCloudVendorDetails(){
return cloudVendorService.getAllCloudVendors();
}
#PostMapping
public String createCloudVendorDetails(#RequestBody CloudVendor cloudVendor)
{
cloudVendorService.createCloudVendor(cloudVendor);
return "CloudVendor Created Successfully";
}
#PutMapping
public String updateCloudVendorDetails(#RequestBody CloudVendor cloudVendor)
{
cloudVendorService.updateCloudVendor(cloudVendor);
return "CloudVendor updated Successfully";
}
#DeleteMapping("{vendorId}")
public String deleteCloudVendorDetails(#PathVariable("vendorId")String vendorId)
{
cloudVendorService.deleteCloudVendor(vendorId);
return "CloudVendor deleted Successfully";
}
}
And my model\CloudVendor.java
package com.thinkcon.demo.model;
import jakarta.persistence.Entity;
import jakarta.persistence.Id;
import jakarta.persistence.Table;
#Entity
#Table(name="cloud_vendor_info")
public class CloudVendor
{
#Id
private String vendorId;
private String vendorName;
private String vendorAddress;
private String vendorPhoneNumber;
public CloudVendor(String vendorId, String vendorName, String vendorAddress, String vendorPhoneNumber) {
this.vendorId = vendorId;
this.vendorName = vendorName;
this.vendorAddress = vendorAddress;
this.vendorPhoneNumber = vendorPhoneNumber;
}
public CloudVendor() {
}
public String getVendorId() {
return vendorId;
}
public String getVendorName() {
return vendorName;
}
public String getVendorAddress() {
return vendorAddress;
}
public String getVendorPhoneNumber() {
return vendorPhoneNumber;
}
public void setVendorId(String vendorId) {
this.vendorId = vendorId;
}
public void setVendorName(String vendorName) {
this.vendorName = vendorName;
}
public void setVendorAddress(String vendorAddress) {
this.vendorAddress = vendorAddress;
}
public void setVendorPhoneNumber(String vendorPhoneNumber) {
this.vendorPhoneNumber = vendorPhoneNumber;
}
}
I need the solution, I can't have my data being restarted all over again everytime I just end it.
There are 5 types ;
create -> drops existing tables, then creates new tables.
update -> creates, compares with old ones, if there are changes,
then updates and never deletes the existing tables or columns.
create-drop -> similar to create, but it will drop the database
after shutdown.
validate -> validates whether the tables and columns exist,
otherwise it throws an exception for that.
none -> turns off the DDL generation.
So, you will need ;
jpa.hibernate.ddl_auto: update
Related
Use-case:
Developers/I, want to only implement a Protobuf implementation (binary protocol). However, I need a way to add config, so, the same implementation is exposed as rest/json api as well -- without code duplication.
I have proto endpoints exposed. I also want consumers to post json equivalent of those proto objects and return/receive json equivalent of the results with type info (Pojo?). The type info helps with OpenAPI / Swagger documentation too!
What are the most elegant/simple ways to achieve that without code duplication?
Any example github code that achieves that would be helpful.
Note: This is for webflux & netty - no tomcat.
ProtobufJsonFormatHttpMessageConverter - works for tomcat, does not work for netty. A working example code would be great.
I was messing around with this and ended up with this. Nothing else worked for me.
Using protov3 and setting a protobuf like this
syntax = "proto3";
option java_package = "com.company";
option java_multiple_files = true;
message CreateThingRequest {
...
message CreateThingResponse {
....
I can scan for the protobuf files by setting app.protoPath in my application.properties
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.JsonDeserializer;
import com.fasterxml.jackson.databind.JsonSerializer;
import com.fasterxml.jackson.databind.SerializerProvider;
import com.google.common.reflect.ClassPath;
import com.google.protobuf.Message;
import com.google.protobuf.util.JsonFormat;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Configuration;
import org.springframework.http.codec.ServerCodecConfigurer;
import org.springframework.http.codec.json.Jackson2JsonDecoder;
import org.springframework.http.codec.json.Jackson2JsonEncoder;
import org.springframework.http.converter.json.Jackson2ObjectMapperBuilder;
import org.springframework.web.reactive.config.WebFluxConfigurer;
#Configuration
public class WebConfig implements WebFluxConfigurer {
#Value("${app.protoPath:com.}")
private String protoPath;
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.defaultCodecs().jackson2JsonEncoder(
new Jackson2JsonEncoder(Jackson2ObjectMapperBuilder.json().serializerByType(
Message.class, new JsonSerializer<Message>() {
#Override
public void serialize(Message value, JsonGenerator gen, SerializerProvider serializers) throws IOException {
String str = JsonFormat.printer().omittingInsignificantWhitespace().print(value);
gen.writeRawValue(str);
}
}
).build())
);
final ClassLoader loader = Thread.currentThread().getContextClassLoader();
Map<Class<?>, JsonDeserializer<?>> deserializers = new HashMap<>();
try {
for (final ClassPath.ClassInfo info : ClassPath.from(loader).getTopLevelClasses()) {
if (info.getName().startsWith(protoPath)) {
final Class<?> clazz = info.load();
if (!Message.class.isAssignableFrom(clazz)) {
continue;
}
#SuppressWarnings("unchecked") final Class<Message> proto = (Class<Message>) clazz;
final JsonDeserializer<Message> deserializer = new CustomJsonDeserializer() {
#Override
public Class<Message> getDeserializeClass() {
return proto;
}
};
deserializers.put(proto, deserializer);
}
}
} catch (IOException e) {
throw new RuntimeException(e);
}
configurer.defaultCodecs().jackson2JsonDecoder(new Jackson2JsonDecoder(Jackson2ObjectMapperBuilder.json().deserializersByType(deserializers).build()));
}
private abstract static class CustomJsonDeserializer extends JsonDeserializer<Message> {
abstract Class<? extends Message> getDeserializeClass();
#Override
public Message deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException {
Message.Builder builder = null;
try {
builder = (Message.Builder) getDeserializeClass()
.getDeclaredMethod("newBuilder")
.invoke(null);
} catch (Exception e) {
throw new RuntimeException(e);
}
JsonFormat.parser().merge(jp.getCodec().readTree(jp).toString(), builder);
return builder.build();
}
}
}
Then I just use the object types in the returns;
#PostMapping(
path = "/things",
consumes = {MediaType.APPLICATION_JSON_VALUE, "application/x-protobuf"},
produces = {MediaType.APPLICATION_JSON_VALUE, "application/x-protobuf"})
Mono<CreateThingResponse> createThing(#RequestBody CreateThingRequest request);
With https://github.com/innogames/springfox-protobuf you can get the responses to show in swagger but the requests still aren't showing for me.
You'll have to excuse the messy Java I'm a little rusty.
I needed to support json and the following code helped
#Bean
public WebFluxConfigurer webFluxConfigurer() {
return new WebFluxConfigurer() {
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
ObjectMapper mapper = new ObjectMapper()
.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false)
.registerModule(new ProtobufModule());
configurer.customCodecs().register(new Jackson2JsonEncoder(mapper));
configurer.customCodecs().register(new Jackson2JsonDecoder(mapper));
}
};
}
Try adding ProtoEncoder in your WebFlux config:
#EnableWebFlux
public class MyConfig implements WebFluxConfigurer {
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.customCodecs().register(new ProtobufEncoder());
}
}
Then in your request mapping return the proto object:
#GetMapping (produces = "application/x-protobuf")
public MyProtoObject lookup() {
return new MyProtoObject();
}
Furthermore, if you want to serialize the proto object into JSON and return String, then have a look at com.googlecode.protobuf-java-format:protobuf-java-format library and JsonFormat::printToString capability (https://code.google.com/archive/p/protobuf-java-format/):
#GetMapping
public String lookup() {
return new JsonFormat().printToString(new MyProtoObj());
}
Since version 4.1 spring provides org.springframework.http.converter.protobuf.ProtobufHttpMessageConverter for reading and writing protos as Json.
However, If you are using Spring 5.x and Protobuf 3.x there is org.springframework.http.converter.protobuf.ProtobufJsonFormatHttpMessageConverter for more explicit conversion of Json.
This documentation should help you:
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/http/converter/protobuf/ProtobufHttpMessageConverter.html
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/http/converter/protobuf/ProtobufJsonFormatHttpMessageConverter.html
I'm trying to upload a file in html and then send it to my database via restangular.
My frontend is a combination of angular with typescript but the upload is a form.
<form enctype="multipart/form-data">
<fieldset class="form-group" ng-repeat="field in $ctrl.metadata.fields">
<label ng-if="field.inputType !== 'hidden'" for="{{field.propertyKey}}"><strong>{{field.name}}</strong></label>
<input ng-if="field.inputType !== 'select' && field.inputType !== 'file'" class="form-control" type="{{field.inputType}}" name="{{field.propertyKey}}" id="{{field.propertyKey}}" ng-model="$ctrl.data[field.propertyKey]"/>
<input ng-if="field.inputType === 'file'" class="form-control" ngf-select type="{{field.inputType}}" name="{{field.propertyKey}}" id="{{field.propertyKey}}" ng-model="$ctrl.data[field.propertyKey]"/>
<sp-dropdown ng-if="field.inputType === 'select'" value="$ctrl.data[field.propertyKey]" api-domain="field.linkedObjectApiDomain" linked-object-name="field.linkedObjectName"></sp-dropdown>
</fieldset>
<button class="btn btn-primary" ng-click="$ctrl.save({item: $ctrl.data})">Save</button>
<button ng-if="$ctrl.metadata.buttons.hasOpen" class="btn btn-primary" ng-click="$ctrl.open()">Open</button>
</form>
I did the databinding of the file with ng-file-upload.
Upon saving we enter this typescript save method.
public save(item: any): any {
console.log("item to save is ", item);
console.log("rapport is ", item["rapport"]);
if (item.id === undefined) {
this.restService.save(this.metadata.apiDomain, item).then((addedItem: any) => {
toastr.success(`${addedItem.naam} successfully created.`, `Overzicht Dossiers Created`);
});
} else {
this.restService.update(this.metadata.apiDomain, item).then((updatedItem: any) => {
toastr.success(`${updatedItem.naam} successfully updated.`, `Overzicht Dossiers Updated`);
});
}
}
The second log with the file gives the json:
lastModified:1463402787393
lastModifiedDate:Mon May 16 2016 14:46:27 GMT+0200 (Romance (zomertijd))
name:"Rapport.pdf"
size:83605
type:"application/pdf"
upload:Promise
webkitRelativePath:""
__proto__:File
On server side I'm using a spring project which I didn't set up myself but the important files are my class which should store this data
Dossier
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
package be.ugent.lca.data.entities;
import be.ugent.sherpa.entity.BaseEntity;
import java.sql.Date;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.FetchType;
import javax.persistence.JoinColumn;
import javax.persistence.Lob;
import javax.persistence.ManyToOne;
import javax.persistence.OneToOne;
/**
*
* #author Sam
*/
#Entity
//#JsonDeserialize(using = DossierDeserializer.class)
//#JsonSerialize(using = DossierSerializer.class)
public class Dossier extends BaseEntity{
private String externDossierNr;
private String internDossierNr;
private Date datum;
private Boolean doc;
private Date refKlantDatum;
private String refKlantVerwijzing;
private String verantw;
#OneToOne(fetch=FetchType.LAZY, mappedBy="dossier")
private Offerte offerte;
private String status;
#ManyToOne(fetch=FetchType.EAGER)
#JoinColumn(name = "persoon")
private Persoon persoon;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "OrganisatieFirma")
private OrganisatieFirma organisatieFirma;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "OrganisatieIntern")
private OrganisatieIntern organisatieIntern;
#Lob
#Column(length=100000)
private byte[] rapport;
public Offerte getOfferte() {
return offerte;
}
public void setOfferte(Offerte offerte) {
this.offerte = offerte;
}
public byte[] getRapport() {
return rapport;
}
public void setRapport(byte[] rapport) {
this.rapport = rapport;
}
public OrganisatieFirma getOrganisatieFirma() {
return organisatieFirma;
}
public String getExternDossierNr() {
return externDossierNr;
}
public void setExternDossierNr(String externDossierNr) {
this.externDossierNr = externDossierNr;
}
public String getInternDossierNr() {
return internDossierNr;
}
public void setInternDossierNr(String internDossierNr) {
this.internDossierNr = internDossierNr;
}
public void setOrganisatieFirma(OrganisatieFirma organisatieFirma) {
this.organisatieFirma = organisatieFirma;
}
public OrganisatieIntern getOrganisatieIntern() {
return organisatieIntern;
}
public void setOrganisatieIntern(OrganisatieIntern organisatieIntern) {
this.organisatieIntern = organisatieIntern;
}
public Persoon getPersoon() {
return persoon;
}
public void setPersoon(Persoon persoon) {
this.persoon = persoon;
}
public String getStatus() {
return status;
}
public void setStatus(String status) {
this.status = status;
}
public Date getDatum() {
return datum;
}
public void setDatum(Date datum) {
this.datum = datum;
}
public Date getRefKlantDatum() {
return refKlantDatum;
}
public void setRefKlantDatum(Date refKlantDatum) {
this.refKlantDatum = refKlantDatum;
}
public String getRefKlantVerwijzing() {
return refKlantVerwijzing;
}
public void setRefKlantVerwijzing(String refKlantVerwijzing) {
this.refKlantVerwijzing = refKlantVerwijzing;
}
public String getVerantw() {
return verantw;
}
public void setVerantw(String verantw) {
this.verantw = verantw;
}
public Boolean getDoc() {
return doc;
}
public void setDoc(Boolean doc) {
this.doc = doc;
}
}
and my repository for this class
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
package be.ugent.lca.data.repository;
import be.ugent.lca.data.entities.Dossier;
import be.ugent.lca.data.query.DossierQuery;
import be.ugent.sherpa.repository.RestRepository;
import org.springframework.data.rest.core.annotation.RepositoryRestResource;
/**
*
* #author Sam
*/
#RepositoryRestResource(collectionResourceRel = "dossiers", path = "dossiers")
public interface DossierRepository extends RestRepository<Dossier, DossierQuery<?>>{
}
When trying to save a file to my database the server gives this exception
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of byte[] out of START_OBJECT token
This led me to believe that I have to write my own deserializer for Dossier
Thus:
package be.ugent.lca.data.entities.deserializers;
import be.ugent.lca.data.entities.Dossier;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.core.ObjectCodec;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.JsonDeserializer;
import com.fasterxml.jackson.databind.JsonNode;
import java.io.IOException;
public class DossierDeserializer extends JsonDeserializer {
#Override
public Dossier deserialize(JsonParser jsonParser,
DeserializationContext deserializationContext) throws IOException {
ObjectCodec oc = jsonParser.getCodec();
JsonNode root = oc.readTree(jsonParser);
Dossier dossier = new Dossier();
dossier.setExternDossierNr(root.get("externDossierNr").asText());
dossier.setInternDossierNr(root.get("internDossierNr").asText());
return dossier;
}
}
But my problem is that I don't know how exactly to deserialize the file json, since writing out root.get("rapport") gives back an empty string.
Any help would be much appreciated.
I've worked out the file upload.
First of all I split the file upload from the rest of my data so I won't have to rewrite the automatic deserialization for everything that does work.
this.restService.save(this.metadata.apiDomain, item).then((addedItem: any) => {
toastr.success(`${addedItem.naam} successfully created.`, `Overzicht Dossiers Created`);
console.log("created item ", addedItem);
var fd = new FormData();
fd.append("rapport", item["rapport"]);
this.restService.one('dossiers/' + addedItem.id + '/rapport').withHttpConfig({transformRequest: angular.identity}).customPOST(fd, '', undefined, {'Content-Type': undefined}).then(
(addedDossier: any) => {
console.log("posted dossier ", addedDossier);
}
);
});
In the callback of my normal save I do the custom post to dossiers/{id}/rapport for this I need a custom controller.
#BasePathAwareController
#RequestMapping("/dossiers/{id}")
#ExposesResourceFor(Dossier.class)
public class DossierController {
The BasePathAwawareController makes sure that all automatically generated paths that you don't override keep existing.
#Autowired
private DossierRepository dossierRepository;
With this I inject my repository to connect to my database.
#RequestMapping(path = "/rapport", method = RequestMethod.POST)//,headers = "content-type=multipart/form-data")
public #ResponseBody String postRapport(#PathVariable("id") Long id,#RequestParam("rapport") MultipartFile file) {
String name = "rapport";
System.out.println("Entered custom file upload with id " + id);
if (!file.isEmpty()) {
try {
byte[] bytes = file.getBytes();
Dossier dossier = dossierRepository.findOne(id);
dossier.setRapport(bytes);
dossierRepository.save(dossier);
return "You successfully uploaded " + name + " into " + name + "-uploaded !";
} catch (Exception e) {
return "You failed to upload " + name + " => " + e.getMessage();
}
} else {
return "You failed to upload " + name + " because the file was empty.";
}
}
Like this I'm able to successfully upload my file.
I have to start Jenkins parameterized build programmatically using Jersey REST API and the values for the parameters must be provided as a JSON object. Any hint or example is welcome.
So, seems like you have not tried it yourself. I can give you a fast 5 minute solution, that should be reworked to be clear and not so ugly, but it works :)
import java.util.ArrayList;
import java.util.List;
import com.sun.jersey.api.client.Client;
import com.sun.jersey.api.client.ClientResponse;
import com.sun.jersey.api.client.WebResource;
public class JenkinsJob {
public static void main(String[] args) {
runParamJob("http://jenkins.host/", "SOME_JOB", "{\"object\":\"test\"}");
}
public static String runParamJob(String url, String jobName, String paramsJSON) {
String USERNAME = "user";
String PASSWORD = "pass";
Client client = Client.create();
client.addFilter(new com.sun.jersey.api.client.filter.HTTPBasicAuthFilter(USERNAME, PASSWORD));
WebResource webResource = client.resource(url + jobName + "/buildWithParameters?PARAMETER=" + paramsJSON);
ClientResponse response = webResource.type("application/json").get(ClientResponse.class, paramsJSON);
String jsonResponse = response.getEntity(String.class);
client.destroy();
System.out.println("Server response:" + jsonResponse);
return jsonResponse;
}
}
In order to use rest API for parameterized build you should use POST and not get according to Jenkins wiki. Here is an example. Make sure that the json you send is as instructed at the documentation.
take this json for example:
{"parameter": [{"name":"id", "value":"123"}, {"name":"verbosity", "value":"high"}]}
You have two parameters with name and value for each. If I will use what was written at the former answer by #stanjer the json should look like that:
{"parameter": [{"name":"object", "value":"test"}]}
In addition there is a good discussion about it here
I would not recommend to use USER:PASSWORD but use token that could be configured in Jenkins job UI. Here is a class that implements builder pattern for with/without parameters.
import java.util.Map;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.MultivaluedMap;
import com.sun.jersey.api.client.Client;
import com.sun.jersey.api.client.ClientResponse;
import com.sun.jersey.api.client.WebResource;
import com.sun.jersey.core.util.MultivaluedMapImpl;
public class JenkinsTrigger {
private String host;
private String jenkinsToken;
private String jobParams;
private MultivaluedMap<String,String> queryParams = new MultivaluedMapImpl();
private Client client = Client.create();
private WebResource webResource;
private JenkinsTrigger(JenkinsTriggerBuilder jenkinsTriggerBuilder){
this.host = jenkinsTriggerBuilder.host;
this.jenkinsToken = jenkinsTriggerBuilder.jenkinsToken;
this.jobParams = getJobParams(jenkinsTriggerBuilder.jobParams);
webResource = client.resource(this.host);
queryParams.add("token", jenkinsToken);
}
public void trigger(){
ClientResponse response = webResource.path(this.host).queryParams(queryParams)
.type(MediaType.APPLICATION_FORM_URLENCODED_TYPE)
.header("Content-type", "application/x-www-form-urlencoded")
.post(ClientResponse.class, jobParams);
if (response.getStatus() != 201) {
throw new RuntimeException("Failed : HTTP error code : "
+ response.toString());
} else {
System.out.println("Job Trigger: " + host);
}
}
private String getJobParams(Map<String,String> jobParams){
StringBuilder sb = new StringBuilder();
sb.append("json={\"parameter\":[");
jobParams.keySet().forEach(param -> {
sb.append("{\"name\":\""+param+"\",");
sb.append("\"value\":\""+ jobParams.get(param) + "\"},");
});
sb.setLength(sb.length() - 1);
sb.append("]}");
System.out.println("Job Parameters:" + sb.toString());
return sb.toString();
}
public static class JenkinsTriggerBuilder {
private String host;
private String jenkinsToken;
private Map<String,String> jobParams = null;
public JenkinsTriggerBuilder(String host, String jenkinsToken){
this.host = host;
this.jenkinsToken = jenkinsToken;
}
public JenkinsTriggerBuilder jobParams(Map<String,String> jobParams){
this.jobParams = jobParams;
return this;
}
public JenkinsTrigger build(){
return new JenkinsTrigger(this);
}
}
}
And here is usage sample:
Map<String, String> params = new HashMap<>();
params.put("ENV", "DEV103");
JenkinsTrigger trigger = new JenkinsTriggerBuilder("https://JENKINS_HOST/job/JOB_NAME/buildWithParameters","JOB_TOKEN").jobParams(params).build();
trigger.trigger();
best of luck
I created a restful service witch generates JSON with the GSON api, but i need table name in front of the JSON structure, and i can't show this, let me show the codes
package webService;
import java.util.ArrayList;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import com.google.gson.Gson;
import model.AccessManager;
import dto.Usuarios;
#Path("/UsuariosService")
public class UsuariosService
{
#GET
#Path("/usuarios")
#Produces("application/json")
public String usuarios()
{
String usuarios = null;
ArrayList<Usuarios> usuariosList = new ArrayList<Usuarios>();
try
{
usuariosList = new AccessManager().getUsuarios();
Gson gson = new Gson();
usuarios = gson.toJson(usuariosList);
} catch (Exception e)
{
e.printStackTrace();
}
return usuarios;
}
}
The return I need is:
{
"usuarios" : [
{"usr_id":1,"usr_login":"teste#gmail.com","usr_pwd":"123456"},
{"usr_id":2,"usr_login":"teste#teste.com.br","usr_pwd":"123456"}
]
}
But the return I get is:
[
{"usr_id":1,"usr_login":"teste#gmail.com","usr_pwd":"123456"},
{"usr_id":2,"usr_login":"teste#teste.com.br","usr_pwd":"123456"}
]
i.e. without the table's name, but that name is needed in my SAPUI5 Application
What stops you from just adding that as String
"{\"usuarios\" :" + gson.toJson(usuariosList) + "}"
Although you need to escape "
Otherwise you have to introduce a different return object
class ReturnObject {
List<Usuarios> usuarios;
public ReturnObject(List<Usuarios> usuarios) {
this.usuarios = usuarios;
}
}
And use that
gson.toJson(new ReturnObject(usuariosList));
Can anyone please tell me if there is any way in apache spark to store a JavaRDD on mysql database? I am taking input from 2 csv files and then after doing join operations on their contents I need to save the output(the output JavaRDD) in the mysql database. I am already able to save the output successfully on hdfs but I am not finding any information related to apache Spark-MYSQL connection. Below I am posting the code for spark sql. This might serve as a reference to those who are looking for an example for spark-sql.
package attempt1;
import java.io.Serializable;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.sql.api.java.JavaSQLContext;
import org.apache.spark.sql.api.java.JavaSchemaRDD;
import org.apache.spark.sql.api.java.Row;
public class Spark_Mysql {
#SuppressWarnings("serial")
public static class CompleteSample implements Serializable {
private String ASSETNUM;
private String ASSETTAG;
private String CALNUM;
public String getASSETNUM() {
return ASSETNUM;
}
public void setASSETNUM(String aSSETNUM) {
ASSETNUM = aSSETNUM;
}
public String getASSETTAG() {
return ASSETTAG;
}
public void setASSETTAG(String aSSETTAG) {
ASSETTAG = aSSETTAG;
}
public String getCALNUM() {
return CALNUM;
}
public void setCALNUM(String cALNUM) {
CALNUM = cALNUM;
}
}
#SuppressWarnings("serial")
public static class ExtendedSample implements Serializable {
private String ASSETNUM;
private String CHANGEBY;
private String CHANGEDATE;
public String getASSETNUM() {
return ASSETNUM;
}
public void setASSETNUM(String aSSETNUM) {
ASSETNUM = aSSETNUM;
}
public String getCHANGEBY() {
return CHANGEBY;
}
public void setCHANGEBY(String cHANGEBY) {
CHANGEBY = cHANGEBY;
}
public String getCHANGEDATE() {
return CHANGEDATE;
}
public void setCHANGEDATE(String cHANGEDATE) {
CHANGEDATE = cHANGEDATE;
}
}
#SuppressWarnings("serial")
public static void main(String[] args) throws Exception {
JavaSparkContext ctx = new JavaSparkContext("local[2]", "JavaSparkSQL");
JavaSQLContext sqlCtx = new JavaSQLContext(ctx);
JavaRDD<CompleteSample> cs = ctx.textFile("C:/Users/cyg_server/Documents/bigDataExample/AssetsImportCompleteSample.csv").map(
new Function<String, CompleteSample>() {
public CompleteSample call(String line) throws Exception {
String[] parts = line.split(",");
CompleteSample cs = new CompleteSample();
cs.setASSETNUM(parts[0]);
cs.setASSETTAG(parts[1]);
cs.setCALNUM(parts[2]);
return cs;
}
});
JavaRDD<ExtendedSample> es = ctx.textFile("C:/Users/cyg_server/Documents/bigDataExample/AssetsImportExtendedSample.csv").map(
new Function<String, ExtendedSample>() {
public ExtendedSample call(String line) throws Exception {
String[] parts = line.split(",");
ExtendedSample es = new ExtendedSample();
es.setASSETNUM(parts[0]);
es.setCHANGEBY(parts[1]);
es.setCHANGEDATE(parts[2]);
return es;
}
});
JavaSchemaRDD complete = sqlCtx.applySchema(cs, CompleteSample.class);
complete.registerAsTable("cs");
JavaSchemaRDD extended = sqlCtx.applySchema(es, ExtendedSample.class);
extended.registerAsTable("es");
JavaSchemaRDD fs= sqlCtx.sql("SELECT cs.ASSETTAG, cs.CALNUM, es.CHANGEBY, es.CHANGEDATE FROM cs INNER JOIN es ON cs.ASSETNUM=es.ASSETNUM;");
JavaRDD<String> result = fs.map(new Function<Row, String>() {
public String call(Row row) {
return row.getString(0);
}
});
result.saveAsTextFile("hdfs://path/to/hdfs/dir-name"); //instead of hdfs I need to save it on mysql database, but I am not able to find any Spark-MYSQL connection
}
}
Here at the end I am saving the result successfully in HDFS. But now I want to save into MYSQL database. Kindly help me out. Thanks
There are two approaches you can use for writing your results back to the database. One is to use something like DBOutputFormat and configure that, and the other is to use foreachPartition on the RDD you want to save and pass in a function which creates a connection to MySQL and writes the result back.
Here is an example using DBOutputFormat.
Create a class that represents your table row -
public class TableRow implements DBWritable
{
public String column1;
public String column2;
#Override
public void write(PreparedStatement statement) throws SQLException
{
statement.setString(1, column1);
statement.setString(2, column2);
}
#Override
public void readFields(ResultSet resultSet) throws SQLException
{
throw new RuntimeException("readFields not implemented");
}
}
Then configure your job and write a mapToPair function. The value doesn't appear to be used. If anyone knows, please post a comment.
String tableName = "YourTableName";
String[] fields = new String[] { "column1", "column2" };
JobConf job = new JobConf();
DBConfiguration.configureDB(job, "com.mysql.jdbc.Driver", "jdbc:mysql://localhost/DatabaseNameHere", "username", "password");
DBOutputFormat.setOutput(job, tableName, fields);
// map your rdd into a table row
JavaPairRDD<TableRow, Object> rows = rdd.mapToPair(...);
rows.saveAsHadoopDataset(job);