I am currently working with a database that has read and write tables.
There are always two tables with the same schema, distinguished by a number as suffix, eg. table1 and table2.
Now, there is another source where I get the current number from. I have to use this number to select from the corresponding table with the matching suffix.
Right now, for every table i have a #MappedSuperclass containing the schema and two implementation classes specifying the table name via #Table(name = "..1") and #Table(name = "..2").
This solution works but by now I discovered a lot of drawbacks and fear there will be many more. Is there another, better way to solve this?
Unfortunately, I could not find out what this kind of database mechanism is called hence I could not find any other sources on the internet.
Thank you in advance!
The most obvious solution:
if ( num == 1 )
{
Table1 table1 = createTable1();
table1.set...;
entityManager.persist( table1 );
} else
{
Table2 table2 = createTable2();
table2.set...;
entityManager.persist( table2 );
}
Or with constructor calling by name (with Lombok annotations):
#Entity
#Data
public class CommonBase
{}
#Entity
#Data
public class Table1 extends CommonBase
{}
#Entity
#Data
public class Table2 extends CommonBase
{}
#Stateless
#LocalBean
public class CommonBaseBean
{
#Inject
private CommonBaseBUS commonBaseBUS;
protected void clientCode()
{
Table0 t0 = (Table0) commonBaseBUS.createEntityByIndex( 0 );
t0.set...();
commonBaseBUS.persisEntity( t0 );
Table1 t1 = (Table1) commonBaseBUS.createEntityByIndex( 1 );
t1.set...();
commonBaseBUS.persisEntity( t1 );
}
}
#Dependent
class CommonBaseBUS
{
#Inject
private CommonBaseDAL commonBaseDAL;
#Setter
private String entityBaseName = "qualified.path.Table";
public CommonBase createEntityByIndex( int index_ ) throws ClassNotFoundException
{
String entityName = entityBaseName + Integer.toString( index_ );
return createEntityByName( entityName );
}
public void persisEntity( CommonBase cb_ )
{
commonBaseDAL.persistEntity( cb_ );
}
protected CommonBase createEntityByName( String entityName_ ) throws ClassNotFoundException
{
Class<?> c = Class.forName( entityName_ );
try
{
return (CommonBase) c.newInstance();
}
catch ( InstantiationException | IllegalAccessException ex )
{
throw new ClassNotFoundException();
}
}
}
#Dependent
class CommonBaseDAL
{
#PersistentContext
private EntityManager em;
public void persisEntity( CommonBase cb_ )
{
em.persistEntity( cb_ );
}
}
Related
I am new to Spring Boot. I am trying to use the save() functionality via the JPA library using Postman for the first time. My database is a legacy Mysql database. Generically speaking, this table contains data of baseball players who have been drafted into a fantasy baseball league. The primary key of my table is 'play_id', and I also track the player's 'mlb_id' (Major League Baseball's unique identifier) in the same table.
Here is my code:
Table setup in Mysql:
CREATE TABLE `mlb_rosters` (
`play_id` int(10) NOT NULL,
`mlb_id` int(10) NOT NULL,
`name_first` varbinary(255) NOT NULL,
`name_last` varbinary(255) NOT NULL,
`bats` varchar(1) NOT NULL,
`throws` varchar(1) NOT NULL,
`birthday` date NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
ALTER TABLE `mlb_rosters`
ADD PRIMARY KEY (`play_id`),
ADD UNIQUE KEY `mlb_id` (`mlb_id`),
ADD UNIQUE KEY `mlb_id_2` (`mlb_id`);
ALTER TABLE `mlb_rosters`
MODIFY `play_id` int(10) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=6730;
I also ran insert statements for approximately ~1500 players, so this is not a blank table.
My object in Springboot:
package com.example.demo.entities;
import javax.persistence.*;
#Entity
#Table(name="mlb_rosters")
public class IbcMlbPlayer {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
#Column(name="play_id", columnDefinition = "int(10)")
private Integer playId;
#Column(name="mlb_id")
private Integer mlbId;
#Column(name="name_first", columnDefinition = "varbinary(255)")
private String nameFirst;
#Column(name="name_last", columnDefinition = "varbinary(255)")
private String nameLast;
#Column(name="bats")
private String bats;
#Column(name="throws")
private String thrws;
#Column(name="birthday")
private String birthday;
public IbcMlbPlayer(){
}
public Integer getPlayId() {
return playId;
}
public void setPlayId(Integer playId) {
this.playId = playId;
}
public Integer getMlbId() {
return mlbId;
}
public void setMlbId(Integer mlbId) {
this.mlbId = mlbId;
}
public String getNameFirst() {
return nameFirst;
}
public void setNameFirst(String nameFirst) {
this.nameFirst = nameFirst;
}
public String getNameLast() {
return nameLast;
}
public void setNameLast(String nameLast) {
this.nameLast = nameLast;
}
public String getBats() {
return bats;
}
public void setBats(String bats) {
this.bats = bats;
}
public String getThrws() {
return thrws;
}
public void setThrws(String thrws) {
this.thrws = thrws;
}
public String getBirthday() {
return birthday;
}
public void setBirthday(String birthday) {
this.birthday = birthday;
}
}
The relevant path of my controller:
#PostMapping(value = "/saveIbcMlbPlayer")
public IbcMlbPlayer saveIbcMlbPlayer(#RequestBody IbcMlbPlayer ibcMlbPlayer){
return ibcMlbPlayerDao.save(ibcMlbPlayer);
}
My Dao:
package com.example.demo.dao;
import com.example.demo.entities.IbcMlbPlayer;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
#Repository
public interface IbcMlbPlayerDao extends JpaRepository<IbcMlbPlayer, Integer> {
}
When I attempt to do a Post request to the save path and pass in the JSON object of the player who I'm attempting to create, I get the following error:
Duplicate entry '25' for key 'PRIMARY'
In this case, I've tried this 25 times, so Postman/Spring Boot keep incrementing the 'play_id' field by 1 (this number goes up in the error message by one each time I test).
I understand the error, for whatever reason, Spring Boot isn't getting the max value of the 'play_id' field, incrementing it by one, and then attempting to do the insert. I would have expected 'play_id' to be 6730, which I believe is the table's max play_id plus one. Does anyone know how to fix this? Any help would be really appreciated!
AUTO shouldn't be used as GenerationType you must use IDENTITY
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name="play_id", columnDefinition = "int(10)")
private Integer playId;
I'm using MySQL DB.
My entity for the table is Account with the following fields:
id(long), balance (double), created_on(Date), currency(Enum).
When I'm doing a PUT request to update the account, I pass in the request body JSON.
I want to update, for example, only the balance, but the other columns' values to be saved.
In that case (I'm not passing the currency type) the balance is updated, but the currency has value NULL. Is that because it's enum?
I've tried using #DynamicUpdate annotation, but still, it doesn't have any change.
#RestController
public class AccountController {
#PutMapping("/accounts/{id}")
public void updateAccount(\#PathVariable long id, #RequestBody AccountDto accountDto) {
accountService.updateAccount(id, accountDto);
}
}
I'm using AccountDto (which I pass in the request body) and I'm calling the accountService
public void updateAccount(long id, AccountDto accountDto) {
Account account = accountRepository.getOne(id);
account.fromDto(accountDto);
this.accountRepository.save(account); }),
which calls the AccountRepository
public void fromDto(AccountDto accountDto) {
this.balance = accountDto.getBalance();
this.currency = accountDto.getCurrency();
}
Here is the AccountDto class:
public class AccountDto {
private long id;
#NotNull #PositiveOrZero
private double balance;
#NotNull #Enumerated(EnumType.STRING)
private Currency currency;
}
You need to perform a select query on Account entity and then update only the desired fields.
(Eg - making assumptions of my own of underlying method being used for accessing DB)
public updateAccount(AccountModel jsonBody) {
Account entity = accountRepository.findById(jsonBody.getAccountId());
entity.setBalance(jsonBody.getBalance());
accountRepository.save(entity);
}
If you get null as currency in the JSON you shouldn't update it:
So fromDto must look like:
public void fromDto(AccountDto accountDto) {
this.balance = accountDto.getBalance();
if (accountDto.getCurrency() != null) {
this.currency = accountDto.getCurrency();
}
}
I have three tables, one containing Cards, one containing CardDecks and third one implementing a many-to-many relation between the former two and additionally containg a symbol for every relation entry.
My task is to get three columns from the card-table and the symbol from the relation-table and save it in a data Object specifically designed for handling those inputs, the codition being, that all entries match the given deckId. Or in (hopefully correct) sql-language:
#Query("SELECT R.symbol, C.title, C.type, C.source " +
"FROM card_table C JOIN cards_to_card_deck R ON C.id = R.card_id"+
"WHERE R.card_deck_id = :cardDeckId")
LiveData<List<CardWithSymbol>> getCardsWithSymbolInCardDeckById(long cardDeckId);
But the room implementation class generates:
#Override
public LiveData<List<CardWithSymbol>> getCardsWithSymbolInCardDeckById(long
cardDeckId) {
final String _sql = "SELECT R.symbol, C.title, C.typ, C.source FROM
cards_to_card_deck R INNER JOIN card_table C ON R.card_id = C.id WHERE
R.card_deck_id = ?";
final RoomSQLiteQuery _statement = RoomSQLiteQuery.acquire(_sql, 1);
int _argIndex = 1;
_statement.bindLong(_argIndex, cardDeckId);
return new ComputableLiveData<List<CardWithSymbol>>() {
private Observer _observer;
#Override
protected List<CardWithSymbol> compute() {
if (_observer == null) {
_observer = new Observer("cards_to_card_deck","card_table") {
#Override
public void onInvalidated(#NonNull Set<String> tables) {
invalidate();
}
};
__db.getInvalidationTracker().addWeakObserver(_observer);
}
final Cursor _cursor = __db.query(_statement);
try {
final int _cursorIndexOfSymbol = _cursor.getColumnIndexOrThrow("symbol");
final List<CardWithSymbol> _result = new ArrayList<CardWithSymbol>(_cursor.getCount());
while(_cursor.moveToNext()) {
final CardWithSymbol _item;
final int _tmpSymbol;
_tmpSymbol = _cursor.getInt(_cursorIndexOfSymbol);
_item = new CardWithSymbol(_tmpSymbol,null,null,null);
_result.add(_item);
}
return _result;
} finally {
_cursor.close();
}
}
#Override
protected void finalize() {
_statement.release();
}
}.getLiveData();
}
Where
_item = new CardWithSymbol(_tmpSymbol,null,null,null);
should return my fully initialized object.
The CardWithSymbol class is declared as follows:
public class CardWithSymbol {
public int symbol;
public String cardName;
public String cardType;
public String cardSource;
public CardWithSymbol(int symbol, String cardName, String cardType, String cardSource){
this.symbol = symbol;
this.cardName = cardName;
this.cardType = cardType;
this.cardSource = cardSource;
}
And the types of the columns returned by the query are:
int symbol, String title, String type, String source
I already went through some debugging and the rest of the application works just fine. I can even read the symbol from the objects return by the query, but as mentioned above for some reason room ignores the other three parameters and just defaults them to null in the query-implementation.
So after some trial and error and reading through the dao-documentation once again i found my error:
When creating a class for handling subsets of columns in room, it is important to tell room which variable coresponds to which columns via #ColumnInfo(name = "name of the column goes here")-annotation.
So changing my CardWithSymbol class as follows solved the issue for me:
import android.arch.persistence.room.ColumnInfo;
public class CardWithSymbol {
#ColumnInfo(name = "symbol")
public int symbol;
#ColumnInfo(name = "title")
public String cardName;
#ColumnInfo(name = "type")
public String cardType;
#ColumnInfo(name = "source")
public String cardSource;
public CardWithSymbol(int symbol, String cardName, String cardType, String cardSource){
this.symbol = symbol;
this.cardName = cardName;
this.cardType = cardType;
this.cardSource = cardSource;
}
}
I have issues persisting a simple 2 classes on DataNucleus 3.1.3 on MySQL, where DataNucleus seems to create invalid foreign-keys, ending up in a "foreign key constraint fails" -exception from database.
Here my classes:
// datastore since i dont care about identity here
#PersistenceCapable(identityType = IdentityType.DATASTORE)
class A {
#Persistent
int x;
#Persistent
int y;
}
// identity type:application here to enable id lookups
#PersistenceCapable
class B {
#PrimaryKey
#Persistent(valueStrategy = IdGeneratorStrategy.NATIVE)
long id;
#Persistent
double longitude;
#Persistent
double latitude;
// simple 1:1 unidirectional
#Persistent
A a;
}
The schemaTool created the tables (InnoDB) which looks good, but an insert fails, here the logs:
12:54:11,369 DEBUG [DataNucleus.Datastore.Native] - INSERT INTO `A` (`X`,`Y`) VALUES (<1>,<1>)
12:54:11,387 DEBUG [DataNucleus.Datastore.Persist] - Execution Time = 18 ms (number of rows = 1)
12:54:11,398 DEBUG [DataNucleus.Datastore.Persist] - Object "foo.A#624af1e" was inserted in the datastore and was given strategy value of "3"
12:54:11,403 DEBUG [DataNucleus.Datastore] - Closing PreparedStatement "org.datanucleus.store.rdbms.ParamLoggingPreparedStatement#6f5ba238"
12:54:11,404 DEBUG [DataNucleus.Datastore.Native] - INSERT INTO `B` (`LONGITUDE`,`LATITUDE`,`A_A_ID_OID`) VALUES (<0.5099776394799052>,<0.6191090630996077>,<51>)
12:54:11,419 WARN [DataNucleus.Datastore.Persist] ... Cannot add or update a child row: a foreign key constraint fails (`xperimental`.`B`, CONSTRAINT `B_FK1` FOREIGN KEY (`A_A_ID_OID`) REFERENCES `A` (`A_ID`))
Looking at the logs on lines (3) and (5) its very suspicious that an insert into table A returned a PK of "3" but DataNucleus instead uses a value of "51" as FK on A when inserting data into table B which causes the violation.
Where is the issue? Thanks
UPDATE: the resources
Class A
package jdotest.a;
import javax.jdo.annotations.IdentityType;
import javax.jdo.annotations.PersistenceCapable;
import javax.jdo.annotations.Persistent;
#PersistenceCapable(identityType = IdentityType.DATASTORE)
public class A {
#Persistent
private int x;
#Persistent
private int y;
public int getX() {
return x;
}
public int getY() {
return y;
}
}
Class B
package jdotest.b;
import javax.jdo.annotations.IdGeneratorStrategy;
import javax.jdo.annotations.PersistenceCapable;
import javax.jdo.annotations.Persistent;
import javax.jdo.annotations.PrimaryKey;
import jdotest.a.A;
#PersistenceCapable
public class B {
#PrimaryKey
#Persistent(valueStrategy = IdGeneratorStrategy.NATIVE)
long id;
#Persistent
double longitude;
#Persistent
double latitude;
// simple 1:1 unidirectional
#Persistent
A a;
public long getId() {
return id;
}
public double getLongitude() {
return longitude;
}
public double getLatitude() {
return latitude;
}
public void setA(A a) {
this.a = a;
}
public A getA() {
return a;
}
}
Dao
package dao;
import javax.jdo.JDOHelper;
import javax.jdo.PersistenceManager;
import javax.jdo.PersistenceManagerFactory;
import javax.jdo.Transaction;
import jdotest.b.B;
public class BDao {
public void write(B b) {
PersistenceManagerFactory pmf = JDOHelper.getPersistenceManagerFactory("cloud-sql");
PersistenceManager pm = pmf.getPersistenceManager();
Transaction tx = pm.currentTransaction();
try {
tx.begin();
pm.makePersistent(b);
tx.commit();
} finally {
if (tx.isActive())
tx.rollback();
pm.close();
}
}
}
execution
package exec;
import jdotest.a.A;
import jdotest.b.B;
import dao.BDao;
public class Ex{
public void persist(){
A a = new A();
B b = new B();
b.setA(a);
new BDao().write(b); //<-- exception
}
}
the exception *
java.sql.SQLException: Cannot add or update a child row: a foreign key constraint fails (xperimental.b, CONSTRAINT B_FK1 FOREIGN KEY (A_A_ID_OID) REFERENCES a (A_ID))
We have a fairly complex data model and are using Hibernate and Spring Data JPA on top of MySQL. We have a base class that all domain objects extend to minimize boiler plate code. I would like to be able to add soft delete functionality across all of our domain objects using only this class. However, #SQLDelete requires the table name in the clause:
#SQLDelete(sql="UPDATE (table_name) SET deleted = '1' WHERE id = ?")
#Where(clause="deleted <> '1'")
Does anybody know of a way to generalize the SQLDelete statement and allow the extending domain objects to populate their own table names?
If you use hibernate and #SQLDelete, there's no easy solution to your question. But you can consider another approach to soft delete with Spring Data's expression language:
#Override
#Query("select e from #{#entityName} e where e.deleteFlag=false")
public List<T> findAll();
//recycle bin
#Query("select e from #{#entityName} e where e.deleteFlag=true")
public List<T> recycleBin();
#Query("update #{#entityName} e set e.deleteFlag=true where e.id=?1")
#Modifying
public void softDelete(String id);
//#{#entityName} will be substituted by concrete entity name automatically.
Rewrite base repository like this. All sub repository interfaces will have soft delete ability.
Another approach, which could be more flexible.
On Entity level create
#MappedSuperclass
public class SoftDeletableEntity {
public static final String SOFT_DELETED_CLAUSE = "IS_DELETED IS FALSE";
#Column(name = "is_deleted")
private boolean isDeleted;
...
}
Update your Entity which should be soft deletable
#Entity
#Where(clause = SoftDeletableEntity.SOFT_DELETED_CLAUSE)
#Table(name = "table_name")
public class YourEntity extends SoftDeletableEntity {...}
Create a custom Interface Repository which extends the Spring's Repository. Add default methods for soft delete. It should be as a base repo for your Repositories. e.g.
#NoRepositoryBean
public interface YourBaseRepository<T, ID> extends JpaRepository<T, ID> {
default void softDelete(T entity) {
Assert.notNull(entity, "The entity must not be null!");
Assert.isInstanceOf(SoftDeletableEntity.class, entity, "The entity must be soft deletable!");
((SoftDeletableEntity)entity).setIsDeleted(true);
save(entity);
}
default void softDeleteById(ID id) {
Assert.notNull(id, "The given id must not be null!");
this.softDelete(findById(id).orElseThrow(() -> new EmptyResultDataAccessException(
String.format("No %s entity with id %s exists!", "", id), 1)));
}
}
NOTE: If your application doesn't have the hard delete then you could add
String HARD_DELETE_NOT_SUPPORTED = "Hard delete is not supported.";
#Override
default void deleteById(ID id) {
throw new UnsupportedOperationException(HARD_DELETE_NOT_SUPPORTED);
}
#Override
default void delete(T entity) {
throw new UnsupportedOperationException(HARD_DELETE_NOT_SUPPORTED);
}
#Override
default void deleteAll(Iterable<? extends T> entities) {
throw new UnsupportedOperationException(HARD_DELETE_NOT_SUPPORTED);
}
#Override
default void deleteAll() {
throw new UnsupportedOperationException(HARD_DELETE_NOT_SUPPORTED);
}
Hope it could be useful.