Scala-Slick: only part of sequence of actions executed - mysql

Have a piece of code that adds/updates a Product and also associates one or more tags to it. Tags are actually added to a TagGroup and that is associated with the Product.
Issue I am facing is that only "part" of addOrUpdateProductWithTags() executes. Product is updated or created but Tags are not added. If I comment the last query (see comment) then everything works. Have turned on "" to confirm this.
lazy val pRetId = prods returning prods.map(_.id)
def addTags(keywords: Seq[String]) = {
for {
k <- keywords
} yield {
tags.filter(_.keyword === k).take(1).result.headOption.flatMap {
case Some(tag) => {
Logger.debug("Using existing tag: " + k)
DBIO.successful(tag.id)
}
case None => {
Logger.debug("Adding new tag: " + k)
tags.returning(tags.map(_.id)) += Tag(k, Some("DUMMY"))
}
}
}
}
def addOrUpdateProductWithTags(prod: Product, tagSet: Seq[String]): Future[Option[Long]] = {
// handle add or update product
val prodObject = prod.id match {
case 0L => pRetId += prod
case _ => prods.withFilter(_.id === prod.id).update(prod)
}
val action = for {
pid <- prodObject
tids <- DBIO.sequence(addTags(tagSet))
} yield (tids, pid)
val finalAction = action.flatMap {
case (tids, pid) => {
val prodId = if (prod.id > 0L) prod.id else pid.asInstanceOf[Number].longValue
val delAction = tagGroups.filter(_.prodId === prodId).delete
val tgAction = for {
tid <- tids
} yield {
tagGroups += TagGroup("Ignored-XX", prodId, tid)
}
delAction.flatMap { x => DBIO.sequence(tgAction) }
// IF LINE BELOW IS COMMENTED THEN TagGroup is created else even delete above doesn't happen
prods.filter(_.id === prodId).map(_.id).result.headOption
}
}
db.run(finalAction.transactionally)
}
This is the snippet in the controller where this method is called from. My suspicion is that caller doesn't wait long enough but not sure...
val prod = Prod(...)
val tagSet = generateTags(prod.tags)
val add = prodsService.addOrUpdateProductWithTags(prod, tagSet)
add.map { value =>
Redirect(controllers.www.routes.Dashboard.dashboard)
}.recover {
case th =>
InternalServerError("bad things happen in life: " + th)
}
Any clue what's wrong with the query ?
Stack: Scala 2.11.7, play version 2.5.4, play-slick 2.0.0 (slick 3.1)

Finally figured out a solution:
In place of the following 2 lines:
delAction.flatMap { x => DBIO.sequence(tgAction) }
prods.filter(_.id === prodId).map(_.id).result.headOption
I combined the actions with andThen operators as follows:
delAction >> DBIO.sequence(tgAction) >> prods.filter(_.id === prodId).map(_.id).result.headOption
Now the entire sequence gets executed. I still don't know what's wrong with the original solution but this works.

Related

Composables in Pager With Tabs Composed Too Quickly

Trying to implement accompanist pager with tabs to achieve something like instagram's page displaying followers, following and subscription - 3 tab menu with pager basically. This is the code I am using.
fun UsersPager(
myDBViewModel: MyDBViewModel
) {
val tabData = listOf(
"FOLLOWING" to Icons.Filled.PermIdentity,
"ALLUSERS" to Icons.Filled.PersonOutline,
"FOLLOWERS" to Icons.Filled.PersonOutline
)
val pagerState = rememberPagerState(
0
)
val tabIndex = pagerState.currentPage
val coroutineScope = rememberCoroutineScope()
Column {
TabRow(
selectedTabIndex = tabIndex,
indicator = { tabPositions ->
TabRowDefaults.Indicator(
Modifier.pagerTabIndicatorOffset(pagerState, tabPositions)
)
}
) {
tabData.forEachIndexed { index, pair ->
Tab(
selected = tabIndex == index,
onClick = {
coroutineScope.launch {
Log.d("MP18", "click on Tab num: $index")
pagerState.animateScrollToPage(index)
}
},
text = {
Text(text = pair.first)
},
icon = {
Icon(imageVector = pair.second, contentDescription = null)
})
}
}
HorizontalPager(
state = pagerState,
itemSpacing = 1.dp,
modifier = Modifier
.weight(1f),
count = tabData.size
) { index ->
Column(
modifier = Modifier.fillMaxHeight(),
verticalArrangement = Arrangement.Center,
horizontalAlignment = Alignment.CenterHorizontally
) {
when (index) {
1 -> ShowMyFollowees(myDBViewModel = myDBViewModel)
2 -> ShowMyUsers(myDBViewModel = myDBViewModel)
3 -> ShowMyFollowers(myDBViewModel = myDBViewModel)
}
}
}
}
}
Then 3 composables follow this pattern to fetch data from API and display them:
#Composable
fun ShowMyUsers(
myDBViewModel: MyDBViewModel,
) {
val pageLoadedTimes by myDBViewModel.pageLoadedTimes.observeAsState(initial = null)
val myUsersList by myDBViewModel.myUsersList.observeAsState(initial = emptyList())
val loading by myDBViewModel.loading.observeAsState(initial = myDBViewModel.loading.value)
if (myUsersList.isNullOrEmpty() && pageLoadedTimes == 0 && !loading!!) {
LaunchedEffect(key1 = Unit, block = {
Log.d("MP18", "launchedEffect in ScreenMyAccount.ShowMyUsers")
myDBViewModel.getFirstPageUsers()
})
}
ListMyUsers(myUsers = myUsersList, myDBViewModel = myDBViewModel)
}
#Composable
fun ListMyUsers(
myUsers: List<MyUser>,
myDBViewModel: MyDBViewModel
) {
val pageLoadedTimes by myDBViewModel.pageLoadedTimes.observeAsState(initial = myDBViewModel.pageLoadedTimes.value)
val loading by myDBViewModel.loading.observeAsState(initial = myDBViewModel.loading.value)
Log.d(
"MP18",
"comp ShowMyUsers and pageLoadedTimes is: $pageLoadedTimes and loading is: $loading"
)
Column(
modifier = Modifier
.fillMaxSize()
.background(color = Color.Red)
) {
LazyColumn(
modifier = Modifier.fillMaxSize(),
contentPadding = PaddingValues(16.dp)
) {
itemsIndexed(
items = myUsers
) { index, user ->
myDBViewModel.onChangeProductScrollPosition(index)
val numRec = pageLoadedTimes?.times(PAGE_SIZE)
Log.d(
"MP188",
"in composable, page: $pageLoadedTimes, index: $index, loading: $loading, numRec: $numRec"
)
//we should query and display next page if this is true:
if ((index + 1) >= (pageLoadedTimes?.times(PAGE_SIZE)!!) && !loading!!) {
myDBViewModel.getNextPageUsers()
}
ShowSingleUser(
index = index,
pageLoadedTimes = pageLoadedTimes!!,
user = user,
myDBViewModel = myDBViewModel
)
}
}
}
}
In composables that are available, there's an API call (through ViewModel) which gets data from backend in order to populate some vars in viewModel. The problem I have is that when first tab is clicked, also the neighbouring composable gets composed and thus I'am making 2 API calls and "preparing" second tab data even if the user might never click on that tab. This is not what I want. I'd like to fetch data from tab2 and later tab3 only when there's a click on them. I hope I am clear in what's bothering me.
This is the expected behavior of the pager as the pager has been implemented by using LazyRow in accompanist pager. Basically, pager loads the second page before you scroll to it as LazyLayout is implemented in that way. If you want to cancel that you can do something like this, which I use in my code also:
// In anywhere of your composable
SideEffect {
if(currentShownItemIndex == pagerState.currentPage) {
// Make api call...
}
}
This should ensure that you are making your api call if and only if you are on the correct index
Edit: You can use Launched Effect if you want, I used SideEffect as it is easier to write and does not rely on any key and I needed a coroutine scope simply :d
Finally, this does not prevent the composition of the page in index+1 however prevents the unnecessary api call made by pager.
I found the solution for this. I added another variable in viewModel:
private val _pageInPager = MutableLiveData(0)
val pageInPager: LiveData<Int> = _pageInPager
fun setPageInPager(pageNum: Int) {
Log.d("MP188", "setPageInPager to: $pageNum")
_pageInPager.value = pageNum
}
Then in composable:
if user clicks on tab:
onClick = {
coroutineScope.launch {
Log.d("MP18", "click on Tab num: $index")
pagerState.animateScrollToPage(index)
myDBViewModel.setPageInPager(index)
}
},
or move the pager(slider):
myDBViewModel.setPageInPager(pagerState.currentPage)
I have the exact page in the variable: myDBViewModel.pageInPager, so I can add checker in LaunchedEffect before making an API call:
if (myUsersList.isNullOrEmpty() && pageLoadedTimes == 0 && !loading!! && pageInPager == 1) {
LaunchedEffect(key1 = Unit, block = {
Log.d("MP18", "launchedEffect in ScreenMyAccount.ShowMyUsers")
myDBViewModel.getFirstPageUsers()
})
I think this works ok now. Thank you #Subfly.

Scala: merging two JSON files using AmazonS3Client getObject Futures

I'm trying to merge two JSON files from an S3 bucket. First file pulls fine, but not the second file.
val eventLogJsonFuture = Future(new AmazonS3Client(credentials))
.map(_.getObject(logBucket, logDirectory + "/" + id + "/event_log.json"))
.map(_.getObjectContent)
.map(Source.fromInputStream(_))
.map(_.mkString)
.map(Json.parse) map { archiveEvents =>
Json.toJson(Json.obj("success" -> true, "data" -> archiveEvents))
} recover {
case NonFatal(error) =>
Json.obj("success" -> false, "errorCode" -> "archive_does_not_exist", "message" -> error.getMessage)
}
val infoJsonFuture = Future(new AmazonS3Client(credentials))
.map(_.getObject(logBucket, logDirectory + "/" + id + "/info.json"))
.map(_.getObjectContent)
.map(Source.fromInputStream(_))
.map(_.mkString)
.map(Json.parse) map { archiveInfo =>
Json.toJson(Json.obj("success" -> true, "data" -> archiveInfo))
} recover {
case NonFatal(error) =>
Json.obj("success" -> false, "errorCode" -> "archive_does_not_exist", "message" -> error.getMessage)
}
val combinedJson = for {
eventLogJson <- eventLogJsonFuture
infoJson <- infoJsonFuture
}
yield {
Json.obj("info" -> infoJson, "events" -> eventLogJson)
}
This is what the result JSON looks like ...
Is there another (better?) way of writing this?
Should you wait 3 parts of JSON from different source ?
I can recommend solution with case class DTO
Simple example:
val firstJson = Future {
//case class json1(...)
}
val secondJson = Future {
//case class json2(...)
...
}
val finalFson = for {
f <- first
s <- second
} yield (f, s)
finalJson onComplete {
case Success(jsons) => {
//merge json here
jsons._1 + jsons._2 ...
}

Play scala - confusing about the result type of Action.async

I'm little bit confusing about the expected result of Action.async. Here the use case : from the frontend, I receive a JSON to validate (a Foo), I send this data calling an another web service and I extract and validate the received JSON (Bar case class) which I want to validate too. The problem is when I return a result, I have the following error :
type mismatch;
found : Object
required: scala.concurrent.Future[play.api.mvc.Result]
Here my code :
case class Foo(id : String)
case class Bar(id : String)
def create() = {
Action.async(parse.json) { request =>
val sessionTokenOpt : Option[String] = request.headers.get("sessionToken")
val sessionToken : String = "Bearer " + (sessionTokenOpt match {
case None => throw new NoSessionTokenFound
case Some(session) => session
})
val user = ""
val structureId : Option[String] = request.headers.get("structureId")
if (sessionToken.isEmpty) {
Future.successful(BadRequest("no token"))
} else {
val url = config.getString("createURL").getOrElse("")
request.body.validate[Foo].map {
f =>
Logger.debug("sessionToken = " + sessionToken)
Logger.debug(f.toString)
val data = Json.toJson(f)
val holder = WS.url(url)
val complexHolder =
holder.withHeaders(("Content-type","application/json"),("Authorization",(sessionToken)))
Logger.debug("url = " + url)
Logger.debug(complexHolder.headers.toString)
Logger.debug((Json.prettyPrint(data)))
val futureResponse = complexHolder.put(data)
futureResponse.map { response =>
if(response.status == 200) {
response.json.validate[Bar].map {
b =>
Future.successful(Ok(Json.toJson(b)))
}.recoverTotal { e : JsError =>
Future.successful(BadRequest("The JSON in the body is not valid."))
}
} else {
Logger.debug("status from apex " + response.status)
Future.successful(BadRequest("alo"))
}
}
Await.result(futureResponse,5.seconds)
}.recoverTotal { e : JsError =>
Future.successful(BadRequest("The JSON in the body is not valid."))
}
}
}
}
What is wrong in my function ?
Firstly, this is doing nothing:
futureResponse.map { response =>
if(response.status == 200) {
response.json.validate[Bar].map {
b =>
Future.successful(Ok(Json.toJson(b)))
}.recoverTotal { e : JsError =>
Future.successful(BadRequest("The JSON in the body is not valid."))
}
} else {
Logger.debug("status from apex " + response.status)
Future.successful(BadRequest("alo"))
}
}
Because you're not capturing or assigning the result of it to anything. It's equivalent to doing this:
val foo = "foo"
foo + " bar"
println(foo)
The foo + " bar" statement there is pointless, it achieves nothing.
Now to debug type inference problems, what you need to do is assign results to things, and annotate with the types you're expecting. So, assign the result of the map to something first:
val newFuture = futureResponse.map {
...
}
Now, what is the type of newFuture? The answer is actually Future[Future[Result]], because you're using map, and then returning a future from inside that. If you want to return a future inside your map function, then you have to use flatMap instead, this flattens the Future[Future[Result]] to Future[Result]. But actually in your case, you don't need that you can use map, and just get rid of all those Future.successful calls, because you're not actually doing anything in that map function that needs to return a future.
And then get rid of that await as others have said - using await means blocking, which negates the point of using futures in the first place.
Anyway, this should compile:
def create() = {
Action.async(parse.json) { request =>
val sessionTokenOpt : Option[String] = request.headers.get("sessionToken")
val sessionToken : String = "Bearer " + (sessionTokenOpt match {
case None => throw new NoSessionTokenFound
case Some(session) => session
})
val user = ""
val structureId : Option[String] = request.headers.get("structureId")
if (sessionToken.isEmpty) {
Future.successful(BadRequest("no token"))
} else {
val url = config.getString("createURL").getOrElse("")
request.body.validate[Foo].map {
f =>
Logger.debug("sessionToken = " + sessionToken)
Logger.debug(f.toString)
val data = Json.toJson(f)
val holder = WS.url(url)
val complexHolder =
holder.withHeaders(("Content-type","application/json"),("Authorization",(sessionToken)))
Logger.debug("url = " + url)
Logger.debug(complexHolder.headers.toString)
Logger.debug((Json.prettyPrint(data)))
val futureResponse = complexHolder.put(data)
futureResponse.map { response =>
if(response.status == 200) {
response.json.validate[Bar].map {
b =>
Ok(Json.toJson(b))
}.recoverTotal { e : JsError =>
BadRequest("The JSON in the body is not valid.")
}
} else {
Logger.debug("status from apex " + response.status)
BadRequest("alo")
}
}
}.recoverTotal { e : JsError =>
Future.successful(BadRequest("The JSON in the body is not valid."))
}
}
}
}
Do not Await.result(futureResponse, 5 seconds). Just return the futureResponse as is. The Action.async can deal with it (in fact, it wants to deal with it, it requires you to return a Future).
Note that in your various other codepaths (else, recoverTotal) you are already doing that.
If you use Action.async you don't need to await for result. So try to return future as is, without Await.result

Comma separated list with Enumerator

I've just started working with Scala in my new project (Scala 2.10.3, Play2 2.2.1, Reactivemongo 0.10.0), and encountered a pretty standard use case, which is - stream all the users in MongoDB to the external client. After navigating Enumerator, Enumeratee API I have not found a solid solution for that, and so I solved this in following way:
val users = collection.find(Json.obj()).cursor[User].enumerate(Integer.MAX_VALUE, false)
var first:Boolean = true
val indexedUsers = (users.map(u => {
if(first) {
first = false;
Json.stringify(Json.toJson(u))
} else {
"," + Json.stringify(Json.toJson(u))
}
}))
Which, from my point of view, is a little bit tricky - mainly because I needed to add Json Start Array, Json End Array and comma separators in element list, and I was not able to provide it as a pure Json stream, so I converted it to String steam.
What is a standard solution for that, using reactivemongo in play?
I wrote a helper function which does what you want to achieve:
def intersperse[E](e: E, enum: Enumerator[E]): Enumerator[E] = new Enumerator[E] {
val element = Input.El(e)
override def apply[A](i1: Iteratee[E, A]): Future[Iteratee[E, A]] = {
var iter = i1
val loop: Iteratee[E, Unit] = {
lazy val contStep = Cont(step)
def step(in: Input[E]): Iteratee[E, Unit] = in match {
case Input.Empty ⇒ contStep
case Input.EOF ⇒ Done((), Input.Empty)
case e # Input.El(_) ⇒
iter = Iteratee.flatten(iter.feed(element).flatMap(_.feed(e)))
contStep
}
lazy val contFirst = Cont(firstStep)
def firstStep(in: Input[E]): Iteratee[E, Unit] = in match {
case Input.EOF ⇒ Done((), Input.Empty)
case Input.Empty ⇒
iter = Iteratee.flatten(iter.feed(in))
contFirst
case Input.El(x) ⇒
iter = Iteratee.flatten(iter.feed(in))
contStep
}
contFirst
}
enum(loop).map { _ ⇒ iter }
}
}
Usage:
val prefix = Enumerator("[")
val suffix = Enumerator("]")
val asStrings = Enumeratee.map[User] { u => Json.stringify(Json.toJson(u)) }
val result = prefix >>> intersperse(",", users &> asStrings) >>> suffix
Ok.chunked(result)

restart iterator on exceptions in Scala

I have an iterator (actually a Source.getLines) that's reading an infinite stream of data from a URL. Occasionally the iterator throws a java.io.IOException when there is a connection problem. In such situations, I need to re-connect and re-start the iterator. I want this to be seamless so that the iterator just looks like a normal iterator to the consumer, but underneath is restarting itself as necessary.
For example, I'd like to see the following behavior:
scala> val iter = restartingIterator(() => new Iterator[Int]{
var i = -1
def hasNext = {
if (this.i < 3) {
true
} else {
throw new IOException
}
}
def next = {
this.i += 1
i
}
})
res0: ...
scala> iter.take(6).toList
res1: List[Int] = List(0, 1, 2, 3, 0, 1)
I have a partial solution to this problem, but it will fail on some corner cases (e.g. an IOException on the first item after a restart) and it's pretty ugly:
def restartingIterator[T](getIter: () => Iterator[T]) = new Iterator[T] {
var iter = getIter()
def hasNext = {
try {
iter.hasNext
} catch {
case e: IOException => {
this.iter = getIter()
iter.hasNext
}
}
}
def next = {
try {
iter.next
} catch {
case e: IOException => {
this.iter = getIter()
iter.next
}
}
}
}
I keep feeling like there's a better solution to this, maybe some combination of Iterator.continually and util.control.Exception or something like that, but I couldn't figure one out. Any ideas?
This is fairly close to your version and using scala.util.control.Exception:
def restartingIterator[T](getIter: () => Iterator[T]) = new Iterator[T] {
import util.control.Exception.allCatch
private[this] var i = getIter()
private[this] def replace() = i = getIter()
def hasNext: Boolean = allCatch.opt(i.hasNext).getOrElse{replace(); hasNext}
def next(): T = allCatch.opt(i.next).getOrElse{replace(); next}
}
For some reason this is not tail recursive but it that can be fixed by using a slightly more verbose version:
def restartingIterator2[T](getIter: () => Iterator[T]) = new Iterator[T] {
import util.control.Exception.allCatch
private[this] var i = getIter()
private[this] def replace() = i = getIter()
#annotation.tailrec def hasNext: Boolean = {
val v = allCatch.opt(i.hasNext)
if (v.isDefined) v.get else {replace(); hasNext}
}
#annotation.tailrec def next(): T = {
val v = allCatch.opt(i.next)
if (v.isDefined) v.get else {replace(); next}
}
}
Edit: There is a solution with util.control.Exception and Iterator.continually:
def restartingIterator[T](getIter: () => Iterator[T]) = {
import util.control.Exception.allCatch
var iter = getIter()
def f: T = allCatch.opt(iter.next).getOrElse{iter = getIter(); f}
Iterator.continually { f }
}
There is a better solution, the Iteratee:
http://apocalisp.wordpress.com/2010/10/17/scalaz-tutorial-enumeration-based-io-with-iteratees/
Here is for example an enumerator that restarts on encountering an exception.
def enumReader[A](r: => BufferedReader, it: IterV[String, A]): IO[IterV[String, A]] = {
val tmpReader = r
def loop: IterV[String, A] => IO[IterV[String, A]] = {
case i#Done(_, _) => IO { i }
case Cont(k) => for {
s <- IO { try { val x = tmpReader.readLine; IO(x) }
catch { case e => enumReader(r, it) }}.join
a <- if (s == null) k(EOF) else loop(k(El(s)))
} yield a
}
loop(it)
}
The inner loop advances the Iteratee, but the outer function still holds on to the original. Since Iteratee is a persistent data structure, to restart you just have to call the function again.
I'm passing the Reader by name here so that r is essentially a function that gives you a fresh (restarted) reader. In practise you will want to bracket this more effectively (close the existing reader on exception).
Here's an answer that doesn't work, but feels like it should:
def restartingIterator[T](getIter: () => Iterator[T]): Iterator[T] = {
new Traversable[T] {
def foreach[U](f: T => U): Unit = {
try {
for (item <- getIter()) {
f(item)
}
} catch {
case e: IOException => this.foreach(f)
}
}
}.toIterator
}
I think this very clearly describes the control flow, which is great.
This code will throw a StackOverflowError in Scala 2.8.0 because of a bug in Traversable.toStream, but even after the fix for that bug, this code still won't work for my use case because toIterator calls toStream, which means that it will store all items in memory.
I'd love to be able to define an Iterator by just writing a foreach method, but there doesn't seem to be any easy way to do that.