I have a Gatling test which should do the following:
create user once
retrieve user's data according to specific load model. Actual load testing.
delete user after when done
Question: how to emulate this with Gatling? If I chain calls like :
val scn = scenario("Test scenario").exec(_create-user_).exec(_retrive-user_).exec(_delete-user_)
setUp(scn).protocols(httpConf))
then creating and deleting user will be part of the test.
You can use the before and after hooks to create and delete the user.
class RetrieveUserSimulation extends Simulation {
before {
// create user
}
setUp(scn).protocols(httpConf)
after {
// delete user
}
}
You'll have to issue the create and delete HTTP requests manually. before and after take => Unit thunks, not Scenarios.
In before hook we can call a method which can have below code.
val httpClient = HttpClientBuilder.create.build
val httpResponse = httpClient.execute(new HttpPut(urlString))
println("StatusCode - " + httpResponse.getStatusLine.getStatusCode)
httpClient.close()
We can use HttpGet as well. Here used apache library
example : org.apache.http.impl.client.HttpClientBuilder
Related
Consider following snippet; I have used WSClient here for some api calls via DI.
#Singleton
class SampleService #Inject()(ws: WSClient) {
def get(id:Long): JsValue ={
val trailingURL = s"/$id".toString
val wsRequest = ws.url(baseURL+trailingURL).addQueryStringParameters("access_token" -> authToken).get()
val wsResponse = Await.result(wsRequest, Duration.Inf)
Json.toJson(wsResponse.body)
}
}
And I need to write unit test for get method. I'm doing the following thing
val mockedWS = mock[WSClient]
val sparrowService = new SurveySparrowService(mockedWS)
"get method" should {
"return a valid result with valid id" in {
val result = sparrowService.get(66405)
println(result)
assert(result.toString == `the result i'll get`)
}
}
But the mocking fails and i get a null pointer exception in following line=>
val wsRequest = ws.url(baseURL+trailingURL).addQueryStringParameters("access_token" -> authToken).get()
Also when i'm using Json.toJson(wsResponse.body) i'm getting extra \ with each parameter in whole response.
can anyone help me solving these two problems. Thanks.
There is play-mockws, which exists solely because mocking a WSClient manually is really tedious.
// simulation of a GET request to http://dns/url
val ws = MockWS {
case (GET, "http://dns/url") => Action { Ok("http response") }
}
await(ws.url("http://dns/url").get()).body == "http response"
Further explanation:
Mocking a class / trait simply creates you an instance of that type out of thin air. You cannot do anything with that instance in general, calling any method on it will simply return null. If your code under test calls methods of this object, you must stub those methods with answers (i.e. simply return a prepared value).
For WSClient, this means you must stub the url method since this will be called by any code doing HTTP requests. But this method returns a WSRequest. So, you must mock this also... Any call on this new mock needs to be stubbed too, or else it will end in a NPE again. This really gets complicated very soon, and you probably don't understand your test code too well anymore. That's why play-mockws was created which makes it very easy to reason about calls to HTTP services in your Play application.
BTW, you may also combine play-mockws with the SIRD - String Interpolation Router DSL, which makes it even easier to extract values out of the routes or query parameters if you need to:
val ws = MockWS {
case GET(p"/$id") if id == "66405" =>
Action {
Results.Ok("...")
}
}
I need to create a benchmark report regarding whether in the grand scheme of things: minifying + GZIP dynamic HTML responses (generated through GSPs) on every request, which will lead to an additional overhead due to parsing of the generated dynamic HTML string then compressing using a Java library (which results to a smaller response size) is actually better than GZIP without minifying (which results to faster response time but a little larger response size). I got the feeling that this "improvement" maybe is insignificant, but I need the benchmark report to back it up to the team.
To do that, I modify controller actions like so:
// import ...MinifyPlugin
class HomeController {
def get() {
Map model = [:]
String htmlBody = groovyPageRenderer.render(view: "/get", model: model)
// This adds a few milliseconds and reduce few characters.
htmlBody = MinifyPlugin.minifyHtmlString(htmlBody)
render htmlBody
}
}
But the Grails project has almost a hundred actions and doing this on every existing action is impractical and not maintainable, especially that after the benchmarking, we may decide to not minify the HTML response. So I was thinking of doing this inside an Interceptor instead:
void afterView() {
if(response.getContentType().contains("text/html")) {
// This throws IllegalStateException: getWriter() has already been called for this response
OutputStream servletOutputStream = response.getOutputStream()
String htmlBody = new String(servletOutputStream.toByteArray())
htmlBody = MinifyingPlugin.minifyHtmlString(htmlBody)
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream()
byteArrayOutputStream.write(htmlBody.getBytes())
response.setCharacterEncoding("UTF-8")
response.setContentType("text/html")
response.outputStream << byteArrayOutputStream
}
}
But it seems that modification of the response body is impossible once it enters the afterView interceptor...? So is any other way to do this using Grails 3 Interceptors, or should I update every controller action we have manually and perform the modification there instead?
This is what I like to use Interceptors for.
The after() part of the interceptor can act on the model after it is returned from the controller (wherein 'before()' acts on the request before it is sent to the controller)
This allows you to manipulate all data for a set of endpoints (or one specific endpoint) prior to return to client
If you are wanting to render to a view, you do that in the interceptor rather than in the controller; you merely return data from the controller
I am trying to find the best way to update a value in another service after the creation of data, essentially how a trigger would work in a database context. For instance I have an objects:
foo= {
_id,
barID,
otherData
}
bar= {
_id,
newestFooID,
otherData
}
When a new foo object is created I want update bar to link to the new object. Is the only way to do this through a third service? or is there someway to get enough use an after hook (the id field for foo is not available on create)?
I am trying to avoid using events, avoid rewriting the generated service data, and make it database independent (right now it is using mongoose, but that can/will change in the future).
The id for foo should be available in an after hook for create. Then you can just patch the bar service accordingly:
app.service('foo').after({
create(hook) {
const barId = hook.result.barId;
const newestFooId = hook.result._id;
// Update the barId with the newestFooId
return app.service('bar')
.patch(barId, { newestFooId })
.then(() => hook);
}
});
I have figured out the problem! The issue is because the hook.result is a mongoose object and not a different object. The mongoose object didn't have the requisite field in it. While working on another problem I cam across the feathers mongoose hook "toObject()" which when called makes the result a javascript object with all the data I need in it.
I'm not quite ready to change up all my user/auth tables from the MySQL user/roles/profile provider format, but am moving off of MVC to ServiceStack.
Is there a pre-built IUserAuthRespository and/or CredentialsAuthProvider somewhere that can be used, or do I need to build one to provide this mapping?
If I need to build one, I assume implementing at the IUserAuthRepository level is the cleanest? Is there a minimum set of methods required to implement basic login/logout (and administrative "switch user" impersonation) functionality?
I tried implementing a custom CredentialsAuthProvider, which seems to work, but I'm unable to get local posts for impersonation to use the proper provider. Looking for a solution to that, I realized that maybe its better to implement the repository instead.
EDIT:
My current registration of the custom auth provider is:
Plugins.Add(new AuthFeature(() => new AuthUserSession(), new IAuthProvider[]
{
container.Resolve<MySqlCredentialsAuthProvider>() //HTML Form post of UserName/Password credentials
}));
And calling code for the local post to the AuthenticateService is:
[RequiredRole(SystemRoles.Administrator)]
public object Any(ImpersonateUser request)
{
using (var service = base.ResolveService<AuthenticateService>()) //In Process
{
//lets us login without a password if we call it internally
var result = service.Post(new Authenticate
{
provider = AuthenticateService.CredentialsProvider,
UserName = request.Username,
//Password = "should-not-matter-since-we-are-posting-locally"
});
return result;
}
}
Integrating with existing User Auth tables
If you want to use your existing User/Auth tables, the easiest solution is to ignore the UserAuth repositories and implement a Custom CredentialsAuthProvider that looks at your existing database tables to return whether their Authentication attempt was successful.
Implement OnAuthenticated() to populate the rest of your typed IAuthSession from your database, e.g:
public class CustomCredentialsAuthProvider : CredentialsAuthProvider
{
public override bool TryAuthenticate(IServiceBase authService,
string userName, string password)
{
//Add here your custom auth logic (database calls etc)
//Return true if credentials are valid, otherwise false
}
public override IHttpResult OnAuthenticated(IServiceBase authService,
IAuthSession session, IAuthTokens tokens,
Dictionary<string, string> authInfo)
{
//Fill IAuthSession with data you want to retrieve in the app eg:
session.FirstName = "some_firstname_from_db";
//...
//Call base method to Save Session and fire Auth/Session callbacks:
return base.OnAuthenticated(authService, session, tokens, authInfo);
//Alternatively avoid built-in behavior and explicitly save session with
//authService.SaveSession(session, SessionExpiry);
//return null;
}
}
Importing existing User Auth tables
If you want to import them into an OrmLite User Auth tables, you would configure to use the OrmLiteAuthRepository in your AppHost:
//Register to use MySql Dialect Provider
container.Register<IDbConnectionFactory>(
new OrmLiteConnectionFactory(dbConnString, MySqlDialect.Provider));
Plugins.Add(new AuthFeature(
() => new CustomUserSession(), //Use your own typed Custom UserSession type
new IAuthProvider[] {
//HTML Form post of UserName/Password credentials
new CredentialsAuthProvider()
}));
//Tell ServiceStack you want to persist User Info in the registered MySql DB above
container.Register<IUserAuthRepository>(c =>
new OrmLiteAuthRepository(c.Resolve<IDbConnectionFactory>()));
//Resolve instance of configured IUserAuthRepository
var userAuth = container.Resolve<IUserAuthRepository>();
//Create any missing UserAuth RDBMS tables
authRepo.InitSchema();
Then to import your data you can use the above MySQL DB connection to select from your existing tables then use the IUserAuthRepository to create new Users.
// Open DB Connection to RDBMS
using (var db = container.Resolve<IDbConnectionFactory>().Open())
{
//Example of fetching old Users out of a custom table (use your table instead)
var oldUsers = db.Select<OldUserInfo>();
// Clear existing UserAuth tables if you want to replay this import
//db.DeleteAll<UserAuthDetails>();
//db.DeleteAll<UserAuth>();
//Go through and create new User Accounts using Old User Info
foreach (var oldUser in oldUsers)
{
//Create New User Info from Old Info
var newUser = new UserAuth {
UserName = oldUser.UserName,
Email = oldUser.Email,
//...
};
//Create New User Account with oldUser Password
authRepo.CreateUserAuth(newUser, oldUser.Password);
}
}
After this you'll have new User Accounts from your old User Info which you can sign in with.
I want to send slick table as part of the akka actor message . So that the remote actor at the other end can connect to the database and can do CRUD operations on the mysql database. Am unable to get my head over the slick types and i find compiler/eclipse complaining. How can i get this done. Is it a good idea to pass slick queries as part of actor messages.
object RemoteActorMessages {
case class Create(table: Table[A])
case class RunQuery(query: Query[_, _, _])
case Result(code: Int, message: String)
}
class DBActor extends Actor {
def recieve = {
case Create(table) => createTable(table)
case RunQuery(query) => runQuery(query)
case ... //so on
}
}
def createTable(table: Table[M]): Future[A] = Future {
db.withSession(implicit session => tableQuery[table].ddl.create)
}
def runQuery(query: Query[_, _, _]): Future[A] = Future {
db.withSession { implicit session => {
query.run
}
}
}
warning: code might have some type errors.Discretion is appreciated from the viewers
I am confused about how to send results back to the sender of the messages. for example: query.list.run gives back list of model objects. So, how should i frame by Result message
I think this is a case of 'when you have a hammer everything becomes nails'. I believe this is not the right use case for the actors. A reason (not the only one) would be that DB operations are 'slow' and they would block the actor threads for a long time.
Arguably you want a service that manages the table operations, using Futures and a custom Execution Context to isolate the impact (for example, in Play is done like this). Something like:
object DBService {
def createTable() : Future[Boolean] = ???
...
}
Actors should only receive commands like CreateTable that then call the corresponding method in the service.
Incidentally, this would simplify your use case as the service could know more about the table and other Slick specifics, whilst the actor would be oblivious to them.
Not the only way, but arguably simpler.