Laravel - Mysql to JSON - mysql

At the moment I have a list of employees as a json file.
{
"id": 1,
"departments": "1",
"name": "Bill Smith",
"profilePic": "/img/people/Office/bill-smith.jpg",
"title": "Office Manager"
},
I now want to store these in the database but still return them as JSON.
How do I set this up with my routes? It will be a very basic filter, by department id.
I presume I would do a get request
Route::get('people/{department}', function () {
});
How do I return the json?

If you have an Employee model, it would look like this:
Route::get('people/{department}', function ($departmentId) {
return Employee::where('department_id', $departmentId)->get();
});
Laravel converts objects to JSON before writing them to the client.

As you said you have list of employees in a json file then First create a DB Seed to store your data in the DATABASE from that particular json file with the help of Seeding.
Lets say you made a table departments in the database and stored your data in all 5 columns(id, departments, name, profilePic, title), Then make a model that with the following command
php artisan make:model Department
For more details about models please visits on https://laravel.com/docs/5.4/eloquent#eloquent-model-conventions
Then simple detect that request through route and return the data as json:
Route::get('people/{department}', function ($departmentId) {
$data = Department::where('departments', $departmentId)->first();
return response()->json($data);
});

The very basic model query would be
Route::get('people/{department}', function ($department) {
return Employee::where('departments',$department)->get();
}
assume Employee is your model
.
Then Laravel automatically will convert your query to json.
FYI: You can also send this request to controller method the return like above

Related

NodeJS: How to send entity from mysql to rabbitmq

I'm trying to send an "entity" obtained from MySQL to RabbitMQ.
I'm able to make the connection to the database and return data. Example:
dbConnection.query("SELECT * FROM customer WHERE Id = ?", customerId, (err, rows, fields) => {
...
res.status(200).json(rows)
...
}
After this I am able to watch in Postman the "JSON result", so, I want to send this "JSON result" as an string to RabbitMQ.
I can send to RabbitMq a fake data object with no problem:
const fakeData = {
name: "Elon Musk",
company: "SpaceX",
};
channel.sendToQueue("message-queue", Buffer.from(JSON.stringify(fakeData)));
So, how must I convert the "rows" object returned from MySQL to send it to the queue?
Thank you in advance!
The solution to my problem is as follows:
rows.forEach(function (row) {
channel.sendToQueue("message-queue", Buffer.from(JSON.stringify(row)));
});

NodeJS gives MongoDB data with missing values and an unexpected $db field

I have a simple NodeJS app which is running a http server that is collecting data from a MongoDB instance and presenting the result as JSON:
db.collection(collectionName).findOne({ '_id': id }, function (err, result) {
if (err) {
reportError(err, res);
return;
} else {
outPut(result, res);
}
});
In the outPut function I'm calling JSON.stringify() on the 'result' variable and writing it in the response.
However much of the data is missing, and an empty $db object is included from somewhere. Here is a subset of the data:
"Kommun":1292,
"Lansdel":28,
"Delyta":[
{
"$id":"2",
"$db":""
},
{
"$ref":"691"
},
{
"$ref":"247"
}
Looking at the record using Studio 3T it seems that all the data I expect has been saved.
Why am I not getting all my data in the JSON object? Where is the $db coming from? What is it?
My guess is that you are using DBRefs. In order to include the referenced data from different collections, you must query those yourself. I cannot show you a code example without some more info on the data schema.

Using existing fields as objectId in mongodb to create REST api with node and express?

My JSON file is:
[
{'name': name,
'birthday': birthday,
'telephone': telephone,
'details': details},
....
]
My application will be able to accept name, birthday and telephone values, and these three values combined are going to be unique for each entity. I am hoping to retrieve an entity based on these three values, say http://localhost:8000/name?=mary&birthday?=19880902&telephone?=2234324567 will return the json object of Mary.
I am looking at the examples available on using mongodb and nodejs, most of them suggest creating a completely new _id field using new BSON.ObjectID(id) and do something similar to app.get('/users/:id', users.findById) to find the id. Is it possible not to create a separate id field?
Try using a callback with the (req, res) parameters. See here:
app.get('/', function(req, res) {
var findCursor = Users.find({
"name": req.query.name,
"birthday": req.query.birthday,
"telephone": req.query.telephone
});
cursor.each(function(err, doc) {
if (err) console.log("Oh no!");
if (doc != null) {
// Do stuff
res.send('GET request to user page?');
}
}
});

What's the best way to map objects into ember model from REST Web API?

The topic of this post is: my solution is too slow for a large query return.
I have a Web Api serving REST results like below from a call to localhost:9090/api/invetories?id=1:
[
{
"inventory_id": "1",
"film_id": "1",
"store_id": "1",
"last_update": "2/15/2006 5:09:17 AM"
},
{
"inventory_id": "2",
"film_id": "1",
"store_id": "1",
"last_update": "2/15/2006 5:09:17 AM"
}
]
Since my WebAPI did not provide a root key for my JSON response, I made a RESTSerializer like following.
export default DS.RESTSerializer.extend({
extract:function(store,primaryType,payload,id,requestType){
var typeName = primaryType.typeKey;
var data = {};
data[typeName] = payload; // creating root
payload = data;
return this._super(store,primaryType,payload,id,requestType)
}
});
When this gets run, I get the following error message: Assetion failed: You must include an 'id' for inventory in an object passed to 'push'
As you can see, these objects do not have the attribute id, so I found that the default behaviour of Ember RESTSerializer forces me to write my own.
Okay, so here's where I'm not sure my solution is right. inventory_id from my return is unique, therefore I choose to use that as an id, okay I'm thinking to my self, I'll just add it manually. The function looks like this now:
export default DS.RESTSerializer.extend({
extract:function(store,primaryType,payload,id,requestType){
var typeName = primaryType.typeKey;
for(var i=0;i<payload.length;i++){
payload[i].id = payload[i].inventoryId;
}
var data = {};
data[typeName] = payload; // creating root
payload = data;
return this._super(store,primaryType,payload,id,requestType)
}
});
By just manually duplicating an attribute, I feel like I'm cheating my way over this error message. In addition, I sometimes return a large payload array (over 150k rows). Looping O(n) just doesn't seem a right price to pay for just a simple mapping.
Is there some other way to set either my WebAPI or serializer up so I avoid the for loop in assigning the id that ember so desperately wants.
I think this should fix your problem:
export default DS.RESTSerializer.extend({
primaryKey: 'inventory_id'
});
With this parameter Ember Data will map inventory_id to it's id parameter.

save embedded JSON Rest Api

Im using Symfony2 with FOSRestBundle on the server side, and EmberJS as client. im reciving the data from the server like this:
{
customers:[
{
id:3,
name:"Joue",
currency:{
id:5,
iso_code:"BDT"
}
}
]
}
I populate a select with a 2nd server call, where i get all the currencies.
At the moment im sending back the data (PUT - update) like this:
{
customers:[
{
id:3,
name:"Joue",
currency: 2
}
]
}
and in the controller i look up for the currency with the given id, and i save it.
$currency = $this->getDoctrine()
->getRepository('ApiBundle:Currency')
->findOneBy(array('id' => $req['customer']['currency']));
$partner->setCurrency($currency);
my question is there a way to save a request if i send it back with embedded JSON? eg:
{
customers:[
{
id:3,
name:"Joue",
currency:{
id:2,
iso_code:"XCT"
}
}
]
}
or it is fine to look up for it and handle in the controller.
My opinion:
I prefer querying the database to get the right currency (make sure nobody tries to inject some false data for instance a currency with iso_code "IAJSAIJD"). This way I'm sure everything is fine on $em::flush().
On the other hand, you could use the #ParamConverter from FOSRestBundle and send a full json. In you Customer entity you can specify a way to deserialize your currency association (See #JMSSerializer annotations, if you are using it).
class Customer
{
//...
/*
* #ORM\OneTo..
* #JMS\Type("My\Entity\Currency")
*/
$currency;
}
But this solution assumes that you allow your code to manage entities' ids, because you need to tell Doctrine that your currency already exists and this is the one with the id X. If you have a cascade persist on $currency, for instance, and you forget to $em::merge() before $em::flush(), you will end up with a new currency in your database.
That being said, I might be wrong, I'm no expert.