I created my models in a file and can successfully add and alter objects which successfully commit to the database when run in a function beneath the model declaration. When I import my Session and model classes into a separate script, I can successfully run a query, but if I update the object and run an update the db does not get updated. The same code works well when run beneath where I defined my models and I'm totally confused why this is. Am I confused as to how I should work with and update my database?
haha, I must have been looking at it too closely for too long. In the snippet of code that wasn't working I was calling session.commit, not session.commit().
Related
I'm able to create a new service which creates a table in the MySQL DB just fine, but the feathersJS model file simply creates 1 text field in the model by default,
when I modify the fields add more etc...this does not reflect in the database.
Is there a migration script i must run ? can someone show me how to run a command that will reflect the changes i make to the model so I can sync the two.
right now I have to manually modify the table to match the model.
In your src/sequelize.(js|ts), there should be a call like sequelize.sync(). This updates the structure of the database according to the model definitions every time the app gets started. So in theory, restarting the app should be enough.
If that doesn't work, try .sync({ alter: true }) or even .sync({ force: true }). (This might be fine during initial development, but is not recommended for production use as it can result in data loss. See also the docs about .sync() options.)
You already mentioned Migrations, which would be a better alternative for sure. With Umzug they can be automatically executed during startup as well.
Good evening,
I am currently developing a way to import machine created data from a csv sheet into a database.
The question I have is, is there a way to react to a change in a csv file with Lua.
The file gets a line in this format:
17162H,"801234500001",9/23/2016 12:33:30 PM,"INV"
Every time a scanner is finishing a scan process, added under the old lines, but there is no direct connection to the database, to trigger the script.
It doesn't matter if the change is detected via different file size, foldersize (of the folder that contains the file) or a change within the file information (like date of last opening), but I can't open and read in it permanently due performance reasons.
Also this is the first time I ask here, so sorry for my clunky way, I'll try to improve myself with that over time.
Take a look at linotify, it has lua bindings for inotify and looks like it should do the trick, using the "modify" event to trigger your script.
I use LibUV based variant in my spylog apllication
Usage:
file_monitor(path_to_file, {eol = '\r?\n'}, function(line)
...
end)
If you need to run this on Windows, you can use winapi library, which supports file watchers. Here is an example of how it's used in one of my projects; you'll need to call winapi.sleep() to allow time for the check to trigger.
For some reason, I can't seem to create a bonfire module using the "existing" table option.
Earlier, it wasn't even displaying the list of fields from my database table when I selected the option to use existing table vs. creating a new one.
BUt I figured out that it was a permissions thing and so as a test I did the following:
chmod -R 777 /var/www/myapp
Now, it is querying the database and displaying all the correct fields from the table but when I click on the build button, it just keeps redisplaying the same form.
what I've done so far:
I created a test database in my database with just 2 fields. I tried to create a module using that table... but I get the same results.
I've ensured that all my tables are prefixed with "bf_". If they weren't, the system wouldn't be able to find and list all the correct fields... I think.
I've tested creating a new module using a new table. That seems to work just fine. Bonfire creates a new table in my database without any issues and also creates the correct folder structure for the module.
I've tried to ensure that all fields have a proper name validation rules specified.
In most cases, I just accepted defaults and tried to click on build.
changed logging settings to log everything. but after trying to create a module and going back to the logs, there's nothing listed.
If you have any suggestions, I'd appreciate it.
EDIT 1
Figured out how the profiler works - i didn't realize that you had to click on the flame icon on the bottom left corner of the screen.
found the issue. there's a bug with code igniter. found a bug report on their github site.
the post that i found was: https://github.com/ci-bonfire/Bonfire/issues/733
I have the following problem:
I Am using a Django framework.
One of the parts in a system (non-django) writes to the database, in the same database that django is using.
I want to have a signal when an object is being saved. It's a django model object but not saved via django, but directly in the mysql database.
Is there a way django can watch save-actions in his database when it's not being saved by django?
The neatest way would be: create an Api, and let the save action run through this api. The save signal can than be django default. (but this depends on some work of externals... so not the prefered route... for future development it sure is).
Another option is to implement celery and create a task that frequently looks whether one of the saved objects has had no follow up..... (also quit some puzzling I guess to get this up and running)
But there might be an easier... for me unknown?
I saw django watchdog solutions for file systems... not for databases (probably because django has this build in... when properly done through django)
to complex it: I test and develop locally with sqlite .... but the save signal I can put in my tests without needing to get this locally working.... as long as it works in mysql, I Am happy.
You can try this solution:
Create a new table 'django_watch' with one column 'object_id' (add other columns like 'created_datetime' etc according to your standards);
Lets say your main table is 'object'. Add a mysql trigger for the INSERT event on this table.
You should add an extra insert query inside the trigger to insert the object_id into 'django_watch' table.
Now you can have a cronjob that will be inpecting the new table 'django_watch' (for updations in Django objects) and perform necessary actions. You can run this cronjob continuously with some 1 minute delay (upto you).
In the end, I wrote an api that can be called by the thirdparty module. I delivered the code to logon on django using c-code to this api and call the GET of this api. (using django rest framework). This api just saves the object (the id given in the url), and from there on it's default django. The only thing the third party had to do is build in my code to call the api as well....
Maybe not the best solution, but the best to implement for my problem....
When I run the example
on my system with mvn tomcat:run command,
Data in mysql tables get's deleted.
So, I had to insert data every time I restart the application.
Assuming you followed the tutorial from the linked page exactly, the reason the MySQL data is being removed each time the application starts can be found in the hibernate.cfg.xml file.
Specifically, it's these lines:
<!-- This will drop our existing database and re-create a new one.
Existing data will be deleted! -->
<property name="hbm2ddl.auto">create</property>
I don't remember what the other values are off the top of my head, but I believe validate will only check that your schema is correct but not change anything. update may also work.