Update made in MySQL database does not reflect on the webpage - mysql

I updated contents of my website on its corresponding MySQL database, through phpMyAdmin. The changes are showing in the database however it does not reflect on the webpage.
What could be the possible reasons?

You need to commit to save the changes.
More info:
https://dev.mysql.com/doc/refman/5.7/en/commit.html

Go to configuration page and Check for the database connectivity first. Sometimes this can also be responsible for not reflecting the changes.

Related

Relationships disappeared - MS Access 2016

I have a split access database that's been in use for almost two years. The database resides on a computer which I access remotely via Remote Utilities, where I transfer the db to my local PC, work on it, then transfer back to the remote machine. We use EaseUS Todo Backups to create an image file every 30 minutes of the database file. I am currently in the process of doing some refactoring and have run into the following issue:
All of the relationships in the database have somehow disappeared. Here is what is strange about this:
About a week prior to discovering this issue I had taken a copy of the database and did not have this issue.
The relationships are gone whether I open it on my local machine or the remote machine.
Upon finding this, the first thing I did was try to restore a backup to see if the relationships were there -- they were not.
This is what I can't figure out -- I had copied the file, everything was OK, then a week later no relationships were found in either the current copy, nor any of the backups from before there was no issue.
I have tried the following to resolve this:
Updating Access on both machines.
Hiding all tables then adding back and showing 'all relationships' in relationships tab.
Looking for relationships in the database documenter.
Restoring old backups as mentioned.
I'm sure this could be a result of corruption -- but how could this corruption extend to the .pbd image files generated by EaseUS, that were created before the issue occurred?
Click the All Relationships button.
Repeatedly click the Hide Table button. Each click hides one table at a time. Continue clicking until the Hide Table button grays out. That will mean that you've hidden every last table.
Click the All Relationships button, again. This will make all of the tables (with their connecting lines intact) reappear in rows of four-each, with all of your tables now visible.

How to know who is changing database data in mysql

Some changes made in my database which is in server how to know who changes the data and what changes they made?
I'm not sure if it is possible in MySQL but in Oracle you can create system triggers to check who has logged in, which values have been changed etc.
I recommend you to surf in the MySQL triggers website and see if there's something that you need.
https://dev.mysql.com/doc/refman/8.0/en/triggers.html

getting record of manually made changes in phpmyadmin

My fellow admins have made some changes manually in phpmyadmin. I want to get the record of the changes they have made. Is it possible to get that??
From what I can remember from phpMyadmin, when you install it you have a 'config.sample.inc.php' file which you copy to set up the actual 'config.inc.php' which contains all the configurations. You could use a file difference tool to compare both files and you could see the differences between the original and how it currently is. This is not an exact way but will offer insight into changes made. The only other way that I know off hand is if the config files are under version control, in which case you can compare previous versions.
Hope this helps :)

How to handle database changes made by automatic update script while using Liquibase?

I'm developing a web application that also use Wordpress as part of it. I want to use Liquibase to track my database changes.
How to handle database changes made by automatic update script of Wordpress?
Can I just ignore them? and put only my own changes in Liquibase changelog file?
You could do a diffChangelog of the schemas after each WordPress upgrade so that Liquibase could keep track of the changes. You can just ignore them though - Liquibase doesn't really care about unknown schema objects. The only issue would be if your changes and the WordPress changes conflicted.
You can and should just ignore them.
Liquibase just does one thing. It keeps track of the fact that:
a certain command (say, createTable)...
...that looked a certain way at time 0 (the name of the table, its columns, etc.)...
...was definitively executed at time 0 (it stores this record in DATABASECHANGELOG).
That's it. It is not a structure enforcer or a database state reconstitution engine. It is quite possible—and permitted, and often expected—that the database will be changed by other tools and Liquibase will have no idea what went on.
So just keep your commands in your changelogs, don't worry about preexisting database structure, use preconditions to control whether or not your changesets run, and ignore everything else that might be going on in the database that happened due to other tools.

Refresh destination schema metadata in Integration Services

I have been working on a huge ETL project with 150+ tables and during the design I had to make a major change on destination column names and data types for a couple of tables.
My problem is that I can't get SSIS to see the new schema for the tables I changed. So I would like to know how can I get SSIS to refresh this schema? I find it kind of ridiculous that there no way to tell SSIS to update the metadata from database schema, especially for database migration.
Recreating the project from scratch is out of question because I already spent some hours on it. Also changing manually the 400+ columns I changed is also not an option.
What about using the Advanced Editor and pressing the Refresh button on the left side below?
Following my previous auto-answer, I finally found what was preventing the metadata from being refreshed.
When I originally modified my database, I actually executed another script that was making a DROP on the table and then a CREATE TABLE to recreate the table from scratch. There, SSIS was never able to detect changes and I had to do all the things in my other answer.
Later today I had to make some minor modification and this time I opted for an ALTER TABLE. Magically, this time SSIS detected all the changes even notifying me to refresh columns from the advanced editor, which worked fine.
So basically all these issues has been caused by my poor knowledge about DBA and its best practices.
I found a way to fix it but that was a bit tricky.
Even thought I was completely removing any references of the table from my packages, I was always getting the old metadata.
I still don't have a clear fix but here is what I did to fix it:
Removed any reference to concerned source and destination tables
Deleted obj and bin folder from the project folder
Saved, closed and then reopened the project
Created new data flow from scratch and updated metadata was finally there
Don't know where those informations were cached but I suspect that the obj folder keeps a cached copy of your packages or that Visual Studio keeps metadata in memory which is freed when you close it. Anyway, following these steps should fix it.