I'm using MySQL with phpMyAdmin on windows 8.1, and also with XAMP.
I created a database with several tables, and modified some fields.
I noticed that the next day the database fields goes back to what it was the previous day, even though i been using it all day.
For example i add some fields/columns to a table, then add data to those fields, then use the data base for a day doing queries. Then next day the new columns i added to the tables are gone, the table went back to the way they were the previous day, and all the data i added are gone and all the new columns are gone. This is the second time i happened, two days in a row.
Does anybody knows what could be happening?
Go through with this website u may get the solution for your problem
https://www.sitepoint.com/mysql-mistakes-php-developers/
Related
I have a MySQL database running and I've got Tableau connected to it. The issue I am having is that the table is too long - it contains years of transactions. And I only care about the transactions in the most recent 60 days. I have added a filter on the date so I can get the subset I need. However, it is super slow every time I open the workbook since it will query on the whole table and then apply the filter. So my question is:
How can I make Tableau only load the most recent 60 days of data to start with? Thanks!
I see a few possible solutions :
Create a view or a materialised view in mySQL to cater for the last 60 days only
Change the sheet filter to a context filter. The (normal) sheet filter won't affect the data query, but a context filter will.
If it's still too slow, create an extract. You can schedule Tableau Server to update the extract every day.
Good Morning,
I'm fairly new to Access VBA and I've been trying to find a solution to a problem:
I've created a form from which users upload an excel file to a database. File open prompt appears, user selects the file, temp table gets created and data gets pulled to this table. From there a set of macros populate the required fields and push the complete set to a perm table and then temp table gets deleted. Now I would like to take it a step further and try and count how many times a value has been uploaded to the table...
Lets say that the value appears in the table twice already, then if user tries to upload the same value for the third time it will be uploaded to a different table. Bear in mind that the file which users will upload may contain values that will be uploaded for the first, second, third, etc. time.
Do you have any suggestions or solutions to my problem? Is it even possible? If yes then how can I make Access to distinguish which records are being uploaded for the first, second, third, etc. time and follow appropriate paths?
I've been scouting the internet for several days now, but no one seems to have such issue.
Thank you in advance for replies.
I'm not sure I follow. You are essentially trying to prevent inserting duplicate data to a production table and if a duplicate is encountered at the record to a different table?
I haven't done much in Access for years, but I have picked up a bit of support work to help out.
Database is an Access 2003 one, running on 2010 in compatibility mode (Behaviour was the same on 2003 though) with the data in a SQL Server 2005 backend.
We have a single form (i.e not master/subform) that is based upon a query, joining 2 tables - it's a simple organisation to address. It is theoretically many to one, but in practise one to one. I wasn't even sure if/how this would work, but it does in general.
If you create a new record (The form has a button, but it's the same if you use the built in new record button) it happily generates a new address ID from the autonumber on the tblAddress table, and populates the org_addr_code column in the Organisation table.
However if you then try and create a second new record, it throws an error
The Microsoft Access database engine cannot find a record in the table tblAddress with key matching field(s) 'ORG_ADDR_CODE'
If I hack records directly into the datasheet view of the underlying query, then it lets me add as many records as I like.
If I exit the form after inserting the first record, and go back in, I can add another record just fine. It's only if you try and do it more than once in the same form 'session'. I have tried every variation on refresh/requery I can think of, but no joy.
Anyone got any ideas? I'd rather not have to rewrite the whole form - if it came to that they will just have to stick to adding one record at a time.
Carl
I'm working on an app that is partly an employee time clock. It's not too complex but I want to make sure I head in the right direction the first time. I currently have this table structure:
id - int
employee_id - int (fk)
timestamp - mysql timestamp
event_code - int (1 for clock in, 0 for clock out)
I've got everything working where if their last event was a "clock in" they only see the "clock out" button and visa-versa.
My problem is that we will need to run a report that shows how many hours an employee has worked in a month and also total hours during the current fiscal year (Since June 1 of the current year).
Seems like I could store clock in and outs in the same record and maybe even calculate minutes worked between the two events and store that in a column called "worked". Then I would just need to get the sum of all that column for that employee to know how much time total.
Should I keep the structure I have, move to all on one row per pair of clock in and out events, or is there a better way that I'm totally missing?
I know human error is also a big issue for time clocks since people often forget to clock in or out and I'm not sure which structure can handle that easier.
Is MySQL Timestamp a good option or should I use UNIX Timestamp?
Thanks for any advise/direction.
Rich
I would go with two tables:
One table should be simple log of what events occurred, like your existing design.
The second table contains the calculated working hours. There are columns for the logged in and logged out times and perhaps also a third column with the time difference between them precalculated.
The point is that the calculation of how many hours an employee has worked is complicated, as you mention. Employees may complain that they worked longer hours than your program reports. In this case you want to have access to the original log of all events with no information loss so that you can see and debug exactly what happened. But this raw format is slow and difficult to work with in SQL so for reporting purposes you also want the second table so that you can quickly generate reports with weekly, monthly or yearly sums.
Is MySQL Timestamp a good option or should I use UNIX Timestamp?
Timestamp is good because there are lots of MySQL functions that work well with timestamp. You might also want to consider using datetime which is very similar to timestamp.
Related
Should I use field 'datetime' or 'timestamp'?
A view that hase been running for years selects specific columns from several tables and joins them.
Recently, I added a column to one of the tables, and the vew no longer worked properly.
One of the columns in the query result contained data that was from another column in the table.
I rebuilt the view from a script - no changes to the script - and the problem went away.
The view does not look at the new column.
What is going on?
Your query is using SELECT *. You simply need to recomple it.
When the view is compiled, it compiles the offset of the fields in the record, rather than the names of the field. If the underlying table changes . . . well, the offsets no longer go to the correct positions.
I, unfortunately, learned this once upon a time after about 5 hours of trying to figure out why a number was getting a numeric error -- starting at 6:00 a.m. So, this problem can even effect different types of data.
I do the recompile by scripting out the view as an alter, using SQL Server Management Studio, and then running the code (without any changes). However, you could change the code by putting in the explicit list of columns, and not having this problem in the future.
You can also set up a script that recompiles all your views every evening, to prevent this problem in the future. We now have such a script as well.