MySql Event Scheduler Copy and Insert at Specific Time Every Day - mysql

please I need the correct Syntax to Copy and Update A table everyday from another table's MAX row values using MySql's Event Scheduler. The tasks obviously exceeds my basic knowledge of MySql.
This is what I have but its not working:
DELIMITER $$
CREATE
EVENT IF NOT EXISTS `Statement_Opening_Closing_Bal`
ON SCHEDULE EVERY 1 DAY STARTS '00:00:30'
DO BEGIN
-- GET CUSTOMER UNIQUE IDs
SELECT id
FROM customer AS Customer_UniqueIDs;
-- copy associated audit records
SELECT transactiondate, credit, debit, amount FROM passbook.Customer_UniqueIDs WHERE transactionid=MAX(transactionid) AS LastTransactionEachDay;
-- INSERT MAX ROW TRANSACTION FOR ABOVE SELECT INTO StatementofAccountRef.Customer_UniqueIDs,
INSERT INTO StatementofAccountRef.Customer_UniqueIDs (tid, entrydate, dtdebit, dtcredit, dtbalance)
VALUES(NULL, LastTransactionEachDay.entrydate, LastTransactionEachDay.dtdebit, LastTransactionEachDay.dtcredit, LastTransactionEachDay.dtbalance)
END */$$
DELIMITER ;
For clarification, I have a table called Customer with a column named "id" that contains Unique id for all customers.
Now I have several tables with names as Passbook.id (e.g Passbook2, Passbook3, etc...) where "id" in the Column names corresponds to Unique "id" in Table "Customer". Each Passbook.id has Columns named transactiondate, credit, debit, amount.
And I also have several tables with names as StatementofAccountRef.id where "id" in the Column names also corresponds to Unique "id" in Table "Customer". Each StatementofAccountRef.id has Columns named tid, entrydate, dtdebit, dtcredit, dtbalance.
What I want to do is to:
1. Take each "id" from Table Customer and use it to SELECT each customer's passbook.id
Get the Max row values from each passbook.id for Columns "transactiondate", "debit", "credit", "amount". By Max Row values I mean last or most recent row entry in passbook.id as at the time the Event Scheduler runs. This value will be picked at the specific time (00.00.30) everyday whether they change or not as far as they are the most recent entry in table passbook.id
Insert the values into each corresponding StatementofAccountRef.id Column entrydate, dtdebit, dtcredit, dtbalance. "tid" column in StatementofAccountRef.id tables is set to AUTO INCREMENT. entrydate(in StatementofAccountRef.id) is DATETIME as is transactiondate in passbook.id .
Any working help appreciated guys, please. Thanks.
Show Table Data and Sample entries:
Table customer
CREATE TABLE `customer` (
`id` int(5) NOT NULL,
`name` varchar(255) NOT NULL,
`surname` varchar(255) DEFAULT NULL,
`gender` char(1) NOT NULL,
`dob` date NOT NULL,
`nominee` varchar(255) NOT NULL,
`account` varchar(255) NOT NULL,
`address` varchar(255) NOT NULL,
`mobile` varchar(15) NOT NULL,
`email` varchar(255) NOT NULL,
`accessid` varchar(60) NOT NULL,
`password` varchar(255) NOT NULL,
`branch` varchar(255) NOT NULL,
`Branch_Code` varchar(255) NOT NULL,
`lastlogin` datetime NOT NULL,
`accstatus` varchar(255) NOT NULL,
`accnumber` bigint(10) DEFAULT NULL,
`CustomerRefNum` varchar(255) CHARACTER SET utf8 COLLATE utf8_bin DEFAULT NULL,
`CustomerTokenRef` varchar(255) DEFAULT NULL,
`Acc_Manager` varchar(255) DEFAULT NULL,
`currency` varchar(255) DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=latin1
63, John, Denon, M, 1976-12-10, Jonny Lee, GoldPlus, 200 Monroe Street #233 Rockville MD 2085, cmailultimate#gmail.com, 0458ac7536f4101f597820c7f9136b2354f2f156, Zone C, Team 1, 2018-11-18 03-14-45, Active, 12113, CUDG23299-TYWB02323, Raymond Crow, Dollars
Table passbook63
CREATE TABLE `passbook63` (
`transactionid` int(5) NOT NULL,
`transactiondate` date DEFAULT NULL,
`name` varchar(255) DEFAULT NULL,
`surname` varchar(255) DEFAULT NULL,
`fullnames` varchar(255) DEFAULT NULL,
`branch` varchar(255) DEFAULT NULL,
`Branch_Code` varchar(255) DEFAULT NULL,
`credit` int(10) DEFAULT NULL,
`debit` int(10) DEFAULT NULL,
`amount` float(10,2) DEFAULT NULL,
`narration` varchar(255) DEFAULT NULL,
`recall_value` varchar(10) DEFAULT NULL,
`transactionRef` varchar(255) DEFAULT NULL,
`transactionreceipts` varchar(255) DEFAULT NULL,
`receiving_Inst` varchar(255) DEFAULT NULL,
`beneficiary_account` varchar(255) DEFAULT NULL,
`beneficiary_name` varchar(255) DEFAULT NULL,
`transaction_datetime` datetime NOT NULL
) ENGINE=MyISAM DEFAULT CHARSET=latin1
Sample Data: 19, 2017-10-15 06:06:06, John, Denon, John Denon, Zone C, Team 1, 6000, 6000, 6000, Double Bet On Customer 101, 0, WERW39489923, Triple Confirmation of Results Reguired, 12113, Harrison Due, 2017-10-19 01:06:06
Table StatementofAccountRef63
CREATE TABLE `StatementofAccountRef63` (
`tid` int(11) NOT NULL AUTO_INCREMENT,
`entrydate` datetime DEFAULT NULL,
`dtdebit` int(11) NOT NULL,
`dtcredit` int(11) NOT NULL,
`dtbalance` int(11) NOT NULL,
PRIMARY KEY (`tid`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1
Sample Data: 011, 2018-02-28 06:06:06, 36000, 36000, 96000

Related

Cannot remove duplicate values from a mysql table

I have a table ship_details which is not having any constraints. The data is coming from a data source & original designer of the table thought the incoming data not to have duplication.Now I have to remove the duplicate entries. Now the table has 9,94,184 entries.
The table definition is
CREATE TABLE `ship_details` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`order_number` varchar(150) DEFAULT NULL,
`delivery_id` varchar(150) DEFAULT NULL,
`transaction_type` varchar(150) DEFAULT NULL,
`pick_date` varchar(150) DEFAULT NULL,
`pn_note_number` varchar(150) DEFAULT NULL,
`item_id` varchar(150) DEFAULT NULL,
`item_code` varchar(150) DEFAULT NULL,
`picked_quantity` varchar(150) DEFAULT NULL,
`lot_number` varchar(150) DEFAULT NULL,
`lot_expiry` varchar(150) DEFAULT NULL,
`name` varchar(150) DEFAULT NULL,
`delivered_date` varchar(150) DEFAULT NULL,
`extra_attrib1` varchar(150) DEFAULT NULL,
`extra_attrib2` varchar(150) DEFAULT NULL,
`extra_attrib3` varchar(150) DEFAULT NULL,
`extra_attrib4` varchar(150) DEFAULT NULL,
`extra_attrib5` varchar(150) DEFAULT NULL,
`extra_attrib6` varchar(150) DEFAULT NULL,
`extra_attrib7` varchar(150) DEFAULT NULL,
`extra_attrib8` varchar(150) DEFAULT NULL,
`extra_attrib9` varchar(150) DEFAULT NULL,
`extra_attrib10` varchar(150) DEFAULT NULL,
`last_updated` varchar(100) DEFAULT NULL,
`outbound_id` varchar(100) DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=994222 DEFAULT CHARSET=latin1;
I tried to delete the duplicate entries by using following script:
delete s1
from ship_details s1
inner join ship_details s2
where s1.id < s2.id
and s2.order_number = s1.order_number
and s2.delivery_id = s1.delivery_id
and s2.item_code = s1.item_code
and s2.lot_number = s1.lot_number
and s2.picked_quantity = s1.picked_quantity;
but that gave lock wait timeout. Even if I use a particular order no still it times out.
So I went for the approach of replicating the table with unique constraint of order_number, delivery_id, item_code and picked_quantity.
So tried to export the data from the original table after running following command:
SELECT distinct order_number, delivery_id, transaction_type, pick_date, pn_note_number,
item_id, item_code, picked_quantity, lot_number, lot_expiry, name, delivered_date,
extra_attrib10,last_updated, outbound_id
FROM ship_details;
But this command did not give me unique result. This results in 1,54,948 rows. Pl. see this:
INSERT INTO clean_ship_details (order_number,delivery_id,transaction_type,pick_date,pn_note_number,item_id,item_code,picked_quantity,lot_number,lot_expiry,name,delivered_date,extra_attrib10,last_updated,outbound_id) VALUES
('181020373','10068965','Shipped','2018-11-11T15:50:48.000+04:00','PN176348','516169','VCH128','73','C34142','2021-02-28T00:00:00.000+04:00','DVT-6410','2019-06-18T15:48:12.000+04:00','','2019-06-18T15:54:40.000+04:00','51616973_73_'),
('181020373','10068965','Shipped','2018-11-11T15:50:48.000+04:00','PN176348','516169','VCH128','73','C34142','2021-02-28T00:00:00.000+04:00','DVT-6410','2019-06-18T15:48:12.000+04:00','','2019-06-18T15:54:40.000+04:00','58719373_73_'),
('181020373','10068965','Shipped','2018-11-11T15:50:48.000+04:00','PN176348','516170','VCH120','12','K33471/A','2020-10-31T00:00:00.000+04:00','DVT-6410','2019-06-18T15:48:12.000+04:00','','2019-06-18T15:54:40.000+04:00','51617012_12_'),
('181020373','10068965','Shipped','2019-06-19T12:22:39.000+04:00','PN239867','587193','VCH128','2','E34284','2021-04-30T00:00:00.000+04:00','DVT-6410','2019-06-18T15:48:12.000+04:00','','2019-06-18T15:54:40.000+04:00','5161692_2_'),
('181020373','10068965','Shipped','2019-06-19T12:22:39.000+04:00','PN239867','587193','VCH128','2','E34284','2021-04-30T00:00:00.000+04:00','DVT-6410','2019-06-18T15:48:12.000+04:00','','2019-06-18T15:54:40.000+04:00','5871932_2_'),
('191002479','10091039','Shipped','2019-02-12T07:50:55.000+04:00','PN186154','544495','VTP048','170','205809','2020-07-31T00:00:00.000+04:00','DVT-6479','2019-07-11T07:30:38.000+04:00','','2019-07-11T09:31:22.000+04:00','544495170_170_'),
('191002479','10091039','Shipped','2019-02-12T07:50:55.000+04:00','PN186154','544495','VTP048','170','205809','2020-07-31T00:00:00.000+04:00','DVT-6479','2019-07-11T07:30:38.000+04:00','','2019-07-11T09:31:22.000+04:00','594447170_170_'),
('191002479','10091039','Shipped','2019-07-18T07:45:49.000+04:00','PN249274','594447','VTP048','11','208744','2021-01-31T00:00:00.000+04:00','DVT-6479','2019-07-11T07:30:38.000+04:00','','2019-07-11T09:31:22.000+04:00','54449511_11_'),
('191002479','10091039','Shipped','2019-07-18T07:45:49.000+04:00','PN249274','594447','VTP048','11','208744','2021-01-31T00:00:00.000+04:00','DVT-6479','2019-07-11T07:30:38.000+04:00','','2019-07-11T09:31:22.000+04:00','59444711_11_'),
('191006312','10188037','Shipped','2019-03-31T12:17:39.000+04:00','PN201490','560373','VTP048','26','207783','2020-12-31T00:00:00.000+04:00','DVT-6694','2019-10-08T07:08:45.000+04:00','','2019-10-08T07:11:44.000+04:00','56037326_26_');
I cannot insert this to the new table.
Update I tried to insert using a script without success as I get lock wait time exceeded even with a limit of just 1 record:
INSERT IGNORE INTO clean_ship_details (order_number,delivery_id,transaction_type,pick_date,pn_note_number,item_id,item_code,picked_quantity,lot_number,lot_expiry,name,delivered_date,last_updated,outbound_id) SELECT order_number,delivery_id,transaction_type,pick_date,pn_note_number,item_id,item_code,picked_quantity,lot_number,lot_expiry,name,delivered_date,last_updated,outbound_id FROM ship_details order by order_number,delivery_id,item_id limit 10;
Your second approach does dedupe if as you say unique constraint of order_number, delivery_id, item_code and picked_quantity also you don't need distinct because unique key will detect the duplicates and you can INSERT IGNORE the error
Using your sample data
enter link description here

Creating a summary from two tables?

I have two tables and I'm trying to create a summary with the sum of amount due per person but don't have the creative ID involved.
Table 1:
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`Name` varchar(255) NOT NULL,
`Lname` varchar(255) NOT NULL,
`phone` varchar(15) NOT NULL,
`address` varchar(255) DEFAULT NULL,
`city` varchar(255) NOT NULL,
`state` char(2) NOT NULL,
`zip` varchar(50) DEFAULT NULL,
`email` varchar(255) DEFAULT NULL,
`business_name varchar(255) DEFAULT NULL,
Second table:
`id` INT(10) UNSIGNED NOT NULL AUTO_INCREMENT,
`C_ID` INT(10) UNSIGNED NOT NULL,
`Amount_Due` DECIMAL(7 , 2 ) not null DEFAULT 0,
`created_date` DATETIME NOT NULL,
`closed_date` DATETIME default NULL,
PRIMARY KEY (`id`)
) ENGINE=INNODB DEFAULT CHARSET=LATIN1;
Here is what I'm trying to do:
I'm trying to make a summary with dates within 5/1/18 and 6/15/18.
Have the sum of due amount for each person
Have these aliases : Business name, Phone Number ,Invoiced Amounts
I'm trying to test my code but i'm getting errors:
SELECT Name,phone,SUM(Amount_Due) FROM test_customer,test_invoices
WHERE created_date BETWEEN '2018-05-01' AND "2018-06-15'
If I understand correctly, you need to use JOIN instead of ,(CROSS JOIN) and GROUP BY in non-aggregate function columns.
SELECT Name 'Customer Name',
phone 'Phone number',
SUM(i.Amount_Due) 'Amount due'
FROM test_customer c
INNER JOIN test_invoices i ON C.id = i.C_ID
GROUP BY Name,phone
sqlfiddle

SQLite3 multiple primary key fields with autoincrement (rails project)

I'm trying to create a SQLite3 database for use with a Rails application.
I have the same database created already in MySQL and am trying to replicate the same in SQLite3.
The create syntax for MySQL is
CREATE TABLE `leaderboard_A` (
`league` tinyint(1) NOT NULL,
`id` bigint(10) NOT NULL,
`cut` tinyint(1) NOT NULL,
`wd` tinyint(1) NOT NULL,
`tie` tinyint(1) NOT NULL,
`pos` int(4) NOT NULL,
`pos_s` varchar(10) COLLATE utf8_unicode_ci NOT NULL,
`name` varchar(128) COLLATE utf8_unicode_ci NOT NULL,
`to_par` smallint(3) NOT NULL,
`to_par_s` varchar(5) COLLATE utf8_unicode_ci NOT NULL,
`hole` varchar(8) COLLATE utf8_unicode_ci NOT NULL,
`round` smallint(6) NOT NULL,
`round_s` varchar(8) COLLATE utf8_unicode_ci NOT NULL,
`round_1` int(11) NOT NULL,
`round_2` int(11) NOT NULL,
`round_3` int(11) NOT NULL,
`round_4` int(11) NOT NULL,
`total` smallint(4) NOT NULL,
`tournament_id` varchar(13) COLLATE utf8_unicode_ci NOT NULL,
`tournament_name` varchar(255) COLLATE utf8_unicode_ci NOT NULL,
`year` smallint(4) NOT NULL,
PRIMARY KEY (`league`,`id`,`tournament_id`),
KEY `pos` (`pos`),
KEY `player_key` (`name`,`tournament_name`,`year`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
I believe I need to have an ID column added in for my SQLite3 schema.
Is it possible to have an autoincrement column in my SQLite3 aswell as making other columns primary too?
I have some SQLite3 SQL written already with just a slightly different column setup as below.
CREATE TABLE IF NOT EXISTS Leaderboard (
id INTEGER,
current_position TEXT,
current_round INTEGER,
country TEXT,
is_amateur BOOLEAN,
first_name TEXT,
last_name TEXT,
name TEXT,
player_id INTEGER,
round1 INTEGER,
round2 INTEGER,
round3 INTEGER,
round4 INTEGER,
start_position TEXT,
status TEXT,
thru INTEGER,
today INTEGER,
total INTEGER,
tournament_name TEXT,
tournament_id INTEGER,
start_date datetime,
end_date datetime,
year INTEGER,
PRIMARY KEY (id, player_id, tournament_id, year)
)
My eventual solution is to read in a JSON record and update the details if it exists and if not create the record using the SQL below
INSERT OR REPLACE INTO Leaderboard (
current_position, current_round, country, is_amateur, first_name, last_name, name, player_id, round1, round2, round3, round4, start_position, status, thru, today, total, tournament_name, tournament_id, start_date, end_date, year)
VALUES
('1','4','USA','false','Jordan','Spieth','Spieth, Jordan','34046','68','67','71','69','T1','active','18','-1','-78','US Open','026','2015-06-18','2015-06-21','2015')
Any feedback, suggestions or fixes would be appreciated. New to working all this out. :)

Need to generate an absent report based on days worked table data

This is my first question. So please bear with me if I have asked something stupid.
I have two tables in my mySql database named 'users', 'user_checkin_checkout'
Each time a user logs in to the website, I create a new record in 'user_checkin_checkout' table, and when user logs out, I update that record with current time.
Table structure:
# Users
CREATE TABLE IF NOT EXISTS `users` (
`id` int(11) NOT NULL auto_increment,
`name` varchar(255) NOT NULL,
`password` varchar(255) NOT NULL,
`address` varchar(255) NOT NULL,
`city` varchar(255) NOT NULL,
`state` varchar(255) NOT NULL,
`zip` varchar(255) NOT NULL,
`phone` varchar(255) NOT NULL,
`email` varchar(255) NOT NULL,
`last_login` datetime NOT NULL,
`date_added` datetime NOT NULL,
`date_modified` datetime NOT NULL,
PRIMARY KEY (`id`)
)
# CREATE TABLE IF NOT EXISTS `user_checkin_checkout` (
`user_id` int(11) NOT NULL,
`id` int(11) NOT NULL auto_increment,
`checkin_date` datetime NOT NULL,
`checkout_date` datetime NOT NULL,
`checkin_ip_address` varchar(255) NOT NULL,
`checkout_ip_address` varchar(255) NOT NULL,
PRIMARY KEY (`id`)
)
Now, I need to generate a report so that it takes from and to date as parameters, and I want to know when the users were absent (not checked in). I am struggling to generate a query for this. I am unable to use join queries between them as that just brings when the user has actually checked in. Should I just have another table that populates all days between entered days and should I use a join with that table instead? Please advise.
Thanks in advance.

MySQL: Can this Join query be optimized?

I have two tables. create command for those tables are as follows
CREATE TABLE `invoice` (
`Id` bigint(20) NOT NULL,
`VersionNumber` int(11) NOT NULL,
`CreationDate` datetime default NULL,
`ModificationDate` datetime default NULL,
`CreateBy` bigint(20) default NULL,
`ModifyBy` bigint(20) default NULL,
`BusinessId` varchar(255) character set utf8 default NULL,
`Status` int(11) default NULL,
`VendorId` bigint(20) default NULL,
`CustomerId` bigint(20) default NULL,
`OrderDate` date default NULL,
`ExpectedDate` date default NULL,
`DeliveryDate` date default NULL,
`InvoiceFor` int(11) default NULL,
`Reference` varchar(255) character set utf8 default NULL,
`Agent` varchar(255) character set utf8 default NULL,
`BnAgent` varchar(255) character set utf8 default NULL,
`Note` varchar(255) character set utf8 default NULL,
`HasVoucher` char(1) character set utf8 default NULL,
PRIMARY KEY (`Id`),
KEY `CustomerId` (`CustomerId`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
CREATE TABLE `invoiceitem` (
`Id` bigint(20) NOT NULL,
`VersionNumber` int(11) NOT NULL,
`CreationDate` datetime default NULL,
`ModificationDate` datetime default NULL,
`CreateBy` bigint(20) default NULL,
`ModifyBy` bigint(20) default NULL,
`BusinessId` varchar(255) default NULL,
`Status` int(11) default NULL,
`InvoiceId` bigint(20) default NULL,
`ProductId` bigint(20) default NULL,
`PackageQty` decimal(19,5) default NULL,
`PackagePrice` decimal(19,5) default NULL,
`ItemPerPackage` decimal(19,5) default NULL,
`ItemQty` decimal(19,5) default NULL,
`ItemPrice` decimal(19,5) default NULL,
`StoreQty` decimal(19,5) default NULL,
`RevenuePercent` decimal(19,5) default NULL,
`PurchasePrice` decimal(19,5) default NULL,
`Vat` decimal(19,5) default NULL,
`TotalAmount` decimal(19,5) default NULL,
`InvoiceItemFor` int(11) default NULL,
`LifeTimeUptoDate` datetime default NULL,
PRIMARY KEY (`Id`),
KEY `invoiceid_productid` (`InvoiceId`,`ProductId`),
KEY `ProductId` (`ProductId`),
CONSTRAINT `FK_invoiceitem` FOREIGN KEY (`InvoiceId`) REFERENCES `invoice` (`Id`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
Purpose of this query
I have to show the list of invoices to the admin. He need to see the invoice date, Invoice number, customer Id, Product id and if the customer take that product previously then that previous date.
Now the Query
SELECT iv.id, iv.CustomerId, ii.ProductId, MAX(iv2.orderdate) AS PriviousDate
FROM invoice AS iv
INNER JOIN invoice AS iv2
ON iv.CustomerId=iv2.CustomerId
AND iv.OrderDate>iv2.OrderDate
INNER JOIN invoiceItem AS ii
ON iv.id=ii.invoiceId
INNER JOIN invoiceItem AS ii2
ON iv2.id=ii2.invoiceId
AND ii.ProductId=ii2.ProductId
WHERE iv.Status=0
AND ii.Status=0
AND iv2.Status=0
AND ii2.Status=0
GROUP BY ii.ProductId, iv.CustomerId
Here iv.id and ii.id is primary key
customerId and Productid fields are also indexes
and invoiceId is the foreginkey (iv.id=ii.invoiceId )
I am running this query in my MySql server (local)
But Most of the time I got timeout. How can I optimized this query?
the Explain for this query is as follows:
now I apply
create index invoiceid_productid
on invoiceItem(invoiceId, productId)
after that the explain result is
Not much here (it would have been helpful to see what you're indexes are) but assuming the invoice ids increase in line with invoice date....I suspect this might be slightly faster:
SELECT iv.id, iv.CustomerId, ii.ProductId, MAX(iv2.orderdate) AS PriviousDate
FROM invoice AS iv
INNER JOIN invoice AS iv2
ON iv.CustomerId=iv2.CustomerId
AND iv.id>iv2.id
INNER JOIN invoiceItem AS ii
ON iv.id=ii.invoiceId
INNER JOIN invoiceItem AS ii2
ON iv2.id=ii2.invoiceId
AND ii.ProductId=ii2.ProductId
AND ii.invoiceId>ii2.invoiceId
WHERE iv.Status=0
AND ii.Status=0
AND iv2.Status=0
AND ii2.Status=0
GROUP BY ii.ProductId, iv.CustomerId
But you really need an index on ProductId, Status and InvoiceId in the invoiceItem table before it's going to be appreciable faster.
Looking at your EXPLAIN results, the problem is clearly the invoiceItem table. It has 200,000 rows, and MySQL isn't using any indexes to access it.
So you should look at creating indexes to speed access to it.
I mocked up your tables:
create table invoice (
id int auto_increment not null,
customerid int not null,
orderdate timestamp not null,
status int default 0 not null,
primary key(id)
);
create table invoiceItem (
id int auto_increment not null,
invoiceId int not null,
productId int not null,
status int default 0 not null,
primary key (id)
);
I created these indexes:
create index invoiceId on invoiceItem(invoiceId);
create index cod on invoice(customerid, orderdate);
create index stat on invoice(status);
My EXPLAIN (on empty tables) now has all tables using type ref.
You can also see this tutorial which is for MySQL ;-)