I have a table named 'cyclistic'. I'm trying to bin the data in order to make a histogram. Since some bins contain no elements, I created a new table to make a left join. However I still do not get bins with count=0.
A sample of my table 'cyclistic' which contains more than 5 million rows:
rideable_type
started_at
ended_at
member_casual
electric_bike
2022-01-13 11:59:47
2022-01-13 12:02:44
casual
electric_bike
2022-01-10 08:41:56
2022-01-10 08:46:17
casual
classic_bike
2022-01-25 04:53:40
2022-01-25 04:58:01
member
classic_bike
2022-01-04 00:18:04
2022-01-04 00:33:00
casual
classic_bike
2022-01-20 01:31:10
2022-01-20 01:37:12
member
classic_bike
2022-01-11 18:48:09
2022-01-11 18:51:31
member
classic_bike
2022-01-30 18:32:52
2022-01-30 18:49:26
member
classic_bike
2022-01-22 12:20:02
2022-01-22 12:32:06
member
electric_bike
2022-01-17 07:34:41
2022-01-17 08:00:08
member
classic_bike
2022-01-28 15:27:53
2022-01-28 15:35:16
member
classic_bike
2022-01-11 18:27:59
2022-01-11 18:34:20
member
electric_bike
2022-01-29 12:30:43
2022-01-29 12:43:04
member
classic_bike
2022-01-02 17:56:18
2022-01-02 18:05:38
member
classic_bike
2022-01-20 22:03:06
2022-01-20 22:09:59
member
electric_bike
2022-01-08 05:36:40
2022-01-08 05:46:40
casual
CREATE TABLE bins (
bin VARCHAR (25),
start TIME,
end TIME);
INSERT INTO bins (bin,start,end)
VALUES
('A [+0 min-1 min]','00:00:00','00:01:00'),
('B [+1 min-113 min]','00:01:01','01:53:00'),
('C [+113 min-223 min]','01:53:01','03:43:00'),
('D [+223 min-335 min]','03:43:01','05:35:00'),
('E [+335 min-447 min]','05:35:01','07:27:00'),
('F [+447 min-559 min]','07:27:01','09:19:00'),
('G [+559 min-671 min]','09:19:01','11:11:00'),
('H [+671 min-783 min]','11:11:01','13:03:00'),
('I [+783 min-895 min]','13:03:01','14:55:00'),
('J [+895 min-1007 min]','14:55:01','16:47:00'),
('K [+1007 min-1119 min]','16:47:01','18:39:00'),
('L [+1119 min-1231 min]','18:39:01','20:31:00'),
('M [+1231 min-1343 min]','20:31:01','22:23:00'),
('N [+1343 min-1440 min]','22:23:01','24:00:00'),
('O [+1140 min]','24:00:01','689:47:15');
I GROUP all data into bins.bin:
SELECT bins.bin,
c.member_casual,
c.rideable_type,
COALESCE(COUNT(*),0)
FROM bins
LEFT JOIN cyclistic AS c
ON TIMEDIFF(c.ended_at, c.started_at) BETWEEN bins.start AND bins.end
GROUP BY bins.bin, c.member_casual, c.rideable_type;
However, I do not get the expected output with 75 rows including bins with count=0. Instead I get a table with 58 rows:
bin
member_casual
rideable_type
COALESCE(COUNT(*),0)
classic_bike
12892
docked_bike
1528
electric_bike
33882
classic_bike
25999
electric_bike
47882
classic_bike
857810
docked_bike
158858
electric_bike
1213471
classic_bike
1679971
electric_bike
1584591
classic_bike
15025
docked_bike
11824
electric_bike
5263
classic_bike
1661
electric_bike
2426
classic_bike
1213
docked_bike
1554
electric_bike
323
classic_bike
400
electric_bike
635
classic_bike
400
docked_bike
443
electric_bike
64
classic_bike
176
electric_bike
184
classic_bike
287
docked_bike
229
electric_bike
51
classic_bike
137
electric_bike
168
classic_bike
275
docked_bike
172
classic_bike
129
electric_bike
1
classic_bike
213
docked_bike
178
classic_bike
111
classic_bike
194
docked_bike
127
classic_bike
131
classic_bike
159
docked_bike
126
classic_bike
113
classic_bike
126
docked_bike
109
classic_bike
80
classic_bike
94
docked_bike
116
classic_bike
47
classic_bike
84
docked_bike
92
classic_bike
38
classic_bike
46
docked_bike
73
classic_bike
26
classic_bike
2602
docked_bike
2032
classic_bike
715
Maybe the problem is with COUNT and LEFT JOIN.
How do I get count=0 when using left join?
Related
I have a table like this
create table users(
user_id int not null auto_increment,
user_age date,
user_address varchar(255),
primary key(user_id)
)
user_id
user_age
user_address
1
2010-01-05
87 Polk St. Suite 5
2
2010-01-06
Carrera 52 con Ave. Bolivar #65-98 Liano Largo
3
2010-01-07
Ave. 5 de Mayo Porlamar
4
2010-01-08
89 Chiaroscuro Rd.
5
2010-01-09
Via Ludovico il Moro 22
6
2010-01-10
Rue JosephBens 532
7
2011-01-05
43 rue St. Laurent
8
2011-01-06
Heerstr. 22
9
2011-01-07
South House 300 Queensbridge
10
2011-01-08
Ing. Gustavo Moncaa 8585 Piso 20-A
11
2011-01-09
Obere Str. 57
12
2011-01-10
Avda. de la Constitución 2222
13
2012-01-05
Mataderos 2312
14
2012-01-06
120 Hanover Sq.
15
2012-01-07
Berguvsvägen 8
16
2012-01-08
Forsterstr. 57
And I'd like my table to be like this:
user_id
user_age
user_address
1
2010-01-05
87 Polk St. Suite 5
2
2010-01-06
Carrera 52 con Ave. Bolivar #65-98 Liano Largo
7
2011-01-05
43 rue St. Laurent
8
2011-01-06
Heerstr. 22
13
2012-01-05
Mataderos 2312
14
2012-01-06
120 Hanover Sq.
How can I make this happen with group by statement?
If your MySQL supports windows function try:
with cte as
(
SELECT *, ROW_NUMBER() OVER(partition by year(user_age) ORDER BY user_age) row_num
FROM users
)
select user_id,user_age,user_address
from cte
where row_num<=2;
Demo
my table look like following
id person counter
1 Ona 4946
2 Mayra 15077
3 Claire 496
4 Rita 13929
5 Demond 579
6 Winnifred 13580
7 Green 1734
8 Jacquelyn 19092
9 Aisha 5572
10 Kian 8826
11 Alexandrea 7514
12 Dalton 14151
13 Rossie 18403
14 Carson 19537
15 Mason 2022
16 Emie 2394
17 Jonatan 6655
18 June 5037
19 Jazmyn 10856
20 Mittie 18928
here is the fiddle
i would like to select the top 5 by counter and group by first character, here is the sql that i tried:
SELECT SUBSTR(person,1,1) AS Alpha, person, counter
FROM myTable
GROUP BY SUBSTR(person,1,1)
ORDER BY SUBSTR(person,1,1) ASC, counter DESC;
how to select desired result as following:
alpha person counter
a Arvid 9236
a Aisha 5572
a Alf 4000
a Ahmad 3500
a Alvin 2100
b Brandon 13000
b Ben 8230
b Bonny 7131
b Bella 4120
b Bun 1200
c Connie 9320
c Calvin 8310
c Camalia 6123
c Cimon 3419
c Clay 2515
im using mysql 8.0
You can do:
select *
from (
select *, row_number() over(partition by substr(person, 1, 1)
order by counter desc) as rn
from myTable
) x
where rn <= 5
order by substr(person, 1, 1), rn
Result:
id person counter rn
---- ---------- -------- --
153 Alf 19758 1
283 Alycia 19706 2
260 Abe 19463 3
223 Assunta 18808 4
300 Ari 18031 5
210 Bennie 18309 1
159 Barry 18281 2
128 Beulah 18080 3
314 Benny 16795 4
474 Barry 15789 5
342 Casandra 19656 1
14 Carson 19537 2
67 Chaim 19429 3
280 Colin 18507 4
500 Corbin 18433 5
380 Daphney 19138 1
234 Dejah 18781 2
241 Derrick 18722 3
49 Dasia 18562 4
312 Darrel 17903 5
163 Evalyn 19847 1
79 Ernestine 19523 2
344 Emilie 19520 3
371 Eva 19119 4
469 Emma 18403 5
140 Fiona 19522 1
216 Flo 18314 2
356 Frieda 16082 3
254 Floy 15942 4
54 Florencio 12739 5
447 Geoffrey 19858 1
327 Geoffrey 19223 2
335 Grant 19100 3
454 Giuseppe 16175 4
83 Gardner 15235 5
373 Hilario 19507 1
35 Hanna 19276 2
200 Halle 18150 3
491 Hailee 17521 4
411 Hermann 17018 5
21 Idella 7440 1
177 Izabella 5536 2
115 Isai 4164 3
412 Izabella 2112 4
275 Imani 573 5
195 Joannie 19374 1
8 Jacquelyn 19092 2
48 Jalon 18861 3
251 Jamie 18768 4
367 Joanny 17600 5
282 Kendra 19278 1
421 Kendra 19213 2
363 Kaylin 18977 3
96 Kaylie 18423 4
310 Katrine 17754 5
146 Lonzo 19778 1
194 Leonora 18258 2
399 Laurine 16847 3
137 Leslie 16718 4
190 Luther 16318 5
87 Maegan 19112 1
20 Mittie 18928 2
271 Mariana 18149 3
317 Mary 18043 4
305 Maybelle 17666 5
281 Noelia 19203 1
176 Nickolas 19047 2
408 Nelson 15901 3
142 Nasir 13700 4
366 Nicole 10694 5
423 Ova 19759 1
487 Osborne 19539 2
438 Ozella 18911 3
375 Ora 18270 4
414 Onie 17358 5
52 Pascale 19658 1
39 Pearlie 17621 2
364 Price 14177 3
161 Precious 10337 4
294 Paula 9162 5
70 Quincy 18343 1
73 Quincy 16631 2
192 Quentin 13578 3
131 Rodger 19776 1
231 Royal 19033 2
313 Rocky 19008 3
13 Rossie 18403 4
45 Rosanna 15992 5
418 Sydnee 19810 1
470 Sadie 19189 2
123 Shanna 18862 3
485 Savanah 18664 4
302 Steve 16412 5
406 Toney 18283 1
28 Tremaine 16400 2
98 Taurean 15911 3
278 Tremaine 14391 4
311 Treva 14026 5
239 Ubaldo 11630 1
78 Valentina 17736 1
458 Vita 17527 2
170 Vergie 16971 3
158 Vance 15089 4
272 Veronica 12027 5
102 Willis 18155 1
329 Ward 14919 2
156 Westley 14867 3
136 Winnifred 14315 4
6 Winnifred 13580 5
323 Yolanda 17920 1
155 Yesenia 6164 2
402 Zachary 19129 1
37 Zaria 5398 2
See running example at DB Fiddle.
I am trying to parse 1st table located here using BeautifulSoup in Python. It parsed my First column but for some reason It didn't parsed entire table. Any help is appreciated!
Note: I am trying to parse entire table and convert into pandas dataframe
My Code:
import requests
from bs4 import BeautifulSoup
WIKI_URL = requests.get("https://en.wikipedia.org/wiki/NCAA_Division_I_FBS_football_win-loss_records").text
soup = BeautifulSoup(WIKI_URL, features="lxml")
print(soup.prettify())
my_table = soup.find('table',{'class':'wikitable sortable'})
links=my_table.findAll('a')
print(links)
It only parsed one column because you did a findall for only the items in the first column. To parse the entire table you'd have to do a findall for the table rows <tr> and then a findall within each row for the table divides <td>. Right now you are just doing a findall for the links and then printing the links.
my_table = soup.find('table',{'class':'wikitable sortable'})
for row in mytable.findAll('tr'):
print(','.join([td.get_text(strip=True) for td in row.findAll('td')]))
NOTE: Accept B.Adler's solution as it is good work and sound advice. This solution is simply so you can see some alternatives as you are learning.
Whenever I see <table> tags, I'll usually check out pandas first to see if I can find what I need from the tables that way. pd.read_html() will return a list of dataframes, and you can work/manipulate those to extract what you need.
import pandas as pd
WIKI_URL = "https://en.wikipedia.org/wiki/NCAA_Division_I_FBS_football_win-loss_records"
tables = pd.read_html(WIKI_URL)
You can also look through the dataframes to see which has the data you want.
I just used dataframe in index position 2 for this one, which is the first table you were looking for
table = tables[2]
Output:
print (table)
0 1 ... 6 7
0 Team Won ... Total Games Conference
1 Michigan 953 ... 1331 Big Ten
2 Ohio State 1 911 ... 1289 Big Ten
3 Notre Dame 2 897 ... 1263 Independent
4 Boise State 448 ... 618 Mountain West
5 Alabama 3 905 ... 1277 SEC
6 Oklahoma 896 ... 1274 Big 12
7 Texas 908 ... 1311 Big 12
8 USC 4 839 ... 1239 Pac-12
9 Nebraska 897 ... 1325 Big Ten
10 Penn State 887 ... 1319 Big Ten
11 Tennessee 838 ... 1281 SEC
12 Florida State 5 544 ... 818 ACC
13 Georgia 819 ... 1296 SEC
14 LSU 797 ... 1259 SEC
15 Appalachian State 617 ... 981 Sun Belt
16 Georgia Southern 387 ... 616 Sun Belt
17 Miami (FL) 630 ... 1009 ACC
18 Auburn 759 ... 1242 SEC
19 Florida 724 ... 1182 SEC
20 Old Dominion 76 ... 121 C-USA
21 Coastal Carolina 112 ... 180 Sun Belt
22 Washington 735 ... 1234 Pac-12
23 Clemson 744 ... 1248 ACC
24 Virginia Tech 743 ... 1262 ACC
25 Arizona State 614 ... 1032 Pac-12
26 Texas A&M 741 ... 1270 SEC
27 Michigan State 701 ... 1204 Big Ten
28 West Virginia 750 ... 1292 Big 12
29 Miami (OH) 690 ... 1195 MAC
.. ... ... ... ... ...
101 Memphis 482 ... 1026 The American
102 Kansas 582 ... 1271 Big 12
103 Wyoming 526 ... 1122 Mountain West
104 Louisiana 510 ... 1098 Sun Belt
105 Colorado State 520 ... 1124 Mountain West
106 Connecticut 508 ... 1107 The American
107 SMU 489 ... 1083 The American
108 Oregon State 530 ... 1173 Pac-12
109 UTSA 38 ... 82 C-USA
110 Kansas State 526 ... 1207 Big 12
111 New Mexico 483 ... 1103 Mountain West
112 Temple 468 ... 1094 The American
113 Iowa State 524 ... 1214 Big 12
114 Tulane 520 ... 1197 The American
115 Northwestern 535 ... 1240 Big Ten
116 UAB 126 ... 284 C-USA
117 Rice 470 ... 1108 C-USA
118 Eastern Michigan 453 ... 1089 MAC
119 Louisiana-Monroe 304 ... 727 Sun Belt
120 Florida Atlantic 87 ... 205 C-USA
121 Indiana 479 ... 1195 Big Ten
122 Buffalo 370 ... 922 MAC
123 Wake Forest 450 ... 1136 ACC
124 New Mexico State 430 ... 1090 Independent
125 UTEP 390 ... 1005 C-USA
126 UNLV11 228 ... 574 Mountain West
127 Kent State 341 ... 922 MAC
128 FIU 64 ... 191 C-USA
129 Charlotte 20 ... 65 C-USA
130 Georgia State 27 ... 94 Sun Belt
[131 rows x 8 columns]
I have a tls file such as:
1224 926 1380 688 845 109 118 88 1275 1306 91 796 102 1361 27 995
1928 2097 138 1824 198 117 1532 2000 1478 539 1982 125 1856 139 475 1338
848 202 1116 791 1114 236 183 186 150 1016 1258 84 952 1202 988 866
946 155 210 980 896 875 925 613 209 746 147 170 577 942 475 850
1500 322 43 95 74 210 1817 1631 1762 128 181 716 171 1740 145 1123
3074 827 117 2509 161 206 2739 253 2884 248 3307 2760 2239 1676 1137 3055
183 85 143 197 243 72 291 279 99 189 30 101 211 209 77 198
175 149 259 372 140 250 168 142 146 284 273 74 162 112 78 29
169 578 97 589 473 317 123 102 445 217 144 398 510 464 247 109
3291 216 185 1214 167 495 1859 194 1030 3456 2021 1622 3511 222 3534 1580
2066 2418 2324 93 1073 82 102 538 1552 962 91 836 1628 2154 2144 1378
149 963 1242 849 726 1158 164 1134 658 161 1148 336 826 1303 811 178
3421 1404 2360 2643 3186 3352 1112 171 168 177 146 1945 319 185 2927 2289
543 462 111 459 107 353 2006 116 2528 56 2436 1539 1770 125 2697 2432
1356 208 5013 4231 193 169 3152 2543 4430 4070 4031 145 4433 4187 4394 1754
5278 113 4427 569 5167 175 192 3903 155 1051 4121 5140 2328 203 5653 3233
how can I read it in a list of list of int in haskell?
I have tried few options but I could not manage to do it. I am very new to haskell so please be patience.
First break your input into lines using lines:
let test = "1 2 3 4\n 5 6 7 \n 4 2 5"
let rows = lines test --literally "lines test"! Beautiful, eh?
Result:
["1 2 3 4"," 5 6 7 "," 4 2 5"] :: [[Char]]
Then, extract individual numbers as strings using words:
let nums_as_strings = map words rows
Result:
[["1","2","3","4"],["5","6","7"],["4","2","5"]] :: :: [[[Char]]]
The last thing to do is convert these strings to integers with read:
let numbers = map (map read) nums_as_strings :: [[Int]]
Result:
[[1,2,3,4],[5,6,7],[4,2,5]] :: [[Int]]
Or, squashed into one line:
let numbers = map (map read) (map words $ lines test) :: [[Int]]
Example with your data:
Prelude> let test = "1224 926 1380 688 845 109 118 88 1275 1306 91 796 102 1361 27 995\n1928 2097 138 1824 198 117 1532 2000 1478 539 1982 125 1856 139 475 1338"
Prelude> map (map read) (map words $ lines test) :: [[Int]]
[[1224,926,1380,688,845,109,118,88,1275,1306,91,796,102,1361,27,995],[1928,2097,138,1824,198,117,1532,2000,1478,539,1982,125,1856,139,475,1338]]
You may need to take care of empty lines, but that's really simple.
import System.IO
readListOfLists :: Handle -> IO [[Int]]
readListOfLists handle = do
contents <- hGetContents handle
let ls :: [String]
ls = lines contents
ws :: [[String]]
ws= map words ls
res :: [[Int]]
res = map (map read) ws
return res;
or you can write the same code in one line:
readListOfLists :: Handle -> IO [[Int]]
readListOfLists = fmap (map (map read . words) . lines) . hGetContents
To use it:
do
handle <- openFile fileName ReadMode
table <- readListOfLists handle
hClose handle
print table
I'm a complete novice when it comes to SQL, just getting started.
I need help writing a query to update values in my SQL table:
Two tables: Members, Chapters
Concerned with three columns in Chapters table: CHAP_NO, FA, FA2
i.e.
CHAP_NO FA FA2
111 1234567 2345689
222 2234567 4567899
333 3225545
444
555 2358878 4566665
666 4568799
777 4566878 1233666
888 1119998
999 3555879 6544799
etc. . .
Each value is a unique identifier
Concerned with two columns in Members table: MEMB_NO, CURR_CHAP
i.e.
MEMB_NO CURR_CHAP
1234567 665
5468787 664
4577789 122
4578767 233
7775666 588
4114748 787
etc. . .
Is it possible to automate an update based on if FA or FA2 is in Chapters table, update their CURR_CHAP value in the Members table from the CHAP_NO value?
From the above example data, I need MEMB_NO '1234567' to have his CURR_CHAP updated to '111' because he is listed as FA for CHAP_NO '111'
I really need to do this for similar MS SQL and MySQL databases if possible. If this can't be automated, I need help writing a query to manually update the Members table as exampled above with a 2 column manual update row of data:
MEMB_NO CURR_CHAP
5470011 547
5030038 545
3880188 544
1140753 543
4130019 543
5420011 542
5410010 541
2590511 540
4190109 540
4180296 539
5380020 538
5370012 537
1050859 536
4390125 535
4860144 535
5330009 533
5330061 533
1080746 532
2060321 531
1750750 529
4250135 528
8070013 528
1080645 527
5270053 527
2580695 526
2440073 525
2440163 525
5240010 524
4980035 523
2120380 522
4000418 521
3270185 520
4350210 519
4610218 518
5160004 516
1610450 515
5150065 515
5130046 513
5130050 513
5120047 512
1940306 510
2500170 510
5090087 509
5080014 508
1270803 505
1381026 505
2260505 504
3900106 504
5030006 503
1770526 501
1780355 501
5000017 500
4980037 498
2380411 497
4970019 497
4960044 496
4960127 496
4950012 495
4950095 495
1720409 494
2260867 494
2300466 493
3990055 492
4920204 492
1311252 491
2100252 491
1750592 490
1760563 490
2520403 489
4890051 489
4870076 487
4870143 487
4860153 486
1670856 485
4840054 484
4840143 484
2920024 483
4830136 483
1751087 482
1790828 481
1970128 481
2050815 480
4800027 480
1870246 478
3210174 478
4770100 477
4760124 476
4760126 476
1350640 475
2280722 475
2200077 474
3410230 474
4730100 473
4250159 470
4250156 470
3790179 464
4630164 463
4630139 463
2210062 461
4610188 461
4210110 460
4870065 459
4500246 450
1110937 449
1110934 449
2280501 447
4450323 445
4440114 444
4410135 441
4410216 441
1600799 435
2280449 435
4080089 431
4310132 431
1780525 427
4270190 427
4260502 426
4260550 426
4250467 425
4250485 425
4210328 421
4190230 419
4180005 418
4180341 418
4250232 417
4130004 413
4110444 411
4090133 409
4080308 408
4430119 408
4070279 407
4070443 407
1650354 405
1670725 404
2240204 402
2870319 400
3990114 399
3980014 398
4050073 398
3170399 397
3970348 397
1760487 395
4180191 395
1800443 394
2580288 394
1280499 393
3930227 393
3780058 391
3900377 390
2590362 389
1720492 385
1720398 384
2840325 383
3710142 381
3800235 380
3780407 378
1760459 375
1730026 373
3710306 371
3710228 371
1051294 370
3700332 370
3670174 367
1780583 359
4640038 359
1280614 358
2580373 358
3570449 357
3530560 353
3500046 350
3490275 349
3490244 349
3320203 348
3480310 348
4210188 346
3440364 344
4490223 344
1750642 342
3990257 342
1790541 341
3370562 337
3370738 337
1870336 334
3340382 334
1950674 333
1460619 328
3280586 328
4250013 326
1340705 324
2590495 324
2870029 322
3030290 322
1880232 321
2280415 321
3200547 320
3200568 320
3180132 318
3180178 318
3930433 317
4850072 317
2870449 315
3150168 315
1390763 313
3120170 312
3110048 311
3110110 311
3070267 307
3500231 306
3980122 306
1160708 305
3050510 305
2280197 304
3040348 304
1060785 303
1340760 303
3020534 302
3980151 301
2990239 299
1770425 297
2950573 295
2280513 294
2320434 287
2870594 287
4110133 284
4260131 278
2770221 277
2770366 277
2760484 276
2750397 275
2580694 272
1751006 267
4010252 267
2660235 266
2780335 265
2640326 264
3840125 263
1270872 259
2590690 259
2580728 258
2030556 257
4600151 257
2550390 255
4440010 255
2520461 252
4130095 252
3910117 250
2490314 249
1361032 247
1900370 247
2440211 244
2440101 244
1730150 243
1440258 242
2420062 242
1350511 238
2380559 238
1800598 237
2350417 235
2340372 234
2320453 232
2590582 232
2120104 230
2280696 228
3480122 227
1111011 226
2260626 226
3230234 222
2270200 221
4470101 221
3010326 219
2180334 218
2170591 217
1620648 213
2120524 212
3010424 212
3130060 210
2070261 207
2070313 207
1640858 206
1620684 205
2030573 203
2030810 203
1270589 201
1111015 200
1990448 199
1950384 195
1920328 192
1920684 192
1750798 188
1880607 188
1870445 187
1850587 185
2960295 185
1800721 180
1791166 179
3990116 178
3130119 177
4170034 177
1051172 176
1380942 176
1751011 175
4500021 175
2840346 174
3460307 174
1730027 173
4070275 173
1110986 171
1670586 167
1111222 166
2060385 164
1560459 163
1740135 162
3130093 161
1600695 160
1600682 160
1350600 159
1590341 159
1580464 158
1570742 157
1570761 157
4440077 156
1520404 152
4700010 152
3390033 147
4170240 145
4730144 143
4250191 142
1400502 140
2170212 140
1360713 139
3040299 139
1800519 136
1270930 135
1720638 134
1800462 133
3930387 133
1111000 131
1311274 131
1360547 128
2260776 128
4830091 127
1800431 123
1280523 122
1750851 122
1291052 121
3850165 121
1180219 118
1180477 118
2240110 116
2870263 116
3900143 114
1111488 111
1490386 111
1060765 110
1780463 110
3200394 108
5050015 108
3870219 105
I think this should work on both...
UPDATE
IGNORE Chapters,
Members
SET Members.CURR_CHAP = Chapters.CHAP_NO
WHERE Chapters.FA = Members.MEMB_NO
AND Chapters.FA != ''
AND Chapters.FA2 != ''
UPDATE members m
INNER JOIN chapters c on (m.memb_no = c.fa or m.memb_no=c.fa2)
SET m.curr_chap=c.chap_no
This query worked for me in MySql
update Members m
INNER JOIN chapters c
ON (m.memb_no=c.FA or m.memb_no=c.FA2)
set CURR_CHAP = c.CHAP_NO