Creating a pandas data frame from a JSON object - json

Objective: I have fetched in insights data from my Instagram account using Instagram Graph API. Below I have a JSON object (audience_insights['data'].
[{'name': 'audience_city',
'period': 'lifetime',
'values': [{'value': {'London, England': 1,
'Kharkiv, Kharkiv Oblast': 1,
'Jamui, Bihar': 1,
'Burdwan, West Bengal': 1,
'Kolkata, West Bengal': 112,
'Dhulian, West Bengal': 1,
'Argonne, Wisconsin': 1,
'College Park, Georgia': 1,
'Pakaur, Jharkhand': 1,
'Bristol, England': 1,
'Delhi, Delhi': 1,
'Gaya, Bihar': 1,
'Howrah, West Bengal': 1,
'Kanpur, Uttar Pradesh': 1,
'Jaipur, Rajasthan': 2,
'Panipat, Haryana': 1,
'Saint Etienne, Rhône-Alpes': 1,
'Panagarh, West Bengal': 1,
'Bhagalpur, Bihar': 1,
'Frankfurt, Hessen': 1,
'Riyadh, Riyadh Region': 1,
'Roorkee, Uttarakhand': 1,
'Harinavi, West Bengal': 1,
'Secunderabad, Telangana': 1,
'Mumbai, Maharashtra': 3,
'Patna, Bihar': 11,
'Obando, Valle del Cauca': 1,
'Jaunpur, Uttar Pradesh': 1,
'Sitamau, Madhya Pradesh': 1},
'end_time': '2022-03-24T07:00:00+0000'}],
'title': 'Audience City',
'description': "The cities of this profile's followers",
'id': '17841406112341342/insights/audience_city/lifetime'},
{'name': 'audience_country',
'period': 'lifetime',
'values': [{'value': {'DE': 1,
'IN': 144,
'GB': 2,
'UA': 1,
'FR': 1,
'CO': 1,
'SA': 1,
'US': 2},
'end_time': '2022-03-24T07:00:00+0000'}],
'title': 'Audience Country',
'description': "The countries of this profile's followers",
'id': '17841406112341342/insights/audience_country/lifetime'},
{'name': 'audience_gender_age',
'period': 'lifetime',
'values': [{'value': {'F.13-17': 1,
'F.18-24': 20,
'F.25-34': 5,
'M.13-17': 4,
'M.18-24': 79,
'M.25-34': 15,
'M.35-44': 1,
'M.45-54': 3,
'U.13-17': 4,
'U.18-24': 16,
'U.25-34': 2,
'U.45-54': 3},
'end_time': '2022-03-24T07:00:00+0000'}],
'title': 'Gender and Age',
'description': "The gender and age distribution of this profile's followers",
'id': '17841406112341342/insights/audience_gender_age/lifetime'}]
I wish to loop through this and create three data frames:
First that shows demographic and count.
| | Location | Count |
| ---- | -------------- | ----- |
| 0 | London, England | 1 |
Second would be a similar data frame with country and count.
And, finally the last would show the gender and category against the count.
So far, I've been able to extract the three dictionaries that I eventually need to convert to separate data frames. I've stored the dictionaries in a list all_data.
all_data = []
for item in audience_insight['data']:
data = item['values'][0]['value']
all_data.append(data)
df_location = pd.DataFrame(all_data)
all_data
[{'London, England': 1,
'Kharkiv, Kharkiv Oblast': 1,
'Jamui, Bihar': 1,
'Burdwan, West Bengal': 1,
'Kolkata, West Bengal': 112,
'Dhulian, West Bengal': 1,
'Argonne, Wisconsin': 1,
'College Park, Georgia': 1,
'Bristol, England': 1,
'Bikaner, Rajasthan': 1,
'Delhi, Delhi': 1,
'Gaya, Bihar': 1,
'Howrah, West Bengal': 1,
'Jaipur, Rajasthan': 1,
'Kanpur, Uttar Pradesh': 1,
'Panipat, Haryana': 1,
'Saint Etienne, Rhône-Alpes': 1,
'Panagarh, West Bengal': 1,
'Panchagan, Odisha': 1,
'Bhagalpur, Bihar': 1,
'Frankfurt, Hessen': 1,
'Riyadh, Riyadh Region': 1,
'Roorkee, Uttarakhand': 1,
'Harinavi, West Bengal': 1,
'Mumbai, Maharashtra': 3,
'Patna, Bihar': 11,
'Obando, Valle del Cauca': 1,
'Jaunpur, Uttar Pradesh': 1,
'Hyderabad, Telangana': 1,
'Sitamau, Madhya Pradesh': 1},
{'DE': 1, 'IN': 144, 'GB': 2, 'FR': 1, 'CO': 1, 'UA': 1, 'SA': 1, 'US': 2},
{'F.13-17': 1,
'F.18-24': 20,
'F.25-34': 5,
'M.13-17': 4,
'M.18-24': 79,
'M.25-34': 16,
'M.35-44': 1,
'M.45-54': 3,
'U.13-17': 4,
'U.18-24': 15,
'U.25-34': 2,
'U.45-54': 3}]
I want to be able to convert each of these dictionaries into a data frame such that the keys are in the first column and the values are in the second.
Thank you for your help!

Related

MySQL - How to find the rows with the greatest value changes?

I am trying to search a table where I have daily ranked keywords (SEO keywords). Therefore I have index on a key_id per keyword, and new position value per each keyword.
I would like to find out how I can select the keywords that have the greatest change in value?
MariaDB Table and data:
CREATE TABLE IF NOT EXISTS `daily_rank` (
`rankID` int(24) NOT NULL AUTO_INCREMENT,
`created` timestamp NULL DEFAULT current_timestamp(),
`key_id` int(100) NOT NULL DEFAULT 0,
`position` int(12) NOT NULL DEFAULT 0,
`keyword` varchar(50) NOT NULL DEFAULT '0',
PRIMARY KEY (`rankID`),
KEY `created` (`created`),
KEY `key_id` (`key_id`)
) ENGINE=InnoDB AUTO_INCREMENT=3594 DEFAULT CHARSET=latin1;
INSERT INTO `daily_rank` (`rankID`, `created`, `key_id`, `position`, `keyword`) VALUES
(3594, '2019-10-09 17:59:07', 53, 4, 'SEO'),
(3595, '2019-10-09 17:59:07', 100, 3, 'agency'),
(3596, '2019-10-09 17:59:07', 397, 1, 'bureau marketing'),
(3597, '2019-10-09 17:59:07', 798, 7, 'marketing agency'),
(3598, '2019-10-09 17:59:07', 98, 8, 'search engine optimization'),
(3599, '2019-10-09 17:59:07', 346, 8, 'website optimization'),
(3600, '2019-10-09 17:59:07', 555, 9, 'agency'),
(3608, '2019-10-08 18:07:00', 53, 4, 'SEO'),
(3609, '2019-10-08 18:07:00', 100, 4, 'agency'),
(3610, '2019-10-08 18:07:00', 397, 3, 'bureau marketing'),
(3611, '2019-10-08 18:07:00', 798, 1, 'marketing agency'),
(3612, '2019-10-08 18:07:00', 98, 2, 'search engine optimization'),
(3613, '2019-10-08 18:07:00', 346, 2, 'website optimization'),
(3614, '2019-10-08 18:07:00', 555, 2, 'agency'),
(3615, '2019-10-07 18:07:22', 53, 4, 'SEO'),
(3616, '2019-10-07 18:07:22', 100, 6, 'agency'),
(3617, '2019-10-07 18:07:22', 397, 6, 'bureau marketing'),
(3618, '2019-10-07 18:07:22', 798, 6, 'marketing agency'),
(3619, '2019-10-07 18:07:22', 98, 4, 'search engine optimization'),
(3620, '2019-10-07 18:07:22', 346, 6, 'website optimization'),
(3621, '2019-10-07 18:07:22', 555, 6, 'agency'),
(3622, '2019-10-07 18:07:22', 53, 5, 'SEO'),
(3623, '2019-10-07 18:07:22', 100, 4, 'agency'),
(3624, '2019-10-07 18:07:22', 397, 5, 'bureau marketing'),
(3625, '2019-10-07 18:07:22', 798, 3, 'marketing agency'),
(3626, '2019-10-07 18:07:22', 98, 6, 'search engine optimization'),
(3627, '2019-10-07 18:07:22', 346, 3, 'website optimization'),
(3628, '2019-10-07 18:07:22', 555, 5, 'agency'),
(3629, '2019-10-06 18:07:44', 53, 1, 'SEO'),
(3630, '2019-10-06 18:07:44', 100, 2, 'agency'),
(3631, '2019-10-06 18:07:44', 397, 2, 'bureau marketing'),
(3632, '2019-10-06 18:07:44', 798, 1, 'marketing agency'),
(3633, '2019-10-06 18:07:44', 98, 1, 'search engine optimization'),
(3634, '2019-10-06 18:07:44', 346, 2, 'website optimization'),
(3635, '2019-10-06 18:07:44', 555, 2, 'agency'),
(3636, '2019-10-06 18:07:44', 53, 2, 'SEO'),
(3637, '2019-10-06 18:07:44', 100, 2, 'agency'),
(3638, '2019-10-06 18:07:44', 397, 3, 'bureau marketing'),
(3639, '2019-10-06 18:07:44', 798, 2, 'marketing agency'),
(3640, '2019-10-06 18:07:44', 98, 2, 'search engine optimization'),
(3641, '2019-10-06 18:07:44', 346, 1, 'website optimization'),
(3642, '2019-10-06 18:07:44', 555, 1, 'agency'),
(3643, '2019-10-06 18:07:44', 53, 1, 'SEO'),
(3644, '2019-10-06 18:07:44', 100, 2, 'agency'),
(3645, '2019-10-06 18:07:44', 397, 1, 'bureau marketing'),
(3646, '2019-10-06 18:07:44', 798, 3, 'marketing agency'),
(3647, '2019-10-06 18:07:44', 98, 2, 'search engine optimization'),
(3648, '2019-10-06 18:07:44', 346, 1, 'website optimization'),
(3649, '2019-10-06 18:07:44', 555, 3, 'agency'),
(3650, '2019-10-06 18:07:44', 53, 3, 'SEO'),
(3651, '2019-10-06 18:07:44', 100, 1, 'agency'),
(3652, '2019-10-06 18:07:44', 397, 2, 'bureau marketing'),
(3653, '2019-10-06 18:07:44', 798, 3, 'marketing agency'),
(3654, '2019-10-06 18:07:44', 98, 1, 'search engine optimization'),
(3655, '2019-10-06 18:07:44', 346, 2, 'website optimization'),
(3656, '2019-10-06 18:07:44', 555, 1, 'agency');
How do I query so I can get the latest position for the keywords, and the change from a given date, and order the result to show the keywords with the greatest change?
I imagine a table like this:
[Keyword] - [Todays Position] - [Position Change from yesterday]
where it is ordered by the biggest change descending
UPDATE:
When calculating max-min the todays position is within this calculation, and will skew the result somewhat.
And when viewing todays position, I would like to see the keywords that have had the biggest change in position since compared date.
I think this is what you want.
SELECT a.keyword, a.position as today_position, b.biggest_position_change_since_yesterday
FROM daily_rank a
JOIN
(SELECT keyword, MAX(position) - MIN(position) AS biggest_position_change_since_yesterday
FROM daily_rank
WHERE cast(created as date) >= ADDDATE(curdate(),-1)
GROUP BY keyword) b
ON b.keyword = a.keyword
AND cast(created as date) = curdate()
ORDER by biggest_position_change_since_yesterday desc;
keyword today_position biggest_position_change_since_yesterday
agency 9 7
agency 3 7
website optimization 8 6
marketing agency 7 6
search engine optimization 8 6
bureau marketing 1 2
SEO 4 0
Test Case:
DB<>FIDDLE
SELECT keyword, MAX(position) max_position,MIN(position) min_position FROM daily_rank GROUP BY keyword;
+----------------------------+--------------+--------------+
| keyword | max_position | min_position |
+----------------------------+--------------+--------------+
| agency | 9 | 1 |
| bureau marketing | 6 | 1 |
| marketing agency | 7 | 1 |
| search engine optimization | 8 | 1 |
| SEO | 5 | 1 |
| website optimization | 8 | 1 |
+----------------------------+--------------+--------------+

Error in my Json file (Error: Parse error on line 176: ...1, "furnace": 1, }, "ItemListCrates" ---------------------^ Expecting 'STRING', got '}

Well i have a issue somewhere in my code, this is not my cup of tea to say to fix this. Ive ran through this with a fine tooth brush practically to no luck, and thought someone could tell me where on my code im doing wrong on. Code error is this once again Error: Parse error on line 176:
...1, "furnace": 1, }, "ItemListCrates"
---------------------^
Expecting 'STRING', got '}
{
"ItemListBarrels": {
"rifle.ak": 4,
"ammo.handmade.shell": 1,
"ammo.pistol": 2,
"ammo.pistol.fire": 2,
"ammo.pistol.hv": 2,
"ammo.rifle": 3,
"ammo.rifle.explosive": 2,
"ammo.rifle.incendiary": 2,
"ammo.rifle.hv": 2,
"ammo.rocket.basic": 4,
"ammo.rocket.fire": 4,
"ammo.rocket.hv": 4,
"ammo.shotgun": 4,
"ammo.shotgun.slug": 2,
"antiradpills": 1,
"apple": 1,
"arrow.hv": 1,
"axe.salvaged": 2,
"barricade.concrete": 1,
"barricade.metal": 1,
"barricade.sandbags": 1,
"barricade.stone": 1,
"barricade.wood": 1,
"barricade.woodwire": 1,
"tool.binoculars": 1,
"black.raspberries": 4,
"bleach": 1,
"blueberries": 4,
"bone.club": 1,
"bucket.water": 1,
"tool.camera": 1,
"can.beans": 1,
"can.tuna": 1,
"candycane": 1,
"ceilinglight": 1,
"chair": 1,
"chocholate": 1,
"door.double.hinged.metal": 1,
"door.double.hinged.toptier": 1,
"door.double.hinged.wood": 1,
"door.hinged.toptier": 1,
"door.closer": 1,
"dropbox": 1,
"explosive.satchel": 4,
"explosive.timed": 4,
"explosives": 4,
"floor.grill": 1,
"floor.ladder.hatch": 1,
"fridge": 1,
"furnace.large": 1,
"gates.external.high.stone": 1,
"gates.external.high.wood": 1,
"gears": 2,
"burlap.gloves": 1,
"glue": 1,
"granolabar": 1,
"grenade.beancan": 4,
"fun.guitar": 1,
"hammer.salvaged": 2,
"hat.beenie": 1,
"hat.boonie": 1,
"bucket.helmet": 1,
"hat.candle": 1,
"hat.cap": 1,
"coffeecan.helmet": 2,
"hat.miner": 1,
"hatchet": 2,
"hazmatsuit": 3,
"hoodie": 2,
"icepick.salvaged": 2,
"jacket.snow": 1,
"jacket": 1,
"ladder.wooden.wall": 1,
"lantern": 1,
"largemedkit": 2,
"locker": 1,
"longsword": 1,
"mace": 1,
"machete": 1,
"mailbox": 1,
"map": 1,
"mask.balaclava": 1,
"mask.bandana": 1,
"metal.facemask": 1,
"metal.plate.torso": 1,
"metalblade": 2,
"metalpipe": 2,
"mining.quarry": 1,
"burlap.trousers": 1,
"pants": 1,
"roadsign.kilt": 3,
"pants.shorts": 1,
"pickaxe": 2,
"pistol.eoka": 1,
"pistol.revolver": 2,
"planter.large": 1,
"planter.small": 1,
"propanetank": 1,
"target.reactive": 1,
"riflebody": 3,
"roadsign.jacket": 2,
"roadsigns": 2,
"rope": 1,
"rug.bear": 1,
"rug": 1,
"salvaged.cleaver": 1,
"salvaged.sword": 1,
"weapon.mod.small.scope": 1,
"scrap": 1000,
"searchlight": 1,
"semibody": 2,
"sewingkit": 1,
"sheetmetal": 1,
"shelves": 1,
"shirt.collared": 1,
"shirt.tanktop": 1,
"shoes.boots": 1,
"shotgun.waterpipe": 2,
"guntrap": 1,
"shutter.metal.embrasure.a": 1,
"shutter.metal.embrasure.b": 1,
"shutter.wood.a": 1,
"sign.hanging.banner.large": 1,
"sign.hanging": 1,
"sign.hanging.ornate": 1,
"sign.pictureframe.landscape": 1,
"sign.pictureframe.portrait": 1,
"sign.pictureframe.tall": 1,
"sign.pictureframe.xl": 1,
"sign.pictureframe.xxl": 1,
"sign.pole.banner.large": 1,
"sign.post.double": 1,
"sign.post.single": 1,
"sign.post.town": 1,
"sign.post.town.roof": 1,
"sign.wooden.huge": 1,
"sign.wooden.large": 1,
"sign.wooden.medium": 1,
"sign.wooden.small": 1,
"weapon.mod.silencer": 1,
"weapon.mod.simplesight": 1,
"small.oil.refinery": 1,
"smallwaterbottle": 1,
"smgbody": 2,
"spear.stone": 1,
"spikes.floor": 1,
"spinner.wheel": 1,
"metalspring": 1,
"sticks": 2,
"surveycharge": 2,
"syringe.medical": 3,
"table": 1,
"techparts": 3,
"smg.thompson": 2,
"tshirt": 1,
"tshirt.long": 1,
"tunalight": 1,
"wall.external.high.stone": 1,
"wall.external.high": 1,
"wall.frame.cell.gate": 1,
"wall.frame.cell": 1,
"wall.frame.fence.gate": 1,
"wall.frame.fence": 1,
"wall.frame.netting": 1,
"wall.frame.shopfront": 1,
"wall.window.bars.metal": 1,
"wall.window.bars.toptier": 1,
"wall.window.bars.wood": 1,
"water.catcher.large": 1,
"water.catcher.small": 1,
"water.barrel": 1,
"waterjug": 1,
"water.purifier": 1,
"furnace": 1,
},
"ItemListCrates": {
"rifle.ak": 4,
"ammo.handmade.shell": 1,
"ammo.pistol": 2,
"ammo.pistol.fire": 2,
"ammo.pistol.hv": 2,
"ammo.rifle": 3,
"ammo.rifle.explosive": 2,
"ammo.rifle.incendiary": 2,
"ammo.rifle.hv": 2,
"ammo.rocket.basic": 4,
"ammo.rocket.fire": 4,
"ammo.rocket.hv": 4,
"ammo.shotgun": 4,
"ammo.shotgun.slug": 2,
"antiradpills": 1,
"apple": 1,
"arrow.hv": 1,
"axe.salvaged": 2,
"barricade.concrete": 1,
"barricade.metal": 1,
"barricade.sandbags": 1,
"barricade.stone": 1,
"barricade.wood": 1,
"barricade.woodwire": 1,
"tool.binoculars": 1,
"black.raspberries": 4,
"bleach": 1,
"blueberries": 4,
"bone.club": 1,
"bucket.water": 1,
"tool.camera": 1,
"can.beans": 1,
"can.tuna": 1,
"candycane": 1,
"ceilinglight": 1,
"chair": 1,
"chocholate": 1,
"door.double.hinged.metal": 1,
"door.double.hinged.toptier": 1,
"door.double.hinged.wood": 1,
"door.hinged.toptier": 1,
"door.closer": 1,
"dropbox": 1,
"explosive.satchel": 4,
"explosive.timed": 4,
"explosives": 4,
"floor.grill": 1,
"floor.ladder.hatch": 1,
"fridge": 1,
"furnace.large": 1,
"gates.external.high.stone": 1,
"gates.external.high.wood": 1,
"gears": 2,
"burlap.gloves": 1,
"glue": 1,
"granolabar": 1,
"grenade.beancan": 4,
"fun.guitar": 1,
"hammer.salvaged": 2,
"hat.beenie": 1,
"hat.boonie": 1,
"bucket.helmet": 1,
"hat.candle": 1,
"hat.cap": 1,
"coffeecan.helmet": 2,
"hat.miner": 1,
"hatchet": 2,
"hazmatsuit": 3,
"hoodie": 2,
"icepick.salvaged": 2,
"jacket.snow": 1,
"jacket": 1,
"ladder.wooden.wall": 1,
"lantern": 1,
"largemedkit": 2,
"locker": 1,
"longsword": 1,
"mace": 1,
"machete": 1,
"mailbox": 1,
"map": 1,
"mask.balaclava": 1,
"mask.bandana": 1,
"metal.facemask": 1,
"metal.plate.torso": 1,
"metalblade": 2,
"metalpipe": 2,
"mining.quarry": 1,
"burlap.trousers": 1,
"pants": 1,
"roadsign.kilt": 3,
"pants.shorts": 1,
"pickaxe": 2,
"pistol.eoka": 1,
"pistol.revolver": 2,
"planter.large": 1,
"planter.small": 1,
"propanetank": 1,
"target.reactive": 1,
"riflebody": 3,
"roadsign.jacket": 2,
"roadsigns": 2,
"rope": 1,
"rug.bear": 1,
"rug": 1,
"salvaged.cleaver": 1,
"salvaged.sword": 1,
"weapon.mod.small.scope": 1,
"scrap": 700,
"searchlight": 1,
"semibody": 2,
"sewingkit": 1,
"sheetmetal": 1,
"shelves": 1,
"shirt.collared": 1,
"shirt.tanktop": 1,
"shoes.boots": 1,
"shotgun.waterpipe": 2,
"guntrap": 1,
"shutter.metal.embrasure.a": 1,
"shutter.metal.embrasure.b": 1,
"shutter.wood.a": 1,
"sign.hanging.banner.large": 1,
"sign.hanging": 1,
"sign.hanging.ornate": 1,
"sign.pictureframe.landscape": 1,
"sign.pictureframe.portrait": 1,
"sign.pictureframe.tall": 1,
"sign.pictureframe.xl": 1,
"sign.pictureframe.xxl": 1,
"sign.pole.banner.large": 1,
"sign.post.double": 1,
"sign.post.single": 1,
"sign.post.town": 1,
"sign.post.town.roof": 1,
"sign.wooden.huge": 1,
"sign.wooden.large": 1,
"sign.wooden.medium": 1,
"sign.wooden.small": 1,
"weapon.mod.silencer": 1,
"weapon.mod.simplesight": 1,
"small.oil.refinery": 1,
"smallwaterbottle": 1,
"smgbody": 2,
"spear.stone": 1,
"spikes.floor": 1,
"spinner.wheel": 1,
"metalspring": 1,
"sticks": 2,
"surveycharge": 2,
"syringe.medical": 3,
"table": 1,
"techparts": 3,
"smg.thompson": 2,
"tshirt": 1,
"tshirt.long": 1,
"tunalight": 1,
"wall.external.high.stone": 1,
"wall.external.high": 1,
"wall.frame.cell.gate": 1,
"wall.frame.cell": 1,
"wall.frame.fence.gate": 1,
"wall.frame.fence": 1,
"wall.frame.netting": 1,
"wall.frame.shopfront": 1,
"wall.window.bars.metal": 1,
"wall.window.bars.toptier": 1,
"wall.window.bars.wood": 1,
"water.catcher.large": 1,
"water.catcher.small": 1,
"water.barrel": 1,
"waterjug": 1,
"water.purifier": 1,
"rocket.launcher": 4,
"flamethrower": 2,
"flameturret": 2,
}
}
There is no comma allowed after the last item of a Json-object:
"water.purifier": 1,
"furnace": 1, //<--Remove this comma
},
"flamethrower": 2,
"flameturret": 2,//<--Remove this comma
}

SQL query INSERT column count error

I have removed 'age_range' from this query.
INSERT INTO `filters` (`id`, `user_id`, `profession_preference`, `country`, `city`, `order_by`, `profession_orientation`, `age_range`, `distance_range`, `location_dating`) VALUES
(9, 20, 3, 'All Countries', '', 2, 1, '16,100', '0,500', 0),
(10, 12, 3, 'Egypt', '', 2, 1, '', '', 0),
(11, 19, 3, 'All Countries', '', 2, 1, '16,100', '0,500', 0),
(13, 20, 3, 'All Countries', '', 2, 1, '16,100', '0,500', 0),
(14, 20, 3, 'All Countries', '', 2, 1, '16,100', '0,500', 0),
(15, 20, 3, 'All Countries', '', 2, 1, '16,100', '0,500', 0),
(25, 121, 3, 'All Countries', '', 3, 1, '18,23', '0,500 ', 0),
(26, 316, 3, 'United States', '', 3, 1, '17,25', '0,500', 0);
I executed again and receive this error:
MySQL said: Documentation
#1136 - Column count doesn't match value count at row 1
When you insert a record, it matches the values in the VALUES list to the columns in the columns list by comma-separated position. So, this insert statement:
INSERT INTO `foo` (`A`, `B`, `C`)
VALUES ('valueA', 'valueB', 'valueC')
It will insert valueA into column A, valueB into column B, etc. because they match positions in their respective lists. If you remove B from the columns list and leave VALUES alone, it will not attempt to insert valueA into column A, but valueB into column C because they now match value positions, but it won't know what to do with valueC because there are now more values than columns, so since you removed the column from the second position, you would also need to remove the value from the second position.
So back to your query, you would need to determine which position age_range occupied in the columns list and remove the values from the same position in the values list.
Does that make sense?
You have 9 columns defined in your insert statement and you are trying to insert 10 values. you either need to add another column definition or remove from your values.
According to rule the column name define and provided values count should be same. In your case one column value in extra.
As the documentation says
"Column count doesn't match value count"
You specify 9 columns (id, user_id, profession_preference, country, city, order_by, profession_orientation, distance_range, location_dating) in your insert statement
and you are trying to insert 10 values.
You have to remove one value or add another column
Before removing column this is the script which will work
CREATE TABLE filters (id INT, user_id INT, profession_preference INT, country VARCHAR(50), city VARCHAR(50), order_by INT, profession_orientation INT, age_range VARCHAR(50), distance_range VARCHAR(50), location_dating INT);
INSERT INTO filters (id, user_id, profession_preference, country, city, order_by, profession_orientation, age_range, distance_range, location_dating) VALUES
(9, 20, 3, 'All Countries', '', 2, 1, '16,100', '0,500', 0),
(10, 12, 3, 'Egypt', '', 2, 1, '', '', 0),
(11, 19, 3, 'All Countries', '', 2, 1, '16,100', '0,500', 0),
(13, 20, 3, 'All Countries', '', 2, 1, '16,100', '0,500', 0),
(14, 20, 3, 'All Countries', '', 2, 1, '16,100', '0,500', 0),
(15, 20, 3, 'All Countries', '', 2, 1, '16,100', '0,500', 0),
(25, 121, 3, 'All Countries', '', 3, 1, '18,23', '0,500 ', 0),
(26, 316, 3, 'United States', '', 3, 1, '17,25', '0,500', 0);
Now since you removed age_range column, below script will work:
INSERT INTO filters (id, user_id, profession_preference, country, city, order_by, profession_orientation, distance_range, location_dating) VALUES
(9, 20, 3, 'All Countries', '', 2, 1, '0,500', 0),
(10, 12, 3, 'Egypt', '', 2, 1, '', 0),
(11, 19, 3, 'All Countries', '', 2, 1, '0,500', 0),
(13, 20, 3, 'All Countries', '', 2, 1, '0,500', 0),
(14, 20, 3, 'All Countries', '', 2, 1, '0,500', 0),
(15, 20, 3, 'All Countries', '', 2, 1, '0,500', 0),
(25, 121, 3, 'All Countries', '', 3, 1, '0,500', 0),
(26, 316, 3, 'United States', '', 3, 1, '0,500', 0);
I removed third last column from insert script.
Hope this helps!

Update query from multiple tables, for specific IDs and JOIN LIMIT 1

I have two tables, and I want to update the rows of torrents from scrapes every day.
scrapes:
id, torrent_id, name, status, complete, incomplete, downloaded
1, 1, http://tracker1.com, 1, 542, 23, 542
2, 1, http://tracker2.com, 1, 542, 23, 542
3, 2, http://tracker1.com, 1, 123, 34, 43
4, 2, http://tracker2.com, 1, 123, 34, 43
5, 3, http://tracker1.com, 1, 542, 23, 542
6, 3, http://tracker2.com, 1, 542, 23, 542
7, 4, http://tracker1.com, 1, 123, 34, 43
8, 4, http://tracker2.com, 1, 123, 34, 43
9, 5, http://tracker1.com, 1, 542, 23, 542
10, 5, http://tracker2.com, 1, 542, 23, 542
11, 6, http://tracker1.com, 1, 123, 34, 43
12, 6, http://tracker2.com, 1, 123, 34, 43
torrents:
id, name, complete, incomplete, downloaded
1, CentOS, 0, 0, 0
2, Ubuntu, 0, 0, 0
3, Debian, 0, 0, 0
4, Redhat, 0, 0, 0
5, Fedora, 0, 0, 0
6, Gentoo, 0, 0, 0
The scrapes may have multiple name, but I want to get the values only from the first found (for better performance) and also, I need to update only torrents ids 1, 3, 6 on one query.
UPDATE (SELECT * FROM scrapes WHERE torrent_id IN(1,3,6) GROUP BY torrent_id) as `myview` JOIN torrents ON myview.torrent_id=torrents.id SET torrent.complete=myview.complete WHERE 1

HighCharts - Filling a heatmap from SQL Query

Im trying to fill a HighCharts Heatmap with data returned from an SQL Query.
What i have in the JS file is
$(function () {
var chart;
$(document).ready(function() {
$.getJSON("php/all-counties-sales-data-box.php", function(json) {
chart = new Highcharts.Chart({
chart: {
renderTo: 'chart-box-combined',
type: 'heatmap',
marginTop: 40,
marginBottom: 80,
plotBorderWidth: 1
},
title: {
text: 'Sales per employee per weekday'
},
xAxis: {
categories: ['January', 'February', 'March', 'April', 'May', 'June', 'July', 'August', 'September', 'October', 'November', 'December']
},
yAxis: {
categories: ['Lucozade', 'Rockstar', 'Sprite', 'Monster', '7Up', 'Fanta', 'Coke'],
title: null
},
colorAxis: {
min: 0,
minColor: '#FFFFFF',
maxColor: Highcharts.getOptions().colors[0]
},
legend: {
align: 'right',
layout: 'vertical',
margin: 0,
verticalAlign: 'top',
y: 25,
symbolHeight: 280
},
tooltip: {
formatter: function () {
return '<b>' + this.series.xAxis.categories[this.point.x] + '</b> sold <br><b>' +
this.point.value + '</b> items on <br><b>' + this.series.yAxis.categories[this.point.y] + '</b>';
}
},
series: [{
name: 'Sales per Shell',
borderWidth: 1,
data:
[[0, 0, 10], [0, 1, 19], [0, 2, 8], [0, 3, 24], [0, 4, 67], [0, 5, 67], [0, 6, 67],
[1, 0, 92], [1, 1, 58], [1, 2, 78], [1, 3, 117], [1, 4, 48], [1, 5, 48], [1, 6, 48],
[2, 0, 35], [2, 1, 15], [2, 2, 123], [2, 3, 64], [2, 4, 52],
[3, 0, 72], [3, 1, 132], [3, 2, 114], [3, 3, 19], [3, 4, 16],
[4, 0, 38], [4, 1, 5], [4, 2, 8], [4, 3, 117], [4, 4, 115],
[5, 0, 88], [5, 1, 32], [5, 2, 12], [5, 3, 6], [5, 4, 120],
[6, 0, 13], [6, 1, 44], [6, 2, 88], [6, 3, 98], [6, 4, 96],
[7, 0, 31], [7, 1, 1], [7, 2, 82], [7, 3, 32], [7, 4, 30],
[8, 0, 85], [8, 1, 97], [8, 2, 123], [8, 3, 64], [8, 4, 84],
[9, 0, 47], [9, 1, 114], [9, 2, 31], [9, 3, 48], [9, 4, 91],
[10, 0, 47],
[11, 0, 47],
],
dataLabels: {
enabled: true,
color: '#000000'
}
}]
});
});
});
});
And what im trying to fill it with is data from the query
$sth = mysql_query("Select SUM(Profit) as profitSum From FactSales GROUP BY ShellType, SaleMonth");
$rows1 = array();
$rows1['profit'] = 'profitSum';
while($rr = mysql_fetch_assoc($sth)) {
$rows1['series'][] = $rr['profitSum'];
}
$result = array();
array_push($result,$rows1);
What do i actually need to change for the "series" data to be filled with the data returned from the sql query?
Heres the JSON response as requested
[{"profit":"profitSum","data":[1329230,1796743,1789417,1327813,1457103,1198845,1859826,1770589,1555410,1310369,2183499,1212897,6424306,6426002,6345153,6167415,6969392,5974880,6407699,6278843,6622002,5962102,5198177,5386392,72991,2321397,1751565,2029890,642041,1314314,1322492,1557859,1647784,1831767,1347480,1739353,1742597,1636006,1728247,1709689,1206645,1383206,1119153,1378317,1527356,1937898,1485322,1404498,1868629,1635265,1860456,1293870,1485349,2031389,1834402,1291372,1838382,1616009,781641,1421830,1763592,1279535,1123468,2024766,975863,1461843,1318585,1137336,1111721,1407705,2349652,1260858,1144070,1219659,1378615,1354139,2015115,1408858,2650864,1810850,1380157,1844909,2055306,1913532,1701963]}]