Missing Firebase app_update events - firebase-analytics

I'm seeing huge difference for app_update events automatically sent by Firebase and real conversion of user base to newer version when it's released.
For example 5 days during release of new version: 120 events VS 3k users (x20 difference)
I did check another "alpha" update with data exported to BigQuery:
Count app_update events (to any version):
SELECT
COUNT(user_pseudo_id)
FROM
`analytics_151378081.events_*`
WHERE
_TABLE_SUFFIX BETWEEN '20190820' AND '20190827' AND
event_name = "app_update"
---|---
1 | 49
Count users with new version who's first app launch was before this version was pushed to alpha:
SELECT
COUNT(DISTINCT(user_pseudo_id))
FROM
`analytics_151378081.events_*`
WHERE
_TABLE_SUFFIX BETWEEN '20190820' AND '20190827'
AND app_info.version = "0.4.2.1"
AND user_first_touch_timestamp < 1566259200000000
--|---
1 | 73
So it's 49 events (to any version) VS 73 "old" users using new version.
Is anyone seeing the same?
My platform is AndroidTV and a lot of sessions do not have UI (TV Input Framework integration through service)

Related

Delete custom wordpress posts via SQL

I have a website that operates similarly to many freelancing websites where people can make bids. Its become painfully slow in recent weeks and I used query monitor to find out the issue is likely over 140,000 posts that were build up over the last 5 years. Below is a query that takes 36 seconds (from query monitor)
...
SELECT wp_4_posts.ID
FROM wp_4_posts
WHERE 1=1
AND wp_4_posts.post_parent = 427941
AND wp_4_posts.post_author IN (1)
AND wp_4_posts.post_type = 'bid'
AND ((wp_4_posts.post_status = 'publish'))
ORDER BY wp_4_posts.post_date DESC
LIMIT 0, 5 +
WP_Query->get_posts()
...
I'm wondering how I can:
Delete any post of type "bid" that are in draft status
Delete any post of type "bid" that are in published status but made before 2020
The best solution for your is Bulk delete plugin, this plugin will allow you to delete a specific post type, specific post status, and you can do the operation in a bundle, like 50 posts at a time, or 200.
https://wordpress.org/plugins/wp-bulk-delete/

How do I fetch data for a machine running log with different 'start' 'stop' entry in mysql?

My application keeps track of machines that are 'started' and 'stopped' as many times as needed, and this information is saved in the same table with the fields machine_ID, timestamp, entry_type(start/stop).
A start creates a row, and a stop creates a different row.
Sample start row : M1 - 2019-06-27 12:08:30 - 'start'
Sample stop row : M1 - 2019-06-27 14:30:05 - 'stop'
I need to be able to display:
1) All currently running machines (Machines that don't have a stop entry after the latest start entry)
2)A timeline of the previous machine activity preferably chronologically. Eg Machine 1 started: 09AM stopped 10 AM, Machine 2 started: 1pm, stopped 2pm, etc.
I imagine a result table as follows would be what I need but I can't understand how to build a query to fetch it.
'M1 - 2019-06-27 12:08:30 - 'start''
'M1 - 2019-06-27 14:30:05 - 'stop''
'M1 - 2019-06-27 16:00:00 - 'start''
'M1 - 2019-06-27 16:30:00 - 'stop''
'M2 - 2019-06-27 16:00:00 - 'start''
'M2 - 2019-06-27 16:30:00 - 'stop''
I plan tp use PHP to parse through such a table.
I am unable to alter the table structure in any way.
I believe the general principle behind solutions to this kind of problem has been well documented, however, not knowing how to refer to it, I am unable to do my own research.
Thanks.

Nested Cron jobs in Nodejs

I'm having a 'Tournament' sql table that contains start_time and end_time for my tournaments. I also have another table which has playerId and tournamentIds so I can tell which players playes in which tournament.
What I'm trying to do is to run a cron task to check my tournament table and see if tournament has ended so it can check players results from an external api. The problem is the external API has rate limit and I have to send my requestes every 1.5 sec.
What I tried to do is to write a cron job for every 10 seconds to check my tournament table (I couldn't come up with anyother solution rather than keep checking db):
cron.job("*/10 * * * * *", function(){
result = Query tournament table Where EndTime=<Now && EndTime+10second>=Now
if(result is not empty)
{
cron.job("*/1.5 * * * * *",function(){
send API requests for that userId
parse & store result in db
});
}
});
I don't feel right about this and it seems so buggy to me. Because the inner cron job might take longer than 10 seconds. Is there any good solution to do this. I'm using ExpressJS & MySQL.
The problem you are facing can be solved with event emitters. There is a very useful module node-schedule in npm which can help you in this scenario that you are telling. What you have to do is is to schedule a job to fire at the deadline of the project, that job will hit the 3rd party api and check for results.You an schedule a job like this
var schedule = require('node-schedule');
schedule.scheduleJob("jobName", "TimeToFire", function () {
//Put your api hit here.
//finally remove the schedule
if ("jobName" in schedule.scheduledJobs) {
schedule.scheduledJobs["jobName"].cancel();
delete schedule.scheduledJobs["jobName"];
}
})
Make sure you store all the jobs scheduled in the database also as a server crash will invalidate all the schedules that you have scheduled and will have to reschedule them again.

windows phone 8.1 database general questions

I'm new to WP8.1 developing.
My application is using now a SQL CE database in isostore - my DBConnectionString - "isostore:/SpaceQuiz.sdf", and everything works properly.
But here I have some questions:
1) If my sdf file is in isostore - it will be added to my xap file after deployment?
2) I want to add manually some data to this database (about 1000 rows, exemplary scheme - ID | NAME | ISDONE | TIME).
Some cells in this database will be updated by my application (ISDONE | TIME). Also in future, I want to add next 1000 rows for example. What is the best approach to achieve this? How to prevent to restart data in my sdf file?

mysql db for time-temperature values

I need your help to build my db the right way.
I need to store time-temperature values for different rooms of my house
and I want to use DyGraph to graph the data sets.
I want to implement different time windows: 1 hour, 24 hours, 48 hours,
1 week, ....
I will be detecting the temperature with a 15 minutes interval, so I will have 4 time-temperature values per hour.
Each room has an ID so the time-temperature values will be associated
to the proper room.
The table I built i very simple:
----------------------------------
| ID | DATE | TEMP |
----------------------------------
| 1 |2014-04-30 00:00:00 | 18.6 |
----------------------------------
| 2 |2014-04-30 00:00:00 | 18.3 |
----------------------------------
| 3 |2014-04-30 00:00:00 | 18.3 |
----------------------------------
| 1 |2014-04-30 00:15:00 | 18.5 |
----------------------------------
For some strange reason, when the number of rows gets to 500 or so,the
server becomes very slow.
Also, I have a web page were I can read the different temperatures of
the rooms: this page polls the server through AJAX every 5 seconds (because it needs
to be frequently updated!), but when the number of rows of the table
gets around 500, it hangs.
I tried to split the table and I created a table for each room, then a
table for each time-window and now everything seems to be working fine.
Since I do not think this is the best/most efficient way to organize
this thing, I need your help to give it a better structure.
I use a php script to retrieve the temperature data for all the rooms of my house:
$query = "SELECT * FROM temperature t1
WHERE (id, date) IN
(SELECT id,MAX(date) FROM
temperature t2 GROUP BY id)";
this query allows me to collect the temperature values in an array called $options:
$result_set = mysql_query($query, $connection);
while($rows = mysql_fetch_array($result_set)){
$options [] = $rows;
}
then, I json-encode the array:
$j = json_encode($options);
and send it to the ajax script, which shows the data on the web page:
echo $j;
In the ajax script, I save the data in a variable and then parse it:
var return_data = xhr.responseText;
var temperature = JSON.parse(return_data);
next I loop through the array to extract the temperature values and put it in the right place on the web page:
for(var j=0; j<temperature.length; j++){
document.getElementById("TEMPArea" + j).innerHTML = temperature[j].temp + "°C";
}
This works fine as long as the rows in the 'temperature' table are less than 600 or so: polling every 5 seconds is not a problem.
Above 600, the page refresh gets slow and eventually it hangs and stops refreshing.
EDIT: Right now, I am working on a virtual machine with Windows 7 64bit, Apache, PHP and MySQL, 4GB RAM. Do you think this could be an issue?
It seems like I was poor in details, so here's something more to what I said.
I use a php script to retrieve the temperature data for all the rooms of my house:
$query = "SELECT * FROM temperature t1
WHERE (id, date) IN
(SELECT id,MAX(date) FROM
temperature t2 GROUP BY id)";
this query allows me to collect the temperature values in an array called $options:
$result_set = mysql_query($query, $connection);
while($rows = mysql_fetch_array($result_set)){
$options [] = $rows;
}
then, I json-encode the array:
$j = json_encode($options);
and send it to the ajax script, which shows the data on the web page:
echo $j;
In the ajax script, I save the data in a variable and then parse it:
var return_data = xhr.responseText;
var temperature = JSON.parse(return_data);
next I loop through the array to extract the temperature values and put it in the right place on the web page:
for(var j=0; j<temperature.length; j++){
document.getElementById("TEMPArea" + j).innerHTML = temperature[j].temp + "°C";
}
As I said in the first message, this works fine as long as the rows in the 'temperature' table are less than 600 or so: polling every 5 seconds is not a problem.
Above 600, the page refresh gets slow and eventually it hangs and stops refreshing.
I am not an expert, the code is pretty simple and straight forward, so I am having problems detecing the cause.
Thanks again.
I think the query is the main source of problems:
it's a slow way of getting the answer you want (you can always run it in Workbench and study the output of EXPLAIN - see the manual for more details
it implicitly supposes that all sensors with transmit at the same time, and as soon as that's not the case your output dataset won't be complete. Normally you'll want the latest data from each individual sensor
so I propose a somewhat different approach:
add an index on date and one on id to speed up queries. The lack of a PK is an issue, but let's first focus on solving the current issues...
obtain the list of available sensors - minimal solution
select distinct id from temperature;
but it would be better to store a list of available sensors in some other table - this query will also get slower as the number of records in temperature grows.
iterate over the results of that list to fetch the latest value for each of the sensors
select * from temperature
where id = (value obtained in previous step)
order by date desc
limit 1;
with this query you'll only get the most recent record associated with each sensor. Thanks to the indexes, the speed impact of a growing table should be minimal.
reassemble these results in a data structure to send to your client web page.
Also, as stated in the documentation, the mysql_* extension is deprecated and should not be used in new programs. Use mysqli_ or preferably PDO. Both these extensions will also allow you to use parameterized queries, the only real protection against SQL Injection issues. See here for a quick introduction on how to use them