"Data for date 2020-10-13 for application classroom is not available right now, please try again after a few hours." - google-apis-explorer

I've been getting this message returned in the JSON since last week when trying to get Google Classroom app usage reports via the admin api. Has anybody come across this issue before or know if it is some temporary issue with the service?
If I change the date parameters to 2020-10-12 it works and I have been able to obtain 'daily' data in this way for the last few weeks. (It seems that there is generally a delay of about 2 and half days before the reports become available for a given day, so this current delay seems to be much longer than usual).
I am using the classroom:timestamp_last_interaction parameter.

As of 2020-10-20, I can now access Classroom usage data for 2020-10-13. Maybe the lead time has increased to a week, which would be a shame, but easier to remember I suppose!
Will update if I uncover more.

Related

MS Access: Multi-User Application: send msgbox to specific user

I don't know if my idea is possible to solve with MS Access. The requirement:
I have one centralized DATA-database and several Client-Databases. It will be used to maintain a rescue team in our company. Each one has to press a button in his client and in the reception client is visible who is available in case of an emergency.
It even shows who is in which corner of the building the rescue person is.
Now, it's in human nature that in evening People forget to logout. Plan is to define a usual end of work time. The reception client verifies every 5 Minute if someone reaches his end of work time and can set him as out of office.
Problem is, perhaps Mr. X works today not until 17h as usuall - today he is available until 20h.
So, a message should Pop up 15 Min before his end-of work and ask him if he goes by time or not. If he answers to work longer, a flag should remove him from this function today.
Solution is almost 70% developped. Problem is now the little point, how to pop-up the message to the right user. One Idea is to check a message table if there is a message for him.
But, i don't like to make to much LAN traffic... if each client ask every 5 minutes the DATA DB if there is a message for him.
Has some one an idea?
Best regards
Roland
Polling a single table every 5 minutes should generate virtually no load. I've used a similar solution that polls every minute without any trouble on a networked database with ~20 users.
You can, of course, pull in these messages once, since they will fire at a set time, and then just raise them at that time.
You can just have a hidden form that's bound to a specific table, uses a filter on the username, and requery every x seconds, tests if there's a message ready, and then displays it.
Alternatively, you can pull in messages once, and have a hidden form that checks on timer if it's time to raise that message.

Reporting Project Management

I have a question for BI/Reporting Analyst regarding Project Management.
This maybe a difficult question to answer, but I just need to get an idea of what is expected. To help with the question lets say we are working with customer relationship data(lets says Microsoft Dynamics), comms data, shop data and/or financial transactions.
I understand that the time it takes to create a report from scratch i.e. Gathering requirements -> maybe needing to setup a ETL package -> writing the SQL for the dataset to making the report on a reporting platform lets say SSRS varies based on the complexity.
I wanted to know roughly what sort of time scales are we expecting for such a project from BI/Reporting Analyst point of view.
Also, whether Analyst's work on one project at a time or do split their time between two maybe three project at time. I don't mean slight changes or incremental changes to existing reports. I mean working multiple projects starting from scratch.
If Reporting Analyst are working on multiple projects how does this affect the time it takes as opposed to the time it takes to build a report while working on one project at time.
What I want to learn from this question is what is expected from Analyst in terms of time it takes to delivery reports, and whether they are expected to work on multiple projects at the same time.
Thanks a lot for your responses in advance. Your responses will help me.
Sorry if you feel this question is pointless or if I have not been able clearly write exactly what it is I want to know.
The question isn't pointless, but I found it a little hard to follow. But here's what I would recommend:
Using experienced Analysts who are current employees as a condition, yes - they do work on multiple projects at once typically in a specific business area like Operations. Time impacts are based on how many projects at once. You should ask the manager of your BI Analysis group how many projects they typically handle at once. For discussion, let's say 3. No one's work is evenly split, but you can roughly estimate by assuming 1/3 of the Analyst's time is on your project. So do the math based on 33% of an FTE (Full Time Equivalency or 1 full time employee) with an average work month of 176 hours (8 per day x 22 work days) and you get 59 hours rounded up.
Given the data examples, I think that an ETL tool would help the process. For more information you can read this blog post on ETL tools and using ELT instead. Depending on the software, someone from your data team will likely install it; the Analyst would be setting up data flows and queries. Time varies, but for a specific use likely 30-40 hours. Once the application is configured, producing the reports is quick but allow time to format them into presentation quality (dreaded PowerPoint, or a spreadsheet.) I'd say that's 2-3 times the minutes required to run the query and produce the report. You could see what time current reports take your company and base it off of that, or use a round number like 60 or 90 minutes per report. I prefer to use actual data.
So to sum up, 40 hours to configure the ETL tool, 90 minutes per report with 1/3 of an FTE. Let's say you need 5 reports:
(40 + (90 x 5)) = 490 hours at 59 hours per week = 8 1/2 weeks rounded up.

Active Collab 5 Webhooks / Maintaining "metric" data

I have an application I am working on that basically takes the data from Active Collab and creates reports / graphs out of the data. The API itself is insufficient to get the proper data on a per request basis so I resorted to pulling the data down into a separate data set that can be queried more efficiently.
So in order to avoid needing to query the entire API constantly I decided to make use of webhooks in order to make the transformations to the relevant data and lower the need to resync the data.
However I notice not all events are sent, notably the following.
TaskListUpdated
MemberUpdated
TimeRecordUpdated
ProjectUpdated
There is probably more but these are the main ones I noticed so far,
Time reports is probably the most important, in fact it missing from webhooks means that almost any application has a good chance of incorrect data if it needs time record data. Its fairly common to do a typo in a time record and then adjust it later.
So am I missing anything here? Is there some way to see these events reliably?
EDIT:
In order to avoid a long comment to Ilija I am putting the bulk here.
Webhooks apart, what information do you need to pull? API that powers
time tracking reports can do all sorts of cross project filtering, so
your approach to keep a separate database may be an overkill.
Basically we are doing a multi-variable tiered time report. It can be sorted / grouped by any conceivable method you may want to look at.
http://www.appsmagnet.com/product/time-reports-plus/
This is the closest to what we are trying to do, back when we used Active Collab 4 this did the job, but even with it we had to consolidate it in our own spreadsheets.
So the idea of this is to better integrate our Active Collab data into our own workflow.
So the main data we are looking for in this case is
Job Types
Projects
Task Lists
Tasks
Time Records
Categories
Members / Clients
Companies
These items can feed not only our reports, but many other aspects of our company as well. For us Active Collab is the point of truth, so we want the data quickly accessible and fully query-able.
So I have set up a sync system that initially grabs all the data it can from Active Collab and then uses a mix of cron's and webhooks to keep it up to date.
Cron jobs work well for all aspects that do not have "sub items" (projects/tasks/task lists/time records). So those I need to rely on the webhook since syncing them takes to much time to be able to keep it up to date in real time.
For the webhook I noticed the above do not carry through. Time Records I figured out a way around it listed in my answer, and member can be done through the cron. However Task list and project updating are the only 2 of some concern. Project is fairly important as the budget can change and that would be used in reports, task lists has the start / end dates that could be used as well. Since going through every project / task list constantly to see if there is a change is really not a great idea I am looking for a way to reliably see updates for them.
I have based this system on https://developers.activecollab.com/api-documentation/ but I know there are at least a few end points that are not listed.
Cross-project time-record filtering using Active Collab 5 API
This question is actually from another developer on the same system (and also shows a TrackingFilter report not listed in the docs). Due to issues with maintaining an accurate set of data we had to adapt it. I actually notice that you (Ilija) are the person replying and did recommend we move over to this style of system.
This is not a total answer but a way to solve the issue with TimeRecordUpdated not going through the webhook.
There is another API endpoint for /whats-new This endpoint describes changes for the last day or so and it has a category called TrackingObjectUpdatedActivityLog this refers to an updated time record.
So I set up a cron job to check this fairly consistently and manually push the TimeRecordUpdated event through my system to keep it consistent.
For MemberUpdated since the data for a member being updated is unlikely to affect much, having a daily cron for checking the users seems good enough.
ProjectUpdated could technically be considered the same, but with the absence of TaskListUpdated that leads to far to many api calls to sync the data. I have not found a solution for this yet unfortunately.

Amazon API submitting requests too quickly

I am creating a games comparison website and would like to get Amazon prices included within it. The problem I am facing is using their API to get the prices for the 25,000 products I already have.
I am currently using the ItemLookup from Amazons API and have it working to retrieve the price, however after about 10 results I get an error saying 'You are submitting requests too quickly. Please retry your requests at a slower rate'.
What is the best way to slow down the request rate?
Thanks,
If your application is trying to submit requests that exceed the maximum request limit for your account, you may receive error messages from Product Advertising API. The request limit for each account is calculated based on revenue performance. Each account used to access the Product Advertising API is allowed an initial usage limit of 1 request per second. Each account will receive an additional 1 request per second (up to a maximum of 10) for every $4,600 of shipped item revenue driven in a trailing 30-day period (about $0.11 per minute).
From Amazon API Docs
If you're just planning on running this once, then simply sleep for a second in between requests.
If this is something you're planning on running more frequently it'd probably be worth optimising it more by making sure that the length of time it takes the query to return is taken off that sleep (so, if my API query takes 200ms to come back, we only sleep for 800ms)
Since it only says that after 10 results you should check how many results you can get. If it always appears after 10 fast request you could use
wait(500)
or some more ms. If its only after 10 times, you could build a loop and do this every 9th request.
when your request A lot of repetition.
then you can create a cache every day clear context.
or Contact the aws purchase authorization
I went through the same problem even if I put 1 or more seconds delay.
I believe when you begin to make too much requests with only one second delay, Amazon doesn't like that and thinks you're a spammer.
You'll have to generate another key pair (and use it when making further requests) and put a delay of 1.1 second to be able to make fast requests again.
This worked for me.

How to create an auto task schedule ios notification based on an event with mysql data

I have a problem related to automatic task scheduling.
Currently i am able to find out when my customer has last credited his account, how am i able to find out whether he will pay anything in the next 3 days?
So if no payment has been made in the next three days for any customer, to automatically alert me preferably by a notification directly to my ipad.
I dont want myself to open the app for checks to be done only when i log in, because then if i jump on my application 6 days later, i could have had a customer that hasnt paid in 6 days when the app should have alerted me on the 3rd day so i could ring my customer up to deal with the matter.
I need to work in this matter due to the structure of my application and business.
I am able to monitor everything else but need some insight on how I can go about doing this. the current notification system inside the phone only fires based on time, and I cannot do interval checks where maybe i could run a background task, if that would work then i would have done it like that but thats not the case.
Pavan
If I understand your question correctly, you should compute the interval of the event that you want and post a wake-up timer that is that period of time from "now." If you need it through the notification center, then just handle it silently and clear it from the notifications.
Based on the discussion below:
You will need a little bit of server work. APNS looks complicated, but it really has very few moving parts -- especially if it is a private App. What system component is keeping an eye on Amazon? Do you have an App or web server? For example, if I were to poke a record into your system (purchased services) what workflow is triggered to notify Accounting to process an invoice and collections at a later date? Am I making any sense of your system architecture?
Perfect - you are done. You have all the system components you need and the rest is coding. The server app processes the accounts DB and finds new entries. If found, it publishes a record ID to the APNS server (Apple owns this server). You write code to register to receive the push-notification (subscriber). When you get a push, that will wake up your registered app with the record ID (and some other subscription stuff for bookkeeping -- but you are the only subscriber and only subscribing to one DB table -- so you can largely ignore. Now turn around and query based on that record. Done!