I'm pretty new to Cloud SQL and decided to use it for a course project. I'm trying to use Cloud SQL but each time I create a new instance (second generation type) it ends up getting suspended and I can no longer access it through the Google Cloud Platform console, mysql, etc.
It seems that most of the time, instances get suspended due to billing issues but I've enabled the billing option in the console and all of my information is correct.
The instance will work and be accessible for a few minutes after creation but soon thereafter it moves to this 'Suspended' state.
Any ideas why this is occurring?
You can visit this article for a number of reasons why Google Cloud SQL may suspend an instance. Also, as Vadim mentioned you can open ticket with Google Cloud Billing team via this form.
Related
I have an MSI installer to submit it in the Partner Center or Microsoft store. I converted my MSI installer into MSIX package format using the MSIX packaging tool.
I had already an idea of how to publish my app in the MS store using this reference: https://www.advancedinstaller.com/msix-publish-microsoft-store.html
However, there's a question that comes to my mind. Once I published my application in the MS store and
What if I have new updates in my app, what are the steps how to do that?
I did some research most of the results have something to do with source code.
My preferred approach is to update the app without coding configurations.
The submission process is managed through the Microsoft Partner Center dashboard, which is a web portal that allows developers to publish applications, and manage the updates of the application to allow keeping the app up to date automatically.
Once you submit updates to a published application, the updated packages will be available on average about two hours after submission (though this can sometimes take longer, especially with larger packages). Price, screenshot, or description changes take on average 16 hours to go live. Customers will receive the updated package the next time their device looks for updates, if (automatic app updates) are turned on,
or when they trigger the update by proceeding to the Windows Store and choosing (Check for updates) on the downloads. By default, automatic updates are turned on in the Microsoft Store, so users will always use the latest version of your application.
References:
MSIX Auto Updates https://www.advancedinstaller.com/msix-auto-updates.html
Update a public app https://blogs.windows.com/windowsdeveloper/2016/05/13/publish-or-update-a-public-app-dev-center-tip-1/
Mandatory updates https://learn.microsoft.com/en-us/windows/uwp/packaging/self-install-package-updates#mandatory-package-updates
Upload app packages https://learn.microsoft.com/en-us/windows/uwp/publish/upload-app-packages
we are using SSIS 2016 and with parameterize connections. Every time we open the solution the connection manager is trying to connect with the credentials available in the project.param , due to unavailability of the password and repeated trying the db user account is getting locked.
Looking for some inputs if there is any settings we could change for connection manager not to try to connect when the solution is opened.
Thanks for your time on this.
RR
If you change the project setting to work in Offline mode, you should be able to open the project without validation checks firing which should alleviate the ever-so-fun lockout policy. That might be a user solution file setting so each team member might need to set it.
A different approach is to leave the design time user value to a non existent account. The run-time will swap in the properly parameterized connection but any developer opening the packages won't lock anyone out.
I recently discovered Power BI as part of our Office 365 subscription so am very new to it.
We have a MySQL database with about 5 million rows in AWS. I want to add this as a data source to our Office 365/Power BI service.
How to do this?
I see there is no content pack service that allows me to do this.
According to this SO question and answer, there is no direct way to do this: How to connect POWER BI web with AW mysql database?.
I also looked at using a Power BI Gateway to achieve this. There are two types: Personal and On-Premises. We don't have any Windows Servers, so this leaves the Personal option: https://powerbi.microsoft.com/en-us/documentation/powerbi-personal-gateway/
For Personal, the documentation at that link says "A personal gateway is not required in order to refresh datasets that get data only from an online data source" which is a little confusing given that this seems to be the only option for connecting to my online data source (maybe this document meant to say "from a supported online data source"?). It seems that I install this on a local machine in our office, connect to my AWS MySQL database, query/model on my desktop, then upload my results to our Power BI Service for the rest of our company to access. I schedule refreshes using the Personal Gateway. Is this correct? I hope this does not involve the transfer of millions of rows to/from desktop and/or Power BI Service?
p.s. I also considered developing something similar to the content packs that are provided for GitHub, Google Analytics, MailChimp, etc but there doesn't seem to be a "private" way to do these. Doing it this way seems to involve becoming a Certified Azure Developer (even though there is no Azure in this problem) and then making the solution public (which I obviously don't want to do): https://azure.microsoft.com/marketplace/programs/certified/apply/. If there is a way to develop my own "private" solution without the certification and publication process, I would consider that.
I would tackle this through Power BI Desktop. You will need a windows machine to install this on, and it will need the MySQL Connectors installed, ref:
https://stackoverflow.com/a/32746679/1787137
Then I would develop and publish your queries, datasets and reports using PBI Desktop.
Finally I would configure PBI Personal Gateway to schedule refresh of the published report datasets.
5m rows is not trivial but quite possible in this scenario. You will likely only need a selection of your tables and columns, that have analytical value.
I've got an Access VBA application that is responsible for querying a SQL table once per second over ODBC, looking for records where processed = 0. When it finds one, it will process an Access report and print it to PDF using CutePDF, saving it to a network drive that our desktop application (that added the SQL record with processed = 0) can access, opening it on the user's desktop.
I've spent considerable time debugging this, my first VBA application, and have all errors being trapped & logged. This morning, the Access application was closed, so no reports were being processed. Opening the .accdb allowed the backlog of reports to begin processing. The error log is empty, and the server has not rebooted in nearly a month.
As far as I know, this application should ONLY be run from a desktop, so an administrator must log into the server after a reboot. I've read that running MSACCESS as a service is a "terrible idea" and exposes Windows to corruption if Access errors out.
That's the state of things. Here are my questions.
Is there a pre-written application you would recommend that I install
on this server to monitor & log the events of the application
(particularly, exits & opens)?
As a last resort, I suppose I could
add a function that, every 10 minutes, writes to a "status log" text
file that basically says, "I am here!" so that I can at least find
out WHEN an app closed, if not WHY. This seems like overkill, as the log file could grow enormous unless I create & destroy it every 5-10 minutes instead of appending it.
You'll have to forgive me, prior to this app I worked exclusively on PHP & JavaScript, and haven't taken a Visual Basic or C-based class since college 6 years ago.
Any creative solutions?
My 1st thought is use the Profiler to capture what you need.
Does your app reset the processed = 0 flags when the reports are created?
You could have a 1 record table with a 'last processed' date field, and just update that everytime it runs.
I have a Java Server providing web services to my frone end web application. One of functionality of our web application is to accept recurring payments from our customer and also pay to the sales agents using"pay me" option.
I want my backend Java server to talk to PAypal apis directly and deal with all payments.
For ex: When sales agent says "pay me", I will capture his/her paypal account id and store in my database. Then backend process will kick in and does the payment to the agent using paypal apis.
Is this possible with paypal APis?
I have seen every where that paypal integration can onyl happen via web
As long as your system can carry out an API operation from the backend, there isn't any reason that it shouldn't work. I have a cron job set up that runs every morning and performs an API call.