Shared Data Extension to BigQuery (GCP) - function

I`m quite new at using saleforce but I have been able to use Cloud Data Fusion recently to transfer information from a Data Extention to a table in Big Query.
Know I would like to do the same thing but with a Shared Data Extension, meaning that (possibly) features like the external key of the D.E would not exist because the D.E is shared by different Business Units (B.U) and also, I think I would have a problem by dealing with the authentication (Client ID – Client Secret - ) because again, is a Shared Data Extension.
Thank you very much for any insight on this matter.

Related

Best Practice to Store Keys in Google Apps Script

I would like to inquire for best practices in storing keys within the Google AppScript environment.
Currently, while I am prototyping, I just store the keys as local variables in the file where it is used. This key is used by all users of my App Script regardless of their domains.
Moving forward however I would like to safekeep this key in a more reliable storage. And would like to ask for advice on how to best safekeep these keys.
Currently, I am thinking of :
Using PropertiesService.getUserScriptProperties().setProperty(key,value) as this is shared by all users.
as part of the manifest? Is there a way to add userData in the contextual and homepage triggers?
Or just keep using local variables as the code is not visible to the users anyway?
Thank you all in advance.
I understand that you ask about the best way to store a static key that would be retrieved by anybody who runs your Apps Script project, being indifferent to its domain. I also assume that the script is being run as the owner (you) and that the final users shouldn't be able to read the key, please leave a comment if that isn't the case. With this situation you have the following approaches:
The straightmost approach would be to use the Properties Service. In this particular scenario the key should be accessible to anyone executing the script, therefore PropertiesService.getScriptProperties() is the way to go (you can learn more about other scenarios here).
As an alternative you could store the key in your own database and use the JDBC Service to access it. This method is compatible with Google Cloud, MySQL, Microsoft SQL Server and Oracle databases. Here you can learn more about reading from a database.
Another possible choice is the Drive API. You could take advantage of the application data folder since it is intended to store any files that the user shouldn't directly interact with. For this option you would need to use the Advanced Drive Service on Apps Script.
Please be aware that an admin from your domain could access (or gain access) to the stored key. Also please check the quota limits to see if it fits your usage.
As you may have noticed, the PropertiesService provides several methods for storing key/value at the document's level, user's level or script level:
getDocumentProperties()
getUserProperties()
getScriptProperties()
I'd recommend to store a property based on who needs access to it. If only the authenticated user should have access to a property (such as for example a setting relevant only to its accounts, like it's locale language), go with the UserProperties. Contrary, if the property is relevant to a document (Google Docs, Google Sheets, etc.) go with the DocumentProperties.
With this said, I wouldn't recommend using the ScriptProperties in general. The main reason being that quota applies to the PropertiesService (see table below). This means that as your add-on gets more and more users, you will hit the quota limit quite rapidly.
| Service | Consumer accounts (e.g. #gmail.com) | Google Workspace accounts |
|--|--|--|
|Properties read/write | 50,000 / day | 500,000 / day |
Source: https://developers.google.com/apps-script/guides/services/quotas
Depending on your use case, you might also be tempted by alternatives to the PropertiesService:
using local variables in your code - as you mentionned
using the CacheService - which store data for a limited period of time.
making request to a distant server where you could query your own database.
We heavily rely on the latest at my company, thanks to the UrlFetchApp service. The main reason being that it allows us to pull a user profile from our database without making updates to the codebase.

Is there a way to store images in Azure or AWS without hosting the application on their platforms?

I am currently developing a full-stack web application for the first time. It is a store that needs to give users the ability to upload "books", edit, delete, and manage them. As of now, I have a React front-end, that calls an Express API using Axios that queries a MySQL database. This currently can manage the product details, titles, and simple labels and relations.
However, I now need to store images and .json files dynamically as well. So, I have researched and need to use Azure Storage to store these images and allow access to them by the end-user. I have researched it and the client would like to use Azure storage as well.
I have gotten quite overwhelmed looking into the Azure documentation for Javascript image uploading, and every "tutorial" starts with create a web application, storage account, and app service.
All I would like to do is store images from the user in Azure storage, so that when I eventually deploy the website, the data is available to be accessed and then my front end or API can call Azure to get the images for the user. I apologize if the question is confusing or vague, I am really just overwhelmed by Azure's documentation and it seems like such a simple and common problem. Any guidance or references would be greatly appreciated.
Yes, it's quite possible. You simply create an Azure Storage account and upload your files as blobs via that account. Then they will be available publicly on the Internet, and your web application can reference them from wherever it is hosted.
It is possible as I believe it is quite a common practice to have a Content Delivery Network to deliver the images. I am not very familiar with Azure but I am with AWS and I can tell you that you can use an AWS S3 bucket to store the images and JSON. It comes with many different configuration options to allow content to be protected or open to the general internet.

Remote access to an Access database

I need to develop a very simple database (probably no more than 4-5 tables, with up to 50 records per table) for my company, with the following requirements:
The database itself (most likely an Access file) must be stored on a server and accessed through http://www.something.com/my_db.mdb
Users from 6 different countries (with generally low Internet bandwidth) must be able to access this database and to view / edit it through a few masks, as well as produce automatic reports / extracts
The whole solution must be as robust and as low-tech as possible, to reduce maintenance issues (ideally, no development at all)
I cannot pay an Access license for each user, and using OpenOffie or LibreOffice is not an option (because I cannot go and install it on the computers of all the users)
My first (and naive?) idea was to:
1) Create the mdb file containing only the data and store it on a webserver
2) Create the edition masks and the automatic reports in another file that would define the online file as data source
3) Deploy the file containing the edition masks to the computers of all users
4) The users only have to open their local file to edit the distant DB through ther edition masks
Is my approach somehow realistic? Do you see another approach that would make more sense? Can I implement my solution with 1 single Access license?
Thanks a lot in advance for your inputs and insights!
If you provide just the mdb file as file source, accessible via HTTP, the users won't be able to connect to the database, because in a HTTP GET file download they just get the .mdb file downloaded to their local computer. When they edit something within the database (e.g. add a record), it will be done just locally on their local copy of the file.
If you want to use a access database, the simplest approach I have is that you implement a very small web application (e.g. ASP.NET) which connects to the .mdb file (and the .mdb file then can be in a private directory on the server). Your web application then is deployed to Internet Information Server (Microsoft IIS as a webserver).
You can provide data forms as web application, which you implement using ASP.NET, or develop separate clients which access web services you develop with .NET.
You could try cloud based solutions like; Google Firebase
For a requirement of this type; one should not use Access tables which are static because Access is a front end database but instead use a back end database such as SQL Server Express. SSE is free and one is better positioned to provide real web based features if needed in the long run.
Further I would say, in terms of cost/management - one should really consider using one of the online db services such as soho, knack, airtable, etc. One of these could well be faster and less expensive than creating a web app from scratch for such a small requirement.

Hosting JSON Files for Mobile Application

I am creating a mobile application using swift for my organization. The application reads in data in JSON format to populate the information that gets displayed on the application. I already have a method to generate the JSON files, but I need somewhere to host the actual files. I have an AWS account and an instance running, this is where I initially was hosting my JSON files but I got an email from AWS saying that having the app constantly grab the JSON files that I stored on the site resembled scanning behaviour, which is not allowed apparently. So I was wondering where I could host JSON files so that my mobile app can read in the information it needs. The biggest thing that I need is that I can host it with a static URL that I can keep calling with my app.
I was thinking of potentially putting the files on an AWS bucket with read permissions and having those get accessed, but since AWS already complained about me doing something like that I'm iffy. I was also thinking of putting the JSON files on Github, but again I'd hate to get an email from github telling me that they don't like that an application keeps grabbing the data.
For background, the app essentially has a hardcoded URL that grabs the JSON data and parses it. I didn't do an api because an API takes some time to grab all the information that doesn't really change that often, it's much easier to generate the JSON files locally and just post them online somewhere. The information on it can be read by anyone too it's not private or anything.
Message from AWS:
Hello,
We've received a report(s) that your AWS resource(s)
information
has been implicated in activity which resembles scanning remote hosts on the internet for security vulnerabilities. Activity of this nature is forbidden in the AWS Acceptable Use Policy (https://aws.amazon.com/aup/). We've included the original report below for your review.
Please take action to stop the reported activity and reply directly to this email with details of the corrective actions you have taken. If you do not consider the activity described in these reports to be abusive, please reply to this email with details of your use case.
If you're unaware of this activity, it's possible that your environment has been compromised by an external attacker, or a vulnerability is allowing your machine to be used in a way that it was not intended.
We are unable to assist you with troubleshooting or technical inquiries. However, for guidance on securing your instance, we recommend reviewing the following resources:
I'm new so it won't let me post links but they attached a couple help links
If you require further assistance with this matter, you can take advantage of our developer forums:
more links I can't have
Or, if you are subscribed to a Premium Support package, you may reach out for one-on-one assistance here:
link
Please remember that you are responsible for ensuring that your instances and all applications are properly secured. If you require any further information to assist you in identifying or rectifying this issue, please let us know in a direct reply to this message.
Regards,
AWS Abuse
Abuse Case Number:
Using an AWS EC2 instance to host static files (which is what it sounds like you were doing?) is pretty standard and I suspect that this is not what Amazon is complaining about. More likely, your instance has been infected by some sort of software which is causing it to request many files from other random servers on the web ("scanning for remote vulnerabilities"). You should check that you have not accidentally publicly posted your AWS credentials (in any form), and consider wiping the instance and resetting it. And of course reply to the email explaining this to AWS.

Access 2007 security options to avoid redistribution

My question is as follows. When I recieve a usage fee for an application I developed in Access 2007 I send out the application to my client, but how do I make sure that the client won't simply copy the database and redistribute it. Thus letting the client's client avoid the usage fee for the application.
I have put a 128-bit encryption on the application to secure the data in the tables and also converted it from a .Accdb to .Accde to secure the forms, reports, query's and VBA.
Also, I let them sign a legal document in which it states that the application cannot be redistributed unless authorized by me, but of course I'd rather they couldn't even if they tried.
What are my options here? I thought about linking a license code (handmade by me) to a certain MAC-Address that I can retrieve with VBA. And only making the database usable in case they match. But would this even work and is it easy to bypass?
Any help would be greatly appreciated,
thanks in advance for any suggestions/replies.
Edit: Thanks Dork Programmer for your reply.
In the end I chose to go for the drive volume number to give access to the application. I am aware that this changes when the disk is formatted and there is a slight possibility that it is not unique, however I believe this will have to do as I am unable to retrieve the manufactures hard drive serial number (which would be unique)
In conclusion; the client wil give me their drive vol number, I then add this to a table that holds these numbers. I then apply all my security measures and send the client the app. When the application opens the app will only be usable if a match is found between the clients disk vol number and the values in the table, else it will close the app. Should a client decide to format his/her disk or remove it they could then contact me and I'd add the new number to the approved numbers table and send them the app back.
I just sharing what I did on my ms-access application
First, I create some form with the VBA code inside to enter the unique code
Then I create some Hidden table to store the unique code and also to store the IP address/Computer name that database located.
Based on my experience, this method is quite enough effective to avoid user copying the database or moving it to another computer.