Azure Data Lake Store - File permission update - azure-cli

How do I update the permissions for a file? I want to make it read-only.
Please help!

az dls fs access set-permission --account [your account] --path [your file path] --permission [permission you want]

Related

Specify az active subscription statically - IaC style

Can az cli pick up the current subscription from a bicep file or a local "ansible.cfg" style config file? I don't want to set it from the cli, I want it to be picked up from some local definition.
Right now, the order of deployment commands I need to execute is:
cd ./staging
az login
az account set --subscription <id>
az deployment sub create ...
But az account set --subscription <id> is redundant, I already have subs/env separated by folders, so from whatever folder I am deploying I already know what sub I am using, it is statically defined:
staging/main.bicep
targetScope = 'subscription'
var subscriptionId = '00ta4479...'
module poc '../../../main.bicep' = {
name: 'poc'
scope: subscription(subscriptionId)
}

Windows batch file - connect to remote MySQL database save resulting text Output

I normally work with PHP/MySQL. A client wants to send variables from a .bat file - to a remote MySQL - where I will then manipulate them for display etc. I do not know how to connect and send these variables from a bat file in Windows.
I have small .bat file on windows, that simply writes a few variables to a text file.
#echo off
#echo Data: > test.txt
#echo VAR_1=777 >> test.txt
#echo VAR_2=245.67 >> test.txt
The result of the .bat file is a text file test.txt created with various details in it.
I would like the .bat file commands to also:
1) connect to a remote MySQL database
connect -> '8580922.hostedresource.com'
2) save to a basic table on a remote MySQL database:
INSERT INTO `My_Database`.`My_Table` (
`VAR_1` ,
`VAR_2` ,
)
VALUES (
'777',
'245.67'
);
Is this possible?
Is so - how?
I don't have MySQL Installed and I'm not familiar with it but here is a crack at something to try, based on info from the linked page.
REM This needs to be set to the right path
set bin=C:\Program Files\MySQL\MySQL Server 5.6\bin
REM set the host name and db
SET DBHOST=8580922.hostedresource.com
SET DBNAME=MyDatabase
REM set the variables and the SQL
SET VAR_1=777
SET VAR_2=245.67
SET SQL="INSERT INTO `My_Database`.`My_Table` (`VAR_1`,`VAR_2`) VALUES ( '%VAR_1%',
'%VAR_2%');"
"%bin%/mysql" -e %SQL% --user=NAME_OF_USER --password=PASSWORD -h %DBHOST% %DBNAME%
PAUSE
Please try that and post back the resulting error message. There are many reasons that it won't work, but you need to try it to find out.
I'm not sure where test.txt comes into this but it would be a good idea export the whole SQL statement to a text file then use the correct MySQL command line switch to just run the file instead of generating the SQL inside the batch file.
There's a bit more here.
connecting to MySQL from the command line

Boto3 Error: botocore.exceptions.NoCredentialsError: Unable to locate credentials

When I simply run the following code, I always gets this error.
s3 = boto3.resource('s3')
bucket_name = "python-sdk-sample-%s" % uuid.uuid4()
print("Creating new bucket with name:", bucket_name)
s3.create_bucket(Bucket=bucket_name)
I have saved my credential file in
C:\Users\myname\.aws\credentials, from where Boto should read my credentials.
Is my setting wrong?
Here is the output from boto3.set_stream_logger('botocore', level='DEBUG').
2015-10-24 14:22:28,761 botocore.credentials [DEBUG] Skipping environment variable credential check because profile name was explicitly set.
2015-10-24 14:22:28,761 botocore.credentials [DEBUG] Looking for credentials via: env
2015-10-24 14:22:28,773 botocore.credentials [DEBUG] Looking for credentials via: shared-credentials-file
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: config-file
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: ec2-credentials-file
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: boto-config
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: iam-role
try specifying keys manually
s3 = boto3.resource('s3',
aws_access_key_id=ACCESS_ID,
aws_secret_access_key= ACCESS_KEY)
Make sure you don't include your ACCESS_ID and ACCESS_KEY in the code directly for security concerns.
Consider using environment configs and injecting them in the code as suggested by #Tiger_Mike.
For Prod environments consider using rotating access keys:
https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html#Using_RotateAccessKey
I had the same issue and found out that the format of my ~/.aws/credentials file was wrong.
It worked with a file containing:
[default]
aws_access_key_id=XXXXXXXXXXXXXX
aws_secret_access_key=YYYYYYYYYYYYYYYYYYYYYYYYYYY
Note that there must be a profile name "[default]". Some official documentation make reference to a profile named "[credentials]", which did not work for me.
If you are looking for an alternative way, try adding your credentials using
AmazonCLI
from the terminal type:-
aws configure
then fill in your keys and region.
Make sure your ~/.aws/credentials file in Unix looks like this:
[MyProfile1]
aws_access_key_id = yourAccessId
aws_secret_access_key = yourSecretKey
[MyProfile2]
aws_access_key_id = yourAccessId
aws_secret_access_key = yourSecretKey
Your Python script should look like this, and it'll work:
from __future__ import print_function
import boto3
import os
os.environ['AWS_PROFILE'] = "MyProfile1"
os.environ['AWS_DEFAULT_REGION'] = "us-east-1"
ec2 = boto3.client('ec2')
# Retrieves all regions/endpoints that work with EC2
response = ec2.describe_regions()
print('Regions:', response['Regions'])
Source: https://boto3.readthedocs.io/en/latest/guide/configuration.html#interactive-configuration.
I also had the same issue,it can be solved by creating a config and credential file in the home directory. Below show the steps I did to solve this issue.
Create a config file :
touch ~/.aws/config
And in that file I entered the region
[default]
region = us-west-2
Then create the credential file:
touch ~/.aws/credentials
Then enter your credentials
[Profile1]
aws_access_key_id = XXXXXXXXXXXXXXXXXXXX
aws_secret_access_key = YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY
After set all these, then my python file to connect bucket. Run this file will list all the contents.
import boto3
import os
os.environ['AWS_PROFILE'] = "Profile1"
os.environ['AWS_DEFAULT_REGION'] = "us-west-2"
s3 = boto3.client('s3', region_name='us-west-2')
print("[INFO:] Connecting to cloud")
# Retrieves all regions/endpoints that work with S3
response = s3.list_buckets()
print('Regions:', response)
You can also refer below links:
Amazon S3 with Python Boto3 Library
Boto 3 documentation
Boto3: Amazon S3 as Python Object Store
from the terminal type:-
aws configure
then fill in your keys and region.
after this do next step use any environment. You can have multiple keys depending your account. Can manage multiple enviroment or keys
import boto3
aws_session = boto3.Session(profile_name="prod")
# Create an S3 client
s3 = aws_session.client('s3')
Create an S3 client object with your credentials
AWS_S3_CREDS = {
"aws_access_key_id":"your access key", # os.getenv("AWS_ACCESS_KEY")
"aws_secret_access_key":"your aws secret key" # os.getenv("AWS_SECRET_KEY")
}
s3_client = boto3.client('s3',**AWS_S3_CREDS)
It is always good to get credentials from os environment
To set Environment variables run the following commands in terminal
if linux or mac
$ export AWS_ACCESS_KEY="aws_access_key"
$ export AWS_SECRET_KEY="aws_secret_key"
if windows
c:System\> set AWS_ACCESS_KEY="aws_access_key"
c:System\> set AWS_SECRET_KEY="aws_secret_key"
Exporting the credential also work, In linux:
export AWS_SECRET_ACCESS_KEY="XXXXXXXXXXXX"
export AWS_ACCESS_KEY_ID="XXXXXXXXXXX"
These instructions are for windows machine with a single user profile for AWS. Make sure your ~/.aws/credentials file looks like this
[profile_name]
aws_access_key_id = yourAccessId
aws_secret_access_key = yourSecretKey
I had to set the AWS_DEFAULT_PROFILEenvironment variable to profile_name found in your credentials.
Then my python was able to connect. eg from here
import boto3
# Let's use Amazon S3
s3 = boto3.resource('s3')
# Print out bucket names
for bucket in s3.buckets.all():
print(bucket.name)
I work for a large corporation and encountered this same error, but needed a different work around. My issue was related to proxy settings. I had my proxy set up so I needed to set my no_proxy to whitelist AWS before I was able to get everything to work. You can set it in your bash script as well if you don't want to muddy up your Python code with os settings.
Python:
import os
os.environ["NO_PROXY"] = "s3.amazonaws.com"
Bash:
export no_proxy = "s3.amazonaws.com"
Edit: The above assume a US East S3 region. For other regions: use s3.[region].amazonaws.com where region is something like us-east-1 or us-west-2
If you have multiple aws profiles in ~/.aws/credentials like...
[Profile 1]
aws_access_key_id = *******************
aws_secret_access_key = ******************************************
[Profile 2]
aws_access_key_id = *******************
aws_secret_access_key = ******************************************
Follow two steps:
Make one you want to use as a default using export AWS_DEFAULT_PROFILE=Profile 1 command in terminal.
Make sure to run above command in the same terminal from where you use boto3 or you open an editor.[Understand the following scenario]
Scenario:
If you have two terminal open called t1 and t2.
And you run the export command in t1 and you open JupyterLab or any other from t2, you will get NoCredentialsError: Unable to locate credentials error.
Solution:
Run the export command in t1 and then open JupyterLab or any other from the same terminal t1.
In case of MLflow a call to mlflow.log_artifact() will raise this error if you cannot write to AWS3/MinIO data lake.
The reason is not setting up credentials in your python env (as these two env vars):
os.environ['DATA_AWS_ACCESS_KEY_ID'] = 'login'
os.environ['DATA_AWS_SECRET_ACCESS_KEY'] = 'password'
Note you may also access MLflow artifacts directly, using minio client (which requires a separate connection to the data lake, apart from mlflow's connection). This client can be started like this:
minio_client_mlflow = minio.Minio(os.environ['MLFLOW_S3_ENDPOINT_URL'].split('://')[1],
access_key=os.environ['AWS_ACCESS_KEY_ID'],
secret_key=os.environ['AWS_SECRET_ACCESS_KEY'],
secure=False)
I have solved the problem like this:
aws configure
Afterwards I manually entered:
AWS Access Key ID [None]: xxxxxxxxxx
AWS Secret Access Key [None]: xxxxxxxxxx
Default region name [None]: us-east-1
Default output format [None]: just hit enter
After that it worked for me
The boto3 is looking for the credentials in the folder like
C:\ProgramData\Anaconda3\envs\tensorflow\Lib\site-packages\botocore\.aws
You should save two files in this folder credentials and config.
You may want to check out the general order in which boto3 searches for credentials in this link. Look under the Configuring Credentials sub heading.
If you're sure you configure your aws correctly, just make sure the user of the project can read from ./aws or just run your project as a root
I just had this problem. This is what worked for me:
pip install botocore==1.13.20
Source: https://github.com/boto/botocore/issues/1892
In case of using AWS
In my case I had to add the following policy in IAM role to allow ec2 tags to be read by the EC2 instances. That would eliminate Unable to locate credentials error
:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "ec2:DescribeTags",
"Resource": "*"
}
]
}

Local BLAST Swissprot Database error

I am trying to run the standalone ncbi-blast-2.2.28+ on my machine (Mac) but get this error message when running blastp with SwissProt database:
BLAST Database error: Could not find volume or alias file (nr.00) referenced in alias file (/Users/me/bin/db/swissprot.00).
Here what I did:
1) downloaded the "ncbi-blast-2.2.28+-universal-macosx.tar.gz" from ncbi server and decompressed it
2) move the bin content of the folder to my $PATH directory "/Users/me/bin"
3) In "/Users/me/bin" I created a "db" folder, plus the ".ncbirc" file containing the following path:
[BLAST]
BLASTDB=/Users/me/bin/db
4) I downloaded the SwissProt database and got the following files in "/Users/me/bin/db/:
swissprot.00.msk
swissprot.01.msk
swissprot.02.msk
swissprot.03.msk
swissprot.04.msk
swissprot.05.msk
swissprot.06.msk
swissprot.07.msk
swissprot.08.msk
swissprot.09.msk
swissprot.10.msk
swissprot.00.pal
swissprot.01.pal
swissprot.02.pal
swissprot.03.pal
swissprot.04.pal
swissprot.05.pal
swissprot.06.pal
swissprot.07.pal
swissprot.08.pal
swissprot.09.pal
swissprot.10.pal
swissprot.pal
Then when I run blastp from any working directory (where my query file is), using this command:
blastp -query input.fasta -db swissprot
I get the following error message:
BLAST Database error: Could not find volume or alias file (nr.00) referenced in alias file (/Users/me/bin/db/swissprot.00).
As I read on other threads, I also tried to mention in the command line the whole path where the db is located, and to remove the .pal extension from the file names. But still doesn't work.
Can someone sees what I did wrong ?!!!!
you are storing your database files in db folder so you have to give this command instead of the one you have used:
blastp -query input.fasta -db db/swissprot
and I believe you are looking for an output in the console itself as you haven't used the -out option.
Also this will work only if the bin directory in which db folder is present be declared as an environment variable.
Have you checked the paths in .pal file?
Swissprot database that you have downloaded contains only links to entries in nr database: "nr - Non-redundant GenBank CDS translations + PDB + SwissProt + PIR + PRF, excluding those in env_nr". So you should additionally download nr database to run the standalone blast on your machine with SwissProt database. It weighs about 20 (!) Gb, but without it your blast will not work. Here's a link: ftp://ftp.ncbi.nlm.nih.gov/blast/db/
place all files from 00 to 10 folders into db and then check .pal file should contain 00 to 10 parts for example for nr databas its like
"nr.00" "nr.01" "nr.02" "nr.03" "nr.04" "nr.05" "nr.06" "nr.07" "nr.08" "nr.09" "nr.10"

Executing a SQL Server Script from a batch file

I have a script that I need to execute using a batch file. Do I use SQLCMD in the batch file to run the script? Also, the script inserts data to a table in a database. How should I format the SQLCMD in the batch file so it knows what database it is suppose to work with?
First, save your query into an sql text file (text file with .sql extension). Make sure to add the USE statement at the beginning, which tells the server which database you want to work with. Using the example from MSDN:
USE AdventureWorks2008R2;
GO
SELECT p.FirstName + ' ' + p.LastName AS 'Employee Name',
a.AddressLine1, a.AddressLine2 , a.City, a.PostalCode
FROM Person.Person AS p
INNER JOIN HumanResources.Employee AS e
ON p.BusinessEntityID = e.BusinessEntityID
INNER JOIN Person.BusinessEntityAddress bea
ON bea.BusinessEntityID = e.BusinessEntityID
INNER JOIN Person.Address AS a
ON a.AddressID = bea.AddressID;
GO
Then in your batch file, you run SQLCMD and pass it the sql file (with path) as a parameter.
sqlcmd -S myServer\instanceName -i C:\myScript.sql
If you need to authenticate as well, you'll need to add in -U and -P parameters to your SQLCMD command.
Here's an MSDN article dealing with the sqlcmd utility with more details.
Use the -S switch to specify server and instance names, e.g. -S MyDbServer\Database1
SQLCMD documentation found here.
If you want to execute all .sql files (multiple sql scripts in a folder) for multiple database then create a batch file "RunScript-All.bat" with below content
echo "======Start - Running scripts for master database======="
Call RunScript-master.bat
echo "=======End - Running scripts for master database=========="
pause
echo "=====Start - Running scripts for model database========"
Call RunScript-model.bat
echo "=======End - Running scripts for master database=========="
pause
Definition for individual batch file for a specific database i.e. "RunScript-master.bat" can be written as per below
for %%G in (*.sql) do sqlcmd /S .\SQL2014 /U sa /P XXXXXXXXX /d master -i"%%G"
::pause
Create many files for different databases and call them from "RunScript-All.bat".
Now you will be all to run all sql scripts in many database by clicking on "RunScript-All.bat" batch file.