Unable to see the "Properties" option on right clicking the user database and column under the table in Azure Data Studio [ MAC SYSTEM ] - data-analysis

Steps
1.Created a user database - DB1
2.Created a table - TB1 with columns - cm1 and cm2
3.Installed the extension - "Database Administration Tool Extension"
4.Relaunched the application
5.Right clicked on the database - DB1 ................................
Observation -> I could see some options like 'Manage' but not properties
6.Right clicked on the column 'cm1' under the created table TB1.........................
Observation -> I could see only refresh option but not properties
7.Additional Step: Tried searching for other relevant extension but couldn't spot.
Question -> How can I get the properties option enabled for DB and columns in AzureDataStudio in MAC System

Related

SSIS - Loop Through Active Directory

Disclaimer: new to SSIS and Active Directory
I have a need to extract all users within a particular Active Directory (AD) domain and import them into Excel. I have followed this: https://www.itnota.com/query-ldap-in-visual-studio-ssis/ in order to create my SSIS package. My SQL is:
LDAP://DC=JOHN,DC=JANE,DC=DOE;(&(objectCategory=person)(objectClass=user)(name=a*));Name,sAMAccountName
As you know there is a 1,000 row limit when pulling from the AD. In my SQL I currently have (name=a*) to test the process and it works. I need to know how to setup a loop with variables to pull all records and import into Excel (or whatever you experts recommend). Also, how do I know what the other field names are that are available to pull?
Thanks in advance.
How do I see what's in Active Directory
Tool recommendations are off topic for the site but a tool that you can download, no install required, is AD Explorer It's a MS tool that allows you to view your domain. Highly recommend people that need to see what's in AD use something like this as it shows you your basic structure.
What's my domain controller?
Start -> Command Prompt
Type set | find /i "userdnsdomain" and look for USERDNSDOMAIN and put that value in the connect dialog and I save it because I don't want to enter this every time.
Search/Find and then look yourself up. Here I'm going to find my account by using my sAMAccountName
The search results show only one user but there could have been multiples since I did a contains relationship.
Double clicking the value in the bottom results section causes the under pane window to update with the details of the search result.
This is nice because while the right side shows all the properties associated to my account, it's also updated the left pane to navigate to the CN. In my case it's CN=Users but again, it could be something else in your specific environment.
You might discover an interesting categorization for your particular domain. At a very large client, I discovered that my target users were all under a CN
(Canonical Name, I think) so I could use that in my AD query.
There are things you'll see here that you sure would like to bring into a data flow but you won't be able to. Like the memberOf that's a complex type and there's no equivalent in the data flow data types for it. I think Integer8 is also something that didn't work.
Loop the loop
The "trick" here is that we'll need to take advantage of the
The name of the AD provider has changed since I last looked at this. In VS 2017, I see the OLE DB Provider name as "OLE DB Provider for Microsoft Directory Service"
Put in your query and you should get results back. Let that happen so the metadata is set.
An ADO.NET source does not support parameterization as the OLE DB does. However, you can apply an Expression on the Data Flow which surfaces the component and that's what we'll do.
Click out of the Data Flow and back into the Control Flow and right click on the Data Flow and select Properties. In that properties window, find Expressions and click the ellipses ... Up pops the Property Expressions Editor
Find the ADO.NET source under Property and in the Expressions section, click the Ellipses.
Here, we'll use your same source query just to prove we're doing the right things
"LDAP://DC=JOHN,DC=JANE,DC=DOE;(&(objectCategory=person)(objectClass=user)(name=" + "a" + "*));Name,sAMAccountName"
We're doing string building here so the problem we're left to solve is how we can substitute something for the "a" in the above query.
The laziest route would be to
Create an SSIS variable of type String called CurrentLetter and initialize it to a
Update the expression we just created to be "LDAP://DC=JOHN,DC=JANE,DC=DOE;(&(objectCategory=person)(objectClass=user)(name=" + #[USer::CurrentLetter] + "*));Name,sAMAccountName"
Add a Foreach Loop Container (FELC) to your Control Flow.
Configure the FELC with an enumerator of "Foreach Item Enumerator"
Click the Columns...
Click Add (this results in Column 0 with data type String) so click OK
Fill the collection with each letter of the alphabet
In the Variable Mappings tab, assign Variable User::CurrentLetter to Index 0
Click OK
Old blog posts on the matter because I like clicks
https://billfellows.blogspot.com/2011/04/active-directory-ssis-data-source.html
http://billfellows.blogspot.com/2013/11/biml-active-directory-ssis-data-source.html

SQL Error [91016] [22000]: Remote file 'stage_name/java_udf.jar' was not found

Created a jar file and use it as function. I created it with same user role for both function and snowflake stage. Uploaded the jar file to the stage using snowsql.
When I run the the following command in snowflake ui (browser), it works.
ls #~/stage_name
However, when I use the service account with similar role that I have using DBeaver. It does not work. It comes up empty.
Same thing with the function, it works in the Snowflake UI, but not in DBeaver. Please note that both users have the same role. Also, added grant "all privileges" and "usage" (which be part of all) to the roles I want them to use. But again, it does not work. It shows error below
**> SQL Error [91016] [22000]: Remote file 'stage_name/java_udf.jar' was
not found. If you are running a copy command, please make sure files
are not deleted when they are being loaded or files are not being
loaded into two different tables concurrently with auto purge option.**
However, when I run the function in Snowflake UI using my user account, it works fine. Please note my user account has the same role as the service account. But it doesn't work on the service account. Any ideas?
Followed steps here in the documentation:
https://docs.snowflake.com/en/developer-guide/udf/java/udf-java-creating.html#label-udf-java-in-line-examples
So I think I know the issue.
The stage could be shared using the same role. But the files uploaded in stage are not. They belong to the users who uploaded them. I just loaded exactly the same file a the same internal stage. And they did not overwrite each other:
Service Account:
name: xxxxxxx.jar
size: 389568
md5: be8b59593ae8c4b8baebaa8474bda0a7
last_modified: Tue, 8 Feb 2022 03:26:29 GMT
User account:
namne: xxxxxxx.jar
size: 389568
md5: 0c4d85a3a6581fa3007f0a4113570dbc
last_modified: Mon, 7 Feb 2022 17:03:58 GMT
~# is the USER LOCAL stoage only area.
thus, unless the automation is the "same" user, it will not be able to access it.
This should be provable by getting the same "run" command that works from the WebUI for your user, and logging in as the automation user, and seeing you get the error there.
Reading that link document, full you can see that you should use a table storage, or a named storage, which you can grant access to the role you both have.
working proof:
on user simeon:
create or replace stage my_stage;
create or replace function echo_varchar(x varchar)
returns varchar
language java
called on null input
handler='TestFunc.echo_varchar'
target_path='#my_stage/testfunc.jar'
as
'class TestFunc {
public static String echo_varchar(String x) {
return x;
}
}';
create role my_role;
grant usage on function echo_varchar(varchar) to my_role;
grant all on stage my_stage to my_role;
grant usage on database test to my_role;
grant usage on schema not_test to my_role;
grant usage on warehouse compute_wh to my_role;
then I test it:
use role my_role;
select current_user(), current_role();
/*CURRENT_USER() CURRENT_ROLE()
SIMEON MY_ROLE*/
select test.not_test.echo_varchar('Hello');
/*TEST.NOT_TEST.ECHO_VARCHAR('HELLO')
Hello*/
I created a new user test_two set them to role my_role
on user test_two:
use role my_role;
select current_user(), current_role();
/*CURRENT_USER() CURRENT_ROLE()
TEST_TWO MY_ROLE*/
select test.not_test.echo_varchar('Hello');
/*TEST.NOT_TEST.ECHO_VARCHAR('HELLO')
Hello*/
Ok so a function put on a accessible stage works, lets put another on my user simeon local stage ~#
on user Simeon:
returns varchar
language java
called on null input
handler='TestFuncB.echo_varcharb'
target_path='#~/testfuncb.jar'
as
'class TestFuncB {
public static String echo_varcharb(String x) {
return x;
}
}';
grant usage on function echo_varcharb(varchar) to my_role;
select test.not_test.echo_varcharb('Hello');
/*TEST.NOT_TEST.ECHO_VARCHARB('HELLO')
Hello*/
back on user test_two:
select test.not_test.echo_varcharb('Hello');
/*Remote file 'testfuncb.jar' was not found. If you are running a copy command, please make sure files are not deleted when they are being loaded or files are not being loaded into two different tables concurrently with auto purge option.*/

MYSQL DB - New tables are created randomly

My DB has increased in size very much, above 10GB.
I see these tables:
emcxmp_wp_posts
zjrqwg_wp_posts
qtlmkn_wp_posts
shcjpe_wp_posts
stzbcj_wp_posts
tymbkf_wp_posts
ursnzw_wp_posts
vkhjml_wp_posts
oyjfup_wp_posts
voxfcz_wp_posts
xlhpaz_wp_posts
ybazlk_wp_posts
yjmify_wp_posts
ymsaun_wp_posts
yojkzl_wp_posts
yqlfun_wp_posts
wouevx_wp_posts
msyfsp_wp_posts
kqbjhz_wp_posts
kqjsio_wp_posts
lnfjsf_wp_posts
asvpky_wp_posts
bltyyt_wp_posts
cyuhqr_wp_posts
eudjso_wp_posts
And more, what happened and why these were created?
This is the intended behaviour of the plugin WP Reset Pro.
It creates snapshots with names as yours:
naming template for snapshot tables is {6_char_random_hex}{table_prefix_for_your_site}{original_table_name}
To delete snapshots, use the appropriate function:
If you do not need a specific snapshot on your WordPress site, you can quickly delete it:
Open Tools -> WP Reset -> Snapshots
Scroll down to the "User Created Snapshots" card
Select a snapshot & open the "Action" menu
Click "Delete snapshot"
Confirm that you want to delete it by clicking the red button

How to prevent ERROR 2013 (HY000) using Aurora Serverless

When performing long-running queries on Aurora Serverless I have seen the following errors after a few minutes:
ERROR 1080 (08S01) Forcing close of thread
ERROR 2013 (HY000) Lost connection to MySQL server during query
The queries are using mysql LOAD DATA LOCAL INFILE to load large (multi-GB) data files into the database.
How can I avoid these errors?
To solve this, you can change the parameter group item net_write_timeout to a more suitable value. Here's instructions for completing the steps from the console:
Go to RDS Console
Click "Parameter Groups" in the left pane
Click "Create Parameter Group"
On the Parameter Group Details page, for Type, choose DB Cluster Parameter Group; then give it a name and description, and click "Create"
Click on the name of the parameter group you created in step 4
Search for "net_write_timeout"
Click on the checkbox next to the parameter and click "Edit Parameters"
Change the value to an integer between 1-31536000 for the number of seconds you want it to wait before timing out, and click "save changes"
Click on Databases in the left pane
Click on the database and click "modify"
Under Additional Configuration > Database Options > DB Cluster Parameter Group, select the parameter group you created in step 4, and click "Continue"
Select "Apply Immediately" and click "Modify Cluster"
Break up your large, multi-GB uploads into smaller chunks. Aurora works better (and faster) loading one hundred 10MB files at once rather than one 1GB file. Assuming your data is already in a loadable format:
Split the file into parts using split
split -n l/100 --additional-suffix="_small" big_file.txt
This results in 100 files like xaa_small xab_small, etc.
find files that match the split suffixes using find
files=$(find . -name 'x*_small')
loop through the files and load each in parallel
for file in $files; do
echo "load data local infile '$file' into table my_table;" |
mysql --defaults-file=/home/ubuntu/.my.cnf --defaults-group-suffix=test &
done

SQl Database back up showing error "Exception of type "System.OutofMemoryException" was thrown.(mscorlib)"

I need to take a backup of a SQL database with data. I select
Tasks -> Generate Scripts... -> Next -> Next and in Table view option I changed Script Data FALSE to TRUE -> Next -> Select all -> Script to New Query Window -> Finish.
But it end up with an error:
"Exception of type "System.OutofMemoryException" was thrown.(mscorlib)".
I have checked the memory space of my drives. C drive is having more than 10 GB and D drive is having more than 3 GB but the database is only 500MB. That error is showing for 1 particular table "TBL_SUMM_MENUOPTION" which is an empty table. May I know how to fix this issue? How to take that database backup with out this error?
Screenshot for better understanding:
As #Alejandro stated in his comment, Instead of taking backup using Tasks -> Generate Scripts... -> Next -> Next and in Table view option I changed Script Data FALSE to TRUE -> Next -> Select all -> Script to New Query Window -> Finish. I took backup by using Tasks -> Backup ->Database which is very simple.