Get list of all remote sessions? - powershell-remoting

I would like to be able to remote into a machine and then list all sessions connected to that machine that are also powershell remote sessions. I'd like to be able to grab their session id, the state of the session, etc.
Is there a good way to do that?
To clarify, I'm not asking how to remote in to the machine, I'm just asking how to detect remote powershell sessions once I'm already remotely connected.

Get-PSSession -ComputerName will list all the sessions on the specified computer

Related

How to access MySQL database that is on another machine, located in a different locations (NOT LOCAL) with python

I am finished with my project and now I want to put it on my website where people could download it and use it. My project is connected to my MySQL and it works on my machine. On my machine, I can read, and modify my database with python. It obviously, will not work if a person from another country tries to access it. How can I make it so a person from another town, city, or country could access my database and be able to read it?
I tried using SSH but I feel like it only works on a local network.
I have not written a single line of code on this matter because I have no clue how to get started.
I probably know how to read my database on a local network but I have no clue how to access it from anywhere else.
Any help, tips, or solutions would be great and appreciated.
Thank you!
If I'm understanding correctly, you want to run a MySQL server from your home PC and allow others to connect and access data? Well, you would need to make sure the correct port is forwarded in your router and firewall, default is TCP 3306. Then simply provide the user with your current IP address (could change).
Determine the correct MySQL Server port being listened on.
Allow port forwarding on the TCP protocol and the port you determined, default is 3306.
Allow incoming connections on this port from software firewall if any.
Provide the user with your current IP Address, Port, and Database name.
If you set login credentials, make sure the user has this as well.
That's it. The user should be able to connect with the IP Address, Port, Database Name, Username, and Password.

Using SSH tunnel to connect to remote MYSQL database from Node-Red

I have a set of data rolling out of Node-Red that I want to send to a remote MYSQL database. The Node-Red system is running on a Raspberry Pi. How do I make this work? I know how to it using Node.JS but im not sure how to do this in Node-Red. The IP-adress of the Pi is dynamic so simply authorizing its Ip address does not work sadly.
Thanks in advance!
EDIT for clarification:
I want to connect to a remote MYSQL database that is hosted by my webhosting. I have connected a Raspberry Pi to a battery, and I want to save this information in the aforementioned database. Since there will be several battery setups in different locations, I cannot save the data locally. So, one way or another I need to access the remote database through Node-Red. Authorizing one IP-address does't work, since the IP of the Raspberry Pi network is dynamic and thus changes. I think a SSH-Tunnel might be the solution, but I have no idea how to this in Node-Red, and google isnt very helpful.
OK, so as I said in the comments you can make a Username/Password pair for MySQL can be granted permission to any IP address (which is less secure if the username/password is compromised. Set the host to '%' to allow all hosts when setting up the grant options).
To reduce the risk you can restrict the Username/Password to a specific subnet. This could be a wifi network or the subnet associated to the piblic IP (it needs to be the public range as nearly all cellular ISPs use CGNAT) range of the cellular provider you may be using. (See this question for details How to grant remote access to MySQL for a whole subnet?).
If you want to use a SSH tunnel then this will normally be done outside Node-RED with the ssh command line e.g.
ssh -L localhost:3306:localhost:3306 remote.host.com
Then configure the Node-RED MySQL node to point to localhost.
Since the connection will look like it's coming from localhost on the MySQL machine you need make sure the Username/Password is locked down to a that host.
You will probably also want to set up public/private key authentication for the ssh connection.
You may be able to run the ssh command in the node-red-daemon node, which should restart the connection if it gets dropped.

Allowing users to connect with SSH without having sudo access?

Here's what I'm trying to do: set up a backup server on Google Compute Engine, where employees at my company can have their computers backup nightly via rdiffbackup. A cron job will run that runs rdiffbackup, which uses SSH to send files just like SCP.
With a "normal" server, I can create each employee a new user, and set permissions so they cannot read another employee's files.
It seems like using the "gcloud compute ssh" tool, or configuring regular ssh using "gcloud compute config-ssh", only allows you to allow users to connect who are added to the project and have connected their computer to their google account. My issue with this is that I don't see a way for a user to have read-write abilities on a server without also being a sudoer (anyone added to a project with "Can Edit" can get sudo as far as I know). Obviously if they have sudo, they can read others' files.
Can I give someone the ability to SSH remotely without having sudo? Thank you.
I recommend avoiding gcloud all together for this. gcloud's SSH tools are geared towards easily administering a constantly changing set of machines in your project. It is not made to cover all use cases that would also use SSH.
Instead, I recommend you setup your backup service as you would a normal server:
assign a static address
(optional) assign a dns name
setup users on the box using adduser
You have couple of options
1) You can manage non-root users on your instances as you would on any normal Linux machine with by manually adding them with the standard commands like 'adduser' and not gsutil/UI/metadata update path.
2) Alternatively if you need to manage a large cluster of machines you can disable the entire ACL management provided by Google and run your own LDAP server for this. The file which is responsible for the account updates and needs to be disabled to run is this one
https://github.com/GoogleCloudPlatform/compute-image-packages/blob/master/google-daemon/etc/init/google-accounts-manager-service.conf
3) Finally you can lock down write access to the root users ie. disable writes propagating from metadata server by setting the immutable flag on the sudoers file 'chattr +i /etc/sudoers' Its not a graceful solution but effective. This way you lock in Root for the already added users and any new users will be added as non-root privileged, any new root level user needs to be added manually machine by machine though.

Is it possible to specify the IP address for a second database connection on Heroku

I have a Heroku app and use Heroku Postgres for the main repo. We want to access a second database in a read only manner for some data that is hosted on a third party.
I have seen the answer to multiple databases here but doesn't solve our static ip issue: How to use multiple databases for one rails 3.1 app in Heroku? (ie a SECOND_DATABASE_URL). I should mention that this is read only so much of the active record infrastructure is somewhat irrelevant.
We really just want to make a connection, get back a hash and then inert locally (never doing migrations against this second database), etc...
However, the read-only user they are going to create needs an IP address. Is there any way for me to get a static ip or proxy these MySQL requests easily?
I have seen QuotaGuard static but that looks mainly for http requests. Or could I use it in this scenario.

Remote (Non-LocalHost) MySQL Calls... Safe/Recommended for Management Purposes?

I'm new to MySQL and I'm using a desktop DB management app called "Querious" to simplify the process while I learn.
I want to work on (mainly just structure & basic population) a database that's hosted elsewhere, but the host won't allow any remote MySQL calls on their server.
What is their reasoning for restricting MySQL calls to localhost only? Is this a security or a performance concern?
This is a security concern. The idea is that if people can't remotely connect, they have to compromise the system. Not just the files that hold the database information.
You may be able to request that just add your IP address to a trusted host file, but I doubt they'll do that either.
It's fairly common practice to not allow remote DB connections
I've run into this problem with GoDaddy where they implement this by default. You can change this, however, by indicating that you want to allow remote access. If you've already created your DB, though, you can't change it, so I would recommend creating a new DB and deleting your other one.
The reason why is for security. If only your app can call your DB, you don't have to worry about other people trying to access it.
Distill,
An improperly-configured MySQL instance is dangerous, whether the user is remote or local. This could allow malicious attackers to cause crashes or remote execution of arbitrary code (i.e., owning the machine).
You can use PuTTY to create a tunnel if it's allowed by the server so that your application traffic goes through ssh and then is forwarded to the correct port on localhost.