I have a little understanding on how server sent events work. Say I have a Linux server(remote server) and I need to monitor it's CPU usage from local machine continuously (via a HTML page which will be in my local machine). Will I be able to get the CPU usage continuously from the server to local machine using SSE? If so, I need some clarifications on how to do so. Or is there any other alternatives that I can go with without involving any softwares or so?
You'll need to write some code to run on the server, which will gather whatever data you need. (Common choices include Node.js, PHP, etc.)
That code will need to either directly serve HTTP requests, or connect to a web server.
Your code will send data in this format:
event: someevent
data: {"key": "value"}
Then, your client-side will use EventSource:
const eventSource = new EventSource('https://example.com/your-sse-path');
eventSource.addEventListener('someevent', (e) => {
console.log(JSON.parse(e.data));
});
Related
If I want to live broadcast some work on a text editor embedded in a virtual terminal on my personnal computer, I can stream on the web a video of the window containing it.
But since information consists mainly in a bunch of characters, possibly with some colors and formating, I think that video is a waste of ressources, bandwidth and technology speaking.
What would you recommend for this, and is there some server implementing the solution somewhere ?
The requirements are :
the stream must be almost real time (at least 1 update per second and no more than 1 second delay)
audience can access the stream with only a web browser (no additional software on their side), read-only (no interaction with the stream or with my terminal)
features from say xterm or urxvt be supported
all necessary software (both streamer client side and potential server side) are open source
Comments on technical advantages of such tool compared to video streaming are welcome.
I finally took the time to implement a complete solution, using socket.io within a simple NodeJS server for broadcasting.
On the client side, serve a simple HTML with an Xterm.js terminal
<script src='/socket.io/socket.io.js'></script>
<script src="xterm/xterm.js"></script>
...
<div class="terminal" id="terminal"></div>
and script the synchronization along the lines of
var socket = io();
term.open(document.getElementById('terminal'));
var updateTerminal = socket.on("updateTerminal", function(data) {
term.write(data);
});
Now the data that can be passed to term.write of Xterm.js can be raw terminal data. Several UNIX utilities can monitor such raw data from a terminal, for instance tmux as proposed by jerch in the comments, or script.
To pass these data to the server for broadcasting, the easiest way is to use a named pipe; so on the server side
mkfifo server_pipe
script -f server_pipe
(the terminal issuing that last command will be the one broadcasting; if one does not have physical access to the server, one can use an additional pipe and a tunneling connection
mkfifo local_pipe
cat local_pipe | ssh <server> 'cat > path/to/server_pipe'&
script -f local_pipe
)
Finally, the NodeJS server must be listening to the named pipe and broadcast any new data
/* create server */
const http = require('http');
const server = http.createServer(function (request, response) {
...
});
/* open named pipe for reading */
const fs = require('fs');
const fd = fs.openSync("path/to/server_pipe", 'r+')
const termStream = fs.createReadStream(null, {fd});
termStream.setEncoding('utf8');
/* broadcast any new data with socket.io */
const iolib = require("socket.io");
io = iolib(server);
termStream.on('data', function(data) {
io.emit("updateTerminal", data)
});
All this mechanism is implemented in my software Remote lecture.
As for the comparison with video broadcast, I did not take the time to actually quantify the difference, but for equivalent resolution and latency, the above mechanism should use much less network and computing resources than capturing a graphic terminal output and sharing it with video.
I'm new to using Svelte and would like to create a ordering website using Svelte. I know that I will need a database to keep track of the order, customer name, price etc. I have used MySQL before but I haven't learned how to connect a database to a website.
Is there a specific database that you can use if you are using Svelte?
Or is there a way to connect MySQL to Svelte?
I have searched about this on Youtube and Google but I'm not sure if it's different if you are using Svelte so I wanted to make sure.
Note: I have not started this project yet so I do not have any code to show I just want to know how you can connect a database if you're using Svelte.
Svelte is a front end javascript framework that run on the browser.
Traditionally, in order to use databases like mysql from a front end project such as svelte, (that contains only html,css and js), you would have to do it with a separate backend project. You can then communicate the svelte app and the backend project with the help of REST api. The same applies to other other front end libraries/frameworks like react, angular vue etc.
There are still so many ways to achieve the result. Since you are focusing on Svelte here are few things options
1 Sapper
Sapper is an application framework powered by svelte. You can also write backend code using express or polka so that you can connect to database of your choice (mysql / mongodb)
2 User Server less database
If you want you app simple and just focus on svelte app, you can use cloud based databases such as firebase. Svelte can directly talk to them via their javascript SDK.
3 monolithic architecture
To connect with mysql in the backend, you would need to use one serverside application programming language such as nodejs (express) php or python or whatever you are familiar with. Then use can embed svelte app or use api to pass data to the svelte app.
I can make an example with mongodb
You have to install the library
npm install mongodb
or add in package.json
Then you have to make a connection file that you have to call everytime you need to use the db
const mongo = require("mongodb");
let client = null;
let db = null;
export async function init() {
if(!client) {
client = await mongo.MongoClient.connect("mongodb://localhost");
db = client.db("name-of-your-db");
}
return { client, db }
}
for a complete example with insert you can see this video
https://www.youtube.com/watch?v=Mey2KZDog_A
You can use pouchdb, which gives you direct access to the indexedDB in the browser. No backend needed for this.
The client-pouchdb can then be replicated/synced with a remote couchdb. This can all be done inside you svelte-app from the client-side.
It is pretty easy to setup.
var db = new PouchDB('dbname');
db.put({
_id: 'dave#gmail.com',
name: 'David',
age: 69
});
db.changes().on('change', function() {
console.log('Ch-Ch-Changes');
});
db.replicate.to('http://example.com/mydb');
more on pouchdb.com
Also the client can save the data offline first and later connect to a remote database.
As i get question mostly about connection to backend, not a database. It is pity, but svelte app template has no way to connect backend "in box".
What about me, i'm using express middleware in front of rollup server. In this case you able to proxy some requests to backend server. Check code below
const proxy = require('express-http-proxy');
const app = require('express')();
app.use('/data/', proxy(
'http://backend/data',
{
proxyReqPathResolver: req => {
return '/data'+ req.url;
}
}
)
);
app.use('/', proxy('http://127.0.0.1:5000'));
app.listen(5001);
This script opend 5001 port where you have /data/ url proxied to backend server. And 5000 port still available from rollup server. So at http://localhost:5001/ you have svelte intance, connected to backend vi /data/ url, here you can send requests for fetching some data from database.
I would like to build a web page with interactive content.
I would like to use socket IO, but I've got a problem when I send two number from client to the server.
The server adds the two numbers and sends for every user (but I want just send back to the one user). I wouldn't like to store the users. So I would like to ask how can I build this example using NodeJS and Socket IO?
if you look into socket.io docs, you will find out that you can specify the key for each message you send, for instance:
io.on('connection', function (socket) {
// specify a clientId for each client you have, you may define it at the moment the connection starts.
socket.emit(`news:${clientId}`, { hello: 'world' });
});
And, on the client side, you have:
socket.on(`news:${myClientId}`, function (data) {
console.log(data);
});
You may generate this id randomly by using many libraries, for instance Node-Forge.
Hope it helps! Feel free to ask further.
I am about to develop an application where employees go to service repair machines at customer premises. They need to fill up a service card using a tablet or any other mobile device.
In case of no Internet connection, I am thinking about using HTML5 offline storage, mainly IndexedDB to store the service card (web form) data locally, and do a sync at the office where Internet exists. The sync is with a MySQL database.
So the question: is it possible to sync indexedDB with mysql? I have never worked with indexedDB, I am only doing research and saw it is a potential.
Web SQL is deprecated. Otherwise, it could have been the closer solution.
Any other alternatives in case the above is difficult or outside the standard?
Your opinions are highly appreciated.
Thanks.
This is definitly do able. I am only just starting to learn indexeddb the last couple of days. This is how I would see it working tho. Sorry dont have code to give you.
Website knows its in offline mode somehow
Clicking submit form saves the data into indexeddb
Later laptop or whatever is back online or on intranet and can now talk to main server sends all indexeddb rows to server to be stored in mysql via an ajax call.
indexeddb is cleared
repeat
A little bit late, but i hope it helps.
This is posible, am not sure if is the best choice. I can tell you that am building a webapp where I have a mysql database and the app must work offline and keep trace of the data. I try using indexedDB and it was very confusing for me so I implemented DexieJs, a minimalistic and straight forward API to comunicate with indexedDB in an easy way.
Now the app is working online then if it goes down the internet, it works offline until it gets internet back and then upload the data to the mysql database. One of the solutions i read to save the data was to store in a TEXT field the json object been passed to JSON.stringify(), and once you need the data back JSON.parse().
This was my motivation to build the app in that way and also that we couldn't change of database :
IndexedDB Tutorial
Sync IndexedDB with MySQL
Connect node to mysql
[Update for 2021]
For anyone reading this, I can recommend to check out AceBase.
AceBase is a realtime database that enables easy storage and synchronization between browser and server databases. It uses IndexedDB in the browser, and its own binary db format or SQL Server / SQLite storage on the server side. MySQL storage is also on the roadmap. Offline edits are synced upon reconnecting and clients are notified of remote database changes in realtime through a websocket (FAST!).
On top of this, AceBase has a unique feature called "live data proxies" that allow you to have all changes to in-memory objects to be persisted and synced to local and server databases, so you can forget about database coding altogether, and program as if you're only using local objects. No matter if you're online or offline.
The following example shows how to create a local IndexedDB database in the browser, how to connect to a remote database server that syncs with the local database, and how to create a live data proxy that eliminates further database coding altogether.
const { AceBaseClient } = require('acebase-client');
const { AceBase } = require('acebase');
// Create local database with IndexedDB storage:
const cacheDb = AceBase.WithIndexedDB('mydb-local');
// Connect to server database, use local db for offline storage:
const db = new AceBaseClient({ dbname: 'mydb', host: 'db.myproject.com', port: 443, https: true, cache: { db: cacheDb } });
// Wait for remote database to be connected, or ready to use when offline:
db.ready(async () => {
// Create live data proxy for a chat:
const emptyChat = { title: 'New chat', messages: {} };
const proxy = await db.ref('chats/chatid1').proxy(emptyChat); // Use emptyChat if chat node doesn't exist
// Get object reference containing live data:
const chat = proxy.value;
// Update chat's properties to save to local database,
// sync to server AND all other clients monitoring this chat in realtime:
chat.title = `Changing the title`;
chat.messages.push({
from: 'ewout',
sent: new Date(),
text: `Sending a message that is stored in the database and synced automatically was never this easy!` +
`This message might have been sent while we were offline. Who knows!`
});
// To monitor realtime changes to the chat:
chat.onChanged((val, prev, isRemoteChange, context) => {
if (val.title !== prev.title) {
console.log(`Chat title changed to ${val.title} by ${isRemoteChange ? 'us' : 'someone else'}`);
}
});
});
For more examples and documentation, see AceBase realtime database engine at npmjs.com
I've a MySql database hosted in my web site, with a table named UsrLic
Where any one wants to buy my software must register and enter his/her Generated Machine Key (+ username, email ...etc).
So my question is:
I want to automate this process from my software, how this Process will be?
Should I connect and update my database directly from my software ( and this means I must save all my database connection parameters in it * my database username , password , server * and then use ADO or MyDac to connect to this database ? and if yes how secure is this process ?
or any other suggestions .
I recommend creating an API on your web site in PHP and calling the API from Delphi.
That way, the database is only available to your web server and not to the client application, ever. In fact, you should run your database on localhost or with a private IP so that only machines on the same physical network can reach it.
I have implemented this and am implementing it again as we speak.
PHP
Create a new file named register_config.php. In this file, setup your MySQL connection information.
Create a file named register.php. In this file, put your registration functions. From this file, include 'register_config.php'. You will pass parameters to the functions you create here, and they will do the reading and writing to your database.
Create a file named register_api.php. From this file, include 'register.php'. Here, you will process POST or GET variables that are sent from your client application, call functions in register.php, and return results back to the client, all via HTTP.
You will have to research connecting to and querying a MySQL database. The W3Schools tutorials will have you doing this very quickly.
For example:
Your Delphi program calls https://mysite/register_api.php with Post() and sends the following values:
name=Marcus
email=marcus#gmail.com
Here's how the beginning of register_api.php might look:
// Our actual database and registration functions are in this library
include 'register.php';
// These are the name value pairs sent via POST from the client
$name = $_POST['name'];
$email = $_POST['email'];
// Sanitize and validate the input here...
// Register them in the DB by calling my function in register.php
if registerBuyer($name, $email) {
// Let them know we succeeded
echo "OK";
} else {
// Let them know we failed
echo "ERROR";
}
Delphi
Use Indy's TIdHTTP component and its Post() or Get() method to post data to register_api.php on the website.
You will get the response back in text from your API.
Keep it simple.
Security
All validation should be done on the server (API). The server must be the gatekeeper.
Sanitize all input to the API from the user (the client) before you call any functions, especially queries.
If you are using shared web hosting, make sure that register.php and register_config.php are not world readable.
If you are passing sensitive information, and it sounds like you are, you should call the registration API function from Delphi over HTTPS. HTTPS provides end to end protection so that nobody can sniff the data being sent off the wire.
Simply hookup a TIdSSLIOHandlerSocketOpenSSL component to your TIdHTTP component, and you're good to go, minus any certificate verification.
Use the SSL component's OnVerifyPeer event to write your own certificate verification method. This is important. If you don't verify the server side certificate, other sites can impersonate you with DNS poisoning and collect the data from your users instead of you. Though this is important, don't let this hold you up since it requires a bit more understanding. Add this in a future version.
Why don't you use e.g. share*it? They also handle the buying process (i don't see how you would do this for yourself..) and let you create a reg key through a delphi app.