I have a problem with my docker microservice architecture and different approaches to solve this problems. I hope you can help me with your input.
The question: what is the best way to extract the data from the controllerDB and send this to the stack at creation time?
Given this architecture:
controllerService:
Connected to a docker socket to create stacks by a given compose file. Implemented in python 3.7 and pydocker.
controllerDB:
Data for the new stacks. MySQL database
The problem: every created stack needs a different subset of data from the database. The table which stores the data has columns donĀ“t needed by the stack. This columns only indicate that this data are already used by a stack.
The approaches
create a mysql dump and send this to the mysql initdb folder.
create a docker config file by a python function and add it to the stack
send the data directly from controllerDB to stackDB
some other approach I don't see
Ideas
Is it possible to dump a database by a SQL query? I only know that a can select the dumping rows by the --where= option of mysqldump but not way to exclude columns.
controllerService send a query to controllerDB. This dataset is created into a config file and send to the stack. The Service hast to insert it to the database by its own.
I don't know if this is possible.
Regards
MarLei
Related
Will you Please help me in one more important thing??? I need to store dashboard's data in a database.. according to my study thingsboard supports 3 database at the moment. NoSQL, Mysql and Hybrid (Psql+cassandra) so i have researched a lot but could not find any way to send my telemetry data to any database. I know it is possible because the thingsboard doc itself say so... but how do i do that?? I checked Psql database thingsboard that i created during installation but there are those relations are present that was made by default. i need to store my project's data in databases just like in AWS we store IoT core's data in Dynamo DB or in IoT analytics. Thingsboard do not provide any node related to DB in his rule engine?? so How do i make a rule chain to transfer my projects data in any Database server. i have installed pgadmin4 to Graphically see the database but nothing useful found. Documentation and stakoverflow geeks said that configuring Thingsboard.yml file located in monolithic installation on linux (/etc/thingsboard/conf/thingsboard.conf ) in this path it have cassandra,mysql,postgres configuration but how to properly configure it??? i tried to access the default database of postgres named thingsboard that i created on installing time but when i list the content of database it only shows the default things/relations of thingsboard if i create a device on thingsboard that is not showing in database why?? I really can use your help. Please enlighten me a way to connect my THINGSBOARD with a DATABASE.
see in my attached images there are everything default, nothing that i create on thingsboard.
enter image description here
That's wrong, ThingsBoard currently supports 3 database setups: Postgres only, Hybrid Postgres + Cassandra (only telemetries) & Postgres + Timescale. So there is no MySQL database used anywhere.
https://thingsboard.io/docs/user-guide/install/ubuntu/#step-3-configure-thingsboard-database
Find guides for connecting your devices to Thingsboard here, e.g. via MQTT:
https://thingsboard.io/docs/reference/mqtt-api/
If you would like to forward the stored telemetries of ThingsBoard to different databases, this is not possible directly with rulechains (there's only one node to store data in a cassandra table)
One way to achieve this, would be fetching the data with an external microservice/program via HTTP API and persisting the data in the database of your choice. You could use a python script for example.
https://thingsboard.io/docs/reference/rest-api/
Alternatively, you can send the data to a Message queue like Kafka instead of fetching via HTTP API. But still it would require additional tools for storing the data in external databases outside ThingsBoard.
Am a bit new to inno-setup and using it to create my java executable file and using MySQL as a database file. i just have two question:
First, If a user already has MySQL and i want to detect and load a database. do i use MySQL open database connectivity(odbc)
Second, which approach would be the best for loading/running the script. using a batch file or simply run the script within inno-setup?
i would really appreciate some code snippets
thanks
I have databases in my system and also put database on web server also, so when I update my system database data I ll have to then replace or add data into web database.
but
problem is that I am doing changes in database to some specific record frequently for testing purpose.
So I want some mechanism that will used to export some specific records to sql file with insert statement.
Suppose I have made change in table tbl1 and added 10 records to it.
So right now I am manually adding or replacing whole table on web database.
So is there any mechanism in MySql or in Workbench using that I can export specific records.
Any Help for that.
The only automatic solution is to use replication, but that is probably not a good solution for your scenario. So what remains is some manual process. Here are some ideas:
Write a script that writes specific records into a dump file.
Then use a different script to load this dump file into your
target server.
If you frequently change the same records you could create a script
with insert statements that you edit for each new value and run
against both your local and your remote (web) server.
I'm trying to duplicate my mysql tables for hsqldb in order to run some unit tests in my JPA / Hibernate project.
There are only two tables at the moment, but I can't get neither created in hsqldb. I used the example code from Spring documentation to run a schema.sql script before the test cases:
db = new EmbeddedDatabaseBuilder().addDefaultScripts().build();
But it always fails with "Unexpected token" exceptions with the token ranging from "DATABASE" to "(".
Is there a straight forward way of converting the mysql dump into something that hsqldb would understand? Or should I populate the test database some other way?
I worked many years with HSQL and MySQL database and there's no tool that I know that converts a MySQL dump into a hsqldb script. I see two solutions here:
Make a script or a program that converts MySQL dumps to hsqldb script. You can follow the list of steps to do in this post:
How to load mysql dump to hsqldb database?
Use BOTH responses
https://stackoverflow.com/a/3813164/891479
and
https://stackoverflow.com/a/7791340/891479
as the first one is not complete.
Make a small program that connects to both DB, load your MySQL tables into objects and fill your hsqldb.
We always used the first solution, it's probably the easiest one.
I have encountered a problem where I need to copy only data from a Postgresql database to Mysql database. I already have the Mysql database with empty tables. By using PGAdmin I got a backup (data only, without database schema). I tried using PSQL tool but it keeps giving segmentation fault which I couldn't fix at the moment. I am using Ubuntu. Any simple help with a guide will be highly appreciated to copy data.
Use postgres COPY, and MySQL LOAD DATA INFILE.
psql will crash because of out-of-memory if you try to display a few millions of rows because it fetches all the data beforehand to determine the column widths for a prettier display. If you intend to use psql to fetch lots of data, disable this.
You could try:
http://www.lightbox.ca/pg2mysql.php
It looks like you might be trying to load data into mysql with the postgres client tool. Use the mysql client tools to manipulate data in the mysql server.
mysql client programs
How can you move data into MySQL if you have errors reading from PSQL?
Fix the error, then ask this question.