How to store the order tracking of a ecommerce website in mysql - mysql

This was my schema of the database I am using for my project
I am appending the messages to the **track column ** with different endpoints
is there any effiecient approach in mysql to store the track messages?
I am appending the data to a single column with a new Line character "\n" I need a efficient way to store this data

Related

Best way to handle descrete sets of information SQL

I'm very new to SQL and am trying to structure a list of objects in a hierarchy. I have a Flask server that accepts information from a client device and stores the information in a MySQL server. Because the Flask server can accept connections from multiple client devices, I want to organize the information from each individual client in the SQL server. My naive approach would be to create a new table for each Client that connects and insert the data into that new table. It's my understanding that this may be the incorrect way to handle the organization of data. My question is what is the standard way of handling this in SQL?
You are correct, and creating a new table for each new set of client information is bad practice. Instead, consider just having a single table, with a separate column to keep track of the client, e.g.
client_table (client_id, data1, data2, data3, ...)
Now, for each new incoming set of client information, you need only to insert a new record for that client.

Querying data from 2 MySQL Databases to a new MySQL database

I want to query data from two different MySQL databases to a new MySQL database.
I have two databases with a lot of irrelevant data and I want to create what can be seen as a data warehouse where only relevent data should be present coming from the two databases.
As of now all data gets sent to the two old databases, however I would like to have scheduled updating so the new database is up to speed. There is a key between the two databases so in best case I would like all data to be present in one table however this is not crucial.
I have done similar work with Logstash and ES, however I do not know how to do it when it comes to MySQL.
Best way to do that is create a ETL process with Pentaho Data Integrator or any ETL tool. Where your source will be two different databases, in the transformation part you can remove or add any business logic then load those data into new database.
If you create this ETL you can schedule it once a day so that your database will be up to date.
If you want to do this without an ETL than your database must be in same host. Than you can just add database name just before table name in query. like SELECT * FROM database.table_name

Convert Mysql Database to new Schema

I have an old mysql database which has some tables (each table records number > 1000,000).
I have new mysql database schema. I want to convert all data to new schema.
what are available tools ?
if I have to write code for transforming data to new schema, what is best programming language ?
It's obvious that data should be transformed multi threaded or multi process,
what is your advice ?
I want to have sight on status of conversion while data is converting,
for example number of records which has been converted ,
number of successful migrations , number of failed migrations and ...
what is best way ?
in case any body get to this question,
I found a software called talend studio :
https://www.talend.com/products/talend-open-studio/
you can check it out, and it's exactly what I wanted.

How can I import nested json data into multiple connected redshift subtables?

I have server log data that looks something like this:
2014-04-16 00:01:31-0400,583 {"Items": [
{"UsageInfo"=>"P-1008366", "Role"=>"Abstract", "RetailPrice"=>2, "EffectivePrice"=>0},
{"Role"=>"Text", "ProjectCode"=>"", "PublicationCode"=>"", "RetailPrice"=>2},
{"Role"=>"Abstract", "RetailPrice"=>2, "EffectivePrice"=>0, "ParentItemId"=>"396487"}
]}
What I'd like to a relational database that connects two tables - a UsageLog table and a UsageLogItems table, connected by a primary key id.
You can see that the UsageLog table would have feilds like:
UsageLogId
Date
Time
and the UsageLogItems table would have fields like
UsageLogId
UsageInfo
Role
RetailPrice
...
However, I am having trouble writing these into Redshift and being able to associate each record with unique and related ids as keys.
What I am currently doing is I use a ruby script that reads each line of the log file, parses out the UsageLog info (such as date and time), writes it to the database (writing single lines to Redshift is VERY slow), then creates a csv of the data from the UsageLogItems information and imports that to Redshift via S3, querying the largest id of the UsageLogs table and using that number to relate the two (this is also slow, because lots of UsageLogs do not contain any items, so I frequently load in 0 records from csv files).
This currently does work, but it is far too painfully slow to be effective at all. Is there a better way to handle this?
Amazon Redshift supports JSON ingestion using JSONPaths via COPY command.
http://docs.aws.amazon.com/redshift/latest/dg/copy-usage_notes-copy-from-json.html

Drupal user data storage location

Where does Drupal 7 store user data? I believe our client has been inserting bad data (incorrect characters) copied from MS Word into their Drupal install (for their user list).
How do I export/look at the users to see if any bad characters are in them?
I'm taking a look at the structure of the database in MySQL and I'm not seeing any clear table with the data in it. Is it contained in only one, or multiple tables?
User data is stored in multiple tables one of which is the users table. Any fields that have been added to the user entity will be in the field_data_field_user... table.
If you are using phpMyAdmin or similar interface, you will only be able to immediately see a portion of the tables Drupal has. Use the filter to find "users" and the Users table will appear in the list. There is only one table if there were no modifications for custom user fields.
You can export the table as a CSV and open in a spreadsheet interface, such as Excel; though Excel is not always friendly with UTF-8 if it's not set up to display it. Or you can save the file as a .SQL and view it in a database viewer such as SQLite in Firefox.