advice to run search query in 120 millions data,which way? - mysql

i have mysql database(InnoDB) with +120 millions products data , database is just 5 data table categorys,
products,
category_product_mapping,
users,order
usualy i work for small projects and havent experience in large databases,and i use php+mysql with codeigniter framework and also i use asp.net with mssql,i need small script for this database to search products with name.user login to system and search product with name and see price and add to basket and make order. which way is better ? codeigniter framework with php+mysql or asp.net with mysql?asp.net can be maybe more faster because if iss output cache?

Related

Best way to get data from 2 different databases (MSSQL Server and MySQL)

I have a website that uses 2 different DMS (MSSQL Server and MySQL). There is a table name product in both databases, these 2 tables have the same product ID.
In MSSQL Server: I stored price, quantity.
In MySQL: I stored name, size,...
Now, every time I query products, I would do like this:
- Connect to MySQL -> query products by a loop -> inside every loop, I will connect to MSSQL Server to get other data of this product.
I know this is totally a bad way, so I'm finding a new way to get what I want since I think my website is slower because of that kind of query.
Can you help me by writing pseudo-code or explanation, thank you.
You are right! Having your website data in 2 different database technologies is not optimal.
Until you fix that, one workaround could be (assuming we are not talking millions of records):
User selects product A or Product category X on the website.
Get all data for product A or products of category X from SQL Server and store it in memory (for eg. in a c# dataset or python data frame)
Get all data for product A or products of category X from MySQL and store it in memory (for eg. in a c# dataset or python data frame)
Join the 2 in memory objects based on Product Id
Use this combined dataset for display your website
If required, Update (commit) data to the databases at the end of the session (will need to consider how to deal with dirty read scenarios)

Which database schema should i use?

I am building a Rest Api using node, MySQL and MongoDB, but i am confused with the database schema to go for as the business case is B2B and for each business(customer) there is like 10 tables for general ledger, products, transactions, clients, sales, purchase and many like these. and for accommodating 1 to N relationship in sales and purchase record i will use MongoDB to avoid making default MAX number of columns for products in the purchase/sale orders in SQL.
Considering my customers need a separate data backup option for their data and in near future i am also planning to integrate the relationships between the application customers.
So, which is the best option to go for. I have read this question and answers quite carefully, and would like to ask whether should I go for option number 2 ?
Also, I would like to ask whether I should separate my entire backend (DB +Server) for specific BUSINESS TYPES using hostname mapping to business specific azure WebApp ?

Rails: How to perform search on a table having more than 20,000 records?

I have a product model, the associated table contains more than 20,000 records. In the client side I'm using angular, and in one of the interfaces I have provided the Auto Complete search feature for the product, so it taking a lot of time to search products, basically on the sql side in translates into the following query :
select * from products where name like = %abc%
What should be the best approach to bring down this time?

Best way of reducing product count in e-commerce web app?

I am developing a e-commerce web app using Spring , Hibernate and MySql.
For ex : In this app at the time of placing order for products , I am going to update Product table record , means going to reduce product count from this table.
I am using mysql update query for that.
Its working fine . but if I send 100 of request at a time then that product count is not decreasing properly.
For that I got two answer
I used messaging queue(RabbitMQ) to queued all request and reduce the count one by one
I have to use hibernate lock
So I need to know other possible way to do that work and from above two way which one is good ?

How to synchronize product stock data between different databases in MySQL?

I´m working on a joomla website (Virtuemart 2.0.6 webshop) and the plan is to make 2 or maybe even 3 seperate webshops with different domain names and databases but the owner uses the same warehouse for all the shops.
So I´m searching to find a way to syncronize the product stock in multiple Virtuemart shops.
Basically I´m looking for way to only share one specific table field between the shops.
Specifically in Virtuemart 2.xx the product stock data is stored in the table:
"jos_virtuemart_products"
field name: "product_in_stock"
I have looked everywhere for the answer to this one, but I haven´t found anything I could use to solve this problem.
You are going to have to code a custom plugin to do this if you want to have different product names, descriptions, and prices. In any case, it is going to take a custom plugin to achieve your synchronization.
Probably the best way to do it is write a plugin that updates the other sites when an order is completed.