in booksleeve how do I iterate through all the keys in given db - booksleeve

I´m using booksleeve (and it is awesome) to access redis from C#. The only thing I lack is api documentation (or perhaps I haven´t found it?). I need to flush redis db to sql server, so I need to iterate through all the keys in redis db. How is this best done?
Edit
Ok, I've managed to do this by:
var conn = MyTable.RedisConnection;
var keys = conn.Keys.Find(MyTable.redis_db_index, "*").Result;

I would use publisher/subscriber technique. It seems that Find in Booksleeve is Keys pattern command in Redis. And they do not recommend using Keys pattern in production. See http://redis.io/commands/keys .

Related

Using doctrine with multiple MySQL Databases

I'm starting a new project from an existing MySQL DB and I would like to use symfony+doctrine for that.
The problem is that my current DB has multiple DB in it. For instance, it has db.tables like:
customers.info
customers.orders
items.catalog
items.stock
etc....
I've tried to search online but I've realized that one of the problem is that "database" word is used to define 2 very different things: database "software", like mysql, postgres, mariaDB, etc... and databases as in SQL "CREATE DATABASE".
So when I'm looking at symfony doc, I found this page, which states that I cannot use Doctrine ORM since I have multiple DB: https://symfony.com/doc/current/doctrine/multiple_entity_managers.html
But the more I read it, the more I have the feelings that what there are saying is "you need one entityManager for Mysql, one for Postgres, etc... and Entities cannot define associations across different entity managers" and not "Entities cannot define associations across different DB from the same DB software"
AM I right? and if yes, how can I achieve such a thing, knowing that I need to provide a database name in the connection URL (like mysql://user:pass#127.0.0.1/oneOfMyDb )
Thanks!
Ok so I finally found the answer, which may be useful for other people in the same situation.
It is possible to use doctrine with multiple database/schema in mySQL. yes, the problem here is that MySQL kinda mixed the concept of DB and schema, hence the confusion.
In order to do this, you need to declare the table and schema used for every entity, for instance:
<?php
namespace App\Entity;
use App\Repository\PropertyRepository;
use Doctrine\ORM\Mapping as ORM;
/**
* #ORM\Entity(repositoryClass=PropertyRepository::class)
* #ORM\Table(name="property", schema="myOtherDB")
*/
class Property
{
// some stuff here...
}
This way, no matter which DB name you declare in the connection, it will connect to you other DB (schema) and you will be able to fetch datas from foreign keys, even if this data is stored in a table in a different DB (schema).
I hope this will help some people!

How to use sqldf in R to manipulate local dataframes?

I am attempting to analyze some data in RStudio which originates from a MySQL database, so I used dbConnect to connect to said database, and copied the single table I needed for this project. I then used R to clean the data a bit, getting rid of some un-needed columns. So far, so good.
My problems arose when I realized my data had some outliers, and I needed to delete rows which contained obvious outlier data. This is something I have no problem doing in SQL, but lack the R experience to do effectively. So I looked into it, and found out about sqldf, a package which bills itself as a way to use SQL commands to manipulate data.frames. Perfect! But I'm having some trouble with this, as sqldf seems to require a database connection of some kind. Is there a way to simply connect to a data.frame I have in my global environment in RStudio?
Q: Couldn't you just manipulate the data in MySQL before importing it to R?
A: Yes, and that's what I'll do if I have to, but I'd like to understand sqldf better.
Try:
options(sqldf.driver = "SQLite")
sqldf("select * from book;", drv = 'SQLite')

How can i get the current schema name with Mybatis?

Basically i need to know if there is any way to get the current schema name using Mybatis.
The DB engine I'm using is MySQL
The most easy way, for which you don't even need to do anything MyBatis-specific, would simply be a query:
SELECT DATABASE();
This should, according to the documentation, return the current database.
Alternatively, you should be able to get the Configuration from your SqlSession via getConfiguration() and get it from there somewhere, perhaps from the environment which allows you access to the DataSource, but you will probably need some database-specific code there.

Is there an --des-key-file equivalent for AES?

When you use the DES_ENCRYPT/DES_DECRYPT function in mySQL you can point to your keyfile from my.cnf using the --des-key-file variable.
I thought this should also exist for
AES_ENCRYPT/AES_DECRYPT
So I searched for hours but couldn't find it: is there an equivalent for AES for this?
As far as I can tell from the documentation, no such option exists for AES_ENCRYPT. Instead, you are supposed to pass the key as a parameter directly in the query.
This answer on DBA.SE suggests writing a User Defined Function that returns the key as one possible work-around.
Alternatively, you might want to consider not using the MySQL AES functions at all, and instead just doing all encryption and decryption in the client application. One potential advantage of such an approach is that, in order to obtain and decrypt the data, an attacker then needs to compromise both your database and your application.

Doing a bulk SQL insert in django

Suppose I have a CSV file with 1M email addresses. I need to iterate through the file and add each entry, for example:
with open(file) as csv:
for item in csv:
Email.objects.create(email=item)
This seems like it would be very slow going through the django ORM like this to create 1M objects and insert them into the db. Is there a better way than this, or should I go away from django for this task and do it directly with the db?
You can also try using new bulk_create
Besides bulk_create, you could put all inserts into one transaction as long as your DB backend supports it:
from django.db.transaction import commit_on_success
# with commit_on_success(), open(file) as csv: # in Python2.7
with commit_on_success():
for item in csv:
Email.objects.create(email=item)
Also note that bulk_create treats items w/ same values to be same, thus
Email.objects.bulk_create([Email(email=item), Email(email=item)])
actually creates one row instead of two
Because of more SQLs turnaround, the transaction solution is still slower than the bulk_create one, but you don't have to create all one million Email() instances in memory (generator seems not work here)
Furthermore, you could do it in SQL-level directly
This is something you should drop to DB-API to accomplish, since you bypass creating all the model objects.
IMHO, I don't see very big problem with speed if it's only one-time insert (1M records won't take you hours). If you'll be using django api to access those objects in the future, then probably you should avoid resorting to SQL level insert, and do it through django's methods, like suggested by livar (if using django 1.4)
You might want to look into the Django DSE package, which is apparently an efficient bulk insert/update library.