How to retrive multiple documents in Couchbase with python client - couchbase

Currently, the Couchbase Python client doesn't implement getMulti() method yet. How can I retrive multiple document in bulk by providing multiple keys?
I am asking question about couchbase not memcached client

As you notice today it is not supported by the Couchbase client. So you have to put this in a loop and do individual gets.
An issue has been logged on the subject:
http://www.couchbase.com/issues/browse/PYCBC-49

Related

How to generate notifications from MySQL table updates?

I have a full stack app that uses React, Node.js, Express, and MySQL. I want the react app to respond to database updates similar to Firebase: When data changes, I want a real-time notification sent to my app.
I want to use stock MySQL (no plugins), so that I can use AWS RDB or whatever.
I will use socket.io to push the real-time notifications to the web app.
To avoid off-target responses, I'll summarize various approaches that are not what I am looking for:
The server could poll, or each client could poll. (Not real-time, but included for completeness. When I search, polling is the only solution I find.)
Write a wrapper that handles all MySQL updates, handles subscriptions, and sends the notifications. This is a complicated component that adds complexity. Firebase is popular because it both increases performance and reduces complexity. I like Firebase a lot but want to do the same thing with MySQL.
Use Firebase to handle the real-time notifications. The MySQL wrapper could use Firebase to handle the subscriptions and notifications, but there is still the problem of triggering the notifications in the first place. Also, I don't want to use Firebase. (For example, my application needs to run in an air-gapped environment.)
The question: Using a stock MySQL database, when a table changes, can a notification server discover the change in real-time (no polling), so that it can send notifications?
The approach that works is to listen to the binary logs. This way, any change to the database will be communicated in real-time. The consumer of the binary logs can then publish this information in a number of ways. A common choice is to feed a stream of events to Apache Kafka.
Debezium, Maxwell, and NiFi work this way.

How to synchronize MySQL database with Amazon OpenSearch service

I am new to Amazon OpenSearch service, and i wish to know if there's anyway i can sync MySQL db with Opensearch on real time. I thought of Logstash but it seems like it doesn't support delete , update operations which might not update my OpenSearch cluster
I'm going to comment for Elasticsearch as that is the tag used for this question.
You can:
Read from the database (SELECT * from TABLE)
Convert each record to a JSON Document
Send the json document to elasticsearch, preferably using the _bulk API.
Logstash can help for that. But I'd recommend modifying the application layer if possible and send data to elasticsearch in the same "transaction" as you are sending your data to the database.
I shared most of my thoughts there: http://david.pilato.fr/blog/2015/05/09/advanced-search-for-your-legacy-application/
Have also a look at this "live coding" recording.
Side note: If you want to run Elasticsearch, have look at Cloud by Elastic, also available if needed from AWS Marketplace, Azure Marketplace and Google Cloud Marketplace.
Cloud by elastic is one way to have access to all features, all managed by us. Think about what is there yet like Security, Monitoring, Reporting, SQL, Canvas, Maps UI, Alerting and built-in solutions named Observability, Security, Enterprise Search and what is coming next :) ...
Disclaimer: I'm currently working at Elastic.
Keep a column that indicates when the row was last modified, then you will be able to do updates to OpenSearch. Similarly for deleting, just have a column indicating whether it is deleted or not (soft delete), and the date it was deleted.
With this db design, you can send the "delete" or "update" actions to OpenSearch/ElasticSearch to update/delete the indexes based on the last modified / deleted date. You can later have a scheduled maintenance job to delete these rows permanently from the database table.
Lastly, this article might be of help to you How to keep Elasticsearch synchronized with a relational database using Logstash and JDBC

How do I Store JSON data from websockets?

I’m looking at storing raw JSON data received from a bunch of websockets and am feeling a bit overwhelmed with all the choices available. I found this from 2012 but things have moved on a bit since.
My requirements are to be able to query the data through an API, to get the latest message (ideally in realtime) or a subset of messages (like all messages received on a particular date).
I'll be storing around 10,000 messages daily from different sources with different schemas.
From some basic research I think I need some kind of nosql document store? The main ones seem to be:
Elastic search
Redis
MongoDB
CouchDB
Am I on the right lines here? What DB service should I use to store JSON data?
Thanks.
I ended up streaming data from Kafka to both Postgres (for persistent storage) and Elasticsearch (you know, for search) if that helps anyone.

unable to sync pouchDB with couchBase Sync Gateway

I am trying to sync pouchDB with couchBase through Sync Gateway, but i just get data added by pouchDB, not initial data added to couchBase. For example there is 750 docs in couchBase but none of them synced to the pouchDB. Also http://localhost:4985/_admin/db/db not showing couchBase docs too.
The problem is with adding data to Couchbase Server directly. Couchbase Mobile currently requires extra metadata in order to deal with replication and conflict resolution. This isn't handled by the Server SDKs.
The recommended approach is to do all database writes through Sync Gateway.
To simplify use with PHP, you may want to use a Swagger PHP client. (You can see an example of using clients autogenerated by Swagger in this post. The example use Javascript and Node.js, but the principles are the same.)
You can read from Couchbase Server directly if you want (to do a N1QL query, for example).
Another option is to use "bucket shadowing". This is trickier, and is likely to get deprecated at some point. I only list it for completeness.

AWS ElastiCache : Is it possible Redis compatible Elasticache to store Json Files

Is it possible to store json Files in Amazon Web Services Redis compatible Elasticache?If possible what is the best method to accomplish the same?
**Edited - The answer below is now outdated. Seems AWS began to support it around post 2021. WOOHOO!
reJson is a module add-on in the Redis ecosystem. It can store JSON directly into Redis cache. However modules are not supported in AWS elastic cache due to licensing issues.
TLDR - Amazon is profiting from open source and not contributing back. https://techcrunch.com/2019/02/21/redis-labs-changes-its-open-source-license-again/
So if you like to store JSON directly, you'd need to spin up your own cluster of EC2, install Redis with reJSON.
Alternative route is to see if you can store "Hash" data type or "string" in Elastic cache based on your use cases...
In the case of going with the route of storing a JSONified string into redis as a value per key, your application would have to parse it back to JSON after it retrieves the data from Redis.
Same goes for Hash, though for the case of Hash, you can access specific fields instead of the entire JSON.
If you envision your application to add more fields in the future, but you need some fields in most cases, go with Hash. Otherwise for simplicity sake, go with String.
https://redis.io/topics/data-types-intro
There is now native support for working with JSON in Amazon ElastiCache for Redis and Amazon MemoryDB for Redis.
From the release announcement on May 25th, 2022 (emphasis mine): https://aws.amazon.com/about-aws/whats-new/2022/05/json-support-amazon-elasticache-redis-amazon-memorydb-redis/
Amazon ElastiCache for Redis and Amazon MemoryDB for Redis now support
natively storing and accessing data in the JavaScript Object Notation
(JSON) format. With this launch, application developers can
effortlessly store, fetch, and update their JSON data inside Redis
without needing to manage custom code for serialization and
deserialization.
Along with the announcement, AWS also published a blog post which describe how the new JSON support works and give examples of how to use it: https://aws.amazon.com/blogs/database/unlocking-json-workloads-with-elasticache-and-memorydb/
The Redis JSON module is available with Redis Enterprise which is offered as a managed service on the major cloud providers (AWS, GCP, Azure).
It also works with the Redis Search module giving you the ability to do more complex SQL-like queries against your JSON data.
https://developer.redis.com/howtos/redisjson/getting-started/
The Redis Insight tool has some great tutorials walking you through Search+JSON.
https://redis.com/redis-enterprise/redis-insight/