Is there a way to (batch) delete all Subcriptions in Fiware Orion? - fiware

Due to some error in a script I have thousands of subscriptions (over 5000). Is there a way to delete them all except by hand?
I know that for entities I can use this batch delete script, however, I could not find out if there is a similar way to delete subscriptions. Is there even a batch mode for CRUD operations regarding subscriptions?

Not as far as I know, but it should be easy to do with a shell script and curl. Another option is through the mongoDB collection, but that would be a hack that could lead to inconsistencies ...

Related

Is there any better way to do the data migration?

I have written couple of methods to retrieve data from LDap and put it into MySql database. I put those methods in a Listener, so that it executes while deploying the War.
Now this is a one time action. That means, I have to take all the data from Ldap and put those into the MySql DB, and then work on the database tables. I have nothing to do with the LDap data farther.
Is there any better way to do the data migration thing? Since it is a one time work, and once the database is created successfully, there is no need of these methods.
Please Suggest!
Thanks. :)
For migration exercises, look into the Open Source Pentaho Data Integration tool (PDI, or commonly known as Kettle).
There is a slight learning curve, but it's easy to use, and you'll have it forever.

Automatically Update Data in a Google Fusion Table hourly?

We are going to need to update data in a Google Fusion Table hourly and are looking into possibly using an SSIS package to accomplish this.
Has anyone had any experience with updating Google Fusion Tables automatically? Any methods work better than others?
I would decompose your problem into smaller problems until you reach the point you have found one you can solve.
How do I perform a task at X interval
In a pure Windows world, I'd use the native Task Scheduler. It's free and works fine for your scenario of "every hour."
Since SSIS is in the mix, that means you would also have access to SQL Agent. It too is a good fit for your scenario and at this point, I would examine your organization and determine what scheduling tool is predominantly used. It might be "neither."
How can I programmatically update Google Fusion Tables
There is a full Fusion API published. They even have a DML syntax for working with data in a table. However, do observe the warning about using the query syntax for more than 500 rows/10k cells/1MB.
Note: You can list up to 500 INSERT statements, separated by
semicolons, in one request as long as the total size of the data does
not exceed 1 MB and the total number of table cells being added does
not exceed 10,000 cells. If you are inserting a large number of rows,
use the import method instead, which will be faster and more reliable
than using many SQL INSERT statements.
How can I use SSIS to update Google Fusion Tables
For anything that's not Out Of the Box with SSIS, I usually re-ask the question as "how do I do X in .NET" because that's what it will boil down to. Since it's a web destination, while SSIS has a web service task, it's not as useful as writing your own .NET caller.
I would envision an SSIS package with at least a Data Flow Task. Depending on where your data is coming from, it'd have a source (OLE DB, flat file, etc), any transformations you need between that and the destination. Your destination will be a Script Component configured as a Destination. There you'll use C# or VB.NET to send your Insert/Update commands to the web server. I found this sample of C# that sounds logical. I've never actually used GFT API so I can't comment on whether there's a better route of doing this.
Copying sanya's comment in here
Warning: the attached sample c# script uses Client Login to
authenticate against Google. This auth method has been deprecated
since April 20, 2012. Usage of OAuth2 is supported.

Put trigger in MySQL database to update Oracle database?

I want to create an insert trigger on MySQL which will automatically insert the record into an Oracle database. I would like to know if there are people that have experience to share on this topic.
Cheers
Invoke a script as is done in this example that calls the Oracle code.
Note: you lose support for transactions (there will be no built-in rollback for the Oracle database) when you perform this type of cascading, and you also will take a likely very large performance hit in doing so. The script could turn around and simply call Java code or some other executable that invokes your some generic code to insert into Oracle, or it could be a raw query that gets passed arguments from the script.
This is almost certainly a bad idea because of the odd side-effect behavior, but it's one that can be implemented. I think that you would be much better off having the code to do this against two different DataSources (in Java/.NET speak) rather than have a hidden script in a MySQL trigger that screams unmaintainable, as well as hidden failure for future developers.

Migrating subsets of production data back to dev

In our rails app we sometimes have db entries created by users that we'd like to make part of our dev environment, without exporting the whole table. So, we'd like to be able to have a special 'dev and testing' dump.
Any recommended best practices? mysqldump seems pretty cumbersome, and we'd like to pull in rails associations as well, so maybe a rake task would make more sense.
Ideas?
You could use an ETL tool like Pentaho Kettle. Once you have initial transformation setup that you want you could easily run it with different parameters in the future. This way you could also keep all your associations. I wrote a little blurb about Pentaho for another question here.
If you provide a rough schema I could probably help you get started on what your transformation would look like.
I had a similar need and I ended up creating a plugin for that. It was developed for Rails 2.x and worked fine for me, but I didn't have much use for it lately.
The documentation is lacking, but it's pretty simple. You basically install the plugin and then have a method to_sql available on all your models. Options are explained in README.
You can try it out and let me know if you have any issues, I'll try to help.
I'd go after it using a Rails runner script. That will allow your code to access the same things your Rails app would, including the database initializations. ActiveRecord will be able to take advantage of the model relationships you've defined.
Create some "transfer" tables in your production database and copy the desired data into those using the "runner" script. From there you could serialize the data, or use a dump tool, since you'll be dealing with a reduced amount of records. Reverse the process in the development environment to move the data into the database.
I had a need to populate the database in one of my apps from remote web logs and wrote a runner script that fired off periodically via cron, ftps the data from my site and inserts the data.

Executing shell command from MySQL

I know what I'm looking for is probably a security hole, but since I managed to do it in Oracle and SQL Server, I'll give it a shot:
I'm looking for a way to execute a shell command from a SQL script on MySQL. It is possible to create and use a new stored procedure if necessary.
Notice: I'm not looking for the SYSTEM command which the mysql command line tool offers. Instead I'm looking for something like this:
BEGIN IF
COND1...
EXEC_OS cmd1; ELSE
EXEC_OS cmd2; END;
where EXEC_OS is the method to invocate my code.
This isn't so much an answer to the question as it is justification for this sort of functionality - hence negating those who would say "you should do something else" or "why would you want to".
I have a database which I am trying to keep strict rules on - I don't want orphans anywhere. Referential integrity checks help me with this on the table level, but I have to keep some of the data as files within the filesystem (this is a result from a direct order from my boss to not store any binary data in the database itself).
The obvious solution here is to have a trigger which fires on deletion of a record, which then automatically deletes the associated external file.
Now, I do realise that UDF's may provide a solution, but that seems like a lot of C/C++ work to simply delete a file. Surely the database permissions themselves would provide at least some security from would-be assailants.
Now, I do realise that I could write a shell script or some such which could delete the table record and then go and delete the related file, but again, that's outside the domain of the database itself. As an old instructor once told me "the rules of the business should be reflected in the rules of the database". As one can clearly see, I cannot enforce this using MySQL.
You might want to consider writing your scripts in a more featureful scripting language, like Perl, Python, PHP, or Ruby. All of these languages have libraries to run SQL queries.
There is no built-in method in the stored procedure language for running shell commands. This is considered a bad idea, not only because it's a security hole, but because any effects of shell commands do not obey transaction isolation or rollback, as do the effects of any SQL operations you do in the stored procedure:
START TRANSACTION;
CALL MyProcedure();
ROLLBACK;
If MyProcedure did anything like create or edit a file, or send an email, etc., those operations would not roll back.
I would recommend doing your SQL work in the stored procedure, and do other work in the application that calls the stored procedure.
see do_system() in http://www.databasesecurity.com/mysql/HackproofingMySQL.pdf
According to this post at the forums.mysql.com, the solution is to use the MySQL_Proxy.