using botov1.8d to work with Amazon MechanicalTurk - boto

I'm new to amazon MTurk, and I'm trying to create hits and fetch results through boto v-1.8d. However, this has brought up a lot of issues and I'm wondering if this is because I'm using an older version of boto.
Note: I can't use the latest release since this is my way of learning how to work with MTurk to implement an open source application called CrowdForge, which is only compatible with boto v1.8d.
If you know of any sample code for 1.8d and/or tutorials, I'd appreciate it if you share the link.
For example: One think im confused about is that the result of the following code is empty brackets "[]" instead of showing a value such as [10000].
#import classes needed from boto library
from boto.mturk.connection import MTurkConnection
#specify your own access key and secret access key
ACCESS_ID = 'myAccessKey'
SECRET_KEY = 'mySecretKey'
HOST = 'mechanicalturk.amazonaws.com'
SECURE = True
#make a connection
mtc = MTurkConnection(aws_access_key_id = ACCESS_ID,
aws_secret_access_key = SECRET_KEY,
host = HOST,
is_secure = SECURE)
#display the account balance
print mtc.get_account_balance()

Related

How to initialize Mediawiki shared database tables?

I am trying to initialize a new Mediawiki family. I use this guide, of course. In the Upgrading section of the guide, it is mentioned:
As of MediaWiki 1.21, when upgrading MediaWiki from the web installer, $wgSharedTables must be temporarily cleared during upgrade. Otherwise, the shared tables are not touched at all (neither tables with $wgSharedPrefix, nor those with $wgDBprefix), which may lead to a failed upgrade.
It is right, because using this setting:
$wgSharedDB = 'wiki_shared';
$wgSharedTables[] = array('user','user_groups','actor');
$wgSharedPrefix = '';
I had no success in setting the db up; no shared tables are created in the wiki_shared db (it remains an empty db).
How should I "clear $wgSharedTables" to avoid facing this issue?
(Even though this is old, just in case someone will get here...)
First of all, this is how to clear $wgSharedTables:
$wgSharedTables = [];
All the other options just added a new empty array into the array.
Also, This is not the way you set $wgSharedTables. You essentially added an array inside the array; however, each table is supposed to be its own item. Either use array_merge():
$wgSharedTables = array_merge( $wgSharedTables, [ user','user_groups','actor' ] );
Or set each one separately:
$wgSharedTables[] = 'user';
$wgSharedTables[] = 'user_groups';
$wgSharedTables[] = 'actor';

No Role found with id: 106 (HTTP 404) in Keyrock when creating new application

I have a problem when creating new application in Horizon(Identity manager GE).
Im logged in as idm user and when creating application, on the first step when i specify name, description, callback and url and press next i get following error:
Error: No Role found with id: 106 (HTTP 404)
Error: Unable to register the application.
What might be the problem?
The mistake was in the Horizons local_settings.py file.
instead of attributes FIWARE_PURCHASER_ROLE_ID and FIWARE_PROVIDER_ROLE_ID pointing to corresponding row ids of table role_fiware like this:
FIWARE_PURCHASER_ROLE_ID = '5786623590bc4f3ab01c61733a13ee6d'
FIWARE_PROVIDER_ROLE_ID = '4806909eb4b646c7a1f11ad9f9ed53ed'
attributes were :
FIWARE_PURCHASER_ROLE_ID = '106'
FIWARE_PROVIDER_ROLE_ID = '191'
I guess that is default configuration for sqlite db.
So if using mysql db just insert correct ids in this file for these attributes from table role_fiware.

How to use ReplaceRows from .NET Google.Apis.Fusiontables.v2 (stream csv)?

Goal: to update a Fusion Table by replacing old rows by new ones from a csv file without headers using ReplaceRows().
I am using the Google.Apis.Fusiontables.v2 library.
I have read and reread the documentation, but still can`t get my code working.
Authentication is working and I am able to perform simple INSERTs without issue:
string sql = "INSERT INTO 11t9VLt3vzb46oGQMaS2LTSPWUyBYNcfi1shkmvag (rpu_id, NO_BAIL, 'Usage (description)', 'Use (description)', 'Sup. louable m2', 'Sup. Utilisable m2', 'SumTotal Lou', 'Percent Lou', 'SumTotal Util', 'Percent Util') VALUES (9999,1111,'Test','Test En',1,2,3,4,5,6)"
Sqlresponse sqlRspnse = service.Query.Sql(sql).Execute();
I have tried ReplaceRowsMediaUpload and ReplaceRowsMediaUpload directly from the TableResource class without luck.
Calling the upload function from the service object doesn't error out, but I'm not sure what to do next that would actually replace the rows in the Fusion Table (service is a FusiontablesService):
StreamReader str = new StreamReader(Server.MapPath("~") + #"\sample2.csv");
service.Table.ReplaceRows("1X7JMLFy75uq20UnU6cLrGTTDfp6lLuD1Fc3vYYjQ", str.BaseStream, "text/csv").Upload();
I've tried:
service.Table.ReplaceRows("1X7JMLFy75uq20UnU6cLrGTTDfp6lLuD1Fc3vYYjQ").Execute()
following the upload, but this just puts the Fusion table in "stuck" mode.
Can someone please provide the lines required to make ReplaceRows work? (Explanations would be appreciated, but aren't necessary!).
You should change "text/csv" for "application/octet-stream". (See accepted MIME type here: https://developers.google.com/fusiontables/docs/v2/reference/table/replaceRows)
StreamReader str = new StreamReader(Server.MapPath("~") + #"\sample2.csv");
service.Table.ReplaceRows("1X7JMLFy75uq20UnU6cLrGTTDfp6lLuD1Fc3vYYjQ", str.BaseStream, "application/octet-stream").Upload();
The call to Upload should be enough.
Also, try to create a new table to test it out, to be sure it is setup correctly.
You can use a REST API call to replace a row in your Google Fusion table directly instead of writing methods to do that. Here is an example:
POST https://www.googleapis.com/upload/fusiontables/v2/tables/tableId/replace
Please refer to this document for more details, it has a testing environment tool too.

bigquery how to get the jobreference of an insert job for a local CSV file

I have a small piece of code to send a CSV file to a bigquery table.
The CSV file is on a local HD (not on google cloud storage).
Here is a simplified version of the csharp code (using version 2.1.5.0.122 of the bigquery csharp API).
BigqueryService bq = someMethodToGetABigqueryServiceInstance();
JobConfigurationLoad loadJobCfg = new JobConfigurationLoad();
loadJobCfg.SourceFormat = "CSV";
.
.
.
Job job = new Job();
JobConfiguration config = new JobConfiguration();
config.Load = loadJobCfg;
job.Configuration = config;
FileStream fs = new FileStream(#"c:\temp\onecol.csv", FileMode.Open, FileAccess.Read, FileShare.Read);
JobsResource.InsertMediaUpload insert = bq.Jobs.Insert(job, projectId, fs, "application/octet-stream");
var progress = insert.Upload();
// wait for Google.Apis.Upload.UploadStatus.Completed
.
.
The problem is that when I receive the Completed status, the file has been uploaded, but the data is not in the target table yet (ie: the job is still running).
Normally I should be able to wait for the job to be finished ('DONE') if I can have its reference (or ID),but I can't find a way to get that reference in the csharp API.
Is it possible to get the job reference from a JobsResource.InsertMediaUpload?
(insert.Body.JobReference is null)
Or is there another way to upload a local file to a bigquery table?
We recommend for all load jobs that you pass your own job reference -- that way in the event of a network hiccup, you'll be able to tell the state of the job. The job reference must be unique within the project, but this is pretty easy to do by creating a random number or using the current time.

Multiple individual users on one database

I have a .sql database with which i interact using Django .
The database in the beginning is filled with public data that can be accessed from anynone.
Multiple individual users can add rows into a table(private data).
How can a user see only the changes he made in the database(private data)?
I assume you're using django.contrib.auth. You just need to do something like:
from django.contrib.auth.models import User
# ...
class PrivateData(models.Model):
# ... private data fields ...
user = models.ForeignKey(User)
Then you can get just that user's fields with:
PrivateData.objects.filter(user=request.user)
EDIT: So, if your users are just IP addresses, and you're not using a login mechanism, you don't really need django.contrib.auth... though it's good to have anyway since you can use it to authenticate yourself and use the built-in admin stuff to manage your site.
If you just want to tie data to IP addresses, set up an IPUser model:
class IPUser(models.Model):
address = models.CharField(max_length=64, unique=True) # Big enough for IPv6
# Add whatever other discrete (not list) data you want to store with this address.
class PrivateData(models.Model):
# ... private data fields ...
user = models.ForeignKey(IPUser)
The view function looks something like:
def the_view(request):
remoteAddr = request.META['REMOTE_ADDR']
try:
theUser = IPUser.objects.get(address=remoteAddr)
except IPUser.DoesNotExist:
theUser = IPUser.objects.create(address=remoteAddr)
userModifiedData = PrivateData.objects.filter(user=theUser)
One thing to note: when you're testing this with manage.py runserver, you'll need to specify the IP address via environment variable:
$ REMOTE_ADDR=127.0.0.1 manage.py runserver
When you use Django with a real web server like Apache, the server will set the variable for you.
There are probably several ways to optimize this, but this should get you started.
I'm assuming that users have to log into this application. If yes, add a column to every table for the username. Add WHERE username = ? to every query so they can see only their data.
For data manipulation requests, make sure that the username matches the value for every row; forbid the operation if not true.