I'm trying to use a CSV a to bulk import users
username, firstname, lastname, email, idnumber, auth, country, city, institution, course1
a#b.ac.uk, a, z, a#b.ac.uk, a, LDAP, GB, London, B, B-A-STA
The the auth entry drops out saying,
LDAP
Auth plugin not supported here
but the other entries seem fine and the user gets added. I can't seem to find documentation on adding auth types, how do I specify LDAP?
Here I use CAS auth and always get a similar message, but users get added normally. I don't know if it is a bug...
Related
We are running a custom app on Invantive Data Access Point which adds business functionality to Exact Online. For billing purposes, we would like to somehow register actual use of the software as defined in business terms instead of memory used, CPU, SQL statements executed, etc.
We do not yet have custom tables and I would like to keep it that way, so the whole state is kept in memory and in Exact Online only. So "insert into mytable#sqlserver..." is not an option. Neither does Exact Online offer the possibility to create custom tables as with Salesforce.
How can we somehow register billable events, such as "Performed an upload of 8 bank transactions" under this condition?
For billing purposes, you can lift along on the Customer Service infrastructure, which is similar to functionality offered by AWS or Apple for this purpose in their eco system. The "table" which stores the billing events like a Call Detail Record of a PBX is managed by Customer Service infrastructure.
There are two options:
Your apps use the default audit and license event registrations like "User logged on", "First use of partition #xyz", etc. each with a specific message code like 'itgenlic125'.
Your apps define their own event types like "Performed an upload of bank transactions", with a message code 'mybillingmessagecode123' and the number '8' as quantity in the natural key.
The first option is automatically and always done. These data is also used to manage resource consumption and detect runaways.
The second option is best done using Invantive SQL with the data dictionary table "auditevents". All records inserted into auditevents are automatically asynchronously forwarded to Customer Service. To see the current register audit events since start of application:
select *
from auditevents#datadictionary
where:
occurrence_date: when it happened.
logging_level: always "Audit".
message_code: code identifying the type of event.
data_container_d: ID of the data container, used with distributed SQL transactions.
partition: partition within the data container for platforms such as Exact Online or Microsoft SQL Server which store multiple databases under one customer/instance.
session_id: ID of the session.
user_message: actual text.
last_nk: last used natural key
application_name: name of the appplication.
application_user: user as known to the application.
gui_action: action within the GUI.
And some auditing and licensing information fields.
To register a custom event:
insert into auditevents#datadictionary select * from auditevents#datadictionary
Only some fields can be provided; the rest are automatically determined:
message_code
user_message
last_natural_key
application_name
application_user
gui_action
gui_module
partition
provider_name
reference_key
reference_table_code
session_id
To receive the billing events yourself from the infrastructure, you will need to access the Customer Service APIs or have them automatically forwarded to mail, Slack, RocketChat or Mattermost channel.
A sample SQL:
insert into auditevents#datadictionary
( message_code
, user_message
, last_natural_key
, application_name
, gui_action
, gui_module
, reference_key
, reference_table_code
, partition
)
select 'xxmycode001' message_code
, 'Processed PayPal payments in Exact Online for ' || divisionlabel user_message
, 'today' last_natural_key
, 'PayPalProcessor' application_name
, 'xx-my-paypal-processor-step-2' gui_action
, 'xx-my-payal-processor' gui_module
, clr_id reference_key
, 'clr' reference_table_code
, division partition
from settings#inmemorystorage
I am using the Marketo API V1 to get lead data from a customers Marketo account. I have successfully connected to the API (by going through their documentation).
I can get data for a single lead however it only displays the default data (id, firstName, lastName, email) and I know there are a lot of custom data fields (company, salutation, jobTitle etc.) but this does not show from the API - do you know how I can access this custom data?
Below is the API URL I am using which works fine just not showing all the data I require:
https://<<url>>/rest/v1/leads.json?access_token=<<token>>&filterType=email&filterValues=oliver#test.com
This returns:
{"requestId":"1261b#14f40fc3156","result":[{"id":2755951,"updatedAt":"2015-08-18T10:58:42Z","lastName":"wells","email":"oliver#test.com","createdAt":"2015-06-02T09:36:48Z","firstName":"oliver"}],"success":true}
Thank you very much!
You need to include a parameter, 'fields', which has a comma-separated list of field names to retrieve a given set of fields: http://developers.marketo.com/documentation/rest/get-multiple-leads-by-filter-type/ See example 2, there.
I'm really beginner of nodejs. I want to make a chatting service using nodejs. I use nodejs/jade/mysql to construct basic part of my system and now I want to provide pub/sub to users.
We receive users' interests from text field or using hash tags (anyway we received users' interests and stored in MySQL -> we did it). Then, we want to show users chatting room list according to their interests. For instance A's interests are 'game', 'car' and 'food', then we search chat rooms with 'game', 'car', 'food' and show A these chat rooms first.
I want to use redis to provide this service but i really have no idea!
1) I installed redis and can run redis-server.
2)
//redis
var redis = require('redis');
var publisher = redis.createClient();
var subscriber = redis.createClient();
subscriber.on('message', function(channel, message){
console.log('Message ' + message + ' on channel ' + channel + ' arrived!');
});
subscriber.on('subscribe', function(channel){
publisher.publish('test', 'the a team');
publisher.publish('test', 'the b team');
})
subscriber.subscribe('test');
This is short code that I tried to understand redis.
3) I don't know how can I read data stored in Mysql and show users chat room according to their interests using redis.
Redis is a advanced key-value cache and store.Its operations cannot be directly mapped to mysql.
In redis you can set either key value pair or a hash under a key.
That is :
If you want to store your name in redis it can be done by:
var client = redis.createClient();
client.set("name", "John")
Retrieve the values using client.get("name")
Similarly under a single key you can store multiple key value pairs, as hash.
That under a name if you want to store their details like age, place, company etc.Then hash should be used.
Redis has method "hmset" and "hmget" for hash opertaions.
In redis like in cache you can set expiry time.
There are different method available. You can explore those.
For reference http://redis.io/commands
Tinkering around with verifying a couple of domains and found the manual process rather tedius. My DNS controller offers API access so I figured why not script the whole thing.
Trick is I can't figure out how to access the required TXT & CNAME records for DKIMS verification from boto, when I punch in
dkims = conn.verify_domain_dkim('DOMAIN.COM')
it adds DOMAIN.COM to the list of domains pending verification but doesn't provide the needed records, the returned value of dkims is
{'VerifyDomainDkimResponse': {
'ResponseMetadata': {'RequestId': 'REQUEST_ID_STRING'},
'VerifyDomainDkimResult': {'DkimTokens': {
'member': 'DKIMS_TOKEN_STRING'}}}}
Is there some undocumented way to take the REQUEST_ID or TOKEN_STRING to pull up these records?
UPDATE
If you have an aws account you can see the records I'm after at
https://console.aws.amazon.com/ses/home?region=us-west-2#verified-senders:domain
tab: Details:: Record Type: TXT (Text)
tab: DKIM:: DNS Record 1, 2, 3
these are the records required to add to the DNS controller to validate & allow DKIM signatures to take place
This is how I do it with python.
DOMINIO = 'mydomain.com'
from boto3 import Session
session = Session(
aws_access_key_id=MY_AWS_ACCESS_KEY_ID,
aws_secret_access_key=MY_AWS_SECRET_ACCESS_KEY,
region_name=MY_AWS_REGION_NAME)
client = session.client('ses')
# gets VerificationToken for the domain, that will be used to add a TXT record to the DNS
result = client.verify_domain_identity(Domain=DOMINIO)
txt = result.get('VerificationToken')
# gets DKIM tokens that will be used to add 3 CNAME records
result = client.verify_domain_dkim(Domain=DOMINIO)
dkim_tokens = result.get('DkimTokens') # this is a list
At the end of the code, you will have "txt" and "dkim_tokens" variables, a string and a list respectively.
You will need to add a TXT record to your dns, where the host name is "_amazonses" and the value is the value of "txt" variable.
Also you will need to add 3 CNAME records to your dns, one for each token present in "dkim_tokens" list, where the host name of each record is of the form of [dkimtoken]._domainkey and the target is [dkimtoken].dkim.amazonses.com
After adding the dns records, after some minutes (maybe a couple of hours), Amazon will detect and verify the domain, and will send you an email notification. After that, you can enable Dkim signature by doing this call:
client.set_identity_dkim_enabled(Identity=DOMINIO, DkimEnabled=True)
The methods used here are verify_domain_identity, verify_domain_dkim and set_identity_dkim_enabled.
You may also want to take a look a get_identity_verification_attributes and get_identity_dkim_attributes.
I think the get_identity_dkim_attributes method will return the information you are looking for. You pass in the domain name(s) you are interested in and it returns the status for that identity as well as the DKIM tokens.
I want to export all my facbeook contacts into .CSV format or any other contacts format
I have tried to import facebook contacts to yahoo mail then export both in .CSV but it didn't work with the new yahoo
If all you need is their name, you can call
https://graph.facebook.com/[YOUR_USER_ID]/friends?access_token=[ACCESS_TOKEN]
with an application access token either from one of your current applications or the Facebook access token tool.
This will return the name and facebook ID of all your friends in JSON form that you can convert to .CSV by just looping through the data array.
If you need more than just their name and facebook id, you'll have to use FQL by calling:
https://graph.facebook.com/fql?q=SELECT first_name, last_name, profile_url FROM user WHERE uid in (SELECT uid2 FROM friend WHERE uid1=me())&access_token=[ACCESS_TOKEN]
The full list of fields you can query in your SELECT can be found here.
There appears to be a facebook application that can do this for you. I'm uncertain if it actually works as I have not tested it.