Embedded expression to frame a request with string in karate - parameter-passing

I have trouble to parameterize into the below field.
* def temp = 'KSG-' + user+ '-GS'
* print temp
* def user = ('#(temp)'\n-C453/M-R/UVE S/J\n)\n
The actual field looks like user: "(KSG-ABCDE-GS\n-C453/M-R/UVE S/J\n)\n"
(full field value mentioned with \n also in quotes)
I get java script evaluation field when i provide temp value into this as mentioned above. Please correct me if I am providing temp value in wrong way in * def user part of code.

Did you read the docs I linked in answer to your previous question?
The below code should work. If you want to embed escaped line feeds, just use \\n:
* def user = 'foo'
* def temp = '(KSG-' + user + '-GS\n-C453/M-R/UVE S/J\n)\n'
* print temp

Related

How to ignore NoneType error in tweet ID statuses_lookup

I am trying to collect tweets with tweepy from a list of tweet ids good_tweet_ids_test, using statuses_lookup.
Since the list is a bit old, some tweets will have been deleted by now. Therefore I ignore errors in the lookup_tweets function, so it does not stop each time.
Here is my code so far:
def lookup_tweets(tweet_IDs, api):
full_tweets = []
tweet_count = len(tweet_IDs)
try:
for i in range((tweet_count // 100) + 1):
# Catch the last group if it is less than 100 tweets
end_loc = min((i + 1) * 100, tweet_count)
full_tweets.extend(
api.statuses_lookup(tweet_IDs[i * 100:end_loc], tweet_mode='extended')
)
return full_tweets
except:
pass
results = lookup_tweets(good_tweet_ids_test, api)
temp = json.dumps([status._json for status in results]) #create JSON
newdf = pd.read_json(temp, orient='records')
newdf.to_json('tweepy_tweets.json')
But when I run the temp = json.dumps([status._json for status in results]) line, it gives me the error:
TypeError: 'NoneType' object is not iterable
I do not know how to fix this. I believe the type of some of the statuses is None, because they have been deleted and can therefore not be looked up now. I simply wish for my code to move on to the next status, if the type is None.
EDIT: As have been pointed out, the issue is that results is None. So now I think I need to exclude None values from the full_tweets variable. But I cannot figure out how to. Any help?
EDIT2: With further testing I have found out that results is only None when there is a tweet ID that has now been deleted in the batch. If the batch contains only active tweets, it works. So I think I need to figure out how to have my function look up the batch of tweets, and only return those that are not None. Any help on this?
Instead of implicitly returning None when there's an error, you could explicitly return an empty list. That way, the result of lookup_tweets will always be iterable, and the calling code won't have to check its result:
def lookup_tweets(tweet_IDs, api):
full_tweets = []
tweet_count = len(tweet_IDs)
try:
for i in range((tweet_count // 100) + 1):
# Catch the last group if it is less than 100 tweets
end_loc = min((i + 1) * 100, tweet_count)
full_tweets.extend(
api.statuses_lookup(tweet_IDs[i * 100:end_loc], tweet_mode='extended')
)
return full_tweets
except:
return [] # Here!

How to do a SQL query using a string wildcard and LIKE?

I am new to python and currently learning to use SQL with python. I have the following code:
word = input("Enter a word: ")
query = cursor.execute("SELECT * FROM Dictionary WHERE Expression LIKE '%s%' " % word)
results = cursor.fetchall()
The second line throws an error since I don't think I can use '%s%' like that? How would I change this so as to be able to make this work? I want to be able to return all related entries to the users input. So if the user inputs "rain", then I want the query to return all possible results e.g. "raining", "rainy" etc. Thank you.
You can try
query = cursor.execute(f"SELECT * FROM Dictionary WHERE Expression LIKE '%{word}%' ")
You should use cursor.execute() parameter substitution rather than string formatting, to prevent SQL injection.
Then use CONCAT() to surround the search string with %.
query = cursor.execute("SELECT * FROM Dictionary WHERE Expression LIKE CONCAT('%', %s, '%' "), (word,))

Automation script: Get value from related table

I'm trying to break this problem down into manageable parts: Spatial Query.
I want to create an automation script that will put a work order's LatitudeY coordinate in the work order's DESCRIPTION field.
I understand that a work order's coordinates are not stored in the WORKORDER table; they're stored in the WOSERVICEADDRESS table.
Therefore, I believe the script needs to reference a relationship in the Database Configuration application that will point to the related table.
How can I do this?
(Maximo 7.6.1.1)
You can get the related Mbo and get the values from the related Mbo and use it as shown in the below code. By getting the related Mbo you can also alter it's attributes.
from psdi.mbo import MboConstants
serviceAddressSet = mbo.getMboSet("SERVICEADDRESS")
if(serviceAddressSet.count() > 0):
serviceAddressMbo = serviceAddressSet.moveFirst()
latitudeY = serviceAddressMbo.getString("LATITUDEY")
longitudeX = serviceAddressMbo.getString("LONGITUDEX")
mbo.setValue("DESCRIPTION","%s, %s" % (longitudeX, latitudeY),MboConstants.NOACCESSCHECK)
serviceAddressSet.close()
I've got a sample script that compiles successfully:
from psdi.mbo import MboConstants
wonum = mbo.getString("WONUM")
mbo.setValue("DESCRIPTION",wonum,MboConstants.NOACCESSCHECK)
I can change it to get the LatitudeY value via the SERVICEADDRESS relationship:
from psdi.mbo import MboConstants
laty = mbo.getString("SERVICEADDRESS.LatitudeY")
longx = mbo.getString("SERVICEADDRESS.LONGITUDEX")
mbo.setValue("DESCRIPTION",laty + ", " + longx,MboConstants.NOACCESSCHECK)
This appears to work.

How can I use Django.db.models Q module to query multiple lines of user input text data

How would I go about using the django.db.models Q module to query multiple lines of input from a list of data using a <textarea> html input field? I can query single objects just fine using a normal html <input> field. I've tried using the same code as my input field, except when requesting the input data, I attempt to split the lines like so:
def search_list(request):
template = 'search_result.html'
query = request.GET.get('q').split('\n')
for each in query:
if each:
results = Product.objects.filter(Q(name__icontains=each))
This did not work of course. My code to query one line of data (that works) is like this:
def search(request):
template = 'search_result.html'
query = request.GET.get('q')
if query:
results = Product.objects.filter(Q(name__icontains=query))
I basically just want to search my database for a list of data users input into a list, and return all of those results with one query. Your help would be much appreciated. Thanks.
Based on your comments, you want to implement OR-logic for the given q string.
We can create such Q object by reduce-ing a list of Q objects that each specify a Q(name__icontains=...) constraint. We reduce this with a "logical or" (a pipe in Python |), like:
from django.db.models import Q
from functools import reduce
from operator import or_
def search_list(request):
template = 'search_result.html'
results = Product.objects.all()
error = None
query = request.GET.get('q')
if query:
query = query.split('\n')
else:
error = 'No query specified'
if query:
results = results.filter(
reduce(or_, (Q(name__icontains=itm.strip()) for itm in query))
)
elif not error:
error = 'Empty query'
some_context = {
'results' : results,
'error': error
}
return render(request, 'app/some_template.html', some_context)
Here we thus first check if q exists and is not the empty string. If that is the case, the error is 'No query specified'. In case there is a query, we split that query, next we check if there is at least one element in the query. If not, our error is 'Empty query' (note that this can not happen with an ordinary .split('\n'), but perhaps you postprocess the list, and for example remove the empty elements).
In case there are elements in query, we perform the reduce(..) function, and thus filter the Products.
Finally here we return a render(..)ed response with some_template.html, and a context that here contains the error, and the result.

MySQL Dynamic Query Statement in Python with Dictionary

Very similar to this question MySQL Dynamic Query Statement in Python
However what I am looking to do instead of two lists is to use a dictionary
Let's say i have this dictionary
instance_insert = {
# sql column variable value
'instance_id' : 'instnace.id',
'customer_id' : 'customer.id',
'os' : 'instance.platform',
}
And I want to populate a mysql database with an insert statement using sql column as the sql column name and the variable name as the variable that will hold the value that is to be inserted into the mysql table.
Kind of lost because I don't understand exactly what this statement does, but was pulled from the question that I posted where he was using two lists to do what he wanted.
sql = "INSERT INTO instance_info_test VALUES (%s);" % ', '.join('?' for _ in instance_insert)
cur.execute (sql, instance_insert)
Also I would like it to be dynamic in the sense that I can add/remove columns to the dictionary
Before you post, you might want to try searching for something more specific to your question. For instance, when I Googled "python mysqldb insert dictionary", I found a good answer on the first page, at http://mail.python.org/pipermail/tutor/2010-December/080701.html. Relevant part:
Here's what I came up with when I tried to make a generalized version
of the above:
def add_row(cursor, tablename, rowdict):
# XXX tablename not sanitized
# XXX test for allowed keys is case-sensitive
# filter out keys that are not column names
cursor.execute("describe %s" % tablename)
allowed_keys = set(row[0] for row in cursor.fetchall())
keys = allowed_keys.intersection(rowdict)
if len(rowdict) > len(keys):
unknown_keys = set(rowdict) - allowed_keys
print >> sys.stderr, "skipping keys:", ", ".join(unknown_keys)
columns = ", ".join(keys)
values_template = ", ".join(["%s"] * len(keys))
sql = "insert into %s (%s) values (%s)" % (
tablename, columns, values_template)
values = tuple(rowdict[key] for key in keys)
cursor.execute(sql, values)
filename = ...
tablename = ...
db = MySQLdb.connect(...)
cursor = db.cursor()
with open(filename) as instream:
row = json.load(instream)
add_row(cursor, tablename, row)
Peter
If you know your inputs will always be valid (table name is valid, columns are present in the table), and you're not importing from a JSON file as the example is, you can simplify this function. But it'll accomplish what you want to accomplish. While it may initially seem like DictCursor would be helpful, it looks like DictCursor is useful for returning a dictionary of values, but it can't execute from a dict.