I want to use a command to load a file in my wdio-tests. As the documentation https://webdriver.io/docs/api/browser/uploadFile/ says, this command only works in the Chrome browser. However, when using it, I get an error
Error: The uploadFile command is not available in Chrome
My code:
utilities.upload_file = (file_name, input_class) ->
await browser.pause(consts.SHORT_TIMER)
await browser.executeScript("window.document.getElementsByClassName('" + input_class + "')[0].style.display = 'block'", [])
file_path = path.join(FILE_PATH, file_name)
cool_file = await browser.uploadFile(file_path)
await $('input.' + input_class).then((res) ->
return res.setValue(cool_file)
)
Please help me
Related
I have just started learning ruby and trying to learn extracting web data. I am using samplewebapp code given in a sites API documentation
On executing the test.rb file as given below it generates a token.
The token.json file contains the following
{"access_token":"eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiJ9.eyJhdWQiOiIzY2U1NjFhMC01ZmE5LTQ5NjUtYTQ2Mi01NmI2MzEzNjRlZGMiLCJqdGkiOiJmMjc0ZTgwM2FmZjVlMTFiMTAzNzIwMzAwNzRlNDRhZDE3ODg3ZjAzOGQ2ODk0OWIwNzVkMmQ5N2E2MjAwYzc5ZWRhNzU4MmQ3MTg1MTc1YiIsImlhdCI6MTY1NDU2OTc0MS41MjgxNTUsIm5iZiI6MTY1NDU2OTc0MS41MjgxNTcsImV4cCI6MTY1NDU3MzM0MS41MjgwMzksInN1YiI6IjE5NGFlNTU5LTU5OWMtNDQ2ZC1hOTRhLWM2MTIyYjZkMTA2ZiIsInNjb3BlcyI6W10sImNyZWRpdHNfcmVtYWluaW5nIjo0NzAwLCJyYXRlX2xpbWl0cyI6W3sicmF0ZSI6NSwiaW50ZXJ2YWwiOjYwfV19.XoN5A-H8Z7d8p_0e2rCcyV4yWO9MAF3TZlHk5VP5NjUEqtTXTVYKfKuzidYDs7K8ZzAtWzyt3aR5VSUG0_nw-uPd7Z3_Y75alQMqa2b5rHGn5JFWH7nuAriskr26WSCqAdj4cNjccPORryRr5sYKwpE4Y4kTG0IX_8frvwYxwp-8wFpLLj98K3axwN7CCpWdnPDSzuMqmH6tSpF0XhdYxB5LTVH0AyH7lN0S0_Lftq0b3sOLIvEaTfNkuRGNqwLkBYFkHFuPqwrKd8RJhC2W1QZhrmUw3eYnh-0iQABGk2V0skIqDlb6BbQ5GFX6MXgiXAc-h5Ndda7pZ5N5UCQU3g","expires_at":1654573341}
However it also generates a JSON parser error as under
/Users/mm/.rbenv/versions/3.0.4/lib/ruby/3.0.0/json/common.rb:216:in `parse': 809: unexpected token at '' (JSON::ParserError)
from /Users/mm/.rbenv/versions/3.0.4/lib/ruby/3.0.0/json/common.rb:216:in `parse'
from /Users/mm/Desktop/prokerala/client.rb:21:in `parseResponse'
from /Users/mm/Desktop/prokerala/client.rb:99:in `get'
I googled around for a while and some of the responses talk about the issues between double and single quotes.. I tried changing those also but it doesn't work.
I have also tried running it with system ruby - which gives a different error
Would appreciate very much if someone could explain what the error is / logic behind this.. Also I am not clear about the error message itself. The message talks about parse error 809:"unexpected token" on column no 809 of the token there is nothing.. Am i reading it wrong?
code used in test.rb file
client = ApiClient.new('YOUR CLIENT_ID', 'YOUR CLIENT_SECRET');
result = client.get('v2/astrology/thirumana-porutham/advanced', {
:girl_nakshatra => 4,
:girl_nakshatra_pada => 2,
:boy_nakshatra => 26,
:boy_nakshatra_pada => 3
})
puts JSON.pretty_generate(result)
code used in client.rb file
require 'net/http'
require 'json'
class ApiError < StandardError
end
class ApiClient
BASE_URL = "https://api.prokerala.com/"
# Make sure that the following file path is set to a location that is not publicly accessible
TOKEN_FILE = "./token.json"
def initialize(clientId, clientSecret)
# Instance variables
#clientId = clientId
#clientSecret = clientSecret
end
def parseResponse(response)
content = response.body
res = JSON.parse(content)
if res.key?('access_token')
return res
end
if res['status'] == "error"
raise ApiError, res['errors'].map {|e| e['detail']}.join("\n")
end
if res['status'] != "ok"
raise "HTTP request failed"
end
return res
end
def saveToken(token)
# Cache the token until it expires.
File.open(ApiClient::TOKEN_FILE,"w") do |f|
token = {
:access_token => token['access_token'],
:expires_at => Time.now.to_i + token['expires_in']
}
f.write(token.to_json)
end
end
def getTokenFromCache
if not File.file?(ApiClient::TOKEN_FILE)
return nil
end
begin
# Fetch the cached token, and return if not expired
text = File.read(ApiClient::TOKEN_FILE)
token = JSON.parse(text)
if token['expires_at'] < Time.now.to_i
return nil
end
return token['access_token']
rescue JSON::ParserError
return nil
end
end
def fetchNewToken
params = {
:grant_type => 'client_credentials',
:client_id => #clientId,
:client_secret => #clientSecret
}
res = Net::HTTP.post_form(URI(ApiClient::BASE_URL + 'token'), params)
token = parseResponse(res)
saveToken(token)
return token['access_token']
end
def get(endpoint, params)
# Try to fetch the access token from cache
token = getTokenFromCache
# If failed, request new token
token ||= fetchNewToken
uri = URI(ApiClient::BASE_URL + endpoint)
uri.query = URI.encode_www_form(params)
req = Net::HTTP::Get.new(uri.to_s, {'Authorization' => 'Bearer ' + token})
res = Net::HTTP.start(uri.hostname) do |http|
http.request(req)
end
return parseResponse(res)
end
end
running the test file using ruby 3.0.4 gives the a JSON parse error. Full error reproduced as below
/Users/mm/.rbenv/versions/3.0.4/lib/ruby/3.0.0/json/common.rb:216:in `parse': 809: unexpected token at '' (JSON::ParserError)
from /Users/mm/.rbenv/versions/3.0.4/lib/ruby/3.0.0/json/common.rb:216:in `parse'
from /Users/mm/Desktop/prokerala/client.rb:21:in `parseResponse'
from /Users/mm/Desktop/prokerala/client.rb:99:in `get'
from test.rb:11:in `<main>'
I executed the above code and from this, we are getting
#<Net::HTTPMovedPermanently 301 Moved Permanently readbody=true> as response
and response body as empty string '', which results in error in parsing the body.
Since, API endpoint is secure with https, we need set use_ssl: true before starting the session. Try below code and it will fix the issue.
res = Net::HTTP.start(uri.hostname, use_ssl: true) do |http|
http.request(req)
end
Update: A fresh install did not have the problem. I gave up on trying to fix the upgraded Ubuntu.
Orignal question:
I'm using the perl CPAN package JSON to convert a hasref to json using the to_json function.
This worked fine on Ubuntu 14.04 with perl version 5.18.2, but after ugrading to Ubuntu 16.04 with perl version 5.22.1 I get the error message:
hash- or arrayref expected (not a simple scalar, use allow_nonref to allow this)
The orignal code was this:
my $lang = {
'connection_lost' => 'Network connection was lost',
'connection_lost_more' => 'Please refresh this page to fix this problem'
};
my $json_lang = to_json($lang);
I checked with warn ref($lang) the type of $lang which returned 'HASH', so it should be an hashref?
I tried to change it to this:
my %lang;
$lang{'connection_lost'} = 'Network connection was lost';
$lang{'connection_lost_more'} = 'Please refresh this page to fix this problem';
my $json_lang = to_json(%lang);
and this:
my %lang;
$lang{'connection_lost'} = 'Network connection was lost';
$lang{'connection_lost_more'} = 'Please refresh this page to fix this problem';
my $json_lang = to_json(\%lang);
Both failed.
Then I tried the allow_nonref switch:
my $lang = {
'connection_lost' => 'Network connection was lost',
'connection_lost_more' => 'Please refresh this page to fix this problem'
};
my $jsonnonref = JSON->new->allow_nonref;
my $json_lang = $jsonnonref->to_json($lang);
which resulted in the error message to_json should not be called as a method
How do I get this to work?
Absolute minimal code that does not work for me:
package Handlers::test_handlers;
use strict;
use warnings;
use Apache2::Const -compile => qw(OK);
use Apache2::Request;
use JSON;
sub handler {
my $lang = {
'connection_lost' => 'connection_lost',
'connection_lost_more' => 'connection_lost_more'
};
#my $json_lang = 'Hello world';
my $json_lang = to_json($lang);
print $json_lang;
return Apache2::Const::OK;
}
1;
Using the 'Hello world'-line works, while the to_json-line does not.
I now had the same issue with a fresh install, and decided to completely remove JSON::XS / libjson-xs-perl from the project, which instantaneously fixed this problem
I'm a newcomer to VIM and installed GVIM successfully on my Windows 8.1 laptop. I uploaded the Syntastics plugin using Pathogen and, as I'm planning to write an ionic project, I also copied the tidy5 for html5 binary (from http://www.htacg.org/tidy-html5/) and referred to the binary in my VIMRC as follows:
" SYNTASTIC
set statusline+=%#warningmsg#
set statusline+=%{SyntasticStatuslineFlag()}
set statusline+=%*
" automatically load errors in the location list:
let g:syntastic_always_populate_loc_list = 1
let g:syntastic_auto_loc_list = 1
" check on errors when a file is loaded:
let g:syntastic_check_on_open = 1
" check on errors when saving the file:
let g:syntastic_check_on_wq = 0
let g:syntastic_html_tidy_execute = 'C:\Program Files (x86)\tidy-5.1.25-win32\bin\tidy'
let g:syntastic_shell='C:\Windows\System32\cmd'
" check debugging messages in vim with :mes:
let g:syntastic_debug = 1
let g:syntastic_mode_map = {"mode": "active",
\"active_filetypes" : ["html","javascript","json"],
\"passive_filetypes" : ["html", "javascript","json"] }
When I run
:SyntasticInfo
I get the following output
Syntastic version: 3.7.0-76 (Vim 704, Windows)
Info for filetype: html
Global mode: active
Filetype html is active
The current file will be checked automatically
Available checkers: -
Currently enabled checkers: -
For some reason, my default checkers aren't loading, and when I run
echo syntastic#util#system('echo %PATH%')
I get a E484 error
Error detected while processing function syntastic#util#system:
line 9:
E484: Can't open file C:\Users\Dirk\AppData\Local\Temp\VIo8366.tmp
-1
I suspect there are multiple issues here, so I assume the first one I need to solve is the E484 error. Any help appreciated.
I'm using LDAvis for topic modeling and trying to use the as.gist option to create a gist. When serVis executes there is a timeout in curl::curl_fetch_memory after about 10 seconds. If I immediately execute serVis again I get a different error 'Problems parsing JSON' and from then on whenever serVis is run that same error recurs.
If I start all over with a fresh workspace the same behavior occurs. The first time serVis is run, curl::curl_fetch_memory times out after about 10 seconds. Subsequent executions return 'Problems parsing JSON'.
If I don't use the as.gist option it works fine, but of course doesn't create a gist.
Very rarely, it works and a gist is created. If I change parameters to reduce the size of the JSON object it usually works, which makes me think it may be related to size.
I have explored the various RCurlOptions timeout settings. Currently, they are set as
options(RCurlOptions = list(cainfo = system.file("CurlSSL", "cacert.pem",
package = "RCurl"),
connecttimeout = 300, timeout = 3000,
followlocation = TRUE, dns.cache.timeout = 300))
Below is a console listing with debug set on curl::curl_fetch_memory.
> json <- createJSON(phi = cases$phi,
+ theta = cases$theta,
+ doc.len .... [TRUNCATED]
> serVis(json, open.browser = TRUE, as.gist = TRUE, description = 'APM Community')
debugging in: curl::curl_fetch_memory(url, handle = handle)
debug: {
output <- .Call(R_curl_fetch_memory, url, handle)
res <- handle_response_data(handle)
res$content <- output
res
}
Browse[2]> output <- .Call(R_curl_fetch_memory, url, handle)
Error: Timeout was reached
Browse[2]> output <- .Call(R_curl_fetch_memory, url, handle)
Browse[2]> rawToChar(output)
[1] "{\"message\":\"Problems parsing JSON\",\"documentation_url\":\"https://developer.github.com/v3\"}"
Browse[2]>
.
.
exiting from: curl::curl_fetch_memory(url, handle = handle)
Error: Problems parsing JSON
Any hints on how to debug this problem?
There was an earlier question on this, but the asker was just overwriting their output and solved their own problem.
I'm using a subprocess.popen to read video information and write the output to a json. It works fine on MOST videos, but on others is returning an empty string on others - even though it runs fine from the command line. I tried it several times and am getting the data fine through the command line.
Here's the relevant part of the script:
out_prj.write('[')
for m, i in enumerate(files):
print i
out_prj.write('{"$type":"BatchProcessor.Job, BatchProcessor","Id":0,"Ver":1.02,"CurrentTask":0,"IsSelected":true,"TaskList":[')
f_name = os.path.basename(i[0])
f_json = out_folder + os.sep + "06_Output" + os.sep + os.path.basename(i[0]).split(".")[0] + ".json"
trans_f = out_folder + os.sep + "04_Video" + os.sep + os.path.basename(i[0]).split(".")[0] + "-tr.ts"
trans_f_out = out_folder + os.sep + "06_Output" + os.sep + os.path.basename(i[0]).split(".")[0] + "-tr-out.ts"
ffprobe = 'ffprobe.exe'
command = [ffprobe, '-v', 'quiet', '-print_format', 'json', '-show_format', '-show_streams', i[0]]
p = sp.Popen(command, stdout=sp.PIPE, stderr=sp.PIPE, shell=True)
out, err = p.communicate()
io = cStringIO.StringIO(out)
info = json.load(io)
print info
filea = open(f_json, 'w')
filea.write(json.dumps(info))
filea.close()
f = open(f_json)
b = json.load(f)
print b
#########################
###################
f_format = str(b['streams'][0]['codec_long_name'])
Your code ignores error messages (err variable). print err or don't redirect stderr to see them.
Unrelated: the json handling in your code is insane: most operations are redundant.
To save output of the subprocess to a file:
import os
from subprocess import check_call
f_json = os.path.join(out_folder, "06_Output",
os.path.splitext(f_name)[0] + ".json")
with open(f_json, 'wb', 0) as file:
check_call(command, stdout=file)
Note: shell=True is not necessary here. If subprocess can't find ffprobe.exe then specify the full path e.g. (use the path appropriate for your system):
ffprobe = r'C:\Program Files\Real\RealPlayer\RPDS\Tools\ffmpeg\ffprobe.exe'
Note: r'' -- a raw string literal is used to avoid doubling the backslashes.