pause a single session but keep processing other - mysql

I want to the mysql-proxy lua script to handle interleaving accesses to a website (e.g. two different browser windows/users) but being able to pause/delay one of the two without influencing the other.Handling sessions interleavingly is possible in mysql-proyx lua (so it seems regarding the later listed output) but as soon as I start delaying the script it blocks everything and the other session cannot advance either.
-- the query which indicates the session/connection that shall be delayed at that execution
local qoi = "SELECT loginattempts,uid FROM mybb_users WHERE username='user1' LIMIT 1"
function read_query(packet)
if string.byte(packet) == proxy.COM_QUERY then
query = packet:sub(2)
start_time = os.time()
if query == qoi then
print("busy wait")
while os.time() < start_time + 20 do
--nothing
end
print("busy wait end")
end
print("Connection id: " .. proxy.connection.server.thread_id)
end
end
However this script ends up with output:
Connection id: 36
busy wait
busy wait end
Connection id: 36
Connection id: 36
Connection id: 36
Connection id: 37
Connection id: 37
Connection id: 36
Connection id: 36
Connection id: 36
Connection id: 37
and not the expected
Connection id: 36
busy wait
connection id: 37
connection id: 37
busy wait end
Connection id: 36
Is my intention even achievable and if so how?

It seems to be impossible to delay the session within lua but it works just as fine if I outsource the delay to mysql server as this will force the interleaving as well.
local DEBUG = true
local qoi = "SELECT loginattempts,uid FROM mybb_users WHERE username='user1' LIMIT 1"
function read_query(packet)
ret = nil
comp_query = qoi
if string.byte(packet) == proxy.COM_QUERY then
query = packet:sub(2)
if query == comp_query then
if DEBUG then
print("found matching query " .. packet:sub(2))
print("insert sleep")
end
inj_query = "SELECT sleep(30);"
new_packet = string.char(proxy.COM_QUERY) .. inj_query
proxy.queries:append(1, new_packet, { resultset_is_needed = true })
proxy.queries:append(2, packet, { resultset_is_needed = true })
ret = proxy.PROXY_SEND_QUERY
end
end
return ret
end
function read_query_result(inj)
if inj.id == 1 then
if DEBUG then
print("sleep query returns")
end
return proxy.PROXY_IGNORE_RESULT
end
if inj.id == 2 then
if DEBUG then
print("regular query returns")
end
return
end
return
end

Related

Can't close opened serial COM port on script

import serial
try:
ser = serial.Serial(port = 'COM7', baudrate = 921600, timeout = 2, bytesize = 8)
except serial.SerialException:
ser.close()
ser = serial.Serial(port = 'COM7', baudrate = 921600, timeout = 2, bytesize = 8)
Sometimes my script exits before reaching ser.close() at the end so I thought the below code would catch the exception, close the port and reopen the port:
However, I got "NameError: name 'ser' is not defined" at ser.close() statement when running the whole script.
But strange enough, there is no issues if I highlighted the code section and run as a section.

Database connections break when simultaneous connections (Crystal-lang)

I set up a class with this class property:
##db : DB::Database = DB.open "mysql://root:root#localhost/db_name"
Then, in any class method I can use ##db:
##db.query("select * from thing") do |rs|
rs.each do
# make object
end
end
I was hoping this way I could DB.open only when the app starts, and not again for each request. This works great... until I run wrk -c10 and test multiple connections at the same time. Here is some error feedback:
Unhandled exception on HTTP::Handler Closed stream (IO::Error)
0x10e69a440: *raise:NoReturn at ?? 0x10e6a8d57:
*IO::FileDescriptor+#IO::Buffered#read:Int32 at ?? 0x10e7030db: *MySql::ReadPacket#read:Int32 at ??
0x10e74d89b: *DB::ResultSet+#DB::Disposable#close:(Bool | Nil) at ??
0x10e745ba2: *DefaultController::database:Nil
at ?? 0x10e7500c4:
*HTTP::Server::RequestProcessor#process<(OpenSSL::SSL::Socket::Server | TCPSocket+), (OpenSSL::SSL::Socket::Server | TCPSocket+),
IO::FileDescriptor>:Nil at ?? 0x10e6995fd:
*Fiber#run:(IO::FileDescriptor | Nil) at ??
I "fixed" the problem by doing this on each request:
DB.open("mysql://root:root#localhost/db_name") do |db|
db.query("select * from thing") do |rs|
rs.each do
# make object
end
end
end
This works, but with awful performance. What am I doing wrong?

HTTParty and JSON memory leak?

I've been having a hard time debugging the memory leak.
I have paging using HTTParty inside Sidekiq, and the memory is keep growing and growing. I profiled the sidekiq and this is what I found:
540 /home/user/.rvm/gems/ruby-2.2.3/gems/aws-sdk-core-2.4.2/lib/seahorse/client/configuration.rb:157:DATA
540 /home/user/.rvm/gems/ruby-2.2.3/gems/aws-sdk-core-2.4.2/lib/seahorse/client/configuration.rb:157:NODE
609 /home/user/.rvm/gems/ruby-2.2.3/gems/activesupport-4.1.14/lib/active_support/core_ext/class/attribute.rb:86:DATA
1376 /home/user/.rvm/gems/ruby-2.2.3/gems/activerecord-4.1.14/lib/active_record/connection_adapters/postgresql/database_statements.rb:148:STRING
1712 /home/user/.rvm/gems/ruby-2.2.3/gems/activesupport-4.1.14/lib/active_support/dependencies.rb:247:ARRAY
1713 /home/user/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/set.rb:290:STRING
1964 /home/user/.rvm/gems/ruby-2.2.3/gems/mime-types-3.1/lib/mime/types/_columnar.rb:29:OBJECT
1964 /home/user/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/set.rb:72:OBJECT
2572 /home/user/.rvm/gems/ruby-2.2.3/gems/mime-types-3.1/lib/mime/types/_columnar.rb:24:STRING
2987 /home/user/.rvm/gems/ruby-2.2.3/gems/activesupport-4.1.14/lib/active_support/dependencies.rb:247:NODE
3126 /home/user/.rvm/gems/ruby-2.2.3/gems/mime-types-3.1/lib/mime/types/container.rb:10:OBJECT
3506 /home/user/.rvm/gems/ruby-2.2.3/gems/activesupport-4.1.14/lib/active_support/dependencies.rb:247:DATA
3928 /home/user/.rvm/gems/ruby-2.2.3/gems/mime-types-3.1/lib/mime/type.rb:532:STRING
3928 /home/user/.rvm/gems/ruby-2.2.3/gems/mime-types-3.1/lib/mime/type.rb:543:STRING
5175 /home/user/.rvm/rubies/ruby-2.2.3/lib/ruby/2.2.0/set.rb:81:HASH
5234 /home/user/.rvm/gems/ruby-2.2.3/gems/activesupport-4.1.14/lib/active_support/dependencies.rb:247:STRING
5892 /home/user/.rvm/gems/ruby-2.2.3/gems/mime-types-3.1/lib/mime/type.rb:546:STRING
8120 /home/user/.rvm/gems/ruby-2.2.3/gems/json-1.8.3/lib/json/common.rb:155:ARRAY
87403 /home/user/.rvm/gems/ruby-2.2.3/gems/json-1.8.3/lib/json/common.rb:155:HASH
541817 /home/user/.rvm/gems/ruby-2.2.3/gems/json-1.8.3/lib/json/common.rb:155:STRING
Notice the json STRING and HASH. These bottom 2 just keeps growing as the sidekiq keeps on running more and more jobs.
The code is this:
...
begin
reactions = HTTParty.get("https://graph.facebook.com/v2.7/#{vid.external_id}/reactions?summary=true&limit=#{PAGE_SIZE}&access_token=#{ENV['at']}")
rescue Exception => ex
return
end
r_last_id = "vl/programs/#{vid.program_id}/vids/#{vid.id}/reactions/last_id"
r_last_entry_idx = "vl/programs/#{vid.program_id}/vids/#{vid.id}/reactions/last_entry_idx"
reactions_purged = []
abort = false
total_records = reactions['summary']['total_count'] || 0
last_total_count = $redis.get(r_last_entry_idx).to_i
need_to_run = total_records - last_total_count
need_to_run = 0 if need_to_run < 0
last_id = nil
loop do
break if reactions['data'].nil? || reactions['data'].empty?
reactions['data'].each do |r|
last_id = $redis.get(r_last_id)
abort = true and break if !last_id.nil? && need_to_run <= 0
need_to_run -= 1
reactions_purged << r
end
GC.start # <----- Even this didnt solve it
break if abort
break if reactions['paging']['next'].nil?
reactions = HTTParty.get(reactions['paging']['next'])
end
...
Even the GC.start I added as you see, didn't solve it.
What causes this JSON leakage? this is coming from HTTParty...
Thanks

meaning of 'return proxy.PROXY_IGNORE_RESULT' in connect_server() hook

I am trying to implement connection pooling(many clients are served by few db connections) with mysql-proxy.
I took a look at
ro-pooling.lua and it seems that some actions must be done in connect_server() hook.
If I want to create a new connection:
Assign target backend index to proxy.connection.backend_ndx.
Return nothing
If I want to use already existing idle connection:
Assign target backend index to proxy.connection.backend_ndx.
Return proxy.PROXY_IGNORE_RESULT
Now, what bothers me that returning proxy.PROXY_IGNORE_RESULT from connect_server() hook seems to have no impact on connection reuse - every time a client connects, a new connection is created and eventually I run into following error: "User 'username' has exceeded the 'max_user_connections' resource (current value: 4)"
So, the question is: What is the meaning of return proxy.PROXY_IGNORE_RESULT in connect_server() hook?
Also, any reference about how mysql-proxy creates and reuses connections would be very helpful - I did not manage to find any...
Any help would be greatly appreciated :)
EDIT:
this is source of script I'm current using:
local default_backend = 1
local min_idle_connections = 1
local max_idle_connections = 4
local connections_limit = 1
local user_name = "user"
if not proxy.global.count then -- never mind this, not using it...
proxy.global.count = 0
end
function connect_server()
local backend = proxy.global.backends[1]
local pool = backend.pool
local cur_idle = pool.users[""].cur_idle_connections
print ("CONNECT SERVER:")
print("current backend:" ..proxy.connection.backend_ndx)
if cur_idle >= min_idle_connections then
print("using pooled connection")
proxy.connection.backend_ndx=0
print("current backend:" ..proxy.connection.backend_ndx)
return proxy.PROXY_SEND_RESULT
end
proxy.global.count = proxy.global.count + 1
print("Creating new connection")
proxy.connection.backend_ndx = default_backend
print("current backend:" ..proxy.connection.backend_ndx)
end
function read_handshake()
print("READ_HANDSHAKE")
print("current backend:" ..proxy.connection.backend_ndx)
end
function read_auth()
local username = proxy.connection.client.username
print("READ_AUTH: " ..username)
print("current backend:" ..proxy.connection.backend_ndx)
end
function disconnect_client()
print ("DISCONNECT CLIENT. ")
print("current backend:" ..proxy.connection.backend_ndx)
end
function read_auth_result(auth)
print("READ_AUTH_RESULT")
if auth.packet:byte() == proxy.MYSQLD_PACKET_OK then
--auth was fine, disconnect from the server--
proxy.connection.backend_ndx = 0
print("disconnected backend after auth")
print("current backend:" ..proxy.connection.backend_ndx)
end
end
function read_query(packet)
print("READ_QUERY:")
print("current backend:" ..proxy.connection.backend_ndx)
if packet:byte() == proxy.COM_QUIT then
print("received signal QUIT")
proxy.response = {
type = proxy.MYSQLD_PACKET_OK,
}
return proxy.PROXY_SEND_RESULT
end
if proxy.connection.backend_ndx == 0 then
print("assigning backend to process query...")
proxy.connection.backend_ndx = default_backend
print("current backend:" ..proxy.connection.backend_ndx)
end
local username = proxy.connection.client.username
local cur_idle = proxy.global.backends[default_backend].pool.users[username].cur_idle_connections
print ("current idle user" ..username.." connections: " ..cur_idle)
if string.byte(packet) == proxy.COM_QUERY then
print("Query: " .. string.sub(packet, 2))
end
proxy.queries:append(1, packet)
return proxy.PROXY_SEND_QUERY
end
function read_query_result( inj )
print("READ_QUERY_RESULT:")
print("current backend:" ..proxy.connection.backend_ndx)
local res = assert(inj.resultset)
local flags = res.flags
if inj.id ~= 1 then
return proxy.PROXY_IGNORE_RESULT
end
is_in_transaction = flags.in_trans
if not is_in_transaction then
-- release the backend
print("releasing backend")
proxy.connection.backend_ndx = 0
end
print("current backend:" ..proxy.connection.backend_ndx)
end

Corona sdk, functions run in wrong order?

I'm trying to save some data in to a table. I get the data from a database and it works ok.
My problem is that the data is not saved in the table. It is a lua table like table = {} and NOT a database table.
Maybe it is saved but it looks like the prints are done before the saving even though I call them after. In fact it seems like my network request is done last in my program even though I call it first.
I would real like to know the reason for this. Any ideas?
Here is the code:
---TESTING!
print("Begin teting!")
--hej = require ( "test2" )
local navTable = {
Eng_Spd = 0,
Spd_Set = 0
}
local changeTab = function()
navTable.Eng_Spd = 2
end
printNavTable = function()
print("navTable innehåller: ")
print(navTable.Eng_Spd)
print(navTable.Spd_Set)
end
require "sqlite3"
local myNewData
local json = require ("json")
local decodedData
local SaveData2 = function()
local i = 1
local counter = 1
local index = "livedata"..counter
local navValue = decodedData[index]
print(navValue)
while (navValue ~=nil) do
--tablefill ="INSERT INTO navaltable VALUES (NULL,'" .. navValue[1] .. "','" .. navValue[3] .."','" .. navValue[4] .."','" .. navValue[5] .."','" .. navValue[6] .."');"
--print(tablefill)
--db:exec(tablefill)
if navValue[3] == "Eng Spd" then navTable.Eng_Spd = navValue[4]
elseif navValue[3] == "Spd Set" then navTable.Spd_Set = navValue[4]
else print("blah")
end
print(navTable.Eng_Spd)
print(navTable.Spd_Set)
counter=counter+1
index = "livedata"..counter
navValue = decodedData[index]
end
end
local function networkListener( event )
if (event.isError) then
print("Network error!")
else
myNewData = event.response
print("From server: "..myNewData)
decodedData = (json.decode(myNewData))
SaveData2()
--db:exec("DROP TABLE IN EXISTS navaltable")
end
end
--function uppdateNavalTable()
network.request( "http://127.0.0.1/firstMidle.php", "GET", networkListener )
--end
changeTab()
printNavTable()
--uppdateNavalTable()
printNavTable()
print("Done!")
And here is the output:
Copyright (C) 2009-2012 C o r o n a L a b s I n c .
Version: 2.0.0
Build: 2012.971
Begin teting!
navTable innehåller:
2
0
navTable innehåller:
2
0
Done!
From server: {"livedata1":["1","0","Eng Spd","30","0","2013-03-15 11:35:48"],"li
vedata2":["1","1","Spd Set","13","0","2013-03-15 11:35:37"]}
table: 008B5018
30
0
30
13
And btw, navTable innehåller means navTable contains.
The answer is that networklistener run parallell with the rest of the code.