Differences in TNSNAMES and LISTENER files - listener

I have installed Oracle on two different azure virtual machines and between the installation the version of oracle 12C has changed SE1 to SE2 and despite setting up both machines in the same to the best of my knowledge the TNSNAMES.ORA differs between environments
# tnsnames.ora Network Configuration File: D:\app\oracle\product\12.1.0\dbhome_1\network\admin\tnsnames.ora
# Generated by Oracle configuration tools.
DB =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = host-live.f10.internal.cloudapp.net)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = EMMA)
)
)
ORACLR_CONNECTION_DATA =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS = (PROTOCOL = IPC)(KEY = EXTPROC1521))
)
(CONNECT_DATA =
(SID = CLRExtProc)
(PRESENTATION = RO)
)
)
On my test server which I believed to have set up in a similar manner the TNSNAMES.ORA file is as follows:
# tnsnames.ora Network Configuration File: D:\app\oracle\product\12.1.0\dbhome_1\NETWORK\ADMIN\tnsnames.ora
# Generated by Oracle configuration tools.
LISTENER_EMMATEST =
(ADDRESS = (PROTOCOL = TCP)(HOST = localhost)(PORT = 1521))
DBTEST =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS = (PROTOCOL = TCP)(HOST = host-test.f7.internal.cloudapp.net)(PORT = 1521))
)
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = EMMATEST)
)
)
ORACLR_CONNECTION_DATA =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS = (PROTOCOL = IPC)(KEY = EXTPROC1521))
)
(CONNECT_DATA =
(SID = CLRExtProc)
(PRESENTATION = RO)
)
)
I am unsure as to why the test server has an additional listener in the TNSNAMES.ORA file. If I look further at the LISTENER.ORA file found on each environment they are as follows:
LIVE:
# listener.ora Network Configuration File: D:\app\oracle\product\12.1.0\dbhome_1\network\admin\listener.ora
# Generated by Oracle configuration tools.
SID_LIST_DB =
(SID_LIST =
(SID_DESC =
(SID_NAME = CLRExtProc)
(ORACLE_HOME = D:\app\oracle\product\12.1.0\dbhome_1)
(PROGRAM = extproc)
(ENVS = "EXTPROC_DLLS=ONLY:D:\app\oracle\product\12.1.0\dbhome_1\bin\oraclr12.dll")
)
)
DB =
(DESCRIPTION_LIST =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = host-live.f10.internal.cloudapp.net)(PORT = 1521))
(ADDRESS = (PROTOCOL = IPC)(KEY = EXTPROC1521))
)
)
TEST:
# listener.ora Network Configuration File: D:\app\oracle\product\12.1.0\dbhome_1\NETWORK\ADMIN\listener.ora
# Generated by Oracle configuration tools.
DBTEST =
(DESCRIPTION_LIST =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = host-test.f7.cloudapp.net)(PORT = 1521))
)
(DESCRIPTION =
(ADDRESS = (PROTOCOL = IPC)(KEY = EXTPROC1521))
)
)
ADR_BASE_DBTEST = D:\app\oracle\product\12.1.0\dbhome_1\log
I am relatively knew to Oracle and would be very grateful if someone could assist me in helping to discover where I have gone wrong in setting up the test environment.

Related

SSRS Download All History Snapshots

Is it possible to download all the history snapshots of a report at once? Preferably as a CSV. Would save a lot time instead of clicking into each one individually and selecting save as CSV.
I only see the option to Delete
In PowerShell, you can loop through each snapshot and save them using this example:
<#
Description: Save SSRS Report Snapshots
#>
$sql = "
DECLARE #ReportName NVARCHAR(200) = 'Your Report Name'; --change to NULL for every snapshot
DECLARE #FileFormat NVARCHAR(50) = 'CSV'; --HTML5,PPTX,ATOM,HTML4.0,MHTML,IMAGE,EXCEL (for .xls),EXCELOPENXML (for .xlsx),WORD (for .doc),WORDOPENXML (for .docx),CSV,PDF,XML
DECLARE #FileExtn NVARCHAR(50) = 'csv';
DECLARE #ServerName NVARCHAR(100) = 'http://YourServerName';
DECLARE #DateFrom DATE = CAST(DATEADD(DAY, -1, GETDATE()) AS DATE); --change to NULL for every snapshot
DECLARE #ExportPath NVARCHAR(200) = 'C:\Temp\';
SELECT
--[ReportID] = [c].[itemid]
-- , [ReportName] = [c].[name]
-- , [ReportPath] = [c].[path]
-- , [SnaphsotDate] = FORMAT([h].[snapshotdate], 'dd-MMM-yyyy')
-- , [SnapshotDescription] = [s].[DESCRIPTION]
-- , [SnapshotEffectiveParams] = [s].[effectiveparams]
-- , [SnapshotQueryParams] = [s].[queryparams]
-- , [ScheduleName] = [sc].[name]
-- , [ScheduleNextRunTime] = CONVERT(VARCHAR(20), [sc].[nextruntime], 113)
[ExportFileName] = #ExportPath + REPLACE([c].[name], ' ', '_') + '_' + FORMAT([h].[snapshotdate], 'yyyyMMdd_HHmm') + '.' + #FileExtn
, [SnapshotUrl] =
#ServerName
+ '/ReportServer/Pages/ReportViewer.aspx?'
+ [c].[path] + '&rs:Command=Render&rs:Format='
+ #FileFormat + '&rs:Snapshot='
+ FORMAT([h].[snapshotdate], 'yyyy-MM-ddTHH:mm:ss')
FROM
[ReportServer].[dbo].[History] AS [h] WITH(NOLOCK)
INNER JOIN [ReportServer].[dbo].[SnapshotData] AS [s] WITH(NOLOCK) ON [h].[snapshotdataid] = [s].[snapshotdataid]
INNER JOIN [ReportServer].[dbo].[Catalog] AS [c] WITH(NOLOCK) ON [c].[itemid] = [h].[reportid]
INNER JOIN [ReportServer].[dbo].[ReportSchedule] AS [rs] WITH(NOLOCK) ON [rs].[reportid] = [h].[reportid]
INNER JOIN [ReportServer].[dbo].[Schedule] AS [sc] WITH(NOLOCK) ON [sc].[scheduleid] = [rs].[scheduleid]
WHERE
1=1
AND [rs].[reportaction] = 2
AND (#ReportName IS NULL OR [c].[Name] = #ReportName)
AND (#DateFrom IS NULL OR [h].[snapshotdate] >= CAST(DATEADD(DAY, -1, GETDATE()) AS DATE))
ORDER BY
[c].[name]
, [h].[snapshotdate];
;"
$server = 'YourServerName';
$dbs = 'MASTER';
$dsn = "Data Source=$server; Initial Catalog=$dbs; Integrated Security=SSPI;";
$cn = New-Object System.Data.SqlClient.SqlConnection($dsn);
#execute merge statement here with parameters
$cn = New-Object System.Data.SqlClient.SqlConnection($dsn);
$cn.Open();
$cmd = $cn.CreateCommand();
$cmd.CommandText = $sql
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $cmd
$cmd.Connection = $cn
$ds = New-Object System.Data.DataSet
$SqlAdapter.Fill($ds)
$cn.Close()
$Result = $ds.Tables[0]
Foreach ($item in $Result)
{
#Write-Host $item.name
$SnapshotUrl = $item.SnapshotUrl
$ExportFileName = $item.ExportFileName
(Invoke-WebRequest -Uri $SnapshotUrl -OutFile $ExportFileName -UseDefaultCredentials -TimeoutSec 240);
}
https://learn.microsoft.com/en-us/sql/reporting-services/url-access-parameter-reference?view=sql-server-ver15
Was having trouble with powershell, so thought I'd post simplified version of my rough Python solution inspired by the resource from #aduguid's answer.
import requests
from requests_negotiate_sspi import HttpNegotiateAuth
import os
def downloadFile(url, file_name, download_folder, session):
response = session.get(url, stream=True) # open the download link
file_path = os.path.join(download_folder, file_name)
with open(file_path, 'wb') as file: # create a new file with write binary mode
for chunk in response.iter_content(chunk_size=1024):
if chunk:
file.write(chunk)
# Can also use '/Reports()' for non-linked reports.
# Can also pass in 'path="<report_path>"' instead of using id numbers,
# e.g. '.../Reports(path="/cool%20reports/my%20report")/HistorySnapshots'
api_url = r'http://<server_name>/reports/api/v2.0/LinkedReports(<item_id>)/HistorySnapshots'
session = requests.session()
session.auth = HttpNegotiateAuth() # uses windows log in
response = session.get(api_url)
hs_snapshot_list = response.json()['value']
for item_dict in hs_snapshot_list:
download_url = (r'http://<server_name>/ReportServer/Pages/ReportViewer.aspx?<report_path>'
+ '&rs:Snapshot=' + item_dict['HistoryId']
+ '&rs:Format=CSV')
downloadFile(download_url, '<your_file_name>', '<your_download_folder>', session)
SSRS API Resource:
https://app.swaggerhub.com/apis/microsoft-rs/SSRS/2.0#/Reports/GetReportHistorySnapshots

Selenium python script working but its not click or entering any value firefox

Script working completely but its not entering any data.
Here my code:
from selenium import webdriver
from selenium.webdriver.support.ui import Select
import datetime
from login_credentials import *
from common_file import *
from selenium.webdriver.firefox.options import Options
from pyvirtualdisplay import Display
from selenium.webdriver.firefox.firefox_binary import FirefoxBinary
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
start_time = str(sdate)+" "+ str(stime)
end_time = str(edate)+" "+ str(etime)
options = Options()
options.headless = True
driver = webdriver.Firefox(executable_path='/usr/bin/geckodriver',options=options)
driver.get("https://www.goeventz.com/")
driver.find_element_by_xpath("//a[contains(text(),'Login')]").click()
print("going")
#driver.find_element_by_id("userlogin")
driver.find_element_by_id("user_email").send_keys(ge_email)
driver.find_element_by_id("password").send_keys(ge_pswd)
#driver.find_elements_by_class_name(".btn-login").click()
#driver.find_element_by_css_selector('btn-login').click()
driver.find_element_by_xpath("//button[#type='submit']").click()
driver.find_element_by_xpath("//a[contains(text(),'Create Event') and #id='headerbtn']").click()
driver.find_element_by_name("title").clear()
driver.find_element_by_name("title").send_keys(eventname)
driver.find_element_by_xpath("//a[contains(text(),'Enter Address')]").click()
driver.find_element_by_xpath("//input[contains(#name,'venue_name')]").send_keys(full_address)
driver.find_element_by_name("start_date_time").clear()
driver.find_element_by_name("start_date_time").send_keys(start_time)
driver.find_element_by_name("end_date_time").clear()
driver.find_element_by_name("end_date_time").send_keys(end_time)
driver.find_element_by_id("fileToUpload").send_keys("/var/www/html/crons/event_posting/manual/test.jpg")
driver.find_element_by_xpath("//div[contains(#class,'fr-element fr-view')]").send_keys('description')
select = Select(driver.find_element_by_name("booknow_button_value"))
select.select_by_value('Register')
select = Select(driver.find_element_by_name("category"))
select.select_by_value("Sports")
select = Select(driver.find_element_by_name("othercategory"))
select.select_by_value('Festival')
driver.find_element_by_name("support_mobile").send_keys(cont_number)
driver.find_element_by_name('support_email').send_keys(email_id)
driver.find_element_by_name("makeeventlive").click()
print("its complted")
and it running completly on server, this is output:
but its not entering any data as provided it just output it blank.
here the output im getting on browser:
output on browser
this is common_file:
from dbconnection import get_conn
from datetime import datetime
connection_object, cursor = get_conn()
json_0 = []
json12_in_list = []
json_12 = []
json34_in_list = []
json_34 = []
json5 = []
json678_in_list = []
json_678 = []
json9 = []
json10 = []
main_json = {}
event_details = ''
with open('event_details.txt', 'r') as f:
event_details = f.read()
event_id = int(event_details.split(',')[0])
site_id = int(event_details.split(',')[1])
site_name = str(event_details.split(',')[2])
#event_id =
sql = """SELECT * FROM articles2 WHERE id ='%d'""" %event_id
cursor.execute(sql)
data = cursor.fetchall()
for info in data:
eventid = info[0]
countryname = info[1]
eventname = info[2]
profileimg = info[5]
banner0 = info[6]
sdate = str(info[7])[:10]
edate = str(info[8])[:10]
addr1 = info[9]
addr2 = info[10]
pincode = info[11]
full_address = info[15]
state = info[12]
city = info[13]
stime = str(info[18])
#s_time = datetime.strptime(stime,"%H:%M:%S")
#stime = s_time.strftime("%I:%M:%S %p")
etime = str(info[19])
# e_time = datetime.strptime(etime,"%H:%M:%S")
# etime = e_time.strftime("%I:%M:%S %p")
description = info[20]
src_url = info[26]
json0 = {"event id":eventid, "country":countryname, "event name":eventname, "profile image":profileimg, "banner":banner0, "start date":sdate,
"end date":edate, "address 1":addr1, "address 2":addr2, "pincode":pincode, "full address":full_address, "state":state, "city":city,
"start time":stime, "end time":etime, "description":description, "source url":src_url}
json_0.append(json0)
main_json['event info'] = json_0
#tickets
sql1 = """SELECT * FROM tickets WHERE event_id = '%d'""" %event_id
cursor.execute(sql1)
data1 = cursor.fetchall()
for info1 in data1:
tktid = info1[0]
eventid1 = info1[1]
tktname = info1[2]
original_tkt_price = info1[3]
other_charges = info1[4]
other_charges_type = info1[5]
tkt_qty = info1[6]
min_qty = info1[7]
max_qty = info1[8]
qty_left = info1[9]
ticket_msg = info1[10]
ticket_start_date = str(info1[11])[:10]
ticket_start_time = str(info1[11])[11:]
expiry_date = str(info1[12])[:10]
expiry_time = str(info1[12])[11:]
ticket_label= info1[13]
active1 = info1[14]
..........................................................................

Can not join group - dropping presence to unavailable resouce - ejabberd

Hey I trying to run a local instance of ejabberd and connect to it using Adium - I am able to do so, but run into issues when trying to join a group chat.
I've tried creating a room by running ejabberdctl create_room room1 localhost localhost and connecting through Adium but here are the error messages I get:
2018-10-19 22:48:51.550 [debug] <0.2234.0>#xmpp_socket:parse:374 (tcp|<0.2234.0>)
Received XML on stream = <<"
<presence to='room1#localhost/dan'>
<c xmlns='http://jabber.org/protocol/caps'
node='http://pidgin.im/' hash='sha-1'
ver='DdnydQG7RGhP9E3k9Sf+b+bF0zo='/>
<x xmlns='http://jabber.org/protocol/muc'/>
</presence>
">>
and:
#presence{id = <<>>,type = available,lang = <<"en">>,
from = #jid{user = <<"danmiller">>,server = <<"localhost">>,
resource = <<"8c859086572c">>,luser = <<"danmiller">>,
lserver = <<"localhost">>,
lresource = <<"8c859086572c">>},
to = #jid{user = <<"room1">>,server = <<"localhost">>,
resource = <<"dan">>,luser = <<"room1">>,
lserver = <<"localhost">>,lresource = <<"dan">>},
show = undefined,status = [],priority = undefined,
sub_els = [#xmlel{name = <<"c">>,
attrs = [{<<"xmlns">>,
<<"http://jabber.org/protocol/caps">>},
{<<"node">>,<<"http://pidgin.im/">>},
{<<"hash">>,<<"sha-1">>},
{<<"ver">>,
<<"DdnydQG7RGhP9E3k9Sf+b+bF0zo=">>}],
children = []},
#xmlel{name = <<"x">>,
attrs = [{<<"xmlns">>,
<<"http://jabber.org/protocol/muc">>}],
children = []},
#vcard_xupdate{hash = <<>>}],
meta = #{ip => {0,0,0,0,0,0,0,1}}}
2018-10-19 22:48:51.552 [debug] <0.2234.0>#ejabberd_local:do_route:141 local route:
#presence{id = <<>>,type = available,lang = <<"en">>,
from = #jid{user = <<"danmiller">>,server = <<"localhost">>,
resource = <<"8c859086572c">>,luser = <<"danmiller">>,
lserver = <<"localhost">>,
lresource = <<"8c859086572c">>},
to = #jid{user = <<"room1">>,server = <<"localhost">>,
resource = <<"dan">>,luser = <<"room1">>,
lserver = <<"localhost">>,lresource = <<"dan">>},
show = undefined,status = [],priority = undefined,
sub_els = [#xmlel{name = <<"c">>,
attrs = [{<<"xmlns">>,
<<"http://jabber.org/protocol/caps">>},
{<<"node">>,<<"http://pidgin.im/">>},
{<<"hash">>,<<"sha-1">>},
{<<"ver">>,
<<"DdnydQG7RGhP9E3k9Sf+b+bF0zo=">>}],
children = []},
#xmlel{name = <<"x">>,
attrs = [{<<"xmlns">>,
<<"http://jabber.org/protocol/muc">>}],
children = []},
#vcard_xupdate{hash = <<>>}],
meta = #{ip => {0,0,0,0,0,0,0,1}}}
2018-10-19 22:48:51.552 [debug] <0.2234.0>#ejabberd_sm:do_route:651 processing packet to full JID:
#presence{id = <<>>,type = available,lang = <<"en">>,
from = #jid{user = <<"danmiller">>,server = <<"localhost">>,
resource = <<"8c859086572c">>,luser = <<"danmiller">>,
lserver = <<"localhost">>,
lresource = <<"8c859086572c">>},
to = #jid{user = <<"room1">>,server = <<"localhost">>,
resource = <<"dan">>,luser = <<"room1">>,
lserver = <<"localhost">>,lresource = <<"dan">>},
show = undefined,status = [],priority = undefined,
sub_els = [#xmlel{name = <<"c">>,
attrs = [{<<"xmlns">>,
<<"http://jabber.org/protocol/caps">>},
{<<"node">>,<<"http://pidgin.im/">>},
{<<"hash">>,<<"sha-1">>},
{<<"ver">>,
<<"DdnydQG7RGhP9E3k9Sf+b+bF0zo=">>}],
children = []},
#xmlel{name = <<"x">>,
attrs = [{<<"xmlns">>,
<<"http://jabber.org/protocol/muc">>}],
children = []},
#vcard_xupdate{hash = <<>>}],
meta = #{ip => {0,0,0,0,0,0,0,1}}}
2018-10-19 22:48:51.553 [debug] <0.2234.0>#ejabberd_sm:do_route:664 dropping presence to unavailable resource:
#presence{id = <<>>,type = available,lang = <<"en">>,
from = #jid{user = <<"danmiller">>,server = <<"localhost">>,
resource = <<"8c859086572c">>,luser = <<"danmiller">>,
lserver = <<"localhost">>,
lresource = <<"8c859086572c">>},
to = #jid{user = <<"room1">>,server = <<"localhost">>,
resource = <<"dan">>,luser = <<"room1">>,
lserver = <<"localhost">>,lresource = <<"dan">>},
show = undefined,status = [],priority = undefined,
sub_els = [#xmlel{name = <<"c">>,
attrs = [{<<"xmlns">>,
<<"http://jabber.org/protocol/caps">>},
{<<"node">>,<<"http://pidgin.im/">>},
{<<"hash">>,<<"sha-1">>},
{<<"ver">>,
<<"DdnydQG7RGhP9E3k9Sf+b+bF0zo=">>}],
children = []},
#xmlel{name = <<"x">>,
attrs = [{<<"xmlns">>,
<<"http://jabber.org/protocol/muc">>}],
children = []},
#vcard_xupdate{hash = <<>>}],
meta = #{ip => {0,0,0,0,0,0,0,1}}}
I believe that I have configuration set up correctly, with modules.mod_muc.access: all
What am I missing?
I found the answer by paying closer attention to the documentation for the mod_muc module:
Module options:
host: HostName: This option defines the Jabber ID of the service. If the host
option is not specified, the Jabber ID will be the hostname of the virtual host
with the prefix ‘conference.’. The keyword “#HOST#” is replaced at start time with
the real virtual host name.
I was trying to connect and create a room at room1#localhost when I need to either connect instead to room1#conference.localhost or add host: localhost to my configuration under mod_muc section

sphinx indexer is not working

i want to index MSSql 2008 database in windows 7.
this is my config file
source main
{
type = mssql
sql_host = localhost
sql_user = sa
sql_pass = pass
#mssql_winauth = 1
sql_db = article
sql_port = 1433 # optional, default is 3306
mssql_unicode = 1 # request Unicode data from server
sql_query_pre = insert into IndexLog(StartDateTime, Count, Successful) select top 1 getdate(), (select COUNT(id) from article),0 from article
sql_query = SELECT ID ,ROW_NUMBER() OVER(ORDER BY Title) AS TitleRowNo\
,Title,abstract \
FROM article
#Log index end
sql_query_post_index = update IndexLog set EndDateTime = getdate(), successful = 1 where id = (select max(id) from IndexLog)
sql_attr_uint = ID
sql_attr_uint = TitleRowNo
}
index news_main
{
source = main
path = C:/sphinx/article/data/news_main
docinfo = extern
charset_type = utf-8
min_word_len = 3
min_infix_len = 3
infix_fields = Title, abstract
# min_stemming_len = 0
# index_exact_words = 1
}
indexer
{
mem_limit = 32M
}
searchd
{
port = 6550 # 9312
log = C:/sphinx/article/log/searchd.log
query_log = C:/sphinx/article/log/query.log
read_timeout = 5
max_children = 0 # concurrent searches to run in parallel
pid_file = C:/sphinx/article/log/searchd.pid
max_matches = 1000
seamless_rotate = 1
preopen_indexes = 0
unlink_old = 1
collation_server = utf8_general_ci
collation_libc_locale = utf8_general_ci
}
i use this command to run index.
indexer --config "C:\sphinx\a.sphinx.conf" news_main
but is not work.

Thinking Sphinx: Another index error

While indexing I get this error:
indexing index 'qtl_table_core'...
ERROR: index 'qtl_table_core': sql_range_query: 'soybase.qtl_table.QTLName' isn't in GROUP BY (DSN=mysql://_www:***#xxxxxxx/soybase).
My model:
class QtlTable < ActiveRecord::Base
....
define_index do
indexes :QTLID, :sortable => true
indexes :QTLName, :sortable => true
end
development.sphinx.conf
indexer
{
}
searchd
{
listen = 127.0.0.1:1234
log = /usr/home/benjamin/qtl/log/searchd.log
query_log = /usr/home/benjamin/qtl/log/searchd.query.log
pid_file = /usr/home/benjamin/qtl/log/searchd.development.pid
}
source qtl_table_core_0
{
type = mysql
sql_host = xxxxxxxxxxxxxx
sql_user = _www
sql_pass =
sql_db = soybase
sql_query_pre = SET NAMES utf8
sql_query_pre = SET TIME_ZONE = '+0:00'
sql_query = SELECT SQL_NO_CACHE `qtl_table`.`QTLID` * CAST(1 AS SIGNED) + 0 AS `QTLID` , `qtl_table`.`QTLID` AS `QTLID`, `qtl_table`.`QTLName` AS `QTLName`, `qtl_table`.`QTLID` AS `sphinx_internal_id`, 0 AS `sphinx_deleted`, 1786069111 AS `class_crc`, IFNULL(`qtl_table`.`QTLID`, '') AS `QTLID_sort`, IFNULL(`qtl_table`.`QTLName`, '') AS `QTLName_sort` FROM `qtl_table` WHERE (`qtl_table`.`QTLID` >= $start AND `qtl_table`.`QTLID` <= $end) GROUP BY `qtl_table`.`QTLID` ORDER BY NULL
sql_query_range = SELECT IFNULL(MIN(`QTLID`), 1), IFNULL(MAX(`QTLID`), 1) FROM `qtl_table`
sql_attr_uint = sphinx_internal_id
sql_attr_uint = sphinx_deleted
sql_attr_uint = class_crc
sql_attr_str2ordinal = QTLID_sort
sql_attr_str2ordinal = QTLName_sort
sql_query_info = SELECT * FROM `qtl_table` WHERE `QTLID` = (($id - 0) / 1)
}
index qtl_table_core
{
source = qtl_table_core_0
path = /usr/home/benjamin/qtl/db/sphinx/development/qtl_table_core
charset_type = utf-8
min_infix_len = 1
enable_star = 1
}
index qtl_table
{
type = distributed
local = qtl_table_core
}
Try adding the following inside your define_index block:
group_by "`qtl_table`.`QTLName`"