How to track query execution time in N1QL of couchbase server? - couchbase

Is there any easy way to get query time in couchbase server's N1QL like .explain() in mongodb?
I have a query like SELECT c.name, c.description from customer c and I would like to trace the time.

in N1QL, responses are JSON and include metadata. the metrics field contains statistics, including the time taken to execute the query (so notably elapsedTime and executionTime).
in the 2.2.0 developer preview of the Java SDK, you can access these metrics as a QueryMetrics object using the info() method on the QueryResult.

Related

select json output from Micros Database

Has anyone who uses the "old" Micros database platform found a way to select the results of a query as json content? Meaning select a from tablea and have it returned at {"a", "somevalue"}
This not a question about the mainstream Oracle database but, rather, the other database platform called Micros

What is the SQL injection likelihood on a raw PostgreSQL JSON field query?

I'm starting to build an API using Laravel with PostgreSQL as the data source. From what I've read so far, Laravel's Eloquent ORM currently does not support querying indexed JSON (April 2014). For example, the following simple query is not (currently) possible in Eloquent:
SELECT * FROM mytable WHERE json_data->>'firstname' = 'Paul'
As a result, it looks like I'm manually building queries until Eloquent supports it.
As this is an API, we'll need to take data from the URL/cURL query strings:
https://myapi.com/api/v1/people?firstname=Paul
results in
SELECT * FROM people WHERE json_data->>'firstname' = 'Paul'
Assuming that I do my best to sanitise the incoming query strings, am I taking a bit of a risk performing straight queries on JSON field data? Seeing as the JSON data type has not been around so long I'm still evaluating it for use within this project.

SalesForce: Export SOQL query results to CSV

How to export results from an SOQL query to CSV? Currently I'm using workbench.developerforce.com but I get an error when I try to use Bulk CSV.
Failed: InvalidBatch : Failed to process query: FUNCTIONALITY_NOT_ENABLED: Foreign Key Relationships not supported in Bulk Query
I'm guessing the Bulk API doesn't support queries with relationships, it's fine with single tables though.
I also tried using the Developer Console Query Editor but there is no option to export the results to CSV. Is there any other way to do this?
The FuseIT SFDC Explorer has an option to export SOQL query results as CSV.
It can export the SOQL query results as CSV. This will handle paging through all the query results for you.
Disclosure: I work for the company that makes this product. It is free to download and use.
Though its very old post:
v39.0 API is currently supporting Relationship fields.
https://developer.salesforce.com/docs/atlas.en-us.204.0.api_asynch.meta/api_asynch/asynch_api_using_bulk_query.htm
v39.0 API - Bulk API query doesn’t support the following SOQL:
COUNT
ROLLUP
SUM
GROUP BY CUBE
OFFSET
Nested SOQL queries
For v38.0 API:-
Bulk API query doesn’t support the following SOQL:
COUNT
ROLLUP
SUM
GROUP BY CUBE
OFFSET
Nested SOQL queries
Relationship fields

How to access query execution history?

I want to know how long a certain query took to execute in SQL Server 2008. I could have known if I had put a Profiler trace on the process ID before I executed the query, but I forgot.
Is there any way to pull this information out of SQL Server without running the query again?
You can use the DMV sys.dm_exec_query_stats. There is a lot more information you can get from the query below such as reads/writes just use * to see all the information available.
SELECT
t.TEXT QueryName,
last_elapsed_time
FROM sys.dm_exec_query_stats s
CROSS APPLY sys.dm_exec_sql_text( s.sql_handle ) t

Is there a way to filter a SQL Profiler trace?

I'm trying to troubleshoot this problem using SQL Profiler (SQL 2008)
After a few days running the trace in production, finally the error happened again, and now i'm trying to diagnose the cause. The problem is that the trace has 400k rows, 99.9% of which are coming from "Report Server", which I don't even know why it's on, but it seems to be pinging SQL Server every second...
Is there any way to filter out some records from the trace, to be able to look at the rest?
Can I do this with the current .trc file, or will I have to run the trace again?
Are there other applications to look at the .trc file that can give me this functionality?
You can load a captured trace into SQL Server Profiler: Viewing and Analyzing Traces with SQL Server Profiler.
Or you can load into a tool like ClearTrace (free version) to perform workload analysis.
You can load into a SQL Server table, like so:
SELECT * INTO TraceTable
FROM ::fn_trace_gettable('C:\location of your trace output.trc', default)
Then you can run a query to aggregate the data such as this one:
SELECT
COUNT(*) AS TotalExecutions,
EventClass,
CAST(TextData as nvarchar(2000)) ,
SUM(Duration) AS DurationTotal ,
SUM(CPU) AS CPUTotal ,
SUM(Reads) AS ReadsTotal ,
SUM(Writes) AS WritesTotal
FROM
TraceTable
GROUP BY
EventClass,
CAST(TextData as nvarchar(2000))
ORDER BY
ReadsTotal DESC
Also see: MS SQL Server 2008 - How Can I Log and Find the Most Expensive Queries?
It is also common to set up filters for the captured trace before starting it. For example, a commonly used filter is to limit to only events which require more than a certain number of reads, say 5000.
Load the .trc locally and then Use save to database to local db and then query to your hearts content.
These suggestions are great for an existing trace - if you want to filter the trace as it occurs, you can set up event filters on the trace before you start it.
The most useful filter in my experience is application name - to do this you have to ensure that every connection string used to connect to your database has an appropriate Application Name value in it, ie:
"...Server=MYDB1;Integrated Authentication=SSPI;Application Name=MyPortal;..."
Then in the trace properties for a new trace, select the Events Selection tab, then click Column Filters...
Select the ApplicationName filter, and add values to LIKE to include only the connections you have indicated, ie using MyPortal in the LIKE field will only include events for connections that have that application name.
This will stop you from collecting all the crud that Reporting Services generates, for example, and make subsequent analysis a lot faster.
There are a lot of other filters available as well, so if you know what you are looking for, such as long execution (Duration) or large IO (Reads, Writes) then you can filter on that as well.
Since SQL Server 2005, you can filter a .trc file content, directly from SQL Profiler; without importing it to a SQL table. Just follow the procedure suggested here:
http://msdn.microsoft.com/en-us/library/ms189247(v=sql.90).aspx
An additional hint: you can use '%' as a filter wildcard. For instance, if you want to filter by HOSTNAME like SRV, then you can use SRV%.
Here you can find a complete script to query the default trace with the complete list of events you can filter:
http://zaboilab.com/sql-server-toolbox/anayze-sql-default-trace-to-investigate-instance-events
You have to query sys.fn_trace_gettable(#TraceFileName,default) joining sys.trace_events to decode events numbers.