How to get row count of all tables available under 50schemas in TERADATA? - teradata-sql-assistant

I cannot create procedure to get count as i do have only read access in TERADATA server.
Please suggest me a query to get rowcount of all tables available under 50 schemas FROM DBC.tablesV.
I don't find any data dictionaries as like Oracle to fetch rowcounts fr all_table.
Please help me!!
I found rowcount available under dbc.tablestatsv. but it's updated with as decimal and more are wrong value.

The row count in DBC Stats views is a snapshot as of the last COLLECT STATISTICS so should be considered approximate at best. The only way to get an accurate count is to do SELECT COUNT(*) from each table. If you can't use a Stored Procedure the you will have to iterate through the list with some client side scripting. Fairly simple to do in Java, Python, etc. Or you can do it in two steps - generate a bunch of SELECT statements using the dictionary view, then run that generated SQL:
SELECT 'SELECT '''||DatabaseName||''','''||TableName||''',COUNT(*) FROM '||DatabaseName||'.'||TableName||';'
FROM DBC.TablesV WHERE DatabaseName IN (_list of names_);

Related

mysql 5.7 loop through results of a SELECT statement

I have a simple MYSQL database server and want to use the data from the results of one complex query to fetch data from another table. This is the psuedo code:
originTable='SELECT id from original_table WHERE #some_complex_conditions'
for row in $originTable do
SELECT profile,description FROM newTable WHERE id=${row.id}
done
I've seen examples online using cursors or while loops but nothing seems to work in mysql 5.7. Hoping that there's a simple way to loop through a table and use the rows like a JSON element. Any help appreciated.
Thanks!

Problem with union the table which not exists

I want to get data from several tables which has the same column name in one SQL statement, for example:
SELECT name, age FROM table_a UNION SELECT name, age FROM table_b UNION...
But the table_x may not exists that I can't avoid from people who send the request to me, if one of the tables is not exits in a query it will be failed, is there any syntax to avoid that?
I know a way that I can use show tables to get all tables in the database and compare them to the request parameters first, but I hope I can do it from MySQL syntax.
The short answer is no. If you are using another language in front of it, such PHP or any other language really, you can query the tables as you suggest, but SQL expects the query to be accurate syntactically and if it's not it will error. There is one (IMO bad) way to do this, if you must. You could use a stored procedure, which would allow you to dynamically build the query as you would in PHP or another language, but that's about all you have with MySQL (or any database that I know).

Select SQL query FROM table, select using that query?

I'm trying to do the following in SQL alone. In the end it will end up on a wso2 DSS server but if it can be done in sql alone even better :)
Sudocode
Array results=Array;
result = <sql>select id, query from definitions</sql>
foreach result.query
r=<sql>query</sql>
results.push(r)
I am trying to run a select on table a that returns 2 columns.
One of the two columns is named query, and I want to then execute that query returning
id, query_title, query_text
We can assume that the query column always returns the same columns (through aliases written in the query)
The other option would be doing this in WSO2 DSS however I though at least that it can only do what sql does. Maybe joining with the ESB I could get it if that doesn't work but really my goal would be to do it ALL in sql as I am going to insert insert information and then update it into another table anyway.
You cannot do this with a single select query.
One solution is to do this in two steps. Fetch the query in the application using your SQL and then execute the second query from the application.
The second solution is to use a stored procedure and prepare/execute. How you then fetch the results depends on the nature of the results

How do I store and iterate over a result set in a MySQL Stored Proc?

I'm relatively new to MySQL stored procs, so I was hoping someone could help me out here. I want to call my stored proc (maybe pass in one IN param) and have it do the following:
SELECT some data
Iterate over the records
Perform some operations on a few of the fields in each record including some INSERTs in other tables based on the data it finds.
My problem is that I don't know how to store the SELECT data set and iterate the records. I know how to declare and set stuff like int and text, but not full data sets. How do I do this?
Thanks
Look into MySql Cursors
http://dev.mysql.com/doc/refman/5.0/en/cursors.html

How can you exclude a large number of records in a cross db query using LINQ2SQL?

So here is my situation: I have a vendor supplied DB we cannot modify and a custom db that imports data from the vendor app and acts on it. Once records are imported form the vendor app, they cannot appear on the list of records to be imported. Also we only want to display the 250 most recent records that have not been imported.
What I originally started with was select the list of ids that have been imported from the custom db, and then query the vendor db, using the list of ids in a .Where(x => !idList.Contains(x.Id)) clause on the remote query.
This worked up until we broke 2100 records imported into the custom db, as 2100 is the limit on the number of parameters that can be passed into SQL. After finding out this was the actual problem and not the 'invalid buffer'/'severe error' ADO.Net reported, my solution was to remove the first 2000 ids in the remote query, and then remove the remaining records in the local query.
Having to pull back a large number of irrelevant records, just to exclude them, so I can get the correct 250 records seems very inelegant. Is there a better way to do this, short of doing a cross db stored procedure?
Thanks in advance.
This might not be the best answer, depending on how many records you're dealing with, but you could force the SQL to execute and just deal with it as in-memory objects. Calling the ToList() method will execute the SQL and convert to an IEnumerable .
What I might suggest is to have started by querying the vendor database first ordering the results by some kind of criteria (perhaps a date field, oldest to most recent).
You could do a Skip().Take() to "skim" the results and then take each bulk set and insert them into the custom db where the ID doesn't already exist. That way you avoid the problem you have now.
If you have db-create access to the SQL Server that the vendor's db is running on (or if your custom db is on the same server), you could create a "has been imported" table in a different database on that same server, and then write a stored proc that does a cross-database join of that table against the vendor db, e.g.:
select top 250 from vendordb.to_be_imported
where not exists
(select 1 from customdb.has_been_imported where idWasImported = idToBeImported)
order by whatever;
You might even be able to do this in Linq 2 SQL -- I've never tried adding objects from different databases into a single DataContext...