How to export results from an SOQL query to CSV? Currently I'm using workbench.developerforce.com but I get an error when I try to use Bulk CSV.
Failed: InvalidBatch : Failed to process query: FUNCTIONALITY_NOT_ENABLED: Foreign Key Relationships not supported in Bulk Query
I'm guessing the Bulk API doesn't support queries with relationships, it's fine with single tables though.
I also tried using the Developer Console Query Editor but there is no option to export the results to CSV. Is there any other way to do this?
The FuseIT SFDC Explorer has an option to export SOQL query results as CSV.
It can export the SOQL query results as CSV. This will handle paging through all the query results for you.
Disclosure: I work for the company that makes this product. It is free to download and use.
Though its very old post:
v39.0 API is currently supporting Relationship fields.
https://developer.salesforce.com/docs/atlas.en-us.204.0.api_asynch.meta/api_asynch/asynch_api_using_bulk_query.htm
v39.0 API - Bulk API query doesn’t support the following SOQL:
COUNT
ROLLUP
SUM
GROUP BY CUBE
OFFSET
Nested SOQL queries
For v38.0 API:-
Bulk API query doesn’t support the following SOQL:
COUNT
ROLLUP
SUM
GROUP BY CUBE
OFFSET
Nested SOQL queries
Relationship fields
Related
I'm using Data Studio to generate a financial report dashboard and I'm connecting it to CloudSQL MySQL, but my problem here is that it only requires me one table to use as a data source, and one table wouldn't help me at all to generate a financial report at all.
Here's the image of the process of selecting a Data Source:
I tried selecting Custom Query, which according to this: https://support.google.com/datastudio/answer/7088031?hl=en
Select the CUSTOM QUERY option to provide a SQL query instead of connecting to a single table. Google Data Studio uses this custom SQL as an inner select statement for each generated query to the database.
But I don't know what query should I write to have all my database tables as data sources in Google Data Studio.
Regarding Custom Queries: Had a look online, and didn't seem to find a sample CUSTOM QUERY specific to Google Data Studio and Google Cloud SQL for MySQL, however, on StackOverflow, there are a couple of posts on BigQuery Custom Queries that involve joins, that may be useful:
Data Studio query error when using Big Query view that joins tables
BigQuery Data Studio Custom Query
An alternative is to create individual Data Sources each linked to a single table and then link multiple Data Sources through the use of Data Blending, where a common Join Key links all the respective Tables.
In addition, if you could elaborate on your exact scenario, it would perhaps help users familiar with SQL to provide a more precise solution:
How are the tables structured?
How are the tables linked?
What code have you currently tried?
I also had quite a few issues with the Custom Query using the Cloud MySQL Connector by Google for Data Studio.
The resolution for me was to not run SELECT * but rather SELECT each column by name. Not sure why it doesn't like SELECT * but hopefully this helps someone else.
Example of a successful query.
Example of successful query with join.
I have created a View Criteria in ADF that searches employees on the basis of employee name and name is passed through a bind variable. When I search using equals it fetches the result.
However, my requirement is to search using CONTAINS. When I use CONTAINS it doesn't fetch out any data and shows no record found.
I believe that contains uses Oracle style wild cards which might not be the same as what MySQL expects.
Turn on fine logging on the ADF BC layer to see the SQL that is issued and verify.
This error was occurring because i was querying MySql but syntax which was gong through here was of ORACLE , so what we can do here is change the query execution mode from database to in memory this will resolve the error and will work fine and it worked for me.
I'm new to elasticsearch.I have learnt how to give different queries and get search results with the understanding that each document is stored in json format.Is it possible to insert records that were obtained from an sql query on a relational database?If it is possible,how is it done? by converting each record into json format?
You need to build an index in elasticsearch similar to the way you've got your tables in the RMDBS, this can be done in a lot of ways and it really depends on what data you would need to access via elasticsearch. You shouldnt just dump your complete RMDBS data into ES.
If you search around you may find bulk data importers/synchronisers/rivers(deprecated) for your RMDBS to ES, some of these can run in the background and keep the indexes in ES upto date with your RMDBS.
You can create your own code as well which updates ES whenever any data is changed in your RMDBS. Look into the API for your platform Elastic Search Client APIhttps://www.elastic.co/guide/en/elasticsearch/client/index.html
I'm starting to build an API using Laravel with PostgreSQL as the data source. From what I've read so far, Laravel's Eloquent ORM currently does not support querying indexed JSON (April 2014). For example, the following simple query is not (currently) possible in Eloquent:
SELECT * FROM mytable WHERE json_data->>'firstname' = 'Paul'
As a result, it looks like I'm manually building queries until Eloquent supports it.
As this is an API, we'll need to take data from the URL/cURL query strings:
https://myapi.com/api/v1/people?firstname=Paul
results in
SELECT * FROM people WHERE json_data->>'firstname' = 'Paul'
Assuming that I do my best to sanitise the incoming query strings, am I taking a bit of a risk performing straight queries on JSON field data? Seeing as the JSON data type has not been around so long I'm still evaluating it for use within this project.
So here is my situation: I have a vendor supplied DB we cannot modify and a custom db that imports data from the vendor app and acts on it. Once records are imported form the vendor app, they cannot appear on the list of records to be imported. Also we only want to display the 250 most recent records that have not been imported.
What I originally started with was select the list of ids that have been imported from the custom db, and then query the vendor db, using the list of ids in a .Where(x => !idList.Contains(x.Id)) clause on the remote query.
This worked up until we broke 2100 records imported into the custom db, as 2100 is the limit on the number of parameters that can be passed into SQL. After finding out this was the actual problem and not the 'invalid buffer'/'severe error' ADO.Net reported, my solution was to remove the first 2000 ids in the remote query, and then remove the remaining records in the local query.
Having to pull back a large number of irrelevant records, just to exclude them, so I can get the correct 250 records seems very inelegant. Is there a better way to do this, short of doing a cross db stored procedure?
Thanks in advance.
This might not be the best answer, depending on how many records you're dealing with, but you could force the SQL to execute and just deal with it as in-memory objects. Calling the ToList() method will execute the SQL and convert to an IEnumerable .
What I might suggest is to have started by querying the vendor database first ordering the results by some kind of criteria (perhaps a date field, oldest to most recent).
You could do a Skip().Take() to "skim" the results and then take each bulk set and insert them into the custom db where the ID doesn't already exist. That way you avoid the problem you have now.
If you have db-create access to the SQL Server that the vendor's db is running on (or if your custom db is on the same server), you could create a "has been imported" table in a different database on that same server, and then write a stored proc that does a cross-database join of that table against the vendor db, e.g.:
select top 250 from vendordb.to_be_imported
where not exists
(select 1 from customdb.has_been_imported where idWasImported = idToBeImported)
order by whatever;
You might even be able to do this in Linq 2 SQL -- I've never tried adding objects from different databases into a single DataContext...