Google Data Studio Gives Error When a 3rd Schema Attribute is Added - google-apps-script

I have created a Custom Community Connector that currently has a schema with two attributes. A Metric: Budget Amount, and a Dimension: Budget Name. These are pulled from an API from another site. When linking this connector to a Data Studio project I am able to create a Pie Chart that displays all Budget names and amounts.
However, once I add a third attribute of any kind (ex. Metric: Budget Spent or Dimension: Company Name) the pie chart no longer shows and displays an error :
Script error message:
Script error cause: UNKNOWN
Script error stacktrace:
I have confirmed that data is flowing correctly by using logs within Apps Scripts and with the Table chart in Data Studio. The Table Chart is the only table that will show up but only when all possible Metrics and Dimensions are apart of it.
I am currently looking for anyway to debug this, solutions, or advice on where to go from here. I would like to expand the Schema, however I cannot continue if charts stop working when I add more attributes.
If more information or code is needed let me know and I will provide whatever is needed. Thank you for your time

The Debugging Guide should get you started with the debug. I recommend you use Stackdriver logging to log the request and response for each function. You can see an example of logging here.

Related

UI Path Studio - Data Scraping - Cannot find pattern

This website (Palantir Foundry - which is a pay-only cloud service, so it is not publicly available) Seems to use some React Tables package that creates a table with headers (that one can sort, filter, etc) and then rows of data.
We want to capture the header row and the first data row with UIPath Studio (Enterprise) 2020.10.2
When I can use the Chrome extension to grab a row, but when go for the second element in the pattern i get the pop-up error “Cannot find all pattern elements”. Any suggestions on things to try?
Pictures below might help:

Not able to get value from custom table in PowerApps portal

I had created a custom table(entity) in dataverse of power app portal and added some dummy data. Table name is "TestTable" and created a column "TestColumn". I am trying to get data from this in powerapps portal using liquid query.
{{entities.cr3c9_testtable['ef5398fe-c68f-eb11-b1ac-000d3af25ac1'].cr3c9_testcolumn}}
I am able to get value in portal studio as shown below.
However, when I browse this site, it's not showing.
I tried to clear cache multiple times but having same result.
Please let me know if anyone has idea on this.
Thanks in advance.
Check if table/column name is correct. Since the column is named Test Column it should be something like cr3c9_TestColumn.
Also check if you have entity permissions for cr3c9_testtable with read access and assign appropriate web roles to that permission.

SSIS Odata VSTS WIQL An error occured while retrieving the metadata

My project currently uses an SSIS to get data from VSTS. We use an Odata source to get a particular collection, eg WorkItems. That works fine.
What I want to do, but can't figure out how to, is to retrieve the results from a saved query within VSTS. What I've tried is copying the url from the VSTS website into the Odata's Source Editor under "query options". I get data back when I hit the preview button but when I try to save or see Columns I get the message: "An error occurred while retrieving the metadata". Does anyone know what's going on or could give me a suggestion?
If that doesn't work is there a way I could use WIQL to query VSTS? If I could do that from SSIS, or anything else, that would solve my problem as well.
Using VSTS REST API instead.
Also, the REST API supports WIQL: Wiql - Query By Wiql

TF293000: The data warehouse has detected data conflicts for the following work item fields

Hi I'm looking for help with the following issue:
In TFS on our SSRS report server whenever I run any of the out the box Sprint Burndown reports the report seems to run successfully but I get the following error in the bottom right hand corner:
Through some research I found that the issue was due to the field definitions in that particular Collection not matching the other collections that we have in TFS. Simple...
In order to determine which field definition in the collection was the issue I used the witadmin command listfields for all of my collections:
witadmin listfields /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy
This led me to find that the Synchronizes Identity Name Changes definition in the collection mentioned in the TF293000 error was set to a value of true, while it is false in all of my other collections. Issue Found! Should be easy from here...wrong.
The following command should solve my problem:
witadmin changefield /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy /syncnamechanges:false
*of course with the proper collection url subbed in for the word Collection
However when run and after I confirm that I want to make the change I get the following error:
TF401327: The operation is not supported. The feature is obselete.
I look the error up and it takes me to this page TFS Known Issue which tells me it's a known issue but was resolved in update 1 ... we have update 3.
I then attempted to simply edit the WIT .xml file and update the attribute for that WIT on that collection with false, but when I import the change to the server it tells me it has imported successfully however when I export it I see that the file has not changed.
I have also tried copying the the .xml file from the same WIT in another collection and uploading that to the offending collection and that will not work. I've never had an issue with uploading a WIT as we've made several changes to our TFS workflow before. I'm pretty stuck at this point and just wondering if anyone else has experienced this issue before, thanks!
According to the error info, seems there is a conflict in the TFS Data warehouse and this because 2 fields in different collection has different attributes in the data warehouse as it’s only one data warehouse. To avoid schema conflicts when you export and process data to the data warehouse databases, you must assign the same values to these attributes across all collections:
Field type (the value for this field cannot be changed for an
existing field).
Reporting type.
Reporting name.
What you have done is the correct operation, change/update the attribute for the field in one project collection to match the assignments that are made in other project collections.
You could try to narrow the issue, if this issue only happened on that specific field in the team project collection. All other work item filed working correctly? Also give a try with other collections, such as change the syncnamechanges=true, then set it back to syncnamechanges=false, to see if any issue occurs.
Run the command line on TFS sever machine instead of your develop machine. Clear TFS cahce. And if the filed is not use for reporting about those project collections, you could also try to mark it as non-reportable. More details please refer below links:
Resolve data warehouse schema conflicts
Change a reportable attribute for a work item field

Cloudconnect CSV buffer size

When I try to load a big CSV from a zip file, the execution log give me the following error:
----------------------------------------- Error details ------------------------------------------
Component [Clientes:CLIENTES1] finished with status ERROR.
The size of data buffer is only 100663296. Set appropriate parameter in defaultProperties file.
--------------------------------------------------------------------------------------------------
How can I set the appropriate parameter in defaultProperties file?
I tried this link, but my cloudconnect run configurations page is different from the link:
I've created the parameters file and filled the additional parameters with the right values like said the tutorial (code bellow) and the same error appear in the screen.
Name: -config; Value: new_buffer_size.txt
The new_buffer_size.txt content have just this line:DEFAULT_INTERNAL_IO_BUFFER_SIZE = 200000000
How can I solve this problem? I need to solve this before the world explodes.
CloudConnect is designed to develop ETL(s), which can be run on GoodData cloud workers and therefore some lower level settings are surpassed as in this case. The only legitimate way is to modify the ETL the way it can process the data with current settings. Regarding to docs, the referenced article is outdated. GoodData docs team is aware if it and they are preparing docs refactoring.
Note: As you have probably noticed, CloudConnect is being powered by Javlin's Clover ETL, therefore feel free to check their forums, as you would find there how to overcome the issue on lower level (no UI), but it would work only for data processing on the local machine.