Power Automate updating expression automatically to wrong value? - json

Not sure if any one else has noticed this behavior in Power Automate. So I would click on a dynamic content expression, and see something I needed to fix like body('parse json')?['variable_1']?[variabl_2] but [variable_2] is not inside ['variable_1']. Then after deleting the ?['variable_1'] value and clicking update, then clicking the expression again it would pull up the old value, body('parse json')?['variable_1']?[variabl_2]. I had to delete the whole object body('parse json')?['variable_1']?[variabl_2] then go to Dynamic content and re-add it then update and then it worked. I believe it has to do with changing my Json Schema around or possibly about how the cache is done. Like if the dynamic content was created with the old schema where [variable_2] was in [variable_1] then it would keep correcting to the old version because you can't get [variable_2] without having ?[variable_1] in place. I'll try to recreate this phenomenon again but was curious if any of you had seen it? I think it means you need to delete variables related to old json schema or it may autocorrect based on old json schema it is pointing to.

Related

After Delete Data Macro

I am running into a problem doing something very simple with MS Access data macros. When a record row is deleted from T_Roster, I want it copied to t_Deleted. I know there are a plethora of other and better ways to accomplish this however, the requirements for this are such that a data macro is required. The end user/manager does not want me to add columns to the roster table or use a form (both of which I had initially suggested). That said, I have been scouring the internet for a solution and see that this question has been answered however, I have not been able to get the simple macro to work on my end. I have finally worked up the nerve to ask the question here again so please be kind.
There are only 2 tables in this DB. t_Deleted and T_Roster
I have attached a screen shot of what my current macro looks like. In the interests of keeping things simple, I only want the "OCIO_Name" to copy over for now. I assume that if this test works, the rest of the fields will not be an issue.
Table properties are as follows:
T_Roster Table Properties
t_Deleted Table Properties
Macro in T_Roster:
T_Roster AfterDelete Macro
Can anyone tell me what is wrong with what they see?
edit Does not work means that t_Deleted is not updated when I delete a record from T_roster.
Fields set as required will prevent creating new record unless data is provided for them. Either don't set as required or provide data.

How do i refresh csv data set in quicksight and not replace the data set as this loses my calcs

I am looking to refresh a data set in quicksight, this is in Spice. The data set comes from a csv file that has been updated and now has more data than the original file I uploaded.
I can't seem to find a way to simply repoint to the same file with same format. I know how to replace the file but whenever i do this it states that it can't create some of my calculated fields and so drops multiple rows of data!
I assume I'm missing something obvious but I can't seem to find the right method or any help on the issue.
Thanks
Unfortunately, QuickSight doesn't support refreshing file data-sets to my knowledge. One solution, however, is to put your CSV in S3 and refresh from there.
The one gotcha with this approach is that you'll need to create a manifest file pointing to your CSV. This isn't too difficult and the QuickSight documentation is pretty helpful.
You can replace the datasource by going into the Analysis and clicking on the pencil icon as highlighted in Step 1. By replacing dataset, you will not lose any new calculated fields that might have been calculated already on the old dataset.
If you try to replace the data source by going into the Datasets as highlighted below, you'll lose all calculated fields and modifications etc
I don't know when this was introduced but you can now do this exact thing through the "Edit Dataset", starting either from the Dataset page or from the 'pencil' -> Edit dataset inside an Analysis. It is called "update file" and will result in an updated dataset (additional or different data) without losing anything from your analysis including calculated fields, filters etc.
The normal caveat applies in that the newer uploaded file MUST contain the same column names and datatypes as the original - although it can also contain additional columns if needed.
Screenshots:

MS Access "Write Conflict" error when adding new field to record source

I want to preface this by saying I don't have any real programming background, I'm just trying to update an existing database at work.
We have an access database and I want to add an additional Y/N checkbox to an existing form. The form updates a SQL table. Currently the record source is a SQL statement.
I can go to the SQL table, add a new field and make it Yes/No data type. There are other Yes/No fields in the table and the default settings for the new field are identical to the others. I next go and update the linked table through External Data in the ribbon. I can go into the table in Access and see the new field - so far, so good.
Next, go to the form design view and form properties, go to the record source, update the SQL statement to include the new field. (I've also tried this thru query builder, same result.) From here, I start to get the error.
If I go back to form view and change any data in the form and hit the next record button or save button, I get the Write Conflict error. "This record has been changed by another user since you started editing it..." The 'Save Record' button is greyed out. I am the only person accessing the database or SQL server.
I've tried to finish building the new button and linking it to the new field in control source (that went fine), but it didn't make any difference. If I go in to edit the record source and remove the new field, everything works again (but of course, the new field isn't in the control source list, so isn't linked to the check box).
Any ideas? Thanks.
A strong contender for the reason for your problem is the form itself.
Most likely the form is setup with a specific query as the Record Source. Its not to say that that is inherently incorrect, but it makes adding new columns to the source significantly more difficult.
The problem would likely be solved if you just changed the Record Source to reference the table itself, rather than a query, if that is in fact how it is referenced.
If Ms Access tries to pull data from a table using a query through a form it will inherently Pessimistically Lock the table in question, making it unable to be modified. However, you can usually still modify the table through the query itself, but you would need to add the column changes into that query first.
I'm not quite sure if I am making sense here, but something like this (for a table called "Table1"):
In theory, if the form is the problem... then if you closed it and tried to make modifications to the table, then the changes should work when the form is closed.
Someone else fixed it for me, but from what I understand, it was a communication issue between SQL and Access. Something with the setting of the null value in SQL not being set the way that Access could understand.
We had the issue narrowed down. When the new field was added to the table, you couldn't change any info directly in the table, but you could with the form.
If you added the new field to the form's record source, you couldn't edit any info at all.
Thanks for everyone's input.

How to know if ADOQuery.Post will change database?

I have a project in Delphi 7 that uses a database in MySQL to store some configuration.
Whenever I change a config, a "Save" button is enabled. The OnClick procedure in this button calls TADOQuery.Edit, Select the fields with the SQL property, TADOQuery.Open and set the various FieldsByName. In the end, it TADOQuery.Post the configuration and Requery it.
This works well, only if at least one of these fields is actually changed.
If, for example, I check a checkBox (originally unchecked) and then unCheck it again, the Save button would go enabled, but the actual data in database doesn't change. In this case, when I call Post, an Exception is raised.
I saw this question that would solve my problem, checking if there ara any modification, but as soon as I set the first field, the Modified property of TADOQuery becomes true, invalidating this solution.
Another option would be to check, before setting the field, if it will actually change, setting an flag to, in the end, actually post it or not. But, there are hundreds of fields to do that, which will be pretty boring to do.
A third alternative I thought is to create a new field in database with a "Last Modified" datestamp, which forces to always have at least one modification, but I prefer to not mess with the existing database.
Is there any other way to know if a TADOQuery.Post will trigger an exception, because no data has really changed? How can I solve this problem? Or there's a simple workaround for it?
The ADOQuery variable is dynamically created in the Save button's routine (and free'd in the end).
It would be a nice approach to use CheckBrowseMode() instead of Post().
ADOQuery1.CheckBrowseMode
CheckBrowseMode():
Automatically posts or cancels data changes when the active record changes.
You can read more about here:
http://docs.embarcadero.com/products/rad_studio/delphiAndcpp2009/HelpUpdate2/EN/html/delphivclwin32/DB_TDataSet_CheckBrowseMode.html

Data source retains schema after drop and add

I have several packages that are almost identical. Differ only by columns added/removed in different database versions. When I copy a package and modify the data flow of the copy, I delete the OLE DB Data Source and add a new one. Once the new one is defined, its preview shows exactly what I expect. Columns, however, are from the OLE DB source that was deleted. It's like it is being cached somewhere.
Seems like I need to close the package and re-open it after removing the data source. Is there some other way to clear this cached state? What's going on internally that causes this to happen?
More... it looks like it's the parametrized connection manager that is holding on to previous parameters until the package is closed and re-opened.
If I understand your work flow, you are copy and pasting packages and then tweaking the source definition in the data flow. The challenge is that the CustomerID in one system is varchar(7) and defined as varchar(12) in another. The "trick" becomes having the design engine recognize the metadata change and behave accordingly.
My usual hack is radically change the source. I find using the query SELECT 1 as foo does the trick. After doing that, the metadata for the OLE DB Source component drops all references to existing columns which percolates to the downstream components. I then switch back to the proper source and double click the first red X to have it map the IDs from old to new.
If you want a more brain surgical route than civil war surgery, change the column name in your source for anything that should have registered a metadata change. Thus SELECT T.MyColumn, T.IsFine FROM dbo.MyTable AS T becomes SELECT T.MyColumnX, T.IsFine FROM dbo.MyTable AS T Now only the first column gets kiboshed throughout the dataflow. Reset it back to the "right" column name and all is well.
Internally, I don't know but that never stops me from guessing. The Validation fires off, the SSIS Engine recognizes that the data types are still compatible so it doesn't change the existing metadata. A column no longer existing is enough to make it sit up and take notice and so the cached sizing goes away.
Some folks like to try and use the Advanced properties to change the sizes but I find I have better success just using the above approach than changing the size only to have the Designer slap my hand and disallow my proposed changes.