Read the last line of a CSV file and extract one value in Knime - knime

I am working in a workflow on Knime, and I have an excel writer node as a final node of my workflow. I need to read this file and get and store the last value of one specific column (time). And with this data, I need to input in another time node, to update my API link to get a new request.
To summarize I need to extract specific information from the last line of my excel file in knime.
My question is: How can I read this file and get this value from my sheet? And then, how can update a time loop to refresh the data for inserting the current day in my API link?
UDDATE-> My question is how can I Filter always the last 90 days in my concatenate database. I have two columns in this file with dates. And I need to maintain just the last 90 days since the current day.

To read an Excel file, use the Excel Reader node.
The simplest way to get the last row of a table (assuming your date column has a value for every row of this table?) is probably to use a Rule-based Row Filter with the expression
$$ROWINDEX$$ = $$ROWCOUNT$$ => TRUE
Now you have a one-row table with the values from the last line of the Excel sheet. To help further we need to understand what you mean by update a time loop to refresh the date for inserting the current day in my API link. Can you update your question with a screenshot of your current KNIME code?

Related

Is there anyway to get a value from app script and update the sheet in realtime instead of calculated field?

I have two spread sheet files.
A file is for showing result for customers. You can select type and amount. So, it shows the total price and
time.
B file: data file. It must be invisible for customers. This file contains some data about price by values. And the price calculation part is not simple, So, I think it needs app script.
But, What I want to implement is, auto update the field like what calculated fields do. It should update the result fields right away and automatically.
My first question is 'is there anyway to call API?' because app script needs to be called to get the return value.
My second question is 'is it possible to update the result fields right away, and automatically?' because as far as I know there's no way to detect the completion event.

VB.NET How to obtain previous record entered from datatable?

I would like to obtain the last record entered in my database and display it in a textbox.
Here is what I tried
Previous_Project_Float_Detail.Text = projectRecordDT.Rows(projectRecordDT.Rows("FloatNo").Count - 1).ToString()
projectRecordDT is my datatable populated with records using mySqlDataAdapter.
In my database, I would like to retrieve the last record of the column FloatNo.
Usually I am able to retrieve this by creating a new datarow. But how do I obtain it efficiently without having to loop the records every time by using datarows?
I have managed to solve my own question because I misplace a parenthesis with the column name.
The correct way is
Previous_Project_Float_Detail.Text = projectRecordDT.Rows(projectRecordDT.Rows.Count - 1)("FloatNo").ToString()
The column name is after the row count. Not inside.
Also, this only works if the order of the records is already correct in your database.

Does the Socrata SODA API support getting a list of dates on which the dataset was modified?

Does the Socrata SODA API support a method to query out all the dates a dataset has been updated? Basically a changelog for the dataset that has an object for every modification/update to a dataset.
There is an existing question that asks for the last modified date (you can get it through the "/data.json API available on all Socrata-powered sites".
There is also a method to get the modified dates of individual rows using System Fields and the :update_at field. But this is incomplete, a data provider might update every row each time. This means there is no guarantee that we are really getting back a history of modifications, just the top layer of modification on each row.
I'm looking for the complete list of modification dates, at least. We are trying to get a sense of activity on datasets and we need to know how often they are being updated.
Unfortunately, Max, we don't offer what you're looking for. We've got the last time the dataset and metadata were modified, but not a changelog of every single time that there was a change.
A surprisingly large number of datasets change very frequently, as often as every 5 minutes.

Export data from Access to Excel and batch every 1000 records in individual cells

A customer has asked to me export a recordset (which equates to one column) from access to excel – which I'm fine with using VBA. However, they want the data batched, so every 1000 records will be placed in one cell and so on. They have asked that the records are separated by a comma. I suspect they are then feeding the records into another application (like Business Objects).
So for example:
cell A1 would look like: 1235,1234,2346,346 etc
cell B2 would look like: 7994,345,345
Can anyone offer any help on this as I'm stuck?
You need to create a query to return one record every 1000 records and concatenate them into a single field and then export it to excel as you already have done.
Using this function in your query you can concatenate grouped records into a single field. You can group them every 1000 using a row counter dividing for 1000 and taking the integer part.

How do I populate a field with static header row information on import?

As it stands, I am currently looking to import data from an Excel spreadsheet into a table on a monthly basis. The header row in the spreadsheet contains the date that the original query was run.
I have one master table in access consisiting of multiple files. I would like to set up an automated process to capture the date in the header upon import, and then record it in a field for every new record that was imported.
There are two caveats here:
Spreadsheet sizes will vary depending on where data exists.
I have no control over how the data is provided. Fields that contain no data for the month will not populate to the spreadsheet.
Less frequently fields will be added that do not exist.
So far I have been identifying these new additions manually and creating a new field for them at the end of the field list. I realize that this is very inefficient and I would like to automate it, if I can.
Does anyone have any insight? Any assistance would be greatly appreciated.
OK, here's the steps you'll want to take.
Create a link from Access to your Excel spreadsheet. Access will now see this as a table.
Create a make table query using the Excel table as the source and adding the date (derived from a sub-query) as an additional field.
Then run the query. This will automatically create all the fields.
If, however, you need to create new fields in an existing table, then you'll have to use VBA, read each header in the Excel table, compare it to the schema of the existing table, and execute an alter table query to add the field.
Good luck