Using the following query, I found that for items that have a stock location, there are multiple rows returned from the REST API StockLocations of Exact Online:
select spn.item_code_attr || '-' || spn.warehouse_code_attr || '-' || stn.code key
, itm.itemgroupcode
, itm.itemgroupdescription
, spn.item_code_attr
, spn.item_description
, spn.currentquantity
, spn.planning_in
, spn.planning_out
, spn.currentquantity + spn.planning_in - spn.planning_out plannedquantity
, -1 bestelniveau /* out of scope */
, itm.costpricestandard costprijs
, itm.costpricestandard * spn.currentquantity stockvalue
, spn.warehouse_code_attr
, stn.code locatie
, itm.unitcode UOM
, itm.id
, whe.id
, sln.stock
, sln.itemid
, sln.warehouse
, stn.id
from exactonlinexml..StockPositions spn
join exactonlinerest..items itm
on itm.code = spn.item_code_attr
and itm.code = 'LE-10242'
and itm.isstockitem = 1
join exactonlinerest..warehouses whe
on whe.code = spn.warehouse_code_attr
left
outer
join exactonlinerest..stocklocations sln
on sln.itemid = itm.id
and sln.stock != 0
and sln.warehouse = whe.id
left
outer
join storagelocations stn
on stn.id = sln.storagelocation
and stn.warehouse = sln.warehouse
--
-- Filter out no stock nor planned.
--
where ( spn.currentquantity !=0
or
spn.planning_in != 0
or
spn.planning_out != 0
)
and spn.item_code_attr = 'LE-10242'
order
by key
For example, for this item, there are 10 StockLocations. When I sum the field Stock, it returns the stock quantity found in StockPositions. However, it seems that every transaction creates an additional StockLocation entry.
I would expect StockLocation to contain per location in stock the total amount to be found there.
EDIT
The StockLocations API is described in https://start.exactonline.nl/api/v1/{division}/logistics/$metadata as:
<EntityType Name="StockLocation">
<Key>
<PropertyRef Name="ItemID"/>
</Key>
<Property Name="ItemID" Type="Edm.Guid" Nullable="false"/>
<Property Name="Warehouse" Type="Edm.Guid" Nullable="true"/>
<Property Name="WarehouseCode" Type="Edm.String" Nullable="true"/>
<Property Name="WarehouseDescription" Type="Edm.String" Nullable="true"/>
<Property Name="Stock" Type="Edm.Double" Nullable="true"/>
<Property Name="StorageLocation" Type="Edm.Guid" Nullable="true"/>
<Property Name="StorageLocationCode" Type="Edm.String" Nullable="true"/>
<Property Name="StorageLocationDescription" Type="Edm.String" Nullable="true"/>
</EntityType>
Somehow it is not documented at https://start.exactonline.nl/docs/HlpRestAPIResources.aspx
What am I doing wrong?
Discussed question on Hackathon with engineer. This is as the StockLocation API works; the naming does not optimally reflect the contents, but this is intended behaviour.
With a select field, sum(stock) from stocklocations group by field, you can get the right information.
To improve join performance, it is recommended to use an inline view for this such as select ... from table1 join table2 ... join ( select field, sum(stock) from stocklocations group by field).
Related
Currently when I look at the expiry date section for insurance on my MSReport Builder it is 03/04/2017, but the expiry date section for my insurance in CRM is 04/04/2017 which is the correct and right, however in the report it is 1 day behind, therefore I was wondering why and what might be the fix to this? because I want it to show the same as CRM 04/04/2017, I've been researching and some articles said use UTC THE ONE THAT DONT START WITH 23 hours, not entirely sure how to put this in my query, I’m in the UK, and CRM options are for already set for UK as I checked it already. Again, please advise fix to this?
SELECT 'PAS ' +
SectionName AS SectionName,
SectionKey,
FormName,
ItemName,
ImportSequenceNumber,
ExpiryDate,
ExpiresOn,
#Param_MonthlyStatement_EntityRecordId AS AccountId
FROM
(SELECT
sect.mm_name AS SectionName,
sect.mm_key AS SectionKey,
frm.mm_name AS FormName,
frm.mm_name AS ItemName,
frm.mm_importsequencenumber AS ImportSequenceNumber,
MAX(frmans.mm_expires) AS ExpiryDate,
DATEADD(m, 2, GETDATE()) AS ExpiresOn
FROM Filteredmm_section AS sect INNER JOIN
mm_form AS frm ON sect.mm_sectionid = frm.mm_section INNER JOIN
mm_formanswer AS frmans ON frmans.mm_form = frm.mm_formId INNER JOIN
Account AS acc ON frmans.mm_AccountID = acc.AccountId
WHERE (sect.mm_name LIKE '%-%')
AND (sect.mm_parentsection IS NULL)
AND (CONVERT(NVARCHAR(250), frmans.mm_AccountID)
= #Param_MonthlyStatement_EntityRecordId)
AND ( acc.mm_supplier = 1)
GROUP BY sect.mm_name, sect.mm_key, frm.mm_name, frm.mm_importsequencenumber
HAVING (MAX(frmans.mm_expires) BETWEEN GETDATE() AND DATEADD(m, 2, GETDATE()))) AS t1
WHERE (NOT EXISTS (SELECT TOP (1) mm_accountid FROM Filteredmm_formanswer
WHERE (mm_formname = t1.FormName) AND (mm_accountid = #Param_MonthlyStatement_EntityRecordId) AND (mm_statusname = 'Awaiting Verification')))
ORDER BY SectionName, FormName, ImportSequenceNumber
You might want to get the expiry date from the filtered table, because I can see there is a filtering used, perhaps try
INNER JOIN Filteredmm_formanswer AS frmans ON frmans.mm_form = frm.mm_formId
I am trying to generate report in unicentaopos4.2.2. The report is created in Jasper but there is a corresponding bs file. There is only one SQL query and for my report I need multiple query in the jrxml file.
This is the sales_extendedcashregisterlog.bs file that have only one SQL query with the report.setSentence().
report = new com.openbravo.pos.reports.PanelReportBean();
report.setTitleKey("Menu.ExtendedCashRegisterLog");
report.setReport("/com/openbravo/reports/sales_extendedcashregisterlog");
report.setResourceBundle("com/openbravo/reports /sales_extendedcashregisterlog_messages");
report.setSentence("SELECT " +
"tickets.TICKETID AS TICKET_NO, " +
"receipts.DATENEW AS TICKET_DATE, " +
"people.NAME AS PERSON, " +
"payments.PAYMENT AS PAYMENT, " +
"payments.NOTES, " +
"payments.TOTAL AS MONEY, " +
"payments.TENDERED " +
"FROM ((tickets tickets " +
"LEFT OUTER JOIN people people ON (tickets.PERSON = people.ID)) " +
"RIGHT OUTER JOIN receipts receipts ON (receipts.ID = tickets.ID)) " +
"LEFT OUTER JOIN payments payments ON (receipts.ID = payments.RECEIPT) " +
"WHERE ?(QBF_FILTER) " +
"ORDER BY TICKET_DATE ASC");
report.addParameter("receipts.DATENEW");
report.addParameter("receipts.DATENEW");
paramdates = new com.openbravo.pos.reports.JParamsDatesInterval();
paramdates.setStartDate(com.openbravo.beans.DateUtils.getToday());
// JG - 8 Jan 14 paramdates.setEndDate(com.openbravo.beans.DateUtils.getToday());
paramdates.setEndDate(com.openbravo.beans.DateUtils.getTodayMinutes());
report.addQBFFilter(paramdates);
report.addField("TICKET_NO", com.openbravo.data.loader.Datas.STRING);
report.addField("TICKET_DATE", com.openbravo.data.loader.Datas.TIMESTAMP);
report.addField("PERSON", com.openbravo.data.loader.Datas.STRING);
report.addField("PAYMENT", com.openbravo.data.loader.Datas.STRING);
report.addField("NOTES", com.openbravo.data.loader.Datas.STRING);
report.addField("MONEY", com.openbravo.data.loader.Datas.DOUBLE);
report.addField("TENDERED", com.openbravo.data.loader.Datas.DOUBLE);
return report;
Now we have a jrxml file. I can edit this file using pallet feature but the thing is the sql query that give me the fields so that I can use them. here is the query of the jrxml file that it already have.
queryString>
<![CDATA[SELECT
tickets.TICKETID AS TICKET_NO,
receipts.DATENEW AS TICKET_DATE,
payments.TOTAL AS MONEY,
people.NAME AS PERSON,
payments.PAYMENT AS PAYMENT
FROM receipts
LEFT JOIN tickets ON receipts.ID = tickets.ID
LEFT JOIN payments ON receipts.ID = payments.RECEIPT
LEFT JOIN people ON tickets.PERSON = PERSON.ID
ORDER BY tickets.TICKETID]]>
</queryString>
<field name="TICKET_NO" class="java.lang.String"/>
<field name="TICKET_DATE" class="java.util.Date"/>
<field name="PERSON" class="java.lang.String"/>
<field name="PAYMENT" class="java.lang.String"/>
<field name="NOTES" class="java.lang.String"/>
<field name="MONEY" class="java.lang.Double"/>
<field name="TENDERED" class="java.lang.Double"/>
the fields are already inside. Now I want to add more than one sql query, so how exactly I can do it.
Basically I am trying to generate a transaction log in unicentaopos with all the needed data.
I have the following query:
select sie.invoicedate sie_invoicedate
, sie.Silitem sle_item
, sie.Silitemcode sle_itemcode
, sie.Silitemdescription sle_itemdescription
, sie.Silnetprice sle_netprice
, sie.Silquantity sle_quantity
, sie.Silunitprice sle_unitprice
, ctr.ctr_code ctr_code
, ctr.ctr_name ctr_name
, ctr.parent_code parent_code
, ctr.parent_name parent_name
, gdlsn.ssrserialnumber serialnumber
from SalesInvoicesExploded sie
join customers#inmemorystorage ctr
on ctr.ctr_id = sie.invoiceto
join GoodsDeliveryLineSerialNumbers gdlsn
on gdlsn.salesorderlineid = sie.silid
where sie.invoicedate >= '2016-01-01'
and sie.invoicedate < '2016-01-03'
order
by sie.invoicedate
How can I get the serial numbers only from the date range? In the debugger I see a lot of requests to Exact Online.
For now, there isn't a very good filter possibility to get the result you want.
The problem is that there is no way to perform the gdlsn.salesorderlineid = sie.silid filter on the data set unless the data sets have been fetched from the other side.
Only specific filters are executed server-side (like your invoicedate >= '2016-01-01'). This is quite a hard nut to crack from the program side.
It would work if you can specify a filter that can be determined on beforehand, like that the date in GoodsDeliveryLineSerialNumbers.Created always comes after the invoicedate. It would mean a significant performance improvement if you can narrow down the set based on that date.
I suggest to use something like this, if possible:
select sie.invoicedate sie_invoicedate
, sie.Silitem sle_item
, sie.Silitemcode sle_itemcode
, sie.Silitemdescription sle_itemdescription
, sie.Silnetprice sle_netprice
, sie.Silquantity sle_quantity
, sie.Silunitprice sle_unitprice
, ctr.ctr_code ctr_code
, ctr.ctr_name ctr_name
, ctr.parent_code parent_code
, ctr.parent_name parent_name
, gdlsn.ssrserialnumber serialnumber
from SalesInvoicesExploded sie
join customers#inmemorystorage ctr
on ctr.ctr_id = sie.invoiceto
join GoodsDeliveryLineSerialNumbers gdlsn
on gdlsn.salesorderlineid = sie.silid
where sie.invoicedate >= '2016-01-01'
and sie.invoicedate < '2016-01-03'
-- add the following line, use a date that for sure will yield the rows:
and gdlsn.created >= '2015-12-01'
--
order
by sie.invoicedate
I have a simple SSIS package, and I'd like to complicate it a little.
Right now, it executes a stored procedure in an OLE DB Source, and adds the rows returned from the stored procedure to the data flow. Then, for each row returned, it executes an OLE DB Command transform, executing a second stored procedure (in a second database), passing the columns from the source as parameters.
The second stored procedure performs a synchronization function, and I would like to log the grand total number of adds, deletes and updates. The "sync" stored procedure uses the OUTPUT clause of a MERGE statement to get this data and return it as a resultset.
I don't see a way to get this resultset out of the OLE DB Command transform. It does not allow me to add output columns.
Short of adding a Script Transform, is there a way for me to log the grand total of the add, delete and update columns?
This is not as straight forward as it ought to be. That or I need to go back to SSIS class.
The OLE DB Command component can't add new rows to the dataflow, as it's a synchronous component.
It also cannot add new columns to the data flow. That's the first thing that was non-intuitive. So you'll see in my source, I have added an ActionName column of type nvarchar(10)/string length of 10. You could add the column in a Derived Column Transformation prior to the OLE DB Command component if you so wish.
Since I can't add rows to the data flow, that means I'm only able to use an OUTPUT parameter for my proc instead of using the recordset it could generate. Perhaps your stored procedure only allows for one row to be altered at a time and this is ok but has a general code smell to me.
Table definition and set up
CREATE TABLE dbo.so_27932430
(
SourceId int NOT NULL
, SourceValue varchar(20) NOT NULL
);
GO
INSERT INTO
dbo.so_27932430
(SourceId, SourceValue)
VALUES
(1, 'No change')
, (3,'Changed');
Stored Proc
CREATE PROCEDURE
dbo.merge_27932430
(
#SourceId int
, #SourceValue varchar(20)
, #ActionName nvarchar(10) OUTPUT
)
AS
BEGIN
SET NOCOUNT ON;
DECLARE
#BloodyHack table
(
ActionName nvarchar(10) NOT NULL
, SourceId int NOT NULL
);
MERGE
dbo.so_27932430 AS T
USING
(
SELECT
D.SourceId
, D.SourceValue
FROM
(
SELECT #SourceId, #SourceValue
) D(SourceId, SourceValue)
) AS S
ON
(
T.SourceId = S.SourceId
)
WHEN
MATCHED
AND T.SourceValue <> S.SourceValue
THEN
UPDATE
SET
T.SourceValue = S.SourceValue
WHEN
NOT MATCHED THEN
INSERT
(
SourceId
, SourceValue
)
VALUES
(
SourceId
, SourceValue
)
OUTPUT
$action, S.SourceId
INTO
#BloodyHack;
/* Pick one, any one*/
SELECT
#ActionName = BH.ActionName
FROM
#BloodyHack AS BH
END
Source Query
SELECT
D.SourceId
, D.SourceValue
, CAST(NULL AS nvarchar(10)) AS ActionName
FROM
(
VALUES
(1, 'No change')
, (2, 'I am new')
, (3,'I Changed')
) D(SourceId, SourceValue);
OLE DB Command setup
EXECUTE dbo.merge_27932430 #SourceId = ?, #SourceValue = ?, #ActionName = ? OUTPUT;
Results
References
OUTPUT clause
Biml
Assuming you have the free BidsHelper the following Biml was used to generate this package.
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="CM_OLE" ConnectionString="Data Source=localhost\dev2014;Initial Catalog=tempdb;Provider=SQLNCLI10.1;Integrated Security=SSPI;Auto Translate=False;" />
</Connections>
<Packages>
<Package ConstraintMode="Linear" Name="so_27932430">
<Variables>
<Variable DataType="String" Name="QuerySource">
<![CDATA[SELECT
D.SourceId
, D.SourceValue
, CAST(NULL AS nvarchar(10)) AS ActionName
FROM
(
VALUES
(1, 'No change')
, (2, 'I am new')
, (3,'I Changed')
) D(SourceId, SourceValue);
]]></Variable>
<Variable DataType="String" Name="QueryCommand">EXECUTE dbo.merge_27932430 #SourceId = ?, #SourceValue = ?, #ActionName = ? OUTPUT;</Variable>
</Variables>
<Tasks>
<Dataflow Name="DFT OLEDB Test">
<Transformations>
<OleDbSource ConnectionName="CM_OLE" Name="OLESRC GenData">
<VariableInput VariableName="User.QuerySource" />
</OleDbSource>
<OleDbCommand ConnectionName="CM_OLE" Name="OLECMD Test">
<DirectInput>EXECUTE dbo.merge_27932430 #SourceId = ?, #SourceValue = ?, #ActionName = ? OUTPUT;</DirectInput>
<Parameters>
<Parameter SourceColumn="SourceId" DataType="Int32" TargetColumn="#SourceId"></Parameter>
<Parameter SourceColumn="SourceValue" DataType="AnsiString" Length="20" TargetColumn="#SourceValue"></Parameter>
<Parameter SourceColumn="ActionName" DataType="String" Length="10" TargetColumn="#ActionName"></Parameter>
</Parameters>
</OleDbCommand>
<DerivedColumns Name="DER PlaceHolder" />
</Transformations>
</Dataflow>
</Tasks>
</Package>
</Packages>
</Biml>
I have a simple requirement. I have a table with Product Names and their Count. I want to create a SSIS package to extract data from one table to infinite tables based on Product Name.
In table if i have 10 products then SSIS package should create 10 tables dynamically with one product in each table.
Table Name : Products
ProductName , QuantitySold
ABC 10
xyz 15
Testing 25
Table Name : ABC
ProductName , QuantitySold
ABC 10
Table Name : XYZ
ProductName , QuantitySold
xyz 15
Table Name : Testing
ProductName , QuantitySold
ABC 10
Conceptually, you're looking at something like
The concept is that you will identify all the product names in the table and perform 2 tasks on each row: Create the target table, if needed. Run a query against your source for that one row and load it into the table.
Variables
I have 6 variables declared
Query_TableCreateBase is a big string that formatted looks like
IF NOT EXISTS
(
SELECT
*
FROM
sys.tables AS T
WHERE
T.name = '<Table/>'
)
BEGIN
CREATE TABLE dbo.<Table/>
(
ProductName varchar(30) NOT NULL
, QuantitySold int NOT NULL
);
END
I have expressions on Query_Source, Query_TableCreate and TargetTable
Query_Source expression
"SELECT ProductName, QuantitySold FROM (
VALUES
('ABC', 10)
, ('xyz', 15)
, ('Testing', 25)
) Products(ProductName, QuantitySold) WHERE ProductName = '" + #[User::ProductName] + "'"
Query_TableCreate expression
replace(#[User::Query_TableCreateBase], "<Table/>", #[User::ProductName])
TargetTable expression
"[dbo].[" +#[User::ProductName] + "]"
SQL Get Rows
I simulate your Products table with a query. I load those results into a variable named RS_Product.
SELECT
ProductName
FROM
(
VALUES
('ABC', 10)
, ('xyz', 15)
, ('Testing', 25)
) Products(ProductName, QuantitySold);
FELC Shred Results
I use a Foreach Loop Container, set to process an ADO Result set and parse out the 0th column into our ProductName variable
SQL Create Table if needed
This is a query that gets evaluated out to something like
IF NOT EXISTS
(
SELECT
*
FROM
sys.tables AS T
WHERE
T.name = 'Foo'
)
BEGIN
CREATE TABLE dbo.Foo
(
ProductName varchar(30) NOT NULL
, QuantitySold int NOT NULL
);
END
DFT Load Table
I have this set as DelayValidation = true as the table may not exist right up until it gets the signal to start.
Again, simulating your Products table, my query looks like
SELECT ProductName, QuantitySold FROM (
VALUES
('ABC', 10)
, ('xyz', 15)
, ('Testing', 25)
) Products(ProductName, QuantitySold) WHERE ProductName = 'Foo'
Wrapup
Strictly speaking, the data flow is not required. It could all be done through your Execute SQL Task if we pulled back all the columns in our source query.
Biml implemenation
Biml, the Business Intelligence Markup Language, describes the platform for business intelligence. Here, we're going to use it to describe the ETL. BIDS Helper, is a free add on for Visual Studio/BIDS/SSDT that addresses a host of shortcomings with it. Specifically, we're going to use the ability to transform a Biml file describing ETL into an SSIS package. This has the added benefit of providing you a mechanism for being able to generate exactly the solution I'm describing versus clicking through many tedious dialogue boxes.
The following code assumes you have a default instance on your local machine and that within tempdb, you have a table called Foo.
use tempdb;
GO
CREATE TABLE dbo.Foo
(
ProductName varchar(30) NOT NULL
, QuantitySold int NOT NULL
);
Save the following script into a .biml file which when you add to your SSIS project will show up under the Miscellaneous virtual folder. Right click, choose Generate SSIS Package and it should create a package called so_27320726
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="tempdb" ConnectionString="Data Source=localhost;Initial Catalog=tempdb;Provider=SQLNCLI10.1;Integrated Security=SSPI;" />
</Connections>
<Packages>
<Package Name="so_27320726" ConstraintMode="Parallel" >
<Variables>
<Variable Name="ProductName" DataType="String">Foo</Variable>
<Variable Name="Query_Source" DataType="String" EvaluateAsExpression="true">"SELECT ProductName, QuantitySold FROM (
VALUES
('ABC', 10)
, ('xyz', 15)
, ('Testing', 25)
) Products(ProductName, QuantitySold) WHERE ProductName = '" + #[User::ProductName] + "'"</Variable>
<Variable Name="Query_TableCreate" DataType="String" EvaluateAsExpression="true"><![CDATA[replace(#[User::Query_TableCreateBase], "<Table/>", #[User::ProductName])]]></Variable>
<Variable Name="Query_TableCreateBase" DataType="String" ><![CDATA[IF NOT EXISTS
(
SELECT
*
FROM
sys.tables AS T
WHERE
T.name = '<Table/>'
)
BEGIN
CREATE TABLE dbo.<Table/>
(
ProductName varchar(30) NOT NULL
, QuantitySold int NOT NULL
);
END]]></Variable>
<Variable Name="RS_Product" DataType="Object" />
<Variable Name="TargetTable" DataType="String" EvaluateAsExpression="true">"[dbo].[" +#[User::ProductName] + "]"</Variable>
</Variables>
<Tasks>
<ExecuteSQL Name="SQL Get Rows" ConnectionName="tempdb" ResultSet="Full">
<Variables>
<Variable Name="Variable" DataType="Int32" IncludeInDebugDump="Include">0</Variable>
</Variables>
<Results>
<Result Name="0" VariableName="User.RS_Product" />
</Results>
<DirectInput>SELECT
*
FROM
(
VALUES
('ABC', 10)
, ('xyz', 15)
, ('Testing', 25)
) Products(ProductName, QuantitySold);</DirectInput>
</ExecuteSQL>
<ForEachAdoLoop Name="FELC Shred Results" ConstraintMode="Linear" SourceVariableName="User.RS_Product">
<PrecedenceConstraints>
<Inputs>
<Input OutputPathName="SQL Get Rows.Output" SsisName="Constraint" />
</Inputs>
</PrecedenceConstraints>
<Tasks>
<ExecuteSQL Name="SQL Create Table if needed" ConnectionName="tempdb">
<VariableInput VariableName="User.Query_TableCreate" />
</ExecuteSQL>
<Dataflow Name="DFT Load Table" DelayValidation="true">
<Transformations>
<OleDbSource Name="OLE_SRC Get Data" DefaultCodePage="1252" ConnectionName="tempdb">
<VariableInput VariableName="User.Query_Source" />
</OleDbSource>
<OleDbDestination Name="OLE_DST Save data" ConnectionName="tempdb" >
<TableFromVariableOutput VariableName="User.TargetTable" />
<Columns>
<Column SourceColumn="ProductName" TargetColumn="ProductName" />
<Column SourceColumn="QuantitySold" TargetColumn="QuantitySold" />
</Columns>
</OleDbDestination>
</Transformations>
</Dataflow>
</Tasks>
<VariableMappings>
<VariableMapping Name="0" VariableName="User.ProductName" />
</VariableMappings>
</ForEachAdoLoop>
</Tasks>
<Connections>
<Connection ConnectionName="tempdb" />
</Connections>
</Package>
</Packages>
</Biml>