I have a Access DB with following scheme:
+---------+---------+-----------+------------+-------------------------------------------+
| BrandNr | TextNr | LangCode | OngoingNr | Text |
+---------+---------+-----------+------------+-------------------------------------------+
| 1 | AB | 1 | 1 | Text beginns here but it doesn't end here |
| 1 | AB | 1 | 2 | Text isn't finished so need second row |
| 1 | AC | 2 | 1 | New text |
| 2 | hg2 | 1 | 1 | New brand new text |
+---------+---------+-----------+------------+-------------------------------------------+
Now I need to merge the text where BrandNR, TextNr and LangCode are the same. The Text should be ordered by the OngoingNr.
Some ideas?
I believe this will require VBA.
One possible method is to call a VBA function which defines a Static string-type variable that is initialised to an empty string for every new combination of BrandNR, TextNr and LangCode, else concatenated with itself.
Open the VBA IDE using Alt+F11
Insert a new Public Module Alt+I,M
Copy the following basic code into the new Module:
Function Concat(strInp As String, lngOrd As Long) As String
Static strTmp As String
If lngOrd = 1 Then strTmp = strInp Else strTmp = strTmp & " " & strInp
Concat = strTmp
End Function
In MS Access, create a new query with the following SQL, changing MyTable to the name of your table:
select q.BrandNr, q.TextNr, q.LangCode, Concat(q.Text,q.OngoingNr) as Merged
from
(
select t.* from MyTable t
order by t.BrandNr, t.TextNr, t.LangCode, t.OngoingNr
) q
Related
I'm having trouble returning a JSON representation of a many-many join. My plan was to encode the columns returned using the following JSON format
{
"dog": [
"duke"
],
"location": [
"home",
"scotland"
]
}
This format would handle duplicate keys by aggregating the results in a JSON array, howver all of my attempts at aggregating this structure so far have just removed duplicates, so the arrays only ever have a single element.
Tables
Here is a simplified table structure I've made for the purposes of explaining this query.
media
| media_id | sha256 | filepath |
| 1 | 33327AD02AD09523C66668C7674748701104CE7A9976BC3ED8BA836C74443DBC | /photos/cat.jpeg |
| 2 | 323b5e69e72ba980cd4accbdbb59c5061f28acc7c0963fee893c9a40db929070 | /photos/dog.jpeg |
| 3 | B986620404660DCA7B3DEC4EFB2DE80C0548AB0DE243B6D59DA445DE2841E474 | /photos/dog2.jpeg |
| 4 | 1be439dd87cd87087a425c760d6d8edc484f126b5447beb2203d21e09e2a8f11 | /photos/balloon.jpeg |
media_metdata_labels_has_media (for many-many joins)
| media_metadata_labels_label_id | media_media_id |
| 1 | 1 |
| 2 | 1 |
| 3 | 1 |
| 1 | 2 |
| 4 | 2 |
| 5 | 2 |
| 1 | 3 |
| 6 | 3 |
| 7 | 3 |
| 8 | 4 |
| 9 | 4 |
media_metadata_labels
| label_id | label_key | label_value |
| 2 | cat | lily |
| 4 | dog | duke |
| 6 | dog | rex |
| 1 | pet size | small |
| 3 | location | home |
| 7 | location | park |
| 8 | location | scotland |
| 9 | location | sky |
| 5 | location | studio |
My current attempt
My latest attempt at querying this data uses JSON_MERGE_PRESERVE with two arguments, the first is just an empty JSON object and the second is an invalid JSON document. It's invalid because there are duplicate keys, but I was hoping that JSON_MERGE_PRESERVE would merge them. It turns out JSON_MERGE_PRESERVE will only merge duplicates if they're not in the same JSON argument.
For example, this won't merge two keys
SET #key_one = '{}';
SET #key_two = '{"location": ["home"], "location": ["scotland"]}';
SELECT JSON_MERGE_PRESERVE(#key_one, #key_two);
-- returns {"location": ["scotland"]}
but this will
SET #key_one = '{"location": ["home"] }';
SET #key_two = '{"location": ["scotland"]}';
SELECT JSON_MERGE_PRESERVE(#key_one, #key_two);
-- returns {"location": ["home", "scotland"]}
So anyway, here's my current attempt
SELECT
m.media_id,
m.filepath,
JSON_MERGE_PRESERVE(
'{}',
CAST(
CONCAT(
'{',
GROUP_CONCAT(CONCAT('"', l.label_key, '":["', l.label_value, '"]')),
'}'
)
AS JSON)
)
as labels
FROM media AS m
LEFT JOIN media_metadata_labels_has_media AS lm ON lm.media_media_id = m.media_id
LEFT JOIN media_metadata_labels AS l ON l.label_id = lm.media_metadata_labels_label_id
GROUP BY m.media_id, m.filepath
-- HAVING JSON_CONTAINS(labels, '"location"', CONCAT('$.', '"home"')); -- this would let me filter on labels one they're in the correct JSON format
After trying different combinations of JSON_MERGE, JSON_OBJECTAGG, JSON_ARRAYAGG, CONCAT and GROUP_CONCAT this still leaves me scratching my head.
Disclaimer: Since posting this question I've started using mariadb instead of oracle MySQL. The function below should work for MySQL too, but in case it doesn't then any changes required will likely be small syntax fixes.
I solved this by creating a custom aggregation function
DELIMITER //
CREATE AGGREGATE FUNCTION JSON_LABELAGG (
json_key TEXT,
json_value TEXT
) RETURNS JSON
BEGIN
DECLARE complete_json JSON DEFAULT '{}';
DECLARE current_jsonpath TEXT;
DECLARE current_jsonpath_value_type TEXT;
DECLARE current_jsonpath_value JSON;
DECLARE CONTINUE HANDLER FOR NOT FOUND RETURN complete_json;
main_loop: LOOP
FETCH GROUP NEXT ROW;
SET current_jsonpath = CONCAT('$.', json_key); -- the jsonpath to our json_key
SET current_jsonpath_value_type = JSON_TYPE(JSON_EXTRACT(complete_json, current_jsonpath)); -- the json object type at the current path
SET current_jsonpath_value = JSON_QUERY(complete_json, current_jsonpath); -- the json value at the current path
-- if this is the first label value with this key then place it in a new array
IF (ISNULL(current_jsonpath_value_type)) THEN
SET complete_json = JSON_INSERT(complete_json, current_jsonpath, JSON_ARRAY(json_value));
ITERATE main_loop;
END IF;
-- confirm that an array is at this jsonpath, otherwise that's an exception
CASE current_jsonpath_value_type
WHEN 'ARRAY' THEN
-- check if our json_value is already within the array and don't push a duplicate if it is
IF (ISNULL(JSON_SEARCH(JSON_EXTRACT(complete_json, current_jsonpath), "one", json_value))) THEN
SET complete_json = JSON_ARRAY_APPEND(complete_json, current_jsonpath, json_value);
END IF;
ITERATE main_loop;
ELSE
SIGNAL SQLSTATE '45000'
SET MESSAGE_TEXT = 'Expected JSON label object to be an array';
END CASE;
END LOOP;
RETURN complete_json;
END //
DELIMITER ;
and editing my query to use it
SELECT
m.media_id,
m.filepath,
JSON_LABELAGG(l.label_key, l.label_value) as labels
FROM media AS m
LEFT JOIN media_metadata_labels_has_media AS lm ON lm.media_media_id = m.media_id
LEFT JOIN media_metadata_labels AS l ON l.label_id = lm.media_metadata_labels_label_id
GROUP BY m.media_id, m.filepath
I'm trying to create a table that looks like the following examples..
| SO_NUMBER | ORDER |
------------------------------------------
| 12345 | iphone5|1|500|APPLE |
| 12345 | icase-blk|1|20|CaseCompany |
| 23411 | galaxy5|1|500|Samsung |
| 23411 | galaxy-blk|1|20|CaseCompany|
Convert that to this..
| SO_NUMBER | ORDER_1 | ORDER_2
-----------------------------------------------------------------------
| 12345 | iphone5|1|500|APPLE | icase-blk|1|20|CaseCompany
| 23411 | galaxy5|1|500|Samsung | galaxy-blk|1|20|CaseCompany
I'm not sure where to start, I can group by the SO_Number fine, but not sure if I need to create a temporary table. I've search around and all I could find is grouping them together with commas but that won't work for this.
EDIT:
I've started looking at Option Compare Database RunningCount Script to do what i asked for. It works for the most part.
Option Compare Database
Option Explicit
Public wName As String
Public wRuningCount As Long
Function GetRunCount(Name1) As Long
If wName = Name1 Then
wRuningCount = wRuningCount + 1
Else
wName = Name1
wRuningCount = 1
End If
GetRunCount = wRuningCount
End Function
But now I get
| SO_NUMBER | ORDER_1 | ORDER_3 |
instead of
| SO_NUMBER | ORDER_1 | ORDER_2 |
The query the auto script setup is the following..
TRANSFORM First([123].ORDER) AS FirstOfORDER
SELECT [123].SO_NUMBER
FROM 123
GROUP BY [123].SO_NUMBER
PIVOT [123].GetRunCount;
Use the Query Wizard and create a crosstab query. Below is a sample:
TRANSFORM First([Tbl1].[SO_NUMBER]) AS FirstOftype
SELECT [Tbl1].[SO_NUMBER]
FROM Tbl1
GROUP BY [Tbl1].[SO_NUMBER]
PIVOT [Tbl1].[Order];
I have a table with 2 Columns, filled with strings
CREATE TABLE [tbl_text]
(
[directoryName] nvarchar(200),
[text1] nvarchar(200),
[text2] nvarchar(200)
)
The Strings are build like the following
| Text1 | Text2 |
|------------|----------|
|tz1 tz3 tz2 | al1 al2 |
| tz1 tz3 | al1 al3 |
| tz2 | al3 |
| tz3 tz2 | al1 al2 |
Now i want to Count how many times the TestN or TextN are resulting in the
| Text1 | al1 | al2 | al3 |
|-------|------|------|------|
| tz1 | 2 | 1 | 1 |
| tz2 | 2 | 2 | 1 |
| tz3 | 3 | 2 | 1 |
i tried solving it with an sql-query like this:
TRANSFORM Count(tt.directoryName) AS Value
SELECT tt.Text1
FROM tbl_text as tt
GROUP BY tt.Text1
PIVOT tt.Text2;
This works fine if i got fields only with one value like the third column (the complete datasource has to be like a one-value-style)
But in my case i'm using the strings for a multiselect...
If i try to conform this query onto a datasource filled with the " " between the values the result is complete messed up
Any suggestions how the query should look like to get this result ?
You'll have to split the strings inside Text1/Text2 before you can do anything with them. In VBA, you'd loop a recordset, use the Split() function and insert the results into a temp table.
In Sql Server there are more powerful options available.
Coming from here: Split function equivalent in T-SQL? ,
you should read this page:
http://www.sommarskog.se/arrays-in-sql-2005.html#tablelists
I've been looking around the Internet for quite some time now, but I can't find a way to get a list of just column names. I don't care about the data each column has.
I want to take that list and compare it against a collection. I'm using the VB.NET MySqlConnector. I'm new to using the connector.
Edit:
Dim mscCMD As MySqlCommand = New MySqlCommand("SHOW COLUMNS FROM OCN.cpu", msc)
Dim sqlReader As MySqlDataReader = mscCMD.ExecuteReader()
Dim b As Integer = 0
Dim c As Integer = 0
While sqlReader.Read
msProjects(c) = sqlReader.**Item**(b)
b += 1
c += 1
End While
Never mind, I figured it out. I had to choose Item, not getName.
Edit: Perhaps I didn't just yet. It's reading one row. I'm unsure how to move to the next row. For example
mysql> SHOW COLUMNS FROM City;
+------------+----------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+------------+----------+------+-----+---------+----------------+
**Id | int(11) | NO | PRI | NULL | auto_increment**
| Name | char(35) | NO | | | |
| Country | char(3) | NO | UNI | | |
| District | char(20) | YES | MUL | | |
| Population | int(11) | NO | | 0 | |
+------------+----------+------+-----+---------+----------------+
5 rows in set (0.00 sec)
That's the row it reads it reads. Once you go past the 5th item, you're out of bounds.
You can use the metadata from a result set:
final ResultSet rs = statement.executeQuery("SELECT * FROM `" + tableName +"` LIMIT 0,1;");
final ResultSetMetaData metaData = rs.getMetaData();
for(int i=1; i<metaData.getColumnCount(); i++)
{
System.out.println("" + metaData.getColumnName(i));
}
Also, you can get the java.sql.Types column type the same way.
Got it. Found the answer from a MS KB article.
https://support.microsoft.com/en-us/kb/310108
For Each myField In schemaTable.Rows
'For each property of the field...
For Each myProperty In schemaTable.Columns
'Display the field name and value.
Console.WriteLine(myProperty.ColumnName & " = " & myField(myProperty).ToString())
Next
Console.WriteLine()
'Pause.
Console.ReadLine()
Next
Specifically myField(myProperty).ToStringwill get you the column names. It will also contain other things about the column. You will have to sort through it, but all the column names from your table will be in there.
I have the following table of data from an MDX query that resembles the following:
Account | Location | Type | Person | Amount
ABC | XYZ | AA | Tom | 10
ABC | XYZ | AA | Dick | 20
ABC | XYZ | AA | Harry | 30
ABC | XYZ | BB | Jane | 50
ABC | XYZ | BB | Ash | 100
DEF | SDQ | ZA | Bob | 20
DEF | SDQ | ZA | Kate | 10
DEF | LAO | PA | Olivia | 200
DEF | LAO | PA | Hugh | 120
And I need to add the Amount column for each Account, Location, and Type. If I was using SQL I would perform a query on this data as follows:
Select Account, Location, Type, Sum(Amount) as SumAmmount
From Table
Group By Account, Location, Type
but due to the way we store the data I need to roll-up this data using SSRS. To do that I created a tablix, created a parent group (Which I have called "RollUp") of the default detail group which grouped on Account, Location, and Type and then deleted the detail group so when running the report I get:
Account | Location | Type | Amount
ABC | XYZ | AA | 60
ABC | XYZ | BB | 150
DEF | SDQ | ZA | 30
DEF | LAO | PA | 320
What I need to do now is create a page break so that when I export this SSRS report to excel there are only 1000 rows on each sheet, but I am having trouble writing the expression to split this every 1000 rows. Because I have removed the details group I cannot use the typical expression I would use to page break on a specific row of a dataset (e.g. Group Expression = Floor(RowNumber(NOTHING) / 1000) )
I have tried a few different things like writing some custom code and some running value expressions but haven't been able to figure it out yet.
I did figure out how to do this.
First I created the following custom code in the report definition:
Dim GroupingDummy = "GroupDummy"
Dim RowNumberToReturn = -1
Function PopulateRowNumber(GroupString As String) As Integer
If (GroupString <> GroupingDummy ) Then
GroupingDummy = GroupString
RowNumberToReturn = RowNumberToReturn + 1
End If
Return RowNumberToReturn
End Function
Keeping in mind the grouping I applied to the dataset used the fields Account, Location, and Type, I added a calculated field to my dataset with the name RowNumberCalc and the expression:
=Code.RowNumberToReturn(Fields!Account.Value + Fields!Location.Value + Fields!Type.Value)
Now I could easily create the group that would create a page break at 1000 rows with the expression :
=Floor(Fields!RowNumberCalc.Value / 1000)