Handling Inconsistent Delimiters in Flat File Source on ForeachLoop Container - ssis

I'm trying to handle inconsistent delimiters in 'n Flat File Source contained in a Data Flow Task running in a Foreach Loop container in SSIS.
I have several files in a folder with varying names but with one consistent identifier e.g.
File23998723.txt
File39872397.txt
File29387234.txt etc., etc.
These files, as a standard should be tab delimited, but every so often a user missed cleaning up a file and it will be delimited with a , or a ; etc., which causes the package import to fail.
Is there an easy approach for me to follow to dynamically change the delimiter or to test for the delimiter beforehand?

I managed to handle it with a script task, thanks!
Basically added a script task to the Foreach Loop Container that executes before my DataFlow task.
I send the file name through as a variable:
I added the following namespaces to the script:
using System.IO;
using RuntimeWrapper = Microsoft.SqlServer.Dts.Runtime.Wrapper;
And my script looks like this:
public void Main()
{
if (!string.IsNullOrEmpty(Dts.Variables["sFileName"].Value.ToString()))
{
StreamReader file = new StreamReader(Dts.Variables["sFileName"].Value.ToString());
if (file != null)
{
string HeadRowDelimiter = "";
string ColDelimiter = "";
string data = "";
while (file.Peek() >= -1)
{
char[] c = new char[500];
file.Read(c, 0, c.Length);
data = string.Join("", c);
if (!string.IsNullOrEmpty(data))
{
//set row delimiters
if (data.Contains("\r\n"))
{
HeadRowDelimiter = "\r\n";
}
else if (data.Contains("\r"))
{
HeadRowDelimiter = "\r";
}
else if (data.Contains("\n"))
{
HeadRowDelimiter = "\n";
}
else if (data.Contains("\0"))
{
HeadRowDelimiter = "\0";
}
//set column delimiters
if (data.Contains("\t"))
{
ColDelimiter = "\t";
}
else if (data.Contains(";"))
{
ColDelimiter = ";";
}
else if (data.Contains(","))
{
ColDelimiter = ",";
}
else if (data.Contains(":"))
{
ColDelimiter = ":";
}
else if (data.Contains("|"))
{
ColDelimiter = "|";
}
else if (data.Contains("\0"))
{
ColDelimiter = "\0";
}
}
break;
}
file.Close();
RuntimeWrapper.IDTSConnectionManagerFlatFile100 flatFileConnection = Dts.Connections["FlatFileConnection"].InnerObject as RuntimeWrapper.IDTSConnectionManagerFlatFile100;
if (flatFileConnection != null)
{
flatFileConnection.HeaderRowDelimiter = HeadRowDelimiter;
flatFileConnection.RowDelimiter = HeadRowDelimiter;
flatFileConnection.HeaderRowsToSkip = 0;
flatFileConnection.Columns[0].ColumnDelimiter = ColDelimiter;
}
Dts.TaskResult = (int)ScriptResults.Success;
}
}
}

Related

How to retrieve entire Logger output?

I have large sets of data (mainly arrays and objects with many elements) and I am trying to log out the entire result to check for bugs. However, in some cases it says "Logging output too large. Truncating output." Where can I see the output in its entirety? I am working with Map Objects and trying to debug why my calculations don't match Google's output.
Logger.log is limited to the number of lines that it can contain. However you can make your own logger and save it to a text file.
var Log = null;
function testLogger() {
try {
Log = new LogFile("testLogFile");
test1();
test2();
throw "done"
}
catch(err) {
Log.log(err);
Log.save();
}
}
function test1() {
Log.log("in test1");
}
function test2() {
Log.log("in test2");
}
class LogFile {
constructor (name) {
if( name === undefined ) name = "_LogFile"
this.name = name;
this.text = [];
}
log(text) {
this.text.push(text);
}
save() {
try {
let text = "";
this.text.forEach( line => text = text.concat(line,"\n") );
let files = DriveApp.getFilesByName(this.name);
let file = null;
if( files.hasNext() ) {
file = files.next();
file.setContent(text);
}
else {
DriveApp.createFile(this.name,text);
}
}
catch(err) {
Logger.log(err);
}
}
}
The text file is shown below.

Extract data from complex JSON in Flutter

I'm working in extract data from more complex JSON with unknown structure, its structure changes with operation ,
JSON sample link :http://afs-i.com/json.json
Kindly find my code here : http://afs-i.com/main.dart
Thanks in advance
Update:
I extracted the data using PHP code, you can find result here: http://afs-i.com/json.php
Kindly this is my PHP code:
$arraycars=array();
$y=json_decode($x);
// echo "<pre>";
// var_dump($y->tree[0]->children);
foreach ($y->tree[0]->children as $f) {
if(isset($f->vid)){
global $arraycars;
$arraycars[]=$f;
} elseif(isset($f->children)){
if(sizeof($f->children) > 0){
coolectcars($f->children);
}
}
}
function coolectcars($array){
// var_dump($array);
foreach ($array as $f) {
if(isset($f->vid)){
global $arraycars;
$arraycars[]=$f;
} elseif(isset($f->children)){
if(sizeof($f->children) > 0){
coolectcars($f->children);
}
}
}
}
echo json_encode($arraycars);
Update:2
I have problem now with null error for this code:
The error:
I/flutter ( 4264): NoSuchMethodError: The method 'forEach' was called on
null.
I/flutter ( 4264): Receiver: null
I/flutter ( 4264): Tried calling: forEach(Closure: (Children) => Null)
The code:
List<Children> cars = [];
Queue numQ = new Queue();
numQ.addAll(parsed["tree"][0]["children"]);
Iterator i = numQ.iterator;
while (i.moveNext()) {
// print("ddddddd ${i.current}");
if (i.current.containsKey("vid")) {
cars.add(new Children(
i.current['vid'], i.current['protocol'], i.current['datetime'],
i.current['gpss']));
} else {
Queue numQ = new Queue();
if (i.current["children"] != null) {
numQ.addAll(i.current["children"]);
// iterate(numQ);
List<Children> carse=[];
carse = iterate(numQ);
carse.forEach((data){
cars.add(data);
}) ;
}
}
}
cars.forEach((data) {
print(data.toString());
});
List<Children> iterate(Queue numQ) {
List<Children> cars=new List<Children>();
Iterator i = numQ.iterator;
while (i.moveNext()) {
print("ddddddd ${i.current}");
if (i.current.containsKey("vid")) {
cars.add(new Children(
i.current['vid'], i.current['protocol'], i.current['datetime'],
i.current['gpss']));
} else {
if (i.current["children"] != null) {
Queue numQ = new Queue();
numQ.addAll(i.current["children"]);
List<Children> carse=[];
carse = iterate(numQ);
carse.forEach((data){
cars.add(data);
}) ;
}
}
return cars;
}
}
I prefer using built_value to do json deserialization/ serialization. It's more elegant. You don't need to write down fromJson by yourself. built_value will generate deserializers / serializer for you. You can check built_value's github or this and this articles.
A good place for convert JSON to dart here
hats to reddit url
just copy your json into the textbox and generate, it will auto generate for you.
With this, you can call fromJson and feed it the json, then you can also get auto complete for it
eg: usage
final response =
await http.get('http://afs-i.com/json.json');
if (response.statusCode == 200) {
final Autogenerated respJson = Autogenerated.fromJson(json.decode(response.body));
print(respJson.tree[0].userId);
}
insert this import 'dart:convert';
into your widget file at top.
lets say you have your json in
var response
then
var jsonDecoded= json.decode(response);
now you can get name and user_id like this:
var user_id= jsonDecoded["tree"][0]["user_id"];
var name = jsonDecoded["tree"][0]["name"];
NOTE : I get these values from the first object (0 index). If you want values for each object , you can loop to get it.
The final answer i reach:
final parsed = json.decode(response.body);
List<Children> cars = [];
Queue numQ = new Queue();
numQ.addAll(parsed["tree"][0]["children"]);
Iterator i = numQ.iterator;
while (i.moveNext()) {
if (i.current.containsKey("vid")) {
cars.add(new Children(
i.current['vid'],
i.current['datetime'],
));
} else {
Queue numQ = new Queue();
if (i.current["children"].toString() != "[]") {
numQ.addAll(i.current["children"]);
List<Children> carse = [];
carse = iterate(numQ);
carse.forEach((data) {
cars.add(data);
});
}
}
}
List<Children> iterate(Queue numQ) {
List<Children> cars = new List<Children>();
Iterator i = numQ.iterator;
while (i.moveNext()) {
if (i.current.containsKey("vid")) {
if (i.current.containsKey("vid")) {
cars.add(new Children(
i.current['vid'],
i.current['datetime'],
));
}
} else {
if (i.current["children"].toString() != "[]") {
Queue numQ = new Queue();
if (i.current["children"].toString() != "[]") {
numQ.addAll(i.current["children"]);
List<Children> carse = [];
carse = iterate(numQ);
carse.forEach((data) {
cars.add(data);
});
}
}
}
return cars;
}

How to open csv file which contains special characters in one of the fields in csv?

hi I am working on a xamarin.forms app, While trying to open one of the csv file the following exception is displayed "input string is not in a correct format " the csv file contains a field called item name which consists the following names ET Door,E459-2,H 91 Ft and Key,Door so these both items contain comma so I am not able to open the csv file which consists of these two elements as they contain special characters like comma and underscore .Here is my code to read and open csv file ,please check the code and let me know what changes do i need to make so the file with items consisting of special characters also open ?
public async void OnProcess(object o, EventArgs args)
{
if (!string.IsNullOrWhiteSpace(csv_file.Text))
{
// _database.AddFiles();
if (App.Current.MainPage is NavigationPage)
{
try
{
List<ItemsCSV> items = new List<ItemsCSV>();
string[] lines = File.ReadAllLines(string.Format(#"{0}", this.file.FilePath));
if (lines != null)
{
for (int x = 1; x < lines.Length; x++)
{
string data = lines[x];
string[] item = data.Split(',');
// ItemsCSV itemsCSV = new ItemsCSV();
_itemsCSV = new ItemsCSV();
{
_itemsCSV.Cycle_Count = string.IsNullOrEmpty(item.ElementAtOrDefault(0)) ? 0 : Convert.ToInt32(item[0]);
_itemsCSV.Line_Number = string.IsNullOrEmpty(item.ElementAtOrDefault(1)) ? 0 : Convert.ToInt32(item[1]);
_itemsCSV.Item_Number = item.ElementAtOrDefault(2);
_itemsCSV.Name = item.ElementAtOrDefault(3);
_itemsCSV.Warehouse = item.ElementAtOrDefault(4);
_itemsCSV.Aisle = item.ElementAtOrDefault(5);
_itemsCSV.Bin = item.ElementAtOrDefault(6);
_itemsCSV.Level = item.ElementAtOrDefault(7);
_itemsCSV.Order_Qty = string.IsNullOrEmpty(item.ElementAtOrDefault(8)) ? 0 : Convert.ToInt32(item[8]);
_itemsCSV.Order_UOM = item.ElementAtOrDefault(9);
_itemsCSV.Consumption_Qty = string.IsNullOrEmpty(item.ElementAtOrDefault(10)) ? 0 : Convert.ToInt32(item[10]);
_itemsCSV.Consumption_UOM = item.ElementAtOrDefault(11);
_itemsCSV.Status = "";
};
items.Add(_itemsCSV);
_database.AddItems(_itemsCSV);
}
var result = await DisplayAlert("", "CSV has been processed, please do cycle count", "OK", "Cancel");
if(result == true)
{
var cyclecountPage = new CycleCountPage(items, 0, "MainPage",this.file.FilePath);
await (App.Current.MainPage as NavigationPage).PushAsync(cyclecountPage);
}
else
{
}
}
else
{
await DisplayAlert("Alert", "File is empty", "OK");
}
}
catch (Exception e)
{
await DisplayAlert("Exception", e.Message, "OK");
}
}
}
else
{
await DisplayAlert("Alert", "File name is mandatory", "OK");
}
}

Adding base64 string to JSON invalidates it

I am creating a JSON Array having a byte64 string data. This is an invalid JSON but when I remove the displaypic value by some normal string the JSON is valid. Please help me with this. Is there any other way to work with images that has to be parsed on cross platforms. How do I deal with this byte 64 string.
It's a long data so can't add here.. The body limit is 30000. Please see this link for the JSON.
CODE THAT CREATES JSON
function checkLogin_post()
{
//$responsedata = array();
if($this->post('useremail') && $this->post('password'))
{
$useremail = $this->post('useremail');
$password = $this->post('password');
$this->load->model('loginmodel');
$table_data = $this->loginmodel->checkLogin($useremail);
if (sizeof($table_data) != 0)
{
foreach ($table_data as $data)
{
if($password == $data->password)
{
$responsedata["firstname"] = $data->firstname;
$responsedata["lastname"] = $data->lastname;
$responsedata["email"] = $data->email;
$responsedata["userid"] = $data->userid;
$responsedata["displaypic"] = $data->displaypic; //THIS IS THE BASE64
$responsedata["ispersonaldetailsfilled"] = $data->ispersonaldetailsfilled;
$responsedata["isexpertisedetailsfilled"] = $data->isexpertisedetailsfilled;
$responsedata["isprofessionaldetailsfilled"] = $data->isprofessionaldetailsfilled;
$this->response(array("success"=>$responsedata), 200);
//$this->response($responsedata, 200);
}
else
{
$this->response(array("error"=>"Password not matched"), 200);
}
}
}
else
{
$this->response(array("error"=>"User not found"), 200);
}
}
else
{
if($this->post('useremail') == "")
{
$this->response(array("error"=>"Useremail can't be null"),200);
}
if($this->post('password') == "")
{
$this->response(array("error"=>"Password can't be null"),200);
}
}
}
It was just missing a brace at the end. Here's your corrected JSON
Always use tools to validate your json before posting a question. I personally like http://pro.jsonlint.com

Mule - how to pad rows in a CSV file with extra delimiters

I have a CSV file coming into my Mule application that looks as follows:
1,Item,Item,Item
2,Field,Field,Field,Field
2,Field,Field,Field,Field
3,Text
Is there a way I can transform this file in Mule to something like the below:
1,Item,Item,Item,,,,,,
2,Field,Field,Field,Field,,,,,
2,Field,Field,Field,Field,,,,,
3,Text,,,,,,,,
Essentially, what I need to do here is append a string (containing x occurrences of a delimiter) to the end of each row. The number of delimiters I need to append to each row can be determined by the first character of that row e.g. if row[0]='1' then (append ",,,,,,") else if row[0]='2' then (append ",,,,,"etc.
The reason I have this rather annoying problem is because the system providing the input to my Mule application produces a file where the number of columns in each row may vary. I'm trying to pad this file so that the number of columns in each row is equal, so that I can pass it on to a Java transformer like the one explained here that is using FlatPack (which expects x columns, it won't accept a file that has a varying number of columns in each row).
Does anyone have any ideas on how I could approach this? Thanks in advance.
UPDATE
Based on #Learner's recommendation and #EdC's answer - I've achieved this in Mule using the below:
<flow name="testFlow1" doc:name="testFlow1">
<file:inbound-endpoint .../>
<file:file-to-string-transformer doc:name="File to String"/>
<component doc:name="Java" class="package.etc.etc.MalformData"/>
</flow>
Try this based on #Learner 's answer.
import org.mule.api.MuleEventContext;
import org.mule.api.MuleMessage;
public class MalformData implements org.mule.api.lifecycle.Callable {
private String[] strTempArray;
public String findMalformRow(String strInput) {
String strOutput = "";
String strFix = "";
int intArrayLength = 0;
char charFirst = ' ';
String strControl = "";
strTempArray = strInput.split("\\n");
intArrayLength = strTempArray.length;
for (int i = 0; i < intArrayLength; i++) {
charFirst = strTempArray[i].charAt(0);
strFix = strTempArray[i];
String missingDelimiter = "";
if (charFirst == '1') {
missingDelimiter = ",,,,,,";
strFix += missingDelimiter;
} else if (charFirst == '2') {
missingDelimiter = ",,,,,";
strFix += missingDelimiter;
} else if (charFirst == '3') {
missingDelimiter = ",,,,,,,,";
strFix += missingDelimiter;
} else {
strFix = "Good";
}
if (strControl != "Good") {
strTempArray[i] = strFix;
} else {
charFirst = ' ';
strFix = "";
}
}
for(int i=0; i < intArrayLength; i++){
strOutput += strTempArray[i] + "\n";
}
return strOutput;
}
#Override
public Object onCall(MuleEventContext eventContext) throws Exception {
MuleMessage message = eventContext.getMessage();
String newPayload = this.findMalformRow(message.getPayloadAsString());
message.setPayload(newPayload);
return message;
}
}
You could write a custom java component that reads your CSV file line by line, write to another file and add , based on your logic at the end of each line