Retrieve the destination DE from Import Step in an automation using SSJS script activity - salesforce-marketing-cloud

I have an automation in SFMC containing an import step. There will always be 1 import step and I will have the name of that import step in my SSJS script. However, I would like to retrieve the Destination DE of that import step using SSJS in the same automation. Is there a way to do that?
Step 1: Import step
Step 2: Script that retrieves destination DE name

Related

ipython tab completion only to show imported functions from a module, not other imported modules

I like the tab completion in ipython to see which functions are available in an imported module. I am aware that imports in a module could be hidden from view in the tab completion if those imports were done with an underscore like import os as _os. Is there a way to avoid seeing imported modules if the imports were done without an underscore like import os only?
Example:
with_underscore.py:
import os as _os
def list(path):
_os.listdir(path)
without_underscore.py:
import os
def list(path):
os.listdir(path)
After importing the two modules into ipython
[1]: import with_underscore
[2]: import without_underscore
tab completion without_underscore.<tab> would yield list os, tab completion with_underscore.<tab> would yield list only, which is what I want. How could I get only list with tab completion without the underscore importing approach?

Import pre-trained Deep Learning Models into Foundry Codeworkbooks

How do you import a h5 model locally from Foundry into code workbook?
I want to use the hugging face library as shown below, and in its documentation the from_pretrained method expects a URL path to the where the pretrained model lives.
I would ideally like to download the model onto my local machine, upload it onto Foundry, and have Foundry read in said model.
For reference I’m trying to do this on code workbook or code authoring. It looks like you can work directly with files from there, but I’ve read the documentation and the given example was for a CSV file whereas this model contains a variety of files like h5 and json format. Wondering how I can access these files and have them passsed into the from_pretrained method from the transformers package
Relevant links:
https://huggingface.co/transformers/quicktour.html
Pre-trained Model:
https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/tree/main
Thank you!
I've gone ahead and added the transformers (hugging face) package onto the platform.
As for the uploading the package you can follow these steps:
Use your dataset with the model-related files as an input to your code workbook transform
Use pythons raw file access to access the contents of the dataset: https://docs.python.org/3/library/filesys.html
Use pythons built-in tempfile to build a folder and add the files from step 2, https://docs.python.org/3/library/tempfile.html#tempfile.mkdtemp , https://www.kite.com/python/answers/how-to-write-a-file-to-a-specific-directory-in-python
Pass in the tempfile (tempfile.mkdtemp() return the absolute path) to the from_pretrained method
import tempfile
def sample (dataset_with_model_folder_uploaded):
full_folder_path = tempfile.mkdtemp()
all_file_names = ['config.json', 'tf_model.h5', 'ETC.ot', ...]
for file_name in all_file_names:
with dataset_with_model_folder_uploaded.filesystem().open(file_name) as f:
pathOfFile = os.path.join(fullFolderPath, file_name)
newFile = open(pathOfFile, "w")
newFile.write(f.read())
newFile.close()
model = TF.DistilSequenceClassification.from_pretrained(full_folder_path)
tokenizer = TF.Tokenizer.from_pretrained(full_folder_path)
Thanks,

Why Python script run through batch file does not write to json file?

I have a Python script that does some web-scraping, then opens and dumps the parsed data into a JSON file in the same directory. Everything works, when the script is run manually through the CLI, but the data does not get written to the JSON file, when run from the batch file run by a task scheduler.
I have managed to show that all the data exists within the Python script, when run through the batch file. Somehow only part of the function that deals with the JSON file is not run.
Python script:
# Packages used:
import requests
from bs4 import BeautifulSoup
import smtplib
import time
from win10toast import ToastNotifier
import json
# Web Scraping...
my_json = {}
def function1():
# Web scraping for data...
json_function(data)
# Below is the function that is not functioning
def json_function(data):
my_json[time.strftime("%Y-%m-%d %H:%M")] = f"{data}"
with open ('json_file.json') as my_dict:
info = json.load(my_dict)
info.update(my_json)
with open('json_file.json','w') as my_dict:
json.dump(info,my_dict)
# A few other functions that work regardless...
# Call function
function1()
Batch file:
"C:\Users\...pythonw.exe" "C:\Users...script.pyw"
JSON file:
{"Key":"Value"}
Every file is in the same directory.
When run from the CLI, expected result occurs - key-value are appended to the JSON file. When run automatically (through batch and task scheduler), no visible errors, and all of the script, save for the json_function, run as expected.
Thank you to #PRMoureu for the answer, and #Mofi for a detailed explanation.
Answer is to ensure all files referenced have their full path referenced:
def json_function(data):
my_json[time.strftime("%Y-%m-%d %H:%M")] = f"{data}"
with open ('C:/.../json_file.json') as my_dict:
info = json.load(my_dict)
info.update(my_json)
with open('C:/.../json_file.json','w') as my_dict:
json.dump(info,my_dict)
Or, direct the Task Scheduler to the working directory to avoid the batch being run in the default, root directory.

Odoo 10 .csv file import

I'm trying to import data into a module I created in Odoo. However, here is what comes out of it:
Do you know the reason?
It is a file containing 400 lines, I tried to reduce the import with 50 lines is the same error.
Thank you
This error normally happens when the source code of Odoo is modified or import fields are not correct. you can refer this link.
please try to import sample data with some fields.

SSIS Import CSV data just one time

I'm planning to import csv file into oledb destination just one time and should do the rest of the process. Process should check for data existence and move on with rest of the process. How would i do this.