Node.js - Can I store writeable streams as JSON in Redis? - json

I am still working on fully understanding streams in node.js. If I create a writable stream, would I be able able to store the stream object as JSON in Redis, and then access it later, and continue writing to it (after JSON.parse)?
example:
var fs = require( 'fs' );
var redis = require( 'redis' );
var streamName = fs.createWriteStream(upfilePath, streamopts);
streamName = JSON.stringify(streamName);
rclient.set('streamJSON', streamName);
....
var myNewdata = 'whatever';
rclient.get('streamJSON', function (err, streamJSON) {
var recoveredStream = JSON.parse(streamJSON);
recoveredStream.write(myNewdata, function (err, written, buffer) {
//write successful??
}
}

You can't store variable references on redis. You would only need to store the filename, then reopen the stream with the a flag which allows you to append data to it.
I thought this was pretty an interesting question and created this that allows you to save the state of a stream and then use it later. But I don't see the point if you can just use the a flag. Might be useful for ReadableStreams though.
var fs = require('fs');
exports.stringify = function(stream) {
var obj = {
path: stream.path
, writable: stream.writable
, fd: stream.fd
, options: {
encoding: stream.encoding
, mode: stream.mode
}
};
if (stream.writable) {
obj.bytesWritten = stream.bytesWritten;
} else {
obj.options.bufferSize = stream.bufferSize;
obj.bytesRead = stream.bytesRead;
}
return JSON.stringify(obj);
};
exports.parse = function(json, callback) {
var obj = JSON.parse(json);
var stream;
if (obj.writable) {
obj.options.flags = 'a';
stream = fs.createWriteStream(obj.path, obj.options);
stream.bytesWritten = obj.bytesWritten;
} else {
stream = fs.createReadStream(obj.path, obj.options);
stream.bytesRead = obj.bytesRead;
}
// if stream was already opened, wait until it is
if (obj.fd !== null) {
stream.on('open', function() {
callback(null, stream);
});
} else {
process.nextTick(function() {
callback(null, stream);
});
}
return stream;
};

Related

Forge api translating .fbx to SVF2 doesnt work and translates only to SVF

I am using forge-apis package on Node.js and I want to translate a .fbx file in SVF2. When I do so and load the model, size and GPU memory used is the same as normal translate to SVF and when I check viewer.model.isSVF2() it return false.
const {
DerivativesApi,
JobPayload,
JobPayloadInput,
JobPayloadOutput,
JobSvfOutputPayload} = require('forge-apis');
and
router.post('/jobs', async (req, res, next) => {
const xAdsForce = (req.body.xAdsForce === true);
let job = new JobPayload();
job.input = new JobPayloadInput();
job.input.urn = req.body.objectName;
if(req.body.rootFilename && req.body.compressedUrn) {
job.input.rootFilename = req.body.rootFilename;
job.input.compressedUrn = req.body.compressedUrn;
}
job.output = new JobPayloadOutput([
new JobSvfOutputPayload()
]);
job.output.formats[0].type = 'svf2' ;
job.output.formats[0].views = ['2d', '3d'];
try {
// Submit a translation job using [DerivativesApi](https://github.com/Autodesk-Forge/forge-api-nodejs-client/blob/master/docs/DerivativesApi.md#translate).
const result = await new DerivativesApi().translate(job, { xAdsForce: xAdsForce }, req.oauth_client, req.oauth_token);
res.status(200).end();
} catch(err) {
next(err);
}});
How can I handle this problem? Thanks a lot.

Google Data Studio Community Connector getData() not working as expected

function getData(request){
try{
var options = {
'method' : 'post',
'contentType': 'application/json',
'payload' : JSON.stringify(request)
};
response=UrlFetchApp.fetch(getDataUrl, options);
resData = JSON.parse(response.getContentText())
return resData
}catch (e) {
e = (typeof e === 'string') ? new Error(e) : e;
Logger.log("Catch", e);
throw e;
}
}
The the above is my getData() function.
My isAdminUser() returns true.
When I try to visualize my data, I get the following error
Data Set Configuration Error
Data Studio cannot connect to your data set.
There was an error requesting data from the community connector. Please report the issue to the provider of this community connector if this issue persists.
Error ID: 3d11b88b
https://i.stack.imgur.com/x3Hki.png
The error code changes every time I refresh data and I can't find any dictionary to map the error id to an error
I tried debugging by logging the request parameter, response.getContentText() and resData variable to make sure I my data is formatted correctly.
Following are the logs printed in Stackdriver logs
request
{configParams={/Personal config data/}, fields=[{name=LASTNAME}]}
response.getContentText()
{"schema":[{"name":"LASTNAME","dataType":"STRING"}],"rows":[{"values":["test"]},{"values":["test"]},{"values":["Dummy"]},{"values":["One"]},{"values":["Nagargoje"]},{"values":[""]},{"values":[""]},{"values":[""]},{"values":[""]},{"values":[""]}],"filtersApplied":false}
resData
{rows=[{values=[test]}, {values=[test]}, {values=[Dummy]},
{values=[One]}, {values=[Nagargoje]}, {values=[]}, {values=[]},
{values=[]}, {values=[]}, {values=[]}], filtersApplied=false,
schema=[{name=LASTNAME, dataType=STRING}]}
I am not sure what is wrong with my getData() function.
The Object that I am returning seems to match the structure given here https://developers.google.com/datastudio/connector/reference#getdata
So there was no issue with my getData() function, the issue existed in the manifest file.
I was searching about passing parameter via URL and I stumbled upon a field called
dataStudio.useQueryConfig and added that to my manifest file and set its value to true.
Google Data studio was expecting me to return a query Config for getData().
But what I really wanted was this.
Anyways, I was able to debug it thanks to Matthias for suggesting me to take a look at Open-Source implementations
I implemented JSON connect which worked fine, so I Logged what it was returning in getData() and used that format/structure in my code, but my connector still didn't work.
My next assumption was maybe there is something wrong with my getSchema() return value. So I logged that as well and then copy pasted the hard coded value of both getData() and getSchema() return varaibles from JSON connect.
And even that didn't work, so my last bet was there must be something wrong with the manifest file, maybe the dummy links I added in it must be the issue. Then, after carrying out field by comparison I was finally able to get my community connector working.
This would have been easier to debug if the error messages were a bit helpful and didn't seem so generic.
First: You can always check out the Open-Source implementations that others did for custom Google Data Studio connectors. They are a great source if information. Fore more information checkout the documentation on Open Source Community Connectors.
Second: My implementation is for a time tracking system thus having confidential GDPR relevant data. That's why I can not just give you response messages. But I assembled this code. It contains authentifiction, HTTP GET data fetch and data conversions. Explanation is below the code. Again, checkout the open-source connectors if you need further assistance.
var cc = DataStudioApp.createCommunityConnector();
const URL_DATA = 'https://www.myverysecretdomain.com/api';
const URL_PING = 'https://www.myverysecretdomain.com/ping';
const AUTH_USER = 'auth.user'
const AUTH_KEY = 'auth.key';
const JSON_TAG = 'user';
String.prototype.format = function() {
// https://coderwall.com/p/flonoa/simple-string-format-in-javascript
a = this;
for (k in arguments) {
a = a.replace("{" + k + "}", arguments[k])
}
return a
}
function httpGet(user, token, url, params) {
try {
// this depends on the URL you are connecting to
var headers = {
'ApiUser': user,
'ApiToken': token,
'User-Agent': 'my super freaky Google Data Studio connector'
};
var options = {
headers: headers
};
if (params && Object.keys(params).length > 0) {
var params_ = [];
for (const [key, value] of Object.entries(params)) {
var value_ = value;
if (Array.isArray(value))
value_ = value.join(',');
params_.push('{0}={1}'.format(key, encodeURIComponent(value_)))
}
var query = params_.join('&');
url = '{0}?{1}'.format(url, query);
}
var response = UrlFetchApp.fetch(url, options);
return {
code: response.getResponseCode(),
json: JSON.parse(response.getContentText())
}
} catch (e) {
throwConnectorError(e);
}
}
function getCredentials() {
var userProperties = PropertiesService.getUserProperties();
return {
username: userProperties.getProperty(AUTH_USER),
token: userProperties.getProperty(AUTH_KEY)
}
}
function validateCredentials(user, token) {
if (!user || !token)
return false;
var response = httpGet(user, token, URL_PING);
if (response.code == 200)
console.log('API key for the user %s successfully validated', user);
else
console.error('API key for the user %s is invalid. Code: %s', user, response.code);
return response;
}
function getAuthType() {
var cc = DataStudioApp.createCommunityConnector();
return cc.newAuthTypeResponse()
.setAuthType(cc.AuthType.USER_TOKEN)
.setHelpUrl('https://www.myverysecretdomain.com/index.html#authentication')
.build();
}
function resetAuth() {
var userProperties = PropertiesService.getUserProperties();
userProperties.deleteProperty(AUTH_USER);
userProperties.deleteProperty(AUTH_KEY);
console.info('Credentials have been reset.');
}
function isAuthValid() {
var credentials = getCredentials()
if (credentials == null) {
console.info('No credentials found.');
return false;
}
var response = validateCredentials(credentials.username, credentials.token);
return (response != null && response.code == 200);
}
function setCredentials(request) {
var credentials = request.userToken;
var response = validateCredentials(credentials.username, credentials.token);
if (response == null || response.code != 200) return { errorCode: 'INVALID_CREDENTIALS' };
var userProperties = PropertiesService.getUserProperties();
userProperties.setProperty(AUTH_USER, credentials.username);
userProperties.setProperty(AUTH_KEY, credentials.token);
console.info('Credentials have been stored');
return {
errorCode: 'NONE'
};
}
function throwConnectorError(text) {
DataStudioApp.createCommunityConnector()
.newUserError()
.setDebugText(text)
.setText(text)
.throwException();
}
function getConfig(request) {
// ToDo: handle request.languageCode for different languages being displayed
console.log(request)
var params = request.configParams;
var config = cc.getConfig();
// ToDo: add your config if necessary
config.setDateRangeRequired(true);
return config.build();
}
function getDimensions() {
var types = cc.FieldType;
return [
{
id:'id',
name:'ID',
type:types.NUMBER
},
{
id:'name',
name:'Name',
isDefault:true,
type:types.TEXT
},
{
id:'email',
name:'Email',
type:types.TEXT
}
];
}
function getMetrics() {
return [];
}
function getFields(request) {
Logger.log(request)
var fields = cc.getFields();
var dimensions = this.getDimensions();
var metrics = this.getMetrics();
dimensions.forEach(dimension => fields.newDimension().setId(dimension.id).setName(dimension.name).setType(dimension.type));
metrics.forEach(metric => fields.newMetric().setId(metric.id).setName(metric.name).setType(metric.type).setAggregation(metric.aggregations));
var defaultDimension = dimensions.find(field => field.hasOwnProperty('isDefault') && field.isDefault == true);
var defaultMetric = metrics.find(field => field.hasOwnProperty('isDefault') && field.isDefault == true);
if (defaultDimension)
fields.setDefaultDimension(defaultDimension.id);
if (defaultMetric)
fields.setDefaultMetric(defaultMetric.id);
return fields;
}
function getSchema(request) {
var fields = getFields(request).build();
return { schema: fields };
}
function convertValue(value, id) {
// ToDo: add special conversion if necessary
switch(id) {
default:
// value will be converted automatically
return value[id];
}
}
function entriesToDicts(schema, data, converter, tag) {
return data.map(function(element) {
var entry = element[tag];
var row = {};
schema.forEach(function(field) {
// field has same name in connector and original data source
var id = field.id;
var value = converter(entry, id);
// use UI field ID
row[field.id] = value;
});
return row;
});
}
function dictsToRows(requestedFields, rows) {
return rows.reduce((result, row) => ([...result, {'values': requestedFields.reduce((values, field) => ([...values, row[field]]), [])}]), []);
}
function getParams (request) {
var schema = this.getSchema();
var params;
if (request) {
params = {};
// ToDo: handle pagination={startRow=1.0, rowCount=100.0}
} else {
// preview only
params = {
limit: 20
}
}
return params;
}
function getData(request) {
Logger.log(request)
var credentials = getCredentials()
var schema = getSchema();
var params = getParams(request);
var requestedFields; // fields structured as I want them (see above)
var requestedSchema; // fields structured as Google expects them
if (request) {
// make sure the ordering of the requested fields is kept correct in the resulting data
requestedFields = request.fields.filter(field => !field.forFilterOnly).map(field => field.name);
requestedSchema = getFields(request).forIds(requestedFields);
} else {
// use all fields from schema
requestedFields = schema.map(field => field.id);
requestedSchema = api.getFields(request);
}
var filterPresent = request && request.dimensionsFilters;
//var filter = ...
if (filterPresent) {
// ToDo: apply request filters on API level (before the API call) to minimize data retrieval from API (number of rows) and increase speed
// see https://developers.google.com/datastudio/connector/filters
// filter = ... // initialize filter
// filter.preFilter(params); // low-level API filtering if possible
}
// get HTTP response; e.g. check for HTTT RETURN CODE on response.code if necessary
var response = httpGet(credentials.username, credentials.token, URL_DATA, params);
// get JSON data from HTTP response
var data = response.json;
// convert the full dataset including all fields (the full schema). non-requested fields will be filtered later on
var rows = entriesToDicts(schema, data, convertValue, JSON_TAG);
// match rows against filter (high-level filtering)
//if (filter)
// rows = rows.filter(row => filter.match(row) == true);
// remove non-requested fields
var result = dictsToRows(requestedFields, rows);
console.log('{0} rows received'.format(result.length));
//console.log(result);
return {
schema: requestedSchema.build(),
rows: result,
filtersApplied: filter ? true : false
};
}
A sample request that filters for all users with names starting with J.
{
configParams={},
dateRange={
endDate=2020-05-14,
startDate=2020-04-17
},
fields=[
{name=name}
],
scriptParams={
lastRefresh=1589543208040
},
dimensionsFilters=[
[
{
values=[^J.*],
operator=REGEXP_EXACT_MATCH,
type=INCLUDE,
fieldName=name
}
]
]
}
The JSON data returned by the HTTP GET contains all fields (full schema).
[ { user:
{ id: 1,
name: 'Jane Doe',
email: 'jane#doe.com' } },
{ user:
{ id: 2,
name: 'John Doe',
email: 'john#doe.com' } }
]
Once the data is filtered and converted/transformed, you'll get this result, which is perfectly displayed by Google Data Studio:
{
filtersApplied=true,
schema=[
{
isDefault=true,
semantics={
semanticType=TEXT,
conceptType=DIMENSION
},
label=Name,
name=name,
dataType=STRING
}
],
rows=[
{values=[Jane Doe]},
{values=[John Doe]}
]
}
getData should return data for only the requested fields. In request.fields should have the list of all requested fields. Limit your data for those fields only and then send the parsed data back.

Flutter fetch JSON from external Storage

I am trying to read a json file from external Storage (Android). But unable to do it.
I already setup the permission in manifest also checking the permission before reading. Though the file is already in the directory cannot read it.
ModelTestModel modelTestModel;
List<ModelTests> listModelTests;
Future<bool> get readPermission async {
await new Future.delayed(new Duration(seconds: 1));
bool checkResult = await SimplePermissions.checkPermission(
Permission.ReadExternalStorage);
if (!checkResult) {
var status = await SimplePermissions.requestPermission(
Permission.ReadExternalStorage);
if (status == PermissionStatus.authorized) {
var res = await fetchModelTest;
return res != null;
}
} else {
var res = await fetchModelTest;
return res != null;
}
return false;
}
Future<List<ModelTests>> get fetchModelTest async {
var dir = await getExternalStorageDirectory();
print(dir);
final data =
await rootBundle.loadString("${dir.path}/BCS/bsc.json");
print(data);
// var data = await rootBundle.loadString('assets/database/bcs-preparation.json'); this is working when when the file is inside assets
var jsonData = json.decode(data);
modelTestModel = ModelTestModel.fromJson(jsonData);
listModelTests = modelTestModel.modelTests;
return listModelTests;
}
Log
I/SimplePermission(17862): Checking permission :
android.permission.READ_EXTERNAL_STORAGE I/flutter (17862): Directory:
'/storage/emulated/0'
the permission is successful but cannot read the file
rootBundle is used to access the resources of the application, it cannot be used to access the files in phone storage.
Open the file with
File jsonFile = await File("${dir.path}/BCS/bsc.json");
Then decode this jsonFile using
var jsonData = json.decode(jsonFile.readAsStringSync());

Node.js request starvation during large IO

I have a node application that utilizes streams to read and process data from a DB.
I have a reader stream that makes a request via ZeroMQ and as it receives the data it pushes the data out to the next stream in the pipe.
The second stream will write the JSON data to a file and then pass the data on.
The final stream will convert the JSON data to a CSV and then write out a CSV file.
What I'm noticing is that when I'm receiving a "large" amount of data from the DB (over 10k rows and about 2MB of uncompressed raw data) that this process takes a considerable amount of time (~20 seconds). And during that 20 seconds, other requests are starved and can not complete.
Does this sound right? Is the there a way to relinquish the thread to allow it to do other work while the stream of data is being read/written? Or is there a better approach to handling this file I/O?
EDIT FOR CODE
function FirstStream(msg) {
stream.Readable.call(this, { objectMode: true });
this.request = msg;
this._read = function() {
var self = this;
DBRequest.send(self.request).then(function(json) {
json.Body.messages.forEach(function(item) {
self.push(JSON.stringify(item));
});
self.push(null);
});
};
}
util.inherits(FirstStream, stream.Readable);
function SecondStream(filename, maxFileSize) {
stream.Transform.call(this, { objectMode: true });
this.filename = filename;
this.jsonArray = [];
this.buf = '';
this.bufferTooBig = false;
this.id = 0;
this.maxFileSize = maxFileSize || MB;
// Buffers JSON data, if the Buffer gets too large, then don't bother writing the JSON file
this._write = function(chunk, encoding, done) {
// If our buffer is too large, don't worry about caching more data
if(!this.bufferTooBig) {
var json = JSON.parse(chunk);
this.jsonArray.push(json);
this.buf = new Buffer(JSON.stringify(this.jsonArray));
// If the filesize is going to be over our Max filesize, then forget about it
if(this.buf.length > this.maxFileSize) {
fs.unlink(filename, function(err) { });
this.jsonArray = [];
this.buf = '';
this.bufferTooBig = true;
}
}
// Pass the data on to the next stream
this.push(chunk);
done();
};
this._flush = function(done) {
// If the filesize is within reason, then write out the file
if(!this.bufferTooBig) {
fs.writeFile(filename, this.buf.toString(), function(err) {
if(err) {
throw err;
}
done();
});
} else {
done();
}
};
}
util.inherits(SecondStream, stream.Transform);
function ThirdStream(filename) {
stream.Transform.call(this, { objectMode: true });
this.fileStream = fs.createWriteStream(filename);
this._write = function(chunk, encoding, done) {
this.fileStream.write(csvMessage);
this.push(csvMessage);
done();
};
this._flush = function(done) {
this.fileStream.end();
done();
};
}
util.inherits(ThirdStream, stream.Transform);
// USE CASE
var backendStream = new FirstStream(request)
.pipe(new SecondStream(jsonFileName))
.pipe(new ThirdStream(csvFileName))
.on('finish', function() { /* write response back to client */ });

How to Store Object to windows phone 8.1

In wp8.0 we can store object to IsolatedStorageSettings. wp8.1 object was not storing. Is there any way to store object to wp8.1.
WRITE OBJECT CODE
NewsList = new ObservableCollection<New>(e.News);
var FileName = "News.xml";
DataContractSerializer serializer = new DataContractSerializer(typeof(ObservableCollection<New>));
var localFolder = ApplicationData.Current.LocalFolder;
var file = await localFolder.CreateFileAsync(FileName,CreationCollisionOption.ReplaceExisting);
IRandomAccessStream sessionRandomAccess = await file.OpenAsync(FileAccessMode.ReadWrite);
IOutputStream sessionOutputStream = sessionRandomAccess.GetOutputStreamAt(0);
serializer.WriteObject(sessionOutputStream.AsStreamForWrite(), NewsList);
READ OBJECT CODE
var FileNameNews = "News.xml";
DataContractSerializer serializer = new DataContractSerializer(typeof(ObservableCollection<New>));
var localFolder = ApplicationData.Current.LocalFolder;
var newsFile = await localFolder.GetFileAsync(FileNameNews);
IInputStream sessionInputStream = await newsFile.OpenReadAsync();
newsVM = new NewsViewModel();
NewsVM.NewsList = (ObservableCollection<New>)serializer.ReadObject(sessionInputStream.AsStreamForRead());
im getting error on this link
IInputStream sessionInputStream = await newsFile.OpenReadAsync();
What mistake is there this code??
Thanks
This is how I do it. No using statements. I try to avoid the Stream syntax as much as possible.
Your error is very likely either because of concurrency (accessing the same file at the same time will throw an exception), or because the stream was not closed properly. I think it is the latter.
You do not dispose of your Stream objects properly (learn the using () {} syntax), which means that the stream remains OPEN after you're done writing. That means you hit the concurrency issue the second time you write, because you're trying to access a stream that's already open.
public async Task CreateOrUpdateData(string key, object o)
{
try
{
if (o != null)
{
var sessionFile = await _localFolder.CreateFileAsync(key, CreationCollisionOption.ReplaceExisting);
var outputString = JToken.FromObject(o).ToString();
await FileIO.WriteTextAsync(sessionFile, outputString);
}
}
catch (Exception e)
{
Debug.WriteLine("Encountered exception: {0}", e);
}
}
public async Task<T> GetDataOrDefault<T>(string key, T defaultValue)
{
try
{
T results = defaultValue;
var sessionFile = await _localFolder.CreateFileAsync(key, CreationCollisionOption.OpenIfExists);
var data = await FileIO.ReadTextAsync(sessionFile);
if (!String.IsNullOrWhiteSpace(data))
{
results = JToken.Parse(data).ToObject<T>();
}
return results;
}
catch (Exception e)
{
Debug.WriteLine("Encountered exception: {0}", e);
}
return defaultValue;
}