SVF2 Public Beta - Models are not translated to SVF2 - autodesk-forge

i just tried the SVF2 public beta.
I tried the translation on four models, but none of them worked. They are still in svf format and my cloud credits have been deducted.
As explained in the documentation, i just changed the output.formats.type to "svf2".
const job = {
input,
output: {
//force: true,
formats: [
{
views: ["2d", "3d"],
type: "svf2",
},
],
},
},
I am using node.js sdk version:
"forge-apis": "^0.7.3",
Viewer Version 7.29 with the init options:
const viewerEnv = await this.initialize({
// useConsolidation: true,
// env: dbModel.env,
// edgeRendering: true,
// lightPreset: "Boardwalk",
// envMapBackground: true,
// getAccessToken: function(onGetAccessToken) {
// onGetAccessToken(accessToken, expireTimeSeconds)
// }
env: "MD20ProdUS",
api: "D3S",
});
I checked the output-format with:
For all four translations, the translation-progress callback stopped between 90-98%. I never reached the 100% callback, but all models are translated.

we have not updated the node.js SDK yet for SVF2, so I suspect even with those changes, it may be reverting to SVF.

Related

Autodesk forge configurator inventor add new model authorization error

Hello we are using Autodesk forge configurator inventor And we created our own js function. Below, you will find the logic we want to import to the application. On it's own, we make it work, but with forge configurator inventor, we get the authentication error. We tried a lot of different options but failed to make it load the document.
Error is --> GET 401 (Unauthorized)
import repo from '../../Repository';
var options = repo.hasAccessToken() ?
{ accessToken: repo.getAccessToken() } :
{ env: 'Local' };
var documentId = 'urn:MyUrn';
Autodesk.Viewing.Document.load(
documentId, (doc) => {
console.log("test");
let items = doc.getRoot().search(
{
type: "geometry",
role: "3d",
},
true
);
if (items.length === 0) {
console.error("Document contains no viewables.");
return;
}
viewer.loadDocumentNode(doc, items[0], {
keepCurrentModels: true,
//placementTransform: tr,
})
.then(function (model2) {
secondModel = model2;
let tr = secondModel.getPlacementTransform();
let _selecterTr = _selectedModel.getPlacementTransform();
console.log(_selecterTr);
tr = _selecterTr;
secondModel.setPlacementTransform(tr);
viewer.impl.invalidate(true, true, true);
});
}, onDocumentLoadFailure,options);
function onDocumentLoadFailure() {
console.error('Failed fetching Forge manifest');
}
The main issue is that repo does not provide an access token because it was never needed. The models' SVF contents are always loaded directly from the server - instead of relying on the Model Derivative service to provide them.
You just have to provide an endpoint (e.g. /api/viewables/token) on the server-side by adding a Controller that could be called from the client-side in order to get an access token with viewables:read scope.
Detailed information here:
https://forge.autodesk.com/blog/drag-and-drop-design-automation-inventor-sample

Switch AutoDesk Forge Viewer Environment from Local to AutodeskProduction

We are using autodesk-forge viewer to show 3D Models, We have a requirement as to load the models from the local environment and the auto desk cloud environment
Previously we are using only local environment, Viewer initialized with the "env":"Local"
let initOptions = {
'env': 'Local'
'language': 'en'
}
Autodesk.Viewing.Initializer(initOptions)
Now we need to switch between the Local Environment and AutodeskProduction, is that Possible?
To achieve switching, what cloud i have to do.
Have to re-Initialize the Viewer with new env?
Please help me...
Simply terminate/finish the previous/existing Viewer and re-initialize as you normally would all over again:
Autodesk.Viewing.Initializer( {'env' : 'Local', getAccessToken:
//...
}, () =>{
// viewer = ...
})
})
//...
viewer.finish()
Autodesk.Viewing.Initializer( {'env' : 'AutodeskProduction', getAccessToken:
//...
}, () =>{
// viewer = ...
})
})

Is there any way to output logs using log4js on Cloud Functions?

I'm using GCP's cloud function. What I want to achieve is to output logs by log4js.
I know and have tried that using console.xxx() works well.
Environment:
- Google Cloud Functions
- Functions-framework
- nodejs10 as Runtime
logger.js
const log4js = require('log4js');
const logger = exports = module.exports = {};
log4js.configure({
appenders: {
out: { type: 'console' },
trail: {
type: 'dateFile',
filename: './logs/trail',
pattern: '-yyyy-MMdd-hh.log',
alwaysIncludePattern: true
}
},
categories: {
default: { appenders: [ 'out' ], level: 'info' },
trail: { appenders: [ 'trail' ], level: 'DEBUG' }
}
})
logger.trail = log4js.getLogger('trail')
index.js
const { logger } = require('./logger');
exports.spTest = (pubSubEvent, context) => {
console.log('console.log should appear'); // => properly logged
logger.trail.error('logger should appear'); => doesn't show up
};
Thanks in advance!
According to the oficial documentation link:
Cloud Logging is part of the Google Cloud's operations suite of
products in Google Cloud. It includes storage for logs, a user
interface called the Logs Viewer, and an API to manage logs
programmatically.
Also Custom StackDriver logs
Cloud Functions logs are backed by StackDriver Logging. You can use
the StackDriver Logging library for Node.js to log events with
structured data, enabling easier analysis and monitoring.
const { Logging } = require('#google-cloud/logging');
// ...
// Instantiate the StackDriver Logging SDK. The project ID will
// be automatically inferred from the Cloud Functions environment.
const logging = new Logging();
const log = logging.log('my-custom-log-name');
// This metadata is attached to each log entry. This specifies a fake
// Cloud Function called 'Custom Metrics' in order to make your custom
// log entries appear in the Cloud Functions logs viewer.
const METADATA = {
resource: {
type: 'cloud_function',
labels: {
function_name: 'CustomMetrics',
region: 'us-central1'
}
}
};
// ...
// Data to write to the log. This can be a JSON object with any properties
// of the event you want to record.
const data = {
event: 'my-event',
value: 'foo-bar-baz',
// Optional 'message' property will show up in the Firebase
// console and other human-readable logging surfaces
message: 'my-event: foo-bar-baz'
};
// Write to the log. The log.write() call returns a Promise if you want to
// make sure that the log was written successfully.
const entry = log.entry(METADATA, data);
log.write(entry);index.js
Therefore, I do not think you can use log4js on Cloud Functions.

Ember data - Cannot read property 'typeKey' of undefined

Trying to load a plan model, embedded, into my app model.
I keep getting the following error when loading (it saves just fine):
Cannot read property 'typeKey' of undefined TypeError: Cannot read property 'typeKey' of undefined
at Ember.Object.extend.modelFor (http://localhost:4200/assets/vendor.js:71051:22)
at Ember.Object.extend.recordForId (http://localhost:4200/assets/vendor.js:70496:21)
at deserializeRecordId (http://localhost:4200/assets/vendor.js:71500:27)
at http://localhost:4200/assets/vendor.js:71477:11
at http://localhost:4200/assets/vendor.js:69701:20
at http://localhost:4200/assets/vendor.js:17687:20
at Object.OrderedSet.forEach (http://localhost:4200/assets/vendor.js:17530:14)
at Object.Map.forEach (http://localhost:4200/assets/vendor.js:17685:14)
at Function.Model.reopenClass.eachRelationship (http://localhost:4200/assets/vendor.js:69700:42)
at normalizeRelationships (http://localhost:4200/assets/vendor.js:71463:12) vendor.js:17062logToConsole
With that said I have the following models,
app/models/app.js
export default DS.Model.extend({
name: attribute('string'),
domain: attribute('string'),
plan: DS.belongsTo('plan', { embedded: 'load' }),
creator: DS.belongsTo('user', { async: true }),
time_stamp: attribute('string', {
defaultValue: function () {
return moment().format("YYYY/MM/DD HH:mm:ss");
}
})
});
app/models/plan.js
export default DS.Model.extend({
price: attribute('number'),
description: attribute('string'),
tagline: attribute('string'),
title: attribute('string'),
features: attribute('array') // Array is defined in a transform, don't worry.
});
Plan being kind of a static document.
Here's my server response when calling store.get('creator.apps');
{
"apps":[
{
"_id":"53da9994b2878d0000a2e68f",
"name":"Myapp",
"domain":"http://myapp.com",
"creator":"53d9598bb25244e9b1a72e53",
"plan":{
"_id":"53d93c44b760612f9d07c921",
"price":0,
"description":"Free plan",
"tagline":"Great for testing",
"title":"Developer",
"features":["5,000 Requests","API/Plugin Access"],
"__v":0
},
"time_stamp":"2014/07/31 13:31:32",
"__v":0
}
]
}
I realize that the typeKey error is due to Ember not finding a model for the response. I can confirm that it finds the app type, firing a hook under normalizeHash.apps.
Sorry this is such a long post, just can't wrap my head around the cause of the issue!
App.Thing = DS.Model.extend(
{
name: attr('string'),
children: DS.hasMany('child', {inverse:null})
}
);
App.ThingSerializer = DS.RESTSerializer.extend(
DS.EmbeddedRecordsMixin, {
attrs: {
children: { embedded: 'always' }
}
}
);
DS.EmbeddedRecordsMixin must be in your model and you must have `embedded:'always' for the correct attribute.
If you have a Thing model then you can make Ember Data load the embedded children (here and array of nested objects) by using the model specific serializer.
Resources:
http://emberjs.com/guides/models/customizing-adapters/
http://emberjs.com/api/data/classes/DS.EmbeddedRecordsMixin.html
Ember doesn't want to have the record embedded in the JSON of the parent record.
Do what you need to, to get your json like this. With just the plan id.
{
"apps":[
{
"_id":"53da9994b2878d0000a2e68f",
"name":"Myapp",
"domain":"http://myapp.com",
"creator":"53d9598bb25244e9b1a72e53",
"plan_id":"53d93c44b760612f9d07c921", // just output id only not embedded record
"time_stamp":"2014/07/31 13:31:32",
"__v":0
}
]
}
This then lets ember look up the related model itself, using the async: true
export default DS.Model.extend({
name: attribute('string'),
domain: attribute('string'),
plan: DS.belongsTo('plan', { async: true }), //changed
creator: DS.belongsTo('user', { async: true }),
time_stamp: attribute('string', {
defaultValue: function () {
return moment().format("YYYY/MM/DD HH:mm:ss");
}
})
});
I've just gone through the pain of this myself and with some help found an answer.
For anyone else who's come here and still is has issues, read my answer to my own question for a detailed rundown of what the typeKey error means and further steps I used to resolve the issue myself.
Deploying ember-rails to Heroku - TypeError: Cannot read property 'typeKey' of undefined

IBM Worklight JSONStore - Add and get data

I am using worlight JSONstore. I am new to it. I tried searching that read all docs but didn't get much idea.
I have one login page from that I get some json data I want to store that data using jsonstore. and get that afterwards.
I made jsonstore adapter.
Json-Store-Impl.js
function getJsonStores(custData) {
var data = custData;
return data;
//custdata is json
}
function addJsonStore(param1) {
var input = {
method : 'put',
returnedContentType : 'json',
path : 'userInputRequired'
};
return WL.Server.invokeHttp(input);
}
function updateJsonStore(param1) {
var input = {
method : 'post',
returnedContentType : 'json',
path : 'userInputRequired'
};
return WL.Server.invokeHttp(input);
}
function deleteJsonStore(param1) {
var input = {
method : 'delete',
returnedContentType : 'json',
path : 'userInputRequired'
};
return WL.Server.invokeHttp(input);
}
after that I Create a local JSON store.
famlCollection.js
;(function () {
WL.JSONStore.init({
faml : {
searchFields: {"response.mci.txnid":"string","response.mci.scrnseqnbr":"string","response.loginUser":"string","request.fldWebServerId":"string","response.fldRsaImageHeight":"string","request.fldRequestId":"string","request.fldTxnId":"string","response.fldDeviceTokenFSO":"string","response.fldRsaCollectionRequired":"string","response.datlastsuccesslogin":"string","response.fldRsaUserPhrase":"string","response.fldRsaAuthTxnId":"string","response.rc.returncode":"string","response.datcurrentlogin":"string","response.mci.deviceid":"string","response.customername":"string","request.fldDeviceId":"string","response.fldRsaUserStatus":"string","request.fldScrnSeqNbr":"string","response.fldRsaImageWidth":"string","request.fldLangId":"string","response.fldTptCustomer":"string","response.encflag":"string","response.rc.errorcode":"string","response.fldRsaImagePath":"string","response.mci.appid":"string","response.mci.requestid":"string","response.rc.errormessage":"string","response.mci.appserverid":"string","response.fldRsaCollectionType":"string","request.fldAppId":"string","response.fldRsaImageId":"string","request.fldLoginUserId":"string","response.mci.sessionid":"string","response.mci.langid":"string","response.mci.remoteaddress":"string","request.fldAppServerId":"string","response.mci.webserverid":"string","response.fldRsaImageText":"string","response.fldRsaEnrollRequired":"string","response.fldRsaActivityFlag":"string"},
adapter : {
name: 'JsonStore',
replace: 'updateJsonStore',
remove: 'deleteJsonStore',
add: 'addJsonStore',
load: {
procedure: 'getJsonStores',
params: [],
key: 'faml'
},
accept: function (data) {
return (data.status === 200);
}
}
}
}, {
password : 'PleaseChangeThisPassword'
})
.then(function () {
WL.Logger.debug(['Take a look at the JSONStore documentation and getting started module for more details and code samples.',
'At this point there is no data inside your collection ("faml"), but JSONStore is ready to be used.',
'You can use WL.JSONStore.get("faml").load() to load data from the adapter.',
'These are some common JSONStore methods: load, add, replace, remove, count, push, find, findById, findAll.',
'Most operations are asynchronous, wait until the last operation finished before calling the next one.',
'JSONStore is currently supported for production only in Android and iOS environments.',
'Search Fields are not dynamic, call WL.JSONStore.destroy() and then initialize the collection with the new fields.'].join('\n'));
})
.fail(function (errObj) {
WL.Logger.ctx({pretty: true}).debug(errObj);
});
}());
When I clicked on login button I call getJsonStores like this -
getJsonStores = function(){
custData = responseData();
var invocationData = {
adapter : "JsonStore",
procedure : "getJsonStores",
parameters : [custData],
compressResponse : true
};
//WL.Logger.debug('invoke msg '+invocationData, '');
WL.Client.invokeProcedure(invocationData, {
onSuccess : sucess,
onFailure : AdapterFail,
timeout: timeout
});
};
I followed these steps
Is this right way? and how can I check jsonstore working locally or not? and how can I store my jsondata in JSONStore? Where should I initialize the wlCommonInit function in project?
plz Help me out.
Open main.js and find the wlCommonInit function, add the JSONStore init code.
WL.JSONStore.init(...)
You already have an adapter that returns the data you want to add to JSONStore, call it any time after init has finished.
WL.Client.invokeProcedure(...)
Inside the onSuccess callback, a function that gets executed when you successfully get data from the adapter, start using the JSONStore API. One high level way to write the code would be, if the collection is empty (the count API returns 0), then add all documents to the collection.
WL.JSONStore.get(collectionName).count()
.then(function (countResult) {
if(countResult === 0) {
//collection is empty, add data
WL.JSONStore.get(collectionName).add([{name: 'carlos'}, {name: 'mike'}])
.then(function () {
//data stored succesfully
});
}
});
Instead of adding [{name: 'carlos'}, {name: 'mike'}] you probably want to add the data returned from the adapter.
Later in your application, you can use the find API to get data back:
WL.JSONStore.get(collectionName).findAll()
.then(function (findResults) {
//...
});
There is also a find API that takes queries (e.g. {name: 'carlos'}), look at the getting started module here and the documentation here.
It's worth mentioning that the JSONStore API is asynchronous, you must wait for the callbacks in order to perform the next operation.