I am trying to translate a file ifc to svf. I'm making the upload in multi parts and seems to be correct.
This is the code Im using to translate
public async convertFileToSvfFormat(forgeApiAccessToken: string, urn: string): Promise<any> {
const forgeFileConversionResponse = await firstValueFrom(
this.forgeHttpService.post(
`modelderivative/v2/designdata/job`,
JSON.stringify({
input: {
urn,
},
output: {
formats: [
{
type: this.DEFAULT_FORGE_FORMAT_TYPE, // 'svf'
views: this.DEFAULT_FORGE_FORMAT_VIEWS, // ["2d", "3d"]
},
],
},
}),
{
headers: {
"Authorization": `Bearer ${forgeApiAccessToken}`,
"Content-Type": "application/json",
},
},
),
)
return forgeFileConversionResponse
}
But the manifest endpoint after a while throws this:
{
"urn": "dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bnY5a2cyZnB5bTFsdDQ5MDkxdzdobXVsbzlha3RldXdfdHV0b3JpYWxfYnVja2V0L2MyMWE2MzcwZTMwNzJmM2IwZTA3OWE5MzRjYWM4YTZlLmlmYw",
"derivatives": [
{
"hasThumbnail": "false",
"name": "c21a6370e3072f3b0e079a934cac8a6e.ifc",
"progress": "complete",
"messages": [
{
"type": "error",
"message": "Unrecoverable exit code from extractor: -1073741829",
"code": "TranslationWorker-InternalFailure"
}
],
"outputType": "svf",
"status": "failed"
}
],
"hasThumbnail": "false",
"progress": "complete",
"type": "manifest",
"region": "US",
"version": "1.0",
"status": "failed"
}
Am I missing something? How can I know which is the error?
Thanks!
-edit
Here is the method in which Im uploading the files.
public async uploadFileToForgeBucket(forgeApiAccessToken: string, file: Express.Multer.File): Promise<void> {
// When de-hardcoding the Forge acocunt to assign accounts to each user, de-hardcode this value here
const bucketKey = this.getBucketKey(this.hardcodedForgeClientId)
const path = join(__dirname, "../../../../upload/") + file.filename
const parts = Math.floor(this.calculateFileChuncks(path))
const signedUrls = await this.getS3SignedUrl(forgeApiAccessToken, file.filename, parts)
parts > 1 && splitFileInChunks(path, parts)
console.log(`Number of parts: ${parts}`)
const timerId = setTimeout(async () => {
const fileName = parts > 1 ? `${file.filename}.sf-part${parts}` : file.filename
const exists = existsSync(`./upload/${fileName}`)
console.log(exists && "Last chunk exists and read...")
if (exists) {
console.log("Starting upload...")
for (let i = 0; i < signedUrls.urls.length; i++) {
const currentUrl = signedUrls.urls[i]
let part: string
if (parts > 9) {
part = (i + 1).toString().length === 1 ? `0${i + 1}` : (i + 1).toString()
} else {
part = (i + 1).toString()
}
const fileName = parts > 1 ? path + `.sf-part${part}` : path
const stream = readFileSync(fileName)
await firstValueFrom(
this.forgeHttpService.put(
currentUrl,
{ data: stream },
{
headers: {
"Content-Type": "application/octet-stream",
"Content-Length": stream.length,
// "Content-Range": `bytes 0-${stream.length}`,
"Content-Disposition": `${file.filename}.sf-part${part}`,
},
},
),
)
const percentage = ((i + 1) / parts) * 100
console.log(`Uploading file... ${percentage.toFixed(2)}%`)
}
console.log("Upload complete. Processing...")
console.log("Re-unifying file:", `${file.filename}`)
const result = await firstValueFrom(
this.forgeHttpService.post(
`https://developer.api.autodesk.com/oss/v2/buckets/${bucketKey}/objects/${file.filename}/signeds3upload`,
{
uploadKey: signedUrls.uploadKey,
},
{
headers: {
"Authorization": `Bearer ${forgeApiAccessToken}`,
"Content-Type": "application/json",
},
},
),
)
clearInterval(timerId)
console.log(result.data)
const toBase64 = stringToBase64(result.data.objectId)
console.log(toBase64)
const translation = await this.convertFileToSvfFormat(forgeApiAccessToken, toBase64)
console.log(translation)
// console.log(translation, "trans")
}
}, 2000)
The result for this is:
data: {
result: 'success',
urn: 'dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6eG02ZjIxbWUxMHNxbTc4NmlhY3c2cTV6bjUxeWdpZmhfdHV0b3JpYWxfYnVja2V0L3JhY19iYXNpY19zYW1wbGVfcHJvamVjdF92My5pZmM',
acceptedJobs: { output: [Object] }
}
After that I call the translation method. Which ends up failing with the error code I copied before.
What is your IFC schema version, IFC2x3 or IFC4?
For large IFC files and IFC4, I would advise you to use the v3 IFC conversion method. So here is the modification to your code. You can find the comparison table here: https://forge.autodesk.com/blog/faq-and-tips-ifc-translation-model-derivative-api
public async convertFileToSvfFormat(forgeApiAccessToken: string, urn: string): Promise<any> {
const forgeFileConversionResponse = await firstValueFrom(
this.forgeHttpService.post(
`modelderivative/v2/designdata/job`,
JSON.stringify({
input: {
urn,
},
output: {
formats: [
{
type: this.DEFAULT_FORGE_FORMAT_TYPE, // 'svf'
views: this.DEFAULT_FORGE_FORMAT_VIEWS, // ["3d"],
advanced: {
conversionMethod: 'v3'
}
},
],
},
}),
{
headers: {
"Authorization": `Bearer ${forgeApiAccessToken}`,
"Content-Type": "application/json",
"x-ads-force": true,
},
},
),
)
return forgeFileConversionResponse
}
Example manifest of using v3. The IFC Loader should be 3.
{
"urn": "dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bXlidWNrZXQvcmFjX2Jhc2ljX3NhbXBsZV9wcm9qZWN0X3YzLmlmYw",
"derivatives": [
{
"hasThumbnail": "true",
"children": [
{
"guid": "d16f9899-ea0f-b40e-73fa-876f51b1352a",
"type": "geometry",
"role": "3d",
"name": "rac_basic_sample_project_v3.ifc",
"status": "success",
"viewableID": "rac_basic_sample_project_v3.ifc",
"hasThumbnail": "true",
"progress": "complete",
"useAsDefault": true,
"children": [
{
"guid": "72890b2b-0c4e-4e60-a168-3764b2c9922a",
"type": "view",
"role": "3d",
"name": "Default",
"status": "success",
"hasThumbnail": "true",
"camera": [
-11528.884765625,
-183404.859375,
5052.05517578125,
-11528.884765625,
-29944.34375,
5052.05517578125,
0,
-2.220446049250313e-16,
1,
1,
0.785398006439209,
1,
0
],
"useAsDefault": true,
"children": [
{
"urn": "urn:adsk.viewing:fs.file:dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bXlidWNrZXQvcmFjX2Jhc2ljX3NhbXBsZV9wcm9qZWN0X3YzLmlmYw/output/0/0_100.png",
"role": "thumbnail",
"mime": "image/png",
"guid": "92474384-5af9-478e-b98a-be02e5005eab",
"type": "resource",
"resolution": [
100,
100
]
},
{
"urn": "urn:adsk.viewing:fs.file:dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bXlidWNrZXQvcmFjX2Jhc2ljX3NhbXBsZV9wcm9qZWN0X3YzLmlmYw/output/0/0_200.png",
"role": "thumbnail",
"mime": "image/png",
"guid": "5567e9a6-2de2-46a3-a818-06cf1a8b916c",
"type": "resource",
"resolution": [
200,
200
]
},
{
"urn": "urn:adsk.viewing:fs.file:dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bXlidWNrZXQvcmFjX2Jhc2ljX3NhbXBsZV9wcm9qZWN0X3YzLmlmYw/output/0/0_400.png",
"role": "thumbnail",
"mime": "image/png",
"guid": "02792503-61de-4880-b5bd-ffe17f06c9f8",
"type": "resource",
"resolution": [
400,
400
]
}
]
},
{
"urn": "urn:adsk.viewing:fs.file:dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bXlidWNrZXQvcmFjX2Jhc2ljX3NhbXBsZV9wcm9qZWN0X3YzLmlmYw/output/0/0.svf",
"role": "graphics",
"mime": "application/autodesk-svf",
"guid": "a65292fb-4676-47a2-a632-69b3bd01af30",
"type": "resource"
}
]
},
{
"urn": "urn:adsk.viewing:fs.file:dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bXlidWNrZXQvcmFjX2Jhc2ljX3NhbXBsZV9wcm9qZWN0X3YzLmlmYw/output/0/AECModelData.json",
"role": "Autodesk.AEC.ModelData",
"mime": "application/json",
"guid": "b13160e3-df69-4e43-a0e9-fd7c0b2e8d3a",
"type": "resource",
"status": "success"
},
{
"urn": "urn:adsk.viewing:fs.file:dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bXlidWNrZXQvcmFjX2Jhc2ljX3NhbXBsZV9wcm9qZWN0X3YzLmlmYw/output/0/properties.db",
"role": "Autodesk.CloudPlatform.PropertyDatabase",
"mime": "application/autodesk-db",
"guid": "06aac8bb-14c7-4775-9d3d-059a26b620ed",
"type": "resource",
"status": "success"
}
],
"name": "rac_basic_sample_project_v3.ifc",
"progress": "complete",
"outputType": "svf",
"properties": {
"Document Information": {
"Navisworks File Creator": "LcNwcLoaderPlugin:lcldifc",
"IFC Application Name": "Autodesk Revit 2022 (ENU)",
"IFC Application Version": "2022",
"IFC Organization": "Autodesk",
"IFC Schema": "IFC2X3",
"IFC Loader": "3"
}
},
"status": "success"
}
],
"hasThumbnail": "true",
"progress": "complete",
"type": "manifest",
"region": "US",
"version": "1.0",
"status": "success"
}
After some days trying I found the problem. I leave it here in case someone else has the same error and after trying all the things above couldn't solve it.
Im my case was simple as creating a new bucket. After doing so worked. Maybe the one with which I was trying was expired or something I don't know.
Related
I use a script that runs a Nest thermostat. It works well for me, but when I share it with another person with multiple thermostats, it does not work even though I set up the script to run only on one of his thermostats.
This is the script I am talking about:
function changetemp() {
// enter the thermostat ID manually after running makerequesttest function
const THERMOSTAT = '';
const smartService = getSmartService();
const access_token = smartService.getAccessToken();
const url = 'https://smartdevicemanagement.googleapis.com/v1';
const endpoint = '/enterprises/' + PROJECT_ID + '/devices';
const headers = {
'Authorization': 'Bearer ' + access_token,
'Content-Type': 'application/json'
}
const params = {
'headers': headers,
'method': 'get',
'muteHttpExceptions': true
}
try {
const response = UrlFetchApp.fetch(url + endpoint, params);
const responseCode = response.getResponseCode();
const responseBody = JSON.parse(response.getContentText());
console.log("Response code: " + responseCode);
console.log(responseBody);
const devices = responseBody['devices'];
const device = devices.find(d => d.name === 'enterprises/' + PROJECT_ID + '/devices/' + THERMOSTAT)
if (!device) {
console.log("Thermostat with ID " + THERMOSTAT + " not found.")
return
}
}
The person with multiple thermostats gets the "not found" message, while it works for me.
The THERMOSTAT const is gotten from that function that we execute first:
function makeRequesttest() {
// get the smart service
const smartService = getSmartService();
// get the access token
const access_token = smartService.getAccessToken();
//console.log(access_token);
// setup the SMD API url
const url = 'https://smartdevicemanagement.googleapis.com/v1';
const endpoint = '/enterprises/' + PROJECT_ID + '/devices';
// setup the headers for the call
const headers = {
'Authorization': 'Bearer ' + access_token,
'Content-Type': 'application/json'
}
// set up params
const params = {
'headers': headers,
'method': 'get',
'muteHttpExceptions': true
}
// try calling API
try {
const response = UrlFetchApp.fetch(url + endpoint, params);
const responseBody = JSON.parse(response.getContentText());
Logger.log('response: ' + response);
return responseBody;
}
catch(e) {
console.log('Error: ' + e);
//throw e;
}
}
The makeRequesttest() function gives me that result:
response: {
"devices": [
{
"name": "enterprises/AAA/devices/BBB",
"type": "sdm.devices.types.THERMOSTAT",
"assignee": "enterprises/AAA/structures/CCC/rooms/DDD",
"traits": {
"sdm.devices.traits.Info": {
"customName": ""
},
"sdm.devices.traits.Humidity": {
"ambientHumidityPercent": 58
},
"sdm.devices.traits.Connectivity": {
"status": "ONLINE"
},
"sdm.devices.traits.Fan": {},
"sdm.devices.traits.ThermostatMode": {
"mode": "OFF",
"availableModes": [
"HEAT",
"OFF"
]
},
"sdm.devices.traits.ThermostatEco": {
"availableModes": [
"OFF",
"MANUAL_ECO"
],
"mode": "OFF",
"heatCelsius": 13.04544,
"coolCelsius": 24.44443
},
"sdm.devices.traits.ThermostatHvac": {
"status": "OFF"
},
"sdm.devices.traits.Settings": {
"temperatureScale": "CELSIUS"
},
"sdm.devices.traits.ThermostatTemperatureSetpoint": {},
"sdm.devices.traits.Temperature": {
"ambientTemperatureCelsius": 9.20999
}
},
"parentRelations": [
{
"parent": "enterprises/AAA/structures/EEE/rooms/FFF",
"displayName": "Office"
}
]
}
]
}
With that log, I manually update the THERMOSTAT const value.
The person with multiple thermostats gets that result with makeRequestest():
Logging output too large. Truncating output. response: {
"devices": [
{
"name": "enterprises/XXX/devices/AVPHwGGG",
"type": "sdm.devices.types.THERMOSTAT",
"assignee": "enterprises/XXX/structures/YYY/rooms/ZZZ",
"traits": {
"sdm.devices.traits.Info": {
"customName": ""
},
"sdm.devices.traits.Humidity": {
"ambientHumidityPercent": 28
},
"sdm.devices.traits.Connectivity": {
"status": "ONLINE"
},
"sdm.devices.traits.Fan": {},
"sdm.devices.traits.ThermostatMode": {
"mode": "HEAT",
"availableModes": [
"HEAT",
"OFF"
]
},
"sdm.devices.traits.ThermostatEco": {
"availableModes": [
"OFF",
"MANUAL_ECO"
],
"mode": "OFF",
"heatCelsius": 16.52971,
"coolCelsius": 24.44444
},
"sdm.devices.traits.ThermostatHvac": {
"status": "OFF"
},
"sdm.devices.traits.Settings": {
"temperatureScale": "FAHRENHEIT"
},
"sdm.devices.traits.ThermostatTemperatureSetpoint": {
"heatCelsius": 17.777779
},
"sdm.devices.traits.Temperature": {
"ambientTemperatureCelsius": 19.17
}
},
"parentRelations": [
{
"parent": "enterprises/XXX/structures/YYY/rooms/ZZZ",
"displayName": "Notinteresting1"
}
]
},
{
"name": "enterprises/XXX/devices/AVPHAAA",
"type": "sdm.devices.types.THERMOSTAT",
"assignee": "enterprises/XXX/structures/YYY/rooms/BBB",
"traits": {
"sdm.devices.traits.Info": {
"customName": ""
},
"sdm.devices.traits.Humidity": {
"ambientHumidityPercent": 27
},
"sdm.devices.traits.Connectivity": {
"status": "ONLINE"
},
"sdm.devices.traits.Fan": {},
"sdm.devices.traits.ThermostatMode": {
"mode": "HEAT",
"availableModes": [
"HEAT",
"OFF"
]
},
"sdm.devices.traits.ThermostatEco": {
"availableModes": [
"OFF",
"MANUAL_ECO"
],
"mode": "OFF",
"heatCelsius": 12.77776,
"coolCelsius": 24.44444
},
"sdm.devices.traits.ThermostatHvac": {
"status": "HEATING"
},
"sdm.devices.traits.Settings": {
"temperatureScale": "FAHRENHEIT"
},
"sdm.devices.traits.ThermostatTemperatureSetpoint": {
"heatCelsius": 18.98769
},
"sdm.devices.traits.Temperature": {
"ambientTemperatureCelsius": 19.03
}
},
"parentRelations": [
{
"parent": "enterprises/XXX/structures/YYY/rooms/BBB",
"displayName": "Notinteresting2"
}
]
},
{
"name": "enterprises/XXX/devices/CCC",
"type": "sdm.devices.types.THERMOSTAT",
"assignee": "enterprises/XXX/structures/YYY/rooms/DDD",
"traits": {
"sdm.devices.traits.Info": {
"customName": ""
},
"sdm.devices.traits.Humidity": {
"ambientHumidityPercent": 29
},
"sdm.devices.traits.Connectivity": {
"status": "ONLINE"
},
"sdm.devices.traits.Fan": {},
"sdm.devices.traits.ThermostatMode": {
"mode": "HEAT",
"availableModes": [
"HEAT",
"OFF"
]
},
"sdm.devices.traits.ThermostatEco": {
"availableModes": [
"OFF",
"MANUAL_ECO"
],
"mode": "OFF",
"heatCelsius": 17.062832,
"coolCelsius": 24.44444
},
"sdm.devices.traits.ThermostatHvac": {
"status": "OFF"
},
"sdm.devices.traits.Settings": {
"temperatureScale": "FAHRENHEIT"
},
"sdm.devices.traits.ThermostatTemperatureSetpoint": {
"heatCelsius": 16.71161
},
"sdm.devices.traits.Temperature": {
"ambientTemperatureCelsius": 19.67999
}
},
"parentRelations": [
{
"parent": "enterprises/XXX/structures/YYY/rooms/DDD",
"displayName": "Theone"
}
]
},
{
"name": "enterprises/XXX/devices/EEE",
"type": "sdm.devices.types.THERMOSTAT",
"assignee": "enterprises/XXX/structures/YYY/rooms/FFF",
"traits": {
"sdm.devices.traits.Info": {
"customName": ""
},
"sdm.devices.traits.Humidity": {
"ambientHumidityPercent": 26
},
"sdm.devices.traits.Connectivity": {
"status": "ONLINE"
},
"sdm.devices.traits.Fan": {},
"sdm.devices.traits.ThermostatMode": {
"mode": "HEAT",
"availableModes": [
"HEAT",
"OFF"
]
},
"sdm.devices.traits.ThermostatEco": {
"availableModes": [
"OFF",
"MANUAL_ECO"
],
"mode": "OFF",
"heatCelsius": 15.586624,
"coolCelsius": 24.444443
},
"sdm.devices.traits.ThermostatHvac": {
"status": "OFF"
},
"sdm.devices.traits.Settings": {
"temperatureScale": "FAHRENHEIT"
},
"sdm.devices.traits.ThermostatTemperatureSetpoint": {
"heatCelsius": 18.360092
},
"sdm.devices.traits.Temperature": {
"ambientTemperatureCelsius": 19.259995
}
},
"parentRelations": [
{
"parent": "enterprises/XXX/structures/YYY/rooms/AVPHw
The thermostat with display name "Theone" is the one that we are interested in.
We updated the THERMOSTAT const manually with "CCC" in the changetemp function. And we tried with other values he got like "GGG" or "DDD" but still got the log "device not found".
For that reason, I decided to create a "dummy" device in Google Cloud Console in order to simulate the fact that I have many thermostats running.
What is the way to acces this thermostat from Google Apps Script? How to make my script run like this fake device exists in my installation?
This is what I have at the moment in my registry:
dummydevice-details
dummydevice-configstate
dummydevice-authentication
Edit: public key added to the device:
dummydevice-authenticationpublickey
I have been trying desperately for 5 days to create an elasticsearch watcher alert that sends a notification on an incoming webhook teams. However, the answer I receive is "Bad payload received by generic incoming webhook". I do not understand why it does not work.
{
"trigger": {
"schedule": {
"interval": "2m"
}
},
"input": {
"simple": {
"summary": "Test Nom",
"text": "test"
}
},
"condition": {
"always": {}
},
"actions": {
"MS_TEAMS_PORT443": {
"webhook": {
"scheme": "https",
"host": "outlook.office.com",
"port": 443,
"method": "post",
"path": "/webhook/XYZ",
"params": {},
"headers": {
"content-type": "application/json"
},
"body": "{{#toJson}}ctx.payload.summary{{/toJson}}"
}
}
}
}
And this is the response when I launch it:
{
"watch_id": "_inlined_",
"node": "QUApyNq4S5GyhHF-CuNjfg",
"state": "executed",
"status": {
"state": {
"active": true,
"timestamp": "2019-10-21T08:40:39.802Z"
},
"last_checked": "2019-10-21T08:40:39.802Z",
"last_met_condition": "2019-10-21T08:40:39.802Z",
"actions": {
"MS_TEAMS_PORT443": {
"ack": {
"timestamp": "2019-10-21T08:40:39.802Z",
"state": "awaits_successful_execution"
},
"last_execution": {
"timestamp": "2019-10-21T08:40:39.802Z",
"successful": false,
"reason": "received [400] status code"
}
}
},
"execution_state": "executed",
"version": -1
},
"trigger_event": {
"type": "manual",
"triggered_time": "2019-10-21T08:40:39.802Z",
"manual": {
"schedule": {
"scheduled_time": "2019-10-21T08:40:39.802Z"
}
}
},
"input": {
"simple": {
"summary": "Test Nom",
"text": "test"
}
},
"condition": {
"always": {}
},
"metadata": {
"name": "testJsonALaMano",
"xpack": {
"type": "json"
}
},
"result": {
"execution_time": "2019-10-21T08:40:39.802Z",
"execution_duration": 125,
"input": {
"type": "simple",
"status": "success",
"payload": {
"summary": "Test Nom",
"text": "test"
}
},
"condition": {
"type": "always",
"status": "success",
"met": true
},
"actions": [
{
"id": "MS_TEAMS_PORT443",
"type": "webhook",
"status": "failure",
"reason": "received [400] status code",
"webhook": {
"request": {
"host": "outlook.office.com",
"port": 443,
"scheme": "https",
"method": "post",
"path": "/webhook/XYZ",
"headers": {
"content-type": "application/json"
},
"body": "Test Nom"
},
"response": {
"status": 400,
"headers": {
"date": [
"Mon, 21 Oct 2019 08:40:38 GMT"
],
"content-length": [
"49"
],
"expires": [
"-1"
],
"x-beserver": [
"VI1PR07MB5053"
],
"x-aspnet-version": [
"4.0.30319"
],
"x-proxy-backendserverstatus": [
"400"
],
"x-cafeserver": [
"VE1PR03CA0023.EURPRD03.PROD.OUTLOOK.COM"
],
"x-calculatedbetarget": [
"VI1PR07MB5053.eurprd07.prod.outlook.com"
],
"request-id": [
"6d651f70-74b5-4010-a2a6-662666fa9985"
],
"pragma": [
"no-cache"
],
"x-msedge-ref": [
"Ref A: C6E8A3DCFF9541DD95D63FD71ACD695C Ref B: PAR02EDGE0513 Ref C: 2019-10-21T08:40:39Z"
],
"x-feserver": [
"VE1PR03CA0023"
],
"x-powered-by": [
"ASP.NET"
],
"x-backendhttpstatus": [
"400"
],
"content-type": [
"text/plain; charset=utf-8"
],
"cache-control": [
"no-cache"
]
},
"body": "Bad payload received by generic incoming webhook."
}
}
}
]
},
"messages": []
}
Wrong body statement. Need to send text in body. Change your body to be like this
"body": "{\"text\": \"{{ctx.payload.summary}}\"}"
source property is your friend here:
"body": {
"source": {
"text" : "Test Nom"
}
}
I'm aware that it's possible to get the cameras from saved views in the Navisworks models, but it would be great to get the names as well. When uploading a nwd file to a BIM 360 Document Management project these saved views are shown. Is it possible to do this with the Froge viewer as well? Or is this a Document Manager feaure only?
Regards Frode
The saved views in Navisworks files are fetchable with viewpoint names inside the response of the GET:urn/manifest. Here is an example from the Revit house sample model, rac_basic_sample_project.rvt exported as rac_basic_sample_project.nwc, see the folder type folder JSON object:
{
"guid": "dc74c06c-0818-46c3-b9cd-6f9666468d12",
"type": "view",
"role": "3d",
"name": "Default",
"status": "success",
"camera": [
-37.01164245605469,
-573.8855590820312,
10.432775497436523,
-37.01164245605469,
-101.42298889160156,
10.432775497436523,
0,
-2.220446049250313e-16,
1,
1,
0.785398006439209,
1,
0
],
"useAsDefault": true,
"hasThumbnail": "true",
"children": [
{
"guid": "59d18972-95cb-4845-a116-55a92e3c7ac3",
"type": "resource",
"urn": "urn:adsk.viewing:fs.file:dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bGt3ZWo3eHBiZ3A2M3g0aGwzMzV5Nm0yNm9ha2dnb2YyMDE3MDUyOHQwMjQ3MzIzODZ6L3JhY19iYXNpY19zYW1wbGVfcHJvamVjdC5ud2M/output/0/0_100.png",
"role": "thumbnail",
"mime": "image/png",
"resolution": [
100,
100
]
},
{
"guid": "14607723-303c-476a-ac39-8f66cac8f4bf",
"type": "resource",
"urn": "urn:adsk.viewing:fs.file:dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bGt3ZWo3eHBiZ3A2M3g0aGwzMzV5Nm0yNm9ha2dnb2YyMDE3MDUyOHQwMjQ3MzIzODZ6L3JhY19iYXNpY19zYW1wbGVfcHJvamVjdC5ud2M/output/0/0_200.png",
"role": "thumbnail",
"mime": "image/png",
"resolution": [
200,
200
]
},
{
"guid": "d7fd06cb-4ef5-48df-9e27-297343bf107a",
"type": "resource",
"urn": "urn:adsk.viewing:fs.file:dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6bGt3ZWo3eHBiZ3A2M3g0aGwzMzV5Nm0yNm9ha2dnb2YyMDE3MDUyOHQwMjQ3MzIzODZ6L3JhY19iYXNpY19zYW1wbGVfcHJvamVjdC5ud2M/output/0/0_400.png",
"role": "thumbnail",
"mime": "image/png",
"resolution": [
400,
400
]
}
]
},
{
"guid": "cccca659-8638-4e8d-9554-223f7cc4a23b",
"type": "folder",
"name": "3D View",
"role": "viewable",
"hasThumbnail": "false",
"status": "success",
"progress": "0% complete",
"children": [
{
"guid": "3dc842c3-acf9-4921-8d54-ffebf86500d1",
"type": "view",
"role": "3d",
"name": "Kitchen",
"camera": [
-71.70982360839844,
-77.9845199584961,
4.921259880065918,
10.964564323425293,
-15.158869743347168,
4.921259880065918,
4.996003610813204e-16,
-4.440892098500626e-16,
1,
1,
0.9272952079772949,
1,
0
],
"status": "success"
},
{
"guid": "716c2341-af18-4866-9fb7-57a27ff811d3",
"type": "view",
"role": "3d",
"name": "From Yard",
"camera": [
-98.73897552490234,
-169.06787109375,
0,
-42.515201568603516,
-44.77614212036133,
-1.609189127435573e-14,
0,
1.1102230246251565e-16,
1,
1,
0.9272952079772949,
1,
0
],
"status": "success"
},
{
"guid": "1466f07a-5536-4acd-bb51-ee228fb6a41e",
"type": "view",
"role": "3d",
"name": "Living Room",
"camera": [
-31.575815200805664,
-51.19736862182617,
0.9842519760131836,
38.432044982910156,
-143.84164428710938,
0.9842519760131836,
-5.0237591864288333e-14,
-3.735900477863652e-14,
1,
1,
0.9272952079772949,
1,
0
],
"status": "success"
},
{
"guid": "68ffe8dc-9a9c-45a5-aaf5-29221dd38771",
"type": "view",
"role": "3d",
"name": "Approach",
"camera": [
-41.0597038269043,
38.65303039550781,
32.80839920043945,
-49.91415786743164,
-107.17664337158203,
9.272088050842285,
-0.009639321826398373,
-0.15875616669654846,
0.9872707724571228,
1,
0.9272952079772949,
1,
0
],
"status": "success"
},
{
"guid": "4c8b4f68-7010-47eb-a3e3-3aa699e82674",
"type": "view",
"role": "3d",
"name": "Section Perspective",
"camera": [
8.170970916748047,
29.014333724975586,
5.741469860076904,
-82.0259780883789,
-107.69042205810547,
5.741469860076904,
7.771561172376096e-16,
2.914335439641036e-16,
1,
1,
0.9272952079772949,
1,
0
],
"status": "success"
},
{
"guid": "33f941f3-81d1-41c5-82e5-346731d79f34",
"type": "view",
"role": "3d",
"name": "Solar Analysis",
"camera": [
62.19073486328125,
-142.4400634765625,
161.65139770507812,
-32.913902282714844,
-97.60645294189453,
8.838506698608398,
-0.7451809644699097,
0.3512883186340332,
0.5668349266052246,
1,
45,
273.4084777832031,
1
],
"status": "success"
},
{
"guid": "41b33c50-bcdf-4f57-8101-5f8af6ece8eb",
"type": "view",
"role": "3d",
"name": "{3D}",
"camera": [
-104.73332977294922,
-202.08343505859375,
67.68977355957031,
-103.85852813720703,
-103.78852081298828,
5.5336480140686035,
0.004756217356771231,
0.5344184637069702,
0.8452067375183105,
1,
45,
424.3688049316406,
1
],
"status": "success"
}
]
}
Now we use the Kitchen view to illustrate the workflow:
{
"guid": "3dc842c3-acf9-4921-8d54-ffebf86500d1",
"type": "view",
"role": "3d",
"name": "Kitchen",
"camera": [
-71.70982360839844,
-77.9845199584961,
4.921259880065918,
10.964564323425293,
-15.158869743347168,
4.921259880065918,
4.996003610813204e-16,
-4.440892098500626e-16,
1,
1,
0.9272952079772949,
1,
0
],
"status": "success"
}
First, let's convert it from original model space into the viewer's:
const nwVP = JSON.parse( // the above JSON );
const camera = nwVP.camera;
const nwVPName = nwVP.name;
const placementWithOffset = viewer.model.getData().placementWithOffset;
const pos = new THREE.Vector3( camera[0], camera[1], camera[2] );
const target = new THREE.Vector3( camera[3], camera[4], camera[5] );
const up = new THREE.Vector3( camera[6], camera[7], camera[8] );
const aspect = camera[9];
const fov = camera[10] / Math.PI * 180;
const orthoScale = camera[11];
const isPerspective = !camera[12];
const offsetPos = pos.applyMatrix4( placementWithOffset );
const offsetTarget = target.applyMatrix4( placementWithOffset );
const nwSavedViewpoints = [];
nwSavedViewpoints.push(
{
aspect: aspect,
isPerspective: isPerspective,
fov: fov,
position: offsetPos,
target: offsetTarget,
up: up,
orthoScale: orthoScale,
name: nwVPName
}
);
Afterward, switch the viewpoint by
viewer.impl.setViewFromCamera( nwSavedViewpoints[0] );
Lastly, you may be aware the above converted camera definition will have the almost same value (floating precision issue) as viewer.model.getData().cameras[1]
Hope it helps!
Cheers,
Updates for sectioning mapping
If your saved viewpoint contains a section plane, the response of GET:urn/manifest would have something like this:
{
"guid": "54794b24-d9ef-4a1a-b5aa-8bbf35de2c55",
"type": "view",
"role": "3d",
"name": "Section Test",
"camera": [
-264.2721252441406,
-79.92520141601562,
148.0021209716797,
-42.678688049316406,
-73.8739013671875,
0.7752543091773987,
0.5530436635017395,
0.01510258112102747,
0.8330153822898865,
1.4948216676712036,
0.785398006439209,
1,
0
],
"sectionPlane": [
-0.803684066258349,
-0.5950562340169588,
0,
-92.04215879314862
],
"status": "success"
}
The sectionPlane attribute is the target we want. So, the conversion is
const forge_model_offset = viewer.model.getData().globalOffset;
// assume the param of Navisworks clip plane is available
//I copied from the response of the GET:urn/manifest
const navis_clip_plane = { x: -0.803684066258349, y: -0.5950562340169588, z: 0,d:-92.04215879314862 };
//calculate exact distance in Forge Viewer
const dis_in_forge =( forge_model_offset.x * navis_clip_plane.x +
forge_model_offset.y * navis_clip_plane.y +
forge_model_offset.z * navis_clip_plane.z) + navis_clip_plane.d;
const cutplanes = [
new THREE.Vector4( navis_clip_plane.x, navis_clip_plane.y, navis_clip_plane.z, dis_in_forge )
];
//apply the plane to sectioning
viewer.setCutPlanes( cutplanes )
2 Questions:
1) I want to keep the json as is but change the Timestamp to human readable date like "2016-12-19T09:21:35Z"
{
"Action": "ALLOW",
"Timestamp": 1482139256.274,
"Request": {
"Country": "US",
"URI": "/version/moot/beta.json",
"Headers": [
{
"Name": "Host",
"Value": "static.tiza.com"
},
{
"Name": "User-Agent",
"Value": "Faraday v0.9.2"
},
{
"Name": "Accept-Encoding",
"Value": "gzip;q=1.0,deflate;q=0.6,identity;q=0.3"
},
{
"Name": "Accept",
"Value": "*/*"
},
{
"Name": "X-Newrelic-Id",
"Value": "Vgcs5gbFU123dFBWGwIdAVFdrXBwc="
},
{
"Name": "X-Newrelic-Transaction",
"Value": "PxQDQVlzZVUd3NKQcrEwWwU"
}
],
"ClientIP": "107.22.17.51",
"Method": "GET",
"HTTPVersion": "HTTP/1.1"
},
"Weight": 1
}
I know I can do it using 'todate' jq feature but I lose all other data
sh# cat temp.json | jq -r '.SampledRequests[].Timestamp | todate'
2016-12-19T09:21:44Z
--------- Updated --------
second question:
2)How do I take the content of .Headers[] out of the array under the "Request{}" level.
from:
{
"TimeWindow": {
"EndTime": 1482156660,
"StartTime": 1482156420
},
"SampledRequests": [
{
"Action": "ALLOW",
"Timestamp": 1482139256.274,
"Request": {
"Country": "US",
"URI": "/version/moot/beta.json",
"Headers": [
{
"Name": "Host",
"Value": "static.tiza.com"
},
{
"Name": "X-Newrelic-Transaction",
"Value": "PxQDQVlzZVUd3NKQcrEwWwU"
}
],
"ClientIP": "107.22.17.51",
"Method": "GET",
"HTTPVersion": "HTTP/1.1"
},
"Weight": 1
}
],
"PopulationSize": 89
}
To:
{
"TimeWindow.EndTime": 1482156660,
"TimeWindow.StartTime": 1482156420,
"Action": "ALLOW",
"Timestamp": 1482139256.274,
"Request.Country": "US",
"Request.URI": "/version/moot/beta.json",
"Headers.Host": "static.tiza.com",
"Headers.X-Newrelic-Transaction": "PxQDQVlzZVUd3NKQcrEwWwU",
"ClientIP": "107.22.17.51",
"Method": "GET",
"HTTPVersion": "HTTP/1.1",
"Weight": 1,
"PopulationSize": 89
}
Thanks a lot,
Moshe
1) Use |= rather than just |
2) One way to transform the fragment:
{
"Headers": [
{
"Name": "Host",
"Value": "static.tiza.com"
},
{
"Name": "User-Agent",
"Value": "Faraday v0.9.2"
}
]
}
as required would be using the filter:
.Headers[] | { ("Headers." + .Name): .Value }
In your case, you could therefore use the following filter for (2):
.SampledRequests[].Request.Headers[] |=
{ ("Headers." + .Name): .Value }
I'll leave it to you to put all the pieces together :-)
I am using the Bluemix Workload Scheduler REST API to create Processes with a Scheduled Trigger having a oneTimeProperty and a startDate.
Additionally the json i am sending also has a restfulStep.
The issue i have is, that no matter how i provide the "queryParameters" and "headers" for the restfulStep, they are not accepted/configured in the process after the successful process creation.
Here is the json i am using:
{
"name": "my process name",
"processlibraryid": 1234,
"processstatus": true,
"triggers": [
{
"name": "Scheduled Trigger",
"triggerType": "OnceTrigger",
"oneTimeProperty": {
"startDate": "TIMEVALUE"
}
}
],
"steps": [
{
"restfulStep": {
"agent": "AGENTNAME}",
"action": {
"uri": "MYCUSTOMURL",
"contentType": "application/json",
"method": "POST",
"verifyHostname": true,
"queryParameters": [
["param1", "value1"],
["param2", "value2"]
],
"headers": [
["param3", "param4"]
],
"numberOfRetries": 3,
"retryIntervalSeconds": 30
},
"authdata": {
"username": "USERNAME",
"password": "PASSWORD"
},
"input": {
"input": "",
"isFile": false
}
}
}
]
}
issue has been fixed with last Workload Scheduler upgrade.
Could you try using a Json like the following?
{
"name": "myname",
"processlibraryid": <1234>,
"processstatus": false,
"triggers": [
{
"name": "Scheduled Trigger",
"triggerType": "OnceTrigger",
"oneTimeProperty": {
"startDate": "2016-12-16T10:30:43.218Z"
}
}
],
"steps": [
{
"restfulStep": {
"agent": "<MY_AGENT_NAME>",
"action": {
"uri": "<MY_URL>",
"contentType": "application/json",
"method": "GET",
"verifyHostname": true,
"queryParameters": [
["param1", "value1"],
["param2", "value2"]
],
"headers": [
["Accept", "application/json"],
["User-Agent", "Mozilla/5.0 "]
],
"numberOfRetries": 3,
"retryIntervalSeconds": 30
},
"authdata": {
"username": "USERNAME",
"password": "PASSWORD"
},
"input": {
"input": "",
"isFile": false
}
}
}
]
}
Regards
Andrea I
your json is correct but there is a little bug in Workload Scheduler service.
A fix will be released by the end of December.
As workaround, you could use Application Lab in order to create your Restful step. In addition, you could append the queryParameters to your uri address.
At the moment, there is no workarounds for headers.
If you find other issue using the service, do not hesitate to post your comments.
Thanks!
Andrea I