iMac:fbx-conv me$ ./fbx-conv-mac -v -o G3DJ courier.fbx
INFO: FBX to G3Dx converter, version 0.01.0061 x64 , FBXSDK 2014.02
STATUS: Loading source file
PROGRESS: Import FBX 100.00%
STATUS: Converting source file
STATUS: Closing source file
VERBOSE: Listing model information:
VERBOSE: ID :
VERBOSE: Version : Hi=0, Lo=1
VERBOSE: Meshes : 0 (0 vertices, 0 parts, 0 indices)
VERBOSE: Nodes : 0 root, 0 total, 0 parts
VERBOSE: Materials : 0 (0 textures)
STATUS: Exporting to G3DJ file: courier.g3dj
STATUS: Closing exported file
I'm using maya lt to export an object to fbx for conversion.
The 3d model is downloaded from the maya market site, imported and exported to fbx.
The returned .g3dj file is almost empty. Loading it in libgdx
assets.load("data/courier.g3dj", Model.class);
doesn't show anything either.
Anyone know what went wrong?
.g3dj file:
{
"version": [ 0, 1],
"id": "",
"meshes": [],
"materials": [],
"nodes": [],
"animations": []
}
I had some success trying blender which return me this:
iMac:fbx-conv me$ ./fbx-conv-mac -v -o G3DJ palm_tree.fbx
INFO: FBX to G3Dx converter, version 0.01.0061 x64 , FBXSDK 2014.02
STATUS: Loading source file
PROGRESS: Import FBX 12.50% palm_tree
PROGRESS: Import FBX 25.00% 02 - Default PROGRESS: Import FBX 37.50% Map #1
PROGRESS: Import FBX 50.00% Map #1
PROGRESS: Import FBX 62.50% Take 001
PROGRESS: Import FBX 75.00% BaseLayer PROGRESS: Import FBX 87.50% PROGRESS: Import FBX 100.00%
STATUS: [shape(palm_tree)] Triangulating FbxMesh geometry
VERBOSE: [shape(palm_tree)] polygons: 1032 (3096 indices), control points: 724
STATUS: Converting source file
STATUS: Closing source file
VERBOSE: Listing model information:
VERBOSE: ID :
VERBOSE: Version : Hi=0, Lo=1
VERBOSE: Meshes : 1 (743 vertices, 1 parts, 3096 indices)
VERBOSE: Nodes : 1 root, 1 total, 1 parts
VERBOSE: Materials : 1 (1 textures)
STATUS: Exporting to G3DJ file: palm_tree.g3dj
STATUS: Closing exported file
Related
I did a translation from .rvt to .dwg with Model Derivative API, but I didn't get the urn of the dwg file. When I did the request GET manifest for watching the progress, I got ...
{urn: 'dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6dGVtcGxhdGVzXzIvUEhBUk0ucnZ0', derivatives: Array(0), hasThumbnail: 'false', progress: '0% complete', type: 'manifest', …}
derivatives: []
hasThumbnail: "false"
progress: "0% complete"
region: "US"
status: "success"
type: "manifest"
urn: "dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6dGVtcGxhdGVzXzIvUEhBUk0ucnZ0"
version: "1.0"
The progress is 0% complete and status is success, but, in this case, status should be inprogress.
Do you known the issue?
Starting in macOS 12 (Monterey), the system apparently writes crash file as .ips files, instead of the traditional .crash file format.
The file appears to contain JSON data:
{"app_name":"Helper","timestamp":"2021-10-30 18:49:32.00 +0100","app_version":"3.0.0(66) beta","slice_uuid":"673198dd-94ac-31a7-9e81-09fe6c781255","build_version":"3.0.0.66","platform":1,"bundleID":"com.dislt.helper","share_with_app_devs":0,"is_first_party":0,"bug_type":"309","os_version":"macOS 12.0.1 (21A559)","incident_id":"CC03C2EC-C1D4-4F6E-AA1F-6C4EC555D6B8","name":"Helper"}
{
"uptime" : 91000,
"procLaunch" : "2021-10-30 18:49:29.7791 +0100",
"procRole" : "Unspecified",
"version" : 2,
"userID" : 501,
"deployVersion" : 210,
"modelCode" : "MacBookPro14,3",
"procStartAbsTime" : 91844701503187,
"coalitionID" : 1244,
"osVersion" : {
"train" : "macOS 12.0.1",
"build" : "21A559",
"releaseType" : "User"
},
"captureTime" : "2021-10-30 18:49:32.4572 +0100",
"incident" : "92A89610-D70A-4D93-A974-A9018BB5C72A",
"bug_type" : "309",
"pid" : 77765,
"procExitAbsTime" : 91847378271126,
"cpuType" : "X86-64",
"procName" : "Helper",
...
When I preview the file or open it in the Console app, a traditional crash report is automatically generated:
-------------------------------------
Translated Report (Full Report Below)
-------------------------------------
Process: Helper [77765]
Path: /Users/USER/Library/Application Support/Helper.app/Contents/MacOS/Helper
Identifier: com.distl.helper
Version: 3.0.0(66) beta (3.0.0.66)
Code Type: X86-64 (Native)
Parent Process: TestBead [77726]
Responsible: TestBead [77726]
User ID: 501
Date/Time: 2021-10-30 18:49:32.4572 +0100
OS Version: macOS 12.0.1 (21A559)
Report Version: 12
Bridge OS Version: 3.0 (14Y908)
Anonymous UUID: CC03C2EC-C1D4-4F6E-AA1F-6C4EC555D6B8
Time Awake Since Boot: 91000 seconds
System Integrity Protection: enabled
Crashed Thread: 1 Dispatch queue: com.apple.NSXPCConnection.user.anonymous.77726
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x00007f780071a000
Exception Codes: 0x0000000000000001, 0x00007f780071a000
Exception Note: EXC_CORPSE_NOTIFY
Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11
Terminating Process: exc handler [77765]
...
I have customer support and development tools that scan these crash report files automatically, and I'd like to find out if there's a way to automate the translation of the JSON data back into the traditional crash report format?
I'd like to do this to (a) avoid rewriting my crash report scanning tools (although that wouldn't be impossible), and (b) automatically translate these files into a human readable format, without resorting to opening the file in the Console app.
I've run into the same problem. I haven't tried it myself yet, but someone has already created an ips2crash command available at GitHub. As the name implies, it should convert an .ips file to the (now) legacy crash report format.
According to info here, plugin_host is an external process that is used to execute plugin code. Unfortunately I'm getting an error (see title) and I probably need to configure this process but how? Thanks
Please note I'm using 32-bit version (https://www.sublimetext.com/3) on Debian 11.
System packages (zip files) are installed in /usr/local/share/sublime-text/Packages
I created symlink in /usr/local/bin linked to the previous folder
Python is installed in $HOME/.config/sublime-text-3/Lib/python3.3
both sublime.py and sublime_plugin.py files are located in $HOME/.config/sublime-text-3/Lib/ folder
Console output:
UI scale: 1.002 (gtk text scale)
startup, version: 3211 linux x32 channel: stable
executable: /usr/local/bin/sublime
working dir: /
packages path: /home/fox/.config/sublime-text-3/Packages
state path: /home/fox/.config/sublime-text-3/Local
zip path: /usr/local/bin/Packages
zip path: /home/fox/.config/sublime-text-3/Installed Packages
ignored_packages: ["Markdown", "Vintage"]
pre session restore time: 0.320594
startup time: 1.57776
first paint time: 1.8247
error: plugin_host has exited unexpectedly, plugin functionality won't be available until Sublime Text has been restarted
Preferences.sublime-settings:
{
"update_check": false,
"color_scheme": "Packages/dark.scheme",
"tab_size": 4,
"translate_tabs_to_spaces": true,
"trim_automatic_white_space": true,
"trim_trailing_white_space_on_save": true,
"ensure_newline_at_eof_on_save": true,
"detect_indentation" : false,
"copy_with_empty_selection": false,
"find_selected_text": true,
"detect_slow_plugins": false,
"auto_complete_delay": 500,
"font_face" : "Source Code Pro",
"font_options":
[
"directwrite"
],
"font_size": 14,
"highlight_line": true,
"ignored_packages": ["Markdown", "Vintage"]
}
❯ /usr/local/bin/plugin_host
Unexpected number of arguments, expected 2
❯ /usr/local/bin/plugin_host --help
unable to open channels
I have a valid appsettings.json file (according to jsonlint.com), I've set the tsconfig resolveJsonModule option to true. I'm importing #rollup/plugin-json and I've tried calling it at every position in the plugins chain. But I always get:
(!) Plugin json: Could not parse JSON file
appsettings.json
[!] Error: Unexpected token (Note that you need #rollup/plugin-json to import JSON files)
appsettings.json (2:10)
So the plugin is firing (I think), but it can't parse the file, which seems to be valid. Rollup config looks like this:
import typescript from '#rollup/plugin-typescript';
import resolve from '#rollup/plugin-node-resolve';
import commonjs from "#rollup/plugin-commonjs";
import dev from 'rollup-plugin-dev';
import copy from 'rollup-plugin-copy';
import replace from '#rollup/plugin-replace';
// Loaders for non-ts/js file types
import postcss from 'rollup-plugin-postcss';
import image from '#rollup/plugin-image';
import json from '#rollup/plugin-json';
console.log(`Node env is ${process.env.NODE_ENV}`);
// console.debug(process);
let isDevEnv = process.env.NODE_ENV === 'development';
let useMsw = process.env.USE_MSW;
const extensions = ['.cjs', '.js', '.jsx', '.json', '.ts', '.tsx', '.css', '.png'];
// const intro = useMsw
// ? 'global = window; window.NODE_ENV = process.env.NODE_ENV; window.USE_MSW = true'
// : 'global = window; window.NODE_ENV = process.env.NODE_ENV; window.USE_MSW = false';
const intro = `global = window; window.NODE_ENV = process.env.NODE_ENV; ${useMsw ? 'window.USE_MSW = true;' : ''}`;
export default {
input: [
'src/index.tsx'
],
output: {
intro: intro,
file: './dist/bundle.js',
format: 'es',
sourcemap: isDevEnv,
inlineDynamicImports: true,
},
plugins: [
postcss({}),
resolve({
extensions: extensions,
browser: true
}),
commonjs(),
typescript(),
replace({
'process.env.NODE_ENV': JSON.stringify('development')
}),
image(),
copy({
targets: [
{src: './src/index.html', dest: './dist/'},
{src: './src/mockServiceWorker.js', dest: './dist/'}
],
verbose: true
}),
isDevEnv && dev('dist', {
host: 'localhost'
}),
json(),
]
};
tsconfig looks like this:
{
"compilerOptions": {
"declaration": false,
"module": "ESNext",
"noImplicitAny": true,
"target": "ES2015",
"jsx": "react",
"allowSyntheticDefaultImports": true,
"allowJs": true,
"moduleResolution": "Node",
"esModuleInterop": true,
"resolveJsonModule": true
},
"include": [
"src/**/*.tsx",
"src/**/*.ts",
"declaration.d.ts",
"src/components/TabularVIew/GridContainer/hooks"
],
"exclude": ["node_modules"]
}
and the actual json file looks like this:
{
"HUB_URL": "theHubUrl",
"AUTH_ENDPOINT": "https://localhost:44330/API/Dispatch/Authentication/v1.0/authenticate",
"POSITION_ENDPOINT": "https://localhost:44330/API/Dispatch/Data/v1.0/position",
"SUMMARY_ENDPOINT": "https://localhost:44330/API/Dispatch/Data/v1.0/summaries",
"GLOBAL_TLM": 1,
"PERIOD_LENGTH_MINUTES": 30,
"EFA_BLOCKS": [
[23,0,1,2],
[3,4,5,6],
[7,8,9,10],
[11,12,13,14],
[15,16,17,18],
[19,20,21,22]
]
}
and the rollup output is this:
(!) Plugin json: Could not parse JSON file
appsettings.json
[!] Error: Unexpected token (Note that you need #rollup/plugin-json to import JSON files)
appsettings.json (2:10)
Pretty frustrating because on one line it says 'plugin json can't parse', then the next log line tells me I need plugin json???. Invalid file, file not found, plugin not installed, these I could understand. Possibly a clash between tsc and the plugin. Out of ideas..
Suggestions welcome.
Thanks.
The reason for that can be the json file encoding is utf8withbom. Try to encode the file as utf8.
Not really an answer, but the behaviour appears to be linked to some aggressive caching. Either by npm or typescript. I opened up the project in vscode, hosed node_modules, ran npm install, usual drill.. created a new JSON file, installed the rollup json plugin, and it built. Sum total of learning: 0;
This is my json log file. I'm trying to store the file to my elastic-Search through my logstash.
{"message":"IM: Orchestration","level":"info"}
{"message":"Investment Management","level":"info"}
Here is my filebeat.yml
filebeat.inputs:
- type: log
enabled: true
paths:
- D:/Development_Avecto/test-log/tn-logs/im.log
json.keys_under_root: true
json.add_error_key: true
processors:
- decode_json_fields:
fields: ["message"]
output.logstash:
hosts: ["localhost:5044"]
input{
beats {
port => "5044"
}
}
filter {
json {
source => "message"
}
}
output{
elasticsearch{
hosts => ["localhost:9200"]
index => "data"
}
}
No able to view out put in elasticserach. Not able to find whats the error.
filebeat log
2019-06-18T11:30:03.448+0530 INFO registrar/registrar.go:134 Loading registrar data from D:\Development_Avecto\filebeat-6.6.2-windows-x86_64\data\registry
2019-06-18T11:30:03.448+0530 INFO registrar/registrar.go:141 States Loaded from registrar: 10
2019-06-18T11:30:03.448+0530 WARN beater/filebeat.go:367 Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2019-06-18T11:30:03.448+0530 INFO crawler/crawler.go:72 Loading Inputs: 1
2019-06-18T11:30:03.448+0530 INFO log/input.go:138 Configured paths: [D:\Development_Avecto\test-log\tn-logs\im.log]
2019-06-18T11:30:03.448+0530 INFO input/input.go:114 Starting input of type: log; ID: 16965758110699470044
2019-06-18T11:30:03.449+0530 INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 1
2019-06-18T11:30:34.842+0530 INFO [monitoring] log/log.go:144 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":312,"time":{"ms":312}},"total":{"ticks":390,"time":{"ms":390},"value":390},"user":{"ticks":78,"time":{"ms":78}}},"handles":{"open":213},"info":{"ephemeral_id":"66983518-39e6-461c-886d-a1f99da6631d","uptime":{"ms":30522}},"memstats":{"gc_next":4194304,"memory_alloc":2963720,"memory_total":4359488,"rss":22421504}},"filebeat":{"events":{"added":1,"done":1},"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"logstash"},"pipeline":{"clients":1,"events":{"active":0,"filtered":1,"total":1}}},"registrar":{"states":{"current":10,"update":1},"writes":{"success":1,"total":1}},"system":{"cpu":{"cores":4}}}}}
2
https://www.elastic.co/guide/en/ecs-logging/dotnet/master/setup.html
Check step 3 at the bottom of the page for the config you need to put in your filebeat.yaml file:
filebeat.inputs:
- type: log
paths: /path/to/logs.json
json.keys_under_root: true
json.overwrite_keys: true
json.add_error_key: true
json.expand_keys: true