Getting a reference error on gatsby build - html

I’ve been working on my project as usual, it was working fine. When I tried to do gatsby build it got this error message:
ReferenceError: Cannot access '_404' before initialization
and
ReferenceError: Cannot access '_mdx_slug_' before initialization
I haven’t added anything that should cause this, though I don’t really understand where the problem is coming from. It seems to refer to the way gatsby builds the pages of the website but I’m not experienced enough to know for sure.
gatsby build full:
success load gatsby config - 0.041s
success load plugins - 0.757s
success onPreInit - 0.029s
success initialize cache - 0.046s
success copy gatsby files - 0.183s
success Compiling Gatsby Functions - 0.185s
success onPreBootstrap - 0.201s
success createSchemaCustomization - 0.011s
success Checking for changed pages - 0.001s
success source and transform nodes - 0.511s
info Writing GraphQL type definitions to /home/daniel_gray/GatsbyProjects/watchsdarot/.cache/schema.gql
warn There are conflicting field types in your data.
If you have explicitly defined a type for those fields, you can safely ignore this warning message.
Otherwise, Gatsby will omit those fields from the GraphQL schema.
If you know all field types in advance, the best strategy is to explicitly define them with the `createTypes` action, and skip inference with the `#dontInfer` directive.
See https://www.gatsbyjs.com/docs/actions/#createTypes
Mdx.frontmatter.EpisodeNum:
- type: number
value: 10
- type: string
value: '09'
Mdx.frontmatter.NumOfEpisodes:
- type: number
value: 10
- type: string
value: '09'
success building schema - 0.469s
success createPages - 0.002s
success createPagesStatefully - 0.336s
info Total nodes: 329, SitePage nodes: 86 (use --verbose for breakdown)
success Checking for changed pages - 0.005s
success Cleaning up stale page-data - 0.010s
success onPreExtractQueries - 0.001s
success extract queries from components - 2.813s
success write out redirect data - 0.010s
success Build manifest and related icons - 0.565s
success onPostBootstrap - 0.609s
info bootstrap finished - 9.239s
success write out requires - 0.048s
success Building production JavaScript and CSS bundles - 8.175s
⠼ Building HTML renderer
[============================] 10.041 s 7/7 100% Running gatsby-plugin-sharp.IMAGE_PROCESSING jobs
<w> [webpack.cache.PackFileCacheStrategy] Skipped not serializable cache item 'mini-css-extract-plugin /home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/css-loader/dist/cjs.js??ruleSet[1].rules[10].oneOf[0].use[1]!/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/postcss-loader/dist/cjs.js??ruleSet[1].rules[10].oneOf[0].use[2]!/home/daniel_gray/GatsbyProjects/watchsdarot/src/components/footer.module.css|0|Compilation/modules|/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/css-loader/dist/cjs.js??ruleSet[1].rules[10].oneOf[0].use[1]!/home/daniel_gray/GatsbyProjects/watchsdar
⠴ Building HTML renderer
[============================] 10.041 s 7/7 100% Running gatsby-plugin-sharp.IMAGE_PROCESSING jobs
<w> [webpack.cache.PackFileCacheStrategy] Skipped not serializable cache item 'Compilation/modules|/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/css-loader/dist/cjs.js??ruleSet[1].rules[9].oneOf[0].use[0]!/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/postcss-loader/dist/cjs.js??ruleSet[1].rules[9].oneOf[0].use[1]!/home/daniel_gray/GatsbyProjects/watchsdarot/src/components/foote
success Building HTML renderer - 22.328s
success Execute page configs - 0.028s
success Caching Webpack compilations - 0.002s
success run queries in workers - 0.133s - 78/78 585.57/s
success Running gatsby-plugin-sharp.IMAGE_PROCESSING jobs - 32.550s - 7/7 0.22/s
success Merge worker state - 0.002s
success Rewriting compilation hashes - 0.002s
success Writing page-data.json files to public directory - 0.048s - 79/79 1656.57/s
[ ] 0.000 s 0/85 0% Building static HTML for pages
/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/yoga-layout-prebuilt/yoga-layout/build/Release/nbind.js:53
throw ex;
^
ReferenceError: Cannot access '_404' before initialization
at Module.default (/home/daniel_gray/GatsbyProjects/watchsdarot/.cache/page-ssr/routes/component---src-pages-404-js.js:182:35)
at RouteHandler.render (/home/daniel_gray/GatsbyProjects/watchsdarot/.cache/page-ssr/routes/webpack:/watchsdarot/.cache/static-entry.js:229:57)
at Ic (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:66:30)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:68:210)
at Z (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:74:89)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:72:13)
at Z (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:74:89)
at Lc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:77:98)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:69:131)
at Z (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:74:89)
at Ic (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:66:373)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:68:210)
at Z (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:74:89)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:72:271)
at Z (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:74:89)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:68:481)
Emitted 'error' event on WritableAsPromise instance at:
at emitErrorNT (node:internal/streams/destroy:157:8)
at emitErrorCloseNT (node:internal/streams/destroy:122:3)
at processTicksAndRejections (node:internal/process/task_queues:83:21)
at runNextTicks (node:internal/process/task_queues:65:3)
failed Building static HTML for pages - 2.072s
/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/yoga-layout-prebuilt/yoga-layout/build/Release/nbind.js:53
throw ex;
^
ReferenceError: Cannot access '_mdx_slug_' before initialization
at Module.default (/home/daniel_gray/GatsbyProjects/watchsdarot/.cache/page-ssr/routes/component---src-pages-player-mdx-slug-js.js:278:35)
at RouteHandler.render (/home/daniel_gray/GatsbyProjects/watchsdarot/.cache/page-ssr/routes/webpack:/watchsdarot/.cache/static-entry.js:229:57)
at Ic (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:66:30)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:68:210)
at Z (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:74:89)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:72:13)
at Z (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:74:89)
at Lc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:77:98)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:69:131)
at Z (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:74:89)
at Ic (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:66:373)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:68:210)
at Z (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:74:89)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:72:271)
at Z (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:74:89)
at Kc (/home/daniel_gray/GatsbyProjects/watchsdarot/node_modules/react-dom/cjs/react-dom-server.node.production.min.js:68:481)
Emitted 'error' event on WritableAsPromise instance at:
at emitErrorNT (node:internal/streams/destroy:157:8)
at emitErrorCloseNT (node:internal/streams/destroy:122:3)
at processTicksAndRejections (node:internal/process/task_queues:83:21)
at runNextTicks (node:internal/process/task_queues:65:3)
ERROR #95313
Building static HTML failed
See our docs page for more info on this error: https://gatsby.dev/debug-html
1 | 'use strict';
> 2 | module.exports = function (obj, predicate) {
| ^
3 | var ret = {};
4 | var keys = Object.keys(obj);
5 | var isArr = Array.isArray(predicate);
WebpackError: Worker exited before finishing task
- index.js:2
[watchsdarot]/[filter-obj]/index.js:2:1
- index.js:194
[watchsdarot]/[query-string]/index.js:194:1
- dev-404-page.js:209
watchsdarot/.cache/dev-404-page.js:209:19
Thanks to anyone tried to help.

Found a solution to it on my own. It was a certain call to document.documentElement that prevented the build process to create static HTML files. No idea still why it pointed to two different pages since it was on another page but everything seems to be working now. For anyone still struggling try wrapping your window or document calls in a useEffect() hook or an if statement like this if(typeof window !== 'undefined')
Hope this helps someone!
Good luck.

Related

Telegraf json_v2 parser error: Unable to convert field to type int. strconv.ParseInt: parsing invalid syntax

Good day!
I have built a small IoT device that monitors the conditions inside a specific enclosure using an ESP32 and a couple of sensors. I want to monitor that data by publishing it to the ThingSpeak cloud, then writing it to InfluxDB with Telegraf and finally using the InfluxDB data source in Grafana to visualize it.
So far I have made everything work flawlessly, but with one small exception.
Which is: One of the plugins in my telegraf config fails with the error:
parsing metrics failed: Unable to convert field 'temperature' to type int: strconv.ParseInt: parsing "15.4": invalid syntax
The plugins are [inputs.http]] and [[inputs.http.json_v2]] and what I am doing with them is authenticating against my ThingSpeak API and parsing the json output of my fields. Then in my /etc/telegraf/telegraf.conf under [[inputs.http.json_v2.field]] I have added type = int as otherwise telegraf writes my metrics as Strings in InfluxDB and the only way to visualize them is using either a table or a single stat, because the rest of the flux queries fail with the error unsupported input type for mean aggregate: string. However, when I change to type = float in the config file I get a different error:
unprocessable entity: failure writing points to database: partial write: field type conflict: input field "temperature" on measurement "sensorData" is type float, already exists as type string dropped=1
I have a suspicion that I have misconfigured the parser plugin, however after hours of debugging I couldn't come up with a solution.
Some information that might be of use:
Telegraf version: Telegraf 1.24.2
Influxdb version: InfluxDB v2.4.0
Please see below for my telegraf.conf as well as the error messages.
Any help would be highly appreciated! (:
[agent]
interval = "10s"
round_interval = true
metric_batch_size = 1000
metric_buffer_limit = 1000
collection_jitter = "0s"
flush_interval = "10s"
flush_jitter = "0s"
precision = ""
hostname = ""
omit_hostname = false
[[outputs.influxdb_v2]]
urls = ["http://localhost:8086"]
token = "XXXXXXXX"
organization = "XXXXXXXXX"
bucket = "sensor"
[[inputs.http]]
urls = [
"https://api.thingspeak.com/channels/XXXXX/feeds.json?api_key=XXXXXXXXXX&results=2"
]
name_override = "sensorData"
tagexclude = ["url", "host"]
data_format = "json_v2"
## HTTP method
method = "GET"
[[inputs.http.json_v2]]
[[inputs.http.json_v2.field]]
path = "feeds.1.field1"
rename = "temperature"
type = "int" #Error message 1
#type = "float" #Error message 2
Error when type = "float":
me#myserver:/etc/telegraf$ telegraf -config telegraf.conf --debug
2022-10-16T00:31:43Z I! Starting Telegraf 1.24.2
2022-10-16T00:31:43Z I! Available plugins: 222 inputs, 9 aggregators, 26 processors, 20
parsers, 57 outputs
2022-10-16T00:31:43Z I! Loaded inputs: http
2022-10-16T00:31:43Z I! Loaded aggregators:
2022-10-16T00:31:43Z I! Loaded processors:
2022-10-16T00:31:43Z I! Loaded outputs: influxdb_v2
2022-10-16T00:31:43Z I! Tags enabled: host=myserver
2022-10-16T00:31:43Z I! [agent] Config: Interval:10s, Quiet:false, Hostname:"myserver",
Flush Interval:10s
2022-10-16T00:31:43Z D! [agent] Initializing plugins
2022-10-16T00:31:43Z D! [agent] Connecting outputs
2022-10-16T00:31:43Z D! [agent] Attempting connection to [outputs.influxdb_v2]
2022-10-16T00:31:43Z D! [agent] Successfully connected to outputs.influxdb_v2
2022-10-16T00:31:43Z D! [agent] Starting service inputs
2022-10-16T00:31:53Z E! [outputs.influxdb_v2] Failed to write metric to sensor (will be
dropped: 422 Unprocessable Entity): unprocessable entity: failure writing points to
database: partial write: field type conflict: input field "temperature" on measurement
"sensorData" is type float, already exists as type string dropped=1
2022-10-16T00:31:53Z D! [outputs.influxdb_v2] Wrote batch of 1 metrics in 8.9558ms
2022-10-16T00:31:53Z D! [outputs.influxdb_v2] Buffer fullness: 0 / 10000 metrics
Error when type = "int"
me#myserver:/etc/telegraf$ telegraf -config telegraf.conf --debug
2022-10-16T00:37:05Z I! Starting Telegraf 1.24.2
2022-10-16T00:37:05Z I! Available plugins: 222 inputs, 9 aggregators, 26 processors, 20
parsers, 57 outputs
2022-10-16T00:37:05Z I! Loaded inputs: http
2022-10-16T00:37:05Z I! Loaded aggregators:
2022-10-16T00:37:05Z I! Loaded processors:
2022-10-16T00:37:05Z I! Loaded outputs: influxdb_v2
2022-10-16T00:37:05Z I! Tags enabled: host=myserver
2022-10-16T00:37:05Z I! [agent] Config: Interval:10s, Quiet:false, Hostname:"myserver",
Flush Interval:10s
2022-10-16T00:37:05Z D! [agent] Initializing plugins
2022-10-16T00:37:05Z D! [agent] Connecting outputs
2022-10-16T00:37:05Z D! [agent] Attempting connection to [outputs.influxdb_v2]
2022-10-16T00:37:05Z D! [agent] Successfully connected to outputs.influxdb_v2
2022-10-16T00:37:05Z D! [agent] Starting service inputs
2022-10-16T00:37:10Z E! [inputs.http] Error in plugin:
[url=https://api.thingspeak.com/channels/XXXXXX/feeds.json?
api_key=XXXXXXX&results=2]: parsing metrics failed: Unable to convert field
'temperature' to type int: strconv.ParseInt: parsing "15.3": invalid syntax
Fixed it by leaving type = float under [[inputs.http.json_v2.field]] in telegraf.conf and creating a NEW bucket with a new API key in Influx.
The issue was that the bucket sensor that I had previously defined in my telegraf.conf already had the field temperature created in my influx database from previous tries with its type set as last (aka: String) which could not be overwritten with the new type mean (aka: float).
As soon as I deleted all pre existing buckets everything started working as expected.
InfluxDB dashboard

Caused by: org.apache.ignite.IgniteCheckedException: Failed to validate cache configuration. Cache store factory is not serializable. Cache name:

I am trying to set up an Apache Ignite cache store using Mysql as external storage.
I have read all official documentation about it and examined many other examples, but I can't make it run:
[2022-06-02 16:45:56:551] [INFO] - 55333 - org.apache.ignite.logger.java.JavaLogger.info(JavaLogger.java:285) - Configured failure handler: [hnd=StopNodeOrHaltFailureHandler [tryStop=false, timeout=0, super=AbstractFailureHandler [ignoredFailureTypes=UnmodifiableSet [SYSTEM_WORKER_BLOCKED, SYSTEM_CRITICAL_OPERATION_TIMEOUT]]]]
[2022-06-02 16:45:56:874] [INFO] - 55333 - org.apache.ignite.logger.java.JavaLogger.info(JavaLogger.java:285) - Successfully bound communication NIO server to TCP port [port=47100, locHost=0.0.0.0/0.0.0.0, selectorsCnt=4, selectorSpins=0, pairedConn=false]
[2022-06-02 16:45:56:874] [WARN] - 55333 - org.apache.ignite.logger.java.JavaLogger.warning(JavaLogger.java:295) - Message queue limit is set to 0 which may lead to potential OOMEs when running cache operations in FULL_ASYNC or PRIMARY_SYNC modes due to message queues growth on sender and receiver sides.
[16:45:56] Message queue limit is set to 0 which may lead to potential OOMEs when running cache operations in FULL_ASYNC or PRIMARY_SYNC modes due to message queues growth on sender and receiver sides.
[2022-06-02 16:45:56:898] [WARN] - 55333 - org.apache.ignite.logger.java.JavaLogger.warning(JavaLogger.java:295) - Checkpoints are disabled (to enable configure any GridCheckpointSpi implementation)
[2022-06-02 16:45:56:926] [WARN] - 55333 - org.apache.ignite.logger.java.JavaLogger.warning(JavaLogger.java:295) - Collision resolution is disabled (all jobs will be activated upon arrival).
[16:45:56] Security status [authentication=off, sandbox=off, tls/ssl=off]
[2022-06-02 16:45:56:927] [INFO] - 55333 - org.apache.ignite.logger.java.JavaLogger.info(JavaLogger.java:285) - Security status [authentication=off, sandbox=off, tls/ssl=off]
[2022-06-02 16:45:57:204] [INFO] - 55333 - org.apache.ignite.logger.java.JavaLogger.info(JavaLogger.java:285) - Successfully bound to TCP port [port=47500, localHost=0.0.0.0/0.0.0.0, locNodeId=b397c114-d34d-4245-9645-f78c5d184888]
[2022-06-02 16:45:57:242] [WARN] - 55333 - org.apache.ignite.logger.java.JavaLogger.warning(JavaLogger.java:295) - DataRegionConfiguration.maxWalArchiveSize instead DataRegionConfiguration.walHistorySize would be used for removing old archive wal files
[2022-06-02 16:45:57:253] [INFO] - 55333 - org.apache.ignite.logger.java.JavaLogger.info(JavaLogger.java:285) - Configured data regions initialized successfully [total=4]
[2022-06-02 16:45:57:307] [ERROR] - 55333 - org.apache.ignite.logger.java.JavaLogger.error(JavaLogger.java:310) - Exception during start processors, node will be stopped and close connections
org.apache.ignite.IgniteCheckedException: Failed to start processor: GridProcessorAdapter []
at org.apache.ignite.internal.IgniteKernal.startProcessor(IgniteKernal.java:1989) ~[ignite-core-2.10.0.jar:2.10.0]
Caused by: org.apache.ignite.IgniteCheckedException: Failed to validate cache configuration. Cache store factory is not serializable. Cache name: StockConfigCache
Caused by: org.apache.ignite.IgniteCheckedException: Failed to serialize object: CacheJdbcPojoStoreFactory [batchSize=512, dataSrcBean=null, dialect=org.apache.ignite.cache.store.jdbc.dialect.MySQLDialect#14993306, maxPoolSize=8, maxWrtAttempts=2, parallelLoadCacheMinThreshold=512, hasher=org.apache.ignite.cache.store.jdbc.JdbcTypeDefaultHasher#73ae82da, transformer=org.apache.ignite.cache.store.jdbc.JdbcTypesDefaultTransformer#6866e740, dataSrc=null, dataSrcFactory=com.anyex.ex.memory.model.CacheConfig$$Lambda$310/1421763091#31183ee2, sqlEscapeAll=false]
Caused by: java.io.NotSerializableException: com.anyex.ex.database.DynamicDataSource
Any advice or idea would be appreciated, thank you!
public static CacheConfiguration cacheStockConfigCache(DataSource dataSource, Boolean writeBehind)
{
CacheConfiguration ccfg = new CacheConfiguration();
ccfg.setSqlSchema("public");
ccfg.setName("StockConfigCache");
ccfg.setCacheMode(CacheMode.REPLICATED);
ccfg.setAtomicityMode(CacheAtomicityMode.TRANSACTIONAL);
ccfg.setIndexedTypes(Long.class, StockConfigMem.class);
CacheJdbcPojoStoreFactory cacheStoreFactory = new CacheJdbcPojoStoreFactory();
cacheStoreFactory.setDataSourceFactory((Factory<DataSource>) () -> dataSource);
//cacheStoreFactory.setDialect(new OracleDialect());
cacheStoreFactory.setDialect(new MySQLDialect());
cacheStoreFactory.setTypes(JdbcTypes.jdbcTypeStockConfigMem(ccfg.getName(), "StockConfig"));
ccfg.setCacheStoreFactory(cacheStoreFactory);
ccfg.setReadFromBackup(false);
ccfg.setCopyOnRead(true);
if(writeBehind){
ccfg.setWriteThrough(true);
ccfg.setWriteBehindEnabled(true);
}
return ccfg;
} public static JdbcType jdbcTypeStockConfigMem(String cacheName, String tableName)
{
JdbcType type = new JdbcType();
type.setCacheName(cacheName);
type.setKeyType(Long.class);
type.setValueType(StockConfigMem.class);
type.setDatabaseTable(tableName);
type.setKeyFields(new JdbcTypeField(Types.NUMERIC, "id", Long.class, "id"));
type.setValueFields(
new JdbcTypeField(Types.NUMERIC, "id", Long.class, "id"),
new JdbcTypeField(Types.NUMERIC, "stockinfoId", Long.class, "stockinfoId"),
new JdbcTypeField(Types.VARCHAR, "remark", String.class, "remark"),
new JdbcTypeField(Types.TIMESTAMP, "updateTime", Timestamp.class, "updateTime")
);
return type;
} igniteConfiguration.setCacheConfiguration(
CacheConfig.cacheStockConfigCache(dataSource, igniteProperties.getJdbc().getWriteBehind())
); #Bean("igniteInstance")
#ConditionalOnProperty(value = "ignite.enable", havingValue = "true", matchIfMissing = true)
public Ignite ignite(IgniteConfiguration igniteConfiguration)
{
log.info("igniteConfiguration info:{}", igniteConfiguration.toString());
Ignite ignite = Ignition.start(igniteConfiguration);
log.info("{} ignite started with discovery type {}", ignite.name(), igniteProperties.getType());
return ignite;
}

Error: Invalid dest() folder argument. Please specify a non-empty string or a function

i cloned the repo ng-seed/universal
then - npm install,
then - npm run clean, then - npm run build:spa-dev (gives me error while) building
Finished 'bundle:spa-stage' after 1.42 min
[18:43:03] Starting 'copy:index.html'...
[18:43:03] 'copy:index.html' errored after 31 ms
[18:43:03] Error: Invalid dest() folder argument. Please specify a non-empty string or a function.
at Gulp.dest (D:\xx\xx\universal\node_modules\vinyl-fs\lib\dest\index.js:21:11)
at D:\xx\xx\universal\tools\build\gulp-helpers.js:56:36
at D:\xx\xx\universal\node_modules\lazypipe\index.js:27:19
at Array.map (<anonymous>)
at build (D:\xx\xx\universal\node_modules\lazypipe\index.js:26:37)
at Object.exports.debug (D:\xx\xx\universal\tools\build\gulp-helpers.js:57:7)
at copy (D:\xx\xx\universal\tools\build\gulp-tasks.js:90:18)
at bound (domain.js:301:14)
at runBound (domain.js:314:12)
at asyncRunner (D:\xx\xx\universal\node_modules\async-done\index.js:55:18)
at _combinedTickCallback (internal/process/next_tick.js:131:7)
at process._tickDomainCallback (internal/process/next_tick.js:218:9)
With console.log I output the variables that were used for DEST and found out that some were not yet set. Maybe that will help one or the other.
gulp.task(function minify_html() {
var SRC = variables.config.sourceHTML;
return (gulp
.src(SRC)
(...)
.pipe(gulp.dest(variables.config.targetHTML))
.pipe(gulpif(variables.settings.isServer, browser_sync.stream())));
console.log(variables.config.targetHTML); // 👈
});
For the sake of completeness, I use let variables = require ('./ variables.json') to maintain the settings separately

DataAccessException: Deadlock found when trying to get lock; try restarting transaction

I'm having a method addEvent() which is adding event records from a webhook REST controller:
public void addEvent(String subscriptionToken, DateTime eventTimestamp, SubscriptionEventType subscriptionEventType) {
LOGGER.debug("Adding event: " + subscriptionEventType.name());
this.ctx.transaction(configuration -> {
// Transaction code ..
});
}
The problem I am facing is that if two webhooks come in at the same time I am getting a DataAccessException:
[http-bio-8080-exec-6] DEBUG com.mz.server.rest.braintree.webhooks.BraintreeWebhooksRestController - Webhook kind: SUBSCRIPTION_WENT_ACTIVE
[http-bio-8080-exec-6] DEBUG com.mz.server.repository.jooq.payment.PaymentGatewaySubscriptionEventRepository - Adding event: WENT_ACTIVE
[http-bio-8080-exec-6] DEBUG com.mz.server.spring.SpringTransactionProvider - ##### begin #####
[http-bio-8080-exec-6] INFO com.mz.server.spring.SpringTransaction - SpringTransaction Ctor()
[http-bio-8080-exec-3] DEBUG com.mz.server.rest.braintree.webhooks.BraintreeWebhooksRestController - Webhook kind: SUBSCRIPTION_CHARGED_SUCCESSFULLY
[http-bio-8080-exec-3] DEBUG com.mz.server.repository.jooq.payment.PaymentGatewaySubscriptionEventRepository - Adding event: CHARGED_SUCCESSFULLY
[http-bio-8080-exec-3] DEBUG com.mz.server.spring.SpringTransactionProvider - ##### begin #####
[http-bio-8080-exec-3] INFO com.mz.server.spring.SpringTransaction - SpringTransaction Ctor()
[http-bio-8080-exec-6] DEBUG com.mz.server.spring.SpringTransactionProvider - ##### commit #####
[http-bio-8080-exec-6] DEBUG com.mz.server.spring.auth.CustomHttpSessionListener - Unhandled event
[http-bio-8080-exec-3] DEBUG com.mz.server.spring.SpringTransactionProvider - ##### rollback #####
[http-bio-8080-exec-3] ERROR com.mz.server.rest.braintree.webhooks.BraintreeWebhooksRestController -
org.jooq.exception.DataAccessException: SQL [update `mz_db`.`payment_gateway_subscription` set `mz_db`.`payment_gateway_subscription`.`subscription_event_type_id` = ? where `mz_db`.`payment_gateway_subscription`.`id` = ?]; Deadlock found when trying to get lock; try restarting transaction
at org.jooq.impl.Utils.translate(Utils.java:1690)
at org.jooq.impl.DefaultExecuteContext.sqlException(DefaultExecuteContext.java:660)
at org.jooq.impl.AbstractQuery.execute(AbstractQuery.java:356)
at org.jooq.impl.AbstractDelegatingQuery.execute(AbstractDelegatingQuery.java:133)
at com.mz.server.repository.jooq.payment.PaymentGatewaySubscriptionEventRepository.lambda$0(PaymentGatewaySubscriptionEventRepository.java:73)
at org.jooq.impl.DefaultDSLContext$1.run(DefaultDSLContext.java:370)
at org.jooq.impl.DefaultDSLContext$1.run(DefaultDSLContext.java:367)
at org.jooq.impl.DefaultDSLContext.transactionResult(DefaultDSLContext.java:339)
at org.jooq.impl.DefaultDSLContext.transaction(DefaultDSLContext.java:367)
What can I do about that?
I don't understand why I'm getting this. It appears that the rows I'm accessing are getting locked and the next thread that comes in chokes on that. Is there a sane way to handle this issue? It's the first time I'm seeing this exception. Since this is the backend of a larger web application I'm not really looking forward to see something like that in production.. although at the moment I fear that this will happen if I do nothing about this.

graphhopper GHRequest not fetching

I have a problem with this code:
GraphHopperAPI gh = new GraphHopperWeb();
gh.load("http://localhost:8989/api/route");
GHResponse ph = gh.route(new GHRequest(45.104546,7.69043,45.104546,7.69043));
It gives me this error:
2014-03-29 09:33:00,036 [main] INFO graphhopper.http.GraphHopperWeb - Full request took:0.037406, API took:0.0
Exception in thread "main" java.lang.RuntimeException: Problem while fetching path 45.104546, 7.69043->45.104546, 7.69043
at com.graphhopper.http.GraphHopperWeb.route(GraphHopperWeb.java:119)
at provaMain.main(provaMain.java:23)
Caused by: org.json.JSONException: A JSONObject text must begin with '{' at character 0
at org.json.JSONTokener.syntaxError(JSONTokener.java:410)
at org.json.JSONObject.<init>(JSONObject.java:179)
at org.json.JSONObject.<init>(JSONObject.java:402)
at com.graphhopper.http.GraphHopperWeb.route(GraphHopperWeb.java:95)
... 1 more
The documentation currently undergoes a change (moving it from wiki to source). Where did you find that snippet? Please try gh.load("http://localhost:8989/"); for the latest branch and gh.load("http://localhost:8989/api"); before.