I have a single node Flink instance which has the required jars for logback in the lib folder (logback-classic.jar, logback-core.jar, log4j-over-slf4j.jar). I have removed the jars for log4j from the lib folder (log4j-1.2.17.jar, slf4j-log4j12-1.7.7.jar). 'logback.xml' is also correctly updated in 'conf' folder. I have also included 'logback.xml' in the classpath, although this does not seem to be considered while the job is run. Flink refers to logback.xml inside the conf folder only. I have updated pom.xml as per Flink's documentation in order to exclude log4j.
I have some log entries set inside a few map and flatmap functions and some log entries outside those functions (eg: "program execution started").
When I run the job, Flink writes only those logs that are coded outside the transformations. Those logs that are coded inside the transformations (map, flatmap etc) are not getting written to the log file. Also, Flink displays a strange behavior regarding this. Whenever I update the logback jars inside the the lib folder(due to version changes), during the next job run, all logs (even those inside map and flatmap) are written correctly into the log file. But the logs don't get written in any of the runs after that. This means that my 'logback.xml' file is correct and the settings are also correct. But I don't understand why the same settings don't work while the same job is run again.
Update
This issue was reported to the Flink team and they have added this as a bug in JIRA https://issues.apache.org/jira/browse/FLINK-7990
Related
I have setup ssis logging to a text file. In the connection manager I have selected create file and given path as c:\logs\log.txt
Notice that log file is not generated if the log folder is absent. How to ensure that folder is created if not exists? I tried choosing create folder on connection manager but that is also not creating the log file in absence of the c:\log folder.
How to ensure folder is auto created and log is always generated?
You have a chicken and egg scenario here. Consider the following replication of your problem
I have the connection manager driven by a variable LogFileName which generates the date and time. That file lives in whatever folder is specified by LogPath and the first thing my package does is create the folder if it does not already exist. "This thing can run anywhere and all is good." I've said that plenty and have the scars to show for it.
The following shows the events you can choose to log (based on what is in my package).
I am only logging OnPostExecute events. So I'm good, right? Because the post execute event won't fire until after that File System Task has completed.
If that were the case, you wouldn't have posted a question.
The first event that a package generates is a PackageStart event. Look at that list of events - no ability to filter that out. It doesn't matter whether you want that event logged or not, the logging handlers hear the PackageStart event and record it. Always.
The specified Text file logger should be used to record the data and it's ready to record PackageStart to file... "oh that path doesn't exist."
It will exist once the very first task (File System Task, Create Folder) has completed but alas, it it too late. You either get the complete sequence of events or none.
In your Output window, you would see something like the following
SSIS package "C:\Users\bfellows\source\repos\PackageDeploymentModel\PackageDeploymentModel\ChickenAndEgg_Logging.dtsx" starting.
Error: 0xC001404B at ChickenAndEgg_Logging, Log provider "SSIS log provider for Text files": The SSIS logging provider has failed to open the log. Error code: 0x80070003.
The system cannot find the path specified.
Error: 0xC001404B at ChickenAndEgg_Logging, Log provider "SSIS log provider for Text
files": The SSIS logging provider has failed to open the log. Error code: 0x80070003.
The system cannot find the path specified.
SSIS package "C:\Users\bfellows\source\repos\PackageDeploymentModel\PackageDeploymentModel\ChickenAndEgg_Logging.dtsx" finished: Success.
The package will show your Control Flow objects as all having gone green/OK and the status message will say it "Package execution completed with success" but on the results tab, you'll have a red X showing the log provider couldn't open the log.
What do I do
Preconfigure your environments as part of the package deployment process. When we used the native logger as you're inquiring about, we had a document that laid out all that new developers/new servers needed to have done to ensure all of this stuff was laid out and configured as it needed to be.
Unless a client has a strong business case for using the classic logging methodology, I would encourage them to not use it and instead rely on the SSISDB's native logging. It's cleaner, easier to manage, no special setup required. To quote the fine folks in Cupertino - it just works
I changed the config (config.yml) and want to check, if it worked.
How can I see the actually loaded configurations? Can I access them from a Controller?
When your app first time starts, Sf uses a HttpKernel component to manage the loading of the service container configuration from the application and bundles and also handles the compilation and caching.
After the compilation process has loaded the services from the configuration, extensions and the compiler passes, it is dumped so that the cache can be used next time. The dumped version is then used during subsequent requests as it is more efficient.
More info at: http://symfony.com/doc/current/components/dependency_injection/workflow.html
If you dump $this->container in your controller, you will see all the parameters inside private property parameters, including parameters defined in the parameters.yml file and in the config.yml
Let's say that you want to know what is current value of the parameter locale - you can write this
$this->container->getParameter('locale')
Also, all those parameters are dumped into sf_root/var/cache/your_env/appDevDebugProjectContainer.xml
I have created a jar file, example.jar, to be consumed by various project at my work place. The jar does have SLF4J for logging, backed by Logback.
However during build , I have excluded logback.xml with assumption that actual application will be having its own logback.xml.
Now , even though actual application does have its logback.xml, logger messages from example.jar are missing .
Could you please guide how to handles multiple logback in above mentioned scenario.
In Play Application , I have added following lines in conf/logback.xml file.
You may disable logging by making level="OFF"
In the documentation for app-indexeddb-mirror at https://elements.polymer-project.org/elements/app-storage?active=app-indexeddb-mirror there is a section I've copied below. I think I'm running into an error because the indicated file isn't loading, but I'm not sure how to fix the issue. Do I add a reference in staticFileGlobs in sw-precache-config.js or somewhere else?
In order to ensure that operations on IndexedDB block the main browser thread as little as possible, app-indexeddb-mirror relies on a WebWorker to operate on its corresponding IndexedDB database. If you are vulcanizing or otherwise combining your source files before your app is deployed, make sure that you include the corresponding worker script (app-indexeddb-mirror-worker.js) among your deployable files. You can configure the path to the worker script with the worker-url attribute.
The error I'm getting:
GET https://example.com/src/common-worker-scope.js?https://example.com/bower_components/app-storage/app-indexeddb-mirror/app-indexeddb-mirror-worker.js net::ERR_INTERNET_DISCONNECTED
Packaging a log4j configuration file in a NetBeans Platform application apparently requires some thinking through. This is what I tried...
I put log4j.xml in src/main/resources/my/package/log4j.xml of some_netbeans_module. The package is a public module package (i.e. classes from this package are used from other packages). I rebuilt the module and confirmed that the file does, in fact, get packaged into the module.
In my classes I get an instance of the logger the way I always do:
static final Logger log = Logger.getLogger(ThisClass.class);
Every NetBeans Platform application has a my_app.conf file which makes it possible to set certain properties. This is where I set log4j.conf:
log4j.configuration="/my/package/log4j.xml"
Now, when I run the application, I see the following output:
[INFO] /home/me/my_app/application/target/my_app/bin/../etc/my_app.conf: 5:
log4j.configuration=/my/package/log4j.xml: not found
What is wrong with the above configuration?
In the my_app.conf file if you append the log4j.configuration property to the default_options property, like so:
default_options="...<other options> -J-Dlog4j.configuration=my/package/log4j.xml"
then this option will get passed to the JVM. Notice that the log4j property has -J-D appended to it. The -J is used by NetBeans to delineate JVM properties and the -D is used by the JVM to delineate a system property.
Also you can/should drop the quotes and the initial / as the quotes are not necessary and NetBeans will complain if you have the initial /
The other way to do this, and the way that I prefer since it doesn't require editing the .conf file, is to put the log4j.xml file into the default package. If you have other requirements that prevents you from doing this then remember that you must put the log4j.configuration property in the app's platform.properties file while your in dev mode and running the app inside of the IDE. Like so:
run.args.extra=-J-Dlog4j.configuration=my/package/log4j.xml
Edit: For questions regarding NetBeans Platform you might have better luck posting to the NetBeans Platform Users forum.