Can the Trinidad server use log files in development? - jruby

I'm working on an app that's served using Trinidad, and in development, the server defaults to pushing all of its output to stdout. This is making it very difficult to use a command line debugger (in my case Pry). Is there some way to make it use log files in dev the way that it does in prod? For reference, I'm using version 1.4.4 of Trinidad.
Alternatively, if there's some workaround for this in Pry, I'd love to learn about that too.
Thanks!

It turns out this was actually due to a logger monkey patch in my specific app, which was otherwise using log files. I was just thrown off by the Trinidad README on GitHub, which claims that printing to stdout is the default in dev.
If anyone has clever ideas for working around a situation like this using Pry and/or Unix utilities, please share anyway as that might be useful in other similar scenarios.

Related

Google cloud - Stackdriver debug reports "File was not found in the executable" for GCE Jetty war

I've been trying to follow the
Setting Up Stackdriver Debugger for Java applications on Google Compute Engine, but am running into issues with Stackdriver Debug.
I'm building my .war file from a separate build server, then deploying it to my GCE server. I added the agent to the start command via /etc/defaults, and my app appears in the https://console.cloud.google.com/debug control panel. The version I set in the run command matches the revision that shows up in the source-context(s).json files.
However when I click open the app, I see the message that
No source version information was provided by the deployed application
I connected the app's git repo as a mirrored cloud repository, and can browse the source files in the sidebar of the Stackdriver Debug page. But, If I browse to a file and add a breakpoint I get an error that the error "File was not found in the executable."
I have ran the gcloud preview app gen-repo-info-file command, which created two basic json files storing my git repo and revision. Is it supposed to do anything else?
I have tried running jetty using both normal and extracted modes. If I have jetty first extract the war file, I can see the source-context.json filesin the WEB-INF/classes directory.
What am I missing?
https://github.com/GoogleCloudPlatform/cloud-debug-java#extra-classpath mentions
you can update the agentPath showing your WEB-INF/class directory.
-agentpath:/opt/cdbg/cdbg_java_agent.so=--cdbg_extra_class_path=/opt/tomcat/webapps/myapp/WEB-INF/classes
For multiple class paths:
-agentpath:/opt/cdbg/cdbg_java_agent.so=--cdbg_extra_class_path=/opt/tomcat/webapps/myapp/WEB-INF/classes:/another/path/with/classes
There are a couple of things going on here.
First, it sounds like you are doing the correct thing with gen-repo-info-file. The debugger agent should pick up the json files from the WEB-INF/classes directory.
The debugger uses fuzzy matching to find source files, so as long as the name of the .java file matches a file in your executable, you should not get that error.
The most likely scenario given the information in your question is that you are attaching the debugger to a launcher process, rather than your actual application. Without further details, I can't absolutely confirm that, though.
If you send us more details at cdbg-feedback#google.com, we can look more closely at your case to see if we can understand exactly what's happening, and potentially improve our documentation, since it sounds like you followed the docs pretty closely.

Auto refresh wepage when source file changed

I have been learning web development for some time and I have noticed on tutorials on youtube that when someone change source file (html, css, js) the webpage opened in browser is automatically refreshing. I have read something about live-reload but it's too complicated for me and there is no step by step tutorial.
I have found some similar questions, but the case is that refreshing happens by side of local server not the code editor or browser as is mentioned in questions that I found.
I'm using Apache as my local server. Sublime Text for writing a code and Ubuntu operating system.
Here is the video that shows what I am exactly trying to say.
https://www.youtube.com/watch?v=q78u9lBXvj0
Npm and live-server doesn't work on my computer at all.
Sorry for my english, but I'm not a native speaker. I'm looking forward for your help.
Anyone knows anything?
Install sublime web server using package manager ( or in your case continue to use Apache )
Use http://livejs.com/

Chrome allow file access from files no longer working (was using to see WebGL/three.js files)?

I was using a Chrome shortcut with allow-file-access-from-files in the target to work on my three.js student project files. But sometime this morning this stopped working and it appeared Chrome had been updated. I redid the shortcut but no joy.
Part of the project I'm doing is building three.js animation that works in a common browser (for which I chose Chrome).
Is there any way to get Chrome to allow file access again?
Thanks.
The answer I came up with was to use Firefox instead of Chrome changing the security policy as detailed in https://github.com/mrdoob/three.js/wiki/How-to-run-things-locally
Not a perfect answer but with a deadline looming it's the best workable answer for me right now as trying different variations of Chrome, trying Wamp and also Mongoose didn't work. If I had more time I would work out how to use Python or probably node.js as I've seen it mentioned a number of times as being the faster option.
What gman stated is true, using the Chrome flag (and changing Firefox's security policy) does create a big security risk. But only if you use that shortcut (and it's tabs etc.) for anything other than accessing your own local files. I've been scrupulous about not using it for the internet but don't use this method if you can't be strict with yourself.
Ideally I'd recommend beginning any project with node.js.
Gman's answer is good. If you're in windows environment, and use npm for package management the easiest is to install http-server globally:
npm install -g http-server
Then simply run http-server in any of your project directories:
Eg. d:\my_project> http-server
Starting up http-server, serving ./
Available on:
http:169.254.116.232:8080
http:192.168.88.1:8080
http:192.168.0.7:8080
http:127.0.0.1:8080
Hit CTRL-C to stop the server
Easy, and no security risk of accidentally leaving your browser open vulnerable.
DON'T USE THAT FLAG! You're opening yourself to having your online accounts being hacked and your local data stolen. Here are 2 proof of concept examples
Run a simple server.
It's super simple.
Here's one
Here's one.
Here's another.
And another.
They won't take more than a couple of minutes to download and require no configuration

Workflow for using TextMate/Coda with Transmit and Versions

I use TextMate to do my HTMl,PHP,JS/Other languages and CSSEdit to do my CSS.
I want to integrate TextMate with Transmit better because at the moment I work like this:
TextMate: Edit code
Transmit: Look for folder and drag to online server
Firefox: Refresh page
Rinse, Repeat.
It feels very clunky to me and I do the same with CSSEdit (although CSSEdit's live preview means that I only have to upload once) but I would like to be able to, on save, have Transmit upload the edited document to the relevant place on the server (given that linked browsing is enabled).
Does anyone have a certain workflow that they follow or macros enabled in TextMate to do such tasks as they would certainly make my life a lot easier, Coda is also an option instead of TextMate if needed.
Being able to have Versions/Git-Tower auto commit on save would be great too.
I recommend #Adam's solution for the uploading part of your question but why are you using Git and Transmit simultaneously? Why not Git for everything?
My workflow:
On my machine I keep a Git repository where I do all the work. The working directory is served by MAMP so that I can test my code before commiting anything.
When I'm satisfied I commit my latest changes until I think the branch I'm working on is stable.
When I'm ready, I push to the server where a post-commit hook checks out the latest version to what the "pre-prod server".
When everything has been tested to death, branches merged and so on I check out manually the repository to the "prod server".
No need to use an FTP client at any point, everything is done from the editor (TextMate before, Vim now).
If you set up a site in Transmit, and open the local directory that holds your files, you can activate the Textmate Transmit bundle by typing ctrl-shift-f. Then hit either 1 or 2. 1 will upload the current directory, 2 will send the current file.
You might consider using Transmit's ability to mount FTP servers as volumes and simply edit the files directly on the server. To TextMate the mounted FTP server will appear to be just another volume. Search the help files for Transmit Disk, their name for this feature.

Unable to get email-ext.hpi to work in hudson

I have just setup hudson and have begun playing around with it.
I have downloaded the email-ext.hpi into the the folder $HUDSON_HOME\plugins
I have restarted hudson post-step1 ( i am following this manual method as i am unable to use (for proxy setting reasons) the automatic way of installing plugins via the "Manage hudson" page.
I dont see any errors when hudson starts. In fact i see the line
INFO: Started all plugins
BUT:
When i start a project configuration page, I do not see the promised option "Editable Email Notification".
FYI:
1. I am able to setup and run few basic test builds and they run fine.
2. I am also able to configure and receive the default hudson emails for failures and subsequent successes.(This confirms the SMTP settings)
3. I was also aboe to setup the subversion tag hpi in the same way as detailed above and that works fine as well!
What am i missing? Thanks in advance for any help!
EXTRA INFO:
Hudson version - 1.379 running on Windows XP
OK - i figured out a workaround (although i still need to dig into why this is a problem). Recording here for anyone else tha tmay face this issue.
The plugin when copied into the $HUDSON_HOME\plugin was somehow not really being activeated/recognized. But when i copied it over also to C:\Documents and Settings\mylogin.hudson\plugins and restarted hudson service, voila! it worked.
If anyone knows why this might have occured, kindly record it here for reference. Thanks.
To install a plugin you should use the easy route. In Hudson, go to 'Manage Hudson' -> 'Manager Plugins' -> 'Advanced' (its a tab) and use the 'upload plugin' option.
Than follow the instructions. Usually you have to restart Hudson to actually get the plugin.
Way saver than messing around with the file system. In general the approach you had should have been correct, but there seems to be an issue with your $HUDSON_HOME. Have a look at the "Manage Hudson" -> "Configure System" page. What is the Hudson Home directory displayed on the top of the page? I don't know what Hudson does if it can't access the Home Directory? My assumption is here that Hudson runs as a service with a user account rather than the local system account and that you used a different account to copy the hpi file.
Install Maven Legacy and Maven3 plugins .