I'm working on an assignment for school, my files for the website are stored on a distant server which I access via VPN and remote server connection on macOS.
When I modify my html files, the changes aren't reflected immediately, sometimes after a day or two (in fact quite randomly, can be an afternoon, an hour).
It's a bit problematic when you try to have long code sessions. Sometimes, one page actualises but not the others.
I'm not having any problems with my php files, they actualise immediately.
I've tried several things without any changes:
Emptying the cache
Trying on different web browsers
Disconnecting from the server and VPN
Waiting :)
System infos :
macOS 10.12.2
Safari 10.0.2
Thanks for the help, I personally think it's a problem with the server, but I won't be able to change that, hopefully, it's something I can fix.
Related
I'm working on Aegir for migrating/cloning sites from platform1/db1 to platform2/db2. Platform1, platform2, db1 and db2 are verified successfully, but when i try to migrate/clone site the tasks spinning for ever. When i go into the server i can see the db is created on db2 and site is created on platform2 but the task is unfinished which is always in progress. I cancelled the task and re verified the platform which din't help.
Is there anything i'm missing or am i doing something wrong?
You might wish to compare site creation times on both servers. It could take as long to migrate a site over as it does creating a site.
If that works (creating a site on db2) I would try and disable unstable/custom modules until it works. It sounds as though something is interfering.
(Btw in my opinion you shoud post at least the last lines of the task log, as well as how long you waited and how big the site's backups are, and any relevant info on the db engines etc. I notice you forgot to talk about installs on db2 too.)
I created an application that works perfect in my computer but when I uploaded it to start server tests it becomes very slow, specially after a couple of uses (the first minutes work fine)...It even becomes unresponsive, as I move through a treetable a form should be updated from the database but stops working after a while...
I'm using an Amazon EC2 Linux server and a MySQL database...I checked if the connections to the database is what failed, but I'm using no more than 7 out of 150 max connections to the database.
Is this a common problem?
Any ideas on how to solve this?
Thanks!!!
Note: This is a copy of an internal vaadin forum thread: https://vaadin.com/forum#!/thread/4816326 ...Hope is not against the forum rules to do this...
It sounds like you may have a memory leak in your application somewhere that your computer is able to sustain, but your server is not. I would suggest trying some load testing on another machine and see what actions are causing it to spin out.
You can have a look at this SO answer to see how to do that:
https://stackoverflow.com/a/46227692/460802
I've been doing some research of the best way to show an "users online" counter which is updated to the second trying to avoid continuos ajax polling.
Obviously WebSockets seems to be the best option. Since this is an intranet I will make it a requirement to use Chrome or Safari so there shouldn't be compatibility issues.
I've been reading some articles about WebSockets since I'm new to it and I think I pretty much understand how it works.
What I'm not so sure is how to implement it with PHP. Node.js seems the natural choice for this because of it's "always running" nature but that's not an option.
Why I'm most confused about is the fact that PHP runs and when it's done, it ends. If PHP ended, wouldn't the socket connection be lost? Or if the php re-runs it will look back the user by ip? (I don't see that likely)
Then I found this library
http://code.google.com/p/phpwebsocket/
but it seems to be a little old (it mentions only Chrome nightly is compatible with WebSockets)
In one point says "From the command line, run the server.php program to listen for socket connections." which means I need SSH, something many shared hosting plans don't have.
And my other doubt is this other line in the source of that library:
set_time_limit(0);
does that mean that the php file will run continuously? Is that allow in shared hosting? From what I know all hostings kill php after a timeout of 1 o2 minutes.
I have a mysql table with online users and I want to use PHP to broadcast via websocket the amount of logged in users to those online users. Can someone please help me or point me somewhere with better information how this could be achieved?
Thanks
Websockets would require lots of thing even on dedicated hosting, put aside shared hosting.
For your requirement server sent events (sse) is the correct choice, since only the server will be pushing data to the client.
SSE can simply call a server script, very much like ajax, but the client side will receive and be able to process data part by part as it comes in. Dom events would be generated whenever some data comes in.
But IE does not support SSE even in version 10. So for IE you have to use some fallback technique like "foreever iframe".
as far as hosting is concerned, ordinary shared hostings (and those which are not very cheap) would allow php scripts to run for long, as long as they are not seen as a problem.
I was originally planning on using a local machine on our network as the development server.
Then I had the idea of using a subdomain.
So if the site was at www.example.com then the development could be done at dev.example.com.
If I did this, I would know that the entire software stack was configured exactly the same for development and production. Also development could use the same database as production removing the hassle of syncing the data. I could even use the same media (images, videos, etc.)
I have never heard of anyone else doing this, and with all these pros I am wondering why not?
What are the cons to this approach?
Update
OK, so its seems the major no no of this approach is using the same DB for dev and production. If you take that out of the equation, is it still a terrible idea?
The obvious pro is what you mentioned: no need to duplicate files, databases, or even software stacks. The obvious con is slightly bigger: you're using the exact same files, databases, or even software stacks. Needless to say: if your development isn't working correctly (infinite loops, and whatnot), production will be pulled down right alongside with it. Obviously, there are possibilities to jail both environments within the OS, but in that case you're back to square one.
My suggestion: use a dedicated development machine, not the production server, for development. You want to split it for stability.
PS: Obviously, if the development environment missed a "WHERE id = ?", all information in the production database is removed. That sounds like a huge problem, doesn't it? :)
People do do this.
However, it is a bad idea to run development against a production database.
What happens if your dev code accidentally overwrites a field?
We use subdomains of the production domain for development as you suggest, but the thought of the dev code touching the prod database is a bit hair-raising.
In my experience, using the same database for production and development is nonsence. How would you change your data model without changing your code?
And also 2 more things:
Its wise to prepare all changes in SQL script, that is run after testing from different environment not your console. Some accidental updates to live system made me headake for weeks.
Once happend to me, that restored backup didn't reproduced live system problem, because of unordered query result. This strange baviour of backup later helped us find the real problem simplier, than retrying on live system.
Using the production machine for development takes away your capacity to experiment. Trying out new modules/configurations can be very risky in a live environment. If I mess up our dev machine with an error in the apache conf, I will just slightly inconvenience my fellow devs. You will be shutting down the live server while people are trying to give you their money.
Not only that but you will be sharing resources with the live enviroment. You can forget about stress testing when the dev server also has to deal with actual customers. Any mistakes that can cause problems on the development server (infinite loop taking up the entire CPU, running out of HDD space, etc) suddenly become a real issue.
Previously when covering events I've typed my reports into an html file and ftp'd it to the server. If I lose connectivity or the laptop crashes, I've got the last saved copy.
We just switched our event coverage area to a database holding Textile-formatted entries done via a web form text area.
Is it at all possible to create a mechanism whereby a local copy of the textarea is saved so I can keep working during connectivity failure? Using Windows on the laptop. Guess the low tech way would be to type in a word processor and just keep pasting into the web form.
You could take a look at the local storage features the browsers have to offer.
Local storage is a nice idea, cookies might also be a solution (which even works in older browsers) if the texts are not too long.
However, I'd use server-side backups (created automatically via AJAX requests) and only fall back to local backups if there's no connection to the server. You can never know if local backups persist when the browser or even the whole system crashes.