I've installed wordpress locally on my Mac (lion)
After enabling vhosts, I've created an entry in my hosts file to point "wordpress" to 127.0.0.1
My vhosts.conf contains:
<VirtualHost *:80>
DocumentRoot "/Users/alex/Sites/wordpress"
ServerName wordpress
<Directory />
Options FollowSymLinks
AllowOverride None
</Directory>
</VirtualHost>
This works, and I can access the default wordpress install, no problem.
Basically, inside my wp-content/themes folder, I've put a symlink to (for example)
/Users/Alex/Projects/SomeTheme/
This folder contains my theme files
However, it's just not detecting the theme (doesn't show up in WP admin)
If I copy the folder to wp-content/themes, then it works. Symlinks aren't being followed
In my /etc/apache2/users/alex.conf I have:
<Directory "/Users/alex/Sites/">
Options Indexes MultiViews FollowSymLinks
AllowOverride None
Order allow,deny
Allow from all
</Directory>
I'm obviously missing something somewhere....
Wordpress has issues using symlinks. This is due to how PHP handles the __FILE__ magic constant. In PHP, __FILE__ returns the absolute path for the file it is run within. Unfortunately it ignores symlinks in that process. (ie. while you might be accessing the file through /opt/wordpress/instance/wp-content/sym-themes/pretty and the file is actually in /opt/content/themes/pretty, when __FILE__ is called, instead of returning /opt/wordpress/instance/wp-content/sym-themes/pretty which is what wordpress expects, it returns /opt/content/themes/pretty.
Wordpress uses heavy use of __FILE__ in it's code and also the basename() function to compare the wordpress root directory against the theme directory to get the name of the theme directory for things like parsing files. Due to how PHP handles the __FILE__ magic constant, Wordpress tries to match the paths but since they are two different ones, it just appends one path to the other and you end up with a path to something that doesn't exist.
Unless you plan on doing a LOT of code moficiations, I highly recommend not using Symlinks anywhere in the wordpress structure.
As #Drahkar has pointed out, symlinks are difficult in WordPress. To change the theme directory, use a simple plugin:
add_filter( 'theme_root', 'sp8963532_theme_root' );
function sp8963532_theme_root()
{
return 'FULL_LOCAL_PATH_TO_YOUR_THEMES_DIRECTORY';
}
add_filter( 'theme_root_uri', 'sp8963532_theme_root_uri' );
function sp8963532_theme_root_uri()
{
return 'URI_TO_YOUR_THEMES_DIRECTORY';
}
Copy the code into a file, and put the file in wp-content/mu-plugins/.
Related
This is most likely a beginner's issue, but I can't seem to find the fix anywhere, and the few posts I found dealing with it are unanswered (e.g. xampp in window 7 cannot access files in subfolder inside C:/xampp/htdocs).
So far I have a working localhost using XAMPP (had to change the port to 8080), located in a custom document root. I can load the index.html, but when I click on a link towards a subdirectoy:
<li></li>
I get the following error:
Service unavailable!
The server is temporarily unable to service your request due to
maintenance downtime or capacity problems. Please try again later.
If you think this is a server error, please contact the webmaster.
Error 503
localhost Apache/2.4.25 (Win32) OpenSSL/1.0.2j PHP/5.6.30
Even if I type in the address (http://localhost:8080/examples/test.html) directly in the browser, it also doesn't work.
Could someone please indicate if this should be working? Or if I should specify something in the apache config file?
I could also note that when simply viewing the html files in my browser (outside of the localhost), the pages work fine and load regardless of their position in the directories. Thanks for any help!
Edit:
Here is my modified conf file DocumentRoot section:
#DocumentRoot "F:/Apps/xampp/htdocs"
DocumentRoot "F:/me/GitWorkDir/myproject_io"
<Directory "F:/me/GitWorkDir/myproject_io">
#
# Possible values for the Options directive are "None", "All",
# or any combination of:
# Indexes Includes FollowSymLinks SymLinksifOwnerMatch ExecCGI MultiViews
#
# Note that "MultiViews" must be named *explicitly* --- "Options All"
# doesn't give it to you.
#
# The Options directive is both complicated and important. Please see
# http://httpd.apache.org/docs/2.4/mod/core.html#options
# for more information.
#
Options Indexes FollowSymLinks Includes ExecCGI
#
# AllowOverride controls what directives may be placed in .htaccess files.
# It can be "All", "None", or any combination of the keywords:
# AllowOverride FileInfo AuthConfig Limit
#
AllowOverride All
#
# Controls who can get stuff from this server.
#
Require all granted
</Directory>
It turns out the issue was due to an unlucky coincidence preventing a subdirectory to be called, precisely, "examples"...
See this page which discusses the issue and proposes a fix.
In short, you can:
Either go to the file C:\xampp\apache\conf\extra\httpd-ajp.conf
and add a "#" to comment out the conflicting line:
ProxyPass /examples ajp://127.0.0.1:8009/examples smax=0 ttl=60 retry=5
Or simply rename the "examples" directory differently (e.g. examples2)
Mostly if you install XAMPP on windows it runs without any problem. Only thing which give issue on windows is permission. which you could resolve by right click on htdocs folder and go to security tab and give all rights to everyone.
Sorry for asking this question again. Even though this has been asked and discussed quite a bit I cannot seem to find the right solution for a local dev environment using VirtualHost. I am using XAMPP Portable for Windows for dev work but assume this is the same for any other local server with regards to the .htaccess file.
DocumentRoot of VirtualHost is D:\dev\www\ for example.
ServerName is devwork.webdev for example.
HOSTS entry is 127.0.0.1 devwork.webdev.
VirtualHost file does have a default DocumentRoot being DocumentRoot "D:/xampp/htdocs". It works just fine.
The projects are each in a folder under D:\dev\www\ for example D:\dev\www\project01\ or D:\dev\www\project02\ and so on and show nicely in the browser when going to devwork.webdev with Options Indexes FollowSymLinks enabled. Apache is not showing any error and the access log file is also OK, things are working.
Now in my HTML when I use Project 01 a click on the link does link me to D:\dev\www\ showing all the projects I have in that folder.
Instead I would like to be linked to the root of the project, being D:\dev\www\project01\ or rather http://devwork.webdev/project01/.
How can I get that to work?
I am looking for a solution to this so that I can do dev work locally and without changing the HTML later FTP the data to the live host's root and it will work.
I have read and tried the following:
http://coolestguidesontheplanet.com/redirecting-a-web-folder-directory-to-another-in-htaccess/
https://perishablepress.com/redirect-subdirectory-to-root-via-htaccess/ .htaccess How to redirect root URL to subdirectory files, rewrite to clean URL AND not affect subdomains?
http://alexcican.com/post/how-to-remove-php-html-htm-extensions-with-htaccess/
https://stackoverflow.com/a/990405/1010918
How to redirect /directory/index.html and /directory/index.php to /directory/
Redirecting /directory/index.html to /directory/
with
How to remove .html from URL
and
http://forums.modx.com/thread/77211/endless-friendly-url-redirect-from-subdomain-folder-location
being closest to what I think I need but I must be doing something wrong since I always arrive at the DocumentRoot instead of the folder where the project is kept under the DocumentRoot.
Thank you for any help.
This seems closest:
Add VirtualHost definition for each project you want to access. (I'm
not sure how to do it on XAMPP for Windows).
For example, project01.devwork.webdev...
Set DocumentRoot for this VirtualHost to D:\dev\www\project01...
Add the hostname to your /etc/hosts/ file.
Open http://project01.devwork.webdev/ in your browser.
You should see the application in D:\dev\www\project01, while all URLs
will be based on "/".
What happens in the background:
When you open the URL http://project01.devwork.webdev/ in your browser, it will (as usual) translate it to IP address, but along with the request, it will also send Host header with the entered hostname:
GET / HTTP/1.1
Host: project01.devwork.webdev
Based on the Host field, Apache will decide which VirtualHost it needs to "pretend" to be, and serve files from the respective direcory.
However, if you need to have index of the projects, you will have to create it manually with full URLs.
An alternative with the given logic would be to simply add a VirtualHost for each project in D:dev\www\projectname by giving it its own domain instead of a subdomain on the host.
There is little difference there now when already editing httpd-vhosts.conf (using XAMP this file is in InstallDir and then in apache\conf\extra )but ideally it makes copying the local HTML to the live server without changing the HTML possible.
So this is something devs doing local work should keep in mind. I certainly will!! Thank you for your help and information.
<VirtualHost *:80>
DocumentRoot "D:/dev/www/newprojectname"
ServerName newdomainname.webdev
ServerAlias www.newdomainname.webdev
ErrorLog "D:/dev/www/log/dev-apache.error.log"
CustomLog "D:/dev/www/log/dev-apache.access.log" common
<Directory "D:/dev/www/newprojectname">
AllowOverride All
Options Indexes FollowSymLinks
Require local
# more detailed local
# Require ip 192.168.188
# or the IP from your local network
</Directory>
</VirtualHost>
I would like to change the default web page that shows up when I browse my site. I currently have a reporting program running, and it outputs a file called index.html. I cannot change what it calls the file. Therefore, my landing page must be called something else. Right now, when I browse my site it takes me to the reporting page.
From what I see, whatever you call index.html it will pull that up as your default. I want to change that to landing.html. How do I do this?
I am a folder (Folding # Home). The reporting program is HFM.net. HFM can output an html file with my current folding statistics. It names the html file index. I do not want that to be my default home page. I would like a menu-like landing where I can choose if I want to see my stats, or something else. The website is at /home/tyler/Documents/hfm/website (landing.html and hfm's index.html are here). Apache2 is in its default directory.
I'm also running Ubuntu 13.04.
I recommend using .htaccess. You only need to add:
DirectoryIndex home.php
or whatever page name you want to have for it.
EDIT: basic htaccess tutorial.
1) Create .htaccess file in the directory where you want to change the index file.
no extension
. in front, to ensure it is a "hidden" file
Enter the line above in there. There will likely be many, many other things you will add to this (AddTypes for webfonts / media files, caching for headers, gzip declaration for compression, etc.), but that one line declares your new "home" page.
2) Set server to allow reading of .htaccess files (may only be needed on your localhost, if your hosting servce defaults to allow it as most do)
Assuming you have access, go to your server's enabled site location. I run a Debian server for development, and the default site setup is at /etc/apache2/sites-available/default for Debian / Ubuntu. Not sure what server you run, but just search for "sites-available" and go into the "default" document. In there you will see an entry for Directory. Modify it to look like this:
<Directory /var/www/>
Options Indexes FollowSymLinks MultiViews
AllowOverride None
Order allow,deny
allow from all
</Directory>
Then restart your apache server. Again, not sure about your server, but the command on Debian / Ubuntu is:
sudo service apache2 restart
Technically you only need to reload, but I restart just because I feel safer with a full refresh like that.
Once that is done, your site should be reading from your .htaccess file, and you should have a new default home page! A side note, if you have a sub-directory that runs a site (like an admin section or something) and you want to have a different "home page" for that directory, you can just plop another .htaccess file in that sub-site's root and it will overwrite the declaration in the parent.
You can also set DirectoryIndex in apache's httpd.conf file.
CentOS keeps this file in /etc/httpd/conf/httpd.conf
Debian: /etc/apache2/apache2.conf
Open the file in your text editor and find the line starting with DirectoryIndex
To load landing.html as a default (but index.html if that's not found) change this line to read:
DirectoryIndex landing.html index.html
I had a similar problem. When providing http://server/appDirectory I got a directory listing instead of index.html even though I had
<IfModule dir_module>
DirectoryIndex index.php index.html
</IfModule>
in my httpd.conf file.
My solution was to uncomment the
LoadModule setenvif_module modules/mod_setenvif.so
line in httpd.conf
Apache version: 2.4
In Ubuntu you can add in file:
/etc/apache2/mods-enabled/dir.conf
line
DirectoryIndex myhomepage.htm
and then restart apache service
sudo systemctl restart apache2
In Unbuntu, you can update the default page on a site-by-site basis with the site config files eg:
/etc/apache2/sites-available/your.domain.conf
Same syntax for the key line in the file, eg mine is;
DirectoryIndex default.htm index.htm
Then don't forget to enable and reload:
sudo a2ensite your.domain.conf
sudo systemctl reload apache2
I just finished to serve my pages on the internet through apache. I can see my webpage nicely, but when I try the admin, the django admin page don't have the css with it, just the html page. But my webpage's css are displaying nicely.
Thank you!
my http.conf snippet:
WSGIPythonPath C:/Users/robin/web/etc/etc
<Directory C:/Users/robin/web/etc/etc>
<Files wsgi.py>
Order deny,allow
Allow from all
</Files>
</Directory>
#Alias /robots.txt /path/to/mysite.com/static/robots.txt
#Alias /favicon.ico /path/to/mysite.com/static/favicon.ico
AliasMatch ^/([^/]*\.css) C:/Users/robin/web/etc/etc/static/styles/$1
#Alias /media/ /path/to/mysite.com/media/
Alias /static/ C:/Users/robin/web/etc/etc/static/
<Directory C:/Users/robin/web/etc/etc/static>
Order deny,allow
Allow from all
</Directory>
#<Directory /path/to/mysite.com/media>
#Order deny,allow
#Allow from all
#</Directory>
WSGIScriptAlias / C:/Users/robin/web/etc/etc/etc/wsgi.py
<Directory C:/Users/robin/web/etc/etc/etc>
<Files wsgi.py>
Order allow,deny
Allow from all
</Files>
</Directory>
my settings snippet:
STATIC_ROOT = 'C:/Users/robin/web/etc/static/'
STATIC_URL = '/static/'
I have made a different directory for the static, and have already directed django to the dir where to find static. I have edited and added the setting's snippet above, kindly check it. And have run the collectstatic command, and it has created three directories - admin, css and images. And copied all the statics from the project to that directory, even the admin's css and images in the admin dir. And the server is also displaying my project's css nicely. But not of the admin's css. What am I missing? Please guide me.
Each application has own static files (generally in 'static' directory, but not necessarily, see settings.STATICFILES_FINDERS). Django serves this files in debug mode, but before deploy to real server you must collect all static from all apps, put it into one folder and configure webserver. You can do it manually, or set settings.STATIC_ROOT to apache's docroot and run collectstatic command.
In sum, apache config:
Alias /static/ C:/Users/robin/web/etc/etc/static/
settings.py:
STATIC_ROOT = 'C:/Users/robin/web/etc/etc/static/'
And run collectstatic:
python manage.py collectstatic
Tell Django where to find the static files (if you have not already).
First, make sure you have STATIC_ROOT and STATIC_URL set correctly in `settings.py', and then on your live server simply run the following command.
python manage.py collectstatic
This will collect all static files necessary for the admin to render correctly.
When running the development server (runserver) static files are automatically found using STATICFILES_FINDERS, but in production this is not the case, in-fact STATICFILES_FINDERS does somethine else in production, it finds and collects the files after `running python manage.py collectstatic ' they are then served in your case by Apache.
Ok, so i've previously set up two virtual hosts and they are working cool. they both house simple web projects and work fine with http://project1 and http://project2 in the browser.
Anyway, I've come to add another vhost. I edited the /etc/hosts file with 127.0.0.1 project3 and also updated the httpd-vhosts.conf file by copy and pasting the previous entries for project2 and editing the file path.
I've checked all the file and folder permissions (in fact I copied and pasted from project2) and simply put a "hello world" message in the index.php file.
I get a 403 forbidden permission denied message when accessing http://project3
Why is this, I just can figure out what step I've missed as everything seems to be set up correct.
Check that :
Apache can physically access the file (the user that run apache, probably www-data or apache, can access the file in the filesystem)
Apache can list the content of the folder (read permission)
Apache has a "Allow" directive for that folder. There should be one for /var/www/, you can check default vhost for example.
Additionally, you can look at the error.log file (usually located at /var/log/apache2/error.log) which will describe why you get the 403 error exactly.
Finally, you may want to restart apache, just to be sure all that configuration is applied.
This can be generally done with /etc/init.d/apache2 restart. On some system, the script will be called httpd. Just figure out.
I just fixed this issue after struggling for a few days. Here's what worked for me:
First, check your Apache error_log file and look at the most recent error message.
If it says something like:
access to /mySite denied (filesystem path
'/Users/myusername/Sites/mySite') because search permissions
are missing on a component of the path
then there is a problem with your file permissions. You can fix them by running these commands from the terminal:
$ cd /Users/myusername/Sites/mySite
$ find . -type f -exec chmod 644 {} \;
$ find . -type d -exec chmod 755 {} \;
Then, refresh the URL where your website should be (such as http://localhost/mySite).
If you're still getting a 403 error, and if your Apache error_log still says the same thing, then progressively move up your directory tree, adjusting the directory permissions as you go. You can do this from the terminal by:
$ cd ..
$ chmod 755 mySite
If necessary, continue with:
$ cd ..
$ chmod Sites
and, if necessary,
$ cd ..
$ chmod myusername
DO NOT go up farther than that. You could royally mess up your system.
If you still get the error that says search permissions are missing on a component of the path, I don't know what you should do. However, I encountered a different error (the one below) which I fixed as follows:
If your error_log says something like:
client denied by server configuration:
/Users/myusername/Sites/mySite
then your problem is not with your file permissions, but instead with your Apache configuration.
Notice that in your httpd.conf file, you will see a default configuration like this (Apache 2.4+):
<Directory />
AllowOverride none
Require all denied
</Directory>
or like this (Apache 2.2):
<Directory />
Order deny,allow
Deny from all
</Directory>
DO NOT change this! We will not override these permissions globally, but instead in your httpd-vhosts.conf file.
First, however, make sure that your vhost Include line in httpd.conf is uncommented. It should look like this. (Your exact path may be different.)
# Virtual hosts
Include etc/extra/httpd-vhosts.conf
Now, open the httpd-vhosts.conf file that you just Included. Add an entry for your webpage if you don't already have one. It should look something like this. The DocumentRoot and Directory paths should be identical, and should point to wherever your index.html or index.php file is located. For me, that's within the public subdirectory.
For Apache 2.2:
<VirtualHost *:80>
# ServerAdmin webmaster#dummy-host2.example.com
DocumentRoot "/Users/myusername/Sites/mySite/public"
ServerName mysite
# ErrorLog "logs/dummy-host2.example.com-error_log"
# CustomLog "logs/dummy-host2.example.com-access_log" common
<Directory "/Users/myusername/Sites/mySite/public">
Options Indexes FollowSymLinks Includes ExecCGI
AllowOverride All
Order allow,deny
Allow from all
Require all granted
</Directory>
</VirtualHost>
The lines saying
AllowOverride All
Require all granted
are critical for Apache 2.4+. Without these, you will not be overriding the default Apache settings specified in httpd.conf. Note that if you are using Apache 2.2, these lines should instead say
Order allow,deny
Allow from all
This change has been a major source of confusion for googlers of this problem, such as I, because copy-pasting these Apache 2.2 lines will not work in Apache 2.4+, and the Apache 2.2 lines are still commonly found on older help threads.
Once you have saved your changes, restart Apache. The command for this will depend on your OS and installation, so google that separately if you need help with it.
I hope this helps someone else!
PS: If you are having trouble finding these .conf files, try running the find command, such as:
$ find / -name httpd.conf
restorecon command works as below :
restorecon -v -R /var/www/html/
Notice that another issue that might be causing this is that, the "FollowSymLinks" option of a parent directory might have been mistakenly overwritten by the options of your project's directory. This was the case for me and made me pull my hair until I found out the cause!
Here's an example of such a mistake:
<Directory />
Options FollowSymLinks
AllowOverride all
Require all denied
</Directory>
<Directory /var/www/>
Options Indexes # <--- NOT OK! It's overwriting the above option of the "/" directory.
AllowOverride all
Require all granted
</Directory>
So now if you check the Apache's log message(tail -n 50 -f /var/www/html/{the_error_log_file_of_your_site}) you'll see such an error:
Options FollowSymLinks and SymLinksIfOwnerMatch are both off, so the RewriteRule directive
is also forbidden due to its similar ability to circumvent directory restrictions
That's because Indexes in the above rules for /var/www directory is overwriting the FolowSymLinks of the / directory. So now that you know the cause, in order to fix it, you can do many things depending on your need. For instance:
<Directory />
Options FollowSymLinks
AllowOverride all
Require all denied
</Directory>
<Directory /var/www/>
Options FollowSymLinks Indexes # <--- OK.
AllowOverride all
Require all granted
</Directory>
Or even this:
<Directory />
Options FollowSymLinks
AllowOverride all
Require all denied
</Directory>
<Directory /var/www/>
Options -Indexes # <--- OK as well! It will NOT cause an overwrite.
AllowOverride all
Require all granted
</Directory>
The example above will not cause the overwrite issue, because in Apache, if an option is "+" it will overwrite the "+"s only, and if it's a "-", it will overwrite the "-"s... (Don't ask me for a reference on that though, it's just my interpretation of an Apache's error message(checked through journalctl -xe) which says: Either all Options must start with + or -, or no Option may. when an option has a sign, but another one doesn't(E.g., FollowSymLinks -Indexes). So it's my personal conclusion -thus should be taken with a grain of salt- that if I've used -Indexes as the option, that will be considered as a whole distinct set of options by the Apache from the other option in the "/" which doesn't have any signs on it, and so no annoying rewrites will occur in the end, which I could successfully confirm by the above rules in a project directory of my own).
Hope that this will help you pull much less of your hair! :)
it doesn't, however, solve the problem, because on e.g. open SUSE Tumbleweed, custom source build is triggering the same 401 error on default web page, which is configured accordingly with Indexes and
Require all granted
The server may need read permission for your home directory and .htaccess therein
You can try disabling selinux and try once again using the following command
setenforce 0
In my case it was failing as the IP of my source server was not whitelisted in the target server.
For e.g. I was trying to access https://prodcat.ref.test.co.uk from application running on my source server.
On source server find IP by ifconfig
This IP should be whitelisted in the target Server's apache config file. If its not then get it whitelist.
Steps to add a IP for whitelisting (if you control the target server as well)
ssh to the apache server
sudo su -
cd /usr/local/apache/conf/extra (actual directories can be different based on your config)
Find the config file for the target application for e.g. prodcat-443.conf
RewriteCond %{REMOTE_ADDR} <YOUR Server's IP>
for e.g.
RewriteCond %{REMOTE_ADDR} !^192\.68\.2\.98
Hope this helps someone
Add
<Directory "/path/to/webroot">
Options Indexes FollowSymLinks Includes ExecCGI
AllowOverride All
Order allow, deny
Allow from all
Require all granted
</Directory>
What this does is tell Apache2 to override any previous configs, and allow (200) from all before denying. (403) It also requires all requests to be granted. This code will have to go in every vhost file, but it does work. I have been using this for over a year.
to your config file (e.g. /etc/apache2/sites-enabled/000-default.conf)
Tested LAMP stack Debian 11