Volume 3: nginx: A Successful Foundation

(This is part of My ownCloud Adventure)

The Final Fron… Errrr… Wrong adventure.

In the end I have chosen to utilize Nginx to host my personal ownCloud… At least until I have a reason to change my mind, probably for an arbitrary or subjective reason.

Nginx is rather new to me, bug generally speaking, remembering how to pronounce its name was more difficult than getting things working.

Just like getting lighttpd working, I took advantage of Win32 Disk Imager to write an image to the SD card with all of the prep work covered by the Preface completed.

  1.  sudo apt-get update
  2. sudo apt-get install nginx php5-cgi php5-fpm curl libcurl3 php5-curl php5-common php5-gd php-xml-serializer php5-mysql
  3. cd /etc/nginx/sites-available
  4. sudo nano siteName-ssl
    1. copy/paste the Nginx config from ownClouds docs
    2. Edits
      • “root /var/www/owncloud”
      • Find:  “location ~ ^/(data|config|\.ht|db_structure\.xml|README)”
        • Change to (Thanks Dave!):
          • “location ~ ^.*/?(data|config|\.ht|db_structure\.xml|README)”
      • comment out:  fastcgi_pass 127.0.0.1:9000;
      • uncomment:  fastcgi_pass unix:/var/run/php5-fpm.sock
    3. save and exit
  5. cd ../sites-enabled
  6. sudo rm default
  7. sudo ln -s ../sites-available/siteName-ssl
  8. sudo service nginx restart

At this point a choice needs to be made.  One can continue installing the php-apc module, or one can go ahead and do the initial ownCloud configuration… Getting things to work and experiencing the load times as-is… Providing a great before-and-after comparison to fully appreciate what php-apc accomplishes.

PHP-APC

Regardless of which choice is made, the php-apc module is a required addition… And a very quick install.

  1. sudo apt-get install php-apc
  2. sudo service php5-fpm restart

And it is now ready and working!

ownCloud Initial Configuration

The first time you bring up ownCloud, which with the above steps can be found at https://server.ip.or.fqdn/owncloud, you are presented with a pretty simple initial configuration page.

For anyone following this path, I believe its pretty self explanatory.  It only needs a few sets of information…

  • Initial username & password
  • ownCloud’s data directory
  • Database details

Some details may be hidden via an Advanced Options link.

The data directory configuration item is the one that may need the most consideration depending on circumstances.  At this early point, with little experience, I did not have a need to divert from the default.

A Little Tuning

There are a couple of options that I would recommend doing.

So lets go into the ownCloud Admin section.

The most important configuration change to make with under the “Cron” section.  While the default is set to “Ajax”, its recommended to use the “Cron” option.

Configuring the cron is not difficult, but should be done under the default webserver user.  (ownCloud’s cron doc)

  1. sudo -u www-data crontab -e
  2. Add to the bottom:  */15 * * * * php -f /var/www/owncloud/cron.php
  3. save and exit

Beyond the cron configuration, you can also check “Enforce HTTPS”.  While the webserver should be doing this task, a little backup probably won’t hurt.

App Thoughts

External storage support:  Probably one of the most important available apps.  This lets you connect to Dropbox, Google Drive or other accounts.  It also lets you define local storage directories, such as NFS mounts.

ownCloud Dependencies Info:  I do not believe this one is needed normally, but is a good tool out there to make check ensure you have the appropriate dependencies.

Unfortunately I had to disable the Pictures app at this time since it is not working properly for me. I hope it’ll be resolved in the future.

 

And with that… For now… Until next time… The End

Various References:

Volume 2: lighttpd: An Easy Fling

(This is part of My ownCloud Adventure)

While my initial foray into ownCloud was successful… Even if I did not see it all the way through…  I felt I needed to investigate a solution that would bet better suited for the limited hardware of my Raspberry Pi.

The path chosen was a brief dalliance with lighttpd.

Taking advantage of Win32 Disk Imager, I was able to write an image to the SD card with all the completed prep work that covered in the Preface.  This did save a good bit of time.

  1. sudo apt-get update
  2. sudo apt-get install lighttpd php5-cgi curl libcurl3 php5-curl php5-common php5-gd php-sml-serializer php5-mysql
  3. cd /etc/php5/cgi
  4. sudo nano php.ini
    • uncomment:  “cgi.fix_pathinfo = 1”
    • Not actually sure this is required, as a re-read of the description says that “1” is now the default value
  5. cd /etc/lighttpd
  6. sudo lighty-enable-mod fastcgi-php
  7. cd conf-available
  8. sudo cp 10-expire.conf 10-expire-myHost.conf
  9. sudo nano 10-expire-myHost.conf
    • Append
      • $HTTP[“url”] =~ “(css|js|png|jpg|ico|gif)$” {
        expire.url = ( “” => “access 7 days” )
        }
      • etag.use-inode = “enable”
      • etag.use-mtime = “enable”
      • etag.use-size = “enable”
      • static-file.etags = “enable”
  10. sudo service lighttpd restart

And with that the ownCloud first-run webpage should be accessible at https://server.ip.or.fqdn/owncloud.  Which leaves things in basically the same spot as the end of Volume 1.

After describing everything needed to get lighttpd up and running with ownCloud though, I’m hoping a detail was not left out… This does appear to live up to the “Easy Fling” description though.

Its possible this would have been the end of my adventure, except for one detail.  In the Preface I mentioned security… And while I have no particular knowledge or specific security concern to doubt lighttpd… I was a little put-off by the last update being over 9 months ago in November 2012.

And similarly, while I cannot speak to nginx’s security posture being better that lighttpd, it appears to be more actively maintained… Not to mention its quickly growing popularity.

Which leads us to… Volume 3:  nginx:  A Successful Foundation

Various References:

Volume 1: Apache2: A Heavy Duty Companion

(This is part of My ownCloud Adventure)

My adventure with ownCloud started out well, focusing on the goal of using Apache for my webserver, but it appears that some of my records were lost during my adventure… With the actual commands I used to install Apache now unavailable (i.e. never recorded to begin with).

So I’ll be providing what believe to be the best reconstruction (i.e. guess) that I am able to provide.

  1. sudo apt-get update
  2. sudo apt-get install apache2 php5-gd php5-curl  php5-cgi libapache2-mod-php5 php5-mysql libcurl3 php5-common php-xml-serializer
  3. cd /etc/apache2/sites-available
  4. sudo cp default mySite
  5. sudo cp default-ssl mySite-ssl
  6. sudo nano mySite-ssl
    1. edit:  SSLCertificateFile /etc/ssl/localcerts/mySite.fqdn.pem
    2. edit:  SSLCertificateKeyFile /etc/ssl/localcerts/mySite.fqdn.key
    3. save and exit
  7. sudo service apache2 stop
  8. sudo a2dissite default
  9. sudo a2ensite mySite
  10. sudo a2ensite mySite-ssl
  11. sudo service apache2 start
  12. Test:  https://server.ip/ownCloud
    1. If the certificates are working, you’ll have to tell your browser that you accept the risk of accepting a self-signed cert
  13. Assuming it worked…
    1. cd /etc/apache2/sites-available
    2. sudo nano mySite
      1. After “DocumentRoot” line, add…
        1. Redirect permanent / https://site.ip/
        2. or:  Redirect permanent / https://fqdn/
      2. save and exit
    3. sudo service apache2 reload

These instructions make a few assumptions that should be mentioned before progressing further.

  • The Preface has been followed
  • apache’s default root directory is “/var/www”
  • an owncloud directory (or symlink) exists at “/var/www/owncloud”

At this point, going to “https://site.ip.or.fqdn/owncloud” should bring one to the initial configuration page for ownCloud.  On a Raspberry Pi, with its limited hardware, It may take more than a few seconds to appear.

One last parting thought… Apache2 is a good webserver.  It has served me well over the years, but as the years have passed its put on some weight.  During this initial ownCloud endeavor… And it hit me when loading ownCloud for the first time…  I learned that there are other, less weighty (i.e. light) webserver options.

So in an effort to not repeat myself, its at this point that Volume 1 will wrap up, as I plan to go into more post-installation details at the end of Volume 3.

Until next time…  Volume 2:  lighttpd:  An Easy Fling

Various References:

 

Preface: Common ownCloud Path

(This is part of My ownCloud Adventure)

For any adventure to come to a successful conclusion, the proper preparations must first be made.

With my previous experience working with the Raspberry Pi I was able to quickly get a dedicated server setup and connected to my Synology NAS via NFS.

I should mention here, to plant a seed of thought, that throughout my endeavors the security posture of my system has been a constant consideration.  As an example, with my NFS configuration there are mounts available on my network that I did not give my ownCloud host access to… I am just not comfortable with some files being remotely accessible.

While not exhaustive, there are some common tasks that should probably be performed when setting up a new Raspbian instance:

SD Card Images

Throughout my adventure I made extensive use of Win32 Disk Imager to create images of the SD card.  This allowed me to configure common features once and just reload an image to start over if needed.

For example, I have an image that I created after performing my basic Raspbian updates and configurations.  After that I have an image with SSL certs and MySQL already taken completed.  This definitely made it much easier to go from Apache2, to lighttpd and finally end up at nginx with a “clean” system.

SSL Certs

To allow any of the webservers to utilize HTTPS, generating SSL certificates is the first task.  There are MANY resources available out there, but here are the basic commands I performed.

  1. cd /etc/sll
  2. sudo mkdir localcerts
  3. sudo openssl req -newkey rsa:2048 -x509 -days 365 -nodes -out /etc/ssl/localcerts/myServer.fqdn.pem -keyout /etc/ssl/localcerts/myServer.fqdn.key
  4. sudo chmod 600 localcerts/meister*

These commands result in 2 files as output:  a PEM certificate & a key.  Both are used by any webserver to enable HTTPS.

You will be asked a number of questions during key generation.  Since this results in a self-signed key, answer them however you like.  Except for the FQDN question, I’m not sure any of them even technically matter.  And in the case of the FQDN question, I didn’t care if its value matched my dynamic DNS name or not.

The one important technical detail is that if you do not want to enter a password every time your webserver starts, then do not enter a password when prompted.

Good Resources:

MySQL

ownCloud supports multiple database backends, but I chose MySQL since its familiar to me (although I do wish MariaDB were available in the Raspbian repository).

  1. sudo aptitude
    1. Install MySQL server
    2. The install will ask for a ‘root’ password for your new database server
  2. mysql_secure_installation
    • A script that performs a number of standard bets practice configurations.  Be sure to follow its recommendations!
  3. mysql -u root -p
    • No need to put your password in as an option, you will be prompted
  4. At the “mysql>” prompt
    • create database myOcDbase;
    • create user ‘myOcUser’@’localhost’ identified by ‘myUserPass’;
    • create user ‘myOcUser’@’127.0.0.1’ identified by ‘myuserPass’;
    • grant all privileges on myOcDbase.* to ‘myOcUser’@’localhost’;
    • grant all privileges on myOcDbase.* to ‘myOcUser’@’127.0.0.1’;
    • exit

Good Resource:  http://dev.mysql.com/doc/refman/5.5/en/index.html

Acquiring ownCloud

Getting a hold of ownCloud is not difficult and can be accomplished via various means.

I originally dabbled with manually adding an ownCloud repository to my system’s repo list.  I just followed the instructions found for Debian off ownCloud’s Linux packages install link.

  1. cd /etc/apt/sources.list.d
  2. sudo nano owncloud.list
    • Enter:  “deb http://download.opensuse.org/repositories/isv:ownCloud:community/Debian_7.0/ /”
    • save and exit
  3. cd
  4. wget http://download.opensuse.org/repositories/isv:ownCloud:community/Debian_7.0/Release.key
  5. apt-key add – < Release.key
  6. sudo apt-get update
  7. sudo apt-get install owncloud

While this method did work and is not a bad way to go, especially considering its many advantages… I was unsure of how quickly the repository would be updated with new versions, so I instead elected to go with the manual install.

  • cd
  • wget http://download.owncloud.org/community/owncloud-5.0.10.tar.bz2
    • As versions change, this link will change.  So be sure to get the latest Tar link.
  • tar -xjvf owncloud-5.0.10.tar.bz2
  • mv owncloud owncloud_5.0.10
  • cp -r owncloud_5.0.10 /var/www
  • cd /var/www
  • sudo chmod -R www-data:www-data owncloud_5.0.10
  • sudo ln -s owncloud_5.0.10 owncloud
    • Using a symbolic link in this fashion can help in the future with manual updates.  Just follow ownCloud’s manual update instructions and pre-position the latest version’s directory under /var/www and re-do the symlink for a quick and easy upgrade

And that seems to wrap up the common activities across each of the volumes in my adventure.

WHS 2011 Install

The install of WHS 2011 went pretty smoothly.

Biggest issue was a little static electricity that froze the system during the install.  Had to start over, but was not far in, so not a big loss. No hardware lost either (silver lining!).

The install automatically partitions the main drive, with no option to change.  Other than the feeling of losing control, I can understand this decision.

The install DOES require a network connection to perform.  Not sure why during… Afterwards when needing to do all the updates (and there are a good number) sure… But during?

I was worried that the Shuttle BIOS would not like the lack of a monitor / keyboard / mouse connected post-install, but it did not complain a bit.  Win!

I did disable “Shadow Copies” on my drives.  Primarily to reduce space usage since I cannot just add more drives to increase space (and got the biggest 2.5″ drives I could find).  We’ve never had a need for this feature, so its not a big loss.

The client connector installs were very time consuming.  The wizard says it could take 30 mins or more… Try well over 1 hour with little feedback that things are even working.  At least this time, patience worked.

One Issue / Resolution: I did hit a small snag when installing the client connector software.  I believe it may have been a conflict with the old client connector software, which I had already removed as a first step.  However, it kept saying that it could not install because of another software install was happening (or something like that).  So I restarted and that solved the problem. I guess there was something still hanging around from the uninstall.