A week with ownCloud

Last week I installed “ownCloud”, a Dropbox like file sharer where you run the server so nobody you don’t want gets control over your files. It also provides calendar and contact sharing, as well as supposedly providing an rss reader/aggregator to replace Google Reader, although I haven’t figured out how to implement that yet.

Installing it was pretty easy – I used the Debian packages hosted on opensuse for both the server and my linux box, and a more direct install on my MacBook. It took a bit of messing around to get ssl working on my web server because my sites-enabled config files were a mess that just barely worked in the past.

I added the documents folder on my MacBook to the ownCloud, and it synced. I could see all the files on the web interface. Then I moved the Documents dir on my Linux box out of the way, and added the ownCloud Documents dir to it. A little while later, all the documents from the Mac were now there on my Linux box. Then I moved all the docs that had been on the Linux box’s Document dir back into the dir, and watched as they appeared on the web interface and on my Mac. The very next day, the server was reporting that ownCloud version 5.0 was now out, and so I upgraded.

The upgrade didn’t go 100% smoothly. At one point in the web interface part of the upgrade, it appeared to stop doing anything, and so I reloaded the page. I’m not sure if that’s the cause of all my future problems, or just another symptom. At a later time I noticed an error appearing in the web admin page mentioning a duplicate key in an index. That probably isn’t good.

I didn’t run the old 4.7 version long enough to tell if it happened there, but this week I’ve noticed the following big problems:

  • frequently throughout the day, the “dock icon” on Linux or the toolbar icon on the Mac will indicate a problem, but it will go away on its own. The error mentions something about being unable to find a sync file.
  • the linux client crashes almost every night, and sometimes during the day
  • this morning, the linux client hadn’t crashed, but it was consuming 150% of a CPU
  • the number of requests to my apache server has gone up astronomically, mostly “PROPFIND” requests on ownCloud.
  • when I noticed a dozen or so directories in Documents that I don’t need any more and attempted to remove them on the linux box, 3 of them came back. One of them came back seven or eight times, as I removed it on Linux box and then the web interface, and finally on the Mac

I also tried installing the client on a 32 bit Linux virtualbox vm, and was able to get it to sync the default “ownCloud” directory, but I couldn’t get it to sync “Documents”.

In spite of these problems, as a file sharer I think it’s an awesome idea. I might try blowing the installation away and recreating it to see if that clears some of the annoyances.

Having a central store for my calendar is pretty nice, and I can use caldav to add that calendar to my iPad calendar and others. I exported my whole google calendar and imported it into e owncloud calendar without too much trouble. The drawback is I don’t see any easy way to allow Vicki to copy things onto my calendar like she does with my google calendar. Also, their web interface is pretty basic, but like I say you can just point other calendar programs like the iOS and KDE ones at it.

So if you’re like me and wish Dropbox gave you more disk space, and you just happen to have a spare 20 or 30 gig of unused space on a server somewhere, give ownCloud a try.

Big changes coming

So I’ve decided to spend a few bucks to fix a few niggling little issues around the house, mostly in the computer department:

  • First off, I’m worried about some recent break-ins and vandalism in the neighborhood.
  • Secondly, and slightly related, when I’m working in my office at the back of the house, it would be nice to know when the FedEx guy is ninja-ing a non-delivery tag at the front door instead of ringing the doorbell and waiting. Or know when the dogs bark whether it’s somebody at the door or just a shadow across the road.
  • The wifi penetration in the house sucks – in some parts of the house, your device will show one bar but nothing will actually get through. And if the microwave is on, forget about getting any signal on the other side of it. I put in a wifi repeater but it’s dog slow, and it uses a different SSID so you have to switch between SSIDs as you move around the house.

So here is what I’m in the process of doing to fix all those things:

  • I bought a security camera – an Airsight PTZ Pro outdoor camera with pan/tilt/zoom. If I wanted to, I could hook up a microphone and speaker so I could yell at the FedEx delivery guy to wait for 5 seconds as I run down. I’ve been playing with it and it is pretty amazing, although I’ve found one big flaw (more on that later)
  • I am running network cable from my office down into the basement, and from the basement up into the far corner of the basement, the dining room, and out to the front porch. The cable is currently pulled, but it’s not terminated and tested yet.
  • I’ve got a 8 port Gigabit Ethernet switch tacked to the wall where the first network cable drop comes down.
  • In the far corner of the basement, I’ve got a second router ready to install. I’m going to put this on the same SSID as the main one upstairs, and the same password, but on a different channel, turn off DHCP, and run the outgoing cable from my main router into the “WLAN” port of this one. I believe this will make the switchover from one to the other transparent so you don’t have to remember to switch SSIDs as you walk around the house, and it should perform a lot better than using the repeater. As an added bonus, it also supports 5GHz.
  • In the dining room, where Vicki spends 90% of her time when she’s using her computer, especially when she’d doing Second Life for work, there will be a wired network drop. Wifi is all well and good, especially 5GHz, but nothing beats wired.
  • The camera allows power over ethernet (or PoE as they call it in the brochure). So since I had to run power out to it anyway, I figured I’d give it the advantages of a wired connection, and run it all through the same wire.
  • The camera has the option to upload pictures and recordings to an FTP server. I figured that’s not much good to you if the thieves break in and steal your computer as well, so I’ve ordered a tiny little Raspberry Pi (aka Rπ). I already have a hard disk taken from a laptop that’s not doing anything, so I figure I can set up a tiny little FTP server and hide it somewhere where thieves won’t find it even if they’re ransacking the house. A closet, an obscure corner of the basement, even hidden inside the walls somewhere. These things are amazingly tiny. And I’m considering also using the Rπ to run ZoneMinder as an alternative to the built-in functionality because of the already foreshadowed flaw in the camera.

Ok, so what is this big flaw you’ve been talking about, I hear you ask? Well, it’s simple. The camera has the option to, when it detects motion, email you 5 pictures and start recording video to an ftp server. It also has the ability to pan and tilt and zoom. Those are two awesome features, right there. So what’s the problem? Well, when you set it panning, it interprets *that* as actionable movement and starts sending you emails. Not a good thing if you want it to continuously pan back and forwards. There is another option in the camera that lets you set up a bunch of fixed locations and have it cycle between those locations at intervals. I haven’t yet tested it to see if it’s smart enough to ignore movement while it’s moving between locations.

Oh, in other techie stuff, I finally got around to upgrading my Gallery site to Gallery3. In spite of the promises, the “Gallery 2 Importer” isn’t able to properly translate the URLs that Gallery 2 used to Gallery 3, so links to the Gallery are probably all broken. I did put in a mod_rewrite rule to take care of some of them, but not direct links to image files. Also, I seem to have lost all my raw pictures and movies.

I’m also currently looking into installing “ownCloud” as a way to get more space than I have with Dropbox without paying for it. I want enough space that I can throw my entire Documents folder on it instead of having to think “do I need this on all my machines, or is it ok if it’s just here” for every file. Since one of the two people renting space on my colo box never pays his rent except when I send him an email asking him if he’s still using it, I think I know where I can lay my hands on 100Gb of disk space on a server in a rack really cheap.

Well, that could have gone better

I volunteered to give a presentation to Linux Users Group of Rochester (LUGOR) about LVM, the Logical Volume Manager. I knew I had half an hour, and so I made a presentation, rehearsed it several times, and knew I could go through it in half an hour. I did it on my laptop, using VirtualBox to stand in for a computer that I could virtually add and remove drives from. I was told the room we were presenting had a projector that took HDMI input, and my laptop has an HDMI output, so I figured I was set.

First hitch was arriving to find out that we had been bumped from our room because some musicians were warming up for a concert they were giving elsewhere in the building, and the new room had a projector that only took VGA or DVI. Oh, and also I’d evidently gotten my signals crossed and I was really supposed to present next month. But no mind, the guy who was supposed to give the second talk today wanted to go first because he was sick and wanted to bail early, and the guy who was supposed to give the first talk wanted an hour not half an hour and would rather postpone. So the guy who wanted to go first talked first, and got me all intrigued about “ownCloud”. I may be setting that up one of these days.

Then the first room became available again, and we trooped back to it. And then I plugged in my laptop, got the two screens non-mirrored all set up so I could do the Powerpoint presentation part of the show, and then the projector screen started randomly flashing between what it was supposed to be showing and a green screen with something about HDCP displayed on it. I didn’t know it at the time, but that means that the copy protection stuff on my laptop isn’t compatible with the copy protection stuff on the projector. We spent some time trying to wiggle wires, change settings on both the laptop and the projector, etc, and finally I gave up.

Another guy gave a good quick little presentation on the Raspberry Pi. Amazing power in such a small cheap package. I’ve got one on order for another project, but it might be many weeks before I see it.

While he was talking, one of the other members handed over his laptop. It was an Acer that isn’t as high end as my MacBook Air, but it had two things going for it:

  1. It had already proven it could display to the projector, and
  2. It had VirtualBox installed on it.

I copied my VirtualBox disk files and my PowerPoint over to his laptop, and when the Raspberry Pi presentation was over, I started my presentation. And that’s when the next problem reared its ugly head. Every time I booted my VirtualBox instance on my laptop, it takes about 10 seconds or so. Every time I booted it on his computer, it took literally 10 minutes or more. Since I had to reboot several times in the presentation (because I was simulating adding and removing disks), this caused the presentation to drag out drastically. Fortunately there were lots of things to talk about during those long pauses. Charles, the organizer, used one of the pauses to explain in great detail what exactly I was doing with the VirtualBox and which parts of what I was showing belonged to it and which belonged to the guest OS and which belonged to LVM, something which I fear I hadn’t even though to explain in my presentation. With all the long pauses and delays, my “30 minute talk” ended up being somewhere between an hour and an hour and a half. And worse still, on the very last boot of my talk, I discovered that if I increased the number of virtual CPUs from 1 to 4 the boot went much, much faster. I’d only ever used 1 virtual CPU on my own laptop and hadn’t noticed any problem – I don’t know if that’s a difference between my i7 processor and the loaner laptop’s i5, or because mine is hosted on OSX and his is hosted on Linux. I wish I’d discovered this earlier in the talk, though.

If you care, slides are available at https://www.dropbox.com/sh/y4822v4k6am0s9s/IFhrMz-HEW/lvm.pptx but probably not for too long.

Well, that wasn’t as easy as I’d hoped…

In my job, I often have to make accommodations for the security desires of my clients. That can be a massive pain in the ass, but it’s better than working in an office.

So when I started this new job, I worked on my Linux box and my Mac laptop, with a massive preference to my Linux box because it’s got two nice big monitors, a really nice clicky keyboard, and I have all the ergonomics dialed in. I had Postgres running on both systems already for other purposes, and it wasn’t hard to install the software we were using as the base system on both. I kept the software in sync between both of them and the client’s dev server using git. Everything was beautiful. For accessing things like time sheets and corporate email, as well as connecting to their dev server, I had to use Citrix, which was a minor pain, but fortunately I didn’t have to do it very often.

But then the client said “oh, that test database we gave you has real employee ids and the like, and so we need you to take some security precautions with it. Specifically, you need to turn on full disk encryption on your laptop, and purge the copy of the database on your desktop.” It took a bit of work, but I managed to get it so that my software would still run on the Linux box and connect with PostgreSQL on the laptop over an SSH tunnel, and so I’m in compliance with their wishes – I do have to remember to shut down the test server on my Linux box and the SSH tunnel before removing my laptop from the LAN, but that’s ok. That’s what you’ve got to do in this brave new work of computer security.

But now we’re entering a new phase of the project, where my code has to talk to a web service that a different group at the client site provides. And that web service is only available inside their firewall. That gives me a few choices for development:

  1. Do my local development without benefit of the web service calls, “comment them out” or the equivalent, and only test them when I “git pull” the code down to their dev server. Not a great option, because the code I’m testing locally is even further away from their code.
  2. Write a dummy web service on the Linux box or my laptop or both, and use that for testing. Probably feasible, but more trouble than I’d like to go through.
  3. Get a VM on their site where I can do development and testing both.

The last option is probably the easiest. It also means I can get rid of my copy of their database, and therefore get rid of full disk encryption on my laptop (which means no more typing my password every time the display blanks). The downside is that the VM will probably be Windows, which is nowhere near as nice to do development on as Linux or Mac, especially if you don’t have admin privs and so you can’t install the stuff you like. (I’m guessing I can’t install Sublime Text, not sure even if I can install gvim.) The real clincher is whether I’m going to be able to install a version of the base software or not, because if I can’t do that, I can’t work. If I can install it, then I probably can work that way – it’s a simple as that.

But if I’m going to do that, I’m going to want to log in from Linux because of the ergonomics I mentioned earlier. I’ve been using my MacBook Pro (or even this shitty Dell laptop I have for testing purposes) to log into Citrix because I didn’t want to install the Citrix client program on Linux. But needs must, etc. I looked on the Citrix web site and they have a .deb “for 64 bit Linux”. I downloaded it and clicked on it, and it said that it needs to install 246 other packages to satisfy dependencies, including 32 bit versions of nearly every major library out there. Sorry, Citrix, that’s not my definition of a version “for 64 bit Linux”. Ok, I thought, I know a way around this! I’ll install a 32 bit version of Linux in a VirtualBox VM, install the Citrix client in that, and use that to log into the work site.

Well, that turned out to be an adventure in itself. Mostly because I’m using Kubuntu (which is Ubuntu with KDE instead of the god-awful Unity Desktop), which is a little too resource hungry to run in a VM. So I was installing vanilla Ubuntu, Unity Desktop and all. But there was something weird about Ubuntu – I would install it and it was fine, but then it would download the required security updates, and suddenly the “VirtualBox Guest Extensions” stopped working and they refused to re-install. And what that means is that I could share any directories between the host OS and the guest, and more importantly, I couldn’t get the guest to expand to use the entirety of my beautiful 2560×1440 IPS monitor. And that’s a deal-breaker. I tried installing from scratch, and I tried using a pre-built Ubuntu image, and both times if failed after installing upgrades. But I tried a Debian pre-built image, and that worked fine, even after installing upgrades. The only drawback of Debian is that they don’t have proper Firefox, they have their weird-ass IceWeasel browser, which lags way behind the current version of Firefox. So I had to install real Firefox from a tar file, which is like a throwback to the bad old days of Slackware. But that worked fine, and the Citrix 32 bit client installed without any drama, I was able to log into Outlook and Putty on the client side, and so I’m ready for when they get the VM set up for me.

Get it together, guys

If there is one thing that iOS and Android developers seriously need to come together on it’s a common standard for showing “my app is currently waiting for something to arrive from the internet”. I mean, half the time in Android all you can see is a tiny barely visible exclamation mark or something on the wifi signal strength meter. The spinner on the titlebar that seems to be the “normal” iOS one is at least slightly more visible, although I think we need something more visible when your app is actually blocking (as opposed to just filling stuff you can’t see yet). Some apps have taken it upon themselves to replace the “default” spinner (or lame exclamation point) with a much more visible one in the main screen – in the Facebook app on iOS it’s both, and they aren’t 100% in sync – but there is a lot of different spinners and throbbers in different apps, and it’s inconsistent and confusing. Then you get the god-awful flashing color bars in the G+ app on iOS. Please stop trying to be clever. Maybe if Android’s wait notification wasn’t so lame people would actually use it, and then at least we’d have some consistency. (It doesn’t help my case that Chrome on my iPad currently has the spinner up on the title bar spinning even though nothing is loading.)