Found a new annoyance in Excel. I am analysing some data in Excel that’s been extracted from CiviCRM. Most of the data was submitted via webform. I easily managed to pull the URL out of the data to view the original webform submission. However, clicking the link in Excel I get an error:

Microsoft Excel Error
Unable to open URL. Cannot download the information you requested.

A quick Google brings me to this site. Of course, you need to be an authenticated user to view the web form but Excel is unable to authenticate so it just fails. Well, in reality it just sees a 404 and tells you the link is broken. Once Excel makes this decision it won’t pass the URL to the browser to open. Great.

So, another quick Google about stopping Excel validating links brings me to Stack Exchange. Of course, I can’t comment on SE due to their chicken/egg “reputation” rules so the only option I have for drawing these two threads together is this blog post.  Hope it helps!

It’s been a long time since I hosted and ran any websites. Back in the day I did the forum administration for Arch Linux and had great fun playing with phpBB patches and CSS. In one of my previous jobs I had to set up an online directory application site for the Tower Hamlets Family Information Service. That was quite fun and I learned a lot. Unfortunately, none of my work remains today. Now, rather than having a landing page for the directory, the application is mainly accessed via forms embedded in the main Tower Hamlets website. You can see the total lack of TLC on the results pages. This is how it looked when we launched in 2010:
[vimeo 13852075 w=640 h=480]
Of course, the overall look of the site has completely changed but even for 10 years ago I think it looks better than the current offer.

We’ve just launched a new site at Healthwatch Bucks and I’ve been making a few tweaks today because our developer is literally moving to France this weekend. He’s put in a major shift on it this week and we’re really grateful for his efforts!

My favourite tweaks of the day were manually editing an SVG file (with a text editor) to change the colour and finding out how to resize it as a background img. Seems quite simple looking back now but where you’re starting from near zero everything seems like a challenge!

I had a devil of time trying to do this yesterday. The first time through it all seemed fine until I added the certificate to the Windows Cert store. It went in under “Other People,” not “Personal” and I couldn’t select it to sign my emails. Figuring I’d done something wrong I revoked the cert and applied again. And again. And again. Then I found a great tip that suggested I check to see if the Private Key for my cert was found. It wasn’t. So I started looking into how where my Private Key even was…

Turns out the Comodo site plain doesn’t work with Chrome. Probably the permissions are too restrictive. I tried with IE and suddenly it all made sense and it worked just fine first go.

What I really wanted was a certificate that would let me sign a PDF, I’m not really that interested in signing my mails. Sadly the template used by Comodo doesn’t seem to be set up in the idle way for document signing but it still works!

I’m running Arch Linux ARM on my Pi and I just had WAY too much trouble getting distcc working.

First, my most idiotic mistake was not having distcc installed on the master (the Pi). I know how that happened. I was going to install and decided to -Syu first, then forgot.

Secondly, I was trying to get it to work using the makepkg method, as opposed to the standalone method. What’s the difference? That brings me to…

Thirdly, the guides I was using aren’t great.

What you need to know

If you are trying to cross-compile using distcc on a Raspberry Pi you will almost certainly end up here: https://archlinuxarm.org/wiki/Distcc_Cross-Compiling

Note the warning at the top: “This guide will appear vague and incomplete if you aren’t sure what you’re doing. This is intentional.” That’s rubbish. It’s just very badly written. The only knowledge of “compilation and toolchain components” needed to understand this guide is that the toolchain needs to be installed/built on the client. That’s it.

Once you’ve read the dire warning you are directed here: https://archlinuxarm.org/wiki/Distributed_Compiling

Don’t go there now. The rest of the page is about setting the client up, finish that first.

The biggest problem with this guide is it doesn’t adequately explain just how much of the guide you can skip if you use WarheadsSE’s distccd-alarm package. The fact that WarheadsSE doesn’t explain this in README on his github doesn’t help either. Basically, WarheadsSE’s pkg does everything.

As explained in the guide, WarheadsSE’s pkg builds a toolchain for each ARM architecture and puts them into separate packages. So, build his pkg on your client machine (X86_64 only) and install the pkg for the toolchain you need. For me and my Pi it’s ARMv6l hard.

This package automatically creates the symlinks described in the “Make nice with distcc” section. Further more, it creates it’s own conf file at /etc/conf.d/distccd-armv6h (obviously depending on your ARM architecture). This contains the correct $PATH variable and is sourced automatically when distcc is run on the master. No need to edit /etc/conf.d/distccd at all. You will have to set the allowed hosts in /etc/conf.d/distccd-armv6h, though.

Next, read the recommended guide to setting up the master (https://archlinuxarm.org/wiki/Distributed_Compiling). At the bottom you’ll move to configuring the client but the other guide has already taken you through that step. All that remains is to start the systemd service on the client.

Troubleshooting

I tried to troubleshoot using the makepkg build method. What’s that? Well, it’s the only method explained in the Arch Linux ARM guides. It basically means you use makepkg to run distcc. This is what you want to do in the long run but the error handling is rubbish. So, instead, go and have a look at this guide: https://wiki.archlinux.org/index.php/Distcc

This guide explains how to run distcc without makepkg aka “Standalone” method. Short version:

1) add your client IP address to /etc/distcc/hosts on the master
2) create file on the client called hello_world.cpp and paste this into it:
// 'Hello World!' program

#include <iostream>

int main()
{
std::cout << "Hello World!" << std::endl;
return 0;
}

3) run this distcc g++ -c hello_world.cpp

Now you can see what distcc is trying to do. For me it showed that connection to the client was refused. Obviously (duh) I needed to open a port in my firewall on my client. With that done it just built.

Now you can go back to using makepkg -A.

I tried really hard to like Evernote. I really did. In that time when it ruled as one of the trendy start-up darlings I really wanted to make it work for me. I loved the design and aesthetic but using it just left me a bit cold. The Android app was OK, and that was probably where I got the best out of it, but as I tried to use it via the web app it just seemed less and less intuitive. Worst of all, though, it was pretty much file and forget. Stuff was going in but because of my lack of buy in it wasn’t coming back out

When I started a new job back in September 2015 I had an opportunity to start fresh with some new ways of working and I was determined that one of them would be digital note taking. I was all set to give Evernote another go but decided I better check out the competition as I’d not really looked into for a while. Like most good geeks/nerds I went straight to Lifehacker.

It’s here that you should know a bit about me in terms of the tech I like. Short version: aside from a dabble with an iPod Touch 4th generation, I am an avowed Apple critic and Android proponent. I started using Linux in 2003 and am a big fan of the open-source “why pay for something you can get for free” mentality. Apple to me was the antithesis of this. Microsoft, on the other hand, was a professional inconvenience. I’ve never worked in an environment where I can choose which software I use and so I’ve made the best of what I had. This might be a reason why I couldn’t get on with Evernote, does it have an Apple-centric philosophy?

So, what did Lifehacker tell me? Well, I was surprised. Not only did the readers and writers of Lifehacker think OneNote was OK they actually thought it was pretty great. Since I’ve never been one to invest in a platform that didn’t have decent backing this was important to me. As I’m writing this I’m looking back at Lifehacker and reading some direct comparisons of EN and ON and much of it agrees with my experience.

Tagging

I never got on with tagging in Evernote, in fact, I don’t get tagging at all. Tagging is something that works well when you want to collate items by tag, like tasks in Remember the Milk. What it doesn’t do well is help you find items with particular tags. Also, most tagging systems don’t do much to help you avoid duplicating tags with slightly different names. Tagging made sense when searching was slow and inefficient but now, for example, OneNote will let you search text in images without any user intervention. Tags are old hat.

Clipping

When I got interested in Evernote I was mainly through the web clipping craze. This is YEARS back now but there was a real goldrush around web clipping until Evernote cornered the market. Most of the other web clipping tools are closed now but may favourite used to be Amplify. I had two amp logs, which I used as pseudo-blogs. Twitter eventually made a lot of what amplify did rather pointless when it introduce twitter cards, effectively providing a clip of the site you were sharing automatically. Anyway, web clipping, it was a big thing and, apparently, it still is. I don’t do it anymore; it doesn’t make sense to me. Maybe the idea of saving whole articles offline was great when online storage didn’t cost $0.02 per GB but now the only thing that makes content disappear from the web is bankruptcy.

Price

In the early 20-teens, Evernote had a bit of a shake-up. Obviously it needed to monetize, and I can’t begrudge them that, but for me the costs associated with Evernote (before another rethink) became laughable. At around about the same time OneNote became free…

I also blame Evernote for encouraging other providers to offer ridiculously over-priced premium versions (looking at you Pocket and Feedly) that offer limited additional functionality. Just charge everyone a $1 a month for anything other than the most basic product.

Microsoft Office Integration

Work was my main motivation for digital note taking and this is where investing in OneNote has paid dividends for me.  We just upgraded to Office365 and with that came Office 2016 on the desktop; a shared OneDrive and SharePoint sites. All of this just makes having a team notebook in OneNote a doddle. You can link meetings to notes in OneNote with a click and the whole team can read and contribute. Also, it’s helping me get out of my inbox. I used to keep emails indefinitely just in case I needed to refer back to them. Now I can push them to OneNote, cut out all the crap and keep the key details, and file it in a relevant notebook or section. Chances are I’ll actually stumble upon it when I need it now rather than forgetting it was ever there. You can also embed documents into notes, which again everyone can open. While this is a terrible idea for work products it’s great for the things you want to keep or share “for information.” I mean, where do you file that stuff in a shared drive anyway?

Evernote does none of this that I know of.

Personal Life

With OneNote firmly established in my work life it’s been even easier to adapt to it at home. OneDrive integration means I can access various notebooks kept in various places from any location. I’m drafting this blog post in OneNote. It just makes sense. For me it excels in capturing researching from the web. If you copy and paste from a page it captures the URL automatically. Inserting links and images is a piece of cake. I just found out today you can make sub-pages too, which makes organize your notes so much more intuitive than I ever found Evernote.

Everything Else

I haven’t even touched on some of the more powerful aspects of OneNote because, frankly, I’ve never even used them. There’s a thing called templates. No idea what that does but I can imagine that capturing things like recipes would go very well in a template!

To Sum Up

If you don’t use Macs almost exclusively, you’ll probably have a Microsoft account already and access to the OneNote app. It’s all you need to get started. Microsoft even have a tool to help you migrate from Evernote now.

Since IFTTT improved its OneNote integration I’ve abandoned Evernote completely. And it looks like other’s will follow soon.

 

pacman, the package manager that was created as fundamental feature of Arch Linux, has a long and successful development history. The pace of development might not be speedy but it’s always been very sensible.

Recently install hooks were introduced to pacman. In short this means, for example, that rather than each font package updating the font cache itself, the font cache is updated once when all the font installations are completed. It’s not ground breaking but it’s a neat, tidy and important solution.

However, I was bit concerned when the post-install script that builds the kernel initramfs wasn’t one of the first scriptlets moved to a hook. I was further dismayed when a discussion on arch-dev-public suggested that some developers might not see the point of doing so.

Part of the fun I have with Linux is fixing things. It probably doesn’t sound like fun but it can be a challenge that rewards with knowledge. However, when the problem is that you can’t boot, well, that’s no fun. It means rescue disks and chroot, and a second box to look up solutions on.

The most frequent cause of a non-booting Arch Linux system (after an update) is not having read the recent News Announcements or followed post-install messages. One time, though, I just got really unlucky. A module that needed to updated in the initramfs was installed after the kernel had been updated and the initramfs had already been rebuild. That really sucked. Since then I have rebuilt the initramfs manually after every system update has completed.

So, I was actually very relieved, when other developers did see the point of moving the scriptlet to a hook. This doesn’t fully solve the problem because that hook still needs to run after other hooks to be as bullet-proof as possible and the current process didn’t seem to support that. Fortunately, this was identified too.

Awesome.

This is great example of open source development going right. Too often it disintegrates in to disagreement and death by committee.

Last night I had a bit of an epiphany and uninstalled Twitter from my tablet. I haven’t deleted my accounts or anything, I’ve just removed it as a possible “go to”. However, this wasn’t because, like many people, I’m starting to resent the distraction it causes but rather I’m starting to think I just don’t like the offer.

My grumbles centre on “algorithmic curation” i.e. the service decides what I’m interested in and promotes items in my news feed. The problem is that these algorithms don’t work on me. I don’t follow more than about 50 accounts on ANY service I use. There just isn’t enough data for these algorithms to make meaningful decisions on and so, to me, it’s totally arbitrary. That ruins the experience for me.

I mainly use social media to keep in touch with people that share my interests but that I don’t know personally. In the past I would have used forums for that sort of thing. Forums have become extremely tedious in recent years. There is so much risk of offence or upset in the face of poor wording or tone. Social media doesn’t seem to be blighted by that problem. It’s accepted that you’re stating an opinion and that opinion is your own. Having an opinion on a forum is a recipe for disaster.

Timeline curation was one of two things that ultimately pushed me away from Facebook. I just didn’t have confidence I could actually see everything my friends were saying, or that anyone was even seeing what I was posting. What an awful system. This morning I’m reading that Instagram is planning a similar approach. Apparently “the average user misses 70 percent of posts” so they think curation is important. I must be way below average then because I can not login to Instagram for a few days and catch up on every post in a few minutes. In that sense I don’t think curation on Instagram will be that much of an issue for me. There really is nothing to curate.

I’ve actually just logged in to Twitter to get an idea of just who I am following and, lo, the new timeline was just switched on for me. “Tweets you are likely to care about most will show up first in your timeline,” goes the claim. I think not.

 

 

I work for a small not-for-profit and we use Drupal to power our website. There are only 6 people on staff and we all have access to do pretty much anything on the site.

Yesterday, our Chief Exec came across something strange. A link had been sent out in our newsletter that pointed to oursite.com/404-page-not-found. How could this have happened? Not being very familiar with Drupal, I suggested it was some weird redirect as a result of a 404 to the most recent post. But it was the right post. Coincidence? The universe is rarely so lazy.

So I looked into it. Turns out, and I’m not entirely sure how this happened, someone had actually edited the page that was set as our custom 404 page. So the link in the newsletter was technically correct. The problem was all 404 were being directed to that article.

Because we didn’t want to break the link in the newsletter I set up a new 404 custom page. In the process of testing it, putting in a few aliases and redirects I discovered a few broken links. Amusingly, these broken links had never come to light because they were supposed to direct to the article that was ON the 404 page!

With the new 404 in place they really were broken. I soon had that fixed but it led me to check a few more pages and it turns out that who ever made the original error made a bit of dog’s dinner of it. There were a few duplicate nodes and aliases pointing all over the place.

Didn’t take more than about half an hour to the whole thing out but it did provide a nicely little puzzle!

 

This took me ages to find. I started off messing with Xmodmap until I realise GNOME would ignore it.  I finally found the solution here.

navigate to org >> gnome >> desktop >> input-sources

Put your options under xkb-options as a list. Ex: [‘altwin:ctrl_alt_win’,’caps:backspace’]

I hope that I can help others find it faster!

Then

I’ve been using Arch Linux for years. It’s been my main Linux distribution since I decided I needed to switch from an i586 optimised system to an i686 optimised system. Yes, it was that long ago. To be honest, I’m astonished my original distribution is still going but I’m really pleased for them!

When I first got interested in Linux it was because of a distribution called Mandrake. Back then KDE and the Keramik theme were all the rage. It was all so refreshing compared to Windows.

Anyway, I could never really get on with those big desktop managers. With KDE I always wanted some GTK apps (firefox) that ruined the look and Gnome, well, back in the day the word was “cruft”, it was full of dependencies and apps you just didn’t want. Plus it was butt ugly.

The first window manager I really got into was Fluxbox. I even made some popular themes for it. I’d always liked Xfce but, like Gnome and KDE, it still wasn’t very cohesive. At some point Xfce must have grown-up because that became my standard desktop environment from at least 2007. That was probably down to Xfce 4.4.0!

With hindsight, I was lucky to find Arch Linux early on in my Linux experience as I never even considered trying Ubuntu when it was created. Back in 2003 Gentoo was the cause of most flamewars on the Arch forums.

So, my first experience with Ubuntu was around the last noughties when a laptop with an OEM of Windows XP went kaput and we needed an OS. I was pretty pleased with it. It always ran well on the laptop and I didn’t need to mess about with it too much. It almost never went wrong. I went off it a lot when Unity came on the scene but the laptop died permanently shortly after, so I was never really forced to look for an alternative. I always kept half an eye on Ubuntu, though.

Now

A few weeks ago I was at a conference in London for CiviCRM (which is an open-source CRM used by a lot of charities and not-for-profits.) Almost all of the presenters were using either Macbooks or Ubuntu laptops. A few guys from CiviCOOP were also running what I surmised was Gnome so I asked them about their set-up. They simply said it was Ubuntu Gnome and I decided I’d check it out.

As you might know both Gnome and Ubuntu have recently made major releases. I guess Gnome didn’t freeze early enough to make it into Ubuntu’s release window so Ubuntu Gnome still ships with the previous version.

So, yesterday, I decided to try and install. I started out by making sure that the installation wasn’t going to mess with my Arch install. That meant revisiting my bootloader configuration and moving /boot back into the Arch root. Then I had a huge balls-up where I deleted everything in my ESP with some sloppy typing. Yay, me! So out came the Windows disks to restore the Windows EFI bootmgr and, bleurgh. Mission.

Once that was finally sorted (and, as is so often the case with Linux, improved) I got around to installing Ubuntu Gnome in some free space on my Windows drive. I was pretty disappointed. I quickly realised that it was going to a struggle to get the system set-up how I liked; I have so many long forgotten customisations in Arch that I take for granted. Also, the Ubuntu software centre was just weird. There were two versions of Vim and I couldn’t tell what the difference was. Gnome itself was also really unstable; it hanged on me a few times and some widgets vanished, making the UI tough to interpret. However, what really put the tin hat on it was trying to change the Gnome icon theme. I couldn’t find any to install in the software centre and the advice I found while Google was to extract packages straight into /usr! Urgh! That’s like recommending incest!

I decided that Ubuntu wasn’t for me after all and decided to give Gnome a go on my Arch install, especially when Arch is noted by the Gnome Project as already shipping the latest version. It didn’t make long to get it sorted. I’d already decided, while out a on run last night, that I was going to convert wholesale to the “Gnome way” and swap to gdm and NetworkManager.

I can honestly say it’s the most complete Linux desktop experience I have ever had and I think I’ve only scratched the surface. It’s all worked out pretty well in the end!