Information overload: how are we supposed to keep up?

I’ve always been hungry for news. I like to know what’s happening – and I like to know when it happens.

The thing is, I feel I’ve reached saturation point.

I jokingly summed this up in a recent tweet:

Throw in Pokemon Go (I’m not ashamed to say I’m still enjoying it) and I’m torn between the buzz of feeling in the loop, and information overload.

Continue reading “Information overload: how are we supposed to keep up?”

SEO and technical issues we encountered in Drupal 7

Since I joined City A.M., we’ve been gradually migrating away from Drupal 7 to a fully bespoke CMS built on Laravel. We had planned to do this anyway, but we accelerated parts of the migration as we hit issues that we were unable to solve. Here’s a summary of the main ones.

Continue reading “SEO and technical issues we encountered in Drupal 7”

How not to do sticky ads

Making ads sticky can help to improve viewability percentages – viewability is important to many advertisers.

Ads can be made to stick for a short period of time and then disappear, or they can be set up to be “perma-sticky” – so they don’t go away.

When ads are permanently stuck, publishers need to be careful that there’s sufficient space to display everything on the page.

Here’s what I saw on the Telegraph today, while viewing a story:

Continue reading “How not to do sticky ads”

Why large websites should avoid full pagination

One of my long-standing navigation requirements for any website is pagination. It should be possible to see how many pages of content a site has, and it should be possible to jump to any page.

However, I’ve found this doesn’t really work on large news sites. When was the last time you saw a news site with pagination going back thousands of pages?

How news sites work

When readers go to a news site, they will usually be looking for the latest news. This can be found on the homepage, or you can drill down by going to a section page (Sport, Tech, Entertainment etc).

For a small site, or a section with only a few pages of content, it can be useful to click through each page to browse the history of the site. But once a site has published vast quantities of content, pagination becomes a lot less useful.

Who reads old news?

How useful is it to browse through old news, through thousands of pages? If you wanted to find an old story, there are a couple of ways to find it much more quickly:

  1. Search – no need to think about where the content is – just find it with a quick search;
  2. Archives – know when the story was published? Date-based archives can be a lot faster than scrolling through page after page;
  3. Related links – not really the best way to find a specific story, but if you’re on a story and you want to read other stories on similar topics, related links can be quite handy.

Crawlers and performance

There’s also the issue of crawlers. If search bots can crawl thousands of pages in your News category, they will try to. Wouldn’t it be better to limit your pagination to a small number of pages (10-20 at most) – or lose it altogether – so crawlers can index your stories first, and stop crawling your section pages?

We also noticed some crawl issues occurring due to crawlers simultaneously hitting multiple page numbers in the section archives. It didn’t make sense to keep thousands of old section pages active.

As long as your stories are in sitemaps, you don’t need to maintain an endless list of paginated section pages.

Rebuilding the post editor at City AM

When I joined City AM, one of the first things I experienced was the post editor. It did the job, but it was not particularly quick or intuitive to use. We had to add new fields in a predefined way – and doing so usually added extra work for the content team. Plus, there were a number of tasks that had to be done outside the system – such as finding and resizing images from Getty.

I knew we needed to take a radical approach and consider scrapping the existing post editor.

Continue reading “Rebuilding the post editor at City AM”

The Washington Post: publishing observations

There’s a good story over at on the Washington Post redesign (“New WaPo ‘flexible’ homepage completes site redesign“).

Here’s a quote that stands out:

If you’re going to be quicker, agile and innovative, you can’t be in a place where it takes you six months to build something.

I’d agree with that, and I’d add that it’s risky to have any lengthy period of uninterrupted development. Even one month is a long time, particularly for a small team where time is so precious. Big projects need lots of work – but it’s not just about completing the task, it’s about doing it right. Even with perfect project management (is anyone perfect?) it’s surely better to spend between a few days and a couple of weeks building core functionality, and giving the primary user(s) an early demo to see if you’re on the right track.

In terms of the platform that the Washington Post uses:

The publishing process on the outlet’s website is now entirely powered by Arc, a collaborative tool developed by engineers and journalists

The Post uses WordPress for 70 per cent of the content it produces – Arc integrates through APIs with WordPress and other platforms, such as the Washington Post recipe or quiz database.

Emphasis mine. If WordPress still powers a large part of the process, then it’s not entirely powered by Arc. It sounds like WordPress is largely used for the back-end CMS with some custom tools added onto it, while the front-end of the site is entirely bespoke.

If that’s the case, then that makes a lot of sense – mainly because I think it’s far better to build a theme in Twig than in pure PHP.

Based on this assumption, I’ve updated my CMS usage chart to include the Washington Post.