London

A few more photos again today from the archives. I'm still going through and making sure I've got all of my favourite shots published here on the blog.

This time it's a few shots I took in London back in September.

Monterey Blocking Ports 5000 and 7000

If you're a developer and use macOS Monterey, then you may have come into issues when using ports 5000 and 7000 on your local machine. And seeing as these are pretty common ports, I can imagine that this will affect quite a few people.

It turns out, what's using these ports is the new AirPlay Receiver functionality added in Monterey.

You can find this in the Sharing pane of System Preferences. And if you don't care about having it enabled, then you can just uncheck it, and the ports will be free.

However, if you do want to make use of AirPlay Receiver, then all you need to do is first disable it, run your local server, and then enable AirPlay Receiver again. It will then use a different port.

Durdle Door

I've been to Durdle Door before, back in 2019. And only a few months ago, I visited the area again. And of course, I ended up taking a few photos. Which I think turned out a lot better than last time.

How To Edit Your Ghost Theme Using Github

Who knew that you integrate a GitHub repository to your Ghost blog and have your theme automatically deploy? I for sure didn't, but luckily Greg Morris did, and he's written a great guide on how to set it all up:

Since first trying Ghost, one of the best things about editing my theme is the ability to host on Github. Through a simple integration I can easily edit my theme to make changes from almost anywhere. If you want to do this too, this guide should help you out.

I've just run through this guide myself, and the theme for this blog now automatically updates whenever I push changes to my repository. This is going to be so much better than using scp or sftp to manually upload changes.

Apple Announces Self Service Repair

Apple Newsroom:

Apple today announced Self Service Repair, which will allow customers who are comfortable with completing their own repairs access to Apple genuine parts and tools. Available first for the iPhone 12 and iPhone 13 lineups, and soon to be followed by Mac computers featuring M1 chips, Self Service Repair will be available early next year in the US and expand to additional countries throughout 2022. Customers join more than 5,000 Apple Authorised Service Providers (AASPs) and 2,800 Independent Repair Providers who have access to these parts, tools, and manuals.The initial phase of the program will focus on the most commonly serviced modules, such as the iPhone display, battery, and camera. The ability for additional repairs will be available later next year.

My immediate reaction on Twitter to this was that I thought that this is a good idea, and benefits both Apple and consumers. Because this will surely be good for Apple's reputation, and they'll now gain more control of the iPhone parts market. And that means for consumers, they will have access to official parts that they can trust, and also be able to perform repairs themselves.

I'm not too sure Apple are doing this purely for the benefit of consumers though. I'm starting to wonder if they're introducing this program so that they have a counterargument to the right to repair people.

Matt Birchler also shared his opinions on the new program:

I'm super curious to see how this is received by people on both sides of the right to repair argument. Will people who support right to repair see this as a win or an empty gesture distracting from their real concerns? Will people who have argued against right to repair because it would mean bulky products be annoyed because this shows that's not really the case?

Even though I'm sure that Apple will be very restrictive to what parts they sell, and what they "allow" you to repair. I would find it incredibly amusing if Apple find a way to support reasonably priced repairs for batteries, screens, cases, etc. Because right now, the only manufacturers I see that are even thinking about this kind of stuff are making big phones that look ugly. And the excuse that "it's repairable" won't hold up as much.

The Feeling of Material You

David Imel has made a fantastic video on Google's Material You design language and how it aims to focus on feel rather than just function. And how its imperfection, constructed from customisability, and personal touches are what makes it good.

Ever since Material You was announced, I've been a fan. I'm not sure if I could pick a winner between Material You and the design language Apple have used in iOS and iPadOS because I think they are born from different perspectives, and both have their own pros and cons.

The best way I can articulate the design differences, is that Apple's design feels clean and precise, while Material You looks much more personal. Both are valid choices, but they're still different, and I think that's great. Because more choice can only be a good thing.

A Few Things I Use the Command Line For

I’ve noticed myself using the command line a lot more recently, at home, and work, so I thought I’d share a few of the little tools and handy commands that I use on a day to day basis.

Note: I use ZSH as my shell, with oh my zsh, so they may differ slightly if you’re using something different.

Aliases

The most helpful commands that I use have to be aliases for the most minor commands. But because they’re used so often, it saves so much time.

The majority of them are for two things - moving to common directories, and launching applications.

Here are a few examples of directories I have set up with aliases:

  • h: home directory
  • dev: developer directory where I store all projects
  • tc: Text Case directory
  • blb: Bulba directory

That’s just a small snippet, but usually, I have most projects set up with a very small alias. But even if I don’t have one set up for each project, I’ve got one that puts me at the root of my developer folder anyway.

As for applications, I’ve got a few that I use a lot:

  • xc: Opens a xcodeproj in the current directory
  • xcw: Opens a xcworkspace in the current directory
  • vs: Opens the current directory in Visual Studio Code
  • fork: Opens the current directory in Fork - A Git GUI, for when I want to dig into any conflicts.

Git

Being a developer, I use Git quite a lot. And that is where oh my zsh comes in handy, as it comes with a huge number of aliases for common Git commands. Here’s a great cheatsheet.

Here are the ones I use the most, and also what the full command is:

  • gco: git checkout - checkout branch
  • gaa: git add --all - adds all changes in the current directory
  • gc -m "": git commit -v -m "" - commits the current changes with a message
  • gp: git push - pushes commits to the configured remote repo
  • gf: git fetch- fetches branches and tags from the configured remote repo

I’m aware that those are pretty minor commands, but they’re so much easier when they’re just one two or three letters.

Also, oh my zsh does come with an alias for committing changes with a message, but it’s gcmsg and that’s longer than just using gc with the -m option.

The most used Git command I use though has to be a little ZSH function I made myself:

function gacp {
    gaa; gc -m $1; gp
}

It stands for “git add commit push”, and as you can probably tell, it adds and commits the following changes, with the supplied message, and pushes it to the remote repository.

Most of the time I’m doing stuff like this:

gacp "JIRA-123 fix tests"

Woops!

FTP

I don’t use these a lot, but I do have a few aliases to update various websites. They basically use the scp command (secure copy) to transfer files from a local directory to a remote server. Guide.

This isn’t exactly what I have, but they all follow this rough syntax:

scp -r /Users/chris/website user@123.123.123.213:../var/www/

This will recursively upload the contents of a local directory to a remote server. I use this whenever I update changes to my blog theme.

HTTP Requests

Whether I’m working on a mobile app or REST API at work, I’m usually testing various requests throughout the day. And while I sometimes use a tool like Postman, especially when I’m building a collection for QA testers, I do find it a bit cumbersome sometimes. So that means I end up resorting back to the terminal.

I’ve seen a tool called httpie which does seem to be quite good, but I’ve found curl to be good enough for my uses.

Tip: If you’re stuck with the syntax and don’t have time to wade through documentation, I’d recommend using a tool like Postman to build the request, and you can then export the curl request.

Most of the time I’m just performing GET requests, so the syntax is simply:

curl https://dev.chrishannah.me/feed.json

If you need just the headers of the response there’s the -I option, and if I want both the headers and body it’s a lowercase -i.

Environment Variables

Usually, I’m using the command line because I want to test quite quickly, and with slight tweaks, so I find making the command as short as possible helps with this.

The first one for me is to use environment variables. So for example I’ll use one for the base URL of the API, and usually a few for any variables that need to be in the path, especially if these are user account numbers, as it makes it a lot easier to quickly test different scenarios.

This means that a request like this:

curl https://company.com/api/account/2a3e4832-14e6-430d-8c34-748f4626e864/transactions

Can be made a lot shorter by using two environment variables:

export base=https://company.com/api
export account=2a3e4832-14e6-430d-8c34-748f4626e864/transactions

Which means it can look like this:

curl $base/account/$account/transactions

The biggest benefit I find is that allows you to edit the command much easier.

Using Files

Another tip I have for curl is that if you have a bunch of headers that you need to use, then it helps to have these stored in a file.

You can do this by using the -H command followed by @ and then the filename. For example, this command will read the headers from a file named headers.txt:

curl https://website.com -H @headers.txt

The header file needs to be in this format:

Key: Value

So something like this would work:

Content-Type: application/json
Authorization: Bearer [token]

This is especially handy for me as most of our APIs at work require various authorisation tokens that can be quite large.

You can also use other options to use files for storing the body of the request, but I’ve not had much experience of that, so I’ll have to defer to google.

JSON

This goes hand-in-hand with making HTTP requests, in that the responses are usually JSON. For that I use the JSON processor, jq.

Most of the time, I’m just using it to “beautify” data from a curl request, so I pass through the response to jq by piping the output from curl to jq like so:

curl https://dev.chrishannah.me/feed.json | jq

What that does is take the response from the curl request and output a pretty printed version of it.

But you can also use jq to parse the JSON response and pick out certain fields.

So for example, a request to my blogs JSON feed at https://dev.chrishannah.me/feed.json will return a fair bit of JSON data. Something like this:

{
  "version": "https://jsonfeed.org/version/1",
  "title": "Chris Hannah's Dev Blog",
  "home_page_url": "https://dev.chrishannah.me/",
  "feed_url": "https://dev.chrishannah.me/feed.json",
  "description": "A devlog by Chris Hannah",
  "author": {
    "name": "Chris Hannah",
    "url": "https://chrishannah.me"
  },
  "items": [
    { ... }
  ]
}

But say I only wanted to read the author object, I’d just need to use this command:

curl https://dev.chrishannah.me/feed.json | jq '.author'

Which will return just this:

{
  "name": "Chris Hannah",
  "url": "https://chrishannah.me"
}

There’s a lot more it can do as well, and I’d recommend checking out these examples.


I’m sure there are tons of other resources that go into far more detail on what you can do with the command line. But I thought I’d share a few things that I use it for, just in case it might prompt others to find some ways to save themselves time!

“Built using Bulba”

Two announcements. Firstly, I’m building a static site generator. And secondly, that I’m using said generator to power a new blog of mine.

Bulba

That’s the name. My new side-project, which is a static site generator, built using Node.js, that can transform a few Markdown files into a static site. It’s still early days now, but it’s already got paginated index pages, archive, about page, JSON feed, and of course, a page for every blog post.

I’m in no way a Node.js or JavaScript programmer, so I can’t always be certain that my code is the best. (Nor can I say that it’s the worst). But I think it will be a fun project to work on, and maybe in the future other people can use it.

You can find Bulba on GitHub and NPM.

Development Blog

To both demonstrate the features of my new tool, and to also share the progress of it’s development, I’m using Bulba to power a new development blog. It’s got a pretty nice url as well: dev.chrishannah.me .

Right now, I’m using it for Bulba specific updates, but I can imagine that in the future I may use it to write about other projects I work on too.

Earlier today, I published a post, “Introduction to Bulba”, on my new blog, where I introduce Bulba in detail, showing how it works, what features are currently implemented, and also what I’m working on next.