Inside OzRunways

Using an interesting tool, you can explode out views into layers to see how they all composite together to achieve the final image. I think it looks cool:

Traffic

 

Drawing individual images is expensive, so you can see what we’ve done here is load three bitmap pictures once and we rotate (animate) the outer pointer to change the track.

Below is the vertical toolbar that slides out. The little fuel ones are hidden underneath and animated out. You can see the transparency of the toolbar is set to about 95% so you can see just a tiny amount of map underneath (to look natural). The GPU must go and calculate all of the blending required for doing this as you pan the map (at 60 fps) leaving less than 16 milliseconds to do all the calculations in order to maintain smooth scrolling.

toolbarScreen Shot 2016-03-02 at 11.11.50 AM

Here’s the whole main map screen minus the backing map. You can see the GPU has to composite, crop and blend many layers of images with a different z (vertical) order to achieve the final image. You can override the z order of an object in code to force it to appear at the top of the screen.

Explode

 

Here is the final image with all cropping and blending done. Cool !

Full Screen

Creating RWY Go

AppStore_small

Today I want to discuss some things that happen behind the scenes when creating a new app, and in this case, RWY Go.

When Apple announced the Watch, we did some brainstorming about what we would love to see on it, and came up with some screenshots:

 

D48B646B-DB6C-40FF-B705-9A268F34DCD746AAEB5A-B0F2-4ADB-8AAA-74D10768271A108B9A6A-84A0-4BD6-9250-2BCE9B3E13B0IMG_3747

Yes, very lame I know (particularly the old school watch!). We had no idea what the resolution or features the SDK would support, so this was all just guess work. It turned out the SDK was fairly restrictive so it actually led the design in a different direction, which I think turned out to be a good thing.

Goal and design philosophy

We wanted something useful to use on iPhone and Watch to give useful, time critical information normally scattered across ERSA and approach plates, with zero or minimal user interaction in a divert or emergency. Here’s what we came up with:

  1. Where to go – What is the most intuitive way to represent the distance & bearing in a glance.
  2. Arrival Plan – Runway orientation, elevation, windsock and runway numbers. The entire diagram should rotate for orientation. Layered based on priority.
  3. Frequencies – Important ones starting with CTAF/TWR, then ATIS and navaids. These are ordered based on usefulness.
  4. Met & Notams – Low priority, so accessible via a swipe.
  5. Searchable – One tap shows a list of airfields ordered by distance. A runway diagram makes selection easier.
  6. Contrast – The colour scheme needed to work in both broad daylight and in a darkened night cockpit.

Class Design

As soon as Apple released the watch and SDK to developers, we had a look through to see what it could do. We very quickly discovered that the entire binary was run on the iPhone, communicating wirelessly to the watch to update the interface. The interface needed to have a static design layout (created at compile time) and the iPhone sends some basic low-bandwidth numbers to update it. It could do some basic images and animations. We also needed to create a ‘container’ iPhone app, which we hadn’t planned on doing. As it turns out, I think the iPhone app for RWY Go ended up being the best bit.

So the basic concept is that the iPhone app launches and gets a location, then chooses the best airport nearby. It procedurally draws some images of the airport and saves it into a shared container both the watch binary (also running on the iPhone) can use. It also fetches the Met & Notams and saves those for display.

Here’s the first draft I used as the overall structure for the app (how all the bits and pieces connect together):

0E545226-B5B9-473A-BAAC-A6254BFA4105

The implementation was a lot more complex as you work through issues & test the limits of what the SDK can do for you. This was my first project using Apple auto-layout and Swift which was awesome for this project. Here’s the auto-layout for the iPhone app in XCode:

RWY_Go_AutoLayout_iPhone

Essentially, all of the elements on screen are built up with relationships to each other. It requires tons of testing and tweaking on all of the different devices in portrait & landscape, but once done, works perfectly in native resolution in any interface orientation.

 

The Watch interfaces were created in a similar fashion by dragging & dropping elements onto a screen. With some nested grouping of items, it’s possible to achieve a nice interface on both watch sizes. We didn’t actually have any devices to test on, so it was all done in the Simulator. Apple only allowed 1 day in Sydney playing with the watch prior to public release to check it worked.

RWY_Go_AutoLayout

The Watch doesn’t do animations the same way iPhone/iPad does (e.g. “Rotate this image by π/2”). You need to specify 360 individual images and have them flash up quickly. Yes – you need to create 360 images in Photoshop. Then another 360 every time you change the design!

compass_220-0@2x compass_220-1@2x compass_220-2@2x compass_220-3@2x compass_220-4@2x compass_220-5@2x compass_220-6@2x compass_220-7@2x compass_220-8@2x compass_220-9@2xcompass_220-10@2x compass_220-11@2x

 

The first lot of artwork we used was yellow, then a light blue theme, however we settled on a darker blue theme which worked better for iPhone at night in a dark cockpit. The centreline markings were also removed and as it looked too busy and cluttered. We wanted an interface that could, in a snapshot, be used to give you the orientation and direction of the runways without your eyes getting lost in the picture.

First version – Runways drawing!

 

8dd0b849-ccfb-48ec-826a-9edf69cbd294
Second version – circle concept
Runways don’t fit well yet

 

Here’s some other designs – some of these look really awesome and it was a difficult choice. At first the ones with high contrast circles look the best, however you’ll quickly realise that the large circle is redundant information and drowns out the actual data. The dark blue circle works better at night and allows the white arrow + text to stand out better.

IMG_3495

And some different ideas for runway drawings and frequency design. It turned out these don’t really fit well on the watch and some other technical limitations made them impossible to do well. We went with green/brown lines for grass/dirt runways as it’s more intuitive.

This looks modern but which is the grass runway?

IMG_3487 IMG_3490

 

RWY Go Watch
Final Design.

 

 

Database creation

We had an extensive database of runway thresholds from Airservices Australia and other sources, however it didn’t cover a lot of the smaller airstrips or international airports. So we created our own data set.

Using Google Earth, we plotted the position of every single airfield we knew about, then went about mapping the thresholds and runway type (grass, dirt, bitumen) for each airport. The current count is 28,164 runways across 23,562 airfields around the world. Painstaking work! We managed to squeeze all of this data, plus navaid / frequency information into a 6.1 mb database which is updated every 28 days.

Drawing airfields

Probably the most nerd satisfying part of creating this app was procedurally drawing the airfields from a bunch of lat/lon pairs for runways. The easy bit is creating a class that converts lat/lon pairs to an x,y pixel coordinate based on the maximum and minimum latitude and longitude pairs found for that airport. Here’s a snippet with some corrections to squeeze the runways inside a circle:

        //Center any axis that would otherwise be at the top or on the left (so the image is centered)
        if new_xScale < xScale {
            let scaleDifference = 1.0 - (new_xScale / xScale) //(e.g. compressed 50% would be 0.5)
            xOffset += (scaleDifference / 2.0) * Double(clippedRect.size.width)
        } else if new_yScale < yScale {
            let scaleDifference = 1.0 - (new_yScale / yScale)
            yOffset += (scaleDifference / 2.0) * Double(clippedRect.size.height)
        }
        xScale = new_xScale
        yScale = new_yScale
        
        //Correct so any diagonal runways outside of the circle are put back inside for rotations.
        // iPhone & iPad only (not Watch):
        if fitInsideCircle {
            let center = CGPoint(x: rect.width * 0.5, y: rect.height * 0.5)
            var longestThresholdFromCenter = 0.0
            for runway in apt.runways {
                let p1 = latLonToPoint(runway.threshold1.position)
                let p2 = latLonToPoint(runway.threshold2.position)
                let p1Dist = (p1.x - center.x)*(p1.x - center.x) + (p1.y - center.y)*(p1.y - center.y)
                let p2Dist = (p2.x - center.x)*(p2.x - center.x) + (p2.y - center.y)*(p2.y - center.y)
                longestThresholdFromCenter = max(longestThresholdFromCenter, sqrt(Double(p1Dist)))
                longestThresholdFromCenter = max(longestThresholdFromCenter, sqrt(Double(p2Dist)))
            }
            let desiredRadius = sqrt(Double(center.x * center.x)) - Double(rect.width*0.16) //assume square image & reduce by a further 16%
            if longestThresholdFromCenter > desiredRadius {
                let overhang = longestThresholdFromCenter - desiredRadius
                let reduction = sqrt(overhang*overhang*0.5)*2.0
                xOffset += reduction * 0.5
                yOffset += reduction * 0.5
                xScale *= (1.0 - reduction/Double(rect.size.width))
                yScale *= (1.0 - reduction/Double(rect.size.height))
            }
        }

The runways are then drawn on as a simple drawing call (“draw a box from (a,b) to (c,d) width 4.0”). The entire thing is saved as a small image and cached. It also draws a diagram for the search results (with no labels, thinner lines) and the Watch, which can be slightly larger as the image doesn’t need to rotate.

The tricky part is positioning the labels as subviews to the runway diagram so they rotate the opposite direction to the runway diagram and don’t overlap. For example, in the diagram below, RWY 04 at EHAM (Amsterdam Int’l) is shifted to not overlap with the other numbers. It took a couple of hours of tweaking drop shadows, font size and colours of the labels to have them readable with a high contrast over the top of the runways. Bonus points for the reader if you can figure out how to dynamically place a windsock on a runway diagram (choose an x,y position), in a good looking spot, that doesn’t overlap with any of the runways or labels – the only data is N pairs of runway thresholds (a,b), (c,d) where N >= 1.

Runway1 Runway2List

The windsock appears after it has downloaded the TTF, TAF and METAR for the airport. It uses a regular expressions to search through and find text like 23025KT or 18030G42KT prioritised by METAR, SPECI, TTF then TAF. The result should be a windsock that is very close to the wind right now.

Hypoxia Alerts

Soon after releasing RWY Go, we looked at the altimeter on the iPhone6 and wondered how we could use it. Because it will be measuring cockpit static pressure which has loads of errors, it’s fairly useless as an altimeter. The best use you can use for cabin altitude is to use it for cabin altitude! For those that don’t fly above 10,000 ft on oxygen, hypoxia is a big deal. The time of useful conscious decreases rapidly from about 30 minutes at 18,000 ft to 3-5 minutes at 25,000 ft. And the clock starts ticking above 10,000 ft so if you have had a gradual ascent (perhaps loss of cabin pressure), your time at 25,000 may only be a minute or two. What RWY Go does is start a timer above 10,000ft that ticks down at a certain rate to simulate your O₂ saturation. The higher you go, the faster it ticks down. When it gets to a very conservative threshold (about 50% of the published times), it will show a message to check oxygen. It goes without saying this isn’t a certified system and shouldn’t be relied upon in place of proper equipment and procedures. It’s an extra help that may come in useful one day, or to provide education with how long it can take to become hypoxic. Hypoxia alerts can be disabled in iPhone Settings.

[embedyt] http://www.youtube.com/watch?v=UN3W4d-5RPo%5B/embedyt%5D

Hypoxia_Warning3Hypoxia_Warning

Well I hope that gives some insight into what goes on behind the scenes. This has probably been the most enjoyable small project I’ve ever worked on. Swift is a really awesome language to program in, plus working with our artist and brainstorming design paradigms is great fun. Since writing the app, Apple have released Watch OS2 which allows native binaries to run on the watch. I’ve updated the iPhone/iPad code already, but now need to do a massive rewrite of the Watch code to support this change. It should result in the watch app loading and displaying content much quicker than wireless transfers do at the moment.

RWY Go 3_small

 

Creating OzRunways traffic

And why you should use it!

In this post I’ll discuss some technical aspects of how we created the OzRunways traffic system. This will be at a nerd level but others may find it interesting.

OzRunways Traffic
tx.OzRunways.com

Lets start with the requirements for a traffic system. It must be:

  • Low latency,
  • Reliable,
  • Contain a lot of important information, and
  • Able to be used on flakey mobile data networks with high losses.

We settled on UDP as the protocol of choice. UDP has some good characteristics that make it useful for traffic. For a start, it works extremely well over poor network connections. Imagine UDP as “send send send” vs TCP which involves handshakes and a flow of ‘ack’ packets back and forth to establish a reliable connection. Traffic is time sensitive so if an UDP data packet is lost, we don’t care as the next one to arrive in a few seconds completely replaces the data from the last one.

To use UDP with maximum reliability we had to look at how to send a single packet with all of the information we need without the 3G network fragmenting it into multiple packets. A quick google showed that using less than around 500 bytes for each packet (plus the IP header) should guarantee it being sent out as a single packet across 3G networks.

The basic infrastructure involves the OzRunways iOS client sending out its position, callsign, flight plan, unique identifier and a few other details (climb rate, track etc). Our server is a simple echo server that saves this to a database and sends a single packet back with the details of a bunch of nearby aircraft for displaying on the screen.

The OzRunways client firstly needs to check whether we are flying to send packets. We settled on a simple algorithm that uses a combination of timers, height above ground (using the NASA SRTM terrain database) and forward speed to test whether flying. The timers are used for touch & go’s to allow a minute of slow speed and low height so the system keeps sending packets until taxiing clear of the runway. To protect user privacy, once detected not flying, we send an incorrect user position and disable an airborne bit in the packet so the server still sends back nearby aircraft to display, but does not save the client position in the database to display for other users.

For the actual packet implementation, we needed to squeeze as much data as possible into ~ 500 bytes. The biggest saving is coming up with an implementation for a latitude/longitude pair that can be squeezed into a small number of bits. For example a packet sent back with 10 nearby aircraft with flight plans of 5 points each would contain 50 lat/lon pairs. If we have 6 bytes for each lat/lon, that would chew up 300 bytes. Using a standard 4 byte or 8 byte float/double was therefore not ideal.

For our lat/lon pair we looked at the resolution required by dividing earth up into X segments both vertically and horizontally. Since latitudes are +90 to -90, there is 180 degrees of space available. For longitudes, it’s -180 to +180 = 360 degrees at the equator. Longitudes further north or south would be squished so result in better resolution so we’ll use the Equator for worst case.  Here’s the calculations for latitudes:

1 bit of information = 90 degrees resolution available (180 / 2^1)
2 bits = 45 degrees (180 / 2^2)
…..
8 bits (1 byte) = 0.7 degrees (42nm resolution)
16 bits (2 bytes) = 0.0027 degrees = 0.16nm (300m – we’re getting close).
24 bits (3 bytes) = 1.2m resolution.

Excellent – so instead of a 4 byte float, if we represent +90 to -90 of Earth’s latitude as +0 to +180 sliced up into 16,777,216 pieces, we can just send a 3 byte integer. The decoder can just multiply the number by 0.00001072883606 to get the latitude with about 1-2m resolution, which is perfect for our traffic system.

+ (int)readUnsignedMedium: (const void *)bytes {
    int n=0;
    memcpy(&n, bytes, 3);
    n = n << 8;
    return NTOHL(n);
}

We did the same calculations for all of the data. We could squeeze a lot of data into 1 byte (for example vertical speed with a resolution of 100 ft/sec) resulting in an overall packet size of ~ 80 bytes (includes 20 byte IP header) when sending our information out. This is why the OzRunways traffic system is so reliable — an 80 byte UDP packet is about the most reliable thing you can send out of a little iPad antenna at 24,000 ft. We tried gz compression but did not get any gains for either 80 or 500 byte packets.

 

FL240
tx.OzRunways.com

 

For the echo server, we jam as many aircraft as possible into the return packet until we hit 500 bytes. We include all of their information, timestamp (for dead-reckoning their symbol on the map) and flight plans. We average around 18 aircraft per return packet which is why you only see nearby aircraft. You can see all traffic at tx.ozrunways.com

In the end, we have written an entire packet specification mapping out individual bit values including some reserved bits for additional features (e.g. an ’emergency’ bit) should we choose to implement them. We think it’s more flexible than the ADS-B specification as we can send & receive flight plans of other users.

There’s obviously a lot more that goes into the traffic system such as database design (storing tens of millions of data packets for fast access), interfaces for SAR agencies and human-interface design of the traffic icons on screen however I can’t go on forever! Let’s finish with a screenshot of somebody landing in remote QLD:

Traffic

As you can see, the 3G coverage is very good with UDP. If this person were to crash, we could give SAR agencies an exact position. You can also see the timer working to disable the traffic feed after the pilot has landed and taxi’d off the runway.

Fly safe.

Rowan.