If you have flown anywhere in the last fifty years, you have seen an airport code. It is the three-letter abbreviation on your boarding pass, your luggage tag, and the flight board above the gate. LAX. JFK. LHR. DXB. They look like shorthand for city names, and sometimes they are, but just as often they are something stranger. Chicago's main airport is ORD. Orlando is MCO. Toronto is YYZ. Where did those come from?
The three-letter airport code is one of the most widely used classification systems in modern life, and almost nobody thinks about how it works. The rules behind it are a mix of early aviation history, Cold War-era telegraph infrastructure, and the mundane bureaucracy of a trade association that has been assigning these codes since the 1940s. Once you understand the patterns, every code tells you something about when and how an airport entered the global aviation network.
The two code systems you will actually see
There is not one airport code. There are two, and they exist for different reasons.
The codes you see on your boarding pass are IATA codes. IATA is the International Air Transport Association, a trade body that represents most of the world's airlines and handles the commercial plumbing of the industry: revenue settlement between carriers, ticketing standards, cargo rules, and the naming conventions that make it possible for an airline in one country to sell a connection through an airport in another. IATA codes are three letters, and they identify airports, cities, airlines, and even individual metropolitan areas. LAX is an IATA code. So is NYC, which refers to the combined New York City airport system rather than a specific airport.
The codes used by pilots, dispatchers, and air traffic controllers are different. Those are ICAO codes, assigned by the International Civil Aviation Organization, a United Nations agency that handles the safety and operational side of aviation. ICAO codes are four letters and they follow a geographic hierarchy. Every ICAO code starts with a letter or two identifying the country or region, and the remaining letters identify the specific airport. KLAX is Los Angeles. EGLL is London Heathrow. OMDB is Dubai. RJTT is Tokyo Haneda. If you have ever heard a flight crew read a route off a clearance, the airport names they say out loud are usually ICAO codes.
The practical difference is who needs each. Passengers deal with airlines, tickets, and gates, so they see IATA. Operations deal with air traffic control, flight plans, and international sovereignty questions, so they use ICAO. The two systems run in parallel and almost never overlap in daily life, even though they describe the same physical places.
The rest of this article is about IATA codes, because those are the ones you meet on the ground.

Who assigns them and how
IATA itself manages the code list, but airports and airlines apply for the codes they want. When a new airport opens, or a small airport wants to expand into commercial service, it submits a request with a shortlist of three-letter combinations it would like to use. IATA reviews the request, checks for collisions with existing codes, and assigns one. This happens more often than you might think, because small regional airports come online regularly and occasionally a larger airport gets renamed or reorganized.
There are roughly twenty-six cubed possible three-letter combinations, which is 17,576. IATA has assigned codes to something like ten thousand airports worldwide, which means the namespace is genuinely crowded. Many of the obvious codes are taken. If you wanted the code BRN for a new airport today, you could not have it because Bern, Switzerland has held it for decades. New assignments tend to get more creative as the easy three-letter combinations disappear.
There is also a small set of codes that IATA reserves and will not assign. Any code starting with Q is avoided for most airports because Q is used as an operational prefix in other communication systems. Codes that spell offensive words in major languages are blocked, as are codes that collide with common airline designators or reserved sequences. The system evolves quietly, but it evolves.
The patterns behind the letters
Most airport codes follow one of a few patterns, and once you know them the world gets legible quickly.
The first pattern is the obvious one: the first three letters of the city name. BOS is Boston. SIN is Singapore. CAI is Cairo. MAD is Madrid. BER is Berlin. When the city has a short name and no competing claim on those letters, this is the default. Roughly a third of major airports worldwide use some version of this pattern.
The second pattern is a phonetic or abbreviated version of the city name when the first three letters are already taken or would be confusing. ATL is Atlanta. SEA is Seattle. PHX is Phoenix. MIA is Miami. These are not the first three letters of the city name, but they sound like the city. You can almost always read them aloud and hear the city.
The third pattern is a reference to the airport itself, not the city. Often this is because the airport was originally named for something other than the city it now serves. The most famous example is ORD for Chicago O'Hare. The airport was built on the site of an earlier airfield called Orchard Field, and when the Chicago airport network expanded in the 1940s the site inherited its old code. When the airport was renamed for Lieutenant Commander Edward O'Hare in 1949, the ORD code stuck. It has been the city's airport code for nearly eighty years, even though the word "Orchard" appears nowhere in the airport's current name.
MCO is the same kind of ghost. Orlando's airport is on the former site of McCoy Air Force Base, a Strategic Air Command facility that closed in the 1970s. When the civilian airport took over the field, it inherited the military code. Today, MCO is a tourist airport serving Disney World, but the code is a memorial to a Cold War bomber base.

The X that was added to LAX
One of the most-repeated stories in aviation is the claim that LAX used to be LA and had an X appended when three-letter codes came into use. This is roughly true but needs a little care.
In the 1930s, airport codes in the United States were two letters, managed by the National Weather Service and derived from telegraph station identifiers. Los Angeles Municipal Airport, which would later become LAX, used the two-letter code LA. When the aviation industry shifted to three-letter codes during and after the Second World War, the National Weather Service and later IATA adopted conventions that required unique three-letter identifiers. Airports with existing two-letter codes often kept them and added a filler letter, and the letter that ended up being used most often was X.
This is why a whole cluster of American airports have codes ending in X: LAX, PHX, PDX for Portland, LEX for Lexington. The X is not meaningful. It is a placeholder that was chosen because it was distinctive and unlikely to create collisions with existing three-letter codes from other regions.
Not every airport was so lucky. Some had to take whatever was available. PIT for Pittsburgh is a simple abbreviation. BWI is Baltimore-Washington International, a portmanteau. MSP for Minneapolis-Saint Paul is a selection from the two city names it serves. DTW for Detroit was originally called Wayne County Airport, which is where the W and T come from. Almost every weird code has a reason, usually historical, and usually visible if you dig into the airport's pre-IATA name.
Why Canadian airports start with Y
If you have ever flown to Toronto, Vancouver, Calgary, or Montreal, you have seen the strangest-looking codes in the world. YYZ. YVR. YYC. YUL. YHZ. YEG. Every major Canadian airport starts with a Y.
The reason goes back to the railway telegraph system. In the early twentieth century, Canada's weather and telegraph reporting used two-letter station identifiers. Many of these stations were located at airports, because airports needed weather information. When aviation radio identifiers were introduced in the 1930s, Canadian stations prefixed their existing two-letter telegraph codes with a letter indicating whether the weather reporting station was co-located with a radio navigation aid. The letter Y meant "yes, there is a station here." Other prefixes existed: W for "without a weather station," U for "uncertain," and so on. Over time, Y stuck as the default, the other prefixes fell out of use for major airports, and the Y-prefixed codes became the civilian aviation identifiers.
So YYZ is Y plus YZ, where YZ was the old telegraph code for Malton, the Ontario town where Toronto's airport was built. YVR is Y plus VR, where VR was Vancouver. YUL is Y plus UL, where UL was Dorval, Quebec, the location of Montreal's main airport. The Y is a fossil of a weather-reporting convention that has not been used in its original form for decades.
If you know the system, you can read Canadian codes almost like a map. YHZ is Halifax. YEG is Edmonton. YOW is Ottawa. The second and third letters are usually recognizable.
Codes that collide, and codes that do not
Sometimes a city has more than one airport. When that happens, IATA issues a metropolitan area code that applies to the whole city, and each airport gets its own specific code. New York City is NYC at the metro level, but JFK, LGA, EWR, HPN, SWF, and ISP are the individual airports. London is LON, with LHR, LGW, STN, LTN, LCY, and SEN as the individual airports. Tokyo is TYO, with HND and NRT. Paris is PAR, with CDG and ORY.
The metro code matters because it lets airlines and travel platforms present the whole city as a single origin or destination when you search for a flight. If you look up a fare from "New York to London," the search engine is resolving NYC against LON and finding all the combinations of individual airport pairs, then ranking them. You as a passenger do not see NYC or LON on a boarding pass. You see the specific airport you actually fly through. But the metro code is doing work in the background.
Codes also sometimes collide across continents, and the collisions are memorable. GVA is Geneva and GNV is Gainesville, Florida. Close enough to be confusing if you read fast. IEV was Kiev, Ukraine, until the city transitioned its main airport to KBP at Boryspil; now IEV refers to the smaller city-center airport at Zhulyany. YKM is Yakima, Washington. YUM is Yuma, Arizona. JAL is neither an airport nor a city but the IATA carrier designator for Japan Airlines; airlines have IATA codes too, and the namespace is shared.
When codes retire, or try to
IATA codes are sticky. Once an airport has a code, it keeps it even through renaming, expansion, or reconstruction. O'Hare kept ORD. Orlando kept MCO. Mumbai kept BOM, which is still its IATA code decades after the city itself was renamed from Bombay.
But codes do occasionally change. The most famous American example is New York's main international airport, which was called Idlewild and had the IATA code IDL from 1948 until 1963. When the airport was renamed for President John F. Kennedy after his assassination, the code changed to JFK. The three letters followed the name. This is unusual. More often, the code persists even when the name changes, because changing the code creates years of transition headaches for airlines, reservation systems, and the printed infrastructure of aviation.
China's situation with Beijing is one of the most interesting recent cases. PEK has been Beijing's airport code since the 1950s, based on the Wade-Giles romanization of Peking. When Beijing Daxing International Airport opened in 2019, it did not replace PEK. It was assigned a separate code, PKX, and Beijing now has two major international airport codes operating in parallel. The older transliteration survives as a fossil alongside the newer one.

Reading a code like a map
Once you have seen enough codes, you start to read them geographically without thinking. K-prefixes in ICAO land mean United States. E-prefixes mean northern Europe. O-prefixes mean the Middle East. R-prefixes mean East Asia. IATA codes do not have that geographic regularity, but the patterns still accumulate. SIN sounds like Singapore. AUH sounds like Abu Dhabi. YYZ looks Canadian before you know why. ORD signals a long-established American hub. CDG tells you Paris Charles de Gaulle rather than Paris Orly.
The codes are not trying to be elegant. They are trying to be unique, globally communicable, and stable enough to print on a ticket. They are the vocabulary of the aviation system, and they have survived almost a century of technological change because they are short, they are unambiguous once assigned, and they are shared by everyone in the industry.
The next time you see an unfamiliar code on a departure board, try to guess before you look it up. Is it an obvious abbreviation? A phonetic spelling? A historical fossil? Does it start with Y? The answer is almost always there in the letters.
That, more or less, is how to read the world from three letters at a time.