Holding all the cards
THE RISE OF THE CARD
Until the late 1950s, credit cards were linked to a single bank or merchant. Practically speaking, this meant that consumers had to stow a fat stack of cards in anticipation of where they’d spend money. Back then, these cards were cardboard and served as a sort of glorified tab. They served as a way for restaurants, department stores, and hotels to accept payment without forcing their customers to travel home to procure more cash; engender loyalty; and get customers to spend more.This fragmented approach essentially positioned companies to behave a bit like financial institutions by creating their own payment methods. Back then, banks also issued their own banknotes, meaning the number of different types of money went hand in hand with the number of financial institutions that existed. (At one point, there were more than 5,000 different types.) Once companies got on board with individual credit cards, there were even more ways to pay. But many charge cards were created and failed due to lack of broad consumer acceptance in the second half of the 20th century. In 1951, for example, New York’s Franklin National Bank (which is now more famous for, in 1974, becoming the biggest bank to fail ever) introduced the first bank credit card, permitting its use only by the bank’s account holders, and only at merchants with which the bank had partnered. And if a merchant didn’t accept or have its own card, the customer was still stuck using cash.
In 1949, businessman Frank McNamara was dining with clients in New York when he realized—to his horror—that he had left his wallet at home. The incident inspired him to establish the Diners Club card, a charge card that served mostly restaurants, as a way to pay. The card was a rousing success, with 28 restaurants and two hotels in New York getting on board with the payment method in its first year of operation, and 10,000 members among the city’s business elite using it to pay. This kind of broader interopability had been born of a specific form of necessity—to avoid embarrassment—and it became clear that it was here to stay.
The Diner’s Club card also introduced the concept of a “closed-loop network,” meaning that it acted as both the card issuer and the credit card association. That idea gained in popularity toward the end of 1958, when American Express, which had entered the financial services fray as a money order business, began issuing its own credit cards, complete with a $6 annual fee—an opening salvo in establishing American Express as a premium product. And while American Express is still a closed-loop network today, other, more popular cards like Visa and MasterCard, leave the card-issuing function up to third-party banks—though their origins were intertwined.
These early iterations of credit cards were all leading up to the modern bank card, which was the brainchild of Bank of America. In September 1958, the company mailed 60,000 credit cards to consumers in Fresno, California. By October 1959, more than 2 million credit cards had been issued in California alone. But the program, called BankAmericard, was barred from doing business in other states.
Then BankAmericard offered to split its fees with other banks, which would essentially sponsor the expansion. The impact was huge: Banks began to offer these cards to their customers, and merchants began accepting the card, which would later become the Visa, as a way to drive sales, paying a fee for the privilege. Similarly, in 1967, four banks in California founded a competitor for BankAmericard, known as the MasterCharge program. This program became MasterCard 12 years later. Now that banks were part of the system, banks were paying merchants directly and immediately, with customers only having to pay back the banks.
THE LITTLE FEE THAT COULD
In 1970, BankAmericard shook off its primary association with Bank of America. The coalition of issuer banks that had helped drive its national expansion took control of the program, creating National BankAmericard Inc. In 1976, they decided to call the BankAmericard they created Visa. And that private coalition—which eventually comprised more than 20,000 member institutions worldwide—continued to own Visa until 2008, when Visa went public.But although the ownership stakes have changed, there persists a bit of a conflict of interest in the fact that Visa set fees payable to banks—and always has, even when banks owned the company they were taking money from.
Card companies like Visa and MasterCard are only middlemen connected to other middlemen. For example, Visa links the swipe of the card to the customer’s bank (which issued the card) and the merchant’s bank (which receives the funds). In addition to maintaining relationships with the various parties, card companies handle the complexities of the transaction, including secure financial transfers and fraud monitoring.
For their trouble, credit card companies levy the aforementioned fee, called an interchange or processing fee. These days, Visa and MasterCard charge hundreds of different rates for every type of card that runs through their networks, some as high as 3 percent. The rate varies depending on the type of card, transaction, and average transaction volume. The rate is also tied to the perceived level of risk—whether the card is present or not, and so on. And they dole that interchange fee out to various transaction facilitators, allocating most of it to the banks themselves.
In the early 1960s, the interchange fee made sense: Facilitating the transaction was an expensive business. The computer infrastructure was expensive to develop and maintain, so the credit card companies used their share of the fee to connect the banks and businesses with an infrastructure that was, at first, also expensive to develop and maintain. At their establishment, these were staggeringly complex systems that improved the lives of many people. But fees have only risen in the decades since, and transaction speeds and access haven’t improved. The biggest recent innovation to come to the credit card industry, chip-and-PIN, stems from calls for greater security, not convenience, and, if anything, has only slowed things down.
As such, in recent years, the interchange fee has been subject to a fair amount of scrutiny because it’s unclear what the fees are being used for. Most merchants—and consumers—have a hard time predicting just what fees they’ll end up paying, so they don’t know how to adjust prices or whether to pass the fees on to the consumers. As Aaron Patzer, the co-founder of Mint, put it: “Outside the U.S. government, [credit card associations] are the only entity that has the power to levy a fee across virtually every transaction.”
Banks maintain that they need these fees help preserve the integrity of their computer networks, extend lines of credit before purchases actually get paid for, and bankroll anti-fraud efforts. Interchange revenue ballooned to $45 billion in 2010, up from $20 billion in 2002, according to The New York Times; banks have used the revenue as a growing profit center and to pay for cardholder perks like rewards programs.
In some ways, the fee is just a vestige of a time when banks needed to be persuaded to play. The bulk of the interchange fee went to the issuing banks—and that hasn’t changed even though the playing field has. The bank’s cut is extracted from the amount collected by the merchants when they submit credit or debit transactions for payment through their acquiring banks.
Though merchants once saw credit cards as a competitive advantage that made it more convenient for consumers to shop at their establishments, the fees now make every credit card purchase cost a merchant six times what the same sale with cash would run.
With the advent of new technologies like Venmo and Square, many consumers view credit cards as an entrenched technology—ancient, even—despite being only half a century old. Though these platforms are still built on credit card rails, they’re already changing the way we pay. Merchants are interacting more directly with consumers than ever before, and these platforms are seeing rapid adoption; we may be at yet another turning point in the way we pay for things.