The evolution of borrowing and lending, from the 1800s to today.
We swipe our credit cards every day. But have you ever wondered how these cards came to be?
The concept of credit stretches back to ancient times. Back then, people used instruments like the tally stick to record debt. With this method, each party of a transaction would make notches in the stick to signify debt owed. Then the stick would be broken in half so each party could keep a record.
Fast-forward to the late 1800s, when charge coins came into fashion. Made of aluminum or steel, these coins were typically issued by department stores. During a purchase, the coin was used to imprint a customer’s information onto a sales slip.
These coins — and charge plates of the early 1900s — were precursors to the modern credit card. In short order, retailers began adopting the rectangular cards we know today.
How early charge cards were used
In the early 1920s, oil companies, airlines and department stores began giving out “courtesy cards.” Customers could use these cards only with the issuing companies.
In 1946, banker John Biggins created the first bank-issued card, calling it the Charg-It. The idea behind this card was simple: A customer used the card to buy something from a local merchant. The local merchant sent the transaction to Biggins’ bank, which paid the merchant and collected money from the customer. There was one small problem, however: The Charg-It could be used for local transactions only.
A solution arrived in 1950, when the Diners Club introduced the first universal card. Instead of being tied down to one establishment, Diners Club members could use the card for many purposes — especially travel and entertainment. Initially, members could use their cards at 27 restaurants in the New York City.
The Diners Club Card was a charge card — cardholders had to pay their bill in full each month. It came with a $5 annual fee, which is about $49 in today’s dollars. Diners Club also charged merchants 7% to 10% for every card transaction.
Not to be outdone, the American Express Company introduced its Green Card in 1958. Like the Diners Club Card, it was a travel-and-entertainment card that could be used widely. Though American Express was second to market with charge cards, it was first to introduce plastic cards. (Diners Club switched to plastic in 1961.)
The BankAmericard and revolving credit
But 1958 wasn’t just the year of the Green Card: It was also the year of the BankAmericard.
The BankAmericard was different because it was a general use card. Customers could use it anywhere as long as merchants accepted it. It was also the first card with revolving credit. With revolving credit, consumers didn’t need to pay their balances in full each month but could instead pay off their cards over time.
BankAmericard was initially released in California. It was such a success there that Bank of America started licensing it to banks nationwide in 1966. Soon, international banks started issuing the card — you could find it in Canada, Ireland, Japan and Great Britain.
Seeing the success of the BankAmericard, manufacturers Chemical Bank and Hanover Trust Co. created the Eastern States Bankcard Association. In collaboration with other regional bank groups, they started issuing cards through a network they called the Master Charge Plan.
At the turn of the decade, the BankAmericard’s issuing banks created National BankAmericard Incorporated. In 1976, BankAmericard was renamed Visa. Master Charge followed suit three years later, becoming Mastercard in 1979. And in 1985, Sears, Roebuck and Co. jumped into the credit card market with its Discover card.
Credit cards and the 2008 financial crisis
Revolving credit was an incredible innovation, but it provided consumers with more opportunities to accumulate debt.
This wasn’t a huge problem, as consumers were highly judicious about what they bought on credit. As recently as the 1980s, most Americans didn’t buy nonessential items on credit. But that’s changed over time. Now, according to a recent CreditCards.com survey, roughly 17% of credit card owners use their card on purchases of $5 or less, an uptick of 5% from the year prior and a sign of increasingly casual credit usage. It’s also normal to buy luxuries on credit, and today’s households that carry debt have an average of $16,000 in credit card balances.
In 2008, Americans held $951 billion in credit card debt. When the global financial crisis hit that year, many Americans found themselves charging more on credit cards for essentials like food and housing. With more debt, it became more difficult for consumers to make payments — and many of them defaulted.
Many card providers took advantage of these defaults with abusive business practices. They’d hike interest rates without notifying consumers, levy egregious overlimit fees and add confusing rules to card terms. Many consumers spiraled further into debt as a result.
Credit CARD Act of 2009
In response to shadowy card-provider practices, the US Congress introduced the Credit Card Accountability Responsibility and Disclosure (CARD) Act in 2009. The bill created many important protections for consumers that included:
- Notifications for rate increases. Card providers must let you know 45 days in advance if they’re increasing your interest rates.
- An end to retroactive rate increases. Prior to the Credit CARD Act, card issuers could retroactively raise interest rates after customers defaulted.
- A legal ban on double-cycle billing. This now illegal practice allowed providers to charge interest on your current and previous month’s balance. It was widely considered unfair, because consumers were charged even when they’d already paid off purchases.
With its many consumer protections, the Credit CARD Act passed Congress with bipartisan support. It was signed into law by President Barack Obama on May 22, 2009.
The future of credit cards
EMV (“Chip”) Technology
Credit cards have come a long way, and they’ll inevitably evolve as financial payments change.
We’re used to the magnetic stripes on our cards, but this technology is highly susceptible to fraud. In recent years, Americans have been introduced to EMV or “chip” cards — technology that the rest of the world has been using for a long time.
EMV cards are difficult to clone and use more advanced encryption, offering a higher level of protection to consumers. Very soon, we’ll do away with swiping our credit cards altogether. Instead, we’ll “dip” our cards instead, inserting them into chip readers.
This all might be moot, however, in light of big technological advancements in the payments industry. As time goes by, we may not need credit cards at all and instead use our digital wallets.
Many payment-industry experts believe that mobile devices and wearables are on their way to replacing credit cards. In a decade or two, we may use our smartphones for all of our payments. And by midcentury, we may be paying for items with microchips on (or in) our wrists.
A future without credit cards isn’t so far-fetched. We can already make mobile payments through services like Apple Pay and Android Pay. These platforms can be more secure than credit cards, because they don’t transmit card numbers when consumers make purchases. In time, they may become a universal standard — just like credit cards are today.