<img src="https://ws.zoominfo.com/pixel/pIUYSip8PKsGpxhxzC1V" width="1" height="1" style="display: none;">

Lessons from a Fintech O.G.

author
9 min read
Oct 30, 2018

When Phil Meinert began his career in computer programming, core systems were usually a collection of stand-alone applications. The company he worked for served dozens of small banks. Many of these banks would computerize only their deposit applications.

Computer mainframes weighed several tons. Data transmission was in its infancy and involved placing a telephone handset in a cradle on a dial-up connection. A courier in a car was the only reliable form of moving checks and reports. Banks in Colorado were small.

Today the banking landscape is very different thanks to the work of Meinert and many other technology pioneers. They were the original fintech innovators, finding ways to make banking automation faster and more efficient by leveraging creativity, new technologies, and more than a little bit of chutzpah.

Meinert was a leader at companies including Bank Services Corporation, which provided item and data processing, before being acquired by banking technology firm InterCept, which was acquired by Fidelity.

After retiring from Fidelity, Meinert served as a director of Canon National Bank.

We spoke with Meinert for a glimpse into the first wave of fintech development and to see how the lessons from those early days of banking technology impact us today. If you think it’s hard to implement new hardware and software today, you won’t believe what bankers used to go through.

The First Core Systems

The first banking applications were closely patterned after the manual accounting systems which they replaced. The features available were not much more sophisticated than the old systems with a tub of ledger cards sitting next to a posting machine.

Meinert worked for a defense contractor in Colorado Springs. The company had purchased a few banking applications and modified them to run on their Control Data super computer which had excess processing time available. They installed an NCR sorter and item processing software. A salesman was employed to market the services. On Jan 1, 1970, the center's largest bank customer advertised that they were paying simple interest on savings accounts.

One problem: The software did not have that feature.

All available interest routines were based on traditional interest plans which paid interest based on things such as the minimum balance of the month after the 10th.

Meinert had been hired to develop expertise with a newfangled Relational Data Base System which was developed by the University of Texas. Overnight he was reassigned as the savings programmer. He never got back to the database project.

Relying on a computer took a leap of faith. Some early adopters would maintain their manual systems in parallel with the core systems until they were confident of the core’s accuracy. A typical conversion consisted of a service center employee standing in a bank on Friday evening, waiting until it finished posting the ledger cards, and driving the cards to the data center where the balances were keypunched and used to update the customer records which had been created in the weeks leading up to the conversion.  If all went well, printed reports were delivered to the bank on Monday morning.

One situation required a bit of subterfuge. Meinert says that a bank president refused to convert his general ledger application. He insisted that his staff continue to use the old posting machine to produce his daily statement of condition. Finally the machine broke one more time, and the vendor refused to fix it. He called the data center wanting to know what they could do.  Unknown to him, his staff had implemented the general ledger on the computer months before. The staff was simply taking the daily data from the computer reports and entering it into the posting machine to print his statement. The morning after his call for help he was presented with the computer reports. He assumed that everyone had made herculean efforts the night before to accomplish the conversion. He was never told differently.

A World of Small Providers and DIY Initiative

The Bank Services Company Act of 1961 allowed individual banks to band together for their data processing. These entities were allowed to provide services only to banks.

Under this act regulators had power to examine third-party vendors. It decreed that banks were just as responsible for their vendors’ actions on their behalf as their own. Inconsistent examination procedures and requirements were part of the motivation for some of the early service providers to form an organization which would help them share their common experiences.

In December 1972 representatives of 19 data processing centers met in Fremont, Ohio “to consider the advantages of a multi-bank servicing organization.” In January 1973 the Independent Multi-Bank Data Processing Centers (IMBDPC) was born.  The independence aspect was reinforced by the by-laws which prohibited processors owned by a single bank from being members. The name was soon changed to NABS – National Association of Bank Servicers. It was later changed to AFT – Association for Financial Technology.

In 1977 a group of three banks in Colorado Springs formed a cooperative with the very original name of “Bank Services Corporation” (BSC). Meinert joined BSC as the president and only employee. BSC added staff and implemented leased software on a Unisys mainframe and began servicing the owner banks.  BSC added three additional owner banks and marketed its processing services throughout Colorado to other banks.

Interesting note, the BSC installation was the first for the software vendor which did not have a card reader. All customer input except item processing was done on-line. The techs who came to install the system had to learn how to start the computer without their usual “Start Deck”.

Meinert had been the AFT representative for his former company. One of his first actions was to add BSC to the growing membership of AFT. During the software and hardware search for BSC's new data center, he relied heavily on other AFT members for guidance and recommendations.

A few years later BSC purchased its own mainframe system from a group of Colorado developers. BSC then worked with Financial Technology Inc. of Chicago in a joint marketing effort to license the software, known as BancPac, to in-house banks and other service centers.

One of the BSC developers, Charles Capel, adapted BancPac to operate in a client-server environment. The transition from traditional mainframes to client-server in the core processing industry is a fascinating story on its own.

Resistance was high. One Wyoming banker came to Colorado Springs for a presentation of BancPac. On his way he stopped in Denver to view a competitor's system. His first comment, after perching one cowboy boot on the toe of the other, was: “I hope you have something to show us. That last guy tried to tell me that someday I was going to run my bank on a (blanked) PC.”

Resistance was even evidenced in the BSC board. When Meinert asked them for $60,000 to purchase an IBM i-series to continue development, they instead approved a $4,000 PC.

Related: Fintechs Lead the Way for Digital Transformation in Financial Institutions

Industry Challenges and Changes

Today just a handful of core providers serve the industry, but the field was far more diverse in the beginning. Banks often developed their own software or would team up and create bank-owned service companies, much like CUSOs today. At one point the AFT had 90 core processors as members. (Fun fact: CORE stands for centralized online real-time exchange.)

As mentioned earlier, many core systems were conglomerations of software packages from individual developers. Systematics located in Little Rock, Ark., was an early developer of banking software which used a customer information file (CIF) at the center of its design. All major banking software systems now employ a CIF.

While many ancillary applications were developed for specific mainframes, the availability of larger and faster PCs stimulated the emergence of stand-alone applications to fill the gaps. Very prevalent in some of the early packages were general ledger, accounts payable, A/L management, document origination for deposits and loans, and collections tracking.

The service provider for a Chicago BancPac customer developed PC software to drive their high-speed check sorter. The designer developed a “big board of spaghetti wiring” to plug into the sorter.

One problem: The sorter was faster than the PC.

An operator sat outside the sorter room window and watched the items scroll across the screen. When the scrolling would slow he would hold up a sign with the word “STOP” and the sorter operator would stop feeding items until the computer caught up.

A Transmission Problem

Early core systems faced a big obstacle: affordable, reliable data transmission. Data communication was very slow and very expensive.

There would sometimes be multiple telephone companies involved. This could create compatibility problems as well as the inevitable finger pointing when there was a malfunction.

Serial networks meant that when one terminal in the chain was inoperative everything downstream was disabled also. One bad solder connection could cause a few hundred miles of driving and some long nights. Running cables through the basements of century old buildings in little mining towns could be a daunting task.

One innovative center owner in Mt Pleasant, Texas, combated the transmission problem by putting a network in each customer bank, having a dedicated courier pick up the database on disk each evening, driving it to the data center, and delivering a refreshed disk back the next morning.

The Power of Paper and Check Processing

The original fintech companies existed in a world where paper drove the industry. Paper was involved in every area. The check drove almost all of the expenses for software, personnel, and hardware. Paper forms were used to collect new customer information and then keypunched. Paper reports were printed in multiple copies. The printing of paper statements on month end and quarter end could produce around-the-clock activity. One Florida software company was able to lure banks to sign on with a buzzword of the time: camera-ready forms. The company would put your logo on its templates so it was easy to get the personalized forms printed.

The very early sorters were chain driven. Operators would put a check in a sleeve and hang the sleeve on the chain. The chain pulled the check through the MICR reader and then the check was pulled out of the sleeve. (Note: Meinert is not old enough to have actually operated one of these sorters.) The history of check processing leading up to today's image environment is an adventure in innovation and adaptability.

Closely related to the prevalence of paper and checks was the need for scheduled couriers. Getting checks to the service center and then getting the cash letters to the Fed by the evening deadlines was of extreme importance, especially during periods of high interest rates.

Certain situations required innovative solutions. BSC had contracted with a new bank located 98 miles from the data center.  The intent was to employ a proof machine which would transmit the item data to the center. However, the software developer was unable to deliver a working system.  On the day of conversion, Meinert bought a car, drove it to the bank, hired a local resident, set up a charge account at a service station, handed the keys of the new car to the new employee and introduced her to the bank.  She drove that night and kept the job for four years. While waiting on the data center operators to process the work she was a sorter operator. When reports were ready she would head back home, all in one eight-hour shift.

A small center operation in Bellevue, Wash., had to contend with added difficulties of ferry schedules. They developed a system using their own couriers who would leave the data center in the morning with the updated reports, drive a circuit radiating out from the center, check into lodging at the end of the route and then drive back to the center in the evening picking up input as they went.

What Regulators Thought

Regulatory agencies were somewhat welcoming of new technology, but slow to understand it. Exams didn’t keep pace, with regulators checking the same paperwork and boxes as they had in the past.

Regulators didn’t understand technology, Meinert says. Even his wife expressed security concerns when he showed her their first PC-based “mainframe” sitting on a file cabinet. She wondered what would keep somebody from stealing it. Regulators expressed similar concerns about physical security.

The consolidation of the services industry presented additional regulatory challenges. Meinert spent the last few years of his career as compliance officer, managing the audit and regulatory processes for all InterCept centers.  Many of the small centers had no experience with audits or exams and it was comforting to have a corporate presence during their reviews.

Lessons and Questions

Meinert’s recollections share many parallels with today. The financial services industry is continually implementing new technologies to make operations more cost effective and efficient. Those technologies include customer facing products as well as internal products such as regtech.

The most innovative of these technologies embrace innovation, trying to solve old problems in new ways. It makes me think about:

Risk management. Early adopters of banking software and cores were taking a risk in trying technology that didn’t have full proof of concept, but those that did were serious about due diligence and trusting their vendors—so much so that they often found the best way to ensure they trusted the entities developing these technologies was to create them themselves. They decided the risk of failing to keep up with growth opportunities and an increased workload outweighed the risk of replacing manual processes with new technologies.

Fintech relationships. While the first generation of fintech providers worked with or for banks, today they operate in an environment where banks and credit unions are no longer the exclusive provider of financial services. Imagine how the financial services landscape would look today if Meinert’s company built its mainframe and core for its own use as a financial services competitor instead of for the banking industry.

Vendor consolidation. While there’s a limited set of core providers today, the number of ancillary providers and other third-party vendors is vast. Fintech developers are relying on innovation, technology and creativity to solve new problems and compete for customers. Will these vendors continue to operate independently or will they consolidate in the future?

Regulation and due diligence. Today there is no way bank staff could go behind a bank president’s bank and implement a new core system. The board and management must sign off on critical third-party relationships. That means staff has to have proof of concept and make an extremely strong case for any technology investments. I also wonder where staff would have found the budget to take on such a big project behind management’s back.

Thanks to Phil Meinert for sharing his insights into the evolution of core systems and other banking technology.


Subscribe to the Nsight Blog