Linux Means Business: Linux and Banking

by Josip Almasi

We are a small bank in a small country. Up until two years ago, our management policy regarding software was internal development—standard banking software price is about $1 million, and we were small enough to handle internal development. Our servers were Intel-based machines, running SCO and Informix SE.

Then management decided to change our business strategy, and soon we started to grow very fast. In less than a year, the number of employees doubled, and so did the number of PCs. Our LAN became a WAN, and the number of transactions multiplied five times and continues to grow. We could not ensure efficient software solutions for all new requests, so the decision was made to buy commercial applications. We also bought two IBM RS/6000 servers running AIX, connected in a high-availability cluster.

A year later, this decision may not seem too smart. We have our software, which handles 70% of our business, and a bunch of applications, communicating between each other with shell scripts, 4GL and C programs that perform needed data conversions. But development takes time, and we cannot employ a new application without at least three months of extensive testing.

How Linux Fits In

More and more users needed Internet access, and we needed a dial-on-demand router. We didn't want specialized hardware. We wanted to be able to customise logs, schedule access time on a per-user basis and use an existing PC for that purpose.

After a week running NT Workstation with WinGate demo, we gave up. WinGate used to die two or three times a day. That would not be a problem in and of itself, since it was a demo anyway, but the COM port stayed locked every time it happened, remaining locked until reboot. Obviously, NT did not release allocated resources after process termination. Well, it did not look like a serious operating system to us.

Linux came in our focus accidentally. A friend of mine did quite a bit of advocacy and convinced me to try it, giving me precise directions and a Slackware 3.2 CD. Why not? If it works, fine; if it doesn't, it costs me nothing. After all, I am an experienced UNIX administrator—I can do it. So I took a 486 with a 1GB SCSI disk and 16MB RAM and started to play with Linux.

A few hours after inserting the CD into the drive, I had a working proxy, and I still have it. It has been working for a year now. All we have to do with it are the usual administration jobs: adding new users, kernel patches, etc. Furthermore, it became a caching proxy and news server. To be quite specific, there is diald, socks5 proxy, caching DNS, tcpd, Apache as caching HTTP proxy and internal web server, and inn as a news server for both local and Usenet newsgroups.

My colleagues were very suspicious of Linux, but started respecting it after a few months of uptime.

After the initial experience, I was very pleased with Linux and started to explore. As I discovered the purpose of the iBCS module, I had to run the SCO version of Informix on it. So, I copied Informix, our database and applications, tested it, and again it worked from the first try. I even found Linux a better development platform than SCO—it didn't require additional kernel parameter tuning to run complex database queries, it worked faster and was more stable, and it had better development tools.

Well, it's not quite fair to compare Linux to an obsolete product (SCO 4.2), but I tested concurrent queries and updates on millions of records, shut down the machine in the middle of work, killed system processes such as kswapd, tested power-off behaviour, and even pulled off the IDE disk cable on the working machine. Although I got database corruption, I didn't manage to produce unrecoverable file system errors. When fsck failed, fsdebug worked. For the record, we had unrecoverable file system errors on SCO. The result is that we now use Linux as our development platform.

Switching to Linux may not be easy for a Windows user, but we are used to working in a UNIX environment. We have set up Linux on a PC as a server for development and testing purposes and on three developers' PCs instead of Windows. We used Windows mostly as a task switcher for a few terminal sessions and for printer sharing. The only reason we used it was that it came pre-installed on every PC. Linux gave us additional possibilities: we have our native development environment on every PC, with good development tools such as Emacs, ddd and other good stuff. Since we are used to GNU tools, we installed them on both AIX and SCO. From a user's point of view, our AIX looks more like Linux now.

It's hard to believe that you can get a good development environment free when you used to pay thousands of dollars for one, but it's true.

When our users wanted to share Windows files across the WAN, we didn't even think about an NT Server, as Linux has already proven its reliability. So, we installed Samba as a WINS server. We did not set it up on AIX servers because they handle our critical applications and we just don't want to experiment on them. Today we use Linux/Samba for printer sharing, too, as it gives our UNIX systems access to Windows print servers, and Windows workstations access to UNIX printers. Basically, when a user wants to print something from UNIX, he can either print to a dot matrix or laser printer in the system room, or to a printer in his room, which may be connected to his PC or Windows-shared. When he chooses a local printer, the appropriate Windows print queue is determined by his IP address.

When we realized Samba's possibilities, we wanted to implement a centralized backup, based on the following idea: all users have their accounts on the Linux box, their Windows boxes mount their home directory upon startup, and all their documents are saved on Linux. Most of our users don't even know where their files reside; they just click on the icon and use the application, so we could change the working directory for Windows applications on every PC.

This policy has some other security considerations. First, only the logged-in user can read documents. Second, we receive some documents from the National Bank and other government organizations as Microsoft Office files. Once we even received a macro virus with them—it damaged only the user's local disk, but the user lost some important files. Before that happened, we had not been worried about viruses, since they come with cracked games, but now they may come with business data. This way, we can also do periodic virus scans, and file permissions are set in such a way that a user can write to only his home directory, thus a potential virus can't spread around. But we haven't fully implemented this yet, because we have to change every PC's network settings to mount the Linux drive and move all files to it. We also have problems forcing our users to log onto Windows with their Linux user names and passwords and explaining to them how to change the Windows password every time their Linux password expires.

One day, we received a request to allow access to an old Clipper application from a remote location. If we had just installed it on the remote PC and mapped the network drive to our machine, the Clipper database would go over the modem line, so we would have to increase bandwidth. What's more important, we would risk database corruption, and it's also the same story with viruses, since we have no physical control over remote locations. Linux took a role again; we installed the DOS Emulator. A few smaller problems arose. Clipper is never idle, so DOSEmu takes all available CPU time. Since we have only a few Clipper users, we just gave DOSEmu a smaller priority. This way, only screens are traveling the network, and with ttysnoop, we can see exactly what a user is doing. Also, there's no database corruption when Windows dies—Linux doesn't die.

Linux has made our life much easier, especially on that Monday morning when the hard disk on our SCO server didn't wake up. We had no up-to-date database on our backup server, since the previous weekend the entire bank moved to a new location. The backup server was moved first, so there was no overnight database copying. We had a tape backup, but it needs at least an hour and a half to restore it. We had the Informix log file, which takes at least the same time to roll forward the database. Luckily, we had an up-to-date database on our development server; unfortunately, copying over the network takes about an hour. Our customers were waiting, so we chose to drive the bank on Linux for a day. The server was not really a server but a PC with P133, 16MB RAM and an IDE disk, so there was a lot of swapping, but most users didn't notice any difference.

Today, Linux does quite a good job for us, and so do GNU programs. Our complete development is done on Linux. C programs are then recompiled with the GNU C compiler on either SCO or AIX; 4GL programs are just copied to the proper place. Bash became our standard shell on both AIX and SCO. CVS takes care of the version control. Communication with remote locations is also handled by Linux, and so is printer spooling. With the proxy, the internal web and news, Linux played a significant part in our job. Anyway, it's not yet ready to be used as the database and application server for our purposes, because it still lacks a few features needed in our mission-critical applications: the high-availability cluster software and a journaling file system. We cannot afford any data loss, and that's the main reason we have chosen AIX for certain applications.

At the end of this year, we plan to migrate our database, applications and development completely to AIX, so we'll stop using SCO, but Linux will stay. It works very well and has greatly improved our security and administration.

Josip Almasi (jalmasi@partner-banka.tel.hr) has been in database design/programming for ten years, and in Novell and UNIX system administration for the last five years. He is currently working as a database/system administrator at Partner Bank, Croatia. His hobbies are Aikido and blues harmonica.

Load Disqus comments