Chris Fieggen – 1960s Pioneer Programmer
This is an historical account about my father, Chris Fieggen. Let's begin with a fairly recent photo of me and Dad, taken on his 86th birthday:
I really like how this photo shows the obvious family resemblance, plus gives a glimpse of our similar personalities and the close connection that we'd long shared.
It's also probably the last nice photo of the two of us together before Dad gradually slipped away, his mind and memories slowly eroded by dementia over the next five years. He died on the 21st of March 2022, aged 90 – and not even knowing who I was.
Even worse, Dad had forgotten what an amazing person he was! This seems all the more cruel when I recall what a gifted mind he once had.
Among Dad's many extraordinary accomplishments was that he was one of the pioneering computer programmers for the Australia and New Zealand Bank (ANZ Bank) back in the early 1960s.
Viewed from today's perspective, those first computer programs might easily be perceived as trivial. To do justice to what Dad and his fellow programmers actually achieved, I'll try to paint a picture of the era, the computer, the software, and what was actually involved.
Fortunately, Dad and I had often discussed the subject, having myself been involved in the subsequent adoption of personal computers by small businesses in the early 1980s. Even more fortunately, Dad had kept a stack of artefacts from the time. Finally, I was able to draw from Dad's own memoirs as well as Mum & Dad's family letters.
Here, then, is an account of Dad's early programming years with ANZ Bank – which probably mirrors the experiences of many of his colleagues from that original team.
To you, the reader
If you're reading this account, you're probably using a modern day computer. I'll probably take you out of your comfort zone by using 1960s-era terminology. Technical stuff like “core memory”, “octal”, “coding sheets” and “punched cards” are now long obsolete – most of us have never heard of them. Yet that same technical stuff was so new back in the 1960s that Dad and his colleagues had never heard of them.
If you sometimes feel overwhelmed, then I've succeeded in giving you a taste of what they would have experienced.
The ANZ Announcement
In 1963, Dad was working at the ANZ Bank's main New Zealand branch in Lambton Quay, Wellington:
Dad had recently been promoted to Internal Audit Officer. Although this was a highly responsible position, it offered little intellectual challenge. To quote Dad:
“This [appointment] heralded a life of utter boredom.”
That all changed when the ANZ Bank issued the following 4-page circular, dated 28-May-1963, to all staff throughout Australia, Papua & New Guinea, New Zealand and Fiji:
|A. N. Z. BANK||G/M CIRCULAR|
GENERAL MANAGER'S OFFICE,
394 COLLINS STREET,
Last number to Fiji 50/63
Last number to New Zealand 48/63
AUTOMATION IN A.N.Z. BANK
The last twenty-five years, as most senior officers and lady clerks will be aware, have seen many developments in the handling of our voucher traffic. This has progressed from the hand posting of ledgers and pass books through to the modern machinery and procedures used in our larger branches today.
The continued expansion of the Bank's business, the increase in voucher traffic amongst our existing customers and the cost of servicing this business have brought about the need to find ways and means of handling the volume at economic levels with, if possible, an improved service to customers.
Most of our staff will have read of the use of computers to solve the intricate scientific calculations of this modern age, of their increasing use in banks throughout the world and, in the last two years or so, of their use in Australian and New Zealand industry.
The Australian and New Zealand banks through their respective Bankers' Associations have been conducting extensive research into the use of computers. The recent issue to Australian branches of the Common Machine Language Booklet (vide G/M Circular 44/63) is an example of this co-operative research. (For the information of our New Zealand and Fijian branches this booklet sets out for the guidance of printers, customers and staff the technical requirements for printing magnetic ink characters on vouchers.)
2. THE DECISIONS ALREADY MADE
After extensive examination in General Manager's Office into all aspects, the Board of Directors has agreed that we purchase a computer from the General Electric Company of America. This company has been connected with work in the computer field since the introduction to banking of this aid in the early 1950's, commencing with the Bank of America which is currently using 32 of the General Electric Company's computers.
The equipment will be installed in extensions about to be made to the premises of Methods Department, General Manager's Office, at South Yarra.
The first application planned to be undertaken will be the centralisation of all batching and remitting functions of branches in the metropolitan area of Melbourne, and this will be followed by centralisation of the ledger operations of those branches.
3. WHAT THE DECISIONS MEAN AND THEIR EFFECT
(i) Branch Aspect
Little or no effect will be felt by branches outside the Melbourne metropolitan area.
All current accounts in that area will eventually be numbered and cheques will be encoded with magnetic ink character recognition (MICR), and personalised, and branches will therefore observe the gradual introduction of these vouchers during the next twelve months.
Managers and Accountants of those branches will be freed from much of their duties connected with voucher traffic and the staffing of batching and ledger departments. Management information will be available sooner and will be more accurate. There will be savings in space at all branches concerned and complaints from customers about inaccurate or untidy work will be eliminated. Many of the returns, reports and statistics that are now done by hand will be performed by the computer.
In general, Managers at those branches will be able to devote more time to managing and developing their businesses.
(ii) Staff Aspects
It is considered desirable that the opportunities for advancement arising out of the abovementioned decisions should be available to our staff throughout Australia, Papua and New Guinea, New Zealand and Fiji.
The purpose of this circular is therefore not only to advise staff of the decisions reached but to seek applications from staff who may be interested in occupying the first few positions that have been created by this development.
At the same time it is emphasised that there will be no retrenchments of staff because of the computer installation and that any resultant revisions in Authorised Staff Complements will be effected without detriment to the present position of individual members.
As planning proceeds, more opportunities will be available to staff in this specialised field and further advices will issue from time to time as opportunities occur.
Enclosed with this circular is a number of application forms. One of these copies is to be filed with this circular. Additional copies of the circular and application form are forwarded to the larger branches to facilitate advices to interested personnel who are now on leave. Further copies of the circular and application form may be obtained from Divisional Office and General Manager's Office if required.
The positions available in the Data Processing Section of Methods Department, General Manager's Office and for which applications are now invited are :
Supervisor Central Proof – 1 male officer
Supervisor Printer Liaison – 1 male officer
Audit Officer – 1 male officer
Public Relations Officer – 1 male officer
Analyst/Programmers – 5 positions (male or female)
Tabulator Operators – 6 male officers
In all positions referred to, attitude is as important as aptitude. We are aiming to build a closely knit team of people who are interested and enthusiastic in seeing a challenging task implemented in the most efficient and economical way.
Applications may be submitted by any officer (or any lady clerk – in the case of Analyst/Programmers – who is interested in a career post) who has the interest to do this work and considers he has the necessary qualifications listed on the reverse of the application form. The foregoing includes Branch Managers and Accountants and staff already occupying specialist jobs and those who have recently taken up new appointments, subject to the age limits indicated.
Because of the highly specialised nature of the work and the special attributes successful applicants will require to bring to the tasks ahead, particular attention will be paid to the question of emoluments. These will be established at a level higher than normally related to officers in the same age groups and will be commensurate with the particular responsibilities attached to each post.
The intention is that the successful candidates will receive a worthwhile increment on appointment to the particular post. Thence forward the degree of enthusiasm, adaptability and progress displayed will determine the rate and nature of advancement. It is emphasised that it is the intention of the Bank that selected personnel, once they demonstrate complete suitability and competence, will find work in the specialised computer field a well worthwhile and rewarding occupation.
Two copies of the application form are to be completed in the applicant's own hand writing and forwarded in terms of the head note instruction on the form to reach General Manager's Office, Methods Department, South Yarra and Divisional Office, respectively, on or before 18th June, 1963.
On receipt of applications in General Manager's Office each will be examined thoroughly by a small committee and some culling of applications will necessarily take place at this point.
Selected applicants (to this stage) will be required to report at Divisional Office, on a date to be advised, for a personal interview with officers from General Manager's Office. These people will be given an aptitude test whilst at Divisional Office.
When results of these interviews and aptitude tests from all who apply are compared, a number of personnel (except those applying for positions as Analyst/Programmers and Tabulator Operators) will be brought to Melbourne for further interviews and ultimate selection.
All staff concerned will be advised through Divisional Office as to their progress at each stage of selection.
Those applying for positions as Analyst/Programmers will be required to undertake two courses with the Australian General Electric Company before being finally selected. Although only five Analyst/Programmers positions are available, it is expected that about twelve to eighteen people, as deemed necessary, will undertake the two courses.
No discredit will attach to any applicant who is unsuccessful and all applications will be treated as confidential.
Successful applicants will be given one month's notice of transfer to Melbourne.
This date is expected to be about 20th August, 1963 in the case of the first four posts listed, and early 1964 in the case of those selected as Analyst/Programmers. Staff selected for Tabulator Operators will be transferred to Melbourne as required. All Analyst/Programmers and Tabulator Operators will be on probation for six months having regard to aptitude.
Personnel appointed to the various posts will have opportunities for advancement in the Melbourne Data Processing Centre and in any other similar installation which may be established in the future. Entry into this new and expanding field will not preclude staff being considered for re-entry into other avenues in the Bank. All staff will be given a thorough introduction to computers by highly qualified General Electric staff.
5. EDUCATION OF STAFF AND PUBLIC
We are conscious of our responsibilities to staff and customers to keep all informed of progress and it will be noticed one of the positions involved is that of a Public Relations Officer.
This education will apply to staff at all levels and will follow through to our customers and the public generally.
The acquisition and installation of a computer will be a major occurrence in the history of the Bank.
Our intention is to tackle the associated problems with vigour and determination so that the benefits to the staff and the Bank as a whole will be obtained as soon as possible, but with due regard to the technical and planning difficulties that have to be overcome and the need for care in selecting a team of personnel with particular aptitude for the work.
Managers are asked to ensure that all staff read and initial this circular promptly and that those who may be interested are encouraged to make application.
I'm fascinated by some of the quaint language in the circular, such as the way that computers were being used “...to solve the intricate scientific calculations of this modern age”. Computers of this era were vastly expensive machines that typically filled a good sized room and required a number of dedicated staff to operate them. They were therefore usually only adopted by large companies that performed sufficient “intricate...calculations” to justify such a huge upfront and ongoing investment.
Good-old “Numbers business”
Banks in particular could see the enormous potential of computers. The majority of their business involved dealing with numbers. And these were not the simple “dollars and cents” that we take for granted today but the old fashioned “pounds, shillings and pence” that were still in use back then in Australia and New Zealand. These in turn were derived from the “librae, solidi, and denarii” developed by the ancient Romans! Twelve “pence” to the “shilling” and twenty “shillings” to the “pound” made calculations pretty complicated.
Even more amazingly, those very banks were still using traditional paper ledgers to maintain all records of all transactions. And this was the 1960s! It's hard to imagine scores of staff – like my Dad – whose job involved manually adding and subtracting all those fiddly imperial currency amounts by hand in paper ledgers:
And yes, it's true that adding machines were available for staff to use. But in yet another example of the archaic banking industry mindset – most staff didn't use them! Here's a delightful excerpt from Dad's memoirs, describing his very first day at the ANZ Bank in 1955:
“One of the first jobs I was given was to check additions of money columns in some ledger or other. I had to really try hard to remember whether it was 12 shillings to the pound and twenty pence to the shilling, or twelve pennies to the shilling and twenty of those to the pound!
“But the first column I checked proved that it was the latter, and so I merrily worked away at this for a while until I discovered an adding machine. When I asked if I could use that, the answer was a somewhat surprised "yes". Only later did I discover that using adding machines and typewriters was regarded as girls' work!”
Time for a change
Fortunately, all of this was set to change. Australia was planning to switch to decimal currency in Feb-1966, with New Zealand to follow in Jul-1967.
The timing of the May-1963 announcement of the ANZ Bank's plans to introduce computers was thus no coincidence. It was hoped that three years would be sufficient to get the new systems up and running before the impending switch to decimal currency.
Imagine the difference: Converting thousands of customers' account balances from “pounds, shillings and pence” to “dollars and cents” – manually – would be an excruciating task – and almost guaranteed to contain errors. If, instead, all of those balances were already on the system, a carefully written and pre-tested computer program could perform all of the same conversions in hours instead of weeks – and with absolute accuracy.
The 1960s were certainly shaping up to be an exciting time!
It's nice to read that the ANZ Bank chose to: “...seek applications from staff who may be interested in occupying the first few positions...”. Actually, it made good business sense to begin their recruitment in-house. Computers themselves were so new in the 1960s that there was no real “pool” of computer programmers. It was ultimately easier to find existing bank staff with the required smarts and to train them in the new field of computer programming than it was to try to recruit pre-trained programmers – let alone ones with relevant banking experience:
APPLICATION FORMComplete in own Handwriting –
See Reverse for Job Descriptions and Qualifications.
|Despatch under Confidential Cover to reach each Division no later than 18/6/63.|
|Original to:||Duplicate to:|
|Admin Office – Data Processing,
P.O. Box 39, South Yarra, Victoria
M351, M394, Su. and Lta. Branches and G/M/O Depts. to
Your Branch Manager or Departmental Head
All other branches to
Your Divisional Office – Staff Dept.
|Date of Birth||/ /19||Entered Service||/ /19||Joined Branch/Dept.||/ /19|
|Branch/Dept.||Position in Branch/Dept.||Salary £
|Marital Status||Number of
|STATE EDUCATIONAL CERTIFICATES OBTAINED AND SUBJECTS PASSED
(Junior Public Certificates/Intermediate/Leaving Certificates and/or Matriculation)
STATE TRAINING COURSES UNDERTAKEN
|– Particulars of studies currently being undertaken||
|State Previous Branch/Administrative Experience (Brief Details only)||
|Do you own (or are about to purchase) your own home?||
|Are you prepared to be transferred to Melbourne?||
|State positions occupied in community activities||
|State Sporting activities||
|State First and Second Preferences of positions sought||
|State reasons for wishing to enter this field of banking||
|List Major Illnesses in last 5 years||
Positions Available and Qualifications Required
Minimum Qualifications Required for All Positions
(Note: Additional Qualifications are indicated for certain positions – see below)
|POSITIONS AVAILABLE||JOB DESCRIPTION||JOB QUALIFICATIONS
(additional to those above)
|Title: SUPERVISOR CENTRAL PROOF
(One Male Position)
Age limit – 28/45.
Sound Branch experience and experience in Clearing Department or Proof work at large City Branch, an advantage.
|SUPERVISOR PRINTER LIAISON
(One Male Position)
Age limit – 28/45.
Proven ability in any of the specialised fields, such as, Work Measurement, Methods Department, Stationery and Form Design would be an advantage.
(One Male Position)
Age limit – 35/50.
A.B.I.A. or other higher qualification.
Sound knowledge and experience of all Branch Accounting.
Experience in Auditing would be desirable but is not essential.
|PUBLIC RELATIONS OFFICER
(One Male Position)
Age limit – 28/45.
A definite interest in English literature and ability in English expression.
Natural flair for this work.
Ability to express himself freely.
Sincere liking for and interest in people.
Experience in debating would be desirable but is not essential.
ANALYSTS / PROGRAMMERS
Note – Title – Trainee Programmer – 6 months proceeding to – Junior Programmer and then – Programmer.
Applications will be considered from career Lady Clerks.
Age limit – 20/45.
At least 3 years Bank experience.
Experience in Current Account work is essential.
An interest in and knowledge of data processing techniques and systems approach would be an advantage but is not a requirement.
High School mathematics, at least to Intermediate or equivalent standard, required.
(Six Male Positions)
Note: This work is a valuable training ground in Data Processing.
Age limit – 19/28.
Willing to work on shift basis (penalty rates of pay apply).
For Dad, this advancement sounded very enticing. Having experienced first-hand the tedium of manually tallying numbers, the idea of automating such tasks via computers seemed like gold. Plus the role of “Analyst/Programmer” appeared to offer far greater intellectual stimulation than his current position as “Internal Audit Officer”.
So Dad applied – and ended up being one of 194 applicants for those new positions.
As outlined in the initial staff memo, those applicants would then be subjected to far more hurdles than one typically encountered when applying for a new job in the 1960s. For the Analyst/Programmer positions in particular, the ANZ Bank needed to figure out whether applicants had the required analytical and mathematical skills.
The next staff memo that Dad kept was dated 15-Aug-1963, in which the ANZ Bank outlined their progress to that point. Interviews had been conducted, aptitude tests had been taken, eight staff had been appointed to various technical and administrative roles, one had been appointed from the General Manager's Office as Systems Analyst/Programmer, and 17 staff deemed suitable for the remaining Analyst/Programmer positions had been selected to attend two training courses.
As you can probably guess, Dad was one of those original 17 selections. Dad's next big adventure was just beginning.
Flying to Australia
At this early stage, the ANZ Bank didn't yet have its own computer, so the introductory training was to take place in Sydney, Australia. On Saturday 16-Nov-1963, Dad farewelled his wife and four kids at Wellington airport, flying to Australia along with three of his New Zealand colleagues. That was pretty exciting in itself – first-class overseas air travel in the 1960s!
After arrival in Sydney, Dad and his fellow travellers met up with the rest of the trainees. Aside from one colleague who lived in Sydney, the remaining 16 settled into a hotel in Kings Cross, which Dad described as “...the entertainment centre of Sydney.” They had a bit of free time in the evenings and on weekends to eat out, see the sights and shop for souvenirs – but the focus of the next two weeks was to get to grips with this new technology and to see if they had what it took to be programmers.
The two training courses were held at the offices of General Electric (G.E.) in Sydney, Australia, and were presented by an American instructor, C.B. (“Buck”) Spooner.
Training Course 1 – “Introduction to Computers”
The first three-day course was devoted to introducing the computer. The trainees were taught about the computer's various components and their internal workings and how everything interacted.
The system selected by the ANZ Bank was a General Electric GE-225. It consisted of a large number of interconnected cabinets, each of which had a particular discrete function, such as inputting, processing, storing or outputting data:
Remember, this was still the early 1960s. Integrated circuits (ICs) hadn't yet been invented – let alone computer chips. It would still be several years before computers became small enough to accompany man to the moon (as they did in the late 1960s.)
In the above photo, the large, wardrobe-sized cabinet right in the middle is appropriately named the “Central Processor”. That one cabinet contained more than 30,000 individual semiconductor components plus a massive array of almost 200,000 tiny memory “cores” (more on those later). Little wonder that the Central Processor unit weighed almost a tonne.
By contrast, today's equivalent “Central Processing Unit” (CPU) would be a single computer chip smaller than the fingernail on your pinkie!
The other cabinets in the above photo include (from left to right):
- “Disc Storage Unit”, for storing programs or data;
- “Document Handler”, for reading and sorting magnetically-encoded documents, such as cheques or deposit slips;
- “Datanet-15”, for transmitting and receiving remote data (like a “modem”);
- “High-speed Printer”, for outputting program listings or printing cheques;
- “Card Punch”, for creating cards containing programs or data;
- “Auxiliary Arithmetic Unit”, for performing faster calculations on larger numbers;
- “Console Typewriter”, for typing out instructions and messages to the operators;
- “Magnetic Tape”, for bulk, long-term storage of programs or data;
- “Card Reader”, for high-speed input of cards containing programs or data;
- “Paper Tape Reader-Punch”, for slower input/output of tapes containing programs or data.
The cabinets were all housed in a spacious, climate-controlled and sound-proofed room. One of my own earliest childhood memories was the amazing experience of visiting Dad's work to see the mystical “computer”. I can vividly recall my surprise at how noisy it was inside that room!
In fact, programming was almost never done in the same room as the computer. With so many electro-mechanical input and output devices whirring and clattering away, it would be nearly impossible to concentrate.
For all of the trainees, this computer was something totally new to them. None of them would likely be able to draw on past experience or expertise. Instead, they would need to draw on their intellect as they developed the new knowledge and skills required to actually program this beast.
Training Course 2 – “Computer Programming”
The second training course ran for the next seven days. This was where things got way more complicated – even for a bunch of clever people.
I still have Dad's meticulous hand-written notes from those training courses. The extent of technical information that was covered is pretty surprising. Although this account is trying to give you the flavour of what Dad and his fellow trainees experienced, I'll spare you that level of detail:
Consider also that the whole concept of a “computer” was totally foreign to most people in the 1960s. The way that everything was done via computer was quite different to the way they had previously done anything.
First and foremost, although humans are quite comfortable working with pen and paper, computers work with “data” stored in “memory”. The trainees had to learn how to convert the information which with they were familiar – such as names, account numbers, balances – into computer format (via various input devices), process them (via the computer), and output the results (via various output devices).
- Effectively, they had to re-learn how to “read” and “write”.
The next challenge for trainees was to learn two new number systems. They already knew how to use the “decimal” number system (for everyday arithmetic) plus the somewhat more complicated and archaic “imperial” number system (for currencies). They now had to learn how the computer stored and performed arithmetic on numbers in “binary” (the “0” or “1” bits that could be stored in computer memory) and “octal” (groups of six bits, which were easier to work with).
- Effectively, they had to re-learn how to do arithmetic.
In addition, although the trainees were used to writing or typing on paper and storing what was written in “files” or “ledgers”, they were now being taught how programs and data were written onto “coding sheets”, from which key punch operators would create “punched cards” or “punched paper tapes”, and which could be more permanently stored on “magnetic tape” or “magnetic discs”.
- Effectively, they had to re-learn how to store and file.
Finally, although the trainees were well versed in “procedures” – and most of them could teach those procedures to others – it was another matter to “teach” them to a computer. They firstly had to define a procedure as a sequence of logical steps and decisions. These then had to be broken down further into the hundreds (sometimes thousands) of individual instructions that could be performed by a computer.
- Effectively they had to learn how to write the “recipe” for any procedure – plus translate that recipe into computer language.
Although the trainees did do some actual programming during the course, these were only tiny exercises done on paper, not full programs that would have any worthwhile use. It's like they were being shown how a piano works and how to form pleasing sounding chords – without ever getting a chance to actually play their own tune on the piano!
Only after the final selections had been made would the successful candidates then begin the challenge of converting their knowledge of banking procedures into a suite of banking programs.
The training course was significant not just for the trainees. As Dad wrote (to relatives in The Netherlands):
“The big boss from Melbourne, one of the people who will make the end-selection, joined us on Wednesday and Thursday to get to know us a bit better.”
The following group photo (with Dad standing in the very centre!) was thus likely taken on one of those two days, as it includes Dad's future boss, Jim Paton, seated just to the right of front-centre, with G.E's instructor, Buck Spooner, seated just to the left of front-centre:
The period of the course was significant for another reason, as Dad wrote in his memoirs:
“The course took place in the fortnight that President Kennedy was assassinated (22-Nov-1963), and as is often said, everybody remembers where he was when that occurred. And I certainly did.”
At the conclusion of the course, Dad and his New Zealand colleagues flew back home on 30-Nov-1963. It had been an intense and challenging but nonetheless fascinating two weeks.
Final Staff Selections
When Dad returned from the training course, he was feeling quite optimistic that he had the required skills and was well suited to the role. And even if he wasn't selected, the whole trip had still been a marvellous experience.
On Christmas Eve 1963, the moment of truth finally arrived. Dad and a fellow applicant, Bob Wheeler, were called into their branch manager's office and given an early Christmas present – the news that both of them had been accepted as Analyst/Programmers:
For a recap of the whole selection process, which spanned seven months between May and December 1963, here's a brief summary that Dad wrote to his brother-in-law in January 1964 (translated from Dutch and amended with exact numbers drawn from an ANZ Bank memo):
“For these positions (at first only 5, but later 8) the bank received 194 applications, of those 149 were interviewed and they all had to submit to an IQ test, specially oriented to mathematical aptitude. Of those 149 there were 17 left, and those 17 had to go to Sydney to do this course, and out of those 8 have been selected, and I am one of them. So you can see it was not exactly a recruitment action by the bank, they selected this method of application for these new positions because it was a whole new aspect in the banking world. Only one other bank in Australia and none in New Zealand as yet have computers, so they wanted to be assured of the best. It is starting to sound as if I am blowing my own trumpet, I don't want it to sound like that.”
Well, if Dad was too humble to blow his own trumpet, I'll do it for him. Do the maths and you'll find that Dad and his fellow recruits were just 8 out of 194 applicants – making them the top 4%. Which doesn't surprise me – considering that Dad was also a long-time member of Mensa, for whom membership requires an IQ score in the top 2% of the population.
It's also worth noting that the ANZ Bank was footing the bill for relocating any staff who weren't already living in Melbourne, Australia. At that time, Dad was living in New Zealand, along with a wife, four kids and a houseful of belongings. Dad's relocation would likely have been the most costly of all of those selected. It seems that no expense was being spared to get the top candidates.
Dad was certainly clever – and he would soon be working with some similarly clever colleagues in the exciting new field of computer programming.
Migrating to Australia
This new chapter in Dad's life carried with it a huge upheaval for the whole of our family. Well – for most of us. As the youngest child (I'd only just turned 1 year old), I had it easy. My parents and siblings each had to help pack up belongings, say goodbye to their friends, and had the uncertainty of “starting over” when they arrived in Australia.
This was also to be the second migration for Mum and Dad, each of whom had separately migrated from The Netherlands to New Zealand in the 1950s after the Second World War. Their years of getting settled, learning the language and customs, making friendships, buying a home and building a life, would now have to be repeated in yet another new country:
For my part, I remember nothing of either New Zealand or of our move to Australia, but I recall that for many years we still had things in “tea chests” – presumably from that move. My brother Mike recalled that our childhood cubby-house was originally the large wooden crate in which our car (or possibly furniture) had been shipped!
As noted previously, we Fieggens were only one family attached to one successful recruit. There were seven other successful recruits. Of those, only three were from Melbourne and thus didn't need relocating. Two more would similarly have to be relocated from New Zealand as well as one from Queensland and one from Western Australia. I'm guessing that some of them may likewise have been married with children.
This added up to a lot of people undergoing similar upheaval before the new programming work could ever begin.
The team is assembled
I won't delve too deeply into the specifics of those moves lest it detract from the main story – the writing of the ANZ Bank's first computer programs. For the moment, I hope that it's sufficient to say that the “Meeting was adjourned” until everyone re-convened at a later date in Melbourne, Australia.
That date was Monday, 24-Feb-1964, when the Data Processing team assembled for the first time in the ANZ Bank's brand-new high-tech offices at 177 Toorak Road, South Yarra.
Writing Programs on 1960s Computers
As a programmer myself, I've written software on everything from an early home-built kit computer through to desktop PCs and most recently to handheld devices. Despite this wide experience, it's still hard for me to comprehend how difficult it was to program in those days.
For a start, the programmers didn't use a screen or keyboard! Instead, most of their work was done with pencil and paper, edited with a pencil eraser, then transcribed onto cardboard (either punched cards or punched paper tape) for feeding into the computer.
Here's a glimpse of the complete process, using photos of some of the actual paper and cardboard artefacts that Dad kept from that period.
The programming process
- Worked out the logic of what they wanted to achieve by drawing up a “flow chart” (or “flow diagram”) on paper:
- From that flow chart, worked out the individual computer instructions needed to achieve each piece of that logic, hand-writing those instructions – the “source code” – onto “coding sheets”:
- Those coding sheets containing the source code were given to “key punch operators”, who would carefully type those instructions into a machine to create “punch cards”, each card containing one line of code or data. This resulted in a first “deck” containing the complete “symbolic program”:
- That symbolic program was then handed off to the “console operator”, who would feed the deck of punched cards into the computer to perform the “assembly” (more on that later), resulting in a finished, working program.
- At long last, that finished program was sent on its first actual test-run through the computer.
A team process
As you can see, writing and testing a program was not a one-man-job. The computer was a large machine, which required a team with various specific skills to operate. Programmers like Dad and his colleagues were expected to devote themselves solely to the “thinking” part of the process.
I spoke recently with one of Dad's colleagues, Bob Wheeler, who recalled that whenever they finished some section of code, it would typically be “sent off” to be assembled and perhaps run on some test data while the programmer took a break or worked on something else. Some time later, the resulting printouts would arrive, which they would then study to determine the success or failure of the code.
If there were problems, debugging and updating the program required repeating some or all of the above steps. Programmers soon learned to be meticulous and think everything through carefully before committing their changes to flow charts, coding sheets, punched cards and ultimately computer time.
There would also presumably have been some public embarrassment when each failure and each repeated attempt was seen by all of those other operators!
Compare this with programming barely 10 years later when the first hobby computers began to appear. Despite being a low-budget amateur, even I had a rudimentary screen (an old TV set that I'd crudely modified) and a keyboard (some cobbled-together push-buttons). Still, I could simply type in a program and test it – with near-zero cost of subsequent edits and tests. Programming took less forethought, allowing experimentation, trial and error, “what-if” testing – and no public embarrassment when anything failed.
In those pioneering days, Dad and his colleagues didn't have that luxury.
Aside from the rigmarole of the primitive input-output, the other major challenge was the limited resources in those early computers – in particular, working memory. This was where all of the instructions of a computer program plus all of the temporary data on which it was working would be stored.
In the next section I'll try to illustrate how limited this memory was by paraphrasing the old saying: “A picture is worth a thousand words” – into – “A thousand words can hold a small picture”.
1960s Computer Memory
The GE-225 computer, like most computers of that era, used “magnetic-core memory” – which was a very expensive commodity. Take a look at how intricate this technology was and you'll probably see why:
Although the above array is not from a GE-225 computer (it's from my collection of old computer bits & pieces, and is actually from the 1970s), it uses similar technology – a grid of thin wires running in rows and columns, with a tiny “core” (ring of ferrite) at each intersection. Each core can be individually magnetised to store a single “bit” of data. The above grid contains 64×64 = 4,096 cores and thus can store 4,096 bits of data.
Just how much is “4,096 bits” of data? How much real-world data can it store?
Not much – it can barely hold the above two paragraphs of text.
Another way to visualise that amount of data is to represent it as an image. Let's use each of the ferrite cores in that 64×64 grid to store one dot (or “pixel”), either black or white (corresponding to whether or not the core was magnetised). The result would be an equivalent 64×64 pixel black & white image – which is pretty tiny:
Of course, that's just one memory grid (and not even from a GE-225 computer). Typical computers would contain many such grids. The ANZ Bank's computer had a storage grid of 8,192×20 = 163,840 cores. Arranged as a square, this would be roughly 404×404:
It's kind of fun to visualise the equivalent of the entire memory of the ANZ Bank's original 1960s-era computer filled solely with black and white pixels of their 1960s-era corporate logo!
Finally, let's try the same exercise with the ANZ Bank's current full-colour logo from the 2020s. This requires up to 32× the amount of memory to store the colour, brightness and transparency value for each and every pixel. The result is that the ANZ Bank's original computer would only have been able to store a tiny full-colour logo of just 71×71 pixels – barely the size that we see today on a smart-phone app icon or website bookmark:
In fact, a typical smart-phone from the 2020s has one million times as much memory and storage as a typical computer from the 1960s – despite being so much smaller!
What I'm trying to show with all of this is just how little memory Dad and his colleagues had to work with. Back then, it was still a precious commodity – thus computers contained precious little.
Yesterday's memory in today's terms
The GE-225 computer used core memory “modules”, each capable of storing 4,096 “words”, with each word comprised of 20 “bits”. The computer was available with 1, 2 or 4 modules. The ANZ Bank chose an initial configuration of two modules, thus 8,192 words.
Nowadays we generally refer to storage capacity in “bytes”, with each byte comprised of 8 “bits”. So let's do the maths to convert that storage capacity to today's terminology:
- 8,192 words × 20 bits per word = 163,840 bits;
- 163,840 bits ÷ 8 bits per byte = 20,480 bytes (= “20 kilobytes” or “20 kb”).
Again, just how much is “20 kb” of memory? How much real-world data can it store?
Again, not much – about 8 or 10 pages of text – maybe enough for one short story.
In fact, that's probably a good description of a typical computer program of the era – as a “short story” that achieved something worthwhile. And because we're talking about computer programs, those short stories were written in computer language, not English.
General Assembler Language
Although it was possible in 1964 to write programs using high-level computer languages (such as “COBOL”), the ANZ Bank opted to use the low-level “General Assembler Language”. Such programs use the individual native instructions of the GE-225's Central Processor.
As an analogy, if learning a high-level computer language is like learning to drive a car, learning a low-level computer language is like learning how to separately control every single moving part of a car – pistons, valves, crankshafts, gears, levers, wheels, etc.
Central processor instructions
Dad and his colleagues were each given a copy of the “GE-225 Programming Reference Manual”, which listed the more than 300 native instructions understood by the GE-225's Central Processor.
Most of these instructions consist of what to do and where to do it. On the coding sheet, these are entered as the “Operator” (=“Opr”) (a three-letter code, or “mnemonic”) and the “Operand” (usually a memory address), sometimes with an additional “X” entry (usually an address modifier or device number).
To spare the programmer from having to calculate and keep track of memory addresses throughout their code, they can instead assign a short name to any line of code or data. On the coding sheet, these are entered as the “Symbol” (up to six characters). Any instruction that references that code or data can then use that symbol instead of the memory address.
For example, let's say we're tallying an employee's hours worked. If we use the symbol “HOURS” for the area of memory where the daily hours are being loaded and the symbol “TOTAL” for the area of memory where the total is being tallied, the resulting assembler code might look like this:
|DLD||TOTAL||Load current total|
|DAD||HOURS||Add hours worked|
|DST||TOTAL||Store updated total|
Armed with all of this – plus countless more information about “registers”, “controllers”, “interrupts” and other intricacies – Dad and his colleagues set off on their journey using “General Assembler Language” to write the ANZ Bank's first programs.
Human code to computer code
As mentioned previously, the final part of the journey – converting a “human-readable” program into a finished “computer-readable” program – is known as “assembly”:
- All of the instructions, numbers and text are converted from various alphabetic and numeric formats into “octal” code;
- The memory address of each instruction and each data item is calculated;
- Any “symbols” in the code are replaced with those calculated memory addresses.
The end result is a program containing the identical instructions plus the identical data in the identical sequence as it was programmed in the source code – except that everything is in a format that's compactly stored and is easier for the computer to work with.
That may seem like a simple task – but like everything else in the early 1960s, “assembly” was also a complex process.
The Assembly Process
The GE-225 computer came with a software suite called “General Assembler II”. This consisted of four separate “binary” decks of punched cards – “Pass 0”, “Pass 1”, “Pass 2A” (Absolute version) and “Pass 2R” (Relocatable version). Each of these decks was used to perform one separate assembly pass.
For each pass, the console operator would feed one of those binary decks into the card reader, followed by the cards to be processed, followed by one final blank card (to signal end of input).
The following shows the typical three-pass assembly process for a basic punched-card system:
First assembly pass
The “Pass 0” binary deck plus the “Symbolic Program” deck (eg. ANZ Bank's code) were fed into the computer:
This pass generates a new second deck of punched cards called the “Packed Deck” (four instructions per card, comments removed) plus a third deck of cards called “Symbolic Table ST1” (certain symbolic entries in the program) plus a printout.
Second assembly pass
Next, the “Pass 1” binary deck plus those second and third decks were then fed back into the computer:
This pass generates another new fourth deck of punched cards called “Symbolic Table ST2” (all symbols in the program, this time with their calculated memory addresses) plus a second printout.
Third assembly pass
Finally, the “Pass 2” binary deck (either the “A” or “R” version) plus the second, third and fourth decks were fed back into the computer:
This pass generates the fifth (and final) deck of punched cards called the “Object Program” (slightly different “binary” format) plus a third (and final) printout called the “Assembly Listing”.
Note that for the sake of simplicity I've shown only the most basic assembly using the most basic punched-card system. On systems that also include “punched paper tape” and/or “magnetic tape”, the output or subsequent input could use either of those media instead of punched cards. This would be significantly faster (and less tedious):
The finished program
If all went smoothly, the programming was now complete. The finished “object program” was then ready to run whenever or wherever needed – even on a different GE-225 computer – with the finished “assembly listing” printout stored as documentation.
Only when updating the program or fixing bugs was it necessary to repeat any of the above steps. Mostly this involved punching some new cards, inserting them into the first “symbolic program” deck, then – depending on the nature of the change – repeating up to three passes through the computer to create an updated “object program” deck and “assembly listing”.
Sounds complicated, right? I get dizzy just reading that whole procedure!
Having said that, those multiple passes of punched cards/tapes in and out of the computer were only “procedure”. The hard part was the programming itself – and I've only scratched the surface of all that was involved. Hopefully it has given you some appreciation of just how much Dad and his colleagues had to learn and put into practice before the ANZ Bank's first programs could be commenced.
The ANZ Bank Programming Begins
How does one even begin to write the first ever software for a large multi-national organization like ANZ Bank? It's a daunting task.
One strategy is to “divide and conquer”. Break down everything into smaller pieces.
The initial team of nine analyst programmers was divided into three smaller teams: “Current Accounts”, “Personnel” and “Share Registrar”. These teams would tackle the three main areas of the ANZ Bank's business.
Dad was one of three in the “Personnel” team, along with Doug Vollmerhause and Ted Harding (later replaced by Roger Papps):
Likewise, each of the teams broke down the “library” of needed software into a bunch of “short stories”. Based on the artefacts that Dad kept, the library that the Personnel team designed and created consisted of a number of programs with names like “S01”, “S02”, etc, with the highest I've found being “S28”. The common “S” prefix possibly indicates “Staff” or “Salaries” – but I'm only guessing.
One significant artefact
Of all of those programs, “S05 Master File Update Run” is probably the most fascinating from a historical standpoint. Dad kept a bunch of stuff relating to this particular program, including the flow charts (as seen above) plus a printout of the first assembly, which he had crudely bound with cardboard and string, labelled: “S05 1ST ASSEMBLY”, and annotated: “THE FIRST ANZ-BANK PROGRAM EVER TO BE ASSEMBLED!”
This printout obviously meant enough to Dad for him to save it as an artefact for more than half a century. After all, months of effort had been spent before Dad had this first tangible result to show for it.
There's also a fun story that Dad often related about this very first assembly. Here's another excerpt from Dad's memoirs:
“I still clearly remember the first time I ran the assembly of my first program. The source program cards occupied two almost full card-trays, and when the operator at General Electric's computer bureau (our own computer had not yet arrived at this stage) saw that he said "That'll never fit into memory".”
“So, when the listing was being printed, there I stood alongside the printer, anxiously looking at the addresses, shown in octal notation, ascending. Of course, the highest address (in octal) was 17777; when the numbers started exceeding 17700 I almost resigned myself to the fact that I might have miscalculated. Some instructions took more than 1 word, and I might have under-estimated. But then, at 17770 the assembly was complete! I had 8 words to spare!”
“Later, with experience, we learned how to make maximum use of memory, how to share buffers and how to overlay, how to share constants etc etc. But that first program (which, incidentally, was the ANZ Bank's first in-house written program to be assembled) was an experience never to be forgotten.”
Although it was rare for programmers to oversee the assembly process, that very first assembly was a one-off moment that Dad wasn't going to miss. The resulting printout, with its proud annotation on the binding, sufficiently captured my imagination to study it further. It seemed a perfect case study for this account – and probably holds its own as a solid piece of computing history.
The work of presenting this program, however, was no easy feat. It ended up taking me months of work (on and off) to transcribe the 120+ pages. You can read more about my own story of how this was done further below.
For now, let's take a look at the “S05” program – both figuratively and literally – to see what Dad and his colleagues created and what it achieved.
“S05” Program Overview
Here's some overall stats on the “S05” program, taken from the spreadsheet into which it was transcribed:
- 122 pages of assembled code printouts;
- 4,875 lines of ANZ-written source code;
- 1,666 lines of G.E.-supplied source code;
- 6,541 lines of source code in total;
- 116,706 bytes of source code in total;
- 193,480 bytes of assembled code;
- 8,173 words of memory filled;
- 20,433 bytes of memory filled.
I found a couple of the above stats particularly interesting – and seemingly contradictory. The program filled 20,433 bytes of memory, only just fitting into the GE-225 computer's working memory (20,480 bytes). Yet the source code of the program was almost 6× as large (113,867 bytes).
Source code size
So – how could the source code fit into memory?
Sure, I understand that the “assembly” process compacts the big, human-readable source code into a tiny, computer-readable program. But wouldn't the computer need 6× as much working memory just to be able to read all that source code?
The assembler accomplished this feat by reading only one “source” card at a time, keeping only the bare minimum from that card in memory, then outputting only one “assembled” card at a time. This partly explains why the computer needed three passes to complete the assembly.
Visualising the “S05” Program
Remember previously when I used ANZ logos to “visualise” the storage capacity of the ANZ Bank's original computer? How cool would it be to use a similar image to visualise the actual “S05” program as it would have been stored in memory! My spreadsheet contained the complete assembled “S05” program stored as octal strings – I only needed to convert those strings into black and white dots.
Here, then, is the resulting image – followed by an explanation of what it shows:
The idea was to visualise a program stored in the memory of the ANZ Bank's original GE-225 computer. That memory consisted of 8,192 “words”, each consisting of 20 “bits”. Ideally, I could have used image dimensions of 8192×20 to exactly match those storage dimensions. But that would be one ridiculously long, thin “strip”, which would be awkward to view on a web page.
I've therefore chopped this into 16 strips and stacked them into a more sensible rectangle, adding faint grey horizontal lines to show those strips. By analogy, you can think of this as one 16-line “paragraph” on ruled paper, with each line containing 512 “words” of memory.
To me, this image is fascinating! It's like a historic “street map” showing the whole of a 1960s-era memory grid, with every black dot depicting the location of a single memory core magnetized with one bit of code from a 1960s-era computer program.
Blank areas = No data
It was interesting to see in this visualization the large areas without black dots – and hence without data. What are these?
- The first blank area (at top-left) is due to the program loading into memory starting at memory address 300, leaving an area of 300 words empty at the very start of memory;
- The second, much larger blank area (in the middle) is where memory was allocated to store every bit of data that this program inputs, manipulates or generates for output;
- The final blank area (at bottom-right) is the tiny amount of memory remaining after the program loaded – only 19 words!
It was also interesting to see certain patterns in the data. I'm familiar with binary numbers and thus recognize some of these patterns, but other patterns looked less familiar. I therefore carefully worked through the image, color-coding and annotating the sections of ANZ code and data plus G.E.-supplied code, to produce the following breakdown:
For Dad and his colleagues, this type of visualization would have taken place only mentally as they pieced everything together in their minds.
“S05” Program Discoveries
I made plenty of other interesting discoveries as I gradually transcribed the “S05” program. I learned a lot about what the program actually did – as well as what it didn't do.
For one thing, the program as it stood wouldn't have worked!
I only discovered this when I reached page 14 of the assembly listing. That's when I found, for the first time, a solitary letter “U” sitting just to the left of what I had previously assumed was the left-most column. The same character appeared again on page 19 – and after checking further, I found another eight occurrences. Each had been prominently circled with red pen:
These turned out to be error codes indicating: “Undefined Symbol”. In the above example, the code: “BRU PENS7” means to branch to the line of code with the symbol: “PENS7”. However, the assembler could find no such symbol, thus it couldn't “fill in the blank” with the necessary memory address.
The result was the incomplete octal code: “2600000”, which would actually have resulted in the program branching to memory location “00000”. Who knows what that would have triggered, as this is within undefined memory before the start of the “S05” program!
Less critical errors
The next error was somewhat less critical, caused by a single missed character within the “Male salary table”:
|DEC||116400||MALE SCALE 1|
The program would still have worked – but those staff falling into that pay grade would have been quite annoyed to receive less than 1/10 of their usual salary – “29080” instead of “298080”. (Note: The correct figure had been hand-written on the printout.)
Likewise, typos in less-critical sections, such as in the “remarks” column (eg: “CARIIAGE” instead of “CARRIAGE”) or in the “sequence” column (eg: “171()” instead of “17190”) would have made little difference to the program's operation.
Sometimes, however, the code and remarks were out of sync (eg: “TAB1+2” in Code, “TAB1+3” in Remarks), leaving me unsure whether the typo was in the code (which would be critical) or in the remarks (non-critical).
Finally, I found five instances when a single memory address (13035, 14177, 14547, 15135 and 17651) was skipped for no immediately apparent reason:
The first occurrence turned out to be due to a double-length decimal value: “DDC 50000”. These are required to start on an even-numbered memory address, so it was bumped from 13035 to 13036, leaving a gap.
But I couldn't find any such explanation for the remaining four instances – although the fact that each was also at an odd-numbered address hints at a similar alignment issue. In any case, I doubt that they are errors, rather, they are simply a mystery to me. I only mentioned them for the benefit of anyone else examining the code.
It was a bit of a surprise to encounter the G.E.-supplied code roughly 3/4 of the way through. These were “subroutines” to perform inbuilt functions – one to “FLIP” values from BCD (binary coded decimal) to binary, one to “FLOP” values from binary to BCD, plus an “#I/O” subroutine for tape input/output services.
Interestingly, the first card of the “FLIP” subroutine was mistakenly placed at the back of that supplied deck. It was found in the assembly listing at octal address 15641 when it should have been at octal address 14630 – more than 500 lines earlier:
|REM||CD225C1.000 FLIP-1 BCD TO BINARY CONVERSION|
That card consisted only of remarks (the heading and version number of the subroutine), so the fact that it was out of sequence didn't affect the actual program.
Interestingly, it's worth noting that half of all typos were in the G.E.-supplied code!
Although not an error as such, I did notice one anomaly: Dad had mentioned in his Apr-1966 letter that there were five pension funds, yet there seemed to be only four listed in the “S05” program: “G+P 1861”, “OPF 1933”, “MOPS 1951” and “LSPF 1951”. It's possible that a fifth pension fund was added some time between 1964 (when “S05” was first assembled) and 1966 (when the letter was written).
In addition to the actual GE-225 instructions in this program, which are mainly of interest to programmers and computer geeks like me, the source code contains other snippets that could be of interest to demographers and history buffs.
The “Male salary table” (as mentioned above) indicates that there was a separate table for females. Although there was a strong “Women's Lib” movement back in the 1960s, it would be decades yet before institutions began to address all the issues of gender equality. Equal pay would certainly have made programs like “S05” much simpler!
Some people may likewise find it interesting to study other aspects of payroll in the 1960s, such as normal working hours, overtime, special duties, allowances, deductions, taxes, pensions, etc.
I hope that showcasing this “S05 1st Assembly” is seen as a worthwhile addition to the annals of Australia's computing history. If you'd like to download and examine the code, see the Resources section at the bottom of this document.
Summary of discoveries
All in all, the “S05” program contained plenty of interesting discoveries – even a few errors. But I hope that pointing out those errors hasn't detracted from what Dad and his colleagues accomplished. That 1st assembly was a huge success for them as fledgling analyst programmers – and a huge milestone emotionally.
What the “S05” 1st Assembly Achieved
For the months prior to this first assembly, all that Dad and his colleagues had to show for their work was their own pages of notes, flow charts, coding sheets, punched cards and paper tapes. It was only when that first deck of punched cards was fed through the computer that everything was finally brought together into one cohesive printout.
The result must have seemed like a long-sought treasure!
Suddenly they could see and read for the first time exactly what the computer would read – warts and all. This was their first chance to correct any errors that the assembly process had highlighted, plus eyeball the whole lot to find more subtle errors.
In fact, there are extensive handwritten edits on many pages, with the final page including the note: “Amended 862 cards before 2nd assembly”:
I also wonder if this explains the disparity between Dad's recollection in his memoirs of having only “8 words to spare”, whereas the 1st assembly printout revealed 19 words to spare? Perhaps it was that 2nd assembly that came agonizingly closer to the limit?
Although I don't have the 2nd assembly listing, in theory it could be reconstructed from all of those handwritten edits on the 1st assembly. But that's way further than I intended to run with my analysis.
Interestingly, I did locate an “S05” assembly listing from Jun-1972 – some seven years later. Although it had undoubtedly evolved substantially, it's nice to see that basically the same program was still in use.
Other Hardware and Software Challenges
This account may already sound pretty complicated to the average reader – yet I've barely touched on the many challenges that Dad and his colleagues faced. In particular, most of the computer hardware that they were using was first-generation, and each piece of hardware came with its own peculiarities.
As a quick example, let's look at one of the biggest units in the GE-225 computer system lineup. The “Document Handler” unit reads magnetically-encoded documents, such as cheques, deposit slips and so forth, sorting them into one of 12 output pockets:
This unit was fast – reading at 1,200 documents per minute. That's 20 documents per second! For a 1960s-era programmer, that made things complicated.
I remember Dad telling me about the time constraint of reading a cheque, making a decision based on what was read, then issuing a command to select the pocket into which to divert that cheque – but only after double-checking that the cheque hadn't already passed the first pocket. At 20 cheques per second, that works out to a cutoff time of only 50 milliseconds
Writing the code for complex and/or lengthy tests sometimes needed some pretty intricate programming. Typically this required working out the individual execution times for each processor instruction in each section of code, then ensuring that the total time taken through each possible path was under that 50 ms cutoff time.
The above example involves just one piece of hardware. If a program was reading a cheque, comparing it to a record on magnetic tape, then sending a message to the printer, there were three electro-mechanical units involved. It was a real challenge to synchronise everything to reduce or eliminate stopping and starting.
Dad and his colleagues gradually learned how to handle each unit, and as their programming matured, soon had everything working in harmony like a well rehearsed orchestra.
Intricacies of programming
In this account, I didn't want to delve too deeply into programming itself – which is a whole other story. Briefly though, here are some of the other intricacies that Dad and his colleagues had to absorb:
- All the different parts of the Central Processor;
- All the different processor instructions;
- How “decisions” are made in a program;
- How errors are signalled by devices and coded into a program;
- Troubleshooting and “debugging” a program for coding errors;
- Creating test files with both “good” and “bad” data to check a program's logic;
- Performing a “core dump” (a printout of everything in core memory) to check what was going on inside.
The elephant in the room
On top of all of this was the enormity of what Dad and his colleagues were working on. This was software for a multinational bank. Mistakes would affect real people's lives – and would be damaging to the ANZ Bank's reputation. Everything was expected to tally to the exact cent.
Talk about pressure!
When I recall my own first programs that I wrote on hobby computers in the 1970s, these were just simple experiments while I learned about programming. Any mistakes didn't affect anyone, so I had the luxury of messing around and enjoying myself.
Back in the 1960s, there was no time for fun and games on a brand new multi-million-dollar computer that was being set up for the first time. Those first programs, like “S05”, were the result of months of work and thousands of lines of code. They had to work.
ANZ Bank Software Rollout
The “S05” program was the first of the ANZ Bank's programs to be assembled, leading the way for the remainder of the suite of “Personnel” programs to slowly come to completion. Only when all the required pieces were operational and tested would they be able to make their debut.
That momentous occasion came on 07-Apr-1965 – the date of the first salaries print-run:
I have copies of three slightly different photos taken on that date, one of which also included Mike Clarke, another that also included Bruce Harper. I only have a tiny, blurry copy of the latter – a pity because that's the one that appeared in the ANZ Bank's newsletter, carrying the following caption:
“A proud and exciting moment for Messrs Doug Vollmerhause, Chris Fieggen, Roger Papps, Mike Clarke and Bruce Harper when the high-speed printer produced the first salaries run on April 7 at 100 warrants per minute. EMANZA is fully operational.”
(Note: “EMANZA” was the name for the ANZ Bank's computer.)
I've spoken with Dad many times about the interesting aspects of the personnel software – in particular that fortnightly payroll run. Here's a good summary that Dad wrote to his brother-in-law in The Netherlands in Apr-1966 (about two years into the job):
“So far I have been involved in the "Payroll" computer application for the Bank. The Bank has some 7,500 personnel in Australia, and the application required a "suite" of programs, some of which I have written.”
“We now have, on magnetic tape, all necessary details of those 7500 staff, such as name, salary p.a., certain allowances, pension-fund details (we have 5 pension funds), dates of birth and entry into the Bank, tax scale, etc.”
“You will appreciate that every fortnight, before we can run the payroll, numerous changes and corrections have to be entered. Then there are the overtime details, people have been transferred, etc etc. All those details are advised to us via punched paper tapes by the various personnel-departments in the state capitals. All these details are entered into the computer and written on magnetic tape, then sorted so they end up in the same sequence as the tape with all the basic details (the "Master-file"), and then the two tapes are processed together, the changes are applied, salary-cheques are printed (2 per second!), various amounts are added together and stored in each staff-member's "record" for later reporting, e.g. to the Taxation dept, and when all is done a new "Masterfile" is produced on magnetic tape for use next payday. After this various reports need to be printed, per branch, per department, per state etc etc. After that all is quiet for a fortnight (don't you believe it!)”
I also recall Dad telling me how – many years down the track – the ANZ Bank was considering some new payroll software and Dad was one of those tasked with evaluating offerings from various vendors. One such suite seemed particularly promising – until Dad asked a critical question: “What's the total run-time of a fortnightly payroll run?” From memory, the answer was something like: “18 hours”.
This meant that if there was any error that required a re-run, that 18 hours became 36 hours, and employees would receive their salary one day late! It was an easy decision for Dad to reject that software – and gave him the opportunity to brag a little on his own earlier software that achieved the required outcome within a 24 hour timeframe on a much less powerful computer.
Remaining software rollouts
In due course the “Current Accounts” and “Share Registrar” teams likewise completed their contributions. Between them, the three teams managed to get all of the necessary ANZ Bank software up and running well ahead of the Feb-1966 deadline (when the changeover to decimal currency was scheduled).
As Dad wrote in his memoirs about his team's efforts:
“Our team, consisting of three Analyst/Programmers, was doing the salaries application. Our deadline was 1 July 1965, the start of the new fiscal year. For various technical reasons we decided to bring our deadline forward by two months. Well, contrary to all common wisdom ("all computer systems run over budget and over time") we brought our system in under time and under budget! It may well have been the last computer system in the world to achieve this; it certainly was for the ANZ Bank's major systems.”
You may have noticed throughout this account that Dad and almost all of his original Analyst/Programmer colleagues were men. Looking back from today with the morals of the 2020s, it's easy to assume that sex discrimination may have played a part. The fact that the ANZ Bank had separate pay tables for males and females (as seen previously) lends weight to this assumption.
Dad and I spoke about this a couple of times – and the reality is actually pretty surprising!
Gender neutral positions
In the very first recruitment drive back in May-1963, most of the positions offered were indeed “for the boys”. The Supervisors, Audit Officer, P.R. Officer and Tabulator Operators were all listed as: “Male Position(s)”
On the other hand, the Analyst/Programmer positions were open to anyone – regardless of gender. In fact, the staff circular specifically stated:
“(or any lady clerk – in the case of Analyst/Programmers – who is interested in a career post)”
The staff application form likewise stated:
“Applications will be considered from career Lady Clerks”:
It was therefore disappointing to see so few women at each stage of the recruitment process – only one of whom made it through:
- There were far fewer initial applications from women than from men;
- After the interviews and aptitude tests, one woman was immediately appointed as a Systems Analyst/Programmer;
- For the remaining Analyst/Programmers positions, only one woman attended the computer training courses, whereas 16 men attended;
- Of those, no women made the grade, whereas eight men were selected and four men were kept as “reserves”.
Positive gender bias
From the outset, the ANZ Bank had made the assumption that women might be ideally suited to the Analyst/Programmer roles. After all, a computer program is like a “recipe”, and women in the 1960s still largely filled the traditional gender role of “cooks” in most households. It would be right up their alley, correct?
Sadly, that stereotypical assumption was wrong. The fact that a man or woman may cook, and may be good at following a set recipe, didn't necessarily make them good at creating a recipe – nor even of understanding the reasoning behind each of the ingredients or steps. Perhaps a “chef” would have been closer to the mark.
In a strange twist, it turned out that a different gender stereotype might have been correct. As more women gradually came on board in later years, it turned out that those who were good at knitting were typically good at programming!
Note that this didn't apply to women who were just average knitters. Someone who could only follow a knitting pattern that was spelled out in detail was likely no better at programming than a cook who could only follow an equally detailed recipe.
On the other hand, a good knitter understood the underlying logic of knitting. A simple instruction or series of instructions, such as “Knit One, Purl One”, repeated over and over, with decisions and variations made at critical points, resulted in a given outcome – a finished piece of knitting with a particular pattern. Good knitters instinctively grasped those same familiar concepts in programming.
Although the connection between knitting and programming was an interesting observation, it never helped the ANZ Bank recruit new female staff. Can you imagine an application form asking: “Are you good at knitting?”
Ongoing Software Development
Through 1965 and 1966, with the various new ANZ Bank programs now performing their intended tasks, Dad and his colleagues could finally shift their focus to making improvements. This was when it really became exciting to be a pioneer programmer!
In particular, this meant that they were often doing things for the first time. As I once wrote of my own experiences in the early days of businesses adopting PCs:
“The tasks, which may seem trivial by today's standards, were both fascinating and challenging because so much had never before been attempted, hence we often explored new territory and pushed the boundaries of those less sophisticated systems.”
On a computer with no screen-based “development environment”, few reference books and no internet, one couldn't simply search for sample programs or algorithms. Each programmer typically had to figure out for themselves the best way to accomplish everything.
Dad described how he or a colleague would occasionally decide to re-work some awkward piece of code to try to gain some improvement. Even a tiny saving of a few milliseconds becomes significant when it occurs within a loop that is repeated millions of times. On succeeding, they would then excitedly share the results with their colleagues, who would marvel at both the resulting speed gain and the genius by which it was accomplished.
Today, if a programmer wants to sort a bunch of data, they can simply search the internet for “sort algorithm” and find countless solutions. They can then choose the algorithm best suited to the type of data being sorted, the software in which the program is being written or the hardware on which it will be run. By contrast, back in the 1960s, pioneer programmers around the world were busy inventing such algorithms.
Chris Fieggen's code
Besides that first “S05” program, Dad really enjoyed writing “tricky” bits of code. I recall Dad being fairly pleased with his suite of “Date Modules”. These were primarily designed to do calculations based on Wednesdays (when the fortnightly payroll run took place), but could equally perform the same function for any weekday:
In his “Perpetual Calendar” modules, Dad went the extra mile with his calculations, programming them to handle dates all the way back to the year 0. To achieve this, the code included complexities such as the fact that ten days were skipped in Oct-1582, when the “Julian Calendar” switched to the “Gregorian Calendar”. That's way more comprehensive date calculations than what is being used by most computer software even today!
Dad was usually his own biggest critic, so it was nice to read in his memoirs a rare moment of positive self-assessment:
“After about three years of programming I decided to do a count of all the instructions in the programs I had written, which were still in use at the time, and which could be regarded as debugged. At that time it was considered that a good programmer, whilst programming, could produce 25 debugged instructions per day. Well, my count came to 31 debugged instructions per day, and those days included all the days I had spent doing systems analysis, systems design and all the other things that are involved in producing computer systems, including the maintenance of all the programs I had written.”
Anyone who immerses themselves into some aspect of computing can become an expert in their field. But that expertise is short-lived. New computer technologies – and with them new methodologies – come fast enough that they're difficult to absorb. Experts find themselves increasingly reluctant to be throwing out all their well-established knowledge and methods in favour of new (and supposedly better) methods.
Eventually, things had changed enough for Dad to become disenchanted with the direction in which things were heading. Here's a final excerpt from Dad's memoirs:
“After some more years in programming I got heartily bored with it all. The originally prevalent atmosphere of pioneering, of doing "leading edge" work, was replaced by a "bean-counter" mentality. Newly employed programmers, usually university diplomates, were put through months of in-house training; various consultants initiated lengthy development "methodologies"; management, newly augmented by "experts" brought in from heaven knows where, instituted voluminous "justification documentation" for even the most trivial of computer systems requests, and I decided to look for some new challenge.”
That was when Dad shifted to a new “communications” department, headed by former boss Bill Collins. Over the next decade or so, they modernized the ANZ Bank's archaic telephone systems, implementing all-new infrastructure for both voice and data communications throughout Australia and overseas. Again – amazing challenges, into which Dad once again immersed himself fully, continuing in that role until his retirement in 1990.
Of the 34 years that he'd worked for the ANZ Bank, I always had the impression that Dad considered his years as an Analyst/Programmer to be his finest.
A thinking exercise
Dad often said – and I agree – that programming is more of a “thinking” exercise than a “writing” exercise. Much of Dad's programming was done while leaning back in a chair, hands behind his head, sometimes with his eyes closed, thinking through a whole problem to its solution before even starting to write. As you can imagine, this pose didn't go down well in the eyes of those bosses who happened to walk past Dad's office and saw their highly-paid employee apparently taking it easy!
Like father, like son
I guess I'll never know whether my own love of programming, or the idea of crafting something of real quality, stems from Dad's recollections – or whether it's because I already had those same personality traits. Whatever the reason, I ended up largely following in Dad's footsteps.
And even today, I sometimes find myself adopting the same “Dad” pose as I, too, lean back in my chair to think through some problem before hitting the keyboard.
I hope that I've managed to impart some sense of what it took to be one of the earliest programmers in Australia in the 1960s. The lengthy road that each of the staff travelled, the huge amount of knowledge they had to absorb, the challenges that they faced and overcame, and the sense of accomplishment in the finished suite of software that they created.
Although it was a significant undertaking at the time – and quite a turning point in the history of the ANZ Bank – viewed from today, it was just one small part of the ANZ Bank's continued evolution. I'm proud to know that my Dad was a part of it.
This account is in memory of my Dad – and in tribute to that original team of ANZ Bank software pioneers:
- Bill Collins
- Doreen Ellis
- Chris Fieggen
- Ted Harding
- Tom McCullough
- Ian Millard
- Roger Papps
- Dave Roberts
- Laurie Steer
- Doug Vollmerhause
- Bob Wheeler
Thanks to several of Dad's former ANZ colleagues for their contributions:
- Ian Millard, for the 1963 programming course group photo;
- Joanna Newman (ANZ archivist), for the 1957 NZ staff group photo;
- Bob Wheeler, for meeting with me to explain certain aspects of GE-225 assembler code.
End of Part One
The whole of the previous section was about Dad and his colleagues and their pioneering work in the 1960s. The following section covers my own contribution in the 2020s when I transcribed the “S05” program to make it available for current and future generations.
Transcribing the “S05” Program
Previously, I mentioned the challenge that I faced with turning the fading printout of the “S05 1st Assembly” into something not only presentable but also useable. At first glance, it was immediately obvious that transcribing everything manually was going to take a lot of work.
The actual “S05 1st Assembly” printout consisted of the following:
- 1-page cover sheet;
- 1-page “ST1” printout (symbolic table 1);
- 3-page “ST2” printout (symbolic table 2);
- 122-page assembly listing.
Even if I ignored the symbolic tables and focused only on the assembly listing, that's still 122 pages. If each of those pages was full (54 lines of code) and each of those lines was full (120 characters), that would add up to 790,560 characters.
Luckily, most lines were only 1/4 to 3/4 full – but the total still came out to a quarter of a million characters. I wasn't about to embark on quite that much typing.
Optical Character Recognition (OCR)
Having been personally been involved with computers for more than forty years, I figured I'd put my computer graphics skills to the test. I'd scan the pages, then feed those images into OCR software to transform the printing into a useable text file. Easy, right?
Boy, was I mistaken...
Firstly, the pages are on wide, fan-fold computer paper, with most still joined to each other along their perforations. I didn't want to separate them into individual pages – and even if I had, each page was too big for my A4-size flatbed scanner. So I ended up simply taking digital photos:
Feeding these through OCR software gave me the worst results I've ever experienced! For a start, there was a lot of distraction amongst the printing, including blue-lined paper plus extensive hand-written corrections and annotations. Even after applying my image-editing skills to optimize the images, increasing overall brightness and contrast and minimizing the blue lines and red annotations, the results were little better.
The main problem was the indistinct characters produced by the high-speed printer. These were sometimes missing the top, bottom or side strokes, such that “E” sometimes looked like “F”, “t”, “c” or “=”, while “O” sometimes looked like “U”, “∩”, “C” or “J”:
It also didn't help that we were trying to read 1960s-era assembler code, which lacks the usual words, sentences, punctuation and layout of “normal” writing. This gave the OCR software very little context to fall back on when anything was marginal.
And even if I did improve the source images plus tweak the OCR settings, the accuracy was still destined to be way below 100%. The end result would thus need a lot of careful proof reading to fix every single error.
So lots of work before, during and after the OCR procedure. Farewell to any time savings!
Although I quickly accepted that OCR was not the way to go, that didn't mean that I'd given up on finding a computer solution. I was still determined to save myself having to manually type a quarter of a million characters!
My new plan was “disassembly”.
Ian's GE-225 Disassembler
The idea of a “disassembler” is simple. The original printout from 1964 is the result of “assembly” – converting human-readable source code into computer-readable octal code. We want to do the same thing in reverse – converting that octal code back into source code – preferably something as close to the original as possible.
Let's begin with what Dad had coded, and which the computer had assembled.
Here are the first few lines of Dad's handwritten source code from the “GE-225 Coding Sheet” (as seen previously) minus the “X” and “Sequence” entries (which Dad hadn't used):
|MRECIN||ALF||BTL||MASTER RECORD IN|
Here are the same lines from the “S05 1st Assembly” printout (as also seen previously):
|00454||0226343||MRECIN||ALF||BTL||MASTER RECORD IN|
These correspond exactly – except for the addition of the first two columns, which contain the memory address plus the assembled octal code. There is a one-to-one relationship between that octal code and the original source code. That is, a given line of source code (particularly the “Opr.” and “Opd.”) should consistently produce a given octal code:
Can we do that in reverse? From a given octal code, can we decode it back into the original source code?
To find out, I created a spreadsheet – an easy decision given the rigidly-structured rows and columns of the assembler source code. As a bonus, I love creating spreadsheets!
It was easy enough to replicate the headings and layout based on the coding sheet and assembly printout, then filling the first column with an incrementing memory address:
The only tricky part was that the address was in octal (digits 0–7) instead of decimal (digits 0–9). This means that the address  was followed by , not . With a simple formula, I soon managed to expand that column to contain every octal number from 00454 through 17777 (decimal 300 through 8,191).
Then I was ready for the challenging part – creating the disassembly formulas.
Version 1: Lookup formulas
I started out using simple “vlookup” formulas. When I entered an octal code into a blank row, those formulas would search for the same octal code in previously filled rows. If found, the lookup formulas would “auto-fill” columns 3–7 with the corresponding source code from that previous entry.
The first time this formula kicked in was about 2/3 of the way through the first page (memory address 00520), where I entered the octal code “0226343”. The lookup formulas searched back through previous rows and successfully found a matching octal code at memory address 00454:
|00454||0226343||MRECIN||ALF||BTL||MASTER RECORD IN|
|00520||0226343||TFREC||ALF||BTL||TRANSFER RECORD IN|
Sadly, the source code copied from that previous entry was partly correct (colored green) and partly incorrect (colored red). It was simple enough to overtype those incorrect entries with the correct text. But in the end, the formulas hadn't really saved me much work.
I'd always expected that it would take a few pages of typing before there was a large enough pool of “previous entries” for me to start reaping the rewards. It soon became obvious that probably 90% of octal entries were completely unique, so the lookups rarely managed to find a previous match at all.
I had to delve deeper!
Version 2: Chopped octal formulas
The next step was to chop the 7-digit octal code into three separate fragments, corresponding more closely to the three separate source code fields: “Opr.” (Operator), “X”, “Opd.” (Operand). Note that these weren't chopped neatly at whole digits but at 2 digits, 0.7 digits and 4.3 digits.
Here's some sample octal codes chopped into fragments, which I've called: “W”, “X” and “Y”:
Suddenly the path forward became much easier! The “W” fragment returned a 2-digit octal code. It was an easy matter to create a separate lookup-table to convert those 2-digit octal codes into 3-character mnemonics:
The “X” fragment needed no further conversion – it was already the actual “X” value.
The “Y” fragment was somewhat trickier. Although its value was consistent, the way that value was formatted depended on the type of operator:
- An arithmetic instruction might imply a decimal format;
- A storage instruction might imply an octal format;
- A jump instruction might imply a relative address.
It took me a fair amount of experimentation until I had some formulas that almost always produced the correct code in the correct format.
Manually entering symbols
The next stumbling block was “symbols”. Remember earlier when I discussed the merits of using symbols instead of memory addresses? This feature really benefits anyone coding a program, but in turn hinders anyone trying to decode a program because all of those symbols have disappeared – having been converted into memory addresses during assembly.
In the end, I resorted to a similar tactic to the “symbolic pass” that was performed by the assembler (“pass 1”, as covered earlier). Working through every page of the printout, I entered only the symbols that I found in the “symbol” column, then created yet another lookup table out of them:
Actually, it was slightly more complicated than that because the source code also included countless examples of “relative symbolic addressing” (symbolic addresses plus or minus an offset).
For example, after reading a punched card and storing the results at “CRDIN”, separate data fields from that card can be accessed with relative symbolic addresses like “CRDIN+1”, “CRDIN+2”, etc.
Once the lookup table of symbols plus relative offsets was in place – behold – my formulas now yielded meaningful symbols instead of meaningless memory addresses!
Additional tables + formulas
The final tweak was adding in a lookup table of specific one-off mnemonics (all those starting with “25”) as well as “overrides”. These replaced the “theoretical” source code (normally produced by my formulas) with “actual” source code (anything that was more commonly seen in the actual printout).
Towards the end of my project, my GE-225 disassembler was giving excellent results. Entering just the 7-digit octal code generally produced source code that exactly matched the printout!
All that was required to finish off a line was to manually enter any optional “remarks”. These were typically short notes describing what the current line of code was doing. Occasionally there were several lines consisting solely of remarks, particularly the headings of “subroutines” that had been supplied by G.E. for system functions.
Speaking of which, the programmers at G.E. really went a bit overboard, documenting almost every single line of code. Man, that cost me a lot of extra typing!
Incorrect disassemblyMy spreadsheet was now performing really well, with around 99% accuracy. I was perfectly happy with that because I know that disassembly is never an exact science, thus some errors are to be expected. In fact, even with my most evolved formulas, re-entering the very first seven lines of the “S05” program still gave incorrect results!
Let's look again at how the very first line of source code “ALF BTL” was assembled into octal code:
The GE-225 mnemonic “ALF” means “alphabetic data”. In this example, the text “BTL” is encoded as character codes 22 (= “B”), 63 (= “T”) and 43 (= “L”). In fact, the first seven lines of the original source code were each encoding a group of three letters. Put together, they encoded the text string: “BTL001MSTFILE 200864”.
Now, let's look at how my disassembler converted that very first octal code “0226343” back into source code:
My simple disassembler didn't know the context of that first octal code, so my formulas assumed that it was an instruction. But that octal code was actually data, so the decoded result was clearly something totally different and meaningless. As I usually did whenever my formula produced the wrong source code, I simply overtyped it with the correct source code.
Luckily, such text strings using the “ALF” mnemonic proved uncommon within the source code (I only found four more similar examples). The majority of text strings appeared much later in the program (about 3/4 of the way through). These were instead encoded with the mnemonic “MAL”, which means “Multiple Alphanumeric”, or “PAL”, which means “Multiple Alphanumeric for Printer”.
For these, I took a lateral approach – switching from disassembly formulas to assembly formulas!
Up to that point, I'd been entering octal codes into the “octal” column and having the formulas produce the source code in the remaining columns. From here onwards, it made more sense to do the reverse. It was much easier to type the text (usually English words or sentences) into the “remarks” column, then use new assembly formulas to perform exactly what the GE-225 assembler would have done – filling subsequent rows with that text chopped into groups of three letters (in octal format):
That one switch to assembly formulas saved me heaps of typing. Entering a single text string of up to 45 characters would magically generate up to 15 rows of octal code!
Finally, I should also mention another reason that my disassembly formulas sometimes produced the wrong result. This usually related to how data values were defined.
We've just seen how text data can be defined in the source code using either “ALF” (up to 3 characters) or “MAL” / “PAL” (up to 45 characters). Likewise, numeric data can be defined in the source code using either “DEC” (decimal), “DDC” (double-length decimal), “FDC” (floating-point decimal), or “OCT” (octal). Each of these will be assembled into either one or two plain octal codes, with no hint as to their format.
Once again, my simple disassembler doesn't know the context of such octal data. Decoding everything as decimal was usually correct, but not always. As before, any incorrect results were simply overtyped with the correct source code.
Rating my GE-225 Disassembler
Conservatively, my spreadsheet probably only saved me half the keystrokes that I would otherwise have typed manually. That said, it significantly reduced the amount of human error, resulting in much less follow-up proof reading needed.
Although this spreadsheet was invaluable for my immediate needs, its usefulness is restricted to this one program (“S05”). If I were to now decode a different GE-225 program, I'd have to create a new lookup table with all of the symbolic addresses contained in that program. But even that's not too difficult – just perform one manual “symbolic pass”.
Speaking of which, when I first started decoding the “S05 1st Assembly” printout, I set aside the original printed symbolic tables “ST1” and “ST2”, focussing my initial efforts on the source code. With the benefit of hindsight, I realize that “ST2” would have been invaluable for creating the lookup table of symbols. Everything was right there – symbol names plus their memory addresses:
In the image above, I've added faint vertical lines to help separate the “address” of each entry from the “symbol” of the following entry. Had these been spaced apart rather than squashed together, I might have recognized this table for what it was. Oh well, lesson learned for next time...
It's kind of ironic that my Dad spent months learning how to code his first GE-225 assembler program, then around half a century later, his son (ie: me) would spend months (on and off) learning how to decode that same GE-225 assembler program!
All in all, the project of transcribing the “S05” program 1st assembly was a fascinating exercise for me. Hopefully you also found it interesting.
The following resources are available for download: