My Technical Journey

I wrote this because much of my early work predates GitHub and modern tooling. I wanted to preserve the story while I still remember the details. What follows is the long form version of my technical path from the late 1970s through today.

Early Foundations: TRS 80 Model I Era (late 1970s to early 1980s)

My introduction to computers began at home with a TRS 80 Model I, first with 4 KB of RAM and later upgraded to 16 KB. Everything was stored on cassette, so I learned early to type in programs from Creative Computing compendium books, debug by hand, and understand exactly what each line of code was doing. These sessions taught me patience, pattern recognition, and a sense of how software behaves at the smallest scale.

The Model I included a simple block graphics mode, and I experimented with drawing routines that used set and reset operations on the 128 by 48 display. This was my first exposure to thinking about algorithms that convert abstract ideas into visible output.

High School Programming: Model III and Algorithm Experiments (1983 to 1986)

In high school I used TRS 80 Model III systems with dual floppy drives and the Alcor Pascal environment. I wrote Pascal programs with graphics primitives, including a port of the Bresenham line drawing algorithm. I also wrote a PILOT to Pascal transpiler that parsed source code and generated Pascal output as a shortcut for more verbose Pascal programs. This was my first example of working on code that processed other code.

During this period I also became interested in randomness and statistical testing. I wrote my own pseudo random number generators in Pascal and evaluated them by printing long letter sequences using ascii offsets and looking for patterns. I borrowed a statistics text and implemented chi square tests to evaluate the distribution properties of my generators.

I also participated in programming contests using BASIC. Many of the techniques came from long hours reading Creative Computing and other programming books, which helped me learn to debug quickly under pressure.

Transition to Unix and Remote Development (1986 to 1987)

After high school I gained access to USDA SCS AT&T 3B2 Unix systems. The sysadmin there granted me permission to dial in and work on my college programming assignments. At first I used an ADM 3A terminal connected through a 1200 baud modem on a 25 pin RS 232 interface. Later we used a TRS 80 Model III as the terminal so I could edit locally and dump buffers to the 3B2. This avoided tying up the phone line the entire time, and our house had two phone lines which made the workflow possible.

I uploaded programs using ed and cat redirection and avoided vi when TERM settings caused cursor issues. This was my introduction to multiuser Unix systems, terminal behavior, and remote development constraints.

TCJC Era: Formal Programming Foundations (1986 to 1991)

At Tarrant County Junior College I studied K and R C and Fortran 77. My mother and I took the C class together, which meant both the syntax and the Unix tools became part of our daily routine. I completed most C coursework on the USDA SCS 3B2 systems rather than on campus machines.

Fortran 77 coursework took place in two environments. One was the IBM 4381 mainframe in the campus computer lab, which used batch style submission and taught me the IBM toolchain. Later in the semester, as access became difficult, I began compiling and running Fortran 77 code on the same 3B2 Unix systems I used for C. This showed me how the same language behaves across two very different computing cultures.

During this time I attempted a Z80 based linear congruential generator using multi precision routines from a Lance Leventhal book. I wrote MACROP, a Turbo Pascal macro processor with nested macro support for Z80 and 6502 assembly. I also took an 8088 assembly class, but by the end of this period C and Unix had become my main direction.

Lone Star Comics: Warehouse and Reporting (1991 to 1993)

I joined Lone Star Comics in 1991 in the warehouse, learning operational workflows for back issue management, receiving, inventory flows, and SKU based systems. These experiences informed the tools I later built for the company.

Lone Star Comics Programming: Early Web Engineering and Inventory Tools (1993 to 1997)

I wrote inventory and sales tools in FoxPro, created budgeting and reporting utilities, and wrote embedded C programs for Symbol PDT barcode scanners used by inventory teams.

In 1995 I built one of the first online back issue search engines using C based CGI and CSV exports from FoxPro. I created several pre sorted datasets for fast lookup and supported keyword searches with AND and OR logic. The search engine generated HTML directly and operated entirely on flat files. Wayback Machine snapshots from 1996 and 1997 show the interface essentially unchanged after I left in 1997.

NHIN AR Data Center and PDX: Enterprise Systems Development (1997 to 2002)

I joined the NHIN AR Data Center inside PDX in 1997. My work involved maintaining and extending a proprietary C like language for pharmacy accounts receivable systems. I wrote data import programs in C, D ISAM, and Sybase stored procedures. I supported staging pipelines, generated datasets for external industry clients, and worked with Java servlets and JSP pages. I maintained the AIX development environment and used Perl extensively for text processing and automation.

PDX: Perl and Data Transformation Work (2002 to 2013)

In later years at PDX I continued working with the proprietary language but relied heavily on Perl, shell scripting, MySQL, and PHP for internal tooling. I created XML generators, parsers, and nightly automation processes that ran quietly in production. I became the department contact for regular expressions and text transformation work. I worked within Agile practices using Rally.

Core Mark Fort Worth: Operations and Perl Automation (2015 to 2020)

At Core Mark in Fort Worth I worked as an AS 400 operator supporting nightly operational workflows. I wrote Perl scripts to automate rebate credit processing by converting text based reports into structured CSV files used by accounting systems. This eliminated manual entry steps and reduced errors.

Core Mark Plano Data Center: Python and Analytical Work (2020 to present)

After transferring to the Plano Data Center I began using Python and SQL for internal analysis and automation. My work includes data cleanup scripts, batch automation, reporting tools, and support for operational systems. Python is now my primary tool for transforming and restructuring data. I also maintain personal projects involving Whisper transcription pipelines, ffmpeg automation, metadata extraction, and Grav CMS utilities.

Unifying Thread

Across all phases of my technical life the pattern is consistent. I build tools that automate work, transform data, and reshape information from one form into another. The languages have changed from Pascal and C to Perl and Python, but the underlying problems have remained the same. Most of the tools I create run quietly in the background converting messy or unstructured input into something usable.