Annoying Computer News
The BBC has a short piece on the latest threat, Shellshock: ‘Deadly serious’ new vulnerability found.
There is a relatively simple exploit that allows hackers to take control of computers using BASH [Bourne Again Shell for Linux] which includes Apache servers, multiple flavors of Linux, and Mac OSX computers. A patch has been released but there are questions about the totality of the ‘fix’.
BASH has been around since at least 1990, so this problem is buried in some dense C code which explains why it took so long for the exploit to appear, and why it may not be easy to fix.
On a personal note, they have added insult to injury by installing a third fiber bundle across the street from me, and I still have only two choices for broadband Internet services. The name of the contractor was ‘Fibore’, which is too sophisticated for Alabama, so they may be from up North.
29 comments
There is actually 2 vulnerabilities. Shellshock (CVE-2014-6271) and Aftershock (CVE-2014-7169). This was caused by the bash fix for CVE-2014-6271 being incomplete and command injection is possible even after the patch has been applied. They are being tracked here:
CVE-2014-6271: Remote code execution through bash
If you are using Ubuntu (or a derivative), you can use this to get a patched update:
sudo apt-get update
sudo apt-get install bash
And for those with Mac’s OS-X (note that this has been updated to address both vulnerabilities):
Every Mac Is Vulnerable to the Shellshock Bash Exploit: Here’s How to Patch OS X
The following can help you determine if you are vulnerable:
For Shellshock (CVE-2014-6271), enter this into a bash shell:
env x='() { :;}; echo vulnerable' bash -c "echo this is a test"
If you get this, you have a successfully patched bash:
bash: warning: x: ignoring function definition attempt
bash: error importing function definition for `x'
this is a test
If you get this, you are vulnerable:
vulnerable
this is a test
For Aftershock (CVE-2014-7169), enter this into a bash shell:
env var='() {(a)=>\' bash -c "echo date"; cat echo; rm -f echo
If you get this, you have a successfully patched bash:
bash: var: line 1: syntax error near unexpected token `='
bash: var: line 1: `'
bash: error importing function definition for `var'
date
cat: echo: No such file or directory
If you get this, you are vulnerable (Note that the last line will be your current date/time):
bash: var: line 1: syntax error near unexpected token `='
bash: var: line 1: `'
bash: error importing function definition for `var'
Fri Sep 26 09:20:00 UTC 2014
Ubuntu just issued a bash patch this morning. I assume they work closely with debian on updates, but I still have to patch the Raspberry. I’ll keep it away from the ‘Net and patch it by waiting for a new version and building a new card. That’s easier than doing it over the ‘Net. It is interesting to play with the Pi, but the OS is limited and doesn’t have a number of the utilities that I’m used to, so I cheat and do somethings on the big box and transfer them over. My other option would be to write them, but it has been a while since I have done that kind of coding.
[Sunday update: another bash patch this morning.]
There are actually web applications that execute bash in response to URL’s? For real? Even though we’ve been telling people since 1995 that executing shell commands from a CGI script is wrong and evil?
— Badtux the Aghast Penguin
It is people like that who make these exploits possible. Without the unwitting [or witless] assistance of sloppy, lazy coders, many of these attacks would fizzle. Giving people access to the shell on the server is just such a bad idea that you have to wonder what these people were ingesting.
I had a plaque in my office in SoCal: You can’t make anything foolproof – Fools are too ingenious.
Ah, idiots. The other stupid thing programmers do is feed user-inputted parameters into database engines. One of the first things I asked our QA department to do was run a standard SQL injection suite against the few pieces of information that get turned into SQL statements rather than fed to Hibernate (which has its own SQL injection protection engine). The template engine I was using for the SQL was supposed to escape/sanitize to prevent such attacks. It did. Substituting a username inputted from the login prompt directly into a SQL statement to query the user database? Man. That would have been *stupid*. I mean, SQL injection attacks have been known about for over 20 years now, what kind of moron writes software susceptible to them? Except, apparently, every other moron out there, I mean, since I regularly hear of web sites compromised via SQL injection attacks (generally creating an admin super user with a known password)? SIGH!
Over the years I have probably spent more actual programming time validating user input to protect the data and the program – garbage in, garbage out. The fact that my first real civilian computer jobs dealt with accounting where bad data could have a direct financial and occasionally legal impacts on my employer/client made me hyper-aware of the need to ensure that anything that went into the program had to be valid as to type, purpose, and range.
For me, the weak data typing in languages like C was a major annoyance, but something you had to program to avoid. Like ‘garbage collection’, it was something you had to be aware of, and certain of, before you could send a chunk of code out for beta testing.
I have collection of modules to do various types of validation for every language I have ever done any serious coding with because it is SOP. It is especially important when you are working as part of team, because the code feeding into your module can send you bad parameters and you need to provide a ‘graceful response’ with a meaningful error message.
It doesn’t require bad intent for a user response to screw up your program, if you don’t trap control keystrokes typos can do a job on you.
Yes, SQL injection attacks are a very old problem and blocking them should be SOP, not something special.
They make libraries to do this for you nowadays — libraries that have been thoroughly vetted for coverage and completeness. I’m using one of those libraries to do validation prior to passing the values in to SQL, and the GUI team is using another of those libraries to attempt to validate at the JavaScript level. Which of course can be bypassed by doing a manual POST to the REST URL, which is why I do my own level of validation on the back end via using a standard SQL templating engine which properly escapes / validates all inputs to make sure they can’t commit a SQL injection attack.
Yet every day we hear about SQL injection vulnerabilities… for example, WordPress’s plugin all-video-gallery had a vulnerability in config.php because of the following code.
“SELECT * FROM “.$wpdb->prefix.”allvideogallery_profiles WHERE id=”.$_pid
Where $__pid was sent in via the REST URL and received no — zero — validation.
What.
The.
FUBAR.
Nobody should be building raw SQL queries in today’s day and age. Nobody. It just should Not Be Done. A parameterized template engine should be used. Otherwise you’re likely to goof and shoot yourself in the foot by forgetting to check a parameter, something that is utterly impossible with a parameterized template engine. For example, here’s a sample SQL call in my application:
siteList = dbms.Rows(“siteLookupService.findByName”,[“siteName”: site_name])
The first parameter tells which SQL template I’m using, which is:
siteLookupService.findByName:select * from site where name = ‘${siteName}’;
The second parameter says what to substitute, i.e., ${siteName} gets substituted by whatever is in site_name. The template engine itself escapes any SQL-special characters in site_name at the time it does the substitution. Yes, I thoroughly tested this via attempting SQL injection, it doesn’t work.
The point is that I couldn’t accidentally create a SQL injection here if I was deliberately trying, because everything goes through the template engine. Nobody ought to be building raw SQL queries from scratch in today’s world. Nobody. All it takes is one mistake and you’re screwed. Everything *must* be forced to go through a template engine that thoroughly scrubs all input prior to substituting it into the SQL statements, because the alternative is insanity. (Note that the ‘dbms’ in my statement is actually a template engine instance, and attempting to do a straight SQL query through it simply won’t work — only the template engine has access to the raw SQL database connection in our application).
Morons. We are surrounded by morons. GAH!
That’s why I would love to see more scrutiny of the code in WordPress to check for errors and security issues. It is too easy for someone to take a ‘shortcut’ and introduce a weakness in the code in order to get it out the door on schedule.
The whole idea behind OOP is to build using vetted, known good code and not winging it with whatever comes to mind. You plan the solution, not keep hacking away until it seems to work.
I have no idea what they may be teaching in IT courses today, but I’m not impressed with the results.
We had an intern right out of CalPoly, one of the most prestigious engineering schools on the planet. Bright enough guy, but it was clear that he knew nothing about object oriented programming. “You should not have a class with all static methods,” I told him. He asked, “Why?” I said, “So you can do code re-use and use polymorphism to extend the class for other object types.” He was utterly baffled. I asked him, “they didn’t teach anything about object-oriented design at CalPoly?” He said “Nope.”
Sigh. Doomed. Doomed I say. We are utterly doomed. He didn’t even know what the dinosaur book was — one of the fundamental texts of computer software engineering. They taught no — zero — software engineering courses at CalPoly, apparently. It was all abstract mathematical theory. SIGH!
Sounds like he was being prepared to be a quant, not an IT guy. Abstract theory is useful, but someone has to create the code to use it. Most of the people involved in the creation of COBOL were mathematicians, but they learned to branch out.
It almost appears that they are so interested in being on the cutting edge that they are ignoring the firm foundation needed to keep from falling off. I was working in the field for quite a while before I went back and got the appropriate degree to make HR happy. The ‘kids’ in my classes had very little historical knowledge of hardware or software, and little actual knowledge of how a computer functions. It isn’t absolutely necessary, but it sure makes debugging odd errors easier.
Anybody who read the dinosaur book (of which I am talking about, naturally, The Mythical Man-Month) would have predicted the fiasco that was healthcare.gov. The classics are the classics *because* they remain relevant, not because they’re old. I was lucky to attend a middle-tier university that tried to strike a balance between theory and practice. But that was then, and this is now. One of my office-mates is attending a middle-tier university which is graduating Computer Science grads who can’t program *at all*. WTF?! Only reason he’s attending is because he wants the sheepskin to satisfy future HR drones, he’s actually been in the field for some years now. The disdain he has for his fellow students and for his professors is palpable.
Theory is useful and vital for future developments, but if you want a job, you have to be able to do something that a business needs – like write programs to deal with the goal of the business. I took my courses at night, so most of my instructors were adjuncts who spent their days working in the field locally. They not only taught courses, they were also a source of job referrals if their company needed people. At least a few of the programming ‘exercises’ smelled like something that they were working on and needed a fresh approach to the solution.
After I developed the habit of adding a musical ending to programs that would run on PCs by making assembler calls people avoided me. Man, I was bored. Writing CICS scripts/programs for SQL on IBM DB2 was not exactly a world of excitement, and people learned I wouldn’t give them an answer, I would only remind them how to find it themselves.
Now you’re telling me that the degree has become worthless because the graduates don’t actually know how to do anything useful.
I think that’s mostly true. Any IT diploma/degree is a limited means of determining actual skill level. On the weekend, I was helping the neighbor’s 14 yo son with some homework for his IT class. It’s the first year of what is supposed to be entry level computer skills. He didn’t even know what RAM was. He thought ‘memory’ was the USB pen drive. He also had a lot of misinformation, and had little idea about the most fundamental building blocks of a PC, let alone any specifics. They are learning to program in BASIC, with very little, if any, understanding of what a computer is, or how it works.
If you think s/w is bad today… it will get worse.
I taught myself during the 70’s. I built my first computer from a kit, the EDUC-8 (pronounced ‘Educate’) where I had to solder the components myself to boards. It didn’t use a dedicated CPU (which is why I wanted to build it, I wanted to understand how a CPU actually worked), it used discrete TTL & CMOS IC’s. 🙂 I learned a lot, including machine code. By the 80’s, I was writing code in Basic, BLISS, COBOL, Fortran & Pascal. My favorite was BLISS. 🙂 Developed in 1970 by Carnegie Mellon University and used by DEC until the 90’s. It was one of the first true Compilers, and the only one that could extensively optimize code for performance and size. HP continued using it in-house after acquiring DEC, mainly because DEC had developed a 64-bit version but never released for the Alpha & IA-64. 🙂
Did you ever read the classic “The Design of an Optimizing Compiler” 1980? That was based on BLISS. 🙂 One of the great things about BLISS was that it was very difficult to make errors. It didn’t tolerate syntax or logic mistakes at all! 😀 You had to try really hard if you wanted “buggy” code. it either worked, or it didn’t. 🙂
Now, everything is too hard apparently, and the basic’s no longer matter. Thinking no longer matters. The kids are not taught how to think, they are taught what to think. What to memorize, to learn by rote. Welcome to the scary new World! *shrug*
They have to teach BASIC because Bill loved BASIC which is why Visual BASIC is used so extensively at M$. The intro courses dropped BASIC and started using Pascal to teach programming fundamentals because it was a more structured language that avoided a lot of the bad habits that you can acquire using BASIC.
I used mock-ups of a half adder to teach people how computers ‘think’. I had one with switches, one with relays, and finally a transistor version to show some of the history of computing. I found that people finally saw the virtues and limitations of computers when you showed them what was really going on. Then I showed them the Boot procedure for a Data General Nova 3 [one of the machines at the college] to demonstrate why they were learning this. You had to use toggle switches to load the initial instructions into memory, which activated the paper tape reader to load the instructions to read the hard drive. It made them appreciate the boot ROM on a PC.
You have to understand the underlying principles and learn the necessary vocabulary to really be successful. Great texting skills and familiarity with smartphone apps doesn’t make kids computer literate – it makes them competent users.
Yeah. 🙂 The EDUC-8 used switches and had a discrete LED display. 🙂 It cost about AU$300 as a kit back in ’74. A lot of money then! I was 15 when the project started in ‘Electronics Australia’. I had some money saved (about $180 from memory) and I had sold newspapers after school. I started two other jobs before school. One helping the Milkman (an Uncle) make his deliveries, and delivering newspapers on my bike before school. I’d figured out that customer service pays. Happy customers give good tips! 😀 So, instead of just throwing the paper into the yard 9as most did), I’d actually put it on the doorstep, and if it was raining and there was no cover, I’d put in a plastic bag. Eventually, I stated finding envelopes with “paperboy” on them. Usually 5c or 10c, sometimes 20c (the paper was only 5 c then!) That Xmas, I had a few envelopes with a thank you card and $1 or $2, and one with a nice note and $5! The number of people getting papers delivered had increased also. Mom said that I should ask for a “pay-raise”, so I did! The Newsagent manager just laughed, so i said I quit and walked out! A week later, he came to my home and told Mom some people on my old route said that if I wasn’t back, they would cancel. LOL So I got my raise. 😀 I learned at a very early age!
By mid 2005, I had enough to buy all the tools and the kit. Even an antistatic mat and strap, and a cheapish analog multimeter.
Once I’d had it working and learned all I could, i designed a paper tape reader/punch i/f which I submitted to EA and it was published. 🙂 Some months later, Jamieson Rowe (editor of EA and designer of the EDUC-8) contacted me to help with a new project. Keypad and 7-segment LED Octal display i/f (which was 2 boards)! I learned heaps! It was he that suggested I should do a COT (Certificate of Technology) diploma and then either an Electronics or Industrial Design degree. The CoT was 3 years and VERY intense! It combined electrical, electronics and mechanical disciplines, but unlike Uni was 40% theory/60% practical. And the pass mark was 75%!
Here’s a pic:
By late 70’s, early 80’s, we had 3 major components/kit suppliers: Dick Smith Electronics, Ketsets Aus. and Heathkit. I just about lived in those stores! LOL
Intro courses at the university level nowadays use Java. Which at least isn’t BASIC, but does hide the details of the actual physical machine underneath its enormously complex “virtual” machine model. The result is programmers who haven’t the slightest idea of the cost of their algorithms. For example, there was one algorithm that was taking sometimes up to 20 minutes per batch to process a record set for one customer. I sat down and cleaned up the algorithms, replacing a lot of linear searches with a batch hash table load and hash table lookups, and for that same record set which took 20 minutes to process, it now takes 20 seconds. Uhm, yeah. That sort of tells you just how bad it is. Sigh!
I did Heathkit and Radio Shack, plus the ads in computer magazines. I even burned a few boards for some of my projects, but had to stop that when the people who took the used chemicals stopped dealing with individuals. That stuff was too hazardous to just dump in the sewer.
My Dad was interested in the new technology and certainly had the experience after fixing missile guidance systems forever, so I have been soldering things for a very long time.
Do you know of a good book, Badtux, that teaches Java programming and is not just a listing of functions and commands, or a ‘cookbook’. My brother is interested but hasn’t found a decent book yet. Everything he has seen ignores the whole and concentrates on the parts. This seems to be the new trend in computer books – no instruction on how to create a functioning program, or the structure of the overall code.
Java can be better than BASIC, but you still need to be taught how to program, before you concentrate on any particular language. The principles have to be learned to write reasonable code, and that includes sorting, even if Knuth is not the most exciting writer in the world.
In 2000 I was asked by RMIT to teach an evening Java class for a 4 week intro course. I declined on the grounds that they were a bunch of crooks who had no chance finding anyone except a complete idiot given their unreasonable requirements! But that’s another story! 😉 😀
Anyway, whilst preparing before I realized I was about to be screwed big time, I found this list. It was quite useful, and may still be. After all, the basics haven’t changed much if at all. It’s a review of 7 introductory Java books from 1999. I know at least some have been updated since. I liked it partly because it included a useful comparison table as a quick reference. Anyway, may be a starting point. 🙂
In search of the best Java book for beginners
In any case, JavaWorld is a good Java resource. 🙂
I have others more recent, but not on this system. I’ll post later if I find anything useful. Or badtux may have something better. 🙂
He has been in the business a very long time and was working at Sun when Java came out, but he was in the business side and didn’t do a lot of code. He has gone back into coding in retirement to keep busy.
He is looking for something to learn the good practices for creating apps and applications and not just to make things move around on the screen.
Unfortunately, Bryan, I learned Java via the school of hard knocks. As in, I joined a hardware company because I was an expert on the particular kind of hardware they built, and the first thing they did was say “Oh, and here’s this massive hunk of Java code, we need you to implement functionality X, Y, and Z for controlling our hardware.” From there it was keeping the Sun Java docs open in my browser window while I learned enough of the language from reading the existing program to function, Java is vaguely C-ish so that worked for me, I just mapped Java concepts on top of C concepts, mapped the object model to the Python object model (which I’d just finished mastering) and went from there.
Regarding best practices, that is the same regardless of the computer language (at least when you’re talking about object-oriented computer languages), I use the same best practices in Java that I use in Ruby, Python, or C++. Perl and PHP are a bit… special… because their syntax is so loose and makes it so easy to shoot yourself in the foot, I use a more strict set of rules there. There are special rules applicable to particular problem domains, such as “separate out the business logic from the presentation” (otherwise you have a pain in the butt when you want to go change the presentation because people want the GUI to look or operate differently), and “make everything an API except presentation”, but those are all language-independent too, though some languages make it harder than others (PHP, GRR!!!!).
I learned it the same way except I had the printed Sun manuals [and a promotional coffee cup]. They were advertising it as a cross-platform , more secure, replacement for C in its first iterations, and it did start out much closer to the C model.
The early versions were too slow for the work I was doing so I left it behind.
Perl and PhP are well beyond the borders and need to be seriously reined in to be reliable. They absolutely require defined standards to be maintained and to ‘play well with others’.
I’ll pass along the Javaworld recommendations that Kryten mentioned.
Yeah, the early versions of Java were *ridiculously* slow! As in, pretty much unusable. Then they got the JIT bytecode compiler working in 1.4 and suddenly it was usable. Still slow, but usable. The latest versions have been optimized even more, plus the hardware is ridiculously faster now than the Pentium 233 that I was using back then, so using Java is no longer a penalty box. Still not useful for stuff that needs direct access to hardware, of course. That’s why “C” exists…
I was using 1.0, so you know it was slow unless you were using Sun equipment.
C was definitely a welcome replacement for the assembler I was occasionally having to write to get things to work at a decent speed.
Java is definitely useful these days, so I can understand his interest.
The first version I used was 1.1. It was… ludicrously… slow. I cannot imagine how slow 1.0 must have been…
“C” is, of course, a macro assembler for the PDP-11 ;).
Part of the COT course was learning how to design and make your own PCB’s. We had to make our own sealed UV table and etching tank (a narrow fish tank with thermostatically controlled heater & air pump with a silicon tube with 2 rows of small holes equally spaced and sealed at the end. It was in a sealed cabinet with an air filter/extraction system (because whilst air bubbles speeds up the process, it also blows the vaporizing etchant into the air! A bad thing!) There are much safer etchants now, even environmentally safe. 🙂
So long as you only need a single or double sided board. Multilayer boards have to be specially manufactured of course. But back in the late 70’s, they were not even a dream yet! LOL
I made my own solder station with an old electric thermostat controlled frying pan for the solder bath and a drill press with an adjustable board holder. 🙂 You could buy a cool tool that would snip and crimp the component leads at the same time. Stopped things falling out while soldering! 😀
Those where the fun days! 😀
As for Java, yeah I started with v1. It sucked (and not just speed!) LOL
LOL badtux! 😀 Actually, DEC used BLISS for the PDP’s mostly, and VMS (for all the OS utilities anyway). 🙂
You didn’t need the sweep second hand on your watch to time program execution with 1.0, Badtux 😉
I did the tabletop version with an old plastic dishpan and some tools for developing photographs as well as a surplus military gas mask, because even without the air bubbles the etchant was foul and I couldn’t move air fast enough to avoid it. My neighbors thought I was really strange when I did it. I had conversations with the local cops and firemen on more than one occasion, but it wasn’t illegal and I was following the manufacturer’s directions. I even took out my soft lenses and wore glasses because I wasn’t sure the stuff wouldn’t affect them.
Fortunately there were press on designs for ICs and and other components so I didn’t have to depend on my nonexistent drawing skills for anything other than lines. Later I mastered the use of the system where you mimicked the photographic system, but it didn’t save as much time and effort as I thought it would for one-off projects.
Both Unix and C owe their existence to an unused PDP-7 in the bowels of Bell Labs, but that’s just history and can be ignored. [What some people will do to play games is amazing.]
LOL yeah. 😀 True about the PDP-7 (and the games bit!) 😉 😀 When I started the COT, we had a PDP-11/03. One of the first things a group of us did was port the original “Colossal Caves” text adventure game to the PDP-11 running RSX-11 AND made it multi-user! Took 4 of us about 8 Months during our spare time. The IT Manager allowed us to have access after school hours & weekends, and he got involved about midway through. 😀 We learned a hell of a lot! 😀 Then we ported Star Trek, but it wasn’t finished because the school got a brand new VAX 11-750 from DEC! And that was far more interesting! My group hacked the supposedly unhackable VAX in a month! LOL Was easy, when you know hardware and electronics. From a purely s/w stand point, the VAX was pretty secure. 😉
Hey… want a laugh? (or a job in Canada?) 😉 😀
I know from working for HP in late 90’s that they were pissed because they wanted to kill the PDP-11 deader than purple suede shoes! But GE and a couple other major companies had news for them! LOL They said they planned to use them until 2050! And that was that! LOL I did a quick Google and found this posed to a Canadian IT board last year:
“ChrisGE is offline
Join Date: Jun 2013
Greetings from GE Canada!
I would like to reach out to you to let you know about a fantastic opportunity in Peterborough Ontario Canada for a PDP-11 programmer. The role supports the nuclear industry who has committed to continue the use of PDP-11 until 2050!! Yes I know this is a hard-to-find (existing) skill. We will also consider programming experience with other assembly language. If you are interested, or know of anyone who is, please feel free to email me at chris[.]issel[@]ge[.]com
Thanks!”
(email edited by me).
I am still laughing over that! (I don’t like HP either!) LOL
Actually, Mentec in Ireland have the LSI-11 rights. 🙂
Drafting was a big part of the COT! And VERY strict! They measured everything! Even the text had to be a precise size and shape! Spaces between lines, thickness, radius of bends, pad size & shape were measured, and woe betide you if you hadn’t properly taken current or LCR into account! I am (smugly) proud that I scored top marks in that (98% – a high distinction)! LOL 😉 And it did come in very handy over the years.
At the school I taught at I administered the PDP-11, but it had ATT Unix on it [free education license]. It was a breeze compared to working with the original Data General Nova 3 because it had the memory and disk resources to keep everyone moving at the same time, as well as a 9-track tape for loading and back-up. It could also output to the line printer at full speed. We shifted a lot of the student load off the IBM 360 and got to move away from cards for everything except the COBOL and CICS courses. Good times and no reason to just dump them – if it ain’t broke, don’t fix it. The PDP-11 has been the standard system for nuclear power forever, and there is no good reason to risk something new.
I took drafting in high school and had the tools, but my skills were only adequate as far as I was concerned, even though I took top marks. Now I can do the job with decent software programs that can make up for my weaknesses. I know what I want, but my hands don’t want to go along with my mind when I try it freehand.
The ability to click and have the output go to the plotter is a wonderful thing for me.
Kryten, Kernighan and Ritchie wrote the “C” language when they were involved in porting Unix from the PDP-7 to the PDP-11 because they didn’t want to re-write everything into a different assembly language again, it’d been enough of a PITA the first time. That is why virtually every concept in “C” maps directly into PDP-11 assembly language. I used to have the PDP-11 architecture manual somewhere around here (it was the computer we learned assembler on back in the late 70’s at the university that I attended) and looking at the original “C” programming language, almost every statement in the language (even auto-incrementing pointers being used to walk down character arrays) mapped to a single PDP-11 instruction.
Thus why I call “C” the world’s most sophisticated PDP-11 macro assembler ;).
What that does emphasize, however, is just how pleasant and well designed the PDP-11 really was, since the “C” language is now the world’s most popular language for writing low-level code (thanks to the success of Linux, amongst other things, but still) despite not mapping well onto the hacked-up Intel assembly language (which is ridiculously nonorthogonal and a PITA to optimize for). Nowadays I only drop down to assembly language when I’m dealing with something ridiculously limited like a PIC chip that has 4K words of program memory and no timers or interrupts, where everything I do has to be cycle-counted state machines to make the timing work. But that’s another story.