Tuesday, December 13, 2022

2022 Was Kind of a Quiet Year

I didn't really get as much done as I'd hoped, but in going through it, there were still a few cool things. Just some of the better stuff...

Learned Yocto in order to implement a Coral AI accelerator for a client.

Took a trip to Vancouver, and another to my hometown, and one last one to Los Angeles to close out some business. But Covid really restricted the personal travel still, this year.

I did continue with the UFW work on Second Life. I suspect I am spending too much time with this, and I am going to need to decide - fun or projects. ;) Some pissy people have me questioning whether it's actually fun anyway. I'll decide by year end. 

I added some vision-impaired enhancements to my TI emulator - Classic99. It has background noise (which on the real console was an interesting indicator of activity), and I added a screen reader that works most of the time! It received a bit of attention that was quite pleasing! I have only one vision impaired user, but he went so far as to create a podcast describing it and demonstrating its use.

Because I was worried about losing access to my domains at the end of last year, I started up Fossil Line Systems (https://fossillinesystems.com/). The hope was to use it for local contracting, though I've been pretty busy. I also ended up not losing access to my domains, so I am a bit torn whether to keep HarmlessLion active or not. I have to admit I quite like the new logo, and I like the concept of advancing.

I did do a couple of other contract projects. I did a little bit of small animatronic work for a local production that I can't really talk about yet... but I also got to work on a brilliant new animatronic. It was fun to work on a big machine again, and with a skilled team that knows what they are doing.

I guess the last notable thing I did wasn't really something I had to do myself at all - but I sold Cool Herders as a property to Piko Interactive. It was really kind of them, and I need to work on some extra tasks for them as a condition of it. Kind of a relief anyway, since now I can strike several potential projects from my todo list, hehehe. ;)

Anyway... yeah. Really quiet year. I think I need to step it up in 2023. ;)

Thursday, October 6, 2022

How I started in Software

I'm writing this instead of sleeping. Be grateful! ;)

I was thinking about why I get asked so many questions about software, and what makes my answers any better than someone else's (the answer is battle scars, in case you wondered). And that got me thinking to my early days.

When I got out of the armed forces in '95, I knew that I wanted to work on software. Computers were what I was best at, after all. I bought my first PC (before that I was staunchly anti-PC, going through my TI, Atari ST and Amiga 2000 and defending them voraciously). Without Windows 95, I might have kept that attitude, honestly... but that's another story.

I had found a free small C compiler - GCC what? For DOS? Hah! But this compiler let me build simple programs with 64k code and 64k data (because that's how the 8088 segmented memory). I was largely working from books, which is like printing out a wiki and sticking the pages together, but there's no search.

Anyway, this was the period that I published on my 5MB website my epics Super Sales Acer, Super Sled Acer and Super Shooting Acer. You can actually find them on archive.org in playable form, and of course still on my website (nothing I have released is abandonware, and anyone who says otherwise is lying to you.)

I'm going to link Super Sled Acer, cause it's relevant:


For some reason, this game received my first professional review. I can't remember where, or who the reviewer was, and sadly this was before I started printing everything off. But hoo-boy, was he mad! I remember phrases like "before Doom and anything good, this is what shareware meant", and "fun for maybe 5 seconds, then you turn it off to play anything else".

I was baffled... who asked him to review it? But from my standpoint, I quite liked it. I still do, actually. First off, it was a port of my TI game, with muchly improved graphics and animation. Secondly, it's a really silly game. And thirdly it was my first go at joystick controls. Complex? No. But I defy you to pass the last stage.

Anyway, while deciding what to do about it, I noticed that the system requirements he had posted were ridiculous. As I noted, I only had an 8088, 64k compiler. As such, I wrote my games to run in CGA or VGA mode (automatically!), and only a PC XT was needed. Sound support was limited to a simple PC speaker beep. But he had listed 4MB RAM, 486 PC with VGA and Sound Blaster card.

I wrote in and noted to them that they were welcome to say what they liked about the game, but they had grossly mis-represented the minimum system specs. They updated the specs as I requested, but about a week later the review was gone. I guess bashing on a PC-XT game in 1995 wasn't quite as exciting a thing.

But anyway, time went on. I began to think that I needed to get more notice. I did something which is probably still the most audacious thing I've ever done. I wrote up a flyer describing myself as the best of the best for software, and I faxed it out (yeah, remember fax? Of course not, you're way too young) to every game company I could find information for.

I wish I had kept that, too. But my hard drive was always full, and I deleted a lot of things from those days. Those were the "A gigabyte? Yeah, THAT'D be nice" days.

Anyway, I only got one contact. A recruiter for Visual Concepts reached out, and we had a number of pretty nice conversations. I sent them samples of my code - the above mentioned games, and my work-in-progress TI emulator from the Amiga (Ami99, now Classic99, could boot the raw console and run TI BASIC and A-Maze-Ing, very slowly.)

Most of the feedback was pretty decent. They felt my code showed inexperience (valid). They thought it was all very simple code -- I asked what was wrong with simple code and the recruiter didn't know. We'll come back to this.

Everyone's comments were generally very constructive, though. Except for one fellow. Again... I really brushed this guy the wrong way. All of his comments were ridiculously hostile. To his credit, the recruiter filtered nothing, though I think he was a little embarrassed when I asked what this one guy's problem was. The one feedback that stuck with me was "The worst programmer in the world could have written this!", about Classic99. Even then I thought it was a stretch to think that the worst programmer could create a functioning CPU and video core from scratch -- or even plot a color pixel on a bit-plane display for that matter. But on the other hand, Classic99 is still around and Visual Concepts is not, so what (apparently) do I know? ;)

I wondered for a long time what set that guy off, and I think that one statement really clarifies it. My fax promised I was the best programmer in the world, and this guy apparently felt that was a challenge of some sort. I guess I sort of get it, but even I wouldn't be that mean about it.

I ended up getting a local job installing networks for local businesses. This was a good way to get a lot of PC experience quickly - I mean, on my first week I needed the boss to help me understand how to sys a C drive to get DOS back up. But it wasn't long before I was doing everything myself. It was a good company and will always be remembered fondly.

But I still wanted to write software, and just before I darted off to my first Canadian software job, a good friend stole me down south to work in the States. I was really excited, until I got there and realized I didn't even know C++ yet. I can still remembering sitting there with a C++ book (remember, the printed wiki with no search function), and going through step by step. I wrote test programs I could step through until I understood how classes worked, how inheritance worked, what happened when you derived a class, and so on. All the same, I was really, really depressed. Maybe this was all too much for me, I feared.

But it wasn't, and I did master it, and a number of other techs I'd never seen before. From that telephony background, through to robotics and animatronics, I learned that I never get to work on a project I know how to do. So the first couple of weeks are ALWAYS miserable as I wonder whether I can actually learn this. So if you are feeling that, JUST EMBRACE IT.

So I've got about 25 years professionally now, and a good 5-10 before that depending on how much of it you want to count. What's the biggest thing I've learned?


And you can read those last two words in every context, they all work.

But seriously. The downfall of code is complexity. The code you write needs to make sense when you write it. It needs to make sense when you read it six months from now. It needs to make sense to the next person who reads it. You have to be able to debug it.

Simple code meets all of these in the easiest manner. It's generally easier to write. It's easier to understand. It's easier to debug (particularly because simple code lends itself well to testing just parts of it). And nine times out of ten, the compiler can make it run faster too.

The whole art of programming is condensing complex tasks into simple steps. Don't make your own task harder. Bed time now.

Tuesday, September 13, 2022

Cool Herders Graphics Experiments

Way back in the day, I created a silly little demo for the Sega Dreamcast using then then-new KOS development kit. My buddy Binky came up with the design and all the artwork, and we put the original demo together in like two weeks.

Let's zoom in a little on what's going on there...

We've got some nice overlaps going on there. Boxes overlapping the ground, trees overlapping the ground and boxes (both above them and beside them). There are probably lots of ways to do this, but I'm going to tell you how I did it. It's actually pretty basic.

The Dreamcast was the first 3D hardware I worked with, and knowing that, I made all the graphics out of polygons. Each tile/sprite is just two triangles using flat projection, but with the Z axis tilted slightly. The top of the tile is closer to the screen than the bottom. As a result, for free, we get nice overlaps of the sprites and the tiles. It's as if they are actually (slightly) standing up!

This caused some people to make funny faces at me, but nobody ever suggested a simpler idea. (Some people suggested multiple textures for different situations, but why?? Extra work, extra video RAM, extra processing... what?

So this was on the Sega Dreamcast. We had 8MB of video RAM... so we ended up deciding on a basic system with 3 pages of tiles (originally the three had meaning, but in the end only the third page is special). Full 24-bit (I think it was 16 bit on the DC), and lots of wasted space. Each page has 4 pages of animation that plays continuously, and none of this causes the Dreamcast to even notice that you are there. ;)

Of course, some levels have more graphics than others, but I'm sticking with the NZ stage for now.

So this worked fine, and we released a working game and had some fun with it.

After a while, I started working on porting and updating the game, and we landed on the Nintendo DS. Going from 8MB of video RAM to 768kb - with a few limitations on how to use that memory, required the wasteful old layout to be revised. However, the DS's 3D hardware was still up to the task of all those tiles, so I was able to use the same actual layout. We opted to just zoom the screen in and add a radar, so that the original 640x480 resolution was still honored on the smaller 256x192 display. (This also helped the 3D system, as I only render the polygons that are actually in the viewing rectangle.)

Eventually, I wrote a project that parsed the Dreamcast tiles and condensed the texture pages, removing duplicate tiles and making a lookup table that could be used when the code asked for one of the original tiles. We reduced the background color depth to 8 bit as well, which still looked pretty good!

(Still a bit of wasted space, but it was the closest power of two size that fit all the levels. Incidentally, this tight packing is why the game has trouble on DS emulators. They have an off-by-one bug in the texture mapping for flat projection that causes the textures to pull one more pixel than the hardware does. After years and years this has never been fixed, except in DrasticDS for Android.)

With that, the final game pretty much kept the original graphics on a much smaller system. The DS version also added special attacks and improved the story.

(All that said, the DS was actually powerful enough to render the whole screen, as this very early test shot shows!)

Recently, then, I got a challenge and decided to see if I could get this going on the Gameboy Advance. Now, we're talking slightly more restrictions here. Only 64k of video RAM available to tiles, oh, and it's TILE based, not 3D.

I reduced the color depth to 16 colors for this.. and after the first pass, I was able to use a different palette for the destructibles compared to the ground tiles, giving me 32 colors overall (well, really 30 cause of the way the GBA does graphics).

Originally, I was considering hand-calculating the tiles, but I didn't think the CPU would be up to the task. It might have, but then I remembered, oh yeah! There are two layers.

So originally, I thought that would do it. I'll put the top halves on layer 2, and the bottom halves on layer 1. I can put the sprites in the middle.

But, then I realized... oh, wait, the destructible objects (crates, etc) need to go on top of the ground, but under the tops. So fine, three layers then. No problem.

That /almost/ worked. I didn't grab a screenshot, unfortunately, but where the leaves overlap on the left, that was deleting the previous tile and causing corruption (only on specific animation frames!!) But, easy to resolve this - I used the fourth and final layer and alternated the top layer for even and odd.


Check out the layers!

(The four separate layers)

(Combined layers 1 and 2, and 3 and 4).

To get here, I have a new script that processes the original Dreamcast graphics, converting the 48x48 pixel blocks (!!) into the 6x6 character tiles that the GBA works with. The lookup process is just fast enough on the GBA, and will probably need a little bit of optimization.

The new tool also looks for duplicate tiles, and so it can actually find more duplicate graphics (at 8x8) than the DS version did (at 48x48, though the DS version removed empty space).

In the end, all 6 levels, with /almost/ all the graphics, fits in about 350KB, and a single level's tiles fit in a single character set on the GBA (32k or less). I had to say 'almost', because the Toy Factory has so much animation that I had to remove a few tiles. Fortunately, no animation was removed - just the paint on the factory floor (there was a whole tile page of arrows), and three of the five colors of gift boxes.

Anyway... that's all I had to write tonight. I was just pleased to see it come together on such a different system. We'll have to see where it goes in the end!

Thursday, July 28, 2022

Necrobiotics My Spinnerets...

If you don't like spiders, don't read this. ;)

Lots of buzz over the relatively recent release that some researchers are using dead spiders as grippers. There's literally nothing ground breaking here - they inflate the spiders' legs, causing them to extend, then release the pressure and they contract. That's how spiders walk in the first place.

They spend some time trying to justify it - oh, it's biodegradable, they can sometimes lift more mass than the mass of the dead spider, etc etc. Even invented a term - necrobiotics.

But it's all kind of bunk, isn't it? First off, there isn't a massive issue with grippers filling up landfills as disposable parts. So who cares that they are biodegradable? In fact, let's talk about that part!

I didn't see any mention of the number of useful cycles, but a typical gripper is going to rate this in the tens of thousands, if not hundreds of thousands. Lifespan of a typical gripper will be measured in years.

The spider starts decaying the moment they kill it (and yes, they kill the spiders they use for this experiment). The duration during which they are going to be useful, before the bladder is damaged or the creature simply becomes too dry to operate - that's going to be measured in hours. Which means at least every day, to keep your machine operating, you need to capture a spider, kill it without damaging it, carefully inject the actuator into the correct bladder, seal the hole, and then put your machine back together. Maybe you can get quick at that, but it seems like a lot of effort!

They talk about using them for pick and place machines, which need to rapidly and accurately pick up, position and place thousands of parts an hour. It's hard to believe that using dead spiders is going to revolutionize the already-very-simple-and-reliable grippers that are used for this. (Suction, if I understand correctly...) And there's no indication about how long the extremely tiny and fragile hairs used to actually make things stick to their feet will last without life to renew them.

That last point got me thinking. Clearly, the answer is not zombie spiders, but borg spiders. If we can implant a small computer that is capable of controlling the spider, we may have something. No, not as a pick and place, that's stupid. But for other things. Reconnaissance, for instance. I dunno what else. Targeted pest control, maybe. ;)

My thinking there is that the computer is able to drive the spider, as well as receive sensor feedback. But most importantly, it's able to turn off the interface and restore the spider to natural operation. The advantage of this is that, presumably, with appropriate rest breaks, the spider will naturally feed itself, and its biological processes will naturally repair the normal wear and tear of operation.

Morally, of course, this falls on the dark side of science. One might imagine that during periods of computer control, the spider would be experiencing a living hell. We might well be able to determine whether or not spiders have any sentience - any sense of self. If they did, I can only imagine the horrors they would try to cope with when the computer turns off.

I'm (fortunately) far too busy to create borg arachnids. Partially because spiders creep me right the hell out. And besides, for a pick and place, clearly ants are a far better choice. ;)

Wednesday, February 23, 2022

You're Doing it Wrong - Allocations

 One of the things that frequently amuses me/drives me into a rage is the modern mentality that any old design pattern is bad, so don't learn it. Then code breaks, we old timers nod and say "yes, that sounds about right", and get back "why is programming so hard? I'm going to fix it so it's easy!"

The article I'm reading right now is about a fellow who documents his battle to understand a kernel level crash that ultimately boiled down to a race causing a use-after-free. For those who don't know, this is exactly what it sounds like. You allocated some memory, then you freed it, then you accessed that pointer after the free. This is very bad and causes everything from security exploits to system crashes, because after a free the system is allowed to do anything it likes with that memory location, including re-allocating it, or marking it illegal because the address space is needed elsewhere.

Use-after-free is controlled by having a clear memory policy. This is a design concept wherein you have a set policy that defines unambiguously who owns a block of allocated memory at any given time. Some systems may have multiple owners - in this case you need a management system such as smart pointers. Some systems may pass ownership from one function to the next - in this case it's usually wise for the code which is losing ownership to forget that pointer as soon as possible. For instance, you can null it out - then you /can't/ accidentally use it. There is literally no reason to remember an address once it has been freed - null that pointer. 

This also helps a lot with memory /leaks/. If you know who owns a block of memory, then that implies quite strongly who is responsible for /deleting/ that block when it's no longer required.

This is all a design concept - nothing happens in code until you decide it. And once decided, no exceptions. Exceptions cause bugs. And besides, if you need exceptions, that means your design doesn't fit the application and needs to be rethought. My personal observation is that my design isn't right till the third time I implement it. Less than that, and I get nervous. It's like having your code build and run the first time - you should be thinking 'Oh no. What did I miss?' ;)

Stable, reliable code beats buggy code that's 0.001% faster on Tuesdays. EVERY. SINGLE. TIME. You'll thank yourself when those good night sleeps start taking the place of early morning panic calls and late night debug sessions.

Although I do agree with how the fellow ended it. I'm not naming him cause my rant isn't really his fault, and ended up a little off topic, but kudos, bud.

"Just go to sleep, because everything is broken anyways :)"