Saturday, December 6, 2025

Solar Ceilings and Compounding Dreams

It is fashionable to wave away physical constraints with vague references to solar abundance and human ingenuity. Yet every balance sheet eventually meets a balance of energy. Solar photons may shower Earth with roughly 170,000 terawatts, but financial markets expect growth that compounds on top of itself forever. The math linking those stories rarely appears in the same paragraph—so let’s put them together.

Setting the Stage

I keep coming back to Tom Murphy’s dialogue in Exponential Economist Meets Finite Physicist. In Act One, Murphy plots U.S. energy use from 1650 onward and it traces a remarkably straight exponential line at ~3% per year. Economists in the conversation shrug; after all, 2–3% feels modest. But compounding at that pace means energy demand multiplies by ten every century. Our economic models implicitly assume something even more optimistic : 8–10% returns in equity markets, pension targets, and venture decks; without asking what energy supply function supports that.

Thermodynamic Guardrails

Murphy distills the second law of thermodynamics into plain language:

“At a 2.3% growth rate (conveniently chosen to represent a 10× increase every century), we would reach boiling temperature in about 400 years… Even if we don’t have a name for the energy source yet, as long as it obeys thermodynamics, we cook ourselves with perpetual energy increase.”

That thought experiment matters less for the literal 400-year timer and more because it shows energy growth must decelerate to avoid turning Earth into a heat engine. Solar panels, fusion, space mirrors … pick your technology. The waste heat still has to radiate away. We cannot spreadsheet, app and AI our way around Stefan–Boltzmann and Black Body radiation.

Solar Arithmetic vs Demand Curves

Let’s grant the optimists a heroic build-out: cover 5% of Earth’s land area with 20%-efficient photovoltaic arrays, assume a generous 200 W/m² average output, and we net roughly 20 TW—about the entire human primary energy demand today. That is fantastic news for decarbonization, but it is not a blank check for compounding GDP. If demand keeps growing at 3%, we would need 20 TW × (1.03)ⁿ in perpetuity. Within 250 years we’d be trying to harvest thousands of terawatts—orders of magnitude more land, materials, storage, and transmission than our initial miracle project. Solar abundance is real; solar infinity is fiction.

Finance Is an Energy IOU

Money is a claim on future work, and work requires energy. When pensions assume 7–8% annual returns, when startups pledge 10× growth, and when national budgets bake in permanent productivity gains, they are effectively promising that future societies will deliver 2–3 doublings of net energy per century. If we instead hit a solar plateau—because land, materials, or social license cap expansion—those financial promises become unmoored. We can pretend that virtual goods, algorithmic trading, or luxury desserts (to borrow Murphy’s Act Four anecdote) deliver infinite utility without added energy, but the chefs, coders, and data centers still eat, commute, and cool their CPU’s , GPU’s and Tensor processors. The intangible economy rides on a very tangible energy base.

Rewriting the Business Plan

Accepting a solar ceiling does not doom us to stagnation. It just forces different design constraints:

  • grow quality, not quantity—prioritize outcomes per unit energy … do proof of useful work rather that roll the dice and gamble.
  • align finance with expected energy supply rather than mythical exponentials … and I am not talking of wasting energy on crypto.
  • treat efficiency gains as buying time, not as a perpetual motion machine … if you learnt enough physics in high school to reject the perpetual motion machine, but have been lulled into perpetual 8% returns from the finance markets, there is a serious schizophrenia issue.
  • embed thermodynamic literacy in economic education so debates start from the same math.

Murphy ends his essay noting that growth is not a “good quantum number.” It is not conserved. Our job is to craft institutions, portfolios, and narratives that can thrive when net energy flattens, because physics already told us that day will arrive long before our spreadsheets hit overflow errors.

Darwin 2022 - Ruminations Compendium

Collected reflections from the July 2022 Darwin trip, a narrative of adaptation, organisational change, and expansion can live in a single place.

July 19 – Lemmings And Launchpads

There is no exception to the rule that every organic being naturally increases at so high a rate, that if not destroyed the earth would soon be covered by the progeny of a single pair. Even slow breeding man has doubled in twenty five years, and at this rate in a few thousand years there would literally be no standing room for his progeny.Charles Darwin

Like the lemming marching and diving into the ocean to self‑regulate, humanity plunges itself into vices of its own creation: alcohol, drugs, violence, and greed. Perhaps the next plunge is into the real ocean or into the vacuum of space, chasing more room in which to stand or float. Failure in harsh environments creates room by removing weaker individuals, or greater resilience by rewarding the most adaptable. Colonial Australia itself was founded on such selection—the most adaptable individuals and the strictest rule enforcers reshaped an unforgiving frontier.

July 20 – Organisational Evolution In Flight

Seeing that a few members of such water-breathing classes as the Crustacea and Mollusca are adapted to live on the land, and seeing that we have flying birds and mammals, flying insects of vast diversified types, and formerly had flying reptiles. It is conceivable that flying fish, which now glide far through air, slightly rising and falling by the aid of their fluttering fins, might have been modified into perfectly winged animals.Charles Darwin

The ability to skim over water for a few metres comes from external tweaks, but the ability to cross the Pacific like a Godwin Tern comes from internal rewiring: hollow bones, high metabolism, and a brain with a built‑in compass. Organisations face the same distinction. A brief digital-transformation spasm can bolt on an app or a website, yet sustaining that flight demands internal metamorphosis and a sense of direction from leadership. Caterpillars become butterflies through wholesale change—so must companies that aspire to be more than flying fish.

July 23 – Questions For The Corporate Naturalist

  1. Where are the transitional forms?
    Organisations with no lines on the org chart operate as pure adhocracy. Hidden behind corporate veils, they are like pupae in cocoons, waiting to emerge in a more defined shape.
  2. How can specialised organs evolve?
    Marketing machines, technology muscle, sales teeth, enterprise-planning backbone, analyst frontal lobes—each department is an organ honed for a specific survival task.
  3. Is behaviour or instinct inheritable?
    Culture answers this. The rituals, stories, and incentives that survive layoffs and leadership changes become the genetic code of the firm.
  4. Why are some species sterile when crossed, while others are fertile?
    Some mergers and acquisitions thrive; others fail because the two organisational genomes cannot integrate and diverge instead of hybridising.

July 24 – Conquering New Lands

He who believes in the struggle for existence and in the principle of natural selection, will acknowledge that every organic being is constantly endeavouring to increase in numbers; and that if any one being vary ever so little, either in habits or structure, and thus gain an advantage over some of that inhabitant, however different it may be from its own place, it will seize on the place of that inhabitant.Charles Darwin

International expansion is a contest for ecological niches. Bringing hard‑won optimisations from one country to another is a bid to displace incumbents. The organisations that vary—by process, by product, by mindset—claim new ground first.

Saturday, November 5, 2022

Drop shipping products from PCBWay

Drop shipping products from PCBWay

For a while I have been ordering PCB’s from PCBWay and parts from Mouser and Digikey, then hand assembling them at home. These have been very small scale cottage industry style runs and ultimately time consuming as I focus more on design and evaluation of new energy monitor ASIC’s such as the V9261F. When PCBWay started offering to stock and drop ship my PCB’s directly from the factory using their extensive clout with DHL, I promptly signed up for the service.

Recently I have been getting my ATM90E36 Devkit PCB’s assembled there. The service has been excellent with concierge like parts choice and purchase. Followed by extremely helpful consultation on assembly progress and correctness.

I received the following images after the first stage and confirmed the crystal and LED’s.






Then I received some more inspection photos to allay any doubts.




Looking forward to the stock appearing on the shop front.

NOTE: This is a paid promotion of PCBWay services

Sunday, December 27, 2020

Trucks vs Trains as an analogy for Microservices vs Monoliths

2018 and 2019 was mostly spent obsessing over containers, trucks, trailers and hand written paper invoices for me. I was helping build out the technology stack and engineering team for Lori Systems. Early in 2019 we made our first DevOps hire, getting Clive from Safaricom and getting started on migrating our handrolled Django monolith from EC2 to EKS. We would make jokes around shipping containers using containers. Clive even had a container shaped stressball with the EKS logo on it. This set me thinking on the parallels between shipping code and shipping goods, perhaps also led to the foundations of this post.

Intermodal Shipping in the real-world and in software

Over the almost 2-years of work in Logistics I learnt a lot about how the global logistics system works. Almost like the life-blood of the planet. Large container ships abstract away contents and ship things from Taiwan to Timbukutu. The seminal book on this topic is perhaps, The BOX. Watching global shipping lanes in Marine Traffic and scraping ships arriving in Mombasa from the KPA Sharepoint became a daily ritual. I digress, back to the original point on the importance for containerization in shipping code or machinery.

Docker uses the ubiquitous whale/ship logo, most containers arrive at ports this way from the oceans of developers. I don't quite have an analogy here for the massive ships that land the containers at ports, some 500 or 1000 TEU's at a time. The analogy here covers land transport aspects, somewhat related to how code runs in production and is typically served via datacenters / public clouds to users.

Containers themselves make transfer of goods/code from development (ships) to production (trains/trucks) easy. However even containerized applications can demonstrate tight coupling similar to what a train has, in effect being a distributed monolith, instead of a true suite of microservices. In my opinion, any system that requires a release train approach for new features is most likely to be a distributed monolith masquarding as microservices. The real flexibility comes from the low coupling between containers and the freedom to release each clearly delineated service at its own cadence on the roads.

Trains are awesome

My 5yo is currently obsessed with steam engines, even though they are from an era long gone. There is something magical about a powerful engine pulling everything along smoothly on a set of constraints (rails). It works nicely as long as no quick changes are needed in the carriages and everyone wants to get to the same destination. Trouble arises when something in the closely coupled chain of components goes awry and requires a quick change. I still don't understand the scene in snow piercer where a few wagons were dumped in a siding at speed. If we can do that one neat trick perhaps monoliths would become much more maintainable. In early stages of a product monoliths are a nice simple entry point, especially if the features are narrowly scoped and well coupled. On the reverse the monolith may be a very good idea for a mature product which is not changing rapidly and perhaps needs to be optimised for performance instead by reducing communication overhead between components by introducing tight coupling. In both cases a modular approach and service-oriented designs are still feasible, as long as the implementation and maintenance team is aware of the implications. People are still driving around in classic cars from the 1900's, where as steam locomotives from that era are languishing in museums.

Trucks are flexible

One of the killer advantages of trucks in the logistics business is their ability to deliver right to the factory or warehouse loading bay. It is simply not feasible to build train tracks to serve every address. Even in areas with great railway infrastructure, buffers (known as Inland container depots) have to be placed to cover the last few miles of transport from the rail to the industrial areas. This sort of mode can sometimes be seen in Microservices being layered on older monoliths to provide user facing services, especially in banking systems. The other great advantage trucks have is the ability overtake each other gradually along the road, this manifests itself in software systems as rolling deployment of new features. Such an approach requires careful management of the stateful parts of the system such as storage and database schemas. Otherwise it turns into a Fast and Furious game of stealing a container from a moving platform, aka the Romanian Rollover.

This analogy is not new

The logistics analogies are rife in software engineering, we ship code, we package things, we have release trains. The largest real world container orchestration organization Maersk uses a 7-point logo surprisingly similar to the most popular software container orchestration platform Kubernetes. I will continue updating this post as more ideas and links come together.

You can engage with article via comments or the twitter thread.

Sunday, October 4, 2020

Desktop Software API's in Python (KiCAD, FreeCAD, Blender, QGIS)

Python wraps around everything

For the last couple of years I have mostly written Satellite Data Processing code in Python and plenty of Flask/Django web services. However Python is also an excellent automation tool for GUI based applications allowing custom plugins to be written and functionality provided out of the box extended by users.

The first desktop application I seriously looked at Python plugins for was QGIS. It was early days of learning how to wrap C++ code using SWIG/SIP etc. In the old mailing list you can find a much younger me making inane comments about mixing wrapper metaphors in QGIS with SWIG + SIP. We have come a long way since then and SIP based bindings are the mainstay of QGIS plugins.

QGIS

QGIS has so many Python plugins that they need a registry of their own. Occasionally QGIS Python gets twisted around itself due to multiple Pythons in the user enviroment. You can also flip the python API around and instead of building a plugin you can turn QGIS into a custom desktop application. Which is what I have done with my basic Airport Viewer demo.

QGIS being a fairly extensive and complex C++ application which takes hours to compile, being able to make small quick changes in python is invaluable.

KiCAD

At the time of writing KiCAD has an extensive Python API for processing the automating the PCB layout part of the workflow and this has lead to many innovations in automating traditionally laborious hand layout or even performing complex simulations / optimization to set trace lengths. For example Josh Johnson has one for laying parts out in a circle and Greg Davill has several for length matching and rendering file generation. My personal favourite among the KiCAD scripts is the one for generation of Interactive BOM.

I am really looking forward to Python script support in the Schematic Editor. Meanwhile programmatic Schematic generation tools like Skidl provide schematic oriented Python fun.

The rendering of the PCB's is often done in Blender. Which has its own set of Python nicities.

Blender

My first foray in creating a Blender API based application was during the Kinect USB protocol hacking days. The data stream had just been decoded and I wanted an easy pipeline to a commonly installed / open-source 3D display software. The Python API is mature enough for people these days to quickly put together motion capture plugins for Blender. This plugin however demonstrates the challenges for creating native plugins for blender, the .pyd files for Python have to be recreated for different versions of Blender for ABI compaitibility.

Getting the binaries working has had me thrashing about and posting in forums, then sticking to a working Blender build with Python 2.7 for about 5 years since I did not want to touch it and break it. My integration actually reversed the embedding process, i.e. instead of using additional modules in the Blender embedded python I embedded Blender in a 3D GIS automation.

Native plugin weirdness aside, Blender Python API is a really powerful tool for creating procedural objects from waves / fluid simulation to astrophysics with amuse.

FreeCAD

FreeCAD is sort of the third part of my physical electrical / mechnical design triumvirate. I occasionally design parts for KiCAD in FreeCAD, or bring multiple boards together to test enclosure fit. FreeCAD also has an extensive python library which is leveraged by KiCAD part library maintainers to parametrically generate parts.

The scripting in FreeCAD can be used much like the PCB layout scripts in KiCAD to create this with circular symmetry, like ball bearings which are difficult and repetitive to do by hand.

Final words

There are lots of other pieces of desktop software I have used that have started shipping with Python API's to address the never ending demand from users to easily automate repeated tasks. The live process for making this blogpost in somewhat recursive fashion can be found here.

I have even made videos withs a proprietary one, I will live that here for anyone interested in my attempts at a voiceover.

Sunday, August 30, 2020

Compiling QGIS in MSVC in 2020

Compiling QGIS on Windows in 200x

I don't quite remember when I decided to help compile QGIS on Windows. It was somewhere between compiling GDAL with ECW support for Photoshop on Windows and getting carried into Direct3D and C# land with NASA WorldWind. It was sometime in the 2000's while still working at Apogee Imaging in Lobethal.

At that point I was manually building a database of the footprints of satellite imagery that filled up a wall cabinet with CD's and DVD's. The technique was something like open up the image, go around edges and trace a polygon. This was days before mature boolean thresholding and reliable/easy raster-to-vector logic.

I hopped on IRC on #qgis in Freenode and chatted with luminaries like timlinux, frankw and gsherman. Listened to the automated notifications from sigq, the commits bot. Things were heating up and instead of a Linux cross-compile to windows using MingW, something native to windows say using MSyS+MingW instead of Cygwin was desired. A lot of GDAL and Qt worked in MingW, so presumably QGIS would too. So I set myself to put together an MSYS environment with all the third-party dependencies that could be used to happily build QGIS. Eventually I built a release in NSIS as well.

My MSYS environment got packed in a zip and shared via FTP/HTTP on a VPS I had back then to the rest of the community. I earned myself a pin in the QGIS core contributor map in Adelaide. Something I am very proud of to this day. Eventually the MingW build got deprecated and native MSVC builds were supported. That's how contributions work, nothing lasts forever. In my IRC days, I helped on-board Nathan Woodrow to QGIS, who in turn I believe helped on-board Nyall Dawson. Nyall has surpassed us all in feature contributions and work on QGIS.

Fast forward to 2020, compiling QGIS in MSVC

I am getting back into doing lots of Open-source work after long hiatus in private industry with Aerometrex and start-up land with Lorisystems. It is great fun working on mostly in the open at Geoscience Australia. There is actually a recently archived opendatacube + qgis repository here. Seeing that repo and speaking to Nathan and LinuxConfAu inspired me to have a go and getting back into actively working on the Qgis code base. It has sprawled out, with lots and lots of new features. The build system is still familiar via CMake and actually much easier now with MSVC. I cast around for a recent guide and found this. The guide mostly works, however I made some refinements.

  • Ditched bison and flex via Cygwin to using the one available via Msys2. These can be found here. Not needing the while Cygwin system helps in keeping the windows build system light. Simply download the binaries and add them to the Osgeo4W binaries directory.
  • Captured my CMakeCache.txt to make it easier to reproduce and debug the build environment for others.
  • Used Incredibuild in demo mode to use a few NUC's I have lying around to speed up the build. Recording while building failed the first time and worked the next. The whole build from scratch still tooks around 35minutes overall.

I am planning to throw some of my day to day DevOps skills towards the QGIS project and start helping again with Raster enhancements and windows release management. Perhaps getting Incredibuild in the hands of the windows maintainers will help tighten up the iteration cycle and make testing easier.

The twitter thread/ stream of consciousness edition of this is available as well.

Wednesday, July 22, 2020

Microservices the hard way - folders in EC2 Instance

For day to day work I wrangle containers in EKS these days. However when doing personal projects EKS is a luxury (baseline cost being $70 or so per month). So I decided to do microservice development for the rain radar project using no Docker, no Kubernetes but using:

  • multiple venvs
  • multiple service folders
  • environments in .env files (secrets in plain text)
  • web service start using @reboot in Cron
  • scheduled services using ... ya Cron

The whole thing started with noble intentions to use lambda's all the way however I got stuck in using S3-SNS to trigger the lambda and decided to scan the S3 bucket using timestamps to find latest files to process. More on the pitfalls of that later.

The major microservices handle are:

  • Raw radar data preparation using custom hand crafted algorithm, being ported to Rust here.
  • Inserting prepared data to DynamoDB as a sparse array and serving this via Flask.
  • Nowcasting using the timeseries of sparse array of rain observations also serving results via Flask.
  • Capturing rain events and nowcasts and creating text and gif to send to twitter.

Each of these applications consumes the other to some extent and is sort of separated in responsibility. I decided to deploy them with basic a folder per application on the /home/ubuntu directory, with a venv per folder.

I had it like this for a while. Then I got tired for sshing into the box and git pulling in each folder. So I decided to write a fabfile per application which would do this for me and created deployment keys which would be used to pull the code to this folder. Then I got tired of running multiple fabfiles and decided to setup a polled process which run the fabfiles and git synced the code from a master pipeline.

Eventually I got around to bootstrapping the whole VM using Packer + Ansible playbooks. The development work for it was done locally using Vagrant with Hyper-V as the VM provide to test the same Ansible playbooks. I will follow up on this with a few characters on twitter.

Once the initial Packer AMI is established the choice is to either keep building this image or to move away from the whole VM based old-school stuff to a more modern/fun Kubernetes way.