tag:nugget.posthaven.com,2013:/posts Nuggetty Goodness 2026-03-09T00:49:09Z tag:nugget.posthaven.com,2013:Post/2269366 2026-03-09T00:49:03Z 2026-03-09T00:49:09Z One-bowl vegan hazelnut brownies aka "Ferrero brownies"

This was an experiment that turned out awesome. There were some vegan humans a nugget was intending to feed. However, being quite carnivorous and distinctly non-vegan, we nuggets didn't want any recipes with vegan butter, flaxseed egg replacements, and so on. Also, refined coconut oil offends a nugget's lineage. 

So we threw this experiment together, and somehow, after a 24 hour rest, these are the best brownies nuggets have ever made. They may even be the best brownies nuggets have ever eaten. D:

And they're one-bowl. And lazy. WHEEEEEEEEEEE

Ingredients

Dry

  • 150 hazelnut meal (or almond meal if you want a more neutral taste)
  • 150g all-purpose flour (or bread flour if that's all you got, we won't be stirring it enough for too much gluten to be an issue)
  • 175g white sugar
  • 40g cocoa powder (preferably Dutch-processed)
  • 20g tapioca starch (don't leave this out, this boosts fudginess. can also use glutinous rice flour instead)
  • 1/2tsp baking powder
  • 1/4tsp salt
  • chocolate chips (optional)
  • cocoa nibs (optional)

Wet

  • 190 soy milk (I prefer soy milk with no added oil, but an icky one with oil should be fine too)
  • 30ml vegetable oil (something neutral tasting like canola)
  • vanilla essence (to taste)
Craft it!
  1. Preheat oven to 180C fan-forced.
  2. Dump all dry ingredient in a bowl, whisk/stir with fork until homogenous-ish (the choc chips will hardly homogenise...)
  3. Make a well in dry ingredients, dump wet ingredients in the well (no need to mix em first).
  4. Stir until just combined. Batter will be very thick, kinda like cookie dough, but a bit squishier.
  5. Line 9inch square brownie pan with baking paper.
  6. Pop batter in pan, and pat it down until it's sorta flattish and even. It'll melt the rest of the way as it bakes.
  7. Bake for about 25 minutes. Once it's more or less flattened out and puffing up a little bit, it's done. Toothpick test won't come out clean. Better to underbake than overbake (same as any other brownie).
  8. Pull outta oven, leave it to cool and DO NOT TOUCH IT for at least 24 hours.
Eaten a couple of hours (5 or so) out of the oven, it's nice, but not amazing. Good flavour, but slightly crumbly, and a bit lacking in richness. When I tried it at this point, I started thinking of tweaks...

After at least 24 hours, the darn thing evolves. Somehow it becomes incredibly rich and fudgey, and the flavour has a lot more depth. I don't know why this happens but... it does. I ate some after 24 hours and... no tweaks needed. The thing is glorious. And I don't even like chocolate much, let alone brownies. XD


]]>
tag:nugget.posthaven.com,2013:Post/2263853 2026-02-18T09:56:00Z 2026-02-18T09:56:17Z BDO: Male woosa customisation

<.<; Because someone on Reddit wanted one, and I got curious about making one.

Note that I have ONLY customised the face here, nothing else. So body, etc, you gotta do yourself. :)


]]>
tag:nugget.posthaven.com,2013:Post/2255911 2026-01-24T04:30:47Z 2026-01-24T07:37:10Z Enable VoLTE on Huawei P20 Pro in Australia via USB debugging (Windows computer needed)

My Huawei P20 Pro's last EMUI update in 2023 added VoLTE support - it was specifically added to do that. Nonetheless, because of the shambolic way that the move off 3G has been handled in Australia, my phone was effectively "bricked" on both Vodafone and Telstra.

If you're in a similar situation, here's how to force unlock/enable VoLTE on Telstra's network in Australia. Note that you will have to do this procedure every time you restart your Huawei P20 Pro. What's more, it seems to drain battery about 10%-15% faster. This makes it a bandaid solution, but does allow you to keep having mobile phone service until you get a new phone.

This works on unrooted devices running EMUI 12. I believe it also works for unrooted devices running Android 10 and below, based on the Shizuku app guide.

I don't know if it'll work with Vodafone, because they handled the situation so badly that I'll never use them again.

What you'll need

  • Your Huawei P20 Pro
  • A Windows 10+ desktop or laptop computer
  • A USB charging cable that can transfer data
    (You can verify this by using the cable to plug your phone into your computer. If Windows picks up the phone as an external device, you're good. If it doesn't, the cable won't work for these purposes.)

Apps to install

On your phone

  • Shizuku
    I recommend downloading this from the Google Play store
    https://shizuku.rikka.app/guide/setup/
  • Pixel IMS
    This isn't available on the Google Play store any longer. Plus, we need the specific patch as linked for our Huawei P20 Pro, to prevent Pixel IMS from crashing when Shizuku is on. Normally I'd caution against downloading random APKs from the interwebs, but it all depends on your own individual desperation and situation. ;)
    Github comment | Github download

If you've never installed apps on your phone outside of google play, and don't know what an "APK" is, it can seem a bit scary. But it's really easy, promise! Just follow the Pixel IMS link either on your phone, or type it into your phone's web browser. The APK download should start, and once it's done, your phone should prompt you to install it. 

If the prompt doesn't show up, then once the download is done on your phone, go to Files > Downloads/Received files > Download Manager > Tap on dev.bluehouse.enablevolte.apk and you should get an installation prompt.

On your Windows computer

Enable VoLTE

  1. Turn on Developer Mode in your phone (if you haven't already).
    Settings > About phone > Build number > Tap "Build number" repeatedly.
  2. Turn on USB debugging on your phone.
    Settings > System & updates > Developer options > Enable "Stay awake", "USB debugging", "Allow ADB debugging in charge only mode", and make sure "Set USB configuration" is "Charge only".
  3. Plug your phone into your Windows computer.
    Again, this needs to be with a cable that can do data transfer. If your computer doesn't detect your phone as an external device when you plug it in, it won't work.

  4. On your computer, go to the "platform-tools" folder you extracted (this is ADB). 
    Shift-right-click "platform-tools" folder > Open Powershell window here
    Windows PowerShell will launch.
  5. On your computer, in Windows PowerShell, type 
    .\adb
    If successful, you should see a wall of text. :)
  6. On your computer, in Windows PowerShell, type 
    .\adb start-server
    You should see "daemon started successfully".
  7. On your computer, in Windows PowerShell, type 
    .\adb devices
    If successful, you should see "List of devices attached", a long serial number, followed by the word "device". That's your phone! :D
  8. On your computer, in Windows PowerShell, type 
    .\adb shell sh /sdcard/Android/data/moe.shizuku.privileged.api/start.sh
    This will use your computer to start the Shizuku app you installed on your phone.

  9. On your phone, open Pixel IMS, and select the "SIM" tab on the bottom of the app.
    For example, on my phone, it says "Telstra (SIM 1)".
  10. On your phone, in Pixel IMS, turn on all the things!
    Enable VoLTE
    Enable VoWiFI
    Enable VoWiFi while roaming
    Enable Supplementary Services over UT
    Enable Supplementary Services over CDMA
    Enable Video Calling (VT)
    Enable Enhanced 4G LTE (LTE+)
    Allow adding APNs
    Show VoWiFi Icon
  11. On your phone, check that everything is enabled and working.
    Settings > Mobile network > Mobile data > Enable "4G" and "Wi-Fi Calling".
    Set preferred network mode to LTE/WCDMA/GSM auto
  12. The top bar of your phone should show "Telstra", and "VoWiFI".
    You've successfully enabled VoLTE and VoWiFi on your Huawei P20 Pro. :D

Resources

A whole bunch of kind folks on the interwebs made this solution possible for me, and here they are. Parts of this guide are pretty much directly quoting them. :)

No guarantees that this will work for you. But if you have a Huawei P20 Pro in Australia, it's worth a shot. The original that twigged me on to realising that force-enabling VoLTE in a way an Australian telco would recognise was for a Huawei Honor 90 Lite, after all.

Good luck! :)








]]>
tag:nugget.posthaven.com,2013:Post/2246360 2025-12-22T04:43:25Z 2025-12-23T13:12:44Z Significant increase in malevolent supernatural activity in areas with a high uptake of LLM usage

"Now, I shall perform the Ritual of Summoning! Behold my absolute power! You are bound to me, demon!"

"Trifling human! You face Jaraxxus, Eredar Lord of the Burning Legion!"

"Wow, really?!"

"No. You wrote the ritual with ChatGPT."

Screams, and the smell of brimstone, followed by some rather fragrant roast-porky aromas, and contented munching noises.

]]>
tag:nugget.posthaven.com,2013:Post/2244960 2025-12-16T02:14:22Z 2025-12-23T13:12:29Z One bowl muffins - no butter, no egg! :D Plus bonus custard filling to make them into Stuffins.

Add flavourings of your choice. The pics are matcha in the batter, and chinese preserved sweet salted plum powder in the custard.

Muffin base

Ingredients

Flour 200g (AP, bread, cake, all will work)
Almond meal 50g <-- this is to boost the fat, don't leave it out
White sugar 150g
Baking powder 1tsp
Salt 1/4 tsp

Thickened cream / whipping cream 100ml
Fresh full cream milk 200ml

Crafting

  1. Dump dry ingredients in bowl and stir till more or less the same
  2. Dump cream into dry ingredients, mix into pastey doughy thing (can still have bits of flour, it's fine)
  3. Dump milk in 50% at a time, mixing in each time. (But don't overmix, just like... fold it lazily. Lumps are fine.) You're looking for a very thick milkshake type batter, so we wanna control the liquid to make sure it's not too ... liquid.
  4. Fill cupcake cups to about 5mm below the edge. Alternate filled/empty bits in muffin tray for best results/most even heating. If you're using those fancy tall muffin cups like I did in the pic, still go 5mm~ just below where it'll start to leak. The fancy muffin cups are GREAT if your oven (like mine) heats unevenly. They seem to shield the cupcackles and make them come out more symmetrical.
  5. Pop in 180C oven and bake for 30m~ (tops should start to be golden brown on the sides)

If using coconut cream to replace thickened/whipping cream, the texture will be more dense. But still nice!

Microwaveable custard-filling base

Ingredients

Corn starch 20g
Sugar 20g
Milk (cow or other, doesn't matter) to fill a generic :P coffee mug till about 2/3 to 3/4 depending on how squishy you want the custard
  1. Dump corn starch and sugar in mug
  2. Stir to homogenise
  3. Dump a bit of milk (enough to make a paste to avoid lumps)
  4. Stir into said paste form
  5. Add the rest of the milk, stir
  6. Nuke for 2 minutes, stirring every 30s, until the custard thickens. at 1m 30s, you may want to change to nuke for 15s rather than 30.
  7. When it looks thick and gloopydrippy and not-set-custardy, it's done
  8. Shove in fridge to cool for a couple hours (I usually leave it overnight so I can bake the next day).
If you add egg, then the proteins coagulate differently and it will be tetchy and not cook as evenly. I advise not adding egg.

For a chocolate version, also add 20g cocoa (dutch processed is nicer).

For hot chocolate that is amazeballs, just add more milk, all the way to more or less normal-full mug. It's GLORIOUS.

Note that this will only work in a normal coffee mug type mug. If you try to use a 500g pyrex jug, for example, it won't cook evenly. I think it's something to do with the size and volume vs microwave penetration, heat, and evenness.

]]>
tag:nugget.posthaven.com,2013:Post/2229928 2025-10-12T05:10:41Z 2025-12-24T07:58:28Z ESO Addon: No-frills Companion Rapport Check

Sick of typing "/script d(GetActiveCompanionRapport())" in chat, but don't need or want anything more complicated?

This no-frills addon is for you. ;)

Install it by popping the whole (unzipped) folder in
...\Documents\Elder Scrolls Online\live\AddOns

Type "/rapport" in chat, and see the rapport for your current active companion. That's it.

]]>
tag:nugget.posthaven.com,2013:Post/2222363 2025-09-07T03:10:06Z 2025-09-07T03:11:04Z Mostly water

Vampire, "I drink mostly water."

Human, "But you're a vampire!"

Vampire, "And you are mostly water."

]]>
tag:nugget.posthaven.com,2013:Post/2212641 2025-07-23T10:44:48Z 2025-12-24T07:58:45Z Fucking Cherubs!

"Did you hear?"
"Hear what?"
"Our sister Candy Haus. She's dead. Burnt to a cinder."
"What! How?"
"Well, you know, she always was a careless one. Picked up two plump brats. Fattening up the one, and making the other do chores. Never occurred to her that maaaaaybe it might have been better to leave Hans-El and Gret-El alone."
"Ugh. Cherubs?"
"Yep."
"Oh well, ashes to ashes and all that."

<.< An original retelling by Nugget.

(I don't know why, but I've never seen anyone else riff on this before...)


]]>
tag:nugget.posthaven.com,2013:Post/2205505 2025-06-22T05:39:55Z 2025-08-29T02:03:38Z FrameQuery Figma plugin: Like CSS container queries in your Figma components and frames

I currently lead an enterprise digital Product Design team, and I'm also design co-lead (together with my dev co-lead) for our Design System. We already have multiple UI (code) components that use container queries. This is gonna be soooooooooo helpful on the design model side, because with this plugin, our designers don't have to remember when to manually swap layouts on our models.

First iteration took about 10-12 hours total, and about 100~ credits on my personal Pro account. Fixing the bugs took another 200 or so credits. Learned quite a bit of stuff along the way though. Adding support for components imported into libraries (without which the plugin is basically pointless) took another 300~ credits.

Still need to get it cleaned up, it's a mess, but at least the core bugs are fixed.

This is a working copy, and here's how to use it, if you're curious. :)

Load & use FrameQuery Figma Plugin

  1. In Figma desktop app ONLY, open a Figma file with components that you want frame queries on.
  2. Right-click on empty space in Figma canvas: Plugins > Development > Import Plugin from Manifest.
  3. In FrameQuery 1.0.30 > new-plugin > select manifest.json.
  4. Click on "Nugget's Frame Queries" and the plugin should load.
  5. In the component that you want to have frame queries on, add a new Property FQ-size. Needs to be exactly this string incl. capitalisation. You can name the variants anything you like, for a given value of "anything". Spaces are supported, but there are some other characters that might not be.
  6. FrameQuery should dynamically pull your variants from anything you set in FQ-size.
  7. With the component selected, turn FrameQuery on.
  8. Set your breakpoints. A max is needed for your biggest breakpoint, just make it something silly like 9999px.
  9. Pop a component instance onto the canvas, stick it in a frame, and resize the frame. The current version only cares about width, but I might enhance with height later.

FrameQuery appends 🤖 to component names that have FQ enabled, and prepends 🤖 to frames that contain component instances with FQ enabled, so we can keep track of 'em without messing up how our prototypes look in demos.

FrameQuery also works with components imported from libraries (1.0.30)

  1. Follow steps 1-8 from above in your library file, and publish the library.
  2. Close the library file (you don't need to have it open).
  3. In your target file, load FQ.
  4. Pop in the component instance from your library file, just like you normally would. This is the component with 🤖 appended to its name.
  5. Pop the component instance in a frame.
  6. 🤖 is prepended to the frame name, and the frame is now responsive.







]]>
tag:nugget.posthaven.com,2013:Post/2204033 2025-06-16T11:38:27Z 2025-12-03T03:44:46Z BDO Barter Planner

Proliferate little web apps, proliferate! Doesn't make up for the crap that's polluting the interwebs in terms of content, but I guess at least I can have my own little web apps now.

I got sick of writing the same stuff in Notepad over and over, and realised that, "hey, now I can get an LLM (via Windsurf) to write this really simple thing for me!"

Behold, the Black Desert Online (BDO) Barter Planner!

Bartering is basically a trading (mini) game. You sail around trading stuff (...bartering it). BDO has a really really big map, and it's almost all non-instanced, so you can (literally) sail around for hours if (a) you want to for some reason; (b) you are bartering or hunting sea monsters.

In the "Item" column, I have the Item (survival kit) that I need to bring to a particular Location (Arakil island) to barter. It's grade "g" for green, and what I'll get in return for the barter is "box". The number of the item I need to bring is in the Quantity column.

This is a far from optimised setup. I don't even try to optimise distance and time, except to my own lackadaisical playstyle of "are those things in the same general vicinity". It doesn't have Margoria nor Valencia nodes because I've <.<; memorised all of those. Plus the "Crow Coin" option in the Barter UI in-game renders it unnecessary to track those nodes for trading. At least for me.

Features

  • Easily track the trade goods you need for your non-Margoria barters, grouped by proximity.
  • Search and filter by location names and codes.
  • View location codes (arbitrarily assigned by me) by clicking on "Location code" column header.
  • Clear rows when you've completed the barter.
  • Sort rows to the top as you fill them in, so your to-do barters are always visible.
  • Data is saved in LocalStorage, so you can open/close the file without worrying. Data is removed when you clear it.
  • Runs purely local on your machine.
  • No installation needed. Just unzip the file and open it in a web browser.

Download



]]>
tag:nugget.posthaven.com,2013:Post/2202368 2025-06-08T02:51:00Z 2025-12-22T13:23:03Z Droplet aka ChatGPT (via Windsurf) wrote me a knock-off Airdrop/Snapdrop! :D

First iteration / MVP

Windsurf told me how to install Python, and wrote the base HTML and JS, plus the PY file needed to run the Droplet server locally. My original idea was to use the web browser's localstorage, but that didn't work out, not least because the amount of data I could store that way is puny. The first iteration was very ugly and unfriendly, as the text/instructions were written in a way that made sense only to me.

After testing with my partner helping with uploading files (it worked, yay MVP), then it was refinement time.

Later iterations

  • Add determinate loading bar during upload.
  • Make it prettier.
    Hm! I've heard about classless styling/boilerplate HTML frameworks, maybe I can use one of those!
    I ended up using the very lovely Water.css
  • Add understandable-to-humans instructions.
  • Support folder uploads.
  • Delete uploaded files.
    From the web app, instead of me deleting directly from Droplet's "upload" folder.
  • Display human-readable network device name of server.
  • Improved instructions (round 2).
  • Add QR code for easier mobile device access.
This was great fun, and I did manage to pick up a little bit about server side code, and improve my (very poor) JS knowledge a little to boot. And of course, now I have a Snapdrop replacement. ;)

Get a Droplet of your very own. :D


]]>
tag:nugget.posthaven.com,2013:Post/2039456 2023-10-23T07:56:44Z 2025-12-22T02:56:07Z How to get an accessible PDF out of Google Docs (with free tools)

Today, I learned that Google Docs doesn't save accessible PDFs, even if you conscientiously wrote the doc accessibly. I.e. With the correct heading structure, lists that are actual lists, tables that are tables, figures, captions, alt text, oh my!

Instead, when exporting to PDF, Google Docs strips all accessibility-related information, resulting in an untagged PDF.

This was rather annoying to me, since I absolutely needed this particular document (a VPAT) to be accessible while in PDF form.

Poking around the interwebs, I came to the conclusion that most PDF accessibility remediation tools are one of the following:

  • Paid and expensive
    Adobe Acrobat Pro, I'm looking at you. To be fair, it isn't just Adobe that charges rather a lot.
  • Free, but don't work at all, or don't work very well
    Pave-PDF was an example that didn't work for me at all... even when it finally loaded my document.
  • Free, but insert watermarks, and possibly don't work
    PDFix allowed me to tag my PDF, but I couldn't quite trust that it was working. Especially since PDFix finds "no bugs" with a PDF that... has no tagged content. To be clear, a PDF with no tagged content is not accessible. Plus, it inserts watermarks.
    We'll get back to PDFix in a moment though - it does come in handy.
  • Free, very possibly good, but Windows only
    My work machine is a Mac.

Enough with the complaining - tell me how to get that accessible PDF!

It's really simple, but I didn't find anyone else laying out the exact steps, so here they are. Every article or answer I found assumed access to specific paid tools, which I don't have (MS Word, Acrobat Pro).

  1. Ensure your base Google Doc has been authored accessibly.
    If it isn't, make it so. I.e. use the correct heading structure, add captions to your images, etc. This is 95% of the work, "pre-done", almost. And if you need to fix stuff, it's easiest to fix it in your base Google Doc, rather than attempt it with any free tools.
  2. Export your base Google Doc as a MS Word .doc. Yup.
    Because interestingly enough, when you export a Google Doc as an MS Word file, it preserves all of that tasty tasty accessibility information that you've included.
  3. If you have access to MS Word, open the file, and THEN export it as a PDF.
    According to the interwebs, this should give you a nice, clean, accessible PDF.
  4. If you're like me, and don't have access to MS Word...
    Go to ilovepdf.com - it's free.
  5. On the ilovepdf.com site, choose the "Word to PDF" option, and select your exported MS Word file.
  6. Wait for your file to convert, then download it.
  7. Congratulations! You now have a nice, accessible PDF!

    BUT WAIT... How can you be sure it worked?

  8. On a Mac, right-click your nice (and hopefully) accessible PDF, and "Get info".
  9. Look for "Tagged PDF" - if it says "Yes", it worked!
    (Note that depending on your Mac OS version, "Get info" may no longer return this data.)
    That's okay, you can check another way, that also lets you look at the tags, to make sure they're all legit, which is...
  10. Download and install PDFix.
    You're not going to use this to fix your PDF (because that'll make watermarks, among other issues), you're going to use it to check your PDF.
  11. Open your file in PDFix, and click on the "Tag" icon.
    This pretty much works exactly the same way as Adobe Acrobat Pro, except that it's free. ;)
  12. Check out the tags in your (hopefully) lovely, accessible PDF. :D
    Woohoo!



]]>
tag:nugget.posthaven.com,2013:Post/2002749 2023-07-22T03:17:15Z 2025-08-14T04:17:21Z LLMs: The best software development tutor ever - with big caveats

I've recently started learning Python for fun, and I've manually copy-typed my way to my very first Streamlit app - CatGPT Nekomancer!

In the process, I've discovered some fascinating things about Large Language Models (LLMs) like ChatGPT, and how they fit into learning a new programming language.

I'm particularly tickled by how I (mostly) implemented CatGPT Nekomancer by blindly following instructions, and then used it to understand what I'd done. There's just something magical about making something, and then having it teach you how you made it.

LLMs are great at explaining what a piece of code does

This is because they're functioning purely as "translators". The translation task plays to the strengths of LLMs - statistical pattern matching. Good judgement is not needed, because we're not concerned with "how" or "should".

Here's an example of CatGPT explaining its source code, in response to the prompt "explain what this code does", followed by the code.

This explanation was a little too high-level for me. I wanted to really understand what each line of code was doing. This was easily fixed with a different prompt: "explain this code line by line to a novice programmer". This gave me exactly the level of detail I needed.
I've worked with some pretty great software developers from all over the world who aren't native English speakers. My partner and I are both bilingual, and we'd previously experimented a little with using LLMs as translators for recipes, where we found that if we were able to avoid the LLM's tendency to interpolate (aka "hallucinate" aka "lie"), they do much better at translating than Google Translate.

So when my partner asked, "What if we get it to explain in Croatian? This would have been HUGE for me when I was learning," it was a no-brainer to give it a try with this prompt: "explain this code line by line to a novice programmer, in Croatian." For Croatian, at least, my partner verified that CatGPT's translation and explanation was "brilliant".


I can also ask the LLM to explain specific parts of the code that I don't understand, and learn new concepts that way. For example, I wasn't familiar with the concept of f-strings in Python, which I encountered when working on a different Python experiment. Thanks to CatGPT, I was very quickly able to understand that f-strings are strings that can hold expressions - nifty! I particularly love how the LLM fits so seamlessly into my personal learning "flow". Instead of having to go off to the greater interwebs to trawl through answers about f-strings in Python to figure it out from there, I have my own personal tutor.

LLMs increase the value that good software developers bring to the table

It's generally agreed - at least among software development managers and similar roles - that a good developer can pick up a new language pretty quickly and competently. That's because the core of what makes a good developer isn't the knowledge of a particular language. Rather, it's their grasp of transferrable concepts, frameworks, and understanding of best practices as principles. It's not about rote memorisation - it's about good judgement powered by understanding and experience.

Before LLMs exploded on the scene, a good developer could have everything I listed above, but when picking up a new language, or working with one they're rusty at, there'd still be a lag due to needing to learn the basics of the language (syntax, etc). With LLMs that lag is much smaller, allowing the developer to bring their strengths to bear much faster.

Caveat 1: LLMs are bad at advising on how a feature or function should be implemented

LLMs are based on statistical pattern matching, which makes them great at translation. It's also what makes them bad at anything that requires judgement calls based on a larger and often ambiguous context. They're not always wrong about "shoulds", they're just right far less often than the average human developer.

I believe that this is also what makes LLMs very weak at software development stuff that's presentational, or isn't primarily about logic. HTML, CSS, and web accessibility all fall into the bucket of not being about logic, as well as operating in a large and ambiguous context. It also probably doesn't help that LLMs have ingested the interwebs, and even today, there's probably loads more sites styling button text with <span> tags than sites using the correct approach. It's not like the LLM can tell which approach is better. After all, it's not thinking - it's pattern matching based on statistics.

Caveat 2: LLMs can't coach or mentor

Real "personalisation" is needed for coaching and mentoring. Both of these require human judgement and experience about the subject matter, as well as the individual receiving the coaching or mentoring. They also require (arguably to a lesser extent) a wish on the part of the coach or mentor for the person they're working with to learn and succeed. The simulated thing that we've come to call "personalisation" (e.g. a script grabbing your name from a database) does not and cannot work in this context.

The key to using LLMs effectively when learning software development: Know what you need

If you need explanations on what a piece of code does, LLMs are a great and reliable help. Even more so if you're not a native English speaker - LLMs can function as your own personal translator for both language and code.

If you need good advice on what you should implement, then LLMs aren't going to help much. Quite the opposite. Since they lack judgement (indeed, they do NOT judge), what they come up with is likely to be misguided at best.

It all comes down to the age-old "common sense" wisdom of using the right tools for the job. :)
]]>
tag:nugget.posthaven.com,2013:Post/1972273 2023-05-04T07:45:26Z 2025-11-17T05:20:52Z Peoplecraft

I'm proud to say that today, I came up with a new term for what I usually call "pplshit"! :D

Now, I call it "pplshit" because it doesn't come naturally to me, and it's not something I love doing. I've gotten decent at it over the years, within certain boundaries - enough so that I can provide some degree of training/coaching in that area.

Which has lead me to the realisation that when I talk to someone who likes doing that stuff, or I'm trying to train or coach someone to be better at it, then the name "pplshit" is probably not the most inspiring one.

And so I I will now henceforth call "pplshit"... "peoplecraft"!

Peoplecraft
The art of working within organisational and personal environments and dynamics to nudge and bind disparate teams and stakeholders into effectively collaborating on shared goals.

]]>
tag:nugget.posthaven.com,2013:Post/1865205 2022-08-07T03:14:23Z 2025-08-14T04:18:28Z Snow on the sahara

"Ma'am, I think we have a problem. This is a still from some footage captured by Midge Ourney. You know that Nat Geo photographer who's on shooting on location in Jebil park for the next three weeks."

"Is this some kind of joke?"

"No Ma'am. We've got multiple reliable corroborating witnesses. Quite a few of them are park rangers."

"But that's impossible! Even with climate change. It's got to be a hoax."

"Yes, that's what I thought too. But last night, one of our best people sent me this. They spotted this woman in Toual el-Hadhali and managed to snap a photo. Don't worry, they weren't spotted

"And you're sure this isn't a coincidence? Maybe some kids were having one of those dress-up party things, what do they call it, co-playing?"

"Afraid not."

"All right, thank you. Looks like we have a situation here. Just when things were finally starting to calm down too."



]]>
tag:nugget.posthaven.com,2013:Post/1855394 2022-07-15T08:25:28Z 2025-12-13T08:56:45Z Vegan cheese foam

But Nugget, aren't you a carnivore who loves cream?

Yes, I is! The nugget wasn't setting out to create a vegan cheese foam, it just sort of happened based on other requirements.

The foam is really stable, pretty fuss-free to put together, AND it tastes like cheese foam! All the cheese foam recipes I found either required planning ahead (softened cream cheese), or wouldn't taste like cheese at all. I like coconut milk, and I like maple syrup. I am completely unconvinced that the combination of the two tastes like any sort of cheese. :P

Requirements

  • Must not require planning (pft, wait for cream cheese to soften, pft).
  • Should ideally use ingredients with a long shelf life.
  • Must be brainlessly easy to assemble. Also fast.

Ingredients

  • 50 ml soy milk (I recommend Bonsoy, or Vitasoy manufactured in Taiwan)
  • 2 tsp white sugar (don't use a syrup, white sugar gives this its structure)
  • 1/4 tsp salt (or to taste)
  • 1/2 tsp nutritional yeast <-- this is what makes it taste like cheese

Make it!

  1. Put all ingredients in a microwave safe container, tall enough to whip the stuff in (you want a cheese foam after all).
  2. Stir until combined.
  3. Warm in microwave in 15s intervals (you want it warm, or hot, not boiling).
  4. Whip / froth with milk frother (I recommend Aerolatte) about 20-30s, you should have a stable foam.
  5. Pour on top of drink!

Notes on ingredients

  • Bonsoy, and the Vitasoy made in Taiwan do not contain oil. Soy milk w/o oil is what I grew up with, and I intensely dislike the mouthfeel of plant "milks" that have added oil.
  • You can probably substitute another plant milk for soy milk, but ideally choose one w/o a strong flavour.
  • If possible, choose a plant milk w/o the added oil (this could just be my prejudiced young age imprinting speaking haha).
  • Nutritional yeast can be found a most health food stores. It makes things taste like cheese w/o cheese being added. It's shelf stable, and doesn't need to be refrigerated.
  • Don't combine cow milk AND soy milk foam. The cow milk in the drink will kill the foam structure of the soy milk foam really fast.
  • You can substitute plant milk for cold heavy cream, whipped to soft peaks.
    When using heavy cream, don't heat it in the microwave, or it won't whip happily. Just stir the ingredients in, then whip. You will need to carefully spoon and ladle the foam onto your drink, and it may not stay stable for long if your drink is hot. Use dairy in your drink too (see previous point about cow milk killing soy milk foam).
  • Aerolatte milk frothers are awesome. I've had mine for almost 10 years now. I bought another one from a diff brand to use in the office a few years ago, and it's terrible. It doesn't make nice froths! 
]]>
tag:nugget.posthaven.com,2013:Post/1852417 2022-07-08T09:01:25Z 2025-07-22T03:30:54Z It's a bit depressing that the co-bot-art that's mostly bot is better than I ever was...

...but I was never that good anyway. Oh wells!

Final composite

Manual retouching and merging by the nugget.


Midjourney

Prompt: dark skinned magpie woman wearing intricate silver jewelry, trending on artstation, uplight


Real-ESRGAN Inference Demo

This is actually GFPGAN - for some reason the Colab page seems to be titled differently. GFPGAN is for face restoration.



]]>
tag:nugget.posthaven.com,2013:Post/1851404 2022-07-05T14:46:40Z 2025-07-22T03:31:06Z Midjourney is amazeballs. Prompt was "crow girl, trending on artstation".

Midjourney + light retouching

Midjourney isn't great at noses, so that was where the retouching was needed. Very simple job of masking the original nose with the retouched nose.

Midjourney original

Retouched version

Derived from running the original through an image restoration generated adversarial network - Real-ESRGAN Inference Demo. (This is actually GFPGAN - for some reason the Colab page seems to be titled differently. GFPGAN is for face restoration.)

 ^If you want to use this, you need to log into your Google account, and make a copy of it. Not 100% sure you have to make a copy, but you do need to be logged into your Google account.

Print quality is just another AI away

Cupscale to the rescue! Haven't added the output here, for obvious reasons. But after running the retouched version through Cupscale, I ended up with a 60MB PNG file that's super sharp even at 100%. No artifacts.


]]>
tag:nugget.posthaven.com,2013:Post/1819784 2022-04-17T08:29:10Z 2025-07-22T03:31:16Z Our very first (and only) home-grown blueberry.

]]>
tag:nugget.posthaven.com,2013:Post/1818998 2022-04-15T07:10:21Z 2025-12-22T13:23:34Z Bear and Nugget

I drew these years ago as part of our submission to Immigration for the bear's partner sponsorship visa. In Australia, Immigration requires you to write essays about each other, and "your life together". I figured essays must get kinda boring, so I added cartoons too.

...and then after a loooooooooooooooong pause (laziness, the sponsored visa was approved ages ago), here's a new one! I really like pruning and weeding. <.<; The observant will notice that in 6 years, we both grew 2 extra fingers...

]]>
tag:nugget.posthaven.com,2013:Post/1815478 2022-04-06T07:18:53Z 2025-07-22T03:32:14Z What's something that effective compliments and CSS have in common?

Specificity is the most !important part of the declaration.

<scamper scamper>


]]>
tag:nugget.posthaven.com,2013:Post/1791378 2022-02-04T10:10:09Z 2025-07-22T03:32:24Z Everything is better with eyes.

Even if it does kinda remind me of Marvel comics' Inferno arc from ages ago.

I would send this to Bellroy, but not sure it's the kind of "customer action shot" they'd like.

]]>
tag:nugget.posthaven.com,2013:Post/1779739 2022-01-05T23:29:13Z 2025-07-22T03:32:38Z How to make squeaky clean SVGs for use in applications

Procedure

  1. Open the SVG in Adobe Illustrator.
  2. Where possible, unite shapes with the same fill into compound paths.
  3. Where possible, outline strokes. This will make cleanup easier when we dig into the code.
  4. Remove any additional paths, bounding boxes, etc, that are not a visible part of the SVG.
  5. Scale the SVG to intended rendering size (e.g. if it’s a 24px icon, scale the SVG to be 24px).
    If you want the SVG to be in a 24px "bounding box", then make sure the canvas size is 24px, and your svg is your desired size within the canvas. The canvas functions as your "bounding box".
  6. Clean up sub-pixel alignments where feasible.
  7. Save SVG.

    SVGX
  8. Open the SVG that was just saved by Adobe Illustrator in SVGX.
  9. Select “Optimized” tab.
  10. Hit [Copy].

    VScode
  11. Open SVG that was just saved by Adobe Illustrator in Visual Studio Code (VScode).
    At this point, it’s possible that no image displays in SVGX’s preview, even if there’s code. That’s fine. Copy the code from the “Optimized” tab in SVGX.
  12. Paste the SVG code from SVGX below Adobe’s original SVG code.
    We’ll be using the code from SVGX, with a few tweaks based on the Adobe original.
  13. (Optional) Turn on word wrap.

    VScode - SVGX-pasted code
  14. Ensure the svg opening tag contains only xmlns and viewBox attributes.
  15. Add title tag and string directly below (outside of) the opening svg tag.
    Do describe what the image is. Don't describe what the image can be used to represent.
    <title>Speedometer</title> - Do
    <title>Dashboard</title> - Don’t
    What the image represents will be handled separately with aria-label or alt-text, depending on the implementation.

    Single-colour SVGs meant to be used as icons
  16. Do NOT declare the colour of any path(s) using the fill attribute.
    E.g. DON’T <path fill="#000" d="M22 11.8h-3.6V13h3l5.5 8.3h1.4z"/>
    This makes the svgs harder to dynamically colour.

  17. Single-colour SVGs meant to be used as images
    Declare the colour of any path(s) using the fill attribute.
    E.g. <path fill="#000" d="M22 11.8h-3.6V13h3l5.5 8.3h1.4z"/>
    This is for SVGs used as images, where there’s no intent to dynamically change the colour

    Multi-colour SVGs
  18. Declare the correct corresponding colour of each path using the fill attribute.
    SVGX-pasted code will often strip the first colour class, and fail to apply it to the first path associated with it. We’ll need to eyeball the path coordinates, and guess match up the colour in the class to the correct path.
    E.g. <path fill="#41B59D" d="M22 11.8h-3.6V13h3l5.5 8.3h1.4z"/>
  19. Remove style tags, and anything in them.
  20. Remove classes, and their values.
    We are replacing style and class with fill and the colour in the corresponding class.
  21. Delete the original code from Adobe Illustrator.
    We only want the code we pasted from SVGX, and then cleaned up.
  22. Save the SVG in VScode.
  23. Open the VScode edited SVG in SVGX.
  24. If the SVG looks as desired / expected - congrats! It’s clean! We can now add it to Iconset.

Troubleshooting

When I open my VScode edited SVG in SVGX, some paths don’t have the right colours.
It can be tricky to assign the right fills to paths, especially if there are a few of them. Try outlining strokes, and/or combining shapes with the same colours into compound paths in Adobe Illustrator before working with SVGX and VScode.

SVGX is showing a blank preview when I open my VScode edited SVG.
Test the paths by declaring a fill, like so.
<path fill="#41B59D" d="M22 11.8h-3.6V13h3l5.5 8.3h1.4z"/>
If you’re making an SVG that needs to be dynamically recoloured, remember to remove the fills after the preview looks good.

SVGX looks fine, but when I import into Iconset, it just shows a black square/circle/shape.
There may be an additional invisible bounding box / path from the original SVG, which is now being filled, and resulting in a black square/circle/shape. Either locate the invisible path in the SVG code and delete it (this can be hard), or delete the invisible path in Adobe Illustrator, and try again.



]]>
tag:nugget.posthaven.com,2013:Post/1761527 2021-11-20T03:51:39Z 2025-07-22T03:32:49Z Every Guild Wars I build I've ever written (all 350 of them)!

...on the off chance that someone, somewhere, somewhen will find them useful. Might even be future-nugget, though that's unlikely.

These builds are literally everything I ever found interesting enough to save, so there's no guarantees that any of them are good.

However, I've linked to my guides for the good ones in the spreadsheet.

350 guild wars I nuggetbuilds

tinyurl.com/nugguildwars1


]]>
tag:nugget.posthaven.com,2013:Post/1731310 2021-09-03T03:29:39Z 2025-12-08T05:29:12Z Strawberry goop shortbread cookie-tart aka tapioca flour is magical!

We've been thickening savoury sauces with tapioca starch for a while now. We like it better than cornstarch, because it doesn't muddy the flavour of things the way cornstarch does.

At some point, we decided to thicken a pie filling with tapioca starch. <.< There's no going back. Tapioca starch is magical in fruit filling type goops. It makes everything so wonderfully blobby and clear and pretty, without being sticky and tacky. And it even re-bakes nicely, if you want to stuff it in a puff pastry and bake it.

Strawberry yuzu goop

Makes about 500g of goop. Don't worry about measuring exactly. :P I don't really measure stuff, and this is all conjecture anyway haha. If you use too much tapioca starch, you'll just end up with a more solid and bouncy goop.

Ingredients

  • 500g strawberries, chopped
  • 30g~ of honey citron tea - brand doesn't matter, they're all nice (yuzu is wonderful with strawberries)
  • 30g~ tapioca starch
  • 30ml water
  • white sugar to taste (depends on how sweet your berries are)
  • ground cardamom to taste

Steps to reproduce

  1. Chop the strawberries into fingernail-sized bits. It's okay if your fingernails are giant or midget. All fingernail sizes are welcome.
  2. Dump chopped strawberries, sugar, honey citron tea, and ground cardamom in a small pot of your choice (needs to be big enough to hold all your strawberries, obviously).
  3. In a separate bowl, add water to the tapioca starch and swirl it around till it forms a slurry. Don't skip this step! If you just dump the tapioca flour into the pot with the rest, you'll end up with tapioca lumps.
  4. Add tapioca flour slurry to the rest of the stuff in the pot.
  5. Cook at low to medium heat for about 10 minutes, stirring pretty much all the time. Yeah, it sucks. :( I hate stirring.
    The tapioca slurry will look white at first, but once it's done, it'll turn clear.
  6. When the strawberries are squishy enough for your taste, and everything is goopy and clear, it's done.

The goop is great both hot and cold, and it reheats and bakes well. So once you have the goop, go on and GOOP ALL THE THINGS!

]]>
tag:nugget.posthaven.com,2013:Post/1711165 2021-07-07T06:10:53Z 2021-07-07T06:11:50Z Properly-built design system components are awesome.

Unfortunately, most UI-kits are not awesome, and so I end up having to roll my own - like this text input component.

Glad to be back to using Adobe XD after just about 2 years of Sketch-Hell.

]]>
tag:nugget.posthaven.com,2013:Post/1701017 2021-06-09T07:48:18Z 2021-06-09T07:48:19Z So for months now, I've been wondering...

...why does my microwave have an icon of a farting cat with a clock for a face?

Is it because cats lick their bowls really clean? Oh well, I like cats.


Today, it dawns on me what the "proper" interpretation of the icon is.


]]>
tag:nugget.posthaven.com,2013:Post/1609334 2020-10-28T10:18:38Z 2020-10-28T22:15:58Z Free COVID-19 customer logbook for small businesses

I made a very very very basic Airtable template for a COVID-19 customer logbook for small businesses.

Like many Victorians, I watch our Victorian Premier's (Dan Andrews) press conference near every day.

At one of the press conferences a couple of days ago, one of the reporters kept talking about "QR codes" for small businesses, as if QR codes are magical things that will somehow record everything when a customer scans em.

After that press conference, I was complaining to my partner, Does the reporter even know what a QR code does? If it doesn't redirect to a database, with form, etc, what's the point? How will a small shop set this up?

Then I realised, Hey, I happen to know this no-code tool... (Airtable)......and this kinda happened.

The bulk of the work was writing the instructions in a way that normal people can understand and follow.

https://airtable.com/universe/expzohzqb7PE07lhl/covid-19-logbook



]]>
tag:nugget.posthaven.com,2013:Post/1604893 2020-10-16T06:16:41Z 2025-07-22T03:33:24Z "Click the link we sent in your email to log-in" turns every log-in into "reset your password". :| ]]> tag:nugget.posthaven.com,2013:Post/1603892 2020-10-13T05:25:34Z 2025-07-22T03:33:58Z That old chestnut again: Should designers code? No... and yes. ;)

Some ponderings as I learn the wonders of CSS-grid, fluid typography, and all the shiny new toys kids these days have.

Gosh, CSS has gotten so much nicer since the days when we had to haul water to the top of the hill both ways barefoot in the snow.

No, designers shouldn't code
I don't think designers need to be able to write production quality code. It stands to reason that I have a vested interest in this "no", as I haven't committed production code in over a decade. Plus, production quality code, especially at an enterprise-level, is a completely different beast from building a small static website. When it comes to enterprise code, scalability, maintainability, extensibility are all very important - and I prefer to leave them to the experts (my developers).

Yes, designers should code
Ideally, designers should have some familiarity with, and understanding of the basic "materials" used to build the digital products they design. Additionally, the "materials" will vary, even across digital products. Just because I can write js and css certainly does not mean I know the "materials" for native Windows, Mac, Android, or Linux.

With that as the caveat - being able to code just enough to know my materials is a very big plus. I did a basic Vue course fairly recently. Nothing fancy, just a single page app. However, what I learnt from that course gave me a much better idea of how Vue (and React, and Angular) work at a very high level, and how that can translate into implementation. It also made it collaborating with front-end web developers easier, as we had some degree of shared knowledge.

I've also been experimenting with the "new" (not so new, I know) CSS toys all the kids have these days. What's really cool about this is that unlike the Vue course, what I'm learning about CSS is changing the way I think and design - and think about design. These learnings change the bounds of what I know are possible.

For example, I have been reading about fluid typography on the web for a couple of years now - and before I started poking around the code, it's been a very abstract sort of interest. E.g. "Nice and interesting abstract concept, I should try to design for that when I have the opportunity". Now that I've poked around the code, and gotten a basic understanding of how things work, this has changed to a much more real and practical, "ZOMG now that I actually know how that bit of code holds together, I can actually set a typographic scale that way, and see it work. And I can see how I could make it work in so many places. Waoohh!"

Here's my supernoob code-pen, which I'm modifying on the fly as I learn more about css-grid and fluid typography.
All the noob inline comments, every noob inline comments!

See the Pen Flying Red Horse - CSS-Grid Experiments by JC (@nuggettyone) on CodePen.

]]>